It’s December, and you know what that means?
That’s right! Moral panics backed by popular science.
Truth to tell, this kind happens constantly, and there’s nothing special about December. Usually the holidays even offer a sort of break from the idiocy as people get distracted by other public spectacles, like The War on Christmas, upcoming Presidential office handovers, brewing global disorder, and the growing certainty among a (un?)healthy segment of the population that next year is not going to be as good as this year.
Unfortunately, my famous calm and equanimity have taken a beating during the last couple weeks as I’ve been repeatedly punched in the face by a particular fashionable bugaboo, so in the interest of spreading Christmas cheer I have decided to give you an article where I not only scream “bah humbug!” at the top of my lungs about said bugaboo, but show how our history has put us in a position where just about everything we’re encouraged to get worked-up about is a lie backed up by proof-texts.
Because, frankly, the only thing worse than being lied to is being preached at by proof-texters.
The Proof is in the Texting
In any intellectual/cultural system, public intellectuals operate on a metaphorical game board. The rules of the game are things like “standards of evidence” “standards of goodness” “standards of ethics” (which are a separate axis unrelated to goodness)there are three basic types of reasoning:1
In-Game reasoning. This is the reasoning, often deriving from agreed-upon first principles, which engages with the knowledge system in a fair-play kind of way. Its standards of truth and goodness assume that the consensus represented by the system is legitimate, and it plays nicely within those bounds. A fandom argument over whether, for example, Dumbledore was gay (held before Rowling outed him as such) might be vigorous, but it would proceed on the as-if basis from an agreed-upon literary canon and the mythos contained therein.
Faux-Critical reasoning. This is reasoning which purports to criticize the prevailing system, but in pretending to do so it accepts the axioms upon which the system depends. In so doing, it buttresses the legitimacy of the system it purports to oppose. To continue with the Harry Potter analogy, a fan who gripes about things like:
the stupid pacing and characterization issues that undermined the dramatic tension of Harry Potter and the Half-Blood Prince, or
the subtle rules changes in the magic system that crept into Deathly Hallows, or
the astonishing abandonment of time travel as a story mechanic just when it might have been useful
are employing faux-critical reasoning.
Sure, such a fan is voicing genuine gripes, and those gripes might even hold water, but the criticisms are utterly meaningless the moment you stop taking the Harry Potter universe seriously.Criticisms like this are, in a very important sense, a lie: they take the game-space entirely seriously and then try to undermine it (either in an attempt to salvage the suspension of disbelief, or to advance their own reading of the text). They are thus not actually criticisms at all, they are (however unconsciously) cover for an attempt to put a thumb on the scale of other people’s experience of the Harry Potter story.
Radical reasoning. This is reasoning that takes place from outside the knowledge system’s game-space. If the reasoning happens to share a common foundation with the game-space it’s speaking to/about, that commonality exists only at a very low level such as “we all experience physics” or “we like not being killed.” Where radical reasoning reaches a point of convergence with either faux-critical or in-game reasoning, it is by accident, as it proceeds from entirely different foundations and values than those which the first two categories have in common.
In Harry Potter terms, radical criticism treats the books as an artistic text and examines their achievements, not only in terms of how they achieve their artistic objectives (or not), but how they stack up against other works with similar aims. Or, to take another tack, a radical engagement with the series might attempt to unravel what its popularity reveals about the generation that embraced-and-then -rejected the series, and what that might show us about the future of its culture.
For Western culture over the past five hundred years (which is the part of our past that’s relevant to understanding how we think today), category 1 and 2 have been entirely dominated by two basic cognitive paradigms:
Protestantism and Whiggism.
Protestantism (which, yes, descends from the Protestant Reformers of the late 15th and early 16th centuries) is primarily defined by a literal and legalistic turn of mind that is committed to prioritizing the map over the territory. This turn of mind is a great advantage to model-makers and map-makers who are seeking to bring nature (or at least that part of nature that is vulnerable to the observational tools we possess at any given moment) under human domination. It’s also pretty useful if you’re trying to get rid of a thousand-year-old priestly class of philosophers and wizards (i.e. the medieval Catholic church).
Protestantism (the intellectual tradition) overthrows the priests who purport to speak to God and the thinkers who wrestle with Big Questions—and with the entire idea of moral and cultural “elites”—and replaces them with professional geeks who explicate texts. After all, the prayerful conversations of visionaries can’t be scrutinized, but that which is written can be interrogated and analyzed, and those who learn to read can learn to interrogate texts and approach God through the text’s words. On this view, the plain (i.e. exoteric) sense of texts—legal, mythic, theological, and scientific—are the true sense.
Armed with this approach to truth, the Protestant thinker can master the universe, and all the people and thoughts within it.
The Protestant mind is an historical anomaly. Even the ancient traditions that most closely approximate it (such as Epicureanism) possessed an undercurrent of the esoteric.2 Most of the time, intellectuals worried about plumbing the meaning of life so they could bring meaning to their practical endeavors. In the Protestant world, that changed utterly.
It takes something powerful to pull the focus of intellectuals entirely away from intellectual pursuits (like philosophy, theology, alchemy, magic, and literature) and on to practical ones (like engineering, politics, and business).
Something very powerful indeed.
Like for example, money.
Centuries of money.
And lots of it.
The end of the Black Death kicked off an economic bonanza unmatched in world history.3 It disrupted thousand-year-old social and economic hierarchies and broke the back of the Catholic world long before Martin Luther decided to grump about the Pope being a bad lawyer and corrupt marketer. The invention of the printing press, the Age of Discovery, the consequent Age of Empires, and the massive wars that soaked the soil of Europe in blood were all part-and-parcel of that astonishing unlocking of material wealth.
With all of that going on, who’d give a fuck about things like “culture” and “meaning” and “higher truth”?
Well, okay, people still cared about those things, but they are kind of dull, aren’t they? In such a context, it makes intuitive sense to look to crass material4 realities for meaning. Truth comes in the form of Law, goodness in the form of Morals, the favor of God is found in material blessings, and comfort is found in the toys and luxuries and experiences that wealth can buy. Sure, family is good too, but not for the same reason. Where once it was important and meaningful because it connected the past to the future through your body and actions, now it brought meaning chiefly through legacy.
Widespread Protestant thought is only possible in an upwardly-mobile, outwardly expanding, wealth-generating civilization. As soon as the experience of “every year is better than the last, give or take a few blips in the curve” falls away, the hunger for those other, less quantifiable forms of meaning (which Protestant thought precludes) springs up again. The possibility of this happening, therefore, naturally becomes anathema to the mind of the Protestant.
Which brings us to Whiggism, which brings meaning back to the Protestant enterprise.
Whiggism is the deep, unexamined belief that human history is a tale of unequivocal progress. We began in savagery, but we have ascended (or are ascending) to the heavens. Our technological progress—an undeniable fact of life—is seen as a moral quality. We’re less violent now, less barbaric. On the Whiggish view, we’re not like this because our material comforts reduce the necessity of acting brutally, but because we have actually become better creatures.
“Better” does a lot of work in the Whiggish view. It’s an adjective that assumes its conclusion, and encourages a way of thinking that bends the facts to fit. Some courageous Whiggish souls—such as Michael Shermer in his book The Moral Arc, or Steven Pinker in his tome The Better Angels of Our Nature—have attempted to define and document “better,” invariably landing on “less violent.”
It’s a strange proxy for goodness—would a less violent lion be a “better” lion? Is the non-lethal coercive violence of the State truly superior to the neighborhood posse who rides out to lynch an embezzler or a child molester?
Well, if you’re a Protestant Whiggish thinker, the answer to such questions is obviously “yes.” In the Whig world, goodness is defined through law, law’s job is to restrain the passions and create order, therefore that which is “moral” and “good” is synonymous with “those behaviors which are legal and desirable in a well-ordered technocratic society.”
In our culture, the In-Game thinkers (like Shermer, Pinker, and most popular conservative-liberal intellectuals from the last few hundred years) play this game straight. The Faux-Critical thinkers (Bacon, Kant, Hegel, Marx, the Postmodernists, the Frankfurt School, various disaffected-liberal voices who have risen to prominence in recent years, etc.) pretend to oppose this way of thinking. However, when they run up against the edges of their paradigm, they reveal themselves to be merely a subset of that class of intellectuals who—in addition to being in love with “progress”—have made the Protestant error of prizing the exoteric story as esoteric truth. Such souls consequently try to influence the world in the same way that mosquitoes might try to influence an elephant.
The Radical thinker shares neither the Whiggish moral bent, nor the Protestant love of crass materialism. He might instead approach things from an ecological, elitist, chaotic, vitalist, primitivist, historicist, Neitzchean, declinist, objective-aspirant, or mystical point of view (or any number of others). In all cases, the common denominator for the Radical is that he is outside the moral-cognitive frame and is capable, at least in some circumstances, of engaging with facts and ideas on their own terms rather than viewing them through a Whiggish-Protestant lens. Thinkers like Toynbee, Durant, Machiavelli, Goethe, Schmidt, Tolkien, Crowley, Ellul, Huxley, Toqueville, and Taleb are (at least occasionally) radical thinkers.
Proof-texting “Not Ripe Yet”
Nothing delights the modern discourse-participant like a proof-text.
Proof-texts are pithy quotes or citations that can be made to support any position. They are usually ripped out of context and repurposed for the task at hand, with no acknowledgment at all of the piracy or perversion inherent in this act.
Creationists like Ken Hamm love the proof-text, pretending as they do that quoting Genesis lends authority to their batshit pseudotheology5 (itself the invention of the late 19th century, with precious little connection to historic Christianity).
Conservatives proof-text Martin Luther King, Jr. when they argue that his line “…my four little children will one day live in a nation where they will not be judged by the color of their skin, but by the content of their character” as if King wanted a color-blind liberal society, when he was actually a socialist activist, a devotee of Marx and identity politics, and personally mentored race hustlers like Jesse Jackson.
Liberals love a proof-text too, freely quoting Lincoln as if he cared about slaves and believed in the equal dignity of all humans, or did what he did because he prized the liberties that the Founding Fathers established, when he was not terribly interested in any of these things. They also happily quote JFK as if this aristocratic scion, drug addicted womanizer, and grizzled warrior was an egalitarian feminist peacenik who would have ended the Vietnam war had he lived just a little bit longer (he, was in reality, a radical imperialist who lacked confidence in the intelligence establishment’s ability to usefully guide him in his goal of advancing the American conquest of the globe).
Modernity is Protestant, and Protestantism is a religion of lawyers whose primary loyalty is towards the expedient—and, in a Lawyer’s world, nothing is more expedient than a proof-text.
Which leads me to the proof-text that I’ve finally gotten sick enough of to bitch about:
“The Brain [specifically, the prefrontal cortex] does not stop developing until age 25” —Popular Neuroscience Dogma
You’ve heard some version of that knocking around, I’m sure. I first encountered it circulating among prudish conservatives who pretend to be political and cultural liberals. “The brain is still developing into your twenties. Maybe young adults aren’t really competent to own guns/consent to sex on the same level as other adults/be out on their own/etc.6 Maybe we’re expecting too much of them.”
In classic proof-text fashion, it both utterly misrepresents the significance of the cited factoid and utterly ignores the context of humanity.
Context and Significance
To start with the last first, the traditional age-of-full-majority in the English-speaking world was 21. At this age, the adolescent was officially a fully independent adult, capable of marrying without leave (at least legally), of entering into contracts, and voting.
The other things we so dearly associate with “coming of age”—having sex, drinking, smoking, serving in the military, buying a gun—were things that were all done significantly sooner than the age of full majority.
Adolescent boys typically became eligible for military service (usually through impressment7) between ages 14 and 16—sometimes younger if they were serving as drummer boys. They typically were allowed to visit brothels from the time they were old enough to earn their own money (again, usually sometime between ages 14 and 17.8) Drinking was done (in moderation) from childhood (also true for girls), while smoking was an “adult” vice and not typically tolerated in the young until they were of such an age that they could buy their own tobacco and socialize with adult men.9 On the other hand, boys owned their own weapons (or had access to the family arsenal) by the time they were old enough to hunt or watch their younger siblings (usually around age 8 or 9).
For girls, of course, things were different. Sex was generally not allowed before either marriage or full majority (except under the nominal supervision of parents between a daughter and her betrothed, which in most subcultures didn’t happen until 16 at the earliest (and often not until 18 or later),10 and they didn’t get the vote.
Our idea of adulthood is impoverished, if not completely backwards.
On the other hand, humans have always given people we would dishonestly call “children”11 levels of responsibility our culture considers insane, and the results were…pretty decent, all things considered.
So, on what basis should we even consider that fully grown adults are still children simply because we can now, with brain scans, notice that the prefrontal cortex continues developing until around age 25?
Well, Dan, the prefrontal cortex controls judgment and governs the ability to predict the future (and thus anticipate consequences). People in the past just didn’t know these things, so they couldn’t have been expected to understand things like we do. Science is all about learning new things about the world so that we can understand and deal with life better. We should change our behavior based on our new understandings, so of course we should seriously reconsider the way we treat young adults.
If that’s your attitude, let’s look at what it means for “the prefrontal cortex” to “stop developing.” The prefrontal cortex deals with (among other things) intentionality and risk management. A person’s will to live, their personality, their ability to formulate goals, and the like are also all centered in the prefrontal cortex. This is, in other words, the place in the brain where “you” (that is, the part of you that you and others recognize as being yourself) live.
Like all other areas of the brain, it settles into its adult pattern (i.e. “finishes developing”) as the result of its experiences.
The closing out of this developmental window in the mid-twenties is associated with a lot of changes. It marks the solidification of the risk-management style that characterizes a person’s adulthood. It’s the time when learning begins to become more difficult as “fluid intelligence” (i.e. the ability to grasp new problems on the fly without context) begins to fall and “crystalized intelligence” (i.e. problem solving based on previous experience) takes on a greater role in life—a role which increases as a person ages.
Everything in the human body works this way.
Bones learn how dense to be by the weight they bear during childhood. Put them under heavy loads with rough-and-tumble play, you’re much more likely to have good bone density throughout life and avoid late-life diseases like severe osteoporosis.
Your body learns to maintain balance by being tossed around and dealing with heights at a young age. If your dad (or similar figure) didn’t throw you in the air and swing you around and take you on roller-coasters when you were an infant, toddler, and child (respectively), then you’re much more likely to suffer from clumsiness, motion sickness, and poor balance as an adult.
Once a developmental window closes, it’s closed. After the close, you cope with new situations based on what your nervous system has already learned. You might not learn to ride a motorcycle until you’re forty, but if kid-you ever rode a bicycle you’ll pick it up pretty quickly. If, as a child, you never rode a bike, or played on balance beams, or rode a horse, or otherwise used your body to manipulate balance and momentum, you’re probably going to die when you finally straddle a rice rocket.
Biological systems develop in response to the environment, and the prefrontal cortex is no different.
In the case of the prefrontal cortex, 25 is the age at which a human becomes an “old dog” who can’t easily learn “new tricks” where socialization, personality, relationships, and risk assessment are concerned (without finding a way to connect them to the old tricks they learned earlier in life).
That’s all well and good, Dan, but we know how grooming works. We know how abuse works. When someone with more power/authority/charisma/experience holds someone with less of these things in awe, that relationship (whatever its shape) is more likely to be abusive than not. The less capable deserve protection, no matter what.
Imagine what we might be able to achieve if we took to heart the idea that we should never touch the developing organism until it stops developing at age 25! We might be able to finally get rid of abuse, exploitation, and the horrific things that affect all of us, to one degree or another, throughout our lives!
Sounds reasonable, right?
Or does it?
We’ve already seen what happens to children who are protected from risk, emotional difficulty, rough-and-tumble play, and material deprivation (yes, I’m looking at you, late Millennials and early Zoomers). So let’s play it forward.
If we took “adulthood doesn’t begin until 25” seriously in our Protestant Whiggish frame, we might decide that it’s just plain good ethics to protect the delicate developing brains from conflict and fear…preventing their developed forms from ever being able to handle either.
We might protect them from being swindled…making them perfect marks throughout life.
We might protect them from the risks that lead to heartbreak..leaving them unable to cope with desire and intimate emotional bonds.
We might keep them away from relationships—especially sexual relationships—with power differentials…ensuring a reliable supply of naive, exploitable, idiots who are infinitely pliable and happily vulnerable to all kinds of emotional, sexual, political, financial, and social manipulation.12
Because, as we all know, humanity progresses morally, and innocence is the morally exemplary position. If we honor the innocence of these developing young adults long enough, just think! We might be the last generation to ever have experienced the heartbreak of lost innocence.
Just imagine the kind of world that we savvy grown-ups could make for ourselves.
Imagine…never having to worry about the next generation having the capability of supplanting you.
Imagine the advantages.
If you’re looking for fresh stories, you can find my novels, short stories, visions, and dreams (along with some how-to books and literary studies) by clicking here.
When not haunting your Substack client, I write novels, literary studies, and how-to books. If you’re feeling adventurous click here to find a ridiculous number of fiction and nonfiction podcasts for which I will eventually have to accept responsibility.
This column is a big part of how I make my living—bigger now due to recent exciting events which you can read about here. Because of this, I’m offering a 20% lifetime discount off the annual subscription rate. If you’re finding these articles valuable, I’d be honored to have you join the ranks of my supporters!’
Those among you who spot a similarity with Hobbes’s discourse on the types of geometric reasoning are not wrong. I was not familiar with it until Travis Smith, author of Superhero Ethics, was kind enough to read a draft of this article. He pointed out that one of my epistemological mentors was influenced by Hobbes’s mathematical thinking. You can read Hobbe’s discourse here: https://quod.lib.umich.edu/e/eebo/A43987.0001.001/1:6.6?rgn=div2;view=fulltext
i.e. occult, or hidden, ways to understand the meaning embedded in the teachings
A more dramatic window of wealth creation and transfer occurred during the pioneer era of the United States, but it was considerably more localized than the aftermath of the Black Death.
I mean this in its strict sense—crass materialism (the view that the immediately visible world is all that matters, or indeed all that exists) is not the same thing as philosophical materialism (which holds that all things that exist are part of the material world, or at least detectable through interaction with it).
That is, Dispensationalist Literalist Fideist Fundamentalism.
Notice how this bugaboo is never used as a proof-text in support of denying the vote, denying the ability to get into student debt, or denying the ability to serve in the military. Convenient, isn’t it?
i.e. being drafted
Depending on their guardian situation. Boys in apprenticeship were typically more-or-less enslaved to their professional mentors, whose personal attitudes on such matters significantly impacted their pupil’s liberty one way or another.
Until the invention of the cigarette, smoking was mostly a social activity. Cigarettes made it a private vice.
That’s right, Juliet’s young age in Shakespeare’s play was intended to be scandalous even to Shakespeare’s original audience. Part of the reason for these age boundaries is that, before the advent of cheap and plentiful sugar-based calories (along with endocrine disruptors that facilitate early weight gain), girls didn’t often get their periods until around age 16. Boys, similarly, had significantly later puberty, and it wasn’t uncommon for a boy’s voice to fail to drop until he was closing in on his 17th birthday. An idea became fashionable in the late 20th century (due to a work of popular pseudohistory that I own a copy of, but is currently locked up in a storage unit at the moment and whose title I can’t remember) that initiation into sex and marriage happened very young in the medieval world, but such was almost only ever the case among royalty, and even then it was quite rare.
“Dishonest” because we happily and recklessly conflate five separate developmental stages—infancy, toddlerhood, childhood, preadolescence, puberty, and biological adulthood—into a single category. And then we have the unmitigated gall to wonder why so many twenty year-olds seem too immature to handle a driver’s license.
These are, after all, exactly the sort of person targeted by Charles Manson for recruitment into his “family.” He’s not alone in using this as a victim profile.
"Imagine…never having to worry about the next generation having the capability of supplanting you."
You starting to sound as cynical as me.
Thank you! I will save this article and use it to help myself do better in my own writing and research.