There were sometimes loopholes for royalty and aristocrats because what country doesn’t benefit from an inbred child monarch from time to time? And Rome being Rome also factored in an estimate of when you were probably capable of understanding whether you were acting within the law.
But, for the most part, if you were at an age where you and most of your peers had gotten through puberty and were physically developed enough for battle, in most places you had reached the age of majority. Congratulations!
In the United States, guesstimates of a good age of majority, when formally codified, have generally tended to be set at either 18 or 21. Eighteen probably makes a little more sense. You’ve been through puberty. You’re done with compulsory education. You’re free from your parents. You should have enough sense to know whether you are acting within the law. You’re physically capable of giving your life for your country if its leaders get in a pissing contest with Russia or its esteemed defense contractors need to move product. What more is there to consider?
Ever so briefly, the United States sort of recognized this. At the start of the Vietnam War, 18 was old enough for you to be drafted, but not quite old enough to choose the people drafting you or enjoy a beer before you shipped out. Ergo lawmakers at the federal level, acknowledging this apparent logical inconsistency, lowered the voting age to 18 in 1971. Some states similarly lowered their drinking age, until it was, for all practical purposes, raised to 21 at the federal level—albeit in something of a constitutional workaround via some technicalities related to funding highways.
More recently though, it has become rather fashionable for the sophisticated people to shake their heads at this backwards notion that young adults in their late-teens or early 20s be permitted to engage in activities usually reserved for grownups. The smart people know 21 is too young to make serious choices about how you wish to live. The educated people understand 18 isn’t old enough for you to behave responsibly without one of the big people looking over your shoulder. The rental car companies have known this for years: they only rent to 25 and older.
The rationale for much of this—with the apparent exception of Ramaswamy’s call to raise the voting age, which seems more about revitalizing perceptions of civic duty and the act of voting—generally comes down to an appeal to common sense with a dash of science. If you are in your late teens or early to mid-20s you are obviously immature, irresponsible, and not capable of the sound judgment of adults.
The latest brain science backs this up. Therefore, it would be in the best interest of you and the rest of society if we treated you as a child just a little bit longer until your brain finishes ripening.
A lot of science and perhaps some common sense though is lost in this argument. For a more comprehensive understanding of the science bit, one first needs to back up to about the mid-20th century. Prior to the neurofication of all human thought and behavior, somewhere in the aughts through the use of neuroimaging devices, especially fMRIs, developmental psychologists tended to work within a more theoretical and observational paradigm when dividing people’s lives, from birth through old age, into different developmental periods.
Erik Erikson, writing primarily in the 1950s and 1960s, was probably the most influential of them as he theorized that childhood likely ended around the onset of puberty, at which point adolescence began and lasted until the onset of young adulthood in the late teens. Young adulthood then lasted till about 40.
Arnett’s rationale was that when Erikson conceptualized his developmental phases in the mid-20th century, the lives of individuals in their late teens and 20s were much different than they were at the dawn of the new millennium. In Erikson’s day, people began work earlier. Most didn’t go to college. By 20 they found a steady job. By 23 or so they were married. About a year later they had their first child.
In the late 1990s, however, young people in their late teens and early to mid-20s, instead of settling into adult roles, were entering a period of “semi-autonomy” in which they “take on some of the responsibilities of independent living but leave others to their parents, college authorities, or other adults.”
During this period they often pursue additional education and live lives characterized by exploration and frequent change while existing in a quasi-adult state. Physically they are adults. They are considered adults with some restrictions in the eyes of the law. Yet, they don’t feel like adults. They don’t feel responsible for their own lives. They don’t feel like they make their own independent decisions. Plus, they often lack financial independence. For many, this does not change until sometime in their mid-to-late 20s.
With time, many commentators and policymakers began proposing that oversimplified findings from these studies inform law and policy with a particular focus on how the brains and cognitive abilities of adolescents and those in young or emerging adulthood continue to change into roughly the mid-20s.
People started arguing that since the brain isn’t fully mature till the mid-20s, one is not an adult until 25. They started acting as if permitting 18, 21, or even 23-year-olds to take responsibility for their own lives or make decisions independently is as absurd as handing a 12-year-old a bottle of scotch, a handgun, and a box of condoms before sending him off to run a bank.
Sometimes this comes off as a cynical attempt to appeal to science as a means to indirectly restrict activities individual commentators or policymakers probably would rather just simply ban altogether. Other times it seems more like what overeducated proponents of the safetyist nanny-state would perceive as a well-intentioned, honest attempt to help less-informed hoi polloi stay safe by following The Science. In both cases though, it also reveals, at best, a naïve understanding of the science they claim to be following.
When examining questions pertaining to neurodevelopment, researchers don’t really have a clear single metric for neurodevelopment or neuroadulthood. Instead, they have many options to choose from and generally they don’t align perfectly with one another. Thus, for research purposes, scientists will select an operational measure and look to see what age changes in that operational measure plateau.
But again, for any given study, researchers must decide what measure to use: structural changes, the amount of gray matter, the amount of white matter, connectivity, the availability of particular neurotransmitters, metabolic efficiency, etc. They also must choose what part of the brain to focus on. Depending on the choices the researchers of a given study make, they may then find neuroadulthood is attained as early as 15 or as late as never.
Increasingly though, many are homing in on the prefrontal cortex. In some ways this sort of makes sense. This is the part of the brain associated with many higher or executive functions and reasoning capabilities, after all. A related approach is to focus on psychological components of cognitive ability that can be measured without a neuroimaging device, then try to match up performance on the cognitive measure with some neurodevelopmental one because the pretty pictures of an fMRI convey the authority of science better than a bar graph showing reaction times on a complex cognitive task that would take 20 minutes to explain.
Yet still, when implementing either approach to divine the age of neuro or cognitive adulthood, researchers still seem to end up floating imperfect guesstimates ranging from mid-20s to 30s to never that seem to do little more than continue to complicate what was once a sort of simple matter.
This does not mean the research isn’t interesting or worthwhile, but it should make one think twice before deferring to it when arguing for restricting the rights of putative adults.
Moreover, even if science here were a little less fuzzy and we had a more precise age for the maturation of the prefrontal cortex and could definitively correlate it with performance on a relevant cognitive task, a lot is still lost both scientifically and practically.
First, by at least partially tying legal adult activities to one or more scientific metrics, one establishes a seemingly risky precedent, opening the door to adulthood being something forever in flux. Today we might seek to reclassify 18–21-year-olds as children because their brains are not as mature as a 25-year-old’s.
Tomorrow we may reclassify 22–24-year-olds as minors because their brains are more similar to those of 21-year-olds than those of 35-year-olds. A generation from now, we may end up with the same conversation about 35-year-olds. Potentially, this could go on forever.
Second, if we go this route of reclassifying young adults as not quite real adults responsible for their lives and the choices they make, why shouldn’t we finalize the process and keep them under parental care or state control until they are 21 if not 25 or whatever other age, while rewriting remaining laws about tobacco, alcohol, guns, the age of consent, and a plethora of other opportunities for bad choices, while adjusting societal expectations for this age group accordingly?
Drinking and smoking would be prohibited for these twenty-something-year-old minors. Romantic relationships between proper adults and those under whatever the new cutoff is would be treated as statutory rape. College could be made mandatory. But professors would have to be careful not to make the coursework too hard because, in this view, 18 or even 20 is simply not old enough for a child to do adult-level schoolwork.
Lastly though, this whole endeavor to try to find a neurodevelopmental or cognitive measure for the precise age at which one becomes sufficiently adultlike and to shape policy around that measure would seem to discount that the neurodevelopmental and cognitive features being measured may themselves be forever in flux for a variety of sociocultural and environmental reasons. It also ignores that most societies throughout human history have gotten along just fine without knowing the exact moment the prefrontal cortex reaches maximum adultness.
Once more, Arnett noted in 2000 that the young adults of that era were different from those of the mid-20th century, taking on the responsibilities of steady work, marriage, and children later than their earlier counterparts. He also noted how it is well established that marriage and parenthood tend to hasten feelings of adulthood and decrease risky behaviors practically better than any other human experience.
Taken with the work of Arnett, it would seem to indicate that our society and culture have developed in a way where everyone gets held back a developmental stage for roughly the duration of a developmental stage at least until they’re 30.
Although we can’t know for certain, maybe if we had fMRIs in the era of Erikson or even the 1990s we’d see brains back then reached some metric of adultness earlier than those of kids today.
Of course young people have always done dumb things and made stupid decisions. Just watch any teen movie that takes place in the 1950s. Everyone apparently got into drag races with greaser kids and preppy bullies—even when trying to stop an alien blob from destroying Earth.
Perhaps by turning to science to tell us the exact age at which someone no longer must be protected from making their own decisions, we are further exacerbating a vicious cycle in which our society has already trapped its youth.
By attempting to cocoon both adolescents and young adults to protect them from bad choices, responsibility, and real-world consequences for their decisions until they reach a scientifically defined age at which they can enter the world fully mature and unsupervised, we will in fact be protracting their immaturity and delaying their development into the responsible adults we are waiting for them to become.