Plastic surgery, broadly defined, has a long history. The earliest records of going under the knife date back to 600 B.C. in India, where the punishment of nose amputation for adultery created a natural clientele for rhinoplasty. In ancient Rome, surgeons performed scar-removal operations on their patients’ backs, the marks of which suggested a personal history as a slave, who were whipped.
Yet modern plastic surgery, as provider of aesthetic enhancement rather than the restoration of one’s original features, is a relatively novel practice, emerging as a secondary phenomenon of the first World War.
The magnitude of the war and nature of trench warfare resulted in an unprecedented number of disfigured veterans, which in turn led to a boom in the field of plastic surgery.
Today, the historical consensus attributes the rise of modern plastic surgery to the public’s reception of the large-scale treatment of disfigured veterans.
“If soldiers whose faces had been torn away by bursting shell on the battlefield could come back into an almost normal life with new faces created by the wizardry of the new science of plastic surgery, why couldn’t women whose faces had been ravaged by nothing more explosive than the hand of the years find again the firm clear contours of youth?” wrote surgeon Max Thorek in his 1943 autobiography.
Almost a hundred years later, history is repeating itself as advances in biotechnology and pharmaceutics concerned with the correction of defects shows the signs of a potential reorientation towards human enhancement.
From the towering IBM mainframes of the 1950s to Google Glasses, the past half century has seen a continual evolution of technology that allows closer integration with the human body.
This year alone has seen the introduction of the Apple Watch and, more obscurely, the demonstration of a tiny track-pad that fits on your fingernail. It would only take a minor leap of the imagination to envisage the next stage of integration, where your nerves are directly wired to a machine, but such a leap isn’t needed—it’s already a reality.
In the early 2000s, researchers at Northwestern University developed a new type of surgery, known as targeted muscle re-innervation, that rewires the nerve endings that control the limbs so amputees can better control their electronic prosthetic arms and legs.
Since then, researchers at Johns Hopkins have created prosthetic systems that allow people who have had both arms amputated at the shoulder to perform basic household tasks, such as moving a cup to a higher shelf.
The device made at Johns Hopkins uses nerve sensors that are attached onto the skin, which can be interfered with by normal bodily processes like sweating. To tackle this problem, researchers in Sweden have experimented with a prosthetic device that goes under the skin, connecting directly to the nerves, muscles, and bones.
At the moment, the range of motion offered by prosthetic systems are quite primitive, emulating all the energy and grace of a cane-clutching octogenarian, with few purposes beyond servicing amputees and technology enthusiasts, but then, plastic surgery at the turn of the century was similarly primitive.
Walter Yeo, the first recorded person to undergo a skin-graft after losing his eyelids in the war, had after his surgery the appearance of someone who was always wearing a discount masquerade mask. Only a hardened futurist would have predicted that one day numerous nominally attractive women would repeatedly go under the knife to enhance their looks, but that’s exactly what happened.
The desire for self-enhancement—whether in physical, mental, or economic varieties—is a perennial feature of the human race, a desire only amplified by the millennial wave of techno-optimism that promises that if there’s a problem, there’s an app for that.
Short of a mysterious halt in the natural progression of biotechnology, it’s almost certain that prosthetic systems will one day become advanced enough for recreational use—and more.
A cyborg is technically defined as a organism that is deeply integrated with an intelligent machine, although in science fiction cyborgs are usually superhuman, not physical therapy patients. Under that definition, cyborgs don’t exist yet, but there’s no reason to rule out their emergence in the next century or two.
Even if most biotech researchers themselves are hostile to the idea, the development of that technology is part and parcel of medical research. Improvements in prosthetic systems are necessarily an advancement towards the creation of cyborgs, and the technology could easily escape from the hands of the majority.
For instance, while Americans researchers roundly condemn the practice of modifying the genome of human embryos, the method that allowed for the economically viable editing of a cell’s genome—called CRISPR—was pioneered by that same community.