When it comes to predicting the future, science fiction writers are Texas marksmen: they fire a shotgun into the side of a barn, draw a target around the place where the pellets hit, and proclaim their deadly accuracy to anyone who’ll listen. They have made a lot of “predictions,” before and after Mary Shelley wrote her “modern Prometheus” story about a maker and his creature. Precious few of those predictions have come true, which is only to be expected: throw enough darts, and you’ll get a bull’s eye eventually, even if you’re wearing a blindfold.
Predicting the future is a mug’s game, anyway. If the future can be predicted, then it is inevitable. If it’s inevitable, then what we do doesn’t matter. If what we do doesn’t matter, why bother getting out of bed in the morning? Science fiction does something better than predict the future: it influences it.
The science fiction stories that we remember—such as Frankenstein—are ones that resonate with the public imagination. Most science fiction is forgotten shortly after it’s published, but a few of those tales live on for years, decades—even centuries in the case of Frankenstein. The fact that a story captures the public imagination doesn’t mean that it will come true in the future, but it tells you something about the present. You learn something about the world when a vision of the future becomes a subject of controversy or delight.
If some poor English teacher has demanded that you identify the “themes” of Mary’s Frankenstein, the obvious correct answer is that she is referring to ambition and hubris. Ambition because Victor Frankenstein has challenged death itself, one of the universe’s eternal verities. Everything dies: whales and humans and dogs and cats and stars and galaxies. Hubris—“extreme pride or self-confidence” (thanks, Wikipedia!)—because as Victor brings his creature to life, he is so blinded by his own ambition that he fails to consider the moral consequences of his actions. He fails to ask himself how the thinking, living being he is creating will feel about being stitched together, imbued with life force, and ushered into the uncaring universe.
Many critics panned Frankenstein when it was first published, but the crowds loved it—made it a best seller and packed the theaters where it was performed on the stage. Mary had awoken something in the public imagination, and it’s not hard to understand what that was: a story about technology mastering humans rather than serving them.
When Mary published Frankenstein in 1818, England was getting completely upended by runaway technological innovation in the Industrial Revolution. Ways of life that had endured for centuries disappeared in the blink of an eye. William Wordsworth would soon write mournful letters and poems about railroads ruining his beloved countryside. Ancient trades disappeared without fanfare; new careers appeared overnight. Every constant was unmade; the maps were redrawn; and the old, steady rhythm of life stuttered and pulsed erratically. Young Mary, eighteen years old when she started writing Frankenstein, felt revolution in the air.
In 1999, Douglas Adams—another prodigious predictor of the present—made a keen observation about the relationship of young people to technology:
I’ve come up with a set of rules that describe our reactions to technologies:
Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
Anything invented after you’re thirty-five is against the natural order of things.
Depending on your age, the truth of Adams’s law—and the terror of the nineteenth-century readers who relished Frankenstein’s cautionary message about technology mastering its creator—may not be immediately apparent to you. But we assuredly live in a world of continuous technological upheaval, an Information Revolution that makes Mary’s piddling Industrial Revolution seem mild by comparison, and that is the reason we still care about a two-hundred-year-old novel imagining a scientifically inarticulate project to bring the dead back to life.
“Technological change” isn’t a force of nature, though. The way technology changes and the way it changes us are the result of choices that we make as toolsmiths, individual users, and groups.
Robert Heinlein, a titan of science fiction (as well as a titanically problematic figure), wrote in The Door into Summer (1957), a time-travel novel all about technological revolution, “When railroading time comes you can railroad—but not before” (120). All through history, inventors doodled things that looked like helicopters, including, famously, Leonardo da Vinci. But helicopters didn’t come into existence until a lot of other stuff was in place: metallurgy, engine design, aerodynamics, and so on. The idea of helicopters was floating around in our ether, occurring and recurring in the minds of dreamers, but just because you can think up and design a rotor, it doesn’t mean you can design a diesel motor, let alone build a Sikorsky that can lift a tank.
This theory of technological progress is called the “adjacent possible.” Fanciful inspiration strikes all the time as a consequence of our playful, unpredictable imaginations. Fancy becomes reality when enough of the necessary stuff is in place. When it’s railroading time, you get railroads. Writers had long dreamed of animating dead matter—think of the dirt that became Adam or the clay that rabbis turned into golems. Mary, living in the world of Galvinism, industrial and democratic revolution, and the newfound delight in rationalism, was able to give us a golem without resorting to the supernatural.
But railroading time didn’t just give us railroads: it gave us robber barons who built huge corporate “trusts” that stole from the masses to enrich the few. It gave us forced laborers, kidnapped or tricked out of China or shipped from slave plantations, to do the back-breaking work of laying the tracks. Railroads may have been inevitable, given steel and tracks and land and engines. Slave labor was not inevitable. It was a choice.
Once the railroads were built, though, choices got harder to make. Railroads changed the way that farmers sold their wares, changed the way that settlements were opened and serviced, changed all those things that freaked out Wordsworth, redrew maps, made industries disappear, and created new ones. Living as though the railroad didn’t exist was hard, got harder, and eventually became virtually impossible. Whether it was your distant business correspondents expecting quick responses to their letters or what kinds of jobs your kids could get, you couldn’t just opt out of railroads—not without opting out of all the activities your pals and loved ones undertook on the trains.
How the railroads were built was the result of individual and often immoral choices. How the railroads were used was the result of a collective choice made by all the people in your social network: family, friends, bosses, and teachers.
That’s why there’s no such thing as an Amish village of one. To be Amish is to agree with all the people who matter to you to make the same choices about which technologies you’ll use and how you’ll use them.
Internet social networks were already huge before Facebook: Sixdegrees, Friendster, Myspace, Bebo, and dozens of others had already come and gone. There was an adjacent possible in play: the Internet and the Web existed, and it had grown enough that many of the people you wanted to talk to could be found online, if only someone would design a service to facilitate finding or meeting them.
A service like Facebook was inevitable, but how Facebook works was not. Facebook is designed like a casino game where the jackpots are attention from other people (likes and messages) and the playing surface is a vast board whose parts can’t be seen most of the time. You place bets on what kind of personal revelation will ring the cherries, pull the lever—hit “post”—and wait while the wheel spins to see if you’ll win big. As in all casino games, in the Facebook game there’s one universal rule: the house always wins. Facebook continuously fine-tunes its algorithms to maximize the amount that you disclose to the service because it makes money by selling that personal information to advertisers. The more personal information you give up, the more ways they can sell you—if an advertiser wants to sell sugar water or subprime mortgages to nineteen-year-old engineering freshmen whose parents rent in a large northeastern city, then disclosing all those facts about you converts you from a user to a vendible asset.
Adding the surveillance business model to Facebook was an individual choice. But using Facebook—now that it is dominant—is a group choice.
I’m a Facebook vegan. I won’t even use Whatsapp or Instagram because they’re owned by Facebook. That means I basically never get invited to parties; I can’t keep up with what’s going on in my daughter’s school; I can’t find my old school friends or participate in the online memorials when one of them dies. Unless everyone you know chooses along with you not to use Facebook, being a Facebook vegan is hard. But it also lets you see the casino for what it is and make a more informed choice about what technologies you depend on.
Mary Shelley understood social exile. She walked away from the social network of England—ran away, really, at the age of sixteen with a married man, Percy Bysshe Shelley, and conceived two children with him before they finally married. Shelley’s life is a story about the adjacent possible of belonging, and Frankenstein is a story about the adjacent possible of deliciously credible catastrophes in an age of technological whiplash and massive dislocation.
In 1989, the Berlin Wall fell, and the end of the ironically named German Democratic Republic was at hand. The GDR—often called “East Germany”—was one of the most spied-upon countries in the history of the world. The Stasi, its secret police force, were synonymous with totalitarian control, and their name struck terror wherever it was whispered.
The Stasi employed one snitch for every sixty people in the GDR: an army to surveil a nation.
Today, the US National Security Agency (NSA) has the entire world under surveillance more totally than the Stasi ever dreamed of. It has one employee for every twenty thousand people it spies on—not counting the contractors.
The NSA uses a workforce less than one-tenth the size of the Stasi to surveil a planet.
How does the NSA do it? How did we get to the point where the labor costs of surveillance have plummeted so far in a few decades?
By enlisting the spied upon to do the spying. Your mobile device, your social media accounts, your search queries, and your Facebook posts—those juicy, detailed, revelatory Facebook posts—contain everything the NSA can possibly want to know about whole populations, and those populations foot the bill for its gathering of that information.
The adjacent possible made Facebook inevitable, but individual choices by technologists and entrepreneurs made Facebook into a force for mass surveillance. Opting out of Facebook is not a personal choice but a social one, one that you brave on your own at the cost of your social life and your ability to stay in touch with the people you love.
Frankenstein warns of a world where technology controls people instead of the other way around. Victor has choices to make about what he does with technology, and he gets those choices wrong again and again. But technology doesn’t control people: people wield technology to control other people.
The world’s adjacent possibles will enable you to dream up many technologies throughout your life. But what you do with them can take away other people’s possibilities. The decision to use a widely adopted technology is never entirely in your personal hands, but what about the decision to make that technology and how you make it?
That’s up to you.
Doctorow’s essay argues that science fiction is not really about predicting the future but rather about understanding the present. What does Mary’s novel, which was presumably written for a present two hundred years old, have to tell us about scientific practices today? Is it still relevant, or do we need new stories to confront the present?
According to the theory of the “adjacent possible,” technological change comes “when enough of the necessary stuff is in place” (here). According to this logic, discovery can proceed only through so many pathways, and what’s coming always depends on what has come. Do you agree with this view, or do you think that true surprise and serendipity are possible? Is the direction of scientific progress somehow predetermined?
Doctorow argues that although technological changes are often the result of individual choices, how they are used becomes a collective choice. Using the example of Facebook, he talks about how disavowing a surveillance society is a difficult social choice—but still one that you can make as an individual. What collective choices concerning contemporary technologies do you disagree with, and what would it take for you to opt out?
Doctorow’s essay argues that science fiction is not really about predicting the future but rather about understanding the present. What does Mary’s novel, which was presumably written for a present two hundred years old, have to tell us about scientific practices today? Is it still relevant, or do we need new stories to confront the present?
According to the theory of the “adjacent possible,” technological change comes “when enough of the necessary stuff is in place” (here). According to this logic, discovery can proceed only through so many pathways, and what’s coming always depends on what has come. Do you agree with this view, or do you think that true surprise and serendipity are possible? Is the direction of scientific progress somehow predetermined?
Doctorow argues that although technological changes are often the result of individual choices, how they are used becomes a collective choice. Using the example of Facebook, he talks about how disavowing a surveillance society is a difficult social choice—but still one that you can make as an individual. What collective choices concerning contemporary technologies do you disagree with, and what would it take for you to opt out?