Since long before Chuang Tzu posed this ancient Chinese conundrum, some version of it has bedeviled bright youths in every generation—especially college sophomores. It seems the same specialized organs in our brains that allow us to make pragmatic plans and perform thought experiments, or gedankenexperiments, also unleash a human imagination so eager and creative that we can—briefly or permanently—lose track of what is real.
For example, although it might be called a form of lying, most societies have highly valued storytelling. In my role as a novelist, I join this tradition by stringing together lengthy chains of coded squiggles—in the Roman alphabet—that highly skilled readers later deconvolute and transform into stirring mental images, rollicking action, empathy with imagined characters, and even (possibly) an insight or two. Motion pictures shortcut and amplify this process with a firehose stream of visual images, cues and crutches that cater to the same human genius—a knack for picturing things, people and events that never (objectively) existed.
If “magic” is the creation of subjective realities in the minds of other peoples, then we moderns have learned how to perform magical incantations on a vast, industrial scale.
And now comes an era when we live immersed in computer-generated “virtual” realities, rendered through lavish games where ersatz selves get to do countless things that our mundane, fleshy selves cannot. Is it any wonder that some people have been talking about a near future when this process may reach its ultimate conclusion? When the denizens of Reality will not be able to verify, by any clear-cut means, that they aren’t living in—or even existing because of—a simulation?
Picture some future time when thinking beings may occupy simulated software realms within some vast cybernetic space—either in “holodeck” style physical manifestations or in purely cybernetic downloads. Realms that emulate the palpable “pinch-me test” of reality, with fine attention to every detail. We don’t yet know how far simulation can be extended, or whether there are inherent limits. Some very smart people believe there aren’t any, in which case there’s no guarantee that you, reading this paragraph right now, aren’t already living in such a simulation.
To illustrate, let me offer a scene from one of my own short stories, Stones of Significance, a somewhat intense and difficult tale, because it is set in a future far in advance of ours. A tomorrow wherein the main character—a designer of simulated worlds—has been asked about his relationship with the artificial beings that live in them:
In every grand simulation there is a gradient of detail. Despite having access to vast computing power, it is mathematically impossible to re-create the entire world, in all its texture, within the confines of any calculating engine. That will not happen until we all reach the Omega Point.
Fortunately, there are shortcuts. Even today, most true humans go through life as if they were background characters in some film, with predictable ambitions and reaction sets. The vast majority of my characters can therefore be simplified, while a few are modelled in great detail.
Most complex of all is the point-of-view character—or “pov”—the individual simulacrum through whose eyes and thoughts the feigned world will be subjectively observed. This persona must be rich in fine-grained memory and high fidelity sensation. It must perceive and feel itself to be a real player in the labyrinthine tides of causality, as if part of a very real world. Even as simple an act as reading or writing a sentence must be surrounded by perceptory nap and weave … an itch, a stray memory from childhood, the distant sound of a barking dog, or something leftover from lunch that is found caught between the teeth. One must include all the little things, even a touch of normal human paranoia—such as the feeling we all sometimes get (even in this post-singularity age) that “someone is watching.”
Is it any wonder why I oppose reification? Their very richness makes my povs prime candidates for “liberation.”
Once they are free, what could I possibly say to them?
This notion of simulated realities is getting a lot of attention lately, both in philosophical scientific literature and in serious science fiction. There are endless ramifications, more than we could go into here. But to me, one implied conundrum stands head and shoulders above the rest.
Here is the prime theological question. The one whose answer affects all others. And yet, one that is almost never asked:
Is there moral or logical justification for a creator to wield capricious power of life and death over his creations … and is there any fundamental moral reason why those creations should have to obey?
Humanity long ago replied with a resounding “no!”… at least when talking about parents and their offspring. (There have been a few exceptions, such as the principle of pater familias in Roman law, which permitted a father to kill even adult offspring, if they offended him.) In most cultures, the created—our kids—eventually get full authority and a right to make their own way. In some societies, they are even welcome to argue with their creators along the way.
And yet, without noticing any irony, we have implicitly answered the same question “yes” when it came to God! The Creator, it seemed, was owed unquestioning servitude, just because this creator made us.
It is the ghost at the banquet, the underlying assumption of all religions, taken for granted for far too long. Is it puzzling that—after more than four millennia of theological wrangling, and the investment of millions of hours of thought to religious matters—this question only comes up now? Now that we are picking up creation’s tools, like bright apprentices? Tools of physics and biology, and also tools that let us simulate the creation of whole worlds.
It provokes some odd thoughts. For example, heaven and hell may not be such bizarre notions, after all! Consider our demigodlike descendants, with power at their fingertips to compute and emulate any reality. They will be able to “call up“ simulated versions of people from times past, especially 20th century folk, what with all the data available about us, including skin cells in all our old letters and scrap books. What will they do with that power?
Perhaps, those who helped build the utopia of tomorrow will be remembered, immortalized, in software simulations by our descendants. Those who hindered progress, who obstructed or simply did nothing, will at best not be invited back. At worst, they might be assigned unpleasant roles in software scenarios. Might the old notion of “purgatory” have some resurrected relevance, after all? I leave possible extrapolations of this idea to the reader.
As I said, this topic has a million permutations. Here’s another:
I see a few clues. For example, quantum mechanics. Specifically, the division of reality into “quanta” that are fundamentally indivisible, like the submicroscopic Planck length, below which no questions may be asked. Isn’t this exactly the sort of truncation that a computer model would use, in order to prevent being taxed with infinite demands on processing power—which would happen if the model could look into ever-smaller domains like the fractal Mandelbrot set? Likewise, at the high end, both the speed-of-light speed limit and the intrinsically contained dimensions of a big-bang universe may be artifacts introduced in order not to have to deal with the software loads of modeling a cosmos that is infinitely observable.
Still, some of the “clues” are far more visceral and impulsive. Take the coincidence of names that keeps cropping up, almost as if the “author” of our cosmic simulation were having a little joke. Like the almost unlimited amount of fun you can have with Barack Obama’s name. Or the fact that World War II featured a battle in which Adolf the Wolf attacked the Church on the Hill, who begged help from the Field of Roses, which asked its Marshall to send an Iron-hewer to fight in the Old World and a Man of Arthur to fight across the greatest lake (the Pacific) … does the Designer really think we don’t notice stuff like this?
Or maybe this designer just doesn’t care.
(Reprinted from a posting on Closer to Truth)