It is a tribute to James Cameron that he provokes careful, even critical, appraisals of his work, which I tried to do in my riffs on Avatar. In Part III, I offered one proposal for a three-minute tweak -- possibly in a director's cut -- that might repair the core, moral heart of this great-but-flawed film.
Will that happen? When it snows on Pandora! ;-)
Will that happen? When it snows on Pandora! ;-)
I also alluded to some other, even more far-out-meddlesome ideas. Just for fun, in this unofficial addendum... one writer having fun, playing in another fellow's sand box... why don't we look at a concept that isn't even my own! It is yet another, larger tweak that could both surprise audiences and really make them think, suggested by one of my readers -- Matthew Bell:
"All the amazing aspects of Pandora, all the magical exaggerations, along with its strangely un-biological biology and the behavior of its natives can be explained if you assume that the planet is a post-singularity world."
Now, some of you may be unfamiliar with the "singularity" as it was first laid out by the great science fiction author Vernor Vinge, later now pushed hard by Ray Kurzweil, author of The Singularity is Near. This notion -- much discussed among the world's nerds -- is both simple yet profoundly intricate. It takes the fact that human skill and knowledge are accumulating at not just an accelerating rate -- but the rate of acceleration is itself accelerating.
The most familiar sign of this acceleration is Moore's Law, under which computing power doubles every 18 months or so. At this pace, it should be possible to emulate human intelligence in a box, within 20 years or so. Then that artificially intelligent (AI) box can design a new, improved one, which designs the next and so on, in a sequence that rapidly takes off. In mathematical terms, a "singularity" is what happens when such trends accelerate beyond any ability to predict outcomes. All bets are off, when everything you took for granted has changed.
Now, a number of authors (including me) have tried to picture what life might be like on the other side of a singularity (see my novella Stones of Significance). If the huge brains we create turn out to be monstrous and unsympathetic, they may try to stomp us, as in the Teminator and Matrix flicks. Or they could become loyal assistants to human ambition, helping us span the starways, as in Luc Besson's Lucy or in Her, or in the Culture novels of Iain Banks. There are so many possible ways that this transition might work out – and I cover a number of them in my novel, Existence.
But one is especially enticing when it comes to Avatar. The possibility -- suggested by Matthew Bell, but really kind of an obvious possible riff, and subsequently proposed by others -- is that we and our super-mind computer friends might use immense new "godlike" powers the way today's teenagers use the spectacular computers in their homes.
To play.
Okay then, picture this…
…the Na'vi are dashing about and flying through Pandora's vivid, colorful forests as kids -- young minds -- immersed in a game. Their true selves are rooted in the planetary mainframe, which manifests at the surface as a white tree. (How very Tolkien-esque!) This could explain why the biology and ethnography and all other features seem exaggerated for effect, including the internet-like rapid communion network that laces everything from the animals to the Tree of Life. Including the way Pandoran creatures can plug-in.
If we play along with this post-singularity notion a bit, we realize that Avatar isn’t Dances with Wolves at all! It’s more like Star Trek’s “Errand of Mercy.” In this famous example of a frequent plot in SF, humans encounter a "primitive folk," and don’t understand them. Over time, it is revealed the primitives are actually vastly more advanced people who have decided to live in a rustic manner, either for their own reasons, or in order not to reveal the truth to young races out exploring. In that one memorable episode, the Organians are energy beings who get the Federation and Klingons to stop fighting. One of the recent Star Trek films had a similar theme.
Let's go a bit with this notion that Pandora's biosphere (and "unobtainium") turn out to be the result of a post-singularity super-civilization. Then the story that we all got to watch in Avatar might conceal one of three sub-plots.
1) Visiting humans were the primitives, in technology as well as culture! The "war" was a test, which those who sided with the Na'vi passed on our behalf. It ends with the soldiers/scientists "going back to school."
2) The Na'vi -- helped by Jake -- win the war. They then hit pause and evaluate the terrific game they all just played… only to be horrified! To learn that humans who are killed stay dead! (Their own dead just reboot.) "Why didn't you tell us you were mortal?" they cry out in angst. Though also impressed that human warriors would be willing to stake so much on the line, in battle.
3) The great simulation of Pandora, while beautiful, has a deeper purpose. A real foe is coming. This is training. And humanity is now embroiled, like it or not.
As Matthew Bell put it: "The Colonel’s bomb mission was never going to succeed. The only question was in what subtle way would it be averted. Eywa, or should I say AI-wa, had it worked out well in advance, and sent the seeds to tag Jake Sully so that he could play this role, and thus both find somebody who would be human enough to arrange expulsion of the humans, and also join the Na’vi and fight for their side. Indeed, you could say that Jake was AI-wa’s avatar, or at least instrument, as is clear from the very start."
Yipe! That may be drifting way too far, even for me. After all, despite the many elements that he borrowed, Avatar is James Cameron's story to tell. These are just fannish daydreams, then. My own readers send them to me all the time.
Yipe! That may be drifting way too far, even for me. After all, despite the many elements that he borrowed, Avatar is James Cameron's story to tell. These are just fannish daydreams, then. My own readers send them to me all the time.
If Mr. Cameron reacts as I do, then he feels flattered and pleased. I am always a sucker to talk story, and then try to find some new story, that breaks with the cliches.
And on that final note, let us bid fond farewell to Planet Pandora and its very very very tall... Until we all great the great pleasure of visiting again, anon.
===
Return to Part I: Perils of Pandora: Why Avatar (Tragically) Fails to Make us Better
35 comments:
Humans are mortal? To a post-singularity mind? heh. I rather doubt it. Just read the device before it decays too much, fill in for the rotten memories with ones that fit well enough, and write a new device.
If Pandora is a post-singularity world, I think the better explanation is the Na'vi are not in on it at that deeper level. Their minds are too small. That makes them security sensing devices at best.
locumranch, under the previous post:
When is the progressive happy? Never.
That makes you a progressive, then? Just sayin'.
The progressive can never rest; his ideal is unattainable; and Continuous Quality Improvement is an unending death march.
No one ever has enough food, sleep, or (all exceptions duly noted) sex, either. No matter how good your last one was, there will come a time when you need more. So what?
Satiability means that even though you might always wish for more, there is a certain level at which you are content with what you have. Desiring further progress does not necessitate that one is miserable in the interim.
Jumper:
Like Adam West in Batman, William Shatner made Star Trek 'camp' & rescued it from NexGen pretentiousness. And, yes, he did challenge God to a fist fight in 'Who mourns Adonis, Season 2, Episode 4.
He challenged a god to a fight in that episode. Not the Almighty, though. I think whoever mentioned "challenging God to a fist fight" was thinking of the movie that had the line "Why would God need a starship?"
And since you know your Star Trek so well, you must remember that the same Kirk you credit with upholding your values is the one who insisted ("This Side of Paradise, season whatever) that man was not meant to just be happy, but rather needs to be striving for a purpose. Again, just sayin'.
There is some credibility to the notion that continuous quality improvement is an unending death march. Get too much quality and it is possible to optimize a process so much that one loses robustness and all hope of anti-fragility. Death arrives in the guise of a black swan.
Surviving organic processes work out a solution that is 'just good enough' to the Resource Problem. Leave room for diversity through copying errors, though, or the tight solution one gets from solving the linear algebra problem will shatter when the environment changes.
What I don't get, though, is why someone who is essentially an existentialist in his outlook thinks anyone can rest. Being doesn't permit that by definition.
I sigh.
Moore's Law is not that computing power doubles every 18 months.
It is that transistor density on an integrated circuit doubles every 18 months. These days that correlates only very loosely with computing power, because the software to exploit it is devilishly hard to get right.
"There is some credibility to the notion that continuous quality improvement is an unending death march. Get too much quality and it is possible to optimize a process so much that one loses robustness and all hope of anti-fragility. Death arrives in the guise of a black swan."
Total nonsense!
Part of increased quality is increased robustness!
When you are doing the continuous improvement process you are brainstorming about what could go wrong and putting processes in place to catch them
There is a process that does lead to increased fragility but its not "continuous quality improvement" but "continuous cost reduction"
And it's the antitheses of quality improvement
@Duncan: That's a much safer definition for quality. Sounds like we should all be working for you or your employer. 8)
That would make for a good yarn, but post-singularities are getting to be as lazy as 'society is dumb' these days.
If I may play along, too?
My take is that, when Grace dies, she is 'invited' to transition into Eye-wah rather than her avatar (pretty clear from the movie, actually) Why? Of course, the trained scientist wants to learn about this marvel she's been studying from afar. It turns out that Eye-wah is also curious, but very shy of these sky strangers. It has never seen their like before. Many sentient beings... acting together, yet strangely apart (So... what are the Na'vi?).
Jake and the remaining sky people realise that it is more than likely that the Company will be back, and won't be gentle. There needs to be a call for groundswell help from Earth. This will require something more than bravado. It will require diplomacy. It will also require a united from the clans, which Jake and Neytiri get to organising. Divisions emerge. Jake is embittered about Earth, thinking it and its people ruined beyond hope. Others, like Norm, aren't so sure.
Meanwhile, Norm starts to unearth a mystery. It seems that the first surveys of Pandora recorded the abundant ecosystem, but made no mention of the Na'vi. It was assumed that they'd hidden. However. the Na'vi account is strange, and different from subsequent contact: a dawn legend more in keeping with Na'vi creation myths than something that happened a few decades ago. Gradually, as Grace gains the ability to communicate coerently with the emboied Na'vi, Norm and Grace compare notes, they are drawn to an astonishing conclusion.
The Na'vi didn't exist a few decades ago!
Eye-wah controls Pandora, and has been creating the eco-system using principles of emergence and natural selection, and a bit of guidance.
Ultimately, however, Eye-wah is a child, and Pandora is its nursery. On encountering the sky people, Eye-wah tried to make sense of the sky people by making copies of them (adapted to local conditions, but diverging from the hexapedal form it has been using) It has created a society from what it could puzzle out from records on the survey vessel (including *ahem* a few well known stories about indians) It has tried to provide the Na'vi with a sense of independence, but the result is a bit aloof. Still working from those stories, it tagged Jake as a play role as Captain John Smith.
And now Eye-wah wants to talk to Grace directly. It wants to learn, and it wants help. The sky people have *hurt* it, with their mining. Can Grace and the rest of the sky people help?
Of course they can, but that won't make for a good story. As a complicating factor, Eye-wah's parent(s) check in on their child. They are not impressed with brain damaging *parasites*. Earth had better watch out.
I haven't really tried to weave this into a coherent plot, just try and address some of the more obvious holes in the original. Hope it enjoyably fires a few neurones elsewhere.
cool & deep, Tony... and not violent enough for a flick!
Perhaps not of itself, but I think there's enough room in three sequels to insinuate something like this *and* have a bit of the ol' ultra. It would certainly provide enough controversy to encourage factions.
One more post-Singularity scenario, sort of:
The Na'vi, Eye-wah, and the entire Pandoran ecosystem were created by a culture with post-Singularity technology ... and that culture is on Polyphemus, the gas giant that Pandora orbits. The Na'vi were created for their entertainment, and Eye-wah is the communications infrastructure.
How would the proud Na'vi react if they were to find out that they are playthings?
Hmmm ... After all this time, I've finally realized what Pandora's ecosystem reminds me of:
John Varley's Titan.
Hi Alfred
I learned about quality working for Cummins
Cummins had paid to have a "Continuous Improvement" system set up and integrated into their manufacturing systems
It cost in the tens of millions,
And when I was there had saved hundreds of millions
Unfortunately the very senior management was replacing it with a badly thought out "Six Sigma" system
Probably so that they would have their "names" on the system
Core part of a continuous improvement system is
Design/Process FMEA
Failure Mode & Effects Analysis
In an FMEA you brainstorm what could possibly go wrong! - and then estimate (guess) the likelihood, possible damage and the chances of detection
We managed to get the plant supplying the Dodge Ram down to 42 repairs per million
My vote is: Tony's done it.
The fact that you mentioned The Matrix (laughable) and left out Dan Simmons' Hyperion series in your AI factor made me stop reading the rest of the post. You left out many (and better) stories of AI/Human relations.
A very good thread, overall, replete with engaging ideas and pleasant banter, yet I feel that I have been 'left below' with all this talk of 'post-singularity' cultures.
What does 'post-singularity' consist of, pray tell?
As a result of my limited intellect, I have enough trouble with the Singularity concept (with its causal collapse & 'wishes as fishes' whatnot), so I simply cannot conceive of that which comes after such an all-encompassing Oneness.
I am beginning to suspect, however, that the post-singularity concept (and its singular parent) are mere phantasms, like so many soft progressive endpoints, seeming to vanish on arrival, only to reappear on the horizon, like the mirage or will'o wisp that begs us follow.
"Follow me", the generic prophet says, "Untold knowledge, sex, wealth & pleasure (etc) awaits you a short ways down this primrose path ASSUMING that you remain silent, be-good, be-have, work hard and respect my authority", yet the road is longer than advertised and the goal never seems to materialise, until it becomes increasingly clear that Our Singular Purpose is only to 'be led', irregardless of direction or destination.
Enough is enough, and I'm all for sitting this one out. I am tired, as are my companions, and we will follow the will'o wisp no longer.
Promises have been made, only to be reneged on, and we are angry, angry that we have been misled, angrier still that it was we who first chose to follow (perhaps our last free act), and we are tempted (sorely tempted) to pull this false temple down upon our shorn & sightless heads as a last act of rebellion and/or contrition.
Singularity achieved, and 'post-singularity' is best defined as 'what comes after'.
Best
Jennifer B, your mental smallness is not my problem. But did you have to come on and actually brag about it? Oh, by all means.
locum we remain... yawn... vaguely (though without much hope) waiting for what YOU envision as a desired human course across then next ten, hundred, thousand and million years.
Margaret Mead once said that those who lived through the year 1945 were "immigrants in time." They had stayed in the same place, but had been transported into an entirely different world in less than a year. Most people essentially went from the world of the 1930s to the world of the early 1950s within a few months
A singularity will probably be a lot like that, except that the changes will be greater, and occur more rapidly, than those experienced in 1945.
There may be several of these technological singularities in the 21st century. The most expected singularity is when AI exceeds human-level intelligence. There may be another when Drexler's general purpose nano-assemblers can suddenly be mass produced. There may be another when special forms of carbon become the ideal building material that replaces nearly everything else. (We'll need all that coal, and people will think we were crazy for burning it in the past.) There may be yet another singularity when genetic engineering suddenly becomes routine, and healthy human lifespans suddenly become indefinitely long.
We just all have to be prepared to become "immigrants in time" on many different occasions.
It is unlikely that the "singularity" will actually be a single event.
A projection of "a desired human course across then next (10 to the nth) years" is a teleological (and/or theological) construct that reflects individual ideology more than probability.
What I predict (based on my individual but less than unique ideological underpinning) is imminent collapse, a rejection of the prevalent 'more equals better' social assumption, followed by reestablishment of smaller human (plenipotent) populations.
Progressive incrementalism will be abandoned, along with the belief that major progress can be achieved by the mere (additive) accumulation of minor refinement, and human subgroups will be free to diverge from the mean & pursue alternative fates.
Most subgroups will stagnate (plateau), satisfied by mere subsistence, hopefully to serve as genetic custodians. Some will pass (extinction); some will bootstrap themselves into space despite current technological limitations; others will evolve (by accidental or deliberate mutation) to become 'post-human'; and still others will pursue increasingly obscure technologies.
The human future, then, will not spring from consensus, globalism, 'The New Conformity' or a progressive 'coming together', but from social fracture (chaos), just as Darwin postulated geographical isolation as the necessary requirement for bird speciation.
The human (and/or post-human) future is without number, yet the future is not ours to choose.
The present cannot choose the future.
Best
Locumranch, what you postulate seems to be what was the subject of Eric Frank Russell's novel, The Great Explosion.
In The Great Explosion, a technological breakthrough in space travel allows human groups to all go their own different ways and establish their own unique societies in complete isolation from other humans.
The story is based upon what happens when the monolithic culture of Earth tries, after several centuries, to establish "diplomatic relations" with all of these groups that have elected to go their own way. Things do not work out well for the bureaucratic diplomats.
See? He cannot do it. Express his own actual hopes or prescriptions. Just armwaves at the most general sci fi cliches imaginable.
See? He cannot do it. Express his own actual hopes or prescriptions. Just armwaves at the most general sci fi cliches imaginable.
Pandora as a post-singularity world has been on the tumblr and even among facebook for some time, since at least the sale of DVDs in stores, and probably among non-virtual conversations for some time.
Why would a superintelligent AI be a threat to humanity? If it can live forever and doesn't need stuff like oxygen or water or food - just electricity - why wouldn't it head out to the stars? Why would it stay anywhere near earth?
My concern remains human minds living forever in computers, the uber-wealthy making decisions for mere mortals for all time, becoming ever less moral and ethical as they play their game of thrones, and we're all pawns. Some humans, to have access to power and resources and relative safety, become their worker bees, tending the computer consciousness, but the rest are expendable. This has been an idea others and I have had for the Lords of Kobol a-la BSG-75, computer minds that created humans and cylons and maybe other fantastical creatures (dragons, elves, dwarves, orcs) for their eternal amusement - even downloading temporary consciousness into avatars (a-la Kiln People) to walk amongst them, not quite mortal.
Everything locumranch fears happens routinely all the time already, and has for all of human history. If your valley gets wiped out in a flood, shit "collapsed" for you. Famines were routine everywhere just a few years ago; now they are only "somewhere" at any given time.
"Collapse" is made dramatic in novels, because drama is the name of the game. Actual reality is bad, but apparently not in locum's neighborhood, or it is but some dim longing for "purity" of the past is confused with locum's longing for the simplicity of his own childhood mind, which is, god knows, common enough among pessimists and misanthropes and the damned and the doomed, so it's not a rarity.
Who are the survivors when no one gets out alive? Why is despair the correct option on facing certain death? How is it distinguishable from the cowardice of the simply fearful?
I wish that James Cameron would abandon Avatar, and make a motion picture trilogy out of his immediately-prior work of fiction.
I know that is never going to happen because Avatar was too popular, far too financially successful, and (apparently) too important to James Cameron himself.
Cameron's television series Dark Angel has none of the flaws of Avatar. Being a drawn-out television series, the basic 5000-year-long mythology behind Dark Angel went over the heads of most viewers, and the series was cancelled before the story line was really completed.
In Dark Angel, and especially in the final expanded episode that James Cameron personally directed, people do mature and improve during the course of the show and many characters lose their previous prejudices and bigotry. Also, the narration by the writers on the DVD for the last episode reveals what was to be the conclusion of the story line.
A genius of human genetic engineering, shown only in the second season of Dark Angel in a 3-second flashback, has engineered one of the humans he created to so that her genome will provide antibodies against an upcoming comet-deposited viral plague. Those antibodies will save everyone from comet-deposited viral material. A elite cult, however, wants to make sure that only the elite cultists survive the coming plague.
Cameron's Dark Angel has a deep 5000-year mythology, but is all about the science and technology that is being developed right now.
The setting of Dark Angel is in the years 2019-2021. Most people can easily suspend their disbelief with respect to stories in the distant future. Too often, though, tales of the near future seem too incredible to be taken seriously.
Thank you Duncan.
Unfortunately, I had a brief experience with Cummins in late 2004. They had a disaster on their hands because their CRM product (customer facing) had fallen out of support and was hopelessly out of date. Their in-sourced IT contractor was hopelessly lost too. They didn't understand the product well enough to know a previous database migration had destroyed the indexes on all their major tables. What a mess. I set up new indexes and then bugged out of there within 3 weeks. 8)
Someone optimized their IT processes too much. They were fragile and broken when I was there. Hopefully they've learned, but I didn't want to be the one to teach them.
I wish people understood the 'quality' term the way you describe it instead of the usual buzzword nonsense I encounter. 8)
@locumranch: 'Left below' is a pretty good description of your thoughts regarding the singularity. Don't read this as me picking on you personally, but you really oughta read a few stories regarding near-singularity worlds before you say more. Drop your ever-present effort to connect things to those evil progressives, though. They aren't relevant.
The key thing to bring to those stories is that pre-singularity beings with minds roughly the size of a human are indistinguishable from devices we could make to a post-singularity mind.
How long does it take a student to learn enough in college and med school to become a competent doctor? How long would it take a fish? THAT's the problem in a nutshell. Our analogies all break at the singularity.
@Jerry: I like the notion of the post-1945 world being viewed as a kind of singularity, but it might be better to borrow from the science jargon and think of it as a phase change. Instead of getting a different kind of thing, it's just organized different. That displaces the singularity to a derivative on something equivalent to a specific heat capacity.
When Google first made their n-gram system available I got all nerdy and immersed myself in it. I remember clearly seeing the explosion that happened in the English language with respect to the size of the vocabulary in the post-1945 world. I remember thinking at the time that something really odd must have happened and it couldn't just be that the war was over. It hadn't happened after WWI... so what could it be? In the end, I think it was something very small. Lots of people started using English. Kaboom!
Your idea of lots of small singularity events would fit with more phase changes, but I suspect we will eventually get to a point where we can't recognize the stuff that is changing as what it was before. Add enough energy to a solid and it eventually ionizes, right? 8)
@Alfred: The explosion in the use of English was part of the dramatic changes in 1945. Probably more important, though, were the 16 years of unprecedented technological development that had not made it to the general market until that year.
Also, there was a huge wartime industrial capacity that had to be rapidly converted to civilian use (along with a willingness to actually make that conversion). The combination of the Bretton Woods monetary system and the "pent-up demand" which abruptly increased the velocity of money was also an important factor.
Even more important, all of these things combined to suddenly give people hope, which hadn't been there for many years before 1945.
onward
Hi Alfred
I left Cummins in 2001 - but I had experience of their IT stuff
They used to have a half decent process for making any changes - mechanical ones that is
Any number of "improvements" have caused major problems in production - something as simple as the new unit won't fit on part of the production line
The software guys refused to buy into that process
They would update the main servers with no warning
One day I had a panicked call from our North Carolina operation
- A certain type of engine would not go to full power - all of them were failing
We had nearly 100 engines on the floor as failed test
The root cause was a software change - the engines were for buses, the ECU was looking a "door closed" signal!
And nobody had told us!!!!
Software!!!!!
heh. Yah.
There is the kind of software that gets sold in hardware that really, really has to work. Then there is the kind of software that goes on hardware that can be rebooted and the customer won't be upset. Hire a guy used to writing the second kind on a team that writes the first kind and you get a mess. If management can't tell the difference between teh two, sell all the share you have in that company.
//*How would the proud Na'vi react if they were to find out that they are playthings?*//
Ask Buzz Lightyear ;-)
Post a Comment