I was invited by Astronaut Nicholas Patrick to attend his coming shuttle launch... one of the rare night takeoffs. I am sure I will kick myself for decades for turning down this spectacular opportunity to watch one of the greatest sights or/off Earth! But my Google trip, plus other matters, just made it hard to drop everything and go.
Nick is carrying draft copies of some of my chapters and stories into orbit with him, though. So I will be along for the ride.
(Another astronaut, Mike Foale, once left a manuscript of mine in the abandoned Spektr capsule on Mir, on the day a cargo ship collided with the station, and there it languished for a decade before burning up. A couple of days later, Mike’s wife asked me to fedex a replacement, to be sent up with a new toothbrush and sleeping bag. To this day, I am the only author to have his work sent into space as “emergency cargo.”)
In any event, to Nick and his comrades, Godspeed and much success! Return home safely. You are the slender grasp that we still have upon our dreams.
=== === ===
Other related news. Yesterday, NASA released photographs that reveal bright new deposits in two gullies on Mars, suggesting water has flowed in brief spurts on Mars within the last seven years. The Planetary Society congratulates the Mars Global Surveyor team for yet another significant scientific discovery. The fact that Mars Global Surveyor lasted so far beyond its projected lifetime has allowed this type of discovery that requires observing the same area over and over again. If this discovery holds, it is very significant. Only a few years ago, the common belief was that liquid water last flowed on Mars over a billion years ago. Now, we see evidence that liquid water may be flowing today and may currently exist in the subsurface. Glimmers of possibility.
Then there is the plan to “return to the moon” by 2020, only fifty or so years after we left. I once worked for James Arnold, who predicted that there would be ice in the lunar south pole crater, where they are now thinking of planting the base. I heartily approve.
And I wish we could add one month’s Iraq War budget to this endeavor, in order to make other fine dreams come true.
And another months worth for energy research. And another to study climate change. And another vs our kids’ mounting debt. And another to rebuild the reserves. And another for readiness. All the things we have neglected in favor of the war game of a pack of nasty little boys.
Heck. At $20 million a day, let’s take ONE DAY’s worth of Iraq wastage and just give a million dollars to twenty americans. It would do more good.
But no, let’s get back to science....
=== === ===
When Vernor Vinge and I had dinner with investment guru John Mauldin and his son back in July, we discussed my opinion that -- even as raw hardware -- the brain is radically underestimated. I feel, for example, that Artificial Intelligence (AI) zealots like Ray Kurzweil steeply underestimate the difficulty of matching our brain computational power, and therefore the difficulty we face in matching it.
For one thing, the main estimates for “computer break-even” are based upon the assumption that our brains compute only at synapses. These flashy junctures between axons and dendrites are said to play a role somewhat analogous to the on-off nature of binary switches, in a computer. And so, extrapolating according to Moore’s Law, AI optimists estimate that a supercomputer will have the same number of “switches” as a human brain has synapses, around the year 2025 or so.
Ahem, well for one thing, that won’t make a bit of difference if advances in SOFTWARE don’t rapidly catch up. (My Google visit had something to do with hopeful ways to accomplish that.)
And yet, even if software advances are prodigious... and assuming that a lot of our brain power is redundant or “wasted... even so, I am a bit dubious. This skepticism is based upon my own crackpot notion that synapses aren’t everything.
Yes, they can reconfigure and re-wire and strengthen or weaken and do many non-linear things that binary switches cannot. Another reason to expect to need MORE binary switches than we have synapses. Maybe many more.
But even that isn’t my chief reason.
The way I see it, this model neglects to consider the neuron itself, as a -- well -- cellular automaton... or independent calculating entity... that follows its own complex set of rules in reacting to environmental stimuli. STimuli that include the state and actions of its neighbors, but also chemical washes in the surrounding substrate and so on. There may be hundreds of complex rule sets, each of them interacting with each other non-linearly and mediated/moderated by INTRACELLULAR structures that we, even now, know very little about.
If I am right about this, each neuron could have thousands of potential inner conditions!
Moreover, it makes some sense that, while synapses may be important -- firing a “standing wave” of consciousness and rapid calculation -- memory itself (at least the long term kind) has no business being stored in transitory flashes.
A much more likely place for long term memory to be stored would be within the neurons themselves. (Consider, people who receive powerful electric shocks probably experience disruption of vast numbers of neuron firings. Yet many of them recover without severe long term memory loss, even if short term losses are severe.)
Vernor responded to my hypothesis with skepticism. And yet, always honest and openminded, he wrote to me recently that ”... I ran across researcher(s) who figure that even a single neuron might have the computational competence of a supercomputer.” Here is the reference:
Rasmussen, S. _et al._, "Computational Connectionism within Neurons: a Model of Cytoskeletal Automata Subserving Neural Networks", in _Emergent Computation_, Stephanie Forrest, ed., pp428-449, MIT Press, 1991.
Yipes. I never claimed exactly supercomputer status for neurons. Nevertheless, I have long believed that the available neuronal response set is not JUST in the 1-to-1,000 synapses that they link to. Those synapses don't change position all that often. And they don't add up to enough variability to explain the depth and richness of human memory... at least, not by my figuring.
Reiterating: what could possibly switch very rapidly is the internal rules followed by each neuron itself. There needn't be that many of these variable rule sets. Say, a hundred, for the combined flexibility of response to be tremendous. Maybe on the scale of an old hand calculator. Per cell. If so, zowee.
And it means that calculations showing digital computers reaching our level of computational power by 2025 are way off -- maybe by maybe a century!
(Dang, that coulda made a good commentary column in a journal like WIRED and I’d’a got paid for it. Some folks deliver less than that. I hope you guys are appreciative.)
----- FINAL RIFF ----
From time to time, the topic of Education comes up. And people sure have strong opinions!
One of my own endeavors has been to help promote SCIENCE FICTION as a resource and helper in stimulating agile thinking in today’s students.
Are any of you at all interested in this?
The latest effort can be viewed at the AboutSF site, where a variety of lesson plans and other resources are now gathered in a convenient place.
One example curriculum site that was developed for The Postman is way cool. (For the sake of safe archiving, would anyone care to file away a source code backup of this entire site? Just asking. I think the creator is retiring. Would be a pity if it vanished.)
I confess I helped to fund and establish AboutSF. Look especially at the Speculation Speakers portion, that can get great authors to speak at your local (or national) events.
(Of course, if it is a BIG event, that can afford a top-rated national speaker.....)
Back to education and science fiction.... hey we could really use skilled volunteers. (Especially teachers!) Those who are interested in getting active in this effort are invited to visit the “Reading For The Future” web site. http://readingforfuture.com/
See also a collection of articles on my website related to teaching Science Fiction.