Thursday, March 16, 2023

One page screenplay explains the 1980s! "The Bargain."

Okay, here's another 'shut up and play your guitar' posting. 

I originally submitted two scripts to the 2016 One Page Screenplay competition in LA.  The first one -- "Bargain" won the contest! The other one placed.

The short script for "Bargain" is copied below. But you can read-along while it is performed (a clickable video reading). The role of Ronald Reagan was delightfully performed by Peter Nelson.  Poor sound quality, but nicely done.

 Definitely sci fi... and it explains two mysteries from the 1980s... the weird US decision to invest in absurd Space Shuttles... and the fall of the USSR... a coincidence that's finally explained! And sure, this'd make a great short-flick.

And yeah, blogger won't take Final Cut proper script formatting. So sue me. Or else... enjoy!

=====

BARGAIN

by

David Brin


EXT. THE WHITE HOUSE - NIGHT

We zoom closer during credits, glimpsing hints that this is the 1980s.

INT. THE WHITE HOUSE - NIGHT

A BUTLER puts a silver tray and mug on a coffee table before RONALD REAGAN.

RONALD REAGAN.   Ah, two marshmallows. Thank you Benson. Do close the door as you leave.

BUTLER.    Yes Mr. President.

A big, old, mahogany-boxed TV announces “The Tonight Show with Johnny Carson!” Reagan leans in, switching to a channel that seems all static. Highlights flicker across his face.

RONALD REAGAN      It’s me. I’m getting your signal much better now that Carter’s damn solar panels are off the White House roof. We’ll nip that fad in the bud.

The TV flickers. Out of the static, a WARBLING SOUND seems almost like an eerie VOICE.

REAGAN  Yeah. The deal you offer. . . might some call it. . .well. . .kinda treason?

The flickers accompany uncanny, static TV VOICE tones that seem to dismiss that likelihood.

REAGAN       Easy for you to say! You won’t be down here, taking heat if the press finds out. Like the Marine barracks bombing. . . or Iran-Contra.

The staticky TV VOICE offers reassuring tones.

REAGAN.      Only my friends call me ‘Gipper!’  You guys backed the commies! Without your economic and technical support, the Soviets would have collapsed long ago!

The TV VOICE warbling from the TV sounds ominous.

REAGAN        Don’t you dare try threats on me! Sure, you could trigger a war down here. That’d keep us out of space. But other aliens would notice! Genocide is against --

Now the TV VOICE comes across as soothing.

REAGAN     Well.  Okay. It’s a deal. You’ll pull the rug out from under the Commies. . . and I’ll take down the American space program. Fritter it away on 'shuttles.'

The TV VOICE sounds agreeable. Maybe smug.

REAGAN       But won’t someone add two plus two?  Connect the dots? 

The TV VOICE is cajoling now.

REAGAN    Yeah, they’d just call this a figment of senility. You’re right about us humans. Gullible to the end. (a beat). ...  Now tell me more about this thing called “Reality TV”. 

    ...Who are these Cardassian aliens, again?


            THE END


©2016 David Brin

46 comments:

Larry Hart said...

Heh.

The punchline.

But neither that spacefaring race nor the family their name resembles were of the Reagan years, were they?

Dwight Williams said...

Someone leaked something to Rick Berman and his crew...

:-)

scidata said...

The University of Rochester is calling its new superconductor "Reddmatter".
https://newatlas.com/materials/reddmatter-room-temperature-superconductivity/

Alfred Differ said...

scidata, (from last thread)

I think the step toward HCI was absolutely critical for driving up adoption of computing BY humans. Without the billions of us now using them, you don't get the millions who might be interested in comprehending how they work.

______

I think of this as the step needed for the construction of centaurs. I am more employable as a combination of human and computer. Pair me with decent technology and I'm much more effective at certain kinds of problems.

Thing is, the centaurs front end doesn't have to know how the back end works for the whole animal to function. It would help to understand, but it isn't necessary. Much like children with hammers don't have to understand how they really work, the hammer still extends them. With time they learn how not to injure themselves or bend nails. With enough time, they self-teach and become carpenter apprentices.

I'd like to live in a world where Johnny Can Code, but I'll settle for a world of centaurs capable of feats impossible to mere humans. That's why I don't think the HCI 'diversion' qualifies for a Fermi paradox explanation. 8)

David Brin said...

Centaurs. Um, extra encephalization to handle 6 limbs?

Um Where what genetalia? Both? Only the horse on horse is not ... um awkward. But those Elgin marbles...

scidata said...

Alfred Differ: HCI was absolutely critical

I did say 'ambivalent'. I actually have a website dedicated to HCI. The problem is that many more zombies have been created than centaurs.

Tony Fisk said...

I did an online course in HCI, and discovered it to be a gateway drug to statistical analysis (ie the crowd knows what it wants better than the designer). This may well be the underlying issue with 'algorithms'

Alfred Differ said...

Ha! The obviously have to go on the horsey end. 8)

As with most mythological combo beings, they don't make physical sense. Can you imagine trying to eat enough grass, hay, or whatever to make use of that giant gut? Hominids gave all that up to get enough blood flowing to the brain for at least a moderate IQ. Still... they are fun to think about.

The modern example of a centaur (that actually gets called by that name) is a human chess player teamed with a computer in tournament play against other teams. A decent centaur will whip human masters of the game and has a strong edge on pure computer players. The human contribution is most evident in the middle game where heuristics dominate databases. Humans process heuristics fast, so clock-driven play is where this shows up.

scidata,

I hear you wrt zombies, but this isn't new to our civilization. Cheap gin did a number on the British in the early days of industrialization. Cheap alcohol is still a maker of zombies today, but we've been adapting socially.

I know a guy who has been doing 3D art for many years. If you want to see a whole lot of zombies, check out what people want when pursuing their fetishes. Or... maybe don't. The reason I mention him is that he's got just enough coding and admin experience to set up one of the recent AI art tools. He's playing with the one he got going and pointing out to others in his circle of friends that it's not that hard. He argues that it's actually easier to learn how to install and maintain (care and feed) an AI than it is to learn the 3D drawing tools he's been using for years. All that painstaking work gets him more of what he wants from the 3D tools, but the AI's are easier for people breaking in.

What that guy is doing... without calling it such... is encouraging the development of centaurs. He installed the AI tools out of curiosity and started learning. That's a Johnny-CAN-Code kind of behavior!

------

The real bargain being struck today involves how we decide to extend ourselves. Do you want cosmetic surgery... or computation agents. Both have the potential to make one a more viable reproducer of one's genetics, but one of them could change the world permanently.

Alfred Differ said...

Tony,

...the crowd knows what it wants better...

I almost always agree, but I've lived long enough to see a couple of exceptions.

1) The crowd can have a difficult time envisioning big changes it does not realize it wants. Sea changes. Most are terrible, but the crowd points that out quickly.

2) Very rarely, a designer is right. Apple had such a guy. (Not Jobs. The guy Jobs hired.) They are super-duper rare.

scidata said...

Charles Babbage, Ada Lovelace, Alan Turing, John von Neumann, Hans Bethe, and John Kemeny did just fine without a mouse. I'd bet that Alfred Differ would have too. Maybe my austere Presbyterian upbringing is showing, but this is why BNW scares me more than 1984 does.

The new GPT-4 can beat 90% of humans on SAT and bar exams. The Great Replacement may now have nothing to do with race or culture. Asimov once said that he doesn't fear computers, he fears the lack of them. Well, I feel the same way about computational thinking. We are in great peril of amusing ourselves to death, which is why I mentioned the Fermi paradox. BTW that's not my Fermi solution invention, others have already postulated that we may simply vanish into cyberspace.

Centaur chess takes one down the road of Vernor Vinge and Gary Kasparov. I think I've mentioned my own encounter with Peter Jennings, the physicist who created MicroChess (a human-level chess player in 1.5k of 6502 machine code). Amazing.

Acacia H. said...

I actually took a stab at this in the Roleplaying Game System I've been designing in my spare time. Centaurs are a magical species that exist as a humanoid and an equine separately, but in times of great danger the two merge to become the "Centaur" of myth, being larger and more powerful than either species alone, and able to wield weapons and some armor. These are a spiritual partnership, and if one of the two is slain, the other will form a spiritual centaur-like being that will go off and fight to avenge itself against whatever being killed its spirit-mate before ultimately it fades away.

(One thing I decided on for the game was that rather than a natural 20 on a d20 roll being a Critical Success, a success is based on rolling a 21 or higher, with each additional success based on every 7 above 21. Any roll of exactly 21 or increments of 7 above that is a Critical and worth an additional success. So while rolling high is nice, rolling an exact modified 21 (or 28 or whatever) is even better.)

Actually, I'd been basing some of the creatures of myth more on the actual myths themselves. For instance, Gorgons are women cursed (or blessed?) by a God, and it is claimed only a woman can kill them - at which point the curse moves over to the woman who slew the Gorgon and they become the next Gorgon. The story of Medusa (and her sisters) was said to be a curse (or blessing to help protect Medusa, depending on the story) of Athena's following Medusa's rape in a temple of Athena.

After all, if you're going to take creatures of myth to craft into your monsters, then SHOW SOME RESPECT FOR THE STORY. And in some cases (such as a certain "cannibal" creature with a name starting with W, which the Algonquin people consider as something you should not name) not use them at all (or at the very least don't name them).

Acacia H.

Tony Fisk said...

Centaurs?
John Varley had his with dangly bits both ways.
(The entity controlling the Titan habitat was bored)
Robert Heinlein had Starman Jones encounter some carnivorous ones (but the story doesn't get to *that* logical conclusion!)
As for the augmented variety, it may not be four legged, but I always have an urge to give an air punch whenever I see Ellen Ripley walk out into the hangar of the Sulaco to assist Newt.
(Such constructs exist, even ones that are human powered only.)

Larry Hart said...

Alfred Differ:

Can you imagine trying to eat enough grass, hay, or whatever to make use of that giant gut?


The 1980s tv show based on DC Comics's "The Flash" paid homage to the idea that the hero was hungry all the time because of all the calories he burned moving at super speed. In typical tv fashion, this quickly became a comic relief trope. Still.

GMT -5 8032 said...

Get away from her you bench! Loved it in 1986. Still love it.

Larry Hart said...

scidata:

but this is why BNW scares me more than 1984 does.


1984 is scarier to contemplate actually living in, but until recently I didn't think it was any more likely to come about than, say, The Handmaid's Tale. Unfortunately, while that is still the case, neither is as improbably as I once imagined.

But Brave New World is scarier in the contemplation of how close to that reality we already live in.

To sum up, 1984 has a scarier premise, but Brave New World is a scarier prediction.

scidata said...

Re: BNW

Both 1984 and Handmaid's Tale have an 'us vs them' vibe (admittedly psychological, insidious, and frightening). But BNW has a more 'us vs us' theme. The id has scared the daylights out of me since FORBIDDEN PLANET.

Larry Hart said...

scidata:

The id has scared the daylights out of me since FORBIDDEN PLANET.


"The number ten, raised almost literally to the power of infinity!"

Larry Hart said...

@scidata,

Unlike Forbidden Planet, the id in Brave New World was unleashed as a means, not as the end in itself.

BNW kinda sorta answers the same question that Soylent Green does, namely "With all of these people alive, how do we keep them fed?" BNW's answer was that the people all had to work to tend the machines which produce the stuff of life, and that people themselves had to be engineered to do that work and like it. The soporifics which made people content with their lot was part of that.

Soylent Green took a shortcut to assure that the food supply automatically increased with the population.

Alan Brooks said...

A tiny bit of very concentrated flavoring, and it’s just like a milkshake.

Unknown said...

Lois Bujold, among others, noted that a long-term space station will recycle ALL organics, but she suggests that because people get a bit queasy about eating Gramps, corpses are broken down further than sewage and other organic material - there's a nice bit about that in "Ethan of Athos".

Pappenheimer

Alan Brooks said...

We’ve all ingested much worse at one time or other.

Unknown said...

Regarding centaur anatomy, the nearly entirely NSFW webcomic Oglaf offers a solution with its story, "Heterogenous."

Pappenheimer

Alfred Differ said...

BNW is way more scary than the others. The 'bad guy' in it is actually smart. That's the root of it.

scidata,

I learned to code back in the 70's while in high school, bought my first personal computer (a PET) in '80(?), and inherited a x286 in the very early 90's. All mouseless. However, I saw my first MAC in '85(?) and hyperventelated. That MAC belonged to the university and I had to break into the lab where they kept it. (Altering your TA key to match the profile of the professors is technically 'breaking in.') I wrote my dissertation on a different MAC and then graduated to a world where I honestly didn't care to use a computer to do more than write.

Then I met the woman I eventually married. She had a Windows machine and I started to learn it. I still don't like Windows to this day, but it got me back into the IT space. I got an early Linux distro and tried to create a dual boot environment for it, but only because I didn't like the way Windows handled memory. One I got one working, I turned on Gnome and did NOT go back to a mouseless world.

Yes. I can do it, but it is not the way most humans think. For me to be a good centaur, I should make use of the strengths of each of my parts. The human in me is visual, spatial, and geometric. I throw things to hit other things. My deep ancestors pondered, pointed, and plundered.

———

Nothing to fret about with me though. I'm from a generation that did see coding examples and was encouraged to think algorithmically. The guy setting up the AI to assist his art is about my age too. Our host's WJCC issue is still spot on. It's just that I'm not as concerned because I think the law of big numbers will work for us. With billions of computer users in the world now, diversity will get us to a better world. If computational comprehension proves to be useful, exemplars will discover it for themselves in each generation. I think it is, but what matters is that the kids and their kids realize it. For themselves.

William said...

Kinda the same premise as "Senses Three and Six", I think? Maybe not quite the same universe, but similar zookeeping aliens.

scidata said...

Re: mice
I'm not actually 'against' mice - they're uber cool. I took the first one I ever bought apart and ruined it (but not before learning its secrets). It's more that I'm against hiding the underlying machinery. Seymour Papert used code to make the turtle roam around in Logo. OGH advised the same for pushing pixels around in WJCC. It's not only about making computers easier to use - the HCI door swings both ways. Turing spent at least as much time pondering the machine'a cognition as he did wishing it had faster I/O or a smooth, understated enclosure. Steve Jobs set us back as much as moved us forward.

I had a short letter that ridiculed data hiding and needless abstraction in OOP in Byte magazine (Oct 1991 I think). I get your billions of computer users argument, although one must be careful of the 'ten thousand lemmings can't be wrong' fallacy because we're not all Alfred Differs. The bean counters who've designed the pitiful attempts at coding in schools so far despise diversity. I doubt a single one of them has ever even heard of Perl's TMTOWTDI let alone FORTH.

Darrell E said...

I started learning how to code in high school in the early 80s on a TI-16 and then a TRS-80 that a friend's parents got for him. Then our school invested in a dozen or so TRS-80s and started a computing class, which I signed up for. The calculus teacher volunteered to teach the class, which was a good thing. I already knew that he was one of those rare teachers that had no hang-ups about learning from his students, though admittedly I never managed to teach him a damn thing about calculus. But in the computing class, he had 0 coding experience. In that class he learned from me, and my friend with the TRS-80 at home, and he loved it. And of course I learned more too as we went through the curriculum.

In the later half of the ’80s, in early college, I wrote a presentation for TRW for a timing system they were designing for the Air Force. I used a Fat Mac PC, top of the line at that time. Compared to computers I'd used up till then it was amazing. Even then, as amazing as it was, I remember well being frustrated about how many floppies I had to juggle around while composing it. You couldn’t work with the whole thing at once because the Fat Mac didn’t have the capacity to hold it all in memory. I’d run out of room on disk F and have to create F(2) to go between F and G. The good ‘ole days. (Capacities today still strike me as insane. TB capacity micro SD cards!?!?)

But that was light years beyond the TRS-80 and similar. And it wasn't just the increase in capacity, it was the UI. And later, working for a living, I wrote a lot of code for everything from estimating, to project management, to purchasing for the companies I worked for. All the work coding done old school, but the resultant applications greatly empowered by the evolving UIs. Of course, a good bit of coding is devoted to driving the UI.

Given that meager, amateur experience I wonder, Scidata, is it really the UIs that are to blame for WJCC? I don't think coding was ever going to become a widespread skill no matter what. If things were such that the only way to use a computer were to sit down and compose some code to do whatever task you need to do I don't think the result would be most everyone learning how to code. The result would be that almost no one would use computers. Computer use would be a specialized profession.

But UIs made it possible for just about anyone to do useful work on a computer. That seems like a very good thing to me. In fact I think it could be argued that UIs have lead to more people learning to code than otherwise would have, because nearly everyone uses a computer and so the chances of individuals being inspired to learn something about coding are increased.

Alfred Differ said...

scidata,

I took apart my parent's expensive shortwave radio back with $150 dollars was a large sum of money to them. Dismay was the correct term for my father's response. A couple years earlier I had intercepted a package sent to him for the color TV kit he was building. I planted the vacuum tubes in the yard like flower bulbs. Dismay.

Ah… but I learned. It was many years before he picked up and completed the TV kit, but I got the shortwave working again. I still have it today though it IS missing some pieces. 8)

———

I'm with you on needless abstraction with OOP, but I didn't encounter OOP until the late 90's. I got to skip over some of the early hype. When I did finally see it I hyperventilated again. I could see advantages for that approach, but I also saw a lot of dumb stuff used to teach it. Just try organizing a class tree for various polygons in 2D and what you actually uncover is how people abstract ideas. A rhombus is a quadrilateral, but where does it fall on the tree wrt squares? Lots of people are very poor abstractors and it shows in their code which means it's not OOP's fault.

The irony I see is that humans are actually excellent at building abstractions. We aren't good at thinking about how we do it, but we do it all the time. We are suckled on the need. For example, division of labor in a market creates specialists. When it's done right the whole market lifts itself with everyone (on average) becoming wealthier. The market delivers more 'stuff' with specialists. [People generally eat better.] Specialists must abstract away some of their needs and imagine this fuzzy thing (the market) delivering. In exchange they get very detailed in their specialty… which might uncover more abstractions.

I'm of the opinion that the best measure of human intelligence is our ability to abstract patterns from chaos. With useful patterns we can predict small bits of the chaos. That so much of the cosmos appears to be computational in its relationships IS an abstraction. It is one that has served us very well.

———

The way computation is taught in US K-12 schools IS pretty lame, but I think that's true of a lot of subjects where the people building the curriculum don't really know the material. For example, multiplication still gets taught as 'multiple addition' across a lot of schools. It isn't. That just happens to work with regular numbers, but that's not how the ancients thought about multiplication. Division gets taught using sharing examples. It isn't. It's more about measuring.

When curriculum writers don't grok the abstractions, they guide teachers to deal with the concrete. Such stuff is measurable, right? We are able to measure our successes, right? Not really… but it is a challenge to test for a child's acquisition of abstractions.

Alfred Differ said...

Diversity won't get us to a better future simply because it exists. There has to be Selection as well. A billion computer users do exactly that because they are participating in markets.

1) Want more money? Chase after what people want and get it to them.
2) Face the choices they make! Get it wrong and you loose your shirt, but you may try again.

I take Adam Smith very seriously when he discusses the advantages of labor specialization. His pin factory example works in a modern IT setting by pointing out that we don't all have to be device driver coders. I CAN write assembly level code, but I'm much more interested in geometric abstractions. I CAN shuffle data from one register to another, but I'm more curious to know how we might represent a current density of stuff that has its own geometry.

Electromagnetism is built up from scalar charges in motion. What if the thing moving has geometry of its own? What if I want to ponder a theory were momentum is a kind of charge? How do I write that? How do I represent it in a physical model at the code level?

THAT'S more interesting to me, so I'll accept abstractions built by others that hide the registers. I know they are down there, but I don't really care. I probably won't ever care anymore unless they screw things up and make my simulations too slow. Even then, I'll probably grab someone knowledgeable instead of a book. I love books, but Adam Smith was right.

scidata said...

In honour of Tom Sizemore, who died earlier this month, a slightly altered line from RED PLANET:
Uh oh - we're going to talk about [markets] now, aren't we? 'Cause if we are, I'm going to need another pop.


I only tease those I respect.

Alfred Differ said...

Heh. They are only about as important to humans becoming what we are today as opposable thumbs.

In some future diorama where humans no longer exist in our current form, the makers will show us trading stuff with each other in one scene and making stuff to be traded (an expression of hope) in the one nearby. [In another they will show us torn between choices. Are we more related to chimps or bonobos? Hmm.] 8)

Tony Fisk said...

Having recently come back into the IT world after a break of several years, a substantial uplift challenge I have encountered is the now ubiquitous Git.
I have used several version control systems over the years, and have never had too much of a problem with them.
Partly because of its distributed nature, with local and remote branches, I think Git is aptly named. It is far too easy to run into the weeds when you try to recombine threads.*
I was able to come to a better understanding only after abandoning the mouse-centric front end 'centaurs' like tortoise, and diving into the CLI, and the on line help.
I have a snarky little mantra for such occasions:

"Do not meddle in the affairs of 'wizards', for they are subtle, and quick to lead you down the garden path to the back of the compost bin with the fairies."

* Turns out my opinion of Git is shared by a lot of sw engineers, but it's not wise to air it too loudly lest one be seen as 'uncool'. Since I am way past being cool: ditto asynchronous programming.

Alfred Differ said...

Git makes most sense after you've fought and bled on projects involving a whole lot of developers offering patches. Having said that, I think it best to avoid the GUI's until one understands what the buttons do. Git IS a good example where hiding/masking complexities will deprive one of a lot of its power.

I learned version control with CVS, but my employers ALWAYS used something else for which they paid good money to see developers mostly not use. All that really shows, though, is my employers have not been in the business of writing software. They thought they were because they needed it, but that was never their competitive core capability.

I've come to terms with git. Linus said he needed it... and I'm reluctant to believe I know better. That guy is gifted.

Tony Fisk said...

It is just unfortunate that the thing I think Linus needed Git for (distribution) is what makes it difficult. I only hope Git doesn't turn out to be the software equivalent of the mousetrap.

Larry Hart said...

Tony Fisk:

I only hope Git doesn't turn out to be the software equivalent of the mousetrap.


Or, "Roaches check in, but they can't check out."

David Brin said...

Pappenheimer, yeah the fact that the bodies – in Soylent Green - are dumped into a green concoction implies an algae intermediate step. And if you are eating algae, then is there really a problem?

Larry Hart said...

On Soylent Green:


And if you are eating algae, then is there really a problem?


But, "The oceans are dying. Plankton is dying."

Robert said...

I take Adam Smith very seriously when he discusses the advantages of labor specialization.

I take him even more seriously when he discusses inequity. I wish the neocons who deify the Invisible Hand would actually bother to read him. But then, I think the same of Evangelicals who seem consumed by narrow-minded hate of all who are unlike them.

"No society can surely be flourishing and happy, of which the far greater part of the members are poor and miserable."

"A criminal is a person with predatory instincts who has not sufficient capital to form a corporation. Most government is by the rich for the rich. Government comprises a large part of the organized injustice in any society, ancient or modern. Civil government, insofar as it is instituted for the security of property, is in reality instituted for the defence of the rich against the poor, and for the defence of those who have property against those who have none."

"The disposition to admire, and almost to worship, the rich and the powerful, and to despise, or, at least, to neglect persons of poor and mean condition is the great and most universal cause of the corruption of our moral sentiments."

"All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind."

"The rate of profit… is naturally low in rich and high in poor countries, and it is always highest in the countries which are going fastest to ruin."

"The interest of this class [bankers or merchants] is always in some respects different from, and even opposite to, that of the public … The proposal of any new law or regulation of commerce which comes from this order … ought never to be adopted, till after having been long and carefully examined … with the most suspicious attention. It comes from an order of men … who have generally an interest to deceive and even oppress the public."

"Labour was the first price, the original purchase-money that was paid for all things. It was not by gold or by silver, but by labour, that all wealth of the world was originally purchased."

"Wherever there is great property there is great inequality. For one very rich man, there must be at least five hundred poor, and the affluence of the few supposes the indigence of the many. The affluence of the rich excites the indignation of the poor, who are often both driven by want, and prompted by envy, to invade his possessions."

And my favourite, which for some reason I also associate with David Brin:

"Science is the great antidote to the poison of enthusiasm and superstition."

Alan Brooks said...

In Soylent Green there were 40 million living in NYC, 2022.
Maybe all prophets are false prophets.

Unknown said...

Robert,

"All for ourselves, and nothing for other people, seems, in every age of the world, to have been the vile maxim of the masters of mankind."

Are we sure Smith and Marx were two different people?

Pappenheimer

Larry Hart said...

Alan Brooks:

Maybe all prophets are false prophets.


I don't judge a film like Soylent Green on that level of accuracy. The story was not meant as prophecy in the sense of "No, really, this is what life will be like in fifty years." The premise was more like, "What if the population does get out of control in the not-so-distant future? It might be something like this." I'm more interested in how well the story proceeds from its premise than how accurate the premise turns out to be by the arbitrary year that it was set in.

Even a work that was meant as a kind of near-term extrapolation such as our host's Earth takes a certain amount of artistic license. That book's setting involved predictions of trends that turn out to be remarkably accurate, such as the internet, "Nextdoor.com" busybodies, and ubiquitous facial recognition. So we marvel at how accurately that book predicted our present time, while neither household black holes nor city-destroying gravity lasers are anywhere approaching reality.

David Brin said...

Robert, you'd do us a service if you submit that list of Smith quotes - some of them very familiar - again in some future comments section, after cleaning up those that were altered. I know at least a couple were and it makes the list less useful.

"while neither household black holes nor city-destroying gravity lasers are anywhere approaching reality."

Hey is it 2038 yet?

We still need the Helvetian War.

Larry Hart said...

Dr Brin:

Hey is it 2038 yet?


I trust my point is clear, that predicting trends as extrapolations of the present into the future is a while different thing from accurately predicting that some particular out-of-the-blue discovery will come about at the same time. Those are two entirely different kinds of "prediction".

Same with Existence. I think most of us accept your setting as a reasonable possible future based upon current trends. The introduction of the (no spoilers) from orbit upon which the entire plot depends is a different thing. That's not a prediction, but a premise introduced into that setting.


We still need the Helvetian War.


Given the news about banks the past week or so, it might not be far off.

Alfred Differ said...

Robert,

It's always worth reminding people that Smith as a philosophy of ethics professor. Wealth of Nations was about a particular virtue. Prudence. In it he countered rationalizations being offered by others for why certain things had to be as they were. Social classes. Inequity. Mercantile rules. You name it. A lot of the ruling class believed things were they way they had to be for the sake of the nation.

Deep down, I think a lot of aristocrats are closet Hobbes fans. They imagine themselves as something more than they really are. The King is the Nation! I am all my servants, properties, and rights! With that view, protecting one's property is the same as protecting one's self. Smith argued in TWN that they were incorrect as well as unethical. In their error, they were undercutting their potential for wealth. That is imprudence.

Those of us who reject the "I am more than human" POV tend to look at the unethical behaviors of aristocrats rather than the incorrectness of their rationalizations. How dare they steal from the poor! Bankers and Merchants are naturally inclined toward deceit! I think it is obvious that Smith rejected the Hobbesian view as well, but he was still a man of his time. He looked down on merchants and bankers from what he saw as a higher social position.

Are merchants naturally inclined toward deceit? I don't think so. They aren't even inclined to be amoral. Most every merchant you deal with in your daily life has to be reasonably close to your ethical positions… or you won't deal with them. Merchants hidden from your view behind an army of employees might be able to get away with it, but most merchants run small shops even today. If a manager at the local burger place pisses you off, chances are high you'll avoid the place, so truly unethical merchants aren't the norm.

Are SOME merchants inclined toward deceit? Sure. That's true of all of us, though.

———

I think Smith was as wrong as Marx when it came to a labor theory of value. It is an ancient mistake to attempt building a foundation for the value of things let alone one based on labor input. Value is just a thing we assign in a trade. I give you 10 weezits for your 13 whatits and temporarily set the exchange rate for one weezit at 1.3 whatits. When someone else makes a different trade, their measure decides the rate for a while.

Smith could have made that argument, but as a philosopher I think he was disinclined to do so. It sounds so unfounded in a world where people BELIEVED in the value of gold and silver. We know better nowadays, but there are still an awful lot of believers.

Labor isn't foundational. Hope is.
He really should have thought of that.

Alfred Differ said...

David,

...after cleaning up those that were altered...

Are you thinking about the various editions Wealth of Nations went through? I know the invisible hand was only in one of them. Smith included it in a middle edition (I think) and then yanked it back out before the last one.

Alan Brooks said...

LH,
Completely agree; everyone at CB knows what you’ve written. I liked the flick and still do.
But, and there’s always a but, for starters the eschatology of religionists reminds me of Bones’ (loc’s) negativity. Educated whining. Sometimes SF appears...Bonesian. As the self-pity of religionists who project their angst on others: “I’m dying, thus the world must be dying.” No more personal problems after the world is gone!
Something like that.
Soylent was released in ‘73; back then it was unpleasant to see Heston running around as the crusader—plus, 40 million as the population of NYC seemed ludicrous.
The film was a half-century ago, which is taken into account. Still, there is a bit of personal-whining involved. One reason so many flicks contain over-the-top violence is that producers and directors have to claw their way up, and develop a certain anger on the way there. So having characters die gratuitously gruesome deaths and have x gallons of fake blood spilled is one manner of venting.
Higher the body count—plus mutilation and intense suffering—the more sadists in the audience can be gratified, also. Yet wouldn’t a few less deaths and less gallons of blood be sufficient? Wouldn’t a tad less personal Angst be welcome?

David Brin said...

onward

onward