Saturday, January 14, 2017

Science Fictional News & Announcements

The Arthur C. Clarke Center for Human Imagination has launched a new podcast series - Into the Impossible - with conversations bridging the arts, sciences, technology, medicine and more! Episode 1: Imagining the Impossible; Episode 2Becoming a Galactic Wonder, Episode 3: The Hard Problem with Kim Stanley Robinson and Marina Abramović.

NEWS: my submission - “Bargain” - won the recent One Page Screenplay Contest! The script was performed and posted online by the Los Angeles Feedback Film Festival, with the role of Ronald Reagan well-played by Peter Nelson. Nelson did a fine and fun job with the brief, comedic-ironic part.

So will I receive my accolades at a gala tuxedo event with red carpet? I have this fluffy, backless thing....

One story I did not expect to be "prophetic"... though it deals with prophecy... is "The Loom of Thessaly," about a spectacular, ancient site lurking in the middle of modern, heavily traveled Greece. Only now comes word of a "Lost City" discovered just a few hours north of Athens in the ancient region of ... Thessaly.  Give "The Loom of Thessaly" a read -- from my collection The River of Time! 

Oh, a reminder: Last week saw the debut of my latest book -- Chasing Shadows: Visions of Our Coming Transparent World: An anthology of stories and essays about a future filled with light, edited by David Brin and Stephen W. Potts, in collaboration with the Arthur C. Clarke Center for Human Imagination, with stories and essays by Ramez Naam, Bruce Sterling, Brenda Cooper, Robert Sawyer, Nancy Fulda, Scott Sigler, James Morrow, Neal Stephenson, Robert Silverberg, Aliette de Bodard…and more. 

Join me and Stephen Potts for a booksigning at Mysterious Galaxy Friday January 27 at 7:30

== More news! ==


I'll be the author guest at San Diego Comic Fest, along with Greg Bear and Gregory Benford February 17 to20.

Would-be science fiction authors! I’ll be participating as a lecturer at the Santa Barbara Writer’s Conference in June.

On Canada Day, July 1, 2016, my esteemed colleague Robert Sawyer was named a Member of the Order of Canada, the highest civilian honour bestowed by the Canadian government. See Rob's latest novel, Quantum Night.

Well it’s about time, c’est vrai! Some of you will want the French edition of Existence — with a cool (if obscurely totally apropos!) cover by Yohann Schepacz, published this autumn by Bragelonne, translated by Claude Mamier.
                       
For his 100th episode, Kevin Tumlinson's Wordslinger Podcast invited me to offer some Big Picture perspectives in an "amazing" interview.  The show's primary emphasis is on the writing process and the writing biz, but also sci fi films and books and how to watch history unfolding around us. A fun and very well done podcast!

== Science Fictional Worlds ==

This essay on the history of the concept of parallel worlds is excellent, within its own narrow framework.  It leaves out all non-European versions of the idea, and give short shrift to all but a few treatments in mainstream science fiction. A more comprehensive look might also range from clear-eyed explications like Niven's "All the Myriad Ways" all the way to Vladimir Nabakov's murky, indulgent novel Ada.  Still it is an entertaining article and thought provoking.

Only an Apocalypse Can Save Us Now: Here’s an erudite Harpers piece by Mark Lilla on the varied ways that so many have been transfixed by romantic notions of apocalypse. 

Lila laces this rumination on doomcasting with bookish refs to Heidigger, Cervantes, bible, Islam etc. For example, after discussing how pampered people often dive into nostalgia – like so many spoiled American baby boomers -- “La nostalgie de la boue is alien to history’s victims. Finding themselves on the other side of the chasm separating past and present, some recognize their loss and turn to the future, with hope or without it: the camp survivor who never mentions the number tattooed on his arm as he plays with his grandchildren on a Sunday afternoon. Others remain at the edge of the chasm and watch the lights recede on the other side, night after night, their minds ricocheting between anger and resignation: the aged White Russians sitting around a samovar in a chambre de bonne, the heavy curtains drawn, tearing up as they sing songs from the old country.

“Some, though, become idolaters of the chasm. They are obsessed with taking revenge on whatever Demiurge caused it to open up. Their nostalgia is revolutionary. Since the continuity of time has already been broken, they begin to dream of making a second break and escaping from the present. But in which direction?” …

Lilla makes a lot of strong points, many of them quite different than my own take on apocalyptic-messianic-millennialist thinking. Yet he points to some of the same historical ignoramuses and hypocrites. 

“Apocalyptic historiography never goes out of style. Today’s American conservatives have perfected a popular myth of how the nation emerged from World War II strong and virtuous, only to become a licentious society governed by a menacing secular state after the Nakba of the Sixties. They are divided over how to respond. Some want to return to an idealized traditional past; others dream of a libertarian future where frontier virtues will be reborn and internet speeds will be awesome.” 

In the end I was disappointed because he doesn’t even try to glance – even sideways – at the folks who build civilization and who fought and dug our way out of the last apocalypse, the human nadir of 1943, at the bottom of the Concave Century. No glimpse of the greatest accomplishment of our time – turning the attention of millions to building a golden age with our own hands, not lamenting a (mostly fictitious) lost one, in the past.

Least of all does he notice that his own passionate erudition is clearly aimed at helping to prevent  the kind of alienation and despair that can wreck a civilization, that we recently saw in the election. Nor is he apparently aware that he aims to achieve the wonder of science fiction – a self-preventing prophecy – and yet the author’s bestiary makes no mention of folks like him.

== and more ==

Hugo Gernsback, who edited Amazing stories and coined the (unfortunate) term “science fiction, and after whom the Hugo Award is named, also invented the forerunner of today’s VR glasses.  See a glimpse of them in this entertaining article.

Interesting SFnal concept in a “mainstream” novel, the New York Times bestseller, Underground Airlines by Ben H. Winters. From the review on NPR: “A tale set in a modern America where the Civil War never happened and four states still enforce slavery follows the experiences of a talented black bounty hunter who infiltrates an abolitionist group to catch a high-profile runaway.”

"The future," as the author William Gibson once noted, "is already here. It's just unevenly distributed." Look, I respect Bill immensely as one of our finest and most thought provoking metaphorists.  But why do people keep quoting things that are just plain dumb?  It’s like Yoda’s horrid aphorism: “Do or do not, there is no try.”  Huh? Sure, one can squint and see a comment on wealth disparity, on the one hand and determination to succeed, on the other.  But seriously, you know that’s not what those sayings are supposed to mean.  What they are supposed to mean… isn’t even remotely or in any conceivable sense, true.

In his article, The novel Heinlein would have written about GW Bush's America, Cory Doctorow gives a rave review to John Varley’s delightful Red Lightning.

Author and Futurist Brenda Cooper received the 2016 Endeavour Award for her novel Edge of Dark, a science fiction novel about mankind coming face to face with its past mistakes, as a near-AI that was banished to the edge of known space finds its way home again. The book is the first volume in the author’s The Glittering Edge duology, published by Pyr Books.  Read her terrific story in Chasing Shadows!

Catch this music video on YouTube, a very clever takeoff on the style of Betty Boop cartoons of the 1930s, but this time on the shallowness of ubiquitous cellphone culture.  Pretty good music too, by Moby & the Void Pacific Choir.

== Zombies vs Vampires?==

In response to my posting: Vampires, Zombies Werewolves and American Politics, one member of my blogmunity wrote: "I find I have to dispute your assertion, so comfortingly full of truthiness, that "vampire flicks always correlate with Republican administrations. (During democratic administrations, it’s zombies, all the way down.)" This may be because we are using differing data sets, but let's investigate. For science!"

Methods used: searches of IMDB for keywords "vampire" and "zombie", combined with 1-year time restrictions, for years 1953-2016. Data are insufficiently sparse to continue analysis before 1953; besides which, the controls of both the studio system and wartime propaganda restrictions would likely introduce fatal confounders in any case. Test statistic used was (V-Z)/(V+Z), i.e, the relative excess of vampire films as a proportion of the total of vampire and zombie flicks. Comparisons were made to control of the federal Presidency, Senate, and House by major party.

Results: With the exception of two outlier years, 1953 and 1959, vampire films dominated the box office until 1980 regardless of party control of any branch of government. A dramatic shift to zombie films occurs with the election of Ronald Reagan and persists until 1990, when a slow creep back to a slightly vampiric balance occurs. With the election of George W. Bush in 2000, instability in the V/Z balance occurs, settling in 2007 (after Democratic House control is secured) on a distinct zombie preference, a period persisting until the present day.

Conclusions: The hypothesis that the country displays distinct "moods" regarding vampire/zombie preference is confirmed. Data are insufficient to verify statistical significance, but support is found for the idea that the V/Z preference is tied to political shifts. The hypothesis of a tight linkage between the V/Z preference and presidential or other partisan control patterns is rejected.

I stand admonished!  While my general impression’s truthiness stands confirmed overall, its statistical reliability correlating with party administrations kind of… er… sucks.

== And a New Year's wish ==

We should feel strengthened in our dedication to use the greatest gifts that either God or Evolution gave us, or that we seized for ourselves. Yes, love and compassion... but also curiosity and calm willingness to exult in the holy catechism of science: 

"Yes, I might be wrong! Or maybe you are. Perhaps both! Ain't it cool? Let's go find out."

Scientific songs of praise: Watch and enjoy. In dedication to our one chance to grow up. Together.

81 comments:

Paul SB said...

I loved that last line in the play about reality TV & those cardassians! The Moby video was gun, too. As Ray Bradbury once said (very roughly paraphrased), we don't need to burn books, we just need a technology that makes people disinterested in reading.

Paul SB said...

The Moby video was Fun, not Gun! Autocorrect! Or maybe it's these new bifocals I just got - now I really feel like an old geezer!

In the immortal word son Bill the Cat: Oop! Ack! Ptooey!

LarryHart said...

Robert, I posed a question to you toward the end of the previous post. I'd still be interested in the answer.

Otherwise, just taking advantage of the lull here to admire the farewell speech given by President Obama a few days back. And as always, imagining a postscript to the 44th president as a lyric from "Hamilton", with only a few names (and one adjective) actually changed here:


President Obama:
Let me tell you what I wish I’d known
When I was young and dreamed of glory:
You have no control
Who lives, who dies, who tells your story.

Speaker Ryan:
I’ll give him this:
His Health Care system is a work of genius.
I couldn’t undo it if I tried.
And I tried.

Senate Majority Leader McConnell:
He took our country from bankruptcy to prosperity.
I hate to admit it,
But he doesn’t get enough credit for all the credit he gave us.

...

Tony Fisk said...

In the scientific songs department, Tom Lehrer's classic ditty about the periodic table has been superceded by ASAPScience. Set to "Orpheus in the Underworld", it covers elements up to 118. With applications.

Anonymous said...

Dr Brin in the main post:

"Conclusions: The hypothesis that the country displays distinct "moods" regarding vampire/zombie preference is confirmed. Data are insufficient to verify statistical significance, but support is found for the idea that the V/Z preference is tied to political shifts. The hypothesis of a tight linkage between the V/Z preference and presidential or other partisan control patterns is rejected."

I stand admonished! While my general impression’s truthiness stands confirmed overall, its statistical reliability correlating with party administrations kind of… er… sucks.


There does seem to be a correlation (as you suggested) with the general sense that the real-life bad guys are the wealthy/powerful (vampires) or that the real-life bad guys are the unwashed masses (zombies). It might not correlate so much to who is in power--more to which memes are ascendant. If this is the case, then I'd expect the shifts in political power to lag the dominant meme rather than to lead it.

Paul SB said...

"If this is the case, then I'd expect the shifts in political power to lag the dominant meme rather than to lead it."

That would be the case if the movie memes were a result of changes the memes of the voting public. The other possibility is that the movies are swaying the minds of the voting public unconsciously with their memes - either deliberately or randomly. If deliberately, then the movies represent intentional propaganda. I would suspect that the former is more likely than the latter, but you could easily have either one happening at different times in history. One time it might have been deliberate, but another time more of a fad.

Then there's generational issues. Vampires up until 1980, then zombies through the 1990's but a return to more vampires in the 2000's. Those who were young in the 80's grew up with zombies, but didn't have as much money & social influence until they were older and more settled into careers in the 2000's. Meanwhile, people who grew up with vampire movies were young then but more established and socially influential in the 1990's, fueling a rise in vampire flicks (I remember Goth-icky being super popular, and most of them were pretending to be vampires, so maybe it was the youth demographic...).

Zepp Jamieson said...

Second time today Heinlein has entered my consciousness. I watched this morning's SpaceX launch from Vandenberg, and read the subsequent story in the Guardian. In the comments section, someone referred to Elon Musk as the first great space entrepreneur, and I couldn't resist: I remarked that if Musk and Heinlein had been contemporaries, Musk would have been the model for DD in "The Man Who Sold the Moon."

David Brin said...


Tony, cool songs of science… though the elements song is at least 4 missing!

Acacia H. said...

Larry, my mother refuses to talk politics with me.

There is far too much of my brother in me. My brother sought to enter politics as a Republican but I will admit he was old school, honorable, and probably would look at the current breed of Republican and consider them an abomination. I hope. I will admit he had his dark side and an intolerance toward those who didn't at least try to improve themselves. Seeing he was blind (comorbid issue with type 1 diabetes since he was 9 months old back in the 60s)... well, it could go either way. And he was a Massachusetts Republican which are liberal Democrats everywhere else.

That said, if I cornered my mother on this she would very likely brush it off with "I wasn't voting for Donald Trump, I was voting against Hillary Clinton."

She will have to deal with two things in the future. The first is a life under Republican misrule as her insurance goes away, her social security gets cut, and stocks crash under the Republicans. The second is the fact if she complains at all I will just look at her and remind her she voted for this.

But then, I freely admit to being an evil bastard.

My treaty with my Republican friend stands and this time I was the one who reinforced the No Politics rule while taking him and his wife out to eat for Chinese as a Christmas gift. Okay, technically it was religion but given Trump's attempts to turn people against the Muslims, it's politics. My only real regret is that a lot of people are going to die.

After all, that's what happens when people are tossed off their insurance programs and can't continue chemotherapy or the like. I heard one person state how prescient Republicans were in talking about government-run death panels. It's just the Republicans will be running the panels.

That's the other reason for 100 days. Trump is going to pass a lot of stuff for the Republicans. And then when the shit starts hitting the fan repeatedly, Republicans are going to turn the blame and hate on Trump and even point out that THEY got rid of him when things turn truly dark. And they will think that will give them an out.

What they don't realize is two things. First, Trump's supporters will never again trust a Republican. Second, Millennials are the best educated and most intelligent group of young voters we've seen... and they are fully aware of just how much the Republicans are fucking them over.

It's just... a lot of people are still going to die.

Oh, and James O'Keefe was trying to get people to disrupt Trump's inauguration as part of his ongoing effort to discredit progressives. Because Trump needs to be a victim.

Of course, if I were wanting to disrupt things with style, I'd have a half dozen flash bands in Washington D.C. on inauguration day to show up in various places and give impromptu concerts playing music with a theme of standing up for yourself and never backing down from bullies and the like. You know, rebellion.

Rob H.

LarryHart said...

Paul SB:

Meanwhile, people who grew up with vampire movies were young then but more established and socially influential in the 1990's, fueling a rise in vampire flicks (I remember Goth-icky being super popular, and most of them were pretending to be vampires, so mayb e it was the youth demographic...).


Wasn't "Interview With A Vampire" a game changer in the late 1980s? Before that, almost all vampire movies were about one particular vampire, Count Dracula. "Interview" made vampires as a concept more fashionable, and also made them the point-of-view characters rather than just the threatening monster.

Jumper said...

I fear much of the zeitgeist is the result of the baby boomers demographics. I am one but haven't wanted to live in the boomer bubble for a long time, although there is much good there.
For example, the "evil children" meme didn't really kick off until after the Exorcist movie when a bunch of boomers started having kids, not just the early outliers. So I assume that vampires represent fear of exploitation by the young midlife boomers discovering treachery among the older exploitive classes. Zombies in contrast represent fear of the proletariat and the less experienced young.

Jumper said...

Anyone who missed The Vampire Tapestry, a 1980 fantasy novel by Suzy McKee Charnas, might want to read it. It's the closest to believable I recall reading.

LarryHart said...

Robert:

Larry, my mother refuses to talk politics with me.


When I read that first sentence, I thought you were going for "...so that's why I don't have an answer", and was ready to accept that. You went in an entirely different direction, though.


My brother sought to enter politics as a Republican but I will admit he was old school, honorable, and probably would look at the current breed of Republican and consider them an abomination. I hope. I will admit he had his dark side and an intolerance toward those who didn't at least try to improve themselves.


I don't advocate rewarding laziness. I do distinguish between "not willing to work" and "no work is available." I want technology to free humans from the need to work, not "free" them from the ability to support themselves.

...if I cornered my mother on this she would very likely brush it off with "I wasn't voting for Donald Trump, I was voting against Hillary Clinton."

She will have to deal with two things in the future. The first is a life under Republican misrule as her insurance goes away, her social security gets cut, and stocks crash under the Republicans. The second is the fact if she complains at all I will just look at her and remind her she voted for this.


That's what gets me about the liberals who snubbed Hillary. Their stated goals are all thwarted by the election results. Even the specific complaints about Hillary--too close to Wall Street, too hawkish, hints at pay-to-play--are all magnified with a Trump administration. So yes, it looks to me as if refusal to vote for the "lesser of evils" means they are ok with the greater of evils.

But then, I freely admit to being an evil bastard.


Heh. I most cases, I would not make the same admission about myself, but in this particular case, I concur. I have zero sympathy for eyes-wide-open Trump voters who don't like what he and his party do with power. If you let go of a hammer on a planet with a positive gravity, and the hammer lands on your foot, you can't blame the hammer or the gravity.


My only real regret is that a lot of people are going to die.


Well, yeah, but that's kind of the big one, isn't it?


I heard one person state how prescient Republicans were in talking about government-run death panels. It's just the Republicans will be running the panels.


When Republicans make claims about what Democrats will do, and those claims don't seem to make any kind of sense, they're always describing what they themselves would do in the situation. Throwing people off of insurance rolls to save money is a good example. So is faking scientific results to please donors. Heck, so is insisting that men will pretend to be transgender in order to sneak into women's bathrooms.

LarryHart said...

Robert (continuing...) :

That's the other reason for 100 days. Trump is going to pass a lot of stuff for the Republicans. And then when the shit starts hitting the fan repeatedly, Republicans are going to turn the blame and hate on Trump and even point out that THEY got rid of him when things turn truly dark. And they will think that will give them an out.


Ok, you're going to have to pick one. Will Republicans toss Trump out and take credit, or will they trick Democrats into taking the blame? It's not going to be both.

Me, I think they'll "let Trump be Trump" and not realize how badly this will hurt their brand. Because while you think Democrats are stupid, I think Republicans live inside their bubble and believe the voters really do care about lowering taxes on the wealthy and deregulating polluters. We saw how that went in 2006, and to some extent again in this past Republican primary. Trump beat the other Republicans in part by being outrageously racist and bullying, but also by promising not to touch Social Security and to bring back manufacturing jobs. Democrats in the 1990s ran on being "kinder, gentler" Republicans, but Trump won this time as a meaner, more obnoxious Democrat.

Paul SB said...

Rob,

While people are going to die because of Republican mishandling of the healthcare system, and that is a nightmare of preventable death, death is only the most extreme consequences. As long as we have older generations who live in terror of socialized medicine, we will always have huge numbers of people dying early of causes that a system not driven by the profit motive would save. The misery will be much more widespread. While the 1%ers will reap huge benefits, most people will sink, and lose productive years of their lives. If you lose a job and can't get an equivalent within 6 months, you are almost doomed to years of trying to claw your way back to where you had been, and the ramifications are not just financial, they are also deeply psychological and medical. It's not just about preventable deaths, it's also the much broader preventable misery. Not that I am dissing your statement - just adding to it.

Of course, America frequently fails to learn from those kinds of mistakes. This culture has an amazing capacity close their eyes, shout "La la la la! I'm not listening!" and blame the victims of every tragic mistake we make. Just yesterday my daughter was watching a documentary about the Danish resistance to the Nazi occupation. She said she hoped it would give her ideas about resisting the Grope Administration, but after watching it, we both realized there was little to be learned that we could apply to our current situation. In the car elf Denmark, very few people collaborated with the Hitler Regime. It started as a small resistance movement committing a lot of sabotage, but morphed into huge strikes at the factories where munitions were being made to feed the German war machine. Virtually the entire country backed the strikes. Here, people are far too divided, too quick to blame each other for the consequences of kleptocracy. Maybe if Grope starts allowing Russia to annex US states instead of small baltic nations and former Soviet conquests...

Paul SB said...

Larry,

"Interview with a Vampire" came out in 1994, not the late 80's. It was based on a book from the 70's, though, which kind of illustrates what I was saying about decade skipping (a little like generation skipping with recessive genes?). People who would have read the book when they were young in the 70's would have been in their struggling years of early adulthood in the 80's, but by the 90's would be settling down, buying homes and cranking out 2.1 babies. Not only would they have been the target audience, but likely the Hollywood people who proposed making the movie would have fit the demographic themselves.

I just glanced up and saw that the Satan in his Autocorrect incarnation has befuddled another of my sentences. "In the case of..." not "In the car elf ..."

What is the world coming to when noble people like Elrond and Galadriel have been reduced to chauffeurs for the likes of our 1%ers!

LarryHart said...

@Paul SB,

I was referring to the popularity of the book "Interview With a Vampire". If you're right that it was from the 70s, then there must have been a revival of sorts later on. I don't remember much buzz about the novel until 1988 or 1989.

LarryHart said...

Paul SB:

If you lose a job and can't get an equivalent within 6 months, you are almost doomed to years of trying to claw your way back to where you had been, and the ramifications are not just financial, they are also deeply psychological and medical.


If people plan for a loss of a paycheck, they can save money in a rainy-day fund. Not so with eligibility for employer-based health insurance. Before Obamacare, people who were already diagnosed with a condition were not insurable under a new policy. That meant their only option was to stay employed. At best, that's feudalism, forcing people to kow-tow to the whims of an employer in order to keep a job. At worst, the employer lays you off anyway, and now you're out on the street. This is what Republicans in congress are promising to return to, to thunderous applause.

Jason said...

Wow. I wish I had a crystal ball to see the future. It would make life so much easier. Can you guys tell me where you got yours from?

Jason said...

By the way I'm not a troll, although I may have some troll DNA in me somewhere. An ancestral tragedy perhaps?

Anonymous said...

Question for Americans…

I've just read Pity the Billionaire by Thomas Frank, and The Party is Over by by Mike Lofgren. How well do they describe what's happening in your politics?

(Lofgren is a former Republican staffer. Don't recall Frank's qualifications.)

LarryHart said...

Jason:

Wow. I wish I had a crystal ball to see the future. It would make life so much easier.


Without your own crystal ball, how do you know that?

Paul SB said...

Jason,

No crystal ball, dude! Just speculating based on experience - which is pretty much what everyone does. It's just that some are better at it than others. Some base their speculations on actual facts, rather than just believing the bull that "everybody knows."

As far as the troll blood goes, I think we all have a bit of that. It can be hard to keep it in, sometimes. :]

If you could turn an exclamation point on its side, it could make an emoticon for a troll's club. Better yet, angle it. Maybe someone will make a troll emoji.

Paul SB said...

Larry,

"If people plan for a loss of a paycheck, they can save money in a rainy-day fund."

That would depend on how much you are getting paid, wouldn't it? I'm making okay money, and I haven't had a rainy-day fund in my entire adult life. :{

As far as vampire stories go, I don't honestly remember. I got sick of vampires a long time ago, that may have had more to do with having a vampire-obsessed GM than what was floating around in the moviescape.

Carl M. said...

A belated link in reference to the dumbing down of the intel community.
http://www.theonion.com/article/trump-gives-intelligence-agencies-their-daily-brie-54961

duncan cairncross said...

Larry and Paul
Rainy day fund
Putting money into a "rainy day fund" is not only difficult - it is not the best way forwards
Unless you are part of the 1% you will start life with debts
House, education, car......
Before you start your rainy day fund you should pay off all of your debts - saving when you have debt is silly

For most of us that takes us up until about age 50

Paul SB said...

Duncan,

I'm almost there, and still probably a long way from the rainy day fund. And I'm seriously thinking of going back and getting my Ph.D, or maybe an Ed.D., as the politics in the public schools can get pretty mortifying. I've paid off my college debts but not my wife's, and my car died last year so now I'm paying on a new one. Daughter has at least a couple more years in college, and my son starts high school next year. Bad timing I'm sure, but if I don't do it soon my brain will turn to cottage cheese and I won't be capable. I love blowing my students' minds and getting them to see the world through different eyes than "everybody's" - but any minute some kid could lie to some parent, and the administrators are afraid to make parents angry, so they make life hell for teachers.

If I go back to college I will probably not manage to get debt-free until I retire... But then, i had more time for myself when Iw as in grad school than I have since becoming a teacher - one of those quality of life issues.

Paul SB said...

Carl,

The Onion is probably one of the few institutions that can really benefit from the Grope Administration...

David Brin said...

Jason, humor cancels many sins. UR welcome here. Oh grandson of trolls.

CarlM. Har! Sob!

donzelion said...

LarryHart: "Before Obamacare, people who were already diagnosed with a condition were not insurable under a new policy."
Almost always, the problem is more selective than one imagines. A savvy family member with inside connections can manipulate/bypass many sorts of screeners so that certain 'preexisting conditions' that would typically render an outsider ineligible for coverage would still obtain that coverage. Or a rich family could pay someone a little extra to figure out a loophole. Other strategies include strategic bankruptcy, litigation threats (debt collectors for hospitals cave quickly to chase easier targets), and many more. The rest of us, lack that 'insider' angle who would get blocked.

The wealthy are masters of getting expensive stuff for cheap (then selling it at full value). But when the only option for the non-wealthy is "stay employed" - it shifts the balance of power for an employer (and many employers will try to avoid hiring people with a sick family member, for fear of raising their entire pool's health insurance rates - quietly, mind you, in the old system, since it's both frowned upon, heartless, and often illegal - but there are many ways to figure it out - e.g., hire all 'employees' as part-time 'independent contractors' for a year, then track their sick records to suss out conditions).

In many fields, that means (1) 3-12 months as an unpaid volunteer, then (2) 6-24 months as a paid part-time temporary contractor, and then if the employer can afford it, and maybe at the end, (3) employment with coverage (but with extreme control by the employer, who typically pits 'true employees' against 'contractors' for bonuses, wages, benefits, and the rest. The prevalence of 'free' work compresses wages for everyone.

Obama's team tried to rein that in somewhat. Minimum wage rules, when enforced, can rein that in even more. Neither is a priority for our new friends at the federal Dept. of Labor, but Jerry Brown's team has certainly tried to add bodies to enforce the rules and rein this practice in.

"At best, that's feudalism, forcing people to kow-tow to the whims of an employer in order to keep a job."
Indeed. Seldom "peasants working the farms"-style feudalism - more like 'assistants working the phones.'

Tony Fisk said...

Troll? I suspect you're really just a bit of a fixer-upper, Jason.

On matters macabre, like the current ACA repeal pogrom, someone recently tweeted to @xeni Jardin (Boingboing reporter and cancer survivor) "survival bias here is interesting. Because of ACA we now have a bunch of people the GOP wanted dead alive to voice their opinions."

(To which Xeni responded "We, the undead")

...so *THIS* is the zombie apocalypse!*

*(apols. to anyone who don't find this a laughing matter. It isn't. Frankly, I regard any congress critter who votes to repeal the ACA without a replacement is part of a conspiracy to commit mass murder on their own constituents, and should have their noses rubbed in that fact forever. Thanks Osama!)

LarryHart said...

Duncan Cairncross:

Putting money into a "rainy day fund" is not only difficult - it is not the best way forwards


My point was that it is possible to put something aside against the loss of a paycheck. It's also possible to sell your possessions for cash or to go into business for yourself.

It is not possible to put something aside against the loss of employer-provided health insurance.

If you are difficult or impossible to insure as an individual, then employer-based health insurance becomes a de-facto method of binding you to the will of an employer just like feudalism was. In fact, I'm not sure a feudal lord could divest himself of you as easily as a modern-day employer can.

raito said...

From the last,

I've been trying to think of some example of media manipulation being uncovered. And I finally remembered one. And I'm only a couple hops removed from it. A couple years ago now, there was a Facebook thing where they were manipulating news feeds. No particularly nefarious intent -- it was a research project by one of their staff sociologists attempting to see if positive or negative news had any affect on posting content. And it was found out. The guy whose project it was was pretty scared that he'd just lost a job (naturally, he didn't). In this case, news wasn't changed, it was just present or not.

On to the latest:

Re: Nostalgia

Geez. I'm on a couple communities that deal in rather severe forms of nostalgia. On the one hand, nostalgic electric guitarists believe that only amplifier designs dating from shortly after Black* worked out the negative feedback technique produce good sound and tend to use guitar designs dating back to mostly the early 50's. On the other are synthesizer guys who insist that only analog machines built with obsolete components can produce good sound. One believes in tubes, one believes in transistors, neither believes in the integrated circuit. All that has to happen is for those guys from that era to finally die off. It's not that the old stuff ever really dies**, but it'll no longer be mainstream. Kind of like the vinyl/CD problem. So full of confirmation bias it's hilarious.

And the internet does not help lessen nostalgia. With everything created available, and every new creation vieing against everything else ever created, how does anything go out of style? Or stay in style? This is not to say that people can't have their preferences, as long as they're acknowledged as such. And even if there's a criteria, all that does is allow something to be 'better' according to the criteria. A completely separate thing from whether the criteria has any actual meaning other than aesthetic.

*Harold Stephen Black, another one of those Bell Labs giants whose shoulders we stand on. They sure had a lot of them. Remember my reference to Richard Hamming, for example (which also dealt, though indirectly, with nostalgia)?

**Which is kind of a corollary to Gibson's quote.

"The future is already here. It's just unevenly distributed."

Thanks for calling him a metaphorist. As for the quote, it is true that nothing spreads immediately. There's still guys in Africa running down prey with wooden spears. There were still party lines when touch-tone was around. I still can't get more than a 1.5Mb connection inside city limits where they're rolling our fiber. And it's not all about wealth disparity, really. So in some sense, it's true. Just not always the one that's applied to it.

Yes, the Red Lightning series is a fun read. Rests on a bit of a mcguffin, though.

Re: Vampire, Zombies, etc.

I've never looked in those terms. I agree with the likes of John Kenneth Muir, whose central tenet for genre media (sf, horror, fantasy, etc.) is that it ought to say something meaningful about the times in which it is created. This is why so many sequels and reboots fail on an artistic level. They are made during different times, yet are too rooted to their sources to say anything new -- anything about their own times.

raito said...

And on to the comments...

Duncan Cairncross,

Any number of financial advisers would disagree with you. If you can make at least your payments, it's far better to save up some than pay down your debts even further. The reason for this is that with some savings you might be able to ride out a bad patch. With no savings, you can't.

Jumper,

I think 'evil child' stuff started during the 'nature or nurture' debates. The Bad Seed novel was 1954. Then progressed through the 'juvenile delinquent' phase later in the 50's, to the 'hippie/counterculture' stuff of the 60's/early 70's (like Joe) then on to the supernatural stuff you're talking about.

Paul SB,

I can see some of those people in favor of cutting taxes feeling good about themselves when they have a lower tax bill next year -- because they have no income to tax. I wouldn't put it past some people to fail to see that particular forest for the trees.

As for the play, it reminds me a bit of one of my own efforts, in which the failure of Buddy Holly to die results in the Soviet Union loosing their nuclear store on the US in 1976 (more of a though exercise than serious writing).

LarryHart said...

raito:

As for the play, it reminds me a bit of one of my own efforts, in which the failure of Buddy Holly to die results in the Soviet Union loosing their nuclear store on the US in 1976


That's no less plausible than the failure of Edith Keeler to die resulting in a Nazi victory in WWII.

Since our recent discussions of time travel, I've been thinking a lot about the subject, and even watched "Somewhere in Time" again. I've come to the conclusion that stories that use time travel must choose between the immutable past version (the time traveler is part of the past all along) or the multiple timeline version (the time traveler causes time to diverge into a separate timeline). The same story can't have both. The Star Trek episode mentioned above doesn't work with multiple timelines, because McCoy's trip backwards wouldn't have affected the timeline that Kirk, Spock, Uhura, and the Enterprise were still in.

"Back to the Future" seems to operate with multiple timelines, but if so, then it cheats when Marty McFly starts to fade out of existence when his parents seem unlikely to marry in the new timeline. If this is a new timeline, then his old timeline (including his own birth) should not be affected.

TCB said...

A few days ago I made a comment that caused Dr. Brin, our host, to caution against "violent scenarios."

Fair enow.

But I just this minute read a news-ish article with the exact sort of scenario I had mentioned. Creepers.

From the article:

Currently, the assumption is that if ever Trump appeared to be on the verge of summoning the “football” (the briefcase with the codes), the secret service would intervene to remove him from the planet while there is still a viable planet from which to remove him.

But that assumption is too nebulous to give comfort. Although there are precedents of a palpably insane ruler’s personal protection squad taking him out (Caligula’s murder by his Praetorian Guard, and so on), it would be complacent to rely on that.


In any case, the author invokes the 25th Amendment to the Constitution, in which Mike Pence and eight or more of the Cabinet solemnly affirm that The King is Mad and Must be, Ahem, Sent Away to Rest His Fevered Brain, Until Such Time As the Assembled Lords Shall Judge That He Hath Regained His Sanity, Which Day Will, For the Purposes of the Court, Never Come.

Paul SB said...

TCB (why do your initials make me think of yogurt?),

The fact that there have been so few precedents for the removal of a leader except through conquest by another nation might help to save Grope from himself, as it would all be conservatives who would be in a position to do it - assuming they try to legitimize their action with the 25th and not just go for plain, old-fashioned assassination. Conservative lawyers and law-makers tend to see precedent as paramount, and going all the way back to Roman times is not likely to carry much weight with them. Then again, in another generation, given the state of the average American's knowledge of history, they might use Game of Thrones as a precedent. ;)

Paul SB said...

Larry & Raito,

On the play, Brin's piece makes no change to the timeline, it only uses a familiar sci-fi trope to explain some of Reagan's stupidity. This is different from Raito's thought exercise or the old Trek episode, which have more temporal implications.

Raito said:
"I can see some of those people in favor of cutting taxes feeling good about themselves when they have a lower tax bill next year -- because they have no income to tax. I wouldn't put it past some people to fail to see that particular forest for the trees."

That would be a case of cutting off their noses to spite their faces, as the expression goes, but I suspect that getting to the point where that sort of thing wasn't obvious would require years of frontal lobe atrophy.

As far as the vampire/zombie thing goes, are you familiar with what archaeologists call a "battleship curve?" If you take some man-made object and graph its frequency over time (a frequency seriation), what usually happens is that when it is first invented or introduced, there aren't a whole lot of them, but for a time the number of the objects increases, widening the curve. Then it starts to lose popularity and for awhile the numbers dwindle. On a graph it would be shaped like a battleship seen from above. Often these curves turn out bimodal. That is, the artifact type starts to dwindle for awhile, then experiences a resurgence for a time, then fades away. This curve looks kind of feminiform, but what is shows you is a nostalgia pattern.

This site has a good example, though you have to scroll way down to the Sept 25, 2006 entry to see it. The graph is in red and salmon pink, so you can't miss it.

This gives a definition:
http://www.archaeologywordsmith.com/lookup.php?category=&where=headword&terms=battleship+curve

Pictures at:

Paul SB said...

Oop! Ack! Ptooey! I forgot to paste in the url:

http://www.dirtbrothers.org/college/introarchaeology.html

Paul451 said...

Jumper,
"For example, the "evil children" meme didn't really kick off until after the Exorcist movie when a bunch of boomers started having kids"

1960's Village Of The Damned is the trope-setter for creepy evil children. Everything after tends to follow that trope. And it's probably one of the early drive-in horror movies for boomers, hence their copying that trope when boomers started writing for Hollywood. (Twilight Zone's "It's A Good Life" aired around the same time.)

(OTOH, The Exorcist didn't really contribute to that, IMO. Regan was too old (teenager) to really count as a child, and the time-period overlaps too closely with The Omen (which is your more classic evil-child.) Exorcist created so many of the classic gross-demon tropes, but it was Village and Omen that nailed the evil-children tropes.)

LarryHart,
" "Back to the Future" seems to operate with multiple timelines, but if so, then it cheats when Marty McFly starts to fade out of existence when his parents seem unlikely to marry in the new timeline."

No, the writers clearly intended a single timeline. With a delayed effect on the time-travellers. (The only inconsistency is 2015-Biff taking the almanac back to 1955. He shouldn't have been able to return to Doc'n'Marty's unaltered 2015. And they shouldn't have been able to return to the altered 1985, since Doc had been locked up and couldn't have invented time-travel. Unless the time-machine itself becomes a fixed point of existence once invented, hence changes can't prevent its invention.)

That's what I tried to show, several posts back. There's no way for any specific character to experience the trilogy's story arc if you assume branching-timelines/many-worlds. You can get close on one branch, but it always fails.

Likewise Star Trek, it's clearly written with time-travel in the single timeline. Parallel universes are a separate phenomenon and I think every example has crossings always happening at the same point in alt.history. Time-travel is always within the same timeline, parallel universes are always at the same time.

The multiple-timelines Nu-Trek save for Old-Trek only works if, for example, the presence of magic red-matter causes time-travel to also jump universes. Ie, a new thing.

PaulSB,
I just image-searched "battleship curve" instead. Weirdly it doesn't seem to be used anywhere except archaeology. (And battleships, I guess. Hmmm, I wonder what the battleship curve for battleships would look like. Long skinny bow, blunt stern.)

Paul451 said...

Not sure if this has been mentioned:

There's a "contact binary" star system around 1800ly away that is approaching merger (and hence nova.) Researchers were able to pin down the date to 2022, give or take a year. With tighter predictions expected as they watch the system winding up.

Should reach magnitude 2. So not just naked eye, but "hey what's that bright red star?"

http://www.astronomy.com/news/2017/01/2022-red-nova

Paul SB said...

Paul 451,

It makes sense that archaeologists would use that kind of diagram (frequency seriation colloquially the "battleship curve" chart), but you see these in paleontology and evolutionary biology, too. They usually look a little more smooth when paleontologists make them, probably because they are based on more approximate population estimates. I just don't know what they call them - they must be using a different name. Here's an article that uses one for dinosaur evolution, and they are flat at the prow, thanks to old Dr. ALvarez's asteroid.

http://www.getbirding.com/?p=552

Ioan said...

I ran across this article recently.

https://www.bloomberg.com/view/articles/2016-12-20/manufacturing-matters-even-if-it-doesn-t-create-many-jobs

This article tells me to things:
1. Trump opponents are correct that most jobs were lost to automation
2. Economic analysts are living in another world if they don’t think that the loss of about 1 million manufacturing jobs to outsourcing isn’t a big deal. In other words, Trump supporters have every right to be angry that they lost 1 million jobs to other countries.

https://www.google.com/search?site=&tbm=isch&source=hp&biw=1600&bih=799&q=manufacturing+jobs+in+the+us&oq=manufacturing+jobs&gs_l=img.3.2.0l10.1422.6166.0.9773.24.14.3.7.8.0.134.1521.3j11.14.0....0...1ac.1.64.img..0.23.1528...0i10i24k1.rc1K74F_FLA#imgrc=NWMGetp7v8hXqM%3A

Now, that graph ends in 2010 or 2011, at the bottom of the job market. Since then, the US has added about 0.8 million jobs, almost compensating for the loss of outsourcing jobs.

http://www.factcheck.org/2016/12/obamas-record-on-manufacturing-jobs/


What do you guys make of this?

raito said...

LarryHart,

There is another time-travel variant. The one where a change in the past propagates to the present at (ironically) some rate, such that the protagonists must change the past within some amount of time, or Things Will Go Bad. The problem is that one would then be led to assume that the first change would propagate through, with the second change following. But it never seems to go that way. Instead, the changes cancel instantaneously, which makes the whole propagation idea not really work.

Paul SB,

One example there would be surf music. It was a pretty small niche even in its original 63-64 incarnation, then mostly existed on oldies stations. Even a (very) small resurgence at the end of the 70's didn't spark anything (although it was enough to get that scene going again in eastern Europe, of all places. It never went away in Japan, where being a faddist is apparently a norm). Then Pulp Fiction came out in 94, which was enough to renew the genre, and it's been pretty steady ever since.

But I wonder how much the internet affected that. Now, fans of anything can find each other, and the critical mass necessary to maintain a subculture is pretty small, maybe in the handful of hundreds. Are those battleship curves, even with nostalgia, going to change as nothing ever really goes away due to computer storage and the internet? Look at the Baen Free Library, for example. It made nostalgic-looking curves, but really those curves resulted from exposure to new readers through free media.

Ioan said...

Oops, wrong link

http://blog.cnccookbook.com/2013/04/01/the-decline-in-us-manufacturing-is-more-recent-than-you-think-and-turning-around/

Paul SB said...

Slim Moldie made a very interesting point in the last thread that I thought was worth pointing out again. Sorry for bringing it up so late. Sometimes my memory does odd things in the morning.

Here's what he wrote (I think he's a he, but not sure):

One issue regarding the attack on science is in language. Association and negative connotation. Imagine if "science" was a trademarked brand name. Every time we see or hear (bad, inaccurate, unsubstantiated, fraudulent and pseudo) paired with "science" our brains are making associations--even when intentions are well meaning. And I'd hazard a conjecture that the less an individual knows about what science really is the more the negative associations will inform their thinking or lack of...

This reminds me of something I heard (or maybe saw, but I think it was more likely a radio show or a book on CD) talking about children's TV shows & literature. They were comparing how "Blues Clues" impacts children's behavior vs. "Spongebob" than at one point they brought up the idea of using children's lit for moralizing purposes. The conclusion was that it usually has the opposite effect intended because of neuronal mirroring. They explained this one with a "Berenstain Bears" book, in which one of the brothers decides to get his way by hitting his little sister. In the end the brother is punished, shamed, and apologizes and promises to never do it again. But young children reading the book tend to turn around and start hitting their younger siblings. Why? They haven't developed the frontal lobes well enough to really get the moral distinction/consequences idea the way most adults would. But when they see someone hitting someone else, their mirror neurons rehearse the behavior unconsciously, adding "hitting" to their repertoire of behaviors.

Think about that in terms of both the memes flying around in propaganda space and the actual behavior of many voters. The more often demagogues ridicule scientific conclusions they don't like, throwing around terms like "junk science" the more that reinforces negative schemata in the minds of their audience. Many adults will get the moral message about some science being good but some being bad, but "adults" who have less developed frontal lobes will simply mirror what they hear. You can generally tell who has less myelin in their frontal lobes by the kinds of irresponsible, risky behavior and ideas they exhibit, like excessive drinking, drinking and driving, gambling, addictive behaviors, use of bullying tactics to get their way, excessive bragging, belief in the importance of individuals, "will to power" egomania, greed, anything that emphasizes self-indulgence over impulse control.

Paul SB said...

If we want to get people more on board with science and have more positive associations, we probably need to simply flood the airwaves with examples of good science doing good things for people. It's all around us, so much so that we take it for granted. Anyone who has any platform to bring up the benefits of science really should be any chance they get, so the people surrounding us build richer schemata.

Don't you love it when someone goes to a fertility clinic because they can't have children, get fertility treatments, then produce a litter of half a dozen babies and start praising Jesus for it? It wasn't Jesus who made those fertility pills. Then they have a heart attack and doctors save their lives, and they thank the Lord for it, but don't bother to thank the doctors who saved their lives. I like to point out to my students sometimes that it was medical science that brought our infant mortality rate down from 50% a century ago to the 8% it is now (which is actually a few points higher in the US than other modernized nations). Looks at a classroom full of kids, imagine half of them not there because they died before they were 3, and think about what it is scientists really do, in spite of the movies and comic books that would have you think that scientists are mad and evil.

Paul SB said...

Raito,

I think what you will get with the internet preservation of fads (maybe that should be an abbreviation - IPOF) you will see these curves get really, really long and thin, stretched out over much longer periods of time. Will they go on forever? Well, is there anything that goes on forever? I mean besides the typical work week ...?

Paul SB said...

Ioan,

I checked out that article, and it has a few problems in terms of both its logic and factors that it doesn't account for. It does have a couple good points, though.

Problems: The major decline in manufacturing starts in 1999 according to his graph, but then he blames this on offshoring and states that a huge decline in energy cost was necessary to make offshoring possible. But then, the huge decline in energy cost didn't happen until 2009.
No mention is made of the role of technology in energy prices, and the fact that energy costs are going back up shows that the fracking boom is only a temporary boost in the inevitable decline of a non-renewable resource.
You can counter the low energy cost leading to overseas outsourcing easily enough, as low energy costs lower the costs of manufacturing at home.
No mention is made of labor costs, except tangentially at the beginning where the author points out that manufacturing jobs pay better than the average American wage (and completely misses the illogic of saying that people think of manufacturing as dangerous drudgery in spite of higher pay - so it's high-paid dangerous drudgery).
The Chinese currency manipulation argument doesn't hold up as a cause of the manufacturing decline, since it didn't start until 2004, five years after the manufacturing slump.

The two good points: For one, the impression people have and the reality are not the same thing, and people would do well to get the facts rather than just listening to the gloom-and-doom media machine and believing every word son it.
The point he makes about more small-batch, customized manufacturing becoming a good trend to jump on is probably quite sound, and facilitated by advances in 3-D printing technology. If I had the money to invest in a business, that might be a good way to go. But like most people I live paycheck to paycheck, so much for the American Dream!

TheMadLibrarian said...

Paul451, the movie "Hidden Figures", about mathematicians at the start of the NASA program and their contributions, especially about overcoming the strikes against them for being female and African-American, beat out Rogue One this week at the box office. More positive news for people with a taste for factual scientific history.

LarryHart said...

@TheMadLibrarian, re: "Hidden Figures"

I implicitly trust my teenage daughter's taste in entertainment, especially after she introduced me to "Hamilton" (and I returned the favor by hooking her on the comic book "Saga"). She and another mathematically-inclined friend loved the recent movie about Alan Turing (I forget the title), and while watching that one, they saw previews for "Woman in Gold", which they then insisted on seeing. It is no surprise that the two of them are now hot to see "Hidden Figures".

Twominds said...

@Paul451 7:25 AM,
I just image-searched "battleship curve" instead. Weirdly it doesn't seem to be used anywhere except archaeology. (And battleships, I guess. Hmmm, I wonder what the battleship curve for battleships would look like. Long skinny bow, blunt stern.)

IIRC we used "willow leaf pattern" or just "leaf pattern" for such graphs. I understood PaulSB's term for it, even though I hadn't seen it before.

Palynologists (pollen specialists) use these types of graphs too, to study changes in plant cover over long periods. Archeologists greedily use them and learn to read them, they're indispensable to understand the landscape and climate the peoples they study lived in, and how it influenced them.

Twominds said...

@Raito 7:53 AM
Now, fans of anything can find each other, and the critical mass necessary to maintain a subculture is pretty small, maybe in the handful of hundreds. Are those battleship curves, even with nostalgia, going to change as nothing ever really goes away due to computer storage and the internet?

I'm a member of a classic caravan club, that finds and restores old caravans and camping parafernalia, from the early '60's to the late '80's. Internet is indispensable for us, but I see a natural tapering off, when either no more restorable caravans are to be found, or it gets too expensive for most members. So, a leaf pattern, from a couple of early enthousiasts, to its current relatively broad form to its probable pointy end in a couple of decades.

Catfish N. Cod said...

You can have both your single-timeline cake and eat your multiverse theory too. I've seen it done. There are two main methods:

(1) The discrete-timeline method. Jumping timelines and time travel are two separate entities composed by completely separate methods. Most commonly, timelines are created naturally by the laws of physics -- irrespective of any actions by any characters -- and act as parallel dimensions. Time travel within any single timeline allows change in that timeline *and no others*. This is compatible either with a deterministic system (You Already Changed The Past) or a ripple system; in the latter case, it may or may not be that your actions created a new timeline, but if you did, you'll have to do parallel-dimension travel to find the unaltered timeline again.

(2) The change-threshold method. In this system, small changes in the timeline -- that don't cause major butterfly effects -- result in You Already Changed The Past. For instance, you can have a Nazi set an Enigma machine to RSTMN or MNRST, and it's not really going to affect matters much; Bletchley Park will decode it a few seconds earlier or later and the rest of history continues unchanged. Large changes result in new timelines -- if FDR dies a year later, history is deeply changed as he is the one that orders the use of atomic bombs; if he dies a year earlier, history is deeply changed as Truman never becomes President. There's some threshold of "change" that determines if the timeline branches, and typically, you can't predict in advance what will and won't be a sufficient change. (Would *you* be able to predict in advance that losing or not losing Special Order 191 would be a determining factor in the American Civil War?)

However, for plot purposes, it's often too confusing to make these distinctions. If you do make them, they have to be part of the development or tension in the plot -- "we don't know what changes will cause a divergence, so be very careful!"

LarryHart said...

raito:

There is another time-travel variant. The one where a change in the past propagates to the present at (ironically) some rate, such that the protagonists must change the past within some amount of time, or Things Will Go Bad. The problem is that one would then be led to assume that the first change would propagate through, with the second change following. But it never seems to go that way. Instead, the changes cancel instantaneously, which makes the whole propagation idea not really work.


Of all variants of time travel stories, the one I like the least is where the effects of altering the time stream slowly fade in (or out) and the characters can tell that a change is in progress. Marty McFly fading out is an example. To me, that's the worst of both worlds--Marty existed in the new timeline (or the past of the one-and-only timeline) for some time, but then might cease to exist there later? It doesn't resolve any paradoxes; it might just be creating a new one.

LarryHart said...

Paul SB:

Think about that in terms of both the memes flying around in propaganda space and the actual behavior of many voters. The more often demagogues ridicule scientific conclusions they don't like, throwing around terms like "junk science"


Or the more recent one, "Fake News!". Which is so ironic, I really wanted someone to ask Trump, "Oh, is fake news a bad thing now?" He rode fake news to the White House. And now, he's trying to tar a network with the label as if there's something wrong with fake news. Doubly disingenuous, because he's using the term to mean "news that I don't like" rather than anything actually...whatayacall...made up. So made up news is ok, but news that he doesn't like is "fake". We really are in 1984 territory here.

LarryHart said...

Catfish N. Cod:

The change-threshold method. In this system, small changes in the timeline -- that don't cause major butterfly effects -- result in You Already Changed The Past. For instance, you can have a Nazi set an Enigma machine to RSTMN or MNRST, and it's not really going to affect matters much; Bletchley Park will decode it a few seconds earlier or later and the rest of history continues unchanged. Large changes result in new timelines -- if FDR dies a year later, history is deeply changed as he is the one that orders the use of atomic bombs; if he dies a year earlier, history is deeply changed as Truman never becomes President. There's some threshold of "change" that determines if the timeline branches, and typically, you can't predict in advance what will and won't be a sufficient change


I think what you're talking about could be seen as "stable" vs "unstable" permutations to the timeline. Or maybe "converging" and "diverging" is more accurate. What you call "small changes" might be said to create a branching timeline which differs slightly from the original, but then converges back to that original (or at least becomes indistinguishable from it). What you call "large changes" knock over enough dominoes that the new timeline will be forever different from the original.

Think of your changes as creating a new "timeline prime" which differs from the original timeline by some error function of time. With small changes, the error function tends back to zero as time moves ahead. With large changes, the error function increases with time.

raito said...

Catfish N. Cod,

I have never liked the change-threshold method because it begs the question of what is big?

LarryHart,

I don't like those sorts of stories, wither, but there's a lot of them.

Paul SB,

With respect to science, I don't think most people understand it at all. Especially not research. I think most people think that negative results are >failure< (which they are not). So in most people's minds, and research yielding negative results was a waste of time and money. They don't see that narrowing on on positive results is something worthwhile. This comes up often in the form of 'we should spend money on our problems here, instead of spending it on research'. Completely losing sight of the fact that it's research that solved a lot of problems in the past. Then again, I doubt those people ever learned to fish.

LarryHart said...

raito:

I have never liked the change-threshold method because it begs the question of what is big?


That's why I like the mathematical description I just invented a few minutes ago (a few posts up). It doesn't rely on subjective decisions about what is "big" or "small". The error function between the two timelines either converges to zero or it doesn't.

David Brin said...

LarryHart tags the time travel variant I like, which is statistical/stochastic. You meddle in the past and create probability ripples. Sometimes they lock in. And sometimes the implausibility of the ripples just makes them fade away... as we will soon see happening in our peripheral vision, as the universe looks at Trump and goes.... naaaah.

David S said...

On manufacturing, I think it helpful to look at manufacturing labor force as a percentage of all labor (instead of just looking at the absolute number). When you do, you see manufacturing labor steadily decreasing since 1950.

If you look at the absolute number, the trend since 1980 is relatively flat except for steep declines during recessions (and there isn't much recovery post recession).

Here is a link to an graph of manufacturing labor since 1940:
http://www.kylefitzgibbons.com/uploads/1/2/7/0/12705850/screen-shot-2016-12-05-at-12-01-11-pm.png

Paul SB said...

Raito,

I'm not too sure about the fishing thing. I have known enough witless fly fishers who go out, cast a rod and start downing the beers. If it takes all day to catch a fish, that's fine, because they drank a lot of beer with their buddies. If they had to live off their knowledge of how to fish, they would be digging in garbage cans.

But what you say about most people being pretty clueless is mostly right. Negative results sound like failure, and Real Men ® don't ever admit to failure. Real Men ® don't value patience, they value aggression. Science tends to require patience.

I just came back from watching "Hidden Figures" and, even though it was a slow, thinking, action-free movie, it was well worth it. The heroes are mathematicians, one of whom ends up becoming an engineer. Mostly the humor kept it going. I just wish my son had the patience to sit through something like that. Maybe by the time it come out on DVD.

Jumper said...

Some variants make better stories; true. I suspect small changes would butterfly up to major changes always, every single time, given enough time, or rather, given a surprisingly short amount of time. I think a pebble tossed anywhere on earth, into any human's path whatever, would within 100 years ripple into such that no one is born who would otherwise. Just a few slightly differences in reproductive timing, to be delicate about it all, would make all the difference. One second off and your brother would be your sister, or chance meetings wouldn't occur, etc.

Slim Moldie said...

Asimov's "End of Eternity" has always been my favorite for time travel. Any other stories using the kettle system? "Worlds of the Imperium" would be one...even though you have to argue to apply the time travel stamp...

raito said...

Paul SB,

I mean fishing in the sense of 'Give a man a fish and he eats for a day. Teach a man to fish and he eats for a lifetime.'

Paul SB said...

Okay, now I get you, Raito.

Ioan said...

David S,

I disagree for two reasons. Looking at manufacturing as a fraction of the workforce is an intellectual exercise without many political consequences.

Let's look at a hypothetical: the absolute number of manufacturing jobs stays constant at 19.5 million people. This still means a drop in manufacturing as a fraction of the labor force (just not as steep). However, the factories remain in the small towns of what we call the Rust Belt.

As a result, the political surge which created Donald Trump never happens. That's why I'm not actually interested in the drop in manufacturing as a percentage of the population: it doesn't convey the political changes as well as looking at absolute numbers.

Trying to understand how many manufacturing jobs were outsourced versus automated gives us ammunition against demagogues like Trump. Do you want him to run a campaign of "the liberal elite have been exporting this nation manufacturing jobs since 1950"?

Ioan said...

Second, emphasizing the declining share of manufacturing employment since 1950 in essence gives Republicans the ability to blame Civil Rights on the decline of manufacturing. They could blame Brown vs the Board of Education on causing this decline.

TCB said...

I like to fiddle around with old tube amps (found in tape recorders, record players, any audio gear made before about 1965). Not that I'm very competent with the repairs, mind you... but I still find one every now and then at the the thrift store for twenty bucks or so. You can turn them into low-watt guitar amps.

Soooo. Crack one of these things open and what do you see? Tubes, capacitors, resistors, switches, a couple hundred feet of wire... all put in by hand. AL soldered by hand. All screwed in by hand. All done in Chicago or someplace like.

When you look up the prices these things sold for new, holy cow, they're expensive! Some of the better ones, like this circa 1962 Roberts 990 stereo reel to reelas advertised by Doris Day and Percy Faith: it listed at $399.50. That's over $3100 in 2016 dollars. Even the cheap ones from Sears were not cheap. Built like tanks, meant to be repaired if they broke. When I was a pup the Western Auto store had a whole corner that was just replacement tubes and parts, the way the Home Depot has a whole aisle that's just light bulbs. Anyhow, I picked up an old Roberts just like this for about forty bucks. You really can't fix the tape mechanism because the rubber rollers and whatnot are not even made now. The twin amps in the 990, however, you can totally refurb. And they are still considered magnificent by those who know what they are.

I'm nattering, but the point is: that's a HUGE amount of (human) labor that went into that Roberts. A week's worth? More? Far more than modern electronics require, because all those little components you see on a circuit board are placed there by robotic arms (I've seen one doing this up close, and you couldn't do it by hand even a fourth as fast). Then they are soldered mechanically too, by one of several methods.

Even if they still did it in Chicago, you wouldn't expect a skilled human at living wage to spend an entire week making, say, a single Apple Macbook. If they did, that $1300 Macbook might cost something closer to $3100.

Anyway, there's a point in there someplace about where manufacturing employment went... what people did in a factory 50 years ago looked almost artisanal. Things that are made that way now are likely to be high-end or craft items, such as fine furniture.

LarryHart said...

Slim Moldie:

Asimov's "End of Eternity" has always been my favorite for time travel.


I like the story, but I have to turn off some critical parts of my brain while reading it. The weirdest thing for me is how each century is treated as a unit--a "floor" in a building. You get off the elevator in the 95th Century (for example), and you're in a particular "place" that you can return to in real (subjective) time, but you don't zero in on specific years like 9401 or 9456 or 9492.

I don't want to spoil the ending here, but my understanding is that the key image that the protagonist plants in a magazine from the past is something that Asimov actually saw in an old magazine and wondered why that image would be there. So he wrote a time travel story to explain the seeming-paradox. I suppose one could write a similar story to explain how the birth announcement for a certain president was introduced into the print run of a Honolulu newspaper from 1961.

Alfred Differ said...

If they made my Macbook the older, manual way, I doubt I would think of it as craft ware. More likely crap ware. 8)

I've got an old Heath CW transmitter (HW-16) sitting on the filing cabinet behind me with tubes and wiring and such. I learned a lot using it and from it back in the 70's, but no one would make these things that way anymore. If I want 100W of shortwave radio CW transmission, I'm sure I could do much better now and spend much less.

I also have an old Commodore PET (right after the chicklet keyboards) that the first owner bought at $5K in 1979. It came with a double (floppy) disk drive unit and a printer. Each unit had its own 6502, so it was the first time I saw a commercial application for concurrent programs. In 2016 terms, that $5K price is about $16.6K, so it shouldn't be a surprise to learn the first owner was a small business owner looking to automate one or more of his business processes. I bought it off him about two years later for $2K because he knew by then that the hardware is only part of the cost of automation. Today that would be about $5.3K. What I could buy today for that price isn't even in the same galaxy due to quality improvements.

Automation isn't the only story behind events in the manufacturing sector. Competition is the other part of the tale. There were many years after my first computer purchase (I was ~19 years old) and the second because I could see the accelerating change. That first loan for the computer was a fluke no lender was likely to repeat, so I used to sneak into computer labs in grad school and use theirs. My second (1996?) was a 486 Frankenstein machine combined with a piecemeal upgrade plan. With no network, it WAS in the same galaxy, but I was a buyer in full control of my purchasing decision and its timing. Manufacturers had no control.

Whether jobs in manufacturing stay the same in volume or not, the point I'm making is they've lost a lot of control and produce what many of us see as commodities. The economists are pretty clear what happens to the price of commodities in the presence of competition. Employees in companies that face this reality can't be in control either and turning to the voting booth won't help. It really won't. Buyers will sit on purchases until after another election. What is to stop us?

Jumper said...

Quality goes away too.
I remember the peak boombox. It had dual cassettes, AM/FM radio, a graphic EQ, microphone input, headphone output, and was cable-ready, with a cable input for back when your cable TV didn't strip out all the radio stations incidentally picked up on their nice tall towers. Those features slowly went away, until you weren't even able to record your own audio.

LarryHart said...

Jumper:

Quality goes away too.


That's because innovations in entertainment-delivery technology have not been about enhancing the customer experience, but about strengthening copyright enforcement. From the POV of the Disney corporation or Time-Warner(for example), those things you mention are features, not bugs.

raito said...

Even more, all you needed to repair those was a soldering iron, some solder, and a new component. Even the last generation of surface-mount devices were repairable with little more than that (you might want a magnifier).

Today's electronics packages are astounding compared to what I started with, even leaving aside processors.

And with it came the added expense of manufacture. Sure you needed less people, but you need more equipment. You can't do it with a soldering iron any more. Wave soldering is goofy enough, but baking the boards to solder them? Boggle the mind if you started where I did.

As for audio entertainment, I think that for me, the sweet spot was CD and the hard drive iPod. In that time, one could carry one's whole music library. Then things got weird. Now, the providers are in control again. No one appears to care. On the other hand, my own library could play for a year without repeats.

Which leads to other questions. Why would I seek out new media when I already have more than I can possibly reasonably use? What can they offer me?

Paul451 said...

Re: Multi-thousand dollar early amplifiers/computers/etc.

IMO, this is what makes the decline in median income (as a proportion of Real Per Capita GDP) more than just a "wouldn't it be nice if..." or some moral tut-tut story about the 1%.

What is the modern equivalent of a small business buying an early personal computer? Or the modern version of an average home spending $3000 on a single-function device that's the equivalent of a tape recorder?

Electronics needed to pass through that phase to get here. As did airlines before it. As did cars before that.

Cheap electronics was able to "keep up" with the decline in buying power of the vast majority, but what about the next thing?

What technology is being strangled at birth, simply because there's not enough market for it to transition from the lab, or geek/hobbyist sector, to the mainstream?

The lower price per-performance of electronics has papered over the cost of the decline in the share of wealth of the majority of the US public, hidden the opportunities lost.

[For example: The only reason most people could afford a smart-phone (especially an early smart-phone) was because it was subsidised by the network by getting you to sign up for a 12mth or 2yr contract. To a large degree this is still true today. But how? How were the network providers making enough money to maintain and upgrade their networks and pay for all those smart-phones, off the same revenue? Because the same advances in technology were drastically lowering the price (and increasing the capabilities) of commercial-scale network hardware as well. Instead of lowering the price, those improvements subsidised consumer hardware that people couldn't actually afford. Alfred thinks that's a win. But what did we lose? If income was distributed in similar patterns to, say, the early '80s, vastly more people would have easily afforded to buy the equivalent of a smart-phone outright. Network prices, therefore, would be more competitive and prices would plummet. New services could arise that are now locked out. Perhaps the drop in network hardware costs would result in entirely new types of networks that would have broken the old providers completely, a decade before they could adapt. Did we, in effect, give buggy-makers the time to buy up the roads and make us pay rent on our own driveways?)

----

PaulSB,
"But what you say about most people being pretty clueless is mostly right. Negative results sound like failure, and Real Men ® don't ever admit to failure. Real Men ® don't value patience, they value aggression. Science tends to require patience."

Before we get too precious about how those people don't understand Real Science because they don't understand that Negative Results aren't failure...

...science has the same problem. It's vastly harder to get a paper published reporting a negative result than a paper reporting a positive result.

Alfred Differ said...

Quality should go away if it doesn't help with competition. In the case of the boombox, though, it was anti-competitive forces that drove it. Larry already pointed that out, though.

Quality is usually something we buy if we can do it at a reasonable price. Not only can we do more with it, we can use it for social signaling. I've seen quality fade in products, but my own personal experience says the product is usually fading too. Battleship curves again.

I got to pull out my soldering iron the other day and repair the intermittent headlight on my wife's car. Useful? Sure. It was the first time I've used it in about a decade, though. My storage costs vs paying someone else to do what I did probably don't come out in my favor, so prudence argues that I should toss that equipment. Ain't gonna happen, of course. Those skills are integral to how I see myself. They are part of my identity. Fortunately for the younger generation, they won't be trapped exactly that way. They'll have 3-D printers and other computer-aided maker tools. 8)

Alfred Differ said...

I still carry my iPod with my library because I prefer a phone that is just a phone. I will probably flex on that with my next phone purchase, but I don't like the always-present demand for my attention. My music should never be interrupted by mundane life. 8)

As for new media, I'm not inclined to move again, but I'm beginning to see the point of being able to leave my library at home to be accessed where I am. They can't deliver it straight to my auditory nerves yet, so I'll probably wait a while. They don't handle hearing impairment all that well yet, so I'm more interested in interfaces than distribution media.

I AM still willing to buy music, though. Boredom is a terrible thing.

Just got the Hamilton soundtrack. 8)

LarryHart said...

Alfred Differ:

I AM still willing to buy music, though. Boredom is a terrible thing.

Just got the Hamilton soundtrack. 8)


I envy you the novelty. You won't be bored for a while now! The problem will be trying to turn off the soundtrack in your head once it gets in there.



David Brin said...

Just bought the new Macbook Pro with the touch bar. They eliminated right-click! Sigh. But there's other stuff. Hints welcome.

onward

onward

A.F. Rey said...

The problem will be trying to turn off the soundtrack in your head once it gets in there.

No, the problem will be trying to turn off the commercials that will come with the soundtrack in your head. :)

David Brin said...

onward