Wednesday, July 11, 2012

Will Johnny Code Again?

Partly in response to my famous challenge on Salon “Why Johnny Can’t Code” -- denouncing the disappearance of basic, introductory coding languages from our personal computing devices -- several groups have done wonderful things to help bring back at least a simple, reliable way for kids to learn programming.

For example, is a cool, easy, accessible Basic site, offering a simple and obvious entry and display system, usable via any mere browser, ease-of-use and applicability to simple textbook exercises. Quitebasic is instantly ready to use. with a very plainly separated coding area, palette and results screens.

Just last week I spoke at Microsoft Research and brought this topic up again (along with many others; it was a wonderful audience of 200 or so brilliant people). So they are aware of the "basic" problem.

Given today's lavish available memories and cyber-power, they could tuck into the corner of Windows a turn-key system so simple and universal it might tempt textbook publishers to bring back "Try it in BASIC” exercises that used to ease millions of kids into ten-line programs that showed them that exciting, world-changing epiphany….

...The moment when they realized: "Wow! Every pixel is created by an algorithm!"

And now some coincidental news: Just today I was shown a new BASIC system for the iPhone and iPad that attempts to chip away at the problem. Have a look at techBASIC.

This is the sort of thing - rather than 140 character lobotomizations - that we need to be encouraging for kids.  Before we become a society of dunces.

=== More about Your Existence ===

Here's an excerpt from Existence that's been eliciting some yucks... and yucks (!) from readers:

"E-calculi— gut bacteria transformed to function as tiny computers, powered by excess food. Have a problem? Unleash trillions of tiny, parallel processors occupying your own intestine! Speed them up by eating more! And they produce Vitamin C! 

"At first, Tor thought this must be a hoax. It sounded like a comedy routine from Monty Phytoplankton. She wondered how the computed output finally emerged."

If that weren't enough to entice you to race out, buy the hardcover and tell your friends, then how about this from the LA Times review: "Whodunits are a sure thing in publishing — just about everyone loves a good mystery — but Brin's multifaceted novel proves that another question resonates just as powerfully with most people: Are we alone in the universe?" 

And this by Simon Bisson on ZDNet: "Science fiction is as much a literature of the moment as it is of the future….

...This book, then, is both a warning and an encouragement: a novel that engages with the world we're building and tries to show us a way to become a mature civilization rather than a raggle-taggle band of individuals. Technology has libertarian roots, but in the end we build the tools that construct a civil society. 

"In Existence Brin shows us the world our technology is building, and then poses one of the biggest questions: what is it all for?

"What we're left with in Existence is one of those rare SF novels that needs to be on every technologist's desk, alongside John Brunner's Shockwave Rider, Vernor Vinge's Rainbows End, Charles Stross's Rule 34, and Brin's own Earth. We may not be able to see our future, but in Existence we get a picture of a possible — even a plausible — tomorrow."

=== Experts line up against High Frequency Stock Trading ===

This from one of the smartest and most on-target tech economy sages around - Mark Anderson, of the Strategic News Service: 

“At a time when bankers are already at the bottom of the reputational heap, it now seems that their ration of scorn has jumped again. Not only are the worst of them greedy, it turns out, and dangerous, and unrepentant, and unwilling to pay for their mistakes, and undesiring of the most obviously needed reforms - but they are also building systems so complex that even they cannot manage them.

"We already knew that retail investors were no longer safe on the trading playground, but now we've learned that neither are the big bullies who made the new rules. Having already heard that about half of the volume of the NY Stock Exchange was in what is now called High Frequency Trading (HFT), this week I've learned that that figure is probably conservative, and that as much as 80% of the trades may be program-driven.

"What kind of zoo is this? A practice ground for Chaos Theory? Is this the centerpiece of capitalism? When the NYSE not only encourages HFT, but profits by selling colocation of servers at exorbitant rates to allow HFT practitioners that extra picosecond of advantage - you know that the wheels are about to come off."

Of course, I have my own "crackpot" or rather far-seeing reason for wanting this HFT lunacy to stop.  It is the surest road to "Skynet"... to the emergence - in secret and without the slightest oversight or public scrutiny, of AI that is programmed from the start to be predatory, parasitical, voracious, insatiable, amoral and relentlessly sociopathic.

=== And Finally... 

I have a few shout-outs to members of the brilliant Contrary Brin community, some of whom I met in person during my recent book tour, and some of whom I'd like to ask a favor!  I'll say more in the first "comment" below.


Paul451 said...

Every shuttle launch. On the same screen. At the same time.

Yes, that one too. Surprisingly moving.

David Brin said...

I posted a final note to Tom Craver and to Tony Fisk under last posting.

Was great to see Stefan Jones again in Portland and to shake hands with the handsome “Sociotard” and to tout the brilliant Patrick Farley before his hone-town crowd at Powell’s Books. (Help his book trailer go viral!)

Quick questions for the blogmunity! I have spotted various inaccuracies or questionable statements about me on the David Brin Wikipedia page. I'm not asking any person to fix any particular thing. But if some of you brainy and sensible were to look at the page and correct anything you know to be just plain wrong... it might increment us toward a fair representation of their funky sci fi contrarian guy. Thanks!

While we're on the topic: Alert to Tony Fisk! I've been told there's been activity on the PBworks wiki that tracks and scores "David Brin's predictions" including especially those that were made in my novel Earth. Would you care to describe for me a summary of what's been going on? Sorry, but I am slammed from book touring.

Finally: I'm told, I was mentioned in the original British version of the show: The Office, season 2, episode 6? Can anyone verify?

David Brin said...

ooops included the earliest msg to tony in that last one. Let's just talk monday or so.

If folks tweak my Wiki page speak up here?

Rob said...

28 years now and it's still a shock to see that explosion. My heart skipped again.

David, I both like and don't like that Quite BASIC site. ON the one hand, hurray for a simple program to learn and teach basic algorithms. On the other, there's a real reason why line numbers and unstructured code went away.

I can hope for a middle ground that doesn't employ C/C++ semantics to teach.

Walter R. Moore said...

Another kid-friendly coding platform is MIT's Scratch:

My daughter's public school down here in Florida uses it in the 4th grade, and they did an intro class for the 3rd graders.. she came home excited about it and has been coding strange quirky games with rainbow wolves and warrior cats ever since.

Rob said...

I shoulda driven to Beaverton. Alas, the whole day was opening day of camp week, and I've got four campers in the house.

Lorraine said...

I'm sure there's a glut of coders by now. If the objective is to make Johnny computer literate, maybe SQL is the way to go. Should lead to the kind of aha moment that anonymized data provide no anonymity, for example. Assembly of "dossiers" from disparate sources, etc.

Rob said...

If there's a glut of coders worth anything, I have yet to see it.

Lots of underpaid overworked IT guys, though.

Carl M. said...

The problem with high frequency trading is not the speed; it's the lack of accountability. Require any entity which does high speed high leverage trading to be a partnership instead of a corporation and we have accountability. Screw up the code and you go on Food Stamps.

Stefan Jones said...

Hey, I actually have a DVD of The Office, Second Series right here!

I'll check it out.

We can only hope that any mention is positive. Gervais is smart, talented, funny, and really, really mean spirited.

Stefan Jones said...

Regarding elementary-school programming languages:

It is ubiquity, more than the particular language used, that is important. The teaching language DOES NOT have to lead each and every child down a path that leads to a CS degree. It DOES NOT have to be the latest hottest thing. It CAN NOT just be something that is incidentally included in Windows as a scripting tool.

I've heard good things about using Python as a teaching tool. It might be the best shot we have. We just need a standard graphic library, one that works on Mac, X, and Windows.

James Salsman said...

Google Blockly might turn in to something good.

Stefan Jones said...

I watched The Office, second series, episode 6.

Ricky Gervais's character (the equivalent of Steve Carrel's "Michael Scott" in the USA version of the show) is named "David Brent," and at one point he says his name in a way that kind of sounds like "David Brin."
* * *
Glenn Beck WARNED us that something like this would happen if Obama was elected!

Stefan Jones said...

Oh . . . for those who have not seen it:

What an astonishing thing a book is."

Rob said...

These days, the "teaching language" is Java, if you go by the curriculum the College Board developed for AP courses.

But I agree, Stefan, that language doesn't matter. The beef I have with Quite BASIC is nothing more than the introduction of elements which, when a student moves into anything more complex, are not present. Things like explicit line number labels.

Then, there is the limitation of Quite BASIC with respect to recursive algorithms, which have to be one of the coolest kinds of "a-ha!" moments you can have in math, as precursors to Mandelbrot plotting and other impressive recursions.

Mostly it's the fact that we can't agree on *language* that keeps publishers from moving forward. That and the fact that books are scheduled to evaporate five years before the most optimistic Singularity prediction anyway...

sociotard said...

I have some hope that 0x10c will take off as a game. If it went viral, you'd really see Johnny learning to code.

Seriously, if you haven't seen it, go look at his pitch. The game is still in development, but the pitch is the nerdiest sci-fi ever. It's all based around a very real bug!

The only way it could be better would be if Nasa pitched in to help him get the physics right. His latest posts look like he might be chickening out about the whole "hard sci fi" thing, which is a shame.

David Brin said...

Stefan and Rob get it... the key is ubiquity. When ALL computers had basic, textbook publishers and teachers could use it and ALL bright kids got a taste.

There should be a conference and Apple and MSoft etc should agree on SOMETHING to haver as standard and turn key and tucked in an easy corner.

Stefan... you call THAT a mutant carrot??????

Somebody find out if Rickey Gervais is a sci fi fan.

Digging like that led to my having a pre-copy of Existence sent to Temple Grandin... who gave me a great blurb!!!

David Brin said...

Huh... interesting fellow:

Is he the same guy who said "Hypocrisy is the homage that vice pays to virtue"?

François Marcadé said...

Sorry wrong century
"L'Hypocrisie est un Hommage que le Vice rend à la vertu" was written by François de la Rochefoucauld in 1664.
He is known by generation of Students as Maxime de la Rochefoucauld (as maximes is the Title of his book and means Aphorism in French). He is probably an Ancestor of Robert De la Rochefoucauld but probably not a direct Ancestor has François was a Duke, while robert was a Count.

Sorry not to participate more to your community my work keep me very busy.

Carl M. said...

TCL is lightweight, runs on anything, and comes with a graphic library [tk] which has since been ported to just about every other scripting language.

It is not math-centric. The original implementation treated EVERYTHING as a string. You have to use the expr command to do a math expression. But in return you get most of the cool things you can do in Lisp, without the parentheses. And extending TCL with C used to be trivial, unlike Ruby.

While ubiquitous, it has lost popularity to python and ruby, which has resulted in a rather nice feature for textbook publishers: it is stable language.

Java, BTW, is a horrible language for this purpose. WAY too much overhead to do simple things. TCL/Tk, on the other hand, has the shortest GUI "Hello World" implementation of any language.

Acacia H. said...

So, Dr. Brin, is Contrary Brin your own personal smart mob? =^-^=

Rob H.

(For those wondering what I'm talking about, borrow, take out of the library, or buy "Existence" and read it.)

Naum said...

When ALL computers had basic, textbook publishers and teachers could use it and ALL bright kids got a taste.

But when ALL computers had basic, only the few affluent kids had access.

More resources, more tools, more ubiquity (Javascript which runs anywhere) exist today than ever before, especially compared to that nostalgic age where rich hobbyists with the means to tinker believe the experience was "ubiquitous".

Getting "a taste" is just a click away.

Boris Borcic said...

Proposing BASIC is shameful, IMO.

Back in 1990 I stopped reading Eco's Foucault Pendulum when I saw he chose BASIC to exhibit computer code in the pages. On aesthetic and cultural ground; BASIC is an insult (And I am saying this as someone with years of experience with various versions of it in the 70es and 80es).

If one is concerned about providing a minimal amount of programming culture to the masses, for God's sake, there are a myriad better possibilities today, starting with dynamic geometry software and Euclidian constructions.

Ian said...

Sorry if I appear to be obsessing on the topic of SETI and the Fermi Paradox but i'm increasingly finding myself compelled to accept the logic that we're living in a simulation.

Specifically, we're the "Business As Usual" model.

Think abotu if: if you're an advanced alien race and you're considering establishing contact with a newly-discovered alien race you're going to game it, you're going to run simulations first.

One of those simulations is going to consider the business As Usual scenario: how is the new race likely to evolve if we DON'T contact them.

If we accept that First Contact is innately risky, then any species considering it is going be much more likely to do so if the consequences of NOT doing so are particularly dire.

IOW, the Space Brothers didn't land to prevent nuclear war back in the 50's because they'd run simulations showing that nuclear war was unlikely.

Now assume a universe with multiple species: you get more than one species running simulations and you probably get mroe than one simulation per species.

If there are 10, 50, 100 simulated Earths and one "real" one then the odds are we're part of one of the simulations.

(n Existence David also alludes briefly to the possibiltiy that our own descendants may also run such simulations.)

Also consider this: even if interstelalr travel is possible it's likely to be massively expensive and difficult.

Additionally, consider that humans if we could visit, say, Venus or Io would have extremely limited and highly-intermediated contact with the physcial environment. So what's the advantage of being there as opposed to experiencing a simulation.

Heck, it's already happening here: far more people have seen films of the interior of the Great Pyramid than have acutally visited it.

So even if aliens are real and can visit Earth, for every one that does so there are likely billions (or trillions) exploring simulations of Earth.

Which once again, implies many more simulated Earths than the single prototype.

Plato seems to be one of David's Bete Noires for his political views but maybe he was onto something with that whole Shadows in a Cave thing.

(Maybe this explains the lousy track record of Mars missions: the Mars expansion pack is still in Beta.)

Rob said...

Basic as an insult, Boris? Shameful?

Very nearly ALL of the coding I do is in Basic. It's my professional go-to language of choice.

I think a discussion of "which language" is *entirely* beside the point, though.

The other day my 8th-grade nephew asked me for help setting up some software. He wanted to write a game for the Android phone his dad had just handed him.

We couldn't do it. The *setup work* was too complex. You need no fewer than nine tools installed before you can even begin! Or $300 in legos, which comes with bad software to program the Mindstorms robot. Or the $100 entry fee AND a Mac computer to program the iPhone. Or just $100 to get a program onto a Windows phone.

That's the problem, in addition to a lack ubiquity. Really, the programming tools have got to be both easy and free, and there's no money in that. Picking on Basic as a language illustrates the problem: No software developer is an educator. And what we need are strong educators with software dev skills!

Ye Cats, I may have to become the thing in the world that's missing, if computer math education is ever to proceed. One can only hope someone will drop $3 million in my lap or pay me a living wage to develop it.

Rob said...

Carl, I think you're chasing ubiquity without recognizing that TCL isn't useful for algorithm teaching, which is a math discipline.

Again, suggesting extant languages is beside the point, in my view.

Boris Borcic said...

Rob, probably we should agree on what we are talking about.

I am pretty sure your "professional language of choice" looks nothing like "Quite BASIC" or Eco's own productions in "Fôucault's Pendulum", which otoh do look like the BASIC that used to serve to "give a taste" of programming to youngsters (at least in my time).

Further, I'd bet your "professional language of choice" involves an IDE and focusses attention not on algorithms but on manipulating objects and interfaces from a myriad libraries. While navigating countless libraries with code-completing IDEs may count as representative of what coding has become nowadays for most practitioners, it is far removed from what I understand as a relevant taste to give of programming.

I'll reiterate my contention that dynamic geometry software is a good bet for introducing to algorithmic reasoning, not least because it kills other birds with a single stone (in terms of culture).

Given how you then

ell said...

I remember line numbers. They may have been necessary in case the sysop dropped the stack of punchcarads and didn't get them back in the right order.

Yes, I wrote my own programs and keypunched the cards back in the 1960s for an IBM 360. I collected them in a rubber band and put them in the queue for the sysop to run, then picked up my printout the next day.

David Brin said...

Francois, merci bien pour cettes bons mots. Tu es un valable ami, ici.

Naum said "But when ALL computers had basic, only the few affluent kids had access."

False memory. There was a solid decade during which home computers were common enough, even down to the slightly-lower middle class, that textbook makers included TRY IT IN BASIC in nearly all math & science textbooks.

Boris, you simply miss the point. I don't care which language it is and you may argue for all of them! But ubiquity is key to getting textbooks and teachers assigning programming tasks again.

Ian you describe yet another Zoo Hypothesis and the variant "Let's let them develop their own unique culture" is certainly a worthy one that I even give some time in EXISTENCE. But it still implies either fierce LAW out there or else very limited numbers of other races and almost no colonization.

The Simulation and diving into inner space models also have proponents. Why go out when it is cheaper and more pleasant to go in? Still, wise super races won't ignore the outer universe. They will deputize sub-selves, machines or lessers to assertively head out, if only to protect the inner, fully-linked world(s). Servants or robots bred not to want inner life. Or else resurrect frozen folks and assign them outward adventures (they'll say yay) without letting them know that paradise awaits within.

Ian, Plato wasn't the only one to say "our senses cannot ever make sense of Objective Reality and trying is futile."

Think about it. It is the same boring truism conveyed also by Buddha, Lao Tze, Jesus, and so on. All said "beware illusions." Then they added "Give up that path! Instead, seek inner wisdom via..."

There the prescriptions varied: logic, detachment, ritual, faith... but the core message was the same... till Galileo.

David Brin said...


Let us imagine an encounter between two of history’s greatest minds, each defending his own view of reality.

Plato to Galileo --

“Our senses are defective, therefore we cannot discover truth through experience. That chair, for instance. Despite all your gritty ‘experiments’ you will never determine what it is. Not perfectly.
“Therefore give up! Empiricism is useless. Seek the essence of truth through pure reason.”

Galileo to Plato --

“You’re right. My eyesight is poor. My touch is flawed. I will never know with utter perfection what this chair is.
“Nevertheless, I can carve away untruths and wrong theories. I can demolish fancy ‘essences’ and epicycles, and disprove self-hypnotizing incantations.
“With good experiments -- and the helpful criticism of my peers -- I can find out what the chair is not.”

Boris Borcic said...

David, Dr Brin, my perception is you've been talking over my head. I'll answer nevertheless, hoping it's not wasted.

Ubiquity is no problem, unless you mean that kids should get a taste of coding through the touchscreen of their smartphone.
There are scores of free programming environments you can install on any computer in just a few minutes. Many also available through-the-web like Quite BASIC. Same for dynamic geometry software.

What you call for imo is something else, akin to the situation some 30 years ago where teachers needed no brains to decide for teaching BASIC because there was no other choice (and if they had another choice, the name of BASIC reassured them because they themselves on average knew next to nothing about programming).

What I am saying is that

1) the above situation will not recur (except for the last part, ignorant teachers)

2) that it is preferable that this will not recur if that means returning to anything similar to the BASIC of 30 years ago.

3) that special cultural relevance may achieve what ubiquity doesn't: lead to a consensual choice of frame to give kids a taste of algorithmics.

4) that Euclid's geometry (as embodied in dynamic geometry software) can provide such relevance and frame.

Just to expand the least on the latter points, a salient feature of Euclid's geometry is that it is even defensible as part of a curriculum in humanities, given how it belongs to the background of so many luminaries throughout history.

Rob said...

Boris, I do relatively complex numerical analysis and 3D graphics displays, using Visual Basic 9 and 10. That's not IDE twiddling and code archaeology.

I reiterate with David: the *actual language* is not really the point. BASIC is useful because it lacks some low-level semantic travesties, like the idiotic semicolons that all C-based compilers require. That gets in the way of good math teaching to have to spend time hunting for them.

We need to be discussing this at a higher level than "which language". Javascript is ubiquitous and fine for that, over my inconsequential objections, and with the release of Windows 8 will achieve a ubiquity not seen since the early 90's.

Ian said...

David, I'd reverse your argument about the few adventurous souls who venture out into the real universe.

I'm sure that happens.

What I'm saying is that there will also be those prefering to explore via simulations - and that even the adventurous souls are likely to run simulations before they proceed to make contact.

Think about it: you can spend decades to reach another planet where you'll be confined to artificial habitats by your physcial needs OR you can explotre hundreds of wordls via 99.9999% accurate simulations, tweaked just enough to let you participate far more fully.

Or since you mention subselves, just as easily you could both go exploring in ur-space AND explore the simualtions - in fact how better to preapre for your arrival on Earth?

Think about it, there is, we can be pretty sure one and only one physcial Earth. Apart from anything else that means all the different alien races have to share it - have to maintain a common set of policies for dealing with or not dealing with the indigenes.

So what would be more logically than, again, gaming out the different scenarios arising from sdifferent contact strategies - and running a baseline "No Contact" simulation as a reference point.

If humans did something like that, we probably wouldn't run A reference simulation, we'd recognize the random nature of events and the uncertainties in the data and use some sort of Monte Carlo technique, running thousands or millions of simulations with slightly different starting values for the variables and different random perturbations thrown in.

I'm not goign ot go Ron Hubbard here and start a new religion but I think the Zoo Simulation theory is pretty compelling when you think about it.

It also has some disturbing consequences. I mentioned Mars and joked abotu an expansion pack. Well consider this: as we continue to explore the universe were increasing the computational resources required to keep the simulation running. so etiher the peopel running the simulation have to devote more resources to it or they have to slow down the clock speed.

At some point will we make ourselves too expensive to maintain?

Now imagine that all the other alien races are having this same discussion.

Boris Borcic said...

Rob, programming language design has decades behind it and the specific choice of language being beside the point does not free from the inadequacy of illustrating the point with a recreation of the most ugly and insane cheap kludge that ever passed for a programming language. Because that choice of illustration denies the breadth of culture that's implied with saying that the choice of language is besides the point.

Again, I was not talking about the n+1th version of VB, and I was talking as someone who's been employed for years writing code with products similar to eg Quite BASIC and worse.

For the rest, I can't quite make sense of your discussing specific languages while in the same breath denying it's relevant.

Rob said...

Boris, in a conversation like the one David started with the original question, my focus is not on imparting the culture of software developers. I have no interest in such a self-referential and schizophrenic thing and I won't agree that it's a good idea at all to expose children to it.

Instead, it's on imparting algorithmic skills to young people as part of their maths and sciences education.

Javascript is a C-based language whose ubiquity self-selects it. (I don't like them because I don't like C). From that point, the question of "which language to use" ought to be moot: it's Javascript. It remains only to write the textbook/website which contains the maths a teacher wants to guide students through. Or a student on his own.

Also, please. Gramma already knows how to suck eggs.

Naum said...

False memory. There was a solid decade during which home computers were common enough, even down to the slightly-lower middle class, that textbook makers included TRY IT IN BASIC in nearly all math & science textbooks.

Not true at all -- just look at home computer ownership figures. In 1990, only 15% of homes had PCs, and that figure barely was over third by the end of the 1990s.

David, with all due respect, your affluence and nostalgia blind you in this regard. And while I speak anecdotally of course, I was one of those kids that begged my parents to buy a PC but had to wait until after college before I could afford a "home" computer. I noted those "try it in BASIC" exercises in textbooks but those passages may have been Koine Greek for all practicality.

Some will chime in that schools had computers -- yes, schools, especially during the 90s came into the computer age, but I suspect only a fraction of users on a "community" PC ever tinker enough to gain enough acumen to code up some algorithms.

Contrast to a golden age of programming in a post-internet world with cheap and plentiful hardware (with only a few hundred dollars needed for a capable machine -- compare that against 80s/90s machine that cost 4-6K or more in 2012 dollars). And where tutorage and resources are a click away and the most experienced minds willing to share beginner/intermediate/expert level advise and instruction.

Naum said...

But ubiquity is key to getting textbooks and teachers assigning programming tasks again.

That ubiquity exists -- it is Javascript and running any modern browser engine (i.e., Chrome, Safari, Firefox, and even latest MSIE offerings now) include a code console where you can type code in and run it. With a click of a button. And you can paint pixels, play music, roll dice, etc.… just like you could on those old dinosaur pre-Windows DOS boxes that shipped with BASIC.

Rob said...


I grew up in the 80's in the lower middle class. We had a Coleco Adam and I tinkered mercilessly with it, hardware and software both.

But I can also remember in the 9th grade (early 80's) doing a class activity where each student wrote an idea on some paper and passed it around to get responses. Mine was, "Every desktop in the school should have a computer." Every response then was to call the idea more stupid than ingesting arsenic-laced rocks.

So, no, not many people had Apples or Commodores, but the ones who did had them regardless of economic class.

David Brin said...

Ian. I do not buy into the near universal tendency for bright guys to glom onto ONE Fermi explanation and proclaim "This has to be it!!!" Hawking, Michio Kaku, Paul Davies all do this. But this is the only scientific topic without any known subject matter. Proclamations do not make something so.

Yes, an almost empty universe with just one sapient race is much easier to model. But so would be an Earth with a million people instead of 8 billion. At some point you can only shrug at the "it's a simulation" model and know it as a sophistry... even if it turns out to be true.

Frankly, best theory? We exist in a cheap holodeck in 2043 on quarters that had been dropped in by a ninety year old George W Bush who wanted to live his chain of fantasy dream jobs. Fighter pilot! Oilman! Baseball team owner! Guv uf Texas! Pres-e-dent!... and he got fatigues before he could appoint himself astronaut. How likely that we'd ACTUALLY be stupid enough for Culture War?

David Brin said...

BORIC you persist in missing the point and I despair of you ever pausing in your shouts to show enough curiosity to understand it. There are NOT easy ways for today's teens to experience an easy introduction to programming. There are not. I will repeat that. There are not. There are not. There are not.

A teen who wants to program today must overcome endless energy and time and learning curve barriers. Perhaps 1% will be motivated. But 20 years ago, ALL bright teens had PCs with a basic language aboard so ubiquitous that MOST high Schools and Jr highs used texts with simple programs. Half of all American kids did regular homework assignments that involved charting data or moving pixels. THAT DOES NOT HAPPEN NOW.

You can wave your arms and lecture us about your favorite and most hated languages... I got far more hate mail for my BASIC article than I ever did for attacking Star Wars! But you (and all those letter writers) persistently miss the point and refuse to even consider your obligation to suggest an alternative.

Naum I am sorry you were deprived as a kid. But students DID go in to the school's computer lab, to do those simple, 12 line exercises. And poor or not, they learned what makes the pixel move.

Rob said...

I tell you three times. David is right. Schools around where I live think of the AP class for Java as the entry point. Everywhere else they're using game scripting packages. Or ignoring the subject altogether as a subject in school. Your first offering comes in the 11th grade, after you've already been beaten into boredom on the subject.

The software industry is (I tell you three times!) too schizo to address this need. It's going to have to come grassroots. Maybe the Khan Academy guys could try some things, I dunno.

Paul451 said...

Re: Paying for Lego Mindstorm.

I think part of the problem is that you're treating owning a computer today like owning a computer when we were kids. Owning a computer back then was a special purchase, more like buying a musical instrument because your kid showed interest, or because you wanted them to. When your parents forked out several hundred dollars to buy that primitive machine, the one with BASIC, it is the same as you forking out $300 for Mindstorm or some programmable kit. The big difference is, such special purchases are no longer necessary, no long the only option. As long as you have a browser...

Re: Programming on smartphones

This is a unique case. The carriers specifically forbid vendors from doing this. I've seen hardware makers get weird about "modding", but this one is just carriers. So you will never get a in-built user-accessible programming language on smartphones, unless it is somehow mandated by the EU. Still, as long as they have a browser...

"The software industry is (I tell you three times!) too schizo to address this need. It's going to have to come grassroots. Maybe the Khan Academy guys could try some things,"

Already done via "w3schools" for the whole web standard, or jump straight into JS,
Text editor and a web browser. On every computer in the world. Ubiquitous even. Useful also, since it's the scripting language of the web.

Paul451 said...

Apparently Naum's comment needs to be repeated (three times): In 1990, only 15% of homes had computers. More importantly, of children in the 8th grade or earlier, just 3% had access to a computer! Reaching just 9% for high-school graduates. Even by 1997, deep into the first boom, it was still just a third of homes, 10% of 8th graders, 20% of HS grads.

BASIC stopped being ubiquitous long before computers became so.

(And of that 3% in 1990, how many kids did more than play games? I'd be willing to bet any of you crazy uncles that more kids today know at least some JS than ever knew BASIC in my day. [Heh. I tried googling it, to see if someone's actually done that specific study, but of course "children()" appears in JS itself.] )

I think there's an old psychological illusion at work here, "what I experienced, everyone experienced". In the early '80s I owned a "Dick Smith Wizzard" (Australian rebadged Creativision) with a BASIC cartridge. Three kids in my immediate peer-group also owned exactly the same... ahem, "computer". And two of three were interested in programming; at least BASIC anyway. One even had the tape-recorder option to save programs. Therefore it must have been common as mud? Right? By the late '80s, when I had a PC, so did about half of my peers. So half of all Aussie kids had PCs! Well, no.

Likewise, I think BASIC feels somehow important simply because it was ubiquitous, it was how we all got started, because there was nothing else. But now the choice is overwhelming, paralysing. "Argh! Make it like it was when I was young!"

Paul451 said...

"There are NOT easy ways for today's teens to experience an easy introduction to programming. There are not. I will repeat that. There are not. There are not. There are not."

[sigh] Open your browser. Bring up google or your default search engine. Type "learn [language]" or "teach kids to program [language]" whichever language you like.

Beginners, intro, how to, guides, idiots, even "eloquent"... for page after page after page.

Too hard? With all that using google and having to choose from hundreds of options? Then just go into your address bar and type "www.learn[language].org/net/com" and there's a 90% chance you'll hit something useful. Such as which, of course, includes an in-browser interpreter (type in the code-window, click "run", see your output in the output-window.) No downloads, no cost.

Kids have more options to learn programming, more free tutorial books and sites, more free libraries of code, and more communities and support, than you and I could have dreamed. We were starved, eating scraps, they can gorge. Any kid with a tenth of the patience/perseverance/curiosity needed to actually learn to code, has ten times the patience/perseverance/curiosity needed to be able to find a place to start.

Paul451 said...

Reading that back, it sounds kinda bitchy. Wasn't intended. Apologies. In my defence, I'm "drunk" tired.

Unknown said...

I'm a big fan of what they're doing at, which I've been pointing people who want to learn how to code at. It's also worth noting that almost all non-PC machines (Mac, *nix) now come with Ruby installed, which is certainly simple enough for a kid to pick up while also allowing for things like object oriented code.

Tony Fisk said...

@Paul451 ..Another hard day in the peloton?
(For the US readers, TDF occurs from middle of night to wee small hours in Oz)

My daughter (then aged 8-9) had much fun playing with Logo and e-toys on my/her OLPC*. She managed to plug together simple code chains to drive a little car icon to a target, although she needed some help driving it around a course.

*Depending on who you ask!

I think one of the problems of this discussion which is causing the difference of opinion is nailing the nature of the algorithms in question. How low-level do you wish to go?

Interestingly, modern computers do make it virtually (ahem!) impossible for a casual user to access the silicon direct. Apart from the scope for bringing an already delicately balanced OS crashing into the BSOD (or black hole or whatever it is these days), even the 'machine code' is virtual; sitting on top of a micro-kernel that converts all the 'registry' instructions for the chip that's actually under the hood.

Even so, there's a lot to be learnt from the lower layers; like how a subroutine call actually works. While I don't think I've seen a line of assembler for over ten years, it does sometimes come in useful to have an internal model for what's going on!

As for pixels, I remember the ZX Spectrum manual describing how the 6K allocated RAM was laid out to match the interlacing scan pattern on the monitor, and how you could poke values in direct and see the bit pattern. I don't think they do it that way these days. Co-processors are doing a lot of the donkey work.

Even more fun was tweaking the timed interrupt vector(s) so that you could invoke a special routine that could do anything from check a timer to manage multiple tasks.

Rob said...

Paul, I forgive you completely, but yeah, it came across as exasperated, just a bit. fails because the very first thing it says is "You must already understand HTML and CSS." Those are complex programming topics themselves, and we're talking about people whose attention spans haven't fully developed yet!

It's the sheer volume of stuff out there with no filters for quality! You can't just tell me that Google is my friend because in this instance, it's just not.

The Code Academy stuff fails on the "you must register!" requirement. If you're under 13, you just can't do that on your own in the U.S. But it's much, much closer. If they created lessons specific to basic algebra and geometry maths topics there would be something accessible for teachers to use.

Shep said...

"But when ALL computers had basic, only the few affluent kids had access."

As a member of the Old School (high school in the early '70s), I have to disagree with that statement. This is not to say we all had access to cool new Microcomputers (as they were called then), far from it. In fact, back then I had a buddy from a wealthy family who proudly showed me the latest Altair 8800 (or whatever). It had a lovely front panel which allowed him to click a series of binary switches and press a Step button to insert the Direct Machine Code into the thing. This was the only way to program the beast. Having the ability to use something as nice as BASIC was a pipe dream for a home user. But I digress...

What we did have access to was a Computer Lab of sorts there at the high school. This was simply a TTY terminal (yes, that means there was no CRT screen! All output was printed on a roll of brown paper like you get in restrooms to wipe your hands with) hooked up to the main computer down at the local school board office. Everybody who wanted it could sign up for time on this high tech device and run programs like Star Trek (you had to be there) or even program in your own stuff in BASIC. Note this was an inner city school without anything like plentiful resources or a large budget. During this era, Nobody was rich enough to afford something to program on at home, but we all could take advantage of that Computer Lab.

This brings up another point. I see a lot of discussion above about the pros and cons of the various other languages out there in contrast to BASIC. Let me state for the record that BASIC is/was horrendously inefficient, unstructured and truly inferior to almost anything you care to name which has come along since. It was not ubiquitous because it was good (or even acceptable) as a programming language. It was ubiquitous because it was the only game in town, at least in that era.

So here's a question: How do I know it was so bad? Well, I have used other programming languages since and seen how much better they are. But ironicly I would never have graduated to the more elegant, superior languages if I hadn't cut my programming teeth on BASIC. I now realize that having BASIC as my primary computer language was a blessing, not a curse. The main reason for this was that BASIC was incredibly easy to learn and, well, basic. It was full of common sense words like Print and Goto and If-Then. All the efficiency arguments aside, this made sense to me as a neophyte. I'm sure if I had been forced to learn with Java or some other more modern thing, I'd never have put in the effort.

So what this all boils down to is a sad realization the Boris is right - at least in that we will never again see a situation where all computerdom has standardized on a simple, beginner accessible language. The scorn Boris heaps on BASIC is not his alone. It's a dead language as far as many (if not most) are concerned. Other candidates are neither ubiquitous nor set up for complete beginners.

Sorry David, it's not in the cards.

David Brin said...

Paul, are you proud to nurse the illusion you can see what we can't? While it is YOU who is blind to the obvious? If there were an easy way to do it, teachers today and textbooks would still assign simple TRY IT IN BASIC program problems. Many many gen-x-ers remember doing them, even if they never after did a single programming step. And it CANNOT HAPPEN NOW. Period. It would be better if it could happen now? Yes. Boom. Gotcha.

Worldcon Chicago? Anyone coming? Currently at Comicon.

Rob said...

A dead language. Really. Really?

Y'all are still. missing. the point. The point is *not* "teach beginning programming". The point is that computers have ceased to be a pedagogical tool *for maths*.

By the mid-80's, most every US middle and high school had a computer lab. That's ubiquitous enough for that purpose.

ell said...

I first learned FORTRAN 4. Then I worked for a company that thought all its employees, including the secretaries, couriers, and accountants, should know how to program. So then I learned BASIC, then Pascal. And then a dozen more. Employees of the company provided their old desktop computers in a conference room.

ell said...

There's nothing like programming for yourself to appreciate someone else's programming.

Some people explore new territory. Some people use maps the explorers created to get there. And some people take the bus to get there.

BCRion said...

Arguing which language is "best" is like arguing religion. We may as well discuss other theological questions about angels dancing on heads of pins while we're at it. It's possible to write good code in any well formed language, and it's also quite easy to write horrible code in one too.

Of course, all of this is irrelevant to the current discussion here. The point is to provide an easy and sustainable (going to be around quite a while) mean for kids to learn the fundamental concept that all of our gadgets and their apps are not magical devices but are driven by lines of code written by normal humans, and, yes, they too can do it. The notion that you can connect all the glitz of flashy apps to simple lines of code designed to solve math problems really is a profound, yet simple idea once you realize it. It brings context as to why the math you're learning about in school is really important.

With an increasing percentage of the global economy going the way of digital, and no signs of that trend slowing or reversing, it's imperative to introduce these ideas early. The best way is not to just tell them about it, but to have them do it themselves. Our economy depends on it.

Jonathan S. said...

Kids today also won't know the fun I had down at the K-Mart, where their electronics section (formerly the music section) had C-64s on display. I'd program them all to display, centered on the screen, the message, "I am broken." The next instruction was, if I recall the numbers correctly, POKE 64,0 - which disabled the keyboard. Easy enough to fix - just reboot the machine - but it was entertaining to see the "experts" there who couldn't figure out how to fix what that doggone teenager over there had done... :)

rewinn said...

Hey @David Brin - I saw you briefly at Elliot Bay books in Seattle, but didn't wanna bug you by insisting on chatting, since you looked a little "Strung Out From The Road"; I hope the book tour was fun and thanks for the signature! My lovely wife Kris and I enjoyed your remarks, and maybe next time you're in town we can buy you a beer.

Tim H. said...

A couple of years ago Jerry Pournelle suggested that BASICs low speed shouldn't matter at speeds new machines were running, though I think OGH's point is not so much a renaissance in BASIC as finding more of the young people who can code, who will quickly move on to other languages and to remove some of the mystery for the rest. Another possibility is to run an emulator in a virtual machine, where it can't hurt the host, such as PCxformer, which emulated an Atari 800 on a 80486. Wouldn't be surprise if there was a virtual 8086 that could run DOS 2 and GWBASIC in a sandbox where it wouldn't disturb anything.

Tony Fisk said...

My trick was to set up a simple graphics display running on a 'speccy' plus a message inviting someone to do the same on the neighbouring C64.

Anyway, in order to work out which bit of this elephant everyone's holding, try listing six basic things you'd like to teach Johnny to do.

My list:
1. get a 'hello world' response
2. store and retrieve data: name = 'David'
3. use that data: print 'hello ' + name
4. peek and poke data
5. modify a pixel
6. make a decision: if name == 'David'...
7. store and retrieve a command

I'm sure there are variants to this, but no matter. The next step is to see how readily you can perform these tasks on a modern computer system. (I think the sticking points will be 4 & 5)

DVGill said...

The kids are coding. They're not doing it in school, but they code at home as part of what Henry Jenkins calls participatory culture around game modding and other computer topics. It's fascinating because it's an independent and different type of learning than what goes on in schools.

I think David's blog mentioned this project

some time ago. Okay, it's scripting, but still a gateway for some kids to get into CS. The forums and community that will grow around this is what Jenkins is talking about.

One important question is what do you want kids to learn from programming? Seymour Papert made sweeping claims for the cognitive benefits of LOGO that were later not provable for most kids.

Today, one big reason that programming in schools (public, USA) is difficult to do is because the assessment requirements that teachers are working under favor short term outcomes that can be easily shown on a standardized test. What is needed for doing anything meaningful in programming is more of a project based or portfolio assessment. We know how to do this, but the system doesn't make implementation easy.

David, I enjoy this blog, and am currently enjoying Existence.

Tom Craver said...

"ubiquity is key to getting textbooks and teachers assigning programming tasks again"

There are currently simply too many options for PCs, so no one option will be sufficiently accepted to convince textbook makers or teachers to universally target one language.

However, if one of the ebook formats were to include a programming language interface, and the ability to include executable programs in texts and gave the reader the ability to modify those programs, each textbook would INCLUDE the interpreter/compiler to run its own examples.

If other ebook formats decided to add the same feature, they'd want to include that same language, so that existing texts could port automatically and correctly.

But it is critical to design it right, because you're going to be saddled with it forever, just as different nations have text that reads right to left or top to bottom.

David Brin said...

Dange Rewinn you shoulda said hi! Sociotard and Stefan did, in Portland! I love you guys!

DVGill you are welcome here! Clearly a worthy addition to the brainy blogmunity...

Good point TomC -- still, were MSoft and Apple and Red Hat to agree on three languages to embed and make available on all platforms, so text publishers could reliably offer problem sets and guides...

Tom Craver said...

IEEE is a natural to drive the standard as an extension of some open ebook standard. Engineering and math texts could use built-in software, that students can examine and modify.

A hardware company like Intel would be pretty natural to sponsor it - they have a big education focus, and like to promote engineering.

Naum said...

You all make a case on the basis of anecdotal impressions and nostalgia for a golden age for you.

But I posted real data that shows only a fraction of children had access to home computers. And it is late or I would explore stats on schools and computers but even if there was a preponderance of DOS boxes in schools, a "community" PC evokes a much different dynamic that a home PC where one has ample time to tinker and play.

By the time Windows 95 was released (1995?), Basic was no longer "ubiquitous" -- it might have been present until ripped out with Windows 2000, but doubtful that many went spelunking beneath the GUI desktop to explore any of the command line tools, especially new users coming of age, more interested in running desktop "applications" than the old school line oriented BASIC programs.

All that aside, the title of the post is "Will Johnny Code Again" and is a sequel to a post titled "Why Johnny Can't Code" -- which is pure bollocks if one examines the empirical data of how many are programming -- the numbers of affluent (or fortunate) hobbyists that relish the days of typing BASIC programs in is dwarfed in exponential fashion by the number of kids programming today (or any point post 21st century mark). More tools, more resources, more help than ever existed before are available at the click of a mouse.

All of that numbered list (including peek/poke, which all that is needed is a canvas tag and then drawing commands can be issued) @Tony\ Fisk enumerated are available with any modern browser (Shift+Cmd+J or equivalent on Windows machines).

And advocating some obscure one-off site that replicates that BASIC of yesteryear is as pragmatic as running an Amiga emulator. Novel, yes, but pragmatic it is not.

As a software profesional and one who has also trained up many an aspiring software developer, this post stream misses the mark. Have great respect and love for your work David but you are "out of your element" in this realm.

And the absurdity of conforming algorithm experimentation to textbook publishers that freshen up offerings nearly on an annual basis?

On mobile devices, there is a problem of different sort in that these are locked-down sandboxes, on purpose. There still plentiful code + execute applications, but as devices become more sandboxed (indeed, even Apple is moving in this direction for desktop OS X versions, to "sandbox" apps that make these "easy to fire up" programming environments extinct -- leaving development to a trained professional caste willing to spend money on licensing and time to learn the proprietary development environment tools).

David Brin said...

Naum is screaming about oranges while the topic was apples. Nothing he says is untrue... it just has almost no bearing whatsoever on the topic that's at hand.

A modern problem, alas.

David Brin said...

BTW... I systematically demolished TomC's political logic in holding to the "FDR was Satan" religion of the right. See the bottom of the comments section before this one.

Alas, I do not have time to go back there. Crushed by comicon, book touring and teenagers!

But good luck to us all. Order your blue civil war union kepi hat.

Rob said...

Let's be clear.

The United States had an imperialistic expansionist policy right up to WWII, including occupation of Central American countries (Mexico! Panama, which the U.S. created using a coup against Columbia, where we didn't return the Canal Zone until the 1970's!) all the way through the 1920's. And the Phillippines wasn't "liberated" until the 1940's!

There was nothing two-bit about Roosevelt's participation in Great Powers diplomacy. The thing that led us all inexorably into two world wars.

I'm about 70 pages into (a graciously signed copy of) Existence. It's beginning to feel like a magnum opus...

Ian Gould said...

Of more importance to us as a species isn't whether "Johnny" can code but rather whether Fatima, Sanjay, Kwame and Consuela can code.

To return to one of my OTHER obsessiond for a moment: one billion peple, mostly in the developed world have access to home computers, another four billoon have access to mobile phones of some description.

If you want to promote programmign skills and teach kids math etc you need to realzie that the kdis who need that assistance most are likely to be living in Mumbai or Lagos or Chengdu or Montevideo and their sole computing device is goign to be a phone. If they're lucky, it'll be a low-end Google smart phone rather than a Symbian model.

So, the logical way to give them acess is likely by way of an emulator running on a webserver.

Maybe you can set it up so they can send programming instructions via SMS and have the prgorams do stuff like play tunes of their own creation on their phone.

Tim H. said...

If memory serves, BASIC (Beginners All-purpose Symbolic Instruction Code) was intended as a teaching device, not a production language, and shouldn't be judged as one, as well through away legos, because you can't build a real house with them. Back in the day, it enticed many into programming, some of those moved on to other languages. For example, Jeff Minter once wrote simple games in BASIC, I don't think he wrote T2K or Space Giraffe in BASIC, not to mention several IOS games.

Tony Fisk said...

I can trivially report that I can readily achieve tasks 1, 2, 3, 6 & 7 (by defining a subroutine) using the standard Python editor environment ('Idle'). Task #5 (pixel manipulation) could be done, but would require a specialist library to be loaded (not quite a 'first task'!). Task #4 (peek and poke) is nowadays enveloped by several layers of operating system kernel

Python is available by default on Ubuntu Linux, and can be readily obtained for most systems.

Tony Fisk said...

PS for Ian's peace of mind wrt Fatima & co., Python also ships in the XO Laptop.

(and of course there are several other languages that could do just as well)

Boris Borcic said...

@David Brin, you write : "(...) you persistently miss the point and refuse to even consider your obligation to suggest an alternative."

This only goes to prove that you didn't really read what I wrote. I certainly upheld my obligation to suggest an alternative. And this not just once, but in each of my first three comments.

The third time I marked it in boldface in the hope to help your attention on it. Your reaction to the boldface (observing its presence but not what it emphasized) reminds me of the proverbial saying on the wise man pointing to the moon only to see attention attracted to his finger.

@Tim H. In the late 70es they were entire lines of computers intended for business and with BASIC hardwired, by Wang and HP. Also I find the comparison with lego bricks totally misleading. Lego bricks are simple, modular and elegant. The only of these virtues that the BASIC of origins could claim is simplicity - provided you chose the right metrics and don't compare it with, say, Forth or early Lisp.

Tony Fisk said...

... a quick scan of David's wikipedia entry. I don't see obvious inaccuracies, but there are a couple of omissions (mentions grandfather, but not father: a journalist I believe? also has yet to include Existence in bib.)

Tony Fisk said...

@Boris Since I'm around (but about to call it a night), I'll bite: what has Euclidean geometry got to do with ba... um, elementary coding?

David Brin said...

Boris, I read your bold and all caps and they simply reinforced your stubborn insistence that "I GET IT!" when you do not.

I will repeat (with a sigh). If you can persuade MSoft and Apple and Linux to offer a FAST TURNKEY introductory pedagogical language that is just a click away and UNIFORM AND UBIQUITOUS and requires no downloads or instruction...

...then I don't care which language the educational mavens choose and I am not wedded to BASIC. Whn that happens we will have (as we had in the 1980s into the 1990s) a ubiquitous accessibility...

... that will then let teachers and text makers go back to assigning TRY IT problems to millions of kids....

...some of whom will continue and millions will not. But they will understand what makes a pixel.

Not one of your remarks was aimed at this central and key problem and the central and obvious solution. When someone says "you miss the point" a mature person responds with CURIOSITY.

not with pricklish pride and denunciations/

Paul451 said...

Re: Why Johnny can't code.

I give up. I admit it, I don't understand at all.

You tell us the language doesn't matter, just "ubiquity", so I offer you Javascript on every single web-able computer in the world. All it needs is a text editor and a browser. But no, everyone says, kids can't learn JS, it's a terrible language. You want free, I offer you pages of free tutorial sites, free online versions of textbooks aimed at everything from beginners to experts, sites with languages invented solely for kids. But no, each one has some magic flaw that renders it incomplete (we copied lines of BASIC out of magazines printed once a month! How can a thousand free websites not be enough?) Naum offers you millions of kids coding in various forms, through game modding, scripting and so on. I even offered to crazy-uncle bet that the "problem" doesn't really exist. But no, that's "missing the point".

Apple, Microsoft, Mozilla, W3C, the whole internet, did agree on a universal language, Javascript. So why don't maths textbooks have JS examples? Ask the authors! Every computer-owning kid is on the web, that's why their parents bought them a computer, so if maths textbook authors don't like JS, they can create any hosted pseudo-language/teaching-language/maths-language they want on a website, it doesn't have be on the kids computer! Why don't they? Ask them! Whatever the answer is, it has nothing to do with a lack of ubiquity, a lack of standards, or a lack of access.

So I give up. I don't know what you want. Take pity on me, explain it to me as you might a particularly dumb child.

David Brin said...

Paul... please. Close your eyes and picture a math textbook saying "go to your puter and do the ONE STEP needed to get the Javascript screen up with the simple entry that will let you type in just TWELVE LINES and clearly see that the algorithm we discussed in the previous chapter will draw the arc of a cannonball rising andfalling according to galileo's formula."

And the experience is uniform across all platforms and NOTHING HAS TO BE DOWNLOADED.

You try it. Mac and PC. ANd recall every single step that you take... that seems trivial to you... will lose you half the teenagers, then half of those that remain.

Please publish here your step by step process and the 12 lines of code that will display the canonball's arc. I would love to try it, cross platform, on PC, Apple & linux.

Boris Borcic said...

@David "Boris, I read your bold and all caps and they simply reinforced your stubborn insistence that "I GET IT!" when you do not."

Dr Brin, this is clearly an absurd statement, even taken in isolation. And I never used all caps (outside acronyms).

"If you can persuade MSoft and Apple and Linux to offer a FAST TURNKEY introductory "

Now that's a quite outdated requirement in the age of web applications and cloud computing. All you really need is a web app (as is the Quite BASIC page you linked).

As Rob and Paul451 noted, this makes javascript the most natural candidate, and javascript in the browser is indeed the obvious contemporary analogue to the BASICs of old (including the existence of rabid critics).

"pedagogical language that is just a click away and UNIFORM AND UBIQUITOUS and requires no downloads or instruction..."

I succinctly exposed that ubiquity was really not the problem but that uniformity indeed was and why I thought so and how to approach it. To put it in your terms, what mature expression of reasonable curiosity do you expect from people, beyond detailing the ground they have to differ from you, so as to permit debate?

As for something one click away and intensely pedagogical, what's wrong (except the possibility of other choices) with Geogebra in either java applet, java webstart or pure html5/javascript tablet-compatible version?

"I don't care which language the educational mavens choose"

But they wont choose any. They typically think that learning to program has become obsolete outside professional schools. But that's just the least part of the tragedy, the true tragedy is - like where I live - when they simultaneously drop all other more traditional topics that used to communicate the style of constructive abstract reasoning that's (not uniquely) characteristic of programming. Euclidean constructions, propositional logic, and devising proofs to theorem statements.

they will understand what makes a pixel.

In some sense, ok. Now the problem isn't the choice of language, but of narrow pedagogical purpose.

Rob said...

Sure, but also, now, there are levels of abstraction in every CPU anyone can purchase that make hash of the idea that "POKE $d034, $fe" means anything analogous to a physical electrostatic state in a chip on the board. But there is Arduino, which can start a kid down a path towards understanding electronics.

If our goal is to demonstrate that x^2_y^2 = 1 makes a circle, Geogebra won't help because it's not a program that models algorithms.

So, now, I'm reminded of "Phun", which is the physics software my son wanted in the 3rd grade. It's now called "Algodoo", and it simulates physics, but, fails David's test because after 15 hours or so, it costs $5. If you're a teacher, it costs $50.

Right now it's all schizo-new. Everyone everywhere doing his own thing. Most trying somehow to monetize it enough. And it's not helped by the stuttered, halting, and mostly inept efforts of Edu giants like Pearson, whose websites, frankly, just suck. For only $70/year or so.

But, Boris, please google (and then read!) "Lockhart's Lament". The problem with maths education, Why Johnny Really Can't Code, is far more fundamental: our schools make the maths so mind-bendingly boring and useless that no kid takes an interest if he doesn't already have one. I currently have a senior in US high school (equivalent to a comprehensive school in Europe, I'm told) whose aptitude in maths has been high since she was three years old, but, surrounded by teachers who say it's Not Their Thing and project that disinterest most completely, she thought she hated it all the way until Grade 11.

Really. Lockhart's Lament. It'll make you cry.

David Brin said...

I give up trying to explain why it was a good thing for half a million teachers to be able to assign simple, twelve line programming assignments to 20 million students who had such universal access to the needed tools that textbook publishers could rely upon it and thus include samples of programming - a taste - in the MAIN textbooks used by those millions of kids.

Either you understand the meaning of that fact, and its importance, or you do not. Several of you, while dancing and cavorting around the issue, clearly do NOT grasp that core issue while claiming that you do.

Rob said...

David, I will continue to maintain that I understand your point, and acknowledge that there's nothing out there like what you want, any longer.

The rest of it is irritation at always being told that the computer language I use in my very professional efforts is somehow too stupid to continue ticking over on CPU's, along with a showcase of interesting (but close!) tools I've seen that could serve to mitigate *a portion* of your lament!

Anonymous said...

Any chance of a book signing appearance in the north east ?

Boris Borcic said...

Rob (and this answers Tony too), you write

"If our goal is to demonstrate that x^2_y^2 = 1 makes a circle, Geogebra won't help because it's not a program that models algorithms. "

On x^2_y^2 = 1 : I really don't see what you mean. In Geogebra you can type in exactly that equation and it will draw a circle, or if you prefer you can write y=sqrt(1-x^2) or y=asin(cos(x)) and it will draw the upper half-circle, or you can create/construct a circle with various click-and-point tools and Geogebra will display it in the list of current objects next to that equation, appropriately shifted and rescaled for the given circle. What more can you expect?

On "algorithms". AFAIK an algorithm is a sequence of steps starting from inputs and constructing a corresponding output with stipulated properties. That's also exactly what an Euclidian construction is, and Geogebra like all dynamic geometry software beautifully demonstrates the constructive inputs-to-outputs relationship by allowing you to drag your input points over the plane and seeing your output smoothly change in real time.

And if we take the matter of "ubiquity" to mean we also want it on tablets, isn't it a real boon that the construction can be expressed in terms of graphical tools actions rather than typed in?

BTW, Geogebra is available on OLPC XO too.

Boris Borcic said...

Rob - "Lockhart's Lament".

I haven't (yet) read it to the end, although I immediately recognize a diagnostic I made myself when in sophomore math class: it is possible and presumably frequent for a high-school math teacher to bachelor in math without ever touching on the creative joy Lockhart is talking about, etc.

What intrigues me is how you deem the lament relevant, you don't quite state it as an objection to Geogebra, but in context what you seem to be saying is this : mathematical education is in such bad shape and has such a disgusting reputation that the purpose of teaching a taste of programming can only suffer from being conflated with math pedagogy and should rather leave the latter to its own ills ?

-- this relates in a roundabout way to the one issue I have with Geogebra development: it appears to be driven by the desires of scores and scores of high-school math teachers having fun (teaching) with it - which is a good thing vs "Lockhart Lament" since obviously their own fun is a prerequisite to their teaching the fun. A factor of this fun, is that coming after a dozen other dynamic geometry packages, Geogebra development differs in philosophy by not pedantically limiting itself to the frame of Euclidian geometry.

But the problem with it is that it is thought of as a tool for math teachers. I would much rather see it develop into a picture design tool that integrates symbolic math and Euclid geometry as eg painters' means. This would imo essentially mean admitting dual construction and rendering mode, together with extending the geometric/math treatment of shapes to the stipulation of arbitrary gradients.

Rob said...

In the right hands, Geogebra could be a tool for an Integrated Algebra curriculum. I plan to suggest a "have you seen this?" to a couple of school officials here.

Its approach doesn't resemble the narrower current definition of an "algorithm". It isn't an answer to Dr. Brin's concern. And what you think I'm saying is, put simply, not even close.

David Brin said...

If it magically paints a circle when you feed a formula, it does not teach the student HOW an algorithm paints successive pixels on a screen.

Look, all I ask is that each child be part of a small team that re-invents PONG. Maybe sixty lines of code. And when UR done? The landscape of the screen will never be the same.

David Brin said...


Paul451 said...

"The rest of it is irritation at always being told that the computer language I use in my very professional efforts is somehow too stupid to continue ticking over on CPU's,"

Visual Basic? VB is not BASIC. It has very few similarities (which was kind of MS's point). No closer than JS or Python or another modern object language. Which is why you and millions of people can write proper code with it.

[Just noticed that Visual Basic Express (cut down MS-Visual Studio) is available for apparently free download. With links to a free VB tutorial video series to take you from n00b to leet. So that's another option. But no, we're not allowed to download anything. In the 21st century. sigh]

Lorraine said...

I blame the GUI.

1. It encourages a more passive way of using technology.

2. In terms of code, it requires a lot of set-up. Pre-GUI-era languages other than BASIC had no graphics. Post-GUI languages (including post-GUI flavors of BASIC) require lines upon lines of code to set up a window, then set up a canvas to attach to the window, and then finally we can get to setpixel and getpixel. BASIC flavors with POINT and PSET (or the equivalent) belong to a moment in time between the introduction of graphics processors and the introduction of graphical user environments. If you were to make pixels addressable relative to a corner of the screen in Windows/OSX/Xorg, you would be working outside the rubrics of the system. Early PC's were, at any given time, in text mode or graphics mode (think SCREEN statement in QBasic...)

3. The GUI revolution came hand-in-hand with the object-oriented revolution. Object oriented programming means much more productivity for teams of programmers, but more hoops for the soloist to jump through before writing code that does stuff, at least on the strongly-typed OO languages.

4. If you thought GUI cast the masses as passive users, I fear mobile computing will make it even worse.

Possible fixes?

1. Hardware is cheap, so a separate device (computer) dedicated to full-screen graphics INSTEAD OF GUI wouldn't be such an unaffordable luxury. Or a separate disk partition for a DOS-like operating system.

2. A graphics library (for Ruby, Python, whatever language is considered 'elegant' these days) that delivers, instead of a full-blown GUI like GTK, a single window that is opened upon load (require, include, whatever; or at most with a single function call) and is opened pre-configured with a canvas, ready for calls to getpixel/setpixel, or perhaps circle, arc and line drawing calls. I think I recall reading somewhere about something that implements turtle graphics for Ruby.

3. Half in jest, when you posted "Why Johnny Can't Code" back in 2006, I posted the idea of "re-issues" of classic computers (perhaps in miniature):

In the spirit of the electric guitar industry, perhaps there can be "re-issues" of vintage models of computer, perhaps miniaturized for energy efficiency, and sell-phone-era portability. It wouldn't be without precedent. I recall during the "palmtop era" in the mid nineties seeing pocket gadgets hardwired with MS-DOS 5.0 and the concurrent version of Lotus 1-2-3, and other applications. Intended of course as a work tool. An Atari-800-clone palmtop would be a nice stocking stuffer for kids (of any age) in your family you would like to introduce to the joys of hacking. The problem, as I see it, isn't the range of languages available today. It's the hardware.

Paul451 said...

"David, I will continue to maintain that I understand your point,"

I don't. Perhaps you can explain it to me? Kids have more options than we ever dreamed. And you don't have to pick one. They all work in parallel, they all encourage exploration.

I just can't see what more David wants, except "things like it was when I was young". And I don't see the point of that, things are better now.

Lorraine said...

Tony Fisk: As for pixels, I remember the ZX Spectrum manual describing how the 6K allocated RAM was laid out to match the interlacing scan pattern on the monitor, and how you could poke values in direct and see the bit pattern. I don't think they do it that way these days. Co-processors are doing a lot of the donkey work.

Yup, and I had a Panasonic Epsonclone dot matrix printer whose user manual gave a full description of the language for printing graphics, with examples in BASIC, of course. But now the co-processors are doing the donkey work, and the business model is in the secret sauce...

Acacia H. said...

Dear divinity (or lack thereof) of your choice... I've seen less venom and disdain evoked during Dr. Brin's political commentaries. Seriously.

The simple point Dr. Brin is saying is this: We should have an easy-to-use-and-program and universal computer language available on all computers and smartphones for people to use so they can get an introduction to computer programming.

From there, they can migrate to any computer language they wish. But the important thing is this: an easy language to get them interested in programming in the first place.

Just about every language being mentioned outside of BASIC has one flaw: it is not easy to use. It is once you learn it, sure. But for the complete novice, no it is not.


Nuff said.

Rob H., who had fun with ConnectiCon and tormenting girls wearing cat-ears with a laser pointer

Paul451 said...

Rob H,

1. Go to
2. In the code window, type: Print "Hello, World!"
3. Above the output window, click Run.

What is so difficult about that?

Honestly, I'm not trying to be difficult. I genuinely don't get the objections to the vast number of options available to kids today. (Which those kids are using, I'd be willing to bet, in much greater number than our generation.)

(1 byhowaes: 2 bysea. Or was it the other way 'round?)

Rob said...

Paul, does it plot pixels?

Rob said...

Paul, does it plot pixels?

Tony Fisk said...

*sigh!* Am trying to give a quick run-down on what you can do from a standing start in Python. Blogger is *not* co-operating!

Paul451 said...

Rob H,
"does it plot pixels?"

Argh! You want something that's exactly like BASIC. No no, the language doesn't matter. Okay, what about this one? No no, it's not like BASIC.

What do you want it for? To teach algorithms? To teach maths? To teach graphics? To increase computer literacy (ie, under the bonnet, not just behind the wheel)? To encourage kids to learn computer programming? To encourage kids to tinker? Tell me and I'll give it to you.

Kids are learning types of mark-up from comment sections and wikis. They are learning scripts by modding games, and/or using greasemonkey with Firefox (with thousands of sample scripts). They are hacking hardware, from pre-teen Lego Mindstorms, to arduino kits, recently Raspberry Pi; with free websites that detail hardhack projects like they were recipe books. There are tons of free tutorial websites, libraries of sample scripts/programs for beginners, free online versions of programming textbooks. There are websites with pseudo-languages that, like learnpython, don't require anything to run on your computer, aimed at kids, some deliberately copying forms of BASIC, some purely graphical and object based. There's whole communities available to help you with problems, at any level. Resources, riches, that we never had.

Hell, my first computer couldn't save the programs I wrote. Lost everything every time you turned it off. Must we replicate that too?

Boris Borcic said...

I'd like to congratulate the proponents of Python (that I refrained from citing although I use it like Rob uses VB). BTW, Python is getting integrated with Geogebra 5 as event-driven scripting language.

Anyway, I believe all made their opinions clear, and that further constructive discussion would be served by some zooming out to discuss the place of magic itself. All originated here from Dr. Brin's quite justifiable affection for a specific corner of inspiring magic out of an outdated (and according to some, slightly mythical) standard schoolroom narrative that he advocated resurrecting for the benefit of the inspiring magic.

The very low current carrying capacity of the educational system for presenting like magic to kids, complicates the issue, there is a real competition of worthy bits of magic to present.

And I for one believe that the magic of drawing pixels procedurally owed much to in-some-sense natural conditions that nowadays can only be framed artificially in a manner that will not quite recover the magic.

OK, 'nough for now (work awaits).

Tony Fisk said...

Lorraine made a good point in citing the advent of GUI systems as adding those extra 'middle-byte' layers to conceal what's really going on at the bedrock.

However, I recall that AmigaBASIC was available from the start, in a pre-emptive multi-tasking GUI based OS, whose hardware made use of several co-processors. So, I don't think it's the whole story!

Urgh! I tried pasting the results of a simple 5 minute foray into what Johnny (or Jodie) can do with a standard Python setup. Blogger started playing silly-buggers by pretending to accept the post, and quietly discarding it! Anyway, what I was able to demonstrate was that you could do the 'hello world' thing, and store variables, and *even* define your own procedures on the fly (you could do the same with a class, but that's getting to the stage where it's more comfortable using files). There's online help available as well.

Anyway, I'll make a final attempt (with Idle's prompts substituted with '~' since I think the dumb parser suspected the originals of being XML tags) :

'memenia 11' a thoughtwaire virus

Tony Fisk said...

Here we go (from a CLI in Ubuntu Linux):

$ python
Python 2.7.3 (default, Apr 20 2012, 22:44:07)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
~~~ print "hello, world!"
hello, world!
~~~ name = "David"
~~~ print name
~~~ print "Hello, " + name + "!"
Hello, David!
~~~ def hello (name):
... print "@" + name + ". Hello there!"

~~~ hello ("Rob")
@Rob. Hello there!
~~~ help()

Help manual is entered. I won't reproduce it here since that's where the blog glitch must be.

(Glad I managed to get that out. It was bothering me immensely!)