Why Johnny can't Code

A month or so ago I came across this this post on Salon via OSNews. I read both the OSNews post and the Salon post and I've wanted to write about it since, but something wasn't sitting right - I just couldn't put my finger on it. Well, yesterday - reading about child development, specifically language development, it hit me!
Here's an excerpt:

Only there's a rub. Most of these later innovations were brought to us by programmers who first honed their abilities with line-programming languages like BASIC. Yes, they mostly use higher level languages now, stacking and organizing object-oriented services, or using other hifalutin processes that come prepackaged and ready to use, the way an artist uses pre-packaged paints. (Very few painters still grind their own pigments. Should they?)

And yet the thought processes that today's best programmers learned at the line-coding level still serve these designers well. Renowned tech artist and digital-rendering wizard Sheldon Brown, leader of the Center for Computing in the Arts, says: "In my Electronics for the Arts course, each student built their own single board computer, whose CPU contained a BASIC ROM [a chip permanently encoded with BASIC software]. We first did this with 8052's and then with a chip called the BASIC Stamp. The PC was just the terminal interface to these computers, whose programs would be burned into flash memory. These lucky art students were grinding their own computer architectures along with their code pigments -- along their way to controlling robotic sculptures and installation environments."

But today, very few young people are learning those deeper patterns. Indeed, they seem to be forbidden any access to that world at all.

What this reminds me of is the difference between language and linguistics. When I was a computer science student as an undergrad, I learned the linguistics of computation, not the language -- in contrast to high school where I was learning the language of programming to do something with that language - i.e. make a game.

I think that coders knowing and appreciating the underlying structure of a computing architecture is a good think to know and to have, however, do we expect our five-year-old learning language for the first time to be able to parse sentences and bust-a-rhyme like no one's business? No, not really! (unless you've got some pretty gifted children out there!)

Believe it or not, what can be accomplished with a procedural language can be accomplished by an object oriented language - you just have to break down your mental framework and think outside the box.

The author goes on to say:
Those textbook exercises were easy, effective, universal, pedagogically interesting -- and nothing even remotely like them can be done with any language other than BASIC. Typing in a simple algorithm yourself, seeing exactly how the computer calculates and iterates in a manner you could duplicate with pencil and paper -- say, running an experiment in coin flipping, or making a dot change its position on a screen, propelled by math and logic, and only by math and logic: All of this is priceless. As it was priceless 20 years ago. Only 20 years ago, it was physically possible for millions of kids to do it. Today it is not.


This line or argument has a parallel in language learning: Kids should learn latin first because latin has more cases than english and you can parse sentences out with much greater precision. Is this a prepositional phrase or a coordinated conjunction phrase? or both? or neither? With english it's more ambiguous because the lack of declined endings on nouns but latin is clearer! Now any parent (and school teacher) will tell you that this is complete and utter bullshit. Children pick up language - they problem solve and form new frames of reference to be able to communicate.

The same thing is true for learning about computers - it's not the language you use, or how far away you are from the hardware, it's about problem solving. If you want to get closer to the hardware, forget BASIC - go with assembler language :) Personally I think that the author has a major case of nostalgia and is looking for a way to let it out.


Now, don't get me wrong, I think there is a certain magic when it comes to classic hardware and software. If I ever had a kid I would like to program using AppleSoft BASIC on an Apple IIgs (it would bring back memories of school for me), or get an old atari with LOGO on it. The point however is not that there is something inherently better about one language over another. The point is that learning A language is good, no matter what that language is. If you really want your kid to know how a computer works, teach them binary mathematics ;-) (or assembly language)

Comments

Popular posts from this blog

Academic Facepalm (evaluation edition)

Discussion forums in MOOCs are counter-productive...well, sort of...

Latour: Third Source of Uncertainty - Objects have agency too!