In the 1970s, a team of pioneering computer scientists at Xerox PARC known as the Learning Research Group investigated this very problem--that of creating a programming language suitable for children as young as five to write programs in. As part of their study, they took their computers and their language into Palo Alto schools and attempted, successfully, to teach children from different age groups how to program. The language, named Smalltalk, underwent continuous revision throughout the '70s and was finally released to the world in 1980 as Smalltalk-80.
Smalltalk is a tiny language, with the entire syntax famously fitting on an index card. It is pure in theory: everything is an object and every operation, even "1 + 2," a message sent to an object. It is also pure in implementation: except for a handful of primitive messages, Smalltalks are written almost entirely in themselves, like Lisps.
That a developer could, in 2009, be completely ignorant of the advances made in his field a quarter century earlier is simply inexcusable. To see people, in 2009, touting BASIC as a fine pedagogical tool (and not a modern incarnation, but classic line-oriented, GOTO-ridden BASIC--the very BASIC upon which Dijkstra unleashed his rancour), and to see others seriously suggesting Ruby, Python or some other overly-complex bastard child of C instead, is to observe first-hand the tribute that ignorance exacts from the ignorant. And that ignorance is not only the cause of BASIC being inflicted on yet another generation of hapless youth, but is also the reason why C#'s classes aren't real objects, why Java's "new" is a special primitive, and the source of countless over incongruities between Smalltalk and nearly every other language advertised as "object-oriented."
The thing is, Smalltalk (and Forth, and to a certain extent Lisp) do what they do by pretending that everything outside the language runtime doesn't really exist. I wonder how much it's possible for young children to really care about manipulating symbols in such a closed world. Now consider LOGO which is all about moving the turtle around the the floor (and is quite Lisp-like). Even now, decades later, when I'm typing the commands that will make the tape robots in a distant datacentre spin into motion, I'm remembering LOGO at primary school...
I don't know why you include Forth in that "closed world" category. It was originally designed to control machinery, and it's still ideal for exploratory embedded code.
In Forth the BLOCK mechanism for file access is unlike any other programming language, for example. It doesn't even really have a concept of a filesystem or files; you load pages into memory yourself.
Block files are for when Forth is the OS and you want a quick simple way to divide up your storage. Forths that are hosted on eg: Linux all let you use the regular filesystem.
Yes I am not knocking Forth as a language for embedded systems. I own a copy of Thinking Forth and it's great. But it's not what I'd use to teach kids (having had LOGO and BBC BASIC myself).
Smalltalks usually come with some sort of graphical environment that facilitates the building of GUIs. Those environments usually also provide simpler graphical primitives you can draw with. In fact, if all you want is LOGO, you can use the Pen class (a standard ST-80 class) in Squeak or Pharo as a LOGO turtle:
p := Pen new.
360 timesRepeat: [p go: 1; turn: 1].
that will draw a circle.
4 timesRepeat: [p go: 100; turn: 90].
and that, a square.
More complicated stuff can be done with Morphic and eToys.
Whenever this topic comes up, there are a huge number of replies to the effect of, "I learned on BASIC, and I found it satisfying. I don't think kids can deal with more complexity."
This is a valid point, but there is a huge change that here that we seem to be missing. When a lot of us started out, computers were not ubiquitous. Getting a computer to print out "trjordan is the BEST!!!!" in a loop was insanely cool, because that's what computers did. The reason BASIC seemed so great was because you could make computers do what you thought was their primary use (text-based input/output, and decisions based on your input).
Today, kids have a totally different relationship with computers. I can't claim to understand it, but the sheer pervasiveness of graphics, color, and mouse-based interfaces changes how kids work with computers. When you think of it this way, BASIC, Python, and every other programming language falls short, because you basically can't create things that ape real programs (interface, no logic) in a few short lines. Sure, Python is great for some kids, but it's not hooking anybody whose mind isn't already "wired for programming", in some sense.
So what's the answer? I'm not sure. I do think that we need to provide something to get GUIs and graphics in front of them in a fast, easy, and intuitive way. LOGO's turtle is OK, but ultimately too limited. I remember _why's Shoes seemed like a step in the right direction. Kids love building things, and programming are only endeavor where you are only limited by your imagination. With the right set of tools, they could get start on a lifetime of building, but they need to be shown how powerful that keyboard and monitor are.
> "When you think of it this way, BASIC, Python, and every other programming language falls short, because you basically can't create things that ape real programs (interface, no logic) in a few short lines."
Sure you can, c.f. STOS/AMOS, GFABasic, BlitzBasic/BlitzMax, GameMaker, AGS and all the other BASIC-like systems that make it a breeze to program games and GUIs.
With BlitzMax (which is relatively modern), you can get GPU-accelerated sprites flying around on screen in three or four lines. Easier than Flash by a long way.
When starting out in programming it's important for it to be as least abstract as possible. It's basic usability. You want to do something and see the result immediately.
Interesting, everyone here (including myself in my earlier post) seems to be ignoring one of the OPs primary points: BASIC, with its ugly GOTOs, is a lot more like assembly language than anything else mentioned here (as far as I know. I don't know smalltalk or LOGO). Does anyone think that point is meaningful or BS? I, for one, see two sides:
1) I agree that this can give kids a good understanding of how a computer works... but at least with QBasic there are functions so some teachers will be tempted to not teach GOTOs.
2) Kids don't care. As @trordan mentioned, anything command line will seem boring to kids nowadays.
However, I don't see any reason why you can't have a programming language which is, in some sense, a "high level low level language" and still can produce neat graphics. I don't know if it exists though...
Understanding how a computer works is a somewhat different goal from that of infusing computational thinking.
anything command line will seem boring to kids nowadays
Not quite. Remote control, for example, will always have its appeal - that you do something in one place that causes something to happen in another apparently physically disconnected location. So if what the command-line controls is "distant" enough, it'll be fun. I don't intend to say that command-line is the ideal interaction mode, but it doesn't need to be thrown out of the toolbox just yet.
I first learned programming in language called CoolBasic. It's simple basic like language meant for making simple 2d games. It doesn't have any command line functionality at all, and a program always has exactly one window where everything happens. It had built-in commands for loading and showing images, playing sounds, making game objects and moving them, collisions between them, tilemaps, etc. It has it's own IDE with integrated manual (with tutorials, explanations and examples of all commands).
While it was impossible to do anything advanced in it (without using dll-files), it was really easy to make a simple game with it.
The current version of the language is outdated (uses directx 7 for graphics, no hardware acceleration, etc.), but a new version is in development. The language is made by a finn and does not currently have official english manual, but the next version is supposed to have (there are some translations made by the community).
Nice link. Instant gratification is central for any environment targeted at kids. That said, Logo's motive was to have a low threshold and no ceiling, whereas CoolBasic's motive seems to be a low threshold, but a fairly low ceiling as well. So .. not a solved problem yet :)
As much as I dislike Basic in all of its forms, I have to agree with the author on this: it's a good way for Kids to learn programming. My agreement is partially based on my own experience: When I was 13, I wanted to learn how to program my 386 computer. I tried, and failed, to teach myself C++. I then bought a book on QBasic, and learned it. Afterwards I went back to C++ (well, mostly the C subset of C++ actually). I eventually had to come back to Basic in the form of Visual Basic, but by the time that happened, I was an adult and already familiar with several different languages.
At some point, my dislike for Basic brought me to the question: "What language is appropriate to use for introducing kids to programming?" Of all of the other languages I know, none seem appropriate. (And yes, people have suggested Python, but I'm not convinced.) Languages designed for children are dumbed down. Languages designed to meet the rigor of modern programming requirements have complexities that should be hidden for relatively simple operations. Sigh. I'm sure there is a better way out there, but to find it within the haystack that is the collection of modern programming languages is a formidable task.
I am curious as to what you think basic does easier than python. Is it the significant whitespace? I just don't see how python is harder than basic in terms of what a beginning programmer is likely to do with it. They both have for loops and if statements which make up the lions share of the easy programs.
I would think it is easier to program in python because of the for each syntax in python since it maps so closely to human language. For element in list: do something with that element. To open up a file it's just myopenfile = open("/path/to/file") etc....
Line numbers. My father was a computer science teacher, and when I was very young (5, or 6), I used to look at the example programs that his students had written, and that he was grading at home. The exercises were were written in Pascal, and I just really had trouble grokking how the computer knew in which order to execute things. Even just the simple question of being able to identify where execution was going to start was a non-trivial exercise.
Line number based BASIC is much easier for a child to understand. You start at the top, and start reading. I remember that in my head I had an image of me writing 'stories' for the computer to read.
A (very) few years later, the question of order of execution was obvious to me, but I clearly remember those times in the lounge room looking at print outs of Pascal, and not being able to figure out what the program was going to do.
I first cut my teeth on GW-BASIC on my Tandy 1000, and I wrote some pretty crazy stuff on that. The author is totally right... demystifying what was going on was key to getting my feet wet. I soon learned that BASIC was an interpreted language (which is why I couldn't just run it at the command line), and that I needed a "compiler" to make machine code, or learn assembly, so then I wanted a compiler, and I learned C, etc. etc. etc.
Twenty something years later, I don't think the experience left me with any scars or bad habits, but I might not be a programmer today if I hadn't started with BASIC.
I soon learned that BASIC was an interpreted language (which is why I couldn't just run it at the command line)
Maybe the BASIC you used couldn't be entered from the command line, but you could type BASIC commands into the Commodore 64 directly, without having to write a program first, and they would be executed. This has nothing to do with it being interpreted or compiled.
The Tandy 1000 was an MS-DOS clone. GWBASIC.EXE was a separate program that you ran at the DOS prompt, then you could start writing your program. You could probably also do something like GWBASIC <your program name>.
But when I was a kid, that never felt like a "real" program. The real .EXE and .COM programs were compiled and ran straight from the DOS prompt, and I wanted one of those.
You're right, I had friends with C-64s and Apples. For the C-64, BASIC was basically the whole "interface." Running a program involved running a BASIC LOAD command. Not so on the early DOS clones.
Ah the first computer I learned how to program on at the ripe age of 3. The apple iie of course. I still miss the days of millionare, karakteka, conan, and autoduel(we had this computer for years and years)
That said, I think we may be confusing two things: teaching computer science and teaching about computers. To the compsi approach, you would want something that can express the rich, high-level concepts
OTOH, if you want to teach how a computer works deep inside, BASIC, with GOTOs and one-line IFs, primitive types and so on teaches a lot about how the processor inside the computer thinks, without the fuss other languages bring. Creating your own recursion (building your own stack - or stacks - out of arrays of primitives) is a worthy exercise.
I learned how to program with an Apple II. It didn't kill me.
If you want to learn how a computer works, I'd agree that BASIC is a great way to go about it. A processor pulls instructions out of memory and executes them. The instructions mutate the state of the machine to produce a result. BASIC models this imperative process in very plain, easily understood text.
However, if I was going to teach something about programming to a novice I'd try to get them to start thinking functionally from the beginning. All this imperative crap is how you implement the abstract model of what you are trying to compute on the hardware. Why get bogged down in the details? Show them a recursive data structure or algorithm and get them to understand it. Have them run it on paper not by thinking about what registers or variables you are modifying, but rather what the expression tree looks like when the recursive algorithm is evaluated.
I think if my first language had been SML or a LISP I'd have had an easier time later on in my programming career.
"Lisp programmers know the value of everything but the cost of nothing"
Functional programming is a good paradigm but not a panacea, and imperative programming isn't crap. It has its uses. I like FP, but sometimes I'm embarrassed by how some people put it on a pedestal, like for OOP before...
"Interesting to note is that the computer language they use to teach it (scheme) has its roots in Lisp (which is over 40 years old) and will probably never change."
It should be noted that MIT switched the language they teach their undergrads this year to Python. It should also be noted that Python also has roots in Lisp.
Yeah, it doesn’t really. Python was already at version 1.2 before, Guido says, “... in April 1995. CNRI’s director, Bob Kahn, was the first to point out to me how much Python has in common with Lisp, despite being completely different at a superficial (syntactic) level.”
People really think 7 year olds should learn java/c/c++/etc... ? Thats absurd. All the great stuff these languages provide is only useful after you can deal with some abstraction. Till then, the most critical thing for kids is getting them to think in "small steps."
Agreed, but I think this article is railing against a strawman. The choices aren't between basic and C++. Python, for example, can be used with simplicity rivaling most dialects of basic, but it is also a fairly powerful and expressive language once they get used to it.
Then there are things like squeak or Alice that are aimed at providing a gentle and fun introduction to programming.
Learning basic is certainly preferable to learning nothing, and like a great many people here I also was introduced to programming through it. But as far as I can see there is little to no reason to consider it a preferable introductory language.
The best way to teach a young'n how to code in this day and age is a two pronged attack.
From one direction, come at them with microcontroller assembly language. Get them making Arduino-powered blinky LEDs and robots. That will learn 'em the fundamentals and should be a blast from the beginning.
From the other direction, use HTML+JS to teach high-level dynamic languages and GUIs. Plenty of fun to be had here as well. They can code up their own social sites. You might want to grab some equity.
These two courses together would be intense but they are so different that they can use one as a refreshing break from the other. They are both so dense with gratification that if you've got a real future hacker on your hands, you'll be the one who has a hard time keeping up.
A very interesting angle indeed. MIT recently switched their SICP based course to a robot-centric course for teaching computer "science". It is certainly more fun to infuse some hardware work into the curriculum. It is much more fun these days with all kinds of controllers being much more easily available - think Lego Mindstorms.
I've been wondering lately whether MIT's approach of starting with Scheme has any lasting impact on students. After almost a decade of coding, I'm finally grasping the power of Lisp, but I wonder if I'd started with a Lisp would I have gotten here sooner? Does it help or hinder students to learn something as alien as Lisp for their first language in college?
The Parallax Propeller is an 8-core microcontroller (with each core named a "cog") with very small amounts of memory on each core. The Propeller can fairly easily interface with PS/2 keyboards and mice, VGA monitors, and RCA Composite screens. You can code on it with ASM, a language with an onboard-interpreter called SPIN, some free but proprietary-and-slightly-limited C compilers, a tiny implementation of Basic that works, or a very robust Forth.
Give it to the kids in your life, and they too will be able to write "Ian Rules!" :)
My soon to be 8 year old is enjoying Scratch immensely. (As am I.)
Without any instruction, apart from two tutorials from YouTube, he has made a number of exploratory programs.
I cannot recommend it more highly then seeing him dancing in delight as he showed his mother how the flying bit of doodle (made with the in-skin paint widget) that shoots from the mouth of the cat makes the stick men 'ghost'.
He set up broadcasts, loops, events, and keyboard control, all on his own.
I've used BASIC quite a bit on the C64 as a prototype tool. Everything ran so slowly that even the slightest performance gain in my algorithms where noticeable with the naked eye.
So when my demos where coded in assembly I knew which algorithms where the fastest.
Ahh the memories :-)
I'm not sure BASIC captures the imagination of kids these days. _why's Shoes does a much better job.
Smalltalk is a tiny language, with the entire syntax famously fitting on an index card. It is pure in theory: everything is an object and every operation, even "1 + 2," a message sent to an object. It is also pure in implementation: except for a handful of primitive messages, Smalltalks are written almost entirely in themselves, like Lisps.
That a developer could, in 2009, be completely ignorant of the advances made in his field a quarter century earlier is simply inexcusable. To see people, in 2009, touting BASIC as a fine pedagogical tool (and not a modern incarnation, but classic line-oriented, GOTO-ridden BASIC--the very BASIC upon which Dijkstra unleashed his rancour), and to see others seriously suggesting Ruby, Python or some other overly-complex bastard child of C instead, is to observe first-hand the tribute that ignorance exacts from the ignorant. And that ignorance is not only the cause of BASIC being inflicted on yet another generation of hapless youth, but is also the reason why C#'s classes aren't real objects, why Java's "new" is a special primitive, and the source of countless over incongruities between Smalltalk and nearly every other language advertised as "object-oriented."