BASIC is the Best Programming Language for Beginners
"BASIC causes brain damage!" -Dijkstra
Everytime some non-programmer ask me what programming language they should learn, I always answer "BASIC". Understand that not many computer programmers would answer that way, and some would even think that BASIC is actually harmful. However, from my experience, these are the times it took me to learn various languages:
Applesoft BASIC: 3 hours
HTML: 3 hours
JavaScript: 2 days
Perl: 3 days
Processing: 3 days
VRML: 1 week
C: 40 hours
Pascal: 1 semester
Lisp: 4 weeks
Java: still on-going
Python: Still on-going
There are other smattering languages such as Scratch (instant!) and MS Visual Basic (1 week: 3 days being clueless, 2 days learning). However, you can see that BASIC and HTML occupies the short end of the learning curve. I understand that not everybody can learn things that fast, but still, the argument is that BASIC is very easy to pick up. Not as fast to pick up as Scratch and LOGO, mind you, but extremely easy to pick up without the severe limitations of LOGO and still be a general purpose programming language.
If you consider the fact that not only Applesoft BASIC is my first computer programming language, but also that I learned it simply by reading a book without touching the computer whatsoever, then it's a miraculous event, indeed. However, is it really? I don't think so. If you look at the form, it's basically (command) (parameter1),(parameter2),(parameter 3). There may be odd grammatical markers such as parenthesis and colon, but overall the form is very much like algebra, and just as easy to do.
That is, I learned computer programming in 3 hours. That's all there is to it.
Yet, that's actually not a big accomplishment. Coding, or implementing computer program, is actually rather easy. Design, on the other hand, is extremely difficult to get right. I've mastered coding long time ago. Design? I have spent a lifetime learning it, and I'm just scratching the surface.
That's not what most people would say. Most people would say that coding is hard, and design is easy. Just as they say Math is hard, and Art is easy. Who's right? The general population? Or the lonely me?
I'd say that math/coding is easy because if you blunder, the computer will tell you right away and you can fix it. If you design it wrong, there's no one to tell you so, and the mistake will live forever until one day you realize that you do not get the expected performance. Therefore, coding mistakes are easy and cheap to fix. Design mistakes will cause businesses to fail, and no one is the wiser.
Dijkstra wrote a paper saying "blah blah blah BASIC causes brain damage! blah blah blah Once people learn something, they cannot unlearn it." Most media, being sensationalism hungry, focused on the first sentence, and that sentence becomes famous. I, as an educator, focused on the second sentence. Why can't people unlearn what they learned? I have had to do it several times in learning the various computer languages. Why can't they do it? Why can't they bring themselves to another level? Why must they cling to outdated knowledge? BASIC does not cause brain damage. The damage is already there!
Is BASIC such a bad language? Not at all. The new modern BASIC language is especially sophisticated. Even MS Small Basic, with all the bugs, is still superior to sophisticated programming languages such as Java and Python. Understand that most experienced computer programmers will blanch at that statement. They will impound the greatness of such languages: Object Oriented! Local Variables! Strong Data Typing! No pointers!
However, transitioning from BASIC to Pascal, I remember having to think for quite a long time why local variables are desirable. Even to this day, I really don't see the difference between "point.x, point.y" (object oriented) and "point_x, point_y" (functional language). C language has struct keyword to manage memory, and that I can understand. Computer memory in the old days were really expensive and every little bit helps.
Think about it. Local variables can be easily implemented using stack. That is, every time a function is called, you can push variables into a stack, and every time you leave a function, you pop those variables from the stack.
- push (px)
- push (py)
- ...
- pop (py)
- pop (px)
When you have long list of variables, then it becomes unwieldy. Then you'd put a special system to make it convenient for you.
- setmark (stack)
- push ("px",px)
- push ("py",py)
- ...
- poptilmark (stack)
And so on. There are many ways that you can implement this. Array of pointers? Linked list? Preprocessing? There's no one right way. Now think about it. If you go from Pascal to BASIC, these becomes a burden. That will discourage people from learning new computer languages. However, if you go from BASIC to Pascal, then the fact that you don't have to do this "by hand" becomes a pleasure. This will encourage people to seek newer and better languages. Isn't the goal of lifelong learning a worthy one? I think so.
How many people I've seen who would put down BASIC because it doesn't have convenient programming features that sophisticated program have? Well, IMHO, if you can make it work, then it's not a language problem. It's a programmer's problem! Otherwise, may as well remove Scratch and LOGO from viable languages as those are even more primitive than BASIC! You don't put down Scratch language because it's easy to learn and use. Therefore, you don't do it to BASIC, either!
Another example regarding person/computer issue. Take a look at this problem:
- if s>90 and s>=100 then g="A"
- if s>80 and s>=90 then g="B"
- if s>70 and s>=80 then g="C"
- if s>60 and s>=70 then g="D"
- if s>0 and s>=60 then g="F"
Do you see the problem? What if the score has value of 90.5? What's that? Illegal values because the data is stored internally as integer? Well, then, did you remember to round up the values? Because, if not, people will complain!
Notice that it is irrelevant what computer language to use. Some languages provide if-else statement. It can be done as "else if", "elif", or "elseif". It doesn't matter. It's the thought process that is faulty. Look at this:
- g="F"
- if s>=60 then g="D"
- if s>=70 then g="C"
- if s>=80 then g="B"
- if s>=90 then g="A"
That way, it's a lot cleaner. So many people are busy learning the language, but not the algorithm. Learning computer language is easy. It's the thinking process that is hard!
And let's talk about GOTO. It encourages "spaghetti code", as people would say. Is that so? If the design is clean, there would be no spaghetti code, regardless of GOTO statements. On the other hand, imagine doing some object oriented work. Let's put a banking system. Imagine Customer, Bank Safe, Teller, ATM, Manager interactions. 5 elements. Draw a line (with arrow denoting direction) for each interaction.
Customer-Teller
Customer-ATM
Bank Safe-Teller
Bank Safe-Manager
Teller-Customer
Teller-Bank Safe
Teller-ATM
Teller-Manager
ATM-Teller
ATM-Manager
Manager-Bank Safe
Manager-Teller
Manager-ATM
As you draw on paper, check your work. Does any line crosses? If so, then it's a bit more complicated, isn't it? Then you need to repositioned the elements to make it clearer. Now, this is just 5 elements, with arrows all over. Imagine it with a dozen or so objects. Can you still see it clearly just because it's "Object-Oriented?" Doubt it. There is a reason why flow-chart fell out of favor. It's great for small systems, but expand it greatly and it becomes tough to follow. Ditto for any visual languages.
Now, take another look at the chart you have. What happens if a customer tries to by pass Teller and go Directly to Manager? Well, you need to take care of that situation, too. Add another line. You may decide to allow it, or you may decide to throw out an error message. What happens to ATM-Customer interaction? Another error message? Keep going until you have five element graph with lines all over the place. Impossible to avoid crossing the lines, doesn't it? That, my friend, is spaghetti code. Object-Orientedness is by no means the deciding factor of confusion. Simple, clean design matters more.
I also don't understand why Java/Python people take is as a point of pride that they do not use pointers. Sure, it can be difficult to get right, but what are you going to do? Learn how to use pointers properly or discard the whole language? What kind of computer programmer, are you?
Even highly skilled, highly experienced programmer, such as Mark Lulz who wrote 2 excellent Python books, Learning Python and Programming Python, had trouble with Cartesian/Polar Coordinate translation. Excuse me? That conversion is something I learned noddling around in BASIC for about half an hour. It's not the language/tool/car/money. It's the person! Which is better? That you do not use pointers and have trouble converting Polar Coordinates, or that you do use pointers and have no trouble converting Polar Coordinates?
Yes, I had trouble using pointers. So much so that every time I used them, I'd draw on paper little arrows pointing here and there. It's not until I realise that pointers are references, that the whole trouble disappear. What are references used for? Variables, functions, strings, data structures. Any time you want to refer to something without actually having a copy is a reference. Which means that Unix command "ln" is a pointer operation. Java's Garbage Collection is also pointer heavy. Python Objects are also pointers heavy. The only difference is that those languages hide the operations away from the user.
And yet, referencing objects (aka pointer) is so useful, that even Excel spreadsheet has that capability. Not that it's in any tutorial, including advanced ones. I wonder why? The skill to reference objects via indirect method is an important one in computer programming skill toolbox. Why won't people learn it? Why hobble themselves voluntarily?
Old style BASIC, which is flat, is rather cumbersome to use. Sure, that's true. But a bad workman blames the tool. So what if the language is hard to use? You can write a programmer's journal to clarify the problem/solution. How many computer programmers today write such journal? If you're talking about Python, then the answer is none. The source code is the document!
But that's not a good thing. You don't want to teach the programmers via the source code. Imagine that program maintainers cannot understand quicksort. Would they replace the highly optimised quicksort code (such as GNU C qsort) with bubble sort? Very likely. Do you really want to teach these people what a quicksort is, right on the source code itself? Of course not! That would be a distraction! So, the way to bring beginner programmers up to date is via external instruction. With quicksort, that's easy enough. Schooling and/or Internet will take care of that.
Now replace the word "quicksort" with "bleeding edge technology (that nobody but you understand) that your company depends on its very own survival so that if it fails, you (along with everybody else) will be out of the job" and see if you're still willing to gamble that replacement coders will be able to understand the issue completely just by reading the source code.
And that's the value of computer programmer's journal. Just as novel writers have source book detailing all the characters, locations, and other tidbits, so should computer programmers. This isn't being done. Why not? It's a necessary component in computer programming.
All the greatest people I know have the ability to admit that they are wrong. Therefore, the more stubborn they are in insisting that they are right, the less I think of them. With computers, there is no ambiguity: If you're wrong, you're wrong. If the computer is wrong (buggy), you have to work around it. Either way, you learn, adapt, and move on.
So many people complain that their language isn't "powerful" enough. I know many programmers who complained that BASIC doesn't do recursion. I was once among them. Not anymore. Not since learning Assembly Language. The difference is before, I didn't know what to do. Now, I do. There are many computer programmers who'd argue that they don't need to know such details because the language takes care of that. Considering that many highly skilled and experienced computer programmer spend their time removing recursion out of quicksort just to get a fractional increase in speed, I'm afraid I can't agree with that sentiment. Yes, you do need to know how to do it!
If you don't care about speed, as none of Java/Python people do, then by all means ignore that. Ignore everything else, too. Programmer's Journal? Program Design? Automated Testing? Whatever for? "It works on my machine" is good enough excuse!
In my opinion, you need to learn how to design your program from the ground up. Once you do, then you need to learn how to implement it efficiently. Then, you need to learn how to test it properly. And finally, you need to learn how to document the process well enough, so that other people can benefit from your work.
So, back to the point. Why BASIC? Why not C/C++/C#, Java/JavaScript, Perl/Python/Ruby, etc? Why not use the tool highly skilled, highly experienced professionals use? Well, because they are highly skilled, highly experienced professionals. They have the skills to learn complicated subjects quickly and effortlessly. Can beginners do that? Some of them can't even count properly and you want them to use highly complicated tools? What were you thinking?
Let me put it this way: Do you want beginners to learn a language that is not only so easy, it can be learned in an afternoon, but also custom made for the job so far as to have the word "Beginner" in the acronym? Or do you want to shove the language that Mark Lulz spent 3000 pages full of dry technical info on the subject?
That 3000 pages is equivalent to 6 months learning. Do you want to spend that much time learning the language? Or do you want to spend the time learning computer programming designs and algorithms? I rest my case.
- Note that there are good alternatives to BASIC programming language. I mentioned Scratch and LOGO. Processing is a good language as well, one that I recommend over Java.
No comments:
Post a Comment