Sunday, July 31, 2022

Essay 07 A 3G Computer Specification

Essay 07 A 3G Computer Specification

Minimum Specification Computer


It was sometime ago when Raspberry Pi Foundation came up with Raspberry Pi Zero. That's a bubble gum size computer, in comparison to mint tin sized computer. I think it's cute, and of course I bought one. The specification, however, falls short of my vision of an ideal computer.


Of course, it would be nice to have everything, but what I'm really aiming for is a cheap computer that does good enough computing. I still keep one to use as a daily driver when travelling. Raspberry Pi Zero is such a cute and handy computer!


Regarding the performance, it is somewhat lacking. It reminds me of the story of 3M computer specification. It tells a story about how Steve Jobs first heard the term.


 https://en.wikipedia.org/wiki/3M_computer  


which became the driving design of Next Computer: a megabyte of memory, a megapixel display, and a million instructions per second processing power. It is also commonly said that the price should be less than $10,000. 


When I define the specification of a simple yet usable computer, I came up with 3G specification: 1 GHz CPU, 1 GB RAM, and 1 Gb Bandwidth. The price should be less than $100. Raspberry Pi Zero didn't quite measure to it: 700 Mhz CPU, 512 kB RAM, and USB2 spec. So, I'd say about half my ideal specification, except the price, which was $10 with WiFi.


Raspberry Pi 4, which I bought as soon as I was able upon release, was in many ways exceeds the specification by about 4 times: 4 cores of 1 GHz CPU or faster, 4 GB RAM, and USB3 bandwidth. The price is $55, quite reasonable for the power and performance. So powerful, in fact, that the only applications that I can possibly max out of it would be 3D rendering and video editing. It mostly just sit idle doing nothing most of the time.


The Invisible Bandwidth


If you notice the difference of specification between the two, the original specification specified MIPS as an important factor. Steve Jobs puts in Display resolution as the important factor. I put Bandwidth as the important factor.


The reason why I don't really care about MIPS and Display is because I don't really use them. I mostly deal with integer calculation, and as far as display resolution is concerned, I'm still stuck with 800x600 resolution. In fact, I regularly set the font to large size, so that I don't have to squint at the display. Yes, I have a large HDMI monitor. No, I'm not a member of Tiny Font Society!


I set all my fonts and menu bar to large size, and I'm really dismayed that most program, including games and productivity tools, regularly set the font to extremely tiny size. Do you all have perfect vision or do you just not care? I think it's mostly the latter, but I really do hate tiny fonts.


As far as Bandwidth specification is concerned, I was clued in to the difference between Personal Computer (PC) and Mainframe Computer. I really like to deal with large amount of data, and the PC works well enough, but on the mainframe computer, I noticed that the processing went by at a much faster rate than the CPU would indicate. This piqued my interest. As I pore over the literature for the difference, I notice that the bandwidth is much larger on the mainframe. There is the answer!


Actually, that's not the whole story. There are several Bandwidth involved. The obvious one is the System Bus for data transfer between CPU and RAM. Then there's the bandwidth between computer and I/O devices. This is Serial, Parallel, or USB. Another set of bandwidth is the bandwidth between main CPU, math co-processor, and graphic co-processor. Nowadays, we have System on a Chip (SOC) architecture where everything is contained in one die. So, the most restrictive bandwidth is the I/O bandwidth, hence Gigabit bandwidth.


Or is it? Remember that I/O stands for Input/Output. Which external device has the slowest data rate? It's not the hard disk, or network, or keyboard. The slowest most restrictive bandwidth remains ignored because it's invisible. It seems like I'm the only one in the world who works on it.


The slowest bandwidth of the system is the bandwidth between keyboard and chair!


Sunday, July 24, 2022

Essay06 The Core of Computer Programming (part 2)

Essay06 The Core of Computer Programming (part 2)

The usage of IF-GOTO


In part one, I talk about the fundamental hardware necessary to enable IF-GOTO Conditional Branching. In part two, I will show the different way IF-GOTO can be used. Notwithstanding the exhortation "GOTO considered harmful!", let's see how Conditional Branching is to be used in a computer program.


Remember that computer program lies in Memory, and that is in reality just a sequence of numbers in sequential memory addresses. Computers don't really distinguish code and data, unless the hardware store each part separately. We're not concerned about that, though.


The simplest GOTO is Unconditional Branching. This is true if the Conditional evaluates a constant. In other words, either it is always True or always False. Or maybe, the Conditional isn't even checked. It has very limited usage, mainly to connect to a subroutine somewhere else. Reason maybe to simplify the main code, try to fit in the code more than what relative jump can cover, or provide hooks to be modified to point to updated code later. It can also be used to provide Infinite Loop. Lastly, it can also be abused to provide "Spaghetti Code" style of coding.


Therefore, Conditional Branching imply variables in the Conditional. This can be True or False, dependent on the variables. There's a whole section of Boolean algebra regarding this, but let's not go there for now. The main consideration here is that we can skip Forward, or Loop back.


Skipping Forward is usually used to provide interactive process. IF Something THEN Do This Else Do That. 


IF (TRUE) THEN DO THIS. 
IF (FALSE THEN DO THIS.
IF (TRUE) THEN DO THIS ELSE DO THAT

IF (TRUE) THEN DO THIS
ELSE IF (TRUE) THEN DO THIS
ELSE IF (TRUE) THEN DO THIS
ELSE DO THAT


The second example is called "IF-ELSE" ladder, and you can chain them for quite a long chain. There is something similar to this structure: ON...GOTO


ON (var) GOTO 1,2,3,...,N
ON (var) GOSUB 1,2,3,...,N


In C, we use switch() { case 1: ... default; } construct. The idea is that we execute different code depending on the value of variable, instead of TRUE/FALSE. Generally speaking, this is a concept of Multiplexer/Demultiplexer. 


Next, consider IF-GOTO to the preceeding address. This is called a LOOP. Easy way is Infinite LOOP.


LOOP start:
...
...
GOTO start:


You can use the loop to repeat N times:


REPEAT
...
...
UNTIL N=10


In C, we use do { } while () construct.


You can loop as long as a condition remains True


WHILE (cond)
...
...
WEND


In C, we use while () { } construct.


There is also FOR LOOP, but it's not interesting for the discussion. What is interesting are the keywords *break* and *continue*. CONTINUE is used to skip the rest of the loop and go to start immediately, while BREAK is used to BREAK out of the loop immediately. You can think of it as GOTO Start and GOTO End.


LOOP Start:
...
IF (cond) GOTO End: #break
...
IF (cond) GOTO Start #continue
...
GOTO Start
End:  #End of loop


And that should be it. There is also the Spaghetti Code style where you just go everywhere whenever you want. Such pattern are confusing and should be avoided whenever possible, but if you ever try to build a parser, maybe by using lex and yacc, then it can't be helped. State based or Functional coding is like that, too.


Saturday, July 16, 2022

Essay05 The Core of Computer Programming (part 1)

Essay05 The Core of Computer Programming (part 1)

A look at (simplified) computer system


The question is: What is essence of computer programming? What is the core of computer programming? What is it, that if you take away from the thing that is computer programming, will cause the thing to no longer be computer programming? My answer is: Conditional Branching (IF-GOTO).


This is a surprisingly deep issue. There are many considerations regarding this question. I had planned this to be just a single post, but having seen the whole, completed answer, I decided to split it into 2 parts. If you, like most people, think that the issue is a simple one, then I encourage you to stop at this point and work out your answer. Then continue reading, and compare our answers.


First, let me tell you what computer is not. There is an old joke that when a prospective buyer ask a salesman what the computer can do, the salesman answered that the computer can do half the things the customer can do. "Great! I'll take two!" Ha ha. Well, computers don't work like that. Computers can only do what you tell it to do, no more, no less. And whatever the computer was told to do, the computer can do it exceedingly fast, and with great accuracy.


Of course, fast is a relative term. Back on the old Apple 2 days, 1Mhz is fast, but not that fast. Yet, it is still much faster than Charles Babbage Analytical Engine. Nowadays, we have fast computers, but is it really that fast? Newer programs are such a bloat, that huge amount of computer resources are taken, not for working on the problem, but on cute animations that aren't that useful, or even cute, for that matter. So much waste.


We don't really need compilers or programming language, either. Computers are basically a collection of bits and bytes. Ones and Zeros. There's really not much more to it, until we get to Quantum mechanics. We do have CPU, but what is the core difference between a programmable calculator, and a computer? In simple terms, why is The Difference Engine a calculator, while the Analytical Engine a computer?


The most fundamental principle that is uniquely a computer program is, in my view, Conditional Branching. The ability to change the path of computation is fundamental of computer programming. If you have everything else except conditional branching, then you don't have a computer. What you have, then, is a batch processing. Imagine a robot that does a complicated dance, but repeatedly so without any change. Is that a robot? Yes. Is that a computer program? I'd say no. It's a complex mechanical construct, but not a computer. If, say, a car manufacturing robot can only be influenced by On and Off, then it's not a computer program, rather a batch processing. If, however, the robot can sense a car being somehow incorrectly staged and alerts the shop foreman, then it is a computer program. A computer program reacts to changing condition. A batch process cannot.


The simplest implementation of Conditional Branching that I can think of is IF-GOTO. This is true regardless of computer language. It can be Branch if Zero (BZ), Branch if Not Equal (BNE), IF-THEN-ELSE, WHILE/REPEAT LOOP, and other constructs. So, what does it mean, exactly that IF-GOTO exists?


Well, first of all, you need some kind of variables, and that variables can be compared. Variables means pointers. Pointers mean memory addresses to point to. Memory addresses can contain both code and data, but they are really just numbers. This means that GOTO address can be used to call functions. Combined with data stack, and you can call function recursively. You also need CPU because of the explosive combination between operation and data. CPU means you need some kind of flip-flop clock, in order for the CPU to process each instruction cleanly. 


The should also be some means of Input Output available. That means we need System Bus in order to shuffle data from Memory to CPU and vice-versa. We also want the computer system to interact with outside world. So, some kind of external data line, either serial, or maybe GPIO is in order. Interactive element can be Keyboard, Mouse, or Telemetry signal receiver. Output can be Display, Teletype, or maybe just some speakers.


Along the way, we can optimize further with Cache Memory or Co-Processor, but all things considered, the core of computer program is IF-GOTO. The rest is just there to support that process.


Friday, July 15, 2022

Essay 04 Amateur vs Professional Quality

Essay 04 Amateur vs Professional Quality

Does it really matter?


I have seen it too many times. In fact, I have hardly ever seen otherwise. My development environment as well as code style is that of an amateur hobbyist. Other people, even beginner coder, would go professional from the start. The question I'm asking is: Why would you want to handicap yourself to suffer professional quality programs when a simple hobbyist tool suffice?


Do you always get a race car for your daily drive? Do you always get the comprehensive tool cabinet when a tool bag gets the job done? Do you always get a professional studio when a sketchbook is handier? Why would you want to insist that a 4 GB Visual Studio is the only IDE you use, when a simple text editor will do? 


This kind of argument gets really bad. Adobe Studio Tools are definitely top quality, but as professional tools, they are rather expensive. GIMP and Krita are reasonable choices. Microsoft Office are great, too, but I used to just get a cheaper Microsoft Works since it gets the job done just fine. Oracle database is the best in the world, but most people can use MySQL well enough. Why would I want to use GCC with its long compile times, when TCC compiles near instantly?


As a hobbyist, coding should be fun. It doesn't matter that my hobbyist program compiled with TCC runs 3 times longer than a professional code compiled with GCC. The difference between 1 second and 1/3 of a second isn't that much. The fact is, that with TCC, I don't even bother doing the compilation step. There is an option to -run the source code directly! So, that is what I have been doing. It reminds me the easy of use of Perl. Perl is even more convenient!


Are the resulting program any good? I like to think so. Clayton Christensen's The Innovator's Dilemma stated: "Generally, disruptive technologies underperform established products in mainstream markets. [snip] Products based on disruptive technologies are typically cheaper, simpler, smaller, and frequently, more convenient to use." The point is, you don't have to outperform the best in the world. There are other opportunities in the smaller, niche markets. If you can't be a big fish in a big pond, then try to be a big fish in a small pond and make the pond bigger!


There lies the best value of a hobbyist. We don't have to have the best of everything. We just have to have a good enough tools. We don't have to be the smartest. We just have to be not stupid. We don't have to be the cheapest. We just have to provide good value. Coding? That's easy. Design? That's hard. The hard part of a journey isn't in making it. The hard part of a journey is knowing where to go!


Let's talk about professional frameworks a bit. There are many names: Xtreme, Agile, Scrum, and others. For the most part, they deal with the same issue: How to handle communication with clients. How to track progress and manage milestones. How to manage personnel involved in the project. A formal package for deliverables. There may be different philosophical and paradigm involved, but they mostly deal with the same issues.


Professionals need to deal with those issues. Not having proper procedure to deal with those issues will negatively affect client-producer relationship. So, having to follow at least one methodology is an absolutely crucial to the process. 


Hobbyist, on the other hand, don't really need to do that. They are their own clients. If something goes wrong, they can just fix it themselves. They do their own program maintenance. They don't have to worry about the ignorant boss who claims that stream-lining existing code is "a waste of time" even though it will save a lot of time in maintenance later. To that end, hobbyist have the advantage of not having to watch the clock, so to speak.


Expenses can be lower for hobbyist as well. Niche market which big business will never dream of entering is fair game for hobbyist. It's simple math: Less expense yields more profit, even if the overall revenue is less. 


So, my coding is at the amateur hobbyist level. While most people consider that as a handicap, I actually consider that as an advantage! My code is cleaner, simpler, and easier to understand. It works fast enough, and good enough to do the job well. Best of all, if there's something that is less than perfect, I'd just fix it. The source code is right there! It's easy to do because the code is simple and easy to understand. There's none of the complex framework involved as with professional quality codes. I should know, I've been both.


Following the lead of Satoru Iwata: My title may be business owner. My training is that of a computer programmer. But in my heart, I am an artist. I'm not a coder who does art. I am an artist who does code.



 


Thursday, July 14, 2022

Essay 3 C is misunderstood

Essay 3 C is misunderstood

Quirky, flawed, and tremendous success! -Dennis Ritchie


Imagine, if you will, that you are living in the 1970s where computer are in the beginning of their creation. You have a simple machine with simple instruction set. And you want to program the computer. What will you do?


1. Rewire the circuit

2. Enter hex numbers, backwards

3. Enter decimal codes

4. Enter mnemonic assembly

5. Use FORTRAN

6. Use Forth

7. Use BASIC


Most programmers at the time use assembly programming language. Higher level construct are inefficient and to be used only by unskilled clerks. The assumption was that if you can say it in English, then you can say it in computerese. However, you do need to say it in restricted English. In other words, Pseudocode.


No one code in Pseudocode, except Donald Knuth, but he's the only one to do so with his Literature Programming. However, my point is simple: Translating Pseudocode from English to high level computer language is so easy, anybody can do it!


Forth, a stack based language, is unique, in that it's simple to code, yet capable of powerful functionality. You don't see Forth nowadays, but the spirit lives on PostScript format, which is contained in Portable Document Format (PDF). Another simple language is List Processing (LISP) and its dialect cousin Scheme. You may see it in Emacs scripting engine.


Of course, there is BASIC and LOGO, as well. Those are mostly limited in educational institution. The thing about them is that they're easy to code. BASIC, famously, can easily fit in less than 8K. LOGO and a more sophisticated BASIC can fit in a bit more memory, such as 8K. I have personally done a LISP based interpreter in as little as 300 lines. Other students did it in about 600 lines. But I digress. The point is that creating a coding language was by necessity an exercise in small scale expenditure. Computers of the day just isn't capable of large memory footprint requirements.


So, then, the question that needed to be answered at the time was "What is the most efficient way to program a computer that has the ease of high level language, yet be capable of fast machine language execution speed?" And that was the goal of the project. It began with BCPL, then a simplified version of it, B, was conceived. Then improved to C.


C was never conceived to be a high level language. It amuses me to no end that people nowadays call C a high level language. Maybe C++, although that was debatable, too. You see, C++ was originally created as a precompiler that compiles to C. C++ now compiles directly to ASM to take advantage of the language construct to generate good optimization. However, at the time, C++ to C to ASM to ML was quite common. The reason being is "ease of implementation."


If you have the hardware, you have Machine Language (ML). In order to ease the programming process, Assembly language (ASM) is used. C compiler compiles to ASM, which is then compiled and linked to ML. It's trivial to write an assembler. It's not difficult to write a Tiny BASIC compiler, albeit, having done so, I can testify that Tiny BASIC has a rather limited utility.


So, a more powerful language was desired, and the priority is to write something that is easy to write and compile. Hence, C uses a lot of standard libraries. About the only custom library is the stdio library. That is done with machine language. Most other library actually use C.


If you look at the libraries, most of them are really simple and easy to do. Once you have the core C compiler going, you can just implement the rest really, really easy. Hence, the nearly ubiquitous presence of C compiler. C compiler is relatively quick and easy to do, at least in principle, before the desire for a more powerful language comes into being.


I'm not the one to discuss the merits of various C language features, but fortunately, Brian Kernighan is still alive and well, and I believe he is the de facto person to consult about such things. My point is: C language is small and easily implementable across most system, and therefore available to most computers.


There is a version of C compiler called Tiny C Compiler (TCC) which I actually use every day. I don't use Gnu C Compiler (GCC) which is what professional coders use. That's because it's really overkill for my use. I only use the smallest set of C language features, anyway. Why would I want to use GCC? GCC's compilation time is rather extensive! 


And yet, simple as it is, C does feature rather powerful set of capabilities. It bridges the assembly language and high level language. C was actually classified as "middle level" language, yet quite powerful. There is actually a design called C--, which is an even simpler language set, but that isn't popular. In short, C occupies the ideal "Goldilocks" zone of complexity and features.


That is my take of C language. It's not a language that is the end of all language, rather it is the language that is the beginning of all languages. A whole lot of later generation languages can trace its lineage to C. I already mention C++, but basically C -> C++ -> Java -> C# -> Rust is one such lineage. Another would be C -> Perl -> Raku. There are plenty more.


Should you learn C? I would advise it, but not if you're a beginner. For beginners, there are more accessible language such as BASIC, which is designed for beginners in mind, about the only difficulty is the lack of unified dialect. The most popular BASIC dialect seems to be Microsoft QuickBasic, and QB64 seems to be a good implementation of it. The question you should ask is: Will you be willing to learn Assembly programming? Do you want to code as many languages as possible? Do you want to write program in various systems/hardwares? If yes, then learning C is a great way to get started!


As Dennis Ritchie himself pointed out: C is quirky and flawed, but it's also tremendously successful. Yes, the standard library isn't the greatest in the world, but it's not supposed to be. They're supposed to be simple and easily implementable. If you need more, then you should code your own libraries. C makes that easy.


As a note: Tiny BASIC specification features a FOR-LOOP. I'd say that's a mistake. WHILE-LOOP is easily implementable, as is REPEAT-LOOP. If you add *continue* and *break*, then you can even use infinite loop! GOTO is also useful, albeit easily misused. The answer is simple: Structured GOTO. 


Wednesday, July 13, 2022

Essay 2A "Mechanic" Computer Programmer

Essay 2 A "Mechanic" Computer Programmer

Not insulting Mechanical Mechanics


My frame of reference for this piece, is that of a magical circle. By which I mean magicians, either performing on stage or street magicians. If you want to know the secrets of magicians, all you have to do is go to a magic store, and there you will find gadgets galore, all ready to fulfill all your magical needs.


There are tricks, however, that is unachievable by magical gimmicks alone. Certain magic tricks require the magicians to perform with skill, either misdirection, or hidden manipulation. It can take years to refine those skills, and it's certainly no easy feat to be an accomplished magicians.


However, there are magicians who decided that skills are beyond them, and they depend on gimmicks for their performance. The term bandied about is "mechanic", which as you may surmise, a rather dim view of the quality of their magic. Yes, they are magicians, but no, not a high level, or even medium level ones. In short, "mechanic" magicians are mediocre magicians.


Those magicians are usually get hired at children parties. Children, being inexperience in ways of the world, tend to be immune to misdirection. Only the greatest magicians can pull off successful misdirection with children, and most magicians who depend on these children parties wisely chose mechanical magic for their entertainment. Furthermore, if you want to entice these children to become magicians in the future, you would be well advised to keep a few of these gimmicks as to inspire them to the wonders of magic.


However, they should at least keep trying to attain higher skill levels. Those who doesn't generally receive well-deserved scorn for their performance. There's nothing new in their performance. No need for them to divulge their magical secret. Just go to the store and buy their tricks! No real value to their craft.


Now, imagine that you are a computer programmer. You learned so many things. Maybe Java or C#. PHP or Python. HTML, CSS, and JavaScript? Sure. You learn so many frameworks and paradigm. Pair Programming, Agile, and Patterns? Why not? Why not use the most powerful tools available? After all, professionals use them, so why not give yourself the same advantage those professionals use? Why handicap yourself to primitive ones?


The thing is, I don't think primitive tools are handicaps. I think powerful tools are crutches. They allow you to do great things with their "black box" solution, so to speak. There is a difference between knowing how to do things and use code libraries to do them, and use code libraries without understanding what they do, or how they do things. 


Somebody ask me once, why does it matter to use code library if they can recreate the library from scratch? "Nothing," I answered, "but the problem is that they cannot recreate even the simplest task!" That is the problem. It's not that using code library is a problem, but the lack of ability to be independent of them that is the problem.


Nowadays, such questions aren't even asked anymore. Whenever you see computer programming course, you see a ton of these names of these technologies spread around with no regards of how they're made. As more and more of these courses ignore the underlying technologies, the myth of the difficulty of computer programming become more and more exaggerrated. Nowadays, people have no problem claiming that Turtle graphic is an absolute impossibility to teach to students. We're talking about Forward, Turn Left, Turn Right, and PenUp/PenDown! It's so simple, that I've done it in 10 minutes. Somehow, their inability to code Turtle Graphic becomes nobody can code Turtle Graphic easy and that it takes years to do!


Somebody actually told me with a straight face, that "include FizzBuzz" is a perfectly acceptable solution to the problem. And don't tell me about the numerous "professional" programmers, who would be happy to remove recursion from Quicksort() by using Stack() routine! Uh-huh. They just turn implicit stack into explicit one. How is that any better?


I was 13 years old when I independently learned how to use sin() and cos() to draw a circle. I then proceeded to write Spirograph type of programs. It's not until much later in the latter part of my high school senior year in AP Calculus class, that the teacher taught the exact same subject. Needless to say, I scored 100%. The thing is, it seems like nobody else thought that it was easy, judging by the expression of their face and the way they look at me. And these are the smart kids. I'd have thought that maybe lots of them would get it as well, but apparently not.


I think more important that the ability to learn from a teacher in a classroom is more important than the ability to get as many certifications as possible. It takes independent thinking, and that takes guts, as well as stubbornness to try to dissect pre-existing code libraries and to try to figure out how they do things. It takes curiosity, problem solving skills, and the willingness to suffer the inevitable frustration as you try to decipher professional code.


I don't think I want to suffer the pain of deciphering professional code. The few times I did, I end up with reducing the code size, usually to 25% of the original size. I think most of professional coder are Rube Goldberg practitioner. I have no patience for that. What usually happen of late is that I would figure out the API, which tells me the functions and capability of the library, and I will figure out the appropriate Data Structure involve. After that, the code will usually write itself.


The point I'm making is that if I can be independent from Other People's Code, then I would do so. There's nothing more assuring that the code you have is the most fool-proof, correct code possible because you write it yourself. After all, who wants to dedicate the most time writing the correct code for you if not yourself? Nobody, that's who!


You can always ignore this advice, and constraint yourself to a programming language and libraries. Sure, you can do that. But then, you will feel the same pain I felt when I moved away from Linux/Perl solution to Windows environment! When I decided to rebuild everything from scratch, I found out, to my dismay, that most of them are really OS calls, and that I needed to rebuild the most of OS from scratch! Perusing GNU source code didn't help. Did I say that they're complicated? They're complicated! So, it was really a pain!


Nowadays, I'm using Raspberry Pi, and that's Debian technology. So, that's fine. But I already moved away from Perl. I'm building my own set of libraries to do, and I don't regret the extended amount of time I spend building it. After all, I only have to do it _once._


So, you can try your best to study, really study, the whole spectrum of computer programming. Or you can be a mechanic. Your choice.


Tuesday, July 12, 2022

Essay 02: Why do we code?

Essay 02: Why do we code?


Why do we code? I don't know. I mean, I know why I code, by why would anyone else want to learn to code? I think the best answer came from Steve Job: "I think everyone in this country should learn to code, because it teaches you how to think." And what a fine answer it is! 


Unfortunately, the Apple Macintosh, in fact, did not come with a built-in programming language. You had to buy an Apple Lisa system, at $10,000, to write a program for the Macintosh. Great user interface, but programming for it is rather a painful experience!


So, here are some ideas on why you should learn to code, and my reaction to them:


1. Logical Thinking. Also known as left-brain function. And yet, I have seen computer programmers use their right-brain side. Rather emotional. Artists tend to be that way. Steve Jobs, for example, did not do any heavy duty coding. Sure, he knew how to code, but he didn't do it professionally. He relegates himself as visionary, and that requires more than logical thinking. It requires imagination and intuition. As well, people can and do develop logical thinking without learning computer programming. The ability to reason isn't an exclusive domain to computer programmer. Chess players, for example, can do it, too.


2. Creativity. I doubt that. I think people are either creative or not. The creative ones can and do well in computer programming. However, I have seen too many people just use other people's code, without thinking whatsoever. Only in the bleeding edge of technology that creativity is rewarded. However, that area of research isn't too common. As far as art is concerned, there is Creative Coding and Generative Art, but those are rather rare.


3. Resilient and Perseverance. The ability to weather hard times, so to speak. I find that hard to believe. As soon as people see code, their mind go blank, and they never pick up coding. Those who sticks with it would be amply rewarded, sure, but so is sticking with Karate to develop discipline, so to speak. Just as creativity is a talent, so is perseverance.


4. Communication skill. This I agree completely. Many people say that in order to be able to program a computer, you need to be good at math. The truth is, you don't. However, you do need to have good communication skill. The ability to articulate your thought, put it down on paper, and code it into the computer is an indispensable skill, and you have to have it in order to be a successful coder. Math? Do you know that coding used to be thought of as "clerical" job?


5. Analytical Problem Solving. This is also true. The biggest roadblock people have in computer programming is the inability to properly recognize the problem! Defining a problem is a very necessary skill, and not too many people have it. Then, you need to be able to decide on how to solve the problem. So, yes, this is a skill and not a talent.


6. Coding is fun. Well, no. Creating a computer program from nothing is a worthy endeavor and should be encouraged. But it is not a fun activity. Too many people are having trouble debugging their program, and what if there is no help available? It can be maddening experience, indeed! Tearing your hair off in frustration is a rather common experience, I'm afraid.


7. Learn how to learn. Having done it, I can categorically say that learning how to learn and computer programming are not related to each other. That is, to say, if you learn how to learn and can pick up new programming language quickly, then you're well and good to progress as computer programmer. However, just because a lot of successful computer programmer can do so, it doesn't mean that the skill is guaranteed. Those are two different things.


8. Money. This is also not true. I see too many job offers as needing so many years of experience in a particular platform. Also, new platforms keep coming up every year. What was good, won't be. Programming languages come and go. About the only steady job that I see is Database System Administrator. You don't need computer programming for that. Just learn SQL, and some system administrator skills. That requires certification, not boot camp.


9. Prestige. All I can think of about this, is "EA Spouse", the deservedly lambasted activity for treating computer programmer shabbily. Only to be repeated at Rockstar Games, forcing their computer programmers to work a punishing 100 hours per week! Only to be let go after the game was done. All it does is burn out a lot of computer programmer. I don't see this being different anytime soon.


10. Be smarter. Not at all. There is this rather popular process of testing computer programmer with "LeetCode", that is asking the job applicant to code a programming problem. The problem with that practice is that the applicant cannot use a computer or look up stuff either on books or anywhere including the web! So, obviously, the most successful applicants are the ones where they successfully memorized as many of these as possible. I have seen the question compilation and most of them are totally worthless! Either they can solve it because they've seen it before, or they can't because they haven't seen it before. All it does is gatekeep the really smart programmer (never memorize what you can look up -Einstein, Feynman, and others), and value memorization rather than smarts.