Sunday, July 31, 2022

Essay 07 A 3G Computer Specification

Essay 07 A 3G Computer Specification

Minimum Specification Computer


It was sometime ago when Raspberry Pi Foundation came up with Raspberry Pi Zero. That's a bubble gum size computer, in comparison to mint tin sized computer. I think it's cute, and of course I bought one. The specification, however, falls short of my vision of an ideal computer.


Of course, it would be nice to have everything, but what I'm really aiming for is a cheap computer that does good enough computing. I still keep one to use as a daily driver when travelling. Raspberry Pi Zero is such a cute and handy computer!


Regarding the performance, it is somewhat lacking. It reminds me of the story of 3M computer specification. It tells a story about how Steve Jobs first heard the term.


 https://en.wikipedia.org/wiki/3M_computer  


which became the driving design of Next Computer: a megabyte of memory, a megapixel display, and a million instructions per second processing power. It is also commonly said that the price should be less than $10,000. 


When I define the specification of a simple yet usable computer, I came up with 3G specification: 1 GHz CPU, 1 GB RAM, and 1 Gb Bandwidth. The price should be less than $100. Raspberry Pi Zero didn't quite measure to it: 700 Mhz CPU, 512 kB RAM, and USB2 spec. So, I'd say about half my ideal specification, except the price, which was $10 with WiFi.


Raspberry Pi 4, which I bought as soon as I was able upon release, was in many ways exceeds the specification by about 4 times: 4 cores of 1 GHz CPU or faster, 4 GB RAM, and USB3 bandwidth. The price is $55, quite reasonable for the power and performance. So powerful, in fact, that the only applications that I can possibly max out of it would be 3D rendering and video editing. It mostly just sit idle doing nothing most of the time.


The Invisible Bandwidth


If you notice the difference of specification between the two, the original specification specified MIPS as an important factor. Steve Jobs puts in Display resolution as the important factor. I put Bandwidth as the important factor.


The reason why I don't really care about MIPS and Display is because I don't really use them. I mostly deal with integer calculation, and as far as display resolution is concerned, I'm still stuck with 800x600 resolution. In fact, I regularly set the font to large size, so that I don't have to squint at the display. Yes, I have a large HDMI monitor. No, I'm not a member of Tiny Font Society!


I set all my fonts and menu bar to large size, and I'm really dismayed that most program, including games and productivity tools, regularly set the font to extremely tiny size. Do you all have perfect vision or do you just not care? I think it's mostly the latter, but I really do hate tiny fonts.


As far as Bandwidth specification is concerned, I was clued in to the difference between Personal Computer (PC) and Mainframe Computer. I really like to deal with large amount of data, and the PC works well enough, but on the mainframe computer, I noticed that the processing went by at a much faster rate than the CPU would indicate. This piqued my interest. As I pore over the literature for the difference, I notice that the bandwidth is much larger on the mainframe. There is the answer!


Actually, that's not the whole story. There are several Bandwidth involved. The obvious one is the System Bus for data transfer between CPU and RAM. Then there's the bandwidth between computer and I/O devices. This is Serial, Parallel, or USB. Another set of bandwidth is the bandwidth between main CPU, math co-processor, and graphic co-processor. Nowadays, we have System on a Chip (SOC) architecture where everything is contained in one die. So, the most restrictive bandwidth is the I/O bandwidth, hence Gigabit bandwidth.


Or is it? Remember that I/O stands for Input/Output. Which external device has the slowest data rate? It's not the hard disk, or network, or keyboard. The slowest most restrictive bandwidth remains ignored because it's invisible. It seems like I'm the only one in the world who works on it.


The slowest bandwidth of the system is the bandwidth between keyboard and chair!


Sunday, July 24, 2022

Essay06 The Core of Computer Programming (part 2)

Essay06 The Core of Computer Programming (part 2)

The usage of IF-GOTO


In part one, I talk about the fundamental hardware necessary to enable IF-GOTO Conditional Branching. In part two, I will show the different way IF-GOTO can be used. Notwithstanding the exhortation "GOTO considered harmful!", let's see how Conditional Branching is to be used in a computer program.


Remember that computer program lies in Memory, and that is in reality just a sequence of numbers in sequential memory addresses. Computers don't really distinguish code and data, unless the hardware store each part separately. We're not concerned about that, though.


The simplest GOTO is Unconditional Branching. This is true if the Conditional evaluates a constant. In other words, either it is always True or always False. Or maybe, the Conditional isn't even checked. It has very limited usage, mainly to connect to a subroutine somewhere else. Reason maybe to simplify the main code, try to fit in the code more than what relative jump can cover, or provide hooks to be modified to point to updated code later. It can also be used to provide Infinite Loop. Lastly, it can also be abused to provide "Spaghetti Code" style of coding.


Therefore, Conditional Branching imply variables in the Conditional. This can be True or False, dependent on the variables. There's a whole section of Boolean algebra regarding this, but let's not go there for now. The main consideration here is that we can skip Forward, or Loop back.


Skipping Forward is usually used to provide interactive process. IF Something THEN Do This Else Do That. 


IF (TRUE) THEN DO THIS. 
IF (FALSE THEN DO THIS.
IF (TRUE) THEN DO THIS ELSE DO THAT

IF (TRUE) THEN DO THIS
ELSE IF (TRUE) THEN DO THIS
ELSE IF (TRUE) THEN DO THIS
ELSE DO THAT


The second example is called "IF-ELSE" ladder, and you can chain them for quite a long chain. There is something similar to this structure: ON...GOTO


ON (var) GOTO 1,2,3,...,N
ON (var) GOSUB 1,2,3,...,N


In C, we use switch() { case 1: ... default; } construct. The idea is that we execute different code depending on the value of variable, instead of TRUE/FALSE. Generally speaking, this is a concept of Multiplexer/Demultiplexer. 


Next, consider IF-GOTO to the preceeding address. This is called a LOOP. Easy way is Infinite LOOP.


LOOP start:
...
...
GOTO start:


You can use the loop to repeat N times:


REPEAT
...
...
UNTIL N=10


In C, we use do { } while () construct.


You can loop as long as a condition remains True


WHILE (cond)
...
...
WEND


In C, we use while () { } construct.


There is also FOR LOOP, but it's not interesting for the discussion. What is interesting are the keywords *break* and *continue*. CONTINUE is used to skip the rest of the loop and go to start immediately, while BREAK is used to BREAK out of the loop immediately. You can think of it as GOTO Start and GOTO End.


LOOP Start:
...
IF (cond) GOTO End: #break
...
IF (cond) GOTO Start #continue
...
GOTO Start
End:  #End of loop


And that should be it. There is also the Spaghetti Code style where you just go everywhere whenever you want. Such pattern are confusing and should be avoided whenever possible, but if you ever try to build a parser, maybe by using lex and yacc, then it can't be helped. State based or Functional coding is like that, too.


Saturday, July 16, 2022

Essay05 The Core of Computer Programming (part 1)

Essay05 The Core of Computer Programming (part 1)

A look at (simplified) computer system


The question is: What is essence of computer programming? What is the core of computer programming? What is it, that if you take away from the thing that is computer programming, will cause the thing to no longer be computer programming? My answer is: Conditional Branching (IF-GOTO).


This is a surprisingly deep issue. There are many considerations regarding this question. I had planned this to be just a single post, but having seen the whole, completed answer, I decided to split it into 2 parts. If you, like most people, think that the issue is a simple one, then I encourage you to stop at this point and work out your answer. Then continue reading, and compare our answers.


First, let me tell you what computer is not. There is an old joke that when a prospective buyer ask a salesman what the computer can do, the salesman answered that the computer can do half the things the customer can do. "Great! I'll take two!" Ha ha. Well, computers don't work like that. Computers can only do what you tell it to do, no more, no less. And whatever the computer was told to do, the computer can do it exceedingly fast, and with great accuracy.


Of course, fast is a relative term. Back on the old Apple 2 days, 1Mhz is fast, but not that fast. Yet, it is still much faster than Charles Babbage Analytical Engine. Nowadays, we have fast computers, but is it really that fast? Newer programs are such a bloat, that huge amount of computer resources are taken, not for working on the problem, but on cute animations that aren't that useful, or even cute, for that matter. So much waste.


We don't really need compilers or programming language, either. Computers are basically a collection of bits and bytes. Ones and Zeros. There's really not much more to it, until we get to Quantum mechanics. We do have CPU, but what is the core difference between a programmable calculator, and a computer? In simple terms, why is The Difference Engine a calculator, while the Analytical Engine a computer?


The most fundamental principle that is uniquely a computer program is, in my view, Conditional Branching. The ability to change the path of computation is fundamental of computer programming. If you have everything else except conditional branching, then you don't have a computer. What you have, then, is a batch processing. Imagine a robot that does a complicated dance, but repeatedly so without any change. Is that a robot? Yes. Is that a computer program? I'd say no. It's a complex mechanical construct, but not a computer. If, say, a car manufacturing robot can only be influenced by On and Off, then it's not a computer program, rather a batch processing. If, however, the robot can sense a car being somehow incorrectly staged and alerts the shop foreman, then it is a computer program. A computer program reacts to changing condition. A batch process cannot.


The simplest implementation of Conditional Branching that I can think of is IF-GOTO. This is true regardless of computer language. It can be Branch if Zero (BZ), Branch if Not Equal (BNE), IF-THEN-ELSE, WHILE/REPEAT LOOP, and other constructs. So, what does it mean, exactly that IF-GOTO exists?


Well, first of all, you need some kind of variables, and that variables can be compared. Variables means pointers. Pointers mean memory addresses to point to. Memory addresses can contain both code and data, but they are really just numbers. This means that GOTO address can be used to call functions. Combined with data stack, and you can call function recursively. You also need CPU because of the explosive combination between operation and data. CPU means you need some kind of flip-flop clock, in order for the CPU to process each instruction cleanly. 


The should also be some means of Input Output available. That means we need System Bus in order to shuffle data from Memory to CPU and vice-versa. We also want the computer system to interact with outside world. So, some kind of external data line, either serial, or maybe GPIO is in order. Interactive element can be Keyboard, Mouse, or Telemetry signal receiver. Output can be Display, Teletype, or maybe just some speakers.


Along the way, we can optimize further with Cache Memory or Co-Processor, but all things considered, the core of computer program is IF-GOTO. The rest is just there to support that process.


Friday, July 15, 2022

Essay 04 Amateur vs Professional Quality

Essay 04 Amateur vs Professional Quality

Does it really matter?


I have seen it too many times. In fact, I have hardly ever seen otherwise. My development environment as well as code style is that of an amateur hobbyist. Other people, even beginner coder, would go professional from the start. The question I'm asking is: Why would you want to handicap yourself to suffer professional quality programs when a simple hobbyist tool suffice?


Do you always get a race car for your daily drive? Do you always get the comprehensive tool cabinet when a tool bag gets the job done? Do you always get a professional studio when a sketchbook is handier? Why would you want to insist that a 4 GB Visual Studio is the only IDE you use, when a simple text editor will do? 


This kind of argument gets really bad. Adobe Studio Tools are definitely top quality, but as professional tools, they are rather expensive. GIMP and Krita are reasonable choices. Microsoft Office are great, too, but I used to just get a cheaper Microsoft Works since it gets the job done just fine. Oracle database is the best in the world, but most people can use MySQL well enough. Why would I want to use GCC with its long compile times, when TCC compiles near instantly?


As a hobbyist, coding should be fun. It doesn't matter that my hobbyist program compiled with TCC runs 3 times longer than a professional code compiled with GCC. The difference between 1 second and 1/3 of a second isn't that much. The fact is, that with TCC, I don't even bother doing the compilation step. There is an option to -run the source code directly! So, that is what I have been doing. It reminds me the easy of use of Perl. Perl is even more convenient!


Are the resulting program any good? I like to think so. Clayton Christensen's The Innovator's Dilemma stated: "Generally, disruptive technologies underperform established products in mainstream markets. [snip] Products based on disruptive technologies are typically cheaper, simpler, smaller, and frequently, more convenient to use." The point is, you don't have to outperform the best in the world. There are other opportunities in the smaller, niche markets. If you can't be a big fish in a big pond, then try to be a big fish in a small pond and make the pond bigger!


There lies the best value of a hobbyist. We don't have to have the best of everything. We just have to have a good enough tools. We don't have to be the smartest. We just have to be not stupid. We don't have to be the cheapest. We just have to provide good value. Coding? That's easy. Design? That's hard. The hard part of a journey isn't in making it. The hard part of a journey is knowing where to go!


Let's talk about professional frameworks a bit. There are many names: Xtreme, Agile, Scrum, and others. For the most part, they deal with the same issue: How to handle communication with clients. How to track progress and manage milestones. How to manage personnel involved in the project. A formal package for deliverables. There may be different philosophical and paradigm involved, but they mostly deal with the same issues.


Professionals need to deal with those issues. Not having proper procedure to deal with those issues will negatively affect client-producer relationship. So, having to follow at least one methodology is an absolutely crucial to the process. 


Hobbyist, on the other hand, don't really need to do that. They are their own clients. If something goes wrong, they can just fix it themselves. They do their own program maintenance. They don't have to worry about the ignorant boss who claims that stream-lining existing code is "a waste of time" even though it will save a lot of time in maintenance later. To that end, hobbyist have the advantage of not having to watch the clock, so to speak.


Expenses can be lower for hobbyist as well. Niche market which big business will never dream of entering is fair game for hobbyist. It's simple math: Less expense yields more profit, even if the overall revenue is less. 


So, my coding is at the amateur hobbyist level. While most people consider that as a handicap, I actually consider that as an advantage! My code is cleaner, simpler, and easier to understand. It works fast enough, and good enough to do the job well. Best of all, if there's something that is less than perfect, I'd just fix it. The source code is right there! It's easy to do because the code is simple and easy to understand. There's none of the complex framework involved as with professional quality codes. I should know, I've been both.


Following the lead of Satoru Iwata: My title may be business owner. My training is that of a computer programmer. But in my heart, I am an artist. I'm not a coder who does art. I am an artist who does code.



 


Thursday, July 14, 2022

Essay 3 C is misunderstood

Essay 3 C is misunderstood

Quirky, flawed, and tremendous success! -Dennis Ritchie


Imagine, if you will, that you are living in the 1970s where computer are in the beginning of their creation. You have a simple machine with simple instruction set. And you want to program the computer. What will you do?


1. Rewire the circuit

2. Enter hex numbers, backwards

3. Enter decimal codes

4. Enter mnemonic assembly

5. Use FORTRAN

6. Use Forth

7. Use BASIC


Most programmers at the time use assembly programming language. Higher level construct are inefficient and to be used only by unskilled clerks. The assumption was that if you can say it in English, then you can say it in computerese. However, you do need to say it in restricted English. In other words, Pseudocode.


No one code in Pseudocode, except Donald Knuth, but he's the only one to do so with his Literature Programming. However, my point is simple: Translating Pseudocode from English to high level computer language is so easy, anybody can do it!


Forth, a stack based language, is unique, in that it's simple to code, yet capable of powerful functionality. You don't see Forth nowadays, but the spirit lives on PostScript format, which is contained in Portable Document Format (PDF). Another simple language is List Processing (LISP) and its dialect cousin Scheme. You may see it in Emacs scripting engine.


Of course, there is BASIC and LOGO, as well. Those are mostly limited in educational institution. The thing about them is that they're easy to code. BASIC, famously, can easily fit in less than 8K. LOGO and a more sophisticated BASIC can fit in a bit more memory, such as 8K. I have personally done a LISP based interpreter in as little as 300 lines. Other students did it in about 600 lines. But I digress. The point is that creating a coding language was by necessity an exercise in small scale expenditure. Computers of the day just isn't capable of large memory footprint requirements.


So, then, the question that needed to be answered at the time was "What is the most efficient way to program a computer that has the ease of high level language, yet be capable of fast machine language execution speed?" And that was the goal of the project. It began with BCPL, then a simplified version of it, B, was conceived. Then improved to C.


C was never conceived to be a high level language. It amuses me to no end that people nowadays call C a high level language. Maybe C++, although that was debatable, too. You see, C++ was originally created as a precompiler that compiles to C. C++ now compiles directly to ASM to take advantage of the language construct to generate good optimization. However, at the time, C++ to C to ASM to ML was quite common. The reason being is "ease of implementation."


If you have the hardware, you have Machine Language (ML). In order to ease the programming process, Assembly language (ASM) is used. C compiler compiles to ASM, which is then compiled and linked to ML. It's trivial to write an assembler. It's not difficult to write a Tiny BASIC compiler, albeit, having done so, I can testify that Tiny BASIC has a rather limited utility.


So, a more powerful language was desired, and the priority is to write something that is easy to write and compile. Hence, C uses a lot of standard libraries. About the only custom library is the stdio library. That is done with machine language. Most other library actually use C.


If you look at the libraries, most of them are really simple and easy to do. Once you have the core C compiler going, you can just implement the rest really, really easy. Hence, the nearly ubiquitous presence of C compiler. C compiler is relatively quick and easy to do, at least in principle, before the desire for a more powerful language comes into being.


I'm not the one to discuss the merits of various C language features, but fortunately, Brian Kernighan is still alive and well, and I believe he is the de facto person to consult about such things. My point is: C language is small and easily implementable across most system, and therefore available to most computers.


There is a version of C compiler called Tiny C Compiler (TCC) which I actually use every day. I don't use Gnu C Compiler (GCC) which is what professional coders use. That's because it's really overkill for my use. I only use the smallest set of C language features, anyway. Why would I want to use GCC? GCC's compilation time is rather extensive! 


And yet, simple as it is, C does feature rather powerful set of capabilities. It bridges the assembly language and high level language. C was actually classified as "middle level" language, yet quite powerful. There is actually a design called C--, which is an even simpler language set, but that isn't popular. In short, C occupies the ideal "Goldilocks" zone of complexity and features.


That is my take of C language. It's not a language that is the end of all language, rather it is the language that is the beginning of all languages. A whole lot of later generation languages can trace its lineage to C. I already mention C++, but basically C -> C++ -> Java -> C# -> Rust is one such lineage. Another would be C -> Perl -> Raku. There are plenty more.


Should you learn C? I would advise it, but not if you're a beginner. For beginners, there are more accessible language such as BASIC, which is designed for beginners in mind, about the only difficulty is the lack of unified dialect. The most popular BASIC dialect seems to be Microsoft QuickBasic, and QB64 seems to be a good implementation of it. The question you should ask is: Will you be willing to learn Assembly programming? Do you want to code as many languages as possible? Do you want to write program in various systems/hardwares? If yes, then learning C is a great way to get started!


As Dennis Ritchie himself pointed out: C is quirky and flawed, but it's also tremendously successful. Yes, the standard library isn't the greatest in the world, but it's not supposed to be. They're supposed to be simple and easily implementable. If you need more, then you should code your own libraries. C makes that easy.


As a note: Tiny BASIC specification features a FOR-LOOP. I'd say that's a mistake. WHILE-LOOP is easily implementable, as is REPEAT-LOOP. If you add *continue* and *break*, then you can even use infinite loop! GOTO is also useful, albeit easily misused. The answer is simple: Structured GOTO. 


Wednesday, July 13, 2022

Essay 2A "Mechanic" Computer Programmer

Essay 2 A "Mechanic" Computer Programmer

Not insulting Mechanical Mechanics


My frame of reference for this piece, is that of a magical circle. By which I mean magicians, either performing on stage or street magicians. If you want to know the secrets of magicians, all you have to do is go to a magic store, and there you will find gadgets galore, all ready to fulfill all your magical needs.


There are tricks, however, that is unachievable by magical gimmicks alone. Certain magic tricks require the magicians to perform with skill, either misdirection, or hidden manipulation. It can take years to refine those skills, and it's certainly no easy feat to be an accomplished magicians.


However, there are magicians who decided that skills are beyond them, and they depend on gimmicks for their performance. The term bandied about is "mechanic", which as you may surmise, a rather dim view of the quality of their magic. Yes, they are magicians, but no, not a high level, or even medium level ones. In short, "mechanic" magicians are mediocre magicians.


Those magicians are usually get hired at children parties. Children, being inexperience in ways of the world, tend to be immune to misdirection. Only the greatest magicians can pull off successful misdirection with children, and most magicians who depend on these children parties wisely chose mechanical magic for their entertainment. Furthermore, if you want to entice these children to become magicians in the future, you would be well advised to keep a few of these gimmicks as to inspire them to the wonders of magic.


However, they should at least keep trying to attain higher skill levels. Those who doesn't generally receive well-deserved scorn for their performance. There's nothing new in their performance. No need for them to divulge their magical secret. Just go to the store and buy their tricks! No real value to their craft.


Now, imagine that you are a computer programmer. You learned so many things. Maybe Java or C#. PHP or Python. HTML, CSS, and JavaScript? Sure. You learn so many frameworks and paradigm. Pair Programming, Agile, and Patterns? Why not? Why not use the most powerful tools available? After all, professionals use them, so why not give yourself the same advantage those professionals use? Why handicap yourself to primitive ones?


The thing is, I don't think primitive tools are handicaps. I think powerful tools are crutches. They allow you to do great things with their "black box" solution, so to speak. There is a difference between knowing how to do things and use code libraries to do them, and use code libraries without understanding what they do, or how they do things. 


Somebody ask me once, why does it matter to use code library if they can recreate the library from scratch? "Nothing," I answered, "but the problem is that they cannot recreate even the simplest task!" That is the problem. It's not that using code library is a problem, but the lack of ability to be independent of them that is the problem.


Nowadays, such questions aren't even asked anymore. Whenever you see computer programming course, you see a ton of these names of these technologies spread around with no regards of how they're made. As more and more of these courses ignore the underlying technologies, the myth of the difficulty of computer programming become more and more exaggerrated. Nowadays, people have no problem claiming that Turtle graphic is an absolute impossibility to teach to students. We're talking about Forward, Turn Left, Turn Right, and PenUp/PenDown! It's so simple, that I've done it in 10 minutes. Somehow, their inability to code Turtle Graphic becomes nobody can code Turtle Graphic easy and that it takes years to do!


Somebody actually told me with a straight face, that "include FizzBuzz" is a perfectly acceptable solution to the problem. And don't tell me about the numerous "professional" programmers, who would be happy to remove recursion from Quicksort() by using Stack() routine! Uh-huh. They just turn implicit stack into explicit one. How is that any better?


I was 13 years old when I independently learned how to use sin() and cos() to draw a circle. I then proceeded to write Spirograph type of programs. It's not until much later in the latter part of my high school senior year in AP Calculus class, that the teacher taught the exact same subject. Needless to say, I scored 100%. The thing is, it seems like nobody else thought that it was easy, judging by the expression of their face and the way they look at me. And these are the smart kids. I'd have thought that maybe lots of them would get it as well, but apparently not.


I think more important that the ability to learn from a teacher in a classroom is more important than the ability to get as many certifications as possible. It takes independent thinking, and that takes guts, as well as stubbornness to try to dissect pre-existing code libraries and to try to figure out how they do things. It takes curiosity, problem solving skills, and the willingness to suffer the inevitable frustration as you try to decipher professional code.


I don't think I want to suffer the pain of deciphering professional code. The few times I did, I end up with reducing the code size, usually to 25% of the original size. I think most of professional coder are Rube Goldberg practitioner. I have no patience for that. What usually happen of late is that I would figure out the API, which tells me the functions and capability of the library, and I will figure out the appropriate Data Structure involve. After that, the code will usually write itself.


The point I'm making is that if I can be independent from Other People's Code, then I would do so. There's nothing more assuring that the code you have is the most fool-proof, correct code possible because you write it yourself. After all, who wants to dedicate the most time writing the correct code for you if not yourself? Nobody, that's who!


You can always ignore this advice, and constraint yourself to a programming language and libraries. Sure, you can do that. But then, you will feel the same pain I felt when I moved away from Linux/Perl solution to Windows environment! When I decided to rebuild everything from scratch, I found out, to my dismay, that most of them are really OS calls, and that I needed to rebuild the most of OS from scratch! Perusing GNU source code didn't help. Did I say that they're complicated? They're complicated! So, it was really a pain!


Nowadays, I'm using Raspberry Pi, and that's Debian technology. So, that's fine. But I already moved away from Perl. I'm building my own set of libraries to do, and I don't regret the extended amount of time I spend building it. After all, I only have to do it _once._


So, you can try your best to study, really study, the whole spectrum of computer programming. Or you can be a mechanic. Your choice.


Tuesday, July 12, 2022

Essay 02: Why do we code?

Essay 02: Why do we code?


Why do we code? I don't know. I mean, I know why I code, by why would anyone else want to learn to code? I think the best answer came from Steve Job: "I think everyone in this country should learn to code, because it teaches you how to think." And what a fine answer it is! 


Unfortunately, the Apple Macintosh, in fact, did not come with a built-in programming language. You had to buy an Apple Lisa system, at $10,000, to write a program for the Macintosh. Great user interface, but programming for it is rather a painful experience!


So, here are some ideas on why you should learn to code, and my reaction to them:


1. Logical Thinking. Also known as left-brain function. And yet, I have seen computer programmers use their right-brain side. Rather emotional. Artists tend to be that way. Steve Jobs, for example, did not do any heavy duty coding. Sure, he knew how to code, but he didn't do it professionally. He relegates himself as visionary, and that requires more than logical thinking. It requires imagination and intuition. As well, people can and do develop logical thinking without learning computer programming. The ability to reason isn't an exclusive domain to computer programmer. Chess players, for example, can do it, too.


2. Creativity. I doubt that. I think people are either creative or not. The creative ones can and do well in computer programming. However, I have seen too many people just use other people's code, without thinking whatsoever. Only in the bleeding edge of technology that creativity is rewarded. However, that area of research isn't too common. As far as art is concerned, there is Creative Coding and Generative Art, but those are rather rare.


3. Resilient and Perseverance. The ability to weather hard times, so to speak. I find that hard to believe. As soon as people see code, their mind go blank, and they never pick up coding. Those who sticks with it would be amply rewarded, sure, but so is sticking with Karate to develop discipline, so to speak. Just as creativity is a talent, so is perseverance.


4. Communication skill. This I agree completely. Many people say that in order to be able to program a computer, you need to be good at math. The truth is, you don't. However, you do need to have good communication skill. The ability to articulate your thought, put it down on paper, and code it into the computer is an indispensable skill, and you have to have it in order to be a successful coder. Math? Do you know that coding used to be thought of as "clerical" job?


5. Analytical Problem Solving. This is also true. The biggest roadblock people have in computer programming is the inability to properly recognize the problem! Defining a problem is a very necessary skill, and not too many people have it. Then, you need to be able to decide on how to solve the problem. So, yes, this is a skill and not a talent.


6. Coding is fun. Well, no. Creating a computer program from nothing is a worthy endeavor and should be encouraged. But it is not a fun activity. Too many people are having trouble debugging their program, and what if there is no help available? It can be maddening experience, indeed! Tearing your hair off in frustration is a rather common experience, I'm afraid.


7. Learn how to learn. Having done it, I can categorically say that learning how to learn and computer programming are not related to each other. That is, to say, if you learn how to learn and can pick up new programming language quickly, then you're well and good to progress as computer programmer. However, just because a lot of successful computer programmer can do so, it doesn't mean that the skill is guaranteed. Those are two different things.


8. Money. This is also not true. I see too many job offers as needing so many years of experience in a particular platform. Also, new platforms keep coming up every year. What was good, won't be. Programming languages come and go. About the only steady job that I see is Database System Administrator. You don't need computer programming for that. Just learn SQL, and some system administrator skills. That requires certification, not boot camp.


9. Prestige. All I can think of about this, is "EA Spouse", the deservedly lambasted activity for treating computer programmer shabbily. Only to be repeated at Rockstar Games, forcing their computer programmers to work a punishing 100 hours per week! Only to be let go after the game was done. All it does is burn out a lot of computer programmer. I don't see this being different anytime soon.


10. Be smarter. Not at all. There is this rather popular process of testing computer programmer with "LeetCode", that is asking the job applicant to code a programming problem. The problem with that practice is that the applicant cannot use a computer or look up stuff either on books or anywhere including the web! So, obviously, the most successful applicants are the ones where they successfully memorized as many of these as possible. I have seen the question compilation and most of them are totally worthless! Either they can solve it because they've seen it before, or they can't because they haven't seen it before. All it does is gatekeep the really smart programmer (never memorize what you can look up -Einstein, Feynman, and others), and value memorization rather than smarts. 

Monday, July 11, 2022

Essay 1: Introduction

Essay 1: Intro to 100 Days of Code

A journey of 10,000 steps begin with the first one


One fine day, warm and sunny, I decided to take up the 100 Days of Code challenge. This was shortly after I joined Twitter and found out that there are all kinds of groups over there. Elon Musk was in the process of buying Twitter and so I decided to join. Neil Gaiman is active there, too. Well, one thing led to another and I was looking over the hashtags and found one that says "100DaysOfCode" and was sufficiently intriqued.


So, I looked over the tweets, and found out that most of them does not feature any source code. And I may have missed it, but the point of doing 100 Days of Code challenge is to improve yourself, and part of that challenge is to publish the source code and share it with the world. The few that did take screenshot of their code do so with their favorite IDE, which inevitably means tiny, tiny font. Extremely hard to read. May as well not bother.


The suggestion was to publish your code on Github, and there's nothing wrong with that suggestion. I do have a blog account that I still post from time to time, and so I decided to just post it there. As long as it's available, no problem!


Most people actually have some kind of planning, either a book, or maybe, an on-line course. The name Angela Yu is featured with many of the posts. Another favorite of people is LeetCode. As for me, I don't have any established plan whatsoever. So, most of my time is spent trying to find the next feasible projects. The constant worry is running out of ideas to try.


The point of this challenge is to learn, and I decided to learn. That means taking something out that I'm uncomfortable with and improve upon it. As of this writing, 3 weeks into the challenge, it's been mixed projects so far. Leetcode challenges are there, but I actually skip most of them because I want to write actual, usable programs. That's a tall order, as I found out that just implementing the command line parameters alone would take upwards half an hour or so. That's just setting variables, no actual program coding.


As to the actual coding, I firmly believe that most of the program's core are actually very simple. To that end, I would screen capture the code from my blog, which means no tiny font! So far, I've been managing to do so in one screen. Exception is the SVG library, which is rather extensive, even in the first incarnation.


The best part of taking this challenge, however, is that it anchors my day. Excepting rest days, which is Sunday, I always code. If the time is lacking, then I code something simple, such as FizzBuzz. Overall, though, it has been extremely productive sessions. 


And that's the real benefit of the process. My coding skill is getting better every day. Of course, it wasn't lacking to begin with, but doing this challenge forces me to be extremely productive with my time. So far, I've been spending about 2-4 hours per day, including write ups. I'm happy with the progress I'm making, especially since other people would code a challenge, and I would code a whole program.


That cannot be overstated: I wrote a whole new program everyday! Not a little function, but a whole program! Now, the program maybe rough and in need of revising, which I will do as needed later. Also, there may be bugs or missing feature which means I spent some time fixing the bugs, especially if it's a program I'll be using later on the challenge. It's still a program a day, though.


That just goes to show, no matter how skillful you are, there will always be higher mountain to climb, and I have begun making my steps, one day at a time. Today, it's the 18th day, and I'm resting and writing this stream of conciousness for the time to reflect back from the day of completion.


2022 July 03


Sunday, July 10, 2022

Day 24/100 Turtle3

Day 24/100 Turtle3

Look! An animated Turtle!


This is just a simple improvement for the Turtle graphic program. I had planned on having multi-turtle going, but the truth is, I also wanted to have some kind of font system going. Unfortunately, I'm really tired at the end of the week, and so decided to just implement one easy, yet eye catching feature: Animation.


SVG standard actually does have animation option, but I'm still fuzzy on the details, so for this one, I decided on animating it the hard way: animated GIF.


Fortunately, it's quite easy to do. All I have to do is to generate a series of frames and then use Image Magick convert program to generate the animated GIF. It will also resize the images automatically. This is the command that I use:


time convert -delay 20 -loop 0 Frame*.svg -resize 320x200 FrameAnim.gif


It takes about 30 seconds to do 23 SVGs animation, so not too speedy. However, for small animation picture, that is acceptable. I do wonder why it takes somewhat long to build a 64kB image.


As to the actual code, it is extremely simple. I have already done a simple SVG generator, and all I did was generate SVGs one after the other. There is a line separator marked with desired filename. This is the code that is used to cause the Turtle to write the separator:


  if (!strncmp("WF",s1,2)) {
    printf("WRITE FILE %s\n",s2);
  }


And that's it. If you remember the Day 16/100 SplitCat program, that is the one being used to split the file into separate ones. Quite a convenience. All it takes is a one liner "WF" to write the files. Okay, I also have to set a string for the filename, and incrementing the counter. I considered having it done internal to the Turtle, but in the end decided against it for transparency considerations. Here's the code:


  for (i=40;i<=90;i+=10) {
  for (j=60;j<=120;j+=20) {
    sprintf(FN,"Frame%02d.svg",c++);
    Turtle("WF",FN,0,0);
    Turtle("SC","",0,0);
    DrawCMYK(i,j);
    Turtle("sc","",0,0);
  }}


So, one line to set the string, and another to write the separator. Two lines total. And that is it for this week. Just a simple Turtle enhancement that will hopefully pay dividends down the road.


#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <math.h>
#include <stdlib.h>

#define MAXENTRY 10000
#define MAXSTR 2560
#define sx0 0
#define sx1 118
#define sy0 0
#define sy1 180
#define PI 3.141528

char Liner[MAXSTR];
char Data[MAXENTRY][MAXSTR];
int numentry;
int Debug=0;

//turtle
char co; //command
int  nu; //Number
int  he; //heading
int  di; //distance
float  an; //angle (rad)
float  tx0,ty0,tx1,ty1; //line
int  pe; //pen color
int  bg; //background color
int mode=0; //0-normal 1-fill
char FontFam[MAXSTR];
int FontSiz;

float map(float x0,float x1, float x2,float y0, float y2) {
  return y0+((x1-x0)*(y2-y0)/(x2-x0)); //y1
}

void SVGHeader() {
  puts("<svg xmlns:xlink=\"http://www.w3.org/1999/xlink\" 
style=\"fill-opacity:1; color-rendering:auto; 
color-interpolation:auto; text-rendering:auto; 
stroke:black; stroke-linecap:square; 
stroke-miterlimit:10; shape-rendering:auto; 
stroke-opacity:0.4; fill:black; 
stroke-dasharray:none; font-weight:normal; 
stroke-width:0.25; font-family:'Dialog'; font-style:normal; 
stroke-linejoin:miter; font-size:12px; 
stroke-dashoffset:0; image-rendering:auto;\" 
width=\"128.0mm\" height=\"190.0mm\" 
viewBox=\"0 0 128.0 190.0\" 
xmlns=\"http://www.w3.org/2000/svg\">
<!--Template generated by the Batik Graphics2D SVG Generator-->
<defs id=\"genericDefs\" />
<g>
<line x1=\"10.0\" y1=\"10.0\" x2=\"118.0\" y2=\"10.0\" />
<line x1=\"118.0\" y1=\"10.0\" x2=\"118.0\" y2=\"180.0\" />
<line x1=\"10.0\" y1=\"180.0\" x2=\"118.0\" y2=\"180.0\" />
<line x1=\"10.0\" y1=\"180.0\" x2=\"10.0\" y2=\"10.0\" />
");
}

void SVGFooter() {
puts("
</g>
</svg>
");
}

void SVGDrawLine(float x0,float y0,float x1,float y1) {
  int t;
  if (mode==1) {
    if (pe) {
      printf("L %f,%f ",x1,y1);
    } else {
      printf("M %f,%f ",x1,y1);
    }
  } else if (pe) {
    printf("<line x1=\"%f\" y1=\"%f\" x2=\"%f\" y2=\"%f\" />\n",
           x0,y0,x1,y1);
  }
}


int Turtle(char s1[], char s2[], int n1, int n2) {
  if (!strncmp("SC",s1,2)) { //Capture SVG
    SVGHeader();
    if (Debug) puts("SVGStart");
  }
  if (!strncmp("sc",s1,2)) { //Close SVG
    SVGFooter();
    if (Debug) puts("SVGEnd");
  }
  if (!strncmp("F",s1,1)) { //forward absolute
    di=n1;
    if (Debug) printf("Forward Absolute %d\n",di);
    an=map(0.0,(float)he,360.0,0,2*PI);
    tx1=tx0+(float)di*sin(an);
    ty1=ty0-(float)di*cos(an);
    SVGDrawLine(tx0,ty0,tx1,ty1);
    tx0=tx1;ty0=ty1;
  }
  if (!strncmp("f",s1,1)) { //forward relative
    di+=n1;
    if (Debug) printf("Forward Relative %d\n",di);
    an=map(0.0,(float)he,360.0,0,2*PI);
    tx1=tx0+(float)di*sin(an);
    ty1=ty0-(float)di*cos(an);
    SVGDrawLine(tx0,ty0,tx1,ty1);
    tx0=tx1;ty0=ty1;
  }
  if (!strncmp("G",s1,1)) { //Goto XY Absolute
    tx1=n1; ty1=n2;
    SVGDrawLine(tx0,ty0,tx1,ty1);
    tx0=tx1;ty0=ty1;
  }
  if (!strncmp("g",s1,1)) { //Goto XY Relative
    tx1=tx0+n1; ty1=ty0+n2;
    SVGDrawLine(tx0,ty0,tx1,ty1);
    tx0=tx1;ty0=ty1;
  }
  if (!strncmp("H",s1,1)) { //Heading Absolute
    he=n1;
    if (Debug) printf("Heading Relative %d\n",he);
  }
  if (!strncmp("h",s1,1)) { //Heading Relative
    he+=n1;
    if (Debug) printf("Heading Relative %d\n",he);
  }
  if (!strncmp("C",s1,1)) { //Color Stroke
    pe=n1;
    if (Debug) printf("Pen color %d\n",pe);
  }
  if (!strncmp("c",s1,1)) { //Color Fill
    bg=n1;
    if (Debug) printf("Background color %d\n",pe);
  }
  if (!strncmp("T",s1,1)) { //Font Fam+Siz
    strcpy(FontFam,s2);
    FontSiz=n1;
  }
  if (!strncmp("t",s1,1)) { //Draw text
    printf("<text x=\"%d\" y=\"%d\" font-family=\"%s\" font-size=\"%d%%\" >"
            ,n1,n2,FontFam,FontSiz);
    printf("%s",s2);
    printf("</text>\n");
  }
  if (!strncmp("M1",s1,2)) { //Mode 1 start
    mode=1;
    if (bg) {
      printf("<path stroke=\"#%6X\" fill-rule=\"evenodd\" fill=\"#%6X\" d=\"\n"
              ,pe,bg);
    } else {
      printf("<path stroke=\"#%6X\" fill-rule=\"evenodd\" fill=\"none\" d=\"\n"
              ,pe,bg);
    }
    printf("M %f,%f ",tx0,ty0);
  }
  if (!strncmp("m1",s1,2)) { //Mode 1 end
    mode=0;
    puts("\" />");
  }
  if (!strncmp("WF",s1,2)) {
    printf("WRITE FILE %s\n",s2);
  }
  return 0;
}

int Init() {
  co=' ';
  nu=0; //Number
  he=0; //heading
  di=0; //distance
  an=0; //angle (deg)
  tx0=59;ty0=90; //line from
  tx1=tx0;ty1=ty0; //line to
  pe=1; //pen
  strcpy(FontFam,"PibotoThin");
  FontSiz=10;
  return 0;
}

void DrawCMYK(int x, int y) {
  Turtle("C","",0x000000,0);
  Turtle("G","",x,y);

  Turtle("C","",0x010101,0);
  Turtle("c","",0x88FFFF,0); //cyan
  Turtle("M1","",0,0);
  Turtle("G","",x,10);
  Turtle("G","",10,10);
  Turtle("G","",10,y);
  Turtle("G","",x,y);
  Turtle("m1","",0,0);
  Turtle("C","",0x010101,0);
  Turtle("c","",0xFF88FF,0); //magenta
  Turtle("M1","",0,0);
  Turtle("G","",x,10);
  Turtle("G","",118,10);
  Turtle("G","",118,y);
  Turtle("G","",x,y);
  Turtle("m1","",0,0);
  Turtle("C","",0x010101,0);
  Turtle("c","",0xFFFF88,0); //yellow
  Turtle("M1","",0,0);
  Turtle("G","",x,180);
  Turtle("G","",10,180);
  Turtle("G","",10,y);
  Turtle("G","",x,y);
  Turtle("m1","",0,0);
  Turtle("C","",0x010101,0);
  Turtle("c","",0x888888,0); //black
  Turtle("M1","",0,0);
  Turtle("G","",x,180);
  Turtle("G","",118,180);
  Turtle("G","",118,y);
  Turtle("G","",x,y);
  Turtle("m1","",0,0);
}

int ProcessData() {
  char Command[MAXSTR];
  char Value[MAXSTR];
  int  Num1=0;
  int  Num2=0;
  int i,j;
  int c=0;
  char FN[MAXSTR];

  for (i=40;i<=90;i+=10) {
  for (j=60;j<=120;j+=20) {
    sprintf(FN,"Frame%02d.svg",c++);
    Turtle("WF",FN,0,0);
    Turtle("SC","",0,0);
    DrawCMYK(i,j);
    Turtle("sc","",0,0);
  }}
  return 0;
}

int main (int argc, char *argv[] ) {
  int n,i;
  Init();
  ProcessData();
  return 0;
}


One last parting thought: I will do something drastic next week. I have been delaying it for some time, and truthfully, I can stay in the same format for some more weeks, but I think it's time for me to incorporate graphical user interface. Since I'm using Raspberry Pi, it's not the easiest thing to do. So, we'll see.


Saturday, July 9, 2022

Day 23/100 Steganography

Day 23/100 Steganography

How do you say Steganosaurus?


I've been working on this source code packager on and off. The whole idea for this program is that most forums on the web restrict the filesize for text, while allowing generous allowance for pictures. So, the idea of including text inside the picture was born.


There are several different ways to do this. You can simply draw the text in the picture, which is what I have been doing. You can encode the text as barcodes or QR codes. Or you can encode the text as black and white dots.


However, those solution involves creating pictures in such an ugly way. So, what if you can just include the text inside the picture itself? Hidden text, so to speak. Hence steganography.


In my case, I decided that since I'm using Raspberry Pi, the display can only show 16 bit color, whereas pictures are usually 24 bit color. So, it is obvious that I should design it as one byte per pixel. The problem is that I cannot just append a byte on the picture. I have to split it as 3-2-3 bits to fit into the 5-6-5 rgb scheme.


It took some doing. I made it easy for myself by skipping PNG format and go for PPM (P6) format. You're going to have to convert it to PNG format, but I'll leave that up to you. Personally, I use ImageMagick convert program.


This is to Write to the image:


      bit=(char) fgetc(fpImg);
      bit&=0xF8; bit|=(c&0xE0)>>5; bit1=bit;
      bit=(char) fgetc(fpImg);
      bit&=0xFC; bit|=(c&0x18)>>3; bit2=bit;
      bit=(char) fgetc(fpImg);
      bit&=0xF8; bit|=(c&0x07); bit3=bit;
      printf("%c%c%c",bit1,bit2,bit3);


And this is to Read from the image:


  for (i=0;i<l;i++) {
      bit1=(char) fgetc(fpImg);
      bit2=(char) fgetc(fpImg);
      bit3=(char) fgetc(fpImg);
      sout= ((bit1 & 0x07)<<5)
                  | ((bit2 & 0x03)<<3)
                  | ((bit3 & 0x07));
      putchar(sout);
  }


If you're wondering why the sout character isn't set to zero before reading in the data, that's because it's not necessary. sout is a character of 8 bits. I'm reading in 8 bits. So, sout variable is going to be wholly overwritten, anyway.


I was able to use the same program for both Write and Read. Depending whether you supply both picture name and data filename, or just the picture name, the program will select the operation appropriately.


 if (argc<2) {
    puts("Write: fileglob [picname.ppm] [datafilename]");
    puts("Read:  fileglob [picname.ppm]");
    puts("ppm is P6 file (binary)");
    return 1;
  }


The hardest part is design, and trying to decipher PPM format. I still have trouble, and I'm not at all sure that I have it. The specification stated that there should be a whitespace (usually newline) after the colordepth, but it gave me an off-by-one error which shifted the color of the picture. I don't think I quiet get it, yet. So, I can only guarantee that the program works on my machine.


#include <stdio.h>
#include <stdlib.h>
#include <string.h>

#define MAXSTR 256
FILE *fpImg;
FILE *fpDat;
char buff[MAXSTR];
int w,h,d;
long l;
char sname[MAXSTR];

int Init(int argc, char *argv[]) {

  if ((fpImg=fopen(argv[1],"r"))==NULL) {
    puts("File open Error");
    return 1;
  }

  if ((fpDat=fopen(argv[2],"r"))==NULL) {
    puts("File open Error");
    return 1;
  }

// Read P6 PPM format
  if (fgets(buff,MAXSTR,fpImg)==NULL) {
    puts("Image file read error");
    return 1;
  }
  if (strncmp("P6",buff,2)) {
    puts("Image file not P6 PPM error");
    return 1;
  }
  puts("P6");

  fseek(fpDat,0L, SEEK_END);
  l = ftell(fpDat);
  fseek(fpDat,0L, SEEK_SET);
  printf("#steg Aa A %ld %s\n",l,argv[2]);


// Read width height
  fscanf(fpImg,"%d %d",&w,&h);  //Line 50
  printf("%d %d\n",w,h);

// Read depth
  fscanf(fpImg,"%s",buff);
  d=atoi(buff);
  printf("%d",d);

  if (d!=255) {
    puts("Image depth isn't 255");
    return 1;
  }

  if (l>(w*h)) {
    puts("Image size too small!");
    printf("Image: %ld  Data: %ld\n",(long)(w*h),l);
    return 1;
  }

  return 0;
}

void CleanUp() {
  if (fpImg!=stdin) fclose(fpImg);
  if (fpDat!=stdin) fclose(fpDat);
}

int ReadData(int argc, char *argv[]) {
  char bit1,bit2,bit3;
  char sout;

  long i;
  long c; char c1;
  char s1[MAXSTR];
  char s2[MAXSTR];
  char s3[MAXSTR];
  char s4[MAXSTR];
  char s5[MAXSTR];

  if ((fpImg=fopen(argv[1],"r"))==NULL) {
    puts("File open Error");
    return 1;
  }

// Read P6 PPM format
  if (fgets(buff,MAXSTR,fpImg)==NULL) {
    puts("Image file read error");
    return 1;
  }
  if (strncmp("P6",buff,2)) {
    puts("Image file not P6 PPM error");
    return 1;
  }
  fscanf(fpImg,"%s %s %s %s %s",s1,s2,s3,s4,s5);
  l=atol(s4); strcpy(sname,s5);

// Read width height
  fscanf(fpImg,"%d %d",&w,&h);  //Line 50

// Read depth
  fscanf(fpImg,"%d",&d);
  if (d!=255) {
    puts("Image depth isn't 255");
    return 1;
  }

  for (i=0;i<l;i++) {
      bit1=(char) fgetc(fpImg);
      bit2=(char) fgetc(fpImg);
      bit3=(char) fgetc(fpImg);
      sout= ((bit1 & 0x07)<<5)
                  | ((bit2 & 0x03)<<3)
                  | ((bit3 & 0x07));
      putchar(sout);
  }

  return 0;
}

void ProcessData() {
  long i,j;
  int c; char bit;
  char bit1,bit2,bit3;
  char sout;

  for (i=0;i<(w*h);i++) {
    c=fgetc(fpDat);
    if (c==EOF) {
      putchar(fgetc(fpImg));
      putchar(fgetc(fpImg));
      putchar(fgetc(fpImg));
    } else {
      bit=(char) fgetc(fpImg);
      bit&=0xF8; bit|=(c&0xE0)>>5; bit1=bit;
      bit=(char) fgetc(fpImg);
      bit&=0xFC; bit|=(c&0x18)>>3; bit2=bit;
      bit=(char) fgetc(fpImg);
      bit&=0xF8; bit|=(c&0x07); bit3=bit;
      printf("%c%c%c",bit1,bit2,bit3);
    }
  }
}

int main (int argc, char *argv[] ) {
  int e=0;

  if (argc<2) {
    puts("Write: fileglob [picname.ppm] [datafilename]");
    puts("Read:  fileglob [picname.ppm]");
    puts("ppm is P6 file (binary)");
    return 1;
  }

  if (argc==2) ReadData(argc,argv);

  if (argc==3) {
    Init(argc,argv);
    ProcessData();
    CleanUp();
  }

  return e;
}


One more thing:

Design is much, much harder than coding! In the process of programming this, I changed the design several times. It took me hours to finish this program up to this state, but that's because I keep changing the design. There used to be more parameters required in order to write the image, as well as trying it out directly with PNG format using Processing. But it's not until PPM idea comes into being that the program finally arrived at satisfactory design.

Thursday, July 7, 2022

Day 22/100 Fibonacci and Factorial Trailing Zero

Day 22/100 Fibonacci and Factorial Trailing Zero

Failure at Grade School Arithmetic


So, I was at conversation the other day, where the other person was touting the virtues of Python, and dissing C language. You know, the language I'm using to do 100 Days of Code challenge. Truthfully, I did encounter a few "Segmentation Fault" core dump error, but I simply took it in stride. What I didn't know, I soon find out. It's part of the learning process.


Why would anybody do it differently? Why would you blame the programming language when you're supposed to increase your own skill so that you no longer make mistakes, instead of expecting the computer to catch yours?


If you look at my programs, you'll see my tendency to be light on error checking routines. What can I say? I don't make that many mistakes. Sure I made some. Everybody does. The difference between me and other people is that I learn from my mistakes and not repeat them. I don't see that kind of commitment from your average coder. This is why I wonder if the oft claimed "Coding makes you smarter" is incorrectly attributed to survivor bias.


Pardon me for having a dim view of the situation. I have seen too many stupid people who don't know what they're doing, make bold claims that are obviously false. Let's not spread more misinformation than what is already out there.


The title above is being honest. The failure does not lie in coding, regardless of platform or programming language chosen. The failure is at grade school level of arithmetic. I'm talking about 5th grader arithmetic here! Add, subtract, multiply, and divide. It doesn't even involve any fraction! How hard can that be?


Very hard, it turns out. Let's take a look at fibonacci code I've written. I did it twice: First is my take on it. The second is what is commonly done by the community, assuming neither Dynamic Programming (memoization) nor Recursion is involved. This is LeetCode 509:


long long fib1(int n) {
  int i=1;
  long long m[2];
  if (n<1)  return 0;
  if (n>92) return -1; //oob
  for (m[0]=1,m[1]=1;--n;i=1-i) m[i]+=m[1-i];
  return m[i];
}

long long fib2(int n) {
  int i=0; long long a=0;
  long long b=1;long long c=1;
  if (n<1)  return 0;
  if (n>92) return -1; //oob
  for (a=0,b=1,c=1;--n;) {
    a=b; b=c; c=a+b;
  }
  return b;
}


As you can see, my version is more compact than what is usually done. It uses only 2 variables, stored as array, so that I can flip between them. Testing does take a while, but if you pride yourself as a software engineer, then you would want to do it that way! Although mathematicians can resort to recursive solution, software engineer cannot do that! For you to take pride in your coding skill as software engineer, the code has to be extremely tight, and no waste either running time (recursion) or memory (memoization). 


The second solution is actually acceptable. It's very commonly done by people who aren't mathematically sophisticated. It's kind of like FizzBuzz challenge. Sure, it's easily done, but is it optimum? Not really. So, the second solution is acceptable, but nothing to be proud of.


As far as recursion and memoization? In my very strong opinion, those are failures! Waste of cycles and bytes! Very unprofessional! I always say that my code is at amateur hobbyist level, but looking at some of these professional coding, I wonder if I'm not already better than most of them.


A second example is even more telling than the fibonacci problem. The problem is to count the number of trailing zeros in factorial. LeetCode 172. Now, we're talking deep into mathematical realm. Sure, leet coder act like it's an easy thing to figure out. But is it? If you can figure out the deep mathematical issues, then surely you can also figure out the most efficient way to compute it? 


Nope! That's a fail! As far as I'm concerned, this kind of question does not belong in the interview process. Fundamentally, either you ran into the problem already, and remember the solution, or alternatively, you haven't run into it previously, and will now have to compute large samples of factorials to try to determine some kind of patterns that you can recognize. Oh, do you know that in these kind of interviews you cannot use any kind of help including computers and browsers? That's right! You only have the white board to do it.


What that tells me is that the company who does these kind of things will fill up their ranks with people who memorize LeetCode problems, instead of the smart coder who can readily research the issue. What's that Einstein, Feynman, et al said? "Do not memorize things that can be looked up?" Exactly!


Let's see what the typical answer is involved. You can probably google this in just a few minutes:


int trail1(int n) {
  int c; int i;
  for (c=0,i=5;n/i>=1;i*=5) {
    c+=(n/i);
  }
  return c;
}


Looks nice and easy, doesn't it? But it's deceptively tricky. Can you even work out the problem even after looking at the code? I certainly cannot! I'm no mathematician. I need to see rows of factorials and count the zeros to get some kind of pattern recognition going. This question is not an easy question, despite the brevity of the solution.


So, am I a failure? Maybe as a mathematician. However, as a coder? Not only would I pass, those people who give the answer above are all failures! Remember, we're evaluating coding skills, here, not math skills. The only math skill you need is grade school level arithmetic! That's right! No more than arithmetic. Hence the failure is of at grade school level. Now tell me, what kind of highly skilled, highly experienced, highly paid, professional computer programmer would fail grade school math? Only impostors do.


You need to have proper foundation in your skills! This goes double if you want to be a professional! No short cuts! If you cannot do grade school arithmetic, I suggest taking up coding as a hobby, rather than being a professional.


Here is the proper solution to the question:


int trail2(int n) {
  int c=0;
  for (;n/=5;c+=n);
  return c;
}


One division, and one addition. That's it! 


Look, coding is hard. I understand that. What I don't understand is why would people fail at grade school arithmetic! The usual solution as shown above features 2 divisions and a multiply. How is that better? It's not better. It's not professional at all!


How many years ago did you first learn arithmetic? Are you still at that level? Sigh.


Sorry to be so negative about things, but I'm seeing way more incompetence going on lately than what it used to be. Unfortunately, it seems that people just don't care anymore. That's a dangerous attitude to have. In the worst case scenario, it will mean the whole destruction of the industry as we know it. Don't laugh. It happened before. Look up "Dot Com Meltdown" if you don't believe me. That's the equivalent of "Game Industry crash" or even "Great Depression" level of apocalyptic event. Let's not have another one, okay?


Sigh.


I keep telling people to learn fundamentals, but people just don't bother. They refuse to do so. Why would they? A convenient library is just a download away! Until one day, one of those library disappear and if you google "... brought down the internet", you will find some incredible stories that shouldn't happen, but that they happened!


Study your fundamentals!


Lookit dem Smileys...


int d8(int B) {
  int              O;
  return  (O=  (B  /=  5)
  )?  O+   d8  (B  ):  0;
}


One more thing: Python isn't English! That distinction belongs to Literary Programming as pioneered by Dr. Donald Knuth. Alternatively, you can pull up just about any Design Document. Are you a Rockstar computer programmer?