Saturday, December 6, 2014

Hour of Code 2014

Hour of Code is coming, and I'm preparing several programs that you can do in your spare time. It may not be one hour, at least for you, but it will be one hour for me. :)

I'll be using my favorite computer development platform: Petit Computer for Nintendo DSi. I do wish that I have the latest Mk 3 version that is out in Japan now, but so far, not for US.

The first program is already done: Debugger. I'll just have to write it up. Turtle Graphic is an old stand-by, but I think I'm going to implement Travelling Salesman Problem. That's right, the NP hard problem that is the sink of so many man hours of computer science researchers. There's also a re-implementation of Quicksort both recursive version (in Basic!) and non-recursive version (Merge sort!)

I'm sure I can think of a few more fun stuff to write in one hour. Hey, it's quick and easy. What are you waiting for? Download Petit Computer DSiWare for your Nintendo game device and get programming!

Wednesday, May 28, 2014

Travelling Salesman


Random Path Finding: Travelling Salesman

I was reading this bit of article about quantum computing where there is a commercial quality version being sold for $10 million dollars. Of course, quantum computing promises a lot of performance gain, but in this case, D-WAVE, is specific for optimisation algorithm.

In brief, optimisation algorithm can be described as "hill climbing" algorithm. Basically, you're looking for the highest point in the area. So, as long as there is higher ground, you keep going up until you reach the highest point. The problem is that you will encounter local hills, that is the highest in the area, but not the highest overall. So, you have to watch out for "false peak".

Another problem is when you want to find a path through the area, and you want to find the highest overall path, or maybe the lowest. When the hills and valleys represent productivity and cost, then the path you have represents the highest productivity and the lowest cost.

This is all very desirable in computing. The payoff is immense and immediate. So, efficient path finding algorithm is extremely desirable in computing industry. An example of this is the travelling salesman problem where he needs to find the least distance to travel in order to work his route. This is applicable not just to travel, but also in manufacturing where the least distance a tool traverse, be it cutter, welder, or packer, is desirable.

There are two main approaches in order to solve this problem. I will call this the "forest" and the "tree" approaches. This is just a simplification of the process. There is another approach that I will describe later, but for now, let's focus on these two methods.

The forest method looks at the overall picture of the area and try to find the absolute best path possible. The simple, traditional way to do it, is to just do permutation of all possible path, and decide which one is the best. It has the good point of finding optimal solution. The bad point is that it is extremely costly in terms of computing power. Permutation has factorial cost.

The tree method looks at immediate surrounding and simply choose the best solution that is immediately present. The good point is that it is extremely cheap to do. The bad point is that it is susceptible to false peaks, dead ends, and other traps.

So, most people who are serious in solving the travelling salesman problem would do a compromise. They take a section of the local area and solve that optimally using the forest method, and by connecting several sections, they solve the whole area using the tree method. It works, and works very well. It is not the optimum method, but it is a good enough solution.

Apparently, that's not what the quantum computing promises. Due to the extremely efficient computation power available to the quantum computer, they instead solve it by "random" method. That is, they simply lay a path haphazardly, then they jiggle the path to see if a better path presents itself. Repeat that many, many times and eventually, you'll come up with a pretty good solution.

What the quantum computer can do, however, is to just lay down thousands of path all at once, and therefore be able to find the optimum path extremely quickly. There is a problem in reading the quantum result, but we'll ignore that for now. So, what it boils down to, is simply trial and error method. Is that wise?

The proponent of the method points out that such trial and error method is desirable where the problem is so difficult that it is not solvable. I concede the point of the solution, but I am not convinced that the path finding problem is "difficult".

So, I will point out another high respected "random stab" method that has since fallen out of favor: Neural Network. The field of neural network was very interesting with the proponent promises fantastic dreams. Yet, all it boils down to is a few mathematical functions. So, you have one gigantic mathematical equation, and what you do is twiddle the parameters until the error rate is near zero. There is a very sophisticated algorithm that you use in order to determine the parameters, but it looks random to me.

There is another alternative parameter determination method. That is the "Bayesian" method: Fire together, wire together. This is the tree method, where as the former was the forest method. As you can surmise, the Bayesian method is very quick and effective. Will that effect run into false hill problem? So far that hasn't happened because the problem isn't about path, but of pattern recognition. So, the domain of the problem is limited, and it turns out that Bayesian algorithm is perfect for such thing. However, path finding is beyond Bayesian algorithm to solve.

So, this brings me to another idea: Bogo sort. For those who doesn't know Bogo sort, it's also called Clown sort. What you do, basically, is to throw out all the elements up in the air, and if they fall down sorted, you're done. Obviously, that will take an extremely long time to complete, and there is no guarantee that it will finish the job. This is the forest method.

So, if that's the forest method, what about  the tree method? Turns out, it's bubble sort, where you simply compare the elements that are next to each other. This is also slow, because you're comparing one to another repeatedly. It's still faster than Bogo sort, at O(N^2), but slow when compared to other method.

And here's the interesting thing: I found out that if you combine these two slowest sorting algorithm, you will speed up the execution speed appreciably. It's still not the greatest, but it's a good step up. Here's the algorithm:

Random Sort:
1. For each element on list
1.1. Randomly compare this with other elements, swap if out of order
2. Repeat until sorted.

Bubble Sort:
1. For each element on list
1.1. Compare this element with the next one, swap if out of order.
2. Repeat until sorted.


Random Bubble Sort:
1. For each element on list
1.1. Randomly compare this with other elements, swap if out of order
1.2. Compare this element with the next one, swap if out of order.
2. Repeat until sorted.

There you have synergy where the whole is greater than the sum of its part. By combining the two approaches, the problem of randomness is eliminated by fastidious bubble sort, and the problem of slowness is eliminated by large random jumps. Combined together, it represents a large cost reduction available. There is a better solution than this, and that is comb sort. But that's topic for another time. Let's focus on this random solution for now.

Going back to the original travelling salesman problem, what does this have to do with path finding? It turns out that at its heart, the travelling salesman problem can be looked at as a sorting problem. If you take the distance between two points, and you compare them, then you can see that a comparison sort can yield a rather quick solution. A bubble sort TSP will yield interesting result, indeed.

The problem with bubble sort TSP is easy to see: Being the tree solution, it falls foul to the false peak problem. A random sort TSP, however, will not fall so easily. A permutation (sort) TSP will yield optimum solution, but as we can see, extremely expensive solution.

So, a "random stab" method of TSP, in effect is inefficient because it lacks that self-organization factor that sorting does. However, that problem is alleviated by using Random Bubble sort on TSP:

1. Create a lookup table for distance travelled between two points.
2. Do Random Bubble Sort, using the lookup table for comparison.
3. Repeat until time runs out or no improvement to the solution.

The key point here is "random". You always want to jiggle the points repeatedly to see if reversing the order of visiting two towns will yield better result. Eventually, a path will present itself that will consists of points that are next to each other. It will automatically allow backtrack, too. Will it be optimum? Maybe not, but it will be good enough.

Obviously, there is room for improvement. The obvious one is to lay down all the points in order when comparing distant points containing various points enroute. Another improvement is to restrict points consideration to smaller and smaller distance. But that is for further research and not for this session.

Another question would be if another sorting algorithm will work just as well. The answer to that is no. The more efficient sorting algorithm will not work because it lacks the "random jiggle" method. We want a random algorithm that constantly scans for better solutions.

And yet, there will come a time where the solution is fixed, and it will not change. Will that be optimum? Chances are, it won't. However, it will be close enough and in quick enough time. Whenever you are curious, simply run the program multiple times and see if the solution changes.

And there you have it. A random bubble sort may not be desirable in sorting problem domain. However, as it turns out, it's extremely desirable in path finding domain. It's all about mating the tools to the problem. That is all.

Sunday, April 27, 2014

Long Absence due to Illness

I'm back, and struggling to get back to form.

In case you're wondering, I've been struck with illness. It started in January. Half January and half February was spent being sick. No work done. But March took it all. The whole month was spent being bedridden. No work done. I got better enough in April to write a quick April Fools joke about BASIC being the best language for beginners to learn. The doctor put me on Metformin for I don't know how long, but it's a big bottle!

Well, it's now at the end of April. I was enrolled in Camp Nanowrimo, too. I've only written about 3000 words of it, out of planned 25,000 words. So, not good.

I've noticed that the directory pages are outdated. I've just finished updating the cooking page, because that was convenient. I suppose I should update it based on popularity. So, Petit Computer directory page is next, followed by Raspberry Pi page.

I'll be reviewing more books to come, and it will follow up from the last book review. I'll also start on the Petit Computer programs, and it, too, will pick up from the last blog post.

I have not been active in Raspberry Pi lately, so I will have to start over. I'd probably start with reviewing Linux commands piece by piece. I don't know. I know that I want to finish that webcam project, and start documenting my Slim Pi project.

As far as Nintendo Journal, it's just for fun, so it's not like it's urgent. I will skip post and start over. I do have some more pictures to share.

An interesting idea is that I will start writing on a manual typewriter. I plan to write 4 double spaced pages on it per day. It should take me about 1-2 hours per day, assuming I did my homework beforehand. We'll see how that will work.

Well, that's it for now. Updating the cooking directory page took one hour because somehow, the pages went missing! I didn't know what to do, but it magically went back up again. Oh, well. I really don't know what happened. But if I have to restart the directory pages again, it will have only dates, and not link. You can always look it up by the dates, albeit inconveniently.

GTG. Take care for now!

Saturday, April 12, 2014

Chess: 3 Weak Squares


This is a game I played with Pure Chess at Scholar level. It's an interesting game, with me totally dominating Black's position. It's the finest game I've played so far. Just don't expect me to repeat this performance anytime soon!

1. e4 c5
2. g3 d5
3. Bg2 dxe4
4. Bxe4 Nf6
5. Bg2 

Oh, dear. 5 moves into the game, and I lost initiative already. There goes my hope for a good game.

5. ... h6
6. Nf3 e6
7. O-O Bd6
8. c3 O-O
9. d4 Nbd7
10. Be3 Ng4
11. Nbd2 Nxe3
12 fxe3 

A natural recapture. I moved a pawn from f file to e file, thus opening up my rook for attack on Black's King.

12. ... Re8?

This is strange. About the only thing I can think of is that Black has no concept of long range planning. By moving his rook there, he's setting up attack on f pawn. Right now, White can't attack that pawn, but after a few exchanges, things will be different!

13. Ne4 Bf8
14. Rf2 f5!

The situation is now clear. Black has no plan to save that pawn for defense after all. Black is using that pawn as an attacker!

15. Ned2 a5
16. Qc2 Ra6
17. Raf1 Qb6
18. Nc4 Qb5
19. Nce5 

A knight here is a very strong position. Notice that no pawn can attack that knight. It's a comfy perch that will make Black's movement difficult. It's also the first attacking piece so far. We are now moving into the middle game.

19. ... Nxe5
20. Nxe5 g5?

You should not voluntarily give up the pawns protecting the King. Just because you don't see anything wrong with the current position, does not mean it won't change in the future!

21. g4?

Ill-advised attack on my part. I failed to calculate the consequences of my gambit of trying to open the f file.

21. ... cxd4
22. exd4 Bg7
23. gxf5 exf5

I look at this position and stared in horror! My King has no pawn protection in view of 3 Black's pawns! About the only salvation is the fact that my pieces are consolidated while Black's pieces are scattered all over the board. If only I can attack on the Queenside while preventing Black to consolidate his pieces over to the King!

24. c4 Qb6
25. Bd5+ Kf8?

An unfortunate move for Black, lining up his King opposing my double Rook!

26. c5 Qf6?

Lining up his Queen to my double Rook?

27. Bxb7 Ra7
28. c6!

Looking at this position, I'm very happy! Black has 3 major problems that he needs to take care of. There are 3 pressure points


  1. The f5 square. Attacked twice, defended twice, defender wins. Attacked 3 times, defended twice, attacker wins.
  2. The d7 square. Possible King-Queen fork by the Knight.
  3. The c5 square. Possible King-Rook fork by the Queen.


Notice that I'm not looking at the position by the pieces. I'm evaluating the position by weak squares. This is not a common view, but I think squares are more important than pieces!

28. ... Rxb7
29. cxb7

Black attempts to solve problem #3 by trading his Rook for my Bishop. This is unfortunate for Black since I now have two attacks: the Bishop, and the b8 square promotion. As we play along, notice how Black's position crumbled rather quickly.

29. ... Be6
30. Rxf5 Bxf5

Black has solved problem #2, but his problem #1 remains.

31. Rxf5 Qxf5
32. Qxf5 Kg8

And now, I see a winning combo. I realize that in order for me to promote the pawn, I need to remove the Rook from the 8th rank. I know just how to do it! We are entering the Endgame phase. Black did not survive for long.

33. Qf7+ Kh7
34. Qg6+ Kh8
35. Qxe8+

And I have the Rook. I wasn't calculating mate, but since all moves are forced, it's a mating combination.

35. ... Kh7
36. Qg6+ Kh8
37. b8=Q+ Bf8
38. Nf7#

Yes! Mated the King in the corner! Notice that Black does not make obvious blunder. About the only inaccuracy that I can see is lack of long-term planning by Black. Overall, a great game that I do not expect to improve any time soon.

Thursday, April 3, 2014

Book Review #30


Book #30 Java for Dummies
Aaron E. Walsh

This book is a blotch in the usually good "For Dummies" series. As I keep reading this book, I keep asking, where's the Java? I don't see any. Pages, after pages, I don't see Java. I keep seeing HTML. Then it hits me. This isn't "Java for Dummies". This book is "Applet for Dummies."

That's right. This is "Java" in the loosest sense of the word. The correct title would have been "Applets" or Java application inside Web browser. So, I reread the book again, and it's rather comprehensive in that regard. There's hardly any Java at all, so you need to learn Java development separately. You know what? Usually, such book will have its own applet development section. So, maybe this book is applet design? This book has a little about everything. I can't help to think that maybe a more focused subject regarding applets design would have been better. As it is, there's not enough meat in there to go deep. Good for survey, but not enough to understand everything.

Who will benefit from this book? If you're a Java Developer who never touched applets before, this may be a good book. But I say, odds are you can do better if you keep looking.

Tuesday, April 1, 2014

BASIC is the Best Language to Learn


BASIC is the Best Programming Language for Beginners

"BASIC causes brain damage!" -Dijkstra

Everytime some non-programmer ask me what programming language they should learn, I always answer "BASIC". Understand that not many computer programmers would answer that way, and some would even think that BASIC is actually harmful. However, from my experience, these are the times it took me to learn various languages:

Applesoft BASIC: 3 hours
HTML: 3 hours
JavaScript: 2 days
Perl: 3 days
Processing: 3 days
VRML: 1 week
C: 40 hours
Pascal: 1 semester
Lisp: 4 weeks
Java: still on-going
Python: Still on-going

There are other smattering languages such as Scratch (instant!) and MS Visual Basic (1 week: 3 days being clueless, 2 days learning). However, you can see that BASIC and HTML occupies the short end of the learning curve. I understand that not everybody can learn things that fast, but still, the argument is that BASIC is very easy to pick up. Not as fast to pick up as Scratch and LOGO, mind you, but extremely easy to pick up without the severe limitations of LOGO and still be a general purpose programming language.

If you consider the fact that not only Applesoft BASIC is my first computer programming language, but also that I learned it simply by reading a book without touching the computer whatsoever, then it's a miraculous event, indeed. However, is it really? I don't think so. If you look at the form, it's basically (command) (parameter1),(parameter2),(parameter 3). There may be odd grammatical markers such as parenthesis and colon, but overall the form is very much like algebra, and just as easy to do.

That is, I learned computer programming in 3 hours. That's all there is to it.

Yet, that's actually not a big accomplishment. Coding, or implementing computer program, is actually rather easy. Design, on the other hand, is extremely difficult to get right. I've mastered coding long time ago. Design? I have spent a lifetime learning it, and I'm just scratching the surface.

That's not what most people would say. Most people would say that coding is hard, and design is easy. Just as they say Math is hard, and Art is easy. Who's right? The general population? Or the lonely me?

I'd say that math/coding is easy because if you blunder, the computer will tell you right away and you can fix it. If you design it wrong, there's no one to tell you so, and the mistake will live forever until one day you realize that you do not get the expected performance. Therefore, coding mistakes are easy and cheap to fix. Design mistakes will cause businesses to fail, and no one is the wiser.

Dijkstra wrote a paper saying "blah blah blah BASIC causes brain damage! blah blah blah Once people learn something, they cannot unlearn it." Most media, being sensationalism hungry, focused on the first sentence, and that sentence becomes famous. I, as an educator, focused on the second sentence. Why can't people unlearn what they learned? I have had to do it several times in learning the various computer languages. Why can't they do it? Why can't they bring themselves to another level? Why must they cling to outdated knowledge? BASIC does not cause brain damage. The damage is already there!

Is BASIC such a bad language? Not at all. The new modern BASIC language is especially sophisticated. Even MS Small Basic, with all the bugs, is still superior to sophisticated programming languages such as Java and Python. Understand that most experienced computer programmers will blanch at that statement. They will impound the greatness of such languages: Object Oriented! Local Variables! Strong Data Typing! No pointers!

However, transitioning from BASIC to Pascal, I remember having to think for quite a long time why local variables are desirable. Even to this day, I really don't see the difference between "point.x, point.y" (object oriented) and "point_x, point_y" (functional language). C language has struct keyword to manage memory, and that I can understand. Computer memory in the old days were really expensive and every little bit helps.

Think about it. Local variables can be easily implemented using stack. That is, every time a function is called, you can push variables into a stack, and every time you leave a function, you pop those variables from the stack.


  1. push (px)
  2. push (py)
  3. ...
  4. pop (py)
  5. pop (px)


When you have long list of variables, then it becomes unwieldy. Then you'd put a special system to make it convenient for you.


  1. setmark (stack)
  2. push ("px",px)
  3. push ("py",py)
  4. ...
  5. poptilmark (stack)


And so on. There are many ways that you can implement this. Array of pointers? Linked list? Preprocessing? There's no one right way. Now think about it. If you go from Pascal to BASIC, these becomes a burden. That will discourage people from learning new computer languages. However, if you go from BASIC to Pascal, then the fact that you don't have to do this "by hand" becomes a pleasure. This will encourage people to seek newer and better languages. Isn't the goal of lifelong learning a worthy one? I think so.

How many people I've seen who would put down BASIC because it doesn't have convenient programming features that sophisticated program have? Well, IMHO, if you can make it work, then it's not a language problem. It's a programmer's problem! Otherwise, may as well remove Scratch and LOGO from viable languages as those are even more primitive than BASIC! You don't put down Scratch language because it's easy to learn and use. Therefore, you don't do it to BASIC, either!

Another example regarding person/computer issue. Take a look at this problem:


  1. if s>90 and s>=100 then g="A"
  2. if s>80 and s>=90 then g="B"
  3. if s>70 and s>=80 then g="C"
  4. if s>60 and s>=70 then g="D"
  5. if s>0 and s>=60 then g="F"


Do you see the problem? What if the score has value of 90.5? What's that? Illegal values because the data is stored internally as integer? Well, then, did you remember to round up the values? Because, if not, people will complain!

Notice that it is irrelevant what computer language to use. Some languages provide if-else statement. It can be done as "else if", "elif", or "elseif". It doesn't matter. It's the thought process that is faulty. Look at this:


  1. g="F"
  2. if s>=60 then g="D"
  3. if s>=70 then g="C"
  4. if s>=80 then g="B"
  5. if s>=90 then g="A"


That way, it's a lot cleaner. So many people are busy learning the language, but not the algorithm. Learning computer language is easy. It's the thinking process that is hard!

And let's talk about GOTO. It encourages "spaghetti code", as people would say. Is that so? If the design is clean, there would be no spaghetti code, regardless of GOTO statements. On the other hand, imagine doing some object oriented work. Let's put a banking system. Imagine Customer, Bank Safe, Teller, ATM, Manager interactions. 5 elements. Draw a line (with arrow denoting direction) for each interaction.

Customer-Teller
Customer-ATM
Bank Safe-Teller
Bank Safe-Manager
Teller-Customer
Teller-Bank Safe
Teller-ATM
Teller-Manager
ATM-Teller
ATM-Manager
Manager-Bank Safe
Manager-Teller
Manager-ATM

As you draw on paper, check your work. Does any line crosses? If so, then it's a bit more complicated, isn't it? Then you need to repositioned the elements to make it clearer. Now, this is just 5 elements, with arrows all over. Imagine it with a dozen or so objects. Can you still see it clearly just because it's "Object-Oriented?" Doubt it. There is a reason why flow-chart fell out of favor. It's great for small systems, but expand it greatly and it becomes tough to follow. Ditto for any visual languages.

Now, take another look at the chart you have. What happens if a customer tries to by pass Teller and go Directly to Manager? Well, you need to take care of that situation, too. Add another line. You may decide to allow it, or you may decide to throw out an error message. What happens to ATM-Customer interaction? Another error message? Keep going until you have five element graph with lines all over the place. Impossible to avoid crossing the lines, doesn't it? That, my friend, is spaghetti code. Object-Orientedness is by no means the deciding factor of confusion. Simple, clean design matters more.

I also don't understand why Java/Python people take is as a point of pride that they do not use pointers. Sure, it can be difficult to get right, but what are you going to do? Learn how to use pointers properly or discard the whole language? What kind of computer programmer, are you?

Even highly skilled, highly experienced programmer, such as Mark Lulz who wrote 2 excellent Python books, Learning Python and Programming Python, had trouble with Cartesian/Polar Coordinate translation. Excuse me? That conversion is something I learned noddling around in BASIC for about half an hour. It's not the language/tool/car/money. It's the person! Which is better? That you do not use pointers and have trouble converting Polar Coordinates, or that you do use pointers and have no trouble converting Polar Coordinates?

Yes, I had trouble using pointers. So much so that every time I used them, I'd draw on paper little arrows pointing here and there. It's not until I realise that pointers are references, that the whole trouble disappear. What are references used for? Variables, functions, strings, data structures. Any time you want to refer to something without actually having a copy is a reference. Which means that Unix command "ln" is a pointer operation. Java's Garbage Collection is also pointer heavy. Python Objects are also pointers heavy. The only difference is that those languages hide the operations away from the user.

And yet, referencing objects (aka pointer) is so useful, that even Excel spreadsheet has that capability. Not that it's in any tutorial, including advanced ones. I wonder why? The skill to reference objects via indirect method is an important one in computer programming skill toolbox. Why won't people learn it? Why hobble themselves voluntarily?

Old style BASIC, which is flat, is rather cumbersome to use. Sure, that's true. But a bad workman blames the tool. So what if the language is hard to use? You can write a programmer's journal to clarify the problem/solution. How many computer programmers today write such journal? If you're talking about Python, then the answer is none. The source code is the document!

But that's not a good thing. You don't want to teach the programmers via the source code. Imagine that program maintainers cannot understand quicksort. Would they replace the highly optimised quicksort code (such as GNU C qsort) with bubble sort? Very likely. Do you really want to teach these people what a quicksort is, right on the source code itself? Of course not! That would be a distraction! So, the way to bring beginner programmers up to date is via external instruction. With quicksort, that's easy enough. Schooling and/or Internet will take care of that.

Now replace the word "quicksort" with "bleeding edge technology (that nobody but you understand) that your company depends on its very own survival so that if it fails, you (along with everybody else) will be out of the job" and see if you're still willing to gamble that replacement coders will be able to understand the issue completely just by reading the source code.

And that's the value of computer programmer's journal. Just as novel writers have source book detailing all the characters, locations, and other tidbits, so should computer programmers. This isn't being done. Why not? It's a necessary component in computer programming.

All the greatest people I know have the ability to admit that they are wrong. Therefore, the more stubborn they are in insisting that they are right, the less I think of them. With computers, there is no ambiguity: If you're wrong, you're wrong. If the computer is wrong (buggy), you have to work around it. Either way, you learn, adapt, and move on.

So many people complain that their language isn't "powerful" enough. I know many programmers who complained that BASIC doesn't do recursion. I was once among them. Not anymore. Not since learning Assembly Language. The difference is before, I didn't know what to do. Now, I do. There are many computer programmers who'd argue that they don't need to know such details because the language takes care of that. Considering that many highly skilled and experienced computer programmer spend their time removing recursion out of quicksort just to get a fractional increase in speed, I'm afraid I can't agree with that sentiment. Yes, you do need to know how to do it!

If you don't care about speed, as none of Java/Python people do, then by all means ignore that. Ignore everything else, too. Programmer's Journal? Program Design? Automated Testing? Whatever for? "It works on my machine" is good enough excuse!

In my opinion, you need to learn how to design your program from the ground up. Once you do, then you need to learn how to implement it efficiently. Then, you need to learn how to test it properly. And finally, you need to learn how to document the process well enough, so that other people can benefit from your work.

So, back to the point. Why BASIC? Why not C/C++/C#, Java/JavaScript, Perl/Python/Ruby, etc? Why not use the tool highly skilled, highly experienced professionals use? Well, because they are highly skilled, highly experienced professionals. They have the skills to learn complicated subjects quickly and effortlessly. Can beginners do that? Some of them can't even count properly and you want them to use highly complicated tools? What were you thinking?

Let me put it this way: Do you want beginners to learn a language that is not only so easy, it can be learned in an afternoon, but also custom made for the job so far as to have the word "Beginner" in the acronym? Or do you want to shove the language that Mark Lulz spent 3000 pages full of dry technical info on the subject?

That 3000 pages is equivalent to 6 months learning. Do you want to spend that much time learning the language? Or do you want to spend the time learning computer programming designs and algorithms? I rest my case.



  • Note that there are good alternatives to BASIC programming language. I mentioned Scratch and LOGO. Processing is a good language as well, one that I recommend over Java.

Thursday, March 27, 2014

Book Review #29


Book #29 The Total Outdoorsman Manual
T. Edward Nickens et al

Published by Field and Stream, it is indeed an excellent book for outdoorsman. The book is filled with various tips for the outdoors, including hunting, fishing, camping, and survival. The hunting section is further divided into various animals. Dressings and cooking are explained as well. Priced at only $25 yet filled with almost 400 pages of picture-filled pages, this represent an excellent deal.

I'm really impressed with the wide variety of content. If there's anything I'd like to see more, it's camping. The camping section is rather minimal. Then again, Field & Stream denotes hunting and fishing. Camping is only enough to get you going and taken care of, but the priority is certainly in getting your animal bagged and back to camp. In that view, it succeeds admirably.

If you are a hunter, fisherman, or just someone who is contemplating whether or not to venture to the great outdoors, then this is a book worth a reading. Recommended.

Thursday, March 20, 2014

Book Review #28


Learn to Play the Piano and Keyboard
Nick Freeth

Another book by Parragon Publishing, and hard to find author's name. Unfortunately, unlike the sushi book, this one is piano playing. Piano playing is somewhat a rather involved activity. Sure you have some good techniques explained there. Plenty of good pictures, too. But going from simple keyboarding to The Entertainer simply requires more practice than the pages provides. Worse, there's no chord chart at the end of the book the student can refer to. This book is not a tutorial book, or even an introductory one. It's a piano user guide. Here is the white key. Here is the black key, and here is the pedal. And not much after that. Not recommended.

Thursday, March 13, 2014

Book Review #27


Home Buying for Dummies
Eric Tyson and Ray Brown

This book puts a clear vision for first time home buyer. Starting with, "which one is more expensive, rent or buy?" the authors give clear examples of advantages of renting vs. buying. Both views are fairly presented so that you can make a good decision. Of course, the book then proceeds on the assumption that you're buying your house.

A workbook is provided to tell you exactly what you can afford. Followed by mortages, insurance, and other headaches. In chapter 9, the book advises hiring an agent, even if you find the home yourself. Luckily for me, I did find a good agent. And so, I can recommend that you do so, as well. You still need to do your homework, and find out what the price range, neighborhood, and type of home you're looking for.

The last part of the book deals with forms. I can tell you that going through the process, there's quite a bit more paperwork than that. In the end, my agent got me a special loan processor, who is worth her fee, because the amount of work is equivalent to building up a big, thick, dictionary. I lost count of the number of signatures I had to sign.

If you ever wonder whether or not you should stop renting and buy a house for yourself, reading this book will get you ahead in your decision. Recommended.

Monday, March 10, 2014

Cooking Journal #29


Cooking #29 Breakfast Sushi



Well, well. This isn't how it's done! There's an art in making sushi omelette and obviously, this is not it. In fact, I simply fried the eggs while rolling the sushi. Notice, there's no wasabi involved. I use sriracha!

Next time I do this, I'll do scattered sushi. The nori can be cut up into little stripes and use as sprinkled garnish.

Thursday, March 6, 2014

Book Review #26


Sushi with Style
Ellen Brown

I think I have too many sushi book when I start buying them even though they're written by non-Japanese. Then again, the as long as the information presented is good, that's good enough for me. I think there must be something alluring for what is essentially a piece of raw meat stacked on top of rice.

The simpleton in me rebelled at the list of ingredients. Rice, vinegar, soy sauce, hot sauce, and maybe pickled ginger, is what I use for my sushi. Actually, soy sauce and hot sauce are optional, since I just use Sriracha. Of course, mine is not "authentic" Japanese sushi. No sushi that does not use authentic Japanese ingredients can be called authentic. Still, that just means that sushi recipes are pretty easy to do. It's hard to make a mistake when there's only a few ingredients, most of them uncooked.

I got the book in a kit, with bamboo mat and chopstick. Since I mostly make scattered sushi (in a bowl), as opposed to rolled sushi, the bamboo mat is mostly unused. Still, the recipe here are well done. Even if the illustrations are sometimes lacking, there's enough recipes there to try, and experiment. Some of the recipes employ easily found ingredients, and that's a plus.

If you ever wanted to know about sushi, but don't know where to start. This book and a sushi kit should get you going just fine.

Tuesday, March 4, 2014

Raspberry Pi Journal #52



Spell Checker



Perhaps it may surprise you that the Raspberry Pi comes with a built-in spell checker. aspell and ispell.

The standard word list is located in /usr/share/dict/

However, aspell has its own directory: /usr/lib/aspell

The standard aspell mode is by checking the spelling interactively. There is a way to do it with non-interactive method. This will let you dump misspelled words all at once.


  1. cat sample.txt | aspell list


list is a command that basically prints misspelled words coming in from standard input.

Monday, March 3, 2014

cooking journal #28


Cooking #28 Ice Cream Sandwich



Ha ha. What do you know? Ice cream and bread makes a nice pairing! Don't you just love chocolate mint ice cream on bread?

This brings back memories, as one of my childhood experience. Simple, yet pleasant. Who needs fruit jams when you can have ice cream?

Thursday, February 27, 2014

Book Review #25


Damn! Why Didn't I write that?
Marc McCutcheon

This is mostly an inspirational books. There are tips and tricks along the way, but mostly it's the case of "this is how I did it, and how you can do it, too."

That, in itself, is not bad, depending on the execution. How is the execution? Very good, apparently. There's enough outlines in there to list the steps necessary for writing, researching, editing, dealing with publisher and so on. There's not a single checklist, or forms to the steps, but all the steps are there. I suggest that you take a piece of paper and write down the steps to publication yourself. Then hang that paper on the wall, where you can look at it, and do it.

The author does not hold back any secret, save for actual writing samples. However, he does reference enough published books, that you can go to the bookstore and read them for yourself. And the amazing thing is, most of the writings are rather normal. Certainly, there's nothing wrong with them. But there's nothing so brilliant about them either. Mostly, they flow. Like a good presenter would. Nothing earth-shattering. Just gentle words that flow.

And that is why I think I can do it, and so can you. If you are able to explain something clearly, and effortlessly, then you just may have a career as a writer. This book certainly inspired me to try. Highly recommended.

Wednesday, February 26, 2014

Musing Journal #8


On Being a Writer

So many people want to be a writer. Why not? After all, all you have to do is sit in coffee shop everyday for a couple hours writing and you're a writer. But is that a professional writer or a hobbyist?

Stephen King wrote that he maintains 4 pages per day work load. Sometimes it comes easy for him. Other times, not so easy. But he wrote those 4 pages every day.

If you consider that, considering standard submission formatting, each page would count for 250 words, then you understand that professional writers write 1000 words per day. That's every day.

One week, then, equals 7000 words. That's enough for a whole chapter (or two). One chapter per week means 50 chapters per year. Or about 2 books per year. 350,000 words per year or 175,000 words per book.

Of course, that's assuming Dictionary size book. Normal book length is only about 100,000 words give or take. So, if you are a real "professional" book writer, then you should be publishing 3 books per year.

That is what doing 1000 words per day will give you.

Consider that. 1000 words per day. Is that so hard? If you have jobs, and other responsibilities in life, then yes, it's hard. There are people who just cannot do that much work per day. However, there is a difference between a dedicated writer and a hobbyist. A dedicated writer, will write, at the very least, one page per day. And at the end of the year, they will have the next great American novel. Think about it: one page a day. 250 words per page, and you're a writer.

The most important thing is to make a schedule and keep at it. If you're doing it every day, fulfilling the minimum required word count, everyday, then you are a writer. If, on the other hand, you like to just chat, browse the internet, and maybe write something, then I'm afraid, you're not a writer.

Unless you are keeping up schedule, you're not a writer. A true writer will keep up his schedule every day. And to that end, I recommend Nanowrimo for every one who dreams of being a writer.

Nanowrimo basically forces you to write an average 1667 words per day. This, on the month where you are guaranteed to be busy, dealing with holiday activities. Having participated twice, I can tell you that it is not at all easy. Sometimes, I just have to force things out.

The neat thing about it is that at the end of the month, when I fall back to normal workload, my normal workload seems easy in comparison! So, yes, Nanowrimo was painful, but it's a good pain. Growing pains. Once I went through the exercise, my productivity improved. And so, I will keep on participating. I like to improve my productivity. And I am keeping up with my schedule. I am a writer!

Note: I have written 1896 words today, not including blog updates. :)

Tuesday, February 25, 2014

Raspberry Pi Journal #51



Screen Capture



I'm using scrot as my screen capture program. Just real quick, to use scrot with 8 seconds delay:


  1. scrot -cd 8 filename.jpg


and to do screen capture only the window, use


  1. scrot -cud 8 filename.jpg


Then you get a jpeg picture of the filename.

Monday, February 24, 2014

Cooking Journal #27



Bento Box


Yup. My lunchbox! The rice ended up too dry after a while. Perhaps I should have put a moist towel to cover them up? Or maybe I should have put more water while cooking the rice? Oh, well. Next time I do this, it'll be scattered sushi for sure!

Thursday, February 20, 2014

Book Review #24


The Hobbit
J.R.R. Tolkien

"In a hole in the ground there lived a hobbit."

Reading that sentence does not in anyway prepare me for the wonderful world that J.R.R. Tolkien sets up for his story. And yet, that magnificent world, is far more present and impressive in his Lord of the Ring trilogy. The Hobbit is a prequel of sort of that one magnificent trilogy. In fact, there are two movies that I consider must see for nerds: Star Wars and Lord of the Ring trilogies. The Hobbit is part of that world, but there's no magic in it.

For sure there are dwarves, elves, and goblins. Of course, Gandalf the Wizard makes an appearance. Is that not magic? Yes, but not of the right kind. Let me explain. The magic here can be considered special effects. Something to dazzle the readers. However, in Lord of the Ring trilogies, the magic are more of "advance science" where it is used in daily life.

In terms of adventure, this one is full of it, more troubles than you can shake a stick at. In terms of heroic deeds, there's a plenty. In terms of story telling, this ranks very high, indeed. But true magic is lacking, and that is a pity. I think that the best parts are in the beginning of the story, where Bilbo Baggins and Gollum duels in riddles. The rest must follow, and though there are heroic deeds, that is more of a story-telling adventure, rather than character development.

Plenty of cleverness abound, and in no way this is a bad book. In fact, it is an excellent book. It's just that compared to the other masterpieces, that this one can be seen as anything less than perfect. Recommended.

Wednesday, February 19, 2014

Musing Journal #7


The Tale of Two Bikes

So, I currently have two bikes. Which one I like better? Well, I like them both equally. It's just too bad that the older bike is falling apart.

The first bike costs me $10. I know that's the price because that's the sticker I have on the bike. It's a Tourney 920se 10 speed road bike. Or hybrid bike, as it has straight bar instead of drop bar. It certainly has road tires.

When I take that bike around Aurora Reservoir pathways, I have to built stamina and the first few times around it, I have had to stop at various points around the path. I did finally managed to loop the reservoir without stopping. It took about half hour to do, or about 16 mph average speed.

With my second bike, Schwinn Frontier MTB 21 speed. I didn't have to stop. I can climb the hills just fine by dropping down to the lowest gear. My legs do get exercised rather severely, and near the end I wasn't so much pedaling as kicking my way up the hill. But I didn't have to worry about raising my stamina up. The bike has enough gears to handle moderate hills.

So, is my second bike better? Come to think of it, I felt like my mountain bike is slower than my road bike. This is confirmed on flat road. Assuming the road is flat, I think that I'd enjoy my road bike more, even old and heavy as it is. On hills, though, I need my gears, and the mountain bike handles that well enough.

So, I like them equally well for different reason. Come to think of it, maybe a hybrid is better. Since the mountain bike has better gear and the road bike has better tires, how about if I put on a road tire on my mountain bike? Hmmmm.

Tuesday, February 18, 2014

Raspberry Pi Journal #50


Tape Archive Compression



There's a built-in program to provide back up, or archive. It's called TAR or Tape ARchive. You can read more about it in the man pages. I'm interested at this point, in the execution time. As you know, tar does not provide compression by default. You can activate the compression feature by including option -z. There is no question that it works. The question is, how fast is it?


  1. time tar -c VID0000*.AVI >vid.tar
  2. real 0m1.903s
  3. user 0m0.040s
  4. sys 0m1.590s



  1. time tar -cz VID0000*.AVI >vid.tgz
  2. real 1m8.890s
  3. user 1m4.000s
  4. sys 0m2.890s


As you can see, the original operation is just a couple seconds long. The compressed option, however, took over 1 minute long. That's an enormous difference! Let's do a pipe with gzip command.


  1. time tar -c VID0000*.AVI | gzip - >vid.tar.gz
  2. real 1m19.526s
  3. user 1m9.580s
  4. sys 0m3.320s


The process takes even longer to process. So, let's see if we can improve it so that the time it takes will be between uncompressed and the original compressed.


  1. time tar -c VID0000*.AVI | ssh pi@remotepi gzip - >video.tgz
  2. pi@cloudypi's password: enter password
  3. real 12m54.990s
  4. user 0m16.460s
  5. sys 0m9.620s


So, there's a trick to it. I'm using the pipe command to send the data to a remote pi on the network. Then the remote pi compresses the data and then send it back to my local pi. As you can see, the time went through the roof. Obviously, the network bandwidth is the bottleneck. After all, my CPU utilization rate is near zero. Which is nice since I can watch anime while it's compressing. But that's besides the point since I can use the "nice" command to put it into the background.

Oh, well. Some things just aren't worth it.

Monday, February 17, 2014

Cooking Journal #26


Peasant Bread



So, after the dehydrator experiment failure, I decided to do another experimentation. This time, I bought myself a bread maker. The first two experiments were, to put it mildly, total disasters. The first one was obviusly lacking water. The second one, too much.

The third experiment was successful. I bought a bread machine recipe book and in the introduction, it mentioned how unlike hand made bread, bread cooking machine demands accuracy in ingredient measurements. Well, now I know!

I picked the simplest recipe in that book, and that's peasant's bread. 1 part water. 3 parts flour. 2 spoons sugar. 1 spoon salt (I use leftover ramen salt packet!) and 1 packet of yeast. I notice a bit of dryness when mixing dough, so I added 2 spoonful of water at a time, until I judge that the dough is fine. I think it ended up a little more moist that I anticipated, but that's okay.

The end result is terrific! Once I started into the eating the bread, I couldn't stop! As you can see from the picture, I ate the bread in one go! Well done!

Thursday, February 13, 2014

Book Review #23


Modernist Cuisine at Home todo
Nathan Myhrvold et al

If you have ever seen the book Modernist Cuisine in the book store, then you know that the package isn't shaped like a book, but more like a big block of wood. And just as heavy. If the $450 purchase price does not scare you, you will find that it is the best, most comprehensive cooking book anywhere in the world. However, the problem is that the equipment list is so long, that most people would have no use for it. This version of the book (for at Home), I bought at 50% off during a new year sale.

And it is wonderful. The equipment list is also long, and the fact is, there are recipes in there that I will never ever cook. But that's not the aim of the book. The purpose of the book is to understand how the food cooks, and that's why there are illustrations on how the food cooking process happen inside the pot. By the way, these illustrations aren't handdrawn. They're photos. That's right. You can actually see how the food cooks inside the pot because they cut the pot in half! That's no mean feat when they're illustrating on cooking sous vide!

A lot of the cooking process can be summarized as cook low and slow. This holds true, especially for sous vide (under vacuum). However, the result is worth it. Unfortunately, precision is essential, so be prepared to invest in more hardware that is used only in cooking. It's not until after reading this book, that I'm persuaded to buy kitchen timer and thermometer, for example. Also, some of the hardware is industrial looking. Do you know they use a special sprayer for omellete? Sigh.

Anyway, the title may be "for home", and in most part, that's true. However, it's still too much for me. Is the book worth the purchase price? Production quality is top-notch, easily the best I've seen. Considering that I got it for 50% off, the answer is "Absolutely!". If you have to pay full price, then I can understand your reluctance. The best thing you can do is read it. Even if you're not a cooking afficiando, this book may just turn you into one. It did for me. :)

Wednesday, February 12, 2014

Musing Journal #6


Price of Electric Bike

So, I noticed that Walmart is selling a cheap electric bike. It's about $600. Amazon also sells some, and it's slightly cheaper, but about the same price. However, I also notice that the electric bike conversion kit is priced about $400 something. The question is, if you take an electric bike, and subtract the cost of the conversion, will you arrive at the price of the bike?

I think not. Assembling the bikes does cost money, but adding a few more parts at the factory isn't going to cost the same as aftermarket additions. So, you probably are going to get a better bike for the price.

And yet, I wonder if that's a wise decision. First of all, in a cheap bike, heaviness isn't a problem. Whatever energy drain that the bike has, is counteracted by the electric power. Even the smallest pedal assist level will be enough to counter the drag. So, about the only concern is to have a bike that is strong enough to last a long time.

As I wandered around REI store, however, I found a sample electric bike. I picked it up and noticed that the whole bike with battery weights less than my current bike with no cargo! Huh, I didn't think that was possible. I also noticed that the price tag on the bike is $1700! At that price level, I'm probably getting $1000 bike, which would be really sweet indeed. But I don't need $1000 bike. I'm perfectly happy with my current heavy mountain bike. With electric motor? Who cares about the weight.

So, figure $400 bike plus $400 conversion kit. That's about $800, and that's my current limit for an electric bike. Anything above $1000, and I'd rather get a scooter.

Tuesday, February 11, 2014

Raspberry Pi Journal #49



SD Card Formatting



There's quite a bit of discussion on how to format the SD card. The easy way to do it is to just gparted program. It's point and click easy. However, what if you want to do it automatically? Use *sfdisk* program.

Once you prepped the SD card using gparted, use the sfdisk program to dump the info out to a file. This is what I did in order to get 4 GB file data.


  • sudo sfdisk -d /dev/sda > SD4GFAT32.sdmap
  • cat SD4GFAT32.sdmap 
  • # partition table of /dev/sda
  • unit: sectors

  • /dev/sda1 : start=    16384, size=  7684096, Id= b
  • /dev/sda2 : start=        0, size=        0, Id= 0
  • /dev/sda3 : start=        0, size=        0, Id= 0
  • /dev/sda4 : start=        0, size=        0, Id= 0




So that's the file, and its content. Now, if you redid the whole partition using gparted to something like this:


  • sudo sfdisk -d /dev/sda 
  • Warning: extended partition does not start at a cylinder boundary.
  • DOS and Linux will interpret the contents differently.
  • # partition table of /dev/sda
  • unit: sectors

  • /dev/sda1 : start=     2048, size=  4319232, Id= b
  • /dev/sda2 : start=  4352000, size=  3356672, Id= 5
  • /dev/sda3 : start=        0, size=        0, Id= 0
  • /dev/sda4 : start=        0, size=        0, Id= 0



You can use sfdisk to restore it to its former partition, thus reformatting the SD card.


  • sudo sfdisk /dev/sda < SD4GFAT32.sdmap 
  • Checking that no-one is using this disk right now ...
  • OK

  • Disk /dev/sda: 1020 cylinders, 122 heads, 62 sectors/track
  • Warning: extended partition does not start at a cylinder boundary.
  • DOS and Linux will interpret the contents differently.
  • Old situation:
  • Units = cylinders of 3872768 bytes, blocks of 1024 bytes, counting from 0

  •    Device Boot Start     End   #cyls    #blocks   Id  System
  • /dev/sda1          0+    571-    572-   2159616    b  W95 FAT32
  • /dev/sda2        575+   1019-    444-   1678336    5  Extended
  • /dev/sda3          0       -       0          0    0  Empty
  • /dev/sda4          0       -       0          0    0  Empty
  • New situation:
  • Warning: The partition table looks like it was made
  •   for C/H/S=*/6/18 (instead of 1020/122/62).
  • For this listing I'll assume that geometry.
  • Units = sectors of 512 bytes, counting from 0

  •    Device Boot    Start       End   #sectors  Id  System
  • /dev/sda1         16384   7700479    7684096   b  W95 FAT32
  • start: (c,h,s) expected (151,4,5) found (2,20,17)
  • end: (c,h,s) expected (1023,5,18) found (1018,5,18)
  • /dev/sda2             0         -          0   0  Empty
  • /dev/sda3             0         -          0   0  Empty
  • /dev/sda4             0         -          0   0  Empty
  • Warning: partition 1 does not start at a cylinder boundary
  • Warning: partition 1 does not end at a cylinder boundary
  • Warning: no primary partition is marked bootable (active)
  • This does not matter for LILO, but the DOS MBR will not boot this disk.
  • Successfully wrote the new partition table

  • Re-reading the partition table ...

  • If you created or changed a DOS partition, /dev/foo7, say, then use dd(1)
  • to zero the first 512 bytes:  dd if=/dev/zero of=/dev/foo7 bs=512 count=1
  • (See fdisk(8).)




Just to make sure, here's the command again.


  • sudo sfdisk -d /dev/sda 
  • # partition table of /dev/sda
  • unit: sectors

  • /dev/sda1 : start=    16384, size=  7684096, Id= b
  • /dev/sda2 : start=        0, size=        0, Id= 0
  • /dev/sda3 : start=        0, size=        0, Id= 0
  • /dev/sda4 : start=        0, size=        0, Id= 0


One more thing. If you want to reformat the partition to FAT32 filesystem, you can do this:


  1. sudo mkfs -t vfat /dev/sda1


Yes, you need to format each partition separately, and that means you can have different file system per partition. Also, vfat stands for FAT32, although you can specify it explicitly, or use something else entirely.


Monday, February 10, 2014

Cooking Journal #25


Dried Kale Chips



If you remember that dehydrator I got a while back, it was not successful. Well, it was, but too high maintenance for me. Here is the commercial version of the kale chips. The thing that may be surprising is how expensive this thing is. Definitely not your everyday consumption. Okay for occasional snack, though.

Another thing that I find objectionable is that there is too much salt. Okay, so I prefer things "pure" without any spices added, but this is clearly too much!

I find it very surprising at the weight. These are almost weightless! The chips crumbles very easily, so I'm guessing that they really evaporated the water completely!

You may want to try these out, but too salty for me! I much prefer the homemade one. Maybe one of these days, I'll try it using regular oven.

Thursday, February 6, 2014

Book Review #22


Book #22 Programming Linux Games
John R. Hall

Programming Linux Games is an old, old book. In fact, it's obsolete. The reason is that SDL has evolved toward a better set, and some of the functions are deprecated. In addition, it uses plain old C language, that although a great language to use, it has been superceeded by C++, and Java, among other languages. Still, there's enough similarities between this book and current SDL that it's a great introduction to the process of SDL programming.

One of my most frustrating endeavor is finding the proper answer to my question regarding any graphic library: "How do you draw a point on screen?"

That one very simple question is frequently unanswered in great many graphical programming book. I know that's silly, but there you go. This book answers the question right away: Using surface.

"Each SDL_Surface structure contains a pixels member. This is a void * to the raw image, and we can write to it directly if we know the type of pixel that the surface is set up for."

And upon those two sentences, I bought the whole book!

It's a good book, marred only by the fact that it's so old, it's obsolete. However, reading this book followed by reading SDL man pages is infinitely preferable to just going straight on without guidance. The explanations are solid, and adapting the code to newer system is not a problem.

FYI, the ever popular PyGame extension for Python uses SDL for the base. SDL is a good graphic library to learn, especially if you're really interested in harnessing the speed and efficiency of C language.

For those of you still stuck with C programming, give this book a read. You just may like what you see.

Wednesday, February 5, 2014

Musing Journal #5


Electric Bike Cost

If you have been following my posts, you know that I have been smitten by electric bikes. No wonder. They are clean, light, simple, and really a good transportation mode all around. The fact that I have bike path from the house really does seal the deal. There is only one problem: The cost.

Now, I can go buy myself a cute little red scooter for about $2000. A used one can be had for $1000. So, I really don't want to spend more money than $1000 for an electric bike. A scooter can get you places bikes can, faster and safer. Of course, there is the cost of ownership. Scooters costs more to insure and maintain. But let's face it, with 100 mpg effective expense, it just doesn't cost all that much.

But how about electric bikes? You can always pedal. Yes, that's true. But with regular bike, that's free. How about the cost of an electric bike? There have been a lot of calculations thrown about that says an electric bike gets an equivalent of 1000 mpg! Some people, in fact, attached the battery charging unit to solar cells. That's probably about 1 million mpg. How can a scooter beat that?

I'm not interested in calculating pedal power into the equation. So, first, I'm taking the range of motor only. Then I take the cost of purchasing replacement battery. Finally, I take the amount of recharging the batteries can take before losing capacity. So, the true mpg cost of an electric bike is Price of batteries / (mile per charge * number of charge). That gives you dollar per mile. A simple price check will give you dollars per gallon. That will give you mile per gallon figure.

In my calculation, I get about 80 mpg for electric bikes. So, that's slightly less than that of a scooter. I consider, cost wise, scooters and electric bikes are the same. Perhaps I will change my mind later, once the battery technology has improved much and gasoline cost more that it is now. However, that's some time in the future and for now, I will not buy an electric bike that costs more than $1000. I'd rather have a scooter!

Tuesday, February 4, 2014

Raspberry Pi Journal #48


SD Back up time


How long does it take to back up 16 Gig SD card, anyway? Using my PNY SD 16 Gig, this is the time that it takes to fully back up the card using noob clone.

  • real 113m8.210s
  • user 2m21.400s
  • sys 24m42.570s

Or about 140 minutes. That's pretty fast.


Monday, February 3, 2014

Cooking Journal #24


Scrambled Egg


Nothing special, really. I got a jar of mushroom. 3 eggs. Drop mushroom in a pan. Cook until warm. Break eggs in a bowl. Stir. Drop in pan. Cover. Cook until done.

About the only thing I don't like is that I have to clean that extra bowl. Lazy, eh? But little things do add up!

Thursday, January 30, 2014

Book Review #21


Japanese Cooking
Lulu Grimes

First of all, I'm not even sure "Lulu Grimes" is the author of the book. There's no name listed on the cover. The publisher is Parragon Publishing in UK. Printed in China. Produced by the Bridgewater Book Company, Ltd. With Home Economist Kate Moseley. I really don't know what to think of it. The photography, by David Jordan, seems good enough, but there's no blurbs or anything.

The recipe follows a simple procedure.
1. List the ingredients
2. Show one picture of the process
3. Show the text of the preparation process
4. Show a big picture of the finished product

In some cases, this shows the picture of a knife cutting meat, or ingredients being dumped into a pan. So, not much there. Regarding Japanese cooking, they're mostly pure ingredients. In some cases, one short ingredient list. One short preparation steps. Done.

That's my kind of cooking. It annoys me, though, that I can't find the name of the author of this book. Still, if simply prepared Japanese cooking is what you want, then this book is ideal.

By the way, if you haven't done so, try cutting meat in tiny little slices, and use chopstick to drop them into hot oil in a pan. Wait until cooked, and fish it out with the chopstick. Put it in plate. Simple. Quick. Convenient. Done.

Wednesday, January 29, 2014

Musing Journal #4


Plan, Do, Learn

For the longest time, I always work with Learn, Plan, Do. This is because I value my time and by properly learning the materials, I am able to properly plan the activity, so that when it comes to actually doing it, I will have no trouble at all.

This has served me well throughout the years, whether it's engaging in a project, or writing term papers, the proper outline has been essential in getting things done quickly.

Lately, though, I have a second thought about the process. Probably because I'm being overwhelmed right now. There's so much to do that if I have to fully learn everything, then a lot of things won't get done quickly. I suppose there's a good side about "do one thing at a time" philosophy, however, I wonder if that's a good thing when there's so much to do at one time. In essence, I wonder if there's a faster way to do things than one at a time.

So, I have been doing things around. Instead of Learn-Plan-Do cycle, I decided on Plan-Do-Learn cycle. Make plan, Do plan, and Adjust accordingly. So far, it has been working well. Things are getting done, and more importantly, I am able to manage various projects at once. I still limit myself to doing two or three big things per day. And I focus on the most important thing to do in that day. But as weeks go by, I am able to see improvements in all aspects of my life.

If you look at my blog, you see various things going on. That's because I have wide ranging interest, and the fact that had I focus on one thing at a time, there will be some holes on it. So, I have no choice. I have to spread my activities around, regardless how I feel about it.

And that's a good thing. The ability to learn from doing does have value. I keep changing things and well, I see definite improvements in each separate activities. Mind you, I'm still overwhelmed. However, I manage things much better now. I am now able to see progress on my schedule.

So, all I have to do now is keep improving, and I will catch up with the rest of the schedule eventually. It takes longer than expected, but steady progress is better than heroic actions, especially since I have a tendency to burn out after such heroic actions.

Tuesday, January 28, 2014

Raspberry Pi Journal #47


SD card capacity


So I was wondering why my dd command always fail with insufficient disk space message. Turns out that there are different sizes with different SD card brands. Here is my size of PNY 16 gig card:


  1. sudo fdisk -l /dev/sda



  • Disk /dev/sda: 16.0 GB, 16012804096 bytes
  • 255 heads, 63 sectors/track, 1946 cylinders, total 31275008 sectors
  • Units = sectors of 1 * 512 = 512 bytes
  • Sector size (logical/physical): 512 bytes / 512 bytes
  • I/O size (minimum/optimal): 512 bytes / 512 bytes
  • Disk identifier: 0x00000000

  •    Device Boot      Start         End      Blocks   Id  System
  • /dev/sda1            8192    31275007    15633408    c  W95 FAT32 (LBA)



And here is my size of generic store brand 16 Gig card:


  • sudo fdisk -l /dev/sda
  • Warning: ignoring extra data in partition table 5

  • Disk /dev/sda: 16.0 GB, 16009658368 bytes
  • 4 heads, 16 sectors/track, 488576 cylinders, total 31268864 sectors
  • Units = sectors of 1 * 512 = 512 bytes
  • Sector size (logical/physical): 512 bytes / 512 bytes
  • I/O size (minimum/optimal): 512 bytes / 512 bytes
  • Disk identifier: 0x00048a81

  •    Device Boot      Start         End      Blocks   Id  System
  • /dev/sda1            2048     2324218     1161085+   e  W95 FAT16 (LBA)
  • /dev/sda2         2326528    31268863    14471168   85  Linux extended
  • /dev/sda5         2334720     2449407       57344    c  W95 FAT32 (LBA)



As you can see, the PNY SD card has holds 16012804096 bytes, compared to 16009658368. So that's the difference of 3145728 bytes. No wonder the dd command always fails. Well, there are two ways about it:


  1. Buy identical PNY SD cards
  2. Migrate the whole thing to a smaller, generic SD card.


Option one is easy, but depends on availability of such card. Option two is, frankly, a pain at this point. I have too many stuff in there already.

Option 3, which is to built another card, and copy the files between them somehow does not work because the keyboard stopped working. No, I don't know why, but that is why it's off the table.

Monday, January 27, 2014

Cooking Journal #23

Frozen Pizza

So, I bought a few pizza from the local supermarket. Buying in batch of 5, they cost $2 each. Quite a good deal, if I may say so myself. $2 meal is nice, but the large pizza means that I eat only half. The other half goes to refrigerator for another meal time. So, the cost per meal is only $1.

I find it hard to believe that people will order a pizza delivery. I can understand if you're travelling and have no means to cook good food. I can also understand if you're having a party or maybe you want to have more than one pizza. If all you want is a pizza, frozen pizza is cooked in about 20 minutes at 400 degrees, or about the same as delivery.

Pizza quality, of course, is less than gourmet pizza. Pizza Hut is great. So is Papa John. However, the quality of frozen pizza is not bad. There's nothing wrong with stocking your freezer with them.

Take a simple Pepperoni pizza. Maybe you think it's rather bland. So? Just add more toppings as you like it. There's nothing wrong with customizing the toppings. You may need to experiment a little bit with the proper cooking time as to accomodate the opening of the oven, but I don't think that's too difficult to do.


Thursday, January 23, 2014

Book Review #20


Chinese Cooking
Rose Cheng and Michelle Morris

I like this book because the book opens up with a short cultural lesson of Chinese cooking that is frequently missing in other cookbooks, and yet short enough as to concentrate on cooking, without being wordy or irrelevant. A philosophical overview maybe irrelevant to some, but I find that as I experiment with the recipe, these provide the guidance I need.

The tool section is short and to the point. Normally, a cook book would specify so many pots and pans, and so many utensil. This one specifies a few multi-purpose tools. I really appreciate that.

The problem comes when it comes to ingredients. Chinese ingredients are unique to chinese culture, and that's kind of hard to get in rural America. There's a big picture that helpfully shows the different kind of ingredients desired, but unless there's a good Asian store in your neighborhood, there's no getting them. I'd appreciate if the writer would write equivalent substitute to the ingredients. Alas, that information will not be found here.

The dishes are varied and good. My complain about them is the same complain about other recipes: They are all hard to make, requiring many different ingredients and spices, and frankly, a bother to do. I have no doubt that they are great tasting food. I find that I'm skimping all the different exotic ingredient, and substitute what is available locally. There's nothing to it. I appreciate the pictures of the food, and although mine is nowhere close to looking the same, it's good enough for me.

If you want a good survey of chinese food, and don't mind either going to Asian store and do a lot of work, or maybe converting it to using readily available local ingredient, then this is a good cook book to have.

Wednesday, January 22, 2014

Musing Journal #3

3 Priorities

I think that's me. For some reason, I always fixated to number 3. Or maybe number 4. In this case, I'm thinking about what I'm doing with my time, and there are 3 answers:

1. PTC: There will be GameJam this weekend, and I really, really want to go there. Basically, I'll spent the weekend doing one whole game using PTC. It should be fun. There is a lot of projects that has been held up due to my setbacks. I need to step up my productivity.

2. My home. It's kind of a mess right now. I just discovered that the previous owner cleaned the oven by using the "self-cleaning oven" setting. Which is to say, burn all grease to a crisp. Unfortunately, that's all he did. So, I'll be scraping all the fat that is currently sticking to it. Sigh. Where's Mr. Clean? I also need to work on my paperworks, speaking of which...

3. My books. I have 4 books nearly done. 2 needs editing, 2 needs a little more materials before editing. I also have some more materials for more books. So, I need to step up to that also.

It's kind of bad when bad health is causing havoc with the schedule. Then I have the tendency to make up for time lost, then I get sick due to overwork, and repeat. I'll have to figure something out to break the cycle.

So, I should ride my bicycle more often. That gets my heart rate up, and hopefully raised my stamina so I can get more work done.

Tuesday, January 21, 2014

Raspberry Pi Journal #46


Disable Screensaver

The last couple posts deals with pesky screensaver kept interfering with our display. We certainly want to have an easy way to disable the screen saver, and for that, we need to go to the source: X window session itself.

Most of the instructions I see on the internet specifies .xinitrc file modification. Although it should work, that makes the change permanent. If I can do it via user shell, I'd rather do it that way.

Fortunately, there is a set of xwindow utilities. The one we're interested in is called xset. Disabling DPMS (Energy Star) feature is as simple as

xset -dpms

Enabling it would be

xset +dpms

So, to disable screen blanking, we'd do this:


  1. xset s noblank   #Disable blanking
  2. xset s off       #Disable screensaver
  3. xset -dpms       #Disable Energy Saving


And to enable them, we can do this:


  1. xset s blank      #Enable blanking
  2. xset s on         #Enable screensaver
  3. xset +dpms        #Enable Energy Saving


So I created two scripts: unblank.sh and blank.sh respectively. Oh, I also put sudo command in front of them, just to make sure it takes.

Now, all I have to do in order to disable the screen saver is type

unblank.sh

on the command prompt. And to enable it, just type

blank.sh

And there you go. Note that xset does more than just dealing with screen savers. Check out its man page for details.