Programming Languages I've Loved and Hated

Upon encountering various programming languages during my career, some appealed to me instantly, while others left me cold.  I never really understood why, but perhaps by writing about them, I can discern a pattern.

PL/C: This was the first language I learned.  Structured, imperative, and procedural, PL/C was Cornell's teaching variant of PL/1, and the language shaped much of my thinking.  It was a good language, and I liked it, but it was also an academic language and one I would never see or use again.

Basic (IBM PC): It was hate at first sight.  The language relied on line numbers, needed GOTOs, and was interpreted. But it was early in the PC days and I had to use it if I wanted to do anything useful.  In time, Basic became a compiled language and eventually evolved into Visual Basic for Windows programmers.  But even then, the first version of Visual Basic did not directly support arrays, an omission that convinced me that Basic would forever be a dumbed-down language.

8088 Asse…

Post Processing: Hiring, Microsoft, Einstein, Google Fi, and Linux

I explored how tech hiring was broken in two previous posts: The Best Selling Tech Book On Amazon Is ... and Alternatives to the Whiteboard. An article from CNBC adds to the story.  Historically, the Google hiring process required six to nine months and 15 to 25 interviews, but learning the lesson of diminishing returns, Google has adopted the "Rule of Four."

In the post Git-Hub Hub-Bub, I wrote how Microsoft, under Nadella, is different and better than when it was under Ballmer.  Microsoft continues that trajectory as written in Bloomberg's The Most Valuable Company (for Now) Is Having a Nadellaissance.  Most telling is Nadella's mature reaction to a $1 trillion valuation: "... not meaningful" and any rejoicing about such an arbitrary milestone would mark "the beginning of the end."
The figurines Albert Einstein and Homer Simpson sat together on my shelf for a few years.  Every now and then, I wondered what they would say to each other if they ha…

Bookshelf Classic: More Programming Pearls

If you were to judge a book by its cover, you would, from the image of the keyboard, conclude this book is old.  If you were to judge this book by what's inside, you would conclude likewise because the examples are written in C and Awk.  But were you to look deeper, you would see that these are no ordinary pearls.

Jon Bentley wrote a regular column for the Association for Computing Machinery (ACM) and his essays were collected into two of books: "Programming Pearls" and "More Programming Pearls."   The first book focused on speed and efficiency, while the second book covered that and more, including debugging, I/O, and enlightening "Aha!" moments.

Because these essays were originally written for a monthly magazine, Bentley recommends taking it slow, reading one column per sitting, and trying the exercises.  Thankfully, he doesn't leave you hanging, and provides answers at the end of the book.  Chapters 5 thru 8 are the exception and can be read i…

LuldCalc Live at the App Store

LuldCalc is live at Apple's App Store.  Designed for traders and stock exchange operators, it calculates the upper and lower price bands given a stock's price.  Search "Luld" on your iPhone's App Store.
The tables here summarize the rules involved in the calculations.  More details can be found the luldplan website.

Apple requires a privacy policy for all its apps. Simply put, LuldCalc is a stand-alone calculator, does not collect personal information, and does not connect to the internet.

Privacy PolicyThis privacy notice discloses the privacy practices This privacy notice applies solely to information collected by this website. It will notify you of the following:

1. What personally identifiable information is collected from you through the website, how it is used and with whom it may be shared.
2. What choices are available to you regarding the use of your data.
3. The security procedures in place to protect the misuse of your information.

Estimates Are Not Deadlines

A friend posted this image, and naturally, I laughed.  I shared it with other friends, and they laughed too.  That this situation was so widely recognized, however, made it cause for crying rather than for laughing.

Stating the obvious, estimates are not deadlines, and buffers -- proper spacing between milestones -- are needed for good  project management.  In practice, this advice is lost, and the result is bad software.

Failure to implement buffers ignores the unexpected: people leave, requirements change, a show-stopping bug is found. Without buffers, one unexpected problem cascades to all the milestones downstream.

Some project managers play it "close to the vest."  They really do have buffers in the schedule, keep them a secret, and hope the developers continue to work hard to meet the original deadlines.  The hazard here is that the developers will make (unnecessary) compromises to meet them.

"Do we have time to add an extra field in the database record for faster…

Einstein meets Homer Simpson

Einstein: So Mr. Simpson, what makes you say the universe is shaped like a doughnut and is expanding?  Homer: Mmmm, expanding doughnut. Einstein: Do you think the expansion is due to dark matter and dark energy? Homer: Dark matter? Oh, you mean chocolate. There's lots of energy in chocolate. Einstein: Hmmm, chocolate? Homer: Mmmm, chocolate.

Pi and Free Will

Pi day (3/14) was 10 days ago, and my blog post is late.  Modern scientists tell us, however, that there is no such thing as free will, and that my lateness was pre-determined, unavoidable, and thus arguably, "not my fault."

The "no free will" thought experiment goes as follows: imagine we are all physical systems where every interaction can be known and measured.  Given enough data and computational ability, identifying cause and calculating effect delivers predictable outcomes with 100% accuracy. 

If we accept determinism, that our lives from birth to death have already been set, where does our moral responsibility go?  Why try at all?

The Atlantic wrestles with these questions in There's No Such Thing as Free Will (But we're better off believing in it anyway). The sub-heading gives it away, indicating that scientists have publicly walked back their argument, "Ha ha, we were just trolling all of you," but privately still think we have no free wi…