Saturday, September 6, 2014

iPhone Photo Madness

Here's a neat little "trick" that'll cause some confusion and irritation... don't try this at home:
  1. With your iPhone locked, press the Home button to open the main screen
  2. Instead of swiping to unlock, just swipe up from the camera icon in the lower right corner to take a quick picture.
  3. Take your picture
  4. Forget that you're not unlocked
  5. Tap the camera roll icon
  6. Voila - your iPhone is now completely unresponsive... even the power off button does NOTHING!
Now I know we want to make sure that things are secure and all that, but the first time it happens I think; "Bug?" "Broken?" "!@#$%^&". Couldn't you have found a better way, Apple? Like:
  • Prompt for the passcode
  • Let me power off the iPhone to start over
  • Let the Home button take me back to the main screen and start over

Saturday, August 9, 2014

The BikeShed...

This morning I spent a bit of time skimming the latest copy of “Communications of the ACM" (Association for Computing Machinery) and came across the term ‘bike shedding’ in an article by Poul-Henning Kamp, one of the primary developers of the FreeBSD operating system.

The article - “Quality Software Costs Money - Heartbleed Was Free” - was about how to raise money to support FOSS (Free and Open Source Software) projects, many of which provide critical features for the internet and, increasingly, devices of all kinds such as your TV, your printer, your car... He relates his early experience in 2004 of crowd-sourcing funding for a FOSS project, long before KickStarter or Indiegogo were around.

Anyway, in the middle of the article he says: “Worst case, I would cause the mother of all bike sheds (http://www.bikeshed.org) to get thrown out of the FreeBSD community…” so I just had to visit the site. Turns out this is a very well known email, now available on the web, describing challenges on the FreeBSD mailing list. In summary, it says:

"...the simpler and more insignificant something is, the more heated the debate over it, illustrated by: 'you can go in to the board of directors and get approval for building a multi-million or even billion dollar atomic power plant, but if you want to build a bike shed you will be tangled up in endless discussions.'"

The concept is interesting because it has value for other venues such as leading groups or managing projects. I’ve certainly seen it in action but never had a term to describe it.

However, it gets even more interesting (to me anyway) because in the bikeshed article, Poul-Henning makes the statement: “A lot of [my email] gets routed to /dev/null by filters: People like Brett Glass will never make it onto my screen…” (/dev/null means never having to say "I read it").

Who was Brett Glass, you may wonder? Well, thank you for asking; you can read about that here: http://www.quora.com/Hacker-Culture/Who-was-Brett-Glass-as-named-in-the-original-bikeshed-email. The responses speak volumes about internet culture.

Finally, lest you think the story is over, check out http://white.bikeshed.com or http://blue.bikeshed.com. In fact, visit http://.bikeshed.com and you’ll be rewarded with differing visions of what your bikeshed might look like.

Saturday, January 4, 2014

Self-driving cars: can you trust the software?


Recently, there's been a lot of media coverage over self-driving (or autonomous) cars. DARPA has funded several challenges, open to all comers, for developing autonomous vehicles. A Stanford University team won the 2005 Grand Challenge, and a Carnegie-Mellon team won the 2007 Urban Challenge.We all know that Google has been developing and using self-driving cars for a while; see this Wikipedia entry for more details. In a recent keynote presentation  - "Google's Self Driving Cars: The Technology, Capabilities, & Challenges" - at the 2013 Embedded Linux Conference, Andrew Chatham claims that they've driven over 400,000 miles. Not a tremendous amount, certainly, but rather enough to have gained a lot of experience and press coverage. 

Now we're told: "Fully self-driving cars expected by 2030"; at least that's what is claimed at this point. And the article goes on to state that a few automobile manufacturers expect to have some of that capability as early as 2025. Sounds pretty terrific: reduce accidents, help the environment, let visually-impaired people drive again: it's a long and exciting list of benefits.

All this is done through hardware and (lots) of software... very complicated stuff. And the software engineers associated with this effort are smart, earnest, hard-working, and well-meaning folks. 

Sounds great... right?

Well... maybe not? We've also heard a great deal about software failures recently. The Affordable Care Act website was late and extremely buggy (I don't have to give you a link for this -- it's all over the web) and is only the latest and most visible example. There are many more. Just Google "recent software failures" and you'll be treated to a cornucopia of problems, like the first entry in Highest profile software failures of 2012 which describes a software trading bug which "cost a trading firm $440 Million in 45 minutes". The list of such failures is seemingly endless. On a more personal note, I'm sure many of you have experienced software problems with your everyday applications, or tried to upgrade a program only to introduce new problems.

We in the software industry have gotten much better at delivering good software at the same time as we're delivering way more complicated applications than were dreamed possible in the not too distant past. Whoddathunk you could pack that much functionality in a smartphone? The original Bomar four-function calculator was bigger and heavier than today's iPhone or Android phone. But that certainly doesn't mean we've got it all figured out, as all those high-profile (as well as multitudinous small) failures suggest. Robert Martin (a leading light in the software industry, fondly known as 'Uncle Bob') has an excellent article about the problems in the software industry

There's also a social question associated with self-driving cars, as expounded in "Why Google's Driverless Car is Evil" by Brad Berman. He also raises the question of security: can we make cars safe from being attacked and taken over for nefarious ends? We certainly haven't done a great job with personal computers.

The question is: what do we have to do -- what can we do -- to improve our software development capabilities so that all those self-driving cars don't wind up driving over a cliff?