Eat Your Own Dog Food

The joke goes that Bill Gates was once caught playing Windows Solitaire during work hours. When faced with raised eyebrows, Bill said, "What? Someone has to test it!"

I was in a meeting the other day where four people sat around the table to discuss the implementation of a particular technology into a consumer device. The way the technology was to be implemented had a direct effect on the user experience, so it was important to get it right.

It turns out that two of the four people have never actually used the device and one of the four has used it once or twice before. Only one person in the room regularly uses the technology that he develops.

"Yes," he repeated, "it might seem logical from a development point of view, but that's not how a typical user would use it. So it would be a bad idea to implement it like that!" I could see the frustration building in his face.

I was stunned. How is it possible to make important decisions about user experience in a consumer device if you never actually use the thing yourself?

This is not an isolated case. It is highly likely that the designers of my (major name brand) digital camera either don't own a camera, have never taken a photo, or (worse) only use manual SLRs! They failed miserably on a number of basic usability/UI issues. Here are a few examples:
  1. The user interface (on-screen, rather than via a dial) for switching between photo mode and movie mode is poorly designed. There are a number of seldom-used items between the photo-mode and movie-mode options, resulting in three extra button presses. This design has caused me to miss a few really good shots.
  2. The action of selecting an item from the various menus is counter-intuitive. Instead of pressing the OK button to select an option to dig deeper into the menu tree, you press the right-pointing arrow. I always forget this and press the OK button instead - it seems natural to do so because OK means "go for it",  it means "do it", it means "open it" - but not here. On this camera, OK means "go back to the beginning and wipe out the last four minutes you spent finding the correct menu option and start all over again". 
  3. There are no icons or other indicators on the camera for video playback. Sure, I could look it up in the manual (but who carries that around with the camera?) I could even try to comprehend the on-screen Help (but why would I want to put myself through that?) Would it hurt the designers to put some type of indication that the up button plays the video and that the down button stops playback?
It is clear that the manufacturer of this camera (which is actually a good camera, if you can get used to the quirks) never conducted a usability study.

The MSDN website contains an article entitled, "How to Design a Great User Experience". The last point on their list states (emphasis added):
You won't know if you've got it right until you've tested your program with real target users with a usability study. Most likely, you'll be (unpleasantly) surprised by the results. Be glad to have your UI criticized—that's required for you to do your best work. And be sure to collect feedback after your program ships.
Automated testing is not good enough - you have to use it! If that isn't practical, get your target users to test it for you. Microsoft, for one, has taken to crowd-sourced testing for some of its products. For example, Internet Explorer 9 was made available for public download as a Platform Preview, as a Beta version and then as a Release Candidate version. Give a million people free copies of the product on condition that they report bugs and usability issues, many of which you may never have found or thought of on your own - everyone uses software (and hardware) in different ways.

In a "Google Testing" blog  post of Wednesday, 23 March 2011, James Whittaker writes (emphasis added):
Google prefers to release often and leans toward getting a product out to users so we can get feedback and iterate. The general idea is that if we have developed some product or a new feature of an existing product we want to get it out to users as early as possible so they may benefit from it. This requires that we involve users and external developers early in the process so we have a good handle on whether what we are delivering is hitting the mark. 
I think that the best way to deliver a truly great user experience is not only to imagine yourself in the user's shoes, but to actually become a regular user. Only then will you really be able to come close to feeling what the user feels, be it love at first sight, or boiling frustration.

On the concept of using your own products, Wikipedia quotes Dvorak, John C. (15 November 2007). "The Problem with Eating Your Own Dog Food". PC Magazine and Ash, Lydia (2003). The Web testing companion: the insider's guide to efficient and effective tests:
...it should allow employees to test the products in real, complex scenarios, and it gives management pre-launch a sense of progress as the product is being used in practice.

I am positive that Larry and Sergey chat about Google business on Android phones. I am certain that Steve Ballmer was using Windows 7 as his primary OS since way before it was released to market. There is no doubt that Steve Jobs relaxes in his lush garden and browses the Internet on his iPad2. And I bet that they all make sure their managers, architects, developers, testers, tech writers, salesmen and admin assistants eat their own dog food.

Comments are most welcome!
Follow on Twitter: @ykarp
Subscribe to Y. Karp? Why Not! or follow on Facebook (see the side-bar).
Add this blog to your RSS feed reader

Comments

Popular posts from this blog

Jerusalem Marathon 10km - I Did It!

Episode 37: Best Kept Secrets to Healthy Modern Living

Act Your Age, Not Your Shoe Size