SOBE by David Sobeski

Passive Computing

February 10, 2014

Going back to the original Apple Computer and the subsequent releases of computers and operating systems, we have been in an era of Active Computing. Active Computing is a state where we use the computer to do tasks. We use applications like #Word to create documents or #Photoshop to touch-up and edit photographs.

To me, this is the “old style” of computing that we are all used to dealing with on a daily basis. However, the future is in Passive Computing.

In the Spring of 2006 I was meeting with an executive at #Apple at Tamarine in Palo Alto. We were talking about phones and the design of current phones. In particular, I got onto a topic about the Palm Treo (that I was carrying and using at the time). In typical #Apple way, we asked what we did not like about the current experience. Sure, there was low hanging fruit - the stylus. But to me, the phone had to anticipate things that I did and do things on my behalf. It should be so simple that a user would never understand the complexity that was going on to make something so simple happen. I focused on my cheekbone. I have a pointing cheekbone, or maybe it was just a fat face, that would hit the screen every time I would talk to someone on the phone. I would hear the loud “beep beep beep” in my ear as my phone was dialing numbers as I talked. Of course, the person on the other side of the call would also hear this noise and almost always ask what it was. What happened was that I would awkwardly move the phone on my face to hear the conversation, ensure I was talking into the microphone and (finally) moving the screen in an awkward manner so my cheek would not hit it as I would speak. To solve this problem, we talked about adding a sensor to the phone that would automatically turn the screen off when it was next to my face. At first we talked about how multiple sensors could be used at the same time to understand this scenario. Then it was simple, all we needed was a simple proximity sensor. When it was covered, turn off the display. Problem solved. To me, this is all about passive computing. Things doing things for us without having to hit a button or touch something or click something. It just works.

We are starting to see the beginning of the new Passive Computing era. If you look at the two products that the Nest produced, the both don’t need any user experience to work. With the Nest Thermostat, they have an experience to set the desired temperature and for you to see the current temperature, but, it isn’t really needed. The beauty of a Nest is that it needs to understand internal temperature, external temperature and it can dynamically adjust the inside temperature as needed. It isn’t fully there yet. Next, it needs to understand that there hasn’t been any activity in the house and you are most likely away and dynamically lowers the energy needed by your house. It can also connect to other data sources to better understand where you are or what you are doing to enable its dynamic algorithms to be even better. The Nest Protect doesn’t have any user interface elements at all. It simply senses smoke and tells you what to do.

When I think about a future microwave oven, I don’t want to tell it what food I placed in the oven. The next generation will have a various of sensors built into the device that measure weight, food density, internal temperature. It might even be able to seamlessly read the bar code on the packaged food, look it up and automatically set the cooking time (great for Orville Redenbacher Popcorn). The microwave oven should just cook what I want and I shouldn’t have to program it, touch it or do anything. I simply need to be told when the food has completed its cooking. Of course, I don’t want it to just “ding” when the food is done, but build in appropriate cooling time so the food doesn’t burn you when you remove it from the oven.

The #iWatch is coming. If it simply is a notification center for my iOS device with some health-and-fitness built in, it will miss the promise of what Apple does well. For example, if I tilt my wrist in the right way, it should know that I am looking at my watch and turn on the screen so I can see the time. When I look away, the screen turns off. The watch should vibrate slightly for notifications but the screen should not turn on. I don’t want to be in a meeting or at a dinner with my watch screen constantly turning on and off because of a notification. It should have different styles of vibration for the notification type. All notifications are not the same. Also, it should understand that I am in a meeting (looking at my iCloud calendar) and automatically not buzz because of an e-mail message or a text message was received. The iWatch needs to be a passive computing device. Doing things for me, making my life better. I never want to touch a button. The #Nike Fuelband and other devices all fail because I get limited information, I need to sync the device, I need to actively manage it. The fuelband has no idea if I am walking or running. The iWatch needs to know when I am sitting in a car versus walking versus running. Its health data needs to be relevant and understand my context without me having to tell it my context.

A great example of passive computing is the automobile. Today, my Mercedes has an auto sensor for temperature to dynamically change the internal temperature of the car. The car has parking sensors for when I am trying to get into a tight spot. The car has lane detection and will vibrate the steering wheel if I veer off from the white lines of the road. The car has sensors to tell me when things are in my blind spot. It even has the ability to scan for open parking spots and self park the car. At low speeds it can automatically stop itself. When I walk up to the car, the doors unlock and when I leave they lock. I can get into the car, turn it on and my iPhone will just start playing music from where it left off. It all just works and is an amazing experience. It makes my life safer, easier and better.

We are seeing all types of Passive Computing devices in our lives - my Withings Scale, Nest, WeMo home automation, my automobile.

The next generation of startups and even products from Apple, Google, Microsoft should be about making us forget about the technology and the products. They should just work and they should all do things on my behalf. It is a sensor world, it is connecting these devices to the cloud and multiple services to make them better as they mine data on my behalf. We do not need more social networks or text editors or music makers or video compressors. It is all about the passive computing era and getting technology out of the way.


©2014 by David Sobeski