I remember watching Minority Report for the first time back in 2002.
Today, over a decade later, the scenes where Tom Cruise interacted with a computer display using only hand gestures are no longer in the realm of fiction. We have technologies that enable us to interact with computers and TV screens in the same way. We are in the midst of amazing technological innovations in the field of Human Computer Interaction (HCI) where reality is indistinguishable from magic. The next decade of computing will be the era of ‘disappearing UI.’
ENIAC to Siri
From punch card readers to touch screens and voice recognition, computer interfaces have come a long way. This journey, fueled by Moore’s law, has shrunk computers from room sized monstrosities to pocket sized smart phones. This is not a mere technological evolution. It is a sociological revolution. Computing power has gone from the hands of a very few ‘computer professionals’ into the pockets of everyone. This democratization of computing was possible only because the computer interfaces have become more accessible and intuitive. The innovations of mouse and graphical user interface (GUI), originally invented at Xerox Parc, have revolutionized personal computers. While pen based computing, pioneered by Go and Apple’s Newton, never caught on, touch screens triggered the widespread adoption of smart phones and tablets. Voice recognition technology has been around for a long time but hasn’t gone mainstream until the introduction of Siri in iPhone 4S. While its usage still remains limited, voice recognition has surely crossed the tipping point. What comes next after touch and voice?
You are the Interface
Widely adopted technologies are those that disappear and weave themselves into the fabric of life. Availability of cheap and powerful sensors is enabling a whole new generation of interfaces called Natural User Interfaces (NUI) or Gesture Interfaces. With this technology, users can control computers and other devices through spatial gestures. Microsoft Kinect and Leap Motion are examples of 3D gesture control that tracks users’ hand and body motions by using depth-sensing cameras to control computing devices. Such gesture sensing technology moves HCI closer to the way humans interact with things in the real world. This is a huge leap in HCI that promises many interesting possibilities. For example, our startup ZeroUI uses NUI to allow anyone to make digital 3D models with their hands, just like potters and sculptors. They can print them out using affordable 3D printers like Makerbot or using 3D printing services like Shapeways. NUI can make education more hands on and fun. Students can hold and examine virtual 3D models of molecules to get a closer look or navigate inside an archeological dig by just walking around their living room. Surgeons can manipulate a scope inside of patient body just by moving their hands. We will be able to change the volume of our car stereo using hand gestures.
Some companies are expanding the boundaries of NUI beyond gestures. Eye tracking software from Tobii, Cube26, and Eye Tribe can augment gestures to focus on where the user is looking. MYO armband from Thalmic Labs uses electrical activity in our muscles to control the computers. Emotiv systems, Neurosky, and Interaxon are developing brain computer interfaces (BCI) that attempt to understand user intent from our brain signals. When all these technologies come together, HCI will surpass real world interactions. NUI will be pervasive in our lives. This is the way we will be interacting with computing in the future. You are the interface.
Sensors on You
Moore’s Law has made it possible for us to wear computers on our bodies. Google Glass and Pebble watch are good examples. Google Glass is a head mounted display (HMD) technology that is integrated into eyeglasses and displays information in our line of sight. Users can take pictures or video of what they are seeing while on the move and store them in the cloud. How will people control computers that are on their bodies? New technology promises to bring radical innovations in HCI over the next few years. This should be really fun to watch.
Maker in You
Intersecting trends of ‘Disappearing UI’ and ‘Consumerization of Manufacturing’ will bring out the maker in all of us. These trends are enabled by affordable 3D sensors and 3D printers. ‘Bringing out the maker in everyone’ is the simple but powerful vision that compelled us to start our company, ZeroUI. This has the potential to transform the global manufacturing industry the same way PCs and now smart phones and tablets have revolutionized the computing industry.
Imagine a world where you can 3D print your ideas right at your home or office. This world is much closer to reality than we think. Recently doctors at University of Michigan 3D printed an emergency airway right in the hospital to save a 20-month old baby boy’s life. NASA is exploring 3D printing custom meals on the fly for their astronauts in space. It is not far fetched to imagine consumers designing and 3D printing household items right in their home. The critical missing piece for making this vision a reality is the software that enables anyone to quickly design things on the fly with minimal or no training. This is precisely where ‘Disappearing UI’ can play a key role and the reason why we come to work excited everyday at ZeroUI.