It鈥檚 one of the top meetings worldwide for researchers who spend their days (and nights) searching for a 鈥淪teve Jobs鈥 type of breakthrough that will impact how the rest of us interact with our computers. Scientists from Disney, Google, IBM, Intel, Microsoft, Samsung, and many other major technology companies make sure to show up for it every fall so they can learn about the latest innovations in human-computer interfaces coming out of universities and research labs.
As a measure of the significance from this annual meeting鈥攃alled the 鈥攅very six minutes, on average, year-round, a paper from UIST is downloaded. That鈥檚 87,000 copies annually. It鈥檚 just a matter of time before the ideas presented in these papers end up in the latest devices that all of us must have.

For example, every smartphone display now automatically turns from portrait to landscape when you rotate the device. That technology was first described in a paper presented at UIST in 2000 by Microsoft researcher Ken Hinckley, a former student of the late Carnegie Mellon professor . The wall-size touchscreen you will be glued to on CNN during November鈥檚 presidential election came from the conference, as did so many of the cartoon-style animation tools on our computers.
When packed his bags last fall for the 24th UIST symposium in Santa Barbara, Calif., he suspected he had a game-changing idea. He believed it could transform the way we work, the way we play, the way we communicate. It was an ambitious hunch for the fifth-year Carnegie Mellon PhD student, who was still a relative newbie in the computing world. But Harrison had a track record of making big things happen.
For as long as the 27-year-old can remember, he has had a creative tinkering bent鈥攚hether building an adapter to run an old laptop from his car battery or whiling away childhood hours developing a computer game where angry gorillas hurled bananas at each other. Harrison even spent one teenage summer designing and building a siege engine called a trebuchet, used in medieval times to pulverize castle walls. He received local media coverage鈥攁nd drew a crowd of onlookers in one of his hometown鈥檚 parks鈥攚hen he tested his trebuchet replica and successfully launched massive rocks at imaginary fortresses hundreds of yards away.
It鈥檚 this kind of imaginative impulse that led Harrison to the field of computer science after he graduated high school in 2002. That year, sales of a new mp3 player called the iPod were off the charts, and camera phones and wireless headsets were making headlines as they hit the marketplace. 鈥淚t was really becoming glamorous to be a computer scientist,鈥 Harrison reflects.
He studied computer science as an undergraduate and master鈥檚 student at New York University. There, he developed an interest in the space where computer science and behavioral sciences intersect with design, known as human-computer interaction. The field aims to study and improve the relationship between people and computers. It鈥檚 why we hate our alarm clocks, but love using our iPhones. 鈥淭hey made it into something you want to use,鈥 Harrison says.
At Carnegie Mellon, research in human-computer interaction can be traced to the start of the in 1965. Faculty founders , , and 鈥攁濒濒 recipients, considered the Nobel Prize of computing鈥 believed that computer science should include the study of social phenomena surrounding computers, not just the theory and design of computation devices themselves. Today, more than 60 faculty and staff work within Carnegie Mellon鈥檚 , where Harrison began his PhD studies in 2007.
Harrison modestly describes himself as a 鈥渢inkerer,鈥 and a quick glance at the funky d茅cor inside his lab in Newell-Simon Hall reveals the exceptional range of his imagination and talent. A skilled craftsman, he knows how to work with metal, glass, plastics, ceramics, and wood. For instance, he fabricated a table for the lab by welding together old computer motherboards. He reclaimed rusty exhaust manifolds from a metal salvage yard, which he repurposed into a modern lighting fixture.
Nearly two dozen such 鈥渇un projects,鈥 including the trebuchet, are listed on his Web site, together with about 40 PhD and MS research projects. Also catalogued online are his stunning visualizations of huge datasets鈥攕uch as all the books on Amazon and all the Wikipedia topics鈥攚hich have been displayed in museums and galleries internationally. In addition, he has mapped his travels to 50 countries, including Azerbaijan, Jordan, and Uganda, which he says has provided him with creative fodder for his research: 鈥淚n developing countries, you can鈥檛 just go to the auto mechanic to have new parts from the factory installed. I鈥檝e seen people repair cars with duct tape. That kind of engineering creativity has always been very inspiring to me.鈥
He knew it would take more than duct tape, though, to tackle the paradox of mobile devices鈥攁 challenge that had long interested him and aggravated the rest of us. Simply put, computers have dramatically increased in capability while decreasing in size; smartphones today are more powerful than desktop computers were a decade or two ago. That鈥檚 all good. However, engineers haven鈥檛 figured out how to miniaturize these powerful devices without shrinking their interactive surface area. That leaves us smaller and smaller touchscreens, cramped buttons, and teeny jog wheels. As much as you might love your iPhone, you wouldn鈥檛 want to peck out your dissertation or a novel on its little screen, let alone try to read on it anything of substantial length. Entire Web sites are devoted to the funny messaging gaffes people make with the slip of a finger thanks to the autocorrect feature.
It鈥檚 a classic Catch-22. Engineers can鈥檛 enlarge the device鈥檚 size because who wants to schlep around an oversized smartphone? But engineers can鈥檛 scale down the size any further because of the limits of our vision and dexterity of our fingers. So we grit our teeth and text on, one tiny keystroke at a time.
In 2008, Harrison and his PhD advisor, Scott Hudson, started exploring a potential solution to this quandary through a project called . They used a bioacoustic sensor to 鈥渓isten鈥 to the high-frequency sound produced when a fingernail is dragged over a textured material like wood or paint. Coupling this sensor with a mobile device could allow any floor, piece of furniture, or article of clothing in contact with the device to serve as an input surface. For instance, if you had a smartphone in your pocket and wanted to silence an incoming call, you could just drag your fingernail on your jeans.
听
Scratch Input later evolved into , which took acoustic sensing and actually put it on the human body. The system used an armband with an array of sensors to analyze the mechanical vibrations that propagate through your skin and bones when your fingers tap your body. Combining the armband with a tiny projector, which beams an image of a touchscreen, could turn the palm of your hand, for example, into an interactive monitor.
You might wonder why anyone would want to perform computing on their own bodies. The idea sounds almost creepy, and indeed, Popular Mechanics magazine two years ago called Skinput one of the top 鈥淲eird Science Stories鈥 of 2010. 鈥淪ure, it looks a little crazy in pictures,鈥 Harrison admits. 鈥淏ut when it鈥檚 actually on the skin, you find it鈥檚 incredibly intuitive because we are so familiar with ourselves.鈥
Harrison worked on Skinput as an intern at Microsoft Research in Redmond, Wash. He and his colleagues there presented their work at a 2010 international Conference on Human Factors in Computing Systems and won a Best Paper award. Soon afterward, a YouTube video of Skinput in action鈥攆eaturing a jogger controlling his mp3 player by tapping on his palm and a gamer playing Tetris on his forearm鈥攓uickly generated nearly 700,000 hits, and the project was named one of the top 10 biggest technology stories of the year by New Scientist magazine.
Still, Harrison realized Skinput wasn鈥檛 a perfect solution. The clunky armband had some accuracy problems and was limited to use on the human body, not transferrable to the general environment. So he returned to Microsoft for another internship last year with the intention of taking the idea to the next level by replacing the acoustic sensors with a depth-sensing camera that could 鈥渟ee鈥 a room in 3D.
He mounted the camera (akin to that used in the Microsoft Kinect for the Xbox360) on a user鈥檚 shoulder, where it tracked finger movements by looking for anything within arm鈥檚 length that was roughly sausage-shaped. It then took the depth information it collected to work out when and where the fingers touched the image of the screen generated by a tiny 鈥減ico鈥 projector. The system鈥攃alled 鈥攁lso automatically adjusted that image to account for the shape and orientation of the surface.
Sounds complex, but the results are stunningly simplistic. 鈥淚 don鈥檛 want to call it magic, but it feels like magic鈥攊t can turn bare walls and your own arms into interactive surfaces,鈥 Harrison says.
With OmniTouch, by wearing the shoulder-mounted rig, your palm could become a tablet for jotting down notes or painting a digital picture. Maps projected on a wall could be panned and zoomed by swiping and pinching. You could plop down in front of your TV and just open your hands, which would serve as a remote, to change the channel. Or you could read this article on the counter at Starbucks and then order another latte with the simple tap of your finger. How about writing your novel using the booth countertop as a keyboard?!
鈥淎ll of these scenarios that used to be thought of as science fiction are becoming reality,鈥 says Microsoft researcher Hrvoje Benko, who worked on the project. 鈥淭hat鈥檚 exactly what OmniTouch illustrates鈥攊t鈥檚 basically an exploration of what it would mean if you could turn any available object into an interactive surface.鈥
After several months of work, Harrison, bags packed, and his Microsoft colleagues went public with OmniTouch at the highly anticipated 2011 UIST symposium. There, in front of the leading researchers in their field and industry professionals from around the world, they presented OmniTouch, which enabled users to click away on a wall or their hands instead of a standard touchscreen. 鈥淧eople totally got it,鈥 Harrison says. 鈥淭ypically, when you give someone a crazy new system, you have to let them play with it for about 10 minutes before you do any testing, or your data get messed up. But almost right away, our users knew what to do鈥攁nd they wanted to know if they could use our demo system to dial their friends!鈥
OmniTouch has continued to generate buzz in the press and the computing world since that conference. 鈥淭his is really essential work in our field as we are moving toward ultra-mobile devices, and Chris is totally a rock star,鈥 exclaims Patrick Baudisch, chair of human-computer interaction at the Hasso-Plattner Institute in Germany.
Maximum PC magazine has called OmniTouch a display technology that will 鈥渃hange the way you see the world.鈥 Wired UK dubbed it one of the 鈥25 big ideas for 2012.鈥 And Forbes magazine recently named Harrison to its 鈥30 under 30鈥 list of 鈥渢oday鈥檚 disruptors and tomorrow鈥檚 brightest stars鈥 in science.
Harrison estimates it will be a few years before OmniTouch or something like it is available for consumers. The shoulder-mounted rig needs to shrink. 鈥淧eople don鈥檛 want to go to class or on a date with a huge thing on their shoulder鈥攚ell, maybe at Carnegie Mellon,鈥 he quips.
The ultimate goal is to make the technology completely unobtrusive鈥攁 consumer version could be smaller than a matchbox, worn as a watch or necklace. Within the next decade, experts agree, we could be reminiscing how before OmniTouch, we were all shackled to keyboards or squinting at phone screens.
Jennifer Bails is an award-winning freelance writer. She is a regular contributor to this magazine.
Related Links:
听