Talking to the Wall

Our lives are awash with ambient electromagnetic radiation, from the fields generated by power lines to the signals used to send data between Wi-Fi transmitters. Researchers at Microsoft and the University of Washington have found a way to harness this radiation for a computer interface that turns any wall in a building into a touch-sensitive surface.

 Up in the air: Using an experimental interface, a person acts as an antenna for stray electromagnetic radiation in the environment.


The technology could allow light switches, thermostats, stereos, televisions, and security systems to be controlled from anywhere in the house, and could lead to new interfaces for games.

"There's all this electromagnetic radiation in the air," says Desney Tan, senior researcher at Microsoft (and a TR35 honoree in 2007). Radio antennas pick up some of the signals, Tan explains, but people can do this too. "It turns out that the body is a relatively good antenna," he says. 

The ambient electromagnetic radiation emitted by home appliances, mobile phones, computers, and the electrical wiring within walls is usually considered noise. But the researchers chose to put it at the core of their new interface.

When a person touches a wall with electrical wiring behind it, she becomes an antenna that tunes the background radiation, producing a distinct electrical signal, depending on her body position and proximity to and location on the wall. This unique electrical signal can be collected and interpreted by a device in contact with or close to her body. When a person touches a spot on the wall behind her couch, the gesture can be recognized, and it could be used, for example, to turn down the volume on the stereo.

So far, the researchers have demonstrated only that a body can turn electromagnetic noise into a usable signal for a gesture-based interface. A paper outlining this will be presented next week at the CHI Conference on Human Factors in Computing Systems in Vancouver, BC.

In an experiment, test subjects wore a grounding strap on their wrist—a bracelet that is normally used to prevent the buildup of static electricity in the body. A wire from the strap was connected to an analog-to-digital converter, which fed data from the strap to a laptop worn in a backpack. Machine-learning algorithms then processed the data to identify characteristic changes in the electrical signals corresponding to a person's proximity to a wall, the position of her hand on the wall, and her location within the house.

"Now we can turn any arbitrary wall surface into a touch-input surface," says Shwetak Patel, professor of computer science and engineering and electrical engineering at the University of Washington (and a TR35 honoree in 2009), who was involved with the work. The next step, he says, is to make the data analysis real-time and to make the system even smaller—with a phone or a watch instead of a laptop collecting and analyzing data.

"With Nintendo Wii and Microsoft's Kinect, people are starting to realize that these gesture interfaces can be quite compelling and useful," says Thad Starner, professor in Georgia Tech's College of Computing. "This is the sort of paper that says here is a new direction, an interesting idea; now can we refine it and make it better over time." 

Refining the system to make it more user-friendly will be important, says Pattie Maes, a professor in MIT's Media Lab who specializes in computer interfaces. "Many interfaces require some visual, tangible, or auditory feedback so the user knows where to touch." While the researchers suggest using stickers or other marks to denote wall-based controls, this approach might not appeal to everyone. "I think it is intriguing," says Maes, "but may only have limited-use cases."

Joe Paradiso, another professor in MIT's Media Lab, says, "The idea is wild and different enough to attract attention," but he notes that the signal produced could vary depending on the way a person wears the device that collects the signal.

Patel has previously used a building's electrical, water, and ventilation systems to locate people indoors. Tan has worked with sensors that use human brain power for computing and muscle activity to control electronics wirelessly. The two researchers share an interest in pulling useful information out of noisy signals.  With the recent joint project, Tan says, the researchers are "taking junk and making sense of it."

By Kate Greene
From Technology Review

1 comments:

I like this idea. Combine it with the painted-on display surface they're currently testing, and you've got functional control panels anywhere you like. One gesture to bring up the visual interface panel, poke the controls you wish to use, and another gesture to hide the interface. :)

Post a Comment