Microsoft's Kinect, a 3-D camera and software for gaming, has made a
big impact since its launch in 2010. Eight million devices were sold in
the product's first two months on the market as people clamored to play
video games with their entire bodies in lieu of handheld controllers.
But while Kinect is great for full-body gaming, it isn't useful as an
interface for personal computing, in part because its algorithms can't
quickly and accurately detect hand and finger movements.
Finger mouse: 3Gear uses depth-sensing cameras to track finger movements.
Now a San Francisco-based startup called 3Gear has developed a
gesture interface that can track fast-moving fingers. Today the company
will release an early version of its software to programmers. The setup
requires two 3-D cameras positioned above the user to the right and
left.
The hope is that developers will create useful applications that will
expand the reach of 3Gear's hand-tracking algorithms. Eventually, says
Robert Wang, who cofounded the company, 3Gear's technology could be used
by engineers to craft 3-D objects, by gamers who want precision play,
by surgeons who need to manipulate 3-D data during operations, and by
anyone who wants a computer to do her bidding with a wave of the finger.
One problem with gestural interfaces—as well as touch-screen desktop
displays—is that they can be uncomfortable to use. They sometimes lead
to an ache dubbed "gorilla arm." As a result, Wang says, 3Gear focused
on making its gesture interface practical and comfortable.
"If I want to work at my desk and use gestures, I can't do that all day," he says. "It's not precise, and it's not ergonomic."
The key, Wang says, is to use two 3-D cameras above the hands. They
are currently rigged on a metal frame, but eventually could be clipped
onto a monitor. A view from above means that hands can rest on a desk or
stay on a keyboard. (While the 3Gear software development kit is free
during its public beta, which lasts until November 30, developers must
purchase their own hardware, including cameras and frame.)
"Other projects have replaced touch screens with sensors that sit on
the desk and point up toward the screen, still requiring the user to
reach forward, away from the keyboard," says Daniel Wigdor, professor of computer science at the University of Toronto and author of Brave NUI World, a book about touch and gesture interfaces. "This solution tries to address that."
3Gear isn't alone in its desire to tackle the finer points of gesture
tracking. Earlier this year, Microsoft released an update that enabled
people who develop Kinect for Windows software to track head position,
eyebrow location, and the shape of a mouth. Additionally, Israeli
startup Omek, Belgian startup SoftKinetic, and a startup from San Francisco called Leap Motion—which
claims its small, single-camera system will track movements to a
hundredth of a millimeter—are all jockeying for a position in the
fledgling gesture-interface market.
"Hand tracking is a hard, long-standing problem," says Patrick Baudisch,
professor of computer science at the Hasso-Plattner Institute in
Potsdam, Germany. He notes that there's a history of using cumbersome
gloves or color markers on fingers to achieve this kind of tracking. An
interface without these extras is "highly desirable," Baudisch says.
3Gear's system uses two depth cameras (the same type used with
Kinect) that capture 30 frames per second. The position of a user's
hands and fingers are matched to a database of 30,000 potential hand and
finger configurations. The process of identifying and matching to the
database—a well-known approach in the gesture-recognition field—occurs
within 33 milliseconds, Wang says, so it feels like the computer can see
and respond to even a millimeter finger movement almost instantly.
Even with the increasing interest in gesture recognition for hands
and fingers, it may take time for non-gamers and non-engineers to widely
adopt the technology.
"In the desktop space and productivity scenario, it's a much more challenging sell," notes Johnny Lee,
who previously worked at Microsoft on the Kinect team and now works at
Google. "You have to compete with the mouse, keyboard, and touch screen
in front of you." Still, Lee says, he is excited to see the sort of
applications that will emerge as depth cameras drop in price, algorithms
for 3-D sensing continue to improve, and more developers see gestures
as a useful way to interact with machines.
By Kate Greene
From Technology Review
1 comments:
Hi there, after reading this remarkable piece of writing
i am as well delighted to share my knowledge here with friends.
my weblog; Permalink
Post a Comment