Researchers in Germany have developed a new computer interface that responds to hand and finger movements. The system is reminiscent of the 2002 science fiction film, "Minority Report."
Users can interact and alter images with their hands
The 2002 sci-fi thriller "Minority Report" delivered a memorable impression of the future of computing: a future where one could wave one's arms to navigate through data. It seemed pretty far-fetched eight years ago, but this year the concept is suddenly much more real.
After about six months of research and programming for his master's thesis, Georg Hackenberg at the Fraunhofer Institute for Applied Information Technology has managed to create what he and his colleagues are calling a 3-D noncontact gesture-based computer interface.
"A special image analysis algorithm filters out the positions of the hands and fingers," Hackenberg said. "This is achieved in real-time through the use of intelligent filtering of the incoming data. The raw data can be viewed as a kind of 3-D mountain landscape, with the peak regions representing the hands or fingers."
Hackenberg, who is based at the institute just outside of Bonn, said that the system is a prototype technology demo and is essentially an advanced infrared sensor on a tripod attached to a desktop computer. A 3-D wireframe displayed on the monitor serves as the desktop for the system. When someone is standing in front of the system, the infrared sensor picks up the movements of their hands, interpreting them into commands.
At this early stage the system has only limited functionality. It's been programmed to load and manipulate photos, there is a 3-D jigsaw puzzle and another puzzle designed to test the users' ability to manipulate objects in the 3-D environment.
Standing in front of an infra-red sensor, Hackenberg performed what looked like a dance in slow motion. He held arms held out shoulder height, clenching and un-clenching his fists, as he saw his movements mimicked on a computer monitor nearby.
As he made a fist and then spread his fingers again, Hackenberg's system displayed a randomly selected photograph from Flickr, a photo-sharing website.
Then he could "grab" the selected photo on the screen and play around with it.
"You make a fist to grab the object, which locks the image to the hand and then we move it around and then we release it, and then by opening the hand again, the object is placed where we left it," Hackenberg said as he demonstrated the system.
The system can recognize individual fingers and hands
Early stages of development
The tools are still fairly rudimentary. Photographers aren't going to be trading Hackenberg's system for Adobe Photoshop, an industry standard photo editor, just yet, admitted Rod McCall, a Fraunhofer usability expert.
This won't be replacing the keyboard and mouse, he told Deutsche Welle, emphasizing that this was just another tool available for interacting with computers.
"This particular setup can be used for public experiences," McCall said. "Rather than having a rather bland looking PC, it's a much more interactive and participatory experience for people. Later we'll see this for museums, galleries, maybe other large scale venues."
Indeed, similar research into multi-touch displays now commonly found on touchscreen products like the Apple iPhone and iPad were pioneered in the lab years before their commercial availability.
Strength in navigation
But the real allure to the system, says Sven Behnke, a professor of computer science at the University of Bonn, is that the user doesn't need to put on infrared markers or special gloves to interact with it.
He noted that the project has some precedents and potential applications in the world of video games. Gesture interfaces already play a key role in games for the Nintendo Wii and the forthcoming Kinect, an add-on for the Microsoft XBox 360.
"Controlling games is an obvious application, but of course there might be more serious applications, for instance, exploring data," Behnke said. "To navigate three dimensional visualizations is not very easy, and if you find an intuitive way to use your body there, that might make it easier to navigate large data sets."
That way, just like in "Minority Report," researchers may be able to flip through statistics, photos or anything else, just by waving at a screen. Today, in fact, many television presenters currently use multi-touch screens to show weather patterns or illustrate political maps.
Fraunhofer FIT hopes to move the system to mobile phones
Potential for mobility
At this early stage, the Fraunhofer Institute is just drawing attention to the technology, but McCall, whose job also entails looking at the usability of the project, is already looking ahead to potential future applications.
"I think our aim is ultimately to take this technology to use on mobile devices," McCall said.
However, it might be a while before people interact with their mobile devices by waggling their fingers in front of a Fraunhofer gesture-based interface.
One draw back to the technology is the cost. The active infrared sensor alone costs as much as a small car, according to Professor Behnke, having himself used the same sensors in robots developed by his department.
So until mobile phones are packing that kind of technology or Hackenberg's algorithm can use basic cameras for motion capture, people will have to suffer through actually touching their phones for a while longer.
Author: Stuart Tiffen
Editor: Cyrus Farivar