The researchers at Fujitsu Laboratories have developed something quite amazing. They have created a next generation user interface which allows the interface to detect the user’s finger as well as what the finger is touching at that time. What it does next is create an interactive touchscreen-like system using real objects from our world.
As quoted from the video, “We think paper and many other objects could be manipulated by touching them, as with a touchscreen. This system doesn’t use any special hardware; it consists of a simple device, like an ordinary webcam, and a commercial projector. It’s capabilities are achieved through image processing technology.”
Importing documents as you would import data has never been this easy. All you need to do is select the parts of the document that you want to collect using this touchscreen interface.
Basically, this touchscreen interface is able to measure any shape of a real world objects and then it adjusts the coordinate systems for the projector, camera, and real world. On top of that, objects with curved surfaces can also be coordinated just by touching.
Also quoted, “Until now, gesturing has often been used to operate PCs and other devices. But with this interface, we’re not operating a PC, but touching actual objects directly, and combining them with ICT equipment. The system is designed not to respond when you make ordinary motions on a table. It can be operated when you point with one finger. What this means is, the system serves as an interface combining analog operations and digital devices.”
In order to have the touchscreen interface detect touches accurately, the system would need to detect the fingertip height accurately as well. If the finger detection is off just by a single pixel, the height of the image will change by 1cm. So, to have the system detect fingertips accurately, it will need a technology that is able to do so.
“Using a low-res webcam gives a fuzzy picture, but the system calculates 3D positions with high precision, by compensating through image processing.”
The system also has a technology that enables it to control color and brightness, so it’ll be able to identify fingertips consistently. In some cases, there will be situations where a touch is not needed and if that’s the case, the user can still operate the system using gestures.
“For example, we think this system could be used to show detailed information at a travel agent’s counter, or when you need to fill in forms at City Hall.”
“We aim to develop a commercial version of this system by fiscal 2014. It’s still at the demonstration level, so it has not yet been used in actual settings. Next, we’d like to get people to use it for actual tasks, see what issues arise, and evaluate usability. We want to reflect such feedback in this system.”