Thursday, 3 March 2011

Android In-Air Gesture User Interface with Pseudo-Holographic Display

The "Android In-Air Gesture User Interface with Pseudo-Holographic Display" is a project that comes to my mind after seeing the new OMAP 5 platform demo video shown below. The two features that I am very interested are the in-air gesture user input and the 3D holographic display. My intention would to add an in-air gesture user interface and a pseudo-holographic display to the Android platform running on a Pandaboard, a low-power and low-cost single-board computer based on the Texas Instruments OMAP4430 processor.


While both features seems to be futuristic but the possibility is already here, inspired by the many related open source projects. First about the in-air gesture, I was first introduced to it via the project by Johnny Lee using a Wiimote, which acts as the infrared vision sensor to detect the infrared lights reflected by the finger tips.


Then we go to a bit old school to see the use of normal digital camera or webcam to capture finger/hand gestures in the SixthSense project. Obviously this approach will require a lot more computing power for the real-time image processing as compared to the hardware-based tracking in Wiimote.


Finally we have the gesture input in Kinect, which uses a depth camera or time-of-flight camera that not only provides the 2D colour image capture of the scene but the depth information at every pixel via a concept similar to the police speed trap Radar. This has enabled a very precise finger/hand position tracking.


Among the three approaches here, the connection from Pandaboard to Wiimote may be the easiest to be accomplished due to the Bluetooth connection available. Connecting to webcam and especially to Kinect may require a lot more effort to have the Linux/Android device driver ready. Once the data, either points tracked by Wiimote, image capture from webcam or depth information from Kinect, can be received by Pandaboard, the OMAP4430 processor will have the necessary processing power to translate those to a usable Android Input Method (IME).

Now about the 3D holographic display, the actual 3D projection of Prince Leia by R2-D2 we saw in Star Wars movie will not be feasible yet. So I was thinking about a pseudo-holographic that utilizing any existing technology. The first approach would be to use the 360 degree light field display as shown below.


This approach requires a projector modified to an extremely high refresh rate and the effect of a more economic slower refresh rate will be studied. An alternative approach would be just to use the Pepper's ghost effect such as the one in HoloAd.


Either approach also requires the real-time re-rendering of the regular display to suit the 3D display and this would be possible with the high speed symmetric multiprocessing of OMAP4430 processor together with the powerful graphics core and the multimedia accelerator.

The integration of the in-air gesture input and pseudo-holographic 3D display into the now very popular and powerful Android mobile platform will be a glimpse into the future and hopefully the project will bring usefulness and conveniences to everyone, as you can see in the first video.

No comments: