Yuan-Kai Wang , Ching-Tang Fan , Shao-Ang Chen , Hou-Yeh Chen and Yueh-Ju Tai

Department of Electrical Engineering Fu Jen Catholic University


        The goal of this project is to develop a smart portable device, named the X-Eye, which provides a gesture interface for human-computer interaction (HCI). Considering high-speed image processing requirements in the gesture-based HCI, we adopt TI’s dual-core architecture and choose the OMAP3530. TI BeagleBoardwith a CMOS sensor module and TI Pico DLP projector is decided on the X-Eye‘s hardware. Color identification and gesture recognition are the core of the software technologies in the project. We use the expectation-maximization algorithm and Gaussian mixture model (GMM) for the classification of colors. To improve the performance of the GMM, we devise a LUT (Look Up Table) data structure. Finally gesture recognition is applied to recognize user’s gesture commands.    
        The result of this project is a small portable device which can achieve any-time any-where interaction to capture images, and project output images on any plane as large as 42 inches. In addition, you can also manage the image database in the device and select images by gesture interface. The selected images will be transferred through wireless network from the device to other electronic devices.

System Structure

Hardware System                                      Software System

X-eye System

Demo Video