Home | Issues | Profile | History | Submission | Review
Vol: 60(74) No: 2 / June 2015        

Gesture Control of Video Game Consoles and Smart TVs Using a Novel NIR Depth Camera
Dan Ionescu
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science, 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, phone: (613) 562-5800 , e-mail: dan@ncct.uottawa.ca
Viorel Suse
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science , 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, e-mail: viorel@ncct.uottawa.ca
Cristian Gadea
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science, 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, e-mail: cgadea@ncct.uottawa.ca
Bogdan Solomon
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science , 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, e-mail: bsolomon@ncct.uottawa.ca
Bogdan Ionescu
Mgestyk Technologies, Inc., 80 Aberdeen Street, Suite 220, K1S 5R5, Ottawa, Ontario, Canada
Shahidul Islam
Mgestyk Technologies, Inc., 80 Aberdeen Street, Suite 220, K1S 5R5, Ottawa, Ontario, Canada


Keywords: human computer interfaces, real-time 3D camera technology, gesture control, video game consoles, digital television systems

Abstract
With the increased availability and affordability of 3D cameras like the Kinect sensor, academic research on gesture-based human computer interfaces is now more relevant than ever. Challenges in this domain include the creation of a 3D camera with the ability to capture useful depth data, as well as the ability of software to processes the depth data and map it into a command. Since regular 2D cameras such as webcams are sensitive to lighting conditions, they fall short in providing the robustness required for detecting, tracking and recognizing gestures made with a person\'s hands or body. In this paper, a new depth camera that operates in the NIR spectrum is introduced. The camera is based on a novel “space slicing” principle, whereby an illuminator is pulsed using a monotonic increasing and decreasing function, allowing a cycle-driven feedback loop to control illumination intensity. The reflected IR light is captured in slices of the space in which the object of interest can be found. This allows a depth-map to be reconstructed and processed in real-time. Kinect has shown that there is a growing consumer appetite for controlling console-based video games through full-body gestures in the living room. In addition, users now wish to control their TV experience through gestures, including the various applications that run directly on modern Smart TVs. Through a series of experiments, this paper will show how the novel camera successfully enables these and other scenarios in a more natural and reliable way than was previously possible.

References
[1] I. E. Sutherland, “Sketchpad: A man-machine graphical communication system,” in AFIPS Spring Joint Computer Conference 23, 1964, pp. 329–346.
[2] (2008) Mgestyk Videos. Mgestyk Technologies Inc. [Accessed: October 2013]. [Online] Available: http://mgestyk.com/videos.html
[3] P. Lavoie, D. Ionescu, and E. Petriu, “3-D Object Model Recovery From 2-D Images Using Structured Light,” in Proc. IMTC/96, IEEE Instrum. Meas. Technol. Conf., Brussels, Belgium, 1996, pp.377-382.
[4] K. Lai, J. Konrad, and P. Ishwar. “A gesture-driven computer interface using Kinect.” in Proc. of IEEE Southwest Symp. on Image Analysis and Interpretation 2012, April 2012, pp. 185-188.
[5] C. Dal Mutto, P. Zanuttigh and G. M. Cortelazzo, “Time-of-Flight Cameras and Kinect,” in SpringerBriefs in Electrical and Computer Engineering, Springer, 2012.
[6] H. Shimotahira, K. Iizuka, S.-C. Chu, C. Wah, F. Costen, and Y. Yoshikuni, \"Three-dimensional laser microvision,\" in Appl. Opt. 40, 2001, pp.1784-1794.
[7] H. Shimotahira, K. Iizuka, F. Taga, and S. Fujii, \"3D laser microvision,\" in Optical Methods in Biomedical and Environmental Science, Elsevier, New York, 1994, pp.113-116.
[8] T. Kanamaru, K. Yamada, T. Ichikawa, T. Naemura, K. Aizawa, and T. Saito, \"Acquisition of 3D image representation in multimedia ambiance communication using 3D laser scanner and digital camera,\" in Three-Dimensional Image Capture and Applications III, Proc. SPIE 3958, 2000, pp.80-89.
[9] D. A. Green, F. Blais, J.-A. Beraldin, and L. Cournoyer, \"MDSP: a modular DSP architecture for a real-time 3D laser range sensor,\" in Three-Dimensional Image Capture and Applications V, Proc. SPIE 4661, 2002, pp.9-19.
[10] V. H. Chan and M. Samaan, \"Spherical/cylindrical laser scanner for geometric reverse engineering,\" in Three-Dimensional Image Capture and Applications VI, Proc. SPIE 5302, 2004, pp.33-40.
[11] D. Ionescu, B. Ionescu, C. Gadea, and S. Islam: “A Multimodal Interaction Method that Combines Gestures and Physical Game Controllers,” in Proc. of the IEEE ICCCN 2011, Maui-Hawaii, July 31-August 4, 2011, pp. 1-6.
[12] W. T. Freeman and C. D. Weissman, “Television Control by Hand Gestures,” in Proc. of Int. Workshop on Automatic Face and Gesture Recognition. IEEE Computer Society, 1995, pp. 179–183.
[13] R. D. Vatavu, “User-Defined Gestures for Free-Hand TV Control,” in Proc. of the 10th European conf. on Interactive TV and Video, ACM, New York, 2012, pp. 45-48.
[14] R. A. Jarvis, “A perspective on range finding techniques for computer vision,” in IEEE Trans. Pattern anal. Machine Intel., vol. 5, March 1983, pp. 122-139.
[15] D. Ionescu, B. Ionescu, C. Gadea, and S. Islam. “An Intelligent Gesture Interface for Controlling TV Sets and Set-Top Boxes” In Proc. of 6th IEEE Int. Symp. on Applied Computational Intelligence and Informatics, SACI 2011. IEEE Computer Society, May 2011, pp. 159-165.
[16] D. Ionescu, V. Suse, C. Gadea, B. Solomon, B. Ionescu, and S. Islam, “A New NIR Camera for Gesture Control of Electronic Devices,” in Proc. of 8th IEEE International Symposium on Applied Computational Intelligence and Informatics, SACI 2013. IEEE Computer Society, Timisoara, Romania, May 2013.
[17] B. Wilson. (2012, November) Xbox 360 Controller Security Documentation. [Accessed: October 2013]. [Online] Available: http://www.brandonw.net/360bridge/doc.php
[18] Y. Zhu, B. Dariush, and K. Fujimura, “Controlled human pose estimation from depth image streams,” in IEEE Conference on Computer Vision and Pattern Recognition, 2008, pp. 1-8.
[19] S. Preitl and R.-E. Precup, “On the algorithmic design of a class of control systems based on providing the symmetry of open-loop Bode plots,” Buletinul Stiintific al U.P.T., Transactions on Automatic Control and Computer Science, vol. 41 (55), no. 2, pp. 47–55, Dec. 1996.