Home | Issues | Profile | History | Submission | Review
Vol: 57(71) No: 2 / June 2012        

Using Hand Gestures to Control a Collaborative Web-Based Environment
Cristian Gadea
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science, 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, phone: (613) 562-5800 6236, e-mail: cgadea@ncct.uottawa.ca, web: http://www.ncct.uottawa.ca
Bogdan Ionescu
Mgestyk Technologies, Inc., 80 Aberdeen Street, Suite 220, K1S 5R5, Ottawa, Ontario, Canada, phone: (613) 591-1210 2036, e-mail: bogdan@mgestyk.com,, web: http://www.mgestyk.com
Dan Ionescu
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science, 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, e-mail: dan@ncct.uottawa.ca
Shahidul Islam
Mgestyk Technologies, Inc., 80 Aberdeen Street, Suite 220, K1S 5R5, Ottawa, Ontario, Canada, e-mail: shahid@mgestyk.com
Bogdan Solomon
NCCT Laboratory, University of Ottawa, School of Electrical Engineering and Computer Science, 161 Louis Pasteur, Room B-306, K1N 6N5, Ottawa, Canada, e-mail: bsolomon@ncct.uottawa.ca


Keywords: gesture-based control, natural user interface, web collaboration, human computer interaction, 3D depth camera

Abstract
Gesture control is an important part of the next generation of human computer interfaces. Hand gestures can make computers and devices easier to use, such as by allowing people to share photos with just a flick of their wrist. Much research has been undertaken to develop systems whereby users can effectively control computer software through gestures. Existing solutions, however, have relied on inadequate hardware, requiring elaborate setups limited to the lab. Relying heavily on image processing, gesture recognition algorithms used so far are not practical or responsive enough for real-world use. Most importantly, existing solutions have lacked a software environment that allows users to perform common collaborative tasks. In this paper, a new paradigm for next-generation computer interfaces is introduced. The method presented is based on a custom 3D camera that is easy to set up and has a flexible detection range. The camera makes it possible to accurately detect hand gestures from depth data, allowing them to be used to control any application or device. The paper proposes the control of application windows and their content in a collaborative and web-based environment on which many teams can cooperate to complete useful tasks. This is demonstrated with examples that include sharing photos and navigating maps.

References
[1] B. A. Myers, “A Brief History of Human Computer Interaction Technology”, ACM Interactions, vol. 5, no. 2, New York, NY, USA: ACM, pp. 44 – 54, March 1998.
[2] W. T. Freeman and C. D. Weissman, “Television Control by Hand Gestures”, Proc. of Int. Workshop on Automatic Face and Gesture Recognition. IEEE Computer Society, pp. 179 – 183, 1995.
[3] Q. Cai and J. Aggarwal, “Tracking Human Motion Using Multiple Cameras”, in Proc. of 13th Int. Conf. on Pattern Recognition, vol. 3. IEEE Computer Society, pp. 68 – 72, August 1996.
[4] Z. Jun, Z. Fangwen, W. Jiaqi, Y. Zhengpeng and C. Jinbo, “3D Hand Gesture Analysis Based on Multi-Criterion in Multi-Camera System”, Proc. of IEEE Int. Conf. on Automation and Logistics ICAL 2008, IEEE Computer Society, pp. 2342 – 2346, September 2008.
[5] A. Utsumi, T. Miyasato and F. Kishino, “Multi-Camera Hand Pose Recognition System Using Skeleton Image”, Proc. of 4th IEEE Int. Workshop on Robot and Human Communication RO-MAN’95. IEEE Computer Society, pp. 219 – 224, July 1995.
[6] P. Mistry and P. Maes, “SixthSense: A Wearable Gestural Interface”, ACM SIGGRAPH ASIA 2009 Sketches. New York, NY, USA: ACM, 2009.
[7] A. Wilson and N. Oliver, “GWindows: Robust Stereo Vision for Gesture-Based Control of Windows”, Proc. of 5th Int. Conf. on Multimodal interfaces ICMI ’03. New York, NY, USA: ACM, pp. 211 – 218, 2003.
[8] M. R. Thissen, J. M. Page, M. C. Bharathi and T. L. Austin, “Communication Tools for Distributed Software Development Teams”, Proc. of ACM SIGMIS CPR Conf. on Computer Personnel Research SIGMISCPR’07. New York, NY, USA: ACM, pp. 28 – 35, 2007.
[9] W. Wang, “Powermeeting: GWT-Based Synchronous Groupware”, Proc. of 19th ACM Conf. on Hypertext and Hypermedia HT’08. New York, NY, USA: ACM, pp. 251 – 252, 2008.
[10] (2012) Xbox.com Kinect. Microsoft Corp. [Accessed: September 2012]. [Online]. Available: http://www.xbox.com/en-US/kinect
[11] M. Moyle and A. Cockburn, “Gesture Navigation: An Alternative ’Back’ for the Future”, Extended Abstracts on Human Factors in Computing Systems CHI ’02. New York, NY, USA: ACM, pp. 822 – 823, 2002.
[12] D. Ionescu, B. Ionescu, S. Islam and C. Gadea, “A New Method for 3D Object Reconstruction in Real-Time”, Proc. of Int. Joint Conf. on Computational Cybernetics and Technical Informatics ICCC-CONTI 2010, IEEE Computer Society, pp. 649 – 654, May 2010.
[13] R. Tropper, R. Dagher, C. Gadea, B. Ionescu and D. Ionescu, “UC-IC: A Cloud Based and Real-Time Collaboration Platform Using Many-to-Many-on-Many Relationship”, Proc. of 8th IEEE I2TS, IEEE Computer Society, pp. 9 – 17, May 2009.
[14] R. Dagher, C. Gadea, B. Ionescu, D. Ionescu and R. Tropper, “A SIP Based P2P Architecture for Social Networking Multimedia”, Proc. of 12th IEEE/ACM Int. Symp. on Distributed Simulation and Real-Time Applications DS-RT2008, IEEE Computer Society, pp. 187 – 193, October 2008.
[15] C. Gadea, B. Ionescu, D. Ionescu, S. Islam and B. Solomon, “Finger-Based Gesture Control of a Collaborative Online Workspace”, Proc. of 7th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2012, Timisoara, Romania, pp. 41 – 46, May 2012.