mr689.blogspot.com
Sight & Touch
http://mr689.blogspot.com/2010/05/eyedraw-enabling-children-with-severe.html
Monday, April 19, 2010. EyeDraw: Enabling Children with Severe Motor Imparements to Draw with Their Eyes. Anthony J. Hornof. Computer and Information Science,. University of Oregan,. Eugene, OR 97403 USA. Computer Science and engineering,. University of Washington,. Seattle, WA 98195 USA. This is an application that enable children with severe motor disabilities to draw pictures with their eyes. The application runs on a computer equipped with an eye tracking device. Save and open drawings. The easiest f...
mr689.blogspot.com
Sight & Touch
http://mr689.blogspot.com/2010/05/webcam-mouse-using-face-and-eye.html
Monday, May 3, 2010. Webcam Mouse Using Face and Eye Tracking in Various Illumination Environemts. Yuan-Pin Lin, Chung-Chih Lin, Jyh-Horng Chen. Institue of Electrical Engineering,. Department of Computer Science and Information Engineering,. Face tracking is performed using a non linear skin color transform to overcome variations in lighting. Iris tracking is used to identify the eyes and their position. The cursor is then calculated as a product of the eye position relative to the head position...After...
mr689.blogspot.com
Sight & Touch: March 2010
http://mr689.blogspot.com/2010_03_01_archive.html
Monday, March 29, 2010. Whack Gestures: Inexact and Inatentive Interaction with Mobile Devices. Scott E. Hudson. Beverly L. Harrison. Human-Computer Interaction Institute Carnegie Mellon University, Pittsburgh, PA 15213. Intel Labs Seattle, WA 98105. Whacks by themselves are hard to distinguish from bumps, so to minimize false positives a pair of whacks was used to frame a gesture as follows. The three gestures used for evaluation:. Whack as a signal gesture. Shaking as a signal gesture. The need for ges...
mr689.blogspot.com
Sight & Touch: January 2010
http://mr689.blogspot.com/2010_01_01_archive.html
Thursday, January 28, 2010. 3DM: A Three Dimensional Modeler Using a Head-Mounted Display. Jeff Butterworth, Andrew Davidson, Stephen Hench and T. Marc Olano. The notion to develop an easily accessible 3D modeler using VR in the early 90’s was very forward thinking. In many regards this same goal in a simplified non VR version was taken up by sketchup as it appeared in its original form in 2000. Although the resulting images look a little primitive from today’s perspective, for the time it is very ...
mr689.blogspot.com
Sight & Touch: April 2010
http://mr689.blogspot.com/2010_04_01_archive.html
Wednesday, April 28, 2010. Real-Time Hand-Tracking as a User Input Device. Robert Y. Wang. Computer Science and Artificial Intelligence Laboratory,. Massachusetts Institute of Technology. Cambridge, MA USA. Low dispersion sampling of hand poses is used to provide a maximum bound on the estimation error that the algorithm can make. The result is to have a database where the distance from the KNN to the query image is minimized. Fast Nearest Neighbor Search:. This application was intended for 3D direct man...
mr689.blogspot.com
Sight & Touch: February 2010
http://mr689.blogspot.com/2010_02_01_archive.html
Wednesday, February 24, 2010. American Sign Language Recognition in Game Development for Deaf Children. Georgia Institute of Technology GVU Center College of Computing Atlanta, Georgia, USA. Brashear, sylee, vlh, thad)@cc.gatech.edu. Korea Advanced Institute of Science and Technology Daejeon, Republic of Korea. Center for Accessible Technology in Sign Atlanta Area School for the Deaf Clarkston, Georgia, USA. Hhamilto@doe.k12.ga.us. Five children were used in the study, all of them played all three levels...
mr689.blogspot.com
Sight & Touch
http://mr689.blogspot.com/2010/04/coming-to-grips-with-objects-we-grasp.html
Wednesday, April 7, 2010. Coming to Grips with the Objects We Grasp:. Detecting Interactions with Efficient Wrist-Worn Sensors. Eugen Berlin, Jun Liu, Kristof van Laerhoven, Brent Schiele. Department of Computer Science,. Techniche Universität Darmstadt,. M1-mini SkyeTek reader interface circuitry. Two ambient light sensors. Extending the range of the RFID from the wrist to the hand. Choice of antenna to best perform. Classification of 3D accelerometer data. Used in optimizing the antenna. How the essenc...
mr689.blogspot.com
Sight & Touch
http://mr689.blogspot.com/2010/05/peppermill-human-powered-user-interface.html
Monday, April 5, 2010. The Peppermill: A Human-Powered User Interface Device. The circuit diagram is presented showing the relatively simple implementation. Standard components are used with a motor and reduction gear assembly providing the source of power. An artificial scenario video browsing application was developed. Users could choose channels, and adjust volume. The rate of rotation determined the speed of selection of items on the screen. Posted by M Russell. Subscribe to: Post Comments (Atom).
mr689.blogspot.com
Sight & Touch: May 2010
http://mr689.blogspot.com/2010_05_01_archive.html
Monday, May 3, 2010. Webcam Mouse Using Face and Eye Tracking in Various Illumination Environemts. Yuan-Pin Lin, Chung-Chih Lin, Jyh-Horng Chen. Institue of Electrical Engineering,. Department of Computer Science and Information Engineering,. Face tracking is performed using a non linear skin color transform to overcome variations in lighting. Iris tracking is used to identify the eyes and their position. The cursor is then calculated as a product of the eye position relative to the head position...After...