Difference between revisions of "Gesture Recognition"

From Second Life Wiki
Jump to navigation Jump to search
(added "See also" and category)
(Add Links)
Line 34: Line 34:


* Include application in the snowglobe installer package.
* Include application in the snowglobe installer package.
== Useful Links ==
* [http://oreilly.com/catalog/9780596516130/ Learning OpenCV]


== Project Log ==
== Project Log ==

Revision as of 16:40, 25 May 2009

Gesture Tracking

Goal: Add the ability for a PC camera to detect head and upper body movements and relay them to your own avatar. This is a design for a first simple project, intended as a summer intern project which will affect only the viewer/Snowglobe codebase.

Minimal Functional Goal

  • Nodding or shaking your head 'yes' or 'no' triggers the equivalent animation on your SL avatar.
  • Relative motion of your head is captured in real time and used to animate where your avatar is looking in real time. Inotherwards, if you look up, down, left, or right, your avatar will do the same thing.

Design Details

Add the ability for Snowglobe viewer to listen on a port for local UDP messages containing either head position updates or animation triggers. This creates an easy way for developers to control the viewer by simply sending small network packets. This design will also keep the latency and CPU cost of the interface at a minimum.

Write an application which accesses the PC camera, analyzes the image to detect head movement and gestures, and sends UDP packets to the viewer port.

Project Milestones (weekly or better)

  • write the basic 'camera' app that can send UDP messages to the SL viewer, initially just triggered by keystrokes for testing. Modify the SL viewer to receive these packets and trigger an animation.
  • Add ability for SL viewer to receive head position information and pass it on to the lookAt point for the avatar. Again, use keystrokes for trivial testing. Verify that latency and performance is adequate/instantaneous.
  • Add ability to get frames of video from the camera. Demonstrate simplest analysis by looking for a bright light or color in frame and animating lookAt according to its location in the camera frame.
  • Add calls to openCV (or alternative library) to detect head orientation. Pass head orientation to viewer lookAt.
  • Add ability to trigger animation based on data reported from the analysis library. Demonstrate nodding and shaking head.

Extra Credit:

  • Attempt to detect other upper body gestures: Hand raised, Hand wave, Shrug shoulders.
  • Extend application to run on both mac and windows.
  • Include application in the snowglobe installer package.

Useful Links

Project Log

LL intern will start first week of June and begin updating this page. In the meantime it would be great if someone is interested in implementing the viewer-side changes to listen for UDP packets and trigger animations and lookAt based on the data in those packets.

See Also

Camera-based Input‎