This article will explain how developers can experiment and code with the new Puppetry feature in Second Life. It is intended for programmers who have a familiarity with Python or C++ and a desire to make their avatar come alive.
Second Life Open Source Viewer
Puppetry requires an experimental Second Life Viewer. A pre-built version is available for download from the alternative downloads page. If you're only developing Puppetry plug-ins there is no need to build the Viewer from scratch. However, if you want to make changes to messaging, avatar animation system, or the Inverse Kinematics (IK) algorithm, then you will need to download and build the open-source Viewer code.
Puppetry feature detection
To avoid compatibility problems when deploying experimental features Second Life uses a system called "capabilities". Whenever a Second Life Viewer first connects to a Region it submits a list of features it knows how to use and the Region will respond with an URL (capability) for each of those features it supports. This is how the Puppetry viewer checks if the current region has the "Puppetry" capability, and likewise the region detects if the viewer supports "Puppetry".
With a capability, viewers can GET data and status information from the server, or POST to configure and update the server. This page has more information about how the Puppetry capability works. Capabilities are also often referred to as "caps" in some documentation.
Puppetry expects data from a LEAP plug-in that has been launched by the Viewer as a side process. LEAP stands for LLSD Event API Plug-in and is a technology that has been embedded in the Second Life Viewer for a while but has, until now, only been used for internal testing and automated updates. A LEAP plug-in can be an executable or Python script: the only other requirement is that it read and write properly formatted LEAP data to its stdin and stdout pipes, however only Python utils have been written so far, and all of the existing Puppetry plug-ins are Python scripts.
Clone the leap repository to your computer. The Puppetry plug-ins can be found in
You will need Python version 3.6 or later installed on your computer. You will also need to install any Python modules a plug-in depends on. Follow the instructions in the ReadMe file. Please read it carefully and ensure you have the requirements met, otherwise Puppetry may not work.
CAUTION: Only run plugins from a source that you trust. Currently there is no safety mechanism to protect against malicious plugins due to this project being in open development.
Try it out
When Puppetry is available the following menu item should be enabled:
Advanced → Puppetry → Launch LEAP plug-in...
Select that menu item and it will open a file picker. Navigate to where you downloaded the leap repository and find a plug-in to try. A simple plug-in to try is:
leap/puppetry/examples/arm_wave.py which will make your avatar slowly wave one arm.
A more complex plug-in is the
leap/puppetry/webcam/webcam_puppetry.py module. This will start your webcam, and try to recognize your movements and animate your avatar. It's a work in progress - at first release, head motion is reasonably good, but we're not satisfied with arm and hand movement, and have not explored fingers, facial or lips motion.
Developing Puppetry Plug-ins
All of the working examples are written in Python and use the puppetry module API to submit data to the Second Life Viewer. Data is exchanged between plug-in and Viewer via the plug-in's stdin and stdout streams. This means the plug-in must not write any spurious debug text to stdout, else risk corrupting the LEAP data and confusing the Viewer. To log debug messages during runtime under the Viewer use puppetry.log() and look in the Second Life log file for those messages, alternatively anything written to stderr will be written to the log file.
puppetry depends on the leap module, which uses eventlet to run multiple co-routines on a single thread. A consequence of this is: any Python Puppetry plug-in must be eventlet friendly by calling
eventlet.sleep(0) in its main loop. For best results: follow the pattern used by a working example.
If a plug-in throws an exception and stops running: its callstack information will be written to stderr which always gets echoed to the Second Life log file, so look there for details about what went wrong. If a Python script is crashing immediately it is sometimes possible to run it from the command line interface (CLI) to see the callstack.
The Puppetry data stream is a series of events. Each event is effectively a key:value pair where the key is the joint's name (as listed in the
avatar_skeleton.xml configuration file found in the Viewer's codebase) and the value is its intended transform. The transform can specify position and/or orientation.
The position is always in the root-frame of the avatar skeleton, where the root is the avatar's pelvis and the distance units are "normalized" such that the furthest end of the hand of the avatar's longest arm has a reach of 1.0 units when the avatar is in the T-pose: standing upright with arms extended straight out from the sides: a symmetric avatar standing in T-pose has an arm span of 2.0 units in the normalized root-frame. When the position is received by the Viewer it will be uniformly scaled according to the size of the avatar instance.
The position coordinate frame is "right handed" (e.g. Z equals X cross Y) and the axes are: X=forward, Y=left, Z=up.
The orientation can be specified in the avatar root-frame, or in the joint's parent-frame, and is always represented as a normalized quaternion (XYZW), although only the imaginary part (XYZ) is packed in the event and the real-part (W) is inferred mathematically when necessary.
An example Puppetry event might look something like this:
Each Puppetry data event is applied locally as soon as it is received. It is also given a timestamp and streamed to the Second Life server which relays the data to other nearby Puppetry empowered Viewers. The remote Viewers receive the Puppetry events, and instead of applying them immediately, put them into a jitter buffer from which the Viewer computes interpolated data and applies those at a different frame rate, but typically faster, than the events were received. Consequently, if the local animation looks a little choppy the remote view is typically smoother, albeit slightly delayed. You can view what your own interpolated data looks like enabling the menu option:
Advanced → Puppetry → Use my server data
With this on, you will watch your animations as others in-world see it: sent from the region server.
Viewer Code Changes
If you want to experiment with viewer source code, you'll be working in C++ and will need to download the Viewer source code which is available on GitHub. Follow the instructions there to download and build the code.
You will need to switch your source code to the DRTVWR-558 branch.
git checkout DRTVWR-558
Search the codebase for "puppetry" and you'll get an idea of where the code is, or for "llik" to find the inverse kinematics implementation. Remember there are multiple components working together here - the interface to the LEAP Puppetry plug-in, which will send data into the viewer while the viewer can send messages into the plug-in to select options such as which webcam to use..
If you modify this code, you'll need to re-build the viewer, launch it and then start your Puppetry module to test your changes.
If you plan on modifying code and submit your work back to the project, create your own local branch from this one for your changes. The normal procedures for Open Source contributions to Second Life apply.
The Inverse Kinematics algorithm
The "Inverse Kinematics problem" is often encountered in the fields of video games, computer animation, and robotics. It is the situation where the final position and/or orientation of one or more bones within a hierarchical skeleton are specified or "known" while those of the rest of the system are "unknown". The bones with known transforms are called "end-effectors" and their target transforms are called "goals". The challenge is to compute transforms for ALL of the bones in the system such that the end-effectors achieve their goals, or approach them as close as possible.
There are several IK algorithms to choose from. They are all iterative in nature: each time the algorithm is applied the new result gets closer to an acceptable solution. The iterations are stopped when the end-effectors are "close enough" to their goals, or when the time/compute resource budget is exhausted. However, achieving the goals is often insufficient criteria for the IK problem. There are typically an infinite number of solutions but they are not equally valid because some will require joints to bend in unnatural directions. Therefore not only must the end-effectors reach their goals but the joints of the skeleton must be "constrained" to only orient within their limits: the elbow pivots about a fixed local axis within a limited angle range and the forearm is limited in how much it can twist.
Example IK algorithms include: Continuous Cyclic Descent (CCD), Jacobian pseudo-inverse, and Forward And Backward Reaching Inverse Kinematics (FABRIK). The Second Life Puppetry system uses an implementation of FABRIK with joint constraints. At the moment the constraint parameters are hard coded in the C++ and are specified only for: spine, arms, and head. The legs and other bones are currently unconstrained. The pelvis is immovable: it represents the origin of the root-frame.
We would love to hear from you about any ideas and creations you have with this technology. There are a lot of possibilities and different ways we can open up Second Life for more fun. Please join in our conversation with user groups, forum postings and show us what you like.