Difference between revisions of "User talk:Aric Linden"

From Second Life Wiki
Jump to navigation Jump to search
(New page: == Filled in very basic user page == Hi Aric, I filled in User:Aric Linden with the very basics, since I'm linking to the page from Bug triage/Monday Agenda. -- ~~~~)
 
(Added a note to request getting in touch WRT MIDI over second life)
 
Line 2: Line 2:


Hi Aric, I filled in [[User:Aric Linden]] with the very basics, since I'm linking to the page from [[Bug triage/Monday Agenda]]. -- [[User:Rob Linden|Rob Linden]] 15:37, 17 July 2007 (PDT)
Hi Aric, I filled in [[User:Aric Linden]] with the very basics, since I'm linking to the page from [[Bug triage/Monday Agenda]]. -- [[User:Rob Linden|Rob Linden]] 15:37, 17 July 2007 (PDT)
== Request Contact Info ==
Hi, Aric.  My name is Jason Smith (SL user jhs Sungsoo).  After nearly a year, my Google alert finally triggered a hit on SL MIDI and I found you.  Unfortunately, I can not find your email address.
About six months ago, I built a working prototype connecting MIDI and Second Life.  Unfortunately, I have since abandoned the project to chase more mundane business leads.  But I always felt like I turned my back on something really positive for Second Life.  Here is what I had done before I moved on to other things.
=== Summary ===
'''Client-side software which interfaces with traditional MIDI hardware or software to control avatars and in-world objects.  Future plans cover real-time streaming (in-world "jam sessions") and control of show objects as in the real world using MIDI Show Control (MSC).'''
=== Details ===
The demo runs client-side using the open source libsecondlife.  It is cross-platform, developed on on Linux, works on Windows.  When it runs, you will see some new MIDI ports show up in your computer.  You may route to or from these ports however you like (e.g. plug in mixing software, keyboards, virtual devices, whatever).
MIDI data can be used to control your avatar.  In my demo, a C chord cord will make the avatar fly and adjusting a fader will change the altitude.  (This is not useful in practice, just to show off the possibilities.)
Scripts in SL can also pick up your MIDI signals and respond to them.  I made a demo MIDI laser pointer, which changes its color based on the pitch of the notes you play, and changes its pointing direction based on some faders you can slide around on your MIDI device in your studio.  Another application is making avatars play the correct piano keys based on the pitch from the instrument.  (The mechanism is you can give an avatar say a hundred related animations or poses and instead of trudging through them in the clunky video game interface, MIDI events automatically trigger the appropriate animations.)
=== Implications ===
Interacting with Second Life over MIDI allows broader and more nuanced control of the avatar in real-time.  You can "play" avatars and objects as if they were instruments.  Also you have live control of virtual show equipment like lights, smoke, hydraulics, etc.  I think there is an implication for machinima and in-world plays.
Controlling objects with MIDI is also valuable.  MIDI Show Control (MSC) is a common standard for all automated show equipment.  The entire entertainment industry is already using MSC to produce live shows, so all they need are some in-world objects that behave like the real-world objects and they can produce develop entire in-world shows with the same software and equipment that they use every day.  Mixed-reality events can also be better synchronized when the real-world lights and smoke machines get the same MIDI feed that the in-world lights and smoke machines get.
=== Future ===
(This is why I am really excited about all this.)  Working on my demo, I realized that beyond simple stuff, MIDI really needs to be real-time, which Second Life does not offer.  So I developed an idea which I believe is superior in any case: use Second Life as a control channel to set up out-of-bound peer-to-peer real-time channels between individuals.  All of this happens transparently.  Second Life musicians need not understand any of this: all they know is that they run the SL-MIDI software, connect the MIDI-OUT to their synthesizer, teleport over to band practice, and their colleagues' notes pour out of the speaker in their studio.  Multiple people can participate and it can all be recorded and edited later--a virtual studio.  The technology behind this is very similar to a VoIP conference call.  In fact...
If I say, "use a well-known server to set up peer-to-peer real-time channels between individuals" that may not sound like much to you, but that is exactly the same architecture that standard VoIP systems use, particularly, SIP.  The same real-time protocol that is often used for MIDI is also used for VoIP.  So while we go through the trouble of making Second Life a brilliant platform for musicians, we get standards-based VoIP as lagniappe.  I think there are major applications for standards-based VoIP in Second Life, but I will stick to the topic of MIDI here.
In summary, the software that I wrote is a proof of concept to show how to control avatars and scripted objects in-world from traditional MIDI equipment.  But it also shows the direction for integrating VoIP or any other real-time streaming application into Second Life '''as a third-party, in a standards-based way, with zero impact to Linden's network'''.
Please contact me at jhs -at- vidya-corporation.com if you are interested in talking about this.  I had always wanted to release this software under the GPL but I had to shelve it for more pressing matters.

Latest revision as of 07:22, 4 November 2007

Filled in very basic user page

Hi Aric, I filled in User:Aric Linden with the very basics, since I'm linking to the page from Bug triage/Monday Agenda. -- Rob Linden 15:37, 17 July 2007 (PDT)

Request Contact Info

Hi, Aric. My name is Jason Smith (SL user jhs Sungsoo). After nearly a year, my Google alert finally triggered a hit on SL MIDI and I found you. Unfortunately, I can not find your email address.

About six months ago, I built a working prototype connecting MIDI and Second Life. Unfortunately, I have since abandoned the project to chase more mundane business leads. But I always felt like I turned my back on something really positive for Second Life. Here is what I had done before I moved on to other things.

Summary

Client-side software which interfaces with traditional MIDI hardware or software to control avatars and in-world objects. Future plans cover real-time streaming (in-world "jam sessions") and control of show objects as in the real world using MIDI Show Control (MSC).

Details

The demo runs client-side using the open source libsecondlife. It is cross-platform, developed on on Linux, works on Windows. When it runs, you will see some new MIDI ports show up in your computer. You may route to or from these ports however you like (e.g. plug in mixing software, keyboards, virtual devices, whatever).

MIDI data can be used to control your avatar. In my demo, a C chord cord will make the avatar fly and adjusting a fader will change the altitude. (This is not useful in practice, just to show off the possibilities.)

Scripts in SL can also pick up your MIDI signals and respond to them. I made a demo MIDI laser pointer, which changes its color based on the pitch of the notes you play, and changes its pointing direction based on some faders you can slide around on your MIDI device in your studio. Another application is making avatars play the correct piano keys based on the pitch from the instrument. (The mechanism is you can give an avatar say a hundred related animations or poses and instead of trudging through them in the clunky video game interface, MIDI events automatically trigger the appropriate animations.)

Implications

Interacting with Second Life over MIDI allows broader and more nuanced control of the avatar in real-time. You can "play" avatars and objects as if they were instruments. Also you have live control of virtual show equipment like lights, smoke, hydraulics, etc. I think there is an implication for machinima and in-world plays.

Controlling objects with MIDI is also valuable. MIDI Show Control (MSC) is a common standard for all automated show equipment. The entire entertainment industry is already using MSC to produce live shows, so all they need are some in-world objects that behave like the real-world objects and they can produce develop entire in-world shows with the same software and equipment that they use every day. Mixed-reality events can also be better synchronized when the real-world lights and smoke machines get the same MIDI feed that the in-world lights and smoke machines get.

Future

(This is why I am really excited about all this.) Working on my demo, I realized that beyond simple stuff, MIDI really needs to be real-time, which Second Life does not offer. So I developed an idea which I believe is superior in any case: use Second Life as a control channel to set up out-of-bound peer-to-peer real-time channels between individuals. All of this happens transparently. Second Life musicians need not understand any of this: all they know is that they run the SL-MIDI software, connect the MIDI-OUT to their synthesizer, teleport over to band practice, and their colleagues' notes pour out of the speaker in their studio. Multiple people can participate and it can all be recorded and edited later--a virtual studio. The technology behind this is very similar to a VoIP conference call. In fact...

If I say, "use a well-known server to set up peer-to-peer real-time channels between individuals" that may not sound like much to you, but that is exactly the same architecture that standard VoIP systems use, particularly, SIP. The same real-time protocol that is often used for MIDI is also used for VoIP. So while we go through the trouble of making Second Life a brilliant platform for musicians, we get standards-based VoIP as lagniappe. I think there are major applications for standards-based VoIP in Second Life, but I will stick to the topic of MIDI here.

In summary, the software that I wrote is a proof of concept to show how to control avatars and scripted objects in-world from traditional MIDI equipment. But it also shows the direction for integrating VoIP or any other real-time streaming application into Second Life as a third-party, in a standards-based way, with zero impact to Linden's network.

Please contact me at jhs -at- vidya-corporation.com if you are interested in talking about this. I had always wanted to release this software under the GPL but I had to shelve it for more pressing matters.