Control Second Life with GPS, accelerometer & magnetometer

As part of my PhD work I have produced a modified version of the Second Life viewer, which I have dubbed Pangolin after the Open Virtual Worlds research group’s previous Mongoose, Armadillo & Chimera projects, that allows;

  • connecting a serial device to the viewer
  • controlling movement of the avatar according to GPS readings
  • controlling the camera according to accelerometer & magnetometer readings

The combination of these features allows you to do things like connect an accelerometer, magnetometer & GPS receiver to an Arduino, have it dump readings into the viewer & have the viewer use them to control the avatar & camera. The motivation behind this was to address the ‘vacancy problem’ by creating a mobile cross reality interface; allowing a user to experience simultaneous presence in a real environment & an equivalent synthetic environment, using their physical position & orientation as an implicit method of control for their synthetic representation. I’m presenting a paper about this at the iED 2013 Boston summit in June – if I can secure funding to actually get me there!

Getting Code & Building

The Pangolin source code is available on Bitbucket, with my additions & modifications licensed under the GNU General Public License. The serial IO functionality uses Terraneo Federico’s AsyncSerial class which is licensed under the Boost Software License. The viewer codebase was forked from Linden Lab’s viewer-release before the removal of the --loginuri flag so Pangolin is compatible with OpenSim grids/servers.

Instructions for building the viewer are available on the Second Life wiki. I build using 32-bit Debian GNU/Linux (specifically by chrooting into a 32-bit debootstrap install from a 64-bit Arch Linux host, see my instructions) which produces a binary that runs on 32-bit Linux & on 64-bit Linux with 32-bit compatibility libraries installed.

The serial connectivity makes use of Boost.Asio. The Linden-provided Boost pre-built library is missing some of the features that my modifications make use of, so I build with LightDrake’s alternative; his public libraries are available here. To use these libraries, edit the corresponding entry in the autobuild.xml file in the root of the codebase. You’re looking for this section, here I’ve commented out the original library & hash for the Linux version & replaced it with LightDrake’s;

<key>boost</key>
  <map>
    <key>license</key>
    <string>boost</string>
    <key>license_file</key>
    <string>LICENSES/boost.txt</string>
    <key>name</key>
    <string>boost</string>
    <key>platforms</key>
    <map>
      <key>darwin</key>
      <map>
        <key>archive</key>
        <map>
          <key>hash</key>
          <string>d98078791ce345bf6168ce9ba53ca2d7</string>
          <key>url</key>
          <string>http://automated-builds-secondlife-com.s3.amazonaws.com/hg/repo/3p-boost/rev/222752/arch/Darwin/installer/boost-1.45.0-darwin-20110304.tar.bz2</string>
        </map>
        <key>name</key>
        <string>darwin</string>
      </map>
      <key>linux</key>
      <map>
        <key>archive</key>
        <map>
          <key>hash</key>
          <!--<string>a34e7fffdb94a6a4d8a2966b1f216da3</string>-->
          <string>2523af5082f44628e553635de6bbea70</string>
          <key>url</key>
          <!--<string>http://s3.amazonaws.com/viewer-source-downloads/install_pkgs/boost-1.45.0-linux-20110310.tar.bz2</string>-->
          <string>https://bitbucket.org/LightDrake/public-libs/downloads/boost-1.45.0-linux-20120213.tar.bz2</string>
        </map>
        <key>name</key>
        <string>linux</string>
      </map>
      <key>windows</key>
      <map>
        <key>archive</key>
        <map>
          <key>hash</key>
          <string>98be22c8833aa2bca184b9fa09fbb82b</string>
          <key>url</key>
          <string>http://s3.amazonaws.com/viewer-source-downloads/install_pkgs/boost-1.45.0-windows-20110124.tar.bz2</string>
        </map>
        <key>name</key>
        <string>windows</string>
      </map>
    </map>
  </map>

I’ve been building with the Linux 1.45.0 version; if you try the more recent Linux version or the Windows/Darwin versions let me know how it goes! Using this new library leads to a rather nasty namespace collision (at least with the Linux 1.45.0 version) for which I have uploaded a fix.

I’ve also put an example Arduino sketch on Bitbucket, which uses an HMC6343 accelerometer/magnetometer & a u-blox MAX-6 GPS receiver, which I wrote about previously.

Binary

I’ve uploaded a 32-bit Linux binary to Bitbucket if you just want to try it out without the rigmarole of successfully setting up the (rather particular) build environment.

Usage

Start the viewer & login as normal, then take a look at the Serial menu which contains a single entry Serial Monitor. Click this & you will see something like this;

2013-04-17-145652_956x1041_scrot

Put the path to the serial device & the baudrate into the fields in the Device settings section at the top & click [Connect]. For me using an Arduino on Linux boxes the serial device normally appears at /dev/ttyACM0 or /dev/ttyS0 if it’s the first serial device, /dev/ttyACM1 or /dev/ttyS1 if it’s the second serial device, etc. If you’re using an Arduino & are having trouble finding it, just start the Arduino IDE & look at the Serial Port entry in the Tools menu.

Pangolin expects messages, separated by newline characters, in the following format;

<bearing> <pitch> <roll> <latitude> <longitude>

For example;

183.90 75.80 -59.30 56.339991 -2.7875334

So make sure that your serial device adheres to this message format; the example Arduino sketch linked above does this & might be a useful starting point for you.

The fields in the Anchor settings section are for entering the location of a single point for which you know both the real world latitude/longitude & the corresponding virtual world coordinates along with the scale of the virtual world to the real world – eg if 1m in the real world is represented by 1.2m in the virtual world then enter 1.2 into the Scale field.

Sim X & Sim Y of the anchor point are global; this is necessary to allow Pangolin to work across multiple regions (such as mega regions). Because regions are 256x256m you can easily calculate the global coordinate of a point by doing (256 * region position) + local coordinate. For example, if my anchor point is at 127,203,23 in a region at 1020,1043, then the global X coordinate of the anchor point is (1020 * 256) + 127 = 261247 & the global Y coordinate of the anchor point is (1043 * 256) + 203 = 267211. Height isn’t implemented (vertical accuracy of GPS is substantially worse than horizontal) so just pick a Sim Z of around your sim’s ground level. The latitude & longitude fields have 6 decimal places of accuracy.

Once you have input all of the anchor settings, click [Set] & if everything is okay you should see the data from the serial device in the Received data section & the processed position values in the Calculated data section. The Pause checkbox will stop the fields from updating so you can copy the values, etc. If you get garbled received data then you have probably set the baudrate incorrectly – Pangolin will ignore this data instead of trying to process it & sending your avatar’s movement haywire.

To enable/disable control of the camera & your avatar’s movement from the received/calculated data use the Orientation control & Position control checkboxes in the Controls section. The various spinners in this section allow you to alter the smoothing, high pass filters & frequency of updates. Good values for these will depend upon what hardware/sensors you are using, the scale of your sim, etc.

Does it work?

The code itself works nicely. I tested it walking around the ruins of the cathedral at St Andrews, for which the Open Virtual Worlds group has produced an OpenSim reconstruction, using a MSI WindPad 110W tablet computer. This sim is accessible on our own grid & on OSgrid (search for regions called ‘StAndrewsOVW’) though the OSgrid version is usually older than our local grid.

tablet

Camera control from orientation was a very rewarding experience, but will benefit from a faster update rate than the 10Hz of the HMC6343 that I used.

test

The accuracy attainable from a GPS receiver, even a fairly high-spec one like the u-blox MAX-6 that I used, isn’t enough to have the avatar ‘follow in your footsepts’ as I’d originally envisaged, but is good enough to move/teleport the avatar between different points of interest when you move within a certain distance/threshold of them.

Extending

Hopefully my additions & modifications to the viewer will provide a convenient starting point or reference for other people wishing to interface serial devices with the viewer &/or to leverage real world sensor data for controlling avatar/camera.

The code is well commented throughout (more so than the rest of viewer codebase!) & should be fairly self-explanatory. The commit logs on Bitbucket will reveal where the additions/modifications reside, however as a brief overview to get you started;

  • Terraneo Federico’s AsyncSerial class is at indra/newview/AsyncSerial
  • most of my added functionality resides in indra/newview/llviewerserialmovement
  • see LLAppViewer::mainLoop() in indra/newview/llappviewer.cpp for how the serial functionality is actually started
  • the floater is in indra/newview/skins/default/xui/en/floater_serial_monitor.xml

Credits

As well as the people I’ve mentioned whose code I have used, I received huge amounts of help from various people on the Second Life opensource development IRC channel (#opensl on Freenode) & the opensource-dev mailing list – thanks guys!

3 thoughts on “Control Second Life with GPS, accelerometer & magnetometer”

  1. I’ve been doing something similar but necessarily crude by comparison with Lumiya and Automagic, using the latter to grab GPS coords and then force-teleport the avatar to a prim in the corresponding position inworld. Sadly , Lumiya goes to text mode to tp so it’s not exactly seamless. At the moment I’m using a car park for testing purposes which is not very photogenic — any chance I could use that nice pic of the cathedral in a talk I’m giving? Suitably attributed, of course, with a link back here.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.