I’m already slipping with Narrative posts & this is really 2-3 weeks worth of photos, but for sake of naming simplicity I’m going to stick with ‘Week 2’!
Tag: GPS
Narrative Clip – Week 1
The Narrative Clip is a small, wearable ‘lifelogging’ camera. You attach it to your clothes or wear it on a necklace & it takes a photo every 30 seconds & tags it with your location using GPS. At the end of the day you connect the Clip to your computer & it uploads the photos to Narrative’s cloud. The blurry/badly exposed photos are weeded out & the good ones are organised into ‘moments’ depending upon location, subject matter, etc. These moments are sent to the Narrative app on your smartphone, where you can flick through them & share any you like via social networks, email, blogs, etc.
I’d been following the project since their original Kickstarter back in 2012 when they were called Memoto, but I didn’t actually bite the bullet & order one until early 2014. It arrived last week & here are some photos & remarks about my experience with the Clip so far.
Control Second Life with GPS, accelerometer & magnetometer
As part of my PhD work I have produced a modified version of the Second Life viewer, which I have dubbed Pangolin after the Open Virtual Worlds research group’s previous Mongoose, Armadillo & Chimera projects, that allows;
- connecting a serial device to the viewer
- controlling movement of the avatar according to GPS readings
- controlling the camera according to accelerometer & magnetometer readings
The combination of these features allows you to do things like connect an accelerometer, magnetometer & GPS receiver to an Arduino, have it dump readings into the viewer & have the viewer use them to control the avatar & camera. The motivation behind this was to address the ‘vacancy problem’ by creating a mobile cross reality interface; allowing a user to experience simultaneous presence in a real environment & an equivalent synthetic environment, using their physical position & orientation as an implicit method of control for their synthetic representation. I’m presenting a paper about this at the iED 2013 Boston summit in June – if I can secure funding to actually get me there!
Getting Code & Building
The Pangolin source code is available on Bitbucket, with my additions & modifications licensed under the GNU General Public License. The serial IO functionality uses Terraneo Federico’s AsyncSerial class which is licensed under the Boost Software License. The viewer codebase was forked from Linden Lab’s viewer-release before the removal of the --loginuri
flag so Pangolin is compatible with OpenSim grids/servers.
Instructions for building the viewer are available on the Second Life wiki. I build using 32-bit Debian GNU/Linux (specifically by chrooting into a 32-bit debootstrap install from a 64-bit Arch Linux host, see my instructions) which produces a binary that runs on 32-bit Linux & on 64-bit Linux with 32-bit compatibility libraries installed.
The serial connectivity makes use of Boost.Asio. The Linden-provided Boost pre-built library is missing some of the features that my modifications make use of, so I build with LightDrake’s alternative; his public libraries are available here. To use these libraries, edit the corresponding entry in the autobuild.xml
file in the root of the codebase. You’re looking for this section, here I’ve commented out the original library & hash for the Linux version & replaced it with LightDrake’s;
<key>boost</key> <map> <key>license</key> <string>boost</string> <key>license_file</key> <string>LICENSES/boost.txt</string> <key>name</key> <string>boost</string> <key>platforms</key> <map> <key>darwin</key> <map> <key>archive</key> <map> <key>hash</key> <string>d98078791ce345bf6168ce9ba53ca2d7</string> <key>url</key> <string>http://automated-builds-secondlife-com.s3.amazonaws.com/hg/repo/3p-boost/rev/222752/arch/Darwin/installer/boost-1.45.0-darwin-20110304.tar.bz2</string> </map> <key>name</key> <string>darwin</string> </map> <key>linux</key> <map> <key>archive</key> <map> <key>hash</key> <!--<string>a34e7fffdb94a6a4d8a2966b1f216da3</string>--> <string>2523af5082f44628e553635de6bbea70</string> <key>url</key> <!--<string>http://s3.amazonaws.com/viewer-source-downloads/install_pkgs/boost-1.45.0-linux-20110310.tar.bz2</string>--> <string>https://bitbucket.org/LightDrake/public-libs/downloads/boost-1.45.0-linux-20120213.tar.bz2</string> </map> <key>name</key> <string>linux</string> </map> <key>windows</key> <map> <key>archive</key> <map> <key>hash</key> <string>98be22c8833aa2bca184b9fa09fbb82b</string> <key>url</key> <string>http://s3.amazonaws.com/viewer-source-downloads/install_pkgs/boost-1.45.0-windows-20110124.tar.bz2</string> </map> <key>name</key> <string>windows</string> </map> </map> </map>
I’ve been building with the Linux 1.45.0 version; if you try the more recent Linux version or the Windows/Darwin versions let me know how it goes! Using this new library leads to a rather nasty namespace collision (at least with the Linux 1.45.0 version) for which I have uploaded a fix.
I’ve also put an example Arduino sketch on Bitbucket, which uses an HMC6343 accelerometer/magnetometer & a u-blox MAX-6 GPS receiver, which I wrote about previously.
Binary
I’ve uploaded a 32-bit Linux binary to Bitbucket if you just want to try it out without the rigmarole of successfully setting up the (rather particular) build environment.
Usage
Start the viewer & login as normal, then take a look at the Serial
menu which contains a single entry Serial Monitor
. Click this & you will see something like this;
Put the path to the serial device & the baudrate into the fields in the Device settings
section at the top & click [Connect]
. For me using an Arduino on Linux boxes the serial device normally appears at /dev/ttyACM0
or /dev/ttyS0
if it’s the first serial device, /dev/ttyACM1
or /dev/ttyS1
if it’s the second serial device, etc. If you’re using an Arduino & are having trouble finding it, just start the Arduino IDE & look at the Serial Port
entry in the Tools
menu.
Pangolin expects messages, separated by newline characters, in the following format;
<bearing> <pitch> <roll> <latitude> <longitude>
For example;
183.90 75.80 -59.30 56.339991 -2.7875334
So make sure that your serial device adheres to this message format; the example Arduino sketch linked above does this & might be a useful starting point for you.
The fields in the Anchor settings
section are for entering the location of a single point for which you know both the real world latitude/longitude & the corresponding virtual world coordinates along with the scale of the virtual world to the real world – eg if 1m in the real world is represented by 1.2m in the virtual world then enter 1.2
into the Scale
field.
Sim X
& Sim Y
of the anchor point are global; this is necessary to allow Pangolin to work across multiple regions (such as mega regions). Because regions are 256x256m you can easily calculate the global coordinate of a point by doing (256 * region position) + local coordinate
. For example, if my anchor point is at 127,203,23 in a region at 1020,1043, then the global X coordinate of the anchor point is (1020 * 256) + 127 = 261247
& the global Y coordinate of the anchor point is (1043 * 256) + 203 = 267211
. Height isn’t implemented (vertical accuracy of GPS is substantially worse than horizontal) so just pick a Sim Z
of around your sim’s ground level. The latitude & longitude fields have 6 decimal places of accuracy.
Once you have input all of the anchor settings, click [Set]
& if everything is okay you should see the data from the serial device in the Received data
section & the processed position values in the Calculated data
section. The Pause
checkbox will stop the fields from updating so you can copy the values, etc. If you get garbled received data then you have probably set the baudrate incorrectly – Pangolin will ignore this data instead of trying to process it & sending your avatar’s movement haywire.
To enable/disable control of the camera & your avatar’s movement from the received/calculated data use the Orientation control
& Position control
checkboxes in the Controls
section. The various spinners in this section allow you to alter the smoothing, high pass filters & frequency of updates. Good values for these will depend upon what hardware/sensors you are using, the scale of your sim, etc.
Does it work?
The code itself works nicely. I tested it walking around the ruins of the cathedral at St Andrews, for which the Open Virtual Worlds group has produced an OpenSim reconstruction, using a MSI WindPad 110W tablet computer. This sim is accessible on our own grid & on OSgrid (search for regions called ‘StAndrewsOVW’) though the OSgrid version is usually older than our local grid.
Camera control from orientation was a very rewarding experience, but will benefit from a faster update rate than the 10Hz of the HMC6343 that I used.
The accuracy attainable from a GPS receiver, even a fairly high-spec one like the u-blox MAX-6 that I used, isn’t enough to have the avatar ‘follow in your footsepts’ as I’d originally envisaged, but is good enough to move/teleport the avatar between different points of interest when you move within a certain distance/threshold of them.
Extending
Hopefully my additions & modifications to the viewer will provide a convenient starting point or reference for other people wishing to interface serial devices with the viewer &/or to leverage real world sensor data for controlling avatar/camera.
The code is well commented throughout (more so than the rest of viewer codebase!) & should be fairly self-explanatory. The commit logs on Bitbucket will reveal where the additions/modifications reside, however as a brief overview to get you started;
- Terraneo Federico’s AsyncSerial class is at
indra/newview/AsyncSerial
- most of my added functionality resides in
indra/newview/llviewerserialmovement
- see
LLAppViewer::mainLoop()
inindra/newview/llappviewer.cpp
for how the serial functionality is actually started - the floater is in
indra/newview/skins/default/xui/en/floater_serial_monitor.xml
Credits
As well as the people I’ve mentioned whose code I have used, I received huge amounts of help from various people on the Second Life opensource development IRC channel (#opensl on Freenode) & the opensource-dev mailing list – thanks guys!
Arduino + HMC6343 + u-blox MAX-6
Part of my investigation into simultaneous presence in real & virtual environments involves streaming real world position, heading & orientation data into a modified Second Life viewer. After much faffing about evaluation of different accelerometers, magnetometers & GPS receivers, as well as learning about GNSS & exciting concepts like magnetic declination, hard/soft iron offsets & tilt compensation, the result is an Arduino with an HMC6343 tilt-compensated magnetometer & a u-blox MAX-6 GPS receiver.
For the interaction style that I want to investigate with the VTW project I need to know 3 things about the user;
- position – where they are
- heading – the direction they are looking
- pitch – the angle they are facing (up, down, level)
Pitch is the easiest of the three – you simply use an accelerometer. I tried both the ADXL335 with signal conditioned voltage outputs & the MMA8452 which uses I2C. These are both 3-axis accelerometers so can tell you roll, pitch & yaw all at the same time. They both did what I wanted, but the MMA8452 was easier to use as it didn’t involve experimenting to find out what values represented the rest/peak positions of each axis like the ADXL335 did.
Heading (by which I mean a compass heading) also seems easy to begin with – you just use a 3-axis magnetometer (aka ‘digital compass’ or ‘e-compass’) & perform a calculation that uses the readings of the 2 axes that represent pitch & roll. I tried the HMC5883L but soon discovered that things get much more complicated if you aren’t holding the magnetometer flat – because the calculation doesn’t take the third axis into consideration, the heading becomes more useless the more you tilt it! To calculate a heading even when not held level (which is important for many applications) you need to integrate the reading from the third axis, which can be thought of as recording the magnetic field lost by the other two when tilted out of alignment, into the heading calculation. But to do this you need to know how the magnetometer is tilted – good thing we have a 3-axis accelerometer!
There are a number of good tutorials/guides for this, including these two that I followed, however if you are lazy &/or scared by maths you can also buy ICs such as the HMC6343 that combine a 3-axis accelerometer with a 3-axis magnetometer & perform the tilt compensation algorithms onboard.
Of course there must be a downside to this convenience & most obvious is the price. If you are a SparkFun sort of person the HMC6343 will set you back $149.95. To put that into perspective the ADXL335 is $24.95, the MMA8452 a paltry $9.95 & the HMC5883L just $14.95.
The HMC6343
The HMC6343 ‘3-Axis Compass with Algorithms’ is an I2C device so is easy to hook up (SDA to analog 4, SCL to analog 5 on an Arduino Uno) & interact with. The datasheet contains the protocol definition & communicating with it looks something like this (the example here changes the rate of measurements from the default of 5Hz to the fastest of 10Hz). HMC6343_ADDRESS
is 0x19
& obviously you will also need to include Wire.h
at the top of your sketch.
Wire.beginTransmission(HMC6343_ADDRESS); Wire.write(0xF1); Wire.write(0x05); Wire.write(0x02); Wire.endTransmission();
The first Wire.write
tells the device what command we want to run, in this example 0xF1
is the ‘Write to EEPROM’ command that expects 2 binary byte arguments. The first argument is which register we want to write to, in this example 0x05
is the address of ‘Operational Mode Register 2’ which stores the measurement speed. The second argument is the value that we actually want to write to this register & in this case 0x02
means that we want measurements at 10Hz. Simple enough, no?
If you look through the sketch you’ll see this pattern repeated for setting the variation angle correcton, the Infinite Impulse Response filter & temporarily changing the orientation (so that you can mount the IC in different orientations without having to manually swap the axes, which is very handy).
The HMC6343 also allows you to access just the accelerometer readings, separately from the tilt-compensated magnetometer readings, so this single IC gives me both the (tilt compensated) heading & the pitch.
One more handy feature of the HMC6343 is the hard-iron offset calibration. Metallic stuff around the chip (wires, the GPS module, the Arduino, etc.) can affect the magnetic field detected by the magneto resistors & throw off the readings, so we can compensate for this using the built-in calibration mode. Read the datasheet to see how this works, but essentially all you have to do is send command 0x71
, rotate the device in a certain manner, then send command 0x7E
.
In case you’re wondering what the difference between hard-iron offset & soft-iron offset is, the former is for things that won’t change during operation (eg the wires connecting the IC to the Arduino are always going to be there) whereas the latter refers to things that will change during operation (such as moving the IC around your lodestone collection). Hard-iron only need be calculated once, whereas soft-iron requires dynamic attention.
u-blox MAX-6
My research into GNSS led me to the u-blox MAX-6 as my best option. Impressive specification, solid performance & compatibility with SBAS including EGNOS in Europe (Wikipedia page on GNSS augmentation/SBAS here). It turns out that it’s a popular GPS receiver amongst high-altitude ballooners (the kind of people who attach Arduinos & cameras to weather balloons & let them go) as it operates correctly at high altitudes which some receivers evidently don’t. So after some discussion on the #highaltitude channel on freenode I ended up ordering a MAX-6 with the appropriate level conversion to work with an Arduino (the MAX-6 uses 3v logic, whereas the Arduino uses 5v, so conversion is required so as not to fry the module). I ordered it from here & based my sketch on the example code here.
The MAX-6 is a serial device (no convenient I2C here) so interfacing with an Arduino is slightly different. The wiki page linked above recommends not using Software Serial, however as I also want to run the HMC6343 on the same Arduino I have no choice – so far I haven’t experienced any trouble with it.
The protocol specification for the MAX-6 is substantially longer & the messages much more complex than the HMC6343. If you want to have a go at assembling the messages by hand you can do so by reading the 220+ page document & I actually had some success with this, however then discovered that the easier way is to use the u-center software which allows you to configure the receiver from Windows using tickboxes & menus then see/copy the messages from the relevant console window to paste into your sketch. The u-center software worked fine for me in a Windows 7 VM running on a Linux host, using pins 0 & 1 on an Arduino to connect the receiver directly to the USB connection via the Arduino’s UART.
Once you have your message, sending it to the receiver looks something like this (the example here sets the dynamic platform model to ‘pedestrian’);
uint8_t CFG_NAV5[] = {0xB5, 0x62, 0x06, 0x24, 0x24, 0x00, 0xFF, 0xFF, 0x03, 0x03, 0x00, 0x00, 0x00, 0x00, 0x10, 0x27, 0x00, 0x00, 0x05, 0x00, 0xFA, 0x00, 0xFA, 0x00, 0x64, 0x00, 0x2C, 0x01, 0x32, 0x3C, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00}; calculateUBXChecksum(CFG_NAV5, (sizeof(CFG_NAV5)/sizeof(uint8_t))); while (!success) { sendUBX(CFG_NAV5, (sizeof(CFG_NAV5)/sizeof(uint8_t))); success = getUBX_ACK(CFG_NAV5); }
The first thing we do is create an array of type unit8_t
that contains the message itself. The final 2 hex values are the checksum & the call to calculateUBXChecksum
on line 2 calculates these values & substitutes them into the array. To send the message we use the sendUBX
method & keep on trying until the receiver sends us a confirmation.
Again you will see this pattern repeated a number of times in the sketch, to set the dynamic platform model, enable SBAS using EGNOS, disable all the messages that I don’t want & enable the ones that I do want.
Getting readings
Getting data from the HMC6343 is a simple case of sending the correct command (0x50
), requesting the correct number of bytes & then reading them into sensible variables (heading, pitch, etc.).
Getting data from the MAX-6 isn’t much harder, using SoftwareSerial.available()
& SoftwareSerial.Read()
, however as with most GPS receivers the data that it returns is in the form of NMEA 0183 messages which look a bit like this;
$GPRMC,225446,A,4916.45,N,12311.12,W,000.5,054.7,191194,020.3,E*68
It’s not hard to parse it yourself, but why go to the effort when there are libraries like TinyGPS that can do it for you?
…using readings?
So what was the point of all this again? Well, my modified Second Life client listens on the serial port using Boost.Asio, receives these messages & then uses then to control avatar & camera – more on this soon!
OpenSim Region Modules & GPS
Something else I did late last year/early this year, investigating how one might combine OpenSim Region Modules with GPS readings to control movement of an avatar & other exciting things, which weren’t developed to fruition but may be of interest to somebody.
From the OpenSim wiki
Region modules are .net/mono DLLs. During initialization of the simulator, the OpenSimulator bin directory (bin/) and the scriptengines (bin/ScriptEngines) directory are scanned for DLLs, in an attempt to load region modules stored there.
Region modules execute within the heart of the simulator and have access to all its facilities. Typically, region modules register for a number of events, e.g. chat messages, user logins, texture transfers, and take what ever steps are appropriate for the purposes of the module.
Essentially they allow you to develop much more complex extensions to the OpenSim platform than in-world LSL does, but are easier & more accessible than directly modifying the client &/or server source code.
Instructions for making your own region modules can be found on the OpenSim wiki (follow the link above) but I also wrote a heavily commented version of the boilerplate code for a shared region module that might make it easier to understand which parts you actually need to implement/change to get a basic region module doing something. This was the first time I had worked with C# so apologies if some of the comments seem superfluous ;)
Instead of having hundreds of lines of code in this post, I’ve put all of the examples in public Bitbucket repositories – here is the first one for the boilerplate region module code. All you really need to do is change names in a few places & then add some functionality starting in PostInitialise()
to get a basic region module that can result in some visible effect in world.
One of the most basic visible effects is making something move & this example does just that. Despite its name it doesn’t actually have anything to do with GPS yet, it simply creates a spherical prim in each region & moves it a short distance on each tick of a timer. This shows the most basic usage of the SceneObjectGroup
class to get a reference to a primitive in a scene & then do something with it (move it).
Moving onto something that actually begins to involve GPS, or at least begins to make some connection between real world latitude & longitude values & ‘equivalent’ positions in the virtual world, this next example waits for a latitude & longitude to be reported via a TCP connection & then moves the avatar to the equivalent position in the region. This approach assumes that there is one position within the OpenSim region for which the equivalent real world latitude & longitude is known (referred to as the ‘anchor’ point) & that the scale of the OpenSim region compared to the real world is also known (eg that every meter in the real world is represented by 1.2m in the OpenSim region).
When a latitude & longitude is received via TCP the haversine formula is used to calculate the real world ‘great circle’ distance between the anchor point & this new point. This distance is then scaled according to the scale of the real world to the OpenSim region & thus the equivalent OpenSim position is calculated as a displacement from the anchor point – to which the avatar is then moved.
This is a fairly ‘rough & ready’ proof-of-concept – the avatar’s name & the position of the anchor point are currently hard-coded & movement across region boundaries isn’t supported. The implementation of the haversine formula & the GPSSanitizer
method (which checks for both dms & signed decimal latitude/longitude representations using regular expressions) may be useful to other applications. It has also been tested by manually piping in latitude/longitude values via a simple TCP client – a rudimentary example included below.
using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Net; using System.Net.Sockets; namespace TCPClient { /* * A very simple TCP client test program. When run (with no command line arguments) it will send a message from the CLI to * localhost port 13000 via a TCP connection. The message, address & port can be changed by changing the hardcoded values. */ class SimpleTcpClient { static void Main(string[] args) { Console.WriteLine("Enter GPS value to move to (eg '56.340626, -2.808015' or '56 20 26 N, 2 48 28 W'"); string s = Console.In.ReadLine(); Connect("127.0.0.1", s); } static void Connect(String server, String message) { try { // Create a TcpClient. // Note, for this client to work you need to have a TcpServer // connected to the same address as specified by the server, port // combination. Int32 port = 13000; TcpClient client = new TcpClient(server, port); // Translate the passed message into ASCII and store it as a Byte array. Byte[] data = System.Text.Encoding.ASCII.GetBytes(message); // Get a client stream for reading and writing. NetworkStream stream = client.GetStream(); // Send the message to the connected TcpServer. stream.Write(data, 0, data.Length); Console.WriteLine("Sent: {0}", message); //send & forget, don't bother waiting for a response // Close everything. client.Close(); } catch (ArgumentNullException e) { Console.WriteLine("ArgumentNullException: {0}", e); } catch (SocketException e) { Console.WriteLine("SocketException: {0}", e); } Console.WriteLine("\n Press Enter to continue..."); Console.Read(); } } }
The final example is something a bit different, but still GPS related. This one takes a latitude & longitude via TCP in the same manner as the previous example, but instead of moving something it instead queries the Google Maps API for a satellite image centered about this position & applies this image as a texture to a prim that covers the entire region.
This was written to make testing of the previous example easier – it’s easier to visualise whether the movements of the avatar are correct if s/he is walking over imagery of the real world than blank terrain. This was written a long time ago but I think I started extending it such that when an avatar moved into a neighbouring region it would automatically be textured with another satellite image – at any rate the code is in a very unfinished state, but please feel free to harvest any bits that might be useful to you.
It became apparent during these experiments that it would make more sense to handle GPS/accelerometer/magnetometer avatar movement on the client side rather than the server side, thus these experiments were abandoned – however they still serve as an interesting demonstration of what can be achieved with region modules & a bit of imagination :)
In addition, much of the GPSAvatar code will be transferred into my modified Second Life viewer & will help to speed up development there.