Mirrorshades Update

I’ve just finished putting a bunch of people through the Mirrorshades treatment & will have a highlights video up sometime soon hopefully :)


Mirrorshades update

Back with another video from St Salvator’s chapel, this time with some willing volunteers to try out the platform so I could film & then combine the video from the camera with the video captured on the laptop. I’m definitely going to be changing the lerp on the VR movement so it is nowhere near as fast/sudden as it is atm!

Mirrorshades update

I tested my Mirrorshades platform at St Salvator’s chapel in St Andrews for the first time today!

This test represents a less contrived scenario in which one might use a mobile cross reality platform like Mirrorshades, compared to when I was using it while walking around the Computer Science building, because while the real & virtual environments in this test share largely the same layout, they have substantial visual differences which are intriguing to contrast & compare.

The virtual component here shows the chapel as it stood in 1450-1460, since when there have been substantial changes – not least the removal of the original stone roof & its replacement with a wooden roof & the change in the room division.


Until next time!

Mirrorshades project update – first walking test!

Did some more work on Mirrorshades & reached the point where I could actually give it an early go walking around the building I work in! IndoorAtlas didn’t seem to behave as well as in previous experiments, but the fact I didn’t walk into any walls or fall on my face is promising of future progress.

I’m not actually pressing anything on the phone, just tapping the screen occasionally to stop it from sleeping.

The setup isn’t exactly graceful atm… Add to this the Xbox controller used to toggle between real & virtual plus the Android smartphone that gets real position.


Mirrorshades project update – new camera mounts, IndoorAtlas into Unity

Scroll down for a fun video if you don’t want to read ;)

A long overdue update on Mirrorshades, my project that aims to let you walk around wearing an Oculus Rift using cameras to see your real surroundings, whilst using the IndoorAtlas indoor positioning system to track your position & move you around a Unity environment that you can switch to viewing through the Rift whenever you want.

New camera mounts

I realised from William Steptoe’s Rift-based AR platform (incidentally a much more professionally approached endeavour than mine!) that I had made a glaring error with my camera mount by having the cameras horizontal rather than vertical. The Rift’s 1280×800 display is split vertically into two 640×800 segments, one for each eye, so the area that each camera renders to is actually ‘portrait’ rather than ‘landscape’. So I went back to the 3D printer & made some new mounts.


They’re much simpler, still allow the interoccular/interpupillary distance to be altered & I switched from using metal hex spacers & washers to using rubber washers which both makes toe-in adjustments easier & keeps the sensors closer to the eyes so the ‘eyes on stalks’ feeling of the cameras being physically several inches in front of your eyes should be marginally reduced over the old mount.




I’ve now got things set up such that position data from an Android device using IndoorAtlas is dumped into a MySQL database & a Unity app with a nice model of the building I work in (obtained from another student – my 3D modelling skills are much more rudimentary!) queries the database for the current location of the device & can then use it to move a camera controller.

As a quick first test & to show things starting to work, I simply scripted a sphere with a camera pinned above it to move instantaneously to each new position value & built the app for Android so I could quickly try it out without having to carry around a laptop. This is obviously a very rudimentary approach – for a proper implementation you would almost certainly want to move the marker/camera smoothly, maybe with some pathfinding &/or extrapolation.

This could of course have been achieved with a single device by integrating the IndoorAtlas code into the Unity app, but by using two devices I could start playing around straight away & as the Rift will most likely be running from a Windows laptop carried in a backpack it was a good test to use a separate device for collecting the position data.

Next steps?

Next comes the fun part – building the app for Windows & walking around with the Rift on, switching between looking at the real world through the cameras & the Unity world when pressing a button!