Mirrorshades update

It’s been a while since I posted anything about the Mirrorshades project, so here’s a few things I’ve been working on!


I’ve stopped using the Playstation Eye cameras because the drivers were too buggy; I would frequently try to start work on the project only to find that Unity couldn’t find the DLL & there was seemingly no fix other than to repeatedly reinstall & restart for hours until they started working.

I’m now using a pair of Logitech C310 & whilst the resolution is higher than the Playstation Eye cameras (1280×960 vs 640×480), the refresh rate is lower (30hz vs 60hz). To my eyes, the Playstation Eye cameras actually gave a nicer experience, but of course when they weren’t working they were no use!

I’m using the same 3D printed clips (red) with the cameras epoxy’d to thermoplastic (white) so they can be adjusted via the nuts & bolts with rubber washers. Once again, inspiration taken from William Steptoe’s AR Rift project.

I quickly measured the latency introduced by the C310 webcams (& then realised that it would’ve been interesting to have done the same experiment on the PlayStation Eye cameras!). I placed the Rift, with the lenses removed, facing a LCD monitor displaying a timer from flatpanels.dk. I placed a camera behind such that it could see both the monitor & the Rift’s screen, then cranked the sensitivity up on the camera so that it could record 50fps video with a 1/4000th shutter speed.

The monitor & the Rift were both refreshing at 60fps, each frame lasting 16.67ms, whilst a 1/4000 shutter speed on the camera meant that the shutter was open for 0.25ms. The response time of the monitor (quoted by the manufacturer as 8ms GTG) was evidently much higher than that of the Rift, as the tenths & even hundredths digit on the monitor was usually legible in each frame of the video whereas on the Rift the hundredths & thousandths digits were always illegible. So I went through the video frame-by-frame looking for adjacent frames where a transition from one tenth digit to the next was good enough to read on the Rift & the hundredths/thousandths digits were good enough to read on the monitor, such as this pair;



From these we can infer that the tenths digit on the Rift screen (right eye) changed from 9 to 0 sometime between 181 & 198 on the monitor, meaning a latency of between 181ms & 198ms. Out of 11 pairs of frames like this, 7 pairs showed this 181-198ms latency, whilst 4 pairs showed 198-215ms as in the pair below;



I was also able to take some still photos with the same 1/4000th shutter speed, which all showed the same 181-215ms latency (3 images following), however as timing shots to get legible digits was entirely down to luck it was easier to video at 50fps to get enough frames to work from.




This latency of 181-215ms is substantially worse than the 60ms latency between head movement & resultant VR changes being displayed that is often quoted as the upper limit for an acceptable VR experience. This increased camera latency compared to the tracker-to-VR latency (quoted as typically being 30-60ms for applications running at 60fps on the Rift DK1, same link) will probably arise in experimental results when users actually try out the platform.


I’ve mapped St Salvator’s Chapel using IndoorAtlas. We plan to use this site for our case studies as it fits with one of my research group’s interests, cultural heritage, whilst also providing a good example of where mobile cross reality is useful. I wasn’t expecting IndoorAtlas to work well in this building, as it doesn’t have a metal frame, but I was pleasantly surprised. Perhaps the addition of central heating & electricity later in the building’s history helped?


Other than that, I’ve been focussing on theoretical work & designing experiments – after all, the platform is no good without evaluation!

Mirrorshades project – video of camera test w/ wider lenses

Yesterday’s video was with 2.5mm lenses fitted to the PlayStation Eye cameras, today I got a pair of 2.1mm lenses to try. Definite improvement (after this video I tried fitting 2.5mm in the right camera & 2.1mm in the left camera & comparing by closing each eye in turn) but I’m tempted to try even wider lenses as these 2.1mm ones are still noticeably more ‘zoomed in’ than my naked vision & they don’t introduce nasty distortion as I feared they might.



Mirrorshades project – video of camera test

A quick video of me testing the Mirrorshades setup at my desk, with 2.5mm lenses fitted to the cameras. More info in the video description on YouTube.

Mirrorshades project – Oculus Rift, 2x PlayStation Eye cameras, Unity

Continuing my cross reality research, I’m working on an Oculus Rift project to allow you to switch/fade between your real surroundings & virtual surroundings. I’m using 2x PlayStation Eye webcams (cheap, 60fps, easy to fit with standard M12 lens mounts so you can use all sorts of different lenses – I’m using 2.5mm lenses atm) attached to the Rift via a 3D printed mount & with Unity on the software end using code from this chap (Japanese link) because Unity’s WebCamTexture doesn’t work with 2x identical cameras.


The mount comprises the clip designed by the guys at USC Information Technology Services (& freely downloadable from their website) with a second piece I designed & glued onto the front with epoxy. I found an EAGLE board file for the webcams so was able to design this second piece with channels matching the PCB mounting holes, allowing the webcams to be moved horizontally to experiment with different interpupillary distances (or even to completely remove one camera & simply position the remaining one in the centre).


One of the cameras has to be upside down because the width of the top of the PCBs (where the mic array is) means that they can’t be placed close enough together when they are both the same way up! The image from the upside-down camera is just rotated in software.