Mirrorshades project update – first walking test!

Did some more work on Mirrorshades & reached the point where I could actually give it an early go walking around the building I work in! IndoorAtlas didn’t seem to behave as well as in previous experiments, but the fact I didn’t walk into any walls or fall on my face is promising of future progress.

I’m not actually pressing anything on the phone, just tapping the screen occasionally to stop it from sleeping.

The setup isn’t exactly graceful atm… Add to this the Xbox controller used to toggle between real & virtual plus the Android smartphone that gets real position.

IMG_20140129_181553

DIY Minirig using Minirig packaging…

So I still had the nice sturdy tube that my Minirig Sub came in, as well as a pair of Tang Band W3-871C drivers… I had an idea & just went with it…

The surprising part is that it doesn’t sound completely terrible! The drivers are wired in parallel with one out-of-phase for a quasi-isobaric design, as I figured it would sound (more) terrible otherwise.

IMG_20140123_204838

IMG_20140123_214751

IMG_20140123_212739

IMG_20140123_221751

Mirrorshades project update – new camera mounts, IndoorAtlas into Unity

Scroll down for a fun video if you don’t want to read ;)

A long overdue update on Mirrorshades, my project that aims to let you walk around wearing an Oculus Rift using cameras to see your real surroundings, whilst using the IndoorAtlas indoor positioning system to track your position & move you around a Unity environment that you can switch to viewing through the Rift whenever you want.

New camera mounts

I realised from William Steptoe’s Rift-based AR platform (incidentally a much more professionally approached endeavour than mine!) that I had made a glaring error with my camera mount by having the cameras horizontal rather than vertical. The Rift’s 1280×800 display is split vertically into two 640×800 segments, one for each eye, so the area that each camera renders to is actually ‘portrait’ rather than ‘landscape’. So I went back to the 3D printer & made some new mounts.

rift

They’re much simpler, still allow the interoccular/interpupillary distance to be altered & I switched from using metal hex spacers & washers to using rubber washers which both makes toe-in adjustments easier & keeps the sensors closer to the eyes so the ‘eyes on stalks’ feeling of the cameras being physically several inches in front of your eyes should be marginally reduced over the old mount.

back

side

IndoorAtlas

I’ve now got things set up such that position data from an Android device using IndoorAtlas is dumped into a MySQL database & a Unity app with a nice model of the building I work in (obtained from another student – my 3D modelling skills are much more rudimentary!) queries the database for the current location of the device & can then use it to move a camera controller.

As a quick first test & to show things starting to work, I simply scripted a sphere with a camera pinned above it to move instantaneously to each new position value & built the app for Android so I could quickly try it out without having to carry around a laptop. This is obviously a very rudimentary approach – for a proper implementation you would almost certainly want to move the marker/camera smoothly, maybe with some pathfinding &/or extrapolation.

This could of course have been achieved with a single device by integrating the IndoorAtlas code into the Unity app, but by using two devices I could start playing around straight away & as the Rift will most likely be running from a Windows laptop carried in a backpack it was a good test to use a separate device for collecting the position data.

Next steps?

Next comes the fun part – building the app for Windows & walking around with the Rift on, switching between looking at the real world through the cameras & the Unity world when pressing a button!

Tri-X @ 1600 w/ full strength Xtol…

The lovely Ellen Shaw of Shawtography, performing at an exhibition/auction hosted by the university PhotoSoc.

My Xtol has been mixed for over 6 months now so I used it full strength, thus the complete & utter lack of shadow detail with pushed Tri-X…

ellen_2

ellen_1

Covent Garden at Christmas [Re-scan]

I rescanned the Covent Garden photo I took back in 2011 & originally scanned with the CanoScan 9000F, this time using the Epson V600 & producing a much better result. Whether that’s because the Epson is actually better or I simply chose more appropriate options is open to debate though!

Epson

covent_garden_FINAL_FOR_WEB

Canon

IMG002-2