Yesterday’s video was with 2.5mm lenses fitted to the PlayStation Eye cameras, today I got a pair of 2.1mm lenses to try. Definite improvement (after this video I tried fitting 2.5mm in the right camera & 2.1mm in the left camera & comparing by closing each eye in turn) but I’m tempted to try even wider lenses as these 2.1mm ones are still noticeably more ‘zoomed in’ than my naked vision & they don’t introduce nasty distortion as I feared they might.
A quick video of me testing the Mirrorshades setup at my desk, with 2.5mm lenses fitted to the cameras. More info in the video description on YouTube.
Continuing my cross reality research, I’m working on an Oculus Rift project to allow you to switch/fade between your real surroundings & virtual surroundings. I’m using 2x PlayStation Eye webcams (cheap, 60fps, easy to fit with standard M12 lens mounts so you can use all sorts of different lenses – I’m using 2.5mm lenses atm) attached to the Rift via a 3D printed mount & with Unity on the software end using code from this chap (Japanese link) because Unity’s WebCamTexture doesn’t work with 2x identical cameras.
The mount comprises the clip designed by the guys at USC Information Technology Services (& freely downloadable from their website) with a second piece I designed & glued onto the front with epoxy. I found an EAGLE board file for the webcams so was able to design this second piece with channels matching the PCB mounting holes, allowing the webcams to be moved horizontally to experiment with different interpupillary distances (or even to completely remove one camera & simply position the remaining one in the centre).
One of the cameras has to be upside down because the width of the top of the PCBs (where the mic array is) means that they can’t be placed close enough together when they are both the same way up! The image from the upside-down camera is just rotated in software.