An AR/VR hack that lets you see like a dinosaur!
(Dinosaur onesie optional, but recommended for best experience)
I've never seen Jurassic Park, but this classic line is something I've been aware of for years. I thought it would cool to try and visualise what that would actually look like. We now have devices in our pockets that are powerful enough to do cool, real-time, computer vision thingies - and with the new super powers that web has gained over the years (getUserMedia + canvas), it's now super-easy to put something together.
Well, it looks a bit like this:
This is a snapshot of the feed the simulator generates as the viewers head moves. The image is of the Central Hall in Southampton, where Hacksoton 2017 was held.
The view is an edge detection effect applied to a live video feed. The view is generated by calculating the difference between two frames of a live video stream.
Frame 1 is the current frame of the video, and frame 2 is the previous frame. Both frames are converted from color to greyscale, and then each pixel of each image is checked to see if it has changed significantly between the two frames. If a significant change has occurred, the pixel is colored red (the natural color for a blood thirsty dinosaur to see), if no change has occurred the pixel is colored black.
-
The camera is not guarenteed to be environment facing. The front-facing camera will sometimes load instead. If you're using Firefox on Android, you can choose the back-facing camera to create the effect, otherwise it's the luck of the draw with other browsers (for the time being).The simulator will now try and make a best guess about which (if any) cameras are the rear/environment facing camera on your device. Failing any success, it will default to the primary camera of the device. -
The old getUserMedia interface is used. I like it better, but I'll upgrade the new promise-based interface when I get a chance.The new promise-based getUserMedia interface has been implemented. It should work for most modern devices. -
Stereoscopic vision is not supported. Though a left-eye/right-eye view is displayed, the source is from the same camera, so no illusion of depth is created. The image when viewed appears flat, but depth can be inferred from the edges of things in your FoV.