AR R&D

This project aims to push the boundaries of what's possible in AR. To make augmented reality truly immersive it needs to feel connected to the real world, but looking into existing AR applications the virtual objects seem very disconnected. That’s why I came up with this framework!

At first GPS data is used to figure if the user is inside or outside. Weather effects and sun shadows are not applied to the object if it’s inside a building.

Then the user location, weather data (API call with location) and time of day is used to simulate a virtual sun and create realistic lightning and shadows for the object. Weather data is also used for simulating clouds(no sun shadows), rain, snow and even wind force and direction.

Another separate system takes care of the reflections. In this framework there are two types of reflections. The first time is the reflections from the object. Video feed is used to approximate these. The second type is the ground reflection. Currently ground reflections happen only when water is detected below the object. The rain data is used to enable ground reflections, but to make sure the floor is wet a machine learning service is used to determine if there’s water.

The machine learning system returns data regarding the floor below the virtual object and in the example here I use it to change the sound of the floor depending of it’s material.

UberReflections

This is a personal R&D project. I used everything learned when working on the Theta S competition submission as well as my earlier HoloLens reflections project.
My goal was to connect the virtual and the real world in a new way. The virtual object reflects real objects all around it while the real world "reflects" the virtual object.

The user places a HoloLens anchor and then creates a 360 image with a phone from the point of view of the anchor. Once the image is uploaded to a CMS system the HoloLens app downloads it and uses it for image-based lighting and reflections. This version of the app also features additional reflections as the user is supposed to place the objects on a reflective surface. As you can see the objects appear to be reflected by the table.
When you use the HoloLens you see two different reflections on the table, one for each eye. In this video the reflections are not aligned, but when the app is viewed through the HoloLens everything is aligned properly.