AR Garden: AR Viewfinder
2021
A UX exploration for Hololens 2 where users explore the ocean floor with their hands.
R&D with Glowbox
About AR Garden: Viewfinder
In this research we were interested in finding aesthetic and interactive approaches which sidestepped the limited field of view and fill rate of the Hololens 2. I was responsible for the research involved in the development of the final demo and its programming.
At first to deal with performance we looked into occlusion culling and dynamic placement of points in the jobs system to constrain the amount of pressure put on the GPU in their rendering to a more constant amount.
Continuous LOD of pointclouds using the jobs system.
We ultimately decided to realize the culling work through a viewfinder formed by your hands. This allowed us to approach the limited field of view in a more forgiving way and added a novel physical joy to an otherwise technical study.
Creating a camera from your hands deepened my understanding of transformation matrices involved in rendering. In this case I needed to set parameters for a camera that was the intersection of the view from the users eyes through the plane formed between their hands.
Early footage of what it looks like to render the world through a plane.
Early footage demonstrating culling pointcloud points using the frustum formed by the users eyes and the hand plane.
In editor footage demonstrating the technique using MRTK in editor simulation.