Home
/
Les Boréades
Les Boréades

Les Boréades

Les Boréades

2020

An immersive classical music performance wrapped in photoscanned environments.

Work with Glowbox

Written about in OArtsWatch.

Les Boréades

Glowbox is a spatial interaction lab.

www.glowbox.io

Les Boréades

About Les Boreades

Les Boréades is the culmination of a huge amount of work and research by Glowbox, Brad Johnson, and the 45th Parallel. This encompasses a huge amount of work I did in 2020. Because of that this see more section is subdivided into several "See more..." sections. Please click the dropdowns to expand them if any descriptions jump out to you!

When I started work on Les Boréades Glowbox was interested in blending pointclouds from photogrammetry taken by Brad Johnson with a performance by 45th Parallel. The research they tasked me with came in three parts:

to assist in the visualization of what the performance might look like physically:

image

Previsualization of the stage inside PICA's project space I made in Blender

image

Previsualization evolution as physical model by Ben Purdy and Thomas Wester

image

2019 Physical performance inside PICA's project space

to study blending performers captured with Kinect into pointcloud scenes in virtual reality:

image
image

I developed a relighting approach for volumetric video using Blender and mesh sequences of volumetric video exported from Depthkit to better integrate performers into their environment.

and to develop the look of pointclouds from photogrammetry

This was the most extensive area of research for the project. Our goal was to dance between images of the natural and built environments. The product of this research was 50+ VFX Graphs consisting of many different approaches to creating point clouds from raster data baked with a modified version of Keijiro Takahashi's PCX.

image
image

For natural environments I looked to impressionists like Georges Seurat and Henri-Edmond Cross

image
image

For the built environment I looked to sculptors interested in architecture like Edoardo Tresoldi and Doh Ho Suh for inspiration

image
image

I also developed several techniques using sampling over time to do things like calculating directions to neighboring points and closest neighbors between two point clouds.

image
image

These proved useful in effects where I could form loose meshes from point clouds for sharper sounding orchestration which were used extensively in the final piece.

In the production phase of the project I worked closely with Kate Wolf, Cat Ross and Brad Johnson in sequencing visual effects and camera moves to a recording of the music using tools by Thomas Wester. During production I refined the visual effects as the sequencing evolved and made code contributions to the sequencing software to expand support for Unity's VFX Graph.

The complete performance is viewable on 45th Paralell's Youtube channel:

Les Boreades

When we presented Les Boreades over a year ago, quarantines and social distancing were impossible to imagine, and the joy of creating something fresh and won...

www.youtube.com

Les Boreades

Research after Les Boreades

Glowbox later continued their research by developing a stage for volumetrically capturing performers using three Azure Kinects. Most of my research prior had been focused on static point clouds. Shifting to dynamic point clouds introduced new opportunities and new challenges. One challenge was the gap in fidelity between time of flight data in the Azure Kinect which was produced in realtime and photogrammetry which could take hours to produce. We were also challenged with rendering people in a way that allowed abstraction but better represented high frequency details like facial features.

I approached this issue by looking into techniques for filling holes between points. I presented this research as a deck to the rest of the team which explained different approaches.

image
image

The first approach I tried was jump flooding to make a voronoi based coordinate field from the pointclouds.

image

The simplest approach to filling in missing data ended up being splats made of camera facing cones which produced voronoi like results when sorted in depth.

Working with streamed data allowed me to bring in interesting approaches to rendering motion from video art.

Volumetric ghosting effect

Volumetric slit scan effect

We were also interested in ways we could expand tooling for artists. In Les Boréades we used a couple in house tools. Thomas Wester developed a camera pathing tool which allowed Brad Johnson to make camera paths in VR and a sequencer which allowed Cat during the performance to change VFX and cameras in response to cues in the music. To expand this field we looked into things ranging from midi controlled effects to attribute painting and baking for VR.

Midi controlled building effect

Midi controlled pulse effect

image
image

Attributes painted with the VFX graph and baked to texture for readback. (left painted attributes, right buffer holding colors)

Holly Newlands -- 2024