Negotiation Body (prototype)
2019
A tug of war game for desktop between the player and first person controller.
About Negotiation Body (prototype)
Negotiation Body doesn't use any fancy neural networks or reinforcement learning, its relevance to artificial intelligence is in trying to illustrate a non-human character procedurally in a way that a player might read intelligence into it. My hope was to make a space where players could choose to collaborate or challenge their first person controller. I was interested in trying to complicate the relationship players had with their first person controller — where other first person games might just assume you're the sole proprietor of an avatar or that your avatar is a blank slate — I tried to give the avatar agency to challenge the player.
I wanted to challenge myself to work entirely from reality capture data in my aesthetic. I used a Kinect v2 to scan my arms, a Kinect v1 with a jewler's table to scan all of the props floating around the space and RTAB Map on a Tango dev kit to scan the room featured in the game. I then wrote scripts with Meshlab to automatically process scans into game asset resolution.
In this project I leaned heavily on poisson mesh construction in Meshlab. This allowed me to lean into a more dreamlike aesthetic, blurring parts of the models, while retaining an uncanny resemblance to their sources. When I first attempted to use the room featured in the game I found all sorts of issues: furniture obscuring the wall, holes, and eventually that the room was simply too short for the gameplay loop. I discovered that it was possible, however, to simply extend the room by feeding the poisson reconstruction pieces of the room to guide where it would need to fill in.
Here's an example of the approach on a room that I didn't end up using:
Broken up mesh used to feed poisson reconstruction
Poisson reconstructed mesh, watertight, complete with flooring.