Projection mapping, forced perspective, and magic tricks.
In 2013, the Bot & Dolly team began working on a technical demonstration of projection mapping on moving surfaces. Six months and a few thousand iterations later, this project was released as the short film Box.
Box was an opportunity to use all of the technical tricks in our book at the time: synchronization of multiple cameras, projectors, robots, and lights; motion capture workflows; a virtual camera rig; and classic practical effects like forced perspective.
But it was ultimately our in-house creative team that took the project to another level, providing a concept and visual design that allowed the piece to transcend its origins as a technical demonstration.
We realized, fairly early on, that projection mapping onto moving surfaces was an effect we could achieve at Bot & Dolly, given the sub-frame positioning that the robots provided. The challenge for Box was optimization: chasing down latencies, pushing the dynamic limits of the robots. TouchDesigner provided live compositing, used in both the virtual camera pass and the final take, and generally coordinated lights, sound, motion, and other playback elements.
Calibration was a constant challenge, given the number of distinct sensors, lenses, and mechanical assemblies involved. We calibrated the coordinate frames of the three robots, their external axes, the mounted projection screens, a virtual camera rig, and a motion capture system, as well as camera and projector intrinsics and extrinsics, using a mix of in-house software and open-source toolkits.
Virtual camera workflows were relatively new at the time, and we essentially rolled our own, stitching together a Canon 5D, Phasespace motion capture, and TouchDesigner. After capturing the camera path, we used our custom tools in Maya to smooth, preview, and export a motion path for the robot-mounted camera to perform.