Once again Industrial Light and Magic rocks the world of visual effects with the production of The Mandalorian using the new virtual production technology. It has sent shock waves through the VFX community and worries many of my fellow compositors about the future of our jobs. After all, shots that would have normally been done as a green screen are now captured in-camera without compositing. Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology and this scares the pudding out of my fellow pixel practitioners. As of this writing, there are 10 virtual sets in production with roughly another 100 underway all around the world. Is this the end of our careers? Are we soon to be as obsolete as CRTs? Will industry-standard tools like Nuke, Resolve, Silhouette, or Mocha go the way of the Dodo? The short answer is “no!” The long answer is “hell no!”
The visual effects color pipeline starts in the real world, of course. The light from surfaces and light sources are captured by cameras and travels a data path through several “spaces” to the movie screen. The challenge for visual effects is that the real world presents color and light in a totally different way that far exceeds the capacity of our display devices so a degraded version of the original scene must be carefully managed in order to deliver the most realistic version of the original scene to the audience. The real world is not a “color space” in the color science sense, but instead is presented here as the “World Space” where our entire VFX pipeline begins.
The way to think of the VFX color pipeline is as a sequence of operations where the data is depleted at each step, but by starting with high quality data and properly managing the VFX pipeline you can ensure that your VFX not only look great on the intended display device, but will also be future-proofed to look good on display devices of the future.