The project features thousands of photos representing the faces of hunger in America and transforming into one, unique, blended AI- computer generated image. The Mill’s team was tasked with taking that singular AI image and not only bringing it to life, but seamlessly transforming it into different faces of varying genders, ages and seated positions.
Typically, a full CG facial replacement would be the chosen method for a project like this. However, something of that magnitude can take up to 12 months. The team only had two months; leading them down the path of a hybrid 2D approach.
To remain as efficient as possible, The Mill had to think outside the box and adjust our traditional pipeline. The team quickly Beta tested new tools released by Keen Tools: FaceTracker and FaceBuilder for Nuke (the 2D software utilized for the project). These were untested, unreleased tools that The Mill became early an adapter of. The tools produce a generic face model in Nuke which then allowed the artists to realign facial features and ensure a near-perfect match to the initial image. Then, the images were ran through FaceTracker to animate the face, and most importantly the mouth movements synching with the VO.
After having successfully brought the actual AI image to life, it was time to tackle the next challenge: continually transforming and morphing between the different characters portrayed. Utilizing the same compositing tools, the team of artists were quickly able to track all the supporting actors' faces and correct the geometry and facial tracking for each of them. Of course, the transitions needed to be unnoticeable; one moment the viewer is seeing and hearing a young girl's testimonial, the next an elderly man's. The team had to meticulously adjust simple head movements, eye blinks and shadows to make these characters real.