Playlist
  • Play Video
February 21st, 2018

From gaming and social experiences like Pokémon Go and Snapchat to dynamic digital content by media outlets like the NY Times and W Magazine, 2017 was a breakout year for augmented reality as an emerging technology that can enhance the way we tell stories, consume information, and interact with the world around us. The mass adoption of smart phones has allowed a lower barrier for entry into the AR space, giving everyone and anyone the opportunity to partake in some form of AR, while also providing brands the chance to explore how the technology can be used to engage with fans and consumers through established distribution platforms like Facebook.

We asked Tetsuro Mise, Art Director at The Mill NY, to discuss his work with The Mill's Emerging Technology Team on AR projects, how the AR space has evolved in 2017 and how the technology and experiences can progress in the near future:

What has been the biggest innovation in the AR space in 2017? How has The Mill leveraged this in our work?

To be honest, I am still digesting this big leap of progression in AR, where we can see some of the strongest innovation coming from big shots like Apple, Google, and Facebook. For this reason, 2017 was a big year for AR, where the technology has finally entered into the main stage of the entertainment as well as other areas. This has accelerated the speed of invention for both hardware and content.

The Mill has really leveraged this and produced a great number of achievements in the AR field with some of our best examples of this coming from the Facebook.. After the success of the Justice League masks, which we started developing in early 2017, we have produced many examples memorable content for brands like HBO's Game of Thrones, Netflix's Stranger Things, and BBC's Doctor Who, to name only a few.

Since then, we have raised the benchmark for high-fidelity content, as motivated and rolled out for Facebook's camera functions. By this novelty, The Mill was the first vendor appointed to create Facebook's World Effects, which features surface detection without marker, and was used for projects like X-Files, Jurassic World: Fallen Kingdom. These projects all relied on our very talented team of creatives, engineers and producers who have really embraced emerging technology as a tool for immersive storytelling.

What's been the most challenging AR project you've worked on? And how did the team solve the challenge?

Our first project for Facebook camera, Justice League, was a big learning curve for the team in terms of our understanding of the capability of the AR studio and the software that allows users to create Facebook masks and filters. At the project's inception, none of us had any experience with the software, and given that a quick turnaround was required, it was a fast and deep dive for us to create five high-fidelity superhero masks. Facebook engineers were always on hand to offer supreme support to our team and through this process, we were able to develop a number of ways to refine the look of the 3D models we were creating with them.

Another challenging project, though I wasn't involved this project but I would love to mention it, was Jurassic World: Fallen Kingdom. This was our 2nd installment of Facebook World camera effects just launched over 2018 Super Bowl weekend. This new Facebook camera function features surface detection so that we can inject a 3D object into your environment. With this Facebook Effect, you can experience Blue (velociraptor) fiercely roaming around a subway station or chasing after your cat in your living room.

The process to ensure its realistic movement in the Facebook camera effects platform wasn't straight forward at all! Animation data needs translating from MAYA to Blender to the AR studio, where Facebook authorizes camera effects. Our 3D teams and engineers, as well as the Facebook tech team, spent great amount of time to establish this seamless integration. After countless failure and debugs, we finally success to obtain a smooth work flow. This isn't just showcasing how we solve technical issues but it really highlights The Mill's ability to extend their legacy of high quality moving image into the field of AR.

What are some of the best examples of brands or creatives using AR?

I see a number of successful case studies using AR as a camera function, especially for clients and brands in the entertainment sector like Warner Bros and HBO. On a simplistic level, AR increases the level of engagement-for a moment, anyone can be Aquaman from the Justice League or the Night King from Game of Thrones. The technology has definitely helped brands reach a wider range of audiences through these enriched mobile experiences. AR is quickly becoming the new medium for storytelling and advertising because it allows users to fuse reality with make-believe-the effect is a user experience that is memorable because it offers users the opportunity to choose what information they access and absorb.

How do you think we can improve on how is AR being used?

Personally, I would love to see AR applied more to mobile learning. I think that there are infinite possibilities for innovation in that arena. For instance, we could learn more about an insect when trekking through a forest, or identify the names or even peculiarities of certain type of plant or tree through your mobile device. AR could then be used as a tool to access the vast amount of information out there by using image recognition technologies. This will redefine how we learn-making it a much more spontaneous, immediate, and immersive experience. Learning becomes deeper when you can link it to your own experience.

Jul 12 2017 17 47 24Neon Knights two-player cooperative Augmented Reality music game created by The Mill

What tech innovation(s) are needed to further evolve AR?

The speed of the CPU needs to evolve to offer a more seamless experience for the user. Currently, the synchronization of movement and tracking does happen in real time and is quite accurate-but never quite as sharp or immediate as there is still a lag or dislocation that happens. This difference in milliseconds or millimeter gives the user a strange sense of 'reality'. 2018 will probably bring about more precision in this technology, after which we will be able to add virtual content to the user's field of vision without a sense of separation. Only at this time will we have a truly immersive experience that can blur the lines with the real and the virtual.

Feb 09 2018 16 34 13Jurassic World Facebook AR Effect

Where do you see AR going in 2018 and beyond?

We are on the verge of AR/VR progression-where we are beginning to see the integration of high-resolution and complex audio-visual creations with an added sensorial component. I hope that our AR experience will evolve further and integrate aspects of content personalization to make for a more emotionally-driven experience that could include, for example, one's favorite memories, interests, people, and music.


See more AR projects in our Augmented Reality Collection.