Chad Eikhoff, TRICK 3D’s founder and director of the Kidoodle TV and Discovery Kids animated series about Santa's North Pole reality talent show, discsses the challenges of producing episodic content – in a pandemic - with a real-time animation production pipeline using Unreal Engine and motion capture.
Expanding upon the Christmas motif that began with immersive media studio TRICK 3D’s holiday special, The Elf on the Shelf: An Elf’s Story is The Jolliest Elf, a real-time 3D animated series that aired this past December on Kidoodle TV and Discovery Kids. The 12-episode show, which revolves around Santa holding a reality talent competition show at the North Pole, is hosted by Mr. Jingles (Chad Eikhoff), with contestants ranging from sassy Country singer Dottie (Tyra Madison) and smooth R&B toymaker Wedge (Teddy Obot) to adorable rapping reindeer Lil’ Rey (Macyn “Mac Sauce” McMillian).
The decision to move full force into real-time animation for the show was impacted to a large degree by the pandemic. Eikhoff, the show’s director, producer, and TRICK 3D founder, notes, “The pandemic precipitated the move of going headfirst into real-time animation, which we have been playing with for a long time.”
Atlanta-based TRICK 3D was founded 15 years ago by Eikhoff and specializes in character animation, commercials, product visualization, virtual photography, and interactive/VR. “We created a VR camera tool for commercials and virtual photography which I leveraged when directing The Jolliest Elf,” he says. Stories involving Elves and Christmas capture the spirit of what drove Eikhoff to produce animation. “Personally, my own creative passion leans more towards bringing light, beauty, whimsy and a child-like wonder like what we tapped into when working on The Elf on the Shelf, which is the reason why I got into animation in the first place.”
Take a few minutes to go behind-the-scenes and see how the series was made:
As part of their ongoing real-time animation work, TRICK 3D applied for an Epic MegaGrant. “Some tests were done for Zayden’s Wish, a VR experience that we did with Make-A-Wish Georgia; it was the first opportunity to put a character into Unreal Engine,” explains Eikhoff. “I had also tested inside of Unity at the same time and decided that I liked the character pipeline going through Unreal Engine. Epic Games had already announced the first round of MegaGrants so I went through their website process and submitted specifically to do the pilot of The Jolliest Elf. I hadn’t seen a lot of energy put into a stylized cartoon look and that’s what I wanted to do. Epic Games approved the pitch, which got moved into a group that was focused on trying to enhance animation inside of the engine.” The grant provided financing as well as access to Unreal Engine technical expertise. “The money covered the cost of the pilot and Epic Games was 100 percent open with access to knowledge, along with putting the completed pilot on its blog.”
Having a completed pilot was essential in making The Jolliest Elf a reality. “As we were talking to co-production partners and distributors, the pilot allowed us to have something to show as opposed to just talking about it,” notes Stacy Shade, Head of Studio, TRICK 3D. “I don’t know if we could have gotten a greenlight on Discovery for Season 1 if we did not have that pilot, which was enabled by the MegaGrant.”
The concept centered around a reality show with 10 characters being presented over the 12 episodes. “I wanted the pilot to standalone as a short film so it was one character within the construct of a reality show and backstory of where the character came from,” remarks Eikhoff. “Moving into Season 1, we did less of that and focused more on the music.” The animation style was never meant to be photorealistic. “The visual styling was meant to be simplified, with more of a Christmas special energy to it, as opposed to a high-end feature film.” The backgrounds took a cue from the TV movie Rudolph the Red-Nosed Reindeer released (1964). “It’s this mental trick where your mind sees the bright colours of the wardrobe and paints that into the background [which is in fact a flat grey],” he adds.
Characters were designed in 2D and quickly modelled into 3D. According to Eikhoff, who studied traditional 2D animation before the arrival of Maya, “I like iterating things as we’re working, so part of what I was doing with The Jolliest Elf was trying to revisit what is animation and how its done? As I came into 3D animation, I was applying what I knew from 2D animation, as were all of the people creating the tools. Pixar, DreamWorks Animation and Blue Sky Studios established what is a 3D animation pipeline. When you introduce facial and body motion capture, real-time rendering, and virtual cameras, which are easy and accessible tools, if you were to reset and say, ‘Lets do animation starting with these tools,’ what would that pipeline be? Part of that was guiding the creation of the characters as well. If we’re going to be driving the facial animation with iPhone facial capture, what is the best way to fit that into the pipeline and when do you adjust those pieces to start to find the voices? We’re still working that out.”
One major challenge for real-time animation production is embracing a new pipeline where animators aren’t pushed into an assembly line process where they are cleaning up mocap and trying to add characters though keyframe animation. “There is a hybrid world that a lot of production companies are going into,” observes Eikhoff. “But we have a larger opportunity where animators come on and use these tools as part of the animation process as opposed to animators being cleanup artists.”
“Because of the pandemic, there were occasions on the show where motion capture had to be done remotely at the homes of the actors,” he continues. “Some where they’d do a normal record of their voice and we would use our face for the first layer of animation, then tweak it with keyframe. In the pilot, Mac Sauce recorded herself as Lil’ Rey using the Unreal Live Link Face app on her dad’s iPhone and sent us that file. We were able to import that and make it work. For the body, Noitom has a Perception Neuron mocap suit which can go over regular clothes and work with a laptop. We were able to transport and use that suit for multiple environments including homes.”
A custom Maya script was written to take the raw data from the Unreal Live Link Face app and convert it into usable data that could be put into Maya and Unreal. “Unreal Live Link Face is great at capturing normal human facial expressions, but mapping them onto cartoon characters requires keyframe animation,” Eikhoff shares. “An Unreal Engine tool was essential in establishing the camera system.”
“Sequencer is my happy place!” he laughs, adding, “It’s like a 3D nonlinear editor because not only do you see it in clips where you edit the shots, but then you can click into it and change the camera move or the focus of the camera, or lighting, or what’s in the scene on the timeline.” Noting that the initial idea was to mimic a live reality show camera setup, Eikhoff goes on to say, “Art David [creative producer] had setup some of those live events in the past, so he knew exactly where every camera would go and set it up that way. The only downside to it is we didn’t have a camera person on every one of those, so we had to go in and manually move them around. I ended up changing that approach by creating specific cameras for particular shots so I could move the cameras in an intuitive manner that felt right.”
The original soundtrack, featuring 10 songs, has been released via Spotify. “I was about halfway through production of Season 1 when I realized that we were not only producing a show, but we’re making a Christmas album,” reveals Eikhoff. “That is a good takeaway for Season 2. The whole show is reliant on those songs and the talent performing them is great. We spent a lot of energy on that for Season 1. My goal is to make songs that can stand the test of time and find their way into becoming Christmas songs that are played every year. One of the interesting things in the concept of the show is each character represents a different genre. I wanted the North Pole to represent a convergence point of the entire world.”
“Season 1 was about figuring out what as possible and veering away from the traditional animation pipeline,” he says. “We able to accomplish that and feel great about the show that we made. Season 2 is about improving every step of the process. 3D animation is proprietary driven, but for this to move forward quickly in real-time animation, the more we share on what’s working and not working regarding the process, the better. I am working to catalogue a lot of my learnings to share with everybody publicly, moving forward and ideally get more people involved in the discussion.”