The last time I attended a
Langara-sponsored digital FX seminar, I was in the middle of a crowded lecture hall at the Vancouver Film School.
Thanks to the coronavirus, these are different times - I'm currently sitting in my spare bedroom in my bare feet with a cup of tea at hand, waiting for a Zoom webinar on
Game of Thrones digital dragon effects to begin.
Although most of the 155 attendees are from North America (based on a quick pre-event survey) guests are attending from around the world. A second poll reveals that we're fairly evenly split betwen VFX pros, students, would-be students, and Game of Thrones fans - sadly, there was no option for genre fans, so I put myself in with the GOT crowd.
We're running a bit late, but we start at five minutes after the hour. Resolution is average, but there's a lot of that going around right now, even YouTube™ has been limiting bandwidth to deal with the increased stay-at-home demand.
The hosts for the evening* are Tyler Weiss, Visual Effects producer and currently Vice President in charge of Strategic Initiatives at the Langara Centre for Entertainment Arts, and Visual Effects Supervisor Thomas Schelesny, both of whom worked on Seasons 7 and 8 of
Game of Thrones for Image Engine Design, a digital effects company that specializes in animal and creature animation. Ironically, the two first met after Weiss lost an Emmy award for special effects to Schelesny's work on Season Four of
Game of Thrones.
Their presentation begins with a very fundamental question regarding the extensive and complicated Season 8 animation work: how did they get this job done, given the combination of high standards and tight deadlines involved?
Both presenters emphasize the cooperative aspect of the production process behind their success, with a team of 120 people working on
Game of Thrones FX. Image Engine worked primarily on the dragons for the last two seasons of the show, producing 99% of the dragon animation work.
The key to the process was efficiency, given that the final season of the program required that the same crew produce three times as many effects, leading the the fundamental question of "What do we spend time doing that doesn't result in dragons on screen?" This resulted in several basic procedural changes in order to optimize the production process.
As with the previous session on Thanos from the
Avengers movies, the two presenters don't go into the technical details of the production process, but they provide some fascinating insights into the creation of believable fantasy animation.
As an example, the primary references for dragon flight came from passenger jets, given their roughly equivalents sizes - Drogon, the largest of the three dragons, is approximately the size of a 747.
As Schelesny explains, the audience knows how airplanes move, and using them as references connects to their mental image of large objects in flight, "Grabbing onto that part of your mind."
Each dragon had a different type of flight model, based on their sizes. After Viserion's rebirth under the control of the Night King, a different treatment was required to convincingly reflect the slower, more deliberate flight of a magical dead creature.
The flight cycle for the dragon wings utilized actual animal references, with the upward flap coming from eagles in flight, and the downward flap from bats, whose hunting flight patterns also provided the reference for dragons picking up objects from the ground.
As part of the process for focusing on "getting dragons on screen", the animation rig that controlled dragon motion was simplified so that it had fewer controls that were better, making it easier for animators to animate the movements, and providing smooth preview playback without rendering the figures. The dragon "face rig" was rebuilt as well, making it easier to control every small nuance of the dragon's facial expressions.
Standardized flight cycles provided a signature performance for each dragon in terms of speed, how fast their wings flapped and how high and low they went during both flight and hovering, thereby helping the varied group of animators to stay consistent.
The presenters then demonstrated how a variety of techniques came together to create a sequence from the first episode of Season 8, where Jon Snow first rides Rhaegal.
The workflow for the sequence began with paintings from the art department that provided the animation department with the visual intent of the scene. A simple cartoon version was created based on the painting to establish the editorial needs of the sequence, and a basic pre-animation was then done in order to establish the correct speed and motion for each shot in the sequence.
The shots themselves relied upon a combination of live action featuring the actors, and digital versions of the dragons in flight. The actors were filmed on a motion base - commonly referred to as a "buck plate" - which was programmed to match the dragon flight from the pre-animation. In cases where the buck plate was unable to match the dragon, camera movements completed the effect.
The initial dragon animation was then refined to match the 3D buck plate shots, the two elements were combined, and the effects were completed by lighting the dragon and fine-tuning the movement of individual dragon parts such as the tail.
For many of the sequences, the production team realized that it was easier to use green-screened practical shots instead of animation, such as the movement of Jon Snow's cape during the dragon ride, which created by fans blowing on the buck plate rather than adding another animation element to the scene.
Similarly, all the dragon fire was created using real flames that were then composited with the dragons. When necessary, multiple flames were combined to created a larger, more solid flame.
In some cases, the production team initially struggled to achieve the look that they wanted. In the case of the wight attack on Drogon and Daenerys in Episode 3 of the final season, the animators initially used actual stunt performers with greenscreen elements and buck plates in the same manner that they'd produced the dragon riding sequence from Episode 1, but the results didn't sync with the dragon movements.
The group needed some kind of simple idea to simplify the wight attack and allow the hundreds of wights to match the dragon movement while holding onto the dragon and each other.
The clever solution was based on a single live action crawl performance by Animation Supervisor Jason Snyman, a performance that was motion tracked to create an animation cycle that was then given to the animators. This allowed for the creation of hundreds of digital wights that could directly interact with each other and Drogon.
Surprisingly, the animators were also able to use the same cycle to make running wights, through the simple technique of "making their feet heavier than their heads". The resulting combination of effects "created the sense of chaos and interactivity that you see in the final shot."
In addition to dragon and combat animations, the Image Engine team created cloud effects as well. Once again, they relied on airplane and fighter jet references to establish how objects in flight interact with clouds: breaking up the cloud formations, creating turbulence, and wingtip contrails and similar vapour effects that helped to create direction, which Schelesny described as "subtle but necessary to sell the effect." The clouds were initially created as high definition polygon based static elements kilometers in relative length, and then motion was added to each individual cloud, regardless of whether or not a dragon was in contact with the cloud formation.
The evening concluded with an acknowledgement of the partnerships behind the success of the final
Game of Throne effects as they appeared on screen. The final shot compositions were completed by the award-winning WETA Digital Studios, located in New Zealand, where all of the additional elements such as water, people and wights were added to the scenes, and the building renders for the destruction of King's Landing were provided by Scanline VFX.
Overall, it was an extremely interesting event, and I have to give the organizers full credit for adapting to the current situation. In fact, as with some of the other changes that COVID-19 has caused, I'd fully support this format for future seminars - all other issues aside, it's certainly nice to have a comfortable seat and lots of elbow room.
- Sid
* It's a bit ironic that two people who do award-winning movie-quality special effects are relying on the standard Zoom background feature. It's like being invited to have lunch at McDonald's with Gordon Ramsay.