“The rovers themselves are the most accurate versions of Opportunity and Spirit,” Nair observes. That’s what I like.”ĭata was provided by the NASA Jet Propulsion Laboratory. There are so many different environments and dragon characters. The bit I love is the sheer diversity of the work. We did a lot of 2D warping and distortion to make his neck thinner and get his face to be gaunt. We had to do CG for half the face of King Viserys I Targaryen in Episode 108, towards the end of his final days. Comments Bickerton, “In the tournament at the beginning of Episode 101, there are numerous face replacements. There was quite a lot of face replacement for action and storm sequences.” All of the actors were scanned to various degrees, depending on how much of their performance is needed. We tried to shoot an element for everything. “We used digital doubles for some of the fast action otherwise it’s an element of someone on a motion base, if it’s dragon-riding. “If you’re going to have character who is 1/10th the screen size of a dragon, then it’s a digital double,” Bickerton states. In Game of Thrones, they tended to add in bits when needed for each episode.”Īround 2,800 visual effects shots were produced for the 10 episodes. Our Visual Effects Art Director, Thomas Wingrove, brought in the different models, and we came up with our own fully-realized 3D environment because we wanted to be able to come back to it and know where everything was. ![]() Explains Bickerton, “They had been built by different facilities for each season, so we had about five or six different variations of the Red Keep and King’s Landing. Miguel wanted it to be dirtier, dustier, grungier than Game of Thrones because we are taking place 130 years before, so there was a lot of smoke, and King’s Landing has a nastier look.” Bickerton was give an eight-terabyte drive of assets from Game of Thrones by HBO that included the Red Keep and King’s Landing. ![]() They were small in frame but were a key element in bringing life to the shot. “I remember working on our first environment and asked, ‘Should we add some birds?’ And it worked. “The trick was to always have atmosphere-like particles in the air,” Bickerton reveals. There were no static 2D matte paintings as the camera always had to be fluid. ![]() Stopsack adds, “We used the technology for a small number of shots, but it was great when you needed the unmistakable like of the actor.” So, our animators would then see those constraints and work within them to see how far we could bend the head back without going beyond what volume capture could support,” preventing the bend from appearing too rubbery, unlike a real person’s movement. If you have pretty intricate body motion, your head motion can quickly go off what the volume capture would allow, such as if the head goes backwards and you want an extreme that it won’t permit. ![]() But volume capture gives you limited body motion. “We could then take their head motion data and combine that onto our puppet so that the head motion would track perfectly with our digital asset, with our digital head motion. “We took Eyeline’s mesh and tried to incorporate that into our full-blown Black Adam digital double,” Stopsack remarks. Wētā took the process a step further to retain Johnson’s natural head motion.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |