The αctor creature: 4 - ‘Computer pajamas’: The History of ILM’s IMocapJuly 4, 2021
When most people think about the motion capture of actors, they are probably thinking about either optical capture (where cameras pick up markers on a suit) or inertial capture systems (which make use of magnets, accelerometers and gyroscopes inside a suit).
But another kind of capture, image-based capture (sometimes dubbed ‘faux cap’) is now also very common in helping to re-create on-set performances by utilizing suits with specific tracking markers on them and sometimes multiple witness cameras to help triangulate the performance in 3D space. A pioneer in this area was ILM, which launched its patented IMocap technology during the second Pirates of the Caribbean film, Dead Man’s Chest, in 2006.
befores & afters spoke to John Knoll and Kevin Wooley about ILM’s journey with IMocap, how it was conceived, and how it has evolved.
IMocap: the beginnings
On the first Pirates of the Caribbean movie, The Curse of the Black Pearl, ILM had, among several other effects challenges, realized a number of fully CG skeletal pirates, with a combination of optical motion capture and keyframe animation. Visual effects supervisor John Knoll, now chief creative officer at ILM, and animation supervisor, Hal Hickel, felt that something different was needed for the sequel, especially to realize the tentacled star Davy Jones.
“The original inspiration for IMocap came out of a conversation that Hal Hickel and I had with director Gore Verbinski on the first Pirates of the Caribbean film,” relates Knoll. “Gore had told us that he intended Davy and crew to be all digital characters on the second film in the series. From the team’s experiences on Pirates 1, the manual matchamation done was time consuming and not particularly accurate. We had also had a failed attempt at doing optical mocap later to match, but Gore rejected it because we were essentially substituting a different performance into the shot.”
Since Davy and crew were always CG and didn’t have to make transitions like the pirates in the first film, Hal, recalls Knoll, thought ILM could put tracking marks on the performers to help get better matchamation.
“I remembered seeing a demo from ILM’s R&D department on how they did a two-camera mocap solve on Pearl Harbor for tiny digital stuntmen tumbling off the deck of a sinking battleship,” says Knoll. “That got me thinking about a similar witness camera approach, where we suit a performer with tracking marks of some kind, and shoot the performance with witness cameras to triangulate positions.”
A meeting was set up with ILM’s R&D group to lay out the challenge for some kind of on-set system that could help with matchamation. “The goal,” notes Knoll, “was to get close to optical motion capture quality, but with a small footprint on set, and able to handle difficult location shooting conditions.”
The stipulation was that there would be no constraints on cameras or lighting (visual effects artists will know that there has traditionally been a stigma associated with VFX taking up precious time on set). “We wanted to come up with a performance capture system that had an extremely small on-set footprint and was portable, since we knew that the upcoming films would involve complicated sets and a great deal of location work,” adds Knoll.
How IMocap works
It’s important to note that IMocap is not just about the suit (more on the suit, though, below). The system is heavily based in extracting data from the plate and witness cameras. “The original concept was to use ILM’s Academy Award-winning tracking and matchmoving software, M.A.R.S., to see if we could solve the motion of a skeleton,” explains Knoll. “We could already track and solve rigid objects with M.A.R.S. and the idea was to attach rigid, three-dimensional objects of known size and shape to the performer for tracking, and translate their motion into the motion of a simple, mocap-style skeleton. These objects are what became the ‘ILM IMocap Bands.’”
Indeed, the original architect of IMocap was ILM R&D supervisor Steve Sullivan (who, along with engineers Kevin Wooley, Brett Allen and Colin Davidson, received a Scientific & Engineering Technical Achievement Academy Award in 2010 for IMocap).
“Steve has a PhD in computer vision and was the original author of the first version of ILM’s M.A.R.S. tracking system, for which he also won an AMPAS Technical Achievement Award in 2001,” says Knoll. “Colin Davidson was one of the main developers of IMocap. Colin, who also has a PhD in computer vision was responsible for the core solver that was used on Pirates 2. Brett Allen, another computer vision PhD, was also one of the core engineers in the development after Pirates 2, and also developed ILM’s ‘IMocaptain’ tool. Kevin Wooley was one of the system’s champions from the beginning. As a general motion capture expert, he was IMocap’s main ‘power user’ and was instrumental in the design of the overall system, including the design of the bands.”
Mike Sanders was also a key person involved in the development of IMocap, leading the hardware/prototype/production side of things, and in particular the design of the suits and bands. He would also go on to supervise several IMocap productions at ILM.
Now, back to the IMocap Bands. The R&D team at ILM realized that, by using rigid objects of known size and pattern that could be modeled in software, it wasn’t necessary for each ‘marker’ to be visible from multiple camera views. This is, of course, where IMocap differs from most optical mocap systems.
“Unlike tracking markers with an optical mocap system,” states Knoll, “which requires a marker to be visible from multiple cameras in order to triangulate its position, if enough points around a band are visible, even from only a single view, we can solve its position and orientation in 3D space.”
Ultimately, the bands would be placed around each body part of the actor; the upper arm, lower arm, upper leg, lower leg, waist, shoulder, head, chest. Then each band was modeled as a piece of geometry wrapped around an oval. The result – as Knoll and Hickel had hoped – was that ILM could track the motion from a minimal number of cameras, greatly reducing the footprint on set.
The Pirates 2 (and 3) experience
Davy Jones actor Bill Nighy, and other actors playing his crew, wore IMocap suits during filming on Dead Man’s Chest and At World’s End (the two sequels were filmed back-to-back), which took place all the way from controlled sound-stages to sunlit and water-affected outdoor shoots. “For most of the Pirates films,” details Knoll, “we only had one or two witness cameras and the plate camera. Many shots were solved from a single camera view.”
The IMocap suits featured bands with white dots on black squares. This was chosen, says Knoll, because “we knew that the pattern would often be out of focus or motion blurred, and using a circular dot meant that it would still be trackable even when out of focus or blurred.”
Knowing that the shooting environments for Pirates 2 and 3 would sometimes be very dark, ILM originally employed LEDs under the white dots to make them glow in the dark. “This worked very well in tests, but on set it became impractical with the number of characters we were dealing with,” says Knoll. On Van Helsing (2004), ILM had designed and built custom active IR tracking markers for its performance capture suits, however, given the scenarios expected in Pirates 2 and to simplify the system, it was decided that we forgo the LEDs and simply find the most high-contrast material possible, with a matte finish to avoid specular highlights.”
“It also became clear in early tests,” says Knoll, “that we would need some visual distinction in the pattern. With only white dots on black squares, it was often hard to track something through a plate, so we created a repeating pattern of squares which was ‘white on black’, ‘black on white with bars’, white on black’, ‘black on white without bars’. The tracking was still difficult, but by counting the squares it was possible to keep things consistent.”
To prevent the bands from stretching and shearing, which would distort the dot pattern too much, and to preserve the cylindrical model, ILM chose to make the bands out of a rigid material that could bend, but did not stretch. Meanwhile, the suit itself would be grey. “I selected a grey color for the fabric of the suit to avoid polluting the bounce lighting onto another character in the scene or parts of the set with the color from the suit,” reveals Knoll. “Another benefit that the team realized only after we were in production, is that the grey color provided great lighting reference. Better than a grey sphere.”
So, that covers the suit and the tracking/matchmoving approach. The final piece of the puzzle for Pirates 2 and 3 was a custom capture rig. It utilized two high-speed machine vision cameras as witness cameras. Explains Knoll: “These cameras were black and white, global shutter, and could run at any number of frame rates and resolutions. They could be synced to the film (yes, film!) camera, and were usually run at 48 or 72fps.”
The IMocap workflow on Pirates 2 and 3
So, how did, and does, IMocap go through ILM’s visual effects process from on-set to final animation (remember, too, a significant animation side to any IMocap capture still exists). A huge effort was undertake on Pirates 2 and 3 to establish the workflow, and many of the decisions made then still apply today. One of those decisions was to ensure that multiple people could be captured simultaneously.
“Prior to production [on Pirates 2 and 3], recalls ILM senior R&D engineer Kevin Wooley, “we were assured that there would never be more than three actors in IMocap suits on screen at one time. On our first day of shooting, every member of Davy Jones’ crew was on set in IMocap suits! Fortunately we designed the system to be very resilient and accommodating for these types of situations. For both Pirates 2 and 3, every member of the cursed crew was dressed in an IMocap suit, whether shooting on stages or on an island in the Caribbean.”
Wooley remembers that Nighy took to referring to the suit as his ‘computer pajamas.’ “He had just finished shooting Underworld, and at one point said something to the effect of, ‘I may look a tad silly, but at least I’m comfortable and I didn’t have to sit in makeup for four hours.’”
The custom capture system, which was designed to be portable, initially proved to be too cumbersome to use on most locations. So it was retained predominantly for shooting on stages in Southern California, while, on location, a couple of prosumer video cameras were used as reference.
Meanwhile, the suits were able to go anywhere and everywhere on the Pirates 2 and 3 locations. “We had to keep making more and more band material, eventually sending rolls of the stuff out to location where the costume dept kept refreshing the bands as they got trashed,” relates Wooley. “The maintenance of the suits and the bands was a challenge in itself.”
For each set-up, it became important to be consistent, but ILM was learning all the time about what was important to capture, and how quickly it could do it. “Since our witness cameras were prosumer models with zoom lenses, which we knew we would have to matchmove in post for them to be useful, we also tried to capture a lens calibration object for every setup, every time we moved the cameras,” says Wooley. “This was incredibly useful when we had it, but it was really hard to get.”
“So,” continues Wooley, “through the course of the production, it became clear that we were going to have to be way more flexible in post than originally anticipated. You’ll never get an ‘ideal’ setup with all the data and the best camera views and since the bands were not permanently attached to the suits, and moved all the time, we knew we were going to have to ‘re-calibrate’ things on a per-shot basis. Flexibility and the ability to work with less than ideal data, is just as important as the best technology.”
Back at ILM, the first task became managing all the extra data they now had for each set-up in which they would be adding a digital character. “We hadn’t really ever done a show with witness camera footage for every shot,” notes Wooley. “Editorial had to figure out how to handle all this footage and synchronize it. The synchronization was particularly challenging when trying to line up 24fps film footage with witness cameras that were running at 59.97.”
Then, a team of ‘trackers’ at ILM tracked all the dots and markers on the bands of the actors in every shot. Some of this could be achieved automatically, but artists at the studio often went through a very manual tracking and clean-up process in preparation for each shot. “One of the strengths of the system,” advises Wooley, “is that we track right on the captured footage and a user can intervene when necessary. To add to the fun, we tracked the footage at 60fps for most shots.”
Wooley says a small team of ‘super-users’ who had figured out how to solve the motion went on to train the rest of the layout team in the process. Here, he breaks down what that involved, step-by-step:
1. Build simple digital doubles of the actors, based on a reference photo session. This included low-res geometry (no scans), skeletons, and placement of the bands.
2. Synchronize the witness cameras and the plate camera. We would actually load a 60fps version of the film plate in order to more accurately sync it with the witness footage!
3. Matchmove all available cameras! For the witness cameras to be useful, they have to be perfectly registered to the plate camera. You also couldn’t cheat the matchmove in anyway.
4. Hook up the tracked markers to the bands that are part of the digi-double.
5. Re-solve the bands to the skeleton on a few frames of the shot. This was a key part of the process. Since the bands moved and changed all the time, we could never rely on the ‘default’ calibration. So the artist would match the digi-double on a few frames and then use our solver to re-solve the bands to a tighter fit.
6. Solve the motion! Once everything was setup, we could finally solve the motion of the skeleton. Artists could also go in and set a few ‘helper’ poses, lock or smooth certain channels, and re-run the solve to improve the results.
“The process, as you might guess, was quite labor-intensive,” admits Wooley. “But the results were impressive, and the technique was used all the way through both shows, on hero shots of Davy Jones, shots with the entire crew standing close together, and dramatic sword battles. By the end of Pirates 3, most of the layout crew had worked on IMocap shots and developed their own techniques. Some artists would do a lot of tracking and rely on the solver, while others would use a combination of matchanimating, tracking and solving.”
Since Pirates – how IMocap has evolved
IMocap remains a large part of so many ILM productions, with the suit design having gone through several revisions since the Pirates 2 and 3 experience, and the software and workflows have also been updated.
“The first improvement to the suit, actually made on Pirates 3, was to get our suits from a vendor who specializes in mocap suits,” says Wooley, “rather than relying on the costume department of each show. The suits are now similar to normal mocap suits, made entirely out of a soft material that is easier to attach the bands to, and dyed a color that works for the character. To this day we use the same vendor to provide our IMocap suits.”
For more information, I invite to read further to learn more about IMocap contribution to Marvel’s Iron Man franchise:
Relayed article by IAN FAILES, on beforesandafters.com.