HPA Guide to COVID-19 Resources

Lost Lederhosen Showcases Virtual Production at HPA Tech Retreat Supersession

Lost Lederhosen 2 2020

By Debra Kaufman

Over the years, HPA Tech Retreat’s Supersession has tackled many topics. But this year, for the first time, attendees were treated to a front row seat to a one-day movie-making experience: The Lost Lederhosen. “Not only did we bring together the most cloud technologies ever assembled into one filmmaking workflow to show what is capable with today’s technology, but the notion of presenting it live in front of hundreds of people was a next-level accomplishment when it comes to conference programming,” says HPA president Seth Hallen. HPA relied on the support and collaboration of many companies who contributed to make the movie a success.  “This was the community coming together to do something innovative, rewarded with the sense of having learned and collaborated on something truly unique and evolutionary.”

The 11-minute short film was spearheaded by EFILM’s Joachim “JZ” Zell, who developed the concept, gathered the team and produced it. The Lost Lederhosen was directed by Steven Shaw, ASC and shot by cinematographer Roy Wagner, ASC. According to Zell, the movie highlighted cutting-edge technologies such as instant dailies. “Fast global collaboration in the cloud was mind-blowing and scary at the same time,” he admits. The production also highlighted a trend in using virtual production to replace greenscreens to marry real and CG elements in-camera.

Prior to the beginning of the Supersession, The Lost Lederhosen team worked closely with the Westin Mission Hills Golf Resort & Spa hotel staff and Sohonet to ensure that all remote collaboration systems would be go. The first step was a public pre-production meeting, with Zell, Shaw, Wagner, Peter Moss, ASC, ACS; Stargate Studios’ Sam Nicholson, ASC; and 616 Music Lab’s Richardo Mejia. The production used an ARRI Alexa, Blackmagic Design Ursa Mini Pro, Panavision DXL2, RED Monstro2, and Sony Venice to shoot segments of the movie. Several scenes were shot prior to the Supersession, but attendees witnessed the production of three live scenes, with the participation of Zeiss’ Snehal Patel; Nuke artist Caleb Knueven; and RED’s Dan Duran.

For the virtual production scenes, Epic Games’ David Morin explained the role of his company’s Unreal Engine in conjunction with Lux Machina’s integration. Mark Bender Aerials and Associates produced drone shots. Zell and Technicolor’s Josh Pines also laid out the details of the production’s ACES workflow in conjunction with the chosen cameras. CalMan Portrait’s Tyler Pruitt explained the monitor’s calibration, also vis-à-vis ACES, and FilmLight’s Pete Postma demonstrated DIT on-site color correction within the parameters of the ASC CDL (color decision list).  Other companies/individuals that participated in the production/post were Pomfort’s Patrick Renner, who discussed the use of ASC metadata; Frame.io’s Patrick Southern and Michael Cioni demonstrated instant dailies and editing in the cloud. Colorfront offered a cloud-based integration with Frame.io, to make it possible to stream color-consistent camera log HEVC proxy files directly from Frame.io. Colorfront also streamed the 4K, 6K, 8K HDR, SDR dailies and uploaded camera originals to S3 storage in AWS. It also provided sync color graded dailies for viewing and editorial and delivered 2K, 4K and 8K Open EXR format in ACES with camera and lens metadata for VFX and color-grading artists. Grading included Pixelworks’ TrueCut Motion Grading, to minimize motion blur and judder.

The film was edited in Microsoft’s cloud with Avid; Teradici provided the compression/decompression for the remote collaboration. Sohonet provided the remote collaboration for grading with Baselight. Skywalker Sound and Ambidio collaborated on remote sound mixing. BorisFX provided remote VFX, with Pixspan accelerating the movement of data. Amazon’s Jack Wenzinger described the use of Amazon’s cloud workflow in the production, and freelance colorist Zach Medow color-timed the movie with Resolve. Finallly, SGO’s Mistika cloud workflow manager created final deliverables. Every movie worth its salt has an outtake reel, and that’s what BeBop Adobe Technology’s David Benson created (with end credits).

At the end of the Supersession, the movie was screened for an enthusiastic audience. Reflecting on the chief challenges of making the cloud-based movie, Zell notes that, “humans could become the weak link if our educational system does not keep up with the technical evolution. Events like this HPA Tech Retreat are very important to keep our production and post production community up to speed.

“We need to protect the creatives who need to stay in charge of the production,” he adds. “The director and the DP need to decide which shots are going into a cut and not the studio executive nor the producer nor the editor. If content is available to too many people to fast, wrong decisions will be made.”

“HPA is proud to have delivered this kind of ‘reality-based’ programming to a filmmaking trade conference which is truly the pinnacle of educational content for our industry,” said Hallen. “It was real and raw and provided a glimpse into the challenges and opportunities in filmmaking today.” Will the HPA try this kind of live movie-making next year? “Hell, yes!” says Zell.

 

 

 

 

 

We use non-personally-identifiable cookies to analyze our traffic and enhance your HPA site experience. By using our website, you consent to the placement of these cookies. Learn More »

Pin It on Pinterest