This year marked the 20th anniversary of FMX in Stuttgart, Germany. In attendance for this occasion were ArtFx students Estelle Lagarde (specializing in VFX ) and Franck Menigoz (specializing in 3D Animation). When they weren’t working as ambassadors of ArtFx – manning a booth and presenting the school to an audience of around 100 people – they were attending conference presentations by various artists and teachers from within the field.

Below, are summaries of the presentations that had the most lasting impression on them.

 

 

The Future of Filmmaking: High Dynamic Range (HDR), Wide Color Gamut (WCG) and Calibrated Color Aquisition

 

How can we exploit HDR, WCG and Wide Color Gamut to tell a story? What are the technical challenges and how can post-production studios prepare their workflows for these challenges?

Speakers : Bernd Eberhardt, Jan Fröhlich, Stefan Grandinetti & Andreas Karge (Stuttgart Media University)

 

Ultra High Definition(UHD), High Dynamic Range(HDR) and the Wide Color Gamut (WCG) are becoming more and more prevalent within the video and cinema markets.

However, the improvements that these technologies provide go way beyond their marketing strategies, they offer very important benefits for visual artists.

Current television screens are in Standard Dynamic Range(SDR), which supports a range of brightness that is clearer inferior to what the human eye is capable of discerning. Cameras are also not capable of capturing the large palette of colors that the human eye can perceive. Thanks to these new technologies, television will be able to obtain images much closer to reality.

One of the key objectives of Ultra High DefinitionTelevision (UHDTV) is to give the viewer the feeling of “being there” and of witnessing “reality ». Increasing the resolution is no longer a satisfactory way to achieve this goal if the lens and the display of this content don’t have a wider range of brightness and contrast like most cameras nowadays.

 
 

Calibrating screens and cameras

 

The light and the range of colors desired differs from one display to another, sometimes this can be considerably different from what was filmed by the camera. Allowing content to be captured with a larger range of colors and contrasts, offers filmmakers and artists a range of data from which they can draw the details of the desired image in post-production. However, it is neither desirable nor recommended to work with so many display possibilities.

What is more, certain screens depend on a range of colors that are different that the ones the artist provided. To resolve this problem, calibrating the acquisition of color on different devices has become necessary. This calibration can be implemented either during the encoding or when the file is received.

This normalization of content would allow us to create a range of dynamic colors adapted to the chosen display medium. However, this process is still in the testing phase and still has a significant margin of error which must be fixed before it is put on the market.

 
 

Pipeline for Feature Film – VFX

 

Speaker: Hannes Ricklefs (MPC)

 

 

This presentation gave us a peek at the VFX pipeline used for one production at MPC. Cross referencing several examples of MPC’s productions, Hannes Ricklefs showed us the importance of a unified pipeline, especially in a globally collaborative environment.

Over the last year, MPC has been the main studio for Guardians of the Galaxy, X-Men: Days of Future Past and Maleficent and they are currently working on Disney’s The Jungle Book, Batman vs Superman, Terminator Genisys, The Fantastic Four and The Martian.

 
 

The productions that are growing…

 

The pipeline of a VFX film os a long and laborious process which interlocks pre-production, production and post-production and crosses tasks as varied as previz, modeling, rigging, lighting, rendering, matte painting and compositing. Hannes Ricklefs explained to us that it is also difficult to have an optimal level of synchronisation considering the geographic distances and the time differences between the different studio locations. So, for increasingly large and decentralized productions,Hannes Ricklefs (Head of the Pipeline Department atMPC, London) has had to rethink the workflow they have been using up until now. In addition, with close collaborations, like the one they had with Framestore for Guardians of the Galaxy, he had to rearrange the workflow to accommodate the two studios and allow for an easier exchange or information since each studio has its own proprietary software and pipeline structures.

 
 

Adapting workflows

 

The former system of data management at MPC was based on the incrementation of 3D models, rigs, matte painting, etc. in a hierarchical structure concerning the scene and the subject matter. It rapidly become evident to them that this would not be sufficient for managing the complexities required by the projects like Guardians of the Galaxy ou Jungle Book.

To obtain the required level of work within the short period of time, the Pipeline team at MPC developed a new structure based on the practices and work conventions between the different studios (internal or external), the creation of an asset management system with automated data and the synchronization of workflows between different departments. The focus was on an efficient management of the scenes and the different versions, which was necessary in order to guarantee a quick delivery of high quality images while allowing the artists some flexibility within the limits imposed by their own plans.

With a workflow that allows this type of flexibility between disciplines and studios, MPC can now draw from techniques it developed during previous productions and use them as solid starting points for more efficiently producing work that surpasses expectations.