The visual effects company RVX played a major role in producing content for the movie Everest. RVX's CTO reveals the studio's big data management challenges during the production.
The movie Everest, released in September 2015, was based on the real events of the 1996 Mount Everest disaster. The film, starring Jack Gyllenhaal, Josh Brolin and Jason Clarke, focuses on the survival attempts of two expedition groups.
Excellent visual imagery and photography were critical components of Everest. Increasingly, the development of visual imagery depends upon how well an animation and visual effects producer manages its big data.
SEE: Movie animation firm's big data challenges present lessons to learn (TechRepublic)
RVX, an Icelandic visual effects company, played a major role in the development of visual effects for Everest. During the visual effects development process for the film, visual effects developers worked with multiple media sources on multiple tracks and then edited them together into a single work by cutting, copying, pasting, and adding effects. Ultimately, all of these edits were mixed into a final product, which was then rendered into a final file that was a big data compilation of sounds, effects, and images.
The average file sizes for movies are four GB (DVD), 8 to 15 GB (HD), and 20 to 25 GB (Blu-ray). Movie and visual effects rendering files are large, cumbersome to store, and difficult to manage; their size also makes it a challenge to transport them over the internet.
All of these data storage, management, and transport issues were elements that RVX had to contend with during Everest.
"On Everest, we had 150 terabytes of data under management," said Rui Gomes, RVX's CTO. "We spent many months rendering images, and in some cases, such as scenes where we had to show the snowfall and how snow was accumulating, we had to do significant work in order to capture the realism of the situation."
This meant that developers needed fast and immediate access to the visual effects files they were working with.
RVX chose to host its data at Verne Global, a data center in Iceland, because of its geographical proximity, and also because of the ability to create point-to-point fiber cable connections between the company and the data center that could facilitate rapid data transfers.
"We also used a system where the latest versions of visual effects files were stored onsite in easy-access cache," said Gomes. "Each night, the files held in cache were refreshed. Those that had been changed during the day remained in cache, with their previous versions being filed away at the offsite data center. We did this in an automated process with Verne Global so that our developers could be assured that they were working with their most recent files when they arrived at work in the morning."
In partnering with Verne Global, RVX wanted to make sure that the highest levels of security were maintained over its work product, that data transports encountered little latency, and that power consumption was low because RVX wanted to economize its costs of operation.
"The data center was fully ISO-compliant, a requirement that all of the movie studios require," said Gomes. "But we also found that its use of 100% renewable power lowered costs."
RVX's data team also conducted a detailed review of its disaster recovery and business continuation plan.
"What we found was that not everything in our IT needed full redundancy," said Gomes. "For instance, our render blade servers weren't holding any data, so they didn't require backup."
SEE: Big data policy (Tech Pro Research)
What are the takeaways for IT managers where big data is central to their business operations?
First, latency and fast transport of data should be primary considerations in your big data planning. If you can't get effective delivery times for your data, it will impact your business's ability to do work.
Second, although many companies are hesitant to get rid of any of their data, it might be time to consider what data you really need to continually carry—and if indeed, there are data "temporary files" that are stored on resources that don't require backup. This can help economize data center costs.
"The Everest project was successful for us, and having the right IT infrastructure in place to manage the data was instrumental to it," said Gomes. "We are now beginning to field calls for virtual reality (VR) work, and we will continue to evolve our IT infrastructure to support VR."
- 6 big data trends to watch in 2017 (TechRepublic)
- 5 big data trends that will shape AI in 2017 (TechRepublic)
- Why 2017 could be a big year for AR and VR in business (TechRepublic)
- Executive's guide to the business value of VR and AR (free ebook) (TechRepublic)
- Big Data's 2017: Can more meta thinking free us from current malaise? (ZDNet)