Translate this page:
Search this website:


BC/DRCloud StorageComplianceData CentresDeduplicationDisk/RAID/Tape/SSDsEthernet StorageSAN/NASTiered StorageVirtualizationSNIA & SNIA EuropeDCIM
News
SNS TV
White Papers
Products
Web Exclusives
Magazine
Events
Media Pack
Blogs
Register
Contact

 

Improving productivity on Planet 51

Today, the technology required to create computer-generated images in the production of motion pictures has become a competitive differentiator for cutting edge organizations such as Ilion Animation Studios. As part of this advanced infrastructure, deploying a high performance storage solution allows artists to create more revisions by dramatically accelerating render farm output — fuelling extraordinary productions on schedule while reducing costs. SNS Europe talks to Gonzalo Rueda, Chief Technology Officer at Ilion.

 

Date: 1 Aug 2010

Ilion Animation Studios was founded in 2002 to create state-of-the-art computer animated movies for worldwide theatrical release using its own purpose-built, cutting-edge technology. Based in Madrid, Spain, Ilion delivered Europe’s largest animation launch with the release of “Planet 51.” Premiering to a worldwide audience, and with distribution on 3800 screens in the United States alone, “Planet 51” is a defining production for Ilion as the studio’s first full-length feature film. Given the importance of the project to the company’s future, the management team realized that selecting the right storage partner was critical to their success.

The Challenge

Gonzalo Rueda, Ilion’s Chief Technology Officer with responsibility for all technical functions at the studio, quickly came to realize the importance of a high performance NAS (Network Attached Storage) solution in enabling the creative potential of the studio’s 200 artists. This team was dependent on over 200 Hewlett Packard workstations and a render / compositing farm consisting of over 300 physical nodes and 2500 cores. A proprietary, in-house rendering application and home-grown asset management system enabled file access. Also, Ilion ran applications such as Nuke, Autodesk Maya and 3D Studio Max on the farm. Toward the end of the project, production was moving at such an intense pace at times that Gonzalo’s team had to balance rendering and compositing work between the same physical server assets. On weekends, idle workstations were added to provide a total of 4000 cores of processing power. This environment posed a number of challenges:

  • Intense pace of production ;schedule demanded a file storage system that could handle over 3000 simultaneous file interactions when rendering a scene in addition to providing extremely high bandwidth ;for compositing.
  • Ability to manage large shared file systems and corresponding folders was critical.
  • Since varied applications employed were running on both Windows and Linux based servers, concurrent support for NFS and CIFS protocols without degradation was a clear requirement.
  • Cost considerations required scaling the deployment in phases to balance cost without compromising performance.

Gonzalo Rueda explains just how important a part the IT solution had to play in the making of the Planet 51 film: “There are some interesting issues with the feature film production environment. On the one hand most of our production files contain references to other files. For example, one of our lighting files which we send to compute frames on the farm will reference anywhere from 300 to 3000 external files in the form of textures and geometry caches. One of the implications of these dependencies between files is that in our environment the changing the paths of where our files live is considered data corruption since all the files that depend on them will no longer work correctly. This is where having a nice and scalable storage system is a necessity because if during your upgrade and growth cycles, you have to change something from path the migration process is not just about copying files but it’s about going through all dependencies and re-pathing them which is a huge complexity.”

Rueda continues: “Another interesting thing of our environment is that, in order to render a frame, a given render node needs to access all those dependencies which in size can vary between 500MB to about 4GB of information. When you consider that at peak we had around 300 dedicated render nodes with about another 100 artist workstations also rendering over nights and weekends that’s quite a lot of data to be pushed around. Since we do have our own version control system, one of the things we did was to cache versions of the production files on the render nodes and for each render only download those files that had changed instead of all the files. Another thing that we did with our version control system was benchmark the FTP and CIFS protocol and for small files we opted for using the FTP protocol that provided some network performance advantage over CIFS. That’s the sort of thing that we can do because we have our own internal technology handling file access and thanks to the flexibility that BlueArc provides.”

Prior to the Planet 51 project, Ilion’s storage infrastructure had evolved rather more ‘gently’ over time, as Rueda explains: “From a storage perspective, we started off with 4 IDE disks in our domain controller in RAID5. From there we moved on to having a dedicated white box server with a Promise SCSI array. The next step, when we were about 50, was an Apple Xserve with one shelf of SATA drives attached to it for a total of 3TB of storage. And after that came BlueArc. I don’t actually believe that the previous storage solutions are comparable to BlueArc since they were all in essence regular servers sharing their internal disks.”

Contrast this with the IT infrastructure required at the height of the Planet 51 film production process. “At the peak of the production of Planet51 our IT environment consisted of around 250 windows clients out of which over 200 of them were high end HP workstations of our artists,” says Rueda. “296 8 core render nodes out of which about 176 were HP Blades and the rest were Bull 1U dual board nodes. We have a BlueArc system that consists of 2 Titan 2200 heads in a cluster with some 90TB of storage behind them in a mixture of fiber channel and SATA drives. We also have some Linux machines serving our PostgreSQL databases and some of our inhouse intranet apps. We have a StorageTech tape robot for backups. And lastly our network infrastructure is based on 4 HP Procurve 5406zl switches.”

The Solution

Given that Gonzalo and his team would be so dependent on the storage solution ultimately deployed in support of the Planet 51 project, he conducted a lengthy research engagement with an emphasis on bandwidth and performance compared to value. During its assessment, the team evaluated solutions from EMC, NetApp, HP and BlueArc. In order to help Ilion develop a storage infrastructure that would support such a diverse and high performance application environment, the BlueArc team drew upon its extensive experience in the Media and Entertainment industry. According to Gonzalo, “The team took a consultative approach to help set up the initial configuration while keeping costs under control. BlueArc understood how our environment worked.”

Given the cost considerations associated with a start-up, Ilion elected to scale the deployment in two phases in order to balance cost and performance. The initial solution consisted of a single Titan Storage System with a pool of SATA based storage which supported 160 feeds through 200 render nodes. Over the course of two years, this solution was able to scale tightly in accordance with Ilion’s production capacity. However, as the tempo of development accelerated, it became apparent that an additional system would be required. Upon completion of a full needs evaluation considering both current and future requirements, Ilion decided to add a second Titan storage system and an incremental tier of high performance Fibre Channel storage.

Ultimately, the final configuration consisted of clustered Titans and 90 TB of Fibre Channel and SATA capacity. In addition, Ilion took advantage of BlueArc’s Intelligent Tiered Storage functionality which enabled transparent migration of files between the Fibre Channel and SATA tiers in order to reduce dependency on manual administration.

The Results

Despite the phased deployment approach, both the initial installation and subsequent upgrades to the BlueArc solution were completed successfully. In fact, Gonzalo recalls, “Given our tight schedule, simplicity was key. We needed the system to be easy to get set-up and running. With BlueArc, there was no need to troubleshoot or fine tune issues later on.”

While the scene work for Planet 51 spanned a period of 22 months, the last 9 months of production leading to the film’s completion was particularly hectic for Gonzalo’s team. At peak production, members of staff were constantly present in order to maintain render operations 24 hours per day, seven days a week. In fact, a few particularly devoted administrators volunteered to work through the Christmas holiday. During this time, the BlueArc solution delivered sustanined performance allowing Ilion’s 200 artists to produce a greater number of shot iterations. The team wanted a solution where the larger effects would not quickly overcome system cache while rendering input files and producing output files on 300 render nodes. Given BlueArc Titan’s exceptional performance profile, Ilion found that many jobs were performed directly on the network without issue. This resulted in not only a higher quality production, but also enabled greater creativity and experimentation by the artists.

BlueArc’s data migration capabilities were used extensively in transparently moving files between the primary and secondary tiers during the course of production.

Although the BlueArc solution provided standard policy templates, Ilion decided to create customized versions to best support their unique work flows. The primary FC tier was devoted to sequences that were active in production. Once completed for review or finalized, the sequences were then automatically migrated to the less costly and more energy efficient SATA tier.
As an additional benefit, file system Metadata was transparently maintained on the primary tier, enabling fast access across the storage tiers even if the main portion of the data resided on lower cost SATA disks.

This significantly improved overall performance without the need to maintain higher capacities on the primary tier of storage. While the vast majority of system capacity is dedicated to frames, assets, textures and animations, the solution’s multi-protocol accessibility and common storage pool allowed administrators to also use the Titan cluster for business data, such as Exchange and PostgreSQL.

According to Rueda, the key benefits of Ilion’s BlueArc storage system are:

  • Ease of use – storage and infrastructures is not our business and having one box with an intuitive interface and an ease of configuration really helps us focus on the core of our business.
  • Knowledgeable support – one of the key factors for my trust in BlueArc is that they understand my industry and my problem domain. Working with storage vendors that only know of IOPs and bandwidth means that we have to do all the translating from shots, frames, farms into that language. Working with BlueArc they not only wait for us to give storage specs but they can also work with is on how to layout the broader infrastructures to streamline a production studio since they have a large amount of expertise in this industry.
  • Scalability – whenever we’ve had to grow both our bandwidth by adding a second Titan head or by adding more storage to existing storage pools BlueArc has delivered as advertised and without problems. Towards the end of the production we were maxing out the 12Gb of throughput that we had combined on both Titan heads which for me was a clear sign that the system had no internal bottle necks and that we were getting the maximum theoretical performance out of the system. I wish I could say the same for all of my infrastructure elements.
  • Flexibility – the fact that we can share the same file system through different protocols, setup multi tire file systems, add global links to external storage (I know this is almost trivial in a *nix environment but don’t forget I’m on windows) all add up to give me multiple options and alternatives to solve the issues that arise and that’s something that I really like because it leaves my options open.

The Conclusion

Supporting such varied, data intensive application workloads was an important element of the creative effort put forth by Ilion in bringing the vision of Plant 51 to the big screen.

Despite the resounding success of the film, Ilion Animation Studios will continue to push the limits of reality-based animation and is already hard at work on a new and even more ambitious project.

As an ongoing partner in these endeavors, BlueArc’s ability to provide high performance, easy to manage, and cost effective network storage solutions will continue to support Ilion into the future.

Rueda is already looking to the next storage challenge, presented by the latest project. “The next step in storage is seeing how I can scale the bandwidth access to my storage,” he explains. “Our next feature film is going to be stereoscopic and that means we need to generate twice the number of frames so I’m going to have to at a minimum duplicate my render farm, storage capacity and bandwidth.”

Newer BlueArc products than what I have deployed now will already allow me to triple my existing bandwidth and I’d expect that to grow even more in the near future since the 3000 series have been in the market for a couple years.”

Rueda also acknowledges that tiered storage will continue to be an important part of Ilion’s storage strategy, explaining: “Tiered storage has been a key part of our storage strategy during Planet51 and I see it remaining key into the future.

“That’s one of the things I love about BlueArc - as I saw people starting to jump on the tiered storage bandwagon - at first with fairly complex setups and policies - I was just sitting back thinking what’s all the fuss about, since we’ve been using this technology in production for years now. I do have a perception of being ahead of the pack with BlueArc, and that helps me sleep at night to be honest.”

ShareThis

Tags: SAN/NAS, BC/DR

Related White Papers

23 Nov 2011 | White Papers

Automated Storage Tiering on Infortrend’s ESVA Solution by Infortrend

This white paper introduces automated storage tiering on Infortrend’s ESVA storage solutions. Automated storage tiering can generate significant advant... Download white paper

24 Jun 2010 | White Papers

Confidently Migrate Mission-Critical Applications to a Virtualized Environment by Dell

Download this article to discover how deploying vSphere™ 4 virtualization with 11th-generation Dell™ PowerEdge™ servers and Dell™ EqualL... Download white paper

24 Jun 2010 | White Papers

Boosting Performance with Enterprise Flash Drives by Dell

Boosting Performance with Enterprise Flash Drives in Dell/EMC CX4 Series Storage Synopsis Download white paper

Read more White Papers»

Related News

2 Oct 2014 | SAN/NAS

1 Oct 2014 | ICT

1 Oct 2014 | ICT

1 Oct 2014 | BC/DR

Read more News »
Related Web Exclusives

29 Sep 2014 | BC/DR

15 Sep 2014 | BC/DR

8 Sep 2014 | BC/DR

1 Sep 2014 | BC/DR

Read more Web Exclusives»

Related Magazine Articles

| Ethernet Storage

  • Scaling - new heights

    The Dell Storage Forum London saw the introduction of several new products, as part of Dell’s ongoing focus on its Fluid Data architecture. SNS Europe ... Read more

| SAN/NAS

| SAN/NAS

Winter 2010/2011 | Disk/RAID/Tape/SSDs

Read more Magazine Articles»

Related Supplements

1 Feb 2009 | Virtualization

IT Professionals’ Guide to a Smarter Storage Environment: Solutions for Managing Data Growth and Controlling Costs

The storage landscape continues to evolve. Faced with such an array of storage ideas and options, how does one begin to make sense of such complexity?

Click here to learn more »

1 Feb 2009 | SAN/NAS

Networked Enterprise Storage Solutions for Business Partners

Avnet Technology Solutions (via acquisitions) helped the Fibre Channel Industry Association (FCIA) Europe put storage networking technology on the map, across Europe, more than 10 years ago. Move forward to the present day and the FCIA Europe has ?evolved? into the Storage Networking Industry Association (SNIA) Europe.

Click here to learn more »

Read more Supplements »

Recruitment

Latest IT jobs from leading companies.

 

Click here for full listings»

Advertisement