|Home > SAN/NAS > News > Improving productivity on Planet 51||
SNS Europe Data storage and IT management: Storage area network, network-attached storage, direct-attached storage, clustering, network attached storage solution
Ilion Animation Studios was founded in 2002 to create state-of-the-art computer animated movies for worldwide theatrical release using its own purpose-built, cutting-edge technology. Based in Madrid, Spain, Ilion delivered Europe’s largest animation launch with the release of “Planet 51.” Premiering to a worldwide audience, and with distribution on 3800 screens in the United States alone, “Planet 51” is a defining production for Ilion as the studio’s first full-length feature film. Given the importance of the project to the company’s future, the management team realized that selecting the right storage partner was critical to their success.
Gonzalo Rueda, Ilion’s Chief Technology Officer with responsibility for all technical functions at the studio, quickly came to realize the importance of a high performance NAS (Network Attached Storage) solution in enabling the creative potential of the studio’s 200 artists. This team was dependent on over 200 Hewlett Packard workstations and a render / compositing farm consisting of over 300 physical nodes and 2500 cores. A proprietary, in-house rendering application and home-grown asset management system enabled file access. Also, Ilion ran applications such as Nuke, Autodesk Maya and 3D Studio Max on the farm. Toward the end of the project, production was moving at such an intense pace at times that Gonzalo’s team had to balance rendering and compositing work between the same physical server assets. On weekends, idle workstations were added to provide a total of 4000 cores of processing power. This environment posed a number of challenges:
Gonzalo Rueda explains just how important a part the IT solution had to play in the making of the Planet 51 film: “There are some interesting issues with the feature film production environment. On the one hand most of our production files contain references to other files. For example, one of our lighting files which we send to compute frames on the farm will reference anywhere from 300 to 3000 external files in the form of textures and geometry caches. One of the implications of these dependencies between files is that in our environment the changing the paths of where our files live is considered data corruption since all the files that depend on them will no longer work correctly. This is where having a nice and scalable storage system is a necessity because if during your upgrade and growth cycles, you have to change something from path the migration process is not just about copying files but it’s about going through all dependencies and re-pathing them which is a huge complexity.”
Rueda continues: “Another interesting thing of our environment is that, in order to render a frame, a given render node needs to access all those dependencies which in size can vary between 500MB to about 4GB of information. When you consider that at peak we had around 300 dedicated render nodes with about another 100 artist workstations also rendering over nights and weekends that’s quite a lot of data to be pushed around. Since we do have our own version control system, one of the things we did was to cache versions of the production files on the render nodes and for each render only download those files that had changed instead of all the files. Another thing that we did with our version control system was benchmark the FTP and CIFS protocol and for small files we opted for using the FTP protocol that provided some network performance advantage over CIFS. That’s the sort of thing that we can do because we have our own internal technology handling file access and thanks to the flexibility that BlueArc provides.”
Prior to the Planet 51 project, Ilion’s storage infrastructure had evolved rather more ‘gently’ over time, as Rueda explains: “From a storage perspective, we started off with 4 IDE disks in our domain controller in RAID5. From there we moved on to having a dedicated white box server with a Promise SCSI array. The next step, when we were about 50, was an Apple Xserve with one shelf of SATA drives attached to it for a total of 3TB of storage. And after that came BlueArc. I don’t actually believe that the previous storage solutions are comparable to BlueArc since they were all in essence regular servers sharing their internal disks.”
Contrast this with the IT infrastructure required at the height of the Planet 51 film production process. “At the peak of the production of Planet51 our IT environment consisted of around 250 windows clients out of which over 200 of them were high end HP workstations of our artists,” says Rueda. “296 8 core render nodes out of which about 176 were HP Blades and the rest were Bull 1U dual board nodes. We have a BlueArc system that consists of 2 Titan 2200 heads in a cluster with some 90TB of storage behind them in a mixture of fiber channel and SATA drives. We also have some Linux machines serving our PostgreSQL databases and some of our inhouse intranet apps. We have a StorageTech tape robot for backups. And lastly our network infrastructure is based on 4 HP Procurve 5406zl switches.”
Given that Gonzalo and his team would be so dependent on the storage solution ultimately deployed in support of the Planet 51 project, he conducted a lengthy research engagement with an emphasis on bandwidth and performance compared to value. During its assessment, the team evaluated solutions from EMC, NetApp, HP and BlueArc. In order to help Ilion develop a storage infrastructure that would support such a diverse and high performance application environment, the BlueArc team drew upon its extensive experience in the Media and Entertainment industry. According to Gonzalo, “The team took a consultative approach to help set up the initial configuration while keeping costs under control. BlueArc understood how our environment worked.”
Given the cost considerations associated with a start-up, Ilion elected to scale the deployment in two phases in order to balance cost and performance. The initial solution consisted of a single Titan Storage System with a pool of SATA based storage which supported 160 feeds through 200 render nodes. Over the course of two years, this solution was able to scale tightly in accordance with Ilion’s production capacity. However, as the tempo of development accelerated, it became apparent that an additional system would be required. Upon completion of a full needs evaluation considering both current and future requirements, Ilion decided to add a second Titan storage system and an incremental tier of high performance Fibre Channel storage.
Ultimately, the final configuration consisted of clustered Titans and 90 TB of Fibre Channel and SATA capacity. In addition, Ilion took advantage of BlueArc’s Intelligent Tiered Storage functionality which enabled transparent migration of files between the Fibre Channel and SATA tiers in order to reduce dependency on manual administration.
Despite the phased deployment approach, both the initial installation and subsequent upgrades to the BlueArc solution were completed successfully. In fact, Gonzalo recalls, “Given our tight schedule, simplicity was key. We needed the system to be easy to get set-up and running. With BlueArc, there was no need to troubleshoot or fine tune issues later on.”
While the scene work for Planet 51 spanned a period of 22 months, the last 9 months of production leading to the film’s completion was particularly hectic for Gonzalo’s team. At peak production, members of staff were constantly present in order to maintain render operations 24 hours per day, seven days a week. In fact, a few particularly devoted administrators volunteered to work through the Christmas holiday. During this time, the BlueArc solution delivered sustanined performance allowing Ilion’s 200 artists to produce a greater number of shot iterations. The team wanted a solution where the larger effects would not quickly overcome system cache while rendering input files and producing output files on 300 render nodes. Given BlueArc Titan’s exceptional performance profile, Ilion found that many jobs were performed directly on the network without issue. This resulted in not only a higher quality production, but also enabled greater creativity and experimentation by the artists.
BlueArc’s data migration capabilities were used extensively in transparently moving files between the primary and secondary tiers during the course of production.
Although the BlueArc solution provided standard policy templates, Ilion decided to create customized versions to best support their unique work flows. The primary FC tier was devoted to sequences that were active in production. Once completed for review or finalized, the sequences were then automatically migrated to the less costly and more energy efficient SATA tier.
This significantly improved overall performance without the need to maintain higher capacities on the primary tier of storage. While the vast majority of system capacity is dedicated to frames, assets, textures and animations, the solution’s multi-protocol accessibility and common storage pool allowed administrators to also use the Titan cluster for business data, such as Exchange and PostgreSQL.
According to Rueda, the key benefits of Ilion’s BlueArc storage system are:
Supporting such varied, data intensive application workloads was an important element of the creative effort put forth by Ilion in bringing the vision of Plant 51 to the big screen.
Despite the resounding success of the film, Ilion Animation Studios will continue to push the limits of reality-based animation and is already hard at work on a new and even more ambitious project.
As an ongoing partner in these endeavors, BlueArc’s ability to provide high performance, easy to manage, and cost effective network storage solutions will continue to support Ilion into the future.
Rueda is already looking to the next storage challenge, presented by the latest project. “The next step in storage is seeing how I can scale the bandwidth access to my storage,” he explains. “Our next feature film is going to be stereoscopic and that means we need to generate twice the number of frames so I’m going to have to at a minimum duplicate my render farm, storage capacity and bandwidth.”
Newer BlueArc products than what I have deployed now will already allow me to triple my existing bandwidth and I’d expect that to grow even more in the near future since the 3000 series have been in the market for a couple years.”
Rueda also acknowledges that tiered storage will continue to be an important part of Ilion’s storage strategy, explaining: “Tiered storage has been a key part of our storage strategy during Planet51 and I see it remaining key into the future.
“That’s one of the things I love about BlueArc - as I saw people starting to jump on the tiered storage bandwagon - at first with fairly complex setups and policies - I was just sitting back thinking what’s all the fuss about, since we’ve been using this technology in production for years now. I do have a perception of being ahead of the pack with BlueArc, and that helps me sleep at night to be honest.”
|Related White Papers|
|Read more News »|
|Related Web Exclusives|
|Related Magazine Articles|
|White Paper Downloads|
Keep up to date with the latest industry products, services and technologies from the world's leading IT companies.