fbpx

Data Gravity in the Content Creation Industry

By Chuck Parker, HPA Board Member

The longer you work in and around the production and post production industries, the more you notice the accelerating change to the industry’s underlying workflows.  In the mid-2000’s, most of the cataclysmic change facing the industry was a transition from physical and analog workflows to digital workflows.  A few years later the digital business model hit the industry with full force, causing upheaval in pricing and cost models for those that did not adapt quickly.  The result was a number of new players across the value chain and the end of some storied brands and companies in the industry.

Chuck ParkerAfter a decade of significant change, the majority of the steep process (physical / analog to digital) and massive price pressure changes have been endured.  However, our industry is now largely subject to Moore’s Law as a result of that digital transition.  The net effect of a “doubling” of “digital power” for price point parity every 12-18 months is a phenomenon that touches many (but not all) parts of the production and post production process and is a driving economic force that can bear good tidings or incredible pain.  However, its continued source of disruption is often the result of “linear thinking” – falling prey to the assumption that a change which took 5 years to achieve in price point capability will take just as long to be duplicated. In fact, the capability to price point will improve somewhere between 8x and 32x over that period.

What does all this mean?  Taken in its most basic form, many would believe it means the cost of shooting a 2K feature in 2015 should deliver a significant cost reduction to shooting that same feature in 2016.     However, if we take a cue from the storage industry, they would tell our industry that the amount of data stored by enterprises on average has doubled every year for as long as anyone can remember.  This is largely the result of changing behaviors and processes as the cost of keeping data around gets cheaper (i.e. everything becomes less efficient).  Don’t believe this?  Ask yourself how many photos you have stored online with your Apple or Android phone today vs. 3 years ago—at some point the cost of iCloud, DropBox or Google Photos got so “cheap” that managing your photo catalog was “more expensive” to you than the cost of the service to just keep everything.

Combine this impact of price point on behavior and process efficiency with the march of 4k data presentation to the consumer, the increasing availability of 6k cameras at affordable price points, and the inevitable march to 8k for either virtual reality or some future Japan-like broadcast standard, and we have rapidly arrived at a point where shooting that used to result in 1-2 TB of raw footage daily is now often more than double that amount and occasionally results in 15-25TB of daily capture on high-end projects.

The net impact of this incredible change is that every professional in our industry needs to master a new term: data gravity.

Everyone in the production and post production value chain needs to start thinking differently about the way they impart their magic upon the content creation process.  Visual effects teams  have been dealing with  the challenge of remote compute resources for years. already (geographically dispersed office locations) and adapted to the challenge of cloud compute rather quickly, which typically requires them to push a large amount of data to another location very quickly (and securely), work their magic on the content in that location (data gravity), and then move the end result to where it needs to be delivered.  It is very likely that this kind of process change will be required of most creative processes in the very near future.  For example, the convenience offered to professionals by the creation of digital dailies (think Pix/Dax for review and approvals on iPads) will become a requirement for real-time color grading, editing, and review and approve workflows to match this data gravity concept and content production sizes continue to grow as these rapid rates.

Managing data gravity requires both the change of process to account for some abstraction layer (i.e. visual review of a sliver of the data vs. moving all of the data for the review) and the ability to push large amounts of data from one creative collaboration partner to the next quickly and securely.  Solving the data abstraction layer is about understanding your creative process and at which points others can peer into that process to impart their magic vs. having to have all of the data to impart their magic (i.e. review and approve vs. creation of a new visual effect).  Solving the data transport issue requires an understanding of the problem from a transport layer (layer 3), including the speed of light, contention and latency, as well as understanding how the application layer (layer 7) can help solve transport problems.  In short, it means knowing when you need dedicated private bandwidth vs. lots of cheap internet and the right software to solve your data gravity problem.

Regardless of how you decide to change your creative processes and deal with data gravity, one thing is for sure: as long as the price of underlying cost drivers in the digital workflow reduce product price by half every 12-18 months (or their product capability doubles for the same price), someone else in the chain will be innovating the creative process to deliver a better, faster and cheaper solution  to present to your customers because the opportunity for disruption it creates is just too large to ignore.

We use non-personally-identifiable cookies to analyze our traffic and enhance your HPA site experience. By using our website, you consent to the placement of these cookies. Learn More »

Pin It on Pinterest