IT Brief Asia - Technology news for CIOs & IT decision-makers
Story image

Media companies are going big on data – but workflow challenges loom

Yesterday

We are living through a new golden age of media and broadcasting where everything is instantly available, on-demand, and readily accessible with lightning-quick speed.

However, as a highly competitive industry, media companies are having to navigate a complex digital landscape that involves AI integration, cloud migration, and hybrid workflows to achieve cost-efficiency, high performance, scalability and security to meet consumer needs – all while achieving high performance and market growth.

The other challenge is the change in monetisation and shift of audience behaviours using non-linear channels to watch their favourite programmes, as witnessed recently by the 65 million viewers who tuned in live to watch the Mike Tyson vs. Jake Paul boxing match on Netflix – the most streamed sporting event in broadcasting history.

Having once been anchored to on-site production and legacy equipment, the media industry is now becoming increasingly reliant on cloud-based services and remote production. This shift, while offering greater efficiencies, brings substantial data challenges, especially when dealing with high-bandwidth media files and geographically diverse teams.

Content production is rapidly evolving too, with media companies now prioritising accelerated workflows and faster timelines to satisfy market demands. While cloud-based services offer the ideal solution to these complex challenges, it also opens a Pandora's Box of potential disasters.

A state of constant flux

Cloud adoption in the media and broadcasting sector is near universal, but the complexities around data workflow management, data security, and cost predictability – especially for a sector used to the stability of capital expenditure models – has left some media companies in state of flux, specifically where teams often collaborate across borders on large-scale projects. Transferring vast data files over the public internet may sound like the go-to solution, but time and again it has led to delays, inefficiencies, and significant risks of data loss or breach. The result is a workflow characterised by production setbacks, costly overruns, and in some cases compromised data security—outcomes that media companies cannot afford in an intensely competitive industry.

What's more, the rapid rise of over-the-top (OTT) platforms like Netflix and Amazon Prime Video as well as multi-device usage has forced media companies to rethink their tech costs against content investment. For many, a hybrid model – integrating both on-prem and cloud – offers an attractive balance of control and scalability. Even so, several operational challenges persist. The most pressing of these is data transmission failure, and with it, the threat of data loss.

Remote production workflows increasingly depend on data transfer between disparate locations. For instance, a director in New York might need high-definition footage from a production crew in Hong Kong, which can then be edited by a team in London. When relying on the public internet, this intricate process can experience frequent transmission drops, incomplete file deliveries, and delays due to bandwidth limitations and latency. As data volumes increase, which they surely will, these inefficiencies will become ever more costly. While remote production provides high flexibility and cost reduction, it can only work if data transfer over the total workflow is guaranteed. Otherwise, the different stages in the flow process will be hindered by elongated waiting times.

Another significant challenge is data security. A reliance on the public internet for transmitting high-value, sensitive data – think of an unedited movie or an unreleased video game script – exposes media companies to substantial security risks such as the interception of data by hackers. When these are leaked online, it not only causes financial damage and reputational harm, but spoils months of carefully planned build-up and severely damages audience engagement. Such incidents underscore the need for robust security, which the public internet is often ill-equipped to provide.

A third challenge is costly delays. Efficiency is critical when multiple teams in various locations work sequentially on media projects. When files are delayed or lost in transit, teams waiting to perform colour grading, visual effects, or other post-production tasks will encounter significant downtime. This domino effect can stall entire schedules, causing budgets to inflate and deadlines to be missed. For an industry that lives and dies on tightly managed release dates, these delays can have a devasting impact on the bottom line.

Then there's the irresistible rise of AI. As more media companies adopt emerging AI technologies, the need for fast, secure, and reliable data transfers becomes critical. Given that the media industry is the world's largest data collector, the challenges that come with managing massive data sets, deduplication, and dispersed data storage have forced companies to rethink their workflow strategies.

The private network effect

While the public internet remains the default option for many established media companies, especially those bound to long-term equipment investments, smaller and more agile media players are embracing private, direct connections to manage their data workflows. The reasons for this shift are myriad, but top of the list is greater security. Private networks offer a level of data security and control that is simply not achievable with the public internet. They ensure sensitive media assets are protected from cyberthreats and other risks associated with data interception and unauthorised access.  After all, you can imagine the negative impact of having preview shots disclosed before a new movie or video game is ready to go to market. Private networks also enforce strict encryption and security protocols that keep data protected from these potential breaches and preserve the confidentiality of high-value media content.

Equally important is the demand for a robust network that can support AI-driven data workflows. As AI becomes more integrated into content creation – from automated video editing to personalised recommendations – a private network can guarantee that these data workflows remain efficient and responsive, allowing for quicker and more seamless movement.

This leads to another crucial factor – high-speed performance. Private networks provide dynamic scalability allowing media companies to adjust their bandwidth on demand so that data transmissions remain stable. For workflows that involve massive files such as high-resolution footage or complex visual effects, this stability translates into shorter transfer times, improved production efficiency, and vastly reduced bottlenecks.

Global reach is vital too. As a global industry, media companies require network connections that allow their teams to collaborate seamlessly across borders. Think of an international event like the recent Paris Olympics, where live media streams were distributed in real-time to audiences worldwide. Now think of those media streams going down or being probed by hackers. By choosing a private network that offers secure coverage across hundreds of locations, media companies can deliver low-latency, high-speed connectivity without compromising quality or reliability.

Finally, but of no less importance, is the threat of financial or reputational damage that comes with data workflow failures. The consequence of these failures extends far beyond operational setbacks. The global entertainment market, for instance,  generates over $100 billion in annual revenues by keeping its customers happy. Yet when data is compromised, whether from interception or incomplete transmissions, it can trigger a catastrophic backlash from both audiences and investors that could impact a company's brand reputation, its market position, and ultimately its future. A private network removes this threat, helping companies remain competitive, meet audience expectations, and safeguard them from the pitfalls of data mismanagement.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X