Like the cloud before it, Big Data is both a re-invention of something that already existed and is a term that does nothing to really describe what it is. Nonetheless people are suddenly talking about its potential to affect change in broadcast supply and commissioning.
“I don’t think there is such a thing [as big data],” declares Andrew Jordan, SVP, International Operations & Technology, NBCUniversal. “Big is too subjective a tag for anything related to data. What is ‘big’ for one organisation is ‘small’ for another. Big data is a term for a marketeer’s or sales person’s convenience.
Many are dismissive of the catch all phrase ‘big data’ but few believe data analysis without merit. “We have always analysed the data we’ve captured and stored it internally,” says Jordan. “The difference now is that so much data is being created in the public domain and not by us.”
According to IBM, we create 2.5 quintillion bytes of data every day from the growing use of network-connected embedded microprocessors, often connected to sensors or other data-gathering instruments.
Where big data is most commonly being applied to broadcast is in data sets about media consumption which includes Barb and other ratings metrics plus review and biographical data, social media, ISP and return path data.
“Because a data set is huge doesn’t mean it is valuable,” warns consultant Fran Cassidy. “The Twitter universe only represents those who use Twitter, not the total universe of TV viewers. Realtime data, which allows you to ‘see’ what people are thinking, is very seductive to production companies but potentially dangerous. If you rely on that [social media tracking], the silent majority will never get heard.”
According to Robert Ambrose, Media & Entertainment Industry director at Oracle, media customers find it hard to create a business case for investment in big data. There is a risk of ’emperor’s new clothes’, he says, but only if you take a technology for technology’s sake approach.
“If you start with an understanding of the potential data sources out there (web transactions, customer data, transaction/consumption logs, demographic data, service requests, social media, etc) and then think about the business pain points you’re trying to address (targeted marketing/advertising, reduced subscription churn, better content investment) then big data analytics has huge potential to deliver competitive advantage to media companies.”
Andy Brown, chairman at WPP-owned consultancy Kantar Media, says, “A fundamental limitation of big data is that it doesn’t collect attitudinal data. It tells you who is watching, but not why, or why they prefer this content over that. So it’s essential that we retain classic market research – such as panel measurement – alongside return path data and social media data.”
He adds, “Twitter is very aggressive at trying to penetrate the TV ecosytem to the point where they are launching a Twitter TV ratings service in the US. Can we evaluate a programme’s engagement using social media?”
Helping broadcasters improve
Big data is one of the most talked about topics in Silicon Valley, according to Larry Kaplan (pictured), founder and CEO of Software Defined Video Infrastructure (SDVI). “In the Valley they always talk about the consumer, of which the exemplar is Amazon. What Amazon has done in terms of analysing big data to predict consumer habits, and feeding back into the supply chain to package and distribute media to the consumer, is nothing short of amazing. The question is how we can apply this to broadcast.”
Since internal production and distribution processes are more structured and predictable than unstructured data culled from the internet, could big data techniques be used to optimise how a broadcaster’s internal processes operate?
“Broadcasters already have a business with a lot of intelligence built in,” says Tony Taylor, CEO, TransMedia Dynamics. “Metadata is embedded in content to greater or lesser degree. We can help broadcasters make better use of it to improve their operational performance.”
TMD integrated its Mediaflex Content Intelligence tool into the media asset management system at Irish broadcaster RTÉ. The whole system is claimed to save RTÉ over £300k a year by harvesting and analysing metadata from files coming into the broadcaster, then routing the content without manual intervention into the appropriate workflow.
“By extracting as much technical and intellectual information as possible we open the data up for the broadcaster to make business decisions on the performance of internal processes such as transcodes, content movements and publishing,” says Taylor.
According to Kaplan, what is needed in order for the broadcast industry to take full advantage of big data is a virtualisation of the facility and its services. Virtualisation starts from the concept of a facility as a collection of types of resources; one resource for storage, another for processing and so on.
“The percentage of a facility’s asset base in use at any one time is strikingly low,” he says. “Fixed facilities are generally very over provisioned because of the inability to change their configuration. Software control over these utilities would help broadcasters achieve very significant benefits from pooled resources.”
Kaplan’s company is promoting an enterprise software suite to feed into a virtualised plant. “Once facilities are designed around the notion of pooled resouces that can be configured and changed in software then the whole notion of big data techniques can be deployed to intelligently manage it,” he says. “Software defined networks and FIMS (Framework for Interoperable Media Services) and AMWA and EBU framework are keys to making virtualisation possible.”
By Adrian Pennington