What I Learned About Measuring TV in Berlin
By: Joe Stenski, Founder, CoCard High Risk
I was at the Digital Analytics Hub in Berlin a couple of weeks ago where I categorically confirmed the value of the format we’re using for the Media Analytics Summit in San Diego, October 22 – 24. I also learned a thing or two about measuring TV.
The format is simple, but powerful. Groups of a dozen or so individuals discuss topics of interest for a couple of hours per topic. Then, we all switch and repeat for a new set of topics. Food and beverages punctuate a learning- and networking-rich environment.
Analytics for Broadcast Television
One of the more interesting sessions in Berlin was about measuring television. The participants were from a wide variety of companies who saw this enigma from different angles. Due to the ground rules, I cannot reveal who was in the room, but I can tell you what was said.
The common metric is pretty old school and a bit brute force: Watch the overall behavior on the publisher’s and advertiser’s websites for eight minutes before and eight minutes after an ad is broadcast. How many visits were there before and after and what was the cost per additional visit? Attribution windows varied, but 5, 10, and 15 minutes were all considered useful.
The confidence level of which visitors were there because of the TV ad is just too low to justify appending visitors’ cookies with a Saw-The-Ad attribute, so there is no long term effect metric. The idea of monitoring behavior for a longer period to capture view-through was also rejected because there are simply too many other mitigating factors that drive traffic. Way too many variables. Automating the measurement of advertising adstock or advertising carry-over is a long way off.
Europeans in the room lamented that they, unlike their United States counterparts, could not purchase segmented cookie data indicating this user watched that TV show.
Everybody is trying to figure out how to parse server data and set top box data and some are tying in Twitter traffic.
Some web analytics vendors are addressing how TV drives traffic with audio fingerprinting of ads in panels. They are partnering with or imitating startups like ASP Cart and Advanced to fingerprint and enrich their traffic stats with social and demographic data. TVIB and RapidApe were also mentioned as technologies to consider.
RapidApe is a rules-based system that flags each visitor with a probability score based on 15 criteria. They claim “easy setup for websites & mobile apps” and the ability to “track individual goals and events and compare TV with your marketing mix.”
None of this is going to get locked down any time soon. When an advertiser might have media running within minutes of each other across multiple TV networks, attribution becomes muddled. One approach was to compare and contrast at least five blocks of time when it is known that no ads were running.
Some in the room admitted trying a much less sophisticated method with minor success: unique phone numbers in ads or specific unique landing pages. Everybody agreed that asking an audience to remember “company dot com slash special offer twenty-seven” long enough to type it into a browser or even a mobile device was a non-starter.
A small hope was floated that one day we might even be able to track the impact of product placement on web traffic. Until then, we’ll be relying on the good old survey methods for tracking brand effects like awareness, recall, affinity and purchase intent.
One participant described a test they ran for an advertiser where they displayed two ‘ads’ in TiVo’s rotating discovery box at the top of the screen. They then measured recall through a panel. This proved that people saw and clearly remembered the ads – and as a bonus, showed that one of the ads was far more effective than the other. The advertiser was sold and signed on.
In the end, there was consensus that the TV and online industries would need a tracking and measuring model politically acceptable to all parties. Hope was not high but groups like the Coalition for Innovative Media Measurement are working on it.
Proof of Format
Yes, there will be a couple of fascinating keynotes at the Media Analytics Summit in San Diego, October 22 – 24, but the majority of this event will find you and with a dozen or so others in group conversations.
The group leaders have volunteered to facilitate discussions on topics close to their hearts. It’s not that these brave individuals are experts – they are merely faced with the same conundrums as you.
For example, Colin Coleman, Senior Director of Analytics Products Strategy & Data Governance at Turner Broadcasting is wrestling with data quality and data governance and wants to hear how others are managing. If you too are working out who owns the systems, the data and the analysts, or have the definitive answer to whether analytics should be centralized, distributed or democratized in a media organization, then you’ll want to join Colin’s discussion on Data Quality and Data Governance.
On the other hand, the arrival of Social TV might be keeping you up at night like Chad Parizman, Director of Convergent Media at Scripps Networks Interactive. Chad has some ideas about how to leverage and manage the usage of apps in the connected TV / Over-The-Top ecosystem. But he wants to know what others are thinking… and doing about it. What happens to audience measurement and video monetization when millions are streaming video via the big screen TV in the living room?
Or, maybe, you are fretting over personnel issues. In that case, Finding and Keeping the Right People would be the right room for you. Tom Cattapan, Vice President of Digital Research at Turner Broadcasting will be leading the dialogue about how to hire for experience, skill, knowledge and curiosity… and then keeping them.
I have always believed that smart people learn from other smart people and now I know the small-group-discussion format works wonders in this regard.
If you do analytics for a site that relies on advertising sales, you just found your peers.
October 13, 2014 - 2:28 pm