Bad numbers, bad research?

No topic is more hotly debated among broadcasters, marketers and media agencies than the still unpublished TV data. What went wrong? And what happens next? Manuel Dähler, Director of Mediapulse, answers these and other questions in an interview.

daehler
Manuel Dähler, Director Mediapulse (Photo: Pierre C. Meier)

WW: The switch to the new measurement system has turned into a ratings debacle. The industry has been without audience figures for almost four months. Was Mediapulse a little naive when it made the switch?
Manuel Dähler: No, the changeover was planned for a long time. The story began in 2010 with a customer satisfaction survey. One customer requested that time-shifted television and web usage also be recorded. We then investigated which providers could get to grips with the Swiss situation and, together with customers and the Federal Office of Communications (Bakom), drew up a list of requirements for a new measurement system. On this basis, three companies were invited to submit an offer. In December 2011, our Board of Directors (BoD) finally decided to award the contract to Kantar Media.
 
Why did the BoD take so long and only decide on a provider a year before the changeover?
In August 2011, the Executive Board submitted a proposal to set up a new panel. Before the BoD agreed, it had various points clarified: For example, whether the applicant companies also offer the option of integrating radio research retrospectively. Or how the applicants are perceived in other countries. It took time to gather this information. However, we had already started negotiating the contracts before December, which subsequently enabled a swift conclusion. The decision in favor of Kantar was made on December 8 and the contract was signed on December 22. But it's true: Setting up a new panel in a year is quite a feat.
 
We like to believe that.
However, Kantar did not have to start from scratch. Mediapulse had already started to build up an address pool in 2009 with the New Establishment Survey, from which targeted samples could be drawn. If Kantar, as a foreign provider, had first had to get to know the characteristics of the Swiss population, one year would of course have been utopian.
 
Why wasn't the old panel simply converted?
Around half of all households would have had to be recruited anyway. We no longer wanted to recruit the addresses purely by telephone, as before, and also wanted to include households without a TV set but with an Internet connection. In addition, there would have been certain distortions in the old part of the panel (biases).
 
What kind of distortions?
In the last four years, we have had to replace between 20 and 25 percent of all households on the GRP panel for technical measurement reasons. Many households purchased new devices that the old measurement technology was unable to record. The lost households were replaced with socio-demographically identical households that had "older" devices. This made the panel more technically conservative. Households with new technologies were significantly underrepresented. Alongside the change in media use, this is the main reason why the new data differs so greatly from the old data.
 
One factor that made the changeover risky was the lack of a transition phase. Why was the old panel not continued in parallel?
Parallel operation was planned for December and took place. In spring 2012, Mediapulse had to decide whether parallel operation was necessary beyond the deadline of January 1. We assessed how high the risk was that Kantar would not be able to set up the panel within a year - leaving us without a panel. If this risk had been high, we would have allowed the old panel to continue. As Kantar has always reached the target size on time for the last five or six panels, we decided not to extend it. From this perspective, we made the right decision. The panel was there on time and parallel operation was not necessary.
 
With a parallel system, the advertising industry would now at least have data...
Let's imagine that the old panel was still running. The differences we know today would still exist. There would be two numbers for the same item. If I've learned anything from the whole exercise, it's that advertisers only want one number.
 
That is correct.
Apart from that, a parallel operation cannot be there to tweak the figures until they look like the old ones. The new system is good if it does not produce any errors and if the basis, the recruitment, is in order. Under these conditions, there is no reason to fundamentally question the figures - even if they look different. There is no alternative to jumping into the new world.
 
It was foreseeable that there would be winners and losers in the system change - stations that would gain and others that would lose. Did Mediapulse think about how to deal with the "losers" beforehand?
Of course, we have looked at the media acceptance of the new currency. We have paid less attention to the magnitude of the figures for individual broadcasters - certainly not enough from today's perspective.
 
Has Mediapulse looked at the figures with the broadcasters?
Not at station level, but we provided information about the key data at events in December and explained in great detail what changes there are: at the level of the country regions and the main target groups. We are relying on a new projection, which entails changes in key target groups. We have more households than expected - and fewer with children. The reach has increased slightly and the duration of use has decreased slightly.
 
And what happened next?
Information events with sender-specific data were held again at the beginning of January. In February, the BoD then began to debate whether the data should be published. We submitted the data to the BoD, which had it assessed by specialists.
 
These experts said yes?
They said yes - and after an in-depth discussion, the Mediapulse BoD also said yes. As a result of this decision, we began supplying data to marketers and broadcasters on February 14. From that moment on, every broadcaster knew their programs.
 
Only your own?
No, every customer saw more or less everything - including the competitor programs.
 
How did they react?
Despite all the preparatory work, certain customers were shocked when they saw the new data. Even those with representatives on the Mediapulse Board of Directors (see Info 1 below). Some intervened, so the BoD decided to wait before releasing the data. In addition to the first report, which was already underway, the BoD commissioned a second one, which focused in particular on local problems.
 
Why was the media science departmentCommission (MWK) not involved?
The MWK was involved when Mediapulse decided to request an external expert opinion at the beginning of January. We wanted to work with the MWK, but it was not possible to obtain an expert opinion in time. As a result, external experts were commissioned.
 
So the MWK was contacted?
Yes, we contacted the MWK in January.
 
Why wasn't the MWK brought on board for the development? After all, it had been involved in TV research for years.
Firstly, the MWK has a supervisory mandate. It does not act in advance, but issues monitoring reports retrospectively. Secondly, we know from previous reports where the Commission has its finger on the pulse, where it identifies weaknesses and where it makes suggestions for improvement. This knowledge was certainly incorporated into the evaluation.
 
Let's move on to communication: Mediapulse provided only scant information about the events. Your assessment?
Mediapulse's focus in the first three weeks was on troubleshooting, not communication, that's right. Firstly, there were problems with the station splits, secondly, errors occurred when assigning the audio recordings, and the third difficulty was the weighting via concession areas. These problems had to be solved quickly. Mediapulse was focused on this. We didn't know on what basis we should have provided information. Looking back, we have to admit: Perhaps we should have said publicly what we were currently doing. That would have increased transparency and created trust.
 
What happens now: when will there be figures?
Mediapulse has been delivering figures to its contract customers - marketers and broadcasters - since February 14. There was an interruption over Easter due to the super-provisional order. It stated that Mediapulse was not allowed to release any data - with the exception of customers to whom it had previously supplied the data under a confidentiality obligation. As we did not have an explicit confidentiality agreement with our customers, we stopped releasing the data until the court told us on request that we could release the figures again. On the other hand, everything relating to the marketing, planning, optimization and billing of advertising is not our responsibility, but that of the marketers (see Info 2 below).
 
In other words, marketers and broadcasters receive data and are in principle allowed to give it to their customers?
Yes, at least that is the interpretation of our lawyers (see Info 2 below).
 
What about the agencies: do they have access to the figures?
Mediapulse is still not allowed to serve agencies. Agencies that not only work with marketers' tools but also obtain data directly from us are affected by the stop due to the super-provisional ruling.
 
What is going on at Mediapulse in this regard?
We are in intensive talks with 3+ with the aim of eliminating the superprovisional injunction. We are trying to find an agreement - but not on the condition that there is a "Lex 3+". There is no way that 3+ will get an individual solution. And we are prepared to take a stand in court if we cannot reach an agreement.
 
Many broadcasters have given their customers a performance guarantee. Do you expect complaints from broadcasters who are experiencing a decline?
Due to our contracts, the target is relatively small - unless we can be proven to have made gross errors. I expect individual complaints from customers, but not masses. We assume no responsibility for the business that our customers do with the data.
 
Finally, the local stations: For small stations, the range of fluctuation in ratings is large - and has apparently become even larger with the new system.
The fluctuations did not increase. The difference between the GFK panel and the Kantar panel is simply that we can now show confidence intervals (statistical measure for the fluctuations of a measured value).
 
Is it even possible to survey local stations with the national panel?
You can also report audience figures for local stations, but you have to keep an eye on the confidence intervals, which are large in the local area. This has to do with the fact that there are fewer households in small regions. If we wanted to reduce the confidence intervals, we would have to measure x times the current number of households.
 
Shouldn't we admit to ourselves: The recording of local stations is pointless due to the large confidence intervals?
The problem that the confidence intervals become larger in small rooms cannot really be solved. However, if several days are combined, for example, the values become more stable and therefore more meaningful. However, other countries no longer report any figures at this level. We are currently in talks with Telesuisse, the Swiss regional television association, and are discussing various solutions. Of course, there is also the question of how much local broadcasters really depend on national viewer figures for marketing.
 
Interview: Isabel Imper and Pierre C. Meier

Info 1: Between independence and interests
The Mediapulse research organization, owned by the Mediapulse Foundation, provides data on the distribution and use of radio and television programme services in Switzerland. It was established to implement the legal mandate (Art. 78-81 of the Federal Act on Radio and Television, RTVA). According to Art. 78 of the RTVA, it should be independent of the SRG, other broadcasters and the advertising industry. De facto, however, the Board of Directors - with the exception of Chairman Marco de Stoppani - consists of representatives of vested interests: Rudolf Matter (SRF, representing the SRG), Martin Schneider (Publisuisse, representing the SRG), André Moesch (Telesuisse, representing the private broadcasters), Martin Muerner (VSP, representing the private broadcasters), Klaus Kappeler (Goldbach Media Group, representing the advertising industry), Roger Harlacher (SWA, representing the advertising industry).

Info 2: Events after the interview, as of April 24, 4 p.m.: Publication ban also affects broadcasters and marketersThe interview took place on Friday, April 19. On April 22, the Nidwalden High Court confirmed the super-provisional injunction requested by 3+ against the publication of TV ratings. According to the court, the publication ban also affects broadcasters and marketers. The expert reports commissioned by Mediapulse are not sufficient for the judges. The experts had only referred to narrowly defined topics and various problem areas had remained unexamined. 3+ is sticking to its existing catalog and thus to its demand that the system be tested by an independent organization. On the other hand, there is a catalog from Goldbach Media with requirements that must be "implemented before data publication". (The documents are available at Igem.ch/schwerpunkte/ tv-panel2013.) We will keep you up to date with developments on Werbewoche.ch.

On the subject for print subscribers: Will Mediapulse soon be superfluous? Christoph J. Walther on the change in the measurement of ratings due to the digitalization of the television world and the emerging alternative currencies.

jetzt-Abonnent-werden

Bad numbers, bad research?

No topic is currently more hotly debated among broadcasters, marketers and media agencies than the still unpublished new TV data. What went wrong with the changeover? And what happens now? Manuel Dähler, Director of Mediapulse, answered these and other questions in an interview.

daehler
Manuel Dähler, Director Mediapulse (Photo: Pierre C. Meier)

WW: The switch to the new measurement system has turned into a ratings debacle. The industry has been without audience figures for almost four months. Was Mediapulse a little naive when it made the switch?
Manuel Dähler: No, the changeover was planned for a long time. The story began in 2010 with a customer satisfaction survey. One customer requested that time-shifted television and web usage also be recorded. We then investigated which providers could get to grips with the Swiss situation and, together with customers and the Federal Office of Communications (Bakom), drew up a list of requirements for a new measurement system. On this basis, three companies were invited to submit an offer. In December 2011, our Board of Directors (BoD) finally decided to award the contract to Kantar Media.
 
Why did the BoD take so long and only decide on a provider a year before the changeover?
In August 2011, the Executive Board submitted a proposal to set up a new panel. Before the BoD agreed, it had various points clarified: For example, whether the applicant companies also offer the option of integrating radio research retrospectively. Or how the applicants are perceived in other countries. It took time to gather this information. However, we had already started negotiating the contracts before December, which subsequently enabled a swift conclusion. The decision in favor of Kantar was made on December 8 and the contract was signed on December 22. But it's true: Setting up a new panel in a year is quite a feat.
 
We like to believe that.
However, Kantar did not have to start from scratch. Mediapulse had already started to build up an address pool in 2009 with the New Establishment Survey, from which targeted samples could be drawn. If Kantar, as a foreign provider, had first had to get to know the characteristics of the Swiss population, one year would of course have been utopian.
 
Why wasn't the old panel simply converted?
Around half of all households would have had to be recruited anyway. We no longer wanted to recruit the addresses purely by telephone, as before, and also wanted to include households without a TV set but with an Internet connection. In addition, there would have been certain distortions in the old part of the panel (biases).
 
What kind of distortions?
In the last four years, we have had to replace between 20 and 25 percent of all households on the GRP panel for technical measurement reasons. Many households purchased new devices that the old measurement technology was unable to record. The lost households were replaced with socio-demographically identical households that had "older" devices. This made the panel more technically conservative. Households with new technologies were significantly underrepresented. Alongside the change in media use, this is the main reason why the new data differs so greatly from the old data.
 
One factor that made the changeover risky was the lack of a transition phase. Why was the old panel not continued in parallel?
Parallel operation was planned for December and took place. In spring 2012, Mediapulse had to decide whether parallel operation was necessary beyond the deadline of January 1. We assessed how high the risk was that Kantar would not be able to set up the panel within a year - leaving us without a panel. If this risk had been high, we would have allowed the old panel to continue. As Kantar has always reached the target size on time for the last five or six panels, we decided not to extend it. From this perspective, we made the right decision. The panel was there on time and parallel operation was not necessary.
 
With a parallel system, the advertising industry would now at least have data ...
Let's imagine that the old panel was still running. The differences we know today would still exist. There would be two numbers for the same item. If I've learned anything from the whole exercise, it's that advertisers only want one number.
 
That is correct.
Apart from that, a parallel operation cannot be there to tweak the figures until they look like the old ones. The new system is good if it does not produce any errors and if the basis, the recruitment, is in order. Under these conditions, there is no reason to fundamentally question the figures - even if they look different. There is no alternative to jumping into the new world.
 
It was foreseeable that there would be winners and losers in the system change - stations that would gain and others that would lose. Did Mediapulse think about how to deal with the "losers" beforehand?
Of course, we have looked at the media acceptance of the new currency. We have paid less attention to the magnitude of the figures for individual broadcasters - certainly not enough from today's perspective.
 
Has Mediapulse looked at the figures with the broadcasters?
Not at station level, but we provided information about the key data at events in December and explained in great detail what changes there are: at the level of the country regions and the main target groups. We are relying on a new projection, which entails changes in key target groups. We have more households than expected - and fewer with children. The reach has increased slightly and the duration of use has decreased slightly.
 
And what happened next?
Information events with sender-specific data were held again at the beginning of January. In February, the BoD then began to debate whether the data should be published. We submitted the data to the BoD, which had it assessed by specialists.
 
These experts said yes?
They said yes - and after an in-depth discussion, the Mediapulse BoD also said yes. As a result of this decision, we began supplying data to marketers and broadcasters on February 14. From that moment on, every broadcaster knew their programs.
 
Only your own?
No, every customer saw more or less everything - including the competitor programs.
 
How did they react?
Despite all the preparatory work, certain customers were shocked when they saw the new data. Even those with representatives on the Mediapulse Board of Directors (see info box on the left). Some intervened, so the BoD decided to wait before releasing the data. In addition to the first report, which was already underway, the BoD commissioned a second one, which focused in particular on local problems.
 
Why was the media science departmentCommission (MWK) not involved?
The MWK was involved when Mediapulse decided to request an external expert opinion at the beginning of January. We wanted to work with the MWK, but it was not possible to obtain an expert opinion in time. As a result, external experts were commissioned.
 
So the MWK was contacted?
Yes, we contacted the MWK in January.
 
Why wasn't the MWK brought on board for the development? After all, it had been involved in TV research for years.
Firstly, the MWK has a supervisory mandate. It does not act in advance, but issues monitoring reports retrospectively. Secondly, we know from previous reports where the Commission has its finger on the pulse, where it identifies weaknesses and where it makes suggestions for improvement. This knowledge was certainly incorporated into the evaluation.
 
Let's move on to communication: Mediapulse provided only scant information about the events. Your assessment?
Mediapulse's focus in the first three weeks was on troubleshooting, not communication, that's right. Firstly, there were problems with the station splits, secondly, errors occurred when assigning the audio recordings, and the third difficulty was the weighting via concession areas. These problems had to be solved quickly. Mediapulse was focused on this. We didn't know on what basis we should have provided information. Looking back, we have to admit: Perhaps we should have said publicly what we were currently doing. That would have increased transparency and created trust.
 
What happens now: when will there be figures?
Mediapulse has been delivering figures to its contract customers - marketers and broadcasters - since February 14. There was an interruption over Easter due to the super-provisional order. It stated that Mediapulse was not allowed to release any data - with the exception of customers to whom it had previously supplied the data under a confidentiality obligation. As we did not have an explicit confidentiality agreement with our customers, we stopped releasing the data until the court told us on request that we could release the figures again. On the other hand, everything relating to the marketing, planning, optimization and billing of advertising is not our responsibility, but that of the marketers (see info box on the right).
 
In other words, marketers and broadcasters receive data and are in principle allowed to give it to their customers?
Yes, at least that is the interpretation of our lawyers.
 
What about the agencies: do they have access to the figures?
Mediapulse is still not allowed to serve agencies. Agencies that not only work with marketers' tools but also obtain data directly from us are affected by the stop due to the super-provisional ruling.
 
What is going on at Mediapulse in this regard?
We are in intensive talks with 3+ with the aim of eliminating the superprovisional injunction. We are trying to find an agreement - but not on the condition that there is a "Lex 3+". There is no way that 3+ will get an individual solution. And we are prepared to take a stand in court if we cannot reach an agreement.
 
Many broadcasters have given their customers a performance guarantee. Do you expect complaints from broadcasters who are experiencing a decline?
Due to our contracts, the target is relatively small - unless we can be proven to have made gross errors. I expect individual complaints from customers, but not masses. We assume no responsibility for the business that our customers do with the data.
 
Finally, the local stations: For small stations, the range of fluctuation in ratings is large - and has apparently become even larger with the new system.
The fluctuations did not increase. The difference between the GFK panel and the Kantar panel is simply that we can now show confidence intervals (statistical measure for the fluctuations of a measured value).
 
Is it even possible to survey local stations with the national panel?
You can also report audience figures for local stations, but you have to keep an eye on the confidence intervals, which are large in the local area. This has to do with the fact that there are fewer households in small regions. If we wanted to reduce the confidence intervals, we would have to measure x times the current number of households.
 
Shouldn't we admit to ourselves: The recording of local stations is pointless due to the large confidence intervals?
The problem that the confidence intervals become larger in small rooms cannot really be solved. However, if several days are combined, for example, the values become more stable and therefore more meaningful. However, other countries no longer report any figures at this level. We are currently in talks with Telesuisse, the Swiss regional television association, and are discussing various solutions. Of course, there is also the question of how much local broadcasters really depend on national viewer figures for marketing.
 
Interview: Isabel Imper and Pierre C. Meier

Between independence and interests
The Mediapulse research organization, owned by the Mediapulse Foundation, provides data on the distribution and use of radio and television programme services in Switzerland. It was established to implement the legal mandate (Art. 78-81 of the Federal Act on Radio and Television, RTVA). According to Art. 78 of the RTVA, it should be independent of the SRG, other broadcasters and the advertising industry. De facto, however, the Board of Directors - with the exception of Chairman Marco de Stoppani - consists of representatives of vested interests: Rudolf Matter (SRF, representing the SRG), Martin Schneider (Publisuisse, representing the SRG), André Moesch (Telesuisse, representing the private broadcasters), Martin Muerner (VSP, representing the private broadcasters), Klaus Kappeler (Goldbach Media Group, representing the advertising industry), Roger Harlacher (SWA, representing the advertising industry).

 

More articles on the topic