-
Quit Playing Ratings Games With My Heart
November 6, 2015
Have an opinion? Add your comment below. -
by Mike Couchman, BOOST 101.9/St. Louis
Before we get into this, I'll admit to you: I'm a hypocrite. I'm going to fire a few shots at the ratings game. But, being totally honest, when my station's numbers are up, I geek out! Maybe it's because there are so many things stacked against us that I can't believe the odds were ever in our favor. Or more likely, it's plain old human pride. When Nielsen gives us good news, all glory to God and my team. When things don't move, or when the numbers go south, don't most of us throw up our hands and say "well it's such a flawed system in the first place?"
So, here's my situation: I program what would be considered a niche Christian station in St. Louis, MO. BOOST 101.9: our core sounds are Christian Pop and Hip Hop. We're on a super-translator in the middle of the market, and two Class C3 rimshots to the northwest and southwest of the market. About 65% of the Nielsen-defined metro survey area can get our signals reliably on most days, in cars. (Very little home/office building penetration). If HD radio ever takes off (HA!), then everyone within 100 miles could hear us on our parent station's HD2 signal. (KLJY, the amazing 99.1 JOY-FM)
We're a PPM market. Extensive pieces have been written about the flaws of the PPM system. I can link you to my favorites if you want, just email me. To sum up the issues: 1. Sample size (this goes for diary markets too). Recent situations in Tampa and Los Angeles proved that one measly meter...JUST ONE! Can screw up an entire market. If that's not a house of cards, I don't know what is. 2. Sample diversity. Stations with young audiences and minority audiences have uniquely taken the brunt of PPM pain since 2007. What typical iPhone carrying person under 30 is going to be caught dead flashing around a pager-like device? 3a. The technology itself. PPM has faced legitimate questions regarding how well it encodes in various listening environments. 3b. Earbud/headphone listening doesn't get captured at all. 3c. PPM tones struggle with various spoken-word audio sources. 3d. They don't cut through clearly among certain styles of music. 3e. Or with MP3's. 3f. Or with HD signals. 3g. Stations like mine with signal challenges have also seen poor PPM performance because the decoders can't separate the PPM tones from the static our devoted P1's graciously ignore.
Let's set aside all my excuses for my station's numbers. Even if you're on a killer signal in the middle of your market with a PPM-friendly format/execution, you still face big challenges. The only way to judge your success by PPM numbers is to play the ratings game at the same level the stations across the street do. And it's almost as complicated as playing with stocks. If you're not consistently chasing after certain types of listeners in certain zip codes at certain times of day, with strategically coordinated on and off-air efforts, spending six figures on your tactics, you're not playing the same game many of your mainstream counterparts are. And if you're a non-com largely funded by generous donors, why should you? Sure, more cume can mean more donors. But nobody has research proving you'll have a better Pledge Drive if you're #1 instead of #5 in your market. The difference between first and fifth place for many markets is often well within the margin of error.
"But," most of us say, "this flawed system is the only one we have." If we can't measure our success by ratings (and revenue), what's left? Actually, plenty. I learned the hard way, back in September 2014.
BOOST 101.9 had been on a steady cume increase, month after month since our spring 2014 launch. Every month a little better than the one before. The Nielsen gods were smiling on us. Until the August numbers came out: our cume plunged by 50%. Was it related to recent signal challenges? Did 30,000 listeners have a secret meeting and all at once decide the novelty had worn off? Were our song and talent choices blatantly wrong? Was our research beyond flawed? Did Nielsen make some drastic panel changes?
Prior to joining the team I asked my boss, Sandi Brown, how we would evaluate our efforts. An unproven format on these signals surely wasn't going to be a ratings winner. Thankfully, she assured me her building was not a place where we lived and died by Nielsen's mood swings.
After placing my sunken heart back in my chest (it had fallen to ankle-level), I set out to see how damaged my station was. I started with our streaming stats. During the same time period Nielsen said half our cume vanished, our streaming numbers were up. In fact, that particular month turned out to be our third biggest of 2014 for streaming. Insert small sigh of relief here. Next, I turned to our app's stats. Activity in it from our audience was up by 42% over the previous month. A record number of new users downloaded our app during this time frame. Our website traffic was steady. Our social media interaction was up, though part of that may be connected to how we were addressing the unrest in Ferguson. In other words, every metric I could tap into besides Nielsen told us things were fine and continuing to grow.
I'll grant you that all of these things added together may not make as much noise as one great, or terrible, set of ratings numbers. What's often said and yet simultaneously overlooked about Nielsen is this: each month (for PPM markets) and each quarter/season (for diary markets) is a snapshot. If you're going to place any faith in ratings data, only measure your station's performance by viewing a few snapshots back to back, over time. Also, notice the language Nielsen themselves prefaces their numbers with: they are "audience estimates." When you dive into the sausage making and investigate margins of error, the number of diaries/PPM's in your market actually influencing ratings at any given moment, the way the sample is divvied up, you'll see why "estimate" is quite the key word with ratings.
You did not just earn a free pass to do terrible radio though. We all should want to excel in every way measurable. Your on-air and off-air efforts should be more excellent than your market's #1 station. (Unless that's you! Good job!) But our self-worth and how we value our stations should not be superglued to ratings alone. They're as fickle as my exercise efforts. Above, you perhaps gleamed a few other ways you can determine if your station is moving in the right direction. (There are more I didn't have room to mention here.)
Perhaps the best measuring stick are the ones most intangible: Constantly revisit your station's unique mission. Measure your success by how well your team is zoomed in on the mission. And execute it with Colossians 3:23 zeal. "Whatever you do, work at it with all your heart, as working for the Lord." Regardless of your cume or share, you're ultimately trying to please an audience of One.
"Not everything that can be counted counts, and not everything that counts can be counted." -- William Bruce Cameron
-
-