-
Play The Hits: Ignore The Burn
April 12, 2019
Have an opinion? Add your comment below. -
Programmers want to know everything possible when making music decisions. That's understandable. The more detail and insight, the more we can fine-tune the playlist. It's part of the pursuit of a perfect radio station. But in the endless chase for more information, we often create stale, boring radio stations. And one cause of that is asking music research respondents about burn.
Asking whether listeners are tired of a song in a survey is useless because it's redundant.
At worst, including a separate question about burn may be damaging scores of hit songs. At best, you don't need the question because you already have the information.
Five Reasons Music Burn Questions Are Useless
When testing music online, survey fatigue is normal. Streamlining the response process improves the listener experience. The easier it is to navigate your survey, the better your results.
In focus groups, it's common for listeners to say they "used to respond" to their a station's music surveys. Why have they stopped? The most common response is it "takes too long" or "it's boring".
Yet some stations ask for three separate responses for each song. 1) familiarity, 2) to rate the song and 3) whether they're tired of it.
With a survey of 20 or 30 songs, that's an eternity for a music respondent, especially in a world where attention is at a premium.
How long is too long? About five minutes, they tell us.
Eliminate Useless Questions
So why ask questions you don't need?
Think about it: If I'm tired of a song, won't I rate it lower than if I love it? Of course!
Even if the occasional song that is still popular (in score) but "tired of" is starting to rise, does it really matter? As long as listeners love the song, play it! And play it often. When popularity fades, play it less. It doesn't matter if the song is burned or just not as popular. Keep it simple.
Burn is in the data, but it's not shown as its own value. That just makes programmers reconsider a song's popularity. And sometimes, including the burn as a separate question may cause a valuable song to be dropped.
Music Burn is Relative-But To What?
Whether a listener feels tired of a specific song or not is relative. What does it mean? One respondent may have a lower tolerance for burn than another. The question is subjective, so it's not actionable.
Some respondents have an attitude that all stations play all songs too often. When scoring burn, they aren't reacting to the song as much as protesting repetition. This distorts the data and can screw up results.
Some folks are black and white respondents. They either love it or hate it. They are either tired of it or not tired at all. Others see the world in shades of grey.
They aren't sure whether they should rate a song as burned or not. Further, though I'm tired of a song, it doesn't mean I don't like it anymore. Nor does it mean I'll turn it off. Today, in a world where so many stations share titles and music is everywhere, fatigue happens quickly. Reacting to what was once considered high burn can cause stations to churn through new songs faster than they should.
Another factor that affects burn scores is the list of songs being tested. When recurrent or gold titles are surveyed next to newer or current songs, perceived burn increases. The fatigue score will be higher than when testing all gold or recurrent titles.
Music Burn is Irrelevant
As you can see, knowing if the audience is actually tired of the song to the point of tuning out is impossible to measure.
However, if the structure of song rating system is solid, the information is already in your hands. If they're tired of the song to the point they'd tune out, the song will score lower.
How many times have you seen a top-testing song with a burn score of 40 or 50? And how often do you see a poor testing familiar song with a low burn score? Not very often.
If I don't like the song, hearing it once is too much. If I love it, you can't play it often enough.
Music Burn Happens With Or Without You
Most stations view music research in a vacuum, through their own station's lens. Some even ask the burn question of "Should we play this song more or less?".
That's a weak question. It doesn't take into account variables that go into answering such a question.
Burn doesn't happen because your station plays a song more often. Nor is it reduced if you play it less. Burn is station agnostic, cumulative and it increases every time an individual is exposed to a song. They may be hearing it on Spotify, YouTube, other radio stations and television. Or maybe the band plays it in church every Sunday. It doesn't matter. Each contributes to listener perceptions of music fatigue.
There's nothing you can do to manage how often other outlets play songs. Don't let rising burn scores cause you to reduce airplay on otherwise strong titles.
Moving out the hits to make room for new, less familiar and less popular music is a bad trade. Guess what happens then? Yep, tune out and ratings decline.
Music Burn: What's Your Number?
Finally, t's impossible to determine the point a song becomes too crispy to play. When does the percentage tip a song from power to secondary? Or from on the air to off?
Should burn tolerance be 20%? 25%? 30%? 50%? Why that percentage? It's fueled by your gut. There's nothing wrong with using intuition. But isn't the purpose of a music survey to get data to inform that opinion? So how can you apply the score? At best, it's an inexact science.
Again, if the score is holding up and favorites are high, does burn matter?
Asking About Burn Increases Burn
Finally, just asking about burn can increase burn. There may not be a problem until the respondent is asked to think about it.
Asking "Are you tired of this song?" invites the respondent to re-evaluate the song, when in reality it may not be an issue at all. They may think, "Hmm, now that I think about it, yes I am tired of that song".
So they click the box that says they're tired of it. Or they may be a little tired, but they exaggerate how they feel and click on VERY TIRED just because they were prompted.
It may have nothing to do with behavior, taste or feelings about the song, but asking a separate question turns distorts the results.
Conclusion
In a perfect world, we could ask many questions about each song. We'd compile a definitive profile that guarantees it's in the perfect rotation. The station would play nothing but popular, familiar hits with no burn and high favorites.
But music, research and radio are messy. And the methods most stations use to gather music data is far from perfect.
Is it worth introducing burn fatigue in a music survey by asking an extra question? Is burn is even actionable data? Are you wasting respondent's time and driving them away from your survey?
In the final analysis, what would you rather play:
A song that 1/3 of the audience indicates they're tired of it, but still loves? Or a song that everyone rates as average, but has no burn?
Don't over-think it. Play the hits. Ignore the burn
-
-