Friday, September 24, 2010

10 PR Measurement Mistakes And How To Avoid Them

Editor's Note: The following is a well-written guest blog post from Katie Delahaye Paine, chief executive officer of KDPaine & Partners, LLC, a New Hampshire-based research consultancy that provides public relations (PR) measurement and accountability for corporations, non-profits, and government agencies worldwide. This article first appeared in The Measurement Standard, a monthly newsletter published by Paine, and generally regarded as the world's first and most comprehensive PR measurement publication.
__________________________________________________

Despite the best laid plans, public relations measurement programs can sometimes go awry. You can't always anticipate how everything will go, and your elegant research design rarely seems to play out quite as you planned. Let's face it, unforeseen problems and errors can creep in, and part of your job is to figure out how to get the job done anyway.

But there are certain errors your program just won't survive. These mistakes will ruin your data or analysis and leave you with no options but to learn an expensive lesson and start over. Here are 10 fatal research errors to avoid:

1. Clipping systems that miss clips. We won't name names, but you should regularly test your provider. Do what we call a "Pub/Month" check: Look back over the statistics for the last year and see on average how many articles you get in your key publications. If you are below that for the current month, or if you have zero clips for the month, someone's probably missing your clips.

2. Dirty data from your content provider. This means errors like not differentiating between nytimes.com and The New York Times. Again, check the data on a monthly basis to make sure that it includes what it's supposed to.

3. Bad circulation figures (impressions). It doesn't matter if it's off by 10 or even 100. But we've seen cases where providers have moved commas and made the NY Times circulation 14 million instead of 1.4 million. Do a reality check.

4. Corporate articles that end up in product categories and vice versa. This needs to be checked monthly or even weekly for the first six months to make sure that it reflects reality.

5. An unclear definition of tonality. Ask three people what a positive article is and you'll get three different answers. We define it as "leaves the reader more likely to do business with, invest in, or go to work for the company." How you define it is your own business, just make sure it's consistent.

6. An unclear understanding of key messages. Again, do a monthly reality check.

7. Not comparing apples to apples in a competitive analysis. This includes errors like looking at your own local coverage but not the local coverage of your competition.

8. Not being clear about the universe of publications. Make up a written list of search terms as well as a list of the print/online publications, and social media outlets to be covered.

9. Not having control of the names and mailing list for your survey. Beware of merging lists: You can end up with two surveys in one household just because the middle initial is left off one name but not the other.

10. Not being clear about what social media you want to measure. Are you interested in user reviews, Facebook, Twitter, blogs, Foursquare, or all of the above? Chose the outlets your target audience uses.

© 2010 Ragan Communications, Inc. Reprinted courtesy of Ragan Communications, Inc. and KDPaine & Partners, LLC. All rights reserved.

No comments:

Post a Comment