“Measurement is only really meaningful when it’s related to desired outcomes”: Q/A with Mike Daniels, Principal, The Measurement Practice

Mike Daniels, Principal at The Measurement Practice

Mike Daniels, Principal at The Measurement Practice

Mike Daniels, Principal at The Measurement Practice, shares in this post his views about how we can achieve best practice for performance measurement; he explains why the combination of research, communications and technology expertise is a real challenge for measurement, and comments on the latest industry criticism around AVEs sparked by Meltwater’s white paper that was promoting AVEs for PR Measurement. Last but not least, Mike outlines what is Measurement Practice and what it can do for FIBEP Members.

 Q: Please describe what Measurement Practice is and what it does and what it can do for FIBEP members?

MD: In simple terms, The Measurement Practice helps clients maximise the value of their communications measurement and research.

We’re a team of leading communications, evaluation, research and technology experts who evaluate the effectiveness of clients’ communications, enhance the business value it delivers and inform ongoing communications strategy and planning. Our consultancy is both objective and practical.

The Measurement Practice offers a range of services:

  • measurement health-checks;
  • communication & business outcomes audits;
  • stakeholder workshops & staff training;
  • measurement programme scoping & supplier identification;
  • technology reviews;
  • data integration planning.

We’re totally independent of any measurement provider, and can therefore offer businesses and organisations truly objective analysis, which ensures relevant, pragmatic and strategic recommendations, designed to help achieve significantly enhanced ROI from their communications and research investments.

The Measurement Practicegrew out ofclients, users of communications measurement and research data, telling me of their frustration at what we call the “aspiration gap” – the gap between the promise of improved communications performance, enhanced operational efficiency, and meaningful measures of RoI for their communications investment, and what they actually receive.

All too often, what the client gets from their supplier is just a batch of data, charts and reports, with precious little guidance about how to transform that data into relevant, usable decision support. They tell us that reports are way too dense, focused on metrics and KPIs than can be measured rather than should be measured, and most damningly, are divorced from the client’s genuine business imperatives.

We see TMP as a bridge between end-user clients and their measurement/research partners. We work with clients who have an existing programme, giving them an objective, totally independent analysis of how far it aligns to their needs and goals. And we support clients looking for measurement for the first time, or wishing to reboot their existing programme, by defining their goals in ways that can be implemented effectively. Our recommendations, which aim to beboth practical and effective, are based on detailed analysis coupled to our collective multi-decade experience as practitioners.

TMP’s team combines research, technology and communications skills, and because we are not beholden to any particular supplier, methodology or technology, we can provide genuine answers to the questions in front of us.

FIBEP members can utilise our unique team experience and expertise across the customer lifecycle. We can help drive greater insight to clients. Measuring what is genuinely important to clients, presenting results in a compelling manner, and ensuring relevance to the needs of all stakeholders, increases client satisfaction and reduces churn rates. We’d be delighted to support FIBEP members as their on-demand measurement experts!

Additionally, we have clients looking to TMP for measurement partner recommendations – we are actively building our own database of suppliers, which will certainly include FIBEP members.

Q: How to achieve best practice for performance measurement? 

MD: In my view, best practice stems more from ensuring a culture of research rigour and from understanding a client’s real needs rather than any one methodology being “better” than another.

There is no single one-stop definition of best practice. Rather, it resides in following certain key processes – not all of which will be necessary or relevant to every measurement client.

As a first step, it’s very important that clients understand that for any given measurement requirement, there are three critical components to balance:


This diagram reflects how organisations can benefit from any two of the three drivers, but not all three at the same time. So, you can have a high quality provider delivering results in the shortest possible time, but it’s going to be, other things equal, more expensive than a service delivering either lower quality or at a slower pace. Equally, to reach a lower cost level, it is likely that either quality or speed will be sacrificed to some extent. There’s more about this in an IPR Measurement Commission paper “International Media Analysis Made Simple” that I co-authored.

In reality, “best practice” is thus constrained by whichever two of these three elements are deemed priorities. Additionally, “performance” can also be applied to either the outcomes of a programme (impact on sales, perceptions, voting intentions or other behaviours etc) or processes (internal efficiencies, improved cost/thousand, speed of response etc.).

However, in general, best practice in measurement needs two things. Firstly, adherence to the Barcelona Principles, especially in the areas of research consistency, and focus on outcomes. And secondly, commitment to certain critical operational processes.

After many years of evangelising, it’s now taken as given that measurement is only really meaningful when it’s related to desired outcomes. Reviewing and agreeing the client’s goals are therefore a pre-requisite for any successful, sustainable programme, as this will identify the critical metrics to be included.

Subsequently, best practice demands a measurement framework that captures all relevant metrics at an appropriate level of granularity. The coding frame should be generated from the initial outcomes review. It’s critical to apply basic research standards, especially consistent coding and media source sampling. Be sensitive to different media channels – metrics for “traditional” media are very different from social media; don’t mix and match data from incompatible channels. Create reports that are relevant, visually clear and that adhere to the maxim – “less is more”. Your clients will certainly benefit from this approach!

Q: Combing research, communications and technology expertise sounds like a perfect match but does it really work in practice?

MD: I believe that it’s only by bringing these three elements together that we can get close to meeting the real challenge for measurement.  In a nutshell, this challenge was posed to me many years ago by one of my biggest clients: “what I need from you, Mike, is to tell me what I don’t already know”. We need all three disciplines working in tandem to meet that challenge – to surface new insights that can help clients manage their forward activities and not just audit their past actions.

I would go further and argue that measurement is stuck in the 20th century largely because it is so far behind in bringing these disciplines together!

Traditional measurement companies, such as the one I co-founded 25 years ago (Report International, which became Salience Insight and is now Carma) are of course heavy users of computer systems to support their media capture and scraping requirements. Especially as social media can only be captured digitally.

However, using technology as an analytics tool per se – the flavour of the month at the month – has all too often been left to software engineers, who still tend to implement platforms with limited analytical insight, largely because they do not start from a research position.

And communicators, with their understandable bias towards words rather than numbers, find it tough sometimes to make sense of data, whether from research-or technology-focused suppliers. However, communicators need that insight in order to manage their changing media landscape effectively – and research companies must understand what communicators need, in order to provide real insight and not just data.

Critical to the success of my own measurement business was my experience in building a PR business in the 80s and 90s. When I moved into research, I brought my comms perspective, combined with some knowledge of research, to the development of programmes that really got under the skin of my PR clients. I could speak their language and created and delivered metrics that resonated with them, and more importantly, that they could immediately use for tactical (programme development) and strategic purposes (planning, branding development).

At TMP, we believe that all three disciplines absolutely need to work together – that the sum of their parts is far, far greater than each individually. Communicators inform the development of the code frame; researchers ensure the code frame can be rigorously implemented, and technologists surface new insight and create presentation platforms that dramatically increase the power and usability of the data.

Q: How would you comment the latest industry criticism around AVEs following the “infamous” Meltwater white paper that was promoting AVEs for PR Measurement? 

MD: I have previously described AVE as a “zombie metric”. From since before my time as Chair of AMEC, the measurement industry has made, and continues to make strenuous efforts to kill off this most maligned metric. But in a PRWeek UK survey from last year, a third of all communicators indicated that they still used AVE as a key performance metric.

The reality of course, is that clients in the comms space continue to ask for AVEs – after all it’s a simple number to understand (at least superficially), it has a suitably inflated dollar or euro symbol attached, and from their perspective, it seems to be “good enough” to indicate the “value” of their work. I believe clients don’t care terribly much about the protestations of the industry – as long as no-one further up their food chain doesn’t question its value, they will continue to demand it. And the truth is, of course, that if clients demand AVE, there are very few measurement companies that would, on principle, absolutely refuse to provide it, particularly as it is a relatively easy measure to produce.

My colleague in TMP, Guy Corbet (a corporate comms expert) has written a paper arguing that, from a communicator’s perspective, in some circumstances, AVE (or perhaps better, ACE – advertising cost equivalent) does capture a real performance metric, or at least is no worse at doing so than many now-fashionable measures. It will be published shortly on www.measurementpractice.com

The Meltwater piece itself, in reality, said nothing new – Meltwater have never been ashamed of their use of AVE. Indeed, if a third of bill-paying clients still use AVEs, can you really blame them?Of course, from a research perspective, the lack of detail and liberal use of un-explained assumptions (e.g. the multiplier figure they quote) undermined any intellectual credibility they might have been seeking. But I am sure they weren’t looking for that!

More importantly I think, is the fact that the only significant reaction to the piece came from the industry itself. Clients were conspicuous by their absence from the debate. And that, I think, is proof that weaning communicators onto more meaningful outcomes based metrics remains a long, hard road, requiring multiple strategies – from proscribing AVEs in awards entries, to building professional measurement practices into PR university courses and professional development, through to measurement companies showing clients that there are better and more meaningful metrics to reflect and improve their communications effectiveness.

If TMP can support these efforts by improving client education, and helping FIBEP companies build more effective measurement programmes, without recourse to AVE, then we will consider ourselves on the right track!

About Mike Daniels, Principal at The Measurement Practice
Mike Daniels has over 25 years experience in research, PR and marketing communications, and co-founded Report International, a pioneer & leader in media analysis research. He is a past Chairman and a founding Lifetime Fellow of AMEC, and a member of the IPR Commission on Measurement. Mike focuses on clients’ business and communications objectives.

You can reach Mike via email, follow him on Twitter, or discover more on LinkedIn.

The Measurement Practice

The Measurement Practice is a virtual consulting service helping clients maximise the insight and business value of their measurement and research data. We also provide practically support to clients seeking a new or refreshed research programme. TMP is a sponsor of the AMEC International Summit on Measurement in London, June 2016.

For more about The Measurement Practice, please visit their website, reach them via email or learn more on Linkedin.

Pin It

Comments are closed.