Is research really helping marketing to be an evidence-based profession?
Healthcare professionals (HCPs) know the term 'hierarchy of evidence'. At the bottom sits word of mouth, while randomised controlled trials and meta-analyses sit at the top. The hierarchy is one of the things taught during their professional training and most HCPs pride themselves on delivering evidence-based practice.
But what about the evidence from market research; particularly in areas where the facts are less clear-cut? There is far less discussion, let alone consensus, about what 'evidence-based practice' means here.
David Ogilvy – who created one of the world's greatest advertising companies – said: “The trouble with market research is that people don't think how they feel, they don't say what they think and they don't do what they say.”
Does this hold true in healthcare? Is it conceivable that HCPs are vulnerable to giving the researcher an answer that may not represent the truth, whether deliberately or not?
Commercial decisions may not involve life-and-death judgements, but poor decisions can be costly. Isn't it time we evaluated how market research evidence is collected and the types of decisions that it can support?
All evidence is not equal
Direct questioning or 'self-report' is used to collect the majority of data in market research. The great thing about asking direct questions is that respondents usually have an answer; but should we always accept them as the truth?
Just as consumers under-report their use of alcohol or illegal drugs, so HCPs may modify their responses when discussing what they really do in practice, or the influence that marketing materials might have on their prescribing.
Market research questions may require respondents to have impossible degrees of insight or awareness. Direct questions are particularly unlikely to attract accurate responses when the behaviour in question has become automatic or routine; in this situation the responses are most likely to be post-behavioural rationalisations. And yet routine behaviour is what we are often most interested in learning about.
For such difficult questions, we should look for other types of evidence.
Moving on from 'self-report'
Where self-report fails, new techniques are beginning to provide useful answers. A technique we use is Implicit Research, which is a validated psychology tool first used to test for racism in job interviews in the US. (Google 'Implicit Research' to see the wealth of academic information supporting it.)
Put simply, it can tell you what a respondent really feels deep down, which means the researcher can use this to have a much more honest conversation with the respondent. From this we can be much more sure to be taking the right course of action, not just the one of least resistance.
Of course, the self-report element of research will always be important, but we believe the time has come to stop using self-reported evidence to inform all marketing decisions. There are times when you need evidence to make the right decisions, which means you cannot rely on just asking respondents, you have to have a way to find out what they really think.