Brain Juicer, a market research company that thrives on creativity and prides itself on its innovative methods, keeps a blog about all things human behavior and behavioral research.
Their blog is mostly loaded with market research experiments done by Brain Juicer and other companies, but also occasionally throws in industry-related cartoons and ads.
One of the entries I found most interesting and relevant to our own course material had to do with the effectiveness of click-through advertisements. Citing an article originally posted on AdAge, the Brain Juicer post "I belong to the Blank Generation" details an experiment that measured the amount of click-throughs on 6 blank ads, and then compared the number of click-throughs to those of other branded ads.
What the researchers ultimately found was that the click-through rates for the blank ads did not vary significantly from the click-through numbers for actual branded ads. This finding then raised the question, are click-through ads reliable metrics of online behavior?
To make sure the results were accurate, the researchers used various methods to detect any potential click fraud; these methods included tracking "...hovers, interactions, 'mouse downs,' heat maps--everything. (Heat maps detect click fraud because bots tend to click on the same spot every time.)"
The results suggested that roughly 4 clicks in every 10,000 impressions are unintentional. The research also indicates that all the online noise serves as a confound to the reliability of click-through rates as a metric. The extra noise encountered online can lead users to mistaken clicks, thus distorting the representation of intended behaviors and rendering behavioral data almost unidentifiable from surrounding noise.
Brain Juicer sums it up best: "We are great believers in focusing on behaviour, and that changing behaviour should be a research outcome. But - especially online - there is an awful lot of tempting behaviour to measure, and it's easy to be seduced by that. 'If you can't measure it, you can't manage it,' the gurus tell us, and they sound very pragmatic. But it doesn't make 'If you can measure it, you can manage it' any truer. A click seems concrete, but may be as insubstantial as... a blank advert."
These findings further reinforce the importance of working backwards in research; it's more important to focus on data application than it is to focus on data acquisition. Just because there is data to measure doesn't mean it's reliable, or even applicable to any business objectives.