Stated preferences do not align with revealed preferences

From Harridanic
Revision as of 12:57, 9 December 2016 by Paul Herring (talk | contribs)
Jump to navigation Jump to search

Fancy way of saying that you cannot necessarily believe what people say they think in a survey.

For example, in a survey on giving more money to charity, people are more likely to express the opinion that of course they'd give more to charity if X, Y or Z were to happen. However it is unlikely that everyone expressing that opinion would, in fact, give more money to charity when Y eventually does happen.

Another would be "how much do you drink" or "would you like to drink less" would receive answers the person being questioned thinks would paint them in a better light, rather than what a 'truer' answer would be.

To give a concrete example, Wakefield, Hayes, Durkin, and Borland (2013), has been used by tobacco control as justification that plain packaging was a success in Australia because people thought about quitting more when smoking cigarettes in plain packaging. Unfortunately no-one thought to follow up these thoughtful people to find out if they actually had given up.