Trust Your Gut: Too Much Thinking Leads To Bad Choices (1/26/09)
Don't think too much before purchasing that new car or television. According to a new study in the Journal of Consumer Research, people who deliberate about decisions make less accurate judgments than people who trust their instincts. ...
In five separate studies, the researchers found that better judgments can often be made without deliberation. In the first study, participants rated Chinese ideograms for attractiveness. In a following study, participants were asked to judge paintings that were widely considered high- or low-quality. Subsequent groups of participants rated jellybeans and apartments. In all the studies, some participants were encouraged to deliberate and others to go with their gut.
The more complex the decision, the less useful deliberation became. For example, when participants rated apartments on just three primary characteristics (location, price, and size) deliberation proved useful. But when the decision became more complex (with nine characteristics) the participants who deliberated made worse decisions.
One has to wonder exactly how these "researchers" define "better" or "worse" decisions. Better or worse for whom? The salesperson? Especially in view of the following, it almost seems as though outright deception and manipulation is being advocated:
"For example, if a car boasts one particularly good feature (for example, safety) but has a number of other negative features (for example, expensive, bad gas mileage, poor handling), a car salesman might encourage a potential car buyer to deliberate over the pros and cons of the car, while at the same time emphasizing the importance of safety. In this way, the disturbed weighting of attributes created by deliberation might be used to highlight the one sellable feature and draw attention away from the unattractive features," write the authors.
For example, decisions about home mortgages are obviously complex, yet require careful deliberation. What's advocated in this "research" is what made it possible for so many objectively bad decisions about home mortgages to be sold to buyers in recent years, with eventual disastrous results.
One wonders whether political scientists have investigated to what extent such "consumer research" studies are taught to and studied by political operative, and to what extent they adversely affect political choice about candidates and issues.
This strongly echoes an article from mid-2006 in The Guardian, by Johnjoe McFadden. He's at least a reputable molecular biologist and cognitive theoretician.
ReplyDeletehttp://tinyurl.com/dbsdz5
This strongly echoes an article from mid-2006 in The Guardian, by Johnjoe McFadden. He's at least a reputable molecular biologist and cognitive theoretician.
ReplyDeleteThis is actually a very deep and interesting issue. There are (at least) two sides to it, and a whole lot of oversimplifications.
McFadden makes some good points in his article that you cite. However, I bridle at his subtitle: "The evidence seems to be that the conscious mind isn't much use in making hard decisions".
We have all seen the chaos that results when too many people go with their "intuition" to make hard decisions. Out of that "stupidity of crowds" we get results like the financial catastrophe that we are in the middle of. We also get the election of very bad public officials, sometimes, when the crowd listens only to the emotional appeals of the candidates.
Another example of bad decisions typically made by "intuition" is in mate choice. About 50% of marriages end in divorce, due to bad judgment regarding long-term compatibility and listening only to the "heart" instead of the "head". And who knows how many of those 50% of marriages that don't end in divorce are actually unhappy for both parties?
Caveat emptor!
There seem to be a lot of "popular" writers these days who produce books appealing to the public's frustration with the difficulty of making rational choices.
An example is Gladwell's Blink: The Power of Thinking Without Thinking. Another is Surowiecki's The Wisdom of Crowds.
A writer with somewhat more intellectual respectability is Gigerenzer, who's written a number of the books on this subject, including Gut Feelings: The Intelligence of the Unconscious
One theme in all these writings is that human consciousness isn't really aware of much that's going on in the brain.
I have discussed, with approval, that exact theme myself, here. There is also a very good blog post that links to mine here.
Just yesterday, in another context, I wrote a bit about why we should expect that evolution has made our brains in such a way that leaves a lot of room for unconscious "reasoning". I'll try to work that into a post here when time permits.
And yet, I have to keep coming back to point out the significant problems caused by relying only on "gut feelings". We simply cannot afford to give up on rational thinking in making our most important decisions. It's just too risky.
All this may be related to the fact that cognition is not only largely unconscious but also embodied, and memory is to at least some extent "externalized". (I'm thinking of the work by Alva Noƫ, George Lakoff and others.) We're not simply self-contained symbol-processors.
ReplyDeleteIn other words, we have to consider the possibility that we think reactively because it's our evolved nature to do so, and that you can carry rational analysis only so far. Also a considerable part of "the problem" is objective ignorance (hence the utility of Bayesian analysis). Yet hive-mentality, even if to some extent selected for, can be resisted. A lot of people knew the economy was a bubble. If I could have gotten my 401k's out of the market two years ago penalty-free and into (say) CDs believe me I would've.
we have to consider the possibility that we think reactively because it's our evolved nature to do so, and that you can carry rational analysis only so far.
ReplyDeleteYou make good points, including this one. However, thinking reactively is something that almost all animals do. That much has been evolving for over 500 million years. It has been taken about as far as it can go.
Thinking rationally, reasoning logically, planning ahead, etc. are capabilities rather recently evolved in humans, higher primates, and perhaps a few other species. (Maybe even birds, to some extent.)
But the problem is, especially in humans, thinking rationally usually identifies new kinds of information that could help make a better decision. To continue the process requires going out to collect the information somehow.
Until very recently, the only way to get additional information was to go consult an older, presumably wiser person who might know more - a tribal elder.
Then writing was invented, followed by books, libraries, computer networks, and Google.
Humans certainly haven't evolved to make the best use of such things. Relying on "gut instincts" therefore means making decisions without information that might be important.
Of course, sometimes "gut instincts" are based on knowledge we've acquired long ago and to some extent "forgotten" that we knew. We may have learned short cuts and heuristics that guide us to make good decisions, though we don't recall how we came upon those short cuts.
And obviously, at some point it's not economical or sensible to keep gathering information. Depending on how important the decision is, one just has to stop at some point and decide.
But what worries me is that too often people stop the information gathering process too soon - and make bad decisions as a result.
You see this all the time in elections, for example. Many people just get tired of the whole process and decide to vote for whichever candidate appeals to them most on an emotional level.
I understand your concern, believe me. However, might it be that "gut intuition" increases in complexity along with overall cognitive capability?
ReplyDeleteThis is a science blog basically, so let me dredge up the intuitive "blue flash" which physical theorists and pure mathematicians sometimes report experiencing. Actually that's a generic term for sudden insight and maybe nobody's ever experienced it precisely that way. But Feynman for example cognized mysterious shapes which at first would make no conscious sense to him. That's the informed genius part. The brilliant professionalism part comes in the unpacking of the vision and its mapping onto some received system of symbolic representation.
The second part is impossible without a great deal of critical thinking. But is the first part all that different from many suddenly coalesced decisions, not all of them good? (Theoreticians and mathematicians usually have plenty of time to spot their conceptual gaffes before they submit the material, but once you've signed the purchase agreement for the pink Lamborghini and driven it home you're pretty much locked in.)
might it be that "gut intuition" increases in complexity along with overall cognitive capability?
ReplyDeleteFeynman for example cognized mysterious shapes which at first would make no conscious sense to him. That's the informed genius part. The brilliant professionalism part comes in the unpacking of the vision and its mapping onto some received system of symbolic representation.
One would certainly expect someone like Feynman to have better "gut intuition" than the average person, especially in physics - after all, he was not only brilliant to begin with, but also well trained professionally (as a student of Wheeler).
It seems to me, however, that considering the question of reasoning vs. intuition only for the best minds doesn't necessarily tell us much about people of average ability, and it is hard to draw conclusions even in the special case.
Why is it hard to draw conclusions about how experts think? Because, by definition, such people have already internalized large parts of their domain of expertise. They can solve problems in that domain with seeming effortlessness, since they've probably encountered similar problems before, even if they don't explicitly recall them.
Another problem is that domain wizards like Feynman have only a spotty record when they confront new territory later in their career. After helping invent QED, Feynman contributed relatively little to coming up with the Standard Model, which was the work of somewhat younger people like Gell-Mann, Weinberg, Salam, Glashow, et al.
Or consider Einstein. After he'd mostly completed relativity, his intuition seemed to completely fail him as far as quantum mechanics is concerned.
Yet another problem is that in domains that require high levels of ability in abstract thinking, such as math and physics, the biggest breakthroughs are often made by young people whose intuitions are not encumbered with "too much" information that has already been mined out, approaches that have already been explored to their limits.
Feynman and Einstein, in their early years, are examples of this. So this supports the idea that intuition can be a powerful tool. But unfortunately, this works only for a very few "real" geniuses. Hardly any beginning grad students, who may be equally unencumbered with professional knowledge, go on to repeat the achievements of Feynman or Einstein.
Bottom line: I don't see how the thinking styles of top domain experts reveals much that's useful for the vast majority of us.
I don't want to drag this out because there's really no fundamental disagreement here. Just one cavil ...
ReplyDelete"Or consider Einstein. After he'd mostly completed relativity, his intuition seemed to completely fail him as far as quantum mechanics is concerned."
Unified Field Theory was a disaster. And he said himself that as you age the problem becomes not so much coming up with ideas as being able to spot the bad ones readily. But although EPR is wrong it led Bell to try to prove it right, which, as failures go, wasn't unproductive.
As a sign-off: I haven't scoured all the archives but a site-search suggests you haven't discussed Leggett-Garg here, or the IQOQI informatics approach and the ongoing Uncertainity-Incompleteness project. Or the stuff coming from Gisin's group. Your thoughts would be interesting.
I don't want to drag this out because there's really no fundamental disagreement here.
ReplyDeleteNo problem. It's an interesting discussion.
But although EPR is wrong it led Bell to try to prove it right, which, as failures go, wasn't unproductive.
Even though Einstein was wrong, he certainly clarified the problem in a very productive way. That's very much to his credit.
I haven't scoured all the archives but a site-search suggests you haven't discussed Leggett-Garg here, or the IQOQI informatics approach and the ongoing Uncertainity-Incompleteness project. Or the stuff coming from Gisin's group. Your thoughts would be interesting.
I'm not really up on quantum information theory, quantum measurement theory, and so forth, but I am interested in them. One thing, in particular, I'd like to get into more is the Conway-Kochen "Free Will Theorem".
I'd be interested to know what specific questions in this area you find most intriguing.
If you want to discuss this further, email might be the best way (cgd at scienceandreason dot net).