Dan Ariely has written an interesting book, based on his research, called Predictably Irrational (HarperCollins, 2008).
Ariely is an economist but his research is about human behaviour.
There are a lot of studies done over the past few decades in the field of social psychology, which illustrate very similar behavioural phenomena. Ariely's work reminds me specifically of the work of Robert Cialdini, a social psychologist who studied persuasion.
I think this work is important to look at, because it shows that there are powerful factors which influence our decision-making or judgment, which we may not be aware of. The factors are not mysterious phenomena residing in unconscious childhood memories, etc., but are fairly simple--here are some of Ariely's examples:
1) If a person has to choose between two things which are approximately equal (let's call them "item A" and "item B"), there is about a 50% chance of either one being chosen. Suppose a third thing is added, which is similar but modestly inferior to item A; let's call that thing "item A-". This third item could be called a "decoy". If a person has to choose one item out of this group of three, then item A is chosen much more often than item B (in Ariely's experiments, the "item A" gets chosen about 75% of the time).
These experiments show that our decisions are often strongly influenced by irrelevant comparisons.
2) If a cost of something is suggested, it causes us to form an "anchor" in our minds, such that we are more willing to pay that cost or thereabouts, regardless of the true value. This phenomenon is exploited in advertising. But I suspect that as a general principle, we may be influenced to choose something, or to invest a certain amount of energy or commitment into something, based on suggestions, precedents, or personal "anchors", instead of based on the "true value" of the thing.
3) People are much more likely to choose something that is "free" even if it is a worse deal than something else. Free offers substantially bias judgment. Ariel's studies show this nicely, in a quantitative way.
4) Monetary norms and social norms are conflicting motivators. Social norms are healthier and more powerful motivators. Motivations based on money are tenuous, shallow, and easily changeable. Motivations based on social goals are deeper and more stable. The corporate trend to optimize productivity by continuously monitoring worker output is a type of "monetary" strategy. On a social level, it is often offensive and demoralizing. If workers have a sense of social belonging in their workplace, and also a sense that their employer will care for them in a time of need, then the health of the entire system will be much stronger.
Social language can be a persuasive tactic in advertising though, typically through ads (such as with a bank, cable, or insurance company) which make it sound like your relationship with the seller will be something like with a friend or family member. Such advertising could seem persuasive to some, but I think most sellers would not behave like a friend or family member if you got sick and couldn't make your payment on time!
Ariely wisely encourages the development of healthier social goals in education -- to encourage
education as a means to participate in the improvement of society, rather than as a means to get higher scores on a standardized test, or to attain a higher-paying job.
5) Emotional arousal substantially increases the likelihood of making a risky decision. For example, his experiments showed that a random group of college students were about twice as likely to consider engaging in dangerous or illegal sexual activities if they were sexually aroused when asked about it. This phenomenon highlights the need for two types of protection: first, people need to be protected from the potential consequences of making rash decisions in the heat of passion (e.g. being equipped with condoms would protect against the risks of impulsively-chosen sexual activity).
Second--and this is a point that Ariely does not make--people cannot just learn about how to make decisions while in a cool, "rational" state. Perhaps it is important to teach people--through practice-- how to make decisions while in the heat of passion.
I think this is an important idea in a psychotherapeutic process: calm, gentle analysis of thoughts and emotions is valuable (whether this happens in a therapy session or in a CBT journal, etc.) but it may also be necessary to practice rational and healthy decision-making while in an emotionally heated state. This, too, can sometimes happen in therapy sessions, or in CBT journals, etc.
6) Procrastination. Ariely's studies with groups of students showed that a rigid, external imposition of regular deadlines led to the best grades. Requiring students to commit to their own set of deadlines, in advance, led to grades in a middle range. Having no deadlines at all, except for the requirement that all work had to be in by the end of the term, led to the worst grades. Those in the middle group who committed to regularly-spaced deadlines did as well as the first group. This experiment shows that people have a tendency to procrastinate (no surprise here!), and that a commitment to regularly-spaced deadlines is the best way to improve the quality of the work (whether this commitment is chosen by you, or imposed upon you).
I do suspect that there are individual exceptions to this -- I'd be curious to see a study to show this -- in which some people have a better experience with a bit less structure.
He gives a few good applications of this phenomenon: committing in advance to some kind of care plan (whether it be for your health, your car, your teeth, your finances, etc.) will make it less likely that you will procrastinate or forget to do these tasks (e.g. medical check-ups, oil changes, dental cleanings, etc.). With such a system, everyone benefits (e.g. you stay healthier, your car stays in good shape, the auto mechanics get regular work, etc.). The main problem with this is if you are being sold something that you don't really need. The solution is to be be well-informed in advance about the type of care that works best for your needs.
A psychotherapy frame is usually a regularly-spaced commitment of one's time--I certainly do find that people I see are more likely to engage in a beneficial therapeutic process if this kind of structure is in place.
7) Ownership. People have a tendency to value things more when they "own" them already (Ariely gives entertaining examples of studies showing this phenomenon in a monetary sense). This can lead to biased decision-making if the "owned" item is not valuable, necessary, or healthy. This is a similar phenomenon to loss-aversion. We don't like losing something, even if that something is not really good for us. Other social psychology research has shown that this principle applies to ideas as well: if we have espoused an idea, or a viewpoint, or an attitude, about something, we are much more likely to "own" this idea, and to stick to it. We are less likely to change our view, even if the view is unhealthy for us. I find such thinking patterns often involved in chronic depression.
This is definitely a phenomenon that occurs in a psychotherapy environment: therapy is an invitation to change. Even if the change leads to a better quality of life, people are resistant to change, and are more likely to hold on to systems of thought, perception, or behaviour, which perpetuate unhappiness.
8) People are more likely to choose things that seem to be disappearing. Ariely again demonstrates this phemonenon, using economic measures, in a clever experiment. We see this in sales tactics all the time, such as when we are warned that some item is selling out quickly, so we had better act soon! In life, we may tend to spend a harmful amount of time, energy, money, and commitment, keeping multiple options open: as a result, we may never get very far into any pathway we choose.
9) Stereotypes and expectations substantially affect behaviour and choice. In an amusing experiment involving a blinded beer-tasting test, Ariely showed that college subjects presented with two unlabeled containers actually preferred a beer that had been tainted by 10 drops of balsamic vinegar, over the untainted version. But if the students knew in advance that vinegar had been added, then nobody preferred the "vinegar beer". If we believe--or are persuaded to believe--that something is good or desirable, or that something is bad or undesirable (that "something" could be anything from toothpaste, to a new acquaintance, to a job, to our own self or our own skills), then we are significantly more likely to find our beliefs substantiated.
We need to have ways to "stand outside ourselves" at times, to reduce the biases caused by our own beliefs. I think that this, too, is one of the roles of psychotherapy.
10) Things that cost more tend to have a stronger effect. A more expensive placebo tends to be more effective than a less expensive placebo. This is an important, powerful bias to be aware of. This, too, can be a tool exploited by advertisers, in which the high price of their product is displayed prominently as a signifier of higher quality.
I have one major complaint about this book:
Ariely makes a few statements about medical treatments, including "when researchers tested the effect of the six leading antidepressants, they noted that 75% of the effect was duplicated in placebo controls." (p. 178) This claim is based on one single study, from a minor journal, published over 10 years ago, without considering other data from hundreds or thousands of other publications in the research literature. Furthermore, even if this 75% figure was accurate, the remaining 25% of the effect may be very significant for many suffering people. The psychological impact of Ariely's statement may be to cause skepticism and a dismissive attitude towards certain medical treatments, including antidepressant therapy. Ironically, Ariely would then be persuading people against something, based on a tiny, inadequate, and negatively-framed presentation of the evidence.
11) Randomly-chosen college students in Ariely's experiements had a strong tendency to cheat; but if these subjects were reminded of some kind of honour code immediately prior, they had a much smaller tendency to cheat. Based on his findings, he encourages a more prominent role for "honour codes" to reduce dishonesty. He observes that cheating is no trifling matter: fraud accounts for much more stolen money and property than all other forms of crime put together. Also, cheating is much more likely and pronounced if it is perceived to be indirect: people will cheat more if some kind of token is involved, even if the token is worth the same amount as actual money. Our society is evolving to use indirect currencies much more (various forms of credit, for example), which probably will increase systemic dishonesty.
The idea of an "honour code" may seem a bit odd or trite, maybe hard to take seriously. But I think its application could be imaginative and important, and could, at least in a small way, address something that is missing in many workplaces, homes, or individual lives. I suggest this not necessarily as a way to reduce dishonesty, but as a motivational tactic, that can remind us of ways to live healthily. Many workplaces or lives can be so caught up with being busy, competing, getting through the day, that a grounding sense of purpose is rarely contemplated.
An "honour code" in a psychotherapy frame could involve a formal set of statements for oneself, a "mission statement", which could guide choices, motivations, priorities, and attitudes over time.
So it could be an interesting exercise to write down, and answer for yourself:
"What are your morals/values/guiding principles?"
"What is it to be a good person?"
"How can I live honourably in a world which can be harsh and difficult at times, and in a life which can be harsh and difficult at times?"