Why it’s so hard to make good decisions

Axial helps people help themselves. We consider this an important and noble thing to do. Specifically, we provide patients with the information and tools they need to manage their health and stay well between healthcare encounters. Many people thrive while others struggle to follow the most basic of care plans. The truth is that we all struggle to make decisions that are consistently in our best interest. We don't always behave rationally, despite what some microeconomic theories suggest.

 

Two Systems of Thought

The study of decision making is a cross section of economics and psychology called behavioral economics. Much of the work is based on the foundational concept of two thought systems: the automatic thought system and the reflective thought system.

  • Automatic: Your automatic system kicks in when you're surprised by the appearance of a snake. You don't stop to contemplate whether or not it's venomous. You react. The automatic system controls thinking that happens quickly and doesn't require much effort or thought.

  • Reflective: The reflective system is deliberate and slow. Working through a complicated math problem or speaking a foreign language that you haven't quite mastered would both fall into the reflective category.

 

In the context of patient engagement, the automatic system drives the need for instant gratification, which often opens the door to all sorts of unhealthy behaviors. You use the reflective system when you're thinking carefully about the long term consequences of your current actions.

For more on behavioral economics, see Thinking, Fast and Slow by Nobel laureate Daniel Kahneman.

 

Cognitive Biases and the Impact on Patient Engagement

In Nudge: Improving Decisions about Health, Wealth, and Happiness, University of Chicago professors Cass Sunstein and Richard Thaler illustrate how the automatic and reflective systems of thought combine with cognitive biases to lead to poor decision making.

 

Anchoring and Adjusting Bias

An anchor is a piece of information that influences your evaluation of subsequent information. In a study done at Duke University, subjects were asked to write the last two digits of their social security numbers. They were then asked to bid on various items such as wine, chocolate, and computer equipment. The subjects with higher two-digit social security numbers submitted bids that were 60% to 120% higher than those will lower two-digit social security numbers. The two-digit social security numbers anchored the subjects' thinking even though these numbers had nothing to do with the items on which they were bidding. Here's an experiment you can try on your own. Add 200 to the last 4 digits of your phone number. Now guess what year Attila the Hun sacked Europe. Check the answer here. Chances are that if you anchored on your phone number and adjusted up or down from there. It turns out that the Anchoring and Adjustment Bias is an important consideration when communicating information about disease risk.

Lesson for Patient Engagement: Be mindful of how your patient engagement program introduces anchors that might influence a patient's perception of disease risk. When an anchor is introduced, consider how it might influence immediate and longer-term personal health decisions and make appropriate adjustments.

 

Availability Bias

The availability bias is when you make judgments about the probability of events based on how easy it is to think of examples. After a flood, the sales of flood insurance policies go up. This is your automatic thought system at work, which is giving you a clear picture of the threat. The problem is that that the occurrence of a recent flood doesn't necessarily make a future flood more likely. After reading about a recent plane crash, do you hesitate to book air travel? Automobile accidents increased in the weeks following 9/11/2001 as travelers bypassed air travel and took to the road, which is a riskier form of travel.

Lesson for Patient Engagement: Call out the availability bias when presenting healthcare statistics. Help patients recognize it and overcome it.

 

Representative Bias

Stereotypes are a feature of the representative bias. As an example, let's use a hypothetical woman named Linda. Linda is 31, bright, and very outspoken. In college, she majored in philosophy and was involved in many social justice demonstrations. Which do you think is the most probable description of Linda today: i) a bank teller or ii) a bank teller who is active in the feminist movement. If you chose "ii" then you got hit by the representative bias. It is much more likely for one event to occur (bank teller) than for two to occur (bank teller who is involved in the feminist movement.) The representative bias causes you to judge probabilities based on the degree to which A resembles B instead of judging them on a more objectively rational basis.

Lesson for Patient Engagement: Within healthcare, symptomatic patients can easily detect patterns in random noise. This can become dangerous when it leads to self-diagnosis. Help patients understand that many underlying illnesses share common symptoms. The human body has a limited vocabulary when communicating that something is wrong.

 

Optimism and Overconfidence Bias

Unrealistic optimism is a common feature of human life. Without it, few would undertake risky endeavors that lead to things like Tesla Motors and Apple Computer. Why would anyone be so foolish as to start a new business? After all, 65% of them fail. Do you know anyone who has founded a new venture? (I do!)  They are typically a confident lot. Problems can arise when we become overly optimistic. 75% of drivers think they are above average. At one university, 94% of professors considered their own teaching skills to be above average. In healthcare, we see this all the time, but in the form of overconfidence about immunity from disease. For example, consider the smoker who is aware of the statistical risks of smoking, but who believes that lung cancer and heart disease are things that happen to other people. 

Lesson for Patient Engagement Design: When patients overestimate their personal immunity from harm, they fail to take the appropriate preventative action. Help patients view themselves, and their personal risk profile, through the eyes of third parties.

 

 

Loss Aversion Bias

People hate to lose something twice as much as they like to gain something. Take an example of a group of people who are asked to play a game of heads and tails. They are told that if they flip a tails, they must pay $100. Then they are asked what the "heads" payout would need to be in order for them to play. The response? About $200. Loss aversion produces inertia in the sense that you are reluctant to give up what you have for fear of incurring losses. Even when it is in your best interest to make the trade.

Lesson for Patient Engagement: Consider using this bias to your advantage through gaming elements which enable patients to aggregate some kind of incentive currency when adhering to a care plan. Nonadherence could result in the loss of the incentive.

 

Status Quo Bias

This is what causes you to take your "regular" seat in a conference room. It is what prompts you to choose default options. For example, a study of the pension plans of college professors showed that more than half made no adjustments to their portfolios during the lifetime of the holding. Over a lifetime, these allocation decisions can result in material changes in personal net worth. Nevertheless, the bulk of professors chose the default option presented by the pension fund.

Lesson for Patient Engagement: The default settings will likely be used by the majority of patients. Think carefully about the default settings of your patient engagement program, especially patient-facing software applications.

 

Framing

Suppose you are facing a decision of whether or not to have surgery. The doctor tells you that "of the 100 people that have this operation, 90 are alive after five years." How would you react? Now consider if this statistic was stated differently: "of the 100 people that have this operation, 10 are dead after five years." Your automatic system will likely kick into gear. I don't want to die. These hypothetical surgery statements have been studied and it turns out that the people do react very differently to them. This is called the framing bias. According to Thaler and Sunstein, framing works because people tend to be somewhat mindless, passive decision makers. We don't engage our reflective system to do the work of checking whether or not the statements actually differ in content -- or whether they have simply been reframed.

Lesson for Patient Engagement: Carefully consider the use of language in your patient engagement program. Frame choices so that people's decisions are aligned with their long-term goals.

 

Bottom Line

Patient engagement is not always easy; however, investing in solutions that reflect the realities of human nature can generate a major financial return. These solutions can also keep people healthier, which is the goal of our healthcare system. Right?