Fallacies, Illusions, and Biases (Part 2)

I’m working my way through Rolf Dobelli’s The Art of Thinking Clearly by reading a few sections each morning. Below are my notes on sections 12-23. Read 1-11 here.

  1. “It’ll-get-worse-before-it-gets-better” fallacy: A variant of confirmation bias. If the problem gets worse, the prediction is confirmed. If the situation improves unexpectedly, the customer is happy and the expert attributes it to his prowess. Look for verifiable cause-and-effect evidence instead.
  2. Story bias: We tend to interpret things with meaning, especially things that seem connected. Stories are more interesting than details. Our lives are mostly series of unconnected, unplanned events and experiences. Looking at these ex post facto and making up an overarching narrative is disingenuous. The problem with stories is that they give us a false sense of understanding, which leads us to take bigger risks and urges us to take a stroll on thin ice. Whenever you hear a story, ask: Who is the sender, what are his intentions, and what does this story leave out or gloss over?
  3. Hindsight bias: Possibly a variant on story bias. In retrospect, everything seems clear and inevitable. It makes us think we are better predictors than we actually are, causing us to be arrogant about our knowledge and take too much risk. To combat this, read diaries, listen to oral histories, and read news stories from the time you are looking at. Check out predictions from the time. And keep your own journal with your own predictions about your life, career, and current events. Compare them later to what happened to see how poor of a predictor we all are.
  4. Overconfidence effect: We systematically overestimate and our ability to predict on a massive scale. The difference between what we know and what we think we know is huge. Be aware that you tend to overestimate your knowledge. Be skeptical of predictions, especially from so-called experts. With all plans, favor the pessimistic scenario.
  5. Chauffeur Knowledge: There are two types of knowledge: Real knowledge (deep, nuanced understanding) and Chauffeur knowledge (enough knowledge to put on a show, but understanding to answer questions or make connections). Distinguishing between the two is difficult if you don’t understand the topics yourself. One method is the circle of competence. True experts understand the limits of their competence: The perimeter of what they do and do not know. They are more likely to say “I don’t know.” The chauffeurs are unlikely to do this.
  6. Illusion of Control: Similar to placebo effect. The tendency to believe that we can influence something over which we have absolutely no sway. Sports, gambling, etc. Also: Elevators, cross walks, fake temperature dials. This illusion led prisoners (like Frankel, Solzhenitsyn, etc) to not give up hope in concentration camps. Federal reserve’s federal funds rate is probably a fake dial, too. The world is mostly an uncontrollable system at the level we currently understand it. The things we can influence are very few.
  7. Incentive Super-Response Tendency: People respond to incentives by doing whatever is in their best interest. Extreme examples: Hanoi rats being bred, Dead Sea scrolls being torn apart. Good incentive systems take into account both intent and reward. Poor incentive systems often overlook and even corrupt the underlying aim. “Never ask a barber if you need a haircut.” Try to ascertain what actions are incentivized in any situation.
  8. Regression to Mean: A cousin of the “It’ll-get-worse-before-it-gets-better” and the Illusion of Control fallacies. Extreme performances are often interspersed with less extreme ones. There are natural variations in performance. Students are rarely always high or low performers. They cluster around the mean. Thinking we can influence these high and low performers is an illusion of control.
  9. Outcome Bias: We tend to evaluate decisions based on the result rather than the decision process. This is a variant on the Hindsight Bias. Only in retrospect do signals seem clear. When samples are too small, the results are meaningless. A bad result does not necessary indicate a bad decision and vice versa. Focus on the reasons behind actions: Were they rational and understandable?
  10. Paradox of Choice: A large selection leads to inner paralysis and also poorer decisions. Think about what you want before inspecting existing offers. Write down the criteria and stick to them rigidly. There are never perfect decisions. Learn to love a good choice.
  11. Liking Bias: The more we like someone, the more we are inclined to but from or help that person. We see people as pleasant if (a) they are outwardly attractive, (b) they are similar to you, or (3) they like you. This is why the salesperson copies body language and why multi-level marketing schemes work. Advertising employs likable figures in ads. If you are a salesperson, make people like you. If you are a consumer, judge the product independent of the seller and pretend you don’t like the seller.
  12. Endowment effect: We consider things to be more valuable the moment we own them. If we are selling something, we charge more than we ourselves would spend on it. We are better at holding on to things than getting rid of them. This effect works on auction participants, too, which drives up bidding. And late-stage interview rejections. Don’t cling to things, rather view them as the universe temporarily bestowing them to you.

Fallacies, Illusions, and Biases (Part 1)

I’m working my way through Rolf Dobelli’s The Art of Thinking Clearly by reading a few sections each morning. Here are my notes on the first 11 sections (Confirmation Bias had two sections, which I’ve only noted as one below):

  1. Survivorship bias: You overestimate your probability of success because you only see success stories. You find common threads in success stories and think they are the answer. Both ignore the failures because those stories aren’t told. When you are a survivor you think, “I did it! Everyone else can!” Look for counter examples and failures to overcome it.
  2. Swimmer’s body illusion: Swimmers usually choose swimming because they have good physiques. Swimming doesn’t necessarily cause good physiques. Harvard has a rigorous vetting process and skilled, driven people tend to get in. They’d likely be successful without Harvard. This may actually be a subset of the survivorship bias. (You don’t see ugly models selling makeup or fat swimmers because they don’t tend to last long in the business. Dumb people don’t make it though Harvard’s screening, so won’t bring down their salary numbers after 4 years.)
  3. Clustering illusion: Our brains are pattern and meaning recognizing machines. First regard patterns as pure chance. If there seems to be more, test it statistically.
  4. Social Proof: We are hardwired to copy the reactions of others. In the past it was beneficial for survival. Remember to look for links. Popular does not equal best on objective measures. “If 50M people say something foolish, it is still foolish.”
  5. Sunk Cost Fallacy: Investments of time or money to date don’t matter. Only future benefits or costs count.
  6. Reciprocity: The allure of both positive and negative reciprocity is so strong that it is best to avoid saying yes in the first place if it is something you don’t want.
  7. Confirmation bias: The tendency to interpret new information so it becomes compatible with your existing beliefs. We filter out disconfirming evidence. Look for disconfirming evidence and give it serious consideration. “Murder your darlings.”
  8. Authority bias: When making decisions, think about which authority figures are influencing your reasoning. Challenge them.
  9. Contrast Effect: Things seem cheaper, prettier, healthier, better, etc in contrast to something else. This is how magicians and con men remove your watch: Press hard in one area so you don’t feel the lighter touch of removing your watch. This is also why it is easy to ignore inflation. Compare things in individual cost/benefit calculations, not in contrast to an “original price” or what they are framed against.
  10. Availability bias: We create a picture of the world using the examples that most easily come to mind. This creates an incorrect risk map in our heads. We attach too much likelihood to flashy outcomes. We think dramatically, not quantitatively. We tend to focus on what is in front of us, whether or not it is the most important question. We can overcome it by getting others’ input with different experiences and expertise.

Focus on the Day-to-Day Work

Currently reading: Ego is the Enemy by Ryan Holiday

  • Focus on the day-to-day details of your work, not the grand vision. 
  • Gain the discipline to make your daily commitments happen and the bigger goals will follow. 
  • Working harder and smarter than your competition beats having a bigger vision every time. 
  • Make tangible improvements each and every day. No zero days. 
  • Silicon Valley’s recent infatuation with “changing the world” is funny. That isn’t how Google, Facebook, YouTube, Dropbox, or Airbnb started. They fixed one small problem and iteratively scaled. 
  • Focus on doing your best work each day and don’t blind yourself with trying to turn into a good story. 

How to Avoid Pastoralism

I’m rereading Breaking Smart Season 1 right now and I got to thinking about Rao’s concept of pastoralism vs prometheanism and how to avoid it. 

Whenever you find yourself pining for a specific technological solution, especially one that was dreamt up more than 15 years ago, ask yourself whether or not you want the actual specified solution or to solve the problem which the thing you pine for was supposed to solve. 

If it is the later, you should work on a new solution that takes into account the changed social and economic situation in the time that has elapsed and what is now technically possible that wasn’t before.

Don’t keep working toward a technically difficult, but outdated, solution just for nostalgia’s sake. Question your motives and work toward solving the problem again. 

Focused vs Unfocused Reading

  1. The gap between focused and unfocused reading is huge, especially when compounded over time. 
  2. Reducing distractions can lead to huge improvements in the number of pages read and understood. Maybe even more than traditional speed reading methods. 
  3. On my flight to Chicago this weekend, I read half of James Hogan’s Inherit the Stars. On the flight back to NYC, I reread 60% of Breaking Smart Season 1. Each leg was a little over 2 hours. I got through much more of each of these books than I have in equivalent amounts of time at home. It was like I had tunnel vision on the flight because I couldn’t get up and had no distractions available. 
  4. I need to do a better job at implementing airplane-like focus at home so that I can cover more ground in less time. I’m going though the 10 Days to Faster Reading book right now, but its methods aren’t that appealing to me. Working on my focus might be a better route. 

Takeaways from this week’s Breaking Smart newsletter, Betting the Spread on Inexorables:

  • Try multiple ultra short-term bets around a shared assumption. 
  • Don’t stick with something you don’t find valuable just so you “aren’t a quitter.”
  • Constantly question whether or not the next step is what will produce the best results. 
  • Bet the spread, then switch between parallel bets as data changes. Work isn’t the horse races. You can change bets whenever you want. 
  • There is a difference between unfocused dabbling and betting the spread around a central inexorable trend. That difference is that learning and outcomes around that single trend compound. Unfocused dabbling doesn’t.