• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Thought Process

Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)

April 5, 2021 By Nagesh Belludi Leave a Comment

Israeli-American psychologist Daniel Kahneman’s bestselling Thinking, Fast and Slow (2011) describes the finer points of decision-making. It’s an engaging showcase of the innate biases of the mind and unthinking approaches to decision-making.

Human Beings are Intuitive Thinkers

'Thinking, Fast and Slow' by Daniel Kahneman (ISBN 0374533555) Kahneman is a behavioral economics pioneer and the winner of the 2002 Nobel Memorial Prize in Economic Sciences. His lifelong collaboration with Amos Tversky (1937–96) has molded humans’ thinking about human error, risk, judgment, decision-making, happiness, and more. Tversky died in 1996, so he did not share in the Nobel.

Thinking, Fast and Slow explores what Kahneman calls the “mind’s machinery” as two coexisting modes of thought (“fast and slow,” as the title says.) Kahneman splits the brain into two radically divergent ways, employing a two-tier model of cognition.

  • System One makes judgments instantly, intuitively, and automatically, as when a cricket batsman decides whether to cut or pull. A significant part of System One is “evolved heuristics” that lets us read a person’s expression in a microsecond from a block away, for example. And it can’t be switched off. System One’s thinking is fast and effortless. It often jumps to the wrong conclusions, relies on hunches and biases, and perhaps overconfident.
  • System Two is slower, conscious, calculated, and deliberate, like long division. Its operations require attention. System Two is what we think of as “thinking”—slow, tiring, and essential. It’s what makes us human. Even if System Two believes it is on top of things, System One makes many of our decisions.

System One Isn’t All Flawed

In a world that often necessitates swift judgment and rapid decision-making (e.g., fight or flight,) a person who solely relies on deliberative thinking (System Two) wouldn’t last long. Doctors and firefighters, for example, through training and repetition, develop what’s called “expert intuition” that helps them identify patterns and impulsively devise the right response to a complex emergency.

We as humans are not simple rational agents. Consequently, our thinking boils down to two “Systems” of thinking/processing. As we strive to make better decisions in our work and personal lives, it benefits us to slow down and use a more deliberate System 2 way of thinking. Learn to doubt your fast/quick way of thinking!

Human Intuition is Imperfect

Israeli-American psychologist Daniel Kahneman, Author of Thinking, Fast and Slow (2011) Thinking, Fast and Slow is an eye-opener in various ways. It can be a frightening catalog of the biases, shortcuts, and cognitive illusions that come to err our judgment—the endowment effect, priming, halo effect, anchoring effect, conjugation fallacy, the narrative fallacy, and the rest. Such mental processes are not intrinsically flawed; they are heuristics—rules of thumb, stereotypes, shortcuts. They are strategies the mind embraces to find a path in a tsunami of data.

Kahneman teaches how to recognize situations that require slower, deliberative thinking. Kahneman asserts that the value of the book is to give people the vocabulary to spot biases and to criticize the decisions of others: “Ultimately, a richer language is essential to the skill of constructive criticism.”

Recommendation: Read Daniel Kahneman’s Thinking, Fast and Slow (2011.) As one of the most popular non-fiction books in the last decade, it’ll open your eyes to the quirky and error-prone ways in which you can be influenced in ways you don’t suspect.

The conceptions behind behavioral economics make Thinking, Fast and Slow a laborious read. Many chapters are bogged down by hair-splitting details of his rigorous scientific work and academic gobbledygook. It’s a commanding survey of this field, but it’s superbly written and intelligible to non-experts.

Complement with Rolf Dobelli’s accessible The Art of Thinking Clearly (2013.)

Wondering what to read next?

  1. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  2. Let Go of Sunk Costs
  3. Clever Marketing Exploits the Anchoring Bias
  4. Situational Awareness: Learn to Adapt More Flexibly to Developing Situations
  5. How to Avoid Magical Thinking

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

April 1, 2021 By Nagesh Belludi Leave a Comment

Psychologists have argued that many cognitive biases are rooted in mental shortcuts. They are heuristics—rules of thumb, stereotypes, instincts—that help you make sense of the world. They aren’t intrinsically flawed, but, they’re often quirky and error-prone. Your mental models can affect you in ways that you don’t suspect.

'You Are Not So Smart' by David McRaney (ISBN 1592407366) David McRaney’s You Are Not So Smart (2011) suggests a brief—if hurried—tour of 48 cognitive biases that can deceive you. Based on the author’s popular blog, the book is a satisfying assessment of understanding people’s—and your own—behavior a little bit better.

There is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. … From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company.

Each chapter starts with a brief statement of a misconception followed by the fact. Then a synopsis of a related behavioral study shows how our brains produce the counterpart deception and the truth. Some of the less-known preconceptions discussed are,

  • Confabulation. You tend to create unreliable narratives to explain away your choices post hoc. These reassuring perceptions can make you think you’re more rational than you actually are.
  • Groupthink. People tend to fall in with the rest of the group to minimize conflict and foster group cohesiveness and social acceptance. No one wants to be the one person with a dissenting opinion.
  • Social Loafing. That others in a team will pick up your slack may induce you to put in less effort if you think you’ll get away with it. This can curb your own performance, even if you’re a conscientious, hardworking type. If you don’t feel your participation will be noticed, why bother putting in the effort?
  • Availability Heuristic. You’re likely to estimate the likelihood of an event based on your ability to recall immediate and easily accessed examples.
  • Fundamental Attribution Error. You tend to assign external reasons for your own behavior but internal motives to other people. For instance, if you’re late for a meeting, you’ll blame it on public transport. If someone else is running late for a meeting with you, you’ll blame it on her poor time-keeping.

Recommendation: Read David McRaney’s You Are Not So Smart. It’s an engaging, easy-to-read primer to how the mind works. Read it as a lead up to Daniel Kahneman’s bestselling dissertation Thinking, Fast and Slow (2011; my summary.)

Idea for Impact: Once you learn to spot the cognitive biases we all grapple with, they’re easier to overcome.

Wondering what to read next?

  1. Is Showing up Late to a Meeting a Sign of Power?
  2. Question Success More Than Failure
  3. How to Avoid Magical Thinking
  4. The Drunkard’s Search or the Streetlight Effect [Cognitive Bias]
  5. This is Yoga for the Brain: Multidisciplinary Learning

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Data Never “Says”

March 1, 2021 By Nagesh Belludi Leave a Comment

The Data Never

Data doesn’t say anything. Indeed, data can’t say anything for itself about an issue any more than a saw can form furniture, or a sauce can simmer a stew.

Data is inert and inanimate. Data doesn’t know why it was created. Data doesn’t have a mind of its own, and, therefore, it can’t infer anything.

Data is a necessary ingredient in judgment. It’s people who select and interpret data. People can turn it into insight or torture it to bring their agenda to bear. Data is therefore only as useful as its quality and the skills of the people wielding it.

Far more than we admit, subjectivity and intuition play a significant role in deciding how we collect, choose, process, explain, interpret, and apply the data. As entrepreneur Margaret Heffernan warns in Willful Blindness: Why We Ignore the Obvious at Our Peril (2012,) “We mostly admit the information that makes us feel great about ourselves, while conveniently filtering whatever unsettles our fragile egos and most vital beliefs.”

In the hands of careless users, data can end up having the opposite effect its creators intended. All data is good or bad depending on how it’s employed in a compelling story and what end it’s serving—neither of which the data itself can control.

  • Don’t let data drive your conclusions. Let data inform your conclusions.
  • Don’t declare, “The data says,” (as in, “the stock market thinks.”) Data by itself cannot have a particular interpretation.
  • When you find data that seems to support the case you wish to make, don’t swoop on it without caution and suspicion. Data can be very deceptive when used carelessly.
  • Be familiar with the limitations of your data. Investigate if your data informs any other equally valid hypothesis that could propose an alternative conclusion.

Idea for Impact: Beware of the risk of invoking data in ways that end up undermining your message.

Wondering what to read next?

  1. Presenting Facts Can Sometimes Backfire
  2. Admit When You Don’t Have All the Answers
  3. Surrounded by Yes: Social Media and Elsewhere
  4. Saying is Believing: Why People Are Reluctant to Change an Expressed Opinion
  5. When to Stop Thinking and Decide

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Conversations, Conviction, Critical Thinking, Decision-Making, Persuasion, Problem Solving, Thinking Tools, Thought Process

How to Avoid Magical Thinking

February 22, 2021 By Nagesh Belludi Leave a Comment

How to Avoid Magical Thinking Magical thinking remains a subtle impediment to making sound decisions. The more you examine yourself, the more you can reduce your tendency to indulge in it.

Discover the truth for yourself. Beware of the tendency to let others think for you. Don’t believe what your parents, teachers, counselors, mentors, priests, and authorities of all inclinations have taught you from an early age. (The best predictor of people’s spiritual beliefs is the religiosity of their parents.) Question others’ underlying premises and discover for yourself what’s reasonable. Force yourself to test for alternatives.

Don’t believe what you want to believe is true. Many people believe in UFOs and ghosts, even when there’s no credible verification for any visitation from outer space or dead souls haunting abandoned buildings. Often, misinformation is cunningly designed to evade careful analytical reasoning—it can easily slip under the radar of even the most well-informed people. Shun blind optimism.

Consciously identify your biases and adverse instincts. Psychologists have identified more than 100 cognitive biases that can get in the way of clear and rational thinking. Explore how those biases could come into play in your thinking. Try to determine their motive. Work to extricate yourself from them to the best of your ability.

Demand proof when the facts seem demonstrable. Remain intellectually agnostic toward what hasn’t been established scientifically or isn’t provable. If you can’t determine if something is true or it isn’t, suspend judgment. Beware of anecdotes—emotionally swaying stories in particular—they are the weakest form of evidence.

Don’t believe in something that isn’t true just because there’s a practical reason to. If you feel emotionally inclined to believe in something because it gives you hope, comfort, and the illusion of control, identify your belief as just that. Faith is often no more than an inclination that’s not withstood the tests of reason. The process of faith is an absence of doubt. There’ll always be people who reject evolution for reasons that have little to do with evolution. Don’t act with more confidence in unproven theories than is justifiable.

Idea for Impact: Be wary of the influences that can put you at risk for magical thinking.

Give critical thinking and systematic evidence the central role in how you understand the world. Improving the criteria you use to judge the truth of things is difficult—but it’s of the essence. Have an unvarying, well-balanced degree of skepticism about everything, especially your own postulations.

Wondering what to read next?

  1. [Effective Arguments] Explain Your Opponent’s Perspective
  2. Question Success More Than Failure
  3. How to Stimulate Group Creativity // Book Summary of Edward de Bono’s ‘Six Thinking Hats’
  4. How to Gain Empathic Insight during a Conflict
  5. Lessons from Charlie Munger: Destroy Your Previous Ideas & Reexamine Your Convictions

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Critical Thinking, Introspection, Mindfulness, Persuasion, Questioning, Thinking Tools, Thought Process

A Real Lesson from the Downfall of Theranos: Silo Mentality

February 4, 2021 By Nagesh Belludi Leave a Comment

The extraordinary rise and fall of Theranos, Silicon Valley’s biggest fraud, makes an excellent case study on what happens when teams don’t loop each other in.

Theranos’ blood-testing device never worked as glorified by its founder and CEO, Elizabeth Holmes. She created an illusion that became one of the greatest start-up stories. She kept her contraption’s malfunctions and her company’s problems shockingly well hidden—even from her distinguished board of directors.

At the core of Holmes’s sham was how she controlled the company’s flow of information

A Real Lesson from the Downfall of Theranos: Silo Mentality Holmes and her associate (and then-lover) Sunny Balwani operated a culture of fear and intimidation at Theranos. They went to such lengths as hiring superstar lawyers to intimidate and silence employees and anyone else who dared to challenge their methods or expose their devices’ deficiencies.

Holmes had the charade going for so long by keeping a tight rein on who talked to whom. She controlled the flow of information within the company. Not only that, she swiftly fired people who dared to question her approach. She also forcefully imposed non-disclosure agreements even for those exiting the company.

In other words, Holmes went to incredible lengths to create and maintain a silo mentality in her startup. Her intention was to wield much power, prevent employees from talking to each other, and perpetuate her deceit.

A recipe for disaster at Theranos: Silo mentality and intimidation approach

'Bad Blood' by John Carreyrou (ISBN 152473165X) Wall Street Journal investigative reporter John Carreyrou’s book Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018; my summary) is full of stories of how Holmes went out of her way to restrain employees from conferring about what they were working on. Even if they worked on the same project, Holmes made siloed functional teams report to her directly. She would edit progress reports before redirecting problems to other team heads.

Consider designer Ed Ku’s mechatronics team responsible for designing all the intricate mechanisms that control the measured flow of biochemical fluids. Some of his team’s widgets were overheating, impinging on one another and cross-contaminating the clinical fluids. Holmes wouldn’t allow Ku and his team to talk to the teams that improved the biochemical processes.

Silo mentality can become very problematic when communication channels become too constricted and organizational processes too bureaucratic. Creativity gets stifled, collaboration limited, mistakes—misdeeds in the case of Theranos—suppressed, and collective objectives misaligned.

Idea for Impact: Functional silos make organizations slow, bureaucratic, and complicated

Innovation hinges increasingly on interdisciplinary cooperation. Examine if your leadership attitude or culture is unintentionally contributing to insufficient accountability, inadequate information-sharing, and limited collaboration between departments—especially on enterprise-wide initiatives.

Wondering what to read next?

  1. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’
  2. Your Product May Be Excellent, But Is There A Market For It?
  3. Making Training Stick: Your Organization Needs a Process Sherpa
  4. How to Examine a Process and Ask the Right Questions
  5. Creativity by Imitation: How to Steal Others’ Ideas and Innovate

Filed Under: Business Stories, Leadership, Mental Models Tagged With: Biases, Critical Thinking, Entrepreneurs, Ethics, Leadership Lessons, Psychology, Thought Process

This New Year, Forget Resolutions, Set Intentions Instead

January 4, 2021 By Nagesh Belludi Leave a Comment

New Year Intentions, Not Resolutions I think resolutions set you up for failure because they’re usually daunting, and they don’t give you a plan for how to realize what you want to achieve. More to the point, you underestimate how long it’ll take you to kick a bad habit or adopt a good one.

On the other hand, intentions propose paths forward—they can keep you accountable in the process.

Intentions dig into the WHY

Change is hard—change requires real commitment, planning, and follow-through. Intentions help by grounding you to what you can commit to today and tomorrow. Intentions will remind you of the kind of person you want to be and the kind of life you want to live.

Intentions don’t demand perfection, and intentions leave some room for error. Intentions will help you commit yourself and not fill you with guilt and shame if you fall off the wagon for a short period. With intentions, you can anticipate lapses and plan for them.

Setting intentions and then taking action becomes an exciting path of self-discovery rather than a guilt-trap set up with broken resolutions.

Idea for Impact: Set Intentions Instead of Yearly Resolutions

Put less pressure on yourself and set yourself up for success by making regular daily, weekly, and monthly intentions. Once you set the intention, focus on getting to the first step. Then, regroup and think about step two. This way, you target short-term achievable results, and the intention orients you.

Don’t make intentions for the entire year. It’s just hard to keep up with something and stay excited about it year-round.

Wondering what to read next?

  1. A Secret of Dieting Success: Do Not Deprive Yourself of Your Guilty Pleasures
  2. Small Steps, Big Revolutions: The Kaizen Way // Summary of Robert Maurer’s ‘One Small Step Can Change Your Life’
  3. 5 Minutes to Greater Productivity [Two-Minute Mentor #11]
  4. Doing Is Everything
  5. How to Prepare an Action Plan at a New Job [Two-Minute Mentor #6]

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Discipline, Getting Things Done, Goals, Motivation, Performance Management, Procrastination, Thought Process

What the Duck!

December 28, 2020 By Nagesh Belludi Leave a Comment

Rubber Duck Debugging - There’s a popular technique among programmers called “rubber duck debugging:” put a rubber duck (or a cardboard cutout of a dog) next to your computer, and whenever you get stuck, you talk to the duck.

Yes, you talk your problem and walk your code with that inanimate object. And you perpetuate the stereotype that we geeks are a socially awkward bunch.

“Showing It to Someone Else” isn’t just a way of telling the other what you think; it’s a way of telling yourself what you think. Just the act of slowing down and explaining your problem and its context can bring about a moment of illumination.

Rubber duck debugging is related to what psychologists call “incubation.” The best solutions can strike suddenly and unexpectedly when you aren’t actively working on your problem. Think of Archimedes and his Eureka moment.

Idea for Impact: Talking is often a part of thought. After many failed attempts, switching your brain from problem-solving to problem-explaining—even to a cat, parent, sympathetic coworker, or somebody who may not know much about whatever it is that’s bothering you—can break you free from fixation and trigger creative breakthroughs.

Hat tip to reader Nick Ashcroft.

Wondering what to read next?

  1. What Archimedes Can Teach About Creative Problem Solving
  2. How You See is What You See
  3. Overcoming Personal Constraints is a Key to Success
  4. Good Questions Encourage Creative Thinking
  5. This is Yoga for the Brain: Multidisciplinary Learning

Filed Under: Mental Models Tagged With: Creativity, Critical Thinking, Parables, Problem Solving, Thinking Tools, Thought Process

Saying is Believing: Why People Are Reluctant to Change an Expressed Opinion

November 30, 2020 By Nagesh Belludi Leave a Comment

Politicians shift their views shamelessly with the winds of opportunism. To their defense, they must choose to stand up for what they believe or risk political capital.

Most politicians believe in one thing—winning elections and latching on to power. Seems they’ll say anything that can get them in the office and stay there. Like when, during the 2004 presidential elections, Democratic nominee John Kerry famously proclaimed, “I actually did vote for the $87 billion before I voted against” funding to rebuild Iraq.

Politicians Will Often Flip-flop to Maximize Their Popularity

Why People Are Reluctant to Change an Expressed Opinion Well, that’s the nature of the beast. Politicians enter politics for ideological reasons but must readily sell their souls to prolong their political careers. Politicians never seem to be willing to say, “I was wrong” or “Upon mature reflection, I’ve changed my mind on such and such.”

But what about the rest of us? It seems that, unlike the politicians, we’re shamed relatively easily when we change our mind and adjust our approach. Admitting we’ve made a mistake is too threatening to our sense of self. We end up over-compensating by denying fault and refusing ownership of our own mistakes, thereby protecting our self-image.

There’s evidence that suggests that saying is believing. Making a known pronouncement strengthens our commitment to that point of view. By committing ourselves openly to our present opinions, we may be hardening ourselves to future information that would otherwise change our minds.

The ‘Saying-Is-Believing’ Effect

According to Robert Cialdini’s Influence: The Psychology of Persuasion (2006,) social psychologists have shown that openly committing to an opinion makes you less willing to change your mind.

Cialdini cites an experiment by social psychologists in which three sets of students were shown a group of lines. One set of students was asked to write down estimates of the lines’ length and turn their estimates to the experimenter. The second set was asked to write down their estimates on a Magic Pad and then wipe out their estimates before anyone else could see them. The third set of students didn’t write down their estimates at all. After the students were shown new evidence that suggested that their initial estimates were wrong,

The students who had never written down their first choices were least loyal to those choices. … By far, it was the students who had publicly recorded their initial positions who most resolutely refused to shift from those positions later. Public commitment had hardened them into the most stubborn of all.

Publicly committing to an answer makes people less receptive to information suggesting they were wrong

Why Changing Your Mind is Actually a Good Thing Yup, the act of publicly documenting your opinion enforces the feeling of others knowing what your opinion was. This produces fear of being judged.

The hard part about admitting you’re wrong is, well, admitting you’re wrong. This may induce you to refuse to accept new ideas.

The American economist Paul Krugman has remarked on the “epidemic of infallibility,”

Just to be clear, everyone makes mistakes. Nobody is perfect. When you’re committed to a fundamentally false narrative, facing up to facts becomes an act of political disloyalty. What’s going on with Mr. Trump and his inner circle seems to have less to do with ideology than with fragile egos. To admit having been wrong about anything, they seem to imagine, would brand them as losers and make them look small. In reality, of course, the inability to engage in reflection and self-criticism is the mark of a tiny, shriveled soul.

Idea for Impact: Changing Your Mind is Actually a Good Thing

Changing your mind based on new information isn’t bad. It’s something to be encouraged. As the Transcendentalist essayist Ralph Waldo Emerson wrote, “A foolish consistency is the hobgoblin of little minds.”

In our vigilant, hypercritical, and judgmental society, the problem isn’t with people voicing and documenting their opinions (particularly on social media) but with people not being OK with someone changing theirs.

A professed commitment shouldn’t cause reluctance to change your opinion.

Wondering what to read next?

  1. Presenting Facts Can Sometimes Backfire
  2. Don’t Ignore the Counterevidence
  3. Charlie Munger’s Iron Prescription
  4. Here’s a Tactic to Sell Change: As a Natural Progression
  5. [Effective Arguments] Explain Your Opponent’s Perspective

Filed Under: Effective Communication, Mental Models, Sharpening Your Skills Tagged With: Attitudes, Conviction, Critical Thinking, Persuasion, Social Dynamics, Thought Process

Moderate Politics is the Most Sensible Way Forward

September 17, 2020 By Nagesh Belludi Leave a Comment

A sharp observation on political extremism in this 1987 TV ad by comedian John Cleese for the Social Democratic Party-Liberal Party Alliance (1981–88) in the United Kingdom:

Extremism creates a nastier harsher atmosphere everywhere, more abuse and bother boy behavior, less friendliness and tolerance and respect for opponents. What we never hear about extremism is its advantages … the biggest advantage of extremism is that it makes you feel good because it provides you with enemies. The great thing about having enemies is that you can pretend that all the badness in the whole world is in your enemies, and all the goodness in the whole world is in you. If you have a lot of anger and resentment in you anyway, and you, therefore, enjoy abusing people, then you can pretend that you’re only doing it because these enemies of yours are such very bad persons and that if it wasn’t for them, you’d actually be good-natured and courteous and rational all the time.

As relevant now as it was then.

I don’t belong to a political party, and I don’t think I’ll ever join one. Partisan talking points irritate me no end. I’ll watch the upcoming debates, though, because I’ll find all the onstage mudslinging and the impulsive provocations very entertaining.

In politics, everyone tries to push emotional buttons. Few seem to talk about an evidence-based attitude for making decisions and allocating society’s resources where they’ll make the most impact.

Besides, the media today have made the exchange of ideas particularly charged and increasingly polarized. The only way to be heeded to in a screaming vortex is to scream louder and resort to premeditated ad hominum.

Idea for Impact: Wisdom doesn’t reside solely on one side of the center. I am partial to those moderates whose political stance often varies with the issue. Contrary to popular perception, they aren’t tuned-out or ill-informed. Instead, they’re disposed to see both sides of the complex problems, disregard the left and the right’s excessively ideological positions, and seek the middle ground.

Wondering what to read next?

  1. Fight Ignorance, Not Each Other
  2. [Effective Arguments] Explain Your Opponent’s Perspective
  3. How to Gain Empathic Insight during a Conflict
  4. Rapoport’s Rules to Criticize Someone Constructively
  5. Presenting Facts Can Sometimes Backfire

Filed Under: Managing People, Mental Models Tagged With: Conflict, Critical Thinking, Getting Along, Persuasion, Politics, Thinking Tools, Thought Process

The Power of Asking Open-Ended Questions

August 24, 2020 By Nagesh Belludi Leave a Comment

When Bill Gates first met Warren Buffett, Gates was dazzled particularly by how Buffett asked open-ended “big questions”:

I have to admit, when I first met Warren, the fact that he had this framework was a real surprise to me. I met him at a dinner my mother had put together. On my way there, I thought, “Why would I want to meet this guy who picks stocks?” I thought he just used various market-related things—like volume, or how the price had changed over time—to make his decisions. But when we started talking that day, he didn’t ask me about any of those things. Instead he started asking big questions about the fundamentals of our business. “Why can’t IBM do what Microsoft does? Why has Microsoft been so profitable?” That’s when I realized he thought about business in a much more profound way than I’d given him credit for.

The Power of Asking Open-Ended Questions

“What are My Questions?”

Asking great questions is a skill, but doesn’t come as you would expect. One contributing factor is that, with age, education, and experience, we become conditioned to cogitate in very rigid terms. Heuristics and mental shortcuts become deep-seated and instinctual to allow for faster problem-solving and programmed decision-making.

Idea for Impact: Don’t ask the same questions most people ask. The smartest people I know don’t begin with answers; they start by asking, “what are our questions?”

Make inquiries using open-ended questions that can’t be answered with a ‘yes’ or ‘no.’ Effective questions will help you think deeper, generate meaningful explorations, and yield far more interesting insights.

Wondering what to read next?

  1. The Myth of the First-Mover Advantage
  2. Warren Buffett’s Rule of Thumb on Personal Integrity
  3. How to Examine a Process and Ask the Right Questions
  4. The Trickery of Leading Questions
  5. Good Questions Encourage Creative Thinking

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Asking Questions, Decision-Making, Questioning, Thought Process

Next Page »

Primary Sidebar

About: Nagesh Belludi [contact] is an Ann Arbor, Michigan-based investor, effectiveness coach, and freethinker. He frequently voyages in discovery of the places, the people, and the spirits of the greatest countries of the world.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

Explore

Attitudes Balance Biases Books Coaching Communication Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Ethics Etiquette Feedback Getting Along Goals Great Manager Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Skills for Success Social Life Social Skills Stress Thinking Tools Thought Process Time Management Winning on the Job Wisdom

RECOMMENDED BOOK:
Confessions of a Public Speaker

Confessions of a Public Speaker: Scott Berkun

Communication consultant Scott Berkun's guidelines on how to reduce anxiety and how to speak in public with greater effectiveness.

Categories

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators
  • The Successful Manager

Recently,

  • Make Time To Do It
  • Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)
  • Inspirational Quotations #887
  • The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  • This Hack Will Help You Think Opportunity Costs
  • Inspirational Quotations #886
  • Witty Comebacks and Smart Responses for Nosy People

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!