• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Bias

Why You May Be Overlooking Your Best Talent

April 25, 2022 By Nagesh Belludi Leave a Comment

Affinity Bias - Overlooking Your Best Talent Many organizations have a hard time articulating their culture. They can’t explain what they mean when they evoke the phrase “culture fit.” Sometimes it’s just an excuse to engage employees better whom managers feel they can personally relate.

Affinity bias is a common tendency to evaluate people like us more positively than others. This bias often affects who gets hired, promoted, or picked for job opportunities. Employees who look like those already in leadership roles are more likely to be recognized for career development, resulting in a lack of representation in senior positions.

This affinity for people who are like ourselves is hard-wired into our brains. Outlawing bias is doomed to fail.

Idea for Impact: If you want to avoid missing your top talent, become conscious of implicit biases. Don’t overlook any preference for like-minded people.

For any role, create a profile that encompasses which combination of hard and soft skills will matter for the role and on the team. Determine what matters and focus on the traits and skills you need.

Wondering what to read next?

  1. The Unlikely Barrier to True Diversity
  2. The Duplicity of Corporate Diversity Initiatives
  3. Can’t Ban Political Talk at Work
  4. How to Hire People Who Are Smarter Than You Are
  5. Consensus is Dangerous

Filed Under: Leadership, Leading Teams, Managing People Tagged With: Bias, Group Dynamics, Hiring & Firing, Introspection, Social Dynamics, Teams, Workplace

When Exaggerations Cross the Line

January 22, 2022 By Nagesh Belludi Leave a Comment

When Exaggerations Cross the Line Many myths and urban legends—even politicians’ infuriating rhetoric—aren’t wholly untrue. Rather, they’re exaggerations of claims rooted in a kernel of truth.

Yes, many of us don’t achieve our full intellectual potential. However, that doesn’t suggest that most people use only 10% of our brainpower.

Sure, men and women tend to differ somewhat in their communication styles. However, pop psychologists such as John Gray have taken this gender difference stereotype to an extreme, declaring “men are from Mars” and “women are from Venus.”

Idea for Impact: Exaggeration is part of human nature. Take care to not cross the line from harmless puffery to reckless overstatements.

Wondering what to read next?

  1. You Can’t Believe Those Scientific Studies You Read About in the Papers
  2. How to Avoid Magical Thinking
  3. The Data Never “Says”
  4. The Solution to a Problem Often Depends on How You State It
  5. Lessons from David Dao Incident: Watch Out for the Availability Bias!

Filed Under: Effective Communication, Sharpening Your Skills Tagged With: Bias, Critical Thinking, Questioning

You Can’t Believe Those Scientific Studies You Read About in the Papers

November 11, 2021 By Nagesh Belludi Leave a Comment

Look at the filler articles in the well-being section of your preferred newspaper, and you’ll often luck into health advice with nuance-free mentions of all sorts of scientific studies.

One week, drinking coffee is good for you. Next week it’s harmful. Ditto video games. Swearing makes you look intelligent, but hold your flipping horses … the next day, swearing makes you seem verbally challenged to explain your annoyances respectfully.

Gutter press revelations isn’t only less-than-scientific, but it actually defeats the objectives of science.

You Can't Believe Those Scientific Studies You Read About in the Papers In June 2014, the Proceedings of the National Academy of Sciences published an allegedly peer-reviewed paper titled “Female hurricanes are deadlier than male hurricanes.” The study deduced that hurricanes with feminine names generate more casualties supposedly because tacit sexism makes communities take the storms with a feminine name less seriously. The work was discredited as soon as its methods were dissected. Nevertheless, the dubious paper had made its way into media channels across the country because of the believability implied by the influential National Academy of Sciences.

Positive results that make a sensational headline tend to get published readily—especially if they speak to the audiences’ worldviews. In truth, many of these studies are low-quality studies where the variables are latent, and the effects aren’t directly observable and quantifiable, especially in the social sciences. Sadly, with the push to produce ever more papers in academia, peer review doesn’t necessarily corroborate the quality of research nearly so much as it enforces a discipline’s norms.

Idea for Impact: Let’s be skeptical readers. Let’s be better readers.

'A Field Guide to Lies and Statistics' by Daniel Levitin (ISBN 0241239990) Let’s subject every claim to the common-sense test: is the claim possible, plausible, probable, and likely? Everything possible isn’t plausible, everything plausible isn’t probable, and everything probable isn’t likely.

Being skeptical does not mean doubting the validity of everything, nor does it mean being cynical. Instead, to be skeptical is to judge the validity of a claim based on objective evidence and understand the assertions’ nuances. Yes, even extraordinary claims can be valid, but the more extraordinary a claim, the more remarkable the evidence to be mobilized.

While we’re on the subject, have you heard about research that found that you could make unsuspecting people believe in anything by merely asserting that it’s been “shown by research?” Now then, the former’s the only research worth believing. Very much so, yes, even without evidence!

Wondering what to read next?

  1. Question Success More Than Failure
  2. How to Avoid Magical Thinking
  3. The Data Never “Says”
  4. Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)
  5. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

Filed Under: Sharpening Your Skills Tagged With: Bias, Critical Thinking, Questioning, Thinking Tools

Don’t Get Stuck in Middle Management

September 21, 2021 By Nagesh Belludi Leave a Comment

This survey by the Association of Asian Americans in Investment Management reports (via The New York Times DealBook column) the nature of discrimination and bias faced by Asian Americans:

Asian Americans and Pacific Islanders are often stereotyped as lacking leadership skills. At investment firms, they “fill middle management ranks, but their percentages plummet in senior management and C-suites.” Respondents said they were often tapped as technical experts and benefited from the perception that they are good workers. But their advancement stalled as they sought more senior roles that emphasize networking and communication skills.

Don't Get Stuck in Middle Management Most professionals fail to realize that the competencies that made them successful in their early corporate roles are not necessarily the attributes that will allow them to outshine in roles higher up on the ladder. These desirable qualities would include forming coalitions, managing relationships and alliances, determining where and when to shift one’s focus, and learning to appreciate different perspectives.

Work out what you need to get to the top and fight the perceptions

  • Evaluate where your development priorities should be. Find out how you can acquire the necessary skills and competencies. Go get them. Become more visible to management and situate yourself for a promotion.
  • Network wisely. Understanding who must be won over to your point of view is vital for training for your promotion. Spend time cultivating meaningful relationships.
  • Ask for honest feedback—not just from your boss but also from well-respected peers, customers, mentors, and others. Confront problems quickly lest they metastasize.

Idea for Impact: In today’s world, your skills and promotability are your responsibility.

Wondering what to read next?

  1. How to Improve Your Career Prospects During the COVID-19 Crisis
  2. How You Can Make the Most of the Great Resignation
  3. The #1 Cost of Overwork is Personal Relationships
  4. A Little Known, but Powerful Technique to Fast Track Your Career: Theo Epstein’s 20 Percent Rule
  5. Before Jumping Ship, Consider This

Filed Under: Career Development, Sharpening Your Skills Tagged With: Bias, Career Planning, Interpersonal, Leadership, Personal Growth, Skills for Success

Lessons from David Dao Incident: Watch Out for the Availability Bias!

August 23, 2021 By Nagesh Belludi Leave a Comment

In the weeks and months after the United Airlines’ David Dao incident and the ensuing customer service debacle, news of all kinds of disruptive airline incidents, coldblooded managers, and inconsiderate airline staff showed up everywhere.

The United incident raised everyone’s awareness of airline incidents. Expectedly, the media started drawing attention to all sorts of airline incidents—fights on airplanes, confusion and airports, seats taken from small children, insects in inflight meals, snakes on the plane—affecting every airline, large and small. However, such unpleasant incidents rarely happen, with thousands of flights every day experiencing nothing of the sort.

Lessons from David Dao Incident: Watch Out for the Availability Bias Parenthetically, the underlying problem that led to the David Dao incident wasn’t unique to United. The incident could have happened at other airlines. All airlines had similar policies regarding involuntary-denied boarding and prioritizing crew repositioning. Every other airline, I’m sure, felt lucky the David Dao incident didn’t happen on their airline.

In the aftermath of the incident, many people vowed to boycott United. Little by little, that negative consumer sentiment faded away while the backlash—and media coverage—over the incident diminished.

Availability bias occurs when we make decisions based on easy or incomplete ideas.

The David Dao incident’s media coverage is an archetypal case of the Availability Bias (or Availability Heuristic) in force. Humans are inclined to disproportionately assess how likely something will happen by how easy it is to summon up comparable–and recent–examples. Moreover, examples that carry a fierce emotional weight tend to come to mind quickly.

The availability heuristic warps our perception of real risks. Therefore, if we’re assessing whether something is likely to happen and a similar event has occurred recently, we’re much more liable to expect the future possibility to occur.

What we remember is shaped by many things, including our beliefs, emotions, and things like intensity and frequency of exposure, particularly in mass media. When rare events occur, as was the case with the David Dao incident, they become evident. Suppose you’re in a car accident involving a Chevy, you are likely to rate the odds of getting into another car accident in a Chevy much higher than base rates would suggest.

If you are aware of the availability bias and begin to look for it, you will be surprised how often it shows up in all kinds of situations. As with many other biases, we can’t remove this natural tendency. Still, we can let our rational minds account for this bias in making better decisions by being aware of the availability bias.

Idea for Impact: Don’t be disproportionately swayed by what you remember. Don’t underestimate or overestimate a risk or choosing to focus on the wrong risks. Don’t overreact to the recent facts.

Wondering what to read next?

  1. Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)
  2. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  3. The Data Never “Says”
  4. Five Where Only One is Needed: How Airbus Avoids Single Points of Failure
  5. How to Solve a Problem By Standing It on Its Head

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Bias, Change Management, Critical Thinking, Decision-Making, Psychology, Thought Process

Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)

April 5, 2021 By Nagesh Belludi Leave a Comment

Israeli-American psychologist Daniel Kahneman’s bestselling Thinking, Fast and Slow (2011) describes the finer points of decision-making. It’s an engaging showcase of the innate biases of the mind and unthinking approaches to decision-making.

Human Beings are Intuitive Thinkers

'Thinking, Fast and Slow' by Daniel Kahneman (ISBN 0374533555) Kahneman is a behavioral economics pioneer and the winner of the 2002 Nobel Memorial Prize in Economic Sciences. His lifelong collaboration with Amos Tversky (1937–96) has molded humans’ thinking about human error, risk, judgment, decision-making, happiness, and more. Tversky died in 1996, so he did not share in the Nobel.

Thinking, Fast and Slow explores what Kahneman calls the “mind’s machinery” as two coexisting modes of thought (“fast and slow,” as the title says.) Kahneman splits the brain into two radically divergent ways, employing a two-tier model of cognition.

  • System One makes judgments instantly, intuitively, and automatically, as when a cricket batsman decides whether to cut or pull. A significant part of System One is “evolved heuristics” that lets us read a person’s expression in a microsecond from a block away, for example. And it can’t be switched off. System One’s thinking is fast and effortless. It often jumps to the wrong conclusions, relies on hunches and biases, and perhaps overconfident.
  • System Two is slower, conscious, calculated, and deliberate, like long division. Its operations require attention. System Two is what we think of as “thinking”—slow, tiring, and essential. It’s what makes us human. Even if System Two believes it is on top of things, System One makes many of our decisions.

System One Isn’t All Flawed

In a world that often necessitates swift judgment and rapid decision-making (e.g., fight or flight,) a person who solely relies on deliberative thinking (System Two) wouldn’t last long. Doctors and firefighters, for example, through training and repetition, develop what’s called “expert intuition” that helps them identify patterns and impulsively devise the right response to a complex emergency.

We as humans are not simple rational agents. Consequently, our thinking boils down to two “Systems” of thinking/processing. As we strive to make better decisions in our work and personal lives, it benefits us to slow down and use a more deliberate System 2 way of thinking. Learn to doubt your fast/quick way of thinking!

Human Intuition is Imperfect

Israeli-American psychologist Daniel Kahneman, Author of Thinking, Fast and Slow (2011) Thinking, Fast and Slow is an eye-opener in various ways. It can be a frightening catalog of the biases, shortcuts, and cognitive illusions that come to err our judgment—the endowment effect, priming, halo effect, anchoring effect, conjugation fallacy, the narrative fallacy, and the rest. Such mental processes are not intrinsically flawed; they are heuristics—rules of thumb, stereotypes, shortcuts. They are strategies the mind embraces to find a path in a tsunami of data.

Kahneman teaches how to recognize situations that require slower, deliberative thinking. Kahneman asserts that the value of the book is to give people the vocabulary to spot biases and to criticize the decisions of others: “Ultimately, a richer language is essential to the skill of constructive criticism.”

Recommendation: Read Daniel Kahneman’s Thinking, Fast and Slow (2011.) As one of the most popular non-fiction books in the last decade, it’ll open your eyes to the quirky and error-prone ways in which you can be influenced in ways you don’t suspect.

The conceptions behind behavioral economics make Thinking, Fast and Slow a laborious read. Many chapters are bogged down by hair-splitting details of his rigorous scientific work and academic gobbledygook. It’s a commanding survey of this field, but it’s superbly written and intelligible to non-experts.

Complement with Rolf Dobelli’s accessible The Art of Thinking Clearly (2013.)

Wondering what to read next?

  1. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  2. The Data Never “Says”
  3. Lessons from David Dao Incident: Watch Out for the Availability Bias!
  4. This is Yoga for the Brain: Multidisciplinary Learning
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

April 1, 2021 By Nagesh Belludi Leave a Comment

Psychologists have argued that many cognitive biases are rooted in mental shortcuts. They are heuristics—rules of thumb, stereotypes, instincts—that help you make sense of the world. They aren’t intrinsically flawed, but, they’re often quirky and error-prone. Your mental models can affect you in ways that you don’t suspect.

'You Are Not So Smart' by David McRaney (ISBN 1592407366) David McRaney’s You Are Not So Smart (2011) suggests a brief—if hurried—tour of 48 cognitive biases that can deceive you. Based on the author’s popular blog, the book is a satisfying assessment of understanding people’s—and your own—behavior a little bit better.

There is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. … From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company.

Each chapter starts with a brief statement of a misconception followed by the fact. Then a synopsis of a related behavioral study shows how our brains produce the counterpart deception and the truth. Some of the less-known preconceptions discussed are,

  • Confabulation. You tend to create unreliable narratives to explain away your choices post hoc. These reassuring perceptions can make you think you’re more rational than you actually are.
  • Groupthink. People tend to fall in with the rest of the group to minimize conflict and foster group cohesiveness and social acceptance. No one wants to be the one person with a dissenting opinion.
  • Social Loafing. That others in a team will pick up your slack may induce you to put in less effort if you think you’ll get away with it. This can curb your own performance, even if you’re a conscientious, hardworking type. If you don’t feel your participation will be noticed, why bother putting in the effort?
  • Availability Heuristic. You’re likely to estimate the likelihood of an event based on your ability to recall immediate and easily accessed examples.
  • Fundamental Attribution Error. You tend to assign external reasons for your own behavior but internal motives to other people. For instance, if you’re late for a meeting, you’ll blame it on public transport. If someone else is running late for a meeting with you, you’ll blame it on her poor time-keeping.

Recommendation: Read David McRaney’s You Are Not So Smart. It’s an engaging, easy-to-read primer to how the mind works. Read it as a lead up to Daniel Kahneman’s bestselling dissertation Thinking, Fast and Slow (2011; my summary.)

Idea for Impact: Once you learn to spot the cognitive biases we all grapple with, they’re easier to overcome.

Wondering what to read next?

  1. Why Your Judgment Sucks // Summary of Daniel Kahneman’s Thinking, Fast and Slow (2011)
  2. The Data Never “Says”
  3. Lessons from David Dao Incident: Watch Out for the Availability Bias!
  4. This is Yoga for the Brain: Multidisciplinary Learning
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Data Never “Says”

March 1, 2021 By Nagesh Belludi Leave a Comment

The Data Never

Data doesn’t say anything. Indeed, data can’t say anything for itself about an issue any more than a saw can form furniture, or a sauce can simmer a stew.

Data is inert and inanimate. Data doesn’t know why it was created. Data doesn’t have a mind of its own, and, therefore, it can’t infer anything.

Data is a necessary ingredient in judgment. It’s people who select and interpret data. People can turn it into insight or torture it to bring their agenda to bear. Data is therefore only as useful as its quality and the skills of the people wielding it.

Far more than we admit, subjectivity and intuition play a significant role in deciding how we collect, choose, process, explain, interpret, and apply the data. As entrepreneur Margaret Heffernan warns in Willful Blindness: Why We Ignore the Obvious at Our Peril (2012,) “We mostly admit the information that makes us feel great about ourselves, while conveniently filtering whatever unsettles our fragile egos and most vital beliefs.”

In the hands of careless users, data can end up having the opposite effect its creators intended. All data is good or bad depending on how it’s employed in a compelling story and what end it’s serving—neither of which the data itself can control.

  • Don’t let data drive your conclusions. Let data inform your conclusions.
  • Don’t declare, “The data says,” (as in, “the stock market thinks.”) Data by itself cannot have a particular interpretation.
  • When you find data that seems to support the case you wish to make, don’t swoop on it without caution and suspicion. Data can be very deceptive when used carelessly.
  • Be familiar with the limitations of your data. Investigate if your data informs any other equally valid hypothesis that could propose an alternative conclusion.

Idea for Impact: Beware of the risk of invoking data in ways that end up undermining your message.

Wondering what to read next?

  1. Making Tough Decisions with Scant Data
  2. Presenting Facts Can Sometimes Backfire
  3. Disproven Hypotheses Are Useful Too
  4. How to Solve a Problem By Standing It on Its Head
  5. Rapoport’s Rules to Criticize Someone Constructively

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Bias, Conversations, Conviction, Critical Thinking, Decision-Making, Persuasion, Problem Solving, Thinking Tools, Thought Process

Never Accept an Anecdote at Face Value

February 19, 2021 By Nagesh Belludi Leave a Comment

Human beings generally find anecdotes highly compelling. We’re not transformed as much by facts and statistics as we are by stories.

But anecdotes aren’t often objective. Anecdotes are uncontrolled individual observations—sometimes no more than one.

Never Accept an Anecdote at Face Value Reported experience is subjective. Our recollections are ever-changing, and they’re often amazingly imprecise. We often misrepresent events to agree with the audience—even embellish with made-up minutiae to render our stories more compelling.

And for that reason, anecdotes are usually the weakest form of evidence. Anecdotes are subject to a host of biases such as confirmation bias, generalization, and cherry-picking. Moreover, for every anecdote, an equal and contrary anecdote can be proffered.

Idea for Impact: Be deeply suspicious of anecdotes. Arguments that draw on anecdotal evidence to make broad generalizations are liable to be fallacious.

Wondering what to read next?

  1. How to Gain Empathic Insight during a Conflict
  2. The Data Never “Says”
  3. Persuade Others to See Things Your Way: Use Aristotle’s Ethos, Logos, Pathos, and Timing
  4. Facts Alone Can’t Sell: Lessons from the Intel Pentium Integer Bug Disaster
  5. Don’t Ignore the Counterevidence

Filed Under: Sharpening Your Skills Tagged With: Bias, Communication, Critical Thinking, Persuasion

An Olympian History of Humanity // Book Summary of Yuval Noah Harari’s ‘Sapiens’

September 10, 2020 By Nagesh Belludi Leave a Comment

Israeli historian and philosopher Yuval Noah Harari’s bestselling 464-page Sapiens: A Brief History of Humankind (2015) retells the 13.5 billion years-long odyssey of human evolution from the Big Bang to the near-future. Harari accounts for how Homo sapiens (the ‘wise man’) overcame the most extraordinary odds and numerous arbitrary inevitabilities to dominate the world the way we do at present.

Harari’s narratives span the cognitive revolution (70,000 years ago,) agricultural revolution (11,000 years,) scientific revolution (500 years,) industrial revolution (250 years,) and information revolution (50 years.) The first of these epochs, the cognitive revolution, coupled with a genetic mutation, was the real game-changer: Homo sapiens didn’t evolve efficiently from stooping apes to standing individuals. There were previously no less than six distinct homines, of which Homo sapiens came out top.

Sapiens argues that what made Homo sapiens special was our ability to develop networks and communities and tell stories, i.e., to organize and build large, connected communities around “shared fictions” or narratives—religion, nationalism, capitalism, trade groups, social institutions, for example. It was only through such intangible beliefs—not biological realities—that Homo sapiens were able to get the better of the physical world.

Homo sapiens’ talent for abstraction set us apart

Yuval Noah Harari's Sapiens: Human Evolution from the Big Bang to the Near-future

Language made it easier to dwell upon abstract matters and flexibly cooperate in ever-larger numbers. Harari’s examples cite how Homo sapiens—from our ancestors all the way up to today—are so willing to create and believe in such conceptual paradigms that have been the key to our success and the key to our problems.

Large numbers of strangers can cooperate successfully by believing in common myths. Any large-scale human cooperation—whether a modern state, a medieval church, an ancient city or an archaic tribe—is rooted in common myths that exist only in people’s collective imagination.

Harari’s inquiry is extensive. His scholarship is rigorous, and his interpretation creative. Yes, most of the book restates familiar facts and theories. Harari does an excellent job synthesizing a lot of information. What makes Sapiens exceptional is it gives culture a starring role in the human drama—something that many in science and sociology are hesitant to do, instead preferring to depict culture as transient, nebulous, and “soft.”

Yuval Noah Harari's Sapiens: A Brief History of Humankind Harari builds on some provocative ideas about Homo sapiens, but sets out his anthropological interpretations with vim and vigor:

  • The emergence of agriculture—especially livestock farming—is “the greatest crime in history … The domestication of animals was founded on a series of brutal practices that only became crueler with the passing of the centuries.” [Harari has said that he became a committed vegan while writing Sapiens.]
  • Organized religion is predictably contemptible, “You could never convince a monkey to give you a banana by promising him limitless bananas after death in monkey heaven.” The emergence of religion “was one of the most important revolutions in history, and made a vital contribution to the unification of humankind.” But the notion of supernatural being is increasingly inconsequential as humans are acquired divine abilities and relying increasingly upon ourselves for creating life forms and averting death and destruction. Then, “Is there anything more dangerous than dissatisfied and irresponsible gods who don’t know what they want?”
  • Consumer capitalism is a dreadful prison. “For better or worse, in sickness and in health, the modern economy has been growing like a hormone-soused teenager. It eats up everything it can find and puts on inches faster than you can count.”

Recommendation: Read Harari’s astonishing history of the species, from insignificant apes to rulers of the world

'Sapiens' by Yuval Noah Harari (ISBN 0062316095) Yuval Noah Harari’s Sapiens: A Brief History of Humankind (2015) is a must-read. It is a brilliantly executed examination of who we are and of our behaviors. Notwithstanding the seeming overstatements and the occasional drift to sensationalism, Sapiens is extremely interesting and thought-provoking. It is written elegantly, in a clear and engaging style, with a skeptic’s eye and irreverent—and sometimes-sarcastic—sensibility.

We are more powerful than ever before…Worse still, humans seem to be more irresponsible than ever. Self-made gods with only the laws of physics to keep us company, we are accountable to no one.

Harari is implacably cold and literal, abstaining from political correctness and pro-Western predispositions. Sapiens concludes with spine-tingling predictions about the future. Perhaps as a cliffhanger to his subsequent Homo Deus: A Brief History of Tomorrow (2016,) Harari contends that we’re the primary destructive force.

Homo sapiens are sowing the seeds for our own destruction. The forthcoming biotechnological revolution, Harari speculates, may signal the end of sapiens. Bioengineered “amortal cyborgs” may replace us. These post-human organic and inorganic organisms won’t necessarily be immortal but, absent an accident, can live forever. Homo not so sapiens?

Wondering what to read next?

  1. It’s Probably Not as Bad as You Think: The 20-40-60 Rule
  2. No One Has a Monopoly on Truth
  3. Does the Consensus Speak For You?
  4. How to Guard Against Anything You May Inadvertently Overlook
  5. Making Exceptions “Just Once” is a Slippery Slope

Filed Under: Leadership Reading Tagged With: Bias, Books for Impact, Philosophy, Religiosity, Risk, Scientists

Primary Sidebar

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

Explore

Anxiety Attitudes Balance Biases Books Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Life Social Skills Stress Thinking Tools Thought Process Time Management Winning on the Job Wisdom Worry

RECOMMENDED BOOK:
How Asia Works

How Asia Works: Joe Studwell

Joe Studwell on how Asia’s post-war economic miracles emerged via land reform, government-backed manufacturing, and financial repression.

Categories

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Book Summary: Jack Welch, ‘The’ Man Who Broke Capitalism?
  • A Quick Way to Build Your Confidence Right Now
  • Inspirational Quotations #950
  • Don’t Manage with Fear
  • Great Jobs are Overwhelming, and Not Everybody Wants Them
  • Inspirational Quotations #949
  • How are You: Always Have to Say ‘Good’

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!