• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Biases

Don’t Get Stuck in Middle Management

September 21, 2021 By Nagesh Belludi Leave a Comment

This survey by the Association of Asian Americans in Investment Management reports (via The New York Times DealBook column) the nature of discrimination and bias faced by Asian Americans:

Asian Americans and Pacific Islanders are often stereotyped as lacking leadership skills. At investment firms, they “fill middle management ranks, but their percentages plummet in senior management and C-suites.” Respondents said they were often tapped as technical experts and benefited from the perception that they are good workers. But their advancement stalled as they sought more senior roles that emphasize networking and communication skills.

Most professionals fail to realize that the competencies that made them successful in their early corporate roles are not necessarily the attributes that will allow them to outshine in roles higher up on the ladder. These desirable qualities would include forming coalitions, managing relationships and alliances, determining where and when to shift one’s focus, and learning to appreciate different perspectives.

Work out what you need to get to the top and fight the perceptions

  • Evaluate where your development priorities should be. Find out how you can acquire the necessary skills and competencies. Go get them. Become more visible to management and situate yourself for a promotion.
  • Network wisely. Understanding who must be won over to your point of view is vital for training for your promotion. Spend time cultivating meaningful relationships.
  • Ask for honest feedback—not just from your boss but also from well-respected peers, customers, mentors, and others. Confront problems quickly lest they metastasize.

Idea for Impact: In today’s world, your skills and promotability are your responsibility.

Wondering what to read next?

  1. How to … Be More Confident at Work
  2. Risk More, Risk Earlier
  3. The Career-Altering Question: Generalist or Specialist?
  4. Your time is far from being wasted!
  5. Do-What-I-Did Career Advice Is Mostly Nonsense

Filed Under: Career Development, Sharpening Your Skills Tagged With: Biases, Career Planning, Interpersonal, Leadership, Personal Growth, Skills for Success

Many Hard Leadership Lessons in the Boeing 737 MAX Debacle

August 24, 2021 By Nagesh Belludi Leave a Comment

The U.S. House committee’s report on Boeing’s 737 MAX disaster makes interesting reading on contemporary leadership, particularly the pressures of rapid product development.

The rush to market and a culture of contributory negligence and concealment conspired to ensure that a not-yet-airworthy plane carried passengers into service, resulting in two fatal accidents and a long grounding.

Boeing’s design and development of the 737 MAX was marred by technical design failures, a lack of transparency with both regulators and customers, and efforts to downplay or disregard concerns about the operation of the aircraft.

Of particular importance are the “management failures,” “inherent conflicts of interest,” and “grossly insufficient oversight” at both Boeing and its regulator, the Federal Aviation Administration (FAA.) Boeing failed to offset the design limitations and cost- and schedule-pressures in favor of attention to customer safety. Leadership was fixated on fending off the runaway success of the Airbus A320neo program.

The company relied on too many technical assumptions—and they couldn’t make themselves the space and time to be reasonable about any of this. Boeing’s “culture of concealment” and an “unwillingness to share technical details” are the report’s most damning indictment. Employees spoke but went unheard; indeed, their voices were suppressed.

Wondering what to read next?

  1. The Boeing 737 MAX’s Achilles Heel
  2. Availability Heuristic: Our Preference for the Familiar
  3. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  4. How Stress Impairs Your Problem-Solving Capabilities: Case Study of TransAsia Flight 235
  5. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342

Filed Under: Leadership Tagged With: Aviation, Biases, Change Management, Decision-Making, Problem Solving, Risk, Thinking Tools

Lessons from David Dao Incident: Watch Out for the Availability Bias!

August 23, 2021 By Nagesh Belludi Leave a Comment

In the weeks and months after the United Airlines’David Dao incident and the ensuing customer service debacle, news of all kinds of disruptive airline incidents, coldblooded managers, and inconsiderate airline staff showed up everywhere.

The United incident raised everyone’s awareness of airline incidents. Expectedly, the media started drawing attention to all sorts of airline incidents—fights on airplanes, confusion and airports, seats taken from small children, insects in inflight meals, snakes on the plane—affecting every airline, large and small. However, such unpleasant incidents rarely happen, with thousands of flights every day experiencing nothing of the sort.

Parenthetically, the underlying problem that led to the David Dao incident wasn’t unique to United. The incident could have happened at other airlines. All airlines had similar policies regarding involuntary-denied boarding and prioritizing crew repositioning. Every other airline, I’m sure, felt lucky the David Dao incident didn’t happen on their airline.

In the aftermath of the incident, many people vowed to boycott United. Little by little, that negative consumer sentiment faded away while the backlash—and media coverage—over the incident diminished.

Availability bias occurs when we make decisions based on easy or incomplete ideas.

The David Dao incident’s media coverage is an archetypal case of the Availability Bias (or Availability Heuristic) in force. Humans are inclined to disproportionately assess how likely something will happen by how easy it is to summon up comparable–and recent–examples. Moreover, examples that carry a fierce emotional weight tend to come to mind quickly.

The availability heuristic warps our perception of real risks. Therefore, if we’re assessing whether something is likely to happen and a similar event has occurred recently, we’re much more liable to expect the future possibility to occur.

What we remember is shaped by many things, including our beliefs, emotions, and things like intensity and frequency of exposure, particularly in mass media. When rare events occur, as was the case with the David Dao incident, they become evident. Suppose you’re in a car accident involving a Chevy, you are likely to rate the odds of getting into another car accident in a Chevy much higher than base rates would suggest.

If you are aware of the availability bias and begin to look for it, you will be surprised how often it shows up in all kinds of situations. As with many other biases, we can’t remove this natural tendency. Still, we can let our rational minds account for this bias in making better decisions by being aware of the availability bias.

Idea for Impact: Don’t be disproportionately swayed by what you remember. Don’t underestimate or overestimate a risk or choosing to focus on the wrong risks. Don’t overreact to the recent facts.

Wondering what to read next?

  1. Why Your Judgment Sucks
  2. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  3. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  4. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  5. Why Incentives Backfire and How to Make Them Work: Summary of Uri Gneezy’s Mixed Signals

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Aviation, Biases, Change Management, Critical Thinking, Decision-Making, Psychology, Thought Process

If You’re Looking for Bad Luck, You’ll Soon Find It

August 16, 2021 By Nagesh Belludi Leave a Comment

Consider a woman who complained that her neighborhood dry cleaner ruined her expensive slacks. “Last month, he spoiled my wool blazer. Last Christmas, he … . It always happens,” she grumbled.

This woman knew she was taking chances with this dry cleaner. She allowed it to happen.

Luck is sometimes the result of taking appropriate action. And, bad luck is sometimes the result of tempting fate.

Say, you’ve been planning for weeks for your next big trip. You got an incredible deal on the day’s very last flight to your destination. On the day of departure, your late-night flight gets canceled. Sure, you’re a victim of back luck—but you invited it. Think about it. Odds are, you’re more likely to have a flight delay or cancellation later in the day because airlines schedule their rosters tightly to maximize aircraft and flight crew utilization. Delays and disruptions from earlier in the day propagate onward to the late flights.

Often, luck has nothing to do with bad luck. “The fault,” as Shakespeare wrote, “is not in our stars, but in ourselves.”

Sometimes you can be your own worst enemy. Don’t self-sabotage yourself by tempting fate.

Idea for Impact: Bad choices beget bad luck

You have to be lucky to get lucky. You have no control over many outcomes in life, but you can always increase the odds of getting lucky by taking appropriate action. More importantly, you can minimize the chance of bad luck by decreasing its odds.

Remember, a good mathematics student never buys a lottery ticket, and if he does, he never grumbles about not winning the jackpot!

Wondering what to read next?

  1. Be Smart by Not Being Stupid
  2. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  3. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  4. More Data Isn’t Always Better
  5. Smart Folks are Most Susceptible to Overanalyzing and Overthinking

Filed Under: Mental Models Tagged With: Biases, Critical Thinking, Decision-Making, Luck, Risk, Wisdom

Why Your Judgment Sucks

April 5, 2021 By Nagesh Belludi Leave a Comment

Israeli-American psychologist Daniel Kahneman’s bestselling Thinking, Fast and Slow (2011) describes the finer points of decision-making. It’s an engaging showcase of the innate biases of the mind and unthinking approaches to decision-making.

Human Beings are Intuitive Thinkers

Kahneman is a behavioral economics pioneer and the winner of the 2002 Nobel Memorial Prize in Economic Sciences. His lifelong collaboration with Amos Tversky (1937—96) has molded humans’ thinking about human error, risk, judgment, decision-making, happiness, and more. Tversky died in 1996, so he did not share in the Nobel.

Thinking, Fast and Slow explores what Kahneman calls the “mind’s machinery” as two coexisting modes of thought (“fast and slow,” as the title says.) Kahneman splits the brain into two radically divergent ways, employing a two-tier model of cognition.

  • System One makes judgments instantly, intuitively, and automatically, as when a cricket batsman decides whether to cut or pull. A significant part of System One is “evolved heuristics” that lets us read a person’s expression in a microsecond from a block away, for example. And it can’t be switched off. System One’s thinking is fast and effortless. It often jumps to the wrong conclusions, relies on hunches and biases, and perhaps overconfident.
  • System Two is slower, conscious, calculated, and deliberate, like long division. Its operations require attention. System Two is what we think of as “thinking”—slow, tiring, and essential. It’s what makes us human. Even if System Two believes it is on top of things, System One makes many of our decisions.

System One Isn’t All Flawed

In a world that often necessitates swift judgment and rapid decision-making (e.g., fight or flight,) a person who solely relies on deliberative thinking (System Two) wouldn’t last long. Doctors and firefighters, for example, through training and repetition, develop what’s called “expert intuition” that helps them identify patterns and impulsively devise the right response to a complex emergency.

We as humans are not simple rational agents. Consequently, our thinking boils down to two “Systems” of thinking/processing. As we strive to make better decisions in our work and personal lives, it benefits us to slow down and use a more deliberate System 2 way of thinking. Learn to doubt your fast/quick way of thinking!

Human Intuition is Imperfect

Thinking, Fast and Slow is an eye-opener in various ways. It can be a frightening catalog of the biases, shortcuts, and cognitive illusions that come to err our judgment—the endowment effect, priming, halo effect, anchoring effect, conjugation fallacy, the narrative fallacy, and the rest. Such mental processes are not intrinsically flawed; they are heuristics—rules of thumb, stereotypes, shortcuts. They are strategies the mind embraces to find a path in a tsunami of data.

Kahneman teaches how to recognize situations that require slower, deliberative thinking. Kahneman asserts that the value of the book is to give people the vocabulary to spot biases and to criticize the decisions of others: “Ultimately, a richer language is essential to the skill of constructive criticism.”

Recommendation: Read Daniel Kahneman’s Thinking, Fast and Slow (2011.) As one of the most popular non-fiction books in the last decade, it’ll open your eyes to the quirky and error-prone ways in which you can be influenced in ways you don’t suspect.

The conceptions behind behavioral economics make Thinking, Fast and Slow a laborious read. Many chapters are bogged down by hair-splitting details of his rigorous scientific work and academic gobbledygook. It’s a commanding survey of this field, but it’s superbly written and intelligible to non-experts.

Complement with Rolf Dobelli’s accessible The Art of Thinking Clearly (2013.)

Wondering what to read next?

  1. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  2. The Data Never “Says”
  3. Question the Now, Imagine the Next
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

April 1, 2021 By Nagesh Belludi Leave a Comment

Psychologists have argued that many cognitive biases are rooted in mental shortcuts. They are heuristics—rules of thumb, stereotypes, instincts—that help you make sense of the world. They aren’t intrinsically flawed, but, they’re often quirky and error-prone. Your mental models can affect you in ways that you don’t suspect.

David McRaney’s You Are Not So Smart (2011) suggests a brief—if hurried—tour of 48 cognitive biases that can deceive you. Based on the author’s popular blog, the book is a satisfying assessment of understanding people’s—and your own—behavior a little bit better.

There is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. … From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company.

Each chapter starts with a brief statement of a misconception followed by the fact. Then a synopsis of a related behavioral study shows how our brains produce the counterpart deception and the truth. Some of the less-known preconceptions discussed are,

  • Confabulation. You tend to create unreliable narratives to explain away your choices post hoc. These reassuring perceptions can make you think you’re more rational than you actually are.
  • Groupthink. People tend to fall in with the rest of the group to minimize conflict and foster group cohesiveness and social acceptance. No one wants to be the one person with a dissenting opinion.
  • Social Loafing. That others in a team will pick up your slack may induce you to put in less effort if you think you’ll get away with it. This can curb your own performance, even if you’re a conscientious, hardworking type. If you don’t feel your participation will be noticed, why bother putting in the effort?
  • Availability Heuristic. You’re likely to estimate the likelihood of an event based on your ability to recall immediate and easily accessed examples.
  • Fundamental Attribution Error. You tend to assign external reasons for your own behavior but internal motives to other people. For instance, if you’re late for a meeting, you’ll blame it on public transport. If someone else is running late for a meeting with you, you’ll blame it on her poor time-keeping.

Recommendation: Read David McRaney’s You Are Not So Smart. It’s an engaging, easy-to-read primer to how the mind works. Read it as a lead up to Daniel Kahneman’s bestselling dissertation Thinking, Fast and Slow (2011; summary forthcoming.)

Idea for Impact: Once you learn to spot the cognitive biases we all grapple with, they’re easier to overcome.

Wondering what to read next?

  1. Why Your Judgment Sucks
  2. Question the Now, Imagine the Next
  3. Lessons from David Dao Incident: Watch Out for the Availability Bias!
  4. What if Something Can’t Be Measured
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

Five Ways … You Could Avoid Being Wrong

March 20, 2021 By Nagesh Belludi Leave a Comment

  • Beware exaggeration of a kernel of truth. For instance: indeed, many of us don’t realize our full intellectual potential; but that doesn’t give credence to the notion that most people use only 10% of their brainpower. Besides, beware of overstatements of small differences. Sure, men and women tend to differ somewhat in their communication styles, but declaring that “men are from Mars” and “women are from Venus” is taking a kernel of reality to an extreme, not to mention coercing psychology into stereotypes.
  • Don’t infer causation from correlation. Don’t be tempted to conclude that if two things co-occur statistically, they must be causally related to each other. (Rabbi Harold Kushner once asserted that circumcision seems to increase men’s chances of winning a Nobel Prize.) Seek contending explanation.
  • Beware biased sampling and extrapolation. Inferences from a biased sample are not as trustworthy as conclusions from a truly random sample—e.g., don’t ask people coming out of Sunday Mass if they have a personal relationship with Jesus Christ and infer that Americans are turning to God. Don’t ascribe to the whole any attribute of the part.
  • Don’t let stress impair your problem-solving capabilities. As many airline disasters confirm (example, example, example,) speed can narrow your cognitive map—small errors can quickly become linked up and amplified into disastrous outcomes. When you feel rushed, you’re likely to miss details. You’re not present enough in the moment to notice what’s important and make the most beneficial choices.
  • Beware argumentum ad nauseam. Don’t confuse a statement’s familiarity (such as urban legends) with its accuracy. The fact that you’ve heard a claim repeated over and over again (think of President Trump’s allegations of widespread voter fraud,) sometimes with rearranged phrasing and substitute terms, doesn’t make it correct.

Bonus: Be suspicious of any claim that doesn’t come with counterarguments or disconfirming evidence.

Wondering what to read next?

  1. Be Smart by Not Being Stupid
  2. How to … Escape the Overthinking Trap
  3. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  4. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  5. If You’re Looking for Bad Luck, You’ll Soon Find It

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Risk

The Data Never “Says”

March 1, 2021 By Nagesh Belludi Leave a Comment

Data doesn’t say anything. Indeed, data can’t say anything for itself about an issue any more than a saw can form furniture, or a sauce can simmer a stew.

Data is inert and inanimate. Data doesn’t know why it was created. Data doesn’t have a mind of its own, and, therefore, it can’t infer anything.

Data is a necessary ingredient in judgment. It’s people who select and interpret data. People can turn it into insight or torture it to bring their agenda to bear. Data is therefore only as useful as its quality and the skills of the people wielding it.

Far more than we admit, subjectivity and intuition play a significant role in deciding how we collect, choose, process, explain, interpret, and apply the data. As entrepreneur Margaret Heffernan warns in Willful Blindness: Why We Ignore the Obvious at Our Peril (2012,) “We mostly admit the information that makes us feel great about ourselves, while conveniently filtering whatever unsettles our fragile egos and most vital beliefs.”

In the hands of careless users, data can end up having the opposite effect its creators intended. All data is good or bad depending on how it’s employed in a compelling story and what end it’s serving—neither of which the data itself can control.

  • Don’t let data drive your conclusions. Let data inform your conclusions.
  • Don’t declare, “The data says,” (as in, “the stock market thinks.”) Data by itself cannot have a particular interpretation.
  • When you find data that seems to support the case you wish to make, don’t swoop on it without caution and suspicion. Data can be very deceptive when used carelessly.
  • Be familiar with the limitations of your data. Investigate if your data informs any other equally valid hypothesis that could propose an alternative conclusion.

Idea for Impact: Beware of the risk of invoking data in ways that end up undermining your message.

Wondering what to read next?

  1. What if Something Can’t Be Measured
  2. Question the Now, Imagine the Next
  3. In Praise of Inner Voices: A Powerful Tool for Smarter Decisions
  4. Making Tough Decisions with Scant Data
  5. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Conversations, Conviction, Critical Thinking, Decision-Making, Persuasion, Problem Solving, Thinking Tools, Thought Process

Never Accept an Anecdote at Face Value

February 19, 2021 By Nagesh Belludi Leave a Comment

Human beings generally find anecdotes highly compelling. We’re not transformed as much by facts and statistics as we are by stories.

But anecdotes aren’t often objective. Anecdotes are uncontrolled individual observations—sometimes no more than one.

Reported experience is subjective. Our recollections are ever-changing, and they’re often amazingly imprecise. We often misrepresent events to agree with the audience—even embellish with made-up minutiae to render our stories more compelling.

And for that reason, anecdotes are usually the weakest form of evidence. Anecdotes are subject to a host of biases such as confirmation bias, generalization, and cherry-picking. Moreover, for every anecdote, an equal and contrary anecdote can be proffered.

Idea for Impact: Be deeply suspicious of anecdotes. Arguments that draw on anecdotal evidence to make broad generalizations are liable to be fallacious.

Wondering what to read next?

  1. How to Gain Empathic Insight during a Conflict
  2. The Data Never “Says”
  3. Silence the Noise
  4. Tales vs. Truth & Anecdotal Evidence: The Case of Sports Illustrated Cover Jinx
  5. Don’t Ignore the Counterevidence

Filed Under: Sharpening Your Skills Tagged With: Biases, Communication, Critical Thinking, Persuasion

A Real Lesson from the Downfall of Theranos: Silo Mentality

February 4, 2021 By Nagesh Belludi Leave a Comment

The extraordinary rise and fall of Theranos, Silicon Valley’s biggest fraud, makes an excellent case study on what happens when teams don’t loop each other in.

Theranos’ blood-testing device never worked as glorified by its founder and CEO, Elizabeth Holmes. She created an illusion that became one of the greatest start-up stories. She kept her contraption’s malfunctions and her company’s problems shockingly well hidden—even from her distinguished board of directors.

At the core of Holmes’s sham was how she controlled the company’s flow of information

Holmes and her associate (and then-lover) Sunny Balwani operated a culture of fear and intimidation at Theranos. They went to such lengths as hiring superstar lawyers to intimidate and silence employees and anyone else who dared to challenge their methods or expose their devices’ deficiencies.

Holmes had the charade going for so long by keeping a tight rein on who talked to whom. She controlled the flow of information within the company. Not only that, she swiftly fired people who dared to question her approach. She also forcefully imposed non-disclosure agreements even for those exiting the company.

In other words, Holmes went to incredible lengths to create and maintain a silo mentality in her startup. Her intention was to wield much power, prevent employees from talking to each other, and perpetuate her deceit.

A recipe for disaster at Theranos: Silo mentality and intimidation approach

'Bad Blood' by John Carreyrou (ISBN 152473165X) Wall Street Journal investigative reporter John Carreyrou’s book Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018; my summary) is full of stories of how Holmes went out of her way to restrain employees from conferring about what they were working on. Even if they worked on the same project, Holmes made siloed functional teams report to her directly. She would edit progress reports before redirecting problems to other team heads.

Consider designer Ed Ku’s mechatronics team responsible for designing all the intricate mechanisms that control the measured flow of biochemical fluids. Some of his team’s widgets were overheating, impinging on one another and cross-contaminating the clinical fluids. Holmes wouldn’t allow Ku and his team to talk to the teams that improved the biochemical processes.

Silo mentality can become very problematic when communication channels become too constricted and organizational processes too bureaucratic. Creativity gets stifled, collaboration limited, mistakes—misdeeds in the case of Theranos—suppressed, and collective objectives misaligned.

Idea for Impact: Functional silos make organizations slow, bureaucratic, and complicated

Innovation hinges increasingly on interdisciplinary cooperation. Examine if your leadership attitude or culture is unintentionally contributing to insufficient accountability, inadequate information-sharing, and limited collaboration between departments—especially on enterprise-wide initiatives.

Wondering what to read next?

  1. Let’s Hope She Gets Thrown in the Pokey
  2. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’
  3. You Need to Stop Turning Warren Buffett Into a Prophet
  4. Your Product May Be Excellent, But Is There A Market For It?
  5. When Work Becomes a Metric, Metrics Risk Becoming the Work: A Case Study of the Stakhanovite Movement

Filed Under: Business Stories, Leadership, Mental Models Tagged With: Biases, Critical Thinking, Entrepreneurs, Ethics, Leadership Lessons, Psychology, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Psychology Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
The Guide

The Guide: R. K. Narayan

R.K. Narayan's story of the transformation of Raju is a profound, yet dryly humorous assessment of the frailty of the human condition and the meaning and consequences of our actions

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Book Summary: Hadley Freeman’s ‘Life Moves Pretty Fast’—How ’80s Movies Wrote America’s Story
  • Inspirational Quotations #1150
  • Corporate Boardrooms: The Governance Problem Everyone Knows and Nobody Fixes
  • Every Agreement Has a Loophole: What Puma’s Pele Gambit Teaches About Lateral Thinking
  • Five Simple Changes That Can Save You the Most Time
  • Inspirational Quotations #1149
  • Sadness Isn’t a Diagnosis

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!