• Skip to content
  • Skip to primary sidebar

Right Attitudes

Ideas for Impact

Critical Thinking

The Problem of Living Inside Echo Chambers

June 14, 2021 By Nagesh Belludi Leave a Comment

Psychologists use the term realistic ignorance to explain the human tendency to believe that we’re normal—that the way we see and do things is entirely representative of everybody else.

Realistic ignorance is intensified by our natural desire to associate with people similar to ourselves.

Social media algorithms make this worse—they reinforce our attitudes but not change them. They steer us to the type of stuff we already know and like. They make it easy for us to form our own echo chambers, packed with people who share the same views. This causes confirmation bias. Tribal allegiances form flawed ideas and viewpoints about what is typical for organizations and communities.

Idea for Impact: Seek out and engage thoughtful folks who don’t think like you. Discuss, debate, and improve your reasoned understanding of one another and of the crucial issues. Your goal should be to enhance your own awareness of the counterarguments in contentious matters, not win over anyone.

Wondering what to read next?

  1. The Sensitivity of Politics in Today’s Contentious Climate
  2. Couldn’t We Use a Little More Civility and Respect in Our Conversations?
  3. Of Course Mask Mandates Didn’t ‘Work’—At Least Not for Definitive Proof
  4. Presenting Facts Can Sometimes Backfire
  5. Moderate Politics is the Most Sensible Way Forward

Filed Under: Effective Communication, Mental Models, Sharpening Your Skills Tagged With: Conflict, Conviction, Critical Thinking, Getting Along, Persuasion, Politics, Social Dynamics, Thinking Tools

Creativity—It Takes a Village: A Case Study of the 3M Post-it Note

April 15, 2021 By Nagesh Belludi Leave a Comment

Creativity isn’t always about sudden insights that work perfectly. No matter how good an idea is, it’ll probably need some work before it can mature into a helpful innovation.

The invention of 3M Post-it (or the sticky note) is a particularly illuminating case in point that innovation requires actionable and differentiated insight. Cross-functional collaboration can help ensure creative involvement throughout the development process.

A Glue That Doesn’t Stick: A Solution Without a Problem

In the winter of 1974, a 3M adhesives engineer named Spencer Silver gave an internal presentation about a pressure-sensitive adhesive compound he had invented in 1968. The glue was weak, and Silver and his colleagues could not imagine a good use for it. The glue could barely hold two pieces of paper together. Silver could stick the glue and reapply it to surfaces without leaving behind any residue.

In Silver’s audience was Arthur Fry, an engineer at 3M’s paper products division. Months later, on a frigid Sunday morning, Fry called to mind Silver’s glue in an unlikely context.

Fry sang in his church’s choir and used to put little paper pieces in his hymnal to bookmark the songs he was supposed to sing. The little paper pieces of bookmark would often fall out, forcing Fry to thumb frantically through the book looking for the correct page. (This is one of those common hassles that we often assume we’re forced to live with.)

In a flash of lightning, Fry recalled the weak glue he’d heard at Silver’s presentation. Fry realized that the glue could be applied to paper to create a reusable bookmark. The adhesive bond was strong enough to stick to the page but weak enough to peel off without leaving a trace.

The sticky note was thus born as a bookmark called Press’n Peel. Initially, It was sold in stores in four cities in 1977 and did poorly. When 3M offered free samples to office workers in Boise, Idaho, some customers started using them as self-attaching notes. It was only then that Post-it notes started to become popular. They were first introduced across America in 1980 and Canada and Europe in 1981.

Ideas Intermingle and Evolve: Creativity Needs Collaboration

In all, it took twelve years after the initial discovery of the “glue that doesn’t stick” before 3M made Post-it available commercially. The Post-it continues to be one of the most widely used office products in the world.

This case study of the Post-it is a persuasive reminder that there’s a divergence between an idea and its tangible application that the creator cannot bridge by himself. The creator will have to expose the concept to diverse people who can evaluate, use, and trial the product.

In other words, the creative process does not end with an idea or a prototype. A happy accident often undergoes multiple iterations and reinterpretations that can throw light on the concept’s new applications. In the above example, Art Fry was able to see Spencer Silver’s invention from a different perspective and conceive of a novel use that its creator, Silver, could not. And all this happened in 3M’s fertile atmosphere that many companies aspire to create to help ideas intermingle and creativity flourish.

Idea for Impact: Creativity Is About Generating New Possibilities

Creativity is a mental and social process involving the generation of new ideas and concepts—and new associations that connect the ideas with existing problems.

Excellent new ideas don’t emerge from within a single person or function but at the intersection of processes or people that may have never met before.

Wondering what to read next?

  1. How to Stimulate Group Creativity // Book Summary of Edward de Bono’s ‘Six Thinking Hats’
  2. Question the Now, Imagine the Next
  3. Defect Seeding: Strengthen Systems, Boost Confidence
  4. Four Ideas for Business Improvement Ideas
  5. You Can’t Develop Solutions Unless You Realize You Got Problems: Problem Finding is an Undervalued Skill

Filed Under: Business Stories, Sharpening Your Skills, The Great Innovators Tagged With: Creativity, Critical Thinking, Networking, Problem Solving, Teams, Thinking Tools, Thought Process

Why Your Judgment Sucks

April 5, 2021 By Nagesh Belludi Leave a Comment

Israeli-American psychologist Daniel Kahneman’s bestselling Thinking, Fast and Slow (2011) describes the finer points of decision-making. It’s an engaging showcase of the innate biases of the mind and unthinking approaches to decision-making.

Human Beings are Intuitive Thinkers

Kahneman is a behavioral economics pioneer and the winner of the 2002 Nobel Memorial Prize in Economic Sciences. His lifelong collaboration with Amos Tversky (1937—96) has molded humans’ thinking about human error, risk, judgment, decision-making, happiness, and more. Tversky died in 1996, so he did not share in the Nobel.

Thinking, Fast and Slow explores what Kahneman calls the “mind’s machinery” as two coexisting modes of thought (“fast and slow,” as the title says.) Kahneman splits the brain into two radically divergent ways, employing a two-tier model of cognition.

  • System One makes judgments instantly, intuitively, and automatically, as when a cricket batsman decides whether to cut or pull. A significant part of System One is “evolved heuristics” that lets us read a person’s expression in a microsecond from a block away, for example. And it can’t be switched off. System One’s thinking is fast and effortless. It often jumps to the wrong conclusions, relies on hunches and biases, and perhaps overconfident.
  • System Two is slower, conscious, calculated, and deliberate, like long division. Its operations require attention. System Two is what we think of as “thinking”—slow, tiring, and essential. It’s what makes us human. Even if System Two believes it is on top of things, System One makes many of our decisions.

System One Isn’t All Flawed

In a world that often necessitates swift judgment and rapid decision-making (e.g., fight or flight,) a person who solely relies on deliberative thinking (System Two) wouldn’t last long. Doctors and firefighters, for example, through training and repetition, develop what’s called “expert intuition” that helps them identify patterns and impulsively devise the right response to a complex emergency.

We as humans are not simple rational agents. Consequently, our thinking boils down to two “Systems” of thinking/processing. As we strive to make better decisions in our work and personal lives, it benefits us to slow down and use a more deliberate System 2 way of thinking. Learn to doubt your fast/quick way of thinking!

Human Intuition is Imperfect

Thinking, Fast and Slow is an eye-opener in various ways. It can be a frightening catalog of the biases, shortcuts, and cognitive illusions that come to err our judgment—the endowment effect, priming, halo effect, anchoring effect, conjugation fallacy, the narrative fallacy, and the rest. Such mental processes are not intrinsically flawed; they are heuristics—rules of thumb, stereotypes, shortcuts. They are strategies the mind embraces to find a path in a tsunami of data.

Kahneman teaches how to recognize situations that require slower, deliberative thinking. Kahneman asserts that the value of the book is to give people the vocabulary to spot biases and to criticize the decisions of others: “Ultimately, a richer language is essential to the skill of constructive criticism.”

Recommendation: Read Daniel Kahneman’s Thinking, Fast and Slow (2011.) As one of the most popular non-fiction books in the last decade, it’ll open your eyes to the quirky and error-prone ways in which you can be influenced in ways you don’t suspect.

The conceptions behind behavioral economics make Thinking, Fast and Slow a laborious read. Many chapters are bogged down by hair-splitting details of his rigorous scientific work and academic gobbledygook. It’s a commanding survey of this field, but it’s superbly written and intelligible to non-experts.

Complement with Rolf Dobelli’s accessible The Art of Thinking Clearly (2013.)

Wondering what to read next?

  1. The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’
  2. The Data Never “Says”
  3. Question the Now, Imagine the Next
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

The Unthinking Habits of Your Mind // Book Summary of David McRaney’s ‘You Are Not So Smart’

April 1, 2021 By Nagesh Belludi Leave a Comment

Psychologists have argued that many cognitive biases are rooted in mental shortcuts. They are heuristics—rules of thumb, stereotypes, instincts—that help you make sense of the world. They aren’t intrinsically flawed, but, they’re often quirky and error-prone. Your mental models can affect you in ways that you don’t suspect.

David McRaney’s You Are Not So Smart (2011) suggests a brief—if hurried—tour of 48 cognitive biases that can deceive you. Based on the author’s popular blog, the book is a satisfying assessment of understanding people’s—and your own—behavior a little bit better.

There is a growing body of work coming out of psychology and cognitive science that says you have no clue why you act the way you do, choose the things you choose, or think the thoughts you think. … From the greatest scientist to the most humble artisan, every brain within every body is infested with preconceived notions and patterns of thought that lead it astray without the brain knowing it. So you are in good company.

Each chapter starts with a brief statement of a misconception followed by the fact. Then a synopsis of a related behavioral study shows how our brains produce the counterpart deception and the truth. Some of the less-known preconceptions discussed are,

  • Confabulation. You tend to create unreliable narratives to explain away your choices post hoc. These reassuring perceptions can make you think you’re more rational than you actually are.
  • Groupthink. People tend to fall in with the rest of the group to minimize conflict and foster group cohesiveness and social acceptance. No one wants to be the one person with a dissenting opinion.
  • Social Loafing. That others in a team will pick up your slack may induce you to put in less effort if you think you’ll get away with it. This can curb your own performance, even if you’re a conscientious, hardworking type. If you don’t feel your participation will be noticed, why bother putting in the effort?
  • Availability Heuristic. You’re likely to estimate the likelihood of an event based on your ability to recall immediate and easily accessed examples.
  • Fundamental Attribution Error. You tend to assign external reasons for your own behavior but internal motives to other people. For instance, if you’re late for a meeting, you’ll blame it on public transport. If someone else is running late for a meeting with you, you’ll blame it on her poor time-keeping.

Recommendation: Read David McRaney’s You Are Not So Smart. It’s an engaging, easy-to-read primer to how the mind works. Read it as a lead up to Daniel Kahneman’s bestselling dissertation Thinking, Fast and Slow (2011; summary forthcoming.)

Idea for Impact: Once you learn to spot the cognitive biases we all grapple with, they’re easier to overcome.

Wondering what to read next?

  1. Why Your Judgment Sucks
  2. The Data Never “Says”
  3. Question the Now, Imagine the Next
  4. Situational Blindness, Fatal Consequences: Lessons from American Airlines 5342
  5. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Psychology, Thinking Tools, Thought Process

Five Ways … You Could Avoid Being Wrong

March 20, 2021 By Nagesh Belludi Leave a Comment

  • Beware exaggeration of a kernel of truth. For instance: indeed, many of us don’t realize our full intellectual potential; but that doesn’t give credence to the notion that most people use only 10% of their brainpower. Besides, beware of overstatements of small differences. Sure, men and women tend to differ somewhat in their communication styles, but declaring that “men are from Mars” and “women are from Venus” is taking a kernel of reality to an extreme, not to mention coercing psychology into stereotypes.
  • Don’t infer causation from correlation. Don’t be tempted to conclude that if two things co-occur statistically, they must be causally related to each other. (Rabbi Harold Kushner once asserted that circumcision seems to increase men’s chances of winning a Nobel Prize.) Seek contending explanation.
  • Beware biased sampling and extrapolation. Inferences from a biased sample are not as trustworthy as conclusions from a truly random sample—e.g., don’t ask people coming out of Sunday Mass if they have a personal relationship with Jesus Christ and infer that Americans are turning to God. Don’t ascribe to the whole any attribute of the part.
  • Don’t let stress impair your problem-solving capabilities. As many airline disasters confirm (example, example, example,) speed can narrow your cognitive map—small errors can quickly become linked up and amplified into disastrous outcomes. When you feel rushed, you’re likely to miss details. You’re not present enough in the moment to notice what’s important and make the most beneficial choices.
  • Beware argumentum ad nauseam. Don’t confuse a statement’s familiarity (such as urban legends) with its accuracy. The fact that you’ve heard a claim repeated over and over again (think of President Trump’s allegations of widespread voter fraud,) sometimes with rearranged phrasing and substitute terms, doesn’t make it correct.

Bonus: Be suspicious of any claim that doesn’t come with counterarguments or disconfirming evidence.

Wondering what to read next?

  1. Be Smart by Not Being Stupid
  2. How to … Escape the Overthinking Trap
  3. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design
  4. Accidents Can Happen When You Least Expect Them: The Overconfidence Effect
  5. If You’re Looking for Bad Luck, You’ll Soon Find It

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Critical Thinking, Decision-Making, Risk

The Data Never “Says”

March 1, 2021 By Nagesh Belludi Leave a Comment

Data doesn’t say anything. Indeed, data can’t say anything for itself about an issue any more than a saw can form furniture, or a sauce can simmer a stew.

Data is inert and inanimate. Data doesn’t know why it was created. Data doesn’t have a mind of its own, and, therefore, it can’t infer anything.

Data is a necessary ingredient in judgment. It’s people who select and interpret data. People can turn it into insight or torture it to bring their agenda to bear. Data is therefore only as useful as its quality and the skills of the people wielding it.

Far more than we admit, subjectivity and intuition play a significant role in deciding how we collect, choose, process, explain, interpret, and apply the data. As entrepreneur Margaret Heffernan warns in Willful Blindness: Why We Ignore the Obvious at Our Peril (2012,) “We mostly admit the information that makes us feel great about ourselves, while conveniently filtering whatever unsettles our fragile egos and most vital beliefs.”

In the hands of careless users, data can end up having the opposite effect its creators intended. All data is good or bad depending on how it’s employed in a compelling story and what end it’s serving—neither of which the data itself can control.

  • Don’t let data drive your conclusions. Let data inform your conclusions.
  • Don’t declare, “The data says,” (as in, “the stock market thinks.”) Data by itself cannot have a particular interpretation.
  • When you find data that seems to support the case you wish to make, don’t swoop on it without caution and suspicion. Data can be very deceptive when used carelessly.
  • Be familiar with the limitations of your data. Investigate if your data informs any other equally valid hypothesis that could propose an alternative conclusion.

Idea for Impact: Beware of the risk of invoking data in ways that end up undermining your message.

Wondering what to read next?

  1. What if Something Can’t Be Measured
  2. In Praise of Inner Voices: A Powerful Tool for Smarter Decisions
  3. Be Smart by Not Being Stupid
  4. Of Course Mask Mandates Didn’t ‘Work’—At Least Not for Definitive Proof
  5. The “Ashtray in the Sky” Mental Model: Idiot-Proofing by Design

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Biases, Conversations, Conviction, Critical Thinking, Decision-Making, Persuasion, Problem Solving, Thinking Tools, Thought Process

Leadership is Being Visible at Times of Crises

February 25, 2021 By Nagesh Belludi Leave a Comment

It’s terrible optics for an elected official to leave his constituency while it’s in the midst of a crisis.

In a grave slip-up for an ambitious politician, Texas Senator Ted Cruz’s giving a lame excuse initially for his Cancún joint made him look insensitive. He was expected to stay and endure alongside his constituents, who were suffering from Texas’s recent freezing temperatures and blackouts.

Of course, Cruz didn’t do anything that hurt anybody, apart from drawing police resources away to shepherd him through the airport. Cruz’s argument—sensible in its own way—was that all he could do was be in regular communication with state and local officials who’re spearheading the crisis response. After all, Cruz has no formal power in the state administration.

As a comparison, King George and the Queen Mother declined to leave London as bombs shattered their city during World War II. As an expression of concern, and commitment to the Allied cause, they even visited sites destroyed during The Blitz of 1940.

Idea for Impact: Leadership means serving as an anchor during crisis times and being available, connected, and accessible during a crisis. Leaders can’t do everything, and they need to delegate responsibilities. However, entrustment should not entail emotional detachment.

Wondering what to read next?

  1. A Superb Example of Crisis Leadership in Action
  2. Make Friends Now with the People You’ll Need Later
  3. How to … Declutter Your Organizational Ship
  4. Making Tough Decisions with Scant Data
  5. Don’t Hide Bad News in Times of Crisis

Filed Under: Effective Communication, Leadership, Leading Teams Tagged With: Conflict, Critical Thinking, Decision-Making, Leadership, Leadership Lessons, Mindfulness, Problem Solving, Winning on the Job

How to Avoid Magical Thinking

February 22, 2021 By Nagesh Belludi Leave a Comment

Magical thinking remains a subtle impediment to making sound decisions. The more you examine yourself, the more you can reduce your tendency to indulge in it.

Discover the truth for yourself. Beware of the tendency to let others think for you. Don’t believe what your parents, teachers, counselors, mentors, priests, and authorities of all inclinations have taught you from an early age. (The best predictor of people’s spiritual beliefs is the religiosity of their parents.) Question others’ underlying premises and discover for yourself what’s reasonable. Force yourself to test for alternatives.

Don’t believe what you want to believe is true. Many people believe in UFOs and ghosts, even when there’s no credible verification for any visitation from outer space or dead souls haunting abandoned buildings. Often, misinformation is cunningly designed to evade careful analytical reasoning—it can easily slip under the radar of even the most well-informed people. Shun blind optimism.

Consciously identify your biases and adverse instincts. Psychologists have identified more than 100 cognitive biases that can get in the way of clear and rational thinking. Explore how those biases could come into play in your thinking. Try to determine their motive. Work to extricate yourself from them to the best of your ability.

Demand proof when the facts seem demonstrable. Remain intellectually agnostic toward what hasn’t been established scientifically or isn’t provable. If you can’t determine if something is true or it isn’t, suspend judgment. Beware of anecdotes—emotionally swaying stories in particular—they are the weakest form of evidence.

Don’t believe in something that isn’t true just because there’s a practical reason to. If you feel emotionally inclined to believe in something because it gives you hope, comfort, and the illusion of control, identify your belief as just that. Faith is often no more than an inclination that’s not withstood the tests of reason. The process of faith is an absence of doubt. There’ll always be people who reject evolution for reasons that have little to do with evolution. Don’t act with more confidence in unproven theories than is justifiable.

Idea for Impact: Be wary of the influences that can put you at risk for magical thinking.

Give critical thinking and systematic evidence the central role in how you understand the world. Improving the criteria you use to judge the truth of things is difficult—but it’s of the essence. Have an unvarying, well-balanced degree of skepticism about everything, especially your own postulations.

Wondering what to read next?

  1. Question Success More Than Failure
  2. In Praise of Inner Voices: A Powerful Tool for Smarter Decisions
  3. What if Something Can’t Be Measured
  4. Rapoport’s Rules to Criticize Someone Constructively
  5. Don’t Ignore the Counterevidence

Filed Under: Mental Models, Sharpening Your Skills Tagged With: Critical Thinking, Introspection, Mindfulness, Persuasion, Questioning, Thinking Tools, Thought Process

Never Accept an Anecdote at Face Value

February 19, 2021 By Nagesh Belludi Leave a Comment

Human beings generally find anecdotes highly compelling. We’re not transformed as much by facts and statistics as we are by stories.

But anecdotes aren’t often objective. Anecdotes are uncontrolled individual observations—sometimes no more than one.

Reported experience is subjective. Our recollections are ever-changing, and they’re often amazingly imprecise. We often misrepresent events to agree with the audience—even embellish with made-up minutiae to render our stories more compelling.

And for that reason, anecdotes are usually the weakest form of evidence. Anecdotes are subject to a host of biases such as confirmation bias, generalization, and cherry-picking. Moreover, for every anecdote, an equal and contrary anecdote can be proffered.

Idea for Impact: Be deeply suspicious of anecdotes. Arguments that draw on anecdotal evidence to make broad generalizations are liable to be fallacious.

Wondering what to read next?

  1. Here’s a Tactic to Sell Change: As a Natural Progression
  2. Lessons from JFK’s Inspiration Moon Landing Speeches
  3. What if Something Can’t Be Measured
  4. The Deceptive Power of False Authority: A Case Study of Linus Pauling’s Vitamin C Promotion
  5. Persuade Others to See Things Your Way: Use Aristotle’s Ethos, Logos, Pathos, and Timing

Filed Under: Sharpening Your Skills Tagged With: Biases, Communication, Critical Thinking, Persuasion

A Real Lesson from the Downfall of Theranos: Silo Mentality

February 4, 2021 By Nagesh Belludi Leave a Comment

The extraordinary rise and fall of Theranos, Silicon Valley’s biggest fraud, makes an excellent case study on what happens when teams don’t loop each other in.

Theranos’ blood-testing device never worked as glorified by its founder and CEO, Elizabeth Holmes. She created an illusion that became one of the greatest start-up stories. She kept her contraption’s malfunctions and her company’s problems shockingly well hidden—even from her distinguished board of directors.

At the core of Holmes’s sham was how she controlled the company’s flow of information

Holmes and her associate (and then-lover) Sunny Balwani operated a culture of fear and intimidation at Theranos. They went to such lengths as hiring superstar lawyers to intimidate and silence employees and anyone else who dared to challenge their methods or expose their devices’ deficiencies.

Holmes had the charade going for so long by keeping a tight rein on who talked to whom. She controlled the flow of information within the company. Not only that, she swiftly fired people who dared to question her approach. She also forcefully imposed non-disclosure agreements even for those exiting the company.

In other words, Holmes went to incredible lengths to create and maintain a silo mentality in her startup. Her intention was to wield much power, prevent employees from talking to each other, and perpetuate her deceit.

A recipe for disaster at Theranos: Silo mentality and intimidation approach

'Bad Blood' by John Carreyrou (ISBN 152473165X) Wall Street Journal investigative reporter John Carreyrou’s book Bad Blood: Secrets and Lies in a Silicon Valley Startup (2018; my summary) is full of stories of how Holmes went out of her way to restrain employees from conferring about what they were working on. Even if they worked on the same project, Holmes made siloed functional teams report to her directly. She would edit progress reports before redirecting problems to other team heads.

Consider designer Ed Ku’s mechatronics team responsible for designing all the intricate mechanisms that control the measured flow of biochemical fluids. Some of his team’s widgets were overheating, impinging on one another and cross-contaminating the clinical fluids. Holmes wouldn’t allow Ku and his team to talk to the teams that improved the biochemical processes.

Silo mentality can become very problematic when communication channels become too constricted and organizational processes too bureaucratic. Creativity gets stifled, collaboration limited, mistakes—misdeeds in the case of Theranos—suppressed, and collective objectives misaligned.

Idea for Impact: Functional silos make organizations slow, bureaucratic, and complicated

Innovation hinges increasingly on interdisciplinary cooperation. Examine if your leadership attitude or culture is unintentionally contributing to insufficient accountability, inadequate information-sharing, and limited collaboration between departments—especially on enterprise-wide initiatives.

Wondering what to read next?

  1. Let’s Hope She Gets Thrown in the Pokey
  2. The Dramatic Fall of Theranos & Elizabeth Holmes // Book Summary of John Carreyrou’s ‘Bad Blood’
  3. You Need to Stop Turning Warren Buffett Into a Prophet
  4. Your Product May Be Excellent, But Is There A Market For It?
  5. Creativity by Imitation: How to Steal Others’ Ideas and Innovate

Filed Under: Business Stories, Leadership, Mental Models Tagged With: Biases, Critical Thinking, Entrepreneurs, Ethics, Leadership Lessons, Psychology, Thought Process

« Previous Page
Next Page »

Primary Sidebar

Popular Now

Anxiety Assertiveness Attitudes Balance Biases Coaching Conflict Conversations Creativity Critical Thinking Decision-Making Discipline Emotions Entrepreneurs Etiquette Feedback Getting Along Getting Things Done Goals Great Manager Innovation Leadership Leadership Lessons Likeability Mental Models Mentoring Mindfulness Motivation Networking Parables Performance Management Persuasion Philosophy Problem Solving Procrastination Relationships Simple Living Social Skills Stress Suffering Thinking Tools Thought Process Time Management Winning on the Job Wisdom

About: Nagesh Belludi [hire] is a St. Petersburg, Florida-based freethinker, investor, and leadership coach. He specializes in helping executives and companies ensure that the overall quality of their decision-making benefits isn’t compromised by a lack of a big-picture understanding.

Get Updates

Signup for emails

Subscribe via RSS

Contact Nagesh Belludi

RECOMMENDED BOOK:
Made in America

Made in America: Sam Walton

Walmart founder Sam Walton’s very educational, insightful, and stimulating autobiography is teeming with his relentless search for better ideas.

Explore

  • Announcements
  • Belief and Spirituality
  • Business Stories
  • Career Development
  • Effective Communication
  • Great Personalities
  • Health and Well-being
  • Ideas and Insights
  • Inspirational Quotations
  • Leadership
  • Leadership Reading
  • Leading Teams
  • Living the Good Life
  • Managing Business Functions
  • Managing People
  • MBA in a Nutshell
  • Mental Models
  • News Analysis
  • Personal Finance
  • Podcasts
  • Project Management
  • Proverbs & Maxims
  • Sharpening Your Skills
  • The Great Innovators

Recently,

  • Is It Ever Too Late to Send a Condolence Card?
  • What Appears Self-Evident to One May Be Entirely Opaque to Another: How the Dalai Lama Apology Highlights Cultural Relativism
  • Inspirational Quotations #1136
  • Ditch Deadlines That Deceive
  • Invention is Refined Theft
  • You Need to Stop Turning Warren Buffett Into a Prophet
  • Inspirational Quotations #1135

Unless otherwise stated in the individual document, the works above are © Nagesh Belludi under a Creative Commons BY-NC-ND license. You may quote, copy and share them freely, as long as you link back to RightAttitudes.com, don't make money with them, and don't modify the content. Enjoy!