Comment

Comments and observations on social and political trends and events.

Sunday, March 1, 2026

8 Signs You’re Not Thinking Critically

I receive a newsletter from a site called Critical Thinking Secrets. The latest one covers eight signs of not thinking critically. I've provided the contents of this newsletter below.

8 Signs You’re Not Thinking Critically 🚩

Even the most brilliant minds can slip into "lazy thinking." Our brains are wired for efficiency, which often means taking the path of least resistance. Recognizing when you’ve fallen into a weak thinking pattern is the first step toward reclaiming your intellectual edge.

Here are eight warning signs that your critical thinking has gone offline—and how to reboot it.

1. You Feel a Surge of Defensive Anger

If your immediate reaction to a dissenting opinion is a "hot" emotional response rather than curiosity, you are likely protecting an identity, not a logic-based belief.

The Fix: Use the 5-Second Rule. Pause and ask: "Is this person attacking me, or just my current idea?"

2. You Can’t Explain the "Other Side"

If you think the people who disagree with you are simply "stupid" or "evil," you don't understand their argument well enough to critique it.

The Fix: Practice Steel-Manning. Try to write down the most rational version of their argument. If you can’t, you’re in an echo chamber.

3. You Search for Proof, Not Truth

If you find yourself Googling things like "Why [My Opinion] is right" instead of "Pros and cons of [Topic]," you are a victim of Confirmation Bias.

The Fix: Search for "disconfirming evidence." Force yourself to read one credible source that contradicts your view.

4. You Rely on Anecdotes Over Data

If you dismiss a statistical trend because "I know a guy who..." or "That didn't happen to me," you are letting a single story override reality.

The Fix: Check the Base Rate. Remember that your personal experience is a sample size of one. What does the data say about the other 99.9%?

5. You Defer Entirely to Your "Tribe"

If your opinions on politics, health, and technology perfectly align with your social circle or favorite influencer without exception, you aren't thinking—you're conforming.

The Fix: Find a "Point of Dissent." Identify one minor area where you disagree with your group and explore why.

6. You Use "All-or-Nothing" Language

Words like always, never, everyone, and nobody are red flags for False Dilemmas and oversimplification.

The Fix: Introduce nuance. Replace "always" with "frequently" or "under these specific conditions."

7. You’re "Moral Dumbfounded"

This happens when you feel a strong sense of "wrongness" but can't point to a logical reason why. You are letting Moral Intuition steer the ship without Moral Reasoning.

The Fix: Trace the harm. Ask: "Who is actually being hurt here, and what is the specific nature of that harm?"

8. You Accept Claims Because They Are "Common Sense"

"Common sense" is often just a collection of cultural biases. If you find yourself saying, "It’s just obvious," you’ve stopped investigating.

The Fix: Use First Principles. Strip away the "obvious" labels and look at the foundational facts. Is it really obvious, or just familiar?

Thursday, January 15, 2026

Coffee & COVID: On Spotting Propaganda

Jeff Childers, attorney and author of the daily newsletter Coffee & COVID, comments on a story in The New York Times that reports on the booming economy while poo-pooing that Trump’s tariffs have anything to do with it. Whether or not you agree with Childers take on the effects of the tariffs, he makes a valid point about how The Times and other news outlets shape their stories to steer you to the opinion they want you to have.

When reading this type of ‘news’ critically, balance is the first thing you should look for. Here’s the formula: the articles report a scrap of actual news (e.g., the economy is booming), and then round up several “experts” to tell readers what to think about the news.

If the “expert” portion of the story is unbalanced, then you are reading propaganda, not news. Corporate media uses experts to publish its own opinions —its bias— while hiding in the bushes behind the carefully curated people who all magically agree with its perspective. By publishing a totally lopsided group of voices, the reporter hopes to fool the reader into assuming expert “consensus” exists— without ever having to explicitly make that dubious argument.

Assuming you are masochistic enough to consume corporate media’s articles, when reading this type of piece, always first ask: “do all the quoted sources agree with each other, and varying expert opinions are conspicuously absent?” If so, you can safely ignore all the quotes and focus just on the factual reporting of what actually happened.

Believe it or not, this kind of reporting is what is most responsible for killing legacy media and driving people to social media for news. On social media, folks actually find the diversity of voices and opinions that is lacking in contemporary corporate media. Even allowing for all the noise of misinformation, outright lies, silliness, and unintelligent commentary, Twitter’s “town square” beats whatever the Times is serving up.

At least the bias is obvious on Twitter/X, which is all anybody asked for anyway.

It would be trivially easy for big news publishers like the Times to give readers right-click access to quoted experts’ biographies, previous comments, publication history, and political donation records. But they don’t. Think about that. And think about the claim that publications like the Times allegedly exist to “inform” us.

I’m old enough to remember “the good old days” when the news would offer more than one set of expert opinions. Now they refer to people who disagree with the experts they’re pushing as “deniers”, “discredited”, “debunked” and so on. Recently a lawyer friend told me he could almost always find an expert who would support the position the lawyer was taking in his case. Seems that’s exactly what our “news” media does too.


Wednesday, January 7, 2026

The Fallacy of Mind Reading + The Ladder of Misinference

In his book Loserthink: How Untrained Brains Are Ruining America, Scott Adams (creator of the Dilbert cartoon series) says: “We humans think we are good judges of what others are thinking. We are not. In fact, we’re dreadful at it. But people being people, we generally believe we are good at it while also believing other people are not.” (I highly recommend Adams’ book.)

Today’s edition of Coffee & COVID newsletter provides a typical example of mind reading.

Yesterday, the Wall Street Journal defecated a hurt-feelings story headlined, “Alarm Spreads Among U.S. Allies Over Trump’s Demand for Greenland

In short, the Journal’s ‘news’ article reported that Trump is being mean to European élites, again, and it is making them feel unsafe, again. “Europeans are afraid of Trump,” said Pascal Boniface, director of the Institute for International and Strategic Affairs, a think tank in Paris. The Journal’s inelegantly implied theme, or thrust, was that European leaders’ fear of Trump explains why they didn’t criticize the recent arrest of Venezuelan narco-terrorist Maduro.

This is journalistic misdirection, and I’ll tell you why. The Journal was imputing a motive (fear of Trump) on all European leaders, without admitting that it was editorializing, [NOTE: emphasis added] to diminish the significance of the leaders’ apparent agreement with the move and thereby prevent it from legitimizing Trump’s actions. They only went along because they were afraid, was the Journal’s implied argument, which was dressed up as ‘news.’

Mind reading is one of the most common cognitive errors I see in critical thinking and news reporting.

On a related topic Alex Edmans points out two other common errors in his excellent book May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases – And What We Can Do about It. As he says in the Introduction: 

We’ll take a deep dive into two psychological biases – confirmation bias and black-and-white thinking – that are the two biggest culprits in causing us to misinterpret information.

Edmans, professor of Finance at London Business School, introduces his Ladder of Misinference. This paragraph in the Introduction summarizes this ladder.

We accept a statement as fact, even if it’s not accurate – the information behind it may be unreliable and may even be misquoted in the first place. We accept a fact as data, even if it’s not representative but a hand-picked example – an exception that doesn’t prove the rule. We accept data as evidence, even if it’s not conclusive and many other interpretations exist. We accept evidence as proof, even if it’s not universal and doesn’t apply in other settings.

May Contain Lies takes a different approach than many of the books I’ve read on critical thinking. Those other books talk about common fallacies and cognitive biases. Edmans plows new ground by offering a new way of looking at statements and claims. Highly recommended!

Loserthink and May Contain Lies are two of my favorite books on critical thinking and being objective. They will help prevent us from falling into thinking traps.