The Comfort Trap
Why We Obey, Conform, Deny, and Dig In — Even When We Know Better
Let’s Get Uncomfortable
It’s almost February, which by my reckoning means that about 90% of the 2026 crop of New Year’s resolutions have been fully abandoned for about three weeks, give or take. Our tenuous grasp on self-improvement notwithstanding, this is also the time of year when we’ve all started to settle back into whatever established routines we have for ourselves. Last year is finished and the new year has begun.
The time for retrospection is over — maybe it’s time for a little introspection instead.
Maybe it’s the perfect time to talk about one of the most powerful forces in human psychology: our instinct to avoid discomfort at all costs.
Or maybe it isn’t. I’m no psych major. I was just looking for a “hook” that ties this topic to this time of year.
At any rate, we all do it. When something we see or hear clashes with what we believe, our brains turn on our “Check Engine” light.
And how do we respond?
We obey when we shouldn’t.
We conform when we don’t agree.
We deny the undeniable.
We double down when we should be admitting we are wrong and changing the way we think.
We do these things not because we’re bad or stupid.
We do them because being uncomfortable — morally, socially, emotionally, or intellectually — is the cerebral equivalent of stepping barefoot onto a piece of Lego.
The Mental Gymnastics We Perform to Feel Okay
Psychologists refer to this as cognitive dissonance — it’s the mental stress we feel when we hold conflicting attitudes or beliefs, or when we do things we know are wrong.
For example, people who smoke know that smoking is bad for them, and could even kill them. Some people have emphysema and still continue to smoke — and it doesn’t take a science major to deduce that dangling an open flame from your mouth while oxygen is being piped into your nose is probably not ideal.
It’s the kind of thing that makes you instinctively take a step back, just in case chemistry decides to make a point.
This creates an inner conflict for the guy who chain-smokes three or four packs a day. On one hand, he loves smoking. On the other, he also has a genuine fondness for living — free from life’s petty inconveniences like heart disease, stroke, cancer or being unceremoniously launched into low-Earth orbit when his cigarette meets his oxygen tank in a moment of randomly unfortunate collaboration.
We really hate when this happens — when our brains are trying to force us to admit we’re wrong. The posterior part of the medial frontal cortex is responsible for monitoring these conflicts, which is a fancy way of saying there’s a tiny, overworked intern in your brain whose only job is to whisper, “Uh… are you sure about this?”
Luckily for us, however, we stand armed and ready to deploy any number of ways to combat these feelings of conflict:
We find ways to rationalize and justify our behavior.
We hide our behavior from others out of feelings of shame or guilt.
We seek out (and immediately believe) information that supports our beliefs, while denying (if not outright ignoring) information to the contrary.
We Google “Is smoking actually that bad?” and click the one article written by a guy named Dr. Nicholas Riviera who says it’s fine.
Simply put, we hate being wrong. So we’re only too willing to blatantly ignore things like common sense, reason, good judgement and facts in exchange for using psychological origami to twist our ugly reality into a beautiful swan.
This instinct to dodge discomfort is the fertilizer we use to foster the growth of all our weird behaviours. Obedience, conformity, denial, doubling down… they all sprout from the same place.
Which brings us to some of the most unintentionally hilarious (if not deeply unsettling) psychology experiments ever conducted…
The Shocking Truth About Obedience
In 1961, psychologist Stanley Milgram ran an experiment to measure how willing people were to obey an authority figure who asked them to do something that ran afoul of their personal conscience.
It went like this:
There were two people involved in the experiment — the “learner,” strapped into a chair in a separate room, and the “teacher,” who couldn’t see him.
The teacher would read word pairs aloud, and then would say one word from each pair giving four possible matches for the other word. The learner would press a button to select his answer.
If the learner got the wrong answer, the teacher was instructed to administer an electric shock to him — and to increase the voltage with each incorrect answer.
Milgram wanted to see how far people would go when instructed by an authority figure.
Turns out the answer is: pretty damned far.
As the incorrect answers piled up, the teachers kept administering shocks at higher and higher voltages. They could hear the consequences — mild groans at first, then banging on the walls, then pleas for it to stop. And if a teacher hesitated or asked to stop, the experimenter calmly insisted they continue. Stopping was “not an option.”
Eventually, at the higher voltages, the screaming stopped. The learner’s room fell silent. And, as instructed, the teacher continued the procedure — asking the next question, waiting five to ten seconds for a response, and treating the silence as another wrong answer. Then came the next shock. And the next. All delivered into a void of absolute quiet.
Of course, the learner wasn’t actually being shocked. He was part of the research team — intentionally giving wrong answers, pretending to be in pain, “begging for mercy,” and finally going quiet to feign unconsciousness — or worse. But as far as the teacher was concerned, it was all too real.
The results of the experiment were (pardon the pun) shocking.
Every participant “administered” at least 300 volts, and almost two‑thirds went all the way to the maximum of 450 volts. To be fair, none of them were comfortable doing it. Many raised concerns, asked to stop, or even offered to return their $4.50 honorarium — which is about $47 in today’s dollars.
For context, that’s not even enough to buy a flat of cheap beer today, let alone the kind you’d want to splurge on to celebrate electrocuting a total stranger.
Still, it proved people will check their morals, decency, and empathy at the door for a guy with a clipboard and less than five bucks.
And if people are willing to do what a stranger in a lab coat tells them, imagine how much harder it is to push back against someone they do know — a trusted friend, a respected colleague, or a supervisor who signs their performance review.
Now, I’m not saying your buddy or your boss would ask you to do something immoral or illegal. But Milgram’s experiment does underscore something disturbing: we will often look the other way rather than face the awkwardness of questioning authority.
Asch‑Backwards Thinking: Why Peer Pressure Makes Us Do Dumb Things
Ten years earlier, social psychologist Solomon Asch ran an experiment to determine whether people would conform to a group’s opinion even when that opinion was clearly wrong.
The setup was simple and clever:
The experiment uses a group of seven or eight people, all but one of whom are actors working with the experimenter. While all are introduced as “participants”, the real focus is on the actual participant and how he reacts to the responses given by the actors.
The group is shown a card with a line on it, and then another card with three other lines on it. One of the lines on the second card is the same length, while the other two are obviously longer or shorter.
Each person in the group is asked to identify which line on the second card matches in length to the line on the first card. The real participant was always asked to answer last, having heard everyone else’s answer.
In the first few rounds, everyone agreed as to which line was the same length, but as the experiment progressed, the group of actors posing as participants would unanimously select the same incorrect line.
The findings: Over a third of the time, participants agreed with the group when they selected the wrong line. And overall, 74% of participants gave at least one incorrect answer.
Solomon Asch himself said of his findings: “That intelligent, well-meaning young people are willing to call white black is a matter of concern.”
So why do people ignore reality and conform to or agree with things they know to be wrong? It basically comes down to our unwillingness to be uncomfortable. Disagreeing, standing out or being the lone voice of reason in a group dynamic is uncomfortable.
This explains a lot — like meetings where everyone nods along to a deadline that is clearly impossible to meet, or the family Christmas dinner where everyone raves about Uncle Mel’s perennial potluck offering of “Braised Polecat and Cabbage Casserole.”
Mood Lighting for the Morally Bankrupt
If a room full of strangers can make you question your own eyesight, imagine what happens when the person distorting your reality isn’t a stranger at all — but someone you know.
Up to this point, the distortions we’ve talked about are mostly harmless — experiments that gave some poor schmo the heebie‑jeebies for a moment before being revealed for what they were.
Gaslighting is where things get darker.
Literally, as it turns out. The term comes from a 1938 play called Gas Light, where a manipulative husband secretly dims the gas lamps in the house and then insists to his wife that nothing has changed. When she notices the lights flickering or fading, he tells her she’s imagining things or being “overly sensitive,” slowly pushing her to doubt her own perception and depend on him for reality itself.
Gaslighting happens when someone convinces you that your own memory, perception, or experience is wrong. And unlike the Asch experiment, where strangers pressure you into doubting your eyes, gaslighting usually comes from someone you trust.
That’s what makes it powerful — and dangerous.
The Asch Experiment shows how a group of strangers can make you question your reality.
Gaslighting shows that the same effect can be achieved by a single person you know and trust.
At its worst, gaslighting is an intentionally abusive behaviour.
But it can be unintentional too. Sometimes people are so convinced of their own version of events — even events that never happened — that they refuse to entertain the possibility of being wrong. They don’t question themselves, they don’t examine the plausibility of their memories, and they insist on their version with total confidence.
For the person on the receiving end, the impact is the same: it feels like deliberate manipulation, even when it isn’t.
I’m trying to think of something funny or clever to write here to wrap up this section, but gaslighting is such an icky subject that I really don’t think I can. So here’s a Dad joke instead: Why was the letter E the only letter to get a present from Santa? Because the rest of the letters were not E!
I’ll see myself out now…
Doubling Down: Because Repeating Something False Somehow Makes It True
When someone shows us evidence that contradicts our way of thinking, we should say, “Oh! That’s interesting! I’ll need to read more about that. Thank you so much!” And, to be fair, many of us do — but only as a courtesy, not because we have any real intention of altering our beliefs.
So we don’t.
We insist the evidence is flawed. We declare the source unreliable. We ignore facts that directly contradict what we believe. We dismiss something as a lie simply because we want it to be one. Our own version of reality is unassailable, and we willfully wallow in our wonky worldview.
We do this because admitting we’re wrong threatens our identity. When we need it most, we don’t think critically at all — or maybe we can’t, because it’s a skill we never learned.
Instead of using it to see the world more clearly, new information that undermines our viewpoints triggers our brain to ignore it altogether, or to quietly shuffle our own beliefs like Tetris blocks to better fit what we’ve just heard. Either way, we carry on — happily and obliviously — as if nothing is wrong.
Doubling down is the psychological equivalent of rearranging the deck chairs on The Titanic after it has collided with an iceberg named “Fact”, all while reminding screaming passengers scrambling into lifeboats that dinner will be served at five o’clock.
Shiny Hats, Shaky Facts
At the extreme end of the Discomfort Reflex spectrum, we find conspiracy theories — psychological comfort food for people who find reality too chaotic, too random, or too widely accepted.
I’ve heard some doozies in my day:
The government has a machine that controls the weather.
The Pyramids and Stonehenge were built by aliens.
A cure for cancer/diabetes/male pattern baldness exists, but pharmaceutical companies are suppressing it because treating symptoms is more profitable.
The NHL has been complicit in preventing a Canadian team from winning the Stanley Cup since Gary Bettman became Commissioner.
Each of them delivered with the straightest of faces, in tones dripping with gravitas.
At the heart of every good conspiracy is something that is either true, or at least sounds as if it could be true. But the road to the actual conspiracy itself is one so long and winding that Sir Paul McCartney himself would be impressed.
That is, if he were still actually alive and not being played by an actor who’s been tricking you sheeple for the past 60 years.
There’s one in every family or workplace. You listen politely, trying not to be judgmental and resisting the urge to beat them senseless with a Grade 9 science textbook.
But whatever you do, you should never ever engage them by asking questions. Because not only do they deny contradictory information and double down, they absolutely thrive in a vacuum of evidence. What makes conspiracy theorists unbeatable in arguments is that whenever they lack proof (and they always do), they inevitably use the conspiracy itself as the explanation.
There’s no evidence because they are hiding it.
And who are they? It varies: the government, the Illuminati, big pharma, big hockey.
Why do people believe in conspiracies?
Lots of reasons. Some struggle with fragile self‑esteem. Others lack intellectual humility and can’t tolerate the possibility that they might be wrong. Still others crave the feeling of being important, intelligent, or uniquely knowledgeable.
For people with these tendencies, conspiracy theories offer emotional relief. They create the illusion of being “in the know,” of having control over something they absolutely cannot control. Nothing fuels a conspiracy theorist’s superiority complex more than the belief that they alone understand how the world really works. And once a conspiracy becomes part of their identity, denying it would cause a rapid, tragic deflation of their ego. They’d rather cling to a bizarre belief devoid of common sense than admit they were hoodwinked.
And, just for the record, tin foil hats are not only incapable of blocking mind control signals delivered by shadowy governments — science says they might actually amplify some radio frequencies by acting as an antenna.
Or so “big aluminum” would have us believe…
Here’s To The “Crazy” Ones
These behaviors come from the same place — the very human need we all have to feel that we belong and are respected. That we are good, intelligent people capable of making smart decisions for ourselves and our families. And when we are forced to face something that says we’re wrong, it’s like being walloped in the face with a frozen salmon. We’d rather accept the lies being told to us, or cling to the belief that we can’t be wrong, than face the discomfort of pushing back or looking inward.
The truth is, discomfort brings real benefits. It’s where empathy, growth, integrity, and courage live. It doesn’t take grand pronouncements — sometimes, it’s just confidently saying things like:
“This is wrong. We shouldn’t be doing this.”
“I don’t think you’re correct.”
“Maybe I need to reconsider my position on this.”
“Wait — this doesn’t make sense. Maybe I’m wrong.”
We don’t have to be contrarians or troublemakers.
Every now and then, we just need to be at ease with feeling uneasy.
Maybe being vexed isn’t all that bad — maybe we just need a little vex therapy. We just need to tolerate a little discomfort — to ask for clarification, or to acknowledge the elephants in the room without triggering a full-blown stampede.
Don’t forget that compliance isn’t the same as competence, and being content isn’t the same as being correct.
So here’s to the brave, the thoughtful, and the mildly uncomfortable. The people willing to sit with dissonance long enough to learn something from it. The ones who trust their own eyes, question their own assumptions, and admit when they’re wrong — even if that makes them look a little “crazy.”
Remember this — so the next time someone tells you to press a button, maybe you’ll ask them to switch places with the guy strapped in the chair first.
Like what you read? I write, rewrite, overthink, rewrite again, and eventually post these things in hopes they resonate. If something struck a chord, sparked an idea, or — most importantly — made you laugh, please drop a comment below. Sarcasm is welcome, cruelty is not. So be honest, and be nice. It’s possible to do both.
And if you'd like to support the effort (or just bribe me to keep going), you can buy me a coffee. No pressure — but caffeine is a powerful motivator.
Full disclosure: you can’t actually make me buy a coffee with your donation. I might use it for a beer instead.