Login or Register
TRP.RED: Home | Blogs - Forums.RED: ALL | TheRedPill | RedPillWomen | AskTRP | thankTRP | OffTopic
The Parables of The Sower
Wrong?
Published 03/06/23 by Whisper [1 Comments]

I actually like debating with people who are wrong on the internet. It's generally thought of as an exercise in futility, but that's only if you're trying to change someone's mind. I do it because what they say, and how respond, makes me have thoughts I might not otherwise have had.

The other day, this habit helped me solve a puzzle I've been working on for ten years.

Every psychologist knows that when you argue against someone's strongly held belief, you increase their conviction in that belief. In fact, the more compelling your argument, or evidence, is, the more you strengthen this belief.

But what psychologists cannot tell you is why.

The only explanation they usually venture is that accommodation (the act of changing one's world-model to deal with new information) takes more effort than assimilation (fitting new information into an existing worldview).

But that doesn't hold water. Because people whose beliefs are being contradicted aren't just coming off as lazy and apathetic. No. They get mad. They get so mad that they sometimes resort to insults, or violence, or start to hate the person doing it. And when we talk about debates, a argument against someone's position is often referred to as an "attack".

Now, psychologists call this cognitive dissonance, meaning the stress people experience when their beliefs are contradicted by others or by experience. They're right... sort of. It feels like pain, in an abstract way. We've all felt it to some degree, and we've all seen others react to it.

But this doesn't explain anything. It's just a label. Yes, this thing occurs... but why?

Why does a disconfirming experience feel painful? Why does it tend to actually strengthen a belief rather than weaken it? Why does this only happen with strongly-held beliefs, and not weakly-held ones?

Ever since I helped found an internet discussion group called "the red pill", which dealt with present disconfirming arguments to certain deeply held beliefs, I watched this happen and it puzzled me.

It couldn't be dismissed with "people are stupid", or otherwise explained in terms of individual dysfunction, because everyone does it to some degree. It's a human instinct.

Then, finally, one day, when arguing with someone online, I had an idea.

The words "right and wrong" mean both "correct and incorrect", and "good and evil". What if that's not a coincidence?

The majority of the brain isn't self-aware, but it heavily influences decision-making. What if those parts of the brain store moral opinions and factual beliefs in the same way? What if only the small, self-aware, prefrontal cortex is capable of making a distinction between the two?

That would mean that the midbrain doesn't really have an "ought" and an "is". Instead, it would have a category that encompasses both. I think that category might be "tribal alignment".

For example, "you ought not to disrespect tribal elders", and "don't step on the yellow and black snakes, they are poisonous", are a moral belief, and a factual one, but at their core, they both tell you what to do or not do. Why wouldn't the ever-practical and lazy programmer that is evolution store them in the same way.

Hence, alignment. Do your beliefs align with the tribe or not? And, as the world becomes larger and more integrated, which tribe do your beliefs align with? Or do they align with none at all?

(And if you're thinking here that it's no coincidence that a popular role-playing game uses this term to describe morality, then you might be correct.)

So perhaps people get mad when you challenge their beliefs, because, to their unconscious mind, you are basically telling them to unalign with their tribe. And under the conditions of brain evolution (stick and rock technology), isolation meant death.

(This might be why people living in the worst of communist dictatorships have never been as unfree as hunter-gatherers living in tribes. Tyrants have a very long list of things you may not do, but primitives have a very short list of things you may.)

Let's try an example:

Suppose I, an atheist, am speaking to a christian about whether or not there is a god. The more factual arguments I muster, and the more compelling they are, the more I am, in effect, demanding that someone abandon their community. And that doesn't even mean, necessarily, walking away. It can also mean losing the sense of belonging. And the latter can happen to someone involuntarily, merely by listening to me, without any conscious act of defection.

If we think about it like that, no wonder it makes people mad! They are being asked to sacrifice a concrete source of comfort and (perceived) benefit, in exchange for... well, at worst nothing, but even at best the benefits of being "correct", which might exist, but even if they do, are hypothetical and they can't see them.

Of course trying to cut someone off from the tribe is experienced as an attack.

But every tribe, but its very nature, has to be wrong about something. Because new evidence comes to light all the time, and if your beliefs aren't able to turn on a dime, then pretty soon some of them are going to be outdated. And if your beliefs are able to turn on a dime, constantly pivoting to whatever the current evidence suggests, then you have no tribe, you can't have a tribe, because you have nothing to hold that tribe together.

(This is why libertarians don't win elections.)

So the universal human experience is a choice: suffer the pain of delusion, or suffer the pain of isolation.

And these tradeoffs play out in a number of different ways... some false beliefs cost you very little, and some of them ruin your life. Some beliefs align you with close-knit and beneficial communities, and some align you with random nutjobs, or no one at all.

And how costly beliefs are can change.

Galileo was threatened by the church for challenging an aligning belief that was, at the time, consequence-free for most people. It didn't really matter in someone's daily life if they thought the sun orbited the earth, or vice versa.

And so Galileo was treated as if he were advocating genocide, or something, because Popes' brains don't make a distinction between factual positions and moral ones.

But, in the twenty-second century, when we start trying to mine asteroids, the same belief would have a huge cost, and would have to be stamped out. Fortunately, it's already happened, partially because of Galileo.

But.... what is different about Galileo? And the people who read his work, nodded, and said "that sounds right, I'm going to change my beliefs".

Well, there's a lot of people who run around priding themselves on being smarter than others, or more objective. And that makes them feel good. But is that the whole story?

What if the people who embrace unaligning beliefs are doing so, at least partially, because they are already unaligned and have nothing to lose? What if Galileo was already suffering the pain of isolation, due his vast personal differences from the people of his day? If so, it would cost him very little to pivot his beliefs, and thereby avoid the pain of delusion as well.

And what if this equation is also impacted by how costly a particular delusion is?

Which brings us back to the red pill. The red pill was a push for unalignment; an attempt to help ourselves and others reject the "matrix" of an unaligned belief system that was costing us a great deal. But it gradually became about other types of unalignment as well, and, more importantly, about the ability to unalign.

This what "taking the red pill" really means in a larger context... making the conscious choice to suffer the pain of isolation instead, when the pain of a particular delusion becomes large enough to make matrix participation a bad investment.

This is why we got so much hate mail at first. We weren't proselytizing, but our mere existence created a risk of unalignment for others who became aware of our views. This was a threat both to those people themselves, because they found unalignment painful, and to others who found their unalignment painful (because they were less easy to exploit).

Thus, we were, in a very real sense, an infohazard. A self-perpetuating meme which caused painful cognitive dissonance.

My current hypothesis is that most resistance to well-supported ideas rests upon this dynamic. It might even be the case that being objective isn't always a good idea... that it might not always be beneficial to be correct. Some factually correct beliefs might have a very high psychological or external cost, in exchange for a lot less personal benefit.







Tip Whisper for their post.
Login to comment...
Comment by HumanSockPuppet on 03/09/23 02:47am

I think that's why Vonnegut invented the word "granfalloon". He recognized that people had a reflexive, pre-cognitive need to avoid the pain of isolation and form groups, even if the justifications for those groups was threadbare.

Vonnegut merely lacked a formal explanation for the process.

There's certainly a price to be paid for being right too soon, or right in the wrong social context. I think the way to defeat the (Mis)Alignment Conundrum is by making strategic use of granfalloons. If you cannot appeal to a person's reason, appeal to your mutual association with some other label.

For example, if you could not appeal to reason with another person because your religions differ, you might appeal to your shared nationality. Or shared species. Something which binds you on a nearer basis with respect to some greater external threat.

"Me against my brother.

My brother and I against our cousin.

My cousin and I against the stranger.

The stranger and I against the wild animal."