Login or Register
TRP.RED: Home | Blogs - Forums.RED: ALL | TheRedPill | RedPillWomen | AskTRP | thankTRP | OffTopic
The Parables of The Sower
Whisper's Three Step Plan for Not Being a Retard
Published 05/14/23 by Whisper [1 Comments]

I got asked a while ago "what I would do" about "Saddam Hussein".


When I get asked how I would arrange the world, I like to take a day or two to chew it over. Because usually, it's not just a matter of answering, it's a matter of first correcting the question.


Questions people ask carry their assumptions, and those assumptions tend to limit their thinking.

For example, if you're arguing with someone who is trying to learn to fly by gluing feathers to his arms and flapping them real hard, at some point, you're gonna get asked "Well, okay, smart guy, what would you do if you jumped out a plane and didn't have a parachute?"

If you answer the question as it's asked, you can say "spread my arms and legs and hope". And he'll say "Ah-hah! You admit that mine is the best idea!"

Or you can say "there's nothing to do in that situation". And he'll say "Ah-hah! You criticize my idea, but you have no ideas at all!"

But what you do is you don't answer any question that carries hidden assumptions. Instead, you point out the assumptions and correct the question.

In this case, you say, "I wouldn't jump out of a plane without a parachute."

And he'll say, "But what if you did?"


Just keep saying "I wouldn't."


So, how would I "deal with" Saddam Hussein?


Simple. I wouldn't. I wouldn't put myself in that position in the first place. If you don't have a parachute, don't jump, genius. Why the fuck would I put myself in a position to care what happens in Kuwait?


"But they're an ally!"

What the fuck does even mean, when you're talking about a country the size of a postage stamp with no army to speak of? How the fuck is that an ally? There's no mutuality in that mutual defense. Who the fuck decided to "ally" with them?

This is just using a previous bad decision to justify the bad decision of doubling down on the bad decision.


"But we need to be allied with Kuwait because we are dependent on foreign oil!"

Well, there's another shit decision. So now we're justifying our bad decisions in terms of our previous bad decision which was in turn justified by another bad decision we made even earlier?

Stop running around the world looking for oil like a stoner in a weed drought. Build fast breeder reactors.

"But the idea of corium scares me!"

Okay, so your shit decision that prompted your shit decision that prompted your shit decision was based on yet another shit decision... the decision to base policy on your feels rather than data.

You wanna buy into FUD instead of asking yourself how many Three Mile Islands it takes to equal one Deepwater Horizon. (Hint: 71 if you're counting dollars. Infinite if you're counting lives.)

That's the problem with talking public policy with most people. Their shit decisions are based on an infinite recursion of shit decisions, stretching back to a point in history that they identify as "that's just the way the world is".

In the spirit of this realization, I would like to introduce Whisper's Three Step Plan for Not Being a Retard.

1. Don't make stupid choices.
2. If you have already made a stupid choice, don't double down on it.
3. If you inherit a stupid choice someone else made before you were here, cut your losses and change direction.

Yes, the first one is difficult, and the second two are painful. But you can either suffer the pain of admitting error, or suffer the pain of not admitting error.

One of the most important things I learned in twenty five years of engineering is that the earlier a problem is detected, the cheaper and easier it is to fix. The same bug that cost you five dollars when it's caught by a unit test suite or code analysis tool will cost you five hundred dollars when it's caught by the QA team, or five million dollars if it's caught by users.

But it's not enough to just detect a problem. Before you can solve it, cheaply or not, you also have to admit that it is a problem. If you try to save your face, or your ego, or your budget, by pretending it was a good decision all along, then you're just going to have to fix it later, when it's worse. Or you're going to wait until it's so bad that you cannot fix it, and it kills everything you tried to build.

But that's not the worst part about not facing your previous bad decisions. The worst part about it is that the longer you avoid that self-awareness, the greater the cost of that moment, and the more you're going to want to put it off and hide your head in the sand. Which in turn makes things even worse in a self-perpetuating cycle.


There are a lot of dudes out there, mostly ex-SOCOM types, who spent their entire careers on US foreign policy. Who lost friends. Who came home with permanent injuries, both physical and psychological. Who based their entire ego-identity around the idea that they were defending freedom and their homeland.


How much do you think it would cost them to face the lie? To realize that all their suffering and sacrifice was for the sake of the Rockefellers and Lockheed Martin? That's some stick-a-P320-in-your-mouth level existential crisis shit right there. And it's not surprising that even guys who will happily freeze their asses off swimming miles off Coronado would rather not face that shit.

That's why they'll spin you true, but ultimately irrelevant, tales about how bloodthirsty the people who hate us are, justifying this decade's shit decision in terms of the previous decade's shit decision. And that's how you end up in a VA hospital missing both legs while Boeing stockholders are in Barbados drinking pina coladas.

See, the secret is that you don't become a retard by making bad decisions. Everyone makes bad decisions. I once dated a Ukrainian-American lingerie model whose love language was "gifts".

While I was broke and in college.

No, retards are people who double down on their back decisions instead of backing out of them the moment they realize they were bad. Learn to do this, and you will survive your mistakes. I survived Kateryna.

If you have trouble with this notion, remember that you don't have to admit your screwups to anyone but yourself... if you just stop doing that, and let the matter drop, people will usually forget. Also remember that some things take time and persistence to pay off, but that's not an excuse for continuing to lie to yourself once you realize that your bad results are coming from your choices, not from lack of commitment to them.

Lastly, do not flagellate yourself, literally for metaphorically, for these kinds of mistakes. The more ashamed you are of making mistakes, the harder it will be to admit to yourself that you made them.

The secret to not fucking up your life, whether you are a single person or an entire country, is as simple as "When you are already in a hole, stop digging."





[1 Comments]
Wrong?
Published 03/06/23 by Whisper [1 Comments]

I actually like debating with people who are wrong on the internet. It's generally thought of as an exercise in futility, but that's only if you're trying to change someone's mind. I do it because what they say, and how respond, makes me have thoughts I might not otherwise have had.

The other day, this habit helped me solve a puzzle I've been working on for ten years.

Every psychologist knows that when you argue against someone's strongly held belief, you increase their conviction in that belief. In fact, the more compelling your argument, or evidence, is, the more you strengthen this belief.

But what psychologists cannot tell you is why.

The only explanation they usually venture is that accommodation (the act of changing one's world-model to deal with new information) takes more effort than assimilation (fitting new information into an existing worldview).

But that doesn't hold water. Because people whose beliefs are being contradicted aren't just coming off as lazy and apathetic. No. They get mad. They get so mad that they sometimes resort to insults, or violence, or start to hate the person doing it. And when we talk about debates, a argument against someone's position is often referred to as an "attack".

Now, psychologists call this cognitive dissonance, meaning the stress people experience when their beliefs are contradicted by others or by experience. They're right... sort of. It feels like pain, in an abstract way. We've all felt it to some degree, and we've all seen others react to it.

But this doesn't explain anything. It's just a label. Yes, this thing occurs... but why?

Why does a disconfirming experience feel painful? Why does it tend to actually strengthen a belief rather than weaken it? Why does this only happen with strongly-held beliefs, and not weakly-held ones?

Ever since I helped found an internet discussion group called "the red pill", which dealt with present disconfirming arguments to certain deeply held beliefs, I watched this happen and it puzzled me.

It couldn't be dismissed with "people are stupid", or otherwise explained in terms of individual dysfunction, because everyone does it to some degree. It's a human instinct.

Then, finally, one day, when arguing with someone online, I had an idea.

The words "right and wrong" mean both "correct and incorrect", and "good and evil". What if that's not a coincidence?

The majority of the brain isn't self-aware, but it heavily influences decision-making. What if those parts of the brain store moral opinions and factual beliefs in the same way? What if only the small, self-aware, prefrontal cortex is capable of making a distinction between the two?

That would mean that the midbrain doesn't really have an "ought" and an "is". Instead, it would have a category that encompasses both. I think that category might be "tribal alignment".

For example, "you ought not to disrespect tribal elders", and "don't step on the yellow and black snakes, they are poisonous", are a moral belief, and a factual one, but at their core, they both tell you what to do or not do. Why wouldn't the ever-practical and lazy programmer that is evolution store them in the same way.

Hence, alignment. Do your beliefs align with the tribe or not? And, as the world becomes larger and more integrated, which tribe do your beliefs align with? Or do they align with none at all?

(And if you're thinking here that it's no coincidence that a popular role-playing game uses this term to describe morality, then you might be correct.)

So perhaps people get mad when you challenge their beliefs, because, to their unconscious mind, you are basically telling them to unalign with their tribe. And under the conditions of brain evolution (stick and rock technology), isolation meant death.

(This might be why people living in the worst of communist dictatorships have never been as unfree as hunter-gatherers living in tribes. Tyrants have a very long list of things you may not do, but primitives have a very short list of things you may.)

Let's try an example:

Suppose I, an atheist, am speaking to a christian about whether or not there is a god. The more factual arguments I muster, and the more compelling they are, the more I am, in effect, demanding that someone abandon their community. And that doesn't even mean, necessarily, walking away. It can also mean losing the sense of belonging. And the latter can happen to someone involuntarily, merely by listening to me, without any conscious act of defection.

If we think about it like that, no wonder it makes people mad! They are being asked to sacrifice a concrete source of comfort and (perceived) benefit, in exchange for... well, at worst nothing, but even at best the benefits of being "correct", which might exist, but even if they do, are hypothetical and they can't see them.

Of course trying to cut someone off from the tribe is experienced as an attack.

But every tribe, but its very nature, has to be wrong about something. Because new evidence comes to light all the time, and if your beliefs aren't able to turn on a dime, then pretty soon some of them are going to be outdated. And if your beliefs are able to turn on a dime, constantly pivoting to whatever the current evidence suggests, then you have no tribe, you can't have a tribe, because you have nothing to hold that tribe together.

(This is why libertarians don't win elections.)

So the universal human experience is a choice: suffer the pain of delusion, or suffer the pain of isolation.

And these tradeoffs play out in a number of different ways... some false beliefs cost you very little, and some of them ruin your life. Some beliefs align you with close-knit and beneficial communities, and some align you with random nutjobs, or no one at all.

And how costly beliefs are can change.

Galileo was threatened by the church for challenging an aligning belief that was, at the time, consequence-free for most people. It didn't really matter in someone's daily life if they thought the sun orbited the earth, or vice versa.

And so Galileo was treated as if he were advocating genocide, or something, because Popes' brains don't make a distinction between factual positions and moral ones.

But, in the twenty-second century, when we start trying to mine asteroids, the same belief would have a huge cost, and would have to be stamped out. Fortunately, it's already happened, partially because of Galileo.

But.... what is different about Galileo? And the people who read his work, nodded, and said "that sounds right, I'm going to change my beliefs".

Well, there's a lot of people who run around priding themselves on being smarter than others, or more objective. And that makes them feel good. But is that the whole story?

What if the people who embrace unaligning beliefs are doing so, at least partially, because they are already unaligned and have nothing to lose? What if Galileo was already suffering the pain of isolation, due his vast personal differences from the people of his day? If so, it would cost him very little to pivot his beliefs, and thereby avoid the pain of delusion as well.

And what if this equation is also impacted by how costly a particular delusion is?

Which brings us back to the red pill. The red pill was a push for unalignment; an attempt to help ourselves and others reject the "matrix" of an unaligned belief system that was costing us a great deal. But it gradually became about other types of unalignment as well, and, more importantly, about the ability to unalign.

This what "taking the red pill" really means in a larger context... making the conscious choice to suffer the pain of isolation instead, when the pain of a particular delusion becomes large enough to make matrix participation a bad investment.

This is why we got so much hate mail at first. We weren't proselytizing, but our mere existence created a risk of unalignment for others who became aware of our views. This was a threat both to those people themselves, because they found unalignment painful, and to others who found their unalignment painful (because they were less easy to exploit).

Thus, we were, in a very real sense, an infohazard. A self-perpetuating meme which caused painful cognitive dissonance.

My current hypothesis is that most resistance to well-supported ideas rests upon this dynamic. It might even be the case that being objective isn't always a good idea... that it might not always be beneficial to be correct. Some factually correct beliefs might have a very high psychological or external cost, in exchange for a lot less personal benefit.







[1 Comments]
Next Page