Theme
12:18am September 11, 2014
Anonymous asked: I didn't know what LW was so I googled it and read the about page, I'm not quite sure I get it... it was making my head hurt. I got a very 'this way of thinking is the ultimate right way' vibe from it.

chordatesrock:

withasmoothroundstone:

If it makes your head hurt, I’d strongly advise you to avoid it.  It makes my head hurt too.  And there’s a reason, and it’s not a good thing (but I can’t explain it in words, so I’m not even gonna try).

These words might not describe what you’ve noticed, but having read the site, I do have some words for it. This is getting long; I’ll tl;dr at the end of each section.

Firstly: LessWrong believes in belief propagation. Belief propagation means that if you believe something, you believe what the thing implies. An example of when that works well might be: I believe humans need food every day and I believe I haven’t eaten yet today, so I know I should eat something before the day is over. Or another example might be: I know bosons are smaller than atoms and I know the Large Hadron Collider can find bosons, so therefore, the LHC studies subatomic particles.

At best, belief propagation means that your beliefs mean things. They don’t just exist as nonsense claims: they imply other things. And if you learn something, you don’t just keep the fact in some dusty corner of your mind: you learn the things that follow from it.

There’s one problem with belief propagation. If you’re part of LW, you’ll understand immediately if I say that truth at the degree of precision you’re capable of knowing does not cohere.

Without the buzzwords, what that means is: beliefs are oversimplified models. The real world is so much more complicated than your beliefs can ever be. So sometimes you have a mental model that works pretty well, and another one that also works pretty well, but each one seems to imply that the other is wrong. They can’t both be right… but they both are right.

If you don’t believe that, then fine: is light a wave or a particle? It can’t be both! Or do you believe in relativity or quantum physics? Well? Which is it?

That’s why belief propagation is dangerous. It can be good. It’s also bad.

LW uses belief propagation in trying to understand what is moral. And they want to create a super AI to enforce their vision on the world. That should scare you a lot.

tl;dr: LW thinks all your beliefs have to fit together, but sometimes true beliefs don’t fit together. LW throws away truth because it doesn’t fit.

LessWrong misuses Occam’s Razor in attempting to ague for atheism. They also ignore every aspect of religion that doesn’t fit their stereotype of monotheism.

They argue that they’re starting from the simplest priors— the simplest assumptions. Why? Because, uh… God is complicated? Or something? It’s simpler to have a universe without God because it has one fewer thing in it, therefore we do have a universe without God, therefore we have a universe where spirituality is only a result of human brains being buggy.

They argue that all spirituality is insane because it’s so unlikely to be true (because it’s not simple) and because it doesn’t imply other things about the world and because religious tolerance conflicts with belief propagation because all religions must hold that all other religions are wrong. So let’s pick apart those last two ideas:

Firstly, some religious experiences do imply things about the world. (Didn’t you talk about learning about photosynthesis religiously, youneedacat?) Even setting aside things like “we’ll definitely know who’s right about the afterlife when we’re all dead” we still have the issue that some people believe it’s possible to have memories of past lives (which could totally let you know things about history without being taught— maybe verifiable things, even), or various other things. Some people believe they can work magic; they should totally expect it to have effects.

Secondly, not every religion teaches that every other religion is wrong. Wicca explicitly teaches that everyone else’s gods are real, sort of, not quite. A lot of polytheists believe in each other’s gods but don’t worship them. Contradictory ideas being simultaneously true is a big thing in a lot of religions, from Christianity to Asatru.

Thirdly, not arguing about it is different from believing there is no objective truth and falsehood. Arguing religion is so pointless most of the time.

It’s true that there are people who think religion and spirituality exist outside of objective truth and falsehood in some fuzzy individual experience hermetically sealed from the rest of the world, but please don’t judge all of us by them. Please. Some of us actually believe what we believe.

And sometimes spirituality is about ways of relating to things everyone agrees exist, not about what things exist. Like youneedacat’s religion: I’m not going to argue that sie’s worshiping something that isn’t real. I absolutely believe in the existence of redwood forests. I just don’t relate to them that way. (Though I’m so uncertain about it that I’m not sure whether it’s even something I would disagree with if I understood it.)

tl;dr: They think you have to be atheist and unspiritual because they don’t understand what religion is.

They’re right about a lot of things, too. Why’s that bad? It makes it harder to discount everything they say. They’re right about some things, like various cognitive biases. Their list of ways words can be wrong is very useful. The idea of playing rationalist taboo is also useful. “The map is not the territory” is useful. Noticing confusion is useful. If you can think very critically about them and don’t take them as Gospel, I highly recommend reading the core Sequences. Just don’t try to engage with the community or read the rest of the site, and remember to take the good and leave the bad. If you can’t do that, definitely don’t read LessWrong at all. 

tl;dr: They can lure people in by not being all bad.

They claim to be trying to be rational, but they make extensive use of framing and emotionally loaded language. They call a lot of people and ideas insane and have tried to convince people to feel a sort of contemptuous pity for religious people. They carefully shift the frame— the way you see things— to make it look like reality agrees with them. To make it look like disagreeing is ridiculous. They make extensive use of what they call the Dark Arts.

tl;dr: They claim to be all about rational argument, then they call people crazy and try to make you feel instead of think.

They hold to a view of rationality that privileges system 2 thinking (explicit, conscious thoughts in words and syllogisms and math) over system 1 thinking (quick, usually subconscious intuitive ideas that come to you as hunches or feelings). System 1 thinking is suspect (even though, as I said, they appeal to it constantly in their use of loaded terms and clever framing). Why is that bad? System 1 is everything you think that isn’t a syllogism. System 1 tells you there’s a car coming in time for you to get out of the way, while system 2 doesn’t know wtf is going on until it pieces things together afterward. System 1 can calculate a parabolic trajectory in the time it takes a thrown ball to follow that trajectory. System 1 is hugely important. System 1 kept your ancestors from getting eaten by tigers. System 1 is your gut, your intuition, and more.

System 1 and system 2 are smart in different ways and about different things.

I 100% believe youneedacat that intuitive sensing will tell you there’s something off about LessWrong faster than conscious thought, because I know that’s true. I’ve felt it. And then, because I’m not as sensing as youneedacat and I’m a lot better at system 2, I came up with this post. Because system 1 pointed to the problem and gathered observations and pointed in directions, and that would have been enough for me to know I don’t want to read any more of LessWrong, but system 2 was needed for me to explain it in a way someone else could understand.

System 1 works faster than system 2 using observations you don’t always know you’ve made. Sometimes it’s wrong. The classic example is that sometimes it’s really racist. But sometimes it’s very, very smart. Instead of mistrusting it and pitting system 2 against it, you should have them both work together, sanity-checking each other, looking at different parts of the world to give you a more complete picture, and learning to see and communicate about the things brought to their attention by the other system.

(And I didn’t know until just now that I knew exactly what youneedacat meant by sensing and had experienced it myself until just now, because it sounds like we made exactly the same intuitive observation of the place.)

To them, if you can’t explain what you know, then do you even know it? To them, if it doesn’t follow logically, it can’t be right and must be biased. But it might just follow from observations you didn’t know you made, by a bunch of steps you went through unconsciously that provided you, very quickly, with your intuition

tl;dr: They act like logical thinking is better than intuition, but that’s not true. Intuition keeps us alive. (So does logic.)

The jargon they use makes it hard to talk about the place with outsiders, or about the concepts with outsiders. Omega, acausal decision theory, rationalist taboo (this one contains the seeds of its own undoing, right here). Whatever else, too; the words have not stuck in my mind. Trying to talk about what you’re thinking with someone outside of LessWrong is impossible, or at least, much slower and more awkward with all the mental translating you have to do. Which could entice you to join if you know the jargon and have read the site. And which could entice you to stay, too, having more conversations with more jargon that refers to jargon that refers to jargon, until you’re so far down the rabbit hole that you don’t know how to talk to other people at all.

tl;dr: Learning the jargon makes it easy to talk to LWers and hard to talk to outsiders.

Whether they intend it to or not, their belief in the Singularity— the time when a supercomputer will know so much that it can redesign the entire world to be a utopia where everyone will be immortal and maximally happy forever— basically fills the same place in their minds as the Millennium and the Kingdom of Heaven do for Christians.

I’m not saying they think a “friendly” Artificial Intelligence is God or that they intentionally worship the works of their own hands. I’m saying they relate to these ideas like I see Christians relate to Biblical prophecy. They treat it somewhat in the same way I see people treat promises of the Eschaton. (No, spellcheck, I don’t mean Charleston, thanks for asking.) Not that they came to the ideas the same way or that it’s identical on a superficial level of conscious thought, but somewhere down deep it seems like they relate to the ideas the same way.

Their near-reverence for Eliezer Yudkowsky isn’t on the same scale as anyone’s reverence for my God. That said, there’s something a little too much about it.

And these are people who’ve analysed religious feelings, know they like them, and seek ways to have them about rationality, e.g., at solstice festivals. Like. This is really not a point it should be difficult for me to make. The point practically makes itself.

tl;dr: Computers, logic and Eliezer Yudkowsky are their religion.

They believe there’s nothing not made of… what’s the smallest unit of matter now? Not atoms but whatever it is atoms are made of. They’re materialists. Which is kind of funny considering how unmaterialist science is getting. But what do I know? I’m just layfolk,

Because they don’t believe in souls, they think that if they can make a computer simulation of you that perfectly mimics all your matter, it is you and creating it means you won’t be dead if you die. Why I should care about the immortality of an artificially-created identical twin instead of my own is beyond me. But they don’t believe there would be a difference if they just simulated you well enough. If they were just precise and accurate enough, they would be able to guarantee it was you (not just a realistic portrait) and that it would definitely be conscious in exactly the same way you are.

I don’t even understand the logic that would lead someone to such an obviously wrong conclusion.

(Yes, I did just use what they call Dark Arts there.)

tl;dr: Materialist ideas about immortality are nonsensical.

Eliezer Yudkowsky thinks he’s too smart for college and everyone supports him in this. It feels like a cult of personality sometimes.

tl;dr: The community encourages the most important members to be arrogant.

And more, I think, that I can’t articulate. But that’s pretty close to all of it, if I could just put a few more specific words to what I was saying about framing and Dark Arts earlier.

IDK; any of this sound right to you? Is any of it what you’ve noticed? At least, is it useful to more wordsy people who need to understand?

Yes, actually, a lot of those things are things I’ve noticed about LW, but I really suck at putting things like that into words.  So thank you very much for doing so.  I think one thing LW needs is some people who are closer to its target demographic than I am, who. are able to dissect what it’s doing to people and say “Hey, this is messed up,” because I'm so far away that they’d just laugh at me if I tried.  Most of them anyway.  My friend who is deep into it wouldn’t laugh at me but wouldn’t necessarily take my viewpoint seriously either (sie knows I’m highly sensing, and sie’d possibly see that as a bias rather than a way that I might be seeing through some things in a big way), and my friend who is on the fringes does take me seriously but probably gets frustrated that I can’t answer hir questions about it.  Even seemingly straightforward questions.

They do, by the way, sometimes say that intuition is a good thing.  But they think its uses are more limited than they are.  My friend who’s done a lot of work in their “let’s teach our version of rationality to people for money” department, has made a point of saying that intuition is important (I don’t think sie could know me for most of my life and not know that intuition is important), but has also sort of made statements where it’s like “well intuition works really good for this sort of situation but not for this other sort of situation”.  Which I would disagree with.  I do agree that intuition is better for some situations than others, and same goes for conscious rational thought (which is why I’ve gone to great lengths to develop mine, even though it’s definitely my weaker suit of the two, because I know a balance of the two leads to the best results), but I don’t agree about which situations.

Also “they” is a kind of nebulous thing to say about them, because they’re definitely a group that has a set of group norms and morals and beliefs, but they’re also made up of individuals.  I actually think some of the most dangerous individuals, in terms of their effect on outsiders, are the people who sincerely believe in this and want to make it work – who are not just in it to fuck with people’s heads (which some people definitely are), or for other less-than-good purposes.  Because when they sincerely want to make it work, they will create things that are closer to reality, and harder to say “This is wrong” about, even if it’s just as wrong.  Because they’ll be trying to make it closer to reality, because they want it to work, really badly.  (And here I’m talking mostly about the ‘rationality methods’ and such that they teach.)

BTW I can’t fathom how “Harry Potter and the Methods of Rationality” got even remotely popular.  It made me physically ill to read it.  As in, I had such a strong gut response (no pun intended) that it came out as severe nausea.  Like my brain just wanted to vomit and vomit and vomit forever, and the only reason I read it as far as I did was my friend had recommended it and I wanted to do right by hir and stick it out till the end.  I can’t remember if I got to the end or if I had to quit at some point, but I would rather read a thousand bad Mary Sues than read one chapter of that again.  It felt strongly like having my mind manipulated – like literally someone reaching into my brain and moving things around in an exceedingly purposeful way – and generally anything that evokes such a strong reaction of that sort in me is something I ought to put down and run away from as fast as I can.  I’ve never had something I read have that effect on me before and I’m convinced the reasons it could do that are not good ones, but they certainly go along with my observations of the rest of the stuff they get up to.

My friend who is very into it has told me there’s no problem with us running out of natural resources because the Singularity will be able to create any element out of nothing, so there will be no resources crisis.  To hear that from someone I have always considered more intelligent by several orders of magnitude than I will ever be, really scared me.  It also scares me that the Singularity Institute deliberately funnels people with money, away from giving to causes that deal with things like resource depletion, and into causes like … the Singularity … funny how that works.  

But yeah one thing that scares me about it is that the people involved are almost all hyper-rational already, usually neurodivergent, and usually fitting a fairly particular cognitive profile.  And it encourages them to be even more hyper-rational (which would be like encouraging me to use sensing and nothing but sensing and forget I had a rational mind altogether, which I tried once and the results were catastrophic, and I’ve seen others try the same with similarly catastrophic results), and in so doing, to completely lose touch with reality in some areas.  Some really important areas.  Areas big enough to drive a starship through.  And then the group very deliberately does drive starships through those areas, and parks them there, and then makes sure that nobody notices there’s these big gaping holes in their thinking with weird cognitive artifacts parked inside them.

All of this… it’s just wrong.  I don’t want people to take my word for it just because I say it, though.  I hate when people do that.  But I do think people should use extreme caution.  Especially if they are a neurodivergent person with a hyper-rational thinking style, because that’s the most vulnerable to whatever the hell it is they’re doing.

(And I have not figured out what their actual goals are, or whether there even is a goal.  I know what their stated goals are, and I know those are the goals of some of the true believers.  But I don’t know the goals of the people who are more sinister than the true believers.  And I know those people are in there because I can read those signs as clear as day.)

Notes:
  1. wanderdaydream reblogged this from somervta and added:
    Reblogging not because I necessarily agree, but because some of these criticisms were interesting and new to me, and I...
  2. somervta reblogged this from withasmoothroundstone and added:
    I haven’t actually read through this completely, because I wanted to ask before I resumed and finished: I’ve gone...
  3. imnotevilimjustwrittenthatway reblogged this from withasmoothroundstone and added:
    I feel like ‘Methods of Rationality’ was written for the pretty specific purpose of getting people to believe all the...
  4. thetigerisariver reblogged this from madeofpatterns
  5. madeofpatterns reblogged this from withasmoothroundstone and added:
    I liked HPMOR. Which scares me, because I can sense what LW is too. And just NOPE NOPE NOPE NOPE. I recognize it because...
  6. doomwings reblogged this from withasmoothroundstone