Saturday, June 10, 2023

Positions I Don't Hold

I have an idea for a new series of posts, called "Positions I Don't Hold."  These are intended as a series of ideological Turing tests for... positions that I don't hold.  This was inspired by a few things.  One was a discussion between Sam Harrison and Jordan Peterson on morality, where the two made an effort to summarize the other's position in as charitable a way as possible before beginning their disagreement.  Another was a recent read of some old scholastic writings such as Aquinas, which always preface an argument by summarizing competing arguments.

It's easy to bicker and argue, but it's rare for anyone to ever listen or try to understand.  We argue in echo chambers, and normally argue against a facsimile version of the opposing view -- what is sometimes called the strawman fallacy.  We normally only listen just enough to find something to respond to, before we interject with our own view of what is right.

Seeking to conform to the principle of charity in argument, in these posts, I will do two things.

The first is the ideological Turing test.  I will argue in favor of a position as though I hold to it, and try to argue it in such a way that someone who does hold the position would read it and recognize their own view in what I have said, and even agree that the arguments I made are the arguments they would make.  I'm going to try to not merely summarize the belief, but argue for it, and argue for it so convincingly that someone actually holding the view would think I do too.  Obviously this isn't easy and I won't do a perfect job, but I will do my best.  

The point is sort of ruined by the context, but the general idea is that if you copy/pasted my description and mixed it with three other descriptions from people who do hold that view, that it would be difficult for someone to distinguish the difference.

A literal strawman from quintain.  
The painted plywood knight is not a real knight,
and is only there to be easily hit.
The second, I call the steelman.  This is intended as the opposite of a strawman.  A strawman is an intentionally absurd or weak misrepresentation of an opposing view.  You've heard plenty of them.

Democrats are over-entitled snowflakes, so think the government should just pay for everything and gives us participation trophies.

Republicans are all rich white corporate fat cats who just hate women and minorities, so don't care about anyone's suffering.

Atheists just want the moral license to use drugs, have lots of sex, and skip church.

Christians believe in a magical sky-fairy who grants wishes.

Muslims worship the moon-god of ancient Arabia.

If evolution were true and we evolved from monkeys, then there wouldn't still be monkeys.   

In contrast, the steelman is intended as the strongest form of a position that I could think of.  When I say strongest, I don't mean "most extreme," as that tends to just make another strawman.   Instead,  I mean the form of a position that makes the most sense to me, that I find the least reason to object to, or the version of the position that I would come to adopt if my mind were changed.

I distinguish the steelman from the ideological Turing test, because the strongest form of a position that I could think of would not necessarily be one others agree with.  My idea of what makes an idea strong will not necessarily be a true believer's idea, and my notion of strengthening an idea might require deviating away from the official position.

I may not be using the term "steelman" the exact way everyone else would, but the intention is similar.

I will follow these with an explanation of why I don't hold that view, and possibly a brief explanation of the view I do hold.

I have been writing these on and off for a while, and have a few planned.  I hope to begin posting some soon.

In This Series

No comments: