Home Public Philosophy An Alternative to Argumentation: Persuasion via Questions

An Alternative to Argumentation: Persuasion via Questions

1

In my last post, I introduced Julia Galef’s way of thinking about motivated reasoning, what she calls soldier mindset: people take ideas personally, and then respond with defensiveness when “their” ideas are attacked. Among other things, soldier mindset leads people to evaluate evidence in a different way, depending on whether it supports their favored ideas or not. Soldier mindset helps explain why attempts to convince people with direct arguments not only fail to work as intended but actually backfire, making people become further entrenched in what they already believe. Plenty of people have offered ways to work on ourselves to become less defensive when our ideas are attacked. Basically, they are recommending that we take on a mindset more like a scout than a soldier, where our interest is primarily in figuring out how things really are, rather than in defending a particular idea that has become personally significant. Nevertheless, I would suggest that we also need to more fundamentally change the way that we relate to others in our attempts to persuade.

But I need to do some ground clearing to get this approach off the ground. This is because many philosophers have argued that adversarial argumentation is not only effective, but it is really the only game in town. According to this view, any time a disagreement arises, it is actually impossible to resolve it without adversarial argumentation. John Casey thinks that this is because when we argue, we are arguing about our beliefs, and we have no direct control over our beliefs (as opposed to things that we voluntarily accept or consider). That is, if you’re convinced that grass is green or that abortion is wrong, there is no way that you can change those convictions by force of will. Try it. Can you simply decide to believe that grass is pink or decide to change your views on abortion for even the next ten seconds? The only real control you have over your own beliefs is indirect; you might be able to change your views on abortion indirectly by exposing yourself to certain kinds of evidence. According to Casey, this means that any time I give you a reason to change your beliefs, I am trying to accomplish something outside of your voluntary control, and almost against your will at times. And Casey thinks this is true even if I’m on your side of a debate: my arguments are still trying to change the strength or arrangement of your convictions without your consent.

This is a very interesting line of thought, but is it really inevitable that the reasons and evidence I give can only change your mind without your consent? Try this analogy on for size. Suppose I have appendicitis. There is something wrong with my body, and I cannot fix it of my own accord. I don’t have the ability to operate on myself to fix this problem (at least not without creating a whole mess of other problems!), but I can get a surgeon to operate. By doing so, she would be doing something that I cannot accomplish directly on my own. Now, does anyone think that the surgery is adversarial? I find it impossible to view it that way. But if adversariality is the wrong way to look at surgery, why must changes in belief or conviction be any different? Suppose I come to think that one of my beliefs is problematic, maybe I can’t convince myself that abortion is not murder but I’m deeply uncomfortable with that belief. This might seem like a strange situation, but it doesn’t seem impossible. In that case, couldn’t I ask a friend to perform a kind of operation on my beliefs? She might give me extensive reasons why abortion is morally permissible and should be safe and widely available. Shall we say that her manner of relating to me is adversarial? If ordinary surgeries are not adversarial, then why should this kind of operation be considered as such?

The issue is this: my example of discomfiting beliefs truly is a strange case. In paradigmatic cases, people identify wholeheartedly with their beliefs and convictions. Whether you believe that abortion is murder or that abortion is healthcare, that belief has probably become part of your identity at some level. Moreover, you are resistant to changing it for that very reason. So, in the ordinary case, Casey is right that arguments must be adversarial. By giving you an argument against your closely held beliefs about abortion, I am trying to induce a change in your very self-conception outside of what you would choose if you had direct control of your beliefs. But this only raises the question: must we identify with our beliefs? On Galef’s view, paradigmatic scouts don’t, so neither must we. In a moment, I’ll consider this thought more explicitly in relation to rational discourse.

But first, Scott Aikin gives a different argument for adversarial argument that is worth considering. He grants that in any disagreement, both sides may share a common goal of getting to the truth. Nevertheless, the only way to get there is by giving reasons for one side as opposed to another and thereby taking sides. On his view, giving reasons involves an inevitable opposition between ideas or hypotheses. To give a reason for one view is always to give a reason against some contrasting set of views.

I suspect that any disagreement with Aikin on this point would be a confirming instance of his very point. To reason against this contrastive view of reasons would be to give a reason for some other way of thinking as opposed to it. Rather than disagreeing with his insight, I want to suggest a way to navigate the inevitable opposition between ideas without inviting opposition from people. Rational discourse must be adversarial in the first respect but it need not be adversarial in the second. The idea is to find a way to give reasons without being argumentative; without posing a threat to other’s closely held beliefs; and thus without calling forth soldier mindset in response to reasoned discourse.

An alternative

What seems objectionable to me about the adversarial model is not that it engages in comparative theory choice or contrastive reasoning but instead how it engages. And this is common ground I find in feminist critiques of the adversarial model (see, e.g., Gilbert, Hundleby, Moulton, Rooney). A key insight in these discussions is that an adversarial approach to argument sets up rational discourse as a zero-sum game. If you win, I lose, and if I win, you lose.

I bet you can see what I mean: Consider the topic of God’s existence. This is a topic that is personal all around in that people place personal significance on God existing or not. In other words, no matter which side you stand on, your belief in God’s (non)existence means something to you: about what groups you belong to, how others view you, how you view yourself, how good you feel about your life, etc. Threats to this belief are going to be seen as threats to the self. (And we could say the same about many politically controversial topics, like abortion, climate change, immigration, racial injustice, etc.) Now, suppose that I draw you into a discussion about God’s existence in the most tactful way possible. And suppose I am arguing against you; aiming to disprove your belief about God’s existence. In that case, am I not trying to do all the reasoning for you and urging you down a path you don’t want to go? How likely is it that you will consider my arguments without feeling any need to defend your beliefs? Even if we suppose that I’m right and there is something inaccurate about your view of the world, is this really the best way to show you? Even if you were to accept my argument, are you not in the position of having to admit that you were wrong in order to get to the truth? Maybe if you are an inveterate scout or are far humbler than most, then such an admission would be a “win” for you. Perhaps we should even strive to be the sort of people that can win by admitting that we were wrong. (This is certainly what Galef’s book advocates.) But answer me this, are most of the people you might aim to persuade so humble and concerned with the truth that they can feel proud of themselves for changing their perspective? Even if they are humble and concerned with the truth, might they not prefer to do the reasoning for themselves?

Here’s how I see it (but maybe you can spot something I’m missing): when I give you a direct argument that aims to disprove your belief, I think two things are happening that invite soldier mindset. One is that I am taking the reasoning process out of your hands. Do you get to decide which reasons get considered when I am giving the argument? Do you get to decide where the reasons will take you? Doesn’t look like it to me. So, whatever conclusion I arrive at is unlikely to feel like a reason for you. It may even feel like a low-level threat to your sense of autonomy; you may feel you need to defend against it to save face. (In cultures where masculinity is strongly associated with autonomy, we might guess that this approach will often put men more on the defensive than women.) Am I wrong? Second, and more importantly, head on argument frames the discussion as a zero-sum game. If my argument works as intended, you must admit you were wrong. When I say, “here’s a set of reason’s (not) to believe in God,” it’s going to sound to almost anyone more like an invitation to a duel than an invitation to solve a problem.

So, I invite you to consider an alternative way of trying to convince people (one that I am exemplifying right now, in fact). Instead of setting up a win-lose scenario that takes over the reasoning process for your partner in a discussion, perhaps we can engineer situations in which arguments function as invitations to a win-win; one where we both have an opportunity to see the truth together and by using both our faculties of reasoning.

What do you think that would look like? How might one create such a situation? Let’s imagine you are thoroughly immersed in the project of reacting to some idea you disagree with. What would make you drop your defenses and be open to a different picture? If you’re like me, what works best is feeling like someone else really gets you; that they understand your perspective to such a degree that they would adopt it too if they were in your situation. You would have to feel like the other person is looking out the same window as you. If someone can get you to that point, you tend to open up to hearing what they have to say. At that point, if they offer a better way of looking at the situation and you decide to change your mind, it could feel less like you’re losing or admitting defeat. In other words, maybe it can feel safe to change your mind, like you and your partner are reasoning together rather than in opposite directions; maybe you can both win by seeing the truth.

So, how do you get to that point? Here, it seems to me that scout mindset is indispensable. Of course, there is nothing preventing a good scout from giving a head-on argument and failing to really get the person they are trying to persuade, but if you can think (and feel!) like a scout, it opens the possibility of approaching the other person with curiosity and a desire to understand their perspective. You can ask questions that are authentically aimed at understanding them better; you can mirror what they say in ways that show you understand what they are feeling and thinking. There is some evidence that listening by itself can work to make peoples’ opinions less extreme, at least when the listener is nonjudgmental, attentive, and active (e.g., paraphrasing and asking for clarifications).

When questions are focused in the right way, these effects may be even more dramatic. In one experiment, asking participants to explain how a social policy works made their political opinions less extreme, probably by exposing gaps in their knowledge and increasing their uncertainty, whereas asking people to give reasons for the very same policies did nothing to change their opinions (and in some experiments has been shown to increase extremism). (There is some evidence that political extremism and belief in conspiracy theories may be supported by the belief that social problems are easy to solve.) For example, if a relative is telling you how common voter fraud is in states without photo ID requirements, you might ask them how someone could actually commit fraud in specific states without getting caught. This may help them to see for themselves that there are gaps in their knowledge of how votes get processed and certified, making them less confident that voter fraud is widespread.

Nonetheless, I think we can sometimes go even further down the road of persuasion; not merely getting our counterparts in a discussion to be less defensive and certain of their rightness, but also drawing the conversation closer to the truth. Suppose you ask a lot of questions and get to the point where you really do understand what your discussion partner thinks and how they feel. At that point, you might find that your own mind has been changed by coming to a better understanding of their perspective. But getting to this point of understanding opens another possibility as well: if you’re looking out the same window as your conversation partner and you think you see something they are missing, you can ask, “what about that piece of evidence over there?” In other words, you can share some agency with them as you evaluate the evidence together.

I think this is quite distinct from the adversarial model of argumentation. Even though you might (inevitably?) be looking at reasons for/against contrasting ideas (a la Aikin), you are not doing so in a way that sets your interests against your partners. You and your partner are engaged in a project of trying to really “get” one another and arrive at the right idea together. (This may occur as part of what Daniela Dover calls interpersonal inquiry.) Even though the change in belief (in yourself or your partner) may be involuntary (a la Casey), you arrived there by the joint operation of your faculties of reasoning. You both consented to the possibility of changing your beliefs, and you each did so by dropping your guard and ceasing the project of defending your identity by defending the belief. In effect, argument and persuasion wouldn’t need to be adversarial if we could manage to relate to our beliefs in a different way; not as a part of oneself to defend against attacks, but as different possible truths to explore and test together.

Scout mindset introduces a new option for approaching one’s beliefs and so also for persuading (oneself or others!). To put it in Casey’s terms, Scout mindset changes the indirect ways that we control our beliefs. Instead of protecting our beliefs as they are by selectively attending to certain evidence or by subjecting other evidence to increased scrutiny, we can actively open ourselves to contrary evidence (among other things). We do not necessarily lay ourselves open to all other “belief surgeons,” but we drop the self-protective slant of our indirect controls. Doing so sets the stage for a more collaborative form of belief modification: together, we consider the evidence that makes us currently see the world a certain way, and together we see if there is reason to change to the way we see it.

1 COMMENT

  1. What Isaac Wiegman advocates is very similar to the process known as “Nonviolent Communication,” or NVC. In NVC we first try to establish that we understand where the other person is coming from, by paraphrasing (“mirroring”) their emotions and attitudes. Usually this involves identifying universally human concerns that lie behind the more partisan attitudes that the other person exhibits. So we can endorse those universal concerns, while issuing no judgment about the partisan attitude. Only when, in this way, a “bridge” has been established between us, so that (as Wiegman puts it) the other person feels understood, do we venture some candid comments of our own, on the issue in question. In this way, the two parties wind up feeling respected, and collaborating on the issue, seeking a solution that (like Wiegman’s “scouts”) both have found, rather than one winning and the other losing.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

WordPress Anti-Spam by WP-SpamShield

Currently you have JavaScript disabled. In order to post comments, please make sure JavaScript and Cookies are enabled, and reload the page. Click here for instructions on how to enable JavaScript in your browser.

Exit mobile version