Showing posts with label rationality. Show all posts
Showing posts with label rationality. Show all posts

Monday, 6 April 2015

Visiting Project PERFECT

Naomi Kloosterboer
This post is by Naomi Kloosterboer, PhD student at the VU University of Amsterdam.

In February and March of this year, I was a visiting PhD student in the Philosophy Department of the University of Birmingham. There is a welcoming and open atmosphere at the department where philosophy chatter and discussions are abundant, staff members like hearing about your philosophical ideas and research, and where many situations lend themselves to an opportunity to have drinks at the staff house bar and continue discussions.

During my time at ‘Brum’ I worked on the first part of my PhD Thesis, which is about understanding the threat that ignorance – as arising from the psychological literature on confabulation, attitude misattributions, choice blindness, etc. – poses to rational agency. Furthermore, I participated in the weekly Proseminar, the postgrad reading group, in the weekly PGR seminar, where postgrads present their work, and in the bi-weekly PERFECT reading group. These groups provided very engaging discussions from fellow students and members of staff. In this post I will discuss a paper that was discussed in the PERFECT reading group, chaired by Professor Lisa Bortolotti: ‘Lifting the Veil of Morality: Choice Blindness and Attitude Reversals on a Self-Transforming Survey’ by Hall et al. (2012).

Hall and Johansson and their group have developed, as they call it, the choice blindness paradigm (cf. Hall et al. 2010, 2012; Johansson 2006, PhD Thesis; Johansson et al. 2005, 2006). The paradigm is built upon the fact that it is possible to manipulate the relation between people’s decisions and the outcomes of these decisions without them noticing, revealing that people are prone to miss even dramatic mismatches between what they want and what they actually get. Moreover, faced with the question to explain choices they in fact did not make, participants offered reasons for the outcome.

After several studies on aesthetic, gustatory and olfactory choices, an experiment was designed to test whether people are also blind to their opinions on moral issues (Hall et al. 2012). In the study, which was conducted in a park, participants rate their level of agreement or disagreement on a score from 1 to 9 either with a general moral principle, e.g. “Even if an action might harm the innocent, it can still be morally permissible to perform it”, or with a specific political moral statement, such as “Large scale governmental surveillance of e-mail and Internet traffic ought to be forbidden as a means to combat international crime and terrorism.” Minutes later they are asked to give reasons for the score they filled in. However, unbeknownst to the participants and involving a kind of magic trick (see explanation of the method in Hall et al. 2012, 2), some of the principles and statements are reversed. This means that the participants are asked to give reasons for an opinion opposite to their original transcribed score.

Tuesday, 3 June 2014

Desiring to Believe and Self-Deception


This week, postgraduate student Martin Smith considers the relationship between self-deception, the desire to believe and rationality. 

Nicole swears that her husband is faithful. She’s adamant about it. Her friends, though, aren’t convinced. That weekly poker game of his? He spends it with Rachel, they say. They see his car at her place every week. And these friends of Nicole’s are good friends. They wouldn’t say this lightly. But Nicole’s firm; “I just don’t believe any of it”, she says.

She speaks with conviction but her words aren’t the whole story. She won’t drive by Rachel’s place when her friends say he’ll be there. A few times she’s needed to get somewhere and it would have been convenient to just go past Rachel’s. But if it’s ‘poker night’, she’ll avoid that route. It piles minutes on to her journey but she just won’t go near that house.

Nicole is a bit of a puzzle. She says she believes her husband to be faithful. She’s not trying to deceive her friends by saying that. Best as she can tell in that moment, that’s an honest report of her perspective. But still, she doesn’t behave like someone who really believes that.

She seems to have some awareness that things are off. That’s why she avoids Rachel’s house. Part of her, somewhere, we might think, senses her friends have a point. But she’s keeping that sense – that awareness – at bay somehow. She’s blocking it out. She’s self-deceived.[1]

No doubt this self-deception is irrational. Hopefully, if we look at her case more deeply, we should be able to draw from it certain lessons. Namely, lessons about what not to do if we want to remain rational! And what’s interesting about this case, I think, is that the lessons we can draw from it might challenge common assumptions about desire, belief and rationality.

We might be tempted to describe Nicole’s problem like this: she wants too much to believe that her husband is faithful. Or she is too committed to believing that her husband is faithful. This desire/commitment, sadly, we might think, is competing against and trumping the rational demands upon her. Demands, for instance, to be fully attentive to evidence against her husband’s faithfulness. If only she could be less ‘invested’ and more ‘neutral’ regarding her belief that her husband is faithful, she would be more able to be rational.

There is something right in this. I think Nicole’s desire to believe that her husband is faithful is involved, in some way, in her irrationality – her self-deception.[2] But I don’t think the problem is that this desire is too strong. Rather the desire (or commitment) is too weak. Let me explain.

Nicole’s belief that her husband is faithful can’t withstand serious levels of opposing evidence. If she were to find her husband’s car at Rachel’s place, it might not be psychologically possible (it would at least be very hard) for her to continue believing in her husband’s faithfulness. So if Nicole desires to believe that her husband is faithful, evidence that he isn’t will make her uncomfortable. She’ll feel the threat that this evidence poses to her desire. It will be distressing for her.  

What can you do to get out of a distressing situation? One option is to face up to it. When you’re, say, anxious about making a phone call to a friend you’ve upset, you can choose to respond by gritting your teeth and picking up that phone. The distress might temporarily increase as you do so but you’re actually tackling the problem. Once it’s resolved, the distress will disappear.

The other way out of a distressing situation is to avoid it. You put off the phone call. Put it out of your mind. Distract yourself whenever the thought of it arises. Avoidance, no doubt, is easier than facing up to a problem. In avoidance you can experience immediate relief from distress rather than the temporary increase that taking action brings about. But there can be other price tags attached to avoidance, as I hope to show.

Obviously Nicole, rather than facing up to the evidence against her husband’s faithfulness, avoids that evidence. She keeps away from Rachel’s house. Brushes aside memories of suspicious activity. She has some level of awareness of this evidence alright but she keeps it from the centre of conscious attention. It’s always pushed to the peripheries. That’s how she deals with the distress that threats against her desire to believe bring.

Well okay, that’s a little cowardly but she’s getting what she wants, isn’t she? Hasn’t she has kept her belief that her husband is faithful intact? It sure seems like her desire to believe is winning out against the demands of rationality. It seems her desire is being gratified.

Consider, though, that her avoidance of threatening evidence against her husband’s faithfulness seems to call into question whether she does in fact believe him to be faithful. After all, she’s not willing to put that belief to the test. She’s not willing to ‘put her money where her mouth is’. Really, her avoidance of the evidence just seems like distrust that the world really is as she professes to believe it to be. That is, she seems to distrust her judgment that her husband is faithful. But if she distrusts that judgment, does she really believe it? Intuitively, to me, it seems she doesn’t. Or at least, she doesn’t fully believe it.[3]

It seems to me that in avoiding evidence against her belief, Nicole gradually loses that belief by systematically distrusting it. Self-deception may gratify her temporary desire for relief from distress but it sabotages her desire to believe that her husband is faithful. Her desire to believe loses out against her desire for comfort. Self-deception isn’t quite as attractive regarding belief-preservation as it may have seemed.

So what could she have done differently? Could she have better served her desire to believe? Yes, she could have, by facing up to the evidence. Enough avoidance guarantees loss of belief through mistrust. But while facing up to threats to a belief may also lead to loss of that belief, it also opens up the possibility of preserving it.

Confronting the situation allows one to potentially find out that the ‘evidence’ is not what one feared. Perhaps Nicole, if she drove up to Rachel’s house, would find a satisfactory and innocent explanation of it all (Rachel’s help was needed for a surprise for Nicole). Even if the chances of this are slim, they beat the zero-chance of belief-preservation offered by avoidance.

If all this is correct, then Nicole’s desire to believe would have been best served by complying with the demands of rationality. She would have maximised her chances of gratifying that desire by being attentive to the evidence. Really, her failure to be attentive to the evidence is a failure to be properly committed to her desire (and her belief). So rather than her desire competing against the demands of rationality, taking her desire seriously requires meeting those demands.

It’s often thought that desire for a belief, emotional investment in a belief, or commitment to a belief compromises rationality. But the lesson we can draw from cases of self-deception like Nicole’s might be that the problem is not in these attitudes themselves but in the deficient methods we use to (attempt to) uphold them. Deficient methods like choosing avoidance over action (‘facing up’), for instance.  

Bibliography

Funkhouser, E., 2005. Do the self-deceived get what they want? Pacific Philosophical Quarterly, 86(3), pp.295-312.

Lynch, K., 2012. On the “tension” inherent in self-deception. Philosophical Psychology. 25(3), pp.433-450.



















  









[1] This is an adaptation of a case of self-deception discussed by Funkhouser (2005, p.302).
[2] Of course, she may desire that her husband actually is faithful too. But it’s plausible to think that she also desires to believe that. Even if it were false that her husband were faithful, sincerely believing him to be would surely provides a level of comfort she could easily cherish. Ignorance is bliss after all. For considerations in favour of viewing desire for belief rather than for some state of affairs out there in the world as more significant in self-deception see Funkhouser (2005).
[3] I find plausible something in the spirit of Lynch’s claim that “the extent to which S really believes that p can be gauged by observing the risks he/she is willing to take on that assumption (2012, p.444).”

Wednesday, 7 May 2014

Delusion: doxasticity, rationality and normativity


This week, doctoral researcher Rachel Gunn examines the nature of delusions. 

If a subject says they believe something then I am inclined to take this at face value.  The subject usually has other mundane unexamined beliefs (e.g.: I believe that when I turn a tap on water comes out) as well as examined beliefs or opinions (e.g.: I believe that liberal democracy is the best political system).  Against this background of other beliefs it does not seem appropriate to ‘second guess’ the subject about his own experience.  Not everyone would agree with this and some would argue that delusions do not meet the criteria for beliefs as they are irrational, do not necessarily affect behaviour and do not cohere with other beliefs.

Some propose that a delusional subject fails to monitor an imagining as being self-generated (the subject is in some sense not the agent of the imagining).  This mental activity is then mislabelled (representationally) as a belief and somehow ‘given’ as true.  So the delusional person has a thought with content P.  He does not believe P.  He imagines P.  And he believes that he believes P.  In this case some delusions are imaginings with a strong feeling of subjective conviction (Currie and Jureidini, 2001).  This is an intriguing way of describing some delusions and might help us explain why some subjects do not seek to integrate their delusions into their lives or to act on them (we do not routinely act on our imaginings).  However, there are problems here – the most obvious being that there are many examples of people acting on their delusions and integrating them into elaborate belief networks that pervade the rest of their lives - for example, the person who believes he is a millionaire, a general and a senior psychiatrist who regularly phones the bank to check on his millions, attempts to arrange to inspect local military bases and applies for a job as a chief executive of a hospital (Bentall, 2004, pp.295–6)

The other problem arises from establishing how this characterisation of delusion differs from non-delusional subjects who are ‘believers’.  Our normal propositional attitudes can be manifest as beliefs, which we may not act on, which may not be integrated into the rest of our beliefs and which may also be irrational.  For example I might say that I believe smoking kills people and I do not want to die sooner than necessary yet I continue to smoke.  This series of un-integrated beliefs might include an unexamined belief (or sub-clinical delusion) that I am special and the detrimental effect of smoking will somehow not have an impact on me.  If questioned about it I would probably concede that the (weakly held) belief that I am special is not true, yet I am unlikely to change my behaviour.  Further, one could successfully argue that my behaviour and my thinking in this case is irrational but it is unlikely that one would question the belief status of my statement about smoking.  Some say that delusions are non-doxastic acceptances that do not meet relevant rationality standards (Frankish, 2012) – and here I would have to question what is meant by ‘relevant rationality standards.’  Ideal (normative) rationality is not consistent in human beings and therefore one cannot deny the doxastic nature of delusions simply because they are sometimes irrational (for supporters of this position see Bayne and Pacherie, 2005; Bortolotti, 2010)

Whilst it might be true that some delusions are not beliefs this does not alter the fact that our ordinary conceptualisation of beliefs sometimes seem to have the same external characteristics as the phenomenon that Currie and Jureidini describe as imaginings mistaken as beliefs and that Frankish describes as non-doxastic acceptances.  Of course, as we are unable to consistently and accurately define or describe beliefs or imaginings, I cannot say more about it here - perhaps beliefs, acceptances and imaginings are complex overlapping forms of mental activity.  For more on delusion see the imperfect cognitions blog.


Other (non-electronic) references:

Bentall, R.P. (2004) Madness Explained: Psychosis and Human Nature. London: Penguin
Bortolotti, L. (2010) Delusions and other irrational beliefs. International perspectives in philosophy and psychiatry. Oxford ; New York: Oxford University Press


Monday, 3 March 2014

Birmingham Workshops in Philosophy - Belief & Perceptual Reasons



Belief & Perceptual Reasons

The presentations of this workshop will all investigate the notion of belief. This event is part of the new Birmingham Workshops in Philosophy series.


Wednesday 12th of March, 2014

10am - 4pm

Open to all.


10:40am  Coffee & Biscuits

11:00am  Scott Sturgeon (Birmingham): “The Tale of Bella and Creda”

                Lunch

1:45 Coffee

2pm         Rae Langton (Cambridge): “Moral Realism and the Plasticity of Mind”

4pm        Susanna Siegel (Birmingham, Harvard): “Can Expertise Rationally
                Influence Perceptual Experience?"

Talks will be in room Lecture Room 3 of the Learning Centre: R28 on the campus map.