Delusion
This entry focuses on the phenomenon of clinical delusions. Although the nature of delusions is controversial, as we shall see, delusions are often characterised as strange beliefs that appear in the context of mental distress. Indeed, clinical delusions are a symptom of psychiatric disorders such as dementia and schizophrenia, and they also characterize delusional disorders. The following case descriptions describe one instance of erotomania, the delusion that one is loved by someone else, often of higher status, and one instance of Cotard delusion, the delusion that one is dead or disembodied.
She realized he was empty without her and was pursuing her, but enemies were preventing them from uniting. The enemies included a number of people: people in her family, her classmates, neighbours and many other persons who were plotting to keep them apart. She knew that her conclusions were accurate because he would send messages to her proving his love. These messages would often present themselves as the license plates on cars of a certain state, the color purple and other indications that she received from the environment that proved to her that he loved her. (Jordan et al. 2006, p. 787)
She repeatedly stated that she was dead and was adamant that she had died two weeks prior to the assessment (i.e. around the time of her admission on 19/11/2004). She was extremely distressed and tearful as she related these beliefs, and was very anxious to learn whether or not the hospital she was in, was ‘heaven’. When asked how she thought she had died, LU replied ‘I don’t know how. Now I know that I had a flu and came here on 19th November. Maybe I died of the flu.’ Interestingly, LU also reported that she felt ‘a bit strange towards my boyfriend. I cannot kiss him, it feels strange—although I know that he loves me.’ (McKay and Cipolotti 2007, p. 353)
The category of delusions is not homogeneous, and we find that different delusions have different features. Some delusions have implausible content (as we saw in the case of Cotard). Other so-called bizarre delusions include mirrored self misidentification (the delusion that the person in the mirror is not one’s reflection but a stranger), and the Capgras delusion (the delusion that the spouse or a relative has been replaced by an impostor). The content of other delusions can be plausible and even true (as in erotomania). One can have the delusion that one is an uncomprehended genius, that one’s spouse is unfaithful, or that one’s neighbor is a terrorist, and these may be true beliefs. What makes all the above examples instances of delusions is that they are rigid to some extent, that is, they are not easily given up in the face of challenges and they tend to resist counterevidence. Moreover, delusions are reported sincerely and with conviction, although the behavior of people with delusions is not always perfectly consistent with the content of their delusions and their conviction in the delusional content can fluctuate. Another common feature is that, for people experiencing delusions, the delusion is often source of distress, and it is found to compromise good functioning. For instance, people who have delusions of persecution and believe that they are followed by malevolent others live in a state of great anxiety and can give up their jobs, stop communicating with their families, and move cities as a result.
The following first-personal account of delusions illustrates the pervasive effects of delusions on people’s lives:
I increasingly heard voices (which I’d always called “loud thoughts” or “impulses with words”) commanding me to take destructive action. I concluded that other people were putting these "loud thoughts" in my head and controlling my behavior in an effort to ruin my life. I smelled blood and decaying matter where no blood or decaying matter could be found (for example, in the classrooms at school). I had difficulty concentrating, I fantasized excessively, and I had trouble sleeping and eating. (Bockes 1985, p. 488)
This entry only starts to address some of the philosophical debates centered on delusions. Section 1 provides an overview of the philosophical significance of delusions. Section 2 introduces the issues surrounding the controversial definition of delusions, and some of the common distinctions between types of delusions are explained. Section 3 discusses the most prominent theoretical approaches to the nature and formation of delusions and the conceptual questions emerging from such approaches are highlighted. Section 4 reviews three of the most discussed themes in the philosophical literature on delusions: whether delusions are irrational; whether they are beliefs; and to what extent they overlap with cases of self-deception. The examination of the issues above often culminates in the attempt to understand how delusions differ from other pathological and non-pathological beliefs.
- 1. The Philosophical Significance of Delusion
- 2. The Nature of Delusion
- 3. Theoretical Approaches to Delusion
- 4. Delusions and the Continuity Thesis
- Bibliography
- Academic Tools
- Other Internet Resources
- Related Entries
1. The Philosophical Significance of Delusion
In recent years, delusions have attracted the attention of philosophers in at least three distinct areas. Here is a summary of the general issues that have been addressed and some examples of specific debates for each of these areas.
1.1 Delusions in the philosophy of mind and the philosophy of psychology
In the philosophy of mind and the philosophy of psychology, there have been various attempts to understand the cognitive processes responsible for the formation of delusions, based on the assumption, widely shared in cognitive neuropsychology, that understanding such processes can lead to the formulation of more empirically sound theories of normal cognition (see Marshall and Halligan 1996, pp. 5–6; Langdon and Coltheart 2000, pp. 185–6). For instance, let’s assume that delusions are pathological beliefs. How do they come about? Do people form delusional beliefs as a response to bizarre experiences? Do they form delusional beliefs because they suffer from some reasoning deficit? A detailed account of the formation of delusions can help flesh out the details of the formation of non-pathological beliefs.
As the above questions already suggest, the study of delusions raises conceptual questions about intentionality, and about the relationship between intentionality, rationality and self-knowledge. Moreover, it invites us to reconsider the interaction between perception, cognition, and intentional behavior. One basic question is what comes first, the experience or the belief (see Campbell 2001): are delusions bizarre convictions that alter one’s way of seeing the world, or are they hypotheses formulated to account for some unusual experiences, and then endorsed as beliefs? Another debated issue is whether delusions should be characterized as beliefs at all, given that they share features with acts of imagination (Currie 2000), desires (Egan 2009) and perceptions (Hohwy and Rajan 2012). Can delusions be beliefs if they present significant deviations from norms of rationality, and are often neither consistent with a person’s beliefs nor responsive to the available evidence? Bayne and Pacherie (2005) and Bortolotti (2009) offer defenses of the doxastic nature of delusions, but this is still a hotly debated issue. An interesting position recently defended by Schwitzgebel (2012) is that delusions are in-between states (neither beliefs nor non-beliefs), because they match only in part the dispositional profile of beliefs. Schwitzgebel’s position has recently challenged by philosophers who argue that delusions play a belief-role in explaining and predicting intentional action (see Bortolotti 2012; Bayne and Hattiangadi 2013).
Another strand of investigation developing in this area concerns the possible failures of self knowledge exhibited by people with delusions. There are several manifestations of poor knowledge of the self in delusions (see Kircher and David 2003; Amador and David 1998). People reporting delusions of passivity may not recognize a movement or a thought as their own, and thus have a distorted sense of their personal boundaries (e.g., Stephens and Graham 2000). People with delusions may act or feel in a way that is incompatible with the content of their delusions, or be unable to endorse the content of their delusion with reasons that are regarded by others as good reasons (e.g., Gallagher 2009; Bortolotti and Broome 2008, 2009; Fernández 2010). Finally, people reporting delusions may encounter difficulties in remembering their experienced past and in projecting themselves into the future, because they construct unreliable self narratives (e.g., Gerrans 2009).
1.2 Delusions in the philosophy of psychiatry
In addition to the literature on the etiology of delusions and their status as beliefs, there is also a growing literature in the philosophy of psychiatry on other aspects of the nature of delusions and on the impact of delusions on people’s mental health. This literature aims at addressing the conceptualization of delusional experience and of delusional beliefs in the wider context of psychiatric research and clinical practice, without neglecting the very practical and urgent problems that mental health professionals and clients need to face in the diagnosis and treatment of delusions. More general debates in the philosophy of psychiatry are often applied to schizophrenia and to delusional disorders, such as the difficulty in describing psychiatric disorders as natural kinds, and the difficulty in providing a justification for the divide between the normal and the pathological. In such debates, delusions feature prominently. For instance, philosophers ask whether ‘delusion’ is a natural kind (e.g., Samuels 2009) and whether there is a principled and value-free way to distinguish clinical delusions from delusional ideas that are widespread in normal cognition, such as religious beliefs about divine revelation or paranoid beliefs (e.g., Fulford 2004).
There are at least six possible non-exclusive answers to what makes delusions pathological:
Delusions are pathological because they present themselves as what they are not. They resemble beliefs but do not share some of the core features of beliefs such as action guidance, and are irrational to a higher degree than or in a qualitatively different way from irrational beliefs (for a discussion of aspects of this view, see Currie and Jureidini 2001 and Frankish 2009).
Delusions are pathological because they are signs that the person inhabits a fictional, non-actual reality and no longer shares some fundamental beliefs and practices with the people around her (for different versions of this view, see Stephens and Graham 2004 and 2006; Sass 1994; Gallagher 2009; Rhodes and Gipps 2008).
Delusions are pathological because they are puzzling and unsettling – in so far as they defy folk-psychological expectations – and this feature also makes them less amenable to rationalization and interpretation (this idea is explored in Campbell 2001).
Delusions are pathological because (differently from many irrational beliefs) they negatively affect a person’s well-being causing impaired social functioning, social isolation and withdrawal (see Garety and Freeman 1999 for a multidimensional account of delusions, and Bolton 2008 for a harm-related account of mental illness in general).
Delusions are pathological because they have forensic implications, that is, implications for judgements about whether agents can be held legally accountable for their actions. Hohwy and Rajan (2012) argue that we tend to attribute delusions when we notice significant impairments in decision-making, autonomy and responsibility.
Delusions are pathological because of their etiology. Differently from other beliefs, they are produced by mechanisms that are dysfunctional or defective. For instance, the process of their formation may be characterized by perceptual aberrations, reasoning biases or deficits.
The challenge for (i) is to account for the difference in kind between the irrationality of common beliefs that are ungrounded and resistant to change (such as superstitious beliefs or beliefs in alien abductions) and the irrationality of delusions. There is abundance of evidence that delusional phenomena are widespread in the normal population, which suggests that a sharp dichotomy between the normal and the pathological would be a simplification (see data in Maher 1974, Johns and van Os 2001, and Bentall 2003).
Accounts in (ii) and (iii) may be plausible for some delusions that appear to defy commonsense and are accompanied by a certain type of heightened experience, but do not seem to apply equally well to more mundane delusions such as jealousy or persecution. Moreover, it is not always obvious that ascribing a delusion as a belief to someone makes the behavior of that person particularly difficult to explain or to predict.
The view described in (iv) is very attractive because it captures the distinction between delusions and irrational beliefs in terms of their effects on other aspects of a person’s psychological and social life. However, using the notions of well-being and harm in accounts of delusions can be problematic, since it is possible for some people to live with the delusion in a way that is preferable to living without the delusion: ceasing to believe that one is a famous TV broadcaster after many years, and starting to accept that one has been mentally unwell instead, can cause low self esteem leading to depression and suicidal thoughts.
Challenges for a forensic account of delusions in (v) lie in the heterogeneity of the behavior exhibited by those who experience delusions. Although some delusions can be accompanied by severe failures of autonomous decision-making and give rise to action for which the agent is not held accountable, it is not obvious that these are generalisable phenomena. Does the mere presence of delusions indicate lack of autonomy or responsibility? Broome et al.(2010) discuss a case study which raises interesting issues about the role of delusions in criminal action.
The etiological answer to the question why delusions are pathological in (vi) needs to be better explored. So far, the consensus seems to be that reasoning biases affect normal reasoning, and are not present only in people with delusions. Perceptual aberrations can explain the formation of some delusions, but are not always a core factor in the formation of all delusions. A problem with the hypothesis evaluation system involved in the formation of beliefs may be at the origin of all delusions, but there is no agreement as to whether the problem is a permanent deficit or a performance error. Thus, it is not clear whether etiological considerations can support a categorical distinction between pathological and non-pathological beliefs.
1.3 Moral psychology and neuroethics
Moral psychology and neuroethics investigate the implications of the debates on the nature of delusions in the philosophy of mind and the philosophy of psychiatry for the type of participation in the moral community to which people with delusions are entitled. This includes the attempt to understand better how people’s rights and responsibilities are affected by their having delusions. For instance, it is important to determine when people with delusions no longer have the capacity to consent to being treated in a certain way, and to safeguard their interests by ensuring that they receive good care. It is also important to understand whether they can be regarded as morally responsible for their actions if they commit acts of violence or other crimes that can be motivated by their believing the content of their delusion.
As a consequence of the failures in rationality and self knowledge that may characterize people with delusions in the acute phase of their mental illness, they may appear as if they were ‘in two minds’, and they may not always present themselves as unified agents with a coherent set of beliefs and preferences (e.g., Kennett and Matthews 2009). As a result, they may be (locally or temporally) unable to exercise their capacity for autonomous thought and action.
2. The Nature of Delusion
We saw some examples of delusions, but no definition yet. How are delusions defined and classified?
2.1 Defining delusion
Commonly used definitions of delusions make explicit reference to their surface features rather than to the underlying mechanisms responsible for their formation. Surface features refer to the behavioral manifestations of the delusions, and are often described in epistemic terms, that is, their description involves the concept of belief, truth, rationality or justification (e.g., delusions are beliefs held with conviction in spite of having little empirical support). According to the Glossary in the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV 2000, p. 765 and DSM-5 2013, p. 819), delusions are false beliefs based on incorrect inference about external reality that persist despite evidence to the contrary:
Delusion. A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary. The belief is not one ordinarily accepted by other members of the person’s culture or subculture (e.g., it is not an article of religious faith). When a false belief involves a value judgment, it is regarded as a delusion only when the judgment is so extreme as to defy credibility.
Philosophers interested in the nature of delusions have asked a number of questions which highlight the weaknesses of the DSM definition. For instance, how can we tell delusions apart from other pathologies involving cognitive impairments or deficits? How can we distinguish delusions from non-pathological, but similarly false or unjustified beliefs? These questions aim at capturing both what is distinctive about delusions, and what makes them pathological.
Delusions are generally accepted to be beliefs which (a) are held with great conviction; (b) defy rational counter-argument; (c) and would be dismissed as false or bizarre by members of the same socio-cultural group. A more precise definition is probably impossible since delusions are contextually dependent, multiply determined and multidimensional. Examplars of the delusion category that fulfil all the usual definitional attributes are easy to find, so it would be premature to abandon the construct entirely. Equally, in everyday practice there are patients we regard as deluded whose beliefs in isolation may not meet standard delusional criteria. In this way a delusion is more like a syndrome than a symptom. (Gilleen and David 2005, pp. 5–6)
Counterexamples are easily found to the DSM definition of delusion: there are delusions that do not satisfy all of the proposed criteria, and there are irrational beliefs that do, even though they are not commonly regarded as delusional. Coltheart summarizes the main problems with the DSM definition:
1. Couldn’t a true belief be a delusion, as long as the believer had no good reason for holding the belief? 2. Do delusions really have to be beliefs—might they not instead be imaginings that are mistaken for beliefs by the imaginer? 3. Must all delusions be based on inference? 4. Aren’t there delusions that are not about external reality? ‘I have no bodily organs’ or ‘my thoughts are not mine but are inserted into my mind by others’ are beliefs expressed by some people with schizophrenia, yet are not about external reality; aren’t these nevertheless still delusional beliefs? 5. Couldn’t a belief held by all members of one’s community still be delusional? (Coltheart 2007, p. 1043)
The Diagnostic and Statistical Manual of Mental Disorders has been updated recently and although no changes appear in the Glossary, some interesting shifts can be noted in the description of delusions which appear in the section on schizophrenia (compare DSM-IV, p. 275 and DSM-IV-TR p. 299 with DSM-5, p. 87). The new description seems to take into account some of the issues identified by Coltheart and others. For instance, in the DSM-5 delusions are described not as false, but as “fixed beliefs that are not amenable to change in light of conflicting evidence”. Leaving the details aside, some general comments apply to the style of the DSM definitions and descriptions of delusions. In so far as delusions are defined and described as irrational beliefs, it is difficult for them to be uniquely identified because their epistemic ‘faults’ are shared with other symptoms of psychiatric disorders, and with non-pathological beliefs. But definitions such as the ones in the DSM cannot probably be expected to provide necessary and sufficient conditions for the phenomena they aim to define. At best, they can prove diagnostically useful and guide further research by conveniently delimitating an area of investigation worth pursuing.
A widespread critique of the DSM definition is that not enough weight is given to the consequences of having the delusion for the well-being of the person reporting it. Some recent definitions of delusion make more explicit reference to “disrupted functioning”(e.g., McKay et al. 2005a, p. 315). Freeman (2008, pp. 24–26) highlights the multi-dimensional nature of delusions and lists among the main characteristics of delusions not just that delusions are unfounded, firmly held, and resistant to change, but also that they are preoccupying and distressing, and that they interfere with the social dimension of a person’s life.
2.2 Types of delusion
2.2.1 Functional versus organic
Delusions used to be divided into functional and organic. Now the distinction is regarded by most obsolete, at least in its original characterization. A delusion was called ‘organic’ if it was the result of brain damage (usually due to injuries affecting the right cerebral hemisphere). A delusion was called ‘functional’ if it had no known organic cause and was explained primarily via psychodynamic or motivational factors. It has become more and more obvious with the development of neuropsychiatry that the two categories overlap. Today, the received view is that there is a biological basis for all types of delusions, but that in some cases it has not been identified with precision yet. Some studies have reported very little difference between the phenomenology and symptomatology of delusions that were once divided into organic and functional (Johnstone et al. 1988).
2.2.2 Monothematic versus polythematic
As we saw, in persecutory delusions, people believe that they are followed and treated with hostility, and that others want to harm them. In delusions of mirrored-self misidentification, people usually preserve the capacity to recognize images in the mirror as reflections, but do not recognize their own face reflected in the mirror and come to think that there is a person in the mirror, a stranger who looks very much like they do. In either case, the delusion is resistant to counterevidence and has pervasive effects on one’s life. One of the differences is that persecutory delusions are polythematic, that is, they extend to more than one theme, where the themes can be interrelated. Delusions of mirrored-self misidentification are monothematic, and apart from the content of delusion itself, no other (unrelated) bizarre belief needs to be reported by the same person. Thus, a person who systematically fails to recognize her image in the mirror and comes to think that there is a person identical to her following her around (as in mirrored-self misidentification), but has no other unusual beliefs, has a monothematic delusion. Other examples of monothematic delusions often referred to in the philosophical literature are Capgras and Cotard. The Capgras delusion involves the belief that a dear one (a close relative or the spouse) has been replaced by an impostor. The Cotard delusion involves the belief that one is disembodied or dead. Delusions of persecution are very common polythematic delusions. A person who believes that she is surrounded by alien forces and that they control her own actions and are slowly taking over people’s bodies might have a number of different delusions (persecution and alien control). These delusions are interrelated and are manifest in the interpretation of most events occurring in the person’s life. Other examples of delusions that affect many aspects of one’s cognitive life are the belief that one is a genius but is often misunderstood by others (grandeur), and the belief that one is loved by a famous or powerful person (erotomania).
2.2.3 Circumscribed versus elaborated
Monothematic delusions tend to be circumscribed whereas polythematic delusions tend to be elaborated (see Davies and Coltheart 2000 for more detailed explanation and examples). The distinction between circumscribed and elaborated delusions is relevant to the level of integration between delusions and a person’s other intentional states and to the extent to which the person’s endorsement of the delusion is manifested in verbal reports and observable behavior. Delusions might be more or less circumscribed. A delusion is circumscribed if it does not lead to the formation of other intentional states whose content is significantly related to the content of the delusion, nor does it have pervasive effects on the behavior of the person reporting the delusion. For instance, a person with Capgras who believes that his wife has been substituted by an impostor but shows no preoccupation for his wife and does not go and look for her, appears to have a circumscribed delusion. A delusion can be elaborated, if the person reporting the delusion draws consequences from the delusional state and forms other beliefs that revolve around the theme of the delusion. For instance, a person with Capgras can develop paranoid thoughts related to the content of the delusion, along the lines that the impostor has evil intentions and will cause harm when the occasion presents itself.
2.2.4 Primary versus secondary
Depending on whether the delusion seems to be reported on the basis of some reasons, and defended with arguments, delusions can be described as primary or secondary. The traditional way of distinguishing primary from secondary delusions relied on the notion that primary delusions ‘arise out of nowhere’ (Jaspers 1963). This traditional characterization of the distinction has been found problematic, because it is difficult to establish whether there are antecedents of the delusion in a person’s line of reasoning, and for other methodological and clinical reasons (e.g., Miller and Karoni 1996, p. 489). New readings of the distinction have been provided in the recent philosophical literature on delusions, where the need arises for distinguishing between people who can endorse the content of their delusions with reasons, and people who cannot (e.g. Bortolotti and Broome 2008 talk about authored and un-authored delusions; and Aimola Davies and Davies 2009 distinguish between pathologies of belief and pathological beliefs on similar lines).
3. Theoretical Approaches to Delusion
There are several theoretical approaches to delusion formation which attempt to explain the surface features of delusions by reference to abnormal experiences, reasoning biases, neuropsychological deficits, motivational factors, and prediction error, but the task of describing the behavioral manifestations of delusions, and reconstructing their etiology is made difficult by the variation observed both in the form and in the content of delusions.
When the distinction between functional and organic delusion was still widely accepted, functional delusions were primarily explained on the basis of psychodynamic factors, whereas organic delusions primarily received a neuropsychological explanation. At the present stage of empirical investigation in the formation of delusional states, the received view is that all delusions are due to neuropsychological deficits, which might include motivational factors.
3.1 Neuropsychological and psychodynamic accounts of delusion
According to psychodynamic accounts, there needs to be no neurobiological deficits and delusions are caused by motivational factors alone. For instance, delusions of persecution would be developed in order to protect one from low self-esteem and depression, and would be due to the attribution of negative events to some malevolent other rather than to oneself. The delusion would be part of a defense mechanism. Other delusions, such as Capgras, have also received a psychodynamic interpretation: a young woman believes that her father has been replaced by a stranger looking just like him in order to make her sexual desire for him less socially objectionable. In this way, the delusion would have the function to reduce anxiety and sense of guilt. Psychodynamic accounts of the Capgras delusion have been strongly criticized on the basis of recent findings about the type of brain damage that characterizes people with Capgras and affects their face recognition system. Psychodynamic accounts of other delusions that are supposed to play a defensive or self-enhancing role (e.g., persecution, anosognosia and erotomania) are still very popular.
Neuropsychological accounts of delusions offer very satisfactory accounts of some delusions, as one can often identify with some precision the damaged region of the brain and the causal link between the damage and the formation of the delusion. Neuropsychological accounts of other delusions—once regarded as ‘functional’—are also being developed and explored. For some delusions, hybrid accounts have been proposed, where a combination of different factors (including motivation) significantly contribute to the formation of the delusion (e.g. McKay et al. 2007). One such case seems to be the Reverse Othello Syndrome, the delusion that a spouse or romantic partner is still faithful when this is no longer the case. The belief can be regarded as a defense against the suffering that the acknowledgement of the infidelity of one’s partner would cause (see example in Butler 2000 as cited and discussed by McKay et al. 2005a, p. 313).
According to popular neuropsychological accounts, delusions are the result of a cognitive failure, which can be an abnormal perceptual experience (Maher 1974); an abnormal experience accompanied by milder dysfunctions such as reasoning biases (Garety and Freeman 1999; Garety et al. 2001); a breakdown of certain aspects of perception and cognition including a deficit in hypothesis evaluation (Langdon and Coltheart 2000); or a failure of predictive coding (Fletcher and Frith 2009; Corlett et al. 2010).
In the two-factor theory framework, an abnormal event is responsible for the formation of the delusion. The young woman who thinks that her father has been replaced by an impostor would form this belief because she has reduced autonomic response to familiar faces, and this affects her capacity to recognize the face of the man in front of her as her father’s face, even if she can judge that the face is identical (or virtually identical) to that of her father. But this abnormal event (reduction of autonomic response) is not the only factor responsible for the formation of the delusion. In order to explain why the thought that a dear one has been replaced by an impostor is adopted as a plausible explanation of the abnormal event, one also needs to advocate a deficit at the level of hypothesis evaluation (Coltheart 2007), or the presence of exaggerated attributional or data-gathering biases, such as the tendency to ‘jump to conclusion’ on the basis of limited evidence (Garety and Freeman 1999).
According to the prediction-error theory, expectations are formed about experience, and greater attention is paid to those events which defy expectations. The discrepancy between what is expected and the information taken in is an important part of the way learning occurs. When expectations are not met, a prediction error is coded, and the representation of the world is updated accordingly. A prediction-error signal is disrupted when events invested of special significance (in psychosis this may be due to dopamine dysregulation) cause one to update one’s current (correct) beliefs about reality. The woman who see her father and does not get the usual autonomic response experiences an unexpected event which gives rise to a prediction error. The reaction to the signal is to attempt an explanation of the unexpected event (’That man is not my father!’). This results in the formation of a delusion (Corlett et al. 2007).
3.2 Bottom-up versus top-down theories of delusion
Another distinction, introduced and developed in the philosophical literature on delusions, is between bottom-up and top-down theories, where these labels are meant to refer to the direction of the causal relation between experience and belief in the formation of the delusion. Bottom-up theorists argue that the direction of causal explanation is from the experience to the belief. Top-down theorists argue that the direction of causal explanation is from the belief to the experience. Notice that not everybody finds the distinction useful. For instance, Hohwy and Rosenberg (2005) and Hohwy (2004) argue that the distinction loses its appeal in the framework proposed by prediction-error theorists given that delusion formation involves both bottom-up and top-down processes. A person’s prior expectations affect the way in which the perceptual signals are processed and give rise to unusual experiences. Then, the unusual experiences go through reality testing and are subject to further interpretation, after which they become a central factor in the formation of the delusional belief.
For bottom-up theorists, delusions involve modifications of the belief system that are caused by ‘strange experiences’ due to organic malfunction (Bayne and Pacherie 2004a; Davies et al. 2001). For instance, I experience people watching me with suspicion or hostility, and as a result I form the hypothesis that they want to harm me; or something does not feel right when I see my sister’s face, and as a result I come to believe that the person I am looking at is not really my sister but an impostor.
The proximal cause of the delusional belief is a certain highly unusual experience (Bayne and Pacherie 2004a, p. 2).
What would top-down theorists say about the same examples? I believe that people want to harm me, and as a result I perceive them as looking at me malevolently; or I believe that someone looking almost identical to my sister has replaced her, and as a result the person claiming to be my sister doesn’t look to me as my sister does. The top-down thesis about delusion formation has been proposed especially for monothematic delusions such as Capgras (Campbell 2001; Eilan 2000) and for delusions of passivity, when people report that there are external influences on their thoughts and actions (Sass 1994; Graham and Stephens 1994; Stephens and Graham 2000).
[D]elusion is a matter of top-down disturbance in some fundamental beliefs of the subject, which may consequently affect experiences and actions (Campbell 2001, p. 89).
Both bottom-up and top-down theories face challenges: whereas top-down theorists need to account for where the belief comes from, and why it is so successful in affecting perceptual experiences, bottom-up theorists are pressed to explain why people tend to endorse a bizarre hypothesis to explain their unusual experiences, given that hypotheses with higher probability should be available to them.
Within the bottom-up camp, further divisions apply. For some, it is correct to say that the delusional belief explains the experience. Others claim that the delusion is an endorsement of the experience. According to the explanationist account (Maher 1999; Stone and Young 1997), the content of experience is vaguer than the content of the delusion, and the delusion plays the role of one potential explanation for the experience. For instance, in the Capgras delusion, the experience would be that of someone looking very much like my sister but not being my sister. The delusion would be an explanation of the fact that the woman looks like my sister, but her face feels strange to me: the woman must be an impostor. In persecution, the experience would be that of some people as being hostile, and the delusion would be an explanation why they seem hostile: they have an intention to harm me. This account leaves it open that the same experience could have been explained differently (i.e., without any appeal to the delusional hypothesis).
According to the rival account, the endorsement account (Bayne and Pacherie 2004a; Pacherie et al. 2006), the content of the experience is already as conceptually rich as the content of the delusion. The delusion is not an explanation of the experience, but an endorsement of it: the content of the experience is taken as veridical and believed. In Capgras, the experience is that of a woman looking very much like my sister but being an impostor, and when the experience is endorsed, it becomes the delusional belief that my sister has been replaced by an impostor. In persecution, the experience is that of people having an intention to harm me, and when it is endorsed, it becomes the delusional belief that those people want to harm me.
Both versions of the bottom-up theory seem to imply that the delusion starts with a conscious experience, or better, with an experience whose content is available to a person as something to be explained or to be endorsed. But Coltheart (2005b) suggests instead that in the typical case the process of delusion formation starts with an event that a person is not aware of, such as the absence of an autonomic response.
3.3 One-factor, two-factor and prediction-error theories of delusion formation
If the delusional belief comes from the experience, why is the delusional hypothesis preferred to more probable and plausible hypotheses (in the explanationist language), or why is the content of the experience endorsed in spite of low probability and plausibility (in the language of the endorsement account)? There are several replies to this objection in the literature, which have given rise to competing theories of delusion formation. Bottom-up theorists can be divided in those who think that the unusual experience is sufficient for the formation of the delusion (one-factor theorists), and those who think that the unusual experience is only one factor in the formation of the delusion (two-factor theorists).
For some one-factor theorists (Maher 1974), the delusion is a reasonable hypothesis given the strangeness of the experience, or the strange experience is in a sensory modality or at a processing stage where further reality testing is not available (Hohwy and Rosenberg 2005). But other one-factor theorists (e.g. Gerrans 2002a) argue that, although it may be reasonable to articulate a delusional hypothesis, it is not rational to maintain it in the face of counterevidence. For two-factor theorists (Davies et al. 2001; Stone and Young 1997), the delusion is formed in order to explain a puzzling experience or a failed prediction, but the presence of the experience or the failed prediction is not sufficient for the formation of the delusion. The mechanism responsible for the formulation of the delusional hypothesis must be affected by reasoning biases or deficits. Recent developments of this theory have been offered by Aimola Davies and Davies 2009, by Coltheart et al. 2010, and by Davies and Egan 2013.
Thus, there are three main positions as to whether reasoning is impaired in people with delusions: (1) it is not impaired at all or the apparent impairment is due to a performance error rather than to a limitation of reasoning competence; (2) it is impaired due to a hypothesis evaluation deficit, and possibly reasoning biases; (3) it is impaired due to reasoning biases only. Although the predominance of certain reasoning styles and the presence of reasoning biases in people with delusions have been studied extensively, the available evidence does not seem to clearly prioritise one of the three options above. It is difficult at this stage of theoretical development to establish whether a certain reasoning “mistake” is due to a failure of competence or a failure of performance, or to specify exactly what processes are involved in the hypothesis evaluation system.
By reference to monothematic delusions, Max Coltheart explains the two main factors involved in the formation of delusions as follows:
- There is a first neuropsychological impairment that presents the patient with new (and false data), and the delusional belief formed is one which, if true, would explain these data. The nature of this impairment varies from patient to patient.
- There is a second neuropsychological impairment, of a belief evaluation system, which prevents the patient from rejecting the newly formed belief even though there is much evidence against it. This impairment is the same in all people with monothematic delusions. (Coltheart 2005b, p. 154)
In Davies et al. (2001) and Coltheart (2007), factor two is described in more details. First there is the generation of a hypothesis which serves as an explanation of the experience or an endorsement of the content of the experience. Second, there is a failure in rejecting the hypothesis, even when it is not supported by the available evidence and it is implausible given the person’s background beliefs—such a failure is probably due to frontal right hemisphere damage. Finally the hypothesis is accepted, attended to and reported, and can be subject to further (personal-level) evaluation when counterevidence emerges. When it is endorsed, the hypothesis is regarded as more plausible, more probable, and more explanatory than relevant alternatives. This influential account of the neuropsychology of delusions appeals to general mechanisms of belief formation, namely hypothesis generation and evaluation, and is compatible both with the view that people with delusions have ‘non-optimal hypothesis-testing strategies’ (Kihlstrom and Hoyt 1988, p. 96) and with the thesis that these sub-optimal strategies may be caused by damage to the right hemisphere (Ramachandran and Blakeslee 1998) which would be responsible for examining the fit between hypothesis and reality.
A similar story is told for polythematic delusions, self-deception, and delusion- and confabulation-like episodes in the normal population, although in such cases a single deficit could be at the origin of the reported belief (see McKay et al. 2005a). Experiential information is misinterpreted due to attentional or data-gathering biases that affect the generation of hypotheses or to powerful motivational factors.
The prediction-error theory of delusion formation differs from the two-factor account in that it is not wedded to the doxastic nature of delusions and it focuses on the similarities between delusions and perceptual illusions (Hohwy 2012, 2013). There need be no specific reasoning deficit which contributes to the formation of the delusion, but a disruption in the coding of the prediction error, which causes accurate beliefs to be revised, with the result that they become inaccurate. The choice between 1-factor and 2-factor theories depends on a sharp distinction between perception and reasoning. The predictive error approach to delusions assumes an architecture on which this distinction is harder to draw, since even the most paradigmatically perceptual processing draws on predictions.
4. Delusions and the Continuity Thesis
This section focuses on three debates that have animated the philosophical literature on delusions in recent years. They can all be seen as attempts to examine the extent to which the reasoning patterns and styles exhibited by people with delusions are continuous with those exhibited by people who have no known pathology of cognition.
4.1 Are delusions irrational?
There is no doubt that the definitions of delusions in DSM-IV and DSM-5 characterise delusions as irrational beliefs. However, in the philosophical literature on delusions, the status of delusions as irrational beliefs does not go unchallenged. Are delusions really irrational?
In a number of influential papers Brendan Maher (1974, 1988, 1999, 2003) argues that delusions are not ill-formed beliefs, and that there is nothing irrational in the relationship between the evidence supporting the delusional hypothesis and the formation of such a hypothesis. According to Maher, the abnormality of the delusion is entirely due to the abnormality of the experiences on the basis of which the delusion is formed. By reference to Maher’s model, Blaney (1999) describes delusions as ‘false but reasonable’. Some difficulties have been identified with this strategy. A first difficulty is that there seem to be people who suffer from the same type of brain damage, and plausibly have the same experience, as the people who develop the delusion, but do not accept any delusional hypotheses. How can these people avoid forming a delusion? One possible answer is that those who have strange experiences and do not form the delusion have hypothesis-evaluation mechanisms that work efficiently, and thus end up rejecting hypotheses with low probability and plausibility. But those who have strange experiences and do form the delusion are instead affected by an additional problem, a deficit at the level of hypothesis evaluation, which can be conceived as a failure of rationality.
On Maher’s view, […] [i]t follows that anyone who has suffered neuropsychological damage that reduces the affective response to faces should exhibit the Capgras delusion; anyone with a right hemisphere lesion that paralyzes the left limbs and leaves the subject with a sense that the limbs are alien should deny ownership of the limbs; anyone with a loss of the ability to interact fluently with mirrors should exhibit mirrored-self misidentification, and so on. However, these predictions from Maher’s theory are clearly falsified by examples from the neuropsychological literature (Davies et al. 2001, p. 144).
Another difficulty with Maher’s original account of delusions as ‘false but reasonable’ is that, even if the abnormality of the experience were to satisfactorily explain the acceptance of the delusional hypothesis and the formation of the delusion, this would not be sufficient to guarantee that the behavior of people with delusions is overall rational. We would still have to explain why delusions are maintained in the face of counterevidence once the delusional hypothesis has been formed and endorsed (see Gerrans 2002a). One aspect of the notion of rationality for beliefs is that people are disposed to revise or abandon beliefs that seem to be in conflict with the acquired evidence. The “incorrigibility” of delusions speaks in favor of their being held irrationally.
Let’s concede that maintaining delusions (if not forming them) is irrational. Which norms of rationality are violated by the obstinate attachment to a delusional hypothesis? One norm that does seem to be infringed by delusions is consistency, where this is intended both as consistency between the delusion and the person’s other beliefs, and consistency between the delusion and the person’s behavior.
Rationality is a normative constraint of consistency and coherence on the formation of a set of beliefs and thus is prima facie violated in two ways by the delusional subject. First she accepts a belief that is incoherent with the rest of her beliefs, and secondly she refuses to modify that belief in the face of fairly conclusive counterevidence and a set of background beliefs that contradict the delusional belief (Gerrans 2000, p. 114).
Delusions do not seem to respect the idea that the belief system forms a coherent whole and that adjustments to one belief will require adjustments to many others (Young 2000, p. 49).
In the course of the same interview, a woman may claim that her husband died four years earlier and was cremated and that her husband is a patient in the same hospital where she is (Breen et al. 2000, p. 91). In Capgras delusion, people may worry about the disappearance of their loved one, but also be cooperative and even flirtatious with the alleged impostor (see Lucchelli and Spinnler 2007). This suggests that delusions do not always give rise to appropriate action (Bleuler 1950; Sass 2001), although they must be reported either spontaneously or after questioning, or they could not be diagnosed as delusions. How can we square people’s apparent strong conviction in the content of the delusion with their failure to act on it? One hypothesis is that the content of the delusion is not genuinely believed. Another hypothesis is that the content of the delusion is genuinely believed but not coverted into action, because the person fails to acquire or maintain the motivation to act (this would be consistent with negative symptoms of schizophrenia, see Bortolotti and Broome 2012).
One should not be too impressed by ‘behavioural inertia’ in people with delusions, as there are many examples of people acting on their delusions. Affected by perceptual delusional bicephaly, the delusion that one has two heads, a man who believed that the second head belonged to his wife’s gynecologist attempted to attack it with an axe. When the attack failed he attempted it to shoot it down—as a consequence he was hospitalized with gunshot wounds (Ames 1984). Cases of Cotard delusion have been reported where people stop eating and bathing themselves as a consequence of believing that they are dead (Young and Leafhead 1996).
Other possible violations of norms of rationality come from the relationship between the content of the delusion and the available evidence. Resistance to revising or abandoning the delusion in the face of powerful counterevidence or counterargument is a sign of irrationality in normal and abnormal cognition alike: people with delusions ignore relevant evidence or attempt to defend their beliefs from apparent objections with obvious confabulations. Often these attempts are deeply perplexing, as the reasons offered for believing in the content of their delusions do not seem to be good reasons: in one of the examples of delusions offered at the start, a woman incorrectly believed that a man was in love with her and claimed that he was sending her secret love messages hidden in the license plates on cars of a certain state.
Thus, delusions may be inconsistent with a person’s beliefs and behavior, are typically unresponsive to both counterevidence and counterargument, and are often defended by weak evidence or argument. The empirical literature suggests that the reasoning performance of people with delusions reflects data-gathering and attribution biases. For instance, it has been argued that people with delusions ‘jump to conclusions’; they need less evidence to be convinced that a hypothesis is true (Garety 1991; Huq et al. 1988; Garety and Freeman 1999), and are more hasty in their decisions (Moritz and Woodward 2005; Fine et al. 2007). Other biases have also been noted: people with delusions of persecution tend to attribute the responsibility of negative events to other people (e.g., McKay et al. 2005b); in the Cotard delusion there seems to be a tendency to attribute the responsibility of negative events to oneself (Young and Leafhead 1996; Gerrans 2000; McKay and Cipolotti 2007). There are further studies suggesting that people with delusions are worse than controls at inhibiting the evidence of their senses when it conflicts with other things they know (Langdon et al. 2008b) and that they have an accentuated need for closure which comprises a desire for clarity and structure (see Kruglanski 1989, p. 14). These data are not by themselves sufficient to support the view that delusions are irrational, but show interesting deviations from statistically normal performance in the behavior of people with delusions.
A very recent debate relevant to the rationality of delusions concerns the step from abnormal data gathered via perception and the delusional belief. (This is primarily an issue that emerged among two-factor theorists, so the language used below is acceptable to them, but the problem can be reformulated in terms that are friendly to prediction-error theorists.) Coltheart et al. 2010 argue that the step from abnormal data to belief is an instance of abductive inference, as those who end up endorsing a delusional belief need to select an explanatory hypothesis for their abnormal data from a range of relevant hypotheses. Coltheart and colleagues use the Bayesian model of abductive inference which invites us to ask two questions: Which hypothesis better explains the data? Which hypothesis is the most plausible given what we already know? In the case of delusions, they argue, it is reasonable to adopt the delusional hypothesis given the data, and the good fit between hypothesis and data swamps general considerations about the overall implausibility of the hypothesis. What may not be reasonable is the fact that people with delusions hang onto their delusions even when they keep gathering evidence against it—that is, the delusional belief is not correctly updated in the light of new information. The authors argue that the information undermining the delusional hypothesis does not present itself as disconfirming evidence to the person with the delusion. Thus, the new evidence is interpreted in the light of the delusion and confabulation is used to fill the gaps. The behaviour of the person with the delusion is not very different from that of an obstinate scientist refusing to see the new data as undermining the support for the theory she proposed and she is now deeply committed to.
McKay 2012 offers some criticism of the account by Coltheart and colleagues, and raises the following points among others: (1) it is not a perfectly rational response to adopt the delusional hypothesis as an explanation for the abnormal data, unless the probability of the delusional hypothesis before any abnormal data is gathered is very high, and this is implausible given the content of some delusional hypotheses (“my wife has been replaced by an almost identical impostor”); (2) factor two as described by Coltheart and colleagues (i.e., a failure to update a belief in the light of conflicting evidence) cannot precede factor one or be acquired at the same time as factor one, because such a form of conservatism would prevent the person from adopting the delusional hypothesis in the first place. McKay’s positive account is that factor two is not a bias towards conservatism (which causes new data conflicting with the delusional hypothesis to be discounted), but a bias towards explanatory adequacy (which causes the delusional hypothesis to be adopted and maintained because it fits the abnormal data so well). McKay also argues that his account is compatible with prediction-error theories of delusion formation:
An excess of prediction error signal is what underpins the bias towards explanatory adequacy. Prediction error signals are triggered by discrepancies between the data expected and the data encountered. Such signals render salient the unexpected data and initiate a revision of beliefs to accommodate these data. If there is an excess of prediction error signal, inappropriately heightened salience is attached to the data, and belief revision is excessively accommodatory—biased towards explanatory adequacy. (McKay 2012, p. 18)
The debate is reviewed by Davies and Egan (2013) who helpfully focus on the distinction between the adoption and the maintenance of the delusional hypothesis and argue that there is no need to postulate an additional reason why people hang on to their delusions (a bias towards conservatism or empirical adequacy). Once the delusional hypothesis is adopted as an explanation for or an endorsement of the abnormal data, it is normatively correct from a Bayesian point of view that it will not be updated or revised for conflicting with previous beliefs. New evidence would be necessary to prompt an update or a revision. That said, critical evaluation of the delusional hypothesis after adoption can still occur if we consider that the delusional belief was formed as prepotent doxastic response to abnormal data and thus it is likely to be compartmentalised. In a fragmented belief system, it could be the case that the compartmentalised belief is assessed on the basis of previous beliefs.
The debate described above (in a very simplified way) is an attempt to specify what the second factor is in a two-factor account of delusion formation. However, as Davies and Egan themselves acknowledge, the application of idealised models of inference such as Bayesianism is somehow limited when we are considering actual belief systems. Even when no pathology is present, biases affecting the adoption and evaluation of hypotheses are going to be the (statistical) norm, which makes it difficult to uniquely identify the problem with the adoption and the persistence of delusional beliefs.
Although delusions can be irrational to a higher degree than normal beliefs, as they may be less consistent with a person’s other beliefs and actions and more resistant to counterevidence, they do not seem to be irrational in a qualitatively different way from normal beliefs. This would suggest that they are continuous with irrational beliefs, although (as we shall see in the next section) there exist sophisticated philosophical arguments challenging the continuity claim.
4.2 Are delusions beliefs?
According to the doxastic conception of delusions (dominant among psychologists and psychiatrists), delusions are belief states—it is an important diagnostic features of delusions that they can lead to action and that they can be reported with conviction, and thus that they behave as typical beliefs. (See the entry on belief.) But there is an increasing influential view in philosophy warning that the doxastic characterization of delusions would lead to an oversimplification of the phenomenon. Although some of the alternative accounts of delusions (e.g., experiential, phenomenological and metarepresentational) are critical towards standard doxastic conceptions, they do not necessarily deny that the phenomenon of delusions involves the formation of normal or abnormal beliefs. Rather, the central idea seems to be that, even if people with delusions report false or irrational beliefs, paying attention only to their first-order cognitive states and to the doxastic dimension of their pathology can lead to a partial and incorrect view of the phenomenon of delusions (see also Radden 2010).
Some authors emphasize the experiential and phenomenological character of delusions over the doxastic one (e.g., Sass 1994; Gold and Hohwy 2000), and others conceive of delusions not as mere representations of a person’s experienced reality, but as attitudes towards representations (e.g., Currie 2000; Currie and Jureidini 2001; Stephens and Graham 2006). Gallagher 2009 argues that an explanation of the delusion as a mere cognitive error would be inadequate, and introduces the terminology of delusional realities, modes of experience which involve shifts in familiarity and sense of reality and encompass cognition, bodily changes, affect, social and environmental factors.
Most of the authors who deny belief status to delusions have a negative and a positive thesis. The positive thesis is an alternative account of what delusions are. For instance, one might argue that delusions are acts of imagination mistakenly taken by a person to have belief status (Currie and Ravenscroft 2002) or empty speech acts with no intentional import (Berrios 1991). The negative thesis is an account of why delusions are not beliefs. Beliefs have certain characteristics, that is, they are formed and revised on the basis of evidence, they are consistent with other beliefs, they are action guiding in the relevant circumstances. If delusions do not share these characteristics, then they are not beliefs.
Let us list some of the arguments for the negative thesis:
- Beliefs are integrated with other beliefs. If delusions are not integrated with a person’s beliefs, then they are not beliefs.
- Beliefs are responsive to evidence. If delusions are not responsive to evidence, then they are not beliefs.
- Beliefs guide action. If delusions do not guide action, then they are not beliefs.
These arguments are central to the debate about the doxastic nature of delusions (Bortolotti 2009). For instance, Currie and Jureidini (2001, p. 161) argue that delusions are more plausibly imaginings than beliefs, because delusions ‘fail, sometimes spectacularly, to be integrated with what the subject really does believe’, whereas there is no requirement that imaginings are consistent with what the person believes. Berrios (1991) argues that delusions cannot be beliefs, because, as explanations for an abnormal experience, they are not regarded even by the person reporting them as more probable than alternative explanations of the experience. Berrios reaches the extraordinary conclusions that delusions are not even intentional states, but utterances without meaning, “empty speech acts”.
Assessing arguments in (1) to (3) requires assessing empirical and conceptual claims. Let’s consider (1), the ‘bad integration’ objection to the belief status of delusions. In order to see whether the conclusion is convincing, we need to examine an empirical claim about delusions first: Do delusions really fail to integrate with a person’s beliefs? Then, we need to assess a conceptual claim, the claim that not being integrated with a person’s beliefs prevents delusions from being beliefs at all. In many cases, we shall find that the alleged ‘fault’ of delusions has been exaggerated (e.g., delusions sometimes integrate well with beliefs), but that it is correct to claim that delusions exhibit that mark of irrationality (bad integration) to a higher degree than ordinary beliefs.
The most common versions of anti-doxastic arguments seem to rely on an idealization of normal belief states, and impose constraints on delusions that typical beliefs would not meet. The assumption seems to be that beliefs are essentially rational, and that delusions are not beliefs because they are not rational. But the abundant psychological evidence on familiar irrationality tells us that ordinary beliefs are often irrational in exactly the same way as delusions can be—although to a lesser degree. It is sufficient to think about hypocrisy, about prejudiced and superstitious beliefs, and about the many biases that affect belief updating in normal cognition to realize that the same kinds of irrationality that we find in delusions are also common in many ordinary beliefs (e.g., Nisbett and Ross 1980). For the doxastic conception of delusions, the greatest challenge is to provide a satisfactory reply to the double-bookkeeping objection: if people truly believe the content of their delusions, why is their behavior often inconsistent with it? Aren’t beliefs distinct from other intentional states in virtue of their action-guiding character? A more general worry is that the very notion of belief is not theoretically useful if the criteria for what counts as a belief become too loose.
Independent of any given answer to the question whether delusions are beliefs, two opposed conceptions of delusions contend the philosophical scene. One highlights the discontinuity between delusions and beliefs, and between normal and abnormal cognition, with consequences for the conceptualization of the disorder, but also for the availability of therapeutic options, such as cognitive behavioral therapy, to people with delusions. The other view insists that there is continuity between delusions and beliefs, and attempts to gather data both suggesting that people with delusions can reason in much the same way as people without, and that delusion-like ideas are widespread in the normal population. Bentall 2003, for instance, gathered a vast amount of empirical data about the temporal variations in delusions reported by people affected by psychopathologies, and the presence of delusion-like beliefs in the normal population.
4.3 Does delusion overlap with self-deception?
There is no consensus on whether self deception and delusion significantly overlap. Self deception has been traditionally characterized as driven by motivational factors. Delusions are now primarily accounted in neurobiological terms, and theories of delusion formation involve reference to perceptual and cognitive impairments. However, motivational factors can still play an important role in the explanation of some delusions, for instance by partially determining the specific content of the reported delusional state. Thus, one plausible view is that self-deception and delusion are distinct phenomena that may overlap in some circumstances (for further analysis, see Bayne and Fernàndez 2008). There are three arguments for the view that delusions and self-deception can overlap.
The first view about the relationship between delusions and self-deception is that, when they overlap, they do so because they both involve a motivationally biased treatment of evidence. If we agree with deflationists that the motivationally biased treatment of the evidence is the key feature of self-deception (Mele 2001 and 2008), then people with delusions can be said to be self-deceived if they treat the evidence at their disposal in a motivationally biased way, or if they search for evidence in a motivationally biased way. This does not seem to be generally the case, but it is useful to distinguish between different types of delusions. Some delusions of misidentification (at least according to neuropsychological accounts) do not seem to be akin to self-deception, given that there is no fundamental role for motivational biases in the explanation of how a person comes to hold or retain the delusion. A different analysis might be appropriate for other delusions, such as delusions of jealousy or persecution.
The second view is that (some) delusions are extreme cases of self-deception and that they have a protective and adaptive function (see Hirstein 2005). An example is offered by Ramachandran, who discusses anosognosia, the denial of illness, and somatoparaphrenia, the delusion that a part of one’s body belongs to someone else. Ramachandran (1996) reports the case of a woman (FD) who suffered from a right hemisphere stroke which left her with left hemiplegia. FD could not move without a wheelchair and could not move her left arm. But when she was asked whether she could walk and she could engage in activities which require both hands (such as clapping), she claimed that she could. Ramachandran advances the hypothesis that behaviors giving rise to confabulations and delusions are an exaggeration of normal defense mechanisms that have an adaptive function, as they allow us to create a coherent system of beliefs and to behave in a stable manner. In normal subjects, the left hemisphere produces confabulatory explanations aimed at preserving the status quo (‘I’m not ill’; ‘My arm can move’), but the right hemisphere does its job and detects an anomaly between the hypotheses generated by the left hemisphere and reality. So, it forces a revision of the belief system. In patients such as FD, the discrepancy detector no longer works. It is very plausible that the delusions reported by people with anosognosia involve motivational aspects. But whether we believe that these delusions are an exaggerated form of self-deception depends on the preferred theoretical characterization of self-deception.
The third view about the potential overlap of delusions and self-deception is that the very existence of delusions (which shows that doxastic conflict is possible) can help us vindicate the traditional account of self-deception, according to which a person has two contradictory beliefs, but she is aware of only one of them, because she is motivated to remain unaware of the other (McKay et al. 2005a, p. 314). This account derives from Donald Davidson’s theory of self-deception (e.g. Davidson 1982 and 1985b). When I deceive myself, I believe a true proposition but act in such a way as to causing myself to believe the negation of that proposition. Neil Levy argues that the conditions for self-deception set by the traditional approach are not necessary for self-deception, but that the case of FD described by Ramachandran (1996) is living proof that a person can, at the same time, believe that her arm is paralyzed, and believe that she can move her arm. Moreover, it is the belief that her arm is paralyzed that causes her to acquire the belief that her arm is not. This is Levy’s analysis of the typical person with anosognosia Levy (2008, p. 234):
- Subjects believe that their limb is healthy.
- Nevertheless they also have the simultaneous belief (or strong suspicion) that their limb is significantly impaired and that they are profoundly disturbed by this belief (suspicion).
- Condition (1) is satisfied because condition (2) is satisfied; that is, subjects are motivated to form the belief that their limb is healthy because they have the concurrent belief (suspicion) that it is significantly impaired and they are disturbed by this belief (suspicion).
If this analysis is correct, at least one case of delusion (e.g., anosognosia) involves doxastic conflict. The most controversial aspect of this analysis concerns condition (2). Is the belief that their limb is impaired truly available to people affected by anosognosia? One could argue that, given that they probably have a deficit in the discrepancy detector of the right hemisphere of the brain, they have no awareness of the impairment they deny (see also Hirstein 2005). But Levy’s reply is that availability comes in degrees. He suggests that, given that people with paralysis and anosognosia often avoid tasks that would require mobility when costs for failure are high, and given that they can acknowledge some difficulties in movement (and say ‘I have arthritis’ or ‘My left arm has always been weaker’), it is plausible that they have some awareness of their impairment—although they may lack a fully formed and conscious belief about it.
Recently, the debate about the differences between delusion and self-deception has centred on whether delusions (just like ordinary instances of self-deception) are explicable from a folk-psychological point of view (Bortolotti and Mameli 2012; Murphy 2012, 2013). Murphy argues that we diagnose delusions when folk psychology runs out of resources for understanding what someone seems to report as a genuine belief. By contrast, self-deception does not challenge our folk-psychological generalisations, because we expect people’s beliefs to be influenced by their desires at least on some occasions (e.g., when stakes are high). Bortolotti and Mameli maintain that the gap between self-deception and motivated delusions (such as anosognosia) is narrower than the gap between self-deception and apparently non-motivated delusions, but even in the latter case folk psychology can account for the delusional belief as an explanation (irrational as it may be) of the delusional experience.
In sum, the views summarized here show that it can be very difficult to justify clear-cut distinctions between delusion and self-deception. It is diagnostically and scientifically useful to maintain a distinction between symptoms of conditions such as amnesia, dementia, or schizophrenia, and the irrational beliefs that characterize normal cognition, but one should acknowledge that there are also many elements of genuine overlap.
Bibliography
- Aimola Davies, A.M. and Davies, M., 2009. “Explaining pathologies of belief,” in M.R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 285–326.
- American Psychiatric Association, 2000. Diagnostic Statistical Manual of Mental Disorders, Fourth edition, Text Revision (DSM-IV-TR).
- American Psychiatric Association, 2013. Diagnostic Statistical Manual of Mental Disorders, Fifth edition (DSM-5).
- Ames, D., 1984. Self shooting of a phantom head. British Journal of Psychiatry, 145: 193–194.
- Bayne, T. and Fernández, J. (eds.), 2009. Delusion and Self-deception: Affective and Motivational Influences on Belief Formation, Hove: Psychology Press
- Bayne, T. and Pacherie E., 2004a. “Bottom up or top down?,” Philosophy, Psychiatry, & Psychology, 11 (1): 1–11.
- –––, 2004b. “Experience, belief, and the interpretive fold,” Philosophy, Psychiatry, & Psychology, 11 (1): 81–86.
- –––, 2005. “In defence of the doxastic conception of delusion,” Mind & Language, 20 (2): 163–188.
- Bayne, T. and Hattiangadi, A., 2013. “Belief and its bedfellows,” in N. Nottelmann (ed.) New Essays on Belief: Constitution, Content and Structure, London, Palgrave: chapter 6.
- Bell, V., Halligan, P. and Ellis, H., 2003. “Beliefs about delusions,” The Psychologist, 16 (8): 418–423.
- –––, 2006. “Explaining delusions: a cognitive perspective,” Trends in Cognitive Science, 10 (5): 219–226.
- Bentall, R., 2003. “The paranoid self,” in T. Kircher and A. David (eds.) The Self in Neuroscience and Psychiatry, Cambridge: Cambridge University Press, 293–318.
- Bentall, R. P., Corcoran, R., Howard, R., Blackwood, N., and Kinderman, P., 2001. “Persecutory delusions: A review and theoretical integration,” Clinical Psychology Review, 21: 1143–1192.
- Bentall, R., and Kaney, S., 1996. “Abnormalities of self‑representation and persecutory delusions: a test of cognitive model of paranoia,” Psychological Medicine, 26: 1231‑1237.
- Bermúdez, J., 2001. “Normativity and rationality in delusional psychiatric disorders,” Mind & Language, 16 (5): 493–457.
- Berrios, G. E., 1991. “Delusions as ‘wrong beliefs’: a conceptual history,” British Journal of Psychiatry, 159 (suppl. 14): 6–13.
- Berrios , G.E. and Luque, R., 1995a. “Cotard’s syndrome: analysis of 100 cases,” Acta Psychiatrica Scandinavica, 91: 185–188.
- –––, 1995b. “Cotard’s delusion or syndrome: a conceptual history,” Comprehensive Psychiatry, 35 (3): 218–223.
- Blaney, P.H., 1999. “Paranoid conditions,” in T. Millon, P.H. Blaney, and R. D. Davis (eds.) Oxford Textbook of Psychopathology, Oxford: Oxford University Press, chapter 13.
- Bleuler, E., 1950. Dementia Precox, or the Group of Schizophrenias, New York: International University Press.
- Bockes, Z., 1985. “First Person Account: ‘Freedom’ Means Knowing You Have a Choice”, Schizophrenia Bulletin, 11(3): 487–489.
- Bolton, D., 2008. What is Mental Disorder?, Oxford: Oxford University Press.
- Bolton, D., and Hill, J., 2003. Mind, Meaning and Mental Disorder, Oxford: Oxford University Press.
- Bortolotti, L., 2005. “Delusions and the background of rationality,” Mind & Language, 20 (2): 189–208.
- –––, 2009. Delusions and Other Irrational Beliefs, Oxford: Oxford University Press.
- –––, 2012. “In defence of modest doxasticism about delusions,” Neuroethics, 5 (1): 39–53.
- Bortolotti, L. and Broome, M.R., 2007. “If you didn’t care, you wouldn’t notice: recognition and estrangement in psychopathology,” Philosophy, Psychiatry, & Psychology, 14 (1): 39–42.
- –––, 2008. “Delusional beliefs and reason giving,” Philosophical Psychology, 21 (3): 1–21.
- –––, 2009. “A role for ownership and authorship of thoughts in the analysis of thought insertion,” Phenomenology and the Cognitive Sciences, 8(2): 205–224.
- –––., 2012. “Affective dimensions of the phenomenon of double bookkeeping in delusions,” Emotion Review, 4 (2): 187–191.
- Bortolotti, L. and Mameli, M., 2012. “Self-deception, delusion and the boundaries of folk-psychology,” Humana.Mente Journal of Philosophical Studies, 20: 203–221.
- Breen, N., Caine, D., Coltheart, M., Hendy, J. and Roberts, C., 2000. “Towards an understanding of delusions of misidentification: four case studies,” in M. Coltheart and M. Davies (eds.) Pathologies of Belief, Oxford: Blackwell, 74–110.
- Brett-Jones, J., Garety, P. and Hemsley, D., 1987. “Measuring delusional experiences: a method and its application,” British Journal of Clinical Psychology, 26: 257–277.
- Broome, M.R., 2004. “Rationality in psychosis and understanding the deluded,” Philosophy, Psychiatry, & Psychology, 11(1): 35–41.
- Broome M.R., Bortolotti, L. and Mameli, M., 2010. “Moral responsibility and mental illness: a case study,” Cambridge Quarterly of Healthcare Ethics, 19 (2): 179–187.
- Broome M.R., Johns L. C., Valli I., Woolley J.B., Tabraham, P., Valmaggia, L., Peters, E., Garety, P., and McGuire, P., 2007. “Delusion formation and reasoning biases in those at clinical high risk for psychosis,” British Journal of Psychiatry, 191: s38–42
- Butler, P., 2000. “Reverse Othello syndrome subsequent to traumatic brain injury,” Psychiatry: interpersonal and biological processes, 63: 85–92.
- Campbell, J., 1999. “Schizophrenia, the space of reasons and thinking as a motor process,” The Monist, 82 (4): 609–625.
- –––, 2001. “Rationality, meaning and the analysis of delusion,” Philosophy, Psychiatry, & Psychology, 8 (2–3): 89–100.
- –––, 2002. “The ownership of thoughts,” Philosophy, Psychiatry & Psychology, 9 (1): 35–39.
- –––, 2009. “What does rationality have to do with psychological causation? Propositional attitudes as mechanisms and as control variables,” in M. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 137–150.
- Colbert, S.M., Peters, E.K., and Garety, P.A., 2006. “Need for closure and anxiety in delusions: A longitudinal investigation in early psychosis,” Behavior Research and Therapy, 44 (10): 1385–1396.
- Coltheart, M., 2005a. “Delusional belief,” Australian Journal of Psychology, 57: 72–6.
- –––, 2005b. “Conscious experience and delusional belief,” Philosophy, Psychiatry & Psychology, 12 (2): 153–157.
- –––, 2007. “Cognitive neuropsychiatry and delusional belief” (The 33rd Sir Frederick Bartlett Lecture), The Quarterly Journal of Experimental Psychology, 60 (8): 1041–1062.
- Coltheart, M., Langdon, R. and McKay, R., 2007. “Schizophrenia and monothematic delusions,” Schizophrenia Bulletin, 33 (3): 642–647.
- Coltheart, M., Menzies, P. and Sutton, J., 2010. “Abductive inference and delusional belief,” Cognitive Neuropsychiatry, 15 (1): 261–287.
- Cooper, R., 2007. Psychiatry and Philosophy of Science, London: Acumen.
- Corlett, P., Murray, G.K., Honey, G.D., Aitken, M.R.F., Shanks, D.R., Robbins, T.W., Bullmore, E.T., Dickinson, A. and Fletcher, P.C., 2007. “Disrupted prediction-error signal in psychosis: evidence for an associative account of delusions,” Brain, 130 (9): 2387–2400.
- Corlett, P., Taylor J., Wang X., Fletcher P., Krystal J., 2010. “Toward a neurobiology of delusion,” Progress in Neurobiology, 92 (3): 345–369.
- Currie, G., 2000. “Imagination, delusion and hallucinations,” in M. Coltheart and M. Davies (eds.) Pathologies of Belief, Oxford: Blackwell, 167–182.
- Currie, G. and Jureidini, J., 2001. “Delusions, rationality, empathy,” Philosophy, Psychiatry and Psychology, 8 (2–3): 159–162.
- Currie, G. and Ravenscroft, I., 2002. Recreative Minds: Imagination in Philosophy and Psychology, Oxford: Oxford University Press.
- Davidson, D., 1982. “Paradoxes of irrationality,” in R. Wollheim (ed.) Philosophical essays on Freud, Cambridge University Press, 289–305. Reprinted in D. Davidson (2004) Problems of Irrationality, Oxford: Clarendon Press, 169–188.
- –––, 1985a. “Incoherence and Irrationality,” Dialectica, 39: 345–354. Reprinted in D. Davidson, 2004, Problems of Irrationality, Oxford: Clarendon Press, 189–198.
- –––, 1985b. “Deception and Division,” in J. Elster (ed.), The Multiple Self, Cambridge: Cambridge University Press. Reprinted in D. Davidson, 2004, Problems of Irrationality, Oxford: Clarendon Press, 199–212.
- Davies, M., 2008. “Delusion and motivationally biased belief: self deception in the two factor framework,” in T. Bayne and J. Fernàndez (eds.), Delusion and Self deception: Affective and Motivational Influences on Belief Formation, Hove: Psychology Press, 71–86.
- Davies, M. and Coltheart, M., 2000. “Introduction,” in M. Coltheart and M. Davies (eds.), Pathologies of Belief, Oxford: Blackwell, 1–46.
- Davies, M., Coltheart, M., Langdon, R. and Breen, N., 2001. “Monothematic delusions: Towards a two- factor account,” Philosophy, Psychiatry and Psychology, 8 (2/3): 133–158.
- Davies, M. and Egan, A., 2013. “Delusion: Cognitive approaches. Bayesian inference and compartmentalisation,” in K.W.M. Fulford et al. (eds.), The Oxford Handbook of Philosophy and Psychiatry, Oxford: Oxford University Press.
- Diez-Alegria, C., Vazquez, C., Nieto-Moreno, M., Valiente, C., and Fuentenebro, F., 2006. “Personalizing and externalizing biases in deluded and depressed patients: Are attributional biases a stable and specific characteristic of delusions?,” British Journal of Clinical Psychology, 45 (4): 531–544.
- Dudley, R.E. and Over, D., 2003. “People with delusions jump to conclusions,” Clinical Psychology and Psychotherapy, 10: 263–274.
- Egan, A., 2009. “Imagination, delusion, and self-deception,” in T. Bayne and J. Fernandez (eds.) Delusions, Self-Deception, and Affective Influences on Belief-formation, Hove: Psychology Press, 263–280.
- Eilan, N., 2000. “On understanding schizophrenia,” in D. Zahavi (ed.) Exploring the self, Amsterdam: John Benjamins, 97–113.
- Elkin, G.D., 1999. Introduction to Clinical Psychiatry, Maidenhead: McGraw-Hill Professional.
- Ellis, H., 1998. “Cognitive neuropsychiatry and delusional misidentification syndromes: an exemplary vindication of a new discipline,” Cognitive Neuropsychiatry, 3 (2): 81–89.
- Fernández, J., 2010. “Thought insertion and self-knowledge,” Mind & Language, 25 (1): 66–88.
- Fine, C., Craigie, J. and Gold, I., 2005. “Damned if you do, damned if you don’t: The impasse in cognitive accounts of the Capgras delusion,” Philosophy, Psychiatry & Psychology, 12: 143–151.
- Fine, C., Gardner, M., Craigie, J., Gold, I., 2007. “Hopping, skipping or jumping to conclusions? Clarifying the role of the JTC bias in delusions,” Cognitive Neuropsychiatry, 12 (1): 46–77.
- Fletcher, P. C. and C. D. Frith, 2009. “Perceiving is believing: a Bayesian approach to explaining the positive symptoms of schizophrenia,” Nature Reviews Neuroscience 10 (1): 48–58.
- Fowler, D., Garety P. and Kuipers, 1995. Cognitive Behavior Therapy for Psychosis: Theory and Practice., Chichester: Wiley.
- Frankish, K., 2009. “Delusions: a two-level framework,” in M.R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 269–284.
- Freeman, D., 2008. “The assessment of persecutory ideation,” in D. Freeman, R. Bentall, and P. Garety (eds.) Persecutory Delusions. Assessment, Theory and Treatment, Oxford: Oxford University Press, 23–52.
- Freeman, D., Garety, P., Kuipers, E., Colbert, E., Jolley, S., Fowler, D., Dunn, G. and Bebbington, P., 2006. “Delusions and decision-making style: Use of the Need for Closure Scale,” Behavior Research and Therapy, 44 (8): 1147–1158.
- Freud, S., 1917. Delusion and Dream, New York: Moffat Yard.
- Frith, C., 1992. The Cognitive Neuropsychology of Schizophrenia, Hove: Psychology Press.
- Fulford, K.W.M., 1998. “Completing Kraepelin’s psychopathology: Insight, delusion and the phenomenology of illness,” in X. F. Amador and A. David (eds.) Insight and Psychosis, Oxford: Oxford University Press: 47–65.
- –––, 1993. “Mental illness and the mind-brain problem: delusion, belief and Searle’s theory of intentionality,” Theoretical Medicine and Bioethics, 14 (2): 181–194.
- –––, 2004. “Neuro-Ethics or Neuro-Values? Delusion and religious experience as a case study in values-based medicine,” Poiesis and Praxis, 2 (4): 297–313.
- Fulford, K.W.M., Thornton, T., Graham, G., 2006. Oxford Textbook of Philosophy and Psychiatry, Oxford: Oxford University Press.
- Gallagher, S., 2003. “Self-narratives in schizophrenia,” in T. Kircher and A. David (eds.) The Self in Neuroscience and Psychiatry, Cambridge: Cambridge University Press, 336–357.
- –––, 2004. “Neurocognitive models of schizophrenia: A neurophenomenological critique,” Psychopathology, 37: 8–19.
- –––, 2009. “Delusional realities,” in M. R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 245–268.
- Garety, P., 1991. “Reasoning and delusions,” British Journal of Psychiatry, 159: 14–18.
- Garety, P. A. and Freeman, D., 1999. “Cognitive approaches to delusions: A critical review of theories and evidence,” British Journal of Clinical Psychology, 38: 113–154.
- Garety, P. and Hemsley, D., 1987. “Characteristics of delusional experience,” European Archives of Psychiatry and Neurological Sciences, 236: 294–298.
- –––, 1997. “Delusions: Investigations into the Psychology of Delusional Reasoning,” Hove: Psychology Press.
- Garety, P.A, Kuipers, E., Fowler, D.G, Freeman, D. and Bebbington, P.E., 2001. “A cognitive model of the positive symptoms of psychosis,” Psychological Medicine, 31: 189–195.
- Gazzaniga, M., 1985. The Social Brain, New York: Basic Books.
- Gerrans, P., 2000. “Refining the explanation of the Cotard delusion,” Mind & Language, 15 (1): 111–122.
- –––, 2001. Delusions as performance failures. Cognitive Neuropsychiatry, 6 (3): 161–173.
- –––, 2002a. “A one-stage explanation of the Cotard delusion,” Philosophy, Psychiatry, & Psychology, 9 (1): 47–53.
- –––, 2002b. “Multiple paths to delusion,” Philosophy, Psychiatry, & Psychology, 9 (1): 65–72.
- –––, 2009. “Mad scientists or unreliable autobiographers? Dopamine dysregulation and delusion,” in M.R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 151–172.
- –––, 2013. “Delusional Attitudes and Default Thinking,” Mind & Language, 28 (1): 83–102.
- Gilleen, J. and David, A.S., 2000. “The cognitive neuropsychiatry of delusions: from psychopathology to neuropsychology and back again,” Psychological Medicine, 35 (1): 5–12.
- Gold, I. and Hohwy, J., 2000. “Rationality and schizophrenic delusion,” Mind & Language, 15 (1): 146–167.
- Graham, G. and Stephens, G.L., 1994. “Mind and mine,” in G. Graham and G. Stephens (eds.), Philosophical Psychology, Cambridge, MA: MIT Press, 91–109.
- Hacking, I., 1995. “Rewriting the Soul: Multiple Personality and the Sciences of Memory,” Princeton: Princeton University Press.
- Haggard, P., Cartledge, P., Dafydd, M., and Oakley, D. A., 2004. “Anomalous control: When ‘free will’ is not conscious,” Consciousness and Cognition, 13: 646–654.
- Halligan, P. and Marshall, J., 1996. “The wise prophet makes sure of the event first: Hallucinations, amnesia, and delusions,” in P. Halligan and J. Marshall (eds.) Method in Madness, Hove: Psychology Press, 235–266.
- Halligan, P. and Marshall, J. (eds.), 1996. Method in Madness: Case Studies in Cognitive Neuropsychiatry, Hove: Psychology Press.
- Hamilton, A., 2007. “Against the belief model of delusion,” in M.C. Chung, K.W.M. Fulford, and G. Graham (eds.) Reconceiving Schizophrenia, Oxford: Oxford University Press, 217–234.
- Hirstein, W., 2005. Brain Fiction: Self deception and the Riddle of Confabulation, Cambridge, MA: MIT Press.
- Hohwy, J., 2004. “Top-down and bottom-up in delusion formation,” Philosophy, Psychiatry, & Psychology, 11 (1): 65–70.
- –––, 2007. “The sense of self in the phenomenology of agency and perception,” Psyche, 13 (1).
- –––, 2013. “Delusions, Illusions, and Inference under Uncertainty,” Mind& Language, 28 (1): 57–71.
- –––, 2013. The Predictive Mind, Oxford and New York: Oxford University Press.
- Hohwy, J. and Rajan, V., 2012. “Delusions as forensically disturbing perceptual inferences,” Neuroethics, 5 (1): 5–11.
- Hohwy, J. and Rosenberg, R., 2005. “Unusual experiences, reality testing and delusions of alien control,” Mind & Language, 20 (2): 141–162.
- Humphreys, N. and Dennett, D., 1989. “Speaking for our selves: An assessment of multiple personality disorder,” Raritan, 9 (1): 68–98.
- Huq, S., Garety, P., Hemsley, D., 1988. “Probabilistic judgements in deluded and non-deluded subjects,” Quarterly Journal of Experimental Psychology, 40(A): 801–812.
- Jaspers, K, 1963. General Psychopathology, J. Hoenig and M. Hamilton (trans.), Manchester: Manchester University Press.
- Jeannerod M. and Pacherie E., 2004. “Agency, simulation and self-identification,” Mind & Language, 19 (2): 113–146.
- Johns, L.C., van Os, J., 2001. “The continuity of psychotic experiences in the general population,” Clinical Psychology Review, 21 (8): 1125–1141.
- Johnstone, E., Cooling, N., Frith, C., Crow, T., Owens, D., 1988. “Phenomenology of organic and functional psychoses and the overlap between them,” British Journal of Psychiatry, 153: 770–776.
- Jordan, H.W., Lockert, E.W., Johnson-Warren, M., Cabell, C., Cooke, T., Greer, W., and Howe, G., 2006. “Erotomania revisited: Thirty-four years later,” Journal of the National Medical Association, 98 (5): 787–793.
- Kapur, S., 2003. “Psychosis as a state of aberrant salience: a framework for linking biology, phenomenology and pharmacology in schizophrenia,” American Journal of Psychiatry, 160: 13–23.
- –––, 2004. “How antipsychotics become anti-‘psychotic’ – from dopamine to salience to psychosis,” Trends in Pharmacological Sciences, 25: 402–406.
- Kemp, R., Chua, S., McKenna, P. and David, A., 1997. “Reasoning and delusions,” British Journal of Psychiatry, 170: 398–405.
- Kennett, J. and Matthews, S., 2003. “Delusion, dissociation and identity,” Philosophical Explorations, 6: 31–49.
- –––, 2009. “Mental time travel, agency and responsibility,” in M.R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 327–350.
- Kircher, T. and David, A. (eds.), 2003. The Self in Neuroscience and Psychiatry, Cambridge: Cambridge University Press.
- Krafft-Ebing, R. von, 2007. Textbook of Insanity: Based on Clinical Observations for Practitioners and Students of Medicine, C. Chaddock (trans.), Whitefish: Kessinger Publishing. Originally published in 1879 as: Lehrbuch der Psychiatrie auf klinischer Grundlage für practische Ärzte und Studirende, Stuttgart: F. Enke.
- Kruglanski, A.W., 1989. Lay Epistemics and Human Knowledge: Cognitive and Motivational Bases, New York: Plenum.
- Langdon, R. and Coltheart, M., 2000. “The cognitive neuropsychology of delusions,” in M. Coltheart and M. Davies (eds.) Pathologies of Belief, Oxford: Blackwell, 183–216.
- Langdon, R., Ward, P. and Coltheart, M., 2008. “Reasoning anomalies associated with delusions in schizophrenia,” Schizophrenia Bulletin, doi: 10.1093/schbul/sbn069 [Available online].
- Leeser, J., and O’Donohue, W., 1999. “What is a delusion? Epistemological dimensions,” Journal of Abnormal Psychology, 108(4): 687–694.
- Levy, N., 2008. “Self deception without thought experiments,” in T. Bayne and J. Fernández(eds.) Delusions and Self deception: Affective and Motivational Influences on Belief-Formation, Hove: Psychology Press, 227–242.
- Lucchelli, F. and Spinnler, H., 2007. “The case of lost Wilma: a clinical report of Capgras delusion,” Neurological Science, 28 (4): 188–195.
- Maher, B.A., 1974. “Delusional thinking and perceptual disorder,” Journal of Individual Psychology, 30: 98–113.
- –––, 1988. “Anomalous experience and delusional thinking: The logic of explanations,” in T.F. Oltmann and B.A. Maher (eds.) Delusional Beliefs, New York: Wiley, 15–33.
- –––, 1999. “Anomalous experience in everyday life: Its significance for psychopathology,” The Monist, 82: 547–70.
- –––, 2003. “Schizophrenia, aberrant utterance and delusions of control: The disconnection of speech and thought, and the connection of experience and belief,” Mind & Language, 18: 1–22.
- Marshall, J. and Halligan, P., 1996. “Introduction,” in P. Halligan and J. Marshall (eds.) Method in Madness, Hove: Psychology Press, 3–11.
- McAllister-Williams, R., 1997. “The description of primary delusions: confusions in standard texts and among clinicians,” Psychiatric Bulletin, 21: 346–349.
- McKay, R., 2012. “Delusional inference,” Mind & Language, 27 (3): 330–355.
- McKay, R. and Cipolotti, L., 2007. “Attributional styles in a case of Cotard delusion,” Consciousness and Cognition, 16: 349–359.
- McKay, R., Langdon, R. and Colheart, M., 2007. “Models of misbelief: Integrating motivational and deficit theories of delusions,” Consciousness and Cognition, 16: 932–941.
- –––, 2005a. “Sleights of mind: Delusions, defences, and self deception,” Cognitive Neuropsychology, 10: 305–326.
- –––, 2005b. “Paranoia, persecutory delusions and attributional biases,” Psychiatry Research, 136 (2–3): 233–245.
- Mele, A., 2001. Self Deception Unmasked, Princeton: Princeton University Press.
- –––, 2007. “Self deception and three psychiatric delusions,” in M. Timmons, J. Greco, A. Mele (eds.) Rationality and the Good, Oxford: Oxford University Press: 163–176.
- –––, 2008. “Self deception and delusions,” in T. Bayne and J. Fernández (eds.) Delusions and Self deception: Affective and Motivational Influences on Belief-Formation, Hove: Psychology Press, 55–70.
- Metcalf, K., Langdon, R. and Coltheart, M., 2007. “Models of confabulation: A critical review and a new framework,” Cognitive Neuropsychology, 24 (1): 23–47.
- Miller, E. and Karoni, P., 1996. “The cognitive psychology of delusions: a review,” Applied Cognitive Psychology, 10: 487–502.
- Moritz, S. and Woodward, T. S., 2005. “Jumping to delusions in delusional and non-delusional schizophrenic patients,” British Journal of Clinical Psychology, 44: 193–207.
- Mullen, R., 2003. “Delusions: the continuum versus category debate,” Australian and New Zealand Journal of Psychiatry, 37: 505–511.
- Murphy, D., 2006. Psychiatry in the Scientific Image, Cambridge, MA: MIT Press.
- –––, 2012. “The folk epistemology of delusions,” Neuroethics, 5 (1): 19–22.
- –––, 2013. “Delusions, Modernist Epistemology and Irrational Belief,” Mind & Language, 28 (1): 113–124.
- Nisbett, R.E. and Ross, J., 1980. Human Inference: Strategies and Shortcomings in Social Judgment, Englewood Cliffs: Prentice Hall.
- Nisbett, R.E. and Wilson, T.D., 1977. “Telling more than we can know: Verbal reports on mental processes,” Psychological Review, 84 (3): 231–259.
- Oyebode, F. and Sims, A., 2008. Sims’ Symptoms in the Mind: An Introduction to Descriptive Psychopathology, London: Elsevier.
- Pacherie, E., Green, M. and Bayne, T., 2006. “Phenomenology and delusions: Who put the ‘alien’ in alien control?,” Consciousness and Cognition, 15: 566–577.
- Parnas, J. and Handset, P., 2003. “Phenomenology of self-experience in early schizophrenia,” Comprehensive Psychiatry, 44 (2): 121–134.
- Phillips, J., 2003. “Psychopathology and the narrative self,” Philosophy, Psychiatry & Psychology, 10 (4): 313–328.
- Radden, J., 2004. The Philosophy of Psychiatry, Oxford: Oxford University Press.
- –––, 2010. On Delusion, Abingdon and New York: Routledge.
- Ramachandran, V.S., 1996. “The evolutionary biology of self deception, laughter, dreaming and depression: some clues from anosognosia,” Medical Hypotheses, 47: 347–362.
- Ramachandran, V.S. and Blakeslee, S., 1998. Phantoms in the Brain: Human Nature and the Architecture of the Mind, London: Fourth Estate.
- Ratcliffe, M., 2004. “Interpreting delusions,” Phenomenology and Cognitive Sciences, 3: 25–48.
- Rhodes, J. and Gipps, R., 2008. “Delusions, certainty, and the background,” Philosophy, Psychiatry, & Psychology, 15 (4): 295–310.
- Samuels, R., 2009. “Delusions as a natural kind,” in M.R. Broome and L. Bortolotti (eds.) Psychiatry as Cognitive Neuroscience: Philosophical Perspectives, Oxford: Oxford University Press, 49–82.
- Sass, L., 1994. The Paradoxes of Delusion: Wittgenstein, Schreber, and the Schizophrenic, Mind, Ithaca: Cornell University Press.
- –––, 2001. “Self and world in schizophrenia: Three classic approaches,” Philosophy, Psychiatry, & Psychology, 8 (4): 251–270.
- –––, 2004. “Some reflections on the (analytic) philosophical approach to delusion,” Philosophy, Psychiatry, & Psychology, 11(1): 71–80.
- Schwitzgebel, E., 2012. “Mad belief,” Neuroethics, 5 (1): 13–17.
- Sims, A., 2003. Symptoms in the Mind, London: Elsevier.
- Stephens, G. L. and Graham, G., 2000. When Self-Consciousness Breaks: Alien Voices and Inserted Thoughts, Cambridge, MA: MIT Press.
- –––, 2004. “Reconceiving delusions,” International Review of Psychiatry, 16(3): 236–241.
- –––, 2006. “The delusional stance,” in M. Cheung Chung, W. Fulford, G. Graham (eds.) Reconceiving Schizophrenia, Oxford: Oxford University Press, 193–216.
- Stone, T. and Young, A.W., 1997. “Delusions and brain injury: the philosophy and psychology of belief,” Mind & Language, 12: 327–364.
- Thornton, T., 2008. “Why the idea of framework propositions cannot contribute to an understanding of delusion,” Phenomenology and the Cognitive Sciences, 7 (2): 159–175.
- Vosgerau, G. And Newen, A., 2007. “Thoughts, motor actions, and the self,” Mind & Language, 22 (1): 22–43.
- Young, A. W., and de Pauw, K. W., 2002. “One stage is not enough,” Philosophy, Psychiatry, & Psychology, 9(1): 55–59.
- Young, A., Reid, I., Wright, S., Hellawell, D.J., 1993. “Face-processing impairments and the Capgras delusion,” British Journal of Psychiatry, 162: 695–698.
- Young, A.W., 2000. “Wondrous strange: The neuropsychology of abnormal beliefs,” in M. Coltheart and M. Davies (eds.) Pathologies of Belief, Oxford: Blackwell: 47–73.
- Young, A.W. and Leafhead, K., 1996. “Betwixt life and death: Case studies of the Cotard delusion,” in P. Halligan and J. Marshall (eds.) Method in Madness: Case Studies in Cognitive Neuropsychiatry, Hove: Psychology Press, chapter 8.
- Young, G., 2007. “Clarifying ‘Familiarity’: Examining differences in the phenomenal experiences of patients suffering from prosopagnosia and Capgras delusion,” Philosophy, Psychiatry, & Psychology, 14(1): 29–37.
- –––, 2008. “Capgras delusion: An interactionist model,” Consciousness and Cognition, 17: 863–876.
Academic Tools
How to cite this entry. Preview the PDF version of this entry at the Friends of the SEP Society. Look up this entry topic at the Indiana Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers, with links to its database.
Other Internet Resources
- International Network for Philosophy and Psychiatry.
- Delusions, list of papers in PhilPapers.
- Blog on Imperfect Cognitions.