INTRODUCTION

A Brief History of Psychotomimesis and NMDA Blockade

Psychotomimetic drugs have a rich, though controversial, history in neuropsychopharmacology research. The term psychotomimetic was first used to describe the similarity between the subjective effects of lysergic acid diethylamide in healthy volunteers and the reports of patients with schizophrenia (Osmond, 1957). However, research with compounds that engender psychosis-like experiences has a much longer history; Moreau, (1845/1973) documented the effects of marijuana intoxication during the 1840s, noting that different subjects reacted quite differently to identical doses. Knauer and Maloney (1913) emphasized the importance of set (the individual subject's physiology) and setting (the situation in, and route by which the compound is administered) in explaining the variability in subjects' responses to psychotogenic compounds such as mescaline. Beringer (1927) first used the term ‘artificial model of psychosis’; he emphasized the potential utility of testing subjects before, during, and after the intake of the agent, believing that inducing psychosis in this way, he could discern primary pathological mechanisms from secondary reactions to, and coping mechanisms for such pathology. These themes, emerging early in the history of drug models of psychosis, are ones that will guide much of this current review. While we view the central ideas as generalizing across a number of different psychotomimetic drugs (see Corlett et al, 2009a, 2009b and Box 1 for review), we focus here on drugs whose mode of action involves noncompetitive NMDA receptor antagonism.

We aim to outline a model within which to understand how the administration of psychotomimetic compounds can induce changes in subjective experience that are redolent of psychosis. There is a wealth of work on the detailed physiological mechanisms through which psychotomimetics affect neural function (Aghajanian, 2009). We aim to build upon that work by outlining a set of principles that may allow us to extrapolate from perturbed synaptic function to disordered subjective experience. Although we focus on the cognitive aspects of psychotomimetic effects, but we believe that the Bayesian model that we outline may well provide a tool for translational investigations of the psychotic symptoms of schizophrenia (Corlett et al, 2010b).

NMDA receptor antagonists and human subjects

Phencyclidine (or PCP) and ketamine bind noncompetitively to NMDA receptors (Anis et al, 1983). At higher doses, these NMDA receptor antagonists produce a dissociative anesthesia: a profound analgesia coupled with a somnolent state in which the patient does not appear to be asleep or anesthetized but rather disconnected from his surroundings (Corssen et al, 1968). However, on emergence from anesthesia, subjects treated with PCP experienced psychotomimetic side effects (Luby et al, 1959), which resembled the effects of sensory deprivation (Meyer et al, 1959). These include ‘feelings of anxiety, depression or fear together with a difficulty in thinking and concentration. In higher doses illusional, delusional and hallucinatory experience and a sensation of displacement commonly occur’, for example, subjects administered with PCP suffered persecutory thoughts, illusory experiences of objects, as well as their own bodies. PCP induced negative symptoms, including subjective thought block, anhedonia, catatonia, and waxy flexibility; ‘his limbs would persist if placed in abnormal postures for several minutes’. These states were postoperative, following the use of PCP as an anesthesia, and they were transient, lasting 12–72 h (Meyer et al, 1959).

Healthy subjects administered with a low subanesthetic dose of the drug experienced disturbances in body image and a difficulty distinguishing self and nonself. Their thinking was disorganized, manifest as concreteness in proverb interpretation. Subjects were apathetic, they were able to recall hearing questions and comprehending instructions, but felt no special compulsion to respond or comply with requests. They also described a peculiar state (described by Luby et al as Hypnagogic), reporting feeling as though they were in some specific setting from their personal past (for example a classroom) that they were able to describe in detail (eg, recognizing an old school friend); a phenomenon reminiscent of the rare reports of confabulation in head-injured patients with frontostriatal damage (Schnider, 2001).

Ketamine was first synthesized by Parke-Davis in 1963 (Jansen, 2004) and first administered to a human subject on 3 August 1964 (Domino, 1984). Then named CI-581, it was also found to have dissociative anesthetic properties but also induced emergence reactions including ‘marked changes in mood and affect’, which varied across subjects, as well as ‘frank hallucinatory episodes’. These effects usually subsided within 30 min (Domino et al, 1965).

Unlike a number of other drugs used to model schizophrenia, NMDA receptor blockade can induce both positive and negative symptoms and therefore provide a more complete model of psychosis (Javitt and Zukin, 1991). These observations provided the foundation for the NMDA receptor hypofunction model of schizophrenia (Abi-Saab et al, 1998; Javitt and Zukin, 1991; Olney and Farber, 1995; Sharp et al, 2001; Tsai and Coyle, 2002). However, while the negative symptoms induced by the drug are related to its NMDA-binding capacity, positive symptoms are not (Stone et al, 2008), suggesting some non-NMDA component to the generation of positive symptoms. This may be mediated by postsynaptic AMPA receptors (Jackson et al, 2004; Moghaddam et al, 1997). Despite modeling both the positive and negative symptoms associated with schizophreniform psychosis, the acute ketamine model is not without its detractors. See Box 2 for discussion.

Acute ketamine administration

Since its discovery, many studies have used acute, subanesthetic ketamine administration to induce psychotic symptoms (Bowdle et al, 1998; Ghoneim et al, 1985; Krystal et al, 1994; Oye et al, 1992), as well as cognitive deficits redolent of schizophrenia (eg, (Honey et al, 2005a, 2005b, 2005c; Krystal et al, 1994, 2000; Malhotra et al, 1996; Morgan et al, 2004a; Newcomer et al, 1999). The dose and pattern of ketamine delivery varies across studies, although the intravenous route is used almost exclusively. Single bolus injections of ketamine lead to rapid changes in subjective experience that decline as the drug is metabolized and excreted (Bowdle et al, 1998; Oye et al, 1992). In order to prolong the effects of the drug and allow for more rigorous characterization of psychotic symptoms, bolus injections can be followed by a maintenance dose (Bowdle et al, 1998; Breier et al, 1997; Krystal et al, 1994; Malhotra et al, 1996; Newcomer et al, 1999; Vollenweider et al, 1997a, 1997b, 2000) (Krystal et al, 1998). Maintenance doses are often applied using computer-controlled pumps to deliver ketamine at a particular rate (Honey et al, 2003, 2004, 2005a, 2005b, 2005c). The infusion schemes commonly employed can be summarized as BET; that is, a bolus (B), followed by a constant rate infusion to compensate for drug elimination (E), and an exponentially decreasing infusion to compensate for drug distribution or transfer (T) between the body's compartments (Kruger-Thiemer, 1968). The pharmacokinetic parameters for these computerized pumps came from early anesthetic studies by Domino et al, 1982 (Keefe et al, 1999). These models were based on volunteer subjects who were amputees, as they were compartment based (ie, their parameters were estimated based on compartment volumes that were significantly smaller than nonamputee volunteers), and their predictions were often inaccurate (Absalom et al, 2007). Furthermore, as the pharmacokinetic and pharmacodynamic parameters were based on higher anesthetic doses, and hence there were inaccuracies and inconsistencies at the lower subanesthetic doses favored in psychotomimetic ketamine research, newer models are available with a shorter time to peak concentration and a more accurate maintenance of ketamine plasma levels (Absalom et al, 2007).

A further potential source of variability across studies is the ketamine itself. Ketamine has two enantiomers, (S) and (R) ketamine. Although commonly administered as a racemic mixture (a 50 : 50 mix of both enantiomers), it appears that the (S) enantiomer is more psychotogenic than the (R) (Vollenweider et al, 1997a). Furthermore, the two enantiomers have different effects on brain metabolism; (S)-ketamine induces hyperfrontality that correlates with the severity of psychotomimetic effects (Vollenweider et al, 1997a), whereas (R)-ketamine induces hypoactivation of posterior cortical regions (Vollenweider et al, 1997a). If different batches of ketamine were composed of different proportions of these enantiomers, then variability in responses would be observed. However, there is still significant interindividual variability in the ketamine response even within a single study using one batch of ketamine.

Exploring Individual Variability: Relating Tasks to Symptoms

Given that we can reliably control the level of drug to which a subject is exposed, it may be possible to use modern neuroscientific techniques to dissociate the individual variability in symptoms from the variations in dose received. Following Beringer (1927), we argue that characterization of the individual differences in the psychotomimetic effects of ketamine enables us to test cognitive and neural models of how those symptoms arise (Kosslyn et al, 2002; Underwood, 1975). We can challenge subjects with psychological tasks that engage a number of key cognitive processes; we can capture the neural circuitry engaged by those processes using functional imaging; and, subsequently, we can administer controlled doses of ketamine that induce the psychotomimetic effects of interest. With these data, we can relate subjects’ baseline neural and behavioral responses (in the absence of the drug and even when the participant has never before experienced the drug) to the psychopathology they experience; such an approach may be used to test hypotheses about the cognitive and neural bases of specific symptoms (Corlett et al, 2006; Honey et al, 2008; Umbricht et al, 2002) (Krystal et al, 2003).

Previous work employing this approach has found evidence in favor of such links for both positive (Corlett et al, 2006; Honey et al, 2008; Umbricht et al, 2002) and negative symptoms (Honey et al, 2008). These studies exploited the potential offered by functional neuroimaging for allowing observation of latent behavioral processes (Henson, 2006; Poldrack, 2006). For example, as we discuss in more detail below, we can use functional neuroimaging to provide an assay or brain marker for prediction error and thereby devise means of using variations in this marker as the basis for developing our understanding of the emergence of delusions.

A key concept, common to this and to the other examples below, is that these signals (and their variability across subjects) provide an opportunity for testing cognitive neuropsychiatric models of psychotic symptoms. This approach is centrally related to notions of efficiency and sensitivity in brain–behavior relationships during cognitive tasks. That is, for some tasks, more proficient subjects engage particular neural circuits to a lesser degree (Rypma and D’Esposito, 1999); for other tasks, greater activation in a set of regions may be associated with better task performance (Gray et al, 2003). These differences in relationship between magnitude of brain response and behavioral competence may reflect the engagement of different cognitive processes and neural circuits by specific task demands (Rypma and Prabhakaran, 2010). An appreciation of the relationship between magnitude of brain response and behavioral competence is critical in interpreting any correlation between brain responses and ketamine-induced symptoms. Thus, fMRI data that capture a particular process in intact individuals can be used to implicate that process in psychotic symptom generation by exploring the extent to which the presence or magnitude of the neural marker for the process predicts the severity of specific symptoms when those same individuals are administered a psychotomimetic drug. Combining cognitive neuroscience, psychopharmacology, and psychiatry in this way can enrich our understanding of the formation of psychotic symptoms; a disease process not readily amenable to empirical study (Corlett et al, 2007).

Exploring Individual Variability: Using Tasks to Predict Symptoms

Predicting delusions and perceptual aberrations

Beliefs emerge because we learn about associations in our environment. Formal learning theories relating to these associations appeal to prediction error or surprise as a driving force in learning and have been able to explain a number of key behavioral observations (Kamin, 1969; Rescorla and Wagner, 1972), However, prediction error remained a ‘latent’ process whose presence was inferred but not observed for many years. Neurophysiological recording from midbrain dopamine neurons during reward conditioning in nonhuman primates revealed a prediction error signal in these neurons (Montague et al, 1996; Schultz, 1998; Waelti et al, 2001). Subsequent neuroimaging studies in human subjects have found evidence for similar signals in the midbrain (D’Ardenne et al, 2008), the striatum (McClure et al, 2003; O’Doherty et al, 2003), and the frontal cortex (Corlett et al, 2004; Fletcher et al, 2001; Turner et al, 2004).

Thus, prediction error signal in the brain can be estimated even though it does not have any explicit behavioural correlates. This has proven useful in exploring the brain basis for the anomalous perceptions and beliefs that occur in association with acute ketamine administration. With respect to positive symptoms, sensitivity to prediction error within right dorsolateral prefrontal cortex was predictive of the perceptual aberrations and delusional ideation suffered on a high dose of ketamine across subjects; those subjects with a larger prediction error response to events that violated their expectancies were more likely to suffer perceptual aberrations, attentional capture, and referential delusions under ketamine (Corlett et al, 2006); see Figure 1. Furthermore, a low subpsychotic dose of ketamine engendered aberrant prediction error responding during causal learning and inference in the midbrain, hippocampus, striatum, and DLPFC, which showed a trend toward predicting delusions and perceptual aberrations (Corlett et al, 2006).

Figure 1
figure 1

Baseline neural responses predict the psychotogenic effects of ketamine. (a) The N-Back working memory task, the circuitry it engages, and the relationship between thalamic responses during working memory performance under placebo and ketamine-induced negative symptoms. (b) CPT, the circuitry it engages and the relationship between inferior frontal gyrus responses to the task under placebo and ketamine-induced negative symptoms. (c) Sentence Completion Task, the circuitry it engages, and the relationship between task-induced responses in middle temporal gyrus acquired under placebo and ketamine-induced thought disorder. (d) The Auditory Verbal Monitoring Task, the circuitry it engages, and the relationship between task-induced responses in inferior frontal gyrus captured under placebo and auditory illusory responses engendered by ketamine. (e) The Associative Causal Learning Task, the circuitry it engages, and the relationship between prediction error responses in right frontal cortex and perceptual aberrations induced by ketamine.

PowerPoint slide

A physiological measure with relevance to sensory prediction and prediction error is mismatch negativity (MMN). The MMN is an electrocortical response to rare deviant auditory or visual stimuli embedded in a stream of identical stimuli (Cammann, 1990; Naatanen et al, 1978). Umbricht et al (2002) found that a weaker auditory MMN response at baseline (in the absence of ketamine) was predictive of more severe positive symptoms under ketamine. That is, those subjects who showed a reduced electrocortical response to rare deviant tones in a stream of predictable tones scored more highly on an aberrant experiences scale when they were administered ketamine. This appears at odds with our observation; however, the paradigm used to engage the MMN response involves a predictable pattern of tones with a deviant occurring once every ten tones. Paradigms in which deviant events are more unpredictable observe larger MMN responses (Kimura et al, 2010; Sussman et al, 1998), suggesting that subjects (either implicitly or explicitly) learn the tone sequence. In formal learning theory terms, they use prediction errors; events that violate their expectancy to learn the structure of their environment (Rescorla and Wagner, 1972). As such, subjects with a smaller average MMN response might actually have an increased sensitivity to prediction error and learn the predictability of their sensory inputs more effectively. Of course this is speculative and there are other methodological differences between the studies, which may have contributed to the difference (such as different doses and methods of ketamine administration, as well as different symptom ratings scales).

Predicting negative symptoms

Negative symptoms involve social and cognitive disengagement, perhaps due to a reduction in processing capacity of prefrontal cortex (Silver and Feldman, 2005), which leads to difficulties in sustaining concentration and maintaining task set (Nuechterlein et al, 1986). If this is the case, those subjects who had inefficient prefrontal function during the attention and working memory tasks would show increased vulnerability to negative symptoms under high-dose ketamine. This was what we observed (Honey et al, 2008). For both an n-back working memory task and a continuous performance task (CPT) involving sustained attention, there was a significant association between task-related activation under placebo and the expression of negative symptoms on a high dose of ketamine. Regression analyses revealed a strong relationship between lateral prefrontal cortex activity under placebo and negative symptom scores under ketamine (see Figure 1). For the attention task (CPT), there was significant association between task-related activation and negative symptoms in the bilateral inferior frontal gyri and right middle frontal gyrus (see Figure 1). Both findings are consistent with our prediction that the efficiency of responses to working memory and attentional demands, particularly frontally mediated responses, would relate to negative symptoms under the drug.

Predicting thought disorder and (pseudo)hallucinations

Another cognitive domain that can be explored in relation to positive symptoms is self-monitoring; the ability to distinguish self from nonself related processing (Frith et al, 2000b). Self-monitoring has been implicated in the genesis of auditory hallucinations, where sufferers may be misattributing their own inner speech to an external source (Frith et al, 2000b). Furthermore, aberrations of self-monitoring have been invoked to explain delusions of passivity—in which the sufferer believes that their thoughts or actions are under the control of an external agent (Frith et al, 2000b). Finally, verbal self-monitoring is relevant to another cardinal psychotic symptom, formal thought disorder, in which a sufferer's expressive language is disrupted such that they produce tangential disconnected speech (Oh et al, 2002); sometimes subjects produce no speech at all (alogia) and they claim not have any thoughts (subjective thought block) (Berenbaum et al, 2008). Working memory and, in particular, contextual regulation of ongoing processing are particularly relevant to the induction of thought disorder (Barrera et al, 2005). We explored the idea that, if ketamine produces thought disorder and hallucination-like perceptual aberrations, then the extent to which it does may relate to subject's ability to perform tasks engaging self-monitoring and to the neural underpinnings of such a task. Subjects’ were therefore required to imagine visually presented sentences being read in their own inner speech relative or in the voice of one of two ‘robots’. Imagining robotic speech preferentially engaged fontal and temporal cortices. Critically, those subjects in whom this underlying activation was greatest experienced greater changes in perceptual salience and severity of thought disorder when receiving ketamine outside the scanner on a separate occasion. The finding was considered in terms of a relative inefficiency of processing such that those who required the greatest level of activation to carry out the task were those most vulnerable to these particular effects of the drug. Furthermore, within the same task, we manipulated a requirement to generate the endings of sentences. Inefficiency of left frontal cortical engagement during this component of the task predicted the severity of thought disorder across subjects (Honey et al, 2008); see Figure 1.

It may be that the prediction error model that we have found useful in considering brain responses predictive of delusional ideas (above) is also useful here. Self-monitoring processes involve a predictive expectation and a comparison with experience in order to discern self from other (Blakemore et al, 2002; Frith et al, 2000a). Furthermore, sentence completion involves a predictive contextual process that constrains the set of words that may be employed to complete a particular sentence (Hoeks et al, 2004). Aberrant prediction-related processes may underpin individual susceptibility to thought disorder and auditory illusory phenomena (precursory to hallucinations).

Thus, there is evidence that individual variability in brain responses can be predictive of ketamine-induced symptoms and experiences. Moreover, these observations offer clues about the cognitive and neural basis for symptoms insofar as they highlight the process- and region-specific vulnerability markers for the array of drug effects. In the remainder of this article, we will attempt to bring these links between cognition, brain function, and psychiatric phenomenology under a common explanatory framework; the Bayesian model of brain function (Friston, 2005a, 2009) and psychosis (Corlett et al, 2009a; Fletcher and Frith, 2009; Stephan et al, 2009).

Individual Variability and the Bayesian Brain: a Model that Links Cognitive Variability to Symptom Susceptibility

Of course it is interesting, and, potentially, clinically useful to be able to predict how an individual will respond to a drug on the basis of their ‘baseline’ measures of task-specific neural response. But it would be especially so if we could use this information to frame hypotheses about how the relationship between the cognitive architecture of that individual and his or her specific vulnerability (to say, negative symptoms or to perceptual disturbance) might show us how cognition and symptoms emerge from the same systems whose variable responsiveness predicts this vulnerability. We will consider whether this is possible by proposing a unifying account of the positive and negative symptoms engendered by ketamine (both acutely and chronically) expressed in terms of the increasingly influential Bayesian conception of the brain and behavior (Corlett et al, 2009a; Fletcher and Frith, 2009; Stephan et al, 2009). What we perceive and learn and how we comport ourselves is affected by the sensations incident upon us combined with our prior experiences and expectations about the causes of those sensations (Bayes, 1764); see Box 3. We add that this explanation may be generalized to other drug models (see Corlett et al, 2009a, 2009b and Box 1).

The model conceives of a general purpose for the brains, brain systems, and even single neurons; they aim to predict their subsequent inputs with increasing certainty (Deneve 2008a, 2008b; Fiorillo, 2008; Friston 2005a). This idea has its roots in formal associative and reinforcement learning theories (Rescorla and Wagner, 1972; Sutton and Barto, 1981), as well as in the work of Hermann Von Helmholtz, who theorized that perception was a constructive process based on learned expectancies (Helmholtz, 1878/1971). We attend to and learn about events that violate our expectancies such that we are less surprised subsequently (Kamin, 1969; Pearce and Hall, 1980; Rescorla and Wagner, 1972). This ‘prediction error’ concept has been formally developed to account not only for reinforcement and causal learning (Dickinson, 2001; Fletcher et al, 2001; Schultz and Dickinson, 2000) but also the receptive field properties of the visual system (Rao and Ballard, 1999); the anatomy and physiology of cortical hierarchies (Friston, 2005a; Mesulam, 2008; Mumford, 1992), as well as the interactions between basal ganglia learning systems and sensory cortices that mediate perceptual learning (den Ouden et al, 2009).

Implementing Bayes in the Brain

The Bayesian hierarchical model makes explicit predictions about the roles of different neurotransmitter systems and subsystems in signaling predictions and prediction errors. This division of labor maps on to other conceptions of interneuronal signaling such as the notion of neural ‘drivers’ and ‘modulators’ (Sherman and Guillery, 1998); ‘drivers’ send signals upward through the hierarchy and ‘modulators’ specify predictions top-down (Friston, 2005a). While the originators of driving and modulation as interneural effects did not specify a currency for driving, they did speculate that modulation took place through metabotropic glutamate receptors (Sherman and Guillery, 1998). Subsequently, Friston suggested that the topography and signaling properties of glutamate receptor subtypes might be well suited to instantiating both driving and modulation (Angelucci et al, 2002a, 2002b; Bullier et al, 2001; Hupe et al, 1998, 2001a, 2001b); in an hierarchical cortical system in which representations become more abstract with increasing distance from the primary input, higher levels of the hierarchy specify top-down predictions through NMDA receptor signaling and any mismatches between expectancy and experience are conveyed upward through the hierarchy via rapid AMPA and GABA signaling (Friston, 2005a). There is some neurophysiological support for this contention (Self et al, 2008); however, it is unlikely that NMDA receptors only signal predictions and AMPA receptors signal prediction errors; rather, there may be a division of labor where AMPA receptors are relatively more engaged bottom-up and NMDA receptors are relatively more involved in top-down processes.

Considering the distribution of NMDA and AMPA receptors on the surface of neocortical neurons, the soma and proximal dendrite are highly sensitive to NMDA receptor manipulations, whereas more distal parts of the dendrite are more sensitive to manipulations of AMPA function (Dodt et al, 1998), implying that NMDA receptors near the soma might regulate the amplification of synaptic signals resulting from AMPA receptor activation on remote dendritic sites (Dodt et al, 1998). A similar division of labor is apparent in the connections between the striatum and frontal cortex, although with added complexity (Gittis et al, 2010).

Striatal GABAergic interneurons control the output of the striatum to frontal cortex via the direct and indirect pathways (which express D1 and D2 dopamine receptor subtypes, respectively (Shen et al, 2008)). The majority of control involves feedforward inhibition from local interneurons, which contain parvalbumin and are fast spiking (Tepper and Bolam, 2004). However, some inhibition is mediated by low-threshold spiking interneurons (Koos and Tepper, 1999). These two subtypes of interneurons have distinct signaling properties; fast spiking cells have large rectifying AMPA-mediated currents, but no detectable NMDA-mediated responses (Gittis et al, 2010). On the other hand, low-threshold spiking interneurons have currents indicative of a small AMPA current and an NMDA receptor-mediated component. The AMPA signaling FS interneurons preferentially target direct pathway medium spiny neurons over those in the indirect pathway (Gittis et al, 2010). This arrangement would amount to AMPA receptor stimulation driving and NMDA receptor signaling inhibiting driving outputs from the striatum to the prefrontal cortex. Furthermore, while NMDA receptor antagonism does not alter phasic dopamine release directly (Chesselet, 1984; Mount et al, 1989), top-down regulation of tonic extrasynaptic dopamine levels by glutamatergic afferents from the prefrontal cortices sets the tone on prediction error responses via NMDA receptors (Grace, 1991).

There are, of course, numerous other aspects to the glutamatergic synapse and its regulation; too numerous to treat them in all in detail here. We will briefly discuss some of those aspects with relevance to psychosis and the model under consideration. There are high steady-state glutamate levels in the extrasynaptic space, controlled by extrasynaptic metabotropic group II glutamate receptors (mglur2), which interact with an active exchange mechanism that shuttles cystine into and glutamate out of the glial cells that surround synapses (Baker et al, 2008). As mglur2s function as presynaptic autoreceptors (Baskys and Malenka, 1991), extrasynaptic glutamate negatively modulates the synaptic release of glutamate (Moran et al, 2005). Hence, the modulatory role of mglur receptors on feedforward driving inputs (Sherman and Guillery, 1998).

Another metabotropic receptor, subtype 5 (mglur5), has emerged as a close signaling partner with the NMDA receptor; the two physically interact via postsynaptic density scaffolding proteins homer and shank (Tu et al, 1999); dysfunctions in homer (Gauthier et al, 2010; Gilks et al, 2010), as well as numerous other components of the postsynaptic density (Hahn et al, 2006), have been associated with risk for schizophrenia. Activation of mglur5 receptors reverses the effects of NMDA antagonism on cortical function (Lecourtier et al, 2007). Crucially, mglur5-knockout mice display schizophrenia-like phenotypes (Gray et al, 2009) and mglur antagonists are psychotomimetic in human subjects (Friedmann et al, 1980), underlining the key role of decreased NMDA signaling in the generation of psychosis-like symptoms.

More broadly, these more detailed observations of the glutamatergic synapse underline the relevance of excess synaptic glutamate for the generation of psychotic symptoms (Moghaddam et al, 1997) and, within the context of our proposed translational model, how excess synaptic glutamate might arise either due to inappropriate feedforward signaling or an absence of the regulating processes that specify expected inputs.

Slower Neuromodulators—Precision, Uncertainty, and Learning

There is, of course, much more to cognition, perception, their dysfunction, and their underlying neurobiology than rapid neurotransmission; slower neuromoulators, such as dopamine, acetylcholine, serotonin, and noradrenaline, have a significant role (Greengard, 2001; Spitzer, 1995). Multiple neuromodulators, most notably dopamine, have been implicated in the pathophysiology of psychosis (Carlsson et al, 2001; Kapur, 2003). Where do slower neuromodulators fit into this Bayesian framework? An important distinction, within Bayesian hierarchical models of the brain, (Friston, 2005a, 2005b 2009, 2010) is between prediction errors per se and the precision or uncertainty about those errors. We (and others before us) propose that rapid glutamatergic and GABAergic neurotransmission represent prediction error and, depending on the neuroanatomical circuitry, slower neuromodulators encode the precision of prediction errors (Corlett et al, 2009a; Friston, 2005a); see Figure 2. Some theorists distinguish between these two neurochemical components as mediating inference and learning, respectively (Fiser et al, 2010; Friston, 2005a). While these processes are intimately connected, they are dissociable to come extent (see Figure 3). Inferences involve short-term decisions, such as unconscious perceptual inductions, in which one perceptual hypothesis is favored over another, for example, the Necker Cube (Barlow, 1990). On the other hand, learning represents the set of prior expectancies brought to bear on current processing (Fiser et al, 2010); the set of learned predictions that drives healthy individuals to perceive a hollow mask as a convex face (Emrich, 1989).

Figure 2
figure 2

A model of the reciprocal relationships between inference and learning, priors and prediction error, synaptic plasticity and neural dynamics. Inference is encapsulated in the bistable percepts of the Necker Cube, that is, when faced with ambiguous inputs, the brain entertains multiple hypotheses and makes an inference as to the best candidate. The powerful effect of learning on perception is captured by the hollow mask illusion, wherein, as a result of our overwhelming experience with faces as convex, we perceive a hollow, concave, inverted mask as convex. All predictions, or hypotheses that we entertain, have a likelihood distribution, which we compare with the inputs, computing: a prediction error; a degree of uncertainty associated with that prediction error. We speculate that fast neurotrasnmitters (GABA and glutamate) may code the prediction error and slower neuromodulators (eg, dopamine and acetylcholine, depending on the task and underlying circuitry) may compute the uncertainty.

PowerPoint slide

Figure 3
figure 3

The putative effects of acute and chronic ketamine treatment within the Bayesian model. We predict that, with repeated ketamine exposure, aberrant learning (due to deranged synaptic plasticity) and subsequent inappropriate inferences (based on perturbed neural dynamics) lead to maladaptive and inaccurate representations of the world; delusional beliefs.

PowerPoint slide

This distinction (between prediction error and uncertainty, inference, and learning) expands the role of phasic dopaminergic discharges beyond encoding reward prediction error, to include a more general role of dopamine in modulating or optimizing the precision of prediction errors that may or may not be reward related (ie, modulating the signal to noise response properties of neural units encoding prediction error). This notion fits with the proposed role of dopamine and glutamate interactions in controlling signal to noise ratio (Grace, 1991; Spitzer and Walter, 2003) and the numerous proposals that dopamine (at least in terms of its tonic discharge rates) encodes uncertainty or violation of expectations (Fiorillo et al, 2003; Preuschoff and Bossaerts, 2007). These models are also consistent with physiological recordings of prediction errors; for example, the slower dopaminergic response in cortex following a rapid glutamatergic surprise signal, perhaps involved in maintaining surprising events in working memory (Lavin et al, 2005), as well as updating of expectancies through changes in synaptic function (Schultz and Dickinson, 2000). Furthermore, frontostriatal prediction error signals guide the formation of sensory associations between nonrewarding stimuli (den Ouden et al, 2010, 2009).

Of course, the general model is a gross oversimplification, but it does provide a skeletal framework in which to consider learning, inference, perception, and, perhaps, consciousness in a manner conducive to generating and testing hypotheses. Below, we consider in more detail the relationship of this overarching view of the importance of prediction error and the symptoms of psychosis.

Prediction Error and Psychosis

The positive symptoms of psychosis involve gross misrepresentations of reality. In terms of the Bayesian model, delusions result when an individual experiences internally engendered aberrant prediction error coincident with what should be unsurprising and highly predictable events. Hallucinations involve excessively strong predictions within the hierarchical sensory cortices, conferring an apparent structure upon sensory noise such that the individual experiences a percept without sensory stimulus (Corlett et al, 2009a, 2009b). Passivity phenomena (experiencing one's own actions without intention and inferring that they are therefore under the control of an external agent) occupy a hinterland between hallucinations and delusions (Fletcher and Frith, 2009). Within the Bayesian model, they too involve a prediction error dysfunction (Corlett et al, 2009a; Fletcher and Frith, 2009; Stephan et al, 2009). In the absence of robust predictions about the sensory consequences of an intentional action, performance of the action feels surprising, and that surprise encourages an external agency attribution.

It is important to include in the model of psychotic symptoms a consideration of the characteristic prodromal phase of illness. During the early stages of psychosis, patients report changes in the intensity of their perceptual experience, such that background noises seem much louder and colors seem brighter (McGhie and Chapman, 1961; Bowers and Freedman, 1966; Freedman, 1974; Matussek, 1952). Further, they perceive inappropriate relatedness between external and internal stimuli and events (Schneider, 1930); (Freedman and Chapman, 1973; Freedman, 1974; Matussek, 1952). Within the framework of a model combining prior expectations with incoming sense data, these experiences can be conceived as arising from persistent and irreconcilable mismatch such that even mundane experiences are conferred a degree of salience (see Kapur, 2003) that renders them vivid, novel, and important.

In light of the above simplified account of the prepsychotic and psychotic state, we believe psychotomimetic drugs, and ketamine in particular, offer a potential resolution to this problem, allowing us to induce transient psychotic states by blocking NMDA receptors and enhancing AMPA receptor signaling (Corlett et al, 2007,2009a, 2009b). NMDA antagonists impair how current experience is constrained by learned expectation by blocking the top-down actions of NMDA receptors (Phillips and Silverstein, 2003), while, at the same time, encouraging aberrant prediction error responses to events that are merely coincident with enhanced AMPA receptor signaling (Jackson et al, 2004; Moghaddam et al, 1997). In this way, the world of an individual administered with NMDA receptor antagonists becomes highly unpredictable, different things seem important, and important things seem different, and, from this confusing state, delusions arise as explanatory schemes (Corlett et al, 2007).

Predictive Learning, Negative Symptoms, and Cognitive Control

Can prediction error also apply to negative symptoms? Negative symptoms involve deficits in motivated behavior and a lack of pleasure for previously enjoyed acts, events, and experiences (Andreasen, 1982). Recently, negative symptoms have been analyzed within the reinforcement learning framework (a special case of the Bayesian model, Dayan and Daw, 2008). Patients with negative symptoms appear to have problems predicting the ultimate consequences of a series of actions (Polgar et al, 2008) and using those predictions to guide goal-directed decision-making (Barch et al, 2003). This deficit may also involve working memory dysfunction; one of the cardinal cognitive dysfunctions associated with schizophrenia (Goldman-Rakic, 1994); that is, the short-term storage of salient information in a limited capacity storage system in service of behavior (Baddeley, 1981).

There is an important contextual aspect to working memory processes, such that behaviors, thoughts, and percepts are constrained to what is appropriate to the current situation (Barch et al, 2001; Braver et al, 1999). Working memory deficits also disturb contextual appropriateness (Barch et al, 2003) and lead to a reduction in behavioral engagement and output (Barch et al, 2003), as well as an inability to logically order thoughts and produce coherent and communicative speech (thought disorder, a positive symptom, although complete thought block and alogia are considered negative symptoms).

We contend that negative symptoms arise when no viable predictions can be produced or maintained, subtending a paucity of behavior, a disengagement, and withdrawal. Others have attempted to bring negative symptoms within the Bayesian framework (Stephan et al, 2009); these authors conceived of negative symptoms as a reaction to the unpredictability associated with positive symptoms and ‘impaired learning’. Our empirical work with ketamine (see above) contradicts this idea of negative symptoms as merely a reaction to positive symptoms; as subjects treated with ketamine experience a range of different responses and crucially, those who experience the most severe perceptual aberrations and delusion-like ideas are not the same subjects who experience the most severe negative symptoms (Honey et al, 2008).

Furthermore, social isolation (presumably related to negative symptoms in at least some patients) appears to incubate the generation of psychotic symptoms (Hoffman, 2007). Indeed, negative symptoms appear to precede the genesis of positive symptoms in some patients (Iyer et al, 2008) and to exist as endophenotypes in otherwise unaffected relatives of patients with schizophrenia (Laurent et al, 2000). Our account of negative symptoms as a paucity of motivated expectation (and not secondary to positive symptoms) would be entirely consistent with these observations.

Thus, we have considered a prediction error-based model of both the positive and negative symptoms of schizophrenia and shown how observations of neural responses and, critically, the individual differences therein, may help us to relate this theoretical framework more directly to the ketamine-induced psychosis. Clearly, a great deal of work remains and the existing literature is not without controversy (see Box 2).

EXTENDING THE MODEL IN TIME: LEARNING FROM THE EFFECTS OF CHRONIC AND REPEATED KETAMINE USE

The acute ketamine model is not without its critics (see Box 2). One important point is the surprising rapidity with which ketamine produces psychotomimetic effects (of the order of minutes; Bowdle et al, 1998). However, changes in synaptic function can happen that rapidly (Stephan et al, 2006). Ketamine effects also curtail rapidly. The drug is metabolized with a half-life between 10 and 15 min. Within 30 min, subjects administered a psychotomimetic dose of ketamine will have returned almost completely to normal (with a small minority experiencing residual effects for the next few hours). There is no evidence of adverse effects subsequent to a single acute ketamine challenge in healthy subjects, nor any significant sensitization of responses over multiple (up to 4) repeated ketamine administrations (Cho et al, 2005; Perry et al, 2007).

Despite a lack of sensitization across a small number of administrations, widely spaced in time and in the experimental setting, chronic ketamine abusers do show escalating cognitive changes and symptom experiences (Morgan et al, 2010). Admittedly their exposure involves more numerous and frequent ketamine administration than subjects in repeated ketamine studies (which will have been separated perhaps by years and most likely months). Recreational users usually administer ketamine intranasally, with an estimated duration of effect of up to 1 h (Siegel, 1978). In an evening, ketamine abusers will often self-administer several sequential doses of the drug in order to maintain psychotropic effects over time (Muetzelfeldt et al, 2008).

CHRONIC KETAMINE ABUSE

The recreational use of ketamine was first reported in 1970s in North America, (Petersen and Stillman, 1978). Renowned experimenters or ‘psychonauts’ John C Lilly and Marcia Moore both published books on their experiences with ketamine in 1978 (Lilly, 1978; Moore and Altounian, 1978). The possibility for consciousness expansion and the exploration of new worlds provided by ketamine appealed to both of them; however, as ketamine use is continued, tolerance developed and they both spiraled toward larger and larger consumption (Jansen, 2004). Both Lilly and Moore perceived increasing numbers of coincidences or synchronicities that demanded explanation. Lilly conceived of a Coincidence Control Center and was increasingly preoccupied with communicating with cetaceans (themes to which he alludes in published academic work; Lilly, 1963). As each of them continues to use ketamine excessively, the degree of self-harm escalates; Lilly is interred in a state psychiatric hospital and suffers a near fatal drowning accident; Moore dies from exposure in a forest near her home (Jansen, 2004).

These instructive yet cautionary tales have not quelled enthusiasm for illicit ketamine use; from 1999 to 2003, prevalence of ketamine use among club goers in the UK rose from 25 to 40% (McCambridge et al, 2007). An increasing number of studies are examined cognition and phenomenology in chronic ketamine users (Curran and Monaghan, 2000, 2001; Freeman et al, 2009; Morgan et al, 2004b, 2004c, 2006, 2008, 2009, 2010). Ketamine users had high levels of dissociation and schizophrenia-like symptoms while on drug, and frequent users remained impaired when they were drug free (Curran and Monaghan, 2001). Although some cognitive processes remain relatively unaltered, for example, frequent ketamine users do not have severe occulomotor abnormalities redolent of schizophrenia (Morgan et al, 2009), chronic users do have semantic and episodic memory impairments that correlate with their degree of exposure (Morgan et al, 2010); moreover, relevant to the current thesis, chronic ketamine users show aberrant learning; they engage in predictive responding toward irrelevant stimuli, which is a superstitious responding style (note though that ketamine-naive subjects also show a considerable degree of superstition (Freeman et al, 2009)). Crucially, frequent ketamine use increases the severity of delusional ideation when subjects are followed up longitudinally for 1 year (Morgan et al, 2010). These beliefs remain, although attenuated, even in individuals who are abstinent (Morgan et al, 2010).

A further important question arises: what is the basis for these individual differences in acute and chronic effects upon which the Bayesian approach may capitalize?

FACTORS UNDERPINNING ACUTE AND CHRONIC KETAMINE EFFECTS

Genetic Effects

The idea that genetic variability between individuals might influence response to drugs was described and termed pharmacogenomics by Vogel (1959). Genetic variation confers differences in receptor function, cortical development, synapse formation, structure, or plasticity; this may underpin the individual differences we observed (Stephan et al, 2006). Although an association study between the acute effects of ketamine in healthy volunteers and genetic variability has yet to be published, there are a number of important leads.

Family history of alcoholism: Individuals with a family history of alcoholism show a blunted psychotomimetic response to ketamine, experiencing fewer dysphoric effects of ketamine, fewer positive symptoms, and milder negative symptoms compared with individuals without such a family history (Petrakis et al, 2004). As ketamine binds to NMDA receptors, it was assumed that this variability related to the function, structure, or regulation of NMDA receptor signaling in some way. Future studies will unveil the specific mechanisms of this effect.

Apolipoprotein E4: A ketamine challenge study in a group of remitted patients with schizophrenia (n=18) revealed a milder effect of ketamine in subjects who were carrying the epsilon 4 allele of apoliprotein E (n=7)—in particular, they had smaller ketamine-induced changes in unusual thought content (a measure of delusional ideation) compared with placebo (Malhotra et al, 1998). Patients with schizophrenia who carry this gene have less severe positive symptoms (Pickar et al, 1997). However, patients with Alzheimer's disease who carry this allele are more likely to suffer hallucinations and delusions (Zdanys et al, 2007). Apolipoprotein E4 is involved in lipid metabolism; however, data on its neural interactions are emerging, for example, ApoE4 modulates NMDA receptor function via effects on insulin-degrading enzymes and protein kinase A (Sheng et al, 2008), which is indicative of a potential role of cyclic AMP signaling in the psychotomimetic effects of ketamine. The relationship between the presence of the epsilon 4 allele and the severity of ketamine-induced psychosis in healthy volunteers has yet to be established empirically.

Personality

Initial studies into emergence phenomena following ketamine anesthestia explored personality as an explanatory factor (Biersner et al, 1977). These studies focused on the ‘big 5’ personality measures with inconsistent results; high neuroticism, high psychotism, and low extraversion were related to ketamine-induced emergence phenomena; however, these studies were not placebo controlled and, given the initial use of ketamine in obstetric and gynecological anesthesia, female subjects predominated which may have biased the results (Jansen, 2004).

The lack of a significant association between schizotypal personality and ketamine responses (Krystal et al, 1994; PR Corlett, GD Honey, and PC Fletcher (unpublished observations)) suggests that schizotypy and acute ketamine-induced psychosis are mediated by different neurochemical mechanisms. However, such inferences should be made with caution; healthy volunteers who are invited to partake in ketamine research are carefully screened for a family history or personal history of mental illness (some groups even use high schizotypy scores as an exclusion criterion).

Although prolonged ketamine abuse may engender more convincing delusions than acute administration, this experimental model has not been examined prospectively (so far). That is, in the same way, we have found individual differences in behavioral and brain functions that predispose toward certain patterns of experience of acute ketamine; perhaps, there are predisposing factors related to chronic ketamine use. It is impossible to carry out a placebo-controlled study here. Indeed, it is possible that more schizotypal subjects (for example) are drawn to engage in repeated ketamine use. Exploring this idea would involve a prospective study with an extremely large initial population of subjects; some of whom would go on to abuse ketamine and in whom assessments had been made before the onset of ketamine use.

Future work should triangulate the relationship between ketamine-induced psychopathology, personality measures, and cortical neurochemistry. Other relevant personality measures ought to be explored, in particular suggestibility and absorption, which correlate with susceptibility to hypnosis (Braffman and Kirsch, 1999); a relatively recent model of psychosis that is beginning to be employed in delusion research (Barnier et al, 2008).

Other Candidate Mechanisms for Individual Variability in Ketamine Response

In addition to single-nucleotide polymorphisms (SNPs), which are one marker for genetic variability, there are numerous biological mechanisms through which the observed variability in ketamine response may be mediated. For example, gene copy number repeat variation, which has been associated with schizophrenia (Walsh et al, 2008) and may be associated with responses to pharmacological interventions (Dhawan and Padh, 2009). In addition, alternative splicing provides a means through which variability can be introduced (Passetti et al, 2009), put simply, alternative splicing involves the formation of multiple alternative messenger RNAs (mRNAs) from the same gene (Gilbert, 1978), which can code for proteins of varying functionality (Black, 2000). Again alternative splicing has been implicated in the generation of interindividual differences in drug responses (Passetti et al, 2009). With respect to NMDA antagonist model psychoses, alternative splicing has been employed to alter the functionality of NMDA receptors in vitro (Rodriguez-Paz et al, 1995). In brief, NMDA receptors are assembled from a combination of subunits; in this study, the inclusion of an alternatively spliced insertion of 21 amino acids into the NMDAR1 subunit gives rise to receptors that were more sensitive to ketamine, an effect that was maintained when the alternatively spliced NR1 subunits were combined with NMDAR2B subunits into heteromeric receptors (Rodriguez-Paz et al, 1995). Furthermore, NMDAR2B-selective antagonists do not engender psychotomimetic effects in human subjects (Wang and Shuaib, 2005). The variability in effects of NMDA receptor antagonists on behavior (Gilmour et al, 2009) may be explicable in terms of their differing abilities to bind and activate NMDA receptors of varying subunit composition, as well as the variability in concentration of those different subunits in different brain regions (Wenzel et al, 1995).

Furthermore, variation in ketamine response in human subjects may be relatable to the subunit composition of their NMDA receptors, specifically the functionality of NR1 subunits affected by alternative splicing. However, the inbred Balb/c mouse strain that is more sensitive to NMDA receptor antagonists (Deutsch et al, 1997) does not have any significant alteration in mRNAs for NR1, NR2A, or NR2B subunits (Perera et al, 2008).

Another possible mechanism through which alternative splicing might influence ketamine responses is through interactions with microRNAs (miRNAs). miRNAs are noncoding RNAs that can block mRNA translation and affect mRNA stability and hence the function of ∼30% of protein-coding genes in the human genome (Rajewsky, 2006). miRNAs mediate their effects by targeting the 3′ untranslated region of mRNAs generated from the genes being regulated (Bartel, 2009). Variations in DNA sequence, such as SNPs, as well as variations in mRNA sequence generated by alternative splicing can effect an miRNAs ability to regulate its target (Georges et al, 2006). Hence variance in miRNA function could also be involved in the individual differences in ketamine response. More specifically, microRNA 219 (miR-219) is significantly reduced in rodent prefrontal cortex following NMDA antagonism and is involved in the expression of psychotomimetic effects of NMDA blockade (Kocerha et al, 2009). miR-219 has also been implicated in neuronal development, specifically myelination and the generation of oligodendrocytes (Dugas et al, 2010), as well as in the synaptic learning process long-term potentiation (Wibrand et al, 2010). Hence, variation in miR-219 function might confer susceptibility to the effects of NMDA antagonism, in particular those effects that involve aberrant neural learning.

There are multiple other potential sources of variability in the ketamine response; we highlight these mechanisms as potential future avenues of inquiry.

CONSILIENCE?

We aim to provide a consilient account of acute and chronic NMDA antagonist administration in terms of the Bayesian model we introduced.

To briefly rehearse, we and other have postulated that the brains, brain systems, and even single neurons function as Helmholtz machines (Dayan et al, 1995; Hinton and Dayan 1996); they try to minimize their uncertainty about subsequent inputs and stimulation by generating predictions, and responding to violations of their expectancy predictions are specified top-down via NMDA signaling, to lower layers of the hierarchy and lower layers signal prediction errors, bottom-up via AMPA and GABA receptor signaling. These errors are either ignored or accommodated; that is, fed forward through the hierarchy and update subsequent predictions with new learning (Dayan et al, 1995; Hinton and Dayan 1996).

In addition to the rapid glutamatergic (and GABAergic) signaling between layers in a cortical hierarchy, there are slower neuormodulatory effects, for example, dopamine and acetylcholine appear to code a confidence estimate or uncertainty in predictions and prediction errors (Preuschoff and Bossaerts, 2007; Yu and Dayan, 2002) in different hierarchies (Friston, 2005; Stephan et al, 2006, 2009). In the mesocortical dopamine system, dopamine and glutamate can be coreleased; dopamine levels rise over a slower timescale following a rapid glutamatergic prediction error signal, perhaps to ensure the maintenance of potentially explanatory information in working memory, perhaps promoting subsequent plasticity and new learning (Lavin et al, 2005). We and others make a distinction based on these two signaling modes, rapid and neuromodulatory, between inference and learning (Fiser et al, 2010; Friston, 2005a). These two processes complement one another, but, we posit, may be differentially affected by acute and chronic ketamine.

We posit that inference is glutamatergically mediated, and in the short term, briefly perturbs slower neuromodulatory function. However, with persistent changes in glutamatergic function, learning is engaged that alters the parameters of dopaminergic uncertainty coding and thus affects subsequent inferences (see Figures 3 and 4).

Figure 4
figure 4

Potential synaptic processes in health, acute, and chronic ketamine exposure. (a) ‘Health’—In the absence of psychotomimetic drugs, information processing at a glutamatergic synapse involves glutamate release from a presynaptic cell (regulated by NMDA receptors and mglurs), which is incident upon a postsynaptic cell. The number and functionality of postsynaptic receptors on that cell, as well as the tone of slower, nuromodulatory inputs, for example, dopamine and acetylcholine, set the ‘prior’ (how much stimulation to expect) and the uncertainty (the level of confidence ascribed to that particular input), respectively. Glial cells regulate the reuptake of synaptic glutamate and its cycling back into the presynaptic cell, also under the control of NMDA and mglur receptors, as well as slower neuromodulators (like noradrenaline). (b) Acute ketamine—It blocks NMDA receptors, thus impairing the specification of prior expectancies. It has transient effects on slower neuromodulators, such as acetylcholine, thus affecting inference processes and vitiating perception and cognition. Crucially for the model at hand, ketamine administration increases presynaptic glutamate release (via effects on glial glutamate reuptake and noradrenergic signaling), and as such, AMPA receptors are excessively and inappropriately stimulated. Thus, prediction errors are registered inappropriately inducing delusion-like ideation as aberrant or inappropriate inference; an attempt to make sense of uncertain experiences. (c) Chronic ketamine—With chronic exposure, there is a compensatory increase in the number and function of NMDA receptors. However, glial glutamate reuptake remains impaired and there is a sensitization of slower dopaminergic inputs (eg, to medium spiny neurons in the striatum). As such, the delusion-like ideas characteristic of the acute phase become crystallized as new learning, that is, a new prior.

PowerPoint slide

Acute ketamine then would impair the specification of top-down priors (by blocking NMDA receptors); furthermore, it would enhance presynaptic glutamate release (Jackson et al, 2004), engendering aberrant prediction error responses by stimulating AMPA receptors (Moghaddam et al, 1997; Jackson et al, 2004). These effects are primarily responsible for the psychotomimetic responses that ketamine engenders. Although acute ketamine may induce striatal dopamine release (Breier et al, 1998; Smith et al, 1998; Vollenweider et al, 2000), however, it does not always (Kegeles et al, 2000), and typical D2 dopamine blocking antipsychotic drugs do not alleviate the psychotomimetic effects of acute ketamine (Krystal et al, 2005). However, lamotrigine, which blocks presynaptic glutamate release, does reverse the psychotomimetic effects of ketamine (Anand et al, 2000).

Perhaps the psychosis-like experiences are transient because ketamine is rapidly metabolized by the liver and excreted. Homeostatic mechanisms involving glia retune glutamatergic synapses (Baker et al, 2008), and hence cognition and phenomenal experience return to normal.

Acute ketamine administration does increase cortical cholinergic function (Nelson et al, 2002), perhaps providing a mechanism for the perceptual aberrations and attentional capture that characterize its effects (Pomarol-Clotet et al, 2006). Furthermore, formal learning theories (Pearce and Hall, 1980) and preclinical behavioral neuroscience tell us that increasing cortical acetylcholine engages new explanatory learning and tunes attention to stimuli with unexpected and unpredictable consequences (Sarter et al, 2005). The engagement of cortical acetylcholine to aid uncertain perceptual inference may underpin the referential delusion-like ideas that a narrow, darting focus of attention happens upon under ketamine (Sarter et al, 2005).

These mechanistic differences may explain why acetylcholinersterase inhibitors are psychotogenic (Sarter et al, 2005). Furthermore, according to this model, acute ketamine should not engender hallucinations, because of its effects on cortical acetylcholine and the subsequent uncertainty of cortical inference—hallucinations would occur in the context of reduced cholinergic function when cortical inference was more certain; for example, following ingestion of anticholinergic drugs (Hall et al, 1977) or in disease states associated with reduced cholinergic activity (Collerton et al, 2005).

These primary effects on cortical inference (rather than more permanent learning effects) may explain the degree of insight that ketamine-treated subjects retain; they describe their experiences in relative terms and do not appear to experience longer-term effects of the drug once it has been metabolized (Pomarol-Clotet et al, 2006) (Corlett et al, 2010a; see Figure 3 and 4).

On the other hand, chronic ketamine administration involves dysfunction to the homeostatic regulation of glutamatergic synapses such that new learning takes place; both at the level of glutamate receptors, as well as the dopamine system; chronic NMDA blockade sensitizes the dopamine system (Jentsch and Roth, 1999). Intriguingly, longer-term ketamine administration does not sensitize cortical cholinergic function (Nelson et al, 2002), again, potentially explaining why chronic ketamine use is associated with delusions rather than hallucinations (Morgan et al, 2010); see Figures 3 and 4 and also see Box 1 for a brief discussion of the effects of binge use of amphetamine.

Continued frequent ketamine use is associated with delusions that worsen over time but not hallucinations (Morgan et al, 2010). Morgan et al speculate that delusion formation occurs during acute ketamine intoxication, but delusions crystallize into the drug-free state. We propose that the crystallization process involves interplay between sensitized dopamine function in the basal ganglia (especially the associative striatum (Kegeles et al, 2010)) and glutamatergic reinforcement of new explanatory priors (Friston, 2005a). Associative striatal dopamine sensitization may also explain the apparent increase in susceptibility to superstitious conditioning in chronic ketamine users (Freeman et al, 2009), as chaotic and hypersensitive dopamine signaling has been implicated in superstitious learning in animal models (King et al, 1984; Shaner, 1999).

Dopamine antagonism then, particularly with drugs that target D2 dopamine receptors, may ameliorate the psychopathology generated by chronic ketamine use, despite having no effect on the psychopathology induced by acute ketamine administration. A sensitized dopamine system may be prone to aberrant salience attribution (Kapur, 2003) either due to excess uncertainty (associated with bottom-up prediction error) that demands explanation (Pearce and Hall, 1980) or due to excessive top-down salience, based on inappropriately strong conditioned expectations (Mackintosh, 1975). Typical dopaminergic antipsychotics would modulate both of these effects and hence they ameliorate hallucinations and delusions (Hamamura and Harada, 2007).

One further characteristic of repeated ketamine use that we wish to consider is the commonality of experience across exposures; in a phenomenological analysis of repeated ketamine use (Newcombe, 2008), one psychonaut describes common themes and experiences across multiple ketamine administrations. We believe that the same mechanisms may explain the maintenance of delusional themes across episodes of psychosis (even when periods of remission separate episodes of psychosis by years (Sinha and Chaturvedi, 1989)) and perhaps a means through which ketamine experiences crystallize into delusions. In brief, the components of a synapse that confer its strength and excitability are frequently completely recycled (Arshavsky, 2006). One relatively stable and information-rich candidate for storage of memories across the lifespan is DNA. Changes in how the genome can be expressed could mediate long-term retention of information by an organism (Crick, 1984; Holliday, 1999). Much of the genome may be silenced in a particular cell, whereas other portions will be highly expressed; a process mediated, in part, through changes in proteins called histones that combine with DNA to form chromatin (Levenson et al, 2004). We posit that one effect of repeated ketamine use is chromatin remodeling such that prior ketamine experiences are repeated on subsequent use.

In addition to altering synaptic connection weights (and perhaps chromatin structure) and thus inducing aberrant learning and memory, we should, of course, highlight that chronic ketamine use may also alter brain structure more macroscopically, for example, by engendering excitotoxic cell death. Olney, Faber, and coworkers found evidence of cortical lesions in rodents following treatment with NMDA receptor antagonists (Farber et al, 1995; Olney et al, 1989; Olney et al, 1991); lesions that impinged upon circuits relevant to schizophrenia (Sharp et al, 2001). Whether these lesions occur in primates has yet to be determined, but we note that structural brain changes following ketamine are a possible mediator of the more marked and persistent delusions associated with chronic self-administration.

In the context of the Bayesian hierarchical model, we speculate that negative symptoms result from a failure to specify coherent expectations with sufficient certainty in order to justify motivated behaviors. Although withdrawal from social interaction may result from uncertainty and unpredictability to some extent (Stephan et al, 2006, 2009), we propose that negative symptoms, such as alogia, amotivation, and anhedonia, result when sufferers cannot predict the positive, hedonic qualities of a valued goal, nor can they maintain that prediction online, or use it to guide their behavior (Heerey and Gold, 2007). Again, we might expect particular genes to predispose individuals to particularly severe negative symptoms on ketamine, for example, the val158met polymorphism of catechol-o-methyltransferase is associated with inefficient frontal cortical responses during working memory tasks and a paucity of exploratory decision-making during reward learning (Frank et al, 2009). Chronic ketamine use further compounds inefficient frontal cortical function (Narendran et al, 2005) and is associated with decreased mood (Morgan et al, 2010). This is worth underlining given the recent move toward treating depression with NMDA antagonists and the suggestions that chronic administration may prolong the short-lived antidepressant effect of acute ketamine (Machado-Vieira et al, 2009).

CLINICAL UTILITY OF THE MODEL

NMDA antagonist model psychoses inspired the development of the first antipsychotic drug without a direct effect on dopamine function; an mglur2/3 agonist that would act presynaptically to regulate glutamatergic tone, decreasing excess release and the stimulation of non-NMDA glutamate receptors, and thus ameliorating the positive symptoms of psychosis (Patil et al, 2007). Other adjunct treatments have been trialed as a result of the model, for example, lamotrigine, which reverses the neural and phenomenological effects of ketamine (Anand et al, 2000; Deakin et al, 2008). Lamotrigine has modest ameliorative effects as an adjunct treatment in schizophrenia (Dursun and Deakin, 2001). Glycine and D-serine, two compounds that act as cofactors to NMDA receptor transmission, increase NMDA receptor function and have clinical utility as adjunct treatments (Heresco-Levy et al, 2004; Heresco-Levy et al, 2005).

But does our Bayesian analysis suggest any novel treatment options? In the context of the model, delusional beliefs may be conceptualized as inappropriately reinforced prior expectations that are constantly updated with novel information or reconsolidated (Corlett et al, 2009b). One novel approach therefore involves engaging the prior belief and administering a drug that destabilizes it, preventing its reconsolidation (Rubin, 1976). Propanolol, a β-adrenergic receptor antagonist, has been used in this manner in preclinical models to disrupt fear memories (Debiec and Ledoux, 2004) and drug memories (Milton et al, 2008) and recently to attenuate learned fear responses in human subjects (Kindt et al, 2009). With respect to the NMDA antagonist models under consideration, propanolol attenuates the excessive glutamate release engendered by NMDA antagonists (Narimatsu et al, 2002). Propanolol has been used to treat psychosis with varying results (Atsmon et al, 1972; Manchanda and Hirsch, 1986); one reason for the inconsistency may involve variability in the degree to which the prior mediating the psychotic symptoms is engaged at the time of treatment (Lee et al, 2006).

FUTURE DIRECTIONS

The Bayesian conception of psychosis, which we have outlined, provides a framework for us to consider the co-occurrence of positive and negative symptoms, with obvious relevance for endogenous psychotic illnesses. Empirical examinations of the model will either support the model or disconfirm it. We feel that there is already sufficient consilience between glutamatergic model psychoses and endogenous psychotic illnesses such that they may provide a useful step in developing novel antipsychotic medications. Although ketamine-induced symptoms are not reversed by typical antipsychotic drugs (Krystal et al, 1999), atypical antipsychotics ameliorate them (Malhotra et al, 1997), as do drugs that modulate presynaptic glutamate release (Anand et al, 2000) and synaptic glutamate processing (Krystal et al, 2005). In the context of the framework we have outlined, it may be possible to engage in experimental medicine, targeting specific psychotic symptoms both theoretically and empirically, by identifying individuals who will respond to acute ketamine with a particular profile of symptoms and perhaps using those symptoms as a test bed for novel antipsychotic treatments. By focusing on the individual differences in ketamine effects via the Bayesian model of psychosis, we may have a framework within which to explore psychosis translationally. Indeed, genetic association studies of the cognitive and neural bases of the model may well assist our understanding of the biological mechanisms of psychosis. Furthermore, the individual differences in ketamine response that we describe, and their prediction by the engagement of specific neural circuitry and cognitive processes, portend the use of fMRI and cognitive science in clinical decision-making and perhaps targeted, personalized medicine.

CONCLUSION

NMDA receptor antagonists provide a useful model of psychosis following both acute and chronic administration. Acute administration permits the transient and reversible engagement of both positive and negative psychotic symptoms. With repeated chronic (self-) administration, delusions develop and persist even after the ketamine has been metabolized (Morgan et al, 2010).

We outlined a series of data that explored and exploited the individual differences in response to acute ketamine, emphasizing the utility of quantifying brain responses to theoretically relevant cognitive tasks and what they might tell us about the brain and behavioral bases of psychotic symptoms.

We interpreted these effects in terms of a hierarchical Bayesian model of cognition, comportment, and brain function. By comparing and contrasting the phenomenology and neurobiology of acute and chronic ketamine administration, we were able to explain, among other things, why ketamine administration does not engender hallucinations (either acutely or chronically), why the psychotomimetic effects of ketamine are not reversed by haloperidol, and why there is no strong association between schizotypy and the acute effects of ketamine, but at the same time, highly schizotypal individuals may be particularly susceptible to chronic ketamine use. Furthermore, we outline a potential explanation for negative symptoms in terms of aberrant predictions and ineffective maintenance of those predictions in service of goal-directed behavior. We find this simplifying framework a very powerful means to relate a number of disparate observations relevant to psychosis. This exercise has generated a number of testable predictions both in terms of the generation of psychosis and strategies that may be used to ameliorate it.