26 April 2013. More than 100 researchers gathered to discuss the early stages of oncoming psychosis at the meeting of the International Prodromal Research Network (IPRN) on 21 April 2013 in Orlando, Florida. Like the rain outside, ideas poured forth in the meeting, organized by Tyrone Cannon of Yale University, New Haven, Connecticut, and Barbara Cornblatt of the Zucker Hillside Hospital, Glen Oaks, New York, as a satellite to the International Congress on Schizophrenia Research. For 11 years, the group has been meeting to share research about the prodrome, early signs of oncoming psychosis. Although the matter of whether to include a prodromal category in the new version of the DSM-5 has been settled for now (in the negative; see SRF related news story), the researchers made clear that work remains to refine ideas about the prodrome, how best to detect it, and how to intervene.
Paths to psychosis
The first two talks explored different ways to detect people destined to develop psychosis among those who show early signs. Describing two such patients, Diana Perkins of the University of North Carolina, Chapel Hill, challenged the audience to predict which actually developed psychosis. Noting the need for other evidence to help evaluate this question, she referred to existing diabetes or heart disease algorithms that take into account different variables about people to predict the chances of their developing these disorders. “My fantasy is that maybe in five years we can do this for psychosis,” she said.
Perkins then presented new data on the development of a blood test, based on samples taken from 72 people at “clinical high risk” (CHR) before they developed psychosis. Measuring 141 components in blood, many inflammation-related, the researchers eventually culled 16 that differed between those who eventually developed psychosis (n = 32) and those who did not. When values for these 16 components were put into an algorithm, it correctly identified 91 percent of those who became psychotic and 78 percent of those who did not. It also distinguished between those who developed psychosis from healthy controls. Future studies will have to see if this algorithm could work in other samples, and Perkins noted that such a blood test would be only one piece of a panel of items that could help evaluate psychosis risk among people already showing early signs, rather than being a screening test for the general population.
In the audience, Sabine Bahn of the University of Cambridge, U.K., noted the difficulties of discerning whether such blood tests index general risk rather than something specific to the process of becoming psychotic. She suggested that repeated blood tests may be more capable of picking up key changes on the way to psychosis than would a single baseline measurement.
Turning to the brain, Daniel Mathalon of the University of California, San Francisco (UCSF), presented his data on using composite measures of brain activity, as measured by electroencephalography (EEG), to distinguish those destined to develop psychosis. While listening to a train of auditory stimuli, human brain waves vary depending on whether the stimulus was expected or not. When it does not match expectations, a negative signal develops, termed “mismatch negativity” (MMN). Measuring MMN in 212 people at CHR for psychosis, he reported that MMN was reduced in those who eventually developed psychosis two years later. This reduction is also found in schizophrenia patients, whereas those who do not “convert” to psychosis and healthy controls maintain robust MMN responses. Because MMN is thought to reflect a form of plasticity at synapses in the brain, Mathalon proposed that the run-up to psychosis may be marked by impaired synaptic plasticity, which would lead to an overabundance of weak synapses. Excessive pruning away of these synapses could then drive conversion to psychosis.
Noting that stress may push someone toward psychosis, Rachel Loewy of UCSF talked about ways to get a readout of a person’s stress levels. First, Loewy found an increased incidence of trauma in 69 people at CHR for psychosis, compared to 56 healthy controls matched for age and socioeconomic status. To get a biological index of stress, Loewy found that the 34 people from the CHR group had higher levels of cortisol, a stress hormone present in saliva, than 32 controls while enduring a stressful task involving public speaking or math problem solving. The at-risk group tended to rate this experience as more threatening (as opposed to challenging) than controls, and this rating correlated with cortisol levels. Loewy suggested that people at risk may be sensitized to stress, responding more strongly to stressful situations than controls, and that something like cognitive behavioral therapy (CBT) might help people manage stress by learning to reappraise aversive situations. An audience member suggested that certain interventions such as school programs against bullying could limit stressful experiences in a way that could have a real impact in preventing psychosis. Another member raised the possibility that the subjective burden of stress may be more important than the actual number of stressful events a person experiences.
Citing the benefits of cognitive remediation therapies for people who have already developed schizophrenia, Matcheri Keshavan of Harvard Medical School, Cambridge, Massachusetts, explored whether such a strategy would be useful to a small at-risk group (n = 14). A “Cognitive Enhancement Therapy” consisting of computerized training as well as group and individual therapy (Hogarty et al., 2004) was delivered for 40 hours over eight weeks and consisted of cognitive training and social cognitive training (i.e., emotion recognition). This significantly improved performance in the training categories, including processing speed and visual learning. These gains correlated with functional improvements measured after training, as well as with a decrease in brain activation measured by fMRI—something that might reflect increased brain efficiency. Although it is not clear yet whether this could forestall psychosis, Keshavan and others suggested that the emphasis on whether interventions bring down rates of conversion to psychosis overlooks other informative outcomes, such as depression symptoms, negative symptoms, and other functional deficits. While an intervention may not prevent conversion, it could still mitigate the severity of someone’s illness.
Kristin Cadenhead of the University of California, San Diego, followed with a review of some of the 11 randomized, controlled trials of interventions for CHR done so far, including CBT, antipsychotics, and omega 3 fatty acids. These interventions had moderate effects, meaning they seemed to reduce the number of people who went on to develop full-blown psychosis. Cadenhead reviewed other avenues worth trying, including treating biomarkers of illness such as gray matter loss or suppression of μ brain waves. Because only a subset of at-risk people go on to convert to psychosis, she emphasized that finding approaches that combined both pharmacological and non-pharmacological treatments may make more sense.
Whatever the treatment route chosen, it should be taken early, was the message from Thomas McGlashan of Yale University, who spoke on behalf of Tor Ketil Larsen. He went on to describe the TIPS (Treatment and Intervention in Psychosis) early-detection study published last year, which found that treatment-as-usual given in the early stages of psychosis produces durable reductions in negative and cognitive symptoms, lasting up to 10 years (Hegelstad et al., 2012), compared to giving the same treatment later. “Early timing appears to have a lasting difference,” he said.—Michele Solis.