Understanding Emotional Language Content

0 downloads 0 Views 4MB Size Report
seven-point scales, spanned by pairs of antonyms such as hot–cold, ...... For instance, in antonym processing .... If the tenet of a monolithic lexical entry is given.
SECTION II

Understanding Emotional Language Content (Edited by Johanna Kissler)

Anders, Ende, Jungho¨fer, Kissler & Wildgruber (Eds.) Progress in Brain Research, Vol. 156 ISSN 0079-6123 Copyright r 2006 Elsevier B.V. All rights reserved

CHAPTER 8

Emotional and semantic networks in visual word processing: insights from ERP studies Johanna Kissler, Ramin Assadollahi and Cornelia Herbert Department of Psychology, University of Konstanz, P.O. Box D25, D-78457 Konstanz, Germany

Abstract: The event-related brain potential (ERP) literature concerning the impact of emotional content on visual word processing is reviewed and related to general knowledge on semantics in word processing: emotional connotation can enhance cortical responses at all stages of visual word processing following the assembly of visual word form (up to 200 ms), such as semantic access (around 200 ms), allocation of attentional resources (around 300 ms), contextual analysis (around 400 ms), and sustained processing and memory encoding (around 500 ms). Even earlier effects have occasionally been reported with subliminal or perceptual threshold presentation, particularly in clinical populations. Here, the underlying mechanisms are likely to diverge from the ones operational in standard natural reading. The variability in timing of the effects can be accounted for by dynamically changing lexical representations that can be activated as required by the subjects’ motivational state, the task at hand, and additional contextual factors. Throughout, subcortical structures such as the amygdala are likely to contribute these enhancements. Further research will establish whether or when emotional arousal, valence, or additional emotional properties drive the observed effects and how experimental factors interact with these. Meticulous control of other word properties known to affect ERPs in visual word processing, such as word class, length, frequency, and concreteness and the use of more standardized EEG procedures is vital. Mapping the interplay between cortical and subcortical mechanisms that give rise to amplified cortical responses to emotional words will be of highest priority for future research. Keywords: emotion; semantics; word processing; event-related potentials; healthy volunteers; clinical populations and activity (Osgood et al., 1957), the first two accounting for the majority of the variance. On the semantic differential, a word’s evaluative connotation is determined by ratings on a multitude of seven-point scales, spanned by pairs of antonyms such as hot–cold, soft–hard, happy–sad, etc. Factor analyses of the judgments of many words on such scales, given by large subject populations, reveal a three-dimensional evaluative space, whose structure has been replicated many times and across different cultures (Osgood et al., 1975). Figure 1 provides an illustration of the evaluative space determined by the semantic differential.

Introduction Influential dimensional approaches to the study of emotion derive their basic dimensions from analyses of written language. Osgood and collaborators, using the ‘semantic differential’ technique, were the first to empirically demonstrate that affective connotations of words are determined by three principal dimensions, namely evaluation, potency,

Corresponding author. Tel.: +49-7531-884616; Fax: +49-7531-4601; E-mail: [email protected] DOI: 10.1016/S0079-6123(06)56008-X

147

148

Fig. 1. A three-dimensional affective space of connotative meaning as postulated by Osgood (1957, 1975). The figure depicts the three orthogonal bipolar dimensions, evaluation (E), activity (A) and potency (P), and gives examples of prototypical words for each dimension and polarity. (Adapted from Chapman, 1979.)

Osgood’s principal dimensions are at the core of other circumplex theories of affect (Lang, 1979; Russel, 1980). For instance, Lang and colleagues posit that human affective responses are determined by the dimensions valence, arousal, and dominance; again the first two having the largest impact, leading these authors to propose model of affect defined by the dimensions of arousal and valence. Within such a two-dimensional affective space, different classes of stimuli such as pictures, sounds, and words cluster in a u-shaped manner. Highly arousing stimuli usually receive valence ratings as either highly pleasant or highly unpleasant, low arousing material is generally regarded as more neutral with regard to valence (see

Fig. 2 for an illustration derived from the ratings of German word stimuli). The impact of valence and arousal on various central nervous and peripheral physiological indicators of affective processing has been repeatedly validated using picture and sound media (Lang et al., 1993; Bradley and Lang, 2000; Jungho¨fer et al., 2001; Keil et al., 2002). Word stimuli assessed for perceived arousal and valence such as the ‘Affective Norms for English Words’, ANEW (Bradley and Lang, 1998) have also been put to use in physiological research (see e.g., Fischler and Bradley, this volume), although so far the resulting evidence seems more restricted than the one for pictorial material.

149

Fig. 2. Illustration of a two-dimensional affective space spanned by the dimensions arousal and valence. Examples of the relative position of some German adjectives and nouns used in our studies are given in English translation. Along the x- and y-axis are depicted the arousal and valence scales of the Self-Assessment Manikin (SAM, Bradley and Lang, 1994) used to rate individual emotional responses to the words.

Emotions are generally viewed as culturally universal, largely innate evolutionary ‘old’ signaling and activation systems, residing in ‘old’, subcortical parts of the brain. They are designed to promote survival in critical situations, i.e., to signal and activate fight, flight or feeding, attachment, and sexual behavior. Reading and writing, by contrast, represent comparatively recent developments in the history of mankind, and in individual development these are acquired much later than oral language. Consequently, these skills are often regarded as

cultural achievements, but during the acquisition of written language considerable regional specialization emerges in the human brain (Warrington and Shallice, 1979, 1980; Dehaene et al., 2005). Reading acts as a secondary process that utilizes the processing capabilities of the earlier acquired auditory language system once the analysis of visual word form is completed (Perfetti, 1998; Everatt et al., 1999; Perfetti and Sandak, 2000). Language and emotion share a communicative function but linguistic communicative functions are obviously not restricted to the communication of affect.

150

How do the ‘emotional brain’ and the ‘linguistic brain’ interact, when written words with emotional connotations are encountered? Emotion theories posit that linguistic expressions are stored within semantic networks that encompass links to all aspects of their linguistic and pragmatic usages and emotional connotations (Lang, 1979; Bower, 1981). Thus, the word ‘gun’, for example, not only represents the object itself, but also includes links to its operations, use, purposes, and their consequences as well as their emotional evaluation (Bower, 1981). A converging view is shared in neurolinguistics (Pulvermu¨ller, 1999) and cognitive semantics (Barsalou, 1999): All information related to a word is stored in a dynamic network. Recent evidence suggests that subnetworks1 representing different aspects of a word’s lexical representation can be separately and dynamically activated. For instance, differential neuromagnetic activations of semantic subnetworks have recently been shown for subclasses of plant or animal names (Assadollahi and Rockstroh, 2005). Moreover, biasing contextual constrains can affect the timing of access to the dominant vs. subordinate meaning of homonyms (Sereno et al., 2003), challenging the modular view that in word processing initially all lexical entries have to be exhaustively accessed. Functional divisions of the semantic system mirroring functional divisions in the organization of the cortex have repeatedly been shown for verbs denoting different types of actions (Pulvermu¨ller et al., 2001b; Hauk et al., 2004). Investigating verbs pertaining to movements carried out with different parts of the body, these authors demonstrate that the meaning of action words is reflected by the correlated somatotopic activation of motor and premotor cortex. These patterns of coactivations presumably reflect individual learning history, where the so-called referential meaning has been acquired by repeated coactivation of the body 1 The terms sub-network or sub-representation as used here are not necessarily intended to imply a fixed hierarchical ordering of the neural networks coding for different aspects of semantics, although a certain degree of hierarchical ordering may indeed exist. Instead, sub-network or sub-representation refers to the fact that different neural networks are likely to code for different aspects of a word’s meaning, such as animacy, emotional connotation, grammatical gender, etc.

movement and the descriptive speech pattern, for instance when a child observes or carries out a gesture such as throwing and simultaneously hears the caregiver say the respective word. Later, in the acquisition of written language, this phonological code is mapped onto the visual word form (Perfetti and Sandak, 2000). For emotional concepts, Lang et al. (1993, 1994) assume that not only associated semantic but also physiological and motor response information is coactivated in associative networks. Figure 3 illustrates such a multilevel network representation of an emotional scene, encompassing a semantic code of the given situation as well as associated motor and physiological responses. Thus, the semantic network that codes for ‘emotional semantics’ could include the neuronal circuitry processing the associated emotion (see also Cato and Crosson, this volume, for a related suggestion). How does emotional content influence different stages of visual word processing? Here, the literature is sparse. Event-related brain potentials (ERPs), the scalp recorded averaged synchronized activity of several thousands cortical pyramidal cells, have successfully been used to delineate different stages of visual word processing (for reviews see e.g., Posner et al., 1999; Tarkiainen et al., 1999). A closer look also reveals that a considerable number of electrophysiological studies of emotional processing have employed visually presented word-stimuli. Some, particularly early, studies have used the semantic differential as their theoretical vantage point. In fact, in the wake of Osgood’s studies, the examination of ERP correlates of emotional semantics generated substantial research interest (Chapman et al., 1978, 1980; Chapman, 1979; Skrandies, 1998; Skrandies and Chiu, 2003). More recently, two-dimensional valence  arousal models have been used as a framework for research (see Fischler and Bradley, this volume). However, many studies used words with emotional connotations as experimentally convenient instances of a broader class of emotional events (Anderson and Phelps, 2001; Dijksterhuis and Aarts, 2003) or conversely, as a semantic class without much reference to any particular theory of language and/or emotion (Begleiter and Platz, 1969). So far, little systematic knowledge has been gathered on the relationship between the emotion and language

151

Fig. 3. A network representation of a complex emotional scene (exam situation) illustrates how in dynamic emotional processing perceptual, semantic and response systems are interactively linked. Activation on any level of this system can spread to other subsystems. (Adapted after Lang, 1994.)

systems in visual word processing and possible implications for the underlying neural implementation of meaning. ERP recordings have an excellent temporal resolution, allowing for a fine-grained analysis of the temporal sequence of different processing stages. Their spatial resolution is more restricted, and inferences from the spatial distribution of scalp measured ERPs to their neural generators can only be made with caution. The number of electrodes and the recording reference used influence the probability of finding effects, the generalizability of these findings, and the accuracy of spatial localization (see also Jungho¨fer et al., this volume). The present review will summarize and systematize existing studies on the role of emotion in visual word processing, both in healthy volunteers and in clinical populations and relate this evidence

to the available knowledge on ERP indices of stages of visual word processing. First, we will address studies reporting early effects of emotional content on ERP responses, occurring within the first 300 ms after stimulus onset, separately for healthy volunteers and clinical populations; second, we will review effects of emotional content on the late ERPs to visual words; and third, discuss how an interplay of subcortical and cortical mechanisms of emotion and word processing may give rise to the observed effects. To facilitate comparisons between studies, main methodological parameters and results concerning effects of emotional content, as stated in the reviewed studies, are summarized in two tables in the appendix. Table A1 describes studies with healthy volunteers; Table A2 describes work with clinical populations. There, it becomes immediately apparent that the studies

152

described vary considerably in theoretical approach and methodology.

Early effects — occurring within 300 ms after stimulus onset Healthy volunteers The extent to which early exogenous components of the human event-related potential are subject to modification by nonphysical stimulus characteristics is a matter of ongoing controversy. In visual word processing, a traditional view holds that within the first 150–200 ms after a word has been presented, specific perceptual features of written words but no meaning-related attributes are extracted (Schendan et al., 1998; Posner et al., 1999) and for many years the N400 potential, a centro-parietal negativity arising around 400 ms after stimulus onset has been viewed as ‘the’ index of semantic processing (Kutas and Federmeier, 2000). Using other types of visually presented stimuli with emotional content, such as faces or pictures, remarkable ERP differences between emotionally significant and neutral stimuli have been found within the first 300 ms after stimulus onset, some even within the first 100 ms. In his single subject study of ERP responses to emotional words, Lifshitz (1966) failed to find a visually impressive differentiation between emotional and neutral words within the first 500 ms after word onset, although upon visual inspection the difference between the likewise presented erotic and neutral line drawings was sizeable. However, in the meantime there is a considerable body of evidence indicating that even very early ERP responses can diverge between emotional and neutral words. Thus, the probably first quantitative study on the effects of emotional content on ERP indices of word processing found differences between negative ‘taboo words’ and neutral words already within the first 200 ms after word presentation (Begleiter and Platz, 1969). Notably, this study was entitled: ‘Cortical evoked potentials to semantic stimuli’, expressing a view of emotion as a vital part of semantics. The technological standards at the time were not very sophisticated and the authors

recorded only from a single right occipital electrode (O2), but ERP differences due to emotional content appeared in the ERP tracings already in the P1–N1 complex: twenty minimally above threshold presented repetitions of each of two clearly negative ‘four-letter –words’ led to larger responses than twenty repetitions of each of the words ‘tile’ and ‘page’. This pattern held for both a passive viewing condition and a naming condition. Several years later, Kostandov and Azurmanov (1977) contrasted ERP responses to both subliminally and supraliminally presented ‘conflict’ and neutral words. In the subliminal condition, the earliest differences between ‘conflict’ and neutral appeared around 200 ms after stimulus onset. Subliminally presented ‘conflict words’ apparently relating to the subjects’ relationship conflicts caused by jealousy led to larger N200 responses than neutral words. Supraliminally presented words led to later differentiations starting around 300 ms, in the P3a window. Unfortunately, the description of the study is somewhat sparse on details regarding the materials used. Chapman et al. (1978) sought to electrophysiologically validate Osgood’s connotative dimensions of meaning (see Fig. 1), namely evaluation (E), potency (P), and activity (A). They recorded from one midline electrode (CPz) referenced to linked mastoids while subjects had to name briefly flashed (17 ms) words. Twenty words each from the extreme ends of the three connotative dimensions, E, P, and A, yielding six semantic classes (E+/-, P+/-, and A +/-) were used as stimuli. Random sequences of all these words were presented 12–20 times to the subjects (until a set criterion of artifact-free trials was entered into the average). Although the exact multivariate analysis of the data in the original report is somewhat hard to reconstruct, in essence Chapman et al. (1978) were able to statistically differentiate between all six connotative dimensions using ERP data from a single central channel. Whereas the different connotative dimensions are not mapped onto any specific ERP components, in the original grandaverages a second positive peak at around 260 ms after word onset is noticeable that clearly differentiates between the extreme ends (+ and -) of all three dimensions but not so much among E, P, and

153

A. In general, the material in this study was well controlled for physical aspects but apparently not for other linguistic attributes such as word frequency, word class, or concreteness. Replication studies by the same group investigated the effects of explicit affective rating tasks on ERPs (Chapman, 1979) and directly compared the effects of affective rating and word naming in one study, yielding similar results (Chapman et al., 1980). Thus, a series of studies demonstrated reliable statistical differentiation between six connotative categories within 500 ms after stimulus onset on the basis of ERP data from a single channel. Extending their 1969 study across a larger selection of stimuli, Begleiter et al. (1979) recorded ERPs elicited by unpleasant, pleasant, and neutral words, as 10 subjects had to either identify the last vowel in a presented word or give their personal affective evaluation of the word. Recordings were made from three electrodes over each hemisphere, referenced to linked mastoids. However, results are reported only for two of these electrodes, namely P3 and P4. Stimuli were the 62 most unpleasant, 62 most neutral, and 62 most pleasant five-letter words derived from a larger pool of words previously assessed with the semantic differential (Osgood et al., 1957). Words were presented very briefly (20 ms). The words’ emotional meaning affected N1–P2 peak-to-peak amplitude during emotional evaluation. ERP responses to words evaluated as pleasant, neutral, or unpleasant could be distinguished statistically at both electrodes. Overall, effects of emotional content were somewhat more pronounced over the left hemisphere and were restricted to the affective evaluation condition. At the left hemispheric electrode, ERPs were also generally larger when the words were shown in the emotional evaluation than in the letter-identification task. Unlike the previous studies, Begleiter et al. (1979) was the first to provide evidence for a major impact of task, showing early ERP effects of emotional content only during active evaluation. Skrandies (1998), studying ‘ERP correlates of semantic meaning’, used an approach conceptually similar to Chapman’s (1978, 1979, 1980). A pool of 60 nouns representing the bipolar extremes on Osgood’s E, P, and A dimensions were selected and presented in a rapid serial visual presentation

(RSVP) design, i.e., a continuous stream of alternating words without interstimulus interval. Skrandies (1998) used a comparatively slow RSVP design presenting each word for 1 s. Ten stimuli per category and polarity were selected and each stimulus was repeated 40 times, yielding 400 averages per category, thus optimizing the signal-to-noise ratio. Subjects were instructed to visualize the words and to remember them for a subsequent memory test in order to ensure active engagement with the stimuli. EEG was recorded from 30 channels referenced to an average reference. Brain responses between the emotional word categories differed in six distinct time windows, with respect to either their peak latency, associated global field power or the location of the centroids of the scalp distribution. Remarkably, most of the differences occurred within the first 300 ms after word presentation, starting with P1 at around 100 ms. These results were recently extended cross-culturally in a virtually identical study of affective meaning in Chinese, yielding a somewhat different but equally complex pattern of differences depending on emotional word content, which were restricted to the first 300 ms after word onset (Skrandies and Chiu, 2003). Together with Begleiter and Platz (1969), Skrandies studies report probably the earliest meaning-dependent differentiation between words. Consequently, these studies are frequently cited in the visual word processing literature, albeit without reference to the particular emotional semantic contents used. Schapkin et al. (2000) recorded ERPs as subjects evaluated nouns as emotional or neutral. The stimuli consisted of a total of 18 words, 6 pleasant, 6 unpleasant, and 6 neutral, which were matched across emotional categories for word length, frequency, concreteness, and initial letters. Words were presented peripherally, to the left and right visual fields for 150 ms, while the EEG was recorded from 14-linked mastoid referenced electrodes, eight of which were consistently analyzed. Stimuli were repeated 32 times, 16 times in each visual field. The earliest effect of emotional significance of the words on ERP responses was observed in the P2 window, peaking at 230 ms. At bilateral central sites, P2 responses to pleasant words were larger than responses to unpleasant and neutral ones. While the P2 response was generally larger

154

over the left hemisphere, the emotion effect was not lateralized. Similar effects of emotional content on ERP responses to words were also observed in later time windows (see below). As already suggested by Kostandov and Arzumanov (1977), the powerful effects of emotional connotation on brain responses appear to extend even below the limits of conscious perception. Bernat et al. (2001), recording from six-linked mastoid referenced scalp electrodes, report a differentiation between both subliminally and briefly (40 ms) presented unpleasant and pleasant adjectives at lefthemispheric electrode positions already in the P1 and N1 time ranges, as participants simply maintained fixation on a central cross and viewed the computer screen without an explicit behavioral response being required. In the P1 and N1 windows, unpleasant adjectives lead to larger ERP responses in the left hemisphere. Overall larger responses to unpleasant as compared to pleasant adjectives were obtained in the subsequent P2, P3a, and LPC time ranges, the main effects of emotional content having earlier onsets in the subliminal condition. The affective valence of the stimuli had been determined by assessing a larger pool of words on five bipolar scales from the evaluative dimension of the semantic differential. Both the subsequent ERP study subjects and an independent sample had repeatedly rated the stimuli. ERPs to 6 repetitions of the 10 most extremely pleasant and unpleasant as well as to 12 neutral words were recorded. Thus, study subjects had considerable experience with the words. Also, it is unclear, whether the stimuli were assessed on other potentially relevant emotional or linguistic dimensions such as arousal and dominance or word length, frequency, and abstractness. Still, these results as well as data from Kostandov and Azurmanov (1977) or Silvert et al. (2004) and Naccache et al. (2005) in principle support the possibility of measurable physiological responses to subliminally presented emotional words and add to the evidence of emotional content-dependent P1 differences (Begleiter, 1969; Skrandies, 1998; Skrandies and Chiu, 2003). Recently, Ortigue et al. (2004) also reported a very early effect of word emotionality in a dense array ERP study recording from 123 scalp channels. The task consisted of a lexical decision to

very briefly flashed (13 ms) stimuli. Subjects had to indicate which of two simultaneously presented letter combinations in both visual fields constituted an actual word. Stimuli were half neutral and half emotional nouns of both pleasant and unpleasant valence. They were matched for word length and frequency and selected from a larger pool of words pre-rated on a bipolar seven-point scale spanning the neutral-emotional continuum. Overall, emotional words presented in the right visual field were classified most accurately and fastest. However, the relative advantage for emotional words was larger for words presented in the left visual field. Using a source estimation approach (LAURA) the authors identified a stable topographic pattern from the spatiotemporal distribution of the ERP data that accounted for the processing advantage of emotional words in the right visual field. This pattern emerged between 100 and 140 ms after stimulus onset, i.e., mostly in the P1/N1 window. Curiously, it was localized to primarily right-hemispheric extra-striate cortex. Surprisingly, no specific neurophysiological correlate of the even more pronounced advantage for emotional words in the left visual field was identified within the 250 ms after stimulus presentation that this study restricted its analysis to. Recent data from our own laboratory also produced evidence for early differences between cortical responses to emotionally arousing (both pleasant and unpleasant) and neutral adjectives and nouns. The word’s emotional content had been predetermined in a separate experiment, obtaining valence and arousal ratings on two ninepoint rating scales (see Fig. 2) from 45 undergraduate students. According to these ratings, highly arousing pleasant and unpleasant and low arousing neutral words were selected. Different subsets of these words were used in three studies where ERPs from 64 scalp sites were measured. Neutral and highly arousing pleasant and unpleasant words matched for word length, frequency, and in one experiment also for concreteness were presented in RSVP designs to subjects instructed to read the words. Across three different stimulus presentation durations (333, 666, 1000 ms) and regardless of word type (adjectives or nouns), a lefthemispheric dominant occipitotemporal negativity

155

differentiated emotional (both pleasant and unpleasant) from neutral words. This negativity had its maximum around 260 ms after stimulus onset. The influence of stimulus repetition was assessed, but neither habituation nor sensitization was found for the emotional-neutral difference within the five repetitions used (Fig. 4 illustrates the effect for the 666 ms presentation rate). In one of the studies we also manipulated task demands, instructing subjects to attend to and count one of the two word classes (adjective or noun). Interestingly, this manipulation did not affect the enhanced early negativity to emotional words but had a significant impact on the later positivity. Herbert et al. (2006) also recorded ERPs from 64 average-reference linked scalp channels, as 26 subjects evaluated the emotional significance of

highly arousing pleasant and unpleasant as well as neutral adjectives. The affective content of the stimuli had been predetermined in a separate population using the above-described procedure. Words were presented for a relatively long period, namely 5 s. In this study the P2 component was the first index of differential processing of emotional vs. neutral words. This P2 component primarily responded to perceived stimulus intensity/arousal and did not differentiate brain responses to pleasant from those to unpleasant words. The same was true for the subsequent P3a component, but the picture changed for a later LPC component and the simultaneously recorded startle response that were more pronounced for pleasant than for unpleasant and neutral words. This sequence of effects is depicted in Fig. 5.

Fig. 4. Early arousal-driven enhancement of cortical responses to emotional words. Uninstructed reading of both pleasant and unpleasant words in a rapid serial visual stimulation paradigm (RSVP, 666 ms stimulus duration) leads to larger occipitotemporal negativities than reading of neutral words. The effect is illustrated at two occipital sensors (O9, O10) and the scalp topography of the difference potential emotional–neutral words is depicted. Grand-averages from 16 subjects are shown.

156

Fig. 5. Difference maps of cortical activation for emotional minus neutral words in a covert evaluation task. Averaged activity in three time windows is shown: P2 (180–280 ms), P3 (280–400 ms), and LPC (550–800 ms). For P2 and P3 both pleasant and unpleasant words are associated with larger positivities than neutral ones. In the LPC window only processing of pleasant words diverges from neutral. The time course of the activity is shown at electrode Pz. Grand-averages from 26 subjects are shown.

Clinical studies One of the first studies to use emotional words as a tool to address processing biases (or a lack thereof) in clinical populations is Williamson et al. (1991) who investigated behavioral and cortical responses to pleasant, unpleasant, and neutral words in psychopathic and nonpsychopathic prisoners. Subjects had to detect words in a sequence consisting of words and nonwords while their EEG was being recorded from five scalp positions referenced to linked mastoids. Stimuli were presented vertically for 176 ms, separately to either visual field, and repeated three times. Stimuli had been matched for length,

number of syllables, word frequency, and concreteness but differed in emotional connotation. Nonpsychopathic subjects had faster reaction times and larger P2 responses to both pleasant and unpleasant emotional words than to neutral ones. These differences induced by emotional arousal extended into the late positive component (LPC) time range in controls but were completely absent in psychopaths. Weinstein (1995) assessed correlates of enhanced processing of threatening and nonthreatening verbal information in university students with elevated or normal trait anxiety levels. Subjects read sentences with threatening and pleasant content that served as primes for subsequently presented threat related,

157

neutral, or pleasant words. ERPs in response to the target words were assessed at Fz, Cz, and Pz, as subjects had to decide whether the target word contextually fit the previously shown sentence. Highly anxious subjects were reported to exhibit a larger frontal N1 in the threat-priming condition and an enhanced P400 (i.e., reduced N400, see below) in a later time window. In retrospect, a number of methodological problems seem to exist in this study or the data presentation may contain errors. For instance, a large ERP offset at baseline is shown, the presented condition means occasionally do not seem to correspond to the ERPs displayed and information on linguistic properties of the stimuli used is missing. However, taken at face value, the results indicate heightened selective attention to (N1) and facilitated semantic integration of (N400/P400) threat-related information in students with elevated trait anxiety levels. A similar pattern of early ERP differences was also found in a study investigating the processing of pain-, body-related, and neutral adjectives in healthy volunteers and prechronic pain patients. Knost et al. (1997) report enhanced N1 responses to the pain-related stimuli at a left frontal sensor (F3) in the patient group. ERPs had been recorded from 11 mastoid linked scalp positions while subjects had to name words that were presented at the individually determined perceptual threshold. In both groups, pain- and body-related words produced larger positivities than neutral ones in a later time window (600–800 ms) and were also associated with larger startle eye-blink responses on separately administered startle probe trials. The authors interpret their findings as evidence for preconsciously heightened attention to unpleasant, pain-related stimuli in the patients. These results were paralleled in an analogous study with chronic pain patients (Flor et al., 1997). Chronic pain patients had a larger frontal N1 response to pain-related words than comparison subjects. Additionally, a general hemispheric asymmetry emerged: N1 responses were larger over the right for pain-related words and over the left side of the head for neutral words. The enhanced responses for painrelated words in the patient group were also visible in a centro-parietally maximal N2 component. In the P2 window, right hemispheric responses to pain

words were likewise larger in the patients. In contrast to the first study, no differential effects of emotional category were observed in subsequent time windows (P3 and LPC). The ERP results are taken to reflect heightened preconscious allocation of attention (N1) and stimulus discrimination (N2) to disorder-related words in pain patients, but show no evidence of further evaluative processing. In a similar vein, Pauli et al. (2005) studied cognitive biases in panic patients and healthy volunteers analyzing ERP responses from nine scalp channels to panic-related unpleasant and neutral words that were presented, in separate runs, at individually determined perceptual thresholds and for 1000 ms. Early ERPs differentiated panic patients from comparison subjects. At threshold presentation, patients showed two enhanced early frontal positivities in response to panic words, one between 100 and 200 ms (P2) and the other (P3a) between 200 and 400 ms post-stimulus onset. Both early effects were absent at the longer exposure duration and did not occur at all in the control group. Interestingly, subsequent positivities between 400 and 600 ms as well as between 600 and 1000 ms differentiated between panic words from neutral words in both groups and for both presentation durations. This pattern of data resembles the results by Knost et al. (1997). Kissler and colleagues (in preparation) assessed processing biases in depressed patients and comparison subjects using the above-described RSVP paradigm, recording from 256 scalp electrodes and comparing the amplitude and scalp distribution of the previously described early negativity to pleasant, unpleasant, and neutral adjectives matched for length and frequency. Around 250 ms after word onset (see Fig. 3), comparison subjects displayed the above-described left-hemispheric dominant enhanced negativity for emotional words, pleasant and unpleasant alike. Depressed patients, by contrast, exhibited this enhanced negativity solely in response to the unpleasant words and only in the right hemisphere. Comparing early emotional and early semantic processing In sum, numerous studies have found early (o300 ms) amplifications of ERPs in response to

158

words with emotional content compared with neutral words. The occurrence of such effects is remarkable since controlled conscious processing has been suggested to arise only with the P3/N400 components (Halgren and Marinkovic, 1995), implying that emotion can affect preconscious stages of word processing. Such effects appear to be more pronounced in various clinical populations, reflecting heightened sensitivity and orienting to unpleasant, disorderrelated material (Weinstein, 1995; Flor et al., 1997; Knost et al., 1997; Pauli et al., 2005). Other patients groups, by contrast, seem to selectively lack processing advantages for emotional words, psychopaths showing no cortical differentiation between emotional and neutral words (Williamson et al., 1991) and depressed patients showing preferential processing of unpleasant but not pleasant words (Kissler et al., in preparation). At any rate, processing biases in a number of clinical populations are reflected in their patterns of early responses to pleasant, unpleasant, and neutral words. A debated issue in emotion research pertains to whether, when, and how cortical responses differ as a function of arousal, valence, and additional factors such as dominance, or complex interactions of these. Here, the data must remain somewhat inconclusive, as the studies discussed differed vastly on the dimensions included and assessment methods used. Studies that assessed their materials with the semantic differential found that brain responses differentiate between all dimensions and polarities within the first 300 ms after stimulus onset (Chapman et al., 1978, 1980; Skrandies, 1998; Skrandies and Chiu, 2003). However, the arising pattern of results is so complex that it is hard to gauge the effect of each individual dimension on brain responses. The vast majority of studies report generally larger ERP responses to emotional than to neutral words, with some studies reporting these effects even in the absence of a task that would explicitly require processing of emotional content or other types of semantic access (Begleiter and Platz, 1969; Bernat et al., 2001; Kissler et al., submitted manuscript). However, occasionally, early emotion effects in word processing were found restricted to situations where explicit

processing of the emotion dimension is required by the task (Begleiter et al., 1979). Directly comparing the impact of pleasant vs. unpleasant word content yields mixed results, with some studies finding larger early effects for pleasant words (Schapkin et al., 2000) and others larger effects of unpleasant ones (Bernat et al., 2001). As mentioned above, the subjects’ clinical or motivational status may bias their cortical responses in either direction. Also, task characteristics as well as the timing of stimulus presentation may have an additional impact but so far the influence of these parameters is not well understood. Of note, some of the described effects occurred even before 200 ms after word onset, in a time range in which from a traditional theoretical standpoint meaning-related processing differences would not be expected (Schendan et al., 1998; Cohen et al., 2000). These very early effects of emotional content on ERP indices of visual word processing are rather heterogeneous with regard to timing, locus, and direction. Some of the inconsistencies are probably related to differences in instrumentation and recording methodology, number of electrodes, and choice of reference electrode(s) representing but the most obvious differences. The described studies also differ vastly in the way emotional content of the stimulus material was assessed as well as in the extent to which other, nonemotional, linguistic factors such as word class, length, and frequency or concreteness were controlled. Nevertheless, the bulk of the evidence suggests that, indeed, under certain circumstances the emotional connotation of words can affect even the earliest stages of preconscious sensory processing. Thus, the challenge is to specify under which circumstances such emotional modulation of earliest processing may occur and what the underlying mechanisms are. Two experimental factors arise from the reviewed studies that may contribute to the emergence of very early emotion effects. First, very brief stimulus presentation, near or even below the perceptual threshold (Begleiter and Platz, 1969; Kostandov and Arzumanov, 1977; Chapman et al., 1978, 1980; Flor et al., 1997; Knost et al., 1997; Bernat et al., 2001; Ortigue et al., 2004; Pauli et al.,

159

2005) and second, repeated presentation of comparatively small stimulus sets (Begleiter and Platz, 1969; Chapman et al., 1978, 1980; Skrandies, 1998; Skrandies and Chiu, 2003; Ortigue et al., 2004). None of the cited studies have explicitly assessed the effect of stimulus repetition on the latency of emotional-neutral ERP differences. Our own studies of repetition effects on negative difference waves distinguishing emotional from neutral content around 250 ms after stimulus onset show no evidence of change within five repetitions. However, some of the cited studies used by far more than five stimulus repetitions and studies of early semantic processing indeed suggest an effect of stimulus repetition on the timing on meaning-related differences in cortical activity: Pulvermu¨ller and colleagues report neurophysiological evidence of differences in semantic processing from 100 ms post word onset (Pulvermu¨ller et al., 2001a). They used a task in which a single subject was repeatedly, over several days, presented with a set of 16 words that she had to monitor and hold active memory as responses to occasionally presented new words were required. Thus, in above threshold presentation, preactivation of the cortical networks coding for meaning by using tasks that require continuous attention to and working memory engagement with the stimuli as well as use of many repetitions may foster earliest semantic processing differences, nonemotional and emotional alike. Further, a recent study on repetition effects in symbol processing found an increase in N1 amplitude (around 150 ms) across three repetitions of initially unfamiliar symbol strings (Brem et al., 2005), supporting the view that stimulus repetition can amplify early cortical responses to word-like stimuli. Thus, repetition effects affecting emotional stimuli more than neutral ones as a consequence of differential initial capture of attention and rapid perceptual learning may account for some of the very early ERP effects in emotional word processing. Early effects of emotional content on brain responses to subliminally or near-subliminally presented stimuli have occasionally been accounted for by fast, subcortical short-cut routes (see Wiens, this volume, for a discussion of issues of subliminal stimulus presentation). Evidence from animal

experiments and functional neuroimaging indeed reveals the existence of such subcortical ‘short-cut’ routes in emotional processing, particularly of fear-relevant stimuli (Davis, 1992; LeDoux, 1995; Morris et al., 1999). Direct pathways from the superior colliculi and the thalamus to the amygdala and the cortex allow for the automatic processing of relevant stimuli outside of conscious awareness, preparing rapid behavioral responses. In humans, this subcortical pathway has been mapped by functional neuroimaging during fear conditioning of subliminally presented faces (Morris et al., 1999) as well as during the subliminal presentation of faces with fearful expressions (Liddell et al., 2005). On a cortical level, its activity may be reflected in transient early responses. Brief, subliminal stimulation with fearful faces has recently been shown to result in a transient enhancement of the N2 and early P3a components, which, however, did not continue in later N4/P3b/LPC windows. For supraliminal stimulation, conversely, N4/P3/LPC but not N2 components responded to emotional content (Liddell et al., 2004), suggesting the operation of a slower, conscious processing and evaluation route. Conceivably, subliminal stimuli receive a temporally limited amount of processing that wanes if it is not confirmed by further supraliminal input, much like in the case of subliminal priming (Greenwald et al., 1996; Kiefer and Spitzer, 2000). Recording ERPs during subliminal and supraliminal semantic priming, Kiefer and Spitzer observe decay of subliminal semantic activation within 200 ms, a delay at which supraliminal priming effects can still be robustly demonstrated. A plastic, maladaptive downregulation of subcortical excitability may account for early responsiveness to unpleasant and disorder-related words in clinical populations (see e.g. Pauli et al., 2005 for supportive evidence). Clearly, at present the operation of a fast subcortical route from the thalamus and the amygdala in emotional word processing that could account for near or subthreshold emotion effects in visual word processing remains a speculative conjecture. A most critical point is that such a mechanism would require at least basic ‘reading abilities’ in the thalamus. While the case for stimuli such as faces or threatening scenes that by some are

160

assumed to be part of our ‘evolved fear module’ (O¨hman and Mineka, 2001) can be made much more easily, many would have a hard time believing in rather sophisticated subcortical visual capacities allowing for the discrimination of written words. On the other hand, subcortical structures are also subject to modifications by learning, and by the time people take part in experiments they will usually have had about two decades of reading expertise. So far, most of the evidence for the subliminal processing of emotional stimuli is based on studies with aversive material. Accordingly, the abovereviewed studies evidence extremely early effects primarily for unpleasant words (Flor et al., 1997; Knost et al., 1997; Bernat et al., 2001). An alternative explanation of some of these early effects of enhancement by emotional content in visual word processing that would not rely on subcortical by-pass routes and therefore on subcortical vision is reentrant connections between the so-called visual word form area (VWFA) and the emotion processing system. During visual word recognition the earliest activation of an invariant version of the visual word form (i.e. the font-, size-, position-invariant representation of the string of letters) occurs from about 100 ms after stimulus onset (Sereno et al., 1998; Assadollahi and Pulvermuller, 2003). Form invariant, abstract representations of highly overlearned visual objects such as words and faces have been found to originate in the fusiform gyrus (Haxby et al., 1994; Chao et al., 1999; Cohen et al., 2000; Dehaene et al., 2002). Electrophysiological evidence with regard to the onset of word-specific effects of fusiform activity varies, with some authors reporting onsets around 120 ms (Tarkiainen et al., 1999; Assadollahi and Pulvermu¨ller, 2001) and others somewhat later around 170 ms (Bentin et al., 1999; Cohen et al., 2000). Timing differences may be partly attributable to differences in word familiarity across experiments (King and Kutas, 1998). Immediately after access of the visual word form, meaning can be activated: Assadollahi and Rockstroh (2005) showed that activation differences due to super-ordinate categorical differences (animals vs. plants) can be found in left occipitotemporal

areas between 100 and 150 ms after word onset, whereas activation differentiating between subordinate categories was evident only from 300 ms on. Dehaene (1995) observed the earliest ERP differences between words of different categories (verbs, proper names, animals), 250–280 ms after word onset. Semantic category differences were reflected in the scalp distribution of a left occipitotemporal negativity. Using RSVP designs a similar occipitotemporal negativity has been identified. This negativity has been termed the ‘recognition potential’ (RP). It is sensitive to semantic aspects of visual word processing and has its maximum around 250 ms after word (Rudell, 1992; Martin-Loeches et al., 2001; Hinojosa et al., 2004). The ‘RP’ responds to manipulations of depth of semantic analysis, its amplitude increasing with the meaningfulness and task-relevance of the presented word. Source analysis has placed the origin of the RP in the fusiform gyrus (Hinojosa et al., 2001). Results from our laboratory are consistent with the view that a word’s emotional connotation enhances the associated recognition potential (see Fig. 4). Thus, a word’s emotional connotation could be directly connected to the abstract representation of its visual form. Moreover, the combined evidence suggests that emotional content amplifies early stages of semantic analysis in much the same way an instructed attention enhancing processing task would. If enhanced semantic processing is an important mechanism by which emotional content affects visual word processing, again, the question arises as to the causative mechanism: back-projections from the anterior cingulate and the amygdala may give rise to such processing enhancements. In support, amygdala lesions impair the enhanced detection of unpleasant words in an RSVP attentional blink paradigm but not of identification enhancements caused by manipulation of target color (Anderson and Phelps, 2001). Thus, assuming that the amygdala plays a pivotal role in the preferential processing of emotional words as recently suggested by several neuroimaging and lesion studies (Isenberg et al., 1999; Anderson and Phelps, 2001; Garavan et al., 2001; Hamann and Mao, 2002; Naccache et al., 2005), an alternative model to the above described thalamo-amygdalo-cortical route

161

could account for most of the data. Emotional amplification of semantic processing would occur after initial stimulus identification, caused by bidirectional reentrant communication between cortical regions and the amygdale (Amaral et al., 2003). Crucially, cortical analysis would precede and spark subcortical amplification of cortical processing. Clearly, a theoretically crucial priority for future research is to determine the timing of subcortical mechanisms in relation to cortical enhancement of ERP responses to emotional words. Unlike for other semantic categories such as movement-related verbs (Pulvermu¨ller et al., 2000, 2001b), ERP data for emotional words so far suggest little consistent emotion-specific change in topography (but see Skrandies, 1998; Skrandies and Chui, 2003; Ortigue, 2004). A distinct, emotion-associated topography might point to the existence of a homogeneous emotion lexicon localizable in distinct neuronal populations of the brain as has been suggested for other semantic categories (Martin et al., 1996). Rather, emotional content seems to amplify cortical word processing, much in the same way as it enhances picture (Jungho¨fer et al., 2001) or face processing (Schupp et al., 2004). However, functional neuroimaging techniques with better spatial resolution of especially deep cortical and subcortical structures (see Cato and Crosson, this volume) and the more consistent use of dense array EEG studies (Ortigue et al., 2004) may provide additional information.

Late components (after 300 ms) Healthy volunteers In relation to traditional stages of visual word processing, effects occurring later than 300 ms after word onset are less puzzling than the previously discussed early ones. Enhanced late positivities in response to emotional word content have most often, but not invariably, been found. In several of the already discussed studies reporting early ERP modulations as a function of emotional content, later enhanced positivities associated with the emotional content of the word stimuli are also apparent. For instance, in the data shown by

Chapman et al. (1978, 1979, 1980; see above) a positivity occurring around 300 ms is discernible and appears to be primarily related to the potency dimension extracted from their data. Using materials from the Begleiter et al. (1969, 1979) and Chapman et al. (1978, 1979) studies, Vanderploeg et al. (1987) assessed ERP responses to visually presented emotional (20 pleasant, 20 unpleasant) and 20 neutral words and face drawings (two per emotion category), which were evaluated during viewing. The EEG was recorded from six electrodes referenced to linked ears in 10 male subjects. During viewing, the visual stimuli were presented for either 80 ms (words) or 100 ms (faces). In the conditioning phase, the face drawings were shown for 1500 ms. For both faces and words clearly discernible emotion-category dependent differences in ERP tracings appear from around 300 ms after stimulus onset as parietal positivities. A small but significant effect of emotional connotation of words but not faces on the spatial distribution of the ERP was also evident in the P2 window. Interestingly, although sizeable in appearance, the P3 effect of emotional connotation did not reach statistical significance for words. For a later positivity (positive slow wave/late positive complex) a similar result was obtained; although visible in the presented grand-averages, the difference in parietal positivity between emotional, pleasant and unpleasant, and neutral words does not reach significance in an analysis of the corresponding PCA factors while it does for faces. The authors, in line with Lifshitz’ (1966) early finding, suggest that words may be less powerful (or more heterogeneously evaluated) emotional stimuli than pictures (Vanderploeg et al., 1987). Thus, ERPs from10 subjects may not yield enough statistical power to assess connotation-dependent ERP differences, particularly in studies using comparatively sparse electrode arrays. Also, the perceptual variance between 20 words of a category may be higher than among two faces. Differential effects may, therefore, also result from the greater consistency or higher frequency of occurrence of the faces. Indeed, subsequent studies have found robust effects of emotional connotation on later ERP components. For instance, Naumann et al. (1992) investigated late positive potentials to adjectives

162

varying in emotional content. Their key idea was that using ERPs it should be possible to dissociate emotional and cognitive processing and that, following LeDoux (1989), cognitive and emotional processing systems should be functionally and neuronally separable as reflected in distinct ERP scalp topographies. In an initial experiment, 30 prerated pleasant, unpleasant, and neutral adjectives were presented to 14 subjects who had to either evaluate the words as pleasant, unpleasant, or neutral (affective task) or determine whether a word’s length was longer, shorter, or equaled six letters (structural task). The EEG was recorded from three midline electrodes (Fz, Cz, and Pz), which were referenced to the left mastoid. ERPs were assessed between 300 and 700 ms after word presentation for the P3 component and between 700 and 1200 ms for the later positive slow wave. For both components and all word categories, ERP amplitudes were more positive going for the affective than for the structural task, particularly at electrodes Fz and Cz. Moreover, P3 amplitudes were also generally more positive in response to emotional than neutral adjectives. The spatial distribution of the subsequent slow wave component varied with emotional category, displaying larger amplitudes at Pz than at Cz and Fz for pleasant and unpleasant words but having equal amplitudes at all three electrodes for the neutral words. This pattern was taken as evidence for the hypothesized separateness of affective and cognitive processing systems. Naumann et al. (1992) replicated this result in a second experiment having calculated an ideal sample size of 106 subjects, minimizing the likelihood of false-negative results. In the replication study a between groups design was used, assigning 53 subjects each to either the structural or the affective task. Again, for both components and all word types, more positive frontal ERPs were obtained for the affective than for the structural task. The P3 component was larger for both pleasant and unpleasant than for neutral adjectives. Furthermore, positivities in response to emotional adjectives were particularly pronounced at Cz and Pz, and this gradient was most evident for the pleasant words. For the slow wave, the scalp distribution likewise exhibited a parietal maximum and this

was more pronounced for the emotional than for the neutral adjectives. Thus, overall, an emphasis on emotional processing (affective task) caused an anterior shift of the scalp distribution. Furthermore, regardless of task, emotional stimuli led to more pronounced parietal peaks than neutral ones. Again, the authors interpreted their results as evidence for a functional and structural distinctiveness of affective and cognitive functions in the human brain as suggested by LeDoux (1989). However, a third demonstration of dissociable affective and cognitive processes in visual word processing failed. Naumann et al. (1997) examined a sample of 54 students in three different tasks, namely letter search (structural task), concrete–abstract decision (semantic task), and an unpleasant–neutral decision (affective task) on a set of nouns varying in emotional content. Fifty-six nouns were used that could be divided into subsets of seven words unambiguously belonging to one of eight possible combinations of these attributes. ERPs were now recorded from nine scalp locations (Fz, Cz, Pz, and adjacent left and right parallels). There was indeed a considerably larger P3 for unpleasant compared to neutral words, albeit the effect was not general but restricted to the affective task. This casts doubt on the assumption that cognitive and emotional word processing operate along completely separable routes and raises the question to what extent larger late positive potentials to emotional word stimuli occur outside the attentional focus. Given the early effects (o300 ms) reported above, it would have been interesting to analyze the data with a focus on early and possibly automatic impacts of emotion on word processing. Naumann et al. (1997) raise a number of conceivable reasons for the reduction of the effect, favoring a familiarity-based explanation. In the experiments that had yielded ‘uninstructed’ and topographically distinct late responses, subjects had been familiar with the stimuli beforehand. Thus, the affective differences between the stimuli may have already attracted the participants’ attention. Moreover, the new design reduced the probability of occurrence for an emotional word, possibly making this dimension less salient; although the converse hypothesis, based on an

163

oddball effect, would be equally plausible. Moreover, in their initial studies, Naumann et al. (1992) had used adjectives that may produce somewhat different effects, given that ERP differences between word classes have been reported (e.g. Federmeier et al., 2000; Kellenbach et al., 2002). Fischler and Bradley (this volume) report on a series of well-controlled studies where effects of word emotionality on late positivities are consistently found for both pleasantly and unpleasantly arousing words when the task requires semantic processing of the presented words but not otherwise. Some studies also show larger late positive potential effects restricted to pleasant words (Schapkin et al, 2000; Herbert et al., 2006), which were missing in Naumann’s 1997 study. For instance Schapkin et al. (2000, see above) report larger P3 and late positive slow wave responses to pleasant compared to both neutral and unpleasant words during evaluative decision. As mentioned above, Herbert et al. (2006) report a study where early ERP responses (P2, P3a) reflected the arousal dimension of the words, differentiating both pleasant and unpleasant from neutral words. The later LPC, however, differentiated pleasant from unpleasant stimuli and was larger for pleasant words (see Fig. 5). Bernat et al. (2001), on the other hand, report enhanced responses to unpleasant as compared to pleasant words across the entire analysis window, until 1000 ms after word onset encompassing P3 and LPC. Schapkin et al. (2000) additionally assessed late negativities that were labeled N3 (around 550 ms) and N4 (around 750 ms), finding no effect of emotional content. Data from our own studies do show a small effect of emotional content on N4 amplitudes with larger N4 to neutral than to emotional words possibly reflecting a contextual expectancy for an emotional content caused by unequal stimulus probabilities. In both studies (Schapkin et al, 2000; Kissler et al., submitted), two-thirds stimuli had emotional content (pleasant or unpleasant), only one-third was neutral. Late components — clinical studies Weinstein (1995) is one of the few reports of a modulation of integration of emotionally charged

words following a sentence context. Students with high-trait anxiety levels had a reduced N400 (or, in Weinstein’s terminology enhanced P400) to words following a threatening sentence context, indicating facilitated integration of information within threatening contexts. An alternative interpretation might suggest enhanced sustained attention to threatening information in highly anxious subjects, if the potential described were taken to resemble a P3/LPC component, which is not entirely clear on the basis of the presented data. Personality-dependent changes in late cortical responses to emotional words have been subsequently replicated: Kiehl et al. (1999) tried to extend Williamson et al.’s (1991, see above) results of deficient early (P2) and late (LPC) ERP responses to emotionally charged words in psychopathic subjects. They assessed similarities and differences in the processing of abstract–concrete vs. pleasant–unpleasant words in psychopaths and comparison subjects. To address the processing of emotional words a pleasant–unpleasant decision task was used; although the initial study had not revealed any valence differences. The altered task was apparently motivated by clinical observations suggesting that psychopaths have difficulty in understanding abstract information and in distinguishing pleasant from unpleasant valence. Stimuli were controlled for word length and frequency, syllable number, and concreteness. Word presentation was extended to 300 ms and words were presented only once, centrally and in a horizontal format. EEG was recorded from nine scalp positions, again with a linked mastoids reference. Analyses now focused on a 300–400 ms post-stimulus window and a LPC window (400–800 ms). Behaviorally, in both groups responses to pleasant words were faster and more accurate than those to unpleasant ones. Cortically, an N350 component differentiated between pleasant and unpleasant words but not between psychopaths and nonpsychopaths, being across groups larger for the pleasant words. Moreover, the valence differentiation was more pronounced over the left hemisphere. In the later time window (400–800 ms), unpleasant words elicited more positive going brain waves than pleasant ones. This left-hemispheric dominant differentiation was absent in psychopaths. In

164

effect, ERPs to unpleasant words were more positive than ERPs to pleasant words across both time windows, and the differentiation was reduced in psychopaths. It is unclear how the ERP patterns relate to the behavioral data (both groups were faster and more accurate for pleasant). But more positive-going late potentials for unpleasant stimuli in a binary pleasant–unpleasant decision are in line with data from Bernat et al. (2001). During a lexical decision task, Williamson et al. (1991) reported a larger LPC to emotional than to neutral words in nonpsychopathic subjects and to a lesser degree in psychopaths, but no differentiation between the pleasant and unpleasant words. Schapkin et al. (2000) and Herbert et al. (2006), by contrast, report larger late positivities for pleasant in comparison to both neutral and unpleasant words. Note, that neither the Kiehl et al. (1999) nor the Bernat et al. (2001) studies report data on neutral stimuli.

Comparing late emotional and late semantic word processing A considerable number of studies have found amplifying effects of emotional word content on electrophysiological cortical activity later than 300 ms after word onset. In contrast to the very early effects, they occur in a time range where modulation of cortical responses by word meaning is not unusual in itself. By 300–400 ms after stimulus onset ERP tracings reflect conscious processing stages (Halgren et al., 1994a, b) and clearly vary with semantic expectancy (Kutas and Hillyard, 1980, 1984), task relevance (Sutton et al., 1967), or depth of mental engagement (Dien et al., 2004). Thus, it is not surprising that ERPs in this time range can reflect processing differences between words of different emotional content. From a semantics perspective, N400 might represent an appropriate ‘classical’ ERP component to assess for emotion effects. Indeed, some studies have found modulations of N400- or N400-like ERP responses to words of emotional categories (Williamson et al., 1991; Weinstein, 1995; Kiehl et al., 1999). However, in line with ERP studies of affective processing of faces (Schupp et al., 2004) and

pictures (Keil et al., 2002), most researchers focused on an analysis of late positivities. The comparative paucity of reports on N400 modulation by emotional word content may partly reflect a bias on the part of the investigators and appear surprising in view of the fact that N400 is often regarded as ‘the electrophysiological indicator’ of semantic processes in the brain. On the other hand, it is becoming increasingly clear that the N400 response does not index lexical access or semantic processing per se but reflects semantic integration within a larger context, created by either expectations on sentence content or other contextual constraints within experiments (Kutas and Federmeier, 2000). Thus, it is reasonable to assume that single-word studies will only result in N400 modulations if strong expectations on emotional word content are established. Priming studies or experiments establishing an emotional expectation within a sentence context may provide a better testing ground for the issue of N400 modulations by emotional word content. Indeed, Weinstein (1995) followed this rationale establishing emotional expectations on a sentence level. Recent work from our laboratory also shows N400 modulation by emotional content in a lexical decision task where an emotional expectation (pleasant or unpleasant) was established by a preceding emotional picture. Of note, the pictures were of similar emotional connotation as the subsequent adjectives but the words were not descriptive of the picture content (Kissler and Ko¨ssler, in preparation). A transient mood induction may have mediated the effect; recently, effects of subjects’ emotional states on semantic processing have been reported (Federmeier et al., 2001). When subjects were in a mildly positive mood, their semantic processing was facilitated as reflected by a smaller N400 potential to more distant members of given categories than when in a neutral mood. Thus, a number of studies suggest that both a word’s emotional content and a subject’s emotional state may affect the N400 ERP response (but see Fischler and Bradley, this volume). Still, so far the most consistently reported later effects of emotional word categories on the ERP are seen in broadly distributed late positivities with a parietal maximum (see also Fischler and Bradley,

165

this volume). Such late positivities have generally not been associated with specific aspects of semantic processing but rather with task demands such as attentional capture, evaluation, or memory encoding. In neurolinguistics, late positivities have repeatedly been suggested to index syntactic reanalysis following morphosyntactic violations (Osterhout et al., 1994; Friederici et al., 1996; Hagoort and Brown, 2000). Yet, some studies also report modulations of late positivities by semantic attributes of language. For instance, in antonym processing differential P3 and LPC responses were found, depending on whether a word contained a given attribute or lacked it (Molfese, 1985). Contextual semantic constraints and stimulus abstractness have also been reported to affect late positivites (Holcomb et al., 1999). Both contextually expected and unexpected sentence-final words were associated with larger positivities than contextually unconstrained words, the effect being even more pronounced when the abstract words were contextually unexpected. Mu¨nte et al. (1998) also find late positive responses to language semantics, thereby challenging the account of specific morphosyntactic late positive shifts and corroborating the view that late positivities reflect mental engagement and effortful processing across a wide range of higher cognitive functions (Dien et al., 2004). Late positivities are likely to share a proportion of neural generators and differ on others, reflecting the extent to which the tasks that elicit them share or draw on different neural systems. Thus, topographically distinct late positive shifts may relate to different aspects of cognitive and emotional functioning, as suggested for instance by Naumann et al. (1992). However, in order to unambiguously elucidate topographic changes that reflect shifts in neural generator structure, simultaneous recordings from dozens of electrodes and advanced data analysis techniques are necessary (see also Jungho¨fer and colleagues, this volume). From the extant studies on the emotional modulation of late components in word processing, it is hard to gauge the extent to which emotion induces genuine topographic changes indicative of the recruitment of additional distinct cortical structures or purely amplifies the activity of a unitary processing system.

In emotion research, larger late positivities have consistently been shown during free viewing of emotional vs. neutral pictures (Keil et al., 2002) and faces (Schupp et al., 2004). If primary tasks distract participants from the emotional content of the visual stimuli, late positivities to emotional stimuli are often diminished reflecting competition for attentional resources. The degree to which and circumstances under which emotion and attention compete for resources, have additive effects, or operate in parallel is a matter of ongoing debate (see Schupp, et al., this volume, for a discussion). For visually presented word stimuli the picture is similar; when the primary task requires an evaluative emotional decision (pleasant–unpleasant–neutral, emotional–neutral) emotional words, like pictures or faces, are consistently associated with larger late positivities than neutral ones. When the primary task requires structural stimulus processing the evidence is mixed, with some studies still finding larger positivities in response to emotional stimuli (Naumann et al., 1992) while others do not (Naumann, 1997). During free viewing, a recent study (Kissler et al., submitted manuscript) finds a larger LPC to emotional words, suggesting that when subjects are free to allocate their processing resources as they wish, they process emotional words more deeply than nonemotional ones. Our results also indicate that late responses, around 500 ms, may be more affected by explicit attentional tasks than the simultaneously observed early effects around 250 ms. During lexical decision (Wiliamson et al., 1991) and naming tasks (Knost et al., 1997; Pauli et al., 2005), emotional words have also been found to be associated with larger late positivities (but see Fischler and Bradley, this volume). Thus, when the task allows for or even requires semantic processing, emotional words are processed more deeply than neutral ones. When the task requires structural processing this processing advantage is considerably diminished (Naumann et al., 1997). Clearly, the extent to which larger late positivities to emotionally relevant words are driven by arousal, valence, or additional subject-, task-, or situationspecific factors is not quite settled. The matter is complicated by the fact that studies differed on the instruments used to assess emotional word content

166

and the extent to which the pleasant and unpleasant dimension were differentiated or collapsed into one ‘emotional’ category. A fair number of studies employed the empirically well-founded semantic differential technique to assess emotional content, or the two-dimensional arousal  valence space, yet others do not even report the criteria by which the emotional content of the material has been determined. Although multidimensional models of affect are empirically well founded and the use of numerical rating scales allows for the rapid assessment of large numbers of stimuli on many dimensions, an inherent problem with Likert-type scaling remains. Such scaling techniques assume that subjects will meaningfully assign numbers to psychological stimuli, such that the quantitative relationships between the numbers will correctly reflect the psychologically perceived relationships among the stimuli, including conservation of distance or conservation of ratio, yielding interval or even ratio scales. But these assumptions do not always hold, such that it is unclear whether the psychological distance between stimuli rated 2 and 4 on a given scale is really the same as between stimuli rated 6 and 8 (Luce and Suppes, 1965; Kissler and Ba¨uml, 2000; Wickelmaier and Schmid, 2004). Moreover, the relationship between behavioral ratings and physiological impact of emotional stimuli is likely to be nonlinear. Nevertheless, the bulk of the data corroborates the view that during earlier stages of processing, emotion acts as a nonvalence-specific, arousaldriven alerting system (see above). During later stages of processing (4300 ms), the patterns found are more varied and may reflect flexible adaptations to contextual factors. In support, Herbert et al. (2006) recently found arousal-driven amplification of cortical responses to both pleasant and unpleasant words within the first 300 ms and a divergent pattern that favors the processing of pleasant material in a later time window (see Fig. 5). Keil (this volume) discusses a number of task factors that contribute to processing advantages for pleasant or unpleasant material in turn. For language material with emotional content, a general pleasant–unpleasant asymmetry in emotional processing may be important: At low levels of arousal, a ‘positivity offset’ is often found in

that the approach system responds more strongly to relatively little input. The withdrawal system in response to unpleasant input, in turn, is activated comparatively more at high levels of arousal, this latter process being termed ‘negativity bias’ (Caccioppo, 2000; Ito and Caccioppo, 2000). Visually presented words are likely to constitute lessarousing stimuli than complex colored pictures, i.e., the word ‘cruel’ will be less arousing than a photograph of a corresponding scene, even if both stimuli receive comparable ratings. Therefore, in the absence of strong unpleasant personal associations for a given word, which may well be present in various clinical populations (see above), a ‘positivity offset’ for written verbal material might be expected. Corresponding data are reported, for instance, by Schapkin et al. (2000) or Herbert et al. (2006). Like for the early effects, the question arises how late effects of emotional word content on ERPs come about; subcortical activity has again been implicated. Nacchache et al. (2005) have recently for the first time recorded directly from the amygdala field potentials in response to emotional words. Three epilepsy patients with depth electrodes implanted for presurgical evaluation performed an evaluative decision task (threatening–non-threatening) on a series of threat or nonthreat words presented subliminally or supraliminally. In all three patients, larger amygdala potentials to threat than to nonthreat words could be identified around 800 ms after word presentation in the subliminal and around 500–600 ms in the supraliminal condition. The study is pivotal in that it both directly measures amygdala activity during emotional word processing and provides clues as to the timing of this activity. As detailed before, subcortical, primarily amygdala activity may be a source of cortical amplifying mechanisms in response to emotional stimuli visible in ERPs. Amygdala activity measured by depth electrodes around 600 ms after stimulus onset may provide a basis for LPC amplifications evident in the surface ERP. However, the timing of the responses poses new puzzles. If amygdala activity in emotional word processing onsets around 600 ms, how are early effects of emotional word content generated (see discussion above)? Amplified cortical ERP responses reflect the activation of larger

167

patches of cortex, indicating spread of activation in a more densely packed neural network. These result from life-long associative learning mechanisms. The effects of emotional learning can be seen in amplified cortical ERP tracings whenever the corresponding semantic network is accessed. Subcortical mechanisms might be active primarily in the acquisition of emotional semantics, reflecting the role of the amygdala in emotional learning even of abstract representations (Phelps et al., 2001). Their impact may be attenuated once a representation has been acquired. Clearly, elucidating the mechanisms by which amplified responses to emotional words are generated is a vital issue for future research. In sum, a considerable number of studies show enhanced late positive responses when people process emotionally laden words, pleasant and unpleasant alike. The responses are not as large as for pictorial or face stimuli, but they have been reliably demonstrated across numerous studies. Major challenges for future research remain in determining the relative role of arousal and valence and their interactions with task demands. Finally, the question to what extent and at which points in time, processing of emotional words recruits specific cortical and subcortical neural circuitries merits further scientific attention.

Processing emotional words — electrophysiological conjectures The above review demonstrates that emotional word content can amplify word processing at all stages from access to word meaning (around 200 ms), to contextual integration (around 400 ms), evaluation, and memory encoding (around 600 ms). Occasionally, emotionality-dependent enhancements have been reported even before 200 ms. In neurolinguistics, the timing of lexical access is heatedly debated. The reports about different points in time where some aspects of the lexical information on a word are accessed vary between 100 and 600 ms. Importantly, the interpretation of the N400 has shifted from an index of semantic access to a signature of the interaction between single word semantics and context. Accordingly, a

growing body of evidence demonstrates that some aspects of word meaning must be active before the N400 is elicited. We propose that there is no fixed point in time where all semantic information is equally available. Instead, subrepresentations can become activated in a dynamic, possibly cascaded manner. Variations in timing of semantic signatures could be interpreted in the light of an internal structure of the entry of a mental lexicon. Different aspects of semantics could be flexibly prioritized, depending on context, task, and motivation of the subject. If the tenet of a monolithic lexical entry is given up, the internal structure of a word’s lexical representation can be assessed by investigating the timing by which each subrepresentation is available (Assadollahi and Rockstroh, 2005) or the contextual constraints that lead to the activation of a particular subrepresentation at a given point in time (Sereno et al., 2003). Emotional semantics may be special; their connection to biologically significant system states and behavioral output tendencies may ensure most rapid activation of the underlying neural network representations. A number of studies endorse a simultaneous impact of emotional word content on both cortical and peripheral responses (Flor et al., 1997; Knost et al., 1997; Herbert et al., 2006), corroborating the view that the neural networks representing emotional concepts dynamically link semantic and response information (Lang et al., 1993, see Fig. 3). Many studies also show surprisingly early cortical responses to emotional word content. Thus, the subrepresentation of a word’s emotional content may sometimes, though clearly not always, be activated before other lexical information is available, for example, whether a word denotes an animate or an inanimate entity. Depending on personality, context, and task, emotional semantic networks may become activated rapidly or gradually, operate over several hundred milliseconds or their activation may decay quickly when not currently relevant. The specification of experimental factors contributing to the generation of arousal-, valence-, or even potency-specific effects or to the temporal gradient of the processing enhancement caused by emotional connotation is lacking detail. As a

168

working hypothesis, emotional content amplifies word processing at the earliest stage at which the visual word form is assembled, i.e., no sooner than about 100 ms after word presentation. Learning, top-down and priming processes may dynamically modify the time point at which the activation of the representation takes place (see above). Arousal is likely to govern this amplification process within the first 300 ms, reflecting a general orienting function of emotion. Under specific experimental manipulations, such as very brief stimulus duration, or in populations with pronounced negative processing biases, earliest advantages for unpleasant material may be obtained, possibly reflecting modality-independent operations of a rudimentary rapid threat detection system (O¨hman and Mineka, 2001). In general, the neural representations of abstract linguistic features are less likely to be activated by such a superfast detection system. Later than 300 ms after stimulus onset the behavioral relevance of the stimulus in a given

situation is likely to determine its further processing, with tasks that require processing of semantic content supporting sustained enhancement of emotional stimuli and contextual effects determining possible processing advantages for either valence. Under circumstances where structural outweighs semantic processing or where the stimuli occur very briefly and are not followed by confirmatory input, only transient enhancement of early brain responses and less, if any, amplification in later time windows will result. Subcortical structures, most prominently the amygdala, have been implied at all stages of this emotional content-driven amplification process, but the dynamics of the interplay between subcortical and cortical structures in processing emotional words await specification. Combining electrophysiological, functional magnetic resonance, and lesion approaches within a network theory of emotion and language will be useful to clarify these issues.

18, all males

37, sex not indicated

Begleiter and Platz (1969)

Kostandov and Arzumanov (1977)

Experiment 2 subliminal, 14.

Experiment 1 supraliminal 23.

(‘all in state of jealousy’)

1, male

Subjects N, sex

Lifshitz (1966)

Study

Covert identification and counting of stimulus repetitions

Viewing, word naming

Viewing, memorizing

Task(s)

Experiment 2: 15 ms 50 repetitions?

No further details on experimental stimuli given. Additional parameter control unclear

Experiment 1: 200 ms 50 repetitions?

‘Conflict’ and ‘neutral’ words

Controlled for: word length

1024 ms

20 repetitions

t-tests

1000 ms

Left mastoid reference

2 (Cz, O1)

ANOVA t-test

Linked ears reference

Centrally

1 (O2)

10 ms

Nouns

Visual inspection

500 ms

Left-hemispheric leads, reference unclear

4

Statistical analysis

Epoch duration (poststimulus)

Reference

Number of electrodes

Recording and analysis

Emotional content determined by authors: 2 ‘taboo’ words (shit, fuck) 2 ‘neutral’ words (tile, page) blank flash

Controlled for: word length

Centrally Number of repetitions not indicated

1000–2000 ms

Stimulus repetitions

Control of additional stimulus parameters

Word class not indicated

Position of presentation

Emotional content determined by author: 40 ‘dirty words’ 40 neutral words

Stimulus duration

Emotional content

Presentation

Word class

Stimuli

P3a (300 ms) O1 and Cz: Peak amplitude: emotional 4 neutral Peak latency, n.s.

Experiment 2: Subliminal: N2 (220 ms) Peak amplitude O1 and Cz: emotional 4 neutral

P3a (270–320 ms) O1: Peak amplitude: emotional4neutral Peak latency: emotionaloneutral Cz: n.s.

Experiment 1: Supraliminal: N2 (220 ms) Emotional content on peak amplitude & latency n.s.

Latency: ‘taboo’oneutral & flash (300–400 ms)

Naming4viewing (100–200 ms and 200–300 ms)

Amplitude: ‘taboo’4neutral & flash

Visual comparison dirty words versus neutral words: no marked differences visible.

ERP effects

Table A1. A summary of experimental design, recording parameters and results of the reviewed studies on effects of emotional content on ERP measures of word processing in healthy volunteers.

Appendix

169

10, all males

10, 4 males

10

10, 4 males

10, all males

Chapman et al. (1978)

Chapman (1979)

Chapman et al. (1980)

Vanderploeg et al. (1987)

Subjects N, sex

Begleiter et al. (1979)

Study

Table A1. (Continued )

510 ms

15 repetitions

Affective rating

Pre-rating on semantic differential: 20 E+, 20 E–, 20 P+, 20 P–, 20 A+, 20 A–

510 ms 15 repetitions

PCA, ANOVA on PCA components

800 ms

Repeated until 32 artifactfree averages per category obtained Additional parameter control unclear

Linked ears reference

Centrally

6 (Fz, Pz, F7, F8, T5, T6)

80 ms

Word class not indicated

PCA of ERP, ANOVA on PCA components discriminant analysis

Linked ears reference

1 (CPz) Centrally

Words from Begleiter et al. (1969, 1979) and Chapman et al., (1977, 1978)

Additional parameter control unclear

Word class not indicated

Word naming

17 ms

510 ms

15 repetitions

Additional parameter control unclear

Linked ears reference

Centrally

PCA of ERP, ANOVA on PCA components discriminant analysis

1 (CPz)

17 ms

Word class not indicated Pre-rating on semantic differentiala: 20 E+, 20 E–, 20 P+, 20 P–, 20 A+, 20 A–

Luminance, distribution of letters

Linked ears reference

Centrally

PCA of ERP, ANOVA on PCA components discriminant analysis

1 (CPz)

17 ms

ANOVA t-test

450 ms

Only P3 and P4 analyzed, linked ears reference

6 (F3, F4, C3, C4, P3, P4)

Recording and analysis

Word class not indicated

Single presentation

Centrally

20 ms

Presentation

Pre-rating on semantic differentiala: 20 E+, 20 E–, 20 P+, 20 P–, 20 A+, 20 A–

Verbal affective rating

Verbal affective rating

Word naming

Prerating on semantic differential: 62 pleasant, 62 unpleasant, 62 neutral Additional parameter control unclear

Word class not indicated

Letter identification

Stimuli

Affective rating

Task(s)

On P3a (230–420 ms) and LPC (500–624) ms

n.s. trend: amplitude: emotional (pleasant & unplesant) 4 neutral

Visual inspection: N1, P2, P3a differentiate the+and – poles of the E, P, A dimensions

PCA/stepwise discriminant analysis differentiate among all 6 stimulus types within the first 500 ms

Differentiation among polarities of word content types (E+/, P +/, A+/) within 330 ms

Differentiation between semantic scales (E, P, A) within 190 ms

PCA/stepwise discriminant analysis differentiate among all 6 stimulus types within the first 500 ms

Visual inspection of provided figures: N1, P2, and P3a differentiate the+and – poles of the E, P, A dimensions

PCA/stepwise discriminant analysis differentiate among all 6 stimulus types within the first 500 ms

No emotion effect in letter identification

Left electrode (P3) Affective rating 4 letter identification

Effects: left 4 right

Affective rating condition Electrodes P3 and P4: pleasant 4 neutral unpleasant 4 neutral pleasant 4 unpleasant

N1–P2 peak-to-peak amplitude (140–200 ms)

ERP effects

170

Naumann et al. (1997)

Controlled for: length, concrete-ness

28 unpleasant, 28 neutral

Published affective norms (Schwibbe et al., 1981)

Valence decision

Nouns

Controlled for: length, concreteness

30 pleasant, 30 unpleasant, 30 neutral

90 words affective norms (Hager et al., 1985)

Adjectives

Concreteness decision

Affective rating

Letter detection

Word length decision

Pilot study: 17, 8 males

Replication study: 106, 50 males

54 (26 males, between design, 18 per condition)

Naumann et al. (1992)

Single presentation

Centrally

200 ms

Single presentation

Centrally

125 ms

ANOVA and Tukey HSD test on P3 and positive slow wave amplitudes and peak latencies

1400 ms post word

9 (F4, C4, P4, Fz, Cz, Pz, F3, C3, P3) -Linked ears reference

ANOVAs on P3 and positive slow wave amplitude and peak latency

1500 ms post word

3 (Fz, Cz, Pz) Left mastoid reference

Positive slow wave amplitude No effects

Valence decision: neutralounpleasant

P3 peak latency Emotional content  task Letter detection & concreteness decision: unpleasantoneutral

Valence decision task P3 amplitude: Unpleasant4neutral

Emotional content  task Unpleasant4neutral Effect from valence decision task only

Positive slow wave amplitude Electrode  emotional content No parietal maximum for neutral

Electrode  emotional content Larger effect for pleasant, particularly at Fz and Cz P3 peak latency: Neutraloemotional (pleasant & unpleasant)

P3 amplitude Emotional content Emotional (pleasant & unpleasant)4neutral Electrode  task: Fz: affective rating4word length decision

Electrode  emotional content Larger positivities at central and parietal electrodes for pleasant and unpleasant Replication study:

Positive slow wave amplitude (700–1200): Affective rating4word length decision

Pilot study: P3 amplitude (300–700 ms) Affective rating4word length decision. Emotional content (pleasant & unpleasant) 4 neutral

171

22, 8 males

23, 10 males

Skrandies and Chiu (2003)

Subjects N, sex

Skrandies (1998)

Study

Table A1. (Continued )

Visualize and memorize words

Visualize and memorize words

Task(s)

RSVPb

Average reference 1000 ms post word ANOVAs on peak latency centroid position Global field power (GFP)

Centrally RSVPb 24(?) repetitions

Controlled for word length, word frequency

32 (10–20 system)

1000 ms

Nouns Pre-rating on semantic differentiala (different population): 10 E+, 10 E–, 10 P+, 10 P–, 10 A+, 10 A–

Controlled for: Word length, word frequency

ANOVAs on centroids of scalp distributions and peak latencies Duncan post hoc

Centrally

40 repetitions

Average reference 1000 ms

1000 ms

30 (10–20 system)

Recording and analysis

Nouns

Presentation

Pre-rating on semantic differentiala (different population): 10 E+, 10 E–, 10 P+, 10 P–, 10 A+, 10 A–

Stimuli

No effects after 300 ms

Left-right location of centroids: Right-shift of A centroid

N1 (130–195 ms): GFP P-and A-4P+and A+ Posterior-anterior location of centroids: A+anterior P+anterior E+and E-anterior P-and A-

P1 (80–130 ms): Latency E+, P+, A+4E-, P-, A-

Complex sequence of effects, discriminating between E, P, A dimensions and their polarity:

860–975 ms: Peak latency: E+ and P+4E- and P-A+oA-

635–765 ms: Centroids: P+ and P- shifted anteriorally and posteriorally

565–635 ms: Peak latency: E+, P+, A+4E-, P-, A-

195–265 ms: Peak latency PoE and A Anterior centroid shift for E-, P-, Acompared to E+, P+, A+

N1 (130–195 ms): Increased peak latency for E and A words Reduced global field power (GFP) for A words

Right negative centroid shift for E- words Left positive centroid shift for E- words

P1 (80–130 ms): Peak latency A4E and P

Complex sequence of effects, discriminating among E, P, A dimensions and their polarity:

ERP effects

172

15, 7 males

17, 8 males

13, all males

Schapkin et al. (2000)

Bernat et al. (2001)

Ortigue et al. (2004)

Divided field lexical decision (in which visual field did a word appear?)

Focus on fixation point

Emotional vs. neutral classification

Controlled for: word length, word frequency

16 words, (8 neutral 8 emotional) 96 pseudowords

Nouns Prerating for emotionality on seven-point scale (emotional–nonemotional)

‘word length, word frequency, luminance

10 pleasant, 10 unpleasant, 12 neutral

Pre-rating on semantic differential in same population.

Adjectives

Controlled for: syllables, word length, word frequency, concreteness, imagery, initial letter

Distinct right occipital spatial map for emotional words presented in the right visual field

Average reference 250 ms post word

30 repetitions

ANOVAs on spatial configuration maps LAURA source estimation

Amplitude: 100–140 ms: emotional4neutral

P3 (200–500 ms) Emotional Content  hemisphere: unpleasant: left4right

Emotion  hemisphere, P1 (20–80 ms), N1 (52–150 ms), Left: unpleasant4pleasant Right: unpleasant4pleasant

Supraliminal: Amplitudes: Emotion: P3 (200–500 ms) and LPC (500–900 ms): unpleasant4pleasant

Emotion  hemisphere: P1 (40–120 ms), N1 (80–170 ms), P3 (200–500 ms): Left: unpleasant4pleasant Right: unpleasant4pleasant

Subliminal: Amplitudes: Emotion: P2 (100–210 ms), P3 (200–500 ms), LPC (500–900 ms): unpleasant4pleasant

123 (extended 10–20 system)

ANOVAs on P1, N1, P2, P3, LPC mean amplitudes pleasant vs. unpleasant (analysis of neutral words is not presented)

1000 ms post word

Linked ears reference

6 (F3, F4, P3, P4, CPz, Oz)

ANOVAs on base-to-peak measures of N1, P2, P3, SPW post hoc t-tests

Slow positive wave (1000–1800): Frontal pleasant4unpleasant Parietal: pleasant4neutral

P3 amplitude (300–450 ms) pleasant4unpleasant

Linked ears reference 800 ms post word

P2 amplitude (200–300 ms) pleasant4unpleasant

14 (only C3, C4, P3, P4, O1, O2 analyzed)

Left and right visual field simultaneously

13 ms

12 repetitions (6 subliminal, 6 supraliminal)

Centrally

Subliminal (1 ms) Supraliminal (40 ms)

8 repetitions (4 per visual field)

Left or right visual field

Pre-rating on valence (different population) 6 pleasant, 6 unpleasant, 6 neutral

150 ms

Nouns

173

Silent reading

Covert evaluation

Task(s)

333 ms/1000 ms post word

10 (5 per presentation speed)

Word length, word frequency, concreteness

ANOVA on groups of electrodes

Average reference

RSVPb

60 unpleasant, 60 pleasant, 60 neutral

64 (extended 10–20 system)

333 ms and 1000 ms

Nouns Pre-rating of valence and arousal by different population

Controlled for: word length, word frequency

ANOVAs on channel groups

1000 ms post word

Single presentation

60 unpleasant, 60 pleasant, 60 neutral

Average reference

Centrally

64 (extended 10–20 system)

4000 ms

Recording and analysis

Adjectives

Presentation

Pre-rated on valence and arousal by different population

Stimuli

Emotional (pleasant ¼ unpleasant)4neutral Left: emotional4neutral Right: emotional4neutral

Amplitude: 200–300 ms occipito-temporal electrodes

LPC (600–750 ms): pleasant4neutral

P2 (180–250 ms) and P3a (250–400 ms): emotional (pleasant & unpleasant)4neutral

Amplitude:

ERP effects

b

Dimensions of the semantic differential: E, evaluation; P, potency; A, activity. +/- indicate the positive and negative poles of these dimensions. RSVP, rapid serial visual presentation of stimuli in a continuous consecutive stream without interstimulus interval.

16, 8 males

Kissler et al. (submitted)

a

26, 16 males

Subjects N, sex

Herbert et al. (2006)

Study

Table A1. (Continued )

174

Williamson et al. (1991)

Study

16 — 8 psychopaths, 8 non-psychopaths, all males

clinical status

Subjects N, sex

Lexical decision

Task(s)

Word length, frequency, number of syllables, concreteness

13 pleasant, 13 unpleasant, 13 neutral, 39 pseudowords

Affective norms (Toglia and Battig, 1978)

Word class not indicated

Control of additional stimulus parameters

Emotional content

Word class

Stimuli

6 repetitions

+/- 3 parafoveally, vertical format

176 ms

Stimulus repetitions

Position of presentation

Stimulus duration

Presentation

ANOVA t-test

2000 ms

Linked ears reference

5 (Fz, Cz, Pz, PT3, PT4)

Electrodes and reference analysis window (poststimulus) Statistics

Recording and analysis

Psychopaths: emotional (pleasant & unpleasant)4neutral

Midline: Non-psychopaths: emotional (pleasant & unpleasant)4neutral

LPC: 650–800 ms Amplitude: Group  emotional Content  electrode site

Psychopaths: emotional (pleasant & unpleasant) ¼ neutral

P240: 225–300 ms Amplitude: Nonpsychopaths: emotional (pleasant & unpleasant)4neutral.

ERP effects

ERP effects

Table A2. A summary of experimental design, recording parameters and results of the reviewed studies on effects of emotional content on ERP measures of word processing in clinical populations.

175

Weinstein (1995)

Study

Table A2. (Continued )

20 students 10 highly anxious, 5 males 10 low anxious 4 males

Subjects N, sex

Decision on contextual fit

Task(s)

Control for linguistic parameters not indicated

Threat words from MacLeod (1985). Emotional content of sentences and words prerated in different population.

‘Probe words’:20 threat, 20 neutral 20 pleasant (nouns & adjectives) 2 pairings: fittingnot fitting.

20 threatening, 20 pleasant sentences (5–7 words)

Stimuli

2 repetitions ANOVA t-test

650 ms

Linked ears reference

3 (Fz, Cz, Pz)

1100 ms per word Centrally

Recording and analysis

Presentation

(Plots of means appear to diverge from the reported statistics)

N400/P400: 400–500 ms Peak amplitude: Threat condition: High anxiety group more positive than low anxiety group (Cz, Pz).Peak latency: High anxiety groupolow anxiety group.

N100: 90–120 ms: Peak amplitude: High anxiety group threat priming4low anxiety group threat priming. (Fz, Cz)

ERP effects

176

38:19 prechronicpain patients, 11 males

Knost et al. (1997)

19 controls, 11 males

29 prisoners, all males: 8 psychopaths, 9 nonpsychopaths, 12 mixed

Kiehl et al. (1999), Task 3

Detection and naming

Pleasant/unpleasant decision

Controlled for: word frequency, word length

40 pain-related, 40 body-related, 40 neutral

Pre-rating on familiarity, body and pain relatedness

Adjectives

Controlled for: word length, frequency, number of syllabels, imagery, concreteness

60 pleasant, 60 unpleasant

Valence norms (Toglia and Battig, 1978)

Word class not indicated

LPC2: 600–800 ms Amplitude both groups: pain/bodyrelated4neutral

ANOVA t-test

800 ms

Linked ears reference

Centrally Single presentation

N100 (80–180 ms) at electrode F3 Prechronic pain patients: Amplitude: pain words4body related & neutral

Reaction times: Pleasantounpleasant

Emotional content  hemisphere: Left: unpleasant4pleasant Right: unpleasant4pleasant

LPC: 400–800 ms Amplitude Emotional content: Non-psychopaths: Unpleasant4pleasant Psychopaths: Unpleasant ¼ pleasant

Emotional content  hemisphere: Left: pleasant4unpleasant Right: pleasant4unpleasant

N350: 300–400 ms Amplitude: All groups: Emotional content: pleasant4unpleasant

11 (Fz, F3, F4, Cz, C3, C4, Pz, P3, P4, T3, T4)

ANOVA t-test on reaction times and N350, LPC amplitudes

1200 ms

5 (Fz, Cz, Pz, PT3, PT4)Linked ears reference

Individual perceptual threshold

Single presentation

Centrally

300 ms

177

Flor et al. (1997)

Study

Table A2. (Continued )

24:12 chronic pain patients, 5 males 12 controls, 5 males

Subjects N, sex

Detection and naming

Task(s)

Controlled for: word frequency, word length

40 pain-related, 40 body-related, 40 neutral

Pre-rating on familiarity, body and pain relatedness

Adjectives

Stimuli

Single presentation

Centrally

Perceptual threshold

Presentation

ANOVA t-test

LPC2 (600–800 ms) n.s. All: pain words ¼ body related ¼ neutral.

LPC1 (400–600 ms): n.s. All: pain words ¼ body related ¼ neutral

P200 (180–280 ms): Chronic pain patients: Larger responses to pain words over right than left

N200 (140–200 ms) at all electrodes: Chronic pain patients:pain words4body related & neutral

Both groups: N100 (80–140 ms) Pain words: left4right Neutral: right4left

Chronic pain patients only: pain words4body related & neutral

Linked ears reference 800 ms

N100 (80–140 ms) at electrodes F3, Fz, C3, P3, Pz

ERP effects

11 (Fz, F3, F4, Cz, C3, C4, Pz, P3, P4, T3, T4)

Recording and analysis

178

Pauli et al. (2005)

50:25 panic patients, 9 males, 25 controls, 9 males

Detection and naming

Controlled for: word frequency, length, syllables, distribution of word classes

48 panic related, 48 neutral

Pre-rating for descriptiveness for panicdisorder symptoms by 2 psychiatrists and 2 clinical psychologists

Adjectives and verbs Linked ears pnce

2 presentations (one per duration) ANOVA t-test

1000 ms

21 (only 9 analyzed: Fz, Cz, Pz, F3, F4, C3, C4, P3, P4)

Perceptual threshold and 1000 ms,

Emotional content  hemisphere: Left: panic4neutral Right: panic4neutral

LPC2: 600–800 Amplitude both groups: panic4neutral

1000 ms presentation: LPC1: 400–600 ms Amplitude both groups: panic4neutral

LPC1: 400–600 ms Amplitude both groups: panic4neutral

P3a: (200–400 ms): panic words4neutral

Threshold presentation: Panic patients: N200/P200 (100–200 ms) Frontal effect panic words4neutral

179

180

Acknowledgments This work was supported by a grant from the Heidelberg Academy of Sciences (Mind and Brain Program). We thank Anne Hauswald for help in preparation of this manuscript and Christiane Beck, Susanne Ko¨ssler, Bistra Ivanona and Irene Winkler for assistance in the experimental work described.

References Amaral, D.G., Behniea, H. and Kelly, J.L. (2003) Topographic organization of projections from the amygdala to the visual cortex in the macaque monkey. Neuroscience, 118: 1099–1120. Anderson, A.K. and Phelps, E.A. (2001) Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature, 411: 305–309. Assadollahi, R. and Pulvermu¨ller, F. (2001) Neuromagnetic evidence for early access to cognitive representations. Neuroreport, 12: 207–213. Assadollahi, R. and Pulvermuller, F. (2003) Early influences of word length and frequency: a group study using MEG. Neuroreport, 14: 1183–1187. Assadollahi, R. and Rockstroh, B. (2005) Neuromagnetic brain responses to words from semantic sub- and supercategories. BMC Neurosci., 6: 57. Barsalou, L.W. (1999) Perceptual symbol systems. Behav. Brain Sci., 22: 577–609 discussion 610–660. Begleiter, H. and Platz, A. (1969) Cortical evoked potentials to semantic stimuli. Psychophysiology, 6: 91–100. Begleiter, H., Projesz, B. and Garozzo, R. (1979) Visual evoked potentials and affective ratings of semantic stimuli. In: Begleiter, H. (Ed.), Evoked Brain Potentials and Behavior. Plenum Press, New York, pp. 127–143. Bentin, S., Mouchetant-Rostaing, Y., Giard, M.H., Echallier, J.F. and Pernier, J. (1999) ERP manifestations of processing printed words at different psycholinguistic levels: time course and scalp distribution. J. Cogn. Neurosci., 11: 235–260. Bernat, E., Bunce, S. and Shevrin, H. (2001) Event-related brain potentials differentiate positive and negative mood adjectives during both supraliminal and subliminal visual processing. Int. J. Psychophysiol., 42: 11–34. Bower, G. (1981) Mood and memory. Am. Psychol., 36: 129–148. Bradley, M.M. and Lang, P.J. (1994) Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry, 25(1): 49–59. Bradley, M., and Lang, P.J., 1998. Affective norms for English words (ANEW): Instruction manual and affective ratings. Technical report A-8, The Center for Research in Psychophysiology, University of Florida.

Bradley, M.M. and Lang, P.J. (2000) Affective reactions to acoustic stimuli. Psychophysiology, 37: 204–215. Brem, S., Lang-Dullenkopf, A., Maurer, U., Halder, P., Bucher, K. and Brandeis, D. (2005) Neurophysiological signs of rapidly emerging visual expertise for symbol strings. Neuroreport, 16: 45–48. Caccioppo, J.T. (2000) Asymmetries in affect laden information processing. In: Banaji, R. and Prentice, D.A. (Eds.), Perspectivism in Social Psychology: The Yin and Yang of Scientific Progress. American Psychological Association Press, Washington, DC, pp. 85–95. Chao, L.L., Haxby, J.V. and Martin, A. (1999) Attribute-based neural substrates in temporal cortex for perceiving and knowing about objects. Nat. Neurosci., 2: 913–919. Chapman, R.M. (1979) Connotative meaning and averaged evoked potentials. In: Begleiter, H. (Ed.), Evoked Brain Potentials and Behavior. Plenum Press, New York, pp. 171–197. Chapman, R.M., McCrary, J.W., Chapman, J.A. and Bragdon, H.R. (1978) Brain responses related to semantic meaning. Brain Lang., 5: 195–205. Chapman, R.M., McCrary, J.W., Chapman, J.A. and Martin, J.K. (1980) Behavioral and neural analyses of connotative meaning: word classes and rating scales. Brain Lang., 11: 319–339. Cohen, L., Dehaene, S., Naccache, L., Lehericy, S., DehaeneLambertz, G., Henaff, M.A. and Michel, F. (2000) The visual word form area: spatial and temporal characterization of an initial stage of reading in normal subjects and posterior splitbrain patients. Brain, 123(Pt 2): 291–307. Davis, M. (1992) The role of the amygdala in fear and anxiety. Annu. Rev. Neurosci., 15: 353–375. Dehaene, S. (1995) Electrophysiological evidence for categoryspecific word processing in the normal human brain. Neuroreport, 6: 2153–2157. Dehaene, S., Cohen, L., Sigman, M. and Vinckier, F. (2005) The neural code for written words: a proposal. Trends Cogn. Sci., 9: 335–341. Dehaene, S., Le Clec, H.G., Poline, J.B., Le Bihan, D. and Cohen, L. (2002) The visual word form area: a prelexical representation of visual words in the fusiform gyrus. Neuroreport, 13: 321–325. Dien, J., Spencer, K.M. and Donchin, E. (2004) Parsing the late positive complex: mental chronometry and the ERP components that inhabit the neighborhood of the P300. Psychophysiology, 41: 665–678. Dijksterhuis, A. and Aarts, H. (2003) On wildebeests and humans: the preferential detection of negative stimuli. Psychol. Sci., 14: 14–18. Everatt, J., McCorquidale, B., Smith, J., Culverwell, F., Wilks, A., Evans, D., Kay, M. and Baker, D. (1999) Association between reading ability and visual processes. In: Everatt, J. (Ed.), Reading and Dyslexia. Routledge, London, pp. 1–39. Federmeier, K.D., Kirson, D.A., Moreno, E.M. and Kutas, M. (2001) Effects of transient, mild mood states on semantic memory organization and use: an event-related potential investigation in humans. Neurosci. Lett., 305: 149–152.

181 Federmeier, K.D., Segal, J.B., Lombrozo, T. and Kutas, M. (2000) Brain responses to nouns, verbs and class-ambiguous words in context. Brain, 123(Pt 12): 2552–2566. Flor, H., Knost, B. and Birbaumer, N. (1997) Processing of pain- and body-related verbal material in chronic pain patients: central and peripheral correlates. Pain, 73: 413–421. Friederici, A.D., Hahne, A. and Mecklinger, A. (1996) Temporal structure of syntactic parsing: early and late event-related brain potential effects. J. Exp. Psychol. Learn. Mem. Cogn., 22: 1219–1248. Garavan, H., Pendergrass, J.C., Ross, T.J., Stein, E.A. and Risinger, R.C. (2001) Amygdala response to both positively and negatively valenced stimuli. Neuroreport, 12: 2779–2783. Greenwald, A.G., Draine, S.C. and Abrams, R.L. (1996) Three cognitive markers of unconscious semantic activation. Science, 273: 1699–1702. Hagoort, P. and Brown, C.M. (2000) ERP effects of listening to speech compared to reading: the P600/SPS to syntactic violations in spoken sentences and rapid serial visual presentation. Neuropsychologia, 38: 1531–1549. Halgren, E., Baudena, P., Heit, G., Clarke, J.M., Marinkovic, K., Chauvel, P. and Clarke, M. (1994a) Spatio-temporal stages in face and word processing. 2. Depth-recorded potentials in the human frontal and Rolandic cortices. J. Physiol. (Paris), 88: 51–80. Halgren, E., Baudena, P., Heit, G., Clarke, J.M., Marinkovic, K. and Clarke, M. (1994b) Spatio-temporal stages in face and word processing. I. Depth-recorded potentials in the human occipital, temporal and parietal lobes [corrected]. J. Physiol. (Paris), 88: 1–50. Hamann, S. and Mao, H. (2002) Positive and negative emotional verbal stimuli elicit activity in the left amygdala. Neuroreport, 13: 15–19. Hauk, O., Johnsrude, I. and Pulvermuller, F. (2004) Somatotopic representation of action words in human motor and premotor cortex. Neuron, 41: 301–307. Haxby, J.V., Horwitz, B., Ungerleider, L.G., Maisog, J.M., Pietrini, P. and Grady, C.L. (1994) The functional organization of human extrastriate cortex: a PET-rCBF study of selective attention to faces and locations. J. Neurosci., 14: 6336–6353. Herbert, C., Kissler, J., Jungho¨fer, M., Peyk, P. and Rockstroh, B. (2006) Processing emotional adjectives: evidence from startle EMG and ERPs. Psychophysiology, 43(2): 197–206. Hinojosa, J.A., Martin-Loeches, M., Munoz, F., Casado, P. and Pozo, M.A. (2004) Electrophysiological evidence of automatic early semantic processing. Brain Lang., 88: 39–46. Hinojosa, J.A., Martin-Loeches, M. and Rubia, F.J. (2001) Event-related potentials and semantics: an overview and an integrative proposal. Brain Lang., 78: 128–139. Holcomb, P.J., Kounios, J., Anderson, J.E. and West, W.C. (1999) Dual-coding, context-availability, and concreteness effects in sentence comprehension: an electrophysiological investigation. J. Exp. Psychol. Learn. Mem. Cogn., 25: 721–742.

Isenberg, N., Silbersweig, D., Engelien, A., Emmerich, S., Malavade, K., Beattie, B., Leon, A.C. and Stern, E. (1999) Linguistic threat activates the human amygdala. Proc. Natl. Acad. Sci. USA, 96: 10456–10459. Ito, T.A. and Caccioppo, J.T. (2000) Electrophysiological evidence of implicit and explicit categorization processes. J. Exp. Soc. Psychol., 36: 660–676. Jungho¨fer, M., Bradley, M.M., Elbert, T.R. and Lang, P.J. (2001) Fleeting images: a new look at early emotion discrimination. Psychophysiology, 38: 175–178. Keil, A., Bradley, M.M., Hauk, O., Rockstroh, B., Elbert, T. and Lang, P.J. (2002) Large-scale neural correlates of affective picture processing. Psychophysiology, 39: 641–649. Kellenbach, M.L., Wijers, A.A., Hovius, M., Mulder, J. and Mulder, G. (2002) Neural differentiation of lexico-syntactic categories or semantic features? Event-related potential evidence for both. J. Cogn. Neurosci., 14: 561–577. Kiefer, M. and Spitzer, M. (2000) Time course of conscious and unconscious semantic brain activation. Neuroreport, 11: 2401–2407. Kiehl, K.A., Hare, R.D., McDonald, J.J. and Brink, J. (1999) Semantic and affective processing in psychopaths: an eventrelated potential (ERP) study. Psychophysiology, 36: 765–774. King, J.W. and Kutas, M. (1998) Neural plasticity in the dynamics of human visual word recognition. Neurosci. Lett., 244: 61–64. Kissler, J. and Ba¨uml, K.H. (2000) Effects of the beholder’s age on the perception of facial attractiveness. Acta Psychol. (Amst.), 104: 145–166. Kissler, J., Herbert, C., Peyk, P. and Jungho¨fer, M. (submitted manuscript) Sex, crime and videotape—enhanced early cortical responses to rapidly presented emotional words. Kissler, J., and Ko¨ssler, S., (in preparation) Pleasant pictures facilitate lexical decision. Knost, B., Flor, H., Braun, C. and Birbaumer, N. (1997) Cerebral processing of words and the development of chronic pain. Psychophysiology, 34: 474–481. Kostandov, E. and Arzumanov, Y. (1977) Averaged cortical evoked potentials to recognized and non-recognized verbal stimuli. Acta Neurobiol. Exp. (Wars), 37: 311–324. Kutas, M. and Federmeier, K.D. (2000) Electrophysiology reveals semantic memory use in language comprehension. Trends Cogn. Sci., 4: 463–470. Kutas, M. and Hillyard, S.A. (1980) Reading senseless sentences: brain potentials reflect semantic incongruity. Science, 207: 203–205. Kutas, M. and Hillyard, S.A. (1984) Brain potentials during reading reflect word expectancy and semantic association. Nature, 307: 161–163. Lang, P.J. (1979) Presidential address, 1978A bio-informational theory of emotional imagery. Psychophysiology, 16: 495–512. Lang, P.J. (1994) The motivational organization of emotion: Affect-reflex connections. In: Van Goozen, S.H.M., Van de Poll, N.E. and Sergeant, J.E. (Eds.), Emotions: Essays on

182 Emotion Theory. Lawrence Erlbaum Associates, Hillsdale, NJ, pp. 61–93. Lang, P.J., Greenwald, M.K., Bradley, M.M. and Hamm, A.O. (1993) Looking at pictures: affective, facial, visceral, and behavioral reactions. Psychophysiology, 30: 261–273. LeDoux, J.E. (1989) Cognitive-emotional interactions in the brain. Cogn. Emotion, 3: 267–289. LeDoux, J.E. (1995) Emotion: clues from the brain. Annu. Rev. Psychol., 46: 209–235. Liddell, B.J., Brown, K.J., Kemp, A.H., Barton, M.J., Das, P., Peduto, A., Gordon, E. and Williams, L.M. (2005) A direct brainstem-amygdala-cortical ‘alarm’ system for subliminal signals of fear. Neuroimage, 24: 235–243. Liddell, B.J., Williams, L.M., Rathjen, J., Shevrin, H. and Gordon, E. (2004) A temporal dissociation of subliminal versus supraliminal fear perception: an event-related potential study. J. Cogn. Neurosci., 16: 479–486. Lifshitz, K. (1966) The averaged evoked cortical response to complex visual stimuli. Psychophysiology, 3: 55–68. Luce, R.D. and Suppes, P. (1965) Preference, utility and subjective probability. In: Luce, R.D., Bush, R.R. and Galanter, E.H. (Eds.) Handbook of Mathematical Psychology, Vol. 3. New York, Wiley, pp. 249–410. Martin, A., Wiggs, C.L., Ungerleider, L.G. and Haxby, J.V. (1996) Neural correlates of category-specific knowledge. Nature, 379: 649–652. Martin-Loeches, M., Hinojosa, J.A., Gomez-Jarabo, G. and Rubia, F.J. (2001) An early electrophysiological sign of semantic processing in basal extrastriate areas. Psychophysiology, 38: 114–124. Molfese, D. (1985) Electrophysiological correlates of semantic features. J. Psycholinguistic Res., 14: 289–299. Morris, J.S., O¨hman, A. and Dolan, R.J. (1999) A subcortical pathway to the right amygdala mediating ‘unseen’ fear. Proc. Natl. Acad. Sci. USA,, 96: 1680–1685. Mu¨nte, T.F., Heinze, H.J., Matzke, M., Wieringa, B.M. and Johannes, S. (1998) Brain potentials and syntactic violations revisited: no evidence for specificity of the syntactic positive shift. Neuropsychologia, 36: 217–226. Naccache, L., Gaillard, R., Adam, C., Hasboun, D., Clemenceau, S., Baulac, M., Dehaene, S. and Cohen, L. (2005) A direct intracranial record of emotions evoked by subliminal words. Proc. Natl. Acad. Sci. USA, 102: 7713–7717. O¨hman, A. and Mineka, S. (2001) Fears, phobias, and preparedness: toward an evolved module of fear and fear learning. Psychol. Rev., 108: 483–522. Ortigue, S., Michel, C.M., Murray, M.M., Mohr, C., Carbonnel, S. and Landis, T. (2004) Electrical neuroimaging reveals early generator modulation to emotional words. Neuroimage, 21: 1242–1251. Osgood, C.E., Miron, M.S. and May, W.H. (1975) Cross-Cultural Universals of Affective Meaning. University of Illinois Press, Urbana, Chicago, London. Osgood, C.E., Suci, G.J. and Tannenbaum, P.H. (1957) The measurement of meaning. University of Illinois Press, Urbana, Chicago, and London.

Osterhout, L., Holcomb, P.J. and Swinney, D.A. (1994) Brain potentials elicited by garden-path sentences: evidence of the application of verb information during parsing. J. Exp. Psychol. Learn. Mem. Cogn., 20: 786–803. Pauli, P., Amrhein, C., Muhlberger, A., Dengler, W. and Wiedemann, G. (2005) Electrocortical evidence for an early abnormal processing of panic-related words in panic disorder patients. Int. J. Psychophysiol., 57: 33–41. Perfetti, C.A. (1998) Comprehending written language: a blueprint of the reader. In: Brown, C.M. and Hagoort, P. (Eds.), The Neurocognition of Language. University Press, Oxford. Perfetti, C.A. and Sandak, R. (2000) Reading optimally builds on spoken language: implications for deaf readers. J. Deaf. Stud. Deaf. Educ., 5: 32–50. Phelps, E.A., O’Connor, K.J., Gatenby, J.C., Gore, J.C., Grillon, C. and Davis, M. (2001) Activation of the left amygdala to a cognitive representation of fear. Nat. Neurosci., 4: 437–441. Posner, M.I., Abdullaev, Y.G., McCandliss, B.D. and Sereno, S.C. (1999) Neuroanatomy, circuitry and plasticity of word reading. Neuroreport, 10: R12–R23. Pulvermu¨ller, F. (1999) Words in the brain’s language. Behav. Brain Sci., 22: 253–279 discussion 280–336. Pulvermu¨ller, F., Assadollahi, R. and Elbert, T. (2001a) Neuromagnetic evidence for early semantic access in word recognition. Eur. J. Neurosci., 13: 201–205. Pulvermu¨ller, F., Harle, M. and Hummel, F. (2000) Neurophysiological distinction of verb categories. Neuroreport, 11: 2789–2793. Pulvermu¨ller, F., Ha¨rle, M. and Hummel, F. (2001b) Walking or talking? Behavioral and neurophysiological correlates of action verb processing. Brain Lang., 78: 143–168. Rudell, A.P. (1992) Rapid stream stimulation and the recognition potential. Electroencephalogr. Clin. Neurophysiol., 83: 77–82. Russel, J. (1980) A circumplex model of affects. J. Pers. Soc. Psychol., 39: 1161–1178. Schapkin, S.A., Gusev, A.N. and Kuhl, J. (2000) Categorization of unilaterally presented emotional words: an ERP analysis. Acta. Neurobiol. Exp. (Wars), 60: 17–28. Schendan, H.E., Ganis, G. and Kutas, M. (1998) Neurophysiological evidence for visual perceptual categorization of words and faces within 150 ms. Psychophysiology, 35: 240–251. Schupp, H.T., O¨hman, A., Junghofer, M., Weike, A.I., Stockburger, J. and Hamm, A.O. (2004) The facilitated processing of threatening faces: an ERP analysis. Emotion, 4: 189–200. Sereno, S.C., Brewer, C.C. and O’Donnell, P.J. (2003) Context effects in word recognition: evidence for early interactive processing. Psychol. Sci., 14: 328–333. Sereno, S.C., Rayner, K. and Posner, M.I. (1998) Establishing a time-line of word recognition: evidence from eye movements and event-related potentials. Neuroreport, 9: 2195–2200. Silvert, L., Delplanque, S., Bouwalerh, H., Verpoort, C. and Sequeira, H. (2004) Autonomic responding to aversive words

183 without conscious valence discrimination. Int. J. Psychophysiol., 53: 135–145. Skrandies, W. (1998) Evoked potential correlates of semantic meaning — a brain mapping study. Brain Res. Cogn. Brain Res., 6: 173–183. Skrandies, W. and Chiu, M.J. (2003) Dimensions of affective semantic meaning — behavioral and evoked potential correlates in Chinese subjects. Neurosci. Lett., 341: 45–48. Sutton, S., Tueting, P., Zubin, J. and John, E.R. (1967) Information delivery and the sensory evoked potential. Science, 155: 1436–1439. Tarkiainen, A., Helenius, P., Hansen, P.C., Cornelissen, P.L. and Salmelin, R. (1999) Dynamics of letter string perception in the human occipitotemporal cortex. Brain, 122(Pt 11): 2119–2132.

Vanderploeg, R.D., Brown, W.S. and Marsh, J.T. (1987) Judgments of emotion in words and faces: ERP correlates. Int. J. Psychophysiol., 5: 193–205. Warrington, E.K. and Shallice, T. (1979) Semantic access dyslexia. Brain, 102: 43–63. Warrington, E.K. and Shallice, T. (1980) Word-form dyslexia. Brain, 103: 99–112. Weinstein, A. (1995) Visual ERPs evidence for enhanced processing of threatening information in anxious university students. Biol. Psychiatry, 37: 847–858. Wickelmaier, F. and Schmid, C. (2004) A Matlab function to estimate choice model parameters from paired-comparison data. Behav. Res. Methods Instrum. Comput., 36: 29–40. Williamson, S., Harpur, T.J. and Hare, R.D. (1991) Abnormal processing of affective words by psychopaths. Psychophysiology, 28: 260–273.