Regular ArticleEvent-Related Brain Potentials Reflect Semantic Priming in an Object Decision Task
Abstract
Subjects made speeded object decisions about target line drawings which were preceded by semantically related or unrelated prime line drawings. One hundred of the targets depicted real objects and 50 others were unrecognizable non-objects. Similar to a recent picture-matching study, the ERPs from this study generated a larger negativity for unrelated than for related target pictures, between 325 and 550 msec (). Although these differences had a similar time course to those seen for the N400 component in semantic priming lexical decision tasks, they were more frontally distributed and were larger over the left rather than the right hemisphere. Non-objects, which were the picture equivalent of pseudowords, produced an even larger negativity with a somewhat different distribution. The results are discussed with regards to recent claims about amodal conceptual memory processes.
References (0)
Cited by (195)
Action-speech and gesture-speech integration in younger and older adults: An event-related potential study
2022, Journal of NeurolinguisticsIn daily communication, speech is enriched with co-speech gestures, providing a visual context for the linguistic message. It has been shown that older adults are less sensitive to incongruencies between context (e.g., a sentence) and target (e.g., a final sentence word). This is evidenced by a smaller and delayed N400 (in)congruency effect that reflects the difference between the N400 component in response to congruent versus incongruent targets. The present study investigated whether the effect of age on the N400 effect in sentence-final word integration would also arise for verb-gesture/action integration. Assuming that gestures have a tight connection to language these would provide a higher contextual constraint for the action phrase than the literal actions (i.e., an action performed on an object can be understood in isolation, without the action phrase). EEG was recorded from a sample of younger and older participants, while they watched audio-visual stimuli of a human actor performing an action or pantomime gesture while hearing a congruent or incongruent action phrase. Results showed that the N400 (in)congruency effect was less widespread in the older than the younger adults. It seemed that older adults, but not younger adults were less sensitive to the gestural than the action (object) information when processing an action phrase.
Memory after visual search: Overlapping phonology, shared meaning, and bilingual experience influence what we remember
2021, Brain and LanguageHow we remember the things that we see can be shaped by our prior experiences. Here, we examine how linguistic and sensory experiences interact to influence visual memory. Objects in a visual search that shared phonology (cat-cast) or semantics (dog-fox) with a target were later remembered better than unrelated items. Phonological overlap had a greater influence on memory when targets were cued by spoken words, while semantic overlap had a greater effect when targets were cued by characteristic sounds. The influence of overlap on memory varied as a function of individual differences in language experience -- greater bilingual experience was associated with decreased impact of overlap on memory. We conclude that phonological and semantic features of objects influence memory differently depending on individual differences in language experience, guiding not only what we initially look at, but also what we later remember.
Attentional automatic processes and cerebral activity may differ between individuals with different weight statuses in the presence of food stimuli (e.g. odors, pictures). In the present study, we used an implicit olfactory priming paradigm to test the influence of non-attentively perceived food odors on the cerebral activity underlying the processing of food pictures, in normal-weight, overweight, and obese adults. A pear odor and a pound cake odor were used as primes, respectively priming sweet low-energy-density foods and high-energy-density foods. Event-related potentials were recorded while the participants passively watched pictures of sweet low and high-energy-density foods, under the two priming conditions plus an odorless control condition. The amplitude and latency of several peaks were measured (P100, N100, P200, N400). As a major result, we found that weight status influences the cerebral activity underlying the processing of food cues outside of consciousness, as early as the first detectable P100 peak.
Relationships between cognitive event-related brain potential measures in patients at clinical high risk for psychosis
2020, Schizophrenia ResearchNeurophysiological measures of cognitive functioning that are abnormal in patients with schizophrenia are promising candidate biomarkers for predicting development of psychosis in individuals at clinical high risk (CHR). We examined the relationships among event-related brain potential (ERP) measures of early sensory, pre-attentional, and attention-dependent cognition, in antipsychotic-naïve help-seeking CHR patients (n = 36) and healthy control participants (n = 22). These measures included the gamma auditory steady-state response (ASSR; early sensory); mismatch negativity (MMN) and P3a (pre-attentional); and N400 semantic priming effects - a measure of using meaningful context to predict related items - over a shorter and a longer time interval (attention-dependent). Compared to controls, CHR patients had significantly smaller P3a amplitudes (d = 0.62, p = 0.03) and N400 priming effects over the long interval (d = 0.64, p = 0.02). In CHR patients, gamma ASSR evoked power and phase-locking factor were correlated (r = 0.41, p = 0.03). Reductions in mismatch negativity (MMN) and P3a amplitudes were also correlated (r = −0.36, p = 0.04). Moreover, lower gamma ASSR evoked power correlated with smaller MMN amplitudes (r = −0.45, p = 0.02). MMN amplitude reduction was also associated with reduced N400 semantic priming over the shorter but not the longer interval (r = 0.52, p < 0.002). This pattern of results suggests that, in a subset of CHR patients, impairment in pre-attentional measures of early information processing may contribute to deficits in attention-dependent cognition involving rapid, more automatic processing, but may be independent from pathological processes affecting more controlled or strategic processing. Thus, combining neurophysiological indices of cognitive deficits in different domains offers promise for improving their predictive power as prognostic biomarkers of clinical outcome.
When a second language hits a native language. What ERPs (do and do not) tell us about language retrieval difficulty in bilingual language production.
2020, NeuropsychologiaThe accumulating evidence suggests that prior usage of a second language (L2) leads to processing costs on the subsequent production of a native language (L1). However, it is unclear what mechanism underlies this effect. It has been proposed that the L1 cost reflects inhibition of L1 representation acting during L1 production; however, previous studies exploring this issue were inconclusive. It is also unsettled whether the mechanism operates on the whole-language level or is restricted to translation equivalents in the two languages. We report a study that allowed us to address both issues behaviorally with the use of ERPs while focusing on the consequences of using L2 on the production of L1. In our experiment, native speakers of Polish (L1) and learners of English (L2) named a set of pictures in L1 following a set of pictures in either L1 or L2. Half of the pictures were repeated from the preceding block and half were new; this enabled dissociation of the effects on the level of the whole language from those specific to individual lexical items. Our results are consistent with the notion that language after-effects operate at a whole-language level. Behaviorally, we observed a clear processing cost on the whole-language level and a small facilitation on the item-specific level. The whole-language effect was accompanied by an enhanced, fronto-centrally distributed negativity in the 250–350 ms time-window which we identified as the N300 (in contrast to previous research, which probably misidentified the effect as the N2), a component that presumably reflects retrieval difficulty of relevant language representations during picture naming. As such, unlike previous studies that reported N2 for naming pictures in L1 after L2 use, we propose that the reported ERPs (N300) indicate that prior usage of L2 hampers lexical access to names in L1. Based on the literature, the after-effects could be caused by L1 inhibition and/or L2 interference, but the ERPs so far have not been informative about the causal mechanism.
Hemispheric differences in perceptual integration during language comprehension: An ERP study
2020, NeuropsychologiaThe left hemisphere (LH) is responsible for many fundamental aspects of language; however, converging evidence suggests the right hemisphere (RH) is critically involved in higher-level language comprehension. We examined the extent of each hemispheres' access to a meaningful mental representation of language by recording electroencephalography while participants (N = 44) completed a computer-based task where auditory sentences described individual elements of an image. If integrated successfully, this allowed the construction of a meaningful mental representation. If unsuccessful, the individual elements were in themselves meaningless. Participants saw a lateralised image that was either an integrated representation of the object described in the previous auditory passage (“integrated”), an unintegrated representation of each of the individual elements (“unintegrated”), or an integrated representation of an object that did not match the previous passage (“unrelated”). Evidenced by the trend in N300 amplitudes, we found that both hemispheres accessed a mental representation that embodied the elements described in the preceding passage. However, only the RH distinguished integrated versus unintegrated targets, suggesting that the RH accessed a mental representation that embodied the correct spatial relationships between elements (i.e., perceptual integration) as well as the individual imagined elements (i.e., perceptual elaboration). These results provide evidence of a clear RH contribution to the integration of perceptual information during language comprehension.