Fourteen-month-old infants track the language comprehension of communicative partners
Infants employ sophisticated mechanisms to acquire their first language, including some that rely on taking the perspective of adults as speakers or listeners. When do infants first show awareness of what other people understand? We tested 14-month-old infants in two experiments measuring event-related potentials. In Experiment 1, we established that infants produce the N400 effect, a brain signature of semantic violations, in a live object naming paradigm in the presence of an adult observer. In Experiment 2, we induced false beliefs about the labelled objects in the adult observer to test whether infants keep track of the other person’s comprehension. The results revealed that infants reacted to the semantic incongruity heard by the other as if they encountered it themselves: they exhibited an N400-like response, even though labels were congruous from their perspective. This finding demonstrates that infants track the linguistic understanding of social partners.
Neural signatures for sustaining object representations attributed to others in preverbal human infants
A major feat of social beings is to encode what their conspecifics see, know or believe. While various nonhuman animals show precursors of these abilities, humans perform uniquely sophisticated inferences about other people’s mental states. However, it is still unclear how these possibly human-specific capacities develop and whether preverbal infants, similarly to adults form representations of other agents’ mental states, specifically metarepresentations. We explored the neuro-cognitive bases of 8-month-olds’ ability to encode the world from another person’s perspective, using gamma-band EEG activity over the temporal lobes, an established neural signature for sustained object representation after occlusion. We observed such gamma-band activity when an object was occluded from the infants’ perspective, as well as when it was occluded only from the other person (Study 1), and also when subsequently the object disappeared but the person falsely believed the object to be present (Study 2). These findings suggest that the cognitive systems involved in representing the world from infants’ own perspective are also recruited for encoding others’ beliefs. Such results point to an early developing, powerful apparatus suitable to deal with multiple concurrent representations; and suggest that infants can have a metarepresentational understanding of other minds even before the onset of language.
Neural responses to multimodal ostensive signals in 5-month-old infants
Infants' sensitivity to ostensive signals, such as direct eye contact and infant-directed speech, is well documented in the literature. We investigated how infants interpret such signals by assessing common processing mechanisms devoted to them and by measuring neural responses to their compounds. In Experiment 1, we found that ostensive signals from different modalities display overlapping electrophysiological activity in 5-month-old infants, suggesting that these signals share neural processing mechanisms independently of their modality. In Experiment 2, we found that the activation to ostensive signals from different modalities is not additive to each other, but rather reflects the presence of ostension in either stimulus stream. These data support the thesis that ostensive signals obligatorily indicate to young infants that communication is directed to them.
Electrophysiological evidence for the understanding of maternal speech by 9-month-old infants
Early word learning in infants relies on statistical, prosodic, and social cues that support speech segmentation and the attachment of meaning to words. It is debated whether such early word knowledge represents mere associations between sound patterns and visual object features, or reflects referential understanding of words. By using event-related brain potentials, we demonstrate that 9-month-old infants detect the mismatch between an object appearing from behind an occluder and a preceding label with which their mother introduces it. The N400 effect has been shown to reflect semantic priming in adults, and its absence in infants has been interpreted as a sign of associative word learning. By setting up a live communicative situation for referring to objects, we demonstrate that a similar priming effect also occurs in young infants. This finding may indicate that word meaning is referential from the outset, and it drives, rather than results from, vocabulary acquisition in humans.
Influence of Eye Gaze on Spoken Word Processing: An ERP Study With Infants
Eye gaze is an important communicative signal, both as mutual eye contact and as referential gaze to objects. To examine whether attention to speech versus nonspeech stimuli in 4- to 5-month-olds (n = 15) varies as a function of eye gaze, event-related brain potentials were used. Faces with mutual or averted gaze were presented in combination with forward- or backward-spoken words. Infants rapidly processed gaze and spoken words in combination. A late Slow Wave suggests an interaction of the 2 factors, separating backward-spoken word + direct gaze from all other conditions. An additional experiment (n = 15) extended the results to referential gaze. The current findings suggest that interactions between visual and auditory cues are present early in infancy.
Eye contact and emotional face processing in 6-month-old infants: advanced statistical methods applied to event-related potentials
Event-related potential (ERP) studies with infants are often limited by a small number of measurements. We introduce a weighted general linear mixed model analysis with a time-varying covariate, which allows for the efficient analysis of all available event-related potential data of infants. This method allows controlling the signal to noise ratio effect on averaged ERP estimates due to small and varying numbers of trials. The method enables analyzing ERP data sets of infants, which would often not be possible otherwise. We illustrate this method by analyzing an experimental study and discuss the advantages in comparison to currently used methods as well as its potential limitations. In this study, 6-month-old infants saw a face showing a neutral or an angry expression in combination with direct or averted eye gaze. We examined how the infant brain processes facial expressions and whether the direction of eye gaze has an influence on it. We focused on the infant Negative Central ERP component (Nc). The neutral expression elicited larger amplitude and peaked earlier than the angry expression. An interaction between emotion and gaze was found for Nc latency, suggesting that emotions are processed in combination with eye gaze in infancy.
The detection of communicative signals directed at the self in infant prefrontal cortex
A precondition for successful communication between people is the detection of signals indicating the intention to communicate, such as eye contact or calling a person's name. In adults, establishing communication by eye contact or calling a person's name results in overlapping activity in right prefrontal cortex, suggesting that, regardless of modality, the intention to communicate is detected by the same brain region. We measured prefrontal cortex responses in 5-month-olds using near-infrared spectroscopy (NIRS) to examine the neural basis of detecting communicative signals across modalities in early development. Infants watched human faces that either signaled eye contact or directed their gaze away from the infant, and they also listened to voices that addressed them with their own name or another name. The results revealed that infants recruit adjacent but non-overlapping regions in the left dorsal prefrontal cortex when they process eye contact and own name. Moreover, infants that responded sensitively to eye contact in the one prefrontal region were also more likely to respond sensitively to their own name in the adjacent prefrontal region as revealed in a correlation analysis, suggesting that responding to communicative signals in these two regions might be functionally related. These NIRS results suggest that infants selectively process and attend to communicative signals directed at them. However, unlike adults, infants do not seem to recruit a common prefrontal region when processing communicative signals of different modalities. The implications of these findings for our understanding of infants' developing communicative abilities are discussed.
"Did You Call Me?'' 5-Month-Old Infants Own Name Guides Their Attention
An infant's own name is a unique social cue. Infants are sensitive to their own name by 4 months of age, but whether they use their names as a social cue is unknown. Electroencephalogram (EEG) was measured as infants heard their own name or stranger's names and while looking at novel objects. Event related brain potentials (ERPs) in response to names revealed that infants differentiate their own name from stranger names from the first phoneme. The amplitude of the ERPs to objects indicated that infants attended more to objects after hearing their own names compared to another name. Thus, by 5 months of age infants not only detect their name, but also use it as a social cue to guide their attention to events and objects in the world.
Processing Faces in Dyadic and Triadic Contexts
In a series of four experiments we assessed whether functional properties of the human face, such as signaling an object through eye gaze, influence face processing in 3- and 4-month-old infants. Infants viewed canonical and scrambled faces. We found that 4- but not 3-month-old infants' ERP showed an enhanced face-sensitive N170 component for the scrambled stimulus. Furthermore, when canonical and scrambled faces were gazing toward an object, 4-month-olds displayed an enhanced Negative central (Nc) component, related to attentional processes, for the scrambled face. Three-month-olds did not display any of these effects. These results point to important transition in the first months of infancy and show that triadic cues influence the processing of the human face.
The neural correlates of infant and adult goal prediction: evidence for semantic processing systems
The sequential nature of action ensures that an individual can anticipate the conclusion of an observed action via the use of semantic rules. The semantic processing of language and action has been linked to the N400 component of the event-related potential (ERP). The authors developed an ERP paradigm in which infants and adults observed simple sequences of actions. In one condition the conclusion of the sequence was anticipated, whereas in the other condition the conclusion was not anticipated. Adults and infants at 9 months and 7 months were assessed via the same neural mechanisms-the N400 component and analysis of the theta frequency. Results indicated that adults and infants at 9 months produced N400-like responses when anticipating action conclusions. The infants at 7 months displayed no N400 component. Analysis of the theta frequency provided support for the relation between the N400 and semantic processing. This study suggests that infants at 9 months anticipate goals and use similar cognitive mechanisms to adults in this task. In addition, this result suggests that language processing may derive from understanding action in early development.
Looking at eye gaze processing and its neural correlates in infancy-implications for social development and autism spectrum disorder
The importance of eye gaze as a means of communication is indisputable. However, there is debate about whether there is a dedicated neural module, which functions as an eye gaze detector and when infants are able to use eye gaze cues in a referential way. The application of neuroscience methodologies to developmental psychology has provided new insights into early social cognitive development. This review integrates findings on the development of eye gaze processing with research on the neural mechanisms underlying infant and adult social cognition. This research shows how a cognitive neuroscience approach can improve our understanding of social development and autism spectrum disorder.
Direct eye contact influences the neural processing of objects in 5-month-old infants
Do 5-month-old infants show differences in processing objects as a function of a prior interaction with an adult? Using a live ERP paradigm we assessed this question utilizing a within-subjects design. Infants saw objects during two pretest phases with an adult experimenter. We recorded event-related potentials to the presentation of objects following the interactive pretest phases. Experimental conditions differed only in the nature of eye contact between the infant and the experimenter during the pretests. In one condition the experimenter engaged the infant with direct eye contact. In a second condition the experimenter looked only at the infant's chest. We found that the negative component, related to attentional processes, showed differences between experimental conditions in left fronto-central locations. These data show that 5-month-old infants allocate more attention to objects that have been previously seen during direct eye-contact interaction. In addition, these results clarify the functional nature of the negative component.
Influence of vocal cues on learning about objects in joint attention contexts
An experimenter taught infants about a novel toy in two joint attention conditions, one with and one without vocal cues. In test trials, infants viewed the familiar toy and a novel toy. Infants in the Joint Attention plus Voice condition looked significantly longer to the novel toy.