When do infants imitate facial expressions
See Subscription Options. Go Paperless with Digital. Get smart. Sign up for our email newsletter. Sign Up. Support science journalism. Knowledge awaits. See Subscription Options Already a subscriber? Don't interrupt or look away when your baby's "talking" — show you're interested and that your little one can trust you. Most people will raise the pitch of their voices and exaggerate their speech when talking to babies. This is fine — studies have shown that "baby talk" doesn't delay the development of speech — but mix in some regular adult words and tone.
It may seem early, but you're setting the stage for your baby's first word. Sometimes babies aren't in the mood to talk or vocalize — even babies need their space and a break from all the stimulation in the world. Babies might turn away, closes their eyes, or becomes fussy or irritable.
If this happens, let your little one be or just try cuddling. There might be times when you've met all of your baby's needs, yet he or she continues to cry. Don't despair — your baby may be overly stimulated, have gas, or may have too much energy and need a good cry. It's common for babies to have a fussy period about the same time every day, generally between early evening and midnight.
Though all babies cry and show some fussiness, when an infant who is otherwise healthy cries for more than 3 hours per day, more than 3 days per week for at least 3 weeks, it is a condition known as colic. This can be upsetting, but the good news is that it's short-lived — most babies outgrow it at around 3 or 4 months of age. This indicates that mu ERD was specific to central clusters and not due to changes in occipital alpha power.
The scheme we used to code mother-infant interaction videos devised by Murray and colleagues 31 identifies various infant facial expressions, including smiles, mouth opening, and negative expressions. A number of maternal responses to infant expressions are also identified in the scheme, including mirroring, marking, and negative responses. More details regarding this scheme can be found in the Methods section and Supplementary Information.
Inspection of the mirroring data revealed that mothers selectively mirrored specific smiles or mouth opening infant expressions, rather than simply varying in their overall levels of mirroring. Accordingly, mothers were allocated to a high or low mirroring group for both smiles and mouth opening see Methods. Note for all mixed models described here, visual inspection of residual plots did not reveal any deviations from homoscedasticity or normality. All p values were based on Kenward-Roger's corrected degrees of freedom, and all post-hoc tests least-square means were corrected for multiple comparisons using Tukey-Kramer contrasts.
Significant main effects of condition [ F 1, This three-way interaction was followed up with planned pairwise comparisons. In the right hemisphere, there was significantly more mu desynchronization during the happy condition in the high compared to low mirroring group [ t In the left hemisphere, there was significantly more mu desynchronization during the mouth opening condition for the high compared to low mirroring group [ t See Fig. Infant mu ERD in high and low mirroring groups during observation.
Infant mu ERD during observation of mouth opening and happy conditions in the low and high maternal mirroring groups for each expression, in both the left and right hemisphere. In order to confirm that effects in the previous analysis were specific to maternal mirroring, control analyses were run to rule out any influence of purely motor infant execution or visual experience maternal execution during early interactions. To do this, two linear mixed models were used to investigate whether any relationship existed between base rates of mouth opening or smiles, by infant or mother during the two month interactions, and infant mu ERD in central electrode clusters during observation of those same expressions.
Finally, we tested whether the mirroring effects above might actually be accounted for by more general measures of maternal behaviour, rather than the specific correspondence between infant motor activity and maternal mirroring responses. To do this, two linear models were used to investigate associations between infant mu ERD during observation and i the overall rate of maternal mirroring i.
More information about these extra analyses, as well as some additional control analyses testing the specificity of these effects, can be found in the Supplementary Information file. The aim of this study was to investigate whether in human infants, as in adults and macaques e. Additionally, we wished to test the hypothesis that early maternal mirroring predicts the development of the neural mechanism mapping between self- and other-generated facial movements 30 , 31 , 41 , The pattern of mu ERD revealed suggests that motor regions are activated in nine-month-old infants not only during execution of facial expressions, but also during their observation, thereby supporting the existence of a facial action-perception network at this young age.
In addition, and in line with our hypothesis, greater maternal mirroring of a particular facial expression mouth opening and smiles at two months postpartum predicted stronger infant mu ERD during observation of the same expression later on in infancy. This result constitutes the first evidence in support of visuomotor experience 29 , 41 , 42 , afforded by maternal facial mirroring, facilitating development of a neural action-perception matching mechanism for faces.
In the EEG experiment, infants showed significant mu ERD in central electrode clusters during observation of various facial expressions mouth opening, happy, sad relative to a static neutral face, but not during observation of scrambled versions of those same expressions. The lack of mu ERD seen during observation of scrambled stimuli indicates that desynchronization during other conditions was not simply a function of observing a moving face-like stimulus, or other attentional factors.
Additionally, no significant mu ERD was found in occipital electrode clusters, suggesting that central responses were not driven by alpha desynchronization in visual cortex, and were specific to motor cortical regions.
These results therefore indicate recruitment of the motor system during facial expression processing. Of particular note was our finding that infants whose mothers mirrored either mouth opening or smiles more often during early social interactions showed greater mu ERD during observation of happy and mouth opening stimuli respectively in the later EEG experiment.
Control analyses confirmed that this was not simply the result of increased motor or visual experience, and no relationship was found between more general measures of maternal mirroring and infant mu ERD. This provides evidence for maternal mirroring supporting the development of an action-perception matching mechanism by strengthening specific visuomotor mappings, rather than by broadly modulating motor system responses through some other, generalised, mechanism.
One previous study with macaques does indicate a more general influence of early mother-infant interactions on the development of this mechanism Mu ERD in a group of mother-reared macaques was found to be greater during observation of facial gestures compared to nursery-reared infants, but specific experiential factors that may have contributed to this were not considered.
As well as substantiating the idea that maternal mirroring is important for development of a brain network that couples visual representations with corresponding motor programs, our results are in line with studies demonstrating a relationship between exposure to atypical emotional environments and altered infant neural activity during observation of emotional expressions 56 , Similarly, our findings are consistent with more recent research showing how even normal variation in mother-infant interaction quality influences infant brain development 58 , including facial expression processing 4.
Notably, however, our study extends this previous work in that it tested hypotheses concerning the role of specific kinds of early social experience in the development of particular neural mechanisms, rather than more generic measures of the social environment.
Although the patterns of mu ERD were very similar to those we identified in month-old children 34 , one difference was revealed. Unlike the older children, who demonstrated right lateralized activity for all emotion expressions, infants in the current study exhibited bilateral mu ERD for sad expressions. The reason for this is unclear, but it could reflect a more refined response for happy compared to sad expressions by nine months of age.
Interestingly, ERD in the happy condition also appeared more right lateralized in infants whose mothers mirrored smiles more often, consistent with right hemisphere specialization for emotional face processing 15 , By corollary, the lack of lateralization for sad expressions at nine months could possibly be linked to less maternal mirroring of negative infant expressions; i.
Given that depressed mothers show atypical levels of mirroring during early interactions 60 , responding more to negative and less to positive infant expressions 55 , it would be both clinically and scientifically relevant to explore how such differences in maternal mirroring in the context of depression might affect development of a facial action-perception mechanism; and ultimately, the later problems in affective regulation characteristic of offspring of depressed mothers.
Although our study indicates an influence of social experience in the early postpartum period, it cannot speak to the issue of the status of a facial action-perception network at birth. In humans, maternal mirroring has very specific effects on emerging infant social behaviour 31 , and therefore an action-perception network could play an important part in very early face-to-face interactions.
Murray et al. However, there are two points to keep in mind when interpreting this result. First, although mothers did not mirror negative expressions very often during the recorded interaction periods, this does not mean that they never mirrored these expressions. Potentially communicative infant expressions such as smiles and mouth opening are most likely to be mirrored during early interactions, but mothers also mirror expressions of negative affect 61 , 62 , albeit less frequently in typical populations.
If we had observed more mother-infant interactions, we might have observed more instances of negative mirroring; however, we would expect the relative proportion of expressions i. If the infant brain is highly sensitive to maternal mirroring 31 , visuomotor mappings for negative facial expressions could be strengthened even with very little experience of being mirrored.
Second, and as noted previously, mu ERD for sad expression observation at nine months was bilateral, whereas a right-lateralized response to this condition was found in month-old children One hypothesis is that mu ERD occurs during observation of all facial expressions from a very early age in human infants, but that maternal mirroring then refines the visuomotor networks involved, resulting in right-lateralized representations for different emotional expressions at varying times.
A limitation should be acknowledged regarding the execution condition in our EEG experiment. As participants here were very young infants, it was not feasible to include an explicit condition for execution of facial expressions, which resulted in relatively few trials per expression type.
The subsequent need to combine different facial expressions into one condition for analysis restricts conclusions concerning the specificity of action-perception coupling. However, the overlap in neural activity revealed during observation and execution in central regions is still indicative of an action-perception matching mechanism. Further, although these results strongly suggest that early maternal mirroring of infant facial expressions influences the degree of infant motor system activity during observation of the same expressions later on, the data presented here are correlational, and thus cannot prove causality of the relationship.
Future research involving the systematic manipulation of mirroring variables would help to address this point. Finally, as with many studies in this field, our sample size was modest, and replication with a larger sample is required to establish the reliability of the effects we have reported. In summary, our findings suggest that the motor system is recruited during observation of facial expressions in human infants, and that early maternal mirroring facilitates the development of a mechanism mapping between own- and other-generated expressions.
Given how critical this is for individual success, our results also underscore the value of analysing mother-infant dyads as tightly-coupled systems in which infant behaviour influences maternal responses, which in turn, shape development of the infant brain.
Mothers gave written, informed consent before participation. Further details concerning this sample and exclusions are provided in the Supplementary Information file. Stimuli consisted of short videos of female actors executing various facial expressions Fig. Previous studies have utilized static or non-biological moving stimuli in control conditions 24 , 33 ; however, the scrambled stimuli condition was used here instead to control for low-level visual features and overall movement across all experimental conditions Equivalent videos were made for the non-emotional mouth opening condition, comparable with the ADFES stimuli in terms of onset, duration of movement, size, brightness, contrast, and spatial frequency.
Time-course of stimuli in the four experimental conditions. Experimental stimuli were presented on the monitor using PsychoPy v1. These clips were randomized within blocks, and blocks themselves were pseudo-randomized so that the same condition could not be presented more than twice in succession.
The experiment was terminated if the infant became too inattentive, distressed, moved excessively, or after 25 experimental blocks had been presented. Data were sampled at Hz with an analogue band-pass filter of 0. An experimental block began when triggered manually by an experimenter who was watching the participant on a screen from another section of the room. Trial blocks were triggered as soon as the infant was attentive to the monitor.
Synchronous video recordings of the experiment were examined offline to allow exclusion of trials in which the infant was inattentive, and to enable coding of infant expression execution.
To identify trials in which infants executed the facial expressions presented during experimental blocks, their expressions happy, sad, and mouth opening were coded offline from the video recordings. All videos were coded by a research assistant blind to the experimental condition being presented. Videos were viewed in real-time and frame-by-frame to accurately identify onsets and offsets of movements. After viewing the video recordings of infants during the experiment and marking periods of inattention using EGI software NetStation v4.
More details about the pre-processing steps used before data analysis can be found in the Supplementary Information. ERD was calculated for four clusters of electrodes. In this study, we evaluated the effect of different moderators on the productions of FEs in children between 6 and 11 years old.
We found that age, emotion, task and cultural environment modulate their productions. Also, production on request was easier than production imitating an avatar model. Taking into account these variables is necessary for the evaluation of competences of typical children but also comparison with a pathological population.
In a future research, we plan to propose this protocol to children with ASD in order to characterize and compare their productions to those of typical children. We will also use the dataset to train classification algorithms for FE recognition in order to integrate it into the serious game JEMImE. HP: analysis and interpretation of data, drafting the work.
PF is general director of Groupe Genious Healthcare, a private company that develops serious games for health purposes. The other authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. Adolphs, R. Cognitive neuroscience of human social behaviour. Introducing the Geneva multimodal expression corpus for experimental research on emotion perception.
Emotion 12, — Belin, P. The montreal affective voices: a validated set of nonverbal affect bursts for research on auditory affective processing. Methods 40, 31— Bennett, D. Does the organization of emotional expression change over time?
Facial expressivity from 4 to 12 months. Infancy 8, — Biele, C. Sex differences in perception of emotion intensity in dynamic and static facial expressions. Brain Res. Boucenna, S. Information communication technology ICT and autism: overview and focus on early developmental issues and social robotics. Brody, L. Lewis and J. Google Scholar.
Brun, P. Enfance 53, — Camras, L. Emotion 6, — Nonverbal Behav. Do infants show distinct negative facial expressions for fear and anger? Emotional expression in month-old European American, Chinese, and Japanese infants. Infancy 11, — Cao, C. Facewarehouse: a 3d facial expression database for visual computing. IEEE Trans. Castellano, G. Lecture Notes in Computer Science , Vol. Peter and R. Beale Berlin: Springer , 92— Chaplin, T.
Gender differences in emotion expression in children: a meta-analytic review. Dalrymple, K. PLoS One 8:e Darwin, C. London: John Murray. Egger, H. Methods Psychiatr.
Ekman, P. Oxford: Pergamon Press. Universals and cultural differences in the judgments of facial expressions of emotion. Deliberate facial movement. Child Dev. Elfenbein, H. On the universality and cultural specificity of emotion recognition: a meta-analysis. Toward a dialect theory: cultural differences in the expression and recognition of posed facial expressions.
Emotion 7, — Field, T. Production and discrimination of facial expressions by preschool children. Facial age affects emotional expression decoding. Fridlund, A. Russell and J. Gordon, I. Training facial expression production in children on the autism spectrum. Autism Dev. Gosselin, P. Grossard, C. Enfance Adolesc. Halberstadt, A. Affective social competence. Hall, J. Second Series. Gender and Emotion: Social Psychological Perspectives , ed. Herba, C. The development of emotion-processing in children: effects of age, emotion, and intensity.
Child Psychol. Psychiatry 47, — Hoffmann, H. Expression intensity, gender and facial emotion recognition: women recognize only subtle facial emotions better than men. Acta Psychol. Holodynski, M. Boston, MA: Kluwer Academic. Izard, C. Emotional intelligence or adaptive emotions? Emotion 1, — The Face of Emotion.
New York: Appleton-CenturyCrofts. Komatsu, S. Construction and evaluation of a facial expression database of children. Shinrigaku kenkyu 83, — LaFrance, M. The contingent smile: a meta-analysis of sex differences in smiling.
0コメント