Showing posts with label babies language development. Show all posts
Showing posts with label babies language development. Show all posts

Thursday, September 8, 2016

Baby talk words with repeated sounds help infants learn language

Babies find it easier to learn words with repetitive syllables rather than mixed sounds, a study suggests.

Assessments of language learning in 18-month-olds suggest that children are better at grasping the names of objects with repeated syllables, over words with non-identical syllables.

Researchers say the study may help explain why some words or phrases, such as 'train' and 'good night', have given rise to versions with repeated syllables, such as choo-choo and night-night.

The researchers say such words are easier for infants to learn, and may provide them with a starter point for vocabulary learning.

A team from the University of Edinburgh assessed the infants' language learning behaviour in a series of visual and attention tests using pictures on a computer screen of two unfamiliar objects.

The two objects were named with made-up words which were communicated to the infants by a recorded voice - one with two identical syllables, for example neenee, and the other without repeated syllables, such as bolay.

The infants were then tested for their recognition of each made-up word. Recordings of their eye movements showed they looked more reliably at the object labelled with repeated syllables, than the other object.

Researchers validated their results with a control test, in which the infants responded to pictures of familiar objects - such as a dog or an apple.

Previous studies show that infants more easily learn patterns involving repetition in visual sequences and musical notes. Researchers say these latest findings show that this tendency also applies to word learning.

Lead researcher Mitsuhiko Ota, of the University of Edinburgh's School of Philosophy, Psychology and Language Sciences, said: "This is the first evidence to show that infants have a repetition bias in learning new words. It also shows that there may be a good reason why in so many cultures across the world, existing adult words and expressions are replaced by words with repeated syllables in baby-talk vocabulary. Some examples could be tum-tum, mama, dada, din-din and wee-wee."
_________________
Reference:

EurekAlert. 2016. “Baby talk words with repeated sounds help infants learn language”. EurekAlert. Posted: May 27, 2016. Available online: http://www.eurekalert.org/pub_releases/2016-05/uoe-btw052716.php

Wednesday, May 11, 2016

Bilingual baby brains show increased activity in executive function regions

Many brain studies show that bilingual adults have more activity in areas associated with executive function, a set of mental abilities that includes problem-solving, shifting attention and other desirable cognitive traits.

Now new findings reveal that this bilingualism-related difference in brain activity is evident as early as 11 months of age, just as babies are on the verge of producing their first words.

"Our results suggest that before they even start talking, babies raised in bilingual households are getting practice at tasks related to executive function," said Naja Ferjan Ramírez, lead author and a research scientist at the Institute for Learning & Brain Sciences (I-LABS) at the University of Washington.

"This suggests that bilingualism shapes not only language development, but also cognitive development more generally," she said.

The study also gives evidence that the brains of babies from bilingual families remain more open to learning new language sounds, compared with babies from monolingual families.

The study was published online April 4 in Developmental Science and will appear in an upcoming issue of the journal.

"Monolingual babies show a narrowing in their perception of sounds at about 11 months of age -- they no longer discriminate foreign-language sounds they successfully discriminated at 6 months of age," said co-author Patricia Kuhl, co-director of I-LABS.

"But babies raised listening to two languages seem to stay 'open' to the sounds of novel languages longer than their monolingual peers, which is a good and highly adaptive thing for their brains to do," Kuhl said.

The researchers used magnetoencephalography (MEG), which measures magnetic changes given off by active nerve cells. Unlike other brain-imaging methods, MEG can precisely pinpoint both the timing and location of activity in the brain.

The study is the first to use MEG to do whole-brain analyses comparing activation patterns in response to speech sounds in babies raised in monolingual and bilingual households.

In the experiment, 16 11-month-old babies -- 8 from English-only households and 8 from Spanish-English households, and an even mix of demographic factors such as the family's socioeconomic status -- sat in a highchair beneath the helmet-like MEG scanner.

The babies listened to an 18-minute stream of speech sounds, such as "da's" and "ta's." The stream included sounds specific to English or Spanish, and sounds shared by the two languages.

The researchers compared monolingual and bilingual babies' brain responses to the language sounds.

The most obvious difference they saw was in two brain regions associated with executive function, the prefrontal cortex and orbitofrontal cortex. In these regions, the Spanish-English bilingual babies had stronger brain responses to speech sounds, compared with English-only babies.

The findings align with brain studies in bilingual and monolingual adults, Ferjan Ramírez said. The boost bilingualism gives to executive function areas in the brain could arise from bilinguals needing to switch back and forth between languages, allowing them to routinely practice and improve executive function skills.

Other brain evidence from the study should be a relief for parents wondering if their bilingual baby is learning enough language:

  • - Bilingual babies displayed neural sensitivity to both English and Spanish sounds, meaning that they were learning both languages.
  • - Bilingual babies had the same sensitivity to English sounds as the monolingual babies, which suggests that they were learning English at the same rate as the monolingual babies.

"The 11-month-old baby brain is learning whatever language or languages are present in the environment and is equally capable of learning two languages as it is of learning one language," Ferjan Ramírez said.

"Our results underscore the notion that not only are very young children capable of learning multiple languages, but that early childhood is the optimum time for them to begin," she said.
_________________
Reference:

EurekAlert. 2016. “Bilingual baby brains show increased activity in executive function regions”. EurekAlert. Posted: April 4, 2016. Available online: http://www.eurekalert.org/pub_releases/2016-04/uow-bbb040416.php

Saturday, December 12, 2015

Language acquisition: From sounds to the meaning

Do young infants know that words in language 'stand for' something else?

Without understanding the 'referential function' of language (words as 'verbal labels', symbolizing other things) it is impossible to learn a language. Is this implicit knowledge already present early in infants? 

The word "apple," as we pronounce it, is a sequence of sounds (phonemes) that we use whenever we want to refer to the object it indicates. If we did not know that a referential relationship exists between the sound and the object it would be impossible for us to use, and learn, a language. Where does this implicit knowledge come from, and how early in human development does it manifest? This is the question Hanna Marno and her SISSA colleagues Marina Nespor and Jacques Mehler in a collaboration with Teresa Farroni, from the University of Padova, attempted to answer in a study just published in Scientific Reports.

"A sensitivity to speech sounds is already present in newborns. These types of sounds are in fact perceived as special starting from the first days of life, and they are processed differently from other types of auditory stimuli. What makes this type of stimulus so special for the newborn?" asks Marno. "There's definitely a 'social' saliency: speech sounds signal interaction between conspecifics, which is important for the survival of the infant. But there is also another important aspect, i.e., referentiality: words are symbols that carry meanings and convey messages. If infants didn't know this, albeit implicitly, they wouldn't be able to acquire language."

"Try to imagine an infant who, on several occasions, sees his mother holding up a cup while uttering the word 'cup'," explains the researcher. "He could just think that this is something his mum would do whenever holding the cup, a strange habit of hers. But instead in a short while he will learn that the word refers to that object, as if he were 'programmed' to do so."

To test this hypothesis, Marno conducted experiments with infants (4 months old). The babies watched a series of videos where a person might (or might not) utter an (invented) name of an object, while directing (or not directing) their gaze towards the position on the screen where a picture of the object would appear. By monitoring the infants' gaze, Marno and colleagues observed that, in response to speech cues, the infant's gaze would look faster for the visual object, indicating that she is ready to find a potential referent of the speech. However, this effect did not occur if the person in the video remained silent or if the sound was a non-speech sound.

"The mere fact of hearing verbal stimuli placed the infants in a condition to expect the appearance, somewhere, of an object to be associated with the word, whereas this didn't happen when there was no speech, even when the person in the video directed the infant's gaze to where the object would appear, concludes Marno. "This suggests that infants at this early age already have some knowledge that language implies a relation between words and the surrounding physical world. Moreover, they are also ready to find out these relations, even if they don't know anything about the meanings of the words yet. Thus, a good advice to mothers is to speak to their infants, because infants might understand much more than they would show, and in this way their attention can be efficiently guided by their caregivers. This doesn't only facilitate the task of acquiring a language, but also helps to learn about their surrounding world."
_________________
Reference:

Science Daily. 2015. “Language acquisition: From sounds to the meaning”. Science Daily. Posted: September 2, 2015. Available online: http://www.sciencedaily.com/releases/2015/09/150902093259.htm

Sunday, October 18, 2015

Babies' brains show that social skills linked to second language learning

Babies learn language best by interacting with people rather than passively through a video or audio recording. But it's been unclear what aspects of social interactions make them so important for learning.

New findings by researchers at the Institute for Learning & Brain Sciences (I-LABS) at the University of Washington demonstrate for the first time that an early social behavior called gaze shifting is linked to infants' ability to learn new language sounds.

Babies about 10 months old who engaged in more gaze shifting during sessions with a foreign language tutor showed a boost in a brain response that indicates language learning, according to the study, which is published in the current issue ofDevelopmental Neuropsychology.

"Our study provides evidence that infants' social skills play a role in cracking the code of the new language," said co-author Patricia Kuhl, co-director of I-LABS.

"We found that the degree to which infants visually tracked the tutors and the toys they held was linked to brain measures of infant learning, showing that social behaviors give helpful information to babies in a complex natural language learning situation," Kuhl said.

Gaze shifting, when a baby makes eye contact and then looks at the same object that the other person is looking at, is one of the earliest social skills that babies show.

"These moments of shared visual attention develop as babies interact with their parents, and they change the baby's brain," said co-author Rechele Brooks, research assistant professor at I-LABS.

In an earlier report, Brooks and others showed that infant gaze shifting serves as a building block for more sophisticated language and social skills as measured in preschool children.

"Since gaze shifting is linked to a larger vocabulary in preschoolers, we suspected that eye gaze might be important earlier when babies are first learning the sounds of a new language, and we wanted to use brain measures to test this," Brooks said.

In the experiment, 9.5-month-old babies from English-speaking households attended foreign language tutoring sessions. Over four weeks, the 17 infants interacted with a tutor during 12 25-minute sessions. The tutors read books and talked and played with toys while speaking in Spanish.

At the beginning and end of the four-week period, researchers counted how often the infants shifted their eye gaze between the tutor and the toys the tutor showed the baby. 

After the tutoring sessions ended, the researchers brought the babies back to the lab to see how much Spanish the babies had learned. This was measured by their brain responses to English and Spanish sounds. The babies listened to a series of language sounds while wearing an electroencephalography (EEG) cap to measure their brain activity.

The results showed that the more gaze shifting the babies participated in during their tutoring sessions, the greater their brain responses were to the Spanish language sounds.

"Our findings show that young babies' social engagement contributes to their own language learning -- they're not just passive listeners of language," Brooks said. "They're paying attention, and showing parents they're ready to learn when they're looking back and forth. That's when the most learning happens."

The study builds on earlier work by Kuhl's team, which found that babies from English-speaking households could learn Mandarin from live tutors, but not from video or audio recordings of Mandarin and from other work at I-LABS establishing the importance of infant eye gaze for language learning.

The researchers hope their findings help parents, caregivers and early childhood educators develop strategies for teaching young children.

"Babies learn best from people," Brooks said. "During playtime your child is learning so much from you. Spending time with your child matters. Keeping them engaged -- that's what helps them learn language."
_________________
Reference:

Science Daily. 2015. “Babies' brains show that social skills linked to second language learning”. Science Daily. Posted: July 27, 2015. Available online: http://www.sciencedaily.com/releases/2015/07/150727100024.htm

Saturday, October 17, 2015

Mum's the word: Maternal language has strong effect on children's social skills

Psychologists at the University of York have revealed new evidence showing how specific language used by parents to talk to their babies can help their child to understand the thoughts of others when they get older.

Studying the effects of maternal mind-mindedness (the ability to 'tune in' to their young child's thoughts and feelings), lead author Dr Elizabeth Kirk observed 40 mothers and their babies when they were 10, 12, 16, and 20 months old.

Keeping a record of parental language while a mother and her child played for 10 minutes, psychologists logged every time the mother made 'mind related comments' - inferences about their child's thought processes through their behaviour (for example, if an infant had difficulty with opening a door on a toy car, they could be labelled as 'frustrated').

Revisiting 15 mother-child pairs when children reached 5 - 6 years old, the child's Theory of Mind (ToM) or socio-cognitive ability was assessed. Using the 'strange stories' method, the level at which the child was able relate to others and understand another person's thoughts was recorded.

The strange stories method involves reading a fictional vignette to the child which poses one of 12 social scenarios (contrary emotions, lies, white lies, persuasion, pretend, joke, forget, misunderstanding, double-bluff, figure of speech, appearance versus reality or sarcasm). Children are then asked a comprehension question followed by a test to prove whether they have understood the mental manipulation covered in the story.

Results showed a strong, positive correlation between mind-related comments at 10, 12 and 20 months old and a child's score on the strange stories task. Therefore, children's ability to understand the thoughts of other people when they were aged 5 was related to how mind-minded their mothers were when they were babies.

Dr Kirk, Lecturer in York's Department of Psychology, said: "These findings show how a mother's ability to tune-in to her baby's thoughts and feelings early on helps her child to learn to empathise with the mental lives of other people. This has important consequences for the child's social development, equipping children to understand what other people might be thinking or feeling.

"These results are significant as they demonstrate the critical role of conversational interaction between mothers and their children in infancy. This also supports previous research led by psychologist Professor Liz Meins, who leads mind-mindedness research at York."
_________________
Reference:

EurekAlert. 2015. “Mum's the word: Maternal language has strong effect on children's social skills”. EurekAlert. Posted: July 27, 2015. Available online: http://www.eurekalert.org/pub_releases/2015-07/uoy-mtw072715.php

Sunday, April 5, 2015

Brain activity shows infants are hardwired to link images, sounds as they learn to speak

New research examining electrical brain activity in infants suggests that we are biologically predisposed to link images and sounds to create language.

In a paper published in the journal Cortex, an international team of researchers in the UK and in Japan, including those at the University of Warwick, examined the electrical activities of the brain in 11 month-olds at the initial stages of word learning.

They used novel words ('kipi' or 'moma') to refer to pictures of a spiky or a rounded shape. They found the infants very quickly began to match the word to the image.

One of the authors, Dr Sotaro Kita from the University of Warwick said: "The oscillatory activity of the infant brainincreased when the word they heard matched the shape they were shown, compared to when it did not. This suggests that the infant brain spontaneously engages in matching visual and auditory input."

An analysis of how different areas of the brain are communicating with each other also showed surprising results. Dr Kita said: "Communication traffic between regions of the brain was light when the word matched the shape, but the traffic became heavy especially in the left hemisphere, where language is typically processed, when the word did not match the shape. The left-hemisphere had to work harder to associate visual and auditory input when they are not a natural match."

"The N400 response was higher for mismatching word-image pairs, which is a classic index of word meaning processing in the brain. This indicates that the infants were trying to work out the meaning of the novel words."

Dr Kita added that these findings reveal that sound symbolism allows 11-month-old infants to spontaneously bind the speech sound and the visual referent, and this spontaneous binding may provide infants an insight that spoken words refers to objects you can see in the world.

He said: "It is this cross-modal mapping between sound and image that plays a key role in the origin and development of language-learning."
_________________
Reference:

Science Daily. 2015. “Brain activity shows infants are hardwired to link images, sounds as they learn to speak”. Science Daily. Posted: February 23, 2015. Available online: http://www.sciencedaily.com/releases/2015/02/150223122502.htm

Monday, November 17, 2014

Improving babies' language skills before they're even old enough to speak

A Rutgers researcher focuses infants on noticing the sounds that are most important

In the first months of life, when babies begin to distinguish sounds that make up language from all the other sounds in the world, they can be trained to more effectively recognize which sounds "might" be language, accelerating the development of the brain maps which are critical to language acquisition and processing, according to new Rutgers research.

The study by April Benasich and colleagues of Rutgers University-Newark is published in the October 1 issue of the Journal of Neuroscience. The researchers found that when 4-month-old babies learned to pay attention to increasingly complex non-language audio patterns and were rewarded for correctly shifting their eyes to a video reward when the sound changed slightly, their brain scans at 7 months old showed they were faster and more accurate at detecting other sounds important to language than babies who had not been exposed to the sound patterns.

"Young babies are constantly scanning the environment to identify sounds that might be language," says Benasich, who directs the Infancy Studies Laboratory at the University's Center for Molecular and Behavioral Neuroscience. "This is one of their key jobs -- as between 4 and 7 months of age they are setting up their pre-linguistic acoustic maps. We gently guided the babies' brains to focus on the sensory inputs which are most meaningful to the formation of these maps."

Acoustic maps are pools of interconnected brain cells that an infant brain constructs to allow it to decode language both quickly and automatically – and well-formed maps allow faster and more accurate processing of language, a function that is critical to optimal cognitive functioning. Benasich says babies of this particular age may be ideal for this kind of training.

"If you shape something while the baby is actually building it," she says, "it allows each infant to build the best possible auditory network for his or her particular brain. This provides a stronger foundation for any language (or languages) the infant will be learning. Compare the baby's reactions to language cues to an adult driving a car. You don't think about specifics like stepping on the gas or using the turn signal. You just perform them. We want the babies' recognition of any language-specific sounds they hear to be just that automatic."

Benasich says she was able to accelerate and optimize the construction of babies' acoustic maps, as compared to those of infants who either passively listened or received no training, by rewarding the babies with a brief colorful video when they responded to changes in the rapidly varying sound patterns. The sound changes could take just tens of milliseconds, and became more complex as the training progressed.

"While playing this fun game we can convey to the baby, 'Pay attention to this. This is important. Now pay attention to this. This is important,'" says Benasich, "This process helps the baby to focus tightly on sounds in the environment that 'may' have critical information about the language they are learning. Previous research has shown that accurate processing of these tens-of-milliseconds differences in infancy is highly predictive of the child's language skills at 3, 4 and 5 years."

The experiment has the potential to provide lasting benefits. The EEG (electroencephalogram) scans showed the babies' brains processed sound patterns with increasing efficiency at 7 months of age after six weekly training sessions. The research team will follow these infants through 18 months of age to see whether they retain and build upon these abilities with no further training. That outcome would suggest to Benasich that once the child's earliest acoustic maps are formed in the most optimal way, the benefits will endure.

Benasich says this training has the potential to advance the development of typically developing babies as well as children at higher risk for developmental language difficulties.

For parents who think this might turn their babies into geniuses, the answer is – not necessarily. Benasich compares the process of enhancing acoustic maps to some people's wishes to be taller. "There's a genetic range to how tall you become – perhaps you have the capacity to be 5'6" to 5'9,"' she explains. "If you get the right amounts and types of food, the right environment, the right exercise, you might get to 5'9" but you wouldn't be 6 feet. The same principle applies here."

Benasich says it's very likely that one day parents at home will be able to use an interactive toy-like device – now under development – to mirror what she accomplished in the baby lab and maximize their babies' potential. For the 8 to 15 percent of infants at highest risk for poor acoustic processing and subsequent delayed language, this baby-friendly behavioral intervention could have far-reaching implications and may offer the promise of improving or perhaps preventing language difficulties.
_________________
EurekAlert. 2014. “Improving babies' language skills before they're even old enough to speak”. EurekAlert. Posted: September 30, 2014. Available online: http://www.eurekalert.org/pub_releases/2014-09/ru-ibl092614.php

Thursday, October 2, 2014

Singaporean birth cohort study finds benefits for babies exposed to 2 languages

Six-month old infants spoken to in 2 languages show better learning and memory than infants exposed to 1 language

A team of investigators and clinician-scientists in Singapore and internationally have found that there are advantages associated with exposure to two languages in infancy. As part of a long-term birth cohort study of Singaporean mothers and their offspring called GUSTO – seminally a tripartite project between A*STAR's Singapore Institute for Clinical Sciences (SICS), KK Women's and Children's Hospital (KKH) and the National University Hospital (NUH) – (see Annex A), six-month old bilingual infants recognised familiar images faster than those brought up in monolingual homes. They also paid more attention to novel images compared to monolingual infants. The findings reveal a generalised cognitive advantage that emerges early in bilingual infants, and is not specific to a particular language. The findings were published online on 30 July 2014 in the highly-regarded scientific journal, Child Development.

Infants were shown a coloured image of either a bear or a wolf. For half the group, the bear was made to become the "familiar" image while the wolf was the "novel" one, and vice versa for the rest of the group. The study showed that bilingual babies got bored of familiar images faster than monolingual babies.

Several previous studies in the field have shown that the rate at which an infant becomes bored of a familiar image and subsequent preference for novelty is a common predictor of better pre-school developmental outcomes, such as advanced performance in concept formation, non-verbal cognition, expressive and receptive language, and IQ tests. The past studies showed that babies who looked at the image and then rapidly get bored, demonstrated higher performance in various domains of cognition and language later on as children.

Bilingual babies also stared for longer periods of time at the novel image than their monolingual counterparts, demonstrating "novelty preference". Other studies in the field have shown this is linked with improved performance in later IQ and vocabulary tests during pre-school and school-going years.

Associate Professor Leher Singh, who is from the Department of Psychology at the National University of Singapore's Faculty of Arts and Social Sciences and lead author of this study said, "One of the biggest challenges of infant research is data collection. Visual habituation works wonderfully because it only takes a few minutes and capitalises on what babies do so naturally, which is to rapidly become interested in something new and then rapidly move on to something else. Even though it is quite a simple task, visual habituation is one of the few tasks in infancy that has been shown to predict later cognitive development."

A bilingual infant encounters more novel linguistic information than its monolingual peers. A six-month old infant in a bilingual home is not just learning another language; it is learning two languages while learning to discern between the two languages it is hearing. It is possible that since learning two languages at once requires more information-processing efficiency, the infants have a chance to rise to this challenge by developing skills to cope with it.

Said Assoc Prof Leher Singh, "As adults, learning a second language can be painstaking and laborious. We sometimes project that difficulty onto our young babies, imagining a state of enormous confusion as two languages jostle for space in their little heads. However, a large number of studies have shown us that babies are uniquely well positioned to take on the challenges of bilingual acquisition and in fact, may benefit from this journey."

In comparison to many other countries, a large proportion of Singaporean children are born into bilingual environments. This finding that bilingual input to babies is associated with cognitive enhancement, suggests a potentially strong neurocognitive advantage for Singaporean children outside the domain of language, in processing new information and recognising familiar objects with greater accuracy.

Said Assoc Prof Chong Yap Seng, Lead Principal Investigator for GUSTO, "This is good news for Singaporeans who are making the effort to be bilingual. These findings were possible because of the unique Singaporean setting of the study and the detailed neurodevelopmental testing that the GUSTO researchers perform." Assoc Prof Chong is Senior Consultant, Department of Obstetrics & Gynaecology, National University Hospital (NUH), as well as Acting Executive Director, Singapore Institute for Clinical Sciences (SICS), Agency for Science, Technology and Research (A*STAR).
________________
References:

EurekAlert. 2014. “Singaporean birth cohort study finds benefits for babies exposed to 2 languages”. EurekAlert. Posted: September 1, 2014. Available online: http://www.eurekalert.org/pub_releases/2014-09/bsi-sbc090114.php

Monday, July 21, 2014

Multilingual or not, infants learn words best when it sounds like home

Growing up in a multilingual home has many advantages, but many parents worry that exposure to multiple languages might delay language acquisition. New research could now lay some of these multilingual myths to rest, thanks to a revealing study that shows both monolingual and bilingual infants learn a new word best from someone with a language background that matches their own.

While 1.5 year old babies are powerful word learners, they can have difficulty learning similar-sounding words (e.g., "coat" and "goat"). A string of previous studies had found unexplained differences in monolingual and bilingual children's ability to learn these types of similar-sounding words, sometimes suggesting a bilingual advantage, and other times suggesting a bilingual delay. Christopher Fennell from the University of Ottawa and Krista Byers-Heinlein from Concordia University, both in Canada, wanted to understand these differences between monolingual and bilingual word learning. They observed that these groups differ not only in how many languages they are learning, but often in whether they are raised by parents who themselves are monolingual or bilingual.

Adults raised bilingual sound subtly different to those from a monolingual environment. They posses a slight "accent" in both of their languages, so subtle that it is not usually detected by other adults. Yet, children are sometimes sensitive to differences that adults ignore. Fennell and Byers-Heinlein asked: would bilingual children learn words better from an adult bilingual and would monolingual children learn new words best from an adult monolingual?

To answer these questions, the researchers taught 61 English monolingual and English-French bilingual 17-month-olds two similar-sounding nonsense words. Infants sat on their parents' laps in front of a television monitor, where they were taught similar-sounding words for two novel objects: a clay crown-shaped object labelled with the word "kem", and a molecule from a chemistry set labelled with the word "gem". For half the babies, the label was produced by an adult who matched their language-learning environment (e.g., monolinguals heard a monolingual, and bilinguals heard a bilingual). For the other half, the label was produced by an adult who did not match their language-learning environment (e.g., monolinguals heard a bilingual, and bilinguals heard a monolingual). To determine whether children had learned the word, researchers presented an incorrect pairing (e.g. "kem" paired with the molecule"). Babies who have learned the words should be surprised at this wrong label, and stare at the mislabelled object more than when a correct label is presented. Babies who have not learned the words should look equally the object no matter if it is correctly or an incorrectly labelled.

Both monolingual and bilingual children could learn the words, but only from a speaker that matched their language-learning environment. Bilingual babies efficiently learned the words from the bilingual speaker, but not from the monolingual speaker. Conversely, monolingual babies effectively learned the words from the monolingual speaker, but not from the bilingual speaker. In other words, there was no overall bilingual advantage or a bilingual delay, but just a difference in which speaker the babies found easier to learn words from.

To further test their hypothesis, the researchers explored whether any of the bilinguals were able learn from the monolingual speaker. They found that bilinguals who were exposed to more English in their everyday environment were more successful at learning from the monolingual speaker than bilinguals with less English exposure. The researchers suspected the bilinguals who succeeded might be children of English-dominant parents who did not possess a bilingual accent in English (e.g., Mom grew up as an English monolingual, even if she was now bilingual).

"We found that all infants, regardless of whether they are learning one or two languages, learn words best when listening to people who sound like their primary caregivers," Fennell explains. "Monolingual infants succeeded with a monolingual speaker, bilingual infants with a bilingual speaker, but each group had difficulty with the opposite speaker."

The findings reveal that both monolingual and bilingual babies are highly tuned to their home language environments. The results contradict hypotheses that bilingual children are better able to deal with varied accents than monolinguals and that monolinguals have more solid word representations than bilinguals. All babies show similar strengths and weaknesses in their early word learning abilities.

Infants' ability to discern the subtle sound differences between words spoken by bilingual or monolingual speakers is striking. But this also makes a great deal of sense in the context of other evidence suggesting that infants' are uniquely tuned to their caregivers' voices. "Children seem to adapt to their language environments," says Byers-Heinlein. "This supports them in reaching their language milestones, no matter whether they grow up monolingual or multilingual."

Finally, these results have strong implications for other studies of bilingual infants and children, the authors say. If a researcher does not take in to account whether the speaker used in their experiment grew up monolingual or bilingual, as well as language dominance in a bilingual child's home, they could generate misleading results. They may "discover" that bilingual children have difficulty with some language task, when, in reality, some bilingual subgroups can succeed and others struggle depending on the language stimuli used.
________________
References:

EurekAlert. 2014. “Multilingual or not, infants learn words best when it sounds like home”. EurekAlert. Posted: June 4, 2014. Available online: http://www.eurekalert.org/pub_releases/2014-06/sp-mon060414.php

Monday, June 2, 2014

How toddlers learn verbs: New insight

Parents can help toddlers' language skills by showing them a variety of examples of different actions, according to new research from the University of Liverpool.

Previous research has shown that verbs pose particular difficulties to toddlers as they refer to actions rather than objects, and actions are often different each time a child sees them.

To find out more about this area of child language, University psychologists asked a group of toddlers to watch one of two short videos.

They then examined whether watching a cartoon star repeat the same action, compared to a character performing three different actions, affected the children's understanding of verbs.

Developmental psychologist, Dr Katherine Twomey, said: "Knowledge of how children start to learn language is important to our understanding of how they progress throughout preschool and school years.

"This is the first study to indicate that showing toddlers similar but, importantly, not identical actions actually helped them understand what a verb refers to, instead of confusing them as you might expect."

Dr Jessica Horst from the University of Sussex who collaborated on the research added: "It is a crucial first step in understanding how what children see affects how they learn verbs and action categories, and provides the groundwork for future studies to examine in more detail exactly what kinds of variability affect how children learn words."

The research is published in the British Journal of Developmental Psychology by The British Psychological Society and Wiley.
________________
References:

Science Daily. 2014. “How toddlers learn verbs: New insight”. Science Daily. Posted: April 16, 2014. Available online: http://www.sciencedaily.com/releases/2014/04/140416101621.htm

Friday, April 11, 2014

Pointing is infants' first communicative gesture

Catalan researchers have studied the acquisition and development of language in babies on the basis of the temporary coordination of gestures and speech. The results are the first in showing how and when they acquire the pattern of coordination between the two elements which allows them to communicate very early on.


A new study carried out by two researchers from the Pompeu Fabra University of Barcelona analyses the temporary coordination between gestures and speech in babies during the very early stages of language development, from the babbling period until the production of their first words.

The results, published in the journal Speech Communication, are the first to show how and when babies acquire the coordination between gesture and speech.



"There are now more and more investigations that show that the study of language and human communication can not be carried out only with an analysis of speech," Núria Esteve Gibert, one of the authors, explained to SINC.



In fact, in communicative interactions meanings and emotions are transmitted through speech and non-verbal elements (hand gestures, facial expressions or body position).



"Our analysis indicates that it is during the transition between the babbling period and first words (that is to say, before the infant is capable of producing two joined words, one after the other), that the gestural system and system of speech are already closely linked," affirmed Esteve Gibert.


According to the authors, this study demonstrates the vision that speech and body language are two elements required for studying human communication, as there are more and more indications that both modes are developed at the same time and that they are closely coordinated, both semantically and temporarily.


The aim of this pioneering work was to investigate the process of acquisition and development of language in relation to the temporary coordination of gestures and speech.


In order to do so, the researchers filmed four babies, born into Catalan-speaking families, while they played with their parents at home, from when the children were aged 11 to when they reached 19 months old.


"These recordings were used to investigate when children started to combine gesture and speech in the same way as adults and if when they combine the two modes, the patterns of temporary coordination between gesture and speech are appropriate," Gibert continued.


In total, more than 4,500 communicative acts produced by the babies across the analysed months, through 24 hours of recordings, were obtained, which have been studied from the point of view of the gestures and of the acoustic properties of the vocalisations produced by the children.


"Special importance has been given to the analysis of the temporary coordination between speech and the act of pointing, because this gesture is crucial in the linguistic and cognitive development of language since it represents the first communicative gesture that babies are capable of understanding and producing," the expert pointed out.


Moreover, it is noted that the correct development of the coordination is closely linked with the future linguistic abilities of the child at a more advanced stage.


Combination of gesture and speech

During the babbling stage babies still produce many gestures without combining them with vocalisations. However, from the beginning of the period in which they start to produce their first words (four words during half an hour of recording), babies produce the majority of hand gestures in combination with vocalisations, the same as adults.



Furthermore, on analysing the combinations of gesture and vocalisation that the babies produce at this early age we see that most of the gestures that they combine with vocalisations are deictic gestures (pointing and reaching) with a declarative communicative intention (to inform) more than a commanding intention (to achieve that object).



"Already in the first combinations of gesture with vocalisation, the pattern of temporary coordination of both modes (which consists in synchronising the interval of time more prominent in the deictic gesture with the interval of time more prominent in the vocalisation) is very similar to that of adults," concluded Esteve Gibert.
________________
References:

Science Daily. 2014. “Pointing is infants' first communicative gesture”. Science Daily. Posted: February 24, 2014. Available online: http://www.sciencedaily.com/releases/2014/02/140224081117.htm

Friday, January 3, 2014

Toddlers Can Learn Verbs Even in Non-Social Contexts

Language acquisition has traditionally been considered a social, interactive process, however new research from Boston University College of Health & Rehabilitation Sciences: Sargent College reveals that toddlers are able to acquire the meanings of words even in "socially impoverished contexts" where social or visual information is absent.

Sudha Arunachalam, Ph.D., director of the BU Child Language Lab and assistant professor in the Department of Speech, Language and Hearing Sciences at Sargent College authored the study which provides new evidence that just overhearing words may be enough for children to learn them.

This study builds upon previous research from Arunachalam demonstrating that by age two, toddlers can extract information about a new verb from its syntactic context, even before viewing a relevant event. These new findings show that children can do so even in an impoverished social context, without discourse context or visual access to the speakers.

"The only information provided was linguistic," said Arunachalam. "Our goal was to determine whether 2-year-olds, on hearing new verbs in informative sentences, could use their syntactic content alone to map the novel verbs to meaning, even though no social or visual information was available."

Arunachalam and her team presented sentences as ambient noise -- meaning toddlers did not have to directly attend to anyone. Researchers then tested whether the toddlers had learned the word meanings by tracking the children's eye gaze as they looked at potential referents for the verbs.

Findings indicate that toddlers can learn at least some aspects of word meaning from contexts in which they are not directly attending to the conversation around them, without observational or social information for cues.

"What this new research tells us is that toddlers have strong abilities to extract meanings, and not just word forms, from the ambient speech stream." Arunachalam said.

Arunachalam explains that she and other researchers have begun to address the question "Can toddlers can benefit from socially impoverished learning situations?" by closing in on the kinds of information, at a minimum, that toddlers require to establish a new word's meaning.

These results shed light on the way toddlers may acquire aspects of verb meaning by overhearing conversations in which they may not be directly involved, and in which no visual referent, discourse context, or child-directed conversation is available.

"Of course we do not claim that toddlers don't use observational and social information to acquire the meanings of words when they are available," said Arunachalam. "Rather, we argue that when these sources of information are absent, toddlers nevertheless posit rudimentary word meanings that can be built upon in future encounters."
__________________________
References:

Science Daily. 2014. “Toddlers Can Learn Verbs Even in Non-Social Contexts”. Science Daily. Posted: November 14, 2013. Available online: http://www.sciencedaily.com/releases/2013/11/131114101923.htm

Saturday, December 7, 2013

One, Two, Buckle My Shoe: Importance of Language to Learning Math

Talk to your toddler. And use numbers when you talk. Doing so may give a child a better head start in math than teaching her to memorize 1-2-3 counting routines.

That's the takeaway of an international study published this week in the online early edition of the Proceedings of the National Academy of Sciences. Led by David Barner, associate professor of psychology and linguistics in the Division of Social Sciences at the University of California, San Diego, the study examined how well children ages 2 to 4 understand number concepts. It has been known since the 1970s that learning to count is different than comprehending what numbers really mean, said Barner, director of the Language and Development Lab at UC San Diego. A youngster who can recite numerals in the correct order, to 10 or even 20, will not necessarily be able to correctly answer "How many?" when presented with a small pile of five or even fewer. It has also been known for a long time that "later mathematical achievements are affected by the very earliest experiences, but it has been difficult to point to what exactly it is about these early experiences that matters," Barner said. So Barner and colleagues -- from University College London, King Saud University, University of Nova Gorica and MIT -- looked into the possible role of language. The researchers found a natural experiment among speakers of Slovenian in the capital Ljubljana and speakers of the Saudi dialect of Arabic. In both cases, in addition to singular and plural forms for words, the languages also make a finer grammatical distinction for sets of two. They have a "dual marker," or a piece of grammar that is similar to a plural, but that speakers use when talking about quantities of exactly two. Children who speak these dual-marking languages seem to grasp the concept of "two" much earlier than their English-speaking counterparts, Barner said, even when the children have received little to no training on counting. In fact, they are faster to begin learning number words than children learning any other language reported so far, including Russian, Japanese and Mandarin Chinese, despite the fact that some of those studied had much more experience with counting routines. "Our study provides the strongest evidence to date that the language a child speaks affects the rate at which they learn number words, and also that hearing number words in naturalistic speech -- not just in counting routines and procedures -- is a critical part of number word learning," Barner said. Interestingly, the early advantage conferred by language seems to disappear when looking at numbers over two and as the children grow. Where there are drastic differences among 2-year-olds who are "two-knowers," by age 4 the English speakers in San Diego have actually surpassed their counterparts in Riyadh and Ljubljana, and more of them understand "three" and higher. "Language provides really rich cues to number meaning," Barner said, "but language only gets you so far." After that, you do need to teach the routines. "We know that early learning about number is a critical foundation to later mathematics performance in the classroom," Barner said, "so it can only be beneficial to also expose kids to speech that contains informative cues to number concepts." In the simplest terms: It is important to teach children to memorize their 1, 2, 3s along with their A, B, Cs. And singing songs like "One, Two, Buckle My Shoe" can't hurt. But it's at least as important to put numbers into natural speech and say to your 12- to 24-month-old "There are two buttons," when pointing to a pair.
__________________________
References:

Science Daily. 2013. “One, Two, Buckle My Shoe: Importance of Language to Learning Math”. Science Daily. Posted: October 28, 2013. Available online: http://www.sciencedaily.com/releases/2013/10/131028162057.htm

Monday, November 4, 2013

Understanding How Infants Acquire New Words Across Cultures

Infants show strong universals as they acquire their native language, but a recent study with infants acquiring Korean also reveals that there are striking language differences.

Sandra Waxman, Louis W. Menk Professor of Psychology at Northwestern University, is senior author of a new study providing the first ever evidence comparing how infants (monolingual, from Korea) acquiring Korean learn new nouns and verbs.

Researchers have long suggested that in "noun friendly" languages including English, infants' attention is focused primarily on objects, typically marked by nouns. In "verb friendly" languages including Korean, Japanese and Hindi, verbs are said to enjoy a more privileged status because infants' attention is focused more directly on the actions and relations typically marked by verbs.

"Almost all of the research on infants acquiring these "verb-friendly" languages has looked at the nouns and verbs that they produce in their daily lives," said Sudha Arunachalam, lead author of the study and assistant professor of speech and hearing sciences at Boston University.

"By using an experimental method instead, our approach lets us watch infants acquire new words, so we can get real insight into the mental processes that are at work during learning."

Waxman said their new work shows strong universals in language acquisition, but also shows some real cross-linguistic differences.

"Like infants acquiring other languages, Korean infants very successfully learn nouns to name objects such as ball, bottle and boy," Waxman said. "However, when it comes to learning verbs -- names for activities and relations -- like running, hugging, twirling, we see differences across languages."

Previous research had shown that in English, 24-month-old infants were better able to learn novel verbs for novel actions (e.g., petting) if the surrounding noun phrases were explicitly mentioned (e.g., "The girl is petting the dog") than if they were dropped from the sentence (e.g., "Look. Petting!"). In contrast, the new research shows that in Korean (a language in which noun phrases are typically dropped in conversation) 24-month-olds were better able to learn novel verbs for novel actions if the surrounding noun phrases (e.g., the girl, the dog) were dropped; in fact, unlike English-acquiring infants, those acquiring Korean struggled if the nouns were explicitly mentioned.

"We know that even before infants begin to say many verbs, they begin to understand them," Waxman said. "What this new research tells us is that the information that infants need to 'get' that understanding varies, depending upon the native language they are learning. This piece of the language acquisition process is not universal; instead, it is 'language-specific.'

"Even in the early stages of language learning, infants are shaped by the structure of their native language, so much so that the way they learn verbs is influenced by the way they've been hearing verbs in the ambient language, even before they could understand them. This means that like early speech and music perception, the structure of what infants passively hear influences how they actively learn," Waxman said.
__________________________
References:

Science Daily. 2013. “Understanding How Infants Acquire New Words Across Cultures”. Science Daily. Posted: September 27. 2013. Available online: http://www.sciencedaily.com/releases/2013/09/130927123424.htm

Thursday, July 11, 2013

Grammar May Be Hidden in Toddler Babble

The little sounds and puffs of air that toddlers often inject into their baby babble may actually be subtle stand-ins for grammatical words, new research suggests.

For their study, Cristina Dye, a Newcastle University researcher in child language development, made recordings of tens of thousands of utterances of French-speaking children between 23 months and 37 months old.

Dye and her colleagues analyzed each sound the kids made and the context in which it was produced. The team said they documented a pattern of sounds and puffs of air that seemed to replace grammatical words in many cases. Their findings suggest that toddlers may properly use little words (as, a, an, can, is) sooner than thought.

"Many of the toddlers we studied made a small sound, a soft breath, or a pause, at exactly the place that a grammatical word would normally be uttered," Dye said in a statement.

"The fact that this sound was always produced in the correct place in the sentence leads us to believe that young children are knowledgeable of grammatical words. They are far more sophisticated in their grammatical competence than we ever understood."

Though Dye was studying French-speaking toddlers, she and her colleagues expect their findings to apply to other languages as well. She also thinks their results could have implications for understanding language delay in children.

"When children don't learn to speak normally it can lead to serious issues later in life," Dye said in a statement. "For example, those who have it are more likely to suffer from mental illness or be unemployed later in life. If we can understand what is 'normal' as early as possible then we can intervene sooner to help those children."

Previous research has shown that toddlers, before they articulate full sentences themselves, may be able to understand complex grammar. A 2011 study published in the journal Cognitive Science found that as early as 21 months, children could match made-up verbs with pictures that made sense grammatically. For example, if they were told "The rabbit is glorping the duck," they would point to a picture of a rabbit lifting a duck's leg rather than the duck lifting its leg on its own.

The new research on the French-speaking toddlers was detailed in the Journal of Linguistics.
__________________________
References:

Gannon, Megan. 2013. “Grammar May Be Hidden in Toddler Babble”. Live Science. Posted: June 17, 2013. Available online: http://www.livescience.com/37502-grammar-may-be-hidden-in-toddler-babble.html

Thursday, April 18, 2013

Bringing Up Bilingual, Bigoted Baby

Two studies out of Canada this week suggest that while your cute toddler can already understand grammar in two languages, he or she may be suspicious of difference.

The first study, published in the journal Nature Communications, found children “as young as seven months can distinguish between, and begin to learn, two languages with vastly different grammatical structures.” The study found that infants in bilingual environments quickly learned to identify words that appeared frequently, like “the” or “and” in English, and used them as cues to distinguish which language was in use.

The research included languages with marked differences in grammatical structure, like Hindi and English. The children in bilingual households learned to key in on “pitch and duration cues,” and used that to develop what the researchers suggest is a child’s concept of discrete systems of grammar.

Understanding grammar doesn’t mean they will talk any better or sooner. Which is fortunate, because if nine month-olds could tell us what they are thinking, it might be scary as hell.

Research released the same time by the University of British Columbia, found that infants as young as nine months have begun to “condone antisocial behavior when it is directed at individuals who are dissimilar.”

In what sounds like a scene from a David Cronenberg movie, researchers started by putting a series of babies in front a plate of graham crackers and some green beans and let the kids choose which one they preferred. Then, they had the kids watch a puppet show in which the puppets underwent the same exercise, selecting graham crackers or green beans. Then they let the kids interact with the puppets. The kids not only gravitated toward the puppet that shared its snack preference, but appeared to grow more attached if that puppet then attacked the puppets who had expressed the other food preference. From the UBC summary:

In the experiments, other puppets harmed, helped or acted neutrally towards the puppets with different or similar food preferences. Prompted to pick their favorite puppet, infants demonstrated a strong preference for the puppets who harmed the “dissimilar” puppet and helped the “similar” one – one infant even planted a kiss on the puppet she liked.

These findings suggest that babies either feel something like schaudenfreude – pleasure when an individual they dislike or consider threatening experiences harm,” says Hamlin. “Or babies have some early understanding of social alliances, recognizing that the ‘enemy of their enemy’ is their friend.

The two studies had nothing to do with each other. It would be interesting to run one of the bilingual babies through the puppet experiment, and see if they liked both graham crackers and green beans.
__________________________
References:

Herman, Marc. 2013. “Bringing Up Bilingual, Bigoted Baby”. Pacific Standard. Posted: March 14, 2013. Available online: http://www.psmag.com/blogs/the-101/bringing-up-bilingual-bigoted-baby-54002/

Sunday, April 7, 2013

Babies' brains may be tuned to language before birth

Brain imaging shows that premature babies process speech in similar ways to adults.

Despite having brains that are still largely under construction, babies born up to three months before full term can already distinguish between spoken syllables in much the same way that adults do, an imaging study has shown1.

Full-term babies — those born after 37 weeks' gestation — display remarkable linguistic sophistication soon after they are born: they recognize their mother’s voice2, can tell apart two languages they’d heard before birth3 and remember short stories read to them while in the womb4

But exactly how these speech-processing abilities develop has been a point of contention. “The question is: what is innate, and what is due to learning immediately after birth?” asks neuroscientist Fabrice Wallois of the University of Picardy Jules Verne in Amiens, France. 

To answer that, Wallois and his team needed to peek at neural processes already taking place before birth. It is tough to study fetuses, however, so they turned to their same-age peers: babies born 2–3 months premature. At that point, neurons are still migrating to their final destinations; the first connections between upper brain areas are snapping into place; and links have just been forged between the inner ear and cortex.

Colourful sounds

To test these neural pathways, the researchers played soft voices to premature babies while they were asleep in their incubators a few days after birth, then monitored their brain activity using a non-invasive optical imaging technique called functional near-infrared spectroscopy. They were looking for the tell-tale signals of surprise that brains display — for example, when they suddenly hear male and female voices intermingled after hearing a long run of simply female voices.

The young brains were able to distinguish between male and female voices, as well as between the trickier sounds ‘ga’ and ‘ba’, which demands even faster processing. What is more, the parts of the cortex used were the same as those used by adults for sophisticated understanding of speech and language. 

The results show that linguistic connections inside the cortex are already “present and functional” and did not need to be gradually acquired through repeated exposure to sound, Wallois says. This suggests at least part of these speech-processing abilities is innate. The work could also lead to better techniques caring for the most vulnerable brains, Wallois adds, including premature babies. The team's results appear in Proceedings of the National Academy of Sciences1.

These are “remarkable findings”, says Janet Werker, a developmental psychologist at the University of British Columbia in Vancouver, Canada. They are, she says, the first evidence that brains can distinguish between difficult consonants even before a full-term birth, hinting at greater brain sensitivities than previously imagined5

Yet this does not fully answer the innate-versus-learned question, Werker says. “It is possible that the experience of birth triggers a set of processes that prime the brain of a premature infant to respond to language in ways that a same-aged fetus will not.”
__________________________
References:

Nuzzo, Regina. 2013. “Babies' brains may be tuned to language before birth”. Nature. Posted: February 25, 2013. Available online: http://www.nature.com/news/babies-brains-may-be-tuned-to-language-before-birth-1.12489

Journal References:

1.Mahmoudzadeh, M. et al. Proc. Natl Acad. Sci. USA
http://dx.doi.org/10.1073/pnas.1212220110 (2013).
2.DeCasper, A. J. & Fifer, W. P. Science 208, 1174–1176 (1980).
3.Byers-Heinlein, K., Burns, T. C. & Werker, J. F. Psychol. Sci. 21, 343–348 (2010).
4.DeCasper, A. J. & Spence, M. J. Infant Behav. Dev. 9, 133–150 (1986).
5.Weikum, W. M., Oberlander, T. F., Hensch, T. K. & Werker, J. F. Proc. Natl Acad. Sci. USA, 109, 17221–17227 (2012).

Monday, April 1, 2013

Children with brain lesions able to use gestures important to language learning

Children with brain lesions suffered before or around the time of birth are able to use gestures – an important aspect of the language learning process– to convey simple sentences, a Georgia State University researcher has found.

Şeyda Özçalışkan, assistant professor of psychology, and fellow researchers at the University of Chicago, looked at children who suffered lesions to one side of the brain to see whether they used gestures similar to typically developing children. She examined gestures such as pointing to a cookie while saying "eat" to convey the meaning "eat cookie," several months before expressing such sentences exclusively in speech.

"We do know that children with brain injuries show an amazing amount of plasticity (the ability to change) for language learning if they acquire lesions early in life," Özçalışkan said. "However, we did not know whether this plasticity was characterized by the same developmental trajectory shown for typically developing children, with gesture leading the way into speech. We looked at the onset of different sentence constructions in children with early brain injuries, and wanted to find out if we could see precursors of different sentence types in gesture.

"For children with brain injuries, we found that this pattern holds, similar to typically developing children," she said. "Children with unilateral brain injuries produce different kinds of simple sentences several months later than typically developing children. More important, the delays we observe in producing different sentences in speech are preceded by a similar delay in producing the same sentences in gesture-speech combinations."

Children with brain injuries also had a more difficult time in producing complex sentences across gesture and speech, such as conveying relationships between actions, for example saying "help me do it" while making a painting gesture.

"This in turn was later reflected in a much narrower range of complex sentence types expressed in their speech," Özçalışkan said. "This suggested to us, in general, that producing sentences across gesture and speech may serve as an embodied sensorimotor experience, that might help children take the next developmental step in producing these sentences in speech.

"And if you bypass the gesture-speech combination stage, that might negatively affect developing a broader representation of complex sentence types in speech."

The researchers also compared children with smaller brain lesions against children with large lesions, and found more of a delay in producing sentences, both in speech and in gesture-speech combinations, in children with large lesions.

The research has implications for developing interventions to help children with the language learning process, "as it shows that gestures are integral to the process of language learning even when that learning is taking place in an injured brain," Özçalışkan said.

"When children do different kinds of sentence combinations across gesture and speech, that's like a signal to the caregiver that 'I'm ready for this,'" she said. "The caregiver can then provide relevant input to the child, and that could in turn help the child take the next developmental step in producing that sentence entirely in speech."
__________________________
References:

EurekAlert. 2013. “Children with brain lesions able to use gestures important to language learning”. EurekAlert. Posted: February 20, 2013. Available online: http://www.eurekalert.org/pub_releases/2013-02/gsu-cwb022013.php

Sunday, January 20, 2013

While in womb, babies begin learning language from their mothers

Babies only hours old are able to differentiate between sounds from their native language and a foreign language, scientists have discovered. The study indicates that babies begin absorbing language while still in the womb, earlier than previously thought.

Sensory and brain mechanisms for hearing are developed at 30 weeks of gestational age, and the new study shows that unborn babies are listening to their mothers talk during the last 10 weeks of pregnancy and at birth can demonstrate what they've heard.

"The mother has first dibs on influencing the child's brain," said Patricia Kuhl, co-author and co-director of the Institute for Learning & Brain Sciences at the University of Washington. "The vowel sounds in her speech are the loudest units and the fetus locks onto them."

Previously, researchers had shown that newborns are born ready to learn and begin to discriminate between language sounds within the first months of life, but there was no evidence that language learning had occurred in utero.

"This is the first study that shows fetuses learn prenatally about the particular speech sounds of a mother's language," said Christine Moon, lead author and a professor of psychology at Pacific Lutheran University in Tacoma, Wash. "This study moves the measurable result of experience with speech sounds from six months of age to before birth."

The results will be published in an upcoming issue of the journal Acta Paediatrica.

Forty infants, about 30 hours old and an even mix of girls and boys, were studied in Tacoma and Stockholm, Sweden. While still in the nursery, the babies listened to vowel sounds in their native tongue and in foreign languages.

Their interest in the sounds was captured by how long they sucked on a pacifier that was wired into a computer measuring the babies' reaction to the sounds. Longer or shorter sucking for unfamiliar or familiar sounds is evidence for learning, because it indicates that infants can differentiate between the sounds heard in utero.

In both countries, the babies at birth sucked longer for the foreign language than they did for their native tongue.

The researchers say that infants are the best learners, and discovering how they soak up information could give insights on lifelong learning. "We want to know what magic they put to work in early childhood that adults cannot," Kuhl said. "We can't waste that early curiosity."
__________________________
References:

EurekAlert. 2013. “While in womb, babies begin learning language from their mothers”. EurekAlert. Posted: January 2, 2013. Available online: http://www.eurekalert.org/pub_releases/2013-01/uow-wiw122812.php

Tuesday, December 4, 2012

The road to language learning is iconic

Languages are highly complex systems and yet most children seem to acquire language easily, even in the absence of formal instruction. New research on young children's use of British Sign Language (BSL) sheds light on one of the mechanisms - iconicity - that may endow children with this amazing ability.

For spoken and written language, the arbitrary relationship between a word's form – how it sounds or how it looks on paper – and its meaning is a particularly challenging feature of language acquisition. But one of the first things people notice about sign languages is that signs often represent aspects of meaning in their form. For example, in BSL the sign EAT involves bringing the hand to the mouth just as you would if you were bringing food to the mouth to eat it.

In fact, a high proportion of signs across the world's sign languages are similarly iconic, connecting human experience to linguistic form.

Robin Thompson and colleagues David Vison, Bencie Woll, and Gabriella Vigliocco at the Deafness, Cognition and Language Research Centre (DCAL) at University College London in the United Kingdom wanted to examine whether this kind of iconicity might provide a key to understanding how children come to link words to their meaning.

Their findings are published in Psychological Science, a journal of the Association for Psychological Science.

The researchers looked at data from 31 deaf children who were being raised in deaf BSL signing families in the United Kingdom. Parents indicated the number of words understood and produced by their children between the ages of 8 and 30 months. The researchers decided to focus on 89 specific signs, examining children's familiarity with the signs as well as the iconicity and complexity of the signs.

The findings reveal that younger (11-20 months) and older (21-30 months) children comprehended and produced more BSL signs that were iconic than those that were less iconic. And the benefit of iconicity seemed to be greater for the older children. Importantly, this relationship did not seem to depend on how familiar, complex or concrete the words were.

Together, these findings suggest that iconicity could play an important role in language acquisition.

Thompson and colleagues hypothesize that iconic links between our perceptual-motor experience of the world and the form of a sign may provide an imitation-based mechanism that supports early sign acquisition. These iconic links highlight motor and perceptual similarity between actions and signs such as DRINK, which is produced by tipping a curved hand to the mouth and represents the action of holding a cup and drinking from it.

The researchers emphasize that these results can also be applied to spoken languages, in which gestures, tone of voice, inflection, and face-to-face communication can help make the link between words and their meanings less arbitrary.

"We suggest that iconicity provides scaffolding – a middle-ground – to bridge the "great divide" between linguistic form and bodily experience for both sign language and spoken language learners," says Thompson.
__________________________
References:

EurekAlert. 2012. “The road to language learning is iconic”. EurekAlert. Posted: November 13, 2012. Available online: http://www.eurekalert.org/pub_releases/2012-11/afps-trt111312.php