Psycholinguistics
@psycholinguistics

Here we share reviews and articles about Psycholinguistics and other relevant fields.
482  
Psycholinguistics
2019-05-15 

✍ Falk Huettig, Martin J. Pickering. Literacy Advantages Beyond Reading:

Prediction of Spoken Language. Trends in Cognitive Sciences, 2019 DOI: 10.1016/j.tics.2019.03.008
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(19)30090-7#secst0010
Psycholinguistics
2019-05-15 

The literate mind. As reading is self-paced, there is a strong incentive to predict upcoming words, as this speeds up reading, which is typically much faster than listening.

Skilled readers tend to take in whole words at one glance (gazing with their eyes at multiple letters at the same time) and time their eye movements to optimise the reading process. Printed texts (even given the occasional changes in fonts and word capitalizations) are much more regular than conversational speech, which is full of disfluencies, incomplete word pronunciations, and speech errors. This regularity of written texts helps readers to form the predictive relationships between words that then, by extension, can also be used to better predict words when listening to speech.
It is hard to imagine for someone who learned to read a long time ago but even the concept of a 'word' is an invention of the literate mind; it is very hard to grasp if you are an illiterate who only ever hears a stream of speech sounds. For example, when illiterates or children who haven't learned to read yet are asked to repeat the last word of a spoken sentence, they tend to repeat the whole sentence. In contrast, words clearly stand out in written language, typically being separated by white space. Written forms make words more salient and precise: readers become more aware that words are stable units in language. Storing the written form of words in memory also helps to make spoken word forms more salient, to be accessed faster when predicting upcoming speech. And, again, it is prediction of upcoming language that makes language understanding become really fast and proficient.
"Our arguments provide one more reason why more efforts should be undertaken to teach the hundreds of millions of illiterates in developing countries and functional illiterates across the world how to read (or to read better) and why a focus on artificial intelligence voice recognition and voice assistants to overcome literacy-related problems has its dangers," the authors argue.
"Writing is an ancient human technology that we shouldn't give up easily. Teaching how to read and how to read better remains very important even in a modern technological world," concludes Huettig.
#language #anticipation #prediction #printedText #speech
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
https://www.mpi.nl/speech-recognition-technology-is-not-a-solution-for-poor-readers-1
Psycholinguistics
2019-05-15 

👉 Speech recognition technology is not a solution for poor readers

Could artificial intelligence be a solution for people who cannot read well (functional illiterates) or those who cannot read at all (complete illiterates)? According to psycholinguists, speech technology should never replace learning how to read. Researchers argue that literacy leads to a better understanding of speech because good readers are good at predicting words.
@psycholinguistics
Even today about one in five humans is considered to be 'low literate' or illiterate; they cannot read or write simple statements about everyday life. Low literacy can be due to no or little reading practice or reading impairments such as dyslexia. For developing countries with low literacy rates, voice recognition has been hailed as a solution by companies such as Google, calling it 'the next big leap in technology'. But is speech technology really the solution for low literacy?
Falk Huettig and Martin Pickering argue that it is not. In an opinion article in Trends in Cognitive Sciences, the psycholinguists suggest that relying on speech technology might be counterproductive, as literacy has crucial benefits even beyond reading. "It is very relevant and timely to look at the advantages of reading on speech, especially as people tend to read less and in different ways than they used to," says Falk Huettig. "Contemporary social media writing and reading habits for example are quite different from traditional print media. Information that people used to get from written sources such as novels, newspapers, public notices, or even recipe books they get more and more from YouTube videos, podcasts, or audiobooks."
This is not necessarily a bad thing, as some of the general benefits of reading can also be obtained from listening to audiobooks. As audio books also provide 'book language', listening to them will confer some similar advantages -- such as a larger vocabulary, increased knowledge of the world and a larger short-term ('working') memory, which is important to keep track of information and multiple entities over several sentences, paragraphs, or often even pages.
But according to Huettig and Pickering, reading itself -- the actual physical act of reading -- is crucially important for developing the skill of predicting upcoming words, which transfers from reading to understanding spoken language. Reading trains the language prediction system although even very young children -- who cannot read yet -- can predict where a sentence is going. When 2-year-olds hear "the boy eats a big cake," they already look at something edible (i.e. a cake) after hearing "eats," but before hearing "cake." Predicting upcoming information is useful, as it reduces processing load and frees up limited brain resources. And crucially, skilled readers get much better at predicting.
Children who are among the most avid readers encounter over 4 million words a year, while children who rarely read encounter only about 50,000 words. As a result, good readers get a deeper understanding of the meaning of words and build large networks of words with strong associations between them -- which helps them to predict upcoming words. As poor readers have smaller vocabularies and weaker representations of words in their mind (i.e. the recollection of the sound and meaning of a word), the predictive relationships between words are also weaker (e.g. the prediction that 'read the ... ' is often followed by 'book').
Psycholinguistics
2019-05-08 

✍ Zhenghan Qi, Michelle Han, Yunxin Wang, Carlo de los Angeles, Qi Liu, Keri Garel, Ee San Chen, Susan Whitfield-Gabrieli, John D.E.

Gabrieli, Tyler K. Perrachione, Speech processing and plasticity in the right hemisphere predict variation in adult foreign language learning , NeuroImage, Volume 192, 2019, Pages 76-87, ISSN 1053-8119, https://doi.org/10.1016/j.neuroimage.2019.03.008 .
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
Psycholinguistics
2019-05-08 

Additional research will investigate whether the findings apply to those learning other languages, not just Mandarin.

The eventual goal is to explore whether someone can practice sound recognition early in the process of learning a new language to potentially improve their success.
"We found that the more active the right hemisphere is, the more sensitive the listener is to acoustic differences in sound," Qi said. "Everyone has different levels of activation, but even if you don't have that sensitivity to begin with, you can still learn successfully if your brain is plastic enough."
Researchers can't say for certain how to apply these findings to real-life learning, but when it comes down to it, "Adults are trainable," Qi said. "They can train themselves to become more sensitive to foreign speech sounds."
#fMRI #IndividualDifferences #LanguageLearning #Prediction #RestingStateConnectivity #SpeechPerception
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
https://www.sciencedaily.com/releases/2019/05/190508093716.htm
Psycholinguistics
2019-05-08 

👉 Learning language: New insights into how brain functions

When it comes to learning a language, the left side of the brain has traditionally been considered the hub of language processing. But new research shows the right brain plays a critical early role in helping learners identify the basic sounds associated with a language. That could help find new teaching methods to better improve student success in picking up a foreign language.
@psycholinguistics
Learning it in a class that essentially compresses a one-semester college course into a single month of intensive instruction -- and agreeing to have your brain scanned before and after -- might seem even more daunting.
But the 24 Americans who did just that have enabled University of Delaware cognitive neuroscientist Zhenghan Qi and her colleagues to make new discoveries about how adults learn a foreign language.
The study, published in May in the journal NeuroImage, focused on the roles of the brain's left and right hemispheres in language acquisition. The findings could lead to instructional methods that potentially improve students' success in learning a new language.
"The left hemisphere is known as the language-learning part of the brain, but we found that it was the right hemisphere that determined the eventual success" in learning Mandarin, said Qi, assistant professor of linguistics and cognitive science.
"This was new," she said. "For decades, everyone has focused on the left hemisphere, and the right hemisphere has been largely overlooked."
The left hemisphere is undoubtedly important in language learning, Qi said, noting that clinical research on individuals with speech disorders has indicated that the left side of the brain is in many ways the hub of language processing.
But, she said, before any individuals -- infants learning their native language or adults learning a second language -- begin processing such aspects of the new language as vocabulary and grammar, they must first learn to identify its basic sounds or phonological elements.
It's during that process of distinguishing "acoustic details" of sounds where the right side of the brain is key, according to the new findings.
Researchers began by exposing the 24 participants in the study to pairs of sounds that were similar but began with different consonants, such as "bah" and "nah," and having them describe the tones, Qi said.
"We asked: Were the tones of those two sounds similar or different?" she said. "We used the brain activation patterns during this task to predict who would be the most successful learners" of the new language.
The study continued by teaching the participants in a setting designed to replicate a college language class, although the usual semester was condensed into four weeks of instruction. Students attended class for three and a half hours a day, five days a week, completed homework assignments and took tests.
"Our research is the first to look at attainment and long-term retention of real-world language learned in a classroom setting, which is how most people learn a new language," Qi said.
By scanning each participant's brain with functional MRI (magnetic resonance imaging) at the beginning and end of the project, the scientists were able to see which part of the brain was most engaged while processing basic sound elements in Mandarin. To their surprise, they found that -- although, as expected, the left hemisphere showed a substantial increase of activation later in the learning process -- the right hemisphere in the most successful learners was most active in the early, sound-recognition stage.
"It turns out that the right hemisphere is very important in processing foreign speech sounds at the beginning of learning," Qi said. She added that the right hemisphere's role then seems to diminish in those successful learners as they continue learning the language.
Psycholinguistics
2019-05-04 

✍ Katrina d'Apice, Rachel M. Latham, Sophie von Stumm.

A naturalistic home observational approach to children’s language, cognition, and behavior.. Developmental Psychology, 2019; DOI: 10.1037/dev0000733
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
Psycholinguistics
2019-05-04 

👉 Chatterbox parents may boost tots' intelligence

A major new study has identified a link between kids who hear high quantities of adult speech and better nonverbal abilities such as reasoning, numeracy and shape awareness.
@psycholinguistics
The major new study, led by researchers at the University of York, identified a link between kids who heard high quantities of adult speech and their nonverbal abilities such as reasoning, numeracy and shape awareness.
The researchers gained unprecedented insight into the secret lives of pre-schoolers by fitting tiny audio recorders into the clothing of children aged two to four.
The experiences of 107 children and their interactions with parents and other caregivers were recorded in the home environment over three days for up to 16 hours per day.
Parents were also asked to complete activities with their children -- involving drawing, copying and matching tasks -- designed to test their child's cognitive skills.
Lead author of the study, Katrina d'Apice, a PhD student from the University of York's Department of Education, said: "Using the audio recorders allowed us to study real-life interactions between young children and their families in an unobtrusive way within the home environment rather than a lab setting.
"We found that the quantity of adult spoken words that children hear is positively associated with their cognitive ability. However, further research is needed to explore the reasons behind this link -- it could be that greater exposure to language provides more learning opportunities for children, but it could also be the case that more intelligent children evoke more words from adults in their environment."
The researchers also found that high quality adult speech may have benefits for children's linguistic development, as children in the study who interacted with adults who used a diverse vocabulary knew a greater variety of words themselves.
The study also analysed the recordings to look at the impact different parenting styles might have on the children's behaviour.
d'Apice and her colleagues found that positive parenting -- where parents are responsive and encouraging of exploration and self-expression -- was associated with children showing fewer signs of restless, aggressive and disobedient behaviours.
The study's senior author, Professor Sophie von Stumm from the Department of Education at the University of York, said: "This study is the largest naturalistic observation of early life home environments to date.
"We found that the quantity of adult spoken words that children were exposed to varied greatly within families. Some kids heard twice as many words on one day as they did on the next.
"The study highlights the importance of treating early life experiences as dynamic and changeable rather than static entities -- approaching research in this way will help us to understand the interplay between environmental experiences and children's differences in development."
#homeObservation #language #parenting #cognitiveDevelopment #lexicalDiversity
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
https://www.sciencedaily.com/releases/2019/04/190430105753.htm
Psycholinguistics
2019-04-28 

✍ Lyn Tieu, Philippe Schlenker, Emmanuel Chemla. Linguistic inferences without words.

Proceedings of the National Academy of Sciences, 2019; 201821018 DOI: 10.1073/pnas.1821018116
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
Psycholinguistics
2019-04-28 

The PNAS study investigates four of these inferential types with experimental methods, confirming the results of the linguistic study.

But it also goes one step further by replacing the gestures with visual animations embedded in written texts, thus answering two new questions: First, can the results be reproduced for visual stimuli that subjects cannot possibly have seen in a linguistic context, given that people routinely speak with gestures but not with visual animations? Second, can entirely non-linguistic material be structured by the same processes?
Both answers are positive.
In a series of experiments, approximately 100 subjects watched videos of sentences in which some words were replaced either by gestures or by visual animations. They were asked how strongly they derived various inferences that are the hallmarks of different inferential types (for instance, inferences derived in the presence of negation). The subjects' judgments displayed the characteristic signature of four classic inferential types (including presuppositions and implicated content) in gestures but also in visual animations: the informational content of these non-standard expressions was, as expected, divided on the fly by the experiments' subjects among well-established slots of the inferential typology.
#gesture #inference #implicature #presupposition #iconicity
@psycholinguistics
👇👇👇👇👇👇👇👇👇👇
https://www.sciencedaily.com/releases/2019/04/190425104117.htm