Rabitə və İnformasiya Texnologiyaları Nazirliyinin elektron xəbər xidməti

UCLA smartphone app translates meanings of your baby’s cries


As any new parent will quickly confirm, the sound of a crying baby can communicate a multitude of things, from "I'm hungry" to "I'm in pain." A team at UCLA has now developed an innovative app that can identify when a baby is crying and help decode what they are trying to communicate.
 
The app is called Chatterbaby and is initially being directed at assisting deaf parents in identifying when their baby is crying, as opposed to simply making loud noises. Remote noise monitors, while helpful for hearing-impaired parents, cannot distinguish between a cry of distress and a loudly talkative baby.
 
The second function of Chatterbaby is where things begin to get really interesting. The team at UCLA set out to create an algorithm than can place different cries into three categories: pain, hunger and fussiness.
 
"As a mother, you instinctively know what your child is trying to tell you simply by listening to how they cry, even if you can't see them," explains Ariana Anderson, lead in the UCLA research. "As a statistician, I thought, 'Can we train an algorithm to do what my ears as a parent can do automatically?' The answer was yes."
 
After testing a series of different algorithms on more than 2,000 samples of baby cries, the system was ultimately able to correctly identify pain cries over 90 percent of the time, although the system has greater difficulty differentiating between fussy and hungry cries.
 
"The program looks at the types of frequencies that are in the cry and at the different patterns of sounds and silence," says Anderson. "For example, when you hear a cry that has a long period of silence in it, it's more likely that the baby is fussy. But when babies are in pain, the cries typically have louder, longer bursts and there's very little silence between sounds."
 
Of course, not every baby communicates in exactly the same way, which is why the app cleverly incorporates a personalization feature that allows users to train the algorithm to better translate their own baby's specific sounds. Audio clips of a baby's cry can be recorded, tagged and uploaded to the UCLA server to optimize the accuracy of the app's future predictions. The researchers point out that this audio data is de-identified of any individual identifying characteristics but it is permanently stored as part of a greater research study designed to slowly improve the algorithm's overall accuracy.
 
The team also intends to use the growing bank of baby cries to determine if neurodevelopmental disorders can be diagnosed through abnormal vocalizations. Stephen Sheinkopf, from Brown University, has been studying correlations between irregular cry acoustics and neurodevelopmental disorders for several years. He believes that with the appropriate tools a condition such as autism could potentially be identified in a very young baby simply through the study of its cries.
 
"We've known for a long time that older individuals with autism produce sounds or vocalizations that are unusual or atypical," says Sheinkopf. "So vocalizations in babies have been discussed as being useful in developing early identification tools for autism. That's been a major challenge. How do you find signs of autism in infancy?"
 
While the Chatterbaby app is free to use it is also ostensibly part of a larger research project. Use of the app requires consent from the user, and the researchers are hoping parents will engage further in the longitudinal aspects of the study. In order to better correlate certain crying characteristics with conditions such as autism the researchers are asking parents using the app to fill our annual surveys that follow a child's development until the age of six.
 
"This study is unique because it brings the lab to the participant instead of the participant to the lab," says Anderson. "It's open to anyone willing to download the Chatterbaby app on their iPhone or Android devices, record five seconds of their baby's cries, then upload it to the database."






24/05/18    Çap et