Preterm babies born early in the third trimester of pregnancy are likely to experience delays in the development of the auditory cortex, a brain region essential to hearing and understanding sound, a new study reveals. Such delays are associated with speech and language impairments at age 2, the researchers found.
MRI brain scans can predict language improvement after a cochlear implant, laying the foundation for creation of brain specific therapy.
Altered development of a part of the auditory cortex in preterm infants is associated with poorer language skills in early childhood, finds a brain imaging study of very early-born babies in a neonatal intensive care unit. The research, published in eNeuro, suggests that developmental disturbances to this brain region may underlie speech and language difficulties observed in this population.
Three- to five-year-old children with autism spectrum disorder (ASD) and delayed language development appear to process voices differently than typically developing children, according to a new study published in Scientific Reports.
Over a three-year period, researchers from the United Kingdom examined the relationship between poor oral health and older adults' risks for becoming frail. They published their findings in the Journal of the American Geriatrics Society.
Millions of Americans hear ringing in their ears -- a condition called tinnitus -- but a new study shows an experimental device could help quiet the phantom sounds by targeting unruly nerve activity in the brain. Results of the first animal tests and clinical trial of the approach, which uses precisely timed sounds and weak electrical pulses that activate touch-sensitive nerves, resulted in a decrease in tinnitus loudness and improvement in tinnitus-related quality of life.
Amazon recently announced that its language assistant Alexa is now able to recognise voices. What is celebrated as a tech revolution is an everyday process for our brain. So far, it has been unclear as to which areas of the brain we use to differentiate voices. The Max Planck Institute for Human Cognitive and Brain Sciences has just uncovered new findings: Our personal assistant for voice recognition uses a convolution in the right temporal lobe.
Fish sense water motion the same way humans sense sound, according to new research out of Case Western Reserve University School of Medicine. Researchers discovered a gene also found in humans helps zebrafish convert water motion into electrical impulses that are sent to the brain for perception. The shared gene allows zebrafish to sense water flow direction, and it also helps cells inside the human ear sense a range of sounds.
In a new Nature paper, a Rice University professor outlines a strategy that uses gene editing to slow the progression of a genetic hearing disease.
Researchers have developed a CRISPR-Cas9 genome-editing therapy to prevent hearing loss in a mouse model of human genetic progressive deafness.