Mixed signals: Machine learning helps detecting roars from lion collars without recording actual audio
Peer-Reviewed Publication
This month, we’re focusing on artificial intelligence (AI), a topic that continues to capture attention everywhere. Here, you’ll find the latest research news, insights, and discoveries shaping how AI is being developed and used across the world.
Updates every hour. Last Updated: 27-Apr-2026 04:16 ET (27-Apr-2026 08:16 GMT/UTC)
Roaring over long distances is a key behaviour of lions. They communicate within prides as well as with other animals using distinct sequences of moans and grunts. Scientists from the GAIA Initiative have now published a machine learning approach in the journal “Ecological Informatics” that improves how roaring behaviour can be studied. The algorithm can reliably detect long-distance roaring based solely on acceleration data (ACC) that is recorded by collars – without a microphone and without energy- and storage-intensive audio files. For the first time, such an algorithm works reliably with both male and female lions, and even with mixed signals when lions are walking while roaring.
A research team led by Profs. HUANG Jiacong and GAO Junfeng from the Nanjing Institute of Geography and Limnology of the Chinese Academy of Sciences, together with collaborators from the Institute of Mountain Hazards and Environment and Jiangxi Normal University, has developed a novel high-resolution machine learning model based on a comprehensive national dataset, enabling high-precision prediction and spatial mapping of CO₂ emissions from China’s lakes.
When probes are inserted into the brain for research or clinical purposes, the electrical activity of neurons is recorded. These signals can be used to understand how the brain performs certain computations or even to identify pathological states. However, brains are composed of cell types that perform different roles in computation and are differentially affected by certain psychiatric disorders or drugs. Without a deep understanding of how cell types orchestrate the overall activity patterns, we cannot develop the next generation of therapies.
Researchers from Boston University’s Chobanian & Avedisian School of Medicine, College of Arts & Sciences, College of Engineering and Faculty of Computing & Data Sciences have developed a tool called PhysMAP to separate the “voices” of individual cell types within a crowd of electrical noise by combining several complementary features of each type's electrical signature. This machine learning algorithm could open up the study of how cell types shape both the healthy computations and the pathological states that electrical recordings have long been able to detect but never fully understood.