New tool decodes neural activity using facial movements (VIDEO)
Caption
Janelia scientists have developed a tool that could bring researchers one step closer to understanding brain-wide signals driven by spontaneous behaviors. The tool, known as Facemap, uses deep neural networks to relate information about a mouse’s eye, whisker, nose, and mouth movements to neural activity in the brain.
Here, a video of a mouse face has been edited to label 13 key points that correspond to different facial movements associated with individual spontaneous behaviors, like whisking, grooming and licking.
The team first developed a neural network-based model that could identify these key points on in videos of mouse faces collected in the lab under various experimental setups. They then developed another deep neural network-based model to correlate this key facial point data representing mouse movement to neural activity, allowing them to see how a mouse’s spontaneous behaviors drive neural activity in a particular brain region.
Credit
Credit: Atika Syeda/HHMI Janelia Research Campus
Usage Restrictions
Please credit Atika Syeda/HHMI Janelia Research Campus
License
Original content