Using artificial intelligence to track animal behavior is an exciting and quickly expanding topic of research and application. AI technologies have the potential to transform the way we study and comprehend animal behavior in a variety of contexts, such as wildlife protection, research, and even pet care.
GlowTrack is a non-invasive movement-tracking system developed by scientists that employs fluorescent dye markers to educate artificial intelligence to catch movement ranging from a single mouse digit to the human hand. GlowTrack has applications in biology, robotics, medicine, and other fields.
Movement provides insight into how the brain operates and regulates the body. Tracking human and animal activity has come a long way, from clipboard-and-pen observation to sophisticated artificial intelligence-based approaches. Artificial intelligence is now being used in cutting-edge technologies to automatically track portions of the body as they move. However, training these models is still time-consuming and limited by the necessity for researchers to physically mark each body part hundreds to thousands of times.
Associate Professor Eiman Azim and colleagues have developed GlowTrack, a non-invasive movement tracking system that uses fluorescent dye markers to educate artificial intelligence. GlowTrack is a robust, time-efficient, and high-definition tracking system capable of tracking a single digit on a mouse’s paw or hundreds of landmarks on a human hand.
Our method makes these tools more versatile, allowing us to capture more diverse movements in the laboratory. Better movement quantification enables us to have a better understanding of how the brain influences behavior and may aid in the study of movement disorders such as amyotrophic lateral sclerosis (ALS) and Parkinson’s disease.
Eiman Azim
The approach, which will be published in Nature Communications, has applications in biology, robotics, medicine, and other fields.
“Over the last several years, there has been a revolution in tracking behavior as powerful artificial intelligence tools have been brought into the laboratory,” Azim, senior author, and William Scandling Developmental Chair holder, adds. “Our method makes these tools more versatile, allowing us to capture more diverse movements in the laboratory. Better movement quantification enables us to have a better understanding of how the brain influences behavior and may aid in the study of movement disorders such as amyotrophic lateral sclerosis (ALS) and Parkinson’s disease.”
Current approaches for capturing animal movement frequently necessitate researchers physically and repeatedly marking body regions on a computer screen – a time-consuming process prone to human error and time limits. Because artificial intelligence models specialize to the limited amount of training data they get, these technologies can usually only be employed in a confined testing setting. For example, if the light, the animal’s body posture, the camera angle, or any number of other conditions changed, the model would no longer recognize the tracked body part.
To address these limitations, the researchers used fluorescent dye to label parts of the animal or human body. With these “invisible” fluorescent dye markers, an enormous amount of visually diverse data can be created quickly and fed into the artificial intelligence models without the need for human annotation. Once fed this robust data, these models can be used to track movements across a much more diverse set of environments and at a resolution that would be far more difficult to achieve with manual human labeling.
This makes it easy to compare movement data between studies because various laboratories can use the same models to capture body movement in a range of settings. Azim believes that experiment comparison and reproducibility are critical in the process of scientific discovery.
“Fluorescent dye markers were the perfect solution,” explains first author and Salk bioinformatics analyst Daniel Butler. Our fluorescent dye markers, like the invisible print on a dollar bill that only lights up when you want it to, can be turned on and off in the blink of an eye, allowing us to collect huge amounts of training data.”
The team is thrilled to enable varied GlowTrack applications in the future and pair its capabilities with other tracking technologies that reconstruct movements in three dimensions, as well as analysis methodologies that can explore these enormous movement datasets for patterns.
“Our approach can benefit a host of fields that need more sensitive, reliable, and comprehensive tools to capture and quantify movement,” Azim said. “I am eager to see how other scientists and non-scientists adopt these methods, and what unique, unforeseen applications might arise.”