Deep learning powers a motion-tracking revolution

0
15
Deep learning powers a motion-tracking revolution

As a postdoc, physiologist Valentina Di Santo spent a lot of time scrutinizing high-resolution films of fish.

Di Santo was investigating the motions involved when fish such as skates swim. She filmed individual fish in a tank and manually annotated their body parts frame by frame, an effort that required about a month of full-time work for 72 seconds of footage. Using an open-source application called DLTdv, developed in the computer language MATLAB, she then extracted the coordinates of body parts — the key information needed for her research. That analysis showed, among other things, that when little skates (Leucoraja erinacea) need to swim faster, they create an arch on their fin margin to stiffen its edge1.

But as the focus of Di Santo’s research shifted from individual animals to schools of fish, it was clear a new approach would be required. “It would take me forever to analyse [those data] with the same detail,” says Di Santo, who is now at Stockholm University. So, she turned to DeepLabCut instead.

DeepLabCut is an open-source software package developed by Mackenzie Mathis, a neuroscientist at Harvard University in Cambridge, Massachusetts, and her colleagues, which allows users to train a computational model called a neural network to track animal postures in videos. The publicly available version didn’t have an easy way to track multiple animals over time, but Mathis’ team agreed to run an updated version using the fish data, which Di Santo annotated using a graphical user interface (GUI). The preliminary output looks promising, Di Santo says, although she is waiting to see how the tool performs on the full data set. But without DeepLabCut, she says, the study “would not be possible”.

Researchers have long been interested in tracking animal motion, Mathis says, because motion is “a very good read-out of intention within the brain”. But conventionally, that has involved spending hours recording behaviours by hand. The previous generation of animal-tracking tools mainly determined centre of mass and sometimes orientation, and the few tools that captured finer details were highly specialized for specific animals or subject to other constraints, says Talmo Pereira, a neuroscientist at Princeton University in New Jersey.

Over the past several years, deep learning — an artificial-intelligence method that uses neural networks to recognize subtle patterns in data — has empowered a new crop of tools. Open-source packages such as DeepLabCut, LEAP Estimates Animal Pose (LEAP) and DeepFly3D use deep learning to determine coordinates of animal body parts in videos. Complementary tools perform tasks such as identifying specific animals. These packages have aided research on everything from the study of motion in hunting cheetahs to collective zebrafish behaviour.

Each tool has limitations; some require specific experimental set-ups, or don’t work well when animals always crowd together. But methods will improve alongside advances in image capture and machine learning, says Sandeep Robert Datta, a neuroscientist at Harvard Medical School in Boston, Massachusetts. “What you’re looking at now is just the very beginning of what is certain to be a long-term transformation in the way neuroscientists study behaviour,” he says.

Strike a pose

DeepLabCut is based on software used to analyse human poses. Mathis’ team adapted its underlying neural network to work for other animals with relatively few training data. Between 50 and 200 manually annotated frames are generally sufficient for standard lab studies, although the amount needed depends on factors such as data quality and the consistency of the people doing the labelling, Mathis says. In addition to annotating body parts with a GUI, users can issue commands through a Jupyter Notebook, a computational document popular with data scientists. Scientists have used DeepLabCut to study both lab and wild animals, including mice, spiders, octopuses and cheetahs. Neuroscientist Wujie Zhang at the University of California, Berkeley, and his colleague used it to estimate the behavioural activity of Egyp

Read More

Leave a reply