Category: Data Analysis

Automated classification of self-grooming in mice

In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!


In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al.  implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.

They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!

Read more in their publication here!


AutonoMouse

In a recently published article (Erskine et al., 2019), The Schaefer lab at the Francis Crick Institute introduced their new open-source project called AutonoMouse.


AutonoMouse is a fully automated, high-throughput system for self-initiated conditioning and behavior tracking in mice. Many aspects of behavior can be analyzed through having rodents perform in operant conditioning tasks. However, in operant experiments, many variables can potentially alter or confound results (experimenter presence, picking up and handling animals, altered physiological states through water restriction, and the issue that rodents often need to be individually housed to keep track of their individual performances). This was the main motivation for the authors to investigate a way to completely automate operant conditioning. The authors developed AutonoMouse as a fully automated system that can track large numbers (over 25) of socially-housed mice through implanted RFID chips on mice. With the RFID trackers and other analyses, the behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) an odor discrimination task over months with thousands of trials performed every day. The novelty in this study is the fully automated nature or the entire system (training, experiments, water delivery, weighing the animals are all automated) and the ability to keep mice socially-housed 24/7, all while still training them and tracking their performance in an olfactory operant conditioning task. The modular set-up makes it possible for AutonoMouse to be used to study many other sensory modalities, such as visual stimuli or in decision-making tasks. The authors provide a components list, layouts, construction drawings, and step-by-step instructions for the construction and use of AutonoMouse in their publication and on their project’s github.


For more details, check out this youtube clip interview with Andreas Schaefer, PI on the project.

 

The github for the project’s control software is located here: https://github.com/RoboDoig/autonomouse-control and for the project’s design and hardware instructions is here: https://github.com/RoboDoig/autonomouse-design. The schedule generation program is located here: https://github.com/RoboDoig/schedule-generator


Phenopy

April 17, 2019

In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.


Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*

Check out the full article from Nature Protocols!


(*alternatively available on ResearchGate)

CAVE

In a recent article, Jennifer Tegtmeier and colleagues have shared CAVE: an open-source tool in MATLAB for combined analysis of head-mounted calcium imaging and behavior.


Calcium imaging is spreading through the neuroscience field like melted butter on hot toast. Like other imaging techniques, the data collected with calcium imaging is large and complex. CAVE (Calcium ActiVity Explorer) aims to analyze imaging data from head-mounted microscopes simultaneously with behavioral data. Tegtmeier et al. developed this software in MATLAB with a bundle of unique algorithms to specifically analyze single-photon imaging data, which can then be correlated to behavioral data. A streamlined workflow is available for novice users, with more advanced options available for advanced users. The code is available for download from GitHub.

Read more from Frontiers in Neuroscience, or check it out directly from GitHub.


idtracker.ai

February 20, 2019

Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.


Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.

Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!


Open-source platform for worm behavior

February 13, 2019

In Nature Methods, Avelino Javer and colleagues developed and shared an open-source platform for analyzing and sharing worm behavioral data.


Collecting behavioral data is important and analyzing this data is just as crucial. Sharing this data is also important because it can further our understanding of behavior and increase replicability of worm behavioral studies. This is achieved by allowing many scientists to re-analyze available data, as well as develop new methods for analysis. Javer and colleagues developed an open resource in an effort to streamline the steps involved in this process — from storing and accessing video files to creating software to read and analyze the data. This platform features: an open-access repository for storing, accessing, and filtering data; an interchange format for notating single or multi-worm behavior; and file formats written in Python for feature extraction, review, and analysis. Together, these tools serve as an accessible suite for quantitative behavior analysis that can be used by experimentalists and computational scientists alike.

 

Read more about this platform from Nature Methods! (the preprint is also available from bioRxiv!)


CaImAn

January 23, 2019

Hot off the press in eLife, Andrea Giovannucci and colleagues have shared their open-source software library, CaImAn, for one and two-photon Calcium Imaging data Analysis.


In vivo calcium imaging has gained popularity in recent years for its ability to record large quantities of neural activity from multiple brain areas over extended time periods. With advanced tools for recording and collecting data comes large quantities of data. With large datasets comes a need for streamlined ways to analyze it. Giovannucci and colleagues have developed and shared a toolbox for analyzing complex calcium imaging datasets. CaImAn, developed in the open-source Python language (with optional implementation in MATLAB), is designed to correct for motion, estimate spikes, detect new neurons, and assess neuronal activity and locations in a given timeframe. The software can be used on pre-recorded data or can also enabled for real-time analysis. CaImAn is available to download with examples from GitHub, and more information can be obtained through reading the aforementioned manuscript.

Check out GitHub, or the article from eLife!


DeepSqueak

January 9, 2019

Kevin Coffey has shared the following about DeepSqueak, a deep learning-based system for detection and analysis of ultrasonic vocalizations, which he developed with Russell Marx.


Rodents engage in social communication through a rich repertoire of ultrasonic vocalizations (USVs). Recording and analysis of USVs can be performed noninvasively in almost any rodent behavioral model to provide rich insights into the emotional state and motor function. Despite strong evidence that USVs serve an array of communicative functions, technical and financial limitations have inhibited widespread adoption of vocalization analysis. Manual USV analysis is slow and laborious, while existing automated analysis software are vulnerable to broad spectrum noise routinely encountered in the testing environment.

To promote accessible and accurate USV research, we present “DeepSqueak”, a fully graphical MATLAB package for high-throughput USV detection, classification, and analysis. DeepSqueak applies state-of-the-art regional object detection neural networks (Faster-RCNN) to detect USVs. This dramatically reduces the false positive rate to facilitate reliable analysis in standard experimental conditions. DeepSqueak included pre-trained detection networks for mouse USVs, and 50 kHz and 22 kHz rat USVs. After detection, USVs can be clustered by k-means models or classified by trainable neural networks.

Read more in their recent publication and check out DeepSqueak on Github!


Live Mouse Tracker

December 5, 2018

In a recent preprint, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.


Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.

Learn more on BioRXiv, or check out the Live Mouse Tracker website!


OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!