Category: Video Analysis

Automated classification of self-grooming in mice

In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!


In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al.  implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.

They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!

Read more in their publication here!


Stytra

Vilim Štih has shared their new project from the Portugues lab called Stytra, which was recently published in PLOS Computational Biology (Štih, Petrucco et al., 2019):


“Stytra is a flexible open-source software package written in Python and designed to cover all the general requirements involved in larval zebrafish behavioral experiments. It provides timed stimulus presentation, interfacing with external devices and simultaneous real-time tracking of behavioral parameters such as position, orientation, tail and eye motion in both freely-swimming and head-restrained preparations. Stytra logs all recorded quantities, metadata, and code version in standardized formats to allow full provenance tracking, from data acquisition through analysis to publication. The package is modular and expandable for different experimental protocols and setups. Current releases can be found at https://github.com/portugueslab/stytra. We also provide complete documentation with examples for extending the package to new stimuli and hardware, as well as a schema and parts list for behavioral setups. We showcase Stytra by reproducing previously published behavioral protocols in both head-restrained and freely-swimming larvae. We also demonstrate the use of the software in the context of a calcium imaging experiment, where it interfaces with other acquisition devices. Our aims are to enable more laboratories to easily implement behavioral experiments, as well as to provide a platform for sharing stimulus protocols that permits easy reproduction of experiments and straightforward validation. Finally, we demonstrate how Stytra can serve as a platform to design behavioral experiments involving tracking or visual stimulation with other animals and provide an example integration with the DeepLabCut neural network-based tracking method.”

Check out the paper, the enhanced version with the documentation, at www.portugueslab.com/stytra or the pdf at PLOS Computational Biology

 


 

 

CAVE

In a recent article, Jennifer Tegtmeier and colleagues have shared CAVE: an open-source tool in MATLAB for combined analysis of head-mounted calcium imaging and behavior.


Calcium imaging is spreading through the neuroscience field like melted butter on hot toast. Like other imaging techniques, the data collected with calcium imaging is large and complex. CAVE (Calcium ActiVity Explorer) aims to analyze imaging data from head-mounted microscopes simultaneously with behavioral data. Tegtmeier et al. developed this software in MATLAB with a bundle of unique algorithms to specifically analyze single-photon imaging data, which can then be correlated to behavioral data. A streamlined workflow is available for novice users, with more advanced options available for advanced users. The code is available for download from GitHub.

Read more from Frontiers in Neuroscience, or check it out directly from GitHub.


idtracker.ai

February 20, 2019

Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.


Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.

Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!


Open-source platform for worm behavior

February 13, 2019

In Nature Methods, Avelino Javer and colleagues developed and shared an open-source platform for analyzing and sharing worm behavioral data.


Collecting behavioral data is important and analyzing this data is just as crucial. Sharing this data is also important because it can further our understanding of behavior and increase replicability of worm behavioral studies. This is achieved by allowing many scientists to re-analyze available data, as well as develop new methods for analysis. Javer and colleagues developed an open resource in an effort to streamline the steps involved in this process — from storing and accessing video files to creating software to read and analyze the data. This platform features: an open-access repository for storing, accessing, and filtering data; an interchange format for notating single or multi-worm behavior; and file formats written in Python for feature extraction, review, and analysis. Together, these tools serve as an accessible suite for quantitative behavior analysis that can be used by experimentalists and computational scientists alike.

 

Read more about this platform from Nature Methods! (the preprint is also available from bioRxiv!)


Head-Mounted Camera System

February 6, 2019

Arne Meyer and colleagues recently shared their design and implementation of a head-mounted camera system for capturing detailed behavior in freely moving mice.


Video monitoring of animals can give great insight to behaviors. Most video monitoring systems to collect precise behavioral data require fixed position cameras and stationary animals, which can limit observation of natural behaviors. To address this, Meyer et al. developed a system which combines a lightweight head-mounted camera and head-movement sensors to detect behaviors in mice. The system, built using commercially available and 3D printed parts, can be used to monitor a variety of subtle behaviors including eye position, whisking, and ear movements in unrestrained animals. Furthermore, this device can be mounted in combination with neural implants for recording brain activity.

Read more here!


Live Mouse Tracker

December 5, 2018

In a recent preprint, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.


Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.

Learn more on BioRXiv, or check out the Live Mouse Tracker website!


KineMouse Wheel

October 10, 2018

On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.


Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.

Read more about KineMouse Wheel on Hackaday,

and check out other awesome open-source tools on the OpenBehavior Hackaday list!


 

OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!

EthoWatcher: a tool for behavioral and video-tracking analysis in laboratory animals

September 26, 2018

In Computers in Biology and Medicine, Carlos Fernando Crispin Jr. and colleagues share their software EthoWatcher: a computational tool that supports video-tracking, detailed ethography, and extraction of kinematic variables from video files of laboratory animals.


The freely available EthoWatcher software has two modules: a tracking module and an ethography module. The tracking module permits the controlled separation of the target from its background, the extraction of image attributes used to calculate distances traveled, orientation, length, area and a path graph of the target. The ethography module allows recording of catalog-based behaviors from video files, the environment, or frame-by-frame. The output reports latency, frequency, and duration of each behavior as well as the sequence of events in a time-segmented format fixed by the user. EthoWatcher was validated conducting tests on the detection of the known behavioral effects of drugs and on kinematic measurements.

Read more in their paper or download the software from the EthoWatcher webpage!


Junior, C. F., Pederiva, C. N., Bose, R. C., Garcia, V. A., Lino-De-Oliveira, C., & Marino-Neto, J. (2012). ETHOWATCHER: Validation of a tool for behavioral and video-tracking analysis in laboratory animals. Computers in Biology and Medicine,42(2), 257-264. doi:10.1016/j.compbiomed.2011.12.002