In the Journal of Neuroscience Methods, Bastijn van den Boom and colleagues have shared their ‘how-to’ instructions for implementing behavioral classification with JAABA, featuring bonsai and motr!
In honor of our 100th post on OpenBehavior, we wanted to feature a project that exemplifies how multiple open-source projects can be implemented to address a common theme in behavioral neuroscience: tracking and classifying complex behaviors! The protocol from Van den Boom et al. implements JAABA, an open-source machine learning based behavior detection system; motr, an open-source mouse trajectory tracking software; and bonsai, an open-source system capable of streaming and recording video. Together they use these tools to process videos of mice performing grooming behaviors in a variety of behavioral setups.
They then compare multiple tools for analyzing grooming behavior sequences in both wild-type and genetic knockout mice with a tendency to over groom. The JAABA trained classifier outperforms the commercially available behavior analysis software and more closely aligns with manual analysis of behavior by expert observers. This offers a novel, cost-effective and easy to use method for assessing grooming behavior in mice comparable to that of an expert observer, with the efficient advantage of being automatic. How to instructions for how to train your own JAABA classifier can be found in their paper!
Read more in their publication here!
Boom, B. J., Pavlidi, P., Wolf, C. J., Mooij, A. H., & Willuhn, I. (2017). Automated classification of self-grooming in mice using open-source software. Journal of Neuroscience Methods, 289, 48-56. doi:10.1016/j.jneumeth.2017.05.026
Vilim Štih has shared their new project from the Portugues lab called Stytra, which was recently published in PLOS Computational Biology (Štih, Petrucco et al., 2019):
“Stytra is a flexible open-source software package written in Python and designed to cover all the general requirements involved in larval zebrafish behavioral experiments. It provides timed stimulus presentation, interfacing with external devices and simultaneous real-time tracking of behavioral parameters such as position, orientation, tail and eye motion in both freely-swimming and head-restrained preparations. Stytra logs all recorded quantities, metadata, and code version in standardized formats to allow full provenance tracking, from data acquisition through analysis to publication. The package is modular and expandable for different experimental protocols and setups. Current releases can be found at https://github.com/portugueslab/stytra. We also provide complete documentation with examples for extending the package to new stimuli and hardware, as well as a schema and parts list for behavioral setups. We showcase Stytra by reproducing previously published behavioral protocols in both head-restrained and freely-swimming larvae. We also demonstrate the use of the software in the context of a calcium imaging experiment, where it interfaces with other acquisition devices. Our aims are to enable more laboratories to easily implement behavioral experiments, as well as to provide a platform for sharing stimulus protocols that permits easy reproduction of experiments and straightforward validation. Finally, we demonstrate how Stytra can serve as a platform to design behavioral experiments involving tracking or visual stimulation with other animals and provide an example integration with the DeepLabCut neural network-based tracking method.”
Check out the paper, the enhanced version with the documentation, at www.portugueslab.com/stytra or the pdf at PLOS Computational Biology
Štih, V., Petrucco, L., Kist, A. M., & Portugues, R. (2019). Stytra: an open-source, integrated system for stimulation, tracking and closed-loop behavioral experiments. PLoS computational biology, 15(4), e1006699.
In a recent article, Jennifer Tegtmeier and colleagues have shared CAVE: an open-source tool in MATLAB for combined analysis of head-mounted calcium imaging and behavior.
Calcium imaging is spreading through the neuroscience field like melted butter on hot toast. Like other imaging techniques, the data collected with calcium imaging is large and complex. CAVE (Calcium ActiVity Explorer) aims to analyze imaging data from head-mounted microscopes simultaneously with behavioral data. Tegtmeier et al. developed this software in MATLAB with a bundle of unique algorithms to specifically analyze single-photon imaging data, which can then be correlated to behavioral data. A streamlined workflow is available for novice users, with more advanced options available for advanced users. The code is available for download from GitHub.
Read more from Frontiers in Neuroscience, or check it out directly from GitHub.
Tegtmeier, J., Brosch, M., Janitzky, K., Heinze, H., Ohl, F. W., & Lippert, M. T. (2018). CAVE: An Open-Source Tool for Combined Analysis of Head-Mounted Calcium Imaging and Behavior in MATLAB. Frontiers in Neuroscience, 12. doi:10.3389/fnins.2018.00958
February 20, 2019
Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.
Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.
Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!
Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J., & Polavieja, G. G. (2019). Idtracker.ai: Tracking all individuals in small or large collectives of unmarked animals. Nature Methods, 16(2), 179-182. doi:10.1038/s41592-018-0295-5
February 13, 2019
In Nature Methods, Avelino Javer and colleagues developed and shared an open-source platform for analyzing and sharing worm behavioral data.
Collecting behavioral data is important and analyzing this data is just as crucial. Sharing this data is also important because it can further our understanding of behavior and increase replicability of worm behavioral studies. This is achieved by allowing many scientists to re-analyze available data, as well as develop new methods for analysis. Javer and colleagues developed an open resource in an effort to streamline the steps involved in this process — from storing and accessing video files to creating software to read and analyze the data. This platform features: an open-access repository for storing, accessing, and filtering data; an interchange format for notating single or multi-worm behavior; and file formats written in Python for feature extraction, review, and analysis. Together, these tools serve as an accessible suite for quantitative behavior analysis that can be used by experimentalists and computational scientists alike.
Read more about this platform from Nature Methods! (the preprint is also available from bioRxiv!)
Javer, A., Currie, M., Lee, C. W., Hokanson, J., Li, K., Martineau, C. N., . . . Brown, A. E. (2018). An open source platform for analyzing and sharing worm behavior data. Nature Methods. doi:10.1101/377960
February 6, 2019
Arne Meyer and colleagues recently shared their design and implementation of a head-mounted camera system for capturing detailed behavior in freely moving mice.
Video monitoring of animals can give great insight to behaviors. Most video monitoring systems to collect precise behavioral data require fixed position cameras and stationary animals, which can limit observation of natural behaviors. To address this, Meyer et al. developed a system which combines a lightweight head-mounted camera and head-movement sensors to detect behaviors in mice. The system, built using commercially available and 3D printed parts, can be used to monitor a variety of subtle behaviors including eye position, whisking, and ear movements in unrestrained animals. Furthermore, this device can be mounted in combination with neural implants for recording brain activity.
Read more here!
Meyer, A. F., Poort, J., O’Keefe, J., Sahani, M., & Linden, J. F. (2018). A Head-Mounted Camera System Integrates Detailed Behavioral Monitoring with Multichannel Electrophysiology in Freely Moving Mice. Neuron, 100(1). doi:10.1016/j.neuron.2018.09.020
December 5, 2018
In a recent preprint, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.
Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.
Learn more on BioRXiv, or check out the Live Mouse Tracker website!
October 10, 2018
On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.
Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.
Read more about KineMouse Wheel on Hackaday,
and check out other awesome open-source tools on the OpenBehavior Hackaday list!
We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.
We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.
September 26, 2018
In Computers in Biology and Medicine, Carlos Fernando Crispin Jr. and colleagues share their software EthoWatcher: a computational tool that supports video-tracking, detailed ethography, and extraction of kinematic variables from video files of laboratory animals.
The freely available EthoWatcher software has two modules: a tracking module and an ethography module. The tracking module permits the controlled separation of the target from its background, the extraction of image attributes used to calculate distances traveled, orientation, length, area and a path graph of the target. The ethography module allows recording of catalog-based behaviors from video files, the environment, or frame-by-frame. The output reports latency, frequency, and duration of each behavior as well as the sequence of events in a time-segmented format fixed by the user. EthoWatcher was validated conducting tests on the detection of the known behavioral effects of drugs and on kinematic measurements.
Read more in their paper or download the software from the EthoWatcher webpage!
Junior, C. F., Pederiva, C. N., Bose, R. C., Garcia, V. A., Lino-De-Oliveira, C., & Marino-Neto, J. (2012). ETHOWATCHER: Validation of a tool for behavioral and video-tracking analysis in laboratory animals. Computers in Biology and Medicine,42(2), 257-264. doi:10.1016/j.compbiomed.2011.12.002