March 21, 2019
Victor Wumbor-Apin Kumbol and colleagues have developed and shared Actifield, an automated open-source actimeter for rodents, in a recent HardwareX publication.
Measuring locomotor activity can be a useful readout for understanding effects of a number of experimental manipulations related to neuroscience research. Commercially available locomotor activity recording devices can be cost-prohibitive and often lack the ability to be customized to fit a specific lab’s needs. Kumbol et al. offer an open-source alternative that utilizes infrared motion detection and an arduino to record activity in a variety of chamber set ups. A full list of build materials, links to 3D-print and laser-cut files, and assembly instructions are available in their publication.
Read more from HardwareX!
In a recent article, Jennifer Tegtmeier and colleagues have shared CAVE: an open-source tool in MATLAB for combined analysis of head-mounted calcium imaging and behavior.
Calcium imaging is spreading through the neuroscience field like melted butter on hot toast. Like other imaging techniques, the data collected with calcium imaging is large and complex. CAVE (Calcium ActiVity Explorer) aims to analyze imaging data from head-mounted microscopes simultaneously with behavioral data. Tegtmeier et al. developed this software in MATLAB with a bundle of unique algorithms to specifically analyze single-photon imaging data, which can then be correlated to behavioral data. A streamlined workflow is available for novice users, with more advanced options available for advanced users. The code is available for download from GitHub.
Read more from Frontiers in Neuroscience, or check it out directly from GitHub.
Tegtmeier, J., Brosch, M., Janitzky, K., Heinze, H., Ohl, F. W., & Lippert, M. T. (2018). CAVE: An Open-Source Tool for Combined Analysis of Head-Mounted Calcium Imaging and Behavior in MATLAB. Frontiers in Neuroscience, 12. doi:10.3389/fnins.2018.00958
February 20, 2019
Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.
Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.
Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!
Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J., & Polavieja, G. G. (2019). Idtracker.ai: Tracking all individuals in small or large collectives of unmarked animals. Nature Methods, 16(2), 179-182. doi:10.1038/s41592-018-0295-5
February 13, 2019
In Nature Methods, Avelino Javer and colleagues developed and shared an open-source platform for analyzing and sharing worm behavioral data.
Collecting behavioral data is important and analyzing this data is just as crucial. Sharing this data is also important because it can further our understanding of behavior and increase replicability of worm behavioral studies. This is achieved by allowing many scientists to re-analyze available data, as well as develop new methods for analysis. Javer and colleagues developed an open resource in an effort to streamline the steps involved in this process — from storing and accessing video files to creating software to read and analyze the data. This platform features: an open-access repository for storing, accessing, and filtering data; an interchange format for notating single or multi-worm behavior; and file formats written in Python for feature extraction, review, and analysis. Together, these tools serve as an accessible suite for quantitative behavior analysis that can be used by experimentalists and computational scientists alike.
Read more about this platform from Nature Methods! (the preprint is also available from bioRxiv!)
Javer, A., Currie, M., Lee, C. W., Hokanson, J., Li, K., Martineau, C. N., . . . Brown, A. E. (2018). An open source platform for analyzing and sharing worm behavior data. Nature Methods. doi:10.1101/377960
February 6, 2019
Arne Meyer and colleagues recently shared their design and implementation of a head-mounted camera system for capturing detailed behavior in freely moving mice.
Video monitoring of animals can give great insight to behaviors. Most video monitoring systems to collect precise behavioral data require fixed position cameras and stationary animals, which can limit observation of natural behaviors. To address this, Meyer et al. developed a system which combines a lightweight head-mounted camera and head-movement sensors to detect behaviors in mice. The system, built using commercially available and 3D printed parts, can be used to monitor a variety of subtle behaviors including eye position, whisking, and ear movements in unrestrained animals. Furthermore, this device can be mounted in combination with neural implants for recording brain activity.
Read more here!
Meyer, A. F., Poort, J., O’Keefe, J., Sahani, M., & Linden, J. F. (2018). A Head-Mounted Camera System Integrates Detailed Behavioral Monitoring with Multichannel Electrophysiology in Freely Moving Mice. Neuron, 100(1). doi:10.1016/j.neuron.2018.09.020
January 16, 2019
In the Journal of Neurophysiology, Brice Williams and colleagues have shared their design for a novel dual-port lick detector. This device can be used for both real-time measurement and manipulation of licking behavior in head-fixed mice.
Measuring licking behavior in mice provides a valuable metric of sensory-motor processing and can be nicely paired with simultaneous neural recordings. Williams and colleagues have developed their own device for precise measuring of licking behavior as well as for manipulating this behavior in real time. To address limitations of many available lick sensors, the authors designed their device to be smaller (appropriate for mice), contactless (to diminish electric artifacts for neural recording), and precise to a submillisecond timescale. This dual-port detector can be implemented to detect directional licking behavior during sensory tasks and can be used in combination with neural recording. Further, given the submillisecond precision of this device, it can be used in a closed-loop system to perturb licking behaviors via neural inhibition. Overall, this dual-port lick detector is a cost-effective, replicable solution that can be used in a variety of applications.
Learn how to build your own here!
And be sure to check out their Github.
January 9, 2019
Kevin Coffey has shared the following about DeepSqueak, a deep learning-based system for detection and analysis of ultrasonic vocalizations, which he developed with Russell Marx.
Rodents engage in social communication through a rich repertoire of ultrasonic vocalizations (USVs). Recording and analysis of USVs can be performed noninvasively in almost any rodent behavioral model to provide rich insights into the emotional state and motor function. Despite strong evidence that USVs serve an array of communicative functions, technical and financial limitations have inhibited widespread adoption of vocalization analysis. Manual USV analysis is slow and laborious, while existing automated analysis software are vulnerable to broad spectrum noise routinely encountered in the testing environment.
To promote accessible and accurate USV research, we present “DeepSqueak”, a fully graphical MATLAB package for high-throughput USV detection, classification, and analysis. DeepSqueak applies state-of-the-art regional object detection neural networks (Faster-RCNN) to detect USVs. This dramatically reduces the false positive rate to facilitate reliable analysis in standard experimental conditions. DeepSqueak included pre-trained detection networks for mouse USVs, and 50 kHz and 22 kHz rat USVs. After detection, USVs can be clustered by k-means models or classified by trainable neural networks.
Read more in their recent publication and check out DeepSqueak on Github!
December 19, 2018
In 2007, Adam Hoffman and colleagues shared their design for an Electric Operant Testing Apparatus (ELOPTA) in Behavior Research Methods.
Operant behavior is commonly studied in behavioral neuroscience, therefore there is a need for devices to train and collect data from animals in operant procedures. Commercially available systems often require training to program and use and can be expensive. Hoffman and colleagues developed a system that can automatically control operant procedures and record behavioral outputs. This system is intended to be easy to use because it is easily programmable, portable and durable.
Read more here!
Hoffman, A.M., Song, J. & Tuttle, E.M. Behavior Research Methods (2007) 39: 776. https://doi.org/10.3758/BF03192968
December 5, 2018
In a recent preprint, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.
Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.
Learn more on BioRXiv, or check out the Live Mouse Tracker website!
November 30, 2018
Nikolas Francis and Patrick Kanold of the University of Maryland share their design for Psibox, a platform for automated operant conditioning in the mouse home cage, in Frontiers in Neural Circuits.
The ability to collect behavioral data from large populations of subjects is advantageous for advancing behavioral neuroscience research. However, few cost-effective options are available for collecting large sums of data especially for operant behaviors. Francis and Kanold have developed and shared Psibox, an automated operant conditioning system. It incorporates three modules for central control , water delivery, and home cage interface, all of which can be customized with different parts. The system was validated for training mice in a positive reinforcement auditory task and can be customized for other tasks as well. The full, low-cost system allows for quick training of groups of mice in an operant task with little day-to-day experimenter involvement.
Learn how to set up your own Psibox system here!
Francis, NA., Kanold, PO., (2017). Automated operant conditioning in the mouse home cage. Front. Neural Circuits.