April 17, 2019
In a recent Nature Protocol’s article, Edoardo Balzani and colleagues from Valter Tucci’s lab have developed and shared Phenopy, a Python-based open-source analytical platform for behavioral phenotyping.
Behavioral phenotyping of mice using classic methods can be a long process and is susceptible to high variability, leading to inconsistent results. To reduce variance and speed up to process of behavioral analysis, Balzani et al. developed Phenopy, an open-source software for recording and analyzing behavioral data for phenotyping. The software allows for recording components of a behavioral task in combination with electrophysiology data. It is capable of performing online analysis as well as analysis of recorded data on a large scale, all within a user-friendly interface. Information about the software is available in their publication, available from Nature Protocols.*
Check out the full article from Nature Protocols!
(*alternatively available on ResearchGate)
March 29, 2019
In a 2011 Journal of Neuroscience Methods article, Pishan Chang and colleagues shared their design for an open-source, novel telemetry system for recording EEG in small animals.
EEG monitoring in freely-behaving small animals is a useful technique for observing natural fluctuations in neural activity over time. Monitoring frequencies above 80 Hz continuously over a period of weeks can be a challenge. Chang et al. have shared their design for a system that combines an implantable telemetric sensor, radio-frequency transmission, and an open-source data acquisition software to collect EEG data over a span of up to 8 weeks. Various modifications to the system have increased the longevity of the device and reduced transmission noise to provide continuous and reliable data. Schematics of the device, transmission system, and validation results in a population of epileptic rodents are available in their publication.
Learn more from the Journal of Neuroscience Methods!
March 21, 2019
Victor Wumbor-Apin Kumbol and colleagues have developed and shared Actifield, an automated open-source actimeter for rodents, in a recent HardwareX publication.
Measuring locomotor activity can be a useful readout for understanding effects of a number of experimental manipulations related to neuroscience research. Commercially available locomotor activity recording devices can be cost-prohibitive and often lack the ability to be customized to fit a specific lab’s needs. Kumbol et al. offer an open-source alternative that utilizes infrared motion detection and an arduino to record activity in a variety of chamber set ups. A full list of build materials, links to 3D-print and laser-cut files, and assembly instructions are available in their publication.
Read more from HardwareX!
February 20, 2019
Francisco Romero Ferrero and colleagues have developed idtracker.ai, an algorithm and software for tracking individuals in large collectives of unmarked animals, recently described in Nature Methods.
Tracking individual animals in large collective groups can give interesting insights to behavior, but has proven to be a challenge for analysis. With advances in artificial intelligence and tracking software, it has become increasingly easier to collect such information from video data. Ferrero et al. have developed an algorithm and tracking software that features two deep networks. The first tracks animal identification and the second tracks when animals touch or cross paths in front of one another. The software has been validated to track individuals with high accuracy in cohorts of up to 100 animals with diverse species from rodents to zebrafish to ants. This software is free, fully-documented and available online with additional jupyter notebooks for data analysis.
Check out their website with full documentation, the recent Nature Methods article, BioRXiv preprint, and a great video of idtracker.ai tracking 100 zebrafish!
Romero-Ferrero, F., Bergomi, M. G., Hinz, R. C., Heras, F. J., & Polavieja, G. G. (2019). Idtracker.ai: Tracking all individuals in small or large collectives of unmarked animals. Nature Methods, 16(2), 179-182. doi:10.1038/s41592-018-0295-5
January 16, 2019
In the Journal of Neurophysiology, Brice Williams and colleagues have shared their design for a novel dual-port lick detector. This device can be used for both real-time measurement and manipulation of licking behavior in head-fixed mice.
Measuring licking behavior in mice provides a valuable metric of sensory-motor processing and can be nicely paired with simultaneous neural recordings. Williams and colleagues have developed their own device for precise measuring of licking behavior as well as for manipulating this behavior in real time. To address limitations of many available lick sensors, the authors designed their device to be smaller (appropriate for mice), contactless (to diminish electric artifacts for neural recording), and precise to a submillisecond timescale. This dual-port detector can be implemented to detect directional licking behavior during sensory tasks and can be used in combination with neural recording. Further, given the submillisecond precision of this device, it can be used in a closed-loop system to perturb licking behaviors via neural inhibition. Overall, this dual-port lick detector is a cost-effective, replicable solution that can be used in a variety of applications.
Learn how to build your own here!
And be sure to check out their Github.
January 9, 2019
Kevin Coffey has shared the following about DeepSqueak, a deep learning-based system for detection and analysis of ultrasonic vocalizations, which he developed with Russell Marx.
Rodents engage in social communication through a rich repertoire of ultrasonic vocalizations (USVs). Recording and analysis of USVs can be performed noninvasively in almost any rodent behavioral model to provide rich insights into the emotional state and motor function. Despite strong evidence that USVs serve an array of communicative functions, technical and financial limitations have inhibited widespread adoption of vocalization analysis. Manual USV analysis is slow and laborious, while existing automated analysis software are vulnerable to broad spectrum noise routinely encountered in the testing environment.
To promote accessible and accurate USV research, we present “DeepSqueak”, a fully graphical MATLAB package for high-throughput USV detection, classification, and analysis. DeepSqueak applies state-of-the-art regional object detection neural networks (Faster-RCNN) to detect USVs. This dramatically reduces the false positive rate to facilitate reliable analysis in standard experimental conditions. DeepSqueak included pre-trained detection networks for mouse USVs, and 50 kHz and 22 kHz rat USVs. After detection, USVs can be clustered by k-means models or classified by trainable neural networks.
Read more in their recent publication and check out DeepSqueak on Github!
December 12, 2018
Vladislav Voziyanov and colleagues have developed and shared the TRIO Platform, a low-profile in vivo imaging support and restraint system for mice.
In vivo optical imaging methods are common tools for understanding neural function in mice. This technique is often performed in head-fixed, anesthetized animals, which requires monitoring of anesthesia level and body temperature while stabilizing the head. Fitting each of the components necessary for these experiments on a standard microscope stage can be rather difficult. Voziyanov and colleagues have shared their design for the TRIO (Three-In-One) Platform. This system is compact and provides sturdy head fixation, a gas anesthesia mask, and warm water bed. While the design is compact enough to work with a variety of microscope stages, the use of 3D printed components makes this design customizable.
Read more about the TRIO Platform in Frontiers in Neuroscience!
The design files and list of commercially available build components are provided here.
Voziyanov, V., Kemp, B. S., Dressel, C. A., Ponder, K., & Murray, T. A. (2016). TRIO Platform: A Novel Low Profile In vivo Imaging Support and Restraint System for Mice. Frontiers in Neuroscience, 10. doi:10.3389/fnins.2016.00169
December 5, 2018
In a recent publication, Fabrice de Chaumont and colleagues share Live Mouse Tracker, a real-time behavioral analysis system for groups of mice.
Monitoring social interactions of mice is an important aspect to understand pre-clinical models of various psychiatric disorders, however, gathering data on social behaviors can be time-consuming and often limited to a few subjects at a time. With advances in computer vision, machine learning, and individual identification methods, gathering social behavior data from many mice is now easier. de Chaumont and colleagues have developed Live Mouse Tracker which allows for behavior tracking for up to 4 mice at a time with RFID sensors. The use of infrared/depth RGBD cameras allow for tracking of animal shape and posture. This tracking system automatically labels behaviors on an individual, dyadic, and group level. Live Mouse Tracker can be used to assess complex social behavioral differences between mice.
Learn more in their manuscript in Nature Biomedical Engineering (also on BioRXiv), or check out the Live Mouse Tracker website!
de Chaumont, F., Ey, E., Torquet, N., Lagache, T., Dallongeville, S., Imbert, A., … & Olivo-Marin, J. C. (2019). Real-time analysis of the behaviour of groups of mice via a depth-sensing camera and machine learning. Nature biomedical engineering, 1.
November 30, 2018
Nikolas Francis and Patrick Kanold of the University of Maryland share their design for Psibox, a platform for automated operant conditioning in the mouse home cage, in Frontiers in Neural Circuits.
The ability to collect behavioral data from large populations of subjects is advantageous for advancing behavioral neuroscience research. However, few cost-effective options are available for collecting large sums of data especially for operant behaviors. Francis and Kanold have developed and shared Psibox, an automated operant conditioning system. It incorporates three modules for central control , water delivery, and home cage interface, all of which can be customized with different parts. The system was validated for training mice in a positive reinforcement auditory task and can be customized for other tasks as well. The full, low-cost system allows for quick training of groups of mice in an operant task with little day-to-day experimenter involvement.
Learn how to set up your own Psibox system here!
Francis, NA., Kanold, PO., (2017). Automated operant conditioning in the mouse home cage. Front. Neural Circuits.
November 14, 2018
John Stowers and colleagues from the Straw Lab at the University of Frieburg have developed and shared FreemoVR, a virtual reality set-up for unrestrained animals.
Virtual reality (VR) systems can help to mimic nature in behavioral paradigms, which help us to understand behavior and brain function. Typical VR systems require that animals are movement restricted, which limits natural responses. The FreemoVR system was developed to address these issues and allows for virtual reality to be integrated with freely moving behavior. This system can be used with a number of different species including mice, zebrafish, and Drosophila. FreemoVR has been validated to investigate several behavior in tests of height-aversion, social interaction, and visuomotor responses in unrestrained animals.
Read more on the Straw Lab site, Nature Methods paper, or access the software on Github.
Stowers, J. R., Hofbauer, M., Bastien, R., Griessner, J., Higgins, P., Farooqui, S., . . . Straw, A. D. (2017). Virtual reality for freely moving animals. Nature Methods, 14(10), 995-1002. doi:10.1038/nmeth.4399