Home » Behavior Tracking

Category: Behavior Tracking

An Open Source Automated Bar Test for Measuring Catalepsy in Rats

August 6, 2020

Researchers at the University of Guelph have created a low-cost automated apparatus for measuring catalepsy that increases measurement accuracy and reduces observer bias.


Catalepsy is a measure of muscular rigidity that can result from several factors including Parkinson’s disease, or pharmacological exposure to antipsychotics or cannabis. Catalepsy bar tests are widely used to measure this rigidity. The test consists of placing the arms of a rodent on a horizontal bar that has been raised off the ground and measuring the time it takes for the subject to remove themselves from this imposed posture. Traditionally, this has been measured by an experimenter with a stopwatch, or with prohibitively expensive commercial apparatus that have issues of their own. The automated bar test described here uses a 3D printed base with an Arduino operated design to make the design simple and affordable. This design sets itself apart by using extremely low-cost beam break sensors that avoid pitfalls of the traditional “complete the circuit” approach where changes in rat grip can result in false measurements. The beam break sensors to are used to determine whether the rat is on the bar or not and automatically measures and stores the time the rat takes to remove itself from the bar on an SD card for later retrieval. The device has been validated in rats; however, the bar height is adjustable so there is no reason it cannot be used in other rodents as well. This bar test thus makes catalepsy measures easy, accurate, and limits any experimenter bias due to manual measurements.

Learn more about this project in the recent paper!

Or check out the hackaday project page


Luciani, K. R., Frie, J. A., & Khokhar, J. Y. (2020). An Open Source Automated Bar Test for Measuring Catalepsy in Rats. ENeuro, 7(3). https://doi.org/10.1523/ENEURO.0488-19.2020

DeepFly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila

July 23, 2020

Semih Günel and colleagues have created a deep learning-based pose estimator for studying how neural circuits control limbed behaviors in tethered Drosophila.


Appendage tracking is an important behavioral measure in motor circuit research. Up until now, algorithms for accurate 3D pose estimation in small animals such as Drosophila did not exist. Rather, researchers have had to use alternative approaches such as placing small reflective biomarkers on fly leg segments. While this method is appropriate for larger animals, implementing this strategy in drosophila-sized animals is motion limiting, labor intensive, and cannot estimate 3D information, therefore limiting the accuracy of behavioral measures. DeepFly3D is a PyTorch and PyQT5 based software designed to solve these issues and provide a user-friendly user interface for pose estimation and appendage tracking. DeepFly3D makes of use of supervised deep learning for 2D joint detection and a multicamera setup to iteratively infer 3D poses. This new approach allows for sub-millimeter scale accuracy of automated measurements. Amazingly, DeepFly3D is not limited to drosophila and can be modified to study other animals, such as rodents, primates, and humans. DeepFly3D therefore allows for versatile pose estimation while also permitting an extraordinary level of behavioral detail and accuracy.

Read more in the paper!

Or check out the project’s GitHub!


Günel, S., Rhodin, H., Morales, D., Campagnolo, J., Ramdya, P., & Fua, P. (2019). Deepfly3D, a deep learning-based approach for 3D limb and appendage tracking in tethered, adult Drosophila. ELife, 8. https://doi.org/10.7554/eLife.48571

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

Video Repository Initiative on OpenBehavior

June 11, 2020

Last fall when teaching an undergraduate course on computational methods in neuroscience at American University, we wanted to bring in some of the tools for video analysis that have been promoted on OpenBehavior. The idea was to introduce these tools at the end of the course, after the students had learned a bit about Python, Anaconda, Jupyter, Arduinos, etc. We decided on using ezTrack from the Cai Lab as it is written in Python and uses Jupyter notebooks. It was easy to prepare for this topic until we realized that we needed simple videos for tracking. Those from our lab are from operant chambers illuminated with infrared LEDs and require a good bit of preprocessing to be suited for analysis with simple tracking algorithms. In addition, we use Long-Evans rats in our studies and they are a challenge to track given their coloring. So, we looked around the web for example videos and were surprised by the lack of sharing example videos by labs who have developed, published with, and promoted tools for video analysis. Most videos that we found show the results of tracking and did not provide raw video data. We did find a nice example of open-field behavior by mice (Samson et al., 2015), and used the supplemental videos from this now 5 year old paper for the course.

These experiences made us wonder if having a collection of videos for teaching and training would be useful to the community. A collection of video recordings of animals engaged in standard neuroscience behavioral tasks (e.g. feeding, foraging, fear conditioning, operant learning) would be useful for educational purposes, e.g. students could read published papers to understand the experimental design and then analyze data from the published studies using modifications of available tutorial code for packages such as ezTrack or others. For researchers, these same videos would be useful for reproducing analyses from published studies, and quickly learning how to use published code to analyze their own data. Furthermore, with the development of tools that use advanced statistical methods for video analysis (e.g. DeepLabCut, B-SOiD), it seems warranted to have a repository available that could be used to benchmark algorithms and explore their parameter space. One could even envision an analysis competition using standard benchmark videos similar to what is available in the field of machine learning, and that have had impact on the development of powerful algorithms that go well beyond the performance of those that were available only a decade ago (e.g. xgboost).

So we are posting today to ask for community participation in the creation of a video repository. The plan is to post license-free videos to the OpenBehavior Google Drive account. Our OpenBehavior team will convert the files to a standard (mp4) format and post links to the videos on the OpenBehavior website, so they will be accessible to the community. The website will list the creator of the video file, the camera and software used for the recording, the resolution, frame rate and duration of recording, the species and information on the behavioral experiment (and a link to the publication or preprint if the work is from a manuscript).

For studies in rodents, we are especially interested in videos showing overhead views from open-field and operant arena experiments and close-up videos of facial reactions, eyeblinks, oral movements and limb reaching. We are happy to curate videos from other species (fish, birds, monkeys, people) as well.

If you are interested in participating, please complete the form on this page or reach out to us via email at openbehavior@gmail.com or Twitter at @OpenBehavior.

 

ACRoBaT

May 14, 2020

David A. Bjånes and Chet T. Moritz from the University of Washington in Seattle have developed and published their device for training rats to perform a modified center out task.


As neuroscience tools for studying rodent brains have improved in the 21st century, researchers have started to utilize increasingly complex tasks to study their behavior, sometimes adapting tasks commonly used with primates. One such task used for studying motor behavior, the center-out reaching task, has been modified for use in rodents. Bjånes and Moritz have further contributed to the adaptation of this task by creating ACRoBaT, or the Automated Center-out Rodent Behavioral Trainer. This device features two custom printed PCBs, a 3D printed housing unit, an Arduino microchip, and other commercially available parts that can be mounted outside a behavioral arena. It also provides a fully automated algorithm to train rats based on behavioral feedback fed into the device through various sensors. The authors show the effectiveness of the device with data from 18 rats across different conditions to find the optimal training procedure for this task. Information for how to build the device is available in their publication, as well as on Github.

Read the full publication here, or check out the files on GitHub!


neurotic

April 30, 2020

Jeffrey P. Gill and colleagues have developed and shared a new toolbox for synchronizing video and neural signals, cleverly named neurotic!


Collecting neural data and behavioral data are fundamental to behavioral neuroscience, and the ability to synchronize these data streams are just as important as collecting the information in the first place. To make this process a little simpler, Gill et al. developed an open-source option called neurotic, a NEUROscience Tool for Interactive Characterization. This tool is programmed in Python and includes a simple GUI, which makes it accessible for users with little coding experience. Users can read in a variety of file formats for neural data and video, which they can then process, filter, analyze, annotate and plot. To show the effectiveness across species and signal types, the authors tested the software with aplysia feeding behavior and human beam walking. Given its open-source nature and strong integration of other popular open-source packages, this software will continue to develop and improve as the community uses it.

Read more about neurotic here!

Check out the documentation here.


Gill, J. P., Garcia, S., Ting, L. H., Wu, M., & Chiel, H. J. (2020). Neurotic: Neuroscience Tool for Interactive Characterization. Eneuro. doi:10.1523/eneuro.0085-20.2020

3DOC: 3D Operant Conditioning

April 23, 2020

Raffaele Mazziotti from the Istituto di Neuroscienze CNR di Pisa has generously shared the following about 3DOC, a recently developed and published project from their team.


“Operant conditioning is a classical paradigm and a standard technique used in experimental psychology in which animals learn to perform an action in order to achieve a reward. By using this paradigm, it is possible to extract learning curves and measure accurately reaction times. Both these measurements are proxy of cognitive capabilities and can be used to evaluate the effectiveness of therapeutic interventions in mouse models of disease. Recently in our Lab, we constructed a fully 3D printable chamber able to perform operant conditioning using off-the-shelf, low-cost optical and electronic components, that can be reproduced rigorously in any laboratory equipped with a 3D printer with a total cost around 160€. Requirements include a 3D printable filament ( e.g. polylactic acid, PLA), a low-cost microcontroller (e.g. Arduino UNO), and a single-board computer (e.g. Raspberry Pi). We designed the chamber entirely using 3D modelling for several reasons: first, it has a high degree of reproducibility, since the model is standardized and can be downloaded to print the same structure with the same materials throughout different laboratories. Secondly, it can be easily customized in relation to specific experimental needs. Lastly, it can be shared through online repositories (Github: https://github.com/raffaelemazziotti/oc_chamber). With these cost-efficient and accessible components, we assayed the possibility to perform two-alternative forced-choice operant conditioning using audio-visual cues while tracking in the real-time mouse position. As a proof of principle of customizability, we added a version of the OC chamber that is able to show more complex visual stimuli (e.g. Images). This version includes an edit of the frontal wall that can host a TFT monitor and code that runs on Psychopy2 on Raspberry PI. This tool can be employed to test learning and memory in models of disease. We expect that the open design of the chamber will be useful for scientific teaching and research as well as for further improvements from the open hardware community.”

Check out the full publication here.

Or take a peak at the Github for this project.


PASTA

April 16, 2020

Thanks to Jan Homolak from the Department of Pharmacology, University of Zagreb School of Medicine, Zagreb, Croatia for sharing the following about repurposing a digital kitchen scale for neuroscience research: a complete hardware and software cookbook for PASTA (Platform for Acoustic STArtle).


“As we were starving for a solution on how to obtain relevant and reliable information from a kitchen scale sometimes used in very creative ways in neuroscience research, we decided to cut the waiting and cook something ourselves. Here we introduce a complete hardware and software cookbook for PASTA, a guide on how to demolish your regular kitchen scale and use the parts to turn it into a beautiful multifunctional neurobehavioral platform. This project is still medium raw, as its the work in progress, however, we hope you will still find it well done.
PASTA comes in various flavors such as:
– complete hardware design for PASTA
– PASTA data acquisition software codes (C++/Arduino)
– PASTA Chef: An automatic experimental protocol execution Python script for data acquisition and storage
– ratPASTA (R-based Awesome Toolbox for PASTA): An R-package for PASTA data analysis and visualization

..and all can be found on bioRxiv here: https://www.biorxiv.org/content/10.1101/2020.04.10.035766v1.supplementary-material

bon appétit!”


Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!


Open Source Joystick

March 26, 2020

This week we want to talk about joy! I mean, joy-sticks. Parley Belsey, Mark Nicholas and Eric Yttri have developed and shared an open-source joystick for studying motor behavior and decision making in mice!


Mice are hopping and popping in research, and so researchers are using more creativity and innovation to understand the finite aspects of their behaviors. Recently, members of the Yttri lab at Carnegie Mellon used their skills to create an open source joystick for studying mouse motor and decision making behaviors! In their paper they describe the full behavioral set up (based on the RIVETS design from the Dudman lab), featuring a removable head fixation point, a sipping tube, and a joystick to measure reach trajectory, amplitude, speed, etc. Data is collected and devices are controlled via an Arduino, solenoid circuit, microSD card reader, and LCD readout, and data can be analyzed in real time or saved to a csv for analysis later. The Arduino can be programmed to signal reward delivery when a correct response is recorded from the joystick which streamlines outcome based reward delivery. Belsey et al. tested their device with adult mice, and the results of training can be found in the paper as well as the full build instructions and ideas for how their tool may be of interest to build and use in your lab.

For more, check out their publication or Github!


Belsey, P., Nicholas, M. A., & Yttri, E. A. (2020). Open-source joystick manipulandum for decision-making, reaching, and motor control studies in mice. Eneuro. doi: 10.1523/eneuro.0523-19.2020