Category: Behavior Tracking

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

Pathfinder

AUGUST 8, 2019

Matthew Cooke and colleagues from Jason Snyder’s lab at University of British Columbia recently developed open source software to detect spatial navigation behavior in animals called Pathfinder:


Spatial navigation is studied across several different paradigms for different purposes in animals; through analyzing spatial behaviors we can gain insight into how an animal learns a task, how they change their approach strategy, and generally observing goal-directed behaviors. Pathfinder is an open source software that can analyze rodent navigation. The software intends to automatically classify patterns of navigation as a rodent performs in a task. Pathfinder can analyze subtle patterns in spatial behavior that simple analysis measures may not always be able to pick up on. Specifically, many water maze analyses use escape latency or path length as an analysis measure, but the authors point out that the time it takes to reach the platform may not differ while the strategy does, so using latency may not be the most optimal measure for analyzing an animal’s strategy and therefore experimenters may miss out on key differences in behavior. Therefore, Pathfinder aims to analyze more subtle aspects of the task to determine differences in spatial navigation and strategy.

Originally intended for water maze navigation, pathfinder can also be used to analyze many other spatial behaviors across different tasks, mazes, and species. The software takes x-y coordinates from behavior tracking software (for example, it can open files from Noldus Ethovision, ActiMetrics’ Watermaze, Stoelting’s Anymaze, and the open-source project ezTrack from Denise Cai’s lab), and then calculates the best-fit search strategy for each rodent’s trial. For the morris water maze task, trials are fit into several categories: Direct Swim, Directed Search, Focal Search, Spatial indirect, Chaining, Scanning, Thigmotaxis, and Random Search.

Pathfinder runs in Python and has an easy-to-use GUI; many aspects and parameters can be adjusted to analyze different tasks or behaviors.

For more details, check out their BioRxiV preprint here.

There’s a nice (humorous!) writeup of the project on the Snyder Lab website.

You can also download the project and view more details on their github:
https://matthewbcooke.github.io/Pathfinder/

https://github.com/MatthewBCooke/Pathfinder/


RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019

MouseMove

July 18, 2019

In a 2015 Scientific Reports article, Andre Samson and colleagues shared their project MouseMove, an open-source software for quantifying movement in the open field test:


The Open Field (OF) test is a commonly used assay for monitoring exploratory behavior and locomotion in rodents. Most research groups use commercial systems for recording and analyzing behavior in the OF test, but these commercial systems can be expensive and lack flexibility. A few open-source OF systems have been developed, but are limited in the movement parameters that can be collected and analyzed. MouseMove is the first open-source software capable of providing qualitative and quantitative information on mouse locomotion in a semi-automated and high-throughput approach. With the aim of providing a freely available program for analyzing OF test data, these researchers developed a software that accurately quantifies numerous parameters of movement.

In their manuscript, Samson et al. describe the design and implementation of MouseMove. Their OF system allows for the measurement of distance, speed, and laterality with >96% accuracy. They use MouseMove as a method to analyze OF behavior of mice after experimental stroke to show reduced locomotor activity and quantify laterality deficits. The system is used in combination with the open source program ImageJ and the MTrack2 plugin to analyze pre-recorded OF test video.

The system has two downloadable components, the ImageJ macro and a separate program with the custom-built MouseMove GUI. ImageJ is used to subtract the background video from the experiment and create an image of the animals total trajectory. The MouseMove GUI then completes a detailed analysis of the movement patterns, measuring the fractional time spent stationary, the distance traveled, speed mean and various details of laterality. The results are depicted in both a visual/graphical form and as a saveable text file. In the manuscript, they provide step-wise instructions of how to use Mousemove. The authors additionally highlight the defined region-of-interest (ROI) ability of the software that makes it suitable for analysis of cognitive tests such as Novel Object Recognition. This tool offers relatively fast video-processing of motor cognitive behaviors and has many applications for the study of rodent models of brain injury/stimulation to measure altered locomotion.

 

More information on MouseMove can be found in their manuscript here.


Samson, A. L., Ju, L., Ah Kim, H., Zhang, S. R., Lee, J. A. A., Sturgeon, S. A., … Schoenwaelder, S. M. (2015). MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Scientific Reports, 5, 16171.  doi: 10.1038/srep16171

Teensy-based Interface

July 3, 2019

Michael Romano and colleagues from the Han Lab at Boston University recently published their project using a Teensy microcontroller to control an sCMOS camera in behavioral experiments to obtain high temporal precision:


Teensy microcontrollers are becoming increasingly more popular and widespread in the neuroscience community. One benefit of using a Teensy is its ease of programming for those with little programming experience, as it uses Arduino/C++ language. An additional benefit of using a Teensy microcontroller is that it can take in and send out time-precise signals. Romano et al. developed a flexible Teensy 3.2-based interface for data acquisition and delivery of analog and digital signals during a rodent locomotion tracking experiment and in a trace eye blink conditioning experiment. The group shows how the interface can be paired with optical calcium imaging as well. The setup integrates a sCMOS camera with behavioral experiments, and the interface is rather user-friendly.

The Teensy interface ensures that the data is temporally precise, and the Teensy interface can also deliver digital signals with microsecond precision to capture images from a paired sCMOS camera. Calcium imaging can be performed during the eye blink conditioning experiment. This was done through pulses send to the camera to capture calcium activity in the hippocampus at 20 Hz from the Teensy. Additionally, the group shows that the Teensy interface can also generate analog sound waveforms to drive speakers for the eye blink experiment. The study shows how an inexpensive piece of lab equipment, like a simple Teensy microcontroller, can be utilized to drive multiple aspects of a neuroscience experiment, and provides inspiration for future experiments to utilize microcontrollers to control behavioral experiments.

 

For more details on the project, check out the project’s GitHub here.

 

Romano, M., Bucklin, M., Gritton, H., Mehrotra, D., Kessel, R., & Han, X. (2019). A Teensy microcontroller-based interface for optical imaging camera control during behavioral experiments. Journal of Neuroscience Methods, 320, 107-115.

 

optoPAD

June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:


Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.

 

Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .


Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924

DeepBehavior

June 20, 2019

Ahmet Arac from Peyman Golshani’s lab at UCLA recently developed DeepBehavior, a deep-learning toolbox with post processing methods for video analysis of behavior:


Recently, there has been a major push for more fine-grained and detailed behavioral analysis in the field of neuroscience. While there are methods for taking high-speed quality video to track behavior, the data still needs to be processed and analyzed. DeepBehavior is a deep learning toolbox that automates this process, as its main purpose is to analyze and track behavior in rodents and humans.

The authors provide three different convolutional neural network models (TensorBox, YOLOv3, and OpenPose) which were chosen for their ease of use, and the user can decide which model to implement based on what experiment or what kind of data they aim to collect and analyze. The article provides methods and tips on how to train neural networks with this type of data, and gives methods for post-processing of image data.

In the manuscript, the authors give examples of utilizing DeepBehavior in five behavioral tasks in both animals and humans. For rodents, they use a food pellet reaching task, a three-chamber test, and social interaction of two mice. In humans, they use a reaching task and a supination / pronation task. They provide 3D kinematic analysis in all tasks, and show that the transfer learning approach accelerates network training when images from the behavior videos are used. A major benefit of this tool is that it can be modified and generalized across behaviors, tasks, and species. Additionally, DeepBehavior uses several different neural network architectures, and uniquely provides post-processing methods for 3D kinematic analysis, which separates it from previously published toolboxes for video behavioral analysis. Finally, the authors emphasize the potential for using this toolbox in a clinical setting with analyzing human motor function.

 

For more details, take a look at their project’s Github.

All three models used in the paper also have their own Github: TensorBox, YOLOv3, and openpose.


Arac, A., Zhao, P., Dobkin, B. H., Carmichael, S. T., & Golshani, P. (2019). DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Frontiers in systems neuroscience, 13.

 

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently posted a preprint describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. BioRxiv, 592592. 

Low Cost Open Source Eye Tracking

May 30, 2019

On Hackaday, John Evans and colleagues have shared a design and build for an open-source eye-tracking system for human research.


We’ve wanted to expand our coverage of behavioral tools to include those used in human research. To get this rolling, we’d like to highlight a project for eye tracking that might be helpful to many labs, especially if you don’t have a grant to collect pilot data. Check out Low Cost Open Source Eye Tracking. It uses open-source code, available from GitHub, and a pair of cheap USB cameras.

Check out the details on Hackaday.io and GitHub!


Evans, J. (2018). Low Cost Open Source Eye Tracking. Retrieved from https://hackaday.io/project/153293-low-cost-open-source-eye-tracking