Tag: video analysis

Oat: online animal tracker

December 5, 2019

Jonathan Newman of the Wilson Lab at Massachusetts Institute of Technology has developed and shared a set of programs for processing video.


The only thing you’ll enjoy more than an oat milk latte from your favorite coffeeshop is the Oat collection of video processing components designed for use with Linux! Developed by Jonathan Newman of Open Ephys, this set of programs is useful for processing video, extracting object position information, and streaming data. While it was designed for use with real-time animal position tracking, it can be used in multiple applications that require real-time object tracking. The individual Oat components each have a standard interface that can be chained together to create complex dataflow networks for capturing  processing, and recording video streams.

Read more about Oat on the Open Ephys website, or check out more on Github!


https://open-ephys.org/oat

MouseMove

July 18, 2019

In a 2015 Scientific Reports article, Andre Samson and colleagues shared their project MouseMove, an open-source software for quantifying movement in the open field test:


The Open Field (OF) test is a commonly used assay for monitoring exploratory behavior and locomotion in rodents. Most research groups use commercial systems for recording and analyzing behavior in the OF test, but these commercial systems can be expensive and lack flexibility. A few open-source OF systems have been developed, but are limited in the movement parameters that can be collected and analyzed. MouseMove is the first open-source software capable of providing qualitative and quantitative information on mouse locomotion in a semi-automated and high-throughput approach. With the aim of providing a freely available program for analyzing OF test data, these researchers developed a software that accurately quantifies numerous parameters of movement.

In their manuscript, Samson et al. describe the design and implementation of MouseMove. Their OF system allows for the measurement of distance, speed, and laterality with >96% accuracy. They use MouseMove as a method to analyze OF behavior of mice after experimental stroke to show reduced locomotor activity and quantify laterality deficits. The system is used in combination with the open source program ImageJ and the MTrack2 plugin to analyze pre-recorded OF test video.

The system has two downloadable components, the ImageJ macro and a separate program with the custom-built MouseMove GUI. ImageJ is used to subtract the background video from the experiment and create an image of the animals total trajectory. The MouseMove GUI then completes a detailed analysis of the movement patterns, measuring the fractional time spent stationary, the distance traveled, speed mean and various details of laterality. The results are depicted in both a visual/graphical form and as a saveable text file. In the manuscript, they provide step-wise instructions of how to use Mousemove. The authors additionally highlight the defined region-of-interest (ROI) ability of the software that makes it suitable for analysis of cognitive tests such as Novel Object Recognition. This tool offers relatively fast video-processing of motor cognitive behaviors and has many applications for the study of rodent models of brain injury/stimulation to measure altered locomotion.

 

More information on MouseMove can be found in their manuscript here.


Samson, A. L., Ju, L., Ah Kim, H., Zhang, S. R., Lee, J. A. A., Sturgeon, S. A., … Schoenwaelder, S. M. (2015). MouseMove: an open source program for semi-automated analysis of movement and cognitive testing in rodents. Scientific Reports, 5, 16171.  doi: 10.1038/srep16171

DeepBehavior

June 20, 2019

Ahmet Arac from Peyman Golshani’s lab at UCLA recently developed DeepBehavior, a deep-learning toolbox with post processing methods for video analysis of behavior:


Recently, there has been a major push for more fine-grained and detailed behavioral analysis in the field of neuroscience. While there are methods for taking high-speed quality video to track behavior, the data still needs to be processed and analyzed. DeepBehavior is a deep learning toolbox that automates this process, as its main purpose is to analyze and track behavior in rodents and humans.

The authors provide three different convolutional neural network models (TensorBox, YOLOv3, and OpenPose) which were chosen for their ease of use, and the user can decide which model to implement based on what experiment or what kind of data they aim to collect and analyze. The article provides methods and tips on how to train neural networks with this type of data, and gives methods for post-processing of image data.

In the manuscript, the authors give examples of utilizing DeepBehavior in five behavioral tasks in both animals and humans. For rodents, they use a food pellet reaching task, a three-chamber test, and social interaction of two mice. In humans, they use a reaching task and a supination / pronation task. They provide 3D kinematic analysis in all tasks, and show that the transfer learning approach accelerates network training when images from the behavior videos are used. A major benefit of this tool is that it can be modified and generalized across behaviors, tasks, and species. Additionally, DeepBehavior uses several different neural network architectures, and uniquely provides post-processing methods for 3D kinematic analysis, which separates it from previously published toolboxes for video behavioral analysis. Finally, the authors emphasize the potential for using this toolbox in a clinical setting with analyzing human motor function.

 

For more details, take a look at their project’s Github.

All three models used in the paper also have their own Github: TensorBox, YOLOv3, and openpose.


Arac, A., Zhao, P., Dobkin, B. H., Carmichael, S. T., & Golshani, P. (2019). DeepBehavior: A deep learning toolbox for automated analysis of animal and human behavior imaging data. Frontiers in systems neuroscience, 13.

 

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently posted a preprint describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. BioRxiv, 592592. 

Mousecam

February 6, 2019

Arne Meyer and colleagues recently shared their design and implementation of a head-mounted camera system for capturing detailed behavior in freely moving mice.


Video monitoring of animals can give great insight to behaviors. Most video monitoring systems to collect precise behavioral data require fixed position cameras and stationary animals, which can limit observation of natural behaviors. To address this, Meyer et al. developed a system which combines a lightweight head-mounted camera and head-movement sensors to detect behaviors in mice. The system, built using commercially available and 3D printed parts, can be used to monitor a variety of subtle behaviors including eye position, whisking, and ear movements in unrestrained animals. Furthermore, this device can be mounted in combination with neural implants for recording brain activity.

Read more here! You can also check out their github here. Documentation and files are also available on OpenEphys here.