Home » Video Analysis

Category: Video Analysis

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

neurotic

April 30, 2020

Jeffrey P. Gill and colleagues have developed and shared a new toolbox for synchronizing video and neural signals, cleverly named neurotic!


Collecting neural data and behavioral data are fundamental to behavioral neuroscience, and the ability to synchronize these data streams are just as important as collecting the information in the first place. To make this process a little simpler, Gill et al. developed an open-source option called neurotic, a NEUROscience Tool for Interactive Characterization. This tool is programmed in Python and includes a simple GUI, which makes it accessible for users with little coding experience. Users can read in a variety of file formats for neural data and video, which they can then process, filter, analyze, annotate and plot. To show the effectiveness across species and signal types, the authors tested the software with aplysia feeding behavior and human beam walking. Given its open-source nature and strong integration of other popular open-source packages, this software will continue to develop and improve as the community uses it.

Read more about neurotic here!

Check out the documentation here.


Gill, J. P., Garcia, S., Ting, L. H., Wu, M., & Chiel, H. J. (2020). Neurotic: Neuroscience Tool for Interactive Characterization. Eneuro. doi:10.1523/eneuro.0085-20.2020

Open Source Whisking Video Database

April 2, 2020

Alireza Azafar and colleagues at the Donders Institute for Brain, Cognition, and Behaviour at Rabound University have published an open source database of high speed videos of whisking behavior in freely moving rodents.


As responsible citizens, it’s possible you are missing lab and working from home. Maybe you have plenty to do, or maybe you’re looking for some new data to analyze to increase your knowledge of active sensing in rodents! Well, in case you don’t have that data at hand and can’t collect it yourself for a few more weeks, we have a treat for you! Azafar et al. have shared a database of 6,642 high quality videos featuring juvenile and adult male rats and adult male mice exploring stationary objects. This dataset includes a wide variety of experimental conditions including genetic, pharmacological, and sensory deprivation interventions to explore how active sensing behavior is modulated by different factors. Information about interventions and experimental conditions are available as a supplementary Excel file.

The videos are available as mp4 files as well as MATLAB matrices that can be converted into a variety of data formats. All videos underwent quality control and feature different amounts of noise and background, which makes a great tool for mastering video analysis. A toolbox is available from this group on Github for a variety of whisker analysis methods including nose and whisker tracking. This database is a great resource for studying sensorimotor computation, top-down mechanisms of sensory navigation, cross-species comparison of active sensing, and effects of pharmacological intervention of whisking behaviors.

Read more about active sensing behavior, data collection methods, and rationale here.

The database is available through GigaScience DB. 

You can also try out their Matlab Whisker Tracker from Github!


OpenMonkeyStudio

February 27, 2020

OpenMonkeyStudio is an amazing new tool for tracking movements by and interactions among freely moving monkeys. Ben Hayden and Jan Zimmerman kindly sent along this summary of the project:

Tracking animal pose (that is, identifying the positions foo their major joints) is a major frontier in neuroscience. When combined with neural recordings, pose tracking allows for identifying the relationship between neural activity and movement, and decision-making inferred from movement. OpenMonkeyStudio is a system designed to allow tracking of rhesus macaques in large freely moving environments.

Tracking monkeys is at least an order of magnitude more difficult than tracking mice, flies, and worms. Monkeys are, basically, large furry blobs; they don’t have clear body segmentations. And their movements are much richer and more complex. For these reasons, out of the box systems don’t work with monkeys.

The major innovation of our OpenMonkeyStudio is how it tackles the annotation problem. Deep learning systems aren’t very good at generalization. They can replicate things they have seen before or things that are kind fo similar to what they have seen. So the important thing is giving them a sufficiently large training set. We ideally want to have about a million annotated images. That would cost about $10 million and we don’t have that kind of money. So we use several cool tricks, which we describe in our paper, to augment a small dataset and turn it into a large one. Doing that works very well, and results in a system that can track one or even two interacting monkeys.


Check out the preprint:

OpenMonkeyStudio: Automated Markerless Pose Estimation in Freely Moving Macaques

Praneet C. Bala, Benjamin R. Eisenreich, Seng Bum Michael Yoo, Benjamin Y. Hayden, Hyun Soo Park, Jan Zimmermann

https://www.biorxiv.org/content/10.1101/2020.01.31.928861v1

Camera Control

February 6, 2020

The Adaptive Motor Control Lab at Harvard recently posted their project, Camera Control, a python based camera software GUI, to Github.


Camera Control is an open-source software package written by postdoctoral fellow Gary Kane that allows video to be recorded in sync with behavior. The python GUI and scripts allows investigators to record from multiple imaging source camera feeds with associated timestamps for each frame. When used in combination with a NIDAQ card, timestamps from a behavioral task can also be recorded on the falling edge of a TTL signal. This allows video analysis to be paired with physiological recording which can be beneficial in assessing behavioral results. This package requires Windows 10, Anaconda, and Git, and is compatible with Imaging Source USB3 cameras. The software package is accessible for download from the lab’s github and instructions for installation and video recording are provided.

Find more on Github.


Kane, G. & Mathis, M. (2019). Camera Control: record video and system timestamps from Imaging Source USB3 cameras. GitHub. https://zenodo.org/badge/latestdoi/200101590

SimBA

JANUARY 23, 2020

Simon Nilsson from Sam Golden’s lab at the University of Washington recently shared their project SimBA (Simple Behavioral Analysis), an open source pipeline for the analysis of complex social behaviors:


“The manual scoring of rodent social behaviors is time-consuming and subjective, impractical for large datasets, and can be incredibly repetitive and boring. If you spend significant time manually annotating videos of social or solitary behaviors, SimBA is an open-source GUI that can automate the scoring for you. SimBA does not require any specialized equipment or computational expertise.

SimBA uses data from popular open-source tracking tools in combination with a small amount of behavioral annotations to create supervised machine learning classifiers that can then rapidly and accurately score behaviors across different background settings and lighting conditions. Although SimBA is developed and validated for complex social behaviors such as aggression and mating, it has the flexibility to generate classifiers in different environments and for different behavioral modalities. SimBA takes users through a step-by-step process and we provide detailed installation instructions and tutorials for different use case scenarios online. SimBA has a range of in-built tools for video pre-processing, accessing third-party tracking models, and evaluating the performance of machine learning classifiers. There are also several methods for in-depth visualizations of behavioral patterns. Because of constraints in animal tracking tools, the initial release of SimBA is limited to processing social interactions of differently coat colored animals, recorded from a top down view, and future releases will advance past these limitations. SimBA is very much in active development and a manuscript is in preparation. Meanwhile, we are very keen to hear from users about potential new features that would advance SimBA and help in making automated behavioral scoring accessible to more researchers in behavioral neuroscience.”


For more information on SimBA, you can check out the project’s Github page here.

For those looking to contribute or try out SimBA and are looking for feedback, you can interact on the project’s Gitter page.

Plus, take a look at their recent twitter thread detailing the project.

If you would like to be added to the project’s listserv for updates, fill out this form here.

 

B-SOiD

January 16, 2020

Eric Yttri from Carnegie Mellon University has shared the following about B-SOiD, an open source unsupervised algorithm for discovery of spontaneous behaviors:


“Capturing the performance of naturalistic behaviors remains a tantalizing but prohibitively difficult field of study – current methods are difficult, expensive, low temporal resolution, or all of the above. Recent machine learning applications have enabled localization of limb position; however, position alone does not yield behavior. To provide a high temporal resolution bridge from positions to actions and their kinematics, we developed Behavioral Segmentation of Open-field In DeepLabCut, or B-SOiD. B-SOiD is an unsupervised learning algorithm that discovers and classifies actions based on the inherent statistics of the data points of the data points provided (including any marker or markerless system, not just deeplabcut). Our algorithm enables the automated segregation of different, sub-second behaviors with a single bottom-up perspective video camera – and does so without considerable effort or potential bias from the user. This open-source platform opens the door to the efficient study of spontaneous behavior and its neural mechanisms. It also readily provides critical behavioral metrics that historically have been difficult to quantify, such as grooming and stride-length in OCD and stroke research.”


Code available: https://github.com/YttriLab/B-SOID
Preprint available: https://www.biorxiv.org/content/10.1101/770271v1.full


Oat: online animal tracker

December 5, 2019

Jonathan Newman of the Wilson Lab at Massachusetts Institute of Technology has developed and shared a set of programs for processing video.


The only thing you’ll enjoy more than an oat milk latte from your favorite coffeeshop is the Oat collection of video processing components designed for use with Linux! Developed by Jonathan Newman of Open Ephys, this set of programs is useful for processing video, extracting object position information, and streaming data. While it was designed for use with real-time animal position tracking, it can be used in multiple applications that require real-time object tracking. The individual Oat components each have a standard interface that can be chained together to create complex dataflow networks for capturing  processing, and recording video streams.

Read more about Oat on the Open Ephys website, or check out more on Github!


https://open-ephys.org/oat

The OpenMV project: Machine Vision with Python

November 14, 2019

OpenMV – Better, Stronger, Faster and only $65 USD


Recent updates to the firmware for the OpenMV H7 Camera have brought some new functionality to this device, which is popular for open source neuroscience projects (e.g. Rodent Arena Tracker, or RAT: https://hackaday.io/project/162481-rodent-arena-tracker-rat). The new firmware allows for use of the popular TensorFlow library for machine learning on this MicroPython-based device. It’s small (1.5 by 1.75 inches), consumes only a max of 140 mA when processing data, has 1 MB of RAM and 2 MB of flash, and runs 64-bits computations at 4800 MHz (3.84 GB/s bandwidth). OpenMV is capable of frame differencing, color tracking, marker tracking, face detection, eye tracking, person detection (with TensorFlow Lite), and more. The project supports a very easy to use GUI, the OpenMV IDE. It’s intuitive to use, and offers a number of ready to go applications. Arduino users will feel right at home, despite the code being Python based.

Check out the project here: https://openmv.io/.

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.