Category: Video Analysis

FaceMap: Unsupervised analysis of rodent behaviors

May 15, 2018

We developed a toolbox for videographic processing of head-fixed rodent behaviors. It extracts the principal components of the animal’s behavior, either in a single movie, or across movies recorded simultaneously. Several regions of interest in the movie can be processed simultaneously (such as the whisker pad or the nose). We found that the behavioral components from the full face of the mouse predicted up to half of the explainable neural activity across the brain (see https://www.biorxiv.org/content/early/2018/04/22/306019).

In addition to extracting movement components, it can compute the pupil area of the rodent using a center-of-mass estimation. Also, in experiments in which the mouse sits on a surface with texture, the software can estimate the running speed of the mouse. The software is available here (https://github.com/carsen-stringer/FaceMap).

LocoWhisk: Quantifying rodent exploration and locomotion behaviours

March 8, 2018

Robyn A. Grant, from Manchester Metropolitan University, has shared the following on Twitter regarding the development of the LocoWhisk arena:

“Come help me develop my new arena. Happy to hear from anyone looking to test it or help me develop it further.”

The LocoWhisk system is a new, portable behavioural set-up that incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source). The system has so far been successfully piloted on many rodent models, and would benefit from further validation and commercialisation opportunities.

Learn more here: https://crackit.org.uk/locowhisk-quantifying-rodent-exploration-and-locomotion-behaviours

Article in Nature on monitoring behavior in rodents

An interesting summary of recent methods for monitoring behavior in rodents was published this week in Nature.The article mentions Lex Kravitz and his lab’s efforts on the Feeding Experimentation Device (FED) and also OpenBehavior. Check it out:  https://www.nature.com/articles/d41586-018-02403-5

ZebraTrack

December 18, 2017

ZebraTrack is a cost-effective imaging setup for distraction-free behavioral acquisition with automated tracking using open-source ImageJ software and workflow for extraction of behavioral endpoints of zebrafish. This ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins.


Nema, S., Hasan, W., Bhargava, A., & Bhargava, Y. (2016). A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety. Journal of Neuroscience Methods, 271, 65-75. doi:10.1016/j.jneumeth.2016.07.004

 

Pyper

November 28, 2017

Pyper is developed by The Margrie Laboratory.


Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest.  The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.

Example of Pyper tracking a mouse in an open field


Find more information here.

Manual for Pyper.

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.


More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.