Category: Software

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.

More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system will be presented on November 12th at the Society for Neuroscience annual meeting. Click here for more information.

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.


“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Mousetrap: An integrated, open-source computer mouse-tracking package

October 31, 2016

Mousetrap, an open-source software plugin to record and analyze mouse movements in computerized lab experiments, was developed by Pascal Kieslich and Felix Henninger, both located in Germany.


Mousetrap is a plugin that is used with OpenSesame software for mouse-tracking, or the analysis of mouse movements during computerized lab experiments which can serve as an indicator of commitment or conflict in decision making. The integration of Mousetrap with a general-purpose graphical experiment builder also allows users to access other core features and software extensions of OpenSesame, which offers more flexibility to users when designing experiments. Mousetrap is available for use across all platforms (Linux, Windows and Mac) and the data collected with the software can also be imported directly into R for analysis with an available Mousetrap package.

The GitHub for this project may be found here.

 

M-Track

September 12, 2017
 
Annalisa Scimemi, of the Department of Biology at SUNY Albany, has shared the following Python based code to track movement of labelled paws in grooming and freely behaving mice in an article published by PLoS Computational Biology.

Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual paws. M-Track is an open-source code that allows users to simultaneously track the movement of individual paws during spontaneous grooming episodes and walking in multiple freely-behaving mice/rats. This toolbox provides a simple platform to perform trajectory analysis of paw movement.  M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. https://doi.org/10.1371/journal.pone.0078058

OptiMouse

August, 2017

Yorman Ben-Shaul, of the Department of Medical Neurobiology at The Hebrew University, has shared the following about OptiMouse in an article published through BMC Biology:


OptiMouse is an open-source software designed to semi-automatically analyze the positions of individual mice, specifically their nose positions, in a behavioral arena with the goal of minimizing error. This software was designed to provide highly accurate detection of positions and to make the entire analysis process accessible and free by providing its own graphical user interface, requiring no prior programming knowledge. OptiMouse is different from other position tracker software in that it applies multiple detection algorithms to a single session while allowing a seamless integration of custom functions, controllable through the GUI. This software also makes the identification of frames with incorrect position detection easy to find so the settings for those frames can be adjusted, producing higher quality data.


Ben-Shaul, Y. (2017). OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biology,15(1). doi:10.1186/s12915-017-0377-3

The GitHub for this project may be found here.

Automated Rodent Tracker (ART)

May 5, 2017

Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:


We have developed a program (ART, available from: http://mwa.bretthewitt.net/downloads.php) that can automatically track rodent positon and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.

Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.

ART may be downloaded here

Link to share: https://edspace.american.edu/openbehavior/2017/05/05/automated-rodent-tracker-art/


Pixying Behavior

APril 3, 2017 

Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu BerlinGermany, has generously shared the following regarding automated optical tracking of animal movement: 


“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution.  Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”

Link to share:  https://edspace.american.edu/openbehavior/2017/04/03/pixying-behavior/


Open Ephys

March 17, 2017

Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.


“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.

We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.”
– Jakob Voigts

Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
The Open Ephys GitHub can be found here.

Eco-HAB

February 12, 2017 

Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking: 


Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.