Category: Software

M-Track

September 12, 2017
 
Annalisa Scimemi, of the Department of Biology at SUNY Albany, has shared the following Python based code to track movement of labelled paws in grooming and freely behaving mice in an article published by PLoS Computational Biology.

Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual paws. M-Track is an open-source code that allows users to simultaneously track the movement of individual paws during spontaneous grooming episodes and walking in multiple freely-behaving mice/rats. This toolbox provides a simple platform to perform trajectory analysis of paw movement.  M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. https://doi.org/10.1371/journal.pone.0078058

OptiMouse

August, 2017

Yorman Ben-Shaul, of the Department of Medical Neurobiology at The Hebrew University, has shared the following about OptiMouse in an article published through BMC Biology:


OptiMouse is an open-source software designed to semi-automatically analyze the positions of individual mice, specifically their nose positions, in a behavioral arena with the goal of minimizing error. This software was designed to provide highly accurate detection of positions and to make the entire analysis process accessible and free by providing its own graphical user interface, requiring no prior programming knowledge. OptiMouse is different from other position tracker software in that it applies multiple detection algorithms to a single session while allowing a seamless integration of custom functions, controllable through the GUI. This software also makes the identification of frames with incorrect position detection easy to find so the settings for those frames can be adjusted, producing higher quality data.


Ben-Shaul, Y. (2017). OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biology,15(1). doi:10.1186/s12915-017-0377-3

The GitHub for this project may be found here.

Automated Rodent Tracker (ART)

May 5, 2017

Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:


We have developed a program (ART, available from: http://mwa.bretthewitt.net/downloads.php) that can automatically track rodent positon and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.

Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.

ART may be downloaded here

Link to share: https://edspace.american.edu/openbehavior/2017/05/05/automated-rodent-tracker-art/


Pixying Behavior

APril 3, 2017 

Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu BerlinGermany, has generously shared the following regarding automated optical tracking of animal movement: 


“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution.  Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”

Link to share:  https://edspace.american.edu/openbehavior/2017/04/03/pixying-behavior/


Open Ephys

March 17, 2017

Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.


“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.

We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.”
– Jakob Voigts

Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
The Open Ephys GitHub can be found here.

Eco-HAB

February 12, 2017 

Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking: 


Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.


Janelia Automatic Animal Behavior Annotator (JAABA)

December 30, 2016

Mayank Kabra has shared the following about JAABA, a machine learning-based behavior detection system developed by the Branson Lab at HHMI Janelia Farm.


The Janelia Automatic Animal Behavior Annotator (JAABA) is a machine learning-based system that enables researchers to automatically detect behaviors in video of behaving animals. Through our system, users begin by labeling the behavior of the animal, e.g. walking, grooming, or following, in few of the video frames. Using machine learning, JAABA learns behavior detectors based on these labels that can then be used to automatically classify the behaviors of animals in other videos. JAABA has a fast and intuitive graphical user interface, and multiple ways to visualize and understand the classifier by a non-machine learning user. JAABA has enabled extraction of detailed, scientifically meaningful measurements of the behavioral effects in large experiments.

Behavioral Observation Research Interactive Software (BORIS)

Olivier Friard of The University of Turin has generously shared the following about BORIS:


BORIS is an open-source, multiplatform standalone program that allows a user-specific coding environment for a computer-based review of previously recorded videos or live observations. The program allows defining a highly-customised, project-based ethogram that can then be shared with collaborators or can be imported or modified. Once the coding process is completed, the program can extract a time-budget or single or grouped observations automatically and present an at-a-glance summary of the main behavioural features. The observation data and analyses can be exported in various formats (e.g.; CSV or XLS). The behavioural events can also be plotted and exported in various graphic formats (e.g.; PNG, EPS, and PDF).

BORIS is currently used around the world for strikingly different approaches to the study of animal and human behaviour.

The latest version of BORIS can be downloaded from www.boris.unito.it, where the manual and some tutorials are also available.

observation_running  ethogram


Friard, O. and Gamba, M. (2016), BORIS: a free, versatile open-source
event-logging software for video/audio coding and live observations.
Methods Ecol Evol, 7: 1325–1330. doi:10.1111/2041-210X.12584

Bonsai

Gonçalo Lopez of the Champalimaud Neuroscience Programme has shared the following about Bonsai:

Bonsai is an open-source visual programming language for composing reactive and asynchronous data streams coming from video cameras, microphones, electrophysiology systems or data acquisition boards. It gives the experimenter an easy way not only to collect data from all these devices simultaneously, but also to quickly and flexibly chain together real-time processing pipelines that can be fed back to actuator systems for closed-loop stimulation using sound, images or other kinds of digital and analog output.
Bonsai is being used around the world to design all kinds of behavioral experiments from eye-tracking to virtual reality, spanning multiple model organisms from fish to humans.

The latest version of Bonsai can be downloaded from BitBucket, along with instructions for quickly setting up a working system.
A public user forum is also available where you can leave your questions and feedback about how to use Bonsai.