Category: Software

MAPLE: a Modular Automated Platform for Large-Scale Experiments

January 8th, 2018 
The de Bivort lab and FlySorter, LLC are happy to share on OpenBehavior their open-source Drosophila handling platform, called MAPLE: Modular Automated Platform for Large-Scale Experiments.

Drosophila Melanogaster has proven a valuable genetic model organism due to the species’ rapid reproduction, low-maintenance, and extensive genetic documentation. However, the tedious chore of handling and manually phenotyping remains a limitation with regards to data collection. MAPLE: a Modular Automated Platform for Large-Scale Experiments provides a solution to this limitation.

MAPLE is a Drosophila-handing robot that boasts a modular design, allowing the platform to both automate diverse phenotyping assays and aid with lab chores (e.g., collecting virgin female flies). MAPLE permits a small-part manipulator, a USB digital camera, and a fly manipulator to work simultaneously over a platform of flies. Failsafe mechanisms allow users to leave MAPLE unattended without risking damage to MAPLE or the modules.

The physical platform integrates phenotyping and animal husbandry to allow end-to-end experimental protocols. MAPLE features a large, physically-open workspace for user convenience. The sides, top, and bottom are made of clear acrylic to allow optical phenotyping at all time points other than when the end-effector carriages are above the modules. Finally, the low cost and scalability allow large-scale experiments ($3500 vs hundreds of thousands for a “fly-flipping” robot).

MAPLE’s utility and versatility were demonstrated through the execution of two tasks: collection of virgin female flies, and a large-scale longitudinal measurement of fly social networks and behavior.

Links to materials:

CAD files

Control Software

Raw data and analysis scripts 

De Bivort Lab Site 


ArControl: Arduino Control Platform

January 3rd, 2018

The following behavioral platform was developed and published by Xinfeng Chen and Haohong Li, from Huazhong University of Science and Technology, Wuhan, China

ArControl: Arduino Control Platform is a comprehensive behavioral platform developed to deliver stimuli and monitor responses. This easy-to-use, high-performance system uses an Arduino UNO board and a simple drive circuit along with a stand-along GUI application. Experimental data is automatically recorded with the built-in data acquisition function and the entire behavioral schedule is stored within the Arduino chip. Collectively, this makes ArControl a “genuine, real-time system with high temporal resolution”. Chen and Li have tested ArControl using a Go/No-Go task and a probabilistic switching behavior task. The results of their work show that ArControl is a reliable system for behavioral research.

Source codes and PCB drafts may be found here: ArControl Github




December 18, 2017

ZebraTrack is a cost-effective imaging setup for distraction-free behavioral acquisition with automated tracking using open-source ImageJ software and workflow for extraction of behavioral endpoints of zebrafish. This ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins.

Nema, S., Hasan, W., Bhargava, A., & Bhargava, Y. (2016). A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety. Journal of Neuroscience Methods, 271, 65-75. doi:10.1016/j.jneumeth.2016.07.004



November 28, 2017

Pyper is developed by The Margrie Laboratory.

Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest.  The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.

Example of Pyper tracking a mouse in an open field

Find more information here.

Manual for Pyper.

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.

3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.

More information on 3D tracker may be found on the system’s website,

Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.

“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Mousetrap: An integrated, open-source computer mouse-tracking package

October 31, 2016

Mousetrap, an open-source software plugin to record and analyze mouse movements in computerized lab experiments, was developed by Pascal Kieslich and Felix Henninger, both located in Germany.

Mousetrap is a plugin that is used with OpenSesame software for mouse-tracking, or the analysis of mouse movements during computerized lab experiments which can serve as an indicator of commitment or conflict in decision making. The integration of Mousetrap with a general-purpose graphical experiment builder also allows users to access other core features and software extensions of OpenSesame, which offers more flexibility to users when designing experiments. Mousetrap is available for use across all platforms (Linux, Windows and Mac) and the data collected with the software can also be imported directly into R for analysis with an available Mousetrap package.

The GitHub for this project may be found here.



September 12, 2017
Annalisa Scimemi, of the Department of Biology at SUNY Albany, has shared the following Python based code to track movement of labelled paws in grooming and freely behaving mice in an article published by PLoS Computational Biology.

Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual paws. M-Track is an open-source code that allows users to simultaneously track the movement of individual paws during spontaneous grooming episodes and walking in multiple freely-behaving mice/rats. This toolbox provides a simple platform to perform trajectory analysis of paw movement.  M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.

The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.

Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058.