Category: All

3DTracker – 3D video tracking system for animal behavior

November 8th, 2017

Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.


3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.

More information on 3D tracker may be found on the system’s website, www.3dtracker.org

Additionally, a dynamic poster on the system will be presented on November 12th at the Society for Neuroscience annual meeting. Click here for more information.

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.


“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Mousetrap: An integrated, open-source computer mouse-tracking package

October 31, 2016

Mousetrap, an open-source software plugin to record and analyze mouse movements in computerized lab experiments, was developed by Pascal Kieslich and Felix Henninger, both located in Germany.


Mousetrap is a plugin that is used with OpenSesame software for mouse-tracking, or the analysis of mouse movements during computerized lab experiments which can serve as an indicator of commitment or conflict in decision making. The integration of Mousetrap with a general-purpose graphical experiment builder also allows users to access other core features and software extensions of OpenSesame, which offers more flexibility to users when designing experiments. Mousetrap is available for use across all platforms (Linux, Windows and Mac) and the data collected with the software can also be imported directly into R for analysis with an available Mousetrap package.

The GitHub for this project may be found here.

 

Moving Wall Box (MWB)

October 26th, 2017

Andreas Genewsky, from the Max-Planck Institute of Psychiatry, has generously shared the following regarding his Moving Wall Box task and associated apparatus.


“Typicallly, behavioral paradigms which aim to asses active vs. passive fear responses, involve the repeated application of noxius stimuli like electric foot shocks (step-down avoidance, step-through avoidance, shuttle-box). Alternative methods to motivate the animals and ultimately induce a conflict situation which needs to be overcome often involve food and/or water deprivation.

In order to repeatedly assess fear coping strategies in an emotional challenging situation without footshocks, food or water deprivation (comlying to the Reduce & Refine & Replace 3R principles), we devised a novel testing strategy, henceforward called the Moving Wall Box (MWB) task. In short, during the MWB task a mouse is repeatedly forced to jump over a small ice-filled box (10 trials, 1 min inter-trial intervals ITI), by slowly moving walls (2.3 mm/s, over 60 s), whereby the presence of the animal is automatically sensed via balances and analyzed by a microcontroller board which in turn controls the movements of the walls. The behavioral readouts are (1) the latency to reach the other compartment (high levels of behavioral inhibition lead to high latencies) and (2) the number of inter-trial shuttles per trial (low levels of behavioral inhibition lead to high levels of shuttles during the ITI).

The MWB offers the possibility to conduct simultaneous in vivo electrophysiological recordings, which could be later aligned to the behavioral responses (escapes). Therefore the MWB task fosters the study of activity patterns in, e.g., optogenetically identified neurons with respect to escape responses in a highly controlled setting. To our knowledge there is no other available compatible behavioral paradigm.”

M-Track

September 12, 2017
 
Annalisa Scimemi, of the Department of Biology at SUNY Albany, has shared the following Python based code to track movement of labelled paws in grooming and freely behaving mice in an article published by PLoS Computational Biology.

Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual paws. M-Track is an open-source code that allows users to simultaneously track the movement of individual paws during spontaneous grooming episodes and walking in multiple freely-behaving mice/rats. This toolbox provides a simple platform to perform trajectory analysis of paw movement.  M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. https://doi.org/10.1371/journal.pone.0078058

Open Source platform for Sensory Tasks

August, 2017

Lucy Palmer and Andrew Micallef, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, Melbourne, VIC, Australia, have shared the following Arduino and Python based platform for Go/ No-Go tasks in an article published by Frontiers in Cellular Neuroscience.


The Go/No-Go sensory task requires an animal to report a decision in response to a stimulus. In “Go” trials, the subject must respond to a target stimulus with an action, while in “No-Go” trials, the subject withholds a response. To execute this task, a behavioral platform was created which consists of three main components: 1) a water reward delivery system, 2) a lick sensor, and 3) a sensory stimulation apparatus. The water reward is administered by a gravity flow water system, controlled by a solenoid pinch valve, while licking is monitored by a custom-made piezo-based sensor. An Arduino Uno Rev3 simultaneously controls stimulus and reward delivery. In addition, the Arduino records lick frequency and timing through the piezo sensor. A Python script, employing the pyserial library, aids communication between the Arduino and a host computer.


The GitHub for the project may be found here.

OptiMouse

August, 2017

Yorman Ben-Shaul, of the Department of Medical Neurobiology at The Hebrew University, has shared the following about OptiMouse in an article published through BMC Biology:


OptiMouse is an open-source software designed to semi-automatically analyze the positions of individual mice, specifically their nose positions, in a behavioral arena with the goal of minimizing error. This software was designed to provide highly accurate detection of positions and to make the entire analysis process accessible and free by providing its own graphical user interface, requiring no prior programming knowledge. OptiMouse is different from other position tracker software in that it applies multiple detection algorithms to a single session while allowing a seamless integration of custom functions, controllable through the GUI. This software also makes the identification of frames with incorrect position detection easy to find so the settings for those frames can be adjusted, producing higher quality data.


Ben-Shaul, Y. (2017). OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biology,15(1). doi:10.1186/s12915-017-0377-3

The GitHub for this project may be found here.

Free-Behavior Monitoring and Reward System for Non-human Primates

August, 2017

In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.


This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.


Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265

Pulse Pal

July 12, 2017

Josh Sanders has also shared the following with OpenBehavior regarding Pulse Pal, an open source pulse train generator. Pulse Pal and Bpod, featured earlier, were both created by Sanworks.


Pulse Pal is an Arduino-powered device that generates precise sequences of voltage pulses for neural stimulation and stimulus control. It is controlled either through its APIs in MATLAB, Python and C++, or as a stand-alone instrument using its oLED screen and a clickable thumb joystick. Pulse Pal can play independent stimulus trains on its output channels. These trains are either defined parametrically, or pulse-wise by specifying each pulse’s onset time and voltage. Two optically isolated TTL trigger channels can each be mapped to any subset of the output channels, which can range between -10V and +10V, and deliver pulses as short as 100µs. This feature set allows Pulse Pal to serve as an open-source alternative to commercial stimulation timing devices, i.e. Master 8 (AMPI), PSG-2 (ISSI), Pulsemaster A300 (WPI), BPG-1 (Bak Electronics), StimPulse PGM (FHC Inc.) and Multistim 3800 (A-M Systems).

Because Pulse Pal is an Arduino-powered device, modifying its firmware for custom applications is within the capabilities of most modern Neuroscience research labs. As an example, the Pulse Pal’s Github repository provides an alternative firmware for the device, that entirely repurposes it as a waveform generator. In this configuration, a user can specify a waveform, frequency, amplitude and max playback duration, and toggle playback by TTL pulse with ~100µs latency. The firmware can also loop custom waveforms up to 40,000 samples long.

Pulse Pal was first published in 2014, by Josh Sanders while he was a student in Kepecs Lab at Cold Spring Harbor Laboratory. A significantly improved second-generation stimulator (Pulse Pal 2) became available in early 2016, coincident with the opening of Sanworks LLC. Over the past year, >125 Pulse Pal 2 devices were sold at $545 each by the Sanworks assembly service, while several labs elected to build their own. The initial success of this product demonstrates that fully open-source hardware can make headway against closed-source competitors in the Neuroscience instrumentation niche market.

Sanworks Github page for Pulse Pal may be found here

The Wiki page for Pulse Pal, including assembly instructions, may be found here.