December 20, 2017
StimDuino, an inexpensive Arduino-controlled stimulus isolator that allows for highly accurate, reproducible automated setting of stimulation currents. The automatic stimulation patterns are software-controlled and the parameters are set from Matlab-coded simple, intuitive and user-friendly graphical user interface. StimDuino-generated automation of the input-output relationship assessment eliminates need for the current intensity manually adjusting, improves stimulation reproducibility, accuracy and allows on-site and remote control of the stimulation parameters for both in vivo and in vitro applications.
Sheinin, A., Lavi, A., & Michaelevski, I. (2015). StimDuino: An Arduino-based electrophysiological stimulus isolator. Journal of Neuroscience Methods, 243, 8-17. doi:10.1016/j.jneumeth.2015.01.016
December 18, 2017
ZebraTrack is a cost-effective imaging setup for distraction-free behavioral acquisition with automated tracking using open-source ImageJ software and workflow for extraction of behavioral endpoints of zebrafish. This ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins.
Nema, S., Hasan, W., Bhargava, A., & Bhargava, Y. (2016). A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety. Journal of Neuroscience Methods, 271, 65-75. doi:10.1016/j.jneumeth.2016.07.004
November 29, 2017
An open-source, Arduino-controlled syringe pump delivers small, accurate amounts of liquid. The basic menu allows users to select bolus size, and Arduino can be easily customized for a variety of projects. The pump uses 3D-printed parts and easily obtainable hardware.
Full instructions and 3D print files for assembling this pump can be found at https://hackaday.io/project/1838-open- syringe-pump
November 28, 2017
Pyper is developed by The Margrie Laboratory.
Pyper provides real-time or pre-recorded motion tracking of a specimen in an open-field. Pyper can send TTL pulses based on detection of the specimen within user-defined regions of interest. The software can be used through the command line or through a built-in graphical user interface. The live feed can be provided by a USB or Raspberry Pi camera.
Find more information here.
Manual for Pyper.
November 28, 2017
Airtrack was developed in LARKUM Lab by Mostafa Nashaat, Hatem Oraby, Robert Sachdev, York Winter and Matthew Larkum. Alexander Schill, engineer at Charité workshop (CWW) had a significant contribution to the design of the platform and the airtrack table.
Airtrack is a head-fixed behavioral environment that uses a lightweight physical maze floating on an air table that moves around the animal’s body under the direct control of the animal itself, solving many problems associated with using virtual reality for head-fixed animals.
More Information can be found at http://www.neuro-airtrack.com/
Nashaat, MA, Oraby, H, Sachdev, RNS, Winter, Y, Larkum, ME. (2016).
Air-Track: a real-world floating environment for active sensing in head-fixed mice.
Journal of Neurophysiology 116 (4) 1542-1553; DOI:10.1152/jn.00088.2016
November 8th, 2017
Jumpei Matsumoto has submitted the following to OpenBehavior regarding 3D tracker, a 3D video tracking system for animal behavior.
3DTracker-FAB is an open source software for 3D-video based markerless computerized behavioral analysis for laboratory animals (currently mice and rats). The software uses multiple depth cameras to reconstruct full 3D images of animals and fit skeletal models to the 3D image to estimate 3D pose of the animals.
More information on 3D tracker may be found on the system’s website, www.3dtracker.org
Additionally, a dynamic poster on the system was presented on November 12, 2017 at the Society for Neuroscience annual meeting. Click here for more information.
November 3, 2017
Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.
“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”
The DropBox folder containing the python code may be found here.
Silasi, G., Boyd, J., Bolanos, F., LeDue, J., Scott, S. H., & Murphy, T. H. (2017). Individualized tracking of self-directed motor learning in group-housed mice performing a skilled lever positioning task in the home cage. Journal of Neurophysiology, jn.00115.2017. https://doi.org/10.1152/jn.00115.2017
Click here to submit a piece of open-source software or hardware to OpenBehavior.
October 31, 2016
Mousetrap, an open-source software plugin to record and analyze mouse movements in computerized lab experiments, was developed by Pascal Kieslich and Felix Henninger, both located in Germany.
Mousetrap is a plugin that is used with OpenSesame software for mouse-tracking, or the analysis of mouse movements during computerized lab experiments which can serve as an indicator of commitment or conflict in decision making. The integration of Mousetrap with a general-purpose graphical experiment builder also allows users to access other core features and software extensions of OpenSesame, which offers more flexibility to users when designing experiments. Mousetrap is available for use across all platforms (Linux, Windows and Mac) and the data collected with the software can also be imported directly into R for analysis with an available Mousetrap package.
The GitHub for this project may be found here.
October 26th, 2017
Andreas Genewsky, from the Max-Planck Institute of Psychiatry, has generously shared the following regarding his Moving Wall Box task and associated apparatus.
“Typicallly, behavioral paradigms which aim to asses active vs. passive fear responses, involve the repeated application of noxius stimuli like electric foot shocks (step-down avoidance, step-through avoidance, shuttle-box). Alternative methods to motivate the animals and ultimately induce a conflict situation which needs to be overcome often involve food and/or water deprivation.
In order to repeatedly assess fear coping strategies in an emotional challenging situation without footshocks, food or water deprivation (comlying to the Reduce & Refine & Replace 3R principles), we devised a novel testing strategy, henceforward called the Moving Wall Box (MWB) task. In short, during the MWB task a mouse is repeatedly forced to jump over a small ice-filled box (10 trials, 1 min inter-trial intervals ITI), by slowly moving walls (2.3 mm/s, over 60 s), whereby the presence of the animal is automatically sensed via balances and analyzed by a microcontroller board which in turn controls the movements of the walls. The behavioral readouts are (1) the latency to reach the other compartment (high levels of behavioral inhibition lead to high latencies) and (2) the number of inter-trial shuttles per trial (low levels of behavioral inhibition lead to high levels of shuttles during the ITI).
The MWB offers the possibility to conduct simultaneous in vivo electrophysiological recordings, which could be later aligned to the behavioral responses (escapes). Therefore the MWB task fosters the study of activity patterns in, e.g., optogenetically identified neurons with respect to escape responses in a highly controlled setting. To our knowledge there is no other available compatible behavioral paradigm.”
September 12, 2017
Annalisa Scimemi, of the Department of Biology at SUNY Albany, has shared the following Python based code to track movement of labelled paws in grooming and freely behaving mice in an article published by PLoS Computational Biology.
Traditional approaches to analyze grooming rely on manually scoring the time of onset and duration of each grooming episode. This type of analysis is time-consuming and provides limited information about finer aspects of grooming behaviors, which are important to understand bilateral coordination in mice. Currently available commercial and freeware video-tracking software allow automated tracking of the whole body of a mouse or of its head and tail, not of individual paws. M-Track is an open-source code that allows users to simultaneously track the movement of individual paws during spontaneous grooming episodes and walking in multiple freely-behaving mice/rats. This toolbox provides a simple platform to perform trajectory analysis of paw movement. M-Track provides a valuable and user-friendly interface to streamline the analysis of spontaneous grooming in biomedical research studies.