Category: Stimuli

ArControl: Arduino Control Platform

January 3rd, 2018

The following behavioral platform was developed and published by Xinfeng Chen and Haohong Li, from Huazhong University of Science and Technology, Wuhan, China

ArControl: Arduino Control Platform is a comprehensive behavioral platform developed to deliver stimuli and monitor responses. This easy-to-use, high-performance system uses an Arduino UNO board and a simple drive circuit along with a stand-along GUI application. Experimental data is automatically recorded with the built-in data acquisition function and the entire behavioral schedule is stored within the Arduino chip. Collectively, this makes ArControl a “genuine, real-time system with high temporal resolution”. Chen and Li have tested ArControl using a Go/No-Go task and a probabilistic switching behavior task. The results of their work show that ArControl is a reliable system for behavioral research.

Source codes and PCB drafts may be found here: ArControl Github




December 20, 2017

StimDuino, an inexpensive Arduino-controlled stimulus isolator that allows for highly accurate, reproducible automated setting of stimulation currents. The automatic stimulation patterns are software-controlled and the parameters are set from Matlab-coded simple, intuitive and user-friendly graphical user interface. StimDuino-generated automation of the input-output relationship assessment eliminates need for the current intensity manually adjusting, improves stimulation reproducibility, accuracy and allows on-site and remote control of the stimulation parameters for both in vivo and in vitro applications.

Sheinin, A., Lavi, A., & Michaelevski, I. (2015). StimDuino: An Arduino-based electrophysiological stimulus isolator. Journal of Neuroscience Methods, 243, 8-17. doi:10.1016/j.jneumeth.2015.01.016

Autonomous Training of a Forelimb Motor Task

November 3, 2017

Greg Silas, from the University of Ottawa, has kindly contributed the following to OpenBehavior.

“Silasi et al developed a low-cost system for fully autonomous training of group housed mice on a forelimb motor task. We demonstrate the feasibility of tracking both end-point as well as kinematic performance of individual mice, each performing thousands of trials over 2.5 months. The task is run and controlled by a Raspberry Pi microcomputer, which allows for cages to be monitored remotely through an active internet connection.”

Click here to submit a piece of open-source software or hardware to OpenBehavior.

Moving Wall Box (MWB)

October 26th, 2017

Andreas Genewsky, from the Max-Planck Institute of Psychiatry, has generously shared the following regarding his Moving Wall Box task and associated apparatus.

“Typicallly, behavioral paradigms which aim to asses active vs. passive fear responses, involve the repeated application of noxius stimuli like electric foot shocks (step-down avoidance, step-through avoidance, shuttle-box). Alternative methods to motivate the animals and ultimately induce a conflict situation which needs to be overcome often involve food and/or water deprivation.

In order to repeatedly assess fear coping strategies in an emotional challenging situation without footshocks, food or water deprivation (comlying to the Reduce & Refine & Replace 3R principles), we devised a novel testing strategy, henceforward called the Moving Wall Box (MWB) task. In short, during the MWB task a mouse is repeatedly forced to jump over a small ice-filled box (10 trials, 1 min inter-trial intervals ITI), by slowly moving walls (2.3 mm/s, over 60 s), whereby the presence of the animal is automatically sensed via balances and analyzed by a microcontroller board which in turn controls the movements of the walls. The behavioral readouts are (1) the latency to reach the other compartment (high levels of behavioral inhibition lead to high latencies) and (2) the number of inter-trial shuttles per trial (low levels of behavioral inhibition lead to high levels of shuttles during the ITI).

The MWB offers the possibility to conduct simultaneous in vivo electrophysiological recordings, which could be later aligned to the behavioral responses (escapes). Therefore the MWB task fosters the study of activity patterns in, e.g., optogenetically identified neurons with respect to escape responses in a highly controlled setting. To our knowledge there is no other available compatible behavioral paradigm.”

Operant Box for Auditory Tasks (OBAT)

June 2, 2017

Mariana de Araújo has shared the following regarding OBAT, an operant box designed for auditory tasks developed at the Edmond and Lily Safra International Institute of Neuroscience, Santos Dumont Institute, Macaiba, Brazil. 

OBAT is a low cost operant box designed to train small primates in auditory tasks. The device presents auditory stimuli via a MP3 player shield connected to an Arduino Mega 2560 through an intermediate, custom-made shield. It also controls two touch-sensitive bars and a reward delivery system. A Graphical User Interface allows the experimenter to easily set the parameters of the experimental sessions. All board schematics, source code, equipment specification and design are available at GitHub and at the publication. Despite its low cost, OBAT has a high temporal accuracy and reliably sends TTL signals to other equipment. Finally, the device was tested with one marmoset, showing that it can successfully be used to train these animals in an auditory two-choice task.

Fig. 1
Overview of the OBAT, inside the sound-attenuating chamber with the door open. External sources of heating were left outside the chamber: (a) Arduino Mega 2560 and shields and (b) the power amplifier. The modules controlling sound delivery, the response bars, and reward delivery can be seen in this lateral view: (c) speaker, (d) retractable bars, (e) reward delivery system, and (f) reward dispenser. The animal was kept inside the (g) Plexiglas chamber, and monitored by an (h) internal camera mounted on the wall of the (i) sound isolation chamber

Ribeiro MW, Neto JFR, Morya E, Brasil FL, de Araújo MFP (2017) OBAT: An open-source and low- cost operant box for auditory discriminative tasks. Behav Res Methods. doi: 10.3758/s13428-017-0906-6


Visual Stimuli Presentation Device

This apparatus is designed to present complex visual stimuli in rodent behavioral experiments, such as visual discrimination tasks, or visually guided choice paradigms. This low-cost device utilizes an Arduino Uno microcontroller, and three (green) 8×8 LED matrices to present a montage of visual cues across a behavioral arena. Diffusion filters were used to decrease the luminance of the visual cues in order to render them more suitable for rodent visual discrimination. The present design incorporates three light displays to be mounted above three choice ports (nose pokes, levers, etc.); however as many as 8 light displays can be controlled by a single Arduino. This flexible device can be programmed to display a multitude of distinct static and dynamic visual cues, can easily be integrated into an existing behavioral chamber, and seamlessly interface with commercial systems such as MedPC. The wiring diagram and schematic below detail the configuration of this apparatus in a MedPC-based system; however, this device can be controlled by any comparable system, TTL signal, or other device in a behavioral chamber.

SchematicWiring Diagram

Adafruit provides extensive documentation on assembly and programming of these components on their website.

Please contact for Arduino source code and the 3D design files of the mounts used to install this device into a behavioral chamber.