Category: All

Laubach Lab GitHub Repository

The Laubach Lab at American University investigates executive control and decision making, focusing on the role of the prefrontal cortex. Through their GitHub repository, these researchers provide 3D print files for many of the behavioral devices used in their lab, including a Nosepoke and a Lickometer designed from rats. The repository also includes a script that reads MedPC files into Python in a usable way.

Hao Chen lab, UTHSC – openBehavior repository

The openBehavior github repository from Hao Chen’s lab at UTHSC aims to establish a computing platform for rodent behavior research using the Raspberry Pi computer. They have buillt several devices for conducting operant conditioning and monitoring enviornmental data.

The operant licking device can be placed in a standard rat home cage and can run fixed ratio, various ratio, or progressive ratio schedules. A preprint describing this project, including data on sucrose vs water intake is available. Detailed instructions for making the device is also provided.

The environment sensor can record the temperature, humidity, barometric pressure, and illumination at fixed time intervals and automatically transfer the data to a remote server.

There is also a standard alone RFID reader for the EM4100 implantable glass chips, a motion sensor addon for standard operant chambers, and several other devices.

Wave Surfer

WaveSurfer is an open-source application for neurophysiology data acquisition and analysis. The program is based in MatLab, and evolved from an earlier open-source software package called Ephus. WaveSurfer is currently pre-release, but can be downloaded from the WaveSurfer Webpage or the WaveSurfer GitHub Repository.

The project was initiated by the Svoboda Lab, and developed as a collaborative effort between several research groups at the Howard Hughes Medical Institute’s Janelia Research Campus. Janelia is a major proponent of collaboration and open-science, providing documentation for dozens of tools and innovations developed on their campus through their webpage, including several tools specific to behavioral neuroscience research.


Automated Home-Cage Functional Imaging

Timothy Murphy and his colleagues at the University of British Columbia have developed an automated system for mesoscopic functional imaging that allows subjects to self-initiate head-fixation and imaging within the home-cage. In their 2016 paper, “High-throughput automated home-cage mesoscopic functional imaging of mouse cortex,” Dr. Murphy and his colleagues present this device and demonstrate its use with a group of calcium indicator transgenic mice. The supplementary material to this paper includes a diagram of the hardware, a graphic representation of the training cage, several videos of subjects interacting with the device and sample imaging data. The Python source code and 3D print files can be found on Dr. Murphey’s UBC webpage.

Murphy, T. H., Boyd, J. D., Bolaños, F., Vanni, M. P., Silasi, G., Haupt, D., & LeDue, J. M. (2016). High-throughput automated home-cage mesoscopic functional imaging of mouse cortex. Nature Communications, 7, 11611.

Nose-Poke System – Kelly Tan Research Group

The Kelly Tan research group at the University of Basel, Switzerland investigates the neural correlates of motor behavior, focusing on the role of the basal ganglia in controlling various aspects of motor actions. To aid in their investigation, the group has developed an open-source nose-poke system utilizing an Arduino microcontroller, several low-cost electronic components, and a PVC behavioral arena. These researchers have shared the following information about the project:

Giorgio Rizzi, Meredith E. Lodge, Kelly R Tan.
MethodsX 3 (2016) 326-332
Operant behavioral tasks for animals have long been used to probe the function of multiple brain regions. The recent development of tools and techniques has opened the door to refine the answer to these same questions with a much higher degree of specificity and accuracy, both in biological and spatial-temporal domains. A variety of systems designed to test operant behavior are now commercially available, but have prohibitive costs. Here, we provide a low-cost alternative to a nose poke system for mice. Adapting a freely available sketch for ARDUINO boards, in combination with an in-house built PVC box and inexpensive electronic material we constructed a four-port nose poke system that detects and counts port entries.
  • We provide a low cost alternative to commercially available nose poke system.
  • Our custom made apparatus is open source and TTL compatible.
  • We validate our system with optogenetic self-stimulation of dopamine neurons in mice.

IMG_20160126_163648 IMG_20160126_163703 IMG_20160126_163714 IMG_20160126_163725 IMG_20160127_124611 IMG_20160127_124624 IMG_20160127_150643 IMG_20160127_150708

The Kelly Tan research group provides further documentation for this device, including SketchUp design files, Arduino source code, and a full bill of materials, as supplementary data in their 2016 paper.

ArduiPod Box

ArduiPod Box is a simple, comprehensive touchscreen-based operant conditioning chamber that utilizes an iPod Touch in conjunction with an Arduino microcontroller to present visual and auditory stimuli, record behavior in the form of nose-pokes or screen touches, and deliver liquid reward. In his 2014 paper, Oskar Pineño introduces ArduinoPod Box and demonstrates the use of the device in a visual discrimination task.

ArduiPod Box relies on an open-source iOS app named Shaping that can be downloaded for free at the iTunes store, as well as, on Dr. Pineno’s website. Detailed instructions for assembly of the ArduiPod Box are also detailed on the website. In addition, video demonstrating of ArduiPod can be found here.



Pineño, Oskar (2014). ArduiPod Box: a low-cost and open-source Skinner box using an iPod Touch and an Arduino microcontroller. Behav Res Methods. 46(1): 196–205

Lickometer – Feldman Lab

Brian Isett, a graduate researcher in the Feldman Lab at UC Berkeley writes, “Measuring licks using a lickometer can provide an intuitive and simple signal for scientists studying many aspects of rodent behavior.  Commercial lickometers are often bulky and expensive, easily costing a few hundred dollars. In the Feldman Lab, we designed a small and inexpensive lickometer with parts costing less than $20. The lickometer employs an infrared beam and sensor to minimize electrical noise artifacts during neurophysiology experiments and can be easily mounted in a micromanipulator for precise and repeatable positioning.
This open-source lickometer was designed in conjunction with an open-source water delivery system.  Together, these provide the basic hardware for a DIY behavioral assay and reward system for mice.”

Syringe Pump – Pearce Research Group

In their 2014 paper “Open-Source Syringe Pump Library,” Bas Wijnen, Emily Hunt, Gerald Anzalone, and Joshua Pearce detail an open-source syringe pump apparatus developed in their lab, as well as, validate the performance of the device. The authors write, “This syringe pump was designed using freely available open-source computer aided design (CAD) software and manufactured using an open-source RepRap 3-D printer and readily available parts. The design, bill of materials and assembly instructions are globally available to anyone wishing to use them on the Open-source syringe pump Approdepia page… The cost of the entire system, including the controller and web-based control interface, is on the order of 5% or less than one would expect to pay for a commercial syringe pump having similar performance. The design should suit the needs of a given research activity requiring a syringe pump including carefully controlled dosing of reagents, pharmaceuticals, and delivery of viscous 3-D printer media among other applications.”

Pearce Research group also provides an Open Source Lab page dedicated to low-cost, open-source lab hardware.

Wijnen, Bas; Hunt, Emily; Anzalone, Gerald; Pearce, Joshua (2014). Open-Source Syringe Pump Library. PLoS ONE, 9(9), e107216.

Visual Stimuli Presentation Device

This apparatus is designed to present complex visual stimuli in rodent behavioral experiments, such as visual discrimination tasks, or visually guided choice paradigms. This low-cost device utilizes an Arduino Uno microcontroller, and three (green) 8×8 LED matrices to present a montage of visual cues across a behavioral arena. Diffusion filters were used to decrease the luminance of the visual cues in order to render them more suitable for rodent visual discrimination. The present design incorporates three light displays to be mounted above three choice ports (nose pokes, levers, etc.); however as many as 8 light displays can be controlled by a single Arduino. This flexible device can be programmed to display a multitude of distinct static and dynamic visual cues, can easily be integrated into an existing behavioral chamber, and seamlessly interface with commercial systems such as MedPC. The wiring diagram and schematic below detail the configuration of this apparatus in a MedPC-based system; however, this device can be controlled by any comparable system, TTL signal, or other device in a behavioral chamber.

SchematicWiring Diagram

Adafruit provides extensive documentation on assembly and programming of these components on their website.

Please contact for Arduino source code and the 3D design files of the mounts used to install this device into a behavioral chamber.

Feeding Experimentation Device (FED)

WP_20160320_003Feeding Experimentation Device (FED) is a home cage-compatible feeding system that measures food intake with high accuracy and temporal resolution. FED offers a low-cost alternative (~$350) to commercial feeders, with the convenience of use in tradition colony rack caging.

In their 2016 paper, “Feeding Experimentation Device (FED): A flexible open-source device for measuring feeding behavior,” Katrina P. Nguyen, Timothy J. O’Neal, Olurotimi A. Bolonduro, Elecia White, and Alexxai V. Kravitz validate the reliability of food delivery and precise measurement of feeding behavior provided by FED, as well as, demonstrate the application of FED in an experiment examining light and dark-cycle feeding trends, and another measuring optogenetically-evoked feeding.


KravitzLab has shared the Arduino scripts for controlling FED, as well as, the python code used to analyze the feeding data collected by FED on the KravitzLab Github. Additionally, build instructions and power considerations are detailed on the FED Wiki page and 3D Design Files provided through TinkerCAD.

Nguyen, Katrina; O’Neal, Timothy; Bolonduro, Olurotimi; White, Elecia; Kravitz, Alexxai (2016). Feeding Experimentation Device (FED): A flexible open-source device for measuring feeding behavior. J Neurosci Methods, 267:108-14.