Category: All

OptiMouse

August, 2017

Yorman Ben-Shaul, of the Department of Medical Neurobiology at The Hebrew University, has shared the following about OptiMouse in an article published through BMC Biology:


OptiMouse is an open-source software designed to semi-automatically analyze the positions of individual mice, specifically their nose positions, in a behavioral arena with the goal of minimizing error. This software was designed to provide highly accurate detection of positions and to make the entire analysis process accessible and free by providing its own graphical user interface, requiring no prior programming knowledge. OptiMouse is different from other position tracker software in that it applies multiple detection algorithms to a single session while allowing a seamless integration of custom functions, controllable through the GUI. This software also makes the identification of frames with incorrect position detection easy to find so the settings for those frames can be adjusted, producing higher quality data.


Ben-Shaul, Y. (2017). OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biology,15(1). doi:10.1186/s12915-017-0377-3

The GitHub for this project may be found here.

Free-Behavior Monitoring and Reward System for Non-human Primates

August, 2017

In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.


This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.


Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265

Pulse Pal

July 12, 2017

Josh Sanders has also shared the following with OpenBehavior regarding Pulse Pal, an open source pulse train generator. Pulse Pal and Bpod, featured earlier, were both created by Sanworks.


Pulse Pal is an Arduino-powered device that generates precise sequences of voltage pulses for neural stimulation and stimulus control. It is controlled either through its APIs in MATLAB, Python and C++, or as a stand-alone instrument using its oLED screen and a clickable thumb joystick. Pulse Pal can play independent stimulus trains on its output channels. These trains are either defined parametrically, or pulse-wise by specifying each pulse’s onset time and voltage. Two optically isolated TTL trigger channels can each be mapped to any subset of the output channels, which can range between -10V and +10V, and deliver pulses as short as 100µs. This feature set allows Pulse Pal to serve as an open-source alternative to commercial stimulation timing devices, i.e. Master 8 (AMPI), PSG-2 (ISSI), Pulsemaster A300 (WPI), BPG-1 (Bak Electronics), StimPulse PGM (FHC Inc.) and Multistim 3800 (A-M Systems).

Because Pulse Pal is an Arduino-powered device, modifying its firmware for custom applications is within the capabilities of most modern Neuroscience research labs. As an example, the Pulse Pal’s Github repository provides an alternative firmware for the device, that entirely repurposes it as a waveform generator. In this configuration, a user can specify a waveform, frequency, amplitude and max playback duration, and toggle playback by TTL pulse with ~100µs latency. The firmware can also loop custom waveforms up to 40,000 samples long.

Pulse Pal was first published in 2014, by Josh Sanders while he was a student in Kepecs Lab at Cold Spring Harbor Laboratory. A significantly improved second-generation stimulator (Pulse Pal 2) became available in early 2016, coincident with the opening of Sanworks LLC. Over the past year, >125 Pulse Pal 2 devices were sold at $545 each by the Sanworks assembly service, while several labs elected to build their own. The initial success of this product demonstrates that fully open-source hardware can make headway against closed-source competitors in the Neuroscience instrumentation niche market.

Sanworks Github page for Pulse Pal may be found here

The Wiki page for Pulse Pal, including assembly instructions, may be found here.

Bpod

July 10, 2017

Josh Sanders has shared the following with OpenBehavior regarding Bpod, an open platform for precision animal behavior measurement created by Sanworks.


Bpod is a measurement and control system for behavior research, most often used to implement operant (Go/NoGo, 2AFC) tasks. Its software controls a hierarchy of hardware modules, each powered by an Arduino-programmable microcontroller. Atop the heiarchy is a “state machine” module that accepts an abstract trial definition, relating detected behavioral events to progression through user-defined hardware states. On trial start, the module serves as a real-time controller in parallel with the non-real-time computer, reading inputs and updating outputs in 100µs cycles until it reaches an exit state. Measured events are then returned to the computer, where software updates user-defined online plots, and loads the next trial’s state machine.

The Bpod state machine has on-board hardware interfaces for TTL logic and behavior ports (a.k.a. nosepokes) containing a photogate to detect snout entry, a miniature solenoid valve for liquid reward, and a visible LED to deliver cues and feedback. Modules under state machine control specialize in larger solenoids, analog input and output, direct digital synthesis, and a gamepad interface (for human research). An Arduino shield is provided, for users to interface new sensors and actuators with the state machine.

By handling the time-critical logic relating measurements to environment control in an open-source embedded computing environment, the Bpod system provides experimenters with a powerful family of tools for rigor and automation in behavioral research.

Sanworks’ GitHub may be found here.

The Wiki page for Bpod, including assembly instructions and Bill of Materials may be found here. 

Operant Box for Auditory Tasks (OBAT)

June 2, 2017

Mariana de Araújo has shared the following regarding OBAT, an operant box designed for auditory tasks developed at the Edmond and Lily Safra International Institute of Neuroscience, Santos Dumont Institute, Macaiba, Brazil. 


Fig. 1
Overview of the OBAT, inside the sound-attenuating chamber with the door open. External sources of heating were left outside the chamber: (a) Arduino Mega 2560 and shields and (b) the power amplifier. The modules controlling sound delivery, the response bars, and reward delivery can be seen in this lateral view: (c) speaker, (d) retractable bars, (e) reward delivery system, and (f) reward dispenser. The animal was kept inside the (g) Plexiglas chamber, and monitored by an (h) internal camera mounted on the wall of the (i) sound isolation chamber

OBAT is a low cost operant box designed to train small primates in auditory tasks. The device presents auditory stimuli via a MP3 player shield connected to an Arduino Mega 2560 through an intermediate, custom-made shield. It also controls two touch-sensitive bars and a reward delivery system. A Graphical User Interface allows the experimenter to easily set the parameters of the experimental sessions. All board schematics, source code, equipment specification and design are available at GitHub and at the publication. Despite its low cost, OBAT has a high temporal accuracy and reliably sends TTL signals to other equipment. Finally, the device was tested with one marmoset, showing that it can successfully be used to train these animals in an auditory two-choice task.


Ribeiro MW, Neto JFR, Morya E, Brasil FL, de Araújo MFP (2017) OBAT: An open-source and low- cost operant box for auditory discriminative tasksBehav Res Methods. doi: 10.3758/s13428-017-0906-6

Autoreward2

May 26th, 2017

Jesús Ballesteros, Ph.D. (Learning and Memory Research Group, Department of Neurophysiology at Ruhr-University Bochum, Germany) has generously his project, called Autoreward2, with OpenBehavior. Autoreward2 is an Elegoo Uno-based system designed to detect and reward rodents in a modified T-maze task.


In designing their modified T-maze, Ballesteros found the need for an automatic reward delivery system. Using open-source resources, he aimed to create a system with the following capabilities:

  • Detect an animal at a certain point in a maze;
  • Deliver a certain amount of fluid through the desired licking port;
  • Provide visual cues that indicate which point in the maze has been reached;
  • Allow different modes to be easily selected using an interface;
  • Allow different working protocols (i.e., habituation, training, experimental, cleaning)

To achieve these aims, he created used an Elegoo UNO R3 board to read inexpensive infrared emitters. Breaking any infrared beam causes the board to open one or two solenoid valves connected to a fluid tank. The valve remains open for around 75 milliseconds, allowing a single drop of fluid to form at the tip of the licking port. Additionally,  the bread board contains LEDS to signal to the researcher when the IR beam has been crossed.

Currently, a membrane keypad allows different protocols or modes to be selected. The system is powered through a 9V wall adapter, providing 3.3V to the LEDs and IR circuits and 9V to the solenoids.

Importantly, the entire system can be built for under 80€. In the future, Ballesteros hopes to add a screen and an SD card port, and to switch the keypad out for a wireless interface.

The full description may be found here, the Github page for the project may be found here, and the sketch and arduino code may be found here.

Link to share: https://edspace.american.edu/openbehavior/2017/05/26/autoreward2/

FinchScope

May 19th, 2017 

William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.


The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed,  lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
The Github project page can be found here.
Link to share: https://edspace.american.edu/openbehavior/2017/05/19/finchscope/

Automated Rodent Tracker (ART)

May 5, 2017

Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:


We have developed a program (ART, available from: http://mwa.bretthewitt.net/downloads.php) that can automatically track rodent positon and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.

Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.

ART may be downloaded here

Link to share: https://edspace.american.edu/openbehavior/2017/05/05/automated-rodent-tracker-art/


FlyPi

April 28, 2017
In addition to sharing his Nose Poke Device, Dr. Andre Chagas from Open Neuroscience has also shared the following with Open Behavior regarding FlyPi:

The FlyPi is an open source and affordable microscope/experimental setup (the basic configuration can be built for ~100 Dollars). A collaboration between Open Neuroscience and Trend in Africa, the device is based on the Raspberry Pi, Arduino microcontroller and off-the-shelf electronic components. So far it has been used for: Light microscopy, fluorescence microscopy, optogenetics experiments, behavioural tracking, thermogenetic experiments. It has also been used as a teaching tool in workshops where students use it as an entry point into the world of electronics and programming.

Due to the modular and portable design, new applications could be easily created by the community to solve unforeseen needs/problems.

Link to share: https://edspace.american.edu/openbehavior/2017/04/28/flypi/


Nose Poke Device

April 20, 2017 

Andre Chagas, creator of OpenNeuroscience, has generously shared the following with OpenBehavior regarding an arduino-based, 3D-printed nose poke device:


“This nose poke device was built as “proof-of-principle”. The idea was to show that scientists too can leverage from the open source philosophy and the knowledge built by the community that is developing around open source hardware. Moreover, the bill of materials was kept simple and affordable. One device can be built for ~25 dollars and should take 2-3 hours to build, including the time to print parts.

The device is organised as follows: The 3D printed frame (which can also be built with other materials when a printer is not available) contains a hole where the animals are expected to insert their snouts. At the front part of the hole, an infrared led is aligned with an infrared detector. This forms an “infrared curtain” at the hole’s entrance. If this curtain is interrupted, a signal is sent to a microcontroller (an Arduino in this case), and it can be used to trigger other electronic components, such as a water pump, or an led indicator, or in this case a Piezo buzzer.
At the back of the hole, a white LED is placed to indicate that the system is active and ready for “nose pokes”.

The microcontroller, contains the code responsible for controlling the electronic parts, and can easily be changed, as it is written for Arduino and several code examples/tutorials (for begginners and experts) can be found online.”

Find more documentation on FigShare and Open Neuroscience.

Link to share:  https://edspace.american.edu/openbehavior/2017/04/20/nose-poke-device/