Category: Uncategorized

OMR Arena

September 7, 2017

Researchers at the National Eye Institute and the University of Oldenberg, Germany, have developed the OMR-arena for measuring visual acuity in mice.


The OMR-arena is an automated measurement and stimulation system that was developed to determine visual thresholds in mice. The system uses an optometer to characterize the visual performance of mice in a free moving environment. This system uses a video-tracking system to monitor the head movement of mice while presenting appropriate 360° stimuli. The head tracker is used to adjust the desired stimulus to the head position, and to automatically calculate visual acuity. This device, in addition to being open-source and affordable, offers an objective way for researchers to measure visual performance of free moving mice.


Kretschmer F, Kretschmer V, Kunze VP, Kretzberg J (2013) OMR-Arena: Automated Measurement and Stimulation System to Determine Mouse Visual Thresholds Based on Optomotor Responses. PLoS ONE 8(11): e78058. https://doi.org/10.1371/journal.pone.0078058

Open Source platform for Sensory Tasks

August, 2017

Lucy Palmer and Andrew Micallef, of the Florey Institute of Neuroscience and Mental Health, University of Melbourne, Melbourne, VIC, Australia, have shared the following Arduino and Python based platform for Go/ No-Go tasks in an article published by Frontiers in Cellular Neuroscience.


The Go/No-Go sensory task requires an animal to report a decision in response to a stimulus. In “Go” trials, the subject must respond to a target stimulus with an action, while in “No-Go” trials, the subject withholds a response. To execute this task, a behavioral platform was created which consists of three main components: 1) a water reward delivery system, 2) a lick sensor, and 3) a sensory stimulation apparatus. The water reward is administered by a gravity flow water system, controlled by a solenoid pinch valve, while licking is monitored by a custom-made piezo-based sensor. An Arduino Uno Rev3 simultaneously controls stimulus and reward delivery. In addition, the Arduino records lick frequency and timing through the piezo sensor. A Python script, employing the pyserial library, aids communication between the Arduino and a host computer.


The GitHub for the project may be found here.

OptiMouse

August, 2017

Yorman Ben-Shaul, of the Department of Medical Neurobiology at The Hebrew University, has shared the following about OptiMouse in an article published through BMC Biology:


OptiMouse is an open-source software designed to semi-automatically analyze the positions of individual mice, specifically their nose positions, in a behavioral arena with the goal of minimizing error. This software was designed to provide highly accurate detection of positions and to make the entire analysis process accessible and free by providing its own graphical user interface, requiring no prior programming knowledge. OptiMouse is different from other position tracker software in that it applies multiple detection algorithms to a single session while allowing a seamless integration of custom functions, controllable through the GUI. This software also makes the identification of frames with incorrect position detection easy to find so the settings for those frames can be adjusted, producing higher quality data.


Ben-Shaul, Y. (2017). OptiMouse: a comprehensive open source program for reliable detection and analysis of mouse body and nose positions. BMC Biology,15(1). doi:10.1186/s12915-017-0377-3

The GitHub for this project may be found here.

Free-Behavior Monitoring and Reward System for Non-human Primates

August, 2017

In Frontiers Neuroscience, Tyler Libey and Eberhard E. Fetz share their open-source device for recording neural activity from free behaving non-human primates in their home cages and administering reward.


This device is designed to document bodily movement and neural activity and deliver rewards to monkeys behaving freely in their home cages. This device allows researchers to explore behaviors in freely moving non-human primates rather than simply relying on rigid and tightly controlled movements which lends itself to further understanding movement, reward, and the neural signals involved with these behaviors. Studying free-moving animals may offer essential insight to understanding the neural signals associated with reward-guided movement, which may offer guidance in developing more accurate brain machine interfaces. The behavior monitoring system incorporates existing untethered recording equipment, Neurochip, and a custom hub to control a cage-mounted feeder to deliver short-latency rewards. A depth camera is used to provide gross movement data streams from the home cage in addition to the neural activity that is recorded.


Libey T and Fetz EE (2017) Open-Source, Low Cost, Free-Behavior Monitoring, and Reward System for Neuroscience Research in Non-human Primates. Front. Neurosci. 11:265. doi: 10.3389/fnins.2017.00265

Pulse Pal

July 12, 2017

Josh Sanders has also shared the following with OpenBehavior regarding Pulse Pal, an open source pulse train generator. Pulse Pal and Bpod, featured earlier, were both created by Sanworks.


Pulse Pal is an Arduino-powered device that generates precise sequences of voltage pulses for neural stimulation and stimulus control. It is controlled either through its APIs in MATLAB, Python and C++, or as a stand-alone instrument using its oLED screen and a clickable thumb joystick. Pulse Pal can play independent stimulus trains on its output channels. These trains are either defined parametrically, or pulse-wise by specifying each pulse’s onset time and voltage. Two optically isolated TTL trigger channels can each be mapped to any subset of the output channels, which can range between -10V and +10V, and deliver pulses as short as 100µs. This feature set allows Pulse Pal to serve as an open-source alternative to commercial stimulation timing devices, i.e. Master 8 (AMPI), PSG-2 (ISSI), Pulsemaster A300 (WPI), BPG-1 (Bak Electronics), StimPulse PGM (FHC Inc.) and Multistim 3800 (A-M Systems).

Because Pulse Pal is an Arduino-powered device, modifying its firmware for custom applications is within the capabilities of most modern Neuroscience research labs. As an example, the Pulse Pal’s Github repository provides an alternative firmware for the device, that entirely repurposes it as a waveform generator. In this configuration, a user can specify a waveform, frequency, amplitude and max playback duration, and toggle playback by TTL pulse with ~100µs latency. The firmware can also loop custom waveforms up to 40,000 samples long.

Pulse Pal was first published in 2014, by Josh Sanders while he was a student in Kepecs Lab at Cold Spring Harbor Laboratory. A significantly improved second-generation stimulator (Pulse Pal 2) became available in early 2016, coincident with the opening of Sanworks LLC. Over the past year, >125 Pulse Pal 2 devices were sold at $545 each by the Sanworks assembly service, while several labs elected to build their own. The initial success of this product demonstrates that fully open-source hardware can make headway against closed-source competitors in the Neuroscience instrumentation niche market.

Sanworks Github page for Pulse Pal may be found here

The Wiki page for Pulse Pal, including assembly instructions, may be found here.

Bpod

July 10, 2017

Josh Sanders has shared the following with OpenBehavior regarding Bpod, an open platform for precision animal behavior measurement created by Sanworks.


Bpod is a measurement and control system for behavior research, most often used to implement operant (Go/NoGo, 2AFC) tasks. Its software controls a hierarchy of hardware modules, each powered by an Arduino-programmable microcontroller. Atop the heiarchy is a “state machine” module that accepts an abstract trial definition, relating detected behavioral events to progression through user-defined hardware states. On trial start, the module serves as a real-time controller in parallel with the non-real-time computer, reading inputs and updating outputs in 100µs cycles until it reaches an exit state. Measured events are then returned to the computer, where software updates user-defined online plots, and loads the next trial’s state machine.

The Bpod state machine has on-board hardware interfaces for TTL logic and behavior ports (a.k.a. nosepokes) containing a photogate to detect snout entry, a miniature solenoid valve for liquid reward, and a visible LED to deliver cues and feedback. Modules under state machine control specialize in larger solenoids, analog input and output, direct digital synthesis, and a gamepad interface (for human research). An Arduino shield is provided, for users to interface new sensors and actuators with the state machine.

By handling the time-critical logic relating measurements to environment control in an open-source embedded computing environment, the Bpod system provides experimenters with a powerful family of tools for rigor and automation in behavioral research.

Sanworks’ GitHub may be found here.

The Wiki page for Bpod, including assembly instructions and Bill of Materials may be found here. 

Operant Box for Auditory Tasks (OBAT)

June 2, 2017

Mariana de Araújo has shared the following regarding OBAT, an operant box designed for auditory tasks developed at the Edmond and Lily Safra International Institute of Neuroscience, Santos Dumont Institute, Macaiba, Brazil. 


Fig. 1
Overview of the OBAT, inside the sound-attenuating chamber with the door open. External sources of heating were left outside the chamber: (a) Arduino Mega 2560 and shields and (b) the power amplifier. The modules controlling sound delivery, the response bars, and reward delivery can be seen in this lateral view: (c) speaker, (d) retractable bars, (e) reward delivery system, and (f) reward dispenser. The animal was kept inside the (g) Plexiglas chamber, and monitored by an (h) internal camera mounted on the wall of the (i) sound isolation chamber

OBAT is a low cost operant box designed to train small primates in auditory tasks. The device presents auditory stimuli via a MP3 player shield connected to an Arduino Mega 2560 through an intermediate, custom-made shield. It also controls two touch-sensitive bars and a reward delivery system. A Graphical User Interface allows the experimenter to easily set the parameters of the experimental sessions. All board schematics, source code, equipment specification and design are available at GitHub and at the publication. Despite its low cost, OBAT has a high temporal accuracy and reliably sends TTL signals to other equipment. Finally, the device was tested with one marmoset, showing that it can successfully be used to train these animals in an auditory two-choice task.


Ribeiro MW, Neto JFR, Morya E, Brasil FL, de Araújo MFP (2017) OBAT: An open-source and low- cost operant box for auditory discriminative tasksBehav Res Methods. doi: 10.3758/s13428-017-0906-6

FinchScope

May 19th, 2017 

William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.


The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed,  lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
The Github project page can be found here.
Link to share: https://edspace.american.edu/openbehavior/2017/05/19/finchscope/

Nose Poke Device

April 20, 2017 

Andre Chagas, creator of OpenNeuroscience, has generously shared the following with OpenBehavior regarding an arduino-based, 3D-printed nose poke device:


“This nose poke device was built as “proof-of-principle”. The idea was to show that scientists too can leverage from the open source philosophy and the knowledge built by the community that is developing around open source hardware. Moreover, the bill of materials was kept simple and affordable. One device can be built for ~25 dollars and should take 2-3 hours to build, including the time to print parts.

The device is organised as follows: The 3D printed frame (which can also be built with other materials when a printer is not available) contains a hole where the animals are expected to insert their snouts. At the front part of the hole, an infrared led is aligned with an infrared detector. This forms an “infrared curtain” at the hole’s entrance. If this curtain is interrupted, a signal is sent to a microcontroller (an Arduino in this case), and it can be used to trigger other electronic components, such as a water pump, or an led indicator, or in this case a Piezo buzzer.
At the back of the hole, a white LED is placed to indicate that the system is active and ready for “nose pokes”.

The microcontroller, contains the code responsible for controlling the electronic parts, and can easily be changed, as it is written for Arduino and several code examples/tutorials (for begginners and experts) can be found online.”

Find more documentation on FigShare and Open Neuroscience.

Link to share:  https://edspace.american.edu/openbehavior/2017/04/20/nose-poke-device/

Pixying Behavior

APril 3, 2017 

Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu BerlinGermany, has generously shared the following regarding automated optical tracking of animal movement: 


“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution.  Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”

Link to share:  https://edspace.american.edu/openbehavior/2017/04/03/pixying-behavior/