Category: All

Operant Box for Auditory Tasks (OBAT)

June 2, 2017

Mariana de Araújo has shared the following regarding OBAT, an operant box designed for auditory tasks developed at the Edmond and Lily Safra International Institute of Neuroscience, Santos Dumont Institute, Macaiba, Brazil. 


Fig. 1
Overview of the OBAT, inside the sound-attenuating chamber with the door open. External sources of heating were left outside the chamber: (a) Arduino Mega 2560 and shields and (b) the power amplifier. The modules controlling sound delivery, the response bars, and reward delivery can be seen in this lateral view: (c) speaker, (d) retractable bars, (e) reward delivery system, and (f) reward dispenser. The animal was kept inside the (g) Plexiglas chamber, and monitored by an (h) internal camera mounted on the wall of the (i) sound isolation chamber

OBAT is a low cost operant box designed to train small primates in auditory tasks. The device presents auditory stimuli via a MP3 player shield connected to an Arduino Mega 2560 through an intermediate, custom-made shield. It also controls two touch-sensitive bars and a reward delivery system. A Graphical User Interface allows the experimenter to easily set the parameters of the experimental sessions. All board schematics, source code, equipment specification and design are available at GitHub and at the publication. Despite its low cost, OBAT has a high temporal accuracy and reliably sends TTL signals to other equipment. Finally, the device was tested with one marmoset, showing that it can successfully be used to train these animals in an auditory two-choice task.


Ribeiro MW, Neto JFR, Morya E, Brasil FL, de Araújo MFP (2017) OBAT: An open-source and low- cost operant box for auditory discriminative tasksBehav Res Methods. doi: 10.3758/s13428-017-0906-6

Autoreward2

May 26th, 2017

Jesús Ballesteros, Ph.D. (Learning and Memory Research Group, Department of Neurophysiology at Ruhr-University Bochum, Germany) has generously his project, called Autoreward2, with OpenBehavior. Autoreward2 is an Elegoo Uno-based system designed to detect and reward rodents in a modified T-maze task.


In designing their modified T-maze, Ballesteros found the need for an automatic reward delivery system. Using open-source resources, he aimed to create a system with the following capabilities:

  • Detect an animal at a certain point in a maze;
  • Deliver a certain amount of fluid through the desired licking port;
  • Provide visual cues that indicate which point in the maze has been reached;
  • Allow different modes to be easily selected using an interface;
  • Allow different working protocols (i.e., habituation, training, experimental, cleaning)

To achieve these aims, he created used an Elegoo UNO R3 board to read inexpensive infrared emitters. Breaking any infrared beam causes the board to open one or two solenoid valves connected to a fluid tank. The valve remains open for around 75 milliseconds, allowing a single drop of fluid to form at the tip of the licking port. Additionally,  the bread board contains LEDS to signal to the researcher when the IR beam has been crossed.

Currently, a membrane keypad allows different protocols or modes to be selected. The system is powered through a 9V wall adapter, providing 3.3V to the LEDs and IR circuits and 9V to the solenoids.

Importantly, the entire system can be built for under 80€. In the future, Ballesteros hopes to add a screen and an SD card port, and to switch the keypad out for a wireless interface.

The full description may be found here, the Github page for the project may be found here, and the sketch and arduino code may be found here.

Link to share: https://edspace.american.edu/openbehavior/2017/05/26/autoreward2/

FinchScope

May 19th, 2017 

William Liberti, from the Gardner Lab out of Boston University, has shared the following with Open Behavior regarding ‘FinchScope’. Although originally designed for finches, the 3D printed single-photon fluorescent imaging microscope has since been adapted for rodents and other avian species.


The FinchScope project aims to provide a modular in-vivo optophysiology rig for awake, freely behaving animals, with a transparent acquisition and analysis pipeline. The goal is to produce a customizable and scaleable single-photon fluorescent imaging microscope system that takes advantage of developing open-source analysis platforms. These tools are built from easily procured off-the-shelf components and 3D printed parts.
We provide designs for a 3D printed,  lightweight, wireless-capable microscope and motorized commutator, designed for multi-month monitoring the neural activity (via genetically encoded calcium indicators) of zebra finches while they sing their courtship songs. It has since been adapted for rodents, and to other birds such as canaries.
The Github project page can be found here.
Link to share: https://edspace.american.edu/openbehavior/2017/05/19/finchscope/

Automated Rodent Tracker (ART)

May 5, 2017

Robyn Grant, from Manchester Metropolitan University, has shared the following with Open Behavior regarding the development of an automated rodent tracking (ART) program:


We have developed a program (ART, available from: http://mwa.bretthewitt.net/downloads.php) that can automatically track rodent positon and movement. It is able to track head movements, body movements and also aspects of body size. It is able to identify certain behaviours from video footage too, such as rotations, moving forwards, interacting with objects and staying still. Our program is really flexible, so it can have additional modules that can be easily “plugged in”. For example, at the moment, it has a manual tracker module, which allows for your automatic tracking to be validated with manual tracking points (using MWA: Hewitt, Yap & Grant 2016, Journal of Open Research Software). This versatility means that in the future other modules might be added, such as additional behaviour identifiers, or other trackers such as for feet or whiskers.

Our program, ART, is also very automatic. It has minimal user input, but still performs as well as other trackers that require a lot of manual processing. It can automatically find video frames where the mouse is present and will only track these frames; or you can specify to track only when the mouse is locomoting, or rotating, for example. We hope that this tracker will form a solid basis from which to investigate rodent behaviour further.

ART may be downloaded here

Link to share: https://edspace.american.edu/openbehavior/2017/05/05/automated-rodent-tracker-art/


FlyPi

April 28, 2017
In addition to sharing his Nose Poke Device, Dr. Andre Chagas from Open Neuroscience has also shared the following with Open Behavior regarding FlyPi:

The FlyPi is an open source and affordable microscope/experimental setup (the basic configuration can be built for ~100 Dollars). A collaboration between Open Neuroscience and Trend in Africa, the device is based on the Raspberry Pi, Arduino microcontroller and off-the-shelf electronic components. So far it has been used for: Light microscopy, fluorescence microscopy, optogenetics experiments, behavioural tracking, thermogenetic experiments. It has also been used as a teaching tool in workshops where students use it as an entry point into the world of electronics and programming.

Due to the modular and portable design, new applications could be easily created by the community to solve unforeseen needs/problems.

Link to share: https://edspace.american.edu/openbehavior/2017/04/28/flypi/


Nose Poke Device

April 20, 2017 

Andre Chagas, creator of OpenNeuroscience, has generously shared the following with OpenBehavior regarding an arduino-based, 3D-printed nose poke device:


“This nose poke device was built as “proof-of-principle”. The idea was to show that scientists too can leverage from the open source philosophy and the knowledge built by the community that is developing around open source hardware. Moreover, the bill of materials was kept simple and affordable. One device can be built for ~25 dollars and should take 2-3 hours to build, including the time to print parts.

The device is organised as follows: The 3D printed frame (which can also be built with other materials when a printer is not available) contains a hole where the animals are expected to insert their snouts. At the front part of the hole, an infrared led is aligned with an infrared detector. This forms an “infrared curtain” at the hole’s entrance. If this curtain is interrupted, a signal is sent to a microcontroller (an Arduino in this case), and it can be used to trigger other electronic components, such as a water pump, or an led indicator, or in this case a Piezo buzzer.
At the back of the hole, a white LED is placed to indicate that the system is active and ready for “nose pokes”.

The microcontroller, contains the code responsible for controlling the electronic parts, and can easily be changed, as it is written for Arduino and several code examples/tutorials (for begginners and experts) can be found online.”

Find more documentation on FigShare and Open Neuroscience.

Link to share:  https://edspace.american.edu/openbehavior/2017/04/20/nose-poke-device/

Pixying Behavior

APril 3, 2017 

Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu BerlinGermany, has generously shared the following regarding automated optical tracking of animal movement: 


“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution.  Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”

Link to share:  https://edspace.american.edu/openbehavior/2017/04/03/pixying-behavior/


Open Ephys

March 17, 2017

Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.


“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.

We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.”
– Jakob Voigts

Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
The Open Ephys GitHub can be found here.

Eco-HAB

February 12, 2017 

Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking: 


Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.


Attys

January 28, 2017 

Dr. Bernd Porr has also shared the following open source bioamplifier:


Attys is an open source wearable data acquisition device with a special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG). In contrast to many neurogadgets, the Attys transmits the data as it’s being recorded without any compression or pre-filtering, and at its full precision of 24 bits, to a mobile phone, tablet or PC. This guarantees maximum possible openness so that the raw data can be published alongside the processed data, as required now by many research councils.

All software for the Attys is open source which includes the firmware of the Attys.

The story of the Attys started four years ago, when Dr. Bernd Porr filmed numerous YouTube clips to educate the public about the possibilities and limits of biosignal measurement (see BPM Biosignals). The site has been very popular ever since and visitors have been asking if a ready made bio-amp could be made available. This was the birth of Attys.”