Home » Behavioral Apparatus

Category: Behavioral Apparatus

An Open Source Automated Bar Test for Measuring Catalepsy in Rats

August 6, 2020

Researchers at the University of Guelph have created a low-cost automated apparatus for measuring catalepsy that increases measurement accuracy and reduces observer bias.


Catalepsy is a measure of muscular rigidity that can result from several factors including Parkinson’s disease, or pharmacological exposure to antipsychotics or cannabis. Catalepsy bar tests are widely used to measure this rigidity. The test consists of placing the arms of a rodent on a horizontal bar that has been raised off the ground and measuring the time it takes for the subject to remove themselves from this imposed posture. Traditionally, this has been measured by an experimenter with a stopwatch, or with prohibitively expensive commercial apparatus that have issues of their own. The automated bar test described here uses a 3D printed base with an Arduino operated design to make the design simple and affordable. This design sets itself apart by using extremely low-cost beam break sensors that avoid pitfalls of the traditional “complete the circuit” approach where changes in rat grip can result in false measurements. The beam break sensors to are used to determine whether the rat is on the bar or not and automatically measures and stores the time the rat takes to remove itself from the bar on an SD card for later retrieval. The device has been validated in rats; however, the bar height is adjustable so there is no reason it cannot be used in other rodents as well. This bar test thus makes catalepsy measures easy, accurate, and limits any experimenter bias due to manual measurements.

Learn more about this project in the recent paper!

Or check out the hackaday project page


Luciani, K. R., Frie, J. A., & Khokhar, J. Y. (2020). An Open Source Automated Bar Test for Measuring Catalepsy in Rats. ENeuro, 7(3). https://doi.org/10.1523/ENEURO.0488-19.2020

PiDose

July 30, 2020

Cameron Woodard has kindly shared the following write up about PiDose, an open source system for oral drug administration to group-housed mice.


“PiDose is an open-source tool for scientists performing drug administration experiments with mice. It allows for automated daily oral dosing of mice over long time periods (weeks to months) without the need for experimenter interaction and handling. To accomplish this, a small 3D-printed chamber is mounted adjacent to a regular mouse home-cage, with an opening in the cage to allow animals to freely access the chamber. The chamber is supported by a load cell, and does not contact the cage but sits directly next to the entrance opening. Prior to treatment, mice have a small RFID capsule implanted subcutaneously, and when they enter the chamber they are detected by an RFID reader. While the mouse is in the chamber, readings are taken from the load cell in order to determine the mouse’s bodyweight. At the opposite end of the chamber from the entrance, a nose-poke port accesses a spout which dispenses drops from two separate liquid reservoirs. This spout is wired to a capacitive touch sensor controller in order to detect licks, and delivers liquid drops in response to licking. Each day, an average weight is calculated for each mouse and a drug dosage is determined based on this weight. When a mouse licks at the spout it dispenses either regular drinking water or a drop of drug solution depending on if they have received their daily dosage or not. All components are controlled by a Python script running on a Raspberry Pi 3B. PIDose is low cost (~$250 for one system) and full build instructions as well as parts for 3D-printing and software can be found online.”

Read more about PiDose in their paper!

Check out the Open Science Framework repository for the project.

Or head over to the Hackaday page to learn more!


Woodard, C. L., Nasrallah, W. B., Samiei, B. V., Murphy, T. H., & Raymond, L. A. (2020). PiDose: An open-source system for accurate and automated oral drug administration to group-housed mice. Scientific Reports, 10(1). doi:10.1038/s41598-020-68477-2

BonVision

July 16, 2020

In a recent bioRxiv preprint, Gonçalo Lopes and colleagues from NeuroGEARS and University College London have shared information about BonVision — an open source software for creating and controlling visual environments.


With advances in computer gaming and software rendering, it is now possible to create realistic virtual environments. These virtual environments, which can be programmed to react to user input, are useful tools for understanding the neural basis of many behaviors. To expand access to this useful tool, Lopes and colleagues have developed BonVision, a software package for Bonsai which allows for control of 2D and 3D environments. Bonsai is a high-performance, open source, event-based language and which has already been widely used in the neuroscience community for control of closed-loop experiments with compatibility across a flexible range of input and outputs. The software features a modular workflow which allows users to specify parameters in 2D or 3D environments, which can be adapted to a number of display configurations. To demonstrate the utility of BonVision across species and common experimental paradigms, the team performed experiments in human psychophysics, animal behavior, and animal neurophysiology. Overall, this software provides maximal flexibility for application in a variety of experiments across species.

Read more from the preprint here!

Check out the Bonsai programming language here!

Or take a peak at the Github repository for the BonVision project!


Lopes, G., Farrell, K., Horrocks, E. A., Lee, C., Morimoto, M. M., Muzzu, T., . . . Saleem, A. B. (2020). BonVision – an open-source software to create and control visual environments. doi:10.1101/2020.03.09.983775

Rodent Arena Tracker (RAT)

June 18, 2020

Jonathan Krynitsky and colleagues from the Kravitz lab at Washington University have constructed and shared RAT, a closed loop system for machine vision rodent tracking and task control.


The Rodent Arena Tracker, or RAT, is a low cost wireless position tracker for automatically tracking mice in high contrast arenas. The device can use subject position information to control other devices in real time, allowing for closed loop control of various tasks based on positional behavior data. The device is based on the OpenMV Cam M7 (openmv.io), an opensource machine vision camera equipped with onboard processing for real-time analysis which reduces data storage requirements and removes the need for an external computer. The authors optimized the control code for tracking mice and created a custom circuit board to run the device off a battery and include a real-time clock for synchronization, a BNC input/output port, and a push button for starting the device. The build instructions for RAT, as well as validation data to highlight effectiveness and potential uses for the device are available in their recent publication. Further, all the design files, such as the PCB design, 3D printer files, python code, etc, are available on hackaday.io.

Read the full article here!

Or check out the project on hackaday.io!


Krynitsky, J., Legaria, A. A., Pai, J. J., Garmendia-Cedillos, M., Salem, G., Pohida, T., & Kravitz, A. V. (2020). Rodent Arena Tracker (RAT): A Machine Vision Rodent Tracking Camera and Closed Loop Control System. Eneuro, 7(3). doi:10.1523/eneuro.0485-19.2020

 

ACRoBaT

May 14, 2020

David A. Bjånes and Chet T. Moritz from the University of Washington in Seattle have developed and published their device for training rats to perform a modified center out task.


As neuroscience tools for studying rodent brains have improved in the 21st century, researchers have started to utilize increasingly complex tasks to study their behavior, sometimes adapting tasks commonly used with primates. One such task used for studying motor behavior, the center-out reaching task, has been modified for use in rodents. Bjånes and Moritz have further contributed to the adaptation of this task by creating ACRoBaT, or the Automated Center-out Rodent Behavioral Trainer. This device features two custom printed PCBs, a 3D printed housing unit, an Arduino microchip, and other commercially available parts that can be mounted outside a behavioral arena. It also provides a fully automated algorithm to train rats based on behavioral feedback fed into the device through various sensors. The authors show the effectiveness of the device with data from 18 rats across different conditions to find the optimal training procedure for this task. Information for how to build the device is available in their publication, as well as on Github.

Read the full publication here, or check out the files on GitHub!


Visual stimulator with customizable light spectra

May 7, 2020

Katrin Franke, Andre Maia Chagas and colleagues have developed and shared a spatial visual stimulator with an arbitrary-spectrum of light for visual neuroscientists.


Vision research, quite obviously, relies on control of visual stimuli in an experiment. There are a great number of commercially available devices and hardware that are implemented in presenting visual stimuli to human and other species, however, these devices are predominantly developed for the visual spectrum of humans. For other species, such as drosophila, zebrafish, and rodents, their visual spectrum includes UV, and the devices used in studies sometimes fail to present this range of stimulus, and therefore often limits our understanding of the visual systems of other organisms. To address this, Franke, Chagas and colleagues developed an open source, generally low cost visual stimulator which can be customized with up to 6 chromatic channels. Given the components used to build the device, the spectrum of light can be arbitrary and customizable to be adapted to different animal models based on their visual spectrum. The details of this device, including the parts list and information for a custom python library for generating visual stimuli (QDSpy), can be found in the eLife publication. The device is tested and shown to work with stimulating the mouse retina and in vivo zebrafish studies; details on these experiments can also be found in the publication.

Check out the eLife article here!


Franke, K., Chagas, A. M., Zhao, Z., Zimmermann, M. J., Bartel, P., Qiu, Y., . . . Euler, T. (2019). An arbitrary-spectrum spatial visual stimulator for vision research. ELife, 8. doi:10.7554/elife.48779

3DOC: 3D Operant Conditioning

April 23, 2020

Raffaele Mazziotti from the Istituto di Neuroscienze CNR di Pisa has generously shared the following about 3DOC, a recently developed and published project from their team.


“Operant conditioning is a classical paradigm and a standard technique used in experimental psychology in which animals learn to perform an action in order to achieve a reward. By using this paradigm, it is possible to extract learning curves and measure accurately reaction times. Both these measurements are proxy of cognitive capabilities and can be used to evaluate the effectiveness of therapeutic interventions in mouse models of disease. Recently in our Lab, we constructed a fully 3D printable chamber able to perform operant conditioning using off-the-shelf, low-cost optical and electronic components, that can be reproduced rigorously in any laboratory equipped with a 3D printer with a total cost around 160€. Requirements include a 3D printable filament ( e.g. polylactic acid, PLA), a low-cost microcontroller (e.g. Arduino UNO), and a single-board computer (e.g. Raspberry Pi). We designed the chamber entirely using 3D modelling for several reasons: first, it has a high degree of reproducibility, since the model is standardized and can be downloaded to print the same structure with the same materials throughout different laboratories. Secondly, it can be easily customized in relation to specific experimental needs. Lastly, it can be shared through online repositories (Github: https://github.com/raffaelemazziotti/oc_chamber). With these cost-efficient and accessible components, we assayed the possibility to perform two-alternative forced-choice operant conditioning using audio-visual cues while tracking in the real-time mouse position. As a proof of principle of customizability, we added a version of the OC chamber that is able to show more complex visual stimuli (e.g. Images). This version includes an edit of the frontal wall that can host a TFT monitor and code that runs on Psychopy2 on Raspberry PI. This tool can be employed to test learning and memory in models of disease. We expect that the open design of the chamber will be useful for scientific teaching and research as well as for further improvements from the open hardware community.”

Check out the full publication here.

Or take a peak at the Github for this project.


PASTA

April 16, 2020

Thanks to Jan Homolak from the Department of Pharmacology, University of Zagreb School of Medicine, Zagreb, Croatia for sharing the following about repurposing a digital kitchen scale for neuroscience research: a complete hardware and software cookbook for PASTA (Platform for Acoustic STArtle).


“As we were starving for a solution on how to obtain relevant and reliable information from a kitchen scale sometimes used in very creative ways in neuroscience research, we decided to cut the waiting and cook something ourselves. Here we introduce a complete hardware and software cookbook for PASTA, a guide on how to demolish your regular kitchen scale and use the parts to turn it into a beautiful multifunctional neurobehavioral platform. This project is still medium raw, as its the work in progress, however, we hope you will still find it well done.
PASTA comes in various flavors such as:
– complete hardware design for PASTA
– PASTA data acquisition software codes (C++/Arduino)
– PASTA Chef: An automatic experimental protocol execution Python script for data acquisition and storage
– ratPASTA (R-based Awesome Toolbox for PASTA): An R-package for PASTA data analysis and visualization

..and all can be found on bioRxiv here: https://www.biorxiv.org/content/10.1101/2020.04.10.035766v1.supplementary-material

bon appétit!”


Open Source Joystick

March 26, 2020

This week we want to talk about joy! I mean, joy-sticks. Parley Belsey, Mark Nicholas and Eric Yttri have developed and shared an open-source joystick for studying motor behavior and decision making in mice!


Mice are hopping and popping in research, and so researchers are using more creativity and innovation to understand the finite aspects of their behaviors. Recently, members of the Yttri lab at Carnegie Mellon used their skills to create an open source joystick for studying mouse motor and decision making behaviors! In their paper they describe the full behavioral set up (based on the RIVETS design from the Dudman lab), featuring a removable head fixation point, a sipping tube, and a joystick to measure reach trajectory, amplitude, speed, etc. Data is collected and devices are controlled via an Arduino, solenoid circuit, microSD card reader, and LCD readout, and data can be analyzed in real time or saved to a csv for analysis later. The Arduino can be programmed to signal reward delivery when a correct response is recorded from the joystick which streamlines outcome based reward delivery. Belsey et al. tested their device with adult mice, and the results of training can be found in the paper as well as the full build instructions and ideas for how their tool may be of interest to build and use in your lab.

For more, check out their publication or Github!


Belsey, P., Nicholas, M. A., & Yttri, E. A. (2020). Open-source joystick manipulandum for decision-making, reaching, and motor control studies in mice. Eneuro. doi: 10.1523/eneuro.0523-19.2020

Robotic Flower System for Bee Behavior

March 19, 2020

Erno Kuusela and Juho Lämsä, from the University of Oulu in Finland, have shared their design for an open source, computer controlled robotic flower system for studying bumble bee behavior.


Oh.. to be a honey bee.. collecting nectar from a robotic flower.. of open source design… splendid. As with behavioral studies from species common to neuroscience (rodents to drosophila to humans or zebrafish, etc), data collection for behavioral studies in bees can be time-consuming and sensitive to human error. Thanks to the growth in the open source movement, it’s easier than ever to develop hardware and software to automate such studies, which is what Kuusela and Lämsä have demonstrated in their publication. They developed a system of robotic flowers to study bee behavior. Their design features a control unit, based on an Arduino Mega 2560, which can collect data from and send inputs to up to 32 individual robotic flowers. Each flower contains its own servo controlled refill system. The nectar cup (in this design, a phillips screw head that can hold 1.7 uL!) is attached to servomotor’s shaft via a servo horn which, when prompted by the program, dips the cup into the flower’s individual nectar reservoir. The flower is designed in a way to capture data when an animal is feeding by the placement of IR beams that are broken when engaged on the flower’s feeding mechanism and sends data to the control unit. A covering on the system can be marked with symbols to attract bees. Custom control software is available on an open source license to be used as is, or modified to fit an experimenter’s needs. While developed and tested with bumble bees, the system can also be adapted for a number of species.

Read more about specifics of this system in Kuusela & Lämsä (2016). The circuit diagrams, parts list, and control software and source code are available in the paper’s supplemental information.