Home » Raspberry Pi

Tag: Raspberry Pi

3DOC: 3D Operant Conditioning

April 23, 2020

Raffaele Mazziotti from the Istituto di Neuroscienze CNR di Pisa has generously shared the following about 3DOC, a recently developed and published project from their team.


“Operant conditioning is a classical paradigm and a standard technique used in experimental psychology in which animals learn to perform an action in order to achieve a reward. By using this paradigm, it is possible to extract learning curves and measure accurately reaction times. Both these measurements are proxy of cognitive capabilities and can be used to evaluate the effectiveness of therapeutic interventions in mouse models of disease. Recently in our Lab, we constructed a fully 3D printable chamber able to perform operant conditioning using off-the-shelf, low-cost optical and electronic components, that can be reproduced rigorously in any laboratory equipped with a 3D printer with a total cost around 160€. Requirements include a 3D printable filament ( e.g. polylactic acid, PLA), a low-cost microcontroller (e.g. Arduino UNO), and a single-board computer (e.g. Raspberry Pi). We designed the chamber entirely using 3D modelling for several reasons: first, it has a high degree of reproducibility, since the model is standardized and can be downloaded to print the same structure with the same materials throughout different laboratories. Secondly, it can be easily customized in relation to specific experimental needs. Lastly, it can be shared through online repositories (Github: https://github.com/raffaelemazziotti/oc_chamber). With these cost-efficient and accessible components, we assayed the possibility to perform two-alternative forced-choice operant conditioning using audio-visual cues while tracking in the real-time mouse position. As a proof of principle of customizability, we added a version of the OC chamber that is able to show more complex visual stimuli (e.g. Images). This version includes an edit of the frontal wall that can host a TFT monitor and code that runs on Psychopy2 on Raspberry PI. This tool can be employed to test learning and memory in models of disease. We expect that the open design of the chamber will be useful for scientific teaching and research as well as for further improvements from the open hardware community.”

Check out the full publication here.

Or take a peak at the Github for this project.


Autopilot

DECEMBER 12, 2019

Jonny Saunders from Michael Wehr’s lab at the University of Oregon recently posted a preprint documenting their project Autopilot, which is a python framework for running behavioral experiments:


Autopilot is a python framework for behavioral experiments through utilizing Raspberry Pi microcontrollers. Autopilot incorporates all aspects of an experiment, including the hardware, stimuli, behavioral task paradigm, data management, data visualization, and a user interface. The authors propose that Autopilot is the fastest, least expensive, most flexibile behavioral system that is currently available.

The benefit of using Autopilot is that it allows more experimental flexibility, which lets researchers to optimize it for their specific experimental needs. Additionally, this project exemplifies how useful a raspberry pi can be for performing experiments and recording data. The preprint discusses many benefits of raspberry pis, including their speed, precision and proper data logging, and they only cost $35 (!!). Ultimately, the authors developed Autopilot in an effort to encourage users to write reusable, portable experiments that is put into a public central library to push replication and reproducibility.

 

For more information, check out their presentation or the Autopilot website here.

Additionally documentation is here, along with a github repo, and a link to their preprint is here.


Hao Chen lab, UTHSC – openBehavior repository

September 19, 2016

The openBehavior github repository from Hao Chen’s lab at UTHSC aims to establish a computing platform for rodent behavior research using the Raspberry Pi computer. They have built several devices for conducting operant conditioning and monitoring environmental data.

The operant licking device can be placed in a standard rat home cage and can run fixed ratio, various ratio, or progressive ratio schedules. A preprint describing this project, including data on sucrose vs water intake is available. Detailed instructions for making the device is also provided.

The environment sensor can record the temperature, humidity, barometric pressure, and illumination at fixed time intervals and automatically transfer the data to a remote server.

There is also a standard alone RFID reader for the EM4100 implantable glass chips, a motion sensor addon for standard operant chambers, and several other devices.

 

Automated Home-Cage Functional Imaging

Timothy Murphy and his colleagues at the University of British Columbia have developed an automated system for mesoscopic functional imaging that allows subjects to self-initiate head-fixation and imaging within the home-cage. In their 2016 paper, “High-throughput automated home-cage mesoscopic functional imaging of mouse cortex,” Dr. Murphy and his colleagues present this device and demonstrate its use with a group of calcium indicator transgenic mice. The supplementary material to this paper includes a diagram of the hardware, a graphic representation of the training cage, several videos of subjects interacting with the device and sample imaging data. The Python source code and 3D print files can be found on Dr. Murphey’s UBC webpage.

Murphy, T. H., Boyd, J. D., Bolaños, F., Vanni, M. P., Silasi, G., Haupt, D., & LeDue, J. M. (2016). High-throughput automated home-cage mesoscopic functional imaging of mouse cortex. Nature Communications, 7, 11611.