Tag: Raspberry Pi

Autopilot

DECEMBER 12, 2019

Jonny Saunders from Michael Wehr’s lab at the University of Oregon recently posted a preprint documenting their project Autopilot, which is a python framework for running behavioral experiments:


Autopilot is a python framework for behavioral experiments through utilizing Raspberry Pi microcontrollers. Autopilot incorporates all aspects of an experiment, including the hardware, stimuli, behavioral task paradigm, data management, data visualization, and a user interface. The authors propose that Autopilot is the fastest, least expensive, most flexibile behavioral system that is currently available.

The benefit of using Autopilot is that it allows more experimental flexibility, which lets researchers to optimize it for their specific experimental needs. Additionally, this project exemplifies how useful a raspberry pi can be for performing experiments and recording data. The preprint discusses many benefits of raspberry pis, including their speed, precision and proper data logging, and they only cost $35 (!!). Ultimately, the authors developed Autopilot in an effort to encourage users to write reusable, portable experiments that is put into a public central library to push replication and reproducibility.

 

For more information, check out their presentation or the Autopilot website here.

Additionally documentation is here, along with a github repo, and a link to their preprint is here.


Hao Chen lab, UTHSC – openBehavior repository

September 19, 2016

The openBehavior github repository from Hao Chen’s lab at UTHSC aims to establish a computing platform for rodent behavior research using the Raspberry Pi computer. They have built several devices for conducting operant conditioning and monitoring environmental data.

The operant licking device can be placed in a standard rat home cage and can run fixed ratio, various ratio, or progressive ratio schedules. A preprint describing this project, including data on sucrose vs water intake is available. Detailed instructions for making the device is also provided.

The environment sensor can record the temperature, humidity, barometric pressure, and illumination at fixed time intervals and automatically transfer the data to a remote server.

There is also a standard alone RFID reader for the EM4100 implantable glass chips, a motion sensor addon for standard operant chambers, and several other devices.

 

Automated Home-Cage Functional Imaging

Timothy Murphy and his colleagues at the University of British Columbia have developed an automated system for mesoscopic functional imaging that allows subjects to self-initiate head-fixation and imaging within the home-cage. In their 2016 paper, “High-throughput automated home-cage mesoscopic functional imaging of mouse cortex,” Dr. Murphy and his colleagues present this device and demonstrate its use with a group of calcium indicator transgenic mice. The supplementary material to this paper includes a diagram of the hardware, a graphic representation of the training cage, several videos of subjects interacting with the device and sample imaging data. The Python source code and 3D print files can be found on Dr. Murphey’s UBC webpage.

Murphy, T. H., Boyd, J. D., BolaƱos, F., Vanni, M. P., Silasi, G., Haupt, D., & LeDue, J. M. (2016). High-throughput automated home-cage mesoscopic functional imaging of mouse cortex. Nature Communications, 7, 11611.