Category: Other

3D Printed Headstage Implant

Richard Pinnell from Ulrich Hofmann’s lab has three publications centered around open-source and 3D printed methods for headstage implant protection and portable / waterproof DBS and EEG to pair with water maze activity. We share details on the three studies below:


Most researchers opt to single-house rodents after rodents have undergone surgery. This helps the wound heal and prevent any issues with damage to the implant. However, there is substantial benefits to socially-housing rodents, as social isolation can create stressors for them. As a way to continue to socially-house rats, Pinnell et al. (2016a) created a novel 3D-printed headstage socket to surround an electrode connector. Rats were able to successfully be pair housed with these implants and their protective caps.

The polyamide headcap socket itself is 3D printed, and a stainless steel thimble can be screwed into it. The thimble can be removed by being unscrewed to reveal the electrode connector. This implant allows both for increased well-being of the rodent post-surgery, but also has additional benefits in that it can prevent any damage to the electrode implant during experiments and keeps the electrode implant clean as well.

The 3D printed headcap was used in a second study (Pinnell et al., 2016b) for wireless EEG recording in rats during a water maze task. The headstage socket housed the PCB electrode connector and the waterproof wireless system was attached. In this setup, during normal housing conditions, this waterproof attachment was replaced with a standard 18×9 mm stainless-steel sewing thimble, which contained 1.2 mm holes drilled at either end for attachment to the headstage socket. A PCB connector was manufactured to fit inside the socket, and contains an 18-pin zif connector, two DIP connectors, and an 18-pin Omnetics electrode connector for providing an interface between the implanted electrodes and the wireless recording system.

Finally, the implant was utilized in a third study (Pinnell et al., 2018) where the same group created a miniaturized, programmable deep-brain stimulator for use in a water maze. A portable deep brain stimulation (DBS) device was created through using a PCB design, and this was paired with the 3D printed device. The 3D printed headcap was modified from its use in Pinnell et al., 2016a to completely cover the implant and protect the PCB. The device, its battery, and housing weighs 2.7 g, and offers protection from both the environment and from other rats, and can be used in DBS studies during behavior in a water maze.

The portable stimulator, 3D printed cap .stl files, and more files from the publications can be found on https://figshare.com/s/31122e0263c47fa5dabd.


Pinnell, R. C., Almajidy, R. K., & Hofmann, U. G. (2016a). Versatile 3D-printed headstage implant for group housing of rodents. Journal of neuroscience methods, 257, 134-138.

Pinnell, R. C., Almajidy, R. K., Kirch, R. D., Cassel, J. C., & Hofmann, U. G. (2016b). A wireless EEG recording method for rat use inside the water maze. PloS one, 11(2), e0147730.

AutonoMouse

In a recently published article (Erskine et al., 2019), The Schaefer lab at the Francis Crick Institute introduced their new open-source project called AutonoMouse.


AutonoMouse is a fully automated, high-throughput system for self-initiated conditioning and behavior tracking in mice. Many aspects of behavior can be analyzed through having rodents perform in operant conditioning tasks. However, in operant experiments, many variables can potentially alter or confound results (experimenter presence, picking up and handling animals, altered physiological states through water restriction, and the issue that rodents often need to be individually housed to keep track of their individual performances). This was the main motivation for the authors to investigate a way to completely automate operant conditioning. The authors developed AutonoMouse as a fully automated system that can track large numbers (over 25) of socially-housed mice through implanted RFID chips on mice. With the RFID trackers and other analyses, the behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) an odor discrimination task over months with thousands of trials performed every day. The novelty in this study is the fully automated nature or the entire system (training, experiments, water delivery, weighing the animals are all automated) and the ability to keep mice socially-housed 24/7, all while still training them and tracking their performance in an olfactory operant conditioning task. The modular set-up makes it possible for AutonoMouse to be used to study many other sensory modalities, such as visual stimuli or in decision-making tasks. The authors provide a components list, layouts, construction drawings, and step-by-step instructions for the construction and use of AutonoMouse in their publication and on their project’s github.


For more details, check out this youtube clip interview with Andreas Schaefer, PI on the project.

 

The github for the project’s control software is located here: https://github.com/RoboDoig/autonomouse-control and for the project’s design and hardware instructions is here: https://github.com/RoboDoig/autonomouse-design. The schedule generation program is located here: https://github.com/RoboDoig/schedule-generator


Craniobot

March 13, 2019

Suhasa Kodandaramaiah from the University of Minnesota, Twin Cities, has shared the following about Craniobot, a computer numerical controlled robot for cranial microsurgeries.


The palette of tools available for neuroscientists to measure and manipulate the brain during behavioral experiments has greatly expanded in the previous decade. In many cases, using these tools requires removing sections of the skull to access the brain. The procedure to remove the sub-millimeter thick mouse skull precisely without damaging the underlying brain can be technically challenging and often takes significant skill and practice. This presents a potential obstacle for neuroscience labs wishing to adopt these technologies in their research. To overcome this challenge, a team at the University of Minnesota led by Mathew Rynes and Leila Ghanbari (equal contribution) created the ‘Craniobot,’ a cranial microsurgery platform that combines automated skull surface profiling with a computer numerical controlled (CNC) milling machine to perform a variety of cranial microsurgical procedures on mice. The Craniobot can be built from off-the-shelf components for a little over $1000 and the team has demonstrated its capability to perform small to large craniotomies, skull thinning procedures and for drilling pilot holes for installing bone anchor screws.

Read more about the Craniobot here. Software package for controlling the craniobot can be found on Github.


TRIO Platform

December 12, 2018

Vladislav Voziyanov and colleagues have developed and shared the TRIO Platform, a low-profile in vivo imaging support and restraint system for mice.


In vivo optical imaging methods are common tools for understanding neural function in mice. This technique is often performed in head-fixed,  anesthetized animals, which requires monitoring of anesthesia level and body temperature while stabilizing the head. Fitting each of the components necessary for these experiments on a standard microscope stage can be rather difficult. Voziyanov and colleagues have shared their design for the TRIO (Three-In-One) Platform. This system is compact and  provides sturdy head fixation, a gas anesthesia mask, and warm water bed. While the design is compact enough to work with a variety of microscope stages, the use of 3D printed components makes this design customizable.

https://www.frontiersin.org/files/Articles/184541/fnins-10-00169-HTML/image_m/fnins-10-00169-g004.jpg

Read more about the TRIO Platform in Frontiers in Neuroscience!

The design files and list of commercially available build components are provided here.


Upcoming Posters and Talks at SfN 2018

October 31, 2018

At the upcoming Society for Neuroscience meeting in San Diego, there will be a number of posters and talks that highlight novel devices and software that have implications for behavioral neuroscience. If you’re heading to the meeting, be sure to check them out! Relevant posters and talks are highlighted in the document, available at the following link: https://docs.google.com/document/d/12XqODhW14K2drCCEARVESoqqE0KrSjksZKN40xURVmk/edit?usp=sharing

OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!

Collaboration between OpenBehavior and Hackaday.io

July 23, 2018

OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others.  To further this goal, we have formed a collaboration with Hackaday.io, who have provided a home for OpenBehavior on their site.  This can be found at: https://hackaday.io/OpenBehavior, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior.  We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc.  Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.

Note: This collaboration between OpenBehavior and Hackaday.io is completely non-commercial, meaning that we don’t pay Hackaday.io for anything, nor do we receive any payments from them.  It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.


https://hackaday.io/OpenBehavior

 

Article in Nature on monitoring behavior in rodents

An interesting summary of recent methods for monitoring behavior in rodents was published this week in Nature.The article mentions Lex Kravitz and his lab’s efforts on the Feeding Experimentation Device (FED) and also OpenBehavior. Check it out:  https://www.nature.com/articles/d41586-018-02403-5

MAPLE: a Modular Automated Platform for Large-Scale Experiments

January 8th, 2018 
The de Bivort lab and FlySorter, LLC are happy to share on OpenBehavior their open-source Drosophila handling platform, called MAPLE: Modular Automated Platform for Large-Scale Experiments.

Drosophila Melanogaster has proven a valuable genetic model organism due to the species’ rapid reproduction, low-maintenance, and extensive genetic documentation. However, the tedious chore of handling and manually phenotyping remains a limitation with regards to data collection. MAPLE: a Modular Automated Platform for Large-Scale Experiments provides a solution to this limitation.

MAPLE is a Drosophila-handing robot that boasts a modular design, allowing the platform to both automate diverse phenotyping assays and aid with lab chores (e.g., collecting virgin female flies). MAPLE permits a small-part manipulator, a USB digital camera, and a fly manipulator to work simultaneously over a platform of flies. Failsafe mechanisms allow users to leave MAPLE unattended without risking damage to MAPLE or the modules.

The physical platform integrates phenotyping and animal husbandry to allow end-to-end experimental protocols. MAPLE features a large, physically-open workspace for user convenience. The sides, top, and bottom are made of clear acrylic to allow optical phenotyping at all time points other than when the end-effector carriages are above the modules. Finally, the low cost and scalability allow large-scale experiments ($3500 vs hundreds of thousands for a “fly-flipping” robot).

MAPLE’s utility and versatility were demonstrated through the execution of two tasks: collection of virgin female flies, and a large-scale longitudinal measurement of fly social networks and behavior.

Links to materials:

CAD files

Control Software

Raw data and analysis scripts 

De Bivort Lab Site 


 

Pulse Pal

July 12, 2017

Josh Sanders has also shared the following with OpenBehavior regarding Pulse Pal, an open source pulse train generator. Pulse Pal and Bpod, featured earlier, were both created by Sanworks.


Pulse Pal is an Arduino-powered device that generates precise sequences of voltage pulses for neural stimulation and stimulus control. It is controlled either through its APIs in MATLAB, Python and C++, or as a stand-alone instrument using its oLED screen and a clickable thumb joystick. Pulse Pal can play independent stimulus trains on its output channels. These trains are either defined parametrically, or pulse-wise by specifying each pulse’s onset time and voltage. Two optically isolated TTL trigger channels can each be mapped to any subset of the output channels, which can range between -10V and +10V, and deliver pulses as short as 100µs. This feature set allows Pulse Pal to serve as an open-source alternative to commercial stimulation timing devices, i.e. Master 8 (AMPI), PSG-2 (ISSI), Pulsemaster A300 (WPI), BPG-1 (Bak Electronics), StimPulse PGM (FHC Inc.) and Multistim 3800 (A-M Systems).

Because Pulse Pal is an Arduino-powered device, modifying its firmware for custom applications is within the capabilities of most modern Neuroscience research labs. As an example, the Pulse Pal’s Github repository provides an alternative firmware for the device, that entirely repurposes it as a waveform generator. In this configuration, a user can specify a waveform, frequency, amplitude and max playback duration, and toggle playback by TTL pulse with ~100µs latency. The firmware can also loop custom waveforms up to 40,000 samples long.

Pulse Pal was first published in 2014, by Josh Sanders while he was a student in Kepecs Lab at Cold Spring Harbor Laboratory. A significantly improved second-generation stimulator (Pulse Pal 2) became available in early 2016, coincident with the opening of Sanworks LLC. Over the past year, >125 Pulse Pal 2 devices were sold at $545 each by the Sanworks assembly service, while several labs elected to build their own. The initial success of this product demonstrates that fully open-source hardware can make headway against closed-source competitors in the Neuroscience instrumentation niche market.


Sanworks Github page for Pulse Pal may be found here.

The Wiki page for Pulse Pal, including assembly instructions, may be found here.