Category: Neural Stimulation

HOPE

July 12, 2019

Sebastien Delcasso from the Graybiel lab at MIT published a method for developing a brain implant called “HOPE” for combining with optogenetics, pharmacology, and electrophysiology:


HOPE (hybrid-drive combining optogenetics, pharmacology, and electrophysiology) is a method that simplifies the construction of a drivable and multi-task recording implant. HOPE is a new type of implant that can support up to 16 tetrodes, and allows for recordings of two different brain areas in a mouse at the same time, along with simultaneous optogenetic or pharmacological manipulation. The HOPE implants are open-source and can be recreated in CAD software and subsequently 3D printed, drastically lowering the cost of an electrophysiological implant. Additionally, instead of waiting months for a custom-made implant, these can be printed within a few hours.

The manuscript provides detailed instructions on constructing the implant, and allows for users to individually modify it for their own needs (and can be modified to be used in rats or non-human primates). Additionally, HOPE is meant to be used in experiments with paired electrophysiological experiments with either optogenetic or pharmacological manipulations, which will inevitably open the door to many more experiments. The implant is intended for microdrive recordings, and the actual implant is only made up of six 3D printed parts, an electrode interface board (EIB), and five screws.

The authors validate the implant by first successfully recording striatal neurons, using transgenic PV-Cre mice to optogenetically inhibit parvalbumin interneurons, and then using muscimol infused into the striatum in a head-fixed mouse preparation. HOPE is a novel open-source neural implant that can be paired with multiple methods (recordings, optogenetics, and pharmacology) to help in manipulating and subsequently recording brain activity.

 

 

More details of their implant can be found on their project site and on the project GitHub.


Delcasso, S., Denagamage, S., Britton, Z., & Graybiel, A. M. (2018). HOPE: Hybrid-Drive Combining Optogenetics, Pharmacology and Electrophysiology. Frontiers in neural circuits, 12, 41.

 

optoPAD

June 27, 2019

Carlos Ribeiro’s lab at Champalimaud recently published their new project called optoPAD in eLife:


Both the analysis of behavior and of neural activity need to be time-precise in order to make any correlation or comparison to each other. The analysis of behavior can be done through many methods (as seen by many featured projects on this site!). The Ribeiro lab has previously published their work on flyPAD (Itskov et al., 2014), which is a system for automated analysis of feeding behavior in Drosophila with high temporal precision. However, in attempts to manipulate specific feeding behaviors, the group wanted to go one step further to manipulate neural activity during feeding, and needed a method to do so that would be precise enough to compare with behavior.

In their new manuscript, Moreira et al. describe the design and implementation of a high-throughput system of closed-loop optogenetic manipulation of neurons in Drosophila during feeding behavior. Named optoPAD, the system allows for specific perturbation of specific groups of neurons. They use optoPAD as a method to induce appetitive and aversive effects on feeding through activating or inhibiting gustatory neurons in a closed-loop manner. OptoPAD is a combination of the previous flyPAD system with an additional method for stimulating LEDs for optogenetic perturbation. They also used their system combined with Bonsai, a current open-source framework for behavioral analysis.

The system first uses flyPAD to measure the interaction of the fly with the food given in an experiment. Then, Bonsai detects when the fly interacts with a food electrode, then sending a signal to a microcontroller which will turn on an LED for optogenetic perturbation of neurons in the fly. The authors additionally highlight the flexibility and expandability of the optoPAD system. They detail how flyPAD, once published and then implemented in an optogenetics framework by their group, had been successfully adapted by another group, which is a great example of the benefit of open-source sharing of projects.

 

Details on the hardware and software can be found at the Ribeiro lab Github. More details on flyPAD, the original project, can be found on their github as well.

Information on FlyPAD can also be found on the FlyPAD website and in the FlyPAD paper .


Moreira, J. M., Itskov, P. M., Goldschmidt, D., Steck, K., Walker, S. J., & Ribeiro, C. (2019). optoPAD: a closed-loop optogenetics system to study the circuit basis of feeding behaviors. eLife, doi: 10.7554/eLife.43924

Cerebro Wireless Optogenetic System

April 5, 2019

Andy Lustig from the Karpova Lab at Janelia has developed, documented, and shared a system for wireless optogenetic stimulation.


Several commercial systems for wireless controlled optogenetic stimulation are available, however, as you might expect, these systems can be cost-prohibitive and often lack the ability to be customized. To address these limitations, Lustig developed his own wireless, open-source optogenetic stimulation system. It features Cerebro, a rechargeable, battery-powered wireless receiver; a head implant containing optical fibers and two independent laser diodes; a base station for transmitting radio signals to the Cerebro, controlled by a Windows computer via USB or by TTL; a charging dock; and Xavier, a user-friendly GUI for sending and logging base station commands. The full documentation for building this system is available on the Karpova Lab github.

 

Craniobot

March 13, 2019

Suhasa Kodandaramaiah from the University of Minnesota, Twin Cities, has shared the following about Craniobot, a computer numerical controlled robot for cranial microsurgeries.


The palette of tools available for neuroscientists to measure and manipulate the brain during behavioral experiments has greatly expanded in the previous decade. In many cases, using these tools requires removing sections of the skull to access the brain. The procedure to remove the sub-millimeter thick mouse skull precisely without damaging the underlying brain can be technically challenging and often takes significant skill and practice. This presents a potential obstacle for neuroscience labs wishing to adopt these technologies in their research. To overcome this challenge, a team at the University of Minnesota led by Mathew Rynes and Leila Ghanbari (equal contribution) created the ‘Craniobot,’ a cranial microsurgery platform that combines automated skull surface profiling with a computer numerical controlled (CNC) milling machine to perform a variety of cranial microsurgical procedures on mice. The Craniobot can be built from off-the-shelf components for a little over $1000 and the team has demonstrated its capability to perform small to large craniotomies, skull thinning procedures and for drilling pilot holes for installing bone anchor screws.

Read more about the Craniobot here. Software package for controlling the craniobot can be found on Github.


OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!

PhotometryBox

August 29, 2018

In a recent bioRxiv preprint, Scott Owen and Anatol Kreitzer share PhotometryBox, an open-source solution for electronic control of fiber-based fluorescence measurements.


Fluorescence measurements from deep-brain structures through optical fibers (fiber photometry) represent a versatile, powerful, and rapidly growing neuroscience technique. A typical fiber photometry system consists of three
parts: (1) an implant with an optical fiber that is cemented to the skull, (2) optical components for generation of fluorescence excitation light and detection of emission light, and (3) electronic components for controlling light sources and acquiring signals. Excellent technical solutions are available for implants and optical components; however, currently available electronic control systems are not optimized for these experiments. The most commonly used electronic components are either over-engineered or unnecessarily inflexible. To address these issues, Owen et al have developed an open-source, low-cost solution for the electronic components. This system is based on a programmable microcontroller (MBED LPC1768) and can be assembled in ~1 hour (less than a day for an inexperienced user with limited soldering experience). The total estimated cost is about $650, less than one tenth the price of the most commonly used commercially available systems.
The design, development and implementation of this project is described in a manuscript now available on bioRxiv, while details regarding parts, construction and use are available on Hackaday.

Read more on bioRxiv

or check out the Hackaday page.


Collaboration between OpenBehavior and Hackaday.io

July 23, 2018

OpenBehavior has been covering open-source neuroscience projects for a few years, and we are always thrilled to see projects that are well documented and can be easily reproduced by others.  To further this goal, we have formed a collaboration with Hackaday.io, who have provided a home for OpenBehavior on their site.  This can be found at: https://hackaday.io/OpenBehavior, where we currently have 36 projects listed ranging from electrophysiology to robotics to behavior.  We are excited about this collaboration because it provides a straightforward way for people to document their projects with instructions, videos, images, data, etc.  Check it out, see what’s there, and if you want your project linked to the OpenBehavior page simply tag it as “OPENBEHAVIOR” or drop us a line at the Hackaday page.

Note: This collaboration between OpenBehavior and Hackaday.io is completely non-commercial, meaning that we don’t pay Hackaday.io for anything, nor do we receive any payments from them.  It’s simply a way to further our goal of promoting open-source neuroscience tools and their goal of growing their science and engineering community.


https://hackaday.io/OpenBehavior

 

Open source modules for tracking animal behavior and closed-loop stimulation based on Open Ephys and Bonsai

June 15, 2018

In a recent preprint on BioRxiv, Alessio Buccino and colleagues from the University of Oslo provide a step-by-step guide for setting up an open source, low cost, and adaptable system for combined behavioral tracking, electrophysiology, and closed-loop stimulation. Their setup integrates Bonsai and Open Ephys with multiple modules they have developed for robust real-time tracking and behavior-based closed-loop stimulation. In the preprint, they describe using the system to record place cell activity in the hippocampus and medial entorhinal cortex, and present a case where they used the system for closed-loop optogenetic stimulation of grid cells in the entorhinal cortex as examples of what the system is capable of. Expanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats.

Read more on BioRxiv, or on GitHub!


Buccino A, Lepperød M, Dragly S, Häfliger P, Fyhn M, Hafting T (2018). Open Source Modules for Tracking Animal Behavior and Closed-loop Stimulation Based on Open Ephys and Bonsai. BioRxiv. http://dx.doi.org/10.1101/340141

Head-Fixed Setup for Combined Behavior, Electrophysiology, and Optogenetics

June 12, 2018

In a recent publication in the Frontiers in Systems Neuroscience, Solari and colleagues of the Hungarian Academy of Sciences and Semmelweis University have shared the following about a behavioral setup for temporally controlled rodent behavior. This arrangement allows for training of head-fixed animals with calibrated sound stimuli, precisely timed fluid and air puff presentations as reinforcers. It combines microcontroller-based behavior control with a sound delivery system for acoustic stimuli, fast solenoid valves for reinforcement delivery and a custom-built sound attenuated chamber, and is shown to be suitable for combined behavior, electrophysiology and optogenetics experiments. This system utilizes an optimal open source setup of both hardware and software through using Bonsai, Bpod and OpenEphys.

Read more here!

GitHub


Solari N, Sviatkó K, Laszlovszky T, Hegedüs P and Hangya B (2018). Open Source Tools for Temporally Controlled Rodent Behavior Suitable for Electrophysiology and Optogenetic Manipulations. Front. Syst. Neurosci. 12:18. doi: 10.3389/fnsys.2018.00018

Article in Nature on monitoring behavior in rodents

An interesting summary of recent methods for monitoring behavior in rodents was published this week in Nature.The article mentions Lex Kravitz and his lab’s efforts on the Feeding Experimentation Device (FED) and also OpenBehavior. Check it out:  https://www.nature.com/articles/d41586-018-02403-5