Category: Other

eNeuro’s “Open Source Tools and Methods” paper topic

SEPTEMBER 12, 2019

There’s a new place to publish your open-source tools or methods in neuroscience! Christophe Bernard, Editor-in-Chief at the journal eNeuro (an open-access journal of the Society for Neuroscience), recently wrote an editorial detailing the opening of a new topic tract in eNeuro for Open Source Tools and Methods. In his editorial, Bernard details how there has been a recent push for open-source science, and highlights how there are many new open-source projects being developed in neuroscience that need a proper home for publication. While eNeuro has a “Methods/New Tools” submission type already, Bernard says the “Open Source Tools and Methods” submission is available for projects like “low-cost devices to measure animal behavior, a new biophysical model of a single neuron, a better method to realign images when performing in vivo two-photon imaging, scripts and codes to analyze signals” and more.

There is no current publication venue explicitly intended for open-source tools and methods in neuroscience, and through the addition of this article type, new tools/methods/devices/projects can be published in a straightforward manner. By including this publication type, it will aid the neuroscience field in replication, reproducibility, and transparency of methods and tools used. A major point from Bernard is that this may help the developers of the tool or method, since “it allows for acknowledgment of those who developed such tools and methods fully, often rotating students or engineers recruited on a short-duration contract. On a standard research paper, their name ends up in the middle of the list of authors, but the Open Source Tools and Methods type will allow them to be the first author.”

The details for submission of an open source tool or method on the eNeuro site is as follows: “Open Source Tools and Methods are brief reports (limited to 4500 words) describing the creation and use of open-source tools in neuroscience research. Examples of tools include hardware designs used in behavioral or physiological studies and software used for data acquisition and analysis. They must contain a critique of the importance of the tool, how it compares to existing open- and closed-source solutions, and a demonstration of tool use in a neuroscience experiment.”

 

Cheers to you, eNeuro, for your inclusion of open-source projects to help advance the neuroscience field!


Link to the editorial: https://www.eneuro.org/content/6/5/ENEURO.0342-19.2019

Current articles for Open Source Tools and Methods are listed here.

To submit an article under Open Source Tools and Methods, check out the instructions for authors at eNeuro here.

Curated Itinerary on Open-Source Tools at SfN-19

September 5, 2019

OpenBehavior is now an official part of the SfN team for curated itineraries at SfN-19! This year, we will provide an itinerary on Open-Source Tools. Linda Amarante (@L_Amarante) and Samantha White (@samantha6rose) are working on the itinerary now. If you would like your presentation to be included, please DM us through our Twitter account (@OpenBehavior) or send an email message about your presentation to openbehavior@gmail.com before noon on Saturday, September 8. Thanks!

Ratcave

AUGUST 29, 2019

Nicholas A. Del Grosso and Anton Sirota at the Bernstein Centre for Computational Neuroscience recently published their new project called Ratcave, a Python 3D graphics library that allows researchers to create and 3D stimuli in their experiments:


Neuroscience experiments often require the use of software to present stimuli to a subject and subsequently record their responses. Many current libraries lack 3D graphic support necessary for psychophysics experiments. While python and other programming languages may have 3D graphics libraries, it is hard to integrate these into psychophysics libraries without modification. In order to increase programming of 3D graphics suitable for the existing environment of Python software, the authors developed Ratcave.

Ratcave is an open-source, cross-platform Python library that adds 3D stimulus support to all OpenGL-based 2D Python stimulus libraries. These libraries include VisionEgg, Psychopy, Pyglet, and PyGam. Ratcave comes with resources including basic 3D object primitives and wide range of 3D light effects. Ratcave’s intuitive object-oriented interface allows for all objects, which include meshes, lights, and cameras, can be repositioned, rotated, and scaled. Objects can also be parented to one another to specify complex relationships of objects. By sending the data as a single array using OpenGL’s VAO (Vertex Array Object) functionality, the processing of drawing much more efficient. This approach allows over 30,000 vertices to be rendered at a performance level surpassing the needs of most behavioral research studies.

An advantage of Ratcave is that it allows researchers to continue to use their preferred libraries, since Ratcave supplements existing python stimulus libraries, making it easy to add on 3d stimuli to current libraries. The manuscript also reports that Ratcave has been tested and implemented in other’s research, actively showing reproducibility across labs and experiments.

Details on the hardware and software can be found at https://github.com/ratcave/ratcave.

Information on Ratcave can also be found on the https://ratcave.readthedocs.org.


The Future is Open

August 16, 2019

This week’s post is about the current state of OpenBehavior (OB) and ongoing efforts within the open source neuroscience community. Next week, we will resume posting about new tools.

Samantha White, Linda Amarante, Lex Kravitz, and Mark Laubach published a commentary in eNeuro last week about how open-source tools are being used in neuroscience. We reported on our experiences in running OB since the summer of 2016, the many wonderful projects that we have posted about over the past three years, two surveys that we conducted on our site and open source tool use in general, and some observations on the mindset that comes from making and using open source tools. A link to our paper is https://www.eneuro.org/content/6/4/ENEURO.0223-19.2019.

The timing of our commentary and the related social media attention that is generated (e.g. https://twitter.com/samantha6rose/status/1159913815393341440) was especially nice as we have been working to expand OB to better serve the research community, and hope to find external support for the project. We would like to address an outstanding problem: it is not currently possible to systematically track the development and use of open source hardware and software in neuroscience research. To address this issue, we would like to to create a database of existing open source projects, characterize them using a newly developed “taxonomy” based on their functions (video analysis, behavioral control system, hardware for measuring or controlling behavior), and register projects using the SciCrunch RRID registry.

If you haven’t heard of SciCrunch, you should check it out: https://scicrunch.org/. Its an awesome project that tracks usage of research tools such as antibodies. RRIDs are citable and, if developed for open source hardware and software, would allow for developers to track how their tools are used in neuroscience publications. This might help provide incentives for sharing and metrics (RRIDs) on tool use and publication.

We are also planning to work with the Society for Neuroscience (SfN) to increase public awareness of neuroscience research by participating in SfN-sponsored advocacy and outreach events, facilitating discussions of open source tools through a new discussion topic in the Neuronline forums (more news on that soon), and continuing to provide curated itineraries on open source tools for attendees of the annual SfN meeting.

 

3D Printed Headstage Implant

June 6, 2019

Richard Pinnell from Ulrich Hofmann’s lab has three publications centered around open-source and 3D printed methods for headstage implant protection and portable / waterproof DBS and EEG to pair with water maze activity. We share details on the three studies below:


Most researchers opt to single-house rodents after rodents have undergone surgery. This helps the wound heal and prevent any issues with damage to the implant. However, there is substantial benefits to socially-housing rodents, as social isolation can create stressors for them. As a way to continue to socially-house rats, Pinnell et al. (2016a) created a novel 3D-printed headstage socket to surround an electrode connector. Rats were able to successfully be pair housed with these implants and their protective caps.

The polyamide headcap socket itself is 3D printed, and a stainless steel thimble can be screwed into it. The thimble can be removed by being unscrewed to reveal the electrode connector. This implant allows both for increased well-being of the rodent post-surgery, but also has additional benefits in that it can prevent any damage to the electrode implant during experiments and keeps the electrode implant clean as well.

The 3D printed headcap was used in a second study (Pinnell et al., 2016b) for wireless EEG recording in rats during a water maze task. The headstage socket housed the PCB electrode connector and the waterproof wireless system was attached. In this setup, during normal housing conditions, this waterproof attachment was replaced with a standard 18×9 mm stainless-steel sewing thimble, which contained 1.2 mm holes drilled at either end for attachment to the headstage socket. A PCB connector was manufactured to fit inside the socket, and contains an 18-pin zif connector, two DIP connectors, and an 18-pin Omnetics electrode connector for providing an interface between the implanted electrodes and the wireless recording system.

Finally, the implant was utilized in a third study (Pinnell et al., 2018) where the same group created a miniaturized, programmable deep-brain stimulator for use in a water maze. A portable deep brain stimulation (DBS) device was created through using a PCB design, and this was paired with the 3D printed device. The 3D printed headcap was modified from its use in Pinnell et al., 2016a to completely cover the implant and protect the PCB. The device, its battery, and housing weighs 2.7 g, and offers protection from both the environment and from other rats, and can be used in DBS studies during behavior in a water maze.

The portable stimulator, 3D printed cap .stl files, and more files from the publications can be found on https://figshare.com/s/31122e0263c47fa5dabd.


Pinnell, R. C., Almajidy, R. K., & Hofmann, U. G. (2016a). Versatile 3D-printed headstage implant for group housing of rodents. Journal of neuroscience methods, 257, 134-138.

Pinnell, R. C., Almajidy, R. K., Kirch, R. D., Cassel, J. C., & Hofmann, U. G. (2016b). A wireless EEG recording method for rat use inside the water maze. PloS one, 11(2), e0147730.

AutonoMouse

May 10, 2019

In a recently published article (Erskine et al., 2019), The Schaefer lab at the Francis Crick Institute introduced their new open-source project called AutonoMouse.


AutonoMouse is a fully automated, high-throughput system for self-initiated conditioning and behavior tracking in mice. Many aspects of behavior can be analyzed through having rodents perform in operant conditioning tasks. However, in operant experiments, many variables can potentially alter or confound results (experimenter presence, picking up and handling animals, altered physiological states through water restriction, and the issue that rodents often need to be individually housed to keep track of their individual performances). This was the main motivation for the authors to investigate a way to completely automate operant conditioning. The authors developed AutonoMouse as a fully automated system that can track large numbers (over 25) of socially-housed mice through implanted RFID chips on mice. With the RFID trackers and other analyses, the behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) an odor discrimination task over months with thousands of trials performed every day. The novelty in this study is the fully automated nature or the entire system (training, experiments, water delivery, weighing the animals are all automated) and the ability to keep mice socially-housed 24/7, all while still training them and tracking their performance in an olfactory operant conditioning task. The modular set-up makes it possible for AutonoMouse to be used to study many other sensory modalities, such as visual stimuli or in decision-making tasks. The authors provide a components list, layouts, construction drawings, and step-by-step instructions for the construction and use of AutonoMouse in their publication and on their project’s github.


For more details, check out this youtube clip interview with Andreas Schaefer, PI on the project.

 

The github for the project’s control software is located here: https://github.com/RoboDoig/autonomouse-control and for the project’s design and hardware instructions is here: https://github.com/RoboDoig/autonomouse-design. The schedule generation program is located here: https://github.com/RoboDoig/schedule-generator


Craniobot

March 13, 2019

Suhasa Kodandaramaiah from the University of Minnesota, Twin Cities, has shared the following about Craniobot, a computer numerical controlled robot for cranial microsurgeries.


The palette of tools available for neuroscientists to measure and manipulate the brain during behavioral experiments has greatly expanded in the previous decade. In many cases, using these tools requires removing sections of the skull to access the brain. The procedure to remove the sub-millimeter thick mouse skull precisely without damaging the underlying brain can be technically challenging and often takes significant skill and practice. This presents a potential obstacle for neuroscience labs wishing to adopt these technologies in their research. To overcome this challenge, a team at the University of Minnesota led by Mathew Rynes and Leila Ghanbari (equal contribution) created the ‘Craniobot,’ a cranial microsurgery platform that combines automated skull surface profiling with a computer numerical controlled (CNC) milling machine to perform a variety of cranial microsurgical procedures on mice. The Craniobot can be built from off-the-shelf components for a little over $1000 and the team has demonstrated its capability to perform small to large craniotomies, skull thinning procedures and for drilling pilot holes for installing bone anchor screws.

Read more about the Craniobot here. Software package for controlling the craniobot can be found on Github.


TRIO Platform

December 12, 2018

Vladislav Voziyanov and colleagues have developed and shared the TRIO Platform, a low-profile in vivo imaging support and restraint system for mice.


In vivo optical imaging methods are common tools for understanding neural function in mice. This technique is often performed in head-fixed,  anesthetized animals, which requires monitoring of anesthesia level and body temperature while stabilizing the head. Fitting each of the components necessary for these experiments on a standard microscope stage can be rather difficult. Voziyanov and colleagues have shared their design for the TRIO (Three-In-One) Platform. This system is compact and  provides sturdy head fixation, a gas anesthesia mask, and warm water bed. While the design is compact enough to work with a variety of microscope stages, the use of 3D printed components makes this design customizable.

https://www.frontiersin.org/files/Articles/184541/fnins-10-00169-HTML/image_m/fnins-10-00169-g004.jpg

Read more about the TRIO Platform in Frontiers in Neuroscience!

The design files and list of commercially available build components are provided here.


Upcoming Posters and Talks at SfN 2018

October 31, 2018

At the upcoming Society for Neuroscience meeting in San Diego, there will be a number of posters and talks that highlight novel devices and software that have implications for behavioral neuroscience. If you’re heading to the meeting, be sure to check them out! Relevant posters and talks are highlighted in the document, available at the following link: https://docs.google.com/document/d/12XqODhW14K2drCCEARVESoqqE0KrSjksZKN40xURVmk/edit?usp=sharing

OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!