Category: Most Recent

SfN 2019 Open-Source Itinerary

October 18, 2019

It’s that time of year again: SfN season! The OpenBehavior Team has put together a curated itinerary of talks and posters featuring open-source tools. The list contains over 200 projects related to open-source in general, with 90 or so projects that are specifically relevant to behavioral neuroscience (denoted in bold)! You can find it at this link: https://docs.google.com/document/d/1s09KorOuBcga2TygATB6lXcjpNYlMxCEZ1yXzxWsnYE/edit?usp=sharing.

Thanks to everyone who reached out to add their projects to the itinerary! If you want to add your poster or talk to the list, it’s not too late! Send us a DM on twitter or a quick email to openbehavior@gmail.com.

See you in Chicago!

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


3D Printed Headcap and Microdrive

SEPTEMBER 26, 2019

In their 2015 Journal of Neurophysiology article, the Paré Lab at the Center for Molecular and Behavioral Neuroscience at Rutgers University describe their novel head-cap and microdrive design for chronic multi-electrode recordings in rats through the use of 3D printing technology and highlight the impact of 3D printing technology on neurophysiology:


There is a need for microdrives and head-caps that can accommodate different recording configurations. Many investigators implant multiple individual drives aiming to record from numerous areas. However, this extends surgery time, impairs animal recovery, and complicates experiments. Other strategies rely on more expensive custom-machined drive assemblies that are specifically built for a particular set of regions, limiting their adaptability. Some proposed designs allow targeting of multiple regions, but recording sites must be within a few millimeters so are only suitable for mice and not for accessing areas of larger brains (like in rats, for example).

Utilizing 3D printing technology to create a novel design concept of microdrives and head-caps, this group’s design allows for recording of multiple brain regions in different configurations. In their article, the lab reviews the basic principles of 3D design and printing and introduce their approach to multisite recording, explaining how to construct the individual required components. The 3D printed head cap and electrode microdrive enables investigators to perform chronic multi-site recordings in rats. The head cap is composed of five components and there are three types of microdrives that can be used in different combinations or positions to study different targets. The different microdrive designs have different functionality including for extended driving depths, targeting of thin layers, and allowing many microdrives to be placed in a small area.

To show the viability of their new designs, the lab presents LFP recordings obtained throughout the cortico-hippocampal loop using 3D printed components. The lab suggests investigators modify their designs to best suit their research needs and give changeable versions of the three parts most important in modification. The investigators also provide a detailed explanation of the printing, assembly, and implantation of the head caps and microdrives. Finally, they indicate the ways 3D printing advancements can change how chronic implants are designed and used, notably 3D scanning and new material development.

For more information on the microdrive and headcap, see their paper’s Appendix, which has full instructions and advice on building these devices.


Headley, D. B., DeLucca, M. V., Haufler, D., & Paré, D. (2015). Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions. Journal of Neurophysiology, 113(7), 2721–2732. https://doi.org/10.1152/jn.00955.2014

SignalBuddy

SEPTEMBER 19, 2019

Richard Warren, a graduate student in the Sawtell lab at Columbia University, recently shared his new open-source project called SignalBuddy:


SignalBuddy is an easy-to-make, easy-to-use signal generator for scientific applications. Making friends is hard, but making SignalBuddy is easy. All you need is an Arduino Uno! SignalBuddy replaces more complicated and (much) more expensive signal generators in laboratory settings where one millisecond resolution is sufficient. SignalBuddy generates digital or true analog signals (sine waves, step functions, and pulse trains), can be controlled with an intuitive serial monitor interface, and looks fabulous in an optional 3D printed enclosure.

To get SignalBuddy working, all you need to do is install the SignalBuddy.ino Arduino code provided on their github, and follow the step-by-step instructions on github to get the Arduino programmed up for your specific experimental needs. SignalBuddy can be used for numerous lab purposes, including creating pulse trains for optogenetic light stimulation, microstimulation, electrophysiology, or for programming up stimuli for behavioral paradigms.

Additionally, their hackaday site provides the instructions for 3D printing an enclosure to house the Arduino inside using just two .stl files.


For more information, check out the SignalBuddy github repository here.

You can also get further details on the SignalBuddy Hackaday.io page here.

 

Fun Fact: This group also developed KineMouse Wheel, a project previously posted on OpenBehavior and is now being used in numerous labs! Cheers to another great open-source project from Richard Warren and the Sawtell lab!

eNeuro’s “Open Source Tools and Methods” paper topic

SEPTEMBER 12, 2019

There’s a new place to publish your open-source tools or methods in neuroscience! Christophe Bernard, Editor-in-Chief at the journal eNeuro (an open-access journal of the Society for Neuroscience), recently wrote an editorial detailing the opening of a new topic tract in eNeuro for Open Source Tools and Methods. In his editorial, Bernard details how there has been a recent push for open-source science, and highlights how there are many new open-source projects being developed in neuroscience that need a proper home for publication. While eNeuro has a “Methods/New Tools” submission type already, Bernard says the “Open Source Tools and Methods” submission is available for projects like “low-cost devices to measure animal behavior, a new biophysical model of a single neuron, a better method to realign images when performing in vivo two-photon imaging, scripts and codes to analyze signals” and more.

There is no current publication venue explicitly intended for open-source tools and methods in neuroscience, and through the addition of this article type, new tools/methods/devices/projects can be published in a straightforward manner. By including this publication type, it will aid the neuroscience field in replication, reproducibility, and transparency of methods and tools used. A major point from Bernard is that this may help the developers of the tool or method, since “it allows for acknowledgment of those who developed such tools and methods fully, often rotating students or engineers recruited on a short-duration contract. On a standard research paper, their name ends up in the middle of the list of authors, but the Open Source Tools and Methods type will allow them to be the first author.”

The details for submission of an open source tool or method on the eNeuro site is as follows: “Open Source Tools and Methods are brief reports (limited to 4500 words) describing the creation and use of open-source tools in neuroscience research. Examples of tools include hardware designs used in behavioral or physiological studies and software used for data acquisition and analysis. They must contain a critique of the importance of the tool, how it compares to existing open- and closed-source solutions, and a demonstration of tool use in a neuroscience experiment.”

 

Cheers to you, eNeuro, for your inclusion of open-source projects to help advance the neuroscience field!


Link to the editorial: https://www.eneuro.org/content/6/5/ENEURO.0342-19.2019

Current articles for Open Source Tools and Methods are listed here.

To submit an article under Open Source Tools and Methods, check out the instructions for authors at eNeuro here.

Curated Itinerary on Open-Source Tools at SfN-19

September 5, 2019

OpenBehavior is now an official part of the SfN team for curated itineraries at SfN-19! This year, we will provide an itinerary on Open-Source Tools. Linda Amarante (@L_Amarante) and Samantha White (@samantha6rose) are working on the itinerary now. If you would like your presentation to be included, please DM us through our Twitter account (@OpenBehavior) or send an email message about your presentation to openbehavior@gmail.com before noon on Saturday, September 8. Thanks!

Ratcave

AUGUST 29, 2019

Nicholas A. Del Grosso and Anton Sirota at the Bernstein Centre for Computational Neuroscience recently published their new project called Ratcave, a Python 3D graphics library that allows researchers to create and 3D stimuli in their experiments:


Neuroscience experiments often require the use of software to present stimuli to a subject and subsequently record their responses. Many current libraries lack 3D graphic support necessary for psychophysics experiments. While python and other programming languages may have 3D graphics libraries, it is hard to integrate these into psychophysics libraries without modification. In order to increase programming of 3D graphics suitable for the existing environment of Python software, the authors developed Ratcave.

Ratcave is an open-source, cross-platform Python library that adds 3D stimulus support to all OpenGL-based 2D Python stimulus libraries. These libraries include VisionEgg, Psychopy, Pyglet, and PyGam. Ratcave comes with resources including basic 3D object primitives and wide range of 3D light effects. Ratcave’s intuitive object-oriented interface allows for all objects, which include meshes, lights, and cameras, can be repositioned, rotated, and scaled. Objects can also be parented to one another to specify complex relationships of objects. By sending the data as a single array using OpenGL’s VAO (Vertex Array Object) functionality, the processing of drawing much more efficient. This approach allows over 30,000 vertices to be rendered at a performance level surpassing the needs of most behavioral research studies.

An advantage of Ratcave is that it allows researchers to continue to use their preferred libraries, since Ratcave supplements existing python stimulus libraries, making it easy to add on 3d stimuli to current libraries. The manuscript also reports that Ratcave has been tested and implemented in other’s research, actively showing reproducibility across labs and experiments.

Details on the hardware and software can be found at https://github.com/ratcave/ratcave.

Information on Ratcave can also be found on the https://ratcave.readthedocs.org.


SpikeGadgets

AUGUST 22, 2019

We’d like to highlight groups and companies that support an open-source framework to their software and/or hardware in behavioral neuroscience. One of these groups is SpikeGadgets, a company co-founded by Mattias Karlsson and Magnus Karlsson.


SpikeGadgets is a group of electrophysiologists and engineers who are working to develop neuroscience hardware and software tools. Their open-source software, Trodes, is a cross-platform software suite for neuroscience data acquisition and experimental control, which is made up of modules that communicate with a centralized GUI to visualize and save electrophysiological data. Trodes has a camera module and a StateScript module, which is a state-based scripting language that can be used to program behavioral tasks through using lights, levels, beam breaks, lasers, stimulation sources, audio, solenoids, etc. The camera module can be used to acquire video that can synchronize to neural recordings; the camera module can track the animal’s position in real-time or play it back after the experiment. The camera module can work with USB webcams or GigE cameras.

Paired with the Trodes software and StateScript language is the SpikeGadgets hardware that can be purchased on their website. The hardware is used for data acquisition (Main Control Unit, used for electrophysiology) and behavioral control (Environmental Control Unit).  SpikeGadgets also provides both Matlab and Python toolboxes on their site that can be used to analyze both behavioral and electrophysiological data. Trodes can be used on Windows, Linux, or Mac, and there are step-by-step instructions for how to install and use Trodes on the group’s bitbucket page.

Spikegadgets mission is “to develop the most advanced neuroscience tools on the market, while preserving ease of use and science-driven customization.”

 


For more information on SpikeGadgets or to download or purchase their software or hardware, check out their website here.

There is additional documentation on their BitBucket Wiki, with a user manual, instructions for installation, and FAQ.

Check out their entire list of collaborators, contributors, and developers here.

The Future is Open

August 16, 2019

This week’s post is about the current state of OpenBehavior (OB) and ongoing efforts within the open source neuroscience community. Next week, we will resume posting about new tools.

Samantha White, Linda Amarante, Lex Kravitz, and Mark Laubach published a commentary in eNeuro last week about how open-source tools are being used in neuroscience. We reported on our experiences in running OB since the summer of 2016, the many wonderful projects that we have posted about over the past three years, two surveys that we conducted on our site and open source tool use in general, and some observations on the mindset that comes from making and using open source tools. A link to our paper is https://www.eneuro.org/content/6/4/ENEURO.0223-19.2019.

The timing of our commentary and the related social media attention that is generated (e.g. https://twitter.com/samantha6rose/status/1159913815393341440) was especially nice as we have been working to expand OB to better serve the research community, and hope to find external support for the project. We would like to address an outstanding problem: it is not currently possible to systematically track the development and use of open source hardware and software in neuroscience research. To address this issue, we would like to to create a database of existing open source projects, characterize them using a newly developed “taxonomy” based on their functions (video analysis, behavioral control system, hardware for measuring or controlling behavior), and register projects using the SciCrunch RRID registry.

If you haven’t heard of SciCrunch, you should check it out: https://scicrunch.org/. Its an awesome project that tracks usage of research tools such as antibodies. RRIDs are citable and, if developed for open source hardware and software, would allow for developers to track how their tools are used in neuroscience publications. This might help provide incentives for sharing and metrics (RRIDs) on tool use and publication.

We are also planning to work with the Society for Neuroscience (SfN) to increase public awareness of neuroscience research by participating in SfN-sponsored advocacy and outreach events, facilitating discussions of open source tools through a new discussion topic in the Neuronline forums (more news on that soon), and continuing to provide curated itineraries on open source tools for attendees of the annual SfN meeting.

 

Pathfinder

AUGUST 8, 2019

Matthew Cooke and colleagues from Jason Snyder’s lab at University of British Columbia recently developed open source software to detect spatial navigation behavior in animals called Pathfinder:


Spatial navigation is studied across several different paradigms for different purposes in animals; through analyzing spatial behaviors we can gain insight into how an animal learns a task, how they change their approach strategy, and generally observing goal-directed behaviors. Pathfinder is an open source software that can analyze rodent navigation. The software intends to automatically classify patterns of navigation as a rodent performs in a task. Pathfinder can analyze subtle patterns in spatial behavior that simple analysis measures may not always be able to pick up on. Specifically, many water maze analyses use escape latency or path length as an analysis measure, but the authors point out that the time it takes to reach the platform may not differ while the strategy does, so using latency may not be the most optimal measure for analyzing an animal’s strategy and therefore experimenters may miss out on key differences in behavior. Therefore, Pathfinder aims to analyze more subtle aspects of the task to determine differences in spatial navigation and strategy.

Originally intended for water maze navigation, pathfinder can also be used to analyze many other spatial behaviors across different tasks, mazes, and species. The software takes x-y coordinates from behavior tracking software (for example, it can open files from Noldus Ethovision, ActiMetrics’ Watermaze, Stoelting’s Anymaze, and the open-source project ezTrack from Denise Cai’s lab), and then calculates the best-fit search strategy for each rodent’s trial. For the morris water maze task, trials are fit into several categories: Direct Swim, Directed Search, Focal Search, Spatial indirect, Chaining, Scanning, Thigmotaxis, and Random Search.

Pathfinder runs in Python and has an easy-to-use GUI; many aspects and parameters can be adjusted to analyze different tasks or behaviors.

For more details, check out their BioRxiV preprint here.

There’s a nice (humorous!) writeup of the project on the Snyder Lab website.

You can also download the project and view more details on their github:
https://matthewbcooke.github.io/Pathfinder/

https://github.com/MatthewBCooke/Pathfinder/