Home » Other

Category: Other

OpenMonkeyStudio

February 27, 2020

OpenMonkeyStudio is an amazing new tool for tracking movements by and interactions among freely moving monkeys. Ben Hayden and Jan Zimmerman kindly sent along this summary of the project:

Tracking animal pose (that is, identifying the positions foo their major joints) is a major frontier in neuroscience. When combined with neural recordings, pose tracking allows for identifying the relationship between neural activity and movement, and decision-making inferred from movement. OpenMonkeyStudio is a system designed to allow tracking of rhesus macaques in large freely moving environments.

Tracking monkeys is at least an order of magnitude more difficult than tracking mice, flies, and worms. Monkeys are, basically, large furry blobs; they don’t have clear body segmentations. And their movements are much richer and more complex. For these reasons, out of the box systems don’t work with monkeys.

The major innovation of our OpenMonkeyStudio is how it tackles the annotation problem. Deep learning systems aren’t very good at generalization. They can replicate things they have seen before or things that are kind fo similar to what they have seen. So the important thing is giving them a sufficiently large training set. We ideally want to have about a million annotated images. That would cost about $10 million and we don’t have that kind of money. So we use several cool tricks, which we describe in our paper, to augment a small dataset and turn it into a large one. Doing that works very well, and results in a system that can track one or even two interacting monkeys.


Check out the preprint:

OpenMonkeyStudio: Automated Markerless Pose Estimation in Freely Moving Macaques

Praneet C. Bala, Benjamin R. Eisenreich, Seng Bum Michael Yoo, Benjamin Y. Hayden, Hyun Soo Park, Jan Zimmermann

https://www.biorxiv.org/content/10.1101/2020.01.31.928861v1

Open Source Science: Learn to Fly

February 20, 2020

Yesterday, Lex and I participated in a “hack chat” over on hackaday.io. The log of the chat is now posted on the hackaday.io site. A few topics came up that we felt deserved more attention, especially the non-research uses of open source hardware developed for neuroscience applications. Today’s post is about those topics.

For me, it has become clear that there is a major need for trainees (and many faculty) to learn the basic skill set needed to make use of the open source tools that we feature on OpenBehavior. In my own teaching at American University, I run a course for undergraduates (and graduate students too, if they want to take it) that covers the basics on Python and Arduino programming, how to use Jupyter notebooks, how to connect Python with R and GNU Octave (rpy2 and oct2py), and how to do simple hardware projects with Arduinos. The students build a simple rig for running reaction time experiments, collect some data, analyze their own data, and then develop extension experiments to run on their own. We also cover a lot of other issues, like never using the jet colormap and why pandas is awesome. Last year, we partnered with Backyard Brains and brought their muscle spiker box and Neurorobots into the course, with major help from Chris Harris (and of course Greg Gage, who has been a long time supporter of open source science).

Yesterday in the chat, I learned that I have not been alone in developing such content. Andre Maia Chagas at the University of Sussex is working on his own set of tools for training folks to build and make open source devices for neuroscience research. Another site that you might check out is Lab On The Cheap. They have done a lot of posts on how to make lab equipment yourself, for a lot less than any commercial vendor will be able to charge.

In reflecting on all of these activities late last night, I was reminded of this amazing video from 2015 in which 1000 musicians play Learn to Fly by Foo Fighters to ask Dave Grohl and the Foo Fighters to come and play in Cesena, Italy. To me, the awesomeness of what is currently happening in open source neuroscience is kind of like this video. We just need to work together to make stuff happen, and we can have a blast along the way.

-Mark


Check it out: https://www.youtube.com/watch?v=JozAmXo2bDE

Open-Source Neuroscience Hardware Hack Chat

February 13, 2020

This week we would like to highlight an event hosted by Hackaday.io: The Open-Source Neuroscience Hardware Hack Chat. Lex Kravitz and Mark Laubach will be available on Wednesday, February 19, 2020 at noon Pacific Time to chat with users of Hackaday.io about open-source tools for neuroscience research.

In case you don’t know, Hackaday.io is really awesome project hosting site. Many open-source projects are hosted there that can teach you about microcontrollers, 3D printing, and other makerspace tools. It is so easy to find new ideas for projects and helpful build instructions for your projects on Hackaday.io.

We have previously posted about several popular projects that are hosted on Hackaday.io, such as SignalBuddy, PhotometryBox, and FED. (By the way, FED is now offered through collaboration between the OpenBehavior and OpenEphys projects: https://open-ephys.org/fed3/fed3.) But there are a number of other interesting projects hosted on Hackaday.io that are worth a look.

For example, FORCE (Force Output of Rodent Calibrated Effort) was developed by Bridget Matikainen-Ankney. It can be used in studies with response force controlling reward delivery in behaving rodents.

Another interesting project is the LabRATory Telepresence Robot developed by Brett Smith. It is a robotic system that allows for motion correction in imaging studies done in behaving mice using trackball setups.

Two other cool projects on Hackaday.io provide tools for studying behavior in electric fish, an electric fish detector by Michael Haag and the electric fish piano by Davis Catolico. The electric fish piano can be used to listen to, record, and manipulate the electrical tones made by these kinds of fish.

Finally, there are a couple of projects that could be useful for research and teaching labs, including a project on measuring jumping behavior by grasshoppers by Dieu My Nguyen and a rig for recording central pattern generators in snails by Nancy Sloan.

Check out these projects and let us know what you think! And hope to chat with you next Wednesday.

Open-Source Neuroscience Hardware Hack Chat


LINK: https://hackaday.io/event/169511-open-source-neuroscience-hardware-hack-chat

Camera Control

February 6, 2020

The Adaptive Motor Control Lab at Harvard recently posted their project, Camera Control, a python based camera software GUI, to Github.


Camera Control is an open-source software package written by postdoctoral fellow Gary Kane that allows video to be recorded in sync with behavior. The python GUI and scripts allows investigators to record from multiple imaging source camera feeds with associated timestamps for each frame. When used in combination with a NIDAQ card, timestamps from a behavioral task can also be recorded on the falling edge of a TTL signal. This allows video analysis to be paired with physiological recording which can be beneficial in assessing behavioral results. This package requires Windows 10, Anaconda, and Git, and is compatible with Imaging Source USB3 cameras. The software package is accessible for download from the lab’s github and instructions for installation and video recording are provided.

Find more on Github.


Kane, G. & Mathis, M. (2019). Camera Control: record video and system timestamps from Imaging Source USB3 cameras. GitHub. https://zenodo.org/badge/latestdoi/200101590

RatHat: A self-targeting printable brain implant system

JANUARY 9, 2020

Leila Allen and colleagues in Tim Allen’s lab at Florida International University recently developed RatHat, a self-targeting printable brain implant system. Below they describe their project:


“There has not been a major change in how neuroscientists approach stereotaxic methods in decades. Here we present a new stereotaxic method that improves on traditional approaches by reducing costs, training, surgical time, and aiding repeatability. The RatHat brain implantation system is a 3D printable stereotaxic device for rats that is fabricated prior to surgery and fits to the shape of the skull. RatHat builds are directly implanted into the brain without the need for head-leveling or coordinate-mapping during surgery. The RatHat system can be used in conjunction with the traditional u-frame stereotaxic device, but does not require the use of a micromanipulator for successful implantations. Each RatHat system contains several primary components including the implant for mounting intracranial components, the surgical stencil for targeting drill sites, and the protective cap for impacts and debris. Each component serves a unique function and can be used together or separately. We demonstrate the feasibility of the RatHat system in four different proof-of-principle experiments: 1) a 3-pole cannula apparatus, 2) an optrode-electrode assembly, 3) a fixed-electrode array, and 4) a tetrode hyperdrive. Implants were successful, durable, and long-lasting (up to 9 months). RatHat print files are easily created, can be modified in CAD software for a variety of applications, and are easily shared, contributing to open science goals and replications. The RatHat system has been adapted to multiple experimental paradigms in our lab and should be a useful new way to conduct stereotaxic implant surgeries in rodents.

RatHat is freely available to academic researchers, achieving open science goals. Academic and non-profit researchers interested in receiving the 3D files can contact Dr. Timothy Allen (tallen@fiu.edu). We will first provide you a simple noncommercial license to be executed by your institution, and upon completion, printable and editable 3D files of the implant system. Our responses are fast, and all files are provided immediately after receiving the aforementioned document. Our goals are noncommercial, and our only interests are to share RatHat as widely as possible in support of our open science goals and to improve the pace of discovery using chronic brain implant systems for behavioral studies.”

 


The Allen lab has provided a video tutorial on how to implant RatHat, which you can view here on youtube.

 

For more details, you can check out the preprint here.

 

Autopilot

DECEMBER 12, 2019

Jonny Saunders from Michael Wehr’s lab at the University of Oregon recently posted a preprint documenting their project Autopilot, which is a python framework for running behavioral experiments:


Autopilot is a python framework for behavioral experiments through utilizing Raspberry Pi microcontrollers. Autopilot incorporates all aspects of an experiment, including the hardware, stimuli, behavioral task paradigm, data management, data visualization, and a user interface. The authors propose that Autopilot is the fastest, least expensive, most flexibile behavioral system that is currently available.

The benefit of using Autopilot is that it allows more experimental flexibility, which lets researchers to optimize it for their specific experimental needs. Additionally, this project exemplifies how useful a raspberry pi can be for performing experiments and recording data. The preprint discusses many benefits of raspberry pis, including their speed, precision and proper data logging, and they only cost $35 (!!). Ultimately, the authors developed Autopilot in an effort to encourage users to write reusable, portable experiments that is put into a public central library to push replication and reproducibility.

 

For more information, check out their presentation or the Autopilot website here.

Additionally documentation is here, along with a github repo, and a link to their preprint is here.


3D Printed Headcap and Microdrive

SEPTEMBER 26, 2019

In their 2015 Journal of Neurophysiology article, the Paré Lab at the Center for Molecular and Behavioral Neuroscience at Rutgers University describe their novel head-cap and microdrive design for chronic multi-electrode recordings in rats through the use of 3D printing technology and highlight the impact of 3D printing technology on neurophysiology:


There is a need for microdrives and head-caps that can accommodate different recording configurations. Many investigators implant multiple individual drives aiming to record from numerous areas. However, this extends surgery time, impairs animal recovery, and complicates experiments. Other strategies rely on more expensive custom-machined drive assemblies that are specifically built for a particular set of regions, limiting their adaptability. Some proposed designs allow targeting of multiple regions, but recording sites must be within a few millimeters so are only suitable for mice and not for accessing areas of larger brains (like in rats, for example).

Utilizing 3D printing technology to create a novel design concept of microdrives and head-caps, this group’s design allows for recording of multiple brain regions in different configurations. In their article, the lab reviews the basic principles of 3D design and printing and introduce their approach to multisite recording, explaining how to construct the individual required components. The 3D printed head cap and electrode microdrive enables investigators to perform chronic multi-site recordings in rats. The head cap is composed of five components and there are three types of microdrives that can be used in different combinations or positions to study different targets. The different microdrive designs have different functionality including for extended driving depths, targeting of thin layers, and allowing many microdrives to be placed in a small area.

To show the viability of their new designs, the lab presents LFP recordings obtained throughout the cortico-hippocampal loop using 3D printed components. The lab suggests investigators modify their designs to best suit their research needs and give changeable versions of the three parts most important in modification. The investigators also provide a detailed explanation of the printing, assembly, and implantation of the head caps and microdrives. Finally, they indicate the ways 3D printing advancements can change how chronic implants are designed and used, notably 3D scanning and new material development.

For more information on the microdrive and headcap, see their paper’s Appendix, which has full instructions and advice on building these devices.


Headley, D. B., DeLucca, M. V., Haufler, D., & Paré, D. (2015). Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions. Journal of Neurophysiology, 113(7), 2721–2732. https://doi.org/10.1152/jn.00955.2014

SignalBuddy

SEPTEMBER 19, 2019

Richard Warren, a graduate student in the Sawtell lab at Columbia University, recently shared his new open-source project called SignalBuddy:


SignalBuddy is an easy-to-make, easy-to-use signal generator for scientific applications. Making friends is hard, but making SignalBuddy is easy. All you need is an Arduino Uno! SignalBuddy replaces more complicated and (much) more expensive signal generators in laboratory settings where one millisecond resolution is sufficient. SignalBuddy generates digital or true analog signals (sine waves, step functions, and pulse trains), can be controlled with an intuitive serial monitor interface, and looks fabulous in an optional 3D printed enclosure.

To get SignalBuddy working, all you need to do is install the SignalBuddy.ino Arduino code provided on their github, and follow the step-by-step instructions on github to get the Arduino programmed up for your specific experimental needs. SignalBuddy can be used for numerous lab purposes, including creating pulse trains for optogenetic light stimulation, microstimulation, electrophysiology, or for programming up stimuli for behavioral paradigms.

Additionally, their hackaday site provides the instructions for 3D printing an enclosure to house the Arduino inside using just two .stl files.


For more information, check out the SignalBuddy github repository here.

You can also get further details on the SignalBuddy Hackaday.io page here.

 

Fun Fact: This group also developed KineMouse Wheel, a project previously posted on OpenBehavior and is now being used in numerous labs! Cheers to another great open-source project from Richard Warren and the Sawtell lab!

eNeuro’s “Open Source Tools and Methods” paper topic

SEPTEMBER 12, 2019

There’s a new place to publish your open-source tools or methods in neuroscience! Christophe Bernard, Editor-in-Chief at the journal eNeuro (an open-access journal of the Society for Neuroscience), recently wrote an editorial detailing the opening of a new topic tract in eNeuro for Open Source Tools and Methods. In his editorial, Bernard details how there has been a recent push for open-source science, and highlights how there are many new open-source projects being developed in neuroscience that need a proper home for publication. While eNeuro has a “Methods/New Tools” submission type already, Bernard says the “Open Source Tools and Methods” submission is available for projects like “low-cost devices to measure animal behavior, a new biophysical model of a single neuron, a better method to realign images when performing in vivo two-photon imaging, scripts and codes to analyze signals” and more.

There is no current publication venue explicitly intended for open-source tools and methods in neuroscience, and through the addition of this article type, new tools/methods/devices/projects can be published in a straightforward manner. By including this publication type, it will aid the neuroscience field in replication, reproducibility, and transparency of methods and tools used. A major point from Bernard is that this may help the developers of the tool or method, since “it allows for acknowledgment of those who developed such tools and methods fully, often rotating students or engineers recruited on a short-duration contract. On a standard research paper, their name ends up in the middle of the list of authors, but the Open Source Tools and Methods type will allow them to be the first author.”

The details for submission of an open source tool or method on the eNeuro site is as follows: “Open Source Tools and Methods are brief reports (limited to 4500 words) describing the creation and use of open-source tools in neuroscience research. Examples of tools include hardware designs used in behavioral or physiological studies and software used for data acquisition and analysis. They must contain a critique of the importance of the tool, how it compares to existing open- and closed-source solutions, and a demonstration of tool use in a neuroscience experiment.”

 

Cheers to you, eNeuro, for your inclusion of open-source projects to help advance the neuroscience field!


Link to the editorial: https://www.eneuro.org/content/6/5/ENEURO.0342-19.2019

Current articles for Open Source Tools and Methods are listed here.

To submit an article under Open Source Tools and Methods, check out the instructions for authors at eNeuro here.

Curated Itinerary on Open-Source Tools at SfN-19

September 5, 2019

OpenBehavior is now an official part of the SfN team for curated itineraries at SfN-19! This year, we will provide an itinerary on Open-Source Tools. Linda Amarante (@L_Amarante) and Samantha White (@samantha6rose) are working on the itinerary now. If you would like your presentation to be included, please DM us through our Twitter account (@OpenBehavior) or send an email message about your presentation to openbehavior@gmail.com before noon on Saturday, September 8. Thanks!