Home » Uncategorized

Category: Uncategorized

OpenMonkeyStudio

February 27, 2020

OpenMonkeyStudio is an amazing new tool for tracking movements by and interactions among freely moving monkeys. Ben Hayden and Jan Zimmerman kindly sent along this summary of the project:

Tracking animal pose (that is, identifying the positions foo their major joints) is a major frontier in neuroscience. When combined with neural recordings, pose tracking allows for identifying the relationship between neural activity and movement, and decision-making inferred from movement. OpenMonkeyStudio is a system designed to allow tracking of rhesus macaques in large freely moving environments.

Tracking monkeys is at least an order of magnitude more difficult than tracking mice, flies, and worms. Monkeys are, basically, large furry blobs; they don’t have clear body segmentations. And their movements are much richer and more complex. For these reasons, out of the box systems don’t work with monkeys.

The major innovation of our OpenMonkeyStudio is how it tackles the annotation problem. Deep learning systems aren’t very good at generalization. They can replicate things they have seen before or things that are kind fo similar to what they have seen. So the important thing is giving them a sufficiently large training set. We ideally want to have about a million annotated images. That would cost about $10 million and we don’t have that kind of money. So we use several cool tricks, which we describe in our paper, to augment a small dataset and turn it into a large one. Doing that works very well, and results in a system that can track one or even two interacting monkeys.


Check out the preprint:

OpenMonkeyStudio: Automated Markerless Pose Estimation in Freely Moving Macaques

Praneet C. Bala, Benjamin R. Eisenreich, Seng Bum Michael Yoo, Benjamin Y. Hayden, Hyun Soo Park, Jan Zimmermann

https://www.biorxiv.org/content/10.1101/2020.01.31.928861v1

Open Source Science: Learn to Fly

February 20, 2020

Yesterday, Lex and I participated in a “hack chat” over on hackaday.io. The log of the chat is now posted on the hackaday.io site. A few topics came up that we felt deserved more attention, especially the non-research uses of open source hardware developed for neuroscience applications. Today’s post is about those topics.

For me, it has become clear that there is a major need for trainees (and many faculty) to learn the basic skill set needed to make use of the open source tools that we feature on OpenBehavior. In my own teaching at American University, I run a course for undergraduates (and graduate students too, if they want to take it) that covers the basics on Python and Arduino programming, how to use Jupyter notebooks, how to connect Python with R and GNU Octave (rpy2 and oct2py), and how to do simple hardware projects with Arduinos. The students build a simple rig for running reaction time experiments, collect some data, analyze their own data, and then develop extension experiments to run on their own. We also cover a lot of other issues, like never using the jet colormap and why pandas is awesome. Last year, we partnered with Backyard Brains and brought their muscle spiker box and Neurorobots into the course, with major help from Chris Harris (and of course Greg Gage, who has been a long time supporter of open source science).

Yesterday in the chat, I learned that I have not been alone in developing such content. Andre Maia Chagas at the University of Sussex is working on his own set of tools for training folks to build and make open source devices for neuroscience research. Another site that you might check out is Lab On The Cheap. They have done a lot of posts on how to make lab equipment yourself, for a lot less than any commercial vendor will be able to charge.

In reflecting on all of these activities late last night, I was reminded of this amazing video from 2015 in which 1000 musicians play Learn to Fly by Foo Fighters to ask Dave Grohl and the Foo Fighters to come and play in Cesena, Italy. To me, the awesomeness of what is currently happening in open source neuroscience is kind of like this video. We just need to work together to make stuff happen, and we can have a blast along the way.

-Mark


Check it out: https://www.youtube.com/watch?v=JozAmXo2bDE

Open-Source Neuroscience Hardware Hack Chat

February 13, 2020

This week we would like to highlight an event hosted by Hackaday.io: The Open-Source Neuroscience Hardware Hack Chat. Lex Kravitz and Mark Laubach will be available on Wednesday, February 19, 2020 at noon Pacific Time to chat with users of Hackaday.io about open-source tools for neuroscience research.

In case you don’t know, Hackaday.io is really awesome project hosting site. Many open-source projects are hosted there that can teach you about microcontrollers, 3D printing, and other makerspace tools. It is so easy to find new ideas for projects and helpful build instructions for your projects on Hackaday.io.

We have previously posted about several popular projects that are hosted on Hackaday.io, such as SignalBuddy, PhotometryBox, and FED. (By the way, FED is now offered through collaboration between the OpenBehavior and OpenEphys projects: https://open-ephys.org/fed3/fed3.) But there are a number of other interesting projects hosted on Hackaday.io that are worth a look.

For example, FORCE (Force Output of Rodent Calibrated Effort) was developed by Bridget Matikainen-Ankney. It can be used in studies with response force controlling reward delivery in behaving rodents.

Another interesting project is the LabRATory Telepresence Robot developed by Brett Smith. It is a robotic system that allows for motion correction in imaging studies done in behaving mice using trackball setups.

Two other cool projects on Hackaday.io provide tools for studying behavior in electric fish, an electric fish detector by Michael Haag and the electric fish piano by Davis Catolico. The electric fish piano can be used to listen to, record, and manipulate the electrical tones made by these kinds of fish.

Finally, there are a couple of projects that could be useful for research and teaching labs, including a project on measuring jumping behavior by grasshoppers by Dieu My Nguyen and a rig for recording central pattern generators in snails by Nancy Sloan.

Check out these projects and let us know what you think! And hope to chat with you next Wednesday.

Open-Source Neuroscience Hardware Hack Chat


LINK: https://hackaday.io/event/169511-open-source-neuroscience-hardware-hack-chat

SimBA

JANUARY 23, 2020

Simon Nilsson from Sam Golden’s lab at the University of Washington recently shared their project SimBA (Simple Behavioral Analysis), an open source pipeline for the analysis of complex social behaviors:


“The manual scoring of rodent social behaviors is time-consuming and subjective, impractical for large datasets, and can be incredibly repetitive and boring. If you spend significant time manually annotating videos of social or solitary behaviors, SimBA is an open-source GUI that can automate the scoring for you. SimBA does not require any specialized equipment or computational expertise.

SimBA uses data from popular open-source tracking tools in combination with a small amount of behavioral annotations to create supervised machine learning classifiers that can then rapidly and accurately score behaviors across different background settings and lighting conditions. Although SimBA is developed and validated for complex social behaviors such as aggression and mating, it has the flexibility to generate classifiers in different environments and for different behavioral modalities. SimBA takes users through a step-by-step process and we provide detailed installation instructions and tutorials for different use case scenarios online. SimBA has a range of in-built tools for video pre-processing, accessing third-party tracking models, and evaluating the performance of machine learning classifiers. There are also several methods for in-depth visualizations of behavioral patterns. Because of constraints in animal tracking tools, the initial release of SimBA is limited to processing social interactions of differently coat colored animals, recorded from a top down view, and future releases will advance past these limitations. SimBA is very much in active development and a manuscript is in preparation. Meanwhile, we are very keen to hear from users about potential new features that would advance SimBA and help in making automated behavioral scoring accessible to more researchers in behavioral neuroscience.”


For more information on SimBA, you can check out the project’s Github page here.

For those looking to contribute or try out SimBA and are looking for feedback, you can interact on the project’s Gitter page.

Plus, take a look at their recent twitter thread detailing the project.

If you would like to be added to the project’s listserv for updates, fill out this form here.

 

RatHat: A self-targeting printable brain implant system

JANUARY 9, 2020

Leila Allen and colleagues in Tim Allen’s lab at Florida International University recently developed RatHat, a self-targeting printable brain implant system. Below they describe their project:


“There has not been a major change in how neuroscientists approach stereotaxic methods in decades. Here we present a new stereotaxic method that improves on traditional approaches by reducing costs, training, surgical time, and aiding repeatability. The RatHat brain implantation system is a 3D printable stereotaxic device for rats that is fabricated prior to surgery and fits to the shape of the skull. RatHat builds are directly implanted into the brain without the need for head-leveling or coordinate-mapping during surgery. The RatHat system can be used in conjunction with the traditional u-frame stereotaxic device, but does not require the use of a micromanipulator for successful implantations. Each RatHat system contains several primary components including the implant for mounting intracranial components, the surgical stencil for targeting drill sites, and the protective cap for impacts and debris. Each component serves a unique function and can be used together or separately. We demonstrate the feasibility of the RatHat system in four different proof-of-principle experiments: 1) a 3-pole cannula apparatus, 2) an optrode-electrode assembly, 3) a fixed-electrode array, and 4) a tetrode hyperdrive. Implants were successful, durable, and long-lasting (up to 9 months). RatHat print files are easily created, can be modified in CAD software for a variety of applications, and are easily shared, contributing to open science goals and replications. The RatHat system has been adapted to multiple experimental paradigms in our lab and should be a useful new way to conduct stereotaxic implant surgeries in rodents.

RatHat is freely available to academic researchers, achieving open science goals. Academic and non-profit researchers interested in receiving the 3D files can contact Dr. Timothy Allen (tallen@fiu.edu). We will first provide you a simple noncommercial license to be executed by your institution, and upon completion, printable and editable 3D files of the implant system. Our responses are fast, and all files are provided immediately after receiving the aforementioned document. Our goals are noncommercial, and our only interests are to share RatHat as widely as possible in support of our open science goals and to improve the pace of discovery using chronic brain implant systems for behavioral studies.”

 


The Allen lab has provided a video tutorial on how to implant RatHat, which you can view here on youtube.

 

For more details, you can check out the preprint here.

 

SignalBuddy

SEPTEMBER 19, 2019

Richard Warren, a graduate student in the Sawtell lab at Columbia University, recently shared his new open-source project called SignalBuddy:


SignalBuddy is an easy-to-make, easy-to-use signal generator for scientific applications. Making friends is hard, but making SignalBuddy is easy. All you need is an Arduino Uno! SignalBuddy replaces more complicated and (much) more expensive signal generators in laboratory settings where one millisecond resolution is sufficient. SignalBuddy generates digital or true analog signals (sine waves, step functions, and pulse trains), can be controlled with an intuitive serial monitor interface, and looks fabulous in an optional 3D printed enclosure.

To get SignalBuddy working, all you need to do is install the SignalBuddy.ino Arduino code provided on their github, and follow the step-by-step instructions on github to get the Arduino programmed up for your specific experimental needs. SignalBuddy can be used for numerous lab purposes, including creating pulse trains for optogenetic light stimulation, microstimulation, electrophysiology, or for programming up stimuli for behavioral paradigms.

Additionally, their hackaday site provides the instructions for 3D printing an enclosure to house the Arduino inside using just two .stl files.


For more information, check out the SignalBuddy github repository here.

You can also get further details on the SignalBuddy Hackaday.io page here.

 

Fun Fact: This group also developed KineMouse Wheel, a project previously posted on OpenBehavior and is now being used in numerous labs! Cheers to another great open-source project from Richard Warren and the Sawtell lab!

Curated Itinerary on Open-Source Tools at SfN-19

September 5, 2019

OpenBehavior is now an official part of the SfN team for curated itineraries at SfN-19! This year, we will provide an itinerary on Open-Source Tools. Linda Amarante (@L_Amarante) and Samantha White (@samantha6rose) are working on the itinerary now. If you would like your presentation to be included, please DM us through our Twitter account (@OpenBehavior) or send an email message about your presentation to openbehavior@gmail.com before noon on Saturday, September 8. Thanks!

RAD

August 1, 2019

In their recent eNeuro article, Bridget Matikainen-Ankney and colleagues from the Kravitz Lab have developed and shared their device, rodent activity detector (RAD), a low-cost system that can track and record activity in rodent home cages.


Physical activity is an important measure used in many research studies and is an important determinant of human health. Current methods for measuring physical activity in laboratory rodents have limitations including high expense, specialized caging/equipment, and high computational overhead. To address these limitations, Matikainen-Ankney et al. designed an open-source and cost-effective device for measuring rodent behavior.

In their new manuscript, they describe the design and implementation of RAD, rodent activity detector. The system allows for high throughput installation, minimal investigator intervention and circadian monitoring.  The design includes a battery powered passive infrared (PIR) sensor, microcontroller, microSD card logger, and an oLED screen for displaying data. All of the build instructions for RAD manufacture and programming, including the Arduino code, are provided on the project’s website.

The system records the number of PIR active bouts and the total duration the PIR is active each minute. The authors report that RAD is useful for quantifying changes across minutes rather than on a second to second time-scale, so the default data-logging frequency is set to one minute. The CSV files can be viewed and data visualized using provided python scripts. Device validation with video monitoring strongly correlated PIR data with speed and showed it recorded place to place locomotion but not slow or in place movements. To verify the device’s utility, RAD was used to collect physical activity data from 40 animals for 10 weeks. RAD detected high fat diet (HFD)-induced changes in activity and quantified individual animals’ circadian rhythms. Several major advantages of this tool are that the PIR sensor is not triggered by activity in other cages, it can detect and quantify within-mouse activity changes over time, and little investigator intervention other than infrequent battery replacement is necessary. Although the design was optimized for the lab’s specific caging, the open-source nature of the project makes it easily modifiable.

More details on RAD can be found in their eNeuro manuscript here, and all documentation can also be found on the project’s Hackaday.io page.


Matikainen-Ankney, B. A., Garmendia-Cedillos, M., Ali, M., Krynitsky, J., Salem, G., Miyazaki, N. L., … Kravitz, A. V. (2019). Rodent Activity Detector (RAD), an Open Source Device for Measuring Activity in Rodent Home Cages. ENeuro, 6(4). https://doi.org/10.1523/ENEURO.0160-19.2019

ezTrack

June 13, 2019

Zach Pennington from Denise Cai’s lab at Mt. Sinai recently published in Scientific Reports describing their latest open-source project called ezTrack:


ezTrack is an open-source, platform independent set of behavior analysis pipelines using interactive Python (iPython/Jupyter Notebook) that researchers with no prior programming experience can use. ezTrack is a sigh of relief for researchers with little to no computer programming experience. Behavioral tracking analysis shouldn’t be limited to those with extensive programming knowledge, and ezTrack is a nice alternative to currently available software that may require a bit more programming experience. The manuscript and Jupyter notebooks are written in the style of a tutorial, and is meant to provide straightforward instructions to the user on implementing ezTrack. ezTrack is unique from other recent video analysis toolboxes in that this method does not use deep learning algorithms and thus does not require training sets for transfer learning.

ezTrack can be used to analyze rodent behavior videos of a single animal in different settings, and the authors provide examples of positional analysis across several tasks (place-preference, water-maze, open-field, elevated plus maze, light-dark boxes, etc), as well as analysis of freezing behavior. ezTrack can provide frame-by-frame data output in .csv files, and users can crop the frames of the video to get rid of any issue with cables from optogenetic or electrophysiology experiments. ezTrack can take on multiple different video formats, such as mpg1, wav, avi, and more.

Aside from the benefit of being open-source, there are several major advantages of ezTrack. Notably, the tool is user-friendly in that it is accessible to researchers with little to no programming background. The user does not need to make many adjustments to parameters of the toolbox, and the data can processed into interactive visualizations and is easily extractable in .csv files. ezTrack is both operating system and hardware independent and can be used across multiple platforms. Utilizing ipython/Jupyter Notebook allows researchers to easily replicate their analyses as well.

Check out their GitHub with more details on how to use ezTrack: https://github.com/denisecailab/ezTrack


Pennington, Z. T., Dong, Z., Bowler, R., Feng, Y., Vetere, L. M., Shuman, T., & Cai, D. J. (2019). ezTrack: An open-source video analysis pipeline for the investigation of animal behavior. Sci. Reports.

OpenBehavior Feedback Survey

We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.

We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.

https://american.co1.qualtrics.com/jfe/form/SV_0BqSEKvXWtMagqp

Thanks!