Category: Most Recent

The OpenMV project: Machine Vision with Python

November 14, 2019

OpenMV – Better, Stronger, Faster and only $65 USD


Recent updates to the firmware for the OpenMV H7 Camera have brought some new functionality to this device, which is popular for open source neuroscience projects (e.g. Rodent Arena Tracker, or RAT: https://hackaday.io/project/162481-rodent-arena-tracker-rat). The new firmware allows for use of the popular TensorFlow library for machine learning on this MicroPython-based device. It’s small (1.5 by 1.75 inches), consumes only a max of 140 mA when processing data, has 1 MB of RAM and 2 MB of flash, and runs 64-bits computations at 4800 MHz (3.84 GB/s bandwidth). OpenMV is capable of frame differencing, color tracking, marker tracking, face detection, eye tracking, person detection (with TensorFlow Lite), and more. The project supports a very easy to use GUI, the OpenMV IDE. It’s intuitive to use, and offers a number of ready to go applications. Arduino users will feel right at home, despite the code being Python based.

Check out the project here: https://openmv.io/.

An automated behavioral box to assess forelimb function in rats

November 7, 2019

Chelsea C. Wong and colleagues at the University of California – San Francisco have developed and shared a design for an open-source behavioral chamber for the measurement of forelimb function in rats.


Forelimb function (reaching, grasping, retrieving, etc) is a common readout of behaviors for studying neural correlates of motor learning, neural plasticity and recovery from injury. One version of the task used commonly to study these behaviors, the Whishaw single-pellet reach-to-grasp task, traditionally requires an experimenter to manually present each pellet and to shape the behavior rats by placing a subsequent pellet only when the rat has relocated to the other end of the cage over multiple trials. Wong et al. developed an open source, low-cost, automated high-throughput version of this task. The behavioral apparatus, constructed out of commercially available acrylic sheets, features a custom built pellet dispenser, cameras and IR detectors for measuring position of a rat and position of a pellet, and an Arduino board to integrate information about the animal with dispensing of the pellet. Code for automation of the task was built in MATLAB and includes a GUI for altering experiment parameters. Data collected can be analyzed using MATLAB, excel, or most other statistical programming languages. The authors provide example data from the device to highlight its potential use for combining this reaching task with chronic electrophysiological recording techniques. The full design is available in their publication in Journal of Neuroscience Methods.

Check out the full publication here!


Wong, C. C., Ramanathan, D. S., Gulati, T., Won, S. J., & Ganguly, K. (2015). An automated behavioral box to assess forelimb function in rats. Journal of Neuroscience Methods, 246, 30–37. doi: 10.1016/j.jneumeth.2015.03.008

Automated Home-Cage Rodent Two-bottle Choice Test: open-source success story

October 31, 2019

Elizabeth Godynyuk and colleagues from the Creed Lab at Washington University, St. Louis recently published their design for a two-bottle choice homecage apparatus in eNeuro. It incorporates the original design (published on Hackaday.io in May 2018), modifications from Jude Frie and Jibran Khokar (Frie & Khokhar, 2019), and additional improvements over the course of use. This project is a great example of collaborative open-source tool development.


Studies of liquid ingestive behaviors are used in neuroscience to investigate reward-related behavior, metabolism, and circadian biology. Accurate measurement of these behaviors are needed when studying drug administration, preference between two substances, and measuring caloric intake. To measure consummatory behavior in mice between two liquids, members of the Creed lab designed a low-cost and arduino-based device to automatically measure consumption in a homecage two-bottle choice test. Posted to Hackaday in May 2018, the initial version of the device used photointerrupters to measure time at the sipper, 15 mL conical tubes for volumetric measurements of fluid, and a 3D printed holder for the apparatus. Data from the photobeams are recorded to an SD card using a standard Arduino. In August 2018, the project was updated to Version 2, to make it battery powered and include a screen to display data. They made the editable TinkerCAD design available on hackaday.io.

In October 2018, Dr. Jibran Khokhar and colleagues at the University of Guelph posted a project log highlighting the modifications making the device larger and suitable for studying liquid intake in rats. This updated design was published in April 2019 in HardwareX. This device gives the advantage of being able to analyze the drinking microstructure by recording licking behavior and volume consumed in real time. Modifications include larger liquid reservoirs and adding a hydrostatic depth sensor, allowing each bout of drinking to correspond to a specific change in volume.

In current day, Elizabeth Godynyuk and colleagues from the Creed lab have shared their own updated version of the device in eNeuro. It remains low-cost and open-course and results validating the device with preference testing are shared. Furthermore, the authors show that the two-bottle choice test apparatus can be integrated with a fiber photometry system. In the eNeuro article, Godynuyuk et al. cite Frie and Khokhar’s modifications to highlight how the design can be easily adjusted to fit investigator needs.

These two projects show how open source projects can be modified and how different groups can collaborate to improve upon designs. This shows how open source projects allow research groups can modify designs to best address their research questions instead of forming their research questions based on the commercial tools available.

Creed Lab Version 1: https://hackaday.io/project/158279-automated-mouse-homecage-two-bottle-choice-test

Creed Lab Version 2: https://hackaday.io/project/160388-automated-mouse-homecage-two-bottle-choice-test-v2

Frie and Khokar 2019 (HardwareX): https://www.sciencedirect.com/science/article/pii/S2468067219300045#b0005

Godynyuk et al 2019 (eNeuro): https://www.eneuro.org/content/6/5/ENEURO.0292-19.2019.long


Frie, J. A., & Khokhar, J. Y. (2019). An open source automated two-bottle choice test apparatus for rats. HardwareX, 5, e00061. https://doi.org/10.1016/j.ohx.2019.e00061

Godynyuk, E., Bluitt, M. N., Tooley, J. R., Kravitz, A. V., & Creed, M. C. (2019). An Open-Source, Automated Home-Cage Sipper Device for Monitoring Liquid Ingestive Behavior in Rodents. Eneuro, 6(5), ENEURO.0292-19.2019. https://doi.org/10.1523/ENEURO.0292-19.2019

SfN 2019 Open-Source Itinerary

October 18, 2019

It’s that time of year again: SfN season! The OpenBehavior Team has put together a curated itinerary of talks and posters featuring open-source tools. The list contains over 200 projects related to open-source in general, with 90 or so projects that are specifically relevant to behavioral neuroscience (denoted in bold)! You can find it at this link: https://docs.google.com/document/d/1s09KorOuBcga2TygATB6lXcjpNYlMxCEZ1yXzxWsnYE/edit?usp=sharing.

Thanks to everyone who reached out to add their projects to the itinerary! If you want to add your poster or talk to the list, it’s not too late! Send us a DM on twitter or a quick email to openbehavior@gmail.com.

See you in Chicago!

Updates on LocoWhisk and ART

OCTOBER 3, 2019

Dr Robyn Grant from Manchester Metropolitan University in Manchester, UK has shared her group’s most recent project called LocoWhisk, which is a hardware and software solution for measuring rodent exploratory, sensory and motor behaviours:


In describing the project, Dr Grant writes, “Previous studies from our lab have shown that that analysing whisker movements and locomotion allows us to quantify the behavioural consequences of sensory, motor and cognitive deficits in rodents. Independent whisker and feet trackers existed but there was no fully-automated, open-source software and hardware solution, that could measure both whisker movements and gait.

We developed the LocoWhisk arena and new accompanying software, that allows the automatic detection and measurement of both whisker and gait information from high-speed video footage. The arena can easily be made from low-cost materials; it is portable and incorporates both gait analysis (using a pedobarograph) and whisker movements (using high-speed video camera and infrared light source).

The software, ARTv2 is freely available and open source. ARTv2 is also fully-automated and has been developed from our previous ART software (Automated Rodent Tracker).

ARTv2 contains new whisker and foot detector algorithms. On high-speed video footage of freely moving small mammals (including rat, mouse and opossum), we have found that ARTv2 is comparable in accuracy, and in some cases significantly better, than readily available software and manual trackers.

The LocoWhisk system enables the collection of quantitative data from whisker movements and locomotion in freely behaving rodents. The software automatically records both whisker and gait information and provides added statistical tools to analyse the data. We hope the LocoWhisk system and software will serve as a solid foundation from which to support future research in whisker and gait analysis.”

For more details on the ARTv2 software, check out the github page here.

Check out the paper that describes LocoWhisk and ARTv2, which has recently been published in the Journal of Neuroscience Methods.

LocoWhisk was initially shared and developed through the NC3Rs CRACK IT website here.


3D Printed Headcap and Microdrive

SEPTEMBER 26, 2019

In their 2015 Journal of Neurophysiology article, the Paré Lab at the Center for Molecular and Behavioral Neuroscience at Rutgers University describe their novel head-cap and microdrive design for chronic multi-electrode recordings in rats through the use of 3D printing technology and highlight the impact of 3D printing technology on neurophysiology:


There is a need for microdrives and head-caps that can accommodate different recording configurations. Many investigators implant multiple individual drives aiming to record from numerous areas. However, this extends surgery time, impairs animal recovery, and complicates experiments. Other strategies rely on more expensive custom-machined drive assemblies that are specifically built for a particular set of regions, limiting their adaptability. Some proposed designs allow targeting of multiple regions, but recording sites must be within a few millimeters so are only suitable for mice and not for accessing areas of larger brains (like in rats, for example).

Utilizing 3D printing technology to create a novel design concept of microdrives and head-caps, this group’s design allows for recording of multiple brain regions in different configurations. In their article, the lab reviews the basic principles of 3D design and printing and introduce their approach to multisite recording, explaining how to construct the individual required components. The 3D printed head cap and electrode microdrive enables investigators to perform chronic multi-site recordings in rats. The head cap is composed of five components and there are three types of microdrives that can be used in different combinations or positions to study different targets. The different microdrive designs have different functionality including for extended driving depths, targeting of thin layers, and allowing many microdrives to be placed in a small area.

To show the viability of their new designs, the lab presents LFP recordings obtained throughout the cortico-hippocampal loop using 3D printed components. The lab suggests investigators modify their designs to best suit their research needs and give changeable versions of the three parts most important in modification. The investigators also provide a detailed explanation of the printing, assembly, and implantation of the head caps and microdrives. Finally, they indicate the ways 3D printing advancements can change how chronic implants are designed and used, notably 3D scanning and new material development.

For more information on the microdrive and headcap, see their paper’s Appendix, which has full instructions and advice on building these devices.


Headley, D. B., DeLucca, M. V., Haufler, D., & Paré, D. (2015). Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions. Journal of Neurophysiology, 113(7), 2721–2732. https://doi.org/10.1152/jn.00955.2014

SignalBuddy

SEPTEMBER 19, 2019

Richard Warren, a graduate student in the Sawtell lab at Columbia University, recently shared his new open-source project called SignalBuddy:


SignalBuddy is an easy-to-make, easy-to-use signal generator for scientific applications. Making friends is hard, but making SignalBuddy is easy. All you need is an Arduino Uno! SignalBuddy replaces more complicated and (much) more expensive signal generators in laboratory settings where one millisecond resolution is sufficient. SignalBuddy generates digital or true analog signals (sine waves, step functions, and pulse trains), can be controlled with an intuitive serial monitor interface, and looks fabulous in an optional 3D printed enclosure.

To get SignalBuddy working, all you need to do is install the SignalBuddy.ino Arduino code provided on their github, and follow the step-by-step instructions on github to get the Arduino programmed up for your specific experimental needs. SignalBuddy can be used for numerous lab purposes, including creating pulse trains for optogenetic light stimulation, microstimulation, electrophysiology, or for programming up stimuli for behavioral paradigms.

Additionally, their hackaday site provides the instructions for 3D printing an enclosure to house the Arduino inside using just two .stl files.


For more information, check out the SignalBuddy github repository here.

You can also get further details on the SignalBuddy Hackaday.io page here.

 

Fun Fact: This group also developed KineMouse Wheel, a project previously posted on OpenBehavior and is now being used in numerous labs! Cheers to another great open-source project from Richard Warren and the Sawtell lab!

eNeuro’s “Open Source Tools and Methods” paper topic

SEPTEMBER 12, 2019

There’s a new place to publish your open-source tools or methods in neuroscience! Christophe Bernard, Editor-in-Chief at the journal eNeuro (an open-access journal of the Society for Neuroscience), recently wrote an editorial detailing the opening of a new topic tract in eNeuro for Open Source Tools and Methods. In his editorial, Bernard details how there has been a recent push for open-source science, and highlights how there are many new open-source projects being developed in neuroscience that need a proper home for publication. While eNeuro has a “Methods/New Tools” submission type already, Bernard says the “Open Source Tools and Methods” submission is available for projects like “low-cost devices to measure animal behavior, a new biophysical model of a single neuron, a better method to realign images when performing in vivo two-photon imaging, scripts and codes to analyze signals” and more.

There is no current publication venue explicitly intended for open-source tools and methods in neuroscience, and through the addition of this article type, new tools/methods/devices/projects can be published in a straightforward manner. By including this publication type, it will aid the neuroscience field in replication, reproducibility, and transparency of methods and tools used. A major point from Bernard is that this may help the developers of the tool or method, since “it allows for acknowledgment of those who developed such tools and methods fully, often rotating students or engineers recruited on a short-duration contract. On a standard research paper, their name ends up in the middle of the list of authors, but the Open Source Tools and Methods type will allow them to be the first author.”

The details for submission of an open source tool or method on the eNeuro site is as follows: “Open Source Tools and Methods are brief reports (limited to 4500 words) describing the creation and use of open-source tools in neuroscience research. Examples of tools include hardware designs used in behavioral or physiological studies and software used for data acquisition and analysis. They must contain a critique of the importance of the tool, how it compares to existing open- and closed-source solutions, and a demonstration of tool use in a neuroscience experiment.”

 

Cheers to you, eNeuro, for your inclusion of open-source projects to help advance the neuroscience field!


Link to the editorial: https://www.eneuro.org/content/6/5/ENEURO.0342-19.2019

Current articles for Open Source Tools and Methods are listed here.

To submit an article under Open Source Tools and Methods, check out the instructions for authors at eNeuro here.

Curated Itinerary on Open-Source Tools at SfN-19

September 5, 2019

OpenBehavior is now an official part of the SfN team for curated itineraries at SfN-19! This year, we will provide an itinerary on Open-Source Tools. Linda Amarante (@L_Amarante) and Samantha White (@samantha6rose) are working on the itinerary now. If you would like your presentation to be included, please DM us through our Twitter account (@OpenBehavior) or send an email message about your presentation to openbehavior@gmail.com before noon on Saturday, September 8. Thanks!

Ratcave

AUGUST 29, 2019

Nicholas A. Del Grosso and Anton Sirota at the Bernstein Centre for Computational Neuroscience recently published their new project called Ratcave, a Python 3D graphics library that allows researchers to create and 3D stimuli in their experiments:


Neuroscience experiments often require the use of software to present stimuli to a subject and subsequently record their responses. Many current libraries lack 3D graphic support necessary for psychophysics experiments. While python and other programming languages may have 3D graphics libraries, it is hard to integrate these into psychophysics libraries without modification. In order to increase programming of 3D graphics suitable for the existing environment of Python software, the authors developed Ratcave.

Ratcave is an open-source, cross-platform Python library that adds 3D stimulus support to all OpenGL-based 2D Python stimulus libraries. These libraries include VisionEgg, Psychopy, Pyglet, and PyGam. Ratcave comes with resources including basic 3D object primitives and wide range of 3D light effects. Ratcave’s intuitive object-oriented interface allows for all objects, which include meshes, lights, and cameras, can be repositioned, rotated, and scaled. Objects can also be parented to one another to specify complex relationships of objects. By sending the data as a single array using OpenGL’s VAO (Vertex Array Object) functionality, the processing of drawing much more efficient. This approach allows over 30,000 vertices to be rendered at a performance level surpassing the needs of most behavioral research studies.

An advantage of Ratcave is that it allows researchers to continue to use their preferred libraries, since Ratcave supplements existing python stimulus libraries, making it easy to add on 3d stimuli to current libraries. The manuscript also reports that Ratcave has been tested and implemented in other’s research, actively showing reproducibility across labs and experiments.

Details on the hardware and software can be found at https://github.com/ratcave/ratcave.

Information on Ratcave can also be found on the https://ratcave.readthedocs.org.