January 28, 2017
Dr. Bernd Porr has also shared the following open source bioamplifier:
“Attys is an open source wearable data acquisition device with a special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG). In contrast to many neurogadgets, the Attys transmits the data as it’s being recorded without any compression or pre-filtering, and at its full precision of 24 bits, to a mobile phone, tablet or PC. This guarantees maximum possible openness so that the raw data can be published alongside the processed data, as required now by many research councils.
All software for the Attys is open source which includes the firmware of the Attys.
The story of the Attys started four years ago, when Dr. Bernd Porr filmed numerous YouTube clips to educate the public about the possibilities and limits of biosignal measurement (see BPM Biosignals). The site has been very popular ever since and visitors have been asking if a ready made bio-amp could be made available. This was the birth of Attys.”
January 28, 2017
Dr. Bernd Porr has shared his open-source webpage, BPM Biosignals, with Open Behavior in hopes of providing an educational resource regarding the construction and implementation of bioamplifiers as well as interpretation of biosignals.
“BPM biosignals provides a no-nonsense approach to assembling a two stage bio-amplifier and then using it to measure ECG, EEG, EMG and other biosignals. The page features the list of components, the circuit diagram and video tutorials how to assemble the amplifier and how to use it properly. The main aim of this page is biosignal education, in particular how to to distinguish noise from actual biosignals. One will learn that often noise is “sold” as a biosignal, even in top level publications. With the help of these clips one can critically evaluate such results and make up their own mind.”
January 13, 2017
Tom Baden, from the University of Sussex, has generously shared the following device with Open Behavior:
Designed for ease of use, robustness and low-cost, the “Openspritzer” is an open hardware “Picospritzer” as routinely used in labs around the world for administering picoliters of liquid to biological samples. The performance of Openspritzer and commercial alternatives is effectively indistinguishable.
The system is based on a solenoid valve connected to a pressure gauge. Control can be attained directly via an external TTL pulse or internally through an Arduino set by a rotary encoder. The basic setup can be put together for 3-400€, or substantially less if you are prepare to shop around.
More information regarding Openspritzer can be found on Open Labware.
3D Files and Arduino Code
Forman, C. J., Tomes, H., Mbobo, B., Baden, T., & Raimondo, J. V. (2016). Openspritzer: an open hardware pressure ejection system for reliably delivering picolitre volumes. bioRxiv, 93633. https://doi.org/10.1101/093633
December 30, 2016
Mayank Kabra has shared the following about JAABA, a machine learning-based behavior detection system developed by the Branson Lab at HHMI Janelia Farm.
The Janelia Automatic Animal Behavior Annotator (JAABA) is a machine learning-based system that enables researchers to automatically detect behaviors in video of behaving animals. Through our system, users begin by labeling the behavior of the animal, e.g. walking, grooming, or following, in few of the video frames. Using machine learning, JAABA learns behavior detectors based on these labels that can then be used to automatically classify the behaviors of animals in other videos. JAABA has a fast and intuitive graphical user interface, and multiple ways to visualize and understand the classifier by a non-machine learning user. JAABA has enabled extraction of detailed, scientifically meaningful measurements of the behavioral effects in large experiments.
Kabra, M., Robie, A. A., Rivera-Alba, M., Branson, S., & Branson, K. (2013). JAABA: interactive machine learning for automatic annotation of animal behavior. Nature Methods, 10(1), 64–67. https://doi.org/10.1038/nmeth.2281
December 21, 2016
David Barker from the National Institute on Drug Abuse Intramural Research Program has shared the following regarding the development of a device designed to allow the automatic detection of 50kHz ultrasonic vocalizations.
Ultrasonic vocalizations (USVs) have been utilized to infer animals’ affective states in multiple research paradigms including animal models of drug abuse, depression, fear or anxiety disorders, Parkinson’s disease, and in studying neural substrates of reward processing. Currently, the analysis of USV data is performed manually, and thus time consuming.
The present method was developed in order to allow for the automated detection of 50-kHz ultrasonic vocalizations using a template detection procedure. The detector runs in XBAT, an extension developed for MATLAB developed by the Bioacoustics Research Program at Cornell University. The specific template detection procedure for ultrasonic vocalizations along with a number of companion tools were developed and tested by our laboratory. Details related to the detector’s performance can be found within our published work and a detailed readme file is published along with the MATLAB package on our GitHub.
Our detector was designed to be freely shared with the USV research community with the hope that all members of the community might benefit from its use. We have included instructions for getting started with XBAT, running the detector, and developing new analysis tools. We encourage users that are familiar with MATLAB to develop and share new analysis tools. To facilitate this type of collaboration, all files have been shared as part of a GitHub repository, allowing for suggested changes or novel contribution to be made to the software package. I would happily integrate novel analysis tools created by others into future releases of the detector.
Work on a detector for 22-kHz vocalizations is ongoing; the technical challenges for detecting 22-kHz vocalizations, which are nearer to audible noise, are more difficult. Those interested in contributing to this can email me at djamesbarker@gmail-dot-com or find me on twitter (@DavidBarker_PhD).
Olivier Friard of The University of Turin has generously shared the following about BORIS:
BORIS is an open-source, multiplatform standalone program that allows a user-specific coding environment for a computer-based review of previously recorded videos or live observations. The program allows defining a highly-customised, project-based ethogram that can then be shared with collaborators or can be imported or modified. Once the coding process is completed, the program can extract a time-budget or single or grouped observations automatically and present an at-a-glance summary of the main behavioural features. The observation data and analyses can be exported in various formats (e.g.; CSV or XLS). The behavioural events can also be plotted and exported in various graphic formats (e.g.; PNG, EPS, and PDF).
BORIS is currently used around the world for strikingly different approaches to the study of animal and human behaviour.
The latest version of BORIS can be downloaded from www.boris.unito.it, where the manual and some tutorials are also available.
Friard, O. and Gamba, M. (2016), BORIS: a free, versatile open-source
event-logging software for video/audio coding and live observations.
Methods Ecol Evol, 7: 1325–1330. doi:10.1111/2041-210X.12584
Gonçalo Lopez of the Champalimaud Neuroscience Programme has shared the following about Bonsai:
Bonsai is an open-source visual programming language for composing reactive and asynchronous data streams coming from video cameras, microphones, electrophysiology systems or data acquisition boards. It gives the experimenter an easy way not only to collect data from all these devices simultaneously, but also to quickly and flexibly chain together real-time processing pipelines that can be fed back to actuator systems for closed-loop stimulation using sound, images or other kinds of digital and analog output.
Bonsai is being used around the world to design all kinds of behavioral experiments from eye-tracking to virtual reality, spanning multiple model organisms from fish to humans.
A public user forum
is also available where you can leave your questions and feedback about how to use Bonsai
Ian Dublon, former postdoctoral researcher at the
The program itself is relatively simple and originated out of the real need to rapidly appraise 2d calcium imaging datasets and the frustrations with existing acquisition software relying exclusively on defined regions of interest (ROIs). There is of course nothing wrong with ROIs but it is useful to rapidly appraise the the whole matrix. It really depends on the biological question being asked.
At its most simple, when you open a stacked Tiff, scintillate uses MATLAB’s imabsdiff to look at the absolute difference across the image matrix for successive frames in the stack. Several other more powerful tools, including background pre-stimulus subtraction and ICA (using the excellent FastICA) are provided to help the user ascertain the value of the studied preparation. This helps to confirm and pinpoint areas of signal change, allowing the imager to make adjustments to image acquisition parameters or simply to start afresh.
Written as a GUI in MATLAB and with source and GUIDE files provided alongside a getting started manual, it is designed both for image acquisition people and for MATLAB coders. If the compiler toolbox is present its possible to package a version of scintillate that will run without a MATLAB install present, making it ideal for running on the image acquisition workstation. Simply provide it with an acquired stacked tiff and within three or so clicks and no command line syntax you are ready to go.
Code is hosted at GitHub here: github.com/dublon/scintillate
and scintillate uses some freely available 3rd party code which is gladly acknowledged throughout. Scintillate is not designed to replace traditional analysis methods but rather to provide a means of open-source rapid evaluation during the pre-processing stage. It is hoped that by hosting it on GitHub it may be developed further and thus adjusted to suit each persons individual imaging needs.
The Feeding Experimentation Device (FED) is a free, open-source system for measuring food intake in rodents. FED uses an Arduino processor, a stepper motor, an infrared beam detector, and an SD card to record time-stamps of 20mg pellets eaten by singly housed rodents. FED is powered by a battery, which allows it to be placed in colony caging or within other experimental equipment. The battery lasts ~5 days on a charge, providing uninterrupted feeding records over this duration. The electronics for building each FED cost around $150USD, and the 3d printed parts cost between $20 and $400, depending on access to 3D printers and desired print quality.
The Kravitz lab has published a large update of their Feeding Experimentation Device (FED) to their Github site (https://github.com/KravitzLab/fed), including updated 3D design files that print more easily and updates to the code to dispense pellets more reliably. Step-by-step build instructions are available here: https://github.com/KravitzLab/fed/wiki
In their 2014 paper, “Visualizing and quantifying movement from pre-recorded videos: The spectral time-lapse (STL) algorithm,” Christopher Madan and Marcia Spetch propose an approach for summarizing animal movement data as a single image (the spectral time-lapse algorithm) as well as automate analysis of animal movement data.
The paper includes an implementation of the algorithm as a Matlab toolbox, available on Github
As an example application, the toolbox has been used to analyze movement data of pigeons solving the traveling salesperson problem (Baron et al., 2015
Madan, Christopher; Spetch, Marcia (2014). Visualizing and quantifying movement from pre-recorded videos: The spectral time-lapse (STL) algorithm. F1000Res, 3: 19.