Category: All

Nose Poke Device

April 20, 2017 

Andre Chagas, creator of OpenNeuroscience, has generously shared the following with OpenBehavior regarding an arduino-based, 3D-printed nose poke device:


“This nose poke device was built as “proof-of-principle”. The idea was to show that scientists too can leverage from the open source philosophy and the knowledge built by the community that is developing around open source hardware. Moreover, the bill of materials was kept simple and affordable. One device can be built for ~25 dollars and should take 2-3 hours to build, including the time to print parts.

The device is organised as follows: The 3D printed frame (which can also be built with other materials when a printer is not available) contains a hole where the animals are expected to insert their snouts. At the front part of the hole, an infrared led is aligned with an infrared detector. This forms an “infrared curtain” at the hole’s entrance. If this curtain is interrupted, a signal is sent to a microcontroller (an Arduino in this case), and it can be used to trigger other electronic components, such as a water pump, or an led indicator, or in this case a Piezo buzzer.
At the back of the hole, a white LED is placed to indicate that the system is active and ready for “nose pokes”.

The microcontroller, contains the code responsible for controlling the electronic parts, and can easily be changed, as it is written for Arduino and several code examples/tutorials (for begginners and experts) can be found online.”

Find more documentation on FigShare and Open Neuroscience.

Link to share:  https://edspace.american.edu/openbehavior/2017/04/20/nose-poke-device/

Pixying Behavior

APril 3, 2017 

Robert Sachdev, from the Neurocure Cluster of Excellence, Humboldt Universität Zu BerlinGermany, has generously shared the following regarding automated optical tracking of animal movement: 


“We have developed a method for tracking the motion of whiskers, limbs and whole animals in real-time. We show how to use a plug and play Pixy camera to monitor the real-time motion of multiple colored objects and apply the same tools for post-hoc analysis of high-speed video. Our method has major advantages over currently available methods: we can track the motion of multiple adjacent whiskers in real-time, and apply the same methods post-hoc, to “recapture” the same motion at a high temporal resolution.  Our method is flexible; it can track objects that are similarly shaped like two adjacent whiskers, forepaws or even two freely moving animals. With this method it becomes possible to use the phase of movement of particular whiskers or a limb to perform closed-loop experiments.”

Link to share:  https://edspace.american.edu/openbehavior/2017/04/03/pixying-behavior/


Open Ephys

March 17, 2017

Jakob Voigts, from the Massachusetts Institute of Technology, has shared the following regarding www.open-ephys.org. Open Ephys aims to distribute reliable open source software as well as tools for extracellular recording and stimulation.


“Open Ephys is a collaborative effort to develop, document, and distribute open-source tools for systems neuroscience. Since the spring of 2011, our main focus has been on creating a multichannel data acquisition system optimized for recordings in freely behaving rodents. However, most of our tools are general enough to be used in applications involving other model organisms and electrode types.

We believe that open-source tools can improve basic scientific research in a variety of ways. They are often less expensive than their closed-source counterparts, making it more affordable to scale up one’s experiments. They are readily modifiable, giving scientists a degree of flexibility that is not usually provided by commercial systems. They are more transparent, which leads to a better understanding of how one’s data is being generated. Finally, by encouraging researchers to document and share tools they would otherwise keep to themselves, the open-source community reduces redundant development efforts, thereby increasing overall scientific productivity.”
– Jakob Voigts

Open Ephys features devices such as the flexDrive, a “chronic drive implant for extracellular electrophysiology”, as well as an arduino-based tetrode twister. The Pulse Pal generates precise voltage pulses. Also featured on Open Ephys is software such as Symphony, a MATLAB-based data acquisition system for electrophysiology.
The Open Ephys GitHub can be found here.

Eco-HAB

February 12, 2017 

Dr. Ewelina Knapska from the Nencki Institute of Experimental Biology in Warsaw, Poland has shared the following regarding Eco-HAB, an RFID-based system for automated tracking: 


Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.


Attys

January 28, 2017 

Dr. Bernd Porr has also shared the following open source bioamplifier:


Attys is an open source wearable data acquisition device with a special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG). In contrast to many neurogadgets, the Attys transmits the data as it’s being recorded without any compression or pre-filtering, and at its full precision of 24 bits, to a mobile phone, tablet or PC. This guarantees maximum possible openness so that the raw data can be published alongside the processed data, as required now by many research councils.

All software for the Attys is open source which includes the firmware of the Attys.

The story of the Attys started four years ago, when Dr. Bernd Porr filmed numerous YouTube clips to educate the public about the possibilities and limits of biosignal measurement (see BPM Biosignals). The site has been very popular ever since and visitors have been asking if a ready made bio-amp could be made available. This was the birth of Attys.”

 

 

BPM Biosignals

January 28, 2017

Dr. Bernd Porr has shared his open-source webpage, BPM Biosignals, with Open Behavior in hopes of providing an educational resource regarding the construction and implementation of bioamplifiers as well as interpretation of biosignals.


“BPM biosignals provides a no-nonsense approach to assembling a two stage bio-amplifier and then using it to measure ECG, EEG, EMG and other biosignals. The page features the list of components, the circuit diagram and video tutorials how to assemble the amplifier and how to use it properly. The main aim of this page is biosignal education, in particular how to to distinguish noise from actual biosignals. One will learn that often noise is “sold” as a biosignal, even in top level publications. With the help of these clips one can critically evaluate such results and make up their own mind.”

Openspritzer

January 13, 2017

Tom Baden, from the University of Sussex, has generously shared the following device with Open Behavior:


Designed for ease of use, robustness and low-cost, the “Openspritzer” is an open hardware “Picospritzer” as routinely used in labs around the world for administering picoliters of liquid to biological samples. The performance of Openspritzer and commercial alternatives is effectively indistinguishable.
The system is based on a solenoid valve connected to a pressure gauge. Control can be attained directly via an external TTL pulse or internally through an Arduino set by a rotary encoder. The basic setup can be put together for 3-400€, or substantially less if you are prepare to shop around.

More information regarding Openspritzer can be found on Open Labware.

Supplementary Materials

3D Files and Arduino Code 


Figures:


Janelia Automatic Animal Behavior Annotator (JAABA)

December 30, 2016

Mayank Kabra has shared the following about JAABA, a machine learning-based behavior detection system developed by the Branson Lab at HHMI Janelia Farm.


The Janelia Automatic Animal Behavior Annotator (JAABA) is a machine learning-based system that enables researchers to automatically detect behaviors in video of behaving animals. Through our system, users begin by labeling the behavior of the animal, e.g. walking, grooming, or following, in few of the video frames. Using machine learning, JAABA learns behavior detectors based on these labels that can then be used to automatically classify the behaviors of animals in other videos. JAABA has a fast and intuitive graphical user interface, and multiple ways to visualize and understand the classifier by a non-machine learning user. JAABA has enabled extraction of detailed, scientifically meaningful measurements of the behavioral effects in large experiments.

Ultrasonic Vocalization (USV) Detector

December 21, 2016

David Barker from the National Institute on Drug Abuse Intramural Research Program has shared the following regarding the development of a device designed to allow the automatic detection of 50kHz ultrasonic vocalizations.


Ultrasonic vocalizations (USVs) have been utilized to infer animals’ affective states in multiple research paradigms including animal models of drug abuse, depression, fear or anxiety disorders, Parkinson’s disease, and in studying neural substrates of reward processing. Currently, the analysis of USV data is performed manually, and thus time consuming.

The present method was developed in order to allow for the automated detection of 50-kHz ultrasonic vocalizations using a template detection procedure. The detector runs in XBAT, an extension developed for MATLAB developed by the Bioacoustics Research Program at Cornell University. The specific template detection procedure for ultrasonic vocalizations along with a number of companion tools were developed and tested by our laboratory. Details related to the detector’s performance can be found within our published work and a detailed readme file is published along with the MATLAB package on our GitHub.

Our detector was designed to be freely shared with the USV research community with the hope that all members of the community might benefit from its use. We have included instructions for getting started with XBAT, running the detector, and developing new analysis tools. We encourage users that are familiar with MATLAB to develop and share new analysis tools. To facilitate this type of collaboration, all files have been shared as part of a GitHub repository, allowing for suggested changes or novel contribution to be made to the software package. I would happily integrate novel analysis tools created by others into future releases of the detector.

Work on a detector for 22-kHz vocalizations is ongoing; the technical challenges for detecting 22-kHz vocalizations, which are nearer to audible noise, are more difficult. Those interested in contributing to this can email me at djamesbarker@gmail-dot-com or find me on twitter (@DavidBarker_PhD).


Behavioral Observation Research Interactive Software (BORIS)

Olivier Friard of The University of Turin has generously shared the following about BORIS:


BORIS is an open-source, multiplatform standalone program that allows a user-specific coding environment for a computer-based review of previously recorded videos or live observations. The program allows defining a highly-customised, project-based ethogram that can then be shared with collaborators or can be imported or modified. Once the coding process is completed, the program can extract a time-budget or single or grouped observations automatically and present an at-a-glance summary of the main behavioural features. The observation data and analyses can be exported in various formats (e.g.; CSV or XLS). The behavioural events can also be plotted and exported in various graphic formats (e.g.; PNG, EPS, and PDF).

BORIS is currently used around the world for strikingly different approaches to the study of animal and human behaviour.

The latest version of BORIS can be downloaded from www.boris.unito.it, where the manual and some tutorials are also available.

observation_running  ethogram


Friard, O. and Gamba, M. (2016), BORIS: a free, versatile open-source
event-logging software for video/audio coding and live observations.
Methods Ecol Evol, 7: 1325–1330. doi:10.1111/2041-210X.12584