November 14, 2018
John Stowers and colleagues from the Straw Lab at the University of Frieburg have developed and shared FreemoVR, a virtual reality set-up for unrestrained animals.
Virtual reality (VR) systems can help to mimic nature in behavioral paradigms, which help us to understand behavior and brain function. Typical VR systems require that animals are movement restricted, which limits natural responses. The FreemoVR system was developed to address these issues and allows for virtual reality to be integrated with freely moving behavior. This system can be used with a number of different species including mice, zebrafish, and Drosophila. FreemoVR has been validated to investigate several behavior in tests of height-aversion, social interaction, and visuomotor responses in unrestrained animals.
Read more on the Straw Lab site, Nature Methods paper, or access the software on Github.
Stowers, J. R., Hofbauer, M., Bastien, R., Griessner, J., Higgins, P., Farooqui, S., . . . Straw, A. D. (2017). Virtual reality for freely moving animals. Nature Methods, 14(10), 995-1002. doi:10.1038/nmeth.4399
October 31, 2018
At the upcoming Society for Neuroscience meeting in San Diego, there will be a number of posters and talks that highlight novel devices and software that have implications for behavioral neuroscience. If you’re heading to the meeting, be sure to check them out! Relevant posters and talks are highlighted in the document, available at the following link: https://docs.google.com/document/d/12XqODhW14K2drCCEARVESoqqE0KrSjksZKN40xURVmk/edit?usp=sharing
October 24, 2018
Qingchun Guo and colleagues share their cost-effective, multi-channel fiber photometry system in Biomedical Optics Express.
Fiber photometry is a viable tool for recording in vivo calcium activity in freely behaving animals. In combination with genetically encoded calcium indicators, this tool can be used to measure neuronal and population activity from a genetically defined subset of neurons. Guo and colleagues have developed a set-up to allow for recording from multiple brain regions, or multiple animals, simultaneously with the use of a galvano-mirror system. This creative and simple solution reduces the number of detectors necessary for multi-channel data collection. This expands the ability of researchers to collect calcium imaging data from many subjects in a cost-effective way.
Read more here!
Guo, Q., Zhou, J., Feng, Q., Lin, R., Gong, H., Luo, Q., … Fu, L. (2015). Multi-channel fiber photometry for population neuronal activity recording. Biomedical Optics Express, 6(10), 3919–3931. https://doi.org/10.1364/BOE.6.003919
October 17, 2018
In the journal HardwareX, Jinook Oh and colleagues share their design for OpenFeeder, an automatic feeder for animal experiments.
Automatic delivery of precisely measured food amounts is important when studying reward and feeding behavior. Commercially available devices are often designed with specific species and food types in mind, limiting the ways that they can be used. This open-source automatic feeding design can easily be customized for food types from seeds to pellets to fit the needs of any species. OpenFeeder integrates plexiglass tubes, Arduino Uno, a motor driver, and piezo sensor to reliably deliver accurate amounts of food, and can also be built using 3D printed parts.
Read more from HardwareX.
Or check out the device on Open Science Framework and Github.
October 10, 2018
On Hackaday, Richard Warren of the Sawtell Lab at Columbia University has shared his design for KineMouse Wheel, a light-weight running wheel for head-fixed locomotion that allows for 3D positioning of mice with a single camera.
Locomotive behavior is a common behavioral readout used in neuroscience research, and running wheels are a great tool for assessing motor function in head-fixed mice. KineMouse Wheel takes this tool a step further. Constructed out of light-weight, transparent polycarbonate with an angled mirror mounted inside, this innovative device allows for a single camera to capture two views of locomotion simultaneously. When combined with DeepLabCut, a deep-learning tracking software, head-fixed mice locomotion can be captured in three dimensions allowing for a more complete assessment of motor behavior. This wheel can also be further customized to fit the needs of a lab by using different materials for the build. More details about the KineMouse Wheel are available at hackaday.io, in addition to a full list of parts and build instructions.
Read more about KineMouse Wheel on Hackaday,
and check out other awesome open-source tools on the OpenBehavior Hackaday list!
We are looking for your feedback to understand how we can better serve the community! We’re also interested to know if/how you’ve implemented some of the open-source tools from our site in your own research.
We would greatly appreciate it if you could fill out a short survey (~5 minutes to complete) about your experiences with OpenBehavior.
October 3, 2018
Thomas Akam and researchers from the Champalimaud Foundation and Oxford University have developed pyControl, a system that combines open-source hardware and software for control of behavioral experiments.
The ability to seamlessly control various aspects of a complex task is important for behavioral neuroscience research. pyControl, an open-source framework, combines Python scripts and a Micropython microcontroller for the control of behavioral experiments. This framework can be run through a command line interface (CLI), or in a user-friendly graphical user interface (GUI) that allows users to manage a variety of devices such as nose pokes, LED drivers, stepper motor controllers and more. The data collected using this system can then be imported easily into Python for data analysis. In addition to complete documentation on the pyControl website, users are welcome to ask questions and interact with the developers and other users via a pyControl Google group.
Read more on the pyControl website.
Purchase the pyControl breakout board at OpenEphys.
Or check out the pyControl Google group!
September 26, 2018
In Computers in Biology and Medicine, Carlos Fernando Crispin Jr. and colleagues share their software EthoWatcher: a computational tool that supports video-tracking, detailed ethography, and extraction of kinematic variables from video files of laboratory animals.
The freely available EthoWatcher software has two modules: a tracking module and an ethography module. The tracking module permits the controlled separation of the target from its background, the extraction of image attributes used to calculate distances traveled, orientation, length, area and a path graph of the target. The ethography module allows recording of catalog-based behaviors from video files, the environment, or frame-by-frame. The output reports latency, frequency, and duration of each behavior as well as the sequence of events in a time-segmented format fixed by the user. EthoWatcher was validated conducting tests on the detection of the known behavioral effects of drugs and on kinematic measurements.
Read more in their paper or download the software from the EthoWatcher webpage!
Junior, C. F., Pederiva, C. N., Bose, R. C., Garcia, V. A., Lino-De-Oliveira, C., & Marino-Neto, J. (2012). ETHOWATCHER: Validation of a tool for behavioral and video-tracking analysis in laboratory animals. Computers in Biology and Medicine,42(2), 257-264. doi:10.1016/j.compbiomed.2011.12.002
September 19, 2018
In HardwareX, Brendan Drackley and colleagues share VASIC, an open source weight-bearing device for high-throughput and unbiased behavioral pain assessment in rodents.
The assessment of pain in animal models is a key component in understanding and developing treatments for chronic pain. Drackley and colleagues developed VASIC (Voluntary Access Static Incapacitance Chamber), a modified version of a weight-bearing test. A brief water deprivation encourages rats or mice to seek water in a test chamber, set up with a weighing platforms under the water spout, which can assess weight shifting to an unaffected side in animals with damage to nerves or inflammatory pain. The design incorporates a custom printed circuit board (available from the paper), infrared sensor, Arduino microcontroller, 3D printed parts, and open source software for analysis. A full parts list, links to files, and data from a validation study are available in their paper.
Read more here!
September 12, 2018
In Frontiers in Neuroinformatics, Jason Rothman and R. Angus Silver share NeuroMatic, an open-source toolkit for acquiring, analyzing and simulating electrophysiological data.
Data acquisition, analysis, and simulation are key components of understanding neural activity from electrophysiological recordings. Traditionally, these three components of ephys data have been handled by separate software tools. NeuroMatic was developed to merge these tools into a single package, capable of performing a variety of patch-clamp recordings, data analysis routines and simulations of neural activity. Additionally, due to its open-source, modular design in WaveMetrics Igor Pro, NeuroMatic allows users to develop their own analysis functions that can be easily incorporated into its framework. By integrating acquisition, analysis, and simulation together, researchers are able to conserve experimental metadata and track the analysis performed in real time, without involving separate softwares.
Read more about NeuroMatic here!
Or check out their website and GitHub.