Menu

An automated behavioral platform for multisensory decision-making in mice

Fatemeh Yousefi1

1 Forschungszentrum Jülich GmbH

Many studies in systems neuroscience that aim to link neural activity patterns and cognitive functions involve training animals to perform complex behavioral tasks. However, the conventional training procedures are highly time- and labor-intensive and often face large variability of the experimental results, due to unknown environmental factors or interactions between animals and experimenters that affect animal behavior. To address this issue, we created a fully automated hallway system, to connect home cages to different types of behavioral setups and allow individual animals to be trained in perceptual tasks without direct interactions with human experimenters. The system uses RFID sensors to recognize individual animals and video based object detection to ensure that only individual animals are moving through the hallway. All included components are open-source and easy to replicate. To employ the automated system, we then integrated a recently-developed touchscreen chamber, allowing the mice to freely enter the touchscreen under a pre-defined training protocol and be trained in different visual and multisensory paradigms. To track the training process of each animal in its respective paradigm, we also created an SQL database to manage training parameters for each animal and the corresponding behavioral data. This allows experimenters to maintain an overview of the progression of different projects. We are currently working towards further extending our approach to be compatible with commonly used behavioral platforms, such as the Bpod, with the ultimate aim of creating a lab-wide standardized platform for behavioral training and acquisition and management of experimental data.