3 Minute 3Rs February 2019

3 Minute 3Rs - Een podcast door The NC3Rs, the North American 3Rs Collaborative, and Lab Animal

This is the February episode of 3-Minute 3Rs, brought to you by the North American 3Rs Collaborative (www.na3rsc.org, the NC3Rs (www.nc3rs.org.uk), and Lab Animal (www.nature.com/laban)The papers behind the pod:1. 3D-printed Wash Station with Integrated Anesthesia Delivery Manifold for High-throughput Depilation of Laboratory Mice. https://bit.ly/2IpGrAP2. DeepSqueak: a deep learning-based system for detection and analysis of ultrasonic vocalizations .https://go.nature.com/2tupSJK3. Super-Mendelian inheritance mediated by CRISPR–Cas9 in the female mouse germline. https://go.nature.com/2RgnoYw[NA3RsC] Depilation (hair removal) is needed for many procedures done with laboratory mice, including surgery and imaging. Commercial depilatory creams are a safe, efficacious, and often-used method to achieve this hair removal. However, the use of these creams tends to be messy and time consuming, and wide procedural variation exists between laboratories. Furthermore, mice are generally under anesthesia during depilation, so increasing efficiency of the process is a valuable refinement to minimize anesthesia-associated adverse effects. Researchers from the University of Notre Dame have designed a 3D-printed device to simplify, standardize, and streamline the hair removal process. The Mouse Depilation Station (MDS) consists of an elevated stage with flow-through slats and is designed for integration with a self-scavenging anesthesia manifold and waste collection tray. It allows three mice to be depilated in parallel, thus improving efficiency compared to processing mice in sequence. The design files for the MDS stage are available online for free download – read the full article in JAALAS for the link.[NC3Rs] Imagine if laboratory animals could tell us how they were feeling. Scientists at the University of Washington in Seattle have moved this a step closer by developing Deepsqueak, a system to decipher rodent ultrasonic vocalisations. Usually outside the human range of hearing, rats and mice use a characteristic set of squeaks to communicate with one another. Writing in the journal Neuropsychopharmacology Coffey et al have used deep learning algorithms similar to those which power Alexa’s speech recognition abilities to identify and categorise rodent ‘syllables’. The rodent chatter can be linked to behaviours and physiological parameters, allowing an electronic dictionary of rodent calls to be built. One particular area of interest is in neuroscience research, where rodent behavioural tests are routinely applied to study a diverse collection of outcomes and diseases.By using fully automated monitoring of ultrasonic vocalisations during experiments an additional level of behavioural information can be collected, providing a non-invasive method for monitoring rodent welfare. [LA] All animals inherit their genes from their parents, but there’s a 50/50 chance whether a particular copy of a gene will come from the mother or from the father. When you want to make a mouse model of a particular genetic mutation, that often means a lot of animals are bred that don’t have the desired change. But what if you could weigh that probability? It’s been demonstrated already in insects, but Kimberly Cooper from the University of California San Diego and her colleagues bring active genetics to mammals for the first time in a new publication in Nature. They developed a CRISPR/Cas9 construct that could produce of litter of mice in which over 70% of the pups had the desired mutation. It only worked in females so far, but the approach proves that it is feasible to weight the genetic coin in mice. Hosted on Acast. See acast.com/privacy for more information.

Visit the podcast's native language site