Most of what we do on a day-to-day basis involves the ongoing and fluid coordination between our senses and our actions. For example, making a cup of tea involves processing a constant stream of visual and tactile (touch) information to continuously correct how we move our muscles in order to avoid spilling milk or breaking a mug. Our brains coordinate this flow of sensory to motor information effortlessly, yet it is an ability that modern engineering still cannot rival; think of the clumsy robots in this year’s DARPA Robotics Challenge. The ability to coordinate sensory input and motor actions is also often impaired in diseases like Parkinson’s and dyspraxia. Understanding this ability is therefore not only a central goal of modern neuroscience but also one that promises to deliver advances for engineering and facilitate the treatment of disease.
Over the last century, neuroscience research has revealed that areas across the brain are required to process sensory and motor information during active behaviours. These include sensory and motor systems but also areas such as the cerebellum and the basal ganglia. While a partial understanding of the underlying processes in specific circuits has been achieved, a full understanding would ideally require recordings of the neural activity from across the brain in a behaving animal. This type of experiment has been impossible in the past for two reasons: 1. Neural recording techniques require a great degree of stability between recording devices and neural tissue; thus most experiments involve heavily restrained, or anesthetized, animals, which prevents meaningful brain/environment interactions. 2. Typical brain recordings have been limited to either small numbers neurons at cellular resolutions or indirect recordings from large areas of brain tissue at low spatial and/or temporal resolution. To address these challenges will combine advanced techniques in experimental and computational neuroscience. First, a virtual reality for a swimming larval zebrafish; this will allow us to record from a non-moving brain but allow fictive behaviour. Second, light-sheet microscopy, a technique that can simultaneously image 10000’s of neurons from across the zebrafish brain. Third, distributed computing techniques, which will enable us to analyse the enormous data sets (upto a terabyte per trial) acquired from these experiments.
We will use these tools to address three fundamental questions about brain function in behaving animals. First, when animals actively engage the world the brain receives two types of sensory input: Sensory input caused by changes in the external world, e.g. the optic flow experienced by a fish as water sweeps past its retina, and sensory input that is a consequence of their own actions, e.g. the optic flow experienced by the fish that results from its own swimming. These two types of inputs convey different types of information but arrive together on the retina. Thus a central question we will ask is what are the brain-wide circuits that allow the fish to distinguish between them. Second, animals readily adapt their behaviour when the sensory inputs caused by their own actions do not meet their expectations. For example, fish modulate the strength of swimming when changes in water viscosity cause a mismatch between the actual and expected consequences of their swimming, i.e., when their swimming does not propel them as far as they expect. We will ask what the distributed neural circuits are that allow fish to detect these mismatch errors. We will combine our results to produce a biologically plausible model of closed-loop control in an actively swimming fish that reproduces experimental observations and could be used to inspire robotic control systems.
This project will develop new techniques to record and analyse large neural datasets and provide unique insight into the distributed and dynamic nature of brain function necessary for successful active behaviour.