To understand how neurons transform sensory information into behaviour, typical approaches involve training an animal and a deep neural network (DNN) model on the same behavioural task and comparing whether DNN unit activity predicts neural responses to sensory stimuli. However, trained DNNs do not predict how individual neurons causally contribute to behaviour, and recording neural activity during complex natural behaviours is difficult. To overcome these challenges, in a new study, Crowley et al. developed a modelling method termed knockout training and used it to investigate the transformation of visual information into courtship behaviours in male Drosophila melanogaster interacting with female conspecifics.
Knockout training creates a DNN with a one-to-one mapping between its units and neurons in a behaving animal’s brain. It involves perturbing a unit of a DNN during training on behavioural data obtained after a corresponding neuron has also been perturbed in an animal. This constrains the model so that the unit activity must correspond to the contribution to behaviour from its matching neuron’s activity. The authors applied knockout training to the visual system in male flies, in which mappings between visual sensory features, specific behaviours and specific visual projection neurons — lobula columnar and plate neurons (LC neuron types) — have been previously established. LC neuron types separate the optic lobes from the central fly brain, receiving sensory input from the former and sending axons to the optic glomeruli of the latter to drive behaviour. Thus, each DNN unit corresponded to a solitary optic glomerulus innervated by a single LC neuron type.
留言 (0)