--- res: bibo_abstract: - "Neuronal computations depend on synaptic connectivity and intrinsic electrophysiological properties. Synaptic connectivity determines which inputs from presynaptic neurons are integrated, while cellular properties determine how inputs are filtered over time. Unlike their biological counterparts, most computational approaches to learning in simulated neural networks are limited to changes in synaptic connectivity. However, if intrinsic parameters change, neural computations are altered drastically. Here, we include the parameters that determine the intrinsic properties,\r\ne.g., time constants and reset potential, into the learning paradigm. Using sparse feedback signals that indicate target spike times, and gradient-based parameter updates, we show that the intrinsic parameters can be learned along with the synaptic weights to produce specific input-output functions. Specifically, we use a teacher-student paradigm in which a randomly initialised leaky integrate-and-fire or resonate-and-fire neuron must recover the parameters of a teacher neuron. We show that complex temporal functions can be learned online and without backpropagation through time, relying on event-based updates only. Our results are a step towards online learning of neural computations from ungraded and unsigned sparse feedback signals with a biologically inspired learning mechanism.@eng" bibo_authorlist: - foaf_Person: foaf_givenName: Lukas foaf_name: Braun, Lukas foaf_surname: Braun - foaf_Person: foaf_givenName: Tim P foaf_name: Vogels, Tim P foaf_surname: Vogels foaf_workInfoHomepage: http://www.librecat.org/personId=CB6FF8D2-008F-11EA-8E08-2637E6697425 orcid: 0000-0003-3295-6181 bibo_volume: 20 dct_date: 2021^xs_gYear dct_isPartOf: - http://id.crossref.org/issn/1049-5258 - http://id.crossref.org/issn/9781713845393 dct_language: eng dct_publisher: Neural Information Processing Systems Foundation@ dct_title: Online learning of neural computations from sparse temporal feedback@ ...