[{"publication_identifier":{"eissn":["2522-5812"]},"OA_type":"hybrid","oa_version":"Published Version","title":"Mitochondrial Ca2+ efflux controls neuronal metabolism and long-term memory across species","date_created":"2026-03-02T10:04:49Z","publication_status":"published","file":[{"access_level":"open_access","content_type":"application/pdf","success":1,"file_size":5326608,"date_created":"2026-03-02T15:21:27Z","checksum":"365932a599d05bc9ce8a57204e7a1465","file_name":"2026_NatureMetab_AmrapaliVishwanath.pdf","file_id":"21392","date_updated":"2026-03-02T15:21:27Z","creator":"dernst","relation":"main_file"}],"publisher":"Springer Nature","article_processing_charge":"Yes (in subscription journal)","tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"pmid":1,"doi":"10.1038/s42255-026-01451-w","oa":1,"acknowledgement":"We thank all members of the laboratory of J.d.J.-S. for insightful discussions and comments. We thank S. Perez for technical assistance. This work was made possible by the Paris Brain Institute Diane Barriere Chair in Synaptic Bioenergetics awarded to J.d.J.-S., who is also supported by an ERC Starting Grant (SynaptoEnergy, European Research Council; ERC-StG-852873), 2019 ATIP-Avenir Grant (CNRS, Inserm), a Big Brain Theory Grant (ICM Foundation) and a Kavli Exploratory Award (Kavli Foundation). This work was also supported by an ERC Advanced Grant (EnergyMeMo; ERC-AdG-741550) to T.P. and grants from the Agence Nationale de la Recherche to P.Y.P. (ANR-20-CE92-0047-01), T.P. (ANR-23-CE16-0029-01), A.P. and J.d.J.-S. (ANR-22-CE16-0020) and J.d.J.-S. (ANR-24-CE16-0221). T.P., P.Y.P. and J.d.J.-S. are permanent CNRS researchers. A.P. is a permanent ESPCI associate professor. T.C. was funded by the French Ministry of Research and the Fondation pour la Recherche Médicale. V.R. was funded by the Max Planck Society, the Chan Zuckerberg Initiative DAF, an advised fund of the Silicon Valley Community Foundation grant number 2024-349543 and the NIH Director’s New Innovator Award (DP2 MH140148). A.B.-G. and C.R.-D. received funding from an ERC Starting Grant (HighMemory; ERC-StG-948217), the Ministry of Economy and Competitiveness (PID2021-122795OB-I00) and the Departament d’Economia i Coneixement de la Generalitat de Catalunya (SGR 00022). T.P.V. was funded by the Wellcome Trust and a Royal Society Sir Henry Dale Research Fellowship (WT100000) and a Wellcome Trust Senior Research Fellowship (214316/Z/18/Z). K.G. was supported by the DIM C-BRAINS, funded by the Conseil Régional d’Ile-de-France. The contributions of H.F. and E.R.S. were supported by the Howard Hughes Medical Institute. The PHENO-ICMice animal Core at ICM is supported by two ‘Investissements d’avenir’ (ANR-10- IAIHU-06 and ANR-11-INBS-0011-NeurATRIS) and the Fondation pour la Recherche Médicale.","month":"02","department":[{"_id":"TiVo"}],"article_type":"original","project":[{"grant_number":"214316/Z/18/Z","_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","name":"What’s in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."}],"author":[{"last_name":"Amrapali Vishwanath","full_name":"Amrapali Vishwanath, Anjali","first_name":"Anjali"},{"first_name":"Typhaine","full_name":"Comyn, Typhaine","last_name":"Comyn"},{"first_name":"Rodrigo G.","full_name":"Mira, Rodrigo G.","last_name":"Mira"},{"first_name":"Claire","last_name":"Brossier","full_name":"Brossier, Claire"},{"first_name":"Carlos","full_name":"Pascual-Caro, Carlos","last_name":"Pascual-Caro"},{"first_name":"Maya","last_name":"Faour","full_name":"Faour, Maya"},{"first_name":"Kahina","last_name":"Boumendil","full_name":"Boumendil, Kahina"},{"id":"BA06AFEE-A4BA-11EA-AE5C-14673DDC885E","first_name":"Chaitanya","orcid":"0000-0003-4252-1608","full_name":"Chintaluri, Chaitanya","last_name":"Chintaluri"},{"full_name":"Ramon-Duaso, Carla","last_name":"Ramon-Duaso","first_name":"Carla"},{"full_name":"Fan, Ruolin","last_name":"Fan","first_name":"Ruolin"},{"first_name":"Kishalay","last_name":"Ghosh","full_name":"Ghosh, Kishalay"},{"first_name":"Helen","full_name":"Farrants, Helen","last_name":"Farrants"},{"first_name":"Jean-Paul","full_name":"Berwick, Jean-Paul","last_name":"Berwick"},{"first_name":"Riya","full_name":"Sivakumar, Riya","last_name":"Sivakumar"},{"full_name":"Lopez-Manzaneda, Mario","last_name":"Lopez-Manzaneda","first_name":"Mario"},{"last_name":"Schreiter","full_name":"Schreiter, Eric R.","first_name":"Eric R."},{"last_name":"Preat","full_name":"Preat, Thomas","first_name":"Thomas"},{"first_name":"Tim P","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","orcid":"0000-0003-3295-6181","full_name":"Vogels, Tim P","last_name":"Vogels"},{"full_name":"Rangaraju, Vidhya","last_name":"Rangaraju","first_name":"Vidhya"},{"last_name":"Busquets-Garcia","full_name":"Busquets-Garcia, Arnau","first_name":"Arnau"},{"full_name":"Plaçais, Pierre-Yves","last_name":"Plaçais","first_name":"Pierre-Yves"},{"last_name":"Pavlowsky","full_name":"Pavlowsky, Alice","first_name":"Alice"},{"first_name":"Jaime","last_name":"de Juan-Sanz","full_name":"de Juan-Sanz, Jaime"}],"volume":8,"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","ddc":["570"],"year":"2026","intvolume":"         8","OA_place":"publisher","date_published":"2026-02-11T00:00:00Z","quality_controlled":"1","type":"journal_article","external_id":{"pmid":["41673453"]},"_id":"21378","language":[{"iso":"eng"}],"abstract":[{"lang":"eng","text":"From insects to mammals, essential brain functions, such as forming long-term memories (LTMs), increase metabolic activity in stimulated neurons to meet the energetic demand associated with brain activation. However, while impairing neuronal metabolism limits brain performance, whether expanding the metabolic capacity of neurons boosts brain function remains poorly understood. Here, we show that LTM formation of flies and mice can be enhanced by increasing mitochondrial metabolism in central memory circuits. By knocking down the mitochondrial Ca2+ exporter Letm1, we favour Ca2+ retention in the mitochondrial matrix of neurons due to reduction of mitochondrial H+/Ca2+ exchange. The resulting increase in mitochondrial Ca2+ over-activates mitochondrial metabolism in neurons of central memory circuits, leading to improved LTM storage in training paradigms in which wild-type counterparts of both species fail to remember. Our findings unveil an evolutionarily conserved mechanism that controls mitochondrial metabolism in neurons and indicate its involvement in shaping higher brain functions, such as LTM."}],"day":"11","citation":{"ama":"Amrapali Vishwanath A, Comyn T, Mira RG, et al. Mitochondrial Ca2+ efflux controls neuronal metabolism and long-term memory across species. <i>Nature Metabolism</i>. 2026;8(2):467-488. doi:<a href=\"https://doi.org/10.1038/s42255-026-01451-w\">10.1038/s42255-026-01451-w</a>","ista":"Amrapali Vishwanath A, Comyn T, Mira RG, Brossier C, Pascual-Caro C, Faour M, Boumendil K, Chintaluri C, Ramon-Duaso C, Fan R, Ghosh K, Farrants H, Berwick J-P, Sivakumar R, Lopez-Manzaneda M, Schreiter ER, Preat T, Vogels TP, Rangaraju V, Busquets-Garcia A, Plaçais P-Y, Pavlowsky A, de Juan-Sanz J. 2026. Mitochondrial Ca2+ efflux controls neuronal metabolism and long-term memory across species. Nature Metabolism. 8(2), 467–488.","chicago":"Amrapali Vishwanath, Anjali, Typhaine Comyn, Rodrigo G. Mira, Claire Brossier, Carlos Pascual-Caro, Maya Faour, Kahina Boumendil, et al. “Mitochondrial Ca2+ Efflux Controls Neuronal Metabolism and Long-Term Memory across Species.” <i>Nature Metabolism</i>. Springer Nature, 2026. <a href=\"https://doi.org/10.1038/s42255-026-01451-w\">https://doi.org/10.1038/s42255-026-01451-w</a>.","mla":"Amrapali Vishwanath, Anjali, et al. “Mitochondrial Ca2+ Efflux Controls Neuronal Metabolism and Long-Term Memory across Species.” <i>Nature Metabolism</i>, vol. 8, no. 2, Springer Nature, 2026, pp. 467–88, doi:<a href=\"https://doi.org/10.1038/s42255-026-01451-w\">10.1038/s42255-026-01451-w</a>.","short":"A. Amrapali Vishwanath, T. Comyn, R.G. Mira, C. Brossier, C. Pascual-Caro, M. Faour, K. Boumendil, C. Chintaluri, C. Ramon-Duaso, R. Fan, K. Ghosh, H. Farrants, J.-P. Berwick, R. Sivakumar, M. Lopez-Manzaneda, E.R. Schreiter, T. Preat, T.P. Vogels, V. Rangaraju, A. Busquets-Garcia, P.-Y. Plaçais, A. Pavlowsky, J. de Juan-Sanz, Nature Metabolism 8 (2026) 467–488.","ieee":"A. Amrapali Vishwanath <i>et al.</i>, “Mitochondrial Ca2+ efflux controls neuronal metabolism and long-term memory across species,” <i>Nature Metabolism</i>, vol. 8, no. 2. Springer Nature, pp. 467–488, 2026.","apa":"Amrapali Vishwanath, A., Comyn, T., Mira, R. G., Brossier, C., Pascual-Caro, C., Faour, M., … de Juan-Sanz, J. (2026). Mitochondrial Ca2+ efflux controls neuronal metabolism and long-term memory across species. <i>Nature Metabolism</i>. Springer Nature. <a href=\"https://doi.org/10.1038/s42255-026-01451-w\">https://doi.org/10.1038/s42255-026-01451-w</a>"},"has_accepted_license":"1","page":"467-488","status":"public","PlanS_conform":"1","scopus_import":"1","issue":"2","file_date_updated":"2026-03-02T15:21:27Z","date_updated":"2026-03-02T15:23:10Z","publication":"Nature Metabolism"},{"doi":"10.1103/PhysRevX.15.011057","oa":1,"acknowledgement":"We thank Helen Barron, Vezha Boboeva, Adam Packer, João Sacramento, Andrew Saxe, Misha Tsodyks, and Friedemann Zenke for helpful comments at various stages of this work, and Rubem Erichsen, Jr. for carefully reading the manuscript and valuable comments. This work was\r\nsupported by a Sir Henry Dale Fellowship by the Wellcome Trust and the Royal Society [No. WT100000 (W. F. P., E. J. A., and T. P. V.)], a Wellcome Trust Senior Research Fellowship [No. 214316/Z/18/Z (E. J. A. and T. P. V.)], and a Research Project Grant by the Leverhulme Trust\r\n[No. RPG-2016-446 (E. J. A.)]. ","department":[{"_id":"TiVo"}],"month":"03","project":[{"_id":"B67AFEDC-15C9-11EA-A837-991A96BB2854","name":"IST Austria Open Access Fund"},{"name":"What’s in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks.","grant_number":"214316/Z/18/Z","_id":"c084a126-5a5b-11eb-8a69-d75314a70a87"}],"article_type":"original","volume":15,"author":[{"orcid":"0000-0001-6619-7502","first_name":"William F.","full_name":"Podlaski, William F.","last_name":"Podlaski"},{"last_name":"Agnes","full_name":"Agnes, Everton J.","orcid":"0000-0001-7184-7311","first_name":"Everton J."},{"id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","first_name":"Tim P","orcid":"0000-0003-3295-6181","full_name":"Vogels, Tim P","last_name":"Vogels"}],"publisher":"American Physical Society","article_processing_charge":"Yes","corr_author":"1","tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"related_material":{"link":[{"url":"https://github.com/wpodlaski/contextual-memory-nets","relation":"software"}]},"publication_status":"published","file":[{"content_type":"application/pdf","success":1,"file_size":1373704,"date_created":"2025-03-20T12:47:17Z","access_level":"open_access","creator":"dernst","relation":"main_file","date_updated":"2025-03-20T12:47:17Z","file_id":"19432","checksum":"1f27ee469ab51a3e1ce1e2df0022e81d","file_name":"2025_PhysReviewX_Podlaski.pdf"}],"publication_identifier":{"eissn":["2160-3308"]},"OA_type":"gold","oa_version":"Published Version","date_created":"2020-07-16T12:24:28Z","title":"High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating","file_date_updated":"2025-03-20T12:47:17Z","isi":1,"publication":"Physical Review X","date_updated":"2025-05-19T13:51:27Z","status":"public","locked":"1","scopus_import":"1","quality_controlled":"1","type":"journal_article","external_id":{"isi":["001451378900002"]},"_id":"8125","abstract":[{"lang":"eng","text":"Biological memory is known to be flexible—memory formation and recall depend on factors such as the behavioral context of the organism. However, this property is often ignored in associative memory models, leaving it unclear how memories can be organized and recalled when subject to contextual control. Because of the lack of a rigorous analytical framework, it is also unknown how contextual control affects memory stability, storage capacity, and information content. Here, we bring the dynamic nature of memory to the fore by introducing a novel model of associative memory, which we refer to as the context-modular memory network. In our model, stored memory patterns are associated to one of several background network states, or contexts. Memories are accessible when their corresponding context is active, and are otherwise inaccessible. Context modulates the effective network connectivity by imposing a specific\r\nconfiguration of neuronal and synaptic gating—gated neurons (synapses) have their activity (weights) momentarily silenced, thereby reducing interference from memories belonging to other contexts. Memory patterns are randomly and independently chosen, while neuronal and synaptic gates may be selected randomly or optimized through a process of contextual synaptic refinement. Through analytic and numerical results, we show that context-modular memory networks can exhibit both improved memory capacity and differential control of memory stability with random gating (especially for neuronal gating). For contextual synaptic refinement, we devise a method in which synapses are gated off for a given context if they destabilize the memory patterns in that context, drastically improving memory capacity and enabling even more precise control over memory stability. Notably, synaptic refinement allows for patterns to be\r\naccessible in multiple contexts, stabilizing memory patterns even for weight matrices that alone do not contain any information about the memory patterns, such as Gaussian random matrices. Overall, our model integrates recent ideas about context-dependent memory organization with classic associative memory models and proposes a rigorous theory which can act as a framework for future work. Furthermore, our work carries important implications for the understanding of biological memory storage and recall in the brain, such as highlighting an intriguing trade-off between memory capacity and accessibility."}],"language":[{"iso":"eng"}],"day":"13","citation":{"apa":"Podlaski, W. F., Agnes, E. J., &#38; Vogels, T. P. (2025). High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating. <i>Physical Review X</i>. American Physical Society. <a href=\"https://doi.org/10.1103/PhysRevX.15.011057\">https://doi.org/10.1103/PhysRevX.15.011057</a>","short":"W.F. Podlaski, E.J. Agnes, T.P. Vogels, Physical Review X 15 (2025).","ieee":"W. F. Podlaski, E. J. Agnes, and T. P. Vogels, “High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating,” <i>Physical Review X</i>, vol. 15. American Physical Society, 2025.","mla":"Podlaski, William F., et al. “High Capacity and Dynamic Accessibility in Associative Memory Networks with Context-Dependent Neuronal and Synaptic Gating.” <i>Physical Review X</i>, vol. 15, 011057, American Physical Society, 2025, doi:<a href=\"https://doi.org/10.1103/PhysRevX.15.011057\">10.1103/PhysRevX.15.011057</a>.","chicago":"Podlaski, William F., Everton J. Agnes, and Tim P Vogels. “High Capacity and Dynamic Accessibility in Associative Memory Networks with Context-Dependent Neuronal and Synaptic Gating.” <i>Physical Review X</i>. American Physical Society, 2025. <a href=\"https://doi.org/10.1103/PhysRevX.15.011057\">https://doi.org/10.1103/PhysRevX.15.011057</a>.","ista":"Podlaski WF, Agnes EJ, Vogels TP. 2025. High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating. Physical Review X. 15, 011057.","ama":"Podlaski WF, Agnes EJ, Vogels TP. High capacity and dynamic accessibility in associative memory networks with context-dependent neuronal and synaptic gating. <i>Physical Review X</i>. 2025;15. doi:<a href=\"https://doi.org/10.1103/PhysRevX.15.011057\">10.1103/PhysRevX.15.011057</a>"},"has_accepted_license":"1","ddc":["530"],"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","year":"2025","article_number":"011057","intvolume":"        15","OA_place":"publisher","date_published":"2025-03-13T00:00:00Z"},{"tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"corr_author":"1","article_processing_charge":"Yes (in subscription journal)","publisher":"National Academy of Sciences","pmid":1,"related_material":{"link":[{"relation":"software","url":"https://github.com/ccluri/metabolic_spiking"}]},"acknowledgement":"We thank Prof. C. Nazaret and Prof. J.-P. Mazat for sharing the code of their mitochondrial model. We also thank G. Miesenböck, E. Marder, L. Abbott, A. Kempf, P. Hasenhuetl, W. Podlaski, F. Zenke, E. Agnes, P. Bozelos, J. Watson, B. Confavreux, and G. Christodoulou, and the rest of the Vogels Lab for their feedback. This work was funded by Wellcome Trust and Royal Society Sir Henry Dale Research Fellowship (WT100000), a Wellcome Trust Senior Research Fellowship (214316/Z/18/Z), and a UK Research and Innovation, Biotechnology and Biological Sciences Research Council grant (UKRI-BBSRC BB/N019512/1).","oa":1,"doi":"10.1073/pnas.2306525120","author":[{"full_name":"Chintaluri, Chaitanya","last_name":"Chintaluri","first_name":"Chaitanya","id":"E4EDB536-3485-11EA-98D2-20AF3DDC885E"},{"orcid":"0000-0003-3295-6181","first_name":"Tim P","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","last_name":"Vogels","full_name":"Vogels, Tim P"}],"volume":120,"project":[{"_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","grant_number":"214316/Z/18/Z","name":"What’s in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."}],"article_type":"original","month":"11","department":[{"_id":"TiVo"}],"publication_identifier":{"issn":["0027-8424"],"eissn":["1091-6490"]},"date_created":"2023-12-10T23:01:00Z","title":"Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species","oa_version":"Published Version","OA_type":"hybrid","publication_status":"published","file":[{"file_name":"2023_PNAS_Chintaluri.pdf","checksum":"bf4ec38602a70dae4338077a5a4d497f","creator":"dernst","relation":"main_file","date_updated":"2023-12-11T12:45:12Z","file_id":"14678","access_level":"open_access","file_size":16891602,"date_created":"2023-12-11T12:45:12Z","success":1,"content_type":"application/pdf"}],"status":"public","issue":"48","scopus_import":"1","file_date_updated":"2023-12-11T12:45:12Z","publication":"Proceedings of the National Academy of Sciences of the United States of America","date_updated":"2025-09-24T11:16:56Z","isi":1,"article_number":"e2306525120","intvolume":"       120","year":"2023","ddc":["570"],"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","date_published":"2023-11-21T00:00:00Z","OA_place":"publisher","_id":"14666","abstract":[{"lang":"eng","text":"So-called spontaneous activity is a central hallmark of most nervous systems. Such non-causal firing is contrary to the tenet of spikes as a means of communication, and its purpose remains unclear. We propose that self-initiated firing can serve as a release valve to protect neurons from the toxic conditions arising in mitochondria from lower-than-baseline energy consumption. To demonstrate the viability of our hypothesis, we built a set of models that incorporate recent experimental results indicating homeostatic control of metabolic products—Adenosine triphosphate (ATP), adenosine diphosphate (ADP), and reactive oxygen species (ROS)—by changes in firing. We explore the relationship of metabolic cost of spiking with its effect on the temporal patterning of spikes and reproduce experimentally observed changes in intrinsic firing in the fruitfly dorsal fan-shaped body neuron in a model with ROS-modulated potassium channels. We also show that metabolic spiking homeostasis can produce indefinitely sustained avalanche dynamics in cortical circuits. Our theory can account for key features of neuronal activity observed in many studies ranging from ion channel function all the way to resting state dynamics. We finish with a set of experimental predictions that would confirm an integrated, crucial role for metabolically regulated spiking and firmly link metabolic homeostasis and neuronal function."}],"language":[{"iso":"eng"}],"external_id":{"isi":["001157389000005"],"pmid":["37988463"]},"type":"journal_article","quality_controlled":"1","has_accepted_license":"1","day":"21","citation":{"chicago":"Chintaluri, Chaitanya, and Tim P Vogels. “Metabolically Regulated Spiking Could Serve Neuronal Energy Homeostasis and Protect from Reactive Oxygen Species.” <i>Proceedings of the National Academy of Sciences of the United States of America</i>. National Academy of Sciences, 2023. <a href=\"https://doi.org/10.1073/pnas.2306525120\">https://doi.org/10.1073/pnas.2306525120</a>.","ama":"Chintaluri C, Vogels TP. Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species. <i>Proceedings of the National Academy of Sciences of the United States of America</i>. 2023;120(48). doi:<a href=\"https://doi.org/10.1073/pnas.2306525120\">10.1073/pnas.2306525120</a>","ista":"Chintaluri C, Vogels TP. 2023. Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species. Proceedings of the National Academy of Sciences of the United States of America. 120(48), e2306525120.","apa":"Chintaluri, C., &#38; Vogels, T. P. (2023). Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species. <i>Proceedings of the National Academy of Sciences of the United States of America</i>. National Academy of Sciences. <a href=\"https://doi.org/10.1073/pnas.2306525120\">https://doi.org/10.1073/pnas.2306525120</a>","mla":"Chintaluri, Chaitanya, and Tim P. Vogels. “Metabolically Regulated Spiking Could Serve Neuronal Energy Homeostasis and Protect from Reactive Oxygen Species.” <i>Proceedings of the National Academy of Sciences of the United States of America</i>, vol. 120, no. 48, e2306525120, National Academy of Sciences, 2023, doi:<a href=\"https://doi.org/10.1073/pnas.2306525120\">10.1073/pnas.2306525120</a>.","ieee":"C. Chintaluri and T. P. Vogels, “Metabolically regulated spiking could serve neuronal energy homeostasis and protect from reactive oxygen species,” <i>Proceedings of the National Academy of Sciences of the United States of America</i>, vol. 120, no. 48. National Academy of Sciences, 2023.","short":"C. Chintaluri, T.P. Vogels, Proceedings of the National Academy of Sciences of the United States of America 120 (2023)."}},{"scopus_import":"1","ec_funded":1,"status":"public","isi":1,"date_updated":"2025-04-14T09:44:14Z","publication":"Communications biology","file_date_updated":"2022-09-05T08:55:11Z","date_published":"2022-08-25T00:00:00Z","ddc":["570"],"user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","year":"2022","intvolume":"         5","article_number":"873","citation":{"short":"D.W. Jia, T.P. Vogels, R.P. Costa, Communications Biology 5 (2022).","ieee":"D. W. Jia, T. P. Vogels, and R. P. Costa, “Developmental depression-to-facilitation shift controls excitation-inhibition balance,” <i>Communications biology</i>, vol. 5. Springer Nature, 2022.","mla":"Jia, David W., et al. “Developmental Depression-to-Facilitation Shift Controls Excitation-Inhibition Balance.” <i>Communications Biology</i>, vol. 5, 873, Springer Nature, 2022, doi:<a href=\"https://doi.org/10.1038/s42003-022-03801-2\">10.1038/s42003-022-03801-2</a>.","apa":"Jia, D. W., Vogels, T. P., &#38; Costa, R. P. (2022). Developmental depression-to-facilitation shift controls excitation-inhibition balance. <i>Communications Biology</i>. Springer Nature. <a href=\"https://doi.org/10.1038/s42003-022-03801-2\">https://doi.org/10.1038/s42003-022-03801-2</a>","ista":"Jia DW, Vogels TP, Costa RP. 2022. Developmental depression-to-facilitation shift controls excitation-inhibition balance. Communications biology. 5, 873.","ama":"Jia DW, Vogels TP, Costa RP. Developmental depression-to-facilitation shift controls excitation-inhibition balance. <i>Communications biology</i>. 2022;5. doi:<a href=\"https://doi.org/10.1038/s42003-022-03801-2\">10.1038/s42003-022-03801-2</a>","chicago":"Jia, David W., Tim P Vogels, and Rui Ponte Costa. “Developmental Depression-to-Facilitation Shift Controls Excitation-Inhibition Balance.” <i>Communications Biology</i>. Springer Nature, 2022. <a href=\"https://doi.org/10.1038/s42003-022-03801-2\">https://doi.org/10.1038/s42003-022-03801-2</a>."},"day":"25","has_accepted_license":"1","quality_controlled":"1","type":"journal_article","external_id":{"isi":["000844814800007"]},"_id":"12009","language":[{"iso":"eng"}],"abstract":[{"text":"Changes in the short-term dynamics of excitatory synapses over development have been observed throughout cortex, but their purpose and consequences remain unclear. Here, we propose that developmental changes in synaptic dynamics buffer the effect of slow inhibitory long-term plasticity, allowing for continuously stable neural activity. Using computational modeling we demonstrate that early in development excitatory short-term depression quickly stabilises neural activity, even in the face of strong, unbalanced excitation. We introduce a model of the commonly observed developmental shift from depression to facilitation and show that neural activity remains stable throughout development, while inhibitory synaptic plasticity slowly balances excitation, consistent with experimental observations. Our model predicts changes in the input responses from phasic to phasic-and-tonic and more precise spike timings. We also observe a gradual emergence of short-lasting memory traces governed by short-term plasticity development. We conclude that the developmental depression-to-facilitation shift may control excitation-inhibition balance throughout development with important functional consequences.","lang":"eng"}],"publisher":"Springer Nature","article_processing_charge":"No","tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"department":[{"_id":"TiVo"}],"month":"08","project":[{"_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","grant_number":"214316/Z/18/Z","name":"Whatâs in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."},{"name":"Learning the shape of synaptic plasticity rules for neuronal architectures and function through machine learning.","_id":"0aacfa84-070f-11eb-9043-d7eb2c709234","call_identifier":"H2020","grant_number":"819603"}],"article_type":"original","volume":5,"author":[{"first_name":"David W.","last_name":"Jia","full_name":"Jia, David W."},{"last_name":"Vogels","full_name":"Vogels, Tim P","first_name":"Tim P","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","orcid":"0000-0003-3295-6181"},{"first_name":"Rui Ponte","full_name":"Costa, Rui Ponte","last_name":"Costa"}],"oa":1,"doi":"10.1038/s42003-022-03801-2","acknowledgement":"We would like to thank the Vogels Lab for feedback on an earlier version of this manuscript. D.W.J. was supported by a Marshall Scholarship and a Clarendon Scholarship. R.P.C. and T.P.V. were supported by a Wellcome Trust and Royal Society Sir Henry Dale Fellowship (WT 100000), a Wellcome Trust Senior Research Fellowship (214316/Z/18/Z), and an ERC Consolidator Grant (SYNAPSEEK).","oa_version":"Published Version","title":"Developmental depression-to-facilitation shift controls excitation-inhibition balance","date_created":"2022-09-04T22:02:02Z","publication_identifier":{"eissn":["2399-3642"]},"file":[{"checksum":"3ec724c4f6d3440028c217305e32915f","file_name":"2022_CommBiology_Jia.pdf","relation":"main_file","creator":"dernst","date_updated":"2022-09-05T08:55:11Z","file_id":"12022","access_level":"open_access","content_type":"application/pdf","success":1,"date_created":"2022-09-05T08:55:11Z","file_size":2491191}],"publication_status":"published"},{"scopus_import":"1","issue":"8","status":"public","isi":1,"publication":"PLoS Computational Biology","date_updated":"2025-06-11T13:51:21Z","file_date_updated":"2022-09-12T07:47:55Z","date_published":"2022-08-15T00:00:00Z","year":"2022","ddc":["570"],"user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","intvolume":"        18","article_number":"e1010365","has_accepted_license":"1","citation":{"apa":"Christodoulou, G., Vogels, T. P., &#38; Agnes, E. J. (2022). Regimes and mechanisms of transient amplification in abstract and biological neural networks. <i>PLoS Computational Biology</i>. Public Library of Science. <a href=\"https://doi.org/10.1371/journal.pcbi.1010365\">https://doi.org/10.1371/journal.pcbi.1010365</a>","ieee":"G. Christodoulou, T. P. Vogels, and E. J. Agnes, “Regimes and mechanisms of transient amplification in abstract and biological neural networks,” <i>PLoS Computational Biology</i>, vol. 18, no. 8. Public Library of Science, 2022.","short":"G. Christodoulou, T.P. Vogels, E.J. Agnes, PLoS Computational Biology 18 (2022).","mla":"Christodoulou, Georgia, et al. “Regimes and Mechanisms of Transient Amplification in Abstract and Biological Neural Networks.” <i>PLoS Computational Biology</i>, vol. 18, no. 8, e1010365, Public Library of Science, 2022, doi:<a href=\"https://doi.org/10.1371/journal.pcbi.1010365\">10.1371/journal.pcbi.1010365</a>.","chicago":"Christodoulou, Georgia, Tim P Vogels, and Everton J. Agnes. “Regimes and Mechanisms of Transient Amplification in Abstract and Biological Neural Networks.” <i>PLoS Computational Biology</i>. Public Library of Science, 2022. <a href=\"https://doi.org/10.1371/journal.pcbi.1010365\">https://doi.org/10.1371/journal.pcbi.1010365</a>.","ista":"Christodoulou G, Vogels TP, Agnes EJ. 2022. Regimes and mechanisms of transient amplification in abstract and biological neural networks. PLoS Computational Biology. 18(8), e1010365.","ama":"Christodoulou G, Vogels TP, Agnes EJ. Regimes and mechanisms of transient amplification in abstract and biological neural networks. <i>PLoS Computational Biology</i>. 2022;18(8). doi:<a href=\"https://doi.org/10.1371/journal.pcbi.1010365\">10.1371/journal.pcbi.1010365</a>"},"day":"15","type":"journal_article","quality_controlled":"1","language":[{"iso":"eng"}],"_id":"12084","abstract":[{"text":"Neuronal networks encode information through patterns of activity that define the networks’ function. The neurons’ activity relies on specific connectivity structures, yet the link between structure and function is not fully understood. Here, we tackle this structure-function problem with a new conceptual approach. Instead of manipulating the connectivity directly, we focus on upper triangular matrices, which represent the network dynamics in a given orthonormal basis obtained by the Schur decomposition. This abstraction allows us to independently manipulate the eigenspectrum and feedforward structures of a connectivity matrix. Using this method, we describe a diverse repertoire of non-normal transient amplification, and to complement the analysis of the dynamical regimes, we quantify the geometry of output trajectories through the effective rank of both the eigenvector and the dynamics matrices. Counter-intuitively, we find that shrinking the eigenspectrum’s imaginary distribution leads to highly amplifying regimes in linear and long-lasting dynamics in nonlinear networks. We also find a trade-off between amplification and dimensionality of neuronal dynamics, i.e., trajectories in neuronal state-space. Networks that can amplify a large number of orthogonal initial conditions produce neuronal trajectories that lie in the same subspace of the neuronal state-space. Finally, we examine networks of excitatory and inhibitory neurons. We find that the strength of global inhibition is directly linked with the amplitude of amplification, such that weakening inhibitory weights also decreases amplification, and that the eigenspectrum’s imaginary distribution grows with an increase in the ratio between excitatory-to-inhibitory and excitatory-to-excitatory connectivity strengths. Consequently, the strength of global inhibition reveals itself as a strong signature for amplification and a potential control mechanism to switch dynamical regimes. Our results shed a light on how biological networks, i.e., networks constrained by Dale’s law, may be optimised for specific dynamical regimes.","lang":"eng"}],"external_id":{"isi":["000937227700001"],"pmid":["35969604"]},"pmid":1,"publisher":"Public Library of Science","corr_author":"1","tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"article_processing_charge":"No","project":[{"_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","grant_number":"214316/Z/18/Z","name":"Whatâs in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."}],"article_type":"original","department":[{"_id":"TiVo"}],"month":"08","author":[{"first_name":"Georgia","last_name":"Christodoulou","full_name":"Christodoulou, Georgia"},{"last_name":"Vogels","full_name":"Vogels, Tim P","orcid":"0000-0003-3295-6181","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","first_name":"Tim P"},{"first_name":"Everton J.","last_name":"Agnes","full_name":"Agnes, Everton J."}],"volume":18,"acknowledgement":"We thank Friedemann Zenke for his comments, especially on the effect of the self loops on the spectrum. We also thank Ken Miller and Bill Podlaski for helpful comments. This research was funded by a Wellcome Trust and Royal Society Henry Dale Research Fellowship (WT100000; TPV), a Wellcome Senior Research Fellowship (214316/Z/18/Z; GC, EJA, and TPV), and a Research Project Grant by the Leverhulme Trust (RPG-2016-446; EJA and TPV). ","oa":1,"doi":"10.1371/journal.pcbi.1010365","oa_version":"Published Version","title":"Regimes and mechanisms of transient amplification in abstract and biological neural networks","date_created":"2022-09-11T22:01:56Z","publication_identifier":{"eissn":["1553-7358"]},"file":[{"content_type":"application/pdf","success":1,"file_size":2867337,"date_created":"2022-09-12T07:47:55Z","access_level":"open_access","date_updated":"2022-09-12T07:47:55Z","relation":"main_file","creator":"dernst","file_id":"12090","checksum":"8a81ab29f837991ee0ea770817c4a50e","file_name":"2022_PLoSCompBio_Christodoulou.pdf"}],"publication_status":"published"},{"citation":{"ista":"Braun L, Vogels TP. 2021. Online learning of neural computations from sparse temporal feedback. Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems. NeurIPS: Neural Information Processing Systems vol. 20, 16437–16450.","ama":"Braun L, Vogels TP. Online learning of neural computations from sparse temporal feedback. In: <i>Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems</i>. Vol 20. Neural Information Processing Systems Foundation; 2021:16437-16450.","chicago":"Braun, Lukas, and Tim P Vogels. “Online Learning of Neural Computations from Sparse Temporal Feedback.” In <i>Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems</i>, 20:16437–50. Neural Information Processing Systems Foundation, 2021.","ieee":"L. Braun and T. P. Vogels, “Online learning of neural computations from sparse temporal feedback,” in <i>Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems</i>, Virtual, Online, 2021, vol. 20, pp. 16437–16450.","short":"L. Braun, T.P. Vogels, in:, Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems, Neural Information Processing Systems Foundation, 2021, pp. 16437–16450.","mla":"Braun, Lukas, and Tim P. Vogels. “Online Learning of Neural Computations from Sparse Temporal Feedback.” <i>Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems</i>, vol. 20, Neural Information Processing Systems Foundation, 2021, pp. 16437–50.","apa":"Braun, L., &#38; Vogels, T. P. (2021). Online learning of neural computations from sparse temporal feedback. In <i>Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems</i> (Vol. 20, pp. 16437–16450). Virtual, Online: Neural Information Processing Systems Foundation."},"day":"01","quality_controlled":"1","type":"conference","_id":"11453","language":[{"iso":"eng"}],"abstract":[{"lang":"eng","text":"Neuronal computations depend on synaptic connectivity and intrinsic electrophysiological properties. Synaptic connectivity determines which inputs from presynaptic neurons are integrated, while cellular properties determine how inputs are filtered over time. Unlike their biological counterparts, most computational approaches to learning in simulated neural networks are limited to changes in synaptic connectivity. However, if intrinsic parameters change, neural computations are altered drastically. Here, we include the parameters that determine the intrinsic properties,\r\ne.g., time constants and reset potential, into the learning paradigm. Using sparse feedback signals that indicate target spike times, and gradient-based parameter updates, we show that the intrinsic parameters can be learned along with the synaptic weights to produce specific input-output functions. Specifically, we use a teacher-student paradigm in which a randomly initialised leaky integrate-and-fire or resonate-and-fire neuron must recover the parameters of a teacher neuron. We show that complex temporal functions can be learned online and without backpropagation through time, relying on event-based updates only. Our results are a step towards online learning of neural computations from ungraded and unsigned sparse feedback signals with a biologically inspired learning mechanism."}],"main_file_link":[{"url":"https://proceedings.neurips.cc/paper/2021/file/88e1ce84f9feef5a08d0df0334c53468-Paper.pdf","open_access":"1"}],"date_published":"2021-12-01T00:00:00Z","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","year":"2021","intvolume":"        20","publication":"Advances in Neural Information Processing Systems - 35th Conference on Neural Information Processing Systems","date_updated":"2025-04-14T09:44:14Z","scopus_import":"1","status":"public","page":"16437-16450","publication_status":"published","conference":{"name":"NeurIPS: Neural Information Processing Systems","start_date":"2021-12-06","end_date":"2021-12-14","location":"Virtual, Online"},"oa_version":"Published Version","title":"Online learning of neural computations from sparse temporal feedback","date_created":"2022-06-19T22:01:59Z","publication_identifier":{"issn":["1049-5258"],"isbn":["9781713845393"]},"department":[{"_id":"TiVo"}],"month":"12","project":[{"name":"Whatâs in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks.","grant_number":"214316/Z/18/Z","_id":"c084a126-5a5b-11eb-8a69-d75314a70a87"}],"author":[{"first_name":"Lukas","full_name":"Braun, Lukas","last_name":"Braun"},{"last_name":"Vogels","full_name":"Vogels, Tim P","orcid":"0000-0003-3295-6181","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","first_name":"Tim P"}],"volume":20,"oa":1,"acknowledgement":"We would like to thank Professor Dr. Henning Sprekeler for his valuable suggestions and Dr. Andrew Saxe, Milan Klöwer and Anna Wallis for their constructive feedback on the manuscript. Lukas Braun was supported by the Network of European Neuroscience Schools through their NENS Exchange Grant program, by the European Union through their European Community Action Scheme for the Mobility of University Students, the Woodward Scholarship awarded by Wadham College, Oxford and the Medical Research Council [MR/N013468/1]. Tim P. Vogels was supported by a Wellcome Trust Senior Research Fellowship [214316/Z/18/Z].","publisher":"Neural Information Processing Systems Foundation","article_processing_charge":"No","corr_author":"1"},{"tmp":{"short":"CC BY (4.0)","image":"/images/cc_by.png","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)"},"corr_author":"1","article_processing_charge":"No","publisher":"MIT Press","pmid":1,"acknowledgement":"F.Z. was supported by the Wellcome Trust (110124/Z/15/Z) and the Novartis Research Foundation. T.P.V. was supported by a Wellcome Trust Sir Henry Dale Research fellowship (WT100000), a Wellcome Trust Senior Research Fellowship (214316/Z/18/Z), and an ERC Consolidator Grant SYNAPSEEK.","doi":"10.1162/neco_a_01367","oa":1,"volume":33,"author":[{"orcid":"0000-0003-1883-644X","first_name":"Friedemann","full_name":"Zenke, Friedemann","last_name":"Zenke"},{"orcid":"0000-0003-3295-6181","first_name":"Tim P","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425","full_name":"Vogels, Tim P","last_name":"Vogels"}],"project":[{"call_identifier":"H2020","_id":"0aacfa84-070f-11eb-9043-d7eb2c709234","grant_number":"819603","name":"Learning the shape of synaptic plasticity rules for neuronal architectures and function through machine learning."},{"grant_number":"214316/Z/18/Z","_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","name":"Whatâs in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."}],"article_type":"original","department":[{"_id":"TiVo"}],"month":"03","publication_identifier":{"issn":["0899-7667"],"eissn":["1530-888X"]},"date_created":"2020-08-12T12:08:24Z","title":"The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks","oa_version":"Published Version","publication_status":"published","file":[{"checksum":"eac5a51c24c8989ae7cf9ae32ec3bc95","file_name":"2021_NeuralComputation_Zenke.pdf","creator":"dernst","date_updated":"2022-04-08T06:05:39Z","file_id":"11131","relation":"main_file","access_level":"open_access","success":1,"content_type":"application/pdf","file_size":1611614,"date_created":"2022-04-08T06:05:39Z"}],"page":"899-925","status":"public","issue":"4","ec_funded":1,"scopus_import":"1","file_date_updated":"2022-04-08T06:05:39Z","publication":"Neural Computation","date_updated":"2025-04-14T09:44:14Z","isi":1,"intvolume":"        33","year":"2021","user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","ddc":["000","570"],"date_published":"2021-03-01T00:00:00Z","_id":"8253","language":[{"iso":"eng"}],"abstract":[{"text":"Brains process information in spiking neural networks. Their intricate connections shape the diverse functions these networks perform. In comparison, the functional capabilities of models of spiking networks are still rudimentary. This shortcoming is mainly due to the lack of insight and practical algorithms to construct the necessary connectivity. Any such algorithm typically attempts to build networks by iteratively reducing the error compared to a desired output. But assigning credit to hidden units in multi-layered spiking networks has remained challenging due to the non-differentiable nonlinearity of spikes. To avoid this issue, one can employ surrogate gradients to discover the required connectivity in spiking network models. However, the choice of a surrogate is not unique, raising the question of how its implementation influences the effectiveness of the method. Here, we use numerical simulations to systematically study how essential design parameters of surrogate gradients impact learning performance on a range of classification problems. We show that surrogate gradient learning is robust to different shapes of underlying surrogate derivatives, but the choice of the derivative’s scale can substantially affect learning performance. When we combine surrogate gradients with a suitable activity regularization technique, robust information processing can be achieved in spiking networks even at the sparse activity limit. Our study provides a systematic account of the remarkable robustness of surrogate gradient learning and serves as a practical guide to model functional spiking neural networks.","lang":"eng"}],"external_id":{"isi":["000663433900003"],"pmid":["33513328"]},"type":"journal_article","quality_controlled":"1","has_accepted_license":"1","day":"01","citation":{"short":"F. Zenke, T.P. Vogels, Neural Computation 33 (2021) 899–925.","ieee":"F. Zenke and T. P. Vogels, “The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks,” <i>Neural Computation</i>, vol. 33, no. 4. MIT Press, pp. 899–925, 2021.","mla":"Zenke, Friedemann, and Tim P. Vogels. “The Remarkable Robustness of Surrogate Gradient Learning for Instilling Complex Function in Spiking Neural Networks.” <i>Neural Computation</i>, vol. 33, no. 4, MIT Press, 2021, pp. 899–925, doi:<a href=\"https://doi.org/10.1162/neco_a_01367\">10.1162/neco_a_01367</a>.","apa":"Zenke, F., &#38; Vogels, T. P. (2021). The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. <i>Neural Computation</i>. MIT Press. <a href=\"https://doi.org/10.1162/neco_a_01367\">https://doi.org/10.1162/neco_a_01367</a>","ista":"Zenke F, Vogels TP. 2021. The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural Computation. 33(4), 899–925.","ama":"Zenke F, Vogels TP. The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. <i>Neural Computation</i>. 2021;33(4):899-925. doi:<a href=\"https://doi.org/10.1162/neco_a_01367\">10.1162/neco_a_01367</a>","chicago":"Zenke, Friedemann, and Tim P Vogels. “The Remarkable Robustness of Surrogate Gradient Learning for Instilling Complex Function in Spiking Neural Networks.” <i>Neural Computation</i>. MIT Press, 2021. <a href=\"https://doi.org/10.1162/neco_a_01367\">https://doi.org/10.1162/neco_a_01367</a>."}},{"oa_version":"Published Version","date_created":"2021-07-04T22:01:27Z","title":"A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network","publication_identifier":{"issn":["1049-5258"]},"conference":{"name":"NeurIPS: Conference on Neural Information Processing Systems","start_date":"2020-12-06","end_date":"2020-12-12","location":"Vancouver, Canada"},"publication_status":"published","related_material":{"record":[{"id":"14422","status":"public","relation":"dissertation_contains"}],"link":[{"url":"https://doi.org/10.1101/2020.10.24.353409","relation":"is_continued_by"}]},"article_processing_charge":"No","project":[{"name":"Learning the shape of synaptic plasticity rules for neuronal architectures and function through machine learning.","_id":"0aacfa84-070f-11eb-9043-d7eb2c709234","call_identifier":"H2020","grant_number":"819603"},{"_id":"c084a126-5a5b-11eb-8a69-d75314a70a87","grant_number":"214316/Z/18/Z","name":"What’s in a memory? Spatiotemporal dynamics in strongly coupled recurrent neuronal networks."}],"month":"12","department":[{"_id":"TiVo"}],"author":[{"full_name":"Confavreux, Basile J","last_name":"Confavreux","first_name":"Basile J","id":"C7610134-B532-11EA-BD9F-F5753DDC885E"},{"last_name":"Zenke","full_name":"Zenke, Friedemann","first_name":"Friedemann"},{"first_name":"Everton J.","last_name":"Agnes","full_name":"Agnes, Everton J."},{"full_name":"Lillicrap, Timothy","last_name":"Lillicrap","first_name":"Timothy"},{"last_name":"Vogels","full_name":"Vogels, Tim P","orcid":"0000-0003-3295-6181","first_name":"Tim P","id":"CB6FF8D2-008F-11EA-8E08-2637E6697425"}],"volume":33,"acknowledgement":"We would like to thank Chaitanya Chintaluri, Georgia Christodoulou, Bill Podlaski and Merima Šabanovic for useful discussions and comments. This work was supported by a Wellcome Trust ´ Senior Research Fellowship (214316/Z/18/Z), a BBSRC grant (BB/N019512/1), an ERC consolidator Grant (SYNAPSEEK), a Leverhulme Trust Project Grant (RPG-2016-446), and funding from École Polytechnique, Paris.","oa":1,"main_file_link":[{"url":"https://proceedings.neurips.cc/paper/2020/hash/bdbd5ebfde4934142c8a88e7a3796cd5-Abstract.html","open_access":"1"}],"date_published":"2020-12-06T00:00:00Z","year":"2020","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","intvolume":"        33","citation":{"mla":"Confavreux, Basile J., et al. “A Meta-Learning Approach to (Re)Discover Plasticity Rules That Carve a Desired Function into a Neural Network.” <i>Advances in Neural Information Processing Systems</i>, vol. 33, 2020, pp. 16398–408.","ieee":"B. J. Confavreux, F. Zenke, E. J. Agnes, T. Lillicrap, and T. P. Vogels, “A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network,” in <i>Advances in Neural Information Processing Systems</i>, Vancouver, Canada, 2020, vol. 33, pp. 16398–16408.","short":"B.J. Confavreux, F. Zenke, E.J. Agnes, T. Lillicrap, T.P. Vogels, in:, Advances in Neural Information Processing Systems, 2020, pp. 16398–16408.","apa":"Confavreux, B. J., Zenke, F., Agnes, E. J., Lillicrap, T., &#38; Vogels, T. P. (2020). A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network. In <i>Advances in Neural Information Processing Systems</i> (Vol. 33, pp. 16398–16408). Vancouver, Canada.","ama":"Confavreux BJ, Zenke F, Agnes EJ, Lillicrap T, Vogels TP. A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network. In: <i>Advances in Neural Information Processing Systems</i>. Vol 33. ; 2020:16398-16408.","ista":"Confavreux BJ, Zenke F, Agnes EJ, Lillicrap T, Vogels TP. 2020. A meta-learning approach to (re)discover plasticity rules that carve a desired function into a neural network. Advances in Neural Information Processing Systems. NeurIPS: Conference on Neural Information Processing Systems vol. 33, 16398–16408.","chicago":"Confavreux, Basile J, Friedemann Zenke, Everton J. Agnes, Timothy Lillicrap, and Tim P Vogels. “A Meta-Learning Approach to (Re)Discover Plasticity Rules That Carve a Desired Function into a Neural Network.” In <i>Advances in Neural Information Processing Systems</i>, 33:16398–408, 2020."},"day":"06","type":"conference","quality_controlled":"1","abstract":[{"text":"The search for biologically faithful synaptic plasticity rules has resulted in a large body of models. They are usually inspired by – and fitted to – experimental data, but they rarely produce neural dynamics that serve complex functions. These failures suggest that current plasticity models are still under-constrained by existing data. Here, we present an alternative approach that uses meta-learning to discover plausible synaptic plasticity rules. Instead of experimental data, the rules are constrained by the functions they implement and the structure they are meant to produce. Briefly, we parameterize synaptic plasticity rules by a Volterra expansion and then use supervised learning methods (gradient descent or evolutionary strategies) to minimize a problem-dependent loss function that quantifies how effectively a candidate plasticity rule transforms an initially random network into one with the desired function. We first validate our approach by re-discovering previously described plasticity rules, starting at the single-neuron level and “Oja’s rule”, a simple Hebbian plasticity rule that captures the direction of most variability of inputs to a neuron (i.e., the first principal component). We expand the problem to the network level and ask the framework to find Oja’s rule together with an anti-Hebbian rule such that an initially random two-layer firing-rate network will recover several principal components of the input space after learning. Next, we move to networks of integrate-and-fire neurons with plastic inhibitory afferents. We train for rules that achieve a target firing rate by countering tuned excitation. Our algorithm discovers a specific subset of the manifold of rules that can solve this task. Our work is a proof of principle of an automated and unbiased approach to unveil synaptic plasticity rules that obey biological constraints and can solve complex functions.","lang":"eng"}],"_id":"9633","language":[{"iso":"eng"}],"scopus_import":"1","ec_funded":1,"page":"16398-16408","status":"public","date_updated":"2026-04-27T22:30:20Z","publication":"Advances in Neural Information Processing Systems"}]
