{"keyword":["Artificial Intelligence","Computer Networks and Communications","Computer Vision and Pattern Recognition","Human-Computer Interaction","Software"],"publication_identifier":{"issn":["2522-5839"]},"acknowledgement":"This research was supported in part by the AI2050 program at Schmidt Futures (grant G-22-63172), the Boeing Company, and the United States Air Force Research Laboratory and the United States Air Force Artificial Intelligence Accelerator and was accomplished under cooperative agreement number FA8750-19-2-1000. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the United States Air Force or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes, notwithstanding any copyright notation herein. This work was further supported by The Boeing Company and Office of Naval Research grant N00014-18-1-2830. M.T. is supported by the Poul Due Jensen Foundation, grant 883901. M.L. was supported in part by the Austrian Science Fund under grant Z211-N23 (Wittgenstein Award). A.A. was supported by the National Science Foundation Graduate Research Fellowship Program. We thank T.-H. Wang, P. Kao, M. Chahine, W. Xiao, X. Li, L. Yin and Y. Ben for useful suggestions and for testing of CfC models to confirm the results across other domains.","quality_controlled":"1","department":[{"_id":"ToHe"}],"volume":4,"page":"992-1003","type":"journal_article","article_processing_charge":"No","doi":"10.1038/s42256-022-00556-7","issue":"11","publisher":"Springer Nature","date_created":"2023-01-12T12:07:21Z","external_id":{"isi":["000884215600003"],"arxiv":["2106.13898"]},"oa":1,"file_date_updated":"2023-01-24T09:49:44Z","isi":1,"date_updated":"2023-08-04T09:00:10Z","tmp":{"name":"Creative Commons Attribution 4.0 International Public License (CC-BY 4.0)","short":"CC BY (4.0)","legal_code_url":"https://creativecommons.org/licenses/by/4.0/legalcode","image":"/images/cc_by.png"},"language":[{"iso":"eng"}],"month":"11","file":[{"file_name":"2022_NatureMachineIntelligence_Hasani.pdf","date_created":"2023-01-24T09:49:44Z","content_type":"application/pdf","checksum":"b4789122ce04bfb4ac042390f59aaa8b","file_id":"12355","file_size":3259553,"date_updated":"2023-01-24T09:49:44Z","relation":"main_file","creator":"dernst","success":1,"access_level":"open_access"}],"year":"2022","related_material":{"link":[{"url":"https://doi.org/10.1038/s42256-022-00597-y","relation":"erratum"}]},"title":"Closed-form continuous-time neural networks","oa_version":"Published Version","author":[{"last_name":"Hasani","first_name":"Ramin","full_name":"Hasani, Ramin"},{"last_name":"Lechner","first_name":"Mathias","id":"3DC22916-F248-11E8-B48F-1D18A9856A87","full_name":"Lechner, Mathias"},{"full_name":"Amini, Alexander","last_name":"Amini","first_name":"Alexander"},{"last_name":"Liebenwein","first_name":"Lucas","full_name":"Liebenwein, Lucas"},{"full_name":"Ray, Aaron","first_name":"Aaron","last_name":"Ray"},{"full_name":"Tschaikowski, Max","first_name":"Max","last_name":"Tschaikowski"},{"full_name":"Teschl, Gerald","first_name":"Gerald","last_name":"Teschl"},{"full_name":"Rus, Daniela","first_name":"Daniela","last_name":"Rus"}],"date_published":"2022-11-15T00:00:00Z","abstract":[{"lang":"eng","text":"Continuous-time neural networks are a class of machine learning systems that can tackle representation learning on spatiotemporal decision-making tasks. These models are typically represented by continuous differential equations. However, their expressive power when they are deployed on computers is bottlenecked by numerical differential equation solvers. This limitation has notably slowed down the scaling and understanding of numerous natural physical phenomena such as the dynamics of nervous systems. Ideally, we would circumvent this bottleneck by solving the given dynamical system in closed form. This is known to be intractable in general. Here, we show that it is possible to closely approximate the interaction between neurons and synapses—the building blocks of natural and artificial neural networks—constructed by liquid time-constant networks efficiently in closed form. To this end, we compute a tightly bounded approximation of the solution of an integral appearing in liquid time-constant dynamics that has had no known closed-form solution so far. This closed-form solution impacts the design of continuous-time and continuous-depth neural models. For instance, since time appears explicitly in closed form, the formulation relaxes the need for complex numerical solvers. Consequently, we obtain models that are between one and five orders of magnitude faster in training and inference compared with differential equation-based counterparts. More importantly, in contrast to ordinary differential equation-based continuous networks, closed-form networks can scale remarkably well compared with other deep learning instances. Lastly, as these models are derived from liquid networks, they show good performance in time-series modelling compared with advanced recurrent neural network models."}],"citation":{"chicago":"Hasani, Ramin, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Aaron Ray, Max Tschaikowski, Gerald Teschl, and Daniela Rus. “Closed-Form Continuous-Time Neural Networks.” Nature Machine Intelligence. Springer Nature, 2022. https://doi.org/10.1038/s42256-022-00556-7.","ama":"Hasani R, Lechner M, Amini A, et al. Closed-form continuous-time neural networks. Nature Machine Intelligence. 2022;4(11):992-1003. doi:10.1038/s42256-022-00556-7","mla":"Hasani, Ramin, et al. “Closed-Form Continuous-Time Neural Networks.” Nature Machine Intelligence, vol. 4, no. 11, Springer Nature, 2022, pp. 992–1003, doi:10.1038/s42256-022-00556-7.","ista":"Hasani R, Lechner M, Amini A, Liebenwein L, Ray A, Tschaikowski M, Teschl G, Rus D. 2022. Closed-form continuous-time neural networks. Nature Machine Intelligence. 4(11), 992–1003.","ieee":"R. Hasani et al., “Closed-form continuous-time neural networks,” Nature Machine Intelligence, vol. 4, no. 11. Springer Nature, pp. 992–1003, 2022.","short":"R. Hasani, M. Lechner, A. Amini, L. Liebenwein, A. Ray, M. Tschaikowski, G. Teschl, D. Rus, Nature Machine Intelligence 4 (2022) 992–1003.","apa":"Hasani, R., Lechner, M., Amini, A., Liebenwein, L., Ray, A., Tschaikowski, M., … Rus, D. (2022). Closed-form continuous-time neural networks. Nature Machine Intelligence. Springer Nature. https://doi.org/10.1038/s42256-022-00556-7"},"day":"15","has_accepted_license":"1","scopus_import":"1","_id":"12147","ddc":["000"],"intvolume":" 4","status":"public","user_id":"4359f0d1-fa6c-11eb-b949-802e58b17ae8","article_type":"original","project":[{"call_identifier":"FWF","_id":"25F42A32-B435-11E9-9278-68D0E5697425","name":"The Wittgenstein Prize","grant_number":"Z211"}],"publication_status":"published","publication":"Nature Machine Intelligence"}