Closed-form continuous-time neural networks

Hasani R, Lechner M, Amini A, Liebenwein L, Ray A, Tschaikowski M, Teschl G, Rus D. 2022. Closed-form continuous-time neural networks. Nature Machine Intelligence. 4(11), 992–1003.

Download
OA 2022_NatureMachineIntelligence_Hasani.pdf 3.26 MB

Journal Article | Published | English

Scopus indexed
Author
Hasani, Ramin; Lechner, MathiasISTA; Amini, Alexander; Liebenwein, Lucas; Ray, Aaron; Tschaikowski, Max; Teschl, Gerald; Rus, Daniela
Abstract
Continuous-time neural networks are a class of machine learning systems that can tackle representation learning on spatiotemporal decision-making tasks. These models are typically represented by continuous differential equations. However, their expressive power when they are deployed on computers is bottlenecked by numerical differential equation solvers. This limitation has notably slowed down the scaling and understanding of numerous natural physical phenomena such as the dynamics of nervous systems. Ideally, we would circumvent this bottleneck by solving the given dynamical system in closed form. This is known to be intractable in general. Here, we show that it is possible to closely approximate the interaction between neurons and synapses—the building blocks of natural and artificial neural networks—constructed by liquid time-constant networks efficiently in closed form. To this end, we compute a tightly bounded approximation of the solution of an integral appearing in liquid time-constant dynamics that has had no known closed-form solution so far. This closed-form solution impacts the design of continuous-time and continuous-depth neural models. For instance, since time appears explicitly in closed form, the formulation relaxes the need for complex numerical solvers. Consequently, we obtain models that are between one and five orders of magnitude faster in training and inference compared with differential equation-based counterparts. More importantly, in contrast to ordinary differential equation-based continuous networks, closed-form networks can scale remarkably well compared with other deep learning instances. Lastly, as these models are derived from liquid networks, they show good performance in time-series modelling compared with advanced recurrent neural network models.
Publishing Year
Date Published
2022-11-15
Journal Title
Nature Machine Intelligence
Acknowledgement
This research was supported in part by the AI2050 program at Schmidt Futures (grant G-22-63172), the Boeing Company, and the United States Air Force Research Laboratory and the United States Air Force Artificial Intelligence Accelerator and was accomplished under cooperative agreement number FA8750-19-2-1000. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the United States Air Force or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes, notwithstanding any copyright notation herein. This work was further supported by The Boeing Company and Office of Naval Research grant N00014-18-1-2830. M.T. is supported by the Poul Due Jensen Foundation, grant 883901. M.L. was supported in part by the Austrian Science Fund under grant Z211-N23 (Wittgenstein Award). A.A. was supported by the National Science Foundation Graduate Research Fellowship Program. We thank T.-H. Wang, P. Kao, M. Chahine, W. Xiao, X. Li, L. Yin and Y. Ben for useful suggestions and for testing of CfC models to confirm the results across other domains.
Volume
4
Issue
11
Page
992-1003
ISSN
IST-REx-ID

Cite this

Hasani R, Lechner M, Amini A, et al. Closed-form continuous-time neural networks. Nature Machine Intelligence. 2022;4(11):992-1003. doi:10.1038/s42256-022-00556-7
Hasani, R., Lechner, M., Amini, A., Liebenwein, L., Ray, A., Tschaikowski, M., … Rus, D. (2022). Closed-form continuous-time neural networks. Nature Machine Intelligence. Springer Nature. https://doi.org/10.1038/s42256-022-00556-7
Hasani, Ramin, Mathias Lechner, Alexander Amini, Lucas Liebenwein, Aaron Ray, Max Tschaikowski, Gerald Teschl, and Daniela Rus. “Closed-Form Continuous-Time Neural Networks.” Nature Machine Intelligence. Springer Nature, 2022. https://doi.org/10.1038/s42256-022-00556-7.
R. Hasani et al., “Closed-form continuous-time neural networks,” Nature Machine Intelligence, vol. 4, no. 11. Springer Nature, pp. 992–1003, 2022.
Hasani R, Lechner M, Amini A, Liebenwein L, Ray A, Tschaikowski M, Teschl G, Rus D. 2022. Closed-form continuous-time neural networks. Nature Machine Intelligence. 4(11), 992–1003.
Hasani, Ramin, et al. “Closed-Form Continuous-Time Neural Networks.” Nature Machine Intelligence, vol. 4, no. 11, Springer Nature, 2022, pp. 992–1003, doi:10.1038/s42256-022-00556-7.
All files available under the following license(s):
Creative Commons Attribution 4.0 International Public License (CC-BY 4.0):
Main File(s)
Access Level
OA Open Access
Date Uploaded
2023-01-24
MD5 Checksum
b4789122ce04bfb4ac042390f59aaa8b


External material:
Erratum

Export

Marked Publications

Open Data ISTA Research Explorer

Web of Science

View record in Web of Science®

Sources

arXiv 2106.13898

Search this title in

Google Scholar