The role of pretrained representations for the OOD generalization of reinforcement learning agents
Dittadi A, Träuble F, Wüthrich M, Widmaier F, Gehler P, Winther O, Locatello F, Bachem O, Schölkopf B, Bauer S. 2022. The role of pretrained representations for the OOD generalization of reinforcement learning agents. 10th International Conference on Learning Representations. ICLR: International Conference on Learning Representations.
Download (ext.)
https://doi.org/10.48550/arXiv.2107.05686
[Preprint]
Conference Paper
| Published
| English
Author
Dittadi, Andrea;
Träuble, Frederik;
Wüthrich, Manuel;
Widmaier, Felix;
Gehler, Peter;
Winther, Ole;
Locatello, FrancescoISTA ;
Bachem, Olivier;
Schölkopf, Bernhard;
Bauer, Stefan
Department
Abstract
Building sample-efficient agents that generalize out-of-distribution (OOD) in real-world settings remains a fundamental unsolved problem on the path towards achieving higher-level cognition. One particularly promising approach is to begin with low-dimensional, pretrained representations of our world, which should facilitate efficient downstream learning and generalization. By training 240 representations and over 10,000 reinforcement learning (RL) policies on a simulated robotic setup, we evaluate to what extent different properties of
pretrained VAE-based representations affect the OOD generalization of downstream agents. We observe that many agents are surprisingly robust to realistic distribution shifts, including the challenging sim-to-real case. In addition, we find that the generalization performance of a simple downstream proxy task reliably predicts the generalization performance of our RL agents
under a wide range of OOD settings. Such proxy tasks can thus be used to select pretrained representations that will lead to agents that generalize.
Publishing Year
Date Published
2022-04-25
Proceedings Title
10th International Conference on Learning Representations
Conference
ICLR: International Conference on Learning Representations
Conference Location
Virtual
Conference Date
2022-04-25 – 2022-04-29
IST-REx-ID
Cite this
Dittadi A, Träuble F, Wüthrich M, et al. The role of pretrained representations for the OOD generalization of reinforcement learning agents. In: 10th International Conference on Learning Representations. ; 2022.
Dittadi, A., Träuble, F., Wüthrich, M., Widmaier, F., Gehler, P., Winther, O., … Bauer, S. (2022). The role of pretrained representations for the OOD generalization of reinforcement learning agents. In 10th International Conference on Learning Representations. Virtual.
Dittadi, Andrea, Frederik Träuble, Manuel Wüthrich, Felix Widmaier, Peter Gehler, Ole Winther, Francesco Locatello, Olivier Bachem, Bernhard Schölkopf, and Stefan Bauer. “The Role of Pretrained Representations for the OOD Generalization of Reinforcement Learning Agents.” In 10th International Conference on Learning Representations, 2022.
A. Dittadi et al., “The role of pretrained representations for the OOD generalization of reinforcement learning agents,” in 10th International Conference on Learning Representations, Virtual, 2022.
Dittadi A, Träuble F, Wüthrich M, Widmaier F, Gehler P, Winther O, Locatello F, Bachem O, Schölkopf B, Bauer S. 2022. The role of pretrained representations for the OOD generalization of reinforcement learning agents. 10th International Conference on Learning Representations. ICLR: International Conference on Learning Representations.
Dittadi, Andrea, et al. “The Role of Pretrained Representations for the OOD Generalization of Reinforcement Learning Agents.” 10th International Conference on Learning Representations, 2022.
All files available under the following license(s):
Copyright Statement:
This Item is protected by copyright and/or related rights. [...]
Link(s) to Main File(s)
Access Level
Open Access
Export
Marked PublicationsOpen Data ISTA Research Explorer
Sources
arXiv 2107.05686