{"language":[{"iso":"eng"}],"publication":"The 11th International Conference on Learning Representations","day":"01","author":[{"full_name":"Moschella, Luca","first_name":"Luca","last_name":"Moschella"},{"full_name":"Maiorca, Valentino","last_name":"Maiorca","first_name":"Valentino"},{"full_name":"Fumero, Marco","last_name":"Fumero","first_name":"Marco"},{"full_name":"Norelli, Antonio","last_name":"Norelli","first_name":"Antonio"},{"full_name":"Locatello, Francesco","id":"26cfd52f-2483-11ee-8040-88983bcc06d4","first_name":"Francesco","orcid":"0000-0002-4850-0683","last_name":"Locatello"},{"full_name":"Rodolà, Emanuele","last_name":"Rodolà","first_name":"Emanuele"}],"main_file_link":[{"url":"https://arxiv.org/abs/2209.15430","open_access":"1"}],"publication_status":"published","month":"05","date_created":"2023-08-22T14:22:20Z","year":"2023","external_id":{"arxiv":["2209.15430"]},"oa":1,"abstract":[{"lang":"eng","text":"Neural networks embed the geometric structure of a data manifold lying in a high-dimensional space into latent representations. Ideally, the distribution of the data points in the latent space should depend only on the task, the data, the loss, and other architecture-specific constraints. However, factors such as the random weights initialization, training hyperparameters, or other sources of randomness in the training phase may induce incoherent latent spaces that hinder any form of reuse. Nevertheless, we empirically observe that, under the same data and modeling choices, the angles between the encodings within distinct latent spaces do not change. In this work, we propose the latent similarity between each sample and a fixed set of anchors as an alternative data representation, demonstrating that it can enforce the desired invariances without any additional training. We show how neural architectures can leverage these relative representations to guarantee, in practice, invariance to latent isometries and rescalings, effectively enabling latent space communication: from zero-shot model stitching to latent space comparison between diverse settings. We extensively validate the generalization capability of our approach on different datasets, spanning various modalities (images, text, graphs), tasks (e.g., classification, reconstruction) and architectures (e.g., CNNs, GCNs, transformers)."}],"quality_controlled":"1","status":"public","date_published":"2023-05-01T00:00:00Z","conference":{"start_date":"2023-05-01","name":"International Conference on Machine Learning Representations","location":"Kigali, Rwanda","end_date":"2023-05-05"},"_id":"14217","extern":"1","article_processing_charge":"No","citation":{"ama":"Moschella L, Maiorca V, Fumero M, Norelli A, Locatello F, Rodolà E. Relative representations enable zero-shot latent space communication. In: The 11th International Conference on Learning Representations. ; 2023.","apa":"Moschella, L., Maiorca, V., Fumero, M., Norelli, A., Locatello, F., & Rodolà, E. (2023). Relative representations enable zero-shot latent space communication. In The 11th International Conference on Learning Representations. Kigali, Rwanda.","chicago":"Moschella, Luca, Valentino Maiorca, Marco Fumero, Antonio Norelli, Francesco Locatello, and Emanuele Rodolà. “Relative Representations Enable Zero-Shot Latent Space Communication.” In The 11th International Conference on Learning Representations, 2023.","short":"L. Moschella, V. Maiorca, M. Fumero, A. Norelli, F. Locatello, E. Rodolà, in:, The 11th International Conference on Learning Representations, 2023.","ieee":"L. Moschella, V. Maiorca, M. Fumero, A. Norelli, F. Locatello, and E. Rodolà, “Relative representations enable zero-shot latent space communication,” in The 11th International Conference on Learning Representations, Kigali, Rwanda, 2023.","ista":"Moschella L, Maiorca V, Fumero M, Norelli A, Locatello F, Rodolà E. 2023. Relative representations enable zero-shot latent space communication. The 11th International Conference on Learning Representations. International Conference on Machine Learning Representations.","mla":"Moschella, Luca, et al. “Relative Representations Enable Zero-Shot Latent Space Communication.” The 11th International Conference on Learning Representations, 2023."},"title":"Relative representations enable zero-shot latent space communication","user_id":"2DF688A6-F248-11E8-B48F-1D18A9856A87","date_updated":"2023-09-13T09:44:26Z","type":"conference","department":[{"_id":"FrLo"}],"oa_version":"Preprint"}