LUCA MOSCHELLA

PhD Graduate

PhD program:: XXXV


co-supervisor: Prof. Emanuele Rodolà

Thesis title: Latent Communication in Artificial Neural Networks

As NNs (Neural Networks) permeate various scientific and industrial domains, understanding the universality and reusability of their representations becomes crucial. At their core, these networks create intermediate neural representations, indicated as latent spaces, of the input data and subsequently leverage them to perform specific downstream tasks. This dissertation focuses on the universality and reusability of neural representations. Do the latent representations crafted by a NN remain exclusive to a particular trained instance, or can they generalize across models, adapting to factors such as randomness during training, model architecture, or even data domain? This adaptive quality introduces the notion of Latent Communication – a phenomenon that describes when representations can be unified or reused across neural spaces. A salient observation from our research is the emergence of similarities in latent representations, even when these originate from distinct or seemingly unrelated NNs. By exploiting a partial correspondence between the two data distributions that establishes a semantic link, we found that these representations can either be projected into a universal representation (Moschella , Maiorca , et al., 2023), coined as Relative Representation, or be directly translated from one space to another (Maiorca et al., 2023). Intriguingly, this holds even when the transformation relating the spaces is unknown (Cannistraci, Moschella, Fumero, et al., 2024) and when the semantic bridge between them is minimal (Cannistraci, Moschella, Maiorca, et al., 2023). Latent Communication allows for a bridge between independently trained NN, irrespective of their training regimen, architecture, or the data modality they were trained on – as long as the data semantic content stays the same (e.g., images and their captions). This holds true for both generation, classification and retrieval downstream tasks; in supervised, weakly supervised, and unsupervised settings; and spans various data modalities including images, text, audio, and graphs – showcasing the universality of the Latent Communication phenomenon. From a practical standpoint, our research offers the potential to repurpose and reuse models, circumventing the need for resource-intensive retraining; enables the transfer of knowledge across them; and allows for downstream performance evaluation directly in the latent space. Indeed, several works leveraged the insights from our Latent Communication research (Kiefer and Buckley, 2024; Z. Wu, Y. Wu, and Mou, 2024; Jian et al., 2023; Norelli, Fumero, et al., 2023; G. Wang et al., 2023). For example, relative representations have been instrumental in attaining state-of-the-art results in Weakly Supervised Vision-and-Language Pretraining (C. Chen et al., 2023). Reflecting its significance, (Moschella , Maiorca* , et al., 2023) has been presented orally at ICLR 2023 and Latent Communication has been a central theme in the UniReps: Unifying Representations in Neural Models Workshop at NeurIPS 2023, co-organized by our team.

Research products

11573/1712069 - 2024 - From Bricks to Bridges: Product of Invariances to Enhance Latent Space Communication
Cannistraci, Irene; Moschella, Luca; Fumero, Marco; Maiorca, Valentino; Rodolà, Emanuele - 04b Atto di convegno in volume
conference: The Twelfth International Conference on Learning Representations (Vienna, Austria)
book: International Conference on Learning Representations - ()

11573/1713815 - 2023 - Bootstrapping Parallel Anchors for Relative Representations
Cannistraci, Irene; Moschella, Luca; Maiorca, Valentino; Fumero, Marco; Norelli, Antonio; Rodolà, Emanuele - 04b Atto di convegno in volume
conference: Tiny Papers Track at ICLR (Kigali, Rwanda)
book: The First Tiny Papers Track at ICLR 2023 - ()

11573/1698842 - 2023 - Latent Space Translation via Semantic Alignment
Maiorca, Valentino; Moschella, Luca; Norelli, Antonio; Fumero, Marco; Locatello, Francesco; Rodola', Emanuele - 04b Atto di convegno in volume
conference: Thirty-seventh Conference on Neural Information Processing Systems (New Orleans, Lousiana, United States of America)
book: Advances in Neural Information Processing Systems - ()

11573/1673206 - 2023 - Relative representations enable zero-shot latent space communication
Moschella, Luca; Maiorca, Valentino; Fumero, Marco; Norelli, Antonio; Locatello, Francesco; Rodola', Emanuele - 04b Atto di convegno in volume
conference: The Eleventh International Conference on Learning Representations (Kigali, Rwanda)
book: International Conference on Learning Representations - ()

11573/1672156 - 2022 - Metric Based Few-Shot Graph Classification
Crisostomi, Donato; Antonelli, Simone; Maiorca, Valentino; Moschella, Luca; Marin, Riccardo; Rodola', Emanuele - 04b Atto di convegno in volume
conference: First Learning on Graphs Conference (Virtual Conference)
book: Proceedings of the First Learning on Graphs Conference - ()

11573/1649224 - 2022 - Learning Spectral Unions of Partial Deformable 3D Shapes
Moschella, Luca; Melzi, Simone; Cosmo, Luca; Maggioli, Filippo; Litany, Or; Ovsjanikov, Maks; Guibas, Leonidas; Rodolà, Emanuele - 02a Capitolo o Articolo
book: Computer Graphics Forum - ()

11573/1615701 - 2021 - Shape Registration in the Time of Transformers
Trappolini, Giovanni; Cosmo, Luca; Moschella, Luca; Marin, Riccardo; Melzi, Simone; Rodolà, Emanuele - 04b Atto di convegno in volume
conference: Advances in Neural Information Processing Systems (was NIPS) (Online)
book: ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021) - (9781713845393; 1713845393)

11573/1015128 - 2017 - Performance variations of the Bayesian model of peer-assessment implemented in OpenAnswer response to modifications of the number of peers assessed and of the quality of the class
De Marsico, Maria; Moschella, Luca; Sterbini, Andrea; Temperini, Marco - 04b Atto di convegno in volume
conference: 16th International Conference on Information Technology Based Higher Education and Training, ITHET (Ohrid; Macedonia)
book: Proceedings of 2017 16th International Conference on Information Technology Based Higher Education and Training (ITHET) - (978-1-5386-3968-9)

11573/978646 - 2017 - Effects of network topology on the OpenAnswer’s Bayesian model of peer assessment
De Marsico, Maria; Moschella, Luca; Sterbini, Andrea; Temperini, Marco - 04b Atto di convegno in volume
conference: 12th European Conference on Technology Enhanced Learning, EC-TEL 2017 (Tallinn; Estonia)
book: Data Driven Approaches in Digital Education - ()

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma