LUDOVICA SERRICCHIO

PhD Graduate

PhD program:: XXXVIII


advisor: Giancarlo Ruocco, Federico Ricci Tersenghi
co-supervisor: Enzo Marinari

Thesis title: Beyond Hebbian Learning: Optimizing Memory Retrieval and Network Structure in Bio-Inspired Neural Models

The study of learning and memory in neural systems has long inspired the development of theoretical models that bridge biology and computation. Among these, the Hopfield model established a foundational framework for understanding associative memory through the lens of statistical physics, describing collective neural dynamics as the minimization of an energy function in a high-dimensional landscape. However, despite its analytical tractability, the model soffers from intrinsic limitations: its storage capacity saturates at a critical load 𝛼_𝑐 ≃ 0.138, beyond which retrieval fails and spurious attractors dominate. This thesis addresses these limitations following two complementary research directions aimed at enhancing both the performance and realism of associative memory networks. In Part I, we introduce the Daydreaming algorithm, a local learning rule that integrates Hebbian learning and unlearning into a unified, continuously adaptive process. Unlike traditional approaches that require fine-tuning or risk catastrophic forgetting, the Daydreaming algorithm autonomously refines synaptic couplings by reinforcing memory attractors while destabilizing spurious ones. Extensive numerical analyses demonstrate that it extends the retrieval phase of the Hopfield model far beyond the classical limit, achieving near-optimal storage capacity (𝛼=1) for uncorrelated patterns and even higher effective capacity for structured, correlated data. Applied to real-world datasets such as MNIST, the algorithm exhibits spontaneous emergence of class prototypes, revealing an intrinsic ability to generalize and extract latent structure without supervision. In Part II, the focus shifts from learning rules to network architecture. We investigate how asymmetry, dilution, and topological directionality in the coupling matrix reshape the dynamics of non-Hamiltonian spin systems. Using exhaustive numerical simulations, we systematically explore the three-dimensional parameter space defined by connection density (𝜌), synaptic correlation (𝜂), and topological asymmetry (𝑎_1). The results reveal rich dynamical regimes, including fixed points, limit cycles, and chaotic behavior, and identify an optimal regime of high dilution (𝜌<0.2) that combines maximal attractor diversity, short convergence times, and robustness—features consistent with biological neural circuits. Together, these two lines of investigation establish a unified perspective on memory and learning in recurrent systems: efficient information storage emerges from the interplay between local adaptive plasticity and global structural constraints. The findings contribute to both the theoretical understanding of associative memory and the design of biologically inspired neural architectures.

Research products

11573/1736930 - 2025 - Daydreaming Hopfield networks and their surprising effectiveness on correlated data
Serricchio, L.; Bocchi, D.; Chilin, C.; Marino, R.; Negri, M.; Cammarota, C.; Ricci-Tersenghi, F. - 01a Articolo in rivista
paper: NEURAL NETWORKS (Elsevier Science Limited:Oxford Fulfillment Center, PO Box 800, Kidlington Oxford OX5 1DX United Kingdom:011 44 1865 843000, 011 44 1865 843699, EMAIL: asianfo@elsevier.com, tcb@elsevier.co.UK, INTERNET: http://www.elsevier.com, http://www.elsevier.com/locate/shpsa/, Fax: 011 44 1865 843010) pp. 1-12 - issn: 0893-6080 - wos: WOS:001439509100001 (2) - scopus: 2-s2.0-85218148358 (1)

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma