Thesis title: Beyond Hebbian Learning: Optimizing Memory Retrieval and Network Structure in Bio-Inspired Neural Models
The study of learning and memory in neural systems has long inspired the development of theoretical models that bridge biology and computation. Among these, the Hopfield model established a foundational framework for understanding associative memory through the lens of statistical physics, describing collective neural dynamics as the minimization of an energy function in a high-dimensional landscape. However, despite its analytical tractability, the model soffers from intrinsic limitations: its storage capacity saturates at a critical load 𝛼_𝑐 ≃ 0.138, beyond which retrieval fails and spurious attractors dominate.
This thesis addresses these limitations following two complementary research directions aimed at enhancing both the performance and realism of associative memory networks.
In Part I, we introduce the Daydreaming algorithm, a local learning rule that integrates Hebbian learning and unlearning into a unified, continuously adaptive process. Unlike traditional approaches that require fine-tuning or risk catastrophic forgetting, the Daydreaming algorithm autonomously refines synaptic couplings by reinforcing memory attractors while destabilizing spurious ones. Extensive numerical analyses demonstrate that it extends the retrieval phase of the Hopfield model far beyond the classical limit, achieving near-optimal storage capacity (𝛼=1) for uncorrelated patterns and even higher effective capacity for structured, correlated data. Applied to real-world datasets such as MNIST, the algorithm exhibits spontaneous emergence of class prototypes, revealing an intrinsic ability to generalize and extract latent structure without supervision.
In Part II, the focus shifts from learning rules to network architecture. We investigate how asymmetry, dilution, and topological directionality in the coupling matrix reshape the dynamics of non-Hamiltonian spin systems. Using exhaustive numerical simulations, we systematically explore the three-dimensional parameter space defined by connection density (𝜌), synaptic correlation (𝜂), and topological asymmetry (𝑎_1). The results reveal rich dynamical regimes, including fixed points, limit cycles, and chaotic behavior, and identify an optimal regime of high dilution (𝜌<0.2) that combines maximal attractor diversity, short convergence times, and robustness—features consistent with biological neural circuits.
Together, these two lines of investigation establish a unified perspective on memory and learning in recurrent systems: efficient information storage emerges from the interplay between local adaptive plasticity and global structural constraints. The findings contribute to both the theoretical understanding of associative memory and the design of biologically inspired neural architectures.