Physicist working on neural networks and machine learning, specializing in associative memory models and brain-inspired AI algorithms.
I am a physicist working on neural networks and machine learning. I am a member of the research staff at the MIT-IBM Watson AI Lab and IBM Research in Cambridge, MA. Prior to this, I was a member of the Institute for Advanced Study in Princeton.
Broadly defined, my research focuses on the theoretical foundations of artificial neural networks, with particular emphasis on associative memory, energy-based models, and brain-inspired architectures. My work bridges the gap between physics, neuroscience, and AI, seeking to understand and develop algorithms that can process information as efficiently as biological neural networks.
Together with John Hopfield I developed Dense Associative Memories, which significantly increased the information storage capacity of Hopfield Networks. My research has implications for both our understanding of how biological brains store and retrieve information, and for developing more efficient AI systems.
If you want to learn more about my work please check out this recent Q&A with me, or a profile about my work.
A recent overview of Dense Associative Memory models and their applications, IPAM at UCLA.
Watch Full Talk →2025 New Year Opening Lecture of Van Vreeswijk Theoretical Neuroscience Seminar series.
Watch Full Lecture →Review of Energy Transformer and related applications.
Watch Full Seminar →Oral talk of the original Dense Associative Memory paper at NeurIPS 2016.
Watch Full Talk →MIT 6.S191 Introduction to Deep Learning course.
Watch Full Lecture →An exploration of how statistical physics and energy landscapes inspired the development of neural networks and artificial intelligence.
Read Article →New research reveals how astrocytes, previously overlooked brain cells, play a crucial role in memory formation and storage.
Read Article →Coverage of the Nobel Prize in Physics awarded for foundational work on machine learning and artificial neural networks.
Read Article →MIT researchers develop AI models to better understand the role of astrocytes in brain function and memory processing.
Read Article →Research suggests that previously overlooked brain cells could be key to understanding the enormous storage capacity of human memory.
Read Article →The Nobel Prize in Physics recognizes groundbreaking contributions to the field of machine learning.
Read Article →