There are few people in neuroscience as impactful as Donald Hebb. His research on the cellular mechanisms behind learning and memory has largely shaped our current understanding of these processes. Donald Hebb was able to synthesize years of work researching primates and rats with his understanding of the existing neuroscientific literature to pose his own theory of learning and memory in his groundbreaking book, The Organization of Behavior: A Neuropsychological Theory (Haider, 2008). In light of this week’s discussion, I thought it would be interesting to dive into his theory to better understand the molecular basis of learning and memory.
Hebbian theory postulates that learning occurs in stages wherein populations of neurons get repeatedly activated during learning. This repeated activation of cell A leads to the repeated activation of cell B and so on and so forth. This leads to the strengthening of these neural connections through the placement of additional receptors and spiny processes between cell A and cell B (Brown et al., 2021; Hebb, 1949; Langille & Brown, 2018). Now when cell A fires, it becomes inevitable that cell B will fire. So what does this look like? Let’s imagine you have set out to learn the Pythagorean theorem: a^2 + b^2 = c^2. Some neuron cell A becomes primed to fire when you think “What is the Pythagorean theorem?” and cell B becomes primed to fire when you think “a^2 +b^2 = c^2.” Through repeated sessions of studying, cell A will be activated followed by cell B. Eventually, the connection between these two cells will strengthen, so that when you are asked “What is the Pythagorean theorem?” cell B will immediately fire, prompting you with the response “a^2 + b^2 = c^2” (Keep in mind that this is a simplified example and in reality, there are likely hundreds if not thousands of neurons involved in a task like this).
Although this theory was first put forth by Hebb in the late 1940s it still plays an important role in our understanding of learning and memory today. In fact, Hebb’s theory created the basis for the neural network techniques used to create artificial intelligence software (Wu & Feng, 2017). It just goes to show what far-reaching impacts of scientific discovery!