Reading Time: 3 minutes, 9 secs

More Than We Thought

There are few things as astounding as the brain and human memory.  Yet Salk researchers and collaborators have achieved critical insight into the size of neural connections, which revealed that mankind’s memory capacity is actually much higher than previously believed.  This new work also answers a longstanding question as to how the brain is so energy-efficient, which could help engineers to build computers that are not only incredibly powerful, but also conserve energy.  The researchers say that they have discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power.  Their new measurements of the brain’s memory capacity are around 10 times what we’d previously though, around the same amount as the World Wide Web.  skull memory

Our thoughts and memories are the result of patterns of electrical and chemical activity in the brain.  A key part of the activity happens when branches of neurons, much like electrical wire, interact at certain junctions that are known as synapses.  An output wire (or axon) from one neuron connects to an input wire (dendrite) of a second neuron.  Signals travel across the synapse as chemicals called neurotransmitters to tell the receiving neuron whether to convey an electrical signal to other neurons.  Each neuron can have thousands of these synapses with thousands of other neurons.

Synapses remain a mystery, although their dysfunction can cause a variety of neurological diseases.  Larger synapses are stronger, which makes them more likely to activate their surrounding neurons than medium or small synapses.  While building a 3D reconstruction of rat hippocampus tissue, the Salk team noticed that in some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of a second neuron, meaning that the first neuron seemed to be sending a duplicate message to the receiving neuron.  Researchers didn’t think much of this duplicity at first, which occurs about 10 percent of the time in the hippocampus.  Yet one of the researchers realized that if they could measure the difference between two very similar synapses, then they could glean insight into the synaptic sizes, which so far had only been classified in the field as small, medium and large.  To do this, the researchers used advanced microscopy and computational algorithms they had developed to image rat brains and reconstruct the connectivity, shapes, volumes and surface of the brain tissue down to a nanomolecular level.

While the researchers expected the synapses to be roughly the same in size, they were shocked to learn that they were nearly identical.  Since the memory capacity of neurons is dependent on synapse size, this eight percent difference was a key number that the team could plug into their algorithmic models of the brain to measure how much information could be stored into synaptic connections.  Armed with the knowledge that synapses of all sizes could vary in increments as little as eight percent between sizes within a factor of 60, the team determined that there could be about 26 categories of sizes of synapses as opposed to just a few.

The researchers calculated that for the smallest synapses, about 1,500 events cause a change in their size and ability (20 minutes), while for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change.  This means that every 2 to 20 minutes, your synapses are going up or down to the next size.  The implications of what the researchers has found are far-reaching, offering a valuable explanation for the brain’s surprising efficiency; the brain generates only about 20 watts of continuous power, about as much as a very dim light bulb.  This discovery could help computer scientists to build ultraprecise, energy-efficient computers that employ “deep learning” and artificial neural nets.

If you’d like to learn more, you can click here!

 

0 Likes