The human brain can store 10 TIMES as many memories as previously thought
The human brain can store 10 TIMES as many memories as previously thought, says study
· Scientists found storage capacity of synapses by measuring their size
· They found, on average, a synapse can hold about 4.7 bits of information
· This means that the entire human brain has a capacity of one petabyte
· This is same as about 20 million four-drawer filing cabinets filled with text
The human brain has a capacity that is ten times greater than first thought.
This is according to US scientists who have measured the storage capacity of synapses - the brain connections that are responsible for storing memories.
They discovered that, on average, one synapse can hold about 4.7 bits of information. This means that the human brain has a capacity of one petabyte, or 1,000,000,000,000,000 bytes.
One petabyte is about the same as about 20 million four-drawer filing cabinets filled with text or about the same as 13.3 years of HD-TV recordings.
WHAT DOES 1 PETABYTE EQUAL?
On average, one synapse can hold about 4.7 bits of information.
This means that the human brain has a capacity of one petabyte, or 1,000,000,000,000,000 bytes.
One petabyte is about the same as about 20 million four-drawer filing cabinets filled with text or about the same as 13.3 years of HD-TV video.
20 petabytes is around the same amount of data that is processed by Google each day.
Meanwhile 50 petabytes can hold the entire written works of humankind, from the beginning of recorded history, in all languages.
The new work also answers a longstanding question on how the brain is so energy efficient and could help engineers build computers that are powerful but also conserve energy.
'This is a real bombshell in the field of neuroscience,' says Terry Sejnowski, a Salk Institute professor and co-senior author of the paper, which was published in eLife.
'We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power.
'Our new measurements of the brain's memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web.'
Our memories and thoughts are the result of patterns of electrical and chemical activity in the brain.
A key part of the activity happens when branches of neurons, much like electrical wire, interact at certain junctions, known as synapses.
An output 'wire' (an axon) from one neuron connects to an input 'wire' (a dendrite) of a second neuron.
Signals travel across the synapse as chemicals called neurotransmitters to tell the receiving neuron whether to convey an electrical signal to other neurons.
Each neuron can have thousands of these synapses with thousands of other neurons.
CAN MAGNETS BOOST MEMORY?
Applying electric shocks to the brain can improve memory, researchers from Northwestern University recently found.
They say the discovery could open new avenues for treating strokes, early-stage Alzheimer's and even the normal effects of aging on the brain.
They used a non-invasive technique of delivering electrical current using magnetic pulses, called Transcranial Magnetic Stimulation (TMS).
In the past, TMS has been used in a limited way to temporarily change brain function to improve performance during a test, for example, making someone push a button slightly faster while the brain is being stimulated.
The study showed that TMS can be used to improve memory for events at least 24 hours after the stimulation is given.
Synapses are still a mystery, though their dysfunction can cause a range of neurological diseases.
Larger synapses are stronger, making them more likely to activate their surrounding neurons than medium or small synapses.
The Salk team, while building a 3D reconstruction of rat hippocampus tissue, noticed something unusual.
In some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of a second neuron.
This suggests that the first neuron seemed to be sending a duplicate message to the receiving neuron.
At first, the researchers didn't think much of this duplicity, which occurs about 10 per cent of the time in the hippocampus.
But Tom Bartol, a Salk staff scientist, had an idea: if they could measure the difference between two very similar synapses such as these, they might glean insight into synaptic sizes.
'We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about eight percent different in size. No one thought it would be such a small difference. This was a curveball from nature,' says Bartol.
Because the memory capacity of neurons is dependent upon synapse size, this eight percent difference turned out to be a key number the team could then plug into their algorithmic models.
'Our data suggests there are 10 times more discrete sizes of synapses than previously thought,' says Bartol.
In computer terms, 26 sizes of synapses correspond to about 4.7 'bits' of information.
Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.
'This is roughly an order of magnitude of precision more than anyone has ever imagined,' says Sejnowski.
The findings also offer a valuable explanation for the brain's surprising efficiency.
The waking adult brain generates only about 20 watts of continuous power—as much as a very dim light bulb.
The Salk discovery could help computer scientists build ultraprecise, but energy-efficient, computers, particularly ones that employ 'deep learning' and artificial neural nets—techniques capable of sophisticated learning and analysis, such as speech, object recognition and translation.
'This trick of the brain absolutely points to a way to design better computers,' says Sejnowski.
'Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains.'