Connect with us

Hi, what are you looking for?

Jewish Business News

empty

average human brain can store as much information as the entire World Wide Web, says study

2-brain

The human brain has a capacity that is ten times greater than first thought, new study from Salk revealed. The US scientists reconstructed brain tissue to discover the average human brain can store as much information as the entire World Wide Web.

The researchers studied the storage capacity of synapses, the connections in the brain responsible for keeping our memories. It was discovered that each synapse can store 4.8 bits of data, meaning the entire brain has a capacity for a petabyte of data – the equivalent to 62, 500 iPhones, according to ibtimes.

Please help us out :
Will you offer us a hand? Every gift, regardless of size, fuels our future.
Your critical contribution enables us to maintain our independence from shareholders or wealthy owners, allowing us to keep up reporting without bias. It means we can continue to make Jewish Business News available to everyone.
You can support us for as little as $1 via PayPal at office@jewishbusinessnews.com.
Thank you.

An output ‘wire’ (an axon) from one neuron connects to an input ‘wire’ (a dendrite) of a second neuron. Signals travel across the synapse as chemicals called neurotransmitters to tell the receiving neuron whether to convey an electrical signal to other neurons. Each neuron can have thousands of these synapses with thousands of other neurons.

Synapses are still a mystery, though their dysfunction can cause a range of neurological diseases.

Larger synapses are stronger, making them more likely to activate their surrounding neurons than medium or small synapses.

The Salk team, while building a 3D reconstruction of rat hippocampus tissue, noticed something unusual. In some cases, a single axon from one neuron formed two synapses reaching out to a single dendrite of a second neuron. This suggests that the first neuron seemed to be sending a duplicate message to the receiving neuron.

At first, the researchers didn’t think much of this duplicity, which occurs about 10 per cent of the time in the hippocampus.

But Tom Bartol, a Salk staff scientist, had an idea: if they could measure the difference between two very similar synapses such as these, they might glean insight into synaptic sizes.

‘We were amazed to find that the difference in the sizes of the pairs of synapses were very small, on average, only about eight percent different in size. No one thought it would be such a small difference. This was a curveball from nature, ‘ says Bartol.

Because the memory capacity of neurons is dependent upon synapse size, this eight percent difference turned out to be a key number the team could then plug into their algorithmic models.

“Our data suggests there are 10 times more discrete sizes of synapses than previously thought, ” says Bartol. In computer terms, 26 sizes of synapses correspond to about 4.7 “bits” of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus.

“This is roughly an order of magnitude of precision more than anyone has ever imagined, ” says Terry Sejnowski, Salk professor and co-senior author of the paper.

 

Memory capacity of brain is 10 times more than previously thought

In a reconstruction of brain tissue in the hippocampus, Salk scientists found the unusual occurrence of two synapses from the axon of one neuron (translucent black strip) forming onto two spines on the same dendrite of a second neuron (yellow). This suggests that the first neuron seemed to be sending a duplicate message

What makes this precision puzzling is that hippocampal synapses are notoriously unreliable. When a signal travels from one neuron to another, it typically activates that second neuron only 10 to 20 percent of the time.

“We had often wondered how the remarkable precision of the brain can come out of such unreliable synapses, ” says Bartol. One answer, it seems, is in the constant adjustment of synapses, averaging out their success and failure rates over time. The team used their new data and a statistical model to find out how many signals it would take a pair of synapses to get to that eight percent difference.

The researchers calculated that for the smallest synapses, about 1, 500 events cause a change in their size/ability (20 minutes) and for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change.

“This means that every 2 or 20 minutes, your synapses are going up or down to the next size. The synapses are adjusting themselves according to the signals they receive, ” says Bartol.

“Our prior work had hinted at the possibility that spines and axons that synapse together would be similar in size, but the reality of the precision is truly remarkable and lays the foundation for whole new ways to think about brains and computers, ” says Harris. “The work resulting from this collaboration has opened a new chapter in the search for learning and memory mechanisms.” Harris adds that the findings suggest more questions to explore, for example, if similar rules apply for synapses in other regions of the brain and how those rules differ during development and as synapses change during the initial stages of learning.

“The implications of what we found are far-reaching, ” adds Sejnowski. “Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us.”

The findings also offer a valuable explanation for the brain’s surprising efficiency. The waking adult brain generates only about 20 watts of continuous power–as much as a very dim light bulb. The Salk discovery could help computer scientists build ultraprecise, but energy-efficient, computers, particularly ones that employ “deep learning” and artificial neural nets–techniques capable of sophisticated learning and analysis, such as speech, object recognition and translation.

“This trick of the brain absolutely points to a way to design better computers, ” says Sejnowski. “Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains.”

Newsletter



Advertisement

You May Also Like

World News

In the 15th Nov 2015 edition of Israel’s good news, the highlights include:   ·         A new Israeli treatment brings hope to relapsed leukemia...

Entertainment

The Movie The Professional is what made Natalie Portman a Lolita.

Travel

After two decades without a rating system in Israel, at the end of 2012 an international tender for hotel rating was published.  Invited to place bids...

VC, Investments

You may not become a millionaire, but there is a lot to learn from George Soros.