top of page

What is the theoretical storage capacity of the human brain?

  • Jun 27, 2016
  • 16 min read

Theoretical Biology

Human Brain

Neuroscience

Science

The experts estimate that if we compare our brain to a computer we can store approximately from a couple of terabytes to approximately 2.5 petabytes (1 petabyte = 1000 terabytes). However, this comparison is limited by the fact that the way the human brain creates and stores memories is nothing like a computer. The brain has about a billion neurons and each neuron is connected to about a thousand other neurons which makes about a trillion connections. That is a rough estimate of the hard wiring. The way the neural networks combine allows a single neuron to be involved in many memories, which takes an estimate along an exponential curve up into the trillions. A healthy brain need never worry about running out of memory. 2.5 petabytes = 3 million hours of video. [Scientific American, May 1 2010, Paul Reber, "What is the Memory Capacity of the Human Brain."] The 2.5 petabytes is quoted from the Scientific American article listed above. They say that if there are 100 billion neurons, and each neuron forms about 1000 connections with other neurons and each neuron could only store one memory, running out of space might be an issue. However, they then allow that neurons combine for memory, thus running the capacity up exponentially. But they don’t say how they came to the 2.5 petabytes of storage. I don’t think that it is possible to realistically calculate this. Here’s a story. I have been involved in cortical mapping for epilepsy surgery since 1999. I have participated as a tech in hundreds of these mappings to determine if the brain tissue they want to remove, the tissue that is causing the patient’s seizures, is free of speech, language and motor function. They do this by having a neuropsychologist test the patient’s speech and language while at the same time the epileptologist is directly stimulating the patient’s brain with electrical current via implanted electrodes. Stimulating a patient’s brain tissue in this manner renders the area just under the tip of the stimulator – a dot in an (8 x 8) grid of 64 platinum dots - useless during the period of stimulation. Thus if the stimulation affects the area being tested the patient might stop talking, get confused, not recognize pictures or feel a tingle in their face, tongue, arm or leg. During one particular stimulation I remember the patient, a woman about 40 years of age, stopped speaking and said “oh.” The physician asked her what she felt, and she said it was the memory of a feeling, but she couldn’t remember what it was. Testing continued and a while later they came back to the same spot – a dot the about the size of the “O” I just typed – and increased the current. The patient smiled and said that the same memory came, a comforting feeling, and it was about a song that her mother had sang to her when she was a child of 8 yr.s old. They increased the current once again and this time she could remember the name of the song. All of us were very excited about this because it’s something that Penfield, one of the neurosurgical pioneers of neurophysiology had described in the early 1950s, however, we were not in an area usually associated with memory. So this was unusual and very interesting. The testing continued until we were now about 4 dots away – millions of neurons away actually – from the dot that evoked the memory of the song. The stimulation here once again produced the same memory, only different in some way, the patient said. The current was increased and now the woman remembered the lyrics of the song. So this particular memory had connections that involved neurons at least millions of neurons apart, and included emotions and specific details that the patient had no memory of without this cortical stimulation. Because it involved emotions, we know that many parts of her limbic system were also probably stimulated via an incredibly large network, in a thick layer of networks, trillions of trillions of synapses in a web too thick to contemplate. This is why I don’t see how anyone can come up with a number when it comes to the memory capacity of the brain. But if you are into the math, check this out: The possible number of unique combinations of inputs for a single neuron with just 100 incoming dendrites could be computed as 100 x 99 x 98 x 97 x .... x 2 x 1 possibilities. That represents more than 1, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000 unique possible combinations! Multiply that number by 100 and divide by 8 to measure the number of bytes of possible memory. A single nerve cell with 100 dendrites can potentially remember that many bytes of singular combinations. Some nerve cells have up to 2,50,000 dendrites! Only the possible existence of such codes can explain the phenomenal capacity of human memory. [Source: Human Memory Capacity] We don't really know what the size of a memory looks like. Then there is consciousness, which is not something science can quantify yet. For all we know memories can exist in consciousness externally to neural tissue. Here is where the scientists are going to stop reading my answer. I don't blame them as there is no hard evidence that memories exist apart from consciousness. In fact, the hard evidence we have limits memory storage to neural tissue. But I take license, in my answers to go beyond science, because I can. It limits my up votes and can attract some well-deserved criticism from scientists but I have to go with what resonates with my own vision, intuition and experience. For various reasons, my own brain has resisted the mannering of our culture's consensual version of reality. Perhaps that is why I've experienced some things that science doesn't explain (unless you ascribe to the theory of the universe being holographic put forth by David Bohm and others.) I have worked in the field of neuroscience my entire life and have found it to be a marvelous tool but with limitations. I predict that soon science will expand to explain more things or we will discover another tool that goes beyond science. These are my own humble opinions.

Updated Aug 12, 2015 • View Upvotes • Answer requested by Wykas

Upvote21Downvote

Comments2+

Share

Clayton Bingham, Researcher in Center for Neural Engineering at University of Southern California

2.1k Views • Clayton has 60+ answers in Neuroscience

Originally Answered: How much information in terms of memories/observation/learning worth terabytes can our brain hold?

While it is undeniable that our brains can hold a ton of information, guesses range anywhere from 1 TB to 2.5 PB. This may be as good as the guesses will get, mostly due to the fact that brains and computers don't store information in the same way. Brains store information through a complex physiological response to chemical stimuli at the synapse. Bitwise computer memory and internodal synaptic "memory" are fundamentally different because synapses don't have a "state" or a "switch" that they can use to define a bit of information (binary encoding)...they encode this information as a synaptic weight or connective strength that can change as the network learns more about the stimuli. While neural networks can perform extremely complex computation, this is only possible because of scale and plasticity. A neuron on its own is more of a filter than a traditional computing unit--it transforms input in a controlled fashion into a novel output.

Robert Russell, Wrote a paper on Relativity and QFT for my MBA. And the Punic Wars, too..

2.2k Views • Robert has 90+ answers in Neuroscience

Originally Answered: What is the memory size of the human brain?

If our brain's storage capacity were measured in Megabyte, GigaBytes etc. how much would be the size ? It's relatively easy to count the storage capacity of an inert, rigid, inflexible and largely unchanging box of switches, gates and magnetisable platters. It's almost trivial. And you get the same result every time, unless you degrade or erode some material, crash a read/write head into it or upgrade it. And because the box itself can be programmed to do things for us, extend us and present the information in ways we understand and can often even relate to, we mistake it for a "model" of ourselves. But even the most sophisticated self-programming, semi-autonomic computer system remains just a primitive tool. It may as well be a hammer or a nail. There are many "storage" estimates out there, but it's largely educated guesswork based on estimates and assumptions of the average number of neurones, spines and receptors we think we can see and what we imagine currently to be their "purpose". It's a big number of potential "storage elements", if that's what they are. But are they? Indeed our neurones extend outside of the brain. They are highly differentiated and specialised. They are location and orientation specific. They respond not just to stimulusper se but type, intensity and timing of the stimulus. And to varying densities, and gradients. And to upstream and downstream products. And a lot more besides. As a biologically adaptive neural system that has evolved over millions of years, that constantly changes as required and renews and reconfigures itself in ways we don't fully comprehend, it is not easily compared with a fixed, hard set of silicon chips. Probably, it's not comparable. So I'll decline to guess. Can it be filled? Potentially, any physical system has a limit. You can only pack so much material into our bulging brain case. But we are not even close to really answering this question. It's so organic, so adaptive. On one level it appears to "fade" unneeded or unreviewed memories. It seems to drop or compress data. It even "cheats" by reusing elements. You could conclude that this all "saves space", but alternatively it may not be capacity but access speed or relevance of retrieved information that is the real driver here. Indeed the brain appears to nest new information infinitely, using a form of tagging and compression that we don't completely understand. And sometimes it "recreates" a seeming facsimile of stored information based not on a complete hard memory but a fragmentary set of recollections linked to standard shorthand or substitutable memories. Sometimes various regions of the brain even conspire to recalculate a "memory" based on believed time and location data. How can you estimate the "capacity" of such a creative memory system? It's possible (based on a few revealing cases) that some sort of fairly complete "hard copy" is distributed fragmentally in many locations across the brain, but the tagging-and-retrieval system down-plays the less used memories and promotes the quickest, most seemingly useful answer over the "real" or "complete" one. Even if it's not completely accurate. How would our brain "evolve" over a 200-year life expectancy? Evolve or simply age better? I'm not sure that the brain would necessarily evolve simply because we live longer. What selective pressure would be applied over 200 years, especially if most of us reproduce much earlier in our lives? If our brains become less creatively functional over time yet we maintain our bodies by whatever means, what would that mean for brain function? As an adaptive system the brain would continue to find ways to be useful, so long as it maintained its own fitness and functionality. Perhaps maintaining general bodily health over such a long lifetime would be more critical, and avoiding cerebral vascular accidents and the like of course.

Lourdes Trammell, Cognitive Science, Thought and Language, Brain Studies

770 Views • Lourdes has 30+ answers in Human Brain

Originally Answered: What is the memory size of the human brain?

Acording to Forrest Wickman: "The human brain contains roughly 100 billion neurons" http://io9.com/5890414/the-4-big... "Each of these neurons seems capable of making around 1,000 connections, representing about 1,000 potential synapses, which largely do the work of data storage. Multiply each of these 100 billion neurons by the approximately 1,000 connections it can make, and you get 100 trillion data points, or about 100 terabytes of information." The brain has a way of compiling data and creating resolvable blocks of information so chances are, it will never be completely full. If we were living lifespans of 200 years, evolution would slow its rate. Diversity and adaptation across generations is require for phenotypes and genotypes to change.

Bill Skaggs, Ph.D. in neuroscience

2.5k Views • Upvoted by Michael Soso, BA Berkeley Physiology/Biophysics 1967, PhD Neurophysiology UW 1975 and MD Stanford 1979, 30 years … and Yates Buckley,Computational Neuroscience

Bill has 390+ answers and 5 endorsements in Neuroscience

Originally Answered: What is the memory size of the human brain?

We don't understand the brain well enough to give a definite answer. Most neuroscientists believe that memory is stored in brain elements called dendritic spines. It is relatively easy to estimate the number of spines in the human brain: on the order of 100 trillion (there are around 10 billion pyramidal cells in the cerebral cortex, each having around 10 thousand spines). Thus we can estimate that the memory capacity of the human brain is about 100 Teraspines. Converting that to bytes is not so easy, though. If we assume that each spine is continuously adjustable and can store on the order of 1 byte of information, then we get a total capacity on the order of 100 Terabytes. That's the top end of the plausible range. My personal belief, though, is that the usable capacity is far lower. I believe that the level of noise in the brain requires each bit of information to be stored redundantly across multiple spines, perhaps as many as 100 or more. If we go with 100, we get a capacity of 100 Gigabytes. That's probably the bottom end of the plausible range. Bottom line: the plausible range is from 100 Gigabytes to 100 Terabytes. My personal view is that the lower values are more likely to be correct.

Originally Answered: How much information can the human brain hold? What is its limit?

You may be assuming that the brain absorbs all sensory input and that many scientists claim only 1/5th of brain capacity is used. NEITHER ARE TRUE.

Neurons connect with each other chemically with many hundreds of receptors on their surfacethat deliver an electrical composite signal. The makeup of these receptors (ion channels in the axon) is not fixed but changes, becomng stronger or weaker.

Using voltammetry scientists can measure in real time glucose fluctuations. That's how memories are retained in the temporal lobes and within them the hippocampus.

Brain plasticity is one area where damaged neurons that have signal pathways to body areas are remapped to other neuron cells to regain motor control, often the resut of brain trauma or strokes. Not all brain areas have this remapping ability due to circuit specialization. However, in the pons of the brain, ( the circuit switchboard ), tbe brain can rewire itself to some degree.

Healthline link

"The pons is a portion of the brain stem, located above the medulla oblongata and below the midbrain. Although it is small, at approximately 2.5 centimeters long, it serves several important functions. It is a bridge between various parts of the nervous system, including the cerebellum and cerebrum, which are both parts of the brain.

There are many important nerves that originate in the pons. The trigeminal nerve is responsible for feeling in the face. It also controls the muscles that are responsible for biting, chewing, and swallowing. The abducens nerve allows the eyes to look from side to side. The facial nerve controls facial expressions, and the vestibulocochlear nerve allows sound to move from the ear to the brain. All of these nerves start within the pons.

Pons Anatomy, Function & Diagram

Originally Answered: How much memory can a brain hold?

I think that memory capacity is vastly increased over time by two things:

The first is that memory is a somewhat "lossy" form. We don't often remember as a video tape would remember, we remember more like an outine, and fill in the blanks based on our understanding.

It would take too long for me to go through the details, but this accounts for the increasingly obvious unreliability of eye-witness accounts in court. We remember a few points, and recreate the rest if the memory, based on what we believe we saw. Much more efficient, and good enough for most uses.

But there is another, related aspect to memory which serves to compact the data, and make it more available to quickly and accurately fill in those blanks. I think it works for all types of memories, but I will give an example based on learning one's way around a new city.

When I first came to Houston, I learned how to get to the grocery store: turn right, 3 miles, turn left, past the bakery, right at the second light and straight ahead on the left.

I learned a set of specific directions like that. I think remembering these sorts of lists is very memory intensive. It is also "me" centered, like a "first-person" video game, and very purpose directed- getting to the store.

As things progressed, I began to learn a different sort of direction. Not centered on me, or my task, but centered on the city itself. I began to develop a mental map.

As my map improved in detail and accuracy, I no longer needed any list of instructions to go anywhere, just consult the map in my head, and go there. The lists coil be thrown away. Better still, I only need one map. All purpose. But I did have to stop learning based on me or my task.

I think this is the way we organize vast quantities and types of data as we gain experience; it allows us to compact the data from all the mental lists into a much more useful system, into which new data can be incorporated with relative ease (draw mental maps in pencil! ).

Just one bit of evidence - I note in myself, and others, a tendency in youth to evaluate issues (moral, ethical, political, etc.) according to "lists" catagories or standards. Now I am much more likely to judge in overall context, how it interacts with the overall map.

But I think these two aspects of memory allow for increasing effective density with experience, and an almost limitless ultimate capacity.

Originally Answered: Does the human brain have a limited amount of memory?

It must of course be limited, but there's no evidence we come to the limits in any standard lifetime. Sherlock Holmes famously thought that the brain was like an attic filled with junk, where after, at some point, if put junk in, you must take other junk out. So if you learn a new fact, you must forget an old fact. So Holmes tried to remember only things that were useful to his work. Watson told him the Earth goes around the Sun and Holmes said not only didn't he know that, but now he would try to forget it. ;) That's not true. "Forgetting" goes along with "learning" in most people, but that causes an "association error" whereby people have been convinced that one action causes the other (as Holmes did). But we now know of people with "perfect autobiographical memory" who remember every day of their lives (back to teen years or earlier) with the same clarity that you and I remember what we did this morning (but no better). So that suggests that our gradual forgetting over days and weeks of those details is not necessary. We do it for some other reason than space-saving.

Much of that may depend less on the structure of the brain and more on how information is measured, which is not a simple question to answer. To make a crude estimate, suppose we measure information in bits. If the brain has a 100 billion neurons and we pretend that each one is as simple as a bit (a ridiculous assumption), then the brain has at least 12 gigabytes of storage capacity. Of course, that is a terribly simplistic approach. Neurons are not on/off switches like bits: they have connections to other neurons which dramatically complicate the picture. I've done some further calculations on this based on the number of possible configurations of connections in the brain, my theory of measurement being that the amount of information something can hold equates to the number of possible states it can possibly be in. With this premise and the inverse power law in hand, I've come to the answer of gigabytes=∏mc=1(2c−1)nc⋅logm8×109⋅log2gigabytes=∏c=1m(2c−1)nc⋅logm8×109⋅log2 (where n is the number of neurons and m is the max number of connections for a single neuron), which is apparently too large to be meaningful when n is 100 billion. I tried to scale the problem down a bit by reducing the brain to 100 neurons in which the most connections any one neuron has is 10. Such a "reduced" brain has the storage capacity of 7.17×101037.17×10103 gigabytes.

There is a limit to everything, and the brain has its maximum capacity too. It is estimated to be 2.5 petabytes, and that is a huge amount. It was estimated on the number of neurons we have and the number of connections each of these neurons can possess. From the article: Imagine your brain as a computer hard drive. How much memory would your “system” have? A few scientists have puzzled over this question in recent years, with widely varying results. Syracuse University Professor Robert Birge estimated in 1996 that the human brain had a storage capacity of between 1 and 10 terabytes, with a likely value of 3TB (for comparison, high-end iMacs today come with a 1 terabyte memory). However, Scientific American posed the same question to a psychologist in 2010, who estimated that the brain’s memory storage capacity is 2.5 petabytes (a petabyte is 1 million gigabytes, or 1,000 terabytes).

You’ll get a lot of strange answers, in part because you have confused two things that arenothing alike. The first is storage in machines. Since we have, for some time now, fallen under the deadly spell of comparing human brains or minds to computers people apparently think that the two ‘are functionally the same’. They are not even the same order of phenomenon. I shall be blunt: human ‘brains’ (which are actually only partly local and extend throughout the entire body, including the nervous system, and are profoundly influenced by metabolic conditions and even relation with the commensal microbiome) have about as much in common with computers as an ocean wave has with a cartoon of one. They do not exist at the same order of reality, at all. Machines are abiorelational. Organisms are biorelational hyperstructures with shockingly nonlocal extent. So firstly, the idea that a brain ‘is like a computer’ is a simile. The idea that the brain is a computer is a linguistic delusion. Now that we have that out of the way, let us get further clarity. What you are speaking is not memory. For human beings, memory is biolerationally intelligent and impossibly multi-valent. There are probably thousands or billions of forms of memory, and when we speak of memory as it relates to consciousness, this remains true, even though we would there select a relatively tiny subset of memory as it may exist or be accessed in consciousness. But memory itself is not storage, nothing like it. It is an array of intelligence-like faculties, and there is no specific number of their species. Effectively, these species alone are infinite, because they will be uniquely instanced and expressed in every organism and human person or situation. Memory is a constellation of intelligence-like faculties, and they produce synthetic unities from vast arrays of possible forms, figures and relationships. As for storage, it is also going to be infinite, and it is far more than possible that it might be, it fairly has to be. Because, again, there are thousands or more forms of memory and endless ways of constellating their results. More still, the entire process is changing and developing (or regressing) in every human mind, moment, culture and context. So we are ceaselessly inventing or reforging various of the underlying assets and potentials of storage. Mechanical estimates might be interesting for analytic sorts, but I must suggest they are fantasies, that flatten the entire topic into a monodimensional, single factor, and then attempt to give a number. This is impossible, and is nothing like organisms are or become. Ever. Unless you isolate and disect them, and then count the parts. But organisms, unlike machines, have no parts. Organisms are complex unities within relational unities of unimaginable sophistication. They are not like computers, and whatever ‘storage’ may mean in this context is something we will need to discover, rather than declare by comparison to our machines, whose powers and features we vastly over-estimate in comparison to biology. I suspect that to accurately similate all functions of a complex bacterial cell within a living body for 10 minutes would require more computing power than our species has ever produced. And might still fail. Do not make the mistake of reifying organims in the image of mechanisms; it’s a schizogenic gambit, and a naive error of collapsing a comparison into equivalency.


 
 
 

Comments


KinG HunteR

A BLOG BY AJ 

KING HUNTER AJ  PRODUCTIONS

© 2016 BY  KING HUNTER AJ PRODUCTIONS

PROUDLY CREATED WITH WIX.COM

bottom of page