Some fun numbers about the human brain

Since it is fashionable these days to compare the performance of connectionist models with humans (even though these models, often referred to as deep learning only stand a chance of competing with humans in extremely narrow contests), there is a popular belief that these models powered by modern GPU's somehow approach the computational power of the human brain.

Now the latter is really not defined, since we don't even know how brains work and therefore it is extremely hard to estimate at which level of abstraction to assign the fundamental computation but we can still play with some numbers just to get some vague idea of where are we.

So let us start with neurons: average human brain has roughly 80 billion neurons. The popular belief is that neurons are responsible for the function of the brain but there are plenty other cells there, called glia, whose function is not yet understood. So it is very likely there are actually orders of magnitude more cells that somehow realize the computational function, but for now let us stick to the "official" 80B figure.

Each of these neurons is an extremely complex cell, with membrane, electrochemical dynamics of action potentials and so on. In artificial neural networks all that complexity is often reduced to a dot product of input vector times weights vector, passed through a nonlinear activation. This is an enormous simplification but since we don't really know how biological neurons work while our "perceptron units" appear to be doing something useful, let us give ourselves the benefit of doubt. That said, I don't think there is anywhere in the world right now an instance of an artificial neural network that would have 80B unique artificial neurons...

It is somewhat easier to see this discrepancy in the synapse figure. According to estimates (which again may vary) there is 1.5*10^14 synapses in the human brain. I'd take this number with a grain of salt because it is actually very difficult to estimate, but again let's stick to this official number. What about contemporary artificial neural nets? E.g. the number of parameters in VGG16 net is estimated to be 134 million. So if we assume that one biological synapse is equivalent to a weight parameter in a multilayer perceptron, we roughly get that a human brain is equivalent ~1 million of VGG16 nets running in real time. Now as it has been pointed out to me on twitter, this calculation may be a bit confusing - if we were to count each convolution separately (note the convolutional filters are reused over the entire window), we would come up with a much larger number of parameters, according to my quick calculation for VGG16 that would be a bit less than 16 billion, so roughly 115 times more. In such case the brain would only be equivalent to roughly 10000 VGG16's - still a whopping 4 orders of magnitude. Only if we'd let those parameters free in VGG16 or any other deep net for that matter, backprop would have absolutely no chance to train them (vanishing gradient). So I'm hesitant treating shared weights/individual convolutions as first class citizens.

If we assume that every synapse in the human brain could be represented with a single byte, we'd need 136TB to represent all that memory. If we were to load that memory onto GPU mem, we'd need nearly 13 thousand GTX 1080Ti boards, just to store the synaptic weights.

On one hand one byte may be too much, since biological synapses tend to be very noisy (and they either pass a signal or not). On the other hand each synapse undergoes a very complex process of plasticity, both short and long term, modulated by neuromodulators and what not, all of which may equally well take megabytes to encode. We don't really know, but it should be clear from this post, that we are very likely way off.

And all that is aside from the fact that our neural net models are probably not even doing the right thing... And above everything, even if the human brain could be successfully simulated using some 13k GTX 1080 Ti's, these would use more than 3MW of power, while the biological implementation uses roughly 20W... some 5 orders of magnitude less...

Anyway, I'd strongly encourage everyone to consider these numbers, even if they are likely very imprecise, still they give some interesting perspective.

 

Comments

comments

2 thoughts on “Some fun numbers about the human brain

  1. Looking back in time, computers have (roughly) scaled 5 orders of magnitude in, say, 60 years. While performance itself may not continue at that pace, it is possible that compute/watt may continue to improve along an exponential curve for several more decades.

    The point here is that some people alive today may see rough parity, if it's possible to get into molecular level computing.

    I'm not sure you'd disagree with this future possibility, as your point in this post is that we are clearly not close today, which is quite true of course.

    1. Certainly such possibilities are not out of question. Generally I do believe that the function of the brain will eventually get approximated by what we would be likely inclined to call a computer, but I'm not sure it would be anything we would call a computer today.

      That said, my point here is that contrary to some PR statements, we are not any close. And I do think that the approximates I have given here are actually very, very optimistic. More likely we may find additional level of complexity in the brain that will put us back some 2-3 orders of magnitude with respect to the calculations above. So even if Moore's law persists - which is _highly_ questionable at this point - we may still be many decades away. Above everything I don't believe anything we do right now with deep learning is even moving in the right direction, but that is a separate story.

Leave a Reply