Some fun numbers about the human brain

Since it is fashionable these days to compare the performance of connectionist models with humans (even though these models, often referred to as deep learning only stand a chance of competing with humans in extremely narrow contests), there is a popular belief that these models powered by modern GPU's somehow approach the computational power of the human brain.

Now the latter is really not defined, since we don't even know how brains work and therefore it is extremely hard to estimate at which level of abstraction to assign the fundamental computation but we can still play with some numbers just to get some vague idea of where are we.

So let us start with neurons: average human brain has roughly 80 billion neurons. The popular belief is that neurons are responsible for the function of the brain but there are plenty other cells there, called glia, whose function is not yet understood. So it is very likely there are actually orders of magnitude more cells that somehow realize the computational function, but for now let us stick to the "official" 80B figure.

Each of these neurons is an extremely complex cell, with membrane, electrochemical dynamics of action potentials and so on. In artificial neural networks all that complexity is often reduced to a dot product of input vector times weights vector, passed through a nonlinear activation. This is an enormous simplification but since we don't really know how biological neurons work while our "perceptron units" appear to be doing something useful, let us give ourselves the benefit of doubt. That said, I don't think there is anywhere in the world right now an instance of an artificial neural network that would have 80B unique artificial neurons...

It is somewhat easier to see this discrepancy in the synapse figure. According to estimates (which again may vary) there is 1.5*10^14 synapses in the human brain. I'd take this number with a grain of salt because it is actually very difficult to estimate, but again let's stick to this official number. What about contemporary artificial neural nets? E.g. the number of parameters in VGG16 net is estimated to be 134 million. So if we assume that one biological synapse is equivalent to a weight parameter in a multilayer perceptron, we roughly get that a human brain is equivalent ~1 million of VGG16 nets running in real time. Now as it has been pointed out to me on twitter, this calculation may be a bit confusing - if we were to count each convolution separately (note the convolutional filters are reused over the entire window), we would come up with a much larger number of parameters, according to my quick calculation for VGG16 that would be a bit less than 16 billion, so roughly 115 times more. In such case the brain would only be equivalent to roughly 10000 VGG16's - still a whopping 4 orders of magnitude. Only if we'd let those parameters free in VGG16 or any other deep net for that matter, backprop would have absolutely no chance to train them (vanishing gradient). So I'm hesitant treating shared weights/individual convolutions as first class citizens.

If we assume that every synapse in the human brain could be represented with a single byte, we'd need 136TB to represent all that memory. If we were to load that memory onto GPU mem, we'd need nearly 13 thousand GTX 1080Ti boards, just to store the synaptic weights.

On one hand one byte may be too much, since biological synapses tend to be very noisy (and they either pass a signal or not). On the other hand each synapse undergoes a very complex process of plasticity, both short and long term, modulated by neuromodulators and what not, all of which may equally well take megabytes to encode. We don't really know, but it should be clear from this post, that we are very likely way off.

And all that is aside from the fact that our neural net models are probably not even doing the right thing... And above everything, even if the human brain could be successfully simulated using some 13k GTX 1080 Ti's, these would use more than 3MW of power, while the biological implementation uses roughly 20W... some 5 orders of magnitude less...

Anyway, I'd strongly encourage everyone to consider these numbers, even if they are likely very imprecise, still they give some interesting perspective.

 

If you found an error, highlight it and press Shift + Enter or click here to inform us.

Comments

comments