Virtual habitability

From HypertWiki
Jump to navigation Jump to search

Introduction

This is my collection of attempts to calculate the processing power needed to completely simulate a human brain and hence, given Moore's Law and the current state of computing power, how long it will be until such power is available. See Virtuality Universe for related stuff.

Blue Brain 2008 estimate

from this article posted on 2008-03-03:

Markram estimates that in order to accurately simulate the trillion synapses in the human brain, you'd need to be able to process about 500 petabytes of data (peta being a million billion, or 10 to the fifteenth power).

But if computing speeds continue to develop at their current exponential pace, and energy efficiency improves, Markram believes that he'll be able to model a complete human brain on a single machine in ten years or less.

These numbers don't make sense, especially when combined with the 2005 figures below:

  • Markram says "trillion synapses", but singinst.org says "100 trillion"
  • Markram says you'd need to process ~500 PB of data -- is that online storage? Per second?
    • If it's online storage, then his time estimate seems far too optimistic, as he is looking at AI-level computing in 10 years despite needing over 1000 times as much storage
Some relevant bits from the Blue Brain Project web site FAQ:

The human neocortex has many millions of [neocortical column]s (NCCs). ... [One] column has 10,000 - 100,000 neurons depending on the species and particular neocortical region ... We have estimated that we may approach real-time simulations of a NCC with 10,000 morphologically complex neurons interconnected with 108 synapses on a 8,000 - 12,000 processor Blue Gene/L machine. To simulate a human brain with around millions of NCCs will probably require more than proportionately more processing power.

So, taking the most pessimistic end of each range:

  • human brain has 10 million (107) NCCs ("many millions" but not "many tens of millions")
  • 100,000 (105) neurons / NCC
  • 100,000,000 (108) synapses / NCC
  • = 1012 neurons (1,000,000,000,000 = trillion = teraneuron) - 10 times the singinst estimate, so not way off
  • = 1015 synapses (1,000,000,000,000,000 = quadrillion = petasynapse) - again, 10 times the singinst estimate

So probably the singinst estimate reflects the same numbers being used by Markram, and the 10x factor is due to the FAQ not being specific -- especially given Markram's "ten years or less" estimate, which would be way off for singinst*10 and only slightly optimistic for singinst*1 (he's probably looking at cutting-edge hardware, while I'm trying to estimate when consumer-level hardware will be adequate). Either way, it would seem to confirm that the model of 4 bytes / synapse isn't excessively optimistic -- although Blue Brain is also devoting a whole CPU to each neuron as well, but apparently this is taken into account in the "10 years or less" prediction.

Woozle's ~2002 estimate

(Originally posted here)

"The human brain contains somewhere around 100 billion neurons and 100 trillion synapses." [ http://www.singinst.org/seedAI/general.html ] this seems to be the updated version "The current estimate is that the typical human brain contains something like a hundred billion neurons and a hundred trillion synapses."

Oversimplistically modeling a synapse as a single-precision floating-point number (4 bytes), this means we'd need 400,000,000,000,000 bytes (400 terabytes) just to store the data contained in a typical human brain.

A good portion of those 100 terasynapses are probably in the autonomic sections of the brain and may be unnecessary (when an Essienaut is not using a robotic extension, at any rate); we may also get lucky and find that 4 bytes of resolution is unnecessary for many synapses. On the other hand, we will probably need additional storage for the Essienaut's work and daily life (think of all the things you "store" in your immediate environment)... so 400 TB seems like a reasonable guess of what we'd need in order to get started, if not to live in digital luxury.

Starting with a present low-budget disc storage capacity of 120 GB (about $100 at today's prices) and doubling every 1.5 months, how long does it take to reach 400 TB? Approximately 12 doublings gets us from 120 to 400,000 -- 17.5 years. That's NOT LONG.

Ok, let's be pessimistic... we need 400 TB of RAM, not disc; disc space is too slow. Present RAM sizes generally start around 128MB, but it's not hard to find system boards supporting over a gigabyte (and it's not terribly expensive to fill them up). So let's start at 1 GB. 18.6 doublings gets us from 1 to 400,000 -- not quite 30 years.

Now, CPU power...

"Neurons are slow; a neuron has a firing rate of only 200 times per second, and the signals travel at a maximum speed of only 150 meters per second." - ibid.

One might think "oh, THAT one's EASY!" but you have to remember that this is EACH of those 100 terasynapses firing 200 times a second (ok, processing signals from neurons firing at about 200 times/second -- same thing, computationally), all working in parallel... or in other words, (100e+9 x 200 = ) 20,000 teraflops/second. Determining flops for today's microprocessors is tricky because it really depends what sort of processing you're doing, but here's my best estimate. AMD's latest 64-bit processor (available at retail for about $500) can be clocked at up to 2,600 MHz and can run two 64-bit (8-byte) operations per cycle. Boldly extrapolating that this means it could run four 4-byte floating point ops in those two cycles, where there are 2,600,000 cycles per second, that means 10.4 gflops.

From 10.4 gflops to 20,000 tflops (20,000,000 gflops) is... almost 21 doublings, or 31 years. (And who's to say that we won't come up with some specialized hardware for this stuff before then?)

So it seems clear to me that barring major setbacks, we should have quite adequate computing power (at least to set up a rough digital homestead) by the year 2035 or so.