Supercomputer models one second of human brain activity
The most accurate simulation of the human brain ever has been carried out, but a single second’s worth of activity took one of the world’s largest supercomputers 40 minutes to calculate
more…
Supercomputer models one second of human brain activity
The most accurate simulation of the human brain ever has been carried out, but a single second’s worth of activity took one of the world’s largest supercomputers 40 minutes to calculate
more…
That was in 2014.
Where are they now with this technology?
Intel said it would have a machine up and running by this year.
Researchers used the K computer in Japan, currently the fourth most powerful in the world, to simulate human brain activity. The computer has 705,024 processor cores and 1.4 million GB of RAM, but still took 40 minutes to crunch the data for just one second of brain activity.
It used the open-source Neural Simulation Technology (NEST) tool to replicate a network consisting of 1.73 billion nerve cells connected by 10.4 trillion synapses.
It will be interesting to see the latest simulation times.
> 1.73 billion nerve cells connected by 10.4 trillion synapses.
That doesn’t look right. It would require an average of 6,000 synapses per neuron. A neuron has (?) of the rough order of 60 dendrites, and where there are multiple connections between the same two nerve cells you would only need to model two of them, one for each direction.
Looks like a hopelessly inefficient computer algorithm to me, so far.
mollwollfumble said:
> 1.73 billion nerve cells connected by 10.4 trillion synapses.That doesn’t look right. It would require an average of 6,000 synapses per neuron. A neuron has (?) of the rough order of 60 dendrites, and where there are multiple connections between the same two nerve cells you would only need to model two of them, one for each direction.
Looks like a hopelessly inefficient computer algorithm to me, so far.
I was reading how they severely underestimated the capabilities of neurons and how they believe each one is not just a simple computer but a supercomputer in itself. If so you can imagine why its so hard to simulate a working brain
You could think of human efforts to simulate/replicate biological processes with technology as very immature, nature has had billions of years head start of evolution to come up with the designs currently alive.
Cymek said:
mollwollfumble said:
> 1.73 billion nerve cells connected by 10.4 trillion synapses.That doesn’t look right. It would require an average of 6,000 synapses per neuron. A neuron has (?) of the rough order of 60 dendrites, and where there are multiple connections between the same two nerve cells you would only need to model two of them, one for each direction.
Looks like a hopelessly inefficient computer algorithm to me, so far.
I was reading how they severely underestimated the capabilities of neurons and how they believe each one is not just a simple computer but a supercomputer in itself. If so you can imagine why its so hard to simulate a working brain
No. A neuron isn’t a supercomputer. It’s simply a thresholder. When the input exceeds a given threshold it fires. Then takes its time to reset.
What I find particularly interesting is that a neuron isn’t strictly an analog or digital device, more an analog to digital converter.
I’m sure I could simulate a working brain on a very average supercomputer. It’d be easier to simulate the workings of a brain. Easier than programming the connectome brain architecture in the first place. I used to specialize in speeding up other peoples software, sometimes speeding it up by a factor of 100,000.