IBM made a new microchip architecture that emulates the axons, dendrites and synapses of the neurons in the human brain, which make it much more capable of emulating the human brain.
There’s been talk of an eventual event called the Singularity, which is when computers are capable of humanlike thought. What with computers that can win at Jeopardy, and computers that can beat the best chess players in the world, it seems like the Singularity is drawing very near. But there’s a very big obstacle: computer architecture.
Computer processors are built in a way that does not mimic nature, so anything that could mimic the human brain would be a brute-force attempt at best, and would probably be squandering a lot of power. Remember that a human brain takes about 20 watts of power and is estimated to be the equivalent to about 1.7 THz of processing power (plus it’s pre-programmed). Compare that to computers from today, which take a few hundred watts and run around 3 GHz.
The difference is that computers do brute-force work, by definition. They’re not made to do what the human brain does. Using the current computer architecture, it will be a long time until we can emulate a human brain. Check out this video, which discusses the Singularity.
Check this video out for an example!
The cool news about IBM’s new architecture is that, since it emulates the human brain, it’s better designed to do ‘cognitive thinking,’ the kind of thinking that human brains do. That’s opposed to typical computational thinking.
So far, these microchips have been made to play pong, control a racing game, and identify images on a screen. But because it was done with cognitive thinking rather than computational thinking, the microchips weren’t specifically programmed to do any of these things. The new microchips just did it.
One of the other key differences is that, because of how this microchip is designed, it also has functioning memory–making RAM cards unnecessary.
This is a big step forward towards the Singularity. And though this might not transform mainstream computing, chips like these will likely wind up taking over everything that requires smart controlling: autopilot, AI in videogames, robots that search for survivors of natural disasters, and things like that. This might open up a new field of computer engineering and computer science, so be on the lookout!
Read what I read: