Tuesday, December 6, 2011
Modeling Computers After The Human Brain?
Interesting article in the NYTimes about artificial intelligence.
Excerpt: Yet the principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity.
The physical limits of conventional computer designs are within sight — not today or tomorrow, but soon enough. Nanoscale circuits cannot shrink much further. Today’s chips are power hogs, running hot, which curbs how much of a chip’s circuitry can be used. These limits loom as demand is accelerating for computing capacity to make sense of a surge of new digital data from sensors, online commerce, social networks, video streams and corporate and government databases.
To meet the challenge, without gobbling the world’s energy supply, a different approach will be needed. And biology, scientists say, promises to contribute more than metaphors. “Every time we look at this, biology provides a clue as to how we should pursue the frontiers of computing,” said John E. Kelly, the director of research at I.B.M.
Dr. Kelly points to Watson, the question-answering computer that can play “Jeopardy!” and beat two human champions earlier this year. I.B.M.’s clever machine consumes 85,000 watts of electricity, while the human brain runs on just 20 watts. “Evolution figured this out,” Dr. Kelly said.
In designing chips that bear some structural resemblance to the brain, so-called neuromorphic chips, neuroscience was a guiding principle as well. Brains are low-power, nimble computing mechanisms — real-world proof that it is possible.
A brain does its computing with a design drastically different from today’s computers. Its processors — neurons — are, in computing terms, massively distributed; there are billions in a human brain. These neuron processors are wrapped in its data memory devices — synapses — so that the brain’s paths of communication are extremely efficient and diverse, through the neuron’s axons, which conduct electrical impulses.
A machine that adopts that approach, Dr. Modha said, would represent “a crucial shift away from von Neumann computing.” He was referring to a design with processor and memory physically separated and connected by a narrow communications channel, or bus, and operating according to step-by-step sequential methods — the von Neumann architecture used in current computers, named after the mathematician John von Neumann.
Read full NYTimes article here.
Excerpt: Yet the principles of biology are gaining ground as a tool in computing. The shift in thinking results from advances in neuroscience and computer science, and from the prod of necessity.
The physical limits of conventional computer designs are within sight — not today or tomorrow, but soon enough. Nanoscale circuits cannot shrink much further. Today’s chips are power hogs, running hot, which curbs how much of a chip’s circuitry can be used. These limits loom as demand is accelerating for computing capacity to make sense of a surge of new digital data from sensors, online commerce, social networks, video streams and corporate and government databases.
To meet the challenge, without gobbling the world’s energy supply, a different approach will be needed. And biology, scientists say, promises to contribute more than metaphors. “Every time we look at this, biology provides a clue as to how we should pursue the frontiers of computing,” said John E. Kelly, the director of research at I.B.M.
Dr. Kelly points to Watson, the question-answering computer that can play “Jeopardy!” and beat two human champions earlier this year. I.B.M.’s clever machine consumes 85,000 watts of electricity, while the human brain runs on just 20 watts. “Evolution figured this out,” Dr. Kelly said.
In designing chips that bear some structural resemblance to the brain, so-called neuromorphic chips, neuroscience was a guiding principle as well. Brains are low-power, nimble computing mechanisms — real-world proof that it is possible.
A brain does its computing with a design drastically different from today’s computers. Its processors — neurons — are, in computing terms, massively distributed; there are billions in a human brain. These neuron processors are wrapped in its data memory devices — synapses — so that the brain’s paths of communication are extremely efficient and diverse, through the neuron’s axons, which conduct electrical impulses.
A machine that adopts that approach, Dr. Modha said, would represent “a crucial shift away from von Neumann computing.” He was referring to a design with processor and memory physically separated and connected by a narrow communications channel, or bus, and operating according to step-by-step sequential methods — the von Neumann architecture used in current computers, named after the mathematician John von Neumann.
Read full NYTimes article here.
Labels:
Artificial Intelligence,
Computers,
Science
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment