Google’s New Computer With Human-Like Learning Abilities Will Program Itself
Google’s New Computer With Human-Like Learning Abilities
Will Program Itself
The new hybrid device might not need humans at all.
By Sage Lazzaro 10/29 3:22pm
In college, it wasn’t rare to hear a verbal battle
regarding artificial intelligence erupt between my friends studying
neuroscience and my friends studying computer science.
One rather outrageous fellow would mention the
possibility of a computer takeover, and off they went. The neuroscience-savvy
would awe at the potential of such hybrid technology as the CS majors argued we
have nothing to fear, as computers will always need a programmer to tell them
what to do.
Today’s news brings us to the Neural Turing Machine, a
computer that will combine the way ordinary computers work with the way the
human brain learns, enabling it to actually program itself. Perhaps my CS
friends should reevaluate their position?
The computer is currently being developed by the
London-based DeepMind Technologies, an artificial intelligence firm that was
acquired by Google earlier this year. Neural networks — which will enable the
computer to invent programs for situations it has not seen before — will make
up half of the computer’s architecture. Experts at the firm hope this will
equip the machine with the means to create like a human, but still with the
number-crunching power of a computer, New Scientist reports.
In two different tests, the NTM was asked to 1) learn to
copy blocks of binary data and 2) learn to remember and sort lists of data. The
results were compared with a more basic neural network, and it was found that
the computer learned faster and produced longer blocks of data with fewer
errors. Additionally, the computer’s methods were found to be very similar to
the code a human programmer would’ve written to make the computer complete such
a task.
These are extremely simple tasks for a computer to
accomplish when being told to do so, but computers’ abilities to learn them on
their own could mean a lot for the future of AI.
Comments
Post a Comment