Published Computing Machinery and Intelligence in 1950
The Turing Test
Language allows for in-depth probing of intelligence and consciousness:
Q: In the first line of your sonnet which reads "Shall I compare thee to a
summer's day," would not "a spring day" do as well or better?
A: It wouldn't scan.
Q: How about "a winter's day?" That would scan all right.
A: Yes, but nobody wants to be compared to a winter's day.
Q: Would you say Mr. Pickwick reminded you of Christmas?
A: In a way.
Q: Yet Christmas is a winter's day, and I do not think Mr. Pickwick would mind the comparison.
A: I don't think you're serious. By a winter's day one means a typical
winter's day, rather than a special one like Christmas.
1956 Dartmouth Conference
Organized by John McCarthy, who coined the phrase "Artificial Intelligence"
John McCarthy, Marvin Minsky, Claude Shannon, Nathaniel Rochester, Allen Newell, Herbert Simon, Oliver Selfridge, Trenchant More, Arthur Samuel, Ray Solomonoff
McCulloch and Pitts: Networks of logic units (1943)
Mauchly and Eckert at Penn: ENIAC (1946)
Marvin Minsky at MIT: Neural networks (early 50s)
John McCarthy at MIT: LISP (1959)
Arthur Samuel at IBM: Checkers (1959)
Allen Newell and Herbert Simon at CMU: Logic Theorist, General Problem-Solver (late 50s, early 60s)
Frank Rosenblatt at Cornell: Perceptrons (early 60s)
Thomas Evans at MIT: ANALOGY (mid 60s)
Joseph Weizenbaum at MIT: ELIZA (mid 60s)
Marvin Minsky and Seymour Papert publish Perceptrons in 1969
Research shifts to knowledge-representation, heuristic search, logic, theorem-proving, and planning
Many rule-based expert systems developed
Doug Lenat's AM program discovered new mathematical concepts
Terry Winograd develops SHRDLU at MIT
What is a neural network?
(Re)discovery of "backpropagation learning algorithm"
Backpropagation solved some of the problems pointed out by Minsky and Papert
Other researchers developed new learning algorithms
NetTalk (Sejnowski and Rosenberg): learned to pronounce English text by example
ALVINN (CMU): learned to autonomously drive a car on the highway
Genetic algorithms and evolutionary approaches to AI (80s to present)
Rodney Brooks at MIT pioneered a new approach to robotics called "behavior-based control"
Robotics is currently an extremely active research area
IBM's Deep Blue became the world chess champion on May 11, 1997 by defeating Garry Kasparov (3.5 to 2.5)
Deep Junior held its own against Kasparov in a February 2003 rematch (3 to 3)
Gerry Tesauro's TD-Gammon program learned to play world-class backgammon by playing millions of games against itself
David Cope's EMI program composes music in the style of Bach and Mozart
Maple and Mathematica do sophisticated symbolic mathematics
AI has grown into an enormous field with a huge number of subareas
Successes so far have been limited to narrow domains
AI systems still lack human-level flexibility and adaptiveness
Computer speed and memory capacity continue to increase exponentially
Ray Kurzweil predicts that a $1000 PC will match the computing speed and capacity of the human brain by the year 2020
Hans Moravec predicts that a humanoid robot will be viable by the year 2030
AI researchers often suffer from over-confidence
It is not my aim to surprise or shock youbut the simplest way I
can summarize is to say that there are now in the world machines that think,
that learn and that create. . . . Within 10 years a computer will be world chess
champion.
Herbert Simon, 1957
Moral: We should be skeptical of the hype
Fast, powerful hardware is not enough
Software is probably much more important
AI has come far since Turing's 1950 article...
...but it still has far to go