Docstoc

ai

Document Sample
ai Powered By Docstoc
					             artificial intelligence

              fdm 20c introduction to digital media
                            lecture 01.06.2007




warren sack / film & digital media department / university of california, santa cruz
last time

• computer games
outline for today (1 of 3)

• artificial intelligence: the founding document
   – who was turing? what is he famous for?
   – a reading of turing‟s article “computing machinery and
     intelligence” in which the following is highlighted:
      • gender: the role of the woman in the “imitation game”
      • the aesthetics of the game: the aesthetics of the uncanny
      • the prescient insights of turing on gender and the body, that
        would turn out -- now -- to be most useful for trying to
        understanding online role-playing games and also some of
        the central weaknesses of decades of ai research (especially
        oversights made about the role of the body in models of
        thinking)
outline (2 of 3)

• a short history of artificial intelligence in software
   – planning as a technical problem
      • GPS as a “solution”: The General Problem Solver by Herbert
        Simon, Allen Newell, and Clifford
      • demo of GPS
   – story generation as a planning problem
      • TALESPIN as a “solution”
      • demo of micro-talespin
   – story understanding as a plan recognition problem
      • FRUMP as a “solution”
   – question answering as a problem
      • ELIZA as a “solution”
      • demo of ELIZA
outline (3 of 3)

• ELIZA as an evocative object / the
  ethnomethodological approach
alan turing

          • Founder of computer science,
            artificial intelligence, mathematician,
            philosopher, codebreaker, and a gay
            man
              – see http://www.turing.org.uk/turing/
alan turing (1912-1936)

• 1912 (23 June): Birth, Paddington, London
• 1926-31: Sherborne School
• 1930: Death of friend Christopher Morcom
• 1931-34: Undergraduate at King's College,
  Cambridge University
• 1932-35: Quantum mechanics, probability, logic
• 1935: Elected fellow of King's College,
  Cambridge
• 1936: The Turing machine, computability,
  universal machine
alan turing (1936-1946)
• 1936-38: Princeton University. Ph.D. Logic, algebra,
  number theory
• 1938-39: Return to Cambridge. Introduced to
  German Enigma cipher machine
• 1939-40: The Bombe, machine for Enigma
  decryption
• 1939-42: Breaking of U-boat Enigma, saving battle
  of the Atlantic
• 1943-45: Chief Anglo-American crypto consultant.
  Electronic work.
• 1945: National Physical Laboratory, London
• 1946: Computer and software design leading the
alan turing (1947-1954)

• 1947-48: Programming, neural nets, and artificial
  intelligence
• 1948: Manchester University
• 1949: First serious mathematical use of a computer
• 1950: The Turing Test for machine intelligence
• 1951: Elected FRS. Non-linear theory of biological
  growth
• 1952: Arrested as a homosexual, loss of security
  clearance
• 1953-54: Unfinished work in biology and physics
• 1954 (7 June): Death (suicide) by cyanide
  poisoning, Wilmslow, Cheshire.
turing’s “imitation game” (1 of 3)

• “The new form of the problem can be described
  in terms of a game which we call the „imitation
  game.‟ It is played with three people, a man, a
  woman, and an interrogator who may be of
  either sex. The interrogator stays in a room
  apart from the other two. The object of the game
  for the interrogator is to determine which of the
  other two is the man and which is the woman.”
turing’s “imitation game” (2 of 3)

• “It is [the man's] object in the game to try and
  cause [the interrogator] to make the wrong
  identification.”

• “The object of the game for [the woman] is to
  help the interrogator.”
turing’s “imitation game” (3 of 3)

• “We now ask the question, „What will happen
  when a machine takes the part of [the man] in
  this game?‟ Will the interrogator decide wrongly
  as often when the game is played like this as he
  does when the game is played between a man
  and a woman? These questions replace our
  original [question], „Can machines think?‟”
  (Turing, 1950, pp. 433-434)
walker/sack/walker “online caroline”

• walker: “My hair is still wet from the shower
  when I connect my computer to the network,
  sipping my morning coffee. I check my email
  and find it there in between other messages: an
  email from Caroline.”
• sack (citing turing): “[The interrogator asks]: Will
  [you] please tell me the length of [your] hair?”
• walker: “The first lines in my essay on Online
  Caroline really are striking in their insistence on
  a feminine imagery, ...”
walker/sack/walker “online caroline”

• walker: “The first lines in my essay on Online
  Caroline really are striking in their insistence on
  a feminine imagery, ...” and especially since the
  images I used (of wet hair and a shower) are so
  typical of the male objectifying gaze Sack refers
  to: imagine shampoo ads with half-naked
  women or the shower scene in Psycho. Why on
  earth did I choose such a way to ground my
  reading of Online Caroline?”
walker/sack/walker “online caroline”

• what is this virtual body evoked by turing and
  walker and “online caroline”?

• do you have a gender when you are online?
artificial intelligence: a definition

“... artificial intelligence [AI] is the science of
   making machines do things that would require
   intelligence if done by [humans]”
   Marvin Minsky, 1963
artificial intelligence: research areas

•   Knowledge Representation
•   Programming Languages
•   Natural Language (e.g., Story) Understanding
•   Speech Understanding
•   Vision
•   Robotics
•   Machine Learning
•   Expert Systems
•   Qualitative Simulation
•   Planning
planning as a technical problem

 – GPS is what is known in AI as a “planner.”
    • Newell, Alan, Shaw, J. C., and Simon, Herbert A. “GPS, A
      Program That Simulates Human Thought.” In Computers and
      Thought, ed. Edward A. Feigenbaum and Julian Feldman.
      pp. 279-293. New York, 1963
 – To work, GPS required that a full and accurate model
   of the “state of the world” (i.e., insofar as one can
   even talk of a “world” of logic or cryptoarthimetic, two
   of the domains in which GPS solved problems) be
   encoded and then updated after any action was taken
   (e.g., after a step was added to the proof of a
   theorem).
    • demo: implementation from Peter Norvig‟s Paradigms of
      Artificial Intelligence Programming (see www.norvig.com)
a problem with ai planning

• the “frame problem”: This assumption – that
  perception was always accurate and that all of
  the significant details of the world could be
  modeled and followed – was incorporated into
  most AI programs for decades and resulted in
  what became known to the AI community as the
  “frame problem;” i.e., the problem of deciding
  what parts of the internal model to update when
  a change is made to the model or the external
  world.
     • Cf., Martins, J. “Belief Revision.” In Encyclopedia of Artificial
       Intelligence, Second Edition. Stuart C. Shapiro (editor-in-
       chief), pp. 110-116. New York, 1992
story generation as planning

– James Meehan, "The Metanovel: Writing Stories by
  Computer", Ph.D. diss., Yale University, 1976.
   • demo: micro-talespin
      – http://www.eliterature.org/2006/01/meehan-and-sacks-micro-talespin/
problems with story generation:
missing common sense
• Examples of Talespin‟s missing common sense
 (from Meehan, 1976)

  – Answers to questions can take more than one form.
  – Don‟t always take answers literally.
  – You can notice things without being told about them.
  – Gravity is not a living creature.
  – Stories aren‟t really stories if they don‟t have a central
    problem.
  – Sometimes enough is enough.
  – Schizophrenia can be disfunctional.
story understanding
as a plan recognition problem
G. DeJong (1979) FRUMP: Fast Reading Understanding and Memory Program

$demonstration script
• The demonstrators arrive at the demonstration
  location.
• The demonstrators march.
• Police arrive on the scene.
• The demonstrators communicate with the target
  of the demonstration.
• The demonstrators attack the target of the
  demonstration.
• The demonstrators attack the police.
(From DeJong, 1979; pp. 19-20)
story understanding as plan recognition

• demo: micro-sam
  – Richard Cullingford, “Script application: computer
    understanding of newspaper stories,” Ph.D. diss.,
    Yale University, 1977.
question answering as a problem

 – ELIZA as a “solution”
    • J. Weizenbaum, “ELIZA -- A Computer Program for the Study
      of Natural Language Communication between Man and
      Machine,” Communications of the Association for Computing
      Machinery, vol. 9, no. 1 (January 1965), pp. 36-45.
    • demo: see www.norvig.com for source code
question for today

• what problem does weizenbaum‟s eliza system
  address or solve?

  – the artificial intelligence answer: it does (or does not)
    behave like a human and is therefore successful (or
    not successful)

  – the ethnomethodology answer: it is taken to be a like
    a person in a conversation and thus simply works like
    most other technologies in a social situation
remember johnstone’s “algorithm”

• If the last two answers were “No,” then answer
  “Yes.”
• Else, if more than 20 total answers, then answer
  “Yes.”
• Else, if the question ends in vowel, then answer
  “No.”
• Else, if question ends in “Y,” then answer
  “Maybe.”
• Else, answer “Yes.”
ethnomethodology: a definition

• Ethnomethodology simply means the study of
  the ways in which people make sense of their
  social world.
• Ethnomethodology is a fairly recent sociological
  perspective, founded by the American
  sociologist Harold Garfinkel in the early 1960s.
  The main ideas behind it are set out in his book
  "Studies in Ethnomethodology" (1967).
  (Simon Poore, http://www.hewett.norfolk.sch.uk/curric/soc/ethno/intro.htm)
ethnomethodology

• Ethnomethodology differs from other
  sociological perspectives in one very important
  respect:
  – Ethnomethodologists assume that social order is
    illusory. They believe that social life merely appears
    to be orderly; in reality it is potentially chaotic. For
    them social order is constructed in the minds of
    social actors as society confronts the individual as a
    series of sense impressions and experiences which
    she or he must somehow organise into a coherent
    pattern.
     • Simon Poore, http://www.hewett.norfolk.sch.uk/curric/soc/ethno/intro.htm
ethnomethodology

• Q: How do people make sense of the world?
• A: They/we use the “documentary method”
• Karl Mannheim, “the documentary method”
  – Garfinkel on Mannheim: “The method consists of
    treating an actual appearance as „the document of,‟
    as „pointing to,‟ as „standing on behalf of‟ a
    presupposed underlying pattern. The method is
    recognizable for the everyday necessities of
    recognizing what a person is „talking about‟ given that
    he does not say exactly what he means, or in
    recognizing such common occurrences and objects
    as mailmen, friendly gestures, and promises.”
lucy suchman

• Ph.D. in Social/Cultural Anthropology from the
  University of California at Berkeley
• Researcher at Xerox‟s Palo Alto Research
  Center (PARC)
• Founded and Directed of the Work Practice &
  Technology research group at PARC
• Currently Professor in the Centre for Science
  Studies and Sociology Department at Lancaster
  University in England
lucy suchman

• is an ethnomethodologist and an anthropologist
  of science (cf., bruno latour in next week‟s
  lectures)

• her work radically challenged work in hci and ai

• she is one of the primary people working in the
  fields of participatory design (pd) and computer-
  supported cooperative work (cscw)

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:8
posted:10/20/2011
language:English
pages:31