The representational headaches of cognitive science

Document Sample
The representational headaches of cognitive science Powered By Docstoc
					      The representational headaches
           of cognitive science
      Embodied Embedded Cognition

                                 Pim Haselager
Artificial Intelligence and Cognition           Cognitive Science and Philosophy of Mind
NICI - Radboud University                                       Philosophy Dpt. - UNESP
Nijmegen, The Netherlands                                              Marília, SP, Brazil


 • Cognitive science and representations
 • A growing discontent with representational models
   and explanations
      – Attempt to indicate why
      – Discontent with exactly what (& with what not)
 • A new approach to cognition
      – Focus on the role of the body and environment in the
        emergence of behavior
      – Embodied Embedded Cognition (EEC)

Cognitive Science and representations

   •   Difference with behaviorism: representations
   •   Cognitive systems (minds) are representational
   •   Representations are of causal relevance to behavior
   •   Classical cognitive science and standard forms of
       connectionism are both representational

            Representation defined
• “A sophisticated system (organism) designed (evolved) to
  maximize some end (e.g., survival) must in general adjust its
  behavior to specific features, structures, or configurations of its
  environment in ways that could not have been fully
  prearranged in its design. [...]
  But if the relevant features are not always present (detectable),
  then they can, at least in some cases, be represented; that is,
  something else can stand in for them, with the power to guide
  behavior in their stead. That which stands in for something else
  in this way is a representation; that which it stands for is its
  content; and its standing in for that content is representing it”.
(Haugeland, 1991, p. 62).

       Representational Headaches
  1) The frame problem
  2) The representational format problem
  3) The operationalization problem
  4) The grounding problem
  5) The observer problem
  6) The eagerness problem
  7) The omnipresence problem

            1) The frame problem
                      (Act normal!)
• Arises in the context of creating (modeling) a system that
  is capable of acting intelligently in the world (McCarthy
  & Hayes, 1969)
• This requires the ability to notice rapidly the relevant
  consequences of actions or events
• Traditionally, the basic idea is that people are capable of
  doing so by efficiently using what they know about the
• How to represent the immense amount of common sense
  knowledge in a way that enables instantaneous use of the
  relevant parts of it?

        Examples of common sense

• Alexander fries an egg
• Amalia moves her cup of coffee away from the
  paper she is writing
• Carol sees a bike with a flat tire and walks away
  to a bus stop
• John gets upset when Mary laughs while Paul is
  talking to her

Common sense: a complicated affair?

   Knowledge           Beliefs          Desires

Input    World Model       Plans   Decisions      Output

          Distorting common sense
 “Consider a famous commonsense problem:
    what knowledge comes into play in formulating a
    simple plan to crack an egg on the side of a bowl, with
    the intention that the egg contents will be in the bowl.”
       (Elio, 2002, ‘Common sense, reasoning & rationality’, p.11).
 Bowls, eggs, liquids, gravity, hands, grips, speedy movements, etc.

• The way the issue is formulated helps to create the
      frame problem

2) The representational format problem
   (Two fatally ill candidates: whom to vote for?)

 • Symbolic representations
    – Too brittle
    – Explicitness is self-defeating

 • Distributed representations
    – Unable to represent structure inherent in information
    – Brittle training procedures

 Result: A representational impasse
   • Currently available models indicate that is
     not possible both:
        to internally represent a large amount of well-
        structured information
        to use the relevant parts of it efficiently.
   • It may be possible to do one or the other,
     but not both at the same time.

3) The operationalization problem
             (Representations? Where?)

• “We discuss simple machine vision systems developed by
  artificial evolution rather than traditional engineering design
  techniques, and note that the task of identifying internal
  representations is made difficult by the lack of an operational
  definition of representation at the causal mechanistic level.
  Consequently, we question the nature and indeed the existence of
  representations posited to be used within natural vision systems
  (i.e., animals).”
(Cliff & Noble, 1997, p.1165).

       4) The grounding problem
                (What do they mean?)

 • How do representations become meaningful
   to the model or system itself?
    – How to make the meanings of representations
      intrinsic to the system instead of parasitic on
      the meanings in our heads (Harnad, 1990).
       • ‘Intrinsic’ intentionality and ‘genuine’ meaning
         currently remain problematic from a computational-
         representational perspective

        5) The observer problem
         (This behavior looks intelligent,
      therefore it must be representational)
• The ant:

• and Simon’s hypothesis:
     “Human beings, viewed as behaving systems, are quite
     simple. The apparent complexity of our behavior over
     time is largely a reflection of the complexity of the
     environment in which we find ourselves.”
   (Simon, 1969/1996, p.53).

            6) The eagerness problem
          (Representations, no matter what task)
• Solving a ‘simple’ problem
   (how to ensure a smooth output of a flywheel while dealing with an
     irregular power source and a changing workload)
     van Gelder (1995)
      • 1. Measure the speed of the flywheel
      • 2. Compare the actual speed against the desired speed
      • 3. If there is no discrepancy, return to step 1; otherwise
          – a. Measure the current steam pressure
          – b. Calculate the desired alteration in steam pressure
          – c. Calculate the necessary throttle valve adjustment
      • 4. Make the throttle valve adjustment
      Return to step 1.

                         Watt’s solution
                                                    • No internal message
                                                    • Direct coupling of the
                                                      parts of the governor

          7) The omnipresence problem
                 (Representations? Everywhere!)
      • Bechtel (1998): The WG is representational
System Z using Y to                                      Object X to be
coordinate behavior with X                               represented by Y
(Opening of throttle valve)                             (Speed of flywheel)

                              Y representing X
                              (Angle of spindle arms)

                   Really everywhere.....
      “It may be suggested as a criticism of this very
        general characterization of computation that it is
        too general.
        For in this very wide sense, even a sieve or a
        threshing machine could be considered a
        computer, since they sort their inputs into types,
        and if one wanted to spend time at it, one could
        discover a function that describes the input-output
        While this observation is correct, it is not so much
        a criticism as an apt appreciation of the breadth of
        the notion.”
      (Churchland & Sejnowski, 1992, p.66).

    Representational Headaches
1) The frame problem
2) The representational format problem
3) The operationalization problem
4) The grounding problem
5) The observer problem
6) The eagerness problem
7) The omnipresence problem

                 A motto
 ‘Away with representations’

 ‘Don’t use representations in explanation
 and modeling if you can do without

      Embodied Embedded Cognition
• Bodily interaction with the environment is primary,
  not secondary, to cognition
• Labels: enactive cognition, situated cognition,
  embedded cognition, etc.
• Some books:
      Varela, Thompson & Rosch, 1991; Edelman, 1992; Thelen &
      Smith, 1994; Port & van Gelder, 1995; Kelso, 1995; Clancey,
      1997; Agre, 1997; Clark, 1997, 2001; Juarerro, 1999; Keijzer,
      2001; and many others.

 • Intrinsic dynamics (Kelso, 1995)
    – relatively autonomous coordination tendencies
 • Learning to descend a slope (Adolph, 1993; Thelen &
   Smith, 1994)
 • Learning to drive a car
 • Playing football after many years......................…..
 • Cognitive systems ‘tune into their bodies’
    (Chiel & Beer, 1997)
    – phylogenetically
    – ontogenetically

• Scaffolding (Clark, 1997)
    – "We manage our physical and spatial surroundings in
      ways that fundamentally alter the information-
      processing tasks our brains confront." (Clark, 1997,
• The 007 principle (Clark, 1997)
    – "In general, evolved creatures will neither store nor
      process information in costly ways when they can use
      the structure of the environment and their operations
      upon it as a convenient stand-in for the information-
      processing operations.
    – That is, know only as much as you need to know to get
      the job done." (Clark, 1997, p.46).
• Epistemic action (Kirsh & Maglio, 1994)

              Reactive Robotics
•   Rodney Brooks MIT
•   Late 80’s
•   Lots of resistance
•   Lots of publicity

• Cybernetics
  – Wiener (mathematics)
     • Cybernetics, or control and
       communication in the animal and
       machine (1948)
  – Ashby (psychiatry)
     • Principles of the Self-Organizing
       Dynamic System (1947)
  – Von Bertalanffy (biology)
     • General System Theory: Foundations,
       Development, Applications New York:
       George Braziller (1968)
• Grey Walter (neurophysiology)
• Braitenberg (neurophysiology)

• Braitenberg vehicles (1984)

            History of reactive robotics
          Walter                  Braitenberg
                                           Brooks      Nolfi &

                                Stanford cart
                                 Planetary rovers
     1950     1960       1970       1980        1990   2000        2010

                Biological plausibility
• Animal models
• Cambrian intelligence
   – “the ability to move around in a
     dynamic environment sensing
     the surroundings to a degree                       Hannibal
     sufficient to achieve the
     necessary maintenance of life
     and reproduction.(Brooks, 1999,
   – Hannibal & Atilla
      • Infrared sensors: seeking moving
        target (heat)
      • 19 degrees of freedom, 60 sensory
        inputs, and 8 microprocessors                    Atilla

                       Layers of behavior
                            Rodney Brooks
 Serial (horizontal) decomposition

                                              Parallel (vertical)


sensors       explore

                         effectors   Allen’s performance

         Mars Exploratory Rovers
• Sojourner - july 4, 1997
• Spirit - june 10, 2003. Landed jan 4, 2004
• Opportunity - june 28, 2004. Landed jan. 25, 2004
   – Sp & O
       • 40 meters per day
       • Size of a fridge
       • Geological evidence for water
       • O got free from a ‘sand trap’

simulated robots
Willems & Haselager

                  The Play Back Panel

  Figure 1: Decoy Behavior
  These diagrams show the expected decoy behavior of the Unies and a Wumpus
  near a Unie-food source. a) The Unies approach a food source but hold back
  because the Wumpus is near the food, b) The hungriest Unie comes near
  enough to the Wumpus for it to start the chase. c) The Unie senses that the
  Wumpus comes too near and reverses its direction. The Wumpus gets nearer to
  the Unie. d) The Wumpus chases one of the Unies. Because the Wumpus is
  farther away from the food, the Unies that are not being chased can approach
  and eat the food.
                                                     Willems & Haselager (2003)

     Embodied Embedded Systems
   The coupling between world, brain and body

world          brain           world

        body           body                                brain


• EEC implies a different conceptualization
  of the primary task of a cognitive system
   focused on problem solving by means of
   internal information processing
   contributing to the ongoing interaction with the

                  The brain
• Not a conductor (Chiel & Beer, 1997)
   – rather a player in a jazz ensemble
• Not primarily a problem solver
   – directed at improvisation
• Contributing to a behavioral repertoire
   – behavioral layers (Brooks, 1991)
   – the environment selects
• Following the ‘laziness principles’

            The laziness principles
        Cognitive strategies for being lazy instead of tired

• Let the environment do • Postpone
  the difficult work for you – don’t think now of what you
   – scaffolding                            can think about later
                                               (something may happen in the
• Don’t think: Act!                            meantime)
   – just get started                   • Lower your ambitions
      • the environment can correct
        you                               – if the world doesn’t
      • it’s often possible to adjust       cooperate: “Oh well, it’s not
        later on                            all that important anyway”
• Copying and imitating                 • Seek company of people
  are good                                that agree with you
   – follow ‘mam and dad’                 – Call them ‘friends’

         Auto-pilot & deep thought
• Many times we function on auto-pilot
   – Almost automatically, habitual, on-line
• Other times we operate on the basis of ‘deep
   – Concentrated, conscious, off-line
• The majority of our daily life activities (‘getting
  by’) is based on this automatic pilot mode
   – Stop & think, switch to deep thought, return to auto-

• Representations are not ‘wrong’
  – But being addicted to representations is
     • Don’t use representations if you can do without them
• Embodied Embedded Cognition
  – Reinterprets the main tasks of our cognitive system
     • Laziness principles
     • Distinction between deep thought and autopilot behavior
• Empirical issues
  – How much behavior can be modeled this way?
  – Abstract (‘higher-level’) representations
     • How do they fit in?
     • When do they fit in?