Resources - CSE_ IIT Bombay by xiuliliaofz

VIEWS: 7 PAGES: 79

									CS626-449: NLP, Speech and
    Web-Topics-in-AI
       Pushpak Bhattacharyya
               CSE Dept.,
              IIT Bombay
  Lecture 37: Semantic Role Extraction
     (obtaining Dependency Parse)
Vaquious Triangle
                         Interlingua based
                         (do deep semantic process
                         Before entering the target language)




           Transfer Based
           (do deep semantic
           process
           Before entering the
           target language)

                                                         Vaquious: an eminent
      Direct
      (enter the target                                  French Machine
      Language immediately                               Translation Researcher-
      Through a dictionary)
                                                         Originally a Physicist


                                                                                2
Universal Networking
Language
   Universal Words (UWs)
   Relations
   Attributes
   Knowledge Base




                            3
UNL Graph
       He forwarded the mail to the minister.

                   forward(icl>send)          @ entry @ past


          agt                               gol
                       obj
  He(icl>person)                          minister(icl>person)   @def

                   mail(icl>collection)       @def


                                                                        4
AGT / AOJ / OBJ
   AGT (Agent)
    Definition: Agt defines a thing which
    initiates an action
   AOJ (Thing with attribute)
    Definition: Aoj defines a thing which is in a
    state or has an attribute
   OBJ (Affected thing)
    Definition: Obj defines a thing in focus
    which is directly affected by an event or state
                                                    5
Examples
   John broke the window.
    agt ( break.@entry.@past, John)

   This flower is beautiful.
    aoj ( beautiful.@entry, flower)

   He blamed John for the accident.
    obj ( blame.@entry.@past, John)
                                       6
BEN
   BEN (Beneficiary)
    Definition: Ben defines a not directly
    related beneficiary or victim of an event or
    state

   Can I do anything for you?
ben ( do.@entry.@interrogation.@politeness, you )
obj ( do.@entry.@interrogation.@politeness, anything )
agt (do.@entry.@interrogation.@politeness, I )
                                                         7
PUR
   PUR (Purpose or objective)
    Definition: Pur defines the purpose or
    objectives of the agent of an event or the
    purpose of a thing exist

   This budget is for food.
       pur ( food.@entry, budget )
       mod ( budget, this )

                                                 8
RSN

     RSN     (Reason)
      Definition: Rsn defines a reason why
      an event or a state happens
     They selected him for his honesty.
        agt(select(icl>choose).@entry, they)
        obj(select(icl>choose) .@entry, he)
        rsn (select(icl>choose).@entry, honesty)

                                                   9
TIM
   TIM (Time)
    Definition: Tim defines the time an
    event occurs or a state is true
   I wake up at noon.
    agt ( wake up.@entry, I )
    tim ( wake up.@entry, noon(icl>time))


                                            10
     TMF
   TMF (Initial time)
    Definition: Tmf defines a time an
    event starts
   The meeting started from morning.
    obj ( start.@entry.@past, meeting.@def )
    tmf ( start.@entry.@past, morning(icl>time) )




                                                    11
     TMT
   TMT (Final time)
    Definition: Tmt defines a time an event
    ends
   The meeting continued till evening.
    obj ( continue.@entry.@past, meeting.@def )
    tmt ( continue.@entry.@past,evening(icl>time) )




                                                      12
PLC
   PLC (Place)
    Definition: Plc defines the place an
    event occurs or a state is true or a thing
    exists
   He is very famous in India.
       aoj ( famous.@entry, he )
       man ( famous.@entry, very)
       plc ( famous.@entry, India)

                                            13
PLF
   PLF (Initial place)
    Definition: Plf defines the place an
    event begins or a state becomes true
   Participants come from the whole
    world.
      agt ( come.@entry, participant.@pl )
      plf ( come.@entry, world )
      mod ( world, whole)

                                             14
PLT
   PLT (Final place)
    Definition: Plt defines the place an
    event ends or a state becomes false
   We will go to Delhi.
      agt ( go.@entry.@future, we )
      plt ( go.@entry.@future, Delhi)



                                           15
INS
   INS (Instrument)
    Definition: Ins defines the instrument
    to carry out an event
   I solved it with computer
      agt ( solve.@entry.@past, I )
      ins ( solve.@entry.@past, computer )
      obj ( solve.@entry.@past, it )


                                             16
Attributes
   Constitute syntax of UNL
   Play the role of bridging the conceptual world and
    the real world in the UNL expressions
   Show how and when the speaker views what is said
    and with what intention, feeling, and so on
   Seven types:
       Time with respect to the speaker
       Aspects
       Speaker’s view of reference
       Speaker’s emphasis, focus, topic, etc.
       Convention
       Speaker’s attitudes
       Speaker’s feelings and viewpoints
                                                    17
     Tense: @past
             He went there yesterday
   The past tense is normally expressed by
    @past

    {unl}
    agt(go.@entry.@past, he)
    …
    {/unl}

                                              18
  Aspects: @progress
             It’s raining hard.
{unl}
  man ( rain.@entry.@present.@progress,
                     hard )
{/unl}




                                          19
     Speaker’s view of reference
   @def (Specific concept (already referred))
      The house on the corner is for sale.
   @indef (Non-specific class)
    There is a book on the desk
   @not is always attached to the UW which
    is negated.
      He didn’t come.
    agt ( come.@entry.@past.@not, he )
                                                 20
Speaker’s emphasis
   @emphasis
    John his name is.
        mod ( name, he )
        aoj ( John.@emphasis.@entry, name )


   @entry denotes the entry point or main
    UW of an UNL expression


                                              21
Subcategorization Frames
   Specify the categorial class of the lexical item.
   Specify the environment.
   Examples:
       kick: [V; _ NP]
       cry: [V; _ ]
       rely: [V; _PP]
       put: [V; _ NP PP]
       think: : [V; _ S` ]


                                                   22
    Subcategorization Rules
Subcategorization Rule:

                           _NP]
     V               y /   _]
                           _PP]
                           _NP PP]
                           _S`]




                                     23
 Subcategorization Rules
The boy relied on the friend.

       1.   S  NP VP
       2.   VP  V (NP) (PP) (S`)…
       3.   NP  Det N
       4.   V  rely / _PP]
       5.   P  on / _NP]
       6.   Det  the
       7.   N  boy, friend

                                     24
Semantically Odd
Constructions
   Can we exclude these two ill-formed
    structures ?
       *The boy frightened sincerity.
       *Sincerity kicked the boy.

   Selectional Restrictions


                                          25
Selectional Restrictions
   Inherent Properties of Nouns:
       [+/- ABSTRACT], [+/- ANIMATE]


   E.g.,
      Sincerity [+ ABSTRACT]
      Boy [+ANIMATE]


                                       26
    Selectional Rules
   A selectional rule specifies certain selectional
    restrictions associated with a verb.

                             [+/-ABSTARCT] __
    V                y /
                              __ [+/-ANIMATE]

    V        frighten /     [+/-ABSTARCT] __
                            __ [+ANIMATE]


                                                       27
  Subcategorization Frame
forward      e.g., We will be forwarding our new
V                    catalogue to you
__ NP PP

invitation   e.g.,   An invitation to the party
N
__ PP

accessible
A            e.g.,   A program making science is more
__ PP                accessible to young people


                                                        28
     Thematic Roles
The man forwarded the mail to the minister.

          forward
          V
          __ NP PP



  Event   FORWARD
                     ([   Thing   THE MAN],    [Thing THE MAIL],

                                    [Path   TO THE MINISTER]       )
                                                                       29
How to define the UWs in UNL
Knowledge-Base?


        Nominal concept
            Abstract
            Concrete
        Verbal concept
            Do
            Occur
            Be
        Adjective concept
        Adverbial concept

                               30
Nominal Concept: Abstract thing

abstract thing{(icl>thing)}
  culture(icl>abstract thing)
       civilization(icl>culture{>abstract thing})
  direction(icl>abstract thing)
       east(icl>direction{>abstract thing})
  duty(icl>abstract thing)
       mission(icl>duty{>abstract thing})
                 responsibility(icl>duty{>abstract thing})
                         accountability{(icl>responsibility>duty)}
  event(icl>abstract thing{,icl>time>abstract thing})
  meeting(icl>event{>abstract thing,icl>group>abstract thing})
                 conference(icl>meeting{>event})
                            TV conference{(icl>conference>meeting)}




                                                                      31
Nominal Concept: Concrete thing

 concrete thing{(icl>thing,icl>place>thing)}
     building(icl>concrete thing)
               factory(icl>building{>concrete thing})
               house(icl>building{>concrete thing})
     substance(icl>concrete thing)
               cloth(icl>substance{>concrete thing})
                         cotton(icl>cloth{>substance})
               fiber(icl>substance{>concrete thing})
                         synthetic fiber{(icl>fiber>substance)}
                                 textile fiber{(icl>fiber>substance)}
               liquid(icl>substance{>concrete thing})

      beverage(icl>food,icl>liquid>substance})
                               coffee(icl>beverage{>food})
                     liquor(icl>beverage{>food})
                               beer(icl>liquor{>beverage})
                                                                        32
Verbal concept: do

 do({icl>do,}agt>thing,gol>thing,obj>thing)
    express({icl>do(}agt>thing,gol>thing,obj>thing{)})
        state(icl>express(agt>thing,gol>thing,obj>thing))
                 explain(icl>state(agt>thing,gol>thing,obj>thing))
    add({icl>do(}agt>thing,gol>thing,obj>thing{)})
        change({icl>do(}agt>thing,gol>thing,obj>thing{)})

         convert(icl>change(agt>thing,gol>thing,obj>thing)
         classify({icl>do(}agt>thing,gol>thing,obj>thing{)})

         divide(icl>classify(agt>thing,gol>thing,obj>thing))



                                                                     33
Verbal concept: occur and be
   occur({icl>occur,}gol>thing,obj>thing)
        melt({icl>occur(}gol>thing,obj>thing{)})
        divide({icl>occur(}gol>thing,obj>thing{)})
        arrive({icl>occur(}obj>thing{)})

   be({icl>be,}aoj>thing{,^obj>thing})
        exist({icl>be(}aoj>thing{)})
        born({icl>be(}aoj>thing{)})



                                                     34
How to define the UWs in UNL Knowledge
Base?

     In order to distinguish among the verb classes
      headed by 'do', 'occur' and 'be', the following
      features are used:
                   [ need an   [ need an    English
         UW          agent ]    object ]
         'do'         +           +         "to kill"
       'occur'         -          +         "to fall"
         'be'          -           -       "to know"

                                                        35
How to define the UWs in UNL Knowledge-
Base?

     The verbal UWs (do, occur, be) also take some
      pre-defined semantic cases, as follows:
          UW        PRE-DEFINED CASES            English
          'do'     takes necessarily agt>thing   "to kill"

         'occur'   takes necessarily obj>thing   "to fall"

           'be'    takes necessarily aoj>thing "to know"




                                                             36
               sentence
     Complex to watch this movie.
        I want
@entry.@past           :01
 want (icl>)                 watch (icl>do)@entry.@inf
                 obj
     agt                                 obj
                         agt
I (iof>person)                            movie(icl>)
                        I (iof>person)
                                                @def


                                                         37
Approach to UNL
Generation




                  38
Problem Definition
   Generate UNL expressions for English
    sentences
       in a robust and scalable manner,
       using syntactic analysis and lexical resources
        extensively.
   This needs
       detecting semantically relatable entities
       and solving attachment problems
Semantically Relatable Sequences
(SRS)
Definition: A semantically relatable
 Sequence (SRS) of a sentence is a
 group of words in the sentence (not
 necessarily consecutive) that appear in the
 semantic graph of the sentence as linked
 nodes or nodes with speech act labels

(This is motivated by UNL representation)
SRS as an intermediary to and
intermediary

   Source                 Target
   Language               Language
              SRS   UNL
   Sentence               Sentence
Example to illustrate SRS
                                                             past tense

                                                bought

                                    agent                       time

“The man bought a                              object


 new car in June”       man                                               June
                    the: definite                 car                  in: modifier
                                                    a: indefinite

                                    modifier




                                      new
Sequences from “the man bought a
new car in June”
a.   {man, bought}
b.   {bought, car}
c.   {bought, in, June}
d.   {new, car}
e.   {the, man}
f.   {a, car}
Basic questions


      Which words can form semantic
       constituents, which we call Semantically
       Relatable Sequences (SRS)?
      What after all are the SRSs of the given
       sentence?
      What semantic relations can link the
       words in an SRS and the SRSs
       themselves?
Postulate


      A sentence needs to be broken into
       Sequences of at most three forms
          {CW, CW}
          {CW, FW, CW}
          {FW, CW}
   where CW refers to content word or a
    clause and FW to function word
SRS and Language
Phenomena
Movement: Preposition
Stranding
   John, we laughed at.
        (we , laughed.@entry)---------(CW, CW)
        (laughed.@entry,at, John)---(CW, FW,
        CW)
Movement: Topicalization
   The problem, we solved.
       (we , solved.@entry)------------(CW, CW)
       (solved.@entry , problem)-----(CW,CW)
       (the, problem)--------------------(CW,CW)
Movement: Relative
Clauses
   John told a joke which we had already heard.
      (John, told.@entry) -------------------(CW, CW)

      (told.@entry, :01) ---------------------(CW,CW)

      SCOPE01(we,had,heard.@entry)-------(CW,

       FW,CW)
      SCOPE01(already,heard.@entry)-------(CW,CW)

      SCOPE01(heard@entry,which,joke)----

       (CW,FW,CW)
      SCOPE01(a, joke)--------------------------(FW,CW)
Movement: Interrogatives
   Who did you refer her to?
     (did , refer.@entry.@interrogative)-------(FW,CW)

     (you, refer.@entry.@interrogative)--------(CW,CW)

     (refer.@entry.@interrogative , her)--------

      (CW,CW)
     (refer.@entry.@interrogative , to,who)----



      (CW,FW,CW)
Empty Pronominals: to-
infinitivals
   Bill was wise to sell the piano.
      (wise.@entry , SCOPE01)---------------(CW,CW)

      SCOPE01(sell.@entry , piano)---------(CW,CW)

      (Bill, was, wise.@entry) -----------------(CW,

        FW,CW)
      SCOPE01(Bill, to, sell.@entry)---------(CW,

        FW,CW)
      SCOPE01(the, piano) --------------------(FW,CW)
Empty pronominal: Gerundial
   The cat leapt down spotting a thrush on the lawn.
   (The, cat) -------------------------------(FW, CW)
   (cat, leapt.@entry) --------------------(CW, CW)
   (leapt.@entry , down) ----------------(CW, CW)
   (leapt.@entry , SCOPE01) -----------------(CW, CW)
   SCOPE01(spotting.@entry,thrush)--------(CW,CW)
   SCOPE01(spotting.@entry,on,lawn)---(CW,FW,CW)
PP Attachment
   John cracked the glass with a stone.
       (John, cracked.@entry)--------------
        (CW,CW)
       (cracked.@entry, glass)-------------(CW,CW)
       (cracked.@entry, with, stone)----
        (CW,FW,CW)
       (a, stone)------------------------------(FW,CW)
       (the,glass)-------------------------(FW,CW)
 SRS and PP attachment
 (Mohanty, Almeida, Bhattacharyya, 04)
Conditions               Sub-conditions       Attachment Point

[PP] is subcategorized   [NP2] is licensed    [NP2] is attached to the
by the verb [V]          by a preposition [P] verb [V] (e.g., He forwarded
                                              the mail to the minister)

[PP] is subcategorized   [NP2] is licensed    [NP2] is attached to the
by the noun in [NP1]     by a preposition [P] noun in [NP1](e.g., John
                                              published six articles on
                                              machine translation )
[PP] is neither          [NP2] refers to      [NP2] is attached to the
subcategorized by the    [PLACE] / [TIME]     verb [V](e.g., I saw Mary in
verb [V] nor by the      feature              her office; The girls met the
noun in [NP1]                                 teacher on different days)
Linguistic Study to
Computation
Syntactic constituents to Semantic
constituents

      A probabilistic parser (Charniak, 04) is used.
      Other resources: Wordnet and Oxford
       Advanced Learner’s Dictionary
      In a parse tree, tags give indications of CW
       and FW:
         NP, VP, ADJP and ADVP CW

         PP (prepositional phrase), IN (preposition)
          and DT (determiner) FW
     Observation:
     Headwords of sibling nodes form
     SRSs
                                 (C) VP bought
“John has bought
 a car.”           (F) AUX has                   (C) VP bought




SRS:
                             (C) VBD bought                      (C) NP car


{has, bought},                                      (F) DT a                  (C) NN car
{a, car},             has


{bought, car}                     bought
                                                         a                         car
       Need:
       Resilience to wrong PP attachment
   “John has published an                           (C)VP published

     article on linguistics”
                                 (C)VBD published      (C)NP article              (F) PP on
   Use PP attachment
    heuristics                            (F)DT an          (C)NNarticle
   Get
    {article, on, linguistics}   published                             (F)IN on       (C)NPlinguistics
                                              an             article

                                                                                  (C)NNS linguistics



                                                                         on          linguistics
  to-infinitival
“I forced him to watch this
 movie”                                              (C)VP forced



Clause boundary is the VP
node, labeled with SCOPE                                                                 (C) S SCOPE
                                   (C)VBD forced     (C)NP him


Tag is modified to TO, a FW
tag, indicating that it heads
                                                   (C)PRP him
a to-infinitival clause,
                                                                          (C)NP him                    (C)VP
The duplication and insertion
of the NP node with head him
                                       forced


(depicted by shaded nodes) as                          him
                                                                                      (F)TO toto
a sibling of the VBD node with                                                                           (C)VP watch

head forced is done to bring out                                    (C)PRP him

the existence of a semantic
relation between force and him.
                                                                           him                to
Linking of clauses:
“John said that he was reading a novel”
Head of S node
marked as Scope SRS:                   (C) VP said
{said, that, SCOPE}.

Adverbial clauses       (C)VBD said                 (F) SBAR that
have similar parse
tree structures except
that the subordinating                 (F) IN that                   (C) S SCOPE
conjunctions are
different from that.
                           said            that
Implementation
   Block Diagram of the system
                                                            Input Sentence
                   WordNet 2.0
                                                            Charniak Parser
                     Noun classification
           Time and Place                                   Parse Tree
              features
                        Scope Handler                Parse Tree modification and
                                                  augmentation with head and scope
                                                             information
                     THAT clause as Subcat property


                                                               Augmented
                       Sub-categorization Database
                                                               Parse Tree

                      Preposition as Subcat property
                                                     Semantically Relatable Sequences
               Attachment Resolver
                                                                Generator


                                                  Semantically Related Sequences
        Head determination
       Uses a bottom-up strategy to determine the
        headword for every node in the parse tree.
       Crucial in obtaining the SRSs, since wrong head
        information may end up getting propagated all the
        way up the tree
       Processes the children of every node starting from
        the rightmost child and checks the head information
        already specified against the node’s tag to
        determine the head of the node
       Some special cases are:
         SBAR node
         A VP node with PRO insertion, copula, Phrasal verbs etc.
         NP nodes with of-PP cases and conjunctions under them,
          which lead to scope creation.
Scope handler
   Performs modification on the parse
    trees by insertion of nodes in to-
    infinitival cases
   Adjusts of the tag and head information
    in case of SBAR nodes
      Attachment resolver

   Takes a (CW1, FW, CW2) as input and
    checks
          the time and place features of CW2,
          the noun class of CW1 and
          the subcategorization information for the CW1 and
           FW pair
     to decide the attachment.
   If none of these yield any deterministic
    results, take the attachment indicated by
    the parser
SRS generator
   Performs a breadth-first search on the
    parse tree and performs detailed
    processing at every node N1 of the tree.
   S nodes which dominate entire clauses
    (main or embedded) are treated as
    CWs.
   SBAR and TO nodes are treated as
    FWs.
  Algorithm
Algorithm                                             If yes,
If the node N1 is a CW (new/JJ, published/VBD,   Create {CW,FW,CW} ({John, and, Mary})
        fact/NN, boy/NN, John/NNP) perform the   If the node N1 is a FW (the/DT, is/AUX, to/TO),
        following checks:                             perform the following checks:
If the sibling N2 of N-1 is a CW (car/NN,        If the parent node is a CW (boy/NP,
        article/NN, SCOPE/S)                          famous/VP)
Then create {CW,CW} ({new, car}, {published,     Check if sibling is an adjective.
        article}, {boy, SCOPE})                  i. If yes, (famous/JJ)
If the sibling N2 is a FW (in/PP, that/SBAR,     Then, create {CW,FW,CW} ({She, is, famous})
        and/CC)
Then, check if N2 has a child FW, N3 (in/IN,     ii. If no,              (boy/NN)
        that/IN) and a child CW, N4 (June/NN,    Then, create {FW,CW} ({the, boy}, {has,
        SCOPE/S)                                      bought})
    If yes,                                      If the parent node N6 is a FW (to/TO) and the
Then use attachment resolver to decide the CW         sibling node N7 is a CW (learn/VB)
      to which N3 and N4 attach.                 Use attachment resolver to decide on the
Create{CW,FW,CW} ({published, in, June},              preceding CW to which N6 and N7 can
      {fact, that, SCOPE})                            attach.
    If no,                                       Create {CW,FW,CW} ({exciting, to, learn})
Then check if next sibling N5 of N-1 is a CW
      (Mary/NN)

                              
        Evaluation
   FrameNet corpus [Baker et. al., 1998], a
    semantically annotated corpus, as the
    testdata.
   92310 sentences (call this the gold
    standard)
   Created automatically from the FrameNet
    corpus taking verbs, nouns and adjectives
    as the targets
       Verbs as the target- 37,984 (i.e., semantic
        frames of verbs)
       Nouns as the target-37,240
       Adjectives as the target-17,086
Score for high frequency verbs
Verb        Frequency           Score
Swim        280                 0.709
Depend      215                 0.804
Look        187                 0.835
Roll        173                 0.7
Rush        172                 0.775
Phone162                0.695
Reproduce   159                 0.797
Step        159                 0.795
Urge        157                 0.765
Avoid       152                 0.789
Scores of 10 verb groups of
high frequency in the Gold
Standard
Scores of 10 noun groups of
high frequency in the Gold
Standard
An actual sentence
   A. Sentence: A form of asbestos once
    used to make Kent cigarette filters has
    caused a high percentage of cancer
    deaths among a group of workers
    exposed to it more than 30 years ago,
    researchers reported.
Relative performance on SRS
constructs
                           (CW,CW)
    Parameters matched




                         (CW,FW,CW)


                            (FW,CW)


                          Total SRSs
 Recall
                                       0   20    40         60     80   100
 Precision
                                                Recall/Precision
  Results on sentence constructs
            Parameter




                                    PP Resolution

                                    Clause linkings

         Complement-clause resolution

          To-infinitival clause resolution
                        Recall
                        Precision
                                                      0   20     40       60      80   100
                                                               Recall/Precision


Rajat Mohanty, Anupama Dutta and Pushpak Bhattacharyya,
Semantically Relatable Sets: Building Blocks for Repesenting Semantics,
10th Machine Translation Summit ( MT Summit 05), Phuket, September, 2005.
Statistical Approach
Use SRL marked corpora
   Daniel Gildea and Daniel Jurafsky. 2002. Automatic labeling of
    semantic roles. Computational Linguistics, 28(3):245–288.
   PropBank corpus
        Role annotated WSJ part of Penn Treebank [10]
   PropBank role-set [2,4]
        Core roles: ARG0 (Proto-agent), ARG1 (Proto-patient) to ARG5
        Adjunctive roles:
         ARGM-LOC (for locatives),
         ARGM-TMP (for temporals), etc.
         SRL marked corpora contd…
   PropBank roles: an example
     [ARG0 It] operates] [ARG1 stores] [ARGM−LOC mostly in Iowa and Nebraska]




                                             Fig.4: Parse tree output, Source: [5]

   Preprocessing systems [2]
        Part of speech tagger
        Base Chunker
        Full syntactic parser
        Named entities recognizer
         Probabilistic estimation [1]
   Empirical probability estimation over candidate roles for each
    constituent based upon extracted features
                                                                # (r , h, pt , gov, position, voice, t )
           P(r | h, pt , gov, position, voice, t ) 
                                                                 # (h, pt , gov, position, voice, t )
     here,
     t is the target word
     r is a candidate role,
     h , pt, gov, voice are features


   Linear interpolation, with condition                           i i  1

    P(r | constituent )  1 P(r | t )   2 P(r | pt, t )   3 P(r | pt, gov, t )   4 P(r | h)   5 P(r | h, pt, t )

•   Geometric mean, with condition                           r P(r | constituent )  1
                         1
    P(r | constituent )  exp{ 1 P(r | t )   2 P(r | pt, t )   3 P(r | pt, gov, t )   4 P(r | h)   5 P(r | h, pt, t )}
                         z
         A state-of-art SRL system: ASSERT
         [4]
   Main points [3,4]
        Use of Support Vector Machine [13] as classifier
        Similar to FrameNet “domains”, “Predicate Clusters” are introduced
        Named Entities [14] is used as a new feature
   Experiment I (Parser dependency testing)
        Use of PropBank bracketed corpus
        Use of Charniak parser trained on Penn Treebank corpus
     Parse                    Task          Precision (%)      Recall (%)       F-score (%)    Accuracy (%)
                               Id.              97.5              96.1             96.8              -
     Treebank                Class.               -                -                 -              93.0
                           Id. + Class.         91.8              90.5             91.2              -
                               Id.              87.8              84.1             85.9              -
     Charniak                Class.               -                -                 -              92.0
                           Id. + Class.         81.7              78.4             80.0              -


                       Table 1: Performance of ASSERT for Treebank and Charniak parser outputs.
                Id. Stands for identification task and Class. stands for classification task. Data source: [4]
          Experiments and Results
   Experiment II (Cross genre testing)
     1.    Training on PropBanked WSJ data and testing on Brown Corpus
     2.    Charniak parser trained on first PropBank then Brown




            Table 2: Performance of ASSERT for various experimental combinations
                                       Date source: [4]

								
To top