# Fuzzy (PowerPoint)

Document Sample

```					Artificial Intelligence

Knowledge Representation Problem
Reasoning With Uncertainty

Conventional set theory
38.7°C
38°C
40.1°C        41.4°C
Fuzzy set theory
42°C
39.3°C
“Strong Fever”                                           38.7°C
37.2°C                               38°C
40.1°C           41.4°C

42°C
39.3°C
“Strong Fever”

37.2°C
Uncertainty
   Let action At = leave for airport t minutes
before flight
   Will At get me there on time?
   Problems:
   partial observability (road state, other drivers' plans, etc.)
   noisy sensors (traffic reports)
   uncertainty in action outcomes (flat tire, etc.)

   “A25 will get me there on time if there's no
accident on the bridge and it doesn't rain
and my tires remain intact etc etc.”
Making decisions under uncertainty
Suppose I believe the following:

P(A25 gets me there on time | …)     = 0.04
P(A90 gets me there on time | …)     = 0.70
P(A120 gets me there on time | …)    = 0.95
P(A1440 gets me there on time | …)   = 0.9999

   Which action to choose?
   Depends on my preferences for missing flight vs. time spent
waiting, etc.
 Utility theory is used to represent and infer preferences
 Decision theory = probability theory + utility theory
Uncertainty in Expert Systems
from correct premises and correct sound rules  correct conclusions
 but sometimes we have to manage uncertain
information, encode uncertain pieces of knowledge, model

parallel firing of inference rules, tackle ambiguity

 There is a number of various models of
uncertain reasoning:
   Bayesian Reasoning  – classical statistical approach
   Dempster-Shafer Theory of Evidence

   Stanford Certainty Algebra – MYCIN
Bayesian Reasoning
P(a  b)  P(a)  P(b) ……. given that a and b are independent
P(a  b)  P(a | b)  P(b) ……. given that a depends on b

-     prior probability (unconditional) … p(hypothesis)
-     posterior probability (conditional)… p(hypothesis|evidence)

P(e|h)                          P(h  e) P(h)  P(e | h)
P(e)                           P(h)     P(h | e)            
P(e | h)                              P(e)        P(e)

P(h)  P(e | h)
P(h | e) 
 P(hj )  P(e | hj )
j
prospector, dice examples
What is Fuzzy logic
   Fuzzy logic is a superset of conventional
logic (Boolean)

   It was created by Dr. Lotfi Zadeh in 1960s
for the purpose of modeling the inherent
in natural language

   In fuzzy logic, it is possible to have partial
truth values
Lotfi A. Zadeh, The founder of fuzzy logic.

Fuzzy Sets

L. A. Zadeh, “Fuzzy sets,” Information and Control,
vol. 8, pp. 338-353, 1965.
• Fuzzy logic driven picture generator
• Massive Engine
• Battle scenes from Lord of the Rings
Degree’s of truth
    The word “fuzzy” can be defined as “imprecisely defined, confused,
vague”
    Humans represent and manage natural language terms (data) which are
vague. Almost all answers to questions raised in everyday life are within
some proximity of the absolute truth
Reasoning With Uncertainty

Term                         Certainty Factor
Definitely not                   -1.0
Almost certainly not             -0.8
Probably not                     -0.6
Maybe not                        -0.4
Unknown                          -0.2 to +0.2
Maybe                            +0.4
Probably                         +0.6
Almost certainly                 +0.8
Definitely                       +1.0
Fuzzy set theory basics
   There is a strong correlation between Boolean
logic, and classic set theory
   Likewise, there is a very strong correlation
between Fuzzy logic, and Fuzzy set theory
   In Fuzzy set theory, one deals with a set “S”
which determines a universe of discourse and a
fuzzy subset “F” that contains degrees of
membership and the relationship between the two
sets
Crisp Sets
   Classical sets are called crisp sets
   either an element belongs to a set or
not, i.e.,
x A       or      x A
   Member Function of crisp set
0 x  A
 A ( x)               A ( x) 0,1
1 x  A
Sets
{z  Z  | z  3}  {1,2,3}

{Live dinosaurs in British Museum} = 

{0,1,1,2}  {0,1,2}
Fuzzy Sets
 categorization      of elements xi into a set S
   described through a membership function
(s) : x  [0,1]
   associates each element xi with a degree of
membership in S:
0 means no, 1 means full membership
   values in between indicate how strongly an element
is affiliated with the set
Fuzzy Sets
   Formal definition:
A fuzzy set A in X is expressed as a set of
ordered pairs:
A  {( x,  A ( x ))| x  X }
Universe or
Fuzzy set
universe of discourse
Membership
function
(MF)

A fuzzy set is totally characterized by a
membership function (MF).
Fuzzy Sets with Discrete
Universes
 Fuzzy set C = “desirable city to live in”
X = {SF, Boston, LA}
C = {(SF, 0.9), (Boston, 0.8), (LA, 0.6)}
   Fuzzy set A = “sensible number of
children”
X = {0, 1, 2, 3, 4, 5, 6}
A = {(0, .1), (1, .3), (2, .7), (3, 1), (4, .6), (5, .2),
(6, .1)}
Fuzzy Sets
Setswith fuzzy
boundaries
A = Set of tall people

Crisp set A                  Fuzzy set A
1.0                            1.0
.9
.5                    Membership
function

5’10’’     Heights           5’10’’ 6’2’’     Heights
Fuzzy Sets
Formal    definition:
A fuzzy set A in X is expressed as a set of ordered
pairs:
A  {( x,  A ( x ))| x  X }

Membership              Universe or
Fuzzy set
function          universe of discourse
(MF)

A fuzzy set is totally characterized by a
membership function (MF).
Possibility vs.. Probability
 possibility refers to allowed values
 probability expresses expected occurrences of events
 Example: rolling dice
 X is an integer in U = {2,3,4,5,6,7,8,9,19,11,12}
 probabilities
p(X = 7) = 2*3/36 = 1/6          7 = 1+6 = 2+5 = 3+4
 possibilities
Poss{X = 7} = 1           the same for all numbers in
U
Set-Theoretic Operations
   Subset:  
AB                  B
A

   Complement: ( x )  1  ( x )
A  X  A  A         A

 Union:   ( x )  max(  ( x ),  ( x ))   ( x )  ( x )
C AB     c              A        B          A        B

   Intersection:
C  A  B  c ( x )  min( A ( x ), B ( x ))  A ( x ) B ( x )
Set Operations
A B              A B             A B

A            B   A            B    A            B

a)               b)                c)

A         B       A        B        A        B

d)               e)                f)
Fuzzy OR
p     q    pq
0     0     0
0    0.5   0.5
0     1     1
0.5    0    0.5
0.5   0.5   0.5
0.5    1     1
1     0     1
1    0.5    1
1     1     1
Logics in general
Language             Ontological         Epistemological
Commitment           Commitment
Propositional logic   facts                true/false/unknown

First-order logic     facts, objects,      true/false/unknown
relations
Temporal logic        facts, objects,      true/false/unknown
relations, times
Probability theory    facts                degree of belief

Fuzzy logic           facts+degree of truth known interval value
Rough set theory

   Rough set theory was developed by
Zdzislaw Pawlak in the early 1980’s.
   Representative Publications:
   Z. Pawlak, “Rough Sets”, International
Journal of Computer and Information Sciences,
Vol.11, 341-356 (1982).
   Z. Pawlak, Rough Sets - Theoretical Aspect of
Pubilishers (1991).
Introduction (2)
   The main goal of the rough set analysis is
induction of approximations of concepts.
   Rough sets constitutes a sound basis for
KDD. It offers mathematical tools to
discover patterns hidden in data.
   It can be used for feature selection, feature
extraction, data reduction, decision rule
generation, and pattern extraction
(templates, association rules) etc.
Information Systems/Tables
Age     LEMS      IS is a pair (U, A)
 U is a non-empty
x１   16-30   50
x2   16-30    0
finite set of objects.
x3   31-45   1-25     A is a non-empty
x4   31-45   1-25       finite set of attributes
x5   46-60   26-49      suchU  Va
a : that
x6   16-30   26-49      fora  A.
every
x7   46-60   26-49   V
a       is called the
value set of a.
Decision Systems/Tables
Age     LEMS Walk        DS:T  (U , A  {d })
x１   16-30   50      yes      d  A is the decision
x2   16-30    0      no        attribute (instead of one
x3   31-45   1-25     no       we can consider more
x4   31-45   1-25    yes       decision attributes).
x5   46-60   26-49   no       The elements of A are
x6   16-30   26-49   yes       called the condition
x7   46-60   26-49    no       attributes.
Indiscernibility
   Let IS = (U, A) be an information system, then
with anyB  A there is an associated equivalence
relation:
IND IS ( B)  {( x, x' )  U 2 | a  B, a ( x)  a( x' )}
where INDIS (B) is called the B-indiscernibility
relation.
 If( x, x' )  IND IS ( B ), then objects x and x’ are
indiscernible from each other by attributes from B.
 The equivalence classes of the B-indiscernibility
relation are denoted by[ x]B .
An Example of Indiscernibility
   The non-empty subsets
Age     LEMS Walk         of the condition
attributes are {Age},
x１   16-30   50      yes       {LEMS}, and {Age,
x2   16-30    0      no        LEMS}.
x3   31-45   1-25     no
   IND({Age}) =
x4   31-45   1-25    yes
{{x1,x2,x6}, {x3,x4},
x5   46-60   26-49   no
x6   16-30   26-49   yes
{x5,x7}}
x7   46-60   26-49    no      IND({LEMS}) = {{x1},
{x2}, {x3,x4},
{x5,x6,x7}}
Set Approximation
   Let T = (U, A) and let B  A and X  U .
We can approximate X using only the
information contained in B by constructing
the B-lower and B-upper approximations of
X, denotedB X and B X respectively, where
BX  {x | [ x]B  X },
B X  {x | [ x]B  X   }.
Set Approximation (2)
 B-boundary region of X,BN B ( X )  B X  B X ,
consists of those objects that we cannot
decisively classify into X in B.
 B-outside region of X, U  B X ,
consists of those objects that can be with
certainty classified as not belonging to X.
 A set is said to be rough if its boundary
region is non-empty, otherwise the set is
crisp.
An Example of Set Approximation
   Let W = {x | Walk(x) =
yes}.
Age     LEMS Walk         AW  {x1, x6},
x１   16-30   50      yes
AW  {x1, x3, x 4, x6},
x2   16-30    0      no
x3   31-45   1-25     no       BN A (W )  {x3, x 4},
x4   31-45   1-25    yes
x5   46-60   26-49   no
U  AW  {x 2, x5, x7}.
x6   16-30   26-49   yes
x7   46-60   26-49    no      The decision class, Walk, is
rough since the boundary
region is not empty.
An Example of
Set Approximation (2)
{{x2}, {x5,x7}}

AW        {{x3,x4}}
yes
AW
{{x1},{x6}}          yes/no

no
Lower & Upper Approximations

U

RX  X
U/R
RX
R : subset of
setＸ                          attributes
Lower & Upper Approximations
(2)
Upper Approximation:

R X  {Y  U / R : Y  X   }
Lower Approximation:
RX  {Y U / R : Y  X }
Lower & Upper Approximations
(3)
U1   Yes        Normal      No
U2   Yes        High        Yes    The indiscernibility classes defined by
U3   Yes        Very-high   Yes    R = {Headache, Temp.} are
U4   No         Normal      No     {u1}, {u2}, {u3}, {u4}, {u5, u7}, {u6, u8}.
U5   No         High        No
U6   No         Very-high   Yes
U7   No         High        Yes
U8   No         Very-high   No

X1 = {u | Flu(u) = yes}                  X2 = {u | Flu(u) = no}
= {u2, u3, u6, u7}                        = {u1, u4, u5, u8}
RX1 = {u2, u3}                           RX2 = {u1, u4}
RX1 = {u2, u3, u6, u7, u8, u5}
RX2 = {u1, u4, u5, u8, u7, u6}
Lower & Upper Approximations
(4)

U/R = { {u1}, {u2}, {u3}, {u4}, {u5, u7}, {u6, u8}}
X1 = {u | Flu(u) = yes} = {u2,u3,u6,u7}
X2 = {u | Flu(u) = no} = {u1,u4,u5,u8}

RX1 = {u2, u3}                               X1             X2
RX1 = {u2, u3, u6, u7, u8, u5}
u2         u7   u5        u1
RX2 = {u1, u4}
u3        u6   u8    u4
RX2 = {u1, u4, u5, u8, u7, u6}
Four Basic Classes of Rough Sets
   X is roughly B-definable, iff B(X )   and
B( X )  U ,
   X is internally B-undefinable, iffB(X )  
and B ( X )  U ,
   X is externally B-undefinable, iff B(X )  
and B ( X )  U ,
   X is totally B-undefinable, iffB(X )  
and B ( X )  U .
An Example of Reducts & Core
Reduct1 = {Muscle-pain,Temp.}
U     Muscle    Temp.       Flu
pain
U    Headache   Muscle   Temp.       Flu      Ｕ1,U4 Yes       Normal      No
pain                          U2    Yes       High        Yes
U1   Yes        Yes      Normal      No       U3,U6 Yes       Very-high   Yes
U2   Yes        Yes      High        Yes      U5    No        High        No
U3   Yes        Yes      Very-high   Yes
U4   No         Yes      Normal      No    Reduct2 = {Headache, Temp.}
U5   No         No       High        No
U6   No         Yes      Very-high   Yes
U1   Yes        Norlmal     No
U2   Yes        High        Yes
U3   Yes        Very-high   Yes
CORE = {Headache,Temp}                       U4   No         Normal      No
{MusclePain, Temp}                      U5   No         High        No
U6   No         Very-high   Yes
= {Temp}
Soft Techniques for KDD

Logic

Set     Probability
Soft Techniques for KDD (2)

Deduction
Induction
Abduction

Stoch. Proc.
RoughSets    Belief Nets
Fuzzy Sets   Conn. Nets
GDT
A Hybrid Model

Deduction
GrC
RS&ILP

TM          GDT          RS

Abduction               Induction

```
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 9 posted: 3/11/2012 language: English pages: 47