# Conditional Independence and Markov Properties

Document Sample

Conditional Independence and
Markov Properties

Lecture 1
Saint Flour Summerschool, July 5, 2006
Steﬀen L. Lauritzen, University of Oxford
Overview of lectures

1. Conditional independence and Markov properties
2. More on Markov properties
3. Graph decompositions and junction trees
4. Probability propagation and similar algorithms
5. Log-linear and Gaussian graphical models
6. Conjugate prior families for graphical models
7. Hyper Markov laws
8. Structure learning and Bayes factors
9. More on structure learning.
Conditional independence
The notion of conditional independence is fundamental for
graphical models.
For three random variables X, Y and Z we denote this as
⊥
X ⊥ Y | Z and graphically as

u            u            u

X            Z            Y
If the random variables have density w.r.t. a product
measure µ, the conditional independence is reﬂected in the
relation
f (x, y, z)f (z) = f (x, z)f (y, z),
where f is a generic symbol for the densities involved.
Graphical models
2           4
u           u
1       d
5          d 7
u      d u
d             d u
d
d       d

d u
d       d u
d
3       6

For several variables, complex systems of conditional
independence can be described by undirected graphs.
Then a set of variables A is conditionally independent of
set B, given the values of a set of variables C if C
separates A from B.
A directed graphical model

Directed model showing relations between risk factors,
diseases, and symptoms.
A pedigree

Graphical model for a pedigree from study of Werner’s
syndrome. Each node is itself a graphical model.
A highly complex pedigree
p87 p99 p98 p88 p89 p90 p100p68 p59 p91 p177p60 p92 p93 p104p182p36 p37 p105p17 p153p154p181p78 p18 p77 p49 p56 p42 p40 p39 p79 p80 p151p48 p152p85 p174p183p179p44 p199p28 p27 p51 p84 p70 p15 p19 p16 p186p41 p8 p20 p6                            p82 p5   p52 p61 p62 p75 p2    p178p185p47 p83 p33 p66 p101p29 p65 p102p21 p30 p97 p14 p63 p64 p13 p175

p43p280 p86p281 p103 p50p317 p292 p294 p291 p282 p297 p343 p287 p46p341 p247 p358 p365 p355 p340 p356 p1 p245 p306 p299 p336 p279 p342 p11p69p198 p275 p12p26 p313 p259 p330 p257 p244 p53p329 p274 p327 p241 p249 p271 p273 p226 p236 p234 p270 p22p346 p298 p233 p227 p307 p240 p38p74p303 p302 p239 p322 p324 p195
p314    p72 p288     p295 p318 p31 p293 p25 p296 p10 p73        p193 p81 p106 p354 p304 p357 p148     p337 p278 p171 p125 p368 p284       p170 p321    p9  p176 p262 p235 p260 p242    p243 p34 p3    p248 p95 p230 p228 p4    p348 p94 p7       p23 p272 p326 p338 p285 p325        p309 p76 p344 p323 p238 p180

p415 p550 p277p334p485p45 p143p488p560 p567 p578p565p568 p537p184p246p587 p316p359p361p267 p563 p577 p529p530 p349 p232 p422 p456 p372 p436 p192 p258 p263p460 p438 p283p475p558p469 p396 p559p541p470p461 p426p471p231 p503 p416 p531 p462 p472 p301p449p254p445 p497p255 p252 p491 p447p564p539 p376 p308p561p352p290p136p350
p187 p551 p55 p96 p335p67 p552
p311 p310                   p487p434 p486 p566
p305 p483   p481p312 p145p579
p478             p581
p139p580 p191p360p366p423 p527 p562 p532p424 p452 p520 p453 p459 p455 p521 p331 p328 p457p505 p477 p458p435
p480 p528 p522     p430 p427 p194 p142 p437 p144 p535 p369 p454     p395 p467       p557p251 p465 p286p265
p173 p463       p538p433 p431
p141         p300 p269 p429 p264 p504 p250
p24 p253 p525 p428 p229 p419 p169 p448p417p494p495 p418p261 p450 p492 p444p379 p466 p347
p493 p237 p443
p266                          p381 p155 p482p588p351p196p289p339

p553 p315 p910 p524 p907 p333 p889 p909 p913 p882 p548 p555 p620 p189 p940 p941 p929 p618 p619 p595 p363 p926 p384 p914 p421 p881 p377 p543 p596 p880 p432 p654 p906 p514 p510 p385 p506 p637 p927 p920 p930 p901 p256 p915 p621 p571 p886 p533 p502 p150 p885 p603 p623 p633 p605 p572 p593 p392 p439 p393 p599 p378 p473 p398 p684 p496 p399 p629 p627 p374 p576
p554 p523 p573 p574 p912 p895 p156 p575 p190 p383 p938 p546 p549 p948 p946 p451 p388 p386 p902 p896 p425 p928 p585 p536 p440 p892 p888 p597 p364 p512 p714 p544 p138 p397 p905 p921 p622 p887 p479 p367 p934 p373 p916 p919 p540 p636 p884 p890 p656 p933 p168 p644 p688 p607 p387 p518 p517 p394 p146 p390 p686 p200 p602 p516 p771 p628 p770 p570 p630 p197 p769
p353 p911 p332 p526 p276 p507 p908 p137 p484 p345 p320 p547 p140 p939 p545 p582 p891 p35 p883 p134 p924 p135 p925 p420 p594 p542 p598 p893 p508 p608 p625 p904 p441 p511 p936 p609 p509 p937 p446 p606 p468 p917 p918 p935 p626 p474 p723 p501 p931 p500 p632 p645 p268 p903 p515 p655 p513 p464 p635 p687 p556 p391 p638 p685 p206 p172 p768 p534 p569 p932 p202

p958
p203p1585 p898p981p204p984p899p980p489p664 p951 p955p922p992p499p161p986p949p987 p950p613 p442p1020 p754 p737p973 p966 p1043 p614p742
p942p223p702p900p490p726p959p993p983p760p985p1006p1017p943p757p954p945p944p631p362p953p319p677p1070p957p201p1122p704p734p1124p968p616p971p740p676p1015p666p1038p753p1035p1101p683p1111p716p1008p604p1074p643p1048p610p836p1054p1010p1098p975p1049p680p157p897p729p978p1072p1041p160p648p404p1021p661p389p674 p624 p667p1080p1031p1076p722p759p375p1586
p701                               p682 p1077                                        p956
p205p208p114p989p982p703p923p738p728p727p1004p1003p584p894p211p750p991p165p147p1182p947p952p731p129p1078p1014p735p162 p1016p1042p586p212p743p744p1075p710p967p659p1588p988 p1018p222p476p1045p1036p1099p1037p590p218p962p1024p639p679 p725p1052p979p960p961p642p1011p752p407 p651p401p838p1023p405p600p498 p739 p406p672 p720 p717 p1587p113p221p400
p370 p970     p118 p611 p1123         p617p749 p711p733 p660 p612p662 p159 p1100 p657 p640 p1009 p641 p1027 p1026p1053 p976 p678p1025p715p974p965p130p408 p1044p409p1051 p382p403 p719p658p1012p718 p964p649 p721 p120 p671 p1079
p1013    p709   p706 p712 p1097 p1096p1007 p1039p1055p1071 p673 p1022p1073p977             p724                  p1040 p751     p601      p1083        p730 p380     p668 p402 p110 p132     p210

p217 p1459 p793p787 p772 p792p1191 p791p1271 p1266 p1069 p755 p1193 p1373 p1369 p1149 p1140 p1088 p848p1334 p1330 p1126 p855 p846 p1186 p969 p615 p801 p1147 p697 p695p1047 p741p803 p866p1577 p690 p837 p1175 p1139 p692 p1029 p1137 p1133 p166p1495 p1032 p795p1034 p827 p412 p1082 p762p763 p831 p653 p747 p1244 p1417 p1169 p1220 p1434 p871p1138 p411p1173
p1558 p111 p1127 p149p841 p647p1372 p785 p1376 p1187 p997p782 p1061 p646p1337 p1335 p1456 p1059 p689 p1204 p1190 p1521 p1185 p1143 p802 p856 p797 p781 p1205 p758 p705p1206 p1104 p1609 p800p1177 p1128 p963 p874p799 p796p1171 p1132 p1134 p589 p713 p1089 p1033 p1231 p1172 p1090 p764p122p1091 p745 p826p1436 p1590 p1180 p1163 p413 p851p123 p1161
p995p1509 p1508 p998 p1005 p774p1368 p669 p1086 p1269 p1144 p1148 p1333 p1265 p116 p778 p1203 p1142 p1520 p107 p844 p1141 p1067 p188 p1197 p1202 p1606 p1611 p1613 p1105 p1612 p1106 p1109 p670p1130 p858 p1050 p1136 p1243 p1183 p1056 p1218 p214 p1093 p1200 p1229 p1170 p1420 p693 p832 p1225 p1435 p765p1226 p1221 p1439 p1179
p216 p873p839p788 p990p790p1375 p220p1510 p1332 p1370 p207p1064 p776 p1060 p1057 p1336 p1272 p1329 p779 p996p1268 p213 p736p1457 p1062 p834p1209 p783p1028 p1219 p1207 p663p1208 p1189 p1092 p1108 p1614 p699 p215p1103 p1610 p1019 p209p1199 p1095 p794 p1232 p1230 p1181 p830 p761p1058 p109p1228 p748 p767 p746 p650 p112 p1168 p1437 p1162 p864p1592 p1349 p131p853
p775
p163 p777 p1267 p789 p1455
p1458           p1374 p225 p1066 p1063 p1194 p1270 p1065 p1002 p1001 p1121 p1338 p1085 p371 p854p867 p1145 p1000 p1184 p835 p1046 p1607 p1198
p1068 p1084 p1454 p1593 p1087 p784 p1125
p786 p1371 p994                                                    p1331 p857     p732 p780 p1578 p1146                                                                                        p1419 p1094 p117
p696 p833 p1188 p1605 p999 p1030 p1110 p1107 p158 p1135 p698 p127p410 p1201 p681 p634 p828 p591p652 p1421 p829 p1223
p798 p707 p1608 p691 p1102 p1176          p1174                             p694 p700      p708    p519 p1418      p1227
p119 p766 p1081 p1129 p843 p1166 p414 p224 p1589 p850 p845
p852         p972 p1438 p1165 p1222           p1591 p1224

p1392p1294p1396p1464pp1288p1453p1290p1506p1256 p1195p1258p1584p1466p1441p1259p1308p1505p1504p756 p1579p1304p878 p1550p1343p1474p1478p1428p12821281p1297p824p1319p825p1242p821p1322p1317p816p1442p1541p1312p1443p1305p1465 p1447p1519p1115p1152p1159p1425p1151p1555p1154p1565p1112p1471 p1540p1378p1386p1603p1539p1599p1529p1598p1384p1567p1486
p1394p128 p1389p12911292p1451p879 p1262p1326p1411p1255p1440p1245p1522p1311p1254p1473p1155p1248p1409p1566p1210p1347p1341p1427p126p1433p583p1285p1432p1430pp1196p1273 p1302p1310p1422p1239 p1306p860 p1113p859p1114p1235p811p1131p592 p1445p1423p1116p1156p1448 p861 p865 p1489p1472p1119p1215p806 p1552p1595p1525p219 p1382p115 p1514p1528p1487
p1512p1393p121 p863 p1328p1287p1415p773 p1463p1246p1452p1410p1524p1523p1405p1413p1345p1348p1401p1283p1279p1399p1276p1233p1431p1309p1468 p1323p1307p818 820p1237p1320p1313p814 p1315p1316 p1426p1350p813 p1236 p1530p1517 p1537p1516p1444p1118p1153p1160 p1234p1211p1178p868 p875 p1380p675 p804p1377p1167p1385p1594p809 p1383p1527p1526
p1286p1388p1390p1293p1257p1449p1403 p1264p1192p1395p1250p1252p1253p1325p1150p1249p1400p1251p1324p1498p1340p1342p1406p1346p1397 p1429p1398p1284p823 p819
p1260p1490p1391p1544p1289p1450p1261p108p133p840p1263p1414p1327p1402p1412p1416p1404p1247p1497p1344p1467p1275p1496p1407p1339p1277p1274p1299p1280pp1408p1296p849p1300p1301p1321p1303p1551p817p1295p1352p862 p665p815p1569p870p869p1462p1351p1538p1446p1158p164p876p1212p1477p1216p1217p810p1557p1387p1601p805p1460p1600p1583pp1563p1157p1488p124
p822 p1318p167 p1298p812 p1314p1278p1241p1240 p1238p1213p847 p1117p1120p1518p1424p877 p1476p872 p1214p1556p1564p1470p1461p808p807 p1381 p842 p1164p15971596p1379p1602p1604

p1560p1511p1484p1559p1572p1507p1492p1571p1494p1493p1562p1491p1570p1574p1535p1502p1500p1503p1576p1501p1549p1548p1469p1480p1533p1582p1534p1536p1475p1361p1362p1482p1483p1360p1481p1547p1546p1353p1355p1356p1357p1358p1359p1354p1545p1575p1513p1479p1531p1542p1485p1561p1499p1367p1365p1532p1366p1363p1364p1573p1515p1568p1580p1581p1554p1553p1543

Family relationship of 1641 members of Greenland Eskimo
population.
Conditional independence

Random variables X and Y are conditionally independent
given the random variable Z if

L(X | Y, Z) = L(X | Z).

⊥             ⊥
We then write X ⊥ Y | Z (or X ⊥ P Y | Z)
Intuitively:
Knowing Z renders Y irrelevant for predicting X.
Factorisation of densities w.r.t. product measure:

⊥
X ⊥ Y |Z      ⇐⇒     f (x, y, z)f (z) = f (x, z)f (y, z)
⇐⇒     ∃a, b : f (x, y, z) = a(x, z)b(y, z).
Fundamental properties

For random variables X, Y , Z, and W it holds

⊥              ⊥
(C1) if X ⊥ Y | Z then Y ⊥ X | Z;
⊥                             ⊥
(C2) if X ⊥ Y | Z and U = g(Y ), then X ⊥ U | Z;
⊥                             ⊥
(C3) if X ⊥ Y | Z and U = g(Y ), then X ⊥ Y | (Z, U );
⊥            ⊥
(C4) if X ⊥ Y | Z and X ⊥ W | (Y, Z), then
⊥
X ⊥ (Y, W ) | Z;

If density w.r.t. product measure f (x, y, z) > 0 also

⊥             ⊥              ⊥
(C5) if X ⊥ Y | Z and X ⊥ Z | Y then X ⊥ (Y, Z).

f (x, y, z) > 0 is not necessary for (C5). Enough e.g. that
f (y, z) > 0 for all (y, z) or f (x, z) > 0 for all .
In discrete and ﬁnite case it is even enough that the
bipartite graphs G+ = (Y ∪ Z, E+ ) deﬁned by

y ∼+ z ⇐⇒ f (y, z) > 0,

are all connected.
Alternatively it is suﬃcient if the same condition is satisﬁed
with X replacing Y .
Is there a simple necessary and suﬃcient condition?
Graphoid axioms
Ternary relation ⊥σ among subsets of a ﬁnite set V is
graphoid if for all disjoint subsets A, B, C, and D of V :

(S1) if A ⊥σ B | C then B ⊥σ A | C;
(S2) if A ⊥σ B | C and D ⊆ B, then A ⊥σ D | C;
(S3) if A ⊥σ B | C and D ⊆ B, then A ⊥σ B | (C ∪ D);
(S4) if A ⊥σ B | C and A ⊥σ D | (B ∪ C), then
A ⊥σ (B ∪ D) | C;
(S5) if A ⊥σ B | (C ∪ D) and A ⊥σ C | (B ∪ D) then
A ⊥σ (B ∪ C) | D.

Semigraphoid if only (S1)–(S4) holds.
Irrelevance

Conditional independence can be seen as encoding
irrelevance in a fundamental way. With the interpretation:
Knowing C, A is irrelevant for learning B, (S1)–(S4)
translate to:

(I1) If, knowing C, learning A is irrelevant for learning B,
then B is irrelevant for learning A;
(I2) If, knowing C, learning A is irrelevant for learning B,
then A is irrelevant for learning any part D of B;
(I3) If, knowing C, learning A is irrelevant for learning B,
it remains irrelevant having learnt any part D of B;
(I4) If, knowing C, learning A is irrelevant for learning B
and, having also learnt A, D remains irrelevant for
learning B, then both of A and D are irrelevant for
learning B.

The property (S5) is slightly more subtle and not generally
obvious.
Also the symmetry (C1) is a special property of
probabilistic conditional independence, rather than of
general irrelevance, where (I1) could appear dubious.
Probabilistic semigraphoids

V ﬁnite set, X = (Xv , v ∈ V ) random variables.
For A ⊆ V , let XA = (Xv , v ∈ A).
Let Xv denote state space of Xv .
Similarly xA = (xv , v ∈ A) ∈ XA = ×v∈A Xv .
⊥             ⊥
Abbreviate: A ⊥ B | S ⇐⇒ XA ⊥ XB | XS .
Then basic properties of conditional independence imply:
⊥
The relation ⊥ on subsets of V is a semigraphoid.
⊥
If f (x) > 0 for all x, ⊥ is also a graphoid.
Not all (semi)graphoids are probabilistically representable.
Second order conditional independence

Sets of random variables A and B are partially uncorrelated
for ﬁxed C if their residuals after linear regression on XC
are uncorrelated:

Cov{XA − E∗ (XA | XC ), XB − E∗ (XB | XC )} = 0,

in other words, if the partial correlations are zero

ρAB·C = 0.

We then write A ⊥2 B | C.
Also ⊥2 satisﬁes the semigraphoid axioms (S1) -(S4) and
the graphoid axioms if there is no non-trivial linear relation
between the variables in V .
Separation in undirected graphs

Let G = (V, E) be ﬁnite and simple undirected graph (no
self-loops, no multiple edges).
For subsets A, B, S of V , let A ⊥G B | S denote that S
separates A from B in G, i.e. that all paths from A to B
intersect S.
Fact: The relation ⊥G on subsets of V is a graphoid.
This fact is the reason for choosing the name ‘graphoid’ for
such separation relations.
Geometric Orthogonality

As another fundamental example, consider geometric
orthogonality in Euclidean vector spaces or Hilbert spaces.
Let L, M , and N be linear subspaces of a Hilbert space H
and deﬁne

L ⊥ M | N ⇐⇒ (L         N ) ⊥ (M     N ),

where L N = L ∩ N ⊥ . Then L and M are said to meet
orthogonally in N . This has properties

(O1) If L ⊥ M | N then M ⊥ L | N ;
(O2) If L ⊥ M | N and U is a linear subspace of L, then
U ⊥ M | N;
(O3) If L ⊥ M | N and U is a linear subspace of M , then
L ⊥ M | (N + U );
(O4) If L ⊥ M | N and L ⊥ R | (M + N ), then
L ⊥ (M + R) | N .

The analogue of (C5) does not hold in general; for example
if M = N we may have

L ⊥ M | N and L ⊥ N | M,

but if L and M are not orthogonal then it is false that
L ⊥ (M + N ).
Variation independence
Let U ⊆ X = ×v∈V Xv and deﬁne for S ⊆ V the S-section
∗
U uS of U as
∗
U uS = {uV \S : uS = u∗ , u ∈ U}.
S

Deﬁne further the conditional independence relation ‡U as
∗        ∗           ∗
A ‡U B | C ⇐⇒ ∀u∗ : U uC = {U uC }A × {U uC }B
C

i.e. if and only if the C-sections all have the form of a
product space.
The relation ‡U satisﬁes the semigraphoid axioms. In
particular ‡U holds if U is the support of a probability
measure satisfying the similar conditional independence
restriction.
Markov properties for semigraphoids

G = (V, E) simple undirected graph; ⊥σ (semi)graphoid
relation. Say ⊥σ satisﬁes

(P) the pairwise Markov property if

α ∼ β =⇒ α ⊥σ β | V \ {α, β};

(L) the local Markov property if

∀α ∈ V : α ⊥σ V \ cl(α) | bd(α);

(G) the global Markov property if

A ⊥G B | S =⇒ A ⊥σ B | S.
Pairwise Markov property
2            4
u            u
1       d
5           d 7
u      d u
d              d u
d
d       d

d u
d       d u
d
3       6

Any non-adjacent pair of random variables are conditionally
independent given the remaning.
⊥                           ⊥
For example, 1 ⊥ 5 | {2, 3, 4, 6, 7} and 4 ⊥ 6 | {1, 2, 3, 5, 7}.
Local Markov property
2           4
u           u
1       d
5          d 7
u      d u
d             d u
d
d       d

d u
d       d u
d
3       6

Every variable is conditionally independent of the
remaining, given its neighbours.
⊥
For example, 5 ⊥ {1, 4} | {2, 3, 6, 7} and
⊥
7 ⊥ {1, 2, 3} | {4, 5, 6}.
Global Markov property
2           4
u           u
1       d
5           d 7
u     d u
d              d u
d
d      d

d u
d       d u
d
3       6

To ﬁnd conditional independence relations, one should look
for separating sets, such as {2, 3}, {4, 5, 6}, or {2, 5, 6}
⊥
For example, it follows that 1 ⊥ 7 | {2, 5, 6} and
⊥
2 ⊥ 6 | {3, 4, 5}.
Structural relations among Markov properties

For any semigraphoid it holds that

(G) =⇒ (L) =⇒ (P)

If ⊥σ satisﬁes graphoid axioms it further holds that

(P) =⇒ (G)

so that in the graphoid case

(G) ⇐⇒ (L) ⇐⇒ (P).

⊥
The latter holds in particular for ⊥ , when f (x) > 0.
(G) =⇒ (L) =⇒ (P)

(G) implies (L) because bd(α) separates α from V \ cl(α).
Assume (L). Then β ∈ V \ cl(α) because α ∼ β. Thus

bd(α) ∪ ((V \ cl(α)) \ {β}) = V \ {α, β},

Hence by (L) and (S3) we get that

α ⊥σ (V \ cl(α)) | V \ {α, β}.

(S2) then gives α ⊥σ β | V \ {α, β} which is (P).
(P) =⇒ (G) for graphoids

Asuume (P) and A ⊥G B | S. We must show A ⊥σ B | S.
Wlog assume A and B non-empty. Proof is reverse
induction on n = |S|.
If n = |V | − 2 then A and B are singletons and (P) yields
A ⊥σ B | S directly.
Assume |S| = n < |V | − 2 and conclusion established for
|S| > n.
First assume V = A ∪ B ∪ S. Then either A or B has at
least two elements, say A.
If α ∈ A then B ⊥G (A \ {α}) | (S ∪ {α}) and also
α ⊥G B | (S ∪ A \ {α}) (as ⊥G is a semi-graphoid).
Thus by the induction hypothesis

(A \ {α}) ⊥σ B | (S ∪ {α}) and {α} ⊥σ B | (S ∪ A \ {α}).

Now (S5) gives A ⊥σ B | S.
For A ∪ B ∪ S ⊂ V we choose α ∈ V \ (A ∪ B ∪ S). Then
A ⊥G B | (S ∪ {α}) and hence the induction hypothesis
yields A ⊥σ B | (S ∪ {α}).
Further, either A ∪ S separates B from {α} or B ∪ S
separates A from {α}. Assuming the former gives
α ⊥σ B | A ∪ S.
Using (S5) we get (A ∪ {α}) ⊥σ B | S and from (S2) we
derive that A ⊥σ B | S.
The latter case is similar.
Factorisation and Markov properties
For a ⊆ V , ψa (x) is a function depending on xa only, i.e.
xa = ya =⇒ ψa (x) = ψa (y).
We can then write ψa (x) = ψa (xa ) without ambiguity.
The distribution of X factorizes w.r.t. G or satisﬁes (F) if
its density f w.r.t. product measure on X has the form
f (x) =         ψa (x),
a∈A

where A are complete subsets of G or, equivalently, if
f (x) =         ˜
ψc (x),
c∈C

where C are the cliques of G.
Factorization example
2        4
s   s
1 d 5      7
ds d s
s        d
d s  d s
d  d
3        6

The cliques of this graph are the maximal complete subsets
{1, 2}, {1, 3}, {2, 4}, {2, 5}, {3, 5, 6}, {4, 7}, and {5, 6, 7}.
A complete set is any subset of these sets.
The graph above corresponds to a factorization as

f (x) = ψ12 (x1 , x2 )ψ13 (x1 , x3 )ψ24 (x2 , x4 )ψ25 (x2 , x5 )
× ψ356 (x3 , x5 , x6 )ψ47 (x4 , x7 )ψ567 (x5 , x6 , x7 ).
Factorisation of the multivariate Gaussian

Consider a multivariate Gaussian random vector
X = NV (ξ, Σ) with Σ regular so it has density

f (x | ξ, Σ) = (2π)−|V |/2 (det K)1/2 e−(x−ξ)   K(x−ξ)/2
,

where K = Σ−1 is the concentration matrix of the
distribution.
Thus the Gaussian density factorizes w.r.t. G if and only if

α ∼ β =⇒ kαβ = 0

i.e. if the concentration matrix has zero entries for
Factorization theorem

Consider a distribution with density f w.r.t. a product
measure and let (G), (L) and (P) denote Markov properties
⊥
w.r.t. the semigraphoid relation ⊥ .
It then holds that
(F) =⇒ (G)
and further:
If f (x) > 0 for all x: (P) =⇒ (F).
Thus in the case of positive density (but typically only
then), all the properties coincide:

(F) ⇐⇒ (G) ⇐⇒ (L) ⇐⇒ (P).

DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
 views: 4 posted: 8/18/2011 language: English pages: 31