




已阅读5页,还剩42页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Bayesian Networks- an introduction,Henrik Bengtssonhbmaths.lth.seComputer Science and TechnologyLund Institute of Technology,Masters Thesis Project:,1999 Henrik Bengtsson,3 (47),Outline, Introduction A simple Bayesian Network Graph Theory The Junction Tree Algorithm hbBN - a simple Bayesian Network tool The Twin-Model Example Summary,1999 Henrik Bengtsson,4 (47),Introduction,“The Year 2000 problem”,1999 Henrik Bengtsson,5 (47),History,60sThe first expert systems. IF-THEN rules.1968Attempts to use probabilities in expert systems (Gorry & Barnett).1973Gave up - to heavy calculations! (Gorry)1976MYCIN: Medical predicate logic expert system with certainty factors (Shortliffe).1976PROSPECTOR: Predicts the likely location of mineral deposits. Uses Bayes rule. (Duda et al.).,Summary of the time up until mid 80s:“Pure logic will solve the AI problems!”“Probability theory is intractable to use and too complicated for complex models.”,Example (PROSPECTOR):P(d | a)= P(d | a b) P(b | a) + P(d | a b) P( b | a),Certainty Factor (MYCIN):A real value (-1,+1): -1:expression is known to be false. 0:no belief either way.+1:expression is known to be true.Example:rule 34:a b c q(+0.7)rule 35:d q r s(-0.9),1999 Henrik Bengtsson,6 (47),But.,1986Bayesian networks were revived and reintroduced to expert systems (Pearl).1988Breakthrough for efficient calculation algorithms (Lauritzen & Spiegelhalter) tractable calculations on BNs.1995In Windows95 for printer-trouble shooting and Office assistance (“the paper clip”).1999BN is getting more and more used. Ex. Gene expression analysis, Business strategy etc.2000Widely used - a BN tool will be shipped with every Windows Commercial Server.,Bayesian Networks is the future!,1999 Henrik Bengtsson,7 (47),A Bayesian Network has two parts:1)qualitative partthe structuredirected acyclic graph (DAG)vertices represent variablesedges represent relations between variables2)quantitative partthe strength of relationship between variablesconditional probability function,What is a Bayesian Network?,1999 Henrik Bengtsson,8 (47),Models probabilitiesGives posterior beliefs given some observationsCan be used as a classifierCan explain whyCan find the variables with the most impactAlgorithms existsCan model expert (subjective) knowledgeCan automatically be learned from raw dataSimple - my grandmother can use it!,Why Bayesian Networks?,1999 Henrik Bengtsson,9 (47),A simple Bayesian Network,“Icy Roads”,1999 Henrik Bengtsson,10 (47),“Watson has had an accident!” P(XWatson=yes)=1,1999 Henrik Bengtsson,11 (47),“No, the roads are not icy.” P(XIcy=no)=1,When initiating XIcyXHolmes becomes independent of XWatson ;XHolmes XWatson | XIcy,1999 Henrik Bengtsson,12 (47),1,2,3,4,5,6,7,8,This naive approach of updating the network inherits oscillation problems!,1999 Henrik Bengtsson,13 (47),Idea behind the Junction Tree Algorithm,1999 Henrik Bengtsson,14 (47),Secondary Structure/Junction Tree multi-dim. random variables joint probabilities (potentials),Bayesian Networkone-dim. random variablesconditional probabilities,1,2,3,4,1999 Henrik Bengtsson,15 (47),Graph Theory,A graph G = ( V ,E ) consists of:a set of vertices Va set of edges EEx. V = a,b,c,d, E = ab, ad, bc, dce=uv : u is a parent of v, and v a child of u.pa(u) = the parent set of u, ch(u) = the child set of uEx. pa(b) = pa(d) = a, ch(a) = b,d, ch(d) = the family of u: fa(u) = u pa(u)Ex. fa(a) = a, fa(b) = a,b, fa(c) = a,c, fa(d) = b,c,dthe neighbors of u: ne(u) = u pa(u) ch(u)Ex. ne(a) = a,b,d, ne(b) = a,b,c, ne(c) = b,c,d, ne(d) = a,c,d,1999 Henrik Bengtsson,16 (47),1999 Henrik Bengtsson,17 (47),Junction Tree Algorithm,1.Transformation (of structure)a) Moralizeb) Triangulatec) Build junction tree2.Initialization (of values)a) Set up potentialsb) Propagate potentials3.Update beliefsa) Insert evidence into the junction treeb) Propagate potentials,Qualitative,Quantitative,Step 1 and step 2 is performed only once,1999 Henrik Bengtsson,18 (47),Step 1a: Moralization,1. For all w V: For all u,vpa(w) add an edge e=u-v.2. Undirect all edges.,GM,G = ( V , E ),1999 Henrik Bengtsson,19 (47),Step 1b: Triangulation,Add edges to GM such that there is no cyclewith length 4 that does not contain a chord.,1999 Henrik Bengtsson,20 (47),Elimination of a vertex,1. Connect all neighbors pairwise.2. Remove the vertex and edges connected to it.,1999 Henrik Bengtsson,21 (47),GT,GM,Eliminate the vertex that requires least number of edges to be added.,1999 Henrik Bengtsson,22 (47),cliques,Bayesian NetworkG = ( V , E ),Moral graph GM,Triangulated graph GT,1999 Henrik Bengtsson,23 (47),sepsets,Step 1c: Building junction tree,In JT cliquesbecomesvertices,GJT,Ex: ceg egh = eg,1999 Henrik Bengtsson,24 (47),Junction Tree Properties,All vertices C and sepsets S along the path between any two vertices A and B contain the intersection AB.,1999 Henrik Bengtsson,25 (47),Junction Tree Algorithm,1.Transformation (of structure)a) Moralize b) Triangulate c) Build junction tree 2.Initialization (of values)a) Set up potentialsb) Propagate potentials3.Update beliefsa) Insert evidence into the junction treeb) Propagate potentials,Qualitative,Quantitative,Step 1 and step 2 is performed only once,1999 Henrik Bengtsson,26 (47),Potentials,DEFINITION: A potential A over a set of variables XA is a function that maps each instantiation of xA into a non-negative real number. We denote the number that A maps xA by A(xA).,A joint probability is a special case of a potential where A(xA)=1.,1999 Henrik Bengtsson,27 (47),Decomposable Distribution,DEFINITION: A probability distribution is said to be decomposable with respect to graph G = ( V , E ) if G is triangulated and it hold for any clusters A and B with separator C that XA XB | XC,Step 1 & 2 of the junction tree algorithm guarantees this property!,1999 Henrik Bengtsson,28 (47),Factorization of potentials,THEOREM: Given a decomposable probability distribution P(XV) on the graph G = ( V , E ) it can be written as a the product of all potentials of the cliques divided by the product of all potentials of the sepsets:,1999 Henrik Bengtsson,29 (47),Step 2a: Setting up potentials,1. For each cluster C and sepset S;C 1 , S 12. For each vertex u in the BN select a parent cluster C s.t. C fa(u). Include the conditional probability P( Xu | Xpa(u) ) into C ;C C P( Xu | Xpa(u) ),1999 Henrik Bengtsson,30 (47),The potentials in the junction tree are not consistent with each other., i.e. if we use marginalization to get the probability distribution for a variable Xu we will get different results depending on which clique we use.,The potentials might not even sum to one, i.e. they are not joint probability distributions.,1999 Henrik Bengtsson,31 (47),Message Passing from clique A to clique B 1. Project the potential of A into SAB 2. Absorb the potential of SAB into B,Projection,Absorption,Step 2b: Propagating potentials,1999 Henrik Bengtsson,32 (47),Global Propagation,Start here!,1999 Henrik Bengtsson,33 (47),A priori distribution,global propagation potentials are consistent Marginalizations gives probability distributions for the variables,1999 Henrik Bengtsson,34 (47),Summary,1. For each cluster C and sepset S;C 1 , S 12. For each vertex u in the BN select a parent cluster C s.t. C fa(u). Include the conditional probability P( Xu | Xpa(u) ) into C ;C C P( Xu | Xpa(u) ),1999 Henrik Bengtsson,35 (47),Junction Tree Algorithm,1.Transformation (of structure)a) Moralize b) Triangulate c) Build junction tree 2.Initialization (of values)a) Set up potentialsb) Propagate potentials3.Update beliefsa) Insert evidence into the junction treeb) Propagate potentials,Qualitative,Quantitative,Step 1 and step 2 is performed only once,1999 Henrik Bengtsson,36 (47),Evidence is new information about a r.v. that changes our belief about its distribution.Ex. Before receiving evidenceP(Xu) = (0.14, 0.43, 0.31, 0.12)Hard evidence - the r.v. is instantiated (observed)Xu=xu P(Xu) := (0, 0, 1, 0)Soft evidence - everything else.Xu java hb.math.bn.HBBN IcyRoads.xbs Can be called from Matlab or any other environment. Generates figures to be included in LaTeX Only discrete variables,1999 Henrik Bengtsson,41 (47), * * The information that Watson has crashed * updates the probability that the roads are * icy and that Holmes also has crashed. * yes IcyRoadsW 3 * * At last, when Inspector is convinced that the * roads are not icy, then P(H|I=n)=(0.1,0.9). * - page 2 -,Example of hbBN script language (.xbs), * Icy Roads This example is taken from An Introduction to Bayesian Networks Finn V. Jensen, 1996 * Loading network. OK IcyRoads 0 yes IcyRoadsJoinTree - page 1 -,1999 Henrik Bengtsson,42 (47),- page 2 -, 0.7 0.3 0.8 0.2 0.1 0.9 0.8 0.2 0.1 0.9 - page 2 -,Icy Roads model in XML Belief Network File format, Watson yes no Holmes yes no Icy yes no - page 1 -,1999 Henrik Bengtsson,43 (47),The Twin-Model Example“the execution of a prisoner or not”,P(CourtOrder = Execution) = 0.70 P(Prisoner = Dead) = 0.70,1
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 9号石油协议书
- 夫妻双方自愿离婚协议书范本
- 预览传输协议书
- 被狗咬签协议书
- 锅炉承包协议书
- 歌厅租写协议书
- 导师让签三方协议书
- 《2025打印机租赁协议》
- 2025协商一致解除劳动合同协议书模板
- 晋教版八年级地理上册3.2保护有限的耕地资源说课稿
- 《气候中和园区:工业园区的零碳转型指南》
- 2025年驾驶员安全培训考试试题库卷(答案+解析)
- 临床技术操作规范
- 无人机培训课件
- 2025辽宁沈阳副食集团所属企业招聘3人考试参考题库及答案解析
- 变更董事股东会决议
- 中国功夫介绍英文
- 驾驶员管理台帐
- 部编版五年级道德与法治上册第3课《主动拒绝烟酒与毒品》优秀课件【最新】
- 制造企业物料试用单
- 电力排管检验批
评论
0/150
提交评论