Chapter 9 Uncertainty-Based Information:9章基于信息不确定性.doc_第1页
Chapter 9 Uncertainty-Based Information:9章基于信息不确定性.doc_第2页
Chapter 9 Uncertainty-Based Information:9章基于信息不确定性.doc_第3页
Chapter 9 Uncertainty-Based Information:9章基于信息不确定性.doc_第4页
Chapter 9 Uncertainty-Based Information:9章基于信息不确定性.doc_第5页
已阅读5页,还剩26页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

Chapter 9 Uncertainty-Based Information9.1 Information and Uncertainty Uncertainty : 何謂uncertainty? 所謂的uncertainty即是意指資訊量的不足. such as, incomplete, imprecise, fragmentary, vague, contradictory, unreliable 一般而言, 資訊量增加相當於不確定性(uncertainty )的減少(Uncertainty-Based Information) such as, new fact, measurement, experience, knowledge 接著我們嘗試利用數學理論去建造一個數學model 來判斷一個系統的不確定性(uncertainty)程度如何, 其流程如下面此範例 e.q. set theory probabilistic theory information theory measure if probabilistic uncertainty (Claude Shannon) generalized information theory e.q. Fuzzy set theory Fuzzy measure theory Possibility theory Evidence theory Uncertainty-Based information 不確定性(uncertainty)通常有三種種類, 分別如下:1. nonspecificity (or imprecision)- cardinalities 2. fuzziness (vagueness)- imprecision boundaries3. strife (discord)- conflictions9.2 Nonspecificity of Crisp Sets首先我們要量測一個系統的 uncertainty, 必須先著重在set theory, 定義如下式 其中 : cardinality of set A b1 , c0 : determine the unit of uncertainty e.q. b=2, c=1 : Hortly function uncertainty is measure in bitsone bit of uncertainty = the total uncertainty regarding the truth on falsity of one proposition. Formally, U:- 然而所謂的不確定性(uncertainty ) 完全是根據不同的set A 而有不同的解釋e.q. predicative uncertainty : A is a set of predicted states of a variable diagnostic uncertainty : A is a set of possible diseases of a patient retrodictive uncertainty : A is a set of possible answer to a question prescriptive uncertainty : A is a set of possible policies任何一個集合都存在著 潛在的nonspecificity 如這兩種情形: Large sets less specific , Singletons full specificity 若把集合A縮減成集合B 則 the amount of uncertainty-based information I(A,B) = the amount of reduced uncertainty U(A)-U(B)i.e., I(A,B) = U(A)-U(B) = = where |B| =1 I(A,1) = = U(A) U(A) must also be conceived as the amount of information characterize one element of set A Let ,: universal sets : a relation : domain of R : range of R Let , , U() = , U() = simple uncertainty : 單純的兩個集合, 其各自的uncertainty. U(,) = Joint uncertainty : 而其中uncertainty 是具有結合性質,可將兩集合合併,依舊可以得知此新的集合其uncertainty的程度. Conditional uncertainty defined by U(|) = : the average number if elements of given an element of U(|) : the average nonspecificity of given ( p.s 此定義可以配合Example 9.1就可以清楚了解)U(|) = U(|) = = - = U(|)-U() - (9.8) U(|) = U(|)-U() -(9.9)(9.8)-(9.9) U() - U() = U(|) - U(|)(而其中可分為Noninterative 和 Iterative ) , , : Noninterative if U(|) = U() , U(|) = U() U(,) = U() + U()Iterative If U(,) U() , U(,) U() U(,) 0 : iterative (同樣的可分為兩種情形)By (9.8) , (9.9) T(,) = U() + U() - U() - U(,) = U()- U(,)T(,) = U()- U(,) 其通常表示成下列: = total information joint information Example 9.1 : = low , median , high , = 1 , 2 , 3 ,4 , U() = = = 1.6 U() = = = 2 U(,) = = = 2.8 U(|) = U(|)-U() = 2.8 2 = 0.8 U(|) = U(|)-U() = 2.8 1.6 = 1.2 T(,) = U() + U() - U(,) = 1.6 + 2 2.8 = 0.8 (9.1) : applicable to finite sets For infinite sets U(A) = log 1 + (A) Where (A) : the measure of A defined by the Lebesgue integral of the characteristic function of A e.q . A = a , b on R (A) = b a U ( a , b ) = log 1 + b a 由投影片上可清楚發現 Riemann integral 和 lebesgue integral 之間主要的差別 9.3 Nonspecificity of Fuzzy sets 首先我們先定義一些數學方程式以方便之後用來決定Fuzzy sets的nonspecificity程度為何 A: fuzzy set - (9.22) : the cardinality of the -cut of A h(A) : the height of A U(A) : a weighted average of values of the Hartley function for all -cuts of A(X) / h(A) ; Each weight is a difference between the values of of a given -cut and the immediately preceding -cutIf A(X) / h(A) = B(X) / h(B) U(A) = U(B)i.e. 根據 Example 9.2 可以發現 同樣的normalized fuzzy sets 經由 U的計算後,會得到相同的nonspecificity (9.22) : applicable to finite universal set For infinite sets where : memsurable and lebesgue integrable : the measure of defined by the Lebesgue integral of the characteristic function of U-uncertainty U : where R : the set of all finite and ordered possibility distributions Each possibility distribution s.t. U-uncertainty of U() = -(9.25) Where =0If represents a normal fuzzy set A, U() = U(A) , where 0 and i represents with = alternative of (9.25) U() = = =Futhermore, U(m) = Where m = ( ) : basic probability assignment corresponding to Example 9.4 : Given possibility distribution = ( 1,1, 0.8, 0.7, 0.7, 0.7, 0.4, 0.3, 0.2, 0.2) Calculate U() = ( 0, 0.2, 0.1, 0, 0.3, 0.1, 0.1, 0, 0.2 ) U() = = = 2.18 = U() Let : joint possibility distribution defined on U() : joint U-uncertainly U() , U(): simple U-uncertainly , : marginal possibility distribution U()= U() = U(,)= : set of focal elements by Conditional U-uncertaintyU(|) = U(|) = where represents , the average number of elements of A that remain possible alternative given an element of For normal fuzzy sets, by(9.22)U(|)= = = U(,) - U()This is a generalization of (9.8) (9.10)Similarly for (9.9) (9.16) valid for U-uncertainty (9.17) can be defined for U-uncertainty (9.18),(9.19) valid for U-uncertainty9.4 Fuzziness (Vagueness) measure of fuzziness f : : fuzzy power set f(A) : the degree to which the boundary of fuzzy set A is not sharp Requirements1. f(A)=0 iff A: a crisp set (當A is a crisp set ,很明顯其一點都不Fuzzy)2. A(x) = 0.5 iff f(A0): maximum (而一般而言,當其membership值=0.5時,最難決定其狀態,也因此理所當然的,此時的fuzziness最大)3. f(A)f(B) : A sharper than B i.e. A(x)B(x) when B(x)0.5 A(x)B(x) when B(x)0.5(同樣地,若A比B更尖銳,即可發現A會比B不fuzzy)由上訴幾點可以簡單得知,若一個membership function 越平滑,其fuzziness越高. 測量一個set 的fuzzinessways 的方法有二:1.測量此set 的membership function 與最接近此set 的crisp sets characteristic function 之間的差異2.測量此set 與其 complement之間的lack 若此set與其complement之間的差距越小,就表示此set 越fuzzier.t if particular fuzzy complement were used, the value 0.5 in requirements 2 and 3 would have to be replaced with the equilibriums of the fuzzy complement Use standard fuzzy complement Hamming distance The sum of absolute values of difference1, the, local distinction of set A and its complement |A(x)-(1-A(x)|=|2A(x)-1|2, the lack of a local distinction 1-|2A(x)-1|3. The measure of fuzziness =0 iff A: crisp when A(x)=0.5 Continuous case Example 9.5 i, f()=4- = = = 4 0.5 - 0.5 -0.5 -0.5 =2ii, =4-1-0.5-0.5-1=1iii, = 4-1-0.5-0.5-1=1 A set and its normal counterpart 即使擁有同樣的 nonspecificity,但卻不會有相同的 fuzziness, 如下例Example : Let with h()=0.4 h()=0.625 i.e., (x-1) = (3-x) 0 otherwise k =4,5= = 4-1-0.6-0.6-1=0.8 = = 4-1-0.425-0.425-1=1.15 the shaded area indicate the difference between membership grades of set and its complement. the degree of fuzziness is the deficiency of the difference w.r.t. 1 and is measured by the total size of the unshaded areas由上便可以得知 nonspecificity 和 fuzziness 是不同類型的uncertainty, 而且它們是彼此獨立的!they are independent of each other nonspecificity reduced gain in information fuzziness reduced gain in information depends on the accompanied change in nonspecificity U(A)=2.99 , f(A)=6U(B)=3 , f(B)=0由上面例子可清楚的看出 U: increased while f : decreased9.5 uncertainty in Evidence Theory generalization of nonspecificity : classical set theory fuzzy set theory possibility theory evidence theory Nonspecificity N-uncertainty : body of evidencei, N is a unique measure of nonspecificityii, when focal elements in are nested, NUiii, N is a weighted average of the Hartley function focal element A weights : values of the basic probability assignment m(A) m(A) : the degree of evidence focusing on A : the lack of specificity of this evidence claimiv, m(A) evidence A = specificity v, the range of N : 0 , N(m)=0 when x, m(x)=0 . (no uncertain) N(m)= when m()=1 (total ignorance)vi, (9.8)-(9.19) valid for N focal elements in probability measures are singletons , i.e. |A|=1 =0 N(m)=0 probability measure Shannon Entropy H -(9.37)- measures the average uncertainty associated with the prediction of outcomes in a random experiment- range 0 , H(m)=0 when x, s.t. m(x)=1 H(m)= , when x, m(x)= i.e. uniform probability distribution on - (9.8)(9.19) valid for H , m(x)=1-(9.37) H(m)=Let Con(x)= - the total evidential claim pertaining to focal elements that are differentwith the focal element x - the sum of all evidential claims that fully conflict with the one focusing onx H(m): the expected value of the conflict among evidential claims within a givenprobabilistic body of evidence Shannon Cross-entropy (directed divergence) D(f(x),g(x)|)=Where f(x),g(x):probability density function defined on a,bDiscrete caseD(p(x),q(x)|)= - continuous information transimission =where f(x,y): joint probability distribution on : density function of marginal distribution on and Entropy-like Measure in Evidence TheoryA, dissonanceFrom (7.11) Let -The total evidential claim pertaining to focal elements that are disjoint with A-The sum of all evidential claims that fully conflict with A.E(m)i. The expected value of the conflict among evidential claims within a given body of evidence (F, m)ii. Measure conflictiii. Range:iv. Not fully satisfactory as a measure conflicts with m(A), when and B, ConfusionFrom (7.10) Let -the sum of all evidential claim with A according to m(B) conflicts with m(A) when ever i. The expected value of the conflict among evidential claims within a given body of evidence (F,m)ii. Not fully satisfactory as a measure of conflict. does not properly scale each particular conflict of m(B), w.r.t m(A) according to the degree of violation of the subsethood relation . The more this subsethood relation is violated, the greater the conflictiii.C, DiscordLet conflict-the sum of individual conflicts of evidential claims w.r.t. A-each conflict has be scaled by the degree to which the subsethood is violated-range:0,1A, Measure of the mean conflict among evidential claims within each (F,m)Defect of D(m)i. LetAccording to function Con the claim m(B) is taken to be in conflict with the claim m(A) to the degree -(a) can not be viewed as contributing to the conflict with m(A)-(b) (a), (b) contract Example: 。Incomplete information Evidential claim 1:”Joe is between 15 and 17.” Evidential claim 2:”Joe is a teenage.”Claim 1 with degree m(A), where A=15,17Claim 2 with degree m(B), where B=13,19Claim 1 stronger, Claim 2 weakerClaim 2 does not conflict with claim 1ii. Let does conflict with m(A) to a degree #(elements in A not in B) Con(A) not proper for D, strifeLet orWhere :the degree of subseethood of A in BOrWhere N(m):nonspecificity measure Let -additive -Shannor entropy for probability measure-range:Strife in Possibility Theory Assume ordered possibility distributions, i.e., strife in possibility theory is written asWhere U(r): the measure of possibility nonspecificity (U-uncertainty) or1. The mathematical properties of S(r) are almost the same as those of D(m)2. 3. The possibility distributions for obtaining are different from those for obtaining 4. The measures of S and D with ranges in evidence theory are constrained within the domain of possibility theory5. Possibility theory is almost conflict-free S and D can be negligible compared with the other types of uncertainty and nonspecificityTotal uncertainty A, NS(m)=N(m)+S(m) add individual measures to form total uncertaintyIn evidence theoryFrom (9.55), From (9.36), or In possibility theory The range of NS:NS(m)=0, when The correct representation of complete certainty (fully information) for1. when 2. when B, -view any body of evidence as a set of constraints that define acceptable probability distributions. Choose the one with the largest Shannon entropy as total incertaintyThe maximum is taken over all that satisfy(a) , (b) Fuzziness in evidence theoryEvidence theory Fuzzy Evidence theoryFocal elements Fuzzy setsGiven a body of evidence in which elements of F are fuzzy sets.The total degree of fuzziness of -The weighted average of the individual degrees of fuzziness, f(A), of all focal elements , each weighted by m(A)remain unchanged when evidence theory is fuzzified.9.6 Summary of uncertainty Measure3 types of uncertaintyin classical set theoryin fuzzy set theoryin probability theoryin possibility theoryin evidence theoryFunctions of various types of uncertainty(A) Finite sets:(B) Infinite sets:no direct counter parts exist for some of those uncertainty measures(C) Bounded subsets of R:Nonspecificity (9.21)Fuzziness (9.35)9.7 Principles of uncertaintyUtility of uncertainty measures1. Extrapolating evidence2. Assessing the strength of relationship between given groups of variables3. assessing the influence of given input variables on given output variables4. Measuring the loss of information when a system is simplified. principles for managing uncertainty :a. minimum uncertaintyb. maximum uncertaintyc. uncertainty invariance The principle of minimum uncertainty-an arbitration principle-for narrowing down solutions in problems that involve uncertainty“We should accept all otherwise equivalent solutions, whose uncertainty is minimal.” Simplification ProblemsSimplificationLoss of informationincrease of uncertaintyCategories of simplification strategies1. Eliminating entities (e.g. variables, subsystems)2. aggregating entities (e.g. variables, states)3. breaking systems into subsystems.Conflict Resolution ProblemsConflict adjust loss of information increase of uncertaintyThe principle of maximum uncertainty ampliative reasoning problems- reasoning in which conclusions are not entailed in the given premises.- estimate microstates from the knowledge of relevant microstates and partial information regarding the microstates- identify an overall system from some of its subsystems maximum uncertainty- in any ampliative inference, use all information available, but make sure that no additional information is added.Classical information theoryA. The principle of maximum entropy- determine a probability distribution that maximized the Shannon entropy subject to given constraints which specify information about the unknown probability distribution, and general axioms of probability theory.Example:X : random variable with possible real values Assume :unknow :knownwhereEstimate employing the maximum entropy principle, e, maximize obtain where satisfies If were not knownthe uniform probability distribution is obtained.B, the principle of minimum cross entropy- given a prior probability distribution function q and some relevant new evidence, determine a new (a posterior) probability distribution function that minimize the cross entropy D subject to constraints , which represent the new evidence as well as the axioms of probability theory. Uncertainty expressed by P is smalls than uncertainty expressed by q.Evidence theoryA. the principle of maximum nonspecificity X:universal setA, B, :subsets of Xa,b:the total beliefs focusing on A and BEstimate the degree of support for Formulate the optimization problemDetermine m(X), m(A), m(B), m()For which Reaches its maximum subject to the constraints , evidence, , constraintsSelect as the free variable.-(i)-(ii)-(iii)(i),(ii)= the upper bound of (iii)= the lower bound= -(9.66)= the objective function becomes (in terms of )write whereThe solution of the optimization problem depends only on ,:constantsi, Assume A,B, AB:nonempty = 0ii, If 1 = 0 minimize in order to max

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论