机器视觉技术与应用_第1页
机器视觉技术与应用_第2页
机器视觉技术与应用_第3页
机器视觉技术与应用_第4页
机器视觉技术与应用_第5页
已阅读5页,还剩26页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、Logistic Regression,Classification,Machine Learning,Classification,Email: Spam / Not Spam? Online Transactions: Fraudulent (Yes / No)? Tumor: Malignant / Benign ?,0: “Negative Class” (e.g., benign tumor) 1: “Positive Class” (e.g., malignant tumor),Tumor Size,Threshold classifier output at 0.5:,If ,

2、predict “y = 1”,If , predict “y = 0”,Tumor Size,Malignant ?,(Yes) 1,(No) 0,Classification: y = 0 or 1,can be 1 or 0,Logistic Regression:,Logistic Regression,HypothesisRepresentation,Machine Learning,Sigmoid function Logistic function,Logistic Regression Model,Want,0,Interpretation of Hypothesis Outp

3、ut,= estimated probability that y = 1 on input x,Tell patient that 70% chance of tumor being malignant,Example: If,“probability that y = 1, given x, parameterized by ”,Logistic Regression,Decision boundary,Machine Learning,Logistic regression,Suppose predict “ “ if,predict “ “ if,x1,x2,Decision Boun

4、dary,1,2,3,1,2,3,Predict “ “ if,Non-linear decision boundaries,x1,x2,Predict “ “ if,x1,x2,1,-1,-1,1,Logistic Regression,Cost function,Machine Learning,Training set:,How to choose parameters ?,m examples,Cost function,Linear regression:,“non-convex”,“convex”,Logistic regression cost function,If y = 1

5、,1,0,Logistic regression cost function,If y = 0,1,0,Logistic Regression,Simplified cost function and gradient descent,Machine Learning,Logistic regression cost function,Output,Logistic regression cost function,To fit parameters :,To make a prediction given new :,Gradient Descent,Want :,Repeat,(simul

6、taneously update all ),Gradient Descent,Want :,(simultaneously update all ),Repeat,Algorithm looks identical to linear regression!,Logistic Regression,Advanced optimization,Machine Learning,Optimization algorithm,Cost function . Want .,Given , we have code that can compute,(for ),Repeat,Gradient des

7、cent:,Optimization algorithm,Given , we have code that can compute,(for ),Optimization algorithms: Gradient descent,Conjugate gradient BFGS L-BFGS,Advantages: No need to manually pick Often faster than gradient descent. Disadvantages: More complex,Example:,function jVal, gradient = costFunction(thet

8、a) jVal = (theta(1)-5)2 + . (theta(2)-5)2; gradient = zeros(2,1); gradient(1) = 2*(theta(1)-5);gradient(2) = 2*(theta(2)-5);,options = optimset(GradObj, on, MaxIter, 100); initialTheta = zeros(2,1);optTheta, functionVal, exitFlag . = fminunc(costFunction, initialTheta, options);,gradient(1) = ;,func

9、tion jVal, gradient = costFunction(theta),theta =,jVal = ;,gradient(2) = ;,gradient(n+1) = ;,code to compute,code to compute,code to compute,code to compute,Logistic Regression,Multi-class classification: One-vs-all,Machine Learning,Multiclass classification,Email foldering/tagging: Work, Friends, Family, Hobby,Medical diagrams: Not ill, Cold, Flu,Weather: Sunny, Cloudy, Rain, Snow,x1,x2,Binary classification:,Multi-class classification:,x1,x2,One-vs-all (one-vs-rest):,Class 1: Class 2: Class 3:,x1,x2,x1,x2,x1,x2,One-vs-all,Tra

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论