资源目录
压缩包内文档预览:(预览前20页/共45页)
编号:22946057
类型:共享资源
大小:4.34MB
格式:RAR
上传时间:2019-11-05
上传人:qq77****057
认证信息
个人认证
李**(实名认证)
江苏
IP属地:江苏
30
积分
- 关 键 词:
-
钢筋
切断
设计
- 资源描述:
-
钢筋调直切断机的设计,钢筋,切断,设计
- 内容简介:
-
January 10, 200717:19WSPC/181-IJWMIP00159International Journal of Wavelets, Multiresolutionand Information ProcessingVol. 5, No. 1 (2007) 159172c ? World Scientific Publishing CompanySUPPORT VECTOR MACHINE SOLVING FREEFORM CURVEAND SURFACE RECONSTRUCTION PROBLEMZHIXIA YANGand LING JINGCollege of Science, China Agricultural UniversityBeijing, 100083, P. R. Chinajingling Received 15 November 2005Accepted 5 July 2006Reconstruction of mathematically unknown freeform curve and surface is of paramountimportance for reverse engineering. This problem belongs to a regression problem, butthere is a particular requirement, namely the curve and surface have to be smooth.Support vector machine (SVM) is a new and powerful method for the regression problem.However, the fitting results of SVM are usually not smooth enough due to its sensitivityto outliers or noises. In this paper, a modified version, called as smooth support vectormachine (S-SVM), is proposed. The new version treats the training points differentlyby constructing their penalty factors based on the smooth degree, so smooth regressioncurve and surface can be obtained. In order to compare this new version with SVM,numerical experiments on both curve and surface fitting are given, which show clearlythe improvements.Keywords: Reverse engineering; freeform curve; freeform surface; smoothness; supportvector machine.Mathematics Subject Classification 2000: 65D10, 68T99, 65K051. IntroductionDesign and manufacturing of freeform curve and surface have wide industrial appli-cations, for example in automobile, aircraft, shipbuilding, and plastics manufac-turing. There have been many researches about reconstruction of freeform curveand surface based on the known mathematical representations.1,2,9Modeling ofmathematically unknown freeform curve and surface is of paramount importancefor reverse engineering.3One of the main issues in reverse engineering is to gener-ate models or design representations based on the existing products. Subsequently,PhD student entrusted by Xinjiang University.Corresponding author.159January 10, 200717:19WSPC/181-IJWMIP00159160Z. Yang & L. Jingthese models and representations are used to manufacture the products. There is avariety of engineering applications. For example, in the automobile, shipbuilding, oraircraft industries, designers may create a physical model based on the functionalrequirements and analysis. The model does not have a mathematical representation.For simple geometry, it is not difficult to generate such a representation in terms ofdrawings although it is very time consuming. If the model has very complex surfacesuch as a freeform surface, the development of such design representation or modelbecomes difficult. In order to reconstruct an existing freeform curve or surface, thefreeform curve or surface is first digitized by sampling a number of points by laserscanners or coordinate measuring machines. In general, the measured data pointsare often dense as well as noisy, and this will make the fitting curve or surface notsmooth enough. Basically, reconstruction of freeform curve and surface belongs toa regression problem, but there is a particular requirement, namely the producedcurve and surface have to be smooth.Support Vector Machine (SVM) for classification and nonlinear function estima-tion, as introduced by Vapnik4,5and further investigated by many others,68,16,1823is an important new methodology in the area of neural networks and nonlinear mod-eling. SVM solutions are characterized by convex optimization problems. Typically,one solves a convex quadratic programming (QP) problem in dual space in order todetermine the SVM model. The formulation of the optimization problem in the pri-mal space associated with this QP problem involves inequality constraints. Recently,a least squares (LS) version of SVMs has been investigated for classification1113,17and function estimation.14,15In the LS-SVM formulation, one works with equal-ity instead of inequality constraints and a sum squared error cost function. Thisreformulation greatly simplifies the problem in such a way that the solution is char-acterized by a linear system. Thus, LS-SVM can be considerably faster than SVM.However, SVM and its modified version have their potential drawbacks: they givethe training points data points the same penalty factors. Thus the fitting resultsof SVM and LS-SVM are usually not smooth enough due to their sensitivity tooutliers or noises. This paper deals with the problem.Different from SVM, our smooth support vector machine (S-SVM) treats thetraining data points with different importance in the training process. For differ-ent training data points, we give them different penalty factors according to theirsmooth degree. We show, in this paper that by this way S-SVM produces a moresmooth regression curve and surface than SVM does.This paper is organized as follows. In Sec. 2, we first give a brief review onSVM, then focus on the basic idea of our S-SVM for reconstruction of freeformcurve and surface. A practical approach is then proposed to construct penaltyfactors for the training data points in one-dimensional case and the S-SVM algo-rithm is given. Then in order to solve the freeform surface reconstruction problem,the S-SVM algorithm is generalized in two-dimensional case. In Sec. 3, prelimi-nary numerical experiments are given. Finally, further improvements are discussedin Sec. 4.January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem1612. Smooth SVMIn this section, we first introduce the basic idea of our S-SVM; Then, we give aS-SVM algorithm in one dimensional case for reconstruction the freeform curve.Lastly, in order to solve the freeform surface problem, we propose a S-SVM algo-rithm in two-dimensional case.2.1. The basic idea of S-SVMIn this subsection we review SVM and introduce the basic idea of our S-SVM.Given a training setT = (x1,y1),(x2,y2),.,(xl,yl) (X Y )l,(2.1)where the input xi X = Rn, the output yi Y = R, i = 1,.,l. The inputis first mapped to a higher dimensional space by a function . Assume that theregression function takes the following form:f(x) = (w (x) + b,where (x) : Rn Rmis a function which maps the input space into a so-calledhigher dimensional (possibly infinite dimensional) feature space, weight vector w Rm, and bias term b R. Let the loss function be the -insensitive loss functionc(x,y,f(x) = c(f(x) y),(2.2)where c() = max(| ,0), 0.The primal problem is:minw,b,12?w?2+l?i=1C(i+ i)s.t. yi (w (xi) b + i(w (xi) + b yi + i(2.3)i,i 0,i = 1,.,l,where = (1,2,.,l)T, = (1,2,.,l)T, and C are positive parameters.In order to solve problem (2.3), consider its dual problem:min,12l?i=1l?j=1(i i)TK(xi,xj)(i i) l?i=1(i i) + l?i=1(i i)s.t.l?i=1(i i) = 0(2.4)0 i,i C,i = 1,.,l,where = (1,2,.,l)T, = (1,2,.,l)T, K(xi,xj) = (xi),(xj)is the kernel function. After getting its solution and , we obtain the decisionJanuary 10, 200717:19WSPC/181-IJWMIP00159162Z. Yang & L. Jingfunction (regression function) asf(x) =l?i=1(i i)K(xi,x) + b,where b is determined by KKT(KarushKuhnTucker)10conditions.However, SVM treats every training point with the same importance since ituses the same penalty factor for all training points. This leads to unsmooth results,so SVM must be modified.Our S-SVM is obtained by modifying SVM described above. The key point of ourS-SVM is to replace the constant C in SVM by Ciwhich depends on the ith trainingpoint, i = 1,.,l. Note that the parameter Ciis a penalty factor for the ith trainingpoint (xi,yi), i = 1,.,l. So the value Ciimplies the importance of the point(xi,yi) in our mind. In order to make the final decision function smooth, we shouldfind the bad training points inducing the “non-smooth” behavior and suppresstheir function. So a natural strategy is as follows: first, among the training points,find bad points and define their badness degree; second, decide the correspondingpenalty factor Ci the larger the badness degree is, the smaller the penalty factoris. The details will be given in the following Secs. 2.2 and 2.3. After we get thepenalty parameters Ci, i = 1,.,l, it is easy to write the optimization problemcorresponding to Eqs. (2.3) and (2.4) in S-SVM. In fact, the primary problem here is:minw,b,12?w?2+l?i=1Ci(i+ i)s.t. yi (w (xi) b + i(2.5)(w (xi) + b yi + ii,i 0, i = 1,.,lwhere = (1,2,.,l)T, = (1,2,.,l)T, and Ci, i = 1,.,l areparameters.In order to solve problem (2.5), consider its dual problem:min,12l?i=1l?j=1(i i)TK(xi,xj)(i i) l?i=1(i i) + l?i=1(i i)s.t.l?i=1(i i) = 0(2.6)0 i,i Ci,i = 1,.,l,where = (1,2,.,l)T, = (1,2,.,l)T, K(xi,xj) = (xi),(xj)is the kernel function. After getting its solution and , we obtain the decisionfunction asf(x) =l?i=1(i i)K(xi,x) + b,where b is determined by KKT conditions.January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem1632.2. S-SVM in one-dimensional caseFirst, our S-SVM is proposed to solve freeform curve reconstruction which is aregression problem in one dimensional space. Given an ordered training setT = P1,P2,.,Pl = (x1,y1),(x2,y2),.,(xl,yl) (R R),(2.7)where the input xi R, the output yi R,i = 1,.,l, satisfyingx1 x2 0 is a positiveparameter.Combining the above technique of selecting Ciwith SVM, S-SVM in one dimen-sional case is obtained.Algorithm 1: S-SVM in one-dimensional case.(1) Given a training set (2.7) with x1 x2 0, C 0, and a kernel function K(x,x?);(3) According to the above rule, calculate the sign sequence of curvaturesign(Ki)|i = 1,2,.,lfor training points Pi|i = 1,2,.,l;(4) For i = 1,2,.,l, decide Piis “a good point” or “a bad point” by the abovedistinguishing criterion;(5) For i = 1,2,.,l, select CibyCi=C,if Piis “a good point”;C?1 cos( i)2?p,if Piis “a bad point”,(2.10)where iis the included angle between Pi1Piand PiPi+1, and both p and Care positive parameters given in Step 2;January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem165(6) Solve the following optimization problem:min,12l?i=1l?j=1(i i)TK(xi,xj)(i i) l?i=1yi(i i) + l?i=1(i+ i)(2.11)s.t.l?j=1(i i) = 00 i,i Ci,i = 1,.,land get its solution ()= (1,1,.,l,l);(7) Construct the decision function asf(x) =l?i=1(i i)K(xi,x) + b,(2.12)where b is determined by KKT conditions.2.3. S-SVM in two-dimensional caseNow let us turn to two-dimensional case. Given an ordered training setT = (xjk,yjk),j = 1,.,l1,k = 1,.,l2 (R2 R),(2.13)where the input xjk= (xjk1,xjk2)T R2, the output yjk R,j = 1,.,l1,k =1,.,l2, satisfyingxj11 xj21 xj,l21,j = 1,.,l1,(2.14)xj12= xj22= = xj,l22,j = 1,.,l1,(2.15)andx1k1= x2k1= = xl1,k1,k = 1,.,l2,(2.16)x1k2 x2k2 0is a positive parameter.Combining SVM with the above technique to select Cjk, we derive the followingalgorithm which can be used to solve the freeform surface reconstruction problem.Algorithm 2: S-SVM in two-dimensional case.(1) Given a training set (2.13)(2.17);(2) Select 0, C 0, p 0 and a kernel function K(x,x?);(3) For j = 1,.,l1, k = 1,.,l2, distinguish the point Pjk(xjk,yjk) between abad point and a good point: by considering it as a point in the training set(2.18) or in (2.19). If the point Pjkis “ a bad point” either in (2.18) or in(2.19), the point Pjkis said to be “ a bad point”; otherwise, it is said to be “agood point”.January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem167(4) For j = 1,.,l1, k = 1,.,l2decide Cjkcorresponding to Pjk: if Pjkis “goodpoint”, define Cjk= C; otherwise, defineCjk= C?1 cos( jk)2?p,where jk=max1jk,2jk,1jkis the included angle of Pj1,kPjkand PjkPj+1,k, 2jkis the included angle of Pj,k1Pjkand PjkPj,k+1and p 0is a positive parameter given in Step 2;(5) Solve the following optimization problem:min()12l1?i,j=1l2?k=1(ik ik)(jk jk)K(xik,xjk)+l1?i=1l2?k=1(ik+ ik)l1?i=1l2?k=1yik(ik ik),(2.21)s.t.l1?i=1l2?k=1(ik ik) = 0,0 ik,ik Cik(xik,yik),i = 1,.,l1,k = 1,.,l2,and get its solution ()= (11,11,.,1,l2,1,l2,.,l1,l2,l1,l2);(6) Construct the decision function asf(x) =l1?i=1l2?k=1(ik ik)K(x,xik) + b,(2.22)where b is determined by KKT conditions.Note that the parameter p in the algorithm is used to control the suppressingdegree of influence of the bad points. Generally speaking, p = 0 implies that thebad points are not suppressed at all, and p = + implies that the bad points aredeleted completely. For a practical problem, a suitable p should be chosen.3. Numerical ExperimentsIn the section, we give several experiments with artificial data for evaluate theperformance of our algorithms.3.1. The one-dimensional exampleConsider the half round curve y = 1 x2, x 1,1. The inputs are givenby xi= 1110+110i, i = 1,2,.,21. And the corresponding outputs are given byyi= ?1 x2i+ i, where the noise iobeys normal distribution with Ei=0, E2i= 0.1.Both SVM and our S-SVM with Gaussian kernel are executed, while the param-eters of ,C,p are shown in Fig. 3. The regression curves obtained by two waysJanuary 10, 200717:19WSPC/181-IJWMIP00159168Z. Yang & L. Jing10.500.50Orignal data SVM SSVM (p=37)10.500.51012345SVMSSVM (p=37)(a)(b)10.500.50Orignal data SVM SSVM (p=42)10.500.51012345SVM SSVM (p=42)(c)(d)Fig. 3.(a) The Regression curves with = 0.5, C = 1000, = 0.1; (b) absolute value of curvaturewith = 0.5, C = 1000, = 0.1; (c) The regression curves with = 0.5, C = 1000, = 0.05; and(d) absolute value of curvature with = 0.5, C = 1000, = 0.05.are shown in Figs. 3(a) and 3(c). We have also calculated the absolute value ofcurvature of both regression curves and shown them in Figs. 3(b) and 3(d).It is easy to see that the absolute value of curvature corresponding to S-SVMis flatter than the one corresponding to SVM, so the regression curves obtained byS-SVM are smoother than those obtained by SVM.3.2. The two-dimensional exampleConsider the functiony = x21+ x22,x 1,1 1,1,(3.1)where x = (x1,x2)T. The training setT = (xjk,yjk),j,k = 1,.,11(3.2)are obtained by (3.1) with disturbanceyjk= xjk21+ xjk22+ jk,j,k = 1,.,11,January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem16910.500.5110.500.510.500.511.5210.500.5110.500.510.500.511.52(a)(b)10.500.5110.500.510.500.511.5210.500.5110.500.510.500.511.52(c)(d)Fig. 4.(a) The regression surface with the contour lines for SVM, where = 0.5, C = 100, =0.1; (b) the regression surface with the contour lines for S-SVM, where = 0.5, C = 100, =0.1,p = 12; (c) the regression surface with the contour lines for SVM, where = 0.5, C = 100, = 0.05; and (d) the regression surface with the contour lines for S-SVM, where = 0.5, C =100, = 0.05,p = 20.January 10, 200717:19WSPC/181-IJWMIP00159170Z. Yang & L. Jingwhere xjk= (xjk1,xjk2)Twithxjk1= 65+15j,xjk2= 65+15k,j,k = 1,.,11,the disturbance jkobey normal distribution, andEjk= 0,E2jk= 0.2.For the training set (3.2), the S-SVM with Gaussion kernelK(x,x?) = exp(?x x?2/2)is executed. The parameters , C, and p are selected by minimizing the LOO(Leave One Out) error. For comparison, SVM is also executed with the same kerneland parameters. The regression surfaces obtained by both algorithms are shown inFig. 4.101101012x1x2y101101012x1x2y(a)(b)101101012x1x2y101101012x1x2y(c)(d)Fig. 5.(a) The sectional drawing of regression surface (x 1,1 0,1) with the contourlines for SVM, where = 0.5, C = 100, = 0.1; (b) the sectional drawing of regression surface(x 1,1 0,1) with the contour lines for S-SVM, where = 0.5, C = 100, = 0.1, p = 10;(c) the sectional drawing of regression surface (x 1,1 0,1) with the contour lines forSVM, where = 0.5, C = 100, = 0.05; and (d) the sectional drawing of regression surface(x 1,1 0,1) with the contour lines for S-SVM, where = 0.5, C = 100, = 0.1, p = 20.January 10, 200717:19WSPC/181-IJWMIP00159Solving Freeform Curve and Surface Reconstruction Problem171As another numerical example, we consider also a training set with simplerdisturbanceT = (xjk, yjk),j,k = 1,.,11),(3.3)where xjk= (xjk1,xjk2)Tand yjk=?xjk21+ xjk22+ jk,k = 6,xjk21+ xjk22,k ?= 6,(3.4)and the disturbance jkobeys normal distribution, and Ejk= 0, E2jk= 0.2.For the training set (3.3), both S-SVM and SVM with Gaussion kernel areexecuted. The results obtained by both algorithms are shown in Fig. 5.We can see from Figs. 4 and Figs. 5 that the regression surface obtained byS-SVM is smoother than that obtained by SVM. Furthermore, the correspondingcontour lines support obviously this conclusion as well.4. Further ImprovementsThis paper proposes a new approach to deal with reconstruction of freeform curveand surface problem. It should be pointed out that further improvements in thetwo-dimensional case is possible:(a) For the selection of the penalty factor Cjk, instead of (2.20), other expressionsare also allowed. The key point is the monotonicity; any decreasing functionwith respect to jkcan be considered. For example,1jk, or (1jk)p, wherep is a positive parameter.(b) In order to examine if a point Pjkis a bad point or its badness, we haveintroduced two sub-problems with one dimension. However, for dirty data,it might be necessary to introduce more than two sub-problems. For exam-ple, we consider an extra one-dimensional subproblem where the inputsin its training set can be chosen as: .,xj1,k1,xj,k,xj+1,k+1,., and.,xj1,k2,xj,k,xj+1,k+2,., etc.The performance and efficiency of the above strategy is undertaken investigated.AcknowledgmentThis work is supported by the National Natural Science foundation of China(No.10371131).We would like to thank Professor Naiyang Deng who shares hisinsights with us in discussions.References1. G. Hermann, Free-form shapes: An integrated CAD/CAM system, Comput. Ind. 5(1984) 205210.2. W. Boehm, A survey of curve and surface methods in CAGD, Comput. Aided Geom.Des. 1 (1984) 160.January 10, 200717:19WSPC/181-IJWMIP00159172Z. Yang & L. Jing3. T. Varady, R. R. Martin and J. Cox, Reverse engineering of geometric models Anintroduction, Comput. Aided Des. 4 (1997) 255268.4. C. Corts and V. N. Vapnik, Support vector networks, Mach. Learn. 20 (1995) 273297.5. V. Vapnik, Statisitical Learning Theory (John Wiley & Sons, New York, 1998).6. N. Cristianini and J. S. Taylor, An Introduction to Support Vector Machines (Cam-bridge University Press, Cambridge, UK, 2000).7. A. J. Smola and B. Sch olkopf, A tutorial on support vector regression, NeuroCOLTTechnical Report Series NC-TR 1998-030, Royal Holloway College University ofLondon, UK (1998).8. N. Y. Deng and Y. J. Tian, The New Method in Data Mining Support VectorMachine (Science Press, Beijing, China, 2004) (in Chinese).9. B. Q. Su and D. Y. Liu, Computational Geometry (Shanghai Scientific and TechnicalPublishers, China, 1981) (in Chinese).10. R. Fletcher, Practical Methods of Optimiazation (Wiley, New York, 1987).11. J. A. K. Suykens, J. D. Brabanter and L. Lukas, Weighted least squares supportvector machines: Robustness and sparse approximation, Neurocomputing
- 温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

人人文库网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。