雨课堂学堂在线学堂云《现代优化方法(英文)(北京理大 )》单元测试考核答案_第1页
雨课堂学堂在线学堂云《现代优化方法(英文)(北京理大 )》单元测试考核答案_第2页
雨课堂学堂在线学堂云《现代优化方法(英文)(北京理大 )》单元测试考核答案_第3页
雨课堂学堂在线学堂云《现代优化方法(英文)(北京理大 )》单元测试考核答案_第4页
雨课堂学堂在线学堂云《现代优化方法(英文)(北京理大 )》单元测试考核答案_第5页
已阅读5页,还剩7页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

注:不含主观题第1题【注意】单击题目图片即可放大查看【此题不计入成绩】练习题第1题【注意】单击题目图片即可放大查看【此题不计入成绩】第2题ABCD第3题ABCD第4题ABC第5题ABCD第6题ABCD第7题ABCD第8题ABCD第9题ABCD第10题ABCD第11题ABCD【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题Whatisthewrongclaim?AAlocalminimizerisalsoaglobalminimizerforconvexoptimization.BAnisolatedlocalminimizerisalsoastrictlocalminimizer.CAglobalminimizerisalsoalocalminimizerforunconstraintoptimization.DAstrictlocalminimizerisalsoanisolatedlocalminimizer.第2题Alocalminimizers.Bstationarypoints.Cglobalminimizers.Disolatedlocalminimizers.第3题Whichofthefollowingclaimsisnotcorrectforunconstrainedconvexoptimization?AThefirstordernecessaryconditionisalsothesufficientconditionforxtobeaminimizer.BIfxisalocalminimizer,itisalsoaglobalminimizer.CIfxisalocalminimizer,itisalsoastrictlocalminimizer.DAnyconvexcombinationoftwoglobalminimizersisalsoaglobalminimizer第4题AtrustregionstrategyBexactlinesearchCinexactlinesearchDsteepestdescentdirection第5题ABCD第6题AlinearBsublinearCsuperlinearDquadratic【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题Whichofthefollowingsteplengthsisnottheinexactsteplength?AbacktrackingBWolfeconditionCstrongWolfeconditionD第2题WhichconditionsaretheWolfeconditions?(multiplechoices)ABCD正确答案:AB第3题Steepestdescentmethodwith________linesearchisgloballyconvergent.(multiplechoices)AWolfeBstrongWolfeCGoldsteinDbacktracking正确答案:CBA第4题Undercertainconditions,theconvergencerateforNewton’smethodisAlinearBsublinearCsuperlinearDquadratic第5题TheconvergencerateforsteepestdescentmethodundercertainconditionsisatbestAlinearBsublinearCsuperlinearDquadratic第6题Theremaybe________phenomenonforsteepestdescentmethodtowardsolutions.AZigzagBfastconvergenceCellipsoidsDincreasing第7题ABCD【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题Whichisnotthekeypointintrustregionmethods?A

thesizeoftrustregionBthemodelfunctionmkCthesteppkDsufficientdecrease第2题Whichmethodscanbeusedtosolvethesubproblemoftrustregionmethod?(multiplechoices)A

DoglegmethodB

Two-dimensionalsubspaceminimizationC

CauchypointD

KKTcondition正确答案:ABC第3题ABCD第4题ABCWemaynotmakegreatprogressatthecurrentiteration.DWemaymakegreatprogressatthecurrentiteration.第5题AshrinkthetrustregionradiusBstayatthecurrentiterationCreproducethemodelfunctionandsolvethesubproblemwithinasmallertrustregionDenlargethetrustregionradius正确答案:CBA第6题TheCauchypointcanbeviewedasAsteepestdescentmethodwithespecialsteplengthBDoglegmethodCNewton’smethodDtwo-dimensionalsubspaceminimizationmethod第7题Indoglegmethod,whichdirectionisnotinvolved?AsteepestdescentdirectionBNewton’sdirectionCthecombinationofsteepestdescentdirectionandNewton’sdirectionDconjugatedirection第8题AlinearlyBsublinearlyCsuperlinearlyDquadratically【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题Any

set

of

vectors

satisfying

conjugacy

is

also

.

Alinearly

independent

BorthogonalClinearly

dependent

第2题Which

of

the

following

statement

is

not

correct?ABCWhen

applied

to

solving

quadratic

convex

optimization

problem,

the

different

versions

of

nonlinear

conjugate

gradient

methods

are

reduced

to

the

same

linear

conjugate

gradient

method.

DWhen

applied

to

solving

quadratic

convex

optimization

problem,

the

different

versions

of

nonlinear

conjugate

gradient

methods

are

still

different.

第3题ABCD第4题For

linear

conjugate

gradient

method

with

exact

line

search,

which

is

not

true?ABCD第5题AKrylov

subspace

Bhyperplane

Corthogonal

subspaceDrange

space

第6题ABCD第7题ABCD第8题ABCDifferent

variants

including

FR

method,

PRP

method

lead

to

different

iteration

sequence.

DDifferent

variants

including

FR

method,

PRP

method

can

terminate

within

finite

steps.

【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题ABCD第2题ABCD第3题AItisstronglysemismooth.BItissemismooth.CItisapiecewiselinearfunction.DItisasmoothfunction.第4题ABCD正确答案:BAC第5题Supportvectormachinecanbedividedinto_______ASupportvectorclassificationandsupportvectorregressionBSupportvectorclassificationandclusteringCSupportvectorregressionandclusteringDDeeplearningandreinforcelearning第6题Thetraditionalmethodforsupportvectorclassificationinclude________(multiplechoices)ATRONBDCDCCoordinatedescentmethodfortheprimalproblemDModifiedNewton’smethod正确答案:DCBA第7题OnecanapplysemismoothNewton’smethodtosolve_________(multiplechoices)AL2-lossSVCBε-L2SVRCL1-lossSVCDLogistic-lossSVC正确答案:BA第8题WhenappliedglobalizedversionofsemismoothNewton’smethodtosolveL2-lossSVC,wehavethefollowingresults________(multiplechoices)AThesequenceconvergesgloballytotheoptimalsolutionofL2-lossSVCBThelocalconvergencerateisquadraticCEverymatrixintheClarke’sgeneralizedJacobianispositivedefiniteDTheobjectivefunctioninL2-lossSVCisstronglysemismooth正确答案:DCBA【作业须知】第1题【注意】单击题目图片即可放大查看【此题不计入成绩】作业第1题

Whichofthefollowingstatementisnotcorrectforconstrainedoptimization?A

Constraintsmaymakeitverydifficulttofindalocalsolution.B

Constraintsmayalsomakeitdifficulttoidentifylocalsolutionsandglobalsolutions.C

Isolatedlocalsolutionsarestrictlocalsolutions.D

Strictlocalsolutionsareisolatedlocalsolutions.第2题ABCD第3题ABC第4题ABC第5题ABCD第6题Iflinearlyindependentconstraintqualificationholdsatx,thereisAThelinearizedfeasibledirectioncoincideswiththetangentconeatxwithrespecttofeasibleset.BThecriticalconeisthesameasthelinearizedfeasibledirection.Cotherconstraintqualificationsalsohold.DThefirstorderoptimalityconditionshold.第7题A{1,2}B{1,3}C{2,3}D{1,2,3}第8题ABCD第9题Whatistherelationbetweenlinearizedfeasibledirectionandcriticalcone?Acriticalconeisasubset

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论