版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
OptimizationonGraphsPartII:TechnicalModelsClassicalSolutiontoGraphMatchingProblemedge-edgesimilaritynode-nodesimilarityNP-hardGMProblem:ClassicalGMPipeline:SIFTfeatureextractorcomputenode,edgesimilarityclassicalalgorithmapproximatesolutionmatchingresultFixedmethod,e.g.GaussianKernelFunction(limitedrepresentationcapabilityofSIFT)(limitedcapacityduetofixedsimilarity)(limitedperformanceofclassicalalgorithm)LearningGMbyGraphEmbeddingModelICCV19/TPAMI20ClassicalGMPipeline:SIFTfeatureextractorcomputenode,edgesimilarityclassicalalgorithmapproximatesolutionmatchingresultDeepGraphEmbeddingGMPipeline:CNNfeatureextractorintra-+cross-graphGNNembeddingSinkhornalgorithmmatchingresultlearnnodesimilarity(differentiableend-to-endlearning)[1]CombinatorialLearningofRobustDeepGraphMatching:anEmbeddingbasedApproach,TPAMI
2020[2]LearningCombinatorialEmbeddingNetworksforDeepGraphMatching,ICCV2019
LearningGMbyGraphEmbeddingModelICCV19/TPAMI20Intra-graphNodeEmbedding:GraphConvolutionalNetworkCross-graphNodeEmbedding:Calculatesimilarity→ Computecross-graphweightsbySinkhornAdvantagesofgraphembedding:Includesgraphstructureinformation,comparedwithnodematchingReducescomplexity(NP-hard→O(N3)solvable),comparedwithgraphmatchingTheupdatestepiteratesoverallnodesTheupdatestepiteratesoverallnodesSinkhorn:differentiable,exactlinearassignmentalgorithmHowtoinvoke:pipinstallpygmtools
(alreadysupportnumpy,pytorch,paddle,jittor;willsupport tensorflow,mindspore)>>>import
torch>>>import
pygmtools
as
pygm>>>pygm.BACKEND
=
'pytorch’
>>>np.random.seed(0)#2-dimensional(non-batched)input
>>>s_2d
=
torch.from_numpy(np.random.rand(5,5))>>>s_2d
tensor([[0.5488,0.7152,0.6028,0.5449,0.4237],
[0.6459,0.4376,0.8918,0.9637,0.3834],
[0.7917,0.5289,0.5680,0.9256,0.0710],
[0.0871,0.0202,0.8326,0.7782,0.8700],
[0.9786,0.7992,0.4615,0.7805,0.1183]])
>>>x
=
pygm.sinkhorn(s_2d)>>>x
tensor([[0.1888,0.2499,0.1920,0.1603,0.2089],
[0.1895,0.1724,0.2335,0.2219,0.1827],
[0.2371,0.2043,0.1827,0.2311,0.1447],
[0.1173,0.1230,0.2382,0.1996,0.3219],
[0.2673,0.2504,0.1536,0.1869,0.1418]])
>>>print('row_sum:',x.sum(1),'col_sum:',x.sum(0))row_sum:tensor([1.0000,1.0000,1.0000,1.0000,1.0000])col_sum:tensor([1.0000,1.0000,1.0000,1.0000,1.0000])
LearningGMbyGraphEmbeddingModelICCV19/TPAMI20PermutationLoss:Thematchingproblemcanbeconsideredas abinaryclassificationproblemforeachelement
Comparedwiththeregressionbasedoffsetlossusedinthepast,thepermutationlossbetter
portraysthecombina-torialoptimizationnatureofgraphmatchingLearningGMbyGraphEmbeddingModelICCV19/TPAMI20ModelCNNGMFormulationLossFuncMatchingAccGMNVGG16ClassicalGM(Zanfiretal.CVPR2018)OffsLoss55.3GMN-PLVGG16ClassicalGM(Zanfiretal.CVPR2018)PermLoss57.9PIA-GM-OLVGG16Intra-graphGNNOffsLoss61.6PIA-GMVGG16Intra-graphGNNPermLoss63.0PCA-GMVGG16Intra-+Cross-graphGNNPermLoss63.8MatchingresultsonPascalVOC:PermutationLoss>OffsetLoss,Intra-+Cross-graphGNN>Intra-graphGNN>ClassicalGMThemodelhasthecapabilitytotransferacrosscategories:LearningGMbyGraphEmbeddingModelICCV19/TPAMI20GMisequivalenttonodeclassificationonanassociationgraph:AssociationGraphGraphMatchingProblemNode1matchesnodea 1a=1onassociationgraphTherefore,GMsolvers==nodeclassifieronassociationgraphNaturally,GNNthatexcelinnodeclassificationcanserveasgraphmatchingsolvers!LearningGMSolversTPAMI2021IntroduceSplineConv(Feyetal.CVPR18)toencodegeometricinformation
atfeaturelevelDevisekernelfunctionsassociatedwithCNNCNNExtractor(Optional)LearningSolverGCNadoptedfornodeclassificationonassociationgraph(i.e.solvingGM)ConsidermatchingconstraintsinGNNOutputmatchingresultPascalVOCDatasetMatchingAccPreviousSOTA(Rolineketal.ECCV20)79.0NGMv2(ours)80.1LearningGMSolversTPAMI2021ComputeleadingNeigenvectors(N=3)ExtendtoMulti-graphMatching👉AdoptpermutationsynchronizationtechniquePachauriyetal.,Solvingthemulti-waymatchingproblembypermutationsynchronization,inNIPS2013
ExtendtoHyperGraphMatching👇NeuralGraphMatching(NGM)GMNeuralHyperGraphMatching(NHGM)HGM1/2orderfeatureshigh-orderfeaturesassociationgraphassociationhypergraphupdatefeaturesalongedgesupdatefeaturesalonghyperedgesmatchingacc80.1matchingacc80.7MGMTestDatasetMatchingAccNGMv2(2GM)97.5NHGMv2(HGM)97.8NMGMv2(MGM)98.2LearningGMSolversTPAMI2021
GNNNodeClassifierDoubleStochasticIterationSolu-tionQuadraticAssignmentProblem(QAP)
TestDataset:https://www.opt.math.tugraz.at/qaplib/AssocGraphConstructionComparedwithSOTASinkhorn-basealgorithmSIAMJ.ImagingSci.2019AnnealingAlgorithmECCV10LearningAlgorithmCVPR18BestPaperHonorableMentionsComparedwithGurobiProposedLearningAlgorithmTPAMI21GPUComputingCPUComputing1631xacceleration!431xacceleration!Intel(R)Xeon(R)W-3175XCPU@3.10GHzNVIDIARTX8000RunningTime(logscale)Dataset:NeuralGraphMatchingNetwork:LearningLawler’s
Quadratic
AssignmentProblemwithExtensionstoHypergraphandMulti-graphmatching,TPAMI
2021LossFunc(lowerisbetter)LearningGMSolversTPAMI2021SolvingCombinatorialOptimizationoverGraphsbyaGeneralBi-levelMLFrameworkNeurIPS2021ABi-levelFrameworkforLearningtoSolveCombinatorialOptimizationoverGraphs,NeurIPS
2021DecisionVariableObjectiveFunctionConstraintsActionSequenceRewardActionFeasibleDomainForCOproblemsovergraphs,currentformulationisExistingpapersusereinforcementlearningmodeling:HoweverLargerscale,longeractionsequence
Sparsereward,hardtoconvergeAssumeadequatemodelcapacitytolearn
NP-hardproblem,hardtodevisemodelAddingcuttingplanesforintegerprogrammingResorttotheclassicidea:ModifyingtheoriginalproblemtoaidproblemsolvingThispaper:ModifyinggraphstructureAddedgesGitHubrepoQRcodeUpper-level:AdoptareinforcementlearningagenttoadaptivelymodifythegraphsLower-level:OptimizedecisionvariablesbyheuristicsProposeaBi-levelOptimizationFormulation:Bi-levelFramework:Whentheupper-levelRLmodifiesgraphstructure,thelower-levelheuristicisinvokedUpper-levelOptimizer:RLactionnetwork(trainedbyPPO)Lower-levelOptimizer:HeuristicalgorithmsSolvingCombinatorialOptimizationoverGraphsbyaGeneralBi-levelMLFrameworkNeurIPS2021✔❌AGeneralFrameworkforDifferentGraphTheoryProblems(a)DAGScheduling(b)GEDProblem(c)HamiltonianCycleDAGScheTimeTPC-HDatasetCusto-mizedGen-eralImprov-ements50DAGs982189069.3%100DAGs169141519310.2%150DAGs24429223718.4%Custo-mized
Gen-eral500-600nodes202525%GEDAIDSDatasetCusto-mizedGen-eralImprov-ements20-30nodes37.429.122.2%30-50nodes70.461.113.2%50+nodes101.97724.4%SolvingCombinatorialOptimizationoverGraphsbyaGeneralBi-levelMLFrameworkNeurIPS2021Improv-ementsHamiltonianCycleAccuracyFHCPDatasetOptimizationonGraphsPartI:OutlookandApplicationsBackgroundandMotivationGraphConstructionandOptimizationPCBPlacementHalf-centuryresearchesonclassicalgraphmatchingproblem50yearsofresearchonasinglegraphtheoryproblemTSPGCPMFPMCPRobertBixbyProf.atRiceJeromeChaillouxAsst.Prof.atIPParisErlingAndersenAsst.Prof.atSDUMoregraphtheoryproblemchallengesinreal-worldscenarios50yearsofresearchonasinglegraphtheoryproblemDiscrete/UnconstrainedDiscrete/ConstrainedContinuous/UnconstrainedContinuous/Constrained(Deep)NeuralNetworkLogisticRegressionPCAGraphTheoryandCOCO:combinatorialopt.ClusteringGraphCutPathPlanningMatching…BackgroundPullDecisionTreesGaussianMixedModelSupportVectorMachineNaïveBayesProbabilisticTopicModelsContinuousDiscreteUnconstrainedConstrainedMatrix/SequenceGraphStructurePerceptualProblemsGraphTheoryProblemsContinuousUnconstrainedDiscreteConstrainedBackgroundProblemSettingDataFormGeneralNNCOModelsMethodologyInnovationBackgroundofResearchCVPR’22BestPaper:LearningtoSolveHardMinimalProblemsEJOR’21ASurveybyProf.Bengio:MachineLearningforCONSFmakes$20millioninvestmentinOptimization-focusedAIResearchInstituteOpenScenariosVelocityVolumeVarietyVeracityClassicalComfortZoneRelativeStabilityModerateSizeClassicalProblemMachineLearningClassicalSolversLargeSizeRapidChangeMultipleFormsMoreUncertaintyExactCertaintyGraphMatchingBasedModelFusionICML22InputModelAlignmentModelfusionOutputModelDeepNeuralNetworkFusionviaGraphMatchingwithApplicationstoModelEnsembleandFederatedLearning,ICML22Codeavailableat“/Thinklab-SJTU/GAMF”GitHubrepoQRcodeModelEnsemblePrediction-basedModelEnsemble:NeedtomaintainallindividualmodelsFusion-basedModelEnsemble:NeedtomaintainonlyonemodelFederatedLearningFLPipeline:EfficientlyaggregatelocalmodelsbyModelFusionLi,Q.,He,B.,andSong,D.Model-contrastivefederatedlearning.CVPR,2021.Client1Client
NGlobalServerGraphMatchingBasedModelFusionICML221)Globalserversendstheglobalmodelto eachlocalclient2)Eachclienttrainthelocalmodelwiththeir owndatasets3)Localclientssendthelocalmodelbackto globalserver4)Globalservergathersalllocalmodelsand mergethemintoasharedglobalmodelNeuralChannelGraphNodeEdgeSimilarityTobeMatched~~WeightSimilarity~Challenge:ProblemScaleModelFusion:LargescaleofcommonNN,withupto1024channelseachlayerandatotalnumberofchannelsexceeding10000GraphMatching:Lessthan100keypointsinagraphincommonlyuseddataset,whichdifferssignificantlyfromtherequirementsofModelFusionOutputlayer(fixednodes)Hiddenlayer2(matchednodes)Hiddenlayer1(matchednodes)Inputlayer(fixednodes)Model1Model2GraphMatchingBasedModelFusionICML22StructureofPermutationMatrix𝑃1234567696ABCDEFGHIJModel1Model2Layer1Layer2Layer3Layer4Layer5GraduatedAssignmentSelect3adjacentlayersatatimeFixfrontandbacklayersUpdatethepermutationmatrixofthemiddlelayerIterateuntilconvergenceGraphMatchingbasedModelFusionICML22Outputlayer(fixednodes)Hiddenlayer2(matchednodes)Hiddenlayer1(matchednodes)Inputlayer(fixednodes)Model1Model2GraphMatchingBasedModelFusionICML22GraphOptimizationProblemofPlacementandRoutingRLplacemacrosDLplacestandardcellsRLcombinedwithclassicalalgorithmrouterewardfunctionRLplacemacrosrewardfunctionclassicalalgorithmplacestandardcellsDeepPR(NeurIPS21)ISPD2005Dataset8%↓Exploresolving
PlacementandRoutingviaMachineLearning,asanalternativetoclassicalalgorithms
OnJointLearningforSolvingPlacementandRoutinginChipDesign,
NeurIPS2021BackgroundProposeaCyclicPlacementandRoutingModelwire-lengthArchitectureofGenerativeRoutingModelThegeneratoriscomposedofabasicgeneratorfortheinputsizeof64×64orbelowandanextensionfortheinputsizeoflargerthan64×64.Thediscriminatorconsistsoftwosub-discriminatorstoestimateroutesfromvalidityandrealness.FormulationofMixed-sizePlacement
NeuralMacroPlacementandRoutingPipelineCombiningtheRL-basedmodelforlearningmixed-sizemacroplacementwithone-shotgenerativeroutingnetworktoperformroutingasweintroduceabove,weproposeapureneuralpipelineformacroplacementandrouting.InspiredbyEMalgorithm,wefirstupdatethegenerativerouterusingplacementresultfrommixed-sizeagent(similartoEstep),thenplacementandnetorderagentsarelearnedjointlyinawholereinforcementlearningframeworktominimizewirelengthcalculatedbytrainedgenerativemodel(correspondingtoMstep)GraphOptimizationProblemofPlacementandRoutingThePolicy-gradientPlacementandGenerativeRoutingNeuralNetworksforChipDesign,NeurIPS2022OutlookonTypicalParadigmsParadigm1:Differentiablelearningtoimproveoverallfront-andback-endagilityParadigm2:Multi-taskdistributedself-supervisedlearningtoimprovegeneralizabilityFuseEnd-to-end,DifferentiableHardExampleMiningOutputSolverFront-endPerceptionMachineLearningPerceptionResultMachineLearningBack-endDecisionClassicalAlgorithmInputSolverGeneratorOptimisationSolver
OptimisationSelf-supervisedLearning:
contrastivelearningandreconstruction-basedlearningSelf-supervisedLearningPretexttasksContrastivebasedLearningGenerative&ReconstructionbasedLearning2015-20192020-2022-PopulartimeoutlineSJTUDeepLearningLecture.30ContrastivebasedLearningGenerative&ReconstructionbasedLearningContrastiveLearning(InfoNCE-2018)AttractthefeaturesofpositivesamplesRepelthefeaturesofnegativesamplesDataaugmentationsRepresentationlearningwithcontrastivepredictivecoding,arXivpreprintarXiv:1807.03748(DeepMind)Component1.Astochasticdataaugmentationmodulethattransformsanygivendatasamplerandomlyresultingintwocorrelatedviewsofthesamesample:x_iandx_j,consideredasapositivepair2.Twoneuralnetworkbasedencodersq(.),k(.)thatextractrepresentationvectorsfromaugmenteddatasamples.Theyrepresenttheextractedfeaturepairas(q,k+).k+ispositivesample3.Amemorybanktosaveasetofkeyvalues{k1,k2,…,}(kisrandomgeneratedinGaussiandistribution).Foreachqueryq,theyconsiderthepair(q,ki)asanegativepair.4.Acontrastivelossisdefinedforacontrastivepredictiontask.Thislossaimstoupdatetheparametersofq(.)encoder5.Amomentumbasedoptimizationmethodtoupdatek(.)encoderPositivesTheimagesaugmentedfromthesameimagearerecognizedaspositives,andotherimagesarerecognizedasnegativesObjectivefunction
OptimizationmethodHeremisamomentumcoefficient.(defaultis0.999)ContrastiveLearning:MoCo(CVPR2020)ContrastiveLearning:MoCo(CVPR2020)Component1.Astochasticdataaugmentationmodulethattransformsanygivendatasamplerandomlyresultingintwocorrelatedviewsofthesameexample,denotedx˜iandx˜j,whichtheyconsiderasapositivepair.2.Aneuralnetworkbaseencoderf(·)thatextractsrepresentationvectorsfromaugmenteddatasamples.3.Asmallneuralnetworkprojectionheadg(·)thatmapsrepresentationstothespacewherecontrastivelossisapplied.TheyuseaMLPwithonehiddenlayertoobtainzincontrastivespace.4.Acontrastivelossfunctiondefinedforacontrastivepredictiontask.PositiveandNegativeSamplesTheseimagesaugmentedfromthesameimagearerecognizedaspositives,andotherimagesarerecognizedasnegativesObjectivefunction
ContrastiveLearning:SimCLR(ICML2020)Component1.Astochasticdataaugmentationmodulethattransformsanygivendatasamplerandomlyresultingintwocorrelatedviewsofthesamesample.2.Twoneuralnetworkbasedencoders:onlinenetworkandtargetnetwork.3.Teacher-Student-likedarchitectureisadopted.Targetnetworklearnstheembeddingofonlinenetwork.4.MSEisadoptedtolearnonlinenetwork(theinitializationoftargetnetworkisrandom).5.Amomentumbasedoptimizationmethodtoupgradetargetencoder.PositivesTheimagesaugmentedfromthesameimagearerecognizedaspositives,andotherimagesarerecognizedasnegativesObjectivefunctionzisl2normalizedOptimizationmethod
ContrastiveLearning:BYOL–negativefree(NeurIPS2020)Bootstrapyourownlatent-anewapproachtoself-supervisedlearningNeurIPS2020(DeepMind)BarlowTwins:Self-SupervisedLearningviaRedundancyReduction(ICML2021)ContrastiveLearning:BarlowTwins–negativefree
(ICML2021)Notinstancedimension(batchsize)!ContrastiveLearning:BarlowTwins–negativefree
(ICML2021)wherebistheindexofthesampleinthebatch;i,jistheindexoffeatureoutlineSJTUDeepLearningLecture.38ContrastivebasedLearningGenerative&ReconstructionbasedLearningMaskedAutoencodersAreScalableVisionLearners(CVPR2022)(KaimingHe,XinleiChen,SainingXie,YanghaoLi,PiotrDollár,RossGirshick)MaskedAutoencoders(MAE)(CVPR2022)RawmaskedrecoveredMaskedAutoencodersAreScalableVisionLearners(CVPR2022)(KaimingHe,XinleiChen,SainingXie,YanghaoLi,PiotrDollár,RossGirshick)SimMIM:ASimpleFrameworkforMaskedImageModeling(CVPR2022)(ZhendaXie,ZhengZhang,YueCao,YutongLin,JianminBao,ZhuliangYao,QiDai,HanHu)
MaskedImageModeling:MIM(CVPR2022)CorruptedImageModelingforSelf-SupervisedVisualPre-Training(ICLR2023)(YuxinFang,LiDong,HangboBao,XinggangWang,FuruWei)CorruptedImageModeling:CIM(ICLR2023)Self-supervisedLearning:pretasklearningSupervisedandUnsupervisedLearningSJTUDeepLearningLecture.44SupervisedLearningUnsupervisedLearningSelf-supervisedLearningPretexttasksContrastivebasedLearningGenerative&ReconstructionbasedLearning2015-20192020-2022-PopulartimeUnsupervisedVisualRepresentationLearningbyContextPrediction(ICCV2015Oral)CarlDoersch,AbhinavGupta,AlexeiA.EfrosExperimentshowsthelearnedseman
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 千企技改实施方案
- 安平企业双控工作方案
- 街道道路建设方案
- 干垃圾分类工作方案范文
- 自行车修理店维修技师晋升机制方案
- 雨季施工环境监测方案及标准
- 基于2026年金融科技趋势的精准营销策略分析方案
- 高速公路沿线苗木防风加固方案
- 智能物流2025年营销策略调整计划可行性研究报告
- 2025年城市生态修复与环境保护措施实施方案
- 2025年厦门大学强基计划招生考试数学试题真题(含答案)
- 2025年全国信息素养大赛-智创生态挑战赛初赛试题
- 口腔科器械标准化清洗流程
- 医疗设备第三方维修与保养服务项目可行性研究报告
- (四调)武汉市2025届高中毕业生四月调研考试 历史试卷(含答案)
- 安装学生床合同范本
- 危急值报告制度考试题
- T-CSEE 0399-2023 水电站紧固件技术监督导则
- 高血压急症和亚急症
- 2025届中国长江电力股份限公司“三峡班”招聘易考易错模拟试题(共500题)试卷后附参考答案
- 《公共管理学》第六章 公共政策PPT
评论
0/150
提交评论