版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
1、FundamentalsofArtificialNeuralNetworksbyMohamadH.Hassoun(MITPress,1995) HYPERLINK / /PrefaceMypurposeinwritingthisbookhasbeentogiveasystematicaccountofmajorconceptsandmethodologiesofartificialneuralnetworksandtopresentaunifiedframeworkthatmakesthesubjectmoreaccessibletostudentsandpractitioners.Thisb
2、ookemphasizesfundamentaltheoreticalaspectsofthecomputationalcapabilitiesandlearningabilitiesofartificialneuralnetworks.Itintegratesimportanttheoreticalresultsonartificialneuralnetworksandusesthemtoexplainawiderangeofexistingempiricalobservationsandcommonlyusedheuristics.Themainaudienceisfirst-yeargr
3、aduatestudentsinelectricalengineering,computerengineering,andcomputerscience.Thisbookmaybeadaptedforuseasaseniorundergraduatetextbookbyselectivechoiceoftopics.Alternatively,itmayalsobeusedasavaluableresourceforpracticingengineers,computerscientists,andothersinvolvedinresearchinartificialneuralnetwor
4、ks.Thisbookhasevolvedfromlecturenotesoftwocoursesonartificialneuralnetworks,asenior-levelcourseandagraduate-levelcourse,whichIhavetaughtduringthelast6yearsintheDepartmentofElectricalandComputerEngineeringatWayneStateUniversity.Thebackgroundmaterialneededtounderstandthisbookisgeneralknowledgeofsomeba
5、sictopicsinmathematics,suchasprobabilityandstatistics,differentialequationsandlinearalgebra,andsomethingaboutmultivariatecalculus.Thereaderisalsoassumedtohaveenoughfamiliaritywiththeconceptofasystemandthenotionofstate,aswellaswiththebasicelementsofBooleanalgebraandswitchingtheory.Therequiredtechnica
6、lmaturityisthatofaseniorundergraduateinelectricalengineering,computerengineering,orcomputerscience.Artificialneuralnetworksareviewedhereasparallelcomputationalmodels,withvaryingdegreesofcomplexity,comprisedofdenselyinterconnectedadaptiveprocessingunits.Thesenetworksarefine-grainedparallelimplementat
7、ionsofnonlinearstaticordynamicsystems.Averyimportantfeatureofthesenetworksistheiradaptivenature,wherelearningbyexamplereplacestraditionalprogramminginsolvingproblems.Thisfeaturemakessuchcomputationalmodelsveryappealinginapplicationdomainswhereonehaslittleorincompleteunderstandingoftheproblemtobesolv
8、edbutwheretrainingdataisreadilyavailable.Anotherkeyfeatureistheintrinsicparallelismthatallowsforfastcomputationsofsolutionswhenthesenetworksareimplementedonparalleldigitalcomputersor,ultimately,whenimplementedincustomizedhardware.Artificialneuralnetworksareviablecomputationalmodelsforawidevarieyofpr
9、oblems,includingpatternclassification,speechsynthesisandrecognition,adaptiveinterfacesbetweenhumansandcomplexphysicalsystems,functionapproximation,imagedatacompression,associativememory,clustering,forecastingandprediction,combinatorialoptimization,nonlinearsystemmodeling,andcontrol.Thesenetworksaren
10、euralinthesensethattheymayhavebeeninspiredbyneuroscience,butnotbecausetheyarefaithfulmodelsofbiologicneuralorcognitivephenomena.Infact,themajorityofthenetworkmodelscoveredinthisbookaremorecloselyrelatedtotraditionalmathematicaland/orstatisticalmodelssuchasoptimizationalgorithms,nonparametricpatternc
11、lassifiers,clusteringalgorithms,linearandnonlinearfilters,andstatisticalregressionmodelsthantheyaretoneurobiologicmodels.Thetheoriesandtechniquesofartificialneuralnetworksoutlinedherearefairlymathematical,althoughthelevelofmathematicalrigorisrelativelylow.InmyexpositionIhaveusedmathematicstoprovidei
12、nsightandunderstandingratherthantoestablishrigorousmathematicalfoundations.Theselectionandtreatmentofmaterialreflectmybackgroundasanelectricalandcomputerengineer.Theoperationofartificialneuralnetworksisviewedasthatofnonlinearsystems:Staticnetworksareviewedasmappingorstaticinput/outputsystems,andrecu
13、rrentnetworksareviewedasdynamicalsystemswithevolvingstate.Thesystemsapproachisalsoevidentwhenitcomestodiscussingthestabilityoflearningalgorithmsandrecurrentnetworkretrievaldynamics,aswellasintheadoptedclassificationsofneuralnetworksasdiscrete-stateorcontinuous-stateanddiscrete-timeorcontinuous-time.
14、Theneuralnetworkparadigms(architecturesandtheirassociatedlearningrules)treatedherewereselectedbecauseoftheirrelevence,mathematicaltractability,and/orpracticality.Omissionshavebeenmadeforanumberofreasons,includingcomplexity,obscurity,andspace.Thisbookisorganizedintoeightchapters.Chapter1introducesthe
15、readertothemostbasicartificialneuralnet,consistingofasinglelinearthresholdgate(LTG).Thecomputationalcapabilitiesoflinearandpolynomialthresholdgatesarederived.Afundamentaltheorem,thefunctioncountingtheorem,isprovedandisappliedtostudythecapacityandthegeneralizationcapabilityofthresholdgates.Theconcept
16、scoveredinthischapterarecrucialbecausetheylaythetheoreticalfoundationsforjustifyingandexploringthemoregeneralartificialneuralnetworkarchitecturestreatedinlaterchapters.Chapter2mainlydealswiththeoreticalfoundationsofmultivariatefunctionapproximationusingneuralnetworks.Thefunctioncountingtheoremofchap
17、ter1isemployedtoderiveupperboundsonthecapacityofvariousfeedforwardnetsofLTGs.ThenecessaryboundsonthesizeofLTG-basedmultilayerclassifiersforthecasesoftrainingdataingeneralpositionandinarbitrarypositionarederived.Theoreticalresultsoncontinuousfunctionapproximationcapabilitiesoffeedforwardnets,withunit
18、semployingvariousnonlinearities,aresummarized.Thechapterconcludeswithadiscussionofthecomputationaleffectivenessofneuralnetarchitecturesandtheefficiencyoftheirhardwareimplementations.Learningrulesforsingle-unitandsingle-ayernetsarecoveredinChapter3.Morethan20basicdiscrete-timelearningrulesarepresente
19、d.Supervisedrulesareconsideredfirst,followedbyreinforcement,Hebbian,competitive,andfeaturemappingrules.Thepresentationoftheselearningrulesisunifiedinthesensethattheymayallbeviewedasrealizingincrementalsteepest-gradient-descentsearchonasuitablecriterionfunction.Examplesofsingle-layerarchitecturesareg
20、iventoillustratetheapplicationofunsupervisedlearningrules(e.g.,principalcomponentanalysis,clustering,vectorquantization,andself-organizingfeaturemaps).Chapter4isconcernedwiththetheoreticalaspectsofsupervised,unsupervised,andreinforcementlearningrules.Thechapterstartsbydevelopingaunifyingframeworkfor
21、thecharaterizationofvariouslearningrules(supervisedandunsupervised).Underthisframework,acontinuous-timelearningruleisviewedasafirst-orderstochasticdifferentialequation/dynamicalsystemwherebythestateofthesystemevolvessoas丫彳rptominimizeanassociatedinstantaneouscriterionfunction.Statisticalapproximatio
22、ntechniquesareemployedtostudythedynamicsandstability,inanaveragesense,ofthestochasticsystem.Thisapproximationleadstoanaveragelearningequationthat,inmostcases,canbecastasaglobally,asymptoticallystablegradientsystemwhosestableequilibriaareminimizersofawell-definedcriterionfunction.Formalanalysisisprov
23、idedforsupervised,reinforcement,Hebbian,competitive,andtopologypreservinglearning.Also,thegeneralizationpropertiesofdeterministicandstochasticneuralnetsareanalyzed.Thechapterconcludeswithaninvestigatinonofthecomplexityoflearninginmultilayerneuralnets.Chapter5dealswithlearninginmultilayerartificialne
24、uralnets.Itextendsthegradientdescent-basedlearningtomultilayerfeedforwardnets,whichresultsinthebackerrorpropagationlearningrule(orbackprop).Anextensivenumberofmethodsandheuristicsforimprovingbackpropsconvergencespeedandsolutionqualityarepresented,andanattemptismadetogiveatheoreticalbasisforsuchmetho
25、dsandheuristics.Severalsignificantapplicationsofbackprop-trainedmultilayernetsaredescribed.TheseapplicationsincludeconversionofEnglishtexttospeech,mappingofhandgesturestospeech,recognitionofhandwrittenZIPcodes,continuousvehiclenavigation,andmedicaldiagnosis.Thechapteralsoextendsbackproptorecurrentne
26、tworkscapableoftemporalassociation,nonlineardynamicalsystemmodeling,andcontrol.Chapter6isconcernedwithotherimportantadaptivemultilayernetarchitectures,suchastheradialbasisfunction(RBF)netandthecerebellermodelarticulationcontroller(CMAC)net,andtheirassociatedlearningrules.Thesenetworksoftenhavesimila
27、rcomputationalcapabilitiestofeedforwardmultilayernetsofsigmoidalunits,butwiththepotentialforfasterlearning.Adaptivemulilayerunit-allocatingnetssuchashypersphericalclassifiers,restrictedCoulombenergy(RCE)net,andcascadecorrelationnetarediscussed.Thechapteralsoaddressestheissueofunsupervisedlearninginm
28、ultilayernets,anditdescribestwospecificnetworksadaptiveresonancetheory(ART)netandtheautoassociativeclusteringnetsuitableforadaptivedataclustering.Theclusteringcapabilitiesofthesenetsaredemonstratedthroughexamples,includingthedecompositionofcomplexelectromyogramsignals.Chapter7discussesassociativeneu
29、ralmemories.Variousmodelsofassociativelearningandretrievalarepresentedandanalyzed,withemphasisonrecurrentmodels.Thestability,capacity,anderror-correctioncapabilitiesofthesemodelsareanalyzed.Thechapterconcludesbydescribingtheuseofoneparticularrecurrentmodel(theHopfieldcontinuousmodel)forsolvingcombin
30、atorialoptimizationproblems.GlobalsearchmethodsforoptimallearningandretrievalinmultilayerneuralnetworksisthetopicofChapter8.Itcoverstheuseofsimulatedannealing,meanfieldannealing,andgeneticalgorithmsforoptimallearning.Simulatedannealingisalsodiscussedinthecontextoflocal-minima-freeretrievalsinrecurre
31、ntneuralnetworks(Boltzmannmachines).Finally,ahybridgeneticalgorithm/gradient-descentsearchmethodthatcombinesoptimalandfastlearningisdescribed.Eachchapterconcludeswithasetofproblemsdesignedtoallowthereadertofurtherexploretheconceptsdiscussed.Morethan200problemsofvaryingdegreesofdifficultyareprovided.
32、Theproblemscanbedividedroughlyintothreecategories.Thefirstcategoryconsistsofproblemsthatarerelativelyeasytosolve.Theseproblemsaredesignedtodirectlyreinforcethetopicsdiscussedinthebook.Thesecondcategoryofproblems,markedwithanasterisk(*),isrelativelymoredifficult.Theseproblemsnormallyinvolvemathematic
33、alderivationsandproofsandareintendedtobethoughtprovoking.Manyoftheseproblemsincludereferencetotechnicalpapersintheliteraturethatmaygivecompleteorpartialsolutions.Thissecondcategoryofproblemsisintendedmainlyforreadersinterestedinexploringadvancedtopicsforthepurposeofstimulatingoriginalresearchideas.P
34、roblemsmarkedwithadagger()representathirdcategoryofproblemsthatarenumericalinnatureandrequiretheuseofacomputer.Someoftheseproblemsareminiprogrammingprojects,whichshouldbeespeciallyusefulforstudents.Thisbookcontainsenoughmaterialforafullsemestercourseonartificialneuralnetworksatthefirst-yeargraduatel
35、evel.Ihavealsousedthismaterialselectivelytoteachanupper-levelundergraduateintroductorycourse.Fortheundergraduatecourse,onemaychoosetoskipallorasubsetofthefollowingmaterial:Sections1.4-1.6,2.12.2,4.3-4.8,5.1.2,5.4.3-5.4.5,6.1.2,6.2-6.4,6.4.2,7.2.2,7.4.1-7.4.4,8.3.2,8.4.2,and8.6.hopethatthisbookwillpr
36、oveusefultothosestudentsandpracticingprofessionalswhoareinterestednotonlyinunderstandingtheunderlyingtheoryofartificialneuralnetworksbutalsoinpursuingresearchinthisarea.Alistofabout700releventreferencesisincludedwiththeaimofprovidingguidanceanddirectionforthereadersownsearchoftheresearchliterature.E
37、venthoughthisreferencelistmayseemcomprehensive,thepublishedliteratureistooextensivetoallowsuchalisttobecomplete.AcknowledgmentsFirstandforemost,Iacknowledgethecontributionsofthemanyresearchersintheareaofartificialneuralnetworksonwhichmostofthematerialinthistextisbased.Itwouldhavebeenextremelydifficu
38、lt(ifnotimpossible)towritethisbookwithoutthesupportandassistanceofanumberoforganizationsandindividuals.IwouldfirstliketothanktheNationalScienceFoundation(throughaPYIAward),ElectricPowerResearchInstitute(EPRI),FordMotorCompany,MentorGraphics,SunMicroSystems,UnisisCorporation,WhitakerFoundation,andZen
39、ithDataSystemsforsupportingmyresearch.IamalsogratefulforthesupportIhavereceivedforthisprojectfromWayneStateUniversitythroughaCareerDevelopmentChairAward.thankmystudents,whohavemadeclassroomuseofpreliminaryversionsofthisbookandwhosequestionsandcommentshavedefinitelyenhancedit.Inparticular,Iwouldliket
40、othankRaedAbuZitar,DavidClark,MikeFinta,JingSong,AgusSudjianto,Chuanming(Chuck)Wang,HuiWang,PaulWatta,andAbbasYoussef.IalsowouldliketothankmymanycolleaguesintheartificialneuralnetworkscommunityandatWayneStateUniversity,especiallyDr.A.RobertSpitzer,formanyenjoyableandproductiveconversationsandcollabo
41、rations.amindebttoMikeFinta,whoverycapablyandenthusiasticallytypedthecompletemanuscriptandhelpedwithmostoftheartwork,andtoDr.PaulWattaoftheComputationandNeuralNetworksLaboratory,WayneStateUniversity,forhiscriticalreadingofthemanuscriptandassistancewiththesimulationsthatledtoFigures5.3.8and5.3.9.Myde
42、epgratitudegoestothereviewersfortheircriticalandconstructivesuggestions.TheyareProfessorsShun-IchiAmarioftheUniversityofTokyo,JamesAndersonofBrownUniversity,ThomasCoverofStanfordUniversity,RichardGoldenoftheUniversityofTexas-Dallas,LaveenKanaloftheUniversityofMaryland,JohnTaylorofKingsCollegeLondon,
43、FrancisT.S.YuoftheUniversityofPennsylvania,Dr.GraninoKornofG.A.andT.M.KornIndustrialConsultants,andotheranonymousreviewers.Finally,letmethankmywifeAmal,daughterLamees,andsonTarekfortheirquietpatiencethroughthemanylonelyhoursduringthepreparationofthemanuscript.MohamadH.HassounDetroit,1994ProblemsProb
44、lemsTableofContentsFundamentalsofArtificialNeuralNetworksbyMohamadH.Hassoun(MITPress,1995)Chapter1ThresholdGates1.0IntroductionThresholdGatesLinearThresholdGatesQuadraticThresholdGatesPolynomialThresholdGatesComputationalCapabilitiesofPolynomialThresholdGatesGeneralPositionandtheFunctionCountingTheo
45、remWeierstrasssApproximationTheoremPointsinGeneralPositionFunctionCountingTheoremSeparabilityinf-SpaceMinimalPTGRealizationofArbitrarySwitchingFunctionsAmbiguityandGeneralizationExtremePointsSummaryProblemsChapter2ComputationalCapabilitiesofArtificialNeuralNetworks2.0IntroductionSomePreliminaryResul
46、tsonNeuralNetworkMappingCapabilitiesNetworkRealizationofBooleanFunctionsBoundsontheNumberofFunctionsRealizablebyaFeedforwardNetworkofLTGsNecessaryLowerBoundsontheSizeofLTGNetworksTwoLayerFeedforwardNetworksThreeLayerFeedforwardNetworksGenerallyInterconnectedNetworkswithnoFeedbackApproximationCapabil
47、itiesofFeedforwardNeuralNetworksforContinuousFunctionsKolmogorovsTheoremSingleHiddenLayerNeuralNetworksareUniversalApproximatorsSingleHiddenLayerNeuralNetworksareUniversalClassifiersComputationalEffectivenessofNeuralNetworksAlgorithmicComplexityComputationalEnergySummaryChapter3LearningRules3.0Intro
48、ductionSupervisedLearninginaSingleUnitSettingErrorCorrectionRulesPerceptronLearningRuleGeneralizationsofthePerceptronLearningRuleThePerceptronCriterionFunctionMaysLearningRuleWidrow-Hoff(alpha-LMS)LearningRuleOtherGradientDescent-BasedLearningRulesmu-LMSLearningRuleThemu-LMSasaStochasticProcessCorre
49、lationLearningRuleExtensionofthemu-LMSRuletoUnitswithDifferentiableActivationFunctions:DeltaRuleAdaptiveHo-Kashyap(AHK)LearningRulesOtherCriterionFunctionsExtensionofGradientDescent-BasedLearningtoStochasticUnitsReinforcementLearningAssociativeReward-PenaltyReinforcementLearningRuleUnsupervisedLearn
50、ingHebbianLearningOjasRuleYuilleetal.RuleLinskersRuleHebbianLearninginaNetworkSetting:PrincipalComponentAnalysis(PCA)PCAinaNetworkofInteractingUnitsPCAinaSingleLayerNetworkwithAdaptiveLateralConnectionsNonlinearPCACompetitivelearningSimpleCompetitiveLearningVectorQuantizationSelf-OrganizingFeatureMa
51、ps:TopologyPreservingCompetitiveLearningKohonensSOFMExamplesofSOFMsSummaryProblemsChapter4MathematicalTheoryofNeuralLearning4.0IntroductionLearningasaSearchMechanismMathematicalTheoryofLearninginaSingleUnitSettingGeneralLearningEquationAnalysisoftheLearningEquationAnalysisofsomeBasicLearningRulesCha
52、racterizationofAdditionalLearningRulesSimpleHebbianLearningProblemsProblemsImprovedHebbianLearningOjasRuleYuilleetal.RuleHassounsRulePrincipalComponentAnalysis(PCA)TheoryofReinforcementLearningTheoryofSimpleCompetitiveLearningDeterministicAnalysisStochasticAnalysisTheoryofFeatureMappingCharacterizat
53、ionofKohonensFeatureMapSelf-OrganizingNeuralFieldsGeneralizationGeneralizationCapabilitiesofDeterministicNetworksGeneralizationinStochasticNetworksComplexityofLearningSummaryProblemsChapter5AdaptiveMultilayerNeuralNetworksI5.0IntroductionLearningRuleforMultilayerFeedforwardNeuralNetworksErrorBackpro
54、pagationLearningRuleGlobalDescent-BasedErrorBackpropagationBackpropEnhancementsandVariationsWeightsInitializationLearningRateMomentumActivationFunctionWeightDecay,WeightElimination,andUnitEliminationCross-ValidationCriterionFunctionsApplicationsNetTalkGlove-TalkHandwrittenZIPCodeRecognitionALVINN:AT
55、rainableAutonomousLandVehicleMedicalDiagnosisExpertNetImageCompressionandDimensionalityReductionExtensionsofBackpropforTemporalLearningTime-DelayNeuralNetworksBackpropagationThroughTimeRecurrentBack-PropagationTime-DependentRecurentBack-Propagation545Real-TimeRecurrentLearningSummaryChapter6Adaptive
56、MultilayerNeuralNetworksII6.0IntroductionRadialBasisFunction(RBF)NetworksRBFNetworksversusBackpropNetworksRBFNetworkVariationsCerebellerModelArticulationController(CMAC)CMACRelationtoRosenblattsPerceptronandOtherModelsUnit-AllocatingAdaptiveNetworksHypersphericalClassifiersRestrictedCoulombEnergy(RC
57、E)ClassifierReal-TimeTrainedHypersphericalClassifierCascade-CorrelationNetworkClusteringNetworks641AdaptiveResonanceTheory(ART)NetworksAutoassociativeClusteringNetworkSummaryProblemsChapter7AssociativeNeuralMemories7.0IntroductionBasicAssociativeNeuralMemoryModelsSimpleAssociativeMemoriesandtheirAss
58、ociatedRecordingRecipesCorrelationRecordingRecipeASimpleNonlinearAssociativeMemoryModelOptimalLinearAssociativeMemory(OLAM)OLAMErrorCorrectionCapabilitiesStrategiesforImprovingMemoryRecordingDynamicAssociativeMemories(DAM)Continuous-TimeContinuous-StateModelDiscrete-TimeContinuous-StateModelDiscrete
59、-TimeDiscrete-StateModelDAMCapacityandRetrievalDyanamicsCorrelationDAMsProjectionDAMsCharacteristicsofHigh-PerformanceDAMsOtherDAMModelsBrain-State-in-a-Box(BSB)DAMNon-MonotonicActivationsDAMDiscreteModelContinuousModelHystereticActivationsDAMExponentialCapacityDAMSequenceGeneratorDAMHeteroassociati
60、veDAMTheDAMasaGradientNetanditsApplicationtoCombinatorialOptimizationSummaryChapter8GlobalSearchMethodsforNeuralNetworks1.8.0IntroductionLocalversusGlobalSearchAGradientDescent/AscentSearchStrategyStochasticGradientSearch:GlobalSearchviaDiffusionSimulatedAnnealing-BasedGlobalSearchSimulatedAnnealing
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2026浙江大学医学院附属第四医院细胞治疗中心招聘备考题库含答案详解(完整版)
- 2026新疆玛纳斯县第一中学面向社会引进高层次教学人才备考题库及答案详解一套
- 2026西藏那曲嘉黎县邮政分公司招聘6人备考题库及答案详解1套
- 2026河南省医学科学院王宁利院士团队招聘工作人员备考题库及完整答案详解1套
- 2026江苏南通文化艺术中心管理有限公司招聘劳务派遣人员1人备考题库及答案详解(历年真题)
- 2026江西中医药大学第二附属医院编制外招聘9人备考题库(第三批)含答案详解(突破训练)
- 2026陕西西安碑林区柏树林社区卫生服务中心招聘备考题库及答案详解(基础+提升)
- 2026浙江宁波市海曙区人才科技发展有限公司招聘政府机关单位编外人员3人备考题库含答案详解(模拟题)
- 2026四川成都市简阳市人力资源社会保障信息中心招聘编外人员4人备考题库及答案详解一套
- 2026甘肃陇南徽县众乔医院招聘备考题库及答案详解(名师系列)
- 2025年中央纪委国家监委驻中国国家铁路集团有限公司招聘笔试参考题库附带答案详解
- 《公路波纹钢结构涵洞标准图集》(征求意见稿)
- 企业并购的机遇与挑战分析
- 射线检测专业知识考试题库(含答案)
- 2024年全国统一高考数学试卷(理科)甲卷含答案
- 湖北省襄阳市2023-2024学年小升初语文试卷(含答案)
- 黑龙江省建筑工程施工质量验收标准(建筑地面工程)
- 第八课 良师相伴 亦师亦友
- 2023年南京市中考历史试题及答案
- 《公共政策评估》课件
- 350种中药饮片功能主治
评论
0/150
提交评论