![[2014]-word2vec中的数学原理详解_peghoty.pdf_第1页](http://file.renrendoc.com/FileRoot1/2020-1/20/90145771-5949-417e-8ecc-d2fc985e90a5/90145771-5949-417e-8ecc-d2fc985e90a51.gif)
![[2014]-word2vec中的数学原理详解_peghoty.pdf_第2页](http://file.renrendoc.com/FileRoot1/2020-1/20/90145771-5949-417e-8ecc-d2fc985e90a5/90145771-5949-417e-8ecc-d2fc985e90a52.gif)
![[2014]-word2vec中的数学原理详解_peghoty.pdf_第3页](http://file.renrendoc.com/FileRoot1/2020-1/20/90145771-5949-417e-8ecc-d2fc985e90a5/90145771-5949-417e-8ecc-d2fc985e90a53.gif)
![[2014]-word2vec中的数学原理详解_peghoty.pdf_第4页](http://file.renrendoc.com/FileRoot1/2020-1/20/90145771-5949-417e-8ecc-d2fc985e90a5/90145771-5949-417e-8ecc-d2fc985e90a54.gif)
![[2014]-word2vec中的数学原理详解_peghoty.pdf_第5页](http://file.renrendoc.com/FileRoot1/2020-1/20/90145771-5949-417e-8ecc-d2fc985e90a5/90145771-5949-417e-8ecc-d2fc985e90a55.gif)
已阅读5页,还剩48页未读, 继续免费阅读
[2014]-word2vec中的数学原理详解_peghoty.pdf.pdf 免费下载
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
peghoty word2vec peghoty peghoty 2014 7 1 3 2 4 2 1sigmoid 5 2 2 6 2 3Bayes 7 2 4 Huff man 8 2 4 1 Huff man 8 2 4 2 Huff man 8 2 4 3 Huff man 9 3 11 3 1 12 3 2n gram 13 3 3 16 3 4 19 4 Hierarchical Softmax 22 4 1CBOW 23 4 1 1 23 4 1 2 24 4 2Skip gram 30 4 2 1 30 4 2 2 30 5 Negative Sampling 33 5 1CBOW 34 5 2Skip gram 37 1 peghoty word2vec 5 3 40 6 42 6 1 x 42 6 2 43 6 3 44 6 4 45 6 5 46 6 6 47 6 7 48 6 8 49 6 9 50 2 peghoty word2vec 1 word2vec Google 2013 word vector word2vec Tomas Mikolov 3 4 word2vec 2013 10 7 SENNA 8 20 word2vec word2vec word2vec Tomas Mikolov 6 word2vec Deep Learning word2vec 15 16 3 peghoty word2vec 2 word2vec sigmoid Beyes Huff man 4 peghoty word2vec 2 1sigmoid sigmoid x 1 1 e x 0 1 1 sigmoid 1 sigmoid sigmoid x x 1 x log x log 1 x log x 1 x log 1 x x 2 1 2 1 5 peghoty word2vec 2 2 xi yi m i 1 xi Rn yi 0 1 yi 1 yi 0 sigmoid x x1 x2 xn hypothesis h x 0 1x1 2x2 nxn 0 1 n x0 1 x x0 x1 x2 xn x h h x x 1 1 e x T 0 5 y x 1 h x 0 5 0 h x 6 000 001 010 011 100 101 A E R T F D 26 5 25 32 26 9 peghoty word2vec A B C X Y Z Huff man Huff man Huff man word2vec Huff man Huff man Huff man 3 2 1 Huff man 1 0 Huff man 0 111 110 101 1001 1000 3 Huff man Huff man Huff man 1 2 1 0 word2vec 1 0 10 peghoty word2vec 3 word2vec 11 peghoty word2vec 3 1 Nature Language Processing NLP Statistical Language Model NLP 3 1 Voice p Text V oice Text Bayes p Text V oice p V oice Text p Text p V oice p V oice Text p Text 18 W wT 1 w1 w2 wT T w1 w2 wT w1 w2 wT p W p wT 1 p w1 w2 wT Bayes p wT 1 p w1 p w2 w1 p w3 w 2 1 p wT w T 1 1 3 1 p w1 p w2 w1 p w3 w2 1 p wT w T 1 1 wT 1 p wT 1 T T D N T NT T TNT n gram n gram n gram 12 peghoty word2vec 3 2n gram p wk wk 1 1 k 1 Bayes p wk wk 1 1 p wk 1 p wk 1 1 p wk wk 1 1 p wk wk 1 1 count wk 1 count wk 1 1 3 2 count wk 1 count wk 1 1 wk 1 wk 1 1 k count wk 1 count wk 1 1 3 1 n gram n 1 Markov n 1 p wk wk 1 1 p wk wk 1 k n 1 3 2 p wk wk 1 1 count wk k n 1 count wk 1 k n 1 3 3 n 2 p wk wk 1 1 count wk 1 wk count wk 1 n gram n n 1 n n 1 unigram 2 105 2 bigram 4 1010 3 trigram 8 1015 4 4 gram 16 1020 1 n gram n N 200000 13 peghoty word2vec N O Nn n n 3 n n 10 n n 1 2 2 3 3 4 n gram 3 3 1 count wk k n 1 0 p wk wk 1 1 0 2 count wk k n 1 count w k 1 k n 1 p wk wk 1 1 1 11 n gram Y w C p w Context w C Corpus Context w w Context w Context w p w Context w p w n gram Context wi wi 1 i n 1 3 1 C D D C C L X w C logp w Context w 3 4 3 4 p w Context w w Context w p w Context w F w Context w 14 peghoty word2vec 3 4 F p w Context w F w Context w n gram n gram F F word2vec 15 peghoty word2vec 3 3 Bengio A neural probabilistic language model 2003 2 D w v w Rm v w w m 4 Input Projection Hidden Output W U p q 4 3 2 2 5 4 word2vec 5 3 3 2 16 peghoty word2vec C w Context w n 1 n gram Context w w Context w w 4 C m n 1 m N D C nh n 1 m Context w n 1 xw n 1 n 1 m xw zw tanh Wxw p yw Uzw q 3 5 tanh tanh 3 4 n 1 yw yw 1 yw 2 yw N N yw yw i Context w D i softmax p w Context w p w Context w eyw iw PN i 1e yw i 3 6 iw w D 3 6 p w Context w F w Context w v w Rm w D W Rnh n 1 m p Rnh U RN nh q RN v w 4 n 1 m nh N 17 peghoty word2vec 1 n 5 2 m 101 102 3 nh 102 4 N 104 105 3 5 3 6 softmax word2vec n gram 1 S1 A dog is running in the room 10000 S2 A cat is running in the room 1 n gram p S1 p S2 S1 S2 dog cat p S1 p S2 p S1 p S2 1 2 A dog is running in the room A cat is running in the room The cat is running in a room A dog is walking in a bedroom The dog was walking in the room 2 3 6 p w Context w 0 1 n gram 18 peghoty word2vec 3 4 NLP one hot representation D N 1 0 1 Deep Learning Distributed Representation Hinton 1986 1 one hot representation one hot representation word2vec Distributed Representation Distributed Representation one hot representation Distributed Representation 3 2 a x y a 1 x y NLP LSA Latent Se mantic Analysis LDA Latent Dirichlet Allocation IDL 14 Bengio 2003 19 peghoty word2vec JMLR A Neural Probabilistic Language Model Tomas Mikolov word2vec 6SENNA performance in per word accuracy for POS and F1 score for all the other tasks Timing corresponds to the time needed by SENNA to pass over the given test data set Macbook Pro i7 2 8GHz Intel MKL For PSG F1 score is the one over all sentences Ronan Collobert SENNA 12 POS CHK NER 6 13 Tomas Mikolov 90 5 7 E S E nglish S panish one two three four fi ve E 20 peghoty word2vec u1 u2 u3 u4 u5 PCA v1 v2 v3 v4 v5 7 one two three four fi ve uno dos tres cuatro cinco S s1 s2 s3 s4 s5 PCA t1 t2 t3 t4 t5 7 7 6 21 peghoty word2vec 4 Hierarchical Softmax word2vec CBOW Continuous Bag of Words Model Skip gram Continuous Skip gram Model Tomas Mikolov 5 8 9 wt wt 2 wt 1 wt 1 wt 2 wt 8 wt wt 2 wt 1 wt 1 wt 2 9 8 CBOW 9 Skip gram CBOW Skip gram word2vec Hier archical Softmax Negative Sampling Hierarchical Softmax CBOW Skip gram 3 2 L X w C logp w Context w 4 1 p w Context w 2 3 6 word2vec Hierarchical Softmax CBOW 4 1 Hierarchical Softmax Skip gram L X w C logp Context w w 4 2 p w Context w p Context w w 22 peghoty word2vec 4 1CBOW word2vec CBOW 4 1 1 10 CBOW Context w w Context w w c 1 Context w 2c v Context w 1 v Context w 2 v Context w 2c Rm m 2 2c xw 2c P i 1 v Context w i Rm 10 CBOW 3 Huff man Huff man N D D N 1 23 peghoty word2vec 3 3 4 CBOW 10 1 2 3 3 3 softmax CBOW Huff man Hierarchical softmax 4 1 2 Hierarchical Softmax wor2vec Huff man D w 1 pw w 2 lw pw 3 pw 1 p w 2 p w lw pw lw pw 1 pw lw w 4 dw 2 d w 3 d w lw 0 1 w Huff man lw 1 dw j pw j 5 w 1 w 2 w lw 1 Rm pw w j pw j 4 1 D Huff man Huff man 11 2 1 w 11 4 5 pw lw 5 pw 1 p w 2 p w 3 p w 4 pw 5 pw 5 pw 1 dw 2 d w 3 d w 4 d w 5 1 0 0 1 Huff man 1001 w 1 w 2 w 3 w 4 pw 4 24 peghoty word2vec 11 w 10 p w Contex w xw Rm Huff man p w Contex w 11 w 4 1 0 0 1 Huff man Huff man 1 0 1 0 word2vec Label pw i 1 dw i i 2 3 lw 2 2 x w 1 1 e x w 1 x w 25 peghoty word2vec w i w i 4 1 1 p dw 2 xw w 1 1 x w w 1 2 2 p dw 3 xw w 2 x w w 2 3 3 p dw 4 xw w 3 x w w 3 4 4 p dw 5 xw w 4 1 x w w 4 p Contex 4 p Contex 5 Y j 2 p dw j xw w j 1 w Hierarchical Softmax D w Huff man w pw pw lw 1 p w Context w p w Context w 4 3 p w Context w lw Y j 2 p dw j xw w j 1 4 3 p dw j xw w j 1 x w w j 1 dw j 0 1 x w w j 1 dw j 1 p dw j xw w j 1 x w w j 1 1 dw j 1 x w w j 1 dw j 4 2 p dw j xw w j 1 1 d w j x w w j 1 d w j 1 x w w j 1 p dw j xw w j 1 26 peghoty word2vec 4 3 3 3 p w Context w eyw iw N P i 1 eyw i 3 6 X w D p w Context w 1 4 3 4 3 4 1 L X w C log lw Y j 2 x w w j 1 1 dw j 1 x w w j 1 dw j X w C lw X j 2 1 dw j log x w w j 1 d w j log 1 x w w j 1 4 4 L w j L w j 1 dw j log x w w j 1 d w j log 1 x w w j 1 4 5 4 4 CBOW word2vec Context w w L xw w j 1 w C j 2 lw L w j L w j w j 1 L w j w j 1 w j 1 1 dw j log x w w j 1 d w j log 1 x w w j 1 1 dw j 1 x w w j 1 xw d w j x w w j 1 xw 2 1 1 dw j 1 x w w j 1 d w j x w w j 1 xw 1 dw j x w w j 1 xw w j 1 w j 1 w j 1 1 dw j x w w j 1 xw 27 peghoty word2vec L w j xw 4 5 L w j xw w j 1 L w j xw L w j w j 1 L w j xw 1 dw j x w w j 1 w j 1 D xw Context w L w j xw v e w e w Context w word2vec v e w v e w lw X j 2 L w j xw e w Context w lw P j 2 L w j xw Context w xw Context w 4 4 v e w v e w Context w lw X j 2 L w j xw e w Context w Context w Context w Context w w CBOW 1 e 0 2 xw P u Context w v u 3 FOR j 2 lwDO 3 1 q x w w j 1 3 2 g 1 dw j q 3 3 e e g w j 1 3 4 w j 1 w j 1 gxw 4 FOR u Context w DO v u v u e 3 3 3 4 w j 1 e 28 peghoty word2vec 4 5 word2vec syn0 v syn1 w j 1 neu1 xw neu1e e 29 peghoty word2vec 4 2Skip gram word2vec Skip gram CBOW 4 2 1 12 Skip gram CBOW w Context w 1 w v w Rm 2 v w v w CBOW 12 Skip gram 3 CBOW Huff man 4 2 2 Skip gram w Context w 4 2 p Context w w Skip gram 30 peghoty word2vec p Context w w Y u Context w p u w p u w Hierarchical Softmax 4 3 p u w lu Y j 2 p du j v w u j 1 p du j v w u j 1 v w u j 1 1 du j 1 v w u j 1 du j 4 6 4 6 4 2 L X w C log Y u Context w lu Y j 2 v w u j 1 1 du j 1 v w u j 1 du j X w C X u Context w lu X j 2 1 du j log v w u j 1 d u j log 1 v w u j 1 4 7 L w u j L w u j 1 du j log v w u j 1 d u j log 1 v w u j 1 4 7 Skip gram L w u j u j 1 CBOW L w u j u j 1 u j 1 1 du j log v w u j 1 d u j log 1 v w u j 1 1 du j 1 v w u j 1 v w d u j v w u j 1 v w 2 1 1 du j 1 v w vu j 1 d u j v w u j 1 v w 1 du j v w u j 1 v w u j 1 u j 1 u j 1 1 du j v w u j 1 v w L w u j v w L w u j v w w j 1 L w u j v w 1 du j v w u j 1 u j 1 31 peghoty word2vec v w v w v w X u Context w lu X j 2 L w u j v w w Context w Skip gram e 0 FOR u Context w DO FOR j 2 luDO 1 q v w u j 1 2 g 1 du j q 3 e e g u j 1 4 u j 1 u j 1 gv w v w v w e word2vec Context w v w Context w u v w FOR u Context w DO e 0 FOR j 2 luDO 1 q v w u j 1 2 g 1 du j q 3 e e g u j 1 4 u j 1 u j 1 gv w v w v w e 3 4 u j 1 e 4 6 word2vec syn0 v syn1 u j 1 neu1e e 32 peghoty word2vec 5 Negative Sampling Negative Sampling CBOW Skip gram Negative Sampling NEG Tomas Mikolov 4 NCE Noise Contrastive Estimation Hier archical Softmax NEG Huff man Hierarchical Softmax 5 1 NCE X Y X Y X 9 33 peghoty word2vec 5 1CBOW CBOW w Context w w Context w w w NEG w 6 e w D Lw e w 1 e w w 0 e w 6 w e w 1 0 Context w w g w Y u w NEG w p u Context w 5 1 p u Context w x w u Lw u 1 1 x w u Lw u 0 p u Context w x w u Lw u 1 x w u 1 Lw u 5 2 xw Context w u Rm u g w g w 5 2 5 1 g w x w w Y u NEG w 1 x w u x w w Context w w x w u u NEG w Context w u g w x w w x w u u NEG w C G Y w C g w G 34 peghoty word2vec L L logG log Y w C g w X w C logg w X w C log Y u w NEG w n x w u Lw u 1 x w u 1 Lw u o X w C X u w NEG w Lw u log x w u 1 Lw u log 1 x w u 5 3 5 2 5 3 L X w C X u w NEG w Lw u log x w u 1 Lw u log 1 x w u X w C log x w w X u NEG w log 1 x w u X w C log x w w X u NEG w log x w u 1 x x Tomas Mikolov 4 4 Yoav Goldberg Omer Levy 4 4 21 Skip gram 4 5 3 L w u L w u Lw u log x w u 1 Lw u log 1 x w u 5 4 5 3 L L w u u L w u u u Lw u log x w u 1 Lw u log 1 x w u Lw u 1 x w u xw 1 Lw u x w u xw 2 1 Lw u 1 x w u 1 Lw u x w u xw Lw u x w u xw 5 5 u u u Lw u x w u xw 5 6 L w u xw L w u xw u L w u xw Lw u x w u u 5 7 35 peghoty word2vec L w u xw v e w e w Context w Hierarchical Softmax CBOW v e w v e w X u w NEG w L w u xw e w Context w 5 8 Context w w Negative Sampling CBOW 1 e 0 2 xw P u Context w v u 3 FOR u w NEG w DO 3 1 q x w u 3 2 g Lw u q 3 3 e e g u 3 4 u u gxw 4 FOR u Context w DO v u v u e 3 3 3 4 u e 5 3 word2vec syn0 v syn1neg u neu1 xw neu1e e 36 peghoty word2vec 5 2Skip gram Negative Sampling Skip gram Hierarchical Softmax CBOW Skip gram G Y w C g w G Y w C Y u Context w g u 5 9 Q u Context w g u w Context w g u g w g u Y z u NEG u p z w 5 10 NEG u u p z w v w z Lu z 1 1 v w z Lu z 0 p z w v w z Lu z 1 v w z 1 Lu z 5 11 G L logG log Y w C Y u Context w g u X w C X u Context w logg u X w C X u Context w log Y z u NEG u p z w X w C X u Context w X z u NEG u logp z w X w C X u Context w X z u NEG u log n v w z Lu z 1 v w z 1 Lu z o X w C X u Context w X z u NEG u Lu z log v w z 1 Lu z log 1 v w z 5 12 37 peghoty word2vec word2vec Negative Sampling Skip gram 5 12 5 12 w Context w Context w word2vec w Context w word2vec CBOW Contex w w Context w g w Y e w Context w Y u w NEGe w w p u e w p u e w v e w u Lw u 1 1 v e w u Lw u 0 p u e w v e w u Lw u 1 v e w u 1 Lw u NEGe w w e w C G Y w C g w G L logG log Y w C g w X w C logg w X w C log Y e w Context w Y u w NEGe w w n v e w u Lw u 1 v e w u 1 Lw u o X w C X e w Context w X u w NEGe w w Lw u log v e w u 1 Lw u log 1 v e w u 5 13 L w e w u L w e w u Lw u log v e w u 1 Lw u log 1 v e w u 38 peghoty word2vec 5 13 L L w e w u u L w e w u u u Lw u log v e w u 1 Lw u log 1 v e w u Lw u 1 v e w u v e w 1 Lw u v e w u v e w 2 1 Lw u 1 v e w u 1 Lw u v e w u v e w Lw u v e w u v e w 5 14 u u u Lw u v e w u v e w 5 15 L w e w u v e w L w e w u v e w u L w e w u v e w Lw u v e w u u v e w v e w v e w X u w NEGe w w L w e w u v e w w Context w Negative Sampling Skip gram FOR e w Context w DO e 0 FOR u w NEGe w w DO q v e w u g Lw u q e e g u u u gv e w v e w v e w e 3 3 3 4 u e 5 4 word2vec syn0 v syn1neg u neu1e e 39 peghoty word2vec 5 3 Negative Sampling CBOW Skip gram w NEG w D C 19 D w l w len w counter w P u D counter u 5 16 counter C 1 word2vec l0 0 lk k P j 1 len wj k 1 2 N wj D j lj N j 0 0 1 Ii li 1 li i 1 2 N N 0 1 mj M j 0 M N 13 13 Table mj M 1 j 1 13 mj M 1 j 1 Ij N j 1 wj N j 1 Table i wk where mi Ik i 1 2 M 1 5 17 1 M 1 r Table r wi wi 40 peghoty word2vec word2vec D counter w 3 4 5 16 len w counter w 3 4 P u D counter u 3 4 M 108 table size 5 17 InitUnigramTable 41 peghoty word2vec 6 6 1 x 1 sigmoid x x 0 x 6 0 1 x x x 6 6 K x0 x1 xK xi x0 ih x0 6 h 12 K 14 K 1000 6 6 xi x x 0 x 6 xk x 6 6 1 x 6 6 1 k x x0 h xk x x x0 h 6 1 word2vec x ez ez 1 z 1 z2 2 z3 3 n z z 1 n z 6 1 x word2vec i EXP TABLE SIZ i EXP TABLE SIZE for i 0 i EXP TABLE SIZE i expTable i exp i real EXP TABLE SIZE 2 1 MAX EXP expTable i expTable i expTable i 1 42 peghoty word2vec 6 2 word2vec D vocab hash size 3 107 vocab hash 1 D vocab hash hv wj j hv wj wj hv wi hv wj i 6 j wi hv wi vocab hash w hv w vocab hash hv w vocab hash hv wj 1 w w wvocab hash hv w j vocab hash hv wj w vocab hash hv wj w wk vocab hash 1 w k w 43 peghoty word2vec 6 3 D word2vec SortVocab min count min count word2vec for a 0 a size a if vocab a cn min count vocab size free vocab vocab size word else min count min count 44 peghoty word2vec 6 4 word2vec 1 C D min count 5 min count 0 1 D min count min reduce 1 Dcurrent Dcurrent Dcurrent 0 7 vocab hash size Dcurrent min reduce 2 4 the most frequent words 4 Subsampling 2 10 t word2vec sample w prob w 1 s t f w 6 2 f w counter w P u D counter u w D w Subsampling f w t word2vec 6 2 prob w 1 s t f w t f w w ran q t f w t f w 0 1 r r ran w 0 1 ran 1 ran 1 ran w 45 peghoty word2vec 6 5 MAX SENTENCE LENGTH 1000 MAX SENTENCE LENGTH T T w Contex w Contex w w w Contex w Context w w c word2vec window 5 Context w 1 window e c w e c Context w 3 3 CBOW e c 46 peghoty word2vec 6 6 word2vec 0 0 025 10000 0 1 word count actual train words 1 6 3 word count actual train words P w D counter w 1 6 3 0 min min min min 10 4 0 47 peghoty word2vec 6 7 D word2vec rand RAND MAX 0 5 m 0 5 m 0 5 m m word2vec syn0 syn1 syn1neg syn0 Huff man syn1 Huff man syn1neg Negative Sampling 48 peghoty word2vec 6 8 word2vec num threads 1 pthread t pt pthread t malloc num threads sizeof pthread t for a 0 a num threads a pthread create for a 0 a num threads a pthread join pt a NULL TrainModelThread fseek fi file size long long num threads long long id SEEK SET fi le size word2vec 4 Skip gram This makes the training extremely effi cient an optimized single machine imple mentation can train on more than 100 billion words in one day 49 peghoty word2vec 6 9 1 word2vec TrainModelThread Hierarchical Softmax if f MAX EXP continue else f expTable int f MAX EXP EXP TABLE SIZE MAX EXP 2 g 1 vocab word code d f alpha 6 1 f MAX EXP f 1 continue f 0 1 g Negative Sampling if f MAX EXP g label 1 alpha else if f MAX EXP g label 0 alpha else 2 word2vec 3 Hierarchical Softmax Negative Sampling word2vec negative 5 hs 1 4 Context w 50 peghoty word2vec 5 word2vec 6 51 peghoty word2vec 1 David E Rumelhart Geoff rey E Hintont and Ronald J Williams Learning representations by backpropagating errors N
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025年矿山无人作业技术智能化矿山建设标准与规范研究
- 2025年全国质量月主题宣讲课件
- 2022-2023学年广州市实验外语学校高一(下)期中语文试题及答案
- 面试题库及答案 逻辑题
- 绿色物业面试题库及答案
- 2025贵阳市农业农垦投资发展集团有限公司招聘笔试备考及答案详解(易错题)
- 教师招聘之《幼儿教师招聘》练习题及参考答案详解(黄金题型)
- 农村社区3月份党支部会议记录范文
- 基于2025年智能制造的产业孵化基地建设产业协同创新模式建议
- 2025内蒙古呼伦贝尔农垦谢尔塔拉农牧场有限公司招聘45人笔试备考附答案详解
- 宪法培训课件教学课件
- 华为全球培训中心
- 主成分分析法(高教书苑)
- T-SDLPA 0001-2024 研究型病房建设和配置标准
- 2024年中级注册安全工程师《安全生产专业实务(道路运输安全)》真题及答案
- 凝中国心铸中华魂铸牢中华民族共同体意识-小学民族团结爱国主题班会课件
- 制造业物流智能化仓储管理优化策略
- 人教版(2024新版)七年级上册生物全册教学设计
- 全过程工程咨询管理服务方案投标方案(技术方案)
- 2024年新教材统编版一年级上册语文教学计划
- DL∕ T 980-2005 数字多用表检定规程
评论
0/150
提交评论