扩散概率模型:理论与应用-英_第1页
扩散概率模型:理论与应用-英_第2页
扩散概率模型:理论与应用-英_第3页
扩散概率模型:理论与应用-英_第4页
扩散概率模型:理论与应用-英_第5页
已阅读5页,还剩38页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

DiffusionProbabilisticModels:Theory

andApplicationsFan

BaoTsinghua

UniversityByFanBao,

Tsinghua

University1DiffusionProbabilisticModels(DPMs)Hoetal.Denoisingdiffusionprobabilisticmodels(DDPM),

Neurips

2020.Song

etal.Score-based

generativemodelingthrough

stochastic

differential

equations,

ICLR2021.Bao

etal.Analytic-DPM:

an

AnalyticEstimateof

theOptimal

ReverseVariance

inDiffusionProbabilistic

Models,ICLR

2022.Bao

etal.EstimatingtheOptimal

Covariance

withImperfect

Mean

inDiffusionProbabilisticModels,ICML2022.ByFanBao,

Tsinghua

University2•

Diffusionprocessgradually

injects

noiseto

data•

Described

by

aMarkov

chain:

,

,

=

푞(푥

|푥

)0푁01

0푁

푁−1Transition

ofdiffusion:

푥=

푁(

,

퐼)

=

1

훽푛

푛−1푛

푛−1

푛푛푛…푥0푥1푥2푥푁≈

푁(0,

퐼)Diffusionprocess:

,

,

=

푞(푥

|푥

)0푁01

0푁

푁−1Demo

Imagesfrom

Song

etal.Score-based

generativemodeling

through

stochastic

differential

equations,

ICLR

2021.ByFanBao,

Tsinghua

University3•

Diffusionprocessin

thereverse

direction⇔

denoisingprocess•

Reverse

factorization:

,

,

=

|푥

푞(푥

)0푁0

1푁−1

푁푁Transition

ofdenoising:

=?푛−1

푛…푥0푥1푥2푥푁≈

푁(0,

퐼)Diffusionprocess:

,

,

=

푞(푥

|푥

)0푁01

0푁

푁−1=

|푥

푞(푥

)0

1푁−1

푁푁ByFanBao,

Tsinghua

University4•

Approximatediffusionprocessin

thereverse

directionModel

transition:

=

푁(휇

,

Σ

(푥

))푛−1

푛푛

푛푛

푛approximateTransition

ofdenoising:

=?푛−1

푛…푥0푥1푥2푥푁≈

푁(0,

퐼)Diffusionprocess:

,

,

=

푞(푥

|푥

)0푁01

0푁

푁−1=

|푥

푞(푥

)0

1푁−1

푁푁Themodel:

,

,

=

|푥

푝(푥

)0푁0

1푁−1

푁푁ByFanBao,

Tsinghua

University5•

We

hope푞

,

,

,

,

푥푁푝

=

푁(휇

,

Σ

(푥

))푛−1

푛0푁0•

Achievedbyminimizing

their

KLdivergence

(i.e.,maximizing

theELBO)min

KLmaxELBO푝(푥

)0:푁min

퐾퐿(푞(푥

)||푝

)

max

E

log푛

푛0:푁0:푁푞푞(푥

|푥

)휇

,Σ휇

,Σ푛

푛1:푁

0Whatistheoptimalsolution?ByFanBao,

Tsinghua

University6Bao

etal.Analytic-DPM:an

AnalyticEstimateof

theOptimal

Reverse

Variance

in

DiffusionProbabilistic

Models,ICLR

2022.Theorem(The

optimalsolution

under

scalarvariance,i.e.,Σ

=

휎2퐼)푛

푛푛Theoptimalsolutionto

min

퐾퐿(푞(푥

)||푝

)

is0:푁0:푁2휇

,휎푛푛3key

steps

in

proof:➢

Moment

matching➢

Law

of

totalvariance➢

Score

representation

ofmoments

of

푞(푥

|푥

)1∗,휇

=푥

+

log

(푥

)푛

푛푛

푛훼푛0푛2훽∇

log

푥푛∗2푛푛).휎

=

(1

E푛푛

(푥

)푛

푛훼푑푛Noise

predictionform:Parameterizationof

:풏111∇

log

(푥

)

=

−푛E푞[휖

]푛푥

푥0

푛휇

=푛푥

훽푛휖Ƹ(푥

)푛푛ഥ푛푛푛훽ഥ훽훼푛푛푛Estimatedby

predictingnoiseByFanBao,

Tsinghua

University7Bao

etal.EstimatingtheOptimal

Covariance

withImperfect

Mean

inDiffusionProbabilisticModels,ICML2022.Theorem(The

optimalsolution

fordiagonal

covariance,i.e.,Σ

=

diag(휎

2)

)푛

푛푛

푛Theoptimalsolutionto

min

퐾퐿(푞(푥

)||푝

)

is0:푁0:푁2휇

,휎

⋅푛푛Predict

noise1∗,휇

=푥

+

log

(푥

)푛

푛푛푛푛

푛훼푛ഥ2훽훽푛∗2푛−122휎

=훽

+(E휖

E휖

).푛

푛푛푞

(푥

|푥

)

푛푞(푥

|푥

)

푛ഥ훽ഥ훽

훼푛

푛푛

푛푛

푛푛Predict

squarednoiseByFanBao,

Tsinghua

University8

Implementation

framework

ofpredictingsquarednoise最优协方差表达式:constant预测网络(푥

)最小化均方误差Ƹ2휖Ƹmin

퐄‖휖휖ො(푥

)

‖푛

2푛푛푛푛푛数据푥0高斯噪声휖푛带噪数据푥푛预测网络最小化均方误差2

2ℎ

(푥

)

min

퐄‖ℎ

‖푛푛푛푛푛

2ℎ푛平方噪声2휖푛基于预测噪声平方的最优协方差估计:ByFanBao,

Tsinghua

University9휎푥

=훽

+E[

휖Ƹ(푥

)

].Bao

etal.EstimatingtheOptimal

Covariance

withImperfect

Mean

inDiffusionProbabilisticModels,ICML2022.Theorem(The

optimalsolution

fordiagonal

covariance,i.e.,Σ

=

diag(휎

2)

)푛

푛푛

푛Theoptimalsolutionto

min

퐾퐿(푞(푥

)||푝

)

withimperfect

meanis0:푁0:푁2휎

⋅푛ഥ2훽훽푛∗2푛−12푛

푛푛푞(푥

|푥

)

푛ഥ훽ഥ훽

훼푛

푛0

푛푛Noiseprediction

residual(NPR)11Generally,

the

mean

=푥

훽휖Ƹ(푥

)

isnotoptimal

due

to

approximationor푛

푛푛푛푛푛훼푛ഥ훽푛optimization

errorof휖Ƹ(푥

).푛푛ByFanBao,

Tsinghua

University10

Implementation

framework

ofpredictingNPR最优协方差表达式:预测网络

最小化均方误差(푥

)

min

퐄‖휖2휖ƸƸ(푥

)

‖푛푛푛푛푛

2휖ො푛数据푥0高斯噪声휖푛带噪数据푥푛预测网络最小化均方误差2

2푔

(푥

)

min

퐄‖푔

(휖Ƹ푥

)

‖푛푛푛푛푛

2푛噪声残差푥

)2(휖Ƹ푛푛푛基于预测噪声残差的最优协方差估计:Page11Songet

al.Score-based

generativemodeling

throughstochastic

differentialequations,

ICLR2021.•

Thecontinuoustimesteps

version(SDE)•

,

,

becomes0푁•

푑풙

=

푑푡

+

푑풘

푑풙

=

2

∇log

푑푡

+

푑풘ഥ푡•

,

,

becomes0푁•

푑풙

=

2풔

푑푡

+

푑풘ഥ푡ByFanBao,

Tsinghua

University12Conditional

DPMs:Paired

DataWe

have

pairs

of(푥

,

푐),where푥

isthedata

and푐

isthecondition.00Thegoal

is

to

learntheunknown

conditional

data

distribution

푞(푥

|푐).0ByFanBao,

Tsinghua

University13Conditional

Model

Original

model푠

→conditionalmodel푠

|푐푛

푛푛

Training:

minE

E

훽ҧE푠

|푐

log

(푥

|푐)2푐

(푥

|푐

)

푛푛

푛푛

푛푠푛

Conditional

DPM:1

Discrete

time:푝

,

=

푁(휇

|푐

,

Σ

(푥

)),휇

=푥

+

|푐푛

푛푛−1

푛푛

푛2푛

푛푛

푛훼푛

Continuous

time:푑풙

=

풙|c

푑푡

+

푑풘ഥ푡

Challenge:designthemodel

architecture

|푐푛

푛ByFanBao,

Tsinghua

University14DiscriminativeGuidance

Exact

reverse

SDE:

푑풙

=

∇log

풙|푐

푑푡

+

푑풘ഥ2푡

∇log

풙|푐

=

log

(푥)

+

log

(푐|푥)푡푡푡Thepaired

data

isusedin

thetraining

of

thediscriminative

modelOriginalDPMDiscriminativemodelApproximated

by

Conditional

score-based

SDE:2

푑풙

=

(푠

+

log

(푐|푥))

푑푡

+

푑풘ഥ푡푡

Benefits:

Manydiscriminativemodels

have

wellstudiedarchitecturesByFanBao,

Tsinghua

University15ScaleDiscriminativeGuidance

Exact

reverse

SDE:

푑풙

=

(∇

log

(푥)

+

log

(푐|푥))

푑푡

+

풘ഥ2푡푡

Scale

discriminativeguidance:2

푑풙

=

(∇

log

(푥)

+

휆∇

log

(푐|푥))

푑푡

+

푑풘ഥ푡푡

Conditional

score-basedSDE:2

푑풙

=

(푠

+

휆∇

log

(푐|푥))

푑푡

+

푑풘ഥ푡푡2

푑풙

=

(푠

푥|푐

+

휆∇

log

(푐|푥))

푑푡

+

푑풘ഥ푡푡ByFanBao,

Tsinghua

University16Conditioned

on

labelDhariwal

et

al.Diffusion

ModelsBeatGANs

on

ImageSynthesisByFanBao,

Tsinghua

University17SelfGuidanceHoet

al.Unconditional

Diffusion

Guidance

Scalediscriminative

guidance:

푑풙

=

(∇

log

(푥)

+

휆∇

log

(푐|푥))

푑푡

+

풘ഥ2푡푡Require

anextradiscriminative

model

log

(푐|푥)

=

log

(푥|푐)

log

(푥)푡푡푡

Learnconditional

&unconditional

modeltogether

Introducetoken

∅,anduse푠

|∅

to

represent

unconditional

cases푡

Conditional

score-basedSDE:2

푑풙

=

(푠

푥|∅

+

휆(푠

(푥|∅))

푑푡

+

푑풘ഥ푡푡푡

Training:ҧ2ҧ2

minE

E

E푠

|푐

log

(푥

|푐)

+

휆E

E푠

|∅

log

(푥

)푐

(푥

|푐

)

푛푛

푛푛

(푥

)

푛푛푛푛푛푠

⋅푛conditional

lossunconditional

lossByFanBao,

Tsinghua

University18Sahariaet

al.ImageSuper-Resolution

viaIterative

RefinementApplication:

ImageSuper-Resolution

Paired

data

(푥

,

푐),

ishighresolutionimage,

islowresolutionimage00

Learnaconditionalmodel푠

|푐푛

Architecture:푠

|푐

=

UNet(cat

,

,

푛)

푐′,

isthebicubicinterpolation

of푐푛

푛푛ByFanBao,

Tsinghua

University19Sahariaet

al.ImageSuper-Resolution

viaIterative

RefinementApplication:

ImageSuper-ResolutionByFanBao,

Tsinghua

University20Nichol

etal.GLIDE:

Towards

Photorealistic

ImageGenerationand

Editing

with

Text-Guided

Diffusion

ModelsApplication:

Text

to

Image

Dataset

contains

pairsof(푥

,

푐),where

isimage

and푐

istext00

Techniques:

conditionalmodel

with

self-guidance

Challenge:

design푠

푐푡ByFanBao,

Tsinghua

University21Nichol

etal.GLIDE:

Towards

Photorealistic

ImageGenerationand

Editing

with

Text-Guided

Diffusion

ModelsApplication:

Text

to

Image

Architecture

of푠

:UNet+

TransformerOther

detailsDataset:

the

sameasDALL-E푡#parameters:

2.3billion

for

64x64

UNetencodesimage푥

Transformer

encodestext

andtheembeddingisinjectedto

UNet

The

token

embeddingisinjected

after

group

normalization

inResBlock:

The

token

embeddingisconcatenated

totheattention

context

inUNetByFanBao,

Tsinghua

University22Amitet.al.SegDiff:

ImageSegmentationwith

Diffusion

Probabilistic

ModelsApplication:

Segmentation

Paired

data

(푥

,

푐),

issegmentation,

isimage00

=

UNet(퐹

+

퐺(푐),

푡)푡ByFanBao,

Tsinghua

University23Conditional

DPMs:UnpairedDataWe

only

have

asetof푥

(data).0Thegoal

is

to

constructaconditionaldistribution

푝(푥

|푐).0ByFanBao,

Tsinghua

University24Energy

Guidance

UnconditionalDPMtrainedfromaset

of

(data):02

푑풙

=

푑푡

+

푑풘ഥ푡

A

strategy

to

construct푝(푥

|푐)

is

to

insert

anenergyfunction:02

푑풙

=

(풔

∇퐸

(풙,

푐))

푑푡

+

푑풘ഥ,

푝(푥

|푐)푡푡푇푇

The

generated

datatendsto

have

alowenergy퐸

(풙,

푐)푡

The

energy

dependsonspecific

applicationsByFanBao,

Tsinghua

University25Energy

Guidance

Pros:

Provides

aframework

for

incorporating

domain

knowledge

to

DPMs

Cons:

푝(푥

|푐)

isveryblack

box0

Energydesign

isbasedonintuitionByFanBao,

Tsinghua

University26Application:

Text

to

Image

Highlevelidea:

Defineenergyasanegative

similarity

between

imageandtext

CLIPprovidesamodelto

measurethesimilarity

between

imagesandtexts:

Similarity:

sim

풙,

=

풇(풙)

품(푐)

Energy:퐸

풙,

=

−sim

풙,

푐푡Nichol

etal.GLIDE:

Towards

Photorealistic

ImageGenerationandEditing

withText-Guided

Diffusion

ModelsByFanBao,

Tsinghua

University27Application:

Text

to

ImageEnergyguidanceSelfguidanceByFanBao,

Tsinghua

University28Vikash

etal.

GeneratingHigh

FidelityData

fromLow-density

Regionsusing

Diffusion

ModelsApplication:

Generate

Low

DensityImagesSamplesfromSDEismore

similarto

high

densitypartin

datasetSamplesfromSDEof

풙|cDataset푡ByFanBao,

Tsinghua

University29Vikash

etal.

GeneratingHigh

FidelityData

fromLow-density

Regionsusing

Diffusion

ModelsApplication:

Generate

Low

DensityImages

Original

SDE:푑풙

=

2풔

풙|c

푑푡

+

푑풘ഥ푡

NewSDE:

푑풙

=

(풔

풙|c

∇퐸

(풙,

푐))

푑푡

+

푑풘ഥ2푡푡

Highlevelintuition:

Small

energy~x

isaway

from

theclass

c

푥,

=

sim

푥,

=

휇푡푐

isan

imageencoderand휇

isthe

averaged

embeddingof

class푐푐

Empirically,

useacontrastiveversionofthelossVikash

etal.

GeneratingHigh

FidelityData

fromLow-density

Regionsusing

Diffusion

ModelsByFanBao,

Tsinghua

University30Vikash

etal.

GeneratingHigh

FidelityData

fromLow-density

Regionsusing

Diffusion

ModelsApplication:

Generate

Low

DensityImagesSamplesfromSDEof

풙|cSamplesfrom풔

풙|c

∇퐸

(풙,

푐)Dataset푡푡푡ByFanBao,

Tsinghua

University31Mengetal.

ImageSynthesis

andEditing

withStochastic

Differential

EquationsApplication:

Image2Image

Translation

isthe

reference

image

isaDPM

on

target

domain푡2

푑풙

=

(풔

)

푑푡

+

푑풘,

푝(푥

|푐)ഥ푡푡푡00

Noenergyguidance

only

influencethestartdistribution

Chooseanearlystart

time

<

푇0

푝(푥

|푐)

isaGaussian

perturbation

of

푐푡0ByFanBao,

Tsinghua

University32Mengetal.

ImageSynthesis

andEditing

withStochastic

Differential

EquationsApplication:

Image2Image

Translation푝(푥

|푐)

isa

Gaussianperturbation

of

푐푡0Stroke

to

paintingByFanBao,

Tsinghua

University33DPMsfor

Downstream

TasksRegardDPMsaspretrainedmodels(feature

extractors)ByFanBao,

Tsinghua

University34Dmitry

et.al.

Label-Efficient

SemanticSegmentationwithDiffusion

ModelsDPMsfor

Downstream

SegmentationDPM

features

are

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论