狗尾草智能科技人工智能研究院院长虚拟生命中知识图谱实战-问答系统_第1页
狗尾草智能科技人工智能研究院院长虚拟生命中知识图谱实战-问答系统_第2页
狗尾草智能科技人工智能研究院院长虚拟生命中知识图谱实战-问答系统_第3页
狗尾草智能科技人工智能研究院院长虚拟生命中知识图谱实战-问答系统_第4页
狗尾草智能科技人工智能研究院院长虚拟生命中知识图谱实战-问答系统_第5页
免费预览已结束,剩余41页可下载查看

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

分类•利用深度学习对于传统问答方法的改进•基于深度学习的End2End模型逻辑表达式实体识别关系分类实体消歧e11e9e10e12e2e1e3e4e7e5e6e8Similarity的老婆的国籍是?SemanticParsing问句深度学习对于传统知识库问答方法的改进利用深度学习对于传统问答方法的改进•DL-based关系模板识别•Yinh

et

al.

Semantic

Parsing

via

StagedQuery

Graph

Generation:

QuestionAnswering

wit owledge

Bas,

InProceedings

of

ACL

2015

(OutstandingP

r)Zeng

et

al.

Relation

Classification

viaConvolutional

Deep

Neural

Network,

inProceedings

of

COLING

2014

(BestP

r)Zeng

et

al,

Distant

Supervision

for

RelationExtraction

via

Piecewise

ConvolutionalNeural

Networks,

in

Proceedings

of

EMNLP2015Xu

et

al.

Classifying

Relations

via

LongShort

Term

Memory

Networks

alongShortest

Dependency

Paths,

InProceedings

of

EMNLP

2015•••实体识别关系分类实体消歧逻辑表达式Staged

Query

Graph

Generation

(Yinhet

al.

ACL

2015)•问答过程:基于结构化的问句语义表达(Lambda演算)在知识图谱匹配最优子图Family

Guycvt2Meg

GriffinLacey

Chabert1/31/1999cvt1fromcvt3series12/26/1999Mila

KunisWhovoiced

Meg

on

Family

Guy?argmin(x.Actor(x

,Family

_Guy)Voice(x

,Meg

_Griffin),x.casttime(x

))Query

GraphWhovoiced

Meg

on

FamilyGuy?Constraints

&

Aggregationstopic

entitycore

inferential

chainThis

example

comes

from

Yinh’s

Slides

in

2015Linking

TopicEntityFamily

Guys1Meg

Griffins2s0ϕWhovoiced

Megon

Family

Guy?Entity

LinkingStep1:Lexicon-based

candidate

Generation(

anchor

texts,

redirect

pages

and

others)Step2:

Selecting

entities

by

usingStructured

Learning

(Top

10)This

example

comes

from

Yinh’s

Slides

in

2015Identify

Core

Inferential

ChainFamily

Guys1Family

Guycastactorxys3Family

Guywriterstartxys4Family

Guygenrexs5Whovoiced

Megon

Family

Guy?Whovoiced

Megon

Family

Guy?{cast−actor,

writer−start,genre}2

steps

ify

isa

CVT-node1step

ify

is

a

non-CVT-nodeThis

example

comes

from

Yinh’s

Slides

in

2015Using15K15K15K15K15K100010001000.........max...300...<s>w1

w2…wT

</s>max...max...300word

hash:“cat”

—>

“#ca”,”cat”,

“at#”Whovoiced

Meg

on

Family

Guy?Cast-Actor1p(r

|q)

1

exp(cos(er

,eq

))r

qrexp(cos(e

,e

))“cat”

—>[0….1..1….0….1….0]Argument

ConstraintsWhovoiced

Megon

Family

Guy?Family

GuycastactorxyFamily

GuycastactorxyMeg

GriffinFamily

GuyxyMeg

Griffinargmins3s6s7Using

rulestoadd

constraintsonthe

core

inferential

chainIf

x

is

a

entity,

it

can

be

added

asentity

nodeIf

x

is

such

keywords,

like

”,“latest”,

it

could

be

added

asaggregation

constraints.This

example

comes

from

Yinh’s

Slides

in

2015RankingWhovoiced

Meg

onFamilyGuy?…Ranking•Log

Linear

ModelMain

Features:•••Topic

Entity:

Entity

LinkingScoreCore

Inferential

Chain:Relation

MatchingScore(NN-based

model)Constraints:

Keyword

andentity

matchingResults•Be ark:

WebQuestions5,810

Q-APairs

fromDownload:query

logAvg.

F1(Accuracy)

onWebQuestions

Test

Set35.737.539.239.941.344.345.3333020100[VALUE]405060Yao-14Berant-13Bao-14Bordes-14bBerant-14-14Yao-15Wang-14Yih-15Key:

Relation

ClassificationWhovoiced

Megon

Family

Guy?{cast−actor,

writer−start,genre}LabeledTraining

DataFeatureRepresentationTraditional

Feature

Representation传统特征提取需要NLP预处理+人工设计的特征The

[haft]e1

of

the

[axe]e2

is

made

of

yew

woodWords:

haft,

of ,the ,

axem11

b1

b2

21Entity

Type:

OBJECTm1

,

PRODUCTm2Parse

Tree:

OBJECT-NP-PP-PRODUCTKernel

Feature:问题1:对于缺少NLP处理工具和资源的语言,无法提取文本特征问题2:NLP处理工具引入的“错误累积”Component-Whole(e1,e2)Traditional

FeaturesRelation

Identification

based

on

DeepConvolutionalNeural

Network

(Zeng

et

al

COLING

2014)The

[haf.]eof

.he

[axe]e2

is

made

of

1ew

wood.Component(Whole(e1,e2)Lexical

LevelFeatures:捕捉词本身的语义信息Sentence

Level

Features:捕捉所在句子的上下文信息Lexical

Level

FeaturesThe[ha']e1ofthe

[axe]e2

is

made

of

yew

wood.0.5

,

0.2,

-0.1,

0.40.1

,

0.3,

-.01,

0.40.1

,-0.3,

0.1,

0.4L1L2L3L4L5Sentence

Level

FeaturesconvolutionallayerSentence

Level

FeaturesS:

[People]

have

been

moving

backinto

[downtown]

.-24WF:PF:max

pooling:output:Classification

LayerSoftmax实验结果•SemEval-2010

Task

8实验表明, 所提出方法在需要NLP预处理和人工设计复杂特征前提下,能够有效提升实体关系分类性能Long

Shortest

Memory

Network

Along

ShortestSyntactic

Path

(Xu

et

al.

EMNLP

2015)“A

trilliongallons

ofwaterhave

been

poured

into

an

empty

region

ofouter

spaceResults小结•已有DL方法对于问答的改进仍然基于传统知识库问答的基本流程•仍然是基于符号表示语义关系识别: 、LSTM-RNNEnd2End

based

QAEnd2Ende11e9e10e12e2e1e3e4e5e6e7e8的老婆的国籍是?matchingProgress•Bordes

et

al.

Open

Question

Answering

with

Weakly

SupervisedEmbedding

Models,

In

Proceedings

of

ECML-PKDD

14Basic

SystemBordes

et

al.

Question

Answering

with

Subgraph

Embedding,

InProceedings

of

EMNLP

14Contextual

Information

of

Answerset

al.

Joint

Relational

Embeddings

for

Knowledge-BasedQuestion

Answering,

In

Proceedings

of

EMNLP

2014Entity

TypeDong

et

al.

Question

Answerin er

Freebase

with

Multi-ColumnConvolutional

Neural

Network,

In

Proceedings

of

ACL

2015Topic

Entity、Relation

Path、Contextual

InformationBordes

et

al.

Large-scale

Simple

Question

Answering

with

MemoryNetwork,

In

Proceedings

of

ICIR

2015Memory

Network••••Basic

End2End

QA

System(Bordes

et

al.2014)•目前只处理单关系(Single

Relation)的问句(Simple

Question)基本步骤••Step1:候选生成•利用Entity

Linking找到main

entity在KB中main

entity周围的entity均是候选••Step2:候选排序e11e9e10e12e2e1e3e4e7e5e6e8的老婆的国籍是?matchingBasic

End2End

QA

System(Bordes

et

al.

2014)w

qk

ig(qi

)

E(ek

)ek

ti的老婆的国籍是?matchingf

(qi

)

E(wk

)Object:L

0.1

S(

f

(q),

g(t))

S(

f

(q),

g(t))S(

f

(q),

g(t))

f

(q)T

g(t)S(

f

(q),

g(t))

f

(q)T

Mg(t)Multitask

learning

with

paraphrases:S

(

f

(q),

f

(q

))

f

(q)T

f

(q

)p

p

pcandidatee2e1r1e3e4r2r3r4ResultsConsidering

More

Info

(Bordes

et

al.

2014)•Entity

Information•Path

to

entities

in

question•Subgraph

of

the

answers

(contextual

Info)candidatee1e2e3e4r1r2r3r4entity

in

qpathr5e5S(q,a)

f

(q)T

g(a)e6g(qi

)

ek

Context(ti

)E(ek

)E(rk

)rk

Context(ti

)g(qi

)

E(ek

)E(rk

)ek

Path(ti

)rk

Path(ti

)E(ek

)ResultsMulti-Column

(Dong

et

al.

2015)依据问答特点,考虑答案不同维度的信息Answer

TypeAnswer

ContextAnswer

Pathe3e4r4entity

in

qr5e5r3e1e6r1pathtypeanswere2Frameworkwhen

didAvatar

release

in

UKResultsNeural

Attention-based

Neural

Model

for

QA

withCombining

Global

Knowledge

Information(Zhang

et

al.

2016)问题问句表示已有方法多采用WordEmbedding的平均对于问句进行语义表示,问句表示过于简单关注答案不同的部分,问句的表示应该是不一样的知识表示受限于训练语料未考虑全局信息Neural

Attention-based

Neural

Model

for

QA

withCombining

Global

Knowledge

Information(Zhang

et

al.

2016)Neural

Attention-based

Neural

Model

for

QA

withCombining

Global

Knowledge

Information(Zhang

et

al.

2016)Neural

Attention-based

Neural

Model

for

QA

withCombining

Global

Knowledge

Information(Zhang

et

al.

2016)融入全局信息利用TransE得到knowledge

embeddingMulti-taskLearningTransEExperimental

Resultsentity:

Slovakiatype:

/location/countryrelation:

partially/containedbycontext:

/m/04dq9kf,

/m/01mp…小结•基于DL的End2End问答基于检索框架,其

是计算知识库中实体、关系与当前问句在语义向量空间中的语义相似度需要依据知识库问答的任务特点设计神经网络(实体类型、语义关系、上下文结构)目前,End2End问答仍只能解决Simple(单关系)类型的问题对于复杂问题、推理性问题还有待解决Comparison

on

WebQuestionF133logisticRegression35StructuredPerception35.7DCS37.5MT32AverageEmbedding39.2Subgraph

Embedding39.9NLG+embedding41.3Relation+Embedding44.3Question

Graph

+

LR40.845.8MT+Relation+Embedding52.5Query

Graph

+

RerankingEnd-End

DL-based

SystemsTraditional

Methods

Promoted

by

DLTraditionalMethods42.2Memory

Network42.6Attention

ModelJoint

Learning

+

Answer

ValidationDong

Yao+Yao2014aFader2014Berant2013Bao

Bordes2014

2014aBordes2014bBerant2014

2014Yao2014b20152015Yih

Bordes2015

2015Zhang2016Xu201653.3参考公开的评测数据集Reference[Zelle,

1995]

J.

M.

Zelle

and

R.

J.

Mooney,

“Learning

to

parse

database

queries

using

inductive

logic

programming,”

in

Proceedings

of

the

NationalConference

on

Artificial

In ligence,

1996,

pp.

1050–1055.[Wong,

2007]

Y.

W.

Wong

and

R.

J.

Mooney,

“Learning

synchronous

gram-mars

for

semantic

parsing

with

lambda

calculus,”

in

Proceedings

of

the

45thAnnual

Meeting-Association

for

computational

Linguistics,

2007.[Lu,

2008]

W.

Lu,

H.

T.

Ng,

W.

S.

Lee,

and

L.

S.

Zettlemoyer,

“A

generative

model

for

parsing

natural

language

to

meaning

representations,”

inProceedings

of

the

2008

Conference

on

Empirical

Methods

in

Natural

Language

Processing,

2008,

pp.

783–792.[Zettlemoyer,

2005]

L.

S.

Zettlemoyer

and

M.

Collins,

“Learning

to

map

sentences

to

logical

form:

Structured

classification

with

probabilistic

categorialgrammars,”

in

Proceedings

of

the

21st

Uncertainty

in

Artificial

In ligence,

2005,

pp.

658–666.[Clarke,

2010]

J.

Clarke,

D.

Goldwasser,

M.-W.

Chang,

and

D.

Roth,

“Driving

semantic

parsing

from

the

world’s

response,”

inProceedings

of

the

14thConference

on

Computational

Natural

Language

Learning,

2010,

pp.

18–27[Liang,

2011]

P.

Liang,

M.

I.

Jordan,

and

D.

Klein,

“Learning

dependency-based

compositional

semantics,”

in

Proceedings

of

the

49th

Annual

Meeting

ofthe

Association

for

Computational

Linguistics:

Human

LanguageTechnologies-Volume

1,

2011,

pp.

590–599.[Cai,

2013]Qingqing

Cai

and

Alexander

Yates.

2013.

Large-scale

semantic

parsing

via

schemamatching

and

lexicon

extension.

In

ACL.[Berant,

2013]Jonathan

Berant,

Andrew

Chou,

Roy

Frostig,

and

Percy

Liang.

2013.

Semantic

parsing

on

freebase

fromquestion-answer

pairs.

In

EMNLP.[Yao,

2014]

Xuchen

Yao

and

Benjamin

Van

Durme.

2014.

Information

extraction

over

structureddata:

Questionanswering

with

freebase.In

ACL.[Unger,

2012]

Christina

Unger,

Lorenz

uhmann,

Jens

Lehmann,

Axel-Cyrille

Ngonga

Ngomo,

Daniel

Gerber,

and

Philipp

Cimiano.

2012.

Template-based

question

answerin errdf

data.

In

WWW.[Yahya,

2012] med

Yahya,

Klaus

Berberich,

Shady

Elbas-suoni,

Maya

Ramanath,

Volker

Tresp,

and

Gerhard

Weikum.

2012.

Natural

languagequestions

for

the

web

ofdata.

In

EMNLP.[He,

2014]

S.He,

K.

Liu,

Y.

Zhang,

L.

Xu,

and

J.

Zhao.

2014.

Question

answerin er

linked

data

using

-order

logic.

InEMNLP.[Artzi,

2011]Y.

Artzi

and

L.

Zettlemoyer.

2011.

Bootstrap- semantic

parsers

from

conversations.

In

Empirical

Methods

in

Natural

Language

Processing(EMNLP),

pages

421–432.

Dan

Goldwasser,

Roi

Reichart,

James

Clarke

and

Dan

Roth.

Confidence

Driven

Unsupervised

Semantic

Parsing.Proceedings

of

ACL

(2011)[Krishnamurthy,

2012]

Jayant

Krishnamurthy

and

Tom

M.

Mitc .

Weakly

Supervised

Training

of

Semantic

Parsers.

In

Proceedings

of

the

2012Conference

on

Empirical

Methods

in

Natural

Language

Processing

and

Computational

Natural

Language

Learning

(EMNLP-CoNLL),

2012[Reddy,

2014]

S.

Reddy,

M.

Lapata,

and

M.

Steedman.

2014.

Large-scale

semantic

parsing

without

question-answer

pairs.

Transactions

of

theAssociation

forCompu-tational

Linguistics

(TACL),2(10):377–392.[Fader,

2013]

A.

Fader,

L.

S.

Zettlemoyer,

and

O.

Etzioni,

“Paraphrase-driven

learning

for

open

question

answering.”

in

Proceedings

of

the

51th

AnnualMeeting-Association

for

computational

Linguistics,

2013,

pp.

1608–1618.Reference[Bordes,

2014a]

Antoine

Bordes,

Jason

Weston,

and

Nicolas

Usunier,

"Open

Question

Answering

with

Weakly

Supervised

Embedding

Models",

In

Proceedings

of

ECML-PKDD,

2014.[Bordes,

2014b]

Antoine

Bordes,

Sumit

Chopra,

and

Jason

Weston,

"Question

Answering

with

Subgraph

Embedding",

In

Proceedings

of

EMNLP

2014.[ ,

2014]Min-Chul ,

Nan

Duan,

Ming

Zhou,

and

Hae-Chang

Rim,

"Joint

Relational

Embeddings

for

Knowledge-Based

Question

Answering",

In

Proceedings

of

EMNLP

2014.[Dong,

2015]

Li

Dong,

Furu

Wei,

Ming

Zhou,

and

Ke

Xu,

"Question

Answerin er

Freebasewith

Multi-Column

Convolutional

Neural

Network",

In

Proceedings

of

ACL

2015.[Bordes,

2015]

Antoine

Bordes,

Nicolas

Usunier,

Sumit

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论