Sep 30, 2011 ... Example 2: Dr. Ruth to talk about sex. 3 PCFG. Define a ... Perfect language
model out of reach. (MPI/ LMU). NLP. September 30, 2011. 3 / 35 ...
Gran'ma or Grammar - A probabilistic Approach to Natural Language Processing
Lars Winderling MPI/ LMU München, Torsten Enÿlin's Arbeitsgruppe Seminar on Information Theory and Signal Reconstruction
September 30, 2011
(MPI/ LMU)
NLP
September 30, 2011
1 / 35
1
2
3
4
5
Outline of NLP Introduction n-gram model Grammar models Route for today Natural language grammar Example 1: A cat hit the tree Example 2: Dr. Ruth to talk about sex PCFG Dene a Context-Free Grammar Make the CFG useable PCFG The Hidden Markov model The Markov Chain The Hidden Markov model Example 3: Mr. Sunshine and the weather The Bayes' way How Bayes comes into play (MPI/ LMU)
NLP
September 30, 2011
1 / 35
Denition of the probabilities Maximum likelihood
6
7
8
Fit grammar to data
How to create a new rule Example 4: Create a new rule
End of the story Literature
(MPI/ LMU)
NLP
September 30, 2011
1 / 35
Outline of NLP
Introduction
What's the problem, man?
(MPI/ LMU)
NLP
September 30, 2011
2 / 35
Outline of NLP
Introduction
What's the problem, man?
Natural language processing (NLP )
(MPI/ LMU)
NLP
September 30, 2011
2 / 35
Outline of NLP
Introduction
What's the problem, man?
Natural language processing (NLP ) Aim: Compute language specic problems
(MPI/ LMU)
NLP
September 30, 2011
2 / 35
Outline of NLP
Introduction
What's the problem, man?
Natural language processing (NLP ) Aim: Compute language specic problems Applicability: text recognition/ production, speech recognition, spam lter, error detection, translation, ...
(MPI/ LMU)
NLP
September 30, 2011
2 / 35
Outline of NLP
Introduction
What's the problem, man?
Natural language processing (NLP ) Aim: Compute language specic problems Applicability: text recognition/ production, speech recognition, spam lter, error detection, translation, ... Many approaches for many topics
(MPI/ LMU)
NLP
September 30, 2011
2 / 35
Outline of NLP
Introduction
Problems of NLP
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach Rely on probabilistics
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach Rely on probabilistics Piece of language
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach Rely on probabilistics Piece of language → does it t in?
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach Rely on probabilistics Piece of language → does it t in? Balance between computational simplicity and methodical power
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Problems of NLP
Perfect language model out of reach Rely on probabilistics Piece of language → does it t in? Balance between computational simplicity and methodical power Right approach for specic topic
(MPI/ LMU)
NLP
September 30, 2011
3 / 35
Outline of NLP
Introduction
Some common approaches
(MPI/ LMU)
NLP
September 30, 2011
4 / 35
Outline of NLP
Introduction
Some common approaches
Two ways:
(MPI/ LMU)
NLP
September 30, 2011
4 / 35
Outline of NLP
Introduction
Some common approaches
Two ways: Word frequency-based
(MPI/ LMU)
NLP
September 30, 2011
4 / 35
Outline of NLP
Introduction
Some common approaches
Two ways: Word frequency-based Grammar-based
(MPI/ LMU)
NLP
September 30, 2011
4 / 35
Outline of NLP
n-gram model
n-gram
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model:
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school.
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school. p(Kate went to school.) =
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school. p(Kate went to school.) = p(Kate|SB SB) p(went|SB Kate) p(to|Kate went) × p(school|went to) p(SE|to school) p(SE|school SE)
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school. p(Kate went to school.) = p(Kate|SB SB) p(went|SB Kate) p(to|Kate went) × p(school|went to) p(SE|to school) p(SE|school SE)
Only 2-/ 3-grams tractable
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school. p(Kate went to school.) = p(Kate|SB SB) p(went|SB Kate) p(to|Kate went) × p(school|went to) p(SE|to school) p(SE|school SE)
Only 2-/ 3-grams tractable Context not captured
(MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
n-gram model
n-gram
n-gram: Markov chain with (n-1)-step memory
Word frequencies result in (constrained) word probabilities Example of a trigram (3-gram) model: Kate went to school. p(Kate went to school.) = p(Kate|SB SB) p(went|SB Kate) p(to|Kate went) × p(school|went to) p(SE|to school) p(SE|school SE)
Only 2-/ 3-grams tractable Context not captured Result depends highly on text korpus/ training data (MPI/ LMU)
NLP
September 30, 2011
5 / 35
Outline of NLP
Grammar models
Grammar models
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
Try to model further dependencies
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
Try to model further dependencies Could increase precision
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
Try to model further dependencies Could increase precision Computationally costly
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
Try to model further dependencies Could increase precision Computationally costly Many still on academic/ testing stage
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
Try to model further dependencies Could increase precision Computationally costly Many still on academic/ testing stage For natural text still behind n-gram
(MPI/ LMU)
NLP
September 30, 2011
6 / 35
Outline of NLP
Grammar models
Grammar models
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies Only partly comparable to natural grammars
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies Only partly comparable to natural grammars Closer to articial languages
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies Only partly comparable to natural grammars Closer to articial languages Two roads:
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies Only partly comparable to natural grammars Closer to articial languages Two roads: Use annotated text korpus
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Grammar models
Grammar models
Vary by complexity of modelled word interdepencdencies Only partly comparable to natural grammars Closer to articial languages Two roads: Use annotated text korpus Use plain text korpus
(MPI/ LMU)
NLP
September 30, 2011
7 / 35
Outline of NLP
Route for today
Route for today
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar Start with English grammar
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar Start with English grammar Give outlook over articial grammar
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar Start with English grammar Give outlook over articial grammar Pick out one example
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar Start with English grammar Give outlook over articial grammar Pick out one example Mention prior assumptions
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Outline of NLP
Route for today
Route for today
Introduce probabilistic Context-Free Grammar Start with English grammar Give outlook over articial grammar Pick out one example Mention prior assumptions Tell you, why Bayes was a good guy
(MPI/ LMU)
NLP
September 30, 2011
8 / 35
Natural language grammar
Decoding grammatical structures
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Decoding grammatical structures
Task:
Split information content of a sentence.
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Decoding grammatical structures
Task: First
Split information content of a sentence.
refer to grammatical structure,
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Decoding grammatical structures
Task:
Split information content of a sentence.
refer to grammatical structure, then look inter-relations between them.
First
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Decoding grammatical structures
Task:
Split information content of a sentence.
refer to grammatical structure, then look inter-relations between them.
First
First
stick to natural grammatics,
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Decoding grammatical structures
Task:
Split information content of a sentence.
refer to grammatical structure, then look inter-relations between them.
First
stick to natural grammatics, then go to higher abstraction layer.
First
(MPI/ LMU)
NLP
September 30, 2011
9 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example:
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z
(MPI/ LMU)
hit | the {z tree.}
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z
(MPI/ LMU)
hit | the {z tree.}
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N}
(MPI/ LMU)
hit | the {z tree.}
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N} NP D
hit | the {z tree.}
N
H H
A cat
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N} NP D
hit | the {z tree.}
N
H H
A cat
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N} NP D
hit | the {z tree.} V D N} | {z
N
H H
A cat
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N} NP
hit | the {z tree.} V D N} | {z VP
D
HH
N
H H
A cat
V
NP
hit
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
10 / 35
Natural language grammar
Example 1: A cat hit the tree
Example of a Parse Tree
We start with a simple example: A cat} | {z D | {z N} NP D
N
H H
A cat
hit | the {z tree.} V D N} | {z VP
S
NP
D
hit
NP N
the tree
(MPI/ LMU)
NLP
VP
H
HH
V
NP
A cat hit
H H
D
N
H H
HH
V
HH
H H
D
N
the tree
September 30, 2011
10 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
(MPI/ LMU)
NLP
September 30, 2011
11 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
Consider the sentence:
(MPI/ LMU)
NLP
September 30, 2011
11 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
Consider the sentence: Dr. Ruth to talk about sex with newspaper editors.
(MPI/ LMU)
NLP
September 30, 2011
11 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
Consider the sentence: Dr. Ruth to talk about sex with newspaper editors. It contains a small ambiguity.
(MPI/ LMU)
NLP
September 30, 2011
11 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
Consider the sentence: Dr. Ruth to talk about sex with newspaper editors. It contains a small ambiguity. Let's apply a parse tree to it!
(MPI/ LMU)
NLP
September 30, 2011
11 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
HH HH
NP
VP
H HH H HH
HH
PN
PN
Dr. Ruth
H HH
VP0
PP
HH H
HH HH
V
PP
to talk
HH
P
N
about sex
(MPI/ LMU)
NLP
P
NP
with
HH H
N
N
newspaper editors September 30, 2011
12 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
Natural language grammar
Example 2: Dr. Ruth to talk about sex
An extended example
S
H HH
NP
HH
PN
PN
Dr. Ruth
HH
VP
HHH H
V
PP
to talk
H HH
P
about
H H
N
sex
(MPI/ LMU)
NP
H
NLP
HH
PP
H HH H
P
PN
with
H HH
N
N
newspaper editors
September 30, 2011
13 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree?
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG )
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S HH H
NP
VP
H H
HH
D
N
V
A cat hit
NP
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S
S NP
D
→
(NP | VP)
HH H
N
H H
VP
HH
V
A cat hit
NP
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP
S NP
D
HH H
N
H H
→ →
(NP | VP) (D | N)
VP
HH
V
A cat hit
NP
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D
S NP
D
HH H
N
H H
VP
→ → →
(NP | VP) (D | N) (a )
HH
V
A cat hit
NP
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D N
S NP
D
HH H
N
H H
VP
HH
V
A cat hit
NP
→ → → →
(NP | VP) (D | N) (a ) (cat )
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D N VP
S NP
D
HH H
N
H H
VP
HH
V
A cat hit
NP
→ → → → →
(NP | VP) (D | N) (a ) (cat ) (V | NP)
H H
D
N
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D N VP V
S NP
D
HH H
N
H H
VP
HH
V
A cat hit
NP
H H
D
N
→ → → → → →
(NP | VP) (D | N) (a ) (cat ) (V | NP) hit
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D N VP V
S NP
D
HH H
N
H H
VP
HH
V
A cat hit
NP
H H
D
N
→ → → → → →
(NP | VP) (D | N) (a | the ) (cat ) (V | NP) hit
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Dene a Context-Free Grammar
The Context-Free Grammar
Remember the rst parse tree? We can use it to dene a Context-Free Grammar (CFG ) S NP D N VP V
S NP
D
HH H
N
H H
VP
HH
V
A cat hit
NP
H H
D
N
→ → → → → →
(NP | VP) (D | N) (a | the ) (cat | tree ) (V | NP) hit
the tree
(MPI/ LMU)
NLP
September 30, 2011
14 / 35
PCFG
Make the CFG useable
Make the CFG useable
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects.
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols.
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG :
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S →
(MPI/ LMU)
X
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S → X S → SX
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S → X S → SX
(MPI/ LMU)
S X
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S → X S → SX
S
S
X
HH
X
S
X
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S → X S → SX
S
S
S
X
HH
HH H
X
S
X
X
S
HH
X
S
X
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
Make the CFG useable
Make the CFG useable
Sofar we have only considered familiar syntactical objects. For practicability, we need abstract symbols. We dene a simple CFG : S → X S → SX
S
S
S
X
HH
HH H
X
S
X
X
S
HH
X
S
X We can use the symbol X for further branching.
(MPI/ LMU)
NLP
September 30, 2011
15 / 35
PCFG
P + CFG
PCFG
= PCFG ?
(MPI/ LMU)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic.
(MPI/ LMU)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
(MPI/ LMU)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example:
(MPI/ LMU)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S →
(MPI/ LMU)
X
(0.9)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S → X (0.9) S → SX (0.1)
(MPI/ LMU)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S
S → X (0.9) S → SX (0.1)
(MPI/ LMU)
X
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S
S → X (0.9) S → SX (0.1) S
→
(MPI/ LMU)
X
X
(0.1)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S
S → X (0.9) S → SX (0.1) S S
→ →
(MPI/ LMU)
X
X (0.1) SX (0.9)
NLP
September 30, 2011
16 / 35
PCFG
P + CFG
PCFG
= PCFG ?
P stands for probabilistic. We start with P & CFG to arrive at PCFG.
An example: S
S → X (0.9) S → SX (0.1) S S
→ →
X
X (0.1) SX (0.9)
S HH H
X
S
HH
X
S
... (MPI/ LMU)
NLP
September 30, 2011
16 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
(MPI/ LMU)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
(MPI/ LMU)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain:
(MPI/ LMU)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain: ti a state
(MPI/ LMU)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain: ti a state ⇒ p(ti+1 |ti )
(MPI/ LMU)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain: ti a state ⇒ p(ti+1 |ti )
(MPI/ LMU)
(transition probability)
NLP
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain: ti a state ⇒ p(ti+1 |ti ) p(t2 |t0 , t1 ) = p(t2 |t1 ) ∀t0 < t1 < t2
(MPI/ LMU)
NLP
(transition probability)
September 30, 2011
17 / 35
The Hidden Markov model
The Markov Chain
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
Remember the Markov model/ Markov chain: ti a state ⇒ p(ti+1 |ti ) p(t2 |t0 , t1 ) = p(t2 |t1 ) ∀t0 < t1 < t2
(MPI/ LMU)
NLP
(transition probability) (Markov property)
September 30, 2011
17 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Task:
Word frequencies ⇒ Most likely grammar/ system state.
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ
Task:
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S X
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S,
p(cat|V) 1
X
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S,
p(cat|V) 1
V
X
(MPI/ LMU)
cat
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S,
p(cat|V) 1
V
X Word frequencies = ˆ
(MPI/ LMU)
cat
output data
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S,
p(cat|V) 1
V
X
cat
Word frequencies = ˆ output data ? Word frequencies ⇒ transition probabilities
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
The Hidden Markov model
Hidden Markov model
Word frequencies ⇒ Most likely grammar/ system state. Probabilities of PCFG = ˆ transition probabilities
Task:
p(X|S)
S,
p(cat|V) 1
V
X
cat
Word frequencies = ˆ output data ? Word frequencies ⇒ transition probabilities ? Output data ⇒ Grammar
(MPI/ LMU)
NLP
September 30, 2011
18 / 35
The Hidden Markov model
Example 3: Mr. Sunshine and the weather
Mr. Sunshine: Output frequency for sun
85%
15%
(MPI/ LMU)
NLP
September 30, 2011
19 / 35
The Hidden Markov model
Example 3: Mr. Sunshine and the weather
Mr. Sunshine: Output frequency for rain
5%
95%
(MPI/ LMU)
NLP
September 30, 2011
20 / 35
The Hidden Markov model
Example 3: Mr. Sunshine and the weather
Mr. Sunshine: Markovian weather change
99%
10% (MPI/ LMU)
NLP
September 30, 2011
21 / 35
The Hidden Markov model
Example 3: Mr. Sunshine and the weather
Mr. Sunshine: Hidden Markov
Two days ago, I went for a walk!
(MPI/ LMU)
or
NLP
September 30, 2011
?
22 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Rule:
Tree branching + probability
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Rule: Grammar:
Tree branching + probability set of rules for modelling a language
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Rule: Grammar: Output:
Tree branching + probability set of rules for modelling a language Word frequencies of text korpus
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Rule: Grammar: Output: Task:
Tree branching + probability set of rules for modelling a language Word frequencies of text korpus Compute probability of signal and data
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Rule: Grammar: Output: Task:
Tree branching + probability set of rules for modelling a language Word frequencies of text korpus Compute probability of signal and data
(MPI/ LMU)
NLP
September 30, 2011
23 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Grammar
(MPI/ LMU)
= b signal
NLP
September 30, 2011
24 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Grammar Output
(MPI/ LMU)
= b signal = b data
NLP
September 30, 2011
24 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Grammar Output
Normal way:
(MPI/ LMU)
= b signal = b data
Maximize posterior p(G|O)=p(s|d) b
NLP
September 30, 2011
24 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Grammar Output
= b signal = b data
Normal way:
Maximize posterior p(G|O)=p(s|d) b
Bayes says:
p(G|O) =
(MPI/ LMU)
p(O|G) p(G) p(O)
NLP
September 30, 2011
24 / 35
The Bayes' way
How Bayes comes into play
How Bayes comes into play
Grammar Output
Normal way: Bayes says: Maximize:
(MPI/ LMU)
= b signal = b data
Maximize posterior p(G|O)=p(s|d) b p(G) p(G|O) = p(O|G) p(O) p(O|G) p(G) wrt. G
NLP
September 30, 2011
24 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
(MPI/ LMU)
p(Oi |G),
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
(MPI/ LMU)
p(Oi |G), Oi
a sentence in
NLP
O
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible)
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible) Idea:
Decode G and use (simplicity = b short description length)
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible)
Decode G and use (simplicity = b short description length) Take e.g. encoding in a computer language Idea:
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible)
Decode G and use (simplicity = b short description length) Take e.g. encoding in a computer language Idea:
l(G) = #encoded
(MPI/ LMU)
NLP
bits
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible)
Decode G and use (simplicity = b short description length) Take e.g. encoding in a computer language Idea:
l(G) = #encoded bits p(G) = 2−l(G)
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Denition of the probabilities
What about the single probabilities?
p(O|G) =
Q
p(Oi |G), Oi
a sentence in
O
p(G): High if grammar is simple (as possible)
Decode G and use (simplicity = b short description length) Take e.g. encoding in a computer language Idea:
l(G) = #encoded bits p(G) = 2−l(G) Prior catches our intuition!
(MPI/ LMU)
NLP
September 30, 2011
25 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
Possible to maximize likelihood p(O|G)
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
Possible to maximize likelihood p(O|G) Adding rules ts G to O
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
Possible to maximize likelihood p(O|G) Adding rules ts G to O Problem: G with l(G) = ∞ models all data
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
Possible to maximize likelihood p(O|G) Adding rules ts G to O Problem: G with l(G) = ∞ models all data p(O|G) = 1 for l(G) = ∞
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
The Bayes' way
Maximum likelihood
Maximum likelihood
Possible to maximize likelihood p(O|G) Adding rules ts G to O Problem: G with l(G) = ∞ models all data p(O|G) = 1 for l(G) = ∞ We use the Bayes-framework in favor of smaller grammars.
(MPI/ LMU)
NLP
September 30, 2011
26 / 35
How to create a new rule
HOWTO: Find the best grammar
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Task:
Given a text korpus
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Task:
Given a text korpus ⇒Fit a grammar to it.
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Task:
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules Task:
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules Task:
(
Maximal grammar ⇒
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules ( delete superuous rules Maximal grammar ⇒ Task:
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules ( delete superuous rules for n symbols ⇒ O n3 rules Maximal grammar ⇒ Task:
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules ( delete superuous rules for n symbols ⇒ O n3 rules Maximal grammar ⇒ n 1 ⇒ impractible Task:
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules ( delete superuous rules for n symbols ⇒ O n3 rules Maximal grammar ⇒ n 1 ⇒ impractible Task:
Minimal grammar ⇒ create new rules
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
HOWTO: Find the best grammar
Given a text korpus ⇒Fit a grammar to it. Two obvious approaches: Minimal grammar ⇒ create new rules ( delete superuous rules for n symbols ⇒ O n3 rules Maximal grammar ⇒ n 1 ⇒ impractible Task:
Minimal grammar ⇒ create new rules
(MPI/ LMU)
NLP
September 30, 2011
27 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(MPI/ LMU)
NLP
September 30, 2011
28 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
Minimal grammar:
(MPI/ LMU)
NLP
September 30, 2011
28 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(S
Minimal grammar:
S X Ai
(MPI/ LMU)
→ → → →
X SX Ai Aj
NLP
(ε) (1 − ε) p(X → Ai ) p(Ai → Aj )
September 30, 2011
28 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(S
S X
Minimal grammar:
Ai
New rule:
(MPI/ LMU)
→ → → →
X SX Ai Aj
(ε) (1 − ε) p(X → Ai ) p(Ai → Aj )
B → Ai p(B → Ai )
NLP
September 30, 2011
28 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(S
S X
Minimal grammar:
Ai
New rule:
(MPI/ LMU)
→ → → →
X SX Ai Aj
(ε) (1 − ε) p(X → Ai ) p(Ai → Aj )
B → Ai p(B → Ai ) X → B p(X → B)
NLP
September 30, 2011
28 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S S
→ →
X SX
(MPI/ LMU)
(ε) (1 − ε)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S H HH
S S S
→ →
X SX
(ε) (1 − ε)
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S H HH
S S → X (ε) S → SX (1 − ε) 1 X → AAnn 3
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S H HH
S S → X (ε) S → SX (1 − ε) 1 X → AAnn 3 X → Acalls
1 3
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S H HH
S S → X (ε) S → SX (1 − ε) 1 X → AAnn 3 X → Acalls X → Ame
(MPI/ LMU)
1 3
1 3
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S H HH
S → X (ε) S → SX (1 − ε) 1 X → AAnn 3 X → Acalls X → Ame
1 3
1 3
S
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann p(S → ...) = (1 − ε)2 ε ( 13 )3
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S S
H HH
S → X (ε) S → SX (1 − ε) 1 X → AAnn 3 X → Acalls X → Ame
1 3
1 3
S
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
HH
HH
X
H
S
X
Ame
X
Acalls
me
AMary
calls
Mary
p(S → ...) = (1 − ε)2 ε ( 13 )3
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S S
H HH
S S X X X X
→ → → → → →
X (ε) SX (1 − ε) 1 AAnn 6 1 AMary 6 1 Acalls 3 1 Ame 3
S
X
HH
S
X
Ame
X
Acalls
me
AAnn
calls
Ann
HH
HH
X
H
S
X
Ame
X
Acalls
me
AMary
calls
Mary
p(S → ...) = (1 − ε)2 ε ( 13 )3
(MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S
H HH
HH
S S S X X X X
→ → → → → →
X (ε) SX (1 − ε) 1 AAnn 6 1 AMary 6 1 Acalls 3 1 Ame 3
HH
X
S
HH
X
H
S
X
Ame
S
X
Ame
X
Acalls
me
X
Acalls
me
AAnn
calls
AMary
calls
Ann
Mary
p(S → ...) = (1 − 3 ( 1 )( 1 )2 ε)2 ε ( 13 ) 3 6 (MPI/ LMU)
NLP
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S
H HH
HH
S S S X X X X
→ → → → → →
X (ε) SX (1 − ε) 1 AAnn 6 1 AMary 6 1 Acalls 3 1 Ame 3
HH
X
S
HH
X
H
S
X
Ame
S
X
Ame
X
Acalls
me
X
Acalls
me
AAnn
calls
AMary
calls
Ann
Mary
p(S → ...) =
p(S → ...) =
(1 − 3 ( 1 )( 1 )2 ε)2 ε ( 13 ) 3 6 (MPI/ LMU)
NLP
(1 − ε)2 ε ( 13 )( 16 )2
September 30, 2011
29 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
(MPI/ LMU)
NLP
September 30, 2011
30 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S
H HH
H HH
S
HH
X
S
HH
X
S
X
Ame
S
X
Ame
X
Acalls
me
X
Acalls
me
AAnn
calls
AMary
calls
Ann
(MPI/ LMU)
Mary
NLP
September 30, 2011
30 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
S
S
H HH
H HH
S
HH
X
S
HH
X
S
X
Ame
S
X
Ame
X
Acalls
me
X
Acalls
me
AAnn
calls
AMary
calls
Ann
(MPI/ LMU)
Mary
NLP
September 30, 2011
30 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
X Acalls me X X
→ → → →
Acalls me Acalls Ame AAnn AMary
( 12 ) (1) ( 41 ) ( 14 )
S
S
HH H
H HH
S
X
S
X
X
Acalls me
X
Acalls me
HH
HH
AAnn Acalls Ame Ann (MPI/ LMU)
calls
AMary Acalls Ame
me
Mary NLP
calls
me
September 30, 2011
31 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
X Acalls me X X
→ → → →
Acalls me Acalls Ame AAnn AMary
( 12 ) (1) ( 41 ) ( 14 )
S
S
HH H
H HH
S
X
S
X
X
Acalls me
X
Acalls me
HH
HH
AAnn Acalls Ame Ann
calls
AMary Acalls Ame
me
Mary
calls
me
p(S → ...) = ε(1 − ε)( 14 )( 21 ) instead of (MPI/ LMU)
NLP
September 30, 2011
31 / 35
How to create a new rule
Example 4: Create a new rule
Example: Create a new rule
X Acalls me X X
→ → → →
Acalls me Acalls Ame AAnn AMary
( 12 ) (1) ( 41 ) ( 14 )
S
S
HH H
H HH
S
X
S
X
X
Acalls me
X
Acalls me
HH
HH
AAnn Acalls Ame Ann
calls
AMary Acalls Ame
me
Mary
calls
me
p(S → ...) = ε(1 − ε)( 14 )( 21 ) instead of p(S → ...) = ε(1 − ε)2 ( 13 )( 16 )2 (MPI/ LMU)
NLP
September 30, 2011
31 / 35
End of the story
End of presentation
We considered today:
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP Word frequency- and grammar-based
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP Word frequency- and grammar-based CFG using parse-trees
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP Word frequency- and grammar-based CFG using parse-trees Fitting grammar to data
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP Word frequency- and grammar-based CFG using parse-trees Fitting grammar to data Notion of short description length prior.
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
End of presentation
We considered today: Dierent approaches to NLP Word frequency- and grammar-based CFG using parse-trees Fitting grammar to data Notion of short description length prior.
(MPI/ LMU)
NLP
September 30, 2011
32 / 35
End of the story
Literature
Literature
...will be given later...
(MPI/ LMU)
NLP
September 30, 2011
33 / 35
End of the story
(MPI/ LMU)
Literature
NLP
September 30, 2011
34 / 35
End of the story
Literature
Thank you for your interest
(MPI/ LMU)
NLP
September 30, 2011
34 / 35
End of the story
Literature
Thank you for your interest ...
(MPI/ LMU)
NLP
September 30, 2011
34 / 35
End of the story
Literature
Thank you for your interest ... and questions!
(MPI/ LMU)
NLP
September 30, 2011
34 / 35
End of the story
(MPI/ LMU)
Literature
NLP
September 30, 2011
35 / 35
End of the story
Literature
EOF
(MPI/ LMU)
NLP
September 30, 2011
35 / 35