1
Bayesian Image Modeling by Generalized Sparse Markov Random Fields and
Loopy Belief Propagation
Kazuyuki TanakaGSIS, Tohoku University, Sendai, Japan
http://www.smapip.is.tohoku.ac.jp/~kazu/
CollaboratorsMuneki Yasuda (GSIS, Tohoku University, Japan)
Sun Kataoka (GSIS, Tohoku University, Japan)D. M. Titterington (Department of Statistics, University of Glasgow, UK)
20 March, 2013 SPDSA2013, Sendai, Japan
2
Outline
1. Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation
2. Bayesian Image Modeling by Generalized Sparse Prior
3. Noise Reductions by Generalized Sparse Prior4. Concluding Remarks
20 March, 2013 SPDSA2013, Sendai, Japan
3
Probabilistic Model and Belief Propagation
Probabilistic Information Processing
Probabilistic Models
Bayes Formulas
Belief Propagation=Bethe Approximation
Bayesian NetworksMarkov Random Fields
20 March, 2013 SPDSA2013, Sendai, Japan
Eji
jiij ffUP},{
,f
jkjkj
jfjiij
iji
fMffU
fM
,
Constant
V: Set of all the nodes (vertices) in graph GE: Set of all the links (edges) in graph G
ji
),( EVG
Message=Effective Field |},{ Ekjkj
4
Supervised Learning of Pairwise Markov Random Fields by Loopy Belief PropagationPrior Probability of natural images is assumed to be the following pairwise Markov random fields:
4
Eji
ji ffUP},{
}Pr{ ffF
1,,1,0 qfi
20 March, 2013 SPDSA2013, Sendai, Japan
Supervised Learning Scheme by Loopy Belief Propagation in Pairwise Markov random fields:
5
Supervised Learning of Pairwise Markov Random Fields by Loopy Belief Propagation
5
1
0
1
00,,
},{
1
0
1
00
1
0,00,
0,
1
0
1
0
1
0
1
0,
},{
,
,
ln1
1 ,
21ln ,
q
f
q
fjiijjimmnnm
iji
q
f
q
fjiijjim
q
fiiimmm
inm
q
f
q
finim
q
m
q
njnimnmji
Ejiji
i j
i j
i
i i
ffPffKK
ffPff
fPfiKK
qfff
ffKffUffUP
f
Histogram from Supervised Data
20 March, 2013 SPDSA2013, Sendai, Japan
M. Yasuda, S. Kataoka and K.Tanaka, J. Phys. Soc. Jpn, Vol.81, No.4, Article No.044801, 2012.
6
Supervised Learning of Pairwise Markov Random Fields by Loopy Belief PropagationSupervised Learning Scheme by Loopy Belief Propagation in Pairwise Markov random fields:
6
Ejiji
Ejiji ffffUP
},{
45.0
},{ 21exp~lnexp f
20 March, 2013 SPDSA2013, Sendai, Japan
7
Bayesian Image Modeling by Generalized Sparse PriorAssumption: Prior Probability is given as the following Gibbs distribution with the interaction between every nearest neighbour pair of pixels:
7
Ejiji
pff
pZpPp
},{21exp
,1,},|Pr{
ffF
p=0: q-state Potts modelp=2: Discrete Gaussian Graphical Model
1,,1,0 qfi
20 March, 2013 SPDSA2013, Sendai, Japan
8
Prior in Bayesian Image Modeling
8
Ejiji
pff
pZpPp
},{21exp
,1,},|Pr{
ffF
**),( upu
Eji
ji
pff
Eu
},{
*** 1
In the region of 0<p<0.3504…, the first order phase transition appears and the solution * does not exist.
Eji
ji pPffpup
,
,),(f
f
20 March, 2013 SPDSA2013, Sendai, Japan
Loopy Belief Propagation
},|Pr{lnmaxarg **
pfF
q=16
9
Bayesian Image Modeling by Generalized Sparse Prior: Conditional Maximization of Entropy
)( )(ln)(maxarg
},|Pr{
},{)( z Eji
p
jiPEuPzzPP
up
zzz
fF
zf
9
Eji
pji ffup
uppZ
uppfPupfF
},{,
21exp
,,1
,,},|Pr{
Lagrange Multiplier
Loopy Belief Propagation
20 March, 2013 SPDSA2013, Sendai, Japan
K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012.
10
Prior Analysis by LBP and Conditional Maximization of Entropy in Bayesian Image Modeling
10
Eji
p
ji ffupuppZ
uppPup},{
,21exp
,,1,,},|Pr{
ffF
** ,, uup
Ejijiji
p
ji
jiji
fji
ijkjkij
ApffPffEu
AAup
CpffP
EjiffAMj
p
},{},{
},{
\
,,1),(
messagesby LBPin ,, marginals Compute
)},{( 21exp
f
M
M
Prior Probability
q=16 p=0.2 q=16 p=0.5
Eji
pji ff
Eu
},{
*** 1
Repeat until A converges
20 March, 2013 SPDSA2013, Sendai, Japan
LBP
11
Prior Analysis by LBP and Conditional Maximization of Entropy in Generalized Sparse Prior
11
Log-Likelihood for p when the original image f* is given
Eji
p
ji ffE
u},{
*** 1
*
},{
***
**
,,ln,21
},|Pr{ln
uppZffup
up
Eji
p
ji
fF
q=16
},|Pr{lnmaxarg *** uppp
fF
LBP
Free Energy of Prior
20 March, 2013 SPDSA2013, Sendai, Japan
K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012.
12
Degradation Process in Bayesian Image Modeling
Assumption: Degraded image is generated from the original image by Additive White Gaussian Noise.
Prior
Processn Degradatio
Posterior
}Image OriginalPr{
}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
0
12
Viii fg 2
2 )(2
1exp
},|Pr{
fFgG
20 March, 2013 SPDSA2013, Sendai, Japan
13
Noise Reductions by Generalized Sparse Prior
z
z
zf z
zzz
gGfF
Viii
Eji
p
ji
P VPgz
EuPzzPP
up
22
},{
)( )()(
)( )(ln)(maxarg
},,,|Pr{
Posterior Probability
PriorProcessn Degradatio
Posterior
}Image OriginalPr{}Image Original|Image DegradedPr{
}Image Degraded|Image OriginalPr{
13
Eji
p
jiVi
ii ffCgfBCBpZ
CBpPup
},{
2
21
21exp
,,,1
,,,},,,|Pr{
g
gfgGfF Lagrange Multipliers
20 March, 2013 SPDSA2013, Sendai, Japan
K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012.
14
Noise Reduction Procedures Based on LBP and Conditional Maximization of Entropy
14
Repeat until C converge
Eji f fjijiji
fji
ijkjkijjiji
i j
j
CpffPffEu
CC
ffCMCpffP
C
p
p
},{},{
\},{
,,1
21exp,,
converges untilRepeat
M
i
i j
j
fiiii
Eji f fjiijji
iijiij
fjijj
ijkjkij
CBpfPgfV
B
CBpffPffE
u
CBpfPCBpffP
ijVi
ffCgfBj
p
p
,,,1
,,,,1
messagesby ,,, and ,,,, marginals Compute
21
||21exp
2
2
},{
\
g
g
Lgg
LL
Repeat until u, B and L converge
CBpfPpf
CupuuB
iii ,,,maxarg,ˆˆ, ,ˆ ,ˆ
gg
20 March, 2013 SPDSA2013, Sendai, Japan
}ˆ,ˆ,|Pr{maxargˆ
ˆ2ln21ˆ,,ln
ˆ,,ˆ,,ln
}ˆ,ˆ,|Pr{ln2
upp
VuppZ
uppZ
up
p
gG
g
gG
Marginals and Free Energy in LBP
Repeat until C and M convergeInput p and data g
15
Noise Reductions by Generalized Sparse Priors
and Loopy Belief Propagation
p=0.5p=0.2
Original Image Degraded Image
RestoredImage
p=120 March, 2013 SPDSA2013, Sendai, Japan
K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012.
16
Noise Reductions by Generalized Sparse Priors
and Loopy Belief Propagation
p=0.5p=0.2
Original Image Degraded Image
RestoredImage
p=120 March, 2013 SPDSA2013, Sendai, Japan
K. Tanaka, M. Yasuda and D. M. Titterington: J. Phys. Soc. Jpn, 81, 114802, 2012.
17
SummaryFormulation of Bayesian image modeling for image processing by means of generalized sparse priors and loopy belief propagation are proposed.Our formulation is based on the conditional maximization of entropy with some constraints.In our sparse priors, although the first order phase transitions often appear, our algorithm works well also in such cases.
20 March, 2013 SPDSA2013, Sendai, Japan
18
References1. S. Kataoka, M. Yasuda, K. Tanaka and D. M. Titterington: Statistical Analysis of the
Expectation-Maximization Algorithm with Loopy Belief Propagation in Bayesian Image Modeling, Philosophical Magazine: The Study of Condensed Matter, Vol.92, Nos.1-3, pp.50-63,2012.
2. M. Yasuda and K. Tanaka: TAP Equation for Nonnegative Boltzmann Machine: Philosophical Magazine: The Study of Condensed Matter, Vol.92, Nos.1-3, pp.192-209, 2012.
3. S. Kataoka, M. Yasuda and K. Tanaka: Statistical Analysis of Gaussian Image Inpainting Problems, Journal of the Physical Society of Japan, Vol.81, No.2, Article No.025001, 2012.
4. M. Yasuda, S. Kataoka and K.Tanaka: Inverse Problem in Pairwise Markov Random Fields using Loopy Belief Propagation, Journal of the Physical Society of Japan, Vol.81, No.4, Article No.044801, pp.1-8, 2012.
5. M. Yasuda, Y. Kabashima and K. Tanaka: Replica Plefka Expansion of Ising systems, Journal of Statistical Mechanics: Theory and Experiment, Vol.2012, No.4, Article No.P04002, pp.1-16, 2012.
6. K. Tanaka, M. Yasuda and D. M. Titterington: Bayesian image modeling by means of generalized sparse prior and loopy belief propagation, Journal of the Physical Society of Japan, Vol.81, No.11, Article No.114802, 2012.
7. M. Yasuda and K. Tanaka: Susceptibility Propagation by Using Diagonal Consistency, Physical Review E, Vol.87, No.1, Article No.012134, 2013.
20 March, 2013 SPDSA2013, Sendai, Japan