mutual information algorithm applied to rigid registration

16
Mutual Informa-on Algorithm applied to Rigid Registra-on Vibha Chaswal, Ph.D.

Upload: vibha-chaswal

Post on 27-Jan-2015

106 views

Category:

Health & Medicine


3 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Mutual Information Algorithm applied to rigid registration

Mutual  Informa-on  Algorithm  applied  to  Rigid  Registra-on  

Vibha  Chaswal,  Ph.D.  

Page 2: Mutual Information Algorithm applied to rigid registration

Registra-on    

•  Image  registra-on  –  Define  geometric  transforma-ons  T  that  will  map  co-­‐ordinats  between  one  image  onto  another  image  such  that  some  image  quality  criterion  is  maximized.  

–  also  referred  to  as  image  fusion,  superimposi-on,  matching  or  merge  

Page 3: Mutual Information Algorithm applied to rigid registration

Registra-on  algorithms    

•  Used  to  find  the  transforma-on    •  Rigid  &  affine  –  Landmark  based  –  Edge  based  –  Voxel  intensity  based  –  Informa,on  theory  based  

•  Non-­‐rigid  –  Registra-on  using  basis  func-ons  –  Registra-on  using  splines  –  Physics  based  

•  Elas-c,  Fluid,  Op-cal  flow,  etc.  

Page 4: Mutual Information Algorithm applied to rigid registration

Rigid  Body  Registra-on  of  medical  images  

•  The  anatomical  and  pathological  structures  do  not  deform  during  image  acquisi-ons  

•  Tissue  deforma-ons  ignored  and  register  images  using  rigid  body  transforms  

•  only  rota-ons  and  transla-ons  •  6  degrees  of  freedom:  3  transla-ons  and  3  rota-ons  

•  Key  Characteris-c:  All  distances  are  preserved  

Page 5: Mutual Information Algorithm applied to rigid registration

3D  Rigid-­‐body  Transforma-ons  

•  A  3D  rigid  body  transform  is  defined  by:  – 3  transla-ons  -­‐  in  X,  Y  &  Z  direc-ons  – 3  rota-ons  -­‐  about  X,  Y  &  Z  axes  

•  The  order  of  the  opera-ons  maZers  

Transla-ons   Pitch  about  x  axis  

Roll  about  y  axis  

Yaw  about  z  axis  

Page 6: Mutual Information Algorithm applied to rigid registration

Informa-on  theory  based  Rigid  body  Registra-on  

•  Image  registra-on  is  considered  as  to  maximize  the  amount  of  shared  informa-on  in  two  images  –  reducing  the  amount  of  informa-on  in  the  combined  

image    

•  Algorithms  used  –  Joint  entropy  

•  Joint  entropy  measures  the  amount  of  informa-on  in  the  two  images  combined    

–  Mutual  informa,on  •  A  measure  of  how  well  one  image  explains  the  other,  and  is  maximized  at  the  op,mal  alignment  

–  Normalized  Mutual  Informa-on    

Page 7: Mutual Information Algorithm applied to rigid registration

Measures  of  Informa-on  •  Hartley  defined  the  first  informa-on  measure:  

–  H  =  n  log  s  –  n  is  the  length  of  the  message  and  s  is  the  number  of  

possible  values  for  each  symbol  in  the  message  –  Assumes  all  symbols  equally  likely  to  occur  

•  Shannon  proposed  variant  (Shannon’s  Entropy)  

•  weighs  the  informa-on  based  on  the  probability  that  an  outcome  will  occur  

•  second  term  shows  the  amount  of  informa-on  an  event  provides  is  inversely  propor-onal  to  its  probability  of  occurring  

Page 8: Mutual Information Algorithm applied to rigid registration

Three  Interpreta-ons  of  Entropy  

•  The  amount  of  informa-on  an  event  provides  –  An  infrequently  occurring  event  provides  more  informa-on  than  a  frequently  occurring  event  

•  The  uncertainty  in  the  outcome  of  an  event  –  Systems  with  one  very  common  event  have  less  entropy  than  systems  with  many  equally  probable  events  

•  The  dispersion  in  the  probability  distribu-on  –  An  image  of  a  single  amplitude  has  a  less  disperse  histogram  than  an  image  of  many  greyscales  •  the  lower  dispersion  implies  lower  entropy  

Page 9: Mutual Information Algorithm applied to rigid registration

Joint  Entropy  for  Image  Registra-on  

•  Define  a  joint  probability  distribu-on:  –  Generate  a  2-­‐D  histogram  where  each  axis  is  the  number  of  possible  greyscale  values  in  each  image  

–  each  histogram  cell  is  incremented  each  -me  a  pair          (I_1(x,y),  I_2(x,y))  occurs  in  the  pair  of  images  •  If  the  images  are  perfectly  aligned  then  the  histogram  is  highly  focused.    As  the  images  mis-­‐align  the  dispersion  grows  

•  recall  Entropy  is  a  measure  of  histogram  dispersion  

Page 10: Mutual Information Algorithm applied to rigid registration

Entropy  for  Image  Registra-on  

•  Using  joint  entropy  for  registra-on  –  Define  joint  entropy  to  be:  

–  Images  are  registered  when  one  is  transformed  rela-ve  to  the  other  to  minimize  the  joint  entropy  

–  The  dispersion  in  the  joint  histogram  is  thus  minimized  

Page 11: Mutual Information Algorithm applied to rigid registration

Joint  entropy:  overlap  problem  

•  Joint  entropy  very  sensi-ve  to  mapping  of  posi-on  and  intensity  

•  ‘blur’  with  increasing  misregistra-on  

•  May  lead  to  incorrect  solu-on  

aligned   2mm   5mm  

MR/MR  

MR/CT  

MR/PET  Figure  from  Hill  et.al.,  Voxel  Similarity  measures  for  automated  image  registra3on,  1994,  Proc.  SPIE,  2359  

Page 12: Mutual Information Algorithm applied to rigid registration

Solu-on:  Mutual  Informa-on  

 A  solu-on  to  the  overlap  problem  from  which  joint  entropy  suffers  is  to  consider  the  informa-on  contributed  to  the  overlapping  volume  by  each  image  being  registered  together  with  the  joint  informa-on.  The  informa-on  contributed  by  the  individual  images  is  simply  the  entropy  of  the  por-on  of  the  image  that  overlaps  with  the  other  image  volume  

Page 13: Mutual Information Algorithm applied to rigid registration

Defini-ons  of  Mutual  Informa-on  •  Three  commonly  used  defini-ons:  –  1)  MI(A,B)  =  H(B)  -­‐  H(B|A)  =  H(A)  -­‐  H(A|B)  

•  Mutual  informa-on  is  the  amount  that  the  uncertainty  in  B  (or  A)  is  reduced  when  A  (or  B)  is  known.  

–  2)  MI(A,B)  =  H(A)  +  H(B)  -­‐  H(A,B)  •  Maximizing  the  mutual  info  is  equivalent  to  minimizing  the  joint  entropy  (last  term)  

•  Advantage  in  using  mutual  info  over  joint  entropy  is  it  includes  the  individual  input’s  entropy  

•  Works  beZer  than  simply  joint  entropy  in  regions  of  image  background  (low  contrast)  where  there  will  be  low  joint  entropy  but  this  is  offset  by  low  individual  entropies  as  well  so  the  overall  mutual  informa-on  will  be  low  

Page 14: Mutual Information Algorithm applied to rigid registration

Defini-ons  of  Mutual  Informa-on  II  

–  3)  

•  This  defini-on  is  related  to  the  Kullback-­‐Leibler  distance  between  two  distribu-ons  

•  Measures  the  dependence  of  the  two  distribu-ons  

•  In  image  registra-on  I(A,B)    will  be  maximized  when  the  images  are  aligned  

•  In  feature  selec-on  choose  the  features  that  minimize  I(A,B)  to  ensure  they  are  not  related.  

I(A,B) = p(a,b) ⋅ log p(a,b)p(a)p(b)

⎝ ⎜

⎠ ⎟

a,b∑

Page 15: Mutual Information Algorithm applied to rigid registration

MI  algorithm  

Page 16: Mutual Information Algorithm applied to rigid registration

Registra-on  using  MI  Maximiza-on  

Rigid  Registra,on   Affine  Registra-on   Non-­‐rigid  Registra-on