c ov arit eg u snp c l v b lm d covariate gaussian process ...11-16-00)-11-16-20-4994... · re...

11
Motivation: disease progression modelling Motivation: disease progression modelling Motivation: disease progression modelling GPLVM maps latent to observed data using GP mappings Covariate-GPLVM GPLVM maps latent to observed data using GP mappings Covariate-GPLVM extends GPLVM by: 1. Incorporating covariates Covariate-GPLVM GPLVM maps latent to observed data using GP mappings Covariate-GPLVM extends GPLVM by: 1. Incorporating covariates 2. Providing a feature-level decomposition Covariate-GPLVM Feature-level decomposition Readily available for linear models, otherwise challenging: Feature-level decomposition Readily available for linear models, otherwise challenging: Naive decompositions (with standard GP priors) can lead to misleading conclusions With appropriate functional constraints we learn an identifiable non-linear decomposition Decomposing feature-level variation with Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models Covariate Gaussian Process Latent Variable Models @kasparmartens @kasparmartens kasparmartens.rbind.io kasparmartens.rbind.io Poster #261 Poster #261 Decomposing feature-level variation with Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models Covariate Gaussian Process Latent Variable Models Kaspar Märtens, Kieran Campbell, Christopher Yau Kaspar Märtens, Kieran Campbell, Christopher Yau @kasparmartens @kasparmartens kasparmartens.rbind.io kasparmartens.rbind.io

Upload: others

Post on 28-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Motivation: disease progression modellingMotivation: disease progression modellingMotivation: disease progression modelling

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM extends GPLVM by:

1. Incorporating covariates

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

x

y(j)i

= f (j)(xi, zi) + εij

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM extends GPLVM by:

1. Incorporating covariates

2. Providing a feature-level decomposition

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

x

y(j)i

= f (j)(xi, zi) + εij

y(j)

i= μ(j) + f

(j)z (z) + f

(j)x (x) + f

(j)zx (z, x) + εij

Feature-level decomposition

Readily available for linear models, otherwise challenging:

y(j)i

= μ(j) + f(j)z (z) + f

(j)x (x) + f

(j)zx (z, x) + εij

Feature-level decomposition

Readily available for linear models, otherwise challenging:

Naive decompositions (with standard GP priors) can lead to misleading conclusions

With appropriate functional constraints we learn an identifiable non-linear decomposition

y(j)i

= μ(j) + f(j)z (z) + f

(j)x (x) + f

(j)zx (z, x) + εij

Decomposing feature-level variation with Decomposing feature-level variation with Covariate Gaussian Process Latent Variable ModelsCovariate Gaussian Process Latent Variable Models

@kasparmartens @kasparmartens

kasparmartens.rbind.io kasparmartens.rbind.io

Poster #261Poster #261

Decomposing feature-level variation with Decomposing feature-level variation with Covariate Gaussian Process Latent Variable ModelsCovariate Gaussian Process Latent Variable Models

Kaspar Märtens, Kieran Campbell, Christopher YauKaspar Märtens, Kieran Campbell, Christopher Yau

@kasparmartens @kasparmartens

kasparmartens.rbind.io kasparmartens.rbind.io

Page 2: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Motivation: disease progression modelling

Page 3: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Motivation: disease progression modelling

Page 4: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Motivation: disease progression modelling

Page 5: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

Page 6: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM extends GPLVM by:

1. Incorporating covariates

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

x

y(j)i

= f (j)(xi, zi) + εij

Page 7: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

GPLVM maps latent to observed data using GP mappings

Covariate-GPLVM extends GPLVM by:

1. Incorporating covariates

2. Providing a feature-level decomposition

Covariate-GPLVM

zi ∼ N (0, 1)Y

y(j)i

= f (j)(zi) + εij

x

y(j)i

= f (j)(xi, zi) + εij

y(j)

i= μ(j) + f

(j)z (z) + f

(j)x (x) + f

(j)zx (z, x) + εij

Page 8: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Feature-level decomposition

Readily available for linear models, otherwise challenging:

y(j)i

= μ(j) + f(j)z (z) + f

(j)x (x) + f

(j)zx (z,x) + εij

Page 9: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Feature-level decomposition

Readily available for linear models, otherwise challenging:

Naive decompositions (with standard GP priors) can lead to misleading conclusions

With appropriate functional constraints we learn an identifiable non-linear decomposition

y(j)i

= μ(j) + f(j)z (z) + f

(j)x (x) + f

(j)zx (z,x) + εij

Page 10: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Decomposing feature-level variation with Decomposing feature-level variation with Covariate Gaussian Process Latent Variable ModelsCovariate Gaussian Process Latent Variable Models

@kasparmartens @kasparmartens

kasparmartens.rbind.io kasparmartens.rbind.io

Page 11: C ov arit eG u snP c L V b lM d Covariate Gaussian Process ...11-16-00)-11-16-20-4994... · Re adily av ailable fo r lin e a r m od e ls, o th e rwise c h alle n g in g : Naiv e de

Poster #261Poster #261