3d geometry for computer graphics class 3. 2 last week - eigendecomposition a we want to learn how...

33
3D Geometry for Computer Graphics Class 3

Post on 15-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

3D Geometry forComputer Graphics

Class 3

Page 2: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

2

Last week - eigendecomposition

A

We want to learn how the matrix A works:

Page 3: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

3

Last week - eigendecomposition

A

If we look at arbitrary vectors, it doesn’t tell us much.

Page 4: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

4

Spectra and diagonalization

A

If A is symmetric, the eigenvectors are orthogonal (and there’s always an eigenbasis).

A = UUT

n

2

1

= Aui = i ui

Page 5: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

5

The plan for today

First we’ll see some applications of PCA – Principal Component Analysis that uses spectral decomposition.

Then look at the theory. And then – the assignment.

Page 6: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

6

PCA finds an orthogonal basis that best represents given data set.

The sum of distances2 from the x’ axis is minimized.

PCA – the general idea

x

y

x’

y’

Page 7: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

7

PCA – the general idea

PCA finds an orthogonal basis that best represents given data set.

PCA finds a best approximating plane (again, in terms of distances2)

3D point set instandard basis

x y

z

Page 8: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

8

PCA – the general idea

PCA finds an orthogonal basis that best represents given data set.

PCA finds a best approximating plane (again, in terms of distances2)

3D point set instandard basis

Page 9: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

9

Application: finding tight bounding box

An axis-aligned bounding box: agrees with the axes

x

y

minX maxX

maxY

minY

Page 10: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

10

Application: finding tight bounding box

Oriented bounding box: we find better axes!

x’y’

Page 11: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

11

Application: finding tight bounding box

This is not the optimal bounding box

x y

z

Page 12: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

12

Application: finding tight bounding box

Oriented bounding box: we find better axes!

Page 13: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

13

Usage of bounding boxes (bounding volumes)

Serve as very simple “approximation” of the object Fast collision detection, visibility queries Whenever we need to know the dimensions (size) of the object

The models consist of thousands of polygons

To quickly test that they don’t intersect, the bounding boxes are tested

Sometimes a hierarchy of BB’s is used

The tighter the BB – the less “false alarms” we have

Page 14: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

14

Scanned meshes

Page 15: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

15

Point clouds

Scanners give us raw point cloud data How to compute normals to shade the surface?

normal

Page 16: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

16

Point clouds

Local PCA, take the third vector

Page 17: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

17

Notations

Denote our data points by x1, x2, …, xn Rd

11 11 2

22 21 2

1 2

, , ,

n

n

dd dn

xx x

xx x

xx x

1 2 nx x x

Page 18: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

18

The origin is zero-order approximation of our data set (a point)

It will be the center of mass:

It can be shown that:

The origin of the new axes

1

1

n

ini

m x

2

1

argminn

ii

x

m x - x

Page 19: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

19

Scatter matrix

Denote yi = xi – m, i = 1, 2, …, n

where Y is dn matrix with yk as columns (k = 1, 2, …,

n)

TS YY

1 1 1 1 21 2 1 1 12 2 2 1 21 2 2 2 2

1 21 2

dn

dn

d d d dn n n n

y y y y y y

y y y y y yS

y y y y y y

Y YT

Page 20: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

20

Variance of projected points

In a way, S measures variance (= scatterness) of the data in different directions.

Let’s look at a line L through the center of mass m, and project our

points xi onto it. The variance of the projected points x’i is:

Original set Small variance Large variance

21

1

var( ) || ||n

ini

L

x m

L L L L

Page 21: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

21

Variance of projected points

Given a direction v, ||v|| = 1, the projection of xi onto L = m + vt is:

|| || , / || || , Ti i i i x m v x m v v y v y

vm

xi

x’iL

Page 22: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

22

Variance of projected points

So,

2 2 21 1 1

1 1

21 1 1 1 1

var( ) || || ( ) || ||

|| || , ,

n n

i in n ni i

T T Tn n n n n

L Y

Y Y Y YY S S

T T

T T T

x -m v y v

v v v v v v v v v

2 21 1 1 1

1 22 2 2 2

2 1 2 1 2 21 2

1 1

1 2

( ) || ||

i n

n nT d di n

i i

d d d di n

y y y y

y y y yv v v v v v Y

y y y y

Tiv y v

Page 23: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

23

Directions of maximal variance

So, we have: var(L) = <Sv, v> Theorem:

Let f : {v Rd | ||v|| = 1} R,

f (v) = <Sv, v> (and S is a symmetric matrix).

Then, the extrema of f are attained at the eigenvectors of S.

So, eigenvectors of S are directions of maximal/minimal variance!

Page 24: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

24

Summary so far

We take the centered data vectors y1, y2, …, yn Rd

Construct the scatter matrix S measures the variance of the data points Eigenvectors of S are directions of maximal variance.

TS YY

Page 25: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

25

Scatter matrix - eigendecomposition

S is symmetric

S has eigendecomposition: S = VVT

S = v2v1 vd

1

2

d

v2

v1

vd

The eigenvectors formorthogonal basis

Page 26: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

26

Principal components

Eigenvectors that correspond to big eigenvalues are the directions in which the data has strong components (= large variance).

If the eigenvalues are more or less the same – there is no preferable direction.

Note: the eigenvalues are always non-negative. Think why…

Page 27: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

27

Principal components

There’s no preferable direction

S looks like this:

Any vector is an eigenvector

TV V

There is a clear preferable direction

S looks like this:

is close to zero, much smaller than .

TVV

Page 28: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

28

How to use what we got

For finding oriented bounding box – we simply compute the bounding box with respect to the axes defined by the eigenvectors. The origin is at the mean point m.

v2v1

v3

Page 29: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

29

For approximation

x

yv1

v2

x

y

This line segment approximates the original data set

The projected data set approximates the original data set

x

y

Page 30: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

30

For approximation

In general dimension d, the eigenvalues are sorted in descending order:

1 2 … d The eigenvectors are sorted accordingly. To get an approximation of dimension d’ < d, we

take the d’ first eigenvectors and look at the subspace they span (d’ = 1 is a line, d’ = 2 is a plane…)

Page 31: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

31

For approximation

To get an approximating set, we project the original data points onto the chosen subspace:

xi = m + 1v1 + 2v2 +…+ d’vd’ +…+dvd

Projection:

xi’ = m + 1v1 + 2v2 +…+ d’vd’ +0vd’+1+…+ 0 vd

Page 32: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

Good luck with Ex1

Page 33: 3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the matrix A works:

33

Technical remarks:

i 0, i = 1,…,d (such matrices are called positive semi-definite). So we can indeed sort by the magnitude of i

Theorem: i 0 <Sv, v> 0 v

Proof:

Therefore, i 0 <Sv, v> 0 v

,

( ) ( ) ,

T T T T

T T T T

S V V S S V V

V V

v v v v v v

v v v v v v

2 2 21 2, ... dS 1 2 dv v u u u