chapter 7 generating and processing random signals

56
1 Chapter 7 Generating and Processing Random Signals 第第第 B93902016 第第第 第第第 B93902076 第第第

Upload: akiko

Post on 05-Jan-2016

73 views

Category:

Documents


2 download

DESCRIPTION

Chapter 7 Generating and Processing Random Signals. 第一組 電機四 B93902016 蔡馭理 資工四 B93902076 林宜鴻. Outline. Outline. Stationary and Ergodic Process Uniform Random Number Generator Mapping Uniform RVs to an Arbitrary pdf Generating Uncorrelated Gaussian RV Generating correlated Gaussian RV - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Chapter 7 Generating and Processing Random Signals

1

Chapter 7Generating and Processing

Random Signals

第一組電機四 B93902016 蔡馭理資工四 B93902076 林宜鴻

Page 2: Chapter 7 Generating and Processing Random Signals

2

Outline

Stationary and Ergodic ProcessUniform Random Number GeneratorMapping Uniform RVs to an Arbitrary pdfGenerating Uncorrelated Gaussian RVGenerating correlated Gaussian RVPN Sequence GeneratorsSignal processing

Outline

Page 3: Chapter 7 Generating and Processing Random Signals

3

Random Number Generator

Noise, interferenceRandom Number Generator- computation

al or physical device designed to generate a sequence of numbers or symbols that lack any pattern, i.e. appear random, pseudo-random sequence

MATLAB - rand(m,n) , randn(m,n)

Page 4: Chapter 7 Generating and Processing Random Signals

4

Stationary and Ergodic Process

strict-sense stationary (SSS)wide-sense stationary (WSS) Gaussian

SSS =>WSS ; WSS=>SSSTime average v.s ensemble average The ergodicity requirement is that the ensemble

average coincide with the time averageSample function generated to represent signals,

noise, interference should be ergodic

Page 5: Chapter 7 Generating and Processing Random Signals

5

Time average v.s ensemble average

Time average ensemble average

Page 6: Chapter 7 Generating and Processing Random Signals

6

Example 7.1 (N=100)

0 0.5 1 1.5 2-1

0

1

x(t)

0 0.5 1 1.5 2-0.5

0

0.5

x ensem

ble-

avar

age(

t)0 0.5 1 1.5 2

-1

0

1

y(t)

0 0.5 1 1.5 2-1

0

1

y ensem

ble-

avar

ag(t

)

0 0.5 1 1.5 2-2

0

2

z(t)

0 0.5 1 1.5 2-2

0

2z ens

embl

e-av

arag

(t)

)2cos()(),( iii φπftμ1Aξtx

)2cos(),( ii φπftAξtx

Page 7: Chapter 7 Generating and Processing Random Signals

7

Uniform Random Number Genrator

Generate a random variable that is uniformly distributed on the interval (0,1)

Generate a sequence of numbers (integer) between 0 and M and the divide each element of the sequence by M

The most common technique is linear congruence genrator (LCG)

Page 8: Chapter 7 Generating and Processing Random Signals

8

Linear Congruence

LCG is defined by the operation:

xi+1=[axi+c]mod(m)

x0 is seed number of the generator

a, c, m, x0 are integer

Desirable property- full period

Page 9: Chapter 7 Generating and Processing Random Signals

9

Technique A: The Mixed Congruence Algorithm

The mixed linear algorithm takes the form:

xi+1=[axi+c]mod(m)

- c≠0 and relative prime to m

- a-1 is a multiple of p, where p is the

prime factors of m

- a-1 is a multiple of 4 if m is a

multiple of 4

Page 10: Chapter 7 Generating and Processing Random Signals

10

Example 7.4

m=5000=(23)(54)c=(33)(72)=1323a-1=k1‧2 or k2‧5 or 4‧k3 so, a-1=4‧2‧5‧k =40kWith k=6, we have a=241

xi+1=[241xi+ 1323]mod(5000)We can verify the period is 5000, so it’s full

period

Page 11: Chapter 7 Generating and Processing Random Signals

11

Technique B: The Multiplication Algorithm With Prime Modulus

The multiplicative generator defined as :

xi+1=[axi]mod(m)

- m is prime (usaually large)

- a is a primitive element mod(m)

am-1/m = k =interger

ai-1/m ≠ k, i=1, 2, 3,…, m-2

Page 12: Chapter 7 Generating and Processing Random Signals

12

Technique C: The Multiplication Algorithm With Nonprime Modulus

The most important case of this generator having m equal to a power of two :

xi+1=[axi]mod(2n)

The maximum period is 2n/4= 2n-2

the period is achieved if

- The multiplier a is 3 or 5

- The seed x0 is odd

Page 13: Chapter 7 Generating and Processing Random Signals

13

Example of Multiplication Algorithm With Nonprime Modulus

a=3

c=0

m=16

x0=1

0 5 10 15 20 25 30 351

2

3

4

5

6

7

8

9

10

11

Page 14: Chapter 7 Generating and Processing Random Signals

14

Testing Random Number Generator

Chi-square test, spectral test……Testing the randomness of a given sequen

ceScatterplots

- a plot of xi+1 as a function of xi

Durbin-Watson Test

-

N

n

N

n

nXN

nXnXN

2

2

2

2

][)/1(

])1[][()/1(D

Page 15: Chapter 7 Generating and Processing Random Signals

15

ScatterplotsExample 7.5

0 0.5 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.5 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.5 10

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

(i) rand(1,2048)

(ii)xi+1=[65xi+1]mod(2048)

(iii)xi+1=[1229xi+1]mod(2048)

Page 16: Chapter 7 Generating and Processing Random Signals

16

Durbin-Watson Test (1)

N

n

N

n

nXN

nXnXND

2

2

2

2

][)/1(

])1[][()/1(

}({1

}{

}({ 22x

2

2

Y)-XEXE

Y)-XED

Let X = X[n] & Y = X[n-1]

ZρρXY 21 11 ρ

Let

Assume X[n] and X[n-1] are correlated and X[n] is an ergodic process

Page 17: Chapter 7 Generating and Processing Random Signals

17

Durbin-Watson Test (2)

222222

)1()1()1(2)1(1

ZρXZρρXρEσ

D

)1(2)1()1(

2

2222

ρσ

σρσρD

X and Z are uncorrelated and zero mean

D>2 – negative correlation

D=2 –- uncorrelation (most desired)

D<2 – positive correlation

Page 18: Chapter 7 Generating and Processing Random Signals

18

Example 7.6

rand(1,2048) - The value of D is 2.0081 and ρ is 0.0041.

xi+1=[65xi+1]mod(2048) - The value of D is 1.9925 and ρ is 0.0037273.

xi+1=[1229xi+1]mod(2048) - The value of D is 1.6037 and ρ is 0.19814.

Page 19: Chapter 7 Generating and Processing Random Signals

19

Minimum Standards

Full period Passes all applicable statistical tests for

randomness.Easily transportable from one computer to

anotherLewis, Goodman, and Miller Minimum

Standard (prior to MATLAB 5)xi+1=[16807xi]mod(231-1)

Page 20: Chapter 7 Generating and Processing Random Signals

20

Mapping Uniform RVs to an Arbitrary pdf

The cumulative distribution for the target random variable is known in closed form – Inverse Transform Method

The pdf of target random variable is known in closed form but the CDF is not known in closed form – Rejection Method

Neither the pdf nor CDF are known in closed form – Histogram Method

Page 21: Chapter 7 Generating and Processing Random Signals

21

Inverse Transform Method

CDF FX(X) are known in closed form

U = FX (X) = Pr { X ≦ x }

X = FX-1

(U)

FX (X) = Pr { FX-1

(U) ≦ x } = Pr {U ≦ FX (x) }= FX (x)

FX(x)

1

U

FX-1(U) x

Page 22: Chapter 7 Generating and Processing Random Signals

22

Example 7.8 (1)

Rayleigh random variable with pdf –

Setting FR(R) = U

)(2

exp)(2

2

2ru

σ

r

σ

rrf R

2

2

2

2

0 2 2exp1)(

σ

rdy

yexp

σ

yrF

r

R

r

2

2

2exp1

Page 23: Chapter 7 Generating and Processing Random Signals

23

Example 7.8 (2)

∵ RV 1-U is equivalent to U (have same pdf) ∴

Solving for R gives

[n,xout] = hist(Y,nbins) - bar(xout,n) - plot the histogram

r

2

2

2exp

)ln(2R 2 Uσ

Page 24: Chapter 7 Generating and Processing Random Signals

24

Example 7.8 (3)

0 1 2 3 4 5 6 7 8 90

500

1000

1500

Num

ber

of S

ampl

es

Independent Variable - x

0 1 2 3 4 5 6 7 80

0.1

0.2

0.3

0.4

Pro

babi

lity

Den

sity

Independent Variable - x

true pdf

samples from histogram

Page 25: Chapter 7 Generating and Processing Random Signals

25

The Histogram Method

CDF and pdf are unknownPi = Pr{xi-1 < x < xi} = ci(xi-xi-1)

FX(x) = Fi-1 + ci(xi-xi-1)

FX(X) = U = Fi-1 + ci(X-xi) more samples

more accuracy!

1

1111 }Pr{

i

jiii PXXF

)(1

11 ii

i FUc

xX

Page 26: Chapter 7 Generating and Processing Random Signals

26

Rejection Methods (1)

Having a target pdf MgX(x) ≧ fX(x), all x

otherwise

0

,0

a/)(

axMb xMg X

}max{ (x)fa

Mb X

axx+dx

M/a=b

1/a

0

0

MgX(x)

fX(x)

gX(x)

Page 27: Chapter 7 Generating and Processing Random Signals

27

Rejection Methods (2)

Generate U1 and U2 uniform in (0,1)

Generate V1 uniform in (0,a), where a is the maximum value of X

Generate V2 uniform in (0,b), where b is at least the maximum value of fX(x)

If V2 ≦ fX(V1), set X= V1. If the inequality is not satisfied, V1 and V2 are discarded and the process is repeated from step 1

Page 28: Chapter 7 Generating and Processing Random Signals

28

Example 7.9 (1)

R0

0

MgX(x)

fX(x)

gX(x)

πRR

M 4

R

1

otherwise0,

Rx0xRπRxf X

222

4)(

Page 29: Chapter 7 Generating and Processing Random Signals

29

Example 7.9 (2)

0 1 2 3 4 5 6 70

50

100

150

Num

ber

of S

ampl

es

Independent Variable - x

0 1 2 3 4 5 6 70

0.05

0.1

0.15

0.2

Pro

babi

lity

Den

sity

Independent Variable - x

true pdf

samples from histogram

Page 30: Chapter 7 Generating and Processing Random Signals

30

Generating Uncorrelated Gaussian RV

Its CDF can’t be written in closed form , so Inverse method can’t be used and rejection method are not efficient

Other techniques

1.The sum of uniform method

2.Mapping a Rayleigh to Gaussian RV

3.The polar method

Page 31: Chapter 7 Generating and Processing Random Signals

31

The Sum of Uniforms Method(1)

1.Central limit theorem2.See next

.

3.

0

1( )

2

N

ii

Y B U

iU 1,2..,i N represent independent uniform R.V

B is a constant that decides the var of Y

N Y converges to a Gaussian R.V.

Page 32: Chapter 7 Generating and Processing Random Signals

32

The Sum of Uniforms Method(2)

Expectation and Variance

We can set to any desired valueNonzero at

1{ }

2iE U 0

1{ } ( { } ) 0

2

N

ii

E Y B E U

1/ 2 2

1/ 2

1 1var{ }

2 12iU x dx

2

2 2

1

1var{ }

2 12

N

y ii

NBB U

12yB

N

123

2y y

NN

N

Page 33: Chapter 7 Generating and Processing Random Signals

33

The Sum of Uniforms Method(3)

Approximate GaussianMaybe not a realistic situation.

Page 34: Chapter 7 Generating and Processing Random Signals

34

Mapping a Rayleigh to Gaussian RV(1)

Rayleigh can be generated by

U is the uniform RV in [0,1] Assume X and Y are indep. Gaussian RV

and their joint pdf

22 lnR U

2 2

2 2

1 1( , ) exp( ) exp( )

2 22 2XY

x xf x y

2 2

2 2

1exp( )

2 2

x y

Page 35: Chapter 7 Generating and Processing Random Signals

35

Mapping a Rayleigh to Gaussian RV(2)

Transform

let and

and

cosx r siny r 2 2 2x y r 1tan ( )

y

x

( , ) ( , )R R XY XYf r dA f x y dA

/ /( , )

/ /( , )XY

R

dx dr dx ddA x yr

dy dr dy ddA r

2

2 2( , ) exp( )

2 2R

r rf r

Page 36: Chapter 7 Generating and Processing Random Signals

36

Mapping a Rayleigh to Gaussian RV(3)

Examine the marginal pdf

R is Rayleigh RV and is uniform RV

2 22

2 2 2 20( ) exp( ) exp( )

2 2 2R

r r r rf r d

0 r

2

2 20

1( ) exp( )

2 2 2

r rf dr

0 2

cosX R 2

1 22 ln( ) cos 2X U U

sinY R 21 22 ln( ) sin 2Y U U

Page 37: Chapter 7 Generating and Processing Random Signals

37

The Polar Method

From previous

We may transform

21 22 ln( ) cos 2X U U 2

1 22 ln( ) sin 2Y U U

2 2 2 ( )s R u v R s

1cos 2 cosu u

UR s

2sin 2 sinv v

UR s

22 2

1 2

2 ln( )2 ln( ) cos 2 2 ln( )( )

u sX U U s u

ss

22 2

1 2

2 ln( )2 ln( ) sin 2 2 ln( )( )

v sY U U s v

ss

Page 38: Chapter 7 Generating and Processing Random Signals

38

The Polar Method Alothgrithm

1.Generate two uniform RV , and and they are all on the interval (0,1) 2.Let and , so they are independent and uniform on (-1,1)3.Let if continue , else back to step24.Form 5.Set and

1U 2U

1 12 1V U 2 22 1V U

2 21 2S V V 1S

2( ) ( 2 ln ) /A S S S

1( )X A S V 2( )Y A S V

Page 39: Chapter 7 Generating and Processing Random Signals

39

Establishing a Given Correlation Coefficient(1)

Assume two Gaussian RV X and Y , they are zero mean and uncorrelated

Define a new RV We also can see Z is Gaussian RV Show is correlation coefficient relating

X and Z

21Z X Y | | 1

Page 40: Chapter 7 Generating and Processing Random Signals

40

Establishing a Given Correlation Coefficient(2)

Mean , Variance , Correlation coefficient { } { } { } 0E Z E X E Y

2 2 2 2 2{ } 2 1 { } (1 ) { }E X E XY E Y

{ } { } { } 0E XY E X E Y 2 2 2 2 2 2 2{ } ( { }) { } { }X Y E X E X E X E Y

2 2 2 2 2(1 )

2 2 2{[ 1 ] }Z E X Y

Page 41: Chapter 7 Generating and Processing Random Signals

41

Establishing a Given Correlation Coefficient(3)

Covariance between X and Z

as desired

{ } { [ (1 ) ]}E XZ E X X Y

2{ } (1 ) { }E X E XY

2 2{ }E X

2

2

{ }XZ

X Z

E XZ

Page 42: Chapter 7 Generating and Processing Random Signals

42

Pseudonoise(PN) Sequence Genarators

PN generator produces periodic sequence that appears to be random

Generated by algorithm using initial seedAlthough not random , but can pass man

y tests of randomnessUnless algorithm and seed are known , t

he sequence is impractical to predict

Page 43: Chapter 7 Generating and Processing Random Signals

43

PN Generator implementation

Page 44: Chapter 7 Generating and Processing Random Signals

44

Property of Linear Feedback Shift Register(LFSR)

Nearly random with long periodMay have max period If output satisfy period , is called

max-length sequence or m-sequenceWe define generator polynomial as

The coefficient to generate m-sequence can always be found

Page 45: Chapter 7 Generating and Processing Random Signals

45

Example of PN generator

Page 46: Chapter 7 Generating and Processing Random Signals

46

Different seed for the PN generator

Page 47: Chapter 7 Generating and Processing Random Signals

47

Family of M-sequences

Page 48: Chapter 7 Generating and Processing Random Signals

48

Property of m-sequence

Has ones , zerosThe periodic autocorrelation of a m-se

quence is

If PN has a large period , autocorrelation function approaches an impulse , and PSD is approximately white as desired

1

Page 49: Chapter 7 Generating and Processing Random Signals

49

PN Autocorrelation Function

Page 50: Chapter 7 Generating and Processing Random Signals

50

Signal Processing

Relationship

1.mean of input and output

2.variance of input and output

3.input-output cross-correlation

4.autocorrelation and PSD

Page 51: Chapter 7 Generating and Processing Random Signals

51

Input/Output Means

Assume system is linearconvolution

Assume stationarity assumption

We can getand

[ ] [ ] [ ]k

k

y n h k x n k

{ [ ]} { [ ] [ ]} [ ] { [ ]}

k k

E y n E h k x n k h k E x n k

{ [ ]} { [ ]}E x n k E x n

{ } { } [ ]k

E y E x h k

[ ] (0)

k

h k H

{ } (0) { }E y H E x

Page 52: Chapter 7 Generating and Processing Random Signals

52

Input/Output Cross-Correlation

The Cross-Correlation is defined by

This use is used in the development of a number of performance estimators , which will be developed in chapter 8

{ [ ] [ ]} [ ] { [ ] [ ] [ ]}xyj

E x n y n m R m E x n h j x n j m

[ ] [ ] { [ ]}xy

j

R m h j E x n j m

[ ] [ ]xxj

h j R m j

Page 53: Chapter 7 Generating and Processing Random Signals

53

Output Autocorrelation Function(1)

Autocorrelation of the output

Can’t be simplified without knowledge of the Statistics of

{ [ ] [ ]} [ ]yyE y n y n m R m

{ [ ] [ ] [ ] [ ]}j k

E h j x n j h k x n k m

[ ] [ ] [ ] { [ ] [ ]}yy

j k

R m h j h k E x n j x n m k

[ ] [ ] ( )xxj k

h j h k R m k j

[ ]x n

Page 54: Chapter 7 Generating and Processing Random Signals

54

Output Autocorrelation Function(2)

If input is delta-correlated(i.e. white noise)

substitute previous equation

2

[ ] { [ ] [ ]}0

xxxR m E x n x n m

20

[ ]0 x

mm

m

[ ]yyR m

2[ ] [ ] [ ] ( )yy xj k

R m h j h k m k j

2 [ ] [ ]x

j

h j h j m

Page 55: Chapter 7 Generating and Processing Random Signals

55

Input/Output Variances

By definition Let m=0 substitute into

But if is white noise sequence

2[0] { [ ]}yyR E y n

[ ]yyR m

2 [0] [ ] [ ] [ ]y yy xxj k

R h j h k R j k

[ ]x n

2 2 2[0] [ ]y yy xj

R h j

Page 56: Chapter 7 Generating and Processing Random Signals

56

The EndThanks for listening