ramdom signals

Upload: taoyrind3075

Post on 01-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/9/2019 Ramdom Signals

    1/23

    C H A P T E R 9

    Random

    Processes

    INTRODUCTION

    MuchofyourbackgroundinsignalsandsystemsisassumedtohavefocusedontheeffectofLTI systemsondeterministic signals, developing tools foranalyzing thisclass of signals and systems, and using what you learned in order to understand

    applications

    in

    communication

    (e.g.,

    AM

    and

    FM

    modulation),

    control

    (e.g.,

    sta-bilityoffeedbacksystems),andsignalprocessing(e.g.,filtering). ItisimportanttodevelopacomparableunderstandingandassociatedtoolsfortreatingtheeffectofLTI systemson signalsmodeledastheoutcomeofprobabilisticexperiments, i.e.,aclassofsignalsreferredtoasrandomsignals(alternativelyreferredtoasrandomprocesses or stochastic processes). Such signals play a central role in signal andsystem design and analysis, and throughout the remainder of this text. In thischapterwedefinerandomprocessesviatheassociatedensembleofsignals,andbe-gintoexploretheirproperties. Insuccessivechaptersweuserandomprocessesasmodels for randomoruncertain signals thatarise in communication, controlandsignalprocessingapplications.

    9.1

    DEFINITION

    AND

    EXAMPLES

    OF

    A

    RANDOM

    PROCESS

    InSection7.3wedefinedarandomvariableXasafunctionthatmapseachoutcomeofaprobabilisticexperimenttoarealnumber. Inasimilarmanner,arealvaluedCT or DT random process, X(t) or X[n] respectively, is a function that mapseachoutcomeofaprobabilisticexperimenttoarealCTorDTsignalrespectively,termed the realization of the random process in that experiment. For any fixedtime instant t = t0

    or n = n0, the quantities X(t0) and X[n0] arejust randomvariables. Thecollectionofsignalsthatcanbeproducedbytherandomprocessisreferredtoastheensembleofsignals intherandomprocess.

    EXAMPLE9.1 RandomOscillators

    Asanexampleofarandomprocess, imagineawarehousecontainingN harmonicoscillators, eachproducinga sinusoidalwaveformof some specificamplitude, fre-quency,andphase,allofwhichmaybedifferent forthedifferentoscillators. Theprobabilisticexperimentthatresultsintheensembleofsignalsconsistsofselectingan oscillator according to some probability mass function (PMF) that assigns aprobabilitytoeachofthenumbersfrom1toN,sothattheithoscillatorispicked

    c 161Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    2/23

    162

    Chapter

    9

    Random

    Processes

    Amplitude

    X(t;)

    t0 t

    FIGURE

    9.1 Arandomprocess.

    withprobabilitypi. Associatedwitheachoutcomeofthisexperiment isaspecificsinusoidalwaveform.

    In Example 9.1, before an oscillator is chosen, there is uncertainty about whatthe amplitude, frequency and phase of the outcome of the experiment will be.Consequently,forthisexample,wemightexpresstherandomprocessas

    X(t) =Asin(t+)

    where the amplitude A, frequency and phase are all random variables. Thevalue X(t1)at some specific time t1 isalsoa randomvariable. In thecontextofthis experiment, knowing the PMF associated with each of the numbers 1 to Ninvolved inchoosinganoscillator,aswellasthespecificamplitude,frequencyandphaseofeachoscillator,wecoulddeterminetheprobabilitydistributionsofanyof

    the

    underlying

    random

    variables

    A,

    ,

    or

    X(t1)

    mentioned

    above.

    Throughoutthisandlaterchapters,wewillbeconsideringmanyotherexamplesofrandomprocesses. What is importantatthispoint,however, istodevelopagoodmentalpictureofwhatarandomprocessis. Arandomprocessisnotjustonesignalbutratheranensembleofsignals,as illustratedschematically inFigure9.2below,forwhichtheoutcomeoftheprobabilisticexperimentcouldbeanyofthefourwave-forms indicated. Eachwaveform isdeterministic, but the process isprobabilisticor randombecause it isnotknown a prioriwhichwaveformwillbegeneratedbytheprobabilisticexperiment. Consequently,priortoobtainingtheoutcomeoftheprobabilisticexperiment,manyaspectsofthesignalareunpredictable,sincethereisuncertaintyassociatedwithwhichsignalwillbeproduced. Aftertheexperiment,oraposteriori,theoutcomeistotallydetermined.

    If

    we

    focus

    on

    the

    values

    that

    a

    random

    process

    X(t)

    can

    take

    at

    a

    particularinstantoftime,sayt1 i.e.,ifwelookdowntheentireensembleatafixedtime

    whatwehave isarandomvariable,namelyX(t1). Ifwe focusontheensembleofvaluestakenatanarbitrarycollectionoffixedtimeinstantst1 < t2

  • 8/9/2019 Ramdom Signals

    3/23

    Section

    9.1

    Definition

    and

    examples

    of

    a

    random

    process

    163

    X(t) = x (t)

    t t1 2

    FIGURE

    9.2 RealizationsoftherandomprocessX(t)

    canbe thoughtof asa familyofjointlydistributed randomvariables indexedbyt (or n in theDT case). A fullprobabilistic characterization of this collectionofrandomvariableswould require thejointPDFsofmultiple samplesof the signal,takenatarbitrarytimes:

    a

    X(t) = x (t)b

    X(t) = x (t)c

    X(t) = x (t)d

    t

    t

    t

    t

    fX(t1),X(t2), ,X(t)(x1, x2, , x)

    forallandallt1, t2, , t.

    Animportantsetofquestionsthatarisesasweworkwithrandomprocessesinlater

    chapters

    of

    this

    book

    is

    whether,

    by

    observing

    just

    part

    of

    the

    outcome

    of

    a

    randomprocess,wecandeterminethecompleteoutcome. Theanswerwilldependonthe

    detailsofthe randomprocess,but ingeneral theanswer isno. For somerandomprocesses, having observed the outcome in a given time interval might providesufficientinformationtoknowexactlywhichensemblememberwasdetermined. Inothercasesitwouldnotbesufficient. Wewillbeexploringsomeoftheseaspectsinmoredetail later,butweconcludethissectionwithtwoadditionalexamplesthat

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    4/23

    164

    Chapter

    9

    Random

    Processes

    furtheremphasizethesepoints.

    EXAMPLE9.2 Ensembleofbatteries

    ConsideracollectionofNbatteries,eachprovidingonevoltageoutofagivenfinitesetofvoltagevalues. Thehistogramofvoltages(i.e.,thenumberofbatterieswithagivenvoltage) isgiven inFigure9.3. Theprobabilisticexperiment is tochoose

    Number of

    Batteries

    Voltage

    FIGURE

    9.3 HistogramofbatterydistributionforExample9.2.

    oneofthebatteries,withtheprobabilityofpickinganyspecificonebeingN

    1 , i.e.,theyareallequallylikelytobepicked. Alittlereflectionshouldconvinceyouthat

    if

    we

    multiply

    the

    histogram

    in

    Figure

    9.3

    by

    N

    1

    ,

    this

    normalized

    histogram

    will

    represent(orapproximate)thePMFforthebatteryvoltageattheoutcomeoftheexperiment. Since thebattery voltage isa constant signal, this corresponds toarandomprocess, and in fact is similar to theoscillator examplediscussed earlier,butwith=0and=0,sothatonlytheamplitudeisrandom.

    For this exampleobservationof X(t)atanyone time is sufficient information todeterminetheoutcomeforalltime.

    EXAMPLE

    9.3

    Ensemble

    of

    coin

    tossers

    Consider

    N

    people,

    each

    independently

    having

    written

    down

    a

    long

    random

    stringofonesandzeros,witheachentrychosenindependentlyofanyotherentryintheir

    string(similartoasequenceofindependentcointosses). Therandomprocessnowcomprises this ensemble of strings. A realization of the process is obtained byrandomlyselectingaperson(andthereforeoneoftheN stringsofonesandzeros),following which the specific ensemble member of the random process is totallydetermined. Therandomprocessdescribed inthisexample isoftenreferredtoas

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    5/23

    Section

    9.1

    Definition

    and

    examples

    of

    a

    random

    process

    165

    theBernoulliprocessbecauseof theway inwhich the stringofonesand zeros isgenerated(byindependentcoinflips).

    Now suppose thatperson showsyouonly the tenth entry in the string. Canyoudetermine (orpredict)theeleventhentry fromjustthat information? Becauseofthemanner inwhichthestringwasgenerated,theanswerclearly isno. Similarlyif the entire past history up to the tenth entry was revealed to you, could youdeterminetheremainingsequencebeyondthetenth? Forthisexample,theanswerisagainclearlyno.

    While the entire sequence hasbeendeterminedby thenatureof the experiment,partialobservationofagivenensemblemember isingeneralnotsufficienttofullyspecifythatmember.

    Ratherthanlookingatthenthentryofasingleensemblemember,wecanconsiderthe randomvariable corresponding to thevalues from the entire ensembleat the

    nth

    entry.

    Looking

    down

    the

    ensemble

    at

    n

    =

    10,

    for

    example,

    we

    would

    would

    seeonesandzeroswithequalprobability.

    In theabovediscussionwe indicated and emphasized that a random process canbe thoughtofasa familyofjointlydistributed randomvariables indexedby torn. Obviouslyitwouldingeneralbeextremelydifficultorimpossibletorepresentarandomprocessthisway. Fortunately,themostwidelyusedrandomprocessmodelshavespecialstructurethatpermitscomputationofsuchastatisticalspecification.Also,particularlywhenweareprocessingoursignalswithlinearsystems,weoftendesigntheprocessingoranalyzetheresultsbyconsideringonlythefirstandsecondmomentsoftheprocess,namelythefollowingfunctions:

    Mean:

    X(ti) =

    E[X(ti)],

    (9.1)

    Auto-correlation: RXX(ti, tj) =E[X(ti)X(tj)], and (9.2)

    Auto-covariance: CXX(ti, tj) =E[(X(ti)X(ti))(X(tj)X(tj))]

    =RXX(ti, tj)X(ti)X(tj). (9.3)

    Thewordauto(which issometimewrittenwithoutthehyphen,andsometimesdroppedaltogether to simplify the terminology)here refers to the fact thatbothsamplesinthecorrelationfunctionorthecovariancefunctioncomefromthesameprocess;weshallshortlyencounteranextensionofthisidea,wherethesamplesaretakenfromtwodifferentprocesses.

    One case in which the first and second moments actually suffice to completelyspecify the process is in the case of what is called a Gaussian process, definedas

    a

    process

    whose

    samples

    are

    always

    jointly

    Gaussian

    (the

    generalization

    of

    the

    bivariateGaussiantomanyvariables).

    Wecanalsoconsidermultiplerandomprocesses,e.g.,twoprocesses,X(t)andY(t).Forafullstochasticcharacterization ofthis,weneedthePDFsofallpossiblecom-binationsofsamplesfromX(t), Y(t). WesaythatX(t)andY(t)areindependentifeverysetofsamplesfromX(t)isindependentofeverysetofsamplesfromY(t),

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    6/23

    166

    Chapter

    9

    Random

    Processes

    sothatthejointPDFfactorsasfollows:

    fX(t1), ,X(tk),Y(t), ,Y(t)(x1,

    , xk, y1,

    , y) 1

    =fX(t1), ,X(tk

    )(x1, , xk).fY(t), ,Y(t)(y1, , y). (9.4)1

    Ifonlyfirstandsecondmomentsareof interest,theninadditiontotheindividualfirstand secondmomentsof X(t)and Y(t) respectively, weneed to consider thecrossmomentfunctions:

    Cross-correlation:

    RXY(ti, tj) =E[X(ti)Y(tj)], and (9.5)

    Cross-covariance:CXY(ti, tj) =E[(X(ti)X(ti))(Y(tj)Y(tj))]

    =RXY(ti, tj)X(ti)Y(tj). (9.6)

    IfCXY(t1, t2)=0forallt1, t2,wesaythattheprocessesX(t)andY(t)areuncor-

    related.

    Note

    again

    that

    the

    term

    uncorrelated

    in

    its

    common

    usage

    means

    thattheprocesseshavezerocovarianceratherthanzerocorrelation.

    Note that everything we have said above can be carried over to the case of DTrandomprocesses,exceptthatnowthesampling instantsarerestrictedtobedis-crete time instants. In accordance with our convention of using square brackets[ ] around the time argument for DT signals, we will write X[n] for the meanof a random process X[ ] at time n; similarly, we will write RXX[ni, nj] for thecorrelationfunction involvingsamplesattimesni

    andnj

    ;andsoon.

    9.2 STRICT-SENSESTATIONARITY

    Ingeneral,wewouldexpectthatthejointPDFsassociatedwiththerandomvari-

    ables

    obtained

    by

    sampling

    a

    random

    process

    at

    an

    arbitrary

    number

    k

    of

    arbitrarytimeswillbetimedependent, i.e.,thejointPDF

    fX(t1), ,X(tk

    )(x1, , xk)

    willdependonthespecificvaluesoft1, , tk. IfallthejointPDFsstaythesame underarbitrarytimeshifts,i.e.,if

    fX(t1

    ), ,X(tk)(x1, , xk) =fX(t1+), ,X(tk+)(x1, , xk) (9.7)

    forarbitrary,thentherandomprocessissaidtobestrictsensestationary(SSS).Saidanotherway, fora strictsense stationaryprocess, the statisticsdependonlyontherelativetimesatwhichthesamplesaretaken,notontheabsolutetimes.

    EXAMPLE9.4 Representingan i.i.d. process

    ConsideraDTrandomprocesswhosevaluesX[n]mayberegardedasindependentlychosenateachtimenfromafixedPDFfX

    (x),sothevaluesareindependentandidenticallydistributed,therebyyieldingwhat iscalledan i.i.d. process. Suchpro-cesses are widely used in modeling and simulation. For instance, if a particular

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    7/23

    Section

    9.3

    Wide-Sense

    Stationarity

    167

    DT communication channel corrupts a transmitted signal with added noise thattakes independentvaluesateach time instant,butwithcharacteristics that seem

    unchanging

    over

    the

    time

    window

    of

    interest,

    then

    the

    noise

    may

    be

    well

    modeled

    asani.i.d. process. Itisalsoeasytogenerateani.i.d. processinasimulationenvi-ronment,providedonecanarrangearandomnumbergeneratortoproducesamplesfromaspecifiedPDF(andthereareseveralgoodwaystodothis). Processeswithmorecomplicateddependenceacrosstimesamplescanthenbeobtainedbyfilteringorotheroperationsonthei.i.d. process,asweshallseeinthenextchapter.

    Forsuchani.i.d. process,wecanwritethejointPDFquitesimply:

    fX[n1],X[n2], ,X[n](x1, x2, , x) =fX(x1)fX(x2) fX(x) (9.8)

    foranychoiceofandn1, , n. TheprocessisclearlySSS.

    9.3 WIDE-SENSE STATIONARITY

    Ofparticularuse tous isa less restricted typeof stationarity. Specifically, if themeanvalue X

    (ti) is independentof timeand theautocorrelation RXX

    (ti, tj

    ) orequivalentlytheautocovarianceCXX

    (ti, tj

    )isdependentonlyonthetimedifference(titj), then the process is said to be widesense stationary (WSS). Clearly aprocess that is SSS is also WSS. For a WSS randomprocess X(t), therefore, wehave

    X(t) =X (9.9)

    RXX

    (t1, t2) =RXX

    (t1

    +, t2

    +)forevery

    =

    RXX (t1

    t2,

    0)

    .

    (9.10)

    (NotethatforaGaussianprocess(i.e.,aprocesswhosesamplesarealwaysjointlyGaussian)WSS impliesSSS,becausejointlyGaussianvariablesareentirelydeter-minedbythetheirjointfirstandsecondmoments.)

    Two random processes X(t) and Y(t) arejointly WSS if their first and secondmoments (including the crosscovariance) are stationary. In this case we use thenotationRXY()todenoteE[X(t+)Y(t)].

    EXAMPLE9.5 RandomOscillatorsRevisited

    ConsideragaintheharmonicoscillatorsasintroducedinExample9.1, i.e.

    X(t;A,)=Acos(0t+)

    where A and are independent random variables, and now 0

    is fixed at someknownvalue.

    If isactuallyfixedat theconstantvalue 0, theneveryoutcome isof the formx(t) =Acos(0t+0),anditisstraightforward toseethatthisprocessisnotWSS

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    8/23

    168

    Chapter

    9

    Random

    Processes

    (andhencenotSSS).Forinstance,ifAhasanonzeromeanvalue,A =0,thentheexpectedvalueoftheprocess,namelyAcos(0t+0), istimevarying. Toargue

    that

    the

    process

    is

    not

    WSS

    even

    when

    A =

    0,

    we

    can

    examine

    the

    autocorrelation

    function. Notethatx(t)isfixedatthevalue0forallvaluesoftsuchthat0t+0

    isanoddmultipleof/2,andtakesthevaluesAhalfwaybetweensuchpoints;thecorrelationbetweensuchsamplestaken/0

    apartintimecancorrespondinglybe0(intheformercase)orE[A2](inthe latter). Theprocess isthusnotWSS.

    Ontheotherhand,ifisdistributeduniformly in [, ],then 1

    X(t) =A cos(0t+)d= 0, (9.11) 2

    CXX (t1, t2) =RXX(t1, t2)

    =E[A2]E[cos(0t1

    +)cos(0t2

    +)]

    E[A2

    ]

    =

    cos(0(t2

    t1))

    ,

    (9.12)2

    sotheprocess isWSS. Itcanalsobeshown tobeSSS,though this isnot totallystraightforwardtoshowformally.

    To simplify notation for a WSS process, we write the correlation function asRXX (t1t2); the argument t1t2 is referred to as the lag at which the corre-lation is computed. For the most part, the random processes that we treat willbeWSSprocesses. When consideringjustfirstand secondmomentsandnot en-tire PDFs orCDFs, itwillbe less important todistinguish between the randomprocessX(t)andaspecificrealizationx(t)of itsoweshallgoonestepfurtherin simplifyingnotation, byusing lower case letters todenote the randomprocessitself.

    We

    shall

    thus

    talk

    of

    the

    random

    process

    x(t),

    and

    in

    the

    case

    of

    a

    WSS

    processdenote itsmeanby x and itscorrelation function E{x(t+)x(t)}byRxx(). Correspondingly, forDTwellrefertotherandomprocessx[n]and(intheWSScase)denoteitsmeanbyx anditscorrelationfunctionE{x[n+m]x[n]}byRxx[m].

    9.3.1 SomePropertiesofWSSCorrelationandCovarianceFunctions

    It iseasilyshownthatforrealvaluedWSSprocessesx(t)andy(t)thecorrelationandcovariancefunctionshavethefollowingsymmetryproperties:

    Rxx() =Rxx(), Cxx() =Cxx() (9.13)

    Rxy() =Ryx(), Cxy

    () =Cyx(). (9.14)

    Weseefrom(9.13)thattheautocorrelationandautocovariancehaveevensymme-try. SimilarpropertiesholdforDTWSSprocesses.

    Another important property of correlation and covariance functions follows fromnotingthatthecorrelationcoefficientoftworandomvariableshasmagnitudenot

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    9/23

    Section

    9.4

    Summary

    of

    Definitions

    and

    Notation

    169

    exceeding 1. Applying this fact to the samples x(t) and x(t+) of the randomprocessx( )directly leadstotheconclusionthat

    Cxx(0) Cxx() Cxx(0). (9.15)

    Inotherwords, theautocovariance functionneverexceeds inmagnitude itsvalueattheorigin. Addingx

    2 toeachtermabove,wefindthefollowinginequalityholds

    forcorrelationfunctions:

    Rxx(0)+2x2

    Rxx() Rxx(0). (9.16)

    In Chapter 10 we will demonstrate that correlation and covariance functions arecharacterized by the property that their Fourier transforms are real and non-negative at all frequencies, because these transforms describe the frequency dis-tributionoftheexpectedpower intherandomprocess. Theabovesymmetrycon-straintsandboundswill then followasnaturalconsequences,but theyareworthhighlightingherealready.

    9.4 SUMMARYOFDEFINITIONSAND NOTATION

    Inthissectionwesummarizesomeofthedefinitionsandnotationwehavepreviouslyintroduced. As in Section 9.3, we shall use lower case letters to denote random

    processes,

    since

    we

    will

    only

    be

    dealing

    with

    expectations

    and

    not

    densities.

    Thus,

    with x(t)and y(t)denoting (real) randomprocesses,we summarize the followingdefinitions:

    mean : (t)

    (9.17)x =E{x(t)}

    autocorrelation : (t1, t2)

    (9.18)Rxx =E{x(t1)x(t2)}

    crosscorrelation : (t1, t2)

    (9.19)Rxy =E{x(t1)y(t2)}

    autocovariance : (t1, t2)

    (t1)][x(t2)x(t2)]}Cxx =E{[x(t1)x

    =Rxx(t1, t2)x(t1)x(t2) (9.20)

    crosscovariance : (t1, t2)

    (t1)][y(t2)y(t2)]}Cxy =E{[x(t1)x

    =Rxy

    (t1, t2)x(t1)y

    (t2) (9.21)

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    10/23

    170

    Chapter

    9

    Random

    Processes

    strict-sense

    stationary

    (SSS): alljointstatisticsforx(t1),x(t2),...,x(t)forall>0andallchoicesofsampling instantst1,,t

    depend

    only

    on

    the

    relative

    locations

    of

    sampling

    instants.

    wide-sense

    stationary

    (WSS): x(t)isconstantatsomevaluex,andRxx(t1,t2) isafuncti

    jointlywide-sensestationary:

    of(t1t2)only,denoted inthiscasesimplybyRxx(t1t2);henceCxx(t1,t2) isafunctionof(t1t2)only,andwrittenasCxx(t1t2).x(t)andy(t)areindividuallyWSSandRxy(t1,t2) isafunctionof(t1t2)only,denotedsimplybyRxy(t1

    t2);henceCxy(t1,t2)isafunctionof(t1

    t2)only,andwrittenasCxy(t1

    t2).

    ForWSSprocesseswehave, incontinuoustimeandwithsimplernotation,

    Rxx(

    ) =

    E{x(t

    +

    )x(t)}

    =

    E{x(t)x(t

    )}

    (9.22)

    Rxy() =E{x(t+)y(t)}=E{x(t)y(t)}, (9.23)

    andindiscretetime,

    Rxx[m] =E{x[n+m]x[n]}=E{x[n]x[nm]} (9.24)

    Rxy [m] =E{x[n+m]y[n]}=E{x[n]y[nm]}. (9.25)

    Weusecorresponding(centered)definitionsandnotationforcovariances:

    Cxx(), Cxy(), Cxx[m], andCxy[m].

    ItisworthnotingthatanalternativeconventionusedelsewhereistodefineRxy()

    asRxy =E{x(t)y(t+)}.()

    Inournotation,thisexpectationwouldbedenotedbyRxy(). Itsimportanttobecarefultotakeaccountofwhatnotationalconventionis being followed when you read this material elsewhere, and you should also beclearaboutwhatnotationalconventionweareusing inthistext.

    9.5 FURTHEREXAMPLES

    EXAMPLE9.6 Bernoulliprocess

    The

    Bernoulli

    process,

    a

    specific

    example

    of

    which

    was

    discussed

    previously

    inExample9.3,isanexampleofani.i.d. DTprocesswith

    P(x[n]=1)=p (9.26)

    P(x[n] =1)=(1p) (9.27)

    andwith thevalue at each time instant n independent of the valuesatallother

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    11/23

    Section

    9.5

    Further

    Examples

    171

    timeinstants. Asimplecalculationresultsin

    E

    {x[n]}

    = 2p

    1 =

    x (9.28)

    1 m= 0

    E{x[n+m]x[n]}=

    (2p1)2 m= 0(9.29)

    Cxx[m] =E{(x[n+m]x)(x[n]x)} (9.30)

    ={1(2p1)2}[m] = 4p(1p)[m]. (9.31)

    EXAMPLE9.7 Randomtelegraphwave

    A

    useful

    example

    of

    a

    CT

    random

    process

    that

    well

    make

    occasional

    reference

    to is the random telegraph wave. A representative sample function of a randomtelegraphwaveprocessisshowninFigure9.4. Therandomtelegraphwavecanbedefinedthroughthefollowingtwoproperties:

    t

    x(t)

    +1

    1

    FIGURE

    9.4

    One

    realization

    of

    a

    random

    telegraph

    wave.

    1. X(0)=1withprobability0.5.

    2. X(t)changespolarityatPoissontimes,i.e.,theprobabilityofksignchangesinatimeintervaloflengthT is

    (T)keTP(ksignchangesinanintervaloflengthT) = . (9.32)

    k!

    Property 2 implies that the probability of a nonnegative, even number of signchanges inanintervaloflengthT is

    (T

    )k

    1 + (1)k

    (T

    )k

    P(a

    nonnegative

    even

    #

    of

    sign

    changes)

    =

    eT =

    eTk! 2 k!

    k=0

    k=0

    k even

    (9.33)Usingthe identity

    (T)kT

    e =

    k!k=0

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    12/23

    172

    Chapter

    9

    Random

    Processes

    equation(9.33)becomes

    P(a

    nonnegative

    even

    #

    of

    sign

    changes)

    =

    eT (eT

    +

    eT

    )

    21

    = (1+e2T). (9.34)2

    Similarly,theprobabilityofanoddnumberofsignchangesinanintervaloflengthT

    is 1(1e2T). Itfollowsthat2

    P(X(t)=1)=P(X(t) = 1X(0)=1)P(X(0)=1)|

    +P(X(t) = 1|X(0)=1)P(X(0)=1)

    1= P(even#ofsignchangesin[0, t])

    21

    +

    P(odd

    #

    of

    sign

    changes

    in

    [0, t])

    2

    1

    1

    1

    1

    1(1e2t)= (1+e2t) + = . (9.35)

    2 2 2 2 2

    NotethatbecauseofPropertyI,theexpressioninthelastlineofEqn. (9.35)isnotneeded,sincethelinebeforethatalreadyallowsustoconcludethattheansweris 12

    :sincethenumberofsignchanges inany intervalmustbeeitherevenorodd,theirprobabilitiesaddupto1,soP(X(t)=1)= 12. However,ifProperty1isrelaxedtoallowP(X(0)=1)=p0

    = 21,thentheabovecomputationmustbecarriedthrough

    tothe last line,andyieldstheresult

    (1e2t)P(X(t)=1)=p0 (1+e2t) +(1p0) =

    1

    1

    1

    1+(2p01)e

    2t

    .2 2 2

    (9.36)

    ReturningtothecasewhereProperty1holds,soP(X(t)=1),weget

    X

    (t) = 0, and (9.37)

    RXX

    (t1, t2) =E[X(t1)X(t2)]

    = 1P(X(t1) =X(t2))+(1)P(X(t1) = X(t2))

    =e2|t2t1| . (9.38)

    Inotherwords,theprocess isexponentiallycorrelatedandWSS.

    9.6

    ERGODICITY

    Theconceptofergodicity issophisticatedandsubtle,buttheessential idea isde-scribedhere. Wetypicallyobservetheoutcomeofarandomprocess(e.g.,werecordanoisewaveform)andwanttocharacterizethestatisticsoftherandomprocessbymeasurementsononeensemblemember. Forinstance,wecouldconsiderthetimeaverageofthewaveformtorepresentthemeanvalueoftheprocess(assumingthis

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    13/23

    Section

    9.7

    Linear

    Estimation

    of

    Random

    Processes

    173

    mean isconstant foralltime). Wecouldalsoconstructhistogramsthatrepresentthefractionoftime(ratherthantheprobabilityweightedfractionoftheensemble)

    that

    the

    waveform

    lies

    in

    different

    amplitude

    bins,

    and

    this

    could

    be

    taken

    to

    reflect

    the probability density across the ensemble of the value obtained at a particularsamplingtime. Iftherandomprocessissuchthatthebehaviorofalmosteverypar-ticular realizationover time is representativeof thebehaviordown theensemble,thentheprocess iscalledergodic.

    Asimpleexampleofaprocessthat isnotergodic isExample9.2,anensembleofbatteries. Clearly,forthisexample,thebehaviorofanyrealizationisnotrepresen-tativeofthebehaviordowntheensemble.

    Narrowernotionsofergodicitymaybedefined. Forexample, ifthetimeaverage

    1

    T

    x=T

    2T Tx(t)dt (9.39)lim

    almost always (i.e. foralmost every realization oroutcome) equals the ensembleaverage X

    , then the process is termed ergodic in the mean. It can be shown,for instance, that a WSSprocess with finite variance at each instant and with acovariance function thatapproaches0 for large lags isergodic in themean. Notethat a (nonstationary) process with timevarying mean cannot be ergodic in themean.

    Inourdiscussionof randomprocesses, wewillprimarilybe concernedwithfirstand secondorder moments of random processes. While it is extremely difficultto determine in general whether a random process is ergodic, there are criteria(specified in terms of the moments of the process) that will establish ergodicityin the mean and in the autocorrelation. Frequently, however, such ergodicity is

    simply

    assumed

    for

    convenience,

    in

    the

    absence

    of

    evidence

    that

    the

    assumptionis not reasonable. Under this assumption, the mean and autocorrelation can be

    obtainedfromtimeaveragingonasingleensemblemember,throughthefollowingequalities:

    1T

    E{x(t)}= lim x(t)dt (9.40)T

    2TT

    and

    1T

    E{x(t)x(t+)}= lim x(t)x(t+)dt (9.41)T2T

    T

    A

    random

    process

    for

    which

    (9.40)

    and

    (9.41)

    are

    true

    is

    referred

    as

    secondorder

    ergodic.

    9.7 LINEARESTIMATIONOFRANDOM PROCESSES

    Acommonclassofproblemsinavarietyofaspectsofcommunication,controlandsignalprocessing involvestheestimationofonerandomprocessfromobservations

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    14/23

    174

    Chapter

    9

    Random

    Processes

    of another, or estimating (predicting) future values from the observation of pastvalues. Forexample,itiscommonincommunicationsystemsthatthesignalatthe

    receiver

    is

    a

    corrupted

    (e.g.,

    noisy)

    version

    of

    the

    transmitted

    signal,

    and

    we

    would

    like to estimate the transmitted signal from the received signal. Other exampleslie in predicting weather and financial data from past observations. We will betreatingthisgeneraltopicinmuchmoredetailinlaterchapters,butafirstlookatitherecanbebeneficialinunderstandingrandomprocesses.

    Weshallfirstconsiderasimpleexampleof linearpredictionofarandomprocess,thenamoreelaborateexampleoflinearFIRfilteringofanoisecorruptedprocesstoestimatetheunderlyingrandomsignal. Weconcludethesectionwithsomefurtherdiscussionof thebasicproblemof linear estimationofone randomvariable frommeasurementsofanother.

    9.7.1 LinearPrediction

    Asasimple illustrationof linearprediction,consideradiscretetimeprocessx[n].Knowing thevalueat time n0

    we may wish to predictwhat the value will be msamples intothefuture,i.e. attimen0+m. We limitthepredictionstrategytoalinearone, i.e.,with x[n0+m]denotingthepredictedvalue,werestrictx[n0+m]tobeoftheform

    x[n0+m] =ax[n0] +b (9.42)

    and choose the prediction parameters a and b to minimize the expected valueofthesquareoftheerror, i.e.,chooseaandbtominimize

    =E{(x[n0+m]x[n0+m])2} (9.43)

    or

    =E{(x[n0+m]ax[n0]b)2}. (9.44)

    Tominimizewesettozero itspartialderivativewithrespecttoeachofthetwoparametersandsolvefortheparametervalues. Theresultingequationsare

    E{(x[n0+m]ax[n0]b)x[n0]}=E{(x[n0+m]x[n0+m])x[n0]}= 0(9.45a)

    E{x[n0

    +m]ax[n0]b}=E{x[n0

    +m]x[n0

    +m]}= 0.(9.45b)

    Equation (9.45a) states that theerror x[n0

    +m]x

    [n0

    +m]associatedwith theoptimalestimateisorthogonaltotheavailabledatax[n0]. Equation(9.45b)states

    that

    the

    estimate

    is

    unbiased.

    Carryingoutthemultiplicationsandexpectationsintheprecedingequationsresultsinthefollowingequations,whichcanbesolvedforthedesiredconstants.

    Rxx[n0+m, n0]aRxx[n0, n0]bx[n0]=0 (9.46a)

    x[n0+m]ax[n0]b= 0. (9.46b)

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    15/23

    Section

    9.7

    Linear

    Estimation

    of

    Random

    Processes

    175

    IfweassumethattheprocessisWSSsothatRxx[n0+m, n0] =Rxx[m],Rxx[n0, n0] =Rxx[0],andalsoassumethatitiszeromean,(x =0),thenequations(9.46)reduce

    to

    a=Rxx[m]/Rxx[0] (9.47)

    b=0 (9.48)

    sothatRxx[m]

    x[n0+m] =Rxx[0]

    x[n0]. (9.49)

    Iftheprocess isnotzeromean,then itiseasytoseethat

    Cxx[m]x

    [n0+m] =x+

    Cxx[0](x[n0]x). (9.50)

    Anextensionofthisproblemwouldconsiderhowtodopredictionwhenmeasure-mentsofseveralpastvaluesareavailable. Ratherthanpursuethiscase,weillustratenextwhattodowithseveralmeasurementsinaslightlydifferentsetting.

    9.7.2 LinearFIRFiltering

    Asanotherexample,whichwewilltreatinmoregeneralityinchapter11onWienerfiltering, consideradiscretetime signal s[n] thathasbeen corruptedbyadditivenoised[n]. Forexample,s[n]mightbeasignaltransmittedoverachannelandd[n]thenoiseintroducedbythechannel. Thereceivedsignalr[n]isthen

    r[n] =s[n] +d[n]. (9.51)

    Assume that both s[n] and d[n] are zeromean random processes and are uncor-related. At the receiver we would like to process r[n] with a causal FIR (finiteimpulseresponse)filtertoestimatethetransmittedsignals[n].

    d[n]

    s[n] s[n]

    r[n] h[n]

    FIGURE

    9.5

    Estimating

    the

    noise

    corrupted

    signal.

    Ifh[n]isacausalFIRfilteroflengthL,then

    L1

    s[n] =h[k]r[nk]. (9.52)k=0

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    16/23

    176

    Chapter

    9

    Random

    Processes

    Wewouldliketodeterminethefiltercoefficientsh[k]tominimizethemeansquareerrorbetweens[n]ands[n], i.e.,minimizegivenby

    =E(s[n]s[n])2L1

    =E(s[n]

    h[k]r[nk])2. (9.53)k=0

    To determine h, we seth[m]

    = 0 for each of the L values of m. Taking this

    derivative,weget

    =E{2(s[n]

    h[k]r[nk])r[nm]}

    h[m]k

    =E{2(s[n]s

    [n])r[nm]}

    = 0

    m

    = 0,

    1,

    , L

    1

    (9.54)

    whichistheorthogonalityconditionweshouldbeexpecting: theerror(s[n]s[n])associatedwiththeoptimalestimateisorthogonaltotheavailabledata,r[nm].

    Carrying out the multiplications in the above equations and taking expectationsresultsin

    L1h[k]Rrr [mk] =Rsr[m], m= 0,1, , L1 (9.55)

    k=0

    Eqns. (9.55)constituteLequationsthatcanbesolved fortheLparameters h[k].With r[n] = s[n] +d[n], it is straightforward to show that Rsr[m] = Rss[m] +Rsd[m]andsinceweassumedthats[n]andd[n]areuncorrelated,thenRsd[m]=0.Similarly,Rrr [m] =Rss[m] +Rdd[m].

    These

    results

    are

    also

    easily

    modified

    for

    the

    case

    where

    the

    processes

    no

    longer

    havezeromean.

    9.8 THEEFFECTOFLTI SYSTEMS ONWSSPROCESSES

    Yourpriorbackground insignalsandsystems,and intheearlierchaptersofthesenotes,hascharacterizedhowLTIsystemsaffecttheinputfordeterministicsignals.

    Wewill see in laterchaptershow the correlationpropertiesofa randomprocess,andtheeffectsofLTIsystemsontheseproperties,playanimportantroleinunder-standinganddesigningsystems forsuchtasksasfiltering,signaldetection, signalestimation and system identification. We focus in this section on understandinginthetimedomainhowLTIsystemsshapethecorrelationpropertiesofarandomprocess.

    In

    Chapter

    10

    we

    develop

    a

    parallel

    picture

    in

    the

    frequency

    domain,

    af-terestablishingthatthefrequencydistributionoftheexpectedpowerinarandomsignal isdescribedbytheFouriertransformoftheautocorrelationfunction.

    ConsideranLTIsystemwhoseinputisasamplefunctionofaWSSrandomprocessx(t),i.e.,asignalchosenbyaprobabilisticexperimentfromtheensemblethatcon-stitutestherandomprocessx(t);moresimply,wesaythattheinputistherandom

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    17/23

    Section

    9.8

    The

    Effect

    of

    LTI

    Systems

    on

    WSS

    Processes

    177

    process x(t). TheWSS input ischaracterizedby itsmeanand itsautocovarianceor(equivalently)autocorrelationfunction.

    Amongotherconsiderations,weareinterestedinknowingwhentheoutputprocessy(t)i.e.,theensembleofsignalsobtainedasresponsestothesignalsintheinputensemblewillitselfbeWSS,andwanttodetermineitsmeanandautocovarianceorautocorrelationfunctions,aswellasitscrosscorrelationwiththeinputprocess.ForanLTIsystemwhose impulseresponse ish(t),theoutputy(t) isgivenbytheconvolution

    + +y(t) = h(v)x(tv)dv= x(v)h(tv)dv (9.56)

    foranyspecificinputx(t)forwhichtheconvolutioniswelldefined. Theconvolutioniswelldefinedif,forinstance,theinputx(t)isboundedandthesystemisboundedinputboundedoutput(BIBO)stable, i.e. h(t) isabsolutely integrable. Figure9.6indicates

    what

    the

    two

    components

    of

    the

    integrand

    in

    the

    convolution

    integral

    may

    looklike.

    x(v)

    v

    h(t - v)

    t v

    FIGURE

    9.6 Illustrationofthetwoterms inthe integrandofEqn. (9.56)

    Ratherthanrequiringthateverysamplefunctionofourinputprocessbebounded,it will suffice for our convolution computations below to assume that E[x2(t)] =Rxx(0)isfinite. Withthisassumption,andalsoassumingthatthesystemisBIBOstable, weensure that y(t) isawelldefined randomprocess,and that the formalmanipulationswe carryoutbelow for instance, interchangingexpectationandconvolution can all bejustified more rigorously by methods that are beyondour scopehere. In fact, theresultsweobtaincanalsobeapplied,whenproperlyinterpreted, to cases where the input process does not have a bounded secondmoment,

    e.g.,

    when

    x(t)

    is

    socalled

    CT

    white

    noise,

    for

    which

    Rxx() =(). TheresultscanalsobeappliedtoasystemthatisnotBIBOstable,aslongasithasawelldefinedfrequencyresponseH(j),asinthecaseofanideallowpassfilter,forexample.

    We can use the convolution relationship (9.56) to deduce the first and secondorderpropertiesofy(t). Whatweshallestablishisthaty(t)isitselfWSS,andthat

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    18/23

    178

    Chapter

    9

    Random

    Processes

    x(t) and y(t) are in factjointly WSS. We will also develop relationships for theautocorrelationoftheoutputandthecrosscorrelationbetweeninputandoutput.

    First, consider themeanvalueof theoutput. Taking the expectedvalueofbothsidesof(9.56),wefind +

    E[y(t)]=E h(v)x(tv)dv += h(v)E[x(tv)]dv

    += h(v)x

    dv

    +

    =x

    h(v)dv

    =H(j0)x

    =y

    . (9.57)

    Inotherwords,themeanoftheoutputprocessisconstant,andequalsthemeanofthe inputscaledbythetheDCgainofthesystem. This isalsowhattheresponseofthesystemwouldbeifits inputwereheldconstantatthevaluex.

    Theprecedingresultandthelinearityofthesystemalsoallowustoconcludethatapplyingthezero-meanWSSprocessx(t)x totheinputofthestableLTIsystemwould result in the zero-meanprocess y(t)y at theoutput. This fact willbeuseful below in converting results that are derived for correlation functions intoresultsthatholdforcovariancefunctions.

    Nextconsiderthecrosscorrelationbetweenoutputand input:

    +

    E{y(t+)x(t)}=E h(v)x(t+v)dv x(t) + = h(v)E{x(t+v)x(t)}dv. (9.58)

    Sincex(t)isWSS,E{x(t+v)x(t)}=Rxx(v),so +E{y(t+)x(t)}= h(v)Rxx(v)dv

    =h()Rxx()

    =Ryx(). (9.59)

    Note that the crosscorrelation depends only on the lag between the samplinginstantsof theoutputand inputprocesses, notonboth and theabsolute timelocationt. Also,thiscrosscorrelationbetweentheoutputandinputisdeterminis-ticallyrelatedtotheautocorrelationofthe input,andcanbeviewedasthesignalthatwouldresultifthesysteminputweretheautocorrelationfunction,asindicatedinFigure9.7.

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    19/23

    Section

    9.8

    The

    Effect

    of

    LTI

    Systems

    on

    WSS

    Processes

    179

    Ryx()Rxx()

    h()

    FIGURE

    9.7 RepresentationofEqn. (9.59)

    Wecanalsoconcludethat

    Rxy() =Ryx() =Rxx()h() =Rxx()h(), (9.60)

    wherethesecondequalityfollowsfromEqn. (9.59)andthefactthattimereversingthetwo functions inaconvolution results intimereversalofthe result,whilethelastequalityfollowsfromthesymmetryEqn. (9.13)oftheautocorrelationfunction.

    Theaboverelationscanalsobeexpressed intermsofcovariance functions,ratherthanintermsofcorrelationfunctions. Forthis,simplyconsiderthecasewheretheinput to the system is thezeromeanWSSprocess x(t)x,withcorrespondingzeromeanoutputy(t)y . Sincethecorrelationfunctionforx(t)x isthesameasthecovariancefunctionforx(t),i.e.,since

    Rxx,xx() =Cxx(), (9.61)

    the results above hold unchanged when every correlation function is replaced bythecorrespondingcovariancefunction. Wethereforehave,for instance,that

    Cyx() =

    h(

    )

    Cxx()

    (9.62)

    Nextweconsidertheautocorrelationoftheoutputy(t): +

    E{y(t+)y(t)}=E h(v)x(t+v)dv y(t)

    += h(v)E{x(t+v)y(t)}dv

    Rxy(v)

    +

    = h(v)Rxy(v)dv

    =h()Rxy()

    =Ryy(). (9.63)

    Note that theautocorrelationof theoutputdependsonlyon , andnotonboth and t. Puttingthistogetherwith theearlierresults,weconclude that x(t)and

    y(t)arejointlyWSS,asclaimed.

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    20/23

    180

    Chapter

    9

    Random

    Processes

    Thecorrespondingresultforcovariancesis

    Cyy() =h()Cxy(). (9.64)

    Combining(9.63)with(9.60),wefindthat

    Ryy() =Rxx() h()h() =Rxx()Rhh(). (9.65)

    h()h()=Rhh()

    ThefunctionRhh()istypicallyreferredtoasthedeterministicautocorrelationfunction

    ofh(t),and isgivenby +Rhh() =h()h() = h(t+)h(t)dt. (9.66)

    For

    the

    covariance

    function

    version

    of

    (9.65),

    we

    have

    Cyy() =Cxx() h()h() =Cxx()Rhh(). (9.67)

    h()h()=Rhh()

    Note that thedeterministiccorrelation functionof h(t) is stillwhatweuse, evenwhenrelatingthecovariancesoftheinputandoutput. Onlythemeansoftheinputand output processes get adjusted in arriving at the present result; the impulseresponseisuntouched.

    The correlation relations in Eqns. (9.59), (9.60), (9.63) and (9.65), as well astheir covariance counterparts, are very powerful, and we will make considerableuseofthem. Ofequal importancearetheirstatements intheFourierandLaplace

    transform

    domains.

    Denoting

    the

    Fourier

    and

    Laplace

    transforms

    of

    the

    correlationfunction Rxx() by Sxx(j) and Sxx(s) respectively, and similarly for the other

    correlationfunctionsofinterest,wehave:

    Syx(j) =Sxx(j)H(j), Syy

    (j) =Sxx(j)|H(j)|2,

    Syx(s) =Sxx(s)H(s), Syy(s) =Sxx(s)H(s)H(s). (9.68)

    WecandenotetheFourierandLaplacetransformsofthecovariancefunctionCxx()byDxx(j)andDxx(s)respectively,andsimilarlyfortheothercovariancefunctionsofinterest,andthenwritethesamesortsofrelationshipsasabove.

    Exactlyparallel resultshold in theDTcase. Considera stablediscretetimeLTIsystemwhoseimpulseresponseish[n]andwhoseinputistheWSSrandomprocessx[n]. Then,asinthecontinuoustimecase,wecanconcludethattheoutputprocess

    y[n]

    is

    jointly

    WSS

    with

    the

    input

    process

    x[n],

    and

    y

    =x

    h[n] (9.69)

    Ryx[m] =h[m]Rxx[m] (9.70)

    Ryy [m] =Rxx[m]Rhh[m], (9.71)

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    21/23

    Section

    9.8

    The

    Effect

    of

    LTI

    Systems

    on

    WSS

    Processes

    181

    whereRhh[m] isthedeterministicautocorrelationfunctionofh[m],definedas

    +

    Rhh[m] = h[n+m]h[n]. (9.72)n=

    ThecorrespondingFourierandZtransformstatementsoftheserelationshipsare:

    y

    =H(ej0)x

    , Syx(ej) =Sxx(e

    j)H(ej), Syy(ej) =Sxx(e

    j)|H(ej)|2,

    y =H(1)x , Syx(z) =Sxx(z)H(z), Syy(z) =Sxx(z)H(z)H(1/z).(9.73)

    Alloftheseexpressionscanalsoberewrittenforcovariancesandtheirtransforms.

    Thebasicrelationshipsthatwehavedevelopedsofarinthischapterareextremelypowerful. InChapter10wewilluse these relationships to show that theFouriertransform of the autocorrelation function describes how the expected power of aWSS

    process

    is

    distributed

    in

    frequency.

    For

    this

    reason,

    the

    Fourier

    transform

    of

    theautocorrelationfunctionistermedthepowerspectraldensity(PSD)oftheprocess.

    TherelationshipsdevelopedinthischapterarealsoveryimportantinusingrandomprocessestomeasureoridentifytheimpulseresponseofanLTIsystem. Forexam-ple,from(9.70),iftheinputx[n]toaDTLTIsystemisaWSSrandomprocesswithautocorrelation function Rxx[m] = [m], then by measuring the crosscorrelationbetweenthe inputandoutputweobtainameasurementofthesystem impulsere-sponse. Itiseasytoconstructaninputprocesswithautocorrelationfunction[m],forexampleani.i.d. processthatisequallylikelytotakethevalues+1and1ateachtimeinstant.

    As

    another

    example,

    suppose

    the

    input

    x(t)

    to

    a

    CT

    LTI

    system

    is

    a

    random

    telegraph

    wave,

    with

    changes

    in

    sign

    at

    times

    that

    correspond

    to

    the

    arrivals

    in

    a

    Poissonprocesswithrate, i.e.,

    (T)keTP(kswitches inan intervalof lengthT) = . (9.74)

    k!

    Then,assumingx(0)takesthevalues1withequalprobabilities,wecandeterminethattheprocessx(t)haszeromeanandcorrelation functionRxx() =e

    2||,soitisWSS(fort0). IfwedeterminethecrosscorrelationRyx()withtheoutputy(t)andthenusetherelation

    Ryx() =Rxx()h(), (9.75)

    wecanobtainthesystemimpulseresponseh(). Forexample,ifSyx(s), Sxx(s)and

    H(s)

    denote

    the

    associated

    Laplace

    transforms,

    then

    Syx(s)H(s) = . (9.76)

    Sxx(s)

    NotethatSxx(s)isaratherwellbehavedfunctionofthecomplexvariablesinthiscase, whereas any particular sample function of the process x(t) would not havesuchawellbehavedtransform. ThesamecommentappliestoSyx(s).

    cAlan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

  • 8/9/2019 Ramdom Signals

    22/23

    182

    Chapter

    9

    Random

    Processes

    As a third example, suppose that we know the autocorrelation function Rxx[m]of the input x[n] toaDTLTI system, butdonothaveaccess to x[n]and there-

    fore

    cannot

    determine

    the

    crosscorrelation

    Ryx[m]

    with

    the

    output

    y[n],

    but

    can

    determinetheoutputautocorrelationRyy

    [m]. Forexample,if

    Rxx[m] =[m] (9.77)

    andwedetermineRyy [m]tobeRyy [m] =

    21|m|

    ,then1|m|

    Ryy [m] = =Rhh[m] =h[m]h[m]. (9.78)2

    Equivalently,H(z)H(z1)canbeobtainedfromtheZtransformSyy

    (z)ofRyy

    [m].Additionalassumptionsorconstraints, for instanceon the stabilityandcausalityof the system and its inverse, may allow one to recover H(z) from knowledge of

    H(z)H(z1).

    Alan

    V.

    Oppenheim

    and

    George

    C.

    Verghese,

    2010

    c

  • 8/9/2019 Ramdom Signals

    23/23

    MIT OpenCourseWarehttp://ocw.mit.edu

    6.011 Introduction to Communication, Control, and Signal Processing

    Spring 2010

    For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

    http://ocw.mit.edu/http://ocw.mit.edu/termshttp://ocw.mit.edu/termshttp://ocw.mit.edu/