d com lecture5

Upload: nidulsinha

Post on 04-Jun-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 D Com Lecture5

    1/17

    3/28/2006

    Communications Engineering

    (EE321B) 1

    Communications Engineering (EE321B) Entropy, Energy and Systems

    Lecture 5

    March 28, 2006

  • 8/13/2019 D Com Lecture5

    2/17

    3/28/2006

    Communications Engineering

    (EE321B) 2

    Overview

    Entropy, Thermodynamics and Information (Lecture 5)

    (Haykin Chapter 9)

    Thermodynamics Information

    Entropy

    Relevance to communications

  • 8/13/2019 D Com Lecture5

    3/17

    3/28/2006

    Communications Engineering

    (EE321B) 3

    2nd Law of Thermodynamics

    Thermodynamic entropy of an isolated system is non-decreasing over

    time

    Entropy ~ log of # of states

    LLLLLLLLLLLLLLLLLLLL LRLLLRRLLRRRLRLRLRL

  • 8/13/2019 D Com Lecture5

    4/17

    3/28/2006

    Communications Engineering

    (EE321B) 4

    Entropy

    Entropy

    Binary entropy function

    General case

    Sterlings formula 0 0.2 0.4 0.6 0.8 1

    0

    0.1

    0.2

    0.3

    0.4

    0.5

    0.6

    0.7

    0.8

    0.9

    1

    p

    H(p)

  • 8/13/2019 D Com Lecture5

    5/17

    3/28/2006

    Communications Engineering

    (EE321B) 5

    Poincars Infinite Recurrence

  • 8/13/2019 D Com Lecture5

    6/17

    3/28/2006

    Communications Engineering

    (EE321B) 6

    Arrow of Time

    (Almost) all physical theories have time symmetry

    Newtonian mechanics

    General relativity

    Quantum mechanics

    Then, why do we perceive the arrow of time?

    Why glass breaks, but never spontaneously re-assembles itself?

    Why we feel time flows from past to future?

    Why do we remember the past but not the future?

    Because we are living in a low-entropy condition

    Otherwise life cannot exist and we would not exist to ask the question

    We simply call lower entropy state past and higher entropy state

    future

  • 8/13/2019 D Com Lecture5

    7/17

    3/28/2006

    Communications Engineering

    (EE321B) 7

    Arrow of Time

    Entropy

    Past FuturePastFuture

  • 8/13/2019 D Com Lecture5

    8/17

    3/28/2006

    Communications Engineering

    (EE321B) 8

    Information

    Why care about entropy?

    More randomness

    More possible states

    More unpredictability

    More information

    Higher entropy

    More increase in yourknowledge when you

    receive the information

  • 8/13/2019 D Com Lecture5

    9/17

    3/28/2006

    Communications Engineering

    (EE321B) 9

    Randomness, Entropy and Information

    How random are the following sequences?

    000000000000000000000000000000

    010101010101010101010101010101

    000100000100010000000010000010

    011001010111010100001011011101

    Which ones are easier to describe?

    Repeat 0 30 times Repeat 01 15 times

    Print 011001010111010100001011011101

    Shorter description

    Less information

    Less randomness

    Lower entropy

  • 8/13/2019 D Com Lecture5

    10/17

    3/28/2006

    Communications Engineering

    (EE321B) 10

    Entropy

    Entropy: Measure of uncertainty, randomness

    Tossing of a fair coin: 1 bit of information

    Tossing of two fair coins: 2 bit of information

    Tossing of a fair dice: bit of information

    1,2,3,4,5,6:

    11, 12, 13, , 16, 21, 22, , 66:

    111, 112, , 666:

    Asymptotically

    Bent coin?

  • 8/13/2019 D Com Lecture5

    11/17

    3/28/2006

    Communications Engineering

    (EE321B) 11

    Binary Source Example

    How do you compress an i.i.d. Bernoulli(p) sequence?

    000100000100010000000010000010 (n bits long)

    Specify = # of ones and specify index among

    Need bits

    Sterlings formula

  • 8/13/2019 D Com Lecture5

    12/17

    3/28/2006

    Communications Engineering

    (EE321B) 12

    Huffman Codes

    Example

    This algorithm is in fact optimal

    C(X)00

    10

    11

    010

    011

    X1

    2

    3

    4

    5

    p(X)0.3

    0.25

    0.2

    0.15

    0.1

    0.3

    0.25

    0.25

    0.2

    0.45

    0.3

    0.25

    0.55

    0.45

    10

    10

    10

    10

    1

  • 8/13/2019 D Com Lecture5

    13/17

  • 8/13/2019 D Com Lecture5

    14/17

    3/28/2006

    Communications Engineering

    (EE321B) 14

    Source CodingAlice drank from

    the bottle that said

    COMPRESS ME.

    1010110101001

    0001011011010

    1000110101....

    Lossy compression

    Question: What is the best we can do?

    Minimum compression size for a given distortion?

    Minimum distortion given a compressed size?

  • 8/13/2019 D Com Lecture5

    15/17

    3/28/2006

    Communications Engineering

    (EE321B) 15

    Source Coding

    Without source coding, none of the following would have been

    successful

    Digital cell phones

    MP3 players, PMP

    Digital multimedia broadcasting (DMB)

    Voice over IP

    Video on demand Digital cameras, digital camcorders

    Typical numbers

    5:1 compression for text

    10:1 compression for audio

    100:1 compression for video

  • 8/13/2019 D Com Lecture5

    16/17

    3/28/2006

    Communications Engineering

    (EE321B) 16

    Channel Coding

    Channel coding is also essential

    Typically more than 10 times increase in capacity with channel coding

    Without source & channel coding, capacity would reduce to 1/100 ~1/1,000

    No digital revolution would have been possible without source

    and channel coding!

  • 8/13/2019 D Com Lecture5

    17/17

    3/28/2006

    Communications Engineering

    (EE321B) 17

    Analog vs. Digital

    Telephone

    Can carry ~ 10 digital voice calls over one phone line

    Channel BW: 4 KHz

    Source BW of analog signal: 3.4 KHz

    Channel capacity: ~ 50 Kbps

    Source rate before compression: 8 bits * 8 KHz = 64 Kbps

    Source rate after compression: ~ 5 Kbps

    Video Can carry ~ 6 digital video streams over one video channel

    Channel BW: 6 MHz

    Source BW of analog signal: 4MHZ

    Channel capacity: ~ 60 Mbps

    Source rate before compression: ~ 200 Mbps

    Source rate after compression: ~ 10 Mbps