information complexity: an overview

Post on 23-Feb-2016

34 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Information Complexity: an Overview. Rotem Oshman, Princeton CCI Based on work by Braverman , Barak, Chen, Rao, and others Charles River Science of Information Day 2014. Classical Information Theory. Shannon ‘48, A Mathematical Theory of Communication :. - PowerPoint PPT Presentation

TRANSCRIPT

Information Complexity: an Overview

Rotem Oshman, Princeton CCIBased on work by Braverman, Barak, Chen, Rao,

and othersCharles River Science of Information Day 2014

Classical Information Theory

• Shannon ‘48, A Mathematical Theory of Communication:

Motivation: Communication Complexity

𝑋 𝑌

= ?

Yao ‘79, “Some complexity questions related to distributive computing”

Motivation: Communication Complexity

𝑋 𝑌

More generally: solve some task

Yao ‘79, “Some complexity questions related to distributive computing”

• Applications:– Circuit complexity– Streaming algorithms– Data structures– Distributed computing– Property testing– …

Motivation: Communication Complexity

Example: Streaming Lower Bounds

• Streaming algorithm:

• Reduction from communication complexity [AMS’97]

data

algorithmHow much spaceis required to approximate f(data)?

Example: Streaming Lower Bounds

• Streaming algorithm:

• Reduction from communication complexity [Alon, Matias, Szegedy ’99]

data

algorithm

State of thealgorithm

Advances in Communication Complexity

• Very successful in proving unconditional lower bounds, e.g.,– for set disjointness [KS’92, Razborov ‘92]– for gap hamming distance [Chakrabarti, Regev ‘10]

• But stuck on some hard questions– Multi-party communication complexity– Karchmer-Wigderson games

• [Chakrabarty, Shi, Wirth, Yao ’01], [Bar-Yossef, Kumar, Jayram, Srivakumar ‘04]: use tools from information theory

Extending Information Theory to Interactive Computation

• One-way communication:– Task: send across the channel– Cost: bits• Shannon: in the limit over many instances• Huffman: bits for one instance

• Interactive computation:– Task: e.g., compute – Cost?

Information Cost

• Reminder: mutual information

• Conditional mutual information:

• Basic properties:

– and – Chain rule:

Information Cost

• Fix a protocol • Notation abuse: let also denote the transcript

of the protocol• Two ways to measure information cost:– External information cost: – Internal information cost: – Cost of a task: infimum over all protocols– Which cost is “the right one”?

Information Cost: Basic Properties

External information: Internal information:

• Internal external• Can be much smaller, e.g.:– uniform over – Alice sends to Bob

• But equal if inependent

Information Cost: Basic Properties

External information: Internal information:

• External information communication:

Information Cost: Basic Properties

• Internal information communication cost:

• By induction: let .• : what we know after r rounds

what we knew after r-1 rounds

what we learn in round r, given what we already know

I.H.

Information vs. Communication

• Want: • Suppose is sent by Alice.• What does Alice learn?– is a function of and so

• What does Bob learn?

Information vs. Communication

• We have:Internal information communicationExternal information communicationInternal information external information

Information vs. Communication

• “Information cost = communication cost”?– In the limit: internal information! [Braverman, Rao ‘10]– For one instance: external information! [Braverman,

Barak, Rao, Chen ‘10]

Big question: can protocols be compressed down to their internal information cost?– [Ganor, Kol, Raz ’14]: no!– There is a task with internal IC=, CC=.… but: remains open for functions, small output.

Information vs. Amortized Communication

• Theorem [Braverman, Rao ‘10]:

• The “” direction: compression• The “” direction: direct sum• We know: • We can show:

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set – Bad idea: publicly sample

𝑈

𝑉

𝑋

𝑌

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set – Bad idea: publicly sample

Suppose in , Alice sends .In , Bob learns one bit in he should learn bit

But if is public Bob learns 1 bit about !

Direct Sum Theorem [BR‘10]

• Let be a protocol for on -copy inputs • Construct for as follows:– Alice and Bob get inputs – Choose a random coordinate , set

𝑈

𝑉

Publicly sample

Publicly sample

Privately sample

Privately sample

𝑋

𝑌

Compression

• What we know: a protocol with communication , internal info and external info can be compressed to– [BBCR’10]– [BBCR’10]– [Braverman’10]

• Major open question: can we compress to [GKR, partial answer: no]

Using Information Complexity to Prove Communication Lower Bounds

• Internal/external info communication• Essentially the most powerful technique known

[Kerenidis,Laplante,Lerays,Roland,Xiao’12]: most lower bound techniques imply IC lower bounds

• Disadvantage: hard to show incompressibility!– Must exhibit problem with low IC, high CC– But proving high CC usually proves high IC…

Extending IC to Multiple Players

• Recent interest in multi-player number-in-hand communication complexity

• Motivated by “big data”:– Streaming and sketching, e.g., [Woodruff, Zhang

‘11,’12,’13]– Distributed learning, e.g., [Awasthi, Balcan, Long ‘14]

Extending IC to Multiple Players

• Multi-player computation traditionally hard to analyze

• [Braverman,Ellen,O.,Pitassi,Vaikuntanathan]: for Set Disjointness with elements, players, private channels, NIH input

Information Complexity on Private Channels

• First obstacle: secure multi-party computation• [Goldreich,Micali,Wigderson’87]: any function can be

computed with perfect information-theoretic security against players

– Solution: redefine information cost, measure both• Information a player learns, and• Information a player leaks to all the others.

Extending IC to Multiple Players

• Set disjointness:– Input: – Output:

• Open problem: can we extend to gap set disjointness?– First step: “purely info-theoretic” 2-party analysis

Extending IC to Multiple Players

• In [Braverman,Ellen,O.,Pitassi,Vaikuntanathan] we show direct sum for multi-party– Solving instances = solving one instance

• Does direct sum hold “across players”?– Solving with players = solving with 2 players?– Not always

• Does compression work for multi-party?

Conclusion

• Information complexity extends classical information theory to the interactive setting

• Picture is much less well-understood• Powerful tool for lower bounds• Fascinating open problems:– Compression– Information complexity for multi-player

computation, quantum communication, …

top related