arxiv:1503.08033v5 [q-bio.nc] 2 jan 2017 · 2017. 1. 4. · pacs numbers: 02.50.-r, 05.40.-a,...

14
Power-law statistics and universal scaling in the absence of criticality Jonathan Touboul 1, 2, * and Alain Destexhe 3, 4 1 The Mathematical Neuroscience Laboratory, CIRB / Coll` ege de France (CNRS UMR 7241, INSERM U1050, UPMC ED 158, MEMOLIFE PSL*) 2 MYCENAE Team, INRIA Paris 3 Unit for Neurosciences, Information and Complexity (UNIC), CNRS, Gif sur Yvette, France 4 The European Institute for Theoretical Neuroscience (EITN), Paris (Dated: January 4, 2017) Critical states are sometimes identified experimentally through power-law statistics or universal scaling functions. We show here that such features naturally emerge from networks in self-sustained irregular regimes away from criticality. In these regimes, statistical physics theory of large interacting systems predict a regime where the nodes have independent and identically distributed dynamics. We thus investigated the statistics of a system in which units are replaced by independent stochastic surrogates, and found the same power-law statistics, indicating that these are not sufficient to establish criticality. We rather suggest that these are universal features of large-scale networks when considered macroscopically. These results put caution on the interpretation of scaling laws found in nature. PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous in physics and life sciences. They are observed in a wide range of com- plex phenomena arising in natural systems, from sandpile avalanches to forest fires [1, 2], earthquakes amplitude so- lar flares, website frequentation as well as economics [3]. These distributions reveal unusual properties: the ob- served quantity has no typical scale, and is not well char- acterized by its mean (when it exists): it is distributed over orders of magnitudes and large deviations are not exceptionally rare, in the sense that extreme events are far more likely than they would be, for instance, in a Gaussian distribution. These singular properties, com- bined with their ubiquity in nature, has attracted wide attention in applied science. A number of theories were proposed in order to ac- count for the presence of such power-law distributions. Some theories use an analogy with statistical physics sys- tems and associate the presence of power-law scalings as the hallmark that the system could be operating at a phase transition. These theories associate power-law to a so-called notion of criticality : classically, in statisti- cal physics, critical phenomena are the behaviors occur- ring in systems in association with second order phase transitions. These are largely thought to be universal although no proof was provided yet. Indeed, a num- ber of statistics are found in vastly distinct models, and particularly in the Ising model [4] poised at its phase transition. The critical regime thus occurs only at very specific parameters values. At this regime the proper- ties of the system are particularly singular; in particular, a number of statistics are scale-invariant including for instance the size, duration and shape of collective phe- nomena. The seminal work of Bak, Tang and Wiesen- feld on the Abelian sandpile model [5] largely popular- ized the hypothesis that criticality may be the origin of power-laws observed in nature. Indeed, despite the fact that parameters for criticality are very rare, systems may self-organize naturally at criticality (at the phase transi- * [email protected] tion point) without requiring fine tuning by a mechanism called self-organized criticality (SOC) [1]. This remark- able theory has sometimes led to the conclusion that nat- ural systems were critical based on the identification of power-law relationships in empirical data (see [2, 5] for reviews). While power laws, as well as phase transition (thus crit- icality) are well identified in models, a number of authors have underlined the importance of being cautious when claiming power-law behavior in finite systems, question- ing their relevance or usefulness [6, 7]. In particular, Stumpf and Porter in [6] aptly noted the importance to take a nuanced approach between theoretical and empir- ical statistical support reporting a power-law, as theories arise from infinite systems while and real systems and usual datasets are finite. In physics, several alternative theories were proposed to account for the presence of power-laws (see e.g. [3] for a review). In particular, it was noticed very early that a pure random typewriter (a monkey sitting at a type- writer) would generate texts with a power-law distribu- tion of word frequencies [8] identical as the one observed in data. This work brilliantly showed that power-law may arise from purely stochastic mechanisms, evidencing that some power-laws distributions may not reflect deep structures in the data. Li [9] formalized Miller’s theory, highlighting the fact that combinations of exponentials naturally yield power-law relationships. Bouchaud and others [10, 11] noted that inverse of regularly distributed quantities may show power-law distributions. Newman and colleagues showed that random walks generate sev- eral statistics scaling as power-laws [3], Yule introduced a process with broad applications, particularly in evolu- tion, naturally associated to power-law distributions, and Takayasu and collaborators [12–14] showed that systems with aggregation and injection naturally generate clus- ters of size scaling as a power-law with exponent -3/2. In neuroscience, Benayoun, Wallace and Cowan have shown that neuronal networks models in a regime of balance of excitation and inhibition also provide power-law scalings of avalanches [15]. All these mechanisms are independent of any phase transition and arise away from criticality from a particular way of considering a random process. The hypothesis that networks of the brain operate at arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017

Upload: others

Post on 20-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

Power-law statistics and universal scaling in the absence of criticality

Jonathan Touboul1, 2, ∗ and Alain Destexhe3, 4

1The Mathematical Neuroscience Laboratory, CIRB / College de France (CNRS UMR 7241,INSERM U1050, UPMC ED 158, MEMOLIFE PSL*)

2MYCENAE Team, INRIA Paris3Unit for Neurosciences, Information and Complexity (UNIC), CNRS, Gif sur Yvette, France

4The European Institute for Theoretical Neuroscience (EITN), Paris(Dated: January 4, 2017)

Critical states are sometimes identified experimentally through power-law statistics or universalscaling functions. We show here that such features naturally emerge from networks in self-sustainedirregular regimes away from criticality. In these regimes, statistical physics theory of large interactingsystems predict a regime where the nodes have independent and identically distributed dynamics.We thus investigated the statistics of a system in which units are replaced by independent stochasticsurrogates, and found the same power-law statistics, indicating that these are not sufficient toestablish criticality. We rather suggest that these are universal features of large-scale networkswhen considered macroscopically. These results put caution on the interpretation of scaling lawsfound in nature.

PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm

I. INTRODUCTION

Power law statistics are ubiquitous in physics and lifesciences. They are observed in a wide range of com-plex phenomena arising in natural systems, from sandpileavalanches to forest fires [1, 2], earthquakes amplitude so-lar flares, website frequentation as well as economics [3].These distributions reveal unusual properties: the ob-served quantity has no typical scale, and is not well char-acterized by its mean (when it exists): it is distributedover orders of magnitudes and large deviations are notexceptionally rare, in the sense that extreme events arefar more likely than they would be, for instance, in aGaussian distribution. These singular properties, com-bined with their ubiquity in nature, has attracted wideattention in applied science.

A number of theories were proposed in order to ac-count for the presence of such power-law distributions.Some theories use an analogy with statistical physics sys-tems and associate the presence of power-law scalings asthe hallmark that the system could be operating at aphase transition. These theories associate power-law toa so-called notion of criticality : classically, in statisti-cal physics, critical phenomena are the behaviors occur-ring in systems in association with second order phasetransitions. These are largely thought to be universalalthough no proof was provided yet. Indeed, a num-ber of statistics are found in vastly distinct models, andparticularly in the Ising model [4] poised at its phasetransition. The critical regime thus occurs only at veryspecific parameters values. At this regime the proper-ties of the system are particularly singular; in particular,a number of statistics are scale-invariant including forinstance the size, duration and shape of collective phe-nomena. The seminal work of Bak, Tang and Wiesen-feld on the Abelian sandpile model [5] largely popular-ized the hypothesis that criticality may be the origin ofpower-laws observed in nature. Indeed, despite the factthat parameters for criticality are very rare, systems mayself-organize naturally at criticality (at the phase transi-

[email protected]

tion point) without requiring fine tuning by a mechanismcalled self-organized criticality (SOC) [1]. This remark-able theory has sometimes led to the conclusion that nat-ural systems were critical based on the identification ofpower-law relationships in empirical data (see [2, 5] forreviews).

While power laws, as well as phase transition (thus crit-icality) are well identified in models, a number of authorshave underlined the importance of being cautious whenclaiming power-law behavior in finite systems, question-ing their relevance or usefulness [6, 7]. In particular,Stumpf and Porter in [6] aptly noted the importance totake a nuanced approach between theoretical and empir-ical statistical support reporting a power-law, as theoriesarise from infinite systems while and real systems andusual datasets are finite.

In physics, several alternative theories were proposedto account for the presence of power-laws (see e.g. [3] fora review). In particular, it was noticed very early thata pure random typewriter (a monkey sitting at a type-writer) would generate texts with a power-law distribu-tion of word frequencies [8] identical as the one observedin data. This work brilliantly showed that power-lawmay arise from purely stochastic mechanisms, evidencingthat some power-laws distributions may not reflect deepstructures in the data. Li [9] formalized Miller’s theory,highlighting the fact that combinations of exponentialsnaturally yield power-law relationships. Bouchaud andothers [10, 11] noted that inverse of regularly distributedquantities may show power-law distributions. Newmanand colleagues showed that random walks generate sev-eral statistics scaling as power-laws [3], Yule introduceda process with broad applications, particularly in evolu-tion, naturally associated to power-law distributions, andTakayasu and collaborators [12–14] showed that systemswith aggregation and injection naturally generate clus-ters of size scaling as a power-law with exponent−3/2. Inneuroscience, Benayoun, Wallace and Cowan have shownthat neuronal networks models in a regime of balance ofexcitation and inhibition also provide power-law scalingsof avalanches [15]. All these mechanisms are independentof any phase transition and arise away from criticalityfrom a particular way of considering a random process.

The hypothesis that networks of the brain operate at

arX

iv:1

503.

0803

3v5

[q-

bio.

NC

] 2

Jan

201

7

Page 2: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

2

criticality was introduced a decade ago with the develop-ment of recording techniques of local populations of cellsand the analysis of specific events corresponding to col-lective bursts of activity separated by periods of silence.The first empirical evidence that neuronal avalanchesmay show power-law distributions of duration or size wasderived from the analysis of neuronal cultures in vitro ac-tivity [16]. Based on an analogy with the sandpile model,these bursts were seen as “neuronal avalanches”, andwere apparently distributed as a power-law with a slopeclose to −3/2, consistent with the distribution of sandpileavalanches. These in vitro findings were based on indi-rect evidences of spiking derived from local field poten-tials, extracellular signals associated with the summationof postsynaptic potentials (bursts produce negative peaksin the LFP signals) and affected by a number of eventsunrelated with spiking activity. Similar LFP statisticswere later found in vivo in the awake monkey [17] and inthe anesthetized rat [18]. These empirical evidences wereused to draw strong conclusions on neural coding: thepresence such power-laws would ensure maximized infor-mation capacity, dynamic range and information trans-mission [19, 20]. However, the method of analyzing theamplitude of negative LFP peaks was shown to producespurious power laws scalings [21] regardless of the spikeactivity of cells. Indeed, identical scalings were found insurrogate data, positive LFP peaks (that are indepen-dent of spiking activity), and also arise in elementarypurely stochastic signals, such as excursions of Ornstein-Uhlenbeck processes through thresholds away from themean, or in one-dimensional random walks [22, 23]: bothduration and time of excursions show power-law statis-tics, and display shape invariance. It was further shownin that both in data and surrogate models, statisticalsignificance of these power-laws of LFP peak was poor,and depended on the threshold chosen. In [24], Dehghaniand collaborators have made a statistical analysis com-bining multielectrode in vivo recordings from the cerebralcortex of cat, monkey and human, and did not confirmthe presence of power-laws. The data rather showed anoptimal fit with two independent exponential processes.

The poor statistical significance of LFP avalanche anal-ysis and the ambiguous results it yields has motivated anin-depth exploration of in vitro spiking data of culturesof neurons [25]. In this remarkable work, the authorsused multi-unit data from in vitro cultures and addresseda number of properties of critical systems reported bySethna et al unified theory of criticality of statisticalsystems [26]. Of crucial importance in this theory arepower-law scalings of specific events and the relationshipbetween the different scaling exponents. Friedman andcollaborators [25] revealed that the data was consistentwith Sethna et al theory, since power-law scalings of bothavalanche size and duration were reported, with slopesconsistent with the critical exponents of −3/2 and −2respectively, but also the existence of a universal meantemporal profiles of firing events collapsing under specificscaling onto a single universal scaling function, therebyproviding more substantial analogy between this in vitrosystem with statistical physics models at criticality.

In the present paper, we show that these observationscan arise naturally in neuronal systems that are not atcriticality. We also provide a theoretical explanation forthis, as well as analytic access to some of the relevantproperties of such systems.

II. AVALANCHES IN SPIKING NETWORKMODELS

We start by investigating the avalanche distributionsgenerated by the classicalmodel of spiking neuronal net-work with excitatory and inhibitory connections intro-duced by Nicolas Brunel in [27]. This model describes theinteraction of n neurons described through their voltage(vi)i=1···n that decays to reversal potential in the absenceof input, receives external input and spikes from intercon-nected cells, and fire action potentials when the voltageexceeds a threshold θ. In detail, the voltage of neuron isatisfies the equation:

τdvidt

= −vi +Rτ

n∑j=1

Jij∑k≥0

δ(t− tkj −D) (1)

while vi ≤ θ, and where τ denotes the time constantof the membrane and R its resistivity. The input re-ceived by the neuron are Dirac δs. Neuron i receivesthe kth impulse of neuron j, emitted at time tkj , aftera delay denoted D and assumed constant, which altersits membrane potential of a quantity proportional to Jij .Brunel’s model assumes that these coefficients are zeroexcept for a fixed number of cells randomly chosen inthe excitatory and inhibitory populations, for which thecoefficient Jij have fixed values J and −gJ respectively(see [27] for details). This model is particularly inter-esting very for its versatility and ability and to producediverse spiking patterns. One can classify the regimes ofactivity in terms of levels of synchrony and regularity,and different regimes emerge as a function of the relativelevels of excitation and inhibition and the input, that canbe identified through the computation of precise bifurca-tion curves [27]. The thus obtained regimes are termedactivity states, to distinguish these from the statisticalmechanics notion of phase: the regimes are here sepa-rated by bifurcations occurring in the mean-field limit ofthe system. Of special interest are the Asynchronous Ir-regular (AI) states, in which neurons fire in a Poisson-likemanner, with no period of silence (hence no avalanche).This activity is evocative of the spike trains observed ex-perimentally in awake animals. Such sparsely-connectednetworks can also display periods of collective activity ofbroadly distributed duration interspersed by periods ofsilence, called Synchronous Irregular (SI) regimes, knownto reproduce the qualitative features of spiking in anes-thetized animals or neuronal cultures. Although par-tially ordered and partially disordered, SI regimes occurfor a wide range of parameter values (all in inhibition-dominated regimes) [27] and are not very sensitive tomodifications of biophysical parameters. The SI regimeis not at a transition in the activity regime; within thisregion, chaotic activity takes place. Indeed, inhibitiondominates the excitation, thus when the activity spreadsthroughout the network, it triggers massive inhibitionthat naturally silences the network. This is what we ob-serve in simulations of the Brunel model (Fig. 1)[28]. Weinvestigate the statistics of spike units in both cases (seeFig. 1).

First, in the AI state, the sustained and irregular firingdoes not leave room for repeated periods of quiescence atthis network scale, preventing the definition of avalanches(see raster plot in Fig. 1(B)) [29]. This sharply con-trasts with the SI state, in which we can define avalanches

Page 3: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

3

100 10

110

210

3

s-1.5

t-2

1000 1200 1400 1600 1800 2000 2200

Rate

1 2 3 4 5 6 7Time (s)

5 5.5 6 6.5 7 7.5

Time (s)

Neu

ron

inde

x

Eve

nt F

requ

ency

0 10 20

5

10

1

1

time in avalanche t (ms)

Avera

ge N

um

ber

of

firi

ngs

/ T

1/2

fraction of avalanche duration t/T30

15

20

2

3

4

0 0.5

(E)

t 1.5

Avalanche size s (No. spikes) Avalanche duration t (No. bins) Avalanche duration t (No. bins)

100

102

104

106

0

102

104

10

100

102

104

106

0

102

104

10

100

101 10

210

0 101 10

2100

101

102

103

(G)(F)

(B)(A)

(D)(C)

Eve

nt F

requ

ency

Ave

rage

num

ber

ofsp

ikes

Ave

rage

num

ber

ofsp

ikes

1000

500

1

1000

500

1

FIG. 1. Avalanche spike statistics in the sparsely connected networks [27] with N = 5 000 neurons in the SI and AI states.(A) Raster plot of the SI together with the firing rate (below). (B) Raster plot in the AI state: spiking is asynchronous andfaster (notice that the time window is shorter compared to A for legibility): no silence period arises. Parameters as in [27,Fig.2D] with g = 4 (excitation-dominated regime) and νext/νthresh = 1.1. . In the SI states, (C) avalanche size (numberof spikes) and (D) avalanche duration (number of bins) scale as power law (dashed line is the maximum likelihood fit), andaveraged avalanche size scales as a power law with the duration avalanche duration according to the universal scaling law [26](E). Average avalanche shapes collapse onto the same curve (F,G) very accurately. Avalanche shapes from 20 to 40 bins areplotted in gray, and 3 of the same size as in [25] are colored and with distinct thicknesses. Parameters as in [27, Fig. 2D] withN = 1000, g = 5 (inhibition-dominated regime) and νext/νthresh = 0.9.

Page 4: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

4

which display the same statistics assumed to reveal criti-cality in cultures [25]. Fig 1(A) represents the raster plotwith typical avalanches taking place. Strikingly, bothavalanche duration and avalanche size show excellent fitto a power law, validated by Kolmogorov Smirnov test ofmaximum likelihood estimator [30], and the exponentsfound are consistent with those found in neuronal cul-tures [25]. In particular, we find that the size s of theavalanches (number of firings during one avalanche) ap-parently scales as s−τ with τ = 1.42 (Fig. 1(C)), closeto the theoretical value of 3/2 (plotted on the figure forindication), and to the neuronal culture scaling (1.7 re-ported in [25]). Using the Kolmogorov-Smirnov test [30],we have tested the hypothesis that the data are dis-tributed as a power-law, and validated the hypothesis(Kolmogov-Smirnov distance 0.0097, p-value p > 0.99.This is also the case of the distribution of the durationt of the avalanches, found scaling as t−α with α = 2.11(Fig. 1(D)), close to the theoretical value of 2 for criticalsystems and from the experimental value of 1.9 found inneuronal cultures [25]). The Kolmogorov-Smirnov dis-tance with a pure power law is very low, evaluated at0.031, which leads to a high p-value p = 0.99, validat-ing clearly the hypothesis of a power-law distribution ofavalanche durations. Similar to what was observed ex-perimentally, the fit is valid on two octaves, and dropsdown beyond[31]. This exponential drop is classically re-lated to subsampling effects. We validated the presenceof finite-size cutoffs by varying the size of the system,and found indeed that as the network size increases, thefit with a power-law distribution is valid on larger timeintervals and the cutoff only arises at later times (see Sup-plementary Figure 6). Also consistent with neuronal dataand crackling noise, we found that the average avalanchesize scales very clearly as a power of its duration, butwith a positive exponent γ = 1.50, not consistent withthe crackling noise relationship between exponents

γ =α− 1

τ − 1(2)

that predicts an exponent equal to γ = 2.64, but howeverquantitatively consistent with the in vitro data of [25].We have also investigated the shape of the avalanchesof different durations. We have found that, similarly tocritical systems or to in vitro data, the avalanche shapescollapsed onto a universal scaling function (Fig. 1E) whentime is rescaled to a unit interval and shape rescaled byT γ−1 where γ = 1.5 is the power law exponent of theaverage size.

These scalings are not specific to the particular choiceof parameters used in our simulations: we consistentlyfind, in the whole range of parameter values correspond-ing to the SI state of Brunel’s model, and in particu-lar, away from any transitions, similar apparent power-law scalings with similar scaling exponents (see supple-mentary Figure 7). Moreover, we confirmed, in additionto the fact that this regime is away from all transitionsbetween the different network activity, that the systemis not at criticality, by showing that relaxation towardsthe SI regime after perturbation is fast (within millisec-onds) within the region considered, although it does slowdown close to the transition point (see supplementaryFigure 8). We conclude that these statistics are validin a whole regime of the system where the activity issynchronous irregular, and neither at a transition of the

model nor in a regime consistent with the slow decayof perturbations associated to critical regimes. There-fore, finding power-law statistics in neuronal avalancheswith exponents 1.5 and 2 do not reveal that the systemoperates at criticality, but rather seems a property ofsynchronous irregular states.

The SI states are prominent in neuronal activity, espe-cially in anesthesia and neuronal cultures. It is preciselyin these situations that power-law distributions of spikeavalanches were reported experimentally [16, 18]. Thisregime differs from the awake activity where neurons firein an AI manner. In these regimes, power-laws and crit-icality were reported based on LFP recordings [17]. Wewill come back to experimental evidences of power-lawsin local-field potentials recordings of the activity in sec-tion V.

III. AVALANCHES AND BOLTZMANNMOLECULAR CHAOS PROPERTY

The observation of such scaling relationship in simplemodels of neuronal networks away from criticality andfor a broad range of parameter values suggests that theobserved scaling is related to properties of the systemsthat are independent of the notion of criticality and thatmay be relatively general. These may therefore be relatedto the properties of the network activity, that we nowdescribe in more detail.

III.1. Propagation of chaos in neural networksmodels

The classical theory of the thermodynamics of interact-ing particle systems states that in large networks (suchas those of the brain), the correlations between neuronsvanishes. This is also known as Boltzmann molecularchaos hypothesis (Stoßzahlansatz ) in reference to the hy-pothesis that the speed of distinct particles should beindependent, key to the kinetic theory of gases of Lud-wig Boltzmann [32]. In mathematics, this property iscalled propagation of chaos, and is rigorously defined asfollows:

Definition 1. For (Xn1 , · · · , Xn

n ) a sequence of measureson (Rd)n. The sequence is said X-chaotic if for any k ∈N and i1, · · · , ik a set of indexes independent of n, wehave (Xn

i1, · · · , Xn

ik) converge to k independent copies of

X as n→∞.

In our context, in the limit of large networks, neuronsbehave as independent jump processes with a commonrate, which is solution of an implicit equation. This prop-erty is at the core of theoretical approaches to understandthe dynamics of large-scale networks [27, 33–35]. In thecase of Brunel’s model, it is shown that since two neu-rons share a vanishing proportion of common input inthe thermodynamic limit, allowing to conclude that thecorrelation of the fluctuating parts of the synaptic inputof different neurons are negligible. This leads the authorsto conclude that the spike trains of different neurons areindependent point processes with an identical instanta-neous firing rate ν(t) that is solution of a self-consistentequation (see [27, p. 186, first column]). In that view, ex-cept in the case of constant firing rate (the asynchronous

Page 5: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

5

irregular state), neurons always show a certain degree ofsynchrony due to the correlations of the instantaneousfiring rates of different neurons.

Mathematically, several methods were developed forinteracting particle systems and gases (see e.g. [36] foran excellent review). It is shown that generically, sys-tems of interacting agents, with sufficient regularity, showpropagation of chaos. All these results are in particularvalid for neuronal networks, as was shown recently in anumber of distinct situations. Large n limits and prop-agation of chaos was demonstrated for large networks ofintegrate-and-fire neurons [37], firing-rate models withmultiple populations [38], for conductance-based mod-els even in the large time regime [39], and was shownto hold in realistic network models incorporating delaysand the spatial extension of the system [40, 41]. Rigorousmethods of convergence of particle systems show that theempirical measure of the system:

µn =1

n

n∑j=1

δxi

converges in law towards a unique solution. A very pow-erful and universal mathematical result demonstratedin [42, Lemma 3.1] ensures that that the convergenceof the empirical measure of a particle systems towardsa unique measure implies propagation of chaos. Severalmethods may be used to show convergence of the em-pirical measure, including in the case of neuroscience,coupling methods [41], compacity estimates [37] or largedeviations [43].

These results indicate that a universal form of activityemerges from neural networks, whereby neurons are inde-pendent copies of the same process. Before investigatingthe avalanche statistics of such regimes of activity, let usdiscuss the plausibility of the existence of these regimesin neuronal data.

III.2. Decorrelation in experimental data andmodels

In natural environments, regularity of sensory input tothe brain may create strong and long-range correlationsin space and time [44–47]. It soon appeared that thesecorrelations would be detrimental for the brain to encodesensory stimuli and detect changes efficiently [48, 49].Theoretical models of the visual system in particular haveshown that decrease of redundancy by decorrelation wasimportant for efficient encoding of natural images [50–52]. This was confirmed experimentally. In [53], theauthors used high density two-dimensional electrode ar-ray and found in particular a marked exponential decayof correlation of excitatory cells. An clear confirmationof decorrelation even for closeby cells receiving similarinput was recently brought in a remarkable experimen-tal work where chronically implanted multielectrode ar-rays were developed and implanted in the visual cortexof the macaque [54]. This protocol produced exquisitedata allowing to show that even nearby neurons, gen-erally thought to be strongly connected and to receivea substantial part of common input showed a very lowlevel of correlation. Similar decorrelation results were re-ported in the rodent neocortex [55] with the same levelof accuracy.

The origin of this decorrelation is still controversial andseveral assumptions were formulated, including the roleplayed by adaptation ionic currents that could play cen-tral role in temporal decorrelation [56], negative correla-tions associated with the co-evolution of excitatory andinhibitory cells activity [55], or sophisticated and robustmechanisms relying on neuronal nonlinearities and ampli-fied by recurrent connectivities [57], that was compatiblewith pattern decorrelation observed in the olfactory bulbof the zebra fish.

All these experimental findings confirm that regimesin which neurons are independent are plausible represen-tations of neural networks activity. We now investigatethe avalanche statistics of such networks.

III.3. Statistics of networks in the molecular chaosregime

Both the mathematical analysis of neuronal networkmodels and fine analysis of the structure of spike trainsmotivates the study of ensembles of neurons that are in-dependent but with common non-stationary statistics.The simplest model made of one could think of is to con-sider a collection of independent Poisson processes withidentical time-dependent rates.

In that view, cells with constant firing rates resembleAI regimes. To generate a stochastic surrogate of theSI regime, the common rate of the cells should displaynon-periodically periods of silence. An obvious choicewould be to replay the rates extracted from the SI state,and indeed such a surrogate generated power-law statis-tics (not shown), but in this case we could not rule outwhether the power-law statistics are encoded in the ratefunctions. To show that this is not the case, we gener-ated a surrogate independent of the rate functions, byusing a common rate of firing of the neurons given bythe positive part ρ+t of the Ornstein-Uhlenbeck process:

ρt = −αρt + σξt

with (ξt) a Gaussian white noise. This choice is inter-esting in that although periods of silence do not occurperiodically, the duration between two such silences havea finite mean. Actually, the distribution of excursionsshape and duration of the Ornstein-Uhlenbeck processare known in closed form [58]. These distributions are,of course, not heavy tailed: they have exponential tailswith exponent α which is the timescale of decay of theprocess.

We investigated the collective statistics of N = 2 000independent realizations of Poisson processes with thisrate. The resulting raster plot is displayed in Fig. 2(A).While the firing is an inhomogeneous Poisson process,macroscopic statistics show power-law distributions forthe size (τ = 1.47, Fig. 2(B)) and for the duration(α = 1.9, Fig. 2(C)), both statistically significant [30]and consistent with critical exponents. Again, a linearlinear relationship between average avalanche size andduration is found, with a coefficient evaluated to γ = 1.4and explaining all the variance but 6 10−4. Again, thiscoefficient is not consistent with the crackling relation-ship (2). Notwithstanding, we found that the averagedshape of avalanches of a given duration collapse on oneuniversal curve when the amplitude is rescaled by theduration to the power γ − 1 (Fig. 2(D,E)).

Page 6: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

6

Eve

nt F

requ

ency

106

s-1.510

4

102

100

100

102

104

1 2 3 4 50

Neu

ron

inde

xR

ate

Time (s)

Avalanche size

Ave

rage

Num

ber

offir

ings

time in avalanche t (ms)

0 0.5 10.4

0.8

1.2

1.6

Avera

ge N

um

ber

of

firi

ng

s /

T1

/2

fraction of avalanche duration t/T0 10 20 30 40

2

4

6

8

t-1.9

Avalanche duration

Ave

rage

siz

e

t 1.5

1

500

Eve

nt F

requ

ency

Avalanche duration

106

104

102

100

100

101

10310

2 100

101

10310

2100

101

102

103

104

FIG. 2. Avalanches statistics in the independent Poisson model with Ornstein-Uhlenbeck firing rate (α = σ = 1). Apparentpower-law scalings, together with scale invariance of avalanches shapes. Avalanches of size 10 to 40 in gray, 3 specific trajectorieshighlighted.

Of course, these statistics are not related to the natureof the rate chosen. We display in Supplementary Fig. 9for instance the avalanche statistics of independent Pois-son processes with rate given by the positive part of areflected Brownian, and we find exactly the same power-law statistics of avalanche shapes and durations, as wellas a very nice collapse of avalanche shapes.

III.4. Analytical derivations in the stationary andslow rate regime

In this simple model, it is actually very simple to in-deed compute explicitly the distribution of avalanche du-ration for instance. Disregarding the Poisson nature ofthe firings and the type of rate chosen, we can indeedwrite down the probability for an avalanche of durationt to occur at a specific time. These distributions can becomputed analytically in cases where the spiking is de-scribed by a point process. In that case, denoting by p(t)the probability for a neuron to spike in the time inter-val [t, t + δ], the probability to observe an avalanche ofsize τ starting at time t∗ in a collection of n independentrealizations is

q(t∗)nq(t∗ + τ + 1)nτ∏t=1

(1− q(t∗ + t)n).

Assuming stationarity, the probability of finding anavalanche of duration τ is given by∫

[0,1]τ+2

qn0 qnτ+1

τ∏t=1

(1− qnt )dρτ+2(q0, · · · , qτ+1)

where ρ denotes the joint probability for the firing rateto have a specific sequence of values (q0, · · · , qτ+1).

This formula remains quite complex. Assumingnow that the rate is extremely slow compared to theavalanches, one can simplify further the probability ofp0τ of an avalanche of size τ :

p0τ =

∫ 1

0

q2n(1− qn)τρ1(q)dq,

thus with a simple change of variable:

p0τ =1

n

∫ 1

0

x1+1n (1− x)τρ1(x

1n )dx.

As expected, when n → ∞, the probability of havingan avalanche of prescribed, finite duration goes to zeroas 1/n. The typical shape of the distribution can beobtained by rescaling this probability by n. Since we areinterested in the logarithmic shape of the distribution,we disregard any multiplicative constant. As n → ∞,we thus obtain that the probability profile convergestowards a universal limit independent of the particularshape of the distribution ρ, precisely given by:

p0,∞τ ∝∫ 1

0

x(1− x)τdx =1

(τ + 1)(τ + 2)

which is indeed a power law with exponent −2, identi-cal to the one arising in critical systems, consistent withthose reported in neuronal data [25], in neuron models(Fig. 1) and in surrogate systems (Fig. 2).

We now show that this extends to the distribution ofavalanche size and the scaling of the mean avalanche size

Page 7: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

7

with duration. Indeed, the size of avalanches of durationτ have a binomial distribution, corresponding to s − τsuccesses among (n−1)τ independent Bernoulli variableswith probability of success 1−q(t). Moreover, De Moivre-Laplace theorem [59] ensures convergence as n increasestowards a normal variable with mean (n − 1)τp(t) andvariance (n − 1)τp(t)q(t) with p(t) = 1 − q(t). Usingagain our separation of timescale and stationarity hy-potheses, we find the probability the probability of find-ing an avalanche of size s and duration τ averaged on thefiring rate:

pn(s, τ) ∼∫ 1

0

e−(s−τ−n(1−q)τ)2

2nqτ(1−q)q2n(1− qn)τ√2πnqτ(1− q)

ρ(q)dq

which converges, as n→∞, towards:

p∞(s, τ) ∼∫ 1

0

e(s−τ(1+log(u)))2

2τ log(u)u(1− u)τ√2πτ log(u)

du

We thus obtain at leading order the size distribution:

P(s) ∼ ess∑

τ=1

∫ 1

0

e(s−τ)2

2τ log(u)u(√u(1− u))τ√

2πτ log(u)dx. (3)

It is hard to further simplify this formula, but it canbe easily evaluated numerically. We depict the result ofthis computation in Fig. 3, and illustrate the apparentpower-law scaling with slope −3/2.

101 10

210

310-6

10-5

10-4

10-3

10-2

10-1

Event

Freq

uency

Avalanche size s

s -3/2

FIG. 3. Universal shape of the avalanche size distribution inthe slow rate and large n limit.

Eventually, we obtain for the average size Aτ ofavalanches of duration τ :

Aτ ∼∫ 1

0

∫ ∞s=τ

s e(s−τ(1+log(u)))2

2τ log(u)ds√

2πτ log(u)du

∼ τ3/2∫ 1

0

(− log(u))5/4u√

τ−8π log(u) du.

We thus conclude that while a power-law relationshippersists between Aτ and τ , the scaling exponent is notrelated to the exponents of the power-law of size (3/2)and duration (2) distribution through Sethna’s crack-ling noise relationship, which would predict an exponentequal to 2. Importantly, we note that the exponent foundhere is quantitatively perfectly consistent with the expo-nent found in in vitro data [25] in the neural networkmodel (Fig. 1) or surrogate Poisson system (Fig. 2).

IV. SPIKE PATTERN ENTROPY ANDINFORMATION CAPACITY

We have thus proved that power-law distributions ofavalanches do not necessarily reveal that the network isoperating at criticality. However, a number of theorieshave proposed that operating at criticality was an opti-mal regime of information processing in the brain, maybeselected by evolution as a useful trait for the nervous sys-tem [19, 60–64]. The question that arises is thus whetherthese theories break down when power-law statistics nomore arise from the system operating at criticality, butfrom a mean-field Boltzmann chaos regime.

In order to address this outstanding question, we cameback to the methods used in order to demonstrate opti-mality of data processing capabilities at criticality. Thesetheories rely on the computation of the information ca-pacity of the network in different activity regimes evalu-ated, following Shannon’s information theory, as the en-tropy of the patterns of spike fired. In detail, a spikepattern in a network of size N is a N -uplet s ∈ 0, 1N ,with si = 1 (resp. si = 0) if neuron i has fired (resp., notfired) in a specific timebin. If p denotes the probabilityof occurrence of spike patterns, the entropy is given by:

Entropy =∑

s∈0,1Np(s) log(p(s)).

In order to test whether our theory accounting for theemergence of power-law distributions in the absence ofcriticality challenges high information capacity of neu-ronal networks, we computed the information capacityof Brunel’s model in different regimes (see Fig. 4). Thenumerical results show that this is not the case, and theinformation capacity is maximal in the SI regime wherepower-law statistics of avalanches were observed. How-ever, we observe no difference between entropy levels inthe SI or AI states. Therefore, we conclude that themaximality of entropy is not necessarily related to theemergence of power-law statistics.

These observations can be well understood heuristi-cally. Indeed, the entropy of spike patterns is a measureof the variability of possible spike patterns observed inthe course of neuronal activity. The diagram of Fig. 4 isthus not surprising. Indeed, while the diversity of spikepatterns is reduced in the highly synchronized regularregimes, it will be large in the irregular regimes, bothsynchronized asynchronous [65]. In other words, entropyis maximized within irregular regimes where more diversepatterns are fired, and this independently of the under-lying mechanisms supporting the emergence of the irreg-ular activity.

V. LOCAL FIELD POTENTIALS AREAMBIGUOUS MEASURES OF CRITICALITY

We have thus shown that discarding the criticality as-sumption does not degrade the quantifications of networkefficiency This being said, we are facing an apparent con-tradiction. Indeed, our theory provides an account forthe presence of critical statistics in networks in the SIregime, but discards AI states as possibly having criti-cally distributed avalanches. Evidences of critical statis-tics of avalanches in-vivo in the awake brain have been

Page 8: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

8

AI

Entropy

gSR

SI

FIG. 4. Computation of the entropy (circles) of the networkin the Brunel’s model [27] for distinct values of the input in-tensity νext and inhibition ratio g. Color code indicates theentropy amplitude. Parameters as in the original model [27,Fig. 2B]. The blue surface represents the boundary betweenSI and AI regimes, and the green surface separates SR regimesfrom SI regimes. We observe a clear transition from low en-tropy (SR) to high entropy (SI+AI).

reported, but are more scarce and controversial. Unlikeneuronal cultures, the activity in the awake brain doesnot display bursts separated by silences, but is sustained.Using a macroscopic measurement of neuronal activity,the Local Field Potential (LFP), power laws could beshown from the distribution of peaked events [17]. Themotivation to use negative LFP peaks to deduce infor-mation from the distribution of spike avalanches reliedon the fact that the amplitude of these peaks correlatedwith firing activity [16, 19].

However, this monotonic relationship between thenumber of spikes and the number of spikes do not im-ply that there should exist a relationship between thedistribution of peak amplitude and avalanches. More-over, it was shown that power-laws naturally emergefrom the random nature of the signal and the threshold-ing procedure used in this analysis, and moreover thesepower-laws may not be statistically significant [21]. In-deed, no power-law scaling could be found from unit ac-tivity, which were better fit by double-exponential dis-tributions [24]. These analyses rather suggest that thepower-law statistics of LFP peaks does not reflect scale-invariant neural activity.

In an attempt to clarify this, we investigate herewhether the distribution of LFP peaks can display power-law scaling in spiking networks or in their stochastic sur-rogates. To obtain a more biophysical model where LFPcan be defined, we considered the current-based Vogelsand Abbott model [66] which provides a biologically re-alistic model of spiking network displaying asynchronousirregular and synchronous regular states, and in whichsynaptic currents are described by exponentially decay-ing functions with excitation and inhibition having dis-tinct time constants (in place of Dirac impulses in theBrunel model (1) see [66] for details). Simulations of themodel provide an instantaneous distribution of postsy-naptic currents, from which we computed LFP signals.

In detail, we have considered a spatially extended neu-ral network of 5 000 units randomly located on a 2-

dimensional square and satisfying the by the Vogels-Abbott model. We evaluated an LFP signal VLFPfrom the postsynaptic currents according to Coulomb’slaw [67]:

VLFP =Re4π

∑j

Ijrj

,

where VLFP is the electric potential at the electrode po-sition, Re = 230 Ωcm is the extracellular resistivity ofbrain tissue, Ij are the synaptic currents of and rj is thedistance between Ij and electrode position. Remarkably,applying to this more sophisticate model the same pro-cedures as in the original paper [17], we found that themethod cannot distinguish between structured or non-structured activity: for a fixed firing rate and bin size,we have been comparing in Figure 5 the LFP statisticsof the avalanche duration in a network of neurons in theAI regime (A) or in the SI regime (C) that shows partialorder of the firing and saw no difference. We also com-pared the statistics to those of independent Poisson pro-cesses with constant rate (B): the three instances showpower-law scaling of avalanche duration, with the sameexponent, that seem to rather be related to the firingrate and bin size than to the form of the network activ-ity. The scaling coefficient is again, close from 3/2, andvaries with bin size. Clearly, in this case, the dynam-ics of LFP peaks cannot distinguish between critical andnon-critical regimes.

Eve

nt F

requ

ency

Avalanche size (s, in V)

AI Poisson

100

101

102

103

100

101 10

2

s-1.4

100

101 10

210

010

1 102

Rat

eN

euro

n in

dex

s-1.4 s-1.4

Time t (ms)

SI(A) (B) (C)

01000

01000

01000

FIG. 5. Avalanche analysis defined from the macroscopic vari-able VLFP of a network of integrate-and-fire neurons withexponential synapses. Networks displaying an SI (left) or AI(middle) state, and a purely stochastic surrogate (right), showsimilar macroscopic power-law LFP peak statistics with thesame exponent close to -3/2.

VI. DISCUSSION

In this paper, we have evaluated how power-law statis-tics and universal scaling can arise in the absence of crit-icality. We first outline the novel contributions of thepresent manuscript, then we discuss their significance.

The contributions of the present manuscript are thefollowing:

(i). We first investigated the avalanche statistics ofspiking neural networks models (Section II). Weshowed that these networks display power-law

Page 9: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

9

statistics with the critical exponents -3/2 and -2, aswell as collapse of the shapes of avalanches. Thisobservation is robust and valid for a wide region ofparameters corresponding to synchronous irregularactivity in such networks, and thus away from anytransition point (i.e., away from criticality).

(ii). This robust numerical observation led us to inves-tigate theoretically how such statistics can emergefrom the collective activity of spiking networks.One ubiquitous property of large neural networksmodels is that neurons behave as independent pro-cesses with the same statistics. This property,termed Boltzmann’s molecular chaos regime in sta-tistical physics, or propagation of chaos in mathe-matics, is universal in the dynamics of large-scalenetworks. We reviewed such properties in models,as well as recent experimental evidence supportingthis decorrelation between the dynamics of cells.

(iii). We next tested the hypothesis that power-laws mayemerge from such large systems of weakly corre-lated units with similar statistics. We introducedand investigated numerically the dynamics of a sur-rogate network made of independent neurons shar-ing the same statistics (Section III.3). Surprisingly,despite the simplicity of these systems, they displayprecisely the same power-law statistics of avalancheduration and size with the same exponents andshape collapse as fully-connected networks.

(iv). This surrogate model is simple enough to derive inclosed-form the statistics of the duration and size ofthe avalanches. Using this analytic expression, wedemonstrate that the ‘critical’ exponents emergenaturally in large-scale systems (operating withinthe Boltzmann chaos regime), without the need toinvoke criticality (Section III.4).

(v). Current literature indicate that criticality is an op-timal information processing regime for the brain.We considered the same measures of informationcapability (entropy of spike trains) on our biolog-ically plausible models and showed that indeed,information is maximized in the SI regime wherepower-laws emerge. However, we show that this isnot exclusive of the SI regime but same levels of en-tropy are found in AI regimes where no avalanchecan be defined.

(vi). Finally, we also addressed the presence of power-law scalings observed in LFP recordings by sim-ulating such signals emerging from more realis-tic neuronal network models with excitatory andinhibitory cells. Surprisingly, we observed thatpower-laws with the same exponents as observed inexperiments are found in all regimes tested. Thisexponents persisted when neurons were replaced byindependent Poisson processes. This clearly indi-cates that power-law scalings in LFPs do not con-stitute any proof of criticality in the underlying sys-tem.

All together, these numerical and theoretical findingsprovide a new interpretation for the emergence of power-law statistics in large-scale systems, independent of thenotion of criticality. We propose to explain the emergenceof such scaling based on Boltzmann molecular chaos

regime, known to govern the dynamics of most large-scale interacting systems [27, 36, 40]. In other words,power-law and universal scaling functions can be due toa mean-field effect in systems made of a large numberof interacting units. Of course, this theory does not hy-pothesize that the elements considered (here, neurons)are disconnected in reality. To the contrary, the factthat the critical exponents still resist the removal of in-terconnections shows that such exponents do not needcriticality to be explained.

The main mechanism explored here, Boltzmann’smolecular chaos, is a universal feature of many statisticalsystems. The very particular structure of different parti-cles activity it induces, namely statistical independenceof the particles behavior together with a correlation inthe law, may induce as we have observed, the same typeof power-laws as in critical systems, with universal co-efficients that are consistent with those found in criticalsystems. In agreement with this theory, we have seenthat the same statistics are reproduced by a sparsely con-nected network and a surrogate stochastic process wherethe periods of firing and silences are themselves gener-ated by another stochastic process. This interpretationsuggests that similar scaling relationships shall arise inmore realistic neural network models with fixed connec-tivity patterns, in particular including axonal propaga-tion delays (constant delays are already present in Brunelmodel), dendritic structure, spike frequency adaptation,non-instantaneous synaptic transmission. Indeed, mostof these elements will make the intrinsic dynamics of eachcell more complex, but we do not expect that this com-plexity could affect the fact that these systems operatewithin the Boltzmann molecular chaos regime. Notwith-standing, models including synaptic plasticity, which isthe process by which the brain acquires skills and storesmemories, may not belong to the class of systems de-scribed in this paper. Indeed, in such systems, the con-nectivity patterns vary depending on the pairwise corre-lations of cells activity, and this relationship may com-pete with the establishment of Boltzmann’s molecularchaos regime. While this may not occur in the adultbrain where plasticity is much slower than neuronal ac-tivity, distinct phenomena not described by our modelmay occur occur during the critical periods of brain de-velopment, when plasticity occurs at a faster timescale.Further experimental and theoretical investigations arenecessary to characterize avalanche distributions in thesesystems, as well as to compute correlation levels to test ifthe decorrelation characteristic of Boltzmann’s molecularchaos occurs.

Interestingly, a network operating in Boltzmann’smolecular chaos regime can be interpreted as a high di-mensional system with hidden variables, as studied re-cently in [68, 69]. In these contributions, the authors in-vestigate the rank distribution of high dimensional datawith hidden latent variable, and show that such sys-tems display Zipf law scaling (power-laws with slope −1in the rank distribution) that generically arise from en-tropy consideration, and using the elegant identity be-tween entropy and energy shown in [70]. While these de-velopments do not generalize here, Boltzmann molecularchaos provides a natural explanation for the emergence ofweakly correlated units with similar probability laws: inthe neural network system case, the common rate couldbe seen as a latent variable, and both independence andirregularity build up only from the interactions between

Page 10: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

10

cells. As a result of this theory, revealing apparent power-law scaling with exponents of -3/2 and 2, as well as shapecollapse, may be entirely explained statistically; in par-ticular, these criteria constitute no proof of criticalityand experimental studies solely relying on them shouldbe re-evaluated.

This statement is even more true when it comes tomacroscopic measurements such as the LFP: both SIand AI regimes, as well as stochastic surrogates, displaypower-law statistics in the distribution of LFP peaks.This shows that systems of weakly correlated units,or their stochastic surrogates, can generate power-lawstatistics when considered macroscopically. Here again,the power-law statistics tells nothing about the critical ornon-critical nature of the underlying system. This poten-tially reconciles contradictory observations that macro-scopic brain variables display power-law scaling [17],while no sign of such power-law scaling was found in theunits [24, 71]. More generally, these results also put cau-tion on the interpretation of power-law relations found innature.

A question that naturally emerges is how to distinguishpower-laws due to criticality from those due to Boltz-mann’s molecular chaos regime. We used the previousobservation [26] that a prominent characteristic of crit-icality beyond the presence of power-law scalings is theparticular relationship one finds between the exponents.We found here that the power-law scalings emerging inthe absence of criticality did not satisfy this relationship.We propose to use that criterion as a possible way to dis-tinguish between power-law scaling due to criticality ordue to Boltzmann’s molecular chaos.

In conclusion, we have shown here that stochastic mod-els can replicate many of the experimental observationsabout ’critical’ exponents, which demonstrates that notonly power-law scaling is not enough to prove criticality,but that we need new and better methods to investi-gate this in experimental systems. The fact that thatsuch exponents are seen for networks and for stochas-tic systems, shows that they apply to a large class ofnatural systems and may be more universal than previ-ously thought. As Georges Miller noticed in his seminalpaper [8], examining random text typed by virtual mon-keys, the texts produced may not be interesting, but havesome of the statistical properties considered interestingwhen humans, rather than monkeys, hit the keys. Simi-larly, the present results show systems that can emulatethe power-law scaling seen in brain activity, but with nocriticality involved. We thus cannot conclude on whetherthe brain operates at criticality or not, but we need moreelaborate methods to resolve this point.

Acknowledgments

A.D. was supported by the CNRS and grants from theEuropean Community (BrainScales FP7-269921 and Hu-man Brain Project FP7-604102). We warmly thank QuanShi and Roberto Zuniga Valladares for preliminary workand analyzes. We thank anonymous referees for theirsuggestions of analyses and references.

Appendix A: Power law statistics and maximumlikelihood fits for stationary data

We review here the methods used to fit the powerlaw distributions, that closely follow the methodologyexposed in [30] This methodology applies to stationarydata. Taking the logarithm of the probability densityof a power-law random variable, we obtain log(p(x)) =−α log(x) + log(a). The histogram of the power-lawtherefore presents an affine relation in a log-log plot.For this reason, power-laws in empirical data are oftenstudied by plotting the logarithm of the histogram as afunction of the logarithm of the values of the randomvariable, and doing a linear regression to fit an affine lineto through the data points (usually using a least-squaresalgorithm). This method dates back to Pareto in the19th century (see e.g. [72]). The evaluated point xmin

corresponding to the point where the data start having apower-law distribution is mostly evaluated visually, butthis method is very sensitive to noise (see e.g. [73] andreferences herein). The maximum likelihood estimator ofthe exponent parameter α corresponding to n data pointsxi ≥ xmin is:

α = 1 + n( n∑i=1

logxixmin

)−1.

The log-likelihood of the data for the estimated parame-ter value is:

L(α|X) = n log

(α− 1

xmin

)− α

n∑i=1

log

(xixmin

).

The parameter xmin is evaluated then by minimizingthe Kolmogorov–Smirnov distance:

KS = maxx≥xmin

|S(x)− P (x)|

where S(x) is the cumulative distribution function (CDF)

of the data and P (x) is the CDF of the theoretical dis-tribution being fitted for the parameter that best fits thedata for x ≥ xmin), as proposed by Clauset and colleaguesin [74]. In order to quantify the accuracy of the fit, we usea standard goodness-of-fit test which generates a p-value.This quantity characterizes the likelihood of obtaining afit as good or better than that observed, if the hypothe-sized distribution is correct. This method involves sam-pling the fitted distribution to generate artificial data setsof size n, and then calculating the Kolmogorov–Smirnovdistance between each data-set and the fitted distribu-tion, producing the distribution of Kolmogorov–Smirnovdistances expected if the fitted distribution is the truedistribution of the data. A p-value is then calculated asthe proportion of artificial data showing a poorer fit thanfitting the observed data set. When this value is closeto 1, the data set can be considered to be drawn fromthe fitted distribution, and if not, the hypothesis mightbe rejected. The smallest p-values often considered tovalidate the statistical test are taken between 0.1 and0.01. These values are computed following the methoddescribed in [75], which in particular involves generatingartificial samples through a Monte-Carlo procedure.

These methods, very efficient for stationary data, failto evaluate the tails of non-stationary data as is the case

Page 11: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

11

of neuronal data. A weighted Kolmogorov-Smirnov testwith a refined goodness of fit estimate valid up to extremetails [76].

Appendix B: Subsampling effects

Of course, any analysis of finite sequences of data issubject to subsampling effects. While these may beneglected for light-tailed data, they become prominentwhen it comes to assessing possible slow decay of the tailsof a statistical sample distribution. These effects werediscussed in detail in a number of contributions. In thecontext of neuronal avalanches, these effects were charac-terized in [77], and the results show indeed a modificationof the slope with subsampling, together with exponentialcutoffs pushed to larger sizes as sampling becomes finer.

100

101

102

103

10-3

10-2

10-1

100

101

102

N/1N/2N/4N/8

s-1.5t-2

Avalanche size s (No. spikes) Avalanche duration t (No. bins)

Eve

nt F

requ

ency

100

101

10210

-3

10-2

10-1

100

101

102

FIG. 6. Subsampling effects. Statistics for a randomly ex-tracted subset of neurons of size n = 125, 250, 500 among1 000 neurons whose dynamics is described by Brunel’s model.Theoretical power-laws with critical exponents are displayedin black dashed lines.

We have confirmed these results in our own data. InFig. 6, we have computed the distribution of avalanchesize and duration when considering only a fraction ofthe neurons for the statistics. In detail, we have sim-ulated the Brunel model [27] with N = 1 000. Thisyields a raster plot, from which we have extracted a ran-domly chosen subset of n neurons, with n = N/k fork ∈ 2, 4, 8. We indeed observed that an exponentialcutoff is shifted towards larger sizes and slopes increasewith the subsampling ratio k.

Appendix C: Brunel’s model

In our simulations, we have used the neuronal networkmodel introduced by Brunel in [27] and have referredto the different dynamical regimes of this system. Wereview here the model, provide all parameters used in oursimulations and show that the conclusions drawn in oneexample of the synchronous irregular (SI) state are validfor all parameters tested within this regime. The modeldescribes the dynamics of N integrate-and-fire neurons,80% of which are excitatory and the others inhibitory.In the model, it is assumed that each neuron receivesC = εN randomly chosen connections, that are assumedto uniformly arise from the excitatory and the inhibitorypopulation, thus 80% of the incoming connections to anycell come from the excitatory population. The networkis assumed to be sparsely connected, thus ε 1. Thedepolarization vi of neuron i at the soma satisfies the

t-2s-1.5

840

1

2

SRAI

SI

2 6

0.7

0.8

0.9

5 6 7 8

Avalanche duration t (No. bins)

Eve

nt F

requ

ency

g

100 10

110

210

3 100 10

1 102

Avalanche size s (No. spikes)

100

101

102

103

100

101

10

103

2

FIG. 7. Avalanche statistics for the Brunel model with ran-domly chosen parameters within the SI regime.

equation:

τdvi

dt= −vi +RIi(t)

where Ii(t) is the total current reaching the soma at timet. These currents arrive from the synapses made withother cells within the network, as well as from connec-tions to neurons outside the network. It is assumed thateach neuron receives Cext connections to from excitatoryneurons outside the network, and that these synapsesare activated by independent Poisson processes with rateνext. The current receives by neuron i is thus the sum:

RIi(t) = τ

C+Cext∑j=1

Jij∑k

δ(t− tkj −D)

where the sum is taken over all synapses, Jij are thesynaptic efficacies, tkj are the spike times at synapsej of neuron i, and D is the typical transmission de-lay, considered homogeneous at all synapses for simplic-ity. In order to simplify further the model, it is as-sumed that Jij = J > 0 for all excitatory synapses, andJij = −gJ < 0 for inhibitory synapses. The parameter gis relevant in that it controls the balance between excita-tion and inhibition: if g < 4, the network is dominated byexcitation, and otherwise it is dominated by inhibition.The neuron i fires an action potential when vi reaches afixed threshold θ, and the depolarization of neuron i isinstantaneously reset to a fixed value Vr where it remainsfixed during a refractory period τrp (during this period,the neuron is insensitive to any stimulation). An impor-tant parameter is the ratio between the rate of externalinput νext and the quantity denoted νthresh correspond-ing to the minimal frequency that can drive one neuron,disconnected from the network, to fire an action poten-tial: νthresh = θ

0.8Jτ (the coefficient 0.8 in that formulacorresponds to the fraction of excitatory neurons).

In this model, the parameters that are kept free arethe balance between excitation and inhibition g and theexternal firing rate νext. All other parameters are chosenas in table I Using a mean-field analysis together with adiffusion approximation, the authors find that all neurons

Page 12: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

12

ε D J τrp θ Vr

0.1 1.8 ms 0.2 mV 2 ms 20 mV 10 mV

TABLE I. Parameters used in all simulations of Brunel’smodel, as in [27].

are independent point process driven by a common rateν(t) given by a self-consistent equation. Heuristically,during the time interval [t, t+dt], the probability for anygiven to spike is given by ν(t) dt, and the realization ofthis random variable are independent in the different neu-rons. When the rate ν(t) depends on time, neurons thusshow a level of synchrony, and when ν(t) is constant, theregime is called asynchronous, in the parlance of [27]. Inthat paper, an analysis of the self-consistent rate equa-tion in the mean field limit led to the identification ofseveral regimes that are depicted in Fig. 7:

• The asynchronous irregular (AI) state in which ν(t)converges towards a strictly positive constant value,which occurs when the excitatory external inputsare sufficiently large (νext > νthresh) and when in-hibition dominates excitation.

• The synchronous regular (SR) regime correspondsto a state in which ν(t) is a periodic function oftime. This regimes arises in the excitation domi-nated regime, and the oscillations frequency is con-trolled essentially by the transmission delay D andthe refractory period τrp (approximately varying asτrp/D). The transition thus occur close from theline g = 4.

• The synchronous irregular (SI) regime occurs es-sentially in the inhibition-dominated regime whenthe input are not sufficient to drive the network toa sustained firing state, i.e. when νext < νthresh.

We have reproduced in Fig. 7 the bifurcation diagram [27,Fig.2 B] with the bifurcation lines between AI, SI and SRstates. Within the SI state, we have been randomly draw-ing 30 parameter points and analyzed the avalanches aris-ing for these parameters. We have found that all regimesshow a very clear power-law distribution of avalanche sizeand duration with exponents consistent with the expo-nents −1.5 and −2 predicted by the theory.

Appendix D: No slowing down within the SI regime

In addition to the fact that the SI regime is away fromany transition between the different network regimes, weconfirmed that the system did not show the typical prop-erties of critical states. A number of criteria were pro-posed in the Ising model to be characteristic of the criticalregime. These include the divergence of the correlationlength, of the heat capacity or magnetic susceptibility,that are all related to long-range correlations betweenspins. Here, the absence of order parameter and spatialdimension prevents from using similar criteria to inves-tigate the presence of critical dynamics. However, a cri-terion independent of the definition of an analogous oforder parameter or magnetic susceptibility is the critical

slowing-down occurring at phase transitions for dynami-cal systems. This criterion states that the relaxation timeof the system, namely the time it takes for the system toreturn to its stationary regime after a perturbation, di-verges at criticality.

In the present case, computing relaxation times is achallenge since the system is not at an equilibrium butwithin a chaotic regime, thus all perturbations producemassive changes in the dynamics of the system. Follow-ing the methodology developed in [78, 79], we designed anumerical criterion to evaluate relaxation time to the SIregime. That regime is essentially defined by the alterna-tion of periods of collective activity followed by silences.We have thus perturbed the system by adding a con-stant input within a short time window (See Fig. 8(A)),which has the effect of switching the system into anasynchronous irregular regime where the firing is unin-terrupted. As the perturbation stops, the system quicklyreturns to an SI regime with alternations of silences andcollective bursts. An upper bound of the relaxation timecan thus be defined as the first time, after the pertur-bation has stopped, at which the system is completelysilent. We have made extensive simulations within theSI regime to compute the relaxation time and obtainedthat the system returns to SI statistics after a few mil-liseconds after stimulation (on the order of 2ms). Thistime increases very fast close to the SI-AI transition asexpected from the theory, but within the SI regime, thesystem did not show any indication of critical slowing-down.

Appendix E: Diverse regimes of independentprocesses

We have confirmed that the statistics of independentPoisson processes with fluctuating instantaneous firingrates produce avalanches with power-law distributions ofavalanche size and durations, consistent with our theory.To this purpose, we have performed a similar analysis asin Fig. 2 replacing Ornstein-Uhlenbeck firing rates by thepositive part of a Brownian motion reflected at ±1. Thischoice was motivated by two constraints: the positivepart was taken in order to consider only positive firingrates for consistency, and the reflection at ±1 was forcedin order to prevent from having to long excursions of theBrownian motion, so that we can indeed assess that theheavy tails of the avalanche distributions are rather dueto the statistical structure of the firings rather than dueto possible very long excursions of the Brownian motion.The results of the simulations are provided in Fig. 9. Asin the case of the positive part of the Ornstein Uhlen-beck process, we find very clear power-law distributionsof avalanche size and durations, with slopes consistentwith our theory, and a very clear collapse of the avalancheshapes.

We add that beyond the collapse of the avalanchetrajectories, the shapes onto these avalanche collapsemay convey important information, as noted and in-vestigated in the context of one-dimensional randomwalks [22, 23]. We observe indeed that the shape ofthe network-generated avalanches are not similar to theshapes obtained in the Brownian or Ornstein-Uhlenbeckcase, and may similarly contain an information that goesbeyond pure shape collapse reported in neural data [25].

Page 13: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

13

50

2900 2950 3000 3050 3100 3150 3200 3250 3300

0

5

10

15

20

25

30

35

40

45

end of perturbation

Back to SI Statistics

perturbation

840

1

2

SRAI

SI

2 6

g

4 4.5 5 5.5 6 6.5 7 7.50.75

0.8

0.85

0.9

0.95

1

2

3

4

5

6

7

8

Tim

e (ms)

4 4.5 5 5.5 6 6.5 7 7.51

5

9

Tim

e (m

s)

g

(A) (B)

0 2 4 Time

Number of spikes

FIG. 8. Relaxation time of the Brunel model in the SI regime. (A) Typical trajectory (blue) perturbed by a constant input(red) into a SI regime, returns to the SI state after a few milliseconds. This is true over the whole SI domain (B): relaxationtimes are on the order of a few milliseconds, and increase sharply close from the transition.

0.60 10 20 30 40

4

6

8

50

0

500

4 8 12 16 200

Neu

ron

inde

xR

ate

Avera

ge N

um

ber

of

firi

ng

s /

T

Time

Avalanche size Avalanche duration

1/2

1000

×104

t-2

100

101

s-1.5

1

1.4

1.8

2.2

0 0. 5

Eve

nt F

requ

ency

Ave

rage

Num

ber

offir

ings

102 10

3 10410

0

101

102

103

104

100

101

102

100

101

102

103

1

FIG. 9. Avalanche statistics and shape collapses for indepen-dent Poisson processes with rates given by a reflected Brow-nian motion.

[1] P. Bak, How nature works (Oxford university press Ox-ford, 1997).

[2] H. J. Jensen, Self-organized criticality: emergent com-plex behavior in physical and biological systems, Vol. 10(Cambridge university press, 1998).

[3] M. E. Newman, Contemporary physics 46, 323 (2005).[4] M. Kardar, Statistical physics of fields (Cambridge Uni-

versity Press, 2007).[5] P. Bak, C. Tang, K. Wiesenfeld, et al., Physical review

letters 59, 381 (1987).[6] M. P. Stumpf, M. A. Porter, et al., Science 335, 665

(2012).[7] D. Avnir, O. Biham, D. Lidar, and O. Malcai, Science

279, 39 (1998).[8] G. A. Miller, The American journal of psychology 70,

311 (1957).

[9] W. Li, Information Theory, IEEE Transactions on 38,1842 (1992).

[10] J. Bouchaud, in Levy flights and related topics in physics(Springer, 1995) pp. 237–250.

[11] N. Jan, L. Moseley, T. Ray, and D. Stauffer, Advancesin Complex Systems 2, 137 (1999).

[12] H. Takayasu, M. Takayasu, A. Provata, and G. Huber,Journal of statistical physics 65, 725 (1991).

[13] H. Takayasu, I. Nishikawa, and H. Tasaki, Physical Re-view A 37, 3110 (1988).

[14] H. Takayasu, Physical review letters 63, 2563 (1989).[15] M. Benayoun, J. D. Cowan, W. van Drongelen, and

E. Wallace, PLoS computational biology 6, e1000846(2010).

[16] J. M. Beggs and D. Plenz, The Journal of neuroscience23, 11167 (2003).

Page 14: arXiv:1503.08033v5 [q-bio.NC] 2 Jan 2017 · 2017. 1. 4. · PACS numbers: 02.50.-r, 05.40.-a, 87.18.Tt, 87.19.ll, 87.19.lc, 87.19.lm I. INTRODUCTION Power law statistics are ubiquitous

14

[17] T. Petermann, T. C. Thiagarajan, M. A. Lebedev, M. A.Nicolelis, D. R. Chialvo, and D. Plenz, Proceedings ofthe National Academy of Sciences 106, 15921 (2009).

[18] G. Hahn, T. Petermann, M. N. Havenith, S. Yu,W. Singer, D. Plenz, and D. Nikolic, Journal of neu-rophysiology 104, 3312 (2010).

[19] W. L. Shew and D. Plenz, The neuroscientist 19, 88(2013).

[20] D. R. Chialvo, Nature physics 2, 301 (2006).[21] J. Touboul and A. Destexhe, PloS one 5, e8982 (2010).[22] F. Colaiori, A. Baldassarri, and C. Castellano, Physical

Review E 69, 041105 (2004).[23] A. Baldassarri, F. Colaiori, and C. Castellano, Physical

review letters 90, 060601 (2003).[24] N. Dehghani, N. G. Hatsopoulos, Z. D. Haga, R. A.

Parker, B. Greger, E. Halgren, S. S. Cash, and A. Des-texhe, Frontiers in physiology 3 (2012).

[25] N. Friedman, S. Ito, B. A. Brinkman, M. Shimono, R. L.DeVille, K. A. Dahmen, J. M. Beggs, and T. C. Butler,Physical review letters 108, 208102 (2012).

[26] J. P. Sethna, K. A. Dahmen, and C. R. Myers, Nature410, 242 (2001).

[27] N. Brunel, Journal of computational neuroscience 8, 183(2000).

[28] We used the algorithm freely available on ModelDB.[29] Sub-sampling of the network may lead to find periods of

quiescence, whose statistics will depend on the networksize. However, these are not robust statistical quantitiesto define a critical regime.

[30] A. Clauset, C. R. Shalizi, and M. E. Newman, SIAMreview 51, 661 (2009).

[31] The Kolmogorov-Smirnov statistical tests is not affectedby events of very small frequency.

[32] L. Boltzmann, Lectures on gas theory (Univ of CaliforniaPress, 1964).

[33] A. Renart, N. Brunel, and X.-J. Wang, Computationalneuroscience: A comprehensive approach , 431 (2004).

[34] H. Sompolinsky, A. Crisanti, and H. Sommers, PhysicalReview Letters 61, 259 (1988).

[35] S. Ostojic, Nature neuroscience 17, 594 (2014).[36] A. Sznitman, Ecole d’Ete de Probabilites de Saint-Flour

XIX , 165 (1989).[37] P. Robert and J. D. Touboul, arXiv preprint

arXiv:1410.4072 (2014).[38] J. Touboul, G. Hermann, and O. Faugeras, SIAM Jour-

nal on Applied Dynamical Systems 11, 49 (2012).[39] S. Mischler, C. Quininao, and J. Touboul, arXiv preprint

arXiv:1503.00492 (2015).[40] J. Touboul et al., The Annals of Applied Probability 24,

1298 (2014).[41] J. Touboul, Journal of Statistical Physics 156, 546

(2014).[42] A.-S. Sznitman, Journal of functional analysis 56, 311

(1984).[43] T. Cabana and J. Touboul, Journal of Statistical Physics

153, 211 (2013).[44] D. L. Ruderman, Network: computation in neural sys-

tems 5, 517 (1994).[45] D. L. Ruderman and W. Bialek, Physical review letters

73, 814 (1994).[46] D. W. Dong and J. J. Atick, Network: Computation in

Neural Systems 6, 159 (1995).[47] Y. Dan, J. J. Atick, and R. C. Reid, The Journal of

Neuroscience 16, 3351 (1996).[48] F. Attneave, Psychological review 61, 183 (1954).[49] H. B. Barlow, (1961).

[50] S. Laughlin, Z Naturforsch 36, 910 (1981).[51] E. P. Simoncelli and B. A. Olshausen, Annual review of

neuroscience 24, 1193 (2001).[52] A. Dimitrov and J. D. Cowan, Neural computation 10,

1779 (1998).[53] A. Peyrache, N. Dehghani, E. N. Eskandar, J. R. Mad-

sen, W. S. Anderson, J. A. Donoghue, L. R. Hochberg,E. Halgren, S. S. Cash, and A. Destexhe, Proceedings ofthe National Academy of Sciences 109, 1731 (2012).

[54] A. Ecker, P. Berens, G. Keliris, M. Bethge, N. Logothetis,and A. Tolias, science 327, 584 (2010).

[55] A. Renart, J. De la Rocha, P. Bartho, L. Hollender,N. Parga, A. Reyes, and K. Harris, science 327, 587(2010).

[56] X.-J. Wang, Y. Liu, M. V. Sanchez-Vives, and D. A.McCormick, Journal of neurophysiology 89, 3279 (2003).

[57] M. T. Wiechert, B. Judkewitz, H. Riecke, and R. W.Friedrich, Nature neuroscience 13, 1003 (2010).

[58] L. Alili, P. Patie, and J. L. Pedersen, Stochastic Models21, 967 (2005).

[59] W. Feller, An introduction to probability theory and itsapplications, Vol. 2 (John Wiley & Sons, 2008).

[60] W. L. Shew, H. Yang, S. Yu, R. Roy, and D. Plenz, TheJournal of neuroscience 31, 55 (2011).

[61] J. Hesse and T. Gross, Criticality as a signature ofhealthy neural systems: multi-scale experimental andcomputational studies (2015).

[62] W. L. Shew, H. Yang, T. Petermann, R. Roy, andD. Plenz, The Journal of Neuroscience 29, 15595 (2009).

[63] J. M. Beggs and N. Timme, Frontiers in physiology 3,163 (2012).

[64] J. Beggs, Philosophical Transactions of the Royal Societyof London A: Mathematical, Physical and EngineeringSciences 366, 329 (2008).

[65] In the models based on branching processes with param-eter p, it is clear that the maximal diversity arises at crit-icality, where the entropy is larger than in the sub-criticalregime (only patterns with a small number of spikes) orin the super-critical regime p > 1 (patterns with a largenumber of spikes).

[66] T. P. Vogels and L. F. Abbott, The Journal of neuro-science 25, 10786 (2005).

[67] P. L. Nunez and R. Srinivasan, Electric fields of the brain:the neurophysics of EEG (Oxford university press, 2006).

[68] L. Aitchison, N. Corradi, and P. E. Latham, arXivpreprint arXiv:1407.7135 (2014).

[69] D. J. Schwab, I. Nemenman, and P. Mehta, Physicalreview letters 113, 068102 (2014).

[70] T. Mora and W. Bialek, Journal of Statistical Physics144, 268 (2011).

[71] C. Bedard, H. Kroeger, and A. Destexhe, Physical reviewletters 97, 118102 (2006).

[72] B. Arnold, Pareto distributions (International Co-operative Pub. House, 1983).

[73] S. A. Stoev, G. Michailidis, and M. S. Taqqu, arXivpreprint math/0609163 (2006).

[74] A. Clauset, M. Young, and K. Gleditsch, Journal of Con-flict Resolution 51 (2007).

[75] A. Clauset, C. R. Shalizi, and M. Newman, SIAM Re-view 51, 661 (2009), arXiv.org:0706.1062.

[76] R. Chicheportiche and J.-P. Bouchaud, Physical ReviewE 86, 041115 (2012).

[77] V. Priesemann, M. H. Munk, and M. Wibral, BMC neu-roscience 10, 40 (2009).

[78] R. Engelken, F. Farkhooi, D. Hansel, C. van Vreeswijk,and F. Wolf, F1000Research 5 (2016).

[79] Y. Zerlaut, Z. Girones, and A. Destexhe, arXiv preprint.