Understanding the fundamental mechanisms enabling fast and reliable communication in the brain is one of the outstanding key challenges in neuroscience. In this work, we address this problem from a systems and information theoretic perspective. Specifically, we first develop a simple and tractable framework to model information transmission in networks driven by linear dynamics. We then resort to the notion of Shannon capacity to quantify the information transfer performance of these networks. Building on this framework, we show that it is possible to increase Shannon capacity via two fundamentally different mechanisms: either by decreasing the degree of stability of the network adjacency matrix, or by increasing its degree of non-normality. We illustrate and validate our findings by means of simple, insightful examples.
The shannon capacity of linear dynamical networks
Baggio G.;Zampieri S.
2019
Abstract
Understanding the fundamental mechanisms enabling fast and reliable communication in the brain is one of the outstanding key challenges in neuroscience. In this work, we address this problem from a systems and information theoretic perspective. Specifically, we first develop a simple and tractable framework to model information transmission in networks driven by linear dynamics. We then resort to the notion of Shannon capacity to quantify the information transfer performance of these networks. Building on this framework, we show that it is possible to increase Shannon capacity via two fundamentally different mechanisms: either by decreasing the degree of stability of the network adjacency matrix, or by increasing its degree of non-normality. We illustrate and validate our findings by means of simple, insightful examples.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.




