Information Processing in Complex Systems 2019

Complex systems process information in a variety of interesting and unexpected ways: from slime molds anticipating the distribution of food sources in their environment to the changing networks of people relaying information on social media sites such as Instagram and twitter. The behavior of all such systems can be interpreted in terms of information flow. Meanwhile their study generally relies on cross disciplinary techniques which bring together a confluence of ideas from disciplines spanning computer science, physics, mathematical modeling, social sciences, economics, biology, medical research and many other areas. This workshop is a platform for discussing how information theory shapes our view of complex systems across all of these areas.

We aim to bring together leaders across all these areas, to discuss questions ranging from:

  • What are the different signatures of information flow in complex systems? How do we quantify the information content or dynamics of a process?
  • Is information content in systems split between different levels? Is there a hierarchy structure for processing this information.
  • Can we trace the flow of information through a dynamical complex process?

Venue:  Nanyang Executive Centre, Singapore
Date:  October 3, 2019

Abstract Submission: Please submit through EasyChair (deadline: July 26, 2019)

Workshop Programme: coming soon

Organizers: Rick Quax, Eckehard Olbrich, Mile Gu

Invited speakers

Schedule

Abstracts are listed below this table, in the same order.

Time NameTitle
09:00​ - 09:40Oscar DahlstenTBA
09:40 - 10:05Sudipta Singha Roy, Utkarsh Mishra and Debraj Rakshit (NO SHOW)Trends of non-markovianity and localization in disordered spin chains
10:05 - 10:30Fernando E. RosasLeveraging high-order interactions for enabling private information processing
10:30 - 11:00Coffee Break
11:00 - 11:40Karoline WiesnerShannon entropy and mutual information indicate transitions between stable states: glass formers and stem cells
11:40 - 12:05Pradeep Kumar Banerjee, Johannes Rauh, Eckehard Olbrich and Juergen JostOn unique information based bounds on the secret key rate
12:05 - 12:25Conor Finn and Joseph LizierGeneralised Measures of Multivariate Information Content
12:25 - 14:00
Lunch break
14:00 - 14:40​Joseph LizierWhat can quantifying information processing tell us about ... flocks?
14:40 - 15:20Masafumi OizumiTowards Identifying Places and Boundaries of Consciousness
15:20 - 15:45Nao Tsuchiya​Quality and quantity of consciousness: empirical evidence from integrated information analysis from neural data
15:45 - 16:00Closing of the workshop
16:00 - 16:30Coffee break

Abstracts

Oskar Dahlsten

TBA.

Sudipta Singha Roy

Within a framework based on the tools of the theory of open quantum systems, we investigate the relation between localization of wave functions and memory effects associated with dynamics of a sub-part of a disordered transverse field Heisenberg chain. Towards this aim, the decay profile of bipartite entanglement (BE) shared between a probe-qubit and a system-qubit (sub-part) of the chain is monitored in time. A clear shift in the trends of the decay profiles of the BE from monotonic in the low-disorder limit to non-monotonic in the moderately large disorder limit occurs due to strong information backflow from the environment (complementary-part) to the system-qubit. A connection between environmental interruption caused by the information backflow and the disorder strength is established by examining the entanglement revival frequencies. The growth patterns of the revival frequencies in the localized phase effectively distinguish an interacting system (many-body localized) from its non-interacting (Anderson localized) counterpart.

Fernando E. Rosas

Perfect data privacy seems to be in fundamental opposition to the economical and scientific opportunities associated with extensive data exchange. Defying this intuition, we explore how one can leverage high-order interactions to allows datasets disclosure without compromising the privacy of individual data samples. Our proposal is based on the notion of synergistic variables, which are correlated with global statistical properties of datasets while being independent of individual samples. We present an algorithm to build an optimal disclosure strategy/mapping, and discuss it fundamental limits on finite and asymptotically large datasets. Furthermore, we present explicit expressions to the asymptotic performance of this scheme in some scenarios, and study cases where our approach attains maximal efficiency. Finally, we study the relationship between our approach and the well-known Partial Information Decomposition framework (PID), and discuss potential future extensions.

Karoline Wiesner

I will use two examples of experimental-theoretical collaboration to illustrate the power of information theory in discovering transitions in complex systems.

Example 1: Glass former – Among the key challenges to our understanding of solidification in the glass transition is that it is accompanied by little apparent change in structure. We introduced an information theoretic approach to determine correlations in displacement for particle relaxation encoded in the initial configuration of a glass-forming liquid. Our analysis may resolve some of the difficulties of convoluted time and length scales.

Example 2: The metaphor of a potential epigenetic differentiation landscape broadly suggests that during differentiation a stem cell approaches a stable equilibrium state from a higher free energy towards a stable equilibrium state which represents the final cell type. It has been conjectured that there is an analogy to the concept of entropy in statistical mechanics. In this context, in the undifferentiated state, the entropy would be large since fewer constraints exist on the gene expression programmes of the cell. We compute the Shannon entropy for time-resolved single-cell gene expression data in two different experimental set-ups of haematopoietic differentiation. We find that the behaviour of this entropy measure is in contrast to these predictions.

Wiesner, K., J. Teles, M. Hartnor, and C. Peterson. "Haematopoietic stem cells: entropic landscapes of differentiation." Interface focus 8, no. 6 (2018): 20180040.

Dunleavy, Andrew J., Karoline Wiesner, Ryoichi Yamamoto, and C. Patrick Royall. "Mutual information reveals multiple structural relaxation mechanisms in a model glass former." Nature communications 6 (2015): 6089.

Pradeep Kumar Banerjee

The unique information (UI) is an information measure that quantifies a deviation from the Blackwell order. We show that the UI shares some intuitive and basic properties of classical quantities called secret key rates. In particular, we show that the UI is a secrecy monotone. A consequence of this property is that the UI is an upper bound on the one-way secret key rate. We also show that the UI satisfies a triangle inequality, which implies that UI shares a key property with state-of-the-art upper bounds on the two-way secret key rate that is related to a "secret key decomposition". This result implies that the UI is never greater than the best known computable upper bound on the two-way rate. We conjecture that the UI lower bounds the two-way rate and discuss implications of the conjecture. We also discuss our results in the context of nonnegative decompositions of the mutual information into unique, redundant and synergistic components.

Conor Finn

The multivariate mutual information can be negative for certain probability distributions. This property has been said to have "no intuitive meaning" and has prevented the multivariate mutual information from being widely adopted. More recently, however, an area of research known as information decomposition has provided insights into why the multivariate mutual information can be negative. Nevertheless, information decomposition has proven to be a contentious area of research and is still subject to ongoing debate. In this presentation, we take an approach which differs from that of information decomposition and yet yields similar results. We begin by considering the information that a set of observers (e.g. Alice and Bob) gain from a single realisation, and discuss the various ways in which these observers might share their information with some non-observer (e.g. Eve). We proof that Eve's information must be given by a particular function, which we call the union information content, and demonstrate that it has a more consistent correspondence with set theory than the mutual information content. This correspondence also enables us to introduce a function which we call the intersection information content. We will discuss the various properties of these two functions and their rich algebraic structure which enables us to define the functions for an arbitrary number of observers. Finally, we show that we can re-derive our existing approach to information decomposition by combining the intersection information content together with the Bell lattice of joint information contents.

​Joseph Lizier

The space-time dynamics of interactions in complex systems in general are often described using terminology of information processing, or computation, in particular with reference to information being stored, transferred and modified in these systems. In this talk, we describe an information-theoretic framework -- information dynamics -- that we have used to quantify each of these operations on information, and their dynamics in space and time. Not only does this framework quantitatively align with natural qualitative descriptions of neural information processing, it provides multiple complementary perspectives on how, where and why a system is exhibiting complexity. We will review the application of this framework to flocking behaviour, and in particular schooling in fish, describing what it can and indeed has revealed regarding flocking models and in analysis of real collective motion data. First, we show how the space-time dynamics of information processing in flocks highlight hot-spots and emergent computational structures, for example in quantifying the cascading turning waves of the Trafalgar effect as information flows, and in spatially locating the strongest information sources for a target fish. Next, we discuss examples of characterising regimes of flocking behaviour in terms of information processing, including the effects of speed, hunger and mixed species schooling on interactions within fish schools. Finally, we provide an outlook on future work on analysing multivariate information interactions in flocks.

Masafumi Oizumi

It has been a long-standing question “where” consciousness resides in the brain. The problem of identifying the place of consciousness can be restated as the problem of identifying the boundary of consciousness, i.e., the problem of drawing the boundary in a complex neural network, which determines the place of consciousness. Although a lot of experimental findings have been accumulated over the last decades, the boundary problem of consciousness has not yet been resolved. In this talk, I will discuss the information theoretical approach to the boundary problem of consciousness based on Integrated Information Theory (IIT). IIT is an attempt to mathematically quantify consciousness from the viewpoint of information and integration, which are considered to be the essential properties of consciousness. IIT hypothesizes that the place of consciousness corresponds to the locally most “integrated” subnetwork in the brain where the amount of integrated information is locally maximum. Finding the most integrated subnetwork (called a complex or an information core) in a large network is extremely difficult because it involves optimization problems which require an exponentially large amount of computational time. To resolve the difficulty, we have developed efficient algorithms that reduce the computational time to polynomial order, which enables us to find a complex within a reasonable amount of time. I will introduce several applications of the proposed algorithms to real neural data and discuss how we can test the theoretical predictions about the places and boundaries of consciousness by experiments.

Nao Tsuchiya​

Integrated information theory (IIT) gives quantitative predictions about the relationships between conscious experience, neural basis and their informational structure. More specifically, based on the phenomenological analysis, IIT claims that quality of consciousness (e.g. redness of red) correlates with the “shape” of “Integrated Information Structure (IIS) (so-called Maximally Integrated Conceptual Structure (Oizumi et al 2014)” while IIT also claims that quantity of consciousness (e.g. awake vs. sleep) correlates with “system-level Integrated Information (so-called big phi (Oizumi et al 2014))“. IIT prescribes methods to estimate both IIS and system-level II based on our knowledge on the neural system. By resolving several issues associated with the computation of IIS and system-level II, we directly test the two IIT predictions based on two sets of neural data. Our results suggest that IIS is better correlated with both quantity and quality of consciousness than system-level II (or big phi). We will also discuss several other candidate measures that may better correlated with both quantity and quality of consciousness in empirical data.