student profile: Mr Conor Finn


Map

Thesis work

Thesis title: Identifying the synergistic information interactions of distributed computation in complex systems

Supervisors: Mikhail PROKOPENKO , Joseph LIZIER

Thesis abstract:

My current research focus is on the development of a multivariate information theory which would allow one to understand and quantify multivariate interdependence in complex systems.

One of the fundamental measures in information theory is the mutual information which quantifies the pairwise interdependence between two variables. Complex systems consist of many interdependent components and the behaviour of any one component cannot always be understood in terms of mere pairwise interactions with the others; yet surprisingly, there is no multivariate generalisation of the mutual information which provides a satisfactory decomposition of multivariate interdependence [Bertschinger et al., 2013; Griffith and Koch, 2014; Harder et al., 2013; Williams and Beer, 2010]. Information theory was not originally developed for the purpose of quantifying multivariate dependence, hence its failure to do so is hardly surprising. However, understanding and quantifying multivariate interdependence is of the utmost importance in complex systems science—particular for the development of a mathematical theory of information storage, transfer and modification [Lizier, 2010]—and hence, to meet these contemporary demands, Shannon's information theory must be extended.

The kernel of the problem is that the current Shannon information measures conflate the concepts of redundancy and synergy. Redundancy is what the mutual information captures for two variables: that is, redundancy is the common or shared information between variables due to their mutual interdependence. Synergy on the other hand is a purely multivariate effect which only appears when there are three or more variables. It may be the case that knowing the value of either of one of two variables alone does not give you any information about the value of a third variable; yet knowing the value of the two variables together informs you as to the value of the third—this is synergy. The conditional mutual information conflates the redundant interdependency between three or more variables with the synergistic interdependency between those variables. It is now clear that understanding multivariate interdependence, i.e. generalising the mutual information, requires quantifying these distinct modes of interdependence separately (unlike in the conditional mutual information). Hence, now we seek an information decomposition for an arbitrary number of variables or components which quantifies all the distinct ways these components can interact. Indeed, this very idea has spawned a flurry of research [Bertschinger et al., 2014; Griffith et al., 2014; Olbrich et al., 2015; Perrone and Ay, 2016; Williams and Beer, 2010]. However the problem is still currently unresolved due to the surprising difficulty of separating these effects in general. My current research focuses on working towards a resolution of this particular problem.

Note: This profile is for a student at the University of Sydney. Views presented here are not necessarily those of the University.