Complex Systems Seminars

Generalized Entropies and the Similarity of Texts.

Date: Wednesday 21 February 2018
Time: 11:00am
Venue: Civil Eng J05, Conference Room 438

Presenter: Dr. Martin Gerlach, Northwestern University, USA.

Quantifying the similarity between written texts is a traditional problem in applications of , e.g., Information Retrieval. Information-theoretic measures provide a natural framework for comparing sequences of symbols (e.g. words). Here, we show how the generalization of the Jensen-Shannon-divergence (based on the generalization of the Gibbs-Shannon entropy) provides new insights into the investigation of statistical properties of texts. In particular, we show that the heavy-tailed distribution of word-frequencies (Zipf's law) has important consequences for the estimation and interpretation of these divergences. The large number of low-frequency words poses major difficulties to the estimation of the similarity in finite samples leading to much larger than expected systematic and statistical errors. The spectrum of divergences allows for magnifying different scales in the distribution of word-frequencies yielding additional information on the similarity of texts. We show the practical significance of these results by i) quantifying the evolution of the English language over the last two centuries in millions of digitized books; ii) investigating the organization and evolution of scientific fields in more than 21M scientific papers.

Computer-Assisted Risk Assessment of Hospital Infections: an Implementation in Polish Hospitals

Date: Monday 15 January 2017
Time: 10:30am
Venue: Civil Eng J05, Conference Room 438

Presenter: Mr. Andrzej Jarynowski, Moldova State University, Chișinău, Moldova.

We create an intuitive and functional desktop application (free of charge, open licensed - ) to support the work of the hospital epidemiologists in preventing and containing infections spread in hospitals. We study the spread of hospital-associated infections through computer simulations on two layers of contact networks: organizational and empirical. The aim is to integrate a wide range of networks in which physical contact is a crucial factor. The goal of the application is reconstructing the most likely paths of infection and classifying places and individuals into different risk groups.
We propose Multilevel Approach to heuristic analysis of many layers of hospital by agent based modeling. We have collected time-varying structure of contacts reconstructed from Polish hospitals:

1. The organizational structure is mapped by set of questioners, CAD maps integration, functional paths annotation and local vison. It is done mostly by surveys within medical staff through an interactive web application.
2. The empirical layer processes data from the register of patient admissions and discharges from each hospital unit (wards, clinics, etc.), microbiological laboratory test results and medical staff register.

Epidemiological models are implemented on a temporary network of contacts (where each link can provide a path for the pathogen.
We analysed spread of various alarm pathogens. With simulated infection paths, we were able to compute network measures for patients. We obtained the risk of getting infected, based on the patient’s incoming connections, and the risk of spreading infections resulting from outgoing connections (both analogous to Google Page Rank algorithm). We are currently validating our ‘computer assisted’ risk assessment with ‘human’ risk assessment in a prospective study.

Thermodynamics of Computation

Date: Thursday 14 December 2017
Time: 10:00am
Venue: Civil Eng J05, Conference Room 438

Presenter: Prof. David Wolpert, Santa Fe Institute, Santa Fe, USA.

Local Information Decomposition Using the Specificity and Ambiguity Lattices

Date: Monday 6 November 2017
Time: 11am
Venue: Civil Eng J05, Lecture Theatre 3 (Level 3, Room 302)

Presenter: Mr Connor Finn, School of Civil Engineering, University of Sydney.

Multivariate information theory has long been problematic. Recently, the partial information decomposition (PID) of Williams and Beer has provided a promising axiomatic framework which clarifies the general structure of multivariate information. However, PID lacks the necessary measure of redundant information required to complete the framework; despite much recent research, no well-accepted measure of redundant information has emerged that is applicable to more than two sources and respects the locality of information. In this paper, we introduce a new framework based upon the axiomatic approach taken in PID but which aims to decompose multivariate information on a local or pointwise scale. It is shown that in order to identify when information from two sources is indeed the same information, one must consider decomposing the local mutual information into its two directed components, the specificity and the ambiguity. Based upon the axiomatic approach taken in PID, we decompose these two components separately resulting in two lattices - the specificity and ambiguity lattices. This Specificity and Ambiguity (SpAm) decomposition retains the appealing multivariate structure provided by PID, but applying this notion on a much more granular level enables the decomposition to identify when information is the same information. This last point is justified by providing an operational interpretation of redundancy in terms of Kelly gambling. Applying the decomposition to canonical examples from the PID literature demonstrates the unique ability to provide a local decomposition, and the fact that the SpAm decomposition possesses the much sought-after target chain rule. Finally, interpreting these results sheds light on why defining a redundancy measure for PID has proven to be so difficult - one lattice is not enough.

An interview-based study of pioneering experiences in teaching and learning Complex Systems in Higher Education

Date: Monday 30 October 2017
Time: 11am
Venue: Civil Eng J05, Lecture Theatre 1 (Level 2, Room 203)

Presenter: Dr Joseph Lizier, School of Civil Engineering, University of Sydney.

Abstract: Due to the interdisciplinary nature and reach of complex systems as a field, students undertaking courses in complex systems at University level have diverse backgrounds across physics, mathematics, computer science, engineering, biology, neuroscience, economics, social sciences and the humanities. This brings challenges (e.g. diversity of skills, computer programming and analysis ability) but also opportunities (e.g. facilitating interdisciplinary interactions and projects, and applications that meet disciplinary needs) for the classroom. However, there is little published regarding how these challenges and opportunities are handled in teaching and learning Complex Systems as an explicit subject in higher education, and how this differs in comparison to other subject areas. We seek to explore these particular challenges and opportunities in more depth, by examining the primary body of knowledge currently residing in the experience of pioneering teachers and learners in this space. We report an interview-based study of several such subjects (conducted amongst the authors) on their experiences, and a discussion and analysis comparing and contrasting those experiences. Our discussions explore: how curriculum design was approached, how theories/models/frameworks of teaching and learning informed their decisions and experience, how diversity in student backgrounds was addressed, and assessment task design. We find a striking level of commonality in the issues expressed as well as the strategies to handle them. For example: there was a significant focus on problem- or activity-based learning; a focus on understanding and applying key principles with technical analysis and programming implementation as a means to this end; and the use of major student-led creative projects for both achieving and assessing learning outcomes. While similar approaches to curriculum design (e.g. constructive alignment) were observed, curriculum content was the main area recognised as being contested since the field is still rapidly evolving, however this can be interpreted as a strength of the field in tightly knitting research and teaching into the one community.

Video - watch recording - requires internal USyd login.

What does a model of glucose-insulin regulation tell us about diabetes I and II

Date: Thursday 19 October 2017
Time: 2pm
Venue: Civil Eng J05, Conference Room 438

Presenter: Prof. Maia Angelova, School of Information Technology, Deakin University, Melbourne, Australia.

Abstract: We study the effect of diabetic deficiencies on the production of an oscillatory ultradian
regime using a deterministic nonlinear model which incorporates two physiological delays.
We show that insulin resistance impairs the production of oscillations by dampening the
ultradian cycles. In order to study the effect on health, four strategies for restoring healthy
regulation are explored. Through the introduction of an instantaneous glucose-dependent
insulin response, explicit conditions for the existence of periodic solutions in the linearised
model are formulated, significantly reducing the complexity of identifying an oscillatory
regime. The model is thus shown to be suitable for representing the effect of diabetes on the
oscillatory regulation. In particular, it may provide additional pathways for reintroducing a
physiologically appropriate cyclic regulation and devise new regimes for a personalized
treatment. Finally, in view of the recent efforts for the development of an artificial pancreas,
these results open the way for more in-depth analysis of the underlying mechanisms which
are most responsible for generating the oscillations.

Time-varying network approach in modelling social dynamics

Date: Wednesday 2 August 2017
Time: 11am
Venue: Civil Eng J05, Conference Room 438

Presenter: Dr Michele Starnini, Departament de Fisica Fonamental, Universitat de Barcelona, Spain.

Abstract: The temporal dimension of social systems is fundamental in shaping the topological properties of real social networks, and deeply impacts the behaviour of dynamical processes running on top of them, such as epidemic spreading or information diffusion. In this talk I will give an overview of some recent works addressing the role of temporal dimension in the modelling of social dynamics. First, I will focus on empirical data of face-to-face interaction networks, describing social interactions in human gatherings, recorded by different experimental settings. I will present a simple model, based on social attractiveness of individuals, able to quantitatively reproduce most statistical features of face-to-face networks. The role of network's dynamics will be illustrated also by analysing the behaviour of simple diffusion processes, such as random walks, on top of empirical face-to-face networks. The case of social networks formed by multiple layers, representing different kinds of social interactions, is particularly interesting in order to show the effects of temporal correlations between layers on coupled spreading processes. Finally, I will briefly present an analytical approach in modelling the dynamics of social networks, able to incorporate the bursty nature of social interactions, a feature shared in several contexts of human dynamics. Our contributions in the modelling framework of time-varying social networks shed light on the dynamics of human interactions, and it will be of interest to researchers in the broad filed of time-evolving complex systems.

Information Geometry and its Application to Complexity Theory

Date: Wednesday 19 April 2017
Time: 2pm
Venue: Carslaw Building, Access Grid Room, 8th floor

Presenter: Prof. Nihat Ay

Bio: Prof. Nihat Ay is Research Group Leader Information Theory of Cognitive Systems at Max Planck Institute for Mathematics in the Sciences, Honorary Professor at University of Leipzig, and Professor at the Santa Fe Institute.

Abstract: In the first part of my talk, I will review information-geometric structures and highlight the important role of divergences. I will present a novel approach to canonical divergences which extends the classical definition and recovers, in particular, the well-known Kullback-Leibler divergence and its relation to the Fisher-Rao metric and the Amari-Chentsov tensor.
Divergences also play an important role within a geometric approach to complexity. This approach is based on the general understanding that the complexity of a system can be quantified as the extent to which it is more than the sum of its parts. In the second part of my talk, I will motivate this approach and review corresponding work.

Mathematical Aspects of Embodied Intelligence

Date: Wednesday 12 April 2017
Time: 10:30am - 12pm
Venue: Civil Eng J05 Conf Room 438

Presenter: Prof. Nihat Ay

Bio: Prof. Nihat Ay is Research Group Leader Information Theory of Cognitive Systems at Max Planck Institute for Mathematics in the Sciences, Honorary Professor at University of Leipzig, and Professor at the Santa Fe Institute.

Abstract: I will present recent results on the design of embodied systems with concise control architectures, formalising the notion of "cheap design" within the field of embodied intelligence. This notion highlights the fact that high behavioural complexity, seen from the external observer perspective, does not necessarily imply high control complexity. This complexity gap is a result of two different frames of reference, which is closely related to Uexküll's Umwelt concept. If time allows, I will present a measure-theoretic formalisation of this concept and discuss its implications.

Percolation, Cascades, and Control of Networks

Date: Monday 3 April 2017
Time: 10:30am - 12pm
Venue: Civil Eng J05 Conf Room 438

Presenter: Prof. Raissa D'Souza, University of California, Davis

Bio: Raissa D'Souza is Professor of Computer Science and of Mechanical Engineering at the University of California, Davis, as well as an External Professor at the Santa Fe Institute. She received a PhD in Statistical Physics from MIT in 1999, then was a postdoctoral fellow, first in Fundamental Mathematics and Theoretical Physics at Bell Laboratories, and then in the Theory Group at Microsoft Research. Her interdisciplinary work on network theory spans the fields of statistical physics, theoretical computer science and applied math, and has appeared in journals such as Science, PNAS, and Physical Review Letters. She is a Fellow of the American Physical Society, serves on the editorial board of numerous international mathematics and physics journals, has organized key scientific meetings like NetSci 2014, was a member of the World Economic Forum's Global Agenda Council on Complex Systems, and is currently the President of the Network Science Society.

Abstract: Networks are at the core of modern society, spanning physical, biological and social systems. Each distinct network is typically a complex system, shaped by the collective action of individual agents and displaying emergent behaviors. Moreover, collections of these complex networks often interact and depend upon one another, which can lead to unanticipated consequences such as cascading failures and novel phase transitions. Simple mathematical models of networks, grounded in techniques from statistical physics, can provide important insights into such phenomena. Here we will cover several such models, beginning with control of phase transitions in an individual network and the novel classes of percolation phase transitions that result from repeated, small interventions intended to delay the transition. We will then move on to modeling phenomena in coupled networks, including cascading failures, catastrophe-hopping and optimal interdependence.

A Framework and Language for Complex Adaptive System Modeling and Simulation

Date: Wednesday 7 Dec 2016 10:30am
Venue: Building J05 Civil Engineering, Conference Room 438

Presenter: Lachlan Birdsey and Dr Claudia Szabo from the Centre for Distributed and Intelligent Technologies – Complex Systems Program at The University of Adelaide.

Abstract: Complex adaptive systems (CAS) exhibit properties beyond complex systems such as self-organization, adaptability and modularity. Designing models of CAS is typically a non-trivial task as many components are made up of sub-components and rely on a large number of complex interactions. Studying features of these models also requires specific work for each system.
Moreover, running these models as simulations with a large number of entities requires a large amount of processing power.
We propose a language, Complex Adaptive Systems Language (CASL), and a framework to handle these issues. In particular, an extension to CASL that introduces the concept of `semantic grouping' allows for large scale simulations to execute on relatively modest hardware. A component of our framework, the observation module, aims to provide an extensible and expandable set of metrics to study key features of CAS such as aggregation, adaptability, and modularity, while also allowing for more domain-specific techniques.

Modeling neuromorphic devices: structure and dynamics of atomic switch networks

Date: 27 Oct 2016 2pm
Venue: Room 4020, Sydney Nanoscience Hub (SNH). Research Wing Level 4

Presenter: Ido Marcus presents preliminary results from a pilot project on neuromorphic devices.

Abstract: Self-assembled networks of silver nanowires are next-generation neuromorphic devices. Empirical results show that their collective dynamics share features with the activity of brain tissue. These features include criticality, plasticity and hysteretic behavior akin to memory.
In order to understand the behavior of these neuromorphic chips and the extent of their similarity to biological networks, we have developed a model of their structure and dynamics. Nonlinearity is introduced in the form of voltage-dependent switching of the junctions (i.e., atomic switches) formed at the intersections of nanowires.
Preliminary results show that our model of atomic switch networks (i) reproduces the collective dynamics measured in experiments; (ii) enables to study the microscopic origin of this collective dynamics; (iii) has dynamics consistent with criticality. Lastly, our results suggest that these systems are interesting in their own right, and could be used to model the dynamics of different complex systems.

The brain as its own decoder: predicting behavior from the structure of brain representations

Date: Sept 2016 2pm
Venue: Lecture Theatre 2, Physics Building (A28)

Presenter: Associate Professor Thomas A Carlson (ARC Future Fellow, School of Psychology, University of Sydney).

Summary: Multivariate pattern analysis has become an important tool for measuring “information” in the brain. An important question for the future is whether or not the information measured by neuroscientists is actually utilized by the brain for behaviour, e.g. the retina contains a complete record of the visual world but the brain does not directly access its content. One potentially fruitful approach to address this question is to construct models of how information is “read out” from brain representations, and then testing whether the model’s output can predict behaviour. In this talk, I will discuss a simple “read out” model our lab has been developing. We have shown that this model can predict reaction time (RT) behaviour for object categorisation based on recordings of neural activity in human inferior temporal cortex measured using fMRI. And using the same model with MEG data, we found evidence that the brain “read outs” object category information from the optimal brain state. Our most recent work has utilized single unit recordings from rhesus macaque ITC to test the limits of the model. Our analysis of these recordings has identified several limitations of the model, which represent general challenges for future models of “read out”.

Chaos and Synchronisation in Large Systems of Interacting Particles with Random Connections

Date: Monday 5th September, 2pm
Venue: Carslaw 829

Presenter: James MacLaurin (School of Physics, University of Sydney).

Abstract: I study systems of interacting particles indexed by a lattice. These have many applications, but in this talk I focus on applications in neuroscience. The particles are subject to white noise (Brownian motion), with random connections sampled from a probability distribution that is invariant under translations of the lattice. I study the limiting behaviour of system-wide averages (the `empirical measure') as the size of the network asymptotes to infinity. I find a variety of different behaviours under different scalings of the connection strength. When the connection strength is scaled by N^{-1/2} (N being the network size), the system becomes non-Markovian (meaning that the dynamics is influenced by the entire past). When the connection strength is unscaled but decays spatially, one obtains spatial correlations in the infinite size limit. I also find conditions under which particles with connections from an Erdos-Renyi random graph synchronize their oscillations in the large time limit.

Top Down Causation as the basis of the emergence of complexity

Date: Monday, August 22, 11am
Venue: School of IT Lecture Theatre 123

Presenter: Prof. George Ellis FRS, Emeritus Professor of Applied Mathematics and Senior Research Scholar in the Mathematics Department, University of Cape Town (South Africa).

Abstract: The expanding universe context makes clear that genuine emergence must occur. Because this occurs through adaptive processes, it is only possible through a combination of bottom up and top down (contextual) processes. A key feature in making this possible is multiple realisability of higher level structures and processes at lower levels. Arguments against top-down causation, such as claims of causal closure at lower levels and the concept of supervenience, will be answered. Five essentially different types of top-down effects can be identified; in the case of both digital computers and the mind, they enable the causal effectiveness of non-physical entities such as computer programs, algorithms, theories, plans, and social agreements.

Information decompositions

Date: 3pm, Monday 11th April
Venue: Peter Nicol Russell (PNR) building Lecture Theatre 1 (Farrell lecture theatre).

Presenter: Prof. Jürgen Jost, Director, Max Planck Institute for Mathematics in the Sciences, Leipzig; External Professor, Santa Fe Institute.

Abstract: When several inputs contribute to an output, it is natural to ask about their respective contributions, what information each of the inputs contributes uniquely to the output, what they possess in common, and what is complementary. Recently, several proposals have been put forward how to quantify these contributions, and I will present the version developed in our group.

Video - watch recording - requires internal USyd login.

Information theoretic analyses of neural goal functions

Date: April 1st, 11am
Venue: School of IT Lecture Theatre (Level 1)

Presenter: Prof. Michael Wibral, head of the Magnetoencephalography unit at the BIC.

Summary: In many neural systems anatomical motifs are present repeatedly, but despite their structural similarity they can serve very different tasks. A prime example for such a motif is the canonical microcircuit of six- layered neo-cortex, which is repeated across cortical areas, and is involved in a number of different tasks (e.g. sensory, cognitive, or motor tasks). This observation has spawned interest in finding a common underlying principle, a ‘goal function’, of information processing implemented in this structure. By definition such a goal function, if universal, cannot be cast in processing-domain specific language (e.g. ‘edge filtering’, ‘working memory’). Thus, to formulate such a principle, we have to use a domain-independent framework. Information theory offers such a framework. However, while the classical framework of information theory focuses on the relation between one input and one output (Shannon’s mutual information), we argue that neural information processing crucially depends on the combination of multiple inputs to create the output of a processor. To account for this, we use a very recent extension of Shannon Information theory, called partial information decomposition (PID). PID allows to quantify the information that several inputs provide individually (unique information), redundantly (shared information) or only jointly (synergistic information) about the output. In my presentation I demonstrate how to leverage information theory in general and PID in particular for a better understanding of the principles of neural information processing.

Video - watch recording - requires internal USyd login.