Seminars

The Non-linearity and Complexity Research Group runs a series of regular seminars which are usually held on Friday afternoon at 1pm in the Mathematics Common Room (MB310) at Aston University.

The seminars cover a range of topics, from theoretical issues to practical applications.

People from outside the University are welcome to attend the seminar series. Details of individual seminars can be found here.

If you would like to give a talk, or just need more information about seminars, please contact:

Dr Otti D'Huys
Lecturer in Mathematics
System Analytics Research Institute
Aston University
Birmingham B4 7ET, UK
email: o.dhuys@aston.ac.uk

Seminar Calendar 2019

Bayesian modelling has in recent years become prevalent in scientific data analysis. In this talk I introduce Bayesian hierarchical modelling as a tool to analyse Supernova Cosmology. I compare this to standard statistical analysis and shed light on theoretical drawbacks of the classical analysis. I apply this to real supernova observations and compare the cosmological conclusions from the two methods.  Combining supernova with cosmic microwave background data, I get \omega_m = 0.399 \pm 0.027; 2.8\sigma higher than previously reported using the standard analysis.

The vast majority of network data sets contain errors and omissions, although this fact is rarely incorporated in traditional network analysis. Recently, an increasing effort has been made to fill this methodological gap by developing network-reconstruction approaches based on Bayesian inference. These approaches, however, rely on assumptions of uniform error rates and on direct estimations of the existence of each edge via repeated measurements, something that is currently unavailable for the majority of network data. Here, we develop a Bayesian reconstruction approach that lifts these limitations by allowing for not only heterogeneous errors, but also for single edge measurements without direct error estimates. Our approach works by coupling the inference approach with structured generative network models, which enable the correlations between edges to be used as reliable uncertainty estimates. Although our approach is general, we focus on the stochastic block model as the basic generative process, from which efficient nonparametric inference can be performed and yields a principled method to infer hierarchical community structure from noisy data. We demonstrate the efficacy of our approach with a variety of empirical and artificial networks.
With the rapid development of communication and computer engineering, many dynamic systems are connected by internet or other communication networks. There are many challenges and opportunities in control and other engineering applications of network connected dynamic systems. This talk will focus on consensus control, a kind of distributed control, of multi-agent systems, with some details about the basic theory and control algorithms. The talk will start with consensus control of single integrators, and then move on to consensus control of nonlinear systems, including the details for key technical results and their proofs. In the later part of the talk, an introduction is given to distributed optimization based on MAS, in a view as a closely related topic to consensus control. Some basic distributed optimization algorithms are described, and related applications to smart grid and power system applications are demonstrated.
We examine the heterogeneous responses of individual nodes in sparse networks to the random removal of a fraction of edges. Using the message-passing formulation of percolation, we discover considerable variation across the network in the robability of a particular node to remain part of the giant component, and in the expected size of small clusters containing that node. In the vicinity of the percolation threshold, weakly non-linear analysis reveals that node-to-node heterogeneity is captured by the recently introduced notion of non-backtracking centrality. We supplement these results for fixed finite networks by a population dynamics approach to analyse  random graph models in the infinite system size limit, also providing closed-form approximations for the large mean degree limit of Erdos-Renyi random graphs. Interpreted in terms of the application of percolation to real-world processes, our results shed light on the heterogeneous exposure of different nodes to cascading failures, epidemic spread, and information flow.
Kuramoto Sakaguchi type models are probably the simplest   and most generic approach to investigate phase coupled   oscillators. Particular partially synchronised solutions, so called chimera states, have received recently a great deal of attention. Dynamical behaviour of this type will be discussed in the context of time delay dynamics caused by a finite propagation speed of signals.
Non-locality is the most important pillar of quantum mechanics. I will discuss the early history of this concept, some present-day manifestations, and a bird’s eye view of its relation to quantum information processing platforms. 

We study stability of multiple conducting edge states in a topological insulator against all multi-particle perturbations allowed by the time-reversal symmetry. We model a system as a multi-channel Luttinger liquid, where the number of channels equals the number of Kramers doublets at the edge. We show that in the clean system with Kramers doublets there always exist relevant perturbations (either of superconducting or charge density wave character) which always open N-1 gaps. In the charge density wave regime, N-1 edge states get localised. The single remaining gapless mode describes sliding of 'Wigner crystal' like structure. Disorder introduces multi-particle backscattering processes. While the single-particle backscattering turns out to be irrelevant, the two-particle process may localise this gapless, in translation invariant system, mode. Our main result is that an interacting system with Kramers doublets at the edge may be either a trivial insulator or a topological insulator for N=1 or 2, depending on density-density repulsion parameters whereas any higher number N>2 of doublets gets fully localised by the disorder pinning irrespective of the parity issue.

The discovery of the Fractional Quantum Hall Effect in 1982 triggered considerable theoretical debate on thepossibility of fractional quantization of conductance in the absence of Landau levels formed by a quantizingmagnetic field. Various situations have been theoretically envisaged, particularly lattice models in whichband flattening resembles Landau levels; such models resemble the Composite Fermion model suggested byJain in which flux quanta act as lattice sites [1-4]. The discovery of quantisation of conductance in the absenceof magnetic field showed conductance plateaux appearing at values of 2ne2/h (where n=1,2,3…) [5-6].However, no report existed of the fractional quantisation until the group at UCL showed experimentally the firstobservation of fractional conductance in hole based quasi-1D quantum wire in Germanium corresponding tocharge values of e/2 and e/4 [7]. We have now shown a wide variety of non-magnetic fractional quantum statesin electrons in GaAs [8].

In this presentation, I will show that a rich mix of fractions can be observed, and manipulated, in the absenceof a quantizing magnetic field, when a low-density electron system in a GaAs based quasi-1D quantum wireis allowed to relax in the second dimension. The behaviour has been observed for both symmetric andasymmetric confinement but increasing the asymmetry of the confinement potential, to result in a flattening ofconfinement, enhances the appearance of new fractional states. The new quantum states found haveimplications both for the physics of low dimensional electron systems and also for quantum technologies.

The representation of complex systems as networks of interacting parts has provided insight into the workings of systems as diverse as on-line social networks, global trade, technological networks, the brain and the financial system. However the use of networks relies on the simplifying assumption that all of the information crucial to understanding how the parts of a system interact is encodable in a set of pairwise relations between them (i.e. as nodes and links). This assumption fails to recognise that for some systems, interactions can intrinsically involve more than two nodes.

Simplicial complexes are a generalisation of networks that are able to encode these many-body interactions between more than two nodes. In this talk, I will discuss two ensembles of simplicial complexes: the configuration model and canonical ensemble of simplicial complexes. These ensembles are ‘maximum entropy’ with respect to hard or soft constraints on the generalised degrees of the nodes. As such they are suitable as null models, and may be useful for detecting interesting mesoscopic structure in real simplicial complexes.

Our work includes a full statistical mechanical characterisation of the two ensembles. In particular we note that in statistical mechanics terminology the two ensembles are ‘conjugated’. Using this fact we calculate the entropies of the two ensembles, and show that they are not asymptotically equivalent in the large system limit, in contradiction to the received wisdom about conjugated ensembles. We also show how the configuration model can be explored numerically, using a ‘stub-matching’  algorithm that generates d-dimensional simplicial complexes with prescribed generalised degrees.

For more than 15 years, the society has been introducing electronic technologies into its life. Building trust in the online environment is a key to economic and social development. Lack of trust makes consumers, businesses and administrations hesitate to carry out transactions electronically and to adopt new services. Moreover, it should be noted that the solution to these issues and reliable operation of such systems acts as one of the foundations of the state cyber security.

The progress in the field of quantum computing is an important challenge for modern informational world and for cryptography. The rapid evolution of quantum computers and as a result the growth of computational speed leads to the new risks for modern information world. In presentation the summary of the international requirements and specifications to be presented to potential new algorithms as well as the conditions of use will be presented in the first part of it.

Neither successful implementation of modern technologies of electronic management nor electronic trust services are possible without the creation of an appropriate infrastructure – Public Key Infrastructure (PKI). In the presentation the existing public key infrastructure main development principles are given. Problems related to the functioning of such system are described. The opportunity to secure use an alternative trust model (i.e. trust model around the user) is shown and as a result the new concept for public key infrastructure development based on blockchain technology which avoids the weaknesses and shortcomings of a hierarchical architecture is proposed.

Many physical, biological and engineering processes can be represented mathematically by models of coupled systems with time delays. Time delays in such systems are often either hard to measure accurately, or they are changing over time, so it is more realistic to take time delays from a particular distribution rather than to assume them to be constant. In this talk, I will show how distributed time delays affect the stability of solutions in systems of coupled oscillators. Furthermore, I will present a system with distributed delays and Gaussian noise, and illustrate how to calculate the optimal path to escape from the basin of attraction of the stable steady state, as well as how the distribution of time delays influences the rate of escape away from the stable steady state. Throughout the talk, analytical calculations will be supported by numerical simulations to illustrate possible dynamical regimes and processes.

Networks of pulse-coupled oscillators are widely studied as models for numerous systems such as spiking neurons, communicating fireflies, impacting mechanical oscillators, electronic oscillators, optical systems, etc. In realistic networks of various physical nature, pulses propagate with finite speed leading to nonzero coupling delays. The focus of my talk is the collective dynamics of oscillatory networks with pulse delayed coupling, particularly, the scenarios of destabilization of regular spiking. By regular spiking I mean the regimes when the oscillators produces spikes periodically with the constant inter-spike intervals, for example in global synchrony. I will show that destabilization of the regular spiking is defined by slope of the phase response curve (PRC) characterizing the oscillators interaction. I will describe two different destabilization scenarios. The first scenario is the so-called multi-jitter bifurcation which takes place for the PRC with large negative slopes. It leads to the emergence of the so-called jittering regimes characterized by unequal inter-spike intervals. These intervals make various complex sequences giving rise to multiple long-periodical regimes and extreme multistability. The second scenario is observed for positive PRC slopes and relates to a homoclinic bifurcation of the regular spiking solution. The regular spiking destabilizes through the so-called phase slip patterns manifesting themselves as a repetitive process when the oscillators leave the synchronized cluster and then return back. I will present the comprehensive analytical investigation of the both scenarios.    

 

Interacting NN are used to model US Appellate Court three judge panels. Agents, whose initial states have three contributions derived from common knowledge of the law, political affiliation and personality, learn by exchange of opinions, updating their state and trust about other agents. The model replicates data patterns only if initially the agents trust each other and are certain about their trust independently of party affiliation, showing evidence of ideological voting, dampening and amplification. Absence of law or party contribution destroys the theoretical-empirical agreement. We identify quantitative signatures for different levels of the law, ideological or idiosyncratic contributions.

The existing hard thresholding methods may cause a dramatic increase and numerical oscillation of the residual. This inherit drawback renders the algorithms unstable and generally inefficient for solving practical compressed sensing problems. How to develop an efficient thresholding technique becomes a fundamental question in this area. In this presentation, the notion of optimal k-thresholding will be introduced and a new thresholding technique going beyond the existing frame will be discussed. This leads to a natural design principle for efficient thresholding algorithms. It turns out that the theoretical performance for the proposed optimal thresholding algorithms is guaranteed under a standard condition. The numerical experiments demonstrate that the traditional hard-thresholding algorithms have been significantly transcended by our proposed algorithms which may also outperform the l1-minimization method in sparse signal recovery.

Re-entrant spiral and scroll waves are observed in a huge variety of dissipative excitable and oscillatory media. In the last 50 years, the dissipative vortices gained ever increasing interest as regimes of self-organised synchronisation and transition to chaos, with the most important motivation having been for medical applications, e.g. for the better control of cardiac re-entry underlying dangerous arrhythmias and fatal fibrillation.

In the simplest 2D case, a spiral wave rotates around rotation centre R with angular velocity omega. A 3D vortex will rotate around an “organising filament”. Thus, a homogeneous system spontaneously divides into the core, defined by the centre of rotation in 2D, or by the organising filament in 3D, and the periphery synchronised by signals from the core, while location of the core is defined by initial conditions, not due to properties of the medium. In presence of a small perturbation, vortex preserves the pattern and slowly changes its frequency and location of the core. Although the regime appears non-localised, because it fills up and synchronises all available space, the vortex behaves as a localised object, only sensitive to perturbations affecting the core. This macroscopic dissipative wave-particle duality is due to localisation of the vortex's Response Functions (RFs) in the immediate vicinity of the core. Knowledge of the response functions allows quantitative prediction of spiral waves’ drift due to small perturbations of any nature, which makes the RFs as fundamental characteristics for spiral waves as mass is for the matter.

Growing of ageing populations globally poses a number of challenges to families, communities and societies. Most countries face the problems to ensure that their health systems are ready to adapt to the demographic shift. The assisted technologies for elderly people are emerging as one of promising approaches in addressing the challenges faced in aging society. The aim is to offer alternative solutions to help maintain or improve the quality of life of the elderly people. One of key research issues is to innovate and provide more accurate, unobtrusive, low-cost and practical assisted technologies for the elderly, and to help them live independently at home as long as possible in the future. 

In this talk, first reveal the challenges faced by the aging society, and then present the related current researches and future prospects of ICT enabled health care. In particular, concentrate on multi-sensoractivity recognition systems for elderly people in assisted living from theory and practice perspectives, the recognition systems aim to simultaneously monitorthe elderly’s specific activities and their long-term behaviour patterns which lead to a healthy lifestyle and enable the elderly to stay alone in a safety condition. Finally, briefly talk about the current issues and challenges on the combination forecasting models and future research direction in this research area.  

I am a statistician, but have made significant contributions to engineering and physics. I will talk about some of the contributions I made to physics. The talk will be about certain models for network degrees [1], word frequencies [2], city sizes [3] and impact factors [4]. 

References:

[1] Chu, J. and Nadarajah, S., PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS 490 , 869-885 (2018)
[2] Wiegand, Martin; Nadarajah, Saralees and Si, Yuancheng, PHYSICS LETTERS A 382 (9), 621-632 (2018)
[3] Kwong, H.K. and Nadarajah, S., PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS   513, 55-62 (2019)
[4] Okorie, I. E. and Nadarajah, S.,  PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS    491, 209-218 (2018)

Data are more ubiquitous and richer than ever before and the volume of such data creates great challenges in the handling, visualizing, and analyzing. Data analytics is a new interdisciplinary field between statistics and computer science, employing advanced data analysis tools in the absence of any hypothesis or prior data knowledge detecting potential relationships in the data. These can lead to informed decisions, or novel results. Disease control strategies can have both intended and unintended effects on the dynamics of infectious diseases. Routine testing for the harmful pathogen Bovine Tuberculosis (bTB) was suspended briefly during the foot and mouth disease epidemic of 2001 in Great Britain. We utilize bTB incidence spatio-temporal data statistical analysis and computational models to demonstrate how a lapse in management can alter epidemiological parameters, including the rate of new infections and duration of infection cycles. Testing interruption shifted the dynamics from annual to 4-year cycles, and created long-lasting shifts in the spatial synchrony of new infections among regions of Great Britain. After annual testing was introduced in some GB regions, new infections have become more de-synchronised, a result also confirmed by a stochastic model. These results demonstrate that abrupt events can synchronise disease dynamics and that changes in the epidemiological parameters can lead to chaotic patterns, which are hard to be quantified, predicted, and controlled.

Deterministic and stochastic models of interacting populations, with discrete and continuous time, are mainly concerned with the determination of steady/stationary states and conditions for their stability for two interacting species with given functional response. Loss of stability and the change in dynamic behavior due to variation in parameter values are dealt with bifurcation theory. Most of these analyses are related to the convergence of the solution trajectories to certain steady-state. In reality such kind of convergence to the attractor and survival of the species at the stationary state is rarely observed in nature and real data set can never predict such kind of dynamic behavior. In reality we observe that the population density always changes with the advancement of time and never settles down to any steady-state. This is mainly due to some adaptive mechanism among the individuals of constituent species as well as due to evolutionary changes over multiple time scale. Main objective of this presentation is to explain a simple modelling approach to capture the dynamic evolution of population density rather than settling down to some stationary state. The proposed model formulation is neither strictly deterministic nor fully stochastic rather a combination of them with respect to disjoint time intervals.

The presentation summarises the main findings of the emerging paradigm of Quantum like modelling in social sciences, mainly in decision making theory relevant for economic, finance, political science and allied fields. This new paradigm is an alternative to the standard Neoclassical decision theory, which has been the main stream thinking for last two centuries or so. The standard paradigm came to power with the rise of marginalism and utility maximisation theory, which slowly replaced the classical political economy theory. However since 1960s we have been presented with huge amount of real data sets, i.e., choice making of real people under scenarios of uncertainty and ambiguity which routinely violates even basic simple predictions of standard decision theory.

Recent advances in experimental techniques, especially those of the live-cell imaging at a single cell resolution, has greatly changed our view to how biological systems work.  It revealed much more dynamic behaviour than we thought before, while it has also become clear that cellular behaviour is highly stochastic. By using three important model systems: 1) Hes1 genetic oscillations and Notch signalling; 2) Nrf2 nucleo-cytoplasmic shuttling, and 3) Prolactin hormone gene expression in a pituitary tissue, I will discuss how to develop a mathematical model for such a dynamic system, and how to analyse and simulate it.  Model development involved the analysis of imaging data, where statistics played an important role, to infer the types of cellular interactions and the model parameters. These issues will also be covered.

How do honeybees decide where to build a new nest? What behavioural traits motivate the study of this topic? Biologists have been studying social animals and social insects for long time, but only recently the results coming from these studies have seen an application to a variety of research areas, from evolutionary dynamics in AI to value-sensitive decision-making in swarm robotics. This talk aims at explaining the features of the original context where the proposed model originates from, namely the consensus problem in honeybee swarms, and to explain the impact that this study has in different areas of research. Crucial to this talk is the multi-disciplinary approach, that allows us to show the similarities of the model in different contexts and to propose new applications that constitute an interesting contribution for multi-agent systems. An important part of this research is the game theoretic framework, which allows us to provide a different perspective of the same system and to carry out the corresponding stability analysis. Finally, we propose an extension of the original model to the structured case, where the structure is captured via a complex network or an indirected graph.

We live in an information saturated world. Every day, information is created, copied, processed and deleted. However, it is only recently that the significance of information processing has been fully recognised in Physics. This talk will introduce a simple treatment of information processing in measurement based feedback systems, including a model of an 'information engine' that uses an information resource to do work. We then consider an explicit model of feedback control that demystifies the notion of measurement. Finally, we discuss a controversial version of this model that we believe prompts further discussion on the rules for feedback controlled systems.

The meaning of analysis is explored using ideas from topology to see if we may solve problems in mathematical physics and what new methods might appear from this work. This started from looking at inclusion of a body within its ambient space and to what consequence this has on the quantities that simultaneously describe the motion of the body, and the motion within its ambient space.

We use statistical mechanics to study model-based Bayesian data clustering.  In this approach, each partition of the data into clusters is regarded as  a microscopic system state, the negative data log-likelihood  gives the energy of each state, and  the data set realisation acts as disorder.  Optimal clustering  corresponds to the  ground state of the system, and is hence obtained from the free energy via a low `temperature' limit. We assume that  for large sample sizes the free energy density is self-averaging, and we use the replica method to compute the asymptotic free energy  density.   The main order parameter  in the resulting  (replica symmetric) theory,  the distribution of the data over the clusters, satisfies a self-consistent equation which can be solved by a population dynamics  algorithm. From this order parameter one computes the average free energy,  and all relevant macroscopic characteristics of the problem.  The theory describes numerical experiments perfectly, and gives a significant  improvement over the mean-field theory that was used to study this model  in past.

 

Video

Video

Turbulence by Prof Friedrich Busse: Leverhulme Lecture 2013