Noise is inherent in most forms of computing and its impact is more dramatic as the computing circuits become more complex and of large scale. The consequence of component failure is felt more strongly in areas where highly reliable circuits are required.
Equipment reliability can be restored, to some extent, by the use of standard error correction but also their ability is being jeopardised by physical limitations of hardware minimisation. Classical computing circuits suffer from thermal noise and production errors, quantum computers suffer from decoherence, whilst an understanding of the noisy processes, inherent in neural networks and biological systems, remains poorly understood.
The relative computing power of these different architectures should be revisited to incorporate the limitations imposed by noise. This will be particularly relevant to future integrated circuits that rely on ever growing gate-densities, which give rise to higher noise levels and probability of production errors.
Von Neumann was the first to consider the problem of noisy computing, aiming to better understand biological neural networks. He studied the limitations imposed by faults in the fundamental elements of circuits and demonstrated a method to correct their result. His analysis was followed by more recent studies, which show that in the case of Boolean functions there exists a limit for the tolerable gate noise; and that increased noise tolerance requires circuits of greater depth.
The principal investigator of this project is Prof David Saad
Research fellow: Dr Alexander Mozeika
A. Mozeika and D. Saad, “On the Computational Ability of Boolean Circuits”, in preparation (2011).
Presentations - oral
Presentation - poster
Browser does not support script.