Context-sensitive Matter.
By Klaus-Peter Zauner

Context-sensitive Matter

The quest for life-like artifacts has so far yielded only clumsy results compared to what nature proofs to be feasible. This is readily apparent if one compares the performance of robots with organisms. The discrepancy holds across the scale from meters down to micro-meters and across performance categories from power efficiency to pattern recognition in ambiguous situations. We are at a stage comparable to observing birds from a hot-air balloon: nature demonstrates a radically different solution. Understanding its underlying concept is opens the path to a technology revolution. At the interface between biology and information processing it becomes increasingly apparent that matter, i.e., the implementation substrate, plays a crucial role for the marvellous capabilities of organisms.



If we abstract Darwin’s principle of evolution into an algorithm (a loop of variation, selection, reproduction) it becomes sterile. Its power to drive emergence derives from the physico-chemical properties of the matter on which it acts. The fact that chemical elements cannot only be combined into a combinatorial number of different structures, but that the physical and chemical properties of a chemical compound can be radically different from the properties of the reactants that form the compound is crucial here. It is key to get from inanimate molecules to conscious brains. Conversely, within computer science the role of the physical substrate used to implement computation has largely been ignored. Presumably this is the case because already at the dawn of practical computing devices Turing’s universal machine indicated that the ultimate theoretical capability of all adequate computing mechanisms may be the same. Crucial to the notion of universality is the freedom from time and space constraints. If resources are limited and response time is critical, as it is typically the case for both organisms and robots, then it is far from clear that conventional computing methods are the most suitable mode of information processing. Computation driven directly by the physics of the implementation substrate may harbour a significant advantage with regard to both the amount of matter that is required to implement requisite functionality and the energy that is required to perform the computation. There is a cost, however. Conrad (1988) pointed out that a system cannot at the same time be programmable, adaptable and efficient. It is therefore not surprising that Brooks (2001) finds ”[…] matter that makes up living systems obeys the laws of physics in ways that are expensive to simulate computationally.”

If resources are limited and response time is critical, as it is typically the case for both organisms and robots, then it is far from clear that conventional computing methods are the most suitable mode of information processing.

Rather than imitating biological subsystems, the science of semi-biotic systems aims at integrating biological components ranging from macromolecules to living cells directly into engineered architectures. A slime mould cell, for instance, can be integrated into a fully enclosed microfluidic chip which is then mounted on a circuit board and interfaced (via USB) with a conventional computer. The chip can be stored in dry state over months and activated upon a signal from the computer. Release of water from the mircofluidic channels activates the dormant mould cell. The computer can then send signals to the cell and receive its responses. Such a system can operate for a week without nutrition supply.

The purpose is twofold: Firstly, to investigate how components such as living cells can be integrated into conventional architectures. And secondly, to learn more about the distributed information processing capability and methods of cells. The process of building such semi-biotic systems requires a rethinking of engineering processes which historically have been developed largely in a physics setting. The design focus moves from the components to their interfaces where freedom over modality and encoding allows for adapting the largely autonomous components into purposeful systems. Prescriptive control and instructive paradigms are replaced by learning, self-organisation and adaptability. Giving up detailed control eliminates the need for strong predictability of the system. Interactions that would be in an conventional architecture be undesirable because they would interfere with predictability, are now welcome and enrich the variability and thus adaptability. Fine-grained sensitivity to complex physico-chemical context as we find it in proteins is then not a nuisance (e.g., side effects of drugs) but a resource that can be harvested for sophisticated fusion of milieu information at the nanoscale or as a foothold to start the evolution of new functionality.

Macromolecules fluctuate through conformational states and the physico-chemical context stabilises subsets of these states and thus influences functionality that directly or indirectly affects the context. Networks of macromolecules communicate by modifying their local environment, the modifications being distributed through diffusion. Depending on their conformational state they may also bind together to supra-molecular structures which then can rapidly communicate conformational state changes throughout the structure. Nature’s molecular computing architectures are highly organised heterogeneous chemical systems. The large surface-to-volume ratio concomitant with a high degree of compartmentalisation allows for fine-grained control of chemical reactions. Present engineering is far from achieving anything like the level of organisation found in nature. Nevertheless, the advent of microfluidics has in recent years provided a tool that facilitates better spatio-temporal control over chemical reaction media. If the media are suitable chosen one can combine this control with self-assembly of compartments at the sub-microfluidic level. We use 3D-printing to fabricate molds for microfluidic structures which generate droplets of the oscillating Belousov-Zhabotinsky reaction medium immersed in oil. The oil phase contains lipids which self-assemble at the surface of the Belousov-Zhabotinsky droplets. The lipids prevent droplets that come into contact from merging, but do not prevent the transmission of excitation waves from one droplet to another and thus facilitate transmission of information. The droplets can be viewed as crude analogues of neurons and are presently under investigation as a wet information-processing substrate.

Where will this lead? By far the most exciting opportunity opened up by molecular information technology is its potential application to chemistry. The intricate organisation of a small set of chemical building blocks gives rise to the wide variety of material structures and specialised functions seen in nature. A complexity barrier largely prevents chemists and engineers from entering this design space. When this hurdle can be tackled, however, the technology impact will rival the advent of organic chemistry.


R. Pfeifer and C. Scheier. Understanding Intelligence. MIT Press, September 1999.

M. Conrad (1988) The price of programmability. In The Universal Turing Machine: A Fifty-Year Survey (R. Herken, ed.), Oxford University Press, New York, pp. 285–307.

R. Brooks (2001) The relationship between matter and life. Nature, 409:409–411.

K.-P. Zauner, M. Conrad (2001) Molecular Approach to Informal Computing. Soft Computing, 5:39-44.

K.-P. Zauner (2005) From Prescriptive Programming of Solid-state Devices to Orchestrated Self-organisation of Informed Matter. LNCS, 3566:47-55.

S. Tsuda, K.-P. Zauner, Y.-P. Gunji (2007) Robot control with biological cells. BioSystems, 87:215-223.








Comments are closed.

Copyright 2019 TEKS / A Matter of Feeling · Webmaster & custom web design: Espen Gangvik · RSS Feed · Log in

TEKS - Trondheim Electronic Arts Centre