An adaptive system is a set of interacting or interdependent entities, real or abstract, forming an integrated whole that together are able to respond to environmental changes or changes in the interacting parts, in a way analogous to either continuous physiological homeostasis or evolutionary adaptation in biology. Feedback loops represent a key feature of adaptive systems, such as ecosystems and individual organisms; or in the human world, communities, organizations, and families.

Artificial adaptive systems include robots with control systems that utilize negative feedback to maintain desired states.

## Contents

The law of adaptation can be stated informally as:

Every adaptive system converges to a state in which all kind of stimulation ceases.[1]

Formally, the law can be defined as follows:

Given a system ${\displaystyle S}$ , we say that a physical event ${\displaystyle E}$  is a stimulus for the system ${\displaystyle S}$  if and only if the probability ${\displaystyle P(S\rightarrow S'|E)}$  that the system suffers a change or be perturbed (in its elements or in its processes) when the event ${\displaystyle E}$  occurs is strictly greater than the prior probability that ${\displaystyle S}$  suffers a change independently of ${\displaystyle E}$ :

${\displaystyle P(S\rightarrow S'|E)>P(S\rightarrow S')}$

Let ${\displaystyle S}$  be an arbitrary system subject to changes in time ${\displaystyle t}$  and let ${\displaystyle E}$  be an arbitrary event that is a stimulus for the system ${\displaystyle S}$ : we say that ${\displaystyle S}$  is an adaptive system if and only if when t tends to infinity ${\displaystyle (t\rightarrow \infty )}$  the probability that the system ${\displaystyle S}$  change its behavior ${\displaystyle (S\rightarrow S')}$  in a time step ${\displaystyle t_{0}}$  given the event ${\displaystyle E}$  is equal to the probability that the system change its behavior independently of the occurrence of the event ${\displaystyle E}$ . In mathematical terms:

1. - ${\displaystyle P_{t_{0}}(S\rightarrow S'|E)>P_{t_{0}}(S\rightarrow S')>0}$
2. - ${\displaystyle \lim _{t\rightarrow \infty }P_{t}(S\rightarrow S'|E)=P_{t}(S\rightarrow S')}$

Thus, for each instant ${\displaystyle t}$  will exist a temporal interval ${\displaystyle h}$  such that:

${\displaystyle P_{t+h}(S\rightarrow S'|E)-P_{t+h}(S\rightarrow S')

In an adaptive system, a parameter changes slowly and has no preferred value. In a self-adjusting system though, the parameter value “depends on the history of the system dynamics”. One of the most important qualities of self-adjusting systems is its “adaptation to the edge of chaos” or ability to avoid chaos. Practically speaking, by heading to the edge of chaos without going further, a leader may act spontaneously yet without disaster. A March/April 2009 Complexity article further explains the self-adjusting systems used and the realistic implications.[2] Physicists have shown that adaptation to the edge of chaos occurs in almost all systems with feedback.[3]

## Practopoiesis

How do various types of adaptations interact in a living system? Practopoiesis, a term due to its originator Danko Nikolić, is a reference to a hierarchy of adaptation mechanisms answering this question. The adaptive hierarchy forms a kind of a self-adjusting system in which autopoiesis of the entire organism or a cell occurs through a hierarchy of allopoietic interactions among components.[4] This is possible because the components are organized into a poietic hierarchy: adaptive actions of one component result in creation of another component. The theory proposes that living systems exhibit a hierarchy of a total of four such adaptive poietic operations:

   evolution (i) → gene expression (ii) → non gene-involving homeostatic mechanisms (anapoiesis) (iii) → final cell function (iv)


As the hierarchy evolves towards higher levels of organization, the speed of adaptation increases. Evolution is the slowest; the final cell function is the fastest. Ultimately, practopoiesis challenges current neuroscience doctrine by asserting that mental operations primarily occur at the homeostatic, anapoietic level (iii) — i.e., that minds and thought emerge from fast homeostatic mechanisms poietically controlling the cell function. This contrasts the widespread belief that thinking is synonymous with neural activity (i.e., with the 'final cell function' at level iv).

Each slower level contains knowledge that is more general than the faster level; for example, genes contain more general knowledge than anapoietic mechanisms, which in turn contain more general knowledge than cell functions. This hierarchy of knowledge enables the anapoietic level to directly activate concepts, which are the most fundamental ingredient for the emergence of the mind.