5

I have an intuitive concept of systems that grows in complexity steadily as it computes. I can think in some examples:

  1. Nature. Given the physical laws, some initial conditions (say, the Earth planet at the beginning of life) and enough time, the system eventually evolves in complexity, all the way from dinosaurs fighting over territory to humans building castles.

  2. Cellular automatas. Given a set of automata rules, some initial conditions (a grid with specific cells set) and enough time, some automatas eventually evolve complex structures.

  3. AI applications such as genetic programming (to some extent). The issue is that most of those evolve towards a specific goal and often reach a tipping point and stop.

The kind of system I'm talking about is supposed to grow in complexity indefinitely. That is, one can expect that, given enough time, such system will eventually develop complex structures such as "lifeforms" that defend their own existence. It is also not goal oriented, it is supposed to grow in complexity because their own internal structures are more stable when they are more complex. We know nature/physical laws satisfy that criteria, but I'm not sure about cellular automatas and genetic programming. That is a very vague concept and hard to define (there is no accepted definition of complexity, after all). Yet, is there any name/formalization for this kind of system? Do we know any examples of systems/specific automatas that, as far as our observation goes, never reach a "tipping point" where they stop becoming more complex?

Raphael
  • 73,212
  • 30
  • 182
  • 400
MaiaVictor
  • 4,199
  • 2
  • 18
  • 34

2 Answers2

5

Hypothesis could be :

  1. The time is discrete, let $s_i$ be the state of your system at $t=i$.
  2. Your system is deterministic and its transition function $f$ is computable.
  3. Its complexity $c : \mathbb{N} \to \mathbb{N}$ is the Kolmogorov complexity, $c(i)$ is the length of the shortest C++ program which prints $s_i$.

Then you have a bound : $$c(i) \leq O^+(1) + c_0 + |f| + |i| = O^+(1) + |i| $$

($O^+(1)$ is a notation for a positive constant, |x| is the Kolmogorov complexity of x)

And more generally for any computable function $g$ : $$c(g(i)) \leq O^+(1) + c_0 + |f| +|g| + |i| = O^+(1) + |i| $$

The system complexity grows more slowly that any computable function !

Conclusion :

  • You should look at systems which are not deterministic or whose transition function is not computable. Or use an other definition for complexity (like this one : Chaos Theory).
  • With Kolmogorov complexity, chaotic states are more complex than states with lifeforms defending themselves, but may be one can define a physical macroscopic Kolmogorov Complexity where the unbreakable basic components for describe the world are quite big and their internal features are negligible. Then it becomes more complex to describe tangle of life and vacuum than homogeneous chaos.
François
  • 671
  • 5
  • 16
1

there is some scientific study of systems that roughly fit your criteria. you might examine the following to determine which are fitting or not based on their properties. there is a wide range of examples with different phenomena in each category.

vzn
  • 11,162
  • 1
  • 28
  • 52