Iv российский философский конгресс

Вид материалаДоклад

Содержание


Helena Gourko (USA). Comments on Nikos Psarros’s presentation
Четвертое заседание
2. Models of complexity
3. The non-traditional elements and its problems
4. Complexity theories and the constitutive commitments of scientific systems
5. The constitutive principles
Подобный материал:
1   ...   14   15   16   17   18   19   20   21   ...   25

Елена Гурко (США). Комментарий к докладу Никоса Псарроса

Helena Gourko (USA). Comments on Nikos Psarros’s presentation


This paper touches upon issues somehow eternal for science and philosophy, yet positioned differently in every new epoch of scientific evolution and methodological analysis. One could argue that the author assumes a cumulative approach with a distinct personal flavor that makes this research enterprise extremely interesting and quite promising. Answering a self-imposed question “What is science?“ he starts from Plato and Aristotle yet his main emphasis is on the idea of Einheitswissenschaft and debates surrounding it. Using ancient notions of epistéme and dóxa, he argues that they both refer to true or valid knowledge with a difference in the “quality” of its validity, universal or local, and in normativity related to it. Added to this, medieval distinction between scientia and disciplina describes institutional significance of transferring science. This double explication of science becomes a foundation for the author‘s criteria of divisions between natural sciences themselves and between natural and social sciences. In doing this, he rejects the unionist credo – hardly viable in contemporary sciences, both natural and social - as a mere convention to approach these divisions and argues that they are based on various realms of objects and processes independent from each other. He attempts to debunk the argument on ontological uniformity of the world also by citing universality of epistéme, and local character of dóxa. At this point, however, his argumentation is not truly convincing: obviously different character of their knowledge cannot produce a conclusion that there is no conflict between dóxa and epistéme knowledge (this is rather valid for ontologically uniform knowledge of the unionist credo). Everyday practices at the dóxa-level, as it seems, cannot produce universally valid, relevant and teachable knowledge. It raises question whether natural science can be defined as an epistéme-practice, the aim of which is to sustain aspect related poietical practices of the everyday life by providing them with universally valid, relevant and teachable knowledge about the common objects of interest.

This restrictiveness of definition is somehow indicative of the author‘s approach to social sciences as well. Emphasizing that a social scientist is always committed to a participant‘s point of view (as an example of which he cites a moral obligation to intervene if a scientist observes that the members of the culture he studies practice ritual homicide or ritual rape or similar criminal practices, even against the opinion of the members of the local culture who do not think that such practices are bad) the author introduces a criterion that is not in his double explication of science: a common moral stance of a human being conducting a particular research. This stance is inevitable in any research whether in natural or social sciences by a simple virtue of us living in a certain culture. Whether to act upon this stance in a particular situation of research is a different matter (one could argue that acting upon in an anthropological field research cited above would effectively end this particular research venture). Since this peculiarity seems not to be indicative of the social sciences alone, and should not be taken as a rule of scientific behavior, the author‘s assumption that in a strict sense the distinction between epistéme and dóxa knowledge does not apply to the social sciences is highly debatable. It also raises questions about the author‘s assumption that epistéme quality in social sciences should be achieved and maintained by other means (namely, what are these means?)

A missing link in the author’s venue to both definitions, as it seems, is a social/cultural environment and nature of science, i.e., its meaningful milieu. Any science as well as any other human undertaking is conducted in the world of meanings assigned by us to everything we come in contact with. There could be no scientific objects not mediated by meanings if only by a virtue of human language applied to their delineation and description. Ernst Cassirer as a philosopher and a scientist in both natural and social fields was aware of it perhaps more than others. The issue for Cassirer was the same as it was for Kant: how to explain the apparent givenness of empirical datum in the a priori forms of sensation and understanding if the significance and validity of these latter do not arise from the empirical datum and are not based on it. His philosophy of symbolic forms started as an attempt to rectify Kantian difficulties with matter and form that led to Cassirer’s attempt to spread Kantian idea of intellectual form from the mathematical concepts to all possible contacts of humans with their surroundings. His rational was that once Kant made them both concepts of reflection he did not treat them any longer (at least after the section on the “amphiboly of the concepts of reflection” in his Critique of Pure Reason) as the poles of being in a real and insurmountable opposition but rather the members of a methodic correlation. Cassirer’s further train of thought is well-known: “we never find naked sensation as a raw material to which some form is given; all that is tangible and accessible to us is rather the concrete determinacy, the living multiformity, of a world of perception, which is dominated and permeated through and through by definite modes of formation” (Philosophy of the Symbolic Forms, 3:14-15) so “we are not dealing with bare perceptive data, in which some sort of apperceptive acts are later grafted, through which they are interpreted, judged, transformed. Rather, it is the perception itself which by virtue of its own immanent organisation takes on a kind of spiritual articulation – which, being ordered in itself, also belongs to a determinate order of meaning. In its full actuality, its living totality, it is at the same time a life “in” meaning.” (Ibid., 3:202). As a natural and social scientist, as well as a philosopher, Cassirer fully implemented these ideas in his research and this is perhaps what makes his example so telling. He, however, is certainly not alone in contemporary science that embraces the idea of social construction of knowledge if not by philosophical reasoning but due to practical considerations.

In this particular respect all sciences appear to be the same and paradoxically they all have to be treated as belonging to Einheitswissenschaft. It does not make then same or similar; it only clarifies certain fundamental issues of their status and general approach to their objects. In view of this, criteria for separating natural and social sciences presented by the author should probably be modified. Criteria (1), (2), (3), (4) of natural scientific knowledge appear to be applicable, at least to a certain extent, to social science as well (under a condition that technical obstacles include a necessity of accommodating meanings). Criterion (2) of social scientific knowledge is applicable to natural sciences on the grounds of any scientist having a starting point as far as his/her meaningful stance is concerned. The same is with criterion (3) as orientation towards the conditio humana that is universal by virtue of all scientists being human, i.e. having their meaningful universe shared with everybody else.

By way of conclusion, it does not mean that the idea of Einheitswissenschaft should be back into methodology of sciences in all its old glory. The author certainly has a point resorting to it in a measured extent at the end of his presentation. It appears, however, that this idea has yet other applications than the accumulative one emphasised by the author in his idea of the United States of Sciences. Sciences, whether natural or social, have something in common that is embedded in their very principles that in turn are rooted in the fundamentals of human existence in the world.


Четвертое заседание


Парадигмы и развитие науки


Forth Session


Paradigms and Scientific change


Ханс Позер (Германия). Хаотический аутопойезис и самоорганизация катастроф? Новые научные модели и их следствия

Hans Poser (Germany). Chaotic autopoiesis and the self organisation of catastrophes? New scientific models and their consequences


1. Introduction

When going downtown in the morning by car, there will always be a chaotic traffic jam in the same streets and around the same hour, caused by ourselves, the drivers. So one has time enough to listen to the broadcasting news in order to hear that terrorists have organized a catastrophic attack. Does that mean that we are surrounded by chaotic autopoiesis and the self organisation of catastrophes? Some journalists try to say so, in order to document their knowledge of today’s sciences.

Throughout the last century one could observe the transformation of the causal worldview via relativity theory and quantum theory on the one side and evolution theory on the other side to different types of complexity theories. This seemed to be a kind of local paradigm switch, but no transformation of the whole worldview, since going from classical to relativistic mechanics as well as to quantum mechanics implied no change in the causal worldview as long as one restricted it to everyday experience; evolution theory had been taken as the expression of the difference between mechanical systems and bio-systems, since the latter ones depend on more than laws of physics alone. Furthermore, complexity theories have been taken as the continuous prolongation of well-known theoretical approaches to more complex structures in physics, in chemistry, in biology and society. But all this would omit an entirely new element within the development:

Relativity theory has to be taken as a far reaching limitation of human knowledge, since we never can gain empirical knowledge outside the light cone. Quantum theory, which depends on statistical laws, implies (in light of the Copenhagen interpretation) that nature has a probabilistic structure, since we cannot go beyond the uncertainty relation. So this condition of possibility of empirical knowledge is – in a Kantian perspective – at the same time a condition of the possibility of the objects of empirical knowledge. Each theory of evolution postulates a comparable restriction of knowledge, for it presupposes the occurrence of unpredictable mutations, followed by a selection. This unpredictability does not depend on a lack of knowledge comparable with quantum mechanics, where we have to substitute causality by probability, but much more – we cannot attain such knowledge, so that we have to take these mutations as a new kind of object of experience, which excludes even statistical laws. Today, evolutionary approaches are in use not only in social Darwinism, but also in models of language evolution, evolution of technology, of science and even of morals and knowledge (evolutionary ethics and evolutionary epistemology). This indicates that thinking in evolutionary processes means thinking in structures of processes in history, which excludes not only the dependence on causal laws alone, but includes the occurrence of something unpredictable throughout the temporal development. This, indeed, implies that the cosmological development is pushed forward by chance, not only by contingency: In this perspective, even a kind of Laplacean demon cannot know in advance what will happen. The old models, due to which this world is chosen and created by a rational God – the incarnation of logics, mathematics and wisdom as Leibniz saw it – is substituted by an unguided sequence of unforeseen happenings. (The only element which gave way for the so-called Anthropic principle consists in the fact that the evolutionary process is, generally speaking, a process which produces higher complexity, so that it has an underlying directed potentiality.) The theories of complexity, developed throughout the last five decades, have to be understood as an elaboration of the formal side of this new view; therefore we have to clarify not only their theoretical fertility, but also to analyse their presuppositions and limitations, both of which have to be seen as essential conditions of the worldview depending on them.


2. Models of complexity

First let us have a look at the models of complexity.62

They reach from the deterministic chaos via dissipative and evolutionary structures up to autopoietic and synergetic systems; the theory of catastrophes and the theory of fractals belong to them as well.

The simplest case is that of deterministic chaos, depending on systems of non-linear differential equations. It has been discussed for the first time by Leonard Euler, and recognised as a deep rooting problem by Henry Poincaré. These systems have the property that when they are used to describe a dynamic process, its development is highly dependent on the starting conditions. This means that even smallest alterations lead to totally differing states. As a consequence, a long-term prognosis is impossible, since – due to the uncertainty relation of quantum physics – we never can know the exact starting conditions. Examples are the double pendulum, weather forecasts or hydrodynamic processes. Here, each step is determined as in classical physics; but predictions – normally taken as the means to corroborate or to falsify a hypothesis – are restricted to states near the starting position. However, what happens it is nothing really ‘new’ – it is seemingly contingent, but only in the sense of an epistemological modality, since the conceptual frame, here, is classical causality – only the missing possibility of a complete description of the starting conditions implies this kind of uncertainty.

The next step is characterized by dissipative structures, which have been studied by Ilya Prigogine.63 Starting from physico-chemical processes far away from thermodynamic equilibrium, he generalized his results and emphasized the importance of these structures and its theory for a completely new way of understanding the universe. He looked at mathematical systems, which in the simplest case are characterized by an interval which has for a given function only one solution, followed by an interval which, from the point of branching on, has two branches (a so called bifurcation) as solutions. If such a system is taken as an empirical model, the process after passing the branching point can follow in reality only one branch; the process, so to say, has chosen one of the both possibilities. A further important element are so-called strange attractors, i.e. regions where the trajectories follow a relatively constant path – not absolutely the same way, but near by – for a longer period (as our traffic jam). The important idea of Prigogine consists in the view that it is not a mere epistemological problem, which has as a consequence the unpredictability of the way in which the system ‘chooses’, but that there is no cause or reason at all. Furthermore, from a certain level of complexity on we are unable to say anything concerning possible ways or trajectories as a sequence of states – the process in question brings about something new in a radical sense: it is neither known in advance, nor known as a possibility, and it can provoke a structure which existed never before in the universe: The process itself is creative.

Concerning dissipative structures, there is no closed mathematical theory underlying it, but in fact a combination of continuous and discontinuous elements. Therefore, the theory of catastrophes, going back to René Thom and continued by E.C. Zeeman,64 has been developed to close the gap. It takes as the basic mathematical area a topology in order to be able to describe the points of instability. (The name of the theory goes back to the fact known since the early 19th century: that the history of life and of geologic formations on earth is by no means a continuous one, so that in cases of instability catastrophes must have happened.) Even in hydrodynamics we can observe that a continuous process suddenly changes into a totally different one; so a flow of water or a wave might dissolve into isolated drops. Thom developed a differential topology which focuses the singularities, using the discreetness of differential equations with varying parameters. (Speaking in a technological way, he introduced to the topological structure a formalism near the singularity which allows handling the point of discreetness). Both authors made use of these mathematical structures in physics, biology, linguistics and social sciences. Normally only those differential equations are used in sciences and seen as useful which are differentiable and integrable – but this presupposes mathematical continuity, corresponding to a continuous process within its interpretation in the model. The theory of catastrophes, on the contrary, focuses on the switch from continuity to singularity as an instability of the system. Thom showed that these situations can be described in topology in the simplest case by means of seven elementary catastrophes. He believed that this allows us to interpret the mathematical result as a morphology: the morphogenesis, the rise of new forms, is marked by the discreetness of mathematical systems.

These sketchy remarks show that the catastrophe theory is helpful in cases where systems of discrete functions can be applied to processes in nature and in society – otherwise we never would be able to interpret the discreetness in mathematics by instability given in phenomena. Even if in fact the theory of catastrophes plays no really important role at the moment,65 it fits into the horizon of global change of the worldview.

Prigogine has pointed out many times, that his view can be extended to other areas than thermodynamics and chemistry. This extension is done by means of theories of autopoiesis and self organisation. Whereas autopoiesis in its narrow sense means the self stabilisation of a system within changing surroundings and is used by adherents of radical constructivism as Humberto R. Maturana and Francisco J. Varela,66 theories of self organisation include the genesis of systems and its feedback on the surroundings, taken itself as a system, too. So the cornerstone of the theory consists in modelling spontaneously formed, ordered macroscopic structures as an outcome of the self-reinforcement of microscopic fluctuations together with selections depending on the constraints.67 This presupposes non-equilibrium states, which allow a permanent flow of material and energy to keep the order of the system in question or to stabilize a contingent deviation of the internal state of the system. This kind of theoretical model spreads out from cybernetics in the 60s of the last century and became a universal, transdisciplinary model, which Hermann Haken has called synergetics.68 His guiding idea consists in a reduction of complexity by distinguishing parameters which are used to describe a system, namely those belonging to the outside and those characterizing the inside state of the system; then he looked for dependencies among the parameters in order to reduce the whole system to inside ones which are dominant, so that one can interpret the others as their “slaves”: self organisation, therefore, is understood as the process of organisation of the whole system, imbedded in its surroundings, by alterations of instable dominating inside parameters.

This is, in short, the current state of model development. It shows the way from wholly determined systems via singular disturbances by bifurcations to mathematical models which, in the case of Haken, try to say more on the inner structure of the occurrence of something absolutely new in the sense of a new order of the system, imposed and impressed on the system by the unpredictable instability of dominating parameters.


3. The non-traditional elements and its problems

The short overview indicates that all theories use mathematical structures in order to establish a consistent structure of partly incoherent and discontinuous processes. The interpretation of the mathematical structures as structures of the world include several important new elements. Deterministic chaos has been developed as a mathematical model within the framework of classical causal physics; dissipative structures took the origin from mathematical models of bifurcation and of the family of solution curves as relatively stabile states, useful in chemistry; autopoietic structures are thought as useful ones for mechanisms of self organisation, especially in human society, whereas the theory of catastrophes took its origin from topological models useful in explanatory functions in biology. They all are developed with the intention to generalize them: The deterministic chaos is offered as an approach to understand the whole cosmos; dissipative structures as a global theory of emergence of the new; evolutionary structures started from biology but are to be found in mapping each kind of temporal development in biology, psychology, sociology and even in the history of ideas. Haken’s synergetics has been introduced nearly from the very beginning as a transdisciplinary approach. This shows that its way of dealing with the world as well as with the realm of ideas is much more than a Kuhnian paradigm switch – it is a new worldview: complexity instead of causality or of Aristotelian-Medieval finality. This intention is already well known – Mandelbrot and Prigogine, Haken and Thom or Maturana have stressed this point in nearly each of their writings. There is no need to repeat their prophetic utterances.

No doubt the new approaches have been successful within the discipline they started from; and altogether, we learned to understand complexity in a way which nobody could expect half a century ago. Indeed, they have been so successful that they are in use in a metaphoric way, especially in sociology, where we have no possibility of introducing a differential topology, not to mention mathematical functions, which normally demand real numbers – something which makes no sense in theories of societies or of history: Therefore, our traffic jam might be on the border line, but terrorism would not fit into the complexity scheme. Does that mean: No chaotic autopoiesis and the self organisation of catastrophes?

Let me first sketch the new problems implied by these models of complexity theory:

The models show under conditions of causality that there are phenomena which one can explain as a part of a complex whole, but which one cannot predict. This implies that the old methodological idea of corroborating or falsifying a hypothesis by prediction does not hold.

The elements of unpredictability are seen as something new, and in fact in many cases as something unique, so that a repetition is excluded. Therefore, induction as the classical fundament of scientific generalisation and concept formation is excluded.

These models do not deal with objects and its qualities, but with processes. Prigogine and Manfred Eigen speak of a switch from Being to Becoming, and Alfred N. Whitehead has proposed at the beginning of the last century a process ontology. This means to give up not only the Aristotelian or Cartesian substance-accident-scheme of the world, but even the traditional idea of individual objects which can be generalized in a classification, which then allows us to formulate hypotheses or even laws concerning their qualities, relations, movements etc. But in order to be able to speak of a process, there must be continuity and lawlike behaviour at least in a more general sense, otherwise it would not be possible to identify some sequence as a process, and it would be impossible to make use of mathematical functions which allow for a physical interpretation.

All theories of complexity make extensive use of mathematical functions, where the attributes of the functions (continuity, integrability, discreteness etc.) are important; but even if the mathematical structure allows an interpretation as a model of the world (i.e. to interpret a point of discreteness as a bifurcation or as an unexpected jump concerning a dominant parameter of the system; or a sequence of mathematical solutions as a sequence of states which were near each other, indicating a quasi-stabile state, as a strange attractor) – these mathematical properties and their interpretation as qualities do by no means explain the complex process, they only indicate – or correctly:   impose a formal structure.69

These new elements themselves differ in a fundamental perspective. Something unpredictable within the model of deterministic chaos is new only with respect to our knowledge – therefore it demands epistemic contingency, whereas the process which brings it about is by no means a contingent one, for it is nothing but causal; or to put it in modal terms: it is physically necessary. But in dissipative structures, and much more so in synergetic ones, the unpredictability does not depend on a lack of knowledge but on an ontic contingency: there is no kind of physical necessity behind. At least a statistical lawlikeness or a kind of regularity has been the cornerstone of classical sciences to be able to formulate hypotheses or laws of nature; if this does not hold for the entirely New – how then can it be treated in science?

All the models have walk this tightrope: to make use of continuous processes on the one hand, while simultaneously including either epistemic or ontic contingency of discontinuous gaps on the other hand   breaks which may either mark a new singular state or the beginning of a new process or even of complete disorder. The fundamental difficulty the models try to handle consists in describing these breaks; but the description based on an interpreted mathematical structure is no prediction and even no explanation at all: The entirely New excludes this from the very beginning for conceptual reasons.

All this implies that we cannot understand ‘complex’ nature ‘as it is’ by means of explanations, since the classical presupposition of each inductive generalization, “similar causes are followed by similar effects”, does not hold (even in the case of deterministic chaos, one only could say: “identical causes are followed by identical effects”, whereas there will never be really identical causes). Therefore our understanding depends on a model which we impose, a model of which it is even impossible to say that it is an abstraction.

How, then, is a science of complexity possible? Science depends on classification, generalization and repetition – otherwise we would not be able to introduce concepts and to formulate lawlike propositions. The old way of looking at the universe – from physics up to the mind – presupposed that there are classes of entities (atoms, chemical substances, plants and so on up to ideas) which warrant generalizations and which allow us to formulate laws at least as hypotheses. All this is not available in complexity theories. Therefore we need a new orientation not only concerning the ontology, but also concerning the sources of knowledge, the network of these sources, and – most important – new criteria for justifying a proposition or a local theory within the new framework.


4. Complexity theories and the constitutive commitments of scientific systems

These remarks show that all variants of complexity theories cause methodological, epistemological and ontological problems. One could even ask the question whether they are scientific at all, since they offend central classical methodological rules or commitments. Even evolution theory seems to be nothing but a narration, since it cannot predict its essential new element (namely: mutations). Now Thomas S. Kuhn has shown that the commitments in question themselves are not fixed, but change in history, known today as paradigm switch. But even if he believed that these changes are arbitrary ones, a further analysis has shown that one has to distinguish between several commitments which come into play and which were modified always under the pressure of arguments.70

Some introductory remarks first. If Kant is right that we never can transgress the limits of our understanding, so that all objects and processes of experience have to be seen as phenomena constituted by the means of our reasoning –be these forms of thinking and intuition or schemes of ideas (as Whitehead called them) — then we are forced to say that we need to impose those schemes to the objects of knowledge. Kant believed that our faculty of understanding can deduce all forms of thinking as categories from a complete list of the logical forms, whereas today we must admit that we are able to create new forms even in logics – forms of such a kind that they would contradict each other if one would mix them up (propositional logic, many-valued logic, modal calculi, deontic logic, etc.). This insight makes it possible to understand and to accept Whitehead’s view that, within the history of ideas, schemes of ideas might undergo alterations, even if they are the necessary conditions of understanding, since without them we would not be able to think: It is our thinking which originates the schemes of thought, and it imposes itself upon the objects of thinking. The apriori itself has a history, it is a relative one and admits the invention of totally new formal structures – something which completely fits into the fundamental view of complexity theory. Therefore, changes within the methodological commitments of sciences themselves fit into this view (including its corroboration – but this might imply logical circularity).

Each theory is guided by a paradigm; or in the light of a further analysis: it presupposes several methodological commitments, namely

ontological commitments, concerning the fundamental objects and the fundamental properties or relations of and between them,

the accepted sources of knowledge,

a hierarchy of these sources,

judicial commitments which say how to prove, to falsify or to corroborate a proposition of the theory, and

normative commitments concerning the form, the beauty etc. of a theory and its propositions.

We can omit the last point here; but there are remarkable changes from classical sciences to the new view expressed by complexity theories:

ad (1): As already said, the new ontology takes processes as its objects. The fundamental properties are stability (continuity) and instability (either chaos or a singularity which marks the beginning of a new period of stability). Naturally, even processes have elements, namely states and their trajectories. Whereas the system allows general lawlike propositions concerning the process (how they have to be justified will be discussed under (2) to (4)), the elements are meant as concrete, unique singularities in such a radical sense, that their individuality is unpredictable. It is indeed just this concrete or historical and unique individuality which we accept today, although we cannot derive it from covering laws. But what kind of understanding is it, if we say that these structures allow us to understand the process of which the individual unrepeatable states are the elements?

One of the Kantian consequences of his transcendental reflection has been that epistemic contingency implies ontic contingency, as Niels Bohr did within the Copenhagen interpretation of quantum mechanics. We find this in a radical version among protagonists of the new view of complexity, namely among radical constructivists; but altogether and normally complex structures are taken as structures of nature as it is, and the occurrence of new things leads to formulations such as “inventions of nature”, which take nature as an acting and creative entity. Indeed, one has to take into account that the occurrence of the New is a constitutive part of the process ontology! This belongs to the indispensable prerequisites, even if there is no possibility at all to predict the New. Furthermore, it is not determined in which way one has to integrate it. In evolutionary theories, one takes it just as an empirical datum that it occurs as a mutation, leaving open when and in which form this happens. Therefore one has to answer the question of understanding the New that it is covered in a presupposition. In fact, it is not possible do give a definition of creativity or the entirely New, but we have no problems to use both concepts – we presuppose the disposition to understand what is meant.

ad (2 and 3): Normally, the commitments concerning the sources of knowledge explain which experimental and/or observational methods and which mathematical procedures are the fundamental ones. The commitment (3) says whether the rational (mathematical and theoretical) or the observational source has priority. But in our case, the situation itself is a complex one. At the beginning of each complexity theory, there are two empirical insights. Firstly, that there are very typical singularities in the history of the universe, in chemical, biological and social processes; singularities which cannot be modeled and explained within the framework of classical scientific approaches; this was part of the ontological commitment. Secondly, these singularities are connected with the complexity of the system in question and its surroundings. These insights lead as a second step to purely mathematical considerations regarding how to develop and how to manage mathematical structures which contain continuity as well as discreetness including sudden jumps– not alone in the area of systems of non-linear equations, but even in algebraic and differential topology including Mandelbrot’s concept of dimension which is not restricted to integrals. The third step, then, consists in an interpretation of these formal structures – but in a way which does not really touch reality, since a mathematical structure can at best describe an idealized and generalized state or process, but not an arbitrarily unique one, imbedded in the whole network of a structure and its outside (where to speak of this and that structure and its outside already means to think in a model): “Individuum est ineffabile”, as Johann Wolfgang von Goethe said, imitating scholastic terminology. This indicates that the sources of knowledge concerning complexity theories have to be seen in mathematical structures which are initiated on the background of some universalized experiences. The models one gets by means of an interpretation are by no means pictures of complex reality, but a way to impose a selective structure on this reality without really meeting it. Instead of a hierarchy of knowledge sources, we find a network of mathematical theorems, interpretations as a model and correspondences between the model and generalized empirical data.

ad (4): The most difficult point is the question of justificatory procedures. As we saw, all classical methods of prediction, corroboration and falsification fail in so far as they are empty. The interpreted mathematical model allows at best a retrospective understanding of the processes or of the trajectories in the past – but even this only in a descriptive, not in an explanatory way, and there cannot be any exact retrodictions. This shows that the explanatory power does not concern any singular states or processes. But nature and society can in fact be understood as systems depending on a self imposed order, including the possibility that (and partly: how) these ordering structures might change: The systems are therefore justified in so far as they allow a kind of understanding of the kind of order found in nature and society. This indicates that all the classical results of causal and statistical laws of nature are still useful, namely in those regions where processes can be seen as quite continuous, so that we are successful in generalizing even if we know that this, radically speaking, never meets reality. Therefore, the whole approach of complexity theories has to be seen as a prolongation of traditional sciences; but this would neglect the main and essential point concerning the world view, namely that we have to understand the development of the whole universe as a creative process of growing complexity on ever higher levels.


5. The constitutive principles

Let us pick up the problem of understanding, since it is directly interwoven with the judicial commitments: As science tries to understand its objects (including processes), science has to aim at truth. The traditional way of philosophy of science and of analytic tradition has been to take understanding as a kind of incomplete explanation in the sense of Hempel-Oppenheim. However, within complex systems, this is inadequate, since the main element of the classical explanatory scheme – namely prediction – is misleading here. And to take it as a hermeneutic approach would miss the point, too. In fact, evolution theories are characterized by the fact that even if we cannot predict the next mutation, we have a kind of insight concerning the whole evolutionary process: Presupposing principles as ‘survival of the fittest’ we are able to understand why an evolution takes place – and looking back, how it took place. Instead of predictions of future events, we gain retrodictions. Now, in complex systems even retrodictions are not possible – what we understand is the type of process going on, not the singular case; but even that needs principles which one has to presuppose (as the ‘survival of the fittest’ in evolution, which, by the way, in many cases, does not work). And since what happens in fact cannot be reproduced, there is no chance to find an empirical foundation for principles of explaining the unique.

There are two quite simple limiting principles which one has to have in mind:

No extended logical system can, as a system, treat itself in totality; this follows from Gödel’s theorem.

No physical system whatsoever can contain a complete picture of itself, for otherwise the picture would contain a second order picture which would have to contain a third order one, and so on, up to infinity.

This implies that rational models of the world – as complex as they might ever be –can never overcome these limits. Therefore, we can never have a theoretical or physical model of the whole. And therefore we should reflect on the boundaries connected with the proposal of the new world view.

A further difficulty which causes a practical limit in many cases consists in the way in which computers are used. We would never have had the possibility of handling systems discussed by Mandelbrot or Prigogine if we would not have the possibility to gain approximate solutions and simulation models by means of computers, where an exact and direct solution is impossible. This means that models are restricted to finite methodological operations – whereas they are used as if the complexity of the whole universe could be mapped by these methods.

The complexity theories of today lie in between both of these limits, for they make use of human creativity, which allows us to develop new formal structures and new methods of proof-giving or approximations. But we have to pay a price, which, as we will see at the end, includes non-empirical and in this sense metaphysical elements:

The models we get do not touch reality in its ‘ineffability’, but constitute a conceptual network.

Since predictions are excluded for the singularities, the models cannot be corroborated or falsified in this essential point; furthermore in many cases the models are not useful for practical purposes.

One has to accept that the dynamics of the processes in question, which leads to higher complexity in the sense of higher ordering structures. This depends on chance, mixed up with causal necessity. It is chance which pushes the development forward and makes the complexity growing throughout the whole evolution of the cosmos, whereas the continuous periods stabilize the process.

The models are constructed by the human mind, but they offer a ‘divine’ perspective from the far outside. But they do not include the acting, thinking, modelling and evaluating subject as an individual, even if they are used in human sciences.

Thinking of these challenges, one has to explain why we really accept these new theories together with the world view they include. Indeed — even traffic jams and terrorism are metaphorically described by a vocabulary belonging to this intellectual horizon. The answer must be that a secularised world has found its adequate mapping. The new scheme of thought, which complexity theories offer to us, allows to impose an order to processes going on in time, which otherwise would be inaccessible to human understanding. It seems to be possible to integrate the way we conceptualize not only from matter to life and soul, but to integrate even freedom and responsibility. Since we, as human minds, would always insist upon our ability to be creative and to possess free will, we are prepared to accept the new commitments, connected with this view. The answer concerning the justification of the new judicial commitments is: The theories of complexity offer a universal order for the cosmos of matter as well a of ideas. This is what constitutes their fertility. And the answer concerning the constitutive principles is: As a metaphysical (i.e. neither empirical nor formal) principle, we are prepared to accept: Nature as a whole is creative on its way of emergence to higher order structures. Back to metaphysics of science – this is neither a conceptual chaos nor an intellectual catastrophe, as hard-minded analytic philosophers might think, but the consequence of the new scheme of thought in sciences as well as in the world view.