TELEOLOGY – NATURAL SELECTION – PART III – THE APPEARANCE OF LIFE

Intelligence is characterized by a natural incomprehension of life” – Henri Bergson.

 

 

In our consideration of natural selection as a potential alternative to design in explaining the universe, the last post explored the evolution of organic compounds and the environment wherein they exist. Today we will discuss the appearance of life. The fundamental question here is less about selection than origin. In other words, is it credible that life appeared spontaneously from a primordial soup of colloidal substances, or did it require a creator?

This is one of those difficult questions that may never be resolved definitively by science. If life forms are eventually found on other planets, it would make the theory of spontaneous life more reasonable, but would not be clear proof as a designer might create life in multiple locations or life may have disseminated form an initial site. If scientists eventually produce a living thing in a laboratory from scratch, one might argue this is simply further evidence that life is created only by design (in that case a human designer). The hope of an experiment wherein organic and inorganic chemicals known to readily form in nature are placed in a sterilized vat, and observed for the spontaneous appearance of life seems futile given the astronomical odds of life arising in a human time frame.

On the other hand, the inability of man to synthesize life to date and the absence of demonstrated life elsewhere in the universe to date serve as only weak evidence of a designer, and certainly not proof. It would appear we have reached an impasse – the  origination of life turns out being inconclusive in deciding whether teleology applies to the universe, rather opinion, often based on prejudice, likely informs one’s belief. Agnosticism here is vain as well since whether life is accidental or intentional is the crux of the whole question of teleology as opposed to a pointless universe. That is, if life is designed, than the universe has a point – as the setting for the fabrication of life!

Perhaps we should think about this enigma for a couple of days…and pick up where we left off next time.

Share this post:

TELEOLOGY – NATURAL SELECTION – PART II – INORGANIC

“Evolution consists largely of molecular tinkering – producing new objects from old odds and ends.” George Wald, Nobel Prize winner, Medicine, 1967.

 

Last time we discussed the two possible explanations for the mechanism of the universe and for life –either a designer (presumably God) or natural selection. We noted that natural selection is a highly coherent theory based on the assembly of some common observations and scientific principles. In fact, nowadays many theist concede that natural selection is irrefutable. Our next step is to sort out the two facets of this selection process – the inorganic and the organic.

In an earlier post we reviewed the ‘evolution’ of the inorganic as outlined by Fred Kohler.1 Briefly, following the Big Bang subatomic particles aggregated into atoms which combined to create molecules. Stellar processes formed higher weight elements such as carbon, oxygen, and iron, and forces in supernovae created the very heavy elements such as gold. Crystalline structures followed in favorable places as for example on the crust of Earth. These highly organized substances were able to spontaneously enlarge or ‘self-replicate’ as were the still more advanced colloidal structures such as proteins which formed from primitive amino acids.

But at the cosmic level inorganic evolution also includes the elaboration of essential environments, a process where the force of gravity appears to be fundamental. Stars, planets, and galaxies are explained by the effects of gravity on what would be free floating matter. Large amounts of matter in close proximity are drawn together by gravity into megastructures. The largest resulting structures become stars while some of the smaller structures become planets and moons bound to the stars by gravity, although at varying distances. This critical circumstance allows temperature differences on each planetary body permitting a range of inorganic chemical reactions.

Assuming we grant this ‘evolution’ of complex inorganic structures and their environment, we are left with the question of its implications for teleology. Can we truthfully say the unfolding of nonliving matter is purposive or designed?  Let’s turn to Julian Huxley, an early 20th century biologist, who thinks we should broaden the idea of evolution into ‘the directional processes’ seen in the universe. “So far as a main direction is to be observed in physics and chemistry, it is, as all authorities are agreed, towards the degradation of energy and a final state in which not only life but all activity whatsoever will be reduced to nothing,”2 However he identifies a subsidiary direction towards the production of more complex forms of matter.3 Huxley further defines his three great principles of the cosmos – unity, uniformity, and development which he sees as ‘emergent’ rather than ‘creative.’4

In short, Huxley feels ‘directionality’ is a more accurate term than ‘purpose’ or ‘goal,’ and ‘emergence’ more accurate than ‘creation.’ The direction appears to be one of interim complexification of matter but eventual futility. Perhaps he is hinting at a component of short term, but not long term, design in the form of complexification. Next time we will expand our discussion to organic evolution.

1See Posts on this site September 9 and September 11, 2019, Human Destiny – Part IV – General Science View.

2Huxley, Julian, Essays of a Biologist. Chatto and Windus, London, 1929. Page 72.

3Ibid.Page 252.

4 Ibid, Pages 241-242.

Share this post:

TELEOLOGY – NATURAL SELECTION – PART I

“And many monsters too the earth at that time essayed to produce…but all in vain, since nature set a ban on their increase and they could not reach the coveted flower of age nor find food nor be united in marriage. For we see that many conditions must meet together in things in order that they may beget and continue their kinds”- Lucretius

The traditional arguments on teleology focus on the idea of the mechanical workings of the universe, the ‘miracle’ of life, and the apparent hierarchy of living things with man in all his cleverness at the top. The thesis is simple enough, this astonishing apparatus would not appear spontaneously or by accident, any more than a pocket watch found on the ground would be assumed to have materialized into existence. The rest of the argument is then automatic: there must be a designer or creator which we call God.

It took thousands of years for man to unravel an alternative explanation for the mystery of the working of the universe and the existence of complex forms of life found on Earth. But of course the solution was not difficult; in fact, in some ways we might be amazed it took so long to solve this riddle. If we grant the reliability of our experience and some basic science, then a few facts and observations can be assembled into the theory of natural selection:

  1. The universe is very large and very old.
  2. The matter of the universe is made up of tiny particles called atoms.
  3. Atoms combine in a variety of ways spontaneously into compounds.
  4. Some chemical compounds are more stable than others.
  5. Some chemical compounds or entities can self-replicate.
  6. Self-replicating entities increase more readily than non-self-replicating entities.
  7. Some self-replicating entities replicate imperfectly.
  8. The environment surrounding entities in the universe undergoes constant change.
  9. Entities which have features that favor their existence in a particular environment persist.
  10. Self-replicating entities with features that favor their existence an environment will continue to exist and replicate.

The first of these points was known by most of the ancients – particularly the Indians and the Greeks. Atomic theory dates back to ancient Greece although its more modern form is only a few hundred years old. We empirically know items #4 (a rock is more stable than dirt), #5 (crystals and life), #6 (living things versus rocks), and #7 (congenital abnormalities; varying talents in children). The constant flux of the universe was identified in the 5th century BCE by Heraclitus, and the effects on life, for instance the simple change of seasons, are obvious. Logic and experience inform #9 (fish die when a river or pond dries up, but mammals like the beaver or otter survive).

It is the last point which is a leap, but even it is suggested by variations in easily observable animal behavior (seagulls live on the coast where they can find fish, not inland). We need simply invert the idea that animals choose to live where they can most easily to the idea that the animals that live where they do are those that can live there.

In the next three posts I will expand this understanding of natural selection to envelop the inorganic, the organic, and humanity. Our focus will not be on the science behind natural selection and evolution, but rather on the philosophical consequences for teleology.

Share this post:

TELEOLOGY – STATISTICS

“Factual science may collect statistics and make charts. But its predictions are, as has been well said, but past history reversed.John Dewey.

 

 

In talking about probability last time we distinguished subjective from objective probability and discussed the a priori form of the latter. Today we will look at the second form of objective probability: statistics, which we can define as a theory of information obtained by using experimentation or random sampling to make an inference about a larger set of measurements, existing or conceptual, called a population.1

An example is helpful here. If I tell you that 2% of Americans have naturally red hair, you understand that no one has recorded the hair color of every American, rather this assertion is based on a representative sampling of the population. Of course we recognize the possibility of error in such sampling and conclusions. Unlike the a priori calculation of a coin toss, statistics are less certain and less exact. A particularly salient example is the political poll which can be fraught with problems, and whose results are often misleading.

Science, it turns out, consists of subjecting careful observations and experimental results to statistical analysis in order to generate general laws and predictions. Alternatively stated, science uses statistics to determine the universal from particulars. Of course this mathematically cleansed induction is not logically certain. As Hume observed, there is no metaphysical basis to believe that prior results will be repeated in the future, rather this is just a habit of the human mind.

But the crux of the significance of statistics for our purpose is its reliance on two fundamental premises: the uniformity of nature and the law of universal causation. These premises appear in fact to be supported by scientific experiment and its technological applications even if they cannot be proven categorically. Once we grant the validity of statistical predictions, it then seems fair to ask whether uniformity in nature suggests design.

But we can extract a similar significance from the more exact a priori probability of the coin flip. On the one hand there is the consistency of mathematic calculation which signals design, and on the other  that a priori probability is manifest in nature where the coin flip is mirrored for instance in the gender of animal offspring which follow this same formula.

And then there is perhaps the most sublime corollary: mathematics itself is instantiated in nature  (as opposed to being simply a human idea). Take the common white cabbage butterfly where male and female are identical in appearance except the male has one dark spot on its wing and the female two; it appears these butterflies can “count” to two. Or while robins typically have three eggs per season, they correct for a fourth egg by pushing one out of the nest, thus it appears robins can “count’ to three.

Mathematical, probabilistic, and statistical uniformity are strong arguments for design in the universe, but we must hold on making a final conclusion until the end of this section. Next time we will take up evolution – often called ‘natural selection.’

1Ott, Lyman and Mendenhall, William, Understanding Statistics. Duxbury Press, Boston, 1985. ISBN 0-87150-855-9, page 3.

Share this post:

TELEOLOGY – PROBABILITY

“The most important questions of life are indeed, for the most part, really only problems of probability.”Pierre-Simon Laplace

 

 

In thinking through whether reality and the universe are meaningful or pointless, I have been taking the approach that if the universe is designed, then it has meaning, whereas if it is not designed, its meaningfulness is suspect. So far we have looked at chance, complexity, chaos theory, accident, and quantum uncertainty as possible explanations of the universe that circumvent design. The next logical subject in our search is the statistical or probabilistic nature of the cosmos.

Probability is a difficult concept to pin down, but we can dissect its meaning into subjective and objective forms. Subjective probability refers to opinions or estimations of likelihood of an event, that is, levels of confidence, typically in non-recurring or unique situations that cannot be justified by specific scientific or mathematical data. So for instance if you think your political party will probably win the next Presidential election, this is subjective.

Objective probability consists of two forms. The first of these is called by A.J. Ayer, a prior referring to situations of a mathematical symmetry of frequencies of outcomes.1 A simple example is the coin flip where there is a 50% probability of the coin coming up heads or tails. However, there is no way to confirm empirically that a coin flipped repetitively will yield 50% heads and 50% tails, since in fact any outcome in any number of coin flips is theoretically possible. (If you flip a coin 100 times and you get heads 57 times, you don’t decide coin flips give heads 57% of the time; you assume this was just one particular outcome of 100 coin flips).

In fact, objective probability assures no specific outcome can be predicted. For instance while the chance of rolling double sixes is one in 36 for one roll and the chance for rolling double sixes 10 times in a row is > 3.6 x 1015; given sufficient rolls, 10 consecutive double sixes will occur eventually, and with an infinite number of rolls theoretically will occur an infinite number of times.

But of course the symmetry of objective probabilities cannot be absolute, since in fact a coin flip must come down either heads or tails, and a dice roll must yield a specific number between 2 and 36. Chaos theory offers the best theoretical explanation of the eventual outcome of coin flips and dice rolls;  tiny or obscure factors determine the result in a given instance, even though those factors are neutral in the long run. (Contrast this with quantum mechanics where both outcomes can occur simultaneously at the subatomic level and only the addition of the observer leads to an actual outcome.)

The conclusion from this brief blog is that subjective probability refers only to an individual’s level of confidence while  a priori probability cannot be proven empirically nor does it predict specific outcomes. Next time we will look at the second from of objective probability – statistics.

1Ayer, A.J., The Central Questions of Philosophy, Penguin Books, Hammondsworth, Middlesex, England, 1973. Page 164.

Share this post:

 TELEOLOGY – UNCERTAINTY – PART II

 “I have no reason for believing the existence of matter. I have no immediate intuition thereof: neither can I immediately, from any sensation, ideas, notions, actions, or passions infer an unthinking, unperceiving, inactive substance – either by probable deduction or necessary consequence.” – George Berkeley.

Last time I traced briefly the history the development of the uncertainty principle as finalized by Werner Heisenberg. Subsequently, he and other physicists attempted to understand the significance of this discovery and created what came to be known as the Copenhagen Interpretation consisting of three main theses:1

(1)  The fundamental micronature is indivisibly bipartite: i.e. depends on field-theoretical and particle-theoretical considerations.

(2)  We will never return to classical determinism.

(3) We must learn to live in thought with the uncertainty relationship.

The philosophical implications are even more wide-ranging, as delineated by Norwood Russell Hanson:2

(1)  The significance of scientific knowledge becomes questionable.

(2)  The role of science may need to be reconsidered as descriptive rather than predictive.

(3)  Causality remains unexplained or worse is literally unexplainable.

(4) It is unclear if man has the possibility of gaining objective knowledge of the world.

This latter point is particularly poignant. In fact, quantum mechanics reminds us the act of measurement defines the thing measured and is inextricably intertwined with it. So even at the macro level, the world revealed to us depends on the measuring tools we possess and the kind of information we are capable of understanding. Reality as experienced by man then is not absolute but subjective.3

In addition when we enlarge the scope of uncertainty to the level of the universe, we are left with the question – how do superstructures such as galaxies, stars, and planets emerge from the indeterminate quantum fog without external measure or observer (assuming there is nothing external to the cosmos)? The answer offered by physicists is ‘decoherence’ wherein “the internal interactions of a complex quantum system constitute a kind of incessant self-measurement that allows the system as whole to display fixed and definite properties even though the underlying quantum state is in constant flux.”4 From this unfolds independent and objective reality which we label as classical.

Of course this solution is at best hypothetical, and an outside observer seems to be the more palatable alternative. Therefore Heisenberg’s uncertainty principle leads us to question not only the nature of reality but also whether its fundamental laws require an observer/designer – if not a deity than perhaps a demigod. We will pick this up again in the synthesis at the end of this section

 

1Edwards, Paul (editor), The Encyclopedia of Philosophy. Macmillan Publishing Co., Inc. & The Free Press, 1972.   Volume 7, page 43.

2Ibid. page 41.

3Lindley, David (introduction) in Werner, Heisenberg, Physics and Philosophy.HarperCollins Publishers, New York, 2007. ISBN 978-0-06-120919-2, pages xiii-xiv.

4Ibid. page xx.

Share this post:

TELEOLOGY – UNCERTAINTY – PART I

“Nothing is certain; not even that.” – Arcesilaus.

 

Closely related to chance, complexity, chaos, and accident is the concept of uncertainty. The road to understanding uncertainty in reality is sinuous. It begins in ancient Greece when Democritus realized that  objects in the world can be broken into smaller parts, and the resulting parts can be further broken up into still smaller parts. But of course logic alone instructs us that in this process there will be a part so small, it can no longer be divided. Democritus called this the atom (Greek- a meaning ‘not’ and tomos meaning ‘cut’- i.e. uncuttable). In the materialist world espoused by a later Greek philosopher, Epicurus, and still later by the Roman, Lucretius, atoms and their movements (swerves) were the fundamental building blocks and determinants of reality.

This simplistic physics survived the unscientific middle ages and was augmented in the 17th century when Newton discovered his fundamental laws of motion and gravity that suggested a mechanistic or deterministic explanation of events in the universe.  Newton was committed to a ‘corpuscular’ description of matter (and light as well) in accordance with the logically deduced description of the ancients.

Early in the 20th century, Niels Bohr took the Greek word atom as his label for the tiny solar-system-like structure we think of as the building block of matter. But advances in ‘microphysics’ revealed the atom was not in fact the smallest component of matter opening up the entire field of subatomic physics.  Subatomic particles appear different from classical matter – demonstrating both particle and wave aspects. Electrons, photons, and positrons have finite mass and size, but also motion and energy. All attempts to consolidate matter as particles to waves or vice versa failed and over time it became apparent subatomic entities have a dual ‘particle-wave’ nature.

At this point, the of concept uncertainty appears on the scene when Werner Heisenberg demonstrated that if a subatomic particle’s location is specified, its energy level and motion cannot be determined. Likewise if its energy level and momentum are specified, then its location cannot be determined. Moreover this is not the result of imprecision in our measuring devices, but rather is intrinsic to subatomic matter itself. In fact, he demonstrated that the location and momentum are not even defined until witnessed by an observer. These seemingly irrational conclusions appear to be confirmed by experiment and further validated by virtue of their utility in explaining other features of physics.

The philosophical problem is immediately evident: if determinism requires specification of location and energy of particles to predict events, and such specifications are impossible or non-existent, then the universe at its elemental level cannot be mechanistic as previously assumed. But then what is its nature and what determines location and movement of subatomic particles in the absence of an observer?

Share this post:

TELEOLOGY – ACCIDENT

“Nothing under the sun is accidental.” – Gotthold Ephraim Lessing, German critic and dramatist, Emilia Galotti (1772)

 

 

To some the last three posts on complexity, emergence, self-organization, and chaos theory amount to attributing the existence of the universe and life to accident – perhaps more than blind chance, but certainly less than planned design. In this post, we will look at some key objections to this hypothesis.1

The counterarguments can be listed as below:

1.   In cases of statistics, mathematicians typically consider anything less likely than one in 1050 as de facto impossible. The probability of a spontaneously appearing universe compatible with life and derived from inert matter is less than 1050. Therefore the universe did not appear spontaneously.2

2.  Even setting aside the question of probability, a reasonable person following the scientific method will not favor a proposition with very low probability over one with a much higher probability.

3.   Calculations of the formation of life by accidental processes are also astronomically low.

4.   Complexity and chaos theory can explain order developing from disorder, but not the information content intrinsic to living organisms.

5.  The laws and mathematics that govern the origin and functioning of the universe indicate that a logic or intelligence existed prior to spacetime.

Other thinkers argue that even now physical laws cannot explain biological finalism or consciousness. Rather the reductionists are in effect begging the question by defining away distinctions in animate and inanimate matter by focusing only on similar physical laws without addressing their fundamental differences.3

At the end of the day, challenge to the accidental notion of reality is the central debate of teleology, itself the fundamental issue for whether the universe has meaning. If the universe is accidental, then the meaning of it and its features including us are suspect. If design can be demonstrated by default as would be the case where both chance and accident are excluded, then there is hope for meaning.

As we continue to dissect this issue perhaps we should stop to assess the definition of the word ‘accident.’ For our purposes. Webster’s has two entries of use:

(1) any event that happens unexpectedly, without a deliberate plan or cause.

(2) Philos. any entity or event contingent upon the existence of something else.4

With regards to the first definition, I find it difficult to think of reality as “unexpected,” however our investigation of teleology is an attempt to establish or refute reality as deliberately planned. On the other hand the second definition makes even a deliberate universe an “accident” in which case philosophically one might say reality can be an accident even if designed. We will come back to this in our synthesis at the end of this section, but moving forward, we look next at uncertainty as a manifestation of the universe.

 

1Overman, Dean L., A Case Against Accident and Self-Organization, Rowman and Littlefield Publishers, Inc., Lanham, 1997. ISBN 0-8476-8966-2, pages 181-197.2

2 This is essentially a reformulation of the ‘fine tuning’ version of the teleologic argument for the existence of God as outlined in the post on this site dated February 15, 2019.

3Davies, Paul, The Cosmic Blueprint. Simon and Schuster, New York, 1988. ISBN 0-671-60233-0, page 101.

4Webster’s New Universal Unabridged Dictionary, Barnes & Noble, Inc. 2003. ISBN 0-7607-4975-2, p. 12– definitions 3 and 6.

Share this post:

TELEOLOGY – CHAOS

“A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say that effect is due to chance.” – Henri Poincaré.

 

In our search for meaning we have been exploring the origin of reality with a focus not on how but why the universe and its many facets came to be. The possibility that everything is the result of mere chance seems to be unscientific and hence untenable. Last time we looked at complex systems and noted their propensity to creation as a result of emergence and self-organization. Today we will focus on the obverse side of that coin – chaos.

Chaos for our purposes will be defined as “seemingly random, unpredictable, behavior in a system governed by deterministic laws.”1 Our interest then is in chaos as the explanation of what appears as the randomness of the universe, the subjective experience of which is interpreted as pointlessness.

Mathematicians and scientists have become aware in the last 150 years that the Newtonian predictable mechanistic description of nature is an oversimplification. In the real world, the conditions under which natural events occur involve innumerable variables which can be specified with only a limited accuracy. When calculations and predictions based on infinitesimally inexact variables are carried out multiple steps, these inexactitudes lead to unpredictable outcomes. Thus tiny changes in causal factors lead to unexpectedly large effects, a process known as the butterfly effect –referring to the image of the flapping of a single butterfly’s wings altering conditions just enough that with magnification a hurricane occurs thousands of miles away.

The mathematics of chaos theory are fascinating but cannot be detailed in brief enough form to allow description on this site. However the key findings are that chaotic systems of different types appear to gravitate to similar patterns: bifurcations in calculated results, recurring ratios and so-called ‘magic numbers’ (such as 4.669,201 and 2.5029), and nonlinearity. What the calculations and patterns reveal is that our universe is not a linear Newtonian system, but a chaotic one. Philosophically we learn that reductionism – the thesis that all complex phenomena can and should be understood by ‘reducing’ them into simpler pieces – ?will not explain the great mysteries of the universe,” and traditional “determinism is a myth.”2,3

In conclusion, the universe may be the result not of chance, but instead the emergence and self-organization of complexity theory coupled with the mathematically demonstrable unpredictability of chaos theory.  Next time we will consider the consequences of his possibility and arguments against it.

1Strogatz, Steven, Chaos: Course Guidebook. The Great Courses, Chantilly Virginia, 2008. Page 116.

2Ibid.Pages 99-102.

3Davies, Paul, The Cosmic Blueprint. Simon and Schuster, New York, 1988. ISBN 0-671-60233-0, pages 35-56.

Share this post:

TELEOLOGY – COMPLEXITY (CONT’D)

Last time we defined complex systems and reviewed their features such as unpredictability, robustness, and especially non-linear behavior. However, for our purposes, the most critical feature of complex systems is emergence – “the spontaneous creation of order and functionality from the bottom up.”4 The last part of this definition is the basis by which complexity may serve as an explanation for reality in contradistinction to ‘design’ typically meant as ‘top down.’ So for instance the ‘music’ of songbirds is the spontaneous creation of the many individuals each for its own purposes and in response to other birds’ calls while the designed music of an orchestra is directed by a conductor.

Emergence can be simple meaning a macro level property from a system in equilibrium – for example the wetness of water which develops with the aggregation of H2O molecules; or complex where macro level property develops in a system not in equilibrium such as a flame. Emergence can also be weak where the macro level effect would not be expected but can be explained at the micro level, for example a bee colony; or strong where the macro level property cannot be deduced from interactions at the micro level – for instance human consciousness.

The next step then is to think about whether our concerns such as the universe, life, and humanity can be explained by a ‘bottom up’ emergence. Some scientists such as Paul Davies and some philosophers such as Karl Popper believe that, in fact, emergence leads to new states of higher organization not fully explained by lower level laws and entities. In this line of reasoning,  ‘self-organization’ is a fundamental property of nature under conditions characterized as “far-from-equilibrium, open, non-linear systems with a high degree of feedback.”5 The laws involved in emergent phenomena are not the same as those in traditional physics and may not be deterministic in an absolute sense. So then we might say emergence and self-organization appear when the whole is more than its parts. Alternatively in the words of astrophysicists John Barrow and Frank Tipler have written: “We do not think teleological laws either in biology or physics can be fully reduced to non-teleological laws.”6

Complexity then may be a more acceptable explanation for reality than chance. Before we finish our work on an alternative to conscious design as the explanation of reality, we need to make one last stop – Chaos theory. Join me for that next time.

1Eugene Wigner won the 1963 Nobel Prize for physics for his work on elementary particle physics.

2Page, Scott E., Understanding Complexity. The Great Courses. Lecture 1.

3Davies, Paul, The Cosmic Blueprint. Simon and Schuster, New York, 1988. ISBN 0-671-60233-0, pages 21-34.

4Page, Scott E., Understanding Complexity. The Great Courses. Lecture 6.

5Davies, Paul, The Cosmic Blueprint. Simon and Schuster, New York, 1988. ISBN 0-671-60233-0, page 142.

6Ibid. Page 149.

Share this post: