Chance, Randomness, and Determinism

Tucker S. McElroy

This is a brief examination into the subject of randomness. Namely, what do we mean by "random"; what do we mean by "chance"? These seem to be philosophically loaded terms, and one cannot employ them without invoking religious presuppositions. This essay is an attempt to delineate the various approaches taken to the concepts of chance, randomness, and determinism. We can only provide an introduction to the rich collection of concepts, else this minor endeavor would swiftly become a book. More meticulous readers, disturbed by the cursory treatment of epochal metaphysical topics, should consult the references for further information. The latter portion of this essay takes an innovative "density" approach to the concept of probability, and thereby places a non-standard interpretation upon the axiomatic definition; this is an attempt to utterly remove the ambiguous notion of "chance" from the parlance of probability theory.

Glossary

The terminology is profuse, and, unfortunately, in practice the definitions are not standardized. An engineer, businessman, and theologian all mean something distinct when they use the word "random". For the purposes of clarity, the following definitions will be used throughout this paper.

Chance: An abstract principle, which governs the outcome of events in our cosmos (this term will hereafter refer to the empirically knowable universe, our own space-time domain). Roughly speaking, the doctrine of chance states that when an event occurs, it could have just as easily come about some other way, and there is no particular reason or cause for things having occurred this way. Note that the principle of chance is assumed to operate through certain rules or laws (the probabilities). Chance does not attempt to explain these rules/laws, but taking them for granted, often operates through them.

Ergodic: This is a precise mathematical term, which in various contexts usually means that spatial and temporal averages or computations are asymptotically identical. In probability theory, the comparison is made between a distributional average and a temporal/sampling average. This latter perspective is the one taken here—namely that an average of sampled data approximates with high probability the average over every theoretical possibility.

Stochastic: An adjective, which denotes that which pertains to probability theory. It is often used synonymously with "random", but has a more precise interpretation—the word "stochastic" is the technical term for what we often call "random" in common parlance. A stochastic object does obey certain rules and patterns, but is not completely predictable.

Chaos: The common usage of the term denotes disorder, unpredictability, and fluctuation. Paradoxically, it cannot be the complete absence of order (such a concept can never be defined, since "defining" is an order-imposing operation), but rather is the apparent loss or corruption of order, perhaps relative to some subjective aesthetic. In mathematics, a chaotic phenomenon is a deterministic structure (i.e., it has a functional form, with theoretical predictive capacity) which appears to be stochastic.

Random: Some use this word when they discuss raw chance. Others mean a stochastic number between zero and one, generated in such a way that any outcome is equally likely (i.e., a uniform random variable on the unit interval). This is a quite limited meaning. Others refer to a sequence of numbers that have no probabilistic relationship to one another; this is redundant terminology, since the probabilistic concept of independence covers this idea. A mathematician merely uses random as a synonym for stochastic—as such it does not preclude the possibility of some outcomes being more likely than others.

Deterministic: This idea says that a phenomenon has some cause, which necessarily determines the outcome (which often comes temporally). Yet, it is more subtle than fatalism, since determinism allows for the possibility of primary and secondary causes—which just means that some things are apparent (perhaps empirically), whereas others are hidden to finite understanding. From a scientist's standpoint, these secondary causes become somewhat moot, since they may never be detected or measured. As used in this paper, determinism can allow for non-predictability in the cosmos, and at the same time allow full causality once the viewpoint is extended beyond the boundaries of this world. This concept will be fleshed out more fully below.

Fatalistic: Everything in the cosmos is completely determined by forces acting within the universe—thus everything is predictable (in theory, though it may be unfeasible) if only sufficient information can be gathered. This view seems to bear uncomfortably against the edifice of quantum mechanics, which preaches the inherent unpredictability of the small particles within the bed of quantum foam.1

Probability: This is a mathematical theory, which is the basis of all modern studies in stochastic processes and statistics. The measure-theoretic (or axiomatic) formulation of the theory nicely lends itself to a density interpretation, which is described below. We often speak of the "probability of an event," by which we mean the chance that something happens. Depending on our notion of chance, this has various nuances.

Natural Philosophy

The term "natural philosophy" refers to the belief that there are sufficient explanations of the observed phenomenon to be found within the cosmos. In fact, all supernaturalism is precluded by axiom, since the universe is assumed to be a closed topological space—no information comes in, and none goes out. Thus the natural includes the full scope of the "possible," so that the supernatural becomes synonymous with the "impossible." No discussion will be given here of why this is an attractive or adequate belief system for many scientists and intellectuals; the point here is to describe the concept of "chance" that naturally flows from this epistemological source.

Certain laws and rules appear to be operating in the cosmos. Physics, chemistry, and the other sciences have attempted, over the past centuries, to trace these relationships through a partnership of reason and empiricism. The premise that empiricism leads to true knowledge is taken as a given in the current academic community. If a certain phenomenon is observed repeatedly, we notice the pattern and look for a cause. Upon such foundations modern science is avidly and faithfully pursued, and new truths established. For each observed phenomenon, some cause within the cosmos is to be sought; if no such reason can be determined, then one may either speculate or attribute the behavior to chance.

Now there is considerable variation in the natural philosopher's position on the concept of chance. Perhaps in older times (i.e., the beginning of the twentieth century), there was a current of optimism that various laws and rules would be worked out to such an extent that complete predictability would be a theoretical possibility. This would imply strong fatalism, and a mechanistic conception of man and his realm. But certain experiments in quantum mechanics have cast serious doubts on the tenability of such doctrine. Apparently, small particles move about randomly in the fullest sense of the word. Without any traceable cause, a particle may move to one location or another, and nothing in this world can account for the difference! It appears that a probability distribution on space is attached to each particle, and after "rolling the dice," the little particle moves to the appropriate location.2

Notice the logical deduction that a consistent natural philosopher makes at this point: since there are no forces external to the universe that can account for the motion of this particle, and nothing within the universe can be said to cause or determine its alteration, there is no recourse but to say that "chance" decides the path. Beyond this, no further progress can be made. We have no hope of comprehending the impersonal "chance" any more than the ancient pagans did (they often conceived of Fortune as a fickle woman!); so we shut the book and declare: "It is a mystery." Of course nothing prevents us from continuing the project of determining the exact probabilistic laws through which raw chance operates, but we must not try to probe the nature of chance itself. As a passing remark, we observe that similar conclusions from the premise of natural philosophy have been used to logically deduce the theory of evolution. To some of us, this is extremely questionable.3

Supernaturalism

Things are quite a bit different if we once admit the possibility of non-trivial "other-worldliness". If indeed there is "something" beyond and outside our own cosmos, and if interaction in some definable sense is possible, then we may have an alternate explanation for the phenomenon we observe. Indeed, it may just be possible to completely eliminate raw chance from the picture, and thus obtain a more satisfying science—one that attempts to maximize explanatory power and reduce the dominion of the unknown.

First of all, perhaps we should present a brief argument as to why this would be a desirable situation. The objective of science is to explore and describe various aspects of our own cosmos, employing axioms laid down centuries ago.4 It is apparent that our universe is extremely structured (or at least we perceive it that way), and we have an innate desire to understand and analyze this structure. Thus, the extent to which we can eliminate "unknowns" determines the breadth and depth of our knowledge. However, raw chance is not just an unknown, but is rather an "unknowable"; it states that, not only is prediction impracticable, but it is even inconceivable. For example, if a believer in the supremacy of raw chance also claimed belief in God (an all-powerful being who created the cosmos), they would conclude that the God could not control everything, since things, in the last analysis, were left to chance. This person might say that God sets up the distributions for events, but has no actual control over the random number generator! (A random number generator is a theoretical item, which spits out strings of zeroes and ones each with probability of one half.) God would not be all-powerful after all.

Thus, to a scientific mind, order is preferable to raw chance. But perhaps this is all wishful thinking—after all, experimental evidence does lead us to the conclusion that events in the cosmos are ultimately unpredictable. So a few words should be said to show that "raw chance" is intellectually repugnant. If all history is, in the last analysis, the product of raw chance, we may conclude that things could have turned out quite differently—and no meaning or cause can be adduced to justify one history versus another. Then why are we scuttling around so busily making propositions and theories, if our whole civilization, our language, our brains—are the product of a meaningless happenstance? It is fruitless to seek order in a world where that appearance of order only came about by chance.5 Thus, not only is supernatural determinism a desirable epistemological bastion, but it is a necessary prerequisite for the pursuit of meaningful scientific inquiry.

Now in defense of the opposition, one could mention that there is structure together with chance; perhaps the variance of the distributions is low, so that there is a concentration of events, and "on average" things tend to obey strict rules. This indeed seems to be the case, and it begs the next question—how did these certain distributions or laws (it is an elegant coincidence of probability theory that probability distributions are also called "laws") come about? Perhaps through chance—distributions on distributions ad infinitum; it wearies the mind!

The word "supernatural" is taken in the old sense—that which is above or beyond the natural. This is to be developed in the next section; Christianity presents the most coherent treatment of a supernatural system.6

Christian Determinism

The ancient religion of Christianity gives a consistent outworking of these ideas, which combines supernaturalism and the apparent randomness of this world in a subtle but lovely marriage. Here I will attempt to outline the corollaries of the basic doctrines which apply to this discussion.

Firstly, any concept of ultimate chance is utterly excluded from the beginning, since all events in reality are governed and determined (yea, caused) by a single intelligent entity—God. As a weak formulation, some theologians have conceived of God as only possessing foreknowledge; such a being would be too deficient in puissance to merit the epithet of "omnipotent", and thus we discard such feeble conceptions. In fact, God foreordains all events in this cosmos. Also, we must keep in mind that this entity "resides" outside and beyond our own cosmos, and thus it is utterly fallacious to apply our own limitations of space and time to One who transcends this order. And as God made the laws and rules of this cosmos, he also has power and authority to break or surpass them.

Now it becomes apparent that objects in this world, on the average, obey the laws discovered by science. Any small deviations can be attributed to either measurement error or noise—the conglomeration of a plethora of small effects, which it is unfeasible to compute. As for the mysterious quantum effects, we can now assert that the particles move according to the direction of God; and if the overall pattern is measured, it is seen to follow certain well-studied probability distributions. Thus God gives us many instances or samples, from a divinely scrutinized theoretical distribution.7

This formulation is consistent with unpredictability "within the cosmos"— since there is no possibility (for things within the natural world, obeying natural laws) of ascertaining a cause originating "outside", we may well perceive any happenstance as "uncaused." To speak mathematically, we may conceive of these supernatural causes as functions from beyond, which take values in our own history (the range space is contained within the events of our cosmos). Then, we observe only the values of the functions, but sadly have no knowledge of the function itself. In addition, this is a formulation with which theologians should be quite comfortable: God is continually managing the most minute matters of our world, operating upon matter supernaturally. This does not constitute miraculous activity, since there are no natural laws being broken; rather the supernatural economy is the foundation for natural law.

Fatalism and Chaos

Let us now contrast the former view with fatalism. In this picture, God (or a supernatural agent) merely makes one initial cause, which commences the growth of the cosmos; from that point onwards, every event causes every subsequent event in a theoretically predictable fashion. This is disagreeable to Christianity, which preaches a God that continually upholds the universe. It is at odds with modern science, which has noticed theoretically unpredictable phenomena. And it is odious to the human aesthetic.8 Whereas one (e.g., this author) can easily get on and enjoy life under the view that every action is predetermined supernaturally by God, it is another thing to say that this behavior is totally computable based upon one event in the distant past!

How is a theory of probability to be granted within a fatalistic framework? The concept of chaos, as defined above, gives the only tenable sense. Indeed everything is caused and determined by prior cosmic events, but this is so intricate and complicated that no computational machine could possibly make sense of the data. Thus, while being in essence cosmically deterministic, phenomena are nevertheless apparently stochastic, defying even the most diligent scrutiny. If finding the deterministic laws and functions is unfeasible empirically and mathematically, then from a practical standpoint the underlying fatalism is somewhat irrelevant—we are better off (from the perspective of predictability) modeling the cosmos stochastically, so that we may employ the full power of probability theory.

Contrast this with natural philosophy—which says that randomness is not merely apparent, but is a fundamental reality—and the Christian determinism here formulated, which says that randomness is "random" as far as this cosmos is concerned, but ultimately there is a cause for everything found in supernatural realms. In natural philosophy, the probability model is an absolute reality—the pure abstraction of a mathematical theory constantly intrudes and permeates our world. In fatalism, the probability theory is a convenient tool, which is implemented due to the loss of information in the whirl of chaos. Between these extreme views, supernatural determinism preaches a probability theory, which is concrete and undecipherable from a finite perspective, and yet is completely tangible to the supernatural entity generating cosmic events. The mathematical definition of random variables and probability spaces lends itself nicely to this latter interpretation.

Ergodicity and Stochastics

Some discussion should now be given on the issue of "ergodicity" and stochastic structures in general. First, some important terminology will be introduced. When a probabilist speaks of a random variable, he is describing a variable that takes on various values (like one, two, three, or red, white, and blue—anything!) with given (known) likelihoods. The distribution is the collection of probabilities associated to each possible value of a random variable. If I measure the value of a tossed die, the outcome is a random variable, which can take on any integer value between one and six. The distribution tells me what the chance is for each outcome (for a fair die, each outcome has probability of one-sixth). Now suppose that we repeat an experiment three times, and measure the outcome each time—then we have three identically distributed random variables. Since the phenomenon is the same (even though the outcome may be different) each of the three times, we say that the distributions of the random variables are identical. This concept of identical distribution is extremely important in the subsequent discussion.

We could compute an average of a random variable two different ways: theoretically and empirically. The theoretical average would involve taking a sum of the values weighted by their corresponding probabilities, which are determined by the distribution. The empirical average would be obtained by repeating a phenomenon in such a way that we generate a sequence of identically distributed random variables. Then we simply measure each outcome, and take the usual average of all the observations. The basic "ergodic theorem" states9 that (under some conditions), the empirical average gets closer and closer to the theoretical average (with a high probability) as we increase the number of repeated experiments. In other words, the sampling (empirical) average is asymptotically identical with the distributional (theoretical) average. If the observations were taken at subsequent times, then we might say that the temporal average approximates the distributional average—this was our definition of the term "ergodic".

It is strangely apparent that our universe is ergodic. By this, I mean that many phenomena satisfy the ergodic theorem in practice. In some cases, one may have postulated the distribution of a random variable, computed the theoretical average; then this is compared with an empirical average conducted upon data generated by the same stochastic mechanism, and behold!—the stated convergence is eerily obtained. In other cases, we have no idea what the theoretical average is, but we do see the empirical average closing in on the same number (for instance, generate a large data set and calculate the average; then repeat this whole process many times—each of the averages will often be quite close to one another!).

Why should this be the case? Since the conclusions of the ergodic theorem surround us (as well as the subtler "central limit theorems"), it seems to lend validity to our probabilistic modeling of the cosmos. So it behooves us to take a closer look at the precise statement of the ergodic theorem (which, under other contexts, is called the "Law of Large Numbers"). What do we mean by an average of random variables converging to a fixed number? In its strongest formulation, the theorem can be interpreted in the following way: with probability one (i.e., all the time except for fluke instances), the empirical average tends to the distributional average as the sample size grows toward infinity. Thus, in theory it could happen that for a given experiment convergence would not occur, but in practice you would never see it happen.

A Brief Survey of Probability

Probability theory is a fairly recent development upon the scale of human civilization. The classical formulation dealt with a class of random variables where the number of possible outcomes was finite, and each was equally likely (for example, the flipping of a fair coin has two equally likely outcomes). But what if some outcomes are just more likely than others (consider an unevenly shaped coin, which gives a bias to one outcome over another)? Clearly a better theory was needed. Another suggestion was that of "empirical probability"— that we define theoretical probabilities as the limits of empirical proportions. This is essentially equivalent to assuming the ergodic theorem from the beginning and using it as a definition. In the event that the ergodic theorem does not hold for certain phenomena (this can and does happen, e.g., for random variables that fluctuate "too wildly"), this definition falls flat on its face. An axiomatic approach was developed in the twentieth century (by A. Kolmogorov10), which was built on the foundations of mathematics' real analysis. This seems to be the most successful and most elegant of the approaches.

Here I'll make an aside on axiomatic mathematics: this is a great covert whereby mathematicians may completely dodge the hydra of epistemology. Questions like "how do we know it is true?" and "why is that type of reasoning valid?" are banished to the philosopher's (and theologian's) circle. The method merely consists of the declaration of certain delicate axioms, from which the subsequent collection of Theorems, Propositions, Lemmas, and Corollaries are carefully constructed through the operation of logic ("modus ponens"11 and the law of non-contradiction), laid upon a bed of supplemental definitions. No attempt is made to "prove" the axioms, though we may attempt to justify them on an aesthetic or practical basis. We merely ask that one accept the rules of logic (of course, Buddhists may have a problem with this) in order to deduce the resulting mathematics.

One other paradigm, the "subjective" theory of Bayesian probability, conceives of probabilities as perceived "degrees of confidence"; the resulting mathematics is identical with the above axiomatic formulation, but a very different interpretation is placed upon the quantities of interest. I will not comment further on this intriguing theory, but concentrate on Kolmogorov's system.

Modern probability theory has great explanatory power. It is from these axioms that such results as the ergodic theorem were established. This is, however, only the first item among a wealth of propositions. As a passing remark, we observe that the subject got its beginning in the various gambling problems of the Renaissance, bantered back and forth between intellectuals. It seems worthwhile to construct a coterie of examples drawn from a less nefarious context.

The Density Model

Below is a mathematical formulation which gives an acceptable model of the cosmos and is compatible with the above observations. It is an attempt to remove the concepts of chance, randomness, and so forth from the vocabulary of statistics. Following are some essential notations:

w is a "history"

W is the set of all possible "histories" w

w* denotes "our" history, the true state of affairs

C is any function from W to a set of measurable criterion. It is typically called a random variable under some subtle "measurability" conditions.

C(w) is the observed result of C in history  w

C(w*) is the observation of C in our world

Let us unpack this terminology: each w contains an incredible store of cosmic information—somehow it encodes all the facts of our world from beginning to end. Suppose that absolutely every phenomenon in the cosmos was really the value of some function—the outcome C(w); whenever we speak subjunctively of what could have occurred, we are referring to C(w') for a distinct history w'. Note that the possibility of C(w) = C(w') is not excluded for some random variables C—this all depends on the particular phenomenon. Perhaps one way to picture this set of histories is to imagine a tree that is constantly bifurcating in time, according to which many possible cosmic alternatives occurs. Our own history w* has countless parallel histories w; these may be similar or even identical in many ways, but in at least one aspect they actually differ (the details of this distinguishability is embedded in the rich tapestry of s-algebras).

It follows that the set of all such histories, denoted by W, is unimaginably vast. In probability theory, it is sufficient to leave W as an abstract set—our attention is focused upon the distributions of the random variables. Now the "probability" itself is a measure on the space W, which assigns to each set a number between 0 and 1; the axioms of Kolmogorov state that some monotonicity and summability properties should be satisfied. Let us see how this concept may be applied.

Consider some cosmic event which we wish to model statistically. Then we generally conceive of the event as some subset of all possible histories w, such that specified outcomes occur. Observe that in our notation, this means that there is a random variable C which measures the phenomenon in which we are interested, and the event may be indicated by C taking on a particular value or values—the w's for which that value occurs constitute the event. Stated another way, we are interested in all possible worlds w for which the phenomenon occurs. The following specific example illustrates these ideas concretely.

Suppose an electron is in one of two compartments (with equal dimensions) of a box—either the right (R) or left (L) side. If at a particular time we measure the location, we can model this by considering the random variable C, with possible values of R or L. The event that "it's on the left side" is equal to the set of all w's (all world histories) in which the electron really is on the left side. In general, C(w) can be either R or L. If we actually measure the value R, then we know that C(w*) = R. Let's denote the event "it's on the left side" by the letter A (so we can write A = {all w such that C(w) = L}). But this event is actually a subset of W, and we may and will apply the probability measure P to it; then P(A) gives us a number between 0 and 1, which is interpreted as the "probability that it's on the left side." This might be modeled to be the number 1/2, which means that in one out of two worlds w, the electron will be on the left side.

Interpretations

A probabilist will notice that there is nothing innovative in these definitions, except for the idea of "histories". But let us further imagine that we calculate the probability of an event by taking the following ratio: consider the count of all histories w in which the event occurs, and divide this by the count of all histories w in W. Thus, if we think of each w as a "particle" within the total "object" W, then the probability of an event is simply the density of the corresponding histories within the scope of cosmic possibility. Thus the name "density model".

With this in mind, we can explain the apparent randomness of our world. Through our senses, we are able to observe the values C(w*) for various random variables C, and we are able to deduce through reason the existence of a single state of things, namely w*. But we are at a loss to determine exactly which w is our w*; worse, we do not know the functions C! When we observe two phenomena C(w*) and Y(w*) which are identically distributed, it may well be (and often is the case) that C(w*) is not equal to Y(w*), even though from a probability standpoint there is no difference.

From the perspective of natural philosophy, the random variables come at us any which way, and there is no possible way of knowing the functions C. From the perspective of supernatural determinism, we should view the space W as grounded outside our cosmos, so that the functions C are movements or mappings between a supernatural realm into a collection of potential universes (not just our cosmos, but all subjunctive cosmoses as well!). Of course w* holds a special place in this story; to a mathematician, we might say it is an element of the dual space of random variables over W. And it is very possible that an entity from beyond could know and determine both w* and the random variables C. The ergodic theorems will guarantee that our w* does not deviate too greatly from the "center of mass" of possible realities.

San Diego, California

______________________

1 This is a huge topic; see Greene (1999) for a treatment accessible to non-physicists.
2 See Greene (1999).

3 A thorough critique of the theory of evolution can be found in Johnson (1993).

4 Francis Bacon is credited with much of the formulation of the tenets of "modern science."

5 For further argumentation along these lines, see Van Til (1967).

6 As a fellow faculty member pointed out, Judaism and Islam can equally present such a position, as no mention of the Trinity has yet been introduced. Of the three religions, the author feels that Christianity has the most internal coherence.

7 This goes back to mathematics as the underpinnings of this earthly realm, of which the author is thoroughly convicted. Every empirical pattern has a theorem behind it, and behind each theorem is the Primal Mathematician.

8 For an insight into the potential problems (when linked with inevitable foreknowledge), consult the Oedipus Cycle (Grene and Lattimore, 1970).
9 The result stated is usually known as the "Strong Law of Large Numbers." For a rigorous treatment of this large subject, see Billingsley (1995).

10 Andrei Kolmogorov was a brilliant Russian probabilist and a pioneer in the field. See Kolmogorov (1933) for details.

11 An if-then statement, coupled with the protasis, from which the apodosis is concluded.

 

References

Billingsley, Patrick. Probability and Measure (1995).

Greene, Brian. The Elegant Universe (New York: Vintage Books, 1999).

Grene, David and Richmond Lattimore, eds. Sophocles I (Chicago and London: University of Chicago Press, 1970).

Johnson, Phillip E. Darwin on Trial (Downers Grove, IL: Intervarsity Press, 1993).

Kolmogorov, Andrei. Grundbegriffe der Wahrscheinlichkeitsrechnung (1933).

Van Til, Cornelius. The Defense of the Faith (Phillipsburg, NJ: Presbyterian and Reformed Publishing Co., 1967).