Over a succession of Tuesday mornings in the spring of 1962 my brother Robert and I met infinity and took its measure. We wasted no time getting ready for school on Tuesdays, because the early morning math class taught by Mr. R.W. McCarley started at 7:30. Mr. McCarley was younger than most of our teachers. Whether he came in early for us and a few other students from pure enthusiasm, or possibly gained some compensation from the school or our parents, I never thought to question.
Whoever had the idea for that extra class must have received an enthusiastic response from our father. When he was a senior in high school in Oxford, Mississippi, in the 1930s, the faculty decided that he had mastered everything they had to teach. So they let him study books on electronics and physics, aided by tutoring from professors at the University.
In Mr. McCarley’s class we were exposed to Boolean set theory and other subjects well beyond the usual high school geometry, algebra, and pre-calculus. Most of all I remember that he introduced us to different orders of infinity. Mr. McCarley told us that the infinite series of rational numbers, which includes all the numbers represented by fractions of two integers, is no larger than the infinite series of integers. This is true despite the fact that between any 2 rational numbers is an infinitely divisible gradation. The crucial insight is that all rational numbers can be mapped onto the infinite set of integers (see Box).
Although it might seem that there are more rational numbers than integers, the order of infinity is the same. Not so for the set of real numbers. Real numbers include all rational numbers and also numbers that cannot be expressed as a fraction of 2 integers (an example is “pi”). Real numbers cannot be mapped onto the infinite set of integers. Therefore, the set of real numbers has a higher order of infinity. Wow!
Mr. McCarley invited us to try to bisect, then to trisect an angle drawn on a sheet of paper, using a compass and ruler. The first operation is simple. The second, dividing a given angle into 3 equal smaller angles, is impossible. After letting us sweat it out for half an hour (before 8:15!), Mr. McCarley confessed as much, and he added that the impossibility of trisecting an angle had been proven only recently in the history of mathematics.
Counting, arithmetic, and geometry developed all over the world in concert with trading as well as religious observation of the heavens. Multiplication tables appear on clay tablets dated as early as 2500 BCE in Sumeria. An Egyptian papyrus from about 1650 BCE is an instructional text for students of arithmetic and geometry. We count from 1 to 10 and then start counting again, 11 to 20, and so on. The Babylonians counted from 1 to 60 before starting over. Today we divide an hour into 60 minutes and a minute into 60 seconds, thanks to the Babylonians. The decimal system of numbers with the critical inclusion of “0” for zero first appeared in India. This knowledge was transferred and enhanced by Persian and Arab scholars before reaching the West. “Algorithm” and “algebra” are words derived from Arabic.
In ancient Greece a remarkable enthusiasm flared up for elucidating angles, figures, and numbers. For some it became much more than a game or pastime. Pythagoras, son of a Mediterranean trader in the late 6th century BCE, may have caught the spark from Egyptian priests or from travelling Chaldeans in Tyre. Either Pythagoras or his followers devised a kind of geometrical algebra. Their contributions included what we still call the Pythagorean theorem, which proves that the square formed on the hypotenuse of a right triangle is equal in area to sum of squares of the 2 sides (see header image for this blog).
The Pythagoreans applied measurement and number to the lengths of vibrating strings, demonstrating a quantitative theory of music. They formed secret societies devoted to ritual practices, dietary restrictions, philosophical purification, and geometry. The inner circle, those who learned to prove the theorems, were called mathematikoi, and from them our word “mathematics” derives. Aristotle later wrote, “The Pythagorean … having been brought up in the study of mathematics, thought that things are numbers … and that the whole cosmos is a scale and a number.”
Mathematics has long been connected with notions of ultimate discovery and true reality. My encounter with infinity in the early morning math class gave me a small sense of what this means. The Pythagoreans took it as far as they could, and others extended the trail. It is no coincidence that René Descartes, who devised the Cartesian coordinate system giving algebraic expression to geometry, also reasoned his way to a set of truths which, he said, could not be doubted.
With the advent of computers, transistors, and integrated circuits during my lifetime, mathematics has escaped almost every imaginable expanding cage of computational limits. Smartphones in the hands of children do billions of calculations per second.
It’s important to remember that mathematics emerged first and prepared the way for digital application, as soon as the engineering physics of germanium and silicon, especially silicon, reached a stage to allow it. For more than 4 millennia until the first electronic computer and the first transistor appeared, mathematics performed manually, a product of pure human thought, earned its place as the central organizing core of science, and arguably its most advanced expression.
During this time the Pythagorean dream inspired leading thinkers in every age to approach the possibility, subliminal yet guiding, that mathematics might describe ultimate reality.
Today mathematics rises to new heights on a surging crest of technology. In the next blog we’ll examine how the mathematical foundation for our digital age – a transition from classical mathematics to statistics and simulation – was laid more than a century ago in the mid-1800s.
Epimenides, from the island of Crete in the eastern Mediterranean Sea, broke the rules that constrained human thinking in his time, opening a floodgate of inspiration that swept GSOT to a place and a level that could never have been anticipated.
Epimenides gave us the teasing paradox of the liar, when he wrote “Cretans, always liars.” Was his statement a lie? Is it logically possible that he was telling the truth? The answer depends on whether you take his words to refer to himself as well as other Cretans.
Scholars give differing opinions as to whether Epimenides consciously framed it as a paradox when he wrote the words. I prefer to think that he did. With a wink and a smile he would stimulate your thinking and then move on to the next topic. Certainly by the last few centuries BCE, the Greek world associated him with the Liar’s Paradox, thus acknowledging his contribution to the philosophy of self-reference.
But Epimenides was renowned for much more than that, although very few people today have ever heard or read his name. If only Paul in the New Testament had named this man whom he quoted twice, we might all be better acquainted with him.
Epimenides reportedly wrote 3 major works comprising 15,500 lines of verse in all, but all that we have remaining are scattered lines quoted by later writers including Paul.
Epimenides was said to have spent 57 years asleep in a cave in Crete, passing from young man to venerable sage in the process. The origin of this legend is a matter of speculation. The island is riddled with soluble rock formations known as karst, forming more than 2000 caves.
Caves on the island of Crete. Left: Cave at Lisos. Right top: Cave of Zeus. Right bottom: Cave of Psychros.
Religious rites were held in the caves, and it is possible that Epimenides received his education largely from priests or priestesses associated with the caves. A long absence from public visibility, perhaps hiding in a cave, also suggests that Epimenides may have been out of favor with the ruling authorities. This would be consistent not only with his deprecation of Cretans as liars, but also with his introduction of new, controversial ideas. As Maximus of Tyre wrote in the 2nd century CE, Epimenides “had come into relations with the gods and the oracles of the gods and Truth and Justice.”
Diogenes Laertius’ Lives of Famous Philosophers in the 3rd century CE gives this account of the mission to Athens by Epimenides:
He was considered by the Greeks as a person especially beloved by the Gods, on which account, when the Athenians were afflicted by a plague, and the priestess at Delphi enjoined them to purify their city; they sent a ship and Nicias the son of Niceratus to Crete, to invite Epimenides to Athens; and he, coming there in the forty-sixth Olympiad, purified the city and eradicated the plague for that time, he took some black sheep and some white ones, and led them up to the Areopagus, and from thence he let them go wherever they chose, having ordered the attendants to follow them, and wherever any one of them lay down they were to sacrifice him to the God who was the patron of the spot, and so the evil was stayed; and owing to this one may even now find in the different boroughs of the Athenians altars without names, which are a sort of memorial of the propitiation of the Gods that then took place.
The Book of Acts in the Bible tells of Paul’s reaction on seeing one of those Athenian altars to an unnamed god, a legacy of Epimenides.
In his Gifford lecture of 1916, William Ramsay suggested that Epimenides’ visit to Athens happened around 500 BCE.
Under the Pisistratidae Athens grew from a small town into an important city; but in this too rapid increase it outgrew healthy conditions. The laws of sanitation, which the old religion had prescribed for small social groups, were quite inadequate for a large city. Athens was ripe for a pestilence; and, after the tyrants were expelled, the slackness and want of forethought which attended Athenian democracy aggravated the evils of city management, while party strife distracted attention. The result was as recorded by Maximus, Diogenes, and others; a plague struck the city.
Epimenides succeeded partly because he understood better methods of sanitation, according to Ramsay. To sway public opinion, he appealed to religion, combining older with newer religious sympathies. Ritual satisfied the majority. But Epimenides also taught “new and higher conceptions of the divine nature and its relation to man.”
Ramsay’s estimate of the teaching of Epimenides is based partly on the scant surviving lines of his poem “Cretica.” The poem addresses Zeus, who may be thought of either as the foremost of the Greek gods or as the major male deity gaining prominence in the matriarchal Minoan religion. The following lines are based on the recognition that a quote in the Syriac commentary of Ishodad actually derives from Epimenides:
They fashioned a tomb for you, holy and high one, Cretans, always liars, evil beasts, idle bellies. But you are not dead: you live and abide forever, For in you we live and move and have our being.
These words convict the Cretans who built a tomb for Zeus, according to local myth buried there after dying from a wild boar’s attack. In reading these lines 2500 years later, I find it remarkable that Epimenides in his own culture had to contend with the “death of God.” More importantly, Epimenides conceived of Zeus in universal terms, not as a god who merely led the Olympian or Minoan court, but as eternal God the source of creation. Encountering the last of these 4 lines in the Bible (Acts 17:28) many years ago without any knowledge of its source, I was struck by the contrast with what I had previously understood as primitive Greek mythology.
I wonder if Epimenides consciously saw himself as one of the Cretans who had slandered God (all Cretans always liars?)…or if the old myth of Zeus’ death somehow stimulated him to prophesy that God’s hiddenness might relate to human incapacity for self-recognition, an inability to see that “in you we live and move and have our being.”
When Socrates taught in Athens during the latter half of the 5th century BCE, the older and newer conceptions of the gods continued to demand human decision. Xenophon makes this clear in his Memorabilia of Socrates:
[Socrates] believed that the gods care for men, but not in the way that most people believe they do. They suppose that the gods know some things but not others; but Socrates believed that they know everything, both words and actions and unspoken intentions, and that they are present everywhere and communicate to people about all kinds of human affairs…. Socrates…in his relationship to the gods said and did only what was recognizably consistent with the deepest reverence.
It’s reasonable to speculate that Epimenides, summoned to Athens at a time of crisis, could have influenced Socrates substantially over the span of one or two generations, contributing to a new age of universal wisdom heralded by the concept of omniscient God. If this is pressing the point, then let me say that Epimenides at least emblemized the break with the past caused by the new way of thinking. After 2500 years the passion of the Cretan prophet’s insight elicits wonder still.
Photos: Rocky beach and mountains of Crete at Preveli, Jarek, CC Public domain on Pixabay. Caves of Crete, all CC by SA 3.0, Wikimedia Commons: left, Wolfgang Sauber; top right, Olaf Tausch; bottom right, Torben Schramme. Sheep, Papi, CC Public domain on Pixabay. Areopagus, C Messier, CC by SA 4.0, Wikimedia Commons.
 Anna Strataridaki from the University of Crete suggests that Epimenides intended a broad philosophical point, so that a better translation of his words would be “All humans, ever liars.” Strataridaki, A.I. Epimenides of Crete: some notes on his life, works, and the verse ‘Kretes aei pseustai’, Fortunatae 2 (1991): 207-223. Accessed 8/20/2016 at https://dialnet.unirioja.es/descarga/articulo/163834.pdf.
 Immanuel Kant likewise began his major contributions late in life, describing himself as awakening from a “dogmatic slumber” upon instigation from the Scottish skeptic, David Hume.
How do we bring the problem of self-reference into our search for GSOT? Human perception has challenge enough finding order and regularity in the wild universe. So it almost seems too much to bend vision inward, trying to heed the inscription at Delphi to “Know thyself.” But we can start by looking back to that ancient time when “Know thyself” first appeared in human history.
Julian Jaynes caused quite a stir with his mid-1970s pronouncement that humans became conscious sometime early in the first millennium BCE. Jaynes writes with a firmly and naïvely positivist viewpoint. Nevertheless, he makes an interesting case for a marked shift in the way humans think, occurring shortly before the classic age of Greece and before most of the writing of the Jewish Bible.
Jaynes uses the word “consciousness” to describe the new patterns of human thinking. I feel compelled to comment briefly on “consciousness,” a word for which the clearest role is to help market books. Then we’ll see how valuable his insights are.
Jaynes and other positivists  restrict their inquiry on consciousness to the examination of outward expressions of self-awareness, assuming a correspondence between inner thoughts and publicly definable events. They position themselves as external observers in order to derive the necessary measurements and results. The privileged vantage point of external observers arises from rules restricting observation to that which can compel agreement among suitably trained researchers like themselves. As we noted earlier, those rules demand reproducibility over time and space…and among observers, eliminating that which is immediate, personal, and particular.
The problem with thinking about consciousness under these restrictions is the loss of differentiation between me, you, us, him, her, them, it. To illustrate this, let me pose a question: Is there one consciousness or many? For the positivist, the answer is easy – of course, there are many consciousnesses. If there 3 people in a room, there are 3 consciousnesses. But I would suggest that the answer is far more difficult and complex. I think that consciousness has been much too readily identified with brain, behavior, test responses, etc. As for the 3 people in the room, there is far more at play than the interaction between 3 individual consciousnesses. What about the consciousness of the external observer? There is only one consciousness available in full measure to me; consciousness in you and in other people is a matter of projection and similitude of response. Another question: Is my dog Shep conscious? I am inclined to talk to Shep, care for him, and play with him entirely as if Shep is conscious, and it seems to me Shep responds in like manner.
Despite these misgivings, Julian Jaynes seems to have identified a remarkable shift in the way that humans think, occurring early in the first millennium BCE. He analyzes the epic poem attributed to Homer, the Iliad, which was transmitted orally from the time of the events described around 1230 BCE until it was written down some 400-500 years later.
Jaynes notes the lack of any description of self-awareness among the Greek and Trojan heroes of the Iliad. Their reasons for sailing across the Mediterranean, for sulking in the camp (Achilles), for venturing out from the safety of the city (Hector), and for every act we might consider driven by human emotion or will was in fact described in the Iliad as the aural command of a god, immediately obeyed.
Great deeds, either heroic or shameful, were automatic responses to a god’s command. Not only in the Iliad of Greece, but also in other cultures, Jaynes identifies the same pattern of a voice from a god commanding the action of humans. This includes the prophecy of Amos, which he considers to be the oldest written book of the Old Testament.
Jaynes comes up with the interesting theory that the voice of the god actually originates in a region of the right cerebral hemisphere analogous to Broca’s region for speech in the left cerebral hemisphere. (The speech region resides dominantly in the left hemisphere in right-handed people and in about half of left-handed people). Thus an “unconscious” right hemisphere would provide signals heard as a voice in the left hemisphere, which is equally “unconscious” but better connected for external hearing and speaking. Jaynes views this arrangement as an adaptive evolutionary strategy in early humans.
However, as city populations grew and communication became more complex, the previously adaptive strategy of dual-brain responsiveness failed to cope. The environmental pressure of urban living changed the right brain/left brain partnership to a state of left hemispheric dominance. Jaynes called this “the breakdown of the bicameral mind.” The change occurred too rapidly to represent genetic modification or selection, but more likely depended on the extraordinary plasticity of neural connections.
Once speech became consolidated in the left brain, the strange voices, commands from the gods, fell silent or largely so (schizophrenia a well-known exception). This allowed the left cerebral hemisphere to move in a different direction, far more adaptive to an increasingly complex society. Expressions of self-awareness began to appear. Even the Odyssey, composed somewhat later than the Iliad and also ascribed to Homer, began to show contemplation, consideration of alternative future courses, and reflection on the worthiness of a person’s own acts.
By the time of Augustine in approximately 390 CE, the notion of self-awareness could become a central philosophical theme:
Evodius: You have taught me clearly that it is one thing to be alive and quite another to know that one is alive.
Augustine: Which of the two do you think is better?
Evodius: Clearly, the knowledge that one is alive is better.
Jaynes’ psychological theory has great value for its identification of changing patterns of human thought just prior to an extraordinarily creative period in the development of human civilization in the eastern Mediterranean and the Levant. This period included the classic age of Greece, the rise of the highly influential Persian empire, and the time of composition of much of the Old Testament. The right brain/left brain part of Jaynes’ theory is a weaker aspect, which might or might not have some future use in understanding present human psychology and psychiatry.
In the next blog, we’ll look at Epimenides, the almost forgotten, half-legendary hero from the island of Crete whose life and thought helped to shape the remarkable transformation of GSOT described above.
Photo of bust of Homer in the British Museum, London, public domain, uploaded on Wikipedia by JW1805. Trojan war scene drawing based on a lost Greek vase from 540 BCE. A. Rumpf, Chalkidische Vasen (Berlin/Leipzig 1927), pl. 12, public domain. Wikimedia commons.
 Jaynes, J. The Origin of Consciousness in the Breakdown of the Bicameral Mind. Houghton Mifflin, Boston, 1976.
 For example, Hofstater, D.R., and Dennett, D.C. The Mind’s I. Basic Books, New York, 1981, also Dennett, D.C. Consciousness Explained. Little Brown, Boston, 1991.
 Augustine. On Free Choice of the Will, transl. Thomas Williams. Hackett, Indianapolis, 1993. p. 13.
Rule #2. The overarching viewpoint is not allowed.
Rule #3. If it doesn’t make a difference for somebody’s predisposition to act, then it doesn’t make a difference.
These rules tie me to the earth. I want to cut the ties. Soaring in the sky, I want the overarching view. With eagle’s eye, let me see all things in heaven and earth as they really are.
Freedom means tearing loose from the tethers. Freedom doesn’t care how well-reasoned, obvious, inspired, or agreeable are our chosen rules for GSOT.
The human story has always been about breaking the rules, cutting the cords that keep the mind grounded, and letting new ideas take flight.
Peppering his young followers with questions, Socrates brought them to awareness that their unexamined assumptions might be wrong.
With relentless imagination, Descartes found ways to doubt almost all that he had learned from the Schoolmen. Then just as relentlessly he found ways to restore almost all.
Scientists and positivists broke away from a limited first-person viewpoint, which the latter derisively called subjectivity. The exalted height from which positivists observe the world and all that is in it, even themselves, is the celestial “we” of science, based on methods reproducible in time, place, and person.
Now for the sake of freedom, pushing aside any claim of “progress,” we try to break ties with the past. The great thinkers of ancient times can take us only so far. Rationalists of the past several centuries have been discredited. Both positivists and fundamentalists of the modern age fail in a hubris of certainty.
Through the presentation of the first 3 rules, our answer has been to define our selves. Every sentence is first person. The overarching view is not allowed. Agreement need not be universal, so all discussion is local. Every proposed belief is viewed through the lens of someone’s disposition to act. Now the call is to go beyond ourselves.
We live daily under a set of intersecting and expanding domes: First, the cranium housing each brain. The roofs of our families’ houses and automobiles. Circles of friends with differing levels of conversation – shallow, deep, and lofty. Communities and cities ringed by highways, and traffic helicopters tracing arcs above them. Nations and leagues of nations bounded by economics and politics, and overhead the programmed trajectories of nuclear missiles, hopefully never to fly. These are the referents of “I” and limited “we.” Sometimes a person or a group manages to think globally, but even such bold thinking stays within a dome as the context remains limited in time and space. Each worldview has its horizon, its hard-shelled firmament, expansive but finite.
Accepting these limits for now, let’s look critically at the idea of self. What does it mean to talk about my self? If I with eagle’s eye were to see all things on earth clearly, where would my self appear? The sorry answer is that my self remains on earth and not in the sky, and the eagle’s eye is just an awkward metaphor.
The phrase “down to earth” echoes with wisdom from the past. Yet even on earth, where might I stand to observe my self? As we discussed in the first blog of Rule #2 – there is no easy answer.
“Know thyself,” read the inscription at Delphi, quoted over and again by Socrates. Perhaps the only way to know myself is to break these rules. Let’s take a quick look now (and more later), with eyeballs turned around in our heads, at the critical problem of self-reference in philosophy.
Self-reference conflates subject and object. Am I referring to the self of the observer/presenter, or the self of that which is observed/presented? And does it make any difference to distinguish these relations? If I make a distinction, it would seem that I’ve broken Rule #2 – the overarching viewpoint is not allowed.
As stated earlier, self-reference is exhibited in each of our first 3 rules:
Rule #1 urges the recognition of self-involvement in every symbolic expression of thought.
Rule #2 disallows the overarching viewpoint. But is it not the case that Rule #2 itself assumes an overarching viewpoint?
Rule #3 from the pragmatism of Charles Peirce says that I may know my own beliefs only through recognition of my disposition to act. (While this is cast in first-person singular, it also holds in plural.) Therefore, knowing my own beliefs is a matter of projecting, either actually or virtually, how my beliefs will play out in the public, social, real world.
Self is never recognized as self in isolation. It is largely a social construct.
O wad some Pow’r the giftie gie us To see oursels as others see us!
wrote Robert Burns, Scottish poet. His inspiration came in a church service, not from sermon or music, but instead from a louse crawling up the bonnet of a lady in the pew ahead. His unspoken advice for the poor woman –
O Jenny, dinna toss your head, An’ set your beauties a’ abread! Ye little ken what cursed speed The blastie’s makin! Thae winks and finger-ends, I dread, Are notice takin!
How disparate are the thoughts of various individuals in the room – Burns the poet, the lady in the bonnet, the preacher, and the louse.
Self-reference is difficult for logicians as well as poets and ladies. Rule #4 – Break these rules – makes self-reference explicit and flaunts the dilemma. Perhaps thereby we may negotiate with ourselves and find some kind of accommodation.
Long ago Epimenides from Crete said, “Cretans, always liars.” Was his statement a lie? Is it logically possible that he was telling the truth? Was he trying to break through our human incapacity for self-recognition?
In a time of cultural crisis, the people of Athens summoned Epimenides, who they believed “had come into relations with the gods and the oracles of the gods and Truth and Justice.” He initiated reforms for a city that had outgrown the limits of its prior culture. Within a few generations Athens entered its golden age. Even today Epimenides continues to inspire when we read about Athens in the Bible. The next 2 blogs will describe an ancient revolution in human consciousness and assess the impact of the Cretan prophet.
Pragmatism was the 3rd rule presented in our search for GSOT. The rule of pragmatism can be stated simply: Unless it makes a difference in somebody’s disposition to act, then it makes no difference.
We have considered whether free will is valid under pragmatism. This does not mean proving that free will is true, probably an impossible task, but rather demonstrating that free will can constitute a clear and uncontradicted belief upon which a person can act.
It has seemed possible that free will might run afoul of a fundamental philosophical postulate – the principle of causation. Therefore, we looked in some detail at the apparent conflict between these 2 propositions:
Every event has its cause.
The ideas I/we bring up and the habits of responsive action I/we form can produce effects in the world.
The first proposition is essentially the same as the Principle of Sufficient Reason put forward by Leibniz as the foundation of all philosophy. The second is a statement of free will, although it could be viewed alternatively as a statement that posits a sense of my self or of our group as a locus of will. I shall call it a statement of free will.
We looked at 5 ways of dealing with this conflict and found all of them to be wanting. 1. Simply to call it a paradox gains no ground at all. 2. To say that the will emerges in the biologic level of the hierarchy of matter gives false credence to the will, because the same rules of evidence in science-based modes of thinking apply at all levels in the hierarchy (see this blog for 1 and 2). 3. Chaos theory is a mathematical game or tool, not a serious attempt to explain GSOT. 4. The engagement of randomness by quantum physics does not work for free will, because random choice is not what free will means (see this blog for 3 and 4). 5. God is an answer in the sense of trust, but the appeal to God leaves us wondering what we are trusting God to bring about (see this blog).
Our 6th attempt now takes a deeper look at pragmatism. Charles Peirce, we may recall, defined his new philosophy in this way:
…the whole function of thought is to produce habits of action; and that whatever there is connected with a thought, but irrelevant to its purpose, is an accretion to it, but no part of it…. To develop its meaning, we have, therefore, simply to determine what habits it produces, for what a thing means is simply what habits it involves. Now, the identity of a habit depends on how it might lead us to act, not merely under such circumstances as are likely to arise, but under such as might possibly occur, no matter how improbable they may be. What the habit is depends on when and how it causes us to act. As for the when, every stimulus to action is derived from perception; as for the how, every purpose of action is to produce some sensible result. Thus, we come down to what is tangible and practical as the root of every real distinction of thought, no matter how subtile it may be; and there is no distinction of meaning so fine as to consist in anything but a possible difference of practice.
Peirce provides here a model of stimulus and response. The stimulus arrives by perception; the response is one of action (praxis) in the world. The model looks scientific, not surprisingly, because Peirce was a working scientist.
Pragmatism is Peirce’s model for clarity of belief. The model applies to an individual, you or me, or to some group, a family or a people joined by culture or the almost universal community of science. Therefore, pragmatism lends itself readily to science, and yet pragmatism goes far beyond science in its implications. One of those implications is the embedding of free will within the pragmatic model. Let’s look deeper.
Both pragmatism and positivism focus on stimuli and effects that can be touched and seen in the real, external world. But Peirce discusses stimulus and action in the context of a person responding and acting in the world. The locus of observation remains in that person – think you or me – or perhaps in a group – think us. In contrast, positivism appeals to an ideal, universal, reproducible observer. The personal and particular aspects of observer(s) fade away in positivism.
In keeping the focus personal, either singular or plural, Peirce in the first sentence quoted above appropriately uses the word “purpose.” He describes thought and action prompted by an external stimulus, aimed toward an external target, yet in the transition governed by a function of inner responsiveness. “Purpose” for me is appropriate because I am not asking what kind of responses are made by some class of humans under observation; instead I am asking in the midst of living what kind of choices I make. “Purpose” for us is appropriate because we are not asking what kind of responses are made by some externally identified class of humans under observation; instead we are asking in the midst of living what kind of choices we make.
Don’t be misled by Peirce’s use of the word “habit” to describe the response function. It should be understood as “choosing.” The formation of a new habit is the strongest form of choosing. As Peirce puts it, “the whole function of thought is to produce habits of action.” If these habits of action are produced, the clear implication is that they are newly instantiated, and if newly instantiated through a process of thought, then they are chosen.
In times past, when I would arrive at this point in thinking this through, an obstruction in the form of an unanswerable question seemed to block the path. I was then concerned to distinguish between a responsive act I might make that would be truly new and not pre-existing, on the one hand, and a responsive act, on the other hand, that would merely represent a new discovery of something pre-existing within my self – something like a particular function among an overall set of functional responses linking stimulus and action.
Even then, I had the feeling that the unanswerable question, which is the distinction between those two possibilities, might make no difference at all. Now I’m sure that Peirce would agree: it makes no difference. There is no privileged, overarching viewpoint from which such a distinction can be made. We must start from where we are, inhabiting flesh, engaged in lives of action.
Thus it makes no sense to ask whether my human will creates a choice de novo or simply follows a particular functional response specified by sensory inputs interrogating a (possibly infinite) set of pre-existing functional responses. These alternatives form a pragmatic pair. If the only way to discover the particular response among many possible responses is to live it out, then the practical implications are the same.
Is this a philistine, sparse, mechanical view of what might otherwise be called soul or spirit? Again, don’t be misled by the language. It fits rather well with what Immanuel Kant saw when he looked at GSOT and declared,
Two things fill me with ever increasing awe and admiration – the starry sky above me and the moral law within me.
Kant’s moral law is an apt description of the functional responses that constitute human choosing. Here Kant describes the moral law as he discovers it within one individual person – himself. Elsewhere Kant makes the same mistake that Aristotle, Descartes, and other rationalists make, assuming that the pattern of choosing that he finds clearly within himself must be true universally. That’s wrong. There is no requirement for agreement (discussed here).
Even within an individual person, the set of functional responses which is discovered and discoverable only through living out each one, then moving to the next, might be infinite, or if finite as large as the brilliant canopy of heaven.
We cannot know truth as such. Our commerce is with belief, and belief is a product of thoughtful human activity. More specifically, belief is a product of thoughtful decision-making by you and me and us.
Even so, some beliefs turn out to be wrong, and other beliefs turn out to be right – that is, they turn out to predict what happens. Therefore, some beliefs move closer to truth than others.
What is the criterion by which we can judge some beliefs to be more true than others? The criterion is the probability with which beliefs predict the future as we experience events rolling past.
You and I choose our beliefs, it would seem, through one of 2 paradigms. The first is a paradigm of observation-experiment-calculation, which might be regarded as the paradigm of science. The second is a paradigm of personal preference arising from within. This has been considered to constitute free will.
But are the two paradigms really separable? For both involve the interaction of a person with the surrounding external world. The paradigm of science never entirely escapes the touch of personal preference, most prominently the inner urge to do science at all. Moreover, no person works generically as a scientist, but each chooses some field of study, as well as the methods, samples, conditions, and degrees of certainty that will satisfy her or him. If someone were to ask why you study mitochondria in yeast, you might well reply that such work in the pursuit of knowledge is fulfilling. Such a reply expresses personal preference.
Likewise the second paradigm obviously involves the interaction of a person with the surrounding external world. It demands repetitions of observation, calculation, and experiment, in which a multitude of scenarios from the world are presented as hypotheses to the inner person, and one of these evokes the strongest response. That is what we call a choice of personal preference. The result becomes a belief about what the inner self wants. Yet the process – interaction of a person with the surrounding external world – has the same fundamental elements as the process of science.
Pragmatism asserts that the paradigm of science and the paradigm of will are not separable in kind. They only describe the degree to which agreement and public verification of predictability apply, or conversely that to which individual spark or conjoined effort of a group, less than all, finds meaning by application in the real world.
Let’s go back now to the question that began this discussion several blogs ago. Does pragmatism really have an answer to the paradox that arises when we try to believe both of the following statements?
Every event has its cause.
The ideas I/we bring up and the habits of responsive action I/we form can produce effects in the world.
Statement (2.) is a straightforward declaration of human agency. All belief must be understood pragmatically through the lens of habits/inclinations of responsive action. Free will is not a conclusion at the end of a theoretical proof. It is instead part of the process by which we assign meaning.
Statement (1.), on the other hand, has no such clear and straightforward interpretation. We have to recognize that “Every event has its cause” is in its barest sense a metaphysical statement. When spoken by a religious believer, it invokes God setting forth a rule by which the universe is going to operate. Without the theological reference, “Every event has its cause” is simply an unprovable statement and just as much an article of faith as belief in God. Thus “Every event has its cause,” unqualified and without acknowledgement of faith, is not a meaningful construct for human understanding.
Let us then decisively choose statement (2.) as the basis for practical living in the world. Recognizing that (2.) is in conflict with (1.), we must just as decisively reject statement (1.). It is neither demonstrably nor self-evidently true that every event has its cause. There is no sufficient reason to accept The Principle of Sufficient Reason as a starting point.
When he was 24 years old, Charles Peirce in a speech to the Cambridge High School Association announced his judgment that David Hume, the Scottish skeptic, had invalidated the Principle of Sufficient Reason formulated by Leibniz, as follows:
In [Hume’s] day, the philosophical world was divided between the doctrines of Leibniz and Locke, the former of whom maintained the existence of innate ideas while the latter rejected them. Hume, accepting the latter doctrine, which was prevalent in England, asked, “How do we know that every change has a cause?” He demonstrated by invincible logic that upon Locke’s system it was impossible to prove this, and that it ought not to be admitted as a principle at all.
In rejecting “Every event has its cause,” therefore, we find ourselves in good company.
But it is only when “Every event has its cause” oversteps its limits, when we let it pretend to excise purpose and will from human life, that we must reject it. In ordinary terms “Every event has its cause” blooms with pragmatic meaning, as we consider endeavors of scientists unveiling the forces of nature, advances from engineers deploying technology for human progress, and evidence-based solutions from physicians caring for health. These and other activities enlarge the scope of human freedom.
It is only in the metaphysical sense that “Every event has its cause” can be found to conflict with the affirmation of human free will. Now pragmatism has superseded metaphysics. The principle of causation, also known as the principle of sufficient reason, when pragmatically understood in the context of human activity, functions as a highly reliable interpretive foundation that supports purposeful action.
The Place of Our Age in the History of Civilization, first published in Cambridge Chronicle, a Weekly (Sat., Nov. 21, 1863), vol. 18, No. 47, p. 1, reproduced in Charles S. Peirce: Selected Writings ed. Wiener, P.P., Dover, New York, p. 7.
If young Christian missionaries deeply encounter a great and ancient nonChristian culture, what happens?
This is a placeholder for a blog that I want to begin now and complete in the future. I’m grateful to cousin Karen Smith who reminded me of the connections between Kate Smallwood’s time in Suzhou, Kate’s best friend there named Alice Longden, and Alice’s son Huston Smith (not related to Karen), the great scholar-practitioner of world religions.
The previous 3 posts (first one here) told the story of Kate Smallwood, who became engaged to Billy Guyton just before leaving for a 5 year mission term in Suzhou, China. When she returned, they married and eventually settled in Oxford, Mississippi. Kate never tired of describing Chinese customs and wisdom to various womens’ groups in North Mississippi. But what can we say about her religious beliefs?
Kate’s actions over the next 50 years in Oxford as well as testimony from those who knew her best give abundant evidence that she remained actively faithful to Christ throughout her life. On one point, however, uncertainty must be expressed. Did she adhere to Christian exclusivity – the notion that belief in Christ is the exclusive means to salvation?
Kate’s daughter Ruth left Christianity to join the Unitarian Church. Years later Ruth told me that her mother “never showed any resentment toward me for joining a church other than Methodist.”
In Mississippi during the early and mid-20th century, if you had any religious ideas beyond traditional Christianity, you kept them quiet. Ruth told me that her father Billy Guyton once showed her “papers about the Greater Unitarian FelIowship which he had joined and to which he had been sending contributions for many years.” Billy never told Kate about that. He adored her. He also respected her faith, but did not fully share it. What would she have said about his joining the Unitarian Church? We don’t know.
Further insight may be gained by looking at another family. Alice Longden was Kate’s closest friend during her 5 years in Suzhou. After marrying Wesley Moreland Smith, she returned with him to China where they served 41 years as missionaries. Their son, Huston Smith, was born in Suzhou in 1919 and spent his first 17 years in China.
Huston Smith’s teaching, including 14 books, a public television series with Bill Moyers, and several videos, emphatically expresses a viewpoint that might be called nonexclusive Christianity, which appreciates all of the world’s major religions. The banner at the top of this blog shows a paperback cover of his most significant book along with a handwritten note from my mother confirming Karen Smith’s reminder to me. The Religions of Man was first published in 1958. Later editions have the gender-neutral title The World’s Religions.
At this time in August 2016, Huston Smith’s official websitestates that at age 97 he is now in hospice care. What a contribution he has made!
I want to learn more about Huston Smith’s ideas. I’ll revisit this blog in the future after reading more of The World’s Religions (thus far only the chapter on Confucianism in the 1956 edition) and hopefully his autobiography as well. But don’t wait for me. You can check out the official website, another website, or Amazon to benefit from his lifelong exploration of religion with its roots in Methodism and China.
Kate Smallwood accepted Billy Guyton’s proposal of marriage in 1908 and almost immediately left Mississippi for a 5-year term as a Methodist missionary to Suzhou, China. A few months before leaving Suzhou to return home she wrote to her sister, who was also engaged,
You wanted to know Billy’s and my plans. He writes me that he is going home this summer and when you see him, ask him to tell you, because his plans are my plans, and what he does or what he wants me to do, you will find me trying my best to do. I am glad you are in love. I hope you love Mr. Johnson as much as I love Billy, but I doubt it. Maybe you will someday.
I asked Aunt Ruth, Kate’s daughter, if her mother was a changed person when she returned to Mississippi from Suzhou. My intent was to know if she had changed philosophically or religiously.
Ruth smiled as she recalled some intimate conversation with her mother and replied, “Well, it was touch and go when she came back. I think Daddy had some second thoughts about whether to marry her. But they worked it out.”
They married on Christmas Eve, 1913, and soon left for Orange, New Jersey, where he pursued a second year of internship. Kate’s fond hope was that they would return soon to China to establish a mission clinic there.
But it was never clear that Billy had the heart for mission work or living overseas. And his father’s opinion came across loud and clear. John Franklin Guyton or “Pappy,” a successful farmer, storekeeper, and strict Baptist in the tiny community of Ingomar, Mississippi, had a way of expressing his views strongly. Once he smashed a violin of Billy’s younger brother to direct him away from sinful music.
As he thought of the young couple raising a family so far away, Pappy became very emotional, almost crying. He reminded Billy that his medical career had been funded partly by his parents, who had invested far more in him than would be possible for their younger children.
“Your brothers and sisters may need to depend on you,” he said. “Maybe your parents as well. How can you do that from China?”
The decision was made. The couple would abandon the mission field. They spent a year in Ingomar where Billy began his medical practice. The role of a country doctor did not suit him, however, and perhaps they needed to separate a bit from his parents.
He traveled across the Atlantic to Vienna, Austria, where he spent several months training with a world-renowned ophthalmologist. Then Kate and Billy moved to an apartment in Denver, Colorado, where he pursued further training. They almost bought a house in Denver when he received an offer to join the medical school faculty there. However, Pappy intervened once again, this time more in line with Kate’s wishes, persuading Billy to return to Oxford, Mississippi, home of Ole Miss where his medical education had begun.
I like to learn about decisions, especially those made jointly, and their consequences. So I asked Aunt Ruth if Kate ever forgave her father-in-law for blocking her dream of life on the mission field.
Ruth replied with a smile, “Oh certainly she forgave him. She is a forgiving person.” Then she smiled again and added, “But she never went to visit my grandfather or grandmother at the old house. I can’t remember her ever going back to Ingomar. On Sunday afternoons Daddy would drop Mama off in New Albany to spend the time with her mother, and he would drive on to visit his parents in Ingomar.”
Billy started the Guyton Clinic in Oxford, attracting patients with eye, ear, nose, and throat problems from surrounding towns and counties. Three other specialists joined the practice, eventually including their oldest son Jack, who received his medical degree at Harvard and postdoctoral training in ophthalmology at Johns Hopkins. After a few years Jack moved to Henry Ford Hospital in Detroit.
From another son, William F. Guyton, I gained a mental snapshot of the financial accounts at Billy’s clinic around 1930. In order the receipts were carefully tabulated –$2, $2, 0, 0, $2, $5, 0, 0, $2 – and so on. Both whites and blacks received care at the clinic, though from separate waiting rooms.
In that era removal of the tonsils was thought to give protection from severe sore throats that could cause rheumatic fever. That theory was wrong, and tonsillectomy as practiced then was actually a dangerous operation with appreciable mortality. Yet one day Billy lined up 60 first- and second-graders and pulled out all their tonsils, a heroic effort fortunately without a death.
In time Billy Guyton had earned a reputation as one of the most prominent specialists the state of Mississippi could boast. His clinic attracted patients from throughout the northern half of the state and some from Tennessee. He had a part-time teaching position at the 2-year medical school at Ole Miss.
Around 1935 the medical school nearly closed down. Governor Theodore Bilbo, a flamboyant populist and racist, had engineered the firing of the Ole Miss chancellor (later reinstated under pressure from accrediting agencies), the medical school dean, and at least 53 university and college faculty around the state in a dispute about the goals of higher education. The entire university lost its accreditation from 1930 to 1932, and the medical school was on probation.
Politics at the state level was only part of the problem. The American Medical Association Council developed plans to close all nine 2-year medical schools in the U.S. One concern was the prospect of an oversupply of physicians.
Several leaders of higher education asked Billy to accept a one-year term as acting dean of the medical school. He took the temporary job and later the full, established position. One of his first acts was to cancel first-year admissions to the medical school in the fall of 1935.
Together Billy and university chancellor Alfred Benjamin Butts visited legislators in small towns up and down the state requesting support for the university, as the national and state economy recovered. Billy’s country upbringing and his grasp of farming and finance helped him connect with rural legislators.
Billy also wrote letters to every AMA Council member and successfully solicited support from key academic leaders. He decided that an immediate attempt to pursue a 4-year school in Mississippi would undoubtedly fail, and he moved to upgrade the 2-year curriculum. Hospitals around north Mississippi were recruited to supply 1. surgical specimens for pathological examination, 2. bodies of the deceased for free autopsies within a 60-70 mile radius of Oxford (refrigerated transport being unavailable), and 3. patients and clinic assignments to teach the basics of histories and physical exams to sophomore medical students.
In a crucial vote, the AMA Council deferred its plans to close 2-year medical schools, and in 1938 full accreditation for the school in Oxford was restored. Billy led the school for another 7 years before stepping down and resuming a full clinical practice. He stayed connected, serving on the first planning committee for the 4-year University of Mississippi Medical School which opened in Jackson in 1955.
During this time Kate stayed active with family, church, and university. Four children arrived, all of whom she guided academically and socially. She planted the large yard of their home on South Lamar Street with jonquils, hyacinths, Japanese quince, forsythia, and bridalwreath. Each spring she hosted a breakfast for women university students. At First Methodist Church she taught Sunday School, led Mission Study and Circle, and served as president of the Women’s Society of Christian Service. She was founder and first president of the Oxford PTA, president and member of the University Dames, the AAUW, the DAR, and twice president and 45-year year member of the Browning Club. Kate gave many talks to women’s groups throughout north Mississippi. She might start with flower arranging and mix in Confucian philosophy. She might discuss Chinese ideals for family life.
Kate kept up with former students in Suzhou by mail. One of them, Tsi Tsung Yang, visited the Guytons in Oxford in the late 1920s (pictured at right). The Laura Haygood School was incorporated into Soochow University, which today ranks in the top 5% of research universities in China.
In 1950 Billy and Kate took a trip around the world. It was the only time she returned to the place where she spent 5 shining years as a young adult. Her students now had much more experience of life than she herself during her time in Suzhou. What a reunion that must have been.
How much can we see of the human will as it works to change the direction of events in the lives of people we know?
An 11-year-old girl loses her father to a heart attack and suddenly discovers that all the adults who knew about the family curse expected it to happen. As she tries to sleep that night, she wonders how the loving God she learned about in Sunday School could allow such a tragedy.
Through teenage years, some combination of grief, confidence, curiosity, and restlessness gels into a desire to learn what she can about science and medicine. She goes to college, remarkable enough for a small town girl in that time, but afterward finds a limited horizon.
Now she has fallen in love with a boy from a nearby farm, and they plan together for the future. He is moving toward a career that will combine finance and farming, but she sees something else in him and for him. Her vision of who he can become – a medical doctor – appeals to him more than his own casual thought, and they begin to share the vision.
Then she learns about an amazing opportunity to travel to the opposite side of the earth, an opportunity to immerse herself in a culture different, older, in many ways richer than anything she has known. By leaving for 5 years, she opens the path for him to pursue the medical dream. This is the next step, and it becomes her plan and their plan.
After half a decade apart, communicating their love only by letters carried on ships across the Pacific, their lives rejoin. With just a brief stumble, they marry. But her grand plan to return to China collapses. He begins his medical practice, and she supports him and their new community, and they begin a family of their own.
A statewide crisis confronts him with more responsibility than he ever could have imagined. He answers, partly because he knows how to share a dream and partly because it all gets back to farming and finance and living better. Families not only around Oxford, but across the state of Mississippi will benefit from graduates of the medical school, as he finds a way to keep the doors from closing.
She shares the wisdom of the East, blended with her Methodist faith, with whomever will listen. Even the indefensible burden of racial oppression lifts just a little from her willingness to connect and converse.
All of this comes from a confluence of will that springs from many sources, one of them the heartbroken confusion of an 11-year-old girl.
Kate and Billy had three sons – Jack, Bill, and Arthur – and then a much anticipated daughter – Ruth. Jack and Arthur went to medical school, Bill became an engineer, and Ruth a statistician.
I and my siblings, Arthur’s children, as well as cousin Becky were drawn toward careers in medicine. I had no thought that my own career decision was predetermined or engineered for me. Yet the sense of who I am, making that decision, was never limited to my individual self. I did not try to strip away the family heritage or rebel against it, at least not in this area. It feels right to say that here is where I belong in the world, and this is how we have learned to make our contribution.
This dual sense of who I am and how we choose gained new significance when I learned from Aunt Ruth that the decision for a medical career began with my grandmother, not my grandfather as I had previously supposed.
I was young when Kate Smallwood Guyton passed away. I never had the chance to know her as an adult. I have very few personal memories of things she said or did in my presence. That she always called my grandfather “Bully” rather than “Billy” I readily recall, but never thought to ask why. She kept the green lawn and the pecan trees and the flower beds beautiful in a way that even a young boy could appreciate.
Just one incident sticks in my memory. When I turned 6 years old, she held my birthday party in her beautiful big yard. All of my friends came, and we played games around the yard. When it came time to cut the birthday cake, she served all of them before giving me my piece. I asked my grandmother why, and she said, “You are the host, Johnny. You take yours last.” Was that a Chinese custom, or maybe something that missionaries do? I never figured it out.
Her funeral was the first I ever attended. Aunt Ruth remembered speaking to a black woman there, who told her, “Miss Kate was the only white woman in Oxford who ever invited me into her living room.”
A few years later something nudged me to think back about her funeral, and an odd memory came up. Or someting like a memory, but I don’t think it really happened. I just remember remembering it – my best guess a dream, but one with striking effect, and here it is: At the end of the service, just before we would stand and walk out, somebody came by to whisper a few words to each grandchild in turn. These are the words that I remember remembering, “We believe that she lives on in the children.”
On the day of her fatal heart attack, she was preparing a talk for the Browning Club of Oxford. This time the planned talk was not about China, but about the great Bengali poet Rabindranath Tagore. On her bedside table, left there in haste as she departed for the hospital in Memphis, were these words from Tagore copied by her hand –
When death comes and whispers to me Thy days are ended, Let me say to him, “I have lived in love and not in mere time.” He will ask, “Will thy songs remain? I shall say, “I know not, but this I know, That often when I sang, I found my eternity.