2007-06-27, 01:31

The “Cusp of Anomaly”

By

R. Stephen Newberry

ABSTRACT: Deduction vs. Induction ~ Logicism vs. Empiricism ~ Determinism vs. Randomicity/ Chaos, and the

general topic of Contingency are discussed.

Preamble:

One of the more pleasant aspects of advanced age (I’m seventy-nine) is having the leisure in which to renew old literary acquaintances, and thus to savor again the riches of past encounters. A short list would include such treasures as The Bible (Old Testament, Torah), Shakespeare, the great English essayists of the 17th, 18th, and 19th centuries, and the great logicians of the late 19th and early 20th centuries. Here, I’ll certainly mention Dedekind, Cantor, Frege, Russell, Skolem, G\”odel, Gentzen, Herbrand, Church, and (God Bless him!) Stephen Cole Kleene, whose Mathematical Logic (Dover edition of 1967) I am now reading again, with a pleasure that verges upon joy.

I had, for now well over forty years, peripatetically been searching for an explanation of the “Cusp of Anomaly“ that exists between the Deductive and the Inductive methods of investigation, (this study in itself would properly be subsumed under the rubric of methodology.)

By just under forty years ago I had gathered “all the bits and pieces” to hand, but stubbornly (mulelishly?) kept getting confused between the apparently disjoint phenomena of \mathbb{\omega}-inconsistency, non-standard model-theory, the epistemological entailments of deductive vs. inductive reasoning, Logicism vs. Empiricism, determinacy/time-reversibility vs. randomness/dissipative phenomena, and the like. I was particularly fascinated by the failure of almost all of the 20th century logicians to discuss the role of contingency in classical logic. (Satisfiability and contingency are not synonymous, although the latter does entail the former.)

About five years ago, it began to dawn upon me (finally!) that the goal of my quest was to be found only in the synthesis of these several viewpoints, (“pictures” as Wittgenstein would have put it), and that they were all merely different aspects of this same “Cusp of Anomaly” that had bedeviled my contemplations over so many decades.

Imagine then, my present delight in discovering that Kleene had already put it together, and done it so smoothly and so elegantly, that over the many previous readings, I’d simply missed the point, and had continued to put myself through the utterly unnecessary purgatory of reconstructing the entire edifice ab initio, (including a truly hellish period of “diagonal-crankery”.)

The balance of this essay is an attempt to present “The Cusp” in terms sufficiently unfamiliar that “the picture” may be seen from a perspective that some might find novel, even though it most probably will be, for most readers, a very old story indeed.

My first encounter with “The Cusp” was in the winter of 1961. I was reviewing my competence with high-school algebra and trigonometry as preparation for the SAT exam, with the intention of “going back to school” at CCNY and getting at least a BS in EE. I was at that time 33 years of age, and painfully aware that I had missed my opportunity by not going into the Navy and getting into the “V2” program when I’d had the chance, (which would have led to the same end which now I was intent upon pursuing). [I had instead chosen the Merchant Marine, as had my father and my grandfather, and for myself it was a very bad choice.]

The review process had been going well enough that, in order to keep myself entertained I was also reading some other books on mathematics, primarily of the popularization genre, among which was George Gamow’s “One, Two, Three, . . . , Infinity”. I’d already read Russell’s An Introduction to Mathematical Philosophy, knew something of the Dedekind approach to the foundations of analysis, and had developed a very pretty mental model of the real line. My model was countable, since at that time I had no reason to think otherwise, and already had learned that both the rationals and the algebraic irrationals were countable, and since clearly, there was only one remaining block of the partition, the transcendentals, then the transcendentals must certainly be countable, since they have to “fit-in-between” the rationals and the algebraic irrationals. (Countably-many rational/algebraic-irrationals entails only countably-many places where transcendentals can fit! Hence, the concept of the continuum.)

On encountering Gamow’s presentation of Cantor’s “Diagonal Proof” of the uncountability of the reals I was deeply affronted and offended: Gamow’s explanation that the “transcendentals were denser on the real line” than the rationals and the algebraic irrationals was patently hogwash. It took me not more than perhaps five or ten minutes to come up with a constructive refutation, based upon the fact that, given any two transcendentals, taken arbitrarily close together, one can very easily construct a rational interpolant, and having once constructed that initial rational interpolant, then arbitrarily many subsequent rational interpolants may just as easily be constructed between the lower of the two transcendentals and the initial rational interpolant; and then again arbitrarily many subsequent rational interpolants between the upper of the two transcendentals and the initial rational interpolant; and then again arbitrarily many subsequent rational interpolants between all the previous rational interpolants, again and again, ad infinitum. So the concept of “denser-on-the-line” just doesn’t work, and without it, neither does the uncountability of the reals. QED. This is the first glimpse I had of the “The Cusp”.

So began an indescribably unpleasant twenty years’ bout with that particular form of mental-illness known psychiatrically as obsessionalism, or perseverance, and in the mathematical community as “Diagonal Crank”-ism. Not one of my many grad-student friends could find fault with the rational-interpolant construction, but no matter, it was clearly a matter of unchallengeable mathematical faith that the “Diagonal Proof” did in fact demonstrate the greater transfinite cardinality of the transcendentals. (It was on the same metaphysical plane as that of the “immaculate-conception” for devout Roman Catholics.) Case closed.

Later, I encountered the “Diagonal” construction occurring in the G\”odel proof of 1931, and proofs of the non-recursive-enumerability of the recursively-definable functions, and several other interesting cases, that simply defied any attempt at refutation, and hence must be accepted as true. The “Cusp” gets ‘curiouser and curiouser’. HOW to reconcile the “Cusp”?

It seemed pretty clear that there was some sort of epistemological affinity between recursive non-enumerability, and the “non-denumerability of the continuum”, (despite the obvious “apples-and-oranges” objection), and I was beginning to suspect that it might sometimes be possible to prove-by-induction propositions that were not universally-valid, and hence not syntactically/deductively provable. The “Cusp” again. G\”odel proved the deductive completeness (semi-completeness) of FOL, and the so-it-then-seemed deductive incompleteness of the Simple Theory of Types*; then Skolem, and subsequently, Henkin proved the equivalence of (many-sorted) FOL with STT, and the completeness of both FOL and STT, (which makes sense), but then Second order Logic, (which I had previously thought to be a subset of STT) is not even semi-decidable. The “Cusp” had me thoroughly confused. [Of course, the “might-sometimes” conjecture was precisely the content of the first G\’odel Incompleteness Theorem, but it took me several re-readings to be able to see that, and even then I was uncertain for a time.]

* (Already an error, since G\”odel’s “language P” is just STT + PA, and it is PA which is contingent. But that didn’t really “turn on the light-bulb” until much later.)

I’d succeeded in accepting Tarski’s (Hilbert’s?) \mathbb{\omega}-rule without too much difficulty, because I’d already learned (from L\”owenheim) about the existence of n-valid propositions, (“fleeing equations” in his parlance) but the fact that the \mathbb{\omega}-rule induced \mathbb{\omega}-inconsistency when adjoined to the axioms of standard number-theory made some warning buzzers go off in my head, and it seemed that non-standard model-theory might be somehow involved there; but all of the non-standard models that I had actually MET were really weird, essentially involving a universe which contains things like “infinite-integers” and infinitesimals, and suchlike. (Hmmm. . . ) But!

NEED THAT NECESSARILY BE TRUE OF ALL

non-standard-models? If a finite set of propositions is non-contradictory then it must have a countable model (L\”owenheim again), and if that set of propositions is not true in the Standard Model, then clearly that model must be non-standard; as, indeed, Henkin says that all of the non-full models in his universe of General Models must be; and, by-the-way, these non-standard, non-full models are all countable, which is fine by me, but how come the full general model is standard and non-countable???

As long as Henkin is leaving out some second-order entities, his general models are non-standard. All he requires of his general models is that the axioms and rules of inference are true in the model, and that the space of entities be closed under Boolean operations. Any Boolean Lattice satisfies those conditions. Suppose we had a way of constructing a Boolean Lattice in such manner that all-but-only the non-predicatively-definable sets were omitted: That would constitute a (non-full) non-standard General Model, and since I’ve never met (or

even heard of) a predicatively-definable uncountable set . . . THIS might be a good approach to resolving the the “Cusp” problem!! (And so it proved to be, but I’m getting way ahead of my story . . . )

The CCNY plan fell through. My SATs were fine but my wife died (malignant melanoma, unutterably awful), and although I then had an income of $50 per week, even in those days that was not enough to live off of in New York City, and I knew that I would be unable to maintain a full academic load in E.E. at CCNY, and at the same time earn a living. (In the vain attempt to “drown my grief”, I was also at that time drinking a bottle of Scotch every three days.) A good friend persuaded the Columbia Physics Department to give me a job that would keep me sober for at least 8 or 10 hours a day, as a technician on a low-energy physics experiment then being carried out at Brookhaven National Laboratories, and it did quite a bit better than that.

BNL had a research library that kept on the shelves, among other goodies, the entire Mathematical Foundations list of the North-Holland Publishing Company, and a full back-issue file of the Journal for Symbolic Logic, so that, not only did I have to stay sober while on the job, but for a good several hours thereafter, reading in the library. And at the end of that, I did not really need a drink in order to get to sleep!

It was also my first encounter with a Xerox machine, and no limit was placed on how much one could use it, so I copied reams of good stuff from the Reviews sections of JSL, as well as several volumes of the North-Holland list, that I could in no way understand at that time, but hoped to understand in future. Those Xeroxes traveled with me to Europe, and lived with me for several years in Lund, Sweden, while I was studying at Matematiska Institut, vid Lunds Universitet, (at which I was never able formally to register, but nonetheless was permitted to attend lectures and take exams.)

Mathematical Swedish doesn’t really have all that many new words, but Swedish academic standards were very high, math is a demanding subject, and I was simply too exhausted from attending lectures in a still–foreign-language to be able to take good notes, and then to work all the exercises, so I didn’t really make a success of it. (It was in Sweden that I first learned to have examination anxiety!). Several years later, another friend, an American physicist who was then a visiting consultant at the Lunds Physiska Institut, made some remarks about G\”odels Theorem, which he had encountered at Stanford, and I gently corrected him, explaining that such-and-such was not really the point, but that so-and-so, and he responded, “Man, you are wasting your time here!”, which was already beginning to become apparent to me. That led to more discussion of what I was still hoping to do with my life, and in consequence of all this, he persuaded me to go back to the US, go to California, attend “one of the easier campuses of the UC or Cal State systems”, and apply for student aid.

Fine: After four years of studying and learning Swedish and still failing my maths exams, it didn’t take too awfully long to persuade me; but then he began to get enthusiastic about the whole thing, and nothing would do but that I should also apply to Stanford University! I said, “WHAAAAT?!?!, how am I going to afford that on fifty dollars a week, when I couldn’t make it at CCNY in NY?” He said, “No problem, they have lots of money at Stanford. If you can get admitted, and survive the freshman year, then they won’t let you fail to continue merely for financial problems.” So I applied, and he coached me on how to go at it, and wrote a letter in my behalf to the head of the Philosophy Department (I’d given up on the EE, but the department was giving Philosophy credit for some Computer Science courses, and that seemed like a better choice), and in the end, I was admitted. And I did survive the freshman year, but Stanford’s generosity did not materialize (by that time I was thirty-nine years old), and for the first two years I lived in a garage on bread and peanut-butter for breakfast, no lunch, and a half-can of catfood with a handful of rice for dinner. Fortunately I had volunteered to grade papers in the CS101 course, and Bill McKeeman (the prof) was so pleased with the job I did that he gave me a job as an R.A. in the Computer Science department, which helped to pay for the catfood. When I attained Junior status I got a Federal Loan that took me through the rest of the way. But the Xeroxes had made the trip back with me, and I had kept on reading Logic. I graduated with a BA degree in Philosophy, having by that time taken, (or at least attended lectures for) almost all the courses in Computer Science, and as well, attended some of the seminars in Logical Foundations of Mathematics conducted by Sol Feferman and George Kreisel.

Sol Feferman (may he live in good health), had accepted the task of being my senior advisor while I was still in my ‘Diagonal Crank’ phase. (He didn’t know that at the time, because I’d learned to keep quiet about it, but by the time that I had finally produced my senior thesis in an acceptable form, he’d realized the mess I was in.) It was only through his virtually super-human patience that I was finally disabused of my compulsive delusion. My relief (BEING a Diagonal Crank is no bed of roses!) was exceeded only by my chagrin and contrition, and I entered that time of life known as “the mid-life crisis”, a two-and-a-half-year episode of acute clinical depression.

In the ensuing nearly thirty years, I caused him no further bother, as I had sworn off of Pure Mathematics, and, in one the most colossally naïve acts of a life characterized by naïve acts, I gave my entire mathematical library (excepting only Church’s Introduction, Kleene’s Metamathematics, the Woodger edition of Tarski’s Logic, Semantics, Metamathematics, van Heijenoort’s source book, and the Collected Works of Gerhard Gentzen) to Stanford Library on the naively specious assumption that “I can always go back and access them through the Stanford Library System.” Hah! No such luck. My precious library, painfully acquired over a period of more than twelve years, vanished into some black hole, never again to emerge. (During that time the expense of buying books ensured that I had no difficulty in maintaining the same weight and waist measurement that I’d had at age 23.)

In the meantime, I did some tech-writing for money, and went back to my other love, music and the Classical Spanish Guitar, while I devoured food, drank an occasional glass of beer or red-wine, and put on fifty pounds and twenty inches of waistline in the process. I also was fortunate enough to marry another Stanford graduate, a really nice lady who is still my wife, and life was more or less pleasantly normal after that.

Then, one day, as I was innocently walking by my Big Bookshelf, a long, thin, almost membranous, arm reached out from Van’s sourcebook, grabbed me by the ankle, and refused to let go until I took down the book. I was hooked again. No longer possessed by the Diagonal Crank obsession, I was now able to accept, grasp, and even to assimilate ideas which for so many years had somehow triggered the D.C. syndrome, and thereupon had immediately become utterly opaque to me. My infatuation with Predicativity returned; and slowly, painfully, (book prices had increased by an order-of-magnitude), I reacquired most of my mathematics library, and the Don Quixote of Mathematical Philosophy was saddled up again. Heigh Ho, Rosinante!

I occasionally emailed Sol with questions of a technical nature, and he, always the mensch, answered briefly and promptly. [“Are you the same ‘Steve Newberry’ I knew thirty years ago at Stanford?”] In time, I produced an essay which, with much difficulty, and after over a year of whining and waiting, I finally persuaded him reluctantly to consent to read.

The essay constructively demonstrated (a one–line recurrence, followed by the union of the well-ordered, monotone, transitive, unbounded sequence of finite Boolean Lattices reified by the recurrence) the existence of a non-standard counter-model to Cantor’s Theorem. Unfortunately, I had neglected adequately to emphasize the non-standard nature of the model, and he, quite understandably, (although still the mensch, still the gentleman) became perturbed, and asked me no longer to address my epiphanic visions to his attention, on grounds of a limited life-expectancy. As I am five months older than he, and am still loaded with guilt from having already wasted as much or more of his time than he had been accustomed to expend on advising doctoral candidates with their dissertations, I had no option, but to apologize, butt-out and more or less shut up.

Perhaps it is time to explain the Cusp: It arises as the critical and essential difference between deductive reasoning [Logicism] and inductive reasoning [Empiricism]. I like to refer to this as the Empirical Vacuity of deductive reason. Wittgenstein had commented on this at some length in his doctoral thesis Logico Tractatus, (as had Kant some several centuries earlier.). In particular, truths derived from purely empirical data cannot be materially implied by tautologous truths from purely logical/syntactic means, because they are in essence empirically vacuous. [That would be “getting something for nothing”, creating new empirical information from no empirical information: Not allowed.]

My essay had sought to resolve the “Cusp” by noting that, while inductively-true propositions (n-valid, valid on all-but-only-finite domains) validly imply deductively-true (u-valid, universally valid, tautological) propositions, the converse does not hold. The propositions which are only-inductively-true are contingent (Kleene’s word; Carnap’s term is L-indeterminate), and therefore by definition, necessarily admit of counter-models, which exist only on completed-infinite domains.* The only theories devoid of finite sub-models that are recognized by the Standard Model of Arithmetic are those “elementarily equivalent with/to” Peano Arithmetic, (which essentially defines the Standard Model of Arithmetic), and hence these other (non-elementarily equivalent) theories with no finite sub-models are categorized as being non-standard, and hence, from the perspective of the SMA and classical Set theory, have no legitimate significance.

* That idea is worth a moment’s reflective thought

Cantor’s theorem, which is true in the SMA, is (only) inductively-true. We know this because it materially-implies other empirically contingent propositions, e.g., the assertion of the existence of infinite sets (non-denumerably many!), hence cannot itself be deductively-true, u-valid. (The “Diagonal” argument is just the recursive construction of an implicitly inductive [“every” = “all”] proof of the falsity of a certain class of propositions. It took me forty years to see that.)

Therefore, Cantor’s theorem must, of necessity, have at least one non-standard counter-model, and the (union of the sequence of) Boolean Lattices generated by the unit-sets of \mathbb{\omega} is it. [It consists of the finite and co-finite sub-sets of \mathbb{\omega}, augmented by the limit-sets (unions over all the elementary chains in the ultra-filter); of which chains there are obviously only countably many, (since such chains are initiated by only countably-many unit-sets and terminate in only countably-many limit-sets.)

This is a non-standard model, a very pretty model, of all-but-only the Predicatively-definable subsets of \mathbb{\omega} [on the assumption that unions of countable, well-ordered, monotone, transitive, unbounded sequences are predicative, which Sol several years previously had assured me that “even Poincare would have accepted as predicative”. (verbatim quote!)]

So now we have a robustly predicative countable model of the powerset of \mathbb{\omega} and the countable continuum of predicatively-definable real numbers will be defined by Cauchy-sequences of rationals (which themselves are predicatively-definable within the model), but the original Diagonal-proof of the “uncountability” of the set of all binary sequences is still possible! Since ALL entities definable in this system of STT, are countable, (and let’s agree, from here on, to call the system ‘PTT’, for “Predicative Theory of Types”, and take it as known that these entities are all-but-only those which ARE predicatively-definable) how do we interpret the significance of this construction of a member of 2^\mathbb{\omega} which is obviously not a member of the sequence from it has just been defined?

Clearly, there must be an alternative interpretation, but what can it be?

Now it is time to think about the set of all Chaitin/Martin-Lof/Solovay \mathbb{\OMEGA}-random binary sequences. [citation] This is, by definition, necessarily a subset of 2^\mathbb{\omega} and that subset is, (also by definition) quintessently random, and therefore incapable of being well-ordered.

So the Diagonal-proof demonstrates that there is a (countable?) counter-model to the Axiom of Choice, which is hardly surprising, considering the independence proofs of Cohen and Feferman. AxCh is, together with the uncountability of the Cardinality of the Continuum, quintessently contingent, and that is simply a FACT.

“Everything is a trade-off”: In this case one gives up the non-denumerable continuum in exchange for a denumerable model of the continuum which is categorical, (as are all models of theories expressed in the language PTT), and which omits all-and-only the C/M-L/S \mathbb{\OMEGA}randomly generated binary sequences in 2^\mathbb{\omega}(which may plausibly be conjectured to be the characteristic functions of the uncountable subsets of \mathbb{\omega}.)

This denumerable continuum is known among Pure Mathematicians concerned with Foundational Studies as that which is “provable within Feferman’s system IR of predicative analysis”, or, equivalently, within “the first-order part of ATR0, and/or ATR0set ”, as set forth in section VII.3 of Simpson’s SUBSYSTEMS, and properly includes the H-sets of section VIII.3 of Simpson’s SUBSYSTEMS, and the Baire space NN and the Axiom of Countability, together with much of the work of G\”odel developing the relation between constructible sets and the projective hierarchy.

It also includes a very elegant theory of recursive transfinite ordinals up to the limit ordinal \mathbb{\GAMMA}0, as developed (independently) by both Solomon Feferman and Kurt Sch\”utte. Within this formalism can be expressed much of “measure theory, separable Banach space theory, Ramsey theory, matching theory, well quasiordering theory, and countable algebra”. [cite Simpson SUBSYSTEMS, p. 412]

But what about those ‘uncountable sets’ after all? Since it is consistent to assert their existence, they must be possible, and therefore must correspond to some real (i.e., existing) entities/phenomena.

I submit that the correspondence is to be found in the FACT that all sets are either capable of being (simply/well) ordered or not. The former are countable, and the latter are not. This is a question, not primarily of cardinality, but of structure. The disjunction is tautologous, and in any context not dependant upon contingent/empirical truth (“Goldbach’s Conjecture, Riemann’s Hypothesis, Twin-Primes Conjecture, &c.) the issue is determinate, and the principle of excluded-middle applies.

While we’re at it, we might as well dispose of the putative difference between ‘potentially’ and ‘actually’ infinite sets, which has had philosophers at-knives-drawn as far back as Aristotle. We need merely to stipulate that any (actually-infinite) set M which is capable of being (simply or well) ordered is represented by the unbounded (potentially-infinite) sequence S which contains all-but-only those elements which appear in M, and that *M =df= S; and where ‘U’ denotes “union”, U(S) = M; Here, ‘*’, ‘U ‘ are inverse operators. Thus, where ‘X’ may denote the aforementioned M,

*U(X) = U*(X) = U(*X) = U(S); and U(S) = X.

That having been said, it follows very easily that “M is uncountably infinite iff S is unbounded (potentially-infinite) and \mathbb{\OMEGA}-randomly-generated”, and “M is countably infinite iff S is unbounded (potentially-infinite) and orderly-generated”, where ‘orderly-generated ‘ means any one of “simply-ordered”, “well-ordered” or “recursively-generated”.

[NB: Cauchy-sequences of rationals are highly non-random.] But 2^\mathbb{\omega} must of necessity contain all the C/M-L/S \mathbb{\OMEGA}-randomly-generated infinite binary-sequences that exist!

But what does this do to Set Theory?!? We’ve been taught that NBG is predicative, and there certainly are those who maintain that ZFC, since formalized entirely within FOL, must therefore automatically be seen as being predicative. But PTT demonstrates that at least one version of predicativity simply ignores the existence of the entire subject-matter of classical Set theory!

Well, what do the Masters: Skolem, Bernays, G\”odel, Kleene et. al., have to say on the matter? They are essentially unanimous that FOL is inadequate for the expression of all truths, either of Set Theory, or of mathematics. That, (“the Relativity of Set Theory”), is their explanation for the resolution of the L\”owenheim-Skolem “Paradox”, and it works just as well here too. Different views of the Universe are possible, and therefore consistent.

All done, neat and tidy.

THAT IS THE “CUSP OF ANOMALY”, and its RESOLUTION

[Honesty compels me to admit that the correspondence between induction/deduction and standard/non-standard models was not quite as plainly drawn in the essay as it is here.]

Which brings us back to Kleene’s Mathematical Logic: Section 53, Skolem’s Paradox and non-standard models of arithmetic says it all. On all of my (many!) previous readings of this section, (and so many others like it), I had kept wondering, “Why are these people so obsessed with Peano Arithmetic? It is as though it were their entire universe!” [You think? Really? Hello? Four years as a Philosophy major at Stanford University, why didn’t I get that sooner?]

I’m tempted to say, “Now I can die in Peace.” But it’s not true. In a life which has been consistently characterized by naïve enthusiasms, philosophical revelations, epiphanic insights, and hypo-manic head-trips, I have never more than now wished to live, at least a little while longer . . . maybe get back to the guitar . . . perhaps try learning to play the violin-cello . . . ,

but The Cusp has been laid to rest, amen.

Los Altos, CA, 2007

By

R. Stephen Newberry

ABSTRACT: Deduction vs. Induction ~ Logicism vs. Empiricism ~ Determinism vs. Randomicity/ Chaos, and the

general topic of Contingency are discussed.

Preamble:

One of the more pleasant aspects of advanced age (I’m seventy-nine) is having the leisure in which to renew old literary acquaintances, and thus to savor again the riches of past encounters. A short list would include such treasures as The Bible (Old Testament, Torah), Shakespeare, the great English essayists of the 17th, 18th, and 19th centuries, and the great logicians of the late 19th and early 20th centuries. Here, I’ll certainly mention Dedekind, Cantor, Frege, Russell, Skolem, G\”odel, Gentzen, Herbrand, Church, and (God Bless him!) Stephen Cole Kleene, whose Mathematical Logic (Dover edition of 1967) I am now reading again, with a pleasure that verges upon joy.

I had, for now well over forty years, peripatetically been searching for an explanation of the “Cusp of Anomaly“ that exists between the Deductive and the Inductive methods of investigation, (this study in itself would properly be subsumed under the rubric of methodology.)

By just under forty years ago I had gathered “all the bits and pieces” to hand, but stubbornly (mulelishly?) kept getting confused between the apparently disjoint phenomena of \mathbb{\omega}-inconsistency, non-standard model-theory, the epistemological entailments of deductive vs. inductive reasoning, Logicism vs. Empiricism, determinacy/time-reversibility vs. randomness/dissipative phenomena, and the like. I was particularly fascinated by the failure of almost all of the 20th century logicians to discuss the role of contingency in classical logic. (Satisfiability and contingency are not synonymous, although the latter does entail the former.)

About five years ago, it began to dawn upon me (finally!) that the goal of my quest was to be found only in the synthesis of these several viewpoints, (“pictures” as Wittgenstein would have put it), and that they were all merely different aspects of this same “Cusp of Anomaly” that had bedeviled my contemplations over so many decades.

Imagine then, my present delight in discovering that Kleene had already put it together, and done it so smoothly and so elegantly, that over the many previous readings, I’d simply missed the point, and had continued to put myself through the utterly unnecessary purgatory of reconstructing the entire edifice ab initio, (including a truly hellish period of “diagonal-crankery”.)

The balance of this essay is an attempt to present “The Cusp” in terms sufficiently unfamiliar that “the picture” may be seen from a perspective that some might find novel, even though it most probably will be, for most readers, a very old story indeed.

My first encounter with “The Cusp” was in the winter of 1961. I was reviewing my competence with high-school algebra and trigonometry as preparation for the SAT exam, with the intention of “going back to school” at CCNY and getting at least a BS in EE. I was at that time 33 years of age, and painfully aware that I had missed my opportunity by not going into the Navy and getting into the “V2” program when I’d had the chance, (which would have led to the same end which now I was intent upon pursuing). [I had instead chosen the Merchant Marine, as had my father and my grandfather, and for myself it was a very bad choice.]

The review process had been going well enough that, in order to keep myself entertained I was also reading some other books on mathematics, primarily of the popularization genre, among which was George Gamow’s “One, Two, Three, . . . , Infinity”. I’d already read Russell’s An Introduction to Mathematical Philosophy, knew something of the Dedekind approach to the foundations of analysis, and had developed a very pretty mental model of the real line. My model was countable, since at that time I had no reason to think otherwise, and already had learned that both the rationals and the algebraic irrationals were countable, and since clearly, there was only one remaining block of the partition, the transcendentals, then the transcendentals must certainly be countable, since they have to “fit-in-between” the rationals and the algebraic irrationals. (Countably-many rational/algebraic-irrationals entails only countably-many places where transcendentals can fit! Hence, the concept of the continuum.)

On encountering Gamow’s presentation of Cantor’s “Diagonal Proof” of the uncountability of the reals I was deeply affronted and offended: Gamow’s explanation that the “transcendentals were denser on the real line” than the rationals and the algebraic irrationals was patently hogwash. It took me not more than perhaps five or ten minutes to come up with a constructive refutation, based upon the fact that, given any two transcendentals, taken arbitrarily close together, one can very easily construct a rational interpolant, and having once constructed that initial rational interpolant, then arbitrarily many subsequent rational interpolants may just as easily be constructed between the lower of the two transcendentals and the initial rational interpolant; and then again arbitrarily many subsequent rational interpolants between the upper of the two transcendentals and the initial rational interpolant; and then again arbitrarily many subsequent rational interpolants between all the previous rational interpolants, again and again, ad infinitum. So the concept of “denser-on-the-line” just doesn’t work, and without it, neither does the uncountability of the reals. QED. This is the first glimpse I had of the “The Cusp”.

So began an indescribably unpleasant twenty years’ bout with that particular form of mental-illness known psychiatrically as obsessionalism, or perseverance, and in the mathematical community as “Diagonal Crank”-ism. Not one of my many grad-student friends could find fault with the rational-interpolant construction, but no matter, it was clearly a matter of unchallengeable mathematical faith that the “Diagonal Proof” did in fact demonstrate the greater transfinite cardinality of the transcendentals. (It was on the same metaphysical plane as that of the “immaculate-conception” for devout Roman Catholics.) Case closed.

Later, I encountered the “Diagonal” construction occurring in the G\”odel proof of 1931, and proofs of the non-recursive-enumerability of the recursively-definable functions, and several other interesting cases, that simply defied any attempt at refutation, and hence must be accepted as true. The “Cusp” gets ‘curiouser and curiouser’. HOW to reconcile the “Cusp”?

It seemed pretty clear that there was some sort of epistemological affinity between recursive non-enumerability, and the “non-denumerability of the continuum”, (despite the obvious “apples-and-oranges” objection), and I was beginning to suspect that it might sometimes be possible to prove-by-induction propositions that were not universally-valid, and hence not syntactically/deductively provable. The “Cusp” again. G\”odel proved the deductive completeness (semi-completeness) of FOL, and the so-it-then-seemed deductive incompleteness of the Simple Theory of Types*; then Skolem, and subsequently, Henkin proved the equivalence of (many-sorted) FOL with STT, and the completeness of both FOL and STT, (which makes sense), but then Second order Logic, (which I had previously thought to be a subset of STT) is not even semi-decidable. The “Cusp” had me thoroughly confused. [Of course, the “might-sometimes” conjecture was precisely the content of the first G\’odel Incompleteness Theorem, but it took me several re-readings to be able to see that, and even then I was uncertain for a time.]

* (Already an error, since G\”odel’s “language P” is just STT + PA, and it is PA which is contingent. But that didn’t really “turn on the light-bulb” until much later.)

I’d succeeded in accepting Tarski’s (Hilbert’s?) \mathbb{\omega}-rule without too much difficulty, because I’d already learned (from L\”owenheim) about the existence of n-valid propositions, (“fleeing equations” in his parlance) but the fact that the \mathbb{\omega}-rule induced \mathbb{\omega}-inconsistency when adjoined to the axioms of standard number-theory made some warning buzzers go off in my head, and it seemed that non-standard model-theory might be somehow involved there; but all of the non-standard models that I had actually MET were really weird, essentially involving a universe which contains things like “infinite-integers” and infinitesimals, and suchlike. (Hmmm. . . ) But!

NEED THAT NECESSARILY BE TRUE OF ALL

non-standard-models? If a finite set of propositions is non-contradictory then it must have a countable model (L\”owenheim again), and if that set of propositions is not true in the Standard Model, then clearly that model must be non-standard; as, indeed, Henkin says that all of the non-full models in his universe of General Models must be; and, by-the-way, these non-standard, non-full models are all countable, which is fine by me, but how come the full general model is standard and non-countable???

As long as Henkin is leaving out some second-order entities, his general models are non-standard. All he requires of his general models is that the axioms and rules of inference are true in the model, and that the space of entities be closed under Boolean operations. Any Boolean Lattice satisfies those conditions. Suppose we had a way of constructing a Boolean Lattice in such manner that all-but-only the non-predicatively-definable sets were omitted: That would constitute a (non-full) non-standard General Model, and since I’ve never met (or

even heard of) a predicatively-definable uncountable set . . . THIS might be a good approach to resolving the the “Cusp” problem!! (And so it proved to be, but I’m getting way ahead of my story . . . )

The CCNY plan fell through. My SATs were fine but my wife died (malignant melanoma, unutterably awful), and although I then had an income of $50 per week, even in those days that was not enough to live off of in New York City, and I knew that I would be unable to maintain a full academic load in E.E. at CCNY, and at the same time earn a living. (In the vain attempt to “drown my grief”, I was also at that time drinking a bottle of Scotch every three days.) A good friend persuaded the Columbia Physics Department to give me a job that would keep me sober for at least 8 or 10 hours a day, as a technician on a low-energy physics experiment then being carried out at Brookhaven National Laboratories, and it did quite a bit better than that.

BNL had a research library that kept on the shelves, among other goodies, the entire Mathematical Foundations list of the North-Holland Publishing Company, and a full back-issue file of the Journal for Symbolic Logic, so that, not only did I have to stay sober while on the job, but for a good several hours thereafter, reading in the library. And at the end of that, I did not really need a drink in order to get to sleep!

It was also my first encounter with a Xerox machine, and no limit was placed on how much one could use it, so I copied reams of good stuff from the Reviews sections of JSL, as well as several volumes of the North-Holland list, that I could in no way understand at that time, but hoped to understand in future. Those Xeroxes traveled with me to Europe, and lived with me for several years in Lund, Sweden, while I was studying at Matematiska Institut, vid Lunds Universitet, (at which I was never able formally to register, but nonetheless was permitted to attend lectures and take exams.)

Mathematical Swedish doesn’t really have all that many new words, but Swedish academic standards were very high, math is a demanding subject, and I was simply too exhausted from attending lectures in a still–foreign-language to be able to take good notes, and then to work all the exercises, so I didn’t really make a success of it. (It was in Sweden that I first learned to have examination anxiety!). Several years later, another friend, an American physicist who was then a visiting consultant at the Lunds Physiska Institut, made some remarks about G\”odels Theorem, which he had encountered at Stanford, and I gently corrected him, explaining that such-and-such was not really the point, but that so-and-so, and he responded, “Man, you are wasting your time here!”, which was already beginning to become apparent to me. That led to more discussion of what I was still hoping to do with my life, and in consequence of all this, he persuaded me to go back to the US, go to California, attend “one of the easier campuses of the UC or Cal State systems”, and apply for student aid.

Fine: After four years of studying and learning Swedish and still failing my maths exams, it didn’t take too awfully long to persuade me; but then he began to get enthusiastic about the whole thing, and nothing would do but that I should also apply to Stanford University! I said, “WHAAAAT?!?!, how am I going to afford that on fifty dollars a week, when I couldn’t make it at CCNY in NY?” He said, “No problem, they have lots of money at Stanford. If you can get admitted, and survive the freshman year, then they won’t let you fail to continue merely for financial problems.” So I applied, and he coached me on how to go at it, and wrote a letter in my behalf to the head of the Philosophy Department (I’d given up on the EE, but the department was giving Philosophy credit for some Computer Science courses, and that seemed like a better choice), and in the end, I was admitted. And I did survive the freshman year, but Stanford’s generosity did not materialize (by that time I was thirty-nine years old), and for the first two years I lived in a garage on bread and peanut-butter for breakfast, no lunch, and a half-can of catfood with a handful of rice for dinner. Fortunately I had volunteered to grade papers in the CS101 course, and Bill McKeeman (the prof) was so pleased with the job I did that he gave me a job as an R.A. in the Computer Science department, which helped to pay for the catfood. When I attained Junior status I got a Federal Loan that took me through the rest of the way. But the Xeroxes had made the trip back with me, and I had kept on reading Logic. I graduated with a BA degree in Philosophy, having by that time taken, (or at least attended lectures for) almost all the courses in Computer Science, and as well, attended some of the seminars in Logical Foundations of Mathematics conducted by Sol Feferman and George Kreisel.

Sol Feferman (may he live in good health), had accepted the task of being my senior advisor while I was still in my ‘Diagonal Crank’ phase. (He didn’t know that at the time, because I’d learned to keep quiet about it, but by the time that I had finally produced my senior thesis in an acceptable form, he’d realized the mess I was in.) It was only through his virtually super-human patience that I was finally disabused of my compulsive delusion. My relief (BEING a Diagonal Crank is no bed of roses!) was exceeded only by my chagrin and contrition, and I entered that time of life known as “the mid-life crisis”, a two-and-a-half-year episode of acute clinical depression.

In the ensuing nearly thirty years, I caused him no further bother, as I had sworn off of Pure Mathematics, and, in one the most colossally naïve acts of a life characterized by naïve acts, I gave my entire mathematical library (excepting only Church’s Introduction, Kleene’s Metamathematics, the Woodger edition of Tarski’s Logic, Semantics, Metamathematics, van Heijenoort’s source book, and the Collected Works of Gerhard Gentzen) to Stanford Library on the naively specious assumption that “I can always go back and access them through the Stanford Library System.” Hah! No such luck. My precious library, painfully acquired over a period of more than twelve years, vanished into some black hole, never again to emerge. (During that time the expense of buying books ensured that I had no difficulty in maintaining the same weight and waist measurement that I’d had at age 23.)

In the meantime, I did some tech-writing for money, and went back to my other love, music and the Classical Spanish Guitar, while I devoured food, drank an occasional glass of beer or red-wine, and put on fifty pounds and twenty inches of waistline in the process. I also was fortunate enough to marry another Stanford graduate, a really nice lady who is still my wife, and life was more or less pleasantly normal after that.

Then, one day, as I was innocently walking by my Big Bookshelf, a long, thin, almost membranous, arm reached out from Van’s sourcebook, grabbed me by the ankle, and refused to let go until I took down the book. I was hooked again. No longer possessed by the Diagonal Crank obsession, I was now able to accept, grasp, and even to assimilate ideas which for so many years had somehow triggered the D.C. syndrome, and thereupon had immediately become utterly opaque to me. My infatuation with Predicativity returned; and slowly, painfully, (book prices had increased by an order-of-magnitude), I reacquired most of my mathematics library, and the Don Quixote of Mathematical Philosophy was saddled up again. Heigh Ho, Rosinante!

I occasionally emailed Sol with questions of a technical nature, and he, always the mensch, answered briefly and promptly. [“Are you the same ‘Steve Newberry’ I knew thirty years ago at Stanford?”] In time, I produced an essay which, with much difficulty, and after over a year of whining and waiting, I finally persuaded him reluctantly to consent to read.

The essay constructively demonstrated (a one–line recurrence, followed by the union of the well-ordered, monotone, transitive, unbounded sequence of finite Boolean Lattices reified by the recurrence) the existence of a non-standard counter-model to Cantor’s Theorem. Unfortunately, I had neglected adequately to emphasize the non-standard nature of the model, and he, quite understandably, (although still the mensch, still the gentleman) became perturbed, and asked me no longer to address my epiphanic visions to his attention, on grounds of a limited life-expectancy. As I am five months older than he, and am still loaded with guilt from having already wasted as much or more of his time than he had been accustomed to expend on advising doctoral candidates with their dissertations, I had no option, but to apologize, butt-out and more or less shut up.

Perhaps it is time to explain the Cusp: It arises as the critical and essential difference between deductive reasoning [Logicism] and inductive reasoning [Empiricism]. I like to refer to this as the Empirical Vacuity of deductive reason. Wittgenstein had commented on this at some length in his doctoral thesis Logico Tractatus, (as had Kant some several centuries earlier.). In particular, truths derived from purely empirical data cannot be materially implied by tautologous truths from purely logical/syntactic means, because they are in essence empirically vacuous. [That would be “getting something for nothing”, creating new empirical information from no empirical information: Not allowed.]

My essay had sought to resolve the “Cusp” by noting that, while inductively-true propositions (n-valid, valid on all-but-only-finite domains) validly imply deductively-true (u-valid, universally valid, tautological) propositions, the converse does not hold. The propositions which are only-inductively-true are contingent (Kleene’s word; Carnap’s term is L-indeterminate), and therefore by definition, necessarily admit of counter-models, which exist only on completed-infinite domains.* The only theories devoid of finite sub-models that are recognized by the Standard Model of Arithmetic are those “elementarily equivalent with/to” Peano Arithmetic, (which essentially defines the Standard Model of Arithmetic), and hence these other (non-elementarily equivalent) theories with no finite sub-models are categorized as being non-standard, and hence, from the perspective of the SMA and classical Set theory, have no legitimate significance.

* That idea is worth a moment’s reflective thought

Cantor’s theorem, which is true in the SMA, is (only) inductively-true. We know this because it materially-implies other empirically contingent propositions, e.g., the assertion of the existence of infinite sets (non-denumerably many!), hence cannot itself be deductively-true, u-valid. (The “Diagonal” argument is just the recursive construction of an implicitly inductive [“every” = “all”] proof of the falsity of a certain class of propositions. It took me forty years to see that.)

Therefore, Cantor’s theorem must, of necessity, have at least one non-standard counter-model, and the (union of the sequence of) Boolean Lattices generated by the unit-sets of \mathbb{\omega} is it. [It consists of the finite and co-finite sub-sets of \mathbb{\omega}, augmented by the limit-sets (unions over all the elementary chains in the ultra-filter); of which chains there are obviously only countably many, (since such chains are initiated by only countably-many unit-sets and terminate in only countably-many limit-sets.)

This is a non-standard model, a very pretty model, of all-but-only the Predicatively-definable subsets of \mathbb{\omega} [on the assumption that unions of countable, well-ordered, monotone, transitive, unbounded sequences are predicative, which Sol several years previously had assured me that “even Poincare would have accepted as predicative”. (verbatim quote!)]

So now we have a robustly predicative countable model of the powerset of \mathbb{\omega} and the countable continuum of predicatively-definable real numbers will be defined by Cauchy-sequences of rationals (which themselves are predicatively-definable within the model), but the original Diagonal-proof of the “uncountability” of the set of all binary sequences is still possible! Since ALL entities definable in this system of STT, are countable, (and let’s agree, from here on, to call the system ‘PTT’, for “Predicative Theory of Types”, and take it as known that these entities are all-but-only those which ARE predicatively-definable) how do we interpret the significance of this construction of a member of 2^\mathbb{\omega} which is obviously not a member of the sequence from it has just been defined?

Clearly, there must be an alternative interpretation, but what can it be?

Now it is time to think about the set of all Chaitin/Martin-Lof/Solovay \mathbb{\OMEGA}-random binary sequences. [citation] This is, by definition, necessarily a subset of 2^\mathbb{\omega} and that subset is, (also by definition) quintessently random, and therefore incapable of being well-ordered.

So the Diagonal-proof demonstrates that there is a (countable?) counter-model to the Axiom of Choice, which is hardly surprising, considering the independence proofs of Cohen and Feferman. AxCh is, together with the uncountability of the Cardinality of the Continuum, quintessently contingent, and that is simply a FACT.

“Everything is a trade-off”: In this case one gives up the non-denumerable continuum in exchange for a denumerable model of the continuum which is categorical, (as are all models of theories expressed in the language PTT), and which omits all-and-only the C/M-L/S \mathbb{\OMEGA}randomly generated binary sequences in 2^\mathbb{\omega}(which may plausibly be conjectured to be the characteristic functions of the uncountable subsets of \mathbb{\omega}.)

This denumerable continuum is known among Pure Mathematicians concerned with Foundational Studies as that which is “provable within Feferman’s system IR of predicative analysis”, or, equivalently, within “the first-order part of ATR0, and/or ATR0set ”, as set forth in section VII.3 of Simpson’s SUBSYSTEMS, and properly includes the H-sets of section VIII.3 of Simpson’s SUBSYSTEMS, and the Baire space NN and the Axiom of Countability, together with much of the work of G\”odel developing the relation between constructible sets and the projective hierarchy.

It also includes a very elegant theory of recursive transfinite ordinals up to the limit ordinal \mathbb{\GAMMA}0, as developed (independently) by both Solomon Feferman and Kurt Sch\”utte. Within this formalism can be expressed much of “measure theory, separable Banach space theory, Ramsey theory, matching theory, well quasiordering theory, and countable algebra”. [cite Simpson SUBSYSTEMS, p. 412]

But what about those ‘uncountable sets’ after all? Since it is consistent to assert their existence, they must be possible, and therefore must correspond to some real (i.e., existing) entities/phenomena.

I submit that the correspondence is to be found in the FACT that all sets are either capable of being (simply/well) ordered or not. The former are countable, and the latter are not. This is a question, not primarily of cardinality, but of structure. The disjunction is tautologous, and in any context not dependant upon contingent/empirical truth (“Goldbach’s Conjecture, Riemann’s Hypothesis, Twin-Primes Conjecture, &c.) the issue is determinate, and the principle of excluded-middle applies.

While we’re at it, we might as well dispose of the putative difference between ‘potentially’ and ‘actually’ infinite sets, which has had philosophers at-knives-drawn as far back as Aristotle. We need merely to stipulate that any (actually-infinite) set M which is capable of being (simply or well) ordered is represented by the unbounded (potentially-infinite) sequence S which contains all-but-only those elements which appear in M, and that *M =df= S; and where ‘U’ denotes “union”, U(S) = M; Here, ‘*’, ‘U ‘ are inverse operators. Thus, where ‘X’ may denote the aforementioned M,

*U(X) = U*(X) = U(*X) = U(S); and U(S) = X.

That having been said, it follows very easily that “M is uncountably infinite iff S is unbounded (potentially-infinite) and \mathbb{\OMEGA}-randomly-generated”, and “M is countably infinite iff S is unbounded (potentially-infinite) and orderly-generated”, where ‘orderly-generated ‘ means any one of “simply-ordered”, “well-ordered” or “recursively-generated”.

[NB: Cauchy-sequences of rationals are highly non-random.] But 2^\mathbb{\omega} must of necessity contain all the C/M-L/S \mathbb{\OMEGA}-randomly-generated infinite binary-sequences that exist!

But what does this do to Set Theory?!? We’ve been taught that NBG is predicative, and there certainly are those who maintain that ZFC, since formalized entirely within FOL, must therefore automatically be seen as being predicative. But PTT demonstrates that at least one version of predicativity simply ignores the existence of the entire subject-matter of classical Set theory!

Well, what do the Masters: Skolem, Bernays, G\”odel, Kleene et. al., have to say on the matter? They are essentially unanimous that FOL is inadequate for the expression of all truths, either of Set Theory, or of mathematics. That, (“the Relativity of Set Theory”), is their explanation for the resolution of the L\”owenheim-Skolem “Paradox”, and it works just as well here too. Different views of the Universe are possible, and therefore consistent.

All done, neat and tidy.

THAT IS THE “CUSP OF ANOMALY”, and its RESOLUTION

[Honesty compels me to admit that the correspondence between induction/deduction and standard/non-standard models was not quite as plainly drawn in the essay as it is here.]

Which brings us back to Kleene’s Mathematical Logic: Section 53, Skolem’s Paradox and non-standard models of arithmetic says it all. On all of my (many!) previous readings of this section, (and so many others like it), I had kept wondering, “Why are these people so obsessed with Peano Arithmetic? It is as though it were their entire universe!” [You think? Really? Hello? Four years as a Philosophy major at Stanford University, why didn’t I get that sooner?]

I’m tempted to say, “Now I can die in Peace.” But it’s not true. In a life which has been consistently characterized by naïve enthusiasms, philosophical revelations, epiphanic insights, and hypo-manic head-trips, I have never more than now wished to live, at least a little while longer . . . maybe get back to the guitar . . . perhaps try learning to play the violin-cello . . . ,

but The Cusp has been laid to rest, amen.

Los Altos, CA, 2007