08/16/2008, 01:58 AM

Just a quick update on this subject... the ETF function I defined deals only with binary trees. This corresponds with the Veblen hierarchy, which is generated by the binary Veblen function phi_a(b). Oswald Veblen himself actually went further than this, to define ordinal functions of n variables. These correspond with functions on n-ary trees; it is possible to define an n-ary tree transformation function along similar lines as I have, that reduces a left-branching tree to a very, very long right-branching one.

Now if we take the limit of this process (since there is no end to the ordinals), we have a function that deals trees with omega children. (Some care has to be taken here, to ensure that repeated applications of the function will terminate: this can be assured by substituting a finite, but varying, value for omega---in effect, we're dealing with trees where the number of children is indexed.) This brings us up to something called the "small Veblen ordinal".

But Veblen went further: now that we've reached omega, the next logical step is to create a function with omega+1 arguments (or equivalently, a tree with omega+1 children per node). Then omega+2, omega+3, ... and so forth. Eventually, we may just define a function that takes an ordinal number of arguments. Once we get to this point, we might as well take the output of our functions, which is an ordinal, and use that as the number of arguments to our next-higher function, and so forth.

Somewhere along the line, we get to the point where the ordinals generated are so large, that to get any farther, we will need a function with X number of arguments, where X is an ordinal larger than any we have generated so far. The smallest such X is called the "large Veblen ordinal", and represents the limit of what is attainable using ordinal functions---or, equivalently, it represents the limit of computable functions that can be reached by iteration and diagonalization. In other words, it is iteration and diagonalization taken to the logical conclusion.

(As a side note, Jonathan Bowers' Exploding Array Function has the potential to span the ordinals below X. A very impressive feat for someone with not much formal training!)

Iteration and diagonalization will never reach X, so the function that corresponds with X represents the fastest-growing function that can be constructed from the successor function via iteration and diagonalization.

The kicker is that X lies below the Church-Kleene ordinal. So not only is it computable, it barely scratches the surface of fast-growing computable functions! To get beyond X, we will truly need unconstrained recursion, of the kind that even iteration and diagonalization will not suffice. I've no idea what such functions might look like: but they are all still computable!

So here lies the frontier of pushing the upper limits of fast-growing computable functions.

(Personally I think simply leaping to uncomputable functions is a cop-out... many people who will quickly name BBN in the arms-race of fast-growing functions have absolutely no idea how unimaginably huge it is... only by exploring what lies before it, can we truly appreciate how incredibly fast growing it is!) </soapbox>

Now if we take the limit of this process (since there is no end to the ordinals), we have a function that deals trees with omega children. (Some care has to be taken here, to ensure that repeated applications of the function will terminate: this can be assured by substituting a finite, but varying, value for omega---in effect, we're dealing with trees where the number of children is indexed.) This brings us up to something called the "small Veblen ordinal".

But Veblen went further: now that we've reached omega, the next logical step is to create a function with omega+1 arguments (or equivalently, a tree with omega+1 children per node). Then omega+2, omega+3, ... and so forth. Eventually, we may just define a function that takes an ordinal number of arguments. Once we get to this point, we might as well take the output of our functions, which is an ordinal, and use that as the number of arguments to our next-higher function, and so forth.

Somewhere along the line, we get to the point where the ordinals generated are so large, that to get any farther, we will need a function with X number of arguments, where X is an ordinal larger than any we have generated so far. The smallest such X is called the "large Veblen ordinal", and represents the limit of what is attainable using ordinal functions---or, equivalently, it represents the limit of computable functions that can be reached by iteration and diagonalization. In other words, it is iteration and diagonalization taken to the logical conclusion.

(As a side note, Jonathan Bowers' Exploding Array Function has the potential to span the ordinals below X. A very impressive feat for someone with not much formal training!)

Iteration and diagonalization will never reach X, so the function that corresponds with X represents the fastest-growing function that can be constructed from the successor function via iteration and diagonalization.

The kicker is that X lies below the Church-Kleene ordinal. So not only is it computable, it barely scratches the surface of fast-growing computable functions! To get beyond X, we will truly need unconstrained recursion, of the kind that even iteration and diagonalization will not suffice. I've no idea what such functions might look like: but they are all still computable!

So here lies the frontier of pushing the upper limits of fast-growing computable functions.

(Personally I think simply leaping to uncomputable functions is a cop-out... many people who will quickly name BBN in the arms-race of fast-growing functions have absolutely no idea how unimaginably huge it is... only by exploring what lies before it, can we truly appreciate how incredibly fast growing it is!) </soapbox>