Tetration Forum

Full Version: Is successor function analytic?
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Consider the Ackermann successor function $$\varphi(u)$$ such that $$\varphi(a \uparrow ^m b)=a \uparrow ^{m+1} b$$. Is $$\varphi(u)$$ analytic? If so then the complex dynamics of $$\varphi^n(u)$$ can be studied where there is a finite real fixed at  point $$a\uparrow^\infty\infty$$. For example where $$1 \leq a < e^\frac{1}{e}$$.
(09/22/2022, 12:19 PM)Daniel Wrote: [ -> ]Consider the Ackermann successor function $$\varphi(u)$$ such that $$\varphi(a \uparrow ^m b)=a \uparrow ^{m+1} b$$. 

Is this well-defined? I mean to define it that way you would need something like:  if \(a_1\uparrow^m b_1=a_2\uparrow^m b_2\) then \(a_1=a_2\) and \(b_1=b_2\) ?
Hi Daniel.
I encourage you to make your question more precise and to always define the terms of discourse. Defining the terms in mathematics is most of the time like solving 70% of the problem.

The existence of an hypothetical  successor function for ranks, i.e. "adding one" to the rank parameter is a not only very non-trivial matter but also, I claim, the key to solving the whole non-integer ranks problem. This is not only a superficial guess: I've centered all my research around the problem of correctly define such a successor function and now I know it yields promising results.

As I said, it is non-trivial. It seems you approached this successor function the naive way. Let me rephrase what I think is evident from your definition.

Definition (Naive rank successor): let \(\uparrow^n:\mathbb N\times \mathbb N\to\mathbb N\) be the natural Goodstein hyperoperation sequences (Knuth-reparametrized at \(2=0\). Define the naive rank successor function to be the function over the natural numbers
$$\Sigma_{\rm naive}:\mathbb N\to\mathbb N$$
satisfying the functional equation
$$(*)\,\,\,\,\,\,\,\forall a,b,n\in\mathbb N.\, \, \Sigma_{\rm naive}(a\uparrow^n b)=a \uparrow^{n+1} b$$

As bo198214 suspects, this is not a good definition. Remember, a function is a binary relation that is right-univocal (functional) and left-total: here left-total means that it is defined for every argument; right-univocal means that we cannot have \(f(a)\neq f(b)\) if \(a=b\)... quite a reasonable requirement, and the very definition of what a function is.

I'll prove that the set of functions satisfying eq. \((*)\) is empty. In other words, the function \(\Sigma_{\rm naive}\) cannot exists.
To do that I'll shot that the naive definition contradicts the assumed right-univocal property of  \(\Sigma_{\rm naive}\), a property assumed when we assume it is a function.

Proposition 1: the function \(\Sigma_{\rm naive}\)  doesn't exist.
Proof: assume that the successor function exists. Then, by definition
$$(*)\,\,\,\,\,\,\, \forall a,b,n\in\mathbb N.\, \, \Sigma_{\rm naive}(a\uparrow^n b)=a \uparrow^{n+1} b$$
but also, if \(x=y\) then \(\Sigma_{\rm naive}(x)=\Sigma_{\rm naive}(y)\). Take \(x=2\uparrow^0 3\) and \(y=6\uparrow^1 1\). Clearly
$$x=y$$
because \(x=2\cdot 3=6\) and \(y=6^1=6\). By functionality we should deduce
$$\begin{align}
\Sigma_{\rm naive}(x)&=\Sigma_{\rm naive}(y)\\
\Sigma_{\rm naive}(2\uparrow^0 3)&=\Sigma_{\rm naive}(6\uparrow^1 1)
\end{align}$$
but by the functional equation defining the successor we must deduce
$$\begin{align}
2\uparrow^1 3&=6\uparrow^2 1\\
2^3&={}^1 6\\
8&=6
\end{align}$$
This concludes the proof. \(\Sigma_{\rm naive}\) is not a function, or if it is it can't satisfy eq \((*)\).


Note that we can find a much more elegant counterexample if we consider instead of the Knuth's family that is centered at \(0=\rm multiplication\), the original one centered at \(0=successor\). We can speed-run the proof
$$\begin{align}
0&=0\\
0+0&=1\cdot 0\\
0[1]0&=1[2]0\\
\Sigma_{\rm naive}(0[1]0)&=\Sigma_{\rm naive}(1[2]0)\\
0[2]0&=1[3]0\\
0\cdot 0&=1^0\\
\Sigma_{\rm naive}(0)=0&=1
\end{align}$$

We can prove that non-commutativity of the higher hos is an even greater obstruction to functionality/existence of the successor function.

Proposition 2: if \(\Sigma_{\rm naive}\) exists, then every \(\uparrow^n\) is commutative.
Proof: proceed by induction. The base of the induction is for \(\uparrow^0=\times\). Multiplication is commutative.
Then, assume \(\uparrow^n\) is commutative:
$$\begin{align}
\forall a,b.\, a \uparrow^n b&=b\uparrow^n a\\
\forall a,b.\, \Sigma_{\rm naive}(a \uparrow^n b)&=\Sigma_{\rm naive}(b\uparrow^n a)\\
\forall a,b.\, a \uparrow^{n+1} b&=b\uparrow^{n+1}  a\\
\end{align}$$
using induction we deduce the proposition.



Bonus proof: thinking about the successor being defined for every argument we also obtain a great proof that such an object cannot do what it promises. In fact, if it was a function, we could trivially evaluate its value at each natural number

$$\Sigma_{\rm naive}(n)=\Sigma_{\rm naive}(1\cdot n)=1^n=1$$
Yeah, I now think a (naive) successor function doesn't exists. I thought that the Ackermann function might be accessible from a doubly recursive function, but it seems a diagonalization argument could be used to disprove this.
I can't follow you. Please, clarify what you mean by accessible, double recusion and ow do you think a diagonalization argument would disprove it.

Anyways, if you are interested in a possible route to define a non-naive rank successor process you can begin by this post, and follow the links: https://math.eretrandre.org/tetrationfor...71#pid9571

here a gentle introduction to the topic: (2015) MphLee, Introduction to the non-integer ranks problem
I think I can clarify some confusion. Daniel is looking for the functor:

$$
\iota f = F\\
$$

Where \(F(s+1) = f(F(s))\).

Him asking if this is analytic, is a tad incorrect. But a very reasonable question. As it asks if there is some analytic operation which sends \(f\) to \(F\). This would mean we have something like:

$$
\begin{align}
F &= \int_\gamma h(z,f)\,dz\\
F &= \sum_{n=0}^\infty q_n f^n\\
\end{align}
$$

Where by, \(F\) is a function of \(f\). We can write this \(F[f]\). And we ask that:

$$
\lim_{\epsilon \to 0}\frac{F[f + \epsilon h] - F[f]}{\epsilon} = G[f]\\
$$

Is a convergent limit, for an arbitrary test function \(h\). This is common in functional analysis, and I believe that is what Daniel is asking. And by saying analytic, he's basically just abusing the language, but it means the same thing. It is analytic of the function argument, not of the argument of the function.

As an answer, YES DANIEL. If \(1 < b < \eta\), then the successorship operator is "analytic"! This actually isn't that hard to prove! (Schrodinger mechanics save us along every step).




EDIT::::

Proof :


Take:

$$
F[f](s) = f^{\circ s}(1) = \frac{d^{s}}{dw^{s}}\Big{|}_{w=0} \sum_{n=0}^\infty f^{\circ n}(1)\frac{w^n}{n!}\\
$$

The operator \(\frac{d^{s}}{dw^{s}}\Big{|}_{w=0}\) is a bounded, and hence continuous, operator on Ramanujan's space--it's just a fancy Mellin transform. If we sub for \(f_\epsilon = f + \epsilon h\), for some entire function \(h\), and \(\epsilon\) is arbitrarily small, then:

$$
F[f_\epsilon](s) = (f+\epsilon h)^{\circ s}(1) \\
$$

Then this converges just as well (though the domain problem can be tricky). Additionally, it's really just busy work to show that:

$$
\frac{F[f_\epsilon](s) - F[f](s)}{\epsilon} = G[f] + O (\epsilon)\\
$$

By which, YESS, the functor "taking us to the super function" is "analytic". Though your language is wrong, Daniel Wink


To those of you who may still be confused. This is essentially the statement, that if \(h\) is an entire function (for our purposes with \(1 < b < \eta\), it is bounded on \(\Re(z) > 0\)), \(\epsilon\) is arbitrarily small; that:

$$
(f+\epsilon h)^{\circ s}(1) = f^{\circ s}(1) + O(\epsilon)\\
$$

So that epsilon shifts in the base function we are taking the super operator of, creates epsilon shifts in the super function. We can write this differentially, and since this is in many ways a complex "functional/functor" derivative, we can call it "analytic".

This shit is much more common in physics, and Laplacian stuff, or old Eulerian mechanics, but still even then they would say "F[f] is analytic in f"--though it means something a bit different to them. But essentially the statement is that \(F[f]\) is "complex differentiable" in "f", therefore it is "analytic". This is absolutely true. Daniel's just chosen weird language.


 

A good example can be written using \(1 < b < \eta\). Let's write:

$$
f(z) = b \uparrow^n z\\
$$

And:

$$
f_\epsilon(z) = (b+\epsilon) \uparrow^n z\\
$$

One can rigorously show that:

$$
\frac{f_\epsilon(z) - f(z)}{\epsilon} \to G(z)\\
$$

As \(\epsilon \to 0\). Which is essentially the statement that:

$$
f_\epsilon(z) = f(z) + \epsilon G(z) + O(\epsilon^2)\\
$$

Which is largely considered the statement of differentiability. But we are complex oriented, and differentiability for complex variables is "analytic".

And now when \(n \mapsto n+1\), the same result appears. Whereby:

$$
F[f_\epsilon] = (b+\epsilon)\uparrow^{n+1} z\\
$$

And not only does this display analytic behaviour in \(z\), it also displays analytic behaviour like \(f_\epsilon \to f\)--when taking limits in this manner. And we can say:

$$
\begin{align}
F[f_\epsilon] &= F[f] + O(\epsilon)\\
(b+\epsilon) \uparrow^{n+1} z &= b \uparrow^{n+1} z + O(\epsilon)\\
\end{align}
$$

And if we take:

$$
G[f](z) = \lim_{\epsilon \to 0}\frac{F[f_\epsilon](z) - F[f](z)}{\epsilon}\\
$$

Then:

$$
F[f_\epsilon](z) = F[f](z) + \epsilon G[f](z) + O(\epsilon^2)\\
$$

I still agree entirely with everything Mphlee, and Bo said. They're totally correct in their analysis, Daniel was not clear. But I'm pretty sure this is what he was asking, he just lacked the vocabulary to ask it.

TL;DR

The super function operator is analytic (if you choose an older meaning of analytic)--but I only show it for the bounded case.

Smile
Just to go deeper into this, we can be more rigorous.

Let's say that \(f(z) : \mathbb{C}_{\Re(z) >0} \to \mathbb{C}_{\Re(z) > 0}\), while additionally \(|f(z)| < M\) is bounded by some constant \(M\). Let's say that this a space of functions.

Now the solution:

$$
f^{\circ s}(1)\\
$$

Necessarily exists; since \(f\) has an attracting fixed point; and all orbits tend to this fixed point. If \(f\) takes a simply connected domain to itself, it has one attracting fixed-point. We must additionally assume here, that \(f'(z) \neq 0\), which is totally fine for hyper operators.

If I write \(f+\epsilon\), then this equally is a bounded function within the above space. So it has a fixed point, which is attracting, and also, \((f+\epsilon)^{\circ s}(1)\), is a holomorphic function. This function, \(F[f](s) : \mathbb{C}_{\Re(s) >0} \to \mathbb{C}_{\Re(s)>0}\). Additionally, this solution is just as bounded, \(< M\). Therein:

$$
\frac{(f+\epsilon)^{\circ s}(1) - f^{\circ s}(1)}{\epsilon} \to G[f](s)\\
$$

And this statement is entirely rigorous. And \(F[f]\) is analytic in \(f\).