I think I have figured out conceptually how to reconcile the singularities which probably lie arbitrarily close to the real line, with the fact that we can generate a power series with a seemingly non-zero radius of convergence.
To see this, consider the following contrived function: f(x) = log(log(log(log(log(x+1)+e^(e^e))))))
Note that at x=0, the value is 0. Also note that you can pick points very close to -1, and the value will barely change at all.
Yet note that there is a singularity at x=-1. The function is very, very smooth at x=0, almost constant in fact, and yet there is definitely a singularity at x=-1.
I think a similar effect is at work with the base change formula. For any finite n, we could in principle locate singularities by careful analysis, and as n increases, these singularities will get closer and closer to the real line. Yet they will also be less and less perceptible, unless you can manage to get really, really close to one.
As such, the power series expansion would seem to have a rather large radius of convergence, because the first few thousand terms of f are determined primarily by the power series developed with f_n, with a rather small value of n.
But if we could compute tens of thousands of derivatives with millions of bits of precision, we should see the radius of convergence slowly drop. Indeed, the full power series likely has 0 radius of convergence. (Or is it more accurate to say that for any finite value epsilon, the radius of convergence can be shown to be smaller than epsilon? Is that any different than saying it's 0?)
It seems therefore rather miraculous that the limit is well-defined for real numbers, when it would seem to be undefined for non-real numbers.
To see this, consider the following contrived function: f(x) = log(log(log(log(log(x+1)+e^(e^e))))))
Note that at x=0, the value is 0. Also note that you can pick points very close to -1, and the value will barely change at all.
Yet note that there is a singularity at x=-1. The function is very, very smooth at x=0, almost constant in fact, and yet there is definitely a singularity at x=-1.
I think a similar effect is at work with the base change formula. For any finite n, we could in principle locate singularities by careful analysis, and as n increases, these singularities will get closer and closer to the real line. Yet they will also be less and less perceptible, unless you can manage to get really, really close to one.
As such, the power series expansion would seem to have a rather large radius of convergence, because the first few thousand terms of f are determined primarily by the power series developed with f_n, with a rather small value of n.
But if we could compute tens of thousands of derivatives with millions of bits of precision, we should see the radius of convergence slowly drop. Indeed, the full power series likely has 0 radius of convergence. (Or is it more accurate to say that for any finite value epsilon, the radius of convergence can be shown to be smaller than epsilon? Is that any different than saying it's 0?)
It seems therefore rather miraculous that the limit is well-defined for real numbers, when it would seem to be undefined for non-real numbers.
~ Jay Daniel Fox