(09/16/2009, 01:59 PM)Gottfried Wrote:Quote:In so far the interpolation is "correct".

Yes, may be my wording is not explanative enough. I didn't want to say it were not correct in so far - the "false" logarithm series is also "so far correct" in its own set of conditions/requirements, it is completely legitimate in that setting: just in terms of an interpolation of some function in x. (we had this argument often: interpolation is only unique modulo some 1-periodic function)

But only in the wider view that set of conditions/requirements is not useful for the original intention, which was laid on the conceptually wider idea of a *logarithm*, namely to make log(a*b) = log(a) + log(b) and something more.

With "correct" I meant here satisfaction of the superfunction equation: . Indeed it is a surprise as I wouldnt expect satisfaction of this equation from some interpolation. But of course it does not suffice for uniqueness, while your logarithmic equation does.

Quote:Quote:2. the interpolation method seems only to converge if tends to a limit. Which the logarithm does not satisfy.

Hmmm. How far can this argument be related to that interpolation-method in more generality?

My understanding is the following: You know there is a theorem that an analytic function is uniquely determined by the values of infinitely many arguments that have an accumulation point in the domain of holomorphy.

In our case of having a limit, i.e. , I guess the regular super-exponential is analytic at . So under this condition the sequence of natural numbers as arguments has the limit point inside the domain of holomorphism, and hence is uniquely determined by the values on the natural numbers.

The difficult case is where it is not analytic at infinity. However all methods seem to converge to the same thing in this case. These method are all non-interpolative. Here it seems that approaching infinity in a certain angle/sector is bounded. Kinda analyticity not in a whole vicinity of infinity but only when approaching through this sector.