06/27/2009, 09:39 AM

(06/26/2009, 10:51 PM)tommy1729 Wrote: as for the radius of convergence :Its not only valid for Andrew's slog but for every slog and also not only for the smallest but for every fixed point.

let A be the smallest fixpoint => b^A = A

then ( andrew's ! ) slog(z) with base b should satisfy :

slog(z) = slog(b^z) - 1

=> slog(A) = slog(b^A) - 1

=> slog(A) = slog(A) - 1

=> abs ( slog(A) ) = oo

so the radius should be smaller or equal to abs(A)

However not completely:

One can not expect the slog to satisfy slog(e^z)=slog(z)+1 *everywhere*.

Its a bit like with the logarithm, it does not satisfy log(ab)=log(a)+log(b) *everywhere*.

What we however can say is that log(ab)=log(a)+log(b) *up to branches*. I.e. for every occuring log in the equation there is a suitable branch such that the equation holds.

The same can be said about the slog equation.

So if we can show that Andrew's slog satisfies slog(e^z)=slog(z)+1 e.g. for then it must have a singularity at A.

Quote:also this makes me doubt - especially considering that for every base b slog should 'also' ( together with the oo value at the fixed point A mentioned above ) have a period ( thus abs ( slog(A + period) ) = oo too ! ) - , however thats just an emotion and no math of course ...I just showed in some post before but dont remember which one:

again up to branches.

Quote:( btw the video link mentioned in this thread doesnt work for me bo , maybe it isnt online anymore ? )

well it seems to not exist anymore. It didnt give concrete solutions to our problems, it was just interesting that others also deal with the solution of infinite equation systems and approximation via finite equation systems.