Tommy's Gaussian method.
#21
see also post 199 in the fake function thread :

https://math.eretrandre.org/tetrationfor...60#pid9660

regards

tommy1729
#22
let h(s) = inv.f( exp( f(s) ).

whenever t(s) is close to 1 , h(s) is suppose to be very close to s+1.

To study h(s) it makes sense to consider its derivative.

The problem with derivatives is that no matter what kind of calculus you use , h-derivative or q-derivative or other types of derivatives , things just basicly look the same.
By that I mean that for instance the chain rule remains the same independant of the use of h-derivative or q-derivative etc.
This makes it hard to estimate things.

Many have had the idea of replacing the derivatives with another concept to understand " change " but not very succesful I think.

Special cases might however show succes.
And therefore it is hard to exclude the idea completely.
Ideas are welcome.

Maybe James compositional calculus might help ??
I still do not understand that.

Anyways the derivative of h(s) :

h ' (s) = exp( f(s) ) * f ' (s) / f ' ( h(s) )

( notice if h(s) was indeed s + 1 EXACTLY then f has to be tetration EXACTLY )

This identity is complicated.
It is hard to show h ' (s) must be close to 1 when t(s) is close to 1.

( Based on nothing but imagination the idea of finding an f,h,t such that h'(s) = t(s) near the real line is fascinating , but i have no clue )

But what stands out is the division by f ' ( h(s) ).

let h(s) = S.

Then we want to understand 1/ f ' (S).



In particular when is f ' (S) zero or close to zero ??

More general - or less - forget that S = h(s) and take s instead :

when is f ' (s) zero or close to 0 ?

We know that f(s) =/= 0.

We have jensens theorem to relate log f ' (0) to the zero's of f ' (s) = 0 within a radius.

Clearly when f ' (S) = 0 and t(s) is close to 1 , we have a singularity for h(s).

( notice I assumed f ' (s) = f ' (h(s)) = 0 and t(s) is close to 1 is not possible )

also notice the zero's of f ' (s) also affect when h ' (s) = 0.

So this deserves attention.

regards

tommy1729

Tom Marcel Raes
#23
[§1]

Notice that f(s) is close to tetration and f(s+1) is close to exp(f(s)) , so by the chain rule we get approximately 

f ' (s) = f(s) * f ' (s-1)

What implies some sort of pseudo-1-periodicity of closeness to zero.

***

[§2]

The whole idea of h(s) is based on the " opposite of chaos " :

for a small variation of s ; s + q ( q small ) , exp^[n](s+q) is very different from exp^[n](s).

This implies that when we consider f(s) as s iterations of exp of some starting values , then f(s+q+n) can be very different from f(s+n) even for small q.

This in turn implies there exist small values q_2(s) such that f(s+q_2) = exp( f(s) ).

Hence h(s) makes sense.

***

[§3]

Apart from h ' (s) being close to 1 and f ' (s) being close to 0 we can also consider ( things like ) this :

When is h(s) = s + 1 ??


f(s) = exp( t(s) f(s-1) ).

if h(s) = s +1 then 

exp( f(s-1) ) = exp ( t(s) f(s-1) )

A = f(s-1) 

thus 

exp ( A ) = exp ( t(s) A )

1 = exp(0) = exp(A - A) = exp( (t(s) - 1) * A ) = exp(2 pi i * k)

therefore A = (2 pi i * k)/( t(s) - 1 )
for any integer k , if t(s) - 1 =/= 0.


In other words :

f(s-1) = (2 pi i * k)/( t(s) - 1 ) IFF h(s) = s + 1.

Remember that f(s-1) is never 0 , so k = 0 is EXCLUDED !

These are some interesting equations.
Not to complicated.

This gives us some kind of " measure " of closeness to h(s) = s + 1 which might be useful ;

m(s) = f(s-1) - (2 pi i * k)/( t(s) - 1 )

Somehow this feels like getting closer to showing h(s) is about s + 1 in particular in combination with [§1] and [§2].

It feels like some algebraic tricks or calculus tricks are just around the corner or hiding as treasures waiting to be discovered and used.

Let M(s) be the absolute value of m(s).

Should we consider integrals of type integral slog( M(s) + 1 ) or similar ??

Another idea/concern :  if t(s) = 1  the condition is not met. What does that imply ??  A singularity of h(s) ?? A natural boundary in the limit as t(s) is going to 1 ?? EQUIVALENT QUESTION : what happens when m(s) = oo ??

***

Many more ideas exist relating those above, but they tend to get circular or confusing so I stick with the imo basic ideas for now.

I have been asked or given the " opportunity " to prove rigorously the gaussian method first , but I think you can see now it is not so trivial to formally prove things... as usual imo ...

regards

tommy1729

Tom Marcel Raes
#24
likewise we could consider trying to solve h(s) = s + 2 and hope " for the best " ; hoping it has none or not many solutions in the regions we care about.


or in general solving h(s) = s + n for integer n.

keep in mind that h(s) is suppose to be the closest solution to s.

SO if h(s) = s  + 1 + 1/3 solves the issue , the other solution h(s) = s + 1 + 1 = s + 2 MIGHT not be a problem... so that still leaves hope.

the " MIGHT " part is due to analytic continuation/branches ; we want to consider the " same h(s) ".



regards

tommy1729
#25
Just a quick note, the value \(h'(s)\) satisfies the equation:

\[
h'(s) = \exp(f(s)) f'(s) / f'(h(s))\\
\]

Then this is a first order differential equation which is related to compositional calculus, but I'm not sure how it could help.

Either way, if:

\[
u(s,z) = \int_a^s \dfrac{\exp(f(x)) f'(x)}{f'(z)}\,dx \bullet z\\
\]

Then,

\[
\begin{align}
u(a,z) &= z\\
u'(s,z) &= \dfrac{\exp(f(s))f'(s)}{f'(u(s))}\\
\end{align}
\]

This leaves us the trouble of finding a value \(a\) and \(z\) that matches \(h\).

But I'm not so sure how much the compositional calculus would help, unless you are looking for a way to numerically evaluate this. If that is so, then the formula for this is a little involved (just a slightly more esoteric version than Euler's method). Let \(\{t_j\}_{j=0}^n\) be a partition of \([a,s]\) in descending order: such that \(t_{j} - t_{j+1} = \mathcal{O}(1/n)\); and let \(t_{j}  \ge t_j^* \ge t_{j+1}\):

\[
u(s,z) = \lim_{n\to\infty}\Omega_{j=0}^{n-1} z+ \dfrac{\exp(f(t_j^*))f'(t_j^*)}{f'(z)}\left( t_j - t_{j+1}\right)\,\bullet z\\
\]

Which is just saying, if

\[
q_{jn}(z) = z+ \dfrac{\exp(f(t_j^*))f'(t_j^*)}{f'(z)}\left( t_j - t_{j+1}\right)\\
\]

Then,

\[
u(s,z) = \lim_{n\to\infty} q_{0n}(q_{1n}(q_{2n}(...q_{(n-1)n}(z))))\\
\]

This will converge locally for \(|a-s| < \delta\) and \(z\) almost everywhere (sort of). Luckily \(f'\) is nonzero (i believe so). This definition can be extended in a very convoluted manner to a larger domain, it's just fairly difficult. This would be by the defining property of the compositional integral, which is:

\[
\int_{b}^c g(x,z) \,dx \bullet \int_{a}^b g(x,z)\,dx\bullet z = \int_{a}^c g(x,z)\,dx\bullet z\\
\]

This definition works for complex \(s\) as well; but normally I'd write it as a contour integral. Which is:

\[
\int_\gamma g(w,z)\,dw \bullet z = \int_{0}^1 g(\gamma(x),z)\gamma'(x)\,dx\bullet z\\
\]

For \(\gamma\) an arc which satisfies \(\gamma(0)= a\) and \(\gamma(1) = s\). I'm not so sure if this would really help though. I can't see this adding much more to the discussion than just applying Euler's Method on \(h\). The compositional calculus only really develops a use when you start modding out by equivalence classes. But it is definitely helpful at visualizing the interaction between compositions and integrals.



EDIT:

IT seems this is an induced semi-group. Very damned interesting.

The differential equation is separable, so it can be reduced to a semi-group. That is, \(u\) has an alternative representation:

\[
\begin{align}
u(s,z) &= \int_{0}^{A(s)} \frac{dt \bullet z}{f'(z)}\\
A(s) &= \displaystyle \int_a^s \exp(f(x))f'(x)\,dx\\
\end{align}
\]

This is shown by making the substitution \(dt = \exp(f(x))f'(x)\,dx\) in the equation:

\[
u(s,z) = \int_a^s \dfrac{\exp(f(x)) f'(x)}{f'(z)}\,dx \bullet z\\
\]

And if you define:

\[
U(w,z) = \int_{0}^{w} \frac{dt \bullet z}{f'(z)}\\
\]

Then:

\[
\begin{align}
U(0,z) &= z\\
U(w',U(w,z)) &= U(w' + w,z)\\
\end{align}
\]

So, You're differential equation actually reduces into a flow equation, which is very very nice. Essentially then, all we have to worry about to define \(h\) is the pesky semi-group induced by \(1/f'(z)\). Which shouldn't be too too hard.

Just so you understand what a semi group induced by \(1/f'(z)\) means; I mean that every semi group in existence is induced by the identity:

\[
\lim_{\delta\to 0} \frac{U(\delta,z)-z}{\delta} = g(z)\\
\]

Where for \(h\), \(g(z) = 1/f'(z)\); from here, we input \(A(s)\) into the exponent of the semi group--voila, we have \(h\); so long as we choose \(a\) and \(z\) appropriately. I'm happy to explain this more because the compositional calculus includes much of the standard literature, it's just a better way of writing it imo. Where you can apply Leibniz substitutions/Riemann-Stieljtes integration/Flow theory much more compactly. Honestly, it's a well developed shorthand that I developed that isn't really needed. But once you start modding out by equivalence classes, WOAH buckle your seats!

Regards, James. Hope I can help Tommy.
#26
I forgot to mention 

To avoid the use of constants for the gaussian method we can " anchor " the point 0 :

We use this modification 

f(s) = exp( R(s-1) f(s-1) )

Now R(s) = t(s) erf(s)^2 = (1 + erf(s))/2 * erf(s)^2.

So that R(s) = 0 when s = 0.

Therefore 

f(1) = exp(0 * ..) = 1.

This helps numerically and theoretically , however R(s) is slightly more complicated.

But R(s) behaves similar on the complex plane as t(s) so it works.


Extending this idea is worth considering.


But at least 

sexp(s-1) = lim ln^[n] f(s + n) 

where sexp(0) = 1.


regards

tommy1729

Tom Marcel Raes
#27
A nice integral representation for R(s) would be nice too.

regards

tommy1729
#28
about post 26.

considering post 14 and post 18.

First I forgot the constant 

sexp(s) = lim ln^[n] f ( s + n + C)

C is neccessary even if f(1) = 1.

So we are not completely rid of constants.

Secondly 

I had in mind a function that behaves like t(s) = ( 1 + erf(s))/2 but at integers gives 0 or 1 depending on the sign.

My example gave 0 for 0.

But I wonder about a closed form of an entire function that behaves like t(s) = ( 1 + erf(s))/2 but at integers gives 0 or 1 depending on the sign.

And also that function is desired to behave similar to t(s) on the complex plane in particular when Re(s)^2 > Im(s)^2.

Im not even sure such an entire function exists.
Think of sin(s) growing fast in the imaginary direction.

t(x) sin(v x)^2 + cos(v x)^2 for  x > 1 and suitable v does not grow slowly in the complex plane so Im not sure it actually exists with all the desired properties.

Since these properties are lost maybe it does not actually work !?

This requires more study.

However enter post 14 and 18.

We learn from post 14 and 18 the periodic function used for other solutions...

But wait a minute.

The desired function above cannot possible be t(s + p(s)) for a bounded periodic real function p(s).

SO we have a second objection to this modified function.

However clearly the modified function ( or the one where f(1) = 1 from post 26 ) does give a convergent method on the real line.

So here is the main idea :

if t(s) gives an analytic solution , then the modified version does not. Because they cannot both be solutions hence they cannot both be analytic solutions.

A more generalized conjecture is that

F(s) = exp( a(s-1) exp( a(s-2) ... )

can only be analytic if a(s) is 

1) going from 0 to 1 on the reals 

2) is analytic

AND 

3) a(s) is strictly increasing on the reals.



that is mainly post 14 logic talking ...


However the ideas and proofs and conjectures are not formal or decisive yet.

more work is needed.



suppose we take f(s) for our method.

and suppose we take f(2s) for our method.

now we have two methods and two function sexp.

are they equal ?

are they both analytic ? both not ? 

by the logic from post 14 f(s) and f(2s) are not related by a periodic function.

but they do both give a solutions on the reals.

and they do satisfy 

1) going from 0 to 1 on the reals 

2) is analytic

AND 

3) a(s) is strictly increasing on the reals.


...


So the issue of acceleration is still a thing ( post 18 ! )

So we find ourselfs again in the realm of weird but interesting conjectures.

regards

tommy1729
#29
I conjecture

all t(s) with t(s) = ( 1 + u(s)) /2 and phi(s) = exp(t(s-1) exp(...))  will give analytic tetration solutions if 

( sufficient conditions )

1) u(s) = integral f(t) erf(s t) dt from 0+ to +oo.

2) u(s) and f(s) are analytic 

3) integral f(t) dt from 0+ to +oo = 1 and integral f(t)^2 dt from 0+ to +oo is convergeant.

4) f(t) >= 0.

regards

tommy1729
#30
Hey, Tommy I'm curious as to these almost analytic solutions you are suggesting.

The closest I've got is reproducing Kneser--which can be done using the beta method. But this only happens if we assume that:

\[
F(s) \to L\,\,\text{as}\,\,|s|\to\infty\,\,\pi/2 \le |\arg(s)| < \pi\\
\]

Every other solution I've played with, discovered through this iterative formula is pretty much always \(C^\infty\), or it could possibly be analytic, but then it's only analytic on a strip. Actually, all this infinite composition stuff with tetration has me convinced Kneser is the way to go. I'm still convinced that the Gaussian method will actually produce Kneser, because it isolates values in the upper half plane, where iterated logarithms tend to \(L\).

I mean \(L\) as the fixed point \(\exp(z)\) with the smallest imaginary part.


Possibly Related Threads…
Thread Author Replies Views Last Post
  " tommy quaternion " tommy1729 41 21,274 05/23/2023, 07:56 PM
Last Post: tommy1729
  The ultimate beta method JmsNxn 8 2,086 04/15/2023, 02:36 AM
Last Post: JmsNxn
  [NT] Caleb stuff , mick's MSE and tommy's diary functions tommy1729 0 465 02/26/2023, 08:37 PM
Last Post: tommy1729
  greedy method for tetration ? tommy1729 0 510 02/11/2023, 12:13 AM
Last Post: tommy1729
  tommy's "linear" summability method tommy1729 15 3,824 02/10/2023, 03:55 AM
Last Post: JmsNxn
  another infinite composition gaussian method clone tommy1729 2 941 01/24/2023, 12:53 AM
Last Post: tommy1729
  Semi-group iso , tommy's limit fix method and alternative limit for 2sinh method tommy1729 1 881 12/30/2022, 11:27 PM
Last Post: tommy1729
  [MSE] short review/implem. of Andy's method and a next step Gottfried 4 1,742 11/03/2022, 11:51 AM
Last Post: Gottfried
  tommy's group addition isomo conjecture tommy1729 1 927 09/16/2022, 12:25 PM
Last Post: tommy1729
  tommy's displacement equation tommy1729 1 960 09/16/2022, 12:24 PM
Last Post: tommy1729



Users browsing this thread: 1 Guest(s)