Thread Rating:
  • 3 Vote(s) - 4.33 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Searching for an asymptotic to exp[0.5]
Notice how both S9 and the gaussian do Well for f(x^a).


Let f(x) satisfy the conditions.


Let g(x) satisfy the conditions.
Let m be a positive integer.
Let s be a nonnegative real.
Let r be a positive real.

Notice min [ (f(x) + s x^m) / x^m ] = min[ f(x)/x^m ] + s.

Therefore using S9

Fake [ f(x) + s x^m ] = fake [ f(x) ] + s x^m.

Let ++ be close to addition.

Then by the above and

Fake( f(x) ) => a_n
fake ( f(x) x ) => b_n = a_(n-1)

And also

Fake ( r f(x) ) = r fake ( f(x) )

We get

Fake [ f(x) + g(x) ] = Fake [ f(x) ] ++ Fake [ g(x) ]

This is a simple but important fact.

Notice this is not immediately clear from - with respect - the S9 formulation with all the ln's , exp and g and h.

It might be good homework to try and find these results with the g h formulation of S9.
Ofcourse the g h formulations has its OWN benefits.
But imho this simple notation of min ( f(x) / x^n ) i still use has been put back on the map.

Guess this should be called

Fake addition theorem

Another road of investigation should be how other methods like the gaussian or Tommy-sheldon iterations deal with this.

By now the idea of distributive must have come to your mind.

A wild conjecture would be

Tfdl =

Tommy's fake distributive law.

Law , conjecture or theorem ...

This refers to tommy's generalized distributive law.

See elsewhere on the forum.


For S9 :

Notice fake [ fake( f(x) ) + fake( g(x) ) ] = fake^[2]( f(x) ) + fake^[2]( g(x) ).

Clearly related. Fake^[2] means fake fake.

I call that tommy's fake fake formula.

I think these results Will be useful.

For none S9 methods we can probably replace + on the RHS with ++.
This needs more investigation.

Using the squeeze theorem and comparison theorem ( from " limit calculus " ) Will be useful to know.


Coffee break :p


For the second half of this post I Will post An idea that is in my head since post 9.

In post 16 Sheldon improved the overestimate S9 with a division.

It also seems Natural to Try with a substraction.

For methods that are general and do not use zero's it seems unnatural to not use the S9 part.

( e.g. Gaussian = S9 / sqrt .. )

In particular after the properties given in the first half of this post, and in general after rereading this thread.

So this idea occured :

Tommy's tommynaci sequence

Start with 2 , 5 , ..

And use fSadn) = fSadn-1) + fSadn-2).

Already known as the Evangelist series.

FSadn-1) = ((3*sqrt(5)+1)*(((1+sqrt(5))/2)^n)+(3*sqrt(5)-1)*(((1-sqrt(5))/2)^n))/(2*sqrt(5)).

For those intrested.


a_n = min [ ( f(x) - ... - a_(n_19) x^(n-19) - a_(n-12) x^(n-12) - a_(n-7) x^(n-7) -

a_(n-5) x^(n-5) - a_(n-2) x^(n-2) ) / x^n ]

Tommy's Evangelist recursion.
Or the tommynaci method.


Notice for exp(x) we get the Exact solution

By using a_n = min [ ( f(x) - a0 - a1 x - a2 x^2 - ... - a_(n-1) x^(n-1) ) / x^n ]

But that does not work for nonentire f or f with Some negative derivatives.

Hence Tommy's Evangelist recursion is used instead for such.
This implies the method does not perform well for exp.


I had Some gaussian variant Ideas about the above.
But way too many questions ...


How Well does the tommynacci method work ?

Time for medicine.

But as you probably can tell im feeling better and no longer confused.

Unfortunaly not at full potential. So i probably missed something trivial too add.

Take care



About the Tommy-Sheldon iterations ..

I designed them for nonnegative g ' '.

I suggest - with doubt - that in case of a negative , we remove the minus sign.

In other words the absolute value.

Although counterintuit , notice the second deriv already influences the first


Im considering to replace g'' in the gaussian method or the Tommy-Sheldon iterations by ln( e + exp(g'')).

Notice the condition f '' > 0 implies

For x > 1 :
g '' + g' (g'-1) > 0.


How good the Tommy-Sheldon iterations work are depending alot on how good the gaussian is.

Lets say the gaussian is off by a factor 5.

Then the next iteration gives

New g = ln( 5 f (*) ) = ln(5) + old g.
This change seems too weak.

Funny because if we consider gaussian( 5 f) then this weak change is exactly what we need ( to get S9 close to gaussian and both to correct )
, since min( 5 f/ x^m) = 5 min ( f/ x^m).

Reconsidering things seems necessary.

Fake function theory is tricky.

However , i assumed here the gaussian was off by a factor 5.
It is not certain this is possible.

What then gives back confidence to the Tommy-Sheldon iterations.

Or by the replacement suggested Above ,

" The exponential Tommy-Sheldon iterations ".

Shorthand ets.



In context of the previous post , i assume the gaussian can be off by about
( inspired by the ets and the binary partition )
ln(e + exp(0)) / ln(e) = ln(e+1) ~ 1,313


Solving g '' + g ' ( g ' - 1) = C is very insightfull !


Not sure if its a good idea.
I have not considered it much.

I call it the egyptian method

Min(a) + min(b) =< min(a+b)


1/x = 1/(x+1) + 1/(x^2 + x).

So instead of min( f(x)/x^n )

Try min ( f(x) / (x^n + 1) ) + min ( f(x) / (x^2n + x^n) ).


I need the ratio's of the real a_n vs fake a_n for the function exp(x) , where the fake is the limit of the Tommy-Sheldon iterations.

Also , related and usefull ,

1 / ( 1 - a_n/fake a_n ).

[ i assumed the limit of the ratio going to 1 indeed ].


Identical question for exp^[2].

Im close to insights.


Intresting is the fake of exp( (2 pi)^{-1} * 0,5 * ln(x)^2 ).

This makes the gaussian agree on the usual S9.

Methods agreeing could indicate good estimates.

So how good is it ? And how true is the indication ?

Will we get the searched O(1/n^2) ??

Need to study this.


Notice how the second iteration of the Tommy-Sheldon iterations gives a nonnegative coëfficiënt for g '' , therefore being better than the gaussian in " most " cases.

Naturally in all cases where g " < 0 , but more investigation is needed for g " > 0.



Possibly Related Threads...
Thread Author Replies Views Last Post
  Using a family of asymptotic tetration functions... JmsNxn 15 4,093 08/06/2021, 01:47 AM
Last Post: JmsNxn
  Reducing beta tetration to an asymptotic series, and a pull back JmsNxn 2 729 07/22/2021, 03:37 AM
Last Post: JmsNxn
  A Holomorphic Function Asymptotic to Tetration JmsNxn 2 1,137 03/24/2021, 09:58 PM
Last Post: JmsNxn
  An asymptotic expansion for \phi JmsNxn 1 1,009 02/08/2021, 12:25 AM
Last Post: JmsNxn
  Merged fixpoints of 2 iterates ? Asymptotic ? [2019] tommy1729 1 3,367 09/10/2019, 11:28 AM
Last Post: sheldonison
  Another asymptotic development, similar to 2sinh method JmsNxn 0 3,989 07/05/2011, 06:34 PM
Last Post: JmsNxn

Users browsing this thread: 2 Guest(s)