Thread Rating:
  • 2 Vote(s) - 4 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Searching for an asymptotic to exp[0.5]
Ok time to get more formal.

As mentioned before there are links between fake and multisections.
That will be more clearly with the method presented here.

Also this method - with a little twist - gives the CORRECT values if all derivatives are already positive. ( such as exp )

So as a general case method I think this is one of the better ones.

Consider f(x) with the conditions :

f(x) is real-analytic for x >= -1.

for x > 0 we have

f(x) > 0 , f ' (x) > 0 and f " (x) > 0 and also

f(x) grows larger/faster then any polynomial :

lim x-> +oo x^t / f(x) = 0 for all real t >= 0.

So this f(x) satisfies the conditions needed to get a fake [f(x)] = g(x).

g(x) = a_0 + a_1 x + a_2 x^2 + ... ~ f(x).
with a_j >=0

We need a method to find the a_k.

Let n > i > 2 such that for all x > 0 : D^i f(x) > 0.

Define G_n(x) = SUM_i a_i x^i.

Our equations for finding a_j are then :

for all x > 0.
a_0 x^0 =< f(x)

or equivalent
for x > 0
a_0 = inf( f(x) )

here we simply get a_0 = f(0).

further
( x > 0 Always so I will stop mentioning this )

a_1 x = inf( f(x) )

=>

a_1 = inf( f(x) / x )

( notice the similarity to the derivative f ' (0) = lim ( f(x) - f(0) ) / x. )

a_2 x^2 = inf ( f(x) )

=> a_2 = inf ( f(x) / x^2 )

( this looks simpler then all those logs in post 9 ... although it has its use there for tetration type functions ofcourse )

a_3 x^3 = inf( f(x) )

=> a_3 = inf( f(x) / x^3 )

a_4 x^4 + G_4(x) =< f(x)

this is a bit more complicated but notice we already have a_0 , a_1 , a_2 and a_3 from the above equations so G_4 is known(*).

( * assuming the condition for i is understood , see def for G_n above )

and it continues like

a_n is computed from

a_n x^n + G_n(x) =< f(x)

or equivalent

a_n x^n + G_n(x) = inf( f(x) )

Notice how this works PERFECT for exp giving fake[exp] = exp if we replace the equations for a_1 and a_2 with a_1 = 1 and a_2 = 1/2 instead and then go on to solve the others.
Similar good results for sinh(x) for instance.

This method is never worse then the method from post 9.

The G_n is an important concept , because equations like

a_0 + a_1 x + a_2 x^2 + a_3 x^3 =< f(x)

could FAIL for some f(x).

The G_n guarantees non-negativity of the a_j.

This captures most of my ideas here in this thread so I will simply call this :

tommy's fake method.

I will be using this in the future and base my conjectures on this.

Error terms depend on f(x) alot and I do not yet understand them.

But fake function theory is advanced by this.

My friend mick will post the following problem to MSE about tommy's fake method.

( an old problem considered by me for the record )

Let f(x) be as defined above.

Let F(x) := integral_0^x f(t) dt.

Then

conjecture : Fake[ F(x) ] - integral_0^x Fake[ f(t) ] dt = O(1 + x^3)
where O is big-O notation.

A weaker (related !) version ( post 9 method ) is

n>3

a_n(f) = inf( f(x) / x^n )
b_(n+1)(F) = inf( F(x) / x^{n+1} )

[ inf( f(x) / a_n(f) x^n ) ] / [ inf( F(x) / ( (a_n(f) x^{n+1}) /n) ) ] ~ 1 +/- O(1/n).

suggesting that integral_0^x a_n(f) t^n dt ~ b_(n+1)(F) x^{n+1}.

Regards

tommy1729

Reply


Messages In This Thread
RE: Searching for an asymptotic to exp[0.5] - by tommy1729 - 05/25/2015, 10:24 PM

Possibly Related Threads...
Thread Author Replies Views Last Post
  Merged fixpoints of 2 iterates ? Asymptotic ? [2019] tommy1729 1 1,018 09/10/2019, 11:28 AM
Last Post: sheldonison
  Another asymptotic development, similar to 2sinh method JmsNxn 0 2,746 07/05/2011, 06:34 PM
Last Post: JmsNxn



Users browsing this thread: 1 Guest(s)