Thread Rating:
  • 1 Vote(s) - 5 Average
  • 1
  • 2
  • 3
  • 4
  • 5
sum(e - eta^^k): convergence or divergence?
#1
From some fiddlings with the slog-subject I came across the question, whether this is divergent or convergent:

using

the sum:



Clearly the sequence of terms tends to zero because e is the fixpoint of iteration and in a first guess I thought that also the series converges. But the convergence of the sequence is slow and one needs a lot of terms to see a promising trend.
What I did is to look at the sequence of partial sums, each from zero to 2^n,



and that sequence {s_n} seem to increase, even slightly more than linear. at least if I look at the partial sums up to n = 12.

Here are the partial sums and the differences of order 1 to 3:
Code:
.  n  s_n          d1_n=s_n - s_(n-1)   d2_n=d1_n-d1_(n-1)   d3_n
.  1  4.00875692871  4.00875692871     4.00875692871      4.00875692871
.  2  5.58578587004  1.57702894134    -2.43172798737     -6.44048491607
.  3  7.77673131247  2.19094544242    0.613916501085      3.04564448845
.  4  10.5161613469  2.73943003447    0.548484592045   -0.0654319090397
.  5  13.6651189223  3.14895757539    0.409527540923    -0.138957051122
.  6  17.0811305635  3.41601164122    0.267054065824    -0.142473475099
.  7  20.6561843799  3.57505381638    0.159042175165    -0.108011890659
.  8  24.3207908875  3.66460650761   0.0895526912324   -0.0694894839327
.  9  28.0341817806  3.71339089304   0.0487843854235   -0.0407683058090
. 10  31.7736430757  3.73946129511   0.0260704020760   -0.0227139833474
. 11  35.5268806871  3.75323761140   0.0137763162852   -0.0122940857909
. 12  39.2873487126  3.76046802550  0.00723041410300  -0.00654590218217


How could we prove the divergence/convergence of the series S?

Gottfried
Gottfried Helms, Kassel
Reply
#2
reminds me a bit of paris constant. maybe this is an analogue.

also , it reminds me of your " tiny limit curiosity " , it seems logical to me that

if the equation doesnt hold for base eta , then the sum should diverge.

but powers of 1 remain 1 ...

thus probably it diverges.

regards

tommy1729
Reply
#3
(07/20/2010, 10:30 AM)Gottfried Wrote: using

the sum:



Clearly the sequence of terms tends to zero because e is the fixpoint of iteration and in a first guess I thought that also the series converges. But the convergence of the sequence is slow and one needs a lot of terms to see a promising trend.

Perhaps you can post the open problem in the open problems survery.
I dont think it is too difficult, however didnt find a proper proof yet.
Reply
#4
i almost forgot about this , but when considering paris constant i think i stumbled upon a proof.

the assumed proof shows divergence ( as i first suspected ).

first i show that it cannot converge at an exponential rate.

this is done by noting the koenigs analogue :

note that lim k -> oo eta^^k ^ z is in the neigbourhood of eta^^(k-1) and eta^^(k+1) if z is in the neighbourhood of eta.

hence lim k -> oo eta^^k ^ z ~ eta ^^k

koenigs then becomes

lim k-> oo (eta^^k - e) / Q^k

where Q is the derivate of eta^x at x = e ( since e is fixpoint ).

D eta^x = eta^x / e => eta^e / e = e/e = 1.

thus we arrive at Q = 1 and

lim k-> oo (eta^^k - e) / 1

hence eta^^k - e shrinks slower than exponential.

( for bases > eta , Q > 1 and for bases < eta , Q < 1 hence this can be used to prove the div or conv for other bases easily. )

since eta^^k - e shrinks slowly we can approximate it well with polynomials and thus we construct a taylor series.

r_k = -(eta^^k - e) = e - eta^^k

a strictly negative term sum converges iff its corresponding positive term sum converges and vice versa where corresponding means all terms multiplied by -1.

r_k = e - eta^^k

r_k+1 = e - eta^^k+1 = e - eta^eta^^k

r_k+1 = e - eta^(-r_k + e) ( because eta^^k = -r_k +e )

r_k+1 = e - eta^(-r_k + e) = e - e eta^-r_k ( because eta^e = e )

r_k+1 = e (1 - eta^-r_k)

and this is a pretty selfref about how fast r_k grows to e , now use taylor :

r_k+1 = e ( r_k/e - r_k^2 /(2e^2) + r_k^3/(6e^3) + O(r_k^4) )

now §r_k+1§ - r_k = r_k^2 /(2e) + r_k^3/(6e^2) + O(r_k^4)

compare

§e/(r+1)§ - e/r = e^2/(2e r^2) => 1/(r+1) - 1/r = - r^-2 / 2

=> 1/r - 1/(r+1) = r^-2 / 2.

which implies that since eta^^k - e shrinks slower than exponential , r_k seems close to 2e/k since by the above we can remove the /2 part by doubling our estimate of e/r :

2/r - 2/(r+1) = r^-2 2/2 = r^-2.

and of course 1/r - 1/(r+1) = r^2 approximately for 1/r hence r_k seems between 2e/(sqrt(r^2 + 3r)) + C/r^3 and 2e/(sqrt(r^2 -3r)) + C/r^3.

thus r_1 + r_2 + r_3 + ... diverges within the order of O(log(x) + C/x^2).

Q.E.D.

tommy1729
Reply
#5
In the newsgroup news://sci.math I got a nice and concise answer by Prof. Israel.
At 16.08.2010 with the subject: "Series : divergent or convergent?" I gave t=e=exp(1), b=t^(1/t) and the notation for the residual-term r_k = t - b^^k as example.

This is the answer: (I inserted the correction of a wrong sign)

Code:
r_{k+1} = t - b^(b^^k) = t - b^(t - r_k)
        = t - t b^(-r_k)  (since t = b^t)
        = t (r_k ln(b) + O(r_k^2))  

By the ratio test, the series will converge if |t ln(b)| < 1.  
Since b^t = t says t ln(b) = ln(t), this is equivalent to 1/e < t < e.
Your case t=e is on the boundary of this, so we need another term.

r_{k+1} = t (r_k ln(b) - r_k^2 ln(b)^2/2 + O(r_k^3))
        = r_k - r_k^2/(2 e) + O(r_k^3)
            
This fits with r_k ~ 2e/k, which would indicate that the sum diverges.

[second post]
In fact, I believe we should have
     r_k ~= 2e/(k + ln(k)/(3 - 1/k))  as k -> infty.
Again, it diverges.

-- Robert Israel
    Department of Mathematics
    University of British Columbia Vancouver, BC, Canada

I think that solves the problem. @Henryk: Shall I still copy the problem into the TPID-section? (or the math-facts/does this still exist?)

Gottfried
Gottfried Helms, Kassel
Reply
#6
didnt read my posts yesterday ?

i probably proved 2 of your statements with basicly the same method.

i dont know where 2e/(k + ln(k)/(3 - 1/k)) is coming from btw and i dont see it explained.

it seems robert didnt show that e-r_k doesnt converge at exp speed and hence he merely did a taylor series recursion , though i admit so did i finally.

furthermore sum 1/(k*log^3(k)) converges so the proof certainly needs to be made more rigorous.

i used the koenigs analogue to prevent wild solutions to the taylor series recursion , however a strong proof or construction of a solution to the taylor series recursion to prevent e.g. r_k ~~ 1/(k*log^3(k)). is needed.

robert merely gave a recursion , i did a bit more.

but perhaps not enough. im thinking about improving what i wrote yesterday.

maybe replace koenigs analogue with a better formula analogue.

and partially replacing taylor ( after r_3 term ) with something better.

and robert is not a full prof i believe.

dont get me wrong , i do not wish to belittle robert , i respect him and supported him in the past.

but i dont like you skipping my reply ... and ignoring my other potential proof.

forgive my anger , but i am a man of honor.

maybe you meant to reply at my posts later ...

do me a favor to make up for it and read my other potential proof in ' tiny limit curiosity '.

regards

tommy1729
Reply
#7
and i now read the thread on sci.math and it seems your schroeder function idea is ' borrowed ' from my koenigs analogue posted yesterday here ...
Reply


Possibly Related Threads...
Thread Author Replies Views Last Post
  Improving convergence of Andrew's slog jaydfox 19 25,366 07/02/2010, 06:59 AM
Last Post: bo198214
  Convergence of matrix solution for base e jaydfox 6 8,669 12/18/2007, 12:14 AM
Last Post: jaydfox



Users browsing this thread: 1 Guest(s)