using sinh(x) ? - Printable Version +- Tetration Forum (https://math.eretrandre.org/tetrationforum) +-- Forum: Tetration and Related Topics (https://math.eretrandre.org/tetrationforum/forumdisplay.php?fid=1) +--- Forum: Mathematical and General Discussion (https://math.eretrandre.org/tetrationforum/forumdisplay.php?fid=3) +--- Thread: using sinh(x) ? (/showthread.php?tid=424) Pages: 1 2 3 4 5 6 7 8 9 10 RE: using sinh(x) ? - tommy1729 - 03/05/2016 (03/05/2016, 01:27 PM)tommy1729 Wrote: It seems the method has passed the Weierstrass M-test !! Regards Tommy1729 So ... It is analytic ! Here is a sketch of the proof. Lets consider the half-exp for simplicity, the general case follows by analogue. Proof sketch 1) 2sinh^[1/2] (x) = g(x) is analytic in [0,e] so it is analytic for x>= 0. 2) therefore f = ln(g(exp(x)) is analytic in [1,oo]. X* = x - eps , X** = x + eps. Let f(x,n) = ln^[n](f( exp^[n](x) ). + continuation ( like ln(exp) = id ). Clearly for finite n (integer >0) F(x,n) is analytic in x for x > 1. 3) we can choose small eps > 0 , independant of n such that For the points w* in the radius eps around x, And points w** around exp(x) with radius exp(x**) - exp(x), ( Abs(f(w*,n )) - Abs(ln( f(w**,n ) )^2 < (Abs(f(x*,n)) - Abs(ln( f(exp(x**) ,n)) ))^2 { follows from squeeze theorem and analiticity } < 10/n^3 { follows from fast convergeance of the n th step for real x > 1 }. Notice that for small enough eps we get a univalent radius. 4) the weierstrass M test gives the Sum S =< 10 zeta(3). So we have passed the weierstrass M test. 5) since we have past the weierstass M test , we have uniform convergeance within the radius eps > 0. Notice eps is independent of n so lim eps(n) > 0 , no infinitesimal. So we have a positive radius around Some real x > 1 where the M test is satisfied and hence it is uniform convergent there. 6) if within a radius a function is the uniform convergent limit of a sequence of analytic functions ( analytic in the radius) then that limit function is analytic in that radius too. { a classic , but the name and inventor is uncertain to me , resembles uniform limit theorem and work of cauchy and Weierstrass. Also uncertain about the date - assume around 1874 ... Feel Free to inform me. I dont Mind naming after me , but it probably has a name already ... } Therefore the 2sinh method is analytic !! Q.e.d. Tommy1729 ----------------------------- Im considering a geometric proof too. I think this proof and the geometric ( yet to make) one are the only possible. I had matrix ideas and basic real calculus ideas , but despite arguments they did not get closer to a proof. This is probably in part due to the importance of error terms over closed forms. Naturally , this is the beginning and not the end. The natural idea of what else is analytic in a provable way is very inspiring. Notice this extends to all real bases within [exp(1/2), + oo]. I believe my exp(2/5) base method is also analytic because of this proof so Maybe we have Analytic methods for bases in [exp(2/5), + oo]. I know uniformal convergeance for matrix functions is way harder. ( Maybe one of the top 5 problems in lin alg ). But Maybe we Will find a way around it. Regards Tommy1729 The master "Truth is whatever does not go away when you stop believing in it." RE: using sinh(x) ? - sheldonison - 03/06/2016 (03/05/2016, 11:38 PM)tommy1729 Wrote: So ... It is analytic ! Here is a sketch of the proof. Lets consider the half-exp for simplicity, the general case follows by analogue. Proof sketch 1) 2sinh^[1/2] (x) = g(x) is analytic in [0,e] so it is analytic for x>= 0. 2) therefore f = ln(g(exp(x)) is analytic in [1,oo]. X* = x - eps , X** = x + eps. Let f(x,n) = ln^[n](f( exp^[n](x) ). + continuation ( like ln(exp) = id ). Clearly for finite n (integer >0) F(x,n) is analytic in x for x > 1. 3) we can choose small eps > 0 , independant of n such that For the points w* in the radius eps around x, And points w** around exp(x) with radius exp(x**) - exp(x), ( Abs(f(w*,n )) - Abs(ln( f(w**,n ) )^2 < (Abs(f(x*,n)) - Abs(ln( f(exp(x**) ,n)) ))^2 { follows from squeeze theorem and analiticity } < 10/n^3 { follows from fast convergeance of the n th step for real x > 1 }. Obviously I don't agree with Tommy's proof. Here is a graph of the logarithmic singularities of f(z,1) = ln(2sinh^[0.5](exp(z))). The singularities occur approximately where $\text{2sinh}^{[0.5]}(z)=n \pi i$. Now consider f(z,n). For n=1, the radius of convergence is reasonable, for n=3 at z=1, the radius of convergence is about 0.21; where the 41st singularity is the nearest singularity to exp(exp(1)) at 11.8293629488482 + 7.81997442041279i. For n=4, at z=1, the radius is about 0.0012; I don't expect to convince Tommy, which is perhaps unfortunate. The conjecture is that all of the derivatives of f(z,n) do converge as n gets arbitrarily large, even though the function is nowhere analytic. [attachment=1237] RE: using sinh(x) ? - tommy1729 - 03/06/2016 When arg 2sinh^[1/2] = pi / 4 we can expect singularities. 2 reasons ( arguments only ) 1) ln( 2sinh^[1/2] (exp( pi i /4) A) ) ~ ln 2sinh^[1/2] (-A) => { symmetrie of 2sinh } ~ ln ( - 2sinh^[1/2] ) so we get near the branch cut of ln , so it can not be analytic. 2) since we have zero's at re 2sinh = 0 , 2sinh is not close to exp on the imag axis. Likewise since the curve arg = pi/4 for the half-iterate maps to the imag axis after 2 iterations ( and we get zero's then ) , 2sinh^[1/2] Cannot both be analytic and close to exp^[1/2] at the same time at arg = pi/4 ; so a branch cut / singularity on that curve. So basically this agrees with sheldon and his plot kinda. But the idea was that for Some real x > 1 we would get a small radius where it is analytic. Such as x = 15 , with a radius of 0,00001. I see no objection to that by sheldon's post. Assuming i got his idea correct , which i think so because of the plot and the feeling i explained his reasoning here a bit more. --- It seems the nature of the 2sinh is such that if the functional equation fails around z (even after continuation - if possible ) then so does f(z,n) - f(z,n-1) ~ 0 , implying divergeance and hence : For real(z) >1 : around z : Functional equation satisfied IFF analytic. That might help to understand the above. ----- Perhaps not easy to see but this relates to fake function theory. We had that the fake half-exp also failed the functional equation when arg = pi/4. The resembleance is crystal clear ! Fake semi exp ~ fake semi 2 sinh near real x> 1. So i conjecture that the ~ also holds for arg between - pi/4 and pi/4 OR between - pi/8 and pi/8. the /8 comes from the analogue of argument 2 in the beginning of the post. It depends on if the fake can distinguish between exp and 2sinh well. ( in particular away from the real axis ). A few plots like Sheldon did in the fake exp thread would " visually settle this ". ----- Regards Tommy1729 Ps : edit : i made Some typo's and oversimplifications. Basicly to keep things short i mixed Up pi/2 , pi/4 , pi/8 arguments. Sorry. But i guess you can still see basically what i meant. In short for real(z) << 10 and abs arg(z) >= ~ pi/8 all bets are OFF. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/07/2016 I think studying the iterations itself is Nice too. As for the limit function being analytic, im not sure anymore. I found arguments against the squeeze theorem part. So the M-test might fail. I tried a disproof too , but got trapped into assumptions that turned out to be wrong. More specific , my assumptions led to the idea that no analytic tetration existed , so they must be wrong. I compared f(exp(x)) with exp(f(x)). On the other hand with Some luck the squeeze theorem works afterall. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/07/2016 m-test satisfied around 11 with radius 0,002. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/07/2016 Im considering infinite descent as a lemma. What that means here ( outside number theory ) is A =< b/2 B =< C/2 C =< a/2 D = a^2 + b^2 + c^2. If D = 0 , the m-test works. More later. This might keep me Busy for a while though. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/08/2016 Look. Let f(z') = e - 0,2. F(z') ~ exp^[1/2] ~ 2sinh^[1/2] So 1 < z ' < e - 0,2. Within radius 0,1 around z = z' : | exp(f(z)) - f(exp(z)) | =< e^e And | exp(exp(f(z))) - f(exp(exp(z))) | =< e^e^e. So Ln | exp(f(z)) - f(exp(z)) | =< e. And Ln ln | exp( exp(f(z)) ) - f(exp(exp(z))) | =< e. By induction we get +1+ Ln^[n] | exp^[n] (f(z)) - f(exp^[n](z)) | < e + 1. Therefore the m-test gives for +1+ : | m_1 | + | m_2 | + ... < e + 1. Notice if | f(exp^[n](z)) | >= | exp^[n](f(z)) | Then Ln^[n] | 2 f(exp^[n](z)) | < e + 1 + ln(2). So in this case the m-test for 2 sinh gives An upper bound : | s1 | + |s2| + ... < e + 1 + ln(2). So the m-test has passed it as analytic. **** Easy to generalize to e' > e and f(z '' ) = ~ e'. And a radius 0,1 / e^(e^e'). **** Easy to show that it holds at infinity too. **** Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/11/2016 Now consider the arg. The semi-exp and semi-2sinh map the arguments down - away from 0 - to resemble the arg from exp better. There is a rotation point where arg(f(z)) = arg(z) = pi/8. Beyond that point we have downmapping and between that and 0 upmapping. Take that into account with the abs ARgument and you get to see it. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/12/2016 I changed my opinion almost 100 %. To do the m-test we need to think in terms of a starting point near - but not on - the real line. ( and not An infinitesimal imag part ! ). Now consider the iterations as relocations on 2sinh^[r]. ( r > 0 , noninteger ). Then by the chaotic nature of the iterated exp we get near fixpoints of 2sinh , exp , cyclic points of exp and 2sinh , ... And also the singularities or places where the equation fails ( double composition not exp ). I used to consider that to be solved by continuation , but it seems that continuation merely solves ln(exp) = id and the alike ( removing false periodicity and other invariants , resolving the bound of ln * having im max 2 pi - i think that is all it does ! - ) , at least when starting off the real line. These relocations are app not destroyed by continuation but rather imposed. But then the logs of the relocations bring us in trouble. So seeing my method like this Will make the m-test fail i think. AND SEEING IT CORRECTLY Will make the m-test impossible ; the m-test requires the starting point(s) ( (s) from radius or such ) to be nonreal ! So the only way is to Keep thinking on the real line. Because the chaotic behaviour of exp has no effect on ln^[oo] 2sinh ( exp^[oo](z) = exp(z) !! Similarly - as illustration - ln^[oo] ( 3^^oo (z) ) = z + const. BUT fails the m-test. Yet z + const continues to analytic , even entire. So i do not believe in the m-test anymore. And im not thinking 2sinh^[r](z) is analytic anymore ... But how to prove it ?!? So it seems my trip to complex analysis failed and im back at real-analysis / calculus. This is a confusing function ! I orig switched from real to complex analysis because i felt stuck ... But after trying m-test , cauchy's contour , Mittag-leffler and a few others i feel i might need something completely different from real or complex analysis .... ?!?! What could that be ?? I also tried matrix methods but they also failed. I believe it is non-analytic now , and thus the nth derivative must blow up on the real line. But i do not know how to show it. My apologies to sheldon. Regards Tommy1729 RE: using sinh(x) ? - tommy1729 - 03/12/2016 Also look at the absurdity of the chain rule : Exp(z) = ln ln ln ... 2sinh exp exp ... Derivate both sides ?? Exp(z) dz = 2 cosh ( exp exp ... ) exp exp ... * exp ... * ... / exp exp exp ... d z. Where the ... Are nontrivial DISTINCT oo. Messy :/ Regards Tommy1729