Going in the other direction ? tommy1729 Ultimate Fellow Posts: 1,493 Threads: 356 Joined: Feb 2009 01/26/2021, 01:19 PM Many methods use something like taking n log's of some function f(s+n) or f(exp(exp... n times )). So basically we take many log's of something that approximates many exp. Sinh method and the recent NBLR method for instance use this. If we consider it from the nonreal perspective we end up with a dense set of singularities usually because the exp iterates make many smaller copies close to each other. See sheldon's recent comments. Not going into details here. A potential solution is starting from the real line and then doing analytic continuation. Usually analytic continuation is not done from a line but from a circle or polygon. So that is a bit controversial. So what is a logical alternative ? Perhaps avoiding these many copies  by "going in the other direction". By that I mean exp(exp(... n times g(s+n)). However this requires a function that grows approximately like slog(x). Series and products ( that are also interpolation methods usually btw)  tend to give nonanalytic solutions. So I wonder about an infinite composition that grows close like slog(x). As James Nixon demonstrated infinite compositions exist to approximate sexp sufficiently. So maybe they also exist for slog(x). I got stuck trying to find them. Do they even exist ?? Regards tommy1729 JmsNxn Long Time Fellow Posts: 571 Threads: 95 Joined: Dec 2010 01/28/2021, 01:48 AM (This post was last modified: 01/28/2021, 01:54 AM by JmsNxn.) Hey, Tommy! Yes things like what you are looking for do indeed exist. But we have to go the other way. This means we have to use outer infinite compositions. I haven't done much work with these, but the same normality theorems arise. So instead of, $ \phi_1(s,\phi_2(s,...\phi_n(s,z)))\\$ Where we note the limit is going to be taken on the inside, we do, $ \phi_n(s,\phi_{n-1}(s,...\phi_1(s,z)))\\$ Where now the limit is on the outside. The same convergence theorems I described earlier do work; they're not optimal; for the single variable case they converge to constant, and I'm pretty sure you may be able to reduce them a tad in the bivariable case. This case is actually easier than the inner compositional case. I chose inner because it left for less messy equations. So for example, say you want something that looks like $\text{slog}$. I can think of a couple; they're not ideal--mostly because we need to use 2 variables rather than one. In my construction of $\phi$ I got to throw-away the $z$ value. I haven't thought of a way to do this in the outer case... But for example consider, $ H(s,z) =\mathcal{L} _{j=1}^\infty z + e^{s-j+z} \bullet z\\$ $\mathcal{L}_{j=1}^\infty$ is just the outer composition notion (disclaimer: this isn't the symbology I use, I use an upside down $\Omega$ for this--this forums latex abilities are limited). Then, $ H(s, z + e^{s+z}) = H(s+1,z)\\$ Now, all be it, it's not perfect. But it should be entire in $z,s\in\mathbb{C}$. You can massage this, by taking inverses in $s$; maybe switching up the composite a bit; to get something to look more like $\text{slog}(e^z) = \text{slog}(z) + 1$--but I can't think off hand. The key take away is that the functional equation is now on the inside rather than the outside. Rather than solving for $f$ like this, $ f(s+1,z) = F(s,f(s,z))\\$ We are solving for $f$ like this, $ f(s,F(s,z)) = f(s+1,z)\\$ It's pretty much the same theory, just use outer compositions. « Next Oldest | Next Newest »