So, this is something I've come across which includes 3 things we've all seen, and one new thing. That new one thing being Bessel functions.

I will remind the reader that the Bessel functions:

$$

J_v(x) = \sum_{k=0}^\infty \frac{(-1)^k}{k!\Gamma(1+k+v)} \left( \frac{x}{2}\right)^{2k+v}\\

$$

We only care about the \(v = 0\) version, by which:

$$

J_0(\sqrt{2x}) = \sum_{k=0}^\infty \frac{(-1)^k x^k}{k!^2}\\

$$

This function, has the awesome property that:

$$

\Lambda(s) = \int_0^\infty J_0(\sqrt{2x})x^{s-1}\,dx\\

$$

Which is analytically continuable to:

$$

\Lambda(s) = \sum_{k=0}^\infty \frac{(-1)^k}{k!^2(s+k)} + \int_1^\infty J_0(\sqrt{2x})x^{s-1}\,dx\\

$$

Which is meromorphic for \(\Re(s) < \delta\), which is found from the asymptotic that \(J_0(\sqrt{2x})\) is bounded by \(x^{-\delta}\).

We're going to start with the function \(g(x)\) such that \(g : (-\infty, 0] \to (-\infty , 0]\) and that \(g(g(x)) = e^{x} -1\). And we are going to notice instantly that \(e^{-\infty} - 1 = -1\), and thereby, \(g(-\infty) = g^{-1}(-1) \), and therefore is a constant because \(g\) is injective. By which we have \(g\) is a bounded function. Therefore:

$$

\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Converges for \(-1 < \Re(s) < 0\) because \(g(x) \sim x\) as \(x \to 0\). This function equals the mellin transform, which we'll write:

$$

\partial g(s) = \frac{1}{\Gamma(s)}\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Where beautifully, if we take the asymptotic expansion of \(g\) about \(0\), we get that:

$$

\partial g(-k) = k! g_k\\

$$

Where \(g_k\) are the coefficients of the asymptotic expansion \(g(x) = \sum_{k=0}^\infty g_k x^k\) (Gottfried's idea from before).

We can analytically continue \(\partial g(s)\) to \(\Re(s) < 0\) (using standard residue arguments) and here is where the fun begins. Using the Fourier transform, where we write it for \( -1 < c < 0\), we have:

$$

\frac{1}{2 \pi i} \int_{c-i\infty}^{c+i\infty} \Gamma(s) \partial g(s) x^{-s} \,ds = g(x) = \sum_{k=1}^\infty g_k x^k\\

$$

Which is valid for \(\Re(x) < 0\).

Now we can totally change the game by introducing Bessel functions. If I write:

$$

\frac{1}{2 \pi i} \int_{c-i\infty}^{c+i\infty} \Lambda(s) \partial g(s) x^{-s} \,ds = \mathcal{B}g(x) = \sum_{k=1}^\infty g_k \frac{x^k}{k!}\\

$$

We are now asking that this object converges for \(x \approx 0\). And that this is not an asymptotic series. Which ultimately shows that \(g_k = O(c^k k!)\). Well, wouldn't you know that the Bessel function's mellin transform \(\Lambda(s)\) is a standard kind of function:

$$

\Lambda(s) = \frac{\Gamma(s)}{\Gamma(1-s)}\\

$$

So I have reduced Gottfried's problem into showing that:

For \(-1 < c < 0\) The following function is holomorphic for \( |x| < \delta\) for some \(\delta > 0\):

$$

\sum_{k=1}^\infty g_k \frac{x^k}{k!} = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} \frac{\Gamma(s)}{\Gamma(1-s)} \partial g(s) x^{-s}\,ds\\

$$

Where:

$$

\partial g(s) = \frac{1}{\Gamma(s)}\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Now to solve this problem, we note instantly that \(\partial g(s)\) is bounded in the left half plane. And that \(\Gamma(s)\) cancels out \(\Gamma(1-s)\). Where at best we are left with a decay like \(1/|\Im(s)|^{1+\delta}\). This would follow from standard bounds on Mellin transforms, and Gamma function asymptotics (A la Sterling).

This integral absolutely converges for \(|x| < \delta\), and does so uniformly. Showing that Gottfried's coefficients are \(O(c^k k!)\).

EDIT:

Okay so I worked the actual asymptotics out for \(\Lambda\).

But:

$$

|\Gamma(x+iy)| \sim \sqrt{2\pi} |y|^{x-\frac{1}{2}}e^{-\frac{1}{2}\pi |y|}\\

$$

There by, if we choose \(-1 < c < 0\), then:

$$

\left|\frac{\Gamma(c+iy)}{\Gamma(1-c-iy)}\right| \sim \frac{|y|^{c-\frac{1}{2}}}{|y|^{1-c-\frac{1}{2}}} \sim |y|^{2c-1}\\

$$

Therefore, so long as \(c < 0\) the above integral converges. If we have that \(\partial g(s)\) is bounded in the left half plane, which is provable using a similar argument.

I will remind the reader that the Bessel functions:

$$

J_v(x) = \sum_{k=0}^\infty \frac{(-1)^k}{k!\Gamma(1+k+v)} \left( \frac{x}{2}\right)^{2k+v}\\

$$

We only care about the \(v = 0\) version, by which:

$$

J_0(\sqrt{2x}) = \sum_{k=0}^\infty \frac{(-1)^k x^k}{k!^2}\\

$$

This function, has the awesome property that:

$$

\Lambda(s) = \int_0^\infty J_0(\sqrt{2x})x^{s-1}\,dx\\

$$

Which is analytically continuable to:

$$

\Lambda(s) = \sum_{k=0}^\infty \frac{(-1)^k}{k!^2(s+k)} + \int_1^\infty J_0(\sqrt{2x})x^{s-1}\,dx\\

$$

Which is meromorphic for \(\Re(s) < \delta\), which is found from the asymptotic that \(J_0(\sqrt{2x})\) is bounded by \(x^{-\delta}\).

We're going to start with the function \(g(x)\) such that \(g : (-\infty, 0] \to (-\infty , 0]\) and that \(g(g(x)) = e^{x} -1\). And we are going to notice instantly that \(e^{-\infty} - 1 = -1\), and thereby, \(g(-\infty) = g^{-1}(-1) \), and therefore is a constant because \(g\) is injective. By which we have \(g\) is a bounded function. Therefore:

$$

\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Converges for \(-1 < \Re(s) < 0\) because \(g(x) \sim x\) as \(x \to 0\). This function equals the mellin transform, which we'll write:

$$

\partial g(s) = \frac{1}{\Gamma(s)}\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Where beautifully, if we take the asymptotic expansion of \(g\) about \(0\), we get that:

$$

\partial g(-k) = k! g_k\\

$$

Where \(g_k\) are the coefficients of the asymptotic expansion \(g(x) = \sum_{k=0}^\infty g_k x^k\) (Gottfried's idea from before).

We can analytically continue \(\partial g(s)\) to \(\Re(s) < 0\) (using standard residue arguments) and here is where the fun begins. Using the Fourier transform, where we write it for \( -1 < c < 0\), we have:

$$

\frac{1}{2 \pi i} \int_{c-i\infty}^{c+i\infty} \Gamma(s) \partial g(s) x^{-s} \,ds = g(x) = \sum_{k=1}^\infty g_k x^k\\

$$

Which is valid for \(\Re(x) < 0\).

Now we can totally change the game by introducing Bessel functions. If I write:

$$

\frac{1}{2 \pi i} \int_{c-i\infty}^{c+i\infty} \Lambda(s) \partial g(s) x^{-s} \,ds = \mathcal{B}g(x) = \sum_{k=1}^\infty g_k \frac{x^k}{k!}\\

$$

We are now asking that this object converges for \(x \approx 0\). And that this is not an asymptotic series. Which ultimately shows that \(g_k = O(c^k k!)\). Well, wouldn't you know that the Bessel function's mellin transform \(\Lambda(s)\) is a standard kind of function:

$$

\Lambda(s) = \frac{\Gamma(s)}{\Gamma(1-s)}\\

$$

So I have reduced Gottfried's problem into showing that:

For \(-1 < c < 0\) The following function is holomorphic for \( |x| < \delta\) for some \(\delta > 0\):

$$

\sum_{k=1}^\infty g_k \frac{x^k}{k!} = \frac{1}{2\pi i} \int_{c-i\infty}^{c+i\infty} \frac{\Gamma(s)}{\Gamma(1-s)} \partial g(s) x^{-s}\,ds\\

$$

Where:

$$

\partial g(s) = \frac{1}{\Gamma(s)}\int_0^\infty g(-x)x^{s-1}\,dx\\

$$

Now to solve this problem, we note instantly that \(\partial g(s)\) is bounded in the left half plane. And that \(\Gamma(s)\) cancels out \(\Gamma(1-s)\). Where at best we are left with a decay like \(1/|\Im(s)|^{1+\delta}\). This would follow from standard bounds on Mellin transforms, and Gamma function asymptotics (A la Sterling).

This integral absolutely converges for \(|x| < \delta\), and does so uniformly. Showing that Gottfried's coefficients are \(O(c^k k!)\).

EDIT:

Okay so I worked the actual asymptotics out for \(\Lambda\).

But:

$$

|\Gamma(x+iy)| \sim \sqrt{2\pi} |y|^{x-\frac{1}{2}}e^{-\frac{1}{2}\pi |y|}\\

$$

There by, if we choose \(-1 < c < 0\), then:

$$

\left|\frac{\Gamma(c+iy)}{\Gamma(1-c-iy)}\right| \sim \frac{|y|^{c-\frac{1}{2}}}{|y|^{1-c-\frac{1}{2}}} \sim |y|^{2c-1}\\

$$

Therefore, so long as \(c < 0\) the above integral converges. If we have that \(\partial g(s)\) is bounded in the left half plane, which is provable using a similar argument.