(08/21/2014, 01:15 AM)jaydfox Wrote: Sorry to post two versions in one day, but the performance boost was too huge to ignore. In the table below, the top column shows the number of polynomials I calculated. Below is the time in seconds. As you can see, version 3 cuts the time by 30%-50%, compared to version 2. But version 4 blows them both out of the water, cutting the time by more than an order of magnitude.

Code:`Vers. 60 100 150 200`

v2 0.7 9.1 91 --

v3 0.5 5.6 49

v4 0.15 0.5 3.2 15.3

How is this possible? Simple: I got rid of the rational coefficients. (...)

Okay, I updated the code again.

jdf_seq_v5.gp (Size: 9.65 KB / Downloads: 474)

The algorithm is only changed slightly, but due to the manner of the change, the code actually changed quite a bit. (Just in the "details"; the structure remains largely intact.)

Here's the performance for version 5, stacked up against previous versions. Interestingly, there's a rough match in the time to calculate 2n polynomials in version 5, versus n polynomials in version 3.

Code:

`Vers. 60 100 150 200 250 300`

v2 0.7 9.1 91 -- -- --

v3 0.5 5.6 49 -- -- --

v4 0.15 0.5 3.2 15.3 53.2 150

v5 0.15 0.33 1.3 5.4 17.4 46.3

So what spiffy trick did I come up with this time? Well, I probably should have figured this out a while back, but sometimes you miss the forest for the trees, so to speak.

Anyway, the generator polynomials are either even or odd. That is, they only contain even (or only odd) powers of x. This means they are symmetric about 0, so:

By taking advantage of the symmetry, you only need to evaluate about half as many points.

But figuring out an efficient way to calculate the coefficients for the Newton polynomials was eluding me (I couldn't reuse previous iterations). Then it occurred to me that I could change the even functions to full polynomials with half the degree, by substituting x^2 in place of x. The odd functions require a pre-multiplication by x, and then the same algorithm can be used.

An example is probably in order. To calculate the 6th-order polynomial, we start by evaluating the 5th order polynomial, in order to generate the interpolation points:

g_5(x) = 128/15*x^5 - 28/3*x^3 + 9/5*x

(Internally, it's stored as 1024*x^5 - 1120*x^3 + 216*x; divide that by 5! to get the proper form.)

We know that g_6(0)=0 and g_6(1) = 1. We evaluate g_5(x) at 3 and 5, giving 1827 and 25509. The recurrence relation is:

This gives:

g_6(-3) = 27337

g_6(-2) = 1828

g_6(-1) = 1

g_6(0) = 0

g_6(1) = 1

g_6(2) = 1828

g_6(3) = 27337

Now we rewrite this as:

g_6(9) = 27337

g_6(4) = 1828

g_6(1) = 1

g_6(0) = 0

g_6(1) = 1

g_6(4) = 1828

g_6(9) = 27337

Notice that the first three nodes are redundant. At this point, it's just a matter of divided differences (taking into account the non-equal spacing), and applying Newton's interpolation formula. Because of the way the problem is constructed, we can reuse most of the calculations for the various interpolation polynomials.

Also, it's just plain cool. That's the beauty of reducing the degree of the polynomial like this: half the polynomial evaluations to calculate the interpolation points; approximately 1/4th the number of divided differences; and the interpolation polynomials are half the degree as well. Overall, this speeds up the process by a factor of about 3.

For odd functions, you'll see a pre-multiplication step before it starts calculating the divided differences.

Edit: Fixed some formatting, and fixed the recurrence formula for g(k)

~ Jay Daniel Fox