On the existence of rational operators  Printable Version + Tetration Forum (https://math.eretrandre.org/tetrationforum) + Forum: Tetration and Related Topics (https://math.eretrandre.org/tetrationforum/forumdisplay.php?fid=1) + Forum: Mathematical and General Discussion (https://math.eretrandre.org/tetrationforum/forumdisplay.php?fid=3) + Thread: On the existence of rational operators (/showthread.php?tid=546) Pages:
1
2

RE: On the existence of rational operators  JmsNxn  12/20/2010 (12/20/2010, 08:28 PM)sheldonison Wrote:(12/20/2010, 07:01 PM)JmsNxn Wrote: A taylor series expansion could only work if one also has a slog taylor series expansion. If you give me that I'd be happy to make a graph over domain [0, 2].Here it is. Taylor series for , which will converge nicely for z in the range [0..2]. If z<0, take before generating slog(z1)1. If z>2, iterate , before generating slog(z1)+n, so that z is in the range [0..2]. window screen [xmin = 0, xmax =2, ymin=0, ymax = 10] I'm not liking the way this looks either (don't worry, I used the same algorithm you defined). Are there any other ways to extend tetration that I can try? Hopefully one of them will make the graph without any angles. If not, I guess it just expresses a very odd connection. The function 2 {x} 3 is defined piecewise, so I guess it's just the way it is :/. RE: On the existence of rational operators  JmsNxn  03/11/2011 I've come up with a new idea attached to rational operators. It first involves defining a new set of operators that behave as addition, Tommy_r already noted them and talked about the distribution law, he even talked a bit about what I'm getting at except he didn't fully get at it: 0 <= q <= 1; q:ln(x) = exp^[q](x) x {q} y = q:ln(q:ln(x) + q:ln(y)) I should prove some lemmas so we have something to work with: q:ln(x + x) = q:ln(x) {q} q:ln(x) q:ln(2 * x) = q:ln(x) {1q} q:ln(2) q:ln(x) {q} q:ln(x) = q:ln(x) {1q} q:ln(2) which by induction becomes x {1q} q:ln(n) = x {q} x {q} x .... n times This is why I chose to index these operations with negative values because they do not obey the fundamental law of recursion. They do however behave as addition: q:ln(x * (y + a)) = q:ln(xy + xa) q:ln(x) {1q} q:ln(y+a) = q:ln(xy) {q} q:ln(xa) q:ln(x) {1q} (q:ln(y) {1q} q:ln(a)) = (q:ln(x) {1q} q:ln(y)) {q} (q:ln(x) {1q} q:ln(a)) which simplified is: x {1q} (y {q} a) = (x {1q} y) {q} (x {1q} a) Now we have a full ring of operators, exactly as there's addition (+) and multiplication (*) and exponentiation (^), now there's {q} which behaves as addition, {1q} which behaves as multiplication, and {2q} which behaves as exponentiation. Given this final lemma we can create a new calculus. if S(x) is the identity function: q:ln(x + 0) = q:ln(x) {q} q:ln(0) = q:ln(x) therefore S(q) = q:ln(0) and therefore there is a pole at S(1) and there is no identity for operator {1} Now we must prove as 0 is to multiplication, S(q) is to {1q} (which holds for q=1, since technically S(1) is infinity and x + inf = inf which is parallel to multiplication by zero equaling zero). q:ln(x {1q} q:ln(0)) = q:ln(x) * 0 = 0 therefore: x {1q} q:ln(0) = q:ln(0) and q:ln(0) = S(q) with this, we can now create the fundamental theorem of logarithmic semi operator calculus: q:d/dx f(x) = lim h>S(q) [f(x {q} h) }q{ f(x)] }1q{ h Note that 0:d/dx f(x) = d/dx f(x), and that 1:d/dx f(x) is a pole of negative inf. Here are some laws, they turn out exactly the same: q:d/dx f(x) {q} g(x) = q:d/dx f(x) {q} q:d/dx g(x) which is the law that states differentiation is distributable over addition. q:d/dx f(x) {1q} g(x) = [q:d/dx f(x) {1q} g(x)] {q} [f(x) {1q} q:d/dx g(x)] which is the product rule. The chain rule is applicable, it becomes: q:d/dx f(g(x)) = q:d/dg(x) f(g(x)) {1q} q:d/dx g(x) With this you can see that every operation you do with derivatives, you can do a parallel operation with just "lowered" operator power. Here comes the sweet part, we can now invent lower operator polynomials and then transform that knowledge into lower operator Taylor series and invent a definition of semioperator analytic functions q:d/dx x {2q} n = q:ln(n) {1q} (x {2q} (n1)) This derivative is a result of the normal power rule proof, but at the end when you collect n amount of x^(n1), you must convert that to q:ln(n) because {q} is not proper recursive. Now we can develop a taylor series, if X(n=0, y, q) An = A1 {q} A2 ... {q} Ay and q:n! = q:ln(1) {1q} q:ln(2) {1q} ... q:ln(n) = q:ln(n!) q:0! = S(1q) q:1! = S(1q) f(x) = X(n=0, inf., q) [S(1q) }1q{ q:n!] {1q} q:d^n/dx^n f(a) {1q} [(x }q{ a) {2q} n] We can now create the lower operator sine function, the lower operator exponential function which will have the same relationship: exp_q(x) = X(n=0, inf, q) (S(1q) }1q{ q:n!) {1q} (x {2q} n) sin_q(x) = X(n=0, inf., q) ((S(1q) {2q} n) }1q{ q:(2n+1)!) {1q} (x {2q} 2n+1) cos_q(x) = X(n=0, inf, q) ((S(1q) {2q} n) }1q{ q:2n!) {1q} (x {2q} 2n+1) Which may look messy at first, is just the normal sin cos exp taylor series with every operator lowered by q and the q:ln taken of the factorial to account for the false recursion in operator {q}. if we create J(q) such that J(q) {q} J(q) = S(q) and therefore J(1) = i, and J(0) = 0. In a perfect world J(0.5) = 0.5 * i, but this is false. J(q) = S(q) {1+q} 1/2 = q:ln(q:ln(S(q))*1/2) we can now generalize euler's formula to exp_q(x {1q} J(1q)) = cos_q(x) {q} (J(1q) {1q} sin_q(x)) q:d/dx exp_q(x) = exp_q(x), as well I'm trying to look into extending all of analytic calculus into logarithmic semioperator calculus. So far it's very difficult to take the "qderivative" of any ordinary function. And because of a certain law, the lowered exponential function, b{2q} x, is indifferentiable. There, that's all of it. I hope some of you have opinions. PS: I also wanted to ask, does anyone think that choosing an erratic base for the operators might make the transition of 2 {x} 3 more smooth unlike how it is now? It's a shame really because every other transformation is very very nice, like f(x) = 1 }1q{ x, and varying q bit by bit. RE: On the existence of rational operators  JmsNxn  03/19/2011 I have a discovery that starts to legitimize logarithmic semioperator calculus. given: q:d/dx f(x) = lim h>S(q) [f(x{q}h) }q{ f(x)] }1q{ h I can prove that q:d/dx e^x = e^x or that if we lower the operators involved in the difference quotient by any value 0 <= q < 1; and change h's limit from zero (the additive identity) to S(q) (the lowered additive identity), e^x is still it's own output. 