Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Beyond + and -
#1
What I propose here doesn't have a direct relation to tetration, but since I don't really know where to propose a new extension of math that falls out of the horizon of mainstream math (without being ridiculed at the start), I thought this is the right forum for this proposal, given that I read many proposals of "experimental" extensions of current math here.

My idea is rooted in the question "How to extend math in the most fundamental way?", that is, not working inside known structures like R or C, yet also not introducing overly complex new structures.
I thought of a way to naturally extend the integers in the same way that the integers extend the natural numbers. As far as I know no one has written this down yet, maybe no one thought is was interesting or worth thinking about.

Really, it is very simple. But it can seem complicated or plain weird or "wrong" at first - just try to follow my explanation:

Subtraction is simply the opposite of addition. From the positive numbers we go towards zero instead away from it - until zero, where we go towards increasingly negative numbers.
We can do the same with respect to both positive and negative numbers, which creates a new operation (and a new algebraic sign). I call it "§".
By doing §-tion we go towards zero from both the negative and positive numbers - until zero, where we go towards the §-numbers.

Let's look at a few examples.
From positive numbers we go towards zero and beyond zero towards §-tive numbers:
+5§3=2
+1§1=0
+8§3=+5
+3§5=§2

...but also towards zero (and beyond zero towards §-tive numbers) from the negative numbers:
-5§3=-2
-1§1=0
-8§3=-5
-3§5=§2

From 0 or §-numbers we go towards increasingly §-tive numbers:
0§5=§5
§3§5=§8

Note that +1§1=-1§1=0.

I hope the examples make it clear what the operation does.

The §-numbers seem to make it possible to solve the equation -X+Y=X+Y by (for example) inserting X=3 and Y=§3 giving -3§3=3§3=0. This also means that §X-§X!=0 and instead §X-§X=§X, because otherwise you could derive -3=3 by subtracting Y on both sides above.
This seems to imply §1*-1=0, even though that may be counterintuitive.

I am not yet sure what properties of integers exactly are preserved and which not, and I am also not yet sure what §1*§1 is. It would be nice if §1*§1=-1 so that -1 has a squareroot, but I don't know yet whether this is consistent. It may even be possible that §1*§1 doesn't have a unique solution or can't have a solution within the defined numbers (which I think is unlikely). I also have no clue as to what the practical utility of these numbers are (if there is any).

I really encourage you to study these numbers, since I am unfortunately not remotely as good in figuring out mathematical properties as in inventing new principles Wink, but I suspect they are quite interesting.

If you see any errors in my reasoning here, please correct me!
Reply
#2
Your operator is very simple actually:

x§y = \sgn(x) (|x|-y)

Nothing really special about it at all


0 § y is absolutely nothing because \sgn(0) is inexistent

And the solution to your algebraic equation (-X+Y=X+Y) is X = 0
Reply
#3
(08/17/2012, 01:25 AM)JmsNxn Wrote: Your operator is very simple actually:

x§y = \sgn(x) (|x|-y)
Interesting, it seems to be true in some cases.

4§3=1=1*(|4|-3)
-4§3=-1=-1*(|-4|-3)=-1

But

3§4=§1!=-1=1(|3|-4)

So your definition works only in special cases, and is not a general equivalence.

(08/17/2012, 01:25 AM)JmsNxn Wrote: Nothing really special about it at all
I don't know, it was new for me and I found it interesting.

(08/17/2012, 01:25 AM)JmsNxn Wrote: 0 § y is absolutely nothing because \sgn(0) is inexistent
So you give me a incomplete definition that is supposed to be equivalent to mine and so my definition is false? Seems like a flawed approach to me Wink.

Even if your definition was right, sometimes it is useful to add "inexistent" things to math - like sqrt(-1). I don't see how my example is any different, maybe it is useless, but I don't see why it should be.
I am not sure, it might be that § extends the algebraic signs with a sign for 0? It may make sense, since it "neutralizes" numbers, and §1*-1 seems to be 0 (if I didn't ake an error above).

Also your approach says nothing about multiplication (and beyond).

(08/17/2012, 01:25 AM)JmsNxn Wrote: And the solution to your algebraic equation (-X+Y=X+Y) is X = 0
Obviously, but now there are infnite solutions where X!=0.
So it solves -1+Y=1+Y, etc...
Also it seems to solve -1/X+1/Y=1/X+1/Y with Y=§1*X (at least for positive X), which is not solvable in the integers.
Reply
#4
the problems with § are in general not having desired properties and incomplete definitions.

to list the most important

1) define §-x

2) -a§b =?= b§-a ? see 1)

-a§b = -1 * a§b for a > b > 0 but otherwise ?

hence commutativity might not hold. and even anticommutativity might not hold.

3) associativity does not hold and is not defined

(1§1)§2 = 0§2 = §2
1§(1§2) = 1§§1 ??? what ?? = 0 ,? 0=/= §2 either
(3§3)§2 = §2
3§(3§2) = 3§1 = 2 =/= §2

inconsistancy^2 ! Smile


4) we lost linearity because -1§1 = 1§1

5) the § does not have a unique defined inverse operator see 4)

6) distributivity does not hold

§1*(-1+1) = §1*0 = 0

but §1*-1 = 0 and thus §1*-1 + §1*1 = §1 =/= 0 !!

7) you use § both as an operator and a sign.
that is valid for - but seems troublesome here.

a§§b = ?? a§§§c = ?? ( see also above )

Cool what is §1^2 ?

9) what is 1/§x ??

10) what is log(§2) ??

11) it is clear from the above that we cannot use taylor series and calculus without worries on these objects ... if they even exist.

since we do not have much properties to work with , the general objection is it is random without structure and thus might never solve anything WITH STRUCTURE because it doesnt have it itself.

using special cases just seems to reduce to the reals and then we have jmsnx operator mentioned earlier which is as classic as can be.


regards

tommy1729
Reply
#5
Thanks for your informative reply.
I understand that my definitions are very incomplete and that possibly properties may be lost with regard to real numbers, but both is very common when introducing new mathematical concepts, so I wouldn't be so quick to disregard the concept on these grounds. For example complex numbers are not an ordered field, which is arguably a very important property of real numbers, but this doesn't make them less interesting or coherent. And as we "invented" negative numbers, we first had to figure out many definitions (like -*- X^- etc...), the same with i (like X^i or sqrt i).
Yet I am not sure that it'll turn out to be useful, but it seems to be interesting to at least play around with the concept. Who knows what we might be missing. Also it is IMO fun to try new concepts and find out how to make them coherent.

(08/17/2012, 03:18 PM)tommy1729 Wrote: the problems with § are in general not having desired properties and incomplete definitions.

to list the most important

1) define §-x
I honestly have no idea. What could we use to determine that? It may even be possible that there are multiple consistent possibilities.

(08/17/2012, 03:18 PM)tommy1729 Wrote: -a§b = -1 * a§b for a > b > 0 but otherwise ?

hence commutativity might not hold. and even anticommutativity might not hold.
Well, subtraction isn't commutative either. I don't know about anticommutativity.

(08/17/2012, 03:18 PM)tommy1729 Wrote: 3) associativity does not hold and is not defined

(1§1)§2 = 0§2 = §2
1§(1§2) = 1§§1 ??? what ?? = 0 ,? 0=/= §2 either
(3§3)§2 = §2
3§(3§2) = 3§1 = 2 =/= §2
It indeed does not hold. But even the well-known, much-used subtraction is not associative.
(3-3)-2=-2
3-(3-2)=2

So I can't see a problem with that at all?

(08/17/2012, 03:18 PM)tommy1729 Wrote: inconsistancy^2 ! Smile
Well, if lack of associativity or commutativity means inconsistency, then I guess you consider subtraction inconsistent. Wink

(08/17/2012, 03:18 PM)tommy1729 Wrote: 4) we lost linearity because -1§1 = 1§1
I can't really comment on this, since I don't understand what linearity means.

(08/17/2012, 03:18 PM)tommy1729 Wrote: 5) the § does not have a unique defined inverse operator see 4)
Of course it doesn't have it since we have defined 3 operators that are, in some sense all inverse to the other 2. But I don't see how having 2 inverse operators (with + - §) is worse than 1 inverse operators (with + and -) or 0 (+ without -).
It seems to be an interesting symmetry to me.

Also exponentiation has no unique inverse operator (both logarithm and nth-root and inverse operators ins some sense), but this isn't considered a problem either, right?

(08/17/2012, 03:18 PM)tommy1729 Wrote: 6) distributivity does not hold

§1*(-1+1) = §1*0 = 0

but §1*-1 = 0 and thus §1*-1 + §1*1 = §1 =/= 0 !!
I don't think it is correct that §1*-1=0. This was just a guess, that I now think is wrong. Maybe §*-=§?
As I said, it is better to interpret my post as an idea, not as a fully fledged mathematical concept. But everything starts as an idea, right?

At least it is a fully defined set of operators if we just consider +, - and § without multiplication. It it is to be seen where we can go from there.

(08/17/2012, 03:18 PM)tommy1729 Wrote: 7) you use § both as an operator and a sign.
that is valid for - but seems troublesome here.
Why not?

a§§b = ?? a§§§c = ?? ( see also above )[/quote]
Well, which possibility is coherent or how could we figure out which one is?

(08/17/2012, 03:18 PM)tommy1729 Wrote: Cool what is §1^2 ?
Most probably it makes sense to define it as §1*§1 (whatever that is).

(08/17/2012, 03:18 PM)tommy1729 Wrote: 9) what is 1/§x ??
Analogous to + and - it probably is §(1/x).

(08/17/2012, 03:18 PM)tommy1729 Wrote: 10) what is log(§2) ??
Let's not start with such complicated notions. Even with log(-2) this question is not trivial at all.

(08/17/2012, 03:18 PM)tommy1729 Wrote: 11) it is clear from the above that we cannot use taylor series and calculus without worries on these objects ... if they even exist.
I don't see how that is clear, given that many objections you give above apply to subtraction or exponentiation as well, or are simply open problems.
But I am not very knowledgable in that area, so I really don't know.

(08/17/2012, 03:18 PM)tommy1729 Wrote: since we do not have much properties to work with , the general objection is it is random without structure and thus might never solve anything WITH STRUCTURE because it doesnt have it itself.
This seems to be mainly prejudice (understandably so, given that the proposal is quite undeveloped yet). We don't disregard exponeniation or complex numbers just because some properties are missing. I don't see how it is random. For just + - and § everything seems to be clear, and beyond it seems to be unclear right now what the structure turns out to be, so we can't say it is "random".

(08/17/2012, 03:18 PM)tommy1729 Wrote: using special cases just seems to reduce to the reals and then we have jmsnx operator mentioned earlier which is as classic as can be.
Well, there are special cases of - or * which reduce to addition as well (like X-0=X+0) or (X*1=X+0 ). Nevertheless the operator is novel because not all cases can be reduced to + or - (or anything else).


Reply
#6
ok.

although minus is not commutative , associative or distributive , its inverse + is !

this § does not have such an inverse. ( as an operator at least )

the issue is people will argue § is + or § is - , compare it to defining a number j , that also has j^2 = -1 but j =/= i or - i.

or k^2 = 1 with k not -1 or +1.

i hope you are familiar with zero-divisors and quaternions.

" tessarines " and analogues ( sometimes other names ) also cross my mind.

usually , these kind , they have matrix representations.

these mainly 18 and 19 th century ideas where popular in their days , and one of the key ideas is to define a " muliplication table for the units " , call it a group or so if you like.

this to stretch the importance of multiplication and the remark that some things CANNOT BE DERIVED , but MUST BE DEFINED.

not that this is strictly necc but i think you should take a look at it if you havent already.



i wrote the above so to make clear the " actual reply " below :

let a , b , c be distinct nonnegative reals.

i refuse to take § as an operator at the moment , but as a sign/number i will try to define it :

here x§y is considered as x + §y and also equal to §y + x

+a-b§c => max(a,b,c) * the sign that matches that max (+,- or §)

the cases when some a,b,c are equal is trivial.

so now we only need a multiplication table.

since we dont want § to be + or - ( see the intro before the " actual reply " , i hope it is clear ) it seems wise to define §1^2 different from -1 , 0 , 1.

so we only are left to define (§1)^2 and -1 * §1.

( well actually there are other solutions than this post , but those are " old ideas " such as group rings etc and not in the spirit of this thread , closer to quaternions and such )

thus we are basicly forced to define §1*§1 = §1

3 problems remain

x+y = -x+y

-1 * §1

(-1+1) * §1

it is a kinda basic thing in math to be ABLE TO DO BOTH THINGS ON BOTH SIDES OF THE EQUATION.

examples

a = b then a^2 = b^2

although we must note

a^2 = b^2 does not imply a=b necc.

so take with a grain of salt.

but for + and - that grain of salt is not needed in standard math.

however with those § and the equation -x+y = x+y we get another situation.

if we cannot freely use substraction , addition on both sides of the equation AND we do not have commutativity , associativity and distributivity on either hand ... well its hard to take algebraic steps in a proof !

im not trying to shoot this idea to the moon , but im pointing out the issues.

on the other hand , the definition of -1 * §1 seems key here.

if we take -1 * §1 to be §1 , then the equation

-x+y = x + y

reduces to

-x = x

if we substract y on both sides , and then

-x = x has the solutions § (c^2) for any real c.

that seems consistant HOWEVER

if we take

-x +y = x+y

and add x on both sides , we get

y = 2x + y

now substract y on both sides

0 = 2x

and now x can only be 0 and not the other solution arrived at earlier !!

HENCE it seems that there is no satisfactionary solution to -1 * §1 UNLESS WE ALSO DEFINE A 4 TH SIGN

&1 = - §1 ( or §-1 )


but that has issues too

3+§4 = §1

§1+&4 = &3

thus (3 + §4) + &4 = &3 however 3+(§4+&4) = 3 + 0 = 3

one CANNOT DEFINE &3 as 3 ( and hence &1 = +1 ) because we get the same with

-3+§4 = §1

§1+&4 = &3

thus (-3 + §4) + &4 = &3 however -3+(§4+&4) = -3 + 0 = -3

and then we must get §3 = -3 , but we just defined &3 = 3 above !!

SINCE we cannot state & as another sign ( &1 =/= -1 , &1 =/= 1 , &1 =/= §1 , &1 =/= 0 ) we remain with the fact that if we accept & we loose associativity for addition !

( recall thus (-3 + §4) + &4 = &3 however -3+(§4+&4) = -3 + 0 = -3 so we lost associativity of addition )

since we have &1 = -1 * §1 and -1 * -1 * §1 = 1*§1 = §1

and §1*§1 = §1

we are forced to have &1^2 = (-1)^2 * (§1)^2 = §1

and &1 * §1 = - §1 = &1 which is ... undesired. ( since §1 =/= 1 )

---

issues/problems or no properties , that seems to be the decision case ...

hence my lack of enthousiasm and the non-introduction in math i think.

good luck anyway.

regards

tommy1729
Reply
#7
considering ( associative ) extensions however , mathematicians would never extend q ( q >1 ) units to q+1 units.

and an odd prime amount of units is not an extention itself because prime has no real divisors.

see once again examples such as quaternions , multiplication tables etc ( galois is overkill )

if we have a number a1 x1 + a2 x2 + ... an xn and we add a new sign xn+1 then we need to define also x1*xn+1 , x2*xn+2 , ... and those need to be distinct , lin indep , and nonzero , hence we have a multiplication amount of elements -> not a prime and not n+1.

lagrange group theory basicly. ( i assume all know langrange theorem in group theory )

considering that and we want 3 signs , it seems we should stop trying to extend the normal + and - and start with 3 totally new signs !!

the addition remains the same but i propose the multiplication table is
that they act like the 3 roots of unity.

regards

tommy1729
Reply
#8
(08/22/2012, 11:04 AM)tommy1729 Wrote: considering that and we want 3 signs , it seems we should stop trying to extend the normal + and - and start with 3 totally new signs !!

the addition remains the same but i propose the multiplication table is
that they act like the 3 roots of unity.

That sounds like a cool idea. Maybe the problem isn't §, but the existing definitions for + and -. They are coherent on their own, but not neatly extendable.
Reply
#9
one of the main applications of numbers is being able to solve polynomial equations where the coefficients are those numbers.

since all multiplications and additions are defined we could try to solve 2nd and 3rd degree polynomials.

but a polynomial equation of degree n can have more than n solutions for non-complex numbers. although max n for real numbers.

also we use properties like unique factorization , distributivity etc for solving polynomials in C.

the situation here is different.

2 questions come up naturally.

1) what properties do we have and lack with these +,-,§ numbers ?

2) how many distinct solutions does an equation of degree n maximally have in function of n ?

this is just for self-consistancy and basics , questions like usefullness and such are of course not yet considered.

it is intresting to note that degree 1 equations can have 2 solutions.

im guessing that an equation of degree n has at most 2n distinct solutions but thats a wild quick guess.

regards

tommy1729
Reply
#10
What exactly do you mean with 3 roots of unity, and how would the multiplication table look like?

Maybe it would be interesting to plot a graph of a function with those numbers, but how would we do that?
For the "number line" we could perhaps just use 3 directions with an angle of 60° between them? But what do we do to plot X against Y?
Or we could use multiple plots.
Reply




Users browsing this thread: 1 Guest(s)