• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Kinds of Numbers

Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
I think he already told you: start counting them and then come back and tell me the answer once you're done. So the answer to "Do the natural numbers and the rational numbers have the same number of elements?" is exactly as undefined as "Zero goes into one how many times?". There has to be something deeply deranged about mathematicians, to call an infinite set "countable". :thinking:
Thanks. Sometimes it's good to have confirmation that what you say is not as difficult to understand as some want to suggest.

To be fair with mathematicians, the whole business of dealing with the infinite met with fierce opposition and dismay at the time. Something perhaps younger mathematicians are now happy to ignore or dismiss.
EB
 
Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
I think he already told you: start counting them and then come back and tell me the answer once you're done. So the answer to "Do the natural numbers and the rational numbers have the same number of elements?" is exactly as undefined as "Zero goes into one how many times?". There has to be something deeply deranged about mathematicians, to call an infinite set "countable". :thinking:

The entire point of mathematics is to get correct answers while being as lazy as possible. Don't want to repeatedly add? Multiply. Don't want to repeatedly multiply? Exponentiate. I don't feel like counting every single element, so here's a bijection that will do it for me - oh, and it extends to more situations that the one I invented it for? Awesome, less work for me.

Why anyone would insist on doing it the long way is beyond me. :cool:
Nobody is asking you that. Just don't pretend it's the same process and act surprised that people don't like it. Counterintuitive my ass.
EB
 
Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
You know, counting on your fingers, that sort of things.
EB

What is it about this board that "Be precise" is met with "You know, ..., that sort of things." as an answer?

Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
I think he already told you: start counting them and then come back and tell me the answer once you're done. So the answer to "Do the natural numbers and the rational numbers have the same number of elements?" is exactly as undefined as "Zero goes into one how many times?". There has to be something deeply deranged about mathematicians, to call an infinite set "countable". :thinking:

The entire point of mathematics is to get correct answers while being as lazy as possible. Don't want to repeatedly add? Multiply. Don't want to repeatedly multiply? Exponentiate. I don't feel like counting every single element, so here's a bijection that will do it for me - oh, and it extends to more situations that the one I invented it for? Awesome, less work for me.

Why anyone would insist on doing it the long way is beyond me. :cool:
Nobody is asking you that. Just don't pretend it's the same process and act surprised that people don't like it. Counterintuitive my ass.
EB

You can't explain precisely what counting is, but you know what it isn't? My, my...
 
Juma said:
Take two sets of more than 1E100 apples and count them. Tell me the answer once you're done.
There again you are trying to make me waste my time. You're caught red-handed.
EB

So that is catch phrase when you have no argument? Cause I called your bluff: the time it takes doesnt matter.
Go find two sets of more than 10100 apples, and once you're done, challenge SP to count them. Hey, just to make it as fair as possible I won't even insist that the two sets be disjoint. Looks to me like you're the one who's bluffing. :devil:
 
Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
You know, counting on your fingers, that sort of things.
EB

What is it about this board that "Be precise" is met with "You know, ..., that sort of things." as an answer?

Natural numbers, integers, rational numbers, real algebraic numbers, computable real numbers, and definable real numbers all have cardinality aleph-0, the countable cardinality. That means that there is the same number of all of them, however counterintuitive it may seem.
It's counterintuitive because it's not true in the ordinary sense of "have the same number of elements".

Start counting them and then come back and tell me the answer once you're done.

Progress in the sciences, logic and mathematics has involved re-defining common notions. We're just no longer talking about the same things. No wonder then.
EB

What is the ordinary sense of "have the same number of elements"? Be precise...
I think he already told you: start counting them and then come back and tell me the answer once you're done. So the answer to "Do the natural numbers and the rational numbers have the same number of elements?" is exactly as undefined as "Zero goes into one how many times?". There has to be something deeply deranged about mathematicians, to call an infinite set "countable". :thinking:

The entire point of mathematics is to get correct answers while being as lazy as possible. Don't want to repeatedly add? Multiply. Don't want to repeatedly multiply? Exponentiate. I don't feel like counting every single element, so here's a bijection that will do it for me - oh, and it extends to more situations that the one I invented it for? Awesome, less work for me.

Why anyone would insist on doing it the long way is beyond me. :cool:
Nobody is asking you that. Just don't pretend it's the same process and act surprised that people don't like it. Counterintuitive my ass.
EB

You can't explain precisely what counting is, but you know what it isn't? My, my...
Well, if you can't understand the process of counting on one's fingers, I will have to give up on you. Kids do it and humanity arguably started with it. What's wrong with you?
EB
 
There again you are trying to make me waste my time. You're caught red-handed.
EB

So that is catch phrase when you have no argument? Cause I called your bluff: the time it takes doesnt matter.
That's right, the time it takes doesn't matter. But counting takes time and at some point you stop. How long would it take to count to infinity?
Waster.
EB
 
Well, if you can't understand the process of counting on one's fingers, I will have to give up on you. Kids do it and humanity arguably started with it. What's wrong with you?
EB

People can do lots of things even when they don't understand the underlying process. I'm fairly certain you have very little understanding of the specifics of how your posts appear on my screen, but that keeps happening anyway.

In case you weren't counting, this is the second time I'm asking to see if you understand the underlying process behind counting. You haven't given me much hope for your understanding thus far. Hint: It has very little to do with fingers.

Take some time and think about it. And this time, see if you can try to actually be precise.
 
Counting on one's fingers is a simple form of one-to-one correspondence: placing the entities that one is counting in 121C with one's fingers.

Also, in mathematics, it's irrelevant how long one needs to do something.

Mathematicians had some vehement controversies over mathematical infinities in the late 19th and early 20th centuries, but those controversies seem to have died down. The controversies were, as far as I can tell, controversies about the legitimacy of various operations involving infinite sets. Georg Cantor was rather loudly criticized by his colleague Leopold Kronecker for violating what LK considered mathematical propriety. LK believed that the only real mathematical objects were those that could be constructed in a finite number of steps from the natural numbers, and GC's infinite sets violated that principle.

Countably infinite sets are a mild violation of that principle, since each one of their elements can be constructed in a finite number of operations. That is evidently true for the natural numbers, and with 121C rules, that is also true for every other countably-infinite set.

Uncountably infinite sets are a stronger violation, since they contain some elements that cannot be constructed in a finite number of operations.

BTW, "countable" is often used as shorthand for "countably infinite".
 
I'll now prove some properties of natural-number operations, using Peano's axioms.

Addition is associative -- independent of grouping

We consider two cases, the zero case, ((a+b)+0) = (a+(b+0)) and the successor case, ((a+b)+c) = (a+(b+c)) implies ((a+b)+S(c)) = (a+(b+S(c)). Mathematical induction then does the rest.

For the zero case, use the addition zero axiom on both sides. That's very easy.

For the successor case, use the addition successor axiom on both sides. That goes.
Left: ((a+b)+S(c)) = S((a+b)+c)
Right: (a+(b+S(c)) = (a+S(b+c)) = S(a+(b+c))
By hypothesis, the successor arguments are equal, and from Peano's axioms, the successor values are equal. Thus proving it.

Addition has a two-sided identity -- 0 on both sides

The addition zero axiom says that 0 is a right identity, but is it also a left identity? Here again, we use mathematical induction, considering a zero case and a successor case.

The zero case is 0 + 0 = 0, and the successor case is 0 + a = a implies 0 + S(a) = S(a).

The zero case is easy, and the successor case has 0 + S(a) = S(0 + a) = S(a) by hypothesis.

Addition is commutative -- independent of order

Here again, we consider a zero case and a successor case, and then use mathematical induction. The zero case, a + 0 = 0 + a is easy, since 0 is the additive identity. The successor case is (a+b) = (b+a) implies (a+S(b)) = (S(b)+a)

For the successor case, we first note that the successor operation can be expressed as addition:
S(a) = S(a+0) = a + S(0) = a + 1, where 1 = S(0)

We also must prove that addition of 1 is commutative, since we will be using that result later on.
Here again, we start with a zero case and a successor case, and use mathematical induction. The zero case is 1 + 0 = 0 + 1 and the successor case is 1 + a = a + 1 implies 1 + S(a) = S(a) + 1

The zero case is easy, and the successor case goes:
Left side: 1 + S(a) = S(1 + a) = S(a + 1) = S(a + S(0)) = S(S(a+0)) = S(S(a))
Right side: S(a) + 1 = S(a) + S(0) = S(S(a) + 0) = S(S(a))

Back to the general successor case,
Left side: a + S(b) = a + (b + 1) = (a + b) + 1
Right side: S(b) + a = (b + 1) + a = b + (1 + a) = b + (a + 1) = (b + a) + 1 = (a + b) + 1
using succession as adding 1, associativity, and addition of 1 being commutative.
 
Now these operations on multiplication.

Multiplication has a two-sided zero -- 0 is the zero of multiplication on both sides.

The multiplication zero axiom establishes that 0 is a right zero. The proof that 0 is a left zero proceeds by mathematical induction, with a zero case, 0*0 = 0, and a successor case, 0*S(a) = 0*a.

The zero case is easy; one uses the multiplication zero axiom. The successor case is more involved.
0*S(a) = 0 + 0*a = 0*a
using the multiplication successor axiom

Multiplication has a two-sided identity -- 1 is the multiplicative identity on both sides.

For 1 being a right identity, one uses the multiplication successor axiom: a*S(0) = a + a*0 or a*1 = a
For 1 being a left identity, one does both a zero case and a successor case, and then does mathematical induction. The zero case is 1*0 = 0, and the successor case is 1*a = a implies 1*S(a) = S(a)

For the successor case, 1*S(a) = 1 + 1*a = 1 + a = S(a)
using the multiplication successor axiom again

Multiplication is distributive over addition -- on both sides, right and left

First right distributivity: a*(b + c) = a*b + a*c. Here also, we set up a zero case and a successor case, and use mathematical induction. The zero case is a*(b + 0) = a*b + a*0, and it is very easy. The successor case is a*(b+c) = a*b+a*c implies a*(b+S(c)) = a*b+a*S(c).

For the successor case,
The left side: a*(b+S(c)) = a*S(b+c) = a + a*(b+c) = a + a*b + a*c
The right side: a*b + a*S(c) = a*b + a + a*c = a + a*b + a*c

Now left distributivity: (a+b)*c = a*c + b*c. A zero case and a successor case again, and mathematical induction. The zero case is (a+b)*0 = a*0 + b*0 and it is very easy. The successor case is (a+b)*c = a*c+b*c implies (a+b)*S(c) = a*S(c) + b*S(c).

For the successor case,
The left side: (a+b)*S(c) = a + b + (a+b)*c = a + b + a*c + b*c
The right side: a*S(c) + b*S(c) = a + a*c + b + b*c = a + b + a*c + b*c

Multiplication is associative -- independent of grouping

Here again, we set up a zero case and and a successor case, and do mathematical induction. The zero case is (a*b)*0 = a*(b*0), and it is easy. The successor case is (a*b)*c = a*(b*c) implies (a*b)*S(c) = a*(b*S(c)).

For the successor case,
The left side: (a*b)*S(c) = a*b + (a*b)*c
The right side: a*(b*S(c)) = a*(b + b*c) = a*b + a*(b*c) = a*b + (a*b)*c
using the distributive property

Multiplication is commutative -- independent of order

Yet another zero case and a successor case, with mathematical induction. The zero case is a*0 = 0*a, and is easy. The successor case is a*b = b*a implies a*S(b) = S(b)*a

The successor case:
Left side: a*S(b) = a + a*b
Right side: S(b)*a = (b+1)*a = b*a + a = a + b*a = a + a*b
using succession as addition, and also the distributive property and addition commutation
 
You can't explain precisely what counting is, but you know what it isn't? My, my...
You say that like there's something abnormal about that sort of situation. I can't explain precisely what correct grammar, but I know what it not. :D
 
Now for ordering of natural numbers.

Stability under addition

a <= b implies a+c <= b+c

a+x = b
Add c to both sides:
(a+c) + x = (b+c)

Stability under multiplication

a <= b implies a*c <= b*c

a+x = b
(a+x)*c = b*c
a*c + x*c = b*c
x*c is also a natural number

Transitivity

a <= b and b <= c implies a <= c

a+x = b and b+y = c
a+x+y = c
x+y is also a natural number

Symmetry

a <= b and b <= a implies a = b

a+x = b and b+y = a
a+x+y = a
x+y = 0
Since 0 is the first natural number, both x and y must be 0

Reflexivity

a <= a

a+x = a
Satisfied for x = 0, a natural number
 
Counting on one's fingers is a simple form of one-to-one correspondence: placing the entities that one is counting in 121C with one's fingers.

... The controversies were, as far as I can tell, controversies about the legitimacy of various operations involving infinite sets. Georg Cantor was rather loudly criticized by his colleague Leopold Kronecker for violating what LK considered mathematical propriety.
That's not the controversy here, though -- nobody's denying that finding a one-to-one correspondence is a legitimate method of counting a set. The point is that it isn't counting in the ordinary sense -- it isn't what nonmathematicians mean when they say "count". When we say we've counted the rationals and there are aleph0 of them we're extending the concept, every bit as much as when we call "-1" a number.

Counting on one's fingers is much more than a simple form of one-to-one correspondence. Suppose you count these things on your fingers:

A B C D E

What you get when you place them into 121C with fingers is a collection of labels, with each letter having a different finger as its label. There are 10! / 5! ways to do that. Even assuming you always use up your fingers in the same order there are 5! ways to do it, 120, about 27. When you finish creating your 121C you've accumulated 7 bits of information. When you report the result of counting the set you only deliver 3 bits: 101. That lossy information compression step is part of what ordinary people mean by "count" -- if somebody asks you to count them and you reply with "A-little D-ring B-middle E-index C-thumb", you're going to get a puzzled look.

In common-usage "counting", the compression step is to simply report the last correspondence in your 121C and throw away the rest of the information. What number had you counted up to when you ran out of elements? That's why having a last element is an essential part of counting in the ordinary sense. It's great that Cantor et al. figured out that we don't actually need that step and can open up a new world of interesting math using just the 121C; but let's not forget that when we made that conceptual advance we were also making a conceptual shift, and changing the language.

BTW, "countable" is often used as shorthand for "countably infinite".
Where "countably" is simply an assertion that the set is countable. It is in our sense; it isn't in ordinary people's sense. Consider that a set doesn't even need to be infinite for people to call it "countless".
 
You can't explain precisely what counting is, but you know what it isn't? My, my...
You say that like there's something abnormal about that sort of situation. I can't explain precisely what correct grammar, but I know what it not. :D

I never implied that it was rare. In fact, it's depressingly common - and it signifies the lack of a clear definition...
 
Counting on one's fingers is much more than a simple form of one-to-one correspondence. ...
(on all the possible 121C functions for counting on one's fingers...)

That's a separate issue. What's relevant in set theory is whether or not a 121C is possible. One example suffices to demonstrate possibility, no matter how many others may be possible.

BTW, "countable" is often used as shorthand for "countably infinite".
Where "countably" is simply an assertion that the set is countable. It is in our sense; it isn't in ordinary people's sense. Consider that a set doesn't even need to be infinite for people to call it "countless".
So "countable" ought to mean "finite"?
 
(on all the possible 121C functions for counting on one's fingers...)

That's a separate issue. What's relevant in set theory is whether or not a 121C is possible. One example suffices to demonstrate possibility, no matter how many others may be possible.
Quite so. And we use "count" to mean "demonstrate possibility", while normal people use "count" to mean "construct one and tell me the last element".

Where "countably" is simply an assertion that the set is countable. It is in our sense; it isn't in ordinary people's sense. Consider that a set doesn't even need to be infinite for people to call it "countless".
So "countable" ought to mean "finite"?
What am I, a prescriptivist? I'm simply observing that there are two linguistic communities speaking different dialects. :eating_popcorn:
 
For the integers, one can use the subtraction construction to prove a variety of properties about them.

Addition and multiplication both have commutative, associative, and distributive properties, with the additive and multiplicative identities being {0,0} and {1,0} and the multiplicative zero being {0,0}. These properties are easy to prove from the definitions of addition and multiplication for integers.

Integers also have additive inverses: reverse the order in the pair. {a,b} + {b,a} = {a+b,a+b} = {0,0}
We usually write the additive-inverse operation as a unary - operation: - {a,b}.
Subtraction, a - b, is thus a + (-b).
The usual way of writing {a,0} is a, and the usual way of writing {0,a} is -a.

It's easy to show that - {a,b} = {a,b} * {0,1}.

Let's now consider ordering properties. {a,b} <= {c,d} is true if there is some natural number x such that a+d+x = b+c.

One gets stability under addition rather easily, but stability under multiplication is only true some of the time. It's easy to show that if a <= b, then if c >= 0,
a*c <= b*c
otherwise if c <= 0,
a*c >= b*c

The reflexivity, limited-to-equality symmetry, and transitivity properties also hold.
 
Likewise, for the rational numbers, one can use the division construction to prove a variety of properties about them.

Addition and multiplication both have commutative, associative, and distributive properties, with the additive and multiplicative identities being {0,1} and {1,1} and the multiplicative zero being {0,1}. Ordering works like ordering for integers, though it is most conveniently done by subtraction and a positive/negative/zero test. That can also be done for ordering of integers.

Much like for subtraction being defined in terms of addition and additive inverses (negation), division can be defined in terms of multiplication and multiplicative inverses (reciprocals).

The multiplicative inverse (reciprocal) of {a,b} is {b,a}. {a,b}*{b,a} = {a*b,a*b} = {1,1}


For real numbers, we use Cauchy sequences of rational numbers. One does arithmetic term-by-term, though one has to be careful to avoid dividing by 0. One can define ordering with it by subtracting, then checking on whether the result is positive, zero, or negative. Zero is easy to test for -- does the sequence converge to that number? Being positive can be tested for by seeing if there is some n for sequence a such that all a(k) > 0 for some k >= n. Being negative is parallel, but with < 0 instead of > 0.

One can also use Cauchy sequences of real algebraic numbers instead of rational ones, but rational ones are easier to define.
 
So that is catch phrase when you have no argument? Cause I called your bluff: the time it takes doesnt matter.
That's right, the time it takes doesn't matter. But counting takes time and at some point you stop. How long would it take to count to infinity?
Waster.
EB

It would take you the same time as to count 1E100 apples: the rest of your life and you would still not be finished.

The point, that obviously was to fine to be noticed is that this has nothing to do with the timeconsuming human act of counting real objects.

The reason why there is a notion "countable" is because there is an oppsite notion "uncountable" for sets where each pair of elements always has at least one element between them. (As for example real numbers)
 
Back
Top Bottom