• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Regarding Cantor's Diagonal Argument

You said that you can always add 1 to any k, then I said that is also a natural number, which conveniently is also how many elements there are.

Yes, and in what way do you think that this contradicts my argument?

k+1 is going to be a natural number. That natural number is also how many elements are in the set.
 
I see that you smuggled this in here. Are you sure about this?

My argument is not necessarily about the set N being finite. It's about it not having an infinite number of elements. Maybe there needs to be a third option, like a undefined set or an unending finite set.

Now, the proof. Please pay attention to the structure and syntax of the proof and contrast it with your posts. I don't assume existence or properties for the objects I define. Everything step is explained, using definitions (I left some unstated, but can fill those in if necessary). In the future, you should strive to have your mathematical arguments follow the same pattern.

This is a proof by contradiction - I will assume that the set of natural numbers is finite, that is that there is a bijection f : N -> {1,2,3,...,n} for some number n (since 1 is a natural number, I can discount the empty case). Now, I want to use that assumption to show a contradiction, thereby proving that the set of natural numbers is infinite.

Specifically, look at the bijection f : N -> {1,2,3,...,n}. Since it is a bijection, it is one-to-one and onto. Now, I will define a new function g : {1,2,...,n+1} -> {1,2,3,...,n} by taking g(x) = f(x) for every x in {1,2,...,n+1}. g is called the restriction of f to {1,2,...,n+1}.

First off, g is well-defined - for every number 1,2,...,n+1, f is defined and thus g is defined. Furthermore, g is one-to-one because f is one-to-one, that is, if g(x) = g(y) then f(x) = f(y) and so x = y. Now, g is not necessarily onto because we have restricted the domain of the onto function f, but that's OK, we only need one-to-one-ness.

So we have a function g : {1,2,...,n+1} -> {1,2,...,n} that we have proved to be one-to-one. But the pigeonhole principle states that since {1,2,...,n+1} has more elements than {1,2,...,n}, g can't be one-to-one. This is a contradiction. Therefore, the assumption that N was finite is false. That is, the set N of natural numbers is infinite.

Yes, I certainly know this is true in today's mathematics. In one of my calculus courses we had to study a similar proof as this, except it was that no real number can be an upper bound for N.

Anyways, the whole point of this discussion is to investigate the internal mathematical logic of one of its theories, namely Cantor's diagonal argument.

How, exactly, were you planning on talking about infinite sets if you can't even recognize the definition of an infinite set? This is the kind of shit I've been talking about. You're just woefully under-prepared to think about this topic but insist on doing it anyway. It's exasperating as all hell...
You must understand what I am doing here before your head explodes, so calm down and pay close attention.

In the beginning of this thread, you and Angra Manyu helped me understand the diagonal argument. I was going to leave for good, but then this recent issue popped up. Now I am taking a stab at trying to understand this.

Maybe the reason why you are frustrated is because you are used to teaching in a clear and orderly way. The objective is clear and the process that your students follow is well defined. But in the messy world outside of academia, there is no specific path for me to take. I start at the problem and work my way down instead of starting at the bottom and working my way up to the problem. Indeed, this is not what you are used to.

It may not be a terrible idea to go outside of a well structured process and practice finding one in the chaos of the real world.
 
Last edited:
Yes, and in what way do you think that this contradicts my argument?

k+1 is going to be a natural number. That natural number is also how many elements are in the set.

Yes. Every set, k, k+1, k+2,..., k+24553565468, ... etc will have a finite number of elements.
BUT
None of these sets contains ALL natural numbers.
 
k+1 is going to be a natural number. That natural number is also how many elements are in the set.

Yes. Every set, k, k+1, k+2,..., k+24553565468, ... etc will have a finite number of elements.
BUT
None of these sets contains ALL natural numbers.

Let's think about what you said, "for each natural number k there is a bigger number k+1". Does "each" mean for every n?

Or did you mean that if k is an element of some set S, as well as 1 and k+1, then S has the unending property of the set of natural numbers?
 
Let's think about what you said, "for each natural number k there is a bigger number k+1". Does "each" mean for every n?
No, for every k. That's what I wrote.

Or did you mean that if k is an element of some set S, as well as 1 and k+1, then S has the unending property of the set of natural numbers?

You overdo it.

When I write "for each natural number k there is a bigger number k+1" I mean exactly what I write. That is:

for every natural number k you pick there is always at least one natural number that is bigger, for example k+1.
 
Okay, I think I know what's going on now. I was looking through my old notes of a class I took a while ago, http://www.math.ualberta.ca/~bowman/m117/m117.pdf and found an explanation to all of this on page 31.

Summary:

(Please someone correct me if my translation of the notes is wrong)

First the notes inductively defines the set of natural numbers and then it uses Archimedes principle (axiom of Archimedes) with that definition to show a contradiction.

First, to inductively define the set N, we start with 1 E (element of) N

If 1 E N
and if k E N then k+1 is E N
then N is the set of the naturals.

Then the axiom of Archimedes uses a proof by contradiction based on the above definition. Assume there is a least upper bound for N that we will call b; so b = l.u.b N.

b = or > k, for any k E N.

b - 1 < k, for k = b (this line is my own interpretation)

b < k + 1

But we defined N to be that if k E N, then k + 1 E N.

Therefor b is not the l.u.b of N.

So it all goes back to what we defined the set of N to be.

... hmmm ...
 
Last edited:
You must understand what I am doing here before your head explodes, so calm down and pay close attention.

In the beginning of this thread, you and Angra Manyu helped me understand the diagonal argument. I was going to leave for good, but then this recent issue popped up. Now I am taking a stab at trying to understand this.

Maybe the reason why you are frustrated is because you are used to teaching in a clear and orderly way. The objective is clear and the process that your students follow is well defined. But in the messy world outside of academia, there is no specific path for me to take. I start at the problem and work my way down instead of starting at the bottom and working my way up to the problem. Indeed, this is not what you are used to.

It may not be a terrible idea to go outside of a well structured process and practice finding one in the chaos of the real world.

To be honest, I seriously doubt you understand the diagonal argument. I don't think you really understand what infinite sets or the natural numbers are either and it doesn't seem like you realize it because you think that this is a reasonable response to my post:

I think I've made my position clear on why I think you're having difficulty with this topic, so here's a last-ditch attempt to help you. Others have pointed out where your argument fails and if you don't understand the problems they have shown with your claim, then you need to go back and think about it. I won't repeat the same work. Instead, I will take a different tack - I will prove the opposite claim, one that most people just intuitively accept, that there are infinitely many natural numbers. Only one of us can be right, at least if you accept that the axioms underlying all of mathematics are consistent.

Here is a proof that there are infinitely many natural numbers. Fundamentally, it is a property of the  principle of mathematical induction, a property so intrinsic to the concept of number that it is one of the fundamental axioms used to define them. I will actually use an equivalent principle that is more intuitive and easier to state - the  pigeonhole principle. The pigeonhole principle is "if we place n + 1 objects into n categories then at least one category contains at least 2 objects". Restated in terms of functions, it is "If A and B are finite sets where A has more elements than B, then there is no one-to-one function from A to B." I want to emphasize that this principle is how we DEFINE the natural numbers, so it is essentially inviolate. It should also be really intuitively obvious, like good axioms are, but we don't accept it simply because it is intuitive.

First, the definition: A non-empty set S is finite if and only if there is a bijection between S and {1,2,3,...,n} for some number n in N. In that case we say that S has n elements. For the edge case, we'll call the empty set finite as well. An infinite set is a set that is not finite.

I see that you smuggled this in here. Are you sure about this?

My argument is not necessarily about the set N being finite. It's about it not having an infinite number of elements. Maybe there needs to be a third option, like a undefined set or an unending finite set.
 
I see that you smuggled this in here. Are you sure about this?

My argument is not necessarily about the set N being finite. It's about it not having an infinite number of elements. Maybe there needs to be a third option, like a undefined set or an unending finite set.

Now, the proof. Please pay attention to the structure and syntax of the proof and contrast it with your posts. I don't assume existence or properties for the objects I define. Everything step is explained, using definitions (I left some unstated, but can fill those in if necessary). In the future, you should strive to have your mathematical arguments follow the same pattern.

This is a proof by contradiction - I will assume that the set of natural numbers is finite, that is that there is a bijection f : N -> {1,2,3,...,n} for some number n (since 1 is a natural number, I can discount the empty case). Now, I want to use that assumption to show a contradiction, thereby proving that the set of natural numbers is infinite.

Specifically, look at the bijection f : N -> {1,2,3,...,n}. Since it is a bijection, it is one-to-one and onto. Now, I will define a new function g : {1,2,...,n+1} -> {1,2,3,...,n} by taking g(x) = f(x) for every x in {1,2,...,n+1}. g is called the restriction of f to {1,2,...,n+1}.

First off, g is well-defined - for every number 1,2,...,n+1, f is defined and thus g is defined. Furthermore, g is one-to-one because f is one-to-one, that is, if g(x) = g(y) then f(x) = f(y) and so x = y. Now, g is not necessarily onto because we have restricted the domain of the onto function f, but that's OK, we only need one-to-one-ness.

So we have a function g : {1,2,...,n+1} -> {1,2,...,n} that we have proved to be one-to-one. But the pigeonhole principle states that since {1,2,...,n+1} has more elements than {1,2,...,n}, g can't be one-to-one. This is a contradiction. Therefore, the assumption that N was finite is false. That is, the set N of natural numbers is infinite.

Yes, I certainly know this is true in today's mathematics. In one of my calculus courses we had to study a similar proof as this, except it was that no real number can be an upper bound for N.

Anyways, the whole point of this discussion is to investigate the internal mathematical logic of one of its theories, namely Cantor's diagonal argument.

How, exactly, were you planning on talking about infinite sets if you can't even recognize the definition of an infinite set? This is the kind of shit I've been talking about. You're just woefully under-prepared to think about this topic but insist on doing it anyway. It's exasperating as all hell...
You must understand what I am doing here before your head explodes, so calm down and pay close attention.

In the beginning of this thread, you and Angra Manyu helped me understand the diagonal argument. I was going to leave for good, but then this recent issue popped up. Now I am taking a stab at trying to understand this.

Maybe the reason why you are frustrated is because you are used to teaching in a clear and orderly way. The objective is clear and the process that your students follow is well defined. But in the messy world outside of academia, there is no specific path for me to take. I start at the problem and work my way down instead of starting at the bottom and working my way up to the problem. Indeed, this is not what you are used to.

It may not be a terrible idea to go outside of a well structured process and practice finding one in the chaos of the real world.

Whether it is a terrible idea or not depends on your objective - if you wish to achieve a better understanding of reality, then it IS a terrible idea.

There are many, many ways to be wrong; and only a relatively small number of ways to be right. Finding the right answers in a structured way is difficult. Finding the right answers in an unstructured way is practically impossible - and even if you got lucky, without a structured approach, you can never know for sure whether you actually are right.

It is the same basic difference between planning to get rich by setting up a small business, and slowly building it up until you are wealthy; and planning to get rich by buying a lottery ticket every week until you win the jackpot.

It's not impossible to win the lottery; But if you go around every week crowing to all your rich friends that this week you will have more money than them, and then getting all snitty when they point out that your ticket probably doesn't have the winning numbers on it, then you shouldn't be surprised when they stop being kind and humouring you, and start telling you to come back AFTER you have checked your ticket.
 
I don't think it's crazy to assume that I couldn't have understood this seeing as how I have taken the course that covers this topic and have taken the one after too.

But I did learn that my curiosity was more about how the fundamental logic of math leads to the diagonal argument than the argument itself.
 
I don't think it's crazy to assume that I couldn't have understood this seeing as how I have taken the course that covers this topic and have taken the one after too.

But I did learn that my curiosity was more about how the fundamental logic of math leads to the diagonal argument than the argument itself.

How is it going? Do you agree withmy post now?
 
I don't think it's crazy to assume that I couldn't have understood this seeing as how I have taken the course that covers this topic and have taken the one after too.

But I did learn that my curiosity was more about how the fundamental logic of math leads to the diagonal argument than the argument itself.

How is it going? Do you agree withmy post now?

Your post is just the surface of a much deeper area of mathematics. I am just going to chew on all of this for I while.
 
How is it going? Do you agree withmy post now?

Your post is just the surface of a much deeper area of mathematics.

Eh. Bullshit. Do you agree with my post or not? If you dont understand it then be honest and say so.

I am just going to chew on all of this for I while.

Let me come with a honest advice:
Your biggest problem is to be honest with yourself. To have any chance to understand stuff like this you must let your ego go completely and acknowledge any sign that you dont really understand.

You seem to have glossed over way to much and havent taken these concepts into your heart.
 
Your post is just the surface of a much deeper area of mathematics.

Eh. Bullshit. Do you agree with my post or not? If you dont understand it then be honest and say so.

Your post as it is seems quite obvious. But I am finding that there is a lot that supports your post. I am interested in the more fundamental stuff.
I am just going to chew on all of this for I while.

Let me come with a honest advice:
Your biggest problem is to be honest with yourself. To have any chance to understand stuff like this you must let your ego go completely and acknowledge any sign that you dont really understand.

You seem to have glossed over way to much and havent taken these concepts into your heart.

I accepted it in university when I "learnt" it. But now I am questioning what I had to accept to get through the course.
 
Eh. Bullshit. Do you agree with my post or not? If you dont understand it then be honest and say so.

Your post as it is seems quite obvious. But I am finding that there is a lot that supports your post. I am interested in the more fundamental stuff.
I am just going to chew on all of this for I while.

Let me come with a honest advice:
Your biggest problem is to be honest with yourself. To have any chance to understand stuff like this you must let your ego go completely and acknowledge any sign that you dont really understand.

You seem to have glossed over way to much and havent taken these concepts into your heart.

I accepted it in university when I "learnt" it. But now I am questioning what I had to accept to get through the course.

"Accepting"? Accepting is the worst thing you can do. Accepting something means that you agree without really understang it.
 
Your post as it is seems quite obvious. But I am finding that there is a lot that supports your post. I am interested in the more fundamental stuff.
I am just going to chew on all of this for I while.

Let me come with a honest advice:
Your biggest problem is to be honest with yourself. To have any chance to understand stuff like this you must let your ego go completely and acknowledge any sign that you dont really understand.

You seem to have glossed over way to much and havent taken these concepts into your heart.

I accepted it in university when I "learnt" it. But now I am questioning what I had to accept to get through the course.

"Accepting"? Accepting is the worst thing you can do. Accepting something means that you agree without really understang it.

Well there is only so much time during a semester. And the deeper understanding wasn't even mentioned in the material.

As for your post, I don't think that's a proper proof. The proof that I summed up from my notes seems to be a very common way to prove that N is not finite.

But I still have some issues with the proof that I have to think more about.
 
(Please someone correct me if my translation of the notes is wrong)

First the notes inductively defines the set of natural numbers and then it uses Archimedes principle (axiom of Archimedes) with that definition to show a contradiction.

First, to inductively define the set N, we start with 1 E (element of) N

If 1 E N
and if k E N then k+1 is E N
then N is the set of the naturals.

This is not sufficient to define the naturals. 1 is in Z, and if k ∈ Z then k + 1 ∈ Z, but Z is not N. 1 is in R and if k ∈ R then k + 1 ∈ R but R is not N.

The statement of the principle of mathematical induction is:

If 1 is in K and if k ∈ K implies that k + 1 ∈ K then K contains all the natural numbers, so N = {1,2,...} ⊆ K.

Then the axiom of Archimedes uses a proof by contradiction based on the above definition. Assume there is a least upper bound for N that we will call b; so b = l.u.b N.

Least upper bounds are not guaranteed for all sets, in general you must argue that b exists, not simply assume it. Also, we generally want to specify which set the least upper bound comes from. N? R?

b = or > k, for any k E N.

b - 1 < k, for k = b (this line is my own interpretation)

b < k + 1

But we defined N to be that if k E N, then k + 1 E N.

Therefor b is not the l.u.b of N.

So it all goes back to what we defined the set of N to be.

... hmmm ...

The standard proof that I think you're trying for is:

Since b is the least upper bound, nothing less than b is an upper bound, so there exists a k such that k > b - 1. That means that for that k, k + 1 > b. But b is an upper bound for N and k + 1 is in N, so k + 1 > b ≥ k + 1, which is a contradiction.
 
This is not sufficient to define the naturals. 1 is in Z, and if k ∈ Z then k + 1 ∈ Z, but Z is not N. 1 is in R and if k ∈ R then k + 1 ∈ R but R is not N.

The statement of the principle of mathematical induction is:

If 1 is in K and if k ∈ K implies that k + 1 ∈ K then K contains all the natural numbers, so N = {1,2,...} ⊆ K.

Then the axiom of Archimedes uses a proof by contradiction based on the above definition. Assume there is a least upper bound for N that we will call b; so b = l.u.b N.

Least upper bounds are not guaranteed for all sets, in general you must argue that b exists, not simply assume it.
I don't know what you mean here because we are trying to show that b can't exist. Why would we argue for its existence?

Also, we generally want to specify which set the least upper bound comes from. N? R?

Yeah, I should have included that b E R.
b = or > k, for any k E N.

b - 1 < k, for k = b (this line is my own interpretation)

b < k + 1

But we defined N to be that if k E N, then k + 1 E N.

Therefor b is not the l.u.b of N.

So it all goes back to what we defined the set of N to be.

... hmmm ...

The standard proof that I think you're trying for is:

Since b is the least upper bound, nothing less than b is an upper bound, so there exists a k such that k > b - 1. That means that for that k, k + 1 > b. But b is an upper bound for N and k + 1 is in N, so k + 1 > b ≥ k + 1, which is a contradiction.

But what about my interpretation? Is there anything wrong with it?
 
Then the axiom of Archimedes uses a proof by contradiction based on the above definition. Assume there is a least upper bound for N that we will call b; so b = l.u.b N.

Least upper bounds are not guaranteed for all sets, in general you must argue that b exists, not simply assume it.
I don't know what you mean here because we are trying to show that b can't exist. Why would we argue for its existence?

A set can be bounded yet still not have a least upper bound. Showing that b cannot exist does not mean that N is unbounded unless you complete the argument.

Also, we generally want to specify which set the least upper bound comes from. N? R?

Yeah, I should have included that b E R.

Then you can complete the above argument using the least upper bound property of R. Of course, it is still possibly circular and certainly overkill to use properties of the reals to prove properties of the naturals...

b = or > k, for any k E N.

b - 1 < k, for k = b (this line is my own interpretation)

b < k + 1

But we defined N to be that if k E N, then k + 1 E N.

Therefor b is not the l.u.b of N.

So it all goes back to what we defined the set of N to be.

... hmmm ...

The standard proof that I think you're trying for is:

Since b is the least upper bound, nothing less than b is an upper bound, so there exists a k such that k > b - 1. That means that for that k, k + 1 > b. But b is an upper bound for N and k + 1 is in N, so k + 1 > b ≥ k + 1, which is a contradiction.

But what about my interpretation? Is there anything wrong with it?

The main issue with what you wrote is that because you are finding the least upper bound in R, you need an argument that b is a natural number in order to choose k = b.
 
Back
Top Bottom