• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Reasons to disbelieve the Axiom of Choice

The other botch I made was presenting the setup as though each prisoner sees a finite number of hats instead of an infinite number.

~ ~ ~ ~ ~ ~ ~
@ Jarhyn — I did notice that you were pointed at the solution, but you didn't articulate it completely. And you insisted on randomness, which is irrelevant to the puzzle.

I think you were arguing that solution is impossible, which it is (without the Axiom of Choice). But David Hilbert said "Mathematics is a game played according to certain simple rules with meaningless marks on paper." Play the game! :cool:
Well that's the thing, Swammerdami, I don't think randomness is unrelated to the problem at all, for the reasons I articulate. I think that the AC has a fundamental assumption built into it about the nature of solvability and randomness and information derivable from infinites implied by this equivalence class that you describe.

The AC essentially posits that randomness is contradictory and thus all sequences expressed have a finite process description, even if the sequence itself is infinite, and in fact it can only be resolved if the sequence is infinite!

Either the prisoners or the wardens have an impossible task, and it all hinges on whether you can express infinite randomness.

Edit: and as I think about it, this would equate to whether the set theory allows expression of U as a concept: a description of all descriptions would need to describe itself...

Largely, this is seen as a contradiction in rigorous constructions as I understand it, that U is a made-up nonsense idea that if it can be expressed means a trivialization of axioms?

So I would reject that randomness can possibly be expressed, and the prisoners actually do win, and the AC holds because an assumption of randomness means the experiment is nonsense.
 
The other botch I made was presenting the setup as though each prisoner sees a finite number of hats instead of an infinite number.

~ ~ ~ ~ ~ ~ ~
@ Jarhyn — I did notice that you were pointed at the solution, but you didn't articulate it completely. And you insisted on randomness, which is irrelevant to the puzzle.

I think you were arguing that solution is impossible, which it is (without the Axiom of Choice). But David Hilbert said "Mathematics is a game played according to certain simple rules with meaningless marks on paper." Play the game! :cool:
Without rules there is no game. Define the riles ex[acitly and you will know if the game is playable and winable.
 
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.

For a continuous probability distribution of real numbers the probability of any finite number is zero. The probability is defined as limit of integral between a ns b as a-b goes to zero.

The rules of the probability game.
 
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.

For a continuous probability distribution of real numbers the probability of any finite number is zero. The probability is defined as limit of integral between a ns b as a-b goes to zero.

The rules of the probability game.
Yes, but you also need to supply the distributed objects to ask the question. You need to supply f(a...b) to calculate the probability within the distribution. While you can leave it variable, it has to at least be defined in all cases, and math cannot define randomness.

You need to at least be able to express f(x).
 
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.
That's not the case for a countable infinity, i.e. if the number of possibilities is the same as the number of integers. The probabilities can all be nonzero. For example, the probability can be 50% for the first possibility, 25% for the second, 12.5% for the third, and so forth. The infinite number of finite probabilities must all add up to 100%, which won't be the case if they're all zero.

But for a continuous probability distribution over an uncountable infinity of possibilities, yes, you're correct.
 
The other botch I made was presenting the setup as though each prisoner sees a finite number of hats instead of an infinite number.
Oh, okay, sorry. I was discounting that one because you'd already corrected it upthread.

The prima facie hole in the argument is here: "During their consultations the night before, prisoners have agreed on a CHOICE function: For each equivalence class, they have assigned a particular representative sequence."

The Axiom of Choice guarantees that such a choice function exists; it doesn't guarantee that prisoners can all agree on one. In general there will be infinitely many choice functions for a prisoner to choose from among. By what mechanism will the prisoners be able to pick out one from all the possibilities and ensure that they're all going to use the same choice function when it comes to deciding what color to say their hats are?
 
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.
That's not the case for a countable infinity, i.e. if the number of possibilities is the same as the number of integers. The probabilities can all be nonzero. For example, the probability can be 50% for the first possibility, 25% for the second, 12.5% for the third, and so forth. The infinite number of finite probabilities must all add up to 100%, which won't be the case if they're all zero.

But for a continuous probability distribution over an uncountable infinity, yes, you're correct.
I looked at this a ways back on thread on infinities. I don't think a countable infinity means a finite representation of an infinite set.

Countable and infinite are mutually exclusive. Infinity is not a number and means uncountable.

You can manipulate infinite sets. I am not fluent in set notation.

a = {1.0 2.0} b = (3.0 4.0} and one might say c = a + b means c is twice the infinity of a or b. That does not mean the sets are countable.

Mathematics as well as science comes down to a physical test. To me math that cannot in some way be tested in physical relity is an abstraction.

A company I wored for in the 80s sent me to reliability engineering training. As I got into setting up quakity and reliabilty tests the VP, a farmer from Oregon, when I told him statistacs does not give me a priori knowledge. You can't apply probabilities without an enumeration.


Adding to my last post the definition of a probability distributions is that the area under the curve from minus to plus infinity must be 1. For a disc rte distribution this simply means the probabilities of all events must equal 1.

Tossing a die with an infinite number of faces can not have a probability.

You might be able to set up the problem as a limit of a sequence as it goes to infinity. You can test for convergence.

Whenever I was puzzled by a prbem I set up a simulation.
 
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.
That's not the case for a countable infinity, i.e. if the number of possibilities is the same as the number of integers. The probabilities can all be nonzero. For example, the probability can be 50% for the first possibility, 25% for the second, 12.5% for the third, and so forth. The infinite number of finite probabilities must all add up to 100%, which won't be the case if they're all zero.

But for a continuous probability distribution over an uncountable infinity, yes, you're correct.
I looked at this a ways back on thread on infinities. I don't think a countable infinity means a finite representation of an infinite set.

Countable and infinite are mutually exclusive. Infinity is not a number and means uncountable.

You can manipulate infinite sets. I am not fluent in set notation.

a = {1.0 2.0} b = (3.0 4.0} and one might say c = a + b means c is twice the infinity of a or b. That does not mean the sets are countable.

Mathematics as well as science comes down to a physical test. To me math that cannot in some way be tested in physical relity is an abstraction.

A company I wored for in the 80s sent me to reliability engineering training. As I got into setting up quakity and reliabilty tests the VP, a farmer from Oregon, when I told him statistacs does not give me a priori knowledge. You can't apply probabilities without an enumeration.


Adding to my last post the definition of a probability distributions is that the area under the curve from minus to plus infinity must be 1. For a disc rte distribution this simply means the probabilities of all events must equal 1.

Tossing a die with an infinite number of faces can not have a probability.

You might be able to set up the problem as a limit of a sequence as it goes to infinity. You can test for convergence.

Whenever I was puzzled by a prbem I set up a simulation.
Countable vs uncountable infinity is generally seen as the difference between a series of numbers that can each be enumerated one to the next in series, and a series in which this is impossible because there are things between each of the things, and you can always find another thing to set between any two.

You might wish indeed to say the set of odds is "half as big" as the set of all natural numbers, but it is actually the same size, and this has been found to be true between the sizes of all infinite partitions of the natural numbers.

The way I saw it first presented is to think of a house with infinite rooms. Every room is filled with an "odd" number.

Wish to add the evens? Well, just ask the guy in the second room to move his neighbor over by one, and all the way counting down the line. There are infinite rooms, after all.

Now there's space to place 2 for to count it. But the thing is, 2 is already in a room that you counted when counting all the odds. So you haven't used ANY more rooms counting to 1,2,3 than counting 1,3,5.

It's the same size: countable infinity.

The lecture in which I first saw this treated was in fact a discussion on the axiom of choice!
 
Last edited:
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.
That's not the case for a countable infinity, i.e. if the number of possibilities is the same as the number of integers. The probabilities can all be nonzero. For example, the probability can be 50% for the first possibility, 25% for the second, 12.5% for the third, and so forth. The infinite number of finite probabilities must all add up to 100%, which won't be the case if they're all zero.

But for a continuous probability distribution over an uncountable infinity, yes, you're correct.
I looked at this a ways back on thread on infinities. I don't think a countable infinity means a finite representation of an infinite set.

Countable and infinite are mutually exclusive. Infinity is not a number and means uncountable.
I was using technical jargon instead of writing in plain English, sorry. Infinities come in a lot of different sizes. When mathematicians say "countable", what they mean is the smallest kind of infinity there is: an infinity of discrete members that's the same size as the set of integers. They're distinguishing that kind of infinity from all the other kinds of bigger infinities, such as a continuous infinity the same size as the set of real numbers. For historical reasons the smallest infinity is called "countable", but it's a stupid name that always causes confusion.

Tossing a die with an infinite number of faces can not have a probability.
True; but that just means a die is the wrong machine for selecting from an infinite number of discrete possibilities. Instead, use a "Wheel of Fortune" wheel with an infinite number of slots for the flipper-thing to end up in. Each slot has a finite nonzero probability proportional to the angle between its sides. Of course the slot angles can't all be equal. But if you install a slot of some width in the wheel, and then make the next slot take up part of the unused space, and go on in this way, with each slot taking up part of what's unused but never once using up all of it, then you can keep doing that an infinite number of times and end up with a wheel where each slot has a probability. (Never mind that most of the slots are smaller than an atom -- we're doing math here, not engineering. :) )
 
Countable vs uncountable infinity is generally seen as the difference between a series of numbers that can each be enumerated one to the next in series, and a series in which this is impossible because there are things between each of the things, and you can always find another thing to set between any two.
Having more things between any two things isn't what makes a set uncountable. Between any two rational numbers there are more rationals; for instance (A + B) / 2 is in between A and B. But the rational numbers are a countable infinity.
 
Countable vs uncountable infinity is generally seen as the difference between a series of numbers that can each be enumerated one to the next in series, and a series in which this is impossible because there are things between each of the things, and you can always find another thing to set between any two.
Having more things between any two things isn't what makes a set uncountable. Between any two rational numbers there are more rationals; for instance (A + B) / 2 is in between A and B. But the rational numbers are a countable infinity.
And this exercise is similar to setting aside the odd numbers for the evens, setting aside the 1's for the 1/2s, setting aside for the 2/3's and the 1/3's, etc, each also ending up the same set

To get to uncountable you need something that doesn't order as a count but rather as an evolving operation on a count, as per numbers like pi, where 3, into 1, into 4....

So you were right insofar as I described it badly.

We don't end up uncountable until irrational numbers start happening, numbers which are defined in terms of a recursive or internally iterative process.
 
And this exercise is similar to setting aside the odd numbers for the evens, setting aside the 1's for the 1/2s, setting aside for the 2/3's and the 1/3's, etc, each also ending up the same set
Yes, exactly.

To get to uncountable you need something that doesn't order as a count but rather as an evolving operation on a count, as per numbers like pi, where 3, into 1, into 4....

So you were right insofar as I described it badly.

We don't end up uncountable until irrational numbers start happening, numbers which are defined in terms of a recursive or internally iterative process.
Actually even that isn't enough to get to an uncountable set; that only gets you to the "computable numbers". You can still put numbers like pi and the square root of 2 in a discrete sequence, based on putting the programs that systematically generate their digits into alphabetical order. Since there are only a countable infinity of finite programs, there can only be a countable infinity of computable numbers. To get an uncountable set you need to include numbers that need infinitely long programs to generate their digits.
 
And this exercise is similar to setting aside the odd numbers for the evens, setting aside the 1's for the 1/2s, setting aside for the 2/3's and the 1/3's, etc, each also ending up the same set
Yes, exactly.

To get to uncountable you need something that doesn't order as a count but rather as an evolving operation on a count, as per numbers like pi, where 3, into 1, into 4....

So you were right insofar as I described it badly.

We don't end up uncountable until irrational numbers start happening, numbers which are defined in terms of a recursive or internally iterative process.
Actually even that isn't enough to get to an uncountable set; that only gets you to the "computable numbers". You can still put numbers like pi and the square root of 2 in a discrete sequence, based on putting the programs that systematically generate their digits into alphabetical order. Since there are only a countable infinity of finite programs, there can only be a countable infinity of computable numbers. To get an uncountable set you need to include numbers that need infinitely long programs to generate their digits.
If you need an infinitely long program to generate any digit of a number, is it a number?

A while back I looked into numbers that were not computable and all the instances I saw indicated that everything considered such a number is on the lines of a universal constant.

The issue is that the geometry of the universe seems such that the properties they derive from are finite in nature, though very large.

This is on, as I understand it, the nature of spacetime expansion, and the speed of light. At some point, if expansion continues accelerating, eventually each locality will be cleaved, but the sum total of "visibility expansion" will be finite right up until visibility starts ripping away, at which point causal influence shrinks to nothing as every last virtual particle is moving away from ever other virtual particle at superluminal speeds.

This means that there will be a finite total of observation by any particle in all of it's history from pop to rip.

As such, the granularity and the limitations of observation amount to a situation where the universe we experience is going to produce ultimately calculable states, were you to have a computer big enough to calculate on it.

Godel's Incompleteness Theorem says we can't ever actually do that work, since we can't get a big enough computer to accomplish it.

And this means that even universal constants with very precise values (or perhaps in this case particular constants, although there's little difference at that point) would even be "calculable".

So there's some question out in fact as to whether the idea of an incalculable number exists.
 
The prima facie hole in the argument is here: "During their consultations the night before, prisoners have agreed on a CHOICE function: For each equivalence class, they have assigned a particular representative sequence."

The Axiom of Choice guarantees that such a choice function exists; it doesn't guarantee that prisoners can all agree on one. In general there will be infinitely many choice functions for a prisoner to choose from among. By what mechanism will the prisoners be able to pick out one from all the possibilities and ensure that they're all going to use the same choice function when it comes to deciding what color to say their hats are?

Is this objection to the mathematical abstraction, or to "real-world" limitations? For example, instead of an infinite number of prisoners, could we postulate a Hydra-like creature with a single brain to come up with the choice function, but that had to use his infinitely-many fire-walled brainlets during the guessing phase?

Or maybe you've hit on the fundamental flaw (or elusiveness) in the Axiom of Choice: It's one thing to say "There exists a choice function" and another thing to actually hold that function in your hands, or be able to write it down.

Having more things between any two things isn't what makes a set uncountable. Between any two rational numbers there are more rationals; for instance (A + B) / 2 is in between A and B. But the rational numbers are a countable infinity.
The rationals are proven to be countable if they can be mapped one-to-one to a subset of the natural numbers. An inelegant brute-force way is to map each a/b to (2a3b). (Multiply this by 5 to represent -a/b.) Insist that a/b be in lowest terms to avoid duplication.

I think that a similar approach can be used to show that each of the equivalence classes I introduced upthread is countable (though this demonstration uses AC at the outset). If 'a' is in the class whose chosen representative is 'b' then 'a' has a finite number of "misguesses" (m1. m2 ...) each of the form (i,j) where i and j are natural numbers: — 'i' is the location of the misguess, 'j' is the wrong guess.

(i,j) can be represented with the integer (m = 2i3j). 'b' itself can be denoted (given knowledge of its class's representative) by
2m13m25m37m4 ...
That is, the k'th misguess is encoded as the exponent of the k'th prime.

If this is correct — and by now I'm giving myself at best a 30% chance — then it means we are using only a weak form of the Axiom of Choice in the proof upthread: We are imposing a choice function on an (uncountable) collection of COUNTABLE sets.
 
The prima facie hole in the argument is here: "During their consultations the night before, prisoners have agreed on a CHOICE function: For each equivalence class, they have assigned a particular representative sequence."

The Axiom of Choice guarantees that such a choice function exists; it doesn't guarantee that prisoners can all agree on one. In general there will be infinitely many choice functions for a prisoner to choose from among. By what mechanism will the prisoners be able to pick out one from all the possibilities and ensure that they're all going to use the same choice function when it comes to deciding what color to say their hats are?

Is this objection to the mathematical abstraction, or to "real-world" limitations? For example, instead of an infinite number of prisoners, could we postulate a Hydra-like creature with a single brain to come up with the choice function, but that had to use his infinitely-many fire-walled brainlets during the guessing phase?

Or maybe you've hit on the fundamental flaw (or elusiveness) in the Axiom of Choice: It's one thing to say "There exists a choice function" and another thing to actually hold that function in your hands, or be able to write it down.

Having more things between any two things isn't what makes a set uncountable. Between any two rational numbers there are more rationals; for instance (A + B) / 2 is in between A and B. But the rational numbers are a countable infinity.
The rationals are proven to be countable if they can be mapped one-to-one to a subset of the natural numbers. An inelegant brute-force way is to map each a/b to (2a3b). (Multiply this by 5 to represent -a/b.) Insist that a/b be in lowest terms to avoid duplication.

I think that a similar approach can be used to show that each of the equivalence classes I introduced upthread is countable (though this demonstration uses AC at the outset). If 'a' is in the class whose chosen representative is 'b' then 'a' has a finite number of "misguesses" (m1. m2 ...) each of the form (i,j) where i and j are natural numbers: — 'i' is the location of the misguess, 'j' is the wrong guess.

(i,j) can be represented with the integer (m = 2i3j). 'b' itself can be denoted (given knowledge of its class's representative) by
2m13m25m37m4 ...
That is, the k'th misguess is encoded as the exponent of the k'th prime.

If this is correct — and by now I'm giving myself at best a 30% chance — then it means we are using only a weak form of the Axiom of Choice in the proof upthread: We are imposing a choice function on an (uncountable) collection of COUNTABLE sets.
And this is what I attack as well: that you declare there is a CHOICE function that can create an infinite series that satisfies your requirements for absolute randomness.

My postulate is that the prisoners don't need to even prearrange on the sequence they are looking for. Rather they have to formulate a strategy for effectively attacking infinite randomness until the strategy starts predicting values, and when it starts predicting with high enough statistical certainty, check self.

Because we have postulated a computer that can process infinite sets, we can have perfect certainty. Anything less and it would be a gamble over who halts at lower computational complexity, the guard or the prisoner.

In such a world zero prisoners die.

I'm not quite sure whether there exists a strategy to "lie" for every prisoner given a finite known halting point in check depth: to plan a sequence which, if they checked their solution to (their n)+x=N, it will lie about their hat color 100% of the time, and if they checked their solution to N+x would have found the correct sequence for that N. Such that if prisoner.1 saw 2,3,4,6,7,8,9,0,1,2,3... And said "aha, it counts from 1" it wouldn't work because the Algorighm cycles in such a way as to miss 1 and actually start with 0, in some epoch, so prisoner 1 dies... And so on such that 2, while seemingly apparent, is at some later stage implied to be a 1, but not in the epoch where it is 2... And so on?

And if so, what is the bound of this halting problem conflict?

There's a lot more interesting math in this than the pinwheel problem.

I really wish I could have drawn that solution that I saw out, and you would have seen how it was a proof...
 
Jarhyn — If I understand correctly, you are using the idea that Non-constructible numbers cannot be constructed. And that since the prison warden constructed the sequence 'b', it must be a constructible (computable) object. Is that basically the idea? Non-computable numbers do not exist?

I think this would put you in good company: The great mathematician 'Leopold Kronecker, one of the early critics of set theory was famously quoted as responding to a paper describing the properties of π by saying, roughly, "Why are you wasting your time studying something which doesn't exist?"' (Although π may be a bad example since it has finite Kolmogorov complexity.)

Although my work has been in simple applied math, and always with simple constructible objects, I guess I'm philosophically more in tune with David Hilbert. Mathematics is just a game; and we can make up our own rules!
 
Jarhyn — If I understand correctly, you are using the idea that Non-constructible numbers cannot be constructed. And that since the prison warden constructed the sequence 'b', it must be a constructible (computable) object. Is that basically the idea? Non-computable numbers do not exist?

I think this would put you in good company: The great mathematician 'Leopold Kronecker, one of the early critics of set theory was famously quoted as responding to a paper describing the properties of π by saying, roughly, "Why are you wasting your time studying something which doesn't exist?"' (Although π may be a bad example since it has finite Kolmogorov complexity.)

Although my work has been in simple applied math, and always with simple constructible objects, I guess I'm philosophically more in tune with David Hilbert. Mathematics is just a game; and we can make up our own rules!
Indeed we can make up our own rules, but we still have to be able to make those rules.
 
ok, thanks Jaryn.

In the past some on the forum argued an infity can be equated to a finite count.
Mathematically for an infinite number of possibilities the probability of any one occurrence is zero.
That's not the case for a countable infinity, i.e. if the number of possibilities is the same as the number of integers. The probabilities can all be nonzero. For example, the probability can be 50% for the first possibility, 25% for the second, 12.5% for the third, and so forth. The infinite number of finite probabilities must all add up to 100%, which won't be the case if they're all zero.

But for a continuous probability distribution over an uncountable infinity, yes, you're correct.
I looked at this a ways back on thread on infinities. I don't think a countable infinity means a finite representation of an infinite set.

Countable and infinite are mutually exclusive. Infinity is not a number and means uncountable.
I was using technical jargon instead of writing in plain English, sorry. Infinities come in a lot of different sizes. When mathematicians say "countable", what they mean is the smallest kind of infinity there is: an infinity of discrete members that's the same size as the set of integers. They're distinguishing that kind of infinity from all the other kinds of bigger infinities, such as a continuous infinity the same size as the set of real numbers. For historical reasons the smallest infinity is called "countable", but it's a stupid name that always causes confusion.

Tossing a die with an infinite number of faces can not have a probability.
True; but that just means a die is the wrong machine for selecting from an infinite number of discrete possibilities. Instead, use a "Wheel of Fortune" wheel with an infinite number of slots for the flipper-thing to end up in. Each slot has a finite nonzero probability proportional to the angle between its sides. Of course the slot angles can't all be equal. But if you install a slot of some width in the wheel, and then make the next slot take up part of the unused space, and go on in this way, with each slot taking up part of what's unused but never once using up all of it, then you can keep doing that an infinite number of times and end up with a wheel where each slot has a probability. (Never mind that most of the slots are smaller than an atom -- we're doing math here, not engineering. :) )
Thanks for that. I know little in number and set theory but I do know statistics.

The crux of the problem seems to be predicting the problem of assigning a pobability to a sampleg from and infinite set. If you have a bucket of infinite objects you can reach in and pick one, but you cant't assign a probility

Put 90 red balls and 10 blue balls in a box. Shake and pick one. Put it back, shake, and pick again. Over time you will get blue balls 10% of the time. If you put an infinite number of red balls in the box and 10 red, mathematically the probabilty is undefined.

The probability of a die is 1/6. The ttoal prbality space is 6/6. inf/5 has no meaning.

I think infinity is a bit divide by zero. Electronically we can add, subtract, multiply, and divide with analog and digital circuits. However there is no circuit that can fivide by zero. There is also no circuit that can add, subtract, multiply, or divide with an infinity. There is no physical representation. In an anlog divider as the denominator goes to zero infnity is asymptoivally approched limied by curcuit voltges.

The same as lim x->0 1/x asymptotically goes to infinity.

I'll invoke Turing Machines. From Theory Of Computation for a problem to be solveable it must be Turing computable. In todays terms that means cding ito n a PC.


I'm not a good theroretical math perosn. If I were to approch the problem I would try and reduce it to an algorith. My guess woud br you will run up against ad impossible math condition. Metaphysics do give me a headache :D.
 
My guess woud br you will run up against ad impossible math condition
My guess is that you are right, on account of the whole argument from my point being that if you can't reduce it to an algorithm such that it allows expression of any part of it, what is the basis for even proclaiming it is there? No part of it, by definition, allows expression! Which would mean including itself.
 
@Swammerdami it appears your discussion might touch on the question about the (widely considered nonsensical) hypothetical set U in math, "the set that contains all sets", as this touches on the equivalence class issue, and the idea of a number of such "infinite complexity"
 
Back
Top Bottom