• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Rational numbers == infinitely repeating sequences of digits

Species are categories based on pre-existing entities.

Value is not a pre-existng entity.

It is an imaginary entity.

It doesn't even matter whether species are more real than numbers (they aren't, but that's besides the point).

It's the whole point and your claim here is nonsense.

It is a religious belief.

A species is defined using pre-existing features.

A real number is pure definition. There is nothing but the definition.
 
Species are categories based on pre-existing entities.

Value is not a pre-existng entity.

It is an imaginary entity.

It doesn't even matter whether species are more real than numbers (they aren't, but that's besides the point).

It's the whole point and your claim here is nonsense.

It is a religious belief.

A species is defined using pre-existing features.

A real number is pure definition. There is nothing but the definition.

Therefore, nothing follows; nothing that helps your case for sure. Thread remains closed until you learn to think and make arguments, or the internet ceases to operate. I don't know which will come first, but I have a gut feeling.
 
It's the whole point and your claim here is nonsense.

It is a religious belief.

A species is defined using pre-existing features.

A real number is pure definition. There is nothing but the definition.

Therefore, nothing follows; nothing that helps your case for sure. Thread remains closed until you learn to think and make arguments, or the internet ceases to operate. I don't know which will come first, but I have a gut feeling.

The thread is closed because you never had a point.

My point stands.

"3" is a symbol that represents a value within a predefined value scheme.

It is not any other value in that scheme.

And no other thing is the same value in that scheme.

Once you have defined a value scheme with operations there are infinite equations that can produce a given value.

They are not the same thing as the value.

The value is not the infinite equations that can produce it.
 
Trying to reinvent all of ontology, semiotics, mathematics from scratch is an honorable endeavor. It's however also rather foolish.
It's like you're reinventing the wheel and haven't even gotten to the point where you realize that it, maybe, shouldn't be square.

Oh, yeah?

Well, if you're so clever, perhaps you can tell us all what colour it's meant to be.


;)
 
It's the whole point and your claim here is nonsense.

It is a religious belief.

A species is defined using pre-existing features.

A real number is pure definition. There is nothing but the definition.

Therefore, nothing follows; nothing that helps your case for sure. Thread remains closed until you learn to think and make arguments, or the internet ceases to operate. I don't know which will come first, but I have a gut feeling.

The thread is closed because you never had a point.

My point stands.

"3" is a symbol that represents a value within a predefined value scheme.

It is not any other value in that scheme.
That depends once again on what you mean with "it" and the scheme. If "3" is the digit, it's false. The "3" in "0.3" means three tenths. If by scheme you mean simply numeric representation, it can even mean three fifteenths in "0.3" as a hexadecimal string. Pretending you did your homework and defined your terms:
Yeah, what I said that's part of what makes the scheme useful. To mich ambiguity is a Big downer. but it's a property of the representational format, not of the symbol and not of the value.

In other words, this statement is either false or uncontroversial and boring depending on what the hell it is you meant to say.
And no other thing is the same value in that scheme.
Which orifice did you pull this from? A symbol's usefulness depends on its unambiguity, not on it lacking synonyms. The decimal notation is full of synonym pairs, fractional notation even more so.
Once you have defined a value scheme with operations there are infinite equations that can produce a given value.

They are not the same thing as the value.

The value is not the infinite equations that can produce it.

You just admitted that 0.333... Is 1/3. since 1/3 is also 0.4 in duodecimal and since the equality relation is transitive and reflexive, 0.333... Has one and only one fixed value. QED
 
That depends once again on what you mean with "it" and the scheme.

How many times does it have to be explained to you?

The "it" is the one to one correspondence between a symbol in context and a specific value.

The "3" in "0.3" means three tenths.

Context.

And no other thing is the same value in that scheme.

Which orifice did you pull this from? A symbol's usefulness depends on its unambiguity, not on it lacking synonyms. The decimal notation is full of synonym pairs, fractional notation even more so.

What gives a symbol ambiguity is that the context it is within can be changed. But within it's context a symbol is associated with only one value.

And of course if you change the value system a symbol will take on a new value within the new system.

You just admitted that 0.333... Is 1/3.

Nope.

1 divided by 3 produces 0.333.... It is not the same thing as 0.333..... If they were the same thing then 0.333.... would also produce 1 divided by three, which is gibberish.

0.333.... produces nothing. As defined in context it does not have a final value within the value scheme.
 
How many times does it have to be explained to you?

The "it" is the one to one correspondence between a symbol in context and a specific value.
The one to one correspondence isn't a value, it's Mapping between a label and a value. Substituting it for your "it" gets you a sentence with every bit as much meaning as Chomsky's "colorless green ideas sleep furiously" example.
Context.

And no other thing is the same value in that scheme.

Which orifice did you pull this from? A symbol's usefulness depends on its unambiguity, not on it lacking synonyms. The decimal notation is full of synonym pairs, fractional notation even more so.

What gives a symbol ambiguity is that the context it is within can be changed. But within it's context a symbol is associated with only one value.
Even if that were so, an absence of one-to-many mappings doesn't imply an absence of many-to-one mappings. Nothing in what you said precludes that several symbols within one scheme share a referent.
And of course if you change the value system a symbol will take on a new value within the new system.

You just admitted that 0.333... Is 1/3.

Nope.

1 divided by 3 produces 0.333....
The convention among people who have passed 3rd year math (of primary school) is that the string "1/3" is ambiguous between referring to an operation and the number it produces. Since you just agreed that the operation produces 0.333..., it necessarily follows that "0.333.." interpreted as a numeral in decimal notation and "1/3" interpreted as a numeral in fractional notation refer to the same value.

Any properties of the value you want to derive from properties of the string will minimally have to make sense for both strings. That alone doesn't however guarantee their sense. Just like the accidental fact that "dog" and German "Hund" are both monosyllabic tells us nothing interesting about dogs.
 
The one to one correspondence isn't a value, it's Mapping between a label and a value.

It isn't mapping. The values are not something that exist that can be mapped to. It is merely saying that 1 represents this thing called a value.

Then it goes on to say that "2" represents a value of 1 + 1 after the operation of "+" is defined.

And the naming of values goes from there.

"3" represents the value of 2 + 1. "4" represents the value of 3 + 1.

If a civilization has 0 which many did not then 0 represents a value of 1 - 1.

To get the scheme of positive integers all you have done is said "1" represents a value and defined "+/-". The rest just flows from that.

And this has nothing to do with the real world.

In the real world if you have 6 stones and divide them by 2 you get 2 groups of 3 stones.

But in the imaginary world of values 6/2 = 3, just 3 not 2 groups of 3. Because it is imaginary.

The convention among people who have passed 3rd year math (of primary school) is that the string "1/3" is ambiguous between referring to an operation and the number it produces.

1/3 produces an endless string that has no final value. And you have nothing but an empty claim it doesn't.

I don't care what you think you heard in third grade.
 
The one to one correspondence isn't a value, it's Mapping between a label and a value.

It isn't mapping. The values are not something that exist that can be mapped to.

How is existing a precondition for being mapped onto?

It is merely saying that 1 represents this thing called a value.

Then it goes on to say that "2" represents a value of 1 + 1 after the operation of "+" is defined.

And the naming of values goes from there.

"3" represents the value of 2 + 1. "4" represents the value of 3 + 1.

If a civilization has 0 which many did not then 0 represents a value of 1 - 1.

To get the scheme of positive integers all you have done is said "1" represents a value and defined "+/-". The rest just flows from that.

We're getting somewhere :)
However, this is not a definition of the decimal representation scheme, it's a definition of numbers. The value defined as the result of adding ONE to the result of adding ONE to itself is, by definition, the same value whether it's represented with roman numerals as "III", in a positional notation system with base > 3 as "3", or in binary positional notation as "11". Whether and what it has to do with the "real world" doesn't enter consideration: Its definition is, by, well, definition, independent of its representation, which needs to be separately defined.

And this has nothing to do with the real world.

In the real world if you have 6 stones and divide them by 2 you get 2 groups of 3 stones.

But in the imaginary world of values 6/2 = 3, just 3 not 2 groups of 3. Because it is imaginary.

Irrelevant to the fact that numbers and their labels are different entities. A fact you just derived yourself - notice how your definitions doesn't include statements like "once the units reach 9, roll over to 0 and one to the tens", or "numbers greater 9 need to digits to represent" , or "fractions whose simplified form has a denominator whose prime factors are not a subset of {2,5} need infinite digits to represent"? As it should be, as those are properties of the decimal notation, not of the numbers. Well done by the way! A mapping is still a mapping if any or all of the things mapped imaginary, such as a mapping from strings to the numbers you just defined.

The convention among people who have passed 3rd year math (of primary school) is that the string "1/3" is ambiguous between referring to an operation and the number it produces.

1/3 produces an endless string that has no final value.

According to what you just said, that cannot possibly be so. The operation called division exists in the world of values (or numbers, as the rest of us call them), not in the world of strings. Now of course you could, I guess, interpret "1/3" as an operation on strings, but in that case it doesn't produce anything but a TypeError as the division operation is undefined for strings; or you could refrain from interpreting it at all, in which case it doesn't produce anything (except maybe a chemical reaction in your retina), just sits there as a string occupying three bytes of computer memory to no end. Since the definition of the values gets along just fine without reference to their representation, the value defined as the result of dividing ONE by THREE is necessarily the same whether you represent it as "0.333..." in decimal, "0.4" in duodecimal, "0.555..." in hexadecimal, or "1/3" in fractional notation.

And you have nothing but an empty claim it doesn't.

An empty claim and the fact that it follows from what you just said (and what everyone else has been saying for the better part of 2 millennia, but hey, we're getting somewhere).
 
Last edited:
How is existing a precondition for being mapped onto?

Mapping is associating one thing with another.

Both have to exist already before any mapping can take place.

The value of 1 does not exist until somebody invents it.

Value itself does not exist until somebody invents the concept.

What exists in the real world are quantities not values. Value is a conception not an observation.

However, this is not a definition of the decimal representation scheme, it's a definition of numbers.

The decimal scheme would flow from other just as simple rules.

But within the decimal scheme 0.003 is a specific value. It is nothing else in the scheme and it is not pointing to some other scheme.

But the decimal scheme and fractions, unlike the natural numbers, do not correspond to the real world. They are ideal.

You can have 1 apple or 2 apples but you can't have exactly 0.3333.... of an apple. You cannot have 1/3 of an apple.

Irrelevant to the fact that numbers and their labels are different entities.

Numbers or values?

The value is created by creating the symbol. 1 + 1 + 1 is a value. And 3 refers to that value.

The value of 3 does not exist until the label is created and defined. It is not out there somewhere. In the real world quantities exist but values do not.

1/3 produces an endless string that has no final value.

According to what you just said, that cannot possibly be so.

So you claim without any rational argument taking us there.

What I said was 0.3 is a value. 0.33 is a different value. 0.333 is a different value from the others.

But 0.3333... is saying there is no final 3. There cannot be a final 3.

So there cannot be a final value.
 
Mapping is associating one thing with another.

Both have to exist already before any mapping can take place.

The value of 1 does not exist until somebody invents it.

Sure.

And once it's invented, you can map arbitrary other "things" to it. Such as the string "1" or the string "I" or the string "αʹ", or the sound waves of the English word [wən]. The value is none of those things. You've recognised as much when you argued that the positive integers as used by a civilisation that doesn't have a concept of the value ZERO is the same as the ones used by a civilisation that does: Since positional notations require the concept of ZERO, it is obvious that the value TEN will have a different representation in whatever notation they use, and yet you told us it's the same value.

Value itself does not exist until somebody invents the concept.

What exists in the real world are quantities not values. Value is a conception not an observation.



The decimal scheme would flow from other just as simple rules.

Indeed. But they're different rules. None of the rules involved in defining the decimal system are required to define the numbers themselves.

But within the decimal scheme 0.003 is a specific value.

Not if you use value in the same meaning you did until now.

It is nothing else in the scheme and it is not pointing to some other scheme.

But the decimal scheme and fractions, unlike the natural numbers, do not correspond to the real world. They are ideal.

You can have 1 apple or 2 apples but you can't have exactly 0.3333.... of an apple. You cannot have 1/3 of an apple.

In practice, that is true due to the limitations of our cutting technology, but it's a rather uninteresting fact (and having exactly 0.3 of an apple is, in the very same sense, equally impossible). Whether it is more generally true due to the quantized nature of matter depends on the individual apple: whether the numbers of particles it contains are integer multiples of 3. In this sense, you are however much more likely to find an apple of which which you can have exactly 0.333... than one of which you can have exactly 0.3 - as there are over 3 times as many numbers within any given non-minimal range that are integer multiples of 3 than are integer multiples of 10.

Irrelevant to the fact that numbers and their labels are different entities.

Numbers or values?

The value is created by creating the symbol. 1 + 1 + 1 is a value. And 3 refers to that value.

The value of 3 does not exist until the label is created and defined. It is not out there somewhere. In the real world quantities exist but values do not.

1/3 produces an endless string that has no final value.

According to what you just said, that cannot possibly be so.

So you claim without any rational argument taking us there.

You just gave us a definition of the natural numbers that doesn't refer at all to how they are represented, acknowledging that their representation is not part of their definition. You defined THREE as emerging from the recursive application of the operation of ADDITION to the primitive value ONE, as in ONE + ONE + ONE. You also told us that, once we define a DIVISION operation, we can produce new values by applying it to previously defined values such as THREE and ONE. No part of that definition refers to the values' representation, which, as you also told us, would have to be derived from other rules. You even said that "in the [...] world of values 6/2 = 3", clearly indicating that the product of the division operation is another value. Since the operation produces a value defined within the scheme of values and agnostic to what label you might pick to refer to it, and not a string, the statement "1/3 produces an endless string that has no final value" is a simple category error analogous to "a dog has four legs and three characters".
 
Last edited:
And once it's invented, you can map arbitrary other "things" to it. Such as the string "1" or the string "I" or the string "αʹ", or the sound waves of the English word [wən].

No you can't. Not within the value scheme "1" has been defined within. There is only one symbol for that value within the scheme and the symbols in one scheme do not refer to any other scheme.

The value is none of those things.

The value exists within a specific scheme. That is how it comes into being, by being defined within the scheme. It does not exist until defined in a scheme. And a value in one scheme has nothing to do with any value in any other scheme.

Since positional notations require the concept of ZERO

The Romans did fine without zero. It is not needed.

Indeed. But they're different rules. None of the rules involved in defining the decimal system are required to define the numbers themselves.

All a decimal system allows is a greater number of possible values.

But 0.00006 has only one value in the scheme and it is not the same value as 0.000060000000000000001.

But within the decimal scheme 0.003 is a specific value.

Not if you use value in the same meaning you did until now.

A value is something that exists within a value scheme.

Within the decimal scheme 0.003 has only one value.

In some other scheme it will have a different value.

You can have 1 apple or 2 apples but you can't have exactly 0.3333.... of an apple. You cannot have 1/3 of an apple.

In practice, that is true due to the limitations of our cutting technology

No it's not. It's because 0.3333.... does not have a final value, there is no final 3, and only slices of finite values can be made.

Reality is finite. The infinite is imaginary. No slice could ever have the value of 0.333..... It is a logical impossibility.

A slice is a finite entity.

You just gave us a definition of the natural numbers that doesn't refer at all to how they are represented

I didn't give a definition. I gave a way for a value system to be constructed.

You defined THREE as emerging from the recursive application of the operation of ADDITION to the primitive value ONE, as in ONE + ONE + ONE.

I gave a way for values to be created within a value system. "3" is just the label for a value created.
 
"1" can be extended by operations like square root and minus. So square root of minus one exists, but, it is imaginary, not real. Often used to define geometries that pop up in finding solutions to differential equations that are useful.

The value schemes that have been invented are a tribute to the rare geniuses that have invented them.

The great thing about being a human is when a genius invents something there is a good chance you could be taught to understand it.

When you say some square root "exist" you have to clarify what you mean. Where does it exist?
 
"1" can be extended by operations like square root and minus. So square root of minus one exists, but, it is imaginary, not real. Often used to define geometries that pop up in finding solutions to differential equations that are useful.

The words 'imaginary' and 'real' in the context of numbers are in no way related to their meanings in common English.

'Imaginary numbers' are no less real than 'real numbers'. And 'real numbers' are no less imaginary than 'imaginary numbers'.

In the same vein, a 'strange quark' is not more strange than other quarks.

Equivocation between different meanings of words is not helpful. Untermensche is confused enough as it is.
 
Nobody ever thought anything differently than that obvious pablum you're spewing.
 
No you can't. Not within the value scheme "1" has been defined within. There is only one symbol for that value within the scheme and the symbols in one scheme do not refer to any other scheme.

You want it to be so because none of your favourite claims would follow otherwise, but it really isn't and they don't. Defining and naming a thing are independent acts, and if you haven't already said so yourself, you got very close. Having a concept of a value doesn't require having a label for it. I'm pretty sure monolingual speakers of a language that doesn't have numerals beyond "two" are perfectly capable of understanding that the guy who came home from his hunt with two rabbits, and then another two, is going to have an easier time getting his family through the week than the guy who only brought two and one more.

The value exists within a specific scheme. That is how it comes into being, by being defined within the scheme. It does not exist until defined in a scheme. And a value in one scheme has nothing to do with any value in any other scheme.

Since positional notations require the concept of ZERO

The Romans did fine without zero. It is not needed.

The Romans didn't have a positional notation; it might help you to look up terms you don't understand every once in a while
<snip>
But within the decimal scheme 0.003 is a specific value.

Not if you use value in the same meaning you did until now.

A value is something that exists within a value scheme.

Within the decimal scheme 0.003 has only one value.

Is or has?
In some other scheme it will have a different value.

You can have 1 apple or 2 apples but you can't have exactly 0.3333.... of an apple. You cannot have 1/3 of an apple.

In practice, that is true due to the limitations of our cutting technology

No it's not. It's because 0.3333.... does not have a final value, there is no final 3, and only slices of finite values can be made.

The value designated by 0.333... doesn't have digits, final or otherwise, just like dogs don't have characters. A fact that follows from things you've already acknowledged.
 
No you can't. Not within the value scheme "1" has been defined within. There is only one symbol for that value within the scheme and the symbols in one scheme do not refer to any other scheme.
You want it to be so because none of your favourite claims would follow otherwise, but it really isn't and they don't.

Bullshit. The scheme of the positive integers has created the value "1" refers to. And no other symbol within the scheme refers to that value. And talking about other schemes when the value only exists in this scheme because of the scheme is a non-sequitur. An irrational departure.

Within the decimal scheme 0.003 has only one value.

Is or has?

0.003 is a grouping of symbols that creates a context. As a whole it is a symbol that refers to a value that only exists because a scheme of values has been created.

So most correctly it does not have a value and is not a value. It refers to an imaginary conceptual value within ONE invented scheme.

The value designated by 0.333... doesn't have digits, final or otherwise, just like dogs don't have characters. A fact that follows from things you've already acknowledged.

Values don't exist on their own. They don't have anything on their own. This is your religion.

They come into existence only when a value system is created first. And they only exist within that ONE value system. A different system creates different values unless it is just an extension of a pre-existing scheme.

And within the decimal value system every "3" after the decimal point has a value.

Do you disagree with the idea that every "3" after the decimal point designates a different value? Where is any argument disproving this?

If you have an endless string of "3"s you have defined something that has no final value in the scheme.
 
Bullshit.
Not according to what you're telling us.
The scheme of the positive integers has created the value "1" refers to.
Exactly. The value is defined within the "scheme of the positive integers". It's various names not.
And no other symbol within the scheme refers to that value.
The scheme of positive integers doesn't contains symbols. It's a scheme of values. A representational scheme presumes the values are defined, but is otherwise an independent scheme. Defining and naming an entity are separate acts.
And talking about other schemes when the value only exists in this scheme because of the scheme is a non-sequitur. An irrational departure.
You should tell untermensche he keeps irrelevantly bringing up symbols of the decimal notation scheme in the middle of discussing the abstract conceptual scheme of numeric values.
 
Exactly. The value is defined within the "scheme of the positive integers". It's various names not.

The symbols refer only to a specific value within the scheme.

The value is created by the scheme.

It does not exist out there somewhere before the scheme exists.

And no other symbol within the scheme refers to that value.

The scheme of positive integers doesn't contains symbols. It's a scheme of values.

Of course it does. A value created is represented by a specific symbol or grouping of symbols. The symbols were already assigned by the time you showed up so you don't think they were assigned I suppose.

You should tell untermensche he keeps irrelevantly bringing up symbols of the decimal notation scheme in the middle of discussing the abstract conceptual scheme of numeric values.

Nope.

The symbols refer to the abstract values that do not exist until a value system is created.

The symbols refer to values.

0.3 refers to a specific value in the scheme. 0.33 refers to a different value.

0.333... refers to some imaginary entity that never has a final value.
 
Back
Top Bottom