• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Rational numbers == infinitely repeating sequences of digits

The symbols refer only to a specific value within the scheme.
As per the definition of the decimal system, any wellformed numeral within it refers to a unique value within the scheme of numbers, presuming that they have been defined. Those symbols are however not part of the scheme of numbers, they live in an extra layer atop the numbers. You should know, you have us an algorithm for the numbers that doesn't make reference to strings.
The value is created by the scheme.

It does not exist out there somewhere before the scheme exists.
This doesn't contradict anything I said.
The scheme of positive integers doesn't contains symbols. It's a scheme of values.

Of course it does. A value created is represented by a specific symbol or grouping of symbols. The symbols were already assigned by the time you showed up so you don't think they were assigned I suppose.
You got this exactly backwards. I'm well aware the mapping from strings to values is arbitrary. You seem to have your reservations.
You should tell untermensche he keeps irrelevantly bringing up symbols of the decimal notation scheme in the middle of discussing the abstract conceptual scheme of numeric values.

Nope.

The symbols refer to the abstract values that do not exist until a value system is created.
True, but that reference relation is not part of the world of values. The value defined as the result of applying the Addition operation to the Unit value is literally by definition the same whether you use the English word "two", Russian "dva" or Arabic "ithnaan", the decimal "2", binary "10", or Roman "II". If it makes you happy, you could call it "Gandalf" or "Captain Kirk" and still be talking about the same abstractly defined value.
The symbols refer to values.
They do, but not vice versa. Their relationship is asymmetric.
0.3 refers to a specific value in the scheme. 0.33 refers to a different value.
They refer to the values resulting from dividing 3 by 10 and 33 by 100, respectively. In any Positional notation that doesn't have a multiple of ten as its base, they'd have to be represented with a non-terminating string. That's a property of notations, not of values. What was that about not making reference to entities from other schemes?
0.333... refers to some imaginary entity that never has a final value.

It refers to the value resulting from dividing 3/9 or 1/3. If this couldn't be proved - though it can - we could stipulate it and still be good. We are, after all, talking about definitional schemes - you said it first. Nothing in the scheme of NUMBERS makes 1/3 less final than 1/10, that's a known and inevitable bug of the decimal system, with well-known workarounds.
 
As per the definition of the decimal system, any wellformed numeral within it refers to a unique value within the scheme of numbers, presuming that they have been defined. Those symbols are however not part of the scheme of numbers, they live in an extra layer atop the numbers. You should know, you have us an algorithm for the numbers that doesn't make reference to strings.

By what value does 1 increase if 3 is added to it?

How do you claim the symbol has nothing to do with the value?

True, but that reference relation is not part of the world of values. The value defined as the result of applying the Addition operation to the Unit value is literally by definition the same whether you use the English word "two", Russian "dva" or Arabic "ithnaan", the decimal "2", binary "10", or Roman "II". If it makes you happy, you could call it "Gandalf" or "Captain Kirk" and still be talking about the same abstractly defined value.

Values only exist in a scheme and a scheme is only in one language at a time.

The symbols refer to values.

They do, but not vice versa. Their relationship is asymmetric.

There is nothing there without a symbol. The symbol refers to an individual value that exists because of other symbols.

The values exist because there is a symbol "1" creating the concept of the value "1".

0.3 refers to a specific value in the scheme. 0.33 refers to a different value.

They refer to the values resulting from dividing 3 by 10 and 33 by 100, respectively.

It is the same string as dividing 33 by 100 but you don't need to divide 33 by 100 to have it.

But once you have it it designates a separate value from all other values in the scheme.

0.333... refers to some imaginary entity that never has a final value.

It refers to the value resulting from dividing 3/9 or 1/3.

In the value scheme it refers to something that has no final value.
 
By what value does 1 increase if 3 is added to it?
It doesn't increase, it's a constant, immutable value. If you meant to ask what value results: the same whether you call them "one" and "three" or "waahid" and "thalaatha".
How do you claim the symbol has nothing to do with the value?
I'm following the Definition for the values you gave us. Are you now saying it's wrong?
Values only exist in a scheme and a scheme is only in one language at a time.
That sounds like some superduper strong variant of Sapir-Whorf. Someone who Likes to quote Chomsky can be expected to be aware the strong version of the Sapir-Whorf Hypothesis has been discredited before they were born. Plus, no part of the definition you gave makes reference to elements of any language. You just want it to be so but can't give any reason.
The symbols refer to values.

They do, but not vice versa. Their relationship is asymmetric.

There is nothing there without a symbol. The symbol refers to an individual value that exists because of other symbols.
I think you meant to say "other values"
The values exist because there is a symbol "1" creating the concept of the value "1".

0.3 refers to a specific value in the scheme. 0.33 refers to a different value.

They refer to the values resulting from dividing 3 by 10 and 33 by 100, respectively.

It is the same string as dividing 33 by 100 but you don't need to divide 33 by 100 to have it.

But once you have it it designates a separate value from all other values in the scheme.

0.333... refers to some imaginary entity that never has a final value.

It refers to the value resulting from dividing 3/9 or 1/3.

In the value scheme it refers to something that has no final value.

More Map-Landscape. Literally no more sensible than "the domestic dog is actually two species: domestic and dog"
 
A simple question:

In what sense is the value resulting from dividing ONE by THREE less final than the value resulting from dividing ONE by FIVE? Can you answer this without leaving the world of VALUES and irrelevantly bringing up strings of the decimal notation?
 
By what value does 1 increase if 3 is added to it?
It doesn't increase, it's a constant, immutable value. If you meant to ask what value results: the same whether you call them "one" and "three" or "waahid" and "thalaatha".

There is a starting value labeled "1" and you add the value labeled "3" to it.

You have a final value.

It is not the same starting "immutable" value. It is a new value.

By what value is the new value greater than the old value?

- - - Updated - - -

A simple question:

In what sense is the value resulting from dividing ONE by THREE less final than the value resulting from dividing ONE by FIVE? Can you answer this without leaving the world of VALUES and irrelevantly bringing up strings of the decimal notation?

I can give you 1/3 of 90 cents.

I can't give you 1/3 of a dollar.
 
There is a starting value labeled "1" and you add the value labeled "3" to it.
Sure. What you're missing is that the labels aren't part of the values' definition - the one you provided, ironically. The labels are assigned for convenience after the Act of defining to save us the hassle of reciting the definition every time.
You have a final value.

It is not the same starting "immutable" value. It is a new value.

By what value is the new value greater than the old value?

- - - Updated - - -

A simple question:

In what sense is the value resulting from dividing ONE by THREE less final than the value resulting from dividing ONE by FIVE? Can you answer this without leaving the world of VALUES and irrelevantly bringing up strings of the decimal notation?

I can give you 1/3 of 90 cents.

I can't give you 1/3 of a dollar.

The number ONE is not like a dollar. The dollar is decimal based.

I specifically asked about an argument NOT making reference to the decimal system.

I take it this is your way of admitting you don't have one.

Also, I believe you just admitted 0.333... x 90 has what you call a "final value". Since division is the inverse of multiplication, and you've previously said that 1/3 results in 0.333..., it follows that 90 x 0.333... and 90/3 result in the same value. If 0.333... doesn't have a "final value", that's a contradiction.
 
Last edited:
Sure. What you're missing is that the labels aren't part of the values' definition - the one you provided, ironically. The labels are assigned for convenience after the Act of defining to save us the hassle of reciting the definition every time.

There is a starting value labeled "1" and you add the value labeled "3" to it.

You have a final value.

It is not the same starting "immutable" value. It is a new value.

By what value is the new value greater than the old value?

Tell me about the change in value without using the label.

The label is connected to the value. It is what gives the value meaning and illuminates it and separates it from all other values.

How the value can be created is meaningless. Once created the label and the value are inseparable. You can't talk about the value without using the label.

But go ahead, tell me about the change in value without using the label of the value you are using.

I can give you 1/3 of 90 cents.

I can't give you 1/3 of a dollar.

The number ONE is not like a dollar. The dollar is decimal based.

It is.

1 or 10 or 100 or 10,000,000,000,000,000,000,000 are all the same when it comes to being divided by 3. They all yield an endless string. They all give something that by definition has no final value.

You can't divide and not make use of the decimal system.

You're basically telling me to explain 1/3 without using division.

Religious madness.
 
There is a starting value labeled "1" and you add the value labeled "3" to it.

You have a final value.

That would be a new value, not an updated version of the same value. You asked "By what value does 1 increase if 3 is added to it?"

Maybe that's not what you meant to ask, but I can't second guess your intentions, I have to go by what you write. 1 doesn't increase, though you might be funny and reassign the label "1" to some other value to confuse everyone.

It is not the same starting "immutable" value. It is a new value.

Indeed, it's a new value, the value of 1 hasn't changed.

By what value is the new value greater than the old value?

By whatever value you have assigned to the string "3" in your symbol system. If you're halfway sticking to convention, that would be the value obtained from applying the ADDITION operation to the unit value, and then once again.

Tell me about the change in value without using the label.

I just did.

The label is connected to the value.

Not in as simple a way as you seem to believe.

It is what gives the value meaning and illuminates it and separates it from all other values.

No, that would be the value's definition.

How the value can be created is meaningless. Once created the label and the value are inseparable. You can't talk about the value without using the label.

Sure you can talk about the value without using the label, it's just a big hassle to repeat the definition every single time, so people invented labels for the more commonly used values to make their life easier. People are lazy, you know?

But go ahead, tell me about the change in value without using the label of the value you are using.

I just did. Wasn't hard.

I can give you 1/3 of 90 cents.

I can't give you 1/3 of a dollar.

The number ONE is not like a dollar. The dollar is decimal based.

It is.

1 or 10 or 100 or 10,000,000,000,000,000,000,000 are all the same when it comes to being divided by 3. They all yield an endless string.

That may or may not be a true statement for an operation that yields a string, but for an operation that is defined to yield a value its a plain and simple category error.

But even if we allow to equivocate between numbers and numerals, it's a false claim. It's perfectly possible to represent one third in a finite string without the use of ellipsis (other than omitting trailing 0s) in a positional notation system - all you have to do is pick the right base. 0.412 and 0.26 are as valid - and natural - as representations of that value as 0.333...10. You cannot possibly claim that the value ONE (labelled "1") is defined only in the decimal system since that leads to an oxymoron: In order to define the concept TEN recursively through addition of the unit value, you need to have a concept of ONE (and TWO through NINE), and the decimal system needs TEN to be defined. The only way out is to except that the definition of ONE (and for that matter THREE) takes priority over the definition of TEN and thus by transitivity the decimal system.

They all give something that by definition has no final value.

You can't divide and not make use of the decimal system.

Sure I can. I do it millions, if not billions, of times a day - every time a computer script I wrote performs a division... in the binary system.

You're basically telling me to explain 1/3 without using division.

Not at all. I'm telling you that dividing the value ONE by the value THREE is going going to result in a value, not a string, and whatever system of representation you chose to use for your values isn't going to make "the value has X many digits" a meaningful sentence since digits are not an attribute of values. The value is the same whether you're using the decimal system (where 1/5 but not 1/3 is represented by a finite string), the duodecimal system (=base 12, where the opposite is the case), or hexadecimal (=base 16, where neither can be represented without ellipsis, with 1/5 being 0.333... and 1/3 being 0.555...), or even a representation scheme that's unable to handle non-integers.
 
That would be a new value, not an updated version of the same value. You asked "By what value does 1 increase if 3 is added to it?"

A new value you say? Perhaps that is why I asked:

By what value is the new value greater than the old value?

A question you dodged and can't seem to even comprehend.

When you answer it perhaps this can go further.
 
That would be a new value, not an updated version of the same value. You asked "By what value does 1 increase if 3 is added to it?"

A new value you say? Perhaps that is why I asked:

By what value is the new value greater than the old value?

A question you dodged and can't seem to even comprehend.

When you answer it perhaps this can go further.

I answered it.
 
By whatever value you have assigned to the string "3" in your symbol system.

What value is "3" assigned to?

You are dancing around this. I wonder why? Why so coy?

The symbol of the value cannot be separated from the value within the scheme.

You can't speak of the value without referring to the name or symbol.

How many "1" values make up the "3" value?

Tell me, don't show me, without referring to the value.

0.3333.... cannot be separated from the symbols that make it up.

Without the symbols there is nothing there.

Just like without a "3" there is no answer to what value is designated by "3"?
 
By whatever value you have assigned to the string "3" in your symbol system.

What value is "3" assigned to?

If you're using it in the standard sense, that would be the value obtained from adding the unit value to the result of adding the unit value to itself. You told us so yourself.

You are dancing around this. I wonder why? Why so coy?

Not at all, I told you, again and again - or indeed pointed out that you've done so yourself.

The symbol of the value cannot be separated from the value within the scheme.



You can't speak of the value without referring to the name or symbol.

Sure I can. I just did. You yourself told us how to do it. It's just tiresome to repeat the definition everytime, which is the only reason we usually skip it and use the shorthand label instead.

How many "1" values make up the "3" value?

Tell me, don't show me, without referring to the value.

0.3333.... cannot be separated from the symbols that make it up.

The value referred to by the string "0.333..." interpreted as a symbol of the decimal notation scheme isn't made up by symbols, so category error once again. The value lives, as you've said yourself, in the world of values. In the world of values, and following the definition of division, 1/3, 3/9, 4/12, and 5/15 result in the same value. The fact that 3/9 and 5/15 are represented as non-terminating strings of 3s/5s in the decimal/hexadecimal representation is a property of those representations, not of the value: It follows from the fact that they are 3 respectively 5 divided by (base-1). We we didn't know otherwise (and we do, you told us so yourself when you talked about the value resulting from dividing 1 by 3 "in the world of values"), we could tell this doesn't point to an intrinsic non-deterministic nature of the value itself by looking at it's duodecimal representation: "0.4"

Without the symbols there is nothing there.

Just like without a "3" there is no answer to what value is designated by "3"?

I think what you're trying to say is that the only possible answer to "what is the value of '3'" is "the value designated by '3'". That's patent nonsense: It's like defining a function like this:
Code:
int my_function(int x)
{
return my_function(x)
}

Say hello to infinite recursion! And this coming from someone who dislikes infinities...
 
More coyness.

How many "unit values" make up "3"?

You are simply wrong.

It is impossible to refer to the value by anything but the symbol or word for the symbol.

In the scheme "3" is the value and the symbol.

They are inseperable.
 
More coyness.

How many "unit values" make up "3"?

You are simply wrong.

It is impossible to refer to the value by anything but the symbol or word for the symbol.

In the scheme "3" is the value and the symbol.

They are inseperable.

You clearly want it to be that way, but following the definition of the natural numbers you yourself provided, this isn't so. Unless you used a very unconventional definition of "definition".

Usually when we say x is defined as y, it implies that they can be used as drop-in replacements for one another while preserving the truth of a statement. Also, what you're saying now buys you infinite recursion.
 
More coyness.

How many "unit values" make up "3"?

You are simply wrong.

It is impossible to refer to the value by anything but the symbol or word for the symbol.

In the scheme "3" is the value and the symbol.

They are inseperable.

You clearly want it to be that way, but following the definition of the natural numbers you yourself provided, this isn't so. Unless you used a very unconventional definition of "definition".

Usually when we say x is defined as y, it implies that they can be used as drop-in replacements for one another while preserving the truth of a statement. Also, what you're saying now buys you infinite recursion.

I gave no definition of "value".

I gave a way to create all the values of the positive integers from one value.

It starts by simply saying that "1" is this thing called a 1 value.

It starts by saying there is no difference between "1" and the value "1".

You have no point.

And you can't talk about any value without talking about the label of the value.

You can say three or you can say one plus one plus one.

All you can talk about are the labels.

There is nothing else.

There is nothing else besides 0.3333....

It points to itself and nothing else.
 
More coyness.

How many "unit values" make up "3"?

You are simply wrong.

It is impossible to refer to the value by anything but the symbol or word for the symbol.

In the scheme "3" is the value and the symbol.

They are inseperable.

You clearly want it to be that way, but following the definition of the natural numbers you yourself provided, this isn't so. Unless you used a very unconventional definition of "definition".

Usually when we say x is defined as y, it implies that they can be used as drop-in replacements for one another while preserving the truth of a statement. Also, what you're saying now buys you infinite recursion.

I gave no definition of "value".

I gave a way to create all the values of the positive integers from one value.

Yes, values. The value this gives you for the number 3 doesn't have the string "3" - or any other string - as part of its definition. According to what you gave us.

It starts by simply saying that "1" is this thing called a 1 value.

It starts by saying there is no difference between "1" and the value "1".

Erm, you want it to be that way but this is neither what you said, nor does it make any sense. And even if it were so, nothing follows for "3", and it definitely doesn't follow that 1 is intrinsically part of the decimal system: That's a contradiction as the decimal system requires TEN to be defined, which requires NINE to be defined, which requires EIGHT to be defined (you get the picture) if you want to define the natural numbers incrementally by adding one each time. Since ten cannot possibly be defined without one being defined first, 1, the value, being part of the decimal system is an oxymoron.
 
I gave no definition of "value".

I gave a way to create all the values of the positive integers from one value.

Yes, values. The value this gives you for the number 3 doesn't have the string "3" - or any other string - as part of its definition. According to what you gave us.

There is no definition of value in any of it.

All it says is "1" is this thing called a "1" value.

It goes on to say "3" is this thing called a "3" value. A "3" value is a thing unto itself once labeled. It stands on it's own.

How it came to be is meaningless.

The symbol points to a place within the scheme. And that place corresponds to a unique value within the scheme.

The symbol and the value are inseparable within the scheme.

You cannot talk about any value without also talking about the labels.

You can't say how many "1" values make a "3" value without using the labels.

The value and the label are the same thing.
 
I gave no definition of "value".

I gave a way to create all the values of the positive integers from one value.

Yes, values. The value this gives you for the number 3 doesn't have the string "3" - or any other string - as part of its definition. According to what you gave us.

There is no definition of value in any of it.

All it says is "1" is this thing called a "1" value.

It goes on to say "3" is this thing called a "3" value. A "3" value is a thing unto itself once labeled. It stands on it's own.

How it came to be is meaningless.
What if? It still stands that in order for the value resulting from dividing 1by3 to be definable, all that's needed is that 1, 3, and the division operation be defined. Since none of them require TEN to be defined, claiming the value only exists in the decimal system is obvious nonsense and 0.412[/12] is as valid a representation as 0.333...
The symbol points to a place within the scheme. And that place corresponds to a unique value within the scheme.

The symbol and the value are inseparable within the scheme.

You cannot talk about any value without also talking about the labels.

You can't say how many "1" values make a "3" value without using the labels.
Can't you read? I did.
The value and the label are the same thing.

If that is so, "1" + "3" = "13" -> 1+3=13 is a logical argument to make.
 
Last edited:
There is no definition of value in any of it.

All it says is "1" is this thing called a "1" value.

It goes on to say "3" is this thing called a "3" value. A "3" value is a thing unto itself once labeled. It stands on it's own.

How it came to be is meaningless.

What if? It still stands that in order for the value resulting from dividing 1by3 to be definable, all that's needed is that 1, 3, and the division operation be defined.

Nothing is defined. No values have been defined. Nothing about value has been defined.

Something has just been labeled a value and a scheme has been devised using that label to create new labels.

All that exists are the labels and their place in the scheme.

The value and the label are the same thing.

If that is so, "1" + "3" = "13" -> 1+3=13 is a logical argument to make.

No.

Adding the value called "1" to the value called "3" gives the value called "4". That is how the scheme was devised. By adding the value called "1" to all previous values and giving the new value just created a label.

In all these things the label and the value are inseparable.
 
Nothing is defined. No values have been defined. Nothing about value has been defined.

Something has just been labeled a value and a scheme has been devised using that label to create new labels.

All that exists are the labels and their place in the scheme.

The value and the label are the same thing.

If that is so, "1" + "3" = "13" -> 1+3=13 is a logical argument to make.

No.

Yes. The labels are strings, whatever they may or may not refer to. The "+" Sign. In the context of strings can only be interpreted as concatenation, and concatenation of these two strings results in the string "13".

Adding the value called "1" to the value called "3" gives the value called "4".

Indeed, but that presumes that 'the value called "3"' and the label "3" are different entities.
That is how the scheme was devised. By adding the value called "1" to all previous values and giving the new value just created a label.

In all these things the label and the value are inseparable.

You keep saying that, but you also keep producing statements that presume their separation.
 
Back
Top Bottom