Jokodo
Veteran Member
As per the definition of the decimal system, any wellformed numeral within it refers to a unique value within the scheme of numbers, presuming that they have been defined. Those symbols are however not part of the scheme of numbers, they live in an extra layer atop the numbers. You should know, you have us an algorithm for the numbers that doesn't make reference to strings.The symbols refer only to a specific value within the scheme.
This doesn't contradict anything I said.The value is created by the scheme.
It does not exist out there somewhere before the scheme exists.
You got this exactly backwards. I'm well aware the mapping from strings to values is arbitrary. You seem to have your reservations.The scheme of positive integers doesn't contains symbols. It's a scheme of values.
Of course it does. A value created is represented by a specific symbol or grouping of symbols. The symbols were already assigned by the time you showed up so you don't think they were assigned I suppose.
True, but that reference relation is not part of the world of values. The value defined as the result of applying the Addition operation to the Unit value is literally by definition the same whether you use the English word "two", Russian "dva" or Arabic "ithnaan", the decimal "2", binary "10", or Roman "II". If it makes you happy, you could call it "Gandalf" or "Captain Kirk" and still be talking about the same abstractly defined value.You should tell untermensche he keeps irrelevantly bringing up symbols of the decimal notation scheme in the middle of discussing the abstract conceptual scheme of numeric values.
Nope.
The symbols refer to the abstract values that do not exist until a value system is created.
They do, but not vice versa. Their relationship is asymmetric.The symbols refer to values.
They refer to the values resulting from dividing 3 by 10 and 33 by 100, respectively. In any Positional notation that doesn't have a multiple of ten as its base, they'd have to be represented with a non-terminating string. That's a property of notations, not of values. What was that about not making reference to entities from other schemes?0.3 refers to a specific value in the scheme. 0.33 refers to a different value.
0.333... refers to some imaginary entity that never has a final value.
It refers to the value resulting from dividing 3/9 or 1/3. If this couldn't be proved - though it can - we could stipulate it and still be good. We are, after all, talking about definitional schemes - you said it first. Nothing in the scheme of NUMBERS makes 1/3 less final than 1/10, that's a known and inevitable bug of the decimal system, with well-known workarounds.