Mapping is associating one thing with another.
Both have to exist already before any mapping can take place.
The value of 1 does not exist until somebody invents it.
Sure.
And once it's invented, you can map arbitrary other "things" to it. Such as the string "1" or the string "I" or the string "αʹ", or the sound waves of the English word [wən]. The value
is none of those things. You've recognised as much when you argued that the positive integers as used by a civilisation that doesn't have a concept of the value ZERO is the same as the ones used by a civilisation that does: Since positional notations require the concept of ZERO, it is obvious that the
value TEN will have a different representation in whatever notation they use, and yet you told us it's the same value.
Value itself does not exist until somebody invents the concept.
What exists in the real world are quantities not values. Value is a conception not an observation.
The decimal scheme would flow from other just as simple rules.
Indeed. But they're
different rules. None of the rules involved in defining the decimal system are required to define the numbers themselves.
But within the decimal scheme 0.003 is a specific value.
Not if you use
value in the same meaning you did until now.
It is nothing else in the scheme and it is not pointing to some other scheme.
But the decimal scheme and fractions, unlike the natural numbers, do not correspond to the real world. They are ideal.
You can have 1 apple or 2 apples but you can't have exactly 0.3333.... of an apple. You cannot have 1/3 of an apple.
In practice, that is true due to the limitations of our cutting technology, but it's a rather uninteresting fact (and having exactly 0.3 of an apple is, in the very same sense, equally impossible). Whether it is more generally true due to the quantized nature of matter depends on the individual apple: whether the numbers of particles it contains are integer multiples of 3. In this sense, you are however much more likely to find an apple of which which you
can have exactly 0.333... than one of which you can have exactly 0.3 - as there are over 3 times as many numbers within any given non-minimal range that are integer multiples of 3 than are integer multiples of 10.
Irrelevant to the fact that numbers and their labels are different entities.
Numbers or values?
The value is created by creating the symbol. 1 + 1 + 1 is a value. And 3 refers to that value.
The value of 3 does not exist until the label is created and defined. It is not out there somewhere. In the real world quantities exist but values do not.
1/3 produces an endless string that has no final value.
According to what you just said, that cannot possibly be so.
So you claim without any rational argument taking us there.
You just gave us a definition of the natural numbers that doesn't refer at all to how they are represented, acknowledging that their representation is not part of their definition. You defined THREE as emerging from the recursive application of the operation of ADDITION to the primitive value ONE, as in ONE + ONE + ONE. You also told us that, once we define a DIVISION operation, we can produce new values by applying it to previously defined values such as THREE and ONE. No part of that definition refers to the values' representation, which, as you also told us, would have to be derived from
other rules. You even said that "in the [...] world of values 6/2 = 3", clearly indicating that the product of the division operation is another
value. Since the operation produces a
value defined within the scheme of
values and agnostic to what label you might pick to refer to it, and not a
string, the statement "1/3 produces an endless string that has no final value" is a simple category error analogous to "a dog has four legs and three characters".