My position is clear. You and I will not agree. There is nothing more for me to get out of the problem.
Your position is far from clear. Do you accept the definition of decimal numbers above, or not? If not, specify what is wrong with the actual definition as it stands, do not bring up any other issues we may have, just concentrate on the definition at it stands.
<definition>
A ”decimal number” is an ordered set of integers An (where n is all negative and positive integers and for each An <= 9 and An >= 0.) and a corresponding value = Sum of An*10^n (for all n)
</definition>
Some around here seem to think they are locked into a life or death struggle. For me it about exploring and staying active. I learned a few things in the debate. It is not a peer competition.
I can not continually restate.
1. Infinite and finite are mutually exclusive. If you argue that forget the rest. If you accept that then 0.999.. can never = a finite 1.
2. geometric series applied to repeating decimals yields fractional approximations. 0.333... goes to 3/9, a fractional approximation. 0.999... goes to 9/9, a fractional approximation. It says use 1 as an approximation for 0.999.., it does not say 0.999.. equals a finite 1.
3. if 0.333.. goes to 1/3 goes to 0.333... does not round up but 0.999.. seems to do, then 0.999.. as a proof it is 1 is questionable, the method is inconsistent on that point. The method works as intended but can not be relied on for the assumed proof. I could cite examples when I fell into that kind of a trap but it would not mean anything to you.
That is about it for me. If it is not clear I can not help you. I'd suggest letting go of hanging onto a web link and try some independent thought. Learn to criticize yourself as you would me, introspection is important. Stretch and take some risks, it is what the forum affords.