All you are saying here is that the formula "2 + 2 = 4" can be false when we assign a different interpretation to the symbols than is conventional everyday usage.
That isn't completely correct. I highlighted in bold that when we change
the rules, then an operation like 2 + 2 can have a different result. So the "truths" of mathematics change when we change the rules. It's very arbitrary, actually.
Yes, but you are disagreeing with the "popular" practice of declaring that "2 + 2 = 4" is "objectively and absolutely right".
No, I'm not disagreeing with any popular practices. I'm pointing out that it is incorrect that 2 + 2 = 4 is forever absolutely and objectively correct. Mathematics ultimately rests on arbitrary rules that people have made up. Many people are obviously unaware of this fact, and I want to set the record straight.
It is when we do not "change the rules".
What is when we do not change the rules?
So I was "completely correct" to say that your argument is essentially a logic chopping exercise.
I hope that I can disagree with your opinion here. The issues I've brought up are not trivial at all but are very fundamental to the practice of mathematics. If we make dogmas out of mathematical assertions, then we stifle creative thought that can hobble the progress of mathematics.
Nobody takes the position that the formula means the same thing when you change the rules of interpretation. Normally, we use the conventional interpretation.
Actually, almost everybody here on this thread aside from myself seem to be completely ignorant of the fact that mathematics rests on arbitrary, invented rules and definitions. And not only are they ignorant, but they seem to revel in
willful ignorance.
Since convention is determined by popularity, that accounts for why it is popular these days to claim that the formula is always true.
I suppose that's part of the reason why many people have made math into a set of dogmas, but I see this practice of making a religion out of math as the human desire to seek stability and assurance. Since conventional religion is on the wane, some people evidently feel some comfort in believing that other kinds of claims are "objective and absolute."
This has nothing to do with dogma or religion.
How do you know that? You seem very sure of something you cannot possibly know.
Mathematical symbols are formally defined in advance. If you want to change the definitions, then you are just using the same symbols in a nonnormal context.
What is "normal"?
This is not a criticism of the popular claim that "2 + 2 = 4" is always true. It is always true, if you always interpret the symbols as defined by popular convention.
It is
by definition true depending on which definitions you choose to go by. For example, is 0 a natural number? The answer depends on how the set of natural numbers is defined. Some mathematicians include 0 in the set of natural numbers, and other mathematicians don't. Neither party is wrong; they just arbitrarily define the set of natural numbers the way they see fit.
If you don't want to do that, go back in your corner and do as you please. Just don't insist that everyone else join you there.
Yes sir! You can be wrong if you choose to be wrong.
We just don't add "unless we change the definitions of the symbols to an extraordinary usage".
I'm not sure what you mean, but "extraordinary usage" of mathematical concepts is what has advanced mathematics for thousands of years. In addition to modal arithmetic, examples include √(-1), empty sets, and even the number 0.
Well, I think that your time scale here is a little screwy,
I never posted a timescale, but just for the record, Aristotle didn't recognize empty sets, and that's the way set theory went for about two thousand years. People thought the idea of an empty set was "screwy" as you say, but finally when some mathematicians and logicians went against that convention, math and logic was greatly advanced.
So again, there's nothing trivial about what I'm arguing.
but I understand the desire to make a point as forcefully as one can. We didn't even have a basis for formal logic or math until at least Aristotle gave us the laws of identity and noncontradiction. Indian scholars created the integer symbols that we now use and the practice of using sequences to represent powers of magnitude with sequences of integers. Muslim scholars passed the advanced notation on to Europe. The equal sign wasn't even invented until the 16th century.
I wonder if those advancements also came under fire by those who feared that their presuppositions weren't so right after all.
However, I do want to congratulate you on providing us with an excellent example of what it means to
Chop Logic.
If you have an issue with my reasoning, then take it up with the mathematician David Poole. His book (see the attached file) is what let me know that we can change the rules in math to come up with some very useful techniques.
Nobody argues you cannot do that.
You amaze me with that kind of knowledge! You know what everybody argues? Wow! Very impressive.
The problem is that you took issue with what people do when they don't decide to change the rules. Just because you do, don't expect everyone else to start saying the "2 + 2 = 4" isn't always true. It is, as long as you don't change the rules. That is what makes the practice "popular", to use your term--not changing the rules.
No, I took issue with those who insisted that there are mathematical rules that are absolute and objective. If people wallow in that kind of fundamentalism and ignorance, then advancing mathematical knowledge is jeopardized.