Swammerdami
Squadron Leader
I don't think we have any "professional mathematicians" here, but being a mathematician is not a binary proposition. Certainly almost everyone in the thread is very familiar with the notion of modular arithmetic.
IIDB does have a few outstanding mathematicians, but most have steered clear of this thread! I do not have a PhD in math but several math PhDs have asked me for help on their proofs; and decades ago I scored 99.998 percentile on several math contests. I think my opinions have merit.
I think thread participants should apologize to each other, and find a more interesting topic.
This is correct, except for the phrase "non-normal context." A phrase like "less common usages" would have been less judgmental. And less likely to provoke Mr. Soldier.
OP's "mathematics-is-neither-absolutely-nor-objectively-right" is much more unfortunate. The rules of arithmetic on the integers ARE absolutely and objectively correct, and so are the rules for other well-founded fields — residue groups, p-adic numbers — or even non-field algebras, e.g. quaternions. Yes, symbols like "+" are overloaded and applied in domains other than integer arithmetic. So what? Does Unknown Soldier think each mathematical algebra needs its own Unicode character set? Leaping from the overloading on a finite keyboard to conclusions in epistomology or ontology seems rather < Deleted by Mod >.
I tried to hint at all this earlier by quoting famous mathematicians, but to no avail. Let me try again.
Finally, a nitpick:
Nitpick: I've never seen "&&" in a mathematical paper — It looks instead like C code. But C code with a syntax error. Did you mean "a && b == c" ? (And did you omit parentheses for precedence?)
IIDB does have a few outstanding mathematicians, but most have steered clear of this thread! I do not have a PhD in math but several math PhDs have asked me for help on their proofs; and decades ago I scored 99.998 percentile on several math contests. I think my opinions have merit.
I think thread participants should apologize to each other, and find a more interesting topic.
Copernicus said:Mathematical symbols are formally defined in advance. If you want to change the definitions, then you are just using the same symbols in a nonnormal context. This is not a criticism of the popular claim that "2 + 2 = 4" is always true. It is always true, if you always interpret the symbols as defined by popular convention. If you don't want to do that, go back in your corner and do as you please. Just don't insist that everyone else join you there.
This is correct, except for the phrase "non-normal context." A phrase like "less common usages" would have been less judgmental. And less likely to provoke Mr. Soldier.
OP's "mathematics-is-neither-absolutely-nor-objectively-right" is much more unfortunate. The rules of arithmetic on the integers ARE absolutely and objectively correct, and so are the rules for other well-founded fields — residue groups, p-adic numbers — or even non-field algebras, e.g. quaternions. Yes, symbols like "+" are overloaded and applied in domains other than integer arithmetic. So what? Does Unknown Soldier think each mathematical algebra needs its own Unicode character set? Leaping from the overloading on a finite keyboard to conclusions in epistomology or ontology seems rather < Deleted by Mod >.
I tried to hint at all this earlier by quoting famous mathematicians, but to no avail. Let me try again.
I cannot endorse the discussions in this thread, but they are vaguely related to the conflicting schools of mathematical philosophy. I've appended a few insightful quotations from pre-eminent mathematicians.
Formalism
David Hilbert said:Mathematics is a game played according to certain simple rules with meaningless marks on paper.~ ~ ~ ~ ~ ~ ~Georg Cantor said:The essence of mathematics lies in its freedom.
In other news, today or tomorrow I will start a thread with a title like "Did Jesus of Nazareth exist? How to estimate the probability." I hope our mathematicians will read the thread, asking intelligent questions and making other contributions.
Finally, a nitpick:
Boolean Algebra
a && b = c
In the two examples the equal sign has different meaning.
Nitpick: I've never seen "&&" in a mathematical paper — It looks instead like C code. But C code with a syntax error. Did you mean "a && b == c" ? (And did you omit parentheses for precedence?)