• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Why mathematics is neither absolutely nor objectively "right."

It's very popular these days to see mathematics as one truth we can know is objectively and absolutely right. I must disagree. A common example of this supposed absolute and objective truth is the equation 2 + 2 = 4. We are told that for all times and places, 2 + 2 = 4, no matter what! If human beings went extinct, then 2 + 2 = 4 still holds as true. If the dinosaurs had the brains, then they would have known 2 + 2 = 4. If there's an advanced civilization of extraterrestrials in the Andromeda galaxy, then they know 2 + 2 = 4.

It's not true that 2 + 2 is absolutely 4. Depending on the rules mathematicians are using, 2 + 2 = 0 might be the case. In modular arithmetic, 2 + 2 might not even be defined at all much less true. For example, in binary there is no 2.

So the truth of 2 + 2 = 4 depends on what arbitrary set of rules you are using. No absolute or objective truth can be so arbitrary. Hence in general, mathematics is neither absolutely nor objectively "right."

All you are saying here is that the formula "2 + 2 = 4" can be false when we assign a different interpretation to the symbols than is conventional everyday usage. Since convention is determined by popularity, that accounts for why it is popular these days to claim that the formula is always true. We just don't add "unless we change the definitions of the symbols to an extraordinary usage".

However, I do want to congratulate you on providing us with an excellent example of what it means to Chop Logic.
 
It's very popular these days to see mathematics as one truth we can know is objectively and absolutely right. I must disagree. A common example of this supposed absolute and objective truth is the equation 2 + 2 = 4. We are told that for all times and places, 2 + 2 = 4, no matter what! If human beings went extinct, then 2 + 2 = 4 still holds as true. If the dinosaurs had the brains, then they would have known 2 + 2 = 4. If there's an advanced civilization of extraterrestrials in the Andromeda galaxy, then they know 2 + 2 = 4.

It's not true that 2 + 2 is absolutely 4. Depending on the rules mathematicians are using, 2 + 2 = 0 might be the case. In modular arithmetic, 2 + 2 might not even be defined at all much less true. For example, in binary there is no 2.

So the truth of 2 + 2 = 4 depends on what arbitrary set of rules you are using. No absolute or objective truth can be so arbitrary. Hence in general, mathematics is neither absolutely nor objectively "right."

All you are saying here is that the formula "2 + 2 = 4" can be false when we assign a different interpretation to the symbols than is conventional everyday usage.
That isn't completely correct. I highlighted in bold that when we change the rules, then an operation like 2 + 2 can have a different result. So the "truths" of mathematics change when we change the rules. It's very arbitrary, actually.
Since convention is determined by popularity, that accounts for why it is popular these days to claim that the formula is always true.
I suppose that's part of the reason why many people have made math into a set of dogmas, but I see this practice of making a religion out of math as the human desire to seek stability and assurance. Since conventional religion is on the wane, some people evidently feel some comfort in believing that other kinds of claims are "objective and absolute."
We just don't add "unless we change the definitions of the symbols to an extraordinary usage".
I'm not sure what you mean, but "extraordinary usage" of mathematical concepts is what has advanced mathematics for thousands of years. In addition to modal arithmetic, examples include √(-1), empty sets, and even the number 0.
However, I do want to congratulate you on providing us with an excellent example of what it means to Chop Logic.
If you have an issue with my reasoning, then take it up with the mathematician David Poole. His book (see the attached file) is what let me know that we can change the rules in math to come up with some very useful techniques.
 

Attachments

  • Linear Algebra David Poole Pages 14 and 15.png
    Linear Algebra David Poole Pages 14 and 15.png
    231.7 KB · Views: 1
Reading books about math is not th same as having comprehension.

You are not reasoning mathematically , you are quote mining most likely out of context. Its not like I never saw it in the workplace.

Again mr math. For any integer n, is there always an (n + 1) > n?

You reffered to linear algebra to make a point, do you understand something as basic as Gaussian Elimination? Without google assitance of course.

No I am not a mathematician. For me marth was a tool. For me more impoirtant was knowing properties of materials and components. While I did math anaylsis as needed and bult models for my simulations, the last thng I wanted to do was math.

The word that comes to mind when you post is dilettante. There is a differnce between knwng a few fact and comprehension.

I have been around too long for yto make any headway with me. Ihave seen it all before.
 
In applied math including engineering La Place Transforms are iniquitous.

There is a proof that for any f(s) there is one and only one transform pair. Two differebt functions can not have the same trafrom.

If an exception was ever found it would have serious consequences in technology. The same can be said of the Fourier Transforms. LaPlace and Fourier transforms appear whenever there are dynamic electrical and mechanical systems.

In part I'd say this is what mathematicians do Important work.

I am sure a matenetican can wade through this with ease.


Why do Fourier and LaPlace transforms work?
 
Reading books about math is not th same as having comprehension.
True, but you should try reading math books anyway.
You are not reasoning mathematically , you are quote mining most likely out of context.
LOL--how can I quote-mine out of context when I post a copy of my source? Well, I suppose I could have Photoshopped it, but all you need to do is get a copy of Poole's book to see that I am completely right about what I say.
Its not like I never saw it in the workplace.
They quote-mine at Walmart?
Again mr math. For any integer n, is there always an (n + 1) > n?
Yes. That's the rule.
You reffered to linear algebra to make a point, do you understand something as basic as Gaussian Elimination? Without google assitance of course.
Yes. I've been immersed in Gaussian elimination. Do you have a matrix you want me to row-reduce?
No I am not a mathematician. For me marth was a tool. For me more impoirtant was knowing properties of materials and components. While I did math anaylsis as needed and bult models for my simulations, the last thng I wanted to do was math.
I bet.
The word that comes to mind when you post is dilettante. There is a differnce between knwng a few fact and comprehension.
You can build math comprehension by studying the underlying reasoning behind set theory.
I have been around too long for yto make any headway with me. Ihave seen it all before.
But that was never my goal. I set out to demonstrate that math is neither absolutely nor objectively right. The math fundamentalists respond with outrage.
 
It's very popular these days to see mathematics as one truth we can know is objectively and absolutely right. I must disagree. A common example of this supposed absolute and objective truth is the equation 2 + 2 = 4. We are told that for all times and places, 2 + 2 = 4, no matter what! If human beings went extinct, then 2 + 2 = 4 still holds as true. If the dinosaurs had the brains, then they would have known 2 + 2 = 4. If there's an advanced civilization of extraterrestrials in the Andromeda galaxy, then they know 2 + 2 = 4.

It's not true that 2 + 2 is absolutely 4. Depending on the rules mathematicians are using, 2 + 2 = 0 might be the case. In modular arithmetic, 2 + 2 might not even be defined at all much less true. For example, in binary there is no 2.

So the truth of 2 + 2 = 4 depends on what arbitrary set of rules you are using. No absolute or objective truth can be so arbitrary. Hence in general, mathematics is neither absolutely nor objectively "right."

All you are saying here is that the formula "2 + 2 = 4" can be false when we assign a different interpretation to the symbols than is conventional everyday usage.
That isn't completely correct. I highlighted in bold that when we change the rules, then an operation like 2 + 2 can have a different result. So the "truths" of mathematics change when we change the rules. It's very arbitrary, actually.

Yes, but you are disagreeing with the "popular" practice of declaring that "2 + 2 = 4" is "objectively and absolutely right". It is when we do not "change the rules". So I was "completely correct" to say that your argument is essentially a logic chopping exercise. Nobody takes the position that the formula means the same thing when you change the rules of interpretation. Normally, we use the conventional interpretation.


Since convention is determined by popularity, that accounts for why it is popular these days to claim that the formula is always true.
I suppose that's part of the reason why many people have made math into a set of dogmas, but I see this practice of making a religion out of math as the human desire to seek stability and assurance. Since conventional religion is on the wane, some people evidently feel some comfort in believing that other kinds of claims are "objective and absolute."

Rubbish. This has nothing to do with dogma or religion. Mathematical symbols are formally defined in advance. If you want to change the definitions, then you are just using the same symbols in a nonnormal context. This is not a criticism of the popular claim that "2 + 2 = 4" is always true. It is always true, if you always interpret the symbols as defined by popular convention. If you don't want to do that, go back in your corner and do as you please. Just don't insist that everyone else join you there.


We just don't add "unless we change the definitions of the symbols to an extraordinary usage".
I'm not sure what you mean, but "extraordinary usage" of mathematical concepts is what has advanced mathematics for thousands of years. In addition to modal arithmetic, examples include √(-1), empty sets, and even the number 0.

Well, I think that your time scale here is a little screwy, but I understand the desire to make a point as forcefully as one can. We didn't even have a basis for formal logic or math until at least Aristotle gave us the laws of identity and noncontradiction. Indian scholars created the integer symbols that we now use and the practice of using sequences to represent powers of magnitude with sequences of integers. Muslim scholars passed the advanced notation on to Europe. The equal sign wasn't even invented until the 16th century.


However, I do want to congratulate you on providing us with an excellent example of what it means to Chop Logic.
If you have an issue with my reasoning, then take it up with the mathematician David Poole. His book (see the attached file) is what let me know that we can change the rules in math to come up with some very useful techniques.

Nobody argues you cannot do that. The problem is that you took issue with what people do when they don't decide to change the rules. Just because you do, don't expect everyone else to start saying the "2 + 2 = 4" isn't always true. It is, as long as you don't change the rules. That is what makes the practice "popular", to use your term--not changing the rules.
 
Define what you mean by absolutely and onjectvely right?


There are only a finte numer of matemtcal symbols an words.

Meaning can vary in differnt areas. Differnt area s of mathhave differnt rules. If that is what yiu are saying, no kidding.

algebra
a = 2
b = 2
x = a + b

Boolean Algebra
a && b = c

In the two examples the equal sign has different meaning.

Algebraically '=' means a quantitative equivalence.

In Boolean the equal sign assigns a state to the output of a logic function. Not a quantitative equivalence.

4%3 = 1, modulo. The equal sign does not mean quantitative equivalence. It assigns the output of the modulo function to a number

(4%3) + 1 = 2 does represent quantitative equivalence.


Both algebra and Boolean boil down to definitions of what = mean.

Different math systems can symbols which can have different meanings.

What you seem to be doing is saying math specifcally algebra and arithmetic are unreliable and not consistent. Which is wrong.
 
Last edited:
All you are saying here is that the formula "2 + 2 = 4" can be false when we assign a different interpretation to the symbols than is conventional everyday usage.
That isn't completely correct. I highlighted in bold that when we change the rules, then an operation like 2 + 2 can have a different result. So the "truths" of mathematics change when we change the rules. It's very arbitrary, actually.

Yes, but you are disagreeing with the "popular" practice of declaring that "2 + 2 = 4" is "objectively and absolutely right".
No, I'm not disagreeing with any popular practices. I'm pointing out that it is incorrect that 2 + 2 = 4 is forever absolutely and objectively correct. Mathematics ultimately rests on arbitrary rules that people have made up. Many people are obviously unaware of this fact, and I want to set the record straight.
It is when we do not "change the rules".
What is when we do not change the rules?
So I was "completely correct" to say that your argument is essentially a logic chopping exercise.
I hope that I can disagree with your opinion here. The issues I've brought up are not trivial at all but are very fundamental to the practice of mathematics. If we make dogmas out of mathematical assertions, then we stifle creative thought that can hobble the progress of mathematics.
Nobody takes the position that the formula means the same thing when you change the rules of interpretation. Normally, we use the conventional interpretation.
Actually, almost everybody here on this thread aside from myself seem to be completely ignorant of the fact that mathematics rests on arbitrary, invented rules and definitions. And not only are they ignorant, but they seem to revel in willful ignorance.
Since convention is determined by popularity, that accounts for why it is popular these days to claim that the formula is always true.
I suppose that's part of the reason why many people have made math into a set of dogmas, but I see this practice of making a religion out of math as the human desire to seek stability and assurance. Since conventional religion is on the wane, some people evidently feel some comfort in believing that other kinds of claims are "objective and absolute."

This has nothing to do with dogma or religion.
How do you know that? You seem very sure of something you cannot possibly know.
Mathematical symbols are formally defined in advance. If you want to change the definitions, then you are just using the same symbols in a nonnormal context.
What is "normal"?
This is not a criticism of the popular claim that "2 + 2 = 4" is always true. It is always true, if you always interpret the symbols as defined by popular convention.
It is by definition true depending on which definitions you choose to go by. For example, is 0 a natural number? The answer depends on how the set of natural numbers is defined. Some mathematicians include 0 in the set of natural numbers, and other mathematicians don't. Neither party is wrong; they just arbitrarily define the set of natural numbers the way they see fit.
If you don't want to do that, go back in your corner and do as you please. Just don't insist that everyone else join you there.
Yes sir! You can be wrong if you choose to be wrong.
We just don't add "unless we change the definitions of the symbols to an extraordinary usage".
I'm not sure what you mean, but "extraordinary usage" of mathematical concepts is what has advanced mathematics for thousands of years. In addition to modal arithmetic, examples include √(-1), empty sets, and even the number 0.

Well, I think that your time scale here is a little screwy,
I never posted a timescale, but just for the record, Aristotle didn't recognize empty sets, and that's the way set theory went for about two thousand years. People thought the idea of an empty set was "screwy" as you say, but finally when some mathematicians and logicians went against that convention, math and logic was greatly advanced.

So again, there's nothing trivial about what I'm arguing.
but I understand the desire to make a point as forcefully as one can. We didn't even have a basis for formal logic or math until at least Aristotle gave us the laws of identity and noncontradiction. Indian scholars created the integer symbols that we now use and the practice of using sequences to represent powers of magnitude with sequences of integers. Muslim scholars passed the advanced notation on to Europe. The equal sign wasn't even invented until the 16th century.
I wonder if those advancements also came under fire by those who feared that their presuppositions weren't so right after all.
However, I do want to congratulate you on providing us with an excellent example of what it means to Chop Logic.
If you have an issue with my reasoning, then take it up with the mathematician David Poole. His book (see the attached file) is what let me know that we can change the rules in math to come up with some very useful techniques.

Nobody argues you cannot do that.
You amaze me with that kind of knowledge! You know what everybody argues? Wow! Very impressive.
The problem is that you took issue with what people do when they don't decide to change the rules. Just because you do, don't expect everyone else to start saying the "2 + 2 = 4" isn't always true. It is, as long as you don't change the rules. That is what makes the practice "popular", to use your term--not changing the rules.
No, I took issue with those who insisted that there are mathematical rules that are absolute and objective. If people wallow in that kind of fundamentalism and ignorance, then advancing mathematical knowledge is jeopardized.
 
Within a mathematical system the rules are absolute.

Algebra
Euclidean Geometry
Arithmetic
Boolean Algebra
Calculus
Probability Theory

The Fundamental Theorem Of Calculus Calculus 101. All science and engineering students go through it. It is a definition and a proof.


If you expand the question to math as a whole then that becomes a different question, more philosophy than math.

US, you are posting a lot of words but you are not saying anything mathematical. Just unsubstantiated opinion.
 
Or are you one of those people that create those stupid Facebook posts asking what 32 plus 17 divided by half equal, and think you've broken math?
No. I'm a mathematician and have studied these issues for decades. I've found it very common for people to think there's something magical about mathematics. But actually mathematics was invented to get work done--work that is peculiar to people. Math is merely a language and methodology we use to describe and work with all those shapes, measures, quantities, and orders that we fuss over.
What issue? That math has different avenues? Math isn't magic, numbers aren't even magic. Who is saying it is magic. You boast that math isn't objective or absolute. And have yet to actually demonstrate that, other than do math poorly.
 
Actually, almost everybody here on this thread aside from myself seem to be completely ignorant of the fact that mathematics rests on arbitrary, invented rules and definitions. And not only are they ignorant, but they seem to revel in willful ignorance.

Your logic chopping and ad hominems aside, it is clear from reading the thread that everyone understands that, which brings us to your final fallacy: argumentum ad nauseam.
 
Or are you one of those people that create those stupid Facebook posts asking what 32 plus 17 divided by half equal, and think you've broken math?
No. I'm a mathematician and have studied these issues for decades. I've found it very common for people to think there's something magical about mathematics. But actually mathematics was invented to get work done--work that is peculiar to people. Math is merely a language and methodology we use to describe and work with all those shapes, measures, quantities, and orders that we fuss over.
What issue? That math has different avenues? Math isn't magic, numbers aren't even magic. Who is saying it is magic.
Uh--you're taking what I said a bit too literally. I'm not saying that somebody actually said, "math is magic." Nevertheless, to insist that math is objectively and absolutely true and that 2 + 2 = 4 is true and that's that, has been claimed on this thread and is as bad as saying that math is magic. This "math fundamentalism" as I like to call it has its roots not in mathematics but in Platonic philosophy. Plato believed that mathematical truths are part of the very fabric of reality. As such, they weren't invented but discovered by great efforts of the mind. That's nonsense, of course, but it's very popular nonsense that is with us to this day (and on this thread).
You boast that math isn't objective or absolute.
How is stating a fact a boast?
And have yet to actually demonstrate that, other than do math poorly.
Well, it looks like you have your mind made up. But just in case it's possible to pry that mind open to get some knowledge into it, I'd recommend you do some further study if you honestly don't find my treatment of the subject to be satisfactory.
 
Is this what US is talking about? Should anyone care? :unsure: Because I don’t.
 
You are aguoing math is not absolute and objective. If the prenise is that That's about right. What 2 + 2 equals depends on the rules being employed by the mathematician. It's what I've argued from the OP.

You are arguing that math is nor objective, which implies math is subjective meaning results are subjective, depending on how you interret?

Your premise is rules change across math. The conclusion is math is not objective.

Conclusion dies not follow from premise.

Math can be misapplied. Figuring out how to apply math to a problem can be subjective. Math has rules but there are no rules on how to apply math to a problem.


Properly applying rules of math be it geometry or algebra always yields an objective unambiguous result.

In algebra 1 + 1 = 2 is always true withing algebras.
a*(x+y) = ax + ay is always true in algebra

4 mod 3 is always 1


The proofs I believe lie in fields and Peano. Theoretical nath never interested me. The answer to your question is probably in assimlating one or two texts and a grad class.

Not something you can just look up on the net.

Your assertion math is not objective is a subjective ill informed opinion.

Part of your prblem is you never really learned any math and used it. Don't have a clue as to why you keep rgeingto modulus calculator as proof of your assertion.

Your trasparent personal attacks are weak. Your ignorance is glaring.

It sounds like you have a nebulous illformed idea which you are trying to present.

Define what you mean by objective and the lack f it.
Define clear premises.
Define a conclusion.

If you can't do that you are just blowing hot air.
 
Is this what US is talking about? Should anyone care? :unsure: Because I don’t.
I am retired and US provides entertainment. Not exactly stimulating conversation but something to engage in.
 
Back
Top Bottom