Why, you can make "2" anything you want it to be! The sky won't fall if you do. Likewise, you can have 2 + 2 = 4. or 2 + 2 = 0, or 2 + 2 = The Cat's Meow. There's no law of nature saying you can't. People make all this stuff up. Sure, what math people make up may prove useful, or fashionable, or impressive. And for that matter much of what math people invent can cause great consternation for some people on internet forums when some guy comes along and tries to tell them it's invented. But no matter how much they protest calling that revelation things like "a full-on, OCD-triggering horror show for any pure mathematician," it doesn't change the fact that math is neither absolute nor is it objective.
Discovered, not invented, or made up.
You asserted it. Now prove that assertion. You'll need to demonstrate that math exists prior to its discovery.
That "mathematics is discovered" is not a proposition. It's not something to be proven true or false. It's a heuristic intended to reveal the underlying philosophical commitments of pure mathematicians. So long as you look at mathematics as something that's created or invented, you're blinding yourself to how pure mathematicians approach their subject.
Mathematicians do create descriptions and invent methods, but that's not the mathematics itself. The math is the thing that was out there, self-existent before someone came along to notice it. It's the abstraction that exists even if no one ever discovers it. Math is the a priori that makes descriptions and methods possible.
Certainly, mathematics is fundamentally arbitrary. But to stop there misses the entirely objective restraints imposed by mathematics.
There are apocryphal tales that circulate among PhD candidates about a dissertation on a newly-discovered class of algebraic structures that had to be withdrawn when it was shown the class was empty. There's the actual tale of Principia Mathematica, the attempt by Russell and Whitehead to create a complete set-theoretic foundation for mathematics destroyed by Gödel in his "Über formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme, I."
More prosaically, math is the essential that remains true even when mathematical models show that bumblebees can't fly.
I'd say that math is essentially invented. Laplace Transforms were invented by Laplace, and the Cartesian Coordinate System was invented by Descartes. We can see math being invented throughout the history of mathematics by many different people. All that math wasn't just lying around waiting to be dug up. So the knowledge that math is invented is based in the practice of basing mathematics in axioms. Axioms are arbitrary rules that people make up. As such, they are the product of human ingenuity and creativity (i.e. inventions). I've proved the role of arbitrary axioms in mathematics in my
What proof is there that 2 + 2 = 4? thread.
That said, there is a discovery of sorts in math in which once some idea is invented, later on that idea leads to unforeseen conclusions. Circles, for example, were invented but later on the number
π was found to be the ratio of the circumference of any circle to the measure of its diameter. Nobody including the inventors of circles expected
π.
Kinda, but you're getting it all bass ackwards. Nobody invented a circle.
A circle is the locus of points equidistant from a chosen center. It's not something that exists more than approximately in the real world. It didn't come into existence the first time some primate rounded off a rock to help it roll better. It always existed. And even that's too restrictive. Its existence lies outside the bounds of space and time, independent of any potential or conceivable universe.
Neither is the ratio between circumference and diameter something that can be measured.
That's the math you're missing by insisting that math depends on its applications. This is the divide between pure and applied mathematics.
The descriptions codified by Laplace, and later Heaviside, and Descartes are indeed their own inventions. But the relationships described by Laplace and Descartes must have existed before they created those descriptions or there'd be nothing for them to describe. And those relationships would still exist if Laplace or Descartes were never born.
Yes, just lying around waiting to be dug up.
And again, mathematicians don't care if the mathematics they discover is useful.
That's flat-out false. Newton, for example, did some major work in developing calculus because he needed it to do his work in physics.
Perhaps not the best example. There's a reason we use Leibniz notation, and no committee secretly chaired by Newton granting him priority is going to change that.
In the sense that even the purest of mathematicians has to eat, yes, we care if somebody's willing to pay us to pursue our research. But pure math is abstract art for geeks. It's paradoxically described as the search for transuniversal truths that some physicist or other scientist won't someday sully with an application.
The search is hopeless, by the way. When algebraic number theory was found to have applications in cybersecurity, for many of us, the last of our dreams were crushed. George Boole had a great run, by the way.
As it turns out, surprisingly enough, it is, but that's irrelevant to the underlying abstractions that we work with.
Certainly it's a fact that math isn't objective and is no more absolute than the arbitrarily accepted axioms upon which theory is based.
Right! So you get it. Why are you arguing with me?
Because your proofs are nonsense.
But it's not because two can't be represented in binary...
The conventional representation of 2 in binary is 10. Somebody cooked that up, of course.
...or because the residues in modular arithmetic aren't equivalence classes or because of any other contrafactual you've somehow absorbed and can't seem to keep yourself from sharing.
I do like to share my knowledge, that's true.
And you also share your beliefs that a representations of the number two using positional notation is the number itself. Which is nonsense.
It's one thing to claim something is true. It's another thing to support that claim. Congratulations on the former.
But I do support my claims. By contrast, your earlier claim that math is discovered is completely unsupported. The kettle calls the pot black!
If only you'd stopped when you were ahead.
I can understand why you want me to stop.
Ice cream has no bones. That's a true statement that supports no theorem in mathematics. The digit 2 doesn't exist in binary representations. Neither is it true that 2 + 2 isn't 4 because the digit 2 doesn't exist in binary representations.
Would you like to see a proof?
\(10_b+10_b=100_b\)
As for me, I can live with invented, subjective, and relative math. It's still elegant, fun, and very challenging. Heck, I study it hours a day--every day. And I've been learning it in school and through self-study for decades. After all that hard work, I can go online to see what others think of it. And no surprise--I see they've made it into an idol.
Read all the books you like. I commend autodidactism.
I'll keep reading books. I recommend you read some books too.
What is the point of comments like this?
You're not an audidact.
There's no benefit in reading a book you can't understand, and less than no benefit when your lack of comprehension causes you to learn things that are not true.
I'll let you know if that ever happens!
Only if you discover it, and probably not then, either.
But honestly, many of the books I study can be hard for me to understand. But I would be an idiot to take your advice and stop studying them for that reason! As I see it, if I study ten new concepts, and I only understand one of those concepts, then I've learned one concept.
First of all, no, that's not what I'm saying. Yes, read books. But once you've read them, engage in the critical examination necessary to ensure comprehension. The former without the latter leads to the garbled miscomprehensions evident in your posts here.
If the book is a math or science text, it will have exercises listed after every topic. And answers to even problems listed in an appendix, generally, for the benefit of informal students like yourself.
Use the questions to critically examine whether you've actually learned the concepts. Or ask someone who's already mastered the concept to go over it with you. Or ask the professor who wrote the book. You'd be amazed how open they are to responding to correspondence.
Hell, I've got correspondence from Neil deGrasse Tyson in my email. Because I wrote him out of the blue with no introduction, and he wrote me back.
In any case, self-study has been very beneficial for me. When I was in college I prepared for many of my courses by studying beforehand. I ended up with a four-year degree and a 4.0 GPA.
The scary thing is that that's actually possible.
That's the time to seek help from reliable sources who can steer you away from some of the bizarre claims you've been making.
You're making one of the biggest goofs in mathematics here: You are relying on intuition and rejecting whatever seems strange to you. Many truths in mathematics as well as science are often counterintuitive. Truth doesn't care if it makes sense to us.
And those "bizarre claims" you mention are all based in conventional mathematics and logic. I'm not making up anything.
And it's also possible that your miscomprehensions are not original.
That's not personal. It's something that happens to all of us. I had a whole chapter of my dissertation wiped because one member of my committee noticed I'd headed off on a bridge to nowhere. The point being that without independent critical examination, there are no guard rails, and without guard rails, running off the cliff is a question of when, not if.
I can only wonder what they would think of your latest post.
That I'm wasting my time, probably.
I've got plenty of other students to keep me busy.
Send them my way. I've worked as a math tutor. They may well need one.
Yes, they do need tutors, often enough, but no, that's not you, either. We have a budget for tutors. The applicants must at minimum be enrolled in a graduate program.
A great video to watch that explains my position on mathematics is
Philosophical Failures of Christian Apologetics, Part 4: Word Games. Note that Christian apologists see mathematics in a way that is similar to the way you see it.
Because closing with insults is just how you roll, right?
Your position on mathematics lacks the relevance of a mathematician's position on mathematics. Something you can only learn by asking mathematicians about their positions. And something you will never learn so long as you strive to impose your own limitations on others.
Where are your solutions to the questions you posed in the o/p?