• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Philosophies of Mathematics

beero1000

Veteran Member
Joined
Sep 23, 2006
Messages
2,139
Location
Connecticut
Basic Beliefs
Atheist
Here's a thread for discussing the philosophies of mathematics, their merits and faults. Constructivism, formalism, logicism, platonism, etc. What do you think?

IMO, at the surface level, the majority of working mathematicians think of their day-to-day mathematics platonistically, with the understanding that their real foundation lies elsewhere. I would say that most would probably end up backing some flavor of logicism or formalism, but there are a variety of other philosophies that I've seen from the seemingly reasonable constructivism to the seemingly unreasonable ultrafinitism.

I think Phil Scott mentioned that he would be interested in giving his opinions on constructivism here. I'd be interested in seeing that, or any other well thought out opinions.
 
Interesting.

I recently listened to a mathemetician on TV saying to her nath has a kind of separate existence and reality out of the mind. A science mysticism.

To me math is somthing you learn by what has come before, in part trial and error, and experince. No diffwent than any other area. The issue with math is it takes more concentraion and work to learn that most things.

To me math is a function of iour biological neural net.

I have found useful some philosophy like modern Popper and DeCarte along with a few others.

I do not think of any philosophy with math and science, it is learn by doing. Maybe that stands as a philosophy.
 
I'd want to make a distinction that I find crucial when discussing philosophies of mathematics. There's a joke that says that mathematicians are platonists during the week and formalists at the weekend, which at least implies that there's very little practical difference between the two stances.

This isn't the case when discussing intuitionists, finitists and classical mathematicians. Each of the three groups think the other two are trying to prove the wrong theorems, and they can't chalk up the difference to a particular take on their day job. Their day jobs are different, with the constructivist and finitist working to more restricted reasoning practices and, consequently, proving what they consider to be much stronger theorems.

I do enjoy waxing philosophical about the difference between the weekend formalist and the day-worker platonist, but I put far more importance on the difference between intuitionistic and classical logic, since the maths comes out very differently depending on the approach. I want to see more mathematics done intuitionistically, and don't really care what you get up to at the weekend.

That said, I think I can argue quite well that there's something blatantly platonistic in the language used by modern day classical mathematicians, while there is something blatantly intuitionistic about the language used by the ancient Greeks. Where the Greeks are lacking, I think, is in a constructive account of negation: we know what it means to draw a triangle, but what does it mean to show that a particular geometric figure is not constructible? I think that idea now has a very clear and inescapable formalisation in modern intuitionistic logic, where the classical idea of binary truth values is just a special case.
 
I'd want to make a distinction that I find crucial when discussing philosophies of mathematics. There's a joke that says that mathematicians are platonists during the week and formalists at the weekend, which at least implies that there's very little practical difference between the two stances.

This isn't the case when discussing intuitionists, finitists and classical mathematicians. Each of the three groups think the other two are trying to prove the wrong theorems, and they can't chalk up the difference to a particular take on their day job. Their day jobs are different, with the constructivist and finitist working to more restricted reasoning practices and, consequently, proving what they consider to be much stronger theorems.

Do they each think that? I might agree that the 'think they're trying to prove the wrong theorems' could be true from a more restrictive view of less restrictive reasoning, but why the other way? I think you'd be hard-pressed to find a classical mathematician who thinks that about intuitionist math - after all, every intuitionist theorem is also a classical theorem, and finding explicit constructions is useful in its own right. We might look at you a bit funny for making life harder on yourselves, same as the compass-and-ruler people looked at the compass-and-straightedge people (everybody looked at Mohr and Mascheroni funny :D), but theorems are theorems.

I do enjoy waxing philosophical about the difference between the weekend formalist and the day-worker platonist, but I put far more importance on the difference between intuitionistic and classical logic, since the maths comes out very differently depending on the approach. I want to see more mathematics done intuitionistically, and don't really care what you get up to at the weekend.

Ok, can you explain why you think we should be doing math intuitionistically over classically? Yes, the math comes out different, but why do you think one is better than the other? What does 'better' mean in this situation? Is this different than a compass-and-straightedge vs compass-and-ruler situation?

That said, I think I can argue quite well that there's something blatantly platonistic in the language used by modern day classical mathematicians, while there is something blatantly intuitionistic about the language used by the ancient Greeks. Where the Greeks are lacking, I think, is in a constructive account of negation: we know what it means to draw a triangle, but what does it mean to show that a particular geometric figure is not constructible? I think that idea now has a very clear and inescapable formalisation in modern intuitionistic logic, where the classical idea of binary truth values is just a special case.

Can you expand on this? From my readings of the classical Greek mathematicians, that is not the case in the arguments they explicitly used, even if modern-day intuitionist proofs can be found for those results.
 
Interesting thread to think about.

I've always thought of mathematics as a sort of numeric language that describes relationships and functions that exist in reality.

What does a person hope to understand about math via philosophy?
 
Interesting thread to think about.
Yup. :)

I've always thought of mathematics as a sort of numeric language that describes relationships and functions that exist in reality.

Sounds like Platonism to me. In essence, do mathematical entities exist in reality or not? When I talk about an idealized triangle or the number 1, am I referring to something 'out there'? If they exist in reality, where? How can I determine their properties while sitting at home, thinking and scribbling on a piece of paper? What is the relationship between the human notion of "proof" with the "truth" of reality?

What does a person hope to understand about math via philosophy?

The main topic of conversation I'm having with Phil Scott is on the distinction between intuitionist mathematics vs classical mathematics. The main distinction between them being the answer to 'What kinds of argument should we accept as valid", which should clearly affect what kinds of conclusions we can reach when doing mathematics (and also things that use that mathematics). Depending on the rules you allow, you will get different outcomes - things that you can prove with one system might not be provable with another.

There are nuances, but at a basic level intuitionist mathematics is a form of mathematical constructivism, which requires that you explicitly construct a mathematical object in order to claim that it exists. It is not enough to argue that it can't be non-existent, you have to show that it exists (usually by showing how to build or compute it). Basically, this is rejecting the idea that a statement P is the same as the statement not not P, which boils down to rejecting the classical law of excluded middle (which is that either P is true or P is not true).

So the mathematical statement "The full decimal expansion of the number π must contain at least one of the digits 0,1,2,...,9 infinitely many times" would have an easy proof in classical mathematics -- if each digit only shows up finitely many times then π would be rational, but π is irrational so they can't. A constructive proof would need some way of explicitly showing which digit appears infinitely often to be acceptable under the intuitionist system, and I don't think that argument exists (yet?).

Most mathematicians accept that "they can't all appear finitely many times" is enough to conclude that "at least one appears infinitely many times", but intuitionists hold that the former statement is actually weaker than the latter, and that we wouldn't be able to claim the latter without a better argument.
 
Have ever these different schools found any hypotesis that if tested would falsify the other?
Wouldnt think so. Thus this is simply a matter of choce. (!)
 
I think you'd be hard-pressed to find a classical mathematician who thinks that about intuitionist math - after all, every intuitionist theorem is also a classical theorem, and finding explicit constructions is useful in its own right. We might look at you a bit funny for making life harder on yourselves, same as the compass-and-ruler people looked at the compass-and-straightedge people (everybody looked at Mohr and Mascheroni funny :D), but theorems are theorems.
Oh, you proof irrelevantists!

Every classical theorem is also an intuitionistic theorem. You can, and you sometimes do, just add a hypothesis on the front that says you have the necessary instances of excluded middle (that you can decide some property), or that you have a suitable choice function.

But you can be less cheeky. Here are three classical theorems which are not intuitionistically valid:

1) ~~P → P (double negation elimination)
2) P ∨ ~P (excluded middle)
3) ~(∀x. ~P(x)) → (∃x. P(x)) (the classical infinite DeMorgan Law)

But, with slight adjustment, I can make three intuitionistically valid theorems:

1) ~~~P → ~P
2) ~~(P ∨ ~P)
3) ~(∀x. ~P(x)) → ~~(∃x. P(x))

From here, it's possible to show that you can take any classical theorem, shove a few double negations onto some of the subexpressions, and you get an equivalent classical theorem which is also an intuitionistic theorem. This translation is called an "embedding", and it shows that intuitionistic logic is formally more expressive than classical: the semantics is necessarily richer, and any contradiction in classical logic also reliably kills intuitionistic logic, even though the intuitionistics have fewer axioms.

In this sense, I can look at a classical proof and see the theorem as an intuitionistic theorem with choice double negations wrapping some of the expressions, and I'll declare that it's the wrong theorem, and that the theorem I really want is the one without the double negations, which will be stronger.

There are even bigger differences though. If you're going to work constructively, you will sometimes need to choose different representations, so you really end up proving different theorems. And the demands of constructivists is that they need theorems that end up doing more work: they at least always have a computational interpretation. So you end up with definitions that tell you, for instance, that every function in constructive real analysis is continuous! That's pretty strong.

Ok, can you explain why you think we should be doing math intuitionistically over classically? Yes, the math comes out different, but why do you think one is better than the other? What does 'better' mean in this situation? Is this different than a compass-and-straightedge vs compass-and-ruler situation?
I plan for computer science to eat the entire world, and when we come to eat maths, we'll be making it intuitionistic, because it will force you guys to think in a way that always has a computational interpretation. I'm offering you a head-start before we begin the invasion in full.

Can you expand on this? From my readings of the classical Greek mathematicians, that is not the case in the arguments they explicitly used, even if modern-day intuitionist proofs can be found for those results.
I think I can explain this quite pithily. Take any proof in Euclid which ends QEF. These are the constructive proofs, where his goal is to obtain a geometric figure. You don't have to look hard. Proposition I in Book I is: "to draw an equilateral triangle."

Now imagine that proof beginning: "for suppose I cannot...."

It would obviously be a failure. And the reason is because the logic involved in producing a geometric construction doesn't admit double-negation elimination.

I stole this joke from a slightly different context and this excellent video:

[youtube]https://www.youtube.com/watch?v=zmhd8clDd_Y[/youtube]

Here's another example. Suppose Euclid had a proposition that read "To construct a triangle or to construct a circle." This would be really weird. How could he not know which thing he's going to construct. All I need to do is read to the end of the proof, and see whether he ended up with a triangle or a circle, and delete as appropriate from the original proposition.

This is not how classical mathematics works, and classically, a bare-faced disjunction is a perfectly normal thing. Indeed, the law of excluded-middle is just such a bare-faced disjunction.

"Either I draw a triangle or I fail to draw a triangle."

"Okay. And how did it turn out?"

This is not to say that all of their logic is constructive, only the logic that's at work when they're talking in deeds: "I can do this", "I can produce that." But if they're just proving a property of a figure "it is the case that", then they can use classical axioms. It's a mixed approach.

The most extreme forms of modern constructivism, by contrast, say that our logic should uniformally be about constructing things, even if it's just the construction of a proof that a figure or other mathematical object has such-and-such a property. Proofs are to be treated as mathematical objects much as anything else.

There is at least one famous computer scientist I've read who works in an intuitionistic formal verifier, but is happy to use classical axioms if he's just proving a property about something, and doesn't really care about the proof as an actual object. This is the sort of mixed approach I take the ancient Greeks to have been using. It's not the way modern mathematics is done, and if you look at, say, Hilbert's modern adaptation of Euclid, you can't immediately tell if the theorems give you a reliable means of constructing a figure, or whether they just tell you that the figure "exists" (where, I don't know).
 
So the mathematical statement "The full decimal expansion of the number π must contain at least one of the digits 0,1,2,...,9 infinitely many times" would have an easy proof in classical mathematics -- if each digit only shows up finitely many times then π would be rational, but π is irrational so they can't. A constructive proof would need some way of explicitly showing which digit appears infinitely often to be acceptable under the intuitionist system, and I don't think that argument exists (yet?).
This is a nice illustrative example, but it kinda makes me feel the distinction between intuitionism and classical mathematics is pretty weak. I'm happy with the statement "the expansion of π does not contain all the digits 0..9 finitely many times", and don't feel cheated here. It's not like I can think of any particular use for the digit if I knew how to obtain it. Though if a use presents itself, then, yeah, I'll be wanting a better proof.

There's perhaps a way to spin this in some sort of metaphysical way. I sometimes feel that, when I try to think classically, I tend to imagine myself having a God's eye view of an infinite decimal expansion, rather than a mere hand-crank that produces digits indefinitely. I can get someway to bridge the two positions by thinking about hierarchies of Turing oracles: a level-0 Turing oracle is a machine that inputs programs and tells you whether or not they halt. A level-1 Turing oracle inputs programs that make use of level-0 Turing oracles and tells you whether or not those halt. A level-2 does the same for level-0 and level-1, and so on. If we could build even a level-0 oracle, it would be the most powerful thing humanity has ever conceived. It would be awesome.

Pretty sure I just need science to get me a level-1 Oracle and then I can give you a digit that occurs infinitely many times in π.

I think a better example in showing up a crucial difference is this one in geometry: if two points A and B lie on distinct lines l1 and l2, then A = B is the unique point of intersection. The proof is trivial, and just relies on noting that if A and B were different, then they would determine a third line, which isn't possible.

This doesn't work constructively, and it probably shouldn't. The problem is that A and B might have been constructed by some sort of nasty recursive algorithm that puts them so close together that you can't reliably tell them apart, so you can't reliably figure out whether you can squeeze a third line into the picture. One cheat I've seen used here is to just say that point equality is always decidable: you can always tell points apart. I think that's too strong, and that the way to proceed geometrically is to avoid nasty algorithms that put points so close together.
 
Philosophy bakes no bread, math got us to the moon.

Is anyone suggesting one neds a pjilosohy to learn, apply, and research math?

If so them does a carpenter need a philosohy? The differnece between the skills is a matter of degree, both skills require knolwdge coupled with creativity.

In Calculus Integration By Parts is taught. It does not have a lot of utility. It is an apprtroch that has no specic rules and does not always work. One learns it by woring problems and trial and error. Is there a philosophy to that?

A great little book is How To Read And Do Proofs. Again, no specific rules that always ensures success. It is learned by stuying what has been done and through experience. The creative intutive part is part experince and part our brains. If you want insight even without a math bbackground I reccomend ithe book. It is short. Far more entertaining than video games.

Haven't played a video game since around 1983. I'd rather explore math problems.
 
Oh, you proof irrelevantists!

Every classical theorem is also an intuitionistic theorem. You can, and you sometimes do, just add a hypothesis on the front that says you have the necessary instances of excluded middle (that you can decide some property), or that you have a suitable choice function.

But you can be less cheeky. Here are three classical theorems which are not intuitionistically valid:

1) ~~P → P (double negation elimination)
2) P ∨ ~P (excluded middle)
3) ~(∀x. ~P(x)) → (∃x. P(x)) (the classical infinite DeMorgan Law)

But, with slight adjustment, I can make three intuitionistically valid theorems:

1) ~~~P → ~P
2) ~~(P ∨ ~P)
3) ~(∀x. ~P(x)) → ~~(∃x. P(x))

From here, it's possible to show that you can take any classical theorem, shove a few double negations onto some of the subexpressions, and you get an equivalent classical theorem which is also an intuitionistic theorem. This translation is called an "embedding", and it shows that intuitionistic logic is formally more expressive than classical: the semantics is necessarily richer, and any contradiction in classical logic also reliably kills intuitionistic logic, even though the intuitionistics have fewer axioms.

In this sense, I can look at a classical proof and see the theorem as an intuitionistic theorem with choice double negations wrapping some of the expressions, and I'll declare that it's the wrong theorem, and that the theorem I really want is the one without the double negations, which will be stronger.

Ok, but that definitely feels like a bit of bait-and-switch. It isn't that classical theorems are intuitionist theorems, it's that there are ways of fiddling with the statements of classical theorems to get related theorems that are shared. But intuitionist theorems are literally classical theorems, no reinterpretations or pushing through negations necessary. And it also speaks to my view that it's the intuitionists who think that the classicists are proving the wrong theorems, not the other way around.

There are even bigger differences though. If you're going to work constructively, you will sometimes need to choose different representations, so you really end up proving different theorems. And the demands of constructivists is that they need theorems that end up doing more work: they at least always have a computational interpretation. So you end up with definitions that tell you, for instance, that every function in constructive real analysis is continuous! That's pretty strong.

And yet it feels like cheating. It's true in constructive real analysis because you've gotten rid of your ability to identify the classical discontinuous functions as 'functions' in the first place. If I call a tail a leg...

I plan for computer science to eat the entire world, and when we come to eat maths, we'll be making it intuitionistic, because it will force you guys to think in a way that always has a computational interpretation. I'm offering you a head-start before we begin the invasion in full.

So the mathematical statement "The full decimal expansion of the number π must contain at least one of the digits 0,1,2,...,9 infinitely many times" would have an easy proof in classical mathematics -- if each digit only shows up finitely many times then π would be rational, but π is irrational so they can't. A constructive proof would need some way of explicitly showing which digit appears infinitely often to be acceptable under the intuitionist system, and I don't think that argument exists (yet?).
This is a nice illustrative example, but it kinda makes me feel the distinction between intuitionism and classical mathematics is pretty weak. I'm happy with the statement "the expansion of π does not contain all the digits 0..9 finitely many times", and don't feel cheated here. It's not like I can think of any particular use for the digit if I knew how to obtain it. Though if a use presents itself, then, yeah, I'll be wanting a better proof.

There's perhaps a way to spin this in some sort of metaphysical way. I sometimes feel that, when I try to think classically, I tend to imagine myself having a God's eye view of an infinite decimal expansion, rather than a mere hand-crank that produces digits indefinitely. I can get someway to bridge the two positions by thinking about hierarchies of Turing oracles: a level-0 Turing oracle is a machine that inputs programs and tells you whether or not they halt. A level-1 Turing oracle inputs programs that make use of level-0 Turing oracles and tells you whether or not those halt. A level-2 does the same for level-0 and level-1, and so on. If we could build even a level-0 oracle, it would be the most powerful thing humanity has ever conceived. It would be awesome.

Pretty sure I just need science to get me a level-1 Oracle and then I can give you a digit that occurs infinitely many times in π.
I think that needs some justification. Why should we have to have a computational interpretation for everything we do?

Can you expand on this? From my readings of the classical Greek mathematicians, that is not the case in the arguments they explicitly used, even if modern-day intuitionist proofs can be found for those results.
I think I can explain this quite pithily. Take any proof in Euclid which ends QEF. These are the constructive proofs, where his goal is to obtain a geometric figure. You don't have to look hard. Proposition I in Book I is: "to draw an equilateral triangle."

Now imagine that proof beginning: "for suppose I cannot...."

It would obviously be a failure. And the reason is because the logic involved in producing a geometric construction doesn't admit double-negation elimination.

I stole this joke from a slightly different context and this excellent video:

[youtube]https://www.youtube.com/watch?v=zmhd8clDd_Y[/youtube]

Here's another example. Suppose Euclid had a proposition that read "To construct a triangle or to construct a circle." This would be really weird. How could he not know which thing he's going to construct. All I need to do is read to the end of the proof, and see whether he ended up with a triangle or a circle, and delete as appropriate from the original proposition.

This is not how classical mathematics works, and classically, a bare-faced disjunction is a perfectly normal thing. Indeed, the law of excluded-middle is just such a bare-faced disjunction.

"Either I draw a triangle or I fail to draw a triangle."

"Okay. And how did it turn out?"

This is not to say that all of their logic is constructive, only the logic that's at work when they're talking in deeds: "I can do this", "I can produce that." But if they're just proving a property of a figure "it is the case that", then they can use classical axioms. It's a mixed approach.

Well, yeah, construction theorems are usually constructivist. But Euclid had many theorems that were not explicit constructions, and many of his proofs rely on non-constructivist arguments. IMO, that puts him in the same camp with modern mathematicians who certainly value constructive arguments but don't artificially restrict themselves to only constructive arguments.

The most extreme forms of modern constructivism, by contrast, say that our logic should uniformally be about constructing things, even if it's just the construction of a proof that a figure or other mathematical object has such-and-such a property. Proofs are to be treated as mathematical objects much as anything else.

There is at least one famous computer scientist I've read who works in an intuitionistic formal verifier, but is happy to use classical axioms if he's just proving a property about something, and doesn't really care about the proof as an actual object. This is the sort of mixed approach I take the ancient Greeks to have been using. It's not the way modern mathematics is done, and if you look at, say, Hilbert's modern adaptation of Euclid, you can't immediately tell if the theorems give you a reliable means of constructing a figure, or whether they just tell you that the figure "exists" (where, I don't know).

The mixed approach is certainly how I see most modern mathematics being done, I'm not sure where you're getting the impression that it isn't.

I think a better example in showing up a crucial difference is this one in geometry: if two points A and B lie on distinct lines l1 and l2, then A = B is the unique point of intersection. The proof is trivial, and just relies on noting that if A and B were different, then they would determine a third line, which isn't possible.

This doesn't work constructively, and it probably shouldn't. The problem is that A and B might have been constructed by some sort of nasty recursive algorithm that puts them so close together that you can't reliably tell them apart, so you can't reliably figure out whether you can squeeze a third line into the picture. One cheat I've seen used here is to just say that point equality is always decidable: you can always tell points apart. I think that's too strong, and that the way to proceed geometrically is to avoid nasty algorithms that put points so close together.

I like that one too, and I think it also speaks to the above about the ancient Greeks. I don't recall if that exact argument appears in the Elements, but Euclid would have certainly agreed with the classical conclusion and not the constructivist one.
 
Philosophy bakes no bread, math got us to the moon.

Is anyone suggesting one neds a pjilosohy to learn, apply, and research math?

If so them does a carpenter need a philosohy? The differnece between the skills is a matter of degree, both skills require knolwdge coupled with creativity.

In Calculus Integration By Parts is taught. It does not have a lot of utility. It is an apprtroch that has no specic rules and does not always work. One learns it by woring problems and trial and error. Is there a philosophy to that?

A great little book is How To Read And Do Proofs. Again, no specific rules that always ensures success. It is learned by stuying what has been done and through experience. The creative intutive part is part experince and part our brains. If you want insight even without a math bbackground I reccomend ithe book. It is short. Far more entertaining than video games.

Haven't played a video game since around 1983. I'd rather explore math problems.
Philosophy is simply ”use you rationality to be wise”. As soon as a subject gets so much worked on and gets a a crowd it is given a specific name, as maths, economics, etc. all these subjects are philosophy. Its just that what is called philosophy in US today is the remaining thought-stuff that hasnt yet formed into subjects of their own.
 
And yet it feels like cheating. It's true in constructive real analysis because you've gotten rid of your ability to identify the classical discontinuous functions as 'functions' in the first place. If I call a tail a leg...
Okay. Apologies, but I think this is exchange is going to end up one-sided with me doing all the work, and my previous post was long enough already.

By the way, stuff about "taking over the world" is entirely tongue-in-cheek, and I'm not posting in this thread to try to sell constructivism or present arguments that one philosophy is better than another. I don't think in those terms when it comes to philosophy generally. I do claim that if constructivists were in charge, maths would look very different, and the body of definitions and theorems would not appeal to classical mathematicians as the current body of definitions and theorems appeals. We could just focus on that claim.
 
Last edited:
Philosophy bakes no bread, math got us to the moon.

Is anyone suggesting one neds a pjilosohy to learn, apply, and research math?

If so them does a carpenter need a philosohy? The differnece between the skills is a matter of degree, both skills require knolwdge coupled with creativity.

In Calculus Integration By Parts is taught. It does not have a lot of utility. It is an apprtroch that has no specic rules and does not always work. One learns it by woring problems and trial and error. Is there a philosophy to that?

A great little book is How To Read And Do Proofs. Again, no specific rules that always ensures success. It is learned by stuying what has been done and through experience. The creative intutive part is part experince and part our brains. If you want insight even without a math bbackground I reccomend ithe book. It is short. Far more entertaining than video games.

Haven't played a video game since around 1983. I'd rather explore math problems.
Philosophy is simply ”use you rationality to be wise”. As soon as a subject gets so much worked on and gets a a crowd it is given a specific name, as maths, economics, etc. all these subjects are philosophy. Its just that what is called philosophy in US today is the remaining thought-stuff that hasnt yet formed into subjects of their own.

That is your contextual spin, For me science and engineering is Socratic, questions leading to questions leading to truth, hopefully. Nobody consciously called it that, it is how thinking evolved.

Historically there were Doctor Of Law, Doctor Of Medicine, and Doctor Of Philosophy. Philosophy was always a catch all termfor knowledge in general. In the time of the ancient Greeks specific philosophies functioned much as modern religion does today, a source of morality and how to live. A philosopher was one who loved knowledge and searched for wisdom.

People use the word philosophy much as Christians use god, an ill defined amorphous highly contextual term.

Define philosophy in the context of the OP and how a philosophy would be applied by a mathematician.
 
Philosophy bakes no bread, math got us to the moon.

Is anyone suggesting one neds a pjilosohy to learn, apply, and research math?

If so them does a carpenter need a philosohy? The differnece between the skills is a matter of degree, both skills require knolwdge coupled with creativity.

In Calculus Integration By Parts is taught. It does not have a lot of utility. It is an apprtroch that has no specic rules and does not always work. One learns it by woring problems and trial and error. Is there a philosophy to that?

A great little book is How To Read And Do Proofs. Again, no specific rules that always ensures success. It is learned by stuying what has been done and through experience. The creative intutive part is part experince and part our brains. If you want insight even without a math bbackground I reccomend ithe book. It is short. Far more entertaining than video games.

Haven't played a video game since around 1983. I'd rather explore math problems.
Philosophy is simply ”use you rationality to be wise”. As soon as a subject gets so much worked on and gets a a crowd it is given a specific name, as maths, economics, etc. all these subjects are philosophy. Its just that what is called philosophy in US today is the remaining thought-stuff that hasnt yet formed into subjects of their own.

That is your contextual spin, For me science and engineering is Socratic, questions leading to questions leading to truth, hopefully. Nobody consciously called it that, it is how thinking evolved.

Historically there were Doctor Of Law, Doctor Of Medicine, and Doctor Of Philosophy. Philosophy was always a catch all termfor knowledge in general. In the time of the ancient Greeks specific philosophies functioned much as modern religion does today, a source of morality and how to live. A philosopher was one who loved knowledge and searched for wisdom.

People use the word philosophy much as Christians use god, an ill defined amorphous highly contextual term.

Define philosophy in the context of the OP and how a philosophy would be applied by a mathematician.

You forgot Doctor of Theology, which likely pre-dates the other three, and if nothing else, demonstrates neatly that 'ancient' is not synonymous with 'wise'.
 
The notion "if something exists, it exists somewhere" is faulty.

I had to think about this for a while because the notion as you stated it is a tautology (if something exists, it exists) so necessarily true.

Could it be that you meant "'if something can exist (meaning not prohibited by physical laws), it exist somewhere' is faulty"? This would seem to be arguable - what are the chances of something with an infinitely small chance in an infinite universe?
 
The notion "if something exists, it exists somewhere" is faulty.

I had to think about this for a while because the notion as you stated it is a tautology (if something exists, it exists) so necessarily true.

Could it be that you meant "'if something can exist (meaning not prohibited by physical laws), it exist somewhere' is faulty"? This would seem to be arguable - what are the chances of something with an infinitely small chance in an infinite universe?

There exists at least one integer < 10. WHERE is these integers?
 
The notion "if something exists, it exists somewhere" is faulty.

I had to think about this for a while because the notion as you stated it is a tautology (if something exists, it exists) so necessarily true.

Could it be that you meant "'if something can exist (meaning not prohibited by physical laws), it exist somewhere' is faulty"? This would seem to be arguable - what are the chances of something with an infinitely small chance in an infinite universe?
No, the way you worded it, it's a tautology.

I've left open room for something to exist without location.
 
Left room for something to exist without location. :)

I like it. But surely we need to invoke the word "virtually" at this point. Or maybe "undetectable".
 
Back
Top Bottom