• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Is Ockham's Razor Bullshit?

Ockham's Razor as a warning against wild speculation is a good way of putting it.

Then why not say that?

Just saying it may not have the same effect. Someone may not believe that their proposition happens to be a matter of wild speculation, therefore "wild speculation" does not apply to them.

So instead you try to confuse them by cloaking your unconvincing personal insult in what sounds like some sort of philosophical principle?
 
Just saying it may not have the same effect. Someone may not believe that their proposition happens to be a matter of wild speculation, therefore "wild speculation" does not apply to them.

So instead you try to confuse them by cloaking your unconvincing personal insult in what sounds like some sort of philosophical principle?

No, just that there are different ways of saying something. Perspective.. The simplest explanation may often be the best/Ockhams razor. There is no cloaking.
 
Parsimony is too often presented as almost an aesthetic preference, when in fact it relates to the probable accurate of an idea.

Here is my understanding of the logic behind it:

Every assumption has a non-zero probability of being false.
Each separate assumption thus adds to the ways a theory can be false, which increases the total probability that theory is wrong in some essential way.

However, this negative effect of complexity can be countered, if the additional assumption increases the probability of accuracy via other means, such as explanatory power, precision and/or uniqueness of accurate predictions, utility in revealing previously unknown relationships, and coherence with well established facts and theories.

These are the things being referenced by the phrase "All other things being equal". And it's easy to say "All things are never equal, so it's useless", but that's a strawman. It means that there has to be some increase to these things that indicate probable accuracy to offset the cost of added assumptions. So, it's fine to go with the more complex theory, so long as doing so increases what you can predict or explain without those assumptions. This also relates to skepticalbip's comment about the assumptions being "unnecessary" In this context, "unneccessary" means that it does nothing to increase these other properties of the theory.

Also, it applies not just to comparing to specific theories but to two overall worldviews. Two people both accept a set of naturalistic explanations for various things, including mechanisms A through F. But for one event Y the first person says "That it is predicted and explained by mechanism B which we already know exists.", while the other person says "No, I think it's a new mechanism X." If assuming mechanism X does not explain or predict things that established mechanisms A through F cannot, then his theory is more likely to be wrong. It doesn't matter that both explanations can account for X using a single mechanism. The total mechanism the second person is assuming exist in the world is greater and each might be false, so greater # = more net probability of overall inaccuracy.

Basically, parsimony is # of assumptions for particular theories is tied to the same logic behind favoring theories that employ the same mechanisms already known to explain other things, thereby preserving parsimony at the worldview level.

It is the same basic logic behind why using established known mechanisms to explain something is more rational and more likely to be correct than positing some speculative cause. If you find a body poisoned by mushrooms and see a type of mushroom that you know has killed people, and another type of mushroom that you don't know anything about, which mushroom is more likely to have killed him? Sure, the other one might also be poisonous, but without already knowing that, the odds are higher that it's the thing you know could have caused it.
 
That sounds more like a logical fallacy than a logical principle. What you're saying, in effect, is that an unidentified mushroom is less likely to be dangerous than an identified one. If the "probability" of being correct changes based on whether or not you happen to have a mycologist at the crime scene who can identify both mushrooms, then I'd say your estimation isn't very useful even if it happens to be right. Frankly, I'd be suspicious that you were correct in identifying the first mushroom to begin with, if you lack the relevant experience to identify both. Even seasoned harvesters make mistakes. What a scientist would do, or a competent forensic investigator for that matter, would be to identify both mushrooms before trying to talk about "probability" of anything. Who cares what the probability of something "feels" like in the head of some uneducated muffin who knows nothing about mushrooms, while the actual probability is attainable with a bit of rational investigation?

Using your version of the Razor leads one to conclude that the less you certainly know about the subject, the more probable it is that you are correct about particular conclusions relating to it, since the first fact you come to is more "likely" to be a correct basis for inference with respect to the whole, simply because you know it and not the rest. This is anti-scientific and irrational, in my opinion. The scientific method may be slow and not very emotionally satisfying at times, but it gets results. Identify both mushrooms, and you'll have your probability. If you're lucky it might even be close to 100%, provided there are known unique symptoms involved.

I think I'm starting to understand the mental bias that leads to atheism, though. If you are accustomed to reasoning "what little I know says x, therefore there is a higher probability that it is always x rather than sometimes w, y, or z, because I know that x is a fact but not the status of the others", you would naturally be confined to materialist metaphors when talking existential realities.
 
Last edited:
Is there a reason to overthink this issue? If Ockham's Razor is used as a warning against wild speculation, lending emphasis to the point that the simplest explanation may often be the best, why complicate things unnecessarily? Apply Ockham's Razor to the issue of Ockham's razor. In this instance that is the simplest explanation with no apparent need for complications or wild speculation.
 
Is there a reason to overthink this issue? If Ockham's Razor is used as a warning against wild speculation, lending emphasis to the point that the simplest explanation may often be the best, why complicate things unnecessarily? Apply Ockham's Razor to the issue of Ockham's razor. In this instance that is the simplest explanation with no apparent need for complications or wild speculation.

Because it's nonsense, and sometimes thinking about things before coming to a conclusion is a good idea.
 
It is the same basic logic behind why using established known mechanisms to explain something is more rational and more likely to be correct than positing some speculative cause. If you find a body poisoned by mushrooms and see a type of mushroom that you know has killed people, and another type of mushroom that you don't know anything about, which mushroom is more likely to have killed him? Sure, the other one might also be poisonous, but without already knowing that, the odds are higher that it's the thing you know could have caused it.

That's not an application of the law of parsimony; that's just an abuse of Bayesian probability.

Whether the known mushroom or the unknown mushroom killed the person, the mechanism is the same: poisonous mushroom.
 
The thing is, I doubt it was ever meant to be taken as seriously as it's being taken in this thread. It's just an occasionally useful heuristic which gives people a baseline for thinking about things that happen to them, not an ipso facto scientific theorem.

Yes, people always say that once it is meaningfully challenged. Which is why I feel free to ignore it. I have no need for "rules of thumb" that don't stand up to scrutiny. It may seem silly to try and measure whether a stitch in time in fact saves nine, but the real silliness to me is following a rule - or using it in support of an otherwise serious philosophical argument - if that rule does not itself have any substantial basis.


I don't know that I'm qualified to defend my position, but since nobody else in the thread has taken my position, I'll at least state my belief:

Occam's razor is fundamental to logical analysis. Without the razor, every question has an infinite number of answers, each of which is equally likely to be true.
 
The thing is, I doubt it was ever meant to be taken as seriously as it's being taken in this thread. It's just an occasionally useful heuristic which gives people a baseline for thinking about things that happen to them, not an ipso facto scientific theorem.

Yes, people always say that once it is meaningfully challenged. Which is why I feel free to ignore it. I have no need for "rules of thumb" that don't stand up to scrutiny. It may seem silly to try and measure whether a stitch in time in fact saves nine, but the real silliness to me is following a rule - or using it in support of an otherwise serious philosophical argument - if that rule does not itself have any substantial basis.


I don't know that I'm qualified to defend my position, but since nobody else in the thread has taken my position, I'll at least state my belief:

Occam's razor is fundamental to logical analysis. Without the razor, every question has an infinite number of answers, each of which is equally likely to be true.

Only if you ignore the fact that some answers have epistemological justification, and others do not. The whole notion of "assigning" probabilities not based on a set of observed data is not something I feel should be encouraged.
 
People here seem to be inventing their own ideas of what Occam's razor is about then pointing out why it is stupid.

Occam's razor is not about trying to figure out how things are. It is about how the explanation should be phrased after it is figured out. In the explanation, the extraneous, redundant, and irrelevant should be eliminated and only those parts that are necessary and sufficient to explain the event or phenomena should be left.

It isn't about "how science is done" but rather a loose suggestion on how to best present the conclusion.
 
I don't know that I'm qualified to defend my position, but since nobody else in the thread has taken my position, I'll at least state my belief:

Occam's razor is fundamental to logical analysis. Without the razor, every question has an infinite number of answers, each of which is equally likely to be true.

Only if you ignore the fact that some answers have epistemological justification, and others do not. The whole notion of "assigning" probabilities not based on a set of observed data is not something I feel should be encouraged.

I don't understand your response. Perhaps you can rephrase or elaborate?
 
People here seem to be inventing their own ideas of what Occam's razor is about then pointing out why it is stupid.

Occam's razor is not about trying to figure out how things are. It is about how the explanation should be phrased after it is figured out. In the explanation, the extraneous, redundant, and irrelevant should be eliminated and only those parts that are necessary and sufficient to explain the event or phenomena should be left.

It isn't about "how science is done" but rather a loose suggestion on how to best present the conclusion.

1. Complain that everyone is defining the term idiosyncratically.

2. Provide your own highly idiosyncratic "interpretation" of the term.


-----

There are a few potential "correct versions" of Occam's or Ockham's Razor. The vast majority of people know the term through Carl Sagan's book "Contact" or the film based on it, in both if which media it was presented as "that the easiest explanation tends to be the right one. "

The reason it is called Occcam's razor is due to its association with William of Ockham, whose most clear definition of the principle was as Numquam ponenda est pluralitas sine necessitate. This cannot be transliterated into English directly, but its general sense is: "Never must one posit plurality without necessity."

There are also much older formulations of similar principles, which William himself may have been attempting reference to, such as Aristotle's "the more limited explanation is always preferable, if adequate to the task".
 
People here seem to be inventing their own ideas of what Occam's razor is about then pointing out why it is stupid.

Occam's razor is not about trying to figure out how things are. It is about how the explanation should be phrased after it is figured out. In the explanation, the extraneous, redundant, and irrelevant should be eliminated and only those parts that are necessary and sufficient to explain the event or phenomena should be left.

It isn't about "how science is done" but rather a loose suggestion on how to best present the conclusion.

1. Complain that everyone is defining the term idiosyncratically.

2. Provide your own highly idiosyncratic "interpretation" of the term.


-----

There are a few potential "correct versions" of Occam's or Ockham's Razor. The vast majority of people know the term through Carl Sagan's book "Contact" or the film based on it, in both if which media it was presented as "that the easiest explanation tends to be the right one. "

The reason it is called Occcam's razor is due to its association with William of Ockham, whose most clear definition of the principle was as Numquam ponenda est pluralitas sine necessitate. This cannot be transliterated into English directly, but its general sense is: "Never must one posit plurality without necessity."

There are also much older formulations of similar principles, which William himself may have been attempting reference to, such as Aristotle's "the more limited explanation is always preferable, if adequate to the task".
Thanks for making my point. The razor is about how the conclusion is stated, not how the conclusion is reached.

Ockham's version: "Never must one posit plurality without necessity." One mustn't include extraneous, redundant, or irrelevant terms.

If your Sagan quote is correct; ""that the easiest explanation tends to be the right one. " though I remember it more as 'the most straightforward explanation tends to be the right one'.

And Aristotle's' "the more limited explanation is always preferable, if adequate to the task" again eliminates the extraneous and irrelevant leaving only the terms that are necessary and sufficient.
 
Is there a reason to overthink this issue? If Ockham's Razor is used as a warning against wild speculation, lending emphasis to the point that the simplest explanation may often be the best, why complicate things unnecessarily? Apply Ockham's Razor to the issue of Ockham's razor. In this instance that is the simplest explanation with no apparent need for complications or wild speculation.

Because it's nonsense, and sometimes thinking about things before coming to a conclusion is a good idea.

People do think, and in the process of their thinking they come to all sorts of ideas, notions and conclusions, some far more complex and convoluted than necessary. Which is why we have Ockham's Razor.
 
Is there a reason to overthink this issue? If Ockham's Razor is used as a warning against wild speculation, lending emphasis to the point that the simplest explanation may often be the best, why complicate things unnecessarily? Apply Ockham's Razor to the issue of Ockham's razor. In this instance that is the simplest explanation with no apparent need for complications or wild speculation.

Because it's nonsense, and sometimes thinking about things before coming to a conclusion is a good idea.

People do think, and in the process of their thinking they come to all sorts of ideas, notions and conclusions, some far more complex and convoluted than necessary. Which is why we have Ockham's Razor.
Indeed so and it is an easy trap to fall into. Even Einstein included a term in his theory of relativity (the cosmological constant) for which there was no observational evidence for at the time but he believed, at the time, in a steady state universe. A few years later Hubble's observations showed the universe to be expanding after which Einstein said that his inclusion of the term was the greatest mistake of his life.
 
That sounds more like a logical fallacy than a logical principle. What you're saying, in effect, is that an unidentified mushroom is less likely to be dangerous than an identified one. If the "probability" of being correct changes based on whether or not you happen to have a mycologist at the crime scene who can identify both mushrooms, then I'd say your estimation isn't very useful even if it happens to be right. Frankly, I'd be suspicious that you were correct in identifying the first mushroom to begin with, if you lack the relevant experience to identify both. Even seasoned harvesters make mistakes. What a scientist would do, or a competent forensic investigator for that matter, would be to identify both mushrooms before trying to talk about "probability" of anything. Who cares what the probability of something "feels" like in the head of some uneducated muffin who knows nothing about mushrooms, while the actual probability is attainable with a bit of rational investigation?

Using your version of the Razor leads one to conclude that the less you certainly know about the subject, the more probable it is that you are correct about particular conclusions relating to it, since the first fact you come to is more "likely" to be a correct basis for inference with respect to the whole, simply because you know it and not the rest. This is anti-scientific and irrational, in my opinion. The scientific method may be slow and not very emotionally satisfying at times, but it gets results. Identify both mushrooms, and you'll have your probability. If you're lucky it might even be close to 100%, provided there are known unique symptoms involved.

Ronburgundy's scenario didn't involve a mycologist and in some ways is a better illustration of the relevant issue because of that, imo. It's arguably at least a typical starting scenario for inquiring humans, because situations in which we don't know enough about the answer are arguably the ones we are mostly talking about (with what scepticalblip said about applying Occam's Razor to our conclusions as a caveat). In some ways, we don't have as much need for Occam's Razor when we have sufficient relevant information, or perhaps better to say the more relevant information we have (eg via the input from a mycologist), the less we need that sort of rule of thumb.

As such, I think he is right to say that given what the person encountering the dead body does know about mushrooms (that some are edible and some are poisonous), that of the two, the mushroom known to be poisonous is statistically/probabilistically more likely to have been the one that killed the person (assuming the person was poisoned by a mushroom of course, but that is part of the hypothetical scenario, even if it wouldn't necessarily be in real life).

However, it would obviously be wrong to turn that guesstimation, or 'initial working hypothesis' into an assumption. I'm not saying that that (the keeping of an open mind) is easy to manage for humans. But one would hope that a good detective for example would be trained to apply it whilst narrowing down a list of suspects or causes. Ditto a good scientist, or medical practicioner. And hopefully a good rational sceptic.

I think I'm starting to understand the mental bias that leads to atheism, though. If you are accustomed to reasoning "what little I know says x, therefore there is a higher probability that it is always x rather than sometimes w, y, or z, because I know that x is a fact but not the status of the others", you would naturally be confined to materialist metaphors when talking existential realities.

I'm not sure that's as accurate a representation as you might like to think, partly for reasons given above.

I am almost tempted to apologise to those concerned, on behalf of both rational scepticism and science, for damaging the tenability of a lot of their dodgy metaphysics.
 
Last edited:
In my general view, and having given it some more thought after reading the OP, Occam's Razor is probably like a whole bunch of other things, useful in some ways but not others, and in some situations more than other situations. So there are going to be times it helps us and times when it hinders us.

And for example, it can be used at the start of investigations (eg ronburgundy's scenario) and perhaps in a slightly different way at their end (as per what scepticalblip said).

It may also, perhaps, be more useful in the 'hard' sciences (or mathematics, perhaps especially statistics and probability) than the 'soft' ones (eg those involving human behaviour), possibly because the latter are the hard ones in another sense (having more inherent complexity and unpredictability). To use a very broad analogy, it's a bit like the difference between investigating a fence being knocked down by a strong wind and a fence being knocked down by an angry person.

Just to confuse matters further, I would suggest that there are probably several forms or varieties of OR (Occam's Razor), so to treat is too simplistically (ie monolithically) would be an example of why not to use it. :)

It is probably a good idea to also take into account the more or less opposite maxim, that things often have a tendency to actually be much more complex than they seem or that we would like. Or as Kant apparently put it (so I read) "the variety of entities should not be rashly diminished". Though here, we might better apply the same advice to causes generally, rather than just entities.

So maybe it is a good idea to keep both rules of thumb in mind and try to use both judiciously and provisionally.

I don't think all of that amounts to a good set of reasons to ignore OR, only a good reason to treat it with caution, understand its limitations and nuances, and be aware of alternative maxims. I doubt it's a good idea to dispense with the general principles of simplicity and parsimony altogether. For example, we're almost certainly all here, and the way we are (including in our thinking) because of evolution, and that process is ruthlessly parsimonious (even in its complexity).
 
Last edited:
It is the same basic logic behind why using established known mechanisms to explain something is more rational and more likely to be correct than positing some speculative cause. If you find a body poisoned by mushrooms and see a type of mushroom that you know has killed people, and another type of mushroom that you don't know anything about, which mushroom is more likely to have killed him? Sure, the other one might also be poisonous, but without already knowing that, the odds are higher that it's the thing you know could have caused it.

That's not an application of the law of parsimony; that's just an abuse of Bayesian probability.

Whether the known mushroom or the unknown mushroom killed the person, the mechanism is the same: poisonous mushroom.

The poisoning mechanism doesn't answer which mushroom. The exact type of mushroom is a feature of the mechanism. If you're choosing between which mushroom killed him, then the known deadly one is more likely, and it's more likely b/c it doesn't require adding a new assumption (that other mushroom is also deadly) beyond what is already known (the first mushroom is deadly).

Also, suppose you didn't know that it was poisoning, but there are mushrooms growing near the body. If the mushroom is unknown, the the probability that it was mushroom poisoning is less than if the mushroom is known to be deadly. Explaining the death with an already established mechanism capable of producing the result is more parsimonious than explaining it with a mechanism (the unknown mushroom) that you don't know is capable of producing the result. If the mushroom is unknown, then you have to increase what you assume is true about the world (that the unknown mushroom is deadly) in order to infer that it was mushroom poisoning. That increase in assumptions relates directly to the lower probability of accuracy.
 
That sounds more like a logical fallacy than a logical principle. What you're saying, in effect, is that an unidentified mushroom is less likely to be dangerous than an identified one.

Not quite. I am saying that a mushroom already known to be deadly is more likely to cause death than a mushroom you know nothing about. So long as it's true that less than 100% of mushrooms are deadly, then what I said has to be true.

If the "probability" of being correct changes based on whether or not you happen to have a mycologist at the crime scene who can identify both mushrooms, then I'd say your estimation isn't very useful even if it happens to be right.


That's like saying that if you're an omniscient God and knows everything and thus no assumptions need ever be made, then there is no use in using general principles to estimate the relative probability of explanations. Well sure. But in the real world, it is very useful.
Of course probabilities change once new knowns are added. That is always true, but there are always some uncertainties, and thus always some assumptions whose uncertainty determines the a priori probability that a given theory is correct relative to the alternatives. After all, there was a point when there were no mycologists and when mycologists didn't know all mushrooms, and yet people had knowledge of some mushrooms.


Frankly, I'd be suspicious that you were correct in identifying the first mushroom to begin with, if you lack the relevant experience to identify both. Even seasoned harvesters make mistakes. What a scientist would do, or a competent forensic investigator for that matter, would be to identify both mushrooms before trying to talk about "probability" of anything. Who cares what the probability of something "feels" like in the head of some uneducated muffin who knows nothing about mushrooms, while the actual probability is attainable with a bit of rational investigation?

This is a mixture of red-herring irrelevant focus on the superficial features of a given example (ignoring the underlying reasoning) and logically fallacious false dichotomy where you assume that either a person knows everything of any relevance and thus assumptions are not needed or they are "uneducated muffin who knows nothing". The ignorance and/or dishonesty of this isn't worth responding to.


Using your version of the Razor leads one to conclude that the less you certainly know about the subject, the more probable it is that you are correct about particular conclusions relating to it, since the first fact you come to is more "likely" to be a correct basis for inference with respect to the whole, simply because you know it and not the rest.

No, that is almost the exact opposite of what I said. The more you base your conclusion follows from what is already known, the fewer assumptions you make beyond what is known, and thus the more likely you are to be correct, yes "likely" because all real knowledge, rational thought, and science is about probability and likelihoods not certainty. Certainty is faith based belief and religion that doesn't care about being accurate.
And the "more" in "more likely" is also critical b/c it's isn't about being likely correct in absolute terms, but about the relative likelihood of accuracy between alternative accounts. Holding off on any conclusion until there is enough known to meet some threshold of likelihood is always and option and a great idea when possible. But in the real world, it is very often not possible to set that threshold too high b/c we must act and going with the relative most likely idea is better than acting randomly.

This is anti-scientific and irrational, in my opinion. The scientific method may be slow and not very emotionally satisfying at times, but it gets results. Identify both mushrooms, and you'll have your probability. If you're lucky it might even be close to 100%, provided there are known unique symptoms involved.


Your opinion is based on not understanding rationality or science. Science applies these principles constantly as does all rational decision making. Science progresses by applying known mechanisms as the most likely causes behind at they can most parsimoniously predict and explain. What is true is rarely known. Rather alternative theories are constructed and comparatively evaluated on their probability based upon their relative ability to explain the most data and predict with the most accuracy making the fewest assumptions about uncertainties that always exist in any rational analysis. Experiments are designed, not to establish certainty, but to generate data to test whether theories that differ in assumptions also differ in what future observations they can predict, and to rule out one of the most parsimonious accounts of any co-occurrence or seeming pattern, random chance.

I think I'm starting to understand the mental bias that leads to atheism, though.

And there it is. The basis of your whole position and your rejection of basic principles of evidence-based scientific reasoning. You're a theist and/or theist apologist, so you have to reject principles of rational thought b/c rational thought inherently rejects theism, and theism is about reaching emotionally pleasing absolute certainty rather than coping with the reality that understanding of relative likelihood with still a good amount of uncertainty is all that science and reason can achieve.
 
Back
Top Bottom