- Nov 24, 2017
- Northern Ireland
- Basic Beliefs
My original ethical derivations came from a pretty radical idea: that there is some principle in nature, some thing derived from the context of our existence in the universe, that caused the emergence of ethics in humans, that our theories and ethics are attempts to approximate in the same way that there, in fact, mere approximations.
You don't sound that different to Angra at this point.
I really do think that there is a game theoretic approach possible to ethical philosophy, to make it strategic.
Let's look at Tic Tac Toe. There are things that "are". "Marks are owned by players", "marks are placed in alternating sequence", "marks are placed on a three by three grid", "marks once placed are set". There is a GOAL, "place three marks in a line", and a secondary goal "prevent three marks that are not your own from being placed in a line." From these 'is' things, one can derive an OUGHT wherein every single action made by the player is predetermined. It creates a strategy, and the players who use that strategy will invariably meet the inferior goal and if their opponent makes any mistake at all, they will get their superior goal. This creates an ought: IF your goal is to win (and not lose), you OUGHT apply that strategy as perfectly as possible.
Of course, I expect such an axiom to be controversial. I expect people to not want it, to reject it. I used the transform to a simple game to illustrate the point in a simple rather than hellishly complicated context such as ethics, though it was originally thinking about Tic Tac Toe that I actually came to understand the mechanism by which goals derive oughts from "is".
Other games have different rules. Sometimes those rules imply that there can be no strategy: that there is no way to achieve any particular goal and the results of the game are random (like the card game WAR).
So to me this says that some of the fundamental elements of moral philosophy have to be approached from the examination of goals... hence my metagoal. Because it can't just be about what I want, if I want general strategy.
Sure. 'Goals' and 'strategies' are not arbitrary (evolution and natural selection ensure it) and game theory is a useful citation. But as I see it, the idea that goals or strategies are morally right or wrong in any factual, independent, universal or realist way is.....misguided.
Does that make me a moral relativist? Or a Consequentialist? Or a Utilitarian? Or something else? To be honest, I don't know. First, I never seem to feel I fit any particular label or ism, and second, I change my mind a lot. I think people have argued about morality since the start of recorded history and I think they probably will until the end of it and I'm not sure it will ever be philosophically resolved.
One thing to note: games (in game theory) that feature more cooperation than competition seem to work best, I believe. Does this suggest that retribution is not the best strategy? If so, where does that leave retributivism?