Mostly when I try to bring up the Euthyphro dilemma with a theist, it doesn't matter how carefully I explain it, or how many links I provide, they just keep re-asserting that an external authority is the only possible source of objective morality. They don't want to talk about it, they just keep re-asserting that. What am I doing wrong?
Their objection is to moral relativism. That is, the contrast to "objective" morality is "relative" morality. Their use of "relative morality" means that each group gets to decide what is moral even when obviously immoral. ("What would stop a community from deciding pedophilia is wonderful when morality is relative? As in the forced marriage of children in Nigeria. [See today's news about the child (14) bride who poisoned her new husband and a couple of his friends.]")
Not that the US doesn't have its history, too. In Delaware in 1895 the age of consent was 7, although most colonies followed English Common Law: 10-12. Almost all states raised the minimum to 16 by 1920.
In the theist's world of objective morality from the Bible we find no age of consent, but at least one example of a child of three years being "taken to wife." Nevertheless the theist asserts that Biblical morality is Best. Somehow, though, the examples of Biblical immorality (how to do slavery right, baby brides, capital punishment (stoning, cruel but not unusual then) for a teen sassing the 'rents, and for gathering firewood on the Sabbath) are blithely ignored.
Theists say: If there is no God of Perfect Justice then others (not me, of course) have "gotten away with" injustice and will never be punished. Wouldn't it be great if that bad guy over there would suffer for his sadistic cruelty in the end?
Well, yes, for the other guy, they say. But me, I have adopted the God of Perfect Forgiveness. I am a serial sinner. I always feel bad about my sin and beg forgiveness (always granted) and do pretty well at not repeating sins, instead finding new and novel ones.
Morality is subjective. Can a robot be cruel to another robot? Can a robot be cruel to a human being? Moral judgment involves the subjective intent, a first-person conscious intent, on the part of an actor. The actor had a choice. There was a real sense in which the actor could have done differently. Because his or her choice could have been predicted to cause harm to someone [else] we judge that act immoral. (Libertarians insert the "else".)
Because morality is a judgement, a real judgment call, it is necessarily self-generated as all judgements are.