fast
Contributor
I think that you first need to establish what you think it means to "make a choice".
It's original usage was specific to biological beings capable of making decisions. An expanded deviated usage that extends the meaning to include non-biological entities doesn't alter its meaning but rather creates ambiguity. Metaphorical usage of a word that becomes apart of common usage merely adds additional uses. Take for instance a computers brain. Before the word "brain" was bastardized, no one would honestly think that a computer has a brain--how silly is that? Even "thinking" has been bastardized. There are numerous words that have been taken to such extremes that we are amidst an array of words that are adrift in a sea of ambiguity. And that, accounts for a lot of conflict.
See what I mean? It's so intrusive. A machine with goals? Are machines now capable of aspirations? People have goals. I might even go so far as to say animals have goals. If we keep going and talk like those in biology class, we might even (since we're crossing lines) say why certain plants chose to evolve as they did. Chose. See, once you start crossing the line and consider metaphorical usage as okay to flip flop with original usage, distinctions are lost.Machines can be programmed with a list of goals and a set of priorities.
This brings me to the issue in the dangers of denial. If it's indeed acceptable to say computers have goals, then it's certainly of a variety in stark contrast to a biological being having goals. Thus, I could say they do and say they don't while only giving the appearance of contradiction--kind of like how one can talk past another while using a different sense or meaning of a word that has multiple meanings.
I admit, it's difficult to talk without the helping hand of metaphorical usage, but we have to be careful not to forget what's mimicking and what's mimicked. It's not the same!--unless redefinitions are broadened to the extreme. If we replaced every human cell with an artificial cell, a human we would not have. At some point, a human brain would not be a brain, yet with ambiguity everpresent to rear its ugly head, it would be a 'brain'--but certainly not biological.When goals conflict, they can calculate the likely outcomes of choices and which outcomes best satisfy priorities. They can also be programmed to adjust future priorities on the basis of trial and error. That is because AI programmers deliberately design choice-making programs to mimic human thought processes.
Pressed at the moment; I will return to follow up on other things you were kind enough to respond to me with.