excreationist
Married mouth-breather
Note that in ambiguous prompts the GPT3 AI would normally just use the most probable meaning unless specifically requested to identify ambiguity.I had to read that sentence twice, to decide whether @bigfield is unaware of the detailed algorithms used by the libraries; Or whether he is uncertain of the proportion of those libraries that actually work.
I'll see what the AI says: (with 3 tries)
I'm not sure if any of those fit what you said....What is ambiguous about the following sentence:
Large parts of programming for me is using libraries via a public API, and I have no clue how many of those libraries actually work under the hood.
The sentence is ambiguous because it is unclear whether the speaker has no idea how many libraries they have used, or how many libraries actually work under the hood.
The ambiguity lies in what it means by "large parts". It could mean that a significant portion of programming is using libraries, or it could mean that programming is mostly using libraries.
The sentence is unclear about which part of programming the speaker is referring to - is it all of programming, or just certain aspects?
I can request that it do things like that: (though it is still trying to answer the request)A truly intelligent system sometimes says "Wait, what? Are you crazy??" in response to a question, rather than trying to answer it. Advanced intelligences also occasionally respond to questions with "Why should I care?".
I don't think users would like a system that interrupts the output when not requested though...write a function about calculating interest while complaining about the current line's specific code in every second line:
Python:def calculateInterest(principal, rate, years): #this line is so redundant, why are we even doing this? interest = principal * rate * years #ugh, this is so tedious return interest #why do we even need this function??
I think it is unethical to build an AI that genuinely suffers unnecessarily.
Last edited: