DrZoidberg
Contributor
Lol. Well... it is my job to facilitate communication in large teams of people. It's what I do for a living all day. There are studies on this. 55% of all defects found in software are due to faulty (most often vague) requirements. Ie a misunderstanding on how something should be interpreted. And these are intelligent, trained people who are aware of communication problems in teams. They still fuck it up distressingly often.
I suggest to go and do a little research on your own on this topic. Humans are awful at this.
But that is errors in interpretation, which goes beyond just getting the words spoken correct. Even if a person accurately hears every word, they often have interpretation errors, and a single misheard word out of 100 (a 1%) error rate can mean massive interpretation errors. 6.3% error rate means that the person would be mishearing about 1 word in every sentence, not even considering whether they interpret the set of words correctly. I doubt there is any research showing its that high.
Another issue is that humans are less likely to make errors on the most important words. Humans focus their attentional resources on the words that are most important for the meaning (e.g., subject, verb, object). Their errors are likely to be disproportionately on the minor words and articles (the, a, an, etc.). Also, humans use the contextual meaning to "guess" when a word wasn't accurately heard, so their errors while not the exact word will be "close" in terms of deeper meaning. In contrast, computers will make random errors, which means more errors than humans on key words, and the computer "guesses" will be way off in meaning since they "guess" based on semantic overlap rather than contextual meaning.
I think we use context to help us. I lead teams across cultures. When people, from different cultures, aren't used to each other they can barely speak a sentence without needing clarifications. And this is when both are fluent and both are working with the same specialisation in a specialised industry.
Take sight. I think we assume the hell out of what people are seeing. And that's how our brains work. When I studied human-computer intertation I looked at pictures showing representations of the amount of information that the eye actually picks up. Most of what we think we see we don't actually see. It's the brain filling in the gaps. Hearing works the same way.
Humans have a huge head start on computers because we are humans. We will share the same type of limitations as whoever is speaking, so our powers are assumption will be pretty good. But a lot of that's just guesswork, and I think we should be humble about it. If we would be in conversation with a more intelligent non-human being I highly we would be able to understand that they are more intelligent than us. Simply because we'd be searching for specifically human cues. Computers would need to simulate that, which is hard when we still don't really know how the brain works.