• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Self driving robot cars are racially biased

RVonse

Veteran Member
Joined
Jul 1, 2005
Messages
3,057
Location
USA
Basic Beliefs
that people in the US are living in the matrx
https://www.vox.com/future-perfect/...acial-bias-study-autonomous-vehicle-dark-skin

On a personal level I've noticed its not just the autonomous cars that are biased, I think I might be biased myself. I work at all hours and frequently drive to and from work during night time and where I live the area is open and have had many close calls with deer. But where I work the driving experience is even more perilous, an impoverished area with many black people jaywalking across the road. From many first hand near misses, I can say that black people are very difficult to see at night. If I were a black person, the last thing I would do is jaywalk at night in front of a moving car. But for some reason they do it all the time in that area as though the cars will not ever hit them. IMO, the deer are much easier to see and avoid than the black people who jaywalk. Both are unexpected and unpredictable, but the deer are lighter in color and easier to see. Furthermore, hitting a deer is just damage to my car....hitting a person is far more tragic.
 
It's winter, here, so the main threat I see from jaywalkers is their choice of clothing.
Long sleeve, long pants, hoods, scarves, mittens...

I may not even be able to guess the race of the person I have to swerve to avoid. They're all just idiots wearing dark in the dark.
 
I agree it's all about he clothing, not the skin color. When we used to visit Panama City Beach on a regular basis, it was very common to see people wearing dark clothing. My husband came close to hitting one, but luckily I screamed that there was a person in the road while he still had a chance to stop. The person was white but he was wearing black clothing. It was just after sunset. A lot of people don't realize how invisible they are at night if they wear dark clothing when crossing the road. At the beach, it's really bad just after sunset because there tends to be a lot of glare on the road, making it even more difficult to see anyone crossing the road.
 
https://www.vox.com/future-perfect/...acial-bias-study-autonomous-vehicle-dark-skin

On a personal level I've noticed its not just the autonomous cars that are biased, I think I might be biased myself. I work at all hours and frequently drive to and from work during night time and where I live the area is open and have had many close calls with deer. But where I work the driving experience is even more perilous, an impoverished area with many black people jaywalking across the road. From many first hand near misses, I can say that black people are very difficult to see at night. If I were a black person, the last thing I would do is jaywalk at night in front of a moving car. But for some reason they do it all the time in that area as though the cars will not ever hit them. IMO, the deer are much easier to see and avoid than the black people who jaywalk. Both are unexpected and unpredictable, but the deer are lighter in color and easier to see. Furthermore, hitting a deer is just damage to my car....hitting a person is far more tragic.

Racially biased or simply reality occasionally makes racial distinctions?

I've had a very close call with a black jaywalker, had it played out a little bit differently he would have been dead. Had he been white it still would have been close but it wouldn't have required a big dollop of luck to save his life.

- - - Updated - - -

I agree it's all about he clothing, not the skin color. When we used to visit Panama City Beach on a regular basis, it was very common to see people wearing dark clothing. My husband came close to hitting one, but luckily I screamed that there was a person in the road while he still had a chance to stop. The person was white but he was wearing black clothing. It was just after sunset. A lot of people don't realize how invisible they are at night if they wear dark clothing when crossing the road. At the beach, it's really bad just after sunset because there tends to be a lot of glare on the road, making it even more difficult to see anyone crossing the road.

Both matter. The guy I just barely didn't hit was African-black and wearing entirely very dark clothing besides. My headlights could barely pick him out at 1 second--nowhere near enough. Had I not already known he was there the world would have one less jaywalker.
 
What about shoes/sneakers? They are often reflective. Or were these jay-walkers walking to opera or something?
 
What about shoes/sneakers? They are often reflective. Or were these jay-walkers walking to opera or something?

The guy I nearly hit had nothing that wasn't very dark colored. Nothing reflective. He was dark enough I'm wondering if he was a burglar.
 
What about shoes/sneakers? They are often reflective. Or were these jay-walkers walking to opera or something?

The guy I nearly hit had nothing that wasn't very dark colored. Nothing reflective. He was dark enough I'm wondering if he was a burglar.

Perhaps you should have shot him, just in case. Apparently that's the only way to prevent a takeover of society by criminals - shoot dead anyone suspicious.
 
In many places seat belts are required by law, and helmets for bikers, so should there be similar requirements for some kind of highly reflective belt, stripe, design, whatever in clothing as a pedestrian? If not as an legal requirement, then at least as a factor in liability in lawsuits - driver responsibility 0% or very low if no reflectors (regardless of race). Seems logical, but I've not heard of such - is there anywhere such an approach is used?
 
In many places seat belts are required by law, and helmets for bikers, so should there be similar requirements for some kind of highly reflective belt, stripe, design, whatever in clothing as a pedestrian? If not as an legal requirement, then at least as a factor in liability in lawsuits - driver responsibility 0% or very low if no reflectors (regardless of race). Seems logical, but I've not heard of such - is there anywhere such an approach is used?
In Russia they passed such law few years ago (Pedestrian must have reflective clothing or something at night). Have not heard anybody fined though.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.

They didn't use systems taken from actual self driving cars b/c the manufacturers refuse to allow independent tests. No self driving car should be allowed anywhere near public roads until they do and until there has been extensive highly replicated independent research that the manufacturers have no influence over.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.

They didn't use systems taken from actual self driving cars b/c the manufacturers refuse to allow independent tests. No self driving car should be allowed anywhere near public roads until they do and until there has been extensive highly replicated independent research that the manufacturers have no influence over.

This doesn't magically make the proxy they did use justify the headline.
 
Part of the issue is that any perceptual detection system will have more difficulty detecting a darker object against a dark background (not just the night darkness but the black road itself). In fact, detecting that an object has a "face" is probably part of the system. It is beyond object/background contrast, there is the issue of color contrasts within the fact itself. Light skinned people have greater contrasts between the color of their flesh and the color of the features the make it a face, such as the dark of the nostrils, open mouth eye color, and the darker shadows cast by the brow and nose, plus the skin/hair color contrast that is on average much greater among whites. Humans have evolved neurons that react instinctually to faces right from birth. AI systems starts from scratch, so they are probably significantly more likely than humans to fail to detect any human fact under poor visual conditions and more likely to show a racial difference in face detection.

It's important to keep in mind that the system cannot be programmed to simply react to any object in the road. There are always small objects in the road or just potholes, puddles, reflections, etc.. Any viable algorithm cannot simply stop anytime there is some potential irregularity in the visual field. Thus, any system will have a threshold which determines that probability the irregularity is an object of the sort that will cause damage to itself, passengers or the object, and weight that probability against the increase in likely harm caused by unexpected stops when the system has a false alarm., even when the system detects "something", such as a sneaker reflector), it will not stop unless it can distinguish it from harmless somethings like reflection in a puddle or off of a small piece of tin/metal in the road, and unless it infers that something is a person or object large enough to warrant stopping.

Another part of the problem is that the systems only detect what they are trained to and that training includes feeding the system images of people. If the designers show the system fewer blacks, then it will not be as effective at detecting blacks. The fact that the differences emerged under all lighting conditions, suggests that too few images including blacks are being used to train the system.
It might be tempting to think the ratio of images should reflect the ratio in the real population, but this is incorrect. In order for the system to react the same with equal probability for different races, it must be trained with an equal number of images for each race.
And that assumes the perceptual part of the system can in principle detect the races equally. In not, then it might be that the threshold should be set only using the group that is hardest for the perceptual system to detect.

I am curious whether they tested it using lighter concrete colored roads, such as most highways are made of. That would inform how much of the issue is in the perceptual part of the system versus in the training of it's decision making algorithm.
 
They didn't use systems taken from actual self driving cars b/c the manufacturers refuse to allow independent tests. No self driving car should be allowed anywhere near public roads until they do and until there has been extensive highly replicated independent research that the manufacturers have no influence over.

This doesn't magically make the proxy they did use justify the headline.

The VOX headline is factually accurate. The study uses researchers attempts to replicate the systems employed by self driving cars and shows the systems are less able to detect darker skinned people. That means it is "a potential risk" for self driving cars, unless they go to lengths to correct for the problem that is inherent to building such systems and has been shown to produce racial bias in similar systems actively in use, such as facial recognition software that the FBI has shown makes more false positive matches with black faces.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.

Exactly right. Detecting pedestrians is of no use to an automated vehicle, because they need to avoid hitting any obstacle, human, animal, or just a trash can sitting in the path of the vehicle. If a dog is in a crosswalk, you still don't want your car to run over it. This idea that an automated vehicle has to distinguish humans from other obstacles is just plain nutty.

TBH, I think that fully automated vehicles mixed in with human-operated ones are a bad idea. There are just too many ways that things could go wrong. That's not to say that the research isn't worth doing or that there might be cases where it would make sense. But a large part of driving involves an ability to anticipate the behavior of other drivers, not just pedestrians or animals. When I spot someone yakking on their cell phone while driving (or, worse, texting), I take measures to stay clear of that vehicle.

The field of AI has made great advances in simulating intelligent behavior, but there is nothing out there that could scale up to the kind of intelligence that other intelligent animals have, let alone human beings. That is because we do not understand how brains do simple things like learn to recognize and distinguish objects on the basis of past experiences. Our machine learning models are still too simplistic. Programmers simply cannot anticipate all of the dangers that an automated vehicle might face, so AI isn't going to make truly great leaps until it solves the machine learning problem.
 
In many places seat belts are required by law, and helmets for bikers, so should there be similar requirements for some kind of highly reflective belt, stripe, design, whatever in clothing as a pedestrian? If not as an legal requirement, then at least as a factor in liability in lawsuits - driver responsibility 0% or very low if no reflectors (regardless of race). Seems logical, but I've not heard of such - is there anywhere such an approach is used?
If you kill someone with a car these days, the 'penalties' are already so light as to be legalized murder in most states, especially if you "only" hit a pedestrian or bicyclist, so you're pretty much already in the 'he deserved it because of what he was wearing' category. (Now where have I heard that before?)
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.

They didn't use systems taken from actual self driving cars b/c the manufacturers refuse to allow independent tests. No self driving car should be allowed anywhere near public roads until they do and until there has been extensive highly replicated independent research that the manufacturers have no influence over.
I don't know why we would insist on such high standards for AI driven cars. The cars driven by people go through much less rigor.
 
Aren't self-driving cars equipt with some form of radar to navigate with? I drive a truck for a living and they have anti colision radar that is rather sensitive.

The "science" in the article was a little but by proxy and doesn't directly support the headline. They didn't use actual self driving cars and actual pedestrians.

It would seem a bit silly to me to design a car that didn't avoid any moving object it could detect.

Exactly right. Detecting pedestrians is of no use to an automated vehicle, because they need to avoid hitting any obstacle, human, animal, or just a trash can sitting in the path of the vehicle. If a dog is in a crosswalk, you still don't want your car to run over it. This idea that an automated vehicle has to distinguish humans from other obstacles is just plain nutty.

Completely wrong.
It needs to detect that there is an obstacle at all, and it cannot detect that something is an obstacle unless it has been trained to do so.
AI systems have no capacity to distinguish a person from a shadow, light reflecting off a window, or a plastic bag blowing across the street unless it is trained to make those distinctions based on fine grained perceptual information.

Any AI car that you designed would either slam on the brakes every two seconds, which would be only slightly less dangerous than one that failed to stop for real objects in the road.
 
Back
Top Bottom