• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

AI Doomers and the End of Humanity

If AI becomes a better doctor, surgeon, lawyer, designer, builder, writer, driver, etc, than any of us, what is left for us to do in life?
Fundamentally, AI as we currently see it is basically autocomplete on steroids. As such, it can't actually create.

Most people wouldn't have a job in such a world but there would still be value in the development of new stuff. Advancing the state of knowledge is still of benefit and an awful lot more manpower could be devoted to it when it's not needed for the dissemination of existing knowledge.

How many are suited to such a society? There are those who need structured work that brings them identity and meaning, from professionals, executives to tradies, service staff, etc. So if that mostly disappears, hobbies may not be enough to fill the void.
 
That's why doctors consult with each other over diagnostic results. Neural nets are very good at some kinds of pattern recognition. Just bear in mind that abacuses surpass the ability of mathematicians to calculate certain types of results quickly and accurately. Humans are just as good at getting the results, but it takes them much longer. Abacuses are more limited when needs go beyond the narrow range of things they can do for you. And you must know how to make them work properly. Otherwise, they are useless tools.

Sure, but this hypothetical is not about the capabilities of people and AI here and now, but the possible consequences associated with an increasingly powerful AI, where AI surpasses human ability in a wide range of fields, design, manufacture, medicine, law, governance, which is quite possible even if we are nowhere near that stage yet.
 
How many are suited to such a society? There are those who need structured work that brings them identity and meaning, from professionals, executives to tradies, service staff, etc.
Sure, today there are. People who have been raised with the societal expectation that such things are the sole acceptable way to live find (surprise!) that they feel terrible when they can't live that way.

But the aristocrats have never needed that. Because they've been raised to live a life of sports, hobbies, and pastimes.

I don't think it's a fundamental attribute of any person that they cannot handle the absence of structured work, unless they're trained from birth to respect, and to expect, structured work.
 
So then, we may be facing a revolution in social values, work, wealth, status, profession, the nature of work, entertainment...and hopefully we get through it in relatively good shape.
 
That's why doctors consult with each other over diagnostic results. Neural nets are very good at some kinds of pattern recognition. Just bear in mind that abacuses surpass the ability of mathematicians to calculate certain types of results quickly and accurately. Humans are just as good at getting the results, but it takes them much longer. Abacuses are more limited when needs go beyond the narrow range of things they can do for you. And you must know how to make them work properly. Otherwise, they are useless tools.

Sure, but this hypothetical is not about the capabilities of people and AI here and now, but the possible consequences associated with an increasingly powerful AI, where AI surpasses human ability in a wide range of fields, design, manufacture, medicine, law, governance, which is quite possible even if we are nowhere near that stage yet.

What I wanted to talk about here was not science fiction scenarios but the realistic risks that modern AI--or what the media call by that rubric--pose for the foreseeable future. There is no question that advances in technology will have a negative impact on the livelihoods of some people. The internal combustion engine did a lot of damage to the buggy whip industry. People lost their jobs, but they managed to find other ways to make a living. Automation is certainly changing the kinds of manufacturing jobs open to people with blue collar jobs, but there is still employment available to machinists and assembly workers. We are still not seeing a prospect for AI to have a serious negative impact on overall employment, but workers will need to adapt to the new opportunities that computer technology (not just AI programs) have opened up. There were far fewer software engineers working in the 1950s than there are today, and most people can learn to use computers. The point is that AIs are not going to exist unless they are useful to their creators, so putting everyone out of work may not be a goal that super intelligent AIs think worth pursuing. Otherwise, those programs could find themselves out of a job. :unsure:
 
We are still not seeing a prospect for AI to have a serious negative impact on overall employment
Yeah, we are.

There are over 1.3 million truck drivers, 1.5 million 'rideshare' (uber, lyft, etc.) drivers, and about 300,000 taxi drivers in the US; Almost all of them will be put out of work in a very brief period when autonomous vehicles reach the tipping point into commercial viability, which will likely happen within a decade, perhaps much sooner.

That's three million people in one skill area alone, out of a us workforce of 165 million; A two percent increase in unemployment would be very noticeable, even if it were not massively biased towards the "flyover" states where other employment has already largely disappeared. The proportion of midwestern workers who are drivers, particularly truck drivers, is far greater than in the population as a whole.

It certainly isn't profitable to employ humans to drive vehicles, if your competitors are doing the same work without them.

And autonomous vehicles are very close to reaching that tipping point. They're not actually intelligent (but then, in my experience, nor are a significant number of human drivers). But they don't need wages, or rest breaks, or vacation time.

This isn't a new or unique phenomenon, of course. Automation has been making various professions obsolete for centuries. But it's still going to be a rude shock to a system that's configured with the expectation that people can and should get jobs to earn a living. Increasingly that's not true. And the rate of increase is accelerating rapidly.
 
That's why doctors consult with each other over diagnostic results. Neural nets are very good at some kinds of pattern recognition. Just bear in mind that abacuses surpass the ability of mathematicians to calculate certain types of results quickly and accurately. Humans are just as good at getting the results, but it takes them much longer. Abacuses are more limited when needs go beyond the narrow range of things they can do for you. And you must know how to make them work properly. Otherwise, they are useless tools.

Sure, but this hypothetical is not about the capabilities of people and AI here and now, but the possible consequences associated with an increasingly powerful AI, where AI surpasses human ability in a wide range of fields, design, manufacture, medicine, law, governance, which is quite possible even if we are nowhere near that stage yet.

What I wanted to talk about here was not science fiction scenarios but the realistic risks that modern AI--or what the media call by that rubric--pose for the foreseeable future. There is no question that advances in technology will have a negative impact on the livelihoods of some people. The internal combustion engine did a lot of damage to the buggy whip industry. People lost their jobs, but they managed to find other ways to make a living. Automation is certainly changing the kinds of manufacturing jobs open to people with blue collar jobs, but there is still employment available to machinists and assembly workers. We are still not seeing a prospect for AI to have a serious negative impact on overall employment, but workers will need to adapt to the new opportunities that computer technology (not just AI programs) have opened up. There were far fewer software engineers working in the 1950s than there are today, and most people can learn to use computers. The point is that AIs are not going to exist unless they are useful to their creators, so putting everyone out of work may not be a goal that super intelligent AIs think worth pursuing. Otherwise, those programs could find themselves out of a job. :unsure:

The difference between mechanical automation like buggies, automated assembly lines, etc, and AI appears to be to be that AI may be able to take over the role of designing and running machinery, driving cars, serving customers, diagnostics, therapeutics, etc, things that could once only be done by humans.

As shown by the capabilities of AI in its current stage, this is not science fiction.

And because it can cuts costs, AI is useful to big business no sick pay, holiday pay, days off, superannuation, retirement, personality clashes.

So the question, where does this leave the average worker, those who currently work in the service industry, government jobs, manufacturing, etcetera...?
 
Singularity approaches.
Never mind the singularity. What approaches is a world in which we can have anything we want with almost no human effort.

Only capitalists could see that and say "OMG we're doomed!".

The big problem is that they are determined that, if we can make enough of everything for everyone, they should be the only ones to get it.

A post scarcity society is (largely) a post privilege society, and those who currently enjoy privileges are going to fight tooth and nail against that.
 
We are still not seeing a prospect for AI to have a serious negative impact on overall employment
Yeah, we are.

There are over 1.3 million truck drivers, 1.5 million 'rideshare' (uber, lyft, etc.) drivers, and about 300,000 taxi drivers in the US; Almost all of them will be put out of work in a very brief period when autonomous vehicles reach the tipping point into commercial viability, which will likely happen within a decade, perhaps much sooner.

Autonomy is a big deal in moving vehicles, and the debate over self-flying aircraft has been going on even longer than it has for self-driving ground vehicles. As a former Boeing employee, I have feelings about it, but my feelings are perhaps not as intense as those who feel that their livelihoods can be wiped out by AI. You present a nightmare scenario in which one can imagine millions of people thrown out of work rather suddenly, and you present a timeline for it that makes this very threatening to workers in the service industry you are personally involved with. So I'm not going to tell you that you have nothing to worry about. There are a lot of people who run public transportation companies that salivate over the idea of not having to pay all of those salaries. So they'll spend a lot of money on driverless vehicle projects that may turn out to be more of a gamble than promoters of the technology would like them to believe. I simply don't believe that your timeline for the driver apocalypse is realistic, and I do think that there will always be human beings on conveyances with automatic guidance systems. It is too dangerous to entrust the lives of the public to fully automated vehicles without the ability of a human being to take over control of the vehicle.

When I visited airbus once, a tour guide on one of their newer planes, while showing us the pilot cabin, quipped that their dream vision for the cabin was to have a single button to cut off the autopilot and a guard dog to prevent a human pilot from pushing it. The chief test pilot at Boeing once gave us all a nice lecture on the controversy between Boeing and Airbus in which he compared the experience of flying more pilot-friendly Boeing planes vs. pilot-averse Airbus planes. What the Boeing pilot pointed out to us was that most reported incidents tended to be caused by pilots not understanding what the automation was doing. That was certainly a factor in recent times with the disasters which befell the the Supermax 737s. Boeing had tried to sell those aircraft as if they needed only minimal pilot retraining and didn't need more expensive safety features installed, which they were trying to sell as extras. The chief test pilot at Boeing once gave us all a nice lecture on the controversy between Boeing and Airbus in which he compared the experience of flying more pilot-friendly Boeing planes vs. pilot-averse Airbus planes. In fact, those aircraft had entirely different automated piloting that had been installed in 737 airframes--not a great engineering decision, but one intended to save the company a lot of redesign and retooling in the manufacturing process.

That's three million people in one skill area alone, out of a us workforce of 165 million; A two percent increase in unemployment would be very noticeable, even if it were not massively biased towards the "flyover" states where other employment has already largely disappeared. The proportion of midwestern workers who are drivers, particularly truck drivers, is far greater than in the population as a whole.

It certainly isn't profitable to employ humans to drive vehicles, if your competitors are doing the same work without them.

And autonomous vehicles are very close to reaching that tipping point. They're not actually intelligent (but then, in my experience, nor are a significant number of human drivers). But they don't need wages, or rest breaks, or vacation time.

Oh, I agree with you that they're not actually intelligent, but autonomous vehicles are moving robots, so there is a need for much greater intelligence than there is for chatbots trying to trick people into thinking that they can think. Right now, we have a lot of pilot projects going on and some limited deployment of these highly experimental systems. I think that the rush to deploy them is way premature, but that does reflect this rather stupid mentality that machines are somehow smarter than humans and can actually replace them. My view on automation in vehicles is that it has a bright future when seen as a way of augmenting humans so that they convey vehicles more accurately and safely. Machines don't draw salaries, but the people who create and maintain them do. Anyway, it will be a long way off before those driverless vehicle projects throw millions of people like yourself out of work. All it will take is a lot of people being killed in a few newsworthy accidents involving one of those vehicles.


This isn't a new or unique phenomenon, of course. Automation has been making various professions obsolete for centuries. But it's still going to be a rude shock to a system that's configured with the expectation that people can and should get jobs to earn a living. Increasingly that's not true. And the rate of increase is accelerating rapidly.

If you pay attention to the news, it is happening rapidly. It is all the rage among company executives. They view labor costs as the single most controllable expense that eats into profits. I've been in company meetings where we were told that management saw labor costs as something like a dial on a machine that could be dialed down if we couldn't come up with ideas for cutting costs. During financial crises, they didn't have many other dials to fiddle with. However, it turns out that labor is still indispensable, and firing workers is an effective means of losing experience and expertise actually needed to make a profit.
 
It is too dangerous to entrust the lives of the public to fully automated vehicles without the ability of a human being to take over control of the vehicle.
Except that it's not; It's already considerably safer.

What you probably mean is that it's too scary, which is not the same (or even necessarily a related) thing.
 
It is too dangerous to entrust the lives of the public to fully automated vehicles without the ability of a human being to take over control of the vehicle.
Except that it's not; It's already considerably safer.

Opinions vary. I don't think that we have enough exposure to the technology yet to arrive at that kind generalization with any real confidence. Autonomous technology has very limited uses at this point, so it isn't about to throw millions of people out of work, as you seem to think. In my opinion, safety on the road depends in no small part on the ability of drivers to figure out what is in the minds of other drivers, which is not always easy. It is really difficult to figure out what is in the mind of an autonomous vehicle, especially since it doesn't have anything like a mind.


What you probably mean is that it's too scary, which is not the same (or even necessarily a related) thing.

In this case, however, it is scary because it puts humans in realistic danger of being injured or killed, not to mention property damage. That's why operating and maintenance instructions for vehicles come with warnings and cautions.
 
In my opinion, safety on the road depends in no small part on the ability of drivers to figure out what is in the minds of other drivers, which is not always easy.
I couldn't disagree more.

Safety on the road depends on the ability of drivers to know and obey the rules, and to reasonably expect others to do the same; And on the ability to rapidly detect and respond quickly and appropriately when others break those rules.

It's a situation far better suited to autonomous algorithmic systems than to human beings.

The vast majority of even professional drivers have woeful gaps in their knowledge of road rules, in my experience. These can be programmed into autonomous vehicles, and kept up to date automatically as legislators change things. Most drivers have forgotten half of the rules that were in force when they took their one and only driving test; And half of what they do remember, is now out of date.
 
In this case, however, it is scary because it puts humans in realistic danger of being injured or killed, not to mention property damage.
No, it doesn't.

They're CURRENTLY in danger from those things, and autonomous vehicles REDUCE that danger.

That they don't reduce it to zero is irrelevant; but it's the basis for people's being scared.

It's not realistic to be more fearful of autonomous vehicles than of human piloted ones; It's a cognitive error.

And it's not reasonable to judge the proposed new paradigm against perfection; It should be judged against the paradigm it supplants.
 
Autonomous Trucks Are Coming But Face A Bumpy Road | Investor's Business Daily
In the US,
Two dozen states already allow commercial deployment of intrastate autonomous trucks, and they're now on the road in some spots. Further, most states allow testing of self-driving trucks that have human backup on board. For interstate travel, however, autonomous trucks still need clearance from the federal government.

...
At the core of those concerns are high-profile accidents such as those involving Tesla's (TSLA) Autopilot feature in its electric cars. But Tesla Autopilots are not full self-driving cars. Instead they are advanced driver-assistance systems where the human driver is responsible. Also, Tesla cars lack the robust sensor systems, such as lidar, used by autonomous trucks.

Simply put, development of self-driving truck technology is leaving driverless cars in the dust.
I'm guessing that a human driver would take over in urban and industrial areas, where the trucks have to go for deliveries. The driver would then get off the truck when it's out of town, and get onto the next truck that's coming into town.
 
The Long Road to Driverless Trucks - The New York Times
Self-driving eighteen-wheelers are now on highways in states like California and Texas. But there are still human “safety drivers” behind the wheel. What will it take to get them out?

...
In March, a self-driving eighteen-wheeler spent more than five straight days hauling goods between Dallas and Atlanta. Running around the clock, it traveled more than 6,300 miles, making four round trips and delivering eight loads of freight.

... These “safety drivers” grabbed the wheel multiple times.

...
Companies like Kodiak know the technology is a long way from the moment trucks can drive anywhere on their own. So they are looking for ways to deploy self-driving trucks solely on highways, whose long, uninterrupted stretches are easier to navigate than city streets teeming with stop-and-go traffic.

“Highways are a more structured environment,” said Alex Rodrigues, chief executive of the self-driving-truck start-up Embark. “You know where every car is supposed to be going. They’re in lanes. They’re headed in the same direction.”

...
Restricting these trucks to the highway also plays to their strengths. “The biggest problems for long-haul truckers are fatigue, distraction and boredom,” Mr. Rodrigues explained on a recent afternoon as one of his company’s trucks cruised down a highway in Northern California. “Robots don’t have a problem with any of that.”
I'm guessing that the trucking companies rotate their safety drivers.
Part of the challenge is technical. Though self-driving trucks can handle most of what happens on a highway — merging into traffic from an on-ramp, changing lanes, slowing for cars stopped on the shoulder — companies are still working to ensure they can respond to less common situations, like a sudden three-car pileup.

As he continued down the highway, Mr. Rodrigues said his company has yet to perfect what he calls evasive maneuvers. “If there is an accident in the road right in front of the vehicle,” he explained, “it has to stop itself quickly.” For this and other reasons, most companies do not plan on removing safety drivers from their trucks until at least 2024. In many states, they will need explicit approval from regulators to do so.
Coping with such things a big problem with self-driving vehicles, an underappreciated problem.

There also has to be some way of stopping for a maintenance working and going forward when the worker signals it to. Also stopping when a cop wants it to stop. Would the truck be in continuous communication with its operators? Would maintenance workers and cops have to call in to those operators?
 
In shuttling goods between Dallas and Atlanta, Kodiak’s truck did not drive into either city. It drove to spots just off the highway where it could unload its cargo and refuel before making the return trip. Then traditional trucks picked up the cargo and drove “the last mile” or final leg of the delivery.
Unload? Since the trucks are 18-wheeler semitrailer trucks, the truck would stop, the tractor part would move away, and the new tractor part would move in.
In order to deploy autonomous trucks on a large scale, companies must first build a network of these “transfer hubs.” With an eye toward this future, Kodiak recently inked a partnership with Pilot, a company that operates traditional truck stops across the country. Today, these are places where truck drivers can shower and rest and grab a bite to eat. The hope is that they can also serve as transfer hubs for driverless trucks.
Long-haul trucking seems like an awful job that may people move on from when they can.
The turnover rate among long-haul drivers is roughly 95 percent, meaning the average company replaces nearly its entire work force each year. It is a stressful, monotonous job that keeps people away from home for days on end. If they switch to city driving, they can work shorter hours and stay close to home.

Self-driving trucks struggle to deliver
Several startups working on self-driving trucks — viewed by many as an easier challenge than autonomous passenger cars — have stalled in recent months, leaving only a handful of players aiming to deliver on a huge promise.

...
Gatik, meanwhile, is automating a narrower slice of the freight business for companies including Walmart, Kroger, Georgia-Pacific and Pitney Bowes.
  • Instead of trying to perfect autonomous long-haul trucking, it's tackling the "middle mile" between warehouses and stores, co-founder Gautam Narang tells Axios.
  • Its trucks are smaller, and the routes are shorter and relatively easier, avoiding schools, hospitals and fire stations, for example.
  • They also keep to the right lane, and make three right turns instead of a more dangerous left turn.
The bottom line: Self-driving trucks are stuck in low gear.
Seems rather cautious, to make only right turns and not left turns.
 
As he continued down the highway, Mr. Rodrigues said his company has yet to perfect what he calls evasive maneuvers. “If there is an accident in the road right in front of the vehicle,” he explained, “it has to stop itself quickly.” For this and other reasons, most companies do not plan on removing safety drivers from their trucks until at least 2024. In many states, they will need explicit approval from regulators to do so.
Coping with such things a big problem with self-driving vehicles, an underappreciated problem.
It's an even bigger problem for human driven vehicles.
There also has to be some way of stopping for a maintenance working and going forward when the worker signals it to. Also stopping when a cop wants it to stop. Would the truck be in continuous communication with its operators? Would maintenance workers and cops have to call in to those operators?
Stopping for a person who is standing in the path of a vehicle should be routine and easy. That this is not the case for human driven vehicles in many circumstances is a failing of such vehicles; The need for cops (and traffic controllers at roadworks) to wear a specific uniform that enhances the fact that a vehicle should stop when they obstruct the roadway, is entirely a consequence of the fact that many drivers are insufficiently alert and competent to avoid running down unexpected pedestrians who are blocking their path.
 
Canada:
Meet Loblaw's driverless trucks, now making deliveries in Toronto | Financial Post

Self-driving trucks hit the road for Australia’s first live-traffic trial
Transurban, which owns the CityLink toll road, says trials of the self-driving connected and automated truck will help them to better understand how roads and road technology can be future-proofed to prepare for these sorts of vehicles sharing the road in the future.

Trials with driver assistance have already been conducted, however this time the automated truck will be driving itself.

While the truck’s automated features will be in operation, a specially trained safety driver will be on board at all times. Among the three safety drivers is Breanne Turner, who told the Herald Sun that this was a once-in-a-lifetime opportunity.

Autonomous trucks hit the highways, with Australian tech helping drive the revolution - ABC News

Australia continues to dominate the use of autonomous haul trucks - Mining Technology
Between May 2021 and May 2022, the number of autonomous haul trucks in operation globally rose from 769 to 1,068, an increase of 39%, with the figure expected to exceed 1,800 by the end of 2025. Major additions are coming from BHP, which has plans to automate up to 500 haul trucks across its Western Australia iron ore and Queensland coal mines through to 2023, while both Canadian Natural Resources and Suncor Energy are expecting to add over 100 autonomous trucks to their oil sands mines before the end of 2025.

By country, the largest population of autonomous trucks is in Australia with 706, up from 561 in 2021 and 381 two years earlier. It is followed by Canada with 177, up from 143 in 2021, China with 69 and Chile with 33. Autonomous haul trucks are present at 25 mines in Australia, compared with 19 across the rest of the world.
These trucks operate in mines and on mining-company roads, not general-use roads. Giant dump trucks?
 
In shuttling goods between Dallas and Atlanta, Kodiak’s truck did not drive into either city. It drove to spots just off the highway where it could unload its cargo and refuel before making the return trip. Then traditional trucks picked up the cargo and drove “the last mile” or final leg of the delivery.
Unload? Since the trucks are 18-wheeler semitrailer trucks, the truck would stop, the tractor part would move away, and the new tractor part would move in.
This is already routine and long standing practice in Australia for Road Trains; A single driver will haul three or four trailers behind his tractor unit on the long hauls through the lightly populated countryside, to a staging area outside the major centres where it is broken down into individual trailers each hauled the rest of the way by its own driver. This represents a significant saving in driver costs when hauling large loads over long distances.

Road Train and B-Double maximum lengths on public roads here in Queensland are defined according to location, with zones allocated for four classes of Multiple Combination vehicles, from the smallest "B23" two trailer combinations below 23m (~75.5ft) in overall length which can use specific urban routes; up to the largest "T2" four trailer combinations of up to 53.5m (~173.5ft) in overall length which are only permitted west of the Great Dividing Range.

Some of the mining companies out west have built their own private roads on which they haul even longer combinations, of five or even six full sized trailers. Typically the limits in such cases are imposed by the ability of a single tractor unit to haul a fully loaded road train, and that in turn is determined by the maximum gradient on the specific route. Private roads of this kind are already employing fully autonomous vehicles:
IMG_0844.jpeg
These 425tonne GCM vehicles operate in the Pilbara iron ore region; Each vehicle has three trailers with a capacity of 100tonnes apiece, for a total of 300tonnes per vehicle trip. They work in platoons of up to five vehicles, with the lead vehicle doing the majority of the computational tasks, and the others just playing 'follow the leader'.
 
Back
Top Bottom