AI is just the latest form of cheap labor the rich use to get richer, and, as always, they've convince the poor into believing the laborer (AI), not the system, is the problem. If AI can map the internet to learn every artist’s style, then surely it can remember whose work it learned from. IMHO the problem isn’t that it’s impossible to track, it’s that it’s inconvenient to pay.
Side note:
* Of course, being on school grounds might automatically constitute such lack of permission, but if a rule exists that says that weapons are prohibited on school grounds, that rule is almost certainly itself unconstitutional. Which is insane, and maybe even a good reason to eliminate the second amendment, but it is nonetheless still the plain consequence of the Bill of Rights as it currently reads.
America: “You have the right to bear arms, except where the people you elected might see them."
But tech people have never thought to teach AI that stealing someone else’s intellectual property is not merely attributing the stolen work to the appropriate party/parties but to actually obtaining permission prior to using it. Because such things never occur to those who have no interest in anything except : more.
That didn’t slip past me just because I didn’t mention it.

Anyway, I’m honestly more tired of white people treating the success of other white people as their own than I am of AI. AI can stay as far as I’m concerned.
I’m almost certain AI is color blind, as was my comment. People are people are people.
Those who control AI ( for now) care only for green —or gold, perhaps
God knows people have stolen not just lives but accomplishments and inventions and art and music and science and mathematics from all who could not prevent such thefts.
Trust me: I’m not concerned about white people losing more than I am concerned about anyone else—everyone else.
I’m certain I’ve read too much sci fi and speculative fiction even if I did not burn with rage when someone dared alter my own work, childish as it was decades ago, not to see the many ways this will go very badly. For all of us.
I’m honestly not sure I understand either of those replies, maybe because my point wasn’t clear. What I said is that AI is the
new cheap labor the wealthy are exploiting (and you’ve even hinted at that yourself). Instead of focusing on the exploitation itself, the lack of consent in using artists’ work or the absence of systems to pay them, most of the discussion I see in the media and online is about AI replacing jobs. In other words, people are blaming the cheap laborer instead of the people exploiting it.
It reminds me of slavery debates, where the question wasn’t “should people be exploited?” but “should slaves be free?” The moral framing was aimed at the wrong subject. Same thing here.
And just to clarify, my comment about being more tired of white people claiming the success of other white people than I am of AI wasn’t meant to start a debate; it was a tongue-in-cheek response to the thread title. But I’m happy to explain why I’m not tired of AI if that isn’t already obvious.
For example: if AI is being trained on data from real people, say, accounts receivable records or any other human-generated information, why shouldn’t those contributors be paid when that data creates value? Musicians get royalties every time their work is played or used commercially. Data is no different. Every data point ultimately traces back to human labor or creativity, and if AI depends on that human input, the humans behind it deserve compensation too.
Yet the conversation keeps being,
“OMG AI is taking our jobs!” instead of recognizing that we could be setting up a fair system of exchange now, while AI is still scaling up, so that it benefits us early. But no, it’s easier to bash AI than to challenge the exploiters using it. Personally, I don’t care if it’s AI or something else entirely that ends up leveling the playing field, what matters is that we start building a structure that works for everyone, not just those profiting from other people’s work.
Bombs level playing fields. Everybody loses, even those too stupid to know it.
So who is to be compensated? The people who aggregate the data? Analyze it? Who ARE the data ( or their efforts are the data)?
People in the ‘in groups’ have been stealing from and exploiting those in the out groups for millennia. This includes but is not limited to taking credit for intellectual and creative work not originating with them. It does not, imo, make it better that this will happen to people who have, traditionally, been the in groups.
You are much more optimistic than I if you believe this levels the playing field in any way that does not resemble the work of a bomb.
I am not concerned that AI is ‘taking our jobs’ so much as AI is taking our humanity. How many people in this thread have referred to humans as what’s the term? Meat suits? Makes my skin crawl. But at least I’m alive and know it.
I get what you’re saying, and I don’t disagree that exploitation has always been part of the human story. But that’s exactly why I’m arguing we should design systems that make it harder, not easier, to keep repeating that pattern. I don't see how my use of the phrase “level the playing field,” warranted an equivalence to bombs.

I'm also not suggesting my idea is some magic wand that will erase exploitation. I mean equity through structure, like compensation models that recognize
all human input in data creation. We’re talking about data here, which, if handled correctly, similar to the music industry, is traceable. Exploitation in the music industry hasn’t stopped, but without the copyright system, it’s arguable that not a single artist would be guaranteed
any compensation
at all.
The question of who gets compensated is valid, but it’s one we can only answer if we
don’t throw our hands up just because people will always find ways to exploit others. They will, with or without solutions. But at least with a solution, someone deserving of compensation gets paid.
What I’m saying is that the general public keeps focusing on AI “taking their jobs,” which, in my view, signals to the people actually building and deploying AI that doing it
the right way isn’t a priority, because no one’s really talking about that. I agree that AI risks dehumanizing us, but honestly, what doesn’t when humans are behind the wheel? That’s exactly why we should be rethinking the system around it
now, instead of surrendering to the same old hierarchies, letting them roll out “improved” versions built on human labor, only to cut those same humans out of the corporate rewards once the system matures.
I guess we really can’t learn from the past after all. ¯\_(ツ)_/¯