Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
-
@jamesthomson Except the programmers who know that coding is just as creative as writing and "art"
@davebauerart @jamesthomson
as someone who studied softwarelocalisation: programmers who actually know what they were doing always have been a rare breed... -
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson publishing and music companies: We’ll sign any contract with AI companies, especially if it requires us to hand over our writers/artists works little to no compensation for them.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson not all developers

Seriously though… if it wasn’t mandated by management, I wonder how many would stick with it, considering how bad a lot of the outcomes are
-
@jamesthomson @ryanvade and AI is now used massively in hiring - to the point where AI is involved in the decision making process.
We got to this point t really quick
@schwa @jamesthomson @ryanvade
It feels like part of the tech bro / Trump coup, and planned for years.
-
@the_other_jon @jamesthomson I was confused about this change as well. But they use Apple products, while being sceptical about the company. What’s wrong about using ai as a useful tool, while knowing its problems?
@owlex Are the training sets licensed or just strip mined from the web/redit/github/sourceforge? This was the cause for their “AI is theft” statement.
From a technical standpoint: Are these training sets free from bugs? If you use an ai tool to generate tests, are they useful tests? A useful test is one that tries to break the code instead of showing that the code “works. Tests that that exercise the interfaces or cover the code tend to not be “useful” tests.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I can't begin to state how many times I've seen open-source developers (mostly web developers, to the surprise of nobody) just vibe code all of the new functionality in their libraries/frameworks without thinking through the ethical (and copyright) implications
it's stupid, and I'm getting tired of it.
when will this ai bubble pop...
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson My personal theory about why this is has to do with the emphasis on output with development. Writing and art takes time. Doing it right and getting all the details really matters. GenAI can't replace that for the sole reason that each person's artstyle is different and unique.
Development on the other hand focuses on fast production. Code changes, bug fixes, new features. These all can happen much quicker. Add onto that the need to be "first" and you get the current scene.
-
@jamesthomson not all developers

Seriously though… if it wasn’t mandated by management, I wonder how many would stick with it, considering how bad a lot of the outcomes are
@lorimolson @jamesthomson Came here to say the same. Developers were also among those whose content was slurped up by big tech to feed their copyright laundering apparatus.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson Given what we've learned about the ethics of many software developers over the past several months, I wonder how many companies are currently shipping closed source code that's stolen from GPLed projects.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson writers and artists have been dealing with their craft being devalued for an incredibly long time, so they get it. Developers have never really had to deal with that, and generally speaking a lot of developers look down upon anyone not in tech, so they're just not going to trust what non tech people say. Developers are also, generally speaking, really bad at thinking about labor, and organizing. I've spoken to countless developers who said they'd never join a union, even if it was an option.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson developers: 'I hope no one realises our code can no longer be protected by copyright!'
-
@owlex Are the training sets licensed or just strip mined from the web/redit/github/sourceforge? This was the cause for their “AI is theft” statement.
From a technical standpoint: Are these training sets free from bugs? If you use an ai tool to generate tests, are they useful tests? A useful test is one that tries to break the code instead of showing that the code “works. Tests that that exercise the interfaces or cover the code tend to not be “useful” tests.
@the_other_jon I'm aware of these problems, and many more (energy waste, OpenAI's exploitation of workers in Africa for manual training, copyright theft, data mining).
My question stands: Why is it wrong to use something critically while being aware of its problems? Especially when we're in the middle of such a massive technological shift that we should understand it. And when capitalism is forcing it into everything anyway, isn't informed usage better than ignorance?
And it's not even just about American companies anymore. We're in a global race for AI dominance now. This whole topic is incredibly complex.
I respect you for having these principles, but I think taking it out on a podcast, which reports about technology is a little weird. Though it's your decision

-
@the_other_jon I'm aware of these problems, and many more (energy waste, OpenAI's exploitation of workers in Africa for manual training, copyright theft, data mining).
My question stands: Why is it wrong to use something critically while being aware of its problems? Especially when we're in the middle of such a massive technological shift that we should understand it. And when capitalism is forcing it into everything anyway, isn't informed usage better than ignorance?
And it's not even just about American companies anymore. We're in a global race for AI dominance now. This whole topic is incredibly complex.
I respect you for having these principles, but I think taking it out on a podcast, which reports about technology is a little weird. Though it's your decision

@owlex @the_other_jon If it is incredibly complex, then shouldn't the technology be democratically controlled? Shouldn't all tech that has such a massive impact on our lives be democratically controlled? I believe it should.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson When you realize that developers have always 'secretly' been jealous of creative types, it will make sense.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson as a Developer: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson this makes me so angry.
-
@jamesthomson @ryanvade yeah there is a lot of managerial pressure and a lot of “use AI or your career will be over” FOMO.
It does suck.
@paulhebert @jamesthomson @ryanvade our company has a dashboard that shows how many times you used genai this month and anyone who has less than N uses gets put on blast. Almost* everyone in software engineering knows that measuring lines of code is stupid because one can simply insert many pointless lines of code to boost metrics. For some reason though, AI is different. That many of us are using these tools to generate N+1 shit posts a month would be funny if it wasn't contributing to climate polycrises
-
@jamesthomson @ryanvade yeah there is a lot of managerial pressure and a lot of “use AI or your career will be over” FOMO.
It does suck.
@paulhebert @jamesthomson @ryanvade You guys have a choice? For artists it's just "Your career is now over." Perhaps that's the difference here.
-
@owlex @the_other_jon If it is incredibly complex, then shouldn't the technology be democratically controlled? Shouldn't all tech that has such a massive impact on our lives be democratically controlled? I believe it should.
I'm not sure we need to democratically control the technology itself, but we absolutely need to hold companies accountable for their methods. And since these models are built on OUR collective knowledge, we should demand open weight models and not be forbidden from using them.
The true impact of LLMs is still unfolding. If they turn out to be like the telephone or internet, then yes, strong regulatory control is needed. But if they're more like one compiler among many, maybe not.
What's clear: We need to close the legal loopholes that let companies profit parasitically from society without giving back. Democratic control means informed engagement, not avoidance.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson Not all developers are flapping their arms with joy amid these developments.
For example, I stopped sharing my photos AND code, and not using AI to any capacity.