Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson OTOH, we give away tons of our results for free some of which you're using daily.
-
I appreciate this pushback.
You're right that I could be more informed, and I'm actively working onFor example: I recently learned about OpenAI's ties to ICE and have since switched to other models (local models) because of it. That's exactly what 'informed use' looks like to me. I am trying to learn about specific harms and adjusting accordingly.
But here's where I still disagree with the Luddite comparison: The Luddites had a real choice to reject the machines. I don't have that choice anymore: I’m required to use AI at work, and personally, it helps me function with ADHD in ways that nothing else does.
So my question remains: If I can't opt out entirely, isn't 'informed use and demanding regulation' better than 'uninformed use and silence'? I'm genuinely trying to navigate this topic, not to justify myself.
Also I am really curious why your colleagues are all against generative AI. Would you please expand on that?
Btw, the Luddites didn't really have a choice, either. As history shows, the machines were forced on them after all.
It wasn't the machines they were against, it was the abuse of them to increase/create a power imbalance that allowed the owners of the machines to create something similar to the slavery that was moving out.
Some parallels to how the means of computing are now seized and used against the same people who have built it in the first place.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson
Now a question that comes to mind, if AI generated content, be it graphics, music, plot lines or the code that binds this all together, can't be copyrighted, can we, theoretically speaking, copy modern games that use AI in development freely?
-
@owlex @the_other_jon I wish you the best. While we are on opposite sides of this particular struggle I can respect your need to fix the broken system from the inside. Just be careful out there.. https://steve-yegge.medium.com/the-ai-vampire-eda6e4f07163
@firepoet Thank you, Stephen. While we still don’t have the same position. This post captures what I am looking out for in my employees and myself
-
@jaredwhite @jamesthomson All LLM generated code is in the public domain. The commercial companies just protect it all behind private repos. If you could force them to release it that would be what you’d need.
@colincornaby
@jaredwhite @jamesthomsonA post suggesting precisely this:
https://zomglol.wtf/@jamie/116059523957674208 -
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I’m not sure what exactly surprises you. If you look at cultural norms of the trades this attitude appears to be downright inevitable.
Writers basically invented copyright to legally prevent others from using their works. Nowadays writers don’t edit others’ work or lift parts of others’ works. All this is relegated to fanfics which are deemed extremely unserious, a training exercise at best.
Visual artists are similarly cagey about ownership. Copying is somewhat allowed only in training. Even remote similarities in the final work would immediately be pointed out. They even have a concept of a forgery—an exact copy, which is an absolute no-no.
Meanwhile programmers from the earliest days felt very little attachment to the code they produced.
Bob: my dudes, look what I came up with over the weekend!
Dave: very cool! There was a bug, here’s a patch.Programmers are much more collectivist about the code. They invented a license that legally binds others to give away their code.
As an example of difference of attitude let’s take a look at id software. They open sourced engine code for their games fairly quickly. While assets—which are mostly art: graphics, music—remain restricted to this day.
So I don’t see what’s so surprising about them not caring much about the plagiarism issue now given that they never really did.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
Speak for yourself, I have the same message as artists and writers. AI gen code feels super gross, uncomfortable, and stolen. Not worshipping the ground LLMs walk is likely one of the reasons I was laid off a year ago. LLMs were directly the reason my non-coder friend didn't hire me to fix his Wix site. I feel like I'm forced to use them against my will or risk never getting a paycheck again.
-
@owlex Are the training sets licensed or just strip mined from the web/redit/github/sourceforge? This was the cause for their “AI is theft” statement.
From a technical standpoint: Are these training sets free from bugs? If you use an ai tool to generate tests, are they useful tests? A useful test is one that tries to break the code instead of showing that the code “works. Tests that that exercise the interfaces or cover the code tend to not be “useful” tests.
@the_other_jon @owlex uh, I definitely disagree on your stance about tests: A test that checks if something is working is really useful when you need to refactor something and needs to be sure the changes haven't affected existing behaviour
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson As a developer who hates AI, the one pushback I would make against this framing is that it was a mistake to grant computer code "literary copyright protection" in the first place. It's literally computer instructions, and just like a recipe instructions are not copyrightable, computer instructions should not be copyrightable. Patentable, sure, but not copyrightable.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I just need to trick Claude into organizing everyone who uses it into a union.
-
R AodeRelay shared this topic