Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I stopped supporting ATP because of their change from “AI is theft” to “you should pay $20 per month for ChatGPT.”
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson how dare you
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson the amount of podcasts I listen to is dwindling at this rate. The amount of “yeah but look Claude my best pal made an app for me so it’s all good” is making my blood boil
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson "risk devaluing our entire profession..." To the people developing these models, this isn't a bug, its intended behavior.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson DragThing now by ChatGPT.
-
@jamesthomson "risk devaluing our entire profession..." To the people developing these models, this isn't a bug, its intended behavior.
@xenonchromatic Indeed.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
I know a manager who suggested someone use AI to write a simple description of a piece of code.
Nuts.
-
@jamesthomson the amount of podcasts I listen to is dwindling at this rate. The amount of “yeah but look Claude my best pal made an app for me so it’s all good” is making my blood boil
@amyinorbit I mean, aside from the personal, ethical, societal, financial, and environmental issues, it's just great.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I think the problem is developers don't really consider any unattributed use of open source as stealing - just a mild grey area. (They should consider it stealing.)
-
@jamesthomson I stopped supporting ATP because of their change from “AI is theft” to “you should pay $20 per month for ChatGPT.”
@the_other_jon @jamesthomson John Siracusa seems to offer the most balanced perspective there, but yeah it's pretty grim. I mostly don't listen any more. And don't get me started on MacStories omfg

-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson Also developers: Vibe-coding generative AI models were built on our stolen source code, we're all being laid off, please hire us.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson
I would be interested in training a model based on my own code. I spend a decent amount of time looking through my own code to find something I know I’ve done before. -
@jamesthomson I think the problem is developers don't really consider any unattributed use of open source as stealing - just a mild grey area. (They should consider it stealing.)
@colincornaby @jamesthomson Open source != public domain, and free software != free (it's free speech, not free beer), but apparently many developers are clueless re: all those nuances.
️Perhaps if all LLM-generated code was legally automatically placed in public domain, we'd see a bit of a light bulb moment.

-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I’ve been struggling with this cognitive dissonance for years.
-
@jamesthomson
I would be interested in training a model based on my own code. I spend a decent amount of time looking through my own code to find something I know I’ve done before.@estranged My understanding is that there isn't enough data in a small set like that to actually train usefully, as a standalone thing. So it's always going to be based on other models, with your training on top.
-
R ActivityRelay shared this topic
-
@colincornaby @jamesthomson Open source != public domain, and free software != free (it's free speech, not free beer), but apparently many developers are clueless re: all those nuances.
️Perhaps if all LLM-generated code was legally automatically placed in public domain, we'd see a bit of a light bulb moment.

@jaredwhite @jamesthomson All LLM generated code is in the public domain. The commercial companies just protect it all behind private repos. If you could force them to release it that would be what you’d need.
-
@jamesthomson I’ve been struggling with this cognitive dissonance for years.
@Drwave Let me know if you come to any conclusions…
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I feel like most devs don't see code as art, but as something ephemeral and disposable, a mean to an end.
Like i quit IT years ago and mostly code for my own use, and i'm closer to writers and artists on this one.
-
Writers: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Artists: Generative AI models were built on our stolen works, are deeply unethical, and risk devaluing our entire profession.
Developers: Wheeeeeeeeee!
@jamesthomson I touched on this earlier… https://m.phase.org/@parsingphase/116076869609984951 & linked