I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph @redmer I learned how to draw with pen & ink. Thereβs no undo, every stroke on the paper is final. It teaches how to draw with confidence, and dealing with mistakes. This is a skill that translates to drawing digitally. And itβs noticeable when artists only learned to draw digitally and are too dependent on digital tools like undo or specific brushes.
-
@jwcph I second this!

In my line of work I (as the IT-dude
) am implementing AI-tools in our company, because we in that sense then have "control" of what kind of tools the employees are using. But No, it is a false sense of control as we can not control which other AI-tools are in use and by that the whole landscape of AI is our contemporary WILD West, π«£ π«£ -
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph also from a baseline If you apply a skill to a tool you receice a specific output. Which is just not true and cannot be true for any current LLM/AI technology the randomness IS the key function.
-
@perigee @chirpbirb This kind of feeling is no small part of why, when I went on a carreer change a few years ago, I decided that whatever my new job would be I would be working primarily with people. Which I do now & it's awesome - even with a pro-AI company policy, my department, which is the odd one out in the org, has a first-line reason to be & remain good at exactly the things AI can never even approach

@jwcph @chirpbirb can you talk more about your career change?
-
-
@jwcph Funnily enough I wanted to kind of challenge you with saying that it's unlikely a programmer would be able to write a complex program "by hand" (using assembly) but... a sufficiently motivated programmer probably could. It would be an absolutely miserable experience, you'd have to invent a lot from the first principles, but in the end it's all system calls and the documentation is out there.
The point was, that vibe coders can't "code" at all without AI. And pre-AI, people already wrote software. So ... What was your point again?
-
-
The point was, that vibe coders can't "code" at all without AI. And pre-AI, people already wrote software. So ... What was your point again?
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
> "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability."
So, dump trucks are a liability?
-
@art_codesmith @fedithom @jwcph this is an interesting point and maybe reflective of why llm adoption has been somewhat less controversial among programmers than writers (or woodworkers); the vast majority of programmers already had a near-total dependence on tools, so another level of abstraction is less of a bridge too far
-
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph yes. That's also why ludite were breaking machine. It's not like they were breaking an electric hole maker because they wanted to keep using the hand crank ones. They were breaking tools that required no knowledge of how to do the work to be used. Put a piece of wood on one side, turn a crank, get a door on the other side.
Nothing to learn.
If you learn nothing, no reason to pay you more as time pass, no reason to keep you if you make trouble.
-
@art_codesmith @fedithom @jwcph this is an interesting point and maybe reflective of why llm adoption has been somewhat less controversial among programmers than writers (or woodworkers); the vast majority of programmers already had a near-total dependence on tools, so another level of abstraction is less of a bridge too far
But can we keep the distinction between using something as a tool and using something as the only means to get any work done? @jwcph
-
But can we keep the distinction between using something as a tool and using something as the only means to get any work done? @jwcph
@fedithom @art_codesmith @jwcph agreed that this seems like a meaningful distinction; im saying that for the vast majority of programmers, compilers fall into the category of 'things without which its not possible to get any work done'. writing any machine code at all is a fairly rare skill, and developing non-trivial applications using it is almost non-existent outside of certain specialized sub-domains. this seems to make programming unlike many other arts/crafts, where its the other way around (only certain specific sub-domains basically require specialized tools; many others are doable by hand by most practitioners)
-
RE: https://mstdn.ca/@drikanis/116107120926277506
I'd like to comment on the common "AI is just a tool" thing: I'm a woodworker by training & that means a lot of machines - but almost every craftsperson knows how to do their job with hand tools, or "lesser" machines.
Similarly, a writer can write without a text editor - just as well, only slower.
If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability. You're signing over your ability to do business to whoever sells & maintains that tool.
@jwcph Yes!!!
-
> "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability."
So, dump trucks are a liability?
@Downes So, you're an idiot?
-
> "If loss of a tool = loss of your skill & knowledge, then that tool isn't an asset, it's a liability."
So, dump trucks are a liability?
-
@jwcph A similar concern is the ongoing availability of a tool. Building up your workflows around a tool with sustainability issues or one which is solely controlled by subscriptions to one manufacturer has hurt other crafts time and time again. (e.g. Adobe products)
@fundamental @jwcph I am an iOS developer, I *know*
-
@art_codesmith @jwcph @fedithom
1. I think it is safe to say that competent #software engineers know their tools and an early step in any non-trivial project is to gather tools or write new ones if needed. But we donβt (and cannot) write all of them from scratch because it is too much to keep in our heads AND there are smarter people out there whoβve already done the work. We can do what we do only by leveraging the work of others.2. A tool created by automatic programming is just as useful as one created by a human. If you trust it to work in your use case then an AI-created tool is no different.
3. The question to be answered is the same for any software tool: Why do I trust it? If you are super-rigorous then you will want to use a formal logic-checking tool to prove the software is correct. Thatβs really hard and computationally intractable for non-trivial software.
4. ALL software contains residual errors, but our ways of justifying trust in software are incomplete and involve some kind of inductive leap that in the best case leaves you with a quantifiable idea of the risk of failure.
#AI is just software. Do with it what you do with any other software.
-
@Ponygirl You know, there's a lot of people who would respond to that with a bunch of hemming & hawing about how useful it can/will be for the right applications - but right now I'd say they have the burden of proof & to my knowledge, they're not lifting it.
I'm with you.
@jwcph @Ponygirl
"AI" is not "AI". I hate that "AI" has become the term people use to refer to ChatGPT or Gemini.You have to distinguish LLMs and other genAI that are being hyped by big tech from the kind of AI that's being used in science and has been used in science for decades.
For example, I use a neural network model to denoise my astrophotography.
"AI" should never have been made available to the general public. This is a thing for science and science alone.
-
R AodeRelay shared this topic

