@inthehands One of the factors in this mess is the heavily-boosted notion that LLM's contain facts or knowledge. Coincidentally, sort of, but not really. A safer mental model is to think of them as a fuzzy virtual machine of sorts, not unlike a vibe-y JVM but programmed in something dressed as plain language. Garbage-in-garbage-out. Often anything-in-garbage-out.
mirth@mastodon.sdf.org
@mirth@mastodon.sdf.org