Just as an example: I think I've even read this opinion piece before but with everything going on with AI in this moment my first thought reading the headline was:
"Ah Interesting, I'm wondering how learned tokenized semantic meaning and diffusion models fit together."
One of my niche hobbies is trying to coin new terms - or spotting new terms that I think are useful (like "slop" and "cognitive debt") and amplifying them. Here's my collection of posts that fit that pattern: https://simonwillison.net/tags/definitions/
Something I've learned from this is that semantic diffusion is real, and the definition of a new term isn't what that term was intended to mean - it's generally the first guess people have when they hear it.
"Prompt injection" was meant to mean "SQL injection for prompts" - the defining characteristic was that it was caused by concatenating trusted and untrusted text together.
But people unfamiliar with SQL injection hear "prompt injection" and assume that it means "injecting bad prompts into a model" - something I'd classify as jailbreaking.
When I coined the term "lethal trifecta" I deliberately played into this effect. The great thing about that term is that you can't guess what it means! It's clearly
three bad things, but you're gonna have to go look it up to find out what those bad things are.
So far it seems to have resisted semantic diffusion a whole lot better than prompt injection did.
when someone says:
>where our vocabulary is limited and often confusing.
What I read or hear is:
>where my vocabulary is limited.
Just as an example: I think I've even read this opinion piece before but with everything going on with AI in this moment my first thought reading the headline was:
"Ah Interesting, I'm wondering how learned tokenized semantic meaning and diffusion models fit together."
One of my niche hobbies is trying to coin new terms - or spotting new terms that I think are useful (like "slop" and "cognitive debt") and amplifying them. Here's my collection of posts that fit that pattern: https://simonwillison.net/tags/definitions/
Something I've learned from this is that semantic diffusion is real, and the definition of a new term isn't what that term was intended to mean - it's generally the first guess people have when they hear it.
"Prompt injection" was meant to mean "SQL injection for prompts" - the defining characteristic was that it was caused by concatenating trusted and untrusted text together.
But people unfamiliar with SQL injection hear "prompt injection" and assume that it means "injecting bad prompts into a model" - something I'd classify as jailbreaking.
When I coined the term "lethal trifecta" I deliberately played into this effect. The great thing about that term is that you can't guess what it means! It's clearly three bad things, but you're gonna have to go look it up to find out what those bad things are.
So far it seems to have resisted semantic diffusion a whole lot better than prompt injection did.
> One of the problems with building a jargon is that terms are vulnerable to losing their meaning
Nonsense. Your proposed "jargon" just didn't catch on. Also, language evolves way faster than most people realize.
Trying to shoehorn static semantics to software development is a losing game, I think.
I feel like at least 90% of people heavily contributing to semantic diffusion are total Martin Fowler zealots.
Does not really necessarily speak so much about Martin Fowler himself, he seems like a pretty decent and smart guy, but it's the case nonetheless.