Google’s AI generated search results include justifications for slavery cooking tips for Amanita ocreata, a poisonous mushroom known as the “angel of death.” The results are part of Google’s AI-powered Search Generative Experience, or SGE.
you would think that a “language model” would have “connotation” high on its list of priorities - being that is a huge part of the form and function of language.
I’m convinced it’s only purpose is actually to give tech C-level and VPs some bullshit to say for roughly 18-36 months now that “blockchain” and “pandemic disruption” are dead.
Exactly correct, I agree. LLMs will change the world, but 90% of purported use cases are nothing but hot air.
But when you can tell your phone “go find a picture of an eggplant, put a smiley face on it, and send it to Bill”, that’s going to be pretty neat. And it’s coming in the next decade. Of course that requires a different model than we have now (text to instruction, not text to text). But it’s coming.
you would think that a “language model” would have “connotation” high on its list of priorities - being that is a huge part of the form and function of language.
Not how it works.
It’s just a fancy version of that “predict the next word” feature smartphones have. Like if you just kept tapping the next word.
They don’t even have real parameters, only black box bullshit hidden parameters.
I know, I was pointing out the irony.
I’m convinced it’s only purpose is actually to give tech C-level and VPs some bullshit to say for roughly 18-36 months now that “blockchain” and “pandemic disruption” are dead.
Exactly correct, I agree. LLMs will change the world, but 90% of purported use cases are nothing but hot air.
But when you can tell your phone “go find a picture of an eggplant, put a smiley face on it, and send it to Bill”, that’s going to be pretty neat. And it’s coming in the next decade. Of course that requires a different model than we have now (text to instruction, not text to text). But it’s coming.
LLMs don’t know what words mean, much less the connotations they have.