31
Tweaked a training prompt for a chatbot and got a weirdly good poem
I was messing around with a local model, trying to get it to summarize news articles. Instead of a summary, it gave me a haiku about the stock market. The whole thing was off-topic but surprisingly creative. Anyone else get a totally unexpected output that made you rethink how you prompt?
3 comments
Log in to join the discussion
Log In3 Comments
kellyg1429d ago
Last week I asked my phone's assistant for the weather and it gave me a recipe for chili instead. These systems seem to pick up on stray words and run with them, like they're making weird connections. It makes you wonder what's going on under the hood, doesn't it?
9
derek7829d ago
Making weird connections" is right... maybe they're just guessing.
4
wilson.claire6d ago
I read somewhere that these systems work by predicting the most likely next word based on patterns, not actually understanding what you're asking. So if "chili" appeared in your request history or in some random training data near "weather", it just went with that.
3