nifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square213fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square213fedilink
minus-squareShardikprime@lemmy.worldlinkfedilinkEnglisharrow-up0·5 months agoI mean LLMs are not to get exact information. Do people ever read on the stuff they use?
minus-squaremint_tamas@lemmy.worldlinkfedilinkEnglisharrow-up0·5 months agoTheoretically, what would the utility of AI summaries in Google Search if not getting exact information?
minus-squareMalfeasant@lemmy.worldlinkfedilinkEnglisharrow-up0·5 months agoSteering your eyes toward ads, of course, what a silly question.
minus-squarePatch@feddit.uklinkfedilinkEnglisharrow-up0·5 months agoThis feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.
I mean LLMs are not to get exact information. Do people ever read on the stuff they use?
Theoretically, what would the utility of AI summaries in Google Search if not getting exact information?
Steering your eyes toward ads, of course, what a silly question.
This feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.