schizoidman@lemm.ee to Technology@lemmy.worldEnglish · 3 days agoNvidia falls 14% in premarket trading as China's DeepSeek triggers global tech sell-offwww.cnbc.comexternal-linkmessage-square311fedilinkarrow-up11arrow-down10file-textcross-posted to: news@lemmy.worldtechnology@lemmy.ml
arrow-up11arrow-down1external-linkNvidia falls 14% in premarket trading as China's DeepSeek triggers global tech sell-offwww.cnbc.comschizoidman@lemm.ee to Technology@lemmy.worldEnglish · 3 days agomessage-square311fedilinkfile-textcross-posted to: news@lemmy.worldtechnology@lemmy.ml
minus-squareNaia@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up0·1 day agoIf you are blindly asking it questions without a grounding resources you’re gonning to get nonsense eventually unless it’s really simple questions. They aren’t infinite knowledge repositories. The training method is lossy when it comes to memory, just like our own memory. Give it documentation or some other context and ask it questions it can summerize pretty well and even link things across documents or other sources. The problem is that people are misusing the technology, not that the tech has no use or merit, even if it’s just from an academic perspective.
If you are blindly asking it questions without a grounding resources you’re gonning to get nonsense eventually unless it’s really simple questions.
They aren’t infinite knowledge repositories. The training method is lossy when it comes to memory, just like our own memory.
Give it documentation or some other context and ask it questions it can summerize pretty well and even link things across documents or other sources.
The problem is that people are misusing the technology, not that the tech has no use or merit, even if it’s just from an academic perspective.