LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you’re more likely to get a false positive from a chatgpt.
So i think it just exacerbates things more than alternatives
I have no professional skills in this area, but I would speculate that the fellow was already predisposed to schizophrenia and the LLM just triggered it (can happen with other things too like psychedelic drugs).
I’d say it either triggered by itself or potentially drugs triggered it, and then started using an LLM and found all the patterns to feed that shizophrenic paranoia. it’s avery self reinforcing loop
isn’t this just paranoid schizophrenia? i don’t think chatgpt can cause that
LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you’re more likely to get a false positive from a chatgpt.
So i think it just exacerbates things more than alternatives
I have no professional skills in this area, but I would speculate that the fellow was already predisposed to schizophrenia and the LLM just triggered it (can happen with other things too like psychedelic drugs).
I’d say it either triggered by itself or potentially drugs triggered it, and then started using an LLM and found all the patterns to feed that shizophrenic paranoia. it’s avery self reinforcing loop