The typical pattern for leaders is to get “second opinions” from advisors who end up telling them whatever they want to hear, so… maybe asking the equivalent of a magic 8 ball is a marginal improvement?
Most LLMs are literally "tell you whatever you want to hear " machines unfortunately. I’ve gotten high praise from ChatGPT for all my ideas until I go “but hang on, wouldn’t this factor stop it from being feasible” and then it agrees with me that my original idea was a bit shit lmao
The typical pattern for leaders is to get “second opinions” from advisors who end up telling them whatever they want to hear, so… maybe asking the equivalent of a magic 8 ball is a marginal improvement?
I would rather have the politicians consult a plain old magic 8 ball than one controlled by Scam Altman.
Most LLMs are literally "tell you whatever you want to hear " machines unfortunately. I’ve gotten high praise from ChatGPT for all my ideas until I go “but hang on, wouldn’t this factor stop it from being feasible” and then it agrees with me that my original idea was a bit shit lmao