When all the humans you’re “supposed” to open up about your mental health to charge hundreds of dollars a session, need to be booked months in advance, and you need to actively convince them you’re suffering or else they dismiss you, yeah I wonder why.
Not to mention the fact that you could get a mandatory visit to a padded room if you tell them the wrong thing, sometimes at your own expense.
This reminds me of ELIZA, a natural language processing program from the 60’s that induced what they coined as the ELIZA effect, a tendency to anthropomorphize the computer. Joseph Wizenbaum, the computer scientist who created ELIZA, wrote Computer Power and Human Reason: From Judgment to Calculation in which he contends that while artificial intelligence may be possible, we should never allow computers to make important decisions, as they will always lack human qualities such as compassion and wisdom.
The danger is not that AI will become self aware and turn against humanity, it is that people will not realize that is has already been turned against us by its masters.