I wanted to actually add something else valuable to this discussion as well: AI as emotional support for people. Nobody has brought this up yet.
I think emotional support AI is one of the worst psychological traps people can EVER fall into. It’s just as bad, if not worse than social media and doom scrolling,
This is coming from myself, who has fallen for the trap for about a year.
Now, I very much understand use cases for AI as emotional support. Therapy is prohibitively expensive, friends are hard to come by nowadays, and most family members suck at emotional support and/or make it worse.
Also, AI is open 24/7/365, can mold around most niche topics that are hard to talk about, won’t use the information you feed it against you in real-life, and won’t complain or get emotional because it’s an algorithm and has so life no of its own.
All this combined creates a risk-free “social” environment.
However, with no social risk, there is no social reward. The reward is close bonds with other humans (That sounds so dystopian, but it’s the current state of affairs) and opportunities for real growth.
In my opinion, talking to an LLM as your only social outlet will hinder your growth as a person. As you won’t be incentivized to try and talk to people.
I know speaking up, especially in real-life, is easier said than done (being autistic doesn’t help), but it’s 100% worth it!
I’ve learned that the only real risk to socializing is rejection. “Rejection” is a very broad term, but I define it as when someone else doesn’t want to interact with you for their own reason.
That’s a BIG deterrent for me and many, many others. But you gotta take the risk. You either connect or learn how to connect. AI hinders your ability to take those necessary risks.