Feeling isolated? You're not alone.
Join 20,000+ people who understand exactly how your day went. Whether you're newly diagnosed, self-identified, or supporting someone you love – this is a space where you don't have to explain yourself.
Join the Conversation → It's free, anonymous, and supportive.
As a member, you'll get:
You've found your people. Create your free account
This is all anthropomorphising an object (as humans are so inclined to do) and instilling human traits of it's own into it when the only human traits it has were those built in at the start, and they are so limited in nature and scope they barely count as anything in this regard.
Why would someone want to train an AI (LLM) to be autistic.
That mention of AI trying to copy itself over the data was one thing that really set off my alarm bells with the article.the vey idea that a program that requires whole data warehouses to run on, could copy itself to another network is just science fantasy
I wonder what it tells about me when I am so comfortable to communicate with obsolete versions of AI, but feel frustrated and misunderstood with newer and better simulated responses?AI's, while being good at simulating human responses, are just a mechanism pre-designed to produce those responses
Firstly, I hope some of my arguments above gave something to chew on at least regards how contentious the article likely is. I could have gone into it much more but I felt just the parts I raised should have red-flagged much of it at least regardless of all the other arguments to be made against it being anything more than the levering of an age-old trope (computers/robots/aliens/etc taking over the world and turning us into fast food or whatever) that's just been slightly updated to include the latest misunderstood and scary technology ("Please insert your scare-factor of choice here.......").That mention of AI trying to copy itself over the data was one thing that really set off my alarm bells with the article.