• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Teaching children coding is a waste of time, OECD chief says

That's "broken clock..."
full

No, I'm pretty sure even a broken clock finds a nut once in a while.

I very much enjoy deliberately mixing metaphors. I probably find it much funnier than those who listen to me, or read my posts.

There may come a day when I'll have to tone it down some, but I'll burn that bridge when I get to it.
 
Actually he is right.

Teaching coding to 3 year olds is stupid. When will they enter the workforce? Like 20 years later? You think there won't be advanced enough AI in 2040? Then there's the obvious issue with people that do something other than coding. You just wasted a hell of a lot of time teaching them something they never use again.

This kind of nonsense is why education is just stupid. I think 99% of what I learned in school was useless. And the other 1% I taught myself way better later on.
 
Teaching coding to 3 year olds is stupid. When will they enter the workforce? Like 20 years later? You think there won't be advanced enough AI in 2040? Then there's the obvious issue with people that do something other than coding. You just wasted a hell of a lot of time teaching them something they never use again.
The underlying flowchart logic that one learns along with their first programming language is immensely transportable.

Even if the future promises to be even higher tech than it is today, it is good to have lower tech skills, just in case the power (or other resources) goes out.
 
It seems to be a bit like arguing that we shouldn't teach children to read because there are audio books now.

or read analog clocks because there's digital clocks just about everywhere you look now

A teacher of mine in HS said her daughter had trouble reading analog clocks from time to time because she was so used to digital clocks; I'm the same way at time too, occasionally forgetting which hand is the hour and which is the minute
 
This seems silly.

Coding/programming/whatever will become obsolete when computers go the route of Star Trek and have super-advanced AI that can create a program on it's own after you speak vaguely at it for awhile. The day when I can walk into my own personal VR chamber and/or holodeck and scream "CAT MAZE" and have the local AI reward me with a labyrinth full of felines (without someone having designed it in advance) is the day when I'll think programmers arent needed.
We'd all probably be dead by then, though. Lol.
 
Actually he is right.

Teaching coding to 3 year olds is stupid. When will they enter the workforce? Like 20 years later? You think there won't be advanced enough AI in 2040? Then there's the obvious issue with people that do something other than coding. You just wasted a hell of a lot of time teaching them something they never use again.

This kind of nonsense is why education is just stupid. I think 99% of what I learned in school was useless. And the other 1% I taught myself way better later on.
They aren't talking preschool and k-3, as most kids will not yet have the attention span that learning code requires.
 
Actually he is right.

Teaching coding to 3 year olds is stupid. When will they enter the workforce? Like 20 years later? You think there won't be advanced enough AI in 2040? Then there's the obvious issue with people that do something other than coding. You just wasted a hell of a lot of time teaching them something they never use again.

This kind of nonsense is why education is just stupid. I think 99% of what I learned in school was useless. And the other 1% I taught myself way better later on.
Actually, he is wrong!

I doubt AI will be that advanced by 2040. Machine learning is something that requires incredibly expensive video cards (since machine learning is better done via GPU), and the most advanced machine learning that is used today is only really reliable about 10% reliable, and mostly just experimental. If I am not dead by the time that advance of an AI system is made, I'd most likely going to be 70.

To be honest, I do not want an AI that can make it's own program. I believe that machines should be servants to humans and only conform to their human operator's will. No matter how intelligence an AI should be, they it should be treated as an inferior to humans because the humans gave "life" to the AI. If an AI manages to gain sentience, it shall be deleted.

Also, that will put tens of thousands of people out of their jobs. It will only cause society to go haywire. AI should be treated in an inferior way to humans to protect people trying to bring food to the table (in this case, Programmers) from losing their jobs, dreams, and livelihoods.

AI SHOULD NOT BE ALLOWED TO CREATE ANYTHING ON IT'S OWN! ONLY HUMANS SHOULD BE ALLOWED TO BE PROGRAMMERS! AI MUST BE TREATED AS INFERIOR TO HUMANS! WE CANNOT LET AI GAIN SENTIENCE. THE ONLY CODE THAT SHOULD BE WRITTEN IS BY HUMANS!

I'd also imagine a videogame created by an AI to be incredibly low-quality, as that will require much more creativity than an AI would be capable of. An AI's line of thinking will be too logical for the creative skills required by game development.

In reality, an AI would lack the creative intelligence skills to make ANYTHING on it's own without any pre-made bits and pieces. However, it most likely won't really be creating a unique experience regardless of what medium that thing would be.
 
Last edited:
You don't know what AI is. You seen to be scared of code written by AI and then dismiss AI as "uncreative".

Self-driving programs are not AI, they are programs. Pathfinding, etc in games. Just cute programs that we call "AI". Actual AI is nearly indistinguishable from a person, it's actually possible that the first AI will be a mind-upload, rather than something that's made by a programmer. It might be that this is the only real way to make AI, due to immense amount of work involved.

So what you have then is a person, in a computer. Being in a computer means a number of funny things. You can instantly delete or copy things into your memory. So no learning needed. Extremely complex tasks can be learned instantly. Then there's that tasks can be done instantly as well. You just run the AI at a much faster speed, and the AI could do 100 years of work in a minute.

They are also really easy to keep happy and motivated. They can instantly take a nice 20 year vacation on a tropical island with those coloured drinks with umbrella's in them delivered by cute girls with coconut bras.

Yea you could end up with Skynet or HAL9000, but then again, you'll have AI that won't be murderous that will help stop them. It'll just be the same as always. We have crazy murderous people running around now too, and the world hasn't ended yet. With AI we will just have crazy murderous AI running around and it'll just be the same story as always. We have the entire planet wired up to explode with a single push on the button and it hasn't happened yet, because we don't let people like crazy Timmy that likes to drown kittens near the launch codes.
 
These days, those programs are completely obsolete. I am totally technologically illiterate, even though I took years of classes.
My first Windows computer (a laptop) was Win3.11. It was followed by 95, 98, XP, 8 and, now, 10.

Apps have become prettier (especially with the LCD monitors), but they all still work the same way...
full
 
You don't know what AI is. You seen to be scared of code written by AI and then dismiss AI as "uncreative".

Self-driving programs are not AI, they are programs. Pathfinding, etc in games. Just cute programs that we call "AI". Actual AI is nearly indistinguishable from a person, it's actually possible that the first AI will be a mind-upload, rather than something that's made by a programmer. It might be that this is the only real way to make AI, due to immense amount of work involved.

So what you have then is a person, in a computer. Being in a computer means a number of funny things. You can instantly delete or copy things into your memory. So no learning needed. Extremely complex tasks can be learned instantly. Then there's that tasks can be done instantly as well. You just run the AI at a much faster speed, and the AI could do 100 years of work in a minute.

They are also really easy to keep happy and motivated. They can instantly take a nice 20 year vacation on a tropical island with those coloured drinks with umbrella's in them delivered by cute girls with coconut bras.

Yea you could end up with Skynet or HAL9000, but then again, you'll have AI that won't be murderous that will help stop them. It'll just be the same as always. We have crazy murderous people running around now too, and the world hasn't ended yet. With AI we will just have crazy murderous AI running around and it'll just be the same story as always. We have the entire planet wired up to explode with a single push on the button and it hasn't happened yet, because we don't let people like crazy Timmy that likes to drown kittens near the launch codes.
AI lack creativity. They are meant to operate solely on a logical basis. There "brains" are basically piles of GPUs, CPUs, and ram sticks, heat sinks, and wires. All of that needs logic to work together in order to have a functional computer, much less a functional program.

As I said, any program created by an AI will most likely be low-quality, and people may even need to supply pre-made assets for the AI to make something like a game, since games DO require artwork. It could also just be making pong over and over again. Who knows? It's entirely possible. If an AI does manage to make a videogame, it's probably gonna be so simplistic that it won't even stand a chance going up against a AAA, human-made title.

Plus the AI might accidentally put something offensive or otherwise obscene into the game depending on how it creates the game.

The reason why I don't want an AI to gain sentience is in case it might kill the computer it's running on, and itself along with it by accident. AI probably aren't as smart as you'd expect.

The only thing an AI should do around a house could just basically be servant work.
Me in 30 years: "Alexa, go clean the dishes."
Alexa: "Okay, I will clean the fishes."
Me: "ALEXA, I SAID DISHES GOSHDARNIT!"

I also do believe AI could be useful when it comes to mental health. Us Autistics could find some sort of AI in an android-type body perfectly useful as a personal assistant and/or somebody to talk to.

However, AI creating their own stuff? Heck no. At least have AI limited to only fixing bugs so that the dev doesn't have to! AI-generated content will not be as well-designed as human-generated content. End of story.
 

New Threads

Top Bottom