• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Teaching children coding is a waste of time, OECD chief says

Aeolienne

Well-Known Member
(Not written by me)

Teaching children coding is a waste of time, OECD chief says

By Camilla Turner, Education Editor
21 February 2019 • 19:11

Teaching children coding is a waste of time, the OECD’s education chief has said, as he predicts the skill will soon be obsolete.

Andreas Schleicher, director of education and skills at the Organisation for Economic Co-operation and Development, said that the skill is merely “a technique of our times” and will become irrelevant in the future.

"Five hundred years ago we might have thought about pen literacy,” Mr Schleicher said. "In a way coding is just one technique of our times. And I think it would be a bad mistake to have that tool become ingrained.

"You teach it to three-year-olds and by the time they graduate they will ask you 'Remind me what was coding'. That tool will be outdated very soon."

Comparing it to trigonometry, he said: "We are going to get into the same dilemma. I think is very important that we strike a better balance about those kinds of things.

"For example, I would be much more inclined to teach data science or computational thinking than to teach a very specific technique of today."

The Government has championed the teaching of coding and computing skills, with the Chancellor allocating £84 million to treble the number of computing science teachers in 2017’s Autumn Budget.

A new National Centre for Computing was set up to train up the 8,000 new teachers in the subject. Computing became part of the national curriculum in 2014, and the GCSE in Information and Computer Technology (ICT) was axed in favour of a new qualification in computing which includes more coding and programming.

Speaking at the World Innovation Summit for Education in Paris, Mr Schleicher suggested that the importance currently placed on coding is part of a wider problem in education.

“Every day there is a new idea that we think is terribly important today, and we don't think the future will be different,” he said.

Mr Schleicher said a lot of topics taught in the past have no relevance in today's education system, and that the trick is to teach fewer things in greater depth.

He continued: "Trigonometry is a good example. If you ask a mathematician if trigonometry is the foundation of mathematics, they will tell you 'No, it's a specific application'. "So it has just survived because it used to be relevant in a specific historical context."

The global expert went on to say that education is a "very conservative social environment", and that society is very good at adding things to teach children, but not so good at taking away.

"The trick is to teach fewer things at greater depth - that is really the heart of education success," he said.

A Department for Education spokesperson said: “A high-quality computing education, as part of a broad and balanced curriculum, equips pupils to become digitally literate, at a level suitable for the future workplace and as active participants in the modern digital world."

Source: Torygraph
 
This seems silly.

Coding/programming/whatever will become obsolete when computers go the route of Star Trek and have super-advanced AI that can create a program on it's own after you speak vaguely at it for awhile. The day when I can walk into my own personal VR chamber and/or holodeck and scream "CAT MAZE" and have the local AI reward me with a labyrinth full of felines (without someone having designed it in advance) is the day when I'll think programmers arent needed.
 
This seems silly.

Coding/programming/whatever will become obsolete when computers go the route of Star Trek and have super-advanced AI that can create a program on it's own after you speak vaguely at it for awhile. The day when I can walk into my own personal VR chamber and/or holodeck and scream "CAT MAZE" and have the local AI reward me with a labyrinth full of felines (without someone having designed it in advance) is the day when I'll think programmers arent needed.

You just need to try the right drugs...
 
It's no secret that with the advance of technology, there is always the specter of obsolescence lurking about in whole or in part. However as to the timeline of when it happens, no one can say for sure.

Though in the case of computer programming, one might choose to focus on the present attrition rate of such employment rather than the possibility of technological obsolescence.

Simply put, it's not the sort of work most people will be doing indefinitely. Consequently training at childhood for such work seems to be a losing proposition IMO. Better to prepare for a "limited engagement" if anything. But then aren't most jobs in a similar category?

Last I heard, odds were that most people will hold no less than three decidedly different careers in their lifetime.

No place for the old? Is software development a young person's game?

The End Of An Era: Will Programmers Become Obsolete? | Freelancer Blog
 
Last edited:
Anyone can say something is on its way out. He could be right - even a blind squirrel is right twice a day.

But without any studies, predictions, or statistics backing him up, he's just another hipster with an opinion.
 
But without any studies, predictions, or statistics backing him up, he's just another hipster with an opinion.

Are statistics really required to recognize that technological progress inherently breeds various manifestations of technological obsolescence over time ?

Seems to me that under such circumstances, it's mostly just time which is the major variable. A matter of not "if", but so much as "when".
 
Last edited:
Are statistics really required to recognize that technological progress inherently breeds various manifestations of technological obsolescence over time ?

Not if you're talking larger trends. However, when he says that teaching coding now is useless because it won't be used in the future, he's setting a very firm timeline: Teach kids coding now, and when they enter the workforce in 20 years, they won't need it.

To his credit:
  • Programming languages are becoming more and more automated, high-level, and English-readable.
  • The most common and tedious programming tasks are continually being automated.
  • I could try to take a generous interpretation of his statement and say that the programming we know now will likely not be the programming we use 20 years from now.
This trend is making programming easier every day, and it's too easy to assume that we will reach a limit where every possible programming task we will ever need has already been coded, and we simply have to Lego them together.

Or perhaps he is assuming that one day we will have true artificial intelligence and we can just tell it what to do.

What he neglects:
  • There is so much to do. We keep finding more and more things to do with computers, more things to automate, more problems that we hadn't thought to apply computing to. They made the same mistake when I was in high school in the 80s. In a career workshop, one of my classmates said they wanted to be a programmer. The counselor quoted solid stats on current programming jobs (at the time, it consisted of making video games or writing simple word processors) and the number of teens interested in computers, predicting that when my classmate graduated college in 4 years, there would be a glut of programmers and competition for the few jobs out there would be incredibly high. Instead of incredibly high competition, we had the internet boom. Then smart devices, the internet of things, and big data. The point is, we're showing no signs of running out of things to program.
  • AI is hard. The more we come up with complex algorithms to solve difficult problems, the more we are tempted to call it "intelligence." But we have been working on AI for 50+ years now, and we're still just solving problems one at a time. The more we learn about intelligence, the more we realize how little progress we've made toward true AI. We can come up with approaches to solve a class of problems, like finding the best route between two arbitrary locations on a map, but I'm not aware of a single AI tool that can solve more than one class of problems. Even IBM's famous Watson has to be retooled and reprogrammed for every new use. We have programs that can pass the Turing Test and fool people into thinking they're intelligent (the program, not the person), but true AI is still decades or more away.
  • The basics never go away. The C programming language is 45 years old, and it's still in use today. For every device that supports high level applications like Java, .Net, and Flash, there is an OS on that device written in C or C++ to perform as optimally and efficiently as possible. My friends in government contracting jobs do all their coding in C, because performance matters. As long as we're still pushing the envelope in coming up with more complicated problems to solve, there will be one version of the solution in a high-level language that is easy to write, and another version in C that runs 10 times as fast. Now, how many programmers in the future will actually need to know C? My guess is less than 1%. But understanding what goes on under the hood always leads to better performance, and that understanding requires knowing at least some low level programming.
 
Not if you're talking larger trends. However, when he says that teaching coding now is useless because it won't be used in the future, he's setting a very firm timeline: Teach kids coding now, and when they enter the workforce in 20 years, they won't need it.

To his credit:
  • Programming languages are becoming more and more automated, high-level, and English-readable.
  • The most common and tedious programming tasks are continually being automated.
  • I could try to take a generous interpretation of his statement and say that the programming we know now will likely not be the programming we use 20 years from now.
This trend is making programming easier every day, and it's too easy to assume that we will reach a limit where every possible programming task we will ever need has already been coded, and we simply have to Lego them together.

Or perhaps he is assuming that one day we will have true artificial intelligence and we can just tell it what to do.

What he neglects:
  • There is so much to do. We keep finding more and more things to do with computers, more things to automate, more problems that we hadn't thought to apply computing to. They made the same mistake when I was in high school in the 80s. In a career workshop, one of my classmates said they wanted to be a programmer. The counselor quoted solid stats on current programming jobs (at the time, it consisted of making video games or writing simple word processors) and the number of teens interested in computers, predicting that when my classmate graduated college in 4 years, there would be a glut of programmers and competition for the few jobs out there would be incredibly high. Instead of incredibly high competition, we had the internet boom. Then smart devices, the internet of things, and big data. The point is, we're showing no signs of running out of things to program.
  • AI is hard. The more we come up with complex algorithms to solve difficult problems, the more we are tempted to call it "intelligence." But we have been working on AI for 50+ years now, and we're still just solving problems one at a time. The more we learn about intelligence, the more we realize how little progress we've made toward true AI. We can come up with approaches to solve a class of problems, like finding the best route between two arbitrary locations on a map, but I'm not aware of a single AI tool that can solve more than one class of problems. Even IBM's famous Watson has to be retooled and reprogrammed for every new use. We have programs that can pass the Turing Test and fool people into thinking they're intelligent (the program, not the person), but true AI is still decades or more away.
  • The basics never go away. The C programming language is 45 years old, and it's still in use today. For every device that supports high level applications like Java, .Net, and Flash, there is an OS on that device written in C or C++ to perform as optimally and efficiently as possible. My friends in government contracting jobs do all their coding in C, because performance matters. As long as we're still pushing the envelope in coming up with more complicated problems to solve, there will be one version of the solution in a high-level language that is easy to write, and another version in C that runs 10 times as fast. Now, how many programmers in the future will actually need to know C? My guess is less than 1%. But understanding what goes on under the hood always leads to better performance, and that understanding requires knowing at least some low level programming.

No doubt. But can you truly predict the future given technology that hasn't been invented yet, relative to what you know and use in the present?

- Of course not.

It's just a matter of time in terms of what remains and what changes.
 
Seems to me the problem of possible future obsolescence is almost irrelevant. How many children will even be interested in coding or have the necessary talent for it?

 
Seems to me the problem of possible future obsolescence is almost irrelevant. How many children will even be interested in coding or have the necessary talent for it?

Now that is an interesting question. Enrollment in all STEM degrees has been steadily declining.

Not that I mind - it's less competition for me.
 
Seems to me the problem of possible future obsolescence is almost irrelevant. How many children will even be interested in coding or have the necessary talent for it?

I'm afraid that's a consideration that goes far beyond computer programming. Where the decline of properly educated citizens living in a relatively free society is becoming a drain on society's ability to innovate.

Then again real technological innovation has usually been the result of a select few rather than the masses. But yes, declining trends in education never the less do seem ominous. Something America presently doesn't appear to have any "backstop" for while politicians confine their arguments to public versus private schools rather than focus on any one curriculum.

Point taken though. If society academically and collectively loses the ability to innovate, obsolescence becomes a moot point. Technological innovation isn't necessarily a linear process. The Dark Ages proved as much. :eek:
 
Last edited:
(Not written by me)
Teaching coding as a standard class lesson like language or math, no, probably useless. Teaching it to children who are interested, definitely yes. Learn to code one language, and the skills can be applied to learning other coding languages. It is not the coding itself that is important, but the thinking patterns and logic that are important.

Likewise, trigonometry is not a foundation of mathematics. However it is a link between numerical math and non-numerical math (geometry), and has practical real world applications. At least the basics need to be taught.

Educators seem to be trying to make things as simple and practical as possible, resulting in the dumbing down of students and killing the ability to think and figure things out. This kills creativity, and lack of creativity can kill a civilization.
 
Programming software will always be a needed skill. AI is the result of programming, albeit on a sophisticated level beyond the technology used to manage online forms and an Amazon purchase. Avoiding teaching fundamental skills in any field will leave the professional without a basic understanding of the principles being applied to these advanced techniques. A pilot needs to understand the principles and dynamics of flight, not just "push this for take-off, and this for landing". At some point, when planes fly themselves, they will have to be programmed to mimic the decision-making of a human to manage all the variables in flight safety. If you monitor all the experiments being done around the world to create an AI vehicle, the mistakes that result in accidents are only reparable through programming alterations. The mechanics of a vehicle are well understood, but now we want the vehicle to respond to stimuli and judge that stimuli via its relevance to damage. Machines don't think - they respond to programmed information.

If for some reason there is no longer a need for computer coding, then we will make those changes to the newer and better form of computer configuration and teach it to those who need to learn it. As we progress, new vocabulary will appear for the myriad of concepts that will come from technological advancement. Humans have always wanted to have robots, human-like machines that do our work for us. We will want this robot to act like a human, not unlike our relationship to domestic servants today. Humans are a selfish lot. We will demand obedience and perfection from the robots, but the funny thing is that there will be no punishment for disobedience or kitchen blunders. You won't be able to hide their passport to make them work harder. Anything that makes our relationship to AI workable will be in the programming.

We can't change nature. Nature decides itself if it is going to change. It takes about 20% of a lifetime to prepare a person for life in our modern world. The more we advance, the more we have to teach. Technology is academically demanding, and that is at the exclusion of history, geography, math, the arts, language, and practical life skills. We may have to channel students into specialized fields of learning due to the volume of information needed to sustain our modern move forward. The fundamentals will always be necessary, if not directly applied, then as a springboard to innovation and the realization of project ideas. Our brains must be challenged to exercise problem-solving skills, and we must deal with all the moral and ethical questions that will arise over AI interacting with our world. We have to make tough decisions as we approach the design and function of AI as a facilitator in our civilization. We have already toyed with such situations in our sci-fi films, designed more to be entertainment than philosophy. But to anyone who adheres to the art of sci-fi, it is always about philosophy. I don't believe that coding and programming are soon to be useless skills. We should all understand how technology works. Answering on-screen questions while you set up your new TV is not programming, but the practical result of it. Thank the coders, and the programmers, and the machines who soldered the components. A programmer taught them how to do it.
 
You just need to try the right drugs...

Nah, at this point it's not even necessary, not when using VR anyway.

Great example: Look up a game called Polybius on Oculus Rift or PSVR. I cant think of much else that embodies the idea of "drug trip" more than that. Major epilepsy warning for anyone that does look it up though. But yeah, see that and imagine essentially being inside of it thanks to the headset, instead of just seeing it on a flat screen.

VR can be very... strange. Even moreso when it bugs out. But it can also be an example of some of the amazing things that a skilled programmer can pull off. There's some seriously impressive stuff in there. Provided your machine can run it without bursting into flames.
 
even a blind squirrel is right twice a day.
That's "broken clock..."
full
 
I would have done it indefinitely, if I would have found an inroad into it...

I would have said something similar about personal lines insurance underwriting. However it was technology that rendered the job obsolete, handing all such functions over to a computer program. Rather than be laid off, I was promoted to commercial underwriting for which such automation to my knowledge remains somewhat impractical in comparison.

I loved personal lines underwriting. I was very good at it, and could essentially do it in my sleep. Commercial lines was nothing of the sort. However had I remained in personal lines to the end, I would have not only been laid off, but have no place to go in the entire industry given all the players were migrating their personal lines programs to total automation, or what we refer to as "slot underwriting".

It sounds logical to assume that the role of a programmer would never be replaced to this degree, but with the advance of technology, sometimes one has to wonder. When I became a website designer, I prided myself on being a "hand-coder" using only what amounted to a fancy text editor. Yet so many began to use WYSIWYG programs that essentially did all the coding for them. They worked, though IMO they tend to produce bloated and kludgy code that would give automated code validators (and Jakob Nielsen) fits.

Software making software, much like machines that make other machines.

Sure, a human designed the software to make them all, but you have to wonder where this dynamic is headed and whether at some point a human programmer's role might be diminished to a "point of no return". One can only hope that this amounts to science fiction, but I wouldn't dismiss the possibility myself. At least you all can see what experiences my personal bias towards automation is based upon.
 
Last edited:
Software making software, much like machines that make other machines.
I have seen software that generates other software. Its product is clunky, over-elaborate and inefficient. That makes the end program run slower.

When compounded, that effect is worsened. If you want to see a simple example of that, write some text in MS Word. Save a copy as HTML. Re-open that HTML file in Wordpad.
 
I have seen software that generates other software. Its product is over-elaborate and inefficient. That makes the end program run slower.

Indeed. Very sloppy code that works, but slower and often difficult to read and troubleshoot.

Reason to appreciate those who manually produce code in an elegant, efficient fashion. -"Old School" :cool:
 
Last edited:
Many plain "don't get it", as it seems for ever.
Having coding be an optional course even in early school can be a good logic excersize and introduce some to basics.
 

New Threads

Top Bottom