• Welcome to Autism Forums, a friendly forum to discuss Aspergers Syndrome, Autism, High Functioning Autism and related conditions.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Private Member only forums for more serious discussions that you may wish to not have guests or search engines access to.
    • Your very own blog. Write about anything you like on your own individual blog.

    We hope to see you as a part of our community soon! Please also check us out @ https://www.twitter.com/aspiescentral

Research and Development For Robot Surgeons Should be Perma-Banned

Status
Not open for further replies.
I didn't say the military didn't care, I said weapons manufacturers and politicians don't. Armies are given the weapons the politicians buy for them. There's also a big difference between an ordinary enlisted soldier and a general. You might care if your squad mate gets killed on your watch, to a general it's just a statistic. As long as more of "them" die than "us" they're judged as having done a good job.
The comparison between a relatively simple device intended to kill and a highly sophisticated system intended to save lives is completely invalid. The quality control on a mass produced machine costing a few hundred dollars is nowhere near as rigorous as that of a hand built dialysis machine or MRI scanner costing tens of thousands, let alone a multi-million dollar AI surgical robot.
 
Simple fact about computer programs or AI: They all have bugs. *ALL* of them. Not even one single exception. Period.

And no, testing doesnt catch all bugs. In fact, it doesnt even come close.

And no, having big resources and big research doesnt stop it either. There WILL be bugs. Period.

Worse, the more advanced the things get, the higher the chance for MORE bugs (and this is the bit that people always forget). AND. On top of that. It gets exponentially harder to CATCH the bugs. People always have this odd tendancy to think that if a program/game/whatever is buggy, it must be because of laziness. No, it's because it's a hyper-complicated process. They also have this tendancy to think that if you simply fling enough time and money at something, it will be perfected. Nope. Doesnt work that way.

WORSE STILL is the sheer ridiculous complexity that would be required to create a proper, truly functional robot surgeon capable of performing properly. Because "properly" would mean a vast amount of CONSTANT adaptation. Constant encounters with situations that were not directly programmed into the thing. The human body is a strange thing that isnt even fully understood yet (not even remotely close). On top of THAT, differences occur in each individual. A mere slight size difference could be enough to set off problems in an AI. Really, even the absolute most advanced AIs in the world are, in reality, dumber than a sack of hammers. The tech simply is not there yet. Yet even if it was.... again, the slightest bug could be a problem.

And here's the thing: Even a rarely occurring bug could be a problem. That's the thing with computer programs of any sort: They're just fine until, suddenly, they arent.

It reminds me of a moment in one of Arthur C. Clarke's books... one of the Rama books, I believe. There's a point in the story at which one of the characters suddenly experiences a medical emergency. However, it's the sort of thing that surgery can easily and quickly fix. They have a robot surgeon, because of course they do. SUPPOSEDLY infallible (because people always think things like this are). During the surgery, an unexpected event occurs. Some sort of impact that jolts the entire place. Not like some constant quake or something.... one jolt. And not enough of one to even knock someone over. But it trips something within the robot, happening during a very specific part of the thing's program. A million-to-one chance of something like that ever occuring... but it does anyway. The thing enters an endless loop. Which, in that situation, means that it simply doesnt stop cutting. This ends about as well as it sounds like it does. The situation that occurred may have been super rare, but it still revealed a bug or oversight. And when it comes to freaking surgery, ONE glitch could kill. Easily. Doesnt need to be an endless loop. One tiny mistake... hell, a mistake that might not even be noticable right away!.... is enough (which of course is an issue that normal surgeons must face).

Now obviously that's just a story. Nobody's doing robot surgeries on some sort of alien station. But IRL, there's a bazillion things that could go wrong and impact the robot like that. "Bugs" dont even have to be within the program itself. A bazillion possible engineering mistakes/errors/oversights could cause it. Or improper maintenance. Any given machine has about a million surrounding (and often unaccounted for) variables that can get in the way. There's a reason why normal computers spaz out so often even just from hardware issues instead of software.

Also, a human surgeon is capable of dealing with many things. In many different ways. If a problem were to occur, fast reaction or corrections are possible, AND in ways requiring heavy improvisation, even if that improvisation requires things/people that arent in the immediate area. Again, even the most advanced AI currently possible cant even touch that.

On top of ALL OF THAT STUFF, any highly advanced operation by a robot/machine MUST be constantly supervised by someone that knows what they're doing. In the case of freaking surgery, the sheer level of skill/knowledge necessary would be extreme (and whoever posesses that knowledge probably ISNT a surgeon themselves... they'd be an engineer of some sort. Imagine the possible communication screwups between such a person and the medical staff). Yet another layer of potential problems.


All in all, the tech necessary to create a truly reliable thing of this nature simply doesnt exist. People cant even make SIMPLE computers without them going bloody bonkers frequently. We arent ready for freaking robot surgeons.

Oh, and that's all not even going into the possibility of malicious interference. At least a human surgeon cant be remotely hacked from a distance. And you freaking know that people WILL try that. It's not even a question of "if". It's a question of "how quickly can they find a way".


There, that's my useless thoughts on it, because I'm bloody bored. I find the concepts of AI fascinating, but after 3 freaking decades of computer use, I never, ever trust programs of any sort. No matter how expensive and advanced. I've had too much experience for that.
 
I don't think anyone has suggested that an AI surgeon would be infallible or that there's no possibility of bugs, but that such a system would have to out perform human surgeons before it was allowed into widespread use.
We're a long way from a functional tech of that nature, but it will come and when it does it will already be at least as safe as current techniques.
 
In my opinion, this is one of the those issues which falls under a particular category that probably has a name but I've never heard it mentioned outside my brain so I don't know it. It's this: I believe that our stance on certain issues is decided in an instant and has no logical basis, and then the reasoning follows to support what's already been decided.

I won't go into the details of my examples because they're controversial and I don't want to cause any problems, but issues I think this happens with are things like gay marriage, abortion, trans rights, etc.

I don't think being for or against the research and development of robot surgeons has too much to do with reasoning. I think it has everything to do with our preconceptions of robots. Some are naturally disgusted by any attempt to integrate robots into society, something they often express through a phone, computer, or television, ironically.

"Robots doing someone's job."

Whatever emotion is triggered by that statement determines the position. Then we think of reasons so we can explain the emotion out-loud, with words supposedly devoid of emotion.

I could be completely wrong! That's just my fun little theory. :)
 
Machines are increasingly used in war. Hypothetically speaking if you were in a war zone in the near future, a robotic machine might come and kill you. What then is wrong with having another machine fix you up? In fact why do we need you at all?
 
In my opinion, this is one of the those issues which falls under a particular category that probably has a name but I've never heard it mentioned outside my brain so I don't know it. It's this: I believe that our stance on certain issues is decided in an instant and has no logical basis, and then the reasoning follows to support what's already been decided.

I won't go into the details of my examples because they're controversial and I don't want to cause any problems, but issues I think this happens with are things like gay marriage, abortion, trans rights, etc.

I don't think being for or against the research and development of robot surgeons has too much to do with reasoning. I think it has everything to do with our preconceptions of robots. Some are naturally disgusted by any attempt to integrate robots into society, something they often express through a phone, computer, or television, ironically.

"Robots doing someone's job."

Whatever emotion is triggered by that statement determines the position. Then we think of reasons so we can explain the emotion out-loud, with words supposedly devoid of emotion.

I could be completely wrong! That's just my fun little theory. :)

Eh, it depends on the person. Some people get triggered about jobs in general. Dunno about everyone else here, but I personally simply dont care. To be honest though as much as I dont trust machines, I'd rather interact with them than with some random jerk.

I may not trust that robotic cashier thing, but I trust it more than I do an actual cashier (also it takes less time and doesnt ask me inane questions). Maybe that's just me being negative. Or maybe that's because people are bloody stupid. Hard to tell!
 
I never heard of robot surgeon, I don't like the idea neither.
But unlike video games especially current video games I'm sure that there is less variables , in video games especially open world you have so many things to deal with at the same time...

I knew about "remote" surgery thought, machines that can be used by a surgeon from distance so you can win time in emergency situations.

And real humans make mistake aswell, the most common is that sometime they forget stuff in the patient lol.
 
The reality of today reflects a great deal of such technology already being applied in day-to-day exposures with significant risk to human life.

But then the same thing could be said prior to the digital age. It's all relative given the possibility of human error. Whether it happens on the end of a skilled surgeon's hands, or through a computer glitch.

The thing to observe though is that collective societal progress does not cower in the face of the possibility of injury, death or property loss. Where individuals approve of it or not. Though that's not to say that research and development can be a very bumpy road in the process. Enough for me to cringe at in certain circumstances.
 
I didn't say the military didn't care, I said weapons manufacturers and politicians don't. Armies are given the weapons the politicians buy for them. There's also a big difference between an ordinary enlisted soldier and a general. You might care if your squad mate gets killed on your watch, to a general it's just a statistic. As long as more of "them" die than "us" they're judged as having done a good job.
The comparison between a relatively simple device intended to kill and a highly sophisticated system intended to save lives is completely invalid. The quality control on a mass produced machine costing a few hundred dollars is nowhere near as rigorous as that of a hand built dialysis machine or MRI scanner costing tens of thousands, let alone a multi-million dollar AI surgical robot.
Dude, you are British, I'm an American. Don't pretend you know that much about military weaponry. The military WOULD care about their firearms and the quality of their weapons, how complex their weapons are, how durable (also referred to as 'robust') they are, how often they jam, how hard it is to get the gun's insides dirty, and more. Militaries all around the world have been conducting trials on their firearms every since the first rifle that can fire a unitary, all-metal cartridge was introduced, especially during and after WWI. After all, a soldier doesn't want to be caught with a jammed rifle or handgun. It's the militaries that adopt and purchase the weapons, the government just helps them pay the companies that make those said weapons by setting spending budgets for those militaries. Sorry, but do more research. There is a reason why, when you look up a weapon that is kind of iconic such as the AK-47, M1911, M16, or M4, then it would say, "it has been adopted by whatever-nation's military." You, sir, are incorrect. The military conducts weapons trials on all weapons designed for them, commissioned BY the military or not. The site you listed as a resource was also an anti-gun sight, so...
 
Last edited:
Also, have you all been ignoring that a single da Vinchi unit costs $1.2 in USD, and that does not include buying more add-ons to add variety to the instruments it is using? That's too expensive for hospitals outside of major cities to afford. Like, way too expensive. Also, I heard a story online where a man's rectum was burnt during an electrical failure in the machine when he was getting a cancer tumor in his prostate removed. That didn't kill him, but he did sue because that HAD TO REALLY HURT!
 
Also, have you all been ignoring that a single da Vinchi unit costs $1.2 in USD, and that does not include buying more add-ons to add variety to the instruments it is using? That's too expensive for hospitals outside of major cities to afford.

Hate to break the bad news to you, but the sheer costs alone just to operate a hospital in non-metropolitan areas is a crisis unto itself, no matter what medical technologies they have or can afford.

There's never any parity of hospitals or private practices in terms of the services and technologies they may offer. That's a dynamic not likely to change. Though don't expect it to impede technological advancements of this sort.

I still recall how my cousin had to go to another state just to get a specific diagnostic procedure unavailable in the entire state of Nevada.

Healthcare Trends for 2018

Rethinking Rural Health Solutions To Save Patients And Communities
 
Last edited:
Simple fact about computer programs or AI: They all have bugs. *ALL* of them. Not even one single exception. Period.

And no, testing doesnt catch all bugs. In fact, it doesnt even come close.

And no, having big resources and big research doesnt stop it either. There WILL be bugs. Period.

Worse, the more advanced the things get, the higher the chance for MORE bugs (and this is the bit that people always forget). AND. On top of that. It gets exponentially harder to CATCH the bugs. People always have this odd tendancy to think that if a program/game/whatever is buggy, it must be because of laziness. No, it's because it's a hyper-complicated process. They also have this tendancy to think that if you simply fling enough time and money at something, it will be perfected. Nope. Doesnt work that way.

WORSE STILL is the sheer ridiculous complexity that would be required to create a proper, truly functional robot surgeon capable of performing properly. Because "properly" would mean a vast amount of CONSTANT adaptation. Constant encounters with situations that were not directly programmed into the thing. The human body is a strange thing that isnt even fully understood yet (not even remotely close). On top of THAT, differences occur in each individual. A mere slight size difference could be enough to set off problems in an AI. Really, even the absolute most advanced AIs in the world are, in reality, dumber than a sack of hammers. The tech simply is not there yet. Yet even if it was.... again, the slightest bug could be a problem.

And here's the thing: Even a rarely occurring bug could be a problem. That's the thing with computer programs of any sort: They're just fine until, suddenly, they arent.

It reminds me of a moment in one of Arthur C. Clarke's books... one of the Rama books, I believe. There's a point in the story at which one of the characters suddenly experiences a medical emergency. However, it's the sort of thing that surgery can easily and quickly fix. They have a robot surgeon, because of course they do. SUPPOSEDLY infallible (because people always think things like this are). During the surgery, an unexpected event occurs. Some sort of impact that jolts the entire place. Not like some constant quake or something.... one jolt. And not enough of one to even knock someone over. But it trips something within the robot, happening during a very specific part of the thing's program. A million-to-one chance of something like that ever occuring... but it does anyway. The thing enters an endless loop. Which, in that situation, means that it simply doesnt stop cutting. This ends about as well as it sounds like it does. The situation that occurred may have been super rare, but it still revealed a bug or oversight. And when it comes to freaking surgery, ONE glitch could kill. Easily. Doesnt need to be an endless loop. One tiny mistake... hell, a mistake that might not even be noticable right away!.... is enough (which of course is an issue that normal surgeons must face).

Now obviously that's just a story. Nobody's doing robot surgeries on some sort of alien station. But IRL, there's a bazillion things that could go wrong and impact the robot like that. "Bugs" dont even have to be within the program itself. A bazillion possible engineering mistakes/errors/oversights could cause it. Or improper maintenance. Any given machine has about a million surrounding (and often unaccounted for) variables that can get in the way. There's a reason why normal computers spaz out so often even just from hardware issues instead of software.

Also, a human surgeon is capable of dealing with many things. In many different ways. If a problem were to occur, fast reaction or corrections are possible, AND in ways requiring heavy improvisation, even if that improvisation requires things/people that arent in the immediate area. Again, even the most advanced AI currently possible cant even touch that.

On top of ALL OF THAT STUFF, any highly advanced operation by a robot/machine MUST be constantly supervised by someone that knows what they're doing. In the case of freaking surgery, the sheer level of skill/knowledge necessary would be extreme (and whoever posesses that knowledge probably ISNT a surgeon themselves... they'd be an engineer of some sort. Imagine the possible communication screwups between such a person and the medical staff). Yet another layer of potential problems.


All in all, the tech necessary to create a truly reliable thing of this nature simply doesnt exist. People cant even make SIMPLE computers without them going bloody bonkers frequently. We arent ready for freaking robot surgeons.

Oh, and that's all not even going into the possibility of malicious interference. At least a human surgeon cant be remotely hacked from a distance. And you freaking know that people WILL try that. It's not even a question of "if". It's a question of "how quickly can they find a way".


There, that's my useless thoughts on it, because I'm bloody bored. I find the concepts of AI fascinating, but after 3 freaking decades of computer use, I never, ever trust programs of any sort. No matter how expensive and advanced. I've had too much experience for that.
TRUE!
 
Dude, you are British, I'm an American. Don't pretend you know that much about military weaponry. The military WOULD care about their firearms and the quality of their weapons, how complex their weapons are, how durable (also referred to as 'robust') they are, how often they jam, how hard it is to get the gun's insides dirty, and more. Militaries all around the world have been conducting trials on their firearms every since the first rifle that can fire a unitary, all-metal cartridge was introduced, especially during and after WWI. After all, a soldier doesn't want to be caught with a jammed rifle or handgun. It's the militaries that adopt and purchase the weapons, the government just helps them pay the companies that make those said weapons by setting spending budgets for those militaries. Sorry, but do more research. There is a reason why, when you look up a weapon that is kind of iconic such as the AK-47, M1911, M16, or M4, then it would say, "it has been adopted by whatever-nation's military." You, sir, are incorrect. The military conducts weapons trials on all weapons designed for them, commissioned BY the military or not. The site you listed as a resource was also an anti-gun sight, so...

Sorry - but what in blazes does my nationality have to do with my understanding or lack of regarding military procedure and standards? You know nothing of me or my education and clearly even less about my country. You also clearly don't know that I grew up in a military family in a city dominated by it's naval presence.
For the record we have as bloody a history of conflict as your own nation in recent years, and even worse when you consider how much longer we've been doing it for.

I have known far too many people who've seen active service and been mutilated because of the failure of their equipment to ignore it - even those I grew up with. Far more than were injured by enemy fire I can tell you. That includes American made equipment that was used by both your and our Armed Forces and passed by both.

Of course they conduct trials but they also cut corners - BIG corners sometimes. If they can get 10000 of something for the same cost as 7000 of another but the sub par quality means a few injuries or deaths, it's acceptable collateral damage. Your claims that the generals and politicians would not allow such faulty or inferior equipment into active service is a massive disrespect to our servicemen, veterans and dead that were killed or injured in and out of combat by tools that weren't fit for purpose.
 
Last edited:
Sorry - but what in blazes does my nationality have to do with my understanding or lack of regarding military procedure and standards? You know nothing of me or my education and clearly even less about my country. You also clearly don't know that I grew up in a military family in a city dominated by it's naval presence.

For the record we have as bloody a history of conflict as your own nation in recent years, and even worse when you consider how much longer we've been doing it for.
I have known far too many people who've seen active service and been mutilated because of the failure of their equipment to ignore it - even those I grew up with. Far more than were injured by enemy fire I can tell you. That includes American made equipment that was used by both your and our Armed Forces and passed by both.
Of course they conduct trials but they also cut corners - BIG corners sometimes. If they can get 10000 of something for the same cost as 7000 of another but the sub par quality means a few injuries or deaths, it's acceptable collateral damage. Your claims that the generals and politicians would not allow such faulty or inferior equipment into active service is a massive disrespect to our servicemen, veterans and dead that were killed or injured in and out of combat by tools that weren't fit for purpose.

I'm just saying the politicians DO care. They spend as much as possible on their weapons. They even spend a ton the best ammo brands.
 
I'm just saying the politicians DO care. They spend as much as possible on their weapons. They even spend a ton the best ammo brands.

That's backpedalling. You have disrespected thousands of brave servicemen past and present. At least have the humility to accept you are mistaken in the memory of the fallen if nothing else.

If you'd seen what I'd seen, helped people deal with the life shattering consequences of unstable munitions, mortars that exploded in the launcher, guns that blew fingers off or even looked past the propaganda fed to you daily, then you might understand how far wrong some of your ideas are.
 
That's backpedalling. You have disrespected thousands of brave servicemen past and present. At least have the humility to accept you are mistaken in the memory of the fallen if nothing else.

If you'd seen what I'd seen, helped people deal with the life shattering consequences of unstable munitions, mortars that exploded in the launcher, guns that blew fingers off or even looked past the propaganda fed to you daily, then you might understand how far wrong some of your ideas are.
No, I am not. I'm just correcting your inaccurate information. How am I being fed propaganda?
 
Where do you get the information that you think backs up these opinions you keep claiming as fact?
What made you think you were justified in claiming that my nationality rendered my knowledge gained over more than three times your lifespan was inferior to yours?
How do you justify the disrespect that you have shown to the servicemen, including people I have known and helped to cope with their disabilities, who were murdered or mutilated by the supposedly infallible weapons, politicians and generals you have so much faith in?
What evidence do you have which contradicts the well known and widely documented knowledge (such as already provided by @Judge earlier in this thread) that governments and military administrations, including yours, frequently value economy and deadly force over the safety of their men?

Answer a few of my questions factually and accurately, and I might be better disposed to answer yours.
 
Last edited:
ac75 civil.png
 
I believe it's useful to note that one of the members of the argument is fifteen years old. Arguments like this make more sense when you see them coming out of the mouth of the youngun, so it's more confusing online.

I'm basing this on my students that are this age! I enjoy talking to them and even playfully debating with them, but it should always be within the context of their age. :)
 
I believe it's useful to note that one of the members of the argument is fifteen years old. Arguments like this make more sense when you see them coming out of the mouth of the youngun, so it's more confusing online.

I'm basing this on my students that are this age! I enjoy talking to them and even playfully debating with them, but it should always be within the context of their age. :)

Thanks for pointing it out, and I'm well aware of that. It's important for us all to learn that if we make sweeping or insensitive statements claimed as "truth" and can't back them up, it may backfire on us. It's better to learn that in a safe online environment such as this than in a face to face engagement with less tolerant people, when one can run the risk of unpleasant consequences. Like most of us, I've been there too ;)

I bear no hard feelings, but I do know some people who would be dreadfully offended by some of the comments made herein. People who have suffered the horrendous consequences of military incompetence. Having helped some of those people regain a measure of their independence I'm somewhat hurt myself, but we all should have the opportunity to correct our mistakes and our offences.
 
Last edited:
Status
Not open for further replies.

New Threads

Top Bottom