Robots Are Being Taught to Say “No”, But for Our Own Good

 

humanoid robots being trained to say "no"

By David Nield|ScienceAlert

At first, the news that software engineers are teaching robots to disobey their human masters does sound slightly troubling: should we really allow the artificial intelligence systems of the future to say no to us? But once you think it through, you can see why such a feature might actually end up saving your life.

Consider a robot working on a car production line: most of the time, it would simply follow the instructions it’s been given, but if a human should get in harm’s way, the machine needs to be clever enough to stop what it’s doing. It needs to know to override its default programming to put the human at less risk. It’s this kind of functionality that a team from the Human-Robot Interaction Lab at Tufts University is trying to introduce.


 The team’s work is based around the same ‘Felicity conditions’ that our human brains apply whenever we’re asked to do something. Under these conditions, we subconsiously run through a number of considerations before we perform an action: do I know how to do this? Can I physically do it, and do it right now? Am I obligated based on my social role to do it? And finally, does it violate any normative principle to do it?If robots can be conditioned to ask these same questions, they’ll be able to adapt to unexpected circumstances as they occur.

Related Article: UN Urged To Ban ‘Killer Robots’ BEFORE They Can Be Developed

It’s the last two questions that are the most important to the process. A robot needs to decide if the person giving it instructions has the authority to do so, and it needs to work out if the subsequent actions are dangerous to itself or others. It’s not an easy concept to put into practice, as anyone who’s ever watched 2001: A Space Odyssey will know (if you haven’t, watch “Open the pod bay doors”).

Related Article: Will Robots Need to be Programmed with “Feelings” In Order To Be Conscious?

As one of their demo videos below shows, the computer scientists are experimenting with user-robot dialogues that allow for some give and take. That means the robot can provide reasons for why it won’t do something (in this case it says it will fall off a table if it walks forward) while the operator can offer extra reasons for why it should (the robot will be caught once it reaches the edge).

READ THE REST OF THE ARTICLE HERE…

Tags: , , , , ,

Subscribe

If you enjoyed this article, subscribe now to receive more just like it.

Subscribe via RSS FeedConnect on YouTube

6 Reader Comments

Trackback URL Comments RSS Feed

  1. 1642455209372959@facebook.com' Mike Jones says:

    lol. such a crock of shit. just like everything the government does is “for our own good”

  2. 10153533036713346@facebook.com' Nadia Asissis Kroll says:

    No Robots on my Planet ,, stand up against this shit !!!

  3. 892749157447255@facebook.com' Maureen Leah says:

    Would an Infinite Being allow a technologic robot to take control of and order our lives? No danger of allowing this in my energy PERIOD!!!

  4. 10153352850924091@facebook.com' Siantrai Clark says:

    Skynet and I-Robot…you think that people would slow down with this…This is soooo dangerous.

  5. 942379145844352@facebook.com' Steven Williams says:

    Anytime the government makes a move to limit your rights its under the guise of ” for your own good.”

Leave a Reply

Your email address will not be published. Required fields are marked *

FAIR USE NOTICE. Many of the stories on this site contain copyrighted material whose use has not been specifically authorized by the copyright owner. We are making this material available in an effort to advance the understanding of environmental issues, human rights, economic and political democracy, and issues of social justice. We believe this constitutes a 'fair use' of the copyrighted material as provided for in Section 107 of the US Copyright Law which contains a list of the various purposes for which the reproduction of a particular work may be considered fair, such as criticism, comment, news reporting, teaching, scholarship, and research. If you wish to use such copyrighted material for purposes of your own that go beyond 'fair use'...you must obtain permission from the copyright owner.

The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Conscious Life News assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms.

Paid advertising on Conscious Life News may not represent the views and opinions of this website and its contributors. No endorsement of products and services advertised is either expressed or implied.
Top

Send this to friend