Projets

ETHICAA

Machines and agents have more and more autonomous functions and consequently are less and less supervised by human operators or users. Therefore especially when machines interact with humans, we need to ensure that they do not harm us/them or threaten our/their autonomy, especially decision autonomy. Consequently the question of an ethical regulation or control of such autonomous agents is raised and has been discussed by several authors such as Wallach and Allen. As stated by Picard, the greater the freedom of a machine, the more it will need moral standards.

Let’s consider to motivate this problematic the trolley and footbridge dilemmas. Assume that a runaway trolley is hurtling down a track towards five people, whereas there is a single person on a neighbouring track, and two people (a thin one and a fat one) on a footbridge under which the trolley will pass. The trolley dilemma is as follows: should the driver change tracks, killing one to save five? The footbridge dilemma is: should the thin man push the fat man over the footbridge to suddenly stop the trolley? More generally, both dilemmas raise the question: considering an agent A that can make a decision that would benefit many other agents but, in doing so, an agent B would be unfairly harmed, under what circumstances would it be moral for agent A to violate agent B’s rights in order to benefit the group?

The objectives of the eThicAa project is twofold:

  1. definition of what should be a moral autonomous agent and a system of moral autonomous agents

  2. definition and resolution of ethical conflicts that could occur 1) inside one moral agent, 2) between one moral agent and the (moral) rules of the system it belongs to, 3) between one moral agent and a human operator or user, 4) between several artificial (moral) agents including or not human agents.

Ethical conflicts are characterized by the fact that there is no “good” way to solve them. Nevertheless when a decision must be made it should be an informed decision based on an assessment of the arguments and values at stake. When several agents are involved this may result in one agent taking over the (decision or action) authority from the others.

eThicAa proposes to study the four cases of ethical conflicts that could occur in moral autonomous agents or between moral autonomous agents and humans on two chosen applicative domains: robotics and privacy management. For instance, in the robotic domain, eThicAa should be able to manage the ethical conflicts between one artificial agent and one human operator. To this end, we will consider a UAV (Unmanned Air Vehicle) jointly operated by a human operator and an artificial agent. Assuming that the UAV is in an emergency situation and must be crashed, the only two options being either very near the operator’s headquarters (where many operator’s colleagues work) or very near a small village, which decisions must be made by the autonomous agent?

The case of privacy management will consider ethical conflicts between multiple artificial agents and human users, we will consider a social network where the privacy policies of the accounts owned by humans are controlled by moral autonomous agents. Assuming that two users feud and broadcast some private data about the other one in a common circle of friends, what should be the privacy policy of the society of agents including the agents owned by those feuding users?

From the implementation and experimentation of those scenarios, eThicAa aims at providing a formal representation of ethical conflicts and of the objects on which they are about. The project also aims at designing explanation algorithms for the human user and autonomous agents’ arguments and values to make informed ethical decisions. Consequently the outcome of eThicAa will be a framework and recommendations to design moral artificial agents, i.e. how their autonomous functions should be controlled to make them act according to context-dependent moral rules and to deal with ethical conflicts involving other artificial or humans agents, whether moral or not.
More information on ETHICAA