10.4.18

Few remarks of „the problem of many hands”


Accident had happened. Self-driving car (Uber) killed women in Arizona (click). There is need to blame somebody. 

Main questions are:

Who should be blamed for?

Driver, who was a passenger?

Factory which produced this car?

Or maybe, to be more specific group of people who built a car? Pogrammers who made a software?

Who should pay reimbursement?

Who really killed that women? 

Can we blame an autonomous car?

What about future research? Should the project be stopped, because of this victim?

History shows that progress, experiments and new machines were not always welcomed by society. Now we are laughing at the obligation imposed on the first drivers who had to provide a walker who went in front of the car and warned others about automobile passing, with special flag.This was implemented after series of accidents with pedestrians and cars. 
Who really killed this person in Arizona? System, car, factory, software or designers, group of people?

The problem appears how to assign a moral responsibility, when large group of people were involved in the project – this case autonomous car.

Problem of many hands occur in the situation in which collective should held moral responsible for outcome (accident this time), because there are no individuals to can handle this moral responsible.  

It happened because there were many people involved in this project, it is difficult, impossible to pinpoint who is morally responsible for what.

Term – “problem of many hands” describes problem with attributing individual responsibility in collective settings. It occurs in public administration, corporate management, law regulation, healthcare and what is important in case of autonomous car accident – technological development and innovation projects.

There are two main moral responsibility way of thinking. First is a forward looking – obligation, virtue, and second it backward looking – accountability, blameworthiness and liability.

In case of new technologies there are new responsibilities, new relationships which has to be identified and articulated.

Discussion about the  many hands problem come to the point when there were deaths which were resulting from computer error.

Because of science history we learned not to blame gun for killing people, because guns don’t kill people, people kill people (thinking about conventional guns), but what about blaming computer? 

Blaming computer is very tempting for us, we as human might get away from responsibility.

Most sciences treat computers as a products, something less responsible than humans. This is a way of analysing computers as a system. In this system there are humans who create, implement and use those systems. According to modern philosophers those systems should bring best practices, and should be risible. Among the problem of many hands there are also: bugs in software, poor articulation of norms and assumption of ethical neutrality of computers.

The problems of many hands, according to Gotterbarn is due in part to a malpractice model of responsibility. This model tend to such for one person to hold whole responsibility and to be blame for all. This is an individualistic model. This model could not work properly in the modern world, when there is no one team but a lot of collectives who deal with project. There is also one more problem who should be blamed – the collective who prepared a project – this case the autonomous car, or the owner – user of this car.

From another side, creators complain that they are not responsible of lack of control over users of the system (e.g. autonomous car).  

It is important to underline that replacement of humans by computers creates the illusion that people delegated decision making process and made computers responsible for this. 


In case of autonomous car there is no individual model of responsibility. We can not blame computer for what had happened because we, humans created it. We still have to investigate human factors in whole process, and deal with issues of many hands problem.