Ethics

25 Apr 2020

“The main object of conciliation lies in reaching a solution to a case based upon morals and with a warm heart” - Confucius

What does ethics mean in software engineering?

I have taken an ethics class before going over several ideas about how one should think in the world of ethics. Going over deontological ethics and topics such as utilitarianism. Where most cases don’t have situations where it’s black and white, but rather gray areas that no one can quite say which side or any side is right. The broader the perspective, the bigger the gray area becomes making the situation harder to distinguish on what is right from wrong.

Putting this onto a perspective in software engineering, these situations of not so clear cut gray lines, starts to show as well. Where how the implemented work both resolves some problem as well as creating a problem or a possibility of a problem. Becoming a game of find the different situations and if any of these situations are bad enough to need to have concern or a solution to each situation. This especially comes out during the case of the Autonomous Cars were being introduced as a possible mode of transportation in the public.

The gray line of Autonomous drivers

The problem that came to light from having autonomous driving cars on the road were that these cars could crash. The road will always have some situations where crashing becomes inevitable, whether it is due to the enviormental factors, or other objects on the road, or even if the artificial intelligence of the car doesn’t catch the obstacle to avoid. And if the car is to come to the situation of going to crash at a 100% chance, what should the car do? Given a situation where a car can only crash into a person and a wall, would the driver be forcefully sacrificed? Or does the walking person going to have to take the hit? What if there were more than 1 person? Or if the person is an elder with a child? Or if the light was on red for pedestrians? There are many factors that affects the situation that ultimately leaves the car to decide who to kill. Surveys even show that many people wouldn’t mind to self-sacrifice as an option if the situation wasn’t affecting them. Given more complex situations had the surveyors to be more split about the situation as a whole.

Conclusion

I have once looked into this case when this topic first came up. This was also during the time when I was taking the ethics class as well. Though seeing this again gives a real world application of the trolley problem. Where you would choose either a deontological ethics path of judging by action or utilitarianism of judging of what is right. Though given this situation I can’t give a clear cut answer of what is the correct path to take. I believe the recent solutions to this was have the driver be able to take full control at any time. Having the driver be responsible will negate most of the situations that will be created to be put on the car and will give full responsibility to the driver.