Ethical (CDIO #Assignment 3)
In this post, we will discuss about the ethical
issues of autonomous vehicles: what if an accident happened to human (driver or
pedestrian) within or out of the AVs, who to blame, driver, pedestrian,
sellers, manufactures or the state (government)?
As an engineering student, I couldn’t say I
can properly analysis this problem from the ethic point of view, I can more
discuss it from technique point and share some of my views.
Say, if such accident happens as described in
the pre-assignment requirement, the first thing to do is to find out the real
reason of causing this accident: is the vehicle really acting as it should be?
Or is it because of the poor road and environment condition?
Of course the driver Alice should take some
responsibility for this accident. Seems that she put too much trust on this
autonomous vehicle and even turn into most aggressive mode in bad environment
condition.
It also could be the reason that seller Bob
didn’t fully explain the function of vehicle, and maybe even hided some
important information about autonomous driving function just in order to
attract buyers.
But I think the most responsibility should
be taken by the engineers who designed this function. According to the “Three
Laws of Robotics” by Asimov, from the Researchers and developers’ part,
including manufactures, the rule of thumb is to maximum safety, fit in with
social norms and encourage trust. Based on these, the aggressive mode should
not be activated during bad situations. This doesn’t mean on good conditions
autonomous driving would be 100% safe (even human drivers make mistakes, and
maybe even more easy to make mistakes), however, safety should always be the
first to consider. And engineers should always consider the worst case to guarantee
the safety of human.
I believe ethical problem is a gap between
real human and intelligent machines. It’s true that build an ethical robot can
be difficult, but with help of fast developing machine-learning algorithm,
robots, or intelligent vehicles will be learning from the experiences and use
cases. More you use them, smarter they will be. That could be one solution and
mostly used solution to solve the ethic discussion, No 100% safety system would
exist, but there should be a certain threshold for safety. Before we
technologically reach that level, government should make laws to prevent from
too high level of automation.
Trust to such autonomous vehicles are
really hard to build, but can be easily destroyed with an accident or crash.
That requires us more consideration to safety when launching it onto the
market.
You are right that the ethical issues are complex and may take us out of our comfort zones as engineers. Thank you for your thoughts on this assignment. - The CDIO Academy Team
ReplyDelete