Tuesday, January 1, 2013

MORAL MACHINES RAISE KILLER QUESTIONS



“Who should take responsibility for decisions made by intelligent machines like killer drones or autonomous cars?” BBC FUTURE
By: Tom Chatfield
Published: BBC Future; 7 December, 2012.  If you have trouble with the BBC website, google the title and writer’s name.
Level of Difficulty: **
Note to the Student: Furkan Polat has written in to say that the film "Echelon Conspiracy" covers the same dilemnas as in this reading passage so it might be a good idea to watch it.
BEFORE YOU READ
It is suggested that you watch a video on drones and driverless cars and think about or discuss the implications.
WATCH, LISTEN AND CONSIDER

·  Miles Brundage - Limitations and Risks of Machine Ethics - Oxford Winter Intelligence

https://www.youtube.com/watch?v=mhJSVr5SEgI
QUESTIONS
The example of the kid tracking drone is provided as an example to support the writer’s concern about………………………………………………………………………………………………………………….
2.       Read the example of the driverless car in paragraph three. The main argument in leaving the decision to the computer is the fact that ………………………………………………………………………….
3.       If you were asked to select a phrase out of the text to give as a subtitle to paragraph four, what would you choose?
4.       Read the comparison of the trolley experiment and the driverless car. It is implied in this paragraph that both cases are similar with respect to the fact that ……………………………………….
5.       Look at the last sentence of the paragraph beginning “Marcus’ driverless car scenario”. What do the first and the second “it” refer to?
6.       The writer says “Programming this into cars is one thing. Weapons are quite another.” Explain in your own words why this is so.
7.       What is special about X-47B?
8.       The writer says “As much as anything, it’s the relationship between these human operators and their subject that is most disturbing”. Why is this the case?
9.       Read the last sentence of the paragraph beginning “And by the time you reach autonomous systems…” What are the implications of this for the way wars are fought?
10.   Read the end of the text and discuss or think about the moral dilemnas. When you have determined where you stand, proceed to the writing task.
WRITING TASK
Write an essay discussing to what to extent machines that take responsibility for themselves should be allowed. Determine your standpoint and state it clearly in your thesis statement. Then support your view and refute any possible counter arguments with information from the text and any videos you watch.

MORAL MACHINES RAISE KILLER QUESTIONS KEY AND TEACHER’S NOTES
This text concerns a newly emerging dilemma modern technology has given birth to: the relationship between smart machines and their operators or programmers and who is exactly responsible for what to what extent. It therefore has novelty value in that it has never been exploited before. I predict it will fly but you should dig out some videos too.
1.       The increasing delegation of not only daily tasks to machines but also potentially life changing decisions themselves.
2.       The decision must be made in milliseconds.
3.       Minimizing Harm
4.       People make the decision OR people decide.
5.       Decided which program to write; the decision
6.       Possible answer: weapons are destructive and dangerous and have the potential to kill people.
7.       It is designed to take split second decisions on its own initiative while remaining under the overall control of human operators.
8.       Because thought experiments like the trolley experiment demonstrate something evident but extremely significant in moral thinking: how our sense of obligation is modified by distance and immediacy.
9.       Possible answer: People will feel morally comfortable and perform more horrific deeds than they do now with no sense of guilt or responsibility. They will feel they are at a moral distance as someone else did the programming ages ago.
10.   Open ended

No comments:

Post a Comment