Saltar la navegación

Scenario 1

Self-driving cars Scenario

Objective: To foster critical thinking and discussion about ethical principles in AI development and use. 

Duration: 15-30 minutes

Proposed scenario: A self-driving car must choose between hitting a pedestrian or swerving into a wall, potentially injuring the passengers.

self-driving car image

Group activity formats:

Depending on the size and context of your group, these activity formats may be useful to present this scenario

  • Small-group debate: Divide the class into groups representing different stakeholders (pedestrian, passengers, programmers, society) and have them argue their case.
    • Divide the group: Divide the participants into smaller groups of 3-5 people.
    • Present the scenario: Present the activity to the group
    • Brainstorm and discuss: Give the group 10-12 minutes to discuss the questions, frameworks and ethical implications of the scenario (all of them presented below) and identify the key issues that should be considered.
    • Share and debate: Have the small group present their findings to the whole group. Encourage open discussion and debate about the different perspectives and solutions proposed.
  • Role-Playing: Ask each person in the small group to reflect on this issue from each perspective and present the different angles of this dilemma.
    • Assign students roles (e.g., the car's AI, a passenger, the pedestrian)
    • have them improvise a scene leading up to the moment of decision.
    • Present the scenario to the whole group and encourage them to write down their view of the questions, frameworks and ethical implications of the scenario (all of them presented below) according to their roles. 
    • Share and debate: Have each person present the key issues from their point of view 
  • Ethical Code Writing: Have the group draft a code of ethics for self-driving cars, once the scenario and reflecting questions are presented.

Additional Tip: Use visual aids (e.g., mind maps, diagrams) to help group/individuals organize their thoughts and ideas.

Fist questions

  • What is your immediate gut reaction to this scenario?
  • Who do you think the car should hit? Why?
  • Do you think it's even possible for a car to make this kind of decision? Why or why not?

Ethical Frameworks

Utilitarianism: From a utilitarian perspective (greatest good for the greatest number), what should the car do? What are the potential problems with this approach?

Rights and Duties: Does the pedestrian have a "right" to be safe on the sidewalk? Do the passengers have a "right" to be protected by their car? How might these conflicting rights influence the decision?

Personal Values: If you were programming the car, what values would you prioritize? (e.g., preserving life, protecting the vulnerable, minimizing harm)

Implications

Let's discuss about these issues:

  • Should car manufacturers be allowed to program "ethical rules" into their vehicles? Who should decide what those rules are?
  • If self-driving cars become common, could this lead to new forms of discrimination or bias? How?
  • How might this scenario impact public trust in self-driving technology?

Going Deeper

Is it fair to put the burden of this decision on a machine?

Could this kind of situation force us to rethink our values as a society?

After the activity:

Summarize the key points discussed during the activity.
Encourage participants to reflect on their own values and beliefs regarding AI ethics.
Discuss the importance of ongoing dialogue and collaboration to address the ethical challenges of AI.

Creado con eXeLearning (Ventana nueva)