Casa ESL · C1 Advanced · Unit 8 of 20 · Step 1

Artificial Intelligence Ethics

Complex Relative Clauses and Reduced Relatives

Produce complex relative clauses with prepositions and quantifiers
Use reduced relative clauses (participle clauses) to create concise, formal prose
Discuss ethical issues surrounding AI using sophisticated sentence structures

Name

Date

algorithm

noun

A process or set of rules to be followed in calculations or problem-solving, especially by a computer.

"The algorithm was found to exhibit racial bias in its predictions."

bias

noun

Prejudice in favour of or against one thing, person, or group compared with another.

"Algorithmic bias can perpetuate existing social inequalities."

autonomous

adjective

Having the freedom to act independently; self-governing, especially of machines.

"Autonomous vehicles raise questions about liability in the event of an accident."

surveillance

noun

Close observation, especially of a suspected person or group.

"Mass surveillance programmes have been criticised on civil liberties grounds."

accountability

noun

The fact or condition of being responsible for one's actions and their consequences.

"Accountability in AI development is an unresolved issue."

opaque

adjective

Not transparent; difficult to understand or see through, especially regarding processes.

"Many AI decision-making processes remain opaque even to their creators."

mitigate

verb

To make less severe, serious, or painful.

"Steps must be taken to mitigate the risks of automated decision-making."

proliferation

noun

Rapid increase in the number or amount of something.

"The proliferation of AI applications has outpaced regulatory frameworks."

Complex relative clauses and reduced relatives

Complex relative clauses use preposition + which/whom ('the framework within which AI operates'), quantifiers ('many of whom/which'), or sentential relatives ('which suggests…'). Reduced relatives replace the relative pronoun and auxiliary with a participle: 'The data, collected over five years, suggests…' (= which was collected). Present participle: 'Companies developing AI face scrutiny' (= which develop). Past participle: 'The regulations, introduced in 2024, address bias' (= which were introduced). These structures are essential for academic density and precision.

The data, collected over five years from multiple institutions, suggests a clear trend.

Researchers, many of whom specialise in machine learning, raised concerns about the methodology.

The framework within which autonomous systems operate lacks adequate safeguards.

Companies developing AI applications must ensure transparency, a requirement often overlooked in practice.

Exercise 1

Combine or reduce the relative clause in each sentence to create a more concise, formal version.

1. The report, which was published last month, highlights key risks. → The report, last month, highlights key risks.

2. There are several ethical frameworks. AI policy operates within these frameworks. → There are several ethical frameworks within AI policy operates.

3. The engineers, most of had no ethics training, were unaware of the implications.

4. The algorithm, which processes millions of data points daily, has been found to exhibit bias. → The algorithm, millions of data points daily, has been found to exhibit bias.

5. The guidelines, which were introduced in response to public pressure, require annual auditing. → The guidelines, in response to public pressure, require annual auditing.

Exercise 2

Choose the correct relative or reduced relative construction.

1. The committee, ___ had reviewed the evidence, recommended immediate action.

2. The users, many of ___ were unaware of the data collection, filed complaints.

3. The regulations, ___ in 2023, are already considered outdated.

The Black Box Problem

Artificial intelligence systems, deployed across sectors ranging from healthcare to criminal justice, are making decisions that profoundly affect human lives. The algorithms driving these systems, many of which were trained on historical data reflecting existing societal biases, can perpetuate and even amplify discrimination. A hiring tool developed by a major technology company, designed to screen job applications automatically, was found to penalise candidates from certain demographic groups — a bias inherited from the data on which it had been trained. The opacity of such systems, often described as 'black boxes', poses a fundamental challenge to accountability. Decisions made by algorithms, affecting millions of people daily, frequently cannot be explained even by the engineers who created them. Regulatory frameworks introduced in recent years have attempted to address these concerns, requiring companies developing AI applications to conduct impact assessments and provide explanations for automated decisions. Nevertheless, the pace of technological proliferation continues to outstrip the capacity of regulators, many of whom lack the technical expertise to evaluate the systems under their purview. The question facing society is not whether AI should be regulated, but whether meaningful regulation is achievable given the speed and complexity of the technology involved.

1. What example does the passage give of algorithmic bias, and what caused it?

2. Why does the passage suggest meaningful regulation may be difficult to achieve?

Discuss these questions with a partner or your teacher.

1Describe an AI application you use regularly, explaining how it works and what ethical concerns it raises. Use at least three reduced relative clauses in your description.
2Debate: 'AI systems, designed to be efficient rather than fair, should not be used in criminal justice.' Use complex and reduced relative clauses to structure your arguments.

Write a formal paragraph (6–8 sentences) about an ethical issue related to AI. Use at least three reduced relative clauses and one complex relative clause with a preposition or quantifier.

Example: Facial recognition technology, deployed by law enforcement agencies in numerous countries, raises serious questions about privacy. The databases on which these systems rely, compiled without explicit consent in many cases, contain millions of images. Citizens affected by misidentification, many of whom belong to minority groups, have limited recourse. Regulations introduced in the European Union represent a step forward, but enforcement remains inconsistent.

Answer Key — For Teacher Use

Exercise 1

1. published · 2. which · 3. whom · 4. processing · 5. introduced

Exercise 2

1. having · 2. whom · 3. introduced

Reading Comprehension

1. A hiring tool developed by a major tech company was found to penalise candidates from certain demographic groups. The bias was inherited from historical training data reflecting existing societal prejudices. · 2. Because the pace of technological proliferation outstrips regulators' capacity, and many regulators lack the technical expertise to evaluate the AI systems under their purview.