07 September 2023
Almost 120 civil society organisations, including Statewatch, are calling on MEPs to close a massive loophole in the proposed Artificial Intelligence Act introduced under pressure from big tech lobbyists. Without changes, the law will allow developers of AI systems to decide whether or not the systems they produce should be considered "high-risk" or not - an obvious invitation for them to decide that they are not, in order to avoid the extra safeguards that the Act should impose.
Support our work: become a Friend of Statewatch from as little as £1/€1 per month.
Image: Ben Tilley, CC BY 2.0
The statement was coordinated by European Digital Rights.
EU legislators must close dangerous loophole in AI Act
The European Union is entering the final stage of negotiations on its Artificial Intelligence Act (AI Act), but Big Tech and other industry players have lobbied to introduce a major loophole to the high-risk classification process, undermining the entire legislation. We call on EU legislators to remove this loophole and maintain a high level of protection in the AI Act.
The EU AI Act has the potential to improve protections for people impacted by AI systems. In its original form, it outlined a list of ‘high-risk uses’ of AI, including AI systems used to monitor students, to assess consumers’ creditworthiness, to evaluate job-seekers, and to determine who gets access to welfare benefits.
The legislation requires developers and deployers of such ‘high-risk’ AI to ensure that their systems are safe, free from discriminatory bias, and to provide publicly accessible information about how their systems work. However, these benefits will be undermined by a dangerous loophole introduced into the high-risk classification process in Article 6.
In the original draft from the European Commission, an AI system was considered ‘high risk’ if it was to be used for one of the high-risk purposes listed in Annex III. However, the Council and the European Parliament have introduced a loophole that would allow developers of these systems decide themselves if they believe the system is ‘high-risk’.i The same company that would be subject to the law is given the power to unilaterally decide whether or not it should apply to them.
These changes to Article 6 must be rejected and the European Commission’s original risk- classification process must be restored. There must be an objective, coherent and legally certain process to determine which AI systems are ‘high-risk’ in the AI act.
If the changes to Article 6 are not reversed, the AI Act will enable AI developers to decide to exempt themselves from all substantive rules for high-risk systems. The AI Act would:
We urge lawmakers to reverse these changes and restore the Commission’s original language in Article 6. The AI Act must prioritise the rights of people affected by AI systems and ensure that AI development and use is both accountable and transparent.ii
Signed,
Secret negotiations between the Council of the EU, European Parliament and European Commission on the Artificial Intelligence Act have begun, more than two years after the legislation was proposed. A statement signed by more than 150 civil society organisations, including Statewatch, calls for fundamental rights to be put at the centre of the talks.
With the European Parliament and Council of the EU heading for secret trilogue negotiations on the Artificial Intelligence Act, an open letter signed by 61 organisations - including Statewatch - calls on the Spanish Presidency of the Council to make amendments to the proposal that will ensure the protection of fundamental rights.
Spotted an error? If you've spotted a problem with this page, just click once to let us know.
Statewatch does not have a corporate view, nor does it seek to create one, the views expressed are those of the author. Statewatch is not responsible for the content of external websites and inclusion of a link does not constitute an endorsement. Registered UK charity number: 1154784. Registered UK company number: 08480724. Registered company name: The Libertarian Research & Education Trust. Registered office: MayDay Rooms, 88 Fleet Street, London EC4Y 1DH. © Statewatch ISSN 1756-851X. Personal usage as private individuals "fair dealing" is allowed. We also welcome links to material on our site. Usage by those working for organisations is allowed only if the organisation holds an appropriate licence from the relevant reprographic rights organisation (eg: Copyright Licensing Agency in the UK) with such usage being subject to the terms and conditions of that licence and to local copyright law.