Expert Opinion »The General Equal Treatment Act and the Protection against Discrimination by Algorithmic Decision Systems« [German]

Legal Opinion on behalf of the German Federal Anti-Discrimination Agency [in German]

Presented by
Prof. Dr. iur. Indra Spiecker gen. Döhmann, LL.M. (Georgetown Univ.)
Prof. Dr. iur. Emanuel V. Towfigh

Summary of the Results:

The use of ADM systems poses challenges to anti-discrimination law, above all the General Equal Treatment Act (AGG), which it is only able to meet to a limited extent in its current form.

The main task of these systems is to use statistical methods to identify a large number of correlations in order to establish relationships between variables. Despite their undisputed discriminatory potential, ADM systems are associated with the notion of being able to make objective and neutral decisions that are uninfluenced by human bias, as well as more efficient and better decisions, not least because human error, bias and cognitive limitations could be eliminated or at least reduced (section 1 of the report).

The use of such systems covers almost all areas of private and public life: pricing, access to and participation in public and private services, marketing, contract terms, diagnostic and therapeutic decisions, or allocation decisions when resources are scarce.

The way ADM systems function enables, among other things, probability statements to be made about people. Through the attribution of group characteristics, large numbers of automated (selection) decisions are made, or contracts are optimized and made more efficient with the help of mass individualization. From the point of view of anti-discrimination law, it is precisely this group attribution that is problematic.

Discrimination by statistics is the result of an attribution of characteristics obtained by statistical means, based on (actual or assumed) average values of a group. The reference to these average group characteristics is supposed to help overcome uncertainties about the individual characteristics of a single person. The social mechanism underlying such an assessment is not of interest, a causality is not claimed or proven. Thus, (historical) structural inequalities are perpetuated and new ones are created.

The quality of the decision of an ADM system is essentially dependent on the quantity, quality, modeling and evaluation of the data used. As a result, the discrimination potential of ADM systems may already be inherent in the system itself. In addition, intransparency is inherent in the way ADM systems operate. Determining responsibility for discriminatory elements in ADM systems is problematic because of the diversity of those involved in their programming, further development, use, and further use, for example, in network structures and individually varied standard algorithms. In fact, those potentially responsible can often exculpate themselves. The technological progress of digital evaluation methods and technologies places virtually no technical limits on the sharing, further use and merging of large amounts of data, thus making discriminatory data sets uncontrollable in their dissemination and use. It cannot be assumed that discrimination can be identified on a case-by-case basis.

The greatest challenge to legally effective protection against discrimination through ADM systems is the deficits of the AGG in enforcing the law. These fundamental deficits are well known and are not initially a problem specific to discrimination through ADM systems. However, due to the particularly pronounced power and information asymmetry, these effects are amplified: Among other things, the frequently encountered black-box character of ADM systems and the inability to infer from the decisions the use of such systems and their functionalities make it practically impossible for those affected to track down the causes of discrimination due to a lack of resources (section 2 of the report).

It lacks:

    • clear discrimination provisions in the AGG that also cover ADM system-specific discrimination, in particular group discrimination;
    • information and disclosure requirements in the AGG that provide insight into the specific functioning and data of an ADM system;
    • effective measures in the AGG to provide substantive and institutional support to stakeholders in the detection and prosecution of potential sources of error in ADM systems;
    • classic provisions of effective law enforcement in the draft AI Regulation (for example, reversal of the burden of proof, alleviation of causality).

Therefore, in order to ensure effective protection against discrimination by ADM systems and to overcome the deficiencies in enforcement that are contrary to Union law, the following measures should be considered (Section 3 of the Opinion):

    • fundamental reorientation of the AGG with regard to the role of the ADS:
      • granting of comprehensive rights of information and investigation;
      • granting of own rights of action by means of a right of action for associations;
      • establishment of an independent arbitration board at the ADS;
    • granting of legal standing for anti-discrimination associations;
    • extension of the protected characteristics of § 1 AGG to include the characteristic of relationships;
    • supplementation of the legal definition of Section 3 (2) AGG;
    • expansion of the scope of the AGG to include developers and service providers of ADM systems;
    • adjustments in the interpretation of the reversal of the burden of proof in § 22 AGG;
    • inclusion of ADS in the scope of application of the AI Regulation.

 

Press Breakfast of the Federal Anti-Discrimination Agency. Berlin, 08/30/2023.
(Photo credit: ADS / Thomas Trutschel)

 

(Photo credit: ADS / Thomas Trutschel)

Media echo

    • Amann, »Und dann hat er halt so einen Hitlergruß gezeigt« [»And then he just showed a Hitler salute like that«], Spiegel Online, 08/30/2023
    • Beuth, »KI macht vieles leichter – leider auch Diskriminierung« [»AI makes many things easier – unfortunately also discrimination«], Spiegel Online, 08/30/2023
    • Bergt, Der Algorithmus sagt Nein [The algorithm says No], taz, 08/30/2023
    • Fürstenau, KI-Gutachten: Diskriminierung programmiert? [AI expert opinion: Discrimination programmed?], Deutsche Welle, 08/30/2023 = Germany highlights discrimination risks of AI, Deutsche Welle = Inteligencia artificial: ¿discriminación programada?, Cambio = Deutsche Welle = Alemanha: sistemas de IA alimentam discriminação programada, UOL, Deutsche Welle and Terra = Yapay zekada ayrimcilik tehlikesi, Haberdar, Deutsche Welle and YeniAsya
    • Reuter, Gleichbehandlungsgesetz soll für automatisierte Entscheidungen angepasst werden [Equal treatment law to be adapted for automated decisions], Netzpolitik, 08/30/2023
    • Soliak, Vom Algorithmus gedisst [Dissed by the algorithm], LTO, 08/31/2023
    • Specht, Warnung vor hoher Diskriminierungsgefahr durch Algorithmen [Warning of high risk of discrimination by algorithms], Handelsblatt, 08/31/2023
    • Walter, »In drastischen Fällen können KI und Algorithmen Existenzen und Leben zerstören« [»In drastic cases, AI and algorithms can destroy livelihoods and lives«], Die Welt, 08/30/2023
    • Antidiskriminierungsbeauftragte will Menschen vor KI-Diskriminierung schützen [Anti-discrimination commissioner wants to protect people from AI discrimination], aerzteblatt.de, 08/30/2023
    • Ataman warnt vor digitaler Diskriminierung bei zunehmendem Einsatz von KI [Ataman warns of digital discrimination as use of AI increases], TrendyOne = Yahoo News = MSN = Unternehmen-Heute, 08/30/2023
    • Ataman: Gesetz soll vor digitaler Diskriminierung schützen [Ataman: Law should protect against digital discrimination], Newstral = Migazin, 08/30/2023
    • Ataman fordert Regulierung beim Einsatz von künstlicher Intelligenz [Ataman calls for regulation in the use of artificial intelligence], Focus Onlinee, 08/30/2023
    • Weitere Gefahren durch künstliche Intelligenz? Ataman warnt vor Diskriminierung [More dangers from artificial intelligence? Ataman warns against discrimination], Berliner-Zeitung, 08/30/2023
    • Antidiskriminierungsbeauftragte will Schutz vor digitaler Diskriminierung ausweiten [Anti-discrimination commissioner wants to extend protection against digital discrimination], antidiskriminierungsstelle.de, 08/30/2023

[Translated by DeepL]