Digitale Gesellschaft
eBook - ePub

Digitale Gesellschaft

Perspectives on the Power of Algorithms and Data

  1. 290 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Digitale Gesellschaft

Perspectives on the Power of Algorithms and Data

Book details
Book preview
Table of contents
Citations

About This Book

Algorithms are not to be regarded as a technical structure but as a social phenomenon - they embed themselves, currently still very subtle, into our political and social system. Algorithms shape human behavior on various levels: they influence not only the aesthetic reception of the world but also the well-being and social interaction of their users. They act and intervene in a political and social context. As algorithms influence individual behavior in these social and political situations, their power should be the subject of critical discourse - or even lead to active disobedience and to the need for appropriate tools and methods which can be used to break the algorithmic power.

Frequently asked questions

Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access Digitale Gesellschaft by Sven Quadflieg, Klaus Neuburg, Simon Nestler, Sven Quadflieg,Klaus Neuburg,Simon Nestler in PDF and/or ePUB format, as well as other popular books in Social Sciences & Media Studies. We have over one million books available in our catalogue for you to explore.

Information

Year
2022
ISBN
9783732857630
Edition
1
“Accountability requires human judgement, and only humans can perform the critical function of making sure that, as our social relations become ever more automated, domination and discrimination aren’t built invisibly into their code”1 (Frank Pasquale).
The analysis2 focuses on opportunities and challenges of algorithms3 and risks in the “algorithmic age”4 and will explore avenues to address the impact of algorithms5 in the area of gender equality (GE) law regarding biases and discrimination.

I. Obey and Disobey—the Terms Imposed by Behavior Changing Algorithms and Gender-based Discrimination

In most online activities6 consumers’ human intelligence7 is confronted with decisions of algorithms. Consumers have to obey or dis-obey. Often there is no real choice. Not accepting the terms and conditions imposed by companies equals exclusion from the service, which can be best described by the term of behavior changing algorithms8. Some platforms face no competition, exercise monopoly9 or “algorithmic power” and could be viewed as gate keepers10. In a democracy nobody should be discriminated because of gender when using services11. But what are algorithms? Barocas defines an algorithm as “a formally specified sequence of logical operations that provides step-by-step instructions for computers to act on data and thus automate decisions”.12. Algorithms understood as a list of step-by-step instructions which are nourished with real world data, have an objective and follow the instructions or mathematical operations to achieve the defined aim13. Fry groups algorithms into four main categories according to the tasks: 1) priorization14, 2) classification15, 3) association16 and 4) filtering17. These algorithms can come in the shape of either “rule-based algorithms” where instructions are programmed by a human or “machine-learning algorithms”18. The article will mostly refer to algorithms in general19.
Obey shall be understood in two ways: first, humans must obey the terms imposed by companies to use systems and second, the state can impose regulation on companies they need to obey to. Dis-obey shall be understood as humans dis-obeying in order to preserve their rights20, notably in the absence of legal rules or if companies dis-obey regulatory attempts to preserve their business model. The dis-obey approach could inspire consumers to follow rights-preserving behavior, such as data poor approaches, favoring data friendly companies, introducing “noise” into their data supply or avoid digital services that potentially discriminate21. Considering this tension between obey and dis-obey, regulators have been reflecting on rules for fair and non-discriminatory algorithms. The European Commission (EC) published a draft Regulation (Artificial Intelligence Act)22 on 21 April 2021, following the adoption of the Digital Services Act (DSA)23 and the Digital Markets Act (DMA)24. Many international bodies have adopted standards on AI (OECD25, Council of Europe26 or UNESCO27). The EC’s Advisory Committee on Equal Opportunities for Women and Men adopted an opinion on AI and GE28, containing recommendations to address algorithmic biases and prevent gender-based discrimination.
A case of discrimination usually concerns individual cases, but the impact can reach societal scale when patterns of algorithmic discrimination evolve and reinforce biases and discrimination29. Each discriminated individual will be reflected in the datasets and contribute to create future risks of discrimination for women and men as categorized and classified by algorithms. However, humans also rely on automatic processing of data by schematizing and grouping people in boxes, for example by sex or race30. Such classification and generalization could base decisions on a group of women or men to the detriment of an individual, which impacts the well-being of consumers using products and services31 that rely on technology or workers accessing the labor market32. Moreover, one of the problems is the opaque decision making of algorithms, or “black box” as used by Pasquale to describe the fact that the inner workings of an algorithm are sometimes difficult to grasp, especially for potential victims of discrimination.
Relying on literature and current institutional proposals, the article assesses the opportunities and risks both for regulating and using algorithms. Dealing with the topic of AI and gender from the angles of “regulatory object” and “useful tool” will shed new light and contribute to an ethical33 and fair framework34 to enforce GE laws.

II. From Classical Discrimination towards Discrimination by Correlation

Before explaining discrimination by correlation (3) and giving examples (4), I will present the relevant EU law and discuss the concept of discrimination (1) as well as the relationship between algorithms and bias (2).
1) Some Reflections on EU Law and Gender-based Discrimination
EU anti-discrimination law works with the concept of protected characteristics (Ex. gender or age). However, this becomes increasingly difficult when decisive elements in the decision-making result not from humans but algorithms. Current laws were adopted before the age of algorithms and are not equipped to deal with all new legal challenges even if formulated in an abstract and general way to deal with (un)foreseen situations35. Judges will have to interpret existing laws in light of technological developments, which could accommodate AI. EU law distinguishes between direct and indirect discrimination. A direct discrimination in EU law36 exists “where one person is treated less favorably on grounds of sex than another is, has been or would be treated in a comparable situation”37. Indirect discrimination “where an apparently neutral provision, criterion or practice would put persons of one sex at a particular disadvantage compared with persons of the other sex, unless that provision, criterion or practice is objectively justified by a legitimate aim, and the means of achieving that aim are appropriate and necessary”38. While direct discrimination cannot be justified in principle, a possibility for justification exists for indirect discrimination. A different treatment is no...

Table of contents

  1. Cover
  2. Title
  3. Copyright
  4. Contents
  5. Sven Quadflieg / Klaus Neuburg / Simon Nestler
  6. Florian Arnold
  7. Johanna Mellentin / Francesca Schmidt
  8. Fabian Weiss
  9. Carolin Höfler
  10. Moritz Ahlert
  11. Harald Trapp / Robert Thum
  12. Christina Hecht
  13. Victoria Guijarro Santos
  14. Katja Dill
  15. Fabian LĂŒtz
  16. Matthias Pfeffer
  17. Lotte Houwing
  18. Bernd Friedrich Schon
  19. About the Authors