A text, algorithmic text, artificial text
“Through algorithms, one can see through human society—especially one aspect: inequality.”
Algorithms are considered neutral. They are actually not because they are man-made tools. Social, political and personal aspects play a significant role in the production and use of a tool. Also, data—the learning material for algorithms—is completely same as society.
For these reasons, algorithms and AI can perform a task that no one expected: human discrimination. But not everyone is discriminated by them, but those who are neither male, white-skinned, heterosexual nor indigenous. This is called "algorithmic bias."
How would an algorithm “express its opinion” about the texts that I have collected on the subject of “algorithmic bias?” This collection of texts was converted into a dataset and an algorithm (RNN) generated four texts based on it. I have also written a text on the following points, among others, based on the same dataset: blind trust in technology, fragile and non-inclusive data, examples and reasons for discriminatory algorithms and AI, and ways to digital and analog democracy.
By writing this book with an algorithm, I, a non-programmer, wanted to ask myself the following questions: What is an algorithm? Which influences algorithms and AI have on me and on (international) society? What are the meanings of generated texts? What might the future of books look like? And above all, how can we create objective algorithms in order to build an equal and digitally democratic society?
10. 2019 – 2. 2020
Prof. Matthias Görlich, Peter Hermans
Download the pdf version of the book