1 min read

With ongoing digital transformation and the spread of technologies such as artificial intelligence (AI) and automated decision-making systems, the subject of alleged objectivity in these systems continually rears its head. Ultimately, these systems are often nowhere near as neutral as they appear or are claimed to be. Who is judged and based on what criteria? Who gets to be included and who excluded? Who is responsible for this? Based on the discriminatory experiences that people have had because of their skin colour, gender, disability or income, there is a growing demand to consider non-discrimination as early as the design process. RightsCon is an international conference for human rights and the digital age. This year’s event, which took place in early June, saw a debate about why design is important for human rights. The Design Justice Network campaigns for greater awareness about the relevance of this topic.

Using “Design Justice” as a catchphrase, Jenny Genzmer and Dennis Kogel hosted two Deutschlandfunk Kultur shows – “The Struggle for Non-Discriminatory Technologies” and “Visions for the Internet of the Future” – about cookie banners, dark patterns and various, mostly unnoticed forms of digital discrimination. Their discussions are available to download as podcasts in German. Among other things, they make it clear that algorithms themselves do not discriminate, but rather the people who design them. Activists are therefore calling for further regulations and internationally binding treaties to safeguard the protection of human rights in technological development – and in automated decision-making systems above all.

Share this page on Social Media:

Print Friendly, PDF & Email