{"id":73574,"date":"2021-01-25T16:30:00","date_gmt":"2021-01-25T15:30:00","guid":{"rendered":"https:\/\/www.hiig.de\/?p=73574"},"modified":"2023-03-28T14:05:52","modified_gmt":"2023-03-28T12:05:52","slug":"opening-match-the-battle-for-inclusion-in-algorithmic-systems","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/","title":{"rendered":"Opening match: the battle for inclusion in algorithmic systems"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">Team civil society and team industry go head-to-head on conditions and rules for inclusive design<\/h2>\n\n\n\n<p><em><span style=\"font-weight: 400;\">How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights? The <a href=\"https:\/\/www.hiig.de\/en\/research\/ai-and-society-lab\/\">AI &amp; Society Lab<\/a> pursues this core issue by facilitating exchange between academia, industry and civil society while experimenting with different formats and approaches. As one of its initial ventures, it hosted a series of roundtables in cooperation with the Representation of the European Commission in Germany to work on the implementation and operationalisation of the commission&#8217;s White Paper on AI.<\/span><\/em><\/p>\n\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<p><span style=\"font-weight: 400;\">To extend and sustain the societal debate on inclusive AI, the topic of our third roundtable, we challenged two stakeholder groups to a ping pong match, the world&#8217;s fastest return sport \u2013 but digitally, and with the AI &amp; Society Lab hitting the first serve. Playing for team civil society was Lajla Fetic, scientist and co-author of the <a href=\"https:\/\/algorules.org\/de\/startseite\">Algo.Rules<\/a>, a practical guide for the design of algorithmic systems. Facing her on the other side of the net was Finn Grotheer, AI business development fellow at <a href=\"https:\/\/www.merantix.com\">Merantix<\/a>, a Berlin-based AI venture studio. On your marks, get set, go!<\/span><\/p>\n\n\n\n<figure class=\"wp-block-gallery columns-2 is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\"><ul class=\"blocks-gallery-grid\"><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"1200\" height=\"800\" src=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-1200x800.jpg\" alt=\"\" data-id=\"73749\" class=\"wp-image-73749\" srcset=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-1200x800.jpg 1200w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-800x533.jpg 800w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-60x40.jpg 60w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-768x512.jpg 768w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-180x120.jpg 180w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-600x400.jpg 600w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-50x33.jpg 50w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-540x360.jpg 540w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-1536x1024.jpg 1536w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-2048x1365.jpg 2048w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Finn-Grotheer-1320x880.jpg 1320w\" sizes=\"auto, (max-width: 1200px) 100vw, 1200px\" \/><\/figure><\/li><li class=\"blocks-gallery-item\"><figure><img loading=\"lazy\" decoding=\"async\" width=\"899\" height=\"1200\" src=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-899x1200.jpeg\" alt=\"\" data-id=\"73753\" data-full-url=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-scaled.jpeg\" data-link=\"https:\/\/www.hiig.de\/en\/?attachment_id=73753\" class=\"wp-image-73753\" srcset=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-899x1200.jpeg 899w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-599x800.jpeg 599w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-45x60.jpeg 45w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-768x1025.jpeg 768w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-135x180.jpeg 135w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-37x50.jpeg 37w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-270x360.jpeg 270w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-600x801.jpeg 600w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-1151x1536.jpeg 1151w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-1534x2048.jpeg 1534w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-1320x1762.jpeg 1320w, https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/Foto-Lajla-Fetic-1-scaled.jpeg 1918w\" sizes=\"auto, (max-width: 899px) 100vw, 899px\" \/><\/figure><\/li><\/ul><\/figure>\n\n\n\n<p><strong><i>What AI topic won&#8217;t let you sleep at night?<\/i><\/strong><\/p>\n\n\n\n<p>Finn: In particular, so-called GANs (general adversarial networks) are a major societal challenge. They can artificially generate videos and soundtracks that are not recognisable as fakes. In light of our social media culture and its influence on society and politics, we can only hint at their effect.<\/p>\n\n\n\n<p>Lajla: The hype about AI does not give me nightmares. What I ponder are the questions behind it: how can all people benefit equally from technology? How can marginalised groups find a hearing in the design of AI? If women, people with disabilities, people with migration experiences or without a university degree can participate equally in debates and in the development of AI, I will sleep even better.<\/p>\n\n\n\n<p><strong><i>What AI topic has not yet been sufficiently discussed?<\/i><\/strong><\/p>\n\n\n\n<p>Finn: Competitiveness. It is neither sexy nor does it spark enthusiasm. But we will only be able to implement our ethical standards if we \u2013 Germany and Europe \u2013 take the lead in developing, testing and scaling the best AI applications. Otherwise, we run the risk of repeating the experiences we are currently having with the American internet giants.<\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">Lajla: The &#8220;why&#8221; question is too seldom asked in this highly charged debate. For what purpose are we developing algorithmic systems and at what cost? As a rule of thumb: <\/span><i><span style=\"font-weight: 400;\">More<\/span><\/i><span style=\"font-weight: 400;\"> is not always <\/span><i><span style=\"font-weight: 400;\">better<\/span><\/i><span style=\"font-weight: 400;\">. Algorithmic systems offer a lot of (unused) potential. And we have to discuss the conditions under which the latest developments are coming about. Training complex machine learning systems takes a lot of energy. Tools that make invisible CO2 costs visible are a good first step in talking about common good-oriented goals of technology design.<\/span><\/p>\n\n\n\n<p><strong><i>What should AI be able to do today rather than tomorrow?<\/i><\/strong><\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">Finn: What AI <em>can<\/em> do isn&#8217;t necessarily the bottleneck. Much of its potential is simply still untapped. In healthcare, for example, people around the world still die of treatable diseases because they have no access to diagnostics and treatment. We underestimate how much we could already achieve today with a worldwide penetration of tested AI applications.<\/span><\/p>\n\n\n\n<p>Lajla: It is important to me that, today rather than tomorrow, we develop solutions for how we can use AI as a tool in a meaningful way. This requires an understanding of the possibilities and limitations and how the interaction between man and machine really works.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>&#8220;We can learn to deal with bias in mind and code through critical reflection&#8221;.<\/p><\/blockquote>\n\n\n\n<p><strong><i>Lajla, developing inclusive and non-discriminatory AI \u2013 is that even possible?<\/i><\/strong><\/p>\n\n\n\n<p>The inclusive design of AI is a big task for the next years \u2013 there will never be discrimination-free AI. How could there be? Are we humans free of prejudice? But we can learn to deal with bias in our heads and code by critical reflection. Rules for the design of algorithmic systems help us to do so. For example, it is only through good documentation of the processed data and evaluation criteria we can determine whether a certain group of people does worse due to the use of technology.<\/p>\n\n\n\n<p>Finn: <span style=\"font-weight: 400;\">Definitely: Design can be conducted <\/span><i><span style=\"font-weight: 400;\">inclusively<\/span><\/i><span style=\"font-weight: 400;\">, and that is an undisputed priority. However, the fact that AI systems will never be <\/span><i><span style=\"font-weight: 400;\">free of all discrimination<\/span><\/i><span style=\"font-weight: 400;\"> cannot only be attributed to human biases during the design process. Self-learning systems are trained on data sets that, in the first step, were usually not subject to conscious human selection. Instead, they are a product of their environment. For example: available images or speech samples. This is precisely where the challenge lies. Therefore, an important measure is to institutionalise mechanisms that flag up&nbsp; biases noticed in production.<\/span><\/p>\n\n\n\n<p>Laijla: <span style=\"font-weight: 400;\">To ensure that AI does not&nbsp; reproduce and scale existing prejudices, those developing and implementing AI must take responsibility from the very beginning. The training data sets are compiled, curated and labelled in advance. During this process, many things \u2013 man-made \u2013 can go wrong. An experiment by Kate Crawford showed how the well-known ImageNet data set (a set with more than 14 million images that is the basis for many object recognition systems) produced misogynistic results \u2013 due to bad labels. In order to select good data sets and avoid possible (gender) data gaps, we need measures to address the garbage-in-garbage-out phenomenon at an early stage. A first step would be to assemble more diverse and sensitised developer teams. Another one would be to introduce quality standards for data sets that also pay attention to representation, such as the Data Nutrition Label from Harvard University and MIT Media Labs.<\/span><\/p>\n\n\n\n<p>Finn: There is no disagreement here. Developer teams are happy to take on the responsibility. It&#8217;s not like there aren&#8217;t any quality standards; the relevance of training data is especially well known. All serious companies work daily on the most representative and unbiased data sets possible. \nDiversity in teams is a very helpful maxim on an individual level, but in the AI industry as a whole, teams can of course only reflect the degree of diversity that comes from universities.<\/p>\n\n\n\n<p>Lajla: It\u2019s not that easy with quality standards. The documentation \u2013 for instance on training datasets for ML models \u2013 is not comprehensive in many cases. Often start-ups and smaller companies cannot afford the expense of adequate documentation and ethical due diligence. Therefore, mandatory minimum standards help us to firmly anchor what is socially desirable in corporate practice. In terms of diversity, I agree with you: we need to start much earlier, increase the proportion of female students in STEM subjects, break down social barriers and take the shortage of skilled workers seriously.<\/p>\n\n\n\n<p>Finn: When it comes to minimum standards, I wonder whether a law can reflect the complexity, diversity and pace of the AI industry. We work a lot with industrial clients \u2013 manufacturing, e-commerce, synthetic biology \u2013 and use their data to develop customised systems, for instance for improving production or quality control. Each industry has its own characteristics and not every B2B use case raises ethical questions. I would be interested to hear what legal minimum standards would look like specifically and under which circumstances they would apply. <\/p>\n\n\n\n<p><strong><i>AI systems could potentially make discrimination visible, but they are mostly used for reasons of efficiency and in some cases specifically for selection and discrimination (in a current example from Germany, they are used by people who want to change their energy supplier). In addition, there is a justified need to experiment and drive development forward \u2013 how can we build trust and not gamble it away?<\/i><\/strong><\/p>\n\n\n\n<p>Finn: By pointing out potential, sharing success stories and assessing the ratio of unobjectionable to problematic applications. Industrial applications, for example, do not use data from private individuals and they help to make work better and safer. For every application we discussed, there are four that have tacitly changed things for the better \u2013 from curing diseases to mitigating environmental disasters. <\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p> &#8220;In the use of today&#8217;s algorithmic applications, we still face many questions, both technically and socially&#8221;<\/p><\/blockquote>\n\n\n\n<p>Lajla: When using today&#8217;s algorithmic applications, we still face many technical and social questions, e.g. about human-technology interaction. Therefore, we have to take a close look at particularly sensitive areas (personnel selection, health and public services). Certification in these areas could create more security for users and those affected.<\/p>\n\n\n\n<p><strong><i>Where do you see a particularly urgent need for action in European legislation? Do you find the approach of exclusively risk-related regulation sensible? How could the EU as a legislator perhaps even send a positive signal now?<\/i><\/strong><\/p>\n\n\n\n<p>Finn: With its exclusive focus on the most conscientious regulation possible, the EU will find that it is difficult to retrospectively impose their own standards on foreign companies that drive innovation \u2013 as was the case with the tech giants of the 2000s. The local AI ecosystem must be backed much more rigorously: by public partnerships and funds, investments in education and professorships, the opening of test fields and the clarification of ambiguous regulation. The so-called ecosystem of excellence is notoriously under-emphasised.<\/p>\n\n\n\n<p>Lajla: With the GDPR, Europe has shown that it can play a pioneering role in tech regulation issues. Future AI regulation at European level can add another chapter to this success story if it creates binding standards for applications. This can also offer small and medium-sized companies or start-ups a secure framework for innovations. Risk-related regulation combines innovation promotion and necessary standards. Nevertheless, trustworthy AI not only requires laws but also supervisory institutions and contact points for citizens.<\/p>\n\n\n\n<p><strong><i>Good solutions require both the provision of and access to (personal) data to a greater extent than before. As a society, we need to be more understanding and willing to disclose this data in the future. Do you agree with this statement?<\/i><\/strong><\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">Finn: With a self-critical glance at our social media use, I fail to see a lack of willingness to share data. And while there certainly is a trend towards mass data collection, not all of it is profitable. It will increasingly be a matter of awareness: where is my data? What data do I never want to reveal? With regard to questions of transparency and liability, I envision a strong role for the regulator.&nbsp; And governments have started to pick this up.<\/span> <\/p>\n\n\n\n<p><span style=\"font-weight: 400;\">Lajla: Agreed, Finn! Already today there are zettabytes of data lying around unused on servers. But who owns them? Large foreign tech companies. I would like citizens and civil society to harness the potential of the data for themselves. This primarily requires intelligent data-sharing models and more examples of how data can be used for joint projects, for example, through projects like the <\/span><i><span style=\"font-weight: 400;\">Gie\u00df den Kiez<\/span><\/i><span style=\"font-weight: 400;\"> tree-watering project by CityLAB Berlin. <\/span><\/p>\n\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<p>This interview was first published in our annual research magazine, <em><a href=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/encore2020_magazine.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">encore<\/a>.<\/em><\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Fopening-match-the-battle-for-inclusion-in-algorithmic-systems%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=Opening%20match%3A%20the%20battle%20for%20inclusion%20in%20algorithmic%20systems https%3A%2F%2Fwww.hiig.de%2Fen%2Fopening-match-the-battle-for-inclusion-in-algorithmic-systems%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Fopening-match-the-battle-for-inclusion-in-algorithmic-systems%2F&subject=Opening%20match%3A%20the%20battle%20for%20inclusion%20in%20algorithmic%20systems\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?<\/p>\n","protected":false},"author":271,"featured_media":73783,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1289,1582],"tags":[1263,1081,686],"class_list":["post-73574","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-ftif-ai-and-society","tag-diversitat","tag-inklusion-2","tag-ki-2"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Opening match: the battle for inclusion in algorithmic systems &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Opening match: the battle for inclusion in algorithmic systems &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2021-01-25T15:30:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-03-28T12:05:52+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"448\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Juliane Henn\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Juliane Henn\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Opening match: the battle for inclusion in algorithmic systems &#8211; Digital Society Blog","description":"How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/","og_locale":"en_US","og_type":"article","og_title":"Opening match: the battle for inclusion in algorithmic systems &#8211; Digital Society Blog","og_description":"How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?","og_url":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/","og_site_name":"HIIG","article_published_time":"2021-01-25T15:30:00+00:00","article_modified_time":"2023-03-28T12:05:52+00:00","og_image":[{"width":800,"height":448,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg","type":"image\/jpeg"}],"author":"Juliane Henn","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Juliane Henn","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/"},"author":{"name":"Juliane Henn","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/47de3cfd611befae947bfad5fc1684de"},"headline":"Opening match: the battle for inclusion in algorithmic systems","datePublished":"2021-01-25T15:30:00+00:00","dateModified":"2023-03-28T12:05:52+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/"},"wordCount":1814,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg","keywords":["Diversit\u00e4t","inklusion","KI"],"articleSection":["Artificial Intelligence","ftif AI and Society"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/","url":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/","name":"Opening match: the battle for inclusion in algorithmic systems &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg","datePublished":"2021-01-25T15:30:00+00:00","dateModified":"2023-03-28T12:05:52+00:00","description":"How can the increasing automation of infrastructures be made more inclusive and sustainable and be brought into accordance with human rights?","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/01\/austris-augusts-52p1K0d0euM-unsplash.jpg","width":800,"height":448},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/opening-match-the-battle-for-inclusion-in-algorithmic-systems\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"Opening match: the battle for inclusion in algorithmic systems"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/47de3cfd611befae947bfad5fc1684de","name":"Juliane Henn"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/73574","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/271"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=73574"}],"version-history":[{"count":7,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/73574\/revisions"}],"predecessor-version":[{"id":79648,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/73574\/revisions\/79648"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/73783"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=73574"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=73574"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=73574"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}