{"id":80032,"date":"2021-10-26T09:00:00","date_gmt":"2021-10-26T07:00:00","guid":{"rendered":"https:\/\/www.hiig.de\/?p=80032"},"modified":"2023-03-28T14:03:13","modified_gmt":"2023-03-28T12:03:13","slug":"toolkit-intersectional-ai","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/","title":{"rendered":"New toolkit collects easy tips for intersectional AI"},"content":{"rendered":"\n<p><strong>By drawing on marginalized practices to fundamentally reshape the development and use of AI technologies, intersectional approaches to AI (IAI) are key in ensuring more inclusiveness. Our new toolkit provides an introductory guide to IAI and argues that anyone should be able to understand what AI is and what AI ought to be.<\/strong><\/p>\n\n\n\n<p><strong>AI Bias reinforces discrimination&nbsp;<\/strong><\/p>\n\n\n\n<p>AI systems have made how some of us work, move and socialise much easier. However, their promises to enhance user experiences and provide opportunities have not held true equally for everyone. On the contrary: For many, AI systems have further widened the gaps of inequality and worsened discrimination, instead of tackling them at their roots. Even so-called intelligent systems merely reproduce the existing analogue world, including underlying power structures. This means AI applications \u2013 like any technology \u2013 are never neutral. Allowing only a small but powerful fraction of society to design and implement AI systems means power imbalances remain, or even get amplified by computation. Unfair internet infrastructures will continue to be passed off as impartial ones \u2014 and with no one else to say otherwise, we may never be able to imagine it any other way.<\/p>\n\n\n\n<p><strong>Why we need inclusive AI<\/strong><\/p>\n\n\n\n<p>Already marginalised communities are often left out of conversations about what kinds of AI systems should and should not exist, and how they should be created and used \u2013 despite the fact that these groups are disproportionately affected by the harmful impacts of AI systems. Scholars like <a href=\"https:\/\/www.poetofcode.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Joy Buolamwini<\/a> and <a href=\"https:\/\/www.macfound.org\/fellows\/class-of-2021\/safiya-noble\" target=\"_blank\" rel=\"noreferrer noopener\">2021 MacArthur Fellow Safiya Noble<\/a> cite the dangers of algorithmic injustice across insidious but widespread examples from shadow banning to predictive policing.&nbsp;<\/p>\n\n\n\n<p>With the increasing automation of public and private infrastructures, future AI systems should be made by diverse, interdisciplinary and intersectional communities rather than by a select few. In addition to needing community support in order to address the adverse effects they face, system designers can improve AI for everyone by listening to knowledge gained from many perspectives. Diverse groups \u2014 for example Black feminists, and queer and disability theorists \u2014 have long been considering aspects of the same questions exacerbated by problematic AI. We can and must rely on a broader variety of perspectives if we are to shift the course of AI\u2019s future toward more inclusive systems.<\/p>\n\n\n\n<p>Building on its <a href=\"https:\/\/www.hiig.de\/en\/project\/public-interest-ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">research on public interest AI<\/a>, the HIIG\u2019s <a href=\"https:\/\/www.hiig.de\/en\/research\/ai-and-society-lab\/\" target=\"_blank\" rel=\"noreferrer noopener\">AI &amp; Society Lab<\/a> puts a strong focus on questions in this area: How can AI and other technologies be made more approachable for everyone, to ensure people better understand AI systems and how they affect them? What do particularly marginalised communities wish to change about AI, and how can we support them in doing so?&nbsp;&nbsp;<\/p>\n\n\n\n<p><strong>How Intersectional AI can <\/strong><strong>help<\/strong><\/p>\n\n\n\n<p>The <a href=\"http:\/\/intersectionalai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Intersectional AI Toolkit<\/a> helps answer these questions by connecting communities in order to create introductory guides to AI from multiple, approachable perspectives. Developed by Sarah Ciston during a virtual fellowship at the AI &amp; Society Lab, the Intersectional AI Toolkit argues that anyone can and should be able to understand what AI is and what AI ought to be.&nbsp;<\/p>\n\n\n\n<p>Intersectionality describes how power operates structurally, and how multiple forms of discrimination have compounding, interdependent effects. American lawyer Kimberl\u00e9 Crenshaw introduced the term in 1989, using the image of an intersection where paths of power cross to illustrate the interwoven nature of social inequalities (1989).<\/p>\n\n\n\n<p>As imagined by this toolkit, Intersectional AI will bring decades of work on Intersectional ideas, ethics, and tactics to the issues of inequality faced by AI. By drawing on established ideas and practices, and understanding how to combine them, Intersectionality can help reshape AI in fundamental ways. Through its layered, structural approach, Intersectional AI connects the dots between concepts \u2014 as seen from different disciplines and operating across systems \u2014 so that individuals and researchers may be able to help address the gaps that others could not see.&nbsp;<\/p>\n\n\n\n<p><strong>A toolkit that helps to think about intersectionality and code inclusive AI<\/strong><\/p>\n\n\n\n<p>The Intersectional AI Toolkit is a collection of small magazines (or zines) that offer practical accessible guides to both AI and Intersectionality. They are written for engineers, artists, activists, academics, makers and anyone who wants to understand the automated systems that impact them. By sharing key concepts, tactics, and resources, they serve as jumping-off points to inspire readers\u2019 own further research and conversation across disciplines and communities, asking questions like \u201cIs decolonizing AI possible?\u201d or \u201cWhat does it mean to learn to code?\u201d&nbsp;<\/p>\n\n\n\n<p>The toolkit is available as a digital resource that continues to grow with community contributions, as well as printable zines that can be folded, shared, and discussed offline. With issues like a two-sided glossary: \u201cIAI A-to-Z,\u201d strategy flashcards: \u201cTactics for Intersectional AI,\u201d and a guide to concepts for skeptics: \u201cHelp Me Understand Intersectionality,\u201d the zine collection focuses on using plain language and fostering tangible impacts.<\/p>\n\n\n\n<p>This toolkit is not the first or only resource on intersectionality or AI. Instead, it gathers together some of the amazing people, ideas, and forces working to re-examine the foundational assumptions built into these technologies, such as Catherine D\u2019Ignazio and Lauren Klein\u2019s work on \u201c<a href=\"https:\/\/mitpress.mit.edu\/books\/data-feminism\" target=\"_blank\" rel=\"noreferrer noopener\">Data Feminism<\/a>\u201d or Ruja Benjamin\u2019s \u201c<a href=\"https:\/\/www.ruhabenjamin.com\/race-after-technology\" target=\"_blank\" rel=\"noreferrer noopener\">Race after Technology<\/a>\u201d. It also looks at which people are (not) involved when AI is developed or which processes and safeguards do or should exist. In this way, it helps us understand power and aims to link AI development back to democratic processes.&nbsp;<\/p>\n\n\n\n<p><strong>Why is the future of AI intersectional?<\/strong><\/p>\n\n\n\n<p>Current approaches to AI fail to address two major problems. First: Those who create AI systems \u2013 from code to policy to infrastructure \u2014 fail to listen to the needs or wisdom of the marginalised communities most injured by those systems. Second: Current language and tools for AI put up intimidating barriers that prevent outsiders from understanding, building, or changing these systems. If we want improved, inclusive AI systems, we must consider a broader range of people\u2019s needs as much as we must consider a broader range of people\u2019s knowledge. Otherwise we face a future perpetuating the same problems, under the guise of fairness and automation.&nbsp;<\/p>\n\n\n\n<p>The Intersectional AI Toolkit tries to intervene by facilitating much-needed exchange between different groups around these issues. The AI &amp; Society Lab hosted the launch of the Toolkit as an <a href=\"https:\/\/www.hiig.de\/en\/events\/edit-a-thon-intersectional-ai-toolkit\/\" target=\"_blank\" rel=\"noreferrer noopener\">Edit-a-thon<\/a> workshop, in order to gain multiple valuable perspectives through diverse public participation. Over the next months, more digital and in-person zine-making workshops are <a href=\"https:\/\/ki-convention.com\/en\/ki-wir-convention-20\/\" target=\"_blank\" rel=\"noreferrer noopener\">planned<\/a> to keep building the Toolkit while advocating for Intersectional approaches to AI in various sectors like AI governance.&nbsp;<\/p>\n\n\n\n<p>All AI systems are socio-technical; they interconnect humans and machines. Intersectionality reminds us how power imbalances affect those connections. By addressing the gap between those who want to understand and shape AI, and those who already make and regulate it, <a href=\"http:\/\/intersectionalai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Intersectional AI<\/a> can help us find the shared language we need to reimagine AI together.&nbsp;<\/p>\n\n\n\n<p><strong>References<\/strong><\/p>\n\n\n\n<p>Crenshaw, K. (1989). Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics<\/p>\n\n\n\n<p><strong>tl;dr<\/strong><\/p>\n\n\n\n<p>The Intersectional AI Toolkit will remain accessible for contributions and comments at <a href=\"http:\/\/intersectionalai.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">intersectionalai.com<\/a>.<br>The Intersectional AI Toolkit Edit-a-thon took place on Sep 1, 2021 and was hosted by HIIG\u2019s AI &amp; Society Lab in collaboration with our partner from <a href=\"https:\/\/motif-institute.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">MOTIF<\/a>, <a href=\"https:\/\/netzforma.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">netzforma* e.V.<\/a>, <a href=\"https:\/\/superrr.net\/\" target=\"_blank\" rel=\"noreferrer noopener\">SUPERRR<\/a> and the <a href=\"https:\/\/hans-bredow-institut.de\/en\" target=\"_blank\" rel=\"noreferrer noopener\">The Leibniz Institute for Media Research | Hans-Bredow-Institut (HBI)<\/a>.<\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Ftoolkit-intersectional-ai%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=New%20toolkit%20collects%20easy%20tips%20for%20intersectional%20AI https%3A%2F%2Fwww.hiig.de%2Fen%2Ftoolkit-intersectional-ai%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Ftoolkit-intersectional-ai%2F&subject=New%20toolkit%20collects%20easy%20tips%20for%20intersectional%20AI\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>Intersectional AI (IAI) is the key to greater inclusivity. It draws on practices of marginalised perspectives to fundamentally reshape the development, design and use of AI.<\/p>\n","protected":false},"author":356,"featured_media":80029,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1289,1582],"tags":[1108,1263,1309,1251,686,1252],"class_list":["post-80032","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-ftif-ai-and-society","tag-diskriminierung-2","tag-diversitat","tag-feministische-ki-2","tag-intersektionalitat","tag-ki-2","tag-queer-en"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>New toolkit collects easy tips for intersectional AI &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"AI systems reinforce discrimination. Our intersectional AI toolkit explains why and draws on the perspectives of marginalised communities.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"New toolkit collects easy tips for intersectional AI &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"AI systems reinforce discrimination. Our intersectional AI toolkit explains why and draws on the perspectives of marginalised communities.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2021-10-26T07:00:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-03-28T12:03:13+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"450\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Hauke Odendahl\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Hauke Odendahl\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"New toolkit collects easy tips for intersectional AI &#8211; Digital Society Blog","description":"AI systems reinforce discrimination. Our intersectional AI toolkit explains why and draws on the perspectives of marginalised communities.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/","og_locale":"en_US","og_type":"article","og_title":"New toolkit collects easy tips for intersectional AI &#8211; Digital Society Blog","og_description":"AI systems reinforce discrimination. Our intersectional AI toolkit explains why and draws on the perspectives of marginalised communities.","og_url":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/","og_site_name":"HIIG","article_published_time":"2021-10-26T07:00:00+00:00","article_modified_time":"2023-03-28T12:03:13+00:00","og_image":[{"width":800,"height":450,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png","type":"image\/png"}],"author":"Hauke Odendahl","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Hauke Odendahl","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/"},"author":{"name":"Hauke Odendahl","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/91b3ac6e9a08e6cc5c739126166f02da"},"headline":"New toolkit collects easy tips for intersectional AI","datePublished":"2021-10-26T07:00:00+00:00","dateModified":"2023-03-28T12:03:13+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/"},"wordCount":1228,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png","keywords":["diskriminierung","Diversit\u00e4t","feministische KI","Intersektionalit\u00e4t","KI","Queer"],"articleSection":["Artificial Intelligence","ftif AI and Society"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/","url":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/","name":"New toolkit collects easy tips for intersectional AI &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png","datePublished":"2021-10-26T07:00:00+00:00","dateModified":"2023-03-28T12:03:13+00:00","description":"AI systems reinforce discrimination. Our intersectional AI toolkit explains why and draws on the perspectives of marginalised communities.","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2021\/10\/Webseite_banner_intersectional_AI_blogpost.png","width":800,"height":450,"caption":"Eine gro\u00dfe Kreuzung mit viel Verkehr."},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/toolkit-intersectional-ai\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"New toolkit collects easy tips for intersectional AI"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/91b3ac6e9a08e6cc5c739126166f02da","name":"Hauke Odendahl"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/80032","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/356"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=80032"}],"version-history":[{"count":15,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/80032\/revisions"}],"predecessor-version":[{"id":83594,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/80032\/revisions\/83594"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/80029"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=80032"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=80032"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=80032"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}