{"id":87301,"date":"2022-08-23T11:31:33","date_gmt":"2022-08-23T09:31:33","guid":{"rendered":"https:\/\/www.hiig.de\/?p=87301"},"modified":"2022-09-28T17:11:54","modified_gmt":"2022-09-28T15:11:54","slug":"content-moderation","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/content-moderation\/","title":{"rendered":"Content Moderation \u2013 What can stay, what must go?"},"content":{"rendered":"\n<p><strong>Automated deletion on social networks is putting freedom of expression at risk. That\u2019s why we need a few rules. The research project &#8220;Ethics of Digitalisation&#8221; has worked out what these might be.<\/strong><\/p>\n\n\n\n<p>Guest article by Alexandra Borchardt<\/p>\n\n\n\n<p>Should Donald Trump be allowed to tweet? Should the state broadcaster <em>Russia Today<\/em> be allowed to spread war propaganda on Facebook, Instagram and YouTube? For a long time, those in charge of the large social media platform corporations such as Facebook\/Meta or Google have evaded these and similar questions. This was not just out of ignorance or the na\u00efve belief that having more opinions on the internet would automatically guarantee diversity. Above all, they wanted to make money and relied on a simple formula: more speech = more money. Moreover, they did not want to have to play a policing role in public discourse. But the more tense the political climate has become and the louder the discussion about hate, incitement to hate, violence and lies on the internet has grown, the more resistance has crumbled. This did not always happen voluntarily. Laws like the German Network Enforcement Act (Netzwerkdurchsetzungsgesetz or NetzDG for short) have increased the pressure.&nbsp; Platforms such as Facebook or YouTube face fines if they do not remove certain content quickly, and they are required to be more transparent. But when the US Capitol building was stormed on 6 January 2021, many doubters in the corporate world had to concede: staying neutral is not an option in the face of such calls for violence. Hate on the web can put democracy in danger. &nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Why do machines often systematically delete the wrong content?&nbsp;<\/strong><\/h2>\n\n\n\n<p>Today, corporations systematically delete content that violates their internal rules and the law. Systematically means they leave the task of deleting and hiding content to machines, at least initially. Such automated moderation programs, some of which are self-learning, can decide on their own that this can stay but that must go. The technical term for this is automated content moderation.<\/p>\n\n\n\n<p>But software is not as smart as many people think and perhaps even fear. It can only compare content to something it has previously seen but it cannot put it into context, especially when it comes to cultural idiosyncrasies or humour. For example, an image that might thrill the art scene in one country could be considered pornographic in another. The platform\u2019s algorithms are only really well-trained in a few languages, because most platform corporations are based in the USA. And they can often interpret certain formats poorly, for example, visual symbols such as memes and GIFs. That is why deletion often goes wrong: posts that clearly violate the law stay up, but harmless or even important statements are taken down. This, in turn, puts freedom of expression at risk. What needs to happen so that social networks can continue to be central forums for debating and sharing information without providing a platform for propagandists? <\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Scientific debating on content moderation<\/h2>\n\n\n\n<p>This question is being addressed by the civil society organisations that have developed the Santa Clara Principles on Transparency and Accountability in Content Moderation and, in particular, by the EU, with its <a href=\"https:\/\/digital-strategy.ec.europa.eu\/en\/policies\/digital-services-act-package\">Digital Services Act.<\/a> However, scientific input is needed to create good regulations. In the research project <a href=\"https:\/\/www.hiig.de\/en\/project\/the-ethics-of-digitalisation\/\">\u201cEthics of Digitalisation\u201d<\/a>, which is supported by internet institutes worldwide and financed by the Mercator Foundation under the patronage of Federal President Frank-Walter Steinmeier, 13 researchers&nbsp; from nine countries in seven time zones and various disciplines have been working intensively on the topic of automated content moderation. From August to October 2020, they analysed the situation in so-called research sprints \u2013 with the support of mentors. The Alexander von Humboldt Institute for Internet and Society (HIIG) in Berlin was in charge of this project. As a result, recommendations have been developed for policy-makers.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Nothing works without people as monitors<\/strong><\/h2>\n\n\n\n<p>The researchers made several assumptions. First, in the internet\u2019s communication channels, algorithms do the sorting out and removal of content. It will stay that way, because anything else is unthinkable simply because of the large volumes. Second, there are huge knowledge gaps among all stakeholders about how these algorithms operate and learn. This makes it particularly difficult to develop appropriate and effective regulation. Third, up to now it has often been unclear who is responsible in the world of digital information channels and who is not only called upon to act but also bears liability for their actions. And fourth, software will never be able to sort content perfectly. When it is tasked with doing so, fundamental rights, especially freedom of expression, are put at risk. Policymakers will not be able to find perfect answers to these questions, because in most cases, it is the context that matters when it comes to content. A quote, a film clip, or a picture can be interpreted very differently depending on who posts them and under which heading.<\/p>\n\n\n\n<p>Experts advise caution when states begin to remove content that can be classified as merely \u201cproblematic\u201d or \u201charmful\u201d but not illegal. The reason for this is that such vague categories open up the door to censorship. However, hardly any software will be able to correctly classify the legal situation at all times and in all places. Therefore, nothing will work without people to monitor the process. In their proposals, researchers in the Ethics of Digitalisation project have developed a few principles that could guide all parties involved \u2013 not just governments and parliamentarians but also the platform corporations.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>We need facts and the opportunity to fight back<\/strong><\/h2>\n\n\n\n<p>The first concern is far-reaching <em>transparency<\/em>. This demand is directed at the platform corporations on the one hand and the regulators on the other. Google, Facebook and similar companies should be obliged to disclose how their systems work and regularly verify that they respect fundamental rights such as freedom of expression and privacy. This demand has largely been addressed by the Digital Services Act. Legislators, on the other hand, should above all disclose their intentions, justify them and offer reliability. When the regulatory process is underway, then everyone should know how it is happening, what is the aim and what are the results. Regulations should be based on findings from research and be flexible enough to be adapted to new technical developments. This requires a broad social debate on how content should be sorted and automatically filtered.<\/p>\n\n\n\n<p>Currently, platform companies\u2019 algorithms largely optimise content according to its chances of attracting attention. Behind this is an ad-driven business model that relies on views \u2013 the so-called \u201cbattle for the eyeballs\u201d. By applying this model, the companies automatically encourage the posting of all kinds of nonsense \u2013 which then has to be checked for its legality. However, positive selection would also be possible: algorithms could give preference to posts and information whose factual accuracy has been checked or that comes from sources certified as reputable. The<a href=\"https:\/\/www.journalismtrustinitiative.org\/\"> Journalism Trust Initiative<\/a> of the organisation Reporters Without Borders is campaigning for a system like this, with the aim of helping to make quality journalism more visible. Other content that is less conducive to constructive debate then automatically moves further down the list and becomes barely visible.<\/p>\n\n\n\n<p>But because machines and people inevitably make mistakes, it is not enough to sort content by output. <em>Citizens and institutions must also be given simple, fast and unbureaucratic ways to enforce their rights<\/em> when they feel they have been treated unfairly, i.e. when they have been blocked or censored for no apparent reason. Platform corporations should be obliged to create such structures \u2013 for example, they should create options to appeal incorrect decisions with just a few clicks. The researchers recommend that an independent ombudsperson, or platform advisory board, be appointed as the final authority to arbitrate in disputes and make binding decisions for all parties. In addition, of course, there is always the option of going to the national and international courts.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Who actually monitors the algorithms that monitor me?<\/strong><\/h2>\n\n\n\n<p>In addition, the researchers suggest that the <em>algorithms themselves should be subject to regular monitoring<\/em> and that audits should be established for this purpose, i.e. a kind of algorithm MOT. These audits are intended to ensure, among other things, that the algorithms comply with the law, i.e. that they do not discriminate, for example. Discrimination can arise quickly, because artificial intelligence \u201clearns\u201d what works best. If these algorithms are not kept in mind and regularly checked, stereotypes will not only be perpetuated but possibly even reinforced. In recent years, there has been a growing social debate about what algorithms must be able to do and what they are used for. Whereas, initially, algorithms were primarily concerned with solving tasks with the greatest possible efficiency \u2013 for example, granting loans, selecting applicants or personalising content \u2013 values now play an increasingly important role. It is now considered crucial to define from the beginning what goals automated selection should achieve and to check whether they are achieved.&nbsp; &nbsp; &nbsp;<\/p>\n\n\n\n<p>However, enforcing such an MOT will be a challenge, because algorithms are the modern equivalent of the Coca-Cola formula: the platform companies regard them as trade secrets; they want to keep others in the dark when it comes to optimising their sorting software and don\u2019t want to give up control over it. Critics find this unacceptable. After all, these are powerful instruments that influence public debates and may mean that&nbsp; life-defining information is not spread widely enough. The researchers have therefore established four basic principles for such audits: auditorial independence, access to data, publication of results and sufficient resources.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Who will safeguard my freedom of expression now: the state, industry or civil society?&nbsp;<\/strong><\/h2>\n\n\n\n<p>Audits would not be entirely new. This instrument already exists in European law \u2013 for example, in the GDPR data protection package, where \u201cdata protection audits\u201d are included as a control option. Another possibility would be public registers for algorithms to disclose the basis on which automated decisions are made. In Europe, this tool is currently being tested by authorities in Amsterdam, Helsinki and Nantes. Such registers could also be set up for the private sector.<\/p>\n\n\n\n<p>However, the researchers admit that such regulation could also open the door to abuse. Governments could use them as a pretext to restrict privacy, suppress dissent or prevent people from exercising other fundamental rights, as is happening in Russia, for example. \u201cLike any regulation, audits would need to be prudently established to protect against abuse, misuse, politicisation and disproportionate interference,\u201d the team writes. The relevant processes would therefore have to be set up in such a way that neither the states nor the industry alone could exert undue influence over them, either because both have equal voting rights or because civil society actors are involved.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Companies should provide data<\/h2>\n\n\n\n<p>In a world that is changing rapidly \u2013 not only but also due to technological developments \u2013 it is important in any event to involve all actors and groups with the requisite knowledge and experience in political processes. Lengthy, hierarchically controlled decision-making processes do not do justice to dynamic developments. Flexibility and adaptability are needed. Making policy decisions through algorithmic content management is increasingly coming to resemble open-heart surgery. More than ever, it is important to shorten the pathways between action and impact through practice-oriented research. For this to happen, however, scientists need access to data. There is certainly no shortage of this data, but there likely is a lack of willingness on the part of influential companies to make it available. In a society built on knowledge and facts, knowledge cannot be shared quickly enough. Everyone is called upon to contribute to this.<\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Fcontent-moderation%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=Content%20Moderation%20%E2%80%93%20What%20can%20stay%2C%20what%20must%20go%3F https%3A%2F%2Fwww.hiig.de%2Fen%2Fcontent-moderation%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Fcontent-moderation%2F&subject=Content%20Moderation%20%E2%80%93%20What%20can%20stay%2C%20what%20must%20go%3F\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>How can rules for algorithmic content moderation in Social Networks look like? This guest article by Alexandra Borchardt examines researcher&#8217;s suggestions.<\/p>\n","protected":false},"author":9999998,"featured_media":87391,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[],"tags":[],"class_list":["post-87301","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Content Moderation \u2013 What can stay, what must go? Digital Society Blog<\/title>\n<meta name=\"description\" content=\"The research project &quot;Ethics of Digitalisation&quot; has developed rules for algorhithmic content moderation in social networks.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/content-moderation\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Content Moderation \u2013 What can stay, what must go? Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"The research project &quot;Ethics of Digitalisation&quot; has developed rules for algorhithmic content moderation in social networks.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/content-moderation\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2022-08-23T09:31:33+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2022-09-28T15:11:54+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png\" \/>\n\t<meta property=\"og:image:width\" content=\"800\" \/>\n\t<meta property=\"og:image:height\" content=\"450\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Stefanie Barth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Stefanie Barth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Content Moderation \u2013 What can stay, what must go? Digital Society Blog","description":"The research project \"Ethics of Digitalisation\" has developed rules for algorhithmic content moderation in social networks.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/content-moderation\/","og_locale":"en_US","og_type":"article","og_title":"Content Moderation \u2013 What can stay, what must go? Digital Society Blog","og_description":"The research project \"Ethics of Digitalisation\" has developed rules for algorhithmic content moderation in social networks.","og_url":"https:\/\/www.hiig.de\/en\/content-moderation\/","og_site_name":"HIIG","article_published_time":"2022-08-23T09:31:33+00:00","article_modified_time":"2022-09-28T15:11:54+00:00","og_image":[{"width":800,"height":450,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png","type":"image\/png"}],"author":"Stefanie Barth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Stefanie Barth","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/"},"author":{"name":"Stefanie Barth","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a07aa81c80d1dbd4ef1ab5c1cd9c10fd"},"headline":"Content Moderation \u2013 What can stay, what must go?","datePublished":"2022-08-23T09:31:33+00:00","dateModified":"2022-09-28T15:11:54+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/"},"wordCount":1941,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png","inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/content-moderation\/","url":"https:\/\/www.hiig.de\/en\/content-moderation\/","name":"Content Moderation \u2013 What can stay, what must go? Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png","datePublished":"2022-08-23T09:31:33+00:00","dateModified":"2022-09-28T15:11:54+00:00","description":"The research project \"Ethics of Digitalisation\" has developed rules for algorhithmic content moderation in social networks.","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/content-moderation\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2022\/08\/Blog-Titelbild-\u2013-8-1.png","width":800,"height":450,"caption":"Tweets f\u00fcr die Tonne"},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/content-moderation\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"Content Moderation \u2013 What can stay, what must go?"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a07aa81c80d1dbd4ef1ab5c1cd9c10fd","name":"Stefanie Barth"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/87301","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/9999998"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=87301"}],"version-history":[{"count":4,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/87301\/revisions"}],"predecessor-version":[{"id":87901,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/87301\/revisions\/87901"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/87391"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=87301"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=87301"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=87301"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}