{"id":18290,"date":"2014-06-30T19:23:26","date_gmt":"2014-06-30T17:23:26","guid":{"rendered":"https:\/\/www.hiig.de\/?p=18290"},"modified":"2021-07-22T13:35:19","modified_gmt":"2021-07-22T11:35:19","slug":"all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/","title":{"rendered":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights"},"content":{"rendered":"<p><em>by\u00a0<a href=\"http:\/\/cbpuschmann.net\/\">Cornelius Puschmann<\/a> and HIIG Fellow\u00a0<a href=\"http:\/\/ethicsandtechnology.eu\/bozdag\">Engin Bozdag<\/a><\/em><\/p>\n<p>How significant is the impact of what we read on Facebook on what we post there, specifically on our emotions? And to what extent can we trust what we read on social media sites more generally, when what we see is increasingly filtered using algorithmic criteria that are largely opaque? The first question is addressed in <a href=\"http:\/\/www.pnas.org\/content\/111\/24\/8788.full\">a controversial study recently published in the journal PNAS<\/a>, the second we want to critically discuss in this blog post.<\/p>\n<p>The article <a href=\"http:\/\/www.pnas.org\/content\/111\/24\/8788.full\"><i>Experimental evidence of massive-scale emotional contagion through social networks<\/i><\/a> by Adam D. Kramer (Facebook), Jamie E. Guillory (University of California) and Jeffrey T. Hancock (Cornell University) recently provoked some very strong reactions both on international news sites and among scholars and bloggers (e.g. <a href=\"http:\/\/www.theatlantic.com\/technology\/archive\/2014\/06\/everything-we-know-about-facebooks-secret-mood-manipulation-experiment\/373648\/\"><i>The Atlantic<\/i><\/a><i>, <\/i><a href=\"http:\/\/www.forbes.com\/sites\/kashmirhill\/2014\/06\/28\/facebook-manipulated-689003-users-emotions-for-science\/\"><i>Forbes<\/i><\/a><i>, <\/i><a href=\"http:\/\/venturebeat.com\/2014\/06\/28\/facebook-secretly-experimented-with-the-moods-of-700000-of-its-users\/\"><i>Venture Beat<\/i><\/a><i>, <\/i><a href=\"http:\/\/www.independent.co.uk\/life-style\/gadgets-and-tech\/facebook-manipulated-users-moods-in-secret-experiment-9571004.html\"><i>The Independent<\/i><\/a><i>, <\/i><a href=\"http:\/\/www.nytimes.com\/2014\/06\/30\/technology\/facebook-tinkers-with-users-emotions-in-news-feed-experiment-stirring-outcry.html\"><i>The New York Times<\/i><\/a>; <a href=\"http:\/\/laboratorium.net\/archive\/2014\/06\/28\/as_flies_to_wanton_boys\">James Grimmelman<\/a>, <a href=\"http:\/\/psychcentral.com\/blog\/archives\/2014\/06\/23\/emotional-contagion-on-facebook-more-like-bad-research-methods\/\">John Grohol<\/a>, <a href=\"http:\/\/www.talyarkoni.org\/blog\/2014\/06\/28\/in-defense-of-facebook\/\">Tal Yarkoni<\/a>, <a href=\"https:\/\/medium.com\/message\/engineering-the-public-289c91390225\">Zeynep Tufekci<\/a>, <a href=\"http:\/\/www.thefacultylounge.org\/2014\/06\/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html\">Michelle N. Meyer<\/a>, <a href=\"http:\/\/thomasleeper.com\/2014\/06\/facebook-ethics\/\">Thomas J. Leeper<\/a>, <a href=\"http:\/\/www.brianckeegan.com\/2014\/06\/the-beneficence-of-mobs-a-facebook-apologia\/\">Brian Keegan<\/a>, <a href=\"http:\/\/www.sciencebasedmedicine.org\/did-facebook-and-pnas-violate-human-research-protections-in-an-unethical-experiment\/?utm_source=rss&amp;utm_medium=rss&amp;utm_campaign=did-facebook-and-pnas-violate-human-research-protections-in-an-unethical-experiment\">David Gorski<\/a>). <i>The New York Times<\/i>\u2019 Vindu Goel surmises that \u201cto Facebook, we are all lab rats\u201d and <i>The Atlantic<\/i>\u2019s Robinson Meyer calls the study a \u201csecret mood manipulation experiment\u201d. Responses from scholars have been somewhat more mixed: several have noted that the research design and the magnitude of the experiment have been poorly represented by the media, while others argue that there has been a massive breach of research ethics. First author Adam D. Kramer has <a href=\"https:\/\/www.facebook.com\/akramer\/posts\/10152987150867796\">responded to the criticism with a Facebook post<\/a> in which he explains the team\u2019s aims and apologizes for the distress that the study has caused.<\/p>\n<p>So what is the issue? The paper tests the assumption that basic emotions, positive and negative, are contagious, i.e. that they spread from person to person by exposure. This has been tested for face-to-face communication in laboratory settings before, but not online. The authors studied roughly three million English language posts written by close to 700,000 users in January 2012. The researchers adjusted the Facebook News Feed of these users to randomly filter out specific posts with positive and negative emotion words the users would normally have been exposed to and then studied the emotional content of the subjects\u2019 posts in the following period. Kramer and colleagues stress that no content was <i>added<\/i> to anyone\u2019s News Feed, and that the percentage of posts filtered out in this way from the News Feed was very small. The basis for for the filtering decision was <a href=\"http:\/\/www.liwc.net\/\">the LIWC software package<\/a>, developed by <a href=\"http:\/\/homepage.psy.utexas.edu\/HomePage\/Faculty\/Pennebaker\/Home2000\/JWPhome.htm\">James Pennebaker<\/a> and colleagues at the University of Texas, which is used to correlate physical well-being with word usage. LIWC\u2019s origins lie in clinical environments and originally the approach was tested using diaries and other very personal (and fairly wordy) genres, rather that short Facebook status updates, a potential methodological issue that <a href=\"https:\/\/edubirdie.com\/blog\/emotional-contagion-on-facebook-phenomena\">John Grohol points out in his blog post<\/a>.<\/p>\n<p>What did Kramer and his co-authors discover? The study\u2019s central finding is that basic emotions are in fact contagious, though the influence the authors measured is relatively small. However, they note that given the large sample, the global effect is still important and argue that the emotional contagion has not been observed in a computer-mediated setting based purely on textual content before. Psychologist <a href=\"http:\/\/www.talyarkoni.org\/blog\/2014\/06\/28\/in-defense-of-facebook\/\">Tal Yarkoni<\/a> has responded that given the small size of the observed effect, speaking of manipulation is really overblown &#8212; that similar \u2018nudges\u2019 are made in online platforms all the time without the knowledge or consent of users.<\/p>\n<p>So much for the results &#8212; it is rather the ethical aspects of the experiment that unsurprisingly provoked a strong response. The 689,003 users whose News Feeds were changed between January 11 and January 18 2012 were not aware of their participation in the experiment and had no way of knowing how their news feeds were adjusted. To their defense, Kramer and colleagues point out that the content omitted from the News Feed as part of the experiment was still available by going directly to the user\u2019s Wall (1), that the percentage of omitted content was very small (2) and that the content of the News Feed is generally the product of algorithmic filtering (3), rather than a verbatim reproduction of everything that your friends are posting &#8212; in other words, that they merely added a filter to the News Feed and conducted <a href=\"http:\/\/en.wikipedia.org\/wiki\/A\/B_testing\">an A\/B test<\/a> for the impact of the filtering. Furthermore, they stress that no content was examined manually, that is, read by a human researcher, but that all classification was achieved by LIWC automatically. This step was both taken to achieve the large scale of the study and to ensure that no breach of privacy took place. From <a href=\"http:\/\/www.pnas.org\/content\/111\/24\/8788.full\">the paper<\/a>:<\/p>\n<p><i>&#8220;LIWC was adapted to run on the Hadoop Map\/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers. As such, it was consistent with Facebook\u2019s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research&#8221;<\/i><\/p>\n<p>It is a subject of intense debate whether or not agreeing to the Facebook Terms of Service constitutes <b>informed consent to an experiment<\/b> in which the News Feed is manipulated in the described way &#8212; what seems certain is that this kind of research raises a whole host of questions, from the responsibility of institutional review boards (IRBs) who are charged with ensuring that academic research is ethically acceptable to Facebook\u2019s right to conduct such research in the first place. As has been pointed out, internet companies change their algorithms and filtering mechanisms all the time without informing anyone about it, and legally there is no need for them to do so, but for many commentators a line has apparently been crossed, from optimizing a product to studying and influencing behavior without seeking consent.<\/p>\n<p>While the study has provoked strong reactions, it is worth pointing out that this is not the first time that Facebook has filtered users\u2019 News Feed for research purposes without acquiring prior consent. In <a href=\"http:\/\/dl.acm.org\/citation.cfm?id=2187907\">a 2012 study on information diffusion<\/a>, Facebook researchers Eytan Bakshy, Itamar Rosenn, Cameron Marlow, and Lada Adamic found that novel information in online platforms mainly propagates through weak ties, in other words that news travels through groups of relatively informal acquaintances. To study this effect in more detail than had previously been possible, the researchers randomly blocked some status updates from the News Feeds of a pool of some 250 million users, many more than in the emotion contagion experiment.<\/p>\n<p>While the blame has focused on Facebook, it is by no means the only company that performs such experiments. A\/B testing is a standard practice among all internet companies to improve their products. <a href=\"https:\/\/support.google.com\/analytics\/answer\/1745147?hl=en\">Google provides a set of tools to conduct A\/B tests<\/a> for website optimization, as <a href=\"https:\/\/developer.amazon.com\/sdk\/ab-testing.html\">does Amazon<\/a>. Beyond A\/B testing to improve the quality of search results, issues become yet more complicated when experiments around information exposure are conducted with social improvement in mind and without explicit consent. In another recent experiment, researchers at Microsoft <a href=\"http:\/\/ssc.sagepub.com\/content\/32\/2\/145\">changed search engine results in order to promote civil discourse<\/a>. In the study in question, the authors modified search results that were displayed when users entered specific political search queries, so that users entering the query <i>obamacare<\/i> would be exposed both to liberal and conservative sources, rather than just to content biased into one ideological direction. In the light of the discrepancy between the ethical standards of academic research on human subjects and the entirely different requirements of building and optimizing social media platforms and search engines, it\u2019s tempting and simplistic to single out Facebook for filtering content algorithmically. But the public outcry underlines that there is increasingly <b>an expectation towards more transparency regarding how content is filtered and presented<\/b>, beyond assuming a \u2018take it or leave it\u2019-attitude. Social media platforms cater to consumers, not to citizens, but they increasingly carry a responsibility for vital information that influences people\u2019s decisions, and for the transparency of the mechanisms used to filter that information.<\/p>\n<p>What the storm of criticism in response to the research clearly shows is the implicit expectation of users, scholars and the media alike, that <b>what we see on Facebook should be a reflection of what our friends and acquaintances are actually saying, rather than an intransparent curated experience<\/b>, even when the curation is to our supposed benefit. The paper\u2019s editor, Susan Fiske (Princeton), noted the complexity of the situation in a response to <i>The Atlantic,<\/i> pointing out that the Institutional Review Board of the authors\u2019 institutions did approve the research, and arguing that Facebook could not be held to the same standards as academic institutions. Kramer and colleagues clearly saw their experiment in line with Facebook\u2019s continued efforts to optimize the News Feed. Again, from <a href=\"http:\/\/www.pnas.org\/content\/111\/24\/8788.full\">the paper<\/a>:<\/p>\n<p><i>&#8220;In Facebook, people frequently express emotions, which are later seen by their friends via Facebook\u2019s \u201cNews Feed\u201d product &#8230; Which content is shown or omitted in the News Feed is determined via a ranking algorithm that Facebook continually develops and tests in the interest of showing viewers the content they will find most relevant and engaging. One such test is reported in this study: A test of whether posts with emotional content are more engaging.&#8221;<\/i><\/p>\n<p>This characterization clarifies that the approach taken in the study is quite consistent with Facebook&#8217;s goal of producing a more engaging user experience. Increasing engagement is also consistent with the goal of achieving higher click-through rates for targeted advertisements, which is in the legitimate interest of any social media platform. What is less clear is whether the same approach is also consistent with social science research ethics and with the public\u2019s moral expectations towards companies with virtually unlimited access to their data. There is justifiably the expectation that we are all subject to the same basic type of filtering and that Facebook should be open about how it determines the content of the News Feed. The deterministic argument made by some that users should simply accept that platform providers can run their service as they please seems short-sighted in light of these expectations, which have grown just as the reach of these platforms has grown.<\/p>\n<p>In the following we summarize and discuss five arguments that have been made in defense of Facebook\u2019s (and other companies\u2019) approach to this type of experimental research.<\/p>\n<h3>1. No manipulation has occured because the researchers did not insert messages, but just \u00a0filtered existing ones, and the effects were minimal<\/h3>\n<p>It has been argued that because Facebook did not insert emotional messages into the News Feed, but only hid certain posts for certain users, the experiment does not constitute manipulation. However, according to some scholars who have worked on persuasion (<a href=\"http:\/\/www.tue.nl\/en\/publication\/ep\/p\/d\/ep-uid\/271479\/\">Smids<\/a>, <a href=\"http:\/\/link.springer.com\/article\/10.1007\/s11948-011-9278-y\">Spahn<\/a>), if persuasion does not happen voluntarily and if the persuader does not reveal his intentions before the persuading act takes place, this is to be considered manipulative. <a href=\"http:\/\/dl.acm.org\/citation.cfm?id=301410\">Others<\/a> argue that involuntary persuasion is acceptable only if there is a very significant benefit for society that would outweigh possible harms. In the case of Facebook, it is difficult to justify the action as it was not voluntary and the benefits hardly seem to outweigh the harms. The lack of transparency towards the participants is likely to weigh more strongly in the eyes of most users than the small size of the effect and the details of how the filtering was conducted.<\/p>\n<h3>2. The News Feed is the result of algorithmic filtering anyway<\/h3>\n<p>Another claim is that the News Feed is constantly being adjusted and improved as a result of countless ongoing experiments. For instance, Facebook already personalizes various aspects of the platform in order to keep the experience interesting so that users are more likely to return to the site frequently. Since Facebook use is always an experiment, it is argued that this is not a special case and therefore the complaint is unjustified. However, Facebook <a href=\"http:\/\/www.thefilterbubble.com\/\">has previously been criticized<\/a> for tailoring its News Feed without properly explaining the criteria and this criticism is unlikely to just disappear. While personalization is an instrument to counter information overload, <a href=\"http:\/\/socialmediacollective.org\/2014\/06\/26\/corrupt-personalization\/\">corrupt personalization <\/a>seems an increasingly relevant issue. An average Facebook post reaches only <a href=\"http:\/\/techcrunch.com\/2012\/02\/29\/facebook-post-reach-16-friends\/\">12% of a user\u2019s followers<\/a>. As of January 2014, community and organization pages cannot reach its subscribers by regular status updates, they are instead <a href=\"http:\/\/techcrunch.com\/2014\/04\/03\/the-filtered-feed-problem\/\">forced to use ad campaigns<\/a>. Tailoring content for a user for his\/her best interest and then making profit of this service is one thing, prioritizing commercial content over regular content is another (<a href=\"https:\/\/medium.com\/@lydialaurenson\/google-censorship-and-salesmanship-the-epic-smackdown-of-rapgenius-b0c49f6853ca\">Google has also been accused of this<\/a>). If Facebook makes a profit, but also aims to serve the public\u2019s best interest, then its algorithms must be transparent enough for the public to judge them on that.<\/p>\n<h3>3. Social media companies are not bound to the same standards as publicly-funded research<\/h3>\n<p>Another widely-held argument is that private companies can make changes to their services as they see fit. End users have generally accepted the Terms of Service and should accordingly accept the consequences. However, Facebook, like Google, is not any Internet company. It has reached a dominant market position in which there is barely any competition and it in many respects acts as a public service that many people completely rely on. Public goods are important for the society and democracy, and are therefore often regulated.<\/p>\n<p>Another argument along these lines is that social media companies are under constant pressure to improve their products and such experiments are the most efficient way of achieving this. However, there are inherent risks associated with systematic user profiling. Different people react differently to strategic persuasion, therefore companies are likely to devise <a href=\"http:\/\/hbr.org\/2001\/10\/harnessing-the-science-of-persuasion\/ar\/1\">different persuasion strategies<\/a>. These include authority (user values the opinion of an expert), consensus (users do as others do), and liking (users say yes to people they like). Different strategies can be applied to different persuasion profiles. Depending on a user\u2019s susceptibility to a particular strategy, the system can be tailored to achieve persuasiveness.<\/p>\n<p>Once a platform provider knows which persuasion strategy works for a particular user, such a persuasion profile can be sold to third parties or used for other purposes, such as political advertisement. <a href=\"http:\/\/www.ncbi.nlm.nih.gov\/pubmed\/22972300\">In a 61 million user experiment<\/a> in 2010, Facebook users were shown messages at the top of their news feeds that encouraged them to vote, pointed to nearby polling places, offered a place to click \u201cI Voted\u201d and displayed images of select friends who had already voted (the \u201csocial message\u201d). The data suggest that the Facebook social message increased turnout by about 340,000 votes. Recently <a href=\"http:\/\/www.newrepublic.com\/article\/117878\/information-fiduciary-solution-facebook-digital-gerrymandering\">Jonathan Zittrain argued <\/a>that, if Facebook can persuade users to vote, it can also persuade them to vote for a certain candidate.<\/p>\n<h3>4. Experiments are constantly performed by social media companies<\/h3>\n<p>Some claim that online experiments should be accepted as a fact of life, since every social media company conducts them. However, just because this is how it is, it does not mean this is how it should be. In <a href=\"http:\/\/ebooks.iospress.nl\/publication\/29860\">a paper on nanotechnologies<\/a>, Ibo van de Poel lists a number of criteria that must be fulfilled in order to justify a social experiment: A social experiment is only acceptable when (1) there is an absence of alternatives, (2) the experiment is controllable, (3) users give their informed consent, (4) the hazard and benefits must be proportional, (5) the experiment is approved by democratically legitimized bodies, (6) subjects can influence the set-up, carrying out and stopping the experiment if needed, and (7) vulnerable subjects are either not subject or protected. Clearly many online intermediaries do not adhere to most of these principles. In the case of Facebook it is clear that the company has done this testing for their own purposes, but as we have mentioned, it can also be done with <a href=\"http:\/\/ssc.sagepub.com\/content\/32\/2\/145\">intentions for a better society<\/a>, with implications which are no less problematic. It follows that all actors involved need to jointly discuss and devise criteria for the ethics of online experiments and big data research using human subjects in accordance with existing guidelines.<\/p>\n<h3>5. Criticism will lead to less open publication of industry research results<\/h3>\n<p>There is a very real danger that the wave of public outrage (and in some cases very personal attacks on the authors of the study) will lead to less cooperation between industry and academia. This may ensure that researchers at publicly funded institutions do not participate\u00a0in research with potentially questionable aims, but it also has a number of problematic consequences. If social media industry research becomes a complete black box, ethical standards in industry research are likely to suffer, rather than improve. There is a potential for real cross-fertilization between the two, but it lies in <b>industry research becoming more like academia<\/b>, rather than the other way around. Perhaps this is illusory,\u00a0but it seems certain that industry research will grow in coming years, as the user bases of social media platforms and other online services continue to grow.<\/p>\n<p>Data science must follow stricter standards &#8212; both methodologically and ethically &#8212; to deliver on its many promises. Laboratories, regardless of their size, are governed by rules ensuring that the research conducted in them is not just legal, but also ethical. We need to start devising similar rules for online research as well. Hiding behind the ToS will not do.<\/p>\n<hr \/>\n<p><em>Image: <a href=\"https:\/\/www.flickr.com\/photos\/dullhunk\/5261568726\/in\/photolist-8pfreV-aUnJn8-4dThpv-4CiUM8-8TCUJ6-5BwY81-91WV5b-bJgLsk-e3z2Af-4S9B7p-8TFGvW-822bu2-93Coc9-9Y9rJo-9wSz2J-8TG37J-dzYfbV-7yysow-4aBKbu-8VH9YR-7VmLeK-dzJCiw-e1BbSx-66JvbF-8TCTBB-7k7oRm-dUnPaB-dcnNhi-2arT74-ar19LD-8TCqbP-3bMSAz-8em2UZ-74nZEq-5sJqaW-c39g6C-7Jph4H-9Y9tKy-abD2ku-dZxizs-ddWm8V-9yaonN-4x7MLg-97UDHf-gGiUEk-8TG2aq-3W6sxM-8e6g3c-8TCxvT-cduBpG\/lightbox\/\">Flickr<\/a>,\u00a0Paul Butler, Planet Facebook or Planet Earth?\u00a0<\/em><\/p>\n<h4>Cornelius Puschmann is an associated researcher of the Alexander von Humboldt Institute for Internet and Society, Engin Bozdag is an HIIG Fellow. The post does not necessarily represent the view of the Institute itself. For more information about the topics of these articles and associated research projects, please contact <a href=\"mailto:presse@hiig.de\">presse@hiig.de<\/a>.<\/h4>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Fall-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=All%20the%20world%E2%80%99s%20a%20laboratory%3F%20On%20Facebook%E2%80%99s%20emotional%20contagion%20experiment%20and%20user%20rights https%3A%2F%2Fwww.hiig.de%2Fen%2Fall-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Fall-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights%2F&subject=All%20the%20world%E2%80%99s%20a%20laboratory%3F%20On%20Facebook%E2%80%99s%20emotional%20contagion%20experiment%20and%20user%20rights\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>by\u00a0Cornelius Puschmann and HIIG Fellow\u00a0Engin Bozdag How significant is the impact of what we read on Facebook on what we post there, specifically on our emotions? And to what extent can we trust what we read on social media sites more generally, when what we see is increasingly filtered using algorithmic criteria that are largely&hellip;<\/p>\n","protected":false},"author":61,"featured_media":18295,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[227,226],"tags":[],"class_list":["post-18290","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-everyday-life","category-knowledge"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights &#8211; Digital Society Blog<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"by\u00a0Cornelius Puschmann and HIIG Fellow\u00a0Engin Bozdag How significant is the impact of what we read on Facebook on what we post there, specifically on our emotions? And to what extent can we trust what we read on social media sites more generally, when what we see is increasingly filtered using algorithmic criteria that are largely&hellip;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2014-06-30T17:23:26+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-07-22T11:35:19+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2014\/06\/5261568726_d51149d62c_z.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"640\" \/>\n\t<meta property=\"og:image:height\" content=\"319\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Cornelius Puschmann\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Cornelius Puschmann\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"13 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights &#8211; Digital Society Blog","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/","og_locale":"en_US","og_type":"article","og_title":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights &#8211; Digital Society Blog","og_description":"by\u00a0Cornelius Puschmann and HIIG Fellow\u00a0Engin Bozdag How significant is the impact of what we read on Facebook on what we post there, specifically on our emotions? And to what extent can we trust what we read on social media sites more generally, when what we see is increasingly filtered using algorithmic criteria that are largely&hellip;","og_url":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/","og_site_name":"HIIG","article_published_time":"2014-06-30T17:23:26+00:00","article_modified_time":"2021-07-22T11:35:19+00:00","og_image":[{"width":640,"height":319,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2014\/06\/5261568726_d51149d62c_z.jpg","type":"image\/jpeg"}],"author":"Cornelius Puschmann","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Cornelius Puschmann","Est. reading time":"13 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/"},"author":{"name":"Cornelius Puschmann","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/4a89dbc4b47086d2dce5016f16146789"},"headline":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights","datePublished":"2014-06-30T17:23:26+00:00","dateModified":"2021-07-22T11:35:19+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/"},"wordCount":2935,"commentCount":0,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/12\/post_generic.png","articleSection":["Everyday Life","Knowledge"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/","url":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/","name":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/12\/post_generic.png","datePublished":"2014-06-30T17:23:26+00:00","dateModified":"2021-07-22T11:35:19+00:00","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/12\/post_generic.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/12\/post_generic.png","width":640,"height":319},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/all-the-worlds-a-laboratory-on-facebooks-emotional-contagion-experiment-and-user-rights\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"All the world\u2019s a laboratory? On Facebook\u2019s emotional contagion experiment and user rights"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/4a89dbc4b47086d2dce5016f16146789","name":"Cornelius Puschmann"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/18290","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/61"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=18290"}],"version-history":[{"count":8,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/18290\/revisions"}],"predecessor-version":[{"id":78036,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/18290\/revisions\/78036"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/18295"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=18290"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=18290"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=18290"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}