Skip to content
169 HD – AI decides what to see
19 May 2021| doi: 10.5281/zenodo.4751739

Myth: AI algorithms decide what you see online

There’s more than one myth about algorithmic visibility regimes: one posits that AI algorithms are tools used unilaterally by corporations to control what we see;  the other argues that these algorithms are mere mirrors, and we are the ones who control what we see online.

Myth

AI algorithms decide what you see online.

Some say that AI algorithms decide what we see online; others argue that algorithms just do what we tell them to do. Neither idea is really accurate. What we see online is the result of several relationships between several actors and things — users and algorithms but also platforms, coders, data, interfaces etc. It is key to understand the inequality that marks these relationships.

Watch the talk

Materials

Presentation
KEY LITERATURE

Bucher, T. (2018). IF…THEN: Algorithmic power and politics. Oxford: Oxford University Press.

Gillespie, T. (2013). The relevance of algorithms. In T. Gillespie, J. B. Pablo & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167-193). Cambridge, MA: MIT Press.

Introna, L. and H. Nissenbaum (2000). “Shaping the Web: Why the Politics of Search Engines Matters.” The Information Society 16(3): 1-17.
UNICORN IN THE FIELD

The Social Media Collective is ‘a network of social science and humanistic researchers’, funded by Microsoft but working on their own independent agendas. Much of what they do concerns the broad field of platforms’ algorithmic visibility, and often helps steer debates on the theme.

About the author

João Magalhães | HIIG

João Carlos Magalhães

Senior researcher at the Alexander von Humboldt Institute for Internet and SocietyMuch of João’s work explores the political and moral ramifications of algorithmic media and technologies. At the HIIG, he heads an EU-funded project that is mapping out social media platforms’ governance structures, with a focus on copyright policies and automated filters. In 2020, he was awarded a fellowship from the Wikimedia Foundation to help create an open database of platforms’ policies.


Why, AI?

This post is part of our project “Why, AI?”. It is a learning space which helps you to find out more about the myths and truths surrounding automation, algorithms, society and ourselves. It is continuously being filled with new contributions.

Explore all myths


This post represents the view of the author and does not necessarily represent the view of the institute itself. For more information about the topics of these articles and associated research projects, please contact info@hiig.de.

João Carlos Magalhães, Dr.

Former Senior Researcher: The evolving digital society

Sign up for HIIG's Monthly Digest

HIIG-Newsletter-Header

You will receive our latest blog articles once a month in a newsletter.

Explore Research issue in focus

Du siehst Eisenbahnschienen. Die vielen verschiedenen Abzweigungen symbolisieren die Entscheidungsmöglichkeiten von Künstlicher Intelligenz in der Gesellschaft. Manche gehen nach oben, unten, rechts. Manche enden auch in Sackgassen. Englisch: You see railway tracks. The many different branches symbolise the decision-making possibilities of artificial intelligence and society. Some go up, down, to the right. Some also end in dead ends.

Artificial intelligence and society

The future of artificial Intelligence and society operates in diverse societal contexts. What can we learn from its political, social and cultural facets?

Further articles

An older woman in a wheelchair sitting alone. This reflects the urgent need to combat loneliness through digital tools and community-based, inclusive care facilities.

Opportunities to combat loneliness: How care facilities are connecting neighborhoods

Can digital tools help combat loneliness in old age? Care facilities are rethinking their role as inclusive, connected places in the community.

Antique stone statue of a woman holding a mask, symbolising the stolen identities and loss of consent in deepfake pornography."** Let me know if you'd like it adjusted for a specific platform or audience.

Unwillingly naked: How deepfake pornography intensifies sexualised violence against women

Deepfake pornography uses AI to create fake nude images without consent, primarily targeting women. Learn how it amplifies inequality and what must change.

Abstract paper cut-out on a wall symbolising the complexity and interconnection of global initiatives, representing efforts in mapping the landscape of public interest AI as explored in the article.

Artificial intelligence with purpose: Mapping the landscape of public interest AI

How is AI being used for the common good? A new dataset is mapping the landscape of public interest AI by cataloguing impactful projects worldwide.