Skip to content

How to identify bias in Natural Language Processing

Author: Hewett, F., & Nenno, S.
Published in: Digital society blog
Year: 2021
Type: Other publications

Why do translation programmes or chatbots on our computers often contain discriminatory tendencies towards gender or race? Here is an easy guide to understand how bias in natural language processing works. We explain why sexist technologies like search engines are not just an unfortunate coincidence.

Visit publication

Publication

Connected HIIG researchers

Sami Nenno

Researcher: AI & Society Lab

Freya Hewett

Researcher: AI & Society Lab



    Related projects

    Explore current HIIG Activities

    Research issues in focus

    HIIG is currently working on exciting topics. Learn more about our interdisciplinary pioneering work in public discourse.