Artificial intelligence, explained in human terms
Technical systems that use so-called artificial intelligence are not only treated as a new stage of technical evolution, but also bring about progressive social change through their interaction with social processes. The EU Commission therefore formulates the following requirement for the use of AI systems: AI must be "explainable", "interpretable" and "comprehensible". This also makes the comprehensibility of AI for a broad civil society - and not only explainability for experts - a central necessity.
The project “Artificial Intelligence, explained in human terms”, uses an interdisciplinary approach to develop explanatory models for artificial intelligence (AI) systems that are designed for direct interaction and cooperation with civil society. The research subject therefore does not refer to a technical level alone, but requires a special application-oriented and participatory research design due to the focus on civil society.
The first workshop tackles how computer thinks and how we can explain it in human terms.
2. September 2022 – Alexander von Humboldt Institut für Internet und Gesellschaft