Pius Camenzind

Trust in AI

Taking a human-centered approach to mitigate bias and discrimination in hiring.

What?

In the recruitment of job candidates, AI hiring tools are gaining popularity among employers. Besides saving resources, companies hope to reduce bias in the hiring process by using AI. Nevertheless, bias and discrimination are still a big problem in hiring, because existing biases get reinforced by AI systems.

 

How?

The problem was addressed with a human-centered approach. Following qualitative research, an intervention concept was prototyped with the goal of mitigating the problem. Iterative user testing allowed for the prototype to form into a goal-focused design intervention.

 

Why?

The concept’s idea is to give users access to a platform that supports them during the whole hiring process. The main idea is facilitating people to help people. The goal of this concept is to foster trust by allowing users to take transparency, a key factor for trust in AI, into their own hands. By sharing their experiences with each other, users can get a much better idea of what to expect from, and how to deal with different AI hiring tools.

 

For Whom?

The intervention is designed for job applicants who don‘t trust AI hiring tools, or feel competent enough dealing with them when applying.

0 Kommentare

Kommentieren

Danke für Ihren Kommentar, wir prüfen dies gerne.