Queer Perspectives on Automated Facial Analysis Technology (Queer AI)
Research group
The Queer AI research group is engaged around issues of risks and opportunities with automated AI and facial analysis (FA) technology from a queer perspective.
Automated facial analysis (FA) technologies - such as facial detection and facial recognition - stand out as central in discussions about Artificial Intelligence's (AI) impact on human beings. AI is intertwined with both the most mundane and the most critical aspects of human life, and image detection and classification represents a pertinent domain where we can see a tight coupling of human identity and computation. In previous research on automatic gender recognition, the classification of gender by FA technologies, has raised potential concerns around issues of racial and gender bias (e.g. Schuerman et al 2019). Research on FA from queer perspectives is still limited, but the results in studies with focus on queer identity and perspectives points towards potential risks. For an example, Schuerman et al. (2019) found that FA services performed consistently worse on transgender individuals and were universally unable to classify non-binary genders.
Publications:
Danielsson, K., Tubella, A. A., Liliequist, E., & Cocq, C. (2023). Queer Eye on AI: binary systems versus fluid identities. In Handbook of Critical Studies of Artificial Intelligence (pp. 595-606). Edward Elgar Publishing.
Liliequist, E., Tubella, A. A., Danielsson, K., & Cocq, C. (2023). Beyond the Binary - Queering AI for an Inclusive Future. interactions, 30(3), 31-33.