CARE-Kids

CARE-Kids tackles this challenge by empowering children (8 to 12) to think critically about AI-generated content, educating them to recognise stereotypes in these images, and helping them to become aware of their beliefs about themselves and others. CARE-Kids aims to deconstruct stereotypes perpetuated by biased AI algorithms (and society) by fostering children’s active engagement to mitigate its impact on their lives. We will develop a web app to enhance children’s critical thinking and awareness of gender biases enabling children to interact with AI-generated content and understand gender biases. This project has been funded by the INCLUDE+ network.
Check the website CARE-Kids for more info.

Here are some publications (more to come :)):

  • Gail Collyer-Hoar, Elisa Rubegni, Bernard Tomczyk, Alexander Baines, and Lidia Gruia. 2025. “Suits as Masculine and Flowers as Feminine”: Investigating Gender Expression in AI-Generated Imagery. https://doi.org/10.1145/3715336.3735749
  • Gail Collyer-Hoar, Aurora Castellani, Lala Guluzade, Ben Tomczyk, Hania Bilal, and Elisa Rubegni. 2025. Experts Unite, Kids Delight: Co-Designing an Inclusive AI Literacy Educational Tool for Children. Proceedings of the 24th Interaction Design and Children. https://doi.org/10.1145/3713043.3731495
  • Alexander Baines, Lidia Gruia, Gail Collyer-Hoar, and Elisa Rubegni. 2024. Playgrounds and Prejudices: Exploring Biases in Generative AI For Children. In Proceedings of the 23rd Annual ACM Interaction Design and Children Conference (IDC ’24). Association for Computing Machinery, New York, NY, USA, 839–843. https://doi.org/10.1145/3628516.3659404
search previous next tag category expand menu location phone mail time cart zoom edit close