How Parents and Kids Really Feel About AI-Generated Images in Children’s Books
en-GBde-DEes-ESfr-FR

How Parents and Kids Really Feel About AI-Generated Images in Children’s Books


A new study finds that while some parents are opposed to the use of AI-generated images in children’s stories, most are willing to accept these images if the text is human-authored and the images have been reviewed by educators, librarians or other experts. However, depending on the nature of the stories, parents and children did raise concerns about errors in the illustrations that might encourage unsafe behavior or lack real-world accuracy.

“We know that generative AI tools are being used to create illustrations for children’s stories, but there has been very little work on how parents and children feel about these AI-generated images,” says Qiao Jin, first author of a paper on the work and an assistant professor of computer science at North Carolina State University. “We wanted to explore how and whether AI-generated images affected the experience of reading stories for parents and kids.”

For this study, researchers worked with 13 parent-child groups, with each group consisting of one child (aged 4-8) and at least one parent.

Each parent-child group read two out of three stories presented by researchers, adhering as closely as possible to their normal story-reading routine. All of the stories included AI-generated art, human art augmented using AI, and art made entirely by people.

After reading each story, children were asked to rate how much they liked the experience of reading the story, and the related images, using age-appropriate language. Parents were interviewed at greater length about the images, with the goal of capturing their preferences and concerns regarding images that accompany stories in general, and about these stories in particular.

“We found children were more sensitive than their parents to the emotional content of the illustrations and were more likely to notice any disconnect between the emotions being conveyed by the images and the emotions being conveyed by the text,” Jin says. “These disconnects stemmed from problems AI has with interpreting emotional cues from the stories.”

The concerns of parents and children often varied, depending on the nature of the stories they were reading.

“For example, parents and older children were more concerned about real-world accuracy if the stories were realistic or about science, rather than fables,” Jin says. “Older children noticed when AI images contained size or behavior errors, while parents were particularly concerned about errors that may encourage unsafe behavior.”

While most parents were open to the use of AI-generated images if the images were screened by people with expertise in children’s literature, some shared fundamental concerns with the idea of using AI to replace human artists or objected to the “artificial” look of AI-generated images. Most parents were not comfortable with the idea of AI being used to generate the text of stories.

The researchers also experimented with placing small labels under each image to note whether the image was AI-generated. Most parents and children neither noticed nor used the labels, and several reported that the labels were distracting while reading.

“Parents preferred a clear notification on the cover of the story making clear whether AI had been used to create a story’s images so they could make an informed decision about whether to purchase the book,” Jin says. “However, they were more concerned about whether the story was authored by a human.

“Our findings highlight the importance of three things,” Jin says. “First, parents want a simple cover label, not page-level flags, stating whether AI was used to illustrate the book, so they can decide whether to purchase it. Second, certain types of errors in AI-generated images can pose problems for parents and children, depending on the nature of the stories. And lastly, if a book uses AI to generate illustrations, it is important to bring in experts to screen the images to ensure they’re appropriate.”

The paper, “‘They all look mad with each other’: Understanding the Needs and Preferences of Children and Parents in AI-Generated Images for Stories,” is published in the International Journal of Child–Computer Interaction. Corresponding author of the paper is Irene Ye Yuan of McMaster University.

This work was done with support from the OpenAI Researcher Access Program.

“‘They all look mad with each other’: Understanding the Needs and Preferences of Children and Parents in AI-Generated Images for Stories”

Authors: Qiao Jin, North Carolina State University; Irene Ye Yuan, McMaster University

Published: Nov. 12, International Journal of Child–Computer Interaction

DOI: 10.1016/j.ijcci.2025.100787
Regions: North America, United States
Keywords: Applied science, Artificial Intelligence, Arts, Literature & creative writing, Media & multimedia, Visual arts, Business, Culture, media & publishing

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Témoignages

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Nous travaillons en étroite collaboration avec...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement