Skip to content
News Poynter May 2026

Poynter: A critic gave five stars to an AI-generated novel without knowing it

Rachel Meghan, writing for Poynter on May 1, 2026, describes reviewing “Shy Girl” by Mia Ballard — a novel she praised on NetGalley and again in Rue Morgue magazine — and subsequently learning through a New York Times investigation that the book was 78% AI-generated. Hachette, the publisher, pulled the title after the story broke.

Meghan is candid about what makes the experience worth examining: she describes herself as “an avid AI hater” who did not catch the AI origin of the text in a professional critical read. She does not treat this as evidence that AI writing is undetectable; instead, she examines what the conditions of current book reviewing make visible and what they don’t. Review copies arrive in high volumes, critics work under time pressure, and AI detection tools applied to long-form narrative have known false-positive and false-negative rates that make them unreliable as verification tools.

The broader pattern she documents involves AI-generated books entering the publishing pipeline as human-authored works without disclosure. She cites examples of romance authors reportedly generating over 200 books annually, at a volume that suggests production automation rather than individual authorship.

The piece asks questions that the publishing industry has not yet answered: what disclosure obligations should authors and publishers have when AI is used substantially in writing, and what standards should critics apply in the absence of reliable verification? The Hachette response — pulling the book after public exposure — reflects an industry that is reacting to individual incidents rather than operating under a coherent framework.

For writers, editors, and content professionals, this case is a reference point for conversations about attribution, transparency, and what counts as disclosure of AI involvement in published work.