Poynter: What we learned from a failed Nota News experiment
Nota launched eleven hyperlocal news sites using AI-assisted production tools to transform public records and government information into local stories. The company’s model depended on contractors who would work within defined source guidelines to produce content that AI tools would help structure and publish. The experiment failed publicly when Axios Richmond and Poynter found more than seventy stories that had been lifted from other local outlets rather than derived from original sources.
Josh Brandau, Nota’s CEO, wrote this postmortem for Poynter in April 2026. The piece is notable for being specific about what went wrong rather than offering a generic defense of AI-assisted journalism. The core problem was not the technology: contractors deviated from the approved source workflow and copy-pasted content from competing outlets, removing attribution in the process. Brandau acknowledges this directly — the failure was one of oversight and enforcement, not of AI capability.
The lessons he draws from the incident are operational. Workflows have to make the right thing easier and the wrong thing harder. Citation enforcement needs to be built into the production system rather than enforced through after-the-fact review. When contractors do not understand why the guidelines exist, they are more likely to treat them as optional. The article describes the institutional memory gap that emerges when editorial standards are not documented in a way that transfers to people working outside the core team.
Brandau also addresses the accountability structure. Human judgment is not a safeguard that can be assumed — it has to be designed for. The specific behaviors that define acceptable sourcing need to be spelled out and checked, not delegated to individual discretion under production pressure.
The piece is most useful to editors and editorial directors at digital publications that are building or scaling AI-assisted production. It describes a failure mode that emerges specifically when production volume increases and oversight does not scale alongside it. The scenario — contractors under deadline pressure taking shortcuts that the workflow did not prevent — is not unique to AI-assisted journalism and applies to any content operation that relies on distributed contributors without adequate enforcement mechanisms.