Pre-publication audits of AI-assisted content differ from standard editing in one key respect: the auditor must assume that every specific factual claim is potentially wrong until verified. This assumption — counterintuitive to editors trained to assume good faith in human writers — is necessary because AI hallucinations are indistinguishable from accurate claims in surface appearance.

The Pre-Publication AI Audit Process

1) Run a multi-engine fact-check on all specific claims (Omniscient AI or manual equivalent). 2) Verify that all statistics link to primary source documents. 3) Confirm that all named quotes are in the public record. 4) Check all institutional names against current records. 5) Confirm that the article's factual claims are consistent with each other (internal consistency is a hallucination indicator). 6) Read the article for perspective balance — AI frequently overweights the dominant viewpoint. 7) Verify the AI disclosure label is accurate and complete.