ChatGPT

Negative parallelism examples

Mark Into
April 23, 2026
AI use
Original Notion chat
ChatGPT

You asked for negative parallelism before 2020, trying to find an article that uses it but is not AI.

ChatGPT responded:

Negative parallelism is older than generative AI and appears constantly in pre-2020 human writing. People usually mean patterns like "not X, but Y" or "neither A nor B."

Examples discussed included:

The New Yorker (2014): "Not because we felt so alone in our grief, but because we were so together in it."

The New Yorker (2018): "not because its material was too sensitive for students ... but because it was too insensitive."

The Washington Post (2018): "not because of what the Supreme Court nominee did back then but because of what he says about it now."

You asked which reads more like AI.

ChatGPT said the cleanest "not because X, but because Y" constructions read most AI-like to modern ears because they are symmetrical, polished, and easy for models to reproduce.

You then asked for something sycophantic, servant-like, or with a positive Y clause.

ChatGPT suggested:

"Not because it sounds so hokey, but because, deep down, it is something all of us already know."

It also looked for warmer or more noun-heavy examples and suggested passages using abstract nouns like truth, narrative, facts, and story, as well as examples from older Atlantic writing.

ChatGPT can make mistakes. Please double-check responses.