A now-removed set of locally distributed social media posts that included AI-altered images that violated at least one community platform guideline got me wondering how much of the content we’re served on social platforms really deserves a harsher eye than most of us give it.
In 2025, almost 1 in 5social media posts contained AI generated or altered images per the Reuters Institute.
The images in question looked realistic enough. Trees and greenery had been digitally removed and vehicle lanes that would never exist had been added from the original. Existing infrastructure was modified to make a space feel more closed off and congested than it actually is. The tell-tale Google Gemini “star” on the bottom right of the images didn’t even get a comment. None of the changes were glaring, and that’s the point.
The goal of subtle image alteration isn’t to fool someone who is looking closely, it’s to move the emotional needle before looking closely ever crosses anyone’s mind. Research shows that visuals are incredibly effective at triggering both emotion and a sense of credibility simultaneously, and this is exactly what is being weaponized to get people to feel something before they think critically about what they are looking at.
Here’s the thing that I keep coming back to, though. My kids do those “spot the differences” activities in Highlights Magazine every month. In the younger version, it’s two drawings. In the teen version, it’s two photographs side by side. They are genuinely good at it and treat it as a fun activity. It turns out the difficulty that even adults have with what is effectively a children’s game has been studied extensively under a phenomenon researchers call “change blindness,” or our surprisingly poor ability to detect visual changes that, in theory, we should catch easily.
Adults generally outperform children in controlled detection tasks studying change blindness, but the problem is that most adults significantly overestimate our own ability to catch these changes. The gap between our confidence and our actual detection rate is exactly where subtle image manipulation is designed to live, and perhaps we should all be practicing the Highlights Magazine exercise monthly as a useful practice for navigating an environment that has decided altered images can be a legitimate tool for making a point.
Recommended for you
But whose responsibility is all of this, really? In an ideal world, it sits squarely with the person distributing the content. Fact check before you post. Don’t intentionally share altered images to support your position. That’s the right standard and it’s worth drawing this line in the sand. But the uncomfortable reality is that this responsibility almost always migrates to the person scrolling through Facebook or Nextdoor on a weeknight, catching up between making dinner and getting kids to bed, who is absolutely not approaching their community feed as an adversarial environment requiring verification workflows. So, here we are.
There are some habits worth building. I write about this in my upcoming third book “Raising Digitally Resilient Kids in the AI Era” and it comes down to building muscle around these three skills: pause, verify, then react.
Running a reverse image search through Google Images or TinEye takes less than a minute and can surface whether an image has appeared elsewhere in a different context or in an earlier, unaltered form. A strong emotional reaction — particularly anger or urgency — is worth treating as a signal to slow down rather than share or react. That reaction isn’t incidental; it is exactly the outcome for which the content was engineered. If an image shows a place you’ve actually been to, ask yourself whether what you’re seeing matches your own experience of that location. And if you look more closely, lighting inconsistencies around the edges of things that may have been added or removed are often visible, even without any special tools.
What I keep thinking about, however, is the longer-term damage. Local online platforms are where real people in our communities form real opinions about planning decisions, school policy, street design, elections, businesses, products and the people who we pass walking down our streets. When altered images circulate in those spaces, particularly on issues where the community is already divided, the harm extends well beyond whoever was fooled by a specific post. It accumulates and chips away at the shared visual reality that neighbors actually need in order to agree or disagree in good faith with each other. It makes every image slightly more suspect and every conversation a little more exhausting.
What matters more than any single post is what communities choose to do with the information they’re given. Altered images circulate because they work until they don’t, because enough people have built the habit of pausing to think and verify before reacting. This is a small investment for each of us to make for our community.

(0) comments
Welcome to the discussion.
Log In
Keep the discussion civilized. Absolutely NO personal attacks or insults directed toward writers, nor others who make comments.
Keep it clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't threaten. Threats of harming another person will not be tolerated.
Be truthful. Don't knowingly lie about anyone or anything.
Be proactive. Use the 'Report' link on each comment to let us know of abusive posts.
PLEASE TURN OFF YOUR CAPS LOCK.
Anyone violating these rules will be issued a warning. After the warning, comment privileges can be revoked.