Data Doctors: Why Google Photos search feels broken

Q: Why is Google Photos AI search so bad and can I turn it off?

A: If searching your photos — and especially your videos — feels worse than it used to, you’re not imagining it. Many users can no longer find pictures or clips they know are there, even though the same searches worked perfectly fine in the past.

The issue isn’t that your content disappeared; it’s that Google Photos changed how its search works — and not in a way that always matches how we remember things.

From keywords to confidence scores

Google Photos used to behave like a filing system. If a photo had location data, dates or simple tags, search would show it — even if the match wasn’t perfect.

Now, search is driven by artificial intelligence that tries to interpret what’s actually in your photos and videos. It looks for faces, objects, scenes, text and activities, then assigns an internal confidence score. Only photos and videos that pass their “confidence threshold” appear in search results.

That’s the key change, and the main source of frustration for users.

Why simple searches work better than specific ones

When you search for one word, such as “dolphin,” Google only has to answer a single question: Is there a dolphin here? That’s often easy, so results appear.

But when you search “dolphins in Greece,” Google treats that as one combined idea. Now it must be confident about multiple things at once: that there are dolphins, that there may be more than one and that the photo or video was taken in Greece. If any part of that is uncertain — weak GPS data, offshore photos or older images — the result gets filtered out.

Nothing is missing. The AI just isn’t confident enough to show it.

Why videos are even worse

Videos suffer the most under this system. A photo is judged from one moment. A video contains hundreds or thousands of frames, and Google has to decide what the video is about.

If a dolphin appears briefly in a long ocean clip, the AI may decide it’s an “ocean video,” not a “dolphin video.” If a sunset only dominates part of the clip, it may not count as a sunset at all. If the main subject isn’t obvious across most frames, the video often won’t show up in a search.

That’s why scrolling by date and scrubbing through videos still works — search hides uncertainty, scrolling shows everything.

There’s no way to go back but you can make searching better

Many people look for a setting to turn the AI off. In most versions of Google Photos, there isn’t one anymore. AI-based search is now baked into the core of the app. The old keyword-driven system is gone.

So instead of changing the app, you have to change how you search in it.

Start broad, then narrow:

  • Search for the strongest visual concept (“dolphin,” “sunset,” “receipt”)
  • Then, narrow by date or suggested location
  • Avoid stacking multiple ideas into one search

For anything that really matters:

  • Add a short caption (for example “dolphins in Greece”)
  • Put photos and videos into albums with descriptions
  • Mark key items as favorites

Captions and albums don’t rely on AI confidence — they give the system anchors it won’t second-guess.

Get breaking news and daily headlines delivered to your email inbox by signing up here.

© 2026 WTOP. All Rights Reserved. This website is not intended for users located within the European Economic Area.

Federal News Network Logo
Log in to your WTOP account for notifications and alerts customized for you.

Sign up