Search engines find new content by crawling. A bot visits your site periodically, follows links, and adds what it discovers to the index. On a busy site that's fine. On most sites, that bot might not come back for days or weeks. Pages you published today might not show up in search results until next week, if you're lucky.
IndexNow fixes that. Instead of waiting for the crawler to show up, you push a notification to search engines the moment something changes. The engines pull the page themselves within hours.
How IndexNow Works
IndexNow is an open protocol co-developed by Microsoft and Yandex. Google joined in 2023. The idea is simple:
- You generate a small text file (your key file) and place it at the root of your site.
- When you publish or update a page, you send a single HTTP POST to any IndexNow endpoint with the list of changed URLs.
- That endpoint shares the notification with all participating engines.
One POST reaches Bing, Yandex, DuckDuckGo, Naver, and others at once. You don't need to call each engine separately.
Why It Matters
Faster indexing means faster traffic
If you run a news site, a job board, or anything where freshness matters, the difference between "indexed in hours" and "indexed in two weeks" is real traffic. A product page you pushed live today can appear in Bing results the same day.
Google uses it too
Google initially sat out IndexNow but began testing support in 2023. As of 2026, they process IndexNow submissions, though they still reserve the right to independently decide when to actually index a page. For Bing and Yandex the turnaround is consistently fast.
It reduces crawler load
Crawlers consume server resources. IndexNow replaces broad crawling with targeted fetches. Your server handles fewer wasted bot visits, and crawlers spend their budget on pages that actually changed.
What You Need to Use IndexNow
Three things:
- A key file — a
.txtfile athttps://yourdomain.com/{key}.txtcontaining your key. - A way to detect when pages change — typically by monitoring your sitemap.
- An HTTP client — to POST the changed URLs to an IndexNow endpoint.
The protocol is simple enough to implement in an afternoon. The harder part is the detection layer: how do you know when a page was added or updated? How do you avoid re-submitting URLs that haven't changed? How do you handle retries when the endpoint is slow?
The Sitemap Approach
Most sites already have a sitemap listing every URL with a <lastmod> timestamp. That sitemap is the source of truth for what's changed. The workflow:
- Fetch the sitemap on a schedule.
- Diff it against the previous snapshot.
- Submit only the new or updated URLs to IndexNow.
- Store the new snapshot for next time.
Every page you publish or update gets submitted to search engines within minutes, without touching your publishing workflow.
Pingmap runs this loop for you: sitemap polling, diff detection, and IndexNow submission on a schedule you set. You point it at your sitemap once and it handles the rest.
Do It Manually or Automate It
Bing Webmaster Tools has a URL submission interface where you paste URLs one at a time. Fine for a quick one-off, not realistic if you publish regularly.
WordPress plugins like RankMath and Yoast have IndexNow integrations that fire on publish. If you're already in that ecosystem, it's the obvious choice. Just make sure you configure it.
Any other platform (Webflow, Framer, Ghost, custom static sites) doesn't have a built-in option. You either write the integration yourself or use a dedicated tool.
Pingmap is that tool. It watches your sitemap on a schedule, diffs each fetch against the last snapshot, and pushes new and updated URLs to IndexNow automatically. Setup takes about five minutes.
If your site publishes regularly and you're not using IndexNow, you're just waiting for crawlers to show up on their own schedule. The protocol's free and the setup is short.