Stay organized with collections
Save and categorize content based on your preferences.
How Google crawls locale-adaptive pages
If your site has locale-adaptive pages (that is, your site returns different content
based on the perceived country or preferred language of the visitor), Google might not crawl,
index, or rank all your content for different locales. This is because the default IP
addresses of the Googlebot crawler appear to be based in the USA. In addition, the crawler
sends HTTP requests without setting Accept-Language in the request header.
Geo-distributed crawling
Googlebot crawls with IP addresses based outside the USA, in addition to the US-based IP addresses.
As we have always recommended, when Googlebot appears to come from a certain country, treat
it like you would treat any other user from that country. This means that if you block
USA-based users from accessing your content, but allow visitors from Australia to see it,
your server should block Googlebot if it appears to be coming from the USA, but allow access
to Googlebot if it appears to come from Australia.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-02-04 UTC."],[[["Googlebot's default IP addresses appear to be US-based, which may impact the crawling and indexing of locale-adaptive pages targeting other regions."],["It is recommended to use separate locale URL configurations with `rel=\"alternate\"` hreflang annotations for better localization."],["Googlebot crawls from various global locations, so treat it like any other user based on its apparent location, including access restrictions."],["Ensure consistent robots exclusion protocol (robots.txt and meta tags) across all locales to avoid unintended crawling restrictions."]]],["Google crawls locale-adaptive pages using IP addresses from various locations, not just the USA. When Googlebot appears to be from a specific country, treat it like a user from that region. For locale-adaptive sites, using separate URL configurations with `rel=\"alternate\"` hreflang annotations is recommended. Ensure consistent application of robots exclusion protocols, such as robots.txt and meta tags, across all locales. You can verify Googlebot's geo-distributed crawls through reverse DNS lookups.\n"]]