Locations & Languages
Retrieve the list of supported locations, ISO codes, and languages for each
search engine to use in your SERP crawl requests.
Overview
The SerpWatch API supports crawling search results from multiple countries and languages.
Use the engines endpoint to get the complete list of available configurations for
Google, Bing, and Yahoo.
Access Requirements
The engines endpoint may require specific account permissions. If you receive
an authorization error, contact support to enable access for your account.
List Search Engines
Returns a list of all available search engine configurations for the specified system.
Path Parameters
| Parameter | Type | Description |
|---|---|---|
system |
string | google, bing, or yahoo |
Example Request
curl "https://engine.v2.serpwatch.io/api/v1/common/engines/google" \
-H "Authorization: Bearer YOUR_API_KEY"
import requests
import os
response = requests.get(
"https://engine.v2.serpwatch.io/api/v1/common/engines/google",
headers={"Authorization": f"Bearer {os.environ['SERPWATCH_API_KEY']}"}
)
engines = response.json()
for engine in engines["items"]:
print(f"{engine['country_name']} ({engine['iso_code']}) - {engine['language']}")
const response = await fetch(
"https://engine.v2.serpwatch.io/api/v1/common/engines/google",
{
headers: { "Authorization": `Bearer ${process.env.SERPWATCH_API_KEY}` }
}
);
const engines = await response.json();
engines.items.forEach(engine => {
console.log(`${engine.country_name} (${engine.iso_code}) - ${engine.language}`);
});
Example Response
{
"num_items": 3,
"items": [
{
"id": 2840,
"name": "Google United States",
"type_name": "desktop",
"system_name": "google",
"country_name": "United States",
"iso_code": "US",
"language": "English",
"localization": "en"
},
{
"id": 2841,
"name": "Google United States Mobile",
"type_name": "mobile",
"system_name": "google",
"country_name": "United States",
"iso_code": "US",
"language": "English",
"localization": "en"
},
{
"id": 2276,
"name": "Google Germany",
"type_name": "desktop",
"system_name": "google",
"country_name": "Germany",
"iso_code": "DE",
"language": "German",
"localization": "de"
}
],
"cursor": null
}
Response Fields
| Field | Type | Description |
|---|---|---|
num_items |
integer | Total number of engines returned |
items |
array | List of search engine configurations |
cursor |
string|null | Pagination cursor (null if no more results) |
Engine Object
| Field | Type | Description |
|---|---|---|
id |
integer | Unique engine identifier |
name |
string | Human-readable engine name |
type_name |
string | Device type: desktop or mobile |
system_name |
string | Search engine: google, bing, or yahoo |
country_name |
string | Full country name |
iso_code |
string | ISO 3166-1 alpha-2 country code |
language |
string | Language name |
localization |
string | Language/locale code |
Using in SERP Requests
When making SERP crawl requests, map the engine fields to request parameters:
| Engine Field | Request Parameter | Example |
|---|---|---|
country_name |
location_name |
"United States" |
iso_code |
iso_code |
"US" |
localization |
language_code |
"en" |
type_name |
device |
"desktop" or "mobile" |
Example: Using Engine Data
import requests
import os
API_KEY = os.environ["SERPWATCH_API_KEY"]
BASE_URL = "https://engine.v2.serpwatch.io"
# Step 1: Get available engines for Google
engines_response = requests.get(
f"{BASE_URL}/api/v1/common/engines/google",
headers={"Authorization": f"Bearer {API_KEY}"}
)
engines = engines_response.json()["items"]
# Step 2: Find Germany desktop engine
germany = next(
e for e in engines
if e["iso_code"] == "DE" and e["type_name"] == "desktop"
)
# Step 3: Use engine data in crawl request
crawl_response = requests.post(
f"{BASE_URL}/api/v2/serp/crawl",
headers={
"Authorization": f"Bearer {API_KEY}",
"Content-Type": "application/json"
},
json={
"keyword": "beste restaurants",
"location_name": germany["country_name"], # "Germany"
"iso_code": germany["iso_code"], # "DE"
"language_code": germany["localization"], # "de"
"device": germany["type_name"] # "desktop"
}
)
print(crawl_response.json())
const API_KEY = process.env.SERPWATCH_API_KEY;
const BASE_URL = "https://engine.v2.serpwatch.io";
// Step 1: Get available engines for Google
const enginesResponse = await fetch(
`${BASE_URL}/api/v1/common/engines/google`,
{ headers: { "Authorization": `Bearer ${API_KEY}` } }
);
const { items: engines } = await enginesResponse.json();
// Step 2: Find Germany desktop engine
const germany = engines.find(
e => e.iso_code === "DE" && e.type_name === "desktop"
);
// Step 3: Use engine data in crawl request
const crawlResponse = await fetch(
`${BASE_URL}/api/v2/serp/crawl`,
{
method: "POST",
headers: {
"Authorization": `Bearer ${API_KEY}`,
"Content-Type": "application/json"
},
body: JSON.stringify({
keyword: "beste restaurants",
location_name: germany.country_name, // "Germany"
iso_code: germany.iso_code, // "DE"
language_code: germany.localization, // "de"
device: germany.type_name // "desktop"
})
}
);
console.log(await crawlResponse.json());
Cache Engine Data
The list of engines rarely changes. Cache the response locally to avoid
repeated API calls. Refresh periodically (e.g., weekly) to pick up any new locations.
Related Topics
Google SERP (v2)
Complete reference for Google SERP crawling.
Multi-Location
Track rankings across different locations.