AuthonAuthon Blog
tutorial7 min read

Google Search Console Has a Full API. Why Is Nobody Using It from Their IDE?

# Google Search Console Has a Full API. Why Is Nobody Using It from Their IDE? I published a blog post, waited three days for Google to index it, the

AW
Alan West
Authon Team
Google Search Console Has a Full API. Why Is Nobody Using It from Their IDE?

Google Search Console Has a Full API. Why Is Nobody Using It from Their IDE?

I published a blog post, waited three days for Google to index it, then realized I could have just... asked it to. Literally. There's an API endpoint that lets you ping Google and say "hey, this URL exists now, please come look at it." I'd been sitting there refreshing the Google Search Console dashboard like it was a deployment pipeline, waiting for a green checkmark that I could have triggered myself.

That was the moment I realized my entire SEO workflow was broken. Not technically broken -- just comically manual for something that has a perfectly good API behind it.

The SEO Waiting Game

Here's what the publish-and-pray cycle actually looks like. You write a post. You deploy it. You go to Google Search Console, paste the URL into the inspection tool, and click "Request Indexing." Then you wait. Maybe a few hours, maybe a few days. You check back. It's still not indexed. You resubmit. You wait some more.

Now multiply that by every page on your site. Got a new blog with 30 posts? That's 30 manual submissions. Changed your URL structure during a migration? Enjoy manually requesting re-indexing for every single page while praying the old ones get removed from the index before someone clicks a dead link.

And that's just the indexing side. The analytics side is worse. Google Search Console's dashboard loads like it's running on a 2014 server. You click "Performance," wait five seconds for the chart to render, filter by page, wait again, filter by query, wait again. Want to compare two date ranges? That's another round of waiting. Want to export the data? CSV download, open in a spreadsheet, pivot table. For data that Google already has in a structured API response.

The tools exist. The API has been there for years. Nobody wraps it in a way that developers actually want to use.

gsc-mcp

gsc-mcp is an MCP server that puts the entire Google Search Console API surface into your AI agent's hands. 13 tools across 5 categories: site management, sitemaps, search analytics, URL inspection, and the Indexing API. It covers everything the dashboard does, minus the loading spinners.

Setup takes about two minutes. You need a Google Cloud project with the Search Console API enabled and OAuth credentials. Drop this into your MCP config:

json
{
  "mcpServers": {
    "gsc": {
      "command": "npx",
      "args": ["-y", "@mikusnuz/gsc-mcp"],
      "env": {
        "GSC_CLIENT_ID": "your-client-id",
        "GSC_CLIENT_SECRET": "your-client-secret",
        "GSC_REFRESH_TOKEN": "your-refresh-token"
      }
    }
  }
}

Standard OAuth2 flow. You create credentials in the Google Cloud Console, run through the consent screen once, grab your refresh token, and you're done. Nothing exotic.

What You Can Actually Do With This

The first thing I did was ask a question that would have taken me four clicks and two loading screens in the dashboard:

text
Me: What are my top 10 queries by clicks for the last 7 days?

Agent: Here are your top queries for the past week:
  1. "mcp server tutorial" - 342 clicks, 12,400 impressions, 2.8% CTR, pos 4.2
  2. "model context protocol" - 289 clicks, 8,900 impressions, 3.2% CTR, pos 3.8
  3. "npm publish automation" - 156 clicks, 5,200 impressions, 3.0% CTR, pos 6.1
  ...

Instant. No dashboard. No waiting for charts to render. And because it's a conversation, I can immediately follow up: "Which of those dropped in position compared to last month?" or "Show me which pages rank for 'mcp server tutorial'." Try doing that kind of drill-down in the GSC dashboard without losing your mind.

URL inspection is another one that used to eat my time. Instead of pasting URLs one at a time into the inspection bar:

text
Me: Is /blog/deploy-mcp-server indexed? Is it mobile-friendly?

Agent: URL inspection for /blog/deploy-mcp-server:
  - Indexing: Indexed, last crawled March 21
  - Mobile usability: No issues detected
  - Rich results: FAQ schema detected and valid
  - Crawl: Successful, 200 response, 1.2s load time

Indexing status, crawl info, rich results validation, mobile usability -- all in one response. In the dashboard, that's four different sections you'd have to click through.

The Batch Indexing Play

Here's where it gets genuinely useful for anyone running a blog, a docs site, or any content-heavy project. The Indexing API supports batch operations -- up to 100 URLs in a single request.

Think about what that means for your deploy workflow. You push a new version of your docs site with 15 updated pages. Instead of manually submitting each one, you just tell your agent:

text
Me: I just deployed updated docs. Submit these URLs for indexing:
    /docs/getting-started, /docs/api-reference, /docs/authentication,
    /docs/webhooks, /docs/migration-guide, /docs/changelog

Agent: Batch indexing request submitted for 6 URLs.
  All 6 URLs notified successfully.
  Google will prioritize crawling these URLs.

You could even hook this into your CI/CD pipeline. Deploy triggers a build, build generates a sitemap diff, agent submits the changed URLs to the Indexing API. Your pages show up in search results hours after deploy instead of days. For a blog that publishes daily, that's the difference between traffic hitting on day one versus day four.

Sitemap management works the same way. Submit a new sitemap, check the status of existing ones, see which URLs Google has picked up from your sitemap versus which ones it's ignoring. All conversational. All instant.

The Honest Limitations

I should be clear: the Indexing API doesn't guarantee instant indexing. It tells Google "this URL changed, please re-crawl it soon." Google still decides when and whether to actually index it. But in my experience, it cuts the wait from days to hours for most pages. That's a meaningful difference.

The Search Analytics data also has the same lag as the dashboard -- usually 2-3 days behind real-time. That's a Google limitation, not a tool limitation. You're not getting fresher data through the API than you would through the web UI.

And if your site has thousands of pages, the 100-URL batch limit on the Indexing API means you'll need multiple requests for large-scale reindexing jobs. Not a dealbreaker, but worth knowing.

SEO Shouldn't Require Staring at Dashboards

I've been saying this about every developer tool category for the past year, but it especially applies to SEO: if there's an API, there's no reason you should be clicking through a web interface to do routine work. Google Search Console has had a comprehensive API for years. The dashboard just happened to be the default way people interacted with it, because nobody built a better interface.

MCP changes that equation. Your AI agent becomes the interface. You describe what you want in plain language, and it figures out which API calls to make. No memorizing query parameters for the Search Analytics API. No manually constructing URL inspection requests. No batch-submitting sitemaps through a web form.

I check my search performance every morning now. Not because I became more disciplined about SEO -- I just ask while I'm already in my terminal. "How did yesterday's post perform?" takes three seconds. That question used to take two minutes of clicking and loading. The small friction was enough to make me skip it most days. Now I never skip it.

The best SEO workflow is the one you actually do consistently. And you're a lot more consistent when the tool lives where you already work.

Google Search Console Has a Full API. Why Is Nobody Using It from Their IDE? | Authon Blog