Skip to content

Decodo/decodo-mcp-server

Repository files navigation

Decodo MCP Server

smithery badge

This repo contains Decodo MCP server which enables MCP clients to interface with services offered by Decodo.

Quick start via Smithery (Recommended)

Visit decodo-mcp-server on Smithery, select your favourite MCP client and generate installation instructions.

Smithery interface

Obtain Scraper API credentials

A Decodo Scraper API Web Advanced user is required - trial available on dashboard.

Once you have a Web Advanced plan activated, take a note of your generated username and password:

Decodo dashboard

Running MCP server locally (manual)

Running MCP server locally

Prerequisites:

  1. Clone this repo and run:
npm install
npm run build
  1. Take a note of your build location:
cd build/
pwd

Adding index.js to the end of this directory, your build file location will look something like this:

/Users/your.user/projects/decodo-mcp/build/index.js
  1. Update your MCP client with server information:

Claude Desktop

Follow the guide here to find the setup file, then update claude_desktop_config.json to look like this:

{
  "mcpServers": {
    "decodo-mcp": {
      "command": "node",
      "args": ["/Users/your.user/projects/decodo-mcp/build/index.js"],
      "env": {
        "SCRAPER_API_USERNAME": "your_username",
        "SCRAPER_API_PASSWORD": "your_password"
      }
    }
  }
}

Cursor

See Cursor documentation for how to install.

Tools

The server exposes the following tools:

Tool Description Example prompt
scrape Scrapes any target URL, expects a URL to be given via prompt. Scrape peacock.com from a US ip address and tell me the pricing
google_search_parsed Scrapes Google Search for a given query, and returns parsed results. Scrape google search for shoes and tell me the top position

Parameters

The following parameters are inferred from user prompts:

Parameter Description
jsRender Renders target URL in a headless browser.
geo Sets the country from which request will originate.
locale Sets the locale of the request.
tokenLimit Truncates the response content up to this limit. Useful if the context window is small.
fullResponse Skips automatic truncation and returns full content. If context window is small, may throw warnings.

Examples

Scraping geo-restricted content

Query your AI agent with the following prompt:

Scrape peacock.com from a German ip address and tell me the pricing

This prompt will say that peacock.com is geo-restricted. To come around the geo-restriction:

Scrape peacock.com from a US ip address and tell me the pricing

Limiting number of response tokens

If your agent has a small context window, the content returned from scraping will be automatically truncated, in order to avoid context-overflow. You can increase the number of tokens returned:

Scrape hacker news, return 50k tokens

If your agent has a big context window, tell it to return full content:

Scrape hacker news, return full content