We just open-sourced a server that implements the Model Context Protocol (MCP), designed to connect LLMs and AI apps (like Claude Desktop, Cursor, and Windsurf) directly to Decodo’s platform — no glue code required.
This makes it easy for agents and copilots to access the web, retrieve structured data, and run real scraping tasks through a common protocol. The server includes built-in tools for things like:
Scraping websites as Markdown
Running Google or Amazon searches and returning parsed results
Controlling geo-location, language, and token limits via prompt
It’s privacy-conscious, region-flexible, and works out of the box with most MCP clients. You can run it locally, or install it in one click via Smithery.
We’d love feedback from developers working on:
LLM agents or autonomous tooling
AI-based research or retrieval systems
Scraping pipelines or data orchestration tools
Here’s the repo: https://github.com/Decodo/decodo-mcp-server
Let us know what features you’d like to see added, or if you run into any integration issues. We're actively improving this based on early use cases and real-world friction.
— The Decodo team
Let us know what features you’d like to see added, or if you run into any integration issues. We're actively improving this based on early use cases and real-world friction. — The Decodo team