JigsawStack Logo

Beta

JigsawStack MCP Servers: Bridging LLMs with Context and Tools

Share this article

JigsawStack MCP Servers: Bridging LLMs with Context and Tools

Large Language Models (LLMs) are incredibly powerful, but they don't know everything out of the box. They rely on the context we provide, which can be limited by prompt sizes or outdated training data. Imagine if your AI assistant could plug into external knowledge, remember information between conversations, or perform tasks like web browsing on command – all from a standard interface. This is where Model Context Protocol (MCP) comes in.

We'll explore what MCP is (in simple terms), how it helps manage context for LLMs, and how you can use it. We'll also introduce JigsawStack, a developer-friendly AI platform with a range of APIs, and specifically dive into our very own JigsawStack MCP server – an open-source server that brings our end-to-end capabilities to your LLM via MCP. By the end, you'll see why this is exciting for developers looking to give their AI applications better memory, understanding, and efficiency.

What is Model Context Protocol (MCP)?

Model Context Protocol (MCP) is an open standard that allows AI models to connect to external data sources and tools consistently, acting as a universal adapter for AI applications. It defines a client-server architecture where an AI assistant (client) communicates with MCP servers that provide access to resources or perform specific actions.

Without MCP, developers must build separate integrations for each tool or data source, which is tedious and doesn't scale well. MCP solves this by providing a universal protocol for context, making integrations easier to manage. An AI agent can request information or actions from an MCP server, and the server responds in a standardized format, allowing the model to maintain a richer conversation context.

How do MCPs Help LLMs?

Using MCP involves running MCP servers and an MCP-compatible client. This setup allows AI models to fetch information or perform tasks on demand, extending their memory and contextual understanding. MCP servers can serve as external memory banks or knowledge bases, enabling models to retrieve details as needed. They also allow AI to connect to live data sources, ensuring up-to-date and context-rich responses.

So, how does this help an LLM in concrete terms? Here are some key benefits of MCP for LLMs:

  • Better “memory” and extended context: MCP servers can serve as external memory banks or knowledge bases for an AI. Instead of the model being limited to its fixed context window, it can retrieve details on demand.

  • Richer, up-to-date contextual understanding: Because MCP lets an AI connect to live or dynamic data sources, the model’s knowledge stays current and relevant. Even advanced language models are often trained on data that might be incomplete or outdated. By connecting them to live data sources (like documents, databases, or APIs), MCP “helps ensure the model’s answers are up-to-date, context-rich, and domain-specific.”

  • Efficiency and modularity: MCP encourages a modular design where each server is a focused integration. This makes the system more efficient in multiple ways. For one, the LLM doesn't have to be fed huge prompts containing all possibly relevant info – it can ask a server for a specific piece of data only when needed, saving token space and compute. Additionally, from a developer standpoint, it's far more efficient to reuse standardized MCP connectors than to write custom glue code for each tool.

In short, MCP can be used by running compatible servers and connecting your LLM to them (for example, Anthropic's Claude has built-in support for MCP servers, and there are libraries for other frameworks). The payoff is that your AI assistant can augment its limited built-in knowledge with external intelligence, remember things longer, and operate more efficiently. As developers, we get a plug-and-play way to add new capabilities to our AI apps without constantly reinventing the wheel.

What is JigsawStack?

We at JigsawStack serve an AI platform that provides a suite of small, specialized models through easy-to-use APIs. "AI infrastructure for your tech stack – a suite of custom small models plugged into the right infrastructure to power your AI applications."​

Essentially, we aim to be the missing pieces of the puzzle for AI developers – you can pick the functionality you need and easily drop it into your app, rather than building everything from scratch.

// First, install the SDK: npm install jigsawstack
import { JigsawStack } from "jigsawstack";

// Initialize with your API key
const jigsaw = JigsawStack({ apiKey: "YOUR_JIGSAWSTACK_API_KEY" });

// Example usages of various APIs:
const searchResults = await jigsaw.web.search({ 
  query: "What is the weather in San Francisco?" 
});

const structuredOCR = await jigsaw.vision.vocr({ 
  url: "https://jigsawstack.com/preview/vocr-example.jpg",
  prompt: ["total_price", "tax", "highlighted_item_name"]  // fields to extract from image
});

const scrapedData = await jigsaw.web.ai_scrape({ 
  url: "https://news.ycombinator.com/show", 
  element_prompts: ["post title", "post points"]  // what elements to extract from the page
});

const audioFile = await jigsaw.audio.text_to_speech({ 
  text: "Hello, world!" 
});

const translatedText = await jigsaw.translate({ 
  text: ["Hey, how are you?"], 
  target_language: "zh"  // translate to Chinese
});

Our tools cover a lot of ground – web, vision, audio, language, etc. The code is quite straightforward: you initialize the client with an API key and call the method for the task you need.

What is the JigsawStack MCP Server?

The JigsawStack MCP server is an open-source implementation of an MCP server that exposes our AI capabilities to any MCP-compatible AI client. In simpler terms, it's like a bridge between an AI assistant and our APIs. Instead of calling our API services directly in your application code (as shown above), you can run the MCP server and let your LLM call those services through the MCP protocol.

This is incredibly useful if you're building an AI agent that should autonomously decide when to use a tool, the agent can simply invoke the MCP server's tools as needed.

Our MCP server essentially provides a suite of small, fast models (the same ones we discussed) to the AI in a standardized way.

The server is built in TypeScript (Node.js) and is available on GitHub (open-sourced) for developers to use: JigsawStack/jigsawstack-mcp-server on GitHub. This means you can self-host it.

So, when you run a particular MCP server, your AI assistant could, for example, call a web_search tool with a query and get back search results, or call a web_scraper tool with a URL and extraction prompt to get structured data, or use a image_generation tool to create an image from a prompt.

Because it's an MCP server, it's not tied to any specific AI model – any MCP-compliant client can connect. For instance, Anthropic’s Claude (via Claude Desktop) can use it, or a developer could use an MCP client library in a custom Python script or another environment to let GPT-4 or other models use these tools.

Take a look at this platform smithery.ai, which helps developers find and ship language model extensions compatible with the Model Context Protocol Specification.

How to use JigsawStack MCP Server?

Simply put, the JigsawStack MCP Server is an implementation of the Model Context Protocol (MCP) that lets your AI agent call our services—like web search, web scraping, and image generation—without directly embedding API calls in your code.

Our server is built with Node.js and Express.js, and each tool is organized in its directory as a distinct server for modularity. This means you can easily add, update, or remove tools without impacting the overall system.

Below is a quick guide to get you started for AI Web Search:

Quick Setup

  1. Clone the Repository:

    git clone https://github.com/JigsawStack/jigsawstack-mcp-server.git
    cd jigsawstack-mcp-server/ai-web-search
    
  2. Install Dependencies:

    npm install
    

    (Alternatively, you can use yarn.)

  3. Configure Your API Key:

    Obtain your JIGSAWSTACK_API_KEY from our dashboard, then export it:

    export JIGSAWSTACK_API_KEY=your_api_key
    
  4. Test Your Setup Using A Mock Client Script:

    // Demo to use ai_web_search, please save this within ai-web-search/
    import { spawn } from "child_process";
    
    // we must run the server first
    const serverProc = spawn("node", ["dist/index.js"]);
    
    const request = JSON.stringify({
      // this is the request we want to send to the server
      jsonrpc: "2.0",
      method: "tools/call",
      params: {
        name: "ai_web_search",
        arguments: {
          query: "how old is the earth?",
          ai_overview: true,
        },
      },
      id: 1,
    });
    
    //log the request being sent
    console.log("Sending request:", request);
    
    //listen to server's stdout
    serverProc.stdout.on("data", (data) => {
      console.log("Server Response:", data.toString());
    });
    
    //send the request to the server:
    serverProc.stdin.write(request + "\n");
    
    //listen to server's stderr for errors if any
    serverProc.stderr.on("data", (data) => {
      console.error("Server Error:", data.toString());
    });
    

    Upon a successful installation, and definition of a mock client script, we would establish a SSE communication as follows:

    Alternatively, you could use prebuilt providers for integrations with tools like Claude Desktop using smithery.ai for JigsawStack MCP server by using:

    npx -y @smithery/cli@latest install @JigsawStack/ai-web-search --client claude
    

    Or, even use MCP Inspector, a visual debugger for MCP server. Let us try JigsawStack AI Web Search MCP server:

    cd jigsawstack-mcp-server/ai-web-search
    npm install
    export JIGSAWSTACK_API_KEY=sk....
    npm run inspector
    

    Within the UI, continue by clicking on Connect → List Tools → AI Web Search

What’s Inside?

  • ai-web-scraper: Let AI scrape the internet for you.

  • ai-web-search: Leverage AI to handle complex search queries.

  • image-generation: Generate images from prompts, returning a base64 string.

  • vOCR: Extract data from any document type in a consistent structure with fine-tuned vLLMs for the highest accuracy

  • translation: With our real time AI Translation, you can translate any text, conversation, or document to any language while maintaining the context, tone, and meaning with AI enabling high accuracy and reliability. We support 180+ language pairs!

This setup empowers your LLM to autonomously decide when to use a tool, providing a standardized, efficient way to tap into our AI capabilities. Whether you're self-hosting for full control or opting for a managed solution through Smithery.ai, our MCP Server makes integration seamless.

Conclusion

Bringing it all together, the JigsawStack MCP server makes JigsawStack's tools (backed by services) available to your AI through MCP. This means you can have an AI agent that on its own can decide to search the web, scrape a site, or create an image when needed – all by using the MCP server as the go-between. It's open source and accessible: you can run it yourself or quickly install it via smithery.ai, and it works with any MCP-aware client or platform.

Why should developers care?

If you're building an AI-driven application or agent, using MCP servers like ours can dramatically speed up development and expand functionality. You don't have to reinvent solutions for common tasks; you can leverage a growing ecosystem of MCP servers. JigsawStack MCP server, in particular, gives you a grab-bag of useful skills in one package. It exemplifies how a well-designed MCP server can boost an AI's usefulness – from answering more questions (thanks to live search) to automating web tasks and generating creative content.

As MCP gains adoption, we expect to see more servers and tools become available, making AI systems even more extensible. So, if you're excited about building smarter AI assistants or just want to tinker with the latest in AI infrastructure, give our JigsawStack MCP server a try. With a few commands, you'll equip your LLM with a new set of superpowers.

👥 Join the JigsawStack Community

Have questions or want to show off what you’ve built? Join the JigsawStack developer community on Discord and X/Twitter. Let’s build something amazing together!

Share this article