Scalr's MCP Server: The AI Assistant That Does the Heavy Lifting

Use the Scalr MCP server to make your platform teams much more productive and efficient when analyzing your Terraform ecosystem.

What is a Model Context Protocol (MCP) Server?

The Model Context Protocol (MCP) Server is a crucial link that enhances how large language models (LLMs) interact with external, dynamic data sources.

While models like Claude and Gemini possess vast inherent knowledge, that knowledge is typically static, based on their initial training data. The MCP Server addresses this limitation by acting as an intelligent, standardized bridge.

This server is a lightweight, standalone service. Its primary function is to expose current, real-time context—including up-to-date data, access to tools, and enterprise resources (such as live databases, APIs, or internal files)—directly to the AI application (the MCP Client).

By establishing this open and standardized interface, the MCP Server enables the LLM to access information beyond its training set and execute specific actions based on current reality. This capability is what ultimately transforms the AI from a general knowledge base into a powerful, context-aware agent.

The Scalr MCP Server: AI for Infrastructure as Code

Getting started with the Scalr MCP server

For years, managing cloud environments required infrastructure engineers to write complex CLI commands and navigate dense APIs just to manage and understand their infrastructure. Tasks like pulling usage reports, auditing security tokens, or creating a new workspace were manual, time-consuming efforts requiring deep, specific knowledge.

What if all that work could be done with a simple, conversational request?

The Scalr MCP Server fundamentally changes this by integrating an AI assistant that translates conversational requests into powerful Infrastructure as Code (IaC) actions. The Scalr MCP Server is the brain that securely connects your AI assistant (like Claude Desktop or VSCode) directly to your Scalr platform via the Scalr API. It runs locally as a containerized service, translating your natural language requests into the precise API calls needed to manage your infrastructure.


How the Scalr MCP Server Boosts Efficiency

The core value proposition of the Scalr MCP Server is simple: it dramatically reduces the cognitive load and manual effort required for daily infrastructure operations, making your teams significantly more efficient and making Scalr's powerful API universally accessible.

Effortless Compliance and Audits

The AI assistant quickly handles complex governance and security requests that previously required custom scripting and data aggregation from multiple tools:

  • "Review token usage in my account. Show tokens that don't have owners, have never been rotated, and don't have a description."
  • "What policy checks have failed the most in the production environment?"
  • "List all users and teams that have the admin access policy assigned to them."

Rapid Discovery and Management

Forget digging through documentation for the right CLI flag or API endpoint. You can now gain a holistic view and manage your infrastructure instantly just by asking:

  • "Show me the top 10 workspaces with the highest failure rates."
  • "Show me all workspaces without drift detection in the production environment."

Reduced Cognitive Load for DevOps Teams

By putting an intelligent layer between the user and the complexity of the API, the MCP approach means your team no longer needs to memorize specific API endpoints, query parameters, or filter syntax. The AI handles the translation from plain English to the required API calls, lowering the barrier to entry and accelerating your entire infrastructure team.


Quick Start: Get Running in Minutes

Setting up the Scalr MCP Server is straightforward, requiring only Docker and a Scalr API Token.

  1. Get Your API Token: Create a Personal Access Token or Service Account Token in your Scalr account settings with the required permissions.
  2. Configure Your AI Client: Add the Scalr configuration to your client's configuration file (e.g., claude_desktop_config.json). This configuration tells your client how to start the local Docker command.
{
  "mcpServers": {
    "scalr": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "--pull=always",
        "--env",
        "SCALR_API_TOKEN=your_api_token_here",
        "--env",
        "SCALR_API_URL=https://your-account.scalr.io",
        "scalr/mcp-server:latest"
      ]
    }
  }
}
  1. Start Using! After restarting your AI client, you can immediately begin managing and querying your infrastructure using natural language.

Learn More: Ready to start talking to your Scalr control plane? Explore the full details in the Scalr MCP Server Documentation.