Skip to main content
From the beginning, Checkly has bet on Monitoring as Code which lets you create and control your monitoring infrastructure entirely using code. Checkly constructs can reflect all your monitoring properties.
api.check.ts
import { ApiCheck, AssertionBuilder } from "checkly/constructs"

new ApiCheck("api-health-check", {
  name: "API Health Check",
  request: {
    url: "https://danube-web.shop/api/books",
    method: "GET",
    assertions: [
      AssertionBuilder.statusCode().equals(200),
    ],
  },
})
All your monitoring resources can be updated, tested and deployed via the Checkly CLI.
# test your monitoring configuration
npx checkly test

# deploy and update your monitoring setup
npx checkly deploy
The Monitoring as Code workflow is by default AI-native because LLMs are excellent at writing and editing Checkly constructs code and modern AI agents can execute CLI commands easily. You only need to provide the necessary context about Checkly and your monitoring setup to your AI agent of choice.

Add custom Checkly rules to your AI conversation

The checkly.rules.md file includes best practices, example code and required CLI commands to give your AI workflow enough context to perform Checkly-related tasks. Once the Checkly rules are included in your AI context window, your agent can effectively assist you in managing your monitoring setup. It will be able to:

Create new checks, alert channels or other constructs

“Can you create a new BrowserCheck monitoring example.com

Gather information about the current monitoring setup

“What are the currently used monitoring locations?”

Bulk-update your monitoring resources

“Can you change all checks to run every 5 minutes instead of every 10 minutes?”
With enough application context you can even create checks for your specific use cases.

Analyze application code and create the monitoring setup

“Can you create new API Checks for the application API endpoints?”
Find a live session automating Checkly monitoring below.

Why is there no Checkly MCP server?

The MCP concept is often used to enable LLMs to interact with external systems. It acts as a bridge between the AI model and the target system, translating natural language commands into actionable API calls or code snippets. With Monitoring as Code, Checkly already provides a native way to control your monitoring infrastructure via code and the command line. Whether you need to create new resources or update existing ones, AI can write and update the necessary construct files and execute the Checkly CLI commands autonomously.

Additional Resources

Follow these guides if you use one of the popular AI coding tools: