MCP Setup
Use Bublik's MCP (Model Context Protocol) server to connect AI agents and assistants to your Bublik instance.
What You Need
Before connecting a client:
- Your Bublik instance should be updated to
v2.10.4or newer. - The MCP endpoint must be reachable from the machine where your AI client runs.
- Replace
<bublik_url>in the examples below with your public Bublik URL.
Endpoint
Bublik exposes the MCP server at:
https://<bublik_url>/mcp
Client Configuration
OpenCode
Add this server definition to your OpenCode MCP configuration:
{
"bublik-mcp": {
"type": "remote",
"url": "https://<bublik_url>/mcp"
}
}
Official docs: OpenCode MCP servers
Claude
Add this server definition to your Claude MCP configuration:
{
"bublik-mcp": {
"type": "http",
"url": "https://<bublik_url>/mcp"
}
}
Official docs: Claude Code MCP
Codex
Codex supports MCP in both the CLI and the IDE extension. Add the server to ~/.codex/config.toml or to a project-scoped .codex/config.toml in a trusted project:
[mcp_servers.bublik-mcp]
url = "https://<bublik_url>/mcp"
Official docs: Codex MCP
Server Enablement
If you manage your own Bublik deployment, use the initial MCP rollout instructions from the v2.10.4 release notes to enable the server on backend or Docker installations.
Docker .env Options
For Docker deployments, MCP-related settings can be added to your .env file:
BUBLIK_DOCKER_MCP_HOST=127.0.0.1
BUBLIK_DOCKER_MCP_PORT=8001
Options:
BUBLIK_DOCKER_MCP_HOST: Host interface used by the MCP service inside the Docker deployment. The initial setup uses127.0.0.1.BUBLIK_DOCKER_MCP_PORT: Port used by the MCP service inside the Docker deployment. The initial setup uses8001.
If these variables are missing, add them before running your Docker update steps from the release notes.
Verify the Setup
After configuration:
- Restart or reload your AI client if needed.
- Confirm the client can connect to
https://<bublik_url>/mcp. - Ask a simple Bublik-related question from the client to verify that the MCP server responds.
For an early example of usage, see the shared conversation.