Last updated: June 2025
Use MCP (Model Context Protocol) to connect your AI agent to GitKraken CLI
Model Context Protocol (MCP) allows you to connect your GitKraken CLI to your Large Language Model (LLM) program. With this enabled, you can manage your PRs, Issues, Work Items, and more directly from your AI Desktop Agent.
Pre Installation Checklist
- Download the GitKraken CLI
- Follow the GitKraken CLI Installation Guide
- Authenticate to your GitKraken Account
- Synchronize your GitKraken Integrations with the CLI
Please see our MCP blog post with example workflows that can be enabled with this powerful tool.
MCP Setup
GitKraken CLI will connect locally with most LLMs by setting an MCP arguement. Please see the string below:
{
"mcpServers": {
"gitkraken": {
"command": "gk",
"args": ["mcp"]
}
}
}
Example Integrations
Claude
- Open Settings from the sidebar or use keyboard shortcuts.
- Select the Developer tab in the pop-up window.
- Click Edit Config to open the
claude_desktop_config
JSON file. - Add the following MCP configuration:
{
"mcpServers": {
"gitkraken": {
"command": "gk",
"args": ["mcp"]
}
}
}
Windsurf
- Locate the
mcp_config.json
file. The default path is~/.codeium/windsurf/mcp_config.json
. - Add the GitKraken CLI entry:
{
"mcpServers": {
"gitkraken": {
"command": "gk",
"args": ["mcp"]
}
}
}
GitHub Copilot
- Click the Copilot icon.
- Select Edit Preferences from the menu.
- In the left panel, expand Copilot Chat and click MCP.
- Add the following to the JSON configuration file:
{
"servers": {
"gitkraken": {
"type": "stdio",
"command": "gk",
"args": ["mcp"]
}
}
}