Configure Your fault MCP Agent Server
This guide will take you through configuring the fault MCP server.
Prerequisites
-
Install fault
If you haven’t installed fault yet, follow the installation instructions.
Make sure the
fault
binary can be found in yourPATH
.
Tip
fault respects the MCP Server interface. Currently it relies on the stdio transport. It should be supported by any MCP client aware clients.
Cursor
-
Configure the MCP settings for Cursor
Add the following section to your global ~/.cursor/mcp.json file:
{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "agent", "tool" ], "env": { "OPENAI_API_KEY": "..." } } } }
Tip
We are using the default OpenAI API and therefore expect the
OPENAI_API_KEY
. If you switch to ollama or Open Router, these settings may differ. Do not commit this file if you copy your key.You may also want to enable a log file for the
fault
MCP server:{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "tool" ], "env": { "OPENAI_API_KEY": "..." } } } }
You may want to explore the Cursor documentation for more information.
If you want to use ollama instead, for instance using the gemma3 model, you may do it as follows:
{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "--llm-client", "ollama", "--llm-prompt-reasoning-model", "gemma3:4b", "--llm-prompt-chat-model", "gemma3:4b", "--llm-embed-model", "mxbai-embed-large", "tool" ] } } }
Kilo Code
-
Configure the MCP settings for Kilo Code
Add the following section to the .kilocode/mcp.json file at the root directory of any project:
{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "agent", "tool" ], "env": { "OPENAI_API_KEY": "..." } } } }
Note
You may need to restart the Visual Studio Code instance for the changes to take effect.
Tip
We are using the default OpenAI API and therefore expect the
OPENAI_API_KEY
. If you switch to ollama or Open Router, these settings may differ. Do not commit this file if you copy your key.You may also want to enable a log file for the
fault
MCP server:{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "tool" ], "env": { "OPENAI_API_KEY": "..." } } } }
You may want to explore the Kilo Code documentation for more information.
If you want to use ollama instead, for instance using the gemma3 model, you may do it as follows:
{ "mcpServers": { "fault": { "type": "stdio", "command": "fault", "disabled": false, "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "--llm-client", "ollama", "--llm-prompt-reasoning-model", "gemma3:4b", "--llm-prompt-chat-model", "gemma3:4b", "--llm-embed-model", "mxbai-embed-large", "tool" ] } } }
Kwaak
-
Configure the MCP settings for Kwaak
Add the following section to the kwaak.toml file at the root directory of any project:
[[mcp]] name = "fault" command = "fault" args = ["--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "tool"] env = [["OPENAI_API_KEY", "env:OPENAI_API_KEY"]]
Tip
We are using the default OpenAI API and therefore expect the
OPENAI_API_KEY
. If you switch to ollama or Open Router, these settings may differ. Do not commit this file if you copy your key.
Zed
-
Configure the MCP settings for Zed
Add the following section to your project ~/.zed/settings.json settings file:
{ "context_servers": { "fault": { "source": "custom", "command": { "path": "fault", "args": ["agent", "tool"], "env": { "OPENAI_API_KEY": "..." } }, "settings": {} } } }
Tip
We are using the default OpenAI API and therefore expect the
OPENAI_API_KEY
. If you switch to ollama or Open Router, these settings may differ. Do not commit this file if you copy your key.You may also want to enable a log file for the
fault
MCP server:{ "context_servers": { "fault": { "source": "custom", "command": { "path": "fault", "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "tool" ], "env": { "OPENAI_API_KEY": "..." } }, "settings": {} } } }
You may want to explore the Zed documentation for more information.
FastMCP
-
Configure the MCP settings for FastMCP
Add the following section to the configuration section:
import os import shutil from fastmcp import Client async list_fault_tools() -> None: config = { "mcpServers": { "local": { "command": shutil.which("fault"), "args": [ "--log-file", "/tmp/fault.log", "--log-level", "debug", "agent", "tool" ], "env": { "OPENAI_API_KEY": os.getenv("OPENAI_API_KEY") } }, } } async with Client(config) as client: fault_tools = await client.list_tools()
Next Steps
You've successfully deployed fault MCP server in your favourite AI code editor.
- Explore our MCP tools to learn how to first use the agent.