MCP
Interact with the Commonware Library via Model Context Protocol.
Why Provide an MCP Server?
LLMs are trained on code from months ago. Web search (the default fallback for finding missing information) returns GitHub links that must be iterated file-by-file to extract relevant information (if not rate-limited first). And, the results you do find probably don't match the version you're building against.
We built our own MCP server to make LLMs building with the Commonware Library more effective. mcp.commonware.xyz provides unlimited access to a version-pinned index of all source code and documentation, along with a ranked search tool that surfaces more relevant snippets than grep (with surrounding context).
Claude Code
claude mcp add --transport http commonware-library https://mcp.commonware.xyz
Cursor
Add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):
{
"mcpServers": {
"commonware-library": {
"url": "https://mcp.commonware.xyz"
}
}
}
Other Clients
Use the Streamable HTTP endpoint: https://mcp.commonware.xyz
If your AI assistant does not support MCP, you can use llms.txt to provide context manually.