One Go codebase, multiple binaries. Each binary reads OpenAPI specs at build time and turns them into a typed CLI + MCP server. You get commands, flags, help text, and auth handling for free. No runtime dependencies.
| Binary | What it talks to | Config |
|---|---|---|
hai-compute |
Portal (sandboxes, agents) | configs/compute.yaml |
hai-social |
X, LinkedIn, Instagram, Web | configs/social.yaml |
hai-renderer |
PDF rendering, image processing | configs/renderer.yaml |
hai-documents |
Document indexing and search | configs/documents.yaml |
Each config is just a YAML file pointing at OpenAPI spec URLs. Adding a new service is adding a new YAML file. That's it.
Pick a config, run make:
make build CONFIG_FILE=configs/social.yamlThis fetches the OpenAPI specs, generates Go code from them, and compiles a single static binary. You now have ./hai-social in your directory.
Cross-compile for all platforms:
make build-one CONFIG_FILE=configs/social.yamlBuild everything:
make build-allSet your token and go:
export HEDWIGAI_AUTH_TOKEN="your-token"
./hai-social x get-x-profile --username karpathy
./hai-social web search --query "transformer architecture"
./hai-compute portal health-checkEvery command has --help. The flag names, types, and descriptions all come straight from the OpenAPI specs.
Three ways to run it. Pick the one that matches your setup.
The simplest option. Add to .mcp.json:
{
"mcpServers": {
"hai-social": {
"command": "./hai-social",
"args": ["mcp", "server", "--stdio"],
"env": {
"HEDWIGAI_AUTH_TOKEN": "your-token"
}
}
}
}Claude Code talks to the binary over stdin/stdout. Tools show up immediately.
./hai-social mcp server --port 8080Clients connect to:
POST /mcp- streamable HTTP (Claude Code, newer clients)GET /sse+POST /message- legacy SSE (Claude Desktop)
Auth goes in the Authorization: Bearer header. OAuth discovery at /.well-known/oauth-protected-resource.
This is what runs at mcp.hedwigai.com. One server, all configs:
./hai-compute mcp server --config-dir configs/ --port 8080Each config gets its own prefix:
/compute/mcp- portal tools/social/mcp- X, LinkedIn, Instagram, Web tools/renderer/mcp- PDF and image tools/documents/mcp- document tools
Hit GET / for a JSON index of all available servers and tool counts.
See what you've got:
./hai-social mcp listx x_get-x-profile Get user profile
x x_get-x-posts Get user posts
x x_search-x Search posts
linkedin linkedin_get-linked-in-member-profile Get member profile
...
HedwigAI runs the multi-config server at mcp.hedwigai.com. No local binary needed. Point your MCP client at the right prefix:
| Service | Endpoint |
|---|---|
| Compute | https://mcp.hedwigai.com/compute/mcp |
| Social | https://mcp.hedwigai.com/social/mcp |
| Renderer | https://mcp.hedwigai.com/renderer/mcp |
| Documents | https://mcp.hedwigai.com/documents/mcp |
Auth is Bearer token in the Authorization header. The server supports OAuth discovery (RFC 9728) so compliant MCP clients can negotiate auth automatically via identity.hedwigai.com.
OneCLI is an open-source credential vault for AI agents. Instead of scattering JWTs across env vars and config files, you store them once in OneCLI and it injects them transparently into outbound requests.
git clone https://github.com/onecli/onecli.git && cd onecli
docker compose -f docker/docker-compose.yml upDashboard at localhost:10254. Gateway at localhost:10255.
- Create an agent in the OneCLI dashboard
- Store your
HEDWIGAI_AUTH_TOKEN, matched to host*.hedwigai.com - Point the CLI at the gateway:
export HTTPS_PROXY=http://localhost:10255
./hai-social x get-x-posts --username karpathyThe gateway intercepts the request, swaps in the real token, forwards to hedwigai. The CLI never sees the actual credential.
For MCP servers, same idea:
{
"mcpServers": {
"hai-social": {
"command": "./hai-social",
"args": ["mcp", "server", "--stdio"],
"env": {
"HTTPS_PROXY": "http://localhost:10255"
}
}
}
}No HEDWIGAI_AUTH_TOKEN in the config. OneCLI handles it.
- Tokens stay in one encrypted vault, not in dotfiles and env vars everywhere
- Rotate once, all agents pick it up
- Audit trail of which agent called what
- Without OneCLI everything works exactly the same (proxy support is a no-op when
HTTPS_PROXYis unset)
make build-one CONFIG_FILE=configs/social.yaml
make release-one CONFIG_FILE=configs/social.yamlThis cross-compiles for linux/darwin/windows (amd64+arm64), generates checksums, creates an install script, and pushes to GitHub Releases.
configs/*.yaml - what to build (binary name, auth, OpenAPI spec URLs)
tools/generate/ - reads specs at build time, writes generated/operations.go
internal/spec/ - fetches and parses OpenAPI specs
internal/command/ - Cobra CLI builder + HTTP executor
internal/mcpbridge/ - MCP server (stdio, HTTP, multi-config)
generated/operations.go - auto-generated, don't touch
The basic idea: OpenAPI spec goes in, CLI commands and MCP tools come out. Everything interesting happens at build time. The binary ships with no spec files, no config, no runtime dependencies. Just commands.