A lightweight chat interface for LLMs. Designed as a minimalist alternative to Open WebUI or LibreChat.
-
Install dependencies:
npm install
-
Configure backend:
cd server cp .env.example .env # Edit .env and add your API keys
-
Start development servers:
# From project root npm run dev -
Open browser:
- Frontend: http://localhost:5173
- Backend: http://localhost:3001
PORT=3001
OPENAI_API_KEY=your_openai_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
GOOGLE_API_KEY=your_google_key_hereVITE_API_URL=http://localhost:3001- Frontend: React + TypeScript + Vite + Tailwind CSS + Zustand
- Backend: Express + Vercel AI SDK
| Layer | Technology |
|---|---|
| Build Tool | Vite |
| Framework | React |
| Language | TypeScript |
| Styling | Tailwind CSS |
| Components | Shadcn/UI |
| State | Zustand + useChat hook (messages) |
| Backend Framework | Express |
| AI Integration | Vercel AI SDK |
/feather
├── /client # React frontend
│ ├── /src
│ │ ├── /components
│ │ │ ├── /ui # Shadcn primitives
│ │ │ └── /chat # Chat components
│ │ ├── /lib # Utilities & types
│ │ ├── /store # Zustand config store
│ │ └── App.tsx # Main app with useChat
│ └── package.json
│
├── /server # Express proxy
│ ├── /src
│ │ ├── index.ts # Main server
│ │ ├── providers.ts # AI SDK provider registry
│ │ ├── config.ts # Environment config
│ │ ├── types.ts # TypeScript interfaces
│ │ ├── /routes # API routes
│ │ └── /lib # Utilities
│ ├── /data # db.json, uploads
│ ├── /tests # Vitest tests
│ └── package.json
│
└── package.json # Root orchestrator
- Node.js 18+
- npm or yarn
- At least one LLM provider API key
# Install all dependencies
npm install
# Start both client and server
npm run devnpm run buildnpm testGet your API key from platform.openai.com
Get your API key from console.anthropic.com
Get your API key from makersuite.google.com
MIT
