A modern, responsive, and engaging chat interface designed for LM Studio. This web application connects to your local LLMs (Large Language Models) to provide a seamless chat experience with support for markdown rendering, code highlighting, and tables.
- ⚡ Real-Fast Local Intelligence: Connects directly to LM Studio's local server. No data leaves your machine unless you use external tools.
- 👁️ Model Status Display: Shows the currently loaded model name in the sidebar status area.
- 📊 Rich Content Support:
- Full Markdown support (Headers, lists, tables).
- Syntax highlighting for code blocks.
- GitHub Flavored Markdown (strikethrough, tables, tasks).
- 📱 Fully Responsive: Optimized for both Desktop and Mobile devices with a smooth, collapsible sidebar drawer.
- 💾 Smart Session Management:
- Auto-saves chat history to
localStorage. - Auto-Titles: Uses the LLM to generate concise titles for new chats automatically.
- Create new chats and delete old ones easily.
- Auto-saves chat history to
- 💨 Smooth Streaming: Simulates a smooth typing effect for AI responses, creating a natural reading experience.
- ⚙️ Configurable: Easily change the LM Studio Base URL from the settings.
- Node.js: Ensure you have Node.js installed (v16+).
- LM Studio: Download and install LM Studio.
- Load a model (e.g., Llama 3, Mistral, etc.).
- Go to the "Local Server" tab (
<->icon). - Start the Server. Ensure Cross-Origin-Resource-Sharing (CORS) is enabled (usually on by default).
-
Clone the repository:
git clone https://github.com/Sundareeshwaran/lm-studio-chat-agent.git cd lm-studio-chat-agent -
Install Dependencies:
npm install # or pnpm install # or yarn install
-
Run Development Server:
npm run dev
-
Open in Browser: Navigate to
http://localhost:5173(or the URL shown in your terminal).
- Chatting: Type your query in the input box and hit Enter or click Send.
- New Chat: Click "New Chat" in the sidebar to start fresh. The AI will automatically name it after your first message.
- Settings: Click "Settings" in the sidebar to change the LM Studio connection URL (Default:
ws://localhost:1234orhttp://localhost:1234). - Model Info: Hover over the model name in the sidebar status bar to see the full path of the loaded model.
- Frontend: React.js, Tailwind CSS
- Animations: Framer Motion
- Icons: Lucide React
- Markdown: React Markdown, Remark GFM
- AI Integration:
@lmstudio/sdk
src/
├── components/
│ ├── Sidebar.jsx # Side navigation & history
│ ├── ChatMessages.jsx # Message list & rendering
│ └── ChatInput.jsx # Input area
├── hooks/
│ └── useLMStudio.js # Hook for LM Studio connection
├── App.jsx # Main application layout
└── main.jsx # Entry point
Contributions are welcome! Please feel free to submit a Pull Request.
Built with ❤️ for the AI Community.