Skip to content

A lightweight frontend for LM Studio local server APIs. Built using React, Vite, and Tailwind CSS with full support for streaming responses and GitHub Flavored Markdown.

License

Notifications You must be signed in to change notification settings

Sundareeshwaran/lm-studio-chat-agent

Repository files navigation

LM Studio AI Agent

Status React LM Studio

A modern, responsive, and engaging chat interface designed for LM Studio. This web application connects to your local LLMs (Large Language Models) to provide a seamless chat experience with support for markdown rendering, code highlighting, and tables.

✨ Features

  • ⚡ Real-Fast Local Intelligence: Connects directly to LM Studio's local server. No data leaves your machine unless you use external tools.
  • 👁️ Model Status Display: Shows the currently loaded model name in the sidebar status area.
  • 📊 Rich Content Support:
    • Full Markdown support (Headers, lists, tables).
    • Syntax highlighting for code blocks.
    • GitHub Flavored Markdown (strikethrough, tables, tasks).
  • 📱 Fully Responsive: Optimized for both Desktop and Mobile devices with a smooth, collapsible sidebar drawer.
  • 💾 Smart Session Management:
    • Auto-saves chat history to localStorage.
    • Auto-Titles: Uses the LLM to generate concise titles for new chats automatically.
    • Create new chats and delete old ones easily.
  • 💨 Smooth Streaming: Simulates a smooth typing effect for AI responses, creating a natural reading experience.
  • ⚙️ Configurable: Easily change the LM Studio Base URL from the settings.

🚀 Getting Started

Prerequisites

  1. Node.js: Ensure you have Node.js installed (v16+).
  2. LM Studio: Download and install LM Studio.
    • Load a model (e.g., Llama 3, Mistral, etc.).
    • Go to the "Local Server" tab (<-> icon).
    • Start the Server. Ensure Cross-Origin-Resource-Sharing (CORS) is enabled (usually on by default).

Installation

  1. Clone the repository:

    git clone https://github.com/Sundareeshwaran/lm-studio-chat-agent.git
    cd lm-studio-chat-agent
  2. Install Dependencies:

    npm install
    # or
    pnpm install
    # or
    yarn install
  3. Run Development Server:

    npm run dev
  4. Open in Browser: Navigate to http://localhost:5173 (or the URL shown in your terminal).

💡 How to Use

  1. Chatting: Type your query in the input box and hit Enter or click Send.
  2. New Chat: Click "New Chat" in the sidebar to start fresh. The AI will automatically name it after your first message.
  3. Settings: Click "Settings" in the sidebar to change the LM Studio connection URL (Default: ws://localhost:1234 or http://localhost:1234).
  4. Model Info: Hover over the model name in the sidebar status bar to see the full path of the loaded model.

🛠️ Technology Stack

  • Frontend: React.js, Tailwind CSS
  • Animations: Framer Motion
  • Icons: Lucide React
  • Markdown: React Markdown, Remark GFM
  • AI Integration: @lmstudio/sdk

📦 Project Structure

src/
├── components/
│   ├── Sidebar.jsx       # Side navigation & history
│   ├── ChatMessages.jsx  # Message list & rendering
│   └── ChatInput.jsx     # Input area
├── hooks/
│   └── useLMStudio.js    # Hook for LM Studio connection
├── App.jsx               # Main application layout
└── main.jsx             # Entry point

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.


Built with ❤️ for the AI Community.

About

A lightweight frontend for LM Studio local server APIs. Built using React, Vite, and Tailwind CSS with full support for streaming responses and GitHub Flavored Markdown.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published