Skip to content

Welcome to the AI‐Chat‐Linux‐Client wiki!

Auto Bot Solutions edited this page Apr 26, 2026 · 1 revision

Welcome to the Chat Linux Client documentation wiki. This is a privacy-first, multi-provider AI desktop client for Linux systems that unifies multiple AI providers and local models into a single conversational interface.

Quick Links

Features

  • Multi-Provider Support: OpenAI, Ollama (local), Groq, HuggingFace, OpenRouter
  • Offline Capability: Full functionality with local Ollama models
  • Streaming Responses: Real-time token-by-token response rendering
  • Privacy-First: No telemetry, local key storage, optional encryption
  • Intelligent Routing: Automatic model selection based on requirements
  • Extensible Architecture: Plugin system for custom providers and tools
  • Modern UI: Dark theme with PyQt6 interface

System Requirements

  • Python: 3.8 or higher
  • Operating System: Linux (Ubuntu 20.04+, Fedora 35+, Arch Linux)
  • Memory: 4GB RAM minimum (8GB recommended)
  • Storage: 500MB free space
  • Optional: Ollama for local AI models

Getting Started

  1. Install the application
  2. Configure your API keys
  3. Start chatting

Support

For issues and questions:

  • Check the Troubleshooting section
  • Run system checks for diagnostics: python main.py --check-system
  • Create an issue on the project repository

License

This project is licensed under the MIT License - see the LICENSE file for details.

Clone this wiki locally