-
Notifications
You must be signed in to change notification settings - Fork 0
Welcome to the AI‐Chat‐Linux‐Client wiki!
Auto Bot Solutions edited this page Apr 26, 2026
·
1 revision
Welcome to the Chat Linux Client documentation wiki. This is a privacy-first, multi-provider AI desktop client for Linux systems that unifies multiple AI providers and local models into a single conversational interface.
- Installation - How to install and set up the application
- Configuration - Configure API keys and settings
- Usage - How to use the application
- API Providers - Supported AI providers and how to use them
- Troubleshooting - Common issues and solutions
- Development - Development setup and contribution guide
- Architecture - System architecture and design
- FAQ - Frequently asked questions
- Multi-Provider Support: OpenAI, Ollama (local), Groq, HuggingFace, OpenRouter
- Offline Capability: Full functionality with local Ollama models
- Streaming Responses: Real-time token-by-token response rendering
- Privacy-First: No telemetry, local key storage, optional encryption
- Intelligent Routing: Automatic model selection based on requirements
- Extensible Architecture: Plugin system for custom providers and tools
- Modern UI: Dark theme with PyQt6 interface
- Python: 3.8 or higher
- Operating System: Linux (Ubuntu 20.04+, Fedora 35+, Arch Linux)
- Memory: 4GB RAM minimum (8GB recommended)
- Storage: 500MB free space
- Optional: Ollama for local AI models
For issues and questions:
- Check the Troubleshooting section
- Run system checks for diagnostics:
python main.py --check-system - Create an issue on the project repository
This project is licensed under the MIT License - see the LICENSE file for details.