All checks were successful
continuous-integration/drone Build is passing
2.9 KiB
2.9 KiB
OpenAI Models Viewer
A web-based interface for browsing and interacting with OpenAI-compatible API endpoints. This application allows you to manage multiple server connections, view available models, and chat with AI models directly from your browser.
Features
- Multi-Server Management: Add and manage multiple OpenAI-compatible endpoints
- Model Discovery: Browse and view all available models from configured servers
- Interactive Chat: Chat directly with AI models through a clean web interface
- Local Storage: Securely stores server configurations in browser localStorage
- Responsive Design: Works on desktop and mobile devices
Project Structure
├── app.py # Flask backend server
├── requirements.txt # Python dependencies
├── static/
│ ├── index.html # Main HTML interface
│ ├── script.js # Client-side JavaScript logic
│ └── style.css # Styling and layout
Getting Started
Prerequisites
- Python 3.7+
- pip (Python package manager)
Installation
- Clone the repository:
git clone <repository-url>
cd openai-models-viewer
- Install Python dependencies:
pip install -r requirements.txt
- Run the application:
python app.py
The application will start on http://localhost:5000 by default.
Usage
-
Add a Server:
- Click the gear icon next to the server selector
- Enter a server name (e.g., "OpenAI Production")
- Enter the API endpoint URL (e.g.,
https://api.openai.com) - Enter your API key
- Click "Add Server"
-
View Models:
- Select a configured server from the dropdown
- Models will automatically load and display
-
Chat with Models:
- Click on any model to open the chat interface
- Type your message and press Enter or click Send
- View the AI's response in real-time
Development
Running with Custom Port
python app.py 8080
Docker Support
Build and run with Docker:
docker build -t openai-models-viewer .
docker run -p 5000:5000 openai-models-viewer
API Endpoints
The application connects to standard OpenAI-compatible endpoints:
/models- List available models/v1/chat/completions- Chat completion endpoint
Security
- API keys are stored locally in browser localStorage
- All communication happens directly between your browser and the API endpoints
- No server-side storage of sensitive information
Contributing
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Open a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Built with Flask (Python web framework)
- Uses modern JavaScript for client-side functionality
- Designed with responsive CSS for cross-device compatibility