# OpenAI Models Viewer A web-based interface for browsing and interacting with OpenAI-compatible API endpoints. This application allows you to manage multiple server connections, view available models, and chat with AI models directly from your browser. ## Features - **Multi-Server Management**: Add and manage multiple OpenAI-compatible endpoints - **Model Discovery**: Browse and view all available models from configured servers - **Interactive Chat**: Chat directly with AI models through a clean web interface - **Local Storage**: Securely stores server configurations in browser localStorage - **Responsive Design**: Works on desktop and mobile devices ## Project Structure ``` ├── app.py # Flask backend server ├── requirements.txt # Python dependencies ├── static/ │ ├── index.html # Main HTML interface │ ├── script.js # Client-side JavaScript logic │ └── style.css # Styling and layout ``` ## Getting Started ### Prerequisites - Python 3.7+ - pip (Python package manager) ### Installation 1. Clone the repository: ```bash git clone cd openai-models-viewer ``` 2. Install Python dependencies: ```bash pip install -r requirements.txt ``` 3. Run the application: ```bash python app.py ``` The application will start on `http://localhost:5000` by default. ### Usage 1. **Add a Server**: - Click the gear icon next to the server selector - Enter a server name (e.g., "OpenAI Production") - Enter the API endpoint URL (e.g., `https://api.openai.com`) - Enter your API key - Click "Add Server" 2. **View Models**: - Select a configured server from the dropdown - Models will automatically load and display 3. **Chat with Models**: - Click on any model to open the chat interface - Type your message and press Enter or click Send - View the AI's response in real-time ## Development ### Running with Custom Port ```bash python app.py 8080 ``` ### Docker Support Build and run with Docker: ```bash docker build -t openai-models-viewer . docker run -p 5000:5000 openai-models-viewer ``` ## API Endpoints The application connects to standard OpenAI-compatible endpoints: - `/models` - List available models - `/v1/chat/completions` - Chat completion endpoint ## Security - API keys are stored locally in browser localStorage - All communication happens directly between your browser and the API endpoints - No server-side storage of sensitive information ## Contributing 1. Fork the repository 2. Create your feature branch 3. Commit your changes 4. Push to the branch 5. Open a pull request ## License This project is licensed under the MIT License - see the LICENSE file for details. ## Acknowledgments - Built with Flask (Python web framework) - Uses modern JavaScript for client-side functionality - Designed with responsive CSS for cross-device compatibility