update docs
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing

This commit is contained in:
Juan José Gutiérrez de Quevedo Pérez 2026-01-30 12:07:48 +01:00
parent 90cc336ec2
commit cf6f4dc9be
Signed by: ps
GPG key ID: D7026C21E81584BC

View file

@ -10,77 +10,13 @@ A web-based interface for browsing and interacting with OpenAI-compatible API en
- **Local Storage**: Securely stores server configurations in browser localStorage - **Local Storage**: Securely stores server configurations in browser localStorage
- **Responsive Design**: Works on desktop and mobile devices - **Responsive Design**: Works on desktop and mobile devices
## Project Structure
```
├── app.py # Flask backend server
├── requirements.txt # Python dependencies
├── static/
│ ├── index.html # Main HTML interface
│ ├── script.js # Client-side JavaScript logic
│ └── style.css # Styling and layout
```
## Getting Started
### Prerequisites
- Python 3.7+
- pip (Python package manager)
### Installation
1. Clone the repository:
```bash
git clone <repository-url>
cd openai-models-viewer
```
2. Install Python dependencies:
```bash
pip install -r requirements.txt
```
3. Run the application:
```bash
python app.py
```
The application will start on `http://localhost:5000` by default.
### Usage
1. **Add a Server**:
- Click the gear icon next to the server selector
- Enter a server name (e.g., "OpenAI Production")
- Enter the API endpoint URL (e.g., `https://api.openai.com`)
- Enter your API key
- Click "Add Server"
2. **View Models**:
- Select a configured server from the dropdown
- Models will automatically load and display
3. **Chat with Models**:
- Click on any model to open the chat interface
- Type your message and press Enter or click Send
- View the AI's response in real-time
## Development
### Running with Custom Port
```bash
python app.py 8080
```
### Docker Support ### Docker Support
Build and run with Docker: Build and run with Docker:
```bash ```bash
docker build -t openai-models-viewer . docker build -t openai-models-viewer .
docker run -p 5000:5000 openai-models-viewer docker run -p 8000:8000 openai-models-viewer
``` ```
## API Endpoints ## API Endpoints
@ -94,21 +30,3 @@ The application connects to standard OpenAI-compatible endpoints:
- API keys are stored locally in browser localStorage - API keys are stored locally in browser localStorage
- All communication happens directly between your browser and the API endpoints - All communication happens directly between your browser and the API endpoints
- No server-side storage of sensitive information - No server-side storage of sensitive information
## Contributing
1. Fork the repository
2. Create your feature branch
3. Commit your changes
4. Push to the branch
5. Open a pull request
## License
This project is licensed under the MIT License - see the LICENSE file for details.
## Acknowledgments
- Built with Flask (Python web framework)
- Uses modern JavaScript for client-side functionality
- Designed with responsive CSS for cross-device compatibility