Ollama Init-Container für automatisches Modell-Pullen (embeddinggemma)
This commit is contained in:
parent
86f7178b2b
commit
1864370236
2 changed files with 51 additions and 11 deletions
|
|
@ -24,37 +24,56 @@ NEXTAUTH_URL=https://blink.your-domain.com
|
|||
NEXT_PUBLIC_BASE_URL=https://blink.your-domain.com
|
||||
```
|
||||
|
||||
## Services
|
||||
|
||||
| Service | Beschreibung | Ports |
|
||||
|---------|--------------|-------|
|
||||
| `blinko-website` | Next.js Web-UI | 1110 → 1111 |
|
||||
| `postgres` | PostgreSQL Datenbank | 5435 → 5432 (intern empfohlen) |
|
||||
| `ollama` | AI-Modell-Server | Intern (kein Port exponiert) |
|
||||
| `ollama-proxy` | Caddy Reverse Proxy für Ollama | - |
|
||||
| `ollama-models` | **Init-Container** – pullt `embeddinggemma` automatisch | Einmalig, dann beendet |
|
||||
|
||||
## Ports
|
||||
|
||||
| Service | Port | Beschreibung |
|
||||
|---------|------|--------------|
|
||||
| blinko-website | 1110 | Web UI |
|
||||
| postgres | 5435 | Datenbank (nur intern empfohlen) |
|
||||
| ollama | 11434 | Ollama API |
|
||||
| Externer Port | Service | Beschreibung |
|
||||
|---------------|---------|--------------|
|
||||
| 1110 | blinko-website | Web UI |
|
||||
| 5435 | postgres | Datenbank (nur intern empfohlen!) |
|
||||
|
||||
## Caddy
|
||||
|
||||
Für externen Zugriff:
|
||||
Für externen Zugriff auf Blinko:
|
||||
|
||||
```caddyfile
|
||||
blink.example.com {
|
||||
reverse_proxy localhost:1110
|
||||
}
|
||||
|
||||
ollama.example.com {
|
||||
reverse_proxy localhost:11434
|
||||
}
|
||||
```
|
||||
|
||||
Ollama ist nur intern erreichbar (über `ollama-proxy` im Docker-Netzwerk).
|
||||
|
||||
## Ollama Models
|
||||
|
||||
Nach dem Deploy Modelle pullen:
|
||||
### Automatisch (Post-Deploy)
|
||||
|
||||
Der Stack pullt automatisch `embeddinggemma` nach dem Start via `ollama-models` Service.
|
||||
|
||||
### Weitere Modelle (Manuell)
|
||||
|
||||
Zusätzliche Modelle können manuell gepullt werden:
|
||||
|
||||
```bash
|
||||
docker exec -it blinko-ollama ollama pull llama3.2
|
||||
docker exec -it blinko-ollama ollama pull nomic-embed-text
|
||||
```
|
||||
|
||||
### Verfügbare Modelle prüfen
|
||||
|
||||
```bash
|
||||
docker exec -it blinko-ollama ollama list
|
||||
```
|
||||
|
||||
## Volumes
|
||||
|
||||
- `blinko_data` → `/app/.blinko` (Notizen & Daten)
|
||||
|
|
|
|||
|
|
@ -68,6 +68,27 @@ services:
|
|||
networks:
|
||||
- blinko-network
|
||||
|
||||
ollama-models:
|
||||
image: ollama/ollama:latest
|
||||
container_name: blinko-ollama-models
|
||||
entrypoint: ["/bin/sh", "-c"]
|
||||
command:
|
||||
- |
|
||||
echo "⏳ Warte auf Ollama API..."
|
||||
until curl -s http://ollama:11434/api/tags > /dev/null; do
|
||||
sleep 2
|
||||
done
|
||||
echo "✅ Ollama ready! Pulling embeddinggemma..."
|
||||
ollama pull embeddinggemma
|
||||
echo "🎉 Modell erfolgreich installiert!"
|
||||
volumes:
|
||||
- ollama_data:/root/.ollama
|
||||
depends_on:
|
||||
- ollama
|
||||
networks:
|
||||
- blinko-network
|
||||
restart: "no"
|
||||
|
||||
volumes:
|
||||
blinko_data:
|
||||
postgres_data:
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue