Bläddra i källkod

feat: add llm provider settings foundation

master
Jan Svabenik 1 månad sedan
förälder
incheckning
839589c040
9 ändrade filer med 353 tillägg och 29 borttagningar
  1. +3
    -1
      README.md
  2. +2
    -0
      docs/TARGET_STATE_AND_ROADMAP.md
  3. +11
    -0
      internal/app/app.go
  4. +108
    -0
      internal/domain/llm_settings.go
  5. +15
    -7
      internal/domain/models.go
  6. +81
    -17
      internal/httpserver/handlers/ui.go
  7. +23
    -0
      internal/store/sqlite/migrations/006_add_llm_settings.sql
  8. +43
    -3
      internal/store/sqlite/store.go
  9. +67
    -1
      web/templates/settings.gohtml

+ 3
- 1
README.md Visa fil

@@ -10,6 +10,7 @@ Die App kann heute:
- Drafts anlegen, aktualisieren und im Status `draft` -> `reviewed` -> `submitted` fuehren.
- Externen Draft-Intake ueber `POST /api/drafts/intake` verarbeiten (Stammdaten + optional Website-/Stilkontext, kein Direkt-Build).
- Globalen Master-Prompt in Settings pflegen sowie Prompt-Bloecke fuer den spaeteren LLM-Flow als Standard konfigurieren.
- Im Settings-/Config-Bereich die LLM-Basiskonfiguration pflegen: aktiver Provider, aktives Modell, Base URL fuer Ollama/kompatible Endpoints sowie getrennte API-Key-Speicher je Provider (OpenAI, Anthropic, Google, xAI, Ollama).
- Im Draft-/Build-UI den User-Flow auf Stammdaten, Intake-/Website-Kontext, Stil-Auswahl und Template-Felder fokussieren; Prompt-Interna liegen in Settings.
- Interne semantische Zielslots (z. B. `hero.title`, `service_items[n].description`) auf Template-Felder abbilden als Vorbereitung fuer spaeteren LLM-Autofill.
- Repeated-Bereiche in semantischen Slots werden block-/rollenbasiert getrennt (z. B. Services/Team/Testimonials pro Item statt Sammel-Slot).
@@ -21,6 +22,7 @@ Die App kann heute:
Wichtig:
- Leadharvester liefert nur Intake-Daten (Stammdaten + optional Kontext) in Drafts.
- LLM-Autofill bleibt Assistenz im Review-Flow: Vorschlaege werden separat gespeichert und manuell angewendet; bei LLM-Ausfall greift deterministischer Rule-based Fallback.
- Die neue Provider-/Modell-Konfiguration ist Phase-A-Grundlage fuer spaeteres Routing; der bestehende LLM-Suggestions-Runtimepfad bleibt in diesem Schritt unveraendert.

## Lokaler Start

@@ -36,7 +38,7 @@ Wichtig:
## Persistenz

Default ist SQLite.
Gespeichert werden Settings, Templates, Manifeste/Felder, Drafts und Site-Builds.
Gespeichert werden Settings (inkl. Prompt-Konfig und LLM-Provider-/Modell-/Key-Grundlagen), Templates, Manifeste/Felder, Drafts und Site-Builds.

## Draft-/Review-Flow



+ 2
- 0
docs/TARGET_STATE_AND_ROADMAP.md Visa fil

@@ -42,6 +42,7 @@ Aktueller Stand:
- Semantische Zielslots (z. B. `hero.title`, `service_items[n].description`) werden intern auf konkrete Template-Felder gemappt als Vorbereitung fuer spaeteren LLM-Autofill.
- Repeated-Sektionen (u. a. Services/Team/Testimonials) werden in der Slot-Vorschau block- und rollentypisch pro Item getrennt statt in Sammel-Slots zusammenzufallen.
- LLM-first Suggestion-State fuer Draft-/Build-UI ist vorhanden: Vorschlaege werden separat von Feldwerten gespeichert und per Generate/Regenerate/Apply (global und per Feld) explizit gesteuert; Rule-based bleibt als Fallback/Testpfad aktiv.
- Settings-Grundlage fuer spaetere Providerwahl ist vorhanden: aktiver LLM-Provider, aktives Modell, Base URL fuer Ollama/kompatible Endpoints sowie getrennte API-Key-Felder je Provider (OpenAI, Anthropic, Google, xAI, Ollama) sind persistent in `app_settings`.
- Technische Felddetails (z. B. Feldpfade/Slots/Suggestion-Metadaten) sind im UI per Debug-Toggle optional einblendbar.
- Build-Start erfordert bereits einen Template-Manifest-Status `reviewed`/`validated`.
- Prozessuale Review-Gates (z. B. Freigabe-Policy, Rollen, Pflichtchecks pro Feld) sind noch nicht vollstaendig ausgebaut.
@@ -108,6 +109,7 @@ Statusmarker:
- [-] Stilprofil-Logik unter Beruecksichtigung von `businessType` + Tonalitaet (Kontext wird in den LLM-Pfad uebergeben; Qualitaets-/Governance-Feinschliff offen).
- [-] Prompt-/Systemsteuerung (Master-Prompt + Prompt-Bloecke) in Settings in den LLM-Suggestionspfad eingebunden; Build-Flow ohne prominente Prompt-Interna.
- [x] Semantische Slot-Mappings zwischen Template-Feldern und Zielrollen als Bruecke fuer LLM-Autofill aktiv genutzt (inkl. verbesserter Trennung in Repeated-Bereichen).
- [-] Phase A Provider-/Modell-Settings-Fundament in Settings/UI/Persistenz umgesetzt (inkl. provider-spezifischer Key-Speicherung); produktive Runtime-Umschaltung pro Provider/Modell folgt in spaeteren Phasen.

### F) Security und Betriebsreife
- [ ] Verbindliche Secret-Strategie (verschluesselte Speicherung statt einfacher Platzhalterlogik).


+ 11
- 0
internal/app/app.go Visa fil

@@ -84,10 +84,20 @@ func New(cfg config.Config) (*App, error) {
LanguageOutputMode: "EN",
JobPollIntervalSeconds: cfg.PollIntervalSeconds,
JobPollTimeoutSeconds: cfg.PollTimeoutSeconds,
LLMActiveProvider: domain.DefaultLLMProvider(),
LLMActiveModel: domain.NormalizeLLMModel(domain.DefaultLLMProvider(), ""),
MasterPrompt: domain.SeedMasterPrompt,
PromptBlocks: domain.DefaultPromptBlocks(),
}
if existing, err := settingsStore.GetSettings(context.Background()); err == nil && existing != nil {
baseSettings.LLMActiveProvider = existing.LLMActiveProvider
baseSettings.LLMActiveModel = existing.LLMActiveModel
baseSettings.LLMBaseURL = existing.LLMBaseURL
baseSettings.OpenAIAPIKeyEncrypted = existing.OpenAIAPIKeyEncrypted
baseSettings.AnthropicAPIKeyEncrypted = existing.AnthropicAPIKeyEncrypted
baseSettings.GoogleAPIKeyEncrypted = existing.GoogleAPIKeyEncrypted
baseSettings.XAIAPIKeyEncrypted = existing.XAIAPIKeyEncrypted
baseSettings.OllamaAPIKeyEncrypted = existing.OllamaAPIKeyEncrypted
baseSettings.MasterPrompt = existing.MasterPrompt
baseSettings.PromptBlocks = existing.PromptBlocks
}
@@ -102,6 +112,7 @@ func New(cfg config.Config) (*App, error) {
server := httpserver.New(cfg.HTTPAddr, logger, func(r chi.Router) {
r.Get("/", ui.Home)
r.Get("/settings", ui.Settings)
r.Post("/settings/llm", ui.SaveLLMSettings)
r.Post("/settings/prompt", ui.SavePromptSettings)
r.Get("/templates", ui.Templates)
r.Post("/templates/sync", ui.SyncTemplates)


+ 108
- 0
internal/domain/llm_settings.go Visa fil

@@ -0,0 +1,108 @@
package domain

import "strings"

const (
LLMProviderOpenAI = "openai"
LLMProviderAnthropic = "anthropic"
LLMProviderGoogle = "google"
LLMProviderXAI = "xai"
LLMProviderOllama = "ollama"
)

type LLMModelOption struct {
Value string
Label string
}

type LLMProviderOption struct {
Value string
Label string
Models []LLMModelOption
}

func DefaultLLMProvider() string {
return LLMProviderOpenAI
}

func LLMProviderOptions() []LLMProviderOption {
return []LLMProviderOption{
{
Value: LLMProviderOpenAI,
Label: "OpenAI",
Models: []LLMModelOption{
{Value: "gpt-5.2", Label: "gpt-5.2"},
{Value: "gpt-5.4", Label: "gpt-5.4"},
},
},
{
Value: LLMProviderAnthropic,
Label: "Anthropic",
Models: []LLMModelOption{
{Value: "claude-sonnet-4-5", Label: "claude-sonnet-4-5"},
{Value: "claude-opus-4-1", Label: "claude-opus-4-1"},
},
},
{
Value: LLMProviderGoogle,
Label: "Google",
Models: []LLMModelOption{
{Value: "gemini-2.5-pro", Label: "gemini-2.5-pro"},
{Value: "gemini-2.5-flash", Label: "gemini-2.5-flash"},
},
},
{
Value: LLMProviderXAI,
Label: "xAI",
Models: []LLMModelOption{
{Value: "grok-4", Label: "grok-4"},
{Value: "grok-3-mini", Label: "grok-3-mini"},
},
},
{
Value: LLMProviderOllama,
Label: "Ollama",
Models: []LLMModelOption{
{Value: "llama3.2", Label: "llama3.2"},
{Value: "qwen2.5", Label: "qwen2.5"},
{Value: "mistral", Label: "mistral"},
},
},
}
}

func LLMModelsByProvider(provider string) []LLMModelOption {
normalized := NormalizeLLMProvider(provider)
for _, option := range LLMProviderOptions() {
if option.Value == normalized {
out := make([]LLMModelOption, len(option.Models))
copy(out, option.Models)
return out
}
}
return nil
}

func NormalizeLLMProvider(provider string) string {
value := strings.ToLower(strings.TrimSpace(provider))
for _, option := range LLMProviderOptions() {
if option.Value == value {
return value
}
}
return DefaultLLMProvider()
}

func NormalizeLLMModel(provider, model string) string {
models := LLMModelsByProvider(provider)
if len(models) == 0 {
return ""
}
value := strings.TrimSpace(model)
for _, option := range models {
if option.Value == value {
return value
}
}
return models[0].Value
}

+ 15
- 7
internal/domain/models.go Visa fil

@@ -146,11 +146,19 @@ type DraftContext struct {
}

type AppSettings struct {
QCBaseURL string `json:"qcBaseUrl"`
QCBearerTokenEncrypted string `json:"qcBearerTokenEncrypted"`
LanguageOutputMode string `json:"languageOutputMode"`
JobPollIntervalSeconds int `json:"jobPollIntervalSeconds"`
JobPollTimeoutSeconds int `json:"jobPollTimeoutSeconds"`
MasterPrompt string `json:"masterPrompt,omitempty"`
PromptBlocks []PromptBlockConfig `json:"promptBlocks,omitempty"`
QCBaseURL string `json:"qcBaseUrl"`
QCBearerTokenEncrypted string `json:"qcBearerTokenEncrypted"`
LanguageOutputMode string `json:"languageOutputMode"`
JobPollIntervalSeconds int `json:"jobPollIntervalSeconds"`
JobPollTimeoutSeconds int `json:"jobPollTimeoutSeconds"`
LLMActiveProvider string `json:"llmActiveProvider,omitempty"`
LLMActiveModel string `json:"llmActiveModel,omitempty"`
LLMBaseURL string `json:"llmBaseUrl,omitempty"`
OpenAIAPIKeyEncrypted string `json:"openAiApiKeyEncrypted,omitempty"`
AnthropicAPIKeyEncrypted string `json:"anthropicApiKeyEncrypted,omitempty"`
GoogleAPIKeyEncrypted string `json:"googleApiKeyEncrypted,omitempty"`
XAIAPIKeyEncrypted string `json:"xaiApiKeyEncrypted,omitempty"`
OllamaAPIKeyEncrypted string `json:"ollamaApiKeyEncrypted,omitempty"`
MasterPrompt string `json:"masterPrompt,omitempty"`
PromptBlocks []PromptBlockConfig `json:"promptBlocks,omitempty"`
}

+ 81
- 17
internal/httpserver/handlers/ui.go Visa fil

@@ -54,14 +54,24 @@ type homePageData struct {

type settingsPageData struct {
pageData
QCBaseURL string
PollIntervalSeconds int
PollTimeoutSeconds int
PollMaxConcurrent int
TokenConfigured bool
LanguageOutputMode string
MasterPrompt string
PromptBlocks []domain.PromptBlockConfig
QCBaseURL string
PollIntervalSeconds int
PollTimeoutSeconds int
PollMaxConcurrent int
TokenConfigured bool
LanguageOutputMode string
LLMProviderOptions []domain.LLMProviderOption
LLMModelOptions []domain.LLMModelOption
LLMActiveProvider string
LLMActiveModel string
LLMBaseURL string
OpenAIKeyConfigured bool
AnthropicKeyConfigured bool
GoogleKeyConfigured bool
XAIKeyConfigured bool
OllamaKeyConfigured bool
MasterPrompt string
PromptBlocks []domain.PromptBlockConfig
}

type templatesPageData struct {
@@ -234,16 +244,28 @@ func (u *UI) Home(w http.ResponseWriter, r *http.Request) {

func (u *UI) Settings(w http.ResponseWriter, r *http.Request) {
settings := u.loadPromptSettings(r.Context())
activeProvider := domain.NormalizeLLMProvider(settings.LLMActiveProvider)
modelOptions := domain.LLMModelsByProvider(activeProvider)
u.render.Render(w, "settings", settingsPageData{
pageData: basePageData(r, "Settings", "/settings"),
QCBaseURL: u.cfg.QCBaseURL,
PollIntervalSeconds: u.cfg.PollIntervalSeconds,
PollTimeoutSeconds: u.cfg.PollTimeoutSeconds,
PollMaxConcurrent: u.cfg.PollMaxConcurrent,
TokenConfigured: strings.TrimSpace(u.cfg.QCToken) != "",
LanguageOutputMode: "EN",
MasterPrompt: settings.MasterPrompt,
PromptBlocks: settings.PromptBlocks,
pageData: basePageData(r, "Settings", "/settings"),
QCBaseURL: u.cfg.QCBaseURL,
PollIntervalSeconds: u.cfg.PollIntervalSeconds,
PollTimeoutSeconds: u.cfg.PollTimeoutSeconds,
PollMaxConcurrent: u.cfg.PollMaxConcurrent,
TokenConfigured: strings.TrimSpace(u.cfg.QCToken) != "",
LanguageOutputMode: "EN",
LLMProviderOptions: domain.LLMProviderOptions(),
LLMModelOptions: modelOptions,
LLMActiveProvider: activeProvider,
LLMActiveModel: domain.NormalizeLLMModel(activeProvider, settings.LLMActiveModel),
LLMBaseURL: strings.TrimSpace(settings.LLMBaseURL),
OpenAIKeyConfigured: strings.TrimSpace(settings.OpenAIAPIKeyEncrypted) != "",
AnthropicKeyConfigured: strings.TrimSpace(settings.AnthropicAPIKeyEncrypted) != "",
GoogleKeyConfigured: strings.TrimSpace(settings.GoogleAPIKeyEncrypted) != "",
XAIKeyConfigured: strings.TrimSpace(settings.XAIAPIKeyEncrypted) != "",
OllamaKeyConfigured: strings.TrimSpace(settings.OllamaAPIKeyEncrypted) != "",
MasterPrompt: settings.MasterPrompt,
PromptBlocks: settings.PromptBlocks,
})
}

@@ -262,6 +284,37 @@ func (u *UI) SavePromptSettings(w http.ResponseWriter, r *http.Request) {
http.Redirect(w, r, "/settings?msg=prompt+settings+saved", http.StatusSeeOther)
}

func (u *UI) SaveLLMSettings(w http.ResponseWriter, r *http.Request) {
if err := r.ParseForm(); err != nil {
http.Redirect(w, r, "/settings?err=invalid+form", http.StatusSeeOther)
return
}
settings := u.loadPromptSettings(r.Context())
settings.LLMActiveProvider = domain.NormalizeLLMProvider(r.FormValue("llm_provider"))
settings.LLMActiveModel = domain.NormalizeLLMModel(settings.LLMActiveProvider, r.FormValue("llm_model"))
settings.LLMBaseURL = strings.TrimSpace(r.FormValue("llm_base_url"))
if value := strings.TrimSpace(r.FormValue("llm_api_key_openai")); value != "" {
settings.OpenAIAPIKeyEncrypted = value
}
if value := strings.TrimSpace(r.FormValue("llm_api_key_anthropic")); value != "" {
settings.AnthropicAPIKeyEncrypted = value
}
if value := strings.TrimSpace(r.FormValue("llm_api_key_google")); value != "" {
settings.GoogleAPIKeyEncrypted = value
}
if value := strings.TrimSpace(r.FormValue("llm_api_key_xai")); value != "" {
settings.XAIAPIKeyEncrypted = value
}
if value := strings.TrimSpace(r.FormValue("llm_api_key_ollama")); value != "" {
settings.OllamaAPIKeyEncrypted = value
}
if err := u.settings.UpsertSettings(r.Context(), settings); err != nil {
http.Redirect(w, r, "/settings?err="+urlQuery(err.Error()), http.StatusSeeOther)
return
}
http.Redirect(w, r, "/settings?msg=llm+settings+saved", http.StatusSeeOther)
}

func (u *UI) Templates(w http.ResponseWriter, r *http.Request) {
templates, err := u.templateSvc.ListTemplates(r.Context())
if err != nil {
@@ -1598,12 +1651,15 @@ func buildDraftContextFromForm(form buildFormInput, globalData map[string]any) *
}

func (u *UI) loadPromptSettings(ctx context.Context) domain.AppSettings {
defaultProvider := domain.DefaultLLMProvider()
settings := domain.AppSettings{
QCBaseURL: u.cfg.QCBaseURL,
QCBearerTokenEncrypted: u.cfg.QCToken,
LanguageOutputMode: "EN",
JobPollIntervalSeconds: u.cfg.PollIntervalSeconds,
JobPollTimeoutSeconds: u.cfg.PollTimeoutSeconds,
LLMActiveProvider: defaultProvider,
LLMActiveModel: domain.NormalizeLLMModel(defaultProvider, ""),
MasterPrompt: domain.SeedMasterPrompt,
PromptBlocks: domain.DefaultPromptBlocks(),
}
@@ -1629,6 +1685,14 @@ func (u *UI) loadPromptSettings(ctx context.Context) domain.AppSettings {
if stored.JobPollTimeoutSeconds > 0 {
settings.JobPollTimeoutSeconds = stored.JobPollTimeoutSeconds
}
settings.LLMActiveProvider = domain.NormalizeLLMProvider(stored.LLMActiveProvider)
settings.LLMActiveModel = domain.NormalizeLLMModel(settings.LLMActiveProvider, stored.LLMActiveModel)
settings.LLMBaseURL = strings.TrimSpace(stored.LLMBaseURL)
settings.OpenAIAPIKeyEncrypted = strings.TrimSpace(stored.OpenAIAPIKeyEncrypted)
settings.AnthropicAPIKeyEncrypted = strings.TrimSpace(stored.AnthropicAPIKeyEncrypted)
settings.GoogleAPIKeyEncrypted = strings.TrimSpace(stored.GoogleAPIKeyEncrypted)
settings.XAIAPIKeyEncrypted = strings.TrimSpace(stored.XAIAPIKeyEncrypted)
settings.OllamaAPIKeyEncrypted = strings.TrimSpace(stored.OllamaAPIKeyEncrypted)
settings.MasterPrompt = domain.NormalizeMasterPrompt(stored.MasterPrompt)
settings.PromptBlocks = domain.NormalizePromptBlocks(stored.PromptBlocks)
return settings


+ 23
- 0
internal/store/sqlite/migrations/006_add_llm_settings.sql Visa fil

@@ -0,0 +1,23 @@
ALTER TABLE app_settings
ADD COLUMN llm_active_provider TEXT NOT NULL DEFAULT 'openai';

ALTER TABLE app_settings
ADD COLUMN llm_active_model TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN llm_base_url TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN openai_api_key_encrypted TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN anthropic_api_key_encrypted TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN google_api_key_encrypted TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN xai_api_key_encrypted TEXT NOT NULL DEFAULT '';

ALTER TABLE app_settings
ADD COLUMN ollama_api_key_encrypted TEXT NOT NULL DEFAULT '';

+ 43
- 3
internal/store/sqlite/store.go Visa fil

@@ -410,16 +410,29 @@ func (s *Store) UpsertSettings(ctx context.Context, settings domain.AppSettings)
if err != nil {
return fmt.Errorf("marshal prompt blocks: %w", err)
}
provider := domain.NormalizeLLMProvider(settings.LLMActiveProvider)
model := domain.NormalizeLLMModel(provider, settings.LLMActiveModel)
_, err = s.db.ExecContext(ctx, `
INSERT INTO app_settings (
id, qc_base_url, qc_bearer_token_encrypted, language_output_mode, job_poll_interval_seconds, job_poll_timeout_seconds, master_prompt, prompt_blocks_json, updated_at
) VALUES (1, ?, ?, ?, ?, ?, ?, ?, ?)
id, qc_base_url, qc_bearer_token_encrypted, language_output_mode, job_poll_interval_seconds, job_poll_timeout_seconds,
llm_active_provider, llm_active_model, llm_base_url,
openai_api_key_encrypted, anthropic_api_key_encrypted, google_api_key_encrypted, xai_api_key_encrypted, ollama_api_key_encrypted,
master_prompt, prompt_blocks_json, updated_at
) VALUES (1, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(id) DO UPDATE SET
qc_base_url = excluded.qc_base_url,
qc_bearer_token_encrypted = excluded.qc_bearer_token_encrypted,
language_output_mode = excluded.language_output_mode,
job_poll_interval_seconds = excluded.job_poll_interval_seconds,
job_poll_timeout_seconds = excluded.job_poll_timeout_seconds,
llm_active_provider = excluded.llm_active_provider,
llm_active_model = excluded.llm_active_model,
llm_base_url = excluded.llm_base_url,
openai_api_key_encrypted = excluded.openai_api_key_encrypted,
anthropic_api_key_encrypted = excluded.anthropic_api_key_encrypted,
google_api_key_encrypted = excluded.google_api_key_encrypted,
xai_api_key_encrypted = excluded.xai_api_key_encrypted,
ollama_api_key_encrypted = excluded.ollama_api_key_encrypted,
master_prompt = excluded.master_prompt,
prompt_blocks_json = excluded.prompt_blocks_json,
updated_at = excluded.updated_at`,
@@ -428,6 +441,14 @@ func (s *Store) UpsertSettings(ctx context.Context, settings domain.AppSettings)
defaultString(settings.LanguageOutputMode, "EN"),
settings.JobPollIntervalSeconds,
settings.JobPollTimeoutSeconds,
provider,
model,
strings.TrimSpace(settings.LLMBaseURL),
strings.TrimSpace(settings.OpenAIAPIKeyEncrypted),
strings.TrimSpace(settings.AnthropicAPIKeyEncrypted),
strings.TrimSpace(settings.GoogleAPIKeyEncrypted),
strings.TrimSpace(settings.XAIAPIKeyEncrypted),
strings.TrimSpace(settings.OllamaAPIKeyEncrypted),
domain.NormalizeMasterPrompt(settings.MasterPrompt),
promptBlocksRaw,
time.Now().UTC().Format(time.RFC3339Nano),
@@ -437,7 +458,10 @@ func (s *Store) UpsertSettings(ctx context.Context, settings domain.AppSettings)

func (s *Store) GetSettings(ctx context.Context) (*domain.AppSettings, error) {
row := s.db.QueryRowContext(ctx, `
SELECT qc_base_url, qc_bearer_token_encrypted, language_output_mode, job_poll_interval_seconds, job_poll_timeout_seconds, master_prompt, prompt_blocks_json
SELECT qc_base_url, qc_bearer_token_encrypted, language_output_mode, job_poll_interval_seconds, job_poll_timeout_seconds,
llm_active_provider, llm_active_model, llm_base_url,
openai_api_key_encrypted, anthropic_api_key_encrypted, google_api_key_encrypted, xai_api_key_encrypted, ollama_api_key_encrypted,
master_prompt, prompt_blocks_json
FROM app_settings
WHERE id = 1`)
var settings domain.AppSettings
@@ -448,6 +472,14 @@ func (s *Store) GetSettings(ctx context.Context) (*domain.AppSettings, error) {
&settings.LanguageOutputMode,
&settings.JobPollIntervalSeconds,
&settings.JobPollTimeoutSeconds,
&settings.LLMActiveProvider,
&settings.LLMActiveModel,
&settings.LLMBaseURL,
&settings.OpenAIAPIKeyEncrypted,
&settings.AnthropicAPIKeyEncrypted,
&settings.GoogleAPIKeyEncrypted,
&settings.XAIAPIKeyEncrypted,
&settings.OllamaAPIKeyEncrypted,
&settings.MasterPrompt,
&promptBlocksRaw,
); err != nil {
@@ -460,6 +492,14 @@ func (s *Store) GetSettings(ctx context.Context) (*domain.AppSettings, error) {
if len(promptBlocksRaw) > 0 {
_ = json.Unmarshal(promptBlocksRaw, &settings.PromptBlocks)
}
settings.LLMActiveProvider = domain.NormalizeLLMProvider(settings.LLMActiveProvider)
settings.LLMActiveModel = domain.NormalizeLLMModel(settings.LLMActiveProvider, settings.LLMActiveModel)
settings.LLMBaseURL = strings.TrimSpace(settings.LLMBaseURL)
settings.OpenAIAPIKeyEncrypted = strings.TrimSpace(settings.OpenAIAPIKeyEncrypted)
settings.AnthropicAPIKeyEncrypted = strings.TrimSpace(settings.AnthropicAPIKeyEncrypted)
settings.GoogleAPIKeyEncrypted = strings.TrimSpace(settings.GoogleAPIKeyEncrypted)
settings.XAIAPIKeyEncrypted = strings.TrimSpace(settings.XAIAPIKeyEncrypted)
settings.OllamaAPIKeyEncrypted = strings.TrimSpace(settings.OllamaAPIKeyEncrypted)
settings.PromptBlocks = domain.NormalizePromptBlocks(settings.PromptBlocks)
return &settings, nil
}


+ 67
- 1
web/templates/settings.gohtml Visa fil

@@ -10,7 +10,7 @@
{{if .Msg}}<div class="flash flash-ok">{{.Msg}}</div>{{end}}
{{if .Err}}<div class="flash flash-err">{{.Err}}</div>{{end}}
<h1>Settings</h1>
<p>QC-Settings plus globale Prompt-/Systemsteuerung fuer den spaeteren LLM-Flow.</p>
<p>QC-Settings plus LLM- und globale Prompt-/Systemsteuerung fuer den spaeteren LLM-Flow.</p>
<table>
<tr><th>QC Base URL</th><td class="mono">{{.QCBaseURL}}</td></tr>
<tr><th>Bearer token configured</th><td>{{if .TokenConfigured}}yes{{else}}no{{end}}</td></tr>
@@ -20,6 +20,60 @@
<tr><th>Language output mode</th><td>{{.LanguageOutputMode}}</td></tr>
</table>

<h2>LLM Provider / Modell</h2>
<p><small>Phase-A-Grundlage: Provider, Modell, optionale Base URL (Ollama/kompatibel) und provider-spezifische API-Keys.</small></p>
<form method="post" action="/settings/llm">
<div>
<label>Provider
<select id="llm-provider" name="llm_provider">
{{range .LLMProviderOptions}}
<option value="{{.Value}}" {{if eq $.LLMActiveProvider .Value}}selected{{end}}>{{.Label}}</option>
{{end}}
</select>
</label>
</div>
<div>
<label>Model
<select name="llm_model">
{{range .LLMModelOptions}}
<option value="{{.Value}}" {{if eq $.LLMActiveModel .Value}}selected{{end}}>{{.Label}}</option>
{{end}}
</select>
</label>
</div>
<div id="llm-base-url-wrap" {{if ne .LLMActiveProvider "ollama"}}style="display:none;"{{end}}>
<label>Base URL (nur Ollama / kompatible Endpoints)
<input type="url" name="llm_base_url" placeholder="http://localhost:11434/v1" value="{{.LLMBaseURL}}">
</label>
</div>
<div>
<label>OpenAI API Key ({{if .OpenAIKeyConfigured}}configured{{else}}not configured{{end}})
<input type="password" name="llm_api_key_openai" placeholder="leer lassen = unveraendert">
</label>
</div>
<div>
<label>Anthropic API Key ({{if .AnthropicKeyConfigured}}configured{{else}}not configured{{end}})
<input type="password" name="llm_api_key_anthropic" placeholder="leer lassen = unveraendert">
</label>
</div>
<div>
<label>Google API Key ({{if .GoogleKeyConfigured}}configured{{else}}not configured{{end}})
<input type="password" name="llm_api_key_google" placeholder="leer lassen = unveraendert">
</label>
</div>
<div>
<label>xAI API Key ({{if .XAIKeyConfigured}}configured{{else}}not configured{{end}})
<input type="password" name="llm_api_key_xai" placeholder="leer lassen = unveraendert">
</label>
</div>
<div>
<label>Ollama API Key (optional; {{if .OllamaKeyConfigured}}configured{{else}}not configured{{end}})
<input type="password" name="llm_api_key_ollama" placeholder="leer lassen = unveraendert">
</label>
</div>
<button type="submit">LLM-Settings speichern</button>
</form>

<h2>Globaler Master Prompt</h2>
<p><small>Diese Einstellungen gelten systemweit und werden im normalen Build-/Review-Formular nicht mehr direkt editiert.</small></p>
<form method="post" action="/settings/prompt">
@@ -43,6 +97,18 @@
{{end}}
<button type="submit">Prompt-Settings speichern</button>
</form>
<script>
(function () {
var provider = document.getElementById('llm-provider');
var baseUrlWrap = document.getElementById('llm-base-url-wrap');
if (!provider || !baseUrlWrap) return;
var syncBaseURLVisibility = function () {
baseUrlWrap.style.display = provider.value === 'ollama' ? '' : 'none';
};
provider.addEventListener('change', syncBaseURLVisibility);
syncBaseURLVisibility();
})();
</script>
</body>
</html>
{{end}}

Laddar…
Avbryt
Spara