Quellcode durchsuchen

feat: add autofill debug logging and log api

master
Jan Svabenik vor 1 Monat
Ursprung
Commit
591a5aae2e
9 geänderte Dateien mit 755 neuen und 11 gelöschten Zeilen
  1. +4
    -0
      README.md
  2. +4
    -0
      docs/TARGET_STATE_AND_ROADMAP.md
  3. +7
    -3
      internal/app/app.go
  4. +19
    -1
      internal/httpserver/handlers/handlers.go
  5. +155
    -4
      internal/httpserver/handlers/ui.go
  6. +258
    -2
      internal/logging/logger.go
  7. +114
    -1
      internal/mapping/provider_suggestion_generator.go
  8. +190
    -0
      internal/mapping/suggestion_generator.go
  9. +4
    -0
      internal/mapping/suggestions.go

+ 4
- 0
README.md Datei anzeigen

@@ -13,12 +13,15 @@ Die App kann heute:
- Im Settings-/Config-Bereich die LLM-Basiskonfiguration pflegen: aktiver Provider, aktives Modell (provider-aware statische Auswahlliste), Base URL fuer Ollama/kompatible Endpoints, Temperature/Max Tokens sowie getrennte API-Key-Speicher je Provider (OpenAI, Anthropic, Google, xAI, Ollama).
- LLM-Provider-Konfiguration in Settings per leichtgewichtigem Validate-Action pruefen (aktiver Provider/Modell/Key/Base URL via kurzem Runtime-Request).
- OpenAI-kompatible Runtime-Requests waehlen den Token-Limit-Parameter intern modellkompatibel (`max_completion_tokens` fuer OpenAI GPT-5-Modelle, sonst `max_tokens`), inkl. Settings-Validate-Action.
- OpenAI-kompatible Runtime-Responses werden robust ueber mehrere Chat-/GPT-5-kompatible Content-Shapes extrahiert (u. a. `choices[].message.content` als String/Part-Array sowie `output_text`/`output[].content`); bei leerem Ergebnis werden nur sichere Strukturdiagnosen (Keys/Typen), keine Prompt-/Secret-Inhalte, zurueckgegeben.
- Im Draft-/Build-UI den User-Flow auf Stammdaten, Intake-/Website-Kontext, Stil-Auswahl und Template-Felder fokussieren; Prompt-Interna liegen in Settings.
- Interne semantische Zielslots (z. B. `hero.title`, `service_items[n].description`) auf Template-Felder abbilden als Vorbereitung fuer spaeteren LLM-Autofill.
- Repeated-Bereiche in semantischen Slots werden block-/rollenbasiert getrennt (z. B. Services/Team/Testimonials pro Item statt Sammel-Slot).
- LLM-first Autofill-Vorschlaege ueber provider-aware Runtime (OpenAI, Anthropic, Google, xAI, Ollama/kompatibel) mit aktiver Provider-/Modell-Auswahl aus Settings, strukturierter Feldzuordnung auf `fieldPath`/Slot und Rule-based Fallback fuer Ausfall-/Testfaelle.
- Suggestion-Workflow getrennt von Feldwerten (Preview), inkl. `Generate all`, `Regenerate all`, `Apply all to empty` sowie per-Feld `Apply`/`Regenerate` im Draft-/Build-UI.
- Technische Felddetails (z. B. `fieldPath`, Suggestion-Metadaten, Slot-Preview) sind im UI standardmaessig ausgeblendet und nur per Debug-Toggle sichtbar.
- Strukturierte Debug-Logs fuer Autofill-/LLM-Flow inkl. Provider-Pfad, QC-Fallback, Rule-based-Fallback und Validate-Action (kurze Metadaten, Fehlerzusammenfassung, Dauer, Suggestion-Count).
- Kleine interne Log-API `GET /api/logs?limit=<n>` fuer aktuelle In-Memory-Logeintraege (Ring-Buffer, neueste zuerst).
- Builds aus geprueften Daten starten sowie Job-Status pollen und Editor-URL nachladen.

Wichtig:
@@ -34,6 +37,7 @@ Wichtig:
- `DB_URL=data/qctextbuilder.db` (Default)
- `QC_BASE_URL=https://qc-api.yggdrasil.dev-mono.net/api/v1`
- `QC_TOKEN=<bearer token>`
- optional: `LOG_FILE=logs/qctextbuilder.log` fuer zusaetzliches JSON-Logfile (stdout bleibt aktiv)
2. Starten:
- `go run ./cmd/qctextbuilder`



+ 4
- 0
docs/TARGET_STATE_AND_ROADMAP.md Datei anzeigen

@@ -44,9 +44,12 @@ Aktueller Stand:
- LLM-first Suggestion-State fuer Draft-/Build-UI ist vorhanden: Vorschlaege werden separat von Feldwerten gespeichert und per Generate/Regenerate/Apply (global und per Feld) explizit gesteuert; Rule-based bleibt als letzter Fallback/Testpfad aktiv.
- Provider-aware Suggestion-Runtime ist aktiv: Settings (`llm_active_provider`, `llm_active_model`, `llm_temperature`, `llm_max_tokens`, provider-spezifischer API-Key, `llm_base_url` fuer Ollama/kompatible Endpoints) steuern den primaeren Laufzeitpfad; der bestehende QC-Pfad bleibt als Kompatibilitaetsfallback erhalten.
- OpenAI-kompatible Requests nutzen intern modellabhaengig den passenden Token-Limit-Parameter (`max_completion_tokens` fuer OpenAI GPT-5-Modelle, ansonsten `max_tokens`), auch im Settings-Validate-Pfad.
- OpenAI-kompatible Runtime-Response-Extraktion ist fuer neuere GPT-5/OpenAI-kompatible Shapes robuster (String-/Part-Content in `choices[].message.content` sowie `output_text`/`output[].content`) und liefert bei leerem Inhalt sichere Shape-Diagnostik ohne Content-Dumps.
- Settings enthalten einen leichtgewichtigen Validate-Action fuer die aktive Provider-Konfiguration (kurzer Runtime-Check), ohne den Draft-/Review-Flow zu umgehen.
- Modellauswahl ist provider-aware statisch umgesetzt und so strukturiert, dass spaeter dynamische Model-Listen/Refresh anschliessbar sind.
- Technische Felddetails (z. B. Feldpfade/Slots/Suggestion-Metadaten) sind im UI per Debug-Toggle optional einblendbar.
- Strukturierte Debug-Logs fuer den Autofill-/LLM-Pfad sind aktiv (provider-aware Request/Parse, QC-Fallback, Rule-based-Fallback, Validate-Action; ohne Prompt-/Secret-Dumps).
- Eine kleine interne Log-API (`GET /api/logs`) stellt aktuelle strukturierte Logeintraege aus einem In-Memory-Ring-Buffer bereit.
- Build-Start erfordert bereits einen Template-Manifest-Status `reviewed`/`validated`.
- Prozessuale Review-Gates (z. B. Freigabe-Policy, Rollen, Pflichtchecks pro Feld) sind noch nicht vollstaendig ausgebaut.

@@ -114,6 +117,7 @@ Statusmarker:
- [x] Semantische Slot-Mappings zwischen Template-Feldern und Zielrollen als Bruecke fuer LLM-Autofill aktiv genutzt (inkl. verbesserter Trennung in Repeated-Bereichen).
- [x] Phase A/B Provider-/Modell-Settings-Fundament inkl. produktiver Runtime-Umschaltung umgesetzt (Provider-/Modellwahl + provider-spezifische Keys + Base URL fuer Ollama/kompatible Endpoints steuern Suggestions direkt).
- [x] Phase C Komfort/Qualitaet umgesetzt: Temperature/Max Tokens in Settings + Runtime, Settings-Validate-Action, robustere Provider-Response-/Fehlerbehandlung und statisch provider-aware Modell-UX mit spaeterem Ausbaupfad.
- [x] Fokus-Debugging fuer Autofill/LLM: strukturierte Ablauf-Logs + interner Recent-Log-API-Endpunkt fuer schnelle Ursachenanalyse.

### F) Security und Betriebsreife
- [ ] Verbindliche Secret-Strategie (verschluesselte Speicherung statt einfacher Platzhalterlogik).


+ 7
- 3
internal/app/app.go Datei anzeigen

@@ -4,6 +4,7 @@ import (
"context"
"errors"
"fmt"
"log/slog"
"net/http"
"strings"
"time"
@@ -35,7 +36,9 @@ type App struct {
}

func New(cfg config.Config) (*App, error) {
logger := logging.New()
logSetup := logging.Setup()
logger := logSetup.Logger
slog.SetDefault(logger)

var (
templateStore store.TemplateStore
@@ -81,7 +84,7 @@ func New(cfg config.Config) (*App, error) {
),
)
pollingSvc := polling.New(buildSvc, buildStore, time.Duration(cfg.PollIntervalSeconds)*time.Second, cfg.PollMaxConcurrent, logger)
api := handlers.NewAPI(templateSvc, onboardSvc, draftSvc, buildSvc)
api := handlers.NewAPI(templateSvc, onboardSvc, draftSvc, buildSvc, logSetup.Recent)

baseSettings := domain.AppSettings{
QCBaseURL: cfg.QCBaseURL,
@@ -116,7 +119,7 @@ func New(cfg config.Config) (*App, error) {
if err != nil {
return nil, fmt.Errorf("init renderer: %w", err)
}
ui := handlers.NewUI(templateSvc, onboardSvc, draftSvc, buildSvc, settingsStore, suggestionGenerator, cfg, renderer)
ui := handlers.NewUI(templateSvc, onboardSvc, draftSvc, buildSvc, settingsStore, suggestionGenerator, cfg, renderer, logger)

server := httpserver.New(cfg.HTTPAddr, logger, func(r chi.Router) {
r.Get("/", ui.Home)
@@ -152,6 +155,7 @@ func New(cfg config.Config) (*App, error) {
r.Get("/site-builds/{id}", api.GetBuild)
r.Post("/site-builds/{id}/poll", api.PollBuildOnce)
r.Post("/site-builds/{id}/fetch-editor-url", api.FetchBuildEditorURL)
r.Get("/logs", api.ListLogs)
})
})



+ 19
- 1
internal/httpserver/handlers/handlers.go Datei anzeigen

@@ -11,6 +11,7 @@ import (
"qctextbuilder/internal/buildsvc"
"qctextbuilder/internal/domain"
"qctextbuilder/internal/draftsvc"
"qctextbuilder/internal/logging"
"qctextbuilder/internal/onboarding"
"qctextbuilder/internal/templatesvc"
)
@@ -20,14 +21,16 @@ type API struct {
onboardSvc *onboarding.Service
draftSvc *draftsvc.Service
buildSvc buildsvc.Service
recentLogs *logging.RecentStore
}

func NewAPI(templateSvc *templatesvc.Service, onboardSvc *onboarding.Service, draftSvc *draftsvc.Service, buildSvc buildsvc.Service) *API {
func NewAPI(templateSvc *templatesvc.Service, onboardSvc *onboarding.Service, draftSvc *draftsvc.Service, buildSvc buildsvc.Service, recentLogs *logging.RecentStore) *API {
return &API{
templateSvc: templateSvc,
onboardSvc: onboardSvc,
draftSvc: draftSvc,
buildSvc: buildSvc,
recentLogs: recentLogs,
}
}

@@ -358,6 +361,21 @@ func (a *API) FetchBuildEditorURL(w http.ResponseWriter, r *http.Request) {
writeJSON(w, http.StatusOK, build)
}

func (a *API) ListLogs(w http.ResponseWriter, r *http.Request) {
limit, _ := strconv.Atoi(strings.TrimSpace(r.URL.Query().Get("limit")))
if limit <= 0 {
limit = 100
}
if limit > 500 {
limit = 500
}
logs := a.recentLogs.List(limit)
writeJSON(w, http.StatusOK, map[string]any{
"count": len(logs),
"logs": logs,
})
}

func writeJSON(w http.ResponseWriter, status int, v any) {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(status)


+ 155
- 4
internal/httpserver/handlers/ui.go Datei anzeigen

@@ -4,6 +4,7 @@ import (
"context"
"encoding/json"
"fmt"
"log/slog"
"net/http"
"net/url"
"regexp"
@@ -35,6 +36,7 @@ type UI struct {
suggestionGenerator mapping.SuggestionGenerator
cfg config.Config
render htmlRenderer
logger *slog.Logger
}

type htmlRenderer interface {
@@ -223,7 +225,10 @@ type buildDetailPageData struct {
AutoRefreshSeconds int
}

func NewUI(templateSvc *templatesvc.Service, onboardSvc *onboarding.Service, draftSvc *draftsvc.Service, buildSvc buildsvc.Service, settings store.SettingsStore, suggestionGenerator mapping.SuggestionGenerator, cfg config.Config, render htmlRenderer) *UI {
func NewUI(templateSvc *templatesvc.Service, onboardSvc *onboarding.Service, draftSvc *draftsvc.Service, buildSvc buildsvc.Service, settings store.SettingsStore, suggestionGenerator mapping.SuggestionGenerator, cfg config.Config, render htmlRenderer, logger *slog.Logger) *UI {
if logger == nil {
logger = slog.Default()
}
return &UI{
templateSvc: templateSvc,
onboardSvc: onboardSvc,
@@ -233,6 +238,7 @@ func NewUI(templateSvc *templatesvc.Service, onboardSvc *onboarding.Service, dra
suggestionGenerator: suggestionGenerator,
cfg: cfg,
render: render,
logger: logger,
}
}

@@ -316,7 +322,7 @@ func (u *UI) ValidateLLMSettings(w http.ResponseWriter, r *http.Request) {
http.Redirect(w, r, "/settings?err="+urlQuery(err.Error()), http.StatusSeeOther)
return
}
if err := validateLLMProviderConfig(r.Context(), settings); err != nil {
if err := validateLLMProviderConfig(r.Context(), settings, u.logger); err != nil {
http.Redirect(w, r, "/settings?err="+urlQuery(err.Error()), http.StatusSeeOther)
return
}
@@ -627,8 +633,23 @@ func (u *UI) AutofillDraft(w http.ResponseWriter, r *http.Request) {
action, targetFieldPath := parseAutofillAction(strings.TrimSpace(r.FormValue("autofill_action")))
focusFieldPath := targetFieldPath
now := time.Now().UTC()
activeSettings := u.loadPromptSettings(r.Context())
activeProvider := domain.NormalizeLLMProvider(activeSettings.LLMActiveProvider)
activeModel := domain.NormalizeLLMModel(activeProvider, activeSettings.LLMActiveModel)
autofillStart := time.Now()
u.logger.InfoContext(r.Context(), "autofill action",
"component", "autofill",
"step", "action_start",
"status", "start",
"action", action,
"provider", activeProvider,
"model", activeModel,
"draft_id", strings.TrimSpace(form.DraftID),
"template_id", templateID,
)
req := mapping.SuggestionRequest{
TemplateID: templateID,
DraftID: strings.TrimSpace(form.DraftID),
Fields: detail.Fields,
GlobalData: globalData,
DraftContext: draftContext,
@@ -660,6 +681,20 @@ func (u *UI) AutofillDraft(w http.ResponseWriter, r *http.Request) {
default:
msg = "unknown autofill action"
}
sourceCounts := summarizeSuggestionSources(suggestionState)
u.logger.InfoContext(r.Context(), "autofill action",
"component", "autofill",
"step", "action_finish",
"status", "success",
"action", action,
"provider", activeProvider,
"model", activeModel,
"draft_id", strings.TrimSpace(form.DraftID),
"template_id", templateID,
"suggestion_count", len(suggestionState.ByFieldPath),
"sources", sourceCounts,
"duration_ms", time.Since(autofillStart).Milliseconds(),
)

if strings.TrimSpace(form.DraftID) != "" {
_, _ = u.draftSvc.SaveDraft(r.Context(), draftsvc.UpsertDraftRequest{
@@ -804,27 +839,79 @@ func applyLLMSettingsForm(settings domain.AppSettings, r *http.Request) (domain.
return next, nil
}

func validateLLMProviderConfig(ctx context.Context, settings domain.AppSettings) error {
func validateLLMProviderConfig(ctx context.Context, settings domain.AppSettings, logger *slog.Logger) error {
if logger == nil {
logger = slog.Default()
}
started := time.Now()
provider := domain.NormalizeLLMProvider(settings.LLMActiveProvider)
model := domain.NormalizeLLMModel(provider, settings.LLMActiveModel)
baseURL := strings.TrimSpace(settings.LLMBaseURL)
logger.InfoContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "start",
"provider", provider,
"model", model,
"base_url", baseURL,
)
if strings.TrimSpace(model) == "" {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", "no active model configured",
"duration_ms", time.Since(started).Milliseconds(),
)
return fmt.Errorf("no active model configured")
}
baseURL := strings.TrimSpace(settings.LLMBaseURL)
if baseURL != "" {
parsed, err := url.Parse(baseURL)
if err != nil || strings.TrimSpace(parsed.Scheme) == "" || strings.TrimSpace(parsed.Host) == "" {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", "invalid llm base url",
"duration_ms", time.Since(started).Milliseconds(),
)
return fmt.Errorf("invalid llm base url")
}
}
apiKey := domain.LLMAPIKeyForProvider(provider, settings)
if provider != domain.LLMProviderOllama && strings.TrimSpace(apiKey) == "" {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", "missing api key",
"duration_ms", time.Since(started).Milliseconds(),
)
return fmt.Errorf("api key for provider %s is not configured", provider)
}

runtimeFactory := llmruntime.NewFactory(10 * time.Second)
client, err := runtimeFactory.ClientFor(provider)
if err != nil {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", shortError(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return err
}

@@ -848,11 +935,41 @@ func validateLLMProviderConfig(ctx context.Context, settings domain.AppSettings)
UserPrompt: "Return OK",
})
if err != nil {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", shortError(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return fmt.Errorf("provider validation failed (%s/%s): %w", provider, model, err)
}
if strings.TrimSpace(resp) == "" {
logger.WarnContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "failed",
"provider", provider,
"model", model,
"base_url", baseURL,
"error", "empty response",
"duration_ms", time.Since(started).Milliseconds(),
)
return fmt.Errorf("provider validation failed (%s/%s): empty response", provider, model)
}
logger.InfoContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "success",
"provider", provider,
"model", model,
"base_url", baseURL,
"response_snippet", trimSnippet(resp, 40),
"duration_ms", time.Since(started).Milliseconds(),
)
return nil
}

@@ -1605,6 +1722,40 @@ func parseDebugMode(r *http.Request) bool {
}
}

func summarizeSuggestionSources(state domain.DraftSuggestionState) map[string]int {
if len(state.ByFieldPath) == 0 {
return map[string]int{}
}
out := map[string]int{}
for _, suggestion := range state.ByFieldPath {
source := strings.TrimSpace(suggestion.Source)
if source == "" {
source = "unknown"
}
out[source]++
}
return out
}

func shortError(err error) string {
if err == nil {
return ""
}
message := strings.TrimSpace(err.Error())
if len(message) > 180 {
return message[:180] + "..."
}
return message
}

func trimSnippet(value string, max int) string {
trimmed := strings.TrimSpace(value)
if max <= 0 || len(trimmed) <= max {
return trimmed
}
return trimmed[:max] + "..."
}

func fieldAnchorID(fieldPath string) string {
path := strings.TrimSpace(strings.ToLower(fieldPath))
if path == "" {


+ 258
- 2
internal/logging/logger.go Datei anzeigen

@@ -1,10 +1,266 @@
package logging

import (
"context"
"fmt"
"io"
"log/slog"
"os"
"strings"
"sync"
"time"
)

func New() *slog.Logger {
return slog.New(slog.NewJSONHandler(os.Stdout, nil))
type Entry struct {
Timestamp time.Time `json:"timestamp"`
Level string `json:"level"`
Message string `json:"message"`
Fields map[string]any `json:"fields,omitempty"`
}

type RecentStore struct {
mu sync.RWMutex
entries []Entry
next int
size int
capacity int
}

func NewRecentStore(capacity int) *RecentStore {
if capacity <= 0 {
capacity = 200
}
return &RecentStore{
entries: make([]Entry, capacity),
capacity: capacity,
}
}

func (s *RecentStore) Add(entry Entry) {
if s == nil {
return
}
s.mu.Lock()
defer s.mu.Unlock()
s.entries[s.next] = entry
s.next = (s.next + 1) % s.capacity
if s.size < s.capacity {
s.size++
}
}

func (s *RecentStore) List(limit int) []Entry {
if s == nil {
return nil
}
s.mu.RLock()
defer s.mu.RUnlock()
if s.size == 0 {
return nil
}
if limit <= 0 || limit > s.size {
limit = s.size
}
out := make([]Entry, 0, limit)
for i := 0; i < limit; i++ {
idx := (s.next - 1 - i + s.capacity) % s.capacity
out = append(out, s.entries[idx])
}
return out
}

type recentHandler struct {
store *RecentStore
level slog.Leveler
attrs []slog.Attr
group []string
}

func NewRecentHandler(store *RecentStore, level slog.Leveler) slog.Handler {
if level == nil {
level = slog.LevelInfo
}
return &recentHandler{store: store, level: level}
}

func (h *recentHandler) Enabled(_ context.Context, level slog.Level) bool {
if h == nil || h.level == nil {
return level >= slog.LevelInfo
}
return level >= h.level.Level()
}

func (h *recentHandler) Handle(_ context.Context, r slog.Record) error {
if h == nil || h.store == nil {
return nil
}
fields := map[string]any{}
for _, attr := range h.attrs {
addField(fields, h.group, attr)
}
r.Attrs(func(attr slog.Attr) bool {
addField(fields, h.group, attr)
return true
})
if len(fields) == 0 {
fields = nil
}
h.store.Add(Entry{
Timestamp: r.Time.UTC(),
Level: r.Level.String(),
Message: strings.TrimSpace(r.Message),
Fields: fields,
})
return nil
}

func (h *recentHandler) WithAttrs(attrs []slog.Attr) slog.Handler {
next := &recentHandler{
store: h.store,
level: h.level,
group: append([]string(nil), h.group...),
}
next.attrs = append(append([]slog.Attr(nil), h.attrs...), attrs...)
return next
}

func (h *recentHandler) WithGroup(name string) slog.Handler {
next := &recentHandler{
store: h.store,
level: h.level,
attrs: append([]slog.Attr(nil), h.attrs...),
group: append([]string(nil), h.group...),
}
if trimmed := strings.TrimSpace(name); trimmed != "" {
next.group = append(next.group, trimmed)
}
return next
}

func addField(fields map[string]any, groups []string, attr slog.Attr) {
if attr.Equal(slog.Attr{}) {
return
}
key := strings.TrimSpace(attr.Key)
if key == "" {
return
}
prefix := strings.Join(groups, ".")
if prefix != "" {
key = prefix + "." + key
}
val := attr.Value.Resolve()
if val.Kind() == slog.KindGroup {
for _, nested := range val.Group() {
addField(fields, append(groups, strings.TrimSpace(attr.Key)), nested)
}
return
}
fields[key] = slogValueToAny(val)
}

func slogValueToAny(v slog.Value) any {
switch v.Kind() {
case slog.KindString:
return v.String()
case slog.KindBool:
return v.Bool()
case slog.KindInt64:
return v.Int64()
case slog.KindUint64:
return v.Uint64()
case slog.KindFloat64:
return v.Float64()
case slog.KindDuration:
return v.Duration().Milliseconds()
case slog.KindTime:
return v.Time().UTC().Format(time.RFC3339Nano)
case slog.KindAny:
return v.Any()
default:
return fmt.Sprint(v.Any())
}
}

type teeHandler struct {
handlers []slog.Handler
}

func newTeeHandler(handlers ...slog.Handler) slog.Handler {
nonNil := make([]slog.Handler, 0, len(handlers))
for _, h := range handlers {
if h != nil {
nonNil = append(nonNil, h)
}
}
return &teeHandler{handlers: nonNil}
}

func (h *teeHandler) Enabled(ctx context.Context, level slog.Level) bool {
for _, handler := range h.handlers {
if handler.Enabled(ctx, level) {
return true
}
}
return false
}

func (h *teeHandler) Handle(ctx context.Context, r slog.Record) error {
for _, handler := range h.handlers {
if !handler.Enabled(ctx, r.Level) {
continue
}
if err := handler.Handle(ctx, r.Clone()); err != nil {
return err
}
}
return nil
}

func (h *teeHandler) WithAttrs(attrs []slog.Attr) slog.Handler {
next := make([]slog.Handler, 0, len(h.handlers))
for _, handler := range h.handlers {
next = append(next, handler.WithAttrs(attrs))
}
return &teeHandler{handlers: next}
}

func (h *teeHandler) WithGroup(name string) slog.Handler {
next := make([]slog.Handler, 0, len(h.handlers))
for _, handler := range h.handlers {
next = append(next, handler.WithGroup(name))
}
return &teeHandler{handlers: next}
}

type SetupResult struct {
Logger *slog.Logger
Recent *RecentStore
Close func() error
}

func Setup() SetupResult {
recent := NewRecentStore(400)
stdoutHandler := slog.NewJSONHandler(os.Stdout, nil)
handlers := []slog.Handler{stdoutHandler, NewRecentHandler(recent, slog.LevelInfo)}

closers := make([]io.Closer, 0, 1)
if path := strings.TrimSpace(os.Getenv("LOG_FILE")); path != "" {
if file, err := os.OpenFile(path, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644); err == nil {
handlers = append(handlers, slog.NewJSONHandler(file, nil))
closers = append(closers, file)
}
}

logger := slog.New(newTeeHandler(handlers...))
return SetupResult{
Logger: logger,
Recent: recent,
Close: func() error {
for _, closer := range closers {
_ = closer.Close()
}
return nil
},
}
}

+ 114
- 1
internal/mapping/provider_suggestion_generator.go Datei anzeigen

@@ -5,6 +5,7 @@ import (
"encoding/json"
"fmt"
"strings"
"time"

"qctextbuilder/internal/domain"
"qctextbuilder/internal/llmruntime"
@@ -27,21 +28,69 @@ func NewProviderAwareSuggestionGenerator(settings SettingsReader, runtimeFactory
}

func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req SuggestionRequest) (SuggestionResult, error) {
started := time.Now()
if g == nil || g.settings == nil || g.runtimeFactory == nil {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_config",
"status", "failed",
"template_id", req.TemplateID,
"error", "provider-aware generator is not configured",
)
return SuggestionResult{}, fmt.Errorf("provider-aware generator is not configured")
}

settings, err := g.settings.GetSettings(ctx)
if err != nil || settings == nil {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_settings",
"status", "failed",
"template_id", req.TemplateID,
"error", "llm settings are not available",
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("llm settings are not available")
}
provider := domain.NormalizeLLMProvider(settings.LLMActiveProvider)
model := domain.NormalizeLLMModel(provider, settings.LLMActiveModel)
baseURL := strings.TrimSpace(settings.LLMBaseURL)
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "start",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
)
if strings.TrimSpace(model) == "" {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "failed",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"error", "no active model configured",
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("no active model configured")
}
apiKey := domain.LLMAPIKeyForProvider(provider, *settings)
if provider != domain.LLMProviderOllama && strings.TrimSpace(apiKey) == "" {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "failed",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"error", "missing api key",
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("api key for provider %s is not configured in settings", provider)
}

@@ -56,6 +105,17 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug

providerClient, err := g.runtimeFactory.ClientFor(provider)
if err != nil {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "failed",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"error", shortErr(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, err
}
systemPrompt, userPrompt := buildProviderPrompts(req, targets)
@@ -64,7 +124,7 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
raw, err := providerClient.Generate(ctx, llmruntime.Request{
Provider: provider,
Model: model,
BaseURL: strings.TrimSpace(settings.LLMBaseURL),
BaseURL: baseURL,
APIKey: strings.TrimSpace(apiKey),
Temperature: &temperature,
MaxTokens: &maxTokens,
@@ -72,13 +132,55 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
UserPrompt: userPrompt,
})
if err != nil {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "failed",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"error", shortErr(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("provider request failed (provider=%s model=%s): %w", provider, model, err)
}
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"response_chars", len(strings.TrimSpace(raw)),
)

parsed, err := parseProviderSuggestions(raw)
if err != nil {
mappingLogger().WarnContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_parse",
"status", "failed",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"error", shortErr(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("provider returned invalid suggestions json (provider=%s model=%s): %w", provider, model, err)
}
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_parse",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"parsed_count", len(parsed),
)

out := SuggestionResult{
Suggestions: make([]Suggestion, 0, len(parsed)),
@@ -107,6 +209,17 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
out.Suggestions = append(out.Suggestions, suggestion)
out.ByFieldPath[fieldPath] = suggestion
}
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_result",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"suggestion_count", len(out.Suggestions),
"duration_ms", time.Since(started).Milliseconds(),
)
return out, nil
}



+ 190
- 0
internal/mapping/suggestion_generator.go Datei anzeigen

@@ -5,6 +5,7 @@ import (
"fmt"
"sort"
"strings"
"time"

"qctextbuilder/internal/domain"
"qctextbuilder/internal/qcclient"
@@ -22,6 +23,14 @@ func NewRuleBasedSuggestionGenerator() *RuleBasedSuggestionGenerator {

func (g *RuleBasedSuggestionGenerator) Generate(_ context.Context, req SuggestionRequest) (SuggestionResult, error) {
result := SuggestFieldValuesRuleBased(req)
mappingLogger().Info("rule-based suggestion used",
"component", "autofill",
"step", "rule_based_fallback",
"status", "success",
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"suggestion_count", len(result.Suggestions),
)
return result, nil
}

@@ -45,10 +54,47 @@ func NewQCLLMSuggestionGenerator(qc qcclient.Client) *LLMSuggestionGenerator {
}

func (g *LLMSuggestionGenerator) Generate(ctx context.Context, req SuggestionRequest) (SuggestionResult, error) {
started := time.Now()
source := ""
if g != nil {
source = strings.TrimSpace(g.source)
}
step := "qc_fallback_request"
if source == domain.DraftSuggestionSourceLLM {
step = "llm_request"
}
mappingLogger().InfoContext(ctx, "llm suggestion request",
"component", "autofill",
"step", step,
"status", "start",
"source", source,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
)
if g == nil || g.qc == nil {
mappingLogger().WarnContext(ctx, "llm suggestion request",
"component", "autofill",
"step", step,
"status", "failed",
"source", source,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", "llm generator is not configured",
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("llm generator is not configured")
}
if req.TemplateID <= 0 {
mappingLogger().WarnContext(ctx, "llm suggestion request",
"component", "autofill",
"step", step,
"status", "failed",
"source", source,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", "template id is required for llm suggestions",
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, fmt.Errorf("template id is required for llm suggestions")
}

@@ -84,6 +130,16 @@ func (g *LLMSuggestionGenerator) Generate(ctx context.Context, req SuggestionReq
CustomTemplateData: customTemplateData,
})
if err != nil {
mappingLogger().WarnContext(ctx, "llm suggestion request",
"component", "autofill",
"step", step,
"status", "failed",
"source", source,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", shortErr(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, err
}

@@ -118,6 +174,16 @@ func (g *LLMSuggestionGenerator) Generate(ctx context.Context, req SuggestionReq
out.Suggestions = append(out.Suggestions, suggestion)
out.ByFieldPath[target.FieldPath] = suggestion
}
mappingLogger().InfoContext(ctx, "llm suggestion request",
"component", "autofill",
"step", step,
"status", "success",
"source", source,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"suggestion_count", len(out.Suggestions),
"duration_ms", time.Since(started).Milliseconds(),
)
return out, nil
}

@@ -134,26 +200,90 @@ func NewCompositeSuggestionGenerator(primary, fallback SuggestionGenerator) *Com
}

func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req SuggestionRequest) (SuggestionResult, error) {
started := time.Now()
if g == nil {
return SuggestionResult{}, fmt.Errorf("suggestion generator is not configured")
}
if g.Primary == nil {
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "fallback_attempt",
"status", "attempted",
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
)
return generateFallback(ctx, g.Fallback, req)
}

primaryResult, err := g.Primary.Generate(ctx, req)
if err != nil {
mappingLogger().WarnContext(ctx, "autofill fallback",
"component", "autofill",
"step", "primary_failed",
"status", "failed",
"primary_generator", generatorLabel(g.Primary),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", shortErr(err),
)
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "qc_fallback",
"status", "attempted",
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
)
return generateFallback(ctx, g.Fallback, req)
}
primaryResult = normalizeSuggestionResult(primaryResult, req.Fields, req.Existing, req.IncludeFilled)
if g.Fallback == nil {
mappingLogger().InfoContext(ctx, "autofill result",
"component", "autofill",
"step", "final",
"status", "success",
"source_path", "primary_only",
"suggestion_count", len(primaryResult.Suggestions),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"duration_ms", time.Since(started).Milliseconds(),
)
return primaryResult, nil
}

fallbackResult, fbErr := g.Fallback.Generate(ctx, req)
if fbErr != nil {
mappingLogger().WarnContext(ctx, "autofill fallback",
"component", "autofill",
"step", "qc_fallback",
"status", "failed",
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", shortErr(fbErr),
)
mappingLogger().InfoContext(ctx, "autofill result",
"component", "autofill",
"step", "final",
"status", "success",
"source_path", "primary_only_fallback_failed",
"suggestion_count", len(primaryResult.Suggestions),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"duration_ms", time.Since(started).Milliseconds(),
)
return primaryResult, nil
}
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "qc_fallback",
"status", "success",
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"suggestion_count", len(fallbackResult.Suggestions),
)
fallbackResult = normalizeSuggestionResult(fallbackResult, req.Fields, req.Existing, req.IncludeFilled)
merged := primaryResult
if merged.ByFieldPath == nil {
@@ -169,6 +299,36 @@ func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req Suggest
sort.SliceStable(merged.Suggestions, func(i, j int) bool {
return merged.Suggestions[i].FieldPath < merged.Suggestions[j].FieldPath
})
sourcePath := "primary_plus_fallback"
if len(primaryResult.Suggestions) == 0 && len(fallbackResult.Suggestions) > 0 {
sourcePath = "fallback_only"
}
ruleBasedCount := 0
for _, suggestion := range merged.Suggestions {
if strings.EqualFold(strings.TrimSpace(suggestion.Source), domain.DraftSuggestionSourceFallbackRuleBased) {
ruleBasedCount++
}
}
if ruleBasedCount > 0 {
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "rule_based_fallback",
"status", "used",
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"suggestion_count", ruleBasedCount,
)
}
mappingLogger().InfoContext(ctx, "autofill result",
"component", "autofill",
"step", "final",
"status", "success",
"source_path", sourcePath,
"suggestion_count", len(merged.Suggestions),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"duration_ms", time.Since(started).Milliseconds(),
)
return merged, nil
}

@@ -176,10 +336,40 @@ func generateFallback(ctx context.Context, fallback SuggestionGenerator, req Sug
if fallback == nil {
return SuggestionResult{}, fmt.Errorf("fallback suggestion generator is not configured")
}
started := time.Now()
label := generatorLabel(fallback)
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "fallback_attempt",
"status", "attempted",
"fallback_generator", label,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
)
result, err := fallback.Generate(ctx, req)
if err != nil {
mappingLogger().WarnContext(ctx, "autofill fallback",
"component", "autofill",
"step", "fallback_attempt",
"status", "failed",
"fallback_generator", label,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"error", shortErr(err),
"duration_ms", time.Since(started).Milliseconds(),
)
return SuggestionResult{}, err
}
mappingLogger().InfoContext(ctx, "autofill fallback",
"component", "autofill",
"step", "fallback_attempt",
"status", "success",
"fallback_generator", label,
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"suggestion_count", len(result.Suggestions),
"duration_ms", time.Since(started).Milliseconds(),
)
return normalizeSuggestionResult(result, req.Fields, req.Existing, req.IncludeFilled), nil
}



+ 4
- 0
internal/mapping/suggestions.go Datei anzeigen

@@ -12,6 +12,7 @@ import (

type SuggestionRequest struct {
TemplateID int64
DraftID string
Fields []domain.TemplateField
GlobalData map[string]any
DraftContext *domain.DraftContext
@@ -292,6 +293,7 @@ func GenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator,
}
generated, err := suggestionResultWithFallback(ctx, generator, SuggestionRequest{
TemplateID: req.TemplateID,
DraftID: req.DraftID,
Fields: req.Fields,
GlobalData: req.GlobalData,
DraftContext: req.DraftContext,
@@ -318,6 +320,7 @@ func RegenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator
next.ByFieldPath = map[string]domain.DraftSuggestion{}
generated, err := suggestionResultWithFallback(ctx, generator, SuggestionRequest{
TemplateID: req.TemplateID,
DraftID: req.DraftID,
Fields: req.Fields,
GlobalData: req.GlobalData,
DraftContext: req.DraftContext,
@@ -347,6 +350,7 @@ func RegenerateFieldSuggestion(ctx context.Context, generator SuggestionGenerato
}
generated, err := suggestionResultWithFallback(ctx, generator, SuggestionRequest{
TemplateID: req.TemplateID,
DraftID: req.DraftID,
Fields: req.Fields,
GlobalData: req.GlobalData,
DraftContext: req.DraftContext,


Laden…
Abbrechen
Speichern