Kaynağa Gözat

chore: add configurable log levels for diagnostics

master
Jan Svabenik 1 ay önce
ebeveyn
işleme
9ed72a6db5
9 değiştirilmiş dosya ile 130 ekleme ve 18 silme
  1. +3
    -2
      README.md
  2. +1
    -1
      docs/TARGET_STATE_AND_ROADMAP.md
  3. +1
    -1
      internal/app/app.go
  4. +2
    -0
      internal/config/config.go
  5. +45
    -1
      internal/httpserver/handlers/ui.go
  6. +4
    -4
      internal/llmruntime/runtime.go
  7. +21
    -4
      internal/logging/logger.go
  8. +31
    -3
      internal/mapping/provider_suggestion_generator.go
  9. +22
    -2
      internal/mapping/suggestions.go

+ 3
- 2
README.md Dosyayı Görüntüle

@@ -21,8 +21,8 @@ Die App kann heute:
- Composite-Fallback laeuft nur fuer echte Zielfeld-Luecken: bei voller Primaerabdeckung kein Fallback-Aufruf, bei Teilabdeckung nur fuer fehlende `fieldPath`s; Primaer-Sources bleiben bei Merge priorisiert.
- Suggestion-Workflow getrennt von Feldwerten (Preview), inkl. `Generate all`, `Regenerate all`, `Apply all to empty` sowie per-Feld `Apply`/`Regenerate` im Draft-/Build-UI.
- Technische Felddetails (z. B. `fieldPath`, Suggestion-Metadaten, Slot-Preview) sind im UI standardmaessig ausgeblendet und nur per Debug-Toggle sichtbar.
- Strukturierte Debug-Logs fuer Autofill-/LLM-Flow inkl. Provider-Pfad, QC-Fallback, Rule-based-Fallback und Validate-Action (kurze Metadaten, Fehlerzusammenfassung, Dauer, Suggestion-Count).
- Vertiefte Provider-Diagnostics im provider-aware LLM-Pfad (insb. OpenAI-kompatibel): request-nahe Metadaten inkl. Provider/Modell/Base-URL, Prompt-/Payload-Snippets, rohe Response-Snippets plus Shape-Hints sowie Extract-/Parse-/Suggestion-Samples; jeweils sinnvoll gekuerzt und ohne API-Key-/Authorization-Leaks.
- Strukturierte Autofill-/LLM-Logs mit operativen Zusammenfassungen auf INFO (Provider-Pfad, Fallback-Status, Validate-Action, Dauer, Suggestion-Count).
- Vertiefte Provider-/Runtime-Diagnostik (Prompt-/Payload-/Response-/Extract-/Parse-Snippets, Samples, Feld-Transition-Details) bleibt erhalten und ist ueber `LOG_LEVEL=debug` sichtbar; API-Key-/Authorization-Daten bleiben redigiert.
- Kleine interne Log-API `GET /api/logs?limit=<n>` fuer aktuelle In-Memory-Logeintraege (Ring-Buffer, neueste zuerst).
- Builds aus geprueften Daten starten sowie Job-Status pollen und Editor-URL nachladen.

@@ -39,6 +39,7 @@ Wichtig:
- `DB_URL=data/qctextbuilder.db` (Default)
- `QC_BASE_URL=https://qc-api.yggdrasil.dev-mono.net/api/v1`
- `QC_TOKEN=<bearer token>`
- optional: `LOG_LEVEL=info|debug|warn|error` (Default `info`; fuer tiefe Autofill-/Provider-Diagnostik `debug`)
- optional: `LOG_FILE=logs/qctextbuilder.log` fuer zusaetzliches JSON-Logfile (stdout bleibt aktiv)
2. Starten:
- `go run ./cmd/qctextbuilder`


+ 1
- 1
docs/TARGET_STATE_AND_ROADMAP.md Dosyayı Görüntüle

@@ -50,7 +50,7 @@ Aktueller Stand:
- Modellauswahl ist provider-aware statisch umgesetzt und so strukturiert, dass spaeter dynamische Model-Listen/Refresh anschliessbar sind.
- Technische Felddetails (z. B. Feldpfade/Slots/Suggestion-Metadaten) sind im UI per Debug-Toggle optional einblendbar.
- Strukturierte Debug-Logs fuer den Autofill-/LLM-Pfad sind aktiv (provider-aware Request/Parse, QC-Fallback, Rule-based-Fallback, Validate-Action; ohne Prompt-/Secret-Dumps).
- Fokus-Debugging im provider-aware Runtime-Pfad wurde vertieft: gekuerzte Request-/Prompt-/Payload-, Raw-Response-, Shape- und Extract-/Parse-Samples fuer schnellere Ursacheanalyse bei Provider-/Output-Shape-Mismatches; API-Key-/Authorization-Daten bleiben redigiert.
- Fokus-Debugging im provider-aware Runtime-Pfad wurde vertieft und per Log-Level gestaffelt: operative Zusammenfassungen laufen auf INFO, tiefe Request-/Prompt-/Payload-, Raw-Response-, Shape-, Extract-/Parse-Samples sowie Feld-Transition-Details auf DEBUG; API-Key-/Authorization-Daten bleiben redigiert.
- Eine kleine interne Log-API (`GET /api/logs`) stellt aktuelle strukturierte Logeintraege aus einem In-Memory-Ring-Buffer bereit.
- Build-Start erfordert bereits einen Template-Manifest-Status `reviewed`/`validated`.
- Prozessuale Review-Gates (z. B. Freigabe-Policy, Rollen, Pflichtchecks pro Feld) sind noch nicht vollstaendig ausgebaut.


+ 1
- 1
internal/app/app.go Dosyayı Görüntüle

@@ -36,7 +36,7 @@ type App struct {
}

func New(cfg config.Config) (*App, error) {
logSetup := logging.Setup()
logSetup := logging.Setup(cfg.LogLevel)
logger := logSetup.Logger
slog.SetDefault(logger)



+ 2
- 0
internal/config/config.go Dosyayı Görüntüle

@@ -10,6 +10,7 @@ type Config struct {
DBDriver string
DBURL string
AppSecret string
LogLevel string
QCBaseURL string
QCToken string
PollIntervalSeconds int
@@ -23,6 +24,7 @@ func Load() Config {
DBDriver: getenv("DB_DRIVER", "sqlite"),
DBURL: getenv("DB_URL", "data/qctextbuilder.db"),
AppSecret: os.Getenv("APP_SECRET"),
LogLevel: getenv("LOG_LEVEL", "info"),
QCBaseURL: getenv("QC_BASE_URL", "https://qc-api.yggdrasil.dev-mono.net/api/v1"),
QCToken: os.Getenv("QC_TOKEN"),
PollIntervalSeconds: getenvInt("POLL_INTERVAL_SECONDS", 5),


+ 45
- 1
internal/httpserver/handlers/ui.go Dosyayı Görüntüle

@@ -681,6 +681,19 @@ func (u *UI) AutofillDraft(w http.ResponseWriter, r *http.Request) {
default:
msg = "unknown autofill action"
}
u.logger.InfoContext(r.Context(), "autofill state transition",
"component", "autofill",
"step", "pre_action_summary",
"action", action,
"state_count", len(suggestionState.ByFieldPath),
"state_sources", summarizeSuggestionSources(suggestionState),
)
u.logger.DebugContext(r.Context(), "autofill state transition",
"component", "autofill",
"step", "pre_action_summary",
"action", action,
"sample_sources", sampleSuggestionSources(suggestionState, 5),
)
sourceCounts := summarizeSuggestionSources(suggestionState)
u.logger.InfoContext(r.Context(), "autofill action",
"component", "autofill",
@@ -967,9 +980,17 @@ func validateLLMProviderConfig(ctx context.Context, settings domain.AppSettings,
"provider", provider,
"model", model,
"base_url", baseURL,
"response_snippet", trimSnippet(resp, 40),
"duration_ms", time.Since(started).Milliseconds(),
)
logger.DebugContext(ctx, "validate llm provider config",
"component", "autofill",
"step", "validate_provider",
"status", "success",
"provider", provider,
"model", model,
"base_url", baseURL,
"response_snippet", trimSnippet(resp, 40),
)
return nil
}

@@ -1737,6 +1758,29 @@ func summarizeSuggestionSources(state domain.DraftSuggestionState) map[string]in
return out
}

func sampleSuggestionSources(state domain.DraftSuggestionState, limit int) map[string]string {
if limit <= 0 || len(state.ByFieldPath) == 0 {
return map[string]string{}
}
paths := make([]string, 0, len(state.ByFieldPath))
for path := range state.ByFieldPath {
paths = append(paths, path)
}
sort.Strings(paths)
if len(paths) > limit {
paths = paths[:limit]
}
out := make(map[string]string, len(paths))
for _, path := range paths {
source := strings.TrimSpace(state.ByFieldPath[path].Source)
if source == "" {
source = "unknown"
}
out[path] = source
}
return out
}

func shortError(err error) string {
if err == nil {
return ""


+ 4
- 4
internal/llmruntime/runtime.go Dosyayı Görüntüle

@@ -96,7 +96,7 @@ func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (str
payload[maxTokensField] = optionalInt(req.MaxTokens, 1200)
payloadRaw, _ := json.Marshal(payload)
payloadSnippet := redactSecrets(snippet(string(payloadRaw), runtimePayloadSnippetLimit), req.APIKey)
runtimeLogger().InfoContext(ctx, "llm runtime",
runtimeLogger().DebugContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_request",
"provider", provider,
@@ -116,7 +116,7 @@ func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (str
return "", err
}
rawResponse := strings.TrimSpace(string(body))
runtimeLogger().InfoContext(ctx, "llm runtime",
runtimeLogger().DebugContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_response",
"provider", provider,
@@ -130,7 +130,7 @@ func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (str
return "", fmt.Errorf("decode openai-compatible response: %w", err)
}
shape := describeOpenAICompatibleShape(response)
runtimeLogger().InfoContext(ctx, "llm runtime",
runtimeLogger().DebugContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_response_shape",
"provider", provider,
@@ -138,7 +138,7 @@ func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (str
"response_shape_hint", snippet(shape, runtimeShapeSnippetLimit),
)
content := extractOpenAICompatibleContent(response)
runtimeLogger().InfoContext(ctx, "llm runtime",
runtimeLogger().DebugContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_extract",
"provider", provider,


+ 21
- 4
internal/logging/logger.go Dosyayı Görüntüle

@@ -239,15 +239,17 @@ type SetupResult struct {
Close func() error
}

func Setup() SetupResult {
func Setup(levelRaw string) SetupResult {
level := parseLevel(levelRaw)
handlerOptions := &slog.HandlerOptions{Level: level}
recent := NewRecentStore(400)
stdoutHandler := slog.NewJSONHandler(os.Stdout, nil)
handlers := []slog.Handler{stdoutHandler, NewRecentHandler(recent, slog.LevelInfo)}
stdoutHandler := slog.NewJSONHandler(os.Stdout, handlerOptions)
handlers := []slog.Handler{stdoutHandler, NewRecentHandler(recent, level)}

closers := make([]io.Closer, 0, 1)
if path := strings.TrimSpace(os.Getenv("LOG_FILE")); path != "" {
if file, err := os.OpenFile(path, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644); err == nil {
handlers = append(handlers, slog.NewJSONHandler(file, nil))
handlers = append(handlers, slog.NewJSONHandler(file, handlerOptions))
closers = append(closers, file)
}
}
@@ -264,3 +266,18 @@ func Setup() SetupResult {
},
}
}

func parseLevel(raw string) slog.Level {
switch strings.ToLower(strings.TrimSpace(raw)) {
case "trace":
return slog.LevelDebug - 4
case "debug":
return slog.LevelDebug
case "warn", "warning":
return slog.LevelWarn
case "error":
return slog.LevelError
default:
return slog.LevelInfo
}
}

+ 31
- 3
internal/mapping/provider_suggestion_generator.go Dosyayı Görüntüle

@@ -123,7 +123,7 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
systemPrompt, userPrompt := buildProviderPrompts(req, targets)
temperature := domain.NormalizeLLMTemperature(settings.LLMTemperature)
maxTokens := domain.NormalizeLLMMaxTokens(settings.LLMMaxTokens)
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
mappingLogger().DebugContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request_payload",
"status", "start",
@@ -170,9 +170,18 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"response_chars", len(strings.TrimSpace(raw)),
)
mappingLogger().DebugContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"response_snippet", providerLogSnippet(raw, 4000),
)
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
mappingLogger().DebugContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_parse_input",
"status", "start",
@@ -208,6 +217,15 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"parsed_count", len(parsed),
)
mappingLogger().DebugContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_parse",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"parsed_sample", providerSuggestionSample(parsed, 5),
)

@@ -247,9 +265,19 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"suggestion_count", len(out.Suggestions),
"sources", summarizeResultSources(out),
"duration_ms", time.Since(started).Milliseconds(),
)
mappingLogger().DebugContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_result",
"status", "success",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"suggestion_sample_sources", sampleResultSources(out, 5),
"suggestion_sample", suggestionLogSample(out, 5),
"duration_ms", time.Since(started).Milliseconds(),
)
return out, nil
}


+ 22
- 2
internal/mapping/suggestions.go Dosyayı Görüntüle

@@ -311,6 +311,11 @@ func GenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator,
"action", "generate_all",
"generated_count", len(generated.ByFieldPath),
"generated_sources", summarizeResultSources(generated),
)
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_result",
"action", "generate_all",
"sample_sources", sampleResultSources(generated, 5),
)
for _, s := range generated.Suggestions {
@@ -329,7 +334,7 @@ func GenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator,
if explicitSource := strings.TrimSpace(s.Source); explicitSource != "" {
stored.Source = explicitSource
}
mappingLogger().InfoContext(ctx, "autofill state transition",
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "apply_field_transition",
"action", "generate_all",
@@ -346,6 +351,11 @@ func GenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator,
"action", "generate_all",
"state_count", len(next.ByFieldPath),
"state_sources", summarizeDraftSuggestionSources(next),
)
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_apply_state",
"action", "generate_all",
"sample_sources", sampleDraftSuggestionSources(next, 5),
)
next.UpdatedAt = now.UTC()
@@ -375,6 +385,11 @@ func RegenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator
"action", "regenerate_all",
"generated_count", len(generated.ByFieldPath),
"generated_sources", summarizeResultSources(generated),
)
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_result",
"action", "regenerate_all",
"sample_sources", sampleResultSources(generated, 5),
)
for _, s := range generated.Suggestions {
@@ -388,7 +403,7 @@ func RegenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator
if explicitSource := strings.TrimSpace(s.Source); explicitSource != "" {
stored.Source = explicitSource
}
mappingLogger().InfoContext(ctx, "autofill state transition",
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "apply_field_transition",
"action", "regenerate_all",
@@ -405,6 +420,11 @@ func RegenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator
"action", "regenerate_all",
"state_count", len(next.ByFieldPath),
"state_sources", summarizeDraftSuggestionSources(next),
)
mappingLogger().DebugContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_apply_state",
"action", "regenerate_all",
"sample_sources", sampleDraftSuggestionSources(next, 5),
)
next.UpdatedAt = now.UTC()


Yükleniyor…
İptal
Kaydet