Ver código fonte

fix: preserve provider suggestions over stale fallback state

master
Jan Svabenik 1 mês atrás
pai
commit
4f7d80c06b
14 arquivos alterados com 895 adições e 24 exclusões
  1. +3
    -1
      README.md
  2. +3
    -1
      docs/TARGET_STATE_AND_ROADMAP.md
  3. +28
    -0
      internal/domain/llm_settings_test.go
  4. +42
    -0
      internal/httpserver/handlers/logs_api_test.go
  5. +122
    -7
      internal/llmruntime/runtime.go
  6. +7
    -1
      internal/llmruntime/runtime_test.go
  7. +45
    -0
      internal/logging/logger_test.go
  8. +45
    -0
      internal/mapping/logging.go
  9. +36
    -0
      internal/mapping/logging_test.go
  10. +102
    -0
      internal/mapping/provider_suggestion_generator.go
  11. +73
    -5
      internal/mapping/suggestion_generator.go
  12. +167
    -7
      internal/mapping/suggestions.go
  13. +214
    -0
      internal/mapping/suggestions_test.go
  14. +8
    -2
      run-local.ps1

+ 3
- 1
README.md Ver arquivo

@@ -13,14 +13,16 @@ Die App kann heute:
- Im Settings-/Config-Bereich die LLM-Basiskonfiguration pflegen: aktiver Provider, aktives Modell (provider-aware statische Auswahlliste), Base URL fuer Ollama/kompatible Endpoints, Temperature/Max Tokens sowie getrennte API-Key-Speicher je Provider (OpenAI, Anthropic, Google, xAI, Ollama).
- LLM-Provider-Konfiguration in Settings per leichtgewichtigem Validate-Action pruefen (aktiver Provider/Modell/Key/Base URL via kurzem Runtime-Request).
- OpenAI-kompatible Runtime-Requests waehlen den Token-Limit-Parameter intern modellkompatibel (`max_completion_tokens` fuer OpenAI GPT-5-Modelle, sonst `max_tokens`), inkl. Settings-Validate-Action.
- OpenAI-kompatible Runtime-Responses werden robust ueber mehrere Chat-/GPT-5-kompatible Content-Shapes extrahiert (u. a. `choices[].message.content` als String/Part-Array sowie `output_text`/`output[].content`); bei leerem Ergebnis werden nur sichere Strukturdiagnosen (Keys/Typen), keine Prompt-/Secret-Inhalte, zurueckgegeben.
- OpenAI-kompatible Runtime-Responses werden robust ueber mehrere Chat-/GPT-5-kompatible Content-Shapes extrahiert (u. a. `choices[].message.content` als String/Part-Array sowie `output_text`/`output[].content`); bei leerem Ergebnis werden nur sichere, priorisierte Strukturdiagnosen (inkl. `choices`/`message`-Shapes, `message.content`-Typ/Laenge und `finish_reason` falls vorhanden), keine Prompt-/Secret-Inhalte, zurueckgegeben.
- Im Draft-/Build-UI den User-Flow auf Stammdaten, Intake-/Website-Kontext, Stil-Auswahl und Template-Felder fokussieren; Prompt-Interna liegen in Settings.
- Interne semantische Zielslots (z. B. `hero.title`, `service_items[n].description`) auf Template-Felder abbilden als Vorbereitung fuer spaeteren LLM-Autofill.
- Repeated-Bereiche in semantischen Slots werden block-/rollenbasiert getrennt (z. B. Services/Team/Testimonials pro Item statt Sammel-Slot).
- LLM-first Autofill-Vorschlaege ueber provider-aware Runtime (OpenAI, Anthropic, Google, xAI, Ollama/kompatibel) mit aktiver Provider-/Modell-Auswahl aus Settings, strukturierter Feldzuordnung auf `fieldPath`/Slot und Rule-based Fallback fuer Ausfall-/Testfaelle.
- Composite-Fallback laeuft nur fuer echte Zielfeld-Luecken: bei voller Primaerabdeckung kein Fallback-Aufruf, bei Teilabdeckung nur fuer fehlende `fieldPath`s; Primaer-Sources bleiben bei Merge priorisiert.
- Suggestion-Workflow getrennt von Feldwerten (Preview), inkl. `Generate all`, `Regenerate all`, `Apply all to empty` sowie per-Feld `Apply`/`Regenerate` im Draft-/Build-UI.
- Technische Felddetails (z. B. `fieldPath`, Suggestion-Metadaten, Slot-Preview) sind im UI standardmaessig ausgeblendet und nur per Debug-Toggle sichtbar.
- Strukturierte Debug-Logs fuer Autofill-/LLM-Flow inkl. Provider-Pfad, QC-Fallback, Rule-based-Fallback und Validate-Action (kurze Metadaten, Fehlerzusammenfassung, Dauer, Suggestion-Count).
- Vertiefte Provider-Diagnostics im provider-aware LLM-Pfad (insb. OpenAI-kompatibel): request-nahe Metadaten inkl. Provider/Modell/Base-URL, Prompt-/Payload-Snippets, rohe Response-Snippets plus Shape-Hints sowie Extract-/Parse-/Suggestion-Samples; jeweils sinnvoll gekuerzt und ohne API-Key-/Authorization-Leaks.
- Kleine interne Log-API `GET /api/logs?limit=<n>` fuer aktuelle In-Memory-Logeintraege (Ring-Buffer, neueste zuerst).
- Builds aus geprueften Daten starten sowie Job-Status pollen und Editor-URL nachladen.



+ 3
- 1
docs/TARGET_STATE_AND_ROADMAP.md Ver arquivo

@@ -43,12 +43,14 @@ Aktueller Stand:
- Repeated-Sektionen (u. a. Services/Team/Testimonials) werden in der Slot-Vorschau block- und rollentypisch pro Item getrennt statt in Sammel-Slots zusammenzufallen.
- LLM-first Suggestion-State fuer Draft-/Build-UI ist vorhanden: Vorschlaege werden separat von Feldwerten gespeichert und per Generate/Regenerate/Apply (global und per Feld) explizit gesteuert; Rule-based bleibt als letzter Fallback/Testpfad aktiv.
- Provider-aware Suggestion-Runtime ist aktiv: Settings (`llm_active_provider`, `llm_active_model`, `llm_temperature`, `llm_max_tokens`, provider-spezifischer API-Key, `llm_base_url` fuer Ollama/kompatible Endpoints) steuern den primaeren Laufzeitpfad; der bestehende QC-Pfad bleibt als Kompatibilitaetsfallback erhalten.
- Composite-Suggestion-Merge nutzt Coverage-basierte Fallback-Logik: Fallback wird nur fuer fehlende Ziel-Feldpfade angefragt, und Primaer-Vorschlaege behalten im Merge immer Prioritaet vor Fallback-Werten.
- OpenAI-kompatible Requests nutzen intern modellabhaengig den passenden Token-Limit-Parameter (`max_completion_tokens` fuer OpenAI GPT-5-Modelle, ansonsten `max_tokens`), auch im Settings-Validate-Pfad.
- OpenAI-kompatible Runtime-Response-Extraktion ist fuer neuere GPT-5/OpenAI-kompatible Shapes robuster (String-/Part-Content in `choices[].message.content` sowie `output_text`/`output[].content`) und liefert bei leerem Inhalt sichere Shape-Diagnostik ohne Content-Dumps.
- OpenAI-kompatible Runtime-Response-Extraktion ist fuer neuere GPT-5/OpenAI-kompatible Shapes robuster (String-/Part-Content in `choices[].message.content` sowie `output_text`/`output[].content`) und liefert bei leerem Inhalt sichere, priorisierte Shape-Diagnostik (u. a. `choices`/`message`, `message.content`-Typ/Laenge, `finish_reason` falls vorhanden) ohne Content-Dumps.
- Settings enthalten einen leichtgewichtigen Validate-Action fuer die aktive Provider-Konfiguration (kurzer Runtime-Check), ohne den Draft-/Review-Flow zu umgehen.
- Modellauswahl ist provider-aware statisch umgesetzt und so strukturiert, dass spaeter dynamische Model-Listen/Refresh anschliessbar sind.
- Technische Felddetails (z. B. Feldpfade/Slots/Suggestion-Metadaten) sind im UI per Debug-Toggle optional einblendbar.
- Strukturierte Debug-Logs fuer den Autofill-/LLM-Pfad sind aktiv (provider-aware Request/Parse, QC-Fallback, Rule-based-Fallback, Validate-Action; ohne Prompt-/Secret-Dumps).
- Fokus-Debugging im provider-aware Runtime-Pfad wurde vertieft: gekuerzte Request-/Prompt-/Payload-, Raw-Response-, Shape- und Extract-/Parse-Samples fuer schnellere Ursacheanalyse bei Provider-/Output-Shape-Mismatches; API-Key-/Authorization-Daten bleiben redigiert.
- Eine kleine interne Log-API (`GET /api/logs`) stellt aktuelle strukturierte Logeintraege aus einem In-Memory-Ring-Buffer bereit.
- Build-Start erfordert bereits einen Template-Manifest-Status `reviewed`/`validated`.
- Prozessuale Review-Gates (z. B. Freigabe-Policy, Rollen, Pflichtchecks pro Feld) sind noch nicht vollstaendig ausgebaut.


+ 28
- 0
internal/domain/llm_settings_test.go Ver arquivo

@@ -0,0 +1,28 @@
package domain

import "testing"

func TestLLMProviderOptions_OpenAIIncludesGPT54Family(t *testing.T) {
t.Parallel()

models := LLMModelsByProvider(LLMProviderOpenAI)
if len(models) == 0 {
t.Fatalf("expected openai model list")
}

required := map[string]bool{
"gpt-5.4": false,
"gpt-5.4-mini": false,
"gpt-5.4-nano": false,
}
for _, model := range models {
if _, ok := required[model.Value]; ok {
required[model.Value] = true
}
}
for model, present := range required {
if !present {
t.Fatalf("missing openai model option: %s", model)
}
}
}

+ 42
- 0
internal/httpserver/handlers/logs_api_test.go Ver arquivo

@@ -0,0 +1,42 @@
package handlers

import (
"encoding/json"
"net/http"
"net/http/httptest"
"testing"
"time"

"qctextbuilder/internal/logging"
)

func TestListLogsHonorsLimit(t *testing.T) {
t.Parallel()

recent := logging.NewRecentStore(10)
recent.Add(logging.Entry{Timestamp: time.Now().UTC(), Level: "INFO", Message: "older"})
recent.Add(logging.Entry{Timestamp: time.Now().UTC(), Level: "INFO", Message: "newer"})

api := NewAPI(nil, nil, nil, nil, recent)
req := httptest.NewRequest(http.MethodGet, "/api/logs?limit=1", nil)
w := httptest.NewRecorder()

api.ListLogs(w, req)
if w.Code != http.StatusOK {
t.Fatalf("unexpected status: %d", w.Code)
}

var payload struct {
Count int `json:"count"`
Logs []logging.Entry `json:"logs"`
}
if err := json.Unmarshal(w.Body.Bytes(), &payload); err != nil {
t.Fatalf("decode response: %v", err)
}
if payload.Count != 1 || len(payload.Logs) != 1 {
t.Fatalf("unexpected payload: %+v", payload)
}
if payload.Logs[0].Message != "newer" {
t.Fatalf("expected newest log entry, got %q", payload.Logs[0].Message)
}
}

+ 122
- 7
internal/llmruntime/runtime.go Ver arquivo

@@ -6,6 +6,7 @@ import (
"encoding/json"
"fmt"
"io"
"log/slog"
"net/http"
"net/url"
"sort"
@@ -24,6 +25,13 @@ type Request struct {
UserPrompt string
}

const (
runtimeSnippetLimit = 4000
runtimePromptSnippetLimit = 1500
runtimePayloadSnippetLimit = 5000
runtimeShapeSnippetLimit = 1200
)

type Client interface {
Generate(ctx context.Context, req Request) (string, error)
}
@@ -60,9 +68,11 @@ type openAICompatibleClient struct {
}

func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (string, error) {
provider := strings.ToLower(strings.TrimSpace(req.Provider))
model := strings.TrimSpace(req.Model)
baseURL := strings.TrimRight(strings.TrimSpace(req.BaseURL), "/")
if baseURL == "" {
switch strings.ToLower(strings.TrimSpace(req.Provider)) {
switch provider {
case "xai":
baseURL = "https://api.x.ai"
case "ollama":
@@ -72,26 +82,70 @@ func (c *openAICompatibleClient) Generate(ctx context.Context, req Request) (str
}
}

systemPrompt := strings.TrimSpace(req.SystemPrompt)
userPrompt := strings.TrimSpace(req.UserPrompt)
payload := map[string]any{
"model": strings.TrimSpace(req.Model),
"model": model,
"temperature": optionalFloat64(req.Temperature, 0),
"messages": []map[string]string{
{"role": "system", "content": strings.TrimSpace(req.SystemPrompt)},
{"role": "user", "content": strings.TrimSpace(req.UserPrompt)},
{"role": "system", "content": systemPrompt},
{"role": "user", "content": userPrompt},
},
}
payload[openAICompatibleMaxTokensField(req.Provider, req.Model)] = optionalInt(req.MaxTokens, 1200)
maxTokensField := openAICompatibleMaxTokensField(provider, model)
payload[maxTokensField] = optionalInt(req.MaxTokens, 1200)
payloadRaw, _ := json.Marshal(payload)
payloadSnippet := redactSecrets(snippet(string(payloadRaw), runtimePayloadSnippetLimit), req.APIKey)
runtimeLogger().InfoContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_request",
"provider", provider,
"model", model,
"base_url", safeBaseURL(baseURL),
"max_tokens_field", maxTokensField,
"system_prompt_chars", len(systemPrompt),
"system_prompt_snippet", snippet(systemPrompt, runtimePromptSnippetLimit),
"user_prompt_chars", len(userPrompt),
"user_prompt_snippet", snippet(userPrompt, runtimePromptSnippetLimit),
"request_payload_chars", len(payloadRaw),
"request_payload_snippet", payloadSnippet,
)

body, err := doJSON(ctx, c.httpClient, http.MethodPost, baseURL+"/v1/chat/completions", req.APIKey, nil, payload)
if err != nil {
return "", err
}
rawResponse := strings.TrimSpace(string(body))
runtimeLogger().InfoContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_response",
"provider", provider,
"model", model,
"raw_response_chars", len(rawResponse),
"raw_response_snippet", redactSecrets(snippet(rawResponse, runtimeSnippetLimit), req.APIKey),
)

var response map[string]any
if err := json.Unmarshal(body, &response); err != nil {
return "", fmt.Errorf("decode openai-compatible response: %w", err)
}
shape := describeOpenAICompatibleShape(response)
runtimeLogger().InfoContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_http_response_shape",
"provider", provider,
"model", model,
"response_shape_hint", snippet(shape, runtimeShapeSnippetLimit),
)
content := extractOpenAICompatibleContent(response)
runtimeLogger().InfoContext(ctx, "llm runtime",
"component", "autofill",
"step", "provider_extract",
"provider", provider,
"model", model,
"extracted_content_chars", len(content),
"extracted_content_snippet", redactSecrets(snippet(content, runtimeSnippetLimit), req.APIKey),
)
if content == "" {
return "", fmt.Errorf("empty openai-compatible response content (%s)", describeOpenAICompatibleShape(response))
}
@@ -412,17 +466,34 @@ func extractTextFromContentValue(raw any) string {
}

func describeOpenAICompatibleShape(response map[string]any) string {
parts := make([]string, 0, 8)
parts = append(parts, "top="+describeMapKeys(response))
parts := make([]string, 0, 14)

if choices, ok := response["choices"].([]any); ok {
parts = append(parts, fmt.Sprintf("choices_len=%d", len(choices)))
if len(choices) > 0 {
if choice, ok := choices[0].(map[string]any); ok {
parts = append(parts, "choices0="+describeMapKeys(choice))
if finishReason, exists := choice["finish_reason"]; exists {
if reason, ok := finishReason.(string); ok {
parts = append(parts, "choices0_finish_reason="+strings.TrimSpace(reason))
} else {
parts = append(parts, "choices0_finish_reason_type="+valueType(finishReason))
}
}
if message, ok := choice["message"].(map[string]any); ok {
parts = append(parts, "message="+describeMapKeys(message))
parts = append(parts, "message_content_type="+valueType(message["content"]))
if content, ok := message["content"].([]any); ok {
parts = append(parts, fmt.Sprintf("message_content_len=%d", len(content)))
if len(content) > 0 {
parts = append(parts, "message_content0_type="+valueType(content[0]))
if first, ok := content[0].(map[string]any); ok {
parts = append(parts, "message_content0="+describeMapKeys(first))
}
}
}
} else if _, exists := choice["message"]; exists {
parts = append(parts, "message_type="+valueType(choice["message"]))
}
}
}
@@ -445,6 +516,7 @@ func describeOpenAICompatibleShape(response map[string]any) string {
parts = append(parts, "output_type="+valueType(response["output"]))
}

parts = append(parts, "top="+describeMapKeys(response))
return strings.Join(parts, "; ")
}

@@ -482,3 +554,46 @@ func valueType(raw any) string {
return fmt.Sprintf("%T", raw)
}
}

func runtimeLogger() *slog.Logger {
return slog.Default()
}

func snippet(value string, limit int) string {
trimmed := strings.TrimSpace(value)
if trimmed == "" || limit <= 0 {
return ""
}
runes := []rune(trimmed)
if len(runes) <= limit {
return trimmed
}
return strings.TrimSpace(string(runes[:limit])) + "...(truncated)"
}

func redactSecrets(value string, secrets ...string) string {
out := value
for _, secret := range secrets {
trimmed := strings.TrimSpace(secret)
if trimmed == "" {
continue
}
out = strings.ReplaceAll(out, trimmed, "[REDACTED]")
}
return out
}

func safeBaseURL(value string) string {
trimmed := strings.TrimSpace(value)
if trimmed == "" {
return ""
}
parsed, err := url.Parse(trimmed)
if err != nil || parsed.Scheme == "" || parsed.Host == "" {
return trimmed
}
parsed.User = nil
parsed.RawQuery = ""
parsed.Fragment = ""
return strings.TrimRight(parsed.String(), "/")
}

+ 7
- 1
internal/llmruntime/runtime_test.go Ver arquivo

@@ -152,7 +152,7 @@ func TestOpenAICompatibleClient_EmptyContentIncludesShapeDiagnostics(t *testing.
t.Parallel()

server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
_, _ = w.Write([]byte(`{"choices":[{"message":{"content":[]}}]}`))
_, _ = w.Write([]byte(`{"id":"chatcmpl_x","choices":[{"index":0,"finish_reason":"stop","message":{"role":"assistant","content":[]}}]}`))
}))
defer server.Close()

@@ -178,6 +178,12 @@ func TestOpenAICompatibleClient_EmptyContentIncludesShapeDiagnostics(t *testing.
if !strings.Contains(err.Error(), "message_content_type=array") {
t.Fatalf("expected shape diagnostics in error: %v", err)
}
if !strings.Contains(err.Error(), "message_content_len=0") {
t.Fatalf("expected message content length diagnostics in error: %v", err)
}
if !strings.Contains(err.Error(), "choices0_finish_reason=stop") {
t.Fatalf("expected finish reason diagnostics in error: %v", err)
}
}

func TestExtractProviderErrorMessage(t *testing.T) {


+ 45
- 0
internal/logging/logger_test.go Ver arquivo

@@ -0,0 +1,45 @@
package logging

import (
"log/slog"
"testing"
"time"
)

func TestRecentStoreListNewestFirst(t *testing.T) {
t.Parallel()

store := NewRecentStore(3)
store.Add(Entry{Message: "first", Timestamp: time.Unix(1, 0)})
store.Add(Entry{Message: "second", Timestamp: time.Unix(2, 0)})
store.Add(Entry{Message: "third", Timestamp: time.Unix(3, 0)})
store.Add(Entry{Message: "fourth", Timestamp: time.Unix(4, 0)})

got := store.List(2)
if len(got) != 2 {
t.Fatalf("expected 2 entries, got %d", len(got))
}
if got[0].Message != "fourth" || got[1].Message != "third" {
t.Fatalf("unexpected order: %+v", got)
}
}

func TestRecentHandlerCapturesFields(t *testing.T) {
t.Parallel()

store := NewRecentStore(5)
h := NewRecentHandler(store, slog.LevelInfo)
logger := slog.New(h)
logger.Info("test event", "component", "autofill", "suggestion_count", 7)

items := store.List(1)
if len(items) != 1 {
t.Fatalf("expected 1 entry, got %d", len(items))
}
if items[0].Fields["component"] != "autofill" {
t.Fatalf("missing component field: %+v", items[0].Fields)
}
if items[0].Fields["suggestion_count"] != int64(7) {
t.Fatalf("missing suggestion_count field: %+v", items[0].Fields)
}
}

+ 45
- 0
internal/mapping/logging.go Ver arquivo

@@ -0,0 +1,45 @@
package mapping

import (
"fmt"
"log/slog"
"strings"
)

func mappingLogger() *slog.Logger {
return slog.Default()
}

func shortErr(err error) string {
if err == nil {
return ""
}
message := strings.TrimSpace(err.Error())
limit := 180
if strings.Contains(message, "empty openai-compatible response content (") {
limit = 420
}
if len(message) > limit {
return message[:limit] + "..."
}
return message
}

func generatorLabel(generator SuggestionGenerator) string {
switch value := generator.(type) {
case *ProviderAwareSuggestionGenerator:
return "provider-aware"
case *LLMSuggestionGenerator:
source := strings.TrimSpace(value.source)
if source == "" {
return "llm"
}
return source
case *RuleBasedSuggestionGenerator:
return "rule-based"
case *CompositeSuggestionGenerator:
return "composite"
default:
return strings.TrimSpace(fmt.Sprintf("%T", generator))
}
}

+ 36
- 0
internal/mapping/logging_test.go Ver arquivo

@@ -0,0 +1,36 @@
package mapping

import (
"errors"
"strings"
"testing"
)

func TestShortErr_DefaultLimit(t *testing.T) {
t.Parallel()

msg := "x" + strings.Repeat("a", 260)
got := shortErr(errors.New(msg))
if len(got) != 183 {
t.Fatalf("expected default truncated len 183, got %d", len(got))
}
if !strings.HasSuffix(got, "...") {
t.Fatalf("expected ellipsis suffix, got %q", got)
}
}

func TestShortErr_OpenAICompatibleEmptyContentUsesLongerLimit(t *testing.T) {
t.Parallel()

prefix := "empty openai-compatible response content (choices_len=1; choices0={index:number,finish_reason:string,message:object}; message={content:array,role:string}; message_content_type=array; message_content_len=0; top={choices:array,id:string}) "
got := shortErr(errors.New(prefix + strings.Repeat("x", 260)))
if len(got) != 423 {
t.Fatalf("expected openai-compatible truncated len 423, got %d", len(got))
}
if !strings.HasPrefix(got, "empty openai-compatible response content (") {
t.Fatalf("expected openai-compatible prefix, got %q", got)
}
if !strings.HasSuffix(got, "...") {
t.Fatalf("expected ellipsis suffix, got %q", got)
}
}

+ 102
- 0
internal/mapping/provider_suggestion_generator.go Ver arquivo

@@ -4,6 +4,8 @@ import (
"context"
"encoding/json"
"fmt"
"net/url"
"sort"
"strings"
"time"

@@ -121,6 +123,20 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
systemPrompt, userPrompt := buildProviderPrompts(req, targets)
temperature := domain.NormalizeLLMTemperature(settings.LLMTemperature)
maxTokens := domain.NormalizeLLMMaxTokens(settings.LLMMaxTokens)
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_aware_request_payload",
"status", "start",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"base_url", llmruntimeSafeBaseURL(baseURL),
"system_prompt_chars", len(systemPrompt),
"system_prompt_snippet", providerLogSnippet(systemPrompt, 1500),
"user_prompt_chars", len(userPrompt),
"user_prompt_snippet", providerLogSnippet(userPrompt, 2000),
)
raw, err := providerClient.Generate(ctx, llmruntime.Request{
Provider: provider,
Model: model,
@@ -154,6 +170,18 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"response_chars", len(strings.TrimSpace(raw)),
"response_snippet", providerLogSnippet(raw, 4000),
)
mappingLogger().InfoContext(ctx, "provider-aware suggestion",
"component", "autofill",
"step", "provider_parse_input",
"status", "start",
"provider", provider,
"model", model,
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"extracted_content_chars", len(strings.TrimSpace(raw)),
"extracted_content_snippet", providerLogSnippet(raw, 4000),
)

parsed, err := parseProviderSuggestions(raw)
@@ -180,6 +208,7 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"parsed_count", len(parsed),
"parsed_sample", providerSuggestionSample(parsed, 5),
)

out := SuggestionResult{
@@ -218,6 +247,8 @@ func (g *ProviderAwareSuggestionGenerator) Generate(ctx context.Context, req Sug
"template_id", req.TemplateID,
"draft_id", strings.TrimSpace(req.DraftID),
"suggestion_count", len(out.Suggestions),
"suggestion_sample_sources", sampleResultSources(out, 5),
"suggestion_sample", suggestionLogSample(out, 5),
"duration_ms", time.Since(started).Milliseconds(),
)
return out, nil
@@ -388,3 +419,74 @@ func anyToString(raw any) string {
return fmt.Sprintf("%v", value)
}
}

func providerLogSnippet(value string, limit int) string {
trimmed := strings.TrimSpace(value)
if trimmed == "" || limit <= 0 {
return ""
}
runes := []rune(trimmed)
if len(runes) <= limit {
return trimmed
}
return strings.TrimSpace(string(runes[:limit])) + "...(truncated)"
}

func providerSuggestionSample(items []providerSuggestion, limit int) []map[string]string {
if len(items) == 0 || limit <= 0 {
return []map[string]string{}
}
if len(items) > limit {
items = items[:limit]
}
out := make([]map[string]string, 0, len(items))
for _, item := range items {
out = append(out, map[string]string{
"fieldPath": strings.TrimSpace(item.FieldPath),
"slot": strings.TrimSpace(item.Slot),
"value": providerLogSnippet(item.Value, 200),
"reason": providerLogSnippet(item.Reason, 120),
})
}
return out
}

func suggestionLogSample(result SuggestionResult, limit int) []map[string]string {
if limit <= 0 || len(result.ByFieldPath) == 0 {
return []map[string]string{}
}
paths := make([]string, 0, len(result.ByFieldPath))
for path := range result.ByFieldPath {
paths = append(paths, path)
}
sort.Strings(paths)
if len(paths) > limit {
paths = paths[:limit]
}
out := make([]map[string]string, 0, len(paths))
for _, path := range paths {
item := result.ByFieldPath[path]
out = append(out, map[string]string{
"fieldPath": strings.TrimSpace(item.FieldPath),
"source": strings.TrimSpace(item.Source),
"slot": strings.TrimSpace(item.Slot),
"value": providerLogSnippet(item.Value, 200),
})
}
return out
}

func llmruntimeSafeBaseURL(value string) string {
trimmed := strings.TrimSpace(value)
if trimmed == "" {
return ""
}
parsed, err := url.Parse(trimmed)
if err != nil || parsed.Scheme == "" || parsed.Host == "" {
return trimmed
}
parsed.User = nil
parsed.RawQuery = ""
parsed.Fragment = ""
return strings.TrimRight(parsed.String(), "/")
}

+ 73
- 5
internal/mapping/suggestion_generator.go Ver arquivo

@@ -252,7 +252,24 @@ func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req Suggest
return primaryResult, nil
}

fallbackResult, fbErr := g.Fallback.Generate(ctx, req)
targets := collectSuggestionTargets(req.Fields, req.Existing, req.IncludeFilled)
missingFieldPaths := missingSuggestionFieldPaths(targets, primaryResult.ByFieldPath)
if len(missingFieldPaths) == 0 {
mappingLogger().InfoContext(ctx, "autofill result",
"component", "autofill",
"step", "final",
"status", "success",
"source_path", "primary_only",
"suggestion_count", len(primaryResult.Suggestions),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"duration_ms", time.Since(started).Milliseconds(),
)
return primaryResult, nil
}

fallbackReq := narrowedSuggestionRequest(req, missingFieldPaths)
fallbackResult, fbErr := g.Fallback.Generate(ctx, fallbackReq)
if fbErr != nil {
mappingLogger().WarnContext(ctx, "autofill fallback",
"component", "autofill",
@@ -261,6 +278,7 @@ func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req Suggest
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"missing_target_count", len(missingFieldPaths),
"error", shortErr(fbErr),
)
mappingLogger().InfoContext(ctx, "autofill result",
@@ -282,9 +300,10 @@ func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req Suggest
"fallback_generator", generatorLabel(g.Fallback),
"draft_id", strings.TrimSpace(req.DraftID),
"template_id", req.TemplateID,
"missing_target_count", len(missingFieldPaths),
"suggestion_count", len(fallbackResult.Suggestions),
)
fallbackResult = normalizeSuggestionResult(fallbackResult, req.Fields, req.Existing, req.IncludeFilled)
fallbackResult = normalizeSuggestionResult(fallbackResult, fallbackReq.Fields, fallbackReq.Existing, fallbackReq.IncludeFilled)
merged := primaryResult
if merged.ByFieldPath == nil {
merged.ByFieldPath = map[string]Suggestion{}
@@ -332,6 +351,57 @@ func (g *CompositeSuggestionGenerator) Generate(ctx context.Context, req Suggest
return merged, nil
}

func missingSuggestionFieldPaths(targets []SemanticSlotTarget, byFieldPath map[string]Suggestion) []string {
if byFieldPath == nil {
byFieldPath = map[string]Suggestion{}
}
missing := make([]string, 0, len(targets))
seen := map[string]struct{}{}
for _, target := range targets {
path := strings.TrimSpace(target.FieldPath)
if path == "" {
continue
}
if _, ok := seen[path]; ok {
continue
}
seen[path] = struct{}{}
if _, ok := byFieldPath[path]; ok {
continue
}
missing = append(missing, path)
}
return missing
}

func narrowedSuggestionRequest(req SuggestionRequest, targetFieldPaths []string) SuggestionRequest {
allowed := map[string]struct{}{}
for _, path := range targetFieldPaths {
trimmed := strings.TrimSpace(path)
if trimmed == "" {
continue
}
allowed[trimmed] = struct{}{}
}
filteredFields := make([]domain.TemplateField, 0, len(req.Fields))
for _, field := range req.Fields {
if _, ok := allowed[strings.TrimSpace(field.Path)]; !ok {
continue
}
filteredFields = append(filteredFields, field)
}
filteredExisting := make(map[string]string, len(req.Existing))
for path, value := range req.Existing {
if _, ok := allowed[strings.TrimSpace(path)]; !ok {
continue
}
filteredExisting[path] = value
}
req.Fields = filteredFields
req.Existing = filteredExisting
return req
}

func generateFallback(ctx context.Context, fallback SuggestionGenerator, req SuggestionRequest) (SuggestionResult, error) {
if fallback == nil {
return SuggestionResult{}, fmt.Errorf("fallback suggestion generator is not configured")
@@ -404,9 +474,7 @@ func normalizeSuggestionResult(result SuggestionResult, fields []domain.Template
normalized.Slot = target.Slot
}
normalized.Value = value
if strings.TrimSpace(normalized.Source) == "" {
normalized.Source = domain.DraftSuggestionSourceFallbackRuleBased
}
normalized.Source = strings.TrimSpace(normalized.Source)
if _, exists := out.ByFieldPath[fieldPath]; exists {
continue
}


+ 167
- 7
internal/mapping/suggestions.go Ver arquivo

@@ -305,12 +305,49 @@ func GenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator,
if err != nil {
return next
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_result",
"action", "generate_all",
"generated_count", len(generated.ByFieldPath),
"generated_sources", summarizeResultSources(generated),
"sample_sources", sampleResultSources(generated, 5),
)
for _, s := range generated.Suggestions {
if _, exists := next.ByFieldPath[s.FieldPath]; exists {
continue
sliceSource := strings.TrimSpace(s.Source)
canonicalSource := ""
if canonical, ok := generated.ByFieldPath[strings.TrimSpace(s.FieldPath)]; ok {
canonicalSource = strings.TrimSpace(canonical.Source)
s = canonical
}
next.ByFieldPath[s.FieldPath] = toDraftSuggestion(s, now)
}
if existing, exists := next.ByFieldPath[s.FieldPath]; exists {
if !shouldReplaceExistingSuggestion(existing, s) {
continue
}
}
stored := toDraftSuggestion(s, now)
if explicitSource := strings.TrimSpace(s.Source); explicitSource != "" {
stored.Source = explicitSource
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "apply_field_transition",
"action", "generate_all",
"field_path", strings.TrimSpace(s.FieldPath),
"slice_source", firstNonEmpty(sliceSource, "unknown"),
"canonical_source", firstNonEmpty(canonicalSource, "unknown"),
"stored_source", firstNonEmpty(strings.TrimSpace(stored.Source), "unknown"),
)
next.ByFieldPath[s.FieldPath] = stored
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_apply_state",
"action", "generate_all",
"state_count", len(next.ByFieldPath),
"state_sources", summarizeDraftSuggestionSources(next),
"sample_sources", sampleDraftSuggestionSources(next, 5),
)
next.UpdatedAt = now.UTC()
return next
}
@@ -332,9 +369,44 @@ func RegenerateAllSuggestions(ctx context.Context, generator SuggestionGenerator
if err != nil {
return next
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_result",
"action", "regenerate_all",
"generated_count", len(generated.ByFieldPath),
"generated_sources", summarizeResultSources(generated),
"sample_sources", sampleResultSources(generated, 5),
)
for _, s := range generated.Suggestions {
next.ByFieldPath[s.FieldPath] = toDraftSuggestion(s, now)
}
sliceSource := strings.TrimSpace(s.Source)
canonicalSource := ""
if canonical, ok := generated.ByFieldPath[strings.TrimSpace(s.FieldPath)]; ok {
canonicalSource = strings.TrimSpace(canonical.Source)
s = canonical
}
stored := toDraftSuggestion(s, now)
if explicitSource := strings.TrimSpace(s.Source); explicitSource != "" {
stored.Source = explicitSource
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "apply_field_transition",
"action", "regenerate_all",
"field_path", strings.TrimSpace(s.FieldPath),
"slice_source", firstNonEmpty(sliceSource, "unknown"),
"canonical_source", firstNonEmpty(canonicalSource, "unknown"),
"stored_source", firstNonEmpty(strings.TrimSpace(stored.Source), "unknown"),
)
next.ByFieldPath[s.FieldPath] = stored
}
mappingLogger().InfoContext(ctx, "autofill state transition",
"component", "autofill",
"step", "post_generate_apply_state",
"action", "regenerate_all",
"state_count", len(next.ByFieldPath),
"state_sources", summarizeDraftSuggestionSources(next),
"sample_sources", sampleDraftSuggestionSources(next, 5),
)
next.UpdatedAt = now.UTC()
return next
}
@@ -455,7 +527,7 @@ func toDraftSuggestion(s Suggestion, now time.Time) domain.DraftSuggestion {
ts := now.UTC()
source := strings.TrimSpace(s.Source)
if source == "" {
source = domain.DraftSuggestionSourceFallbackRuleBased
source = "unknown"
}
return domain.DraftSuggestion{
FieldPath: strings.TrimSpace(s.FieldPath),
@@ -469,9 +541,97 @@ func toDraftSuggestion(s Suggestion, now time.Time) domain.DraftSuggestion {
}
}

func shouldReplaceExistingSuggestion(existing domain.DraftSuggestion, generated Suggestion) bool {
existingSource := strings.TrimSpace(existing.Source)
generatedSource := strings.TrimSpace(generated.Source)
if generatedSource == "" {
return false
}
if generatedSource == domain.DraftSuggestionSourceFallbackRuleBased {
return false
}
return existingSource == domain.DraftSuggestionSourceFallbackRuleBased
}

func suggestionResultWithFallback(ctx context.Context, generator SuggestionGenerator, req SuggestionRequest) (SuggestionResult, error) {
if generator == nil {
return NewRuleBasedSuggestionGenerator().Generate(ctx, req)
}
return generator.Generate(ctx, req)
}

func summarizeResultSources(result SuggestionResult) map[string]int {
if len(result.ByFieldPath) == 0 {
return map[string]int{}
}
out := map[string]int{}
for _, suggestion := range result.ByFieldPath {
source := strings.TrimSpace(suggestion.Source)
if source == "" {
source = "unknown"
}
out[source]++
}
return out
}

func summarizeDraftSuggestionSources(state domain.DraftSuggestionState) map[string]int {
if len(state.ByFieldPath) == 0 {
return map[string]int{}
}
out := map[string]int{}
for _, suggestion := range state.ByFieldPath {
source := strings.TrimSpace(suggestion.Source)
if source == "" {
source = "unknown"
}
out[source]++
}
return out
}

func sampleResultSources(result SuggestionResult, limit int) map[string]string {
if limit <= 0 || len(result.ByFieldPath) == 0 {
return map[string]string{}
}
paths := make([]string, 0, len(result.ByFieldPath))
for path := range result.ByFieldPath {
paths = append(paths, path)
}
sort.Strings(paths)
if len(paths) > limit {
paths = paths[:limit]
}
out := make(map[string]string, len(paths))
for _, path := range paths {
source := strings.TrimSpace(result.ByFieldPath[path].Source)
if source == "" {
source = "unknown"
}
out[path] = source
}
return out
}

func sampleDraftSuggestionSources(state domain.DraftSuggestionState, limit int) map[string]string {
if limit <= 0 || len(state.ByFieldPath) == 0 {
return map[string]string{}
}
paths := make([]string, 0, len(state.ByFieldPath))
for path := range state.ByFieldPath {
paths = append(paths, path)
}
sort.Strings(paths)
if len(paths) > limit {
paths = paths[:limit]
}
out := make(map[string]string, len(paths))
for _, path := range paths {
source := strings.TrimSpace(state.ByFieldPath[path].Source)
if source == "" {
source = "unknown"
}
out[path] = source
}
return out
}

+ 214
- 0
internal/mapping/suggestions_test.go Ver arquivo

@@ -258,6 +258,204 @@ func TestGenerateAllSuggestions_FallsBackWhenLLMReturnsInvalidValueType(t *testi
}
}

func TestGenerateAllSuggestions_PreservesSourceFromByFieldPathOnStateApply(t *testing.T) {
t.Parallel()

fields := []domain.TemplateField{
{Path: "text.textTitle_m1710_1", Section: "text", KeyName: "textTitle_m1710_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionHero},
{Path: "text.buttonText_c1165_1", Section: "text", KeyName: "buttonText_c1165_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionCTA},
}
generator := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Provider Hero", Source: ""},
{FieldPath: "text.buttonText_c1165_1", Value: "Fallback CTA", Source: ""},
},
ByFieldPath: map[string]Suggestion{
"text.textTitle_m1710_1": {FieldPath: "text.textTitle_m1710_1", Value: "Provider Hero", Source: domain.LLMProviderOpenAI},
"text.buttonText_c1165_1": {FieldPath: "text.buttonText_c1165_1", Value: "Fallback CTA", Source: ""},
},
},
}

state := GenerateAllSuggestions(context.Background(), generator, SuggestionRequest{
Fields: fields,
Existing: map[string]string{},
IncludeFilled: true,
}, domain.DraftSuggestionState{}, time.Now().UTC())

if got := state.ByFieldPath["text.textTitle_m1710_1"].Source; got != domain.LLMProviderOpenAI {
t.Fatalf("expected provider source preserved from generated result, got %q", got)
}
if got := state.ByFieldPath["text.buttonText_c1165_1"].Source; got != "unknown" {
t.Fatalf("expected unknown source only when suggestion source is empty, got %q", got)
}
}

func TestGenerateAllSuggestions_ReplacesStaleRuleBasedSourceWithProviderSource(t *testing.T) {
t.Parallel()

fields := []domain.TemplateField{
{Path: "text.textTitle_m1710_1", Section: "text", KeyName: "textTitle_m1710_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionHero},
}
current := domain.DraftSuggestionState{
ByFieldPath: map[string]domain.DraftSuggestion{
"text.textTitle_m1710_1": {
FieldPath: "text.textTitle_m1710_1",
Value: "Old fallback hero",
Source: domain.DraftSuggestionSourceFallbackRuleBased,
Status: domain.DraftSuggestionStatusSuggested,
},
},
}
generator := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Provider Hero", Source: domain.LLMProviderOpenAI},
},
ByFieldPath: map[string]Suggestion{
"text.textTitle_m1710_1": {FieldPath: "text.textTitle_m1710_1", Value: "Provider Hero", Source: domain.LLMProviderOpenAI},
},
},
}

state := GenerateAllSuggestions(context.Background(), generator, SuggestionRequest{
Fields: fields,
Existing: map[string]string{},
IncludeFilled: true,
}, current, time.Now().UTC())

hero := state.ByFieldPath["text.textTitle_m1710_1"]
if hero.Source != domain.LLMProviderOpenAI {
t.Fatalf("expected stale fallback source to be replaced by provider source, got %q", hero.Source)
}
if hero.Value != "Provider Hero" {
t.Fatalf("expected provider value to replace stale fallback value, got %q", hero.Value)
}
}

func TestCompositeSuggestionGenerator_NoFallbackWhenPrimaryCoversAllTargets(t *testing.T) {
t.Parallel()

fields := []domain.TemplateField{
{Path: "text.textTitle_m1710_1", Section: "text", KeyName: "textTitle_m1710_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionHero},
{Path: "text.buttonText_c1165_1", Section: "text", KeyName: "buttonText_c1165_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionCTA},
}
primary := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Primary Hero", Source: domain.DraftSuggestionSourceLLM},
{FieldPath: "text.buttonText_c1165_1", Value: "Primary CTA", Source: domain.DraftSuggestionSourceLLM},
},
},
}
fallback := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Fallback Hero", Source: domain.DraftSuggestionSourceFallbackRuleBased},
},
},
}
generator := NewCompositeSuggestionGenerator(primary, fallback)

result, err := generator.Generate(context.Background(), SuggestionRequest{
Fields: fields,
Existing: map[string]string{},
IncludeFilled: true,
})
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if fallback.callCount != 0 {
t.Fatalf("expected no fallback call, got %d", fallback.callCount)
}
if len(result.Suggestions) != 2 {
t.Fatalf("expected 2 suggestions, got %d", len(result.Suggestions))
}
}

func TestCompositeSuggestionGenerator_FallbackReceivesOnlyMissingTargets(t *testing.T) {
t.Parallel()

fields := []domain.TemplateField{
{Path: "text.textTitle_m1710_1", Section: "text", KeyName: "textTitle_m1710_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionHero},
{Path: "text.buttonText_c1165_1", Section: "text", KeyName: "buttonText_c1165_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionCTA},
}
primary := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Primary Hero", Source: domain.DraftSuggestionSourceLLM},
},
},
}
fallback := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.buttonText_c1165_1", Value: "Fallback CTA", Source: domain.DraftSuggestionSourceFallbackRuleBased},
},
},
}
generator := NewCompositeSuggestionGenerator(primary, fallback)

result, err := generator.Generate(context.Background(), SuggestionRequest{
Fields: fields,
Existing: map[string]string{},
IncludeFilled: true,
})
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if fallback.callCount != 1 {
t.Fatalf("expected one fallback call, got %d", fallback.callCount)
}
if len(fallback.lastReq.Fields) != 1 || fallback.lastReq.Fields[0].Path != "text.buttonText_c1165_1" {
t.Fatalf("expected fallback request only for missing CTA field, got %+v", fallback.lastReq.Fields)
}
if got := result.ByFieldPath["text.textTitle_m1710_1"].Source; got != domain.DraftSuggestionSourceLLM {
t.Fatalf("expected primary source on hero, got %q", got)
}
if got := result.ByFieldPath["text.buttonText_c1165_1"].Source; got != domain.DraftSuggestionSourceFallbackRuleBased {
t.Fatalf("expected fallback source on cta, got %q", got)
}
}

func TestCompositeSuggestionGenerator_PrimaryWinsOverFallbackForSameField(t *testing.T) {
t.Parallel()

fields := []domain.TemplateField{
{Path: "text.textTitle_m1710_1", Section: "text", KeyName: "textTitle_m1710_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionHero},
{Path: "text.buttonText_c1165_1", Section: "text", KeyName: "buttonText_c1165_1", FieldKind: "text", IsEnabled: true, WebsiteSection: domain.WebsiteSectionCTA},
}
primary := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Primary Hero", Source: domain.DraftSuggestionSourceLLM},
},
},
}
fallback := &stubSuggestionGenerator{
result: SuggestionResult{
Suggestions: []Suggestion{
{FieldPath: "text.textTitle_m1710_1", Value: "Fallback Hero", Source: domain.DraftSuggestionSourceFallbackRuleBased},
{FieldPath: "text.buttonText_c1165_1", Value: "Fallback CTA", Source: domain.DraftSuggestionSourceFallbackRuleBased},
},
},
}
generator := NewCompositeSuggestionGenerator(primary, fallback)

result, err := generator.Generate(context.Background(), SuggestionRequest{
Fields: fields,
Existing: map[string]string{},
IncludeFilled: true,
})
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if got := result.ByFieldPath["text.textTitle_m1710_1"]; got.Value != "Primary Hero" || got.Source != domain.DraftSuggestionSourceLLM {
t.Fatalf("expected primary hero suggestion to win, got %+v", got)
}
}

type stubQCClient struct {
generateContent qcclient.GenerateContentData
generateErr error
@@ -295,3 +493,19 @@ func (s *stubQCClient) GetJob(context.Context, int64) (*qcclient.JobStatusData,
func (s *stubQCClient) GetEditorURL(context.Context, int64) (*qcclient.SiteEditorLoginData, json.RawMessage, error) {
return nil, nil, nil
}

type stubSuggestionGenerator struct {
result SuggestionResult
err error
callCount int
lastReq SuggestionRequest
}

func (s *stubSuggestionGenerator) Generate(_ context.Context, req SuggestionRequest) (SuggestionResult, error) {
s.callCount++
s.lastReq = req
if s.err != nil {
return SuggestionResult{}, s.err
}
return s.result, nil
}

+ 8
- 2
run-local.ps1 Ver arquivo

@@ -2,11 +2,17 @@ $ErrorActionPreference = 'Stop'

$projectRoot = Split-Path -Parent $MyInvocation.MyCommand.Path
$envFile = Join-Path $projectRoot '.env.local'
$distDir = Join-Path $projectRoot 'dist'
$exeFile = Join-Path $distDir 'qctextbuilder.exe'

if (-not (Test-Path $envFile)) {
throw "Missing .env.local at $envFile"
}

if (-not (Test-Path $exeFile)) {
throw "Missing built executable at $exeFile. Run .\build-local.ps1 first."
}

Get-Content $envFile | ForEach-Object {
$line = $_.Trim()
if (-not $line -or $line.StartsWith('#')) { return }
@@ -19,7 +25,7 @@ Get-Content $envFile | ForEach-Object {
[System.Environment]::SetEnvironmentVariable($name, $value, 'Process')
}

Write-Host "Starting QC Text Builder on $env:HTTP_ADDR" -ForegroundColor Green
Write-Host "Starting QC Text Builder on $env:HTTP_ADDR using $exeFile" -ForegroundColor Green
Set-Location $projectRoot

go run ./cmd/qctextbuilder
& $exeFile

Carregando…
Cancelar
Salvar