Fix temperature param + add thinking for anthropic

The temperature param was not passed to the llm.
Now support anthropic models in 'thinking' mode.
This commit is contained in:
Jochen
2025-12-02 17:24:55 +11:00
parent 9ee0468b87
commit ae16243f49
4 changed files with 98 additions and 4 deletions

View File

@@ -24,6 +24,8 @@ temperature = 0.3 # Slightly higher temperature for more creative implementatio
# Options: "ephemeral", "5minute", "1hour"
# Reduces costs and latency for repeated prompts. Uses Anthropic's prompt caching with different TTLs.
# enable_1m_context = true # optional, more expensive
# thinking_budget_tokens = 10000 # Optional: Enable extended thinking mode with token budget
# Allows the model to "think" before responding. Useful for complex reasoning tasks.
# Multiple OpenAI-compatible providers can be configured with custom names