Commit Graph

3 Commits

Author SHA1 Message Date
Tai An
67c34bbb96 fix(middleware): parse OpenAI-spec tool_choice in /v1/chat/completions (#9559)
* fix(middleware): parse OpenAI-spec tool_choice in /v1/chat/completions

Follows up on #9526 (the 3-site setter fix) by addressing the remaining
clause in #9508 — string mode and OpenAI-spec specific-function shape both
silently failed in the /v1/chat/completions parsing path.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fix(middleware): restore LF endings and cover tool_choice parsing with specs

The previous commit on this branch saved core/http/middleware/request.go
with CRLF line endings, ballooning the diff against master to 684 / 651
for what is in reality a ~50-line parsing change. Restore LF (matches
.editorconfig end_of_line = lf).

Add 11 Ginkgo specs under "SetModelAndConfig tool_choice parsing
(chat completions)" that parallel the existing MergeOpenResponsesConfig
specs from #9509. They drive the full middleware chain (SetModelAndConfig
+ SetOpenAIRequest) and assert:

  * "required"  -> ShouldUseFunctions=true, no specific name
  * "none"      -> ShouldUseFunctions=false (tools disabled per OpenAI spec)
  * "auto"      -> default, tools available, no specific name
  * {type:function, function:{name:X}}  (spec)    -> X is forced
  * {type:function, name:X}             (legacy)  -> X is forced
  * nested wins when both forms are present
  * malformed shapes (no type, wrong type, no name, empty name) are no-ops

Update the inline comment on the string case to describe the actual
mechanism: "none" reaches SetFunctionCallString("none") downstream and
is then honored by ShouldUseFunctions() returning false. Before this PR
json.Unmarshal([]byte("none"), &functions.Tool{}) failed silently, so
"none" was ignored - making "none" actually work is a real behavior fix
this PR brings.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Assisted-by: Claude:opus-4-7 [Claude Code]

* fix(middleware): preserve pre-#9559 support for JSON-string-encoded tool_choice

Some non-spec clients send tool_choice as a JSON-encoded string of an
object form, e.g. "{\"type\":\"function\",\"function\":{\"name\":\"X\"}}".
The pre-#9559 code accepted this by accident: its case string: branch
ran json.Unmarshal([]byte(content), &functions.Tool{}), which succeeded
for that double-encoded shape even though it failed for the legitimate
plain string modes "auto" / "none" / "required".

The first version of this PR routed every string straight to
SetFunctionCallString as a mode, which fixed the plain-string cases but
silently regressed the double-encoded one (funcs.Select("{...}") returns
nothing). Restore the fallback: when a string looks like a JSON object,
try parsing it as a tool_choice map first; fall through to mode-string
handling only when no usable name comes out.

Factor the map-name extraction into a small helper
(extractToolChoiceFunctionName) so the string-fallback and the regular
map case go through identical code, and accept both the OpenAI-spec
nested shape and the legacy/Anthropic flat shape from either entry
point.

Add 3 Ginkgo specs covering the double-encoded case (nested form, legacy
form, and the fall-through when the JSON has no usable name).

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Assisted-by: Claude:opus-4-7 [Claude Code]

* test(middleware): silence errcheck on AfterEach os.RemoveAll

The new tool_choice parsing tests added a second AfterEach that calls
os.RemoveAll(modelDir) without checking the error; errcheck flagged it.
Suppress with the standard _ = idiom. The pre-existing AfterEach on the
earlier Describe still elides the check the same way it did before -
leaving that untouched to keep this commit minimal.

Assisted-by: Claude:opus-4-7 [Claude Code]
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Co-authored-by: Ettore Di Giacinto <mudler@localai.io>
2026-05-14 00:14:38 +02:00
walcz-de
f877942d97 fix(openresponses): parse OpenAI-spec nested tool_choice + use correct setter (#9509)
Two bugs in MergeOpenResponsesConfig (/v1/responses + WebSocket, *not*
/v1/chat/completions — that has a separate, working path via Tool
unmarshal + SetFunctionCallNameString):

1. **Shape mismatch.** OpenAI's specific-function tool_choice nests the
   name under "function":
       {"type": "function", "function": {"name": "my_function"}}
   The legacy flat shape was:
       {"type": "function", "name": "my_function"}
   Only the flat shape was handled. OpenAI-compliant clients that reach
   /v1/responses (openai-python with the Responses API, Stainless-generated
   SDKs, …) silently failed to force the function.

2. **Wrong setter.** The code called SetFunctionCallString(name), which
   writes the mode field (functionCallString: "none"/"auto"/"required").
   The specific-function name lives in a separate field
   (functionCallNameString), read by ShouldCallSpecificFunction and
   FunctionToCall. Net effect: a correctly-formed tool_choice never
   engaged grammar-based forcing.

The fix preserves backward compatibility by accepting both shapes
(nested preferred, flat as fallback) and routes to the correct setter.

Note: The same "wrong setter" pattern appears at three other sites —
anthropic/messages.go:883, openai/realtime_model.go:171, and
openresponses/responses.go:776 — and /v1/chat/completions has its own
issue parsing tool_choice="required" as a string (json.Unmarshal on a
raw string fails silently). Those are filed as a tracking issue rather
than bundled here to keep this PR focused.

## Test plan
9 new Ginkgo specs under "MergeOpenResponsesConfig tool_choice parsing":
  - string modes: "required" / "auto" / "none"
  - OpenAI-spec nested shape: {type:function, function:{name}}
  - Legacy Anthropic-compat flat shape: {type:function, name}
  - Shape-preference: nested wins over flat when both present
  - Malformed: missing type, wrong type, missing name, empty name, nil

$ go test ./core/http/middleware/ -count=1 -run TestMiddleware
  Ran 28 of 28 Specs in 0.003 seconds -- PASS

## Repro (against /v1/responses)

    curl -N http://localai/v1/responses \
         -H 'Content-Type: application/json' \
         -d '{"model":"qwen3.6-35b-a3b-apex",
              "input":"Weather in Berlin?",
              "tools":[{"type":"function","name":"get_weather",
                        "parameters":{"type":"object",
                          "properties":{"city":{"type":"string"}},
                          "required":["city"]}}],
              "tool_choice":{"type":"function",
                             "function":{"name":"get_weather"}}}'

Before: grammar-based forcing silently inactive; model free-texts.
After : grammar forces get_weather invocation; output contains
        tool_calls with function:{name:"get_weather", arguments:{...}}.
2026-04-23 18:30:05 +02:00
Richard Palethorpe
efdcbbe332 feat(api): Return 404 when model is not found except for model names in HF format (#9133)
Signed-off-by: Richard Palethorpe <io@richiejp.com>
2026-03-31 10:48:21 +02:00