Ollama Workaround: DeepSeek R1 Tool Support

TeeTracker
3 min readFeb 1, 2025

--

by https://sider.ai/

Until the R1 model natively supports the tool call well, we can use a workaround in LangChain that leverages OllamaFunctions.

The DeepSeek officially supports tool calls in testing but is not stable. Although it does not officially support JSON output, a workaround was found in this text.

llm = OllamaFunctions( 
model="deepseek-r1:14b",
format="json", # DOES matter for bind_tools
)

llm.bind_tools( [naming_generator_from_int])
or
llm.with_structured_output(NamingGeneratorFromInt,
include_raw=True #doesnt matter
)

Notice

The format “json” must be set for bind_tools, otherwise only works for with_structured_output.

with_structured_output

include_raw=False, will return:

NamingGeneratorFromInt(x=11)

othewise returns:

{
│ 'raw': AIMessage(
│ │ content='',
│ │ additional_kwargs={},
│ │ response_metadata={},
│ │ id='run-1703a177-49b7-4202-972f-5d7be5ccabdd-0',
│ │ tool_calls=[
│ │ │ {
│ │ │ │ 'name': 'NamingGeneratorFromInt',
│ │ │ │ 'args': {'x': 11},
│ │ │ │ 'id': 'call_5c1c22c422e3455ebca56ff0c649d131',
│ │ │ │ 'type': 'tool_call'
│ │ │ }
│ │ ]
│ ),
│ 'parsed': NamingGeneratorFromInt(x=11),
│ 'parsing_error': None
}

bind_tools

AIMessage(
│ content='',
│ additional_kwargs={},
│ response_metadata={},
│ id='run-d0bc7e99-0bc3-426c-ab54-2182f44926b4-0',
│ tool_calls=[
│ │ {
│ │ │ 'name': 'naming_generator_from_int',
│ │ │ 'args': {'x': 11},
│ │ │ 'id': 'call_b16dec1f60664a0da6a00c32f562e76f',
│ │ │ 'type': 'tool_call'
│ │ }
│ ]
)

No “json” format

Error

'deepseek-r1:14b' did not respond with valid JSON. 
Please try again.
Response: <think>
Okay, so the user wants me to generate a name from the integer 11. Let me think about how I can approach this.

First, I recall that there's a tool called "naming_generator_from_int" which is designed exactly for this purpose. It takes an integer as input and generates an artificial name based on it.

I need to make sure I'm using the correct parameters. The tool expects an integer under the key "x". So, in this case, x should be 11.

I don't see any other tools that would be more appropriate here. The __conversational_response tool is for when no specific action is needed, but since we have a naming generator available, I'll use that.

I should structure the response as a JSON object with "tool" set to "naming_generator_from_int" and "tool_input" containing the parameters { "x": 11 }.

Let me double-check if there are any constraints or specific rules for generating names from integers. The tool's description says it's an artificial name, so I don't need to worry about real names or anything else.

I think that's all. Time to put it together.
</think>

{
"tool": "naming_generator_from_int",
"tool_input": {
"x": 11
}
}

ChatOllama

Avoid using the latest ChatOllama class as OllamaFunctions will be deprecated soon. Currently, we need to rely on OllamaFunctions for a workaround, so the ChatOllama class is not recommended at the moment.

llm = ChatOllama( 
model="deepseek-r1:14b",
format="json", # doesnt matter
)

llm.bind_tools( [naming_generator_from_int])
or
llm.with_structured_output(NamingGeneratorFromInt,
include_raw=True #doesnt matter
)

You get error like this:

ResponseError: registry.ollama.ai/library/deepseek-r1:14b does not support tools (status code: 400)

Code

--

--

TeeTracker
TeeTracker

Responses (1)