Function Calling 1
tool-callingnebius-token-factory-cookbook
Export
Function Calling in Nebius Token Factory
Pre requisites
- Nebius API key. Sign up for free at Token Factory
1 - Setup
1.1 - If running on Google Colab
Add NEBIUS_API_KEY to Secrets as follows

1.2 - If running locally
Create an .env file with NEBIUS_API_KEY as follows
NEBIUS_API_KEY=your_api_key_goes_here
2 - Install Dependencies
[1]
NOT running on Colab
[2]
3 - Load Configuration
[3]
✅ NEBIUS_API_KEY found
4 - Pick a Model
We will pick a model that supports function calling.
- Go to models tab in tokenfactory.nebius.com
- Select text to text models
- Select function calling filter
- Copy the model name. For example
openai/gpt-oss-20b
See screenshot here:

Recomended models:
- Qwen3 family
- openai/gpt-oss-20b
- Qwen/Qwen3-235B-A22B
- Deepseek family
- deepseek-ai/DeepSeek-R1-0528
- Llama
- meta-llama/Llama-3.3-70B-Instruct
5 - Define Function Call
Here we will use pydantic to define the schema
[4]
6 - Tool calling
[5]
CPU times: user 490 ms, sys: 65.7 ms, total: 555 ms Wall time: 612 ms
[6]
The weather in San Francisco is 72 degrees fahrenheit. It is sunny, with highs in the 80's.
[7]
[{'role': 'user',
, 'content': 'Can you tell me what the temperature will be in San Francisco?'},
, {'role': 'assistant',
, 'tool_calls': [ChatCompletionMessageToolCall(id='chatcmpl-tool-0c160206cdea4ca6b10cc37e4b592ad7', function=Function(arguments='{"city": "San Francisco", "unit": "celsius"}', name='get_current_weather'), type='function')]},
, {'role': 'tool',
, 'content': "The weather in San Francisco is 72 degrees fahrenheit. It is sunny, with highs in the 80's.",
, 'tool_call_id': 'chatcmpl-tool-0c160206cdea4ca6b10cc37e4b592ad7',
, 'name': 'get_current_weather'}]