Notebooks
A
Azure
Working With Functions

Working With Functions

Functionsazure-openai-samplesBasic_Samples

Working with functions in Azure OpenAI

This notebook shows how to use the Chat Completions API in combination with functions to extend the current capabilities of GPT models. GPT models, do not inherently support real-time interaction with external systems, databases, or files. However, functions can be used to do so.

Overview:
tools (previously called functions) is an optional parameter in the Chat Completion API which can be used to provide function specifications. This allows models to generate function arguments for the specifications provided by the user.

Note: The API will not execute any function calls. Executing function calls using the outputed argments must be done by developers.

Setup

[ ]
[ ]

1.0 Test functions

This code calls the model with the user query and the set of functions defined in the functions parameter. The model then can choose if it calls a function. If a function is called, the content will be in a strigified JSON object. The function call that should be made and arguments are location in: response[choices][0][function_call].

[ ]

Forcing the use of a specific function or no function

By changing the value of the tool_choice parameter you can allow the model to decide what function to use, force the model to use a specific function, or force the model to use no function.

[ ]

2.0 Defining functions

Now that we know how to work with functions, let's define some functions in code so that we can walk through the process of using functions end to end.

Function #1: Get current time

[ ]
[ ]

Function #2: Get stock market data

For simplicity, we're just hard coding some stock market data but you could easily edit the code to call out to an API to retrieve real-time data.

[ ]
[ ]

Function #3: Calculator

[ ]
[ ]

3.0 Calling a function using GPT

Steps for Function Calling:

  1. Call the model with the user query and a set of functions defined in the functions parameter.
  2. The model can choose to call a function; if so, the content will be a stringified JSON object adhering to your custom schema (note: the model may generate invalid JSON or hallucinate parameters).
  3. Parse the string into JSON in your code, and call your function with the provided arguments if they exist.
  4. Call the model again by appending the function response as a new message, and let the model summarize the results back to the user.

3.1 Describe the functions so that the model knows how to call them

[ ]

3.2 Define a helper function to validate the function call

It's possible that the models could generate incorrect function calls so it's important to validate the calls. Here we define a simple helper function to validate the function call although you could apply more complex validation for your use case.

[ ]
[ ]
[ ]

4.0 Calling multiple functions together

In some cases, you may want to string together multiple function calls to get the desired result. We modified the run_conversation() function above to allow multiple function calls to be made.

[ ]
[17]
Conversation complete!