Skip to Main Content
Tenjin Ideas Portal
Status Shipped
Created by Guest
Created on Oct 5, 2023

Add support for GPT functions

GPT functions allow the LLM to infer whether it has captured enough information in a conversation to then create a pre-defined JSON object that can be passed to an external system. We have tested this concept with a helpdesk capture flow that has a chatgpt conversation until it infers it has enough information for a ticket and then creates a proper object and returns it.

This is a great new feature and allows us to implement plug-in type functionality for dynamic chatgpt conversations.

Maybe we can add an advanced option in the action (alongside temperature etc) to determine whether to pass a function array or not. Or may be better to have its own 'completion with function' action that returns the function name and object as separate properties if triggered (or a standard chat reply if not).

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/function-calling


I have tested this inference with GPT-4 and it's performing well (better than 3.5) so, when coupled with additional models, we'd have a borad suite of powerful tools.

  • Attach files