Skip to main content

Flow

The AI / LLM functionality is available within Flow as a Flow Action, called LLM Text Generation (apex-seven20__FlowLlmTextGenAction), which can be found in the AI category of the flow builders actions.

Simply add the action element to your flow, build the inputs suitable for your use case and invoke the action. Refer to the documentation of the model you are using for best practices and advice on how to structure a request.

warning

As the action performs an HTTP callout to the LLM, it cannot be used in synchronous record triggered flows.

Inputs

All inputs for the action are required, and can be found below:

Input NameData TypeDescription
LLM ModelPicklistModel to use for generation. Any implementer of seven20.LargeLanguageModelTextGenerator.Request is selectable
Promptsseven20.FlowLlmTextGenMessage[]A collection of prompts detailing the request to the LLM

Prompts

Prompts are given to the LLM to generate a response. The prompts are a collection of seven20.FlowLlmTextGenMessage Apex objects, which have the following properties:

Property NameData TypeDescription
roleStringWhat role does this message play within the request? For example, system could define general instructions for the LLM versus user which defines the user's input
contentStringThe content of the message. This could be instructions for the LLM on how to handle the response, the input from the user, or supplementary context

Multiple prompts can, and should, be included to provide the LLM with general instructions. The messages are assumed to be sorted chronologically and, as such, the user's input should be the last message in the collection, unless building a conversation.

warning

Check the documentation for the model in use to determine the valid values for role and any requirements or nuances for content that are specific to the model.

Outputs

The outputs for the Flow action are as follows:

Output NameData TypeDescription
Is Success?BooleanTrue means the prompt generation occurred without incident. Error reasons could be network-related, or relayed back from the LLM itself
ValueStringGenerated text, or if the action was not successful, the error message
danger

Always check for an error and gracefully handle it before displaying a message to the user.