Reference
The below is a reference for the Apex classes involved in invoking the AI/LLM functionality within Seven20. All methods defined in classes are global, and any implementer of an interface must also be defined as global, including any methods in the interface it implements.
seven20 when used. e.g. seven20.GlobalClassLargeLanguageModelService
Class which provides the mechanism for invoking the LLM functionality by providing it a fully formed request.
| Method | Return Type | Description |
|---|---|---|
invokeRequest( LargeLanguageModelTextGenerator.Request request) | LargeLanguageModelTextGenerator.Response | Invokes the LLM request. This is a synchronous callout and will block until a response is received |
LargeLanguageModelTextGenerator.Request
Interface which defines a new model to be used for a request.
| Method | Return Type | Description |
|---|---|---|
responseType() | System.Type | Returns the Type of the response, for use in parsing. This type must implement the LargeLanguageModelTextGenerator.Response interface |
getHttpRequest() | HttpRequest | Returns the HTTP request to be sent to the LLM endpoint. This should be a valid request for the model being used |
LargeLanguageModelTextGenerator.Response
Interface which defines the response from an LLM request.
| Method | Return Type | Description |
|---|---|---|
processResponse(HttpResponse response) | void | Processes the response from the LLM request. This should parse the response and set the properties of the class. This method is called after the object has been initialised and should handle any errors/exceptions that may be thrown during parsing |
isSuccess() | Boolean | Returns true if the request was successful, which should be determined in the processResponse() method call |
getErrors() | List<LargeLanguageModelTextGenerator.Error> | Returns a list of errors that occurred during the request. This should be set in the processResponse() method call. If there are no errors, this should return an empty list |
getResult() | List<LargeLanguageModelTextGenerator.Result> | Returns the results of the request. This should be set in the processResponse() method call. If there is no result, this should return an empty list |
The response type can be reused across multiple LLM requests if they require the same post-request parsing.
LargeLanguageModelTextGenerator.ChatMessage
Interface which defines a message to be sent to the LLM.
| Method | Return Type | Description |
|---|---|---|
getMessage() | Object | Returns the message to be sent to the LLM. This should be a JSON serializable object |
LargeLanguageModelTextGenerator.Result
Interface which defines a result message from an LLM request.
| Method | Return Type | Description |
|---|---|---|
getMessage() | String | Returns the message from the LLM |
If the LLM's output is intended to be user-facing, it's best to return a human-readable string, otherwise return JSON for further processing downstream.
LargeLanguageModelTextGenerator.Error
Interface which defines an error from an LLM request.
| Method | Return Type | Description |
|---|---|---|
getMessage() | String | Returns the message from the LLM |
If the LLM's output is intended to be user-facing, it's best to return a human-readable string, otherwise return JSON for further processing downstream.
FlowLlmTextGenAction
Invocable Apex class which can be called within a Flow, or directly within Apex.
| Method | Return Type | Description |
|---|---|---|
invoke(List<FlowLlmTextGenAction.Request> requests) | List<FlowLlmTextGenAction.Response> | Invokes the LLM request. Result indexes match the input indexes when more than one input request is provided |
FlowLlmTextGenAction.Request
Class defining the inputs of an LLM request. All values are required.
| Property | Flow Name | Type | Description |
|---|---|---|---|
model | Model | String | The model to be used for the request. This should be a valid implementer of the LargeLanguageModelTextGenerator.Request interface |
messages | Generation Request Prompts | List<FlowLlmTextGenMessage> | The messages to be sent to the LLM. All list members must be valid |
FlowLlmTextGenAction.Response
Class defining the response of an LLM request.
| Property | Flow Name | Type | Description |
|---|---|---|---|
isSuccess | Is Success? | Boolean | True if the request was successful |
value | Value | String | Message from the LLM, or if an error occurred, the error message |
FlowLlmTextGenMessage
Class defining a message to be sent to the LLM. All values are required.
| Property | Type | Description |
|---|---|---|
role | String | The role of the message. The specific values depend on the model which will be invoked, see their documentation for reference |
content | String | The content of the message |