runlocalagent-core
Service icon

RunLocalAgent_Core

version 1.0.0 (Compatible with OutSystems 11)
Uploaded
 on 9 Jun (4 weeks ago)
 by 
0.0
 (0 ratings)
runlocalagent-core

RunLocalAgent_Core

Documentation
1.0.0



Technical Documentation – RunLocalAgent_Core

Overview

RunLocalAgent_Core is a reusable OutSystems module designed to facilitate integration between OutSystems applications and locally hosted large language models (LLMs) that follow the Ollama standard (e.g., LLaMA3, Phi3, etc.).


Main Functionality

Server Action: Call_LLMLocalAPI

Description

Performs an HTTP request to a local API exposing an LLM model and returns the generated response from the model.

Input Parameters

NameTypeDescription
ModelTextName of the model to be used (e.g., "llama3").
PromptTextThe prompt/question to be sent to the model.
MaxTokensIntegerThe maximum number of tokens to generate in the response.
TemperatureDecimalControls the randomness of the output (e.g., 0.7).
StreamBooleanWhether the output should be streamed in real time.
ApiURLTextPublic URL of the locally running LLM API (e.g., https://xyz.ngrok.io/api/chat).

Output Parameters

NameTypeDescription
SuccessBooleanIndicates whether the request was successful.
ResponseTextTextThe text returned by the model.
ErrorMessageTextError message in case the request fails.

Example Usage

Sample Input:

Model: llama3  
Prompt: Explain what ChatGPT is.  
MaxTokens: 200  
Temperature: 0.7  
Stream: False  
ApiURL: https://your-endpoint.ngrok.io/api/chat

Expected Output:

{
  "ResponseText": "ChatGPT is a language model developed by OpenAI...",
  "Success": true,
  "ErrorMessage": ""
}

Requirements

  • OutSystems Platform version 11.55 or higher.

  • Ollama API running locally and exposed using Ngrok or similar tunneling tool.

  • A supported LLM model installed and running (e.g., ollama run llama3).


Module Structure

  • Server Action: Call_LLMLocalAPI

  • Exposed Submodules: None

  • External Dependencies: None

  • Recommended Tool: Ngrok for exposing localhost as HTTPS


Important Notes

OutSystems does not support direct calls to localhost for security reasons.
To make local APIs accessible from your application, use a tunneling service like Ngrok and ensure the API is accessible via HTTPS.

Refer to the official Ollama documentation:
How can I allow additional web origins


Author

Adão Pedro
Version: 1.0.0
License: BSD-3-Clause