RunLocalAgent_Core is a reusable OutSystems module designed to facilitate integration between OutSystems applications and locally hosted large language models (LLMs) that follow the Ollama standard (e.g., LLaMA3, Phi3, etc.).
Call_LLMLocalAPI
Performs an HTTP request to a local API exposing an LLM model and returns the generated response from the model.
Model
"llama3"
Prompt
MaxTokens
Temperature
Stream
ApiURL
https://xyz.ngrok.io/api/chat
Success
ResponseText
ErrorMessage
Sample Input:
Model: llama3 Prompt: Explain what ChatGPT is. MaxTokens: 200 Temperature: 0.7 Stream: False ApiURL: https://your-endpoint.ngrok.io/api/chat
Expected Output:
{ "ResponseText": "ChatGPT is a language model developed by OpenAI...", "Success": true, "ErrorMessage": "" }
OutSystems Platform version 11.55 or higher.
Ollama API running locally and exposed using Ngrok or similar tunneling tool.
A supported LLM model installed and running (e.g., ollama run llama3).
ollama run llama3
Server Action: Call_LLMLocalAPI
Exposed Submodules: None
External Dependencies: None
Recommended Tool: Ngrok for exposing localhost as HTTPS
localhost
OutSystems does not support direct calls to localhost for security reasons.To make local APIs accessible from your application, use a tunneling service like Ngrok and ensure the API is accessible via HTTPS.
Refer to the official Ollama documentation:How can I allow additional web origins
Adão PedroVersion: 1.0.0License: BSD-3-Clause