RunLocalAgent_Core is a reusable OutSystems module that allows developers to connect their applications to locally hosted Large Language Models (LLMs), using the Ollama-compatible API (e.g., LLaMA3, Phi3, etc.).
This core module includes a server-side action (Call_LLMLocalAPI) that encapsulates communication with a local AI model using configurable parameters like prompt, model name, temperature, max tokens, system message, and streaming options.
Call_LLMLocalAPI
The module is ideal for developers who want to embed private, offline, or edge-based AI functionality into their OutSystems applications, while maintaining full control over prompt and model configuration.
Because OutSystems does not support direct calls to localhost, this module is designed to work with tunneling services such as Ngrok to expose the local LLM API securely.
This asset does not include any UI or demo screens — it is intended to be consumed by other applications or demos like RunLocalAgent_Dem
RunLocalAgent_Dem