runlocalagent-core
Service icon

RunLocalAgent_Core

version 1.0.0 (Compatible with OutSystems 11)
Uploaded
 on 9 Jun (4 weeks ago)
 by 
0.0
 (0 ratings)
runlocalagent-core

RunLocalAgent_Core

Details
is a reusable module that enables integration with locally hosted Large Language Models (LLMs), compatible with the Ollama standard (e.g., LLaMA3, Phi3, etc.). It exposes a server action called Call_LLMLocalAPI, which takes parameters such as model, prompt, temperature, streaming option, and more — allowing full customization of local model behavior. Perfect for offline environments or use cases requiring full privacy and local AI control. ⚠️ Note: Since OutSystems blocks calls to localhost, it's recommended to use tunneling tools like Ngrok to expose your local API.
Read more

RunLocalAgent_Core is a reusable OutSystems module that allows developers to connect their applications to locally hosted Large Language Models (LLMs), using the Ollama-compatible API (e.g., LLaMA3, Phi3, etc.).

This core module includes a server-side action (Call_LLMLocalAPI) that encapsulates communication with a local AI model using configurable parameters like prompt, model name, temperature, max tokens, system message, and streaming options.

The module is ideal for developers who want to embed private, offline, or edge-based AI functionality into their OutSystems applications, while maintaining full control over prompt and model configuration.

Because OutSystems does not support direct calls to localhost, this module is designed to work with tunneling services such as Ngrok to expose the local LLM API securely.

This asset does not include any UI or demo screens — it is intended to be consumed by other applications or demos like RunLocalAgent_Dem

Release notes (1.0.0)
License (1.0.0)
Reviews (0)
Team
Other assets in this category