49
Views
2
Comments
Agentic AI use cases and benefits in ODC over O11

Hi All,

I’m looking for a detailed and practical comparison between OutSystems Developer Cloud (ODC) and OutSystems 11 (O11) to make informed decisions about modernization and AI adoption.

1. ODC vs O11 
What are the key benefits of ODC compared to O11? 
Why should a client choose ODC for new app development?
 In which cases is it recommended to migrate O11 apps to ODC? 

2. AI & Agentic Path
 What are the practical benefits of using AI Agents in ODC? 
In which real use cases should AI Agents be used?
Why is the ODC + Agentic approach preferred? 

3. AI Model Cost & Security
How is AI model cost calculated (token consumption, input size, etc.)? 
How do AI models process data in the backend? 
How is data security handled when AI agents access application data? 

Looking for clear guidance, best practices, and real‑world insights. 

Thanks

2025-12-22 13-50-43
Sherif El-Habibi
Champion

Hello,

  1. There are several benefits, including the architectural differences. ODC promotes a more loosely coupled architecture with the introduction of apps and libraries, instead of the traditional modules and applications structure in O11. Another important advantage is containerization, where applications and their libraries are packaged together during deployment. This makes deployments faster and more efficient, without the need to wait for the entire infrastructure to be deployed to another environment.
  2. The benefits of using AI agents include the ability to handle large processing tasks with minimal effort and time. Many things you would normally implement manually in your application, such as workflows or automation logic, can be handled by agents when they are properly configured. This is also one of the reasons why the ODC + agentic approach is often preferred. While you can integrate AI in O11, you usually have to consume external APIs, manage structured responses, and handle most of the setup yourself. ODC provides many of these tools out of the box, meaning you do not have to start everything from scratch.
  3. The cost of AI is usually calculated based on token consumption. Every token used in the user input counts, and the internal processing of the agent (such as reasoning or decision-making) also consumes tokens, along with the response generated by the model. All of this can be monitored and controlled in the ODC Agent Workbench.

Regarding security, it mainly depends on how the system is configured. If you give the agent access to sensitive information, there is always some level of risk. Even with the best AI systems, there is always a very small chance of error, so careful handling and proper security controls are necessary.

2026-03-26 14-15-22
Afonso Metello
Champion

Hi Ajit, happy to give a more detailed breakdown across all three areas.

1. ODC vs O11 — Architecture and When to Migrate

ODC is a fundamentally different runtime from O11, not just an upgrade.

ODC runtime:

  • AWS EKS (Kubernetes) + Linux containers, Aurora Serverless v2 (PostgreSQL-compatible)
  • Each customer gets dedicated Kubernetes namespaces and a dedicated Aurora database — no shared infrastructure between tenants
  • Container image is compiled once in Dev and promoted unchanged through QA and Prod (no recompile per stage)
  • Publish to Dev: typically ~30-90 seconds. Stage promotion: ~1-3 minutes (in our experience)
  • Auto-scaling is handled by Kubernetes and Aurora Serverless v2 — developers configure nothing
  • Zero-downtime deployments via Kubernetes rolling updates
  • Data encrypted at rest with per-tenant AES-256 keys (AWS KMS)
  • Multiple global regions, selected at provisioning — data residency is locked to your chosen region

O11 runtime:

  • IIS + .NET Framework on Windows, SQL Server or Oracle
  • Manual scaling (you or your ops team provisions infrastructure)
  • Shared infrastructure on cloud, or customer-managed on self-hosted
  • No equivalent to the container promotion model — each stage may recompile or repackage

When to migrate:

New apps: start on ODC unless you have a hard dependency on something ODC doesn't yet support (some integration patterns, legacy connectors, or specific SAP/Salesforce accelerators that exist in O11 but not yet in ODC).

Existing O11 apps: migrate when the app would genuinely benefit from cloud-native auto-scaling, faster deployment pipelines, or the native AI capabilities covered below. Don't migrate just to migrate — ODC is newer, and some of the tooling and marketplace components you rely on in O11 may not have equivalent ODC versions yet.

2. AI Capabilities: ODC vs O11

This is where the gap is most significant.

What ODC has natively:

AI Agent Builder — visual tool inside ODC Studio. You build GenAI agents with an LLM backbone and optionally RAG. The agent compiles to a Server Action, callable from any workflow or screen like any other action.

AI Model Management — configured in ODC Portal > AI Connections. Supported providers: Azure OpenAI, OpenAI, Anthropic, AWS Bedrock, Databricks, Mistral, Gemini — plus custom connections for any provider that implements the ODC contract. API keys are stored as Secrets (AWS Secrets Manager, encrypted at rest, not visible after saving).

MCP (Model Context Protocol) — connect agents to external tools via MCP servers. Supports SSE streaming and service-to-service authentication. For MCP servers running in private networks, you use Private Gateway.

Agent Workbench — multi-agent orchestration layer. Agents can use tools: REST endpoints, Data Fabric entities, Service Actions, MCP servers. Supports RAG against external document stores. Agents can be embedded as activities inside workflows.

Mentor — AI digital worker covering the full SDLC. Includes App Generator (describe an app in natural language and get a scaffold), App Editor, and Code Reviews. GA since early 2025.

What O11 has natively:

None of the above. In O11 you consume external AI APIs via REST integrations, parse structured responses manually, and build all orchestration yourself with custom logic. It's possible — teams do it — but you're building the plumbing from scratch every time.

Real use cases where this matters:

  • Document processing: upload PDFs/docs, agent answers questions against them using RAG
  • Customer support chatbots embedded in applications
  • Automated underwriting or approval workflows where the agent evaluates unstructured input
  • Data extraction from emails, forms, or documents into structured entities
  • Multi-agent pipelines where one orchestrator agent delegates to specialist agents

3. AI Model Costs and Data Security

Costs:

Token-based billing. Every input token (user prompt + system instructions + retrieved context from RAG) and every output token (model response) is billed by the provider. Models that expose internal reasoning (like o1/o3) also bill for reasoning tokens. ODC's Agent Workbench shows token usage so you can monitor per-agent consumption. You configure the API key and billing is between you and the provider.

The cost variable that surprises people most is RAG context: if you retrieve 10 large document chunks per query, all of that goes into the input token count. Design your retrieval carefully.

Security:

  • API keys stored as Secrets in AWS Secrets Manager — encrypted at rest, not retrievable in plaintext after saving
  • TLS on all channels: browser-to-app, app-to-DB, app-to-model
  • Data residency: your ODC region is selected at provisioning and data stays in that region
  • If you use Azure OpenAI: data stays within your Azure tenant, Microsoft does not use it for training under the default enterprise agreement
  • Compliance certifications: SOC 2 Type II, ISO 27001, HIPAA-eligible, PCI DSS-eligible, GDPR

Important caveat that gets glossed over:

When an agent retrieves sensitive data (customer records, medical data, financial data) to answer a question, that data is sent to the LLM provider as part of the prompt. The platform secures the API key and the transport — but the data itself goes to the provider you configured. If your data is sensitive, choose your provider accordingly. Azure OpenAI gives you the most control (your Azure tenant, enterprise data processing terms). Generic OpenAI is fine for many use cases but check your data processing agreements before using it with regulated data.

Worth keeping an eye on the March 31 event from Lisbon — OutSystems is teasing a major announcement around AI-powered development that could further widen the ODC advantage.

Happy to dive deeper into any of these areas.

Community GuidelinesBe kind and respectful, give credit to the original source of content, and search for duplicates before posting.