Login to follow
Prompt Assembler

Prompt Assembler (ODC)

Stable version 0.1.0 (Compatible with ODC)
Uploaded on 28 Mar by DB Results Labs
Prompt Assembler

Prompt Assembler (ODC)

Documentation
0.1.0

Overview

The Prompt Assembler is a lightweight utility for OutSystems Developer Cloud (ODC) that eliminates the "Compilation Tax." By moving AI instructions out of hardcoded server actions and into dynamic templates, you can iterate on your prompts in real time without waiting for long republication cycles. To ensure scalability, the system uses a three-tier hierarchy: Projects contain Templates and Groups, and Groups combines one or more Prompt Templates.


A. Configuration: Projects, Groups, and Templates

The management UI is designed to keep your AI assets organized as your factory scales.

Step 1: Define your Project

Projects are the top-level organizational unit. Use them to separate different applications, departments, or major AI initiatives.

  • Navigate to the Projects section.

  • Create a new Project to house your related prompts and groups.


Step 2: Build your Prompt Templates

  • Go to Templates and click Add Template.

  • Key: Enter a unique identifier (e.g., "customerSupport_reply"). This is the mandatory key your logic will use to fetch the template.

  • Text: Write your instructions using double angle brackets for dynamic placeholders.

    • Example: "Summarize this: <>. Tone: <>."

  • Save: The template is now live. Any updates here take effect immediately in your apps without a republish.


Optional: Create Prompt Groups

Groups allow you to combine multiple related prompt templates under a single category within a Project.

  • Go to the Template Group tab.

  • Create a Group (e.g., "support_bot_phase_1") to act as a logical container for your templates.

  • Add templates to the group


B. Testing: Using the Playground

Before integrating a template into your application, use the Playground to verify the assembly logic.

  1. Input Key: Type the template/group key you want to test.

  2. Define Parameter Key/Value: Type in the value for your placeholder keys in JSON format

  3. Download "sample JSON": sample JSON format you can use for your Parameter Key/Value

  4. Review the Result: The playground will show you exactly how the final string looks. This allows you to spot any typos in your placeholders or missing data in your context list before you start debugging in ODC Studio.


C. Usage: Integrating with your Application Logic

Once your template is configured in the UI, you need to "assemble" it within your application's workflow.

  1. Add the Dependency: In ODC Studio, open the Dependencies window (Ctrl+Q) and reference the Assemble_Prompt service action from the Prompt Assembler source.

  2. Add to Logic Flow: Drag the Assemble_Prompt action into your Server Action or Data Action.

  3. Configure Input Parameters:

    • Key (Text): Enter the unique code you defined in the UI (for example: "customerSupport_reply"). This is a mandatory field that tells the engine which template to fetch.

    • Variables (KeyValueItem List): Runtime data used for placeholder substitution within prompt text. Each record in the list requires:

      • PlaceHolderKey: The exact name of the placeholder token defined in the prompt template, without the curly brace syntax. For example, if the template contains <>, the PlaceHolderKey value should be ListSystem. This must match the token name exactly.

      • Values: The runtime data to substitute into the corresponding placeholder. Accepts multiple values to support scenarios where a placeholder represents a list of items (e.g. a list of systems, users, or options). The assembler will inject all values into the template at the position of the matching <> token.

  4. Handle the Output: The action returns an Assembled Prompt string. This is the finalized text with all placeholders replaced by your live data, ready to be sent to your LLM connector.


D. Execution and LLM Connection


Once the Assemble_Prompt action runs, the engine performs a regex-based replacement to swap your placeholders with the data provided in the Variables list. Because the template is fetched and assembled at runtime, your application remains completely decoupled from the specific wording of the instructions.

  1. Direct Integration: Take the Assembled Prompt output string and map it directly to the "Prompt" or "System Message" input of your agent.

  2. Immediate Iteration: If you find the AI's response isn't meeting your requirements, you do not need to change a single line of code or republish your application. Simply go back to the Management UI, adjust the template text, and the very next time your logic runs, the updated instructions will be used.

  3. List Handling: If you passed multiple items into the Values field for a single placeholder, the assembler will inject them into the template at that position. This is particularly useful for passing lists of system names, user roles, or data points that the LLM needs to process as a group.


E. Best Practices

  • Case Sensitivity: Double check that your PlaceHolderKey in the logic matches the casing of the token in the UI exactly. If your template has <>, the key must be SystemName.

  • JSON Stability: When instructing the AI to return a JSON response, try to keep the structure of that JSON consistent. While you can change the tone or detail of the "instructions" freely, changing the expected keys in the AI's output might require you to update the parsing logic in your ODC app.

  • Unique Keys: Use clear, descriptive naming for your template Keys to keep your library organized as you add more AI features.