Login to follow
OTelLogExporter

OTelLogExporter (ODC)

Stable version 0.1.0 (Compatible with ODC)
Uploaded on 30 May 2025 by Xebia USA Inc.
OTelLogExporter

OTelLogExporter (ODC)

Documentation
0.1.0

Name of the Component : OTelLogExporter


Overview:

The OTelLogExporter is a .NET-based reusable server-side external logic that converts custom logs from the OutSystems Developer Cloud (ODC) into OpenTelemetry-compliant JSON format. This component allows teams to standardize and export logs to observability platforms like Signoz, Splunk, Datadog, etc,

The external logic receives a JSON string of custom logs and outputs a JSON string formatted according to the OpenTelemetry Standard Logs. It also returns any errors encountered during parsing or conversion.


Why OpenTelemetry?

OpenTelemetry ensures log interoperability across systems using the OTLP format. If your cloud-based app interacts with external or non-low-code systems, this connector helps centralize logs for seamless integration with APM tools—enhancing observability and performance tracking.


Key Features:

  • - Accepts custom logs in JSON: Simple JSON format input for custom application logs.

  • - Outputs OpenTelemetry-compliant JSON: Full compliance with Otel structured log format.

  • - Error Reporting: Clear error messages for invalid or malformed input.

  • - Distributed Tracing Support: Includes trace ID and span ID handling for traceability.

  • - Efficient Server Execution: Lightweight .NET implementation designed for server-side performance.


External Logic Info:

This extension exposes one key server action which can be seamlessly consumed by any Outsystems application. The below is the screenshot that illustrates how the action looks like:

Picture

🔹 CustomLogsToOtelJson

Inputs:

  • LogsJson (Text)
    ➤ JSON string of one or more custom logs.

Outputs:

  • OtelJson (Text)
    ➤ Output JSON string in OpenTelemetry format.

  • Error (Text)
    ➤ Error message string if the conversion fails.


Expected Input JSON Format:

json


[
{
"Id": "LOG001",
"Message": "User login successful",
"Level": "Information",
"Category": "Authentication",
"Source": "AuthService",
"Timestamp": "2025-05-22T09:15:30Z",
"UserId": "user123",
"Environment": "Production",
"Platform": "Web",
"Module": "UserManagement",
"TraceId": "a3d1f7e8b9c04232a0b5c6d7e8f90123",
"SpanId": "f1e2d3c4b5a69788"
},
{
"Id": "LOG002",
"Message": "Failed to load user profile",
"Level": "Error",
"Category": "UserProfile",
"Source": "UserService",
"Timestamp": "2025-05-22T09:20:45Z",
"UserId": "user456",
"Environment": "Production",
"Platform": "Mobile",
"Module": "UserManagement",
"TraceId": "b4e2c3d1a6f87901b2c3d4e5f6a7b8c9",
"SpanId": "c2b3a4f5d6e79890"
}
]


Sample Output (OpenTelemetry JSON):

json



{

"resourceLogs": [

{

"resource": {

"attributes": [

{

"key": "service.name",

"value": {

"stringValue": "ODCLogTraceService"

}

},

{

"key": "service.version",

"value": {

"stringValue": "1.0.0"

}

}

]

},

"scopeLogs": [

{

"scope": {

"name": "CustomLogTracing",

"version": "1.0"

},

"logRecords": [

{

"traceId": "a3d1f7e8b9c04232a0b5c6d7e8f90123",

"spanId": "f1e2d3c4b5a69788",

"traceFlags": 0,

"timeUnixNano": 1747905330000000000,

"severityText": "Information",

"severityNumber": 9,

"body": {

"stringValue": "User login successful"

},

"attributes": [

{

"key": "id",

"value": {

"stringValue": "LOG001"

}

},

{

"key": "category",

"value": {

"stringValue": "Authentication"

}

},

{

"key": "source",

"value": {

"stringValue": "AuthService"

}

},

{

"key": "userId",

"value": {

"stringValue": "user123"

}

},

{

"key": "environment",

"value": {

"stringValue": "Production"

}

},

{

"key": "platform",

"value": {

"stringValue": "Web"

}

},

{

"key": "module",

"value": {

"stringValue": "UserManagement"

}

}

]

},

{

"traceId": "b4e2c3d1a6f87901b2c3d4e5f6a7b8c9",

"spanId": "c2b3a4f5d6e79890",

"traceFlags": 0,

"timeUnixNano": 1747905645000000000,

"severityText": "Error",

"severityNumber": 17,

"body": {

"stringValue": "Failed to load user profile"

},

"attributes": [

{

"key": "id",

"value": {

"stringValue": "LOG002"

}

},

{

"key": "category",

"value": {

"stringValue": "UserProfile"

}

},

{

"key": "source",

"value": {

"stringValue": "UserService"

}

},

{

"key": "userId",

"value": {

"stringValue": "user456"

}

},

{

"key": "environment",

"value": {

"stringValue": "Production"

}

},

{

"key": "platform",

"value": {

"stringValue": "Mobile"

}

},

{

"key": "module",

"value": {

"stringValue": "UserManagement"

}

}

]

}

]

}

]

}

]

}


Limitations:

  • - No schema validation on input JSON

  • - Timestamps must be in ISO 8601 format

  • - Trace/Span support is limited to what is provided in the input


Error Examples:

Error

Cause

Suggestion

Input JSON is empty or null

No input given

Ensure JSON string is passed to the action

Failed to parse log JSON

Malformed JSON

Validate your JSON using a formatter

Error during OTel JSON conversion: [details]

Runtime error

Review and debug individual fields



Developer Notes:

  1. 1. Custom Log Service
    Set up a custom logging service that writes application logs to your custom log database.

  1. 2. Filter and Convert Logs
    Before converting, filter logs based on criteria like timestamp, severity, or type—for example, to extract only critical errors from the day. Convert the filtered logs to OTLP-compliant format, then group them into a single JSON file for export.

  1. 3. Export to APM Tool
    Once the log file is ready, it can be pushed to an APM tool using various methods. You may find open-source code snippets online to help with this. Additionally, we’re working on an OutSystems connector to simplify this process—stay tuned!

  1. 4. Log Visualization
    Once the logs are available in the APM tool, you can visualize them, apply filters, and run queries to gain comprehensive insights.

  1. 5. Enhanced Traceability
    You can also add new attributes like traceId and spanId to each log record. These help APM tools trace the execution flow of a single request across distributed systems. This traceId can be used to track request that goes across systems.