Allow text streaming (e.g. for AI responses)
80
Views
1
Comments
New
Backend

Allow server actions and integrations (e.g. AI use cases) to stream text to the UI instead of waiting for the full response.

This is standard on any AI powered app and since OutSystems is pushing AI features, I think this is a must have. We don't want our users to have to wait for the full LLM response before having some feedback.

A way to implement this is to allow Server Sent Events which would even facilitate other use cases, especially in ODC with Event Driven Architectures. 

This is a great idea and aligns well with the related idea for native SSE support.

In addition, I think, streaming capabilities should not be limited to text only. Supporting streaming of binary data could improve performance and user-experience for a wider range of use cases (e.g. streaming binary data from the database).