Espace Publish Timeout due to large Entity

Espace Publish Timeout due to large Entity

Hi all,

Lately I've been having a problem publishing 1 espace that contains an Entity with many Attributes (Platform
That entity has less than 142 000 rows, but has many indexes due to all the attributes of type "....Identifier".
For your information, SQL Server reports that that particular table occupies 51.296KB of data space and 49.952KB of Index space.

A few days ago, we added 4 new attributes (all of type decimal) and the Publish process timed-out on the Database script part. We downloaded the script and the problem was with one of the UPDATE instructions who was setting the new attribute to ZERO.

The UPDATE was to ALL rows and was taking too long to execute. So we commented that part of the script an ran it again. It worked fine.

Now, the question. Is this because the table has too many attributes of "....identifier" type (because the index size is so large)?
Hi Gonçalo,

The issue you had with the change on your entity was probably related to its size (~142000 rows) and the new 4 attributes you've added to that entity.

On OutSystems we never store "NULL" values on the database (1, 2). As so, when you add a new attribute (database column) to your entity (database) the OutSystems Platform will generate scripts that create the new columns on the database and then update them with the default null values (" ", 0, 1900-01-01 00:00:00, ...).

This operation probably caused the update query to timeout, but to overcome the problem you can change your "update query timeout" on the configuration tool to a value high enough for the query to complete.

Please let us know if this worked for you.

Nuno Parreira
Nuno -

We've seen similar problems in the past as well. I think this is something that needs to be addressed. We've had deployments to PROD have serious issues because of this. :(


Hi Nuno,

Thanks for your response.
I understand your explanation.

142000 rows seems like a small number, to be generating a timeout. If it was a million rows I could understand, but 142000?
Could this also not be related to the fact that that particular table has many indexes?
It's probably due to the new fields, altough 142.000 records isn't a lot... Indexes shouldn't be changed at all, unless you modified them to include the new columns.

I used to manage a system that had some tables with over 200M records and when we needed to add fields to them, we'd just run the scripts to create them manually one day before putting the modified application live, so our publish operation wouldn't go outside the nightly maintenance timeframe.