I know this is a pretty old thread but we're running into the same issue. We have several tables in our app that contain a large number of rows. If we add columns to those tables in dev and deploy to prod the platform runs a script that updates default values for the new columns into every row. In our case that prod update can run 15+ minutes and causes an almost complete loss of use of our production system while the db does this massive single update transaction. Has anyone come up with a good, reliable way to work around this? At this point we can't add any new columns to these tables without facing this issue - I hate for the database design to be driven by a platform limitation.
Anybody else work around this? Is it feasible to alter on new columns and set the defaults in a more controlled way (like maybe 1M rows at a time, avoiding an extended outage on the db) before doing the deploy?
Following is the Outsystems response from a ticket we had open on this:
We tested this scenario to clarify if not defining the Default value when creating a new attribute in an Entity would avoid the creating of the Update Script:
Hey Greg,
The best way to avoid the timeout is to create a new entity with a 1 to 1 relation to the big entity. But if you really need to add the new column to the big entity, then the only way is to do what I did and that is running the db update script manually.
regards