If you have a table with millions of rows and you want to update one of its columns, one problem that you can run into is that the update causes the log file to grow to such an extent that you run out of space.
One work-around to this problem with the log file growing very large is to update the table in small batches. (Especially when you use Simple logging, this can lead to the log file not growing by much at all).
Here is how I did it:
1. Added a column called Updated to the table. (This column is used to keep track of if the row was updated or not. Based on your tables structure you may or may not need to do this). This column will be deleted at the end of the script.
2. Update the table using top(1000) option.
3. Delete the column used to keep track of whether the row was updated or not.
ALTER TABLE [dbo].[sample] ADD [updated] BIT NULL; while (1 = 1) begin Begin Transaction; UPDATE top (1000) [dbo].[sample] SET [mySampleData] = newValue, [updated] = 1 FROM [dbo].[sample] WHERE [updated] is null; if (@@RowCount = 0) break; Commit Transaction; end Commit Transaction; go ALTER TABLE [dbo].[sample] DROP COLUMN [updated] go
No comments:
Post a Comment
Remember, if you want me to respond to your comment, then you need to use a Google/OpenID account to leave the comment.