Monday, March 26, 2012

Making changes to a table with large amounts of data. Timeout?!

Hello,

I have a table that is fairly large, and I need to make a change to one of the columns in the table. Namely I need to change the datatype and rename that column. When I try to save the updated table, I keep getting a timeout error that says.

'eligibility (dbo)' table
- Unable to create index 'PK_eligibility'.
Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

Any ideas on how to make the table change more efficient or change the timeout period. I have to keep the existing data in the table. I am using sql server managment studio(2005) connected to a sql server 2000 database.

Thanks!

this sort of thing always happens to me when using EM. I use only code now.

If you look at the code behind the scenes that EM uses, it is creating a temp table, shoving all the data into that table, re-creating the original table and pushing all the data back. (if you tell it to save the script when you make the change, you'll see what I mean)

For a large table, I would probably create a new column, update the values in the new column with what is in the old column, then drop the old column. The only time consuming step would be the updates, however, you can space those out and update based on a range of values from one of your other columns. (i.e., update table set newcolumn = oldcolumn where datefield between '1/1/2001' and '2/1/2001')

This way it shouldn't have to create the index.

If you are trying to do this on a column that has a constraint, you will have to drop the constraint first. If you are changing the PK, and your PK is the clustered index, it's probably going to be messy any way you go.

Using script, you can't put your columns in different orders. They will always be added at the end.

sql

No comments:

Post a Comment