Best way to update table schema for huge tables (SQL Server)?

We have a similar problem and I've found that the fastest way to do it is to export the data to delimited files (in chunks - depending on the size of the rows - in our case, each file had 500,000 rows), doing any transforms during the export, drop and recreate the table with the new schema, and then do a bcp import from the files.

Should go much faster without indexing during the build.

I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.

Related Questions