SQLTeam.com | Weblogs | Forums

Changing Data Type in Azure DB on a table with many rows



I'm looking for some advice on the best way to change the data type on a column from an INT to a BIGINT on a table with 220 million rows, as it has reached the upper limit of the INT data type.

The database is on SQL Azure DB, and there is a clustered index with a primary key constraint, a foreign key constraint, and a non-clustered index. The table the foreign key is from also has 3 non-clustered indexes which include the key.

What is the fastest/most efficient way to achieve this? Ideally in such a way as to minimize the length of time the table will be inaccessible.

I am planning to test by dropping the constraints and indexes, making the change, and recreating them, as well as moving the data into a new table and creating the appropriate constraints and indexes there, but I wanted to find out if anyone could offer some advice on this first. I've seen a few ideas online such as moving data in batches, but I'm not sure what will work best.

Thanks in advance.