SQLTeam.com | Weblogs | Forums

Dbcc shrinkfile

Dear friends, I have a MDF file (FileID 1) with about 200 GB free space(!).
Its size is 1555135 MB (1.5 TB).
I run (for test): DBCC SHRINKFILE (1, 1555130) (trying shrink only 5 MB).
It never finish.
Please help me. Thank you so much. Yossi Moses

Why are you trying to shrink the file? Saving 200GB out of 1.5TB isn't saving anything - and will just cause issues when the file needs to grow to support future growth and normal index maintenance.

2 Likes

1st thank you so much. for your reference. Yes, so what you say is the usual concept, I know it.
But consider my situation, I am going to pass all big (huge) tables to files for each table, and make them partitioned. So the .MDF will contain very few. Meanwhile I have not enough space, So I must work table by table. Create file and shrink .MDF, create another file and shrink .MDF. at end .MDF will be not more than 200 GB. I'll be happy to get an idea, so thanks.

Very interesting, now I tried again and it works (not fast, but...).
I think that may be the problem is related to the file system, not to the engine of SQL.
May be when disk active time is high, the shrink work is very hard.
Thank you all, jeffw8713 in particular!

What is the purpose of partitioning your 'huge' tables? I doubt there is going to be any benefit to doing this unless the goal is to implement archive/purge processes for those 'big' tables.

2 Likes

Dear Jeffrey, thank you so much for your treating me so nice (sorry for my bad english).
So, yes, this is what I need. I have to keep 12 months, so in partitioning I can simply truncate a partition (month). More than this, I think that maintenance of indexes (as rebuild) can be made on specific partition and this is very nice it is indeed right.
Yhank you so much, Yossi Moses

Thank you for the update - and those are good reasons for considering partitioning. If this was my project I would see if I could get new storage and move to the new storage instead of shrinking the existing file. Shrinking the existing file can cause fragmentation issue - but if that isn't possible then your approach is probably the only way to go.

1 Like

Thank you very very much Jeffrey!