How Much Memory to Allocate for Heavy SSIS Usage

Hi. SQL Server 2014 Enterprise on Windows Server 2012 RAM = 132 GB

I have allocated 124 GB of Memory to SQL Server (in the Max setting) But SSIS ETL loads are running slow. Do SSIS processes use that 124 GB or do they use the remaining 8 GB I have reserved for the Operating System? I think SSIS uses the 124 GB but I need confirmation. Thanks

sql pig. will take all you give it

124GB is too much to give SQL if the box has 132GB. You're likely causing Windows or other task memory issues.

Try giving SQL 108GB instead and see if everything runs faster.

The max memory setting in SQL Server sets the amount of memory that can be set aside for the SQL Buffers, this is memory that is used by internal querying, cache plans etc. SSIS is an external process so is not covered by the max memory setting.

While the SQL Server engine may not use all of the memory set by the max value what it does use it tends to not give it all back. That said under exceptional load external processes can trim the SQL Server buffer memory when it needs it.

The general rule of thumb I work with for a server with plenty of memory and only running 1 SQL instance is to keep 4GB back for the OS and 2GB for other processes. On a server with 128GB RAM that means I would set my max memory in SQL Server to 122GB.

Now SSIS can be a beast in that it wants to load most of the ETL processes into memory to run it faster. You don't want it struggling for memory (in the same way you don't want SQL Server running short either). For this reason we run our SSIS server as a completely different server to our production databases. Obviously this requires extra licensing so it may not be feasible for you.

I would suggest looking at the memory usage to see what SQL Server is currently using (check how long the instance has been running) this should give you an idea of how much it uses, you could then trim the max memory to a few GB over the current value which then free's up the rest for your SSIS ETL process.

Make sure, you keep your eye on memory stats though, you don't want to swing the other way and find you are getting heavy IO because it doesn't have enough memory to load data into.

another thing we have done is have our ssis oackages run only on a dedicated "BI" server where we have things such
ssrs, ssis, powerbi gateway etc

You also have to allow space for CLR, linked servers (for tasks they do outside SQL memory) and so on.

Again, I'd recommend no more than 108GB to start. You can then check SQL to see if it's having any memory pressure. If it does, you can try another 2GB or so.

If you just throw all the memory possible at it, you'll never know how much loose RAM you have for later use when SQL needs inevitably grow.

Go through the following link and understand how much is too much memory for SQL Server:

Thanks for all the comments, everyone. We dO have SSIS running on a dedicated server. The ETLs pull data from external sources. I suppose I will reduce the max memory setting to MSSQL to 108 GB or so.