OutofMemoryException when importing some files but all

Error message below. Only started happening recently. Imports some files and not others.

============================================================
******************************************************************************
ClearTrace Error
******************************************************************************
Version: 1.0.51
Last Exception: Exception of type 'System.OutOfMemoryException' was thrown.
Last Target Site: System.Threading.Tasks.Task TdsExecuteRPC(System.Data.SqlClient.SqlCommand, System.Data.SqlClient._SqlRPC[], Int32, Boolean, System.Data.Sql.SqlNotificationRequest, System.Data.SqlClient.TdsParserStateObject, Boolean, Boolean, System.Threading.Tasks.TaskCompletionSource`1[System.Object], Int32, Int32)
******************************************************************************

Exception Type: System.Reflection.TargetInvocationException
Exception Message: Exception has been thrown by the target of an invocation.
   at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
   at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
   at System.Delegate.DynamicInvokeImpl(Object[] args)
   at System.Windows.Forms.Control.InvokeMarshaledCallbackDo(ThreadMethodEntry tme)
   at System.Windows.Forms.Control.InvokeMarshaledCallbackHelper(Object obj)
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
   at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Windows.Forms.Control.InvokeMarshaledCallback(ThreadMethodEntry tme)
   at System.Windows.Forms.Control.InvokeMarshaledCallbacks()
   at System.Windows.Forms.Control.WndProc(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m)
   at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m)
   at System.Windows.Forms.NativeWindow.DebuggableCallback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)
   at System.Windows.Forms.UnsafeNativeMethods.DispatchMessageW(MSG& msg)
   at System.Windows.Forms.Application.ComponentManager.System.Windows.Forms.UnsafeNativeMethods.IMsoComponentManager.FPushMessageLoop(IntPtr dwComponentID, Int32 reason, Int32 pvLoopData)
   at System.Windows.Forms.Application.ThreadContext.RunMessageLoopInner(Int32 reason, ApplicationContext context)
   at System.Windows.Forms.Application.ThreadContext.RunMessageLoop(Int32 reason, ApplicationContext context)
   at System.Windows.Forms.Application.Run(Form mainForm)
   at ClearTraceGui.Program.Main()

================================
Inner Exception
================================
Exception Type: System.Reflection.TargetInvocationException
Exception Message: An exception occurred during the operation, making the result invalid.  Check InnerException for exception details.
   at System.ComponentModel.AsyncCompletedEventArgs.RaiseExceptionIfNecessary()
   at ClearTraceGui.MainForm.backgroundWorker1_RunWorkerCompleted(Object sender, RunWorkerCompletedEventArgs e)
   at System.ComponentModel.BackgroundWorker.OnRunWorkerCompleted(RunWorkerCompletedEventArgs e)
   at System.ComponentModel.BackgroundWorker.AsyncOperationCompleted(Object arg)

================================
Inner Exception
================================
Exception Type: System.OutOfMemoryException
Exception Message: Exception of type 'System.OutOfMemoryException' was thrown.
   at System.Data.SqlClient.TdsParser.TdsExecuteRPC(SqlCommand cmd, _SqlRPC[] rpcArray, Int32 timeout, Boolean inSchema, SqlNotificationRequest notificationRequest, TdsParserStateObject stateObj, Boolean isCommandProc, Boolean sync, TaskCompletionSource`1 completion, Int32 startRpc, Int32 startParam)
   at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)
   at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(TaskCompletionSource`1 completion, String methodName, Boolean sendToPipe, Int32 timeout, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)
   at System.Data.SqlClient.SqlCommand.ExecuteNonQuery()
   at ClearData.ClearTrace.Helper.ExecuteNonQuery(SqlConnection sqlConnection, SqlCommand cmd)
   at ClearData.ClearTrace.Helper.ExecuteNonQuery(String connectionString, SqlCommand cmd)
   at ClearData.ClearTrace.TextDataDimension.AddItem(String connectionString, Int64 TKey, TextDataDimensionItem TValue)
   at ClearData.ClearTrace.SqlTrace.ProcessFiles(BackgroundWorker worker, DoWorkEventArgs eventArgs)
   at ClearTraceGui.MainForm.backgroundWorker1_DoWork(Object sender, DoWorkEventArgs e)
   at System.ComponentModel.BackgroundWorker.OnDoWork(DoWorkEventArgs e)
   at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument)

This suggests you are running out of memory. You do not say how you are doing the import but I suspect you need to set the batch size; maybe start with something like 20000 rows.

Thank you for the replay. The out of memory exception is inaccurate even though that is what ClearTrace is reporting. I've monitored memory and CPU; both are far from maxing out during importing. It will process the file before and the file after with no problem.

How is the batch size set? I don't see a setting for that in the GUI or the command line.

I'm going to tag @graz here since it's a ClearTrace error, he'd be the best person to answer.

1 Like

You have still not said how you are doing the import so google for whatever you are using. eg For:

BCP Use -b
T-SQL BULK INSERT use BatchSize
System.Data.SqlClient use the SqlBulkCopy.BatchSize property
etc

How do you monitor memory? From Windows or SQL memory? If you only allow SQL to use 32 GIG memory out of the 64 GIG Windows memory, your Windows memory might be far from max when your SQL server is out of memory.

Yes, I have monitored memory and neither Windows or SQL memory is near being maxed out during importing. The number of files failing to import varies. For example, last time 23 of 71 files failed to process giving the 'out of memory exception' message. Each of the failed files fail at different rows read. Also, the number of failed files out of the total group varies. The time before last all but 10 files were processed. There seems to be no pattern and this is only started happening two or three months ago.

have you checked which other processes are running when you experience the memory errors? How many spids are running? Do you have PLE drops? Which locks are active? Could be you are locking yourself out.

Using a server dedicated to ClearTrace processing. No other activity is occurring. Files are processed one at a time. Some files fail early in the reading process and others near the end. Never happens twice in a row. At minimum, ClearTrace will successfully read the next file after error before the next error occurs. Sometimes it will process the next 10 files before the error.

Try checking for deadlocks

The problem is with a .Net Windows Form client application, not SQL Server.

1 Like

So the error here is unlikely to be with SQL Server and more likely to be with the ClearTrace code. ClearTrace tries to read in trace or extended event files, summarize them, and write the results back to SQL Server.

  1. You say it is intermittent?
  2. If you process one file does it ever happen?
  3. How big are the trace/XE files?
  4. How big is your ClearTrace database?
  5. Are all of the files that error over a certain size?

This is likely in the code that bulk inserts the results although the trace is a little ambiguous.

My only quick advice is to test with smaller files. The second issue is that some of the lookup tables may be quit large and that can be causing ClearTrace to use up memory.

1 Like
  1. By intermittent I mean sometimes no file imports will fail, sometimes it's five to ten, and sometimes is 20+.

  2. If I tried to process a file that failed by itself, it still happens.

  3. Our trace files are 5GB. I tried 4GB files and encountered the same issue.

  4. The ClearTrace database is currently 14GB.

  5. When using 5GB files, the size range of files that failed are 4.86GB to 5.12 GB. When using 4GB files, the ranges are 1.25GB to 4.09GB.

The largest table in the ClearTrace DB is CTTraceSummary at 8.77GB with a rowcount of 49,524,276.
The next largest is CTTextData at 1.85GB with a rowcount of 169,090.

All other ClearTrace DB tables are very small.

There is the key part of the exception. It is building a hash table of all the "fixed up" SQL statements it sees. That generally means all the parameterized SQL statements it sees. And that table is in memory.

My guess is that even after parameterizing, you have LOTS of unique SQL statements. That means the hash table is very large. Does that seem true?

I think there is some housekeeping done between files that cleans that up. Can you try with smaller trace/XE files? I know that's not convenient but I think that will fix the issue.

1 Like

That does seem possible. I've tested with 4GB but that didn't solve it. I'll try 3GB next.

Here is an article I wrote a LONG time ago about processing large trace files:

ClearTrace Performance on 170GB of Trace Files (sqlteam.com)

I was using 200 MB trace files.

I know that creates a LOT more trace files. Let me know how the smaller files go.

As I think about this more, it makes more sense. I'm not sure what version of SMO finally supported 64-bit but I'm not convinced it was the version that still supported trace. Which means it had a memory cap of 2 GB. And that is likely what you are hitting.

We did traces with 1GB files and 2GB files. For the 1GB files, 1 file out of 38 failed to import. For the 2GB files, 37 out of 262 failed to import. Smaller files don't seem to help.

Well, the success ratio is better with the smaller files. Can/have you tried using files <1 GB in size?

We had almost 300 2GB files. If we ran 1GB files for the full two hours, we'd probably have 500+ files. Going smaller is not practical for us.