How to improve ADO performance


I am using ADO/Visual C++ 2008 to access SQL Server 2008

I mainly use the Execution method of the ADO connection to
the SQL Server to perform the following tasks:

Create 40000 tables by executing “CREATE TABLE …”
SQL Server query.

Add about 3000, 0000 records to these tables by executing
“INSERT INTO …” query. In each invoke of Execute method, only one SQL statement
is executed.

Now I find my program is very slow. And the bottleneck is
the Execution method, which takes about
90% of the total time.

So I want to improve the performance.

I have the following several strategies:

Whether Execute method can be invoked asynchronously?
If yes, then after I invoke the Execute method to submit the SQL query to SQL
Server engine, before the query is executed completed, the Execute method can
return and I can run other parts of the program, which will save a lot of time.

Another solution is to merge several SQL
statements together and submit them together with one invoke of Execute method.
But I make a test of such a solution and find the time consumed will be longer
than Executing one statement at a time. So it seems this solution is also not

Please tell me whether the above strategies are feasible? And
whether there are more solutions for my problem.

The best way to improve the "ADO performance", in this case, would be to stop using ADO to do it all. Obviously, you have some sort of algorithm to determine which tables should be create (seriously??? 40,000 tables??? What are they for and why do you think you need 40,000 of them???) as well as having al algorithm for adding the 3,000,000 rows of data to them. The killer is that you have RBAR on steroids happening in that you're not only doing one bloody insert at a time but you're invoked EXECUTE 3 million times.

Stop using ADO for this. Write a stored procedure to do it all using SET BASED methodologies. Your run time will drop to seconds, literally. I know this because I did some work for a company that fell into the same trap as you. I got them to rewrite it as a stored procedure that took just a single line of 8 parameters for input and it dropped their 2-4 hour run to sub-second times (they only need 1 table with 250 thousand rows).

If you don't know how to do the conversion to make it all run in stored procedure, I strongly suggest you find someone that does. A day of consultant time will be well worth not disappointing bosses and customers.

Hi, JeffModen,

Thank you very much for your suggestion.

However, the data to be added come from a source binary file for which I need to use C++ code to decode the data. It is too complex for a stored procedure to do the task.

Hi, Alan here I am stated some tips which can help yours to improve the performance of SQL Server ADO.
1. Design should be good.
2. Should know how ADO gets data.
3. Need to check ADO recordset objects and concurrency etc..
You should go through these links to know more about SQL Server performance.

Possibly but you'd be surprised what you can actually do in SQL Server even with so call binary data. What does the data contain and what is the process that you don't believe can be done in a stored procedure.

And, do you know what and SQLCLR is if it is truly impossible to do in T-SQL?

I would be inclined to use c# and

Here is an old, but useful, article by Solomon Rutzky:

I have similar thoughts about SQL as @JeffModen - that you can do almost everything that you want to do using T-SQL and if required SQL CLR. Some things could be done better outside of SQL, but they still can be done using SQL. Now, I will freely admit that I think this way because I have a hammer, and so everything looks like a nail to me.

For the OP, if the computational logic needs to be done in the C++ code, one possibility might be to write the output to a text file and then use SSIS or bulk insert to insert that data into the SQL database. But, having to create 40,000 tables seems like there is room for optimization/improvement/consolidation there.

Heh... wrong hammer there. If you look at his article, you do all of that setup and have to worry about memory, performance, etc, etc, and to do what? Import from a text file. Most applications, especially those that do things like Invoice/Invoice Detail do NOT use any kind of file. They just pass the information.

You also notice that he also raced against BCP.and it beat ALL of the contrived methods for performance. The only reason it didn't beat everything else for memory is because he likely loaded the whole file at once instead of using a batch size.

AND, BCP is NOT the fastest kid on the block, BULK INSERT is. And guess how much programming one would have to do to use BULK INSERT? 1 Tiny little proc that allowed you to pass a UNC to the system where a bit of dynamic code would do the BULK INSERT.

Not doubting your prowess at C# and but it would take someone a heck of a long time to write code as fast as BULK INSERT and have all the features that are associated with it such as checking each row as it's input to whatever table and, if it's not right, being able to sequester such bad rows in an error file without interrupting the rest of the import.

My recommendation would be to understand that you're actually trying to drive a nail and, for that, you actually do need a hammer. Then, you need to realize that you don't have dig iron ore from a mine, smelt it, form the hammer head, temper it, cut down a tree, build a handle for the hammer, put it all together, and then find out that you also need the hammer to pull a nail but you didn't build one into your hammer. :wink:

Fair point but a bit of metallurgy and woodwork can be quite interesting. It is also sometimes necessary if access to the store is restricted.

What does "Access to the store" have to do with anything here? If access is restricted, it will affect everything. Playing with metallurgy and woodwork is always a great thing so long as people realize that building wooden rocket ships and metal toilet paper are probably not the best of ideas.