SQLTeam.com | Weblogs | Forums

Fail to execute SQL command asynchronously

sql2008

#1

Hi,

I want to invoke a SQL command “INSERT INTO …” for 3, 000,
000 times to insert 3, 000, 000 records via ADO.

Other than the BULK INSERT command, I also want to utilize
the ADO.Command object to execute the SQL command asynchronously.

So I do as follows:

  _bstr_t strSQLChange("INSERT INTO MyTable VALUES (1, 2, 3);");



  _ConnectionPtr pConnection = NULL;      _CommandPtr pCmdChange = NULL;



  // Open connection.

  pConnection.CreateInstance(__uuidof(Connection));

  pConnection->Open (strCnn,

"", "", adConnectUnspecified);

  // Create command object.

  pCmdChange.CreateInstance(__uuidof(Command));

  pCmdChange->ActiveConnection =

pConnection;

  pCmdChange->CommandText =

strSQLChange;

  for (UINT uIndex = 0; uIndex <

3000000, uIndex ++)

     pCmdChange->Execute(NULL, NULL, adCmdText | adAsyncExecute | adExecuteNoRecords); The first execution is OK. But the second one will cause the following error 

“Operation
cannot be performed while connecting asynchronously”

Does
that means the ADO command object cannot execute the SQL command asynchronously?
Thanks


#2

Not the answer to your question, but this will be horribly slow. Maybe it is a performance test of some sort? If not I would program it a different way to operate the INSERT in "bulk"


#3

Yes, I am trying to do the bulk insert. But meanwhile, as the bulk insert has some problems, we will still try to non-bulk way together. Of course, the actual row data to be inserted are not idenical to each other, which will be calculated with some algorithms. I just take that as a simple demonstration.


#4

I'm not sure that ASYNC will help you much (but it might, I don't have experience of that) - SQL will still be dancing around inserting row-by-agonising-row. If you can't get ASYNC syntax sorted out easily perhaps use multiple connections and round-robin the connection on each INSERT?

My priorities (in case it helps, or just for discussion :slightly_smiling: ) would be:

Organise the source data into Clustered Index (ascending) order

Set the clustered index to 100% fill

(I suppose it is still going to split as each index block becomes full, not sure what actually happens then (when inserting an increasing-key, so perhaps an index Rebuild is needed before there is any benefit). However, when your Bulk Insert is ready then a HINT can be given that it is pre-sorted on Clustered Index, which will help)

DROP all secondary indexes (and recreate them after the import)

Could you output the data to a delimited file, instead of 3M x INSERTS? and then just BCP in the file? That is where I would want to wind up, and (to me) the BCP part seems straightforward. My BCP command (not included Clustered Index Sort hint) would be something like:

BCP MyDatabaseName.dbo.MyTable 
    in x:\path\MySourceDataFilename 
    -e x:\MyLogPath\MyTable.ERR 
    -o x:\MyLogPath\MyTable.OUT 
    -S ServerName or . for currently connected server
    -T -E -b 100000
    >>x:\MyLogPath\MyTable.ER2

If you use TAB for the separator you should be able to use -c or -w (character or unicode) for the format. We use -N (Native Mode) but that (AFAIK) requires exporting the data from a SQL database, also using -N which avoids having a separator and thus allows columns to include TAB and/or Line Breaks etc.

SSIS would be an alternative to BCP - I'm an old dinosaur and familiar with BCP, but the GUI interface of SSIS probably appeals to others. I just find it harder to diagnose when something doesn't work, but its probably improved!! in the decades since I last used it in anger.