Help please with updating large amounts of data on SQL Server 2000

Help please with updating large amounts of data on SQL Server 2000

Post by R2VvcmdpZU » Sat, 29 Apr 2006 20:30:02


Hi all,

I am writing an application to look through an Interbase back office
database, detect any changes (by looking at a timestamp) since the last
'synch' and write them to the SQL Server 2000 DB. In general it works fine
with smaller quantities of data, but when I get into the 10,000s of record it
all grinds to a halt.

I'm assuming this is a transaction log problem? I've tried batching the
records into transactions (typically of 1000 updates) but this doesn't seem
to speed things up much.

To do the initial load of the database I used a DTS package - and that was
astoundingly fast. But in live use I need to trickle changes through rather
than truncate the table and bulk copy it.

Any ideas where I'm going wrong? Any help greatly appreciated!
 
 
 

Help please with updating large amounts of data on SQL Server 2000

Post by T21uaWJ1en » Sat, 29 Apr 2006 21:17:01

Can you tell us how you are updating in batches of 10000
using cursors, from the application calling stored procedures?
try backing up the log before you do this mass update.

 
 
 

Help please with updating large amounts of data on SQL Server 2000

Post by Dan Guzma » Sat, 29 Apr 2006 21:22:01

The fastest way to get large volumes of data into SQL Server is using a bulk
insert technique. DTS uses SQLOLEDB IRowsetFastLoad. That API can be used
in C++ (and perhaps Delphi).

Other bulk insert methods:

Sql Server BCP command-line utility
BULK INSERT Transact-SQL statement
ODBC BCP API
SqlBulkCopy (.Net 2.0)

--
Hope this helps.

Dan Guzman
SQL Server MVP