Importing a huge amount of data

Importing a huge amount of data

Post by Tim Almon » Thu, 25 Sep 2003 00:24:22


I am currently looking into making a c# program run more efficiently.

The program currently uses some #ziplib classes to open a zip file and read
each of the records from it, write them to a file and then run a DTS package
against the file.

I'm sure that writing the unzipped file to disk is not a very efficient
thing to do, and that there must be a way of doing something like building a
set of class records and then committing the set to the database.

Anyone done this?
 
 
 

Importing a huge amount of data

Post by Bil » Thu, 25 Sep 2003 00:44:02

No, I expect it's as efficient as you can get (or very nearly so). None of
the data access interfaces are designed to do bulk copy--they are query
interfaces. I would focus on making the process of storing the data in the
zip file and rehydrating it as fast as possible.

--
____________________________________
Bill Vaughn
MVP, hRD
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________



read
package
a

 
 
 

Importing a huge amount of data

Post by Jerry » Thu, 25 Sep 2003 02:21:34

What is the native form of the file in the zip file? Maybe you can just
unzip it and run the DTS package against this file.



read
package
a