Minor database help needed (ADO)

Minor database help needed (ADO)

Post by dgk » Mon, 11 Jul 2005 11:55:29


I need to update 700,000 records in an Access database with data from
another database, which is in a flat comma delimited file. Is it
better to pull back all the Access records at one time and update them
and then update the database, or open the recordset over and over,
each time selecting a single record?

Thanks.
 
 
 

Minor database help needed (ADO)

Post by » Mon, 11 Jul 2005 12:52:06

Ask help about this on XML newsgroups they might be able to help.
XML is used to store databases in format that nearly all database programs
can read it

Hope it helps!
Joppe

"dgk" < XXXX@XXXXX.COM > kirjoitti
viestiss? XXXX@XXXXX.COM ...

 
 
 

Minor database help needed (ADO)

Post by spacewar » Mon, 11 Jul 2005 14:11:48

Since you have that many records to update, I think opening the
recordset over and over will be too much of an overhead.

In case of any doubt, you could always time both techniques. :-)
 
 
 

Minor database help needed (ADO)

Post by Steve Gerr » Mon, 11 Jul 2005 15:40:25


Neither :)

The number that floats around somewhere is that 5000 to 10000 records at a time
will work the best. Too few at a time takes too long; too many at a time uses
too much memory.

I would import the CSV file first, and make it an Access table. This will let
you wrestle with any data conversion or missing data issues that might arise,
and you can also index it.

Assuming you have a key in both tables by which to match them up for the update,
you can make an update query that accepts two parameters: a lowest key and
highest key, for selecting the range to update. Then run the query repeatedly,
entering 1 and 5000, then 5001 and 10000, 10001 and 15000, etc. This can be done
in code, of course, with a nice little status bar update showing progress...
 
 
 

Minor database help needed (ADO)

Post by marks54200 » Tue, 12 Jul 2005 01:38:58


It really depends on the size of the records and the structure of the
database tables. If it is a single table with relatively few fields I
would simply open the database and read sequentially through the
records , updateing them as you go.

This assumes you can sort the flat file into the same order.

With a major update I would be tempted to export all records from the
database, update the resulting flat file and the reload the database.
 
 
 

Minor database help needed (ADO)

Post by dgk » Tue, 12 Jul 2005 23:32:44


There are no fields in common, at least no easy numeric ID fields. I
think that I need to use the names and addresses, read them from one
database and look up a match in the other. The main database is a
single table with maybe 20 fields. The other also has around 20 fields
but is just a comma delimited flat file. I'll have to check how it is
sorted, should have done that already.

Maybe just sucking the entire database over is ok. I'll give it a
whirl. I guess 750,000 records of maybe 2000 bytes each will take
about 1.5 gigs of RAM. Hmm. That might not work out so well. Virtual
memory here I come.

Maybe I can just grab by last name first letter.
 
 
 

Minor database help needed (ADO)

Post by KwikOn » Thu, 14 Jul 2005 02:33:39

> Maybe just sucking the entire database over is ok. I'll give it a
That's why you should look at processing 5000 - 10000 at a time.