Importing huge amounts of data, how to

Importing huge amounts of data, how to

Post by Val Mazu » Thu, 09 Oct 2003 12:07:32


Hi Frank,

Are you sure that Access is a good choice in your case? It has limit of 2G.
After several months of update you may face situation, when you cannot add
any records to the database anymore

--
Val Mazur
Microsoft MVP
Check Virus Alert, stay updated
http://www.yqcomputer.com/
 
 
 

Importing huge amounts of data, how to

Post by Bil » Thu, 09 Oct 2003 13:58:07

I agree with Val. Using Jet/Access for this size database is asking for
trouble. You'll find that maintenance and compression will be a never-ending
problem. SQL Server (even MSDE) would be far better suited for your needs.

--
____________________________________
Bill Vaughn
MVP, hRD
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________



2G.

 
 
 

Importing huge amounts of data, how to

Post by Janne Kej » Fri, 10 Oct 2003 23:31:15

How about creating import tables in the summarized base? These tables would
have the same structure than the "real" tables. Every time you update the
summarized database, you start by emptying the import tables (actually they
should be empty, since you should empty them in the end of each updating).
Then you bring the new data in the import tables, and after that you can
update the "real" tables with a few sql-commands. In the end you empty the
import tables.

You could even do this by having the import tables in a separate database,
or - if the sql commands don't get too complicated - you could do the
update directly without any import tables.

Janne


>> I have designed an Access based application which holds >> summarized information from a larger database (many Gb). >> The summary base will be between 500 Mb and 1 Gb, so >> Access seems a good choice as the application will be >> deployed to several disconnected computers. They will >> only use it for reading, no updates. >> >> Every month the database needs to be updated with new, >> updated and summarized information in the various tables. >> Now, I am wondering how I can best perform these data >> imports. For the tests for database size I just used the >> import function that is part of Access (file menu >> import). However, when it comes to the real imports I >> need to check for existing records in the target base and >> update these, sometimes replacing data, sometimes adding >> figures to existing field values. The intrinsic import >> function of the Access does not seem capable of handling >> that. >> >> This seems to leave me with ADO or DAO though VB, or is >> there any other loader utility or method that I could use? >> Also, if I use ADO, which would be the better way to go >> about it? >> >> Consider that my data file will have several millions of >> rows. For each row I will need to search for an existing >> record and update it, if it exists. If it doesn't the >> record needs to be added. I have only just started >> working with ADO, and mostly with recordsets. However, >> the process of creating a new record set (to query for an >> existing record) for each row in my datafile does not >> seem very effective performance wise. >> >> Does anyone know of a good method? >> >> >> With kind regards, >> >> Frank
 
 
 

Importing huge amounts of data, how to

Post by anonymou » Sat, 18 Oct 2003 19:34:08

Hi Val,

I think it will be okay with Access, as I know the data
amount. There will not be data for more than 2 years, so
every month, some will be deleted.

regards,

Frank


It has limit of 2G.
when you cannot add


Gb).
tables.
the
and
adding
handling
use?
of
existing
an
 
 
 

Importing huge amounts of data, how to

Post by anonymou » Sat, 18 Oct 2003 19:37:41

Yeah, I am considering doing the update directly.
I think that I will use the ADO.NET to get the records
that could be affected into a dataset then update that
and in the end write the updates back to the database.
I think that will work for the summarized info.


Frank

These tables would
time you update the
tables (actually they
of each updating).
after that you can
end you empty the
separate database,
you could do the
>>> I have designed an Access based application which holds >>> summarized information from a larger database (many
Gb). >>> The summary base will be between 500 Mb and 1 Gb, so >>> Access seems a good choice as the application will be >>> deployed to several disconnected computers. They will >>> only use it for reading, no updates. >>> >>> Every month the database needs to be updated with new, >>> updated and summarized information in the various
tables. >>> Now, I am wondering how I can best perform these data >>> imports. For the tests for database size I just used
the >>> import function that is part of Access (file menu >>> import). However, when it comes to the real imports I >>> need to check for existing records in the target base
and >>> update these, sometimes replacing data, sometimes
adding >>> figures to existing field values. The intrinsic import >>> function of the Access does not seem capable of
handling >>> that. >>> >>> This seems to leave me with ADO or DAO though VB, or is >>> there any other loader utility or method that I could
use? >>> Also, if I use ADO, which would be the better way to go >>> about it? >>> >>> Consider that my data file will have several millions
of >>> rows. For each row I will need to search for an
existing >>> record and update it, if it exists. If it doesn't the >>> record needs to be added. I have only just started >>> working with ADO, and mostly with recordsets. However, >>> the process of creating a new record set (to query for
an >>> existing record) for each row in my datafile does not >>> seem very effective performance wise. >>> >>> Does anyone know of a good method? >>> >>> >>> With kind regards, >>> >>> Frank >> >> >>. >>