Manage huge amount of data

Manage huge amount of data

Post by UmFrZXN » Sat, 20 May 2006 20:07:02


Requirement:

1 billion records to be inserted every day
180 days data to be maintained
180 * 1 billion records = approx no of records = defines the size of the
database

Need to design the database/process to maintain so huge data?


- R
 
 
 

Manage huge amount of data

Post by Tony Roger » Sat, 20 May 2006 20:13:03

Hi Rakesh,

Need more info like an idea of the number of tables, size of rows, querying
which will reflect what indexes are required etc...

You will need some good kit though, lots of disks; what sort of fault
tolerance are you looking at because backups are going to be a problem.

--
Tony Rogerson
SQL Server MVP
http://www.yqcomputer.com/
Server Consultant
http://www.yqcomputer.com/ - free video tutorials

 
 
 

Manage huge amount of data

Post by Dan Guzma » Sat, 20 May 2006 20:48:07

> 1 billion records to be inserted every day

With an average row size of 100 bytes, this calculates to about 18 TB of
usable space not including index overhead. You'll also need to sustain a
rate of over 10 thousand inserts per second 24x7.

Very large tables are often partitioned for manageability reasons and to
address backup issues like Tony mentioned. Attention to detail is very
important. Unless you have experience working with very large databases, I
suggest you engage consultants with VLDB experience to help you out. We can
help you with specific questions but a project of such magnitude requires
dedicated resources with specialized experience.

--
Hope this helps.

Dan Guzman
SQL Server MVP