We are having an issue with a custom integration we have built for
We are working on a scheduled DTS package which is required to
migrate data from a legacy system into MSCRM. The data migration
process runs on a nightly basis and required to update close to 40,000
records in MSCRM.
We have built a DTS package(On the SQL SERVER) that gets the data from
the legacy Oracle server and computes the records that need to be
updated and then calls into a serviced component running in com+(On
the SQL Server) passing in the records that need to be updated in the
form of xml string. The serviced component is called with 1000 records
at a time and the entire 40,000 records are not passed in one single
shot because of memory exceptions. The serviced component calls into a
web service (on the CRM machine) and passes batches of 20 records at a
time. The xml is parsed using an xml reader and we update account
information in MSCRM by calling in the MSCRM object model, with the
information in the xml.
The process runs through fine when we try this with a low volume of
records( 100s and a few thousands). Initially the memory held by the
w3wp process increases and after the first 2000 records it becomes
grows to about 150MB (Without running the integratino it stays around
80MB with people using MSCRM) and has meager increases for a while
then it starts to increase faster. Once the w3wp.exe process is using
200MB of memory the MSCRM site does not respond when hit using IE but
the integration continues to run. The memory held by the w3wp process
increases until it reaches 800MB and finally stops with a
System.OutOfMemoryException. This usually happens around the 14K to
16K records procesed.
The hardware configuration that we have on the test environment is
1 gb RAM
1 gb RAM
The DTS package and the serviced component exist in the Sql machine
and the serviced component calls into a web service located in the CRM
machine to migrate the data into MSCRM.
Any ideas we could try to get it to not keep growing.