Hi there.
I'm currently working in a system that uses typed datasets to represent business entities.
We have quite a few batch jobs that are intended to run at certain times during a day.
Some of these batch jobs are handling quite a lot of data and are running for more than 24 hours at a time. The problem is that we run out of memory from time to time.
It seems to be related to how the CLR handles typed datasets. I have made a small app to reproduce the behaviour.
I'm running the test app in VB.Net 2.0 on a Windows XP x86 machine (3GHz, 2,5 GB RAM) .
The test app is a simple console application that iterates 20 times. Each iteration does 1000 Threadpool.QueueWorkItem.
Each workitem loads 3 typed datasets with approx 70kB XML-data each and then does a dataSet.Copy on each of the 3 datasets (just to emulate some memory consumption) and then does a Thread.Sleep(200) to simulate some execution waiting time.
gcServer is set to "true" to run server garbage collection.
When I run the app and look at some perfmon counters i see the following:
(.Net CLR Memory)
# GC Handles are increasing linearly and never decreases.
# Bytes in all heaps are increasing and have a few "dips" during GC but the overall trend is an increase.
Gen 0 heap size are fairly stable
Gen 1 heap size are fairly stable
Gen 2 heap size are increasing and have a few "dips" during GC but the overall trend is an increase.
(Process)
Private bytes are slowly increasing and never decreases
When I replace the typed datasets with untyped ones, the picture is totally different.
(.Net CLR Memory)
# GC Handles bumps up at each iteration (when queueing the workitems) and decreases as each workitem is done (as it should be).
# Bytes in all heaps are fairly stable.
Gen 0 heap size are fairly stable
Gen 1 heap size are fairly stable
Gen 2 heap size are fairly stable
(Process)
Private bytes are fairly stable
Any ideas
Regards
Patrik