Recently, batch job has terminated with error, hang at that dataflow for many hours before terminated with error:
Warning: Your system is running low on process virtual memory space. Available virtual memory is <46> megabytes.
DATAFLOW: Data flow < x > is terminated due to error <50505>.
The batch job is executed with following options:
Enable auditing
Enable recovery
Use collected statistics
Cache statistics determined that data flow < x > uses <5> caches with a total size of <353398268> bytes. This is less than(or equal to) the virtual memory <4290772992> bytes available for caches. Statistics is switching the cache type to IN MEMORY.
Without changing anything, if rerun with following options, the batch job finishes successfully:
Enable auditing
Recover from last failed execution
Collect statistics for optimization
Use collected statistics
Data flow < x > using PAGEABLE Cache with <4092 MB> buffer pool.
We are using Data Integrator XI 3.0. It is daily job run. All the dataflows properties Cache type is Pageable. Everytime when the virtual memory issue hits, we just rerun with Collect stats for optimization to solve it.
If yesterday run was with "Collect stats for optimization", then today run stats is available for optimization, but why it still fails with memory issue.
How do I check the cache stats is collected properly? Is there any setting I need to change?