Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all articles
Browse latest Browse all 3719

Loading a flat file to HANA is running out of physical memory on server

$
0
0

Dear SAP forum,

 

I'm performing a test of trying to load a 150 million row flat file to a HANA table to get performance metrics at my client site. I have created the simplest data flow where I connect the flat file input format to a HANA table (No transformations at all just a straight map). What is happening is that I eventually run out of physical virtual memory about half way through the file. My first reaction is why is anything being written to memory at all, its a straight migration? I see this behavior by using the windows task monitor to see that when I start the job, the virtual memory starts to get used up little by little, eventually crashing when I hit my 90 million rows. When I try using SQL Server as my target, I do not get any virtual memory issues at all, as it stays the same as I would expect.

 

The csv file is located on the job server. 

 

The real question to me is: why is Data Services writing to physical memory when the target is HANA and not when the target is SQL Server? 

 

After 30 million rows I run out of physical memory (99%) with HANA and at the same point with SQL Server the memory does not move from 35% utilization.

 

Does this mean that all data that gets loaded into HANA using data services gets staged in physical memory on the server?

 

Does anyone have a clue as to why this is happening?

 

My Data Services version is using the new 4.2 SP01 and this also happened with version 4.1

 

I appreciate any ideas.

 

Best regards,

 

Michael Ardizzone

itelligence group


Viewing all articles
Browse latest Browse all 3719

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>