Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all articles
Browse latest Browse all 3719

"can not use PAGEABLE because it has nested schema"

$
0
0

Hello together,

 

after a migration of our DataServices Repositories from XI 3.2 to DS 4.1, we have problems to execute one DS job on the new machine.

The job contains dataflows which call SAP BAPIs. The DF try to use PAGEABLE Cache, but the DS system can not do this because of this

message in the job log:


"Data flow <df_NAME> can not use PAGEABLE because it has nested schema. Switching cache type to IN MEMORY."


This is little strange because on the old DS Release the same DFs can use this type of Cache, here the message from job log of the old machine:


"Data flow <df_NAME> using PAGEABLE Cache with <3531MB> buffer pool."


 

So it use ram for 4 DFs (actually there are 10 DF that run on the old DS Release parallel, we reduced the DFs for test purposes).

But already at the third DF the Job aborts with this error "Insufficient memory in allocation attempt.

Details: <trying to allocate 752 * 100 bytes>".


At the OS level you can see that four al_engine processes take the whole available memory (64GB RAM), each of them till 15GB RAM. Then the

processes take some heap (about 2GB) and after that the job fails.

 

********************************************************************************************************

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND

6210 p7sadm 20 0 13.9g 13g 39m S 109 21.3 4:39.12 al_engine

5363 p7sadm 20 0 14.2g 13g 39m S 109 21.7 4:45.71 al_engine

5361 p7sadm 20 0 14.5g 13g 39m S 97 22.0 4:49.72 al_engine

5364 p7sadm 20 0 14.0g 13g 39m S 78 21.5 4:42.15 al_engine

32196 p7sadm 20 0 306m 5180 2824 S 1 0.0 9:09.90 al_jobserver

27754 p7sadm 20 0 15.7g 3976 996 S 0 0.0 28:28.93 al_engine

32216 p7sadm 20 0 353m 4444 2376 S 0 0.0 15:36.50 al_jobserver

32231 p7sadm 20 0 658m 26m 1004 S 0 0.0 42:48.55 java

32250 p7sadm 20 0 401m 3552 2308 S 0 0.0 9:46.89 AL_AccessServer

********************************************************************************************************

 

Our developer have tried to run the DF one after the other and the job

finished successfully, but the job was running too long.

 

 

It seems that SAP has changed some on the engine / processing logic.

But that is fatal, because this job requires huge amount of RAM.

 

Does anyone have the same problem, an idea or a workaround?

 

Thanks in advance


Viewing all articles
Browse latest Browse all 3719

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>