Hello,
We have several jobs which involve very large volumes of data, which get joined/filtered to a small result (~5mb text file is result, however input tables are many Gb).
Our jobs were initially written as standard DF's pulling the data from the SAP table and then processing in Data Services, but this was too slow. As a result our team has created function modules on the source ECC systems to pre-filter/join the data and send only the result.
This is time intensive and requires a lot more work than having the query/filter logic in DS.
Investigating ABAP Dataflows suggests these are the best way (and appear to be SAP best practice for this situation).
Our security team is concerned with giving the DS SAP user so much access to run "Generate and Execute" permission. As we understand, if this is provided, BODS can automatically convert all query logic within the ABAP Dataflow into an ABAP program, upload to the target system, and then execute the program. This would be desirable because we have no need for any ABAP developer access and can more easily move between our dev, quality, and production systems.
If they cannot provide this, we are hoping to at least provide the "Execute Preloaded" security but then this will require us to export the DF logic to ABAP, then have another person import it to each landscape or with each update. This is not as good but still "ok."
My question is: what are best practices with SAP security for ABAP data flows? Do people generally give all the user rights for the DS userid on the source system for Generate/Execute? Or is it more common to just allow the Execute Preloaded and then have someone manually add the ABAP program?