Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

SAP BODS Stage job Performance issue

$
0
0

Hello All,

 

Am facing an issue in one of my stage job in SAP BODS.

 

i have a stage job where i pull the data from source to stage. the source table is BW and stage is oracle table.

 

The mapping is just one to one. But, in source table count is almost 1 crore and the count is increasing day by day. so, our job is taking long time for execution almost 1 hour. 30 minutes

 

It's really causing a lot of performance impact.

 

I tried the below options.

 

1)increased the array fetch size of the table and also increased the rows per commit but, no use

2)increased the array fetch size and then changed the bulkloader option to API  and rows per commit but, no luck

3)tried with Degree of parallelism option but no luck

4)tried with the Dataflow pageagle cache to in-memory but, no luck

 

Can anyone of you please help me to resole this issue

 

Bods version is :4.2

os:-windows

 

Regards,

Prasanna


Assign and combine operation in unique id

$
0
0

Hi all,

 

   Performed cleansing and matching for Table-1 and generated unique id using assign operation.

   Next day Table-2 is passed, matched records in Table-1 and Table-2 will get same unique id and unique records will get new unique id.

 

   Is assign combine operation will suit this requirement???

   If it suits, how to pass Table-2 to match records from Table-1 any transforms I have to use to compare Table-1 and Table-2.


Thank you.



Central Repository migration

$
0
0

Hi All,

 

We need to migrate our central repository from one database to another.

What is the best way to achieve this?

I tried exporting the Central Repository as an ATL file, but didn't find a way to do so.

 

Thanks and Regards,

Bhupendra

Map Operation Transformation Error?

$
0
0

Hi All,

 

When I try to created I Map_Operation.In this scenario Input field contains Cust_Id 1,2,3,6 four rows, now i filter Cust_Id=1  in Query.

Now Inserted Cust_Id=1 records  deleted using Map_Operation and Deleted Cust_Id= 1 records insert into another table.

My issue is Deleted Cust_Id = 1 records  inserted into another table correctly, but Deleted Cust_Id= 1 & renaming records not inserted into target table.

Screen shots attached as shown below.

Data Flow and Source File as shown below:

Data_Flow.PNG

Source_File.PNG

Thanks,

Chandra

How to run ODP Delta as a SAP Background job (SM37)

$
0
0

Hello Experts!

 

In our project we are re-using our SAP BW datasources for extraction via DataServices into a new Oracle database.

I am familiar with delta extraction and we are also using this functionality via the new ODP API 2.0.

 

While running the Delta Initialization or the Full upload a batch job is created in the SAP source system.

This can be seen in transaction SM37 with job name ODQR_*

 

However, while running a Delta upload, no batch job is created in SM37. It seems that the complete delta upload is executed in a dialog process.

Also in transaction ODQMON there is no reference to a batch job.

 

 

Does anyone of you know how we can run the Delta uploads in a batch process in the source system?

It is really necessary to do so. Both for performance and auditing reasons!

 

 

Thanks in advance!

Steven Groot

 

Please see the attached screenshot (modified) from transaction ODQMON.

 

ODQMON_screenshot.jpg

Unable to Login to Repository using BODS designer Client

$
0
0

Hi ,

We are getting the below error message when trying to login into BODS Repository using BODS Designer from client system. The same is working fine on the server. On the server , using BODS Designer and same repository , we are able to login.

BODS_ERROR.JPG

 

Can someone help me on this.

RFC CallReceive errorRFC_ABAP_MESSAGE - No authorization for this action

$
0
0

Hi All,

 

I have created a datastore TEST_SAP, when I try importing or searching any table from the data store, get the following error.

 

Error: Cannot load metadata table <name= >.

 

RFC CallReceive error <Function /BODS/TABLE_SEARCH:

RFC_ABAP_MESSAGE- No authorization for this action[SAP NWRFC

720][SAP Partner 46C][QUA][hostname][test_usr][1100]>.

(BODI-1112346)

 

Can you please help me here?

 

user test_usr has been given all the needed authorization, we even tried having give SAP_ALL.

 

/BODS/* functions are available in R/3 46C.

Strange Workflow icon and missed out data during loading

$
0
0

Hi All,

 

Would just like to know if anyone has some idea on this seemingly strange icon of a workflow in Data Services.

ex.png

I would assume that this is an error but what it meant is confusing.

 

Another thing, data are missed out during data load and a rerun would fix the issue. How do I isolate the cause considering that the error is not reproducible?


Unable to login into a repository

$
0
0

Hi,

 

We have a valid repository hosted on SQL Server.

We can successfully open the database, as well as register the same in CMC.

However, when trying to login into the repository through Designer, we get an error like

'ERROR : ODBC Call <SQLDriverConnect> for data source <servername> failed: <[Microsoft][ODBC SQL Server Driver][DBNETLIB]SQL Server does not exist or Access denied>'.

 

Does anyone know why is this happening?

We have other databases on the server too, but none of them give this error.

 

Thanks,

Bhupendra

Questions on SAP BODS ?

$
0
0

Hi,

i'm new to BODS .

 

1.Someone please explain me the best comparison method in Table comparison transform and why its considered best ?


2.Difference between HDFS and Flat file ?

 

3. Without History Preserving Transform , how to implement SCD type 2 ?

 

 

 

 

 

 

Thanks ,

David king J.

How to Specify Regular Expression in Excel Source File Name

$
0
0

Hi,

 

I want to load one or more Excel files from a folder into a table. The files could be of .xls or .xlsx extension.

So, in the file format, I need to specify the file name for taking both .xls and .xlsx files only for loading.

Currently I'm specifying it using wildcard as *.xls* . But, is there any other way so that i can keep it more specific.

Can we use regular expressions other than the basic wild cards here? Expecting your valuable suggestions.

 

Thanks,

Dav

Verify the copying of Multiple files

$
0
0

Hi,

 

I need to copy one or more files from one folder to another.

There is no specific count we can expect. There would be some common suffix in file names only.

After copying, how can I verify if all files are copied successfully?

I've used file_exists() function to verify copying of a single file with known file name only.

 

Thanks,

Dav

File functions and UNC paths

$
0
0

Hi experts,

 

we are currently facing an issue with the file functions file_copy, file_move and file_delete with Data Services version 4.2 SP06 patch 2.

 

When we use a filename based on a mapped drive (like C:\temp\file.txt) everything works as expected.

But as soon as we use a file based on an UNC path (e.g. \\vmware-host\Shared Folders\C\temp\file1.txt) we receive the error that the input filename is invalid.

 

The strange thing:

Before we call the file_copy function we check if the file exists by using the function file_exists. And file_exists returns 1 which means that the file can be found even if we use the UNC path!?

 

For testing we used a job containing only the following simple script:

script.jpg

 

In the trace file we can see that file_exists finds the file:

trace_file.jpg

 

But the error log shows that file_copy has a problem with the filename:

error_log.jpg

 

By the way: the Data Services service runs under a user account that has permission to the UNC path.

 

Are UNC paths not supported with the file_functions file_copy, file_move and file_delete?

Or did we make a mistake?

 

Thanks a lot for your support.

 

Best regards

Marcus

BODS - BW - HANA compatibility

$
0
0

Hi,

 

We have an issue with the upgrade of our landscape.

 

We are using SAP Data Services 4.2 SP2 to export data

=> from BW 7.01 on oracle db

=> to a datawarehouse on oracle db

=> using the following SAP Data Services features : RFC & Openhub & ABAP dataflows.

It's working perfectly.

 

We are planing to upgrade our BW to 7.4 on SAP HANA 1.0 SP11 (BW upgrade from 7.01 to 7.4 and oracle to SAP HANA 1.0 SP11).

 

If we check the SAP Data Services 4.2 PAM :

  • on Application Connectivity support by operating System, SAP BW 7.4 is supported with SAP Data Services 4.2 SP2
  • on DBMS Source & Target support by Operating Systems, SAP HANA 1.0 SP11 is not supported with SAP Data Services SP2 (supported with SP6)

 

In our case, can we upgrade BW to 7.4 on HANA 1.0 SP11 without upgrading SAP Data Services to SP6?

Is the application connectivity sufficient ? or the DBMS Source & Target support implies a SAP Data Services upgrade ?

 

If anyone has knowledge on this issue, any input would be appreciated,

 

Thanks,

 

Guillaume

RFC_ABAP_EXCEPTION-(Exception_Key:FU_NOT_FOUND,SY-MSGTY:E

$
0
0

Hello All,

 

We have created a datastore ZDS_APO_BW  and when trying to import metadata from SAP BW(7.0) getting the following error:

Error: Cannot load metadata table <name= >.

Error creating RFC Function </BODS/TABLE_SEARCH>:

<RFC_ABAP_EXCEPTOIN- (Exception_key: FU_NOT_FOUND,SY-MSGTY:E,

SY-MSGID:FL , SY-MSGNO:046, SY-MSGV1:


Error [SAP BusinessObjects][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Incorrect syntax near 'Colume_Name'.>.

$
0
0

Hi Experts

 

I have rebuild a New BODS job (SQL Server 2008) from existing Old BODS job (SQL Server 2000) as the source system has got upgrade from SQL Server 2000 to SQL Server 2008

 

But When i try to excite the New BODS job (SQL Server 2008) it throws error, please find the below error log

 

Error Log :

 

|Data flow DF_LS_BW_BLK|Reader search+Results+Property+Solicitor+tblSearchAddOns1
SQL submitted to ODBC data source <LandSearch_UAT> resulted in error <[SAP BusinessObjects][ODBC SQL Server Wire Protocol
driver][Microsoft SQL Server]Incorrect syntax near 'column name'.>. The SQL submitted is <SELECT  { fn substring(
"table"."column name " , 1, 60)  }  ,  { fn substring( "table"."column name" , 61, 40)  }  ,  { fn substring(
"table"."column name1" , 1, 60)  }  ,  { fn substring( "table"."column name2" , 61, 20)  }  , 

Error BODS Execution

$
0
0

Hi, I have 3 existing datastore.

Datastore 1 - Source

D2- Target

D3 - SAP tables

 

 

251193130234624DBS-0703004/8/2016 3:34:16 AM|Data flow DF_FIN_CON002_Cost_Center_Get_Data|Loader QRY_GET_DATA_FIN_CON002_COST_CENTER_STG
251193130234624DBS-0703004/8/2016 3:34:16 AMSQL submitted to Oracle Server <DI1> resulted in error <ORA-00942: table or view does not exist
251193130234624DBS-0703004/8/2016 3:34:16 AM>. The SQL submitted is <select "ZSOURCE_SYSTEM", "KOKRS", "KOSTL", "DATAB", "DATBI", "KTEXT", "LTEXT", "VERAK_USER", "VERAK",
251193130234624DBS-0703004/8/2016 3:34:16 AM"ABTEI", "KOSAR", "KHINR", "BUKRS", "FUNC_AREA", "WAERS", "PRCTR", "SPRAS", "MGEFL", "BKZKP", "PKZKP", "BKZKS", "BKZER",
251193130234624DBS-0703004/8/2016 3:34:16 AM"BKZOB", "PKZKS", "PKZER", "VMETH", "ZZZBHLINE", "ZZZPLCUST"
251193130234624DBS-0703004/8/2016 3:34:16 AMFROM "SAPDSSTGPG"."FIN_CON002_COST_CENTER_STG">.
251101364907808DBS-0703004/8/2016 3:34:32 AM|Data flow DF_FIN_CON002_Cost_Center_Get_Data|Loader QRY_GET_DATA_FIN_CON002_COST_CENTER_STG
251101364907808DBS-0703004/8/2016 3:34:32 AMSQL submitted to Oracle Server <DI1> resulted in error <ORA-00942: table or view does not exist
251101364907808DBS-0703004/8/2016 3:34:32 AM>. The SQL submitted is <select "ZSOURCE_SYSTEM", "KOKRS", "KOSTL", "DATAB", "DATBI", "KTEXT", "LTEXT", "VERAK_USER", "VERAK",
251101364907808DBS-0703004/8/2016 3:34:32 AM"ABTEI", "KOSAR", "KHINR", "BUKRS", "FUNC_AREA", "WAERS", "PRCTR", "SPRAS", "MGEFL", "BKZKP", "PKZKP", "BKZKS", "BKZER",
251101364907808DBS-0703004/8/2016 3:34:32 AM"BKZOB", "PKZKS", "PKZER", "VMETH", "ZZZBHLINE", "ZZZPLCUST"
251101364907808DBS-0703004/8/2016 3:34:32 AMFROM "SAPDSSTGPG"."FIN_CON002_COST_CENTER_STG">.

ODBC vs RFC connectivity - which is faster?

$
0
0

Hi,

 

I have a SAP Application Datastore to which I connect to - one connection is using ODBC which connects to the underlying database and another connection is RFC connection where I am using ABAP Data Flows with RFC Data Transfer.


I have a dummy job that directly pulls data from the Open hub table and dumps it into a Template table. My ODBC job runs at twice the speed of my RFC job, but the BASIS team is adamant that RFC is faster than ODBC and want us to move to RFC (mainly due to security concerns). Could you guys throw light on the same? Am I doing the ABAP development wrong? Or like we believe, is ODBC connection faster than RFC connection with BODS 4.2?

error while running a demo real time job

$
0
0

Hi all,

i have created a test real time job [details are mentioned below],

while running the job for a test from designer i get this error message, however validation seems to be successful and both files have been give full access on the job server location.

can anyone please help on this?

 

error message:

error.JPG

 

now coming to file contents:  below mentioned

source xml file : emp.xml

 

<?xml version="1.0"?>

<employee>

  <EMPNO>8888</EMPNO>

</employee>

 

target xml file: emp_loc.xml - empty

 

source xml schema : emp.xsd

 

source_schema.JPG


target xml schema : emp_loc.xsd

target_xsd.JPG


now coming to job design and dataflow:

job_overview.JPG

variables.JPG

dataflow.JPG

i have hardcoded some test values for this test job in the query transform

query.JPG

 

source message type details

emp.JPG

target message type details

emp_loc.JPG

access on the job server: full permissions

 

job_server_access.JPG

How to add soap header in java

$
0
0

Hi expert!

 

I have a problem I can no longer be added to the header in java wsdl

 

You know the way how to add the header by jax-ws soap header.

 

 

 

Regards

Viewing all 3719 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>