Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

Rfc call recieve error: sap system failure while calling DS remote function.

$
0
0

Hi Experts,

 

I am getting the following error (Rfc call recieve error: sap system failure while calling DS remote function. )

while executing job which contains an abap data flow in my project. i have more than 2 lakh records to load and it is loading 5000 records in to the target and terminating the job by showing the above mentioned error. i am using DS 4.2 version and abap data flow inside a normal data flow. Kindly suggest me some solution for the same.

 

Thanks in Advance.

subbu.


Calling BAPI_CUSTOMER_CHANGEFROMDATA1 from Data Services

$
0
0

I’m having a couple of issues trying to use Data Services to update R/3 via BAPI_CUSTOMER_CHANGEFROMDATA1.

 

I’m new to SAP and relatively new to BODS too, so apologies in advance if I'm doing something silly.

 

So, the object of the exercise overall is to pick up changes in the source system - salesforce.com (SFDC), if it matters - and update the corresponding customer record in SAP.

 

Here’s where I’m up to at the moment: in SFDC, I can change, say, the contact’s telephone number and then see the change go through my Data Services dataflow.  There’s a custom field in SFDC that holds the SAP customer number, so by the time the data gets through as far as the BAPI, I have the SAP customer number, the changed telephone number, and a few other fields like LastName, and I think I should be able to assign those values to fields in the BAPI function, and have the SAP record updated.

 

The SAP log reports that the job completed successfully, but it also throws up an error:

 

Error calling custom RFC function <BAPI_CUSTOMER_CHANGEFROMDATA1 RFC return error! F2 892 Internal error: Sales Organization does not exist in master record  000000    >

 

The change to the telephone number has not been successfully applied to SAP.

 

The Sales Organization that I’m passing to the BAPI is ‘UK10’ (which I’ve tried passing through the dataflow and which I’ve also tried hard-coding into the BAPI function), and this is the correct Sales Organization for the customer that I’m trying to update.

 

I read a post on SCN that said you can’t update the company level data and the personal data via the BAPI at the same time.  So I tried to change the BAPI call to only update the personal-level data (and also AL_LANGUAGE, CUSTOMERNO, PI_DISTR_CHAN, PI_DIVISION, PI_SALESORG).  This gave me a different error when I tried to run the job:

 

Error calling custom RFC function <BAPI_CUSTOMER_CHANGEFROMDATA1 RFC return error! F2 827 Internal error: Make an entry in all fields where required   000000    >

 

I tried to figure out what the required fields were from the BAPI documentation, but I couldn't find the relevant information.  A post on SCN suggested that the required fields are LASTNAME, CITY, POSTL_COD1, STREET, COUNTRY, LANGU_P and CURRENCY.  All these are set in my BAPI function, and I’ve also set the corresponding ‘X’ field for each of these.

 

I’ve tried Google and I’ve tried re-reading the BAPI documentation, but it hasn’t helped me.

 

If you do have any ideas, or suggestions for where I could look next, I’d be very grateful.

 

Thanks

 

Steve

Timeout on DataServices server

$
0
0

We have been seeing the following issue occasionally and need some advice on where we can check for the issue.  The logs around the time of the failure do not seem to show any issues.  When we rerun the jobs they seem to complete without issue.

 

 

SQL submitted to ODBC data source <US5023SQL> resulted in error <[Microsoft][SQL Server Native Client 10.0]Communication link failure>. The SQL submitted is <select OBJECT_KEY, MACHINE, SERVER, TYPE, SERVICE, CONTAINER, INST_MACHINE, INST_SERVER, INST_SERVER_PORT, INST_REPO, INST_EXEC, INST_EXEC_KEY, RUN_SEQ, START_TIME, END_TIME, EXECUTION_TIME, STATUS, HAS_ERROR, IGNORE_ERROR,  NORM_MACHINE, NORM_SERVER, NORM_TYPE, NORM_SERVICE, NORM_CONTAINER, NORM_INST_MACHINE,  NORM_INST_SERVER, NORM_INST_REPO, NORM_INST_EXEC, SERVICE_ID  from AL_HISTORY where OBJECT_KEY = 101238>.

 

Thanks,

 

Ken

DS 4.2 Invalid range Excel

$
0
0

Good day!

We use SAP DataServices 4.2 SP03

I try to load excel 2010 file to temp table and I have to choose custom range AAB5:AAZ323 (Extend range Yes)

When I execute job, I have an error "Range <AAB5:AAZ323> is invalid" (My log is not in English, so it is my translate to English)

Could anybody help me how to use this range?

Thank you!

SAP DS, job fails with error LOAD_PROGRAM_NOT_FOUND, only on applicationserver

$
0
0

Hello,

 

When I'm starting the batch job for Business Objects in "generate_and_execute mode" then it runs perfect when all the jobs run on the central instance. When the jobs run on the application server then there is the chance that job get canceled/aborted. Strange thing is that it is not the same job that gets aborted. There are also jobs that run perfect on the application server.

 

Here some parts of  the job log:

 

Runtime error      LOAD_PROGRAM_NOT_FOUND
Datum en tijd      26.02.2015 14:40:26

 

Korte tekst

Program "ZW16000660" not found.

 

Wat is er gebeurd?

There are several possibilities:

 

Error in the ABAP Application Program

 

The current ABAP program "RSBTCRTE" had to be terminated because it has
come across a statement that unfortunately cannot be executed.
or
Error in the SAP kernel.

 

The current ABAP "RSBTCRTE" program had to be terminated because the
ABAP processor detected an internal system error.

 

434 * was successful or not:

435

436 * d023157  13.4.2004  auskommentiert; siehe Meldung 215477 / 2004

437

438 *     LOAD REPORT STEP_TBL-PROGNAME PART 'HEAD' INTO ABAP_PROG_LOAD.

439 *     IF SY-SUBRC > 0.

440 *        IF SY-SUBRC = 4. "Load not there or not updated

441 *           MESSAGE I524.

442 *        ENDIF.

443 *        GENERATE REPORT STEP_TBL-PROGNAME.

444 *            IF SY-SUBRC > 0.

445 *               MESSAGE A525.

446 *            ENDIF.

447 *     ENDIF. "report exists and is generated

448 * end of prolog

449 * d023157  13.4.2004

450

451 * passing the recent date for archiving

452     MOVE sy-datum TO out_archive_params-datum.

453

454     SUBMIT (step_tbl-progname)

455       TO SAP-SPOOL WITHOUT SPOOL DYNPRO

456       USER step_tbl-authcknam

457       USING SELECTION-SET step_tbl-variant

458       SPOOL PARAMETERS out_print_params

459       ARCHIVE PARAMETERS out_archive_params

460

461       AND RETURN.

462

463 * insert WO

>>>>     IF trace_level > btc_trace_level1.

465       CALL 'WriteTrace'

466        ID 'CALL' FIELD caller

467        ID 'PAR1' FIELD sy-langu.

468     ENDIF.

469

470 * d023157  13.4.2004

471     IF sy-subrc NE 0.

472       MESSAGE i645 WITH 'SUBMIT: sy-subrc =' sy-subrc.      "#EC NOTEXT

473       MESSAGE a525.

474     ENDIF.

475 * d023157  13.4.2004

476

477   ENDIF.

478   SET EXTENDED CHECK OFF. "ToDo:in applications this is not allowed

479   FREE MEMORY.

480   SET EXTENDED CHECK ON.

481 ENDLOOP.

482

483 * Begin c5020400  15.4.2004

Metadata table for 'Delete data before loading' option

$
0
0

Guys, Which metadata table store the 'Delete data from table before loading' information on the target table?

Data Services OS technical deployment possibilities and licensing

$
0
0

I'm new to Data Services. My questions are:

 

1. Is it technically possible to run some of the server components on Solaris and some on Linux, e.g. host the web tier on linux while the job server and access server are on solaris.

 

2. If this is technically possible, would it require two licenses - one for solaris and another for linux? (I've seen where other products are licensed by operating system and unbundling is prohibited. Is this the case or is this applicable to the scenario described here?)

 

Thanks

 

Message was edited by: Julie Oliver

Way to see all of a job at once?

$
0
0

Is there a way to  either 'zoom out' on a job so that you can see what all the workflows inside it are doing in one screen instead of having to dig down into each workflow to see its contents.

 

OR

 

A way to see everything that a job is doing in some kind of text format?  I know you can see the SQL that's being pushed down.  Is there a way to see what is happening with the statements that AREN'T being pushed down?


Question on Installing Information Steward

$
0
0

I installed Data Services and IPS on one server. I am planning to install IS on separate server. IS Installtion did not allow to move to next steps, since the prerequistes did not meet. IPS is not found on the IS server.

 

I thought install IS on a separate server and share webcomponents with IPS/DS server.

 

Questions: Is this scenario possible or not?

 

Do i need to install IS on where DS and IPS installed?

 

I cannot install IS on DS server due to licensing model we have.

 

Any ideas?

template tables are becoming permanent tables in data services

$
0
0

I create a template table as a target object and then if I log out an log back in to data services - it has now become permanent.

Is this a new feature.  is there some way to disable this

 

Data Services version 4.2

Master Data cleansing, deduplication

$
0
0

Hi Folks,

 

I would like to download pre built ATL files  for cleansing, deduplication of Master data files for customers, Master and Vendor. Do i need to purchase license for downloading those ATL files. If so what package do i need to install. It will be great if you can provide me the path for the location of the ATL files. I am using Data services 4.2 version.

 

Appreciate your help.

 

Thanks,

How to extract data from Sharepoint Using BODS

$
0
0

Hi,

 

Existing Scenario:

Data is extracted from SharePoint in xls format and placed at a shared path from where it gets picked up by BODS and loaded into the target system. This data extraction from SharePoint is a manual activity, so each day, file is extracted manually and placed for BODS.

 

Desired Scenario:

To avoid this manual activity, it is required to extract data directly from SharePoint via BODS and load into target system without any manual intervention.

 

Can this be achieved ?

 

Any pointers will be helpful.

 

Thanks,

Sumant

 


DS Migration 4.2.4 variable value - showing special char

$
0
0

Hi,

After migrating from DS 4.2.3 to 4,2,4, one of the data flow conditional variable value is passing as '¿¿¿'  instead of actual character.

Following activities have been made to see whether this will resolve.

 

1. Global variable value in script and print - OK

2. run in debug mode - OK

3. change the scope to Local and pass parameter value - Not working

4. passing variable as a condition in data flow - Not working in fact its converting actual value into "inverted question mark".

 

any ideas or help will be much appreciated.

 

Thanks

Jagan

Fault-tolerance-login-failed: (while connecting to salesforce)

$
0
0

Hi,

I have been getting this error for the past couple of weeks when connecting to Salesforce. This is the same connectivity i had been using for the past few years without any issue. I checked this site for solutions and saw that changes to the JVM options is the resolution. But my server does not have any proxy and the installation was all default. Not sure how to get this resolved. Any advice ?

 

Error:

Fault-tolerance-login-failed:There was a communication error when talking to Salesforce.com: (0)null

Retry:1. Waiting 5 mins before trying to login to Salesforce.com again.

There was a communication error when talking to Salesforce.com: (0)null

 

Solutions tried and failed:

1) I even updated the  JVM options to see if it works.

-Xms512m -Xmx1024m -Dhttp.proxyHost=https://www.salesforce.com/services/Soap/u/16.0 -Dhttp.proxyPort=8080


2) Tried using soap 18 or higher (adapter would not accept this version =)


Thanks.

Slow extract SAP ECC 6

$
0
0

Good day everyone,

 

I've setup a SAP connection as described here

RFC transport method - Enterprise Information Management - SCN Wiki

 

That seems to work but when extracting from tables VBRP, VBRK and T006A the query is really slow.  Sources is SQL server 2008 R2 and I did not find that the SAPDS query is one os the expensive queries in the SQL management studio.

 

There are about 4 process on SAP productive system running the report /BODS/SAPLBODS.

 

Monitor says about 289,000 rows processed in about 35 minutes then throw the following error.

 

2584    6072    R3C-151001    3/2/2015 10:50:45 AM    Error calling RFC function to get table data: <RFC_ABAP_RUNTIME_FAILURE-(Exception_Key: TIME_OUT)- Time limit exceeded.>.

 

SAPDS version is 4.2

 

Any idea or suggestion ?

 

Thanks in advance.


ECC Data Extract - RFC or ODATA, which is faster?

$
0
0

Hi Experts.

 

For SAP ECC Data Extraction, we can use RFC, ABAP Program or ODATA.

 

ODATA performs better?

 

Tks in advance.

 

Baroni

Connection of Crystal Reports 11 with Sybase 15.7 on Windows

$
0
0

Greetings, Somebody has the step by step guide how to connect Crystal Reports 11 with Sybase 15.7 on Windows?

I appreciate all your help.

Thanks

Robinson

How to capture error_message when i use "Error Handling"/"Use Overflow file" option in target table

$
0
0

Hi,

 

I am using "Error Handling"/"Use Overflow file" option in target table to capture errors like "Unique Constraint".

So, When a duplicate key arrives it will be thrown to overflow file and the job is getting succesfully executed aswell as error will be shown in error file.

 

But if i use this option, catch block will never be executed even if there are errros.

Here i want to capture Error_Messgae() into a gloabl variable (like .. $ERROR=ERROR_MESSAGE()) where this can be done in catch block only!!!

 

Can some one help how to capture error message when we use this option in target table?

 

 

Thanks ,

Amar.

Most elegant way to combine Table_Comparison with Row_Generation

$
0
0

I have a situation where I want to load a dimension table in a similar way as described on this wiki page. Now I'm trying to achieve the same, but using a table compare and key generation transform. But I'm sure how to achieve this. I either get primary key violations when I merge at the end, before the target table, or my generated keys are ignored and replace by the Key_Generation. Any suggestions?

BODS Job scheduling issue

$
0
0

Hi,

 

     I have created and scheduled a job in BODS. Now i deleted the schedule and job from the BODS,

     But still the job executing... i want to stop the job.

     I don't know how to do that, please anyone help me to solve the issue.

 

 

Thanks and regards,

Ranjith.

Viewing all 3719 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>