Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

Data Services use job_run_id() in real time job scripting doesnt work

$
0
0

Hi experts,

 

I implemented the following SQL statement in a SAP BODS script. The script is placed in a catch block of a Real-Time job.

 

     sql('database_name','INSERT INTO table (ID, MESSAGE, SUCCESS) VALUES ([job_run_id()],\'text\',\'true\')');

 


When I execute the same script in a Batch Job (also in a catch block) everything works fine.

 

If I replace the job_run_id() function with an default value (for example '1234') it works in the Real Time job.

 

I already tried to enter the job_run_id() in a variable and enter the variable in the SQL statement.. No success...


Does anybody has an idea?


SAP Data Services version is 4.2.4.


Thank you in advance!

Best Regards


Depreciation Simulation Report Data Extract

$
0
0

I am setting up a new model for Fixed Asset Planning. We are on version 10.06 for Microsoft. I am using a Data Services batch job to extract actual values posted to fixed assets, including depreciation. For simulated depreciation, is there any way to extract the data automatically out of ECC or with some function in Data Services so that I can import it into BPC? The data is available in ECC with report S_ALR_87012936.

Uploading files to Google drive.

$
0
0

Hi,

 

I know Python can be used for creating UDT. But is there a way to FTP files directly to Google drive by making use of Python in DS?

 

Thx,

DATAFLOW:Data flow is terminated due to error .

$
0
0

Hi experts,

 

We have a job in HCI which would create a file in some folder location.

 

The Project has 2 tasks which will generate 2 files respectively. The first file generation is working fine and the task is terminating softly.

But the second file is not generating and is ending with the below error.

 

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 1 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 2 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 3 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 4 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 5 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 6 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 7 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 9 of 10.

       (6696:6864)          READER:Waiting for 10 seconds before downloading data for reader XYZ. Retry 10 of 10.

       (6696:5824)        DATAFLOW:Data flow <data_flow_name> is terminated due to error <50011>.

       (6696:5824)        DATAFLOW:Process to execute data flow <data_flow_name> is completed.

       (6884:7144)             JOB:Job <job_name> is terminated due to error <50011>.

 

 

Can someone give me better insight about this problem, On the error code 50011 and about the issue. I can share more details.

 

Thanks in Advance

Anil Rudrappa

Need to identify and load the missed out records

$
0
0

Hi All,

 

I have a source table having 3000 records.While loading those records in target table, 6 records were not loaded.

Now ,I need to identify the missed out records and load only those records in the next run.

Is there any way by which I can achieve this?

 

Ankit

What does $NEEDS_RECOVERY stand for?

$
0
0

anybody knows this function?

Cannot even find it in google (but in SAP DS script in my project...)

 

Thanks in advance

 

Andrzej

Can UDT Custom Options be referenced in python

$
0
0

Hi All,

I'm trying to apply configurable settings to a UDT so it can be changed easily when implementing and without including them in the input record, as it is a constant value for all.

The Custom Options are available for user creation and appear in the python editor (eg [$$ISDEBUG]), but I cannot work out how they can be addressed in python!

I've looked through locals() but cannot see it and dont know if it can be addressed some other way, maybe as an object property. 

Has anyone found a way of using the Custom Options?

 

Regards

    Pat

WebServices REstFul

$
0
0

Hi.

I am try to config a DataStore to a RestFul WebServices in DS 4.2.

 

I am using a wadl but when I want to see the function, this is with input an output paramens empty.

 

Why is that happen?

 

image1.jpg


multi-threaded file reader error

$
0
0

I am getting this error " The multi-threaded file reader should be used only if  the maximum row size is less than <100000>. Check the file for bad data, or redefine the input schema for the file by editing  the file format in the UI."

 

I am able to load the files when they are small but I am not able to load the files when they are huge.

Could you please help me how to deal with this error.

 

Thanks in Advance.

SAP ASE to IQ CDC using Sybase Replication server in SAP DS

$
0
0

Hi ,

 

We are using CDC on Sybase ASE database  in SAP Data Services. Here we are loading data into the Sybase IQ.

 

We are using the Sybase Replication server with Power designer option to implement the CDC.

 

We successfully configured the Replication server and CDC working properly.

 

But we have one issue  as per our requirement.

 

We are planning to use CDC like near real time . We need to run the job multiple times per day . We may get bulk changes in the source transactions in the month end as it is banking sector.

 

When we are using Power designer CDC,  in the back ground it is using the procedure to delete the CDC data based on the  DS_CHANGE_RETENTION. Default is 5 days .  In the procedure it runs once per day (8 AM to 8 AM)  and after 5 days CDC table data will be deleted .

 

We should have control on Retention period like multiple times per day . We should not delete till certain period. If I am not deleting the data and I am running the job multiple times per day it is trying to insert old CDC data again and we will get primary key violation.

 

If we have any flag in the CDC table and possible to mark the flag  in the CDC database table once we load the CDC record in to the target, some how we can maintain the CDC data for few days per requirement and can run multiple times per day as non marked flag records will be loaded into the target.

 

I cannot use Sybase replication server with continuous workflow method as we cannot store CDC data in intermediate database CDC tables . We cannot define the stop process for workflow as stop process may change depend on requirement often . That's why we opted Power designer option for CDC.

 

Could you please help me to resolve this issue. Any good approach for this requirement.

 

Thanks & Regards,

Ramana.

downloading zip folder from FTP using bods

$
0
0

Hi guys

 

I need to download zip folder from ftp location using data services.. gone through few forum thread, but getting difficulty to understand them..

 

It would be very kind of you if anyone can help me to understand the code.

 

Data services 4.2 is installed on windows

 

trying to use Paul Kessler reply in http://scn.sap.com/thread/1809259 thread

 

Script in data services dataflow:-

Program executable:  \\ds server name\foldername\myftp.bat  
User name:  Intralocal (FTP username)  
Password:   Welcome123  
Arguments:  $AW_USER $AW_PASSWORD '\\ds server name\foldername\' 'zipfoldner_name.zip' 'ftp host address''\\ds server name\foldername' is the path to the FTP executable on your Windows server

 

do i need to give user name and password again in arguments?

 

In the job, Data Services will execute the this command string:

 

Myftp.cmd FTPUSER <password> 'local_path' 'zipfoldner_name.zip' 'ftp host address'

 

 


i have created the Myftp.cmd file using code (code copied from data designer guide) below:-

@echo off
set USER=%1
set PASSWORD=%2
set LOCAL_DIR=%3
set FILE_NAME=%4
set LITERAL_HOST_NAME=%5
set INP_FILE=ftp.inp
echo %USER%>%INP_FILE%
echo %PASSWORD%>>%INP_FILE%
echo lcd %LOCAL_DIR%>>%INP_FILE%
echo get %FILE_NAME%>>%INP_FILE%
echo bye>>%INP_FILE%
ftp -s%INPT_FILE% %LITERAL_HOST_NAME%>ftp.out

 

Question :-
do i need to give user name and password details in the cmd file? if yes then where to specify same like username =Intralocal

SAP DS delete inserts

$
0
0

Hi ,

 

First I want to delete the target record and insert the record into the target from the source .

 

Source First Time

 

ID  Name                                              ID  Name

1    A                                                   1     A

 

 

Source second Time

 

ID  Name                                              ID  Name

1    B                                                   1     A ( delete )

                                                           1     B (insert)

 

 

How can I achieve this one.

 

Thanks & Regards,

Ramana.

Where (and how) to place teh Data_Transfer to push down an Aggregate function

$
0
0

Hi there,

 

In my dataflow there is PIVOT followed by an QUERY.

Query contains an Aggregate function.

Now I want to push down the Aggregate function using an Data Transfer transform.

never did this...

 

Should I insert a Data_Transfer between PIVOT and QUERY?

And how exactly can I realize it.

 

Last question: After implementing:

How can I check if the function was really pushed down.

Can I see this in the interactive Debugger, in the Trace or Monitor Log or and any other button?

 

Thanks so much.. (I am a BODS beginner...).

 

Andrzej

how to load data from Oracle & ECC system to Flat file(SFTP) as target

$
0
0

Hi All,


We got a requirement

1.) load the data from SAP ECC system using a custom ABAP program to Flat file(.txt) on SFTP server(target system).

 

and

 

2.) load data from oracle DB to Flat file(.txt) on SFTP server(target system).

 

I know how to extract the data but the problem is loading the data into flat files as a target system(SFTP)

 

we are using BODS 4.2 version,

 

source system:

ECC 6.0 & Oracle 11g

 

and

target system( SFTP File server)

 

 

your efforts would be appreciated.

 

Thanks in advance

BODS with BI Tools

$
0
0

Hello,

 

We have a client who has acquired BI tools ( webi, dashboards, CR, Lumira ) along with BODS . Source system is SAP ECC. I want to know where would BODS will fit in this ?

 

Is bods required in between ECC and BI tools ? or will there be every any scenario like this that can come up and what type ?

I am not sure if BODS can feed BI tools ? or can do ETL for BI tools ? is this possible ?

 

many thanks


IN clause in ABAP dataflow

$
0
0

Hi All,

 

I have an ABAP dataflow which has below condition in where clause in query transform,

 

KNA1.KUNNR in ($G_CustNo)

 

I have tried setting global variable to following values.

$G_CustNo = 123,456,789

$G_CustNo = '123','456','789'


Still ABAP dataflow taken only first customer number into considertion. IN doesn't work. It works only like = operator for the first value.


Any idea how to resolve this?


Thanks in advance.

Designer workspace: Display object descriptions

$
0
0

Hi guys,

 

in the training we have been told how to enable the Display of the object descriptions in the designer Workspace.

 

I think there are 3 settings to be done... unfortunately I dont find my notes regarding this topic.

 

Can someone provide the solution?

 

Thanks,

 

Andrzej

Centralized Logging x SAP BODS

$
0
0

Hi,

 

I want to know if there is any component or configuration that permit integrate SAP BODS ETL LOGs of executions with Centralized Logging, in my company,  ELK Stack (Elasticsearch, Logstash, and Kibana).

 

 

 

I'm asking it because my company determinate that all process shoud use the Centralized Logging infrastructure, but in my search I didn't saw anthing specific for that situation. I'm thinking we will need to develop a specifc component for that.

 

 

 

I'll apreciate any adivice.

 

 

 

Regards

Performance issue: how to find the bottleneg?

$
0
0

Hi there,

 

I just started with SAP DS and have a performance issue in my dataflow.

 

In SAP ERP I would run a trace (transaction ST05) to see which process takes most time.

 

Where can check the performance issue in SAP DS?

 

In the "interactive debugger" or in the "trace log" or somewhere else?

 

Thanks and regards,

 

Philip

Hi experts, please help me on the below BODS error.

$
0
0

Hi Experts,

 

The below is my script to find and insert the jobname,job_id,servername,Startdate,enddate, and target_row_count in a table. I'm using the script before my DF. Only one DF exists in my job.

 

##

##

$GV_Job_Name = job_name( );

##

$Batch_ID = GRSAP_GET_JOB_ID( );   >>> is the custom function.

##

$GV_Start_date = sysdate( );

##

$GV_End_date = sysdate( );

##

$GV_Target_count = sql ('KPI_REP','Select count(*) from KPIRPT_USR.KPI_INTRADAY');

##

$GV_SERVERNAME = host_name( );

##

##

sql ('KPI_STG','insert into KPISTG_USR.TBJOBCTL(JOBNAME,JOBID,SERVER,STARTDATETIME,ENDDATETIME,ROWCOUNT)

values({$GV_Job_Name},{$Batch_ID},{$GV_SERVERNAME},{$GV_Start_date},{$GV_End_date},{$GV_Target_count})');

##

 

 

I'm unable troubleshoot the below error. Kindly help me what is the issue.

 

|Session REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY Oracle <(DESCRIPTION =(ADDRESS_LIST =(ADDRESS = (PROTOCOL = TCP)(HOST = usaldblxd012)(PORT = 1521)))(CONNECT_DATA =(SID=COKPIUAT)))> error message for operation <OCIStmtExecute>: <ORA-01861: literal does not match format string >.

 

(14.2) 02-17-16 14:14:29 (E) (26432:7488) RUN-050304: |Session REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY Function call <sql ( KPI_STG, insert into KPISTG_USR.TBJOBCTL(JOBNAME,JOBID,SERVER,STARTDATETIME,ENDDATETIME,ROWCOUNT) values('REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY','02_17_2016_14_14_25_54988__7ce40fd2_a5f4_49c8_93f0_6dd273d103f9','USALVULXT 010','2016.02.17','2016.02.17','3') ) > failed, due to error <70301>: <Oracle <(DESCRIPTION =(ADDRESS_LIST =(ADDRESS = (PROTOCOL = TCP)(HOST = usaldblxd012)(PORT = 1521)))(CONNECT_DATA =(SID=COKPIUAT)))> error message for operation <OCIStmtExecute>: <ORA-01861: literal does not match format string >.>.

 

(14.2) 02-17-16 14:14:29 (E) (26432:7488) RUN-053008: |Session REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY INFO: The above error occurs in the context <|Session REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY|sql(...) Function Body|>.

 

(14.2) 02-17-16 14:14:29 (E) (26432:7488) DBS-070301: |Session REP_ATOMIC_MEASURES_EYENET_TO_KPI_INTRADAY Oracle <(DESCRIPTION =(ADDRESS_LIST =(ADDRESS = (PROTOCOL = TCP)(HOST = usaldblxd012)(PORT = 1521)))(CONNECT_DATA =(SID=COKPIUAT)))> error message for operation <OCIStmtExecute>: <ORA-01861: literal does not match format string >.

 

 

 

Thanks in advance,

Venkat.

Viewing all 3719 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>