Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

With Stored proc through SQL transform as source, crazy cartesian activity obvious in Monitor

$
0
0

Simple scenario

 

Three sql transforms to execute stored procs in sqlserver as source...

query transform to join.....and continue with other stuff

 

Before anyone answers with why...and push down to data base and so on...not possible, a lot of complexity behind this and job is being run as part of an application splitting the data in to batches running one at a time, large data volumes. much is out of our control.

 

The problem is that the join is being totally ignored in first instance, from the monitor log running in to billions of rows for data sets in the region of 20,000- 40,000

It then proceeds to apply the simple join and return the correct data and continuing to complete the rest of the batch job without issue, except for the passing of a ridiculous amount of time.

 

Is there any way to avoid? Any ideas?


no delta records flowing in from ECC extractor 2lis_06_inv

$
0
0

hi all,

i have a requirement to extract data from extractor 2lis_06_inv.

i am doing the ETL in BODS.

 

did the setup load in ECC and the inital load from BODS job was fine.

but when i am trying to do a V3 on ECC to extract delta data, i see that the queue for application-06 is missing in lbwq.

so doing V3 will not bring any records to delta queue.

can one of you please help me on how to do V3 and bring in delta data for this extractor ?

 

snapshot : lbwq

Capture.JPG

snapshot : V3 for application-06

1.JPG

snapshot: BODS Dataflow

2.JPG

SAP DS integration with Peoplesoft

$
0
0

Hi Experts

 

I am working on migrating Peoplesoft HCM data. I heard that SAP has provided Rapid Marts for peoplesoft but where as it is not available now. Can anyone suggest me where can I get those Rapid Marts to extract data from Peoplesoft HCM.

 

Thanks

Prasannakumar.

HANA & Essbase Connection through BODS

$
0
0

Hi All,

 

 

I went through many SDN forum's on the HANA & Essbase Connection through BODS but couldn't get a SOLID answer.  So can you please help me understand if we can connect the HANA and Essbase through BODS.

 

Right now, we have planned for a file transfer through SFTP For e.g. BODS will pull the data from HANA and post the file in BODS Server and from there the file will be SFTP'ed to Essbase Server but is there a more optimized way of transferring the data between these systems through BODS.

 

Thanks

Ganapathi.

Script to delete the today's data in target table

$
0
0

Hi All,

 

I loaded the below data to a table by adding a new column LOAD_DATE.

 

How to write the script to delete the data if same day loaded?

For example, loaded the data on today and I want to rerun this job today. Today's data should be deleted and load the data again.

What is the script for this.

Thanks in advance.


This is the source data.

EMPNOENAMEJOBMGRHIREDATESALCOMMDEPTNO
1001SMITHCLERK790217-Dec-80800NULL20
1002ALLENSALESMAN769820-Feb-81160030030
1003WARDSALESMAN769822-Feb-81125050030
1004JONESMANAGER783902-Apr-812975NULL20
1005MARTINSALESMAN769828-Sep-811250140030
1006BLAKEMANAGER783901-May-812850NULL30
1007CLARKMANAGER783909-Jun-812450NULL10
1008SCOTTANALYST756609-Dec-823000NULL20
1009KINGPRESIDENTNULL17-Nov-815000NULL10
1010TURNERSALESMAN769808-Sep-811500030
1011ADAMSCLERK778812-Jan-831100NULL20
1012JAMESCLERK769803-Dec-81950NULL30
1013FORDANALYST756603-Dec-813000NULL20
1014MILLERCLERK778223-Jan-821300NULL10

Creating Hive Adapter in Data Services 4.2 SP03

$
0
0

Have installed Dataservices 4.2 SP03 on the Hadoop cluster

Anyone can help in creating the Hive Adapter?

 

Get error message:

Adapter connection failed. Additional information

 

Adapter client host not known

you need to start the adapter synchronously to be able to connect to it

 

Any help?

 

Felix

SAP DBTech JDBC: [270]: not enough values site

$
0
0

Hi All,

 

I am trying to execute this below query but getting an error SAP DBTech JDBC: [270]: not enough values site.

 

INSERT

INTO TEDPOC.Z_MFG_ORDER_ITEM_SHIP_HISTORY select

  CURRENT_TIMESTAMP,

  tA.*

from TEDPOC.Z_ORDER_ITEM_SHIPMENT tA

WHERE FISCAL_YEAR = 2014

AND FISCAL_QUARTER = 'Q1';

 

Am i missing anything in this query. Please advise.

 

Thanks,

Abdulrasheed.

Consume huge memory when load to HIVE from ODP

$
0
0

Hi BODS Gurus,

 

     My scenario is load data to HIVE via DataServices from ODP. Before load to HIVE, DataServices will generate large temp file very slow, and at the same time, there will be java threads consume huge memory up to 10G.

     When I run 4 jobs at the same time the job server will crash and the error info as following:

     -------------------------------------------------------------------------------------------------------------------------------- 

      /usr/local/hive/bin/hive-config.sh: fork: retry: Resource temporarily unavailable
     
/usr/local/hadoop/bin/../libexec/hadoop-config.sh: fork: retry: Resource temporarily unavailable
     
/usr/local/hadoop/bin/../libexec/hadoop-config.sh: fork: retry: Resource temporarily unavailable

     
Logging initialized using configuration in file:/usr/local/hive/conf/hive-log4j.properties
     
Exception in thread "main" java.lang.OutOfMemoryError: unable to create new native thread

     ---------------------------------------------------------------------------------------------------------------------------------

error.gif

   

     1. How cloud I separate the large temp file into several smaller temp files and load them to HIVE?

     2. What caused the java threads consume huge memory? how can I reduce the memory consumption?

 

Thanks in advance.

 

Best Regards,

 

Andy


sleep for 15 mins - configuration

$
0
0

Hi All,

 

I have a batch job which will load csv files. Files will receive from 11pm to 12am.

 

I want to run the job whenever files is received.

 

How to configure sleep 15 mins.

 

Thanks in advance.

Data Services 4.2 Profiler Configuration?

$
0
0

HI,

 

Attempting to configure a Data Services Data Profiler.  Found a SAP Note "0001336441 - How to configure Data Profiler in Data Services" and attempting to follow the same steps, but obvious differences between XI version and 4.2 SP5 are causing issues.

 

Installation & Software

SQL Server 2012 R2 - Database Server

Server 1 - SAP BusinessObjects BI Platform 4.1 Support Pack 6 Patch 2

Server 1 - SAP Information Steward Version 4.2 Support Pack 5 Patch 3

Server 2 - SAP Data Services Version 4.2 Support Pack 5 Patch 3

 

Note: Server 1 shares the administrative load for Information Steward and Data Services.  Data Services Job Server required for installation of Information Steward on Server 1 is not used to comply with License restrictions.

 

Steps taken so far:

  1. Create the profile repository database (Get Version response: BODI-320034: The profiler repository version: <14.2.5.0>)
  2. Define the profile repository database on the Job Server (Server 2)  See JobServer_Profiler_Configuration attachedJobServer_Profiler_Configuration.JPG
  3. Define the profile repository database on the CMS (Server 1)

CMS_Profiler_Configuration.JPG

    1. Profiler Server Host Name: Server 2
    2. Same database server as set for Job Server
    3. Same user name/password as set for Job Server

 

Due to security requirements the installation needs to use encrypted IP traffic between the servers.

 

Any suggestions appreciated?

 

Thanks

David Cooper

Data load management in HANA studio

$
0
0

After I finished the related configuration about SLT, I got my SLT schema in HANA studio  and I found that SAP system basic tables (DD08L,DD02L,DD02T) are scheduled to load into HANA .

After a long time waiting (almost a day and night), I got the three tables's data in hana database, and I continue to use the "load" in data load management .The data loading job is also scheduled,it never changed ,I check the SLT moniter in SAP ,which is connected  to HANA and BGD job is paused, is there the problem with my SAP and SLT server's resource allocation?

 

Actually, my SAP  and SLT are sharing the same server, which would cause the speed of transfer ? if so, how can I fix it ?  if not, what's my problem?

 

If you have any clue about this issue, please share with me ,thank you very much .

 

Regards.

Tina

Issue while importing SAP function into Data Services

$
0
0

I am trying to import SAP RFC GL_ACCT_MASTER_SAVE_RFC into SAP BO Data Services. It is giving error: Cannot Import function. Parser detected an unknown data type. Notify customer Support (BODI-1112335).

 

I see no issue in opening this RFC from SAP S4HANA, and test run using SE37.

 

Any pointer to solve this is helpful.

 

 

Thanks,

Partha

Triggering BW Interrupt with BODS.

$
0
0

Hi Experts,

 

Can any one please advise how to trigger an BW interrupt using BODS? We have tried by importing Function Module, However  we are receiving and RFC Error RFC_NO_AUTHORITY .

 

We have followed the method provided in the below thread:

http://scn.sap.com/community/data-warehousing/bw/blog/2014/04/07/how-to-trigger-bw-process-chain-from-bods

 

We have tried triggering events using FM :ROPC_EVENT_RAISE but there is no option in the function to provide destination.

 

Please let us know if anyone have implemented the same.

 

Also please advise if we can trigger a program using BODS in any RFC destination.

 

Thanks,

Mahesh.

SAP Data Services

$
0
0

Hello Everyone,

 

Planning to learn and move to SAP Data services(DS).

 

How are the oppurtunities for DS and can I have start up material to learn?.

Excel source file closing Data Services

$
0
0

I'm experiencing some issues with Data Services 4.2 when playing with BPDM > RDM templates.

 

So, now I'm searching per help to figure out the issue instead of change the templates I'm using now. Of coerce is case of exist a solution for this little problem.

 

Detailing what I'm doing and what is happening:

 

I'm trying to use the template source file for Material Master (MaterialMaster.xls) provided together with BPDM > RDM templates, the file is placed in "Migration_ERP\Source_Files_DI_IN\" folder.

 

So, I changed the global variable path ($G_SourceData_Path) to the corresponding folder (\\server\...\Migration_ERP\Source_Files_DI_IN\) and then I'm trying to open the file using magnifier glasses to see the content available in that one. So, when click in magnifier glasses, my Data Services got closed without any error message.

 

I tried some other things such replacing the variable $G_SourceData_Path in each file per the full path instead of using variables. This works pretty well and I'm not experiencing issues here.

I also tried with Substitution Parameters at Repo Level and using my parameter as a path. This also works pretty well.

 

But what I really need is a way to make the Global Variable $G_SourceData_Path works and after trying to do that per some days without success. I decided to write this post in order to check what I'm missing or if this is some Data Services bug or not.

 

In advance, thanks per any kind of help you can provide on this subject.

 

Regards,

--JOnAS--


SAP Data Services Code page issue

$
0
0

Hi ,

 

I have Oracle source and Sybase IQ is a target. Oracle has the character set AR8ISO08859P6  and IQ has the Windows-1256 charset.


We have set the NLS_LANG = American_AMERICA. AR8ISO08859P6.


In the SAP Data Services, we have set the code page to ISO8859-6 and language Arabic. For the IQ we used code page CP1256 and language is Arabic.


Oracle source :



IQ Target:


 

But still we are getting the junk characters.

 

Please let me know what additional settings , I have to make.

 

What is the code page in the SAP Data Services for the oracle character set?

 

Is my setting is correct? if not please suggest me right code page.

 

Thanks & Regards,

Venkata Ramana Paidi

OpenHub - no response from web service

$
0
0

I'm in the process of migrating a client off of DS 4.0 to DS 4.2 SP6 patch 2.  We've managed to get most of the jobs working correctly, but we're seeing the above error message with jobs that contain OpenHub tables - but only in one repository in the Dev environment, they all work fine from the other repos in Dev.


The repos where this is working were migrated from their old Dev environment to the new Dev environment.  The one where it is not was migrated into the new Dev environment from their old Production environment and then the Datastore configurations where modified to connect to the Dev systems.  The funny thing is that when I migrate the Datastore and Job from a repository where this is working to the one where it isn't, it still doesn't work!


The only SAP note that I've found on this issue - 1811249 - doesn't apply because the Datastore is connecting to the same URL in all of the repositories.


Does anyone have any thoughts about why this might be happening?


Thanks!


-Dell

Unable to connect to an SFTP server using Dataservices File Location Object

$
0
0

Dear All,

 

It will be helpful if anybody can help me to resolve the issue.

 

I am trying to connect to an SFTP site with providing the required information, However I am getting the below error.

 

1.jpg

 

I am able to connect to site through the SFTP client tools like Winscp but not through BODS.

 

As you can see in the below screen shot, the connection is through public key.2.jpg

Thanks & Regards,

Neel

Alternative solution for the query involving CTE and Temp table

$
0
0

Hi All

      I need to display the count of no of jobs  running in SAP BODS repository table , starting at the Start_time and End_time of the jobs present in the table.

The table structure consists of :

job_details.png

 

The Expected output should be in the format:-


RunID | JobName | Start_time | End_Time | JobsCount at StartTime | JobsCount at EndTime |


Following was the query tried on Sql-Server-2012:-


declare @count int=1,

@max int,

@stime datetime,

@endtime datetime,

@runid int,

@jobname varchar(1000)

with cte as(

select distinct RUN_ID,JOB_NAME,START_TIME,END_TIME,EXECUTION_TIME from COMP_HIS_TBL(nolock)

where STATUS='Failure'

);

,temp_tab as(

select ROW_NUMBER() over(order by run_id)row_number,* from cte)

select * into #temp from temp_tab

order by 1,EXECUTION_TIME desc

select @max=MAX(ROW_NUMBER) from #temp

while (@count<=@max)

begin

select @stime=start_time,@endtime=end_time,@runid=run_id,@jobname=job_name from #temp where

ROW_NUMBER=@count

select

@runid RUN_ID ,@jobname JOB_NAME,@starttime Start_time ,@endtime end_time

select count(RUN_ID) as JOBS_AT_START_TIME from COMP_HIS_TBL(nolock)

where @stime between START_TIME  and END_TIME

select count(RUN_ID) as JOBS_AT_END_TIME from COMP_HIS_TBL(nolock)

where @endtime between START_TIME  and END_TIME

set @count=@count+1

end

--DROP TABLE #TEMP


Which is working fine for few jobs, but ending up with the warning "Query completed with errors!! and ending up with hardly "1 row retrieved". Please provide any alternative solution to this. The same requirement is not possible by other ways like Joins and others??



BODS jobs execution issue from BW infopackage

$
0
0

Dear All,

 

We have an issue while executing  job in infopcakges in bw  , attached screen for Reference

 

error in bods.PNG

 

Thanks,

Rahul

Viewing all 3719 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>