Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

Job Server Issue during upgrade

$
0
0

Hi.

 

 

 

 

We are running an upgrade from Data Services 4.1 with mixed landscape ( BO + DS ) to Data Services 4.2 with separate landcape ( BO // IPS + DS )

 

We successfully followed the upgrade guide until the section 4.4 : Migrate Data Services Repositories.

 

During this step we realised that there was only 1 job server registred on Server Manager.

We managed to identify the cause of this : The initial configuration.

Before upgrading we were using 2 types of job server : 1 job server located on the server (source server BO + DS ), and another job server located on the future server ( future upgraded server IPS + DS ).

When preparing the upgrade we unfortunatly suppressed the job server from server manager and the server manager during the preparation of the upgrade.

 

 

So now when trying to recreate the job server configuration from server manager on the new server ( IPS + DS ) we can't manage to reproduce the initial configuration : i receive the following error : ."The repository named 'DATABASEHOST\SCHEMA_REPONAME_USERi' already exist ( BODI-310079 ).

I can't find any way to reproduce the initial configuration and recover the job server.

 

 

Actually :

i can see all the repository in management console

i can't associate the repository where the job server was suppresed

 

 

I have a lot of schedules runing so i don't see any way to change the job server infos in repository tables if i have to recreate all the schedules.

 

How can i recover the suppresed job server informations ? or remove the informations prohibing me from associating repository to the job service inside the new server manager ?

 

 

Thanks for your help


Performance Issues with SAP Data Services 4.2

$
0
0

Hello All,

 

We are having significant performance issues with our SAP Data Services 4.2 environment. The underlying database is DB2. Below are the times noted for logging into the DS Designer from the local laptops.

 

Login Time

Getting Job from Central Time

 

 

870 Seconds

963 Seconds

 

Also, a test was performed to launch the designer directly from the server and the performance was as expected. Please see below.

 

Login Time

Getting Job from Central Time

 

 

6 Seconds

60 Seconds

 

While this seems to be a network issue, we are able to launch and execute queries on the DB2 client without any issues. I also noticed that there were similar issues with DS 4.1 which were resolved with the release of a support pack and subsequently fixed in 4.2 Any help will be appreciated.

 

-Chaitanya

Problem with BO Data Services

$
0
0

Hello

 

 

I have a problem when I launch al_designer.exe ( BusinessObjects Data Services Repository Login )

 

 

A message appear

 

 

BODI-1111340:ERROR: OCI call <OCIEnvCreate> for connection < BASE_SI.world> failed: <cptmap>

 

 

we have log in the file

 

 

C:\Program Files\Business Objects\BusinessObjects Data Services\Log\errorlog.txt

 

 

 

 

(12.0) 10-23-14 10:10:58 (E) (7792:4972) CON-120404: OCI call <OCIEnvCreate> for connection <BASE_SI.world> failed: <cptmap>.

 

 

Thank you for help me

If statement in Sap BODS

$
0
0

How the below syntax is correct for an IF statement inside a script?

 

If(<Condition>) <True>; else <False>

 

Please explain with an example.

 

Thanks in advance.

Performance improvement options on windows 7 machine

$
0
0

Hi All,

 

1.As we know ,the size of read-ahead protocol is by default set to 4-8 KB

 

We need to Set the read-ahead protocol to 64 KB to make the I/O operations fast.

 

2. Turn on the asynchronous I/O to make the Input/output operations as fast as possible

 

Where to change the option on Windows 7 machine?

 

Thanks in advance.

Run job from webservices

$
0
0

Hi experts,

 

I have to run a job from a webservices call, using java.

 

Here is what I've done so far.

 

Imported the webservices to my project using Apache CFX 2.x and i have all the java code in the project.

 

I tried the following code.

  RunBatchJobRequest runBatchJobRequest = new RunBatchJobRequest();  runBatchJobRequest.setJobName("VERI_ALIQUOTA");  runBatchJobRequest.setRepoName("LOCAL");  runBatchJobRequest.setJobServer("CORSDEV0403");  Batch_Job_AdminImpl batchAdmin = new Batch_Job_AdminImpl();  BatchJobResponse res = batchAdmin.runBatchJob(runBatchJobRequest);

That code does not throw any errors, but the job does not start either.

 

Do I have to initiate a session first? or something like that?

 

Can you guys share a simple code of this working?

 

Best

Leandro

Brazil_AddressCleanse: Delivery point directories not found. Update directories or certification will

$
0
0

HelloGuys,

During migrationofSSISpackagesforbods, I decided to use thedata quality.
ButtousecomponentGlobal_Address_Clieanse(Brazil_AddressCleanse), I could not perform thecorrectionbyPOSTCODEaddress.
TheLogExecutionreturnedthe following message:


14520    13216    DQX-058306    10/22/2014 7:06:49 PM    Transform <Brazil_AddressCleanse>: <GAC0000>: [Delivery point directories not found.  Update directories or certification will

Is there anydirectory,I need tofeed ametadata repositorywith thecomplete listing ofaddresses, postcodes, ...Brazil?

Thanks for theconstantaidhere in the forum...

Calling web services in BODS with nested schemas?

$
0
0

Hi community

 

I'm trying to call a web service in bods, as it is shown in the following example:

http://www.sdn.sap.com/irj/boc/go/portal/prtroot/docs/library/uuid/20bd5e60-11f9-2b10-2bbb-b5109cceff08?QuickLink=index&overridelayout=true&39698382721890

 

Now im facing the challenge to map the input schema with the variable post code to the request schema, definied in the wsdl-documentof the webservice. In the example the web service has only one level. My webservice has 2 nested levels while i can not nest the input parameters that way. this leads to the situation, that i get an error that says that the number of columns between in and output are not corresponding. but it is not possible to map the aRequest node from the input schema directly to the aRequest schema of the output:

 

webservice.png

 

 

Has anyone experience in that function who can help me?

 

Knd regards

Matthias


BODS unable to import clustered table - /IRM/GCOND

$
0
0

Hi,

 

I am trying to import a clustered table -/IRM/GCOND from SAP ECC which sits over DB2 server.

when I connect to the underlying DB2 server in BODS designer, am unable to view/Import the table /IRM/GCOND.

Now i understand since a clustered table is in temporary memory(like some sort of cache in SAP). It makes sense that when I connect to DB2 Server ,I am not able to view this particular table.

 

My question is If,I connect to the overlying SAP Application server -using SAP Applications type datastore connection, will I be able to view and import

this clustered table (/IRM/GCOND).

 

Still I am waiting for the credentials from the basis team for SAP App server.

It would be really nice if I could get an answer for this issue.

 

 

Thanks

Arun.

How To Monitor CPU Usage through BODS

$
0
0

Anyone ,could you explain how we can monitor the cpu usage during any BODS job execution ?

During my initial investigation I have listed two probable options.

1.Through OS command (like SAR -O,but dont know how I can use it here)

2.Through Solution Manager ( But unable to find out how I can use this feature)

If any one can explain these it will helpful.

Buffer size csv file

$
0
0

Hi experts,

 

I got an issue while attempting to load a CSV file into a table.

 

The CSV file has got very large rows, so I think it's why I got the following error :

 

data_services_error.png

 

It says : "The buffer size (in octet) is not sufficient in order to contain..."

 

My question is :

- How I can increase the buffer size?

 

Kind regards,

to_decimal_ext function

$
0
0

Hi All,

Referrence guide page:950 shows to_decimal_ext('99,567.99', '.', ',',38,3) = 99567.990.

How?

Please explain.

 

Thanks in advance.

How to display cst timezone by load date of data in kst timezone in SAP DS

$
0
0

Hi,

 

I am loading data in kst timezone and i need to display it in cst timezone with daylight saving at target in SAP Data Service,

help me for to achieve this problem.

 

 

Thanks & Regards,

Vinodh Seemakurthi

Log files are not maintaining data

$
0
0

Hi Experts,

 

I am having a job in DS which was scheduled on daily basis but when i am trying to see the log file of the same job there is no data maintained

Previously we did faced this problem from last one week onwards the data is not maintaining

 

No Settings were changed, I checked the settings are as below

 

 

CMC-> Applications -> Data services -> settings

 

Settings are as below

 

 

 

In Database i checked the below tables which the data will populates, these tables are empty.

dbo.AL_HISTORY

 

ALVW_HISTORY

 

AL_HISTORY_INFO

 

AL_STATISTICS

 

can you please suggest me how to overcome this.

SNC connection error

$
0
0

An SAP BW datastore has been created, and tested successfully.  However, when SNC is enabled, the following error occurs:

 

Cannot connect to SAP Applications datastore. 

 

CRFC error: RFC_INVALID_HANDLE - Invalid RFC connection handle: 247649536[SAP Partner].  Please make sure the SAP server is running and the login information is correct. (BODI-1111348).

 

All parameters in the RFC destination and in the DS RFC connection have been rechecked, and they appear to be correct.  Does anyone have any suggestions for further analysis?

 

Thanks,

 

Paul


Use of Webservices in SAP BODS 4.2

$
0
0

Hello Experts

 

Have a scenario - There are three subsystems which contains customer details in different template/format while this is for same customer. Now in order to get unified 360 degree view we need to get some common template along with subsystem specific details for this same customer. Here This customer will be known further with a unique CID. Nutshell -  Unique customer along with subsystem specific details and some specific unique template need to be maintained. All the subsystem data will be collected in the form of file/batch file/ real time files etc

Further these details need to be pushdown to portal for maintaining a dashboard (360 degree view of customer)

 

 

BODS Use ???

Planning to use BODS as integration tool for above scenario and with some auto ID mechanism unique customer ID CID would be generated. further through webservices it will be push to portal folks to consume this in real time. 

Inputs for proceeding with this are appericiated, Also if some one can put up specifics to use/consume webservice within BODS would be helpful, Thanks in advance.

 

 

Best !

Deep

BODS/Teradata TPT question

$
0
0

Hi Everyone - we are very new to both BODS and Teradata and currently are in a proof-of-concept phase to see how this all works. One of the steps we are stuck on is when you create a BODS program and call the Teradata TPT using a *.bat file, we are unable to see any type of execution results (log file, trace file ,error file, display statements, etc.)

 

I am assuming we have some type of setup that is incorrect but have not been able to find what or where it needs to be changed. Attached is a screen shot or our BODS program, the corresponding *.bat file and the job log file display in Data Designer.

 

 

1) screen shot of a very basic BODS job. All we are doing is executing a *.bat file

BODS_job.jpg

2) screen shot of bat file that is being called

BAT_file.jpg

3) Data Desidnger log file screen

results.jpg

 

 

Does anyone have any suggestions on what we should change in order to have our log and trace files show up? We have been able to view some output files using the TLOGVIEW command but would prefer to see results displayed in the Data Designer. So far, no luck with any or the BODS support documentation - just curious if there is anyone who would know how to resolve this?

 

Thanks in advance

Could not Serialize – Transaction aborted

$
0
0

We have three jobs that gets scheduled to run at the same time. We are in the process of upgrading the DS from 3.2 to 4.2.

 

In 4.2 when jobs run, I get the below error from Database Netezza.

"[i]Could not Serialize – Transaction aborted[/i]".

 

The jobs basically throw a update statement which updates the record in control table, which are unique to the jobs.

 

Also came to know that Serialization is set to FALSE in Netezza to avoid aborting the second transaction.

 

Any one have any thoughts why this is happening in 4.2 but not in 3.2?

 

DS :14.2.1.700

Netezza: 7.0

Disable the warnings- Error Number 151206

$
0
0

I am facing the same issue even after appending SAP_FUNCTION_MISMATCH = FALSE in dsconfig under AL_engine section for error message 151206.


SAP DS: 14.2.1.700

Anyone facing similar problem?

Delta load performance

$
0
0

Hi All,

I am using delta load and full load with the conditional.

 

Fact table data flow is like this:

 

DF1: Oracle --> Staging

 

DF2: Staging--> DWH

 

I am passing Dimension table value by using lookup_ext tranform at Staging--> DWH

No of rows in Dimension table is 200

no of rows in Fact table is 125,000.

Full load is taking 3 mins where as delta load is taking 30 mins.


Please check below sample job screens.

 

Delta_Load.JPG

TBL_Comparison.JPG

 

Kindly advise to improve the performance.

Thanks in advance.

 

BODS Version is XI3.1

Viewing all 3719 articles
Browse latest View live