Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all 3719 articles
Browse latest View live

Need help in parsing flat files with dynamic column names

$
0
0

Hi guys,

 

I need your advise.  We have a requirement to read a set of flat files with dynamic column names (based on file row header) and load the same into a fixed staging table.  As an example, I can have 3 files with the following row headers below:


File 1 - NAME,ADDRESS,PHONE,GENDER

File 2 - NAME,PHONE,ADDRESS

File 3 - NAME,ADDRESS,GENDER

 

I would then need to load the above into a table with the following columns:NAME,PHONE,ADDRESS,GENDER

 

Right now, we have no idea what columns will appear except for the ones that we already have but we do know the columns that we need so we can probably setup the job to just ignore "unknown" columns.

 

Any help would be greatly appreciated.

 

Thanks!


SAP DS Continuous work flow method.

$
0
0

Hi ,

 

I am planning to use Sybase replication server CDC for Sybase ASE database.

 

I am planning to use Continuous workflow method as it is simple to maintain and real time.

 

I need few clarifications on this method.

 

I need to maintain CDC for 270+ tables. If I keep it single dataflow is any affect.  For this method I think there is no staging database to hold the CDC.

If I keep sequentially how can CDC records will hold until completion of the previous dataflows.

 

One more thing if we use Continuous   workflow log file size is increasing. How can I manage log files in this scenario.

 

Please help me in this .

 

Thanks & Regards,

Ramana.

SAP Data Services using LSMW for data migration into SAP ERP

$
0
0

Hi all,

 

In our last roll out project we used LSMW for DMIG into SAP ERP.


Now, IT department has adquired SAP Data Services and we are forced to use it.

 

Actually for the new country roll out we just wanted to re-use all our LSMW projects, but now we have to place SAP DS for ET and LSMW for L.

 

I have made some assumptions and perhaps someone can confirm if my understanding is correct.

 

1. SAP DS will extract and transform and leave the files on the Server where LSWM will pick them up for loading.

 

2. Since Transform will be moved from LSWM to SAP DS we have to assign a  MOVE 1:1 conversion for all fields in LSMW (step "Maintain Field Mapping and Conversion Rules".

 

3. I am afraid we cannot reuse the conversion rules by copy&paste from LSMW to SAP DS.

 

Therefore I would like to check all conversion types used our  LSMW-Projects:

 

1. Translation Conversion tables

 

I guess I have to recreate the conversion tables we used in LSMW as a permanent DB table in MySQL or as a Excel file and Import it in the SAP DS Designer thru a datastore, right?

 

2. Fixed values / Constants

 

In SAP DS in the Queries I have to apply coding in the correspoding fields of the Schema Out, right?

Is there an alternative to the hard coding in SAP DS? If I use a fix value for our principal Company code BUKRS and some day this value changes I do not want to go in SAP DS to each dataflow query and replace it...

 

3. Translations: ABAP coding

 

We have lots of ABAP coding in our LSMW programs. Principally it checks against 1-2 customizing tables in SAP ERP and retrieves the right value.

For example: go with the old customer number to KNA1 and get the new customer number and the take from another table KNVV (or something like that) and retriebe the contact Person Name: Migrate this contact Person name.

 

So if we take the complete coding out of LSMW in SAP ERP and place it into SAP DS in a Transform (using SAP DS Scripting Language or Python)

which kind of transform should I use and how do I make the connection for this calls to SAP ERP? Do I use a SAP_target datastore? or is there something like a RFC-enabled function module in SAP DS that I can trigger to retrieve the required info from the SAP ERP System?

 

Or should I try to convince the managment to leave all ABAP conversion coding in LSMW in SAP ERP?

 

I hope this challange calls the attention of someone. There should be more than my project where LSMW will be replaced by SAP DS. .

I checked for existing positngs on this but could not found the ansers to my questions.

 

Thanks for any recommendation.

 

Andrzej

How to load XML File Source with multiple XML Declaration

$
0
0

Dear Experts,

 

I want to load an XML file source which is having the following format:

 

<?xml version="1.0" encoding="UTF-8"?>

<Rootelement>

     Content

</Rootelement>

 

<?xml version="1.0" encoding="UTF-8"?>

<Rootelement>

     Content

</Rootelement>

 

<?xml version="1.0" encoding="UTF-8"?>

<Rootelement>

     Content

</Rootelement>

 

<?xml version="1.0" encoding="UTF-8"?>

<Rootelement>

     Content

</Rootelement>

 

 

in normal load i am getting the following error:

"invalid content after root element end tag"

 

Please Suggest.

How create a WADL for Data Services (RestFul)?

$
0
0

Hi

 

Which are the steps to create a WADL for Data Services 4.2

 

or

 

What should be the structure of the WADL?

 

or

 

Which are the steps to import a WADL?

 

I am using SopUI to create a WADL from a url, but when I import the function  to a Datastore (rest), I get a empty function or xml error.

 

Captura22.JPG

BODS mass-start batch jobs

$
0
0

Is there a way to start all batch jobs grouped under a certain project?

Thanks.

Migrating code from 4.0 to 4.2

$
0
0

We have a requirement of moving code from 4.0 to 4.2, 4.2 is already installed and we have jobs running on that smoothly.

 

Please let me know what should be the steps, best practice and check list.

 

Will importing the ATL file be enough ?

 

Also let me know if there is any thread related to this.

 

Thanks

Sandeep

Records missing in target table when loading data with Data Services

$
0
0

I created data flow to load data from format(file) to SAP HANA table. The job can be executed successfully without error. But in the target ABAP table, the number of records are always less, while the row count in Monitor is right. I guess this is related to bulk loader, because the record number is always multiple to default commit size, 10000.

 

Could someone help explain it and tell me how to solve it. Thanks!


Failed to establish FTP session to host in sap bods

$
0
0

why are getting this problem like Failed to establish FTP session to host, Could you please tell me what are actions need to take and how to resolve this issue.

Decimal Value getting updated as instead of 0.0000000

$
0
0

Hi All,

 

I am having the value of one field as 0.0000000 as input that is getting updated as blank in the target whereas

if that same field is having non-zero value, it is getting updated correctly. Data type is decimal.

 

Can anyone help?

 

Regards,

Ankit

Realtime Job execution does not get the proper results

$
0
0

Hi all,

 

We are trying to create a Webservice in BODS 4.2, in order to Recieve few paramaters by XML, then use this like filters in multiple Dataflows.

After that, the result expected is to have all database tables updated.

We have note that the XML is accepted after the SOAP invoke, and also I could return the same parameter in the XML output.

Anyway our impresion, is that the SQL flows are not executed because the tables does not have the information updated.

But if the same workflow is executed by batch job everything works fine.

 

I Aprecciate all information that you can bring us,

 

thanks in advance.

Transporting DataSources for BW

$
0
0

Hello.

 

I'm loading data into BW using DataServices and I created RFC connections and Source System in BW to communicate with Data Services.

Also I created the Datasources in BW, replicated in DataServices and everything is working fine. All of this in Development.

 

When moving to QA I transported the Datasources from DEV to QA in BW.

In DataServices I promoted the Job with the connection (DataStore) and all the replicated DataSources as well. The promotion finished without any errors.

 

When testing running the Job in QA I got an error that says:

"There is no hierarchy available for Infosource = <DS_DATASOURCE> and source system = <DS_DV>"

 

After several hours of tryal an error I was able to figure out the issue and everything is related to the fact that the DataSource (which was promoted from DataServices DEV to QA) is tied to the source system. This means that "DS_DV" is the source system of BW DEV and should be DS_QA for BW QA.

 

I had to reimport the DataSource in DataServices QA, modify the job and replace the datasource (which was pointing to DEV) with the replicated one.

After this I was able to push data to BW.

 

My question is this, is there a way to map source systems like you do in BW? In BW there's a table where you can translate source systems during a transport request so when in DEV is called DS_DV, when transported to QA it is renamed to DS_QA. I'm looking for this kind of configuracion in DataServices to be able to replace or map the DataSource to the right environment after a promotion.

 

 

Any help?

Thanks.

AL_ENGINE is not Starting

$
0
0

When I execute jobs they show up in the DS Management Console but they do not process any data.  I see that when a job starts no AL_ENGINE process starts.  Basically Job Servers are functioning to launch jobs but no AL_ENGINE process is not starting so no processing is possible. 

SAP Data Services landscape and architecture

$
0
0

Hi there,

 

simple question:

 

A data migration has to be done from one SAP ERP source system to an SAP ERP  target system.

 

Source and target systems have DEV - QA - PRD environment.

 

Do we require also DEV / QA / PRD environment for SAP DS?

and transport projects&repositories bz ATL/files from one environment to another?

 

Or is 1 SAP DS box enough since the datastores allow multiple settings for different source&target systems and

we can switch between the different settings?

 

What is usual and what is best practice?

 

Thanks for explanation!

 

Andrzej

SAP BODS With Success Factors

$
0
0

SFDS.pngHi All,

 

I am unable to Connect SAP BODS with Test Server of Success Factors . We are using  SAP BODS 4.2 and in Admin Console we can see Success Factors Adaptor also . We are trying to open SAP DS jobs server connections we are getting  error .Please refer to above screen shot

 

 

 

If any one has worked on BODS to Success factors . Request you please help me on the same.

 

Drop me Email : Pandeyanuj21@gmail.com

Call me :971529278497

 

Thanks

Anuj


Overwrite Schema by uploading a Flat File in SAP Data Services

$
0
0

Hello,

 

i want to upload a Flat File and during my E-learning traning the system always asked in the file format editor if i want to overwrite the default schema with the schema of the Flat File i want to upload.

 

But when i load a Flat File now the system does not ask if i want to overwrite the schema and it uploads only the first two columns.of ny ten field table.

 

has anyone an idea how i can solve that problem?

 

 

greetings Philp

DS Management Console: Data Validation: There is no data available

$
0
0

Hi,

 

I have created a job with a data flow containing a VALIDATION transform.
and put one simple rule

 

If the COUNTRY-field value is unequal DE then go to FAIL-File.

If the COUNTRY-field value is equal DE then go to Pass-File.

 

Job works fine.

I see the failed records in the right file and in rule-violation.

 

When I execute the job I marked in the pop up:

 

X collect statistics for optimization

X collect statistics for monitoring

 

Then I expected to see something in the Data Services Management Console under DATA VALIDATION.

However it loads a while and then displaysthe message "There is no data available".

 

Tried it several times.

 

Which setting is missing here?

 

Thank,

 

Andrzej

PDF Format to db table

$
0
0

Hello,

 

Is there any standard functionality in BODS or BO to directly read PDF files and also export the data to db table ? What's the best workaround to automate this process if standard functionality is not there for PDF files.

 

Thanks and regards,

ETL Recovery methods

$
0
0

Hi All,

For example ETL job failed for some reason in middle of the run.

How to load the ETL job for those records which were not loaded?

Thanks in advance.

BODS job failed due to Invalid value for date

$
0
0

hi team,

 

I recently joined this BO DS project and I am just a beginner with this.

Actually I need to know from where and how can I check data from source to validate error.

one of the DS job failed due to below errors:-

Invalid value <Month: 20> for date <01042012>. Context: Column <>.

 

I need to find the root cause it seems record in the source system caused this error.

 

But I don't know how to proceed.

please help!

Viewing all 3719 articles
Browse latest View live