Quantcast
Channel: SCN : Unanswered Discussions - Data Services and Data Quality
Viewing all articles
Browse latest Browse all 3719

BODS Script and SQL Transform - Performance

$
0
0

For select ... result, I'm unable to pull data directly from tables due to few complexities. For this, I have used SQL transform, which is working fine, but impacting in terms of performance as I guess SQL transform cannot be completely pushed down. This result set is then order by with the help of data transfer push down.

 

As a workaround, I have created a BODS script for the complex select query(used in above stated SQL Xm) and inserted it into a separate table, which is then used a source in next DF as a join with few more source tables and then the Order BY is performed. Estimated record count:80 Million

 

Even the BODS script is taking almost same time. Does this mean that both the SQL transform and Script does not push down operations to Database ?

 

As an alternate approach, if a Package/Function is created to do this complex select and insert operation, and then BODS uses that new table as a source for further Order by mechanism, will it show significant performance improvement...

 

For reference, PFB queries:

 

Original SQL Transform:

Select '0000'|| Schema1_TBL_A.act_key as OBJ_KEY, tokenize.* from(

 

 

select comment_lines,element_no,P_NMBR,comment_typ,comment_typ_seq,load_date,load_type,row_num from (

WITH ilv AS (

        SELECT COMMENT_LINE || ';'                                   AS COMMENT_LINE

        ,     (LENGTH(COMMENT_LINE) - LENGTH(REPLACE(COMMENT_LINE, ';'))) + 1 AS no_of_elements

        , P_NMBR,comment_typ,comment_typ_seq,load_date,load_type,row_num

        FROM   Schema2_TBL_B

        )

    SELECT

    --RTRIM(COMMENT_LINE, ';')                              AS original_string

          SUBSTR(COMMENT_LINE, start_pos, (next_pos-start_pos)) AS comment_lines

    ,      element_no,P_NMBR,comment_typ,comment_typ_seq,load_date,load_type,row_num

    FROM  (

          SELECT ilv.COMMENT_LINE

          ,per_nmbr,comment_typ,comment_typ_seq,load_date,load_type,row_num

          ,      nt.column_value AS element_no

          ,      INSTR(

                    ilv.COMMENT_LINE,

                    ';',

                    DECODE(nt.column_value, 1, 0, 1),

                    DECODE(nt.column_value, 1, 1, nt.column_value-1)) + 1 AS start_pos

          ,      INSTR(

                    ilv.COMMENT_LINE,

                    ';',

                    1,

                    DECODE(nt.column_value, 1, 1, nt.column_value)) AS next_pos

          FROM   ilv

          ,      TABLE(

                    CAST(

                       MULTISET(

                          SELECT ROWNUM FROM dual CONNECT BY ROWNUM <= ilv.no_of_elements

                          ) AS Schema2.number_ct)) nt

         )

 

) where comment_lines is not null

 

 

)

tokenize

 

 

   left outer join Schema1_TBL_A on tokenize.P_NMBR = Schema1_TBL_A.P_NMBR and tokenize.COMMENT_TYP = Schema1_TBL_A.COMMENT_TYP


Viewing all articles
Browse latest Browse all 3719

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>