hi all ,
we have a requirement to update a custom table during a BODS job execution.
details on the BODS job:
source: csv files
target: oracle table
the dataflow consists of a simple flow from source file format to query transformation to target table.
in the source file format , i have given file name as *.csv to read all files.
in the audit function we are updating the custom table with the values from audit labels,
to fill the fields 'src_row_count' and 'tgt_row_count' of the above custom table.
code used in Dataflow-audit:
sql('TEST_TGT_TABLE','update USER_1.CUSTOM_TABLE_NAME set SRC_ROW_COUNT = {$Count_SRC_FILE}, TRG_ROW_COUNT ={$Count_TARGET_TABLE} where JOB_ID = {$ID} and JOB_RUN_ID = {$RUN_ID}') is NULL
this script is not updating the fields.
but when i hard code values of variable in where clause and execute it, this updates the table. [below script]
sql('TEST_TGT_TABLE','update USER_1.CUSTOM_TABLE_NAME set SRC_ROW_COUNT = {$Count_SRC_FILE}, TRG_ROW_COUNT ={$Count_TARGET_TABLE} where JOB_ID = \'10\' and JOB_RUN_ID = \'85\'') is NULL
so looks like the value of variables is not getting substitued here.
but when i print the variable values of job_id and job_run_id , they provide the desired results.
but i cannot hard code values coz some column values get generated during the job execution time.
can anyone please suggest?