Page 1 of 1

Performance stats and no. of rows in source does not match

Posted: Wed Jun 25, 2014 12:54 am
by srini.dw
Hi,

Got a job Oracle stage -> Hash file

When I compile the job and run, it gives me the no. of rows processed as 256, but sometimes, it shows me no of process as 20 rows only, when run from sequencer.

In the source no. of rows are 256.
No error logs or warning coming.

In the Oracle stage it's a simple SQL query.

Any ideas.
Thanks,

Posted: Wed Jun 25, 2014 1:33 am
by ray.wurlod
Hashed files (and please note it's not "hash" file) destructively overwrite records if the key (as identified in metadata) is the same.

Do you have duplicate keys coming from source?

Are there any other constraints in your job design that might limit the number of rows to 20?

Posted: Wed Jun 25, 2014 6:50 am
by chulett
However, even with a destructive overwrite down to 20 going on it would still show 256 going in, which is what I'm assuming is meant by 'processed'. The surprise people get is when they pull the records back out. :wink:

As to the issue, if I had to guess (and I do) the fact that the problem comes when run 'from a sequencer' usually implies a parameter passing problem which could translate to getting connected to a different instance / database than expected - one that only has 20 records in the source. That or the source simply isn't static and it is changing on you, not like we've never seen that before. Have you verified your connection parameters when you see this issue?

Posted: Wed Jun 25, 2014 4:01 pm
by ray.wurlod
Hence my second question.

Posted: Thu Jun 26, 2014 7:23 am
by srini.dw
Thanks for the the reply, it was a environment issue.

Thanks,

Posted: Thu Jun 26, 2014 8:42 am
by chulett
Meaning what, exactly?

Posted: Thu Jun 26, 2014 11:04 am
by PaulVL
A "PEBKAC" issue.

Posted: Thu Jun 26, 2014 4:25 pm
by ray.wurlod
:lol: