Page 1 of 1

Need DataStage to abort rather than loading 0 rows

Posted: Fri Aug 10, 2018 12:42 pm
by Joel in KC
Running a job and after after 4 warnings I get the following:
"No further reports will be generated from this partition until a successful import."

The job finishes but does not load any data.

How can I get the job to abort rather than to complete the load?

My warnings are set to 50.

Any help is appreciated!

Joel

Posted: Sun Aug 12, 2018 7:48 am
by chulett
Welcome.

How about describing your job design? That might help folks help you out. That or let us know what stage is generating that message. I'm assuming it's related to a sequential file but would like confirmation.

Another option, do an exact search here for your error message, it turned up 15 other posts for me and something in there should help you I would imagine. At least on the why of this, perhaps not on the core question. We'll see.

Posted: Tue Aug 14, 2018 12:09 pm
by UCDI
Answering the subject and not the body...

You can drop your extract to a dataset, count the # of records in the dataset with a routine, and use that to gracefully exit or proceed depending on having data or not. This is nice for efficiency as well for complex jobs that do a whole lot of nothing like extracting lookup tables and such for nonexistent data...

There are several other ways to count as well. We use a lot of datasets so checking there is natural for us.

Posted: Tue Aug 14, 2018 3:01 pm
by Joel in KC
I appreciate the feedback.

I was hoping there was a parameter that controlled the number of records that could be processed before the application replies "No further reports will be generated from this partition until a successful import."

Apparently that is a hard coded number of records,,,set to 4? Or is there a parameter to control it?

Then, I could turn the parameter up to over 50 and the warnings would then cause the job to abort.

Again, thank you much for your replies and insights.

Joel

Posted: Wed Aug 15, 2018 4:42 am
by qt_ky
I am not aware of such a parameter or setting for that, however, you would not have to run your job using the default to abort after 50 warnings. You could set it to abort after 1 or 2 warnings at run time. If this is a scheduled job, then you would need to reschedule it to alter the setting.

Posted: Wed Aug 15, 2018 5:27 am
by ray.wurlod
One job to prepare the data, loading the result into a Data Set. Another job to load the data from the Data Set into its real target. These two jobs are run from a controlling sequence that determines whether there are any records in the data set and only runs the second job on that basis.

The command to count records in a Data Set is

Code: Select all

dsrecords -n dataset.ds
Use the $CommandOutput activity variable to capture the value it generates in an Execute Command activity in the sequence job.