Need DataStage to abort rather than loading 0 rows

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Joel in KC
Participant
Posts: 3
Joined: Fri Aug 10, 2018 12:23 pm

Need DataStage to abort rather than loading 0 rows

Post by Joel in KC »

Running a job and after after 4 warnings I get the following:
"No further reports will be generated from this partition until a successful import."

The job finishes but does not load any data.

How can I get the job to abort rather than to complete the load?

My warnings are set to 50.

Any help is appreciated!

Joel
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Welcome.

How about describing your job design? That might help folks help you out. That or let us know what stage is generating that message. I'm assuming it's related to a sequential file but would like confirmation.

Another option, do an exact search here for your error message, it turned up 15 other posts for me and something in there should help you I would imagine. At least on the why of this, perhaps not on the core question. We'll see.
-craig

"You can never have too many knives" -- Logan Nine Fingers
UCDI
Premium Member
Premium Member
Posts: 383
Joined: Mon Mar 21, 2016 2:00 pm

Post by UCDI »

Answering the subject and not the body...

You can drop your extract to a dataset, count the # of records in the dataset with a routine, and use that to gracefully exit or proceed depending on having data or not. This is nice for efficiency as well for complex jobs that do a whole lot of nothing like extracting lookup tables and such for nonexistent data...

There are several other ways to count as well. We use a lot of datasets so checking there is natural for us.
Joel in KC
Participant
Posts: 3
Joined: Fri Aug 10, 2018 12:23 pm

Post by Joel in KC »

I appreciate the feedback.

I was hoping there was a parameter that controlled the number of records that could be processed before the application replies "No further reports will be generated from this partition until a successful import."

Apparently that is a hard coded number of records,,,set to 4? Or is there a parameter to control it?

Then, I could turn the parameter up to over 50 and the warnings would then cause the job to abort.

Again, thank you much for your replies and insights.

Joel
qt_ky
Premium Member
Premium Member
Posts: 2895
Joined: Wed Aug 03, 2011 6:16 am
Location: USA

Post by qt_ky »

I am not aware of such a parameter or setting for that, however, you would not have to run your job using the default to abort after 50 warnings. You could set it to abort after 1 or 2 warnings at run time. If this is a scheduled job, then you would need to reschedule it to alter the setting.
Choose a job you love, and you will never have to work a day in your life. - Confucius
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

One job to prepare the data, loading the result into a Data Set. Another job to load the data from the Data Set into its real target. These two jobs are run from a controlling sequence that determines whether there are any records in the data set and only runs the second job on that basis.

The command to count records in a Data Set is

Code: Select all

dsrecords -n dataset.ds
Use the $CommandOutput activity variable to capture the value it generates in an Execute Command activity in the sequence job.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply