Running a job and after after 4 warnings I get the following:
"No further reports will be generated from this partition until a successful import."
The job finishes but does not load any data.
How can I get the job to abort rather than to complete the load?
My warnings are set to 50.
Any help is appreciated!
Joel
Need DataStage to abort rather than loading 0 rows
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 3
- Joined: Fri Aug 10, 2018 12:23 pm
Welcome.
How about describing your job design? That might help folks help you out. That or let us know what stage is generating that message. I'm assuming it's related to a sequential file but would like confirmation.
Another option, do an exact search here for your error message, it turned up 15 other posts for me and something in there should help you I would imagine. At least on the why of this, perhaps not on the core question. We'll see.
How about describing your job design? That might help folks help you out. That or let us know what stage is generating that message. I'm assuming it's related to a sequential file but would like confirmation.
Another option, do an exact search here for your error message, it turned up 15 other posts for me and something in there should help you I would imagine. At least on the why of this, perhaps not on the core question. We'll see.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Answering the subject and not the body...
You can drop your extract to a dataset, count the # of records in the dataset with a routine, and use that to gracefully exit or proceed depending on having data or not. This is nice for efficiency as well for complex jobs that do a whole lot of nothing like extracting lookup tables and such for nonexistent data...
There are several other ways to count as well. We use a lot of datasets so checking there is natural for us.
You can drop your extract to a dataset, count the # of records in the dataset with a routine, and use that to gracefully exit or proceed depending on having data or not. This is nice for efficiency as well for complex jobs that do a whole lot of nothing like extracting lookup tables and such for nonexistent data...
There are several other ways to count as well. We use a lot of datasets so checking there is natural for us.
-
- Participant
- Posts: 3
- Joined: Fri Aug 10, 2018 12:23 pm
I appreciate the feedback.
I was hoping there was a parameter that controlled the number of records that could be processed before the application replies "No further reports will be generated from this partition until a successful import."
Apparently that is a hard coded number of records,,,set to 4? Or is there a parameter to control it?
Then, I could turn the parameter up to over 50 and the warnings would then cause the job to abort.
Again, thank you much for your replies and insights.
Joel
I was hoping there was a parameter that controlled the number of records that could be processed before the application replies "No further reports will be generated from this partition until a successful import."
Apparently that is a hard coded number of records,,,set to 4? Or is there a parameter to control it?
Then, I could turn the parameter up to over 50 and the warnings would then cause the job to abort.
Again, thank you much for your replies and insights.
Joel
I am not aware of such a parameter or setting for that, however, you would not have to run your job using the default to abort after 50 warnings. You could set it to abort after 1 or 2 warnings at run time. If this is a scheduled job, then you would need to reschedule it to alter the setting.
Choose a job you love, and you will never have to work a day in your life. - Confucius
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
One job to prepare the data, loading the result into a Data Set. Another job to load the data from the Data Set into its real target. These two jobs are run from a controlling sequence that determines whether there are any records in the data set and only runs the second job on that basis.
The command to count records in a Data Set is Use the $CommandOutput activity variable to capture the value it generates in an Execute Command activity in the sequence job.
The command to count records in a Data Set is
Code: Select all
dsrecords -n dataset.ds
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.