Pass Parameters through Autosys to DS

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
just4geeks
Premium Member
Premium Member
Posts: 644
Joined: Sat Aug 26, 2006 3:59 pm
Location: Mclean, VA

Pass Parameters through Autosys to DS

Post by just4geeks »

I have a DS job which processes different input files. And I use autosys to schedule the job at different times of the month. However, if a job fails due to a corrupt input file, how do I forcestart the job for that specific input file?

In other words, can I pass the name of the input file in autosys that will in turn pass it to DSJob and then force start the job?

Thanks in advance......
Attitude is everything....
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Usually the DataStage job developer takes care of parameter assignment. If a job fails. Just change the input file name which will have to be parametrized, and re-run that job. If you have to explore doing that with Autosys, you will have to talk to autosys folks to see how they can pass the parameter which i doubt they will do. They are going to push this back to you.
Last edited by DSguru2B on Mon Jan 29, 2007 9:31 am, edited 1 time in total.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

How are you getting the filenames now? If it is some flavor of automagic, won't it simply pick up with the last file it failed on? :?
-craig

"You can never have too many knives" -- Logan Nine Fingers
just4geeks
Premium Member
Premium Member
Posts: 644
Joined: Sat Aug 26, 2006 3:59 pm
Location: Mclean, VA

Post by just4geeks »

chulett wrote:How are you getting the filenames now? If it is some flavor of automagic, won't it simply pick up with the last file it failed on? :?
Currently, the files are dumped by a process at different times into a folder. The DS Job simply picks up the file that it finds in the folder, processes it and then moves the file to a destination folder. Since DSJob is run at different times, it reads a different file every time its run.

The only workaround I can do now, is move the failed files to a 'failed job' folder and then write another DSJob that processes files contained in this folder. Then I can forcestart this DSJob in autosys.

I was planning on modifying the current DSJob such that I could control in autosys the file it reads.

Let me know if you can think of a better solution.
Attitude is everything....
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

You can keep the same job just make it point to the 'failed job' directory. How you are going to do that depends upon how are you currently controlling your process. Are you using Job Sequence or a custom Control Job or a Unix script? Regardless, however you are specifying parameters currently, similarly you will have to pass the parameters for the "cover up" or "failed job".
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
just4geeks
Premium Member
Premium Member
Posts: 644
Joined: Sat Aug 26, 2006 3:59 pm
Location: Mclean, VA

Post by just4geeks »

DSguru2B wrote:You can keep the same job just make it point to the 'failed job' directory. How you are going to do that depends upon how are you currently controlling your process. Are you using Job Sequence or a custom Control Job or a Unix script? Regardless, however you are specifying parameters currently, similarly you will have to pass the parameters for the "cover up" or "failed job".
Thanks for your reply....

I am using UNIX script. Also, I fail to understand what it means to "point to the 'failed job' directory"? Can I do it Autosys? I mean can I specify what folders to look into in the Autosys JIL script?
Attitude is everything....
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

No. The autosys will just fire off your script which will do everything. Its at a lower level than what autosys can do. You will have to handle this within your script.
"Pointing to the failed directory" means that you need to check in your script if the job failed, if it did then pass the directory path of the folder that you mentioned will contain the file name which will be run if the previous run was un-successful. Its a matter of setting the correct parameter before running your job.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
just4geeks
Premium Member
Premium Member
Posts: 644
Joined: Sat Aug 26, 2006 3:59 pm
Location: Mclean, VA

Post by just4geeks »

DSguru2B wrote:No. The autosys will just fire off your script which will do everything. Its at a lower level than what autosys can do. You will have to handle this within your script.
"Pointing to the failed directory" means that you need to check in your script if the job failed, if it did then pass the directory path of the folder that you mentioned will contain the file name which will be run if the previous run was un-successful. Its a matter of setting the correct parameter before running your job.
Thanks, I get it now. But wouldn't it create problems when multiple instances of the UNIX script are run, one handling the failed file and the other handling a regular file that has just arrived.

I am beginning to think that creating a separate DSJob to specifically handle failed files is simpler in design though not elegant.
Attitude is everything....
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Dont run the script that handles failed jobs at the same time as your regular run. It will create confusion. Keep it plain, keep it simple. Easier to manage and maintain.
Run your failed jobs at the end. Or if you think it will be easier for you to manage a different job, go for it. Make sure you have enough annotations and documentation to support the fact that you have a second identical job. Else developers like me start wondering whats going on :)
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
just4geeks
Premium Member
Premium Member
Posts: 644
Joined: Sat Aug 26, 2006 3:59 pm
Location: Mclean, VA

Post by just4geeks »

DSguru2B wrote:Dont run the script that handles failed jobs at the same time as your regular run. It will create confusion. Keep it plain, keep it simple. Easier to manage and maintain.
Run your failed jobs at the end. Or if you think it will be easier for you to manage a different job, go for it. Make sure you have enough annotations and documentation to support the fact that you have a second identical job. Else developers like me start wondering whats going on :)
Thanks DSguru2B. I will take note of what you said. I guess its easier to go for a different job. Let me go ahead and mark this topic "WorkAround".
Attitude is everything....
Post Reply