I currently have a job where I am writing to two sequential files, then I concatenate the two files into one file, read from that file and continue processing. What I'm doing now is, writing both sequential file streams into the same Sequential file stage (but different files). The output link from the Seq File stage is read from /dev/null. This goes to a transformer that doesn't do anything, but has a before stage routine that concatenates the two files into a third file. The transformer has an output link that goes to another sequential file stage. This link writes to /dev/null. The output link from this second sequential stage is the file that was built when I concatenate the two previous files. And processing continues... This structure allowed me to make sure that both files were done before anything else later in the job happened. So, how can I redesign this process so that I don't read and write to /dev/null in the same process and still keep the second half of the job from trying to read from the third file before it's ready? About the only thing I can think of is to split this up into two jobs and run it from it's own sequencer. Any ideas?
I would go with your sequencer idea. Split the current job into three jobs and a routine. Jobs 1 and 2 run in parallel creating files A and B, respectively. In your sequence, use a sequencer stage to wait for both to complete, then use a routine activity stage to run a routine that runs a cat command via DSExecute.
I don't understand why you can't put the concatenated filename into the output link of the seq stage with the two input files. In the transformer on that seq stage put a before-transformer command to cat the two files together.
What happens is that the seq output won't start until the two inputs are finished. Then, the seq output will begin after the transformer finishes it's before-transformer command which happens to place the concatentated file right where it needs it.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
I guess I thought it would try to open the file before the Transformer's before stage routine kicked off, so I never tried it. I'll try it and report my results back here.
Nope, it works, and I do it all the time. As long as you don't Validate the job, the before commands always happen before the link opens for processing.
Kenneth Bland
Rank: Sempai
Belt: First degree black
Fight name: Captain Hook
Signature knockout: right upper cut followed by left hook
Signature submission: Crucifix combined with leg triangle
A passive stage can not open its output(s) until all its inputs are closed (the IPC stage being a notable exception). This is why it is possible to pre-load a hashed file in the same job that uses it.
An active stage can not open its output(s) until and unless all its inputs are opened.
Before-stage subroutines are executed before the generic "Open" function is issued to any link.
After-stage subroutines are not executed until after the generic "Close" function has been issued to all links.
Before-job subroutines are executed before the generic "Run" method is invoked on any active stage.
After-job subroutines are not executed until all child processes have notified completion.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.