Our provider sends up to 300 files in a batch to be processed...I have a job that processes a single file, but calling it 300 times linearly takes forever due to the DataStage job control overhead. I need to kick off 300 instances of the same job (utilizing the filename as a parameter) simultaneously. The code below captures the strategy utilizing the asynchronous nature of DSRunJob with a job handle array, but it appears I cannot kick off 2 instances of the job simultaneously.
Error is
Control1..JobControl (fatal error from DSRunJob): Job control fatal error (-2)
(DSRunJob) Job Untitled1 is not in a runnable state
I am using the following code:
*Start Job instances
for i =1 to 50
print "File Number ":i:" is "
Call DSLogInfo("Processing File ":i, "Job Control")
Call DSLogInfo("Calling datastage job to load file ", "Job Control")
hJob1<i> = DSAttachJob("Untitled1", DSJ.ERRFATAL)
ErrCode = DSRunJob(hJob1<i>, DSJ.RUNNORMAL)
next i
for i =1 to 50
ErrCode = DSWaitForJob(hJob1<i>)
Status = DSGetJobInfo(hJob1<i>, DSJ.JOBSTATUS)
If Status = DSJS.RUNFAILED Then
* Fatal Error - No Return
Call DSLogFatal("Job Failed: Untitled1 ", "JobControl")
End
next i
Simultaneous Processing of multiple sequential files
Moderators: chulett, rschirm, roy
Make you job multiple instance and in your code change line
to
in order to fire off multiple instances with distinct names.
You will somehow have to differentiate the input file number for each instance of your job, perhaps you need to add a parameter and pass in the file name as a parameter/
Code: Select all
hJob1 = DSAttachJob("Untitled1", DSJ.ERRFATAL)
Code: Select all
hJob1 = DSAttachJob("Untitled1.":i, DSJ.ERRFATAL)
You will somehow have to differentiate the input file number for each instance of your job, perhaps you need to add a parameter and pass in the file name as a parameter/
<a href=http://www.worldcommunitygrid.org/team/ ... TZ9H4CGVP1 target="WCGWin">
</a>
</a>
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
This is not a good name for a job. It is the default name. Prefer meaningful names. Make sure also that you pass the file name as a parameter to each invocation, and that you ensure that the attach was successful. A job that is not in a runnable state has possibly never been compiled, or aborted, crashed or was stopped last time it ran. All these situations need to be corrected.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
I agree with Craig. Even if you try to run all 30 instances at the same time, you will exhuast the resources. Collect the files using a link collector and then process them or pass an OS level COPY command to cat the files together.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.