So what is the resolution. I have the same requirement ; running multi instance job N times simultaneously based on the job parameter N.
Is it possible to do it with a job sequence? or it has to be a job control job?
Search found 161 matches
- Wed Jun 01, 2016 12:49 pm
- Forum: General
- Topic: Datastage Multiple instance job to run 'n' times using parm
- Replies: 16
- Views: 25983
- Wed Jun 01, 2016 12:48 pm
- Forum: General
- Topic: Datastage Multiple instance job to run 'n' times using parm
- Replies: 16
- Views: 25983
- Mon Feb 08, 2016 11:31 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Is it possible to generate the schema file using ds job
- Replies: 10
- Views: 24289
I can't find this feature on my table definitions, I am using 9.1 Parallel job. Am I doing something wrong? I want to create schema files from many database tablesray.wurlod wrote:Right click in the columns grid of the table definition you have already in DataStage and choose Save As to save as a schema file. ...
- Tue May 19, 2015 7:11 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: XML stage error: dbx not found
- Replies: 0
- Views: 1634
XML stage error: dbx not found
Hi, I am getting the following run time error in test environment while the job ran fine many times in dev environment ( same box different project) Xml1,0: sh: dbx: not found The above message is logged as warning and the next thing in the log is the error with the following message Xml1,0: Operato...
- Fri Apr 04, 2014 11:18 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Meaning of the "USER" environment variable in the
- Replies: 8
- Views: 3944
Thanks again Ray, We have SunOS so I could not find the environment folder under /etc (or my userid does not allow me to see it.) The interesintg thing is that when I login to my unix account via shell and do echo $USER it shows my userid, however if I create a dummy parallel job and use the same co...
- Thu Apr 03, 2014 7:44 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Meaning of the "USER" environment variable in the
- Replies: 8
- Views: 3944
- Wed Apr 02, 2014 5:55 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Meaning of the "USER" environment variable in the
- Replies: 8
- Views: 3944
- Wed Apr 02, 2014 4:26 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Meaning of the "USER" environment variable in the
- Replies: 8
- Views: 3944
Thanks Ray, I guess it can be set to any value and it can not be trusted ( do not mean much). Because in our case, the value of the USER is same for all the job logs, no matter who runs it ( the ISUSER shows that information). it is also not the person who started the DataStage engine, it is not the...
- Wed Apr 02, 2014 1:58 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Meaning of the "USER" environment variable in the
- Replies: 8
- Views: 3944
Meaning of the "USER" environment variable in the
Hi Everybody, I was wondering if anyone knows what does the "USER=XXXXX" in the directory second log entry mean? in my log, it just appear just before the environment variable "WHO" which represent the project name. I could not find what it represent, it does not represent who ra...
- Wed Nov 21, 2012 10:54 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Oracle Connector stage does not recognize the SQL statement
- Replies: 1
- Views: 5422
- Wed Nov 21, 2012 10:16 am
- Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
- Topic: Oracle Connector stage does not recognize the SQL statement
- Replies: 1
- Views: 5422
Oracle Connector stage does not recognize the SQL statement
Hi Everybody, Did anyone come accross that the Oracle connector stage is not recognizing the custom SQL when used in server job. I just created a simple job that reads from oracle through oracle connector and write to a sequential file. The interesting part is that I can view the data. Here is exact...
- Fri Sep 30, 2011 4:24 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: A bug in Dquote function?
- Replies: 4
- Views: 3175
That is exactly waht I do NOT want. I did not intend to treat it as two fields, I intend it to treat one field. It is coming as one field and I want it to leave the transformer as one field. I just want the incoming field enclosed in double quoted, that's all. I think that is the definition of this ...
- Fri Sep 30, 2011 11:08 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: A bug in Dquote function?
- Replies: 4
- Views: 3175
A bug in Dquote function?
Have any of you guys experienced that the DQuote function is not working properly inside a parallel transformer? here "is not working properly" means that it is introducing extra escape characters. For example mylink.mycolumn value is ABC"",""EFG when I use Dquote(mylin...
- Fri Aug 19, 2011 11:55 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: is it possible to make some job parameters mandadatory ?
- Replies: 3
- Views: 2709
- Fri Aug 19, 2011 10:57 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: is it possible to make some job parameters mandadatory ?
- Replies: 3
- Views: 2709
is it possible to make some job parameters mandadatory ?
Hi All, I was wondering if it is possible to make some or all of the job (sequence) parameters mandatory to fill it in, otherwise the job should not proceed, ( abort as soon as it runs). The reason I need this, our QA testers are testing the jobs by clearing out some of the default parameters and se...