Parameter Set with Environment Variables in it

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
PhilHibbs
Premium Member
Premium Member
Posts: 1044
Joined: Wed Sep 29, 2004 3:30 am
Location: Nottingham, UK
Contact:

Parameter Set with Environment Variables in it

Post by PhilHibbs »

The project that I am working on has a convention whereby all jobs must have a Parameter Set which contains a few environment variables. There is one Value Set, "Standard", which is set up in every environment, and the shell script that invokes dsjob specifies this value set name, and the value set is passed on from Sequence Jobs to Parallel Jobs.

I am wondering why it was set up this way - nobody here seems to know, the people that set up the standard conventions are all gone and it's a mystery. I wanted to override the value of $APT_DUMP_SCORE in a job, so I added it as a parameter to the job, which then failed (can't remember if it failed to compile or it failed to run) because $APT_DUMP_SCORE is in the Parameter Set. I therefore had to override it at run time rather than compiling in a TRUE value.

Is this a common convention? Can anyone think of a good reason to do this?

The only one that I can make sense of is $APT_CONFIG_FILE, which would allow us to have different .apt config files for different sub-projects, so for example payroll data set files could be written to a separate volume with additional security, but there are no such concerns on this project, all data is of equal confidentiality within a DataStage project.

The Value Set "Standard" is my amendment to the process, as it avoids the need to re-compile jobs when deploying them from dev to another project.

Posting under General as the same convention may apply to Server Job environments.
Phil Hibbs | Capgemini
Technical Consultant
Post Reply