Project Name in environment variable?

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

jackson.eyton
Premium Member
Premium Member
Posts: 145
Joined: Thu Oct 26, 2017 10:43 am

Project Name in environment variable?

Post by jackson.eyton »

So this is likely a dumb question but I am thus far unable to find an answer to this. I have a sequence job that needs to call another sequence job and then end. The bet way I could get this to call the other job, and then go ahead and end, and not wait for the other job to complete, is to use dsjob.exe to call the other job. The arguments for this require the project name and job name of course. In the process of developing this and testing, I will need to package and move the jobs for my specific task from Dev to Test, and then from Test to Prod. These are all different projects and in the case of Prod, a different server set entirely. I'm hoping that the project name where the job resides is available as an environment variable so that I don't have to go and change the Execute Command stage arguments for each move. Something like $DSJProj maybe? Obviously not a HUGE issue more of a convenience.
-Me
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

I would never execute a dsjob command to run a job from a first job. Your first job will end and give a return code to the calling script. The second job will run but have nobody to notify upon failure or success. danger danger.

Not sure why you would EVER want to fire off a job in prod without return code validation.
jackson.eyton
Premium Member
Premium Member
Posts: 145
Joined: Thu Oct 26, 2017 10:43 am

Post by jackson.eyton »

Well, the jobs being executed are sequence jobs with exception handling, so I would assume that would still work and I would get the designed email in case of failure. Additionally I can't think of a better way to do what I need. The scenario is as follows. The sequence jobs are replicas of each other. Once the sequence is started(running) it waits for a file to appear in a certain directory. This file can come at any time as its submitted to us by one of our customers, multiple files could be submitted throughout any given time period. Once the file is there the sequence runs a series of parallel jobs and server commands that read the contents of the file and store that in the database, then move the file to a processed folder with the datetime appended to the file name. The task then needs to repeat as "on-hold" for the next file, whenever it may come. I tried several ways to do this last part, first I tried simply calling a job activity stage to run itself again, but this seemed to cause issues. Second, I created the copy sequence job and had each one using the Job Activity stage at the end to call the other, but this caused a loop, in that whichever job was run first would never actually finish because it was waiting for the other sequence to complete, which would in turn also never finish as it was waiting for the first to finish, and so on. So this third option uses Execute Command at the end to "run" the sequence copy of the other at the end so in this manner the sequence jobs will always go back and forth waiting for the next file to process but will indeed successfully END once the copy job is called. I am more than open to ideas on this if there is a better way to approach it?

I suppose I could add an additional Notification Activity at the very end of the sequence that only emails if the previous Execute Command stage fails its DSJOB command...
-Me
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

To address the original question, I don't recall the current project name being stored anywhere, i.e. a macro or system variable or anything of that nature... but it has been awhile. You can get a list of all of the projects through the API but I think that if you need the current one you'll need to create a user-defined environment variable in each project with that value.
-craig

"You can never have too many knives" -- Logan Nine Fingers
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

Sounds like a design problem (ahem, call me Dr. Obvious) but not at the DataStage level. You need a scheduler with cyclic capabilities to control your job runs.

If you don't have an external scheduler, the following might help you find an inconvenient alternative in DS.

We use Control-M. I would put the file watcher in the CM process, which would invoke your DS process each time a file came in, accept the return code, and either throw a failure alert or reset for the next file. You would need to find something to put in the invocation id to create distinct logs in Director, or relax the line limit on jobs because there'd be dozens of messages for a single job running multiple times.

Schedulers are just too useful to ignore. They give you an enhanced flexibility while taking on workload (and design efforts) that won't be easy or efficient in DS.

EDIT: re project name. If you use a ssi file in your dsjob script, create a project name variable there. You should have a naming convention standard for project names, something with a pattern to it. We use three parts to each name: Unix group name, project app prefix, syslevel. Ex. ABC_XYZ_PRD. Those also being parameters for the dsjob command line keeps us honest. :wink:
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
PaulVL
Premium Member
Premium Member
Posts: 1315
Joined: Fri Dec 17, 2010 4:36 pm

Post by PaulVL »

So how about a loop in your master sequencer?

Test for file, if not present sleep 5 mins, loop to start. If present, continue to execute jobs, loop back to start when done...
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

PaulVL wrote:So how about a loop in your master sequencer?
How would he exit the loop? It's an assumption, but he indicates that the job is time based.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

FranklinE wrote:EDIT: re project name. If you use a ssi file in your dsjob script, create a project name variable there.
Sorry... "ssi file"?
-craig

"You can never have too many knives" -- Logan Nine Fingers
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

"Server side includes." The general use is to source in script lines that are common to scripts in use. In my experience, they are common function definitions -- get server name, verify existence of directories, etc. -- and in our DataStage environment they contain required common variables and parameters at the server level. We customize them to include the environment variables, and use a utility script to update the project when we make changes to them.

Example lines from a project ssi file, with names changed to protect the innocent:

Code: Select all

G_DS_ENV=(syslevel)
G_DS_UNIX_GROUP=abc
ENV_CAPS=$(echo ${G_DS_ENV} | tr [a-z] [A-Z] )
GRP_CAPS=$(echo ${G_DS_UNIX_GROUP} | tr [a-z] [A-Z] )
G_DS_PRJ=XYZ

G_DS_PROJ_NAME=${GRP_CAPS}_${G_DS_PRJ}_${ENV_CAPS}
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
jackson.eyton
Premium Member
Premium Member
Posts: 145
Joined: Thu Oct 26, 2017 10:43 am

Post by jackson.eyton »

Forgive my ignorance, I am not familiar with any schedulers as you have referred to :-( I am pretty new to InfoSphere and ETL in general. If you could elaborate on the dangers of using DSJOB to execute another job? When doing so I can see the job running in the Director and review the logs from that job. Also the notification Activity stages in the jobs configured will let me know if there's ever a problem processing, so I'm having a hard time understanding what exactly the drawback is, I do apologize...
-Me
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

Told to me by my Cobol instructor: never apologize for ignorance. It gives those of us with a little bit of knowledge a chance to look like geniuses. :lol:

DataStage has a scheduling utility. Look it up in the manuals. It's rather primitive compared to external schedulers, and might suit you if your shop doesn't have or won't get one.

In the case of Control-M, it's a job which has calendar and time attributes, and facilitates building cycles of jobs which you can control in various ways. It's used in many larger shops with mainframes and a long list of jobs.

The CM job's core is a command line. In my case, it invokes a script which builds and submits a dsjob command line. It has error handling and notifications built in, and in our large shop is monitored by our Ops crew. When abends happen, they go to documentation for what to do or whom to call.

In short, it covers all of the tasks you have to build into the DS jobs, in a standardized fashion which removes the necessity of doing it in every individual DS job.

It's more a trade-off than a drawback, and part of the trade-off is the expense of having another software app in addition to DS.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

BMC Control-M "about" and usage manuals documentation:
https://docs.bmc.com/docs/display/publi ... umentation

I checked the links, you don't have to log in to view them.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

We moved from a POS internal "scheduler" (I use the term loosely) to a full blown enterprise-wide Control-M environment and it does way more things than we would ever need. All for one high high price. :wink:

And thanks for the SSI details, familiar with the concept if not the name...
-craig

"You can never have too many knives" -- Logan Nine Fingers
FranklinE
Premium Member
Premium Member
Posts: 739
Joined: Tue Nov 25, 2008 2:19 pm
Location: Malvern, PA

Post by FranklinE »

You're welcome, Craig. We have thousands of jobs on multiple platforms -- z/OS, Unix and Windows -- which run nightly and are dependent on variations due to holidays and time constraints. Our ROI for Control-M is rather good. I don't know if BMC has "light" versions for smaller shops, but it's worth a look.
Franklin Evans
"Shared pain is lessened, shared joy increased. Thus do we refute entropy." -- Spider Robinson

Using mainframe data FAQ: viewtopic.php?t=143596 Using CFF FAQ: viewtopic.php?t=157872
jackson.eyton
Premium Member
Premium Member
Posts: 145
Joined: Thu Oct 26, 2017 10:43 am

Post by jackson.eyton »

Thanks gentlemen for your replies, MUCH appreciated. As it is currently we are rather locked down on the software we have for now, I will absolutely keep this in mind for future recommendations to the budgeting committee. For now it sounds like there's not another way to accomplish what I need other than what I already have. In regards to the original question I will absolutely just create a user var that has the project name in each project for now, not a huge deal.
-Me
Post Reply