What are the engines involve in Server and Parallel Jobs?

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
chandankambli
Participant
Posts: 14
Joined: Sun Jun 11, 2006 2:16 pm

What are the engines involve in Server and Parallel Jobs?

Post by chandankambli »

Hello Experts:

Pls. help me understand the the engines/mechanism involve in Server and Parallel Jobs under unix environment?

Thanks.
Thanks experts.
datastage_learner
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Server jobs execute in what is effectively a virtual machine called the DataStage Run Machine. This is what you install as the "DataStage server" and is also used to execute job sequences. The DataStage Run Machine is written in a combination of C and DataStage BASIC.

Parallel jobs execute something called osh (for Orchestrate shell), which is the primary executable for the parallel execution technology that Ascential acquired when they purchased Torrent Systems. This is a "pure C" (well, C++ is included) environment.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chandankambli
Participant
Posts: 14
Joined: Sun Jun 11, 2006 2:16 pm

Post by chandankambli »

ray.wurlod wrote:Server jobs execute in what is effectively a virtual machine called the DataStage Run Machine. This is what you install as the "DataStage server" and is also used to execute job sequences. The DataS ...
Thanks for the reply Ray.

But, how do we execute Server Jobs and Parallel Jobs under unix system command line?

Do these types of jobs require different executables?

Say, DSRunJob <args> is used for executing both types of job or something else for server or parallel job?
Thanks experts.
datastage_learner
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

From the command line you use dsjob for all job types. DataStage figures out which executable to use.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

ray.wurlod wrote: The DataStage Run Machine is written in a combination of C and DataStage BASIC.
I didnt know that C was used too. Good to know.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Primarily for the memory management aspects - there are no functions in DataStage BASIC for this. The STREAMS I/O module is used to increase performance of the Sequential File stage.

Anywhere you see a subroutine call (for example when stage tracing) where the subroutine name begins with "$" a call to a C function is involved. This, for example, $DSD.SeqPut() is a call to a C function, but DSD.BCIPut is "pure" DataStage BASIC.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

Neat. Thanks Ray.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
Post Reply