Archive job design suggestion

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
ag_ram
Premium Member
Premium Member
Posts: 524
Joined: Wed Feb 28, 2007 3:51 am

Archive job design suggestion

Post by ag_ram »

hi,

I have a series of server jobs
DRS stage ------ Transformer---- Sequential file

the number would be 33 similar jobs like above with different tables.

These job are being called using set of sequencer jobs.

The requirement is to archive old data
Each tables holding approx a million of rows to archive, After job subroutine function ExecDos is used in job parameters section of each job to create a zip file in a specified directory with the data.

My concern with this design is:

a. Does scheduling more than 15 jobs in a single canvas of server is feasible or what could be implications.

b. What are the limitations of handling millions of records and moving between directories using datastage, in term of log issues.

Any suggestion is welcomed, i am not content with this design.

thanks,
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

(a) It's feasible PROVIDED THAT you hardware supports that many processes. The total load is equivalent to running fifteen simple jobs (one equivalent to each stream) simultaneously. I would prefer individual jobs and either a sequence or job control routine in which I could set an upper limit on the number simultaneously processing.

(b) There are no log issues with very many rows; the stage statistics simply have larger numbers. Moving between directories should not impact the log at all, unless you explicitly log the results of each cd command.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply