Oracle Enterprise Stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
KStack
Participant
Posts: 1
Joined: Wed Apr 23, 2003 8:42 am
Location: United Kingdom

Oracle Enterprise Stage

Post by KStack »

Any suggestions on restarting Oracle Enterprise Stages. Is it possible to capture the number of commit points and use this to restrict the records for the restart?

The easiest way is to have no commits and let it roll back on failure, I was wondering if anyone had a better way.

Thanks

Kevin
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

KStack,

as you said, by far the easiest is to have one big commit at the end of job execution. Should the job abort for some reason the restartability is essentially re-running the job. Easy, simple and straightforward.

Unfortunately, it is not fast :? For jobs that run for hours this is not a good solution. Also, for big jobs you might run into timeout issues as well as commit size constraints.

If your input data is in the same order each run (i.e. a sequential file as opposed to a SELECT on a table) then you can program around a restart in DataStage by getting the number of rows processed from a previous run (located in the job's log file) and then restarting from that row number minus a modulus for whatever commit frequency you choose. This can be done without too extensive effort and can also be fully automated.

So the choice is yours - but I would prefer to keep the jobs simple and won't program around the restartability issue unless I really have a reason to do so.
bibhudc
Charter Member
Charter Member
Posts: 20
Joined: Thu Jun 19, 2003 12:26 pm

Post by bibhudc »

One way you could avoid re-processing all the rows again is to store the records in a flat file before you insert into the Oracle table. This of course gets complicated if you have multiple targets in a sequential processing situation.
Bibhu
dsxuserrio
Participant
Posts: 82
Joined: Thu Dec 02, 2004 10:27 pm
Location: INDIA

Post by dsxuserrio »

Another way of addressing the problem would be to load into empty work/staging table. From staging/work table to target (actual ) table could be done fully by oracle. (Using a package ). That works great for us with huge volumes.
dsxuserrio

Kannan.N
Bangalore,INDIA
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

As dsxuserrio stated, his approach is a very efficient one; I'd like to add that if you can have your DBA partition your target tables in such as way as to make each load go into a specific partition, then you wouldn't need to copy the data from the "staging" or interim tables, you would hust activate the partition.
Post Reply