Job design

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vgundavarapu
Premium Member
Premium Member
Posts: 22
Joined: Wed May 08, 2013 8:38 am
Contact:

Job design

Post by vgundavarapu »

Hi

I have requirement where I load data into target
Without landing data into datasets I have to load to fact table
With 10 lookups and 10million records how can we achieve this

We don't have enough space on dataset directories

Thks
Thanks,
Venkata Gundavarapu
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

By creating job designs that don't load data to Data Sets?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
sainath
Premium Member
Premium Member
Posts: 138
Joined: Fri Nov 19, 2004 3:57 pm

Post by sainath »

Hi
From my understanding if you source from database and loading into target database in same job performance will be very slow
Do you have any sample design approach that can be followed

Thks
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Your understanding is... incorrect. Or best case incomplete.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

ETL from and to the same table can be slow (mainly due to locks or, worse, self-deadlocks).

ETL from and to the same database should not be a problem, if source and target are different tables and the database server has sufficient capacity for the number of connections being requested.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
UCDI
Premium Member
Premium Member
Posts: 383
Joined: Mon Mar 21, 2016 2:00 pm

Post by UCDI »

if you need to, can you load a temporary table in your target or source or other DB?

You can run straight through, of course, if there isnt anything in your job that runs out of resources. If there is, you may have to find a workaround (like the temp table).

You can also change your extract to get the data in pieces, and run the job a few times with different where clauses.
Post Reply