Hi
I have requirement where I load data into target
Without landing data into datasets I have to load to fact table
With 10 lookups and 10million records how can we achieve this
We don't have enough space on dataset directories
Thks
Job design
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 22
- Joined: Wed May 08, 2013 8:38 am
- Contact:
Job design
Thanks,
Venkata Gundavarapu
Venkata Gundavarapu
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
ETL from and to the same table can be slow (mainly due to locks or, worse, self-deadlocks).
ETL from and to the same database should not be a problem, if source and target are different tables and the database server has sufficient capacity for the number of connections being requested.
ETL from and to the same database should not be a problem, if source and target are different tables and the database server has sufficient capacity for the number of connections being requested.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
if you need to, can you load a temporary table in your target or source or other DB?
You can run straight through, of course, if there isnt anything in your job that runs out of resources. If there is, you may have to find a workaround (like the temp table).
You can also change your extract to get the data in pieces, and run the job a few times with different where clauses.
You can run straight through, of course, if there isnt anything in your job that runs out of resources. If there is, you may have to find a workaround (like the temp table).
You can also change your extract to get the data in pieces, and run the job a few times with different where clauses.