Hi
We have very high volume of source around 20 millions records. How can we effectively load target table?
Is the splitting of files only option? Can we use loops as another option?
Thanks
Split files in DataStage
Moderators: chulett, rschirm, roy
high volume is relative ... we have loaded more than 5x that, it just takes time. You are not pushing any limits of the DS tool if your environment/server can handle it, in other words.
Did you configure the database connector load stage for a bulk load instead of insert mode?
What kind of DB is the target?
Loops work, splitting up the work can do it, there may be other system specific tricks. What do you need from it? A long running large load isnt always bad, if you don't mind waiting for it.
Did you configure the database connector load stage for a bulk load instead of insert mode?
What kind of DB is the target?
Loops work, splitting up the work can do it, there may be other system specific tricks. What do you need from it? A long running large load isnt always bad, if you don't mind waiting for it.