I am using DB2 as Target.volume of data is large.Target is Append mode.
I need to commit when all records are successfully read from the source.
The problem I am facing is let us assume 10 Lakh records are there in source and 9 Lakh records are successfully are written into target.If there is a problem with 9lakh+1 record and Job got aborted when I restart the job from sequence once again 9 Lakh records are inserted which results in duplicate.
I think I need to use some property in target.Not sure which property.
Commit clarification
Moderators: chulett, rschirm, roy
Well, heck... gonna go out on a limb and say it's the DB2 Connector. From the documentation:
Auto commit mode
Specifies whether the connector commits transactions manually or allows the database to commit transactions automatically at its discretion. Choose one of the following:
Off Set this property to Off for the connector to commit transactions manually after processing the number of records specified by the Record count property.
On Set this property to On for the database to commit transactions automatically at its discretion.
Auto commit mode
Specifies whether the connector commits transactions manually or allows the database to commit transactions automatically at its discretion. Choose one of the following:
Off Set this property to Off for the connector to commit transactions manually after processing the number of records specified by the Record count property.
On Set this property to On for the database to commit transactions automatically at its discretion.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Having switched off Auto Commit mode, you also need to specify the number of rows that constitute a transaction. The value 0 (zero) means "all rows".
Note, however, that you will need to work closely with your DB2 DBA to make sure that the database can cope with a transaction of this size.
If it can't, then your next best bet is to load into a staging table, which you can truncate if not all rows were successfully inserted. Only once all rows have been inserted would you move the rows from the staging table to the real table.
Note, however, that you will need to work closely with your DB2 DBA to make sure that the database can cope with a transaction of this size.
If it can't, then your next best bet is to load into a staging table, which you can truncate if not all rows were successfully inserted. Only once all rows have been inserted would you move the rows from the staging table to the real table.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.