Commit clarification

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
vamsi_4a6
Participant
Posts: 95
Joined: Wed Jun 04, 2014 12:06 am

Commit clarification

Post by vamsi_4a6 »

I am using DB2 as Target.volume of data is large.Target is Append mode.
I need to commit when all records are successfully read from the source.
The problem I am facing is let us assume 10 Lakh records are there in source and 9 Lakh records are successfully are written into target.If there is a problem with 9lakh+1 record and Job got aborted when I restart the job from sequence once again 9 Lakh records are inserted which results in duplicate.
I think I need to use some property in target.Not sure which property.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I could guess but would rather not... clarify for us exactly what target stage you are using.
-craig

"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Well, heck... gonna go out on a limb and say it's the DB2 Connector. From the documentation:

Auto commit mode
Specifies whether the connector commits transactions manually or allows the database to commit transactions automatically at its discretion. Choose one of the following:

Off Set this property to Off for the connector to commit transactions manually after processing the number of records specified by the Record count property.
On
Set this property to On for the database to commit transactions automatically at its discretion.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Having switched off Auto Commit mode, you also need to specify the number of rows that constitute a transaction. The value 0 (zero) means "all rows".

Note, however, that you will need to work closely with your DB2 DBA to make sure that the database can cope with a transaction of this size.

If it can't, then your next best bet is to load into a staging table, which you can truncate if not all rows were successfully inserted. Only once all rows have been inserted would you move the rows from the staging table to the real table.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
vamsi_4a6
Participant
Posts: 95
Joined: Wed Jun 04, 2014 12:06 am

Post by vamsi_4a6 »

chulett wrote:I could guess but would rather not... clarify for us exactly what target stage you are using.
I am using DB2 connector
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Riiiiiight... you did see that both of the actual answers posted so far assumed that, correct? Did you get this figured out, have a resolution you can post?
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply