Hi DataStage folks,
I have a job design where it will read from an Oracle table
called "Customer_Data" and insert the records into a Unidata table.
After the record has been inserted to the Unidata table, it will update
a flag field in the source Oracle table which is the "Customer_Data".
The reason of updating the flag field is to indicate that the record
has been fetched and inserted into Unidata table.
My problem here is that, when the update to Oracle stage, the job
will take more than 30 minutes to finish for 2000 records if I set the
array size to 1 and transaction size to 1.
If I set the array size to 3000 and transaction size to 6000, it
will take only approximately 5 minutes to finish the job.
Of course the second option of array size 3000 and transaction size
6000 can give a better performance.
However my concern here is that when a record is inserted to Unidata
table, it will not commit immediately to the Oracle table if I use the
settings of array size 3000 and transaction size 6000. Therefore if the
job fails, I understand that the records inserted to Unidata, for
example the 100th records, all the updated 100 records may not be
commited to the Oracle table as it normally will wait for the record to
reach 6000 before it commits the transaction. The point here is that
the record that were inserted to Unidata will not be in sync with the
Oracle table which the job needs to update a flag field to show that it
has fetched the record. Therefore, I am concern of the data integrity.
Actually I am in a dilemma. It is between performance and data
inconsistency issue. This job is supposed to execute every 15 minutes
interval.
Could anybody give me a suggestion on how to handle this problem?
Regards,
Melvin
Update to Oracle problem
Archive of postings to DataStageUsers@Oliver.com. This forum intended only as a reference and cannot be posted to.
Return to “Archive of DataStage Users@Oliver.com”
Jump to
- Moderators' Choice
- ↳ Editor's BLOG Corner
- ↳ Ask the Experts! - Dads and Grads
- ↳ DSXchange Testimonials
- ↳ Cognos (IBM BI)
- FAQs
- ↳ FAQs
- ↳ FAQ Discussion
- DataStage
- ↳ General
- ↳ IBM<sup>®</sup> Infosphere DataStage Server Edition
- ↳ IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- ↳ Archive of DataStage Users@Oliver.com
- IBM<sup>®</sup>Infosphere Products<sup></sup>
- ↳ Business Glossary
- Suggestions
- ↳ Site/Forum
- ↳ Enhancement Wish List
- Consulting
- ↳ Talent
- ↳ Looking for Talent
- Support
- ↳ Parameter Manager
- ↳ Compile All Plus
- Usergroup Forums
- ↳ Usergroup Central Forum
- ↳ Heartland Usergroup Forum
- The Written Word
- ↳ Articles, White Papers and Tips and Tricks
- ↳ Product Documentation
- Third Party Applications
- ↳ Third Party Applications
- Product Derivatives
- ↳ Functions
- ↳ Routines
- ↳ Jobs
- ↳ Logs
- Tools
- ↳ Tools Forum
- Category
- ↳ Infosphere Master Data Management
- ↳ Data Quality Best Practices
- ↳ IBM QualityStage
- ↳ Information Analyzer (formerly ProfileStage)
- ↳ IBM<sup>®</sup> SOA Editions (Formerly RTI Services)
- ↳ IBM<sup>®</sup> DataStage TX
- ↳ BI
- ↳ Data Integration