I am trying to design a job which does following.
1) Get source flat file(havingng approx. 1-2M records) and separate them into 2 files (file1,file2) based on category.
2) key columns are same in these files.Load file 1 records to table1.
File 2 is having all comma delimited fields(variable number of fields- max upto 20) and key columns.
Now i have to convert each column to be a row(other than key columns) and
then load the data to table 2.
table1 to table2 relationship is 1 to 0..*(0-many).
what am i trying do?
Since i have large volume of data in db i don't want to look up either in the db at runtime or load huge data to hashed file.
so that i can reduce the loading and look up time.
Try load the data in table 1 first and if there are any failures write to a hashed file FILE-A(may be duplicates -i.e load all master records first)
Now read each file2 record and lookup in FILE-A and if not found, insert record to table2(detail records).
Is this right way of loading the data? Am i suppose to load master and corresponding details at once?
Suggest better design
Moderators: chulett, rschirm, roy
-
- Participant
- Posts: 47
- Joined: Tue Feb 23, 2010 12:16 am
- Location: CHENNAI
Re: Suggest better design
Hi,
I think we had meet this senerio instead of conveting into rows into colums and transforming .
<F1> and <F2>
<F2> use Transformer and split them into diff catageory
for eg 10 cat
Join all 10 cat with <F1>
u can get the Result
Regards,
Manoj
I think we had meet this senerio instead of conveting into rows into colums and transforming .
<F1> and <F2>
<F2> use Transformer and split them into diff catageory
for eg 10 cat
Join all 10 cat with <F1>
u can get the Result
Regards,
Manoj
-
- Participant
- Posts: 81
- Joined: Tue Aug 15, 2006 8:31 am
- Location: Zürich
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
In terms of the original poster's final question, you can control a "grouped transaction" in an ODBC stage. As to the rest, please provide an English description of what you are trying to achieve.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I was inserting the records first in parent table and collecting rejected records in to hashed file. while i am inserting the child table records i am looking up in to hashed fiel for rejected records and corresponding child records will be eliminated from the insertion.
More over both parent and child records are not in a single transaction.
More over both parent and child records are not in a single transaction.
Hari