Datastage Bulk Load to Oracle DB - JSON data
Posted: Tue Sep 04, 2018 1:37 pm
Hi All -
I am facing issue in one Datastage job that reads the data ( Has fields with JSON data, defined as BLOB datatype in the Oracle table) and performs BULK load to the Oracle table. Note that this table doesnt have any constraints defined, so ideally its supposed to perform the load run faster.
The job takes 15 minutes to load 5 Million records. The commit count/Array size gets defaulted to 1 due to the LongVarBinary field. Can someone pls suggest a better way to handle this scenario so that the loads run faster ?
Flow :
Sequential File ----> Oracle Load ( using Oracle Connector)
JSON data ---> Defined as LongVarBinary in Datastage
Thanks
Freddie
I am facing issue in one Datastage job that reads the data ( Has fields with JSON data, defined as BLOB datatype in the Oracle table) and performs BULK load to the Oracle table. Note that this table doesnt have any constraints defined, so ideally its supposed to perform the load run faster.
The job takes 15 minutes to load 5 Million records. The commit count/Array size gets defaulted to 1 due to the LongVarBinary field. Can someone pls suggest a better way to handle this scenario so that the loads run faster ?
Flow :
Sequential File ----> Oracle Load ( using Oracle Connector)
JSON data ---> Defined as LongVarBinary in Datastage
Thanks
Freddie