loading large record in to teradata table
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan
loading large record in to teradata table
Hello,
I am trying to load data in to teradata table from a file. One filed has a size of 90k and is being loaded in to column defined as clob. I have defined it as long varchar in the datastage and trying to load. I able to load records that has size less than 64k but its failing for records greater than 64k.
The job is aborting with the following error.
The message's length (91669) exceeds the limit (65466) and it can't be broken at newlines; the message is truncated.
thank you
I am trying to load data in to teradata table from a file. One filed has a size of 90k and is being loaded in to column defined as clob. I have defined it as long varchar in the datastage and trying to load. I able to load records that has size less than 64k but its failing for records greater than 64k.
The job is aborting with the following error.
The message's length (91669) exceeds the limit (65466) and it can't be broken at newlines; the message is truncated.
thank you
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan
Thank you for the response..
Following is the error:
TD_etl_unq,0: Field col contains a value that has a byte length of 91,456 which exceeds the allocated length of 64,000 (the field value is '{"ptyLocRtf":[{"actnCd":"I","locUsgId":"0","addrSeqNbr":"453564706","callInd":"Y","policy":null,"persId":null,"addrTyp":null,"ptyLocEffDt":"2003-06-01","ptyLocEndDt":"9999-12-31","ptyKeyTyp":"MEMBER","ptySrcKey":"IL1^000055^296099546^....
prints entire value of the field and ends with ...[Message split, exceeds 64000 characters]...
next in the log it shows
TD_etl_rtf_unq,0: RDBMS code 3514: User-generated transaction ABORT. SQL statement: ROLLBACK
TD_etl_rtf_unq,0: Number of rows inserted: 0
buffer(0),0: Fatal Error: APT_BufferOperator::writeAllData() write failed. This is probably due to a downstream operator failure.
Following is the error:
TD_etl_unq,0: Field col contains a value that has a byte length of 91,456 which exceeds the allocated length of 64,000 (the field value is '{"ptyLocRtf":[{"actnCd":"I","locUsgId":"0","addrSeqNbr":"453564706","callInd":"Y","policy":null,"persId":null,"addrTyp":null,"ptyLocEffDt":"2003-06-01","ptyLocEndDt":"9999-12-31","ptyKeyTyp":"MEMBER","ptySrcKey":"IL1^000055^296099546^....
prints entire value of the field and ends with ...[Message split, exceeds 64000 characters]...
next in the log it shows
TD_etl_rtf_unq,0: RDBMS code 3514: User-generated transaction ABORT. SQL statement: ROLLBACK
TD_etl_rtf_unq,0: Number of rows inserted: 0
buffer(0),0: Fatal Error: APT_BufferOperator::writeAllData() write failed. This is probably due to a downstream operator failure.
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan
You haven't specified your target stage - is it a Teradata Connector or one of the legacy stages? This may help in either case. If you are going to need more help, you'll need to provide more details about the stage and settings / properties you are using on the target side.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan
1. I am using Teradata connector stage
2. I am using immediate option with array size 1. bulk option is not working.
3. write mode: Insert
4. Table action: Append
5. Record count 100
6. Enable lob references: NO
7. The target column that i have issue is defined as clob. I am able to insert filed less than 64k but not more than 64k. I am able to insert using bteq script but its failing when trying to insert using datastage.
thanks
2. I am using immediate option with array size 1. bulk option is not working.
3. write mode: Insert
4. Table action: Append
5. Record count 100
6. Enable lob references: NO
7. The target column that i have issue is defined as clob. I am able to insert filed less than 64k but not more than 64k. I am able to insert using bteq script but its failing when trying to insert using datastage.
thanks
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
Bulk method only supports up to 64kb in an array, if the single record is exceeding that, you may not be able to use bulk at all.
This link should help you.
This link should help you.
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan
Thanks for the reply.
1. Yes i am aware of the limitation on the bulk mode.
2. Yes i did use the APT_DEFAULT_TRANSPORT_BLOCK_SIZE ENV variable with default value 1280000
3. I also used APT_TSORT_STRESS_BLOCKSIZE with default value as 3000000 as per IBM link below.
4. The buffer side i tried using
a) Default
b) Auto Buffer: 6145728, Buffer free run 95% and disk increment to 2 MB
3) Buffer: 6145728, Buffer free run 95% and disk increment to 2 MB
still fails with same error
http://www-01.ibm.com/support/docview.w ... wg21660153
thanks
1. Yes i am aware of the limitation on the bulk mode.
2. Yes i did use the APT_DEFAULT_TRANSPORT_BLOCK_SIZE ENV variable with default value 1280000
3. I also used APT_TSORT_STRESS_BLOCKSIZE with default value as 3000000 as per IBM link below.
4. The buffer side i tried using
a) Default
b) Auto Buffer: 6145728, Buffer free run 95% and disk increment to 2 MB
3) Buffer: 6145728, Buffer free run 95% and disk increment to 2 MB
still fails with same error
http://www-01.ibm.com/support/docview.w ... wg21660153
thanks
-
- Premium Member
- Posts: 1735
- Joined: Thu Mar 01, 2007 5:44 am
- Location: Troy, MI
there should be "max buffer size" row in teradata connector properties under limits section, which is by default set to 0 to use default buffer size of 64KB. You will have to change that to push the record larger than 64kb.
Priyadarshi Kunal
Genius may have its limitations, but stupidity is not thus handicapped.
Genius may have its limitations, but stupidity is not thus handicapped.
-
- Premium Member
- Posts: 236
- Joined: Sun Apr 01, 2007 7:41 am
- Location: Michigan