I have a rather simple job pulling some data from an excel file and placing it in a table. No real transformations are done to the data short of trimming spaces before and after. The column definitions coming in from the spreadsheet are varchar(256) and outgoing to my table are also varchar(256). The issue is when running the job its attempting to create the table and fields using varchar(64000), this then leads to an error that the table exceeds the 65535 byte internal limit. Full error here:
The Job's OSH can be reviewed here:STAGE_GEOGRAPHIC_AREA_HIERARCHY: Failed to execute SQL statement: CREATE TABLE STAGE_GEOGRAPHIC_AREA_HIERARCHY (Population_Timestamp TIMESTAMP NOT NULL, PROCESSED_LAST_DATE DATE, Country_Short_Code VARCHAR(64000), Zip_Code VARCHAR(64000), Ttype VARCHAR(64000), Primary_City VARCHAR(64000), State_Code VARCHAR(64000), County_Name VARCHAR(64000), ISO_Code VARCHAR(64000), Country_Name VARCHAR(64000), State_Abbreviation VARCHAR(64000), State_Name VARCHAR(64000)) DISTRIBUTE ON RANDOM. Reason: [SQLCODE=HY000][Native=46] ERROR: Table 'STAGE_GEOGRAPHIC_AREA_HIERARCHY' record size 640012 exceeds internal limit of 65535 bytes (CC_NZConnectionRep::executeDirect, file CC_NZConnectionRep.cpp, line 201)
https://github.com/jacksoneyton/DataSta ... master/OSH
I'm not understanding why its deciding that the fields are varchar(64000), therefore I'm unsure how to resolve. Any advice would be greatly appreciated, thanks!!