SQL Datatype issues (LongNVarChar and Bit)

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
wteater
Participant
Posts: 5
Joined: Mon Jul 23, 2018 10:17 am

SQL Datatype issues (LongNVarChar and Bit)

Post by wteater »

Hello,
I am somewhat new to DataStage and am running into an issue.
Unfortunately, I am not able to find any helpful information regarding this issue, and I wanted to post here to see what kind of help I can get.
I have a job that is reading from a SQL Server table. Unfortunately, some of the datatypes on the source table are causing the job to fail (that much information I can find online). The datatypes are; LongNVarchar and BIT. If I copy the table and modify the copied table's datatypes to something more usable (ie; char and int), the job works fine. Unfortunately, this is not an option as there is an external application that is dependent on this table having these datatypes.

My questions are ... how can I write a DS job that uses these columns and writes the data to another table and/or a CSV file? Can I do some sort of conversion of the columns with these datatypes? (For example; to_char() or something similar) What is the best way to resolve the issue with the datatypes so that I can move forward?
Thank you
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Can you post the errors you are seeing in the job's log?
-craig

"You can never have too many knives" -- Logan Nine Fingers
wteater
Participant
Posts: 5
Joined: Mon Jul 23, 2018 10:17 am

Post by wteater »

Here are the error messages that I am seeing after attempting to run the DS job:

source,0: The array size must be set to 1 when reading LOBs (CC_OdbcDBRecordDataSetProducer::dbsAllocateBindBuffers, file CC_OdbcDBRecordDataSetProducer.cpp, line 587)


source,0: The runLocally() of the operator failed.


source,0: Operator terminated abnormally: runLocally() did not return APT_StatusOk


node_node1: Player 1 terminated unexpectedly.


main_program: APT_PMsectionLeader(1, node1), player 1 - Unexpected termination by Unix signal 15(SIGTERM).


Thanks for your help!
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So, the ODBC Connector. Sorry but I have to ask - did you try setting the Array Size to 1? Is there only a single LOB in the table? I also believe you'll need to move the LOB field to the end of the metadata but others can confirm that.
-craig

"You can never have too many knives" -- Logan Nine Fingers
wteater
Participant
Posts: 5
Joined: Mon Jul 23, 2018 10:17 am

Post by wteater »

Hi,

Unfortunately, I do not have access to the server to check that specific ODBC connection. I have checked the ODBC connection that I configured on my PC and I do not see any setting for "array size" to set it to "1". Would this setting only be on the server's ODBC connection?

Thanks
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Unfortunately, I haven't had access to the product in years, using a different tool lately. I think it should either be a property in the connector itself or perhaps an APT variable?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

It's a property of the Connector stage.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
wteater
Participant
Posts: 5
Joined: Mon Jul 23, 2018 10:17 am

Post by wteater »

Thank you all for your responses.

I found the location where to change the array size ... it is in the ODBC connector within the DS job. I have a lookup table and a target table within my DS job, so each will need to be updated with the array size of 1. I am now able to insert new records, but when I attempt to change the action of the job to update existing rows, then insert new rows, the job fails with the error messages provided previously. Why is this?
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Can't really answer the question with any authority but then I don't generally advocate use of those "dual action" actions. If you are doing a lookup, do you not already know if you should be doing an update versus an insert? In that case, direct the result to two separate targets, one for each action.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

One approach might be to create a view of your table using "Standard" datatypes and work with the view.
wteater
Participant
Posts: 5
Joined: Mon Jul 23, 2018 10:17 am

Post by wteater »

Thank you all for your responses!

I am trying a different approach, to test a few things out.
I have been successful in loading the contents of a CSV file through a Transformer stage to an ODBC Connector target. All columns insert the data as expected, except for the date information that is within the CSV file.
The date information in the CSV file is in the following format; yyyy-mm-dd hh:mm:ss. I have tried changing the target table (SQL Server) using a datatime and timestamp datatypes, but no luck. The following id an example of the logic that I have tried.

StringToTimestamp(DSLink4.CreateDate,"%yyyy-%mm-%dd %hh:%nn:%ss.3")

How does one take in-coming date information from a CSV file and load it into a SQL Server table with a datetime datatype? I have been racking my brain searching via Google searches, but no luck yet.

Thanks!
Post Reply