Page 1 of 1

Basic Transformer Numeric datatype

Posted: Fri Sep 30, 2016 1:54 pm
by pavankvk
We have some server jobs migrated from 7.5 days and now in 11.5.0.1. these jobs were using drs plugin and now the drs connector replaced them.

this actually exposed a bug(?) in the basic transformer stage.

essentially we have 2 numeric fields define as 19,8. these will be multipled and result stored in another numeric field of 19,8


365.76000000*0.75357800*0.8

this results in 220.5029514240000000 and when writtent o Numeric 19,8 field in the transformer expect it to convert it to 220.50295142. however until its not written to a file or target database, the transformer doesnt seem to convert it.

when we write to a file, the value is 220.50295142, but when we write to target we get the error from odbc driver "String data, right truncate"

if we write to file and then read the file to load the same table using the same driver, it works fine.

is this a old bug? please let me know

Posted: Fri Sep 30, 2016 2:47 pm
by Mike
Bug? No. That is just the nature of server jobs.

There is no strong typing like you get in parallel jobs.

Everything in a server job is essentially a string. Conversion to and from a number is pretty fluid, and lengths don't really matter.

Add a number to a string:

Code: Select all

"1" + 0
Concatenate a number with a string:

Code: Select all

123 : "ABC"
DataStage BASIC is just fine doing that.

Data type and length restrictions only come into play when using a source or target stage that has stronger typing.

Mike

Posted: Fri Sep 30, 2016 3:52 pm
by chulett
Right. I don't recall where it is stored (uvconfig?) but the default maximum precision is 15 so that is probably coming into play here. Not something you just want to increase without making sure you know all the ramifications of doing so.

You can also look into the "string math" functions available in a Server job for doing math on those: SADD, SSUB, etc.

Posted: Fri Sep 30, 2016 8:08 pm
by pavankvk
thanks for your responses. at this point we will just go with offloading the data to a hashfile enforce data typing and then load the database from reading it in the same job. looks good so far