Bad or Wrong data type

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
devsonali
Premium Member
Premium Member
Posts: 155
Joined: Sun Mar 18, 2007 2:31 pm

Bad or Wrong data type

Post by devsonali »

Hello all,

I have two data source ( oracle connector ) connects and extracts these two combines and puts in dataset (simple funnal)

I see the following error
The OCI function OCIStmtFetch returned status -1. Error code: 1,722, Error message: ORA-01722:

To further debug , I design a simple job where it extracts data from the problematic source (above) and via a transformer pass it to peek .

It gives an error
Fatal Error: APT_Decimal::assignFromString: invalid format for the source string. Expecting decimal separator or digit got 'X'

This gives me a fair idea that it is getting a char 'X' when a decimal is expected (at least thats what it sounds to me)

Now this table is huge (in terms of rows and decimal columns) My question is how can I narrow it to the problematic column ?

I was not sure what to search for this type of query so did not spend much time searching with the error (as I know what the problem is)

Any help is greatly appreciated.
Thanks
pavi
Premium Member
Premium Member
Posts: 34
Joined: Mon Jun 03, 2013 2:34 pm

Post by pavi »

which method did you use while importing metadata(table def)?try to import metadata from connector wozard or orchestrate table def.this will rule out any possibility of data type compatibility issues.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ORA-01722 is an "invalid number" error. Depending on what your source SQL looks like, Oracle can throw that error even if your metadata for the result is correct. Any chance you are joining tables in the connectors and doing an implicit cast of types in the join? Just wanted to rule that out before you went too much further down the rabbit hole.

Also note that the decimal values in the source tables cannot be invalid as Oracle would not store (for example) an 'X' in a NUMBER field. So look for mismatches between your source and the metadata used in the job, somewhere a VARCHAR value is ending up in a NUMBER field.
-craig

"You can never have too many knives" -- Logan Nine Fingers
devsonali
Premium Member
Premium Member
Posts: 155
Joined: Sun Mar 18, 2007 2:31 pm

Post by devsonali »

I think I did miss one fact here .

My target reads on of these varchar columns as Number . Source datatype of a set of columns are varchar2 and target is decimal (number) . So when I read these columns I cast them to Decimal so that I do not have to do it in the transformation logic. and yes there are no joins as such . Now as you rightly pointed out there is somewhere a varchar value that is being read as Decimal (as a part of CAST) and probably throwing the error . I would like to see which record is doing this or has a char value that I am trying to cast as Decimal. I was trying to see if there is a way to find out the record as DS does not give me the record.
devsonali
Premium Member
Premium Member
Posts: 155
Joined: Sun Mar 18, 2007 2:31 pm

Post by devsonali »

pavi wrote:which method did you use while importing metadata(table def)?try to import metadata from connector wozard or orchestrate table def.this will rule out any possibility of data type compatibility issues.
Yes , I am using connector wizard.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Well... one way would be to do the conversion inside the job. Part of that conversion could be an IsValid check so that you can easily identify and handle any values that will not convert properly.
-craig

"You can never have too many knives" -- Logan Nine Fingers
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

ps. Oracle will never tell you which record / column / value is throwing the error - you need to find it. Sometimes you can find them by sorting the VARCHAR2 columns descending, non-numeric values will appear first... depending on their position in the string, of course.

You could also play with the TRANSLATE function, one similar to EReplace. You can remove all valid digits from a string and any that are not null are candidates for being problem children, depending on what is left over:

Code: Select all

select COLUMN from TABLE
where trim(translate(COLUMN,'0123456789','          ')) is not null
Should get you close.

Still think your proper approach is to do the conversion in the job.
-craig

"You can never have too many knives" -- Logan Nine Fingers
devsonali
Premium Member
Premium Member
Posts: 155
Joined: Sun Mar 18, 2007 2:31 pm

Post by devsonali »

Thank you Craig

I just had to go via conventional way of querying table to find distinct or sort(as you proposed) and found the problem area. Is valid sounds like an another fine alternative for these scenarios .

However I feel Datastage should have the ability to throw the problem record in the log for these scenarios as well , but I guess we do not have it at this point

This is now resolved
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

DataStage is a client, it interfaces with Oracle (or any other database for that matter) the same as any other client tool. If that database doesn't support something or communicate something back to the client, there's nothing the client can do about it.
-craig

"You can never have too many knives" -- Logan Nine Fingers
Post Reply