Search found 82 matches

by atulgoel
Thu Jun 15, 2017 12:22 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal Error: Attempt to setIsNull() on the accessor interfac
Replies: 13
Views: 10087

Here is the output for $OSH_PRINT_SCHEMA main_program: Schemas: Data set "ODBC_Connector_0:DSLink2.v": record ( BICOD: string[max=10]; BIBKC: string[max=10]; BIBTN: string[max=10]; BIANO: string[max=10]; BICUR: string[max=10]; BITRN: string[max=10]; BIAMT: string[max=10]; BICHQ: string[max...
by atulgoel
Wed Jun 14, 2017 11:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal Error: Attempt to setIsNull() on the accessor interfac
Replies: 13
Views: 10087

Hi Craig,

I have mentioned my job design in my previous comment. Please let me know what more details you need in order to get more details about issue.
by atulgoel
Wed Jun 14, 2017 11:57 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal Error: Attempt to setIsNull() on the accessor interfac
Replies: 13
Views: 10087

Hi Mike, Below is my job design: ODBC Stage ---> Copy Stage ---> Peek Stage 1) There is no metadata defined in ODBC connector. 2) My Select SQL statement has CAST function. I am converting all columns to VARCHAR using cast function. The column for which I am getting this error is of Numeric Type. 3)...
by atulgoel
Wed Jun 14, 2017 11:53 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal Error: Attempt to setIsNull() on the accessor interfac
Replies: 13
Views: 10087

Yeah I tried using Environment variable $OSH_PRINT_SCHEMAS, but still I am getting the error. It has not resolved my issue.
by atulgoel
Wed Jun 14, 2017 4:29 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Fatal Error: Attempt to setIsNull() on the accessor interfac
Replies: 13
Views: 10087

Fatal Error: Attempt to setIsNull() on the accessor interfac

Hi, I am facing the below error while reading the data from source using ODBC Connector. In source table all columns are NOT NULL and I am reading the table using RCP. While executing Its giving below error: Fatal Error: Attempt to setIsNull() on the accessor interfacing to non-nullable field Can an...
by atulgoel
Tue Jun 06, 2017 1:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Player terminated unexpectedly.Unable to initialize communic
Replies: 9
Views: 32039

Hi pieterh54,

Please suggest, if you are able to resolve the issue or not. I am also getting the same issue.
by atulgoel
Wed Feb 01, 2017 2:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in Using File Connector stage
Replies: 11
Views: 13543

There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.
by atulgoel
Wed Feb 01, 2017 1:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: String to Double DataType Conversion
Replies: 6
Views: 8567

Below function is working fine. Using this I can load the String data from source to target having Double datatype in Hive.

<ColName>:DFLOAT=<ColName>
by atulgoel
Wed Feb 01, 2017 1:43 am
Forum: General
Topic: Looping on Value file name in Parameter Set
Replies: 4
Views: 21208

yeah. It will be a same job which needs to be run with different Value File at every Iteration.

To resolve the issue, I have created the file with the values of value file set and reading it using execute command stage and then passing the required value to loop .
by atulgoel
Wed Feb 01, 2017 1:35 am
Forum: General
Topic: (40521) Unable to retrieve the link details for the stage
Replies: 1
Views: 2552

(40521) Unable to retrieve the link details for the stage

Can someone help with the above error which I am getting while opening Datastage jobs because of which I am loosing Link details in datastage job. This happens very frequently and I have to rework on the code every time. Even if I take backup of code, below error keeps on coming after importing the ...
by atulgoel
Tue Jan 24, 2017 9:19 pm
Forum: General
Topic: Looping on Value file name in Parameter Set
Replies: 4
Views: 21208

Looping on Value file name in Parameter Set

I have a parameter set having 6 parameters and 20 value file set. I need to run the job in sequence 20 times with different value file name each time. I have created a delimited list of value file names as below: ValueFileName1,ValuefileName2,ValueFileName3.... and so on... Now I wanted to pass Valu...
by atulgoel
Tue Jan 10, 2017 6:37 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: String to Double DataType Conversion
Replies: 6
Views: 8567

String to Double DataType Conversion

Hi,

My Source and target is Hive Database. I need to convert a column datatype from String to Double. I am using Run Time Column Propagation.

Using Modify stage is it possible to convert a Column from String to Double?
by atulgoel
Sat Jan 07, 2017 1:16 pm
Forum: General
Topic: Parallelism in Looping within sequencer
Replies: 1
Views: 1903

Parallelism in Looping within sequencer

Hi,

Is there any way to execute a Multi Instance job in parallel for all the parameters (reading from a file) using looping in sequencer? And using only one Job Activity within Start and End Loop.

Or if there is any other way apart from looping in sequencer.
by atulgoel
Mon Dec 26, 2016 11:13 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in Using File Connector stage
Replies: 11
Views: 13543

Hi,

Is there any document or website where I can find step by step configuration setting which is required to use File connector to read from Hive?
by atulgoel
Fri Dec 23, 2016 6:45 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Maximum Number of Value Files in Parameter Set
Replies: 6
Views: 4089

Ok thanks... Marking this post as resolved... :)