DSXchange: DataStage and IBM Websphere Data Integration Forum
Search found 36 matches
Author Message
 Topic: Issue while reading sequential file
rohit_mca2003

Replies: 5
Views: 443

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Sun Mar 04, 2018 8:25 pm   Subject: Issue while reading sequential file
To answer queries:

1. Actually we have mechanism to create schema file based on metadata of the file. Since metadata says this is DOS format file so schema file automatically takes '\r\n' as record ...
Workaround Reported
 Topic: How to prevent CHECKSUM stage to re-arrange column names
rohit_mca2003

Replies: 7
Views: 938

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Fri Mar 02, 2018 3:24 am   Subject: How to prevent CHECKSUM stage to re-arrange column names
To answer everyone's query. we have resolved this issue and may be would be good use case for future.

I am aware that DataStage puts '|' after each field passed to 'checksum' operator and it uses H ...
Workaround Reported
 Topic: How to prevent CHECKSUM stage to re-arrange column names
rohit_mca2003

Replies: 7
Views: 938

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Fri Mar 02, 2018 3:07 am   Subject: How to prevent CHECKSUM stage to re-arrange column names
Best to open a support case then, the documentation doesn't seem to show that as an option. Out of curiosity, what 'existing application' was used to generate the checksum and what did it use to gener ...
 Topic: Issue while reading sequential file
rohit_mca2003

Replies: 5
Views: 443

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Fri Mar 02, 2018 3:01 am   Subject: Issue while reading sequential file
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: Warning - Input buffer overrun at field
Hi,

This is very common error but I did not get any suitable answer from other entries so posting this as new query.

I am trying to read csv file which has record delimiter as '\r\n'. It is wind ...
Workaround Reported
 Topic: How to prevent CHECKSUM stage to re-arrange column names
rohit_mca2003

Replies: 7
Views: 938

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Mon Jan 15, 2018 10:10 pm   Subject: How to prevent CHECKSUM stage to re-arrange column names
Thanks Craig.
I need to generate CHECKSUM based on the column order as there is requirement to have keys in specific order.
Existing application is running in production where hash was generated ba ...
 Topic: SORT:Restrict Memory Usage
rohit_mca2003

Replies: 5
Views: 3453

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Mon Jan 15, 2018 4:46 am   Subject: SORT:Restrict Memory Usage
Try using variable APT_OLD_BOUNDED_LENGTH and set to 'True'.
Workaround Reported
 Topic: How to prevent CHECKSUM stage to re-arrange column names
rohit_mca2003

Replies: 7
Views: 938

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Mon Jan 15, 2018 4:37 am   Subject: How to prevent CHECKSUM stage to re-arrange column names
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: CHECKSUM stage re-arranges the columns while computing hash
Hi Everyone,

I have issue while computing hash value using 'CHECKSUM' stage. It seems that CHECKSUM stage re-arranges the columns as per their names while computing the hash value.

Example:
--- ...
Resolved
 Topic: How to stop default type conversion in MODIFY stage
rohit_mca2003

Replies: 3
Views: 378

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Jan 11, 2018 10:31 pm   Subject: How to stop default type conversion in MODIFY stage
This issue is resolved when I defined the conversion as below:

Output_Col:string[max=20]=string_trim[in_col]

If I assigned the value as string[20], is was treating it as CHAR and adding space so ...
Resolved
 Topic: How to stop default type conversion in MODIFY stage
rohit_mca2003

Replies: 3
Views: 378

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Jan 11, 2018 10:08 pm   Subject: How to stop default type conversion in MODIFY stage
This option does not restrict the length of output_col. We have used this parameter earlier to restrict the usage of disk and scratch space.
Resolved
 Topic: How to stop default type conversion in MODIFY stage
rohit_mca2003

Replies: 3
Views: 378

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Jan 11, 2018 8:15 am   Subject: How to stop default type conversion in MODIFY stage
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: Datatype length exceeds while NULL handling and String Trim
Hi,

I have generic job where I am handling NULL and performing TRIM for string type of column.
As output column of these, If I do not assign data length for output then by default it increases the ...
 Topic: How to handle double quote in Column Name (Teradata)
rohit_mca2003

Replies: 1
Views: 623

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Sep 07, 2017 9:41 pm   Subject: How to handle double quote in Column Name (Teradata)
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: Teradata column name have double quote
Hi,
I have a requirement to handle double quote in the teradata column name. Since teradata does not accept column which are same as reserved words so it has been created as "TITLE".

I have a RCP ...
 Topic: HASH Partition not working for Checksum values
rohit_mca2003

Replies: 5
Views: 873

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Mon Jul 31, 2017 9:01 pm   Subject: HASH Partition not working for Checksum values
When I say join is not happening properly it means if I run the join in sequence or in entire partitions then it is working fine.
but with HASH partition, partition does not seems to be working fine ...
 Topic: HASH Partition not working for Checksum values
rohit_mca2003

Replies: 5
Views: 873

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Mon Jul 31, 2017 4:25 am   Subject: HASH Partition not working for Checksum values
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: Passing MD5 hash value to HASH Partition
Hi,

I need to join the columns (using join stage) which have MD5 hash values (using Checksum stage for this).

I have same data in source and target so expected to match all the records but join ...
 Topic: Join the columns having HASH values
rohit_mca2003

Replies: 3
Views: 943

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Jul 27, 2017 2:38 am   Subject: Join the columns having HASH values
Hi,

I checked the checksum values and it is same what ever is in target. Also if DataStage generates different checksum for same value then it should not be used.

Thanks.
 Topic: Join the columns having HASH values
rohit_mca2003

Replies: 3
Views: 943

PostForum: IBM® DataStage Enterprise Edition (Formerly Parallel Extender/PX)   Posted: Thu Jul 27, 2017 1:36 am   Subject: Join the columns having HASH values
DataStage® Release: 9x
Job Type: Parallel
OS: Unix
Additional info: Join does not work when columns have hash value
Hi,

We have requirement to join the columns having HASH value (these hash values have been computed by checksum).

First I tried by 'hash Partition' and sort on this column (which has hash value) ...
 

 Jump to:   



Powered by phpBB © 2001, 2002 phpBB Group
Theme & Graphics by Daz :: Portal by Smartor
All times are GMT - 6 Hours