Search found 38 matches
- Mon Jul 03, 2017 10:16 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Load Pack for BW_PX_Stage - Unable to see custom infosource
- Replies: 0
- Views: 2072
Load Pack for BW_PX_Stage - Unable to see custom infosource
Dear Experts, We are using BW_Load_Stage to load data from a table to SAP BW. All the required configuration are done and able to connect to SAP BW through BW Load Stage but unable to see Info source created in SAP BW whereas able to see system defined transaction Info source type (Starts with 0 and...
- Fri Mar 28, 2014 1:49 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Array Size - DTS Stage
- Replies: 1
- Views: 1700
Array Size - DTS Stage
Hello, Need clarification on Array Size option in DTS Stage. Our job design is to read messages from queue, parse it and load the transaction into the target tables. DTS stage is used to ensure transaction control. The query is to understand if we give a value of 2,000 for Array Size , does it signi...
- Mon Feb 24, 2014 5:29 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQ connector disconnection delay
- Replies: 0
- Views: 1291
MQ connector disconnection delay
All, We are running MQ connector in multinode (4 nodes - Conductor/Compute Configuration). The purpose of this job is read data from Source Queue, move to work queue and transform the data from source queue. In case of any failures the data from work queue will be moved to source queue and job will ...
- Mon Jun 17, 2013 1:15 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
- Thu Jun 13, 2013 11:26 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
- Thu Jun 13, 2013 3:58 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
Thanks, its still the same. I have explicitly set the exrtended property across all the stages, still the output is in 'ASCII TEXT'. I assume the setting given in the stage will overrisde any setting given in Job/Project/UVCONFIG. In stage/job/project the NLS is set to UTF-8. The reason is, when I c...
- Tue Jun 11, 2013 12:46 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
The problem is not resolved, I have responded to 'chulett' question. The issue is, I am unable to create a UTF-8 CSV file in the target. Even I have tested this with a sample job with Row Generator and Sequential file stage. Is there anything I am missing? UTF-8 is set at project level, job level......
- Mon Jun 10, 2013 7:24 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
Thanks for your response, I tried a sample job as the original job which transforms the XML to a CSV file is quite complex. For the sample job I have moved the file created in windows setting encoding type as 'ANSI' and moved to AIX using FTP client in binary mode. I am able to see the format as ' A...
- Mon Jun 10, 2013 6:31 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: ANSI to UTF-8
- Replies: 9
- Views: 11942
ANSI to UTF-8
Hello, I want to convert the sequential file from ANSI to UTF-8 format. I have tried setting the NLS MAP to UTF-8 at the project level and the NLS MAP at the stage level is also set to UTF-8 just to make sure. The record delimiter is set to UNIX Newline. The file is not geting created in the UTF-8 f...
- Wed Jul 14, 2010 2:36 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQ Message splitup
- Replies: 6
- Views: 3330
- Wed Jul 14, 2010 3:39 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQ Message splitup
- Replies: 6
- Views: 3330
Thanks, I had a similar thought but I didn't the execution option (Advanced tab) where we change the mode Sequential/Parallel in MQ connector. Even I have tried changing the execution mode to Sequential in transformer as we have a transformed stage before MQ connector but no luck. Can you please hel...
- Wed Jul 14, 2010 3:37 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQ Message splitup
- Replies: 6
- Views: 3330
Thanks, I had a similar thought but I didn't the execution option (Advanced tab) where we change the mode Sequential/Parallel in MQ connector. Even I have tried changing the execution mode to Sequential in transformer as we have a transformed stage before MQ connector but no luck. Can you please hel...
- Tue Jul 13, 2010 9:29 am
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: MQ Message splitup
- Replies: 6
- Views: 3330
MQ Message splitup
Hi, Appreciate any help on this! When I use a multi node configuration file in DataStage the message in MQ is getting spilited into multiple chunks depends on the no of nodes given in the config file. Is there an option wherein the message will be loaded as one single file though we use multi node c...
- Sun Jul 04, 2010 3:41 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: Truncate only record exists
- Replies: 2
- Views: 2297
Truncate only record exists
Hi, I want to truncate the target table only when the record comes from source system. My source table is in oracle database and target database is in DB2. The requirement is to move the record from source to target if record exists in the source tables, if no record exists in the source then we nee...
- Tue Jun 29, 2010 4:22 pm
- Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
- Topic: DB2 Configuration - Multiple Servers
- Replies: 2
- Views: 1762
Hi, Thanks, Is it possible to pass more than one more config for each job? Currently we are defining a parameter set with different set of config files and pass the appropriate value through sequencer activity. In the same transform job I need to use both the dataset created using config file 1 and ...