Search found 21 matches

by kura
Mon Mar 05, 2007 2:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Help on dsexport
Replies: 40
Views: 20029

dsexport mutliple jobs to export

Can we export multiple jobs to same "dsx" file from command line using dsexport

Can somebody help in syntax?
by kura
Wed Feb 07, 2007 2:59 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Carriage return
Replies: 9
Views: 4539

Re: Carriage return

Did you try to use "DOS line delimiter option" in the sequential file stage
by kura
Wed Feb 07, 2007 2:56 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Transformer Stage with zero input link
Replies: 14
Views: 4773

Re: Transformer Stage with zero input link

you can use sequential file stage before transformer reading "dev/null" file.
by kura
Mon Feb 05, 2007 4:22 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Error in join stage
Replies: 17
Views: 14218

Re: Error in join stage

This error comes when you have limitation on file size creation. Can you check with Unix admin what is file size limit for user used to run this job [quote="somu_june"]Hi, Iam having a job which reads data from three source tables using DB2 API stage and I have join stage and I am joinning...
by kura
Thu Aug 03, 2006 11:58 am
Forum: General
Topic: Datastage PX Libraries
Replies: 4
Views: 5141

Re: Datastage PX Libraries

Do you have entery for APT_ORCHHOME in dsenv file?

I am thnking it might be missing or got incorrect PXEngine directory
by kura
Thu Aug 03, 2006 11:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: compile error in Transformer
Replies: 9
Views: 4707

Re: compile error in Transformer

Somestime, if transformation are more complex who will get this kind of error.
Here the Guidelines.

Avoid Handling Null a lot.
and convertion. Try to add one by one until you start getting this.

I would suggest many modif's to avoid these kind of problem
by kura
Sun Jul 30, 2006 11:30 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Modify to "truncate" timestamp
Replies: 6
Views: 3887

Re: Modify to "truncate" timestamp

yes, nested function is not compatible with MODIFY stage.

Depending upon how many transformation you do and volume of the data. I would choose either of Modify vs Transformer.

From my experience I recommend Modify.
by kura
Sun Jul 30, 2006 11:25 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Lookup is failing
Replies: 5
Views: 2447

Re: Lookup is failing

I double [] will work. If it data type "DATE", then there is no "DATE" of length 8 and "DATE" of length 10. "DATE" is Date, there is no significance for length. My assumption you are using two field of one with char(8) other wth char(10). Then you must make sa...
by kura
Sun Jul 30, 2006 11:21 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Warning Message in Change Capture Stage
Replies: 3
Views: 2750

Re: Warning Message in Change Capture Stage

Which change mode you used. Depending change mode, change rest of option. Then you should able avoid this warning. [quote="ShilpaBharadwaj"]Dear All, I am facing the following warning in Change Capture stage in a parallel job - Change_Capture: When checking operator: Defaulting "TYPE_...
by kura
Sun Jul 30, 2006 10:58 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: how to decrease the retrieval time from database
Replies: 5
Views: 2300

Re: how to decrease the retrieval time from database

I agree with kcbland. Best solution tune your SQL. Or else dump the each table data into different datasets. Then perform joins in datastage. Thats how helped us a lot [quote="sravanthi"]hi, I'm using db2 udb api stage for selecting columns from a table. The time taken is too high.the job ...
by kura
Sun Jul 30, 2006 10:50 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: Strange Warning in DB2 EE Stage
Replies: 3
Views: 2233

Re: Strange Warning in DB2 EE Stage

Usually you get this error when you use EE stage as input and ouput in same job. If that is case split into two jobs. [quote="Nageshsunkoji"]Hi All, I am facing the following warning in DB2 EE stage db2HEW_CHANL_USAGE_SUMMins: The environment variable DB2INSTANCE is currently set to `db2in...
by kura
Sun Jul 30, 2006 10:47 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: regarding the DB2 EE stage
Replies: 3
Views: 2077

Re: regarding the DB2 EE stage

Hi Divya, Special character loaded could be Empty strings. If that is case When you use API stage it automatically append space in the space for character field. If it is EE stage that is not case. Explicitly add spaces in end. [quote="srividya"]Hi This is in regards to the DB2 EE Stage. W...
by kura
Sat Oct 02, 2004 8:44 am
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: data type convertion in modify stage
Replies: 7
Views: 6342

[quote="ray.wurlod"]Not every ten digit number is an integer.
The largest (four byte (32 bit)) signed twos-complement integer is 2147483647; the largest unsigned is 4294967295.[/quote]

yes you are right. If there is data coming wiht 3123456789, then obviously int32 will not accept.

vijay
by kura
Fri Oct 01, 2004 3:42 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: data type convertion in modify stage
Replies: 7
Views: 6342

Eventhough it is binding error, if read till the end of sentence, you will range limitation error. Because in the PX I orbserved that if have input 10 digit integer data type will not accept. You have declare as Bigint.
by kura
Thu Sep 30, 2004 4:57 pm
Forum: IBM<sup>®</sup> DataStage Enterprise Edition (Formerly Parallel Extender/PX)
Topic: data type convertion in modify stage
Replies: 7
Views: 6342

Re: data type convertion in modify stage

Try with int64_from_decimal, and declare you out data type as bigint instead of integer.

Vijay