Search found 3045 matches

by vmcburney
Wed Nov 20, 2002 6:19 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Job Sequencer
Replies: 2
Views: 2404

Here are some more examples. You can use a conditional trigger to read a job parameter that affects the sequence path. (You don't need to enclose parameters in # marks) Eg. DoLoadLookups=-1 DoLoadLookups=0 The first trigger reads the value of the job parameter DoLoadLookups and if it is true it move...
by vmcburney
Thu Oct 17, 2002 6:57 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Handling Primary Key conflict
Replies: 1
Views: 1206

You second job will have to write a value to all four NOT NULL fields. For empty numeric fields try writing a 0, for text fields try a blank space. If you leave these two fields out of job 2 it will try to add the NULL value which will trigger a database error.
by vmcburney
Thu Oct 17, 2002 6:54 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Parameters passed into SQL
Replies: 2
Views: 1806

There are two ways to add parameters. Some database stages allow you to add a where clause that gets appended to the generated SQL. You should see a "Where" tab on your ODBC Selection screen. You can put pameters directly into that where clause using the # delimeters. If you don't have a w...
by vmcburney
Mon Oct 14, 2002 9:59 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Unknown DataStage Abort Error
Replies: 4
Views: 1764

Looks like when you exported the job you left out one or more routines that are used by that job. Have a look at the tranformers within the job and get a list of the routines it uses. Move those routines into production. Alternatively use the Release Management tool available from this site to autom...
by vmcburney
Sun Aug 25, 2002 9:15 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Passing parameters in Job Sequencer?
Replies: 1
Views: 962

There are a few ways to use parameters within a sequence job. - Firstly you can set up parameters in the sequence job properties screen, these parameters can then be passed to any stages within that sequence. I don't know of any way to change the value of these parameters while the job is running. -...
by vmcburney
Sun Aug 25, 2002 9:05 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Problems with Data Stage 5.2
Replies: 1
Views: 1267

Use the search function on this forum on terms such as "schedule" and "error" to see some other postings on this topic. Other people have had problems scheduling jobs and it usually comes down to ensuring the jobs are being executed by the correct user with the correct rights. In...
by vmcburney
Sun Aug 25, 2002 8:54 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Datastage Corss-Reference
Replies: 2
Views: 1958

As Ray says a reference link seems to be what you are after. Another option is to cross reference your tables within the source database if they both exist in the same database. This can be done by writing a custom SQL select statement in your database stage or by writing a database view that joins ...
by vmcburney
Tue Jul 16, 2002 7:53 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Datastage 3 NT to Datastage 5 HP11i
Replies: 1
Views: 768

If your Unix box has multiple processors you need to get get your jobs running in parallel. Normally a single job utilizes just one processor and can leave much of the server resources idle. There are a few techniques for getting getting maximum utilization of the server: - run independant jobs in p...
by vmcburney
Tue Jul 16, 2002 7:42 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Stage Sequence
Replies: 3
Views: 1634

Endy, normally if you have two independant paths within a job DataStage will try to run them in parallel without giving you much control over the order. One way around this is to turn one of the paths into a dummy reference link. DataStage will first run those data streams that build a reference fil...
by vmcburney
Thu Jul 04, 2002 7:42 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Data Migration Issue
Replies: 1
Views: 1239

There are some tips for making your job run faster: - Do not attempt to use an Oracle table as a lookup/reference stage. If tables A and B are from the same database you can join them in the select clause or create a view that joins them. Otherwise dump table B into a hash file and use this as a loo...
by vmcburney
Thu Jun 20, 2002 6:01 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Upgrade datastage from 4.0 to 5.2 problems
Replies: 2
Views: 1203

I have seen similar problems in 5.1 with a couple of possible causes: - Does your job have a transform which uses routines? If a routine accepts an input field as an argument and then modifies this argument you could get an abnormal termination. In version 5 the arguments are passed by reference, no...
by vmcburney
Mon Jun 17, 2002 8:11 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Incremental Migration
Replies: 1
Views: 721

Updates can be very difficult to find unless you can make a change to your legacy database to tag them. For example on your legacy database add a modified date to each table along with a trigger to populate the field whenever an add or modify occurs. Get the trigger to write the primary key to a del...
by vmcburney
Mon Jun 10, 2002 7:15 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: How to set job status in Transformer
Replies: 1
Views: 2527

1) Here are a couple of options which use a routine to set the job status. Example 1. A field inQuantity is passed passed through the transformer to the output field outQuantity. A business rule states if Quantity is greater than 100 set the job user status to warning, else set it to okay. In the tr...
by vmcburney
Thu May 30, 2002 7:17 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: Hashed File
Replies: 2
Views: 1658

If you are using DataStage 5 you can check the "Create File" check box in the plug in properties which ensures the hash file gets created whenever the job is run. You do not need to change any of the settings for the Hash file, it is created as a Dynamic hash file which in most cases will ...
by vmcburney
Thu May 16, 2002 9:22 pm
Forum: IBM<sup>®</sup> Infosphere DataStage Server Edition
Topic: execute shell command
Replies: 5
Views: 3275

This is a known defect in 5.1, the execute stage does not allow the use of job parameters in the execute parameter field. The work around is to use a routine stage, pass it the command and parameters and run the command from the routine. This has the advantage of centralising your execute code so yo...