Problems with QualityStage metadata on the server.

Infosphere's Quality Product

Moderators: chulett, rschirm

Post Reply
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Problems with QualityStage metadata on the server.

Post by vmcburney »

I'm having a problem with the import of metadata from QualityStage into DataStage. When I change a data file definition that has already been imported into DataStage I am unable to update the DataStage definition.

For example if I import a table from QualityStage into DataStage:

Code: Select all

FieldName   Length  Start
CUSTID      12        1
ACCTNO      12        13
I deploy a QualityStage job using this file and then import it into DataStage via the Metadata import. I then change the length of one of the fields in QualityStage and deploy the job again:

Code: Select all

FieldName   Length  Start
CUSTID      12        1
ACCTNO      20        13
I import it again into DataStage via metadata import. It brings the ACCTNO in as 12, it is unable to recognise that the column length has changed to 20. If I set the column in DataStage to 20 manually then the job fails reporting that the column:
Link column 'ACCTNO' length mismatch in link 'q1'

The only way I've been able to successfully move a column change from QualityStage to DataStage is by renaming the data file whenever a change is made. Then DataStage imports the correct columns. This is creating havoc with our naming conventions. Does anyone know where DataStage is pulling the old column definition from? It has been updated in the local repository, it has been updated on the server in the DIC directory yet DataStage is still finding an old definition somewhere.
vmcburney
Participant
Posts: 3593
Joined: Thu Jan 23, 2003 5:25 pm
Location: Australia, Melbourne
Contact:

Post by vmcburney »

After a few days I still don't have a resolution from Ascential tech support or even confirmation that they can reproduce the problem. At the moment the only workaround we have is to put a version code at the end of every QualityStage Datafile definition, every time we change this definition we increment the code and import the new table name into DataStage and hook the job up to this new table. DataStage imports the new definition correctly if it has a new name but is unable to import it if we use the old name.
timwalsh
Participant
Posts: 29
Joined: Tue Mar 04, 2003 7:48 am

Post by timwalsh »

This might be a simple question, but does your QS procedure run fine "stand alone"?

I find that I have to re-stage a procedure and run it via QS manager, before I re-import the metadata into DataStage and and update/re-run the DS job.

If you've already done all this with ASCL support, then I can't help you out any further.

Good luck and keep us posted!

Tim

PS: In QS v7.0, is the metadata exchange truly bi-directional? This is not possible with verion 6.
Post Reply