DSXchange: DataStage and IBM Websphere Data Integration Forum
View next topic
View previous topic
Add To Favorites
Author Message
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Thu Feb 26, 2004 1:04 pm Reply with quote    Back to top    

Hi, my friends.

I'm working with DS 5.2, and a W2000 platform.

I want to know what is the execution order of the stages in a server job.
And also I want to know can I set it, or how can I control this order.

Thanks in advance.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar

_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Thu Feb 26, 2004 2:11 pm Reply with quote    Back to top    

Execution order is as specified in your design.
Look in the log for "active stage starting" events.
(Note that active stages are closed in the reverse order.)
_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Thu Feb 26, 2004 2:34 pm Reply with quote    Back to top    

Ray:

Thank you for your answer.

The problem I'm having, is that a sequential file is fed from and
aggregator, and from other two transformers at the same time. All records
are "appended" to the file. For the run (and for the test data), there are
no records, so it seems that three links are arriving at the same time at
the seq file, and an error occurs: "DSD.SEQOpen Unable to create file
.......".

But the file in the error, was created in a previous batch job (job A),
which first is creating the file, and then is calling the server job (job
B).

I really don't know why the server job is trying to create the file.

Could you give me any hint to follow and solve the error ?

Thanks again.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar



"Ray Wurlod"
Para: "DataStage Users Discussion List"
Enviado por: cc:
datastage-users-bounces Asunto: Re: Execution order in a server job
@oliver.com


26/02/2004 05:11 p.m.
Por favor, responda a
DataStage Users
Discussion List






Execution order is as specified in your design.
Look in the log for "active stage starting" events.
(Note that active stages are closed in the reverse order.)
_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users



_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Thu Feb 26, 2004 8:44 pm Reply with quote    Back to top    

Guillermo,

This isn't an 'execution order' problem. The error is caused by trying to
write three output streams to the *same* sequential file simultaneously.
This cannot be done, period. And before you ask, this has nothing to do with
DataStage but is due to the fundamental nature of a 'sequential' file.

Change your job to write to three separate files and then combine them in an
after-job routine. Or, write the records to a hash file instead and then
spin the hash file out to a sequential file after it's been populated.

-craig

-----Original Message-----
From: datastage-users-bounces@oliver.com
[mailto:datastage-users-bounces@oliver.com] On Behalf Of gbarsky@osde.com.ar
Sent: Thursday, February 26, 2004 1:35 PM
To: datastage-users@oliver.com
Subject: Re: Execution order in a server job


Ray:

Thank you for your answer.

The problem I'm having, is that a sequential file is fed from and
aggregator, and from other two transformers at the same time. All records
are "appended" to the file. For the run (and for the test data), there are
no records, so it seems that three links are arriving at the same time at
the seq file, and an error occurs: "DSD.SEQOpen Unable to create file
.......".

But the file in the error, was created in a previous batch job (job A),
which first is creating the file, and then is calling the server job (job
B).

I really don't know why the server job is trying to create the file.

Could you give me any hint to follow and solve the error ?

Thanks again.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar


_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Fri Feb 27, 2004 1:57 pm Reply with quote    Back to top    

Craig:

Thank you. I followed your instructions and replacing the seq file with a
hash table, it worked fine.

But I stil do not understand what does "... write three output streams to
the *same* sequential file simultaneously ..." mean. Are they really
simultaneously, or perhaps they are so so close in time (with a difference
of microseconds), that the machine"thinks" that this is simultaneously.

Are the three links arriving to the seq file, really arriving at the same
time ? How is this translated and interpreted when compiled ?

Thanks, and have a nice weekend.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar



"Craig Hulett"

et> cc:
Enviado por: Asunto: RE: Execution order in a server job
datastage-users-bounces
@oliver.com


26/02/2004 11:44 p.m.
Por favor, responda a
DataStage Users
Discussion List






Guillermo,

This isn't an 'execution order' problem. The error is caused by trying to
write three output streams to the *same* sequential file simultaneously.
This cannot be done, period. And before you ask, this has nothing to do
with
DataStage but is due to the fundamental nature of a 'sequential' file.

Change your job to write to three separate files and then combine them in
an
after-job routine. Or, write the records to a hash file instead and then
spin the hash file out to a sequential file after it's been populated.

-craig

-----Original Message-----
From: datastage-users-bounces@oliver.com
[mailto:datastage-users-bounces@oliver.com] On Behalf Of
gbarsky@osde.com.ar
Sent: Thursday, February 26, 2004 1:35 PM
To: datastage-users@oliver.com
Subject: Re: Execution order in a server job


Ray:

Thank you for your answer.

The problem I'm having, is that a sequential file is fed from and
aggregator, and from other two transformers at the same time. All records
are "appended" to the file. For the run (and for the test data), there are
no records, so it seems that three links are arriving at the same time at
the seq file, and an error occurs: "DSD.SEQOpen Unable to create file
.......".

But the file in the error, was created in a previous batch job (job A),
which first is creating the file, and then is calling the server job (job
B).

I really don't know why the server job is trying to create the file.

Could you give me any hint to follow and solve the error ?

Thanks again.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar


_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users



_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Fri Feb 27, 2004 2:58 pm Reply with quote    Back to top    

Sequential files are not databases. There is no
concept of a "row" of data, it is just bytes of data.
A or is just data. If you have two
processes spooling data into the same file, the
operating system (if it lets you) will merge on a
block of bytes basis, not rows. This is readily
verifiable with a some simple testing.

So, when DataStage tries to write output to the same
file twice simultaneously via completely separate and
independent streams, then your data will probably be
garbage. This is not the same as two output links
writing to the same file, I'm talking about
independent streams which can be achieved with
simultaneous jobs or a single job with an IPC stage
that is detaching processes for simultaneity.

Since the solution of using a hash file is a database
construct, your operation will work.



--- gbarsky@osde.com.ar wrote:
>
>
>
>
> Craig:
>
> Thank you. I followed your instructions and
> replacing the seq file with a
> hash table, it worked fine.
>
> But I stil do not understand what does "... write
> three output streams to
> the *same* sequential file simultaneously ..." mean.
> Are they really
> simultaneously, or perhaps they are so so close in
> time (with a difference
> of microseconds), that the machine"thinks" that this
> is simultaneously.
>
> Are the three links arriving to the seq file, really
> arriving at the same
> time ? How is this translated and interpreted when
> compiled ?
>
> Thanks, and have a nice weekend.
>
>
> Guillermo P. Barsky - gbarsky@osde.com.ar
> Gerencia de Sistemas - Desarrollo
>
> OSDE Binario - Filial Metropolitana
> Alem 1067, Piso 16
> TE (5411)4510-4330, Fax (5411)4510-5480
> http://www.osde.com.ar
>
>
>
>
>
> "Craig Hulett"
>
>
> Para: "'DataStage Users Discussion List'"
>
> et>
> cc:
>
> Enviado por:
> Asunto: RE: Execution order in a server job
>
> datastage-users-bounces
>
>
> @oliver.com
>
>
>
>
>
>
>
>
> 26/02/2004 11:44 p.m.
>
>
> Por favor, responda a
>
>
> DataStage Users
>
>
> Discussion List
>
>
>
>
>
>
>
>
>
>
>
>
> Guillermo,
>
> This isn't an 'execution order' problem. The error
> is caused by trying to
> write three output streams to the *same* sequential
> file simultaneously.
> This cannot be done, period. And before you ask,
> this has nothing to do
> with
> DataStage but is due to the fundamental nature of a
> 'sequential' file.
>
> Change your job to write to three separate files and
> then combine them in
> an
> after-job routine. Or, write the records to a hash
> file instead and then
> spin the hash file out to a sequential file after
> it's been populated.
>
> -craig
>
> -----Original Message-----
> From: datastage-users-bounces@oliver.com
> [mailto:datastage-users-bounces@oliver.com] On
> Behalf Of
> gbarsky@osde.com.ar
> Sent: Thursday, February 26, 2004 1:35 PM
> To: datastage-users@oliver.com
> Subject: Re: Execution order in a server job
>
>
> Ray:
>
> Thank you for your answer.
>
> The problem I'm having, is that a sequential file is
> fed from and
> aggregator, and from other two transformers at the
> same time. All records
> are "appended" to the file. For the run (and for the
> test data), there are
> no records, so it seems that three links are
> arriving at the same time at
> the seq file, and an error occurs: "DSD.SEQOpen
> Unable to create file
> ......".
>
> But the file in the error, was created in a previous
> batch job (job A),
> which first is creating the file, and then is
> calling the server job (job
> B).
>
> I really don't know why the server job is trying to
> create the file.
>
> Could you give me any hint to follow and solve the
> error ?
>
> Thanks again.
>
>
> Guillermo P. Barsky - gbarsky@osde.com.ar
> Gerencia de Sistemas - Desarrollo
>
> OSDE Binario - Filial Metropolitana
> Alem 1067, Piso 16
> TE (5411)4510-4330, Fax (5411)4510-5480
> http://www.osde.com.ar
>
>
> _______________________________________________
> datastage-users mailing list
> datastage-users@oliver.com
>
http://www.oliver.com/mailman/listinfo/datastage-users
>
>
>
> _______________________________________________
> datastage-users mailing list
> datastage-users@oliver.com
>
http://www.oliver.com/mailman/listinfo/datastage-users


__________________________________
Do you Yahoo!?
Get better spam protection with Yahoo! Mail.
http://antispam.yahoo.com/tools
_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Fri Feb 27, 2004 4:25 pm Reply with quote    Back to top    

Craig.

If two records come with the same key values, isn't it true that only the
latest record will be retained in the hash file ?

Thanks,
Bibhu

DW Engg, Rewards
503 220 3891 (PDX)



"Craig Hulett"

et> cc:
Sent by: Subject: RE: Execution order in a server job
datastage-users-bounces
@oliver.com


02/26/2004 06:44 PM
Please respond to
"DataStage Users
Discussion List"

|-------------------|
| [ ] Secure E-mail |
|-------------------|





Guillermo,

This isn't an 'execution order' problem. The error is caused by trying to
write three output streams to the *same* sequential file simultaneously.
This cannot be done, period. And before you ask, this has nothing to do
with
DataStage but is due to the fundamental nature of a 'sequential' file.

Change your job to write to three separate files and then combine them in
an
after-job routine. Or, write the records to a hash file instead and then
spin the hash file out to a sequential file after it's been populated.

-craig

-----Original Message-----
From: datastage-users-bounces@oliver.com
[mailto:datastage-users-bounces@oliver.com] On Behalf Of
gbarsky@osde.com.ar
Sent: Thursday, February 26, 2004 1:35 PM
To: datastage-users@oliver.com
Subject: Re: Execution order in a server job


Ray:

Thank you for your answer.

The problem I'm having, is that a sequential file is fed from and
aggregator, and from other two transformers at the same time. All records
are "appended" to the file. For the run (and for the test data), there are
no records, so it seems that three links are arriving at the same time at
the seq file, and an error occurs: "DSD.SEQOpen Unable to create file
.......".

But the file in the error, was created in a previous batch job (job A),
which first is creating the file, and then is calling the server job (job
B).

I really don't know why the server job is trying to create the file.

Could you give me any hint to follow and solve the error ?

Thanks again.


Guillermo P. Barsky - gbarsky@osde.com.ar
Gerencia de Sistemas - Desarrollo

OSDE Binario - Filial Metropolitana
Alem 1067, Piso 16
TE (5411)4510-4330, Fax (5411)4510-5480
http://www.osde.com.ar


_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users





===========================================================================
IMPORTANT NOTICE: This communication, including any attachment, contains
information that may be confidential or privileged, and is intended solely
for the entity or individual to whom it is addressed. If you are not the
intended recipient, you should delete this message and are hereby notified
that any disclosure, copying, or distribution of this message is strictly
prohibited. Nothing in this email, including any attachment, is intended
to be a legally binding signature.



_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
admin
Site Admin



Joined: 12 Jan 2003
Posts: 8720

Points: 10

Post Posted: Fri Feb 27, 2004 9:41 pm Reply with quote    Back to top    

Yes, that is true. Hash file writes are "destructive overwrites", meaning
the last one in wins. Sometimes this can be leveraged to remove duplicates
from data, or to end up with the "most recent" record by properly sorting
the source data. That's why it is very important to define your hash key(s)
appropriately for the data being stored *and* for the purpose it is being
stored for.

-craig

-----Original Message-----
From: datastage-users-bounces@oliver.com
[mailto:datastage-users-bounces@oliver.com] On Behalf Of bxdatta@regence.com
Sent: Friday, February 27, 2004 3:25 PM
To: DataStage Users Discussion List
Subject: RE: Execution order in a server job


Craig.

If two records come with the same key values, isn't it true that only the
latest record will be retained in the hash file ?

Thanks,
Bibhu

DW Engg, Rewards
503 220 3891 (PDX)


_______________________________________________
datastage-users mailing list
datastage-users@oliver.com
http://www.oliver.com/mailman/listinfo/datastage-users

_________________
PLEASE READ
Do not contact admin unless you have technical support or account questions. Do not send email or Private Messages about discussion topics to ADMIN. Contact the webmaster concerning abusive or offensive posts.
Rate this response:  
Not yet rated
Display posts from previous:       

Add To Favorites
View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



Powered by phpBB © 2001, 2002 phpBB Group
Theme & Graphics by Daz :: Portal by Smartor
All times are GMT - 6 Hours