DSXchange: DataStage and IBM Websphere Data Integration Forum
View next topic
View previous topic
Add To Favorites
Author Message
Gius
Participant



Joined: 09 Mar 2015
Posts: 30

Points: 450

Post Posted: Mon Sep 14, 2020 8:32 pm Reply with quote    Back to top    

DataStage® Release: 11x
Job Type: Parallel
OS: Windows
Hello,
Is it possible , from the Sequential job , read an input file like this :
Person1, email1@,email2@
Person2, email1@,email2@
Person3, email1@,email2@
Person4, email1@,email2@
And loop the records passing one record at a time to a Parallel Job , in Parameters ( pPerson, pEmail1, pEmail2 )

?
Thank You -
qt_ky



Group memberships:
Premium Members

Joined: 03 Aug 2011
Posts: 2897
Location: USA
Points: 21971

Post Posted: Tue Sep 15, 2020 2:55 am Reply with quote    Back to top    

Yes, I will assume you meant to say the "Server Job" type. In the Transformer stage, you can find a derivation under the Utilities category (if I recall correctly). There will be a function named something like UtilityRunJob that you can use for each record to call another job.

_________________
Choose a job you love, and you will never have to work a day in your life. - Confucius
Rate this response:  
Not yet rated
ray.wurlod

Premium Poster
Participant

Group memberships:
Premium Members, Inner Circle, Australia Usergroup, Server to Parallel Transition Group

Joined: 23 Oct 2002
Posts: 54595
Location: Sydney, Australia
Points: 296053

Post Posted: Thu Oct 08, 2020 11:19 am Reply with quote    Back to top    

In the sequence job read the file into a delimited list using an Execute Command activity (perhaps tr to convert the newlines to delimiters). Use a User Variables stage within the loop to parse t ...

_________________
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Rate this response:  
Not yet rated
rumu
Participant



Joined: 06 Jun 2005
Posts: 286

Points: 2871

Post Posted: Thu Oct 22, 2020 5:02 am Reply with quote    Back to top    

I implemented almost similar condition where I had to read each record from a file and create an output file for record in json format. I used a loop in sequence where I read number of records in a execute command stage prior to the loop and and passing the count to the loop for number of iteration. I passed the iteration number from loop to the job activity as parameter and used that parameter in sequential file stage 'filter' activity to run a command as below:

cat <filename>|head -<#iterationnum#>
tail -1

This way, the job would read the nth record in each iteration and pass for further transformation.

Regards,
Rumu

_________________
Rumu
IT Consultant
Rate this response:  
Not yet rated
Display posts from previous:       

Add To Favorites
View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



Powered by phpBB © 2001, 2002 phpBB Group
Theme & Graphics by Daz :: Portal by Smartor
All times are GMT - 6 Hours