Write code to extract record from multiple tables to Files.

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
JITeam
Premium Member
Premium Member
Posts: 17
Joined: Mon Mar 24, 2008 9:43 am

Write code to extract record from multiple tables to Files.

Post by JITeam »

Hi,

I have to extract multiple tables record and write into the different CSV files.
Extraction is simple (Select * from table),In Server Edition do you have idea of writing code in DS where we will define the table name , Connection strings of DB and get extracted the record in CSV with Table Definition.

EX:

Table File
ABC ABC.CSV
BCA BCA.CSV
CBA CBA.CSV

Thanks
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

There really isn't the concept of "select *" in Server since everything is metadata driven. So the typical answer would be one job per table.

How many tables are "multiple"?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Provided that the record structure is identical, you could use one job with the table/file name a job parameter.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
JITeam
Premium Member
Premium Member
Posts: 17
Joined: Mon Mar 24, 2008 9:43 am

Post by JITeam »

Thanks guys, the tables have different metadata and approx 95 tables, if in EE we could have used RCP I guess.

So writing a job with 100 extrats or different jobs is the only option?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

No, you don't need 100 extracts. For 95 tables each with different metadata, 95 jobs should do it.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
babaojo
Premium Member
Premium Member
Posts: 2
Joined: Wed Aug 13, 2008 3:49 pm

Post by babaojo »

ray.wurlod wrote:No, you don't need 100 extracts. For 95 tables each with different metadata, 95 jobs should do it. ...

I think the question JITeam's was trying to ask is similar to mine; and that is:
Is there a way to parameterize the table definition in a job? Or is it possible to specify the definition as an external (operating system) object or as a DS_METADATA object rather than explicitly listing each column in the job?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

And the answer is, and remains, not in server jobs.

Parallel jobs give more flexibility, through the use of schema files, but things are always metadata driven.

Just don't expect any "T" in your "ETL".
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Maybe you can concat them into a single column with comma sep. That way one job will do.
Post Reply