Number of lines restriction

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
Mun
Charter Member
Charter Member
Posts: 33
Joined: Tue May 02, 2006 10:02 am

Number of lines restriction

Post by Mun »

Hi, is there a number of lines restriction for Before/After SQL or User-defined SQL? I have a 600 lines SQL that I need to put it in either Before, After or user-defined SQL section. The reason why I need to put it in an ETL is because we cannot promote an SQL Script (political issues).

When I put the 600s lines of SQL in the Before/After, only half would fit and the other half would get chopped off.

What other approaches can I use?
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

600 lines of SQL.
wonder why they cant be implmented using a job
Mun
Charter Member
Charter Member
Posts: 33
Joined: Tue May 02, 2006 10:02 am

Post by Mun »

Krazykoolrohit wrote:600 lines of SQL.
wonder why they cant be implmented using a job
---

Hi, what do you mean by implemented using a job?
Krazykoolrohit
Charter Member
Charter Member
Posts: 560
Joined: Wed Jul 13, 2005 5:36 am
Location: Ohio

Post by Krazykoolrohit »

what does your SQL is trying to do?

datastage jobs are made to ease out the SQLs into seperate data flow. if you look at it crudely, everything that can be done in datastage can be done in a SQL, hence we dont need datastage.

but the crux is that datastage does it better, faster, easier and with optimum results. thats why we do it in datastage.

so you can join two tables, or can aggregate, basically anything a SQL can do.
DSguru2B
Charter Member
Charter Member
Posts: 6854
Joined: Wed Feb 09, 2005 3:44 pm
Location: Houston, TX

Post by DSguru2B »

You can also use Before Job ExecDos, OS command to invoke the sql utility to connect to your database and execute your script. That is another way too.
Creativity is allowing yourself to make mistakes. Art is knowing which ones to keep.
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

Everything has to have a limit, looks like you have reached the limit on the number of characters allowed in that particular stage.
You could have wrapped the sql scripts into a file and run it as an after/before script, but it is ruled out in your case, right?
Is loading the scrips into a file and use "Load sql from a file at run time" an option?
Can you provide details as to what database/stage you want to use this sql scripts in?
What are you doing in these sql statements?
May be you can implement the same in the job.
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
Mun
Charter Member
Charter Member
Posts: 33
Joined: Tue May 02, 2006 10:02 am

Post by Mun »

narasimha wrote:Everything has to have a limit, looks like you have reached the limit on the number of characters allowed in that particular stage.
You could have wrapped the sql scripts into a file and run it as an after/before script, but it is ruled out in your case, right?
Is loading the scrips into a file and use "Load sql from a file at run time" an option?
Can you provide details as to what database/stage you want to use this sql scripts in?
What are you doing in these sql statements?
May be you can implement the same in the job.
----------------------------------------------------------------------------


There are about 500 lines of insert statement. Business user created this SQL script file and ran successfully. They want to promote this script file into production. Admin created a rule to not allow SQL script, so the quickest way to get this script into production is to put it in an ETL and promote it. This script will create a table that the cube developer will be able to use this without much modification on his side.

I hope I am able to explain this a little clearer.
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

Extract the data from your insert script and put them in a file.
Design a job to read this file as an input and insert into what ever database you want!
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
narasimha
Charter Member
Charter Member
Posts: 1236
Joined: Fri Oct 22, 2004 8:59 am
Location: Staten Island, NY

Post by narasimha »

Ok, if your issue is that you cannot move a file into production, break your insert script into two, and run your job twice.
If breaking the insert statements into 2 does not help break them further and run them that many number of times....
I feel your 'Admin' should be more flexible with such things, explain to him about the situation, he might come up with a workaround. :wink:
Narasimha Kade

Finding answers is simple, all you need to do is come up with the correct questions.
Mun
Charter Member
Charter Member
Posts: 33
Joined: Tue May 02, 2006 10:02 am

Post by Mun »

narasimha wrote:Ok, if your issue is that you cannot move a file into production, break your insert script into two, and run your job twice.
If breaking the insert statements into 2 does not help break them further and run them that many number of times....
I feel your 'Admin' should be more flexible with such things, explain to him about the situation, he might come up with a workaround. :wink:
--------------------------

Looks like I am able to break the SQL into 2 small chunks. One loads the first 50 columns and the other SQL loads the rest of the 50 columns. How can I merge them into one table?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Inserts on first pass, updates on second pass.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply