Changing case of parameters on the fly

A forum for discussing DataStage<sup>®</sup> basics. If you're not sure where your question goes, start here.

Moderators: chulett, rschirm, roy

Post Reply
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

Changing case of parameters on the fly

Post by jdmiceli »

Hi all!

I have done some searching and haven't found anything like this question yet. If it is out there and I missed, I offer my apologies in advance.

I have a large project that runs common code for 19 different companies at the moment (and growing). Everything that I can parameterize to make things dynamic to prevent/minimize the need for company specific code has been done, including the naming of files created during processing. This is where my problem lies.

I have a simple job that pulls data from a SQL Server 2000 table and runs it into a sequential file and one of the parameters is called 'CompanyName'. It is passed in from the shell script that runs job control. It is capitalized when passed in and is used in that format most of the time. The one place that it isn't is in a file name that is defined as:

Code: Select all

       '#DirScripts#/#CompanyName#_#Environment#.csv'  
The rest of the code is expecting the filename to be look like this:

Code: Select all

      '/datastage/int/bts/ldr/scripts/lmc_int.csv'
but the file name ends up like this:

Code: Select all

      '/datastage/int/bts/ldr/scripts/LMC_int.csv'
On a case sensitive system, this is a problem. Is there a way to apply the Downcase() function to the CompanyName parameter where it needs to be lower case. Most of the time, it needs to be upper case as it is passed in by job control, but for the filename, it needs to be lower case.

Trying this:

Code: Select all

         '#DirScripts#/downcase(#CompanyName#)_#Environment#.csv'
gives me this:

Code: Select all

        'downcase(LMC)_int.csv'.
I am probably just having a mental speedbump with this, but how do I get that stupid thing lower case on the fly as I need to be without affecting it's overall state within the job?

Thanks in advance!
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
Sainath.Srinivasan
Participant
Posts: 3337
Joined: Mon Jan 17, 2005 4:49 am
Location: United Kingdom

Post by Sainath.Srinivasan »

Code: Select all

'#DirScripts#/' : downcase(#CompanyName#) : '_#Environment#.csv' 
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

Post by jdmiceli »

Thanks Sanaith for the suggestion, but setting it up that way got me a file called:

Code: Select all

      : downcase(LMC) : _int.csv
Is there something I have to do to get the interpreter to actually process the function in this location, since it isn't a normal Derivation spot? Maybe enclose it in {} or something? I'll try that, but :?

Any other ideas to try?
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Things like the concatenation operator (obviously you can concatenate) or functions are not supported in the Filename property of the Sequential file stage. So, you have a couple of options as I see it...

First, is this parameter in this job used in multiple places or just in the filename? If the latter, you'll need to downcase it before you pass it to the job parameter. If you need to use both flavors in the same job and the other areas (like derivations) support functions, then downcase the passed in value and then upcase it in your derivations.

Seems to me if you need both flavors in the job in areas that do not support functions then you'll need to pass both into the job as separate parameters.
-craig

"You can never have too many knives" -- Logan Nine Fingers
jdmiceli
Premium Member
Premium Member
Posts: 309
Joined: Wed Feb 22, 2006 10:03 am
Location: Urbandale, IA

Post by jdmiceli »

Hi Craig,

I have been experimenting with the downcasing of the variable as it is passed in from the shell script and that part works OK now. However, I do still have another place that requires the upper case version in the same job. Changing the case for passing from the shell is no biggie, it is just when I need both in the same job because I did not allow for multiple CompanyName's in the same job. I'll keep pounding.

Thanks for your input!
Bestest!

John Miceli
System Specialist, MCP, MCDBA
Berkley Technology Services


"Good Morning. This is God. I will be handling all your problems today. I will not need your help. So have a great day!"
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Sounds like what you really need is two job parameters - one for the file name and one for "elsewhere in the job". Your script/job sequence can take care of the casing.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
bollinenik
Participant
Posts: 111
Joined: Thu Jun 01, 2006 5:12 am
Location: Detroit

Re: Changing case of parameters on the fly

Post by bollinenik »

Hi,
The easy and best way is , save values for SRC file name and Company name in a file and invoke the file in Datastage job or before starting the job you can read the file pass the values to those parameters on teh fly. either way you can follow read the file before triggering the job or read the file in between the job by using of routine and pass the value to parametrs on the fly.
sbass1
Premium Member
Premium Member
Posts: 211
Joined: Wed Jan 28, 2009 9:00 pm
Location: Sydney, Australia

Post by sbass1 »

You could split the job logically as follows:

1) GetParameters (read your metadata to get your parameters)
2) UseParameters (your current "downstream" job)

Then call both jobs within a sequencer job.

1) would read your table, set UserStatus based on your table data, the job sequencer would parse UserStatus and dynamically set the parameters to 2).

See viewtopic.php?t=125264&highlight= for more details.

I think it would require 2 parameters though.

However, as Craig said, you may be able to use one parameter, as long as the parameter for the file name is in the format you want (since functions aren't available at that point), and use functions to for example uppercase your parameter within your transformations.

HTH,
Scott
sbass1
Premium Member
Premium Member
Posts: 211
Joined: Wed Jan 28, 2009 9:00 pm
Location: Sydney, Australia

Post by sbass1 »

You could split the job logically as follows:

1) GetParameters (read your metadata to get your parameters)
2) UseParameters (your current "downstream" job)

Then call both jobs within a sequencer job.

1) would read your table, set UserStatus based on your table data, the job sequencer would parse UserStatus and dynamically set the parameters to 2).

See viewtopic.php?t=125264&highlight= for more details.

I think it would require 2 parameters though.

However, as Craig said, you may be able to use one parameter, as long as the parameter for the file name is in the format you want (since functions aren't available at that point), and use functions to for example uppercase your parameter within your transformations.

HTH,
Scott
Post Reply