Hi all,
I got requirement to change the below existing query in to a DS Job
PFB the sample query
INSERT INTO Table_Name (A, B, C, D, E, F, G, H)
VALUES ((SELECT MAX(id) + 1 FROM id_table), '', 5, 'ABBEY', '', 1, 1, 'EFG');
How can I achieve inserting the first value(select query from a table) using the DS job.
Kindly help
Job from a Query
Moderators: chulett, rschirm, roy
-
- Premium Member
- Posts: 22
- Joined: Thu Jul 05, 2012 5:09 am
- Location: Chennai
Job from a Query
Thanks
Nibu Mathew Babu
Nibu Mathew Babu
-
- Premium Member
- Posts: 22
- Joined: Thu Jul 05, 2012 5:09 am
- Location: Chennai
Thanks Craig for the swift response but there is a small change in requirement
if you see the above query, the value for field A should come from select query, B,C,D,E,F Values will come from a source file
columns G,H will be again from transformer(Hard coded)
The whole idea is there are some fields from an input file they need to add some more fields which static value(hard coded from transformer) and a key value for the row,(which is our select maxid query)
Kindly suggest,
if you see the above query, the value for field A should come from select query, B,C,D,E,F Values will come from a source file
columns G,H will be again from transformer(Hard coded)
The whole idea is there are some fields from an input file they need to add some more fields which static value(hard coded from transformer) and a key value for the row,(which is our select maxid query)
Kindly suggest,
Thanks
Nibu Mathew Babu
Nibu Mathew Babu
For multiple records that need to start with MAX+1 and (I assume) keep incrementing with each row, I'd suggest a different approach. I'd look into sending that current MAX value into the job as a job parameter via a values file, store it in the Initial Value of a stage variable and increment it for each row.
And, of course, take care here due to parallel processing. Heck, maybe even use a Server job for this, nothing about this sounds like anything that needs to be Parallelly processed.
And, of course, take care here due to parallel processing. Heck, maybe even use a Server job for this, nothing about this sounds like anything that needs to be Parallelly processed.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers