Accepting SOAP over HTTP requests into DataStage
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
Accepting SOAP over HTTP requests into DataStage
Hi all,
DataStage job is deployed and expose and will be invoked by external web service on demand.
I have question...
1) what ds stage should be used in DataStage in order to accept the SOAP over HTTP requests?
2) say the input request contains 5 column, then the DataStage Parameters should have 5 column defined?
3) since, SOAP requests will be in array/xml, which ds stage can be used first read the SOAP? ex xml input
DataStage job is deployed and expose and will be invoked by external web service on demand.
I have question...
1) what ds stage should be used in DataStage in order to accept the SOAP over HTTP requests?
2) say the input request contains 5 column, then the DataStage Parameters should have 5 column defined?
3) since, SOAP requests will be in array/xml, which ds stage can be used first read the SOAP? ex xml input
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
putting it simple...
is job parameters the only way to accept data from SOAP requests?
I am struggling to get the structured data into Datastage.. so for time being I am not 'grouping to structure' at server console and accepting data successfully (as in job parameters).
so the issue challenge, if we expose Datastage job to accept SOAP over http requests as 'grouped into structure', how to accept the data?
is job parameters the only way to accept data from SOAP requests?
I am struggling to get the structured data into Datastage.. so for time being I am not 'grouping to structure' at server console and accepting data successfully (as in job parameters).
so the issue challenge, if we expose Datastage job to accept SOAP over http requests as 'grouped into structure', how to accept the data?
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Not sure what you mean by "accept" here. Reading your question at face value I would have suggested the ISD Input stage, and don't understand how job parameters come into the picture at all.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
I thought "always on" was an option when you deployed the job rather than something you are forced into... however it's been a long dang time since I've played with them.
Anyway, why not have it always on? It can still be invoked "on demand" and you don't have to wait for the job to start and stop each time. Is there something about the job design that precludes that?
Anyway, why not have it always on? It can still be invoked "on demand" and you don't have to wait for the job to start and stop each time. Is there something about the job design that precludes that?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
before deploying, I tried options under 'provider properties' at Server console ie.,
1) Active job instances or JDBC connections : mini=1 (default) max=1
2) idle time: min=60min ; max=0 sec (no changes)
3) activation threshold: service requests=1
4) request limit=1
Let me know if i missed any other setting.... All i need is to run "On Demand" though using ISD input stage
1) Active job instances or JDBC connections : mini=1 (default) max=1
2) idle time: min=60min ; max=0 sec (no changes)
3) activation threshold: service requests=1
4) request limit=1
Let me know if i missed any other setting.... All i need is to run "On Demand" though using ISD input stage
You can't. If you want always on...use the ISDInput....if you want "start on demand" (meaning ...start the job, for EACH request coming in, and only when the request arrives), then use some other Stage as the starting point.
....in that case, where do you get the data from? Well, it can come from a database or sequential file source...or, what it sounds like you would like to do, is pass the whole "chunk" of data from the client.
You can do it, but it takes a technique. It won't be as simple as asking ISD to construct the array and proper WSDL for you [as it does when you use ISDinput for "always on"].
One technique, as we discussed above, is to have a dummy input, like a sequential stage that just reads a single row from a dummy file, and then in a downstream transformer, have a Job Parameter as the whole derivation...into a large varchar. Then, downstream from that, break it out any way you want, using pivot, xml, etc. ...all depending on how you want to construct that "array". Of course, the client tooling will have to package that "array" into a single row call, stuffing the "array" into a single textual value for passing as your "Job Parameter". The WSDL will NOT reflect that you are using arrays, but it's still do-able.
How often are you calling this Operation? Doing this makes sense when you call it very seldom (a few times a day only). Otherwise, use ISDinput...the response time will be many times faster.
Ernie
....in that case, where do you get the data from? Well, it can come from a database or sequential file source...or, what it sounds like you would like to do, is pass the whole "chunk" of data from the client.
You can do it, but it takes a technique. It won't be as simple as asking ISD to construct the array and proper WSDL for you [as it does when you use ISDinput for "always on"].
One technique, as we discussed above, is to have a dummy input, like a sequential stage that just reads a single row from a dummy file, and then in a downstream transformer, have a Job Parameter as the whole derivation...into a large varchar. Then, downstream from that, break it out any way you want, using pivot, xml, etc. ...all depending on how you want to construct that "array". Of course, the client tooling will have to package that "array" into a single row call, stuffing the "array" into a single textual value for passing as your "Job Parameter". The WSDL will NOT reflect that you are using arrays, but it's still do-able.
How often are you calling this Operation? Doing this makes sense when you call it very seldom (a few times a day only). Otherwise, use ISDinput...the response time will be many times faster.
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
Thanks lot ernie, but the issue if not using ISD i/p stage is that, I can't see "Accept Array" option while grouping into structure.
Able to accept one record (comma delimited) with using sequential stage/row generator in the job design ---> i have option to accept comma delimited data
I am now really feeling to have premium content access... but.. i just need the right stage to start with (either to accept array of data or comma delimited)
Able to accept one record (comma delimited) with using sequential stage/row generator in the job design ---> i have option to accept comma delimited data
I am now really feeling to have premium content access... but.. i just need the right stage to start with (either to accept array of data or comma delimited)
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
Updated to a similar topic at PX forum...
Finally, able to accept array of data from web services thorough ISD input stage with the below provider setting at IIS console to run job only on demand
Active Job instance = 1 (max)
Service request and limit=1(max)
thus, restricting job not run always
But, there is wired job status at the log even though job is completed successfully..
Finished Job abcd.1
Attempting to Cleanup after ABORT raised in job abcd.1
Job abcd.1 aborted.
Let me know how to get rid of the last two log entries..
Finally, able to accept array of data from web services thorough ISD input stage with the below provider setting at IIS console to run job only on demand
Active Job instance = 1 (max)
Service request and limit=1(max)
thus, restricting job not run always
But, there is wired job status at the log even though job is completed successfully..
Finished Job abcd.1
Attempting to Cleanup after ABORT raised in job abcd.1
Job abcd.1 aborted.
Let me know how to get rid of the last two log entries..
-
- Participant
- Posts: 104
- Joined: Sat Dec 24, 2005 1:26 am
- Location: Bengaluru
Jobs deployed under ISD as services are "multiple instance" and will be started and stopped continually, depending on traffic coming into your application. There are no particular stats in the Designer that you can expect to use...it's not a "batch" job in the common sense. Monitoring of Web Services (ISD) traffic is a whole other subject...
...each instance, when finishing, shouldn't abort, but that might just be a clean-up artifact, and might depend on what you are doing in the Job and its Stages.... is your client tool receiving the data as expected?
Ernie
...each instance, when finishing, shouldn't abort, but that might just be a clean-up artifact, and might depend on what you are doing in the Job and its Stages.... is your client tool receiving the data as expected?
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
...by the way....what kinds of SOAP clients are calling your ISD services?
Ernie
Ernie
Ernie Ostic
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>