deploying the webservice

Dedicated to DataStage and DataStage TX editions featuring IBM<sup>®</sup> Service-Oriented Architectures.

Moderators: chulett, rschirm

Post Reply
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

deploying the webservice

Post by kennyapril »

Hello,
I designed a webservice by invoking a job from datastage in Information services director.The job contains 3 ODBC stages connected to the database.
My issue is when ever the data is updated in database I need to redeploy the service?
I see a strange issue in the service,After data is loaded or updated in the database for my related tables I cannot see any change in the response of the webservice.

Once I redeploy it I can see the new data.

Is this the way to do it?
Regards,
Kenny
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

We would need more details, but it sounds like (maybe) you have a parallel job that is using lookups......

by default, lookups in EE load all their data into memory when the job is initialized......

there is a lot going on here....it sounds like you might have a job with WISDinput stage (which means it is "always on"), combined with regular lookups that bring all the data into memory. Indeed....they will not get updated data unless you disable/enable the job (you don't need to re-deploy...hit the edit button while in the Deployed App Workspace and you will see an enable/disable button at the bottom --- this will just allow you to stop and then restart the job).

Still...there are other patterns you can use. The simplist is to change the lookups to "sparse lookups" and they will always be current. Do some searches thru the forum here if you haven't used Sparse Lookups...there are many entries on that subject.

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

thanks,
I see the condition sparse lookup only in ODBC,Is there any other way to use sparse lookup I mean change settings in look up stage.
As I am using the general lookup the data is stored in the memory,so I need to disable and enable the service when ever the data is loaded.

If I use sparse lookup I need not do that,Just leave it as it is and the output will be the current data.

Right?
Regards,
Kenny
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

In the ODBC stage use sparse lookup option and is that the only change I have to do?
Regards,
Kenny
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

I used sparse lookup everything seems to be fine.

but Why is it taking too long to deploy completely?

I monitored the job in the director and it tool around 60 min to get deployed completely.

can you please let me know what is the cause of that?
Regards,
Kenny
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

I'm afraid you might be confused about what exactly a web service is and the meaning of "deploy" in this context. It seems to me that when you say "deploy" you actually mean "run", especially if you are saying that step takes an hour. Sounds like you have just a regular PX job rather than anything truly WISD based. :?

Can you confirm your job design for us?
-craig

"You can never have too many knives" -- Logan Nine Fingers
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

The job I designed has WISD input and WISD output stages with lookups,ODBC connectors,Transformers.
This is used as a Information provider for a service from information Server console.
When I deploy the service the see the job in the director always running but unless all the records are completed,I cannot test the service
(I mean when I check that in the monitor of that job in the DIRECTOR client )

Please rectify my faults if any?
Regards,
Kenny
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

So... still confusing. Are you saying that you can only see the records in the target when you stop/disable the service? If that is the issue, can you confirm for us how you are handling commits in the target? Typically in a service you'd have to issue explicit commit (or rollback) actions when each 'packet' of data / unit of work had been completed or encountered an error, you can't wait for the job to end like you would in traditional batch processing. Luckily the ODBC stage makes this fairly easy.

By the way, what actually is your target? All you've said so far is "the database". What kind of data are you processing? What response does it return? As Ernie noted earlier, we would need more details to be able to provide much in the way of cogent help.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

thanks,

The target is not a database, it is WISD output stage. the data is not loaded anywhere it is just a response to a request.

the request is entity_id and the response is the details of that person.
I used two ODBC stages to pull the data for all entity_id's and when the entity_idis given as a request in the webservice,It does a lookup with the details in ODBC and send the details as a response.

As the job is always running once I deploy it after complete deployment as I told you earlier I get the response when I send the request.
I used soapUI tool to test the service.
This works fine but when I disable the service If I need to do any changes and enable it again It takes again 60 minutes to deploy completely.
What I mean is which I check in the monitor of the job then I can test the service?

My issues are

why does it take so long 60 minutes to deploy the service?
and
I used sparse lookup where ever the reference is a ODBC stage and the stream for that lookup is also ODBC.
Used a couple of copy stages also.

could any of these be the reason for the issue?
Regards,
Kenny
lstsaur
Participant
Posts: 1139
Joined: Thu Oct 21, 2004 9:59 pm

Post by lstsaur »

The WISD output stage is the exit point of your job, returning one or more rows to the client (consumer) as a service response (in your case, the details of that person).
So, where is "the details of that person" stored (that's what Craig referring the TARGET). Is it stored on flat file or database?
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

The details are stored in ODBC using sparse lookup stage,There are two different ODBC stages used and they are sparse lookup stages.

In one ODBC the details of active entity_id are stored and in other entity_id the details of inactive entity_id are stored.

In a situation of my job what happens is lookup is done between two ODBC stages and only the reference can be used as sparse so all the data in the stream link of lookup takes so long.

can I change anything in that case?
Regards,
Kenny
eostic
Premium Member
Premium Member
Posts: 3838
Joined: Mon Oct 17, 2005 9:34 am

Post by eostic »

Let's make sure we fully understand the Job and the terminology/functions that you are using in ISD.....

You say that you have two ODBC stages. My understanding is that they are both used as lookups.....and both are used as "sparse" lookups only. And there are no others.....

In ISD, you "deploy".....and then you can also "enable/disable".

Which is taking the hour --- "deploy" (when the little green thing is going back and forth) or "enable" [the first time, it does both...deploy and start the job...but after that if you "disable"...how long does it take to "enable"?]....

The only things that I've ever seen that will make deployment take a long time would be if the job was huge and the machine was tiny (hundreds of stages perhaps, on a 2 way older cpu box)......or a non-sparse lookup with a huge file or large number of stages leading to the lookup for the reference link..... are you sure things are working correctly as sparse?

Once it is up and running, how long does it take to get a response in SOAP UI?

Ernie
Ernie Ostic

blogit!
<a href="https://dsrealtime.wordpress.com/2015/0 ... ere/">Open IGC is Here!</a>
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

actually my job has two lookups,
for the first lookup stream------>ODBC and reference--------->ODBC
so for reference ODBC I used sparse option. but for the stream ODBC I cannot use sparse.
for the second lookup stream------->copy and reference------->ODBC
so for reference ODBC I used option sparse.


And for the point of long time running of a job.
when I click DEPLOY (the small green one takes just 2 to 3 minutes)
and enable & disable also takes just 1 to 2 minutes

But when I see in the monitor of the job(Director) which is used in this deployed service it takes long time for the records of ODBC (stream of lookup).
that is what I used long time in my post!!
Regards,
Kenny
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

But the problem is you've been using the terminology wrong which confuses people and makes your actual issue harder to troubleshoot as we chase the wild gooses instead. So, now that things are a little clearer...

I don't know about anyone else but I'd appreciate it if we can take a step back and get an explanation of exactly what this 'service' is meant to do. What kind of information is sent to it? Your lookups are against what size of tables? How much (and what type) of information does your service return? I'm also wondering when you tested this initially without the WISD stages, how long did the job run with the same volume/type of input data?

Also, I also still unsure what your job actually looks like and what you mean by phrases like "stream of lookup". Any chance you can draw us a "picture" of the job in ASCII art like you've seen many others do here in the past? That or upload a screenshot of the canvas to a file sharing site and then link it into your post? See this post for an example of what I mean.
-craig

"You can never have too many knives" -- Logan Nine Fingers
kennyapril
Participant
Posts: 248
Joined: Fri Jul 30, 2010 9:04 am

Post by kennyapril »

Actually the job with out WISD stages also took the same time.

Lookup has 2 input links so one is stream and other one is reference.

For my particular job stream and reference are both ODBC stages as I could only find sparse in reference ODBC there is no sparse in stream ODBC.

I assume that is the reason this job takes long time to run completely.

My service returns all the details of a person who has the provided entity_id which includes his DOB,name,address,phone,code,ID etc. All these are pulled out from two ODBC and to include both the odbc stages I used lookup. (one ODBC for stream of lookup and other one for Reference of lookup.

Code: Select all

            COPY(reject)--------->transformer------->funnel
                 |                                    |
                 |                                    |
wisdinput--->lookup----->transformer---------------->funnel--->wisdoutput
           !      |
           !      |
           !      |
ODBC(ref) ODBC(stream)
I could not show in a proper way both the ODBC (reference) and ODBC(stream) come under the lookup. The reference ODBC I used sparse.
It seems two funnels but only one funnel is used and that is linked to WISD output.

As you see that ODBC stream stage, the job is taking long time for all the records. The records in both the ODBC will be aroung 1.5 million.
Regards,
Kenny
Post Reply