Page 1 of 1

Getting Job Statistics

Posted: Tue Aug 31, 2004 12:38 pm
by CelesteT
I am creating a PARALLEL job that take data from sequential file, pass it through a transformer and puts it into an Informix database.

The transformer also contains a link to a sequential file that captures all the rejected records.

I need to create a sequential file that gets the job name, start time of job, end time of job, # of total records from the sequential file and # of records inputted into the database successfully. I need to capture the job statistics for about 100 similar jobs. Therefore, I need something generic that I can reuse for all jobs.

Is there a built-in component I can reuse? What is the best strategy to do this?

Using User defined Routine

Posted: Tue Aug 31, 2004 9:50 pm
by mandyli
Hi CelesteT ,


This is possible using user defined routine. Please use following code for getting Job_name, jobs start time and end _time

[code]JobStarted = convert(" :-","_",DSGetJobInfo(JobHandle,DSJ.JOBSTARTTIMESTAMP))
Alias = ""
if JobInvId eq "" then
Alias = DSGetIdForJob(JobName,ErrorCode)
end
if Alias ne "" then
FileName = JobName:"_":Alias
end
else
FileName = JobName:"_":JobStarted
end[/code]--- Getting the required counts from the link of Job

SourceCount = DSGetLinkInfo(JobHandle,'SEQL_File','InTo_TRNS',DSJ.LINKROWCOUNT)
TargetCount = DSGetLinkInfo(JobHandle,'TRNS_Loading','Out_Insert',DSJ.LINKROWCOUNT)

FileName := Suffix

ReportText = DSMakeJobReport(JobHandle,ReportType,"LF")

Audit_info = 'Source count':' ':SourceCount :'Target count': ' ':TargetCount

Audit_info = '"':Field(trims(Field(ReportText,CHAR(10),3)),' ',5):'",'
Audit_info := '"':right(trims(Field(ReportText,CHAR(10),5)),19):'",'
Audit_info := '"':right(trims(Field(ReportText,CHAR(10),6)),19):'",'
Audit_info := '"':right(Field(trims(Field(ReportText,CHAR(10),8)),' ',2),1):'"'

--- Here writing all information in to SQL file

[code]Openpath DirName to t.fvar then
write Audit_info to t.fvar, FileName else
call DSLogWarn("Failed to write file ":FileName:" to directory ":DirName, "DoJobReport")
end
end[/code]

Please try this.

Re: Using User defined Routine

Posted: Wed Sep 01, 2004 6:46 am
by CelesteT
I am still new to DS...how do I create user defined routine?

Thanks!

Posted: Wed Sep 01, 2004 4:12 pm
by ray.wurlod
Mandy's code can be implemented either in job control (in a server job), or in a Routine, preferably of before/after genre. It can not be implemented directly in a parallel job.

Typically you run the parallel job, then use this kind of processing from job control (or through a Routine Activity in a Job Sequence) to interrogate the job (and, possibly, its stages, links and parameters) to discover what happened.

job control

Posted: Wed Sep 08, 2004 7:43 am
by paulhill20
[quote="ray.wurlod"]Many's code can be implemented either in job control (in a server job), or in a Routine, preferably of before/after genre. It can not be implemented directly in a parallel job.

Typically you run the parallel job, then use this kind of processing from job control (or through a Routine Activity in a Job Sequence) to interrogate the job (and, possibly, its stages, links and parameters) to discover what happened.[/quote]

Ray,
I read the second sentence and as a DataStage newbie using Parallel only my questions are:
is the job control using C, C++, Basic? (based on the fact that the standard programmatic example from DataStage is in C for the Server - not Parallel
If it is in C, is it using the same approach as the standard programmatic example (since dsapi.h only exists in the Server include directory - not the Parallel)?
Do you have a sample you could share (something as simple as open the project and query for the jobs in the project) for Parallel?
I appreciate any assistance you can send my way.
Stephen de Vries

Posted: Wed Sep 08, 2004 4:41 pm
by ray.wurlod
The job control code, if hand-crafted, is performed in a Server Job, on the Job Control tab in the job properties window. It's written in DataStage BASIC.

Within Job Control, you can attach jobs, set their parameter values, request their run, wait for them to finish, interrogate their status and statistics, and detach them. These controlled jobs can be any kind of job; job sequence, server or parallel.

Querying the project for the names of jobs is not something you'd ordinarily do in a job control routine, but you can.

Code: Select all

DSGetProjectInfo(DSJ.JOBLIST)
returns a comma-delimited list of jobs in the project, you can process the job names individually to determine whether they are server, parallel or sequence (for example by querying the DS_JOBS hashed file, JOBTYPE or JOBTYPEIND column). In the following example, error processing has been omitted for clarity.

Code: Select all

Open "DS_JOBS" To DSJobs.fvar
Then

   * Obtain list of job names in project and convert to dynamic array
   JobList = Convert(",", @FM, DSGetProjectInfo(DSJ.JOBLIST)

   * Process job names one at a time.
   Loop

      * Get next job name from dynamic array.
      Remove JobName From JobList Setting MoreJobs

      * Read that job's record from DS_JOBS file opened earlier.
      * This is not a read for update, so no lock is obtained.

      Read DSJobsRecord From DSJobs.fvar, JobName
      Then

         JobType = DSJobsRecord<5> ; * from file dictionary

         * Add whatever other processing you need in the Case block.

         Begin Case

            Case JobType = 0 Or JobType = ""
               * job JobName is server job

            Case JobType = 1
               * job JobName is parallel job

            Case JobType = 2
               * job JobName is Job Sequence

         End Case

      End

   While MoreJobs
   Repeat

   Close DSJobs.fvar

End

Posted: Wed Sep 08, 2004 6:09 pm
by kduke
If you go to my web site below or ADN you can download jobs which will get all row counts from all your jobs unless you just want to learn all this. EtlStats.zip has the dsx and all the create table scripts to post these to an Oracle table but you can easily switch this to any database.