Lookup job aborts

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Hope
Participant
Posts: 97
Joined: Sun May 13, 2007 2:51 pm
Contact:

Lookup job aborts

Post by Hope »

I am having a lookup problem in my job. The reference lookup file contains about 30,000 records.This jobs works fine for few thousands.When the file size is increased the job is getting aborted.I tried to put the data from file to a table and did sparse looup.That didn't work too.
This is the error
LKP_SAP_No,0: Could not map table file "D:/IBM_DQS_Config/Dataset/lookuptable.20090626.rknphzd (size 1261666600 bytes)": Not enough space.
LKP_SAP_No,0: Error finalizing / saving table /D=/ISDS/dynLUT405668aa2ae1

Please advice.
Last edited by Hope on Fri Jun 26, 2009 7:39 am, edited 1 time in total.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

You sem to have run out of disk space in your Dataset directory at runtime. Increase the space or use an additional mount point with space for your datasets.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

How does "DEV" relate to "D:/IBM_DQS_Config/Dataset/"?
Hope
Participant
Posts: 97
Joined: Sun May 13, 2007 2:51 pm
Contact:

Post by Hope »

I mean to say that we have 70GB of space in D:/ .70GB of space includes all parent and child folders. Do we need to have space in the "Dataset" folder?. Please advice.
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

Hope wrote:I mean to say that we have 70GB of space in D:/ .70GB of space includes all parent and child folders. Do we need to have space in the "Dataset" folder?. Please advice.

Monitor space while the job is running.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
priyadarshikunal
Premium Member
Premium Member
Posts: 1735
Joined: Thu Mar 01, 2007 5:44 am
Location: Troy, MI

Post by priyadarshikunal »

also which drive is used as Scratch?

There are a lot of factors which can create such problems.

Other jobs/ other stages on the same job may fill the available space.
Priyadarshi Kunal

Genius may have its limitations, but stupidity is not thus handicapped. :wink:
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

And just to address the last question, there is no such thing as space "in a folder", for Windows the space is managed at the drive letter level and is available to any folder on that drive letter.
-craig

"You can never have too many knives" -- Logan Nine Fingers
siauchun84
Participant
Posts: 63
Joined: Mon Oct 20, 2008 12:01 am
Location: Malaysia

Post by siauchun84 »

May I know how many Node you are using for your job?
miwinter
Participant
Posts: 396
Joined: Thu Jun 22, 2006 7:00 am
Location: England, UK

Post by miwinter »

Hope - sounds like you are doing a database lookup, correct?

Leonie - can you confirm what kind of lookup you were doing too please (Was it sparse or normal? Was it to dataset or database?)

Mark
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
Leonie
Participant
Posts: 16
Joined: Thu Jul 31, 2003 7:16 am
Location: South Africa
Contact:

Post by Leonie »

Hi Mark

All sources are datasets or sequential files, no databases.

Leonie
miwinter
Participant
Posts: 396
Joined: Thu Jun 22, 2006 7:00 am
Location: England, UK

Post by miwinter »

OK thanks Leonie, I was looking at the angle of this being related to the temp directory instead of related to dataset or scratch. Could you check what you have this (TMPDIR) set to and verify space there during run time?

Mark
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Here is a tip to reduce the size of your lookup, perhaps then it will fit into the 512Mb block limit = remove the length specifications for VarChar fields.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Here is a tip to reduce the size of your lookup, perhaps then it will fit into the 512Mb block limit = remove the length specifications for VarChar fields.
miwinter
Participant
Posts: 396
Joined: Thu Jun 22, 2006 7:00 am
Location: England, UK

Post by miwinter »

Good point Arnd, same principle as with any dataset.

What is the 512mb limit though? What imposes this?
Mark Winter
<i>Nothing appeases a troubled mind more than <b>good</b> music</i>
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

That limit is set in the executable and can be modified (with ldedit on AIX) but I am sure that there are drawbacks to doing so, otherwise the limit would be set higher in all jobs.
Post Reply