Filesize restrictions on filesets

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
Nic
Charter Member
Charter Member
Posts: 24
Joined: Mon Sep 26, 2005 1:08 pm
Location: UK

Filesize restrictions on filesets

Post by Nic »

I am trying to create a lookup fileset which normally works (average data is 5 million rows) but fails whe trying to create it with an 11 million row source data. It says cannot map table and says there isn't enough space but when we check the space on the server/project there is more than enough free space there. Are there any limitation for creating filesets?
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

No size limit for creating File Sets (well, actually there is; not more than 10000 physical files each not more than 2GB on each processing node, but you're nowhere near that!).

"Map" suggests that there's some kind of problem with memory capacity, possibly you're sorting or possibly it's in building the index for the LFS. Does the total volume of data in the LFS fit in memory? (That's usually one of the criteria for using a Lookup stage.) Can you monitor memory usage while this job is running?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Post Reply