Load all Hashed file into memory
Moderators: chulett, rschirm, roy
Load all Hashed file into memory
Hello,
We have enough memory to load our entire project.
1) Is there a way to load all Hashed files, present in the project directory, in to memory before we start our ETL run?
2) If yes then would it, by any chance, alter the way our jobs are behaving? We have few jobs that just read the hashed files, few which just write to the hashed files and few which read and write to the same hashed file in the same job.
3) Once our ETL run completes is there a way to write all the hashed file contents present in memory back to the disk?
Thanks for your help
-Mav
We have enough memory to load our entire project.
1) Is there a way to load all Hashed files, present in the project directory, in to memory before we start our ETL run?
2) If yes then would it, by any chance, alter the way our jobs are behaving? We have few jobs that just read the hashed files, few which just write to the hashed files and few which read and write to the same hashed file in the same job.
3) Once our ETL run completes is there a way to write all the hashed file contents present in memory back to the disk?
Thanks for your help
-Mav
Re: Load all Hashed file into memory
I doubt .....No
DS User
DS User
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
There's no limit till you start running out of memory. However, every byte of memory you use for cached hashed files is marked memory that is not available for any other purpose. So the more you use, the less you have for other tasks. The whole thing's a trade-off - supply and demand.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
I'm really curious what the goal is here and I'm wondering if all this curious energy isn't being misdirected. I get the impression you may not understand the use of "shared cache"... from what I recall from my play time with it many years ago, it is a way to cache hashed files into memory that many jobs would typically have open simultaneously. That way, rather than having five jobs caching five separate copies of hashed file X into memory, they could all share one copy. It's not really meant to be a place where every single hashed file you might want to use over the course of a run could be "preloaded".
At least that's what I recall.
At least that's what I recall.
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
Craig,
When I initially posted my question I wanted to load every single hashed file into memory but after I stumbled upon "System caching is not intended to be used if only a single stage is creating or reading the file." in the PDF I realized my approach is wrong.
Craig/Ray,
Few more questions:
2) Is 999MB limit for a single Hashed File stage or the entire project?
Thank you.
When I initially posted my question I wanted to load every single hashed file into memory but after I stumbled upon "System caching is not intended to be used if only a single stage is creating or reading the file." in the PDF I realized my approach is wrong.
Craig/Ray,
Few more questions:
1) Is this private link caching? If so can the Hashed File be preloaded into memory?Caching via the Hashed File stage gives you 999MB as the limit.
2) Is 999MB limit for a single Hashed File stage or the entire project?
Thank you.
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
This is an interesting question. I'm waiting for my premium membership to kick in to read all the responses so far, so I don't really know what answer you have got in detail just yet.
You asked about cataloging hashed files in bulk. What do you mean by catalog in this case?
You asked about cataloging hashed files in bulk. What do you mean by catalog in this case?
Choose a job you love, and you will never have to work a day in your life. - Confucius
@Eric,
Sorry for my late response. By Catalog I presume loading a Hashed file into memory. Please read through "Disk Caching Guide" manual for more information. I'm still an amateur in this topic and hence I'm trying to get clarifications from DS gurus by asking a lot of (possibly dumb) questions
@Ray/Craig,
I've a job which looks up on a huge Hashed file (< 999 MB). This Hashed file is used only by the job hence I'm using Private Link Caching and also have increased the Read Cache Size to 999 MB in Administrator. Can this Hashed file be cataloged (preloaded into memory) even before the job starts? If so can you please tell me the steps?
Thank you
Sorry for my late response. By Catalog I presume loading a Hashed file into memory. Please read through "Disk Caching Guide" manual for more information. I'm still an amateur in this topic and hence I'm trying to get clarifications from DS gurus by asking a lot of (possibly dumb) questions
@Ray/Craig,
I've a job which looks up on a huge Hashed file (< 999 MB). This Hashed file is used only by the job hence I'm using Private Link Caching and also have increased the Read Cache Size to 999 MB in Administrator. Can this Hashed file be cataloged (preloaded into memory) even before the job starts? If so can you please tell me the steps?
Thank you
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact: