I am kind of new to DataStage.
In one of my jobs we are trying to read a seq file and writing it to 3 hash files.When I am trying to load a file with a record count of 15000 it runs fine.But,if I am loading the whole file with around 55000 recs its aborting the job after 50 recs are found with more cols.I know its not appropriate to change the warning limit. I am not sure as the production jobs are running fine but I am getting this error in dev while testing as I want to test using the whole file.Is it ok to neglect the warnings as far as the job runs successfully.
DataStage Job 356 Phantom 18791
Job Aborted after 50 errors logged.
Attempting to Cleanup after ABORT raised in stage CBCOMBCreateHashfiles..T1
DataStage Phantom Aborting with @ABORT.CODE = 1
Thanks all of you.
Mukta
![Smile :-)](./images/smilies/icon_smile.gif)