IQ12 bulk load on Linux

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

IQ12 bulk load on Linux

Post by Tania »

Hi,

DataStage version 7.5.1A
OS: RedHat Ent Linux 3 update 3

I have a server job that reads from a sequential file, to a transformer, to a Sybase IQ 12 bulk load stage. The load stage is set for manual load. According to Director all rows are written successfully but if I look at the data in the LOADFILE it only has 4834 records out of 7671 and the last record has been truncated.

Sybase IQ reports an error in the message log - "Warning: Partial input record (24 bytes) skipped at EOF"?

If I run the same job but writing to a sequential file all rows are written with no issues.


Cheers
Tania
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Hello Tania,

my first thought would be to look at your sequential file output at the line that was truncated in the bulk loader and see if there are any non-displayable characters or other oddities in that line. You can use your favored binary editor or (my usual method) of "cat -v" on that data. My suspicion is that the issue is data related. Also, how large it the file (some magical power of two value might suggest a possible reason as well)
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Hi Arnd,

As far as I can tell there aren't any non displayable characters/ strange things in the source file. There is only one column containing dates to build a calender. In the transform we then use the dates to create some other date fields, year, quarter etc.

The file is not large at all 90kb. If I replace the bulk loader with a sequential file everything works fine.

Thanks for the help
Tania
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Does it abort at the same place each time? If you change the source file order by sorting, does the job abort in the same place?
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Yes the job stops writing to the loadfile at the same place relative to the amount of data in the source file. The job itself doesn't abort or give warnings.

As the source file is a calender it gets bigger by one row every day. About 62% of the rows are written by the bulk loader. Also sometimes some of the columns in the loadfile are padded with spaces instead of data :?

I feel the problem is to do with the bulk loader and OS. The same job runs perfectly on a windows box.
ArndW
Participant
Posts: 16318
Joined: Tue Nov 16, 2004 9:08 am
Location: Germany
Contact:

Post by ArndW »

Tania,

there must be some other factor causing this problem, it isn't a normal limitation of the bulk loader file creator to stop writing.

- What about disk space in the directory the file is being created in?
- If you clear the &PH& file/directory in the project in the project and look at the files created during a run are there any additional messages visible?
roy
Participant
Posts: 2598
Joined: Wed Jul 30, 2003 2:05 am
Location: Israel

Post by roy »

Hi,
Since you also say it works fine on windows install, could it be you found a bug in the linux version you have of DS?!
I'd recomend contacting your DS support.

Please post your final findings,
Roy R.
Time is money but when you don't have money time is all you can afford.

Search before posting:)

Join the DataStagers team effort at:
http://www.worldcommunitygrid.org
Image
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Do you preserve the bulk loader data file created by DataStage? If so, does it contain all the rows? If so, the problem is in loading Sybase IQ, and you need to check the bulk loader log file to find out why.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Do you preserve the bulk loader data file created by DataStage?
Yes. I have the bulk loader set to manual load and then looking at the file before doing the load and it doesn't contain all of the rows. When I do a load into Sybase IQ all of the rows in the data file are written so I don't feel that the problem is with the loading it seem to go wrong with creating the data file by the bulk load stage.
could it be you found a bug in the linux version you have of DS?!
I really hope not :cry: but will contact DS support soonest.

Thanks for the help - I'll update with any findings

Tania
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Hi,

Latest on this issue is the client that I had this problem at decided to go with a different OS so we never got anywhere with Ascential's support.

Now the same thing has happened at another site. This place is dedicated to going the Linux route. The same job on an earlier version of DS 7.5 ran fine but now on 7.5.1A there are problems with the Sybase IQ Bulk Load 12 stage.

I set the Bulk Load stage to 'manual' load. The Director log claims about 133000 rows written. The data file only contains about 400 rows and some of the fields are NULL/empty which they shouldn't be.

I can load the data file into Sybase IQ with no complaints but I'm still short of about 132600 rows.

It would be great to know if anyone else has experienced this sort of behaviour. If Ascential come back with anything I'll update this post but it looks like we're going to downgrade from 7.5.1A to 7.5A :(

Cheers

Tania
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The Director log tells you how many rows were written to the data file. Can you check this, and confirm (or otherwise) that all the lines are there? Use a wc command.

If they are, the problem is not in DataStage per se, it's either in the bulk loader itself or how DataStage invokes it. The easiest way to eliminate one of these is to execute the bulk load command manually. Use the control file generated by DataStage and/or use a control file that you create yourself.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Hi,

The Director says that 133000 rows written to the data file.

The data file only has 452 rows in it - just checked again

When I manually load the data file into the database there are only 452 rows loaded - so everything that is in the data file is loaded to the database.

By "execute the bulk load command manually" you mean execute the SQL to load the database - right?

Thank you!
Tania
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Sounds like a problem with the stage then. I don't want to insult your intelligence but I have to ask this - can you please verify that you're checking the same data file to which the stage is writing?
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Tania
Participant
Posts: 22
Joined: Tue Jul 13, 2004 7:54 am
Location: Johannesburg

Post by Tania »

Hi Ray,

No insult taken and yes I am looking at the same data file that the job is writing.

Still no news from the support people on this.

Thanks
Tania
dweller
Participant
Posts: 6
Joined: Fri Jan 06, 2006 1:43 am

Post by dweller »

I've seen this before as well at a customer site, and due to the fact that IBM has taken over Ascential and all its good tools and the uncertainty on whether IBM will fix this in a very short time, we've decided to go to a Windows platform instead. What a pitty. All is running fine on Windows though. I'll be waiting for IBM's response to this issue. :shock:

Dweller.
I did it right the first time !
Post Reply