NLS Mapping Error

Post questions here relative to DataStage Server Edition for such areas as Server job design, DS Basic, Routines, Job Sequences, etc.

Moderators: chulett, rschirm, roy

Post Reply
tonystark622
Premium Member
Premium Member
Posts: 483
Joined: Thu Jun 12, 2003 4:47 pm
Location: St. Louis, Missouri USA

NLS Mapping Error

Post by tonystark622 »

I'm getting the following warning message when writing to a text file:

LoadPartEffectivity..part_app_debug.part_app: nls_map_buffer_out() - NLS mapping error, row 106 (approx), row = "81349 M39029/1-101 F/A-18E/F A05FAAAXAAAE99 01 P "
I am not explicitly using any maps other than the default and as far as I know, the source file doesn't have any weird chars in it. But I may be mistaken in that...

I did copy this line above into a hex editor and I saw that the funny chars visible at the end of the line are hex 0x19. I'm not sure where these are coming from...

How can I tell which characters are causing the problem?

Thanks for your help,
Tony
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

0x19 is Ctrl-Y. Has whoever created the data had to use this key to control when/where data are placed in the application and, if so, can this be prevented?
It's very surprising that Ctrl-Y generates a mapping error; as it's in the C0 control set it should be included in every possible map. What is the map specified:
(a) for the column
(b) for the stage
(c) for the job
(d) as the project default
One possibility is that a lead byte has been found where a trailing byte of a multi-byte character was expected, or vice versa. You can determine this from your hex dump.

Ray Wurlod
Education and Consulting Services
ABN 57 092 448 518
tonystark622
Premium Member
Premium Member
Posts: 483
Joined: Thu Jun 12, 2003 4:47 pm
Location: St. Louis, Missouri USA

Post by tonystark622 »

Hi Ray, thanks for the reply.

The mapping is as follows:

1) column - Allow per-columnn mapping is not checked and I have not explicitly set any mapping on these columns.

2) stage - The NLS tab on the stage says "Project Default (ISO8859-1)"

3) job - The NLS tab on the job properties says "Project Default (ISO8859-1"

These are all on error record links (I'm catching error rows), so perhaps the column doesn't have good data in it. I'm seriously considering disabling mapping for these error rows as I get errors writing to flat files I'm using for troubleshooting when I have them enabled and when I'm writing these rows from the error log hash to a flat file log file.

I just hate to have a problem like this in the job and not resolve it. I feel like it will come back and bite me later. :)

Ah, well. Have a good day, Ray.

Tony
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

The correct mapping for hashed files is always NONE, since these are an internal part of DataStage (no further mapping is required). This is not part of your problem - just an observation. The ISO8859-1 map should be able to handle Ctrl-Y quite happily.
tonystark622
Premium Member
Premium Member
Posts: 483
Joined: Thu Jun 12, 2003 4:47 pm
Location: St. Louis, Missouri USA

Post by tonystark622 »

Ray,
I constructed a simple job that read the source file, ran it through a transformer and wrote it to a flat file. No problems.
SourceFile-->Transform-->OutputFile

I modified my test job to read the source file, write it to a hash file, then read the hash file, through a transformer and write it to a flat file. I got the "NLS Mapping error".
SourceFile->Transform->HashFile->Transform->OutputFile

The source file had some fields that were empty for the rows that caused the mapping error, so I added code to the derivation of those fields before they were written to the hash file. The code tested for null and wrote a space in the field, otherwise, it wrote the input data. No errors.
SourceFile->TransformWithCode->HashFile->Transform->OutputFile

I honestly don't understand why I get a problem when I put the data into a hash file and don't get the problem when I don't put the data into a hash file.

Any thoughts?

Thanks for your help,
Tony
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

Which Transformer stage reported the NLS mapping error, the one that was writing to the hashed file or the one that was reading from the hashed file? Was the NLS map for each link NONE?
tonystark622
Premium Member
Premium Member
Posts: 483
Joined: Thu Jun 12, 2003 4:47 pm
Location: St. Louis, Missouri USA

Post by tonystark622 »

Ray,

I get an NLS mapping error on the last sequential file stage, not on a transformer. The message is something like:

TonyTest..Sequential_File_2.DSLink9: nls_map_buffer_out() - NLS mapping error, row 18862 (approx), row =

The mapping on both the Sequential file input stage and the Sequential file output stage are Project default(ISO8859-1). I don't see any place to specify NLS map name on either of the transformers or the hash file stage.

I didn't change the mapping on anything when I built this job. All the NLS settings are whatever their default values are.

Thanks again for your help.

Tony
Post Reply