Error in Using File Connector stage

Post questions here relative to DataStage Enterprise/PX Edition for such areas as Parallel job design, Parallel datasets, BuildOps, Wrappers, etc.

Moderators: chulett, rschirm, roy

Post Reply
atulgoel
Participant
Posts: 84
Joined: Tue Feb 03, 2009 1:09 am
Location: Bangalore, India

Error in Using File Connector stage

Post by atulgoel »

Hi ,

I am getting the below fatal error while trying to use File Connector stage. Can anyone knows about it.

I am trying to read from hive tables(hdfs) using file connector stage.

File_Connector_0: com.ascential.e2.common.CC_Exception: java.lang.Exception: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:540)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor(CC_JavaAdapter.java:1030)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData(CC_JavaAdapter.java:704)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:81)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFS.init(WebHDFS.java:112)
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:533)
... 2 more
Caused by: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.readToken(WebHDFSSPNEGO.java:251)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.access$300(WebHDFSSPNEGO.java:47)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:211)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:137)
at java.security.AccessController.doPrivileged(AccessController.java:488)
at javax.security.auth.Subject.doAs(Subject.java:572)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.doSpnego(WebHDFSSPNEGO.java:233)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:77)
... 4 more
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1045)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 704)
Atul
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

https://en.wikipedia.org/wiki/HTTP_403

Not sure why you're getting that but that is what it means, in case you hadn't looked up the error. I would double-check your configuration in the stage, make sure all of the values are appropriate and your running user has whatever privs are needed to access things.

Is this actually the Big Data File conector?
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

chulett wrote:Is this actually the Big Data File conector?
Probably not. The File Connector stage is a feature in version 11.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
chulett
Charter Member
Charter Member
Posts: 43085
Joined: Tue Nov 12, 2002 4:34 pm
Location: Denver, CO

Post by chulett »

Only reason I asked is because I was under the impression you needed to use that particular one for Hive.
-craig

"You can never have too many knives" -- Logan Nine Fingers
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
atulgoel
Participant
Posts: 84
Joined: Tue Feb 03, 2009 1:09 am
Location: Bangalore, India

Post by atulgoel »

Hi,

Is there any document or website where I can find step by step configuration setting which is required to use File connector to read from Hive?
Atul
atulgoel
Participant
Posts: 84
Joined: Tue Feb 03, 2009 1:09 am
Location: Bangalore, India

Post by atulgoel »

There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.
Atul
satheesh123
Participant
Posts: 3
Joined: Mon Dec 27, 2010 1:12 am

Post by satheesh123 »

Hi Atul, I'm using File Connector stage to connect HDFS system but receiving the same error as below, "java.lang.RuntimeException: HTTP status = 403". Can let us know the configurations settings needs to be done from Hive Administration. Thanks!
Regards,
Satheesh
________________
ray.wurlod
Participant
Posts: 54607
Joined: Wed Oct 23, 2002 10:52 pm
Location: Sydney, Australia
Contact:

Post by ray.wurlod »

atulgoel wrote:There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.
Please let us know what those configuration changes are.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
TNZL_BI
Premium Member
Premium Member
Posts: 24
Joined: Mon Aug 20, 2012 5:15 am
Location: NZ

Post by TNZL_BI »

Hey All,

I am trying to use the File Connector Stage to put a file on HDFS. I am getting the following issue when I write a file on HDFS

File_Connector_35,0: An exception occurred: java.lang.Exception: Failed to write to file /appl/iip_data/ssd01/data/abhi.txt: HTTP/1.1 403 Forbidden

Atul / Satish - what were the things that your Hive admin did to fix this ??
TNZL_BI
Premium Member
Premium Member
Posts: 24
Joined: Mon Aug 20, 2012 5:15 am
Location: NZ

Post by TNZL_BI »

Hi All , Just wanted to close this thread saying that I can now connect Hadoop using the file connector stage.

there were 2 main issues. Hadoop cluster had 1 Name node and 2 Data nodes. Ideally opening the firewalls for Name node and corresponding Name Node Port should suffice. However the JOb was failing by showing that it cant connect to the Data Node port which was strange. We have anyway opened up all ports for all Nodes ( Name node and Data Nodes ). Doing this got the issue fixed and the Job started running fine.

Also note that it is important to know which Hadoop you are working on. In our case it was Apache Hadoop which wasn't SSL enabled and not Kerberos enabled which made is a straightforward process actually.
TNZL_BI
Premium Member
Premium Member
Posts: 24
Joined: Mon Aug 20, 2012 5:15 am
Location: NZ

Post by TNZL_BI »

Moreover , personally I think it is lot more simpler to use the file connector stage in comparison to the BDFS stage. the BDFS stage in fact is more suited for the Big Insight Hadoop version distributed by IBM ( that doesn't mean it cant connect to other Hadoop versions ). The file connector stage also gives you an option to covert the file contents into a hive table which is like very cool.
Post Reply