Error in Using File Connector stage
Moderators: chulett, rschirm, roy
Error in Using File Connector stage
Hi ,
I am getting the below fatal error while trying to use File Connector stage. Can anyone knows about it.
I am trying to read from hive tables(hdfs) using file connector stage.
File_Connector_0: com.ascential.e2.common.CC_Exception: java.lang.Exception: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:540)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor(CC_JavaAdapter.java:1030)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData(CC_JavaAdapter.java:704)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:81)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFS.init(WebHDFS.java:112)
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:533)
... 2 more
Caused by: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.readToken(WebHDFSSPNEGO.java:251)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.access$300(WebHDFSSPNEGO.java:47)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:211)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:137)
at java.security.AccessController.doPrivileged(AccessController.java:488)
at javax.security.auth.Subject.doAs(Subject.java:572)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.doSpnego(WebHDFSSPNEGO.java:233)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:77)
... 4 more
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1045)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 704)
I am getting the below fatal error while trying to use File Connector stage. Can anyone knows about it.
I am trying to read from hive tables(hdfs) using file connector stage.
File_Connector_0: com.ascential.e2.common.CC_Exception: java.lang.Exception: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:540)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor(CC_JavaAdapter.java:1030)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData(CC_JavaAdapter.java:704)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:81)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFS.init(WebHDFS.java:112)
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:533)
... 2 more
Caused by: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.readToken(WebHDFSSPNEGO.java:251)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.access$300(WebHDFSSPNEGO.java:47)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:211)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:137)
at java.security.AccessController.doPrivileged(AccessController.java:488)
at javax.security.auth.Subject.doAs(Subject.java:572)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.doSpnego(WebHDFSSPNEGO.java:233)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:77)
... 4 more
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1045)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 704)
Atul
https://en.wikipedia.org/wiki/HTTP_403
Not sure why you're getting that but that is what it means, in case you hadn't looked up the error. I would double-check your configuration in the stage, make sure all of the values are appropriate and your running user has whatever privs are needed to access things.
Is this actually the Big Data File conector?
Not sure why you're getting that but that is what it means, in case you hadn't looked up the error. I would double-check your configuration in the stage, make sure all of the values are appropriate and your running user has whatever privs are needed to access things.
Is this actually the Big Data File conector?
-craig
"You can never have too many knives" -- Logan Nine Fingers
"You can never have too many knives" -- Logan Nine Fingers
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Read all about it!
http://www.ibm.com/support/knowledgecen ... arent.html
http://www.ibm.com/support/knowledgecen ... arent.html
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
-
- Participant
- Posts: 3
- Joined: Mon Dec 27, 2010 1:12 am
-
- Participant
- Posts: 54607
- Joined: Wed Oct 23, 2002 10:52 pm
- Location: Sydney, Australia
- Contact:
Please let us know what those configuration changes are.atulgoel wrote:There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.
IBM Software Services Group
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Any contribution to this forum is my own opinion and does not necessarily reflect any position that IBM may hold.
Hey All,
I am trying to use the File Connector Stage to put a file on HDFS. I am getting the following issue when I write a file on HDFS
File_Connector_35,0: An exception occurred: java.lang.Exception: Failed to write to file /appl/iip_data/ssd01/data/abhi.txt: HTTP/1.1 403 Forbidden
Atul / Satish - what were the things that your Hive admin did to fix this ??
I am trying to use the File Connector Stage to put a file on HDFS. I am getting the following issue when I write a file on HDFS
File_Connector_35,0: An exception occurred: java.lang.Exception: Failed to write to file /appl/iip_data/ssd01/data/abhi.txt: HTTP/1.1 403 Forbidden
Atul / Satish - what were the things that your Hive admin did to fix this ??
Hi All , Just wanted to close this thread saying that I can now connect Hadoop using the file connector stage.
there were 2 main issues. Hadoop cluster had 1 Name node and 2 Data nodes. Ideally opening the firewalls for Name node and corresponding Name Node Port should suffice. However the JOb was failing by showing that it cant connect to the Data Node port which was strange. We have anyway opened up all ports for all Nodes ( Name node and Data Nodes ). Doing this got the issue fixed and the Job started running fine.
Also note that it is important to know which Hadoop you are working on. In our case it was Apache Hadoop which wasn't SSL enabled and not Kerberos enabled which made is a straightforward process actually.
there were 2 main issues. Hadoop cluster had 1 Name node and 2 Data nodes. Ideally opening the firewalls for Name node and corresponding Name Node Port should suffice. However the JOb was failing by showing that it cant connect to the Data Node port which was strange. We have anyway opened up all ports for all Nodes ( Name node and Data Nodes ). Doing this got the issue fixed and the Job started running fine.
Also note that it is important to know which Hadoop you are working on. In our case it was Apache Hadoop which wasn't SSL enabled and not Kerberos enabled which made is a straightforward process actually.
Moreover , personally I think it is lot more simpler to use the file connector stage in comparison to the BDFS stage. the BDFS stage in fact is more suited for the Big Insight Hadoop version distributed by IBM ( that doesn't mean it cant connect to other Hadoop versions ). The file connector stage also gives you an option to covert the file contents into a hive table which is like very cool.