Page 1 of 1

Error in Using File Connector stage

Posted: Fri Dec 23, 2016 6:33 am
by atulgoel
Hi ,

I am getting the below fatal error while trying to use File Connector stage. Can anyone knows about it.

I am trying to read from hive tables(hdfs) using file connector stage.

File_Connector_0: com.ascential.e2.common.CC_Exception: java.lang.Exception: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:540)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor(CC_JavaAdapter.java:1030)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData(CC_JavaAdapter.java:704)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:81)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFS.init(WebHDFS.java:112)
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:533)
... 2 more
Caused by: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.readToken(WebHDFSSPNEGO.java:251)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.access$300(WebHDFSSPNEGO.java:47)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:211)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:137)
at java.security.AccessController.doPrivileged(AccessController.java:488)
at javax.security.auth.Subject.doAs(Subject.java:572)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.doSpnego(WebHDFSSPNEGO.java:233)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:77)
... 4 more
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1045)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 704)

Posted: Fri Dec 23, 2016 8:38 am
by chulett
https://en.wikipedia.org/wiki/HTTP_403

Not sure why you're getting that but that is what it means, in case you hadn't looked up the error. I would double-check your configuration in the stage, make sure all of the values are appropriate and your running user has whatever privs are needed to access things.

Is this actually the Big Data File conector?

Posted: Sat Dec 24, 2016 5:12 pm
by ray.wurlod
chulett wrote:Is this actually the Big Data File conector?
Probably not. The File Connector stage is a feature in version 11.

Posted: Sat Dec 24, 2016 7:46 pm
by chulett
Only reason I asked is because I was under the impression you needed to use that particular one for Hive.

Posted: Sat Dec 24, 2016 9:35 pm
by ray.wurlod

Posted: Mon Dec 26, 2016 11:13 pm
by atulgoel
Hi,

Is there any document or website where I can find step by step configuration setting which is required to use File connector to read from Hive?

Posted: Wed Feb 01, 2017 2:37 am
by atulgoel
There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.

Posted: Fri Feb 17, 2017 12:49 am
by satheesh123
Hi Atul, I'm using File Connector stage to connect HDFS system but receiving the same error as below, "java.lang.RuntimeException: HTTP status = 403". Can let us know the configurations settings needs to be done from Hive Administration. Thanks!

Posted: Thu Feb 23, 2017 12:35 am
by ray.wurlod
atulgoel wrote:There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.
Please let us know what those configuration changes are.

Posted: Sun Apr 23, 2017 10:46 pm
by TNZL_BI
Hey All,

I am trying to use the File Connector Stage to put a file on HDFS. I am getting the following issue when I write a file on HDFS

File_Connector_35,0: An exception occurred: java.lang.Exception: Failed to write to file /appl/iip_data/ssd01/data/abhi.txt: HTTP/1.1 403 Forbidden

Atul / Satish - what were the things that your Hive admin did to fix this ??

Posted: Sun Apr 30, 2017 5:56 pm
by TNZL_BI
Hi All , Just wanted to close this thread saying that I can now connect Hadoop using the file connector stage.

there were 2 main issues. Hadoop cluster had 1 Name node and 2 Data nodes. Ideally opening the firewalls for Name node and corresponding Name Node Port should suffice. However the JOb was failing by showing that it cant connect to the Data Node port which was strange. We have anyway opened up all ports for all Nodes ( Name node and Data Nodes ). Doing this got the issue fixed and the Job started running fine.

Also note that it is important to know which Hadoop you are working on. In our case it was Apache Hadoop which wasn't SSL enabled and not Kerberos enabled which made is a straightforward process actually.

Posted: Sun Apr 30, 2017 7:29 pm
by TNZL_BI
Moreover , personally I think it is lot more simpler to use the file connector stage in comparison to the BDFS stage. the BDFS stage in fact is more suited for the Big Insight Hadoop version distributed by IBM ( that doesn't mean it cant connect to other Hadoop versions ). The file connector stage also gives you an option to covert the file contents into a hive table which is like very cool.