DSXchange: DataStage and IBM Websphere Data Integration Forum
View next topic
View previous topic
Add To Favorites
This topic has been marked "Resolved."
Author Message
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Fri Dec 23, 2016 6:33 am Reply with quote    Back to top    

DataStage® Release: 11x
Job Type: Parallel
OS: Unix
Hi ,

I am getting the below fatal error while trying to use File Connector stage. Can anyone knows about it.

I am trying to read from hive tables(hdfs) using file connector stage.

File_Connector_0: com.ascential.e2.common.CC_Exception: java.lang.Exception: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:540)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor(CC_JavaAdapter.java:1030)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData(CC_JavaAdapter.java:704)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:81)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFS.init(WebHDFS.java:112)
at com.ibm.iis.cc.filesystem.FileSystem.initialize(FileSystem.java:533)
... 2 more
Caused by: java.lang.RuntimeException: HTTP status = 403
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.readToken(WebHDFSSPNEGO.java:251)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.access$300(WebHDFSSPNEGO.java:47)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:211)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO$2.run(WebHDFSSPNEGO.java:137)
at java.security.AccessController.doPrivileged(AccessController.java:488)
at javax.security.auth.Subject.doAs(Subject.java:572)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.doSpnego(WebHDFSSPNEGO.java:233)
at com.ibm.iis.cc.filesystem.impl.webhdfs.WebHDFSSPNEGO.getAuthCookie(WebHDFSSPNEGO.java:77)
... 4 more
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.initializeProcessor (CC_JavaAdapter.java: 1045)
at com.ibm.is.cc.javastage.connector.CC_JavaAdapter.getSerializedData (CC_JavaAdapter.java: 704)

_________________
Atul
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42105
Location: Denver, CO
Points: 216139

Post Posted: Fri Dec 23, 2016 8:38 am Reply with quote    Back to top    

https://en.wikipedia.org/wiki/HTTP_403

Not sure why you're getting that but that is what it means, in case you hadn't looked up the error. I would double-check your configuration in the stage, make sure all of the values are appropriate and your running user has whatever privs are needed to access things.

Is this actually the Big Data File conector?

_________________
-craig

Can't keep my eyes from the circling skies
Tongue tied and twisted just an earth bound misfit, I
Rate this response:  
ray.wurlod

Premium Poster
Participant

Group memberships:
Premium Members, Inner Circle, Australia Usergroup, Server to Parallel Transition Group

Joined: 23 Oct 2002
Posts: 53978
Location: Sydney, Australia
Points: 292855

Post Posted: Sat Dec 24, 2016 5:12 pm Reply with quote    Back to top    

chulett wrote:
Is this actually the Big Data File conector?

Probably not. The File Connector stage is a feature in version 11.

_________________
RXP Services Ltd
Melbourne | Canberra | Sydney | Hong Kong | Hobart | Brisbane
currently hiring: Canberra, Sydney and Melbourne
Rate this response:  
chulett

Premium Poster


since January 2006

Group memberships:
Premium Members, Inner Circle, Server to Parallel Transition Group

Joined: 12 Nov 2002
Posts: 42105
Location: Denver, CO
Points: 216139

Post Posted: Sat Dec 24, 2016 7:46 pm Reply with quote    Back to top    

Only reason I asked is because I was under the impression you needed to use that particular one for Hive.

_________________
-craig

Can't keep my eyes from the circling skies
Tongue tied and twisted just an earth bound misfit, I
Rate this response:  
Not yet rated
ray.wurlod

Premium Poster
Participant

Group memberships:
Premium Members, Inner Circle, Australia Usergroup, Server to Parallel Transition Group

Joined: 23 Oct 2002
Posts: 53978
Location: Sydney, Australia
Points: 292855

Post Posted: Sat Dec 24, 2016 9:35 pm Reply with quote    Back to top    

Read all about it!
http://www.ibm.com/support/knowledgecenter/SSZJPZ_11.3.0/com.ibm.swg.im.iis.conn.filecon.usage.doc/topics/filecon_parent.html

_________________
RXP Services Ltd
Melbourne | Canberra | Sydney | Hong Kong | Hobart | Brisbane
currently hiring: Canberra, Sydney and Melbourne
Rate this response:  
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Mon Dec 26, 2016 11:13 pm Reply with quote    Back to top    

Hi,

Is there any document or website where I can find step by step configuration setting which is required to use File connector to read from Hive?

_________________
Atul
Rate this response:  
atulgoel
Participant



Joined: 03 Feb 2009
Posts: 73
Location: Bangalore, India
Points: 820

Post Posted: Wed Feb 01, 2017 2:37 am Reply with quote    Back to top    

There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.

_________________
Atul
Rate this response:  
satheesh123
Participant



Joined: 27 Dec 2010
Posts: 3

Points: 16

Post Posted: Fri Feb 17, 2017 12:49 am Reply with quote    Back to top    

Hi Atul, I'm using File Connector stage to connect HDFS system but receiving the same error as below, "java.lang.RuntimeException: HTTP status = 403". Can let us know the configurations settings needs to be done from Hive Administration. Thanks!

_________________
Regards,
Satheesh
________________
Rate this response:  
Not yet rated
ray.wurlod

Premium Poster
Participant

Group memberships:
Premium Members, Inner Circle, Australia Usergroup, Server to Parallel Transition Group

Joined: 23 Oct 2002
Posts: 53978
Location: Sydney, Australia
Points: 292855

Post Posted: Thu Feb 23, 2017 12:35 am Reply with quote    Back to top    

atulgoel wrote:
There was some configurations settings needs to be done from Hive Administration. After that, its working fine now.

Please let us know what those configuration changes are.

_________________
RXP Services Ltd
Melbourne | Canberra | Sydney | Hong Kong | Hobart | Brisbane
currently hiring: Canberra, Sydney and Melbourne
Rate this response:  
Not yet rated
TNZL_BI



Group memberships:
Premium Members

Joined: 20 Aug 2012
Posts: 24
Location: NZ
Points: 243

Post Posted: Sun Apr 23, 2017 10:46 pm Reply with quote    Back to top    

Hey All,

I am trying to use the File Connector Stage to put a file on HDFS. I am getting the following issue when I write a file on HDFS

File_Connector_35,0: An exception occurred: java.lang.Exception: Failed to write to file /appl/iip_data/ssd01/data/abhi.txt: HTTP/1.1 403 Forbidden

Atul / Satish - what were the things that your Hive admin did to fix this ??
Rate this response:  
Not yet rated
TNZL_BI



Group memberships:
Premium Members

Joined: 20 Aug 2012
Posts: 24
Location: NZ
Points: 243

Post Posted: Sun Apr 30, 2017 5:56 pm Reply with quote    Back to top    

Hi All , Just wanted to close this thread saying that I can now connect Hadoop using the file connector stage.

there were 2 main issues. Hadoop cluster had 1 Name node and 2 Data nodes. Ideally opening the firewalls for Name node and corresponding Name Node Port should suffice. However the JOb was failing by showing that it cant connect to the Data Node port which was strange. We have anyway opened up all ports for all Nodes ( Name node and Data Nodes ). Doing this got the issue fixed and the Job started running fine.

Also note that it is important to know which Hadoop you are working on. In our case it was Apache Hadoop which wasn't SSL enabled and not Kerberos enabled which made is a straightforward process actually.
Rate this response:  
Not yet rated
TNZL_BI



Group memberships:
Premium Members

Joined: 20 Aug 2012
Posts: 24
Location: NZ
Points: 243

Post Posted: Sun Apr 30, 2017 7:29 pm Reply with quote    Back to top    

Moreover , personally I think it is lot more simpler to use the file connector stage in comparison to the BDFS stage. the BDFS stage in fact is more suited for the Big Insight Hadoop version distributed by IBM ( that doesn't mean it cant connect to other Hadoop versions ). The file connector stage also gives you an option to covert the file contents into a hive table which is like very cool.
Rate this response:  
Not yet rated
Display posts from previous:       

Add To Favorites
View next topic
View previous topic
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum



Powered by phpBB © 2001, 2002 phpBB Group
Theme & Graphics by Daz :: Portal by Smartor
All times are GMT - 6 Hours