微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopK

启动spark-shell时,报错如下:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(Ljava/lang/String;)V
    at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:84)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:318)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:303)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:1827)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:709)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:659)
    at org.apache.hadoop.security.UserGroup@R_139_404[email protected](UserGroup@R_139_404[email protected]:570)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
    at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2422)
    at scala.Option.getorElse(Option.scala:121)
    at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2422)
    at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:79)
    at org.apache.spark.deploy.SparkSubmit.secMgr$lzycompute$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$secMgr$1(SparkSubmit.scala:348)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at org.apache.spark.deploy.SparkSubmit$$anonfun$prepareSubmitEnvironment$7.apply(SparkSubmit.scala:356)
    at scala.Option.map(Option.scala:146)
    at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:355)
    at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:784)
    at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
    at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
    at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
    at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:930)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:939)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

解决方法

这是由于错误的配置导致了hadoop的包和spark的包冲突。

原来是不知道啥时候修改了spark-defaults.conf中的配置。

 

所以解决方法也是很简单,删除这个配置就可以。

 

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐