微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

17、【SparkSQL】org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism异常

  • 背景
    当我在使用IDEA本地连接服务器spark服务时,虽然拷贝了hive-site.xml,提供了相应的依赖,仍然报错:org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism
    依赖如下:

      <dependencies>
          <dependency>
              <groupId>org.apache.spark</groupId>
              <artifactId>spark-core_2.12</artifactId>
              <version>3.1.2</version>
          </dependency>
          <dependency>
              <groupId>org.apache.spark</groupId>
              <artifactId>spark-sql_2.12</artifactId>
              <version>3.1.2</version>
          </dependency>
    
          <dependency>
              <groupId>MysqL</groupId>
              <artifactId>mysql-connector-java</artifactId>
              <version>8.0.25</version>
          </dependency>
    
          <dependency>
              <groupId>org.apache.hive</groupId>
              <artifactId>hive-exec</artifactId>
              <version>3.1.2</version>
          </dependency>
    
          <dependency>
              <groupId>org.apache.spark</groupId>
              <artifactId>spark-hive_2.12</artifactId>
              <version>3.1.2</version>
          </dependency>
    
          <dependency>
              <groupId>org.apache.hadoop</groupId>
              <artifactId>hadoop-auth</artifactId>
              <version>3.2.0</version>
          </dependency>
      </dependencies>
    
    
  • 分析
    在远程访问hadoop时,需要提供认证信息

    https://hadoop.apache.org/docs/stable/hadoop-auth/Examples.html
    https://blog.csdn.net/csr_hema/article/details/8147590

  • 措施
    在Maven添加如下依赖:

            <dependency>
              <groupId>org.apache.hadoop</groupId>
              <artifactId>hadoop-auth</artifactId>
              <version>3.2.0</version>
          </dependency>
    

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐