微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

如何用java在hdfs中创建一个新目录?

public static void main(String[] args) throws IOException, URISyntaxException 

{
            配置config = new Configuration();

  config.set("fs.default.name","hdfs://127.0.0.1:50070/dfshealth.jsp");

  FileSystem dfs = FileSystem.get(config);
  String dirName = "TestDirectory";

  Path src = new Path(dfs.getWorkingDirectory()+"/"+dirName);

  dfs.mkdirs(src); 

}
}

他是一个例外

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
    at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:217)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:185)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected]ecurityEnabled(UserGroup@R_456_404[email protected]:237)
    at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:210)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:185)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected]ecurityEnabled(UserGroup@R_456_404[email protected]:237)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:482)
    at org.apache.hadoop.security.UserGroup@R_456_404[email protected](UserGroup@R_456_404[email protected]:468)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1519)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1420)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123)
    at com.TestConnection.main(TestConnection.java:21)
Caused by: java.lang.classNotFoundException: org.apache.commons.configuration.Configuration
    at java.net.urlclassloader$1.run(urlclassloader.java:366)
    at java.net.urlclassloader$1.run(urlclassloader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.urlclassloader.findClass(urlclassloader.java:354)
    at java.lang.classLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.classLoader.loadClass(ClassLoader.java:358)
    ... 17 more

移民是真的
问题是什么 ?
任何帮助!!!

解决方法:

对于您的问题,您必须添加commons-configuration-1.6.jar jar ..

我在下面列出了必要的罐子

{ 
   Configuration config = new Configuration();
   config.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
   config.addResource(new Path("/etc/hadoop/conf/hdfs-site.xml"));

   config.set("fs.hdfs.impl", 
            org.apache.hadoop.hdfs.distributedFileSystem.class.getName()
        );
       config.set("fs.file.impl",
            org.apache.hadoop.fs.LocalFileSystem.class.getName()
        );
  FileSystem dfs = FileSystem.get(config);
  String dirName = "TestDirectory";
  System.out.println(dfs.getWorkingDirectory() +" this is from /n/n");
  Path src = new Path(dfs.getWorkingDirectory()+"/"+dirName);

   dfs.mkdirs(src); 

} }

您必须在构建路径中添加以下jar列表.

公共-CLI-1.2.jar

公地集合-3.2.1.jar

公共配置-1.6.jar

公地郎2.5.jar

共享记录-1.1.1.jar

番石榴11.0.2.jar

Hadoop的auth.jar

Hadoop的common.jar

的protobuf-java的2.4.0a.jar

slf4j-api-1.6.1.jar

log4j的-1.2.17.jar

Hadoop的hdfs.jar

如果它是cloudera,这些所有jar都可以在hadoop / lib文件夹中找到.

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐