微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

Flink采坑记录

1.运行./yarn-session.sh命令报错

[hadoop@hadoop002 bin]$ ./yarn-session.sh --help
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnException
	at java.lang.class.getDeclaredMethods0(Native Method)
	at java.lang.class.privateGetDeclaredMethods(Class.java:2701)
	at java.lang.class.privategetmethodRecursive(Class.java:3048)
	at java.lang.class.getmethod0(Class.java:3018)
	at java.lang.class.getmethod(Class.java:1784)
	at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
	at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.classNotFoundException: org.apache.hadoop.yarn.exceptions.YarnException
	at java.net.urlclassloader.findClass(urlclassloader.java:381)
	at java.lang.classLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
	at java.lang.classLoader.loadClass(ClassLoader.java:357)
	... 7 more

解决方法:将flink-shaded-hadoop-2中flink-shaded-hadoop-2-uber的link-shaded-hadoop-2-uber-2.6.0-cdh5.15.1-7.0.jar 拷贝到flink的lib下面

[hadoop@hadoop002 target]$ pwd
/home/hadoop/software/hadoop/hadoop2.6.0-cdh5.15.1/flink-shaded/flink-shaded-hadoop-2-uber/target
[hadoop@hadoop002 target]$ cp flink-shaded-hadoop-2-uber-2.6.0-cdh5.15.1-7.0.jar /home/hadoop/app/flink-1.9-SNAPSHOT/lib

2.运行./yarn-session.sh命令继续报错

[hadoop@hadoop002 flink-1.9-SNAPSHOT]$ ./bin/yarn-session.sh 
2020-02-02 11:10:38,705 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.address, localhost
2020-02-02 11:10:38,706 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.rpc.port, 6123
2020-02-02 11:10:38,706 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.heap.size, 1024m
2020-02-02 11:10:38,706 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.heap.size, 1024m
2020-02-02 11:10:38,706 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2020-02-02 11:10:38,707 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: parallelism.default, 1
2020-02-02 11:10:38,707 INFO  org.apache.flink.configuration.GlobalConfiguration            - Loading configuration property: jobmanager.execution.failover-strategy, region
2020-02-02 11:10:38,710 ERROR org.apache.flink.yarn.cli.FlinkYarnSessionCli                 - Error while running the Flink Yarn session.
java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/clI/Option$Builder;
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:197)
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:173)
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:836)

------------------------------------------------------------
 The program finished with the following exception:

java.lang.NoSuchMethodError: org.apache.commons.cli.Option.builder(Ljava/lang/String;)Lorg/apache/commons/clI/Option$Builder;
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:197)
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.<init>(FlinkYarnSessionCli.java:173)
	at org.apache.flink.yarn.cli.FlinkYarnSessionCli.main(FlinkYarnSessionCli.java:836)

在 flink-shaded-7.0/flink-shaded-hadoop-2-uber/pom.xml 中的 dependencyManagement 标签添加如下依赖

<dependency>
    <groupId>commons-cli</groupId>
    <artifactId>commons-cli</artifactId>
    <version>1.3.1</version>
</dependency>

重新编译后,将flink-shaded-hadoop-2中flink-shaded-hadoop-2-uber的link-shaded-hadoop-2-uber-2.6.0-cdh5.15.1-7.0.jar 拷贝到flink的lib下面

muyingmiao 发布了135 篇原创文章 · 获赞 4 · 访问量 1万+ 私信 关注

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 [email protected] 举报,一经查实,本站将立刻删除。

相关推荐