使用 SQL Client,进行hive查询时报错: 命名有了flink-connector-hive_2.11-1.12.0.jar,还是报java.lang.ClassNotFoundException: org.apache.flink.connectors.hive.HiveSource 麻烦看一下
报错信息:
Flink SQL> select count(*) from zxw_test_1225_01; 2020-12-30 16:20:42,518 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.spark.client.submit.timeout.interval does not exist 2020-12-30 16:20:42,519 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.support.sql11.reserved.keywords does not exist 2020-12-30 16:20:42,520 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.spark.client.rpc.server.address.use.ip does not exist 2020-12-30 16:20:42,520 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.enforce.bucketing does not exist 2020-12-30 16:20:42,520 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.server2.enable.impersonation does not exist 2020-12-30 16:20:42,520 WARN org.apache.hadoop.hive.conf.HiveConf [] - HiveConf of name hive.run.timeout.seconds does not exist 2020-12-30 16:20:43,065 WARN org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory [] - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2020-12-30 16:20:43,245 INFO org.apache.hadoop.mapred.FileInputFormat [] - Total input files to process : 24 [ERROR] Could not execute SQL statement. Reason: java.lang.ClassNotFoundException: org.apache.flink.connectors.hive.HiveSource
lib包:
lib ├── flink-connector-hive_2.11-1.12.0.jar ├── flink-csv-1.12.0.jar ├── flink-dist_2.11-1.12.0.jar ├── flink-hadoop-compatibility_2.11-1.12.0.jar ├── flink-json-1.12.0.jar ├── flink-shaded-hadoop-2-uber-2.8.3-10.0.jar ├── flink-shaded-zookeeper-3.4.14.jar ├── flink-table_2.11-1.12.0.jar ├── flink-table-blink_2.11-1.12.0.jar ├── hive-exec-2.3.4.jar ├── log4j-1.2-api-2.12.1.jar ├── log4j-api-2.12.1.jar ├── log4j-core-2.12.1.jar └── log4j-slf4j-impl-2.12.1.jar*来自志愿者整理的flink邮件归档
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。