Hi, 请教下 我尝试使用sql-client连接hive, hive正常, 使用beeline -u jdbc:hive2://x.x.x.x:10000 可以正常连接
sql-client-defaults.yaml配置内容: tables: [] functions: [] catalogs: - name: myhive type: hive hive-conf-dir: /home/hive/flink-1.11.1/conf default-database: default execution: planner: blink type: streaming time-characteristic: event-time periodic-watermarks-interval: 200 result-mode: table max-table-result-rows: 1000000 parallelism: 1 max-parallelism: 128 min-idle-state-retention: 0 max-idle-state-retention: 0 restart-strategy: type: fallback deployment: response-timeout: 5000 gateway-address: "" gateway-port: 0
然后启动sql-client报错 $./bin/sql-client.sh embedded
最后的报错信息: Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213) Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context. at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870) at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to determine whether database default exists or not at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335) at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) at java.util.HashMap.forEach(HashMap.java:1289) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:183) at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:136) at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) ... 3 more Caused by: org.apache.thrift.transport.TTransportException at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208) at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ... 15 more
附录完整错误信息: Searching for '/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml'...found. Reading default environment from: file:/home/hive/flink-1.11.1/conf/sql-client-defaults.yaml No session environment specified. 2020-10-27 09:48:14,533 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file file:/home/hive/flink-1.11.1/conf/hive-site.xml 2020-10-27 09:48:15,144 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Trying to connect to metastore with URI thrift://x.x.x.x:10000 2020-10-27 09:48:15,168 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Opened a connection to metastore, current connections: 1 2020-10-27 09:48:15,240 WARN org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it. org.apache.thrift.transport.TTransportException: null at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient. (HiveMetaStoreClient.java:224) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_251] at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_251] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_251] at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_251] at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient. (RetryingMetaStoreClient.java:95) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251] at org.apache.flink.table.catalog.hive.client.HiveShimV310.getHiveMetastoreClient(HiveShimV310.java:103) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.createMetastoreClient(HiveMetastoreClientWrapper.java:240) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper. (HiveMetastoreClientWrapper.java:71) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientFactory.create(HiveMetastoreClientFactory.java:35) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:223) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1] 2020-10-27 09:48:15,247 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Connected to metastore. 2020-10-27 09:48:15,247 INFO org.apache.hadoop.hive.metastore.RetryingMetaStoreClient [] - RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.metastore.HiveMetaStoreClient ugi=hive (auth:SIMPLE) retries=1 delay=1 lifetime=0 2020-10-27 09:48:15,364 WARN org.apache.hadoop.hive.metastore.RetryingMetaStoreClient [] - MetaStoreClient lost connection. Attempting to reconnect (1 of 1) after 1s. getDatabase org.apache.thrift.transport.TTransportException: null at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_251] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_251] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_251] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_251] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1] 2020-10-27 09:48:16,365 INFO org.apache.hadoop.hive.metastore.RetryingMetaStoreClient [] - RetryingMetaStoreClient trying reconnect as hive (auth:SIMPLE) 2020-10-27 09:48:16,375 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Closed a connection to metastore, current connections: 0 2020-10-27 09:48:16,375 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Trying to connect to metastore with URI thrift://x.x.x.x:10000 2020-10-27 09:48:16,376 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Opened a connection to metastore, current connections: 1 2020-10-27 09:48:16,436 WARN org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - set_ugi() not successful, Likely cause: new client talking to old server. Continuing without it. org.apache.thrift.transport.TTransportException: null at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:380) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:230) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_set_ugi(ThriftHiveMetastore.java:4787) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.set_ugi(ThriftHiveMetastore.java:4773) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:534) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.reconnect(HiveMetaStoreClient.java:379) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient$1.run(RetryingMetaStoreClient.java:187) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_251] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_251] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) ~[flink-shaded-hadoop-2-uber-2.8.3-10.0.jar:2.8.3-10.0] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:183) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) ~[?:?] at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) ~[flink-sql-connector-hive-3.1.2_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) ~[flink-table_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_251] at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) ~[flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:183) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:136) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) [flink-sql-client_2.12-1.11.1.jar:1.11.1] at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) [flink-sql-client_2.12-1.11.1.jar:1.11.1] 2020-10-27 09:48:16,438 INFO org.apache.hadoop.hive.metastore.HiveMetaStoreClient [] - Connected to metastore.
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue. at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213) Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context. at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:870) at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:227) at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201) Caused by: org.apache.flink.table.catalog.exceptions.CatalogException: Failed to determine whether database default exists or not at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:335) at org.apache.flink.table.catalog.hive.HiveCatalog.open(HiveCatalog.java:227) at org.apache.flink.table.catalog.CatalogManager.registerCatalog(CatalogManager.java:191) at org.apache.flink.table.api.internal.TableEnvironmentImpl.registerCatalog(TableEnvironmentImpl.java:337) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:627) at java.util.HashMap.forEach(HashMap.java:1289) at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:625) at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:264) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:624) at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:523) at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:183) at org.apache.flink.table.client.gateway.local.ExecutionContext. (ExecutionContext.java:136) at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:859) ... 3 more Caused by: org.apache.thrift.transport.TTransportException at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:1135) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:1122) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1511) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:1506) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:208) at com.sun.proxy.$Proxy28.getDatabase(Unknown Source) at org.apache.flink.table.catalog.hive.client.HiveMetastoreClientWrapper.getDatabase(HiveMetastoreClientWrapper.java:107) at org.apache.flink.table.catalog.hive.HiveCatalog.databaseExists(HiveCatalog.java:330) ... 15 more
谢谢!*来自志愿者整理的flink
你好,我看log里连接的是10000端口,这个是HS2的端口吧?Flink的HiveCatalog需要连接的是HMS,可以启动一个HMS再试试哈。*来自志愿者整理的flink
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。