开发者社区> 问答> 正文

字段写入 PostgreSQL 的 DDL yidun_score 会报异常

我定义的一个表的一个字段(yidun_score)是 numeric(5,2) 类型,写入 PostgreSQL 的 DDL yidun_score  字段也是定义的 numeric(5,2) 类型,结果会报异常。 

org.apache.flink.client.program.ProgramInvocationException: The main method  caused an error: Field types of query result and registered TableSink  [Result] do not match.  Query result schema: [user_new_id: Long, total_credit_score: Integer,  total_order_count: Integer, loss_total_order_count: Integer, yidun_score:  BigDecimal, is_delete: Boolean]  TableSink schema: [user_new_id: Long, total_credit_score: Integer,  total_order_count: Integer, loss_total_order_count: Integer, yidun_score:  BigDecimal, is_delete: Boolean]  at  org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:593)  at  org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:438)  at  org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:83)  at  org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:80)  at  org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:122)  at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:227)  at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)  at  org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1010)  at  org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1083)  at java.security.AccessController.doPrivileged(Native Method)  at javax.security.auth.Subject.doAs(Subject.java:422)  at  org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)  at  org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)  at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1083)  Caused by: org.apache.flink.table.api.ValidationException: Field types of  query result and registered TableSink [Result] do not match.  Query result schema: [user_new_id: Long, total_credit_score: Integer,  total_order_count: Integer, loss_total_order_count: Integer, yidun_score:  BigDecimal, is_delete: Boolean]  TableSink schema: [user_new_id: Long, total_credit_score: Integer,  total_order_count: Integer, loss_total_order_count: Integer, yidun_score:  BigDecimal, is_delete: Boolean]  at  org.apache.flink.table.planner.sinks.TableSinkUtils$.validateSink(TableSinkUtils.scala:69)  at  org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:179)  at  org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$2.apply(PlannerBase.scala:178)  at scala.Option.map(Option.scala:146)  at  org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:178)  at  org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:146)  at  org.apache.flink.table.planner.delegation.PlannerBase$$anonfun$1.apply(PlannerBase.scala:146)  at  scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)  at  scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)  at scala.collection.Iterator$class.foreach(Iterator.scala:891) 

我后面把 numeric(5,2) 类型都改成 double,结果发现就不报异常了,可以正常将数据写入 PG,不知道这个 case 是不是一个 bug?*来自志愿者整理的flink邮件归档

展开
收起
玛丽莲梦嘉 2021-12-02 16:32:05 529 0
1 条回答
写回答
取消 提交回答
  • 你用的是1.9吗? 试过 1.10.0 blink planner 么?*来自志愿者整理的FLINK邮件归档

    2021-12-02 17:24:46
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
PostgreSQL 物联网六脉神剑 立即下载
PostgreSQL在哈啰的实践-周飞 立即下载
PostgreSQL高并发数据库应用数据 立即下载

相关镜像