flink版本1.8.0
cdc flink-sql-connector-oracle-cdc-3.0.1.jar
flink-cdc-pipeline-connector-doris-3.1.0.jar
抽取oracle数据库,字符集是UTF8的结构和数据都能抽取
但是抽取字符集是ZHS16GBK,只能抽取到结构,数据没有,提示如下:
2024-10-11 09:15:54
org.apache.flink.util.FlinkRuntimeException: Exceeded checkpoint tolerable failure threshold. The latest checkpoint failed due to Asynchronous task checkpoint failed., view the Checkpoint History tab or the Job Manager log to find out why continuous checkpoints failed.
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.checkFailureAgainstCounter(CheckpointFailureManager.java:212)
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleTaskLevelCheckpointException(CheckpointFailureManager.java:191)
at org.apache.flink.runtime.checkpoint.CheckpointFailureManager.handleCheckpointException(CheckpointFailureManager.java:124)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.abortPendingCheckpoint(CheckpointCoordinator.java:2151)
at org.apache.flink.runtime.checkpoint.CheckpointCoordinator.receiveDeclineMessage(CheckpointCoordinator.java:1100)
at org.apache.flink.runtime.scheduler.ExecutionGraphHandler.lambda$declineCheckpoint$2(ExecutionGraphHandler.java:103)
at org.apache.flink.runtime.scheduler.ExecutionGraphHandler.lambda$processCheckpointCoordinatorMessage$3(ExecutionGraphHandler.java:119)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
有大佬知道原因吗?刚开始抽字符集是ZHS16GBK提示org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: 不支持的字符集 (在类路径中添加 orai18n.jar): ZHS16GBK,我把orai18n.jar放到lib下就好了,就是没数据,求助
执行的命令如下
bin/flink run \
-Dexecution.checkpointing.interval=30s \
-Dparallelism.default=1 \
-c org.apache.doris.flink.tools.cdc.CdcTools \
lib/flink-cdc-pipeline-connector-doris-3.1.0.jar \
oracle-sync-database \
--database cs_info_dw \
--oracle-conf url=jdbc:oracle:thin:@172.17.8.235:1521/csdwuat \
--oracle-conf username=cs_info_stg \
--oracle-conf password=info_stg01 \
--oracle-conf database-name=CSDWUAT \
--oracle-conf schema-name=CS_INFO_STG \
--including-tables "PERSONS_1" \
--sink-conf fenodes=172.17.9.174:8030 \
--sink-conf username=root \
--sink-conf password= \
--sink-conf jdbc-url=jdbc:mysql://172.17.9.174:9030 \
--sink-conf sink.label-prefix=label \
--table-conf replication_num=1
版权声明:本文内容由阿里云实名注册用户自发贡献,版权归原作者所有,阿里云开发者社区不拥有其著作权,亦不承担相应法律责任。具体规则请查看《阿里云开发者社区用户服务协议》和《阿里云开发者社区知识产权保护指引》。如果您发现本社区中有涉嫌抄袭的内容,填写侵权投诉表单进行举报,一经查实,本社区将立刻删除涉嫌侵权内容。
实时计算Flink版是阿里云提供的全托管Serverless Flink云服务,基于 Apache Flink 构建的企业级、高性能实时大数据处理系统。提供全托管版 Flink 集群和引擎,提高作业开发运维效率。