Date: Wed, 1 Jan 2025 13:04:21 +0800
Subject: [PATCH 57/73] chore: adjust badges
---
README.md | 10 ++++------
1 file changed, 4 insertions(+), 6 deletions(-)
diff --git a/README.md b/README.md
index ed04fafee1..e4814cee67 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,3 @@
-
-
-[](https://github.com/taosdata/TDengine/actions/workflows/taosd-ci-build.yml)
+[](https://github.com/taosdata/TDengine/actions/workflows/taosd-ci-build.yml)
[](https://coveralls.io/github/taosdata/TDengine?branch=3.0)
+
+
+

[](https://bestpractices.coreinfrastructure.org/projects/4201)
@@ -20,9 +21,6 @@
[](https://discord.com/invite/VZdSuUg4pS)
[](https://www.linkedin.com/company/tdengine)
[](https://stackoverflow.com/questions/tagged/tdengine)
-
-
-
English | [简体中文](README-CN.md) | [TDengine Cloud](https://cloud.tdengine.com) | [Learn more about TSDB](https://tdengine.com/tsdb/)
From b999303ac868d76f0e05f3496836b1a1e702035e Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Wed, 1 Jan 2025 13:14:35 +0800
Subject: [PATCH 58/73] docs: add 33~35 FAQ problem
---
docs/en/27-train-faq/index.md | 19 +++++++++++++++++++
docs/zh/27-train-faq/01-faq.md | 10 ++++++++++
2 files changed, 29 insertions(+)
diff --git a/docs/en/27-train-faq/index.md b/docs/en/27-train-faq/index.md
index ca6cd91714..aa85729102 100644
--- a/docs/en/27-train-faq/index.md
+++ b/docs/en/27-train-faq/index.md
@@ -297,3 +297,22 @@ Reporting this error indicates that the first connection to the cluster was succ
Therefore, first, check whether all ports on the server and cluster (default 6030 for native connections and 6041 for HTTP connections) are open; Next, check if the client's hosts file has configured the fqdn and IP information for all dnodes in the cluster.
If the issue still cannot be resolved, it is necessary to contact Taos technical personnel for support.
+
+### 32 Why is the original database lost and the cluster ID changed when the data directory dataDir of the database remains unchanged on the same server?
+Background: When the TDengine server process (taosd) starts, if there are no valid data file subdirectories (such as mnode, dnode, and vnode) under the data directory (dataDir, which is specified in the configuration file taos.cfg), these directories will be created automatically.When a new mnode directory is created, a new cluster ID will be allocated to generate a new cluster.
+
+Cause analysis: The data directory dataDir of taosd can point to multiple different mount points.If these mount points are not configured for automatic mounting in the fstab file, after the server restarts, dataDir will only exist as a normal directory of the local disk, and it will not point to the mounted disk as expected.At this point, if the taosd service is started, it will create a new directory under dataDir to generate a new cluster.
+
+Impact of the problem: After the server is restarted, the original database is lost (note: it is not really lost, but the original data disk is not attached and cannot be seen for the time being) and the cluster ID changes, resulting in the inability to access the original database. For enterprise users, if they have been authorized for the cluster ID, they will also find that the machine code of the cluster server has not changed, but the original authorization has expired.If the problem is not monitored or found and handled in time, the user will not notice that the original database has been lost, resulting in losses and increased operation and maintenance costs.
+
+Problem solving: You should configure the automatic mount of the dataDir directory in the fstab file to ensure that the dataDir always points to the expected mount point and directory. At this point, restarting the server will retrieve the original database and cluster. In the subsequent version, we will develop a function to enable taosd to exit in the startup phase when it detects that the dataDir changes before and after startup, and provide corresponding error prompts.
+
+### 33 How to solve MVCP1400.DLL loss when running TDengine on Windows platform?
+1. Reinstall Microsoft Visual C++ Redistributable: As msvcp140.dll is part of Microsoft Visual C++Redistributable, reinstalling this package usually resolves most issues. You can download the corresponding version from the official Microsoft website for installation
+2. Manually download and replace the msvcp140.dll file online: You can download the msvcp140.dll file from a reliable source and copy it to the corresponding directory in the system. Ensure that the downloaded files match your system architecture (32-bit or 64 bit) and ensure the security of the source
+
+### 34 Which fast query data from super table with TAG filter or child table ?
+Directly querying from child table is fast. The query from super table with TAG filter is designed to meet the convenience of querying. It can filter data from multiple child tables at the same time. If the goal is to pursue performance and the child table has been clearly queried, directly querying from the sub table can achieve higher performance
+
+### 35 How to view data compression ratio indicators?
+Currently, TDengine only provides compression ratios based on tables, not databases or the entire system. To view the compression ratios, execute the `SHOW TABLE DISTRIBUTED table_name;` command in the client taos-CLI. The table_name can be a super table, regular table, or subtable. For details [Click Here](https://docs.tdengine.com/tdengine-reference/sql-manual/show-commands/#show-table-distributed)
\ No newline at end of file
diff --git a/docs/zh/27-train-faq/01-faq.md b/docs/zh/27-train-faq/01-faq.md
index 8b4cabf520..1b2f2c9fcc 100644
--- a/docs/zh/27-train-faq/01-faq.md
+++ b/docs/zh/27-train-faq/01-faq.md
@@ -302,3 +302,13 @@ TDinsight插件中展示的数据是通过taosKeeper和taosAdapter服务收集
问题影响:服务器重启后,原有数据库丢失(注:并非真正丢失,只是原有的数据磁盘未挂载,暂时看不到)且集群 ID 发生变化,导致无法访问原有数据库。对于企业版用户,如果已针对集群 ID 进行授权,还会发现集群服务器的机器码未变,但原有的授权已失效。如果未针对该问题进行监控或者未及时发现并进行处理,则用户不会注意到原有数据库已经丢失,从而造成损失,增加运维成本。
问题解决:应在 fstab 文件中配置 dataDir 目录的自动挂载,确保 dataDir 始终指向预期的挂载点和目录,此时,再重启服务器,会找回原有的数据库和集群。在后续的版本中,我们将开发一个功能,使 taosd 在检测到启动前后 dataDir 发生变化时,在启动阶段退出,同时提供相应的错误提示。
+
+### 33 Windows 平台运行 TDengine 出现丢失 MVCP1400.DLL 解决方法?
+1. 重新安装 Microsoft Visual C++ Redistributable:由于 msvcp140.dll 是 Microsoft Visual C++ Redistributable 的一部分,重新安装这个包通常可以解决大部分问题。可以从 Microsoft 官方网站下载相应的版本进行安装
+2. 手动上网下载并替换 msvcp140.dll 文件:可以从可靠的源下载 msvcp140.dll 文件,并将其复制到系统的相应目录下。确保下载的文件与您的系统架构(32位或64位)相匹配,并确保来源的安全性
+
+### 34 超级表带 TAG 过滤查子查数据与直接查子表哪个块?
+直接查子表更快。超级表带 TAG 过滤查询子查数据是为满足查询方便性,同时可对多个子表中数据进行过滤,如果目的是追求性能并已明确查询子表,直接从子表查性能更高
+
+### 35 如何查看数据压缩率指标?
+TDengine 目前只提供以表为统计单位的压缩率,数据库及整体还未提供,查看命令是在客户端 taos-CLI 中执行 `SHOW TABLE DISTRIBUTED table_name;` 命令,table_name 为要查看压缩率的表,可以为超级表、普通表及子表,详细可[查看此处](https://docs.taosdata.com/reference/taos-sql/show/#show-table-distributed)
\ No newline at end of file
From 06a28cbe330f683154c0f3c89d06058bff08cbbf Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Wed, 1 Jan 2025 13:26:09 +0800
Subject: [PATCH 59/73] chore: correct typo in workflow name
---
.github/workflows/taoskeeper-ci.yml | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/.github/workflows/taoskeeper-ci.yml b/.github/workflows/taoskeeper-ci.yml
index fbc662ffb2..7f84eaa401 100644
--- a/.github/workflows/taoskeeper-ci.yml
+++ b/.github/workflows/taoskeeper-ci.yml
@@ -1,4 +1,4 @@
-name: TaosKeeper CI
+name: taosKeeper CI
on:
push:
From b941f3654312d860d86352d07bcb2de04e5436a3 Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Wed, 1 Jan 2025 13:26:38 +0800
Subject: [PATCH 60/73] chore: delete lgtm config
---
.lgtm.yml | 402 ------------------------------------------------------
1 file changed, 402 deletions(-)
delete mode 100644 .lgtm.yml
diff --git a/.lgtm.yml b/.lgtm.yml
deleted file mode 100644
index fbcedead43..0000000000
--- a/.lgtm.yml
+++ /dev/null
@@ -1,402 +0,0 @@
-##########################################################################################
-# Customize file classifications. #
-# Results from files under any classifier will be excluded from LGTM #
-# statistics. #
-##########################################################################################
-
-##########################################################################################
-# Use the `path_classifiers` block to define changes to the default classification of #
-# files. #
-##########################################################################################
-
-path_classifiers:
- # docs:
- # Identify the top-level file called `generate_javadoc.py` as documentation-related.
- test:
- # Override LGTM's default classification of test files by excluding all files.
- - exclude: /
- # Classify all files in the top-level directories tests/ and testsuites/ as test code.
- - tests
- # - testsuites
- # Classify all files with suffix `.test` as test code.
- # Note: use only forward slash / as a path separator.
- # Use ** to indicate an arbitrary parent path.
- # Use * to indicate any sequence of characters excluding /.
- # Always enclose the expression in double quotes if it includes *.
- # - "**/*.test"
- # Refine the classifications above by excluding files in test/util/.
- # - exclude: test/util
- # The default behavior is to tag all files created during the
- # build as `generated`. Results are hidden for generated code. You can tag
- # further files as being generated by adding them to the `generated` section.
- generated:
- # Exclude all `*.c` files under the `ui/` directory from classification as
- # generated code.
- # - exclude: ui/**/*.c
- # By default, all files not checked into the repository are considered to be
- # 'generated'.
- # The default behavior is to tag library code as `library`. Results are hidden
- # for library code. You can tag further files as being library code by adding them
- # to the `library` section.
- library:
- - exclude: deps/
- # The default behavior is to tag template files as `template`. Results are hidden
- # for template files. You can tag further files as being template files by adding
- # them to the `template` section.
- template:
- #- exclude: path/to/template/code/**/*.c
- # Define your own category, for example: 'some_custom_category'.
- some_custom_category:
- # Classify all files in the top-level directory tools/ (or the top-level file
- # called tools).
- # - tools
-
-#########################################################################################
-# Use the `queries` block to change the default display of query results. #
-#########################################################################################
-
- # queries:
- # Start by hiding the results of all queries.
- # - exclude: "*"
- # Then include all queries tagged 'security' and 'correctness', and with a severity of
- # 'error'.
- # - include:
- # tags:
- # - "security"
- # - "correctness"
- # severity: "error"
- # Specifically hide the results of two queries.
- # - exclude: cpp/use-of-goto
- # - exclude: java/equals-on-unrelated-types
- # Refine by including the `java/command-line-injection` query.
- # - include: java/command-line-injection
-
-#########################################################################################
-# Define changes to the default code extraction process. #
-# Each block configures the extraction of a single language, and modifies actions in a #
-# named step. Every named step includes automatic default actions, #
-# except for the 'prepare' step. The steps are performed in the following sequence: #
-# prepare #
-# after_prepare #
-# configure (C/C++ only) #
-# python_setup (Python only) #
-# before_index #
-# index #
-##########################################################################################
-
-#########################################################################################
-# Environment variables available to the steps: #
-#########################################################################################
-
-# LGTM_SRC
-# The root of the source tree.
-# LGTM_WORKSPACE
-# An existing (initially empty) folder outside the source tree.
-# Used for temporary download and setup commands.
-
-#########################################################################################
-# Use the extraction block to define changes to the default code extraction process #
-# for one or more languages. The settings for each language are defined in a child #
-# block, with one or more steps. #
-#########################################################################################
-
-extraction:
- # Define settings for C/C++ analysis
- #####################################
- cpp:
- # The `prepare` step exists for customization on LGTM.com only.
- prepare:
- # # The `packages` section is valid for LGTM.com only. It names Ubuntu packages to
- # # be installed.
- packages:
- - cmake
- # Add an `after-prepare` step if you need to run commands after the prepare step.
- # Each command should be listed on a separate line.
- # This step is useful for C/C++ analysis where you want to prepare the environment
- # for the `configure` step without changing the default behavior for that step.
- # after_prepare:
- #- export GNU_MAKE=make
- #- export GIT=true
- # The `configure` step generates build configuration files which the `index` step
- # then uses to build the codebase.
- configure:
- command:
- - mkdir build
- - cd build
- - cmake ..
- # - ./prepare_deps
- # Optional step. You should add a `before_index` step if you need to run commands
- # before the `index` step.
- # before_index:
- # - export BOOST_DIR=$LGTM_SRC/boost
- # - export GTEST_DIR=$LGTM_SRC/googletest
- # - export HUNSPELL_DIR=$LGTM_SRC/hunspell
- # - export CRYPTOPP_DIR=$LGTM_SRC/cryptopp
- # The `index` step builds the code and extracts information during the build
- # process.
- index:
- # Override the autobuild process by specifying a list of custom build commands
- # to use instead.
- build_command:
- - cd build
- - make
- # - $GNU_MAKE -j2 -s
- # Specify that all project or solution files should be used for extraction.
- # Default: false.
- # all_solutions: true
- # Specify a list of one or more project or solution files for extraction.
- # Default: LGTM chooses the file closest to the root of the repository (this may
- # fail if there are multiple candidates).
- # solution:
- # - myProject.sln
- # Specify MSBuild settings
- # msbuild:
- # Specify a list of additional arguments to MSBuild. Default: empty.
- # arguments: /p:Platform=x64 /p:Configuration=Release
- # Specify the MSBuild configuration to use, for example, debug or release.
- # Default: read from the solution file or files.
- # configuration:
- # Specify the platform to target, for example: x86, x64, or Any CPU.
- # Default: read from the solution file or files.
- # platform:
- # Specify the MSBuild target. Default: rebuild.
- # target:
- # Specify whether or not to perform a NuGet restore for extraction. Default: true.
- # nuget_restore: false
- # Specify a version of Microsoft Visual Studio to use for MSBuild or any custom
- # build commands (build_command). For example:
- # 10 for Visual Studio 2010
- # 12 for Visual Studio 2012
- # 14 for Visual Studio 2015
- # 15 for Visual Studio 2017
- # Default: read from project files.
- # vstools_version: 10
-
- # Define settings for C# analysis
- ##################################
- # csharp:
- # The `prepare` step exists for customization on LGTM.com only.
- # prepare:
- # packages:
- # - example_package
- # Add an `after-prepare` step if you need to run commands after the `prepare` step.
- # Each command should be listed on a separate line.
- # after_prepare:
- # - export PATH=$LGTM_WORKSPACE/tools:$PATH
- # The `index` step builds the code and extracts information during the build
- # process.
- #index:
- # Specify that all project or solution files should be used for extraction.
- # Default: false.
- # all_solutions: true
- # Specify a list of one or more project or solution files for extraction.
- # Default: LGTM chooses the file closest to the root of the repository (this may
- # fail if there are multiple candidates).
- # solution:
- # - myProject.sln
- # Override the autobuild process by specifying a list of custom build commands
- # to use instead.
- # build_command:
- # - ./example-compile-all.sh
- # By default, LGTM analyzes the code by building it. You can override this,
- # and tell LGTM not to build the code. Beware that this can lead
- # to less accurate results.
- # buildless: true
- # Specify .NET Core settings.
- # dotnet:
- # Specify additional arguments to `dotnet build`.
- # Default: empty.
- # arguments: "example_arg"
- # Specify the version of .NET Core SDK to use.
- # Default: The version installed on the build machine.
- # version: 2.1
- # Specify MSBuild settings.
- # msbuild:
- # Specify a list of additional arguments to MSBuild. Default: empty.
- # arguments: /P:WarningLevel=2
- # Specify the MSBuild configuration to use, for example, debug or release.
- # Default: read from the solution file or files.
- # configuration: release
- # Specify the platform to target, for example: x86, x64, or Any CPU.
- # Default: read from the solution file or files.
- # platform: x86
- # Specify the MSBuild target. Default: rebuild.
- # target: notest
- # Specify whether or not to perform a NuGet restore for extraction. Default: true.
- # nuget_restore: false
- # Specify a version of Microsoft Visual Studio to use for MSBuild or any custom
- # build commands (build_command). For example:
- # 10 for Visual Studio 2010
- # 12 for Visual Studio 2012
- # 14 for Visual Studio 2015
- # 15 for Visual Studio 2017
- # Default: read from project files
- # vstools_version: 10
- # Specify additional options for the extractor,
- # for example --fast to perform a faster extraction that produces a smaller
- # database.
- # extractor: "--fast"
-
- # Define settings for Go analysis
- ##################################
- # go:
- # The `prepare` step exists for customization on LGTM.com only.
- # prepare:
- # packages:
- # - example_package
- # Add an `after-prepare` step if you need to run commands after the `prepare` step.
- # Each command should be listed on a separate line.
- # after_prepare:
- # - export PATH=$LGTM_WORKSPACE/tools:$PATH
- # The `index` step builds the code and extracts information during the build
- # process.
- # index:
- # Override the autobuild process by specifying a list of custom build commands
- # to use instead.
- # build_command:
- # - ./compile-all.sh
-
- # Define settings for Java analysis
- ####################################
- # java:
- # The `prepare` step exists for customization on LGTM.com only.
- # prepare:
- # packages:
- # - example_package
- # Add an `after-prepare` step if you need to run commands after the prepare step.
- # Each command should be listed on a separate line.
- # after_prepare:
- # - export PATH=$LGTM_WORKSPACE/tools:$PATH
- # The `index` step extracts information from the files in the codebase.
- # index:
- # Specify Gradle settings.
- # gradle:
- # Specify the required Gradle version.
- # Default: determined automatically.
- # version: 4.4
- # Override the autobuild process by specifying a list of custom build commands
- # to use instead.
- # build_command: ./compile-all.sh
- # Specify the Java version required to build the project.
- # java_version: 11
- # Specify whether to extract Java .properties files
- # Default: false
- # properties_files: true
- # Specify Maven settings.
- # maven:
- # Specify the path (absolute or relative) of a Maven settings file to use.
- # Default: Maven uses a settings file in the default location, if it exists.
- # settings_file: /opt/share/settings.xml
- # Specify the path of a Maven toolchains file.
- # Default: Maven uses a toolchains file in the default location, if it exists.
- # toolchains_file: /opt/share/toolchains.xml
- # Specify the required Maven version.
- # Default: the Maven version is determined automatically, where feasible.
- # version: 3.5.2
- # Specify how XML files should be extracted:
- # all = extract all XML files.
- # default = only extract XML files named `AndroidManifest.xml`, `pom.xml`, and `web.xml`.
- # disabled = do not extract any XML files.
- # xml_mode: all
-
- # Define settings for JavaScript analysis
- ##########################################
- # javascript:
- # The `prepare` step exists for customization on LGTM.com only.
- # prepare:
- # packages:
- # - example_package
- # Add an `after-prepare` step if you need to run commands after the prepare step.
- # Each command should be listed on a separate line.
- # after_prepare:
- # - export PATH=$LGTM_WORKSPACE/tools:$PATH
- # The `index` step extracts information from the files in the codebase.
- # index:
- # Specify a list of files and folders to extract.
- # Default: The project root directory.
- # include:
- # - src/js
- # Specify a list of files and folders to exclude from extraction.
- # exclude:
- # - thirdparty/lib
- # You can add additional file types for LGTM to extract, by mapping file
- # extensions (including the leading dot) to file types. The usual
- # include/exclude patterns apply, so, for example, `.jsm` files under
- # `thirdparty/lib` will not be extracted.
- # filetypes:
- # ".jsm": "js"
- # ".tmpl": "html"
- # Specify a list of glob patterns to include/exclude files from extraction; this
- # is applied on top of the include/exclude paths from above; patterns are
- # processed in the same way as for path classifiers above.
- # Default: include all files with known extensions (such as .js, .ts and .html),
- # but exclude files ending in `-min.js` or `.min.js` and folders named `node_modules`
- # or `bower_components`
- # filters:
- # exclude any *.ts files anywhere.
- # - exclude: "**/*.ts"
- # but include *.ts files under src/js/typescript.
- # - include: "src/js/typescript/**/*.ts"
- # Specify how TypeScript files should be extracted:
- # none = exclude all TypeScript files.
- # basic = extract syntactic information from TypeScript files.
- # full = extract syntactic and type information from TypeScript files.
- # Default: full.
- # typescript: basic
- # By default, LGTM doesn't extract any XML files. You can override this by
- # using the `xml_mode` property and setting it to `all`.
- # xml_mode: all
-
- # Define settings for Python analysis
- ######################################
- # python:
- # # The `prepare` step exists for customization on LGTM.com only.
- # # prepare:
- # # # The `packages` section is valid for LGTM.com only. It names packages to
- # # # be installed.
- # # packages: libpng-dev
- # # This step is useful for Python analysis where you want to prepare the
- # # environment for the `python_setup` step without changing the default behavior
- # # for that step.
- # after_prepare:
- # - export PATH=$LGTM_WORKSPACE/tools:$PATH
- # # This sets up the Python interpreter and virtual environment, ready for the
- # # `index` step to extract the codebase.
- # python_setup:
- # # Specify packages that should NOT be installed despite being mentioned in the
- # # requirements.txt file.
- # # Default: no package marked for exclusion.
- # exclude_requirements:
- # - pywin32
- # # Specify a list of pip packages to install.
- # # If any of these packages cannot be installed, the extraction will fail.
- # requirements:
- # - Pillow
- # # Specify a list of requirements text files to use to set up the environment,
- # # or false for none. Default: any requirements.txt, test-requirements.txt,
- # # and similarly named files identified in the codebase are used.
- # requirements_files:
- # - required-packages.txt
- # # Specify a setup.py file to use to set up the environment, or false for none.
- # # Default: any setup.py files identified in the codebase are used in preference
- # # to any requirements text files.
- # setup_py: new-setup.py
- # # Override the version of the Python interpreter used for setup and extraction
- # # Default: Python 3.
- # version: 2
- # # Optional step. You should add a `before_index` step if you need to run commands
- # # before the `index` step.
- # before_index:
- # - antlr4 -Dlanguage=Python3 Grammar.g4
- # # The `index` step extracts information from the files in the codebase.
- # index:
- # # Specify a list of files and folders to exclude from extraction.
- # # Default: Git submodules and Subversion externals.
- # exclude:
- # - legacy-implementation
- # - thirdparty/libs
- # filters:
- # - exclude: "**/documentation/examples/snippets/*.py"
- # - include: "**/documentation/examples/test_application/*"
- # include:
- # - example/to/include
From 5bef1dc3bd4c7a5d96307e4809ab68fe7f544bf5 Mon Sep 17 00:00:00 2001
From: menshibin
Date: Wed, 1 Jan 2025 16:24:35 +0800
Subject: [PATCH 61/73] modify flink connector docs param note
---
docs/en/10-third-party/01-collection/flink.md | 205 +++++++++---------
.../10-third-party/01-collection/12-flink.md | 107 ++++-----
2 files changed, 158 insertions(+), 154 deletions(-)
diff --git a/docs/en/10-third-party/01-collection/flink.md b/docs/en/10-third-party/01-collection/flink.md
index e716d5a757..f586b36d0e 100644
--- a/docs/en/10-third-party/01-collection/flink.md
+++ b/docs/en/10-third-party/01-collection/flink.md
@@ -26,7 +26,7 @@ Flink Connector supports all platforms that can run Flink 1.19 and above version
| Flink Connector Version | Major Changes | TDengine Version|
|-------------------------| ------------------------------------ | ---------------- |
-| 2.0.0 | 1. Support SQL queries on data in TDengine database
2 Support CDC subscription to data in TDengine database
3 Supports reading and writing to TDengine database using Table SQL | 3.3.5.0 and above versions|
+| 2.0.0 | 1.Support SQL queries on data in TDengine database
2 Support CDC subscription to data in TDengine database
3 Supports reading and writing to TDengine database using Table SQL | 3.3.5.0 and above versions|
| 1.0.0 | Support Sink function to write data from other sources to TDengine in the future| 3.3.2.0 and above versions|
## Exception and error codes
@@ -36,39 +36,39 @@ Please refer to:
| Error Code | Description | Suggested Actions |
| ---------------- |------------------------------------------------------- | -------------------- |
-|0xa000 | connection param error | connector parameter error
-|0xa001 | The groupid parameter of CDC is incorrect | The groupid parameter of CDC is incorrect|
-|0xa002 | wrong topic parameter for CDC | The topic parameter for CDC is incorrect|
-|0xa010 | database name configuration error | database name configuration error|
-|0xa011 | Table name configuration error | Table name configuration error|
-|0xa012 | No data was obtained from the data source | Failed to retrieve data from the data source|
-|0xa013 | value.deserializer parameter not set | No serialization method set|
-|0xa014 | List of column names for target table not set | List of column names for target table not set ||
-|0x2301 | Connection already closed | The connection has been closed. Check the connection status or create a new connection to execute the relevant instructions|
-|0x2302 | this operation is NOT supported currently | The current interface is not supported, you can switch to other connection methods|
-|0x2303 | invalid variables | The parameter is invalid. Please check the corresponding interface specification and adjust the parameter type and size|
-|0x2304 | Statement is closed | Statement has already been closed. Please check if the statement is closed and reused, or if the connection is working properly|
-|0x2305 | ResultSet is closed | The ResultSet has been released. Please check if the ResultSet has been released and used again|
-|0x230d | parameter index out of range | parameter out of range, please check the reasonable range of the parameter|
-|0x230e | Connection already closed | The connection has been closed. Please check if the connection is closed and used again, or if the connection is working properly|
-|0x230f | unknown SQL type in TDengine | Please check the Data Type types supported by TDengine|
-|0x2315 | unknown tao type in TDengine | Did the correct TDengine data type be specified when converting TDengine data type to JDBC data type|
-|0x2319 | user is required | Username information is missing when creating a connection|
-|0x231a | password is required | Password information is missing when creating a connection|
-|0x231d | can't create connection with server within | Increase connection time by adding the parameter httpConnectTimeout, or check the connection status with taosAdapter|
-|0x231e | failed to complete the task within the specified time | Increase execution time by adding the parameter messageWaitTimeout, or check the connection with taosAdapter|
-|0x2352 | Unsupported encoding | An unsupported character encoding set was specified under the local connection|
-|0x2353 |internal error of database, Please see taoslog for more details | An error occurred while executing prepareStatement on the local connection. Please check the taoslog for problem localization|
-|0x2354 | Connection is NULL | Connection has already been closed while executing the command on the local connection. Please check the connection with TDengine|
-|0x2355 | result set is NULL | Local connection to obtain result set, result set exception, please check connection status and retry|
-|0x2356 | invalid num of fields | The meta information obtained from the local connection result set does not match|
-|0x2357 | empty SQL string | Fill in the correct SQL for execution|
-|0x2371 |consumer properties must not be null | When creating a subscription, the parameter is empty. Please fill in the correct parameter|
-|0x2375 | Topic reference has been destroyed | During the process of creating a data subscription, the topic reference was released. Please check the connection with TDengine|
-|0x2376 |failed to set consumer topic, Topic name is empty | During the process of creating a data subscription, the subscription topic name is empty. Please check if the specified topic name is filled in correctly|
-|0x2377 | Consumer reference has been destroyed | The subscription data transmission channel has been closed, please check the connection with TDengine|
-|0x2378 | Consumer create error | Failed to create data subscription. Please check the taos log based on the error message to locate the problem|
-|0x237a | vGroup not found in result set VGroup | Not assigned to the current consumer, due to the Rebalance mechanism, the relationship between Consumer and VGroup is not bound|
+|0xa000 | connection param error | Connector parameter error.
+|0xa001 | the groupid parameter of CDC is incorrect | The groupid parameter of CDC is incorrect.|
+|0xa002 | wrong topic parameter for CDC | The topic parameter for CDC is incorrect.|
+|0xa010 | database name configuration error | database name configuration error.|
+|0xa011 | table name configuration error | Table name configuration error.|
+|0xa012 | no data was obtained from the data source | Failed to retrieve data from the data source.|
+|0xa013 | value.deserializer parameter not set | No serialization method set.|
+|0xa014 | list of column names set incorrectly | List of column names for target table not set. |
+|0x2301 | connection already closed | The connection has been closed. Check the connection status or create a new connection to execute the relevant instructions.|
+|0x2302 | this operation is NOT supported currently | The current interface is not supported, you can switch to other connection methods.|
+|0x2303 | invalid variables | The parameter is invalid. Please check the corresponding interface specification and adjust the parameter type and size.|
+|0x2304 | statement is closed | Statement has already been closed. Please check if the statement is closed and reused, or if the connection is working properly.|
+|0x2305 | resultSet is closed | The ResultSet has been released. Please check if the ResultSet has been released and used again.|
+|0x230d | parameter index out of range | parameter out of range, please check the reasonable range of the parameter.|
+|0x230e | connection already closed | The connection has been closed. Please check if the connection is closed and used again, or if the connection is working properly.|
+|0x230f | unknown SQL type in TDengine | Please check the Data Type types supported by TDengine.|
+|0x2315 | unknown tao type in TDengine | Did the correct TDengine data type be specified when converting TDengine data type to JDBC data type.|
+|0x2319 | user is required | Username information is missing when creating a connection.|
+|0x231a | password is required | Password information is missing when creating a connection.|
+|0x231d | can't create connection with server within | Increase connection time by adding the parameter httpConnectTimeout, or check the connection status with taosAdapter.|
+|0x231e | failed to complete the task within the specified time | Increase execution time by adding the parameter messageWaitTimeout, or check the connection with taosAdapter.|
+|0x2352 | unsupported encoding | An unsupported character encoding set was specified under the local connection.|
+|0x2353 | internal error of database, Please see taoslog for more details | An error occurred while executing prepareStatement on the local connection. Please check the taoslog for problem localization.|
+|0x2354 | connection is NULL | Connection has already been closed while executing the command on the local connection. Please check the connection with TDengine.|
+|0x2355 | result set is NULL | Local connection to obtain result set, result set exception, please check connection status and retry.|
+|0x2356 | invalid num of fields | The meta information obtained from the local connection result set does not match.|
+|0x2357 | empty SQL string | Fill in the correct SQL for execution.|
+|0x2371 | consumer properties must not be null | When creating a subscription, the parameter is empty. Please fill in the correct parameter.|
+|0x2375 | topic reference has been destroyed | During the process of creating a data subscription, the topic reference was released. Please check the connection with TDengine.|
+|0x2376 | failed to set consumer topic, Topic name is empty | During the process of creating a data subscription, the subscription topic name is empty. Please check if the specified topic name is filled in correctly.|
+|0x2377 | consumer reference has been destroyed | The subscription data transmission channel has been closed, please check the connection with TDengine.|
+|0x2378 | consumer create error | Failed to create data subscription. Please check the taos log based on the error message to locate the problem.|
+|0x237a | vGroup not found in result set VGroup | Not assigned to the current consumer, due to the Rebalance mechanism, the relationship between Consumer and VGroup is not bound.|
## Data type mapping
@@ -96,13 +96,13 @@ TDengine currently supports timestamp, number, character, and boolean types, and
The semantic reason for using At Least One (at least once) is:
--TDengine currently does not support transactions and cannot perform frequent checkpoint operations and complex transaction coordination.
--Due to TDengine's use of timestamps as primary keys, downstream operators of duplicate data can perform filtering operations to avoid duplicate calculations.
--Using At Least One (at least once) to ensure high data processing performance and low data latency, the setting method is as follows:
+- TDengine currently does not support transactions and cannot perform frequent checkpoint operations and complex transaction coordination.
+- Due to TDengine's use of timestamps as primary keys, downstream operators of duplicate data can perform filtering operations to avoid duplicate calculations.
+- Using At Least One (at least once) to ensure high data processing performance and low data latency, the setting method is as follows:
Instructions:
-```text
+```java
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.enableCheckpointing(5000);
env.getCheckpointConfig().setCheckpointingMode(CheckpointingMode.AT_LEAST_ONCE);
@@ -121,7 +121,7 @@ If using Maven to manage a project, simply add the following dependencies in pom
The parameters for establishing a connection include URL and Properties.
The URL specification format is:
-`jdbc: TAOS-WS://[host_name]:[port]/[database_name]? [user={user}|&password={password}|&timezone={timezone}]`
+`jdbc: TAOS-WS://[host_name]:[port]/[database_name]?[user={user}|&password={password}|&timezone={timezone}]`
Parameter description:
@@ -142,18 +142,17 @@ By setting the parallelism of the data source, multiple threads can read data fr
The configuration parameters in Properties are as follows:
-|Parameter Name | Type | Parameter Description | Remarks|
-| ----------------------- | :-----: | ------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| TDengineConfigParams.PROPERTYKEYUSER | string | Login TDengine username, default value 'root' ||
-| TDengineConfigParams.PROPERTYKEY-PASSWORD | string | User login password, default value 'taosdata' ||
-| TDengineConfigParams.If the downstream operator receives data of RowData type, it only needs to be set to RowData. If the user needs to customize the type, the complete class path needs to be set here|
-| TDengineConfigParams.TD_STACTMODE | boolean | This parameter is used to batch push data to downstream operators. If set to True, when creating a TDengine Source object, the data type needs to be specified as SourceRecords \| The type here is the type used to receive data from downstream operators|
-| TDengineConfigParams.PROPERTYKEY_CARSET | string | The character set used by the client, with the default value being the system character set. ||
-| TDengineConfigParams.PROPERTYKEY.MSSAGE_maIT_TIMEOUT | integer | Message timeout, in milliseconds, default value is 60000 ||
-| TDengineConfigParams.Whether compression is enabled during the transmission process. true: Enable, false: Not enabled. Default is false ||
-| TDengineConfigParams.Whether to enable automatic reconnection or not. true: Enable, false: Not enabled. Default to false||
-| TDengineConfigParams.PROPERTYKEY-RECONNECT-RETR_COUNT | integer | number of automatic reconnection retries, default value 3 | only takes effect when PROPERTYKEY-INABLE AUTO-RECONNECT is true|
-| TDengineConfigParams.PROPERTYKEYDISABLE_SSL_CERTVNet | boolean | Disable SSL certificate verification. true: close, false: Not closed. The default is false||
+- TDengineConfigParams.PROPERTY_KEY_USER: Login to TDengine username, default value is 'root '.
+- TDengineConfigParams.PROPERTY_KEY_PASSWORD: User login password, default value 'taosdata'.
+- TDengineConfigParams.VALUE_DESERIALIZER: The downstream operator receives the result set deserialization method. If the received result set type is `RowData` of `Flink`, it only needs to be set to `RowData`. It is also possible to inherit `TDengineRecordDeserialization` and implement `convert` and `getProducedType` methods, customizing the deserialization method based on `ResultSet` of `SQL`.
+- TDengineConfigParams.TD_BATCH_MODE: This parameter is used to batch push data to downstream operators. If set to True, when creating the `TDengine Source` object, it is necessary to specify the data type as a `Template` form of the `SourceRecords` type.
+- TDengineConfigParams.PROPERTY_KEY_MESSAGE_WAIT_TIMEOUT: Message timeout time, in milliseconds, default value is 60000.
+- TDengineConfigParams.PROPERTY_KEY_ENABLE_COMPRESSION: Is compression enabled during the transmission process. true: Enable, false: Not enabled. The default is false.
+- TDengineConfigParams.PROPERTY_KEY_ENABLE_AUTO_RECONNECT: Whether to enable automatic reconnection. true: Enable, false: Not enabled. The default is false.
+- TDengineConfigParams.PROPERTY_KEY_RECONNECT_INTERVAL_MS: Automatic reconnection retry interval, in milliseconds, default value 2000. It only takes effect when `PROPERTY_KEY_ENABLE_AUTO_RECONNECT` is true.
+- TDengineConfigParams.PROPERTY_KEY_RECONNECT_RETRY_COUNT: The default value for automatic reconnection retry is 3, which only takes effect when `PROPERTY_KEY_ENABLE_AUTO_RECONNECT` is true.
+- TDengineConfigParams.PROPERTY_KEY_DISABLE_SSL_CERT_VALIDATION: Turn off SSL certificate verification. true: Enable, false: Not enabled. The default is false.
+
#### Split by time
@@ -209,27 +208,32 @@ Example of custom data type query result: