From b1e9c7e6e6c145f78efe1de591d0e1a1a7a4719f Mon Sep 17 00:00:00 2001
From: sheyanjie-qq <249478495@qq.com>
Date: Wed, 25 Dec 2024 16:43:09 +0800
Subject: [PATCH 01/35] update jdbc stmt2 doc and sample code
---
docs/en/07-develop/01-connect.md | 2 +-
docs/en/07-develop/05-stmt.md | 9 +-
docs/en/14-reference/05-connector/14-java.md | 155 +++++++++---------
docs/examples/JDBC/JDBCDemo/pom.xml | 2 +-
docs/examples/JDBC/SpringJdbcTemplate/pom.xml | 2 +-
docs/examples/JDBC/connectionPools/pom.xml | 2 +-
docs/examples/JDBC/consumer-demo/pom.xml | 2 +-
docs/examples/JDBC/mybatisplus-demo/pom.xml | 2 +-
docs/examples/JDBC/springbootdemo/pom.xml | 2 +-
docs/examples/JDBC/taosdemo/pom.xml | 2 +-
docs/examples/java/pom.xml | 2 +-
...WSParameterBindingExtendInterfaceDemo.java | 87 ++++++++++
...> WSParameterBindingStdInterfaceDemo.java} | 38 ++---
.../src/test/java/com/taos/test/TestAll.java | 9 +-
docs/zh/07-develop/01-connect/index.md | 2 +-
docs/zh/07-develop/05-stmt.md | 8 +-
docs/zh/14-reference/05-connector/14-java.mdx | 1 +
17 files changed, 215 insertions(+), 112 deletions(-)
create mode 100644 docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java
rename docs/examples/java/src/main/java/com/taos/example/{WSParameterBindingBasicDemo.java => WSParameterBindingStdInterfaceDemo.java} (61%)
diff --git a/docs/en/07-develop/01-connect.md b/docs/en/07-develop/01-connect.md
index af5f171f8c..9b1fbad6dd 100644
--- a/docs/en/07-develop/01-connect.md
+++ b/docs/en/07-develop/01-connect.md
@@ -109,7 +109,7 @@ If you are using Maven to manage your project, simply add the following dependen
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
```
diff --git a/docs/en/07-develop/05-stmt.md b/docs/en/07-develop/05-stmt.md
index 4503bb8bd3..4e49145628 100644
--- a/docs/en/07-develop/05-stmt.md
+++ b/docs/en/07-develop/05-stmt.md
@@ -28,8 +28,15 @@ Next, we continue to use smart meters as an example to demonstrate the efficient
+
+There are two kinds of interfaces for parameter binding: one is the standard JDBC interface, and the other is an extended interface. The extended interface offers better performance.
+
```java
-{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingBasicDemo.java:para_bind}}
+{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingStdInterfaceDemo.java:para_bind}}
+```
+
+```java
+{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java:para_bind}}
```
This is a [more detailed parameter binding example](https://github.com/taosdata/TDengine/blob/main/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingFullDemo.java)
diff --git a/docs/en/14-reference/05-connector/14-java.md b/docs/en/14-reference/05-connector/14-java.md
index 10e4ec6d42..f52b7e71f7 100644
--- a/docs/en/14-reference/05-connector/14-java.md
+++ b/docs/en/14-reference/05-connector/14-java.md
@@ -30,33 +30,34 @@ The JDBC driver implementation for TDengine strives to be consistent with relati
## Version History
-| taos-jdbcdriver Version | Major Changes | TDengine Version |
-| ------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------- |
-| 3.4.0 | 1. Replaced fastjson library with jackson.
2. WebSocket uses a separate protocol identifier.
3. Optimized background thread usage to avoid user misuse leading to timeouts. | - |
-| 3.3.4 | Fixed getInt error when data type is float. | - |
-| 3.3.3 | Fixed memory leak caused by closing WebSocket statement. | - |
-| 3.3.2 | 1. Optimized parameter binding performance under WebSocket connection.
2. Improved support for mybatis. | - |
-| 3.3.0 | 1. Optimized data transmission performance under WebSocket connection.
2. Supports skipping SSL verification, off by default. | 3.3.2.0 and higher |
-| 3.2.11 | Fixed a bug in closing result set in Native connection. | - |
-| 3.2.10 | 1. REST/WebSocket connections support data compression during transmission.
2. WebSocket automatic reconnection mechanism, off by default.
3. Connection class provides methods for schemaless writing.
4. Optimized data fetching performance for native connections.
5. Fixed some known issues.
6. Metadata retrieval functions can return a list of supported functions. | - |
-| 3.2.9 | Fixed bug in closing WebSocket prepareStatement. | - |
-| 3.2.8 | 1. Optimized auto-commit.
2. Fixed manual commit bug in WebSocket.
3. Optimized WebSocket prepareStatement using a single connection.
4. Metadata supports views. | - |
-| 3.2.7 | 1. Supports VARBINARY and GEOMETRY types.
2. Added timezone setting support for native connections.
3. Added WebSocket automatic reconnection feature. | 3.2.0.0 and higher |
-| 3.2.5 | Data subscription adds committed() and assignment() methods. | 3.1.0.3 and higher |
-| 3.2.4 | Data subscription adds enable.auto.commit parameter under WebSocket connection, as well as unsubscribe() method. | - |
-| 3.2.3 | Fixed ResultSet data parsing failure in some cases. | - |
-| 3.2.2 | New feature: Data subscription supports seek function. | 3.0.5.0 and higher |
-| 3.2.1 | 1. WebSocket connection supports schemaless and prepareStatement writing.
2. Consumer poll returns result set as ConsumerRecord, which can be accessed through value() method. | 3.0.3.0 and higher |
-| 3.2.0 | Connection issues, not recommended for use. | - |
-| 3.1.0 | WebSocket connection supports subscription function. | - |
-| 3.0.1 - 3.0.4 | Fixed data parsing errors in result sets under some conditions. 3.0.1 compiled in JDK 11 environment, other versions recommended for JDK 8. | - |
-| 3.0.0 | Supports TDengine 3.0 | 3.0.0.0 and higher |
-| 2.0.42 | Fixed wasNull interface return value in WebSocket connection. | - |
-| 2.0.41 | Fixed username and password encoding method in REST connection. | - |
-| 2.0.39 - 2.0.40 | Added REST connection/request timeout settings. | - |
-| 2.0.38 | JDBC REST connection adds batch fetching function. | - |
-| 2.0.37 | Added support for json tag. | - |
-| 2.0.36 | Added support for schemaless writing. | - |
+| taos-jdbcdriver Version | Major Changes | TDengine Version |
+| ----------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ |
+| 3.5.0 | 1. Optimized the performance of WebSocket connection parameter binding, supporting parameter binding queries using binary data.
2. Optimized the performance of small queries in WebSocket connection.
3. Added support for setting time zone on WebSocket connection. | 3.3.5.0 and higher |
+| 3.4.0 | 1. Replaced fastjson library with jackson.
2. WebSocket uses a separate protocol identifier.
3. Optimized background thread usage to avoid user misuse leading to timeouts. | - |
+| 3.3.4 | Fixed getInt error when data type is float. | - |
+| 3.3.3 | Fixed memory leak caused by closing WebSocket statement. | - |
+| 3.3.2 | 1. Optimized parameter binding performance under WebSocket connection.
2. Improved support for mybatis. | - |
+| 3.3.0 | 1. Optimized data transmission performance under WebSocket connection.
2. Supports skipping SSL verification, off by default. | 3.3.2.0 and higher |
+| 3.2.11 | Fixed a bug in closing result set in Native connection. | - |
+| 3.2.10 | 1. REST/WebSocket connections support data compression during transmission.
2. WebSocket automatic reconnection mechanism, off by default.
3. Connection class provides methods for schemaless writing.
4. Optimized data fetching performance for native connections.
5. Fixed some known issues.
6. Metadata retrieval functions can return a list of supported functions. | - |
+| 3.2.9 | Fixed bug in closing WebSocket prepareStatement. | - |
+| 3.2.8 | 1. Optimized auto-commit.
2. Fixed manual commit bug in WebSocket.
3. Optimized WebSocket prepareStatement using a single connection.
4. Metadata supports views. | - |
+| 3.2.7 | 1. Supports VARBINARY and GEOMETRY types.
2. Added timezone setting support for native connections.
3. Added WebSocket automatic reconnection feature. | 3.2.0.0 and higher |
+| 3.2.5 | Data subscription adds committed() and assignment() methods. | 3.1.0.3 and higher |
+| 3.2.4 | Data subscription adds enable.auto.commit parameter under WebSocket connection, as well as unsubscribe() method. | - |
+| 3.2.3 | Fixed ResultSet data parsing failure in some cases. | - |
+| 3.2.2 | New feature: Data subscription supports seek function. | 3.0.5.0 and higher |
+| 3.2.1 | 1. WebSocket connection supports schemaless and prepareStatement writing.
2. Consumer poll returns result set as ConsumerRecord, which can be accessed through value() method. | 3.0.3.0 and higher |
+| 3.2.0 | Connection issues, not recommended for use. | - |
+| 3.1.0 | WebSocket connection supports subscription function. | - |
+| 3.0.1 - 3.0.4 | Fixed data parsing errors in result sets under some conditions. 3.0.1 compiled in JDK 11 environment, other versions recommended for JDK 8. | - |
+| 3.0.0 | Supports TDengine 3.0 | 3.0.0.0 and higher |
+| 2.0.42 | Fixed wasNull interface return value in WebSocket connection. | - |
+| 2.0.41 | Fixed username and password encoding method in REST connection. | - |
+| 2.0.39 - 2.0.40 | Added REST connection/request timeout settings. | - |
+| 2.0.38 | JDBC REST connection adds batch fetching function. | - |
+| 2.0.37 | Added support for json tag. | - |
+| 2.0.36 | Added support for schemaless writing. | - |
## Exceptions and Error Codes
@@ -75,47 +76,47 @@ The error codes that the JDBC connector may report include 4 types:
Please refer to the specific error codes:
-| Error Code | Description | Suggested Actions |
-| ---------- | --------------------------------------------------------------- | ----------------------------------------------------------------------------------------- |
-| 0x2301 | connection already closed | The connection is already closed, check the connection status, or recreate the connection to execute related commands. |
-| 0x2302 | this operation is NOT supported currently! | The current interface is not supported, consider switching to another connection method. |
-| 0x2303 | invalid variables | Invalid parameters, please check the interface specifications and adjust the parameter types and sizes. |
-| 0x2304 | statement is closed | The statement is already closed, check if the statement was used after being closed, or if the connection is normal. |
-| 0x2305 | resultSet is closed | The resultSet has been released, check if the resultSet was used after being released. |
-| 0x2306 | Batch is empty! | Add parameters to prepareStatement before executing executeBatch. |
-| 0x2307 | Can not issue data manipulation statements with executeQuery() | Use executeUpdate() for update operations, not executeQuery(). |
-| 0x2308 | Can not issue SELECT via executeUpdate() | Use executeQuery() for query operations, not executeUpdate(). |
-| 0x230d | parameter index out of range | Parameter out of bounds, check the reasonable range of parameters. |
-| 0x230e | connection already closed | The connection is already closed, check if the Connection was used after being closed, or if the connection is normal. |
-| 0x230f | unknown sql type in tdengine | Check the Data Type types supported by TDengine. |
-| 0x2310 | can't register JDBC-JNI driver | Cannot register JNI driver, check if the url is correctly filled. |
-| 0x2312 | url is not set | Check if the REST connection url is correctly filled. |
-| 0x2314 | numeric value out of range | Check if the correct interface was used for numeric types in the result set. |
-| 0x2315 | unknown taos type in tdengine | When converting TDengine data types to JDBC data types, check if the correct TDengine data type was specified. |
-| 0x2317 | | Incorrect request type used in REST connection. |
-| 0x2318 | | Data transmission error occurred in REST connection, check the network situation and retry. |
-| 0x2319 | user is required | Username information is missing when creating a connection. |
-| 0x231a | password is required | Password information is missing when creating a connection. |
-| 0x231c | httpEntity is null, sql: | An exception occurred in REST connection execution. |
-| 0x231d | can't create connection with server within | Increase the httpConnectTimeout parameter to extend the connection time, or check the connection with taosAdapter. |
-| 0x231e | failed to complete the task within the specified time | Increase the messageWaitTimeout parameter to extend the execution time, or check the connection with taosAdapter. |
-| 0x2350 | unknown error | Unknown exception, please provide feedback to the developers on github. |
-| 0x2352 | Unsupported encoding | An unsupported character encoding set was specified in the local connection. |
-| 0x2353 | internal error of database, please see taoslog for more details | An error occurred while executing prepareStatement in local connection, check taos log for troubleshooting. |
-| 0x2354 | JNI connection is NULL | The Connection was already closed when executing commands in local connection. Check the connection with TDengine. |
-| 0x2355 | JNI result set is NULL | The result set is abnormal in local connection, check the connection and retry. |
-| 0x2356 | invalid num of fields | The meta information of the result set obtained in local connection does not match. |
-| 0x2357 | empty sql string | Fill in the correct SQL for execution. |
-| 0x2359 | JNI alloc memory failed, please see taoslog for more details | Memory allocation error in local connection, check taos log for troubleshooting. |
-| 0x2371 | consumer properties must not be null! | Parameters are null when creating a subscription, fill in the correct parameters. |
-| 0x2372 | configs contain empty key, failed to set consumer property | The parameter key contains empty values, fill in the correct parameters. |
-| 0x2373 | failed to set consumer property, | The parameter value contains empty values, fill in the correct parameters. |
-| 0x2375 | topic reference has been destroyed | During the data subscription process, the topic reference was released. Check the connection with TDengine. |
+| Error Code | Description | Suggested Actions |
+| ---------- | --------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------- |
+| 0x2301 | connection already closed | The connection is already closed, check the connection status, or recreate the connection to execute related commands. |
+| 0x2302 | this operation is NOT supported currently! | The current interface is not supported, consider switching to another connection method. |
+| 0x2303 | invalid variables | Invalid parameters, please check the interface specifications and adjust the parameter types and sizes. |
+| 0x2304 | statement is closed | The statement is already closed, check if the statement was used after being closed, or if the connection is normal. |
+| 0x2305 | resultSet is closed | The resultSet has been released, check if the resultSet was used after being released. |
+| 0x2306 | Batch is empty! | Add parameters to prepareStatement before executing executeBatch. |
+| 0x2307 | Can not issue data manipulation statements with executeQuery() | Use executeUpdate() for update operations, not executeQuery(). |
+| 0x2308 | Can not issue SELECT via executeUpdate() | Use executeQuery() for query operations, not executeUpdate(). |
+| 0x230d | parameter index out of range | Parameter out of bounds, check the reasonable range of parameters. |
+| 0x230e | connection already closed | The connection is already closed, check if the Connection was used after being closed, or if the connection is normal. |
+| 0x230f | unknown sql type in tdengine | Check the Data Type types supported by TDengine. |
+| 0x2310 | can't register JDBC-JNI driver | Cannot register JNI driver, check if the url is correctly filled. |
+| 0x2312 | url is not set | Check if the REST connection url is correctly filled. |
+| 0x2314 | numeric value out of range | Check if the correct interface was used for numeric types in the result set. |
+| 0x2315 | unknown taos type in tdengine | When converting TDengine data types to JDBC data types, check if the correct TDengine data type was specified. |
+| 0x2317 | | Incorrect request type used in REST connection. |
+| 0x2318 | | Data transmission error occurred in REST connection, check the network situation and retry. |
+| 0x2319 | user is required | Username information is missing when creating a connection. |
+| 0x231a | password is required | Password information is missing when creating a connection. |
+| 0x231c | httpEntity is null, sql: | An exception occurred in REST connection execution. |
+| 0x231d | can't create connection with server within | Increase the httpConnectTimeout parameter to extend the connection time, or check the connection with taosAdapter. |
+| 0x231e | failed to complete the task within the specified time | Increase the messageWaitTimeout parameter to extend the execution time, or check the connection with taosAdapter. |
+| 0x2350 | unknown error | Unknown exception, please provide feedback to the developers on github. |
+| 0x2352 | Unsupported encoding | An unsupported character encoding set was specified in the local connection. |
+| 0x2353 | internal error of database, please see taoslog for more details | An error occurred while executing prepareStatement in local connection, check taos log for troubleshooting. |
+| 0x2354 | JNI connection is NULL | The Connection was already closed when executing commands in local connection. Check the connection with TDengine. |
+| 0x2355 | JNI result set is NULL | The result set is abnormal in local connection, check the connection and retry. |
+| 0x2356 | invalid num of fields | The meta information of the result set obtained in local connection does not match. |
+| 0x2357 | empty sql string | Fill in the correct SQL for execution. |
+| 0x2359 | JNI alloc memory failed, please see taoslog for more details | Memory allocation error in local connection, check taos log for troubleshooting. |
+| 0x2371 | consumer properties must not be null! | Parameters are null when creating a subscription, fill in the correct parameters. |
+| 0x2372 | configs contain empty key, failed to set consumer property | The parameter key contains empty values, fill in the correct parameters. |
+| 0x2373 | failed to set consumer property, | The parameter value contains empty values, fill in the correct parameters. |
+| 0x2375 | topic reference has been destroyed | During the data subscription process, the topic reference was released. Check the connection with TDengine. |
| 0x2376 | failed to set consumer topic, topic name is empty | During the data subscription process, the subscription topic name is empty. Check if the specified topic name is correctly filled. |
-| 0x2377 | consumer reference has been destroyed | The data transmission channel for the subscription has been closed, check the connection with TDengine. |
-| 0x2378 | consumer create error | Data subscription creation failed, check the error information and taos log for troubleshooting. |
-| 0x2379 | seek offset must not be a negative number | The seek interface parameter must not be negative, use the correct parameters. |
-| 0x237a | vGroup not found in result set | VGroup not assigned to the current consumer, due to the Rebalance mechanism causing the Consumer and VGroup to be unbound. |
+| 0x2377 | consumer reference has been destroyed | The data transmission channel for the subscription has been closed, check the connection with TDengine. |
+| 0x2378 | consumer create error | Data subscription creation failed, check the error information and taos log for troubleshooting. |
+| 0x2379 | seek offset must not be a negative number | The seek interface parameter must not be negative, use the correct parameters. |
+| 0x237a | vGroup not found in result set | VGroup not assigned to the current consumer, due to the Rebalance mechanism causing the Consumer and VGroup to be unbound. |
- [TDengine Java Connector Error Code](https://github.com/taosdata/taos-connector-jdbc/blob/main/src/main/java/com/taosdata/jdbc/TSDBErrorNumbers.java)
@@ -489,16 +490,16 @@ For example: if the password is specified as taosdata in the URL and as taosdemo
List of interface methods that return `true` for supported features, others not explicitly mentioned return `false`.
-| Interface Method | Description |
-|--------------------------------------------------------|-----------------------------------------------------|
-| `boolean nullsAreSortedAtStart()` | Determines if `NULL` values are sorted at the start |
-| `boolean storesLowerCaseIdentifiers()` | Determines if the database stores identifiers in lowercase |
-| `boolean supportsAlterTableWithAddColumn()` | Determines if the database supports adding columns with `ALTER TABLE` |
-| `boolean supportsAlterTableWithDropColumn()` | Determines if the database supports dropping columns with `ALTER TABLE` |
-| `boolean supportsColumnAliasing()` | Determines if the database supports column aliasing |
-| `boolean supportsGroupBy()` | Determines if the database supports `GROUP BY` statements |
-| `boolean isCatalogAtStart()` | Determines if the catalog name appears at the start of the fully qualified name in the database |
-| `boolean supportsCatalogsInDataManipulation()` | Determines if the database supports catalog names in data manipulation statements |
+| Interface Method | Description |
+| ---------------------------------------------- | ----------------------------------------------------------------------------------------------- |
+| `boolean nullsAreSortedAtStart()` | Determines if `NULL` values are sorted at the start |
+| `boolean storesLowerCaseIdentifiers()` | Determines if the database stores identifiers in lowercase |
+| `boolean supportsAlterTableWithAddColumn()` | Determines if the database supports adding columns with `ALTER TABLE` |
+| `boolean supportsAlterTableWithDropColumn()` | Determines if the database supports dropping columns with `ALTER TABLE` |
+| `boolean supportsColumnAliasing()` | Determines if the database supports column aliasing |
+| `boolean supportsGroupBy()` | Determines if the database supports `GROUP BY` statements |
+| `boolean isCatalogAtStart()` | Determines if the catalog name appears at the start of the fully qualified name in the database |
+| `boolean supportsCatalogsInDataManipulation()` | Determines if the database supports catalog names in data manipulation statements |
### Connection Features
diff --git a/docs/examples/JDBC/JDBCDemo/pom.xml b/docs/examples/JDBC/JDBCDemo/pom.xml
index 4b3e1ab675..e0c17ffbac 100644
--- a/docs/examples/JDBC/JDBCDemo/pom.xml
+++ b/docs/examples/JDBC/JDBCDemo/pom.xml
@@ -19,7 +19,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
org.locationtech.jts
diff --git a/docs/examples/JDBC/SpringJdbcTemplate/pom.xml b/docs/examples/JDBC/SpringJdbcTemplate/pom.xml
index 34719dc135..b48f17acce 100644
--- a/docs/examples/JDBC/SpringJdbcTemplate/pom.xml
+++ b/docs/examples/JDBC/SpringJdbcTemplate/pom.xml
@@ -47,7 +47,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/JDBC/connectionPools/pom.xml b/docs/examples/JDBC/connectionPools/pom.xml
index e3ef30d2f8..8fca6ce1b2 100644
--- a/docs/examples/JDBC/connectionPools/pom.xml
+++ b/docs/examples/JDBC/connectionPools/pom.xml
@@ -18,7 +18,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/JDBC/consumer-demo/pom.xml b/docs/examples/JDBC/consumer-demo/pom.xml
index 709f87d9c1..997643c3de 100644
--- a/docs/examples/JDBC/consumer-demo/pom.xml
+++ b/docs/examples/JDBC/consumer-demo/pom.xml
@@ -17,7 +17,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
com.google.guava
diff --git a/docs/examples/JDBC/mybatisplus-demo/pom.xml b/docs/examples/JDBC/mybatisplus-demo/pom.xml
index 2077e31d8d..27e62695bc 100644
--- a/docs/examples/JDBC/mybatisplus-demo/pom.xml
+++ b/docs/examples/JDBC/mybatisplus-demo/pom.xml
@@ -47,7 +47,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/JDBC/springbootdemo/pom.xml b/docs/examples/JDBC/springbootdemo/pom.xml
index df8a3f5d61..f42799119b 100644
--- a/docs/examples/JDBC/springbootdemo/pom.xml
+++ b/docs/examples/JDBC/springbootdemo/pom.xml
@@ -70,7 +70,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/JDBC/taosdemo/pom.xml b/docs/examples/JDBC/taosdemo/pom.xml
index c36973947b..936a133e31 100644
--- a/docs/examples/JDBC/taosdemo/pom.xml
+++ b/docs/examples/JDBC/taosdemo/pom.xml
@@ -67,7 +67,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/java/pom.xml b/docs/examples/java/pom.xml
index e1a9504249..9b3a8c147d 100644
--- a/docs/examples/java/pom.xml
+++ b/docs/examples/java/pom.xml
@@ -22,7 +22,7 @@
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
diff --git a/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java b/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java
new file mode 100644
index 0000000000..8a83c27bbc
--- /dev/null
+++ b/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java
@@ -0,0 +1,87 @@
+package com.taos.example;
+
+import com.taosdata.jdbc.ws.TSWSPreparedStatement;
+
+import java.sql.*;
+import java.util.ArrayList;
+import java.util.Random;
+
+// ANCHOR: para_bind
+public class WSParameterBindingExtendInterfaceDemo {
+
+ // modify host to your own
+ private static final String host = "127.0.0.1";
+ private static final Random random = new Random(System.currentTimeMillis());
+ private static final int numOfSubTable = 10, numOfRow = 10;
+
+ public static void main(String[] args) throws SQLException {
+
+ String jdbcUrl = "jdbc:TAOS-WS://" + host + ":6041";
+ try (Connection conn = DriverManager.getConnection(jdbcUrl, "root", "taosdata")) {
+ init(conn);
+
+ String sql = "INSERT INTO ? USING power.meters TAGS(?,?) VALUES (?,?,?,?)";
+
+ try (TSWSPreparedStatement pstmt = conn.prepareStatement(sql).unwrap(TSWSPreparedStatement.class)) {
+
+ for (int i = 1; i <= numOfSubTable; i++) {
+ // set table name
+ pstmt.setTableName("d_bind_" + i);
+
+ // set tags
+ pstmt.setTagInt(0, i);
+ pstmt.setTagString(1, "location_" + i);
+
+ // set column ts
+ ArrayList tsList = new ArrayList<>();
+ long current = System.currentTimeMillis();
+ for (int j = 0; j < numOfRow; j++)
+ tsList.add(current + j);
+ pstmt.setTimestamp(0, tsList);
+
+ // set column current
+ ArrayList currentList = new ArrayList<>();
+ for (int j = 0; j < numOfRow; j++)
+ currentList.add(random.nextFloat() * 30);
+ pstmt.setFloat(1, currentList);
+
+ // set column voltage
+ ArrayList voltageList = new ArrayList<>();
+ for (int j = 0; j < numOfRow; j++)
+ voltageList.add(random.nextInt(300));
+ pstmt.setInt(2, voltageList);
+
+ // set column phase
+ ArrayList phaseList = new ArrayList<>();
+ for (int j = 0; j < numOfRow; j++)
+ phaseList.add(random.nextFloat());
+ pstmt.setFloat(3, phaseList);
+ // add column
+ pstmt.columnDataAddBatch();
+ }
+ // execute column
+ pstmt.columnDataExecuteBatch();
+ // you can check exeResult here
+ System.out.println("Successfully inserted " + (numOfSubTable * numOfRow) + " rows to power.meters.");
+ }
+ } catch (Exception ex) {
+ // please refer to the JDBC specifications for detailed exceptions info
+ System.out.printf("Failed to insert to table meters using stmt, %sErrMessage: %s%n",
+ ex instanceof SQLException ? "ErrCode: " + ((SQLException) ex).getErrorCode() + ", " : "",
+ ex.getMessage());
+ // Print stack trace for context in examples. Use logging in production.
+ ex.printStackTrace();
+ throw ex;
+ }
+ }
+
+ private static void init(Connection conn) throws SQLException {
+ try (Statement stmt = conn.createStatement()) {
+ stmt.execute("CREATE DATABASE IF NOT EXISTS power");
+ stmt.execute("USE power");
+ stmt.execute(
+ "CREATE STABLE IF NOT EXISTS power.meters (ts TIMESTAMP, current FLOAT, voltage INT, phase FLOAT) TAGS (groupId INT, location BINARY(24))");
+ }
+ }
+}
+// ANCHOR_END: para_bind
diff --git a/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingBasicDemo.java b/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingStdInterfaceDemo.java
similarity index 61%
rename from docs/examples/java/src/main/java/com/taos/example/WSParameterBindingBasicDemo.java
rename to docs/examples/java/src/main/java/com/taos/example/WSParameterBindingStdInterfaceDemo.java
index 1353ebbddc..7f0e523b97 100644
--- a/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingBasicDemo.java
+++ b/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingStdInterfaceDemo.java
@@ -1,12 +1,10 @@
package com.taos.example;
-import com.taosdata.jdbc.ws.TSWSPreparedStatement;
-
import java.sql.*;
import java.util.Random;
// ANCHOR: para_bind
-public class WSParameterBindingBasicDemo {
+public class WSParameterBindingStdInterfaceDemo {
// modify host to your own
private static final String host = "127.0.0.1";
@@ -19,31 +17,29 @@ public class WSParameterBindingBasicDemo {
try (Connection conn = DriverManager.getConnection(jdbcUrl, "root", "taosdata")) {
init(conn);
- String sql = "INSERT INTO ? USING power.meters TAGS(?,?) VALUES (?,?,?,?)";
+ // If you are certain that the child table exists, you can avoid binding the tag column to improve performance.
+ String sql = "INSERT INTO power.meters (tbname, groupid, location, ts, current, voltage, phase) VALUES (?,?,?,?,?,?,?)";
- try (TSWSPreparedStatement pstmt = conn.prepareStatement(sql).unwrap(TSWSPreparedStatement.class)) {
+ try (PreparedStatement pstmt = conn.prepareStatement(sql)) {
+ long current = System.currentTimeMillis();
for (int i = 1; i <= numOfSubTable; i++) {
- // set table name
- pstmt.setTableName("d_bind_" + i);
-
- // set tags
- pstmt.setTagInt(0, i);
- pstmt.setTagString(1, "location_" + i);
-
- // set columns
- long current = System.currentTimeMillis();
for (int j = 0; j < numOfRow; j++) {
- pstmt.setTimestamp(1, new Timestamp(current + j));
- pstmt.setFloat(2, random.nextFloat() * 30);
- pstmt.setInt(3, random.nextInt(300));
- pstmt.setFloat(4, random.nextFloat());
+ pstmt.setString(1, "d_bind_" + i);
+
+ pstmt.setInt(2, i);
+ pstmt.setString(3, "location_" + i);
+
+ pstmt.setTimestamp(4, new Timestamp(current + j));
+ pstmt.setFloat(5, random.nextFloat() * 30);
+ pstmt.setInt(6, random.nextInt(300));
+ pstmt.setFloat(7, random.nextFloat());
pstmt.addBatch();
}
- int[] exeResult = pstmt.executeBatch();
- // you can check exeResult here
- System.out.println("Successfully inserted " + exeResult.length + " rows to power.meters.");
}
+ int[] exeResult = pstmt.executeBatch();
+ // you can check exeResult here
+ System.out.println("Successfully inserted " + exeResult.length + " rows to power.meters.");
}
} catch (Exception ex) {
// please refer to the JDBC specifications for detailed exceptions info
diff --git a/docs/examples/java/src/test/java/com/taos/test/TestAll.java b/docs/examples/java/src/test/java/com/taos/test/TestAll.java
index a92ddd116c..f73b9d42ee 100644
--- a/docs/examples/java/src/test/java/com/taos/test/TestAll.java
+++ b/docs/examples/java/src/test/java/com/taos/test/TestAll.java
@@ -118,9 +118,14 @@ public class TestAll {
}
@Test
- public void testWsStmtBasic() throws Exception {
+ public void testWsStmtStd() throws Exception {
dropDB("power");
- WSParameterBindingBasicDemo.main(args);
+ WSParameterBindingStdInterfaceDemo.main(args);
+ }
+ @Test
+ public void testWsStmtExtend() throws Exception {
+ dropDB("power");
+ WSParameterBindingExtendInterfaceDemo.main(args);
}
@Test
diff --git a/docs/zh/07-develop/01-connect/index.md b/docs/zh/07-develop/01-connect/index.md
index 2381c49d93..e81a9e08c7 100644
--- a/docs/zh/07-develop/01-connect/index.md
+++ b/docs/zh/07-develop/01-connect/index.md
@@ -89,7 +89,7 @@ TDengine 提供了丰富的应用程序开发接口,为了便于用户快速
com.taosdata.jdbc
taos-jdbcdriver
- 3.4.0
+ 3.5.0
```
diff --git a/docs/zh/07-develop/05-stmt.md b/docs/zh/07-develop/05-stmt.md
index 74b44ba8e6..f6c67f98fe 100644
--- a/docs/zh/07-develop/05-stmt.md
+++ b/docs/zh/07-develop/05-stmt.md
@@ -26,10 +26,16 @@ import TabItem from "@theme/TabItem";
## WebSocket 连接
+
+参数绑定有两种接口使用方式,一种是 JDBC 标准接口,一种是扩展接口,扩展接口性能更好一些。
+
```java
-{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingBasicDemo.java:para_bind}}
+{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingStdInterfaceDemo.java:para_bind}}
```
+```java
+{{#include docs/examples/java/src/main/java/com/taos/example/WSParameterBindingExtendInterfaceDemo.java:para_bind}}
+```
这是一个[更详细的参数绑定示例](https://github.com/taosdata/TDengine/blob/main/docs/examples/java/src/main/java/com/taos/example/WSParameterBindingFullDemo.java)
diff --git a/docs/zh/14-reference/05-connector/14-java.mdx b/docs/zh/14-reference/05-connector/14-java.mdx
index 27f43676f3..ab7494707b 100644
--- a/docs/zh/14-reference/05-connector/14-java.mdx
+++ b/docs/zh/14-reference/05-connector/14-java.mdx
@@ -33,6 +33,7 @@ TDengine 的 JDBC 驱动实现尽可能与关系型数据库驱动保持一致
| taos-jdbcdriver 版本 | 主要变化 | TDengine 版本 |
| ------------------| ---------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------- |
+| 3.5.0 | 1. 优化了 WebSocket 连接参数绑定性能,支持参数绑定查询使用二进制数据
2. 优化了 WebSocket 连接在小查询上的性能
3. WebSocket 连接上支持设置时区 | 3.3.5.0 及更高版本 |
| 3.4.0 | 1. 使用 jackson 库替换 fastjson 库
2. WebSocket 采用独立协议标识
3. 优化后台拉取线程使用,避免用户误用导致超时 | - |
| 3.3.4 | 解决了 getInt 在数据类型为 float 报错 | - |
| 3.3.3 | 解决了 WebSocket statement 关闭导致的内存泄漏 | - |
From 29089af5926186af06cec79b6fcca665cff25b82 Mon Sep 17 00:00:00 2001
From: sheyanjie-qq <249478495@qq.com>
Date: Fri, 27 Dec 2024 15:26:41 +0800
Subject: [PATCH 02/35] add jdbc time zone and app info doc
---
docs/en/14-reference/05-connector/14-java.md | 19 +++++++++++--------
docs/zh/14-reference/05-connector/14-java.mdx | 18 +++++++++++-------
2 files changed, 22 insertions(+), 15 deletions(-)
diff --git a/docs/en/14-reference/05-connector/14-java.md b/docs/en/14-reference/05-connector/14-java.md
index f52b7e71f7..48302b9d3b 100644
--- a/docs/en/14-reference/05-connector/14-java.md
+++ b/docs/en/14-reference/05-connector/14-java.md
@@ -32,7 +32,7 @@ The JDBC driver implementation for TDengine strives to be consistent with relati
| taos-jdbcdriver Version | Major Changes | TDengine Version |
| ----------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------ |
-| 3.5.0 | 1. Optimized the performance of WebSocket connection parameter binding, supporting parameter binding queries using binary data.
2. Optimized the performance of small queries in WebSocket connection.
3. Added support for setting time zone on WebSocket connection. | 3.3.5.0 and higher |
+| 3.5.0 | 1. Optimized the performance of WebSocket connection parameter binding, supporting parameter binding queries using binary data.
2. Optimized the performance of small queries in WebSocket connection.
3. Added support for setting time zone and app info on WebSocket connection. | 3.3.5.0 and higher |
| 3.4.0 | 1. Replaced fastjson library with jackson.
2. WebSocket uses a separate protocol identifier.
3. Optimized background thread usage to avoid user misuse leading to timeouts. | - |
| 3.3.4 | Fixed getInt error when data type is float. | - |
| 3.3.3 | Fixed memory leak caused by closing WebSocket statement. | - |
@@ -245,13 +245,13 @@ For WebSocket connections, the configuration parameters in the URL are as follow
- user: Login username for TDengine, default value 'root'.
- password: User login password, default value 'taosdata'.
-- charset: Specifies the character set for parsing string data when batch fetching is enabled.
- batchErrorIgnore: true: Continues executing the following SQL if one SQL fails during the execution of Statement's executeBatch. false: Does not execute any statements after a failed SQL. Default value: false.
- httpConnectTimeout: Connection timeout in ms, default value 60000.
- messageWaitTimeout: Message timeout in ms, default value 60000.
- useSSL: Whether SSL is used in the connection.
+- timezone: Client timezone, default is the system current timezone. Recommended not to set, using the system time zone provides better performance.
-**Note**: Some configuration items (such as: locale, timezone) do not take effect in WebSocket connections.
+**Note**: Some configuration items (such as: locale, charset) do not take effect in WebSocket connections.
**REST Connection**
Using JDBC REST connection does not depend on the client driver. Compared to native JDBC connections, you only need to:
@@ -264,14 +264,13 @@ For REST connections, the configuration parameters in the URL are as follows:
- user: Login username for TDengine, default value 'root'.
- password: User login password, default value 'taosdata'.
-- charset: Specifies the character set for parsing string data when batch fetching is enabled.
- batchErrorIgnore: true: Continues executing the following SQL if one SQL fails during the execution of Statement's executeBatch. false: Does not execute any statements after a failed SQL. Default value: false.
- httpConnectTimeout: Connection timeout in ms, default value 60000.
- httpSocketTimeout: Socket timeout in ms, default value 60000.
- useSSL: Whether SSL is used in the connection.
- httpPoolSize: REST concurrent request size, default 20.
-**Note**: Some configuration items (such as: locale, timezone) do not take effect in REST connections.
+**Note**: Some configuration items (such as: locale, charset and timezone) do not take effect in REST connections.
:::note
@@ -295,7 +294,9 @@ The configuration parameters in properties are as follows:
- TSDBDriver.PROPERTY_KEY_CONFIG_DIR: Effective only when using native JDBC connections. Client configuration file directory path, default value on Linux OS is `/etc/taos`, on Windows OS is `C:/TDengine/cfg`.
- TSDBDriver.PROPERTY_KEY_CHARSET: Character set used by the client, default value is the system character set.
- TSDBDriver.PROPERTY_KEY_LOCALE: Effective only when using native JDBC connections. Client locale, default value is the current system locale.
-- TSDBDriver.PROPERTY_KEY_TIME_ZONE: Effective only when using native JDBC connections. Client time zone, default value is the current system time zone. Due to historical reasons, we only support part of the POSIX standard, such as UTC-8 (representing Shanghai, China), GMT-8, Asia/Shanghai.
+- TSDBDriver.PROPERTY_KEY_TIME_ZONE:
+ - Native connections: Client time zone, default value is the current system time zone. Effective globally. Due to historical reasons, we only support part of the POSIX standard, such as UTC-8 (representing Shanghai, China), GMT-8, Asia/Shanghai.
+ - WebSocket connections. Client time zone, default value is the current system time zone. Effective on the connection. Only IANA time zones are supported, such as Asia/Shanghai. It is recommended not to set this parameter, as using the system time zone provides better performance.
- TSDBDriver.HTTP_CONNECT_TIMEOUT: Connection timeout, in ms, default value is 60000. Effective only in REST connections.
- TSDBDriver.HTTP_SOCKET_TIMEOUT: Socket timeout, in ms, default value is 60000. Effective only in REST connections and when batchfetch is set to false.
- TSDBDriver.PROPERTY_KEY_MESSAGE_WAIT_TIMEOUT: Message timeout, in ms, default value is 60000. Effective only under WebSocket connections.
@@ -304,12 +305,14 @@ The configuration parameters in properties are as follows:
- TSDBDriver.PROPERTY_KEY_ENABLE_COMPRESSION: Whether to enable compression during transmission. Effective only when using REST/WebSocket connections. true: enabled, false: not enabled. Default is false.
- TSDBDriver.PROPERTY_KEY_ENABLE_AUTO_RECONNECT: Whether to enable auto-reconnect. Effective only when using WebSocket connections. true: enabled, false: not enabled. Default is false.
-> **Note**: Enabling auto-reconnect is only effective for simple SQL execution, schema-less writing, and data subscription. It is ineffective for parameter binding. Auto-reconnect is only effective for connections established through parameters specifying the database, and ineffective for later `use db` statements to switch databases.
+ > **Note**: Enabling auto-reconnect is only effective for simple SQL execution, schema-less writing, and data subscription. It is ineffective for parameter binding. Auto-reconnect is only effective for connections established through parameters specifying the database, and ineffective for later `use db` statements to switch databases.
- TSDBDriver.PROPERTY_KEY_RECONNECT_INTERVAL_MS: Auto-reconnect retry interval, in milliseconds, default value 2000. Effective only when PROPERTY_KEY_ENABLE_AUTO_RECONNECT is true.
- TSDBDriver.PROPERTY_KEY_RECONNECT_RETRY_COUNT: Auto-reconnect retry count, default value 3, effective only when PROPERTY_KEY_ENABLE_AUTO_RECONNECT is true.
- TSDBDriver.PROPERTY_KEY_DISABLE_SSL_CERT_VALIDATION: Disable SSL certificate validation. Effective only when using WebSocket connections. true: enabled, false: not enabled. Default is false.
-
+- TSDBDriver.PROPERTY_KEY_APP_NAME: App name, can be used for display in the `show connections` query result. Effective only when using WebSocket connections. Default value is java.
+- TSDBDriver.PROPERTY_KEY_APP_IP: App IP, can be used for display in the `show connections` query result. Effective only when using WebSocket connections. Default value is empty.
+
Additionally, for native JDBC connections, other parameters such as log level and SQL length can be specified by specifying the URL and Properties.
**Priority of Configuration Parameters**
diff --git a/docs/zh/14-reference/05-connector/14-java.mdx b/docs/zh/14-reference/05-connector/14-java.mdx
index ab7494707b..c61bf51c82 100644
--- a/docs/zh/14-reference/05-connector/14-java.mdx
+++ b/docs/zh/14-reference/05-connector/14-java.mdx
@@ -33,7 +33,7 @@ TDengine 的 JDBC 驱动实现尽可能与关系型数据库驱动保持一致
| taos-jdbcdriver 版本 | 主要变化 | TDengine 版本 |
| ------------------| ---------------------------------------------------------------------------------------------------------------------------------------------------- | ---------------- |
-| 3.5.0 | 1. 优化了 WebSocket 连接参数绑定性能,支持参数绑定查询使用二进制数据
2. 优化了 WebSocket 连接在小查询上的性能
3. WebSocket 连接上支持设置时区 | 3.3.5.0 及更高版本 |
+| 3.5.0 | 1. 优化了 WebSocket 连接参数绑定性能,支持参数绑定查询使用二进制数据
2. 优化了 WebSocket 连接在小查询上的性能
3. WebSocket 连接上支持设置时区和应用信息 | 3.3.5.0 及更高版本 |
| 3.4.0 | 1. 使用 jackson 库替换 fastjson 库
2. WebSocket 采用独立协议标识
3. 优化后台拉取线程使用,避免用户误用导致超时 | - |
| 3.3.4 | 解决了 getInt 在数据类型为 float 报错 | - |
| 3.3.3 | 解决了 WebSocket statement 关闭导致的内存泄漏 | - |
@@ -244,13 +244,13 @@ TDengine 中,只要保证 firstEp 和 secondEp 中一个节点有效,就可
对于 WebSocket 连接,url 中的配置参数如下:
- user:登录 TDengine 用户名,默认值 'root'。
- password:用户登录密码,默认值 'taosdata'。
-- charset: 当开启批量拉取数据时,指定解析字符串数据的字符集。
- batchErrorIgnore:true:在执行 Statement 的 executeBatch 时,如果中间有一条 SQL 执行失败,继续执行下面的 SQL 了。false:不再执行失败 SQL 后的任何语句。默认值为:false。
- httpConnectTimeout: 连接超时时间,单位 ms, 默认值为 60000。
- messageWaitTimeout: 消息超时时间, 单位 ms, 默认值为 60000。
- useSSL: 连接中是否使用 SSL。
+- timezone:客户端使用的时区,连接上生效,默认值为系统时区。推荐不设置,使用系统时区性能更好。
-**注意**:部分配置项(比如:locale、timezone)在 WebSocket 连接中不生效。
+**注意**:部分配置项(比如:locale、charset)在 WebSocket 连接中不生效。
**REST 连接**
使用 JDBC REST 连接,不需要依赖客户端驱动。与 JDBC 原生连接相比,仅需要:
@@ -262,14 +262,13 @@ TDengine 中,只要保证 firstEp 和 secondEp 中一个节点有效,就可
对于 REST 连接,url 中的配置参数如下:
- user:登录 TDengine 用户名,默认值 'root'。
- password:用户登录密码,默认值 'taosdata'。
-- charset: 当开启批量拉取数据时,指定解析字符串数据的字符集。
- batchErrorIgnore:true:在执行 Statement 的 executeBatch 时,如果中间有一条 SQL 执行失败,继续执行下面的 SQL 了。false:不再执行失败 SQL 后的任何语句。默认值为:false。
- httpConnectTimeout: 连接超时时间,单位 ms, 默认值为 60000。
- httpSocketTimeout: socket 超时时间,单位 ms,默认值为 60000。
- useSSL: 连接中是否使用 SSL。
- httpPoolSize: REST 并发请求大小,默认 20。
-**注意**:部分配置项(比如:locale、timezone)在 REST 连接中不生效。
+**注意**:部分配置项(比如:locale、charset 和 timezone)在 REST 连接中不生效。
:::note
@@ -292,7 +291,9 @@ properties 中的配置参数如下:
- TSDBDriver.PROPERTY_KEY_CONFIG_DIR:仅在使用 JDBC 原生连接时生效。客户端配置文件目录路径,Linux OS 上默认值 `/etc/taos`,Windows OS 上默认值 `C:/TDengine/cfg`。
- TSDBDriver.PROPERTY_KEY_CHARSET:客户端使用的字符集,默认值为系统字符集。
- TSDBDriver.PROPERTY_KEY_LOCALE:仅在使用 JDBC 原生连接时生效。 客户端语言环境,默认值系统当前 locale。
-- TSDBDriver.PROPERTY_KEY_TIME_ZONE:仅在使用 JDBC 原生连接时生效。 客户端使用的时区,默认值为系统当前时区。因为历史的原因,我们只支持POSIX标准的部分规范,如UTC-8(代表中国上上海), GMT-8,Asia/Shanghai 这几种形式。
+- TSDBDriver.PROPERTY_KEY_TIME_ZONE:
+ - 原生连接:客户端使用的时区,默认值为系统当前时区,全局生效。因为历史的原因,我们只支持POSIX标准的部分规范,如UTC-8(代表中国上上海), GMT-8,Asia/Shanghai 这几种形式。
+ - WebSocket 连接:客户端使用的时区,连接上生效,默认值为系统时区。仅支持 IANA 时区,即 Asia/Shanghai 这种形式。推荐不设置,使用系统时区性能更好。
- TSDBDriver.HTTP_CONNECT_TIMEOUT: 连接超时时间,单位 ms, 默认值为 60000。仅在 REST 连接时生效。
- TSDBDriver.HTTP_SOCKET_TIMEOUT: socket 超时时间,单位 ms,默认值为 60000。仅在 REST 连接且 batchfetch 设置为 false 时生效。
- TSDBDriver.PROPERTY_KEY_MESSAGE_WAIT_TIMEOUT: 消息超时时间, 单位 ms, 默认值为 60000。 仅 WebSocket 连接下有效。
@@ -300,12 +301,15 @@ properties 中的配置参数如下:
- TSDBDriver.HTTP_POOL_SIZE: REST 并发请求大小,默认 20。
- TSDBDriver.PROPERTY_KEY_ENABLE_COMPRESSION: 传输过程是否启用压缩。仅在使用 REST/WebSocket 连接时生效。true: 启用,false: 不启用。默认为 false。
- TSDBDriver.PROPERTY_KEY_ENABLE_AUTO_RECONNECT: 是否启用自动重连。仅在使用 WebSocket 连接时生效。true: 启用,false: 不启用。默认为 false。
-> **注意**:启用自动重连仅对简单执行 SQL 语句以及 无模式写入、数据订阅有效。对于参数绑定无效。自动重连仅对连接建立时通过参数指定数据库有效,对后面的 `use db` 语句切换数据库无效。
+ > **注意**:启用自动重连仅对简单执行 SQL 语句以及 无模式写入、数据订阅有效。对于参数绑定无效。自动重连仅对连接建立时通过参数指定数据库有效,对后面的 `use db` 语句切换数据库无效。
- TSDBDriver.PROPERTY_KEY_RECONNECT_INTERVAL_MS: 自动重连重试间隔,单位毫秒,默认值 2000。仅在 PROPERTY_KEY_ENABLE_AUTO_RECONNECT 为 true 时生效。
- TSDBDriver.PROPERTY_KEY_RECONNECT_RETRY_COUNT: 自动重连重试次数,默认值 3,仅在 PROPERTY_KEY_ENABLE_AUTO_RECONNECT 为 true 时生效。
- TSDBDriver.PROPERTY_KEY_DISABLE_SSL_CERT_VALIDATION: 关闭 SSL 证书验证 。仅在使用 WebSocket 连接时生效。true: 启用,false: 不启用。默认为 false。
+- TSDBDriver.PROPERTY_KEY_APP_NAME: App 名称,可用于 `show connections` 查询结果显示。仅在使用 WebSocket 连接时生效。默认值为 java。
+- TSDBDriver.PROPERTY_KEY_APP_IP: App IP,可用于 `show connections` 查询结果显示。仅在使用 WebSocket 连接时生效。默认值为空。
+
此外对 JDBC 原生连接,通过指定 URL 和 Properties 还可以指定其他参数,比如日志级别、SQL 长度等。
**配置参数的优先级**
From 014363954c39fc2021224431fd5be47422a809a6 Mon Sep 17 00:00:00 2001
From: jiajingbin
Date: Sun, 29 Dec 2024 22:27:39 +0800
Subject: [PATCH 03/35] feat: add setup scripts for build env
---
packaging/setup_env.sh | 2055 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 2055 insertions(+)
create mode 100644 packaging/setup_env.sh
diff --git a/packaging/setup_env.sh b/packaging/setup_env.sh
new file mode 100644
index 0000000000..cce54b2839
--- /dev/null
+++ b/packaging/setup_env.sh
@@ -0,0 +1,2055 @@
+#!/bin/bash
+# define default timezone
+DEFAULT_TIMEZONE="Asia/Shanghai"
+
+# Define default DNS server
+DEFAULT_DNS="192.168.1.252"
+
+# Define the packages to be installed
+SYSTEM_APT_TOOLS="git wget vim gdb screen tmux ntp tree atop iotop sysstat fio tcpdump iperf3 qemu-guest-agent dstat linux-tools-common linux-tools-generic jq zip unzip cloud-guest-utils nfs-kernel-server nfs-common"
+SYSTEM_YUM_TOOLS="git wget vim gdb screen tmux ntp tree atop iotop sysstat fio tcpdump iperf3 qemu-guest-agent dstat jq zip unzip cloud-utils-growpart python3-devel nfs-utils rpm-build automake autoconf libevent-devel ncurses-devel"
+
+# Define the packages to be installed for build TDinternal
+BUILD_APT_TOOLS="llvm gcc make cmake perl g++ lzma curl locales psmisc libgeos-dev libgoogle-glog-dev valgrind rsync libjemalloc-dev openssh-server sshpass net-tools dirmngr gnupg apt-transport-https \
+ ca-certificates software-properties-common r-base iputils-ping build-essential git libssl-dev libgflags2.2 libgflags-dev libjansson-dev libsnappy-dev liblzma-dev libz-dev zlib1g pkg-config"
+BUILD_YUM_TOOLS="gcc make cmake3 perl gcc-c++ xz curl psmisc geos glog valgrind rsync jemalloc openssh-server sshpass net-tools gnupg2 libarchive snappy-devel pkgconfig libatomic perl-IPC-Cmd libcurl-devel libxml2-devel\
+ ca-certificates libicu-devel R-core iputils bison flex glibc-static libstdc++-static libstdc++-devel openssl-devel gflags jansson jansson-devel snappy xz-devel zlib-devel zlib bzip2-devel zlib-static libs3"
+
+# Define the packages to be installed via pip
+PIP_PKGS="wheel setuptools-rust pandas psutil fabric2 requests faker simplejson toml pexpect tzlocal distro decorator loguru hyperloglog taospy numpy poetry"
+
+# Gcc version to be updated
+GCC_VERSION="9.5.0"
+
+# Define the version of the Ubuntu release
+# Define jdk version to be installed
+if [ -f /etc/debian_version ]; then
+ DIST_VERSION=$(lsb_release -sr)
+ JDK_VERSION="openjdk-17-jdk"
+elif [ -f /etc/redhat-release ]; then
+ DIST_VERSION=$(grep -oP '\d+\.\d+' < /etc/redhat-release)
+ JDK_VERSION="java-1.8.0-openjdk"
+else
+ echo "Unsupported Linux distribution."
+ exit 1
+fi
+
+# Define the path where the core dump files should be stored
+COREPATH="/corefile"
+
+# Define the path where the repository should be cloned
+REPOPATH="$HOME/repos"
+
+# Define the path to the script directory
+SCRIPT_DIR=$(dirname "$(realpath "$0")")
+
+# Define the path to the .bashrc file
+BASH_RC=$HOME/.bashrc
+
+# Define the path to the Cargo configuration file
+CARGO_CONFIG_FILE=$HOME/.cargo/config.toml
+
+# Define jmeter version to be installed
+JMETER_VERSION="5.6.3"
+
+# Define the path where the Prometheus binary should exist
+PROMETHEUS_BINARY="/usr/local/bin/prometheus"
+
+# Define the path where the Node Exporter binary should exist
+NODE_EXPORTER_BINARY="/usr/local/bin/node_exporter"
+
+# Define the path where the Process Exporter binary should exist
+PROCESS_EXPORTER_BINARY="/usr/local/bin/process-exporter"
+
+# Define fstab input
+FSTAB_LINE="share-server.platform.tdengine.dev:/mnt/share_server /mnt/share_server nfs rw,sync,_netdev 0 0"
+
+# ANSI color codes
+GREEN='\033[0;32m' # Green color
+RED='\033[0;31m' # Red color
+NO_COLOR='\033[0m' # Reset to default color
+YELLOW='\033[0;33m' # Yellow color
+
+# read -r -d '' CLOUD_INIT_CONFIG << 'EOF'
+# datasource:
+# NoCloud:
+# seedfrom: /var/lib/cloud/seed/nocloud/
+# meta-data: {}
+# user-data: {}
+# vendor-data: {}
+# ConfigDrive: {}
+# None: {}
+# datasource_list: [ NoCloud, ConfigDrive, None ]
+# EOF
+
+read -r -d '' CUSTOM_SETTINGS <<'EOF'
+export LC_CTYPE="en_US.UTF-8"
+export LANG="en_US.UTF-8"
+export HISTTIMEFORMAT="%d/%m/%y %T "
+parse_git_branch() {
+ git branch 2> /dev/null | sed -e '/^[^*]/d' -e 's/* \(.*\)/(\1)/'
+}
+export PS1="\u@\h \[\e[32m\]\w \[\e[91m\]\$(parse_git_branch)\[\e[00m\]$ "
+EOF
+
+read -r -d '' CARGO_CONFIG <<'EOF'
+[source.crates-io]
+replace-with = 'rsproxy-sparse'
+[source.rsproxy]
+registry = "https://rsproxy.cn/crates.io-index"
+[source.rsproxy-sparse]
+registry = "sparse+https://rsproxy.cn/index/"
+[registries.rsproxy]
+index = "https://rsproxy.cn/crates.io-index"
+[net]
+git-fetch-with-cli = true
+[source.tuna]
+registry = "https://mirrors.tuna.tsinghua.edu.cn/git/crates.io-index.git"
+[source.ustc]
+registry = "git://mirrors.ustc.edu.cn/crates.io-index"
+[source.sjtu]
+registry = "https://mirrors.sjtug.sjtu.edu.cn/git/crates.io-index"
+[source.rustcc]
+registry = "git://crates.rustcc.cn/crates.io-index"
+EOF
+
+# Help function to display usage information
+help() {
+ echo "Usage: $0 [option]"
+ echo "Options:"
+ echo " --help - Display this help and exit"
+ echo " setup_all - Setup all configurations and installations"
+ echo " TDasset - Prepare TDasset env"
+ echo " TDinternal - Prepare TDinternal env"
+ echo " TDgpt - Prepare TDgpt env"
+ echo " taostest - Prepare taostest env"
+ echo " system_config - Perform system configuration"
+ echo " deploy_pure - Deploy Pure environment"
+ echo " deploy_dev - Deploy development environment"
+ echo " deploy_cmake - Deploy CMake"
+ echo " update_redhat_gcc - Update GCC on Red Hat or CentOS"
+ echo " update_redhat_tmux - Update tmux on Red Hat or CentOS"
+ echo " deploy_tmux - Deploy tmux"
+ echo " config_ssh - Configure SSH settings"
+ echo " disable_firewalld - Disable firewalld"
+ echo " config_cloud_init - Set cloud initialization parameters"
+ echo " deploy_git - Deploy git repositories"
+ echo " replace_sources - Replace package sources"
+ echo " update - Update the system"
+ echo " upgrade - Upgrade the system"
+ echo " config_timezone - Configure the system timezone"
+ echo " config_dns - Set DNS configurations"
+ echo " config_custom_settings - Add custom settings to your shell configuration"
+ echo " config_share_server - Configure share server"
+ echo " install_packages - Install specified packages"
+ echo " config_system_limits - Configure system limits and kernel parameters"
+ echo " config_coredump - Configure core dump settings"
+ echo " disable_service - Disable specified services"
+ echo " install_python - Install Python and pip"
+ echo " install_java - Install Java"
+ echo " install_maven - Install Maven"
+ echo " deploy_go - Deploy Go environment"
+ echo " deploy_rust - Deploy Rust environment"
+ echo " install_node - Install Node via package manager or binary"
+ echo " install_node_via_nvm - Install Node via NVM"
+ echo " install_pnpm - Install PNPM, node version >=v18.12.00 required"
+ echo " deploy_node_exporter - Deploy Node Exporter for Prometheus"
+ echo " deploy_process_exporter - Deploy Process Exporter"
+ echo " deploy_prometheus - Deploy Prometheus"
+ echo " deploy_grafana - Deploy Grafana"
+ echo " deploy_jmeter - Deploy JMeter"
+ echo " install_nginx - Install NGINX"
+ echo " config_qemu_guest_agent - Configure QEMU guest agent"
+ echo " deploy_docker - Deploy Docker"
+ echo " deploy_docker_compose - Deploy Docker Compose"
+ echo " clone_enterprise - Clone the enterprise repository"
+ echo " clone_community - Clone the community repository"
+ echo " clone_taosx - Clone TaosX repository"
+ echo " clone_taoskeeper - Clone TaosKeeper repository"
+ echo " clone_taostest - Clone TaosTest repository"
+ echo " clone_operation - Clone operation tools repository"
+}
+
+replace_apt_sources() {
+ # Define the codename of the Ubuntu release
+ local CODENAME
+ CODENAME=$(lsb_release -sc)
+ if grep -q "mirrors.aliyun.com" /etc/apt/sources.list; then
+ echo "The Aliyun mirror is already set."
+ else
+ echo "Backing up the original sources.list..."
+ cp /etc/apt/sources.list /etc/apt/sources.list.bak
+
+ echo "Replacing sources.list with the Aliyun mirror..."
+ tee /etc/apt/sources.list << EOF
+deb http://mirrors.aliyun.com/ubuntu/ $CODENAME main restricted universe multiverse
+deb http://mirrors.aliyun.com/ubuntu/ $CODENAME-security main restricted universe multiverse
+deb http://mirrors.aliyun.com/ubuntu/ $CODENAME-updates main restricted universe multiverse
+deb http://mirrors.aliyun.com/ubuntu/ $CODENAME-proposed main restricted universe multiverse
+deb http://mirrors.aliyun.com/ubuntu/ $CODENAME-backports main restricted universe multiverse
+deb-src http://mirrors.aliyun.com/ubuntu/ $CODENAME main restricted universe multiverse
+deb-src http://mirrors.aliyun.com/ubuntu/ $CODENAME-security main restricted universe multiverse
+deb-src http://mirrors.aliyun.com/ubuntu/ $CODENAME-updates main restricted universe multiverse
+deb-src http://mirrors.aliyun.com/ubuntu/ $CODENAME-proposed main restricted universe multiverse
+deb-src http://mirrors.aliyun.com/ubuntu/ $CODENAME-backports main restricted universe multiverse
+EOF
+ fi
+ echo "Updating repositories..."
+ apt-get update -y
+ echo "The sources have been replaced and updated successfully."
+}
+
+replace_yum_sources() {
+ if grep -q "mirrors.aliyun.com" /etc/yum.repos.d/CentOS-Base.repo; then
+ echo "The Aliyun mirror is already set."
+ else
+ echo "Backing up the original CentOS-Base.repo..."
+ cp /etc/yum.repos.d/CentOS-Base.repo /etc/yum.repos.d/CentOS-Base.repo.bak
+
+ echo "Replacing CentOS-Base.repo with the Aliyun mirror..."
+ tee /etc/yum.repos.d/CentOS-Base.repo << 'EOF'
+[base]
+name=CentOS-$releasever - Base - Aliyun
+baseurl=http://mirrors.aliyun.com/centos/$releasever/os/$basearch/
+gpgcheck=1
+gpgkey=http://mirrors.aliyun.com/centos/RPM-GPG-KEY-CentOS-7
+
+#released updates
+[updates]
+name=CentOS-$releasever - Updates - Aliyun
+baseurl=http://mirrors.aliyun.com/centos/$releasever/updates/$basearch/
+gpgcheck=1
+gpgkey=http://mirrors.aliyun.com/centos/RPM-GPG-KEY-CentOS-7
+
+#additional packages that may be useful
+[extras]
+name=CentOS-$releasever - Extras - Aliyun
+baseurl=http://mirrors.aliyun.com/centos/$releasever/extras/$basearch/
+gpgcheck=1
+gpgkey=http://mirrors.aliyun.com/centos/RPM-GPG-KEY-CentOS-7
+EOF
+ fi
+ echo "Updating repositories..."
+ yum makecache fast
+ yum install epel-release -y
+ yum update -y
+
+ echo "The sources have been replaced and updated successfully."
+}
+
+replace_sources() {
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Replacing sources for APT package manager."
+ replace_apt_sources
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Replacing sources for YUM package manager."
+ replace_yum_sources
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+update() {
+ echo "Updating ..."
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ apt update -y
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ yum install epel-release -y
+ yum update -y
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+upgrade() {
+ echo "Upgrading ..."
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ apt upgrade -y
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ yum upgrade -y
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+config_frontend() {
+ # only ubuntu need
+ if [ -f /etc/debian_version ]; then
+ echo "Configuring frontend..."
+ add_config_if_not_exist "export DEBIAN_FRONTEND=noninteractive" "$BASH_RC"
+ fi
+ update
+ upgrade
+ systemctl restart dbus.service network-dispatcher.service
+
+}
+
+# Adds a configuration to a file if it does not already exist
+add_config_if_not_exist() {
+ local config="$1"
+ local file="$2"
+ grep -qF -- "$config" "$file" || echo "$config" >> "$file"
+}
+
+# General error handling function
+check_status() {
+ local message_on_failure="$1"
+ local message_on_success="$2"
+ local exit_code="$3"
+
+ if [ "${exit_code:-0}" -ne 0 ]; then
+ echo -e "${RED}${message_on_failure}${NO_COLOR}"
+ exit 1
+ else
+ echo -e "${GREEN}${message_on_success}${NO_COLOR}"
+ fi
+}
+
+# Config Share-NFS server
+config_share_server() {
+ echo "Configuring share server..."
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ install_package "nfs-common"
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ install_package "nfs-utils"
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+ mkdir -p /mnt/share_server
+ add_config_if_not_exist "$FSTAB_LINE" /etc/fstab
+ mount -a
+ check_status "Failed to configure share server" "Share server configured successfully." $?
+}
+
+# Init environment
+init_env() {
+ export DEBIAN_FRONTEND=noninteractive
+ export LC_CTYPE="en_US.UTF-8"
+}
+
+# Install packages
+install_packages() {
+ echo "Installing $package..."
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ install_package $SYSTEM_APT_TOOLS
+ install_package $BUILD_APT_TOOLS
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ yum install epel-release -y
+ install_package $SYSTEM_YUM_TOOLS
+ install_package $BUILD_YUM_TOOLS
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Install package
+install_package() {
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ install_via_apt "$@"
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ install_via_yum "$@"
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Install package via apt
+install_via_apt() {
+ echo -e "${YELLOW}Installing packages: $*...${NO_COLOR}"
+ if DEBIAN_FRONTEND=noninteractive apt-get install -y -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" "$@"; then
+ echo -e "${GREEN}Installed packages successfully.${NO_COLOR}"
+ else
+ echo -e "${RED}Failed to install packages.${NO_COLOR}"
+ return 1
+ fi
+}
+
+# Install package via yum
+install_via_yum() {
+ echo -e "${YELLOW}Installing packages: $*...${NO_COLOR}"
+ if yum install -y "$@"; then
+ echo -e "${GREEN}Installed packages successfully.${NO_COLOR}"
+ else
+ echo -e "${RED}Failed to install packages.${NO_COLOR}"
+ return 1
+ fi
+}
+
+# disable and stop firewalld
+disable_firewalld() {
+ if [ -f /etc/debian_version ]; then
+ # Only Debian or Ubuntu
+ echo "ufw disable..."
+ ufw disable
+ check_status "Failed to disable ufw" "Ufw disabled successfully." $?
+ elif [ -f /etc/redhat-release ]; then
+ # Only Red Hat or CentOS
+ echo "Disabling firewalld..."
+ systemctl stop firewalld
+ systemctl disable firewalld
+ check_status "Failed to disable firewalld" "Firewalld disabled successfully." $?
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Modifies SSH configuration and sets the root password
+config_ssh() {
+ echo "Configuring SSH settings..."
+ sed -i 's/^#PermitRootLogin.*/PermitRootLogin yes/' /etc/ssh/sshd_config
+ sed -i 's/^PermitRootLogin.*/PermitRootLogin yes/' /etc/ssh/sshd_config
+
+ echo "Restarting SSH..."
+ if [ "$DIST_VERSION" = "24.04" ]; then
+ systemctl restart ssh
+ else
+ systemctl restart sshd
+ fi
+ check_status "Failed to restart SSH" "SSH restarted successfully." $?
+}
+
+# Sets the timezone
+config_timezone() {
+ echo "Setting timezone to $DEFAULT_TIMEZONE..."
+ timedatectl set-timezone "$DEFAULT_TIMEZONE"
+ check_status "Failed to set timezone" "Timezone set to $DEFAULT_TIMEZONE successfully." $?
+}
+
+# Disables service
+disable_service() {
+ if [ -f /etc/debian_version ]; then
+ # Only Debian or Ubuntu
+ echo "Stop and disable and related services..."
+ systemctl stop apt-daily.service apt-daily-upgrade.service apt-daily-upgrade.timer apt-daily.timer unattended-upgrades
+ systemctl disable apt-daily.service apt-daily-upgrade.service apt-daily-upgrade.timer apt-daily.timer unattended-upgrades
+ check_status "Failed to disable related services" "Related services disabled successfully." $?
+ fi
+}
+
+# Config dns for Red Hat or CentOS
+config_redhat_dns() {
+ local DEFAULT_DNS="192.168.2.99"
+ echo "Configuring DNS settings to use $INTERNAL_DNS and $DEFAULT_DNS..."
+ echo "nameserver $INTERNAL_DNS" > /etc/resolv.conf
+ echo "nameserver $DEFAULT_DNS" >> /etc/resolv.conf
+ check_status "Failed to configure DNS" "DNS configured to use $INTERNAL_DNS and $DEFAULT_DNS successfully." $?
+}
+
+# Config dns for Debian or Ubuntu
+config_debian_dns() {
+ local DEFAULT_DNS="192.168.2.99"
+ echo "Configuring DNS settings to use $INTERNAL_DNS and $DEFAULT_DNS..."
+ systemctl stop systemd-resolved.service
+ echo "[Resolve]" > /etc/systemd/resolved.conf
+ echo "DNS=$INTERNAL_DNS" >> /etc/systemd/resolved.conf
+ echo "DNS=$DEFAULT_DNS" >> /etc/systemd/resolved.conf
+ systemctl restart systemd-resolved.service
+ ln -sf /run/systemd/resolve/resolv.conf /etc/resolv.conf
+ check_status "Failed to configure DNS" "DNS configured to use $INTERNAL_DNS and $DEFAULT_DNS successfully." $?
+}
+
+# Config DNS settings
+config_dns() {
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Configuring DNS settings for Debian or Ubuntu..."
+ config_debian_dns
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Configuring DNS settings for Red Hat or CentOS..."
+ config_redhat_dns
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Config qemu-guest-agent
+config_qemu_guest_agent() {
+ install_package "qemu-guest-agent"
+ echo "Configuring qemu-guest-agent..."
+ systemctl enable qemu-guest-agent
+ systemctl start qemu-guest-agent
+ check_status "Failed to configure qemu-guest-agent" "Qemu-guest-agent configured successfully." $?
+}
+
+# Config custom settings
+config_custom_settings() {
+ echo "Configuring custom settings..."
+ marker="parse_git_branch"
+
+ if grep -qF "$marker" "$BASH_RC"; then
+ echo "Configuration already exists in ""$BASH_RC""."
+ else
+ echo "Adding configuration to ""$BASH_RC""."
+ echo "$CUSTOM_SETTINGS" >>"$BASH_RC"
+ echo "Custom settings have been updated in your $BASH_RC file."
+ fi
+ check_status "Failed to apply custom settings" "Custom settings configured successfully." $?
+}
+
+# Config core dump settings
+config_coredump() {
+ echo "Configuring core dump directory..."
+ mkdir -p $COREPATH
+ add_config_if_not_exist "kernel.core_pattern=$COREPATH/core_%e-%p" /etc/sysctl.conf
+ add_config_if_not_exist "ulimit -n 600000" "$BASH_RC"
+ add_config_if_not_exist "ulimit -c unlimited" "$BASH_RC"
+ sysctl -p
+ check_status "Failed to apply core dump settings" "Core path:$COREPATH applied successfully." $?
+}
+
+# Modifies system resource limits and TCP/IP core parameters
+config_system_limits() {
+ echo "Configuring system limits and kernel parameters..."
+ local sysctl_conf="/etc/sysctl.conf"
+ local limits_conf="/etc/security/limits.conf"
+
+ add_config_if_not_exist "fs.nr_open = 1048576" $sysctl_conf
+ add_config_if_not_exist "net.core.somaxconn=10240" $sysctl_conf
+ add_config_if_not_exist "net.core.netdev_max_backlog=20480" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_max_syn_backlog=10240" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_retries2=5" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_syn_retries=2" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_synack_retries=2" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_tw_reuse=1" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_keepalive_time=600" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_abort_on_overflow=1" $sysctl_conf
+ add_config_if_not_exist "net.ipv4.tcp_max_tw_buckets=5000" $sysctl_conf
+
+ sysctl -p
+ check_status "Failed to apply sysctl settings" "Apply sysctl settings successfully." $?
+
+ for limit in "soft nproc 65536" "soft nofile 65536" "soft stack 65536" "hard nproc 65536" "hard nofile 65536" "hard stack 65536"; do
+ add_config_if_not_exist "* $limit" $limits_conf
+ add_config_if_not_exist "root $limit" $limits_conf
+ done
+ check_status "Failed to apply limits settings" "Apply limits settings successfully." $?
+}
+
+# Check the operating system version
+centos_skip_check() {
+ # Check if the operating system is CentOS 7
+ if [[ -f /etc/redhat-release ]]; then
+ if grep -q "CentOS Linux release 7" /etc/redhat-release; then
+ echo "This platform requires you to manually upgrade gcc and glibc."
+ exit 1
+ fi
+ fi
+}
+
+# Other logic can go here...
+
+# Deploy cmake
+deploy_cmake() {
+ # Check if cmake is installed
+ if command -v cmake >/dev/null 2>&1; then
+ echo "Cmake is already installed. Skipping installation."
+ cmake --version
+ return
+ fi
+ install_package "cmake3"
+ ln -sf /usr/bin/cmake3 /usr/bin/cmake
+ check_status "Failed to install cmake" "Install cmake successfully" $?
+}
+
+
+# install pkg via pip
+install_pip_pkg() {
+ if [ "$DIST_VERSION" != "24.04" ]; then
+ echo "Installing $PIP_PKGS ..."
+ pip3 install --upgrade pip setuptools -i https://pypi.tuna.tsinghua.edu.cn/simple
+ install_via_pip "$PIP_PKGS"
+ fi
+}
+
+install_via_pip() {
+ echo "pip install $*..."
+ if pip3 install $* -i https://pypi.tuna.tsinghua.edu.cn/simple; then
+ echo "pip install packages successfully."
+ else
+ echo "Failed to install packages."
+ return 1
+ fi
+}
+
+
+# Complie python
+download_and_compile_python() {
+ if [ -f /etc/debian_version ]; then
+ install_package gcc make build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev libffi-dev liblzma-dev
+ elif [ -f /etc/redhat-release ]; then
+ # install_package gcc patch libffi libffi-devel python-devel zlib zlib-devel bzip2-devel openssl-devel openssl11 openssl11-devel ncurses-devel sqlite-devel readline-devel tk-devel gdbm-devel db4-devel libpcap-devel xz-devel
+ install_package gcc zlib zlib-devel libffi libffi-devel readline-devel openssl-devel openssl11 openssl11-devel
+ # CFLAGS=$(pkg-config --cflags openssl11)
+ # export CFLAGS
+ # LDFLAGS=$(pkg-config --libs openssl11)
+ # export LDFLAGS
+ export CFLAGS=$(pkg-config --cflags openssl11)
+ export LDFLAGS=$(pkg-config --libs openssl11)
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+
+ local VERSION="$1"
+ local DOWNLOAD_URL="https://www.python.org/ftp/python/$VERSION/Python-$VERSION.tgz"
+
+
+ echo "Downloading Python $VERSION from $DOWNLOAD_URL..."
+ wget "$DOWNLOAD_URL" -O "/tmp/Python-$VERSION.tgz"
+ check_status "Failed to download Python source." "Python source downloaded successfully." $?
+
+ echo "Extracting Python $VERSION..."
+ cd /tmp || exit
+ tar -xzf "Python-$VERSION.tgz"
+ cd "Python-$VERSION" || exit
+
+ echo "Compiling and installing Python $VERSION..."
+ ./configure --enable-optimizations --prefix=/usr/local/python"$VERSION"
+ make -j"$(nproc)"
+ make altinstall
+
+ cd ..
+ rm -rf "Python-$VERSION" "Python-$VERSION.tgz"
+
+ MAJOR_MINOR_VERSION=$(echo "$VERSION" | cut -d '.' -f 1-2)
+
+ # files=(
+ # "/usr/bin/python3"
+ # "/usr/bin/python"
+ # "/usr/bin/pip3"
+ # "/usr/bin/pip"
+ # )
+
+ # for file in "${files[@]}"; do
+ # if [ -e "$file" ]; then
+ # backup_file="$file.bak"
+ # echo "Backing up $file to $backup_file"
+ # cp "$file" "$backup_file"
+ # else
+ # echo "$file does not exist, skipping."
+ # fi
+ # done
+
+ ln -sf /usr/local/python"$VERSION"/bin/python"$MAJOR_MINOR_VERSION" /usr/local/bin/python3
+ ln -sf /usr/local/python"$VERSION"/bin/pip"$MAJOR_MINOR_VERSION" /usr/local/bin/pip3
+ python3 --version
+ check_status "Failed to install Python $VERSION" "Python $VERSION installed successfully." $?
+}
+
+upgrage_pip() {
+ echo "Upgrading pip..."
+ python3 -m pip install --upgrade pip
+ check_status "Failed to upgrade pip" "Pip upgraded successfully." $?
+}
+
+# Install Python via package_manager
+install_python_via_package_manager() {
+ if [ -n "$1" ]; then
+ PYTHON_PACKAGE="$1"
+ else
+ PYTHON_PACKAGE="python3"
+ fi
+ install_package "$PYTHON_PACKAGE"
+ install_package "python3-pip"
+ upgrage_pip
+ python3 --version
+}
+
+# Install Python and pip
+# shellcheck disable=SC2120
+install_python() {
+ echo -e "${YELLOW}Installing Python...${NO_COLOR}"
+ # Specify the major python version to search for; default is set to 3.10 if not specified
+ if [ -n "$1" ]; then
+ PYTHON_VERSION="$1"
+ else
+ install_python_via_package_manager "$PYTHON_PACKAGE"
+ exit 0
+ fi
+ MAJOR_MINOR_VERSION=$(echo "$PYTHON_VERSION" | cut -d '.' -f 1-2)
+ # Check if the JDK package is available in the repository
+ if [ -f /etc/debian_version ]; then
+ PYTHON_PACKAGE="python${MAJOR_MINOR_VERSION}"
+ if apt-cache search "$PYTHON_PACKAGE" | grep -w "$PYTHON_PACKAGE"; then
+ install_python_via_package_manager "$PYTHON_PACKAGE"
+ exit 0
+ else
+ echo -e "${RED}Failed to install Python using package manager.${NO_COLOR}"
+ fi
+ elif [ -f /etc/redhat-release ]; then
+ PYTHON_PACKAGE="python${MAJOR_MINOR_VERSION//./}"
+ if yum list available | grep -w "$PYTHON_PACKAGE"; then
+ install_python_via_package_manager "$PYTHON_PACKAGE"
+ exit 0
+ else
+ echo -e "${RED}Failed to install Python using package manager.${NO_COLOR}"
+ fi
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+
+ echo -e "${YELLOW}$PYTHON_PACKAGE not found in source repository. Attempting to download and install manually....${NO_COLOR}"
+ download_and_compile_python "$PYTHON_VERSION"
+ upgrage_pip
+
+ # Check installation status
+ INSTALLED_VERSION=$(python3 --version 2>&1)
+ if echo "$INSTALLED_VERSION" | grep -q "$MAJOR_MINOR_VERSION"; then
+ echo -e "${GREEN}Python $MAJOR_MINOR_VERSION installed successfully.${NO_COLOR}"
+ else
+ echo -e "${YELLOW}Python version not match.${NO_COLOR}"
+ exit 1
+ fi
+}
+
+update_redhat_gcc() {
+ echo "Updating the system..."
+ update
+
+ echo "Installing dependencies..."
+ yum groupinstall -y "Development Tools"
+ install_package gmp-devel mpfr-devel libmpc-devel wget
+
+ echo "Downloading GCC $GCC_VERSION..."
+ cd /usr/local/src || exit
+ wget https://ftp.gnu.org/gnu/gcc/gcc-$GCC_VERSION/gcc-$GCC_VERSION.tar.gz
+ wget https://ftp.gnu.org/gnu/gcc/gcc-$GCC_VERSION/gcc-$GCC_VERSION.tar.gz.sig
+
+ echo "Extracting GCC $GCC_VERSION..."
+ tar -xzf gcc-$GCC_VERSION.tar.gz
+ cd gcc-$GCC_VERSION || exit
+
+ echo "Downloading necessary dependencies for GCC..."
+ ./contrib/download_prerequisites
+
+ mkdir build
+ cd build || exit
+
+ echo "Configuring GCC..."
+ ../configure --enable-languages=c,c++ --disable-multilib --prefix=/usr
+
+ echo "Compiling GCC, this may take a while..."
+ make -j"$(nproc)"
+ make install
+
+ echo "Cleaning up downloaded files..."
+ cd /usr/local/src || exit
+ rm -rf gcc-$GCC_VERSION gcc-$GCC_VERSION.tar.gz gcc-$GCC_VERSION.tar.gz.sig
+ echo "Cleanup completed."
+
+ echo "GCC installation completed. Verifying installation..."
+ gcc --version
+ check_status "Failed to install GCC" "GCC $GCC_VERSION installed successfully." $?
+
+ # Backup
+ if [ -f "/lib64/libstdc++.so.6.0.28-gdb.py" ]; then
+ # Copy the file
+ mv -f /lib64/libstdc++.so.6.0.28-gdb.py /tmp/libstdc++.so.6.0.28-gdb.py
+ echo "File has been successfully moved to /tmp/libstdc++.so.6.0.28-gdb.py"
+ else
+ echo "File /lib64/libstdc++.so.6.0.28-gdb.py does not exist, cannot perform copy operation."
+ fi
+}
+
+update_redhat_tmux() {
+ echo "Downloading the latest version of tmux..."
+ cd /usr/local/src || exit
+ latest_tmux_version=$(curl -s https://api.github.com/repos/tmux/tmux/releases/latest | grep -Po '"tag_name": "\K.*?(?=")')
+ wget https://github.com/tmux/tmux/releases/download/"${latest_tmux_version}"/tmux-"${latest_tmux_version}".tar.gz
+
+ echo "Extracting tmux ${latest_tmux_version}..."
+ tar -xzf tmux-"${latest_tmux_version}".tar.gz
+ cd tmux-"${latest_tmux_version}" || exit
+
+ echo "Configuring tmux..."
+ ./configure --prefix=/usr
+
+ echo "Compiling tmux, this may take a while..."
+ make -j"$(nproc)"
+ make install
+
+ echo "Cleaning up downloaded files..."
+ cd /usr/local/src || exit
+ rm -rf tmux-"${latest_tmux_version}" tmux-"${latest_tmux_version}".tar.gz
+
+ echo "Cleanup completed."
+
+ echo "tmux installation completed. Verifying installation..."
+ tmux -V
+ check_status "Failed to install tmux" "tmux ${latest_tmux_version} installed successfully." $?
+}
+
+deploy_tmux() {
+ if [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ update_redhat_tmux
+ fi
+ echo "Copying configuration file..."
+
+ cp "$SCRIPT_DIR/../conf/tmux.conf" ~/.tmux.conf
+
+ echo "Configuration file copied to ~/.tmux.conf."
+}
+
+
+# Install Java
+# install_java() {
+# if command -v java >/dev/null 2>&1; then
+# echo "Java is already installed. Skipping installation."
+# java -version
+# return
+# else
+# echo "Installing $JDK_VERSION..."
+# install_package "$JDK_VERSION"
+# check_status "Failed to install Java" "Install Java successfully" $?
+# fi
+# }
+
+# Install Java
+install_java() {
+ echo -e "${YELLOW}Installing Java...${NO_COLOR}"
+ # Specify the major JDK version to search for; default is set to 17 if not specified
+ if [ -n "$1" ]; then
+ DEFAULT_JDK_VERSION="$1"
+ else
+ DEFAULT_JDK_VERSION="17"
+ fi
+
+ # Check if the JDK package is available in the repository
+ if [ -f /etc/debian_version ]; then
+ JDK_PACKAGE="openjdk-$DEFAULT_JDK_VERSION-jdk"
+ if apt-cache search "$JDK_PACKAGE" | grep -q "$JDK_PACKAGE"; then
+ echo "Installing $JDK_PACKAGE using apt..."
+ fi
+ elif [ -f /etc/redhat-release ]; then
+ JDK_PACKAGE="java-$DEFAULT_JDK_VERSION-openjdk"
+ if yum list available | grep -q "$JDK_PACKAGE"; then
+ echo "Installing $JDK_PACKAGE using yum..."
+ fi
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+
+ # Check installation status
+ if ! install_package "$JDK_PACKAGE"; then
+ echo -e "${RED}Failed to install Java using package manager.${NO_COLOR}"
+ else
+ echo -e "${GREEN}Java installed successfully.${NO_COLOR}"
+ java -version
+ return
+ fi
+
+ echo -e "${YELLOW}$JDK_PACKAGE not found in $PACKAGE_MANAGER repository. Attempting to download and install manually....${NO_COLOR}"
+
+ # URL of the archive page to search
+ ARCHIVE_URL="https://jdk.java.net/archive/"
+
+ # Get the latest minor version number
+ LATEST_VERSION=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s "$ARCHIVE_URL" | \
+ grep -o "jdk${DEFAULT_JDK_VERSION}\.[0-9]\+\.[0-9]\+" | \
+ sort -V | \
+ tail -n 1)
+ JDK_VERSION_NUM="${LATEST_VERSION#jdk}"
+ # Confirm the latest version found
+ if [[ $LATEST_VERSION =~ jdk([0-9]+)\.([0-9]+)\.([0-9]+) ]]; then
+ # Print the latest version found
+ echo -e "${YELLOW}Latest JDK version found: $LATEST_VERSION${NO_COLOR}"
+ MATCH_URL="https://download.java.net/java/GA/${LATEST_VERSION}/[^ ]*linux-x64_bin.tar.gz"
+ JDK_DOWNLOAD_URL=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s "$ARCHIVE_URL" | \
+ grep -o "$MATCH_URL" | \
+ head -n 1)
+ else
+ echo -e "${RED}Failed to find the JDK version: $LATEST_VERSION.${NO_COLOR}"
+ exit 1
+ fi
+ # Download JDK
+ echo "Downloading OpenJDK $LATEST_VERSION from $JDK_DOWNLOAD_URL..."
+ wget "$JDK_DOWNLOAD_URL" -O /tmp/"${LATEST_VERSION}"_linux-x64_bin.tar.gz
+ check_status "Failed to download OpenJDK." "OpenJDK downloaded successfully." $?
+
+ # Extract and install
+ echo "Extracting OpenJDK..."
+ if [ -d "/usr/local/jdk-${JDK_VERSION_NUM}" ]; then
+ rm -rf "/usr/local/jdk-${JDK_VERSION_NUM}"
+ fi
+ tar -xzf /tmp/"${LATEST_VERSION}"_linux-x64_bin.tar.gz -C /usr/local/
+ rm -rf /tmp/"${LATEST_VERSION}"_linux-x64_bin.tar.gz
+ # Configure environment variables
+ echo "Configuring environment variables..."
+ add_config_if_not_exist "export JAVA_HOME=/usr/local/jdk-${JDK_VERSION_NUM}" "$BASH_RC"
+ add_config_if_not_exist "export PATH=\$PATH:\$JAVA_HOME/bin" "$BASH_RC"
+ # shellcheck source=/dev/null
+ export JAVA_HOME=/usr/local/jdk-${JDK_VERSION_NUM}
+ export PATH=$PATH:$JAVA_HOME/bin
+ INSTALLED_VERSION=$("$JAVA_HOME"/bin/java --version 2>&1)
+ if echo "$INSTALLED_VERSION" | grep -q "openjdk $DEFAULT_JDK_VERSION"; then
+ echo -e "${GREEN}Java installed successfully.${NO_COLOR}"
+ else
+ echo -e "${YELLOW}Java version not match.${NO_COLOR}"
+ exit 1
+ fi
+}
+
+# Install sdkman
+install_sdkman() {
+ install_package zip unzip
+ if [ -d "$HOME/.sdkman" ]; then
+ echo -e "${GREEN}SDKMAN is already installed.${NO_COLOR}"
+ else
+ echo -e "${YELLOW}Installing SDKMAN...${NO_COLOR}"
+ curl -s "https://get.sdkman.io" | bash
+ fi
+
+}
+
+# Install Maven
+# shellcheck disable=SC2120
+install_maven() {
+ echo -e "${YELLOW}Installing maven...${NO_COLOR}"
+ if [ -n "$1" ]; then
+ DEFAULT_MVN_VERSION="$1"
+ install_sdkman
+ if [ -f "$HOME/.sdkman/bin/sdkman-init.sh" ]; then
+ source "$HOME/.sdkman/bin/sdkman-init.sh"
+ fi
+ # 3.2.5
+ sdk install maven "$DEFAULT_MVN_VERSION"
+ else
+ install_package "maven"
+ fi
+ mvn -version
+ check_status "Failed to install maven" "Maven installed successfully." $?
+}
+
+# Install Go
+deploy_go() {
+ # Define the installation location for Go
+ GO_INSTALL_DIR="/usr/local/go"
+ GOPATH_DIR="/root/go"
+
+ # Check if Go is installed
+ if command -v go >/dev/null 2>&1; then
+ echo "Go is already installed. Skipping installation."
+ return
+ fi
+
+ # Fetch the latest version number of Go
+ GO_LATEST_DATA=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s https://golang.google.cn/VERSION?m=text)
+ GO_LATEST_VERSION=$(echo "$GO_LATEST_DATA" | grep -oP 'go[0-9]+\.[0-9]+\.[0-9]+')
+ # Download and install the latest version of Go
+ echo "Installing $GO_LATEST_VERSION..."
+ wget https://golang.google.cn/dl/$GO_LATEST_VERSION.linux-amd64.tar.gz -O go.tar.gz
+
+ # Extract to the specified directory
+ tar -C /usr/local -xzf go.tar.gz
+ rm -rf go.tar.gz
+
+ # Configure environment variables using the helper function
+ add_config_if_not_exist "export GOROOT=$GO_INSTALL_DIR" "$BASH_RC"
+ add_config_if_not_exist "export GOPATH=$GOPATH_DIR" "$BASH_RC"
+ add_config_if_not_exist "export PATH=\$PATH:\$GOROOT/bin" "$BASH_RC"
+ add_config_if_not_exist "export GO111MODULE=on" "$BASH_RC"
+ add_config_if_not_exist "export GOPROXY=https://goproxy.cn,direct" "$BASH_RC"
+
+ # Apply the environment variables
+ $GO_INSTALL_DIR/bin/go version
+ check_status "Failed to install GO" "Install GO successfully" $?
+}
+
+# Function to install Rust and Cargo
+deploy_rust() {
+ # Check if Rust is already installed
+ if ! command -v rustc &> /dev/null; then
+ # add_config_if_not_exist "export RUSTUP_DIST_SERVER=http://mirrors.ustc.edu.cn/rust-static" $BASH_RC
+ # export RUSTUP_DIST_SERVER=http://mirrors.ustc.edu.cn/rust-static
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ echo "Using APT package manager."
+ install_package build-essential
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Using YUM package manager."
+ yum groupinstall -y "Development Tools"
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+
+ add_config_if_not_exist "export RUSTUP_DIST_SERVER=\"https://rsproxy.cn\"" "$BASH_RC"
+ add_config_if_not_exist "export RUSTUP_UPDATE_ROOT=\"https://rsproxy.cn/rustup\"" "$BASH_RC"
+ export RUSTUP_DIST_SERVER="https://rsproxy.cn"
+ export RUSTUP_UPDATE_ROOT="https://rsproxy.cn/rustup"
+
+ echo "Rust is not installed. Installing Rust and Cargo..."
+ # Download and install Rust and Cargo
+ curl --retry 10 --retry-delay 5 --retry-max-time 120 --proto '=https' --tlsv1.2 -sSf https://rsproxy.cn/rustup-init.sh | sh -s -- -y
+
+ echo "Cargo settings..."
+ marker="git-fetch-with-cli"
+
+ if grep -qF "$marker" "$CARGO_CONFIG_FILE"; then
+ echo "Configuration already exists in ""$CARGO_CONFIG_FILE""."
+ else
+ echo "Adding configuration to ""$CARGO_CONFIG_FILE""."
+ echo "$CARGO_CONFIG" >>"$CARGO_CONFIG_FILE"
+ echo "Cargo config have been updated in your $CARGO_CONFIG_FILE file."
+ fi
+
+ # Source the Cargo environment script to update the current shell
+ if [ -f "$HOME/.cargo/env" ]; then
+ source "$HOME/.cargo/env"
+ fi
+ # Check if the installation was successful
+ rustc --version
+ # Install cargo-make
+ cargo install cargo-make
+ check_status "Failed to install Rust" "Install Rust successfully" $?
+ else
+ echo "Rust is already installed."
+ fi
+}
+
+# Update GCC for Ubuntu 18.04
+update_ubuntu_gcc_18.04() {
+ echo -e "${YELLOW}Updating GCC for Ubuntu 18.04...${NO_COLOR}"
+ install_package software-properties-common
+ yes | add-apt-repository ppa:ubuntu-toolchain-r/test
+ update
+ install_package gcc-9 g++-9
+ update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-9 60 --slave /usr/bin/g++ g++ /usr/bin/g++-9
+ check_status "Failed to update GCC" "GCC updated successfully." $?
+}
+
+install_node_in_ubuntu18.04() {
+ # very slow and not test
+ if [ -n "$1" ]; then
+ DEFAULT_NODE_VERSION="$1"
+ else
+ DEFAULT_NODE_VERSION="14"
+ fi
+ NODE_DISTRO="node-v$DEFAULT_NODE_VERSION-linux-x64"
+ update_ubuntu_gcc_18.04
+ echo "Installing Node..."
+ curl -O https://nodejs.org/dist/v22.0.0/node-v22.0.0.tar.gz
+ tar -xzf node-v22.0.0.tar.gz
+ cd node-v22.0.0 || exit
+ ./configure
+ make
+ make install
+}
+
+
+# Install pnpm
+install_pnpm() {
+ echo -e "${YELLOW}Installing pnpm...${NO_COLOR}"
+ centos_skip_check
+ NODE_VERSION=$(node -v)
+
+ MAJOR_VERSION=$(echo "$NODE_VERSION" | cut -d '.' -f 1 | tr -d 'v')
+ MINOR_VERSION=$(echo "$NODE_VERSION" | cut -d '.' -f 2)
+ PATCH_VERSION=$(echo "$NODE_VERSION" | cut -d '.' -f 3)
+
+ VERSION_NUMBER=$((MAJOR_VERSION * 10000 + MINOR_VERSION * 100 + PATCH_VERSION))
+
+ REQUIRED_VERSION=181200 # v18.12.00
+
+ if [ $VERSION_NUMBER -ge $REQUIRED_VERSION ]; then
+ echo "Node version is $NODE_VERSION, installing pnpm..."
+ npm install --global pnpm
+ pnpm --version
+ check_status "Failed to install pnpm" "pnpm installed successfully." $?
+ else
+ echo "Node version is $NODE_VERSION, skipping pnpm installation."
+ fi
+
+}
+
+# Install Node via nvm
+# shellcheck disable=SC2120
+install_node_via_nvm () {
+ echo -e "${YELLOW}Installing Node via NVM...${NO_COLOR}"
+ if [ -n "$1" ]; then
+ DEFAULT_NODE_VERSION="$1"
+ else
+ DEFAULT_NODE_VERSION=""
+ fi
+
+ if [[ -f /etc/redhat-release ]]; then
+ if [[ "$1" != "16.20.2" ]]; then
+ centos_skip_check
+ fi
+ fi
+
+ # Install NVM
+ if ! command -v nvm &> /dev/null; then
+ NVM_VERSION=$(curl -s https://api.github.com/repos/nvm-sh/nvm/releases/latest | grep -oP '"tag_name": "\K(.*)(?=")')
+ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/"$NVM_VERSION"/install.sh | bash
+ export NVM_DIR="$HOME/.nvm"
+ [ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
+ [ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion"
+ echo -e "${GREEN}NVM installed successfully.${NO_COLOR}"
+ else
+ echo -e "${GREEN}NVM is already installed.${NO_COLOR}"
+ fi
+
+ # Check if version is specified
+ if [ -n "$1" ]; then
+ NODE_VERSION="$1" # use specified version
+ echo -e "${YELLOW}Installing Node version $NODE_VERSION...${NO_COLOR}"
+ nvm install "$NODE_VERSION"
+ else
+ echo -e "${YELLOW}Installing the latest version of Node...${NO_COLOR}"
+ nvm install node # use latest version
+ fi
+ nvm alias default node # set default version
+
+ echo -e "${GREEN}Node installed successfully.${NO_COLOR}"
+
+ npm config set fetch-retry-maxtimeout 120000
+ npm config set fetch-retry-factor 5
+ npm config set registry=https://registry.npmmirror.com
+ npm install --global yarn
+ npm install --global pnpm
+ # NPM_BIN_DIR=$(npm bin -g)
+ # if [ -d "$NPM_BIN_DIR" ]; then
+ # ln -sf "$NPM_BIN_DIR/yarn" /usr/bin/yarn
+ # ln -sf "$NPM_BIN_DIR/yarnpkg" /usr/bin/yarnpkg
+ # echo -e "${GREEN}Yarn installed successfully.${NO_COLOR}"
+ # else
+ # echo -e "${RED}Failed to find npm global bin directory. Yarn installation may not be complete.${NO_COLOR}"
+ # fi
+ node --version
+ check_status "Failed to install Node" "Node installed successfully." $?
+ npm --version
+ check_status "Failed to install npm" "npm installed successfully." $?
+ yarn --version
+ check_status "Failed to install Yarn" "Yarn installed successfully." $?
+}
+
+# Install Node npm Yarn
+# shellcheck disable=SC2120
+install_node() {
+ echo -e "${YELLOW}Installing Node...${NO_COLOR}"
+ if [ -n "$1" ]; then
+ DEFAULT_NODE_VERSION="$1"
+ if [[ -f /etc/redhat-release ]]; then
+ if [[ "$1" != "16.20.2" ]]; then
+ centos_skip_check
+ fi
+ fi
+ echo -e "${YELLOW}Installing Node version $DEFAULT_NODE_VERSION from source...${NO_COLOR}"
+ NODE_DISTRO="node-v$DEFAULT_NODE_VERSION-linux-x64"
+ wget "https://nodejs.org/dist/v$DEFAULT_NODE_VERSION/$NODE_DISTRO.tar.xz" -O "/tmp/$NODE_DISTRO.tar.xz"
+ tar -xf "/tmp/$NODE_DISTRO.tar.xz" -C /usr/local/lib/
+ ln -sf "/usr/local/lib/$NODE_DISTRO/bin/node" /usr/bin/node
+ ln -sf "/usr/local/lib/$NODE_DISTRO/bin/npm" /usr/bin/npm
+ ln -sf "/usr/local/lib/$NODE_DISTRO/bin/npx" /usr/bin/npx
+ rm -rf "/tmp/$NODE_DISTRO.tar.xz"
+ node --version
+ echo -e "${GREEN}Node version $DEFAULT_NODE_VERSION installed successfully.${NO_COLOR}"
+ else
+ install_package "nodejs"
+ install_package "npm"
+ fi
+ node --version
+ check_status "Failed to install Node" "Node installed successfully." $?
+
+ npm --version
+ check_status "Failed to install npm" "npm installed successfully." $?
+ npm config set fetch-retry-maxtimeout 120000
+ npm config set fetch-retry-attempts
+ npm config set fetch-retry-factor 5
+ npm config set registry=https://registry.npmmirror.com
+ npm install --global yarn
+ NPM_BIN_DIR=$(npm bin -g)
+ if [ -d "$NPM_BIN_DIR" ]; then
+ ln -sf "$NPM_BIN_DIR/yarn" /usr/bin/yarn
+ ln -sf "$NPM_BIN_DIR/yarnpkg" /usr/bin/yarnpkg
+ echo -e "${GREEN}Yarn installed successfully.${NO_COLOR}"
+ else
+ echo -e "${RED}Failed to find npm global bin directory. Yarn installation may not be complete.${NO_COLOR}"
+ fi
+ yarn --version
+ check_status "Failed to install Yarn" "Yarn installed successfully." $?
+}
+
+
+
+# Deploy Git
+deploy_git() {
+ install_package "git"
+ git --version
+ check_status "Failed to install Git" "Git installed successfully." $?
+ git config --global user.name "taos-support"
+ git config --global user.email "it@taosdata.com"
+ git config --global credential.helper store
+}
+
+deploy_node_exporter() {
+ if [ ! -f "$NODE_EXPORTER_BINARY" ]; then
+ echo "Node Exporter is not installed. Installing now..."
+
+ echo "Fetching the latest version of Node Exporter..."
+ LATEST_URL=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s https://api.github.com/repos/prometheus/node_exporter/releases/latest | jq -r '.assets[] | select(.name | test("node_exporter-.*linux-amd64.tar.gz")) | .browser_download_url')
+
+ if [ -z "$LATEST_URL" ]; then
+ echo "Failed to fetch the latest Node Exporter release URL. Exiting."
+ exit 1
+ fi
+
+ echo "Downloading Node Exporter from $LATEST_URL..."
+ wget "$LATEST_URL" -O node_exporter.tar.gz
+
+ echo "Extracting Node Exporter..."
+ tar -xzf node_exporter.tar.gz
+ cd node_exporter-*.linux-amd64 || exit
+
+ echo "Copying binary..."
+ cp node_exporter /usr/local/bin/
+
+ echo "Creating systemd service..."
+ cat < /etc/systemd/system/node_exporter.service
+[Unit]
+Description=Node Exporter
+
+[Service]
+ExecStart=/usr/local/bin/node_exporter
+
+[Install]
+WantedBy=default.target
+EOF
+
+ # Start Node Exporter and enable it to run on startup
+ systemctl daemon-reload
+ systemctl start node_exporter
+ systemctl enable node_exporter
+
+ # Clean up the downloaded tar to save space
+ cd ..
+ rm -rf node_exporter*.tar.gz node_exporter-*.linux-amd64
+ node_exporter --version
+ check_status "Failed to install Node Exporter" "Node Exporter installed successfully." $?
+ else
+ echo "Node Exporter is already installed."
+ fi
+}
+
+deploy_process_exporter() {
+ if [ ! -f "$PROCESS_EXPORTER_BINARY" ]; then
+ echo "Process Exporter is not installed. Installing now..."
+
+ echo "Fetching the latest version of Process Exporter..."
+ LATEST_URL=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s https://api.github.com/repos/ncabatoff/process-exporter/releases/latest | jq -r '.assets[] | select(.name | test("process-exporter-.*linux-amd64.tar.gz")) | .browser_download_url')
+
+ if [ -z "$LATEST_URL" ]; then
+ echo "Failed to fetch the latest Process Exporter release URL. Exiting."
+ exit 1
+ fi
+
+ echo "Downloading Process Exporter from $LATEST_URL..."
+ wget "$LATEST_URL" -O process-exporter.tar.gz
+
+ echo "Extracting Process Exporter..."
+ tar -xzf process-exporter.tar.gz
+ cd process-exporter-*.linux-amd64 || exit
+
+ echo "Copying binary..."
+ cp process-exporter /usr/local/bin/process-exporter
+
+ echo "Creating configuration file..."
+ cat < /etc/process_exporter.yml
+process_names:
+ - name: "{{.Comm}}"
+ cmdline:
+ - taosd
+EOF
+
+ echo "Creating systemd service..."
+ cat < /etc/systemd/system/process_exporter.service
+[Unit]
+Description=Process Exporter
+
+[Service]
+ExecStart=/usr/local/bin/process-exporter --config.path /etc/process_exporter.yml
+
+[Install]
+WantedBy=default.target
+EOF
+
+ # Start Process Exporter and enable it to run on startup
+ systemctl daemon-reload
+ systemctl start process_exporter
+ systemctl enable process_exporter
+
+ # Clean up the downloaded tar to save space
+ cd ..
+ rm -rf process-exporter*.tar.gz process-exporter-*.linux-amd64
+ process-exporter --version
+ check_status "Failed to install Process Exporter" "Process Exporter installed successfully." $?
+ else
+ echo "Process Exporter is already installed."
+ fi
+}
+
+deploy_prometheus() {
+ # Check if Prometheus binary exists
+ if [ ! -f "$PROMETHEUS_BINARY" ]; then
+ echo "Prometheus is not installed. Installing now..."
+
+ echo "Fetching the latest version of Prometheus..."
+ LATEST_URL=$(curl --retry 10 --retry-delay 5 --retry-max-time 120 -s https://api.github.com/repos/prometheus/prometheus/releases/latest | jq -r '.assets[] | select(.name | test("prometheus-.*linux-amd64.tar.gz")) | .browser_download_url')
+
+ if [ -z "$LATEST_URL" ]; then
+ echo "Failed to fetch the latest Prometheus release URL. Exiting."
+ exit 1
+ fi
+
+ echo "Downloading Prometheus from $LATEST_URL..."
+ wget "$LATEST_URL" -O prometheus.tar.gz
+
+ echo "Extracting Prometheus..."
+ tar -xzf prometheus.tar.gz
+ cd prometheus-*.linux-amd64 || exit
+
+ echo "Creating directories..."
+ mkdir -p /etc/prometheus /var/lib/prometheus
+
+ echo "Copying binaries and configuration..."
+ cp prometheus promtool /usr/local/bin/
+
+ echo "Setting up Prometheus configuration..."
+ cat < /etc/prometheus/prometheus.yml
+global:
+ scrape_interval: 15s
+scrape_configs:
+ - job_name: 'node_exporter'
+ static_configs:
+ - targets: ['localhost:9100']
+EOF
+
+ echo "Creating systemd service..."
+ cat < /etc/systemd/system/prometheus.service
+[Unit]
+Description=Prometheus Service
+
+[Service]
+ExecStart=/usr/local/bin/prometheus \\
+ --config.file /etc/prometheus/prometheus.yml \\
+ --storage.tsdb.path /var/lib/prometheus/ \\
+ --web.console.templates=/etc/prometheus/consoles \\
+ --web.console.libraries=/etc/prometheus/console_libraries
+
+[Install]
+WantedBy=default.target
+EOF
+
+ # Start Prometheus and enable it to run on startup
+ systemctl daemon-reload
+ systemctl start prometheus
+ systemctl enable prometheus
+
+ # Clean up the downloaded tar to save space
+ cd ..
+ rm -rf prometheus*.tar.gz prometheus-*.linux-amd64
+ prometheus --version
+ check_status "Failed to install Prometheus" "Prometheus installed successfully." $?
+ else
+ echo "Prometheus is already installed."
+ fi
+}
+
+# Install Grafana using a downloaded .deb package
+deploy_grafana() {
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ deploy_debian_grafana
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ deploy_redhat_grafana
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Install Grafana for ubuntu/debian
+deploy_debian_grafana() {
+ # Check if Grafana is already installed
+ if ! dpkg -s "grafana" &> /dev/null; then
+ echo "Downloading the latest Grafana .deb package..."
+ # Download the latest Grafana .deb package
+ wget https://dl.grafana.com/oss/release/grafana_latest_amd64.deb -O grafana.deb
+ # install the required fontconfig package
+ install_package libfontconfig1
+ echo "Installing Grafana..."
+ # Install the .deb package
+ dpkg -i grafana.deb
+
+ # Clean up the downloaded .deb package to save space
+ rm -rf grafana.deb
+
+ # Start the Grafana server and enable it to run on startup
+ systemctl start grafana-server
+ systemctl enable grafana-server
+
+ # Check if Grafana was installed successfully
+ if [ $? -eq 0 ]; then
+ echo "Grafana was installed successfully."
+ else
+ echo "Failed to install Grafana."
+ fi
+ else
+ echo "Grafana is already installed."
+ fi
+}
+
+# Install Grafana for centos/redhat
+deploy_redhat_grafana() {
+ # Check if Grafana is already installed
+ if ! rpm -q grafana &> /dev/null; then
+ echo "Downloading the latest Grafana .rpm package..."
+ # Download the latest Grafana .rpm package
+ wget https://dl.grafana.com/oss/release/grafana-8.5.2-1.x86_64.rpm -O grafana.rpm
+
+ # Install the required fontconfig package
+ yum install -y fontconfig
+
+ echo "Installing Grafana..."
+ # Install the .rpm package
+ rpm -ivh grafana.rpm
+
+ # Clean up the downloaded .rpm package to save space
+ rm -rf grafana.rpm
+
+ # Start the Grafana server and enable it to run on startup
+ systemctl start grafana-server
+ systemctl enable grafana-server
+
+ # Check if Grafana was installed successfully
+ if [ $? -eq 0 ]; then
+ echo "Grafana was installed successfully."
+ else
+ echo "Failed to install Grafana."
+ fi
+ else
+ echo "Grafana is already installed."
+ fi
+}
+
+# Install Nginx
+install_nginx() {
+ install_package "nginx"
+ nginx -v
+ check_status "Failed to install Nginx" "Nginx installed successfully." $?
+}
+
+# Deploy JMeter
+deploy_jmeter() {
+ if ! command -v jmeter &> /dev/null; then
+ echo "Installing JMeter..."
+ install_java
+ wget -P /opt https://mirrors.aliyun.com/apache/jmeter/binaries/apache-jmeter-$JMETER_VERSION.tgz
+ tar -xvzf /opt/apache-jmeter-$JMETER_VERSION.tgz -C /opt/
+ ln -sf /opt/apache-jmeter-$JMETER_VERSION/bin/jmeter /usr/local/bin/jmeter
+ rm -rf /opt/apache-jmeter-$JMETER_VERSION.tgz
+ jmeter --version
+ check_status "Failed to install JMeter" "JMeter installed successfully." $?
+ else
+ echo "JMeter is already installed."
+ fi
+}
+
+# Deploy Docker
+deploy_docker() {
+ if [ -f /etc/debian_version ]; then
+ # Debian or Ubuntu
+ deploy_debian_docker
+ elif [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ deploy_redhat_docker
+ else
+ echo "Unsupported Linux distribution."
+ exit 1
+ fi
+}
+
+# Deploy Docker for centos/redhat
+deploy_redhat_docker() {
+ # Check if Docker is already installed
+ if ! command -v docker &> /dev/null; then
+ echo "Docker is not installed. Installing now..."
+
+ # Set up the repository for Docker
+ echo "Setting up the Docker repository..."
+ install_package yum-utils
+
+ echo "Adding Docker's official repository..."
+ yum-config-manager --add-repo https://mirrors.aliyun.com/docker-ce/linux/centos/docker-ce.repo
+
+ # Install Docker CE
+ echo "Installing Docker CE..."
+ install_package docker-ce docker-ce-cli containerd.io
+
+ # Enable and start Docker
+ echo "Enabling and starting Docker..."
+ systemctl enable docker
+ systemctl start docker
+
+ # Adding current user to the Docker group
+ usermod -aG docker "$USER"
+
+ # Print Docker version
+ docker --version
+
+ # Check the installation status
+ if [ $? -eq 0 ]; then
+ echo "Docker installed successfully."
+ else
+ echo "Failed to install Docker."
+ fi
+ else
+ echo "Docker is already installed."
+ fi
+}
+
+# Deploy docker for ubuntu/debian
+deploy_debian_docker() {
+ # Check if Docker is already installed
+ if ! command -v docker &> /dev/null; then
+ echo "Docker is not installed. Installing now..."
+
+ # Set up the repository for Docker
+ echo "Setting up the Docker repository..."
+ update
+ install_package apt-transport-https ca-certificates curl software-properties-common gnupg lsb-release
+
+ echo "Adding Docker's official GPG key from Aliyun..."
+ curl --retry 10 --retry-delay 5 --retry-max-time 120 -fsSL https://mirrors.aliyun.com/docker-ce/linux/ubuntu/gpg | {
+ if [ -f /usr/share/keyrings/docker-archive-keyring.gpg ]; then
+ rm /usr/share/keyrings/docker-archive-keyring.gpg
+ fi
+ gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg
+ }
+ echo "Setting up stable repository using Aliyun..."
+ echo \
+ "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://mirrors.aliyun.com/docker-ce/linux/ubuntu \
+ $(lsb_release -cs) stable" | tee /etc/apt/sources.list.d/docker.list > /dev/null
+
+ # Install Docker CE
+ echo "Installing Docker CE..."
+ update
+ install_package docker-ce docker-ce-cli containerd.io
+
+ # Enable and start Docker
+ echo "Enabling and starting Docker..."
+ systemctl enable docker
+ systemctl start docker
+
+ # Adding current user to the Docker group
+ usermod -aG docker "$USER"
+
+ # Print Docker version
+ docker --version
+ check_status "Failed to install Docker" "Docker installed successfully." $?
+ else
+ echo "Docker is already installed."
+ fi
+}
+
+# Deploy Docker Compose
+deploy_docker_compose() {
+ # Check if Docker Compose is installed
+ if ! command -v docker-compose &> /dev/null; then
+ echo "Docker Compose is not installed. Installing now..."
+
+ # Install Docker Compose
+ curl --retry 10 --retry-delay 5 --retry-max-time 120 -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
+ chmod +x /usr/local/bin/docker-compose
+
+ # Print Docker Compose version
+ docker-compose --version
+ check_status "Failed to install Docker Compose" "Docker Compose installed successfully." $?
+ else
+ echo "Docker Compose is already installed."
+ fi
+}
+
+# Reconfigure cloud-init
+reconfig_cloud_init() {
+ echo "Reconfiguring cloud-init..."
+ apt remove -y cloud-init && apt purge -y cloud-init
+ rm -rf /var/lib/cloud /etc/cloud
+ apt update -y
+ install_package cloud-init
+ sed -i '/package[-_]update[-_]upgrade[-_]install/s/^/#/' /etc/cloud/cloud.cfg
+}
+
+# Config cloud-init
+config_cloud_init() {
+ if [ "$DIST_VERSION" = "7.9" ];then
+ install_package "cloud-init"
+ sed -i '/ssh_pwauth.*/s/^/#/' /etc/cloud/cloud.cfg
+ else
+ reconfig_cloud_init
+ fi
+ check_status "Failed to configure cloud-init" "Cloud-init configured successfully and you need reboot manually." $?
+ # if [ "$DIST_VERSION" = "18.04" ] || [ "$DIST_VERSION" = "20.04" ]; then
+ # reconfig_cloud_init
+ # elif [ "$DIST_VERSION" = "7.9" ];then
+ # install_package "cloud-init"
+ # sed -i '/ssh_pwauth.*/s/^/#/' /etc/cloud/cloud.cfg
+ # else
+ # echo "Configuring cloud-init..."
+ # add_config_if_not_exist "$CLOUD_INIT_CONFIG" "/etc/cloud/cloud.cfg"
+
+ # marker="NoCloud"
+
+ # if grep -qF "$marker" "/etc/cloud/cloud.cfg"; then
+ # echo "cloud-init settings already exists in /etc/cloud/cloud.cfg."
+ # else
+ # echo "Adding configuration to /etc/cloud/cloud.cfg."
+ # echo "$CLOUD_INIT_CONFIG" >> "/etc/cloud/cloud.cfg"
+ # echo "cloud-init settings have been updated in /etc/cloud/cloud.cfg."
+ # fi
+
+ # mkdir -p /var/lib/cloud/seed/nocloud/
+ # cd /var/lib/cloud/seed/nocloud/ || exit
+ # touch meta-data
+ # touch user-data
+ # add_config_if_not_exist "hostname: \${name}" user-data
+ # add_config_if_not_exist "manage_etc_hosts: true" user-data
+ # fi
+ # cloud-init clean --logs
+}
+
+# Clone a repository with a specified target directory
+clone_repo_with_rename() {
+ local repo_url="$1"
+ local target_dir="$2"
+ local branch_name="$3"
+
+ if [ -z "$target_dir" ]; then
+ target_dir=$(basename -s .git "$repo_url")
+ fi
+
+ cd "$REPOPATH" || exit
+
+ if [ -d "$target_dir" ]; then
+ echo "Directory $target_dir already exists. Skipping clone."
+ else
+ echo "Cloning into $target_dir..."
+ if [ -n "$branch_name" ]; then
+ git clone -b "$branch_name" "$repo_url" "$target_dir"
+ else
+ git clone "$repo_url" "$target_dir"
+ fi
+ fi
+}
+
+# Clone enterprise repository
+clone_enterprise() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/TDinternal
+ clone_repo_with_rename git@github.com:taosdata/TDengine.git TDinternal/community
+}
+
+# Clone community repository
+clone_community() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com:taosdata/TDengine.git
+}
+
+# Clone TaosX repository
+clone_taosx() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/taosx
+}
+
+# Clone TaosKeeper repository
+clone_taoskeeper() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/taoskeeper
+}
+
+# Clone TaosTest repository
+clone_taostest() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/taos-test-framework "" "master"
+}
+
+# Clone TestNG repository
+clone_testng() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/TestNG "" "master"
+}
+
+# Clone operation tools repository
+clone_operation() {
+ cd "$REPOPATH" || exit
+ clone_repo_with_rename https://github.com/taosdata/operation.git
+}
+
+# init system
+system_config() {
+ disable_service
+ config_dns
+ replace_sources
+ config_cloud_init
+ config_ssh
+ config_custom_settings
+ config_timezone
+ config_share_server
+ disable_firewalld
+ config_frontend
+ config_system_limits
+ config_coredump
+ check_status "Failed to config system" "Config system successfully" $?
+}
+
+# Clone all the repositories
+clone_repos() {
+ clone_enterprise
+ clone_community
+ clone_taosx
+ clone_taoskeeper
+ clone_taostest
+ clone_operation
+}
+
+new_funcs() {
+ echo "Adding test..."
+ install_python 3.10.12
+ # install_java 21
+ # install_node 16.20.2
+ # install_maven 3.2.5
+}
+
+# deploy TDasset
+TDasset() {
+ install_java 21
+ install_maven 3.9.9
+ # not supported in centos7 because of the old version of glibc
+ # install_node 22.0.0
+ install_node_via_nvm 22.0.0
+ install_pnpm
+}
+
+# deploy TDinternal/TDengine/taosx
+TDinternal() {
+ deploy_go
+ deploy_rust
+ install_java 17
+ install_node_via_nvm 16.20.2
+ install_python 3.10.12
+}
+
+# deploy TDgpt
+TDgpt() {
+ install_python 3.10.12
+}
+
+# deploy taos-test-framework
+taostest() {
+ if [ ! -d "$REPOPATH/taos-test-framework" ]; then
+ echo "Cloning TaosTest repository..."
+ clone_taostest
+ else
+ echo "TaosTest repository already exists. Skipping clone."
+ fi
+ check_status "Failed to clone TaosTest repository" "TaosTest repository cloned successfully." $?
+
+ if [ ! -d "$REPOPATH/TestNG" ]; then
+ echo "Cloning TestNG repository..."
+ clone_testng
+ else
+ echo "TestNG repository already exists. Skipping clone."
+ fi
+ check_status "Failed to clone TestNG repository" "TestNG repository cloned successfully." $?
+
+ # Configure environment variables
+ echo "Configuring TaosTest environment variables..."
+ mkdir -p "$HOME"/.taostest
+ add_config_if_not_exist "TEST_ROOT=$REPOPATH/TestNG" "$HOME"/.taostest/.env
+
+ # Install TaosTest
+ echo "Installing TaosTest..."
+ cd "$REPOPATH"/taos-test-framework || exit
+ install_package "python3-pip"
+ install_via_pip "poetry"
+ yes | ./reinstall.sh
+ check_status "Failed to install TaosTest" "TaosTest installed successfully." $?
+
+ # Configure passwdless login
+ echo "Configuring passwdless login..."
+ yes | ssh-keygen -t rsa -b 2048 -N "" -f "$HOME/.ssh/testng"
+ cat "$HOME"/.ssh/testng.pub >> "$HOME"/.ssh/authorized_keys
+}
+
+
+# Deploy pure environment
+deploy_pure() {
+ disable_service
+ config_dns
+ if [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ echo "Replacing sources for YUM package manager."
+ replace_yum_sources
+ fi
+ config_cloud_init
+ config_ssh
+ config_custom_settings
+ config_timezone
+ config_share_server
+ disable_firewalld
+ install_package "jq"
+ install_package "wget"
+ deploy_node_exporter
+ check_status "Failed to config pure system" "Config pure system successfully" $?
+}
+
+# Deploy development environment
+deploy_dev() {
+ install_packages
+ deploy_cmake
+ if [ -f /etc/redhat-release ]; then
+ # Red Hat or CentOS
+ update_redhat_gcc
+ fi
+ deploy_tmux
+ deploy_git
+ install_python
+ install_pip_pkg
+ install_java
+ install_maven
+ deploy_go
+ deploy_rust
+ install_node
+ deploy_node_exporter
+ deploy_process_exporter
+ deploy_prometheus
+ deploy_grafana
+ deploy_jmeter
+ install_nginx
+ deploy_docker
+ deploy_docker_compose
+ check_status "Failed to deploy some tools" "Deploy all tools successfully" $?
+}
+
+# Setup all configurations
+setup_all() {
+ system_config
+ deploy_dev
+}
+
+# More installation functions can be added here following the above examples
+
+# Main execution function
+main() {
+ # Check if at least one argument is provided
+ if [ $# -eq 0 ]; then
+ echo "Error: No arguments provided."
+ echo "Please try $0 --help"
+ exit 1
+ fi
+ init_env
+ for arg in "$@"; do
+ case $arg in
+ --help)
+ help
+ exit 0
+ ;;
+ setup_all)
+ setup_all
+ ;;
+ config_ssh)
+ config_ssh
+ ;;
+ disable_firewalld)
+ disable_firewalld
+ ;;
+ config_cloud_init)
+ config_cloud_init
+ ;;
+ deploy_git)
+ deploy_git
+ ;;
+ replace_sources)
+ replace_sources
+ ;;
+ upgrade)
+ upgrade
+ ;;
+ config_timezone)
+ config_timezone
+ ;;
+ config_dns)
+ config_dns
+ ;;
+ config_custom_settings)
+ config_custom_settings
+ ;;
+ install_packages)
+ install_packages
+ ;;
+ config_system_limits)
+ config_system_limits
+ ;;
+ config_coredump)
+ config_coredump
+ ;;
+ disable_service)
+ disable_service
+ ;;
+ install_python)
+ install_python
+ ;;
+ install_pip_pkg)
+ install_pip_pkg
+ ;;
+ install_java)
+ install_java
+ ;;
+ install_maven)
+ install_maven
+ ;;
+ deploy_cmake)
+ deploy_cmake
+ ;;
+ update_redhat_gcc)
+ update_redhat_gcc
+ ;;
+ update_redhat_tmux)
+ update_redhat_tmux
+ ;;
+ deploy_tmux)
+ deploy_tmux
+ ;;
+ deploy_go)
+ deploy_go
+ ;;
+ deploy_rust)
+ deploy_rust
+ ;;
+ install_node)
+ install_node
+ ;;
+ install_node_via_nvm)
+ install_node_via_nvm
+ ;;
+ install_pnpm)
+ install_pnpm
+ ;;
+ deploy_node_exporter)
+ deploy_node_exporter
+ ;;
+ deploy_process_exporter)
+ deploy_process_exporter
+ ;;
+ deploy_prometheus)
+ deploy_prometheus
+ ;;
+ deploy_grafana)
+ deploy_grafana
+ ;;
+ deploy_jmeter)
+ deploy_jmeter
+ ;;
+ install_nginx)
+ install_nginx
+ ;;
+ config_qemu_guest_agent)
+ config_qemu_guest_agent
+ ;;
+ config_share_server)
+ config_share_server
+ ;;
+ deploy_docker)
+ deploy_docker
+ ;;
+ deploy_docker_compose)
+ deploy_docker_compose
+ ;;
+ clone_enterprise)
+ clone_enterprise
+ ;;
+ clone_community)
+ clone_community
+ ;;
+ clone_taosx)
+ clone_taosx
+ ;;
+ clone_taoskeeper)
+ clone_taoskeeper
+ ;;
+ clone_taostest)
+ clone_taostest
+ ;;
+ clone_operation)
+ clone_operation
+ ;;
+ system_config)
+ system_config
+ ;;
+ deploy_pure)
+ deploy_pure
+ ;;
+ deploy_dev)
+ deploy_dev
+ ;;
+ TDasset)
+ TDasset
+ ;;
+ TDinternal)
+ TDinternal
+ ;;
+ TDgpt)
+ TDgpt
+ ;;
+ taostest)
+ taostest
+ ;;
+ new_funcs)
+ new_funcs
+ ;;
+ *)
+ echo "Unknown function: $arg"
+ ;;
+ esac
+ done
+}
+
+# Execute the script with specified function arguments
+main "$@"
From 8dedf8f21d723177aef8441785791747224523aa Mon Sep 17 00:00:00 2001
From: Shengliang Guan
Date: Sun, 29 Dec 2024 22:34:24 +0800
Subject: [PATCH 04/35] minor changes
---
source/libs/parser/src/parTranslater.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/source/libs/parser/src/parTranslater.c b/source/libs/parser/src/parTranslater.c
index 87c648989e..8f23daf8dd 100755
--- a/source/libs/parser/src/parTranslater.c
+++ b/source/libs/parser/src/parTranslater.c
@@ -7782,7 +7782,7 @@ static int32_t buildCreateDbReq(STranslateContext* pCxt, SCreateDatabaseStmt* pS
static int32_t checkRangeOption(STranslateContext* pCxt, int32_t code, const char* pName, int64_t val, int64_t minVal,
int64_t maxVal, bool skipUndef) {
- if (skipUndef ? ((val >= 0 | val < -2) && (val < minVal || val > maxVal)) : (val < minVal || val > maxVal)) {
+ if (skipUndef ? ((val >= 0 || val < -2) && (val < minVal || val > maxVal)) : (val < minVal || val > maxVal)) {
return generateSyntaxErrMsgExt(&pCxt->msgBuf, code,
"Invalid option %s: %" PRId64 ", valid range: [%" PRId64 ", %" PRId64 "]", pName,
val, minVal, maxVal);
From 65330a5b47637a6546e2c3d50836e02f3be53b62 Mon Sep 17 00:00:00 2001
From: jiajingbin
Date: Sun, 29 Dec 2024 22:41:49 +0800
Subject: [PATCH 05/35] enh: update
---
packaging/setup_env.sh | 3 +--
1 file changed, 1 insertion(+), 2 deletions(-)
diff --git a/packaging/setup_env.sh b/packaging/setup_env.sh
index cce54b2839..42ae6b3814 100644
--- a/packaging/setup_env.sh
+++ b/packaging/setup_env.sh
@@ -1761,8 +1761,7 @@ new_funcs() {
TDasset() {
install_java 21
install_maven 3.9.9
- # not supported in centos7 because of the old version of glibc
- # install_node 22.0.0
+ # not supported in centos7/ubuntu18 because of the old version of glibc
install_node_via_nvm 22.0.0
install_pnpm
}
From cf7701af6722f23a2ce56b14391cf54a669e8f91 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 09:47:03 +0800
Subject: [PATCH 06/35] fix: limit can not over childtable_count
---
.../5-taos-tools/taosbenchmark/json/custom_col_tag.json | 2 +-
.../5-taos-tools/taosbenchmark/json/taosc_insert_alltypes.json | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/tests/develop-test/5-taos-tools/taosbenchmark/json/custom_col_tag.json b/tests/develop-test/5-taos-tools/taosbenchmark/json/custom_col_tag.json
index fec5775cd6..b844f7b1b8 100644
--- a/tests/develop-test/5-taos-tools/taosbenchmark/json/custom_col_tag.json
+++ b/tests/develop-test/5-taos-tools/taosbenchmark/json/custom_col_tag.json
@@ -36,7 +36,7 @@
"insert_mode": "taosc",
"line_protocol": "line",
"childtable_limit": -10,
- "childtable_offset": 10,
+ "childtable_offset": 0,
"insert_rows": 20,
"insert_interval": 0,
"interlace_rows": 0,
diff --git a/tests/develop-test/5-taos-tools/taosbenchmark/json/taosc_insert_alltypes.json b/tests/develop-test/5-taos-tools/taosbenchmark/json/taosc_insert_alltypes.json
index 5694b58407..5ba870a3bd 100644
--- a/tests/develop-test/5-taos-tools/taosbenchmark/json/taosc_insert_alltypes.json
+++ b/tests/develop-test/5-taos-tools/taosbenchmark/json/taosc_insert_alltypes.json
@@ -36,7 +36,7 @@
"insert_mode": "taosc",
"line_protocol": "line",
"childtable_limit": -10,
- "childtable_offset": 10,
+ "childtable_offset": 0,
"insert_rows": 20,
"insert_interval": 0,
"interlace_rows": 0,
From 118b2b98779b2f248db8a4a2586a5ecb3c6e6702 Mon Sep 17 00:00:00 2001
From: t_max <1172915550@qq.com>
Date: Mon, 30 Dec 2024 09:34:50 +0800
Subject: [PATCH 07/35] docs: go connector support native stmt2 binding
---
docs/en/07-develop/05-stmt.md | 10 +++
docs/en/14-reference/05-connector/20-go.md | 37 +++++++++
docs/examples/go/stmt2/native/main.go | 84 +++++++++++++++++++++
docs/zh/07-develop/05-stmt.md | 11 +++
docs/zh/14-reference/05-connector/20-go.mdx | 31 ++++++++
5 files changed, 173 insertions(+)
create mode 100644 docs/examples/go/stmt2/native/main.go
diff --git a/docs/en/07-develop/05-stmt.md b/docs/en/07-develop/05-stmt.md
index 4503bb8bd3..614d867c9e 100644
--- a/docs/en/07-develop/05-stmt.md
+++ b/docs/en/07-develop/05-stmt.md
@@ -96,9 +96,19 @@ This is a [more detailed parameter binding example](https://github.com/taosdata/
+
+The example code for binding parameters with stmt2 (Go connector v3.6.0 and above, TDengine v3.3.5.0 and above) is as follows:
+
+```go
+{{#include docs/examples/go/stmt2/native/main.go}}
+```
+
+The example code for binding parameters with stmt is as follows:
+
```go
{{#include docs/examples/go/stmt/native/main.go}}
```
+
diff --git a/docs/en/14-reference/05-connector/20-go.md b/docs/en/14-reference/05-connector/20-go.md
index dd32df2c5b..aecad76a2e 100644
--- a/docs/en/14-reference/05-connector/20-go.md
+++ b/docs/en/14-reference/05-connector/20-go.md
@@ -493,6 +493,43 @@ The `af` package provides more interfaces using native connections for parameter
* **Interface Description**: Closes the statement.
* **Return Value**: Error information.
+From version 3.6.0, the `stmt2` interface for binding parameters is provided.
+
+* `func (conn *Connector) Stmt2(reqID int64, singleTableBindOnce bool) *Stmt2`
+ * **Interface Description**: Returns a Stmt2 object bound to this connection.
+ * **Parameter Description**:
+ * `reqID`: Request ID.
+ * `singleTableBindOnce`: Indicates whether a single child table is bound only once during a single execution.
+ * **Return Value**: Stmt2 object.
+
+* `func (s *Stmt2) Prepare(sql string) error`
+ * **Interface Description**: Prepares an SQL.
+ * **Parameter Description**:
+ * `sql`: The statement for parameter binding.
+ * **Return Value**: Error information.
+
+* `func (s *Stmt2) Bind(params []*stmt.TaosStmt2BindData) error`
+ * **Interface Description**: Binds data to the prepared statement.
+ * **Parameter Description**:
+ * `params`: The data to bind.
+ * **Return Value**: Error information.
+
+* `func (s *Stmt2) Execute() error`
+ * **Interface Description**: Executes the batch.
+ * **Return Value**: Error information.
+
+* `func (s *Stmt2) GetAffectedRows() int`
+ * **Interface Description**: Gets the number of affected rows (only valid for insert statements).
+ * **Return Value**: Number of affected rows.
+
+* `func (s *Stmt2) UseResult() (driver.Rows, error)`
+ * **Interface Description**: Retrieves the result set (only valid for query statements).
+ * **Return Value**: Result set Rows object, error information.
+
+* `func (s *Stmt2) Close() error`
+ * **Interface Description**: Closes the statement.
+ * **Return Value**: Error information.
+
The `ws/stmt` package provides interfaces for parameter binding via WebSocket
* `func (c *Connector) Init() (*Stmt, error)`
diff --git a/docs/examples/go/stmt2/native/main.go b/docs/examples/go/stmt2/native/main.go
new file mode 100644
index 0000000000..fc3ab763c2
--- /dev/null
+++ b/docs/examples/go/stmt2/native/main.go
@@ -0,0 +1,84 @@
+package main
+
+import (
+ "database/sql/driver"
+ "fmt"
+ "log"
+ "math/rand"
+ "time"
+
+ "github.com/taosdata/driver-go/v3/af"
+ "github.com/taosdata/driver-go/v3/common"
+ "github.com/taosdata/driver-go/v3/common/stmt"
+)
+
+func main() {
+ host := "127.0.0.1"
+ numOfSubTable := 10
+ numOfRow := 10
+ db, err := af.Open(host, "root", "taosdata", "", 0)
+ if err != nil {
+ log.Fatalln("Failed to connect to " + host + "; ErrMessage: " + err.Error())
+ }
+ defer db.Close()
+ // prepare database and table
+ _, err = db.Exec("CREATE DATABASE IF NOT EXISTS power")
+ if err != nil {
+ log.Fatalln("Failed to create database power, ErrMessage: " + err.Error())
+ }
+ _, err = db.Exec("USE power")
+ if err != nil {
+ log.Fatalln("Failed to use database power, ErrMessage: " + err.Error())
+ }
+ _, err = db.Exec("CREATE STABLE IF NOT EXISTS meters (ts TIMESTAMP, current FLOAT, voltage INT, phase FLOAT) TAGS (groupId INT, location BINARY(24))")
+ if err != nil {
+ log.Fatalln("Failed to create stable meters, ErrMessage: " + err.Error())
+ }
+ // prepare statement
+ sql := "INSERT INTO ? USING meters TAGS(?,?) VALUES (?,?,?,?)"
+ reqID := common.GetReqID()
+ stmt2 := db.Stmt2(reqID, false)
+ err = stmt2.Prepare(sql)
+ if err != nil {
+ log.Fatalln("Failed to prepare sql, sql: " + sql + ", ErrMessage: " + err.Error())
+ }
+ for i := 1; i <= numOfSubTable; i++ {
+ // generate column data
+ current := time.Now()
+ columns := make([][]driver.Value, 4)
+ for j := 0; j < numOfRow; j++ {
+ columns[0] = append(columns[0], current.Add(time.Millisecond*time.Duration(j)))
+ columns[1] = append(columns[1], rand.Float32()*30)
+ columns[2] = append(columns[2], rand.Int31n(300))
+ columns[3] = append(columns[3], rand.Float32())
+ }
+ // generate bind data
+ tableName := fmt.Sprintf("d_bind_%d", i)
+ tags := []driver.Value{int32(i), []byte(fmt.Sprintf("location_%d", i))}
+ bindData := []*stmt.TaosStmt2BindData{
+ {
+ TableName: tableName,
+ Tags: tags,
+ Cols: columns,
+ },
+ }
+ // bind params
+ err = stmt2.Bind(bindData)
+ if err != nil {
+ log.Fatalln("Failed to bind params, ErrMessage: " + err.Error())
+ }
+ // execute batch
+ err = stmt2.Execute()
+ if err != nil {
+ log.Fatalln("Failed to exec, ErrMessage: " + err.Error())
+ }
+ // get affected rows
+ affected := stmt2.GetAffectedRows()
+ // you can check exeResult here
+ fmt.Printf("Successfully inserted %d rows to %s.\n", affected, tableName)
+ }
+ err = stmt2.Close()
+ if err != nil {
+ log.Fatal("failed to close statement, err:", err)
+ }
+}
diff --git a/docs/zh/07-develop/05-stmt.md b/docs/zh/07-develop/05-stmt.md
index 74b44ba8e6..2cc5413a03 100644
--- a/docs/zh/07-develop/05-stmt.md
+++ b/docs/zh/07-develop/05-stmt.md
@@ -91,9 +91,20 @@ import TabItem from "@theme/TabItem";
```
+
+stmt2 绑定参数的示例代码如下(go 连接器 v3.6.0 及以上,TDengine v3.3.5.0 及以上):
+
+```go
+{{#include docs/examples/go/stmt2/native/main.go}}
+```
+
+stmt 绑定参数的示例代码如下:
+
```go
{{#include docs/examples/go/stmt/native/main.go}}
```
+
+
diff --git a/docs/zh/14-reference/05-connector/20-go.mdx b/docs/zh/14-reference/05-connector/20-go.mdx
index 85c65a5fb8..a4d7b9be71 100644
--- a/docs/zh/14-reference/05-connector/20-go.mdx
+++ b/docs/zh/14-reference/05-connector/20-go.mdx
@@ -494,6 +494,37 @@ Prepare 允许使用预编译的 SQL 语句,可以提高性能并提供参数
- **接口说明**:关闭语句。
- **返回值**:错误信息。
+从 3.6.0 版本开始,提供 stmt2 绑定参数的接口
+
+- `func (conn *Connector) Stmt2(reqID int64, singleTableBindOnce bool) *Stmt2`
+ - **接口说明**:从连接创建 stmt2。
+ - **参数说明**:
+ - `reqID`:请求 ID。
+ - `singleTableBindOnce`:单个子表在单次执行中只有一次数据绑定。
+ - **返回值**:stmt2 对象。
+- `func (s *Stmt2) Prepare(sql string) error`
+ - **接口说明**:绑定 sql 语句。
+ - **参数说明**:
+ - `sql`:要绑定的 sql 语句。
+ - **返回值**:错误信息。
+- `func (s *Stmt2) Bind(params []*stmt.TaosStmt2BindData) error`
+ - **接口说明**:绑定数据。
+ - **参数说明**:
+ - params要绑定的数据。
+ - **返回值**:错误信息。
+- `func (s *Stmt2) Execute() error`
+ - **接口说明**:执行语句。
+ - **返回值**:错误信息。
+- `func (s *Stmt2) GetAffectedRows() int`
+ - **接口说明**:获取受影响行数(只在插入语句有效)。
+ - **返回值**:受影响行数。
+- `func (s *Stmt2) UseResult() (driver.Rows, error)`
+ - **接口说明**:获取结果集(只在查询语句有效)。
+ - **返回值**:结果集 Rows 对象,错误信息。
+- `func (s *Stmt2) Close() error`
+ - **接口说明**:关闭stmt2。
+ - **返回值**:错误信息。
+
`ws/stmt` 包提供了通过 WebSocket 进行参数绑定的接口
- `func (c *Connector) Init() (*Stmt, error)`
From 922766c1457c9af2df3cedfe5d2fed1c1c4fdee3 Mon Sep 17 00:00:00 2001
From: Haolin Wang
Date: Fri, 27 Dec 2024 17:16:00 +0800
Subject: [PATCH 08/35] fix: infinite loop when reading CSV file EOF on Windows
---
source/os/src/osFile.c | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/source/os/src/osFile.c b/source/os/src/osFile.c
index 8a2606c4c2..b1198e1cb2 100644
--- a/source/os/src/osFile.c
+++ b/source/os/src/osFile.c
@@ -1403,7 +1403,7 @@ int64_t taosGetLineFile(TdFilePtr pFile, char **__restrict ptrBuf) {
}
(*ptrBuf)[totalBytesRead] = '\0';
- ret = totalBytesRead;
+ ret = (totalBytesRead > 0 ? totalBytesRead : -1); // -1 means EOF
#else
size_t len = 0;
ret = getline(ptrBuf, &len, pFile->fp);
From af434a28318d69e8c16dc46e485fd840877e39b4 Mon Sep 17 00:00:00 2001
From: t_max <1172915550@qq.com>
Date: Mon, 30 Dec 2024 10:49:41 +0800
Subject: [PATCH 09/35] docs: go connector v3.6.0
---
docs/en/14-reference/05-connector/20-go.md | 39 +++++++++++----------
docs/zh/14-reference/05-connector/20-go.mdx | 39 +++++++++++----------
2 files changed, 42 insertions(+), 36 deletions(-)
diff --git a/docs/en/14-reference/05-connector/20-go.md b/docs/en/14-reference/05-connector/20-go.md
index aecad76a2e..bf0e6dd979 100644
--- a/docs/en/14-reference/05-connector/20-go.md
+++ b/docs/en/14-reference/05-connector/20-go.md
@@ -21,24 +21,25 @@ Supports Go 1.14 and above.
## Version History
-| driver-go Version | Major Changes | TDengine Version |
-|------------------|------------------------------------------------------------------|-------------------|
-| v3.5.8 | Fixed null pointer exception. | - |
-| v3.5.7 | taosWS and taosRestful support passing request id. | - |
-| v3.5.6 | Improved websocket query and insert performance. | 3.3.2.0 and higher |
-| v3.5.5 | Restful supports skipping SSL certificate check. | - |
-| v3.5.4 | Compatible with TDengine 3.3.0.0 tmq raw data. | - |
-| v3.5.3 | Refactored taosWS. | - |
-| v3.5.2 | Websocket compression and optimized tmq subscription performance. | 3.2.3.0 and higher |
-| v3.5.1 | Native stmt query and geometry type support. | 3.2.1.0 and higher |
-| v3.5.0 | Support tmq get assignment and seek offset. | 3.0.5.0 and higher |
-| v3.3.1 | Schemaless protocol insert based on websocket. | 3.0.4.1 and higher |
-| v3.1.0 | Provided Kafka-like subscription API. | - |
-| v3.0.4 | Added request id related interfaces. | 3.0.2.2 and higher |
-| v3.0.3 | Websocket-based statement insert. | - |
-| v3.0.2 | Websocket-based data query and insert. | 3.0.1.5 and higher |
-| v3.0.1 | Websocket-based message subscription. | - |
-| v3.0.0 | Adapted to TDengine 3.0 query and insert. | 3.0.0.0 and higher |
+| driver-go Version | Major Changes | TDengine Version |
+|-------------------|-------------------------------------------------------------------------------------------------|--------------------|
+| v3.6.0 | stmt2 native interface, DSN supports passwords containing special characters (url.QueryEscape). | 3.3.5.0 and higher |
+| v3.5.8 | Fixed null pointer exception. | - |
+| v3.5.7 | taosWS and taosRestful support passing request id. | - |
+| v3.5.6 | Improved websocket query and insert performance. | 3.3.2.0 and higher |
+| v3.5.5 | Restful supports skipping SSL certificate check. | - |
+| v3.5.4 | Compatible with TDengine 3.3.0.0 tmq raw data. | - |
+| v3.5.3 | Refactored taosWS. | - |
+| v3.5.2 | Websocket compression and optimized tmq subscription performance. | 3.2.3.0 and higher |
+| v3.5.1 | Native stmt query and geometry type support. | 3.2.1.0 and higher |
+| v3.5.0 | Support tmq get assignment and seek offset. | 3.0.5.0 and higher |
+| v3.3.1 | Schemaless protocol insert based on websocket. | 3.0.4.1 and higher |
+| v3.1.0 | Provided Kafka-like subscription API. | - |
+| v3.0.4 | Added request id related interfaces. | 3.0.2.2 and higher |
+| v3.0.3 | Websocket-based statement insert. | - |
+| v3.0.2 | Websocket-based data query and insert. | 3.0.1.5 and higher |
+| v3.0.1 | Websocket-based message subscription. | - |
+| v3.0.0 | Adapted to TDengine 3.0 query and insert. | 3.0.0.0 and higher |
## Exceptions and Error Codes
@@ -136,6 +137,8 @@ Full form of DSN:
username:password@protocol(address)/dbname?param=value
```
+When the password contains special characters, it needs to be escaped using url.QueryEscape.
+
##### Native Connection
Import the driver:
diff --git a/docs/zh/14-reference/05-connector/20-go.mdx b/docs/zh/14-reference/05-connector/20-go.mdx
index a4d7b9be71..210b0e438d 100644
--- a/docs/zh/14-reference/05-connector/20-go.mdx
+++ b/docs/zh/14-reference/05-connector/20-go.mdx
@@ -23,24 +23,25 @@ import RequestId from "./_request_id.mdx";
## 版本历史
-| driver-go 版本 | 主要变化 | TDengine 版本 |
-|-------------|-------------------------------------|---------------|
-| v3.5.8 | 修复空指针异常 | - |
-| v3.5.7 | taosWS 和 taosRestful 支持传入 request id | - |
-| v3.5.6 | 提升 websocket 查询和写入性能 | 3.3.2.0 及更高版本 |
-| v3.5.5 | restful 支持跳过 ssl 证书检查 | - |
-| v3.5.4 | 兼容 TDengine 3.3.0.0 tmq raw data | - |
-| v3.5.3 | 重构 taosWS | - |
-| v3.5.2 | websocket 压缩和优化消息订阅性能 | 3.2.3.0 及更高版本 |
-| v3.5.1 | 原生 stmt 查询和 geometry 类型支持 | 3.2.1.0 及更高版本 |
-| v3.5.0 | 获取消费进度及按照指定进度开始消费 | 3.0.5.0 及更高版本 |
-| v3.3.1 | 基于 websocket 的 schemaless 协议写入 | 3.0.4.1 及更高版本 |
-| v3.1.0 | 提供贴近 kafka 的订阅 api | - |
-| v3.0.4 | 新增 request id 相关接口 | 3.0.2.2 及更高版本 |
-| v3.0.3 | 基于 websocket 的 statement 写入 | - |
-| v3.0.2 | 基于 websocket 的数据查询和写入 | 3.0.1.5 及更高版本 |
-| v3.0.1 | 基于 websocket 的消息订阅 | - |
-| v3.0.0 | 适配 TDengine 3.0 查询和写入 | 3.0.0.0 及更高版本 |
+| driver-go 版本 | 主要变化 | TDengine 版本 |
+|--------------|--------------------------------------------|---------------|
+| v3.6.0 | stmt2 原生接口,DSN 支持密码包含特殊字符(url.QueryEscape) | 3.3.5.0 及更高版本 |
+| v3.5.8 | 修复空指针异常 | - |
+| v3.5.7 | taosWS 和 taosRestful 支持传入 request id | - |
+| v3.5.6 | 提升 websocket 查询和写入性能 | 3.3.2.0 及更高版本 |
+| v3.5.5 | restful 支持跳过 ssl 证书检查 | - |
+| v3.5.4 | 兼容 TDengine 3.3.0.0 tmq raw data | - |
+| v3.5.3 | 重构 taosWS | - |
+| v3.5.2 | websocket 压缩和优化消息订阅性能 | 3.2.3.0 及更高版本 |
+| v3.5.1 | 原生 stmt 查询和 geometry 类型支持 | 3.2.1.0 及更高版本 |
+| v3.5.0 | 获取消费进度及按照指定进度开始消费 | 3.0.5.0 及更高版本 |
+| v3.3.1 | 基于 websocket 的 schemaless 协议写入 | 3.0.4.1 及更高版本 |
+| v3.1.0 | 提供贴近 kafka 的订阅 api | - |
+| v3.0.4 | 新增 request id 相关接口 | 3.0.2.2 及更高版本 |
+| v3.0.3 | 基于 websocket 的 statement 写入 | - |
+| v3.0.2 | 基于 websocket 的数据查询和写入 | 3.0.1.5 及更高版本 |
+| v3.0.1 | 基于 websocket 的消息订阅 | - |
+| v3.0.0 | 适配 TDengine 3.0 查询和写入 | 3.0.0.0 及更高版本 |
## 异常和错误码
@@ -137,6 +138,8 @@ WKB规范请参考[Well-Known Binary (WKB)](https://libgeos.org/specifications/w
username:password@protocol(address)/dbname?param=value
```
+当密码中包含特殊字符时,需要使用 `url.QueryEscape` 进行转义。
+
##### 原生连接
导入驱动:
From 4167f0d40f1e461cbe94858a7a0adb84dc26ca16 Mon Sep 17 00:00:00 2001
From: t_max <1172915550@qq.com>
Date: Mon, 30 Dec 2024 11:06:33 +0800
Subject: [PATCH 10/35] test: add go stmt2 example test
---
docs/examples/go/go.mod | 2 +-
docs/examples/go/go.sum | 4 ++--
tests/docs-examples-test/go.sh | 5 +++++
3 files changed, 8 insertions(+), 3 deletions(-)
diff --git a/docs/examples/go/go.mod b/docs/examples/go/go.mod
index ed8fde2d9f..2dd5d4ec2b 100644
--- a/docs/examples/go/go.mod
+++ b/docs/examples/go/go.mod
@@ -2,7 +2,7 @@ module goexample
go 1.17
-require github.com/taosdata/driver-go/v3 v3.5.6
+require github.com/taosdata/driver-go/v3 v3.6.0
require (
github.com/google/uuid v1.3.0 // indirect
diff --git a/docs/examples/go/go.sum b/docs/examples/go/go.sum
index 61841429ee..ee2ad55588 100644
--- a/docs/examples/go/go.sum
+++ b/docs/examples/go/go.sum
@@ -18,8 +18,8 @@ github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.7.0 h1:nwc3DEeHmmLAfoZucVR881uASk0Mfjw8xYJ99tb5CcY=
github.com/stretchr/testify v1.7.0/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
-github.com/taosdata/driver-go/v3 v3.5.6 h1:LDVtMyT3B9p2VREsd5KKM91D4Y7P4kSdh2SQumXi8bk=
-github.com/taosdata/driver-go/v3 v3.5.6/go.mod h1:H2vo/At+rOPY1aMzUV9P49SVX7NlXb3LAbKw+MCLrmU=
+github.com/taosdata/driver-go/v3 v3.6.0 h1:4dRXMl01DhIS5xBXUvtkkB+MjL8g64zN674xKd+ojTE=
+github.com/taosdata/driver-go/v3 v3.6.0/go.mod h1:H2vo/At+rOPY1aMzUV9P49SVX7NlXb3LAbKw+MCLrmU=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
diff --git a/tests/docs-examples-test/go.sh b/tests/docs-examples-test/go.sh
index 606265435d..fa02b33b10 100644
--- a/tests/docs-examples-test/go.sh
+++ b/tests/docs-examples-test/go.sh
@@ -61,6 +61,11 @@ check_transactions || exit 1
reset_cache || exit 1
go run ./stmt/ws/main.go
+taos -s "drop database if exists power"
+check_transactions || exit 1
+reset_cache || exit 1
+go run ./stmt2/native/main.go
+
taos -s "drop database if exists power"
check_transactions || exit 1
reset_cache || exit 1
From b80a20a4a13b7ec9768e92ffb0af8837078a75dc Mon Sep 17 00:00:00 2001
From: dapan1121
Date: Mon, 30 Dec 2024 12:11:11 +0800
Subject: [PATCH 11/35] fix: session list issue
---
source/util/inc/tmempoolInt.h | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/source/util/inc/tmempoolInt.h b/source/util/inc/tmempoolInt.h
index 8d4cb57ddc..013838e6df 100755
--- a/source/util/inc/tmempoolInt.h
+++ b/source/util/inc/tmempoolInt.h
@@ -222,7 +222,7 @@ typedef struct SMPSessionChunk {
} SMPSessionChunk;
typedef struct SMPSession {
- // SMPListNode list;
+ SMPListNode list;
char* sessionId;
SMPJob* pJob;
From 549d26be95add84aef6cb416e8a6dda95c2ab8b3 Mon Sep 17 00:00:00 2001
From: dapan1121
Date: Mon, 30 Dec 2024 13:12:10 +0800
Subject: [PATCH 12/35] fix: abort preprocess query issue
---
source/libs/qworker/src/qworker.c | 14 ++++++++++++--
1 file changed, 12 insertions(+), 2 deletions(-)
diff --git a/source/libs/qworker/src/qworker.c b/source/libs/qworker/src/qworker.c
index 4a9eea66e2..5dd43ca064 100644
--- a/source/libs/qworker/src/qworker.c
+++ b/source/libs/qworker/src/qworker.c
@@ -746,9 +746,19 @@ _return:
}
int32_t qwAbortPrerocessQuery(QW_FPARAMS_DEF) {
- QW_ERR_RET(qwDropTask(QW_FPARAMS()));
+ int32_t code = TSDB_CODE_SUCCESS;
+ SQWTaskCtx *ctx = NULL;
- return TSDB_CODE_SUCCESS;
+ QW_ERR_RET(qwAcquireTaskCtx(QW_FPARAMS(), &ctx));
+
+ QW_LOCK(QW_WRITE, &ctx->lock);
+ QW_ERR_JRET(qwDropTask(QW_FPARAMS()));
+
+_return:
+
+ QW_UNLOCK(QW_WRITE, &ctx->lock);
+
+ return code;
}
int32_t qwPreprocessQuery(QW_FPARAMS_DEF, SQWMsg *qwMsg) {
From 49e1e05f2166e39177feca8a10edffc8150bbafd Mon Sep 17 00:00:00 2001
From: Shengliang Guan
Date: Mon, 30 Dec 2024 14:08:07 +0800
Subject: [PATCH 13/35] docs: username
---
docs/zh/14-reference/03-taos-sql/19-limit.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/zh/14-reference/03-taos-sql/19-limit.md b/docs/zh/14-reference/03-taos-sql/19-limit.md
index 4e2fa69a14..e5c03db2fd 100644
--- a/docs/zh/14-reference/03-taos-sql/19-limit.md
+++ b/docs/zh/14-reference/03-taos-sql/19-limit.md
@@ -37,6 +37,6 @@ description: 合法字符集和命名中的限制规则
- 库的数目,超级表的数目、表的数目,系统不做限制,仅受系统资源限制
- 数据库的副本数只能设置为 1 或 3
- 用户名的最大长度是 23 字节
-- 用户密码的最大长度是 31 字节
+- 用户密码的长度范围是 8-16 字节
- 总数据行数取决于可用资源
- 单个数据库的虚拟结点数上限为 1024
From 1742dd3c0f1dbf5281270c008498c779f7d172fb Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Mon, 30 Dec 2024 14:15:24 +0800
Subject: [PATCH 14/35] ci: add paths-ignore
---
.github/workflows/taosd-ci-build.yml | 9 ++++++---
1 file changed, 6 insertions(+), 3 deletions(-)
diff --git a/.github/workflows/taosd-ci-build.yml b/.github/workflows/taosd-ci-build.yml
index 0876f5b731..8d8a120d76 100644
--- a/.github/workflows/taosd-ci-build.yml
+++ b/.github/workflows/taosd-ci-build.yml
@@ -6,6 +6,10 @@ on:
- 'main'
- '3.0'
- '3.1'
+ paths-ignore:
+ - 'docs/**'
+ - 'packaging/**'
+ - 'tests/**'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -14,7 +18,7 @@ concurrency:
jobs:
build:
runs-on: ubuntu-latest
- name: Run unit tests
+ name: Build and test
steps:
- name: Checkout the repository
@@ -32,12 +36,11 @@ jobs:
libgeos-dev libjansson-dev libsnappy-dev liblzma-dev libz-dev \
zlib1g pkg-config libssl-dev gawk
-
- name: Build and install TDengine
run: |
mkdir debug && cd debug
cmake .. -DBUILD_HTTP=false -DBUILD_JDBC=false \
- -DBUILD_TOOLS=true -DBUILD_TEST=off \
+ -DBUILD_TOOLS=true -DBUILD_TEST=false \
-DBUILD_KEEPER=true -DBUILD_DEPENDENCY_TESTS=false
make -j 4
sudo make install
From aea32365978df8defb117d2b9372a260f4bd2abb Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 14:44:49 +0800
Subject: [PATCH 15/35] stmt2 document for python connector
---
docs/examples/python/insert_with_stmt2.py | 626 ++++++++++++++++++
.../14-reference/05-connector/30-python.mdx | 131 ++--
tests/docs-examples-test/python.sh | 2 +
3 files changed, 688 insertions(+), 71 deletions(-)
create mode 100644 docs/examples/python/insert_with_stmt2.py
diff --git a/docs/examples/python/insert_with_stmt2.py b/docs/examples/python/insert_with_stmt2.py
new file mode 100644
index 0000000000..433b283fe3
--- /dev/null
+++ b/docs/examples/python/insert_with_stmt2.py
@@ -0,0 +1,626 @@
+# encoding:UTF-8
+from ctypes import *
+from datetime import datetime
+# geometry support
+from shapely.wkb import dumps, loads
+from shapely.wkt import dumps as wkt_dumps, loads as wkt_loads
+
+import taos
+import math
+import traceback
+from taos.statement2 import *
+from taos.constants import FieldType
+from taos import log
+from taos import bind2
+
+# input WKT return WKB (bytes object)
+def WKB(wkt, hex = False):
+ if wkt is None:
+ return None
+ wkb = wkt_loads(wkt)
+ wkb_bytes = dumps(wkb, hex)
+ return wkb_bytes
+
+def compareLine(oris, rows):
+ n = len(oris)
+ if len(rows) != n:
+ return False
+ log.debug(f" len is {n} oris={oris} rows={rows}")
+ for i in range(n):
+ if oris[i] != rows[i]:
+ if type(rows[i]) == bool:
+ if bool(oris[i]) != rows[i]:
+ log.debug1(f" diff bool i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+ return False
+ else:
+ log.debug1(f" float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+ elif type(rows[i]) == float:
+ if math.isclose(oris[i], rows[i], rel_tol=1e-3) is False:
+ log.debug1(f" diff float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+ return False
+ else:
+ log.debug1(f" float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+ else:
+ log.debug1(f" diff i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+ return False
+ else:
+ log.debug1(f" i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
+
+ return True
+
+
+def checkResultCorrect(conn, sql, tagsTb, datasTb):
+ # column to rows
+ log.debug(f"check sql correct: {sql}\n")
+ oris = []
+ ncol = len(datasTb)
+ nrow = len(datasTb[0])
+
+ for i in range(nrow):
+ row = []
+ for j in range(ncol):
+ if j == 0:
+ # ts column
+ c0 = datasTb[j][i]
+ if type(c0) is int :
+ row.append(datasTb[j][i])
+ else:
+ ts = int(bind2.datetime_to_timestamp(c0, PrecisionEnum.Milliseconds).value)
+ row.append(ts)
+ else:
+ row.append(datasTb[j][i])
+
+ if tagsTb is not None:
+ row += tagsTb
+ oris.append(row)
+
+ # fetch all
+ lres = []
+ log.debug(sql)
+ res = conn.query(sql)
+ i = 0
+ for row in res:
+ lrow = list(row)
+ lrow[0] = int(lrow[0].timestamp()*1000)
+ if compareLine(oris[i], lrow) is False:
+ log.info(f"insert data differet. i={i} expect ori data={oris[i]} query from db ={lrow}")
+ raise(BaseException("check insert data correct failed."))
+ else:
+ log.debug(f"i={i} origin data same with get from db\n")
+ log.debug(f" origin data = {oris[i]} \n")
+ log.debug(f" get from db = {lrow} \n")
+ i += 1
+
+
+def checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas):
+ count = len(tbnames)
+ for i in range(count):
+ if stbname is None:
+ sql = f"select * from {dbname}.{tbnames[i]} "
+ else:
+ sql = f"select * from {dbname}.{stbname} where tbname='{tbnames[i]}' "
+
+ checkResultCorrect(conn, sql, tags[i], datas[i])
+
+ print("insert data check correct ..................... ok\n")
+
+
+def prepare(conn, dbname, stbname, ntb1, ntb2):
+ conn.execute("drop database if exists %s" % dbname)
+ conn.execute("create database if not exists %s precision 'ms' " % dbname)
+ conn.select_db(dbname)
+ # stable
+ sql = f"create table if not exists {dbname}.{stbname}(ts timestamp, name binary(32), sex bool, score int) tags(grade nchar(8), class int)"
+ conn.execute(sql)
+ # normal table
+ sql = f"create table if not exists {dbname}.{ntb1} (ts timestamp, name varbinary(32), sex bool, score float, geo geometry(128))"
+ conn.execute(sql)
+ sql = f"create table if not exists {dbname}.{ntb2} (ts timestamp, name varbinary(32), sex bool, score float, geo geometry(128))"
+ conn.execute(sql)
+
+
+# performace is high
+def insert_bind_param(conn, stmt2, dbname, stbname):
+ #
+ # table info , write 5 lines to 3 child tables d0, d1, d2 with super table
+ #
+ tbnames = ["d1","d2","d3"]
+
+ tags = [
+ ["grade1", 1],
+ ["grade1", None],
+ [None , 3]
+ ]
+ datas = [
+ # class 1
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
+ ["Mary", "Tom", "Jack", "Jane", "alex" ,None ],
+ [0, 1, 1, 0, 1 ,None ],
+ [98, 80, 60, 100, 99 ,None ]
+ ],
+ # class 2
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
+ ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ,None ],
+ [0, 1, 1, 0, 1 ,0 ],
+ [298, 280, 260, 2100, 299 ,None ]
+ ],
+ # class 3
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
+ ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ,"Mark" ],
+ [0, 1, 1, 0, 1 ,None ],
+ [398, 380, 360, 3100, 399 ,None ]
+ ]
+ ]
+
+ stmt2.bind_param(tbnames, tags, datas)
+ stmt2.execute()
+
+ # check correct
+ checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas)
+
+
+def insert_bind_param_normal_tables(conn, stmt2, dbname, ntb):
+ tbnames = [ntb]
+ tags = None
+ wkts = [None, b"POINT(121.213 31.234)", b"POINT(122.22 32.222)", None, b"POINT(124.22 34.222)"]
+ wkbs = [WKB(wkt) for wkt in wkts]
+
+ datas = [
+ # table 1
+ [
+ # student
+ [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
+ [b"Mary", b"tom", b"Jack", b"Jane", None ],
+ [0, 3.14, True, 0, 1 ],
+ [98, 99.87, 60, 100, 99 ],
+ wkbs
+ ]
+ ]
+
+ stmt2.bind_param(tbnames, tags, datas)
+ stmt2.execute()
+
+ # check correct
+ checkResultCorrects(conn, dbname, None, tbnames, [None], datas)
+
+def insert_bind_param_with_table(conn, stmt2, dbname, stbname, ctb):
+
+ tbnames = None
+ tags = [
+ ["grade2", 1]
+ ]
+
+ # prepare data
+ datas = [
+ # table 1
+ [
+ # student
+ [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
+ ["Mary", "Tom", "Jack", "Jane", "alex" ],
+ [0, 1, 1, 0, 1 ],
+ [98, 80, 60, 100, 99 ]
+ ]
+ ]
+
+ stmt2.bind_param(tbnames, tags, datas)
+ stmt2.execute()
+
+ # check correct
+ checkResultCorrects(conn, dbname, stbname, [ctb], tags, datas)
+
+
+# insert with single table (performance is lower)
+def insert_bind_param_with_tables(conn, stmt2, dbname, stbname):
+
+ tbnames = ["t1", "t2", "t3"]
+ tags = [
+ ["grade2", 1],
+ ["grade2", 2],
+ ["grade2", 3]
+ ]
+
+ # prepare data
+ datas = [
+ # table 1
+ [
+ # student
+ [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
+ ["Mary", "Tom", "Jack", "Jane", "alex" ],
+ [0, 1, 1, 0, 1 ],
+ [98, 80, 60, 100, 99 ]
+ ],
+ # table 2
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
+ ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ],
+ [0, 1, 1, 0, 1 ],
+ [298, 280, 260, 2100, 299 ]
+ ],
+ # table 3
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
+ ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ],
+ [0, 1, 1, 0, 1 ],
+ [398, 380, 360, 3100, 399 ]
+ ]
+ ]
+
+ table0 = BindTable(tbnames[0], tags[0])
+ table1 = BindTable(tbnames[1], tags[1])
+ table2 = BindTable(tbnames[2], tags[2])
+
+ for data in datas[0]:
+ table0.add_col_data(data)
+ for data in datas[1]:
+ table1.add_col_data(data)
+ for data in datas[2]:
+ table2.add_col_data(data)
+
+ # bind with single table
+ stmt2.bind_param_with_tables([table0, table1, table2])
+ stmt2.execute()
+
+ # check correct
+ checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas)
+
+
+def do_check_invalid(stmt2, tbnames, tags, datas):
+ table0 = BindTable(tbnames[0], tags[0])
+ table1 = BindTable(tbnames[1], tags[1])
+ table2 = BindTable(tbnames[2], tags[2])
+
+ for data in datas[0]:
+ table0.add_col_data(data)
+ for data in datas[1]:
+ table1.add_col_data(data)
+ for data in datas[2]:
+ table2.add_col_data(data)
+
+ # bind with single table
+ try:
+ stmt2.bind_param_with_tables([table0, table1, table2])
+ stmt2.execute()
+ except Exception as err:
+ #traceback.print_stack()
+ print(f"failed to do_check_invalid. err={err}")
+ return
+
+ print(f"input invalid data passed , unexpect. \ntbnames={tbnames}\ntags={tags} \ndatas={datas} \n")
+ assert False
+
+
+def check_input_invalid_param(conn, stmt2, dbname, stbname):
+
+ tbnames = ["t1", "t2", "t3"]
+ tags = [
+ ["grade2", 1],
+ ["grade2", 2],
+ ["grade2", 3]
+ ]
+
+ # prepare data
+ datas = [
+ # table 1
+ [
+ # student
+ [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
+ ["Mary", "Tom", "Jack", "Jane", "alex" ],
+ [0, 1, 1, 0, 1 ],
+ [98, 80, 60, 100, 99 ]
+ ],
+ # table 2
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
+ ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ],
+ [0, 1, 1, 0, 1 ],
+ [298, 280, 260, 2100, 299 ]
+ ],
+ # table 3
+ [
+ # student
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
+ ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ],
+ [0, 1, 1, 0, 1 ],
+ [398, 380, 360, 3100, 399 ]
+ ]
+ ]
+
+ # some tags is none
+ tags1 = [ ["grade2", 1], None, ["grade2", 3] ]
+ do_check_invalid(stmt2, tbnames, tags1, datas)
+
+ # timestamp is over range
+ origin = datas[0][0][0]
+ datas[0][0][0] = 100000000000000000000000
+ do_check_invalid(stmt2, tbnames, tags, datas)
+ datas[0][0][0] = origin # restore
+
+
+# insert with single table (performance is lower)
+def insert_with_normal_tables(conn, stmt2, dbname, ntb):
+
+ tbnames = [ntb]
+ tags = [None]
+ # prepare data
+
+ wkts = [None, "POINT(121.213 31.234)", "POINT(122.22 32.222)", None, "POINT(124.22 34.222)"]
+ wkbs = [WKB(wkt) for wkt in wkts]
+
+ datas = [
+ # table 1
+ [
+ # student
+ [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
+ [b"Mary", b"tom", b"Jack", b"Jane", None ],
+ [0, 3.14, True, 0, 1 ],
+ [98, 99.87, 60, 100, 99 ],
+ wkbs
+ ]
+ ]
+
+ table0 = BindTable(tbnames[0], tags[0])
+ for data in datas[0]:
+ table0.add_col_data(data)
+
+ # bind with single table
+ stmt2.bind_param_with_tables([table0])
+ stmt2.execute()
+
+ # check correct
+ checkResultCorrects(conn, dbname, None, tbnames, tags, datas)
+
+
+def test_stmt2_prepare_empty_sql(conn):
+ if not IS_V3:
+ print(" test_stmt2_prepare_empty_sql not support TDengine 2.X version.")
+ return
+
+ try:
+ # prepare
+ stmt2 = conn.statement2()
+ stmt2.prepare(sql='')
+
+ # should not run here
+ conn.close()
+ print("prepare empty sql ............................. failed\n")
+ assert False
+
+ except StatementError as err:
+ print("prepare empty sql ............................. ok\n")
+ conn.close()
+
+
+def test_bind_invalid_tbnames_type():
+ if not IS_V3:
+ print(" test_bind_invalid_tbnames_type not support TDengine 2.X version.")
+ return
+
+ dbname = "stmt2"
+ stbname = "stmt2_stable"
+ subtbname = "stmt2_subtable"
+
+ try:
+ conn = taos.connect()
+ conn.execute(f"drop database if exists {dbname}")
+ conn.execute(f"create database {dbname}")
+ conn.select_db(dbname)
+ conn.execute(f"create stable {stbname} (ts timestamp, a int) tags (b int);")
+ conn.execute(f"create table {subtbname} using {stbname} tags(0);")
+
+ stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?) values(?,?)")
+
+ tags = [[1]]
+ datas = [[[1626861392589], [1]]]
+
+ stmt2.bind_param(subtbname, tags, datas)
+
+ # should not run here
+ conn.close()
+ print("bind invalid tbnames type ..................... failed\n")
+ assert False
+
+ except StatementError as err:
+ print("bind invalid tbnames type ..................... ok\n")
+ conn.close()
+
+
+#
+# insert
+#
+def test_stmt2_insert(conn):
+ if not IS_V3:
+ print(" test_stmt2_query not support TDengine 2.X version.")
+ return
+
+ dbname = "stmt2"
+ stbname = "meters"
+ ntb1 = "ntb1"
+ ntb2 = "ntb2"
+
+ try:
+ prepare(conn, dbname, stbname, ntb1, ntb2)
+
+ ctb = 'ctb' # child table
+ stmt2 = conn.statement2(f"insert into {dbname}.{ctb} using {dbname}.{stbname} tags (?,?) values(?,?,?,?)")
+ insert_bind_param_with_table(conn, stmt2, dbname, stbname, ctb)
+ print("insert child table ........................... ok\n")
+ stmt2.close()
+
+ # # prepare
+ # stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?,?) values(?,?,?,?)")
+ # print("insert prepare sql ............................ ok\n")
+ #
+ # # insert with table
+ # insert_bind_param_with_tables(conn, stmt2, dbname, stbname)
+ # print("insert bind with tables ....................... ok\n")
+ # check_input_invalid_param(conn, stmt2, dbname, stbname)
+ # print("check input invalid params .................... ok\n")
+ #
+ # # insert with split args
+ # insert_bind_param(conn, stmt2, dbname, stbname)
+ # print("insert bind ................................... ok\n")
+ # print("insert execute ................................ ok\n")
+ # stmt2.close()
+
+ # ntb1
+ stmt2 = conn.statement2(f"insert into {dbname}.{ntb1} values(?,?,?,?,?)")
+ insert_with_normal_tables(conn, stmt2, dbname, ntb1)
+ print("insert normal tables .......................... ok\n")
+ stmt2.close()
+
+ # ntb2
+ stmt2 = conn.statement2(f"insert into {dbname}.{ntb2} values(?,?,?,?,?)")
+ insert_bind_param_normal_tables(conn, stmt2, dbname, ntb2)
+ print("insert normal tables (bind param) ............. ok\n")
+ stmt2.close()
+
+ conn.close()
+ print("test_stmt2_insert ............................. [passed]\n")
+ except Exception as err:
+ #conn.execute("drop database if exists %s" % dbname)
+ print("test_stmt2_insert ............................. failed\n")
+ conn.close()
+ raise err
+
+
+#
+# ------------------------ query -------------------
+#
+def query_bind_param(conn, stmt2):
+ # set param
+ #tbnames = ["d2"]
+ tbnames = None
+ tags = None
+ datas = [
+ # class 1
+ [
+ # where name in ('Tom2','alex2') or score > 1000;"
+ ["Tom2"],
+ [1000]
+ ]
+ ]
+
+ # set param
+ types = [FieldType.C_BINARY, FieldType.C_INT]
+ stmt2.set_columns_type(types)
+
+ # bind
+ stmt2.bind_param(tbnames, tags, datas)
+
+
+# compare
+def compare_result(conn, sql2, res2):
+ lres1 = []
+ lres2 = []
+
+ # shor res2
+ for row in res2:
+ log.debug(f" res2 rows = {row} \n")
+ lres2.append(row)
+
+ res1 = conn.query(sql2)
+ for row in res1:
+ log.debug(f" res1 rows = {row} \n")
+ lres1.append(row)
+
+ row1 = len(lres1)
+ row2 = len(lres2)
+ col1 = len(lres1[0])
+ col2 = len(lres2[0])
+
+ # check number
+ if row1 != row2:
+ err = f"two results row count different. row1={row1} row2={row2}"
+ raise(BaseException(err))
+ if col1 != col2:
+ err = f" two results column count different. col1={col1} col2={col2}"
+ raise(BaseException(err))
+
+ for i in range(row1):
+ for j in range(col1):
+ if lres1[i][j] != lres2[i][j]:
+ raise(f" two results data different. i={i} j={j} data1={res1[i][j]} data2={res2[i][j]}\n")
+
+# query
+def test_stmt2_query(conn):
+ if not IS_V3:
+ print(" test_stmt2_query not support TDengine 2.X version.")
+ return
+
+ dbname = "stmt2"
+ stbname = "meters"
+ ntb1 = "ntb1"
+ ntb2 = "ntb2"
+ sql1 = f"select * from {dbname}.d2 where name in (?) or score > ? ;"
+ sql2 = f"select * from {dbname}.d2 where name in ('Tom2') or score > 1000;"
+
+ try:
+ # prepare
+ prepare(conn, dbname, stbname, ntb1, ntb2)
+
+ # prepare
+ # stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?,?) values(?,?,?,?)")
+ # insert_bind_param_with_tables(conn, stmt2, dbname, stbname)
+ # insert_bind_param(conn, stmt2, dbname, stbname)
+ # stmt2.close()
+ # print("insert bind & execute ......................... ok\n")
+
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.000', 'Mary2', false, 298)")
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.001', 'Tom2', true, 280)")
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.002', 'Jack2', true, 260)")
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.003', 'Jane2', false, 2100)")
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.004', 'alex2', true, 299)")
+ conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.005', NULL, false, NULL)")
+
+
+ # statement2
+ stmt2 = conn.statement2(sql1)
+ print("query prepare sql ............................. ok\n")
+
+
+ # insert with table
+ #insert_bind_param_with_tables(conn, stmt2)
+
+
+ # bind
+ query_bind_param(conn, stmt2)
+ print("query bind param .............................. ok\n")
+
+ # query execute
+ stmt2.execute()
+
+ # fetch result
+ res2 = stmt2.result()
+
+ # check result
+ compare_result(conn, sql2, res2)
+ print("query check corrent ........................... ok\n")
+
+ #conn.execute("drop database if exists %s" % dbname)
+ stmt2.close()
+ conn.close()
+ print("test_stmt2_query .............................. [passed]\n")
+
+ except Exception as err:
+ print("query ......................................... failed\n")
+ conn.close()
+ raise err
+
+
+if __name__ == "__main__":
+ print("start stmt2 test case...\n")
+ taos.log.setting(True, True, True, True, True, False)
+ # insert
+ test_stmt2_insert(taos.connect())
+ # query
+ test_stmt2_query(taos.connect())
+ print("end stmt2 test case.\n")
\ No newline at end of file
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index d724fc796c..c91e9775f4 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -246,43 +246,43 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `reqId`: 用于问题追踪。
- **异常**:操作失败抛出 `DataError` 或 `OperationalError` 异常。
-#### 参数绑定
-- `fn statement(&self) -> PyResult`
- - **接口说明**:使用 连接 对象创建 stmt 对象。
- - **返回值**:stmt 对象。
+#### 参数绑定 STMT2
+- `def statement2(self, sql=None, option=None)`
+ - **接口说明**:使用连接对象创建 stmt2 对象
+ - **参数说明**
+ - `sql`: 绑定的 SQL 语句,如果不为空会调用`prepare`函数
+ - `option` 传入 TaosStmt2Option 类实例选项
+ - **返回值**:stmt2 对象。
- **异常**:操作失败抛出 `ConnectionError` 异常。
-- `fn prepare(&mut self, sql: &str) -> PyResult<()>`
- - **接口说明**:绑定预编译 sql 语句。
+- `def prepare(self, sql)`
+ - **接口说明**:绑定预编译 sql 语句
- **参数说明**:
- - `sql`: 预编译的 SQL 语句。
+ - `sql`: 绑定的 SQL 语句
- **异常**:操作失败抛出 `ProgrammingError` 异常。
-- `fn set_tbname(&mut self, table_name: &str) -> PyResult<()>`
- - **接口说明**:设置将要写入数据的表名。
+- `def bind_param(self, tbnames, tags, datas)`
+ - **接口说明**:以独立数组方式绑定数据
- **参数说明**:
- - `tableName`: 表名,如果需要指定数据库, 例如: `db_name.table_name` 即可。
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
-- `fn set_tags(&mut self, tags: Vec) -> PyResult<()>`
- - **接口说明**:设置表 Tags 数据, 用于自动建表。
+ - `tbnames`: 绑定表名数组,数据类型为 list
+ - `tags`: 绑定 tag 列值数组,数据类型为 list
+ - `tags`: 绑定普通列值数组,数据类型为 list
+ - **异常**:操作失败抛出 `StatementError` 异常
+- `def bind_param_with_tables(self, tables)`
+ - **接口说明**:以独立表方式绑定数据,独立表是以表为组织单位,每张表中有表名,TAG 值及普通列数值属性
- **参数说明**:
- - `paramsArray`: Tags 数据。
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
-- `fn bind_param(&mut self, params: Vec) -> PyResult<()>`
- - **接口说明**:绑定数据。
- - **参数说明**:
- - `paramsArray`: 绑定数据。
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
-- `fn add_batch(&mut self) -> PyResult<()>`
- - **接口说明**:提交绑定数据。
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
-- `fn execute(&mut self) -> PyResult`
- - **接口说明**:执行将绑定的数据全部写入。
- - **返回值**:写入条数。
+ - `tables`: `BindTable` 独立表对象数组
+ - **异常**:操作失败抛出 `StatementError` 异常。
+- `def execute(self) -> int:`
+ - **接口说明**:执行将绑定数据全部写入
+ - **返回值**:影响行数
- **异常**:操作失败抛出 `QueryError` 异常。
-- `fn affect_rows(&mut self) -> PyResult`
- - **接口说明**: 获取写入条数。
- - **返回值**:写入条数。
-- `fn close(&self) -> PyResult<()>`
- - **接口说明**: 关闭 stmt 对象。
+- `def result(self)`
+ - **接口说明**: 获取参数绑定查询结果集
+ - **返回值**:返回 TaosResult 对象
+- `def close(self)`
+ - **接口说明**: 关闭 stmt2 对象
+
+[示例](http://example.code.stmt2)
+
#### 数据订阅
- **创建消费者支持属性列表**:
@@ -423,51 +423,40 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- **返回值**:影响的条数。
- **异常**:操作失败抛出 `SchemalessError` 异常。
-#### 参数绑定
-- `def statement(self, sql=None)`
- - **接口说明**:使用连接对象创建 stmt 对象, 如果 sql 不空会进行调用 prepare。
- - `sql`: 预编译的 SQL 语句。
- - **返回值**:stmt 对象。
- - **异常**:操作失败抛出 `StatementError` 异常。
+#### 参数绑定 STMT2
+- `def statement2(self, sql=None, option=None)`
+ - **接口说明**:使用连接对象创建 stmt2 对象
+ - **参数说明**
+ - `sql`: 绑定的 SQL 语句,如果不为空会调用`prepare`函数
+ - `option` 传入 TaosStmt2Option 类实例选项
+ - **返回值**:stmt2 对象。
+ - **异常**:操作失败抛出 `ConnectionError` 异常。
- `def prepare(self, sql)`
- - **接口说明**:绑定预编译 sql 语句。
+ - **接口说明**:绑定预编译 sql 语句
- **参数说明**:
- - `sql`: 预编译的 SQL 语句。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def set_tbname(self, name)`
- - **接口说明**:设置将要写入数据的表名。
+ - `sql`: 绑定的 SQL 语句
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `def bind_param(self, tbnames, tags, datas)`
+ - **接口说明**:以独立数组方式绑定数据
- **参数说明**:
- - `name`: 表名,如果需要指定数据库, 例如: `db_name.table_name` 即可。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def set_tbname_tags(self, name, tags):`
- - **接口说明**:设置表和 Tags 数据, 用于自动建表。
+ - `tbnames`: 绑定表名数组,数据类型为 list
+ - `tags`: 绑定 tag 列值数组,数据类型为 list
+ - `tags`: 绑定普通列值数组,数据类型为 list
+ - **异常**:操作失败抛出 `StatementError` 异常
+- `def bind_param_with_tables(self, tables)`
+ - **接口说明**:以独立表方式绑定数据,独立表是以表为组织单位,每张表中有表名,TAG 值及普通列数值属性
- **参数说明**:
- - `name`: 表名,如果需要指定数据库, 例如: `db_name.table_name` 即可。
- - `tags`: Tags 数据。
+ - `tables`: `BindTable` 独立表对象数组
- **异常**:操作失败抛出 `StatementError` 异常。
-- `def bind_param(self, params, add_batch=True)`
- - **接口说明**:绑定一组数据并提交。
- - **参数说明**:
- - `params`: 绑定数据。
- - `add_batch`: 是否提交绑定数据。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def bind_param_batch(self, binds, add_batch=True)`
- - **接口说明**:绑定多组数据并提交。
- - **参数说明**:
- - `binds`: 绑定数据。
- - `add_batch`: 是否提交绑定数据。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def add_batch(self)`
- - **接口说明**:提交绑定数据。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def execute(self)`
- - **接口说明**:执行将绑定的数据全部写入。
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def affected_rows(self)`
- - **接口说明**: 获取写入条数。
- - **返回值**:写入条数。
-- `def close(&self)`
- - **接口说明**: 关闭 stmt 对象。
+- `def execute(self) -> int:`
+ - **接口说明**:执行将绑定数据全部写入
+ - **返回值**:影响行数
+ - **异常**:操作失败抛出 `QueryError` 异常。
+- `def result(self)`
+ - **接口说明**: 获取参数绑定查询结果集
+ - **返回值**:返回 TaosResult 对象
+- `def close(self)`
+ - **接口说明**: 关闭 stmt2 对象
#### 数据订阅
- **创建消费者支持属性列表**:
diff --git a/tests/docs-examples-test/python.sh b/tests/docs-examples-test/python.sh
index 3a9812637c..f7f94db1f2 100644
--- a/tests/docs-examples-test/python.sh
+++ b/tests/docs-examples-test/python.sh
@@ -196,3 +196,5 @@ check_transactions || exit 1
reset_cache || exit 1
python3 tmq_websocket_example.py
+# stmt2
+python3 insert_with_stmt2.py
\ No newline at end of file
From e6ec05e155e8ef7f0088595ca7187176db664a59 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 14:59:26 +0800
Subject: [PATCH 16/35] docs: update document for python
---
docs/zh/14-reference/05-connector/30-python.mdx | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index c91e9775f4..0c15d866d5 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -258,7 +258,7 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- **接口说明**:绑定预编译 sql 语句
- **参数说明**:
- `sql`: 绑定的 SQL 语句
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
+ - **异常**:操作失败抛出 `StatementError` 异常。
- `def bind_param(self, tbnames, tags, datas)`
- **接口说明**:以独立数组方式绑定数据
- **参数说明**:
@@ -281,7 +281,7 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `def close(self)`
- **接口说明**: 关闭 stmt2 对象
-[示例](http://example.code.stmt2)
+[使用示例](http://https://github.com/taosdata/TDengine/tree/main/docs/examples/python/insert_with_stmt2.py)
#### 数据订阅
@@ -435,7 +435,7 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- **接口说明**:绑定预编译 sql 语句
- **参数说明**:
- `sql`: 绑定的 SQL 语句
- - **异常**:操作失败抛出 `ProgrammingError` 异常。
+ - **异常**:操作失败抛出 `StatementError` 异常。
- `def bind_param(self, tbnames, tags, datas)`
- **接口说明**:以独立数组方式绑定数据
- **参数说明**:
@@ -458,6 +458,8 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `def close(self)`
- **接口说明**: 关闭 stmt2 对象
+[使用示例](http://https://github.com/taosdata/TDengine/tree/main/docs/examples/python/insert_with_stmt2.py)
+
#### 数据订阅
- **创建消费者支持属性列表**:
- td.connect.ip:主机地址。
From defa7ffc42c8bd2ae08ac7106dea5566c815bf03 Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Mon, 30 Dec 2024 15:00:41 +0800
Subject: [PATCH 17/35] chore: remove wrong build option
skip-checks: true
---
.github/workflows/taosd-ci-build.yml | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/.github/workflows/taosd-ci-build.yml b/.github/workflows/taosd-ci-build.yml
index 8d8a120d76..cd5f1eeeae 100644
--- a/.github/workflows/taosd-ci-build.yml
+++ b/.github/workflows/taosd-ci-build.yml
@@ -39,9 +39,11 @@ jobs:
- name: Build and install TDengine
run: |
mkdir debug && cd debug
- cmake .. -DBUILD_HTTP=false -DBUILD_JDBC=false \
- -DBUILD_TOOLS=true -DBUILD_TEST=false \
- -DBUILD_KEEPER=true -DBUILD_DEPENDENCY_TESTS=false
+ cmake .. -DBUILD_TOOLS=true \
+ -DBUILD_KEEPER=true \
+ -DBUILD_HTTP=false \
+ -DBUILD_TEST=false \
+ -DBUILD_DEPENDENCY_TESTS=false
make -j 4
sudo make install
which taosd
From f0627fd7e8042102a9843f205d741928062a0342 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 15:25:25 +0800
Subject: [PATCH 18/35] fix: remove tdengine case to python-connector repo
---
docs/examples/python/insert_with_stmt2.py | 626 ------------------
.../14-reference/05-connector/30-python.mdx | 8 +-
tests/docs-examples-test/python.sh | 2 -
3 files changed, 4 insertions(+), 632 deletions(-)
delete mode 100644 docs/examples/python/insert_with_stmt2.py
diff --git a/docs/examples/python/insert_with_stmt2.py b/docs/examples/python/insert_with_stmt2.py
deleted file mode 100644
index 433b283fe3..0000000000
--- a/docs/examples/python/insert_with_stmt2.py
+++ /dev/null
@@ -1,626 +0,0 @@
-# encoding:UTF-8
-from ctypes import *
-from datetime import datetime
-# geometry support
-from shapely.wkb import dumps, loads
-from shapely.wkt import dumps as wkt_dumps, loads as wkt_loads
-
-import taos
-import math
-import traceback
-from taos.statement2 import *
-from taos.constants import FieldType
-from taos import log
-from taos import bind2
-
-# input WKT return WKB (bytes object)
-def WKB(wkt, hex = False):
- if wkt is None:
- return None
- wkb = wkt_loads(wkt)
- wkb_bytes = dumps(wkb, hex)
- return wkb_bytes
-
-def compareLine(oris, rows):
- n = len(oris)
- if len(rows) != n:
- return False
- log.debug(f" len is {n} oris={oris} rows={rows}")
- for i in range(n):
- if oris[i] != rows[i]:
- if type(rows[i]) == bool:
- if bool(oris[i]) != rows[i]:
- log.debug1(f" diff bool i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
- return False
- else:
- log.debug1(f" float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
- elif type(rows[i]) == float:
- if math.isclose(oris[i], rows[i], rel_tol=1e-3) is False:
- log.debug1(f" diff float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
- return False
- else:
- log.debug1(f" float i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
- else:
- log.debug1(f" diff i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
- return False
- else:
- log.debug1(f" i={i} oris[i] == rows[i] {oris[i]} == {rows[i]}")
-
- return True
-
-
-def checkResultCorrect(conn, sql, tagsTb, datasTb):
- # column to rows
- log.debug(f"check sql correct: {sql}\n")
- oris = []
- ncol = len(datasTb)
- nrow = len(datasTb[0])
-
- for i in range(nrow):
- row = []
- for j in range(ncol):
- if j == 0:
- # ts column
- c0 = datasTb[j][i]
- if type(c0) is int :
- row.append(datasTb[j][i])
- else:
- ts = int(bind2.datetime_to_timestamp(c0, PrecisionEnum.Milliseconds).value)
- row.append(ts)
- else:
- row.append(datasTb[j][i])
-
- if tagsTb is not None:
- row += tagsTb
- oris.append(row)
-
- # fetch all
- lres = []
- log.debug(sql)
- res = conn.query(sql)
- i = 0
- for row in res:
- lrow = list(row)
- lrow[0] = int(lrow[0].timestamp()*1000)
- if compareLine(oris[i], lrow) is False:
- log.info(f"insert data differet. i={i} expect ori data={oris[i]} query from db ={lrow}")
- raise(BaseException("check insert data correct failed."))
- else:
- log.debug(f"i={i} origin data same with get from db\n")
- log.debug(f" origin data = {oris[i]} \n")
- log.debug(f" get from db = {lrow} \n")
- i += 1
-
-
-def checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas):
- count = len(tbnames)
- for i in range(count):
- if stbname is None:
- sql = f"select * from {dbname}.{tbnames[i]} "
- else:
- sql = f"select * from {dbname}.{stbname} where tbname='{tbnames[i]}' "
-
- checkResultCorrect(conn, sql, tags[i], datas[i])
-
- print("insert data check correct ..................... ok\n")
-
-
-def prepare(conn, dbname, stbname, ntb1, ntb2):
- conn.execute("drop database if exists %s" % dbname)
- conn.execute("create database if not exists %s precision 'ms' " % dbname)
- conn.select_db(dbname)
- # stable
- sql = f"create table if not exists {dbname}.{stbname}(ts timestamp, name binary(32), sex bool, score int) tags(grade nchar(8), class int)"
- conn.execute(sql)
- # normal table
- sql = f"create table if not exists {dbname}.{ntb1} (ts timestamp, name varbinary(32), sex bool, score float, geo geometry(128))"
- conn.execute(sql)
- sql = f"create table if not exists {dbname}.{ntb2} (ts timestamp, name varbinary(32), sex bool, score float, geo geometry(128))"
- conn.execute(sql)
-
-
-# performace is high
-def insert_bind_param(conn, stmt2, dbname, stbname):
- #
- # table info , write 5 lines to 3 child tables d0, d1, d2 with super table
- #
- tbnames = ["d1","d2","d3"]
-
- tags = [
- ["grade1", 1],
- ["grade1", None],
- [None , 3]
- ]
- datas = [
- # class 1
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
- ["Mary", "Tom", "Jack", "Jane", "alex" ,None ],
- [0, 1, 1, 0, 1 ,None ],
- [98, 80, 60, 100, 99 ,None ]
- ],
- # class 2
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
- ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ,None ],
- [0, 1, 1, 0, 1 ,0 ],
- [298, 280, 260, 2100, 299 ,None ]
- ],
- # class 3
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
- ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ,"Mark" ],
- [0, 1, 1, 0, 1 ,None ],
- [398, 380, 360, 3100, 399 ,None ]
- ]
- ]
-
- stmt2.bind_param(tbnames, tags, datas)
- stmt2.execute()
-
- # check correct
- checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas)
-
-
-def insert_bind_param_normal_tables(conn, stmt2, dbname, ntb):
- tbnames = [ntb]
- tags = None
- wkts = [None, b"POINT(121.213 31.234)", b"POINT(122.22 32.222)", None, b"POINT(124.22 34.222)"]
- wkbs = [WKB(wkt) for wkt in wkts]
-
- datas = [
- # table 1
- [
- # student
- [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
- [b"Mary", b"tom", b"Jack", b"Jane", None ],
- [0, 3.14, True, 0, 1 ],
- [98, 99.87, 60, 100, 99 ],
- wkbs
- ]
- ]
-
- stmt2.bind_param(tbnames, tags, datas)
- stmt2.execute()
-
- # check correct
- checkResultCorrects(conn, dbname, None, tbnames, [None], datas)
-
-def insert_bind_param_with_table(conn, stmt2, dbname, stbname, ctb):
-
- tbnames = None
- tags = [
- ["grade2", 1]
- ]
-
- # prepare data
- datas = [
- # table 1
- [
- # student
- [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
- ["Mary", "Tom", "Jack", "Jane", "alex" ],
- [0, 1, 1, 0, 1 ],
- [98, 80, 60, 100, 99 ]
- ]
- ]
-
- stmt2.bind_param(tbnames, tags, datas)
- stmt2.execute()
-
- # check correct
- checkResultCorrects(conn, dbname, stbname, [ctb], tags, datas)
-
-
-# insert with single table (performance is lower)
-def insert_bind_param_with_tables(conn, stmt2, dbname, stbname):
-
- tbnames = ["t1", "t2", "t3"]
- tags = [
- ["grade2", 1],
- ["grade2", 2],
- ["grade2", 3]
- ]
-
- # prepare data
- datas = [
- # table 1
- [
- # student
- [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
- ["Mary", "Tom", "Jack", "Jane", "alex" ],
- [0, 1, 1, 0, 1 ],
- [98, 80, 60, 100, 99 ]
- ],
- # table 2
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
- ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ],
- [0, 1, 1, 0, 1 ],
- [298, 280, 260, 2100, 299 ]
- ],
- # table 3
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
- ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ],
- [0, 1, 1, 0, 1 ],
- [398, 380, 360, 3100, 399 ]
- ]
- ]
-
- table0 = BindTable(tbnames[0], tags[0])
- table1 = BindTable(tbnames[1], tags[1])
- table2 = BindTable(tbnames[2], tags[2])
-
- for data in datas[0]:
- table0.add_col_data(data)
- for data in datas[1]:
- table1.add_col_data(data)
- for data in datas[2]:
- table2.add_col_data(data)
-
- # bind with single table
- stmt2.bind_param_with_tables([table0, table1, table2])
- stmt2.execute()
-
- # check correct
- checkResultCorrects(conn, dbname, stbname, tbnames, tags, datas)
-
-
-def do_check_invalid(stmt2, tbnames, tags, datas):
- table0 = BindTable(tbnames[0], tags[0])
- table1 = BindTable(tbnames[1], tags[1])
- table2 = BindTable(tbnames[2], tags[2])
-
- for data in datas[0]:
- table0.add_col_data(data)
- for data in datas[1]:
- table1.add_col_data(data)
- for data in datas[2]:
- table2.add_col_data(data)
-
- # bind with single table
- try:
- stmt2.bind_param_with_tables([table0, table1, table2])
- stmt2.execute()
- except Exception as err:
- #traceback.print_stack()
- print(f"failed to do_check_invalid. err={err}")
- return
-
- print(f"input invalid data passed , unexpect. \ntbnames={tbnames}\ntags={tags} \ndatas={datas} \n")
- assert False
-
-
-def check_input_invalid_param(conn, stmt2, dbname, stbname):
-
- tbnames = ["t1", "t2", "t3"]
- tags = [
- ["grade2", 1],
- ["grade2", 2],
- ["grade2", 3]
- ]
-
- # prepare data
- datas = [
- # table 1
- [
- # student
- [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
- ["Mary", "Tom", "Jack", "Jane", "alex" ],
- [0, 1, 1, 0, 1 ],
- [98, 80, 60, 100, 99 ]
- ],
- # table 2
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
- ["Mary2", "Tom2", "Jack2", "Jane2", "alex2" ],
- [0, 1, 1, 0, 1 ],
- [298, 280, 260, 2100, 299 ]
- ],
- # table 3
- [
- # student
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004],
- ["Mary3", "Tom3", "Jack3", "Jane3", "alex3" ],
- [0, 1, 1, 0, 1 ],
- [398, 380, 360, 3100, 399 ]
- ]
- ]
-
- # some tags is none
- tags1 = [ ["grade2", 1], None, ["grade2", 3] ]
- do_check_invalid(stmt2, tbnames, tags1, datas)
-
- # timestamp is over range
- origin = datas[0][0][0]
- datas[0][0][0] = 100000000000000000000000
- do_check_invalid(stmt2, tbnames, tags, datas)
- datas[0][0][0] = origin # restore
-
-
-# insert with single table (performance is lower)
-def insert_with_normal_tables(conn, stmt2, dbname, ntb):
-
- tbnames = [ntb]
- tags = [None]
- # prepare data
-
- wkts = [None, "POINT(121.213 31.234)", "POINT(122.22 32.222)", None, "POINT(124.22 34.222)"]
- wkbs = [WKB(wkt) for wkt in wkts]
-
- datas = [
- # table 1
- [
- # student
- [1601481600000,1601481600004,"2024-09-19 10:00:00", "2024-09-19 10:00:01.123", datetime(2024,9,20,10,11,12,456)],
- [b"Mary", b"tom", b"Jack", b"Jane", None ],
- [0, 3.14, True, 0, 1 ],
- [98, 99.87, 60, 100, 99 ],
- wkbs
- ]
- ]
-
- table0 = BindTable(tbnames[0], tags[0])
- for data in datas[0]:
- table0.add_col_data(data)
-
- # bind with single table
- stmt2.bind_param_with_tables([table0])
- stmt2.execute()
-
- # check correct
- checkResultCorrects(conn, dbname, None, tbnames, tags, datas)
-
-
-def test_stmt2_prepare_empty_sql(conn):
- if not IS_V3:
- print(" test_stmt2_prepare_empty_sql not support TDengine 2.X version.")
- return
-
- try:
- # prepare
- stmt2 = conn.statement2()
- stmt2.prepare(sql='')
-
- # should not run here
- conn.close()
- print("prepare empty sql ............................. failed\n")
- assert False
-
- except StatementError as err:
- print("prepare empty sql ............................. ok\n")
- conn.close()
-
-
-def test_bind_invalid_tbnames_type():
- if not IS_V3:
- print(" test_bind_invalid_tbnames_type not support TDengine 2.X version.")
- return
-
- dbname = "stmt2"
- stbname = "stmt2_stable"
- subtbname = "stmt2_subtable"
-
- try:
- conn = taos.connect()
- conn.execute(f"drop database if exists {dbname}")
- conn.execute(f"create database {dbname}")
- conn.select_db(dbname)
- conn.execute(f"create stable {stbname} (ts timestamp, a int) tags (b int);")
- conn.execute(f"create table {subtbname} using {stbname} tags(0);")
-
- stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?) values(?,?)")
-
- tags = [[1]]
- datas = [[[1626861392589], [1]]]
-
- stmt2.bind_param(subtbname, tags, datas)
-
- # should not run here
- conn.close()
- print("bind invalid tbnames type ..................... failed\n")
- assert False
-
- except StatementError as err:
- print("bind invalid tbnames type ..................... ok\n")
- conn.close()
-
-
-#
-# insert
-#
-def test_stmt2_insert(conn):
- if not IS_V3:
- print(" test_stmt2_query not support TDengine 2.X version.")
- return
-
- dbname = "stmt2"
- stbname = "meters"
- ntb1 = "ntb1"
- ntb2 = "ntb2"
-
- try:
- prepare(conn, dbname, stbname, ntb1, ntb2)
-
- ctb = 'ctb' # child table
- stmt2 = conn.statement2(f"insert into {dbname}.{ctb} using {dbname}.{stbname} tags (?,?) values(?,?,?,?)")
- insert_bind_param_with_table(conn, stmt2, dbname, stbname, ctb)
- print("insert child table ........................... ok\n")
- stmt2.close()
-
- # # prepare
- # stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?,?) values(?,?,?,?)")
- # print("insert prepare sql ............................ ok\n")
- #
- # # insert with table
- # insert_bind_param_with_tables(conn, stmt2, dbname, stbname)
- # print("insert bind with tables ....................... ok\n")
- # check_input_invalid_param(conn, stmt2, dbname, stbname)
- # print("check input invalid params .................... ok\n")
- #
- # # insert with split args
- # insert_bind_param(conn, stmt2, dbname, stbname)
- # print("insert bind ................................... ok\n")
- # print("insert execute ................................ ok\n")
- # stmt2.close()
-
- # ntb1
- stmt2 = conn.statement2(f"insert into {dbname}.{ntb1} values(?,?,?,?,?)")
- insert_with_normal_tables(conn, stmt2, dbname, ntb1)
- print("insert normal tables .......................... ok\n")
- stmt2.close()
-
- # ntb2
- stmt2 = conn.statement2(f"insert into {dbname}.{ntb2} values(?,?,?,?,?)")
- insert_bind_param_normal_tables(conn, stmt2, dbname, ntb2)
- print("insert normal tables (bind param) ............. ok\n")
- stmt2.close()
-
- conn.close()
- print("test_stmt2_insert ............................. [passed]\n")
- except Exception as err:
- #conn.execute("drop database if exists %s" % dbname)
- print("test_stmt2_insert ............................. failed\n")
- conn.close()
- raise err
-
-
-#
-# ------------------------ query -------------------
-#
-def query_bind_param(conn, stmt2):
- # set param
- #tbnames = ["d2"]
- tbnames = None
- tags = None
- datas = [
- # class 1
- [
- # where name in ('Tom2','alex2') or score > 1000;"
- ["Tom2"],
- [1000]
- ]
- ]
-
- # set param
- types = [FieldType.C_BINARY, FieldType.C_INT]
- stmt2.set_columns_type(types)
-
- # bind
- stmt2.bind_param(tbnames, tags, datas)
-
-
-# compare
-def compare_result(conn, sql2, res2):
- lres1 = []
- lres2 = []
-
- # shor res2
- for row in res2:
- log.debug(f" res2 rows = {row} \n")
- lres2.append(row)
-
- res1 = conn.query(sql2)
- for row in res1:
- log.debug(f" res1 rows = {row} \n")
- lres1.append(row)
-
- row1 = len(lres1)
- row2 = len(lres2)
- col1 = len(lres1[0])
- col2 = len(lres2[0])
-
- # check number
- if row1 != row2:
- err = f"two results row count different. row1={row1} row2={row2}"
- raise(BaseException(err))
- if col1 != col2:
- err = f" two results column count different. col1={col1} col2={col2}"
- raise(BaseException(err))
-
- for i in range(row1):
- for j in range(col1):
- if lres1[i][j] != lres2[i][j]:
- raise(f" two results data different. i={i} j={j} data1={res1[i][j]} data2={res2[i][j]}\n")
-
-# query
-def test_stmt2_query(conn):
- if not IS_V3:
- print(" test_stmt2_query not support TDengine 2.X version.")
- return
-
- dbname = "stmt2"
- stbname = "meters"
- ntb1 = "ntb1"
- ntb2 = "ntb2"
- sql1 = f"select * from {dbname}.d2 where name in (?) or score > ? ;"
- sql2 = f"select * from {dbname}.d2 where name in ('Tom2') or score > 1000;"
-
- try:
- # prepare
- prepare(conn, dbname, stbname, ntb1, ntb2)
-
- # prepare
- # stmt2 = conn.statement2(f"insert into ? using {dbname}.{stbname} tags(?,?) values(?,?,?,?)")
- # insert_bind_param_with_tables(conn, stmt2, dbname, stbname)
- # insert_bind_param(conn, stmt2, dbname, stbname)
- # stmt2.close()
- # print("insert bind & execute ......................... ok\n")
-
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.000', 'Mary2', false, 298)")
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.001', 'Tom2', true, 280)")
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.002', 'Jack2', true, 260)")
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.003', 'Jane2', false, 2100)")
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.004', 'alex2', true, 299)")
- conn.execute(f"insert into d2 using {stbname} tags('grade1', 2) values('2020-10-01 00:00:00.005', NULL, false, NULL)")
-
-
- # statement2
- stmt2 = conn.statement2(sql1)
- print("query prepare sql ............................. ok\n")
-
-
- # insert with table
- #insert_bind_param_with_tables(conn, stmt2)
-
-
- # bind
- query_bind_param(conn, stmt2)
- print("query bind param .............................. ok\n")
-
- # query execute
- stmt2.execute()
-
- # fetch result
- res2 = stmt2.result()
-
- # check result
- compare_result(conn, sql2, res2)
- print("query check corrent ........................... ok\n")
-
- #conn.execute("drop database if exists %s" % dbname)
- stmt2.close()
- conn.close()
- print("test_stmt2_query .............................. [passed]\n")
-
- except Exception as err:
- print("query ......................................... failed\n")
- conn.close()
- raise err
-
-
-if __name__ == "__main__":
- print("start stmt2 test case...\n")
- taos.log.setting(True, True, True, True, True, False)
- # insert
- test_stmt2_insert(taos.connect())
- # query
- test_stmt2_query(taos.connect())
- print("end stmt2 test case.\n")
\ No newline at end of file
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index 0c15d866d5..cdf63f918c 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -52,7 +52,8 @@ Python Connector 历史版本(建议使用最新版本的 `taospy`):
| Python Connector 版本 | 主要变化 | TDengine 版本 |
| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- |
-| 2.7.18 | 支持 Apache Superset BI 产品 | - |
+| 2.7.19 | 支持 Apache Superset 连接 TDengine Cloud 数据源 | - |
+| 2.7.18 | 支持 Apache Superset 产品连接本地 TDengine 数据源 | - |
| 2.7.16 | 新增订阅配置 (session.timeout.ms, max.poll.interval.ms) | - |
| 2.7.15 | 新增 VARBINARY 和 GEOMETRY 类型支持 | - |
| 2.7.14 | 修复已知问题 | - |
@@ -131,7 +132,8 @@ TDengine 目前支持时间戳、数字、字符、布尔类型,与 Python 对
| [json_tag.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/json-tag.py) | 使用 JSON 类型的标签 |
| [tmq_consumer.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/tmq_consumer.py) | tmq 订阅 |
| [native_all_type_query.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_query.py) | 支持全部类型示例 |
-| [native_all_type_stmt.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_stmt.py) | 参数绑定支持全部类型示例 |
+| [native_all_type_stmt.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_stmt.py) | 参数绑定 stmt 全部类型示例 |
+| [insert_with_stmt2.py](https://github.com/taosdata/taos-connector-python/blob/main/tests/test_stmt2.py) | 参数绑定 stmt2 写入示例 |
示例程序源码请参考:
@@ -281,7 +283,6 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `def close(self)`
- **接口说明**: 关闭 stmt2 对象
-[使用示例](http://https://github.com/taosdata/TDengine/tree/main/docs/examples/python/insert_with_stmt2.py)
#### 数据订阅
@@ -458,7 +459,6 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `def close(self)`
- **接口说明**: 关闭 stmt2 对象
-[使用示例](http://https://github.com/taosdata/TDengine/tree/main/docs/examples/python/insert_with_stmt2.py)
#### 数据订阅
- **创建消费者支持属性列表**:
diff --git a/tests/docs-examples-test/python.sh b/tests/docs-examples-test/python.sh
index f7f94db1f2..3a9812637c 100644
--- a/tests/docs-examples-test/python.sh
+++ b/tests/docs-examples-test/python.sh
@@ -196,5 +196,3 @@ check_transactions || exit 1
reset_cache || exit 1
python3 tmq_websocket_example.py
-# stmt2
-python3 insert_with_stmt2.py
\ No newline at end of file
From bbfa55a90ffb3a2ed09441419580fa8bfbb008e8 Mon Sep 17 00:00:00 2001
From: Shengliang Guan
Date: Mon, 30 Dec 2024 16:05:02 +0800
Subject: [PATCH 19/35] doc: password
---
docs/zh/08-operation/14-user.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/zh/08-operation/14-user.md b/docs/zh/08-operation/14-user.md
index a894570b46..3a080619aa 100644
--- a/docs/zh/08-operation/14-user.md
+++ b/docs/zh/08-operation/14-user.md
@@ -17,7 +17,7 @@ create user user_name pass'password' [sysinfo {1|0}] [createdb {1|0}]
相关参数说明如下。
- user_name:用户名最长不超过 23 个字节。
-- password:密码长度必须为 8 到 16 位,并且至少包含大写字母、小写字母、数字、特殊字符中的三类。特殊字符包括 `! @ # $ % ^ & * ( ) - _ + = [ ] { } : ; > < ? | ~ , .`。
+- password:密码长度必须为 8 到 16 位,并且至少包含大写字母、小写字母、数字、特殊字符中的三类。特殊字符包括 `! @ # $ % ^ & * ( ) - _ + = [ ] { } : ; > < ? | ~ , .`。(始自 3.3.5.0 版本)
- sysinfo :用户是否可以查看系统信息。1 表示可以查看,0 表示不可以查看。系统信息包括服务端配置信息、服务端各种节点信息,如 dnode、查询节点(qnode)等,以及与存储相关的信息等。默认为可以查看系统信息。
- createdb:用户是否可以创建数据库。1 表示可以创建,0 表示不可以创建。缺省值为 0。// 从 TDengine 企业版 3.3.2.0 开始支持
From 2c35ec8ae9a8859871cc4330add7872e9f257dcb Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 16:11:11 +0800
Subject: [PATCH 20/35] docs: add stmt2_native.py example
---
docs/en/07-develop/05-stmt.md | 2 +-
docs/examples/python/stmt2_native.py | 78 ++++++++++++++++++++++++++++
docs/zh/07-develop/05-stmt.md | 2 +-
tests/docs-examples-test/python.sh | 1 +
4 files changed, 81 insertions(+), 2 deletions(-)
create mode 100644 docs/examples/python/stmt2_native.py
diff --git a/docs/en/07-develop/05-stmt.md b/docs/en/07-develop/05-stmt.md
index 485315bc93..11b055bcf9 100644
--- a/docs/en/07-develop/05-stmt.md
+++ b/docs/en/07-develop/05-stmt.md
@@ -98,7 +98,7 @@ This is a [more detailed parameter binding example](https://github.com/taosdata/
```python
-{{#include docs/examples/python/stmt_native.py}}
+{{#include docs/examples/python/stmt2_native.py}}
```
diff --git a/docs/examples/python/stmt2_native.py b/docs/examples/python/stmt2_native.py
new file mode 100644
index 0000000000..7e6d59466e
--- /dev/null
+++ b/docs/examples/python/stmt2_native.py
@@ -0,0 +1,78 @@
+import taos
+from datetime import datetime
+import random
+
+conn = None
+stmt2 = None
+host="localhost"
+port=6030
+try:
+ # 1 connect
+ conn = taos.connect(
+ user="root",
+ password="taosdata",
+ host=host,
+ port=port,
+ )
+
+ # 2 create db and table
+ conn.execute("CREATE DATABASE IF NOT EXISTS power")
+ conn.execute("USE power")
+ conn.execute(
+ "CREATE TABLE IF NOT EXISTS `meters` (`ts` TIMESTAMP, `current` FLOAT, `voltage` INT, `phase` FLOAT) TAGS (`groupid` INT, `location` BINARY(16))"
+ )
+
+ # 3 prepare
+ sql = "INSERT INTO ? USING meters (groupid, location) TAGS(?,?) VALUES (?,?,?,?)"
+ stmt2 = conn.statement2(sql)
+
+ # table name array
+ tbnames = ["d0","d1","d2"]
+ # tag data array
+ tags = [
+ [1, "BeiJing"],
+ [2, None],
+ [3, "ShangHai"]
+ ]
+ # column data array
+ datas = [
+ # d0 tabled
+ [
+ [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
+ [10.1, 10.2, 10.3, 10.4, 10.5 ,None ],
+ [98, None, 60, 100, 99 ,128 ],
+ [0, 1, 0, 0, 1 ,0 ]
+ ],
+ # d1 tabled
+ [
+ [1601481700000,1601481700001,1601481700002,1601481700003,1601481700004,1601481700005],
+ [10.1, 10.2, 10.3, 10.4, 10.5 ,11.2 ],
+ [98, 80, 60, 100, 99 ,128 ],
+ [0, 1, 0, 0, 1 ,0 ]
+ ],
+ # d2 tabled
+ [
+ [1601481800000,1601481800001,1601481800002,1601481800003,1601481800004,1601481800005],
+ [10.1, 10.2, 10.3, 10.4, 10.5 ,13.4 ],
+ [98, 80, 60, 100, 99 ,128 ],
+ [0, 1, 0, None, 1 ,0 ]
+ ],
+ ]
+
+ # 4 bind param
+ stmt2.bind_param(tbnames, tags, datas)
+
+ # 5 execute
+ stmt2.execute()
+
+ # show
+ print(f"Successfully inserted with stmt2 to power.meters.")
+
+except Exception as err:
+ print(f"Failed to insert to table meters using stmt2, ErrMessage:{err}")
+ raise err
+finally:
+ if stmt2:
+ stmt2.close()
+ if conn:
+ conn.close()
diff --git a/docs/zh/07-develop/05-stmt.md b/docs/zh/07-develop/05-stmt.md
index 045126b333..1917a86e74 100644
--- a/docs/zh/07-develop/05-stmt.md
+++ b/docs/zh/07-develop/05-stmt.md
@@ -93,7 +93,7 @@ import TabItem from "@theme/TabItem";
```python
-{{#include docs/examples/python/stmt_native.py}}
+{{#include docs/examples/python/stmt2_native.py}}
```
diff --git a/tests/docs-examples-test/python.sh b/tests/docs-examples-test/python.sh
index 3a9812637c..536155437b 100644
--- a/tests/docs-examples-test/python.sh
+++ b/tests/docs-examples-test/python.sh
@@ -196,3 +196,4 @@ check_transactions || exit 1
reset_cache || exit 1
python3 tmq_websocket_example.py
+python3 stmt2_native.py
\ No newline at end of file
From 5fe895c6d228938f3c98083586057f3786819257 Mon Sep 17 00:00:00 2001
From: qevolg <2227465945@qq.com>
Date: Mon, 30 Dec 2024 16:31:11 +0800
Subject: [PATCH 21/35] fix(keeper): add gitinfo
---
tools/CMakeLists.txt | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/tools/CMakeLists.txt b/tools/CMakeLists.txt
index d058d7a52f..1ee2bc4ce6 100644
--- a/tools/CMakeLists.txt
+++ b/tools/CMakeLists.txt
@@ -251,7 +251,7 @@ IF(TD_BUILD_KEEPER)
PATCH_COMMAND
COMMAND git clean -f -d
BUILD_COMMAND
- COMMAND go build -a -ldflags "-X 'github.com/taosdata/taoskeeper/version.Version=${taos_version}' -X 'github.com/taosdata/taoskeeper/version.CommitID=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.BuildInfo=${TD_VER_OSTYPE}-${TD_VER_CPUTYPE} ${TD_VER_DATE}'"
+ COMMAND go build -a -ldflags "-X 'github.com/taosdata/taoskeeper/version.Version=${taos_version}' -X 'github.com/taosdata/taoskeeper/version.Gitinfo=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.CommitID=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.BuildInfo=${TD_VER_OSTYPE}-${TD_VER_CPUTYPE} ${TD_VER_DATE}'"
INSTALL_COMMAND
COMMAND cmake -E echo "Comparessing taoskeeper.exe"
COMMAND cmake -E time upx taoskeeper.exe
@@ -278,7 +278,7 @@ IF(TD_BUILD_KEEPER)
PATCH_COMMAND
COMMAND git clean -f -d
BUILD_COMMAND
- COMMAND go build -a -ldflags "-X 'github.com/taosdata/taoskeeper/version.Version=${taos_version}' -X 'github.com/taosdata/taoskeeper/version.CommitID=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.BuildInfo=${TD_VER_OSTYPE}-${TD_VER_CPUTYPE} ${TD_VER_DATE}'"
+ COMMAND go build -a -ldflags "-X 'github.com/taosdata/taoskeeper/version.Version=${taos_version}' -X 'github.com/taosdata/taoskeeper/version.Gitinfo=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.CommitID=${taoskeeper_commit_sha1}' -X 'github.com/taosdata/taoskeeper/version.BuildInfo=${TD_VER_OSTYPE}-${TD_VER_CPUTYPE} ${TD_VER_DATE}'"
INSTALL_COMMAND
COMMAND cmake -E echo "Copy taoskeeper"
COMMAND cmake -E copy taoskeeper ${CMAKE_BINARY_DIR}/build/bin
From f6b180ff569aab9ba859a6ec84be019a7fc5c4fd Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 16:57:16 +0800
Subject: [PATCH 22/35] fix: use old case model
---
docs/examples/python/stmt2_native.py | 59 ++++++++++++----------------
1 file changed, 26 insertions(+), 33 deletions(-)
diff --git a/docs/examples/python/stmt2_native.py b/docs/examples/python/stmt2_native.py
index 7e6d59466e..72f01c9038 100644
--- a/docs/examples/python/stmt2_native.py
+++ b/docs/examples/python/stmt2_native.py
@@ -2,6 +2,9 @@ import taos
from datetime import datetime
import random
+numOfSubTable = 10
+numOfRow = 10
+
conn = None
stmt2 = None
host="localhost"
@@ -26,38 +29,28 @@ try:
sql = "INSERT INTO ? USING meters (groupid, location) TAGS(?,?) VALUES (?,?,?,?)"
stmt2 = conn.statement2(sql)
- # table name array
- tbnames = ["d0","d1","d2"]
- # tag data array
- tags = [
- [1, "BeiJing"],
- [2, None],
- [3, "ShangHai"]
- ]
- # column data array
- datas = [
- # d0 tabled
- [
- [1601481600000,1601481600001,1601481600002,1601481600003,1601481600004,1601481600005],
- [10.1, 10.2, 10.3, 10.4, 10.5 ,None ],
- [98, None, 60, 100, 99 ,128 ],
- [0, 1, 0, 0, 1 ,0 ]
- ],
- # d1 tabled
- [
- [1601481700000,1601481700001,1601481700002,1601481700003,1601481700004,1601481700005],
- [10.1, 10.2, 10.3, 10.4, 10.5 ,11.2 ],
- [98, 80, 60, 100, 99 ,128 ],
- [0, 1, 0, 0, 1 ,0 ]
- ],
- # d2 tabled
- [
- [1601481800000,1601481800001,1601481800002,1601481800003,1601481800004,1601481800005],
- [10.1, 10.2, 10.3, 10.4, 10.5 ,13.4 ],
- [98, 80, 60, 100, 99 ,128 ],
- [0, 1, 0, None, 1 ,0 ]
- ],
- ]
+ tbnames = []
+ tags = []
+ datas = []
+
+ for i in range(numOfSubTable):
+ # tbnames
+ tbnames.append(f"d_bind_{i}")
+ # tags
+ tags.append([i, f"location_{i}"])
+ # datas
+ current = int(datetime.now().timestamp() * 1000)
+ timestamps = []
+ currents = []
+ voltages = []
+ phases = []
+ for j in range (numOfRow):
+ timestamps.append(current + i*1000 + j)
+ currents.append(float(random.random() * 30))
+ voltages.append(random.randint(100, 300))
+ phases.append(float(random.random()))
+ data = [timestamps, currents, voltages, phases]
+ datas.append(data)
# 4 bind param
stmt2.bind_param(tbnames, tags, datas)
@@ -66,7 +59,7 @@ try:
stmt2.execute()
# show
- print(f"Successfully inserted with stmt2 to power.meters.")
+ print(f"Successfully inserted with stmt2 to power.meters. child={numOfSubTable} rows={numOfRow} \n")
except Exception as err:
print(f"Failed to insert to table meters using stmt2, ErrMessage:{err}")
From 62d3df49d1d5513433b114dc02cc20dd22417c13 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 17:16:23 +0800
Subject: [PATCH 23/35] docs: remove STMT2
---
.../14-reference/05-connector/30-python.mdx | 65 ++++++++++---------
1 file changed, 34 insertions(+), 31 deletions(-)
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index cdf63f918c..ad43ce19df 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -52,6 +52,7 @@ Python Connector 历史版本(建议使用最新版本的 `taospy`):
| Python Connector 版本 | 主要变化 | TDengine 版本 |
| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- |
+| 2.7.20 | Native 支持 STMT2 写入 | - |
| 2.7.19 | 支持 Apache Superset 连接 TDengine Cloud 数据源 | - |
| 2.7.18 | 支持 Apache Superset 产品连接本地 TDengine 数据源 | - |
| 2.7.16 | 新增订阅配置 (session.timeout.ms, max.poll.interval.ms) | - |
@@ -248,41 +249,43 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- `reqId`: 用于问题追踪。
- **异常**:操作失败抛出 `DataError` 或 `OperationalError` 异常。
-#### 参数绑定 STMT2
-- `def statement2(self, sql=None, option=None)`
- - **接口说明**:使用连接对象创建 stmt2 对象
- - **参数说明**
- - `sql`: 绑定的 SQL 语句,如果不为空会调用`prepare`函数
- - `option` 传入 TaosStmt2Option 类实例选项
- - **返回值**:stmt2 对象。
+#### 参数绑定
+- `fn statement(&self) -> PyResult`
+ - **接口说明**:使用 连接 对象创建 stmt 对象。
+ - **返回值**:stmt 对象。
- **异常**:操作失败抛出 `ConnectionError` 异常。
-- `def prepare(self, sql)`
- - **接口说明**:绑定预编译 sql 语句
+- `fn prepare(&mut self, sql: &str) -> PyResult<()>`
+ - **接口说明**:绑定预编译 sql 语句。
- **参数说明**:
- - `sql`: 绑定的 SQL 语句
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def bind_param(self, tbnames, tags, datas)`
- - **接口说明**:以独立数组方式绑定数据
+ - `sql`: 预编译的 SQL 语句。
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `fn set_tbname(&mut self, table_name: &str) -> PyResult<()>`
+ - **接口说明**:设置将要写入数据的表名。
- **参数说明**:
- - `tbnames`: 绑定表名数组,数据类型为 list
- - `tags`: 绑定 tag 列值数组,数据类型为 list
- - `tags`: 绑定普通列值数组,数据类型为 list
- - **异常**:操作失败抛出 `StatementError` 异常
-- `def bind_param_with_tables(self, tables)`
- - **接口说明**:以独立表方式绑定数据,独立表是以表为组织单位,每张表中有表名,TAG 值及普通列数值属性
+ - `tableName`: 表名,如果需要指定数据库, 例如: `db_name.table_name` 即可。
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `fn set_tags(&mut self, tags: Vec) -> PyResult<()>`
+ - **接口说明**:设置表 Tags 数据, 用于自动建表。
- **参数说明**:
- - `tables`: `BindTable` 独立表对象数组
- - **异常**:操作失败抛出 `StatementError` 异常。
-- `def execute(self) -> int:`
- - **接口说明**:执行将绑定数据全部写入
- - **返回值**:影响行数
+ - `paramsArray`: Tags 数据。
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `fn bind_param(&mut self, params: Vec) -> PyResult<()>`
+ - **接口说明**:绑定数据。
+ - **参数说明**:
+ - `paramsArray`: 绑定数据。
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `fn add_batch(&mut self) -> PyResult<()>`
+ - **接口说明**:提交绑定数据。
+ - **异常**:操作失败抛出 `ProgrammingError` 异常。
+- `fn execute(&mut self) -> PyResult`
+ - **接口说明**:执行将绑定的数据全部写入。
+ - **返回值**:写入条数。
- **异常**:操作失败抛出 `QueryError` 异常。
-- `def result(self)`
- - **接口说明**: 获取参数绑定查询结果集
- - **返回值**:返回 TaosResult 对象
-- `def close(self)`
- - **接口说明**: 关闭 stmt2 对象
-
+- `fn affect_rows(&mut self) -> PyResult`
+ - **接口说明**: 获取写入条数。
+ - **返回值**:写入条数。
+- `fn close(&self) -> PyResult<()>`
+ - **接口说明**: 关闭 stmt 对象。
#### 数据订阅
@@ -424,7 +427,7 @@ TaosResult 对象可以通过循环遍历获取查询到的数据。
- **返回值**:影响的条数。
- **异常**:操作失败抛出 `SchemalessError` 异常。
-#### 参数绑定 STMT2
+#### 参数绑定
- `def statement2(self, sql=None, option=None)`
- **接口说明**:使用连接对象创建 stmt2 对象
- **参数说明**
From e8232129651cd56ae9a7ce4cf7a142b90c09741a Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 17:53:28 +0800
Subject: [PATCH 24/35] docs: update english documents
---
.../en/14-reference/05-connector/30-python.md | 81 +++++++++----------
.../14-reference/05-connector/30-python.mdx | 2 +-
2 files changed, 37 insertions(+), 46 deletions(-)
diff --git a/docs/en/14-reference/05-connector/30-python.md b/docs/en/14-reference/05-connector/30-python.md
index 58cff75c90..d1e922da8d 100644
--- a/docs/en/14-reference/05-connector/30-python.md
+++ b/docs/en/14-reference/05-connector/30-python.md
@@ -55,6 +55,8 @@ Python Connector historical versions (it is recommended to use the latest versio
|Python Connector Version | Major Changes | TDengine Version|
| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- |
+|2.7.20 | Native supports STMT2 writing | - |
+|2.7.19 | Support Apache Superset connection to TDengine Cloud data source | - |
|2.7.18 | Support Apache SuperSet BI Tools. | - |
|2.7.16 | Add subscription configuration (session. timeout. ms, Max. roll. interval. ms). | - |
|2.7.15 | Added support for VARBINRY and GEOMETRY types. | - |
@@ -136,7 +138,7 @@ TDengine currently supports timestamp, numeric, character, boolean types, and th
| [tmq_consumer.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/tmq_consumer.py) | tmq subscription |
| [native_all_type_query.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_query.py) | Example supporting all types |
| [native_all_type_stmt.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_stmt.py) | Parameter binding example supporting all types |
-
+| [test_stmt2.py](https://github.com/taosdata/taos-connector-python/blob/main/tests/test_stmt2.py) | Example of STMT2 writing |
Example program source code can be found at:
1. [More native example programs](https://github.com/taosdata/taos-connector-python/tree/main/examples)
@@ -429,51 +431,40 @@ TaosResult object can be iterated over to retrieve queried data.
- **Exceptions**: Throws `SchemalessError` if operation fails.
#### Parameter Binding
-
-- `def statement(self, sql=None)`
- - **Interface Description**: Creates a stmt object using the connection object, if sql is not empty it will call prepare.
- - `sql`: Precompiled SQL statement.
- - **Return Value**: stmt object.
- - **Exception**: Throws `StatementError` exception on failure.
+- `def statement2(self, sql=None, option=None)`
+ - **Interface Description**:Creating an STMT2 object using a connection object
+ - **Parameter Description**
+ - `sql`: The bound SQL statement will call the `prepare` function if it is not empty
+ - `option` Pass in TaoStmt2Option class instance options
+ - **Return Value**:stmt2 object
+ - **Exception**:Throws `ConnectionError` on failure
- `def prepare(self, sql)`
- - **Interface Description**: Binds a precompiled sql statement.
- - **Parameter Description**:
- - `sql`: Precompiled SQL statement.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def set_tbname(self, name)`
- - **Interface Description**: Sets the table name for data to be written to.
- - **Parameter Description**:
- - `name`: Table name, if you need to specify a database, for example: `db_name.table_name`.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def set_tbname_tags(self, name, tags):`
- - **Interface Description**: Sets the table and Tags data, used for automatic table creation.
- - **Parameter Description**:
- - `name`: Table name, if you need to specify a database, for example: `db_name.table_name`.
- - `tags`: Tags data.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def bind_param(self, params, add_batch=True)`
- - **Interface Description**: Binds a set of data and submits.
- - **Parameter Description**:
- - `params`: Data to bind.
- - `add_batch`: Whether to submit the bound data.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def bind_param_batch(self, binds, add_batch=True)`
- - **Interface Description**: Binds multiple sets of data and submits.
- - **Parameter Description**:
- - `binds`: Data to bind.
- - `add_batch`: Whether to submit the bound data.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def add_batch(self)`
- - **Interface Description**: Submits the bound data.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def execute(self)`
- - **Interface Description**: Executes and writes all the bound data.
- - **Exception**: Throws `StatementError` exception on failure.
-- `def affected_rows(self)`
- - **Interface Description**: Gets the number of rows written.
- - **Return Value**: Number of rows written.
-- `def close(&self)`
- - **Interface Description**: Closes the stmt object.
+ - **Interface Description**:Bind a precompiled SQL statement
+ - **Parameter Description**:
+ - `sql`: Precompiled SQL statement
+ - **Exception**:Throws `StatementError` on failure
+- `def bind_param(self, tbnames, tags, datas)`
+ - **Interface Description**:Binding Data as an Independent Array
+ - **Parameter Description**:
+ - `tbnames`:Bind table name array, data type is list
+ - `tags`: Bind tag column value array, data type is list
+ - `tags`: Bind a regular column value array with a data type of list
+ - **Exception**:Throws `StatementError` on failure
+- `def bind_param_with_tables(self, tables)`
+ - **Interface Description**:Bind data in an independent table format. Independent tables are organized by table units, with table names, TAG values, and regular column numerical attributes in each table
+ - **Parameter Description**:
+ - `tables`: `BindTable` Independent Table Object Array
+ - **Exception**:Throws `StatementError` on failure
+- `def execute(self) -> int:`
+ - **Interface Description**:Execute to write all bound data
+ - **Return Value**:Affects the number of rows
+ - **Exception**:Throws `QueryError` on failure
+- `def result(self)`
+ - **Interface Description**:Get parameter binding query result set
+ - **Return Value**:Returns the TaosResult object
+- `def close(self)`
+ - **Interface Description**: close the STMT2 object
+
#### Data Subscription
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index ad43ce19df..a1fedfd92c 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -134,7 +134,7 @@ TDengine 目前支持时间戳、数字、字符、布尔类型,与 Python 对
| [tmq_consumer.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/tmq_consumer.py) | tmq 订阅 |
| [native_all_type_query.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_query.py) | 支持全部类型示例 |
| [native_all_type_stmt.py](https://github.com/taosdata/taos-connector-python/blob/main/examples/native_all_type_stmt.py) | 参数绑定 stmt 全部类型示例 |
-| [insert_with_stmt2.py](https://github.com/taosdata/taos-connector-python/blob/main/tests/test_stmt2.py) | 参数绑定 stmt2 写入示例 |
+| [test_stmt2.py](https://github.com/taosdata/taos-connector-python/blob/main/tests/test_stmt2.py) | 参数绑定 stmt2 写入示例 |
示例程序源码请参考:
From eb42773328c7d25baf286e1f8dae50b9c0ff79dd Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Mon, 30 Dec 2024 18:08:47 +0800
Subject: [PATCH 25/35] chore: remove useless ci config
---
.appveyor.yml | 49 --------
.circleci/config.yml | 13 ---
.drone.yml | 266 -------------------------------------------
3 files changed, 328 deletions(-)
delete mode 100644 .appveyor.yml
delete mode 100644 .circleci/config.yml
delete mode 100644 .drone.yml
diff --git a/.appveyor.yml b/.appveyor.yml
deleted file mode 100644
index e7802b3d0d..0000000000
--- a/.appveyor.yml
+++ /dev/null
@@ -1,49 +0,0 @@
-version: 1.0.{build}
-image:
- - Visual Studio 2015
- - macos
-environment:
- matrix:
- - ARCH: amd64
- - ARCH: x86
-matrix:
- exclude:
- - image: macos
- ARCH: x86
-for:
- -
- matrix:
- only:
- - image: Visual Studio 2015
- clone_folder: c:\dev\TDengine
- clone_depth: 1
-
- init:
- - call "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\vcvarsall.bat" %ARCH%
-
- before_build:
- - cd c:\dev\TDengine
- - md build
-
- build_script:
- - cd build
- - cmake -G "NMake Makefiles" .. -DBUILD_JDBC=false
- - nmake install
- -
- matrix:
- only:
- - image: macos
- clone_depth: 1
-
- build_script:
- - mkdir debug
- - cd debug
- - cmake .. > /dev/null
- - make > /dev/null
-notifications:
-- provider: Email
- to:
- - sangshuduo@gmail.com
- on_build_success: true
- on_build_failure: true
- on_build_status_changed: true
diff --git a/.circleci/config.yml b/.circleci/config.yml
deleted file mode 100644
index 6f98693add..0000000000
--- a/.circleci/config.yml
+++ /dev/null
@@ -1,13 +0,0 @@
-# Use the latest 2.1 version of CircleCI pipeline process engine. See: https://circleci.com/docs/2.0/configuration-reference
-version: 2.1
-# Use a package of configuration called an orb.
-orbs:
- # Declare a dependency on the welcome-orb
- welcome: circleci/welcome-orb@0.4.1
-# Orchestrate or schedule a set of jobs
-workflows:
- # Name the workflow "welcome"
- welcome:
- # Run the welcome/run job in its own container
- jobs:
- - welcome/run
diff --git a/.drone.yml b/.drone.yml
deleted file mode 100644
index d35c104830..0000000000
--- a/.drone.yml
+++ /dev/null
@@ -1,266 +0,0 @@
----
-kind: pipeline
-name: test_amd64
-
-platform:
- os: linux
- arch: amd64
-
-steps:
-- name: build
- image: gcc
- commands:
- - apt-get update
- - apt-get install -y cmake build-essential
- - mkdir debug
- - cd debug
- - cmake ..
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: test_arm64_bionic
-
-platform:
- os: linux
- arch: arm64
-steps:
-- name: build
- image: arm64v8/ubuntu:bionic
- commands:
- - apt-get update
- - apt-get install -y cmake build-essential
- - mkdir debug
- - cd debug
- - cmake .. -DCPUTYPE=aarch64 > /dev/null
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: test_arm64_focal
-
-platform:
- os: linux
- arch: arm64
-
-steps:
-- name: build
- image: arm64v8/ubuntu:focal
- commands:
- - echo 'debconf debconf/frontend select Noninteractive' | debconf-set-selections
- - apt-get update
- - apt-get install -y -qq cmake build-essential
- - mkdir debug
- - cd debug
- - cmake .. -DCPUTYPE=aarch64 > /dev/null
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: test_arm64_centos7
-
-platform:
- os: linux
- arch: arm64
-
-steps:
-- name: build
- image: arm64v8/centos:7
- commands:
- - yum install -y gcc gcc-c++ make cmake git
- - mkdir debug
- - cd debug
- - cmake .. -DCPUTYPE=aarch64 > /dev/null
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: test_arm64_centos8
-
-platform:
- os: linux
- arch: arm64
-
-steps:
-- name: build
- image: arm64v8/centos:8
- commands:
- - dnf install -y gcc gcc-c++ make cmake epel-release git libarchive
- - mkdir debug
- - cd debug
- - cmake .. -DCPUTYPE=aarch64 > /dev/null
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: test_arm_bionic
-
-platform:
- os: linux
- arch: arm
-
-steps:
-- name: build
- image: arm32v7/ubuntu:bionic
- commands:
- - apt-get update
- - apt-get install -y cmake build-essential
- - mkdir debug
- - cd debug
- - cmake .. -DCPUTYPE=aarch32 > /dev/null
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: build_trusty
-
-platform:
- os: linux
- arch: amd64
-
-steps:
-- name: build
- image: ubuntu:trusty
- commands:
- - apt-get update
- - apt-get install -y gcc cmake3 build-essential git binutils-2.26
-
- - mkdir debug
- - cd debug
- - cmake ..
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: build_xenial
-
-platform:
- os: linux
- arch: amd64
-
-steps:
-- name: build
- image: ubuntu:xenial
- commands:
- - apt-get update
- - apt-get install -y gcc cmake build-essential
- - mkdir debug
- - cd debug
- - cmake ..
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: build_bionic
-platform:
- os: linux
- arch: amd64
-
-steps:
-- name: build
- image: ubuntu:bionic
- commands:
- - apt-get update
- - apt-get install -y gcc cmake build-essential
- - mkdir debug
- - cd debug
- - cmake ..
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
----
-kind: pipeline
-name: build_centos7
-platform:
- os: linux
- arch: amd64
-
-steps:
-- name: build
- image: ansible/centos7-ansible
- commands:
- - yum install -y gcc gcc-c++ make cmake
- - mkdir debug
- - cd debug
- - cmake ..
- - make -j4
- trigger:
- event:
- - pull_request
- when:
- branch:
- - develop
- - master
- - 2.0
- - 3.0
\ No newline at end of file
From 3e55011e983ce2724ad0c3bf79fc91a7a6281f55 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 18:13:59 +0800
Subject: [PATCH 26/35] fix: add top level to 4
---
docs/en/14-reference/05-connector/30-python.md | 1 +
1 file changed, 1 insertion(+)
diff --git a/docs/en/14-reference/05-connector/30-python.md b/docs/en/14-reference/05-connector/30-python.md
index d1e922da8d..05ae323f9a 100644
--- a/docs/en/14-reference/05-connector/30-python.md
+++ b/docs/en/14-reference/05-connector/30-python.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: Python
title: Python Client Library
slug: /tdengine-reference/client-libraries/python
From 9e701eaf2a47006835ac661849370e2cb3c1bcae Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 18:18:43 +0800
Subject: [PATCH 27/35] add toc_max_heading_level 4
---
docs/en/14-reference/05-connector/10-cpp.md | 1 +
docs/en/14-reference/05-connector/14-java.md | 1 +
docs/en/14-reference/05-connector/20-go.md | 1 +
docs/en/14-reference/05-connector/26-rust.md | 1 +
docs/en/14-reference/05-connector/35-node.md | 1 +
docs/en/14-reference/05-connector/40-csharp.md | 1 +
6 files changed, 6 insertions(+)
diff --git a/docs/en/14-reference/05-connector/10-cpp.md b/docs/en/14-reference/05-connector/10-cpp.md
index fe7574d416..940d4c359e 100644
--- a/docs/en/14-reference/05-connector/10-cpp.md
+++ b/docs/en/14-reference/05-connector/10-cpp.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: C/C++
title: C/C++ Client Library
slug: /tdengine-reference/client-libraries/cpp
diff --git a/docs/en/14-reference/05-connector/14-java.md b/docs/en/14-reference/05-connector/14-java.md
index 48302b9d3b..2781f26c24 100644
--- a/docs/en/14-reference/05-connector/14-java.md
+++ b/docs/en/14-reference/05-connector/14-java.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: Java
title: Java Client Library
slug: /tdengine-reference/client-libraries/java
diff --git a/docs/en/14-reference/05-connector/20-go.md b/docs/en/14-reference/05-connector/20-go.md
index bf0e6dd979..578150f0fa 100644
--- a/docs/en/14-reference/05-connector/20-go.md
+++ b/docs/en/14-reference/05-connector/20-go.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: Go
title: Go Client Library
slug: /tdengine-reference/client-libraries/go
diff --git a/docs/en/14-reference/05-connector/26-rust.md b/docs/en/14-reference/05-connector/26-rust.md
index 637d009b8c..8de5d628de 100644
--- a/docs/en/14-reference/05-connector/26-rust.md
+++ b/docs/en/14-reference/05-connector/26-rust.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: Rust
title: Rust Client Library
slug: /tdengine-reference/client-libraries/rust
diff --git a/docs/en/14-reference/05-connector/35-node.md b/docs/en/14-reference/05-connector/35-node.md
index 19dae0357f..49b1d200cf 100644
--- a/docs/en/14-reference/05-connector/35-node.md
+++ b/docs/en/14-reference/05-connector/35-node.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: Node.js
title: Node.js Client Library
slug: /tdengine-reference/client-libraries/node
diff --git a/docs/en/14-reference/05-connector/40-csharp.md b/docs/en/14-reference/05-connector/40-csharp.md
index 8e51cb319b..c9c9f95228 100644
--- a/docs/en/14-reference/05-connector/40-csharp.md
+++ b/docs/en/14-reference/05-connector/40-csharp.md
@@ -1,4 +1,5 @@
---
+toc_max_heading_level: 4
sidebar_label: C#
title: C# Client Library
slug: /tdengine-reference/client-libraries/csharp
From a5bafca5fbe2443a6d680349c6e599a7ba477fda Mon Sep 17 00:00:00 2001
From: Haolin Wang
Date: Mon, 30 Dec 2024 14:18:55 +0800
Subject: [PATCH 28/35] test: insert into table from csv file
---
tests/parallel_test/cases.task | 1 +
tests/system-test/1-insert/insert_from_csv.py | 47 +++++++++++++++++++
.../1-insert/test_insert_from_csv.csv | 5 ++
tests/system-test/simpletest.bat | 1 +
4 files changed, 54 insertions(+)
create mode 100644 tests/system-test/1-insert/insert_from_csv.py
create mode 100644 tests/system-test/1-insert/test_insert_from_csv.csv
diff --git a/tests/parallel_test/cases.task b/tests/parallel_test/cases.task
index 9047b0f3af..3eeb583c11 100644
--- a/tests/parallel_test/cases.task
+++ b/tests/parallel_test/cases.task
@@ -443,6 +443,7 @@
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/InsertFuturets.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_wide_column.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_column_value.py
+,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_from_csv.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k_benchmark.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py -R
diff --git a/tests/system-test/1-insert/insert_from_csv.py b/tests/system-test/1-insert/insert_from_csv.py
new file mode 100644
index 0000000000..38f6a8b09d
--- /dev/null
+++ b/tests/system-test/1-insert/insert_from_csv.py
@@ -0,0 +1,47 @@
+import taos
+import sys
+import datetime
+import inspect
+import random
+from util.dnodes import TDDnode
+from util.dnodes import tdDnodes
+
+from util.log import *
+from util.sql import *
+from util.cases import *
+
+class TDTestCase:
+ def init(self, conn, logSql, replicaVar=1):
+ self.replicaVar = int(replicaVar)
+ tdLog.debug(f"start to excute {__file__}")
+ tdSql.init(conn.cursor(), True)
+
+ self.testcasePath = os.path.split(__file__)[0]
+ self.testcasePath = self.testcasePath.replace('\\', '//')
+ self.database = "test_insert_csv_db"
+ self.table = "test_insert_csv_tbl"
+
+ def insert_from_csv(self):
+ tdSql.execute(f"drop database if exists {self.database}")
+ tdSql.execute(f"create database {self.database}")
+ tdSql.execute(f"use {self.database}")
+ tdSql.execute(f"create table {self.table} (ts timestamp, c1 nchar(16), c2 double, c3 int)")
+ tdSql.execute(f"insert into {self.table} file '{self.testcasePath}//test_insert_from_csv.csv'")
+ tdSql.query(f"select count(*) from {self.table}")
+ tdSql.checkData(0, 0, 5)
+
+ def run(self):
+ tdSql.prepare()
+
+ startTime_all = time.time()
+ self.insert_from_csv()
+ endTime_all = time.time()
+ print("total time %ds" % (endTime_all - startTime_all))
+
+ def stop(self):
+ tdSql.close()
+ tdLog.success("%s successfully executed" % __file__)
+
+
+tdCases.addWindows(__file__, TDTestCase())
+tdCases.addLinux(__file__, TDTestCase())
diff --git a/tests/system-test/1-insert/test_insert_from_csv.csv b/tests/system-test/1-insert/test_insert_from_csv.csv
new file mode 100644
index 0000000000..966af8c27a
--- /dev/null
+++ b/tests/system-test/1-insert/test_insert_from_csv.csv
@@ -0,0 +1,5 @@
+'2024-12-13 09:30:00.050','ABCDEF68',24.774736842805263,200
+'2024-12-13 09:30:00.060','ABCDEF68',24.774736842805263,201
+'2024-12-13 09:30:00.080','ABCDEF68',24.774736842805263,202
+'2024-12-13 09:30:00.100','ABCDEF68',24.774736842805263,203
+'2024-12-13 09:30:00.110','ABCDEF68',24.774736842805263,204
diff --git a/tests/system-test/simpletest.bat b/tests/system-test/simpletest.bat
index a1f7273ad4..5948c7fc80 100644
--- a/tests/system-test/simpletest.bat
+++ b/tests/system-test/simpletest.bat
@@ -13,6 +13,7 @@ python3 .\test.py -f 0-others\cachemodel.py
@REM python3 .\test.py -f 0-others\fsync.py
python3 .\test.py -f 1-insert\influxdb_line_taosc_insert.py
+python3 .\test.py -f 1-insert\insert_from_csv.py
@REM python3 .\test.py -f 1-insert\opentsdb_telnet_line_taosc_insert.py
@REM python3 .\test.py -f 1-insert\opentsdb_json_taosc_insert.py
@REM #python3 .\test.py -f 1-insert\test_stmt_muti_insert_query.py
From 1126f97853d59627982bf6fe729642d704bed77a Mon Sep 17 00:00:00 2001
From: WANG Xu
Date: Mon, 30 Dec 2024 18:25:20 +0800
Subject: [PATCH 29/35] chore: add two more badges [skip ci]
---
README.md | 4 ++++
1 file changed, 4 insertions(+)
diff --git a/README.md b/README.md
index ff72412434..ed04fafee1 100644
--- a/README.md
+++ b/README.md
@@ -12,6 +12,7 @@
[](https://github.com/taosdata/TDengine/actions/workflows/taosd-ci-build.yml)
[](https://coveralls.io/github/taosdata/TDengine?branch=3.0)
+
[](https://bestpractices.coreinfrastructure.org/projects/4201)
[](https://twitter.com/tdenginedb)
@@ -19,6 +20,9 @@
[](https://discord.com/invite/VZdSuUg4pS)
[](https://www.linkedin.com/company/tdengine)
[](https://stackoverflow.com/questions/tagged/tdengine)
+
+
+
English | [简体中文](README-CN.md) | [TDengine Cloud](https://cloud.tdengine.com) | [Learn more about TSDB](https://tdengine.com/tsdb/)
From 24f6828b9f90b5ef571756dbeabb6881455b78ba Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 18:27:39 +0800
Subject: [PATCH 30/35] docs: fix tags double mistaken
---
docs/en/14-reference/05-connector/30-python.md | 10 +++++-----
1 file changed, 5 insertions(+), 5 deletions(-)
diff --git a/docs/en/14-reference/05-connector/30-python.md b/docs/en/14-reference/05-connector/30-python.md
index 05ae323f9a..7cf5858c2f 100644
--- a/docs/en/14-reference/05-connector/30-python.md
+++ b/docs/en/14-reference/05-connector/30-python.md
@@ -436,8 +436,8 @@ TaosResult object can be iterated over to retrieve queried data.
- **Interface Description**:Creating an STMT2 object using a connection object
- **Parameter Description**
- `sql`: The bound SQL statement will call the `prepare` function if it is not empty
- - `option` Pass in TaoStmt2Option class instance options
- - **Return Value**:stmt2 object
+ - `option` Pass in `TaoStmt2Option` class instance
+ - **Return Value**:STMT2 object
- **Exception**:Throws `ConnectionError` on failure
- `def prepare(self, sql)`
- **Interface Description**:Bind a precompiled SQL statement
@@ -449,12 +449,12 @@ TaosResult object can be iterated over to retrieve queried data.
- **Parameter Description**:
- `tbnames`:Bind table name array, data type is list
- `tags`: Bind tag column value array, data type is list
- - `tags`: Bind a regular column value array with a data type of list
+ - `datas`: Bind data column value array, data type of list
- **Exception**:Throws `StatementError` on failure
- `def bind_param_with_tables(self, tables)`
- - **Interface Description**:Bind data in an independent table format. Independent tables are organized by table units, with table names, TAG values, and regular column numerical attributes in each table
+ - **Interface Description**:Bind data in an independent table format. Independent tables are organized by table units, with table name, TAG value, and data column attributes in table object
- **Parameter Description**:
- - `tables`: `BindTable` Independent Table Object Array
+ - `tables`: `BindTable` Independent table object array
- **Exception**:Throws `StatementError` on failure
- `def execute(self) -> int:`
- **Interface Description**:Execute to write all bound data
From 7c36eda698dabe5064a42d856e5c08cbc904fc01 Mon Sep 17 00:00:00 2001
From: Shengliang Guan
Date: Mon, 30 Dec 2024 19:36:25 +0800
Subject: [PATCH 31/35] Revert "Fix infinite loop when insert from CSV file on
Windows"
---
source/os/src/osFile.c | 2 +-
tests/parallel_test/cases.task | 1 -
tests/system-test/1-insert/insert_from_csv.py | 47 -------------------
.../1-insert/test_insert_from_csv.csv | 5 --
tests/system-test/simpletest.bat | 1 -
5 files changed, 1 insertion(+), 55 deletions(-)
delete mode 100644 tests/system-test/1-insert/insert_from_csv.py
delete mode 100644 tests/system-test/1-insert/test_insert_from_csv.csv
diff --git a/source/os/src/osFile.c b/source/os/src/osFile.c
index b1198e1cb2..8a2606c4c2 100644
--- a/source/os/src/osFile.c
+++ b/source/os/src/osFile.c
@@ -1403,7 +1403,7 @@ int64_t taosGetLineFile(TdFilePtr pFile, char **__restrict ptrBuf) {
}
(*ptrBuf)[totalBytesRead] = '\0';
- ret = (totalBytesRead > 0 ? totalBytesRead : -1); // -1 means EOF
+ ret = totalBytesRead;
#else
size_t len = 0;
ret = getline(ptrBuf, &len, pFile->fp);
diff --git a/tests/parallel_test/cases.task b/tests/parallel_test/cases.task
index 3eeb583c11..9047b0f3af 100644
--- a/tests/parallel_test/cases.task
+++ b/tests/parallel_test/cases.task
@@ -443,7 +443,6 @@
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/InsertFuturets.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_wide_column.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_column_value.py
-,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_from_csv.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k_benchmark.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py -R
diff --git a/tests/system-test/1-insert/insert_from_csv.py b/tests/system-test/1-insert/insert_from_csv.py
deleted file mode 100644
index 38f6a8b09d..0000000000
--- a/tests/system-test/1-insert/insert_from_csv.py
+++ /dev/null
@@ -1,47 +0,0 @@
-import taos
-import sys
-import datetime
-import inspect
-import random
-from util.dnodes import TDDnode
-from util.dnodes import tdDnodes
-
-from util.log import *
-from util.sql import *
-from util.cases import *
-
-class TDTestCase:
- def init(self, conn, logSql, replicaVar=1):
- self.replicaVar = int(replicaVar)
- tdLog.debug(f"start to excute {__file__}")
- tdSql.init(conn.cursor(), True)
-
- self.testcasePath = os.path.split(__file__)[0]
- self.testcasePath = self.testcasePath.replace('\\', '//')
- self.database = "test_insert_csv_db"
- self.table = "test_insert_csv_tbl"
-
- def insert_from_csv(self):
- tdSql.execute(f"drop database if exists {self.database}")
- tdSql.execute(f"create database {self.database}")
- tdSql.execute(f"use {self.database}")
- tdSql.execute(f"create table {self.table} (ts timestamp, c1 nchar(16), c2 double, c3 int)")
- tdSql.execute(f"insert into {self.table} file '{self.testcasePath}//test_insert_from_csv.csv'")
- tdSql.query(f"select count(*) from {self.table}")
- tdSql.checkData(0, 0, 5)
-
- def run(self):
- tdSql.prepare()
-
- startTime_all = time.time()
- self.insert_from_csv()
- endTime_all = time.time()
- print("total time %ds" % (endTime_all - startTime_all))
-
- def stop(self):
- tdSql.close()
- tdLog.success("%s successfully executed" % __file__)
-
-
-tdCases.addWindows(__file__, TDTestCase())
-tdCases.addLinux(__file__, TDTestCase())
diff --git a/tests/system-test/1-insert/test_insert_from_csv.csv b/tests/system-test/1-insert/test_insert_from_csv.csv
deleted file mode 100644
index 966af8c27a..0000000000
--- a/tests/system-test/1-insert/test_insert_from_csv.csv
+++ /dev/null
@@ -1,5 +0,0 @@
-'2024-12-13 09:30:00.050','ABCDEF68',24.774736842805263,200
-'2024-12-13 09:30:00.060','ABCDEF68',24.774736842805263,201
-'2024-12-13 09:30:00.080','ABCDEF68',24.774736842805263,202
-'2024-12-13 09:30:00.100','ABCDEF68',24.774736842805263,203
-'2024-12-13 09:30:00.110','ABCDEF68',24.774736842805263,204
diff --git a/tests/system-test/simpletest.bat b/tests/system-test/simpletest.bat
index 5948c7fc80..a1f7273ad4 100644
--- a/tests/system-test/simpletest.bat
+++ b/tests/system-test/simpletest.bat
@@ -13,7 +13,6 @@ python3 .\test.py -f 0-others\cachemodel.py
@REM python3 .\test.py -f 0-others\fsync.py
python3 .\test.py -f 1-insert\influxdb_line_taosc_insert.py
-python3 .\test.py -f 1-insert\insert_from_csv.py
@REM python3 .\test.py -f 1-insert\opentsdb_telnet_line_taosc_insert.py
@REM python3 .\test.py -f 1-insert\opentsdb_json_taosc_insert.py
@REM #python3 .\test.py -f 1-insert\test_stmt_muti_insert_query.py
From 977a452ea17e3d163abe6e55e4bc2ee2e921bc70 Mon Sep 17 00:00:00 2001
From: Haolin Wang
Date: Mon, 30 Dec 2024 19:41:41 +0800
Subject: [PATCH 32/35] Revert "Revert "Fix infinite loop when insert from CSV
file on Windows""
---
source/os/src/osFile.c | 2 +-
tests/parallel_test/cases.task | 1 +
tests/system-test/1-insert/insert_from_csv.py | 47 +++++++++++++++++++
.../1-insert/test_insert_from_csv.csv | 5 ++
tests/system-test/simpletest.bat | 1 +
5 files changed, 55 insertions(+), 1 deletion(-)
create mode 100644 tests/system-test/1-insert/insert_from_csv.py
create mode 100644 tests/system-test/1-insert/test_insert_from_csv.csv
diff --git a/source/os/src/osFile.c b/source/os/src/osFile.c
index 8a2606c4c2..b1198e1cb2 100644
--- a/source/os/src/osFile.c
+++ b/source/os/src/osFile.c
@@ -1403,7 +1403,7 @@ int64_t taosGetLineFile(TdFilePtr pFile, char **__restrict ptrBuf) {
}
(*ptrBuf)[totalBytesRead] = '\0';
- ret = totalBytesRead;
+ ret = (totalBytesRead > 0 ? totalBytesRead : -1); // -1 means EOF
#else
size_t len = 0;
ret = getline(ptrBuf, &len, pFile->fp);
diff --git a/tests/parallel_test/cases.task b/tests/parallel_test/cases.task
index 9047b0f3af..3eeb583c11 100644
--- a/tests/parallel_test/cases.task
+++ b/tests/parallel_test/cases.task
@@ -443,6 +443,7 @@
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/InsertFuturets.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_wide_column.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_column_value.py
+,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/insert_from_csv.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k_benchmark.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py
,,y,system-test,./pytest.sh python3 ./test.py -f 1-insert/rowlength64k.py -R
diff --git a/tests/system-test/1-insert/insert_from_csv.py b/tests/system-test/1-insert/insert_from_csv.py
new file mode 100644
index 0000000000..38f6a8b09d
--- /dev/null
+++ b/tests/system-test/1-insert/insert_from_csv.py
@@ -0,0 +1,47 @@
+import taos
+import sys
+import datetime
+import inspect
+import random
+from util.dnodes import TDDnode
+from util.dnodes import tdDnodes
+
+from util.log import *
+from util.sql import *
+from util.cases import *
+
+class TDTestCase:
+ def init(self, conn, logSql, replicaVar=1):
+ self.replicaVar = int(replicaVar)
+ tdLog.debug(f"start to excute {__file__}")
+ tdSql.init(conn.cursor(), True)
+
+ self.testcasePath = os.path.split(__file__)[0]
+ self.testcasePath = self.testcasePath.replace('\\', '//')
+ self.database = "test_insert_csv_db"
+ self.table = "test_insert_csv_tbl"
+
+ def insert_from_csv(self):
+ tdSql.execute(f"drop database if exists {self.database}")
+ tdSql.execute(f"create database {self.database}")
+ tdSql.execute(f"use {self.database}")
+ tdSql.execute(f"create table {self.table} (ts timestamp, c1 nchar(16), c2 double, c3 int)")
+ tdSql.execute(f"insert into {self.table} file '{self.testcasePath}//test_insert_from_csv.csv'")
+ tdSql.query(f"select count(*) from {self.table}")
+ tdSql.checkData(0, 0, 5)
+
+ def run(self):
+ tdSql.prepare()
+
+ startTime_all = time.time()
+ self.insert_from_csv()
+ endTime_all = time.time()
+ print("total time %ds" % (endTime_all - startTime_all))
+
+ def stop(self):
+ tdSql.close()
+ tdLog.success("%s successfully executed" % __file__)
+
+
+tdCases.addWindows(__file__, TDTestCase())
+tdCases.addLinux(__file__, TDTestCase())
diff --git a/tests/system-test/1-insert/test_insert_from_csv.csv b/tests/system-test/1-insert/test_insert_from_csv.csv
new file mode 100644
index 0000000000..966af8c27a
--- /dev/null
+++ b/tests/system-test/1-insert/test_insert_from_csv.csv
@@ -0,0 +1,5 @@
+'2024-12-13 09:30:00.050','ABCDEF68',24.774736842805263,200
+'2024-12-13 09:30:00.060','ABCDEF68',24.774736842805263,201
+'2024-12-13 09:30:00.080','ABCDEF68',24.774736842805263,202
+'2024-12-13 09:30:00.100','ABCDEF68',24.774736842805263,203
+'2024-12-13 09:30:00.110','ABCDEF68',24.774736842805263,204
diff --git a/tests/system-test/simpletest.bat b/tests/system-test/simpletest.bat
index a1f7273ad4..5948c7fc80 100644
--- a/tests/system-test/simpletest.bat
+++ b/tests/system-test/simpletest.bat
@@ -13,6 +13,7 @@ python3 .\test.py -f 0-others\cachemodel.py
@REM python3 .\test.py -f 0-others\fsync.py
python3 .\test.py -f 1-insert\influxdb_line_taosc_insert.py
+python3 .\test.py -f 1-insert\insert_from_csv.py
@REM python3 .\test.py -f 1-insert\opentsdb_telnet_line_taosc_insert.py
@REM python3 .\test.py -f 1-insert\opentsdb_json_taosc_insert.py
@REM #python3 .\test.py -f 1-insert\test_stmt_muti_insert_query.py
From 6bcbf6cc37a0243e28a868d0dd5b92d2d0e17a92 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 20:22:29 +0800
Subject: [PATCH 33/35] fix: taospy version update to 2.7.21
---
docs/en/14-reference/05-connector/30-python.md | 2 +-
docs/zh/14-reference/05-connector/30-python.mdx | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/en/14-reference/05-connector/30-python.md b/docs/en/14-reference/05-connector/30-python.md
index 7cf5858c2f..19247e5364 100644
--- a/docs/en/14-reference/05-connector/30-python.md
+++ b/docs/en/14-reference/05-connector/30-python.md
@@ -56,7 +56,7 @@ Python Connector historical versions (it is recommended to use the latest versio
|Python Connector Version | Major Changes | TDengine Version|
| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- |
-|2.7.20 | Native supports STMT2 writing | - |
+|2.7.21 | Native supports STMT2 writing | - |
|2.7.19 | Support Apache Superset connection to TDengine Cloud data source | - |
|2.7.18 | Support Apache SuperSet BI Tools. | - |
|2.7.16 | Add subscription configuration (session. timeout. ms, Max. roll. interval. ms). | - |
diff --git a/docs/zh/14-reference/05-connector/30-python.mdx b/docs/zh/14-reference/05-connector/30-python.mdx
index a1fedfd92c..42ed67e927 100644
--- a/docs/zh/14-reference/05-connector/30-python.mdx
+++ b/docs/zh/14-reference/05-connector/30-python.mdx
@@ -52,7 +52,7 @@ Python Connector 历史版本(建议使用最新版本的 `taospy`):
| Python Connector 版本 | 主要变化 | TDengine 版本 |
| -------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------- |
-| 2.7.20 | Native 支持 STMT2 写入 | - |
+| 2.7.21 | Native 支持 STMT2 写入 | - |
| 2.7.19 | 支持 Apache Superset 连接 TDengine Cloud 数据源 | - |
| 2.7.18 | 支持 Apache Superset 产品连接本地 TDengine 数据源 | - |
| 2.7.16 | 新增订阅配置 (session.timeout.ms, max.poll.interval.ms) | - |
From 9bf4e232895c384582569cd91a46aefe459b4cd4 Mon Sep 17 00:00:00 2001
From: Alex Duan <417921451@qq.com>
Date: Mon, 30 Dec 2024 20:53:06 +0800
Subject: [PATCH 34/35] fix: set latest taospy version
---
tests/parallel_test/cases.task | 20 ++++++++++----------
tests/parallel_test/run_case.sh | 2 +-
2 files changed, 11 insertions(+), 11 deletions(-)
diff --git a/tests/parallel_test/cases.task b/tests/parallel_test/cases.task
index 9047b0f3af..a3e87a5334 100644
--- a/tests/parallel_test/cases.task
+++ b/tests/parallel_test/cases.task
@@ -6,6 +6,16 @@
,,n,unit-test,bash test.sh
+#docs-examples test
+,,n,docs-examples-test,bash c.sh
+,,n,docs-examples-test,bash python.sh
+,,n,docs-examples-test,bash node.sh
+,,n,docs-examples-test,bash csharp.sh
+,,n,docs-examples-test,bash jdbc.sh
+,,n,docs-examples-test,bash rust.sh
+,,n,docs-examples-test,bash go.sh
+,,n,docs-examples-test,bash test_R.sh
+
#
# army-test
#
@@ -1661,13 +1671,3 @@
,,n,develop-test,python3 ./test.py -f 5-taos-tools/taosbenchmark/sml_json_alltypes.py
,,n,develop-test,python3 ./test.py -f 5-taos-tools/taosbenchmark/taosdemoTestQueryWithJson.py -R
,,n,develop-test,python3 ./test.py -f 5-taos-tools/taosbenchmark/telnet_tcp.py -R
-
-#docs-examples test
-,,n,docs-examples-test,bash c.sh
-,,n,docs-examples-test,bash python.sh
-,,n,docs-examples-test,bash node.sh
-,,n,docs-examples-test,bash csharp.sh
-,,n,docs-examples-test,bash jdbc.sh
-,,n,docs-examples-test,bash rust.sh
-,,n,docs-examples-test,bash go.sh
-,,n,docs-examples-test,bash test_R.sh
diff --git a/tests/parallel_test/run_case.sh b/tests/parallel_test/run_case.sh
index a78d0aa4a4..5dc1cef673 100755
--- a/tests/parallel_test/run_case.sh
+++ b/tests/parallel_test/run_case.sh
@@ -77,7 +77,7 @@ md5sum /usr/lib/libtaos.so.1
md5sum /home/TDinternal/debug/build/lib/libtaos.so
#get python connector and update: taospy 2.7.16 taos-ws-py 0.3.5
-pip3 install taospy==2.7.16
+pip3 install taospy==2.7.21
pip3 install taos-ws-py==0.3.5
$TIMEOUT_CMD $cmd
RET=$?
From 7304750f47b5c4e0b0caa7b1efa208c0fe4045f7 Mon Sep 17 00:00:00 2001
From: Shengliang Guan
Date: Tue, 31 Dec 2024 09:07:39 +0800
Subject: [PATCH 35/35] doc: update s3 support
---
docs/zh/08-operation/12-multi.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/zh/08-operation/12-multi.md b/docs/zh/08-operation/12-multi.md
index d18957d7d2..122fb9c2f3 100644
--- a/docs/zh/08-operation/12-multi.md
+++ b/docs/zh/08-operation/12-multi.md
@@ -60,7 +60,7 @@ dataDir /mnt/data6 2 0
## 对象存储
-本节介绍在 TDengine Enterprise 如何使用 S3 对象存储,本功能基于通用 S3 SDK 实现,对各个 S3 平台的访问参数进行了兼容适配,可以访问如 minio,腾讯云 COS,Amazon S3 等对象存储服务。通过适当的参数配置,可以把大部分较冷的时序数据存储到 S3 服务中。
+本节介绍在 TDengine Enterprise 如何使用 S3 对象存储,本功能基于通用 S3 SDK 实现,对各个 S3 平台的访问参数进行了兼容适配,可以访问如 Amazon S3、Azure Blob、华为 OBS、腾讯云 COS、阿里云 OSS、minio等对象存储服务。通过适当的参数配置,可以把大部分较冷的时序数据存储到 S3 服务中。
**注意** 在配合多级存储使用时,每一级存储介质上保存的数据都有可能被按规则备份到远程对象存储中并删除本地数据文件。