Merge pull request #30302 from taosdata/merge/mainto3.0
merge: from main to 3.0 branch
This commit is contained in:
commit
3936682f6b
|
@ -0,0 +1,180 @@
|
|||
---
|
||||
sidebar_label: Perspective
|
||||
title: Integration With Perspective
|
||||
toc_max_heading_level: 4
|
||||
---
|
||||
|
||||
Perspective is an open-source and powerful data visualization library developed by [Prospective.co](https://www.perspective.co/). Leveraging the technologies of WebAssembly and Web Workers, it enables interactive real-time data analysis in web applications and provides high-performance visualization capabilities on the browser side. With its help, developers can build dashboards, charts, etc. that update in real time, and users can easily interact with the data, filtering, sorting, and exploring it as needed. It boasts high flexibility, adapting to various data formats and business scenarios. It is also fast, ensuring smooth interaction even when dealing with large-scale data. Moreover, it has excellent usability, allowing both beginners and professional developers to quickly build visualization interfaces.
|
||||
|
||||
In terms of data connection, Perspective, through the Python connector of TDengine, perfectly supports TDengine data sources. It can efficiently retrieve various types of data, such as massive time-series data, from TDengine. Additionally, it offers real-time functions including the display of complex charts, in-depth statistical analysis, and trend prediction, helping users gain insights into the value of the data and providing strong support for decision-making. It is an ideal choice for building applications with high requirements for real-time data visualization and analysis.
|
||||
|
||||

|
||||
|
||||
## Prerequisites
|
||||
|
||||
Perform the following installation operations in the Linux system:
|
||||
|
||||
- TDengine is installed and running normally (both Enterprise and Community versions are available).
|
||||
- taosAdapter is running normally, refer to [taosAdapter Reference](../../../tdengine-reference/components/taosadapter/).
|
||||
- Python version 3.10 or higher has been installed (if not installed, please refer to [Python Installation](https://docs.python.org/)).
|
||||
- Download or clone the [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) project. After entering the root directory of the project, run the "install.sh" script to download and install the TDengine client library and related dependencies locally.
|
||||
|
||||
## Data Analysis
|
||||
|
||||
**Step 1**, Run the "run.sh" script in the root directory of the [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) project to start the Perspective service. This service will retrieve data from the TDengine database every 300 milliseconds and transmit the data in a streaming form to the web-based `Perspective Viewer`.
|
||||
|
||||
```shell
|
||||
sh run.sh
|
||||
```
|
||||
|
||||
**Step 2**, Start a static web service. Then, access the prsp-viewer.html resource in the browser, and the visualized data can be displayed.
|
||||
|
||||
```python
|
||||
python -m http.server 8081
|
||||
```
|
||||
|
||||

|
||||
|
||||
## Instructions for use
|
||||
|
||||
### Write Data to TDengine
|
||||
|
||||
The `producer.py` script in the root directory of the [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) project can periodically insert data into the TDengine database with the help of the TDengine Python connector. This script will generate random data and insert it into the database, thus simulating the process of writing real-time data. The specific execution steps are as follows:
|
||||
|
||||
1. Establish a connection to TDengine.
|
||||
2. Create the `power` database and the `meters` table.
|
||||
3. Generate random data every 300 milliseconds and write it into the TDengine database.
|
||||
|
||||
For detailed instructions on writing using the Python connector, please refer to [Python Parameter Binding](../../../tdengine-reference/client-libraries/python/#parameter-binding).
|
||||
|
||||
### Load Data from TDengine
|
||||
|
||||
The `perspective_server.py` script in the root directory of the [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) project will start a Perspective server. This server will read data from TDengine and stream the data to a Perspective table via the Tornado WebSocket.
|
||||
|
||||
1. Start a Perspective server.
|
||||
2. Establish a connection to TDengine.
|
||||
3. Create a Perspective table (the table structure needs to match the type of the table in the TDengine database).
|
||||
4. Call the `Tornado.PeriodicCallback` function to start a scheduled task, thereby achieving the update of the data in the Perspective table. The sample code is as follows:
|
||||
|
||||
```python
|
||||
def perspective_thread(perspective_server: perspective.Server, tdengine_conn: taosws.Connection):
|
||||
"""
|
||||
Create a new Perspective table and update it with new data every 50ms
|
||||
"""
|
||||
# create a new Perspective table
|
||||
client = perspective_server.new_local_client()
|
||||
schema = {
|
||||
"timestamp": datetime,
|
||||
"location": str,
|
||||
"groupid": int,
|
||||
"current": float,
|
||||
"voltage": int,
|
||||
"phase": float,
|
||||
}
|
||||
# define the table schema
|
||||
table = client.table(
|
||||
schema,
|
||||
limit=1000, # maximum number of rows in the table
|
||||
name=PERSPECTIVE_TABLE_NAME, # table name. Use this with perspective-viewer on the client side
|
||||
)
|
||||
logger.info("Created new Perspective table")
|
||||
|
||||
# update with new data
|
||||
def updater():
|
||||
data = read_tdengine(tdengine_conn)
|
||||
table.update(data)
|
||||
logger.debug(f"Updated Perspective table: {len(data)} rows")
|
||||
|
||||
logger.info(f"Starting tornado ioloop update loop every {PERSPECTIVE_REFRESH_RATE} milliseconds")
|
||||
# start the periodic callback to update the table data
|
||||
callback = tornado.ioloop.PeriodicCallback(callback=updater, callback_time=PERSPECTIVE_REFRESH_RATE)
|
||||
callback.start()
|
||||
```
|
||||
|
||||
### HTML Page Configuration
|
||||
|
||||
The `prsp-viewer.html` file in the root directory of the [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) project embeds the `Perspective Viewer` into the HTML page. It connects to the Perspective server via a WebSocket and displays real-time data according to the chart configuration.
|
||||
|
||||
- Configure the displayed charts and the rules for data analysis.
|
||||
- Establish a Websocket connection with the Perspective server.
|
||||
- Import the Perspective library, connect to the Perspective server via a WebSocket, and load the `meters_values` table to display dynamic data.
|
||||
|
||||
```html
|
||||
<script type="module">
|
||||
// import the Perspective library
|
||||
import perspective from "https://unpkg.com/@finos/perspective@3.1.3/dist/cdn/perspective.js";
|
||||
|
||||
document.addEventListener("DOMContentLoaded", async function () {
|
||||
// an asynchronous function for loading the view
|
||||
async function load_viewer(viewerId, config) {
|
||||
try {
|
||||
const table_name = "meters_values";
|
||||
const viewer = document.getElementById(viewerId);
|
||||
// connect Perspective WebSocket server
|
||||
const websocket = await perspective.websocket("ws://localhost:8085/websocket");
|
||||
// open server table
|
||||
const server_table = await websocket.open_table(table_name);
|
||||
// load the table into the view
|
||||
await viewer.load(server_table);
|
||||
// use view configuration
|
||||
await viewer.restore(config);
|
||||
} catch (error) {
|
||||
console.error(`Failed to get data from ${table_name}, err: ${error}`);
|
||||
}
|
||||
}
|
||||
|
||||
// configuration of the view
|
||||
const config = {
|
||||
"version": "3.3.1", // Perspective library version (compatibility identifier)
|
||||
"plugin": "Datagrid", // View mode: Datagrid (table) or D3FC (chart)
|
||||
"plugin_config": { // Plugin-specific configuration
|
||||
"columns": {
|
||||
"current": {
|
||||
"width": 150 // Column width in pixels
|
||||
}
|
||||
},
|
||||
"edit_mode": "READ_ONLY", // Edit mode: READ_ONLY (immutable) or EDIT (editable)
|
||||
"scroll_lock": false // Whether to lock scroll position
|
||||
},
|
||||
"columns_config": {}, // Custom column configurations (colors, formatting, etc.)
|
||||
"settings": true, // Whether to show settings panel (true/false)
|
||||
"theme": "Power Meters", // Custom theme name (must be pre-defined)
|
||||
"title": "Meters list data", // View title
|
||||
"group_by": ["location", "groupid"], // Row grouping fields (equivalent to `row_pivots`)
|
||||
"split_by": [], // Column grouping fields (equivalent to `column_pivots`)
|
||||
"columns": [ // Columns to display (in order)
|
||||
"timestamp",
|
||||
"location",
|
||||
"current",
|
||||
"voltage",
|
||||
"phase"
|
||||
],
|
||||
"filter": [], // Filter conditions (triplet format array)
|
||||
"sort": [], // Sorting rules (format: [field, direction])
|
||||
"expressions": {}, // Custom expressions (e.g., calculated columns)
|
||||
"aggregates": { // Aggregation function configuration
|
||||
"timestamp": "last", // Aggregation: last (takes the latest value)
|
||||
"voltage": "last", // Aggregation: last
|
||||
"phase": "last", // Aggregation: last
|
||||
"current": "last" // Aggregation: last
|
||||
}
|
||||
};
|
||||
|
||||
// load the first view
|
||||
await load_viewer("prsp-viewer-1", config1);
|
||||
});
|
||||
</script>
|
||||
|
||||
<!-- Define the HTML Structure of the Dashboard -->
|
||||
<div id="dashboard">
|
||||
<div class="viewer-container">
|
||||
<perspective-viewer id="prsp-viewer-1" theme="Pro Dark"></perspective-viewer>
|
||||
</div>
|
||||
</div>
|
||||
```
|
||||
|
||||
## Reference Materials
|
||||
|
||||
- [Perspective Docs](https://perspective.finos.org/)
|
||||
- [TDengine Python Connector](../../../tdengine-reference/client-libraries/python/)
|
||||
- [TDengine Stream Processing](../../../advanced-features/stream-processing/)
|
Binary file not shown.
After Width: | Height: | Size: 50 KiB |
Binary file not shown.
After Width: | Height: | Size: 62 KiB |
|
@ -379,6 +379,7 @@ Specify the configuration parameters for tag and data columns in `super_tables`
|
|||
|
||||
`query_times` specifies the number of times to run the query, numeric type.
|
||||
|
||||
**Note: from version 3.3.5.6 and beyond, simultaneous configuration for `specified_table_query` and `super_table_query` in a JSON file is no longer supported **
|
||||
|
||||
For other common parameters, see [General Configuration Parameters](#general-configuration-parameters)
|
||||
|
||||
|
@ -508,6 +509,15 @@ Note: Data types in the taosBenchmark configuration file must be in lowercase to
|
|||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>queryStb.json</summary>
|
||||
|
||||
```json
|
||||
{{#include /TDengine/tools/taos-tools/example/queryStb.json}}
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
#### Subscription Example
|
||||
|
||||
<details>
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
const { sleep } = require("@tdengine/websocket");
|
||||
const taos = require("@tdengine/websocket");
|
||||
|
||||
const db = 'power';
|
||||
|
|
|
@ -0,0 +1,188 @@
|
|||
---
|
||||
sidebar_label: Perspective
|
||||
title: 与 Perspective 集成
|
||||
toc_max_heading_level: 4
|
||||
---
|
||||
|
||||
## 概述
|
||||
|
||||
Perspective 是一款开源且强大的数据可视化库,由 [Prospective.co](https://www.perspective.co/) 开发,运用 `WebAssembly` 和 `Web Workers` 技术,在 Web 应用中实现交互式实时数据分析,能在浏览器端提供高性能可视化能力。借助它,开发者可构建实时更新的仪表盘、图表等,用户能轻松与数据交互,按需求筛选、排序及挖掘数据。其灵活性高,适配多种数据格式与业务场景;速度快,处理大规模数据也能保障交互流畅;易用性佳,新手和专业开发者都能快速搭建可视化界面。
|
||||
|
||||
在数据连接方面,Perspective 通过 TDengine 的 Python 连接器,完美支持 TDengine 数据源,可高效获取其中海量时序数据等各类数据,并提供展示复杂图表、深度统计分析和趋势预测等实时功能,助力用户洞察数据价值,为决策提供有力支持,是构建对实时数据可视化和分析要求高的应用的理想选择。
|
||||
|
||||
|
||||

|
||||
|
||||
## 前置条件
|
||||
|
||||
在 Linux 系统中进行如下安装操作:
|
||||
|
||||
- TDengine 服务已部署并正常运行(企业及社区版均可)。
|
||||
- taosAdapter 能够正常运行,详细参考 [taosAdapter 使用手册](../../../reference/components/taosadapter)。
|
||||
- Python 3.10 及以上版本已安装(如未安装,可参考 [Python 安装](https://docs.python.org/)。
|
||||
- 下载或克隆 [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) 项目,进入项目根目录后运行 “install.sh” 脚本,以便在本地下载并安装 TDengine 客户端库以及相关的依赖项。
|
||||
|
||||
## 数据分析
|
||||
|
||||
**第 1 步**,运行 [perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) 项目根目录中的 “run.sh” 脚本,以此启动 Perspective 服务。该服务会每隔 300 毫秒从 TDengine 数据库中获取一次数据,并将数据以流的形式传输至基于 Web 的 `Perspective Viewer` 。
|
||||
|
||||
```shell
|
||||
sh run.sh
|
||||
```
|
||||
|
||||
**第 2 步**,启动一个静态 Web 服务,随后在浏览器中访问 `prsp-viewer.html` 资源,便能展示可视化数据。
|
||||
|
||||
```python
|
||||
python -m http.server 8081
|
||||
```
|
||||
|
||||

|
||||
|
||||
## 使用说明
|
||||
|
||||
### 写入数据
|
||||
|
||||
[perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) 项目根目录中的 `producer.py` 脚本,借助 TDengine Python 连接器,可定期向 TDengine 数据库插入数据。此脚本会生成随机数据并将其插入数据库,以此模拟实时数据的写入过程。具体执行步骤如下:
|
||||
|
||||
1. 建立与 TDengine 的连接。
|
||||
1. 创建 power 数据库和 meters 表。
|
||||
1. 每隔 300 毫秒生成一次随机数据,并写入 TDengine 数据库中。
|
||||
|
||||
Python 连接器详细写入说明可参见 [Python 参数绑定](../../../reference/connector/python/#参数绑定)。
|
||||
|
||||
### 加载数据
|
||||
|
||||
[perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) 项目根目录中的 `perspective_server.py` 脚本会启动一个 Perspective 服务器,该服务器会从 TDengine 读取数据,并通过 Tornado WebSocket 将数据流式传输到一个 Perspective 表中。
|
||||
|
||||
1. 启动一个 Perspective 服务器
|
||||
1. 建立与 TDengine 的连接。
|
||||
1. 创建一个 Perspective 表(表结构需要与 TDengine 数据库中表的类型保持匹配)。
|
||||
1. 调用 `Tornado.PeriodicCallback` 函数来启动定时任务,进而实现对 Perspective 表数据的更新,示例代码如下:
|
||||
|
||||
```python
|
||||
def perspective_thread(perspective_server: perspective.Server, tdengine_conn: taosws.Connection):
|
||||
"""
|
||||
Create a new Perspective table and update it with new data every 50ms
|
||||
"""
|
||||
# create a new Perspective table
|
||||
client = perspective_server.new_local_client()
|
||||
schema = {
|
||||
"timestamp": datetime,
|
||||
"location": str,
|
||||
"groupid": int,
|
||||
"current": float,
|
||||
"voltage": int,
|
||||
"phase": float,
|
||||
}
|
||||
# define the table schema
|
||||
table = client.table(
|
||||
schema,
|
||||
limit=1000, # maximum number of rows in the table
|
||||
name=PERSPECTIVE_TABLE_NAME, # table name. Use this with perspective-viewer on the client side
|
||||
)
|
||||
logger.info("Created new Perspective table")
|
||||
|
||||
# update with new data
|
||||
def updater():
|
||||
data = read_tdengine(tdengine_conn)
|
||||
table.update(data)
|
||||
logger.debug(f"Updated Perspective table: {len(data)} rows")
|
||||
|
||||
logger.info(f"Starting tornado ioloop update loop every {PERSPECTIVE_REFRESH_RATE} milliseconds")
|
||||
# start the periodic callback to update the table data
|
||||
callback = tornado.ioloop.PeriodicCallback(callback=updater, callback_time=PERSPECTIVE_REFRESH_RATE)
|
||||
callback.start()
|
||||
```
|
||||
|
||||
### HTML 页面配置
|
||||
|
||||
[perspective-connect-demo](https://github.com/taosdata/perspective-connect-demo) 项目根目录中的 `prsp-viewer.html`文件将 `Perspective Viewer` 嵌入到 HTML 页面中。它通过 WebSocket 连接到 Perspective 服务器,并根据图表配置显示实时数据。
|
||||
|
||||
- 配置展示的图表以及数据分析的规则。
|
||||
- 与 Perspective 服务器建立 Websocket 连接。
|
||||
- 引入 Perspective 库,通过 WebSocket 连接到 Perspective 服务器,加载 meters_values 表来展示动态数据。
|
||||
|
||||
```html
|
||||
<script type="module">
|
||||
|
||||
// import the Perspective library
|
||||
import perspective from "https://unpkg.com/@finos/perspective@3.1.3/dist/cdn/perspective.js";
|
||||
|
||||
document.addEventListener("DOMContentLoaded", async function () {
|
||||
// an asynchronous function for loading the view
|
||||
async function load_viewer(viewerId, config) {
|
||||
try {
|
||||
const table_name = "meters_values";
|
||||
const viewer = document.getElementById(viewerId);
|
||||
// connect Perspective WebSocket server
|
||||
const websocket = await perspective.websocket("ws://localhost:8085/websocket");
|
||||
// open server table
|
||||
const server_table = await websocket.open_table(table_name);
|
||||
// load the table into the view
|
||||
await viewer.load(server_table);
|
||||
// use view configuration
|
||||
await viewer.restore(config);
|
||||
} catch (error) {
|
||||
console.error(`Failed to get data from ${table_name}, err: ${error}`);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// configuration of the view
|
||||
const config = {
|
||||
"version": "3.3.1", // Perspective library version (compatibility identifier)
|
||||
"plugin": "Datagrid", // View mode: Datagrid (table) or D3FC (chart)
|
||||
"plugin_config": { // Plugin-specific configuration
|
||||
"columns": {
|
||||
"current": {
|
||||
"width": 150 // Column width in pixels
|
||||
}
|
||||
},
|
||||
"edit_mode": "READ_ONLY", // Edit mode: READ_ONLY (immutable) or EDIT (editable)
|
||||
"scroll_lock": false // Whether to lock scroll position
|
||||
},
|
||||
"columns_config": {}, // Custom column configurations (colors, formatting, etc.)
|
||||
"settings": true, // Whether to show settings panel (true/false)
|
||||
"theme": "Power Meters", // Custom theme name (must be pre-defined)
|
||||
"title": "Meters list data", // View title
|
||||
"group_by": ["location", "groupid"], // Row grouping fields (equivalent to `row_pivots`)
|
||||
"split_by": [], // Column grouping fields (equivalent to `column_pivots`)
|
||||
"columns": [ // Columns to display (in order)
|
||||
"timestamp",
|
||||
"location",
|
||||
"current",
|
||||
"voltage",
|
||||
"phase"
|
||||
],
|
||||
"filter": [], // Filter conditions (triplet format array)
|
||||
"sort": [], // Sorting rules (format: [field, direction])
|
||||
"expressions": {}, // Custom expressions (e.g., calculated columns)
|
||||
"aggregates": { // Aggregation function configuration
|
||||
"timestamp": "last", // Aggregation: last (takes the latest value)
|
||||
"voltage": "last", // Aggregation: last
|
||||
"phase": "last", // Aggregation: last
|
||||
"current": "last" // Aggregation: last
|
||||
}
|
||||
};
|
||||
|
||||
// load the first view
|
||||
await load_viewer("prsp-viewer-1", config1);
|
||||
});
|
||||
</script>
|
||||
|
||||
|
||||
<!-- define the HTML Structure of the Dashboard -->
|
||||
<div id="dashboard">
|
||||
<div class="viewer-container">
|
||||
<perspective-viewer id="prsp-viewer-1" theme="Pro Dark"></perspective-viewer>
|
||||
</div>
|
||||
</div>
|
||||
```
|
||||
|
||||
## 参考资料
|
||||
|
||||
- [Perspective 文档](https://perspective.finos.org/)
|
||||
- [TDengine Python 连接器](../../../reference/connector/python)
|
||||
- [TDengine 流计算](../../../advanced/stream/)
|
||||
|
||||
|
Binary file not shown.
After Width: | Height: | Size: 50 KiB |
Binary file not shown.
After Width: | Height: | Size: 62 KiB |
|
@ -290,6 +290,8 @@ taosBenchmark -f <json file>
|
|||
|
||||
其它通用参数详见 [通用配置参数](#通用配置参数)。
|
||||
|
||||
**说明:从 v3.3.5.6 及以上版本不再支持 json 文件中同时配置 `specified_table_query` 和 `super_table_query`**
|
||||
|
||||
#### 执行指定查询语句
|
||||
|
||||
查询指定表(可以指定超级表、子表或普通表)的配置参数在 `specified_table_query` 中设置。
|
||||
|
@ -416,6 +418,15 @@ taosBenchmark -f <json file>
|
|||
|
||||
</details>
|
||||
|
||||
<details>
|
||||
<summary>queryStb.json</summary>
|
||||
|
||||
```json
|
||||
{{#include /TDengine/tools/taos-tools/example/queryStb.json}}
|
||||
```
|
||||
|
||||
</details>
|
||||
|
||||
### 订阅 JSON 示例
|
||||
|
||||
<details>
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 100,
|
||||
"query_mode": "taosc",
|
||||
"specified_table_query": {
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"sqls": [
|
||||
{
|
||||
"sql": "select last_row(*) from meters"
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 100,
|
||||
"query_mode": "taosc",
|
||||
"specified_table_query": {
|
||||
"concurrent": 4,
|
||||
"threads": 4,
|
||||
"mixed_query": "yes",
|
||||
"sqls": [
|
||||
{
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 5,
|
||||
"query_mode": "taosc",
|
||||
"specified_table_query": {
|
||||
"concurrent": 5,
|
||||
"threads": 5,
|
||||
"query_interval": 1000,
|
||||
"mixed_query": "yes",
|
||||
"batch_query": "yes",
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 5,
|
||||
"query_mode": "taosc",
|
||||
"specified_table_query": {
|
||||
"concurrent": 5,
|
||||
"threads": 5,
|
||||
"query_interval": 100,
|
||||
"mixed_query": "yes",
|
||||
"batch_query": "no",
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 100,
|
||||
"query_mode": "rest",
|
||||
"specified_table_query": {
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"mixed_query": "yes",
|
||||
"sqls": [
|
||||
{
|
||||
|
|
|
@ -10,7 +10,7 @@
|
|||
"query_times": 100,
|
||||
"query_mode": "rest",
|
||||
"specified_table_query": {
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"sqls": [
|
||||
{
|
||||
"sql": "select last_row(*) from meters"
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
"query_mode": "taosc",
|
||||
"super_table_query": {
|
||||
"stblname": "meters",
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"query_interval": 0,
|
||||
"sqls": [
|
||||
{
|
||||
|
|
|
@ -12,7 +12,7 @@
|
|||
"query_mode": "rest",
|
||||
"super_table_query": {
|
||||
"stblname": "meters",
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"query_interval": 0,
|
||||
"sqls": [
|
||||
{
|
||||
|
|
|
@ -0,0 +1,257 @@
|
|||
###################################################################
|
||||
# Copyright (c) 2016 by TAOS Technologies, Inc.
|
||||
# All rights reserved.
|
||||
#
|
||||
# This file is proprietary and confidential to TAOS Technologies.
|
||||
# No part of this file may be reproduced, stored, transmitted,
|
||||
# disclosed or used in any form or by any means other than as
|
||||
# expressly provided by the written permission from Jianhui Tao
|
||||
#
|
||||
###################################################################
|
||||
|
||||
# -*- coding: utf-8 -*-
|
||||
import os
|
||||
import json
|
||||
import sys
|
||||
import os
|
||||
import time
|
||||
import datetime
|
||||
import platform
|
||||
import subprocess
|
||||
|
||||
import frame
|
||||
import frame.eos
|
||||
import frame.etool
|
||||
from frame.log import *
|
||||
from frame.cases import *
|
||||
from frame.sql import *
|
||||
from frame.caseBase import *
|
||||
from frame import *
|
||||
|
||||
|
||||
# reomve single and double quotation
|
||||
def removeQuotation(origin):
|
||||
value = ""
|
||||
for c in origin:
|
||||
if c != '\'' and c != '"':
|
||||
value += c
|
||||
|
||||
return value
|
||||
|
||||
class TDTestCase(TBase):
|
||||
def caseDescription(self):
|
||||
"""
|
||||
taosBenchmark query->Basic test cases
|
||||
"""
|
||||
|
||||
def runSeconds(self, command, timeout = 180):
|
||||
tdLog.info(f"runSeconds {command} ...")
|
||||
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
process.wait(timeout)
|
||||
|
||||
# get output
|
||||
output = process.stdout.read().decode(encoding="gbk")
|
||||
error = process.stderr.read().decode(encoding="gbk")
|
||||
return output, error
|
||||
|
||||
def getKeyValue(self, content, key, end):
|
||||
# find key
|
||||
s = content.find(key)
|
||||
if s == -1:
|
||||
return False,""
|
||||
|
||||
# skip self
|
||||
s += len(key)
|
||||
# skip blank
|
||||
while s < len(content):
|
||||
if content[s] != " ":
|
||||
break
|
||||
s += 1
|
||||
|
||||
# end check
|
||||
if s + 1 == len(content):
|
||||
return False, ""
|
||||
|
||||
# find end
|
||||
if len(end) == 0:
|
||||
e = -1
|
||||
else:
|
||||
e = content.find(end, s)
|
||||
|
||||
# get value
|
||||
if e == -1:
|
||||
value = content[s : ]
|
||||
else:
|
||||
value = content[s : e]
|
||||
|
||||
return True, value
|
||||
|
||||
def getDbRows(self, times):
|
||||
sql = f"select count(*) from test.meters"
|
||||
tdSql.waitedQuery(sql, 1, times)
|
||||
dbRows = tdSql.getData(0, 0)
|
||||
return dbRows
|
||||
|
||||
def checkItem(self, output, key, end, expect, equal):
|
||||
ret, value = self.getKeyValue(output, key, end)
|
||||
if ret == False:
|
||||
tdLog.exit(f"not found key:{key}. end:{end} output:\n{output}")
|
||||
|
||||
fval = float(value)
|
||||
# compare
|
||||
if equal and fval != expect:
|
||||
tdLog.exit(f"check not expect. expect:{expect} real:{fval}, key:'{key}' end:'{end}' output:\n{output}")
|
||||
elif equal == False and fval <= expect:
|
||||
tdLog.exit(f"failed because {fval} <= {expect}, key:'{key}' end:'{end}' output:\n{output}")
|
||||
else:
|
||||
# succ
|
||||
if equal:
|
||||
tdLog.info(f"check successfully. key:'{key}' expect:{expect} real:{fval}")
|
||||
else:
|
||||
tdLog.info(f"check successfully. key:'{key}' {fval} > {expect}")
|
||||
|
||||
|
||||
def checkAfterRun(self, benchmark, jsonFile, specMode, tbCnt):
|
||||
# run
|
||||
cmd = f"{benchmark} -f {jsonFile}"
|
||||
output, error = self.runSeconds(cmd)
|
||||
|
||||
if specMode :
|
||||
label = "specified_table_query"
|
||||
else:
|
||||
label = "super_table_query"
|
||||
|
||||
#
|
||||
# check insert result
|
||||
#
|
||||
with open(jsonFile, "r") as file:
|
||||
data = json.load(file)
|
||||
|
||||
queryTimes = data["query_times"]
|
||||
# contineIfFail
|
||||
try:
|
||||
continueIfFail = data["continue_if_fail"]
|
||||
except:
|
||||
continueIfFail = "no"
|
||||
|
||||
threads = data[label]["threads"]
|
||||
sqls = data[label]["sqls"]
|
||||
|
||||
|
||||
# batch_query
|
||||
try:
|
||||
batchQuery = data[label]["batch_query"]
|
||||
except:
|
||||
batchQuery = "no"
|
||||
|
||||
# mixed_query
|
||||
try:
|
||||
mixedQuery = data[label]["mixed_query"]
|
||||
except:
|
||||
mixedQuery = "no"
|
||||
|
||||
tdLog.info(f"queryTimes={queryTimes} threads={threads} mixedQuery={mixedQuery} "
|
||||
f"batchQuery={batchQuery} len(sqls)={len(sqls)} label={label}\n")
|
||||
|
||||
totalQueries = 0
|
||||
threadQueries = 0
|
||||
QPS = 10
|
||||
|
||||
if continueIfFail.lower() == "yes":
|
||||
allEnd = " "
|
||||
else:
|
||||
allEnd = "\n"
|
||||
|
||||
if specMode and mixedQuery.lower() != "yes":
|
||||
# spec
|
||||
threadQueries = queryTimes * threads
|
||||
totalQueries = queryTimes * threads * len(sqls)
|
||||
threadKey = f"complete query with {threads} threads and "
|
||||
qpsKey = "QPS: "
|
||||
avgKey = "query delay avg: "
|
||||
minKey = "min:"
|
||||
else:
|
||||
# spec mixed or super
|
||||
|
||||
if specMode:
|
||||
totalQueries = queryTimes * len(sqls)
|
||||
# spec mixed
|
||||
if batchQuery.lower() == "yes":
|
||||
# batch
|
||||
threadQueries = len(sqls)
|
||||
QPS = 2
|
||||
else:
|
||||
threadQueries = totalQueries
|
||||
else:
|
||||
# super
|
||||
totalQueries = queryTimes * len(sqls) * tbCnt
|
||||
threadQueries = totalQueries
|
||||
|
||||
nSql = len(sqls)
|
||||
if specMode and nSql < threads :
|
||||
tdLog.info(f"set threads = {nSql} because len(sqls) < threads")
|
||||
threads = nSql
|
||||
threadKey = f"using {threads} threads complete query "
|
||||
qpsKey = ""
|
||||
avgKey = "avg delay:"
|
||||
minKey = "min delay:"
|
||||
|
||||
items = [
|
||||
[threadKey, " ", threadQueries, True],
|
||||
[qpsKey, " ", 5, False], # qps need > 1
|
||||
[avgKey, "s", 0, False],
|
||||
[minKey, "s", 0, False],
|
||||
["max: ", "s", 0, False],
|
||||
["p90: ", "s", 0, False],
|
||||
["p95: ", "s", 0, False],
|
||||
["p99: ", "s", 0, False],
|
||||
["INFO: Spend ", " ", 0, False],
|
||||
["completed total queries: ", ",", totalQueries, True],
|
||||
["the QPS of all threads:", allEnd, QPS , False] # all qps need > 5
|
||||
]
|
||||
|
||||
# check
|
||||
for item in items:
|
||||
if len(item[0]) > 0:
|
||||
self.checkItem(output, item[0], item[1], item[2], item[3])
|
||||
|
||||
|
||||
|
||||
# tmq check
|
||||
def checkTmqJson(self, benchmark, json):
|
||||
OK_RESULT = "Consumed total msgs: 30, total rows: 300000"
|
||||
cmd = benchmark + " -f " + json
|
||||
output,error = frame.eos.run(cmd, 600)
|
||||
if output.find(OK_RESULT) != -1:
|
||||
tdLog.info(f"succ: {cmd} found '{OK_RESULT}'")
|
||||
else:
|
||||
tdLog.exit(f"failed: {cmd} not found {OK_RESULT} in:\n{output} \nerror:{error}")
|
||||
|
||||
|
||||
def run(self):
|
||||
tbCnt = 10
|
||||
benchmark = etool.benchMarkFile()
|
||||
|
||||
# insert
|
||||
json = "../../tools/taos-tools/example/insert.json"
|
||||
self.insertBenchJson(json, checkStep=True)
|
||||
|
||||
# query
|
||||
json = "../../tools/taos-tools/example/query.json"
|
||||
self.checkAfterRun(benchmark, json, True, tbCnt)
|
||||
json = "../../tools/taos-tools/example/queryStb.json"
|
||||
self.checkAfterRun(benchmark, json, False, tbCnt)
|
||||
|
||||
# tmq
|
||||
json = "../../tools/taos-tools/example/tmq.json"
|
||||
self.checkTmqJson(benchmark, json)
|
||||
|
||||
|
||||
|
||||
def stop(self):
|
||||
tdSql.close()
|
||||
tdLog.success("%s successfully executed" % __file__)
|
||||
|
||||
|
||||
tdCases.addWindows(__file__, TDTestCase())
|
||||
tdCases.addLinux(__file__, TDTestCase())
|
|
@ -98,6 +98,19 @@
|
|||
# benchmark 66 cases
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/commandline.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/commandline-partial-col-numpy.py
|
||||
# benchmark 64 cases
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/websiteCase.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/rest_insert_alltypes_json.py -R
|
||||
,,n,army,python3 ./test.py -f tools/benchmark/basic/taosdemoTestQueryWithJson-mixed-query.py -R
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/stmt_sample_csv_json.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/taosdemoTestInsertWithJsonStmt-otherPara.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/sml_telnet_insert_alltypes-same-min-max.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/default_tmq_json.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/reuse-exist-stb.py
|
||||
,,n,army,python3 ./test.py -f tools/benchmark/basic/sml_interlace.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/stmt2_insert.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/stmt_offset_json.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/json_tag.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/commandline-sml-rest.py -R
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/commandline-single-table.py
|
||||
,,y,army,./pytest.sh python3 ./test.py -f tools/benchmark/basic/commandline-supplement-insert.py
|
||||
|
|
|
@ -27,7 +27,7 @@
|
|||
{
|
||||
"name": "meters",
|
||||
"child_table_exists": "no",
|
||||
"childtable_count": 1000,
|
||||
"childtable_count": 10,
|
||||
"childtable_prefix": "d",
|
||||
"auto_create_table": "no",
|
||||
"batch_create_tbl_num": 5,
|
||||
|
|
|
@ -6,13 +6,13 @@
|
|||
"user": "root",
|
||||
"password": "taosdata",
|
||||
"confirm_parameter_prompt": "no",
|
||||
"continue_if_fail": "yes",
|
||||
"continue_if_fail": "yes",
|
||||
"databases": "test",
|
||||
"query_times": 10,
|
||||
"query_mode": "taosc",
|
||||
"specified_table_query": {
|
||||
"query_interval": 1,
|
||||
"concurrent": 3,
|
||||
"threads": 3,
|
||||
"sqls": [
|
||||
{
|
||||
"sql": "select last_row(*) from meters",
|
||||
|
@ -23,16 +23,5 @@
|
|||
"result": "./query_res1.txt"
|
||||
}
|
||||
]
|
||||
},
|
||||
"super_table_query": {
|
||||
"stblname": "meters",
|
||||
"query_interval": 1,
|
||||
"threads": 3,
|
||||
"sqls": [
|
||||
{
|
||||
"sql": "select last_row(ts) from xxxx",
|
||||
"result": "./query_res2.txt"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"filetype": "query",
|
||||
"cfgdir": "/etc/taos",
|
||||
"host": "127.0.0.1",
|
||||
"port": 6030,
|
||||
"user": "root",
|
||||
"password": "taosdata",
|
||||
"confirm_parameter_prompt": "no",
|
||||
"continue_if_fail": "yes",
|
||||
"databases": "test",
|
||||
"query_times": 10,
|
||||
"query_mode": "taosc",
|
||||
"super_table_query": {
|
||||
"stblname": "meters",
|
||||
"query_interval": 1,
|
||||
"threads": 3,
|
||||
"sqls": [
|
||||
{
|
||||
"sql": "select last_row(ts) from xxxx",
|
||||
"result": "./query_res2.txt"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
Loading…
Reference in New Issue