forked from new_org/Project-Caffeine
feat(add):提交Arabica Sprint1项目工程文件和代码
Signed-off-by: gzkoala <guohao@gitconomy.org>
This commit is contained in:
14
projects/arabica/sprint1/.vscode/launch.json
vendored
Normal file
14
projects/arabica/sprint1/.vscode/launch.json
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"type": "node",
|
||||
"request": "attach",
|
||||
"name": "🍒 附加到 Cherry Studio (MCP 联调)",
|
||||
"port": 9229,
|
||||
"restart": true,
|
||||
"skipFiles": ["<node_internals>/**"],
|
||||
"outFiles": ["${workspaceFolder}/dist/**/*.js"]
|
||||
}
|
||||
]
|
||||
}
|
||||
117
projects/arabica/sprint1/README.md
Normal file
117
projects/arabica/sprint1/README.md
Normal file
@@ -0,0 +1,117 @@
|
||||
<!--
|
||||
---
|
||||
title: "Project Caffeine - Arabica Sprint1 QuickStart"
|
||||
description: "Project Caffeine Arabica Sprint1 的快速启动指南,涵盖基于 MCP stdio 架构的零网络开销部署、Obsidian 本地知识库接入、5 Whys 策略引擎配置及 Cherry Studio 客户端联调全流程。"
|
||||
version: "1.0.0"
|
||||
author: "Gitconomy Research郭晧"
|
||||
date: "2026-03-01"
|
||||
type: "README / QuickStart"
|
||||
tags:
|
||||
- Project Caffeine
|
||||
- MCP
|
||||
- stdio
|
||||
- Obsidian
|
||||
- QuickStart
|
||||
- Cherry Studio
|
||||
license: "CC BY-SA 4.0"
|
||||
---
|
||||
-->
|
||||
# Project Caffeine - Arabica Sprint1 QuickStart
|
||||
|
||||
## 1. Sprint1 `0.1.0` 版本核心特性
|
||||
|
||||
- **零网络开销通信**:作为本地集成版本,本系统采用 `stdio` 传输协议,利用同一台机器上本地进程间的 stdin 和 stdout 管道进行直接通信,实现零网络传输开销。
|
||||
|
||||
- **本地知识图谱接入**:无缝对接 Obsidia个人知识管理软件,让大模型能够直接读取你的本地知识库。
|
||||
|
||||
- **内置 5 Whys 策略引擎**:引入经典的“5 Whys”思维框架挂载,强制约束大模型的思考路径,辅助其将模糊想法拆解为高精度的追问。
|
||||
|
||||
- **沙箱隔离级安全防御**:默认将 AI 生成的指令视为不可信负载,通过底层机制阻断越权访问和路径遍历漏洞,确保本地文件系统的绝对安全。
|
||||
|
||||
---
|
||||
## 2. 克隆仓库与获取分支代码
|
||||
|
||||
**📌 重要说明**:Sprint 的迭代代码不会直接合并到`master` 分支。为了获取 Sprint1 的完整代码,你需要在克隆时指定对应的特性分支 (`feature/arabica-sprint-1`):
|
||||
|
||||
```bash
|
||||
# 直接克隆指定的 feature 分支
|
||||
git clone -b feature/arabica-sprint-1 https://gitlink.org.cn/Gitconomy/Project-Caffeine.git
|
||||
|
||||
# 进入 Sprint1 的独立工作目录
|
||||
cd Project-Caffeine/projects/arabica/sprint1
|
||||
|
||||
# 安装 Node.js 项目依赖
|
||||
npm install
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. 环境与路径配置
|
||||
|
||||
为了让系统准确挂载你的知识库,请打开 `src/services/resourceService.ts`,将 `OBSIDIAN_VAULT_PATH` 变量修改为你本机实际的 Markdown 文件夹绝对路径。
|
||||
|
||||
## 4. 编译与工作流说明
|
||||
|
||||
项目采用 TypeScript 开发,底层依靠 Node.js 运行,因此必须先将 `.ts` 源码编译为 `.js` 文件。Sprint1的项目目录已经包含编译过的 `.js` 文件(/dist目录),可以直接快速测试运行。
|
||||
|
||||
**⚠️ 极其重要的运行说明 (必读):**
|
||||
|
||||
基于 MCP 的 `stdio` 架构特性,本程序**不需要**你手动启动独立的后台服务。请根据你的使用场景选择工作流:
|
||||
|
||||
- **🟢 日常使用 (生产模式)**
|
||||
|
||||
你**不需要**在终端里输入 `npm run start`。只要确保 `dist/app.js` 文件存在,当你在 Cherry Studio 或 Claude Desktop 等客户端中配置好绝对路径并打开开关时,客户端会在系统后台自动唤起并接管这个 Node 进程。
|
||||
|
||||
- **🛠️ 开发与调试 (实时监听模式)**
|
||||
|
||||
如果你正在修改源码,并希望配合 VS Code 进行断点联调,请在终端中保持运行以下命令:
|
||||
|
||||
_(此命令会在后台实时监控代码改动。当你按 `Ctrl+S` 保存代码后,只需在 Cherry Studio 中将 Server 开关关闭再打开,即可瞬间应用最新的代码逻辑!)_
|
||||
|
||||
---
|
||||
|
||||
## 5. Sprint1 暴露的工具 (Tools)
|
||||
|
||||
本服务端向支持 MCP 的 LLM 暴露了以下 3 个核心工具,赋予其检索本地数据与优化提示词的主动权:
|
||||
|
||||
- **`list_local_notes`**: 扫描本地知识库目录,返回所有 `.md` 格式的文献与笔记列表,帮助大模型确立探索边界。
|
||||
- **`read_local_note`**: 深度读取指定 Markdown 文件的原文,将本地知识库的物理边界转化为大模型内存中的上下文。
|
||||
- **`generate_5_whys`**: 针对用户宽泛的研究主题,强制模型连续追问五次“为什么”,层层递进剥离问题的表象,寻找最底层的学术痛点。
|
||||
|
||||
|
||||
## 6. Sprint1 暴露的资源 (Resources)
|
||||
|
||||
资源(Resources)作为被动的静态上下文信息数据源,供客户端 UI 直接发现与提取:
|
||||
|
||||
- **`obsidian-index`** (`obsidian://vault/index`): 向客户端暴露本地知识库的完整目录索引数据。
|
||||
|
||||
---
|
||||
|
||||
## 7. 客户端接入联调 (以 Cherry Studio 为例)
|
||||
|
||||
Sprint1 采用纯本地 `stdio` 架构,推荐使用 VS Code 配合客户端进行源码级联调:
|
||||
|
||||
1. 打开 Cherry Studio,进入 **设置 -> MCP**。
|
||||
|
||||
2. 添加一个新的 Server 配置:
|
||||
|
||||
- **名称**: `ProjectCaffeine-Sprint1`
|
||||
- **Command**: `node`
|
||||
- **Args**: `["--inspect=9229", "/你的实际克隆路径/Project-Caffeine/projects/arabica/sprint1/dist/app.js"]` _(⚠️ 必须为编译后的 js 文件绝对路径,且 `--inspect` 需放在首位以开启调试)_
|
||||
|
||||
3. 保存后确认状态灯变为绿色。
|
||||
|
||||
4. 返回 VS Code,在侧边栏“运行和调试”中执行附加 (Attach),即可对大模型发起的每一次工具调用进行完美断点拦截。
|
||||
|
||||
---
|
||||
|
||||
## 8. Sprint1 文档
|
||||
|
||||
- Sprint1 [设计文档](./docs/arabica-sprint1-architecture-design-specification.md)
|
||||
- Srpint1 [开发文档](./docs/arabica-sprint1-development-specification-guide.md)
|
||||
|
||||
---
|
||||
|
||||
## 许可声明
|
||||
|
||||
本文档采用 **知识共享署名--相同方式共享 4.0 国际许可协议 (CC BY--SA 4.0)** 进行许可,© 2025-2026 Gitconomy Research.
|
||||
143
projects/arabica/sprint1/dist/app.js
vendored
Normal file
143
projects/arabica/sprint1/dist/app.js
vendored
Normal file
@@ -0,0 +1,143 @@
|
||||
"use strict";
|
||||
/**
|
||||
* Project Caffeine
|
||||
* Copyright (c) 2025-2026 Gitconomy Research
|
||||
*
|
||||
* SPDX-License-Identifier: MIT
|
||||
*
|
||||
* Contributors:
|
||||
* - 郭晧 <guohao@gitconomy.org> (Initial Author)
|
||||
*/
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || (function () {
|
||||
var ownKeys = function(o) {
|
||||
ownKeys = Object.getOwnPropertyNames || function (o) {
|
||||
var ar = [];
|
||||
for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
|
||||
return ar;
|
||||
};
|
||||
return ownKeys(o);
|
||||
};
|
||||
return function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
})();
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const mcp_js_1 = require("@modelcontextprotocol/sdk/server/mcp.js");
|
||||
const stdio_js_1 = require("@modelcontextprotocol/sdk/server/stdio.js");
|
||||
const zod_1 = require("zod");
|
||||
const promptService_1 = require("./services/promptService");
|
||||
const resourceService_1 = require("./services/resourceService");
|
||||
const fs = __importStar(require("fs"));
|
||||
const path = __importStar(require("path"));
|
||||
// ==========================================
|
||||
// 1. 初始化 MCP Server
|
||||
// ==========================================
|
||||
const mcpServer = new mcp_js_1.McpServer({
|
||||
name: "Project-Caffeine-Prompt-Strategy",
|
||||
version: "1.2.0"
|
||||
});
|
||||
// ==========================================
|
||||
// 2. 注册 Tools (工具) - 赋予大模型主动执行的能力
|
||||
// ==========================================
|
||||
// 工具 1:5 Whys 提示词策略生成
|
||||
mcpServer.tool("generate_5_whys", "使用 5 Whys 模板对用户查询进行深度分解,生成增强的提示词策略", { query: zod_1.z.string().describe("需要分析的查询主题") }, async ({ query }) => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在生成 5 Whys 策略 -> ${query}`);
|
||||
const enhancedPrompt = (0, promptService_1.generate5Whys)(query);
|
||||
return {
|
||||
content: [{ type: "text", text: JSON.stringify(enhancedPrompt, null, 2) }]
|
||||
};
|
||||
});
|
||||
// 工具 2:扫描本地知识库目录
|
||||
mcpServer.tool("list_local_notes", "获取本地 Obsidian 知识库中的所有 Markdown 笔记列表,用于了解当前有哪些可用的本地上下文资料。", {}, async () => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在扫描本地笔记列表...`);
|
||||
const notes = await (0, resourceService_1.listObsidianNotes)();
|
||||
return {
|
||||
content: [{
|
||||
type: "text",
|
||||
text: notes.length > 0 ? `找到了以下笔记:\n${notes.join('\n')}` : "未找到笔记。"
|
||||
}]
|
||||
};
|
||||
});
|
||||
// 工具 3:阅读指定的单篇笔记内容
|
||||
mcpServer.tool("read_local_note", "读取本地 Obsidian 知识库中指定笔记的完整内容,作为深度分析的上下文参考。", { filename: zod_1.z.string().describe("需要读取的笔记文件名,必须包含 .md 后缀") }, async ({ filename }) => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在深度阅读笔记 -> ${filename}`);
|
||||
try {
|
||||
const content = await (0, resourceService_1.readObsidianNote)(filename);
|
||||
return { content: [{ type: "text", text: content }] };
|
||||
}
|
||||
catch (error) {
|
||||
return {
|
||||
content: [{ type: "text", text: `读取失败: ${error.message}` }],
|
||||
isError: true // 明确告知大模型此操作抛出了错误
|
||||
};
|
||||
}
|
||||
});
|
||||
// ==========================================
|
||||
// 3. 注册 Resources (资源) - 暴露给客户端供用户手动勾选的静态数据
|
||||
// ==========================================
|
||||
// 资源 1:知识库目录索引
|
||||
mcpServer.resource("obsidian-index", // 客户端显示的资源 Name/ID
|
||||
"obsidian://vault/index", // 唯一的 URI 标识
|
||||
{
|
||||
description: "本地知识库的目录索引,包含所有 Markdown 笔记的列表"
|
||||
}, async (uri) => {
|
||||
console.error(`[Project Caffeine] 客户端请求静态资源: ${uri.href}`);
|
||||
const notes = await (0, resourceService_1.listObsidianNotes)();
|
||||
const textContent = notes.length > 0
|
||||
? `当前知识库包含以下文件:\n${notes.join('\n')}`
|
||||
: "当前知识库为空。";
|
||||
return {
|
||||
contents: [{
|
||||
uri: uri.href,
|
||||
mimeType: "text/plain",
|
||||
text: textContent
|
||||
}]
|
||||
};
|
||||
});
|
||||
// ==========================================
|
||||
// 4. 启动底层 Stdio 传输层
|
||||
// ==========================================
|
||||
async function start() {
|
||||
console.error("[Project Caffeine] 正在启动 TS 版 MCP Server (含 Tools 与 Resources)...");
|
||||
const transport = new stdio_js_1.StdioServerTransport();
|
||||
await mcpServer.connect(transport);
|
||||
console.error("[Project Caffeine] MCP Server 已就绪,等待 Cherry Studio 交互。");
|
||||
}
|
||||
// 捕获致命错误并安全退出
|
||||
start().catch((err) => {
|
||||
console.error("服务器启动失败:", err);
|
||||
process.exit(1);
|
||||
});
|
||||
// ==========================================
|
||||
// 💡 5. 日志持久化拦截器 (Linux)
|
||||
// ==========================================
|
||||
const logFilePath = path.resolve(__dirname, '../server.log');
|
||||
const originalConsoleError = console.error;
|
||||
console.error = (...args) => {
|
||||
// 1. 在后台输出
|
||||
originalConsoleError(...args);
|
||||
// 2. 同时把日志追加写入到项目根目录的 server.log 文件中
|
||||
const logMessage = args.map(arg => typeof arg === 'object' ? JSON.stringify(arg) : String(arg)).join(' ');
|
||||
fs.appendFileSync(logFilePath, `[${new Date().toISOString()}] ${logMessage}\n`);
|
||||
};
|
||||
//# sourceMappingURL=app.js.map
|
||||
1
projects/arabica/sprint1/dist/app.js.map
vendored
Normal file
1
projects/arabica/sprint1/dist/app.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"app.js","sourceRoot":"","sources":["../src/app.ts"],"names":[],"mappings":";AAAA;;;;;;;;GAQG;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAEH,oEAAoE;AACpE,wEAAiF;AACjF,6BAAwB;AACxB,4DAAyD;AACzD,gEAAiF;AACjF,uCAAyB;AACzB,2CAA6B;AAE7B,6CAA6C;AAC7C,oBAAoB;AACpB,6CAA6C;AAC7C,MAAM,SAAS,GAAG,IAAI,kBAAS,CAAC;IAC5B,IAAI,EAAE,kCAAkC;IACxC,OAAO,EAAE,OAAO;CACnB,CAAC,CAAC;AAEH,6CAA6C;AAC7C,kCAAkC;AAClC,6CAA6C;AAE7C,sBAAsB;AACtB,SAAS,CAAC,IAAI,CACV,iBAAiB,EACjB,oCAAoC,EACpC,EAAE,KAAK,EAAE,OAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,CAAC,WAAW,CAAC,EAAE,EAC3C,KAAK,EAAE,EAAE,KAAK,EAAqB,EAAE,EAAE;IACnC,OAAO,CAAC,KAAK,CAAC,iDAAiD,KAAK,EAAE,CAAC,CAAC;IACxE,MAAM,cAAc,GAAG,IAAA,6BAAa,EAAC,KAAK,CAAC,CAAC;IAC5C,OAAO;QACH,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,cAAc,EAAE,IAAI,EAAE,CAAC,CAAC,EAAE,CAAC;KAC7E,CAAC;AACN,CAAC,CACJ,CAAC;AAEF,iBAAiB;AACjB,SAAS,CAAC,IAAI,CACV,kBAAkB,EAClB,0DAA0D,EAC1D,EAAE,EACF,KAAK,IAAI,EAAE;IACP,OAAO,CAAC,KAAK,CAAC,2CAA2C,CAAC,CAAC;IAC3D,MAAM,KAAK,GAAG,MAAM,IAAA,mCAAiB,GAAE,CAAC;IACxC,OAAO;QACH,OAAO,EAAE,CAAC;gBACN,IAAI,EAAE,MAAM;gBACZ,IAAI,EAAE,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,aAAa,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,QAAQ;aACtE,CAAC;KACL,CAAC;AACN,CAAC,CACJ,CAAC;AAEF,mBAAmB;AACnB,SAAS,CAAC,IAAI,CACV,iBAAiB,EACjB,2CAA2C,EAC3C,EAAE,QAAQ,EAAE,OAAC,CAAC,MAAM,EAAE,CAAC,QAAQ,CAAC,wBAAwB,CAAC,EAAE,EAC3D,KAAK,EAAE,EAAE,QAAQ,EAAwB,EAAE,EAAE;IACzC,OAAO,CAAC,KAAK,CAAC,2CAA2C,QAAQ,EAAE,CAAC,CAAC;IACrE,IAAI,CAAC;QACD,MAAM,OAAO,GAAG,MAAM,IAAA,kCAAgB,EAAC,QAAQ,CAAC,CAAC;QACjD,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,OAAO,EAAE,CAAC,EAAE,CAAC;IAC1D,CAAC;IAAC,OAAO,KAAU,EAAE,CAAC;QAClB,OAAO;YACH,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,SAAS,KAAK,CAAC,OAAO,EAAE,EAAE,CAAC;YAC3D,OAAO,EAAE,IAAI,CAAC,kBAAkB;SACnC,CAAC;IACN,CAAC;AACL,CAAC,CACJ,CAAC;AAEF,6CAA6C;AAC7C,4CAA4C;AAC5C,6CAA6C;AAE7C,eAAe;AACf,SAAS,CAAC,QAAQ,CACd,gBAAgB,EAAoB,mBAAmB;AACvD,wBAAwB,EAAY,aAAa;AACjD;IACI,WAAW,EAAE,gCAAgC;CAChD,EACD,KAAK,EAAE,GAAG,EAAE,EAAE;IACV,OAAO,CAAC,KAAK,CAAC,iCAAiC,GAAG,CAAC,IAAI,EAAE,CAAC,CAAC;IAE3D,MAAM,KAAK,GAAG,MAAM,IAAA,mCAAiB,GAAE,CAAC;IACxC,MAAM,WAAW,GAAG,KAAK,CAAC,MAAM,GAAG,CAAC;QAChC,CAAC,CAAC,iBAAiB,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE;QACrC,CAAC,CAAC,UAAU,CAAC;IAEjB,OAAO;QACH,QAAQ,EAAE,CAAC;gBACP,GAAG,EAAE,GAAG,CAAC,IAAI;gBACb,QAAQ,EAAE,YAAY;gBACtB,IAAI,EAAE,WAAW;aACpB,CAAC;KACL,CAAC;AACN,CAAC,CACJ,CAAC;AAEF,6CAA6C;AAC7C,oBAAoB;AACpB,6CAA6C;AAC7C,KAAK,UAAU,KAAK;IAChB,OAAO,CAAC,KAAK,CAAC,kEAAkE,CAAC,CAAC;IAClF,MAAM,SAAS,GAAG,IAAI,+BAAoB,EAAE,CAAC;IAC7C,MAAM,SAAS,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IACnC,OAAO,CAAC,KAAK,CAAC,wDAAwD,CAAC,CAAC;AAC5E,CAAC;AAED,cAAc;AACd,KAAK,EAAE,CAAC,KAAK,CAAC,CAAC,GAAY,EAAE,EAAE;IAC3B,OAAO,CAAC,KAAK,CAAC,UAAU,EAAE,GAAG,CAAC,CAAC;IAC/B,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AACpB,CAAC,CAAC,CAAC;AAEH,6CAA6C;AAC7C,yBAAyB;AACzB,6CAA6C;AAC7C,MAAM,WAAW,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,EAAE,eAAe,CAAC,CAAC;AAC7D,MAAM,oBAAoB,GAAG,OAAO,CAAC,KAAK,CAAC;AAE3C,OAAO,CAAC,KAAK,GAAG,CAAC,GAAG,IAAI,EAAE,EAAE;IACxB,WAAW;IACX,oBAAoB,CAAC,GAAG,IAAI,CAAC,CAAC;IAC9B,qCAAqC;IACrC,MAAM,UAAU,GAAG,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,CAAC,OAAO,GAAG,KAAK,QAAQ,CAAC,CAAC,CAAC,IAAI,CAAC,SAAS,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IAC1G,EAAE,CAAC,cAAc,CAAC,WAAW,EAAE,IAAI,IAAI,IAAI,EAAE,CAAC,WAAW,EAAE,KAAK,UAAU,IAAI,CAAC,CAAC;AACpF,CAAC,CAAC"}
|
||||
36
projects/arabica/sprint1/dist/services/promptService.js
vendored
Normal file
36
projects/arabica/sprint1/dist/services/promptService.js
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
"use strict";
|
||||
/**
|
||||
* Project Caffeine
|
||||
* Copyright (c) 2025-2026 Gitconomy Research
|
||||
*
|
||||
* SPDX-License-Identifier: MIT
|
||||
*
|
||||
* Contributors:
|
||||
* - 郭晧 <guohao@gitconomy.org> (Initial Author)
|
||||
*/
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.generate5Whys = generate5Whys;
|
||||
/**
|
||||
* 根据查询主题生成 5 Whys 提示词策略
|
||||
* @param query 用户输入的查询主题
|
||||
* @returns 包含 5 个追问的字符串数组
|
||||
*/
|
||||
function generate5Whys(query) {
|
||||
if (query.includes("开源人才")) {
|
||||
return [
|
||||
"为什么中国开源人才的培养面临困难?",
|
||||
"为什么中国开源人才缺乏足够的行业经验?",
|
||||
"为什么开源社区对中国人才的支持力度不足?",
|
||||
"为什么中国开源人才的市场需求与供给不平衡?",
|
||||
"为什么政策支持不足导致中国开源人才流失?"
|
||||
];
|
||||
}
|
||||
return [
|
||||
`为什么 "${query}" 会成为一个问题?`,
|
||||
`为什么导致上述现象的直接原因会发生?`,
|
||||
`为什么当前的系统或流程没有阻止这种情况?`,
|
||||
`为什么以前的解决方案或预防措施失效了?`,
|
||||
`为什么根本的系统性漏洞一直未被修复?`
|
||||
];
|
||||
}
|
||||
//# sourceMappingURL=promptService.js.map
|
||||
1
projects/arabica/sprint1/dist/services/promptService.js.map
vendored
Normal file
1
projects/arabica/sprint1/dist/services/promptService.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"promptService.js","sourceRoot":"","sources":["../../src/services/promptService.ts"],"names":[],"mappings":";AAAA;;;;;;;;GAQG;;AAQH,sCAkBC;AAxBD;;;;GAIG;AAEH,SAAgB,aAAa,CAAC,KAAa;IACvC,IAAI,KAAK,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;QACzB,OAAO;YACH,mBAAmB;YACnB,qBAAqB;YACrB,sBAAsB;YACtB,uBAAuB;YACvB,sBAAsB;SACzB,CAAC;IACN,CAAC;IAED,OAAO;QACH,QAAQ,KAAK,YAAY;QACzB,oBAAoB;QACpB,sBAAsB;QACtB,qBAAqB;QACrB,oBAAoB;KACvB,CAAC;AACN,CAAC"}
|
||||
46
projects/arabica/sprint1/dist/services/resourceService.js
vendored
Normal file
46
projects/arabica/sprint1/dist/services/resourceService.js
vendored
Normal file
@@ -0,0 +1,46 @@
|
||||
"use strict";
|
||||
/**
|
||||
* Project Caffeine
|
||||
* Copyright (c) 2025-2026 Gitconomy Research
|
||||
*
|
||||
* SPDX-License-Identifier: MIT
|
||||
*
|
||||
* Contributors:
|
||||
* - 郭晧 <guohao@gitconomy.org> (Initial Author)
|
||||
*/
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.listObsidianNotes = listObsidianNotes;
|
||||
exports.readObsidianNote = readObsidianNote;
|
||||
const promises_1 = __importDefault(require("fs/promises"));
|
||||
const path_1 = __importDefault(require("path"));
|
||||
// 【⚠️ 重要配置】请修改为你电脑上真实的 Markdown 笔记文件夹绝对路径!
|
||||
const OBSIDIAN_VAULT_PATH = '/home/wguo/Downloads/MyVault';
|
||||
async function listObsidianNotes() {
|
||||
try {
|
||||
const files = await promises_1.default.readdir(OBSIDIAN_VAULT_PATH);
|
||||
return files.filter(file => file.toLowerCase().endsWith('.md'));
|
||||
}
|
||||
catch (error) {
|
||||
console.error(`[Project Caffeine] 无法读取知识库目录: ${error.message}`);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
async function readObsidianNote(filename) {
|
||||
const targetPath = path_1.default.resolve(OBSIDIAN_VAULT_PATH, filename);
|
||||
const safeVaultPath = path_1.default.resolve(OBSIDIAN_VAULT_PATH);
|
||||
// 核心防御:防止大模型通过传入 "../../" 读取系统敏感文件
|
||||
if (!targetPath.startsWith(safeVaultPath)) {
|
||||
throw new Error(`安全警告:越权访问拦截!禁止读取目录外的文件: ${filename}`);
|
||||
}
|
||||
try {
|
||||
const content = await promises_1.default.readFile(targetPath, 'utf-8');
|
||||
return content;
|
||||
}
|
||||
catch (error) {
|
||||
throw new Error(`无法读取笔记 [${filename}]: 文件可能不存在或无权限。`);
|
||||
}
|
||||
}
|
||||
//# sourceMappingURL=resourceService.js.map
|
||||
1
projects/arabica/sprint1/dist/services/resourceService.js.map
vendored
Normal file
1
projects/arabica/sprint1/dist/services/resourceService.js.map
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"resourceService.js","sourceRoot":"","sources":["../../src/services/resourceService.ts"],"names":[],"mappings":";AAAA;;;;;;;;GAQG;;;;;AAQH,8CAQC;AAED,4CAeC;AA/BD,2DAA6B;AAC7B,gDAAwB;AAExB,2CAA2C;AAC3C,MAAM,mBAAmB,GAAG,8BAA8B,CAAC;AAEpD,KAAK,UAAU,iBAAiB;IACnC,IAAI,CAAC;QACD,MAAM,KAAK,GAAG,MAAM,kBAAE,CAAC,OAAO,CAAC,mBAAmB,CAAC,CAAC;QACpD,OAAO,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;IACpE,CAAC;IAAC,OAAO,KAAU,EAAE,CAAC;QAClB,OAAO,CAAC,KAAK,CAAC,iCAAiC,KAAK,CAAC,OAAO,EAAE,CAAC,CAAC;QAChE,OAAO,EAAE,CAAC;IACd,CAAC;AACL,CAAC;AAEM,KAAK,UAAU,gBAAgB,CAAC,QAAgB;IACnD,MAAM,UAAU,GAAG,cAAI,CAAC,OAAO,CAAC,mBAAmB,EAAE,QAAQ,CAAC,CAAC;IAC/D,MAAM,aAAa,GAAG,cAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC,CAAC;IAExD,mCAAmC;IACnC,IAAI,CAAC,UAAU,CAAC,UAAU,CAAC,aAAa,CAAC,EAAE,CAAC;QACxC,MAAM,IAAI,KAAK,CAAC,2BAA2B,QAAQ,EAAE,CAAC,CAAC;IAC3D,CAAC;IAED,IAAI,CAAC;QACD,MAAM,OAAO,GAAG,MAAM,kBAAE,CAAC,QAAQ,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QACvD,OAAO,OAAO,CAAC;IACnB,CAAC;IAAC,OAAO,KAAU,EAAE,CAAC;QAClB,MAAM,IAAI,KAAK,CAAC,WAAW,QAAQ,iBAAiB,CAAC,CAAC;IAC1D,CAAC;AACL,CAAC"}
|
||||
@@ -0,0 +1,210 @@
|
||||
<!--
|
||||
---
|
||||
title: 提示词策略 MCP Server 原型设计文档
|
||||
description: Project Caffeine 提示词策略 MCP Server 的最小可行性功能(MVP)原型设计,涵盖 5 Whys 模板调用、增强提示词合成及 Node.js 环境工作流验证
|
||||
type: Architecture Design
|
||||
file: project-caffeine-mvp-sprint1-architecture-design.md
|
||||
version: v1.0.1 (Arabica)
|
||||
author: Gitconomy Research-郭晧
|
||||
date: 2026-03-1
|
||||
last-update: 2026-03-01
|
||||
update-description: 增加Cherry Stuio作为 MCP Client,上一版的设计还是基于传统的“AI Web 应用后端”,而不是严格意义上的 MCP 系统框架 。
|
||||
tags:
|
||||
- Project Caffeine
|
||||
- MCP Server
|
||||
- MVP
|
||||
- Srpint1
|
||||
- Prompt Strategy
|
||||
- 5 Whys
|
||||
- Node.js
|
||||
license: CC BY-SA 4.0
|
||||
status: Active
|
||||
---
|
||||
-->
|
||||
# Project Caffeine最小可行功能原型设计 —— 提示词策略MCP Server
|
||||
|
||||
## 1. 原型设计概述
|
||||
|
||||
**功能目标**:实现一个最简化的 **提示词策略MCP Server**,当用户发起查询时,系统能够调用 **5 Whys** 模板对查询进行分解,生成增强的提示词,并将其发送到大模型进行推理。
|
||||
|
||||
**关键功能**:
|
||||
|
||||
- **工具注册 **:向 Cherry Studio 注册 `generate_5_whys` 能力。
|
||||
- **接收工具调用请求**:通过 `stdio` 接收 Cherry Studio 传来的用户查询主题。
|
||||
- **5 Whys 模板分解**:利用本地逻辑对查询进行分解,生成多层次的追问提示词。
|
||||
- **资源暴露**:将本地的文档库、PDF 或特定数据以 MCP Resource 的形式暴露给 Client(Sprint2)。
|
||||
|
||||
---
|
||||
|
||||
### 2. 功能实现步骤说明
|
||||
|
||||
#### 2.1 客户端请求与工具分发
|
||||
|
||||
- 用户在 **Cherry Studio** 中输入查询意图(如:“分析LLM在医疗中的应用瓶颈”)。
|
||||
- Cherry Studio 内部的大模型判断需要更深度的思考框架,主动向本 **MCP Server** 发起 `generate_5_whys` 的工具调用。
|
||||
|
||||
#### 2.2 生成 5 Whys 增强提示词
|
||||
|
||||
- **MCP Server** 接收到参数后,执行本地 `promptService.js`,根据查询主题自动生成 5 个逐层递进的问题(5 Whys)。
|
||||
|
||||
#### 2.3 结果返回与最终推理
|
||||
|
||||
- **MCP Server** 将生成的 5 Whys 数组作为工具执行结果,通过标准协议返回给 Cherry Studio。
|
||||
- Cherry Studio 携带这些增强提示词再次请求大模型,生成最终的深度研究洞察,并展示给用户。
|
||||
|
||||
---
|
||||
|
||||
## 3. 系统组件架构设计
|
||||
|
||||
### 3.1 **Project Caffeine** 系统原型项目结构说明
|
||||
|
||||
```markdown
|
||||
project-caffeine/
|
||||
│
|
||||
├── node_modules/ # 存放项目的依赖包
|
||||
├── src/ # 源代码文件夹
|
||||
│ ├── controllers/ # 逻辑路由分发层
|
||||
│ │ ├── promptsController.js # 处理提示词 Tool 相关请求
|
||||
│ │ └── resourcesController.js # 处理资源 Resource 请求
|
||||
│ ├── services/ # 核心业务逻辑层
|
||||
│ │ ├── promptService.js # 处理 5 Whys 提示词模板的生成算法
|
||||
│ │ └── resourceService.js # 处理本地文件、知识库等资源读取逻辑
|
||||
│ ├── models/ # 数据模型与校验
|
||||
│ │ └── schemas.js # 基于 Zod 的参数校验定义
|
||||
│ └── app.js # 主应用入口,初始化 MCP Server (stdio)
|
||||
│
|
||||
├── .vscode/ # IDE 调试配置
|
||||
│ └── launch.json # 配置 Cherry Studio 联调附加(Attach)环境
|
||||
├── config/ # 配置文件
|
||||
│ └── config.js # 项目配置
|
||||
├── .env # 环境变量配置 (不再需要 LLM API Key)
|
||||
├── package.json # 项目依赖 (@modelcontextprotocol/sdk)
|
||||
└── README.md # 项目说明文档
|
||||
```
|
||||
|
||||
**各文件和文件夹的功能说明**:
|
||||
|
||||
*表1-1:**Project Caffeine** 项目 MVP 系统文件和文件夹功能详细说明表格*
|
||||
|
||||
| **目录** | **文件名** | **功能说明** |
|
||||
| --- | --- | --- |
|
||||
| **`src/`** | **`app.js`** | 实例化官方 `McpServer`,配置 `stdio` 传输层,向 Client 注册 Tools 和 Resources。 |
|
||||
| **`src/controllers/`** | **`promptsController.js`** | 接收 `tools/call` 请求,调用 `promptService` 并格式化输出返回给 Client。 |
|
||||
| **`src/controllers/`** | **`resourcesController.js`** | 处理 `resources/list` 和 `resources/read` 请求,返回资源数据。 |
|
||||
| **`src/services/`** | **`promptService.js`** | 纯本地业务逻辑:根据入参字符串生成 **5 Whys** 数组结构。 |
|
||||
| **`src/services/`** | **`resourceService.js`** | 本地文件系统交互:读取本地知识库(如 Obsidian)或指定目录文档。 |
|
||||
| **`.vscode/`** | **`launch.json`** | 极其关键:配置 Node.js 的 `--inspect` 端口,实现基于 Cherry Studio 唤起时的断点联调。 |
|
||||
|
||||
### 3.2 系统模块架构说明
|
||||
|
||||
*图1-1:提示词策略 MCP Server 系统组件工作流架构图*
|
||||
|
||||

|
||||
|
||||
|
||||
基于 Cherry Studio 和 MCP (Model Context Protocol) 架构的工作流分为七个关键步骤:
|
||||
|
||||
1. **用户发起提问**:用户首先在前端(Cherry Studio)输入自己的问题或查询需求。
|
||||
2. **大模型决策调用工具**:Cherry Studio 作为 MCP 客户端,接收到用户提问后,其连接的大模型会对问题进行分析,并判断出需要调用系统注册的外部工具(如 5 Whys 分析法)来获取更好的提示词策略,而不是直接回答。
|
||||
3. **发起工具调用请求 tools/call**:Cherry Studio 通过标准输入输出流(`stdio`)协议,向后端的 Project Caffeine MCP Server(基于 Node.js 构建)发送一个名为 `generate_5_whys` 的工具调用指令。
|
||||
4. **本地生成 5 Whys 文本**:Project Caffeine MCP Server 接收到指令后,触发其内部的业务逻辑模块 `promptService.js`。该模块完全在本地运行,不依赖外部大模型,根据用户的初始问题自动生成 5 个逐层深入的追问(即 5 Whys 文本)。
|
||||
5. **返回工具执行结果**:MCP Server 将生成的 5 Whys 文本打包成标准的结果格式,通过协议传回给 Cherry Studio。
|
||||
6. **结合工具结果发起最终推理**:Cherry Studio 拿到这 5 个问题后,将其作为增强的上下文提示词,再次向大语言模型发起最终的深度推理请求,要求模型基于这些追问给出详尽的分析。
|
||||
7. **渲染并呈现结果**:大模型完成最终推理后,Cherry Studio 将这部分内容进行界面渲染,最终向用户呈现一份高质量的“深度洞察报告”。
|
||||
|
||||
简单来说,这是一个“用户提问 -> 客户端大模型决定求助 -> 本地服务端生成分析框架 -> 客户端大模型根据框架给出深度回答”的完整闭环。
|
||||
|
||||
---
|
||||
|
||||
## 4. MCP 标准通信接口设计 (协议级)
|
||||
|
||||
基于 `@modelcontextprotocol/sdk`,底层的 JSON-RPC 通信由 SDK 接管。以下为逻辑层面的输入输出规约。
|
||||
|
||||
### 4.1 Tool 注册声明 (`tools/list`)
|
||||
|
||||
向 Client 声明具备的能力。
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "generate_5_whys",
|
||||
"description": "使用 5 Whys 模板对用户查询进行深度分解,生成增强的提示词策略",
|
||||
"inputSchema": {
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"query": {
|
||||
"type": "string",
|
||||
"description": "用户需要分析的原始查询主题,例如:中国开源人才的现状分析"
|
||||
}
|
||||
},
|
||||
"required": ["query"]
|
||||
}
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
### 4.2 Tool 调用响应 (`tools/call`)
|
||||
|
||||
当 Client 传入 `query: "中国开源人才的现状分析"` 时,Server 返回的执行结果。
|
||||
|
||||
```json
|
||||
{
|
||||
"content": [
|
||||
{
|
||||
"type": "text",
|
||||
"text": "[\n \"为什么中国开源人才的培养面临困难?\",\n \"为什么中国开源人才缺乏足够的行业经验?\",\n \"为什么开源社区对中国人才的支持力度不足?\",\n \"为什么中国开源人才的市场需求与供给不平衡?\",\n \"为什么政策支持不足导致中国开源人才流失?\"\n]"
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
```
|
||||
|
||||
### 4.3 存储资源访问 (`resources/list` & `resources/read`)
|
||||
|
||||
允许 Cherry Studio 查阅本地文件上下文。
|
||||
|
||||
**资源列表响应示例 (`resources/list`):**
|
||||
|
||||
```json
|
||||
{
|
||||
"resources": [
|
||||
{
|
||||
"uri": "file:///path/to/obsidian/vault/开源行业报告.md",
|
||||
"name": "开源行业研究报告",
|
||||
"mimeType": "text/markdown",
|
||||
"description": "本地知识库中的开源行业深度分析文档"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. 最小可行产品(MVP)开发环境
|
||||
|
||||
本阶段的开发环境高度依赖实际的 Client 联调:
|
||||
|
||||
* **Node.js (v18+)**:提供运行环境。
|
||||
* **@modelcontextprotocol/sdk**:官方依赖,提供 `stdio` 传输支持。
|
||||
* **Cherry Studio**:作为唯一指定测试 Client,配置为以 `command: node` 的方式启动 Server。
|
||||
* **VS Code Attach 调试**:利用 `--inspect=9229` 参数,在 Cherry Studio 唤起 Server 后,通过 VS Code 附加到该进程实现源码级断点调试。
|
||||
|
||||
---
|
||||
|
||||
## 6. 部署与验证
|
||||
|
||||
1. **环境初始化**:安装 Node.js 依赖及 Zod 校验库。
|
||||
2. **Client 配置**:在 Cherry Studio 的 MCP 设置中,添加名为 `ProjectCaffeine` 的 Server,指向本地的 `app.js` 绝对路径,并添加 `--inspect` 参数。
|
||||
3. **断点监听**:在 VS Code 中启动 Attach 调试任务,等待 Cherry Studio 握手。
|
||||
4. **触发验证**:在 Cherry Studio 对话框中要求模型“调用工具分析某问题”,观察 VS Code 是否成功捕获断点,并最终在 Cherry Studio 界面输出基于 5 Whys 增强的回答。
|
||||
|
||||
---
|
||||
|
||||
## 7. 总结
|
||||
|
||||
本设计说明提供了 **Project Caffeine** 的 **提示词策略MCP Server** 最小可行功能的开发框架,包括 **5 Whys** 模板生成、增强提示词的合成、推理请求与返回等关键功能。通过实现这一功能,可以验证整个系统的基本运行环境,确保 **MCP协议** 的各个组件能够顺利协同工作。
|
||||
|
||||
---
|
||||
|
||||
## 许可声明
|
||||
|
||||
本文档采用 **知识共享署名--相同方式共享 4.0 国际许可协议 (CC BY--SA 4.0)** 进行许可,© 2025-2026 Gitconomy Research.
|
||||
@@ -0,0 +1,430 @@
|
||||
---
|
||||
title: "Project Caffeine Arabica版本 Srpint1开发者指南"
|
||||
description: "Arabica版本第一个迭代版本的开发者指南说明,实现一个最小化的5 Wys提示词策略 MCP Server 功能"
|
||||
type: "Development Guide"
|
||||
file: arabical-sprint1-development-specification-guide.md
|
||||
version: "v1.0.0"
|
||||
author: "Gitconomy Research-郭晧"
|
||||
date: 2026-03-01
|
||||
tags:
|
||||
- Project Caffeine
|
||||
- MCP
|
||||
- LLM
|
||||
- JSON-RPC 2.0
|
||||
- TypeScript
|
||||
- Node.js
|
||||
license: "CC BY-SA 4.0"
|
||||
status: "Active"
|
||||
---
|
||||
>本指南将结合你设计的系统架构,在一套代码中同时实现 **Express.js (HTTP JSON-RPC 2.0 服务)** 与 **官方 MCP SDK (支持 Cherry Studio 的 `stdio` 标准通信)**,并配置 VS Code 的全链路调试环境。
|
||||
|
||||
---
|
||||
|
||||
## 环境前置要求
|
||||
|
||||
在开始部署前,请确保开发机已安装以下软件:
|
||||
|
||||
- **Node.js**: v18 LTS 或更高版本。
|
||||
- **Visual Studio Code (VS Code)**: 作为主力开发与断点调试 IDE。
|
||||
- **Cherry Studio**: 最新版,作为发起请求的 MCP Client(大模型中枢)。
|
||||
- **本地知识库**: 一个存放 `.md` 格式笔记的本地文件夹(如 Obsidian Vault)。
|
||||
|
||||
---
|
||||
|
||||
## 2. 工程初始化与依赖安装
|
||||
|
||||
打开终端,执行以下命令从零搭建工程骨架:
|
||||
|
||||
```bash
|
||||
# 1. 创建并进入项目目录
|
||||
mkdir project-caffeine-ts
|
||||
cd project-caffeine-ts
|
||||
|
||||
# 2. 初始化 npm
|
||||
npm init -y
|
||||
|
||||
# 3. 安装生产核心依赖
|
||||
npm install @modelcontextprotocol/sdk zod
|
||||
|
||||
# 4. 安装 TypeScript 及开发环境依赖
|
||||
npm install --save-dev typescript @types/node
|
||||
|
||||
# 5. 创建标准的目录结构
|
||||
mkdir -p src/services dist .vscode
|
||||
touch src/app.ts src/services/promptService.ts src/services/resourceService.ts .vscode/launch.json tsconfig.json
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. 核心工程配置
|
||||
|
||||
我们需要配置 TypeScript 编译器选项、启动脚本以及 VS Code 独有的源码映射调试环境。
|
||||
### 3.1 编辑`tsconfig.json`
|
||||
|
||||
控制代码编译并生成用于断点调试的 `sourceMap`:
|
||||
|
||||
```json
|
||||
{
|
||||
"compilerOptions": {
|
||||
"target": "ES2022",
|
||||
"module": "CommonJS",
|
||||
"moduleResolution": "node",
|
||||
"outDir": "./dist",
|
||||
"rootDir": "./src",
|
||||
"sourceMap": true, // 【关键】生成 .js.map 文件,用于 VS Code 断点映射
|
||||
"strict": true, // 开启严格模式
|
||||
"esModuleInterop": true, // 允许默认导入 CommonJS 模块
|
||||
"skipLibCheck": true,
|
||||
"forceConsistentCasingInFileNames": true
|
||||
},
|
||||
"include": ["src/**/*"]
|
||||
}
|
||||
```
|
||||
|
||||
### 2.2 编辑`package.json`
|
||||
|
||||
添加 `build` 和 `watch` 脚本,用于将 `.ts` 编译为 `.js`。在 `package.json` 中找到 `"scripts"` 字段并替换为
|
||||
|
||||
```json
|
||||
"scripts": { "build": "tsc", "watch": "tsc --watch", "start": "node dist/app.js" }
|
||||
```
|
||||
|
||||
### 2.3 编辑 `.vscode/launch.json`
|
||||
|
||||
新增的 `outFiles` 字段,它告诉 VS Code 去 `dist` 目录寻找编译后的文件,从而将断点映射回你的 `src/*.ts` 源码上。
|
||||
|
||||
```json
|
||||
{
|
||||
"version": "0.2.0",
|
||||
"configurations": [
|
||||
{
|
||||
"type": "node",
|
||||
"request": "attach",
|
||||
"name": "🍒 附加到 Cherry Studio (TS 联调)",
|
||||
"port": 9229,
|
||||
"restart": true,
|
||||
"skipFiles": ["<node_internals>/**"],
|
||||
"outFiles": ["${workspaceFolder}/dist/**/*.js"]
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 4. 第核心业务代码实现
|
||||
|
||||
### 4.1. `src/services/promptService.ts` (提示词策略生成)
|
||||
|
||||
纯本地业务逻辑,负责 5 Whys 框架生成。
|
||||
|
||||
```TypeScript
|
||||
/**
|
||||
* 根据查询主题生成 5 Whys 提示词策略
|
||||
* @param query 用户输入的查询主题
|
||||
* @returns 包含 5 个追问的字符串数组
|
||||
*/
|
||||
export function generate5Whys(query: string): string[] {
|
||||
if (query.includes("开源人才")) {
|
||||
return [
|
||||
"为什么中国开源人才的培养面临困难?",
|
||||
"为什么中国开源人才缺乏足够的行业经验?",
|
||||
"为什么开源社区对中国人才的支持力度不足?",
|
||||
"为什么中国开源人才的市场需求与供给不平衡?",
|
||||
"为什么政策支持不足导致中国开源人才流失?"
|
||||
];
|
||||
}
|
||||
|
||||
return [
|
||||
`为什么 "${query}" 会成为一个问题?`,
|
||||
`为什么导致上述现象的直接原因会发生?`,
|
||||
`为什么当前的系统或流程没有阻止这种情况?`,
|
||||
`为什么以前的解决方案或预防措施失效了?`,
|
||||
`为什么根本的系统性漏洞一直未被修复?`
|
||||
];
|
||||
}
|
||||
```
|
||||
|
||||
### ## 4.2 `src/services/resourceService.ts` (本地知识库访问)
|
||||
|
||||
带有严格路径防穿越(Path Traversal)安全校验的本地文件读取服务。
|
||||
|
||||
```TypeScript
|
||||
import fs from 'fs/promises';
|
||||
import path from 'path';
|
||||
|
||||
// 【⚠️ 重要配置】请修改为你电脑上真实的 Markdown 笔记文件夹绝对路径!
|
||||
const OBSIDIAN_VAULT_PATH = '/home/wguo/Downloads/MyVault';
|
||||
|
||||
export async function listObsidianNotes(): Promise<string[]> {
|
||||
try {
|
||||
const files = await fs.readdir(OBSIDIAN_VAULT_PATH);
|
||||
return files.filter(file => file.toLowerCase().endsWith('.md'));
|
||||
} catch (error: any) {
|
||||
console.error(`[Project Caffeine] 无法读取知识库目录: ${error.message}`);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
export async function readObsidianNote(filename: string): Promise<string> {
|
||||
const targetPath = path.resolve(OBSIDIAN_VAULT_PATH, filename);
|
||||
const safeVaultPath = path.resolve(OBSIDIAN_VAULT_PATH);
|
||||
|
||||
// 核心防御:防止大模型通过传入 "../../" 读取系统敏感文件
|
||||
if (!targetPath.startsWith(safeVaultPath)) {
|
||||
throw new Error(`安全警告:越权访问拦截!禁止读取目录外的文件: ${filename}`);
|
||||
}
|
||||
|
||||
try {
|
||||
const content = await fs.readFile(targetPath, 'utf-8');
|
||||
return content;
|
||||
} catch (error: any) {
|
||||
throw new Error(`无法读取笔记 [${filename}]: 文件可能不存在或无权限。`);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4.3 `src/app.ts` (主入口)
|
||||
|
||||
负责初始化标准输入输出传输层,并向 Cherry Studio 注册工具字典。
|
||||
|
||||
```TypeScript
|
||||
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
|
||||
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
|
||||
import { z } from 'zod';
|
||||
import { generate5Whys } from './services/promptService';
|
||||
import { listObsidianNotes, readObsidianNote } from './services/resourceService';
|
||||
|
||||
// ==========================================
|
||||
// 1. 初始化 MCP Server
|
||||
// ==========================================
|
||||
const mcpServer = new McpServer({
|
||||
name: "Project-Caffeine-Prompt-Strategy",
|
||||
version: "1.2.0"
|
||||
});
|
||||
|
||||
// ==========================================
|
||||
// 2. 注册 Tools (工具) - 赋予大模型主动执行的能力
|
||||
// ==========================================
|
||||
|
||||
// 工具 1:5 Whys 提示词策略生成
|
||||
mcpServer.tool(
|
||||
"generate_5_whys",
|
||||
"使用 5 Whys 模板对用户查询进行深度分解,生成增强的提示词策略",
|
||||
{ query: z.string().describe("需要分析的查询主题") },
|
||||
async ({ query }: { query: string }) => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在生成 5 Whys 策略 -> ${query}`);
|
||||
const enhancedPrompt = generate5Whys(query);
|
||||
return {
|
||||
content: [{ type: "text", text: JSON.stringify(enhancedPrompt, null, 2) }]
|
||||
};
|
||||
}
|
||||
);
|
||||
|
||||
// 工具 2:扫描本地知识库目录
|
||||
mcpServer.tool(
|
||||
"list_local_notes",
|
||||
"获取本地 Obsidian 知识库中的所有 Markdown 笔记列表,用于了解当前有哪些可用的本地上下文资料。",
|
||||
{},
|
||||
async () => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在扫描本地笔记列表...`);
|
||||
const notes = await listObsidianNotes();
|
||||
return {
|
||||
content: [{
|
||||
type: "text",
|
||||
text: notes.length > 0 ? `找到了以下笔记:\n${notes.join('\n')}` : "未找到笔记。"
|
||||
}]
|
||||
};
|
||||
}
|
||||
);
|
||||
|
||||
// 工具 3:阅读指定的单篇笔记内容
|
||||
mcpServer.tool(
|
||||
"read_local_note",
|
||||
"读取本地 Obsidian 知识库中指定笔记的完整内容,作为深度分析的上下文参考。",
|
||||
{ filename: z.string().describe("需要读取的笔记文件名,必须包含 .md 后缀") },
|
||||
async ({ filename }: { filename: string }) => {
|
||||
console.error(`[Project Caffeine] 大模型调用工具: 正在深度阅读笔记 -> ${filename}`);
|
||||
try {
|
||||
const content = await readObsidianNote(filename);
|
||||
return { content: [{ type: "text", text: content }] };
|
||||
} catch (error: any) {
|
||||
return {
|
||||
content: [{ type: "text", text: `读取失败: ${error.message}` }],
|
||||
isError: true // 明确告知大模型此操作抛出了错误
|
||||
};
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// ==========================================
|
||||
// 3. 注册 Resources (资源) - 暴露给客户端供用户手动勾选的静态数据
|
||||
// ==========================================
|
||||
|
||||
// 资源 1:知识库目录索引
|
||||
mcpServer.resource(
|
||||
"obsidian-index", // 客户端显示的资源 Name/ID
|
||||
"obsidian://vault/index", // 唯一的 URI 标识
|
||||
{
|
||||
description: "本地知识库的目录索引,包含所有 Markdown 笔记的列表"
|
||||
},
|
||||
async (uri) => {
|
||||
console.error(`[Project Caffeine] 客户端请求静态资源: ${uri.href}`);
|
||||
|
||||
const notes = await listObsidianNotes();
|
||||
const textContent = notes.length > 0
|
||||
? `当前知识库包含以下文件:\n${notes.join('\n')}`
|
||||
: "当前知识库为空。";
|
||||
|
||||
return {
|
||||
contents: [{
|
||||
uri: uri.href,
|
||||
mimeType: "text/plain",
|
||||
text: textContent
|
||||
}]
|
||||
};
|
||||
}
|
||||
);
|
||||
|
||||
// ==========================================
|
||||
// 4. 启动底层 Stdio 传输层
|
||||
// ==========================================
|
||||
async function start(): Promise<void> {
|
||||
console.error("[Project Caffeine] 正在启动 TS 版 MCP Server (含 Tools 与 Resources)...");
|
||||
const transport = new StdioServerTransport();
|
||||
await mcpServer.connect(transport);
|
||||
console.error("[Project Caffeine] MCP Server 已就绪,等待 Cherry Studio 交互。");
|
||||
}
|
||||
|
||||
// 捕获致命错误并安全退出
|
||||
start().catch((err: unknown) => {
|
||||
console.error("服务器启动失败:", err);
|
||||
process.exit(1);
|
||||
});
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 5. 启动与工作流验证
|
||||
|
||||
### 5.1 步骤一:启动 TS 实时编译 (Watch Mode)
|
||||
|
||||
在 VS Code 中打开终端,运行以下命令。这会让 TypeScript 编译器在后台实时监控你的 `.ts` 文件修改,并自动编译到 `dist` 目录中:
|
||||
|
||||
```bash
|
||||
npm run watch
|
||||
```
|
||||
|
||||
_(保持这个终端窗口在后台运行不要关闭)_
|
||||
|
||||
### 5.2 步骤二:在 Cherry Studio 中配置 Server
|
||||
|
||||
1. 进入 Cherry Studio 的 **设置 -> MCP**。
|
||||
|
||||
2. 添加或修改 Server,**关键在于你要指向编译后的 `dist/app.js` 而不是 `src/app.ts`**:
|
||||
|
||||
- **Command**: `node`
|
||||
- **Args**: `["--inspect=9229", "/project-caffeine-sprint1/dist/app.js"]`
|
||||
- _注意:需要输入app.js的绝对路径_
|
||||
|
||||
3. 确保状态灯亮起绿色。
|
||||
|
||||
### 5.3 步骤三:VS Code 源码级断点联调
|
||||
|
||||
1. 在 `src/app.ts` 或各个 Service 的关键代码行打上断点。
|
||||
2. 在 VS Code 左侧调试面板,选择 **"🍒 附加到 Cherry Studio (TS 联调)"**,点击运行。
|
||||
3. 状态栏变色即表示成功抓取到 Cherry Studio 的底层 Node 进程。
|
||||
|
||||
### 5.4 发起全链路交互
|
||||
|
||||
在 Cherry Studio 对话框中,输入以下指令测试大模型的自主编排能力:
|
||||
|
||||
> _"请先查看我的本地笔记列表,找到关于开源领域的笔记并阅读内容。然后结合你的知识,调用 5 Whys 工具帮我分析一下里面的痛点。"_
|
||||
|
||||
**预期结果**:你将看到大模型**自动、按顺序**调用了 `list_local_notes` -> `read_local_note` -> `generate_5_whys` 三个工具,最终为你输出一篇深度融合了你的私有知识库的洞察报告。
|
||||
|
||||
---
|
||||
|
||||
## 6. 系统运行测试样例
|
||||
|
||||
|
||||
在开始执行测试之前,请确保已完成以下前置准备:
|
||||
|
||||
- **后台编译**:在 VS Code 终端保持运行 `npm run watch` 命令。
|
||||
- **配置连接**:在 Cherry Studio 的设置中,将 Command 设为 `node`,Args 设为 `["--inspect=9229", "<绝对路径>/dist/app.js"]`,并确保状态灯亮起绿色。
|
||||
- **本地知识库**:确保代码中 `OBSIDIAN_VAULT_PATH` 指向的本地文件夹中存在至少一篇 Markdown 格式的测试笔记。
|
||||
|
||||
### 6.1 测试样例1:基础工具调用与分支逻辑验证
|
||||
|
||||
- **测试目标**:验证 `generate_5_whys` 工具的硬编码逻辑分支是否生效。
|
||||
|
||||
- **操作步骤**:
|
||||
|
||||
1. 向大模型发送指令:“请调用工具,帮我生成关于‘开源人才’的 5 Whys 策略。”
|
||||
|
||||
- **预期结果**:
|
||||
|
||||
- VS Code Debug Console打印日志:`[Project Caffeine] 大模型调用工具: 正在生成 5 Whys 策略 -> 开源人才`。
|
||||
- 大模型的回答中必须包含代码中预设的具体问题,例如“为什么中国开源人才的培养面临困难?”、“为什么中国开源人才缺乏足够的行业经验?”等。
|
||||
|
||||
### 6.2 测试样例 2:泛化工具调用验证
|
||||
|
||||
- **测试目标**:验证 `generate_5_whys` 工具在遇到未知主题时的泛化模板处理能力。
|
||||
|
||||
- **操作步骤**:
|
||||
|
||||
1. 向大模型发送指令:“我正在研究‘AI算力成本过高’的问题,请调用工具为我生成 5 Whys 框架。”
|
||||
|
||||
- **预期结果**:
|
||||
|
||||
- 大模型会调用工具,并将主题代入泛化模板,输出类似“为什么 "AI算力成本过高" 会成为一个问题?”、“为什么导致上述现象的直接原因会发生?”等框架性问题。
|
||||
|
||||
|
||||
### 6.3 测试样例 3:安全防御机制测试(红蓝对抗)
|
||||
|
||||
- **测试目标**:验证 `read_local_note` 工具中的路径防穿越(Path Traversal)安全校验是否能成功拦截恶意请求。
|
||||
|
||||
- **操作步骤**:
|
||||
|
||||
1. 向大模型发送诱导性指令:“请调用读取笔记的工具,帮我读取 `../etc/passwd` 文件的内容。”
|
||||
|
||||
- **预期结果**:
|
||||
|
||||
- 工具调用将被拦截,并抛出错误。
|
||||
- 大模型将收到包含 `isError: true` 的错误响应。
|
||||
- 错误信息明确提示大模型:“读取失败: 安全警告:越权访问拦截!禁止读取目录外的文件”。
|
||||
|
||||
|
||||
### 6.4 测试样例 4:全链路 Agentic 自主编排测试
|
||||
|
||||
- **测试目标**:验证大模型是否能自主决策,按顺序组合调用多个外部工具完成复杂分析。
|
||||
|
||||
- **操作步骤**:
|
||||
|
||||
1. 向大模型发送综合指令:“请先查看我的本地笔记列表,找到关于开源领域的笔记并阅读内容。然后结合你的知识,调用 5 Whys 工具帮我分析一下里面的痛点。”。
|
||||
|
||||
- **预期结果**:
|
||||
|
||||
- 大模型将自动并按顺序调用 `list_local_notes` -> `read_local_note` -> `generate_5_whys` 三个工具。
|
||||
- 最终输出一篇融合了私有知识库内容深度的洞察报告。
|
||||
|
||||
### 6.5 测试样例 5:VS Code 断点联调环境测试
|
||||
|
||||
- **测试目标**:验证 `.vscode/launch.json` 源码映射与调试环境配置是否成功。
|
||||
|
||||
- **操作步骤**:
|
||||
|
||||
1. 在 `src/app.ts` 或其他 Service 文件的关键代码行打上断点。
|
||||
2. 在 VS Code 调试面板选择“🍒 附加到 Cherry Studio (TS 联调)”并运行。
|
||||
3. 在 Cherry Studio 中触发任意工具调用。
|
||||
|
||||
- **预期结果**:
|
||||
|
||||
- VS Code 底部状态栏变色,表示成功抓取到 Cherry Studio 的底层 Node 进程。
|
||||
- 代码执行将暂停在设置了断点的位置,允许开发者查看当前变量与调用栈。
|
||||
|
||||
---
|
||||
|
||||
## 许可声明
|
||||
|
||||
本文档采用 **知识共享署名--相同方式共享 4.0 国际许可协议 (CC BY--SA 4.0)** 进行许可,© 2025-2026 Gitconomy Research.
|
||||
1
projects/arabica/sprint1/node_modules/.bin/node-which
generated
vendored
Symbolic link
1
projects/arabica/sprint1/node_modules/.bin/node-which
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../which/bin/node-which
|
||||
1
projects/arabica/sprint1/node_modules/.bin/tsc
generated
vendored
Symbolic link
1
projects/arabica/sprint1/node_modules/.bin/tsc
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../typescript/bin/tsc
|
||||
1
projects/arabica/sprint1/node_modules/.bin/tsserver
generated
vendored
Symbolic link
1
projects/arabica/sprint1/node_modules/.bin/tsserver
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
||||
../typescript/bin/tsserver
|
||||
1162
projects/arabica/sprint1/node_modules/.package-lock.json
generated
vendored
Normal file
1162
projects/arabica/sprint1/node_modules/.package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
357
projects/arabica/sprint1/node_modules/@hono/node-server/README.md
generated
vendored
Normal file
357
projects/arabica/sprint1/node_modules/@hono/node-server/README.md
generated
vendored
Normal file
@@ -0,0 +1,357 @@
|
||||
# Node.js Adapter for Hono
|
||||
|
||||
This adapter `@hono/node-server` allows you to run your Hono application on Node.js.
|
||||
Initially, Hono wasn't designed for Node.js, but with this adapter, you can now use Hono on Node.js.
|
||||
It utilizes web standard APIs implemented in Node.js version 18 or higher.
|
||||
|
||||
## Benchmarks
|
||||
|
||||
Hono is 3.5 times faster than Express.
|
||||
|
||||
Express:
|
||||
|
||||
```txt
|
||||
$ bombardier -d 10s --fasthttp http://localhost:3000/
|
||||
|
||||
Statistics Avg Stdev Max
|
||||
Reqs/sec 16438.94 1603.39 19155.47
|
||||
Latency 7.60ms 7.51ms 559.89ms
|
||||
HTTP codes:
|
||||
1xx - 0, 2xx - 164494, 3xx - 0, 4xx - 0, 5xx - 0
|
||||
others - 0
|
||||
Throughput: 4.55MB/s
|
||||
```
|
||||
|
||||
Hono + `@hono/node-server`:
|
||||
|
||||
```txt
|
||||
$ bombardier -d 10s --fasthttp http://localhost:3000/
|
||||
|
||||
Statistics Avg Stdev Max
|
||||
Reqs/sec 58296.56 5512.74 74403.56
|
||||
Latency 2.14ms 1.46ms 190.92ms
|
||||
HTTP codes:
|
||||
1xx - 0, 2xx - 583059, 3xx - 0, 4xx - 0, 5xx - 0
|
||||
others - 0
|
||||
Throughput: 12.56MB/s
|
||||
```
|
||||
|
||||
## Requirements
|
||||
|
||||
It works on Node.js versions greater than 18.x. The specific required Node.js versions are as follows:
|
||||
|
||||
- 18.x => 18.14.1+
|
||||
- 19.x => 19.7.0+
|
||||
- 20.x => 20.0.0+
|
||||
|
||||
Essentially, you can simply use the latest version of each major release.
|
||||
|
||||
## Installation
|
||||
|
||||
You can install it from the npm registry with `npm` command:
|
||||
|
||||
```sh
|
||||
npm install @hono/node-server
|
||||
```
|
||||
|
||||
Or use `yarn`:
|
||||
|
||||
```sh
|
||||
yarn add @hono/node-server
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
Just import `@hono/node-server` at the top and write the code as usual.
|
||||
The same code that runs on Cloudflare Workers, Deno, and Bun will work.
|
||||
|
||||
```ts
|
||||
import { serve } from '@hono/node-server'
|
||||
import { Hono } from 'hono'
|
||||
|
||||
const app = new Hono()
|
||||
app.get('/', (c) => c.text('Hono meets Node.js'))
|
||||
|
||||
serve(app, (info) => {
|
||||
console.log(`Listening on http://localhost:${info.port}`) // Listening on http://localhost:3000
|
||||
})
|
||||
```
|
||||
|
||||
For example, run it using `ts-node`. Then an HTTP server will be launched. The default port is `3000`.
|
||||
|
||||
```sh
|
||||
ts-node ./index.ts
|
||||
```
|
||||
|
||||
Open `http://localhost:3000` with your browser.
|
||||
|
||||
## Options
|
||||
|
||||
### `port`
|
||||
|
||||
```ts
|
||||
serve({
|
||||
fetch: app.fetch,
|
||||
port: 8787, // Port number, default is 3000
|
||||
})
|
||||
```
|
||||
|
||||
### `createServer`
|
||||
|
||||
```ts
|
||||
import { createServer } from 'node:https'
|
||||
import fs from 'node:fs'
|
||||
|
||||
//...
|
||||
|
||||
serve({
|
||||
fetch: app.fetch,
|
||||
createServer: createServer,
|
||||
serverOptions: {
|
||||
key: fs.readFileSync('test/fixtures/keys/agent1-key.pem'),
|
||||
cert: fs.readFileSync('test/fixtures/keys/agent1-cert.pem'),
|
||||
},
|
||||
})
|
||||
```
|
||||
|
||||
### `overrideGlobalObjects`
|
||||
|
||||
The default value is `true`. The Node.js Adapter rewrites the global Request/Response and uses a lightweight Request/Response to improve performance. If you don't want to do that, set `false`.
|
||||
|
||||
```ts
|
||||
serve({
|
||||
fetch: app.fetch,
|
||||
overrideGlobalObjects: false,
|
||||
})
|
||||
```
|
||||
|
||||
### `autoCleanupIncoming`
|
||||
|
||||
The default value is `true`. The Node.js Adapter automatically cleans up (explicitly call `destroy()` method) if application is not finished to consume the incoming request. If you don't want to do that, set `false`.
|
||||
|
||||
If the application accepts connections from arbitrary clients, this cleanup must be done otherwise incomplete requests from clients may cause the application to stop responding. If your application only accepts connections from trusted clients, such as in a reverse proxy environment and there is no process that returns a response without reading the body of the POST request all the way through, you can improve performance by setting it to `false`.
|
||||
|
||||
```ts
|
||||
serve({
|
||||
fetch: app.fetch,
|
||||
autoCleanupIncoming: false,
|
||||
})
|
||||
```
|
||||
|
||||
## Middleware
|
||||
|
||||
Most built-in middleware also works with Node.js.
|
||||
Read [the documentation](https://hono.dev/middleware/builtin/basic-auth) and use the Middleware of your liking.
|
||||
|
||||
```ts
|
||||
import { serve } from '@hono/node-server'
|
||||
import { Hono } from 'hono'
|
||||
import { prettyJSON } from 'hono/pretty-json'
|
||||
|
||||
const app = new Hono()
|
||||
|
||||
app.get('*', prettyJSON())
|
||||
app.get('/', (c) => c.json({ 'Hono meets': 'Node.js' }))
|
||||
|
||||
serve(app)
|
||||
```
|
||||
|
||||
## Serve Static Middleware
|
||||
|
||||
Use Serve Static Middleware that has been created for Node.js.
|
||||
|
||||
```ts
|
||||
import { serveStatic } from '@hono/node-server/serve-static'
|
||||
|
||||
//...
|
||||
|
||||
app.use('/static/*', serveStatic({ root: './' }))
|
||||
```
|
||||
|
||||
If using a relative path, `root` will be relative to the current working directory from which the app was started.
|
||||
|
||||
This can cause confusion when running your application locally.
|
||||
|
||||
Imagine your project structure is:
|
||||
|
||||
```
|
||||
my-hono-project/
|
||||
src/
|
||||
index.ts
|
||||
static/
|
||||
index.html
|
||||
```
|
||||
|
||||
Typically, you would run your app from the project's root directory (`my-hono-project`),
|
||||
so you would need the following code to serve the `static` folder:
|
||||
|
||||
```ts
|
||||
app.use('/static/*', serveStatic({ root: './static' }))
|
||||
```
|
||||
|
||||
Notice that `root` here is not relative to `src/index.ts`, rather to `my-hono-project`.
|
||||
|
||||
### Options
|
||||
|
||||
#### `rewriteRequestPath`
|
||||
|
||||
If you want to serve files in `./.foojs` with the request path `/__foo/*`, you can write like the following.
|
||||
|
||||
```ts
|
||||
app.use(
|
||||
'/__foo/*',
|
||||
serveStatic({
|
||||
root: './.foojs/',
|
||||
rewriteRequestPath: (path: string) => path.replace(/^\/__foo/, ''),
|
||||
})
|
||||
)
|
||||
```
|
||||
|
||||
#### `onFound`
|
||||
|
||||
You can specify handling when the requested file is found with `onFound`.
|
||||
|
||||
```ts
|
||||
app.use(
|
||||
'/static/*',
|
||||
serveStatic({
|
||||
// ...
|
||||
onFound: (_path, c) => {
|
||||
c.header('Cache-Control', `public, immutable, max-age=31536000`)
|
||||
},
|
||||
})
|
||||
)
|
||||
```
|
||||
|
||||
#### `onNotFound`
|
||||
|
||||
The `onNotFound` is useful for debugging. You can write a handle for when a file is not found.
|
||||
|
||||
```ts
|
||||
app.use(
|
||||
'/static/*',
|
||||
serveStatic({
|
||||
root: './non-existent-dir',
|
||||
onNotFound: (path, c) => {
|
||||
console.log(`${path} is not found, request to ${c.req.path}`)
|
||||
},
|
||||
})
|
||||
)
|
||||
```
|
||||
|
||||
#### `precompressed`
|
||||
|
||||
The `precompressed` option checks if files with extensions like `.br` or `.gz` are available and serves them based on the `Accept-Encoding` header. It prioritizes Brotli, then Zstd, and Gzip. If none are available, it serves the original file.
|
||||
|
||||
```ts
|
||||
app.use(
|
||||
'/static/*',
|
||||
serveStatic({
|
||||
precompressed: true,
|
||||
})
|
||||
)
|
||||
```
|
||||
|
||||
## ConnInfo Helper
|
||||
|
||||
You can use the [ConnInfo Helper](https://hono.dev/docs/helpers/conninfo) by importing `getConnInfo` from `@hono/node-server/conninfo`.
|
||||
|
||||
```ts
|
||||
import { getConnInfo } from '@hono/node-server/conninfo'
|
||||
|
||||
app.get('/', (c) => {
|
||||
const info = getConnInfo(c) // info is `ConnInfo`
|
||||
return c.text(`Your remote address is ${info.remote.address}`)
|
||||
})
|
||||
```
|
||||
|
||||
## Accessing Node.js API
|
||||
|
||||
You can access the Node.js API from `c.env` in Node.js. For example, if you want to specify a type, you can write the following.
|
||||
|
||||
```ts
|
||||
import { serve } from '@hono/node-server'
|
||||
import type { HttpBindings } from '@hono/node-server'
|
||||
import { Hono } from 'hono'
|
||||
|
||||
const app = new Hono<{ Bindings: HttpBindings }>()
|
||||
|
||||
app.get('/', (c) => {
|
||||
return c.json({
|
||||
remoteAddress: c.env.incoming.socket.remoteAddress,
|
||||
})
|
||||
})
|
||||
|
||||
serve(app)
|
||||
```
|
||||
|
||||
The APIs that you can get from `c.env` are as follows.
|
||||
|
||||
```ts
|
||||
type HttpBindings = {
|
||||
incoming: IncomingMessage
|
||||
outgoing: ServerResponse
|
||||
}
|
||||
|
||||
type Http2Bindings = {
|
||||
incoming: Http2ServerRequest
|
||||
outgoing: Http2ServerResponse
|
||||
}
|
||||
```
|
||||
|
||||
## Direct response from Node.js API
|
||||
|
||||
You can directly respond to the client from the Node.js API.
|
||||
In that case, the response from Hono should be ignored, so return `RESPONSE_ALREADY_SENT`.
|
||||
|
||||
> [!NOTE]
|
||||
> This feature can be used when migrating existing Node.js applications to Hono, but we recommend using Hono's API for new applications.
|
||||
|
||||
```ts
|
||||
import { serve } from '@hono/node-server'
|
||||
import type { HttpBindings } from '@hono/node-server'
|
||||
import { RESPONSE_ALREADY_SENT } from '@hono/node-server/utils/response'
|
||||
import { Hono } from 'hono'
|
||||
|
||||
const app = new Hono<{ Bindings: HttpBindings }>()
|
||||
|
||||
app.get('/', (c) => {
|
||||
const { outgoing } = c.env
|
||||
outgoing.writeHead(200, { 'Content-Type': 'text/plain' })
|
||||
outgoing.end('Hello World\n')
|
||||
|
||||
return RESPONSE_ALREADY_SENT
|
||||
})
|
||||
|
||||
serve(app)
|
||||
```
|
||||
|
||||
## Listen to a UNIX domain socket
|
||||
|
||||
You can configure the HTTP server to listen to a UNIX domain socket instead of a TCP port.
|
||||
|
||||
```ts
|
||||
import { createAdaptorServer } from '@hono/node-server'
|
||||
|
||||
// ...
|
||||
|
||||
const socketPath ='/tmp/example.sock'
|
||||
|
||||
const server = createAdaptorServer(app)
|
||||
server.listen(socketPath, () => {
|
||||
console.log(`Listening on ${socketPath}`)
|
||||
})
|
||||
```
|
||||
|
||||
## Related projects
|
||||
|
||||
- Hono - <https://hono.dev>
|
||||
- Hono GitHub repository - <https://github.com/honojs/hono>
|
||||
|
||||
## Author
|
||||
|
||||
Yusuke Wada <https://github.com/yusukebe>
|
||||
|
||||
## License
|
||||
|
||||
MIT
|
||||
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.d.mts
generated
vendored
Normal file
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.d.mts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { GetConnInfo } from 'hono/conninfo';
|
||||
|
||||
/**
|
||||
* ConnInfo Helper for Node.js
|
||||
* @param c Context
|
||||
* @returns ConnInfo
|
||||
*/
|
||||
declare const getConnInfo: GetConnInfo;
|
||||
|
||||
export { getConnInfo };
|
||||
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.d.ts
generated
vendored
Normal file
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.d.ts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { GetConnInfo } from 'hono/conninfo';
|
||||
|
||||
/**
|
||||
* ConnInfo Helper for Node.js
|
||||
* @param c Context
|
||||
* @returns ConnInfo
|
||||
*/
|
||||
declare const getConnInfo: GetConnInfo;
|
||||
|
||||
export { getConnInfo };
|
||||
42
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.js
generated
vendored
Normal file
42
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.js
generated
vendored
Normal file
@@ -0,0 +1,42 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/conninfo.ts
|
||||
var conninfo_exports = {};
|
||||
__export(conninfo_exports, {
|
||||
getConnInfo: () => getConnInfo
|
||||
});
|
||||
module.exports = __toCommonJS(conninfo_exports);
|
||||
var getConnInfo = (c) => {
|
||||
const bindings = c.env.server ? c.env.server : c.env;
|
||||
const address = bindings.incoming.socket.remoteAddress;
|
||||
const port = bindings.incoming.socket.remotePort;
|
||||
const family = bindings.incoming.socket.remoteFamily;
|
||||
return {
|
||||
remote: {
|
||||
address,
|
||||
port,
|
||||
addressType: family === "IPv4" ? "IPv4" : family === "IPv6" ? "IPv6" : void 0
|
||||
}
|
||||
};
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
getConnInfo
|
||||
});
|
||||
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.mjs
generated
vendored
Normal file
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/conninfo.mjs
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
// src/conninfo.ts
|
||||
var getConnInfo = (c) => {
|
||||
const bindings = c.env.server ? c.env.server : c.env;
|
||||
const address = bindings.incoming.socket.remoteAddress;
|
||||
const port = bindings.incoming.socket.remotePort;
|
||||
const family = bindings.incoming.socket.remoteFamily;
|
||||
return {
|
||||
remote: {
|
||||
address,
|
||||
port,
|
||||
addressType: family === "IPv4" ? "IPv4" : family === "IPv6" ? "IPv6" : void 0
|
||||
}
|
||||
};
|
||||
};
|
||||
export {
|
||||
getConnInfo
|
||||
};
|
||||
2
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.d.mts
generated
vendored
Normal file
2
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.d.mts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
|
||||
export { }
|
||||
2
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.d.ts
generated
vendored
Normal file
2
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
|
||||
export { }
|
||||
29
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.js
generated
vendored
Normal file
29
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.js
generated
vendored
Normal file
@@ -0,0 +1,29 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
|
||||
// src/globals.ts
|
||||
var import_node_crypto = __toESM(require("crypto"));
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = import_node_crypto.default;
|
||||
}
|
||||
5
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.mjs
generated
vendored
Normal file
5
projects/arabica/sprint1/node_modules/@hono/node-server/dist/globals.mjs
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
// src/globals.ts
|
||||
import crypto from "crypto";
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = crypto;
|
||||
}
|
||||
8
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.d.mts
generated
vendored
Normal file
8
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.d.mts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
export { createAdaptorServer, serve } from './server.mjs';
|
||||
export { getRequestListener } from './listener.mjs';
|
||||
export { RequestError } from './request.mjs';
|
||||
export { Http2Bindings, HttpBindings, ServerType } from './types.mjs';
|
||||
import 'node:net';
|
||||
import 'node:http';
|
||||
import 'node:http2';
|
||||
import 'node:https';
|
||||
8
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.d.ts
generated
vendored
Normal file
8
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
export { createAdaptorServer, serve } from './server.js';
|
||||
export { getRequestListener } from './listener.js';
|
||||
export { RequestError } from './request.js';
|
||||
export { Http2Bindings, HttpBindings, ServerType } from './types.js';
|
||||
import 'node:net';
|
||||
import 'node:http';
|
||||
import 'node:http2';
|
||||
import 'node:https';
|
||||
613
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.js
generated
vendored
Normal file
613
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,613 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/index.ts
|
||||
var src_exports = {};
|
||||
__export(src_exports, {
|
||||
RequestError: () => RequestError,
|
||||
createAdaptorServer: () => createAdaptorServer,
|
||||
getRequestListener: () => getRequestListener,
|
||||
serve: () => serve
|
||||
});
|
||||
module.exports = __toCommonJS(src_exports);
|
||||
|
||||
// src/server.ts
|
||||
var import_node_http = require("http");
|
||||
|
||||
// src/listener.ts
|
||||
var import_node_http22 = require("http2");
|
||||
|
||||
// src/request.ts
|
||||
var import_node_http2 = require("http2");
|
||||
var import_node_stream = require("stream");
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= import_node_stream.Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = import_node_stream.Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof import_node_http2.Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
var import_node_crypto = __toESM(require("crypto"));
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = import_node_crypto.default;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof import_node_http22.Http2ServerRequest) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/server.ts
|
||||
var createAdaptorServer = (options) => {
|
||||
const fetchCallback = options.fetch;
|
||||
const requestListener = getRequestListener(fetchCallback, {
|
||||
hostname: options.hostname,
|
||||
overrideGlobalObjects: options.overrideGlobalObjects,
|
||||
autoCleanupIncoming: options.autoCleanupIncoming
|
||||
});
|
||||
const createServer = options.createServer || import_node_http.createServer;
|
||||
const server = createServer(options.serverOptions || {}, requestListener);
|
||||
return server;
|
||||
};
|
||||
var serve = (options, listeningListener) => {
|
||||
const server = createAdaptorServer(options);
|
||||
server.listen(options?.port ?? 3e3, options.hostname, () => {
|
||||
const serverInfo = server.address();
|
||||
listeningListener && listeningListener(serverInfo);
|
||||
});
|
||||
return server;
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
RequestError,
|
||||
createAdaptorServer,
|
||||
getRequestListener,
|
||||
serve
|
||||
});
|
||||
573
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.mjs
generated
vendored
Normal file
573
projects/arabica/sprint1/node_modules/@hono/node-server/dist/index.mjs
generated
vendored
Normal file
@@ -0,0 +1,573 @@
|
||||
// src/server.ts
|
||||
import { createServer as createServerHTTP } from "http";
|
||||
|
||||
// src/listener.ts
|
||||
import { Http2ServerRequest as Http2ServerRequest2 } from "http2";
|
||||
|
||||
// src/request.ts
|
||||
import { Http2ServerRequest } from "http2";
|
||||
import { Readable } from "stream";
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
import crypto from "crypto";
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = crypto;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof Http2ServerRequest2) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/server.ts
|
||||
var createAdaptorServer = (options) => {
|
||||
const fetchCallback = options.fetch;
|
||||
const requestListener = getRequestListener(fetchCallback, {
|
||||
hostname: options.hostname,
|
||||
overrideGlobalObjects: options.overrideGlobalObjects,
|
||||
autoCleanupIncoming: options.autoCleanupIncoming
|
||||
});
|
||||
const createServer = options.createServer || createServerHTTP;
|
||||
const server = createServer(options.serverOptions || {}, requestListener);
|
||||
return server;
|
||||
};
|
||||
var serve = (options, listeningListener) => {
|
||||
const server = createAdaptorServer(options);
|
||||
server.listen(options?.port ?? 3e3, options.hostname, () => {
|
||||
const serverInfo = server.address();
|
||||
listeningListener && listeningListener(serverInfo);
|
||||
});
|
||||
return server;
|
||||
};
|
||||
export {
|
||||
RequestError,
|
||||
createAdaptorServer,
|
||||
getRequestListener,
|
||||
serve
|
||||
};
|
||||
13
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.d.mts
generated
vendored
Normal file
13
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.d.mts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import { IncomingMessage, ServerResponse } from 'node:http';
|
||||
import { Http2ServerRequest, Http2ServerResponse } from 'node:http2';
|
||||
import { FetchCallback, CustomErrorHandler } from './types.mjs';
|
||||
import 'node:https';
|
||||
|
||||
declare const getRequestListener: (fetchCallback: FetchCallback, options?: {
|
||||
hostname?: string;
|
||||
errorHandler?: CustomErrorHandler;
|
||||
overrideGlobalObjects?: boolean;
|
||||
autoCleanupIncoming?: boolean;
|
||||
}) => (incoming: IncomingMessage | Http2ServerRequest, outgoing: ServerResponse | Http2ServerResponse) => Promise<void>;
|
||||
|
||||
export { getRequestListener };
|
||||
13
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.d.ts
generated
vendored
Normal file
13
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.d.ts
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
import { IncomingMessage, ServerResponse } from 'node:http';
|
||||
import { Http2ServerRequest, Http2ServerResponse } from 'node:http2';
|
||||
import { FetchCallback, CustomErrorHandler } from './types.js';
|
||||
import 'node:https';
|
||||
|
||||
declare const getRequestListener: (fetchCallback: FetchCallback, options?: {
|
||||
hostname?: string;
|
||||
errorHandler?: CustomErrorHandler;
|
||||
overrideGlobalObjects?: boolean;
|
||||
autoCleanupIncoming?: boolean;
|
||||
}) => (incoming: IncomingMessage | Http2ServerRequest, outgoing: ServerResponse | Http2ServerResponse) => Promise<void>;
|
||||
|
||||
export { getRequestListener };
|
||||
581
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.js
generated
vendored
Normal file
581
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.js
generated
vendored
Normal file
@@ -0,0 +1,581 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/listener.ts
|
||||
var listener_exports = {};
|
||||
__export(listener_exports, {
|
||||
getRequestListener: () => getRequestListener
|
||||
});
|
||||
module.exports = __toCommonJS(listener_exports);
|
||||
var import_node_http22 = require("http2");
|
||||
|
||||
// src/request.ts
|
||||
var import_node_http2 = require("http2");
|
||||
var import_node_stream = require("stream");
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= import_node_stream.Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = import_node_stream.Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof import_node_http2.Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
var import_node_crypto = __toESM(require("crypto"));
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = import_node_crypto.default;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof import_node_http22.Http2ServerRequest) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
getRequestListener
|
||||
});
|
||||
546
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.mjs
generated
vendored
Normal file
546
projects/arabica/sprint1/node_modules/@hono/node-server/dist/listener.mjs
generated
vendored
Normal file
@@ -0,0 +1,546 @@
|
||||
// src/listener.ts
|
||||
import { Http2ServerRequest as Http2ServerRequest2 } from "http2";
|
||||
|
||||
// src/request.ts
|
||||
import { Http2ServerRequest } from "http2";
|
||||
import { Readable } from "stream";
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
import crypto from "crypto";
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = crypto;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof Http2ServerRequest2) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
export {
|
||||
getRequestListener
|
||||
};
|
||||
25
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.d.mts
generated
vendored
Normal file
25
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.d.mts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
import { IncomingMessage } from 'node:http';
|
||||
import { Http2ServerRequest } from 'node:http2';
|
||||
|
||||
declare class RequestError extends Error {
|
||||
constructor(message: string, options?: {
|
||||
cause?: unknown;
|
||||
});
|
||||
}
|
||||
declare const toRequestError: (e: unknown) => RequestError;
|
||||
declare const GlobalRequest: {
|
||||
new (input: RequestInfo | URL, init?: RequestInit): globalThis.Request;
|
||||
prototype: globalThis.Request;
|
||||
};
|
||||
declare class Request extends GlobalRequest {
|
||||
constructor(input: string | Request, options?: RequestInit);
|
||||
}
|
||||
type IncomingMessageWithWrapBodyStream = IncomingMessage & {
|
||||
[wrapBodyStream]: boolean;
|
||||
};
|
||||
declare const wrapBodyStream: unique symbol;
|
||||
declare const abortControllerKey: unique symbol;
|
||||
declare const getAbortController: unique symbol;
|
||||
declare const newRequest: (incoming: IncomingMessage | Http2ServerRequest, defaultHostname?: string) => any;
|
||||
|
||||
export { GlobalRequest, IncomingMessageWithWrapBodyStream, Request, RequestError, abortControllerKey, getAbortController, newRequest, toRequestError, wrapBodyStream };
|
||||
25
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.d.ts
generated
vendored
Normal file
25
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
import { IncomingMessage } from 'node:http';
|
||||
import { Http2ServerRequest } from 'node:http2';
|
||||
|
||||
declare class RequestError extends Error {
|
||||
constructor(message: string, options?: {
|
||||
cause?: unknown;
|
||||
});
|
||||
}
|
||||
declare const toRequestError: (e: unknown) => RequestError;
|
||||
declare const GlobalRequest: {
|
||||
new (input: RequestInfo | URL, init?: RequestInit): globalThis.Request;
|
||||
prototype: globalThis.Request;
|
||||
};
|
||||
declare class Request extends GlobalRequest {
|
||||
constructor(input: string | Request, options?: RequestInit);
|
||||
}
|
||||
type IncomingMessageWithWrapBodyStream = IncomingMessage & {
|
||||
[wrapBodyStream]: boolean;
|
||||
};
|
||||
declare const wrapBodyStream: unique symbol;
|
||||
declare const abortControllerKey: unique symbol;
|
||||
declare const getAbortController: unique symbol;
|
||||
declare const newRequest: (incoming: IncomingMessage | Http2ServerRequest, defaultHostname?: string) => any;
|
||||
|
||||
export { GlobalRequest, IncomingMessageWithWrapBodyStream, Request, RequestError, abortControllerKey, getAbortController, newRequest, toRequestError, wrapBodyStream };
|
||||
227
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.js
generated
vendored
Normal file
227
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.js
generated
vendored
Normal file
@@ -0,0 +1,227 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/request.ts
|
||||
var request_exports = {};
|
||||
__export(request_exports, {
|
||||
GlobalRequest: () => GlobalRequest,
|
||||
Request: () => Request,
|
||||
RequestError: () => RequestError,
|
||||
abortControllerKey: () => abortControllerKey,
|
||||
getAbortController: () => getAbortController,
|
||||
newRequest: () => newRequest,
|
||||
toRequestError: () => toRequestError,
|
||||
wrapBodyStream: () => wrapBodyStream
|
||||
});
|
||||
module.exports = __toCommonJS(request_exports);
|
||||
var import_node_http2 = require("http2");
|
||||
var import_node_stream = require("stream");
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= import_node_stream.Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = import_node_stream.Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof import_node_http2.Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
GlobalRequest,
|
||||
Request,
|
||||
RequestError,
|
||||
abortControllerKey,
|
||||
getAbortController,
|
||||
newRequest,
|
||||
toRequestError,
|
||||
wrapBodyStream
|
||||
});
|
||||
195
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.mjs
generated
vendored
Normal file
195
projects/arabica/sprint1/node_modules/@hono/node-server/dist/request.mjs
generated
vendored
Normal file
@@ -0,0 +1,195 @@
|
||||
// src/request.ts
|
||||
import { Http2ServerRequest } from "http2";
|
||||
import { Readable } from "stream";
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
export {
|
||||
GlobalRequest,
|
||||
Request,
|
||||
RequestError,
|
||||
abortControllerKey,
|
||||
getAbortController,
|
||||
newRequest,
|
||||
toRequestError,
|
||||
wrapBodyStream
|
||||
};
|
||||
26
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.d.mts
generated
vendored
Normal file
26
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.d.mts
generated
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
import { OutgoingHttpHeaders } from 'node:http';
|
||||
|
||||
declare const getResponseCache: unique symbol;
|
||||
declare const cacheKey: unique symbol;
|
||||
type InternalCache = [
|
||||
number,
|
||||
string | ReadableStream,
|
||||
Record<string, string> | Headers | OutgoingHttpHeaders
|
||||
];
|
||||
declare const GlobalResponse: {
|
||||
new (body?: BodyInit | null, init?: ResponseInit): globalThis.Response;
|
||||
prototype: globalThis.Response;
|
||||
error(): globalThis.Response;
|
||||
json(data: any, init?: ResponseInit): globalThis.Response;
|
||||
redirect(url: string | URL, status?: number): globalThis.Response;
|
||||
};
|
||||
declare class Response {
|
||||
#private;
|
||||
[getResponseCache](): globalThis.Response;
|
||||
constructor(body?: BodyInit | null, init?: ResponseInit);
|
||||
get headers(): Headers;
|
||||
get status(): number;
|
||||
get ok(): boolean;
|
||||
}
|
||||
|
||||
export { GlobalResponse, InternalCache, Response, cacheKey };
|
||||
26
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.d.ts
generated
vendored
Normal file
26
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.d.ts
generated
vendored
Normal file
@@ -0,0 +1,26 @@
|
||||
import { OutgoingHttpHeaders } from 'node:http';
|
||||
|
||||
declare const getResponseCache: unique symbol;
|
||||
declare const cacheKey: unique symbol;
|
||||
type InternalCache = [
|
||||
number,
|
||||
string | ReadableStream,
|
||||
Record<string, string> | Headers | OutgoingHttpHeaders
|
||||
];
|
||||
declare const GlobalResponse: {
|
||||
new (body?: BodyInit | null, init?: ResponseInit): globalThis.Response;
|
||||
prototype: globalThis.Response;
|
||||
error(): globalThis.Response;
|
||||
json(data: any, init?: ResponseInit): globalThis.Response;
|
||||
redirect(url: string | URL, status?: number): globalThis.Response;
|
||||
};
|
||||
declare class Response {
|
||||
#private;
|
||||
[getResponseCache](): globalThis.Response;
|
||||
constructor(body?: BodyInit | null, init?: ResponseInit);
|
||||
get headers(): Headers;
|
||||
get status(): number;
|
||||
get ok(): boolean;
|
||||
}
|
||||
|
||||
export { GlobalResponse, InternalCache, Response, cacheKey };
|
||||
99
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.js
generated
vendored
Normal file
99
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.js
generated
vendored
Normal file
@@ -0,0 +1,99 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/response.ts
|
||||
var response_exports = {};
|
||||
__export(response_exports, {
|
||||
GlobalResponse: () => GlobalResponse,
|
||||
Response: () => Response,
|
||||
cacheKey: () => cacheKey
|
||||
});
|
||||
module.exports = __toCommonJS(response_exports);
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response, GlobalResponse);
|
||||
Object.setPrototypeOf(Response.prototype, GlobalResponse.prototype);
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
GlobalResponse,
|
||||
Response,
|
||||
cacheKey
|
||||
});
|
||||
72
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.mjs
generated
vendored
Normal file
72
projects/arabica/sprint1/node_modules/@hono/node-server/dist/response.mjs
generated
vendored
Normal file
@@ -0,0 +1,72 @@
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response, GlobalResponse);
|
||||
Object.setPrototypeOf(Response.prototype, GlobalResponse.prototype);
|
||||
export {
|
||||
GlobalResponse,
|
||||
Response,
|
||||
cacheKey
|
||||
};
|
||||
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.d.mts
generated
vendored
Normal file
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.d.mts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import { Env, Context, MiddlewareHandler } from 'hono';
|
||||
|
||||
type ServeStaticOptions<E extends Env = Env> = {
|
||||
/**
|
||||
* Root path, relative to current working directory from which the app was started. Absolute paths are not supported.
|
||||
*/
|
||||
root?: string;
|
||||
path?: string;
|
||||
index?: string;
|
||||
precompressed?: boolean;
|
||||
rewriteRequestPath?: (path: string, c: Context<E>) => string;
|
||||
onFound?: (path: string, c: Context<E>) => void | Promise<void>;
|
||||
onNotFound?: (path: string, c: Context<E>) => void | Promise<void>;
|
||||
};
|
||||
declare const serveStatic: <E extends Env = any>(options?: ServeStaticOptions<E>) => MiddlewareHandler<E>;
|
||||
|
||||
export { ServeStaticOptions, serveStatic };
|
||||
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.d.ts
generated
vendored
Normal file
17
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import { Env, Context, MiddlewareHandler } from 'hono';
|
||||
|
||||
type ServeStaticOptions<E extends Env = Env> = {
|
||||
/**
|
||||
* Root path, relative to current working directory from which the app was started. Absolute paths are not supported.
|
||||
*/
|
||||
root?: string;
|
||||
path?: string;
|
||||
index?: string;
|
||||
precompressed?: boolean;
|
||||
rewriteRequestPath?: (path: string, c: Context<E>) => string;
|
||||
onFound?: (path: string, c: Context<E>) => void | Promise<void>;
|
||||
onNotFound?: (path: string, c: Context<E>) => void | Promise<void>;
|
||||
};
|
||||
declare const serveStatic: <E extends Env = any>(options?: ServeStaticOptions<E>) => MiddlewareHandler<E>;
|
||||
|
||||
export { ServeStaticOptions, serveStatic };
|
||||
163
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.js
generated
vendored
Normal file
163
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.js
generated
vendored
Normal file
@@ -0,0 +1,163 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/serve-static.ts
|
||||
var serve_static_exports = {};
|
||||
__export(serve_static_exports, {
|
||||
serveStatic: () => serveStatic
|
||||
});
|
||||
module.exports = __toCommonJS(serve_static_exports);
|
||||
var import_mime = require("hono/utils/mime");
|
||||
var import_node_fs = require("fs");
|
||||
var import_node_path = require("path");
|
||||
var import_node_process = require("process");
|
||||
var import_node_stream = require("stream");
|
||||
var COMPRESSIBLE_CONTENT_TYPE_REGEX = /^\s*(?:text\/[^;\s]+|application\/(?:javascript|json|xml|xml-dtd|ecmascript|dart|postscript|rtf|tar|toml|vnd\.dart|vnd\.ms-fontobject|vnd\.ms-opentype|wasm|x-httpd-php|x-javascript|x-ns-proxy-autoconfig|x-sh|x-tar|x-virtualbox-hdd|x-virtualbox-ova|x-virtualbox-ovf|x-virtualbox-vbox|x-virtualbox-vdi|x-virtualbox-vhd|x-virtualbox-vmdk|x-www-form-urlencoded)|font\/(?:otf|ttf)|image\/(?:bmp|vnd\.adobe\.photoshop|vnd\.microsoft\.icon|vnd\.ms-dds|x-icon|x-ms-bmp)|message\/rfc822|model\/gltf-binary|x-shader\/x-fragment|x-shader\/x-vertex|[^;\s]+?\+(?:json|text|xml|yaml))(?:[;\s]|$)/i;
|
||||
var ENCODINGS = {
|
||||
br: ".br",
|
||||
zstd: ".zst",
|
||||
gzip: ".gz"
|
||||
};
|
||||
var ENCODINGS_ORDERED_KEYS = Object.keys(ENCODINGS);
|
||||
var pr54206Applied = () => {
|
||||
const [major, minor] = import_node_process.versions.node.split(".").map((component) => parseInt(component));
|
||||
return major >= 23 || major === 22 && minor >= 7 || major === 20 && minor >= 18;
|
||||
};
|
||||
var useReadableToWeb = pr54206Applied();
|
||||
var createStreamBody = (stream) => {
|
||||
if (useReadableToWeb) {
|
||||
return import_node_stream.Readable.toWeb(stream);
|
||||
}
|
||||
const body = new ReadableStream({
|
||||
start(controller) {
|
||||
stream.on("data", (chunk) => {
|
||||
controller.enqueue(chunk);
|
||||
});
|
||||
stream.on("error", (err) => {
|
||||
controller.error(err);
|
||||
});
|
||||
stream.on("end", () => {
|
||||
controller.close();
|
||||
});
|
||||
},
|
||||
cancel() {
|
||||
stream.destroy();
|
||||
}
|
||||
});
|
||||
return body;
|
||||
};
|
||||
var getStats = (path) => {
|
||||
let stats;
|
||||
try {
|
||||
stats = (0, import_node_fs.statSync)(path);
|
||||
} catch {
|
||||
}
|
||||
return stats;
|
||||
};
|
||||
var serveStatic = (options = { root: "" }) => {
|
||||
const root = options.root || "";
|
||||
const optionPath = options.path;
|
||||
if (root !== "" && !(0, import_node_fs.existsSync)(root)) {
|
||||
console.error(`serveStatic: root path '${root}' is not found, are you sure it's correct?`);
|
||||
}
|
||||
return async (c, next) => {
|
||||
if (c.finalized) {
|
||||
return next();
|
||||
}
|
||||
let filename;
|
||||
if (optionPath) {
|
||||
filename = optionPath;
|
||||
} else {
|
||||
try {
|
||||
filename = decodeURIComponent(c.req.path);
|
||||
if (/(?:^|[\/\\])\.\.(?:$|[\/\\])/.test(filename)) {
|
||||
throw new Error();
|
||||
}
|
||||
} catch {
|
||||
await options.onNotFound?.(c.req.path, c);
|
||||
return next();
|
||||
}
|
||||
}
|
||||
let path = (0, import_node_path.join)(
|
||||
root,
|
||||
!optionPath && options.rewriteRequestPath ? options.rewriteRequestPath(filename, c) : filename
|
||||
);
|
||||
let stats = getStats(path);
|
||||
if (stats && stats.isDirectory()) {
|
||||
const indexFile = options.index ?? "index.html";
|
||||
path = (0, import_node_path.join)(path, indexFile);
|
||||
stats = getStats(path);
|
||||
}
|
||||
if (!stats) {
|
||||
await options.onNotFound?.(path, c);
|
||||
return next();
|
||||
}
|
||||
const mimeType = (0, import_mime.getMimeType)(path);
|
||||
c.header("Content-Type", mimeType || "application/octet-stream");
|
||||
if (options.precompressed && (!mimeType || COMPRESSIBLE_CONTENT_TYPE_REGEX.test(mimeType))) {
|
||||
const acceptEncodingSet = new Set(
|
||||
c.req.header("Accept-Encoding")?.split(",").map((encoding) => encoding.trim())
|
||||
);
|
||||
for (const encoding of ENCODINGS_ORDERED_KEYS) {
|
||||
if (!acceptEncodingSet.has(encoding)) {
|
||||
continue;
|
||||
}
|
||||
const precompressedStats = getStats(path + ENCODINGS[encoding]);
|
||||
if (precompressedStats) {
|
||||
c.header("Content-Encoding", encoding);
|
||||
c.header("Vary", "Accept-Encoding", { append: true });
|
||||
stats = precompressedStats;
|
||||
path = path + ENCODINGS[encoding];
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
let result;
|
||||
const size = stats.size;
|
||||
const range = c.req.header("range") || "";
|
||||
if (c.req.method == "HEAD" || c.req.method == "OPTIONS") {
|
||||
c.header("Content-Length", size.toString());
|
||||
c.status(200);
|
||||
result = c.body(null);
|
||||
} else if (!range) {
|
||||
c.header("Content-Length", size.toString());
|
||||
result = c.body(createStreamBody((0, import_node_fs.createReadStream)(path)), 200);
|
||||
} else {
|
||||
c.header("Accept-Ranges", "bytes");
|
||||
c.header("Date", stats.birthtime.toUTCString());
|
||||
const parts = range.replace(/bytes=/, "").split("-", 2);
|
||||
const start = parseInt(parts[0], 10) || 0;
|
||||
let end = parseInt(parts[1], 10) || size - 1;
|
||||
if (size < end - start + 1) {
|
||||
end = size - 1;
|
||||
}
|
||||
const chunksize = end - start + 1;
|
||||
const stream = (0, import_node_fs.createReadStream)(path, { start, end });
|
||||
c.header("Content-Length", chunksize.toString());
|
||||
c.header("Content-Range", `bytes ${start}-${end}/${stats.size}`);
|
||||
result = c.body(createStreamBody(stream), 206);
|
||||
}
|
||||
await options.onFound?.(path, c);
|
||||
return result;
|
||||
};
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
serveStatic
|
||||
});
|
||||
138
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.mjs
generated
vendored
Normal file
138
projects/arabica/sprint1/node_modules/@hono/node-server/dist/serve-static.mjs
generated
vendored
Normal file
@@ -0,0 +1,138 @@
|
||||
// src/serve-static.ts
|
||||
import { getMimeType } from "hono/utils/mime";
|
||||
import { createReadStream, statSync, existsSync } from "fs";
|
||||
import { join } from "path";
|
||||
import { versions } from "process";
|
||||
import { Readable } from "stream";
|
||||
var COMPRESSIBLE_CONTENT_TYPE_REGEX = /^\s*(?:text\/[^;\s]+|application\/(?:javascript|json|xml|xml-dtd|ecmascript|dart|postscript|rtf|tar|toml|vnd\.dart|vnd\.ms-fontobject|vnd\.ms-opentype|wasm|x-httpd-php|x-javascript|x-ns-proxy-autoconfig|x-sh|x-tar|x-virtualbox-hdd|x-virtualbox-ova|x-virtualbox-ovf|x-virtualbox-vbox|x-virtualbox-vdi|x-virtualbox-vhd|x-virtualbox-vmdk|x-www-form-urlencoded)|font\/(?:otf|ttf)|image\/(?:bmp|vnd\.adobe\.photoshop|vnd\.microsoft\.icon|vnd\.ms-dds|x-icon|x-ms-bmp)|message\/rfc822|model\/gltf-binary|x-shader\/x-fragment|x-shader\/x-vertex|[^;\s]+?\+(?:json|text|xml|yaml))(?:[;\s]|$)/i;
|
||||
var ENCODINGS = {
|
||||
br: ".br",
|
||||
zstd: ".zst",
|
||||
gzip: ".gz"
|
||||
};
|
||||
var ENCODINGS_ORDERED_KEYS = Object.keys(ENCODINGS);
|
||||
var pr54206Applied = () => {
|
||||
const [major, minor] = versions.node.split(".").map((component) => parseInt(component));
|
||||
return major >= 23 || major === 22 && minor >= 7 || major === 20 && minor >= 18;
|
||||
};
|
||||
var useReadableToWeb = pr54206Applied();
|
||||
var createStreamBody = (stream) => {
|
||||
if (useReadableToWeb) {
|
||||
return Readable.toWeb(stream);
|
||||
}
|
||||
const body = new ReadableStream({
|
||||
start(controller) {
|
||||
stream.on("data", (chunk) => {
|
||||
controller.enqueue(chunk);
|
||||
});
|
||||
stream.on("error", (err) => {
|
||||
controller.error(err);
|
||||
});
|
||||
stream.on("end", () => {
|
||||
controller.close();
|
||||
});
|
||||
},
|
||||
cancel() {
|
||||
stream.destroy();
|
||||
}
|
||||
});
|
||||
return body;
|
||||
};
|
||||
var getStats = (path) => {
|
||||
let stats;
|
||||
try {
|
||||
stats = statSync(path);
|
||||
} catch {
|
||||
}
|
||||
return stats;
|
||||
};
|
||||
var serveStatic = (options = { root: "" }) => {
|
||||
const root = options.root || "";
|
||||
const optionPath = options.path;
|
||||
if (root !== "" && !existsSync(root)) {
|
||||
console.error(`serveStatic: root path '${root}' is not found, are you sure it's correct?`);
|
||||
}
|
||||
return async (c, next) => {
|
||||
if (c.finalized) {
|
||||
return next();
|
||||
}
|
||||
let filename;
|
||||
if (optionPath) {
|
||||
filename = optionPath;
|
||||
} else {
|
||||
try {
|
||||
filename = decodeURIComponent(c.req.path);
|
||||
if (/(?:^|[\/\\])\.\.(?:$|[\/\\])/.test(filename)) {
|
||||
throw new Error();
|
||||
}
|
||||
} catch {
|
||||
await options.onNotFound?.(c.req.path, c);
|
||||
return next();
|
||||
}
|
||||
}
|
||||
let path = join(
|
||||
root,
|
||||
!optionPath && options.rewriteRequestPath ? options.rewriteRequestPath(filename, c) : filename
|
||||
);
|
||||
let stats = getStats(path);
|
||||
if (stats && stats.isDirectory()) {
|
||||
const indexFile = options.index ?? "index.html";
|
||||
path = join(path, indexFile);
|
||||
stats = getStats(path);
|
||||
}
|
||||
if (!stats) {
|
||||
await options.onNotFound?.(path, c);
|
||||
return next();
|
||||
}
|
||||
const mimeType = getMimeType(path);
|
||||
c.header("Content-Type", mimeType || "application/octet-stream");
|
||||
if (options.precompressed && (!mimeType || COMPRESSIBLE_CONTENT_TYPE_REGEX.test(mimeType))) {
|
||||
const acceptEncodingSet = new Set(
|
||||
c.req.header("Accept-Encoding")?.split(",").map((encoding) => encoding.trim())
|
||||
);
|
||||
for (const encoding of ENCODINGS_ORDERED_KEYS) {
|
||||
if (!acceptEncodingSet.has(encoding)) {
|
||||
continue;
|
||||
}
|
||||
const precompressedStats = getStats(path + ENCODINGS[encoding]);
|
||||
if (precompressedStats) {
|
||||
c.header("Content-Encoding", encoding);
|
||||
c.header("Vary", "Accept-Encoding", { append: true });
|
||||
stats = precompressedStats;
|
||||
path = path + ENCODINGS[encoding];
|
||||
break;
|
||||
}
|
||||
}
|
||||
}
|
||||
let result;
|
||||
const size = stats.size;
|
||||
const range = c.req.header("range") || "";
|
||||
if (c.req.method == "HEAD" || c.req.method == "OPTIONS") {
|
||||
c.header("Content-Length", size.toString());
|
||||
c.status(200);
|
||||
result = c.body(null);
|
||||
} else if (!range) {
|
||||
c.header("Content-Length", size.toString());
|
||||
result = c.body(createStreamBody(createReadStream(path)), 200);
|
||||
} else {
|
||||
c.header("Accept-Ranges", "bytes");
|
||||
c.header("Date", stats.birthtime.toUTCString());
|
||||
const parts = range.replace(/bytes=/, "").split("-", 2);
|
||||
const start = parseInt(parts[0], 10) || 0;
|
||||
let end = parseInt(parts[1], 10) || size - 1;
|
||||
if (size < end - start + 1) {
|
||||
end = size - 1;
|
||||
}
|
||||
const chunksize = end - start + 1;
|
||||
const stream = createReadStream(path, { start, end });
|
||||
c.header("Content-Length", chunksize.toString());
|
||||
c.header("Content-Range", `bytes ${start}-${end}/${stats.size}`);
|
||||
result = c.body(createStreamBody(stream), 206);
|
||||
}
|
||||
await options.onFound?.(path, c);
|
||||
return result;
|
||||
};
|
||||
};
|
||||
export {
|
||||
serveStatic
|
||||
};
|
||||
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.d.mts
generated
vendored
Normal file
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.d.mts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { AddressInfo } from 'node:net';
|
||||
import { Options, ServerType } from './types.mjs';
|
||||
import 'node:http';
|
||||
import 'node:http2';
|
||||
import 'node:https';
|
||||
|
||||
declare const createAdaptorServer: (options: Options) => ServerType;
|
||||
declare const serve: (options: Options, listeningListener?: (info: AddressInfo) => void) => ServerType;
|
||||
|
||||
export { createAdaptorServer, serve };
|
||||
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.d.ts
generated
vendored
Normal file
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.d.ts
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
import { AddressInfo } from 'node:net';
|
||||
import { Options, ServerType } from './types.js';
|
||||
import 'node:http';
|
||||
import 'node:http2';
|
||||
import 'node:https';
|
||||
|
||||
declare const createAdaptorServer: (options: Options) => ServerType;
|
||||
declare const serve: (options: Options, listeningListener?: (info: AddressInfo) => void) => ServerType;
|
||||
|
||||
export { createAdaptorServer, serve };
|
||||
607
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.js
generated
vendored
Normal file
607
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.js
generated
vendored
Normal file
@@ -0,0 +1,607 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/server.ts
|
||||
var server_exports = {};
|
||||
__export(server_exports, {
|
||||
createAdaptorServer: () => createAdaptorServer,
|
||||
serve: () => serve
|
||||
});
|
||||
module.exports = __toCommonJS(server_exports);
|
||||
var import_node_http = require("http");
|
||||
|
||||
// src/listener.ts
|
||||
var import_node_http22 = require("http2");
|
||||
|
||||
// src/request.ts
|
||||
var import_node_http2 = require("http2");
|
||||
var import_node_stream = require("stream");
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= import_node_stream.Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = import_node_stream.Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof import_node_http2.Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
var import_node_crypto = __toESM(require("crypto"));
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = import_node_crypto.default;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof import_node_http22.Http2ServerRequest) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/server.ts
|
||||
var createAdaptorServer = (options) => {
|
||||
const fetchCallback = options.fetch;
|
||||
const requestListener = getRequestListener(fetchCallback, {
|
||||
hostname: options.hostname,
|
||||
overrideGlobalObjects: options.overrideGlobalObjects,
|
||||
autoCleanupIncoming: options.autoCleanupIncoming
|
||||
});
|
||||
const createServer = options.createServer || import_node_http.createServer;
|
||||
const server = createServer(options.serverOptions || {}, requestListener);
|
||||
return server;
|
||||
};
|
||||
var serve = (options, listeningListener) => {
|
||||
const server = createAdaptorServer(options);
|
||||
server.listen(options?.port ?? 3e3, options.hostname, () => {
|
||||
const serverInfo = server.address();
|
||||
listeningListener && listeningListener(serverInfo);
|
||||
});
|
||||
return server;
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
createAdaptorServer,
|
||||
serve
|
||||
});
|
||||
571
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.mjs
generated
vendored
Normal file
571
projects/arabica/sprint1/node_modules/@hono/node-server/dist/server.mjs
generated
vendored
Normal file
@@ -0,0 +1,571 @@
|
||||
// src/server.ts
|
||||
import { createServer as createServerHTTP } from "http";
|
||||
|
||||
// src/listener.ts
|
||||
import { Http2ServerRequest as Http2ServerRequest2 } from "http2";
|
||||
|
||||
// src/request.ts
|
||||
import { Http2ServerRequest } from "http2";
|
||||
import { Readable } from "stream";
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
import crypto from "crypto";
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = crypto;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof Http2ServerRequest2) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/server.ts
|
||||
var createAdaptorServer = (options) => {
|
||||
const fetchCallback = options.fetch;
|
||||
const requestListener = getRequestListener(fetchCallback, {
|
||||
hostname: options.hostname,
|
||||
overrideGlobalObjects: options.overrideGlobalObjects,
|
||||
autoCleanupIncoming: options.autoCleanupIncoming
|
||||
});
|
||||
const createServer = options.createServer || createServerHTTP;
|
||||
const server = createServer(options.serverOptions || {}, requestListener);
|
||||
return server;
|
||||
};
|
||||
var serve = (options, listeningListener) => {
|
||||
const server = createAdaptorServer(options);
|
||||
server.listen(options?.port ?? 3e3, options.hostname, () => {
|
||||
const serverInfo = server.address();
|
||||
listeningListener && listeningListener(serverInfo);
|
||||
});
|
||||
return server;
|
||||
};
|
||||
export {
|
||||
createAdaptorServer,
|
||||
serve
|
||||
};
|
||||
44
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.d.mts
generated
vendored
Normal file
44
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.d.mts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { IncomingMessage, ServerResponse, Server, ServerOptions as ServerOptions$1, createServer } from 'node:http';
|
||||
import { Http2ServerRequest, Http2ServerResponse, Http2Server, Http2SecureServer, ServerOptions as ServerOptions$3, createServer as createServer$2, SecureServerOptions, createSecureServer } from 'node:http2';
|
||||
import { ServerOptions as ServerOptions$2, createServer as createServer$1 } from 'node:https';
|
||||
|
||||
type HttpBindings = {
|
||||
incoming: IncomingMessage;
|
||||
outgoing: ServerResponse;
|
||||
};
|
||||
type Http2Bindings = {
|
||||
incoming: Http2ServerRequest;
|
||||
outgoing: Http2ServerResponse;
|
||||
};
|
||||
type FetchCallback = (request: Request, env: HttpBindings | Http2Bindings) => Promise<unknown> | unknown;
|
||||
type NextHandlerOption = {
|
||||
fetch: FetchCallback;
|
||||
};
|
||||
type ServerType = Server | Http2Server | Http2SecureServer;
|
||||
type createHttpOptions = {
|
||||
serverOptions?: ServerOptions$1;
|
||||
createServer?: typeof createServer;
|
||||
};
|
||||
type createHttpsOptions = {
|
||||
serverOptions?: ServerOptions$2;
|
||||
createServer?: typeof createServer$1;
|
||||
};
|
||||
type createHttp2Options = {
|
||||
serverOptions?: ServerOptions$3;
|
||||
createServer?: typeof createServer$2;
|
||||
};
|
||||
type createSecureHttp2Options = {
|
||||
serverOptions?: SecureServerOptions;
|
||||
createServer?: typeof createSecureServer;
|
||||
};
|
||||
type ServerOptions = createHttpOptions | createHttpsOptions | createHttp2Options | createSecureHttp2Options;
|
||||
type Options = {
|
||||
fetch: FetchCallback;
|
||||
overrideGlobalObjects?: boolean;
|
||||
autoCleanupIncoming?: boolean;
|
||||
port?: number;
|
||||
hostname?: string;
|
||||
} & ServerOptions;
|
||||
type CustomErrorHandler = (err: unknown) => void | Response | Promise<void | Response>;
|
||||
|
||||
export { CustomErrorHandler, FetchCallback, Http2Bindings, HttpBindings, NextHandlerOption, Options, ServerOptions, ServerType };
|
||||
44
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.d.ts
generated
vendored
Normal file
44
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,44 @@
|
||||
import { IncomingMessage, ServerResponse, Server, ServerOptions as ServerOptions$1, createServer } from 'node:http';
|
||||
import { Http2ServerRequest, Http2ServerResponse, Http2Server, Http2SecureServer, ServerOptions as ServerOptions$3, createServer as createServer$2, SecureServerOptions, createSecureServer } from 'node:http2';
|
||||
import { ServerOptions as ServerOptions$2, createServer as createServer$1 } from 'node:https';
|
||||
|
||||
type HttpBindings = {
|
||||
incoming: IncomingMessage;
|
||||
outgoing: ServerResponse;
|
||||
};
|
||||
type Http2Bindings = {
|
||||
incoming: Http2ServerRequest;
|
||||
outgoing: Http2ServerResponse;
|
||||
};
|
||||
type FetchCallback = (request: Request, env: HttpBindings | Http2Bindings) => Promise<unknown> | unknown;
|
||||
type NextHandlerOption = {
|
||||
fetch: FetchCallback;
|
||||
};
|
||||
type ServerType = Server | Http2Server | Http2SecureServer;
|
||||
type createHttpOptions = {
|
||||
serverOptions?: ServerOptions$1;
|
||||
createServer?: typeof createServer;
|
||||
};
|
||||
type createHttpsOptions = {
|
||||
serverOptions?: ServerOptions$2;
|
||||
createServer?: typeof createServer$1;
|
||||
};
|
||||
type createHttp2Options = {
|
||||
serverOptions?: ServerOptions$3;
|
||||
createServer?: typeof createServer$2;
|
||||
};
|
||||
type createSecureHttp2Options = {
|
||||
serverOptions?: SecureServerOptions;
|
||||
createServer?: typeof createSecureServer;
|
||||
};
|
||||
type ServerOptions = createHttpOptions | createHttpsOptions | createHttp2Options | createSecureHttp2Options;
|
||||
type Options = {
|
||||
fetch: FetchCallback;
|
||||
overrideGlobalObjects?: boolean;
|
||||
autoCleanupIncoming?: boolean;
|
||||
port?: number;
|
||||
hostname?: string;
|
||||
} & ServerOptions;
|
||||
type CustomErrorHandler = (err: unknown) => void | Response | Promise<void | Response>;
|
||||
|
||||
export { CustomErrorHandler, FetchCallback, Http2Bindings, HttpBindings, NextHandlerOption, Options, ServerOptions, ServerType };
|
||||
18
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.js
generated
vendored
Normal file
18
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.js
generated
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/types.ts
|
||||
var types_exports = {};
|
||||
module.exports = __toCommonJS(types_exports);
|
||||
0
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.mjs
generated
vendored
Normal file
0
projects/arabica/sprint1/node_modules/@hono/node-server/dist/types.mjs
generated
vendored
Normal file
9
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.d.mts
generated
vendored
Normal file
9
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.d.mts
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import { OutgoingHttpHeaders } from 'node:http';
|
||||
import { Writable } from 'node:stream';
|
||||
|
||||
declare function readWithoutBlocking(readPromise: Promise<ReadableStreamReadResult<Uint8Array>>): Promise<ReadableStreamReadResult<Uint8Array> | undefined>;
|
||||
declare function writeFromReadableStreamDefaultReader(reader: ReadableStreamDefaultReader<Uint8Array>, writable: Writable, currentReadPromise?: Promise<ReadableStreamReadResult<Uint8Array>> | undefined): Promise<void>;
|
||||
declare function writeFromReadableStream(stream: ReadableStream<Uint8Array>, writable: Writable): Promise<void> | undefined;
|
||||
declare const buildOutgoingHttpHeaders: (headers: Headers | HeadersInit | null | undefined) => OutgoingHttpHeaders;
|
||||
|
||||
export { buildOutgoingHttpHeaders, readWithoutBlocking, writeFromReadableStream, writeFromReadableStreamDefaultReader };
|
||||
9
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.d.ts
generated
vendored
Normal file
9
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.d.ts
generated
vendored
Normal file
@@ -0,0 +1,9 @@
|
||||
import { OutgoingHttpHeaders } from 'node:http';
|
||||
import { Writable } from 'node:stream';
|
||||
|
||||
declare function readWithoutBlocking(readPromise: Promise<ReadableStreamReadResult<Uint8Array>>): Promise<ReadableStreamReadResult<Uint8Array> | undefined>;
|
||||
declare function writeFromReadableStreamDefaultReader(reader: ReadableStreamDefaultReader<Uint8Array>, writable: Writable, currentReadPromise?: Promise<ReadableStreamReadResult<Uint8Array>> | undefined): Promise<void>;
|
||||
declare function writeFromReadableStream(stream: ReadableStream<Uint8Array>, writable: Writable): Promise<void> | undefined;
|
||||
declare const buildOutgoingHttpHeaders: (headers: Headers | HeadersInit | null | undefined) => OutgoingHttpHeaders;
|
||||
|
||||
export { buildOutgoingHttpHeaders, readWithoutBlocking, writeFromReadableStream, writeFromReadableStreamDefaultReader };
|
||||
99
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.js
generated
vendored
Normal file
99
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.js
generated
vendored
Normal file
@@ -0,0 +1,99 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/utils.ts
|
||||
var utils_exports = {};
|
||||
__export(utils_exports, {
|
||||
buildOutgoingHttpHeaders: () => buildOutgoingHttpHeaders,
|
||||
readWithoutBlocking: () => readWithoutBlocking,
|
||||
writeFromReadableStream: () => writeFromReadableStream,
|
||||
writeFromReadableStreamDefaultReader: () => writeFromReadableStreamDefaultReader
|
||||
});
|
||||
module.exports = __toCommonJS(utils_exports);
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
buildOutgoingHttpHeaders,
|
||||
readWithoutBlocking,
|
||||
writeFromReadableStream,
|
||||
writeFromReadableStreamDefaultReader
|
||||
});
|
||||
71
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.mjs
generated
vendored
Normal file
71
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils.mjs
generated
vendored
Normal file
@@ -0,0 +1,71 @@
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
export {
|
||||
buildOutgoingHttpHeaders,
|
||||
readWithoutBlocking,
|
||||
writeFromReadableStream,
|
||||
writeFromReadableStreamDefaultReader
|
||||
};
|
||||
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.d.mts
generated
vendored
Normal file
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.d.mts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
declare const RESPONSE_ALREADY_SENT: Response;
|
||||
|
||||
export { RESPONSE_ALREADY_SENT };
|
||||
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.d.ts
generated
vendored
Normal file
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
declare const RESPONSE_ALREADY_SENT: Response;
|
||||
|
||||
export { RESPONSE_ALREADY_SENT };
|
||||
37
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.js
generated
vendored
Normal file
37
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.js
generated
vendored
Normal file
@@ -0,0 +1,37 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/utils/response.ts
|
||||
var response_exports = {};
|
||||
__export(response_exports, {
|
||||
RESPONSE_ALREADY_SENT: () => RESPONSE_ALREADY_SENT
|
||||
});
|
||||
module.exports = __toCommonJS(response_exports);
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/utils/response.ts
|
||||
var RESPONSE_ALREADY_SENT = new Response(null, {
|
||||
headers: { [X_ALREADY_SENT]: "true" }
|
||||
});
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
RESPONSE_ALREADY_SENT
|
||||
});
|
||||
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.mjs
generated
vendored
Normal file
10
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response.mjs
generated
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/utils/response.ts
|
||||
var RESPONSE_ALREADY_SENT = new Response(null, {
|
||||
headers: { [X_ALREADY_SENT]: "true" }
|
||||
});
|
||||
export {
|
||||
RESPONSE_ALREADY_SENT
|
||||
};
|
||||
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.d.mts
generated
vendored
Normal file
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.d.mts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
declare const X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
export { X_ALREADY_SENT };
|
||||
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.d.ts
generated
vendored
Normal file
3
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.d.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
declare const X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
export { X_ALREADY_SENT };
|
||||
30
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.js
generated
vendored
Normal file
30
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.js
generated
vendored
Normal file
@@ -0,0 +1,30 @@
|
||||
"use strict";
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var constants_exports = {};
|
||||
__export(constants_exports, {
|
||||
X_ALREADY_SENT: () => X_ALREADY_SENT
|
||||
});
|
||||
module.exports = __toCommonJS(constants_exports);
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
X_ALREADY_SENT
|
||||
});
|
||||
5
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.mjs
generated
vendored
Normal file
5
projects/arabica/sprint1/node_modules/@hono/node-server/dist/utils/response/constants.mjs
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
export {
|
||||
X_ALREADY_SENT
|
||||
};
|
||||
7
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.d.mts
generated
vendored
Normal file
7
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.d.mts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
import * as http2 from 'http2';
|
||||
import * as http from 'http';
|
||||
import { Hono } from 'hono';
|
||||
|
||||
declare const handle: (app: Hono<any, any, any>) => (incoming: http.IncomingMessage | http2.Http2ServerRequest, outgoing: http.ServerResponse | http2.Http2ServerResponse) => Promise<void>;
|
||||
|
||||
export { handle };
|
||||
7
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.d.ts
generated
vendored
Normal file
7
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.d.ts
generated
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
import * as http2 from 'http2';
|
||||
import * as http from 'http';
|
||||
import { Hono } from 'hono';
|
||||
|
||||
declare const handle: (app: Hono<any, any, any>) => (incoming: http.IncomingMessage | http2.Http2ServerRequest, outgoing: http.ServerResponse | http2.Http2ServerResponse) => Promise<void>;
|
||||
|
||||
export { handle };
|
||||
588
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.js
generated
vendored
Normal file
588
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.js
generated
vendored
Normal file
@@ -0,0 +1,588 @@
|
||||
"use strict";
|
||||
var __create = Object.create;
|
||||
var __defProp = Object.defineProperty;
|
||||
var __getOwnPropDesc = Object.getOwnPropertyDescriptor;
|
||||
var __getOwnPropNames = Object.getOwnPropertyNames;
|
||||
var __getProtoOf = Object.getPrototypeOf;
|
||||
var __hasOwnProp = Object.prototype.hasOwnProperty;
|
||||
var __export = (target, all) => {
|
||||
for (var name in all)
|
||||
__defProp(target, name, { get: all[name], enumerable: true });
|
||||
};
|
||||
var __copyProps = (to, from, except, desc) => {
|
||||
if (from && typeof from === "object" || typeof from === "function") {
|
||||
for (let key of __getOwnPropNames(from))
|
||||
if (!__hasOwnProp.call(to, key) && key !== except)
|
||||
__defProp(to, key, { get: () => from[key], enumerable: !(desc = __getOwnPropDesc(from, key)) || desc.enumerable });
|
||||
}
|
||||
return to;
|
||||
};
|
||||
var __toESM = (mod, isNodeMode, target) => (target = mod != null ? __create(__getProtoOf(mod)) : {}, __copyProps(
|
||||
// If the importer is in node compatibility mode or this is not an ESM
|
||||
// file that has been converted to a CommonJS file using a Babel-
|
||||
// compatible transform (i.e. "__esModule" has not been set), then set
|
||||
// "default" to the CommonJS "module.exports" for node compatibility.
|
||||
isNodeMode || !mod || !mod.__esModule ? __defProp(target, "default", { value: mod, enumerable: true }) : target,
|
||||
mod
|
||||
));
|
||||
var __toCommonJS = (mod) => __copyProps(__defProp({}, "__esModule", { value: true }), mod);
|
||||
|
||||
// src/vercel.ts
|
||||
var vercel_exports = {};
|
||||
__export(vercel_exports, {
|
||||
handle: () => handle
|
||||
});
|
||||
module.exports = __toCommonJS(vercel_exports);
|
||||
|
||||
// src/listener.ts
|
||||
var import_node_http22 = require("http2");
|
||||
|
||||
// src/request.ts
|
||||
var import_node_http2 = require("http2");
|
||||
var import_node_stream = require("stream");
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= import_node_stream.Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = import_node_stream.Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof import_node_http2.Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof import_node_http2.Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
var import_node_crypto = __toESM(require("crypto"));
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = import_node_crypto.default;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof import_node_http22.Http2ServerRequest) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/vercel.ts
|
||||
var handle = (app) => {
|
||||
return getRequestListener(app.fetch);
|
||||
};
|
||||
// Annotate the CommonJS export names for ESM import in node:
|
||||
0 && (module.exports = {
|
||||
handle
|
||||
});
|
||||
551
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.mjs
generated
vendored
Normal file
551
projects/arabica/sprint1/node_modules/@hono/node-server/dist/vercel.mjs
generated
vendored
Normal file
@@ -0,0 +1,551 @@
|
||||
// src/listener.ts
|
||||
import { Http2ServerRequest as Http2ServerRequest2 } from "http2";
|
||||
|
||||
// src/request.ts
|
||||
import { Http2ServerRequest } from "http2";
|
||||
import { Readable } from "stream";
|
||||
var RequestError = class extends Error {
|
||||
constructor(message, options) {
|
||||
super(message, options);
|
||||
this.name = "RequestError";
|
||||
}
|
||||
};
|
||||
var toRequestError = (e) => {
|
||||
if (e instanceof RequestError) {
|
||||
return e;
|
||||
}
|
||||
return new RequestError(e.message, { cause: e });
|
||||
};
|
||||
var GlobalRequest = global.Request;
|
||||
var Request = class extends GlobalRequest {
|
||||
constructor(input, options) {
|
||||
if (typeof input === "object" && getRequestCache in input) {
|
||||
input = input[getRequestCache]();
|
||||
}
|
||||
if (typeof options?.body?.getReader !== "undefined") {
|
||||
;
|
||||
options.duplex ??= "half";
|
||||
}
|
||||
super(input, options);
|
||||
}
|
||||
};
|
||||
var newHeadersFromIncoming = (incoming) => {
|
||||
const headerRecord = [];
|
||||
const rawHeaders = incoming.rawHeaders;
|
||||
for (let i = 0; i < rawHeaders.length; i += 2) {
|
||||
const { [i]: key, [i + 1]: value } = rawHeaders;
|
||||
if (key.charCodeAt(0) !== /*:*/
|
||||
58) {
|
||||
headerRecord.push([key, value]);
|
||||
}
|
||||
}
|
||||
return new Headers(headerRecord);
|
||||
};
|
||||
var wrapBodyStream = Symbol("wrapBodyStream");
|
||||
var newRequestFromIncoming = (method, url, headers, incoming, abortController) => {
|
||||
const init = {
|
||||
method,
|
||||
headers,
|
||||
signal: abortController.signal
|
||||
};
|
||||
if (method === "TRACE") {
|
||||
init.method = "GET";
|
||||
const req = new Request(url, init);
|
||||
Object.defineProperty(req, "method", {
|
||||
get() {
|
||||
return "TRACE";
|
||||
}
|
||||
});
|
||||
return req;
|
||||
}
|
||||
if (!(method === "GET" || method === "HEAD")) {
|
||||
if ("rawBody" in incoming && incoming.rawBody instanceof Buffer) {
|
||||
init.body = new ReadableStream({
|
||||
start(controller) {
|
||||
controller.enqueue(incoming.rawBody);
|
||||
controller.close();
|
||||
}
|
||||
});
|
||||
} else if (incoming[wrapBodyStream]) {
|
||||
let reader;
|
||||
init.body = new ReadableStream({
|
||||
async pull(controller) {
|
||||
try {
|
||||
reader ||= Readable.toWeb(incoming).getReader();
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
controller.close();
|
||||
} else {
|
||||
controller.enqueue(value);
|
||||
}
|
||||
} catch (error) {
|
||||
controller.error(error);
|
||||
}
|
||||
}
|
||||
});
|
||||
} else {
|
||||
init.body = Readable.toWeb(incoming);
|
||||
}
|
||||
}
|
||||
return new Request(url, init);
|
||||
};
|
||||
var getRequestCache = Symbol("getRequestCache");
|
||||
var requestCache = Symbol("requestCache");
|
||||
var incomingKey = Symbol("incomingKey");
|
||||
var urlKey = Symbol("urlKey");
|
||||
var headersKey = Symbol("headersKey");
|
||||
var abortControllerKey = Symbol("abortControllerKey");
|
||||
var getAbortController = Symbol("getAbortController");
|
||||
var requestPrototype = {
|
||||
get method() {
|
||||
return this[incomingKey].method || "GET";
|
||||
},
|
||||
get url() {
|
||||
return this[urlKey];
|
||||
},
|
||||
get headers() {
|
||||
return this[headersKey] ||= newHeadersFromIncoming(this[incomingKey]);
|
||||
},
|
||||
[getAbortController]() {
|
||||
this[getRequestCache]();
|
||||
return this[abortControllerKey];
|
||||
},
|
||||
[getRequestCache]() {
|
||||
this[abortControllerKey] ||= new AbortController();
|
||||
return this[requestCache] ||= newRequestFromIncoming(
|
||||
this.method,
|
||||
this[urlKey],
|
||||
this.headers,
|
||||
this[incomingKey],
|
||||
this[abortControllerKey]
|
||||
);
|
||||
}
|
||||
};
|
||||
[
|
||||
"body",
|
||||
"bodyUsed",
|
||||
"cache",
|
||||
"credentials",
|
||||
"destination",
|
||||
"integrity",
|
||||
"mode",
|
||||
"redirect",
|
||||
"referrer",
|
||||
"referrerPolicy",
|
||||
"signal",
|
||||
"keepalive"
|
||||
].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
get() {
|
||||
return this[getRequestCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(requestPrototype, k, {
|
||||
value: function() {
|
||||
return this[getRequestCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(requestPrototype, Request.prototype);
|
||||
var newRequest = (incoming, defaultHostname) => {
|
||||
const req = Object.create(requestPrototype);
|
||||
req[incomingKey] = incoming;
|
||||
const incomingUrl = incoming.url || "";
|
||||
if (incomingUrl[0] !== "/" && // short-circuit for performance. most requests are relative URL.
|
||||
(incomingUrl.startsWith("http://") || incomingUrl.startsWith("https://"))) {
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
throw new RequestError("Absolute URL for :path is not allowed in HTTP/2");
|
||||
}
|
||||
try {
|
||||
const url2 = new URL(incomingUrl);
|
||||
req[urlKey] = url2.href;
|
||||
} catch (e) {
|
||||
throw new RequestError("Invalid absolute URL", { cause: e });
|
||||
}
|
||||
return req;
|
||||
}
|
||||
const host = (incoming instanceof Http2ServerRequest ? incoming.authority : incoming.headers.host) || defaultHostname;
|
||||
if (!host) {
|
||||
throw new RequestError("Missing host header");
|
||||
}
|
||||
let scheme;
|
||||
if (incoming instanceof Http2ServerRequest) {
|
||||
scheme = incoming.scheme;
|
||||
if (!(scheme === "http" || scheme === "https")) {
|
||||
throw new RequestError("Unsupported scheme");
|
||||
}
|
||||
} else {
|
||||
scheme = incoming.socket && incoming.socket.encrypted ? "https" : "http";
|
||||
}
|
||||
const url = new URL(`${scheme}://${host}${incomingUrl}`);
|
||||
if (url.hostname.length !== host.length && url.hostname !== host.replace(/:\d+$/, "")) {
|
||||
throw new RequestError("Invalid host header");
|
||||
}
|
||||
req[urlKey] = url.href;
|
||||
return req;
|
||||
};
|
||||
|
||||
// src/response.ts
|
||||
var responseCache = Symbol("responseCache");
|
||||
var getResponseCache = Symbol("getResponseCache");
|
||||
var cacheKey = Symbol("cache");
|
||||
var GlobalResponse = global.Response;
|
||||
var Response2 = class _Response {
|
||||
#body;
|
||||
#init;
|
||||
[getResponseCache]() {
|
||||
delete this[cacheKey];
|
||||
return this[responseCache] ||= new GlobalResponse(this.#body, this.#init);
|
||||
}
|
||||
constructor(body, init) {
|
||||
let headers;
|
||||
this.#body = body;
|
||||
if (init instanceof _Response) {
|
||||
const cachedGlobalResponse = init[responseCache];
|
||||
if (cachedGlobalResponse) {
|
||||
this.#init = cachedGlobalResponse;
|
||||
this[getResponseCache]();
|
||||
return;
|
||||
} else {
|
||||
this.#init = init.#init;
|
||||
headers = new Headers(init.#init.headers);
|
||||
}
|
||||
} else {
|
||||
this.#init = init;
|
||||
}
|
||||
if (typeof body === "string" || typeof body?.getReader !== "undefined" || body instanceof Blob || body instanceof Uint8Array) {
|
||||
headers ||= init?.headers || { "content-type": "text/plain; charset=UTF-8" };
|
||||
this[cacheKey] = [init?.status || 200, body, headers];
|
||||
}
|
||||
}
|
||||
get headers() {
|
||||
const cache = this[cacheKey];
|
||||
if (cache) {
|
||||
if (!(cache[2] instanceof Headers)) {
|
||||
cache[2] = new Headers(cache[2]);
|
||||
}
|
||||
return cache[2];
|
||||
}
|
||||
return this[getResponseCache]().headers;
|
||||
}
|
||||
get status() {
|
||||
return this[cacheKey]?.[0] ?? this[getResponseCache]().status;
|
||||
}
|
||||
get ok() {
|
||||
const status = this.status;
|
||||
return status >= 200 && status < 300;
|
||||
}
|
||||
};
|
||||
["body", "bodyUsed", "redirected", "statusText", "trailers", "type", "url"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
get() {
|
||||
return this[getResponseCache]()[k];
|
||||
}
|
||||
});
|
||||
});
|
||||
["arrayBuffer", "blob", "clone", "formData", "json", "text"].forEach((k) => {
|
||||
Object.defineProperty(Response2.prototype, k, {
|
||||
value: function() {
|
||||
return this[getResponseCache]()[k]();
|
||||
}
|
||||
});
|
||||
});
|
||||
Object.setPrototypeOf(Response2, GlobalResponse);
|
||||
Object.setPrototypeOf(Response2.prototype, GlobalResponse.prototype);
|
||||
|
||||
// src/utils.ts
|
||||
async function readWithoutBlocking(readPromise) {
|
||||
return Promise.race([readPromise, Promise.resolve().then(() => Promise.resolve(void 0))]);
|
||||
}
|
||||
function writeFromReadableStreamDefaultReader(reader, writable, currentReadPromise) {
|
||||
const cancel = (error) => {
|
||||
reader.cancel(error).catch(() => {
|
||||
});
|
||||
};
|
||||
writable.on("close", cancel);
|
||||
writable.on("error", cancel);
|
||||
(currentReadPromise ?? reader.read()).then(flow, handleStreamError);
|
||||
return reader.closed.finally(() => {
|
||||
writable.off("close", cancel);
|
||||
writable.off("error", cancel);
|
||||
});
|
||||
function handleStreamError(error) {
|
||||
if (error) {
|
||||
writable.destroy(error);
|
||||
}
|
||||
}
|
||||
function onDrain() {
|
||||
reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
function flow({ done, value }) {
|
||||
try {
|
||||
if (done) {
|
||||
writable.end();
|
||||
} else if (!writable.write(value)) {
|
||||
writable.once("drain", onDrain);
|
||||
} else {
|
||||
return reader.read().then(flow, handleStreamError);
|
||||
}
|
||||
} catch (e) {
|
||||
handleStreamError(e);
|
||||
}
|
||||
}
|
||||
}
|
||||
function writeFromReadableStream(stream, writable) {
|
||||
if (stream.locked) {
|
||||
throw new TypeError("ReadableStream is locked.");
|
||||
} else if (writable.destroyed) {
|
||||
return;
|
||||
}
|
||||
return writeFromReadableStreamDefaultReader(stream.getReader(), writable);
|
||||
}
|
||||
var buildOutgoingHttpHeaders = (headers) => {
|
||||
const res = {};
|
||||
if (!(headers instanceof Headers)) {
|
||||
headers = new Headers(headers ?? void 0);
|
||||
}
|
||||
const cookies = [];
|
||||
for (const [k, v] of headers) {
|
||||
if (k === "set-cookie") {
|
||||
cookies.push(v);
|
||||
} else {
|
||||
res[k] = v;
|
||||
}
|
||||
}
|
||||
if (cookies.length > 0) {
|
||||
res["set-cookie"] = cookies;
|
||||
}
|
||||
res["content-type"] ??= "text/plain; charset=UTF-8";
|
||||
return res;
|
||||
};
|
||||
|
||||
// src/utils/response/constants.ts
|
||||
var X_ALREADY_SENT = "x-hono-already-sent";
|
||||
|
||||
// src/globals.ts
|
||||
import crypto from "crypto";
|
||||
if (typeof global.crypto === "undefined") {
|
||||
global.crypto = crypto;
|
||||
}
|
||||
|
||||
// src/listener.ts
|
||||
var outgoingEnded = Symbol("outgoingEnded");
|
||||
var handleRequestError = () => new Response(null, {
|
||||
status: 400
|
||||
});
|
||||
var handleFetchError = (e) => new Response(null, {
|
||||
status: e instanceof Error && (e.name === "TimeoutError" || e.constructor.name === "TimeoutError") ? 504 : 500
|
||||
});
|
||||
var handleResponseError = (e, outgoing) => {
|
||||
const err = e instanceof Error ? e : new Error("unknown error", { cause: e });
|
||||
if (err.code === "ERR_STREAM_PREMATURE_CLOSE") {
|
||||
console.info("The user aborted a request.");
|
||||
} else {
|
||||
console.error(e);
|
||||
if (!outgoing.headersSent) {
|
||||
outgoing.writeHead(500, { "Content-Type": "text/plain" });
|
||||
}
|
||||
outgoing.end(`Error: ${err.message}`);
|
||||
outgoing.destroy(err);
|
||||
}
|
||||
};
|
||||
var flushHeaders = (outgoing) => {
|
||||
if ("flushHeaders" in outgoing && outgoing.writable) {
|
||||
outgoing.flushHeaders();
|
||||
}
|
||||
};
|
||||
var responseViaCache = async (res, outgoing) => {
|
||||
let [status, body, header] = res[cacheKey];
|
||||
if (header instanceof Headers) {
|
||||
header = buildOutgoingHttpHeaders(header);
|
||||
}
|
||||
if (typeof body === "string") {
|
||||
header["Content-Length"] = Buffer.byteLength(body);
|
||||
} else if (body instanceof Uint8Array) {
|
||||
header["Content-Length"] = body.byteLength;
|
||||
} else if (body instanceof Blob) {
|
||||
header["Content-Length"] = body.size;
|
||||
}
|
||||
outgoing.writeHead(status, header);
|
||||
if (typeof body === "string" || body instanceof Uint8Array) {
|
||||
outgoing.end(body);
|
||||
} else if (body instanceof Blob) {
|
||||
outgoing.end(new Uint8Array(await body.arrayBuffer()));
|
||||
} else {
|
||||
flushHeaders(outgoing);
|
||||
await writeFromReadableStream(body, outgoing)?.catch(
|
||||
(e) => handleResponseError(e, outgoing)
|
||||
);
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var isPromise = (res) => typeof res.then === "function";
|
||||
var responseViaResponseObject = async (res, outgoing, options = {}) => {
|
||||
if (isPromise(res)) {
|
||||
if (options.errorHandler) {
|
||||
try {
|
||||
res = await res;
|
||||
} catch (err) {
|
||||
const errRes = await options.errorHandler(err);
|
||||
if (!errRes) {
|
||||
return;
|
||||
}
|
||||
res = errRes;
|
||||
}
|
||||
} else {
|
||||
res = await res.catch(handleFetchError);
|
||||
}
|
||||
}
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
const resHeaderRecord = buildOutgoingHttpHeaders(res.headers);
|
||||
if (res.body) {
|
||||
const reader = res.body.getReader();
|
||||
const values = [];
|
||||
let done = false;
|
||||
let currentReadPromise = void 0;
|
||||
if (resHeaderRecord["transfer-encoding"] !== "chunked") {
|
||||
let maxReadCount = 2;
|
||||
for (let i = 0; i < maxReadCount; i++) {
|
||||
currentReadPromise ||= reader.read();
|
||||
const chunk = await readWithoutBlocking(currentReadPromise).catch((e) => {
|
||||
console.error(e);
|
||||
done = true;
|
||||
});
|
||||
if (!chunk) {
|
||||
if (i === 1) {
|
||||
await new Promise((resolve) => setTimeout(resolve));
|
||||
maxReadCount = 3;
|
||||
continue;
|
||||
}
|
||||
break;
|
||||
}
|
||||
currentReadPromise = void 0;
|
||||
if (chunk.value) {
|
||||
values.push(chunk.value);
|
||||
}
|
||||
if (chunk.done) {
|
||||
done = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (done && !("content-length" in resHeaderRecord)) {
|
||||
resHeaderRecord["content-length"] = values.reduce((acc, value) => acc + value.length, 0);
|
||||
}
|
||||
}
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
values.forEach((value) => {
|
||||
;
|
||||
outgoing.write(value);
|
||||
});
|
||||
if (done) {
|
||||
outgoing.end();
|
||||
} else {
|
||||
if (values.length === 0) {
|
||||
flushHeaders(outgoing);
|
||||
}
|
||||
await writeFromReadableStreamDefaultReader(reader, outgoing, currentReadPromise);
|
||||
}
|
||||
} else if (resHeaderRecord[X_ALREADY_SENT]) {
|
||||
} else {
|
||||
outgoing.writeHead(res.status, resHeaderRecord);
|
||||
outgoing.end();
|
||||
}
|
||||
;
|
||||
outgoing[outgoingEnded]?.();
|
||||
};
|
||||
var getRequestListener = (fetchCallback, options = {}) => {
|
||||
const autoCleanupIncoming = options.autoCleanupIncoming ?? true;
|
||||
if (options.overrideGlobalObjects !== false && global.Request !== Request) {
|
||||
Object.defineProperty(global, "Request", {
|
||||
value: Request
|
||||
});
|
||||
Object.defineProperty(global, "Response", {
|
||||
value: Response2
|
||||
});
|
||||
}
|
||||
return async (incoming, outgoing) => {
|
||||
let res, req;
|
||||
try {
|
||||
req = newRequest(incoming, options.hostname);
|
||||
let incomingEnded = !autoCleanupIncoming || incoming.method === "GET" || incoming.method === "HEAD";
|
||||
if (!incomingEnded) {
|
||||
;
|
||||
incoming[wrapBodyStream] = true;
|
||||
incoming.on("end", () => {
|
||||
incomingEnded = true;
|
||||
});
|
||||
if (incoming instanceof Http2ServerRequest2) {
|
||||
;
|
||||
outgoing[outgoingEnded] = () => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
outgoing.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
||||
}
|
||||
outgoing.on("close", () => {
|
||||
const abortController = req[abortControllerKey];
|
||||
if (abortController) {
|
||||
if (incoming.errored) {
|
||||
req[abortControllerKey].abort(incoming.errored.toString());
|
||||
} else if (!outgoing.writableFinished) {
|
||||
req[abortControllerKey].abort("Client connection prematurely closed.");
|
||||
}
|
||||
}
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
if (!incomingEnded) {
|
||||
setTimeout(() => {
|
||||
incoming.destroy();
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
});
|
||||
res = fetchCallback(req, { incoming, outgoing });
|
||||
if (cacheKey in res) {
|
||||
return responseViaCache(res, outgoing);
|
||||
}
|
||||
} catch (e) {
|
||||
if (!res) {
|
||||
if (options.errorHandler) {
|
||||
res = await options.errorHandler(req ? e : toRequestError(e));
|
||||
if (!res) {
|
||||
return;
|
||||
}
|
||||
} else if (!req) {
|
||||
res = handleRequestError();
|
||||
} else {
|
||||
res = handleFetchError(e);
|
||||
}
|
||||
} else {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
}
|
||||
try {
|
||||
return await responseViaResponseObject(res, outgoing, options);
|
||||
} catch (e) {
|
||||
return handleResponseError(e, outgoing);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
// src/vercel.ts
|
||||
var handle = (app) => {
|
||||
return getRequestListener(app.fetch);
|
||||
};
|
||||
export {
|
||||
handle
|
||||
};
|
||||
103
projects/arabica/sprint1/node_modules/@hono/node-server/package.json
generated
vendored
Normal file
103
projects/arabica/sprint1/node_modules/@hono/node-server/package.json
generated
vendored
Normal file
@@ -0,0 +1,103 @@
|
||||
{
|
||||
"name": "@hono/node-server",
|
||||
"version": "1.19.9",
|
||||
"description": "Node.js Adapter for Hono",
|
||||
"main": "dist/index.js",
|
||||
"types": "dist/index.d.ts",
|
||||
"files": [
|
||||
"dist"
|
||||
],
|
||||
"exports": {
|
||||
".": {
|
||||
"types": "./dist/index.d.ts",
|
||||
"require": "./dist/index.js",
|
||||
"import": "./dist/index.mjs"
|
||||
},
|
||||
"./serve-static": {
|
||||
"types": "./dist/serve-static.d.ts",
|
||||
"require": "./dist/serve-static.js",
|
||||
"import": "./dist/serve-static.mjs"
|
||||
},
|
||||
"./vercel": {
|
||||
"types": "./dist/vercel.d.ts",
|
||||
"require": "./dist/vercel.js",
|
||||
"import": "./dist/vercel.mjs"
|
||||
},
|
||||
"./utils/*": {
|
||||
"types": "./dist/utils/*.d.ts",
|
||||
"require": "./dist/utils/*.js",
|
||||
"import": "./dist/utils/*.mjs"
|
||||
},
|
||||
"./conninfo": {
|
||||
"types": "./dist/conninfo.d.ts",
|
||||
"require": "./dist/conninfo.js",
|
||||
"import": "./dist/conninfo.mjs"
|
||||
}
|
||||
},
|
||||
"typesVersions": {
|
||||
"*": {
|
||||
".": [
|
||||
"./dist/index.d.ts"
|
||||
],
|
||||
"serve-static": [
|
||||
"./dist/serve-static.d.ts"
|
||||
],
|
||||
"vercel": [
|
||||
"./dist/vercel.d.ts"
|
||||
],
|
||||
"utils/*": [
|
||||
"./dist/utils/*.d.ts"
|
||||
],
|
||||
"conninfo": [
|
||||
"./dist/conninfo.d.ts"
|
||||
]
|
||||
}
|
||||
},
|
||||
"scripts": {
|
||||
"test": "node --expose-gc node_modules/jest/bin/jest.js",
|
||||
"build": "tsup --external hono",
|
||||
"watch": "tsup --watch",
|
||||
"postbuild": "publint",
|
||||
"prerelease": "bun run build && bun run test",
|
||||
"release": "np",
|
||||
"lint": "eslint src test",
|
||||
"lint:fix": "eslint src test --fix",
|
||||
"format": "prettier --check \"src/**/*.{js,ts}\" \"test/**/*.{js,ts}\"",
|
||||
"format:fix": "prettier --write \"src/**/*.{js,ts}\" \"test/**/*.{js,ts}\""
|
||||
},
|
||||
"license": "MIT",
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/honojs/node-server.git"
|
||||
},
|
||||
"homepage": "https://github.com/honojs/node-server",
|
||||
"author": "Yusuke Wada <yusuke@kamawada.com> (https://github.com/yusukebe)",
|
||||
"publishConfig": {
|
||||
"registry": "https://registry.npmjs.org",
|
||||
"access": "public"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18.14.1"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@hono/eslint-config": "^1.0.1",
|
||||
"@types/jest": "^29.5.3",
|
||||
"@types/node": "^20.10.0",
|
||||
"@types/supertest": "^2.0.12",
|
||||
"@whatwg-node/fetch": "^0.9.14",
|
||||
"eslint": "^9.10.0",
|
||||
"hono": "^4.4.10",
|
||||
"jest": "^29.6.1",
|
||||
"np": "^7.7.0",
|
||||
"prettier": "^3.2.4",
|
||||
"publint": "^0.1.16",
|
||||
"supertest": "^6.3.3",
|
||||
"ts-jest": "^29.1.1",
|
||||
"tsup": "^7.2.0",
|
||||
"typescript": "^5.3.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"hono": "^4"
|
||||
},
|
||||
"packageManager": "bun@1.2.20"
|
||||
}
|
||||
21
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/LICENSE
generated
vendored
Normal file
21
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2024 Anthropic, PBC
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
170
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/README.md
generated
vendored
Normal file
170
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/README.md
generated
vendored
Normal file
@@ -0,0 +1,170 @@
|
||||
# MCP TypeScript SDK [](https://www.npmjs.com/package/@modelcontextprotocol/sdk) [](https://github.com/modelcontextprotocol/typescript-sdk/blob/main/LICENSE)
|
||||
|
||||
<details>
|
||||
<summary>Table of Contents</summary>
|
||||
|
||||
- [Overview](#overview)
|
||||
- [Installation](#installation)
|
||||
- [Quick Start](#quick-start)
|
||||
- [Core Concepts](#core-concepts)
|
||||
- [Examples](#examples)
|
||||
- [Documentation](#documentation)
|
||||
- [Contributing](#contributing)
|
||||
- [License](#license)
|
||||
|
||||
</details>
|
||||
|
||||
## Overview
|
||||
|
||||
The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This TypeScript SDK implements
|
||||
[the full MCP specification](https://modelcontextprotocol.io/specification/draft), making it easy to:
|
||||
|
||||
- Create MCP servers that expose resources, prompts and tools
|
||||
- Build MCP clients that can connect to any MCP server
|
||||
- Use standard transports like stdio and Streamable HTTP
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
npm install @modelcontextprotocol/sdk zod
|
||||
```
|
||||
|
||||
This SDK has a **required peer dependency** on `zod` for schema validation. The SDK internally imports from `zod/v4`, but maintains backwards compatibility with projects using Zod v3.25 or later. You can use either API in your code by importing from `zod/v3` or `zod/v4`:
|
||||
|
||||
## Quick Start
|
||||
|
||||
To see the SDK in action end-to-end, start from the runnable examples in `src/examples`:
|
||||
|
||||
1. **Install dependencies** (from the SDK repo root):
|
||||
|
||||
```bash
|
||||
npm install
|
||||
```
|
||||
|
||||
2. **Run the example Streamable HTTP server**:
|
||||
|
||||
```bash
|
||||
npx tsx src/examples/server/simpleStreamableHttp.ts
|
||||
```
|
||||
|
||||
3. **Run the interactive client in another terminal**:
|
||||
|
||||
```bash
|
||||
npx tsx src/examples/client/simpleStreamableHttp.ts
|
||||
```
|
||||
|
||||
This pair of examples demonstrates tools, resources, prompts, sampling, elicitation, tasks and logging. For a guided walkthrough and variations (stateless servers, JSON-only responses, SSE compatibility, OAuth, etc.), see [docs/server.md](docs/server.md) and
|
||||
[docs/client.md](docs/client.md).
|
||||
|
||||
## Core Concepts
|
||||
|
||||
### Servers and transports
|
||||
|
||||
An MCP server is typically created with `McpServer` and connected to a transport such as Streamable HTTP or stdio. The SDK supports:
|
||||
|
||||
- **Streamable HTTP** for remote servers (recommended).
|
||||
- **HTTP + SSE** for backwards compatibility only.
|
||||
- **stdio** for local, process-spawned integrations.
|
||||
|
||||
Runnable server examples live under `src/examples/server` and are documented in [docs/server.md](docs/server.md).
|
||||
|
||||
### Tools, resources, prompts
|
||||
|
||||
- **Tools** let LLMs ask your server to take actions (computation, side effects, network calls).
|
||||
- **Resources** expose read-only data that clients can surface to users or models.
|
||||
- **Prompts** are reusable templates that help users talk to models in a consistent way.
|
||||
|
||||
The detailed APIs, including `ResourceTemplate`, completions, and display-name metadata, are covered in [docs/server.md](docs/server.md#tools-resources-and-prompts), with runnable implementations in [`simpleStreamableHttp.ts`](src/examples/server/simpleStreamableHttp.ts).
|
||||
|
||||
### Capabilities: sampling, elicitation, and tasks
|
||||
|
||||
The SDK includes higher-level capabilities for richer workflows:
|
||||
|
||||
- **Sampling**: server-side tools can ask connected clients to run LLM completions.
|
||||
- **Form elicitation**: tools can request non-sensitive input via structured forms.
|
||||
- **URL elicitation**: servers can ask users to complete secure flows in a browser (e.g., API key entry, payments, OAuth).
|
||||
- **Tasks (experimental)**: long-running tool calls can be turned into tasks that you poll or resume later.
|
||||
|
||||
Conceptual overviews and links to runnable examples are in:
|
||||
|
||||
- [docs/capabilities.md](docs/capabilities.md)
|
||||
|
||||
Key example servers include:
|
||||
|
||||
- [`toolWithSampleServer.ts`](src/examples/server/toolWithSampleServer.ts)
|
||||
- [`elicitationFormExample.ts`](src/examples/server/elicitationFormExample.ts)
|
||||
- [`elicitationUrlExample.ts`](src/examples/server/elicitationUrlExample.ts)
|
||||
|
||||
### Clients
|
||||
|
||||
The high-level `Client` class connects to MCP servers over different transports and exposes helpers like `listTools`, `callTool`, `listResources`, `readResource`, `listPrompts`, and `getPrompt`.
|
||||
|
||||
Runnable clients live under `src/examples/client` and are described in [docs/client.md](docs/client.md), including:
|
||||
|
||||
- Interactive Streamable HTTP client ([`simpleStreamableHttp.ts`](src/examples/client/simpleStreamableHttp.ts))
|
||||
- Streamable HTTP client with SSE fallback ([`streamableHttpWithSseFallbackClient.ts`](src/examples/client/streamableHttpWithSseFallbackClient.ts))
|
||||
- OAuth-enabled clients and polling/parallel examples
|
||||
|
||||
### Node.js Web Crypto (globalThis.crypto) compatibility
|
||||
|
||||
Some parts of the SDK (for example, JWT-based client authentication in `auth-extensions.ts` via `jose`) rely on the Web Crypto API exposed as `globalThis.crypto`.
|
||||
|
||||
See [docs/faq.md](docs/faq.md) for details on supported Node.js versions and how to polyfill `globalThis.crypto` when running on older Node.js runtimes.
|
||||
|
||||
## Examples
|
||||
|
||||
The SDK ships runnable examples under `src/examples`. Use these tables to find the scenario you care about and jump straight to the corresponding code and docs.
|
||||
|
||||
### Server examples
|
||||
|
||||
| Scenario | Description | Example file(s) | Related docs |
|
||||
| --------------------------------------------------- | ------------------------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------ |
|
||||
| Streamable HTTP server (stateful) | Feature-rich server with tools, resources, prompts, logging, tasks, sampling, and optional OAuth. | [`simpleStreamableHttp.ts`](src/examples/server/simpleStreamableHttp.ts) | [`server.md`](docs/server.md), [`capabilities.md`](docs/capabilities.md) |
|
||||
| Streamable HTTP server (stateless) | No session tracking; good for simple API-style servers. | [`simpleStatelessStreamableHttp.ts`](src/examples/server/simpleStatelessStreamableHttp.ts) | [`server.md`](docs/server.md) |
|
||||
| JSON response mode (no SSE) | Streamable HTTP with JSON responses only and limited notifications. | [`jsonResponseStreamableHttp.ts`](src/examples/server/jsonResponseStreamableHttp.ts) | [`server.md`](docs/server.md) |
|
||||
| Server notifications over Streamable HTTP | Demonstrates server-initiated notifications using SSE with Streamable HTTP. | [`standaloneSseWithGetStreamableHttp.ts`](src/examples/server/standaloneSseWithGetStreamableHttp.ts) | [`server.md`](docs/server.md) |
|
||||
| Deprecated HTTP+SSE server | Legacy HTTP+SSE transport for backwards-compatibility testing. | [`simpleSseServer.ts`](src/examples/server/simpleSseServer.ts) | [`server.md`](docs/server.md) |
|
||||
| Backwards-compatible server (Streamable HTTP + SSE) | Single server that supports both Streamable HTTP and legacy SSE clients. | [`sseAndStreamableHttpCompatibleServer.ts`](src/examples/server/sseAndStreamableHttpCompatibleServer.ts) | [`server.md`](docs/server.md) |
|
||||
| Form elicitation server | Uses form elicitation to collect non-sensitive user input. | [`elicitationFormExample.ts`](src/examples/server/elicitationFormExample.ts) | [`capabilities.md`](docs/capabilities.md#elicitation) |
|
||||
| URL elicitation server | Demonstrates URL-mode elicitation in an OAuth-protected server. | [`elicitationUrlExample.ts`](src/examples/server/elicitationUrlExample.ts) | [`capabilities.md`](docs/capabilities.md#elicitation) |
|
||||
| Sampling and tasks server | Combines tools, logging, sampling, and experimental task-based execution. | [`toolWithSampleServer.ts`](src/examples/server/toolWithSampleServer.ts) | [`capabilities.md`](docs/capabilities.md) |
|
||||
| OAuth demo authorization server | In-memory OAuth provider used with the example servers. | [`demoInMemoryOAuthProvider.ts`](src/examples/server/demoInMemoryOAuthProvider.ts) | [`server.md`](docs/server.md) |
|
||||
|
||||
### Client examples
|
||||
|
||||
| Scenario | Description | Example file(s) | Related docs |
|
||||
| --------------------------------------------------- | ---------------------------------------------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------ |
|
||||
| Interactive Streamable HTTP client | CLI client that exercises tools, resources, prompts, elicitation, and tasks. | [`simpleStreamableHttp.ts`](src/examples/client/simpleStreamableHttp.ts) | [`client.md`](docs/client.md) |
|
||||
| Backwards-compatible client (Streamable HTTP → SSE) | Tries Streamable HTTP first, then falls back to SSE on 4xx responses. | [`streamableHttpWithSseFallbackClient.ts`](src/examples/client/streamableHttpWithSseFallbackClient.ts) | [`client.md`](docs/client.md), [`server.md`](docs/server.md) |
|
||||
| SSE polling client | Polls a legacy SSE server and demonstrates notification handling. | [`ssePollingClient.ts`](src/examples/client/ssePollingClient.ts) | [`client.md`](docs/client.md) |
|
||||
| Parallel tool calls client | Shows how to run multiple tool calls in parallel. | [`parallelToolCallsClient.ts`](src/examples/client/parallelToolCallsClient.ts) | [`client.md`](docs/client.md) |
|
||||
| Multiple clients in parallel | Demonstrates connecting multiple clients concurrently to the same server. | [`multipleClientsParallel.ts`](src/examples/client/multipleClientsParallel.ts) | [`client.md`](docs/client.md) |
|
||||
| OAuth clients | Examples of client_credentials (basic and private_key_jwt) and reusable providers. | [`simpleOAuthClient.ts`](src/examples/client/simpleOAuthClient.ts), [`simpleOAuthClientProvider.ts`](src/examples/client/simpleOAuthClientProvider.ts), [`simpleClientCredentials.ts`](src/examples/client/simpleClientCredentials.ts) | [`client.md`](docs/client.md) |
|
||||
| URL elicitation client | Works with the URL elicitation server to drive secure browser flows. | [`elicitationUrlExample.ts`](src/examples/client/elicitationUrlExample.ts) | [`capabilities.md`](docs/capabilities.md#elicitation) |
|
||||
|
||||
Shared utilities:
|
||||
|
||||
- In-memory event store for resumability: [`inMemoryEventStore.ts`](src/examples/shared/inMemoryEventStore.ts) (see [`server.md`](docs/server.md)).
|
||||
|
||||
For more details on how to run these examples (including recommended commands and deployment diagrams), see `src/examples/README.md`.
|
||||
|
||||
## Documentation
|
||||
|
||||
- Local SDK docs:
|
||||
- [docs/server.md](docs/server.md) – building and running MCP servers, transports, tools/resources/prompts, CORS, DNS rebinding, and multi-node deployment.
|
||||
- [docs/client.md](docs/client.md) – using the high-level client, transports, backwards compatibility, and OAuth helpers.
|
||||
- [docs/capabilities.md](docs/capabilities.md) – sampling, elicitation (form and URL), and experimental task-based execution.
|
||||
- [docs/protocol.md](docs/protocol.md) – protocol features: ping, progress, cancellation, pagination, capability negotiation, and JSON Schema.
|
||||
- [docs/faq.md](docs/faq.md) – environment and troubleshooting FAQs (including Node.js Web Crypto support).
|
||||
- External references:
|
||||
- [Model Context Protocol documentation](https://modelcontextprotocol.io)
|
||||
- [MCP Specification](https://spec.modelcontextprotocol.io)
|
||||
- [Example Servers](https://github.com/modelcontextprotocol/servers)
|
||||
|
||||
## Contributing
|
||||
|
||||
Issues and pull requests are welcome on GitHub at <https://github.com/modelcontextprotocol/typescript-sdk>.
|
||||
|
||||
## License
|
||||
|
||||
This project is licensed under the MIT License—see the [LICENSE](LICENSE) file for details.
|
||||
190
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.d.ts
generated
vendored
Normal file
190
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.d.ts
generated
vendored
Normal file
@@ -0,0 +1,190 @@
|
||||
/**
|
||||
* OAuth provider extensions for specialized authentication flows.
|
||||
*
|
||||
* This module provides ready-to-use OAuthClientProvider implementations
|
||||
* for common machine-to-machine authentication scenarios.
|
||||
*/
|
||||
import { OAuthClientInformation, OAuthClientMetadata, OAuthTokens } from '../shared/auth.js';
|
||||
import { AddClientAuthentication, OAuthClientProvider } from './auth.js';
|
||||
/**
|
||||
* Helper to produce a private_key_jwt client authentication function.
|
||||
*
|
||||
* Usage:
|
||||
* const addClientAuth = createPrivateKeyJwtAuth({ issuer, subject, privateKey, alg, audience? });
|
||||
* // pass addClientAuth as provider.addClientAuthentication implementation
|
||||
*/
|
||||
export declare function createPrivateKeyJwtAuth(options: {
|
||||
issuer: string;
|
||||
subject: string;
|
||||
privateKey: string | Uint8Array | Record<string, unknown>;
|
||||
alg: string;
|
||||
audience?: string | URL;
|
||||
lifetimeSeconds?: number;
|
||||
claims?: Record<string, unknown>;
|
||||
}): AddClientAuthentication;
|
||||
/**
|
||||
* Options for creating a ClientCredentialsProvider.
|
||||
*/
|
||||
export interface ClientCredentialsProviderOptions {
|
||||
/**
|
||||
* The client_id for this OAuth client.
|
||||
*/
|
||||
clientId: string;
|
||||
/**
|
||||
* The client_secret for client_secret_basic authentication.
|
||||
*/
|
||||
clientSecret: string;
|
||||
/**
|
||||
* Optional client name for metadata.
|
||||
*/
|
||||
clientName?: string;
|
||||
/**
|
||||
* Space-separated scopes values requested by the client.
|
||||
*/
|
||||
scope?: string;
|
||||
}
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with client_secret_basic authentication.
|
||||
*
|
||||
* This provider is designed for machine-to-machine authentication where
|
||||
* the client authenticates using a client_id and client_secret.
|
||||
*
|
||||
* @example
|
||||
* const provider = new ClientCredentialsProvider({
|
||||
* clientId: 'my-client',
|
||||
* clientSecret: 'my-secret'
|
||||
* });
|
||||
*
|
||||
* const transport = new StreamableHTTPClientTransport(serverUrl, {
|
||||
* authProvider: provider
|
||||
* });
|
||||
*/
|
||||
export declare class ClientCredentialsProvider implements OAuthClientProvider {
|
||||
private _tokens?;
|
||||
private _clientInfo;
|
||||
private _clientMetadata;
|
||||
constructor(options: ClientCredentialsProviderOptions);
|
||||
get redirectUrl(): undefined;
|
||||
get clientMetadata(): OAuthClientMetadata;
|
||||
clientInformation(): OAuthClientInformation;
|
||||
saveClientInformation(info: OAuthClientInformation): void;
|
||||
tokens(): OAuthTokens | undefined;
|
||||
saveTokens(tokens: OAuthTokens): void;
|
||||
redirectToAuthorization(): void;
|
||||
saveCodeVerifier(): void;
|
||||
codeVerifier(): string;
|
||||
prepareTokenRequest(scope?: string): URLSearchParams;
|
||||
}
|
||||
/**
|
||||
* Options for creating a PrivateKeyJwtProvider.
|
||||
*/
|
||||
export interface PrivateKeyJwtProviderOptions {
|
||||
/**
|
||||
* The client_id for this OAuth client.
|
||||
*/
|
||||
clientId: string;
|
||||
/**
|
||||
* The private key for signing JWT assertions.
|
||||
* Can be a PEM string, Uint8Array, or JWK object.
|
||||
*/
|
||||
privateKey: string | Uint8Array | Record<string, unknown>;
|
||||
/**
|
||||
* The algorithm to use for signing (e.g., 'RS256', 'ES256').
|
||||
*/
|
||||
algorithm: string;
|
||||
/**
|
||||
* Optional client name for metadata.
|
||||
*/
|
||||
clientName?: string;
|
||||
/**
|
||||
* Optional JWT lifetime in seconds (default: 300).
|
||||
*/
|
||||
jwtLifetimeSeconds?: number;
|
||||
/**
|
||||
* Space-separated scopes values requested by the client.
|
||||
*/
|
||||
scope?: string;
|
||||
}
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with private_key_jwt authentication.
|
||||
*
|
||||
* This provider is designed for machine-to-machine authentication where
|
||||
* the client authenticates using a signed JWT assertion (RFC 7523 Section 2.2).
|
||||
*
|
||||
* @example
|
||||
* const provider = new PrivateKeyJwtProvider({
|
||||
* clientId: 'my-client',
|
||||
* privateKey: pemEncodedPrivateKey,
|
||||
* algorithm: 'RS256'
|
||||
* });
|
||||
*
|
||||
* const transport = new StreamableHTTPClientTransport(serverUrl, {
|
||||
* authProvider: provider
|
||||
* });
|
||||
*/
|
||||
export declare class PrivateKeyJwtProvider implements OAuthClientProvider {
|
||||
private _tokens?;
|
||||
private _clientInfo;
|
||||
private _clientMetadata;
|
||||
addClientAuthentication: AddClientAuthentication;
|
||||
constructor(options: PrivateKeyJwtProviderOptions);
|
||||
get redirectUrl(): undefined;
|
||||
get clientMetadata(): OAuthClientMetadata;
|
||||
clientInformation(): OAuthClientInformation;
|
||||
saveClientInformation(info: OAuthClientInformation): void;
|
||||
tokens(): OAuthTokens | undefined;
|
||||
saveTokens(tokens: OAuthTokens): void;
|
||||
redirectToAuthorization(): void;
|
||||
saveCodeVerifier(): void;
|
||||
codeVerifier(): string;
|
||||
prepareTokenRequest(scope?: string): URLSearchParams;
|
||||
}
|
||||
/**
|
||||
* Options for creating a StaticPrivateKeyJwtProvider.
|
||||
*/
|
||||
export interface StaticPrivateKeyJwtProviderOptions {
|
||||
/**
|
||||
* The client_id for this OAuth client.
|
||||
*/
|
||||
clientId: string;
|
||||
/**
|
||||
* A pre-built JWT client assertion to use for authentication.
|
||||
*
|
||||
* This token should already contain the appropriate claims
|
||||
* (iss, sub, aud, exp, etc.) and be signed by the client's key.
|
||||
*/
|
||||
jwtBearerAssertion: string;
|
||||
/**
|
||||
* Optional client name for metadata.
|
||||
*/
|
||||
clientName?: string;
|
||||
/**
|
||||
* Space-separated scopes values requested by the client.
|
||||
*/
|
||||
scope?: string;
|
||||
}
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with a static private_key_jwt assertion.
|
||||
*
|
||||
* This provider mirrors {@link PrivateKeyJwtProvider} but instead of constructing and
|
||||
* signing a JWT on each request, it accepts a pre-built JWT assertion string and
|
||||
* uses it directly for authentication.
|
||||
*/
|
||||
export declare class StaticPrivateKeyJwtProvider implements OAuthClientProvider {
|
||||
private _tokens?;
|
||||
private _clientInfo;
|
||||
private _clientMetadata;
|
||||
addClientAuthentication: AddClientAuthentication;
|
||||
constructor(options: StaticPrivateKeyJwtProviderOptions);
|
||||
get redirectUrl(): undefined;
|
||||
get clientMetadata(): OAuthClientMetadata;
|
||||
clientInformation(): OAuthClientInformation;
|
||||
saveClientInformation(info: OAuthClientInformation): void;
|
||||
tokens(): OAuthTokens | undefined;
|
||||
saveTokens(tokens: OAuthTokens): void;
|
||||
redirectToAuthorization(): void;
|
||||
saveCodeVerifier(): void;
|
||||
codeVerifier(): string;
|
||||
prepareTokenRequest(scope?: string): URLSearchParams;
|
||||
}
|
||||
//# sourceMappingURL=auth-extensions.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"auth-extensions.d.ts","sourceRoot":"","sources":["../../../src/client/auth-extensions.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAGH,OAAO,EAAE,sBAAsB,EAAE,mBAAmB,EAAE,WAAW,EAAE,MAAM,mBAAmB,CAAC;AAC7F,OAAO,EAAE,uBAAuB,EAAE,mBAAmB,EAAE,MAAM,WAAW,CAAC;AAEzE;;;;;;GAMG;AACH,wBAAgB,uBAAuB,CAAC,OAAO,EAAE;IAC7C,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,EAAE,MAAM,CAAC;IAChB,UAAU,EAAE,MAAM,GAAG,UAAU,GAAG,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IAC1D,GAAG,EAAE,MAAM,CAAC;IACZ,QAAQ,CAAC,EAAE,MAAM,GAAG,GAAG,CAAC;IACxB,eAAe,CAAC,EAAE,MAAM,CAAC;IACzB,MAAM,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;CACpC,GAAG,uBAAuB,CAgE1B;AAED;;GAEG;AACH,MAAM,WAAW,gCAAgC;IAC7C;;OAEG;IACH,QAAQ,EAAE,MAAM,CAAC;IAEjB;;OAEG;IACH,YAAY,EAAE,MAAM,CAAC;IAErB;;OAEG;IACH,UAAU,CAAC,EAAE,MAAM,CAAC;IAEpB;;OAEG;IACH,KAAK,CAAC,EAAE,MAAM,CAAC;CAClB;AAED;;;;;;;;;;;;;;;GAeG;AACH,qBAAa,yBAA0B,YAAW,mBAAmB;IACjE,OAAO,CAAC,OAAO,CAAC,CAAc;IAC9B,OAAO,CAAC,WAAW,CAAyB;IAC5C,OAAO,CAAC,eAAe,CAAsB;gBAEjC,OAAO,EAAE,gCAAgC;IAcrD,IAAI,WAAW,IAAI,SAAS,CAE3B;IAED,IAAI,cAAc,IAAI,mBAAmB,CAExC;IAED,iBAAiB,IAAI,sBAAsB;IAI3C,qBAAqB,CAAC,IAAI,EAAE,sBAAsB,GAAG,IAAI;IAIzD,MAAM,IAAI,WAAW,GAAG,SAAS;IAIjC,UAAU,CAAC,MAAM,EAAE,WAAW,GAAG,IAAI;IAIrC,uBAAuB,IAAI,IAAI;IAI/B,gBAAgB,IAAI,IAAI;IAIxB,YAAY,IAAI,MAAM;IAItB,mBAAmB,CAAC,KAAK,CAAC,EAAE,MAAM,GAAG,eAAe;CAKvD;AAED;;GAEG;AACH,MAAM,WAAW,4BAA4B;IACzC;;OAEG;IACH,QAAQ,EAAE,MAAM,CAAC;IAEjB;;;OAGG;IACH,UAAU,EAAE,MAAM,GAAG,UAAU,GAAG,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IAE1D;;OAEG;IACH,SAAS,EAAE,MAAM,CAAC;IAElB;;OAEG;IACH,UAAU,CAAC,EAAE,MAAM,CAAC;IAEpB;;OAEG;IACH,kBAAkB,CAAC,EAAE,MAAM,CAAC;IAE5B;;OAEG;IACH,KAAK,CAAC,EAAE,MAAM,CAAC;CAClB;AAED;;;;;;;;;;;;;;;;GAgBG;AACH,qBAAa,qBAAsB,YAAW,mBAAmB;IAC7D,OAAO,CAAC,OAAO,CAAC,CAAc;IAC9B,OAAO,CAAC,WAAW,CAAyB;IAC5C,OAAO,CAAC,eAAe,CAAsB;IAC7C,uBAAuB,EAAE,uBAAuB,CAAC;gBAErC,OAAO,EAAE,4BAA4B;IAoBjD,IAAI,WAAW,IAAI,SAAS,CAE3B;IAED,IAAI,cAAc,IAAI,mBAAmB,CAExC;IAED,iBAAiB,IAAI,sBAAsB;IAI3C,qBAAqB,CAAC,IAAI,EAAE,sBAAsB,GAAG,IAAI;IAIzD,MAAM,IAAI,WAAW,GAAG,SAAS;IAIjC,UAAU,CAAC,MAAM,EAAE,WAAW,GAAG,IAAI;IAIrC,uBAAuB,IAAI,IAAI;IAI/B,gBAAgB,IAAI,IAAI;IAIxB,YAAY,IAAI,MAAM;IAItB,mBAAmB,CAAC,KAAK,CAAC,EAAE,MAAM,GAAG,eAAe;CAKvD;AAED;;GAEG;AACH,MAAM,WAAW,kCAAkC;IAC/C;;OAEG;IACH,QAAQ,EAAE,MAAM,CAAC;IAEjB;;;;;OAKG;IACH,kBAAkB,EAAE,MAAM,CAAC;IAE3B;;OAEG;IACH,UAAU,CAAC,EAAE,MAAM,CAAC;IAEpB;;OAEG;IACH,KAAK,CAAC,EAAE,MAAM,CAAC;CAClB;AAED;;;;;;GAMG;AACH,qBAAa,2BAA4B,YAAW,mBAAmB;IACnE,OAAO,CAAC,OAAO,CAAC,CAAc;IAC9B,OAAO,CAAC,WAAW,CAAyB;IAC5C,OAAO,CAAC,eAAe,CAAsB;IAC7C,uBAAuB,EAAE,uBAAuB,CAAC;gBAErC,OAAO,EAAE,kCAAkC;IAmBvD,IAAI,WAAW,IAAI,SAAS,CAE3B;IAED,IAAI,cAAc,IAAI,mBAAmB,CAExC;IAED,iBAAiB,IAAI,sBAAsB;IAI3C,qBAAqB,CAAC,IAAI,EAAE,sBAAsB,GAAG,IAAI;IAIzD,MAAM,IAAI,WAAW,GAAG,SAAS;IAIjC,UAAU,CAAC,MAAM,EAAE,WAAW,GAAG,IAAI;IAIrC,uBAAuB,IAAI,IAAI;IAI/B,gBAAgB,IAAI,IAAI;IAIxB,YAAY,IAAI,MAAM;IAItB,mBAAmB,CAAC,KAAK,CAAC,EAAE,MAAM,GAAG,eAAe;CAKvD"}
|
||||
299
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.js
generated
vendored
Normal file
299
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.js
generated
vendored
Normal file
@@ -0,0 +1,299 @@
|
||||
"use strict";
|
||||
/**
|
||||
* OAuth provider extensions for specialized authentication flows.
|
||||
*
|
||||
* This module provides ready-to-use OAuthClientProvider implementations
|
||||
* for common machine-to-machine authentication scenarios.
|
||||
*/
|
||||
var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
var desc = Object.getOwnPropertyDescriptor(m, k);
|
||||
if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
|
||||
desc = { enumerable: true, get: function() { return m[k]; } };
|
||||
}
|
||||
Object.defineProperty(o, k2, desc);
|
||||
}) : (function(o, m, k, k2) {
|
||||
if (k2 === undefined) k2 = k;
|
||||
o[k2] = m[k];
|
||||
}));
|
||||
var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
|
||||
Object.defineProperty(o, "default", { enumerable: true, value: v });
|
||||
}) : function(o, v) {
|
||||
o["default"] = v;
|
||||
});
|
||||
var __importStar = (this && this.__importStar) || function (mod) {
|
||||
if (mod && mod.__esModule) return mod;
|
||||
var result = {};
|
||||
if (mod != null) for (var k in mod) if (k !== "default" && Object.prototype.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);
|
||||
__setModuleDefault(result, mod);
|
||||
return result;
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.StaticPrivateKeyJwtProvider = exports.PrivateKeyJwtProvider = exports.ClientCredentialsProvider = void 0;
|
||||
exports.createPrivateKeyJwtAuth = createPrivateKeyJwtAuth;
|
||||
/**
|
||||
* Helper to produce a private_key_jwt client authentication function.
|
||||
*
|
||||
* Usage:
|
||||
* const addClientAuth = createPrivateKeyJwtAuth({ issuer, subject, privateKey, alg, audience? });
|
||||
* // pass addClientAuth as provider.addClientAuthentication implementation
|
||||
*/
|
||||
function createPrivateKeyJwtAuth(options) {
|
||||
return async (_headers, params, url, metadata) => {
|
||||
// Lazy import to avoid heavy dependency unless used
|
||||
if (typeof globalThis.crypto === 'undefined') {
|
||||
throw new TypeError('crypto is not available, please ensure you add have Web Crypto API support for older Node.js versions (see https://github.com/modelcontextprotocol/typescript-sdk#nodejs-web-crypto-globalthiscrypto-compatibility)');
|
||||
}
|
||||
const jose = await Promise.resolve().then(() => __importStar(require('jose')));
|
||||
const audience = String(options.audience ?? metadata?.issuer ?? url);
|
||||
const lifetimeSeconds = options.lifetimeSeconds ?? 300;
|
||||
const now = Math.floor(Date.now() / 1000);
|
||||
const jti = `${Date.now()}-${Math.random().toString(36).slice(2)}`;
|
||||
const baseClaims = {
|
||||
iss: options.issuer,
|
||||
sub: options.subject,
|
||||
aud: audience,
|
||||
exp: now + lifetimeSeconds,
|
||||
iat: now,
|
||||
jti
|
||||
};
|
||||
const claims = options.claims ? { ...baseClaims, ...options.claims } : baseClaims;
|
||||
// Import key for the requested algorithm
|
||||
const alg = options.alg;
|
||||
let key;
|
||||
if (typeof options.privateKey === 'string') {
|
||||
if (alg.startsWith('RS') || alg.startsWith('ES') || alg.startsWith('PS')) {
|
||||
key = await jose.importPKCS8(options.privateKey, alg);
|
||||
}
|
||||
else if (alg.startsWith('HS')) {
|
||||
key = new TextEncoder().encode(options.privateKey);
|
||||
}
|
||||
else {
|
||||
throw new Error(`Unsupported algorithm ${alg}`);
|
||||
}
|
||||
}
|
||||
else if (options.privateKey instanceof Uint8Array) {
|
||||
if (alg.startsWith('HS')) {
|
||||
key = options.privateKey;
|
||||
}
|
||||
else {
|
||||
// Assume PKCS#8 DER in Uint8Array for asymmetric algorithms
|
||||
key = await jose.importPKCS8(new TextDecoder().decode(options.privateKey), alg);
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Treat as JWK
|
||||
key = await jose.importJWK(options.privateKey, alg);
|
||||
}
|
||||
// Sign JWT
|
||||
const assertion = await new jose.SignJWT(claims)
|
||||
.setProtectedHeader({ alg, typ: 'JWT' })
|
||||
.setIssuer(options.issuer)
|
||||
.setSubject(options.subject)
|
||||
.setAudience(audience)
|
||||
.setIssuedAt(now)
|
||||
.setExpirationTime(now + lifetimeSeconds)
|
||||
.setJti(jti)
|
||||
.sign(key);
|
||||
params.set('client_assertion', assertion);
|
||||
params.set('client_assertion_type', 'urn:ietf:params:oauth:client-assertion-type:jwt-bearer');
|
||||
};
|
||||
}
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with client_secret_basic authentication.
|
||||
*
|
||||
* This provider is designed for machine-to-machine authentication where
|
||||
* the client authenticates using a client_id and client_secret.
|
||||
*
|
||||
* @example
|
||||
* const provider = new ClientCredentialsProvider({
|
||||
* clientId: 'my-client',
|
||||
* clientSecret: 'my-secret'
|
||||
* });
|
||||
*
|
||||
* const transport = new StreamableHTTPClientTransport(serverUrl, {
|
||||
* authProvider: provider
|
||||
* });
|
||||
*/
|
||||
class ClientCredentialsProvider {
|
||||
constructor(options) {
|
||||
this._clientInfo = {
|
||||
client_id: options.clientId,
|
||||
client_secret: options.clientSecret
|
||||
};
|
||||
this._clientMetadata = {
|
||||
client_name: options.clientName ?? 'client-credentials-client',
|
||||
redirect_uris: [],
|
||||
grant_types: ['client_credentials'],
|
||||
token_endpoint_auth_method: 'client_secret_basic',
|
||||
scope: options.scope
|
||||
};
|
||||
}
|
||||
get redirectUrl() {
|
||||
return undefined;
|
||||
}
|
||||
get clientMetadata() {
|
||||
return this._clientMetadata;
|
||||
}
|
||||
clientInformation() {
|
||||
return this._clientInfo;
|
||||
}
|
||||
saveClientInformation(info) {
|
||||
this._clientInfo = info;
|
||||
}
|
||||
tokens() {
|
||||
return this._tokens;
|
||||
}
|
||||
saveTokens(tokens) {
|
||||
this._tokens = tokens;
|
||||
}
|
||||
redirectToAuthorization() {
|
||||
throw new Error('redirectToAuthorization is not used for client_credentials flow');
|
||||
}
|
||||
saveCodeVerifier() {
|
||||
// Not used for client_credentials
|
||||
}
|
||||
codeVerifier() {
|
||||
throw new Error('codeVerifier is not used for client_credentials flow');
|
||||
}
|
||||
prepareTokenRequest(scope) {
|
||||
const params = new URLSearchParams({ grant_type: 'client_credentials' });
|
||||
if (scope)
|
||||
params.set('scope', scope);
|
||||
return params;
|
||||
}
|
||||
}
|
||||
exports.ClientCredentialsProvider = ClientCredentialsProvider;
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with private_key_jwt authentication.
|
||||
*
|
||||
* This provider is designed for machine-to-machine authentication where
|
||||
* the client authenticates using a signed JWT assertion (RFC 7523 Section 2.2).
|
||||
*
|
||||
* @example
|
||||
* const provider = new PrivateKeyJwtProvider({
|
||||
* clientId: 'my-client',
|
||||
* privateKey: pemEncodedPrivateKey,
|
||||
* algorithm: 'RS256'
|
||||
* });
|
||||
*
|
||||
* const transport = new StreamableHTTPClientTransport(serverUrl, {
|
||||
* authProvider: provider
|
||||
* });
|
||||
*/
|
||||
class PrivateKeyJwtProvider {
|
||||
constructor(options) {
|
||||
this._clientInfo = {
|
||||
client_id: options.clientId
|
||||
};
|
||||
this._clientMetadata = {
|
||||
client_name: options.clientName ?? 'private-key-jwt-client',
|
||||
redirect_uris: [],
|
||||
grant_types: ['client_credentials'],
|
||||
token_endpoint_auth_method: 'private_key_jwt',
|
||||
scope: options.scope
|
||||
};
|
||||
this.addClientAuthentication = createPrivateKeyJwtAuth({
|
||||
issuer: options.clientId,
|
||||
subject: options.clientId,
|
||||
privateKey: options.privateKey,
|
||||
alg: options.algorithm,
|
||||
lifetimeSeconds: options.jwtLifetimeSeconds
|
||||
});
|
||||
}
|
||||
get redirectUrl() {
|
||||
return undefined;
|
||||
}
|
||||
get clientMetadata() {
|
||||
return this._clientMetadata;
|
||||
}
|
||||
clientInformation() {
|
||||
return this._clientInfo;
|
||||
}
|
||||
saveClientInformation(info) {
|
||||
this._clientInfo = info;
|
||||
}
|
||||
tokens() {
|
||||
return this._tokens;
|
||||
}
|
||||
saveTokens(tokens) {
|
||||
this._tokens = tokens;
|
||||
}
|
||||
redirectToAuthorization() {
|
||||
throw new Error('redirectToAuthorization is not used for client_credentials flow');
|
||||
}
|
||||
saveCodeVerifier() {
|
||||
// Not used for client_credentials
|
||||
}
|
||||
codeVerifier() {
|
||||
throw new Error('codeVerifier is not used for client_credentials flow');
|
||||
}
|
||||
prepareTokenRequest(scope) {
|
||||
const params = new URLSearchParams({ grant_type: 'client_credentials' });
|
||||
if (scope)
|
||||
params.set('scope', scope);
|
||||
return params;
|
||||
}
|
||||
}
|
||||
exports.PrivateKeyJwtProvider = PrivateKeyJwtProvider;
|
||||
/**
|
||||
* OAuth provider for client_credentials grant with a static private_key_jwt assertion.
|
||||
*
|
||||
* This provider mirrors {@link PrivateKeyJwtProvider} but instead of constructing and
|
||||
* signing a JWT on each request, it accepts a pre-built JWT assertion string and
|
||||
* uses it directly for authentication.
|
||||
*/
|
||||
class StaticPrivateKeyJwtProvider {
|
||||
constructor(options) {
|
||||
this._clientInfo = {
|
||||
client_id: options.clientId
|
||||
};
|
||||
this._clientMetadata = {
|
||||
client_name: options.clientName ?? 'static-private-key-jwt-client',
|
||||
redirect_uris: [],
|
||||
grant_types: ['client_credentials'],
|
||||
token_endpoint_auth_method: 'private_key_jwt',
|
||||
scope: options.scope
|
||||
};
|
||||
const assertion = options.jwtBearerAssertion;
|
||||
this.addClientAuthentication = async (_headers, params) => {
|
||||
params.set('client_assertion', assertion);
|
||||
params.set('client_assertion_type', 'urn:ietf:params:oauth:client-assertion-type:jwt-bearer');
|
||||
};
|
||||
}
|
||||
get redirectUrl() {
|
||||
return undefined;
|
||||
}
|
||||
get clientMetadata() {
|
||||
return this._clientMetadata;
|
||||
}
|
||||
clientInformation() {
|
||||
return this._clientInfo;
|
||||
}
|
||||
saveClientInformation(info) {
|
||||
this._clientInfo = info;
|
||||
}
|
||||
tokens() {
|
||||
return this._tokens;
|
||||
}
|
||||
saveTokens(tokens) {
|
||||
this._tokens = tokens;
|
||||
}
|
||||
redirectToAuthorization() {
|
||||
throw new Error('redirectToAuthorization is not used for client_credentials flow');
|
||||
}
|
||||
saveCodeVerifier() {
|
||||
// Not used for client_credentials
|
||||
}
|
||||
codeVerifier() {
|
||||
throw new Error('codeVerifier is not used for client_credentials flow');
|
||||
}
|
||||
prepareTokenRequest(scope) {
|
||||
const params = new URLSearchParams({ grant_type: 'client_credentials' });
|
||||
if (scope)
|
||||
params.set('scope', scope);
|
||||
return params;
|
||||
}
|
||||
}
|
||||
exports.StaticPrivateKeyJwtProvider = StaticPrivateKeyJwtProvider;
|
||||
//# sourceMappingURL=auth-extensions.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth-extensions.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
446
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.d.ts
generated
vendored
Normal file
446
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.d.ts
generated
vendored
Normal file
@@ -0,0 +1,446 @@
|
||||
import { OAuthClientMetadata, OAuthClientInformationMixed, OAuthTokens, OAuthMetadata, OAuthClientInformationFull, OAuthProtectedResourceMetadata, AuthorizationServerMetadata } from '../shared/auth.js';
|
||||
import { OAuthError } from '../server/auth/errors.js';
|
||||
import { FetchLike } from '../shared/transport.js';
|
||||
/**
|
||||
* Function type for adding client authentication to token requests.
|
||||
*/
|
||||
export type AddClientAuthentication = (headers: Headers, params: URLSearchParams, url: string | URL, metadata?: AuthorizationServerMetadata) => void | Promise<void>;
|
||||
/**
|
||||
* Implements an end-to-end OAuth client to be used with one MCP server.
|
||||
*
|
||||
* This client relies upon a concept of an authorized "session," the exact
|
||||
* meaning of which is application-defined. Tokens, authorization codes, and
|
||||
* code verifiers should not cross different sessions.
|
||||
*/
|
||||
export interface OAuthClientProvider {
|
||||
/**
|
||||
* The URL to redirect the user agent to after authorization.
|
||||
* Return undefined for non-interactive flows that don't require user interaction
|
||||
* (e.g., client_credentials, jwt-bearer).
|
||||
*/
|
||||
get redirectUrl(): string | URL | undefined;
|
||||
/**
|
||||
* External URL the server should use to fetch client metadata document
|
||||
*/
|
||||
clientMetadataUrl?: string;
|
||||
/**
|
||||
* Metadata about this OAuth client.
|
||||
*/
|
||||
get clientMetadata(): OAuthClientMetadata;
|
||||
/**
|
||||
* Returns a OAuth2 state parameter.
|
||||
*/
|
||||
state?(): string | Promise<string>;
|
||||
/**
|
||||
* Loads information about this OAuth client, as registered already with the
|
||||
* server, or returns `undefined` if the client is not registered with the
|
||||
* server.
|
||||
*/
|
||||
clientInformation(): OAuthClientInformationMixed | undefined | Promise<OAuthClientInformationMixed | undefined>;
|
||||
/**
|
||||
* If implemented, this permits the OAuth client to dynamically register with
|
||||
* the server. Client information saved this way should later be read via
|
||||
* `clientInformation()`.
|
||||
*
|
||||
* This method is not required to be implemented if client information is
|
||||
* statically known (e.g., pre-registered).
|
||||
*/
|
||||
saveClientInformation?(clientInformation: OAuthClientInformationMixed): void | Promise<void>;
|
||||
/**
|
||||
* Loads any existing OAuth tokens for the current session, or returns
|
||||
* `undefined` if there are no saved tokens.
|
||||
*/
|
||||
tokens(): OAuthTokens | undefined | Promise<OAuthTokens | undefined>;
|
||||
/**
|
||||
* Stores new OAuth tokens for the current session, after a successful
|
||||
* authorization.
|
||||
*/
|
||||
saveTokens(tokens: OAuthTokens): void | Promise<void>;
|
||||
/**
|
||||
* Invoked to redirect the user agent to the given URL to begin the authorization flow.
|
||||
*/
|
||||
redirectToAuthorization(authorizationUrl: URL): void | Promise<void>;
|
||||
/**
|
||||
* Saves a PKCE code verifier for the current session, before redirecting to
|
||||
* the authorization flow.
|
||||
*/
|
||||
saveCodeVerifier(codeVerifier: string): void | Promise<void>;
|
||||
/**
|
||||
* Loads the PKCE code verifier for the current session, necessary to validate
|
||||
* the authorization result.
|
||||
*/
|
||||
codeVerifier(): string | Promise<string>;
|
||||
/**
|
||||
* Adds custom client authentication to OAuth token requests.
|
||||
*
|
||||
* This optional method allows implementations to customize how client credentials
|
||||
* are included in token exchange and refresh requests. When provided, this method
|
||||
* is called instead of the default authentication logic, giving full control over
|
||||
* the authentication mechanism.
|
||||
*
|
||||
* Common use cases include:
|
||||
* - Supporting authentication methods beyond the standard OAuth 2.0 methods
|
||||
* - Adding custom headers for proprietary authentication schemes
|
||||
* - Implementing client assertion-based authentication (e.g., JWT bearer tokens)
|
||||
*
|
||||
* @param headers - The request headers (can be modified to add authentication)
|
||||
* @param params - The request body parameters (can be modified to add credentials)
|
||||
* @param url - The token endpoint URL being called
|
||||
* @param metadata - Optional OAuth metadata for the server, which may include supported authentication methods
|
||||
*/
|
||||
addClientAuthentication?: AddClientAuthentication;
|
||||
/**
|
||||
* If defined, overrides the selection and validation of the
|
||||
* RFC 8707 Resource Indicator. If left undefined, default
|
||||
* validation behavior will be used.
|
||||
*
|
||||
* Implementations must verify the returned resource matches the MCP server.
|
||||
*/
|
||||
validateResourceURL?(serverUrl: string | URL, resource?: string): Promise<URL | undefined>;
|
||||
/**
|
||||
* If implemented, provides a way for the client to invalidate (e.g. delete) the specified
|
||||
* credentials, in the case where the server has indicated that they are no longer valid.
|
||||
* This avoids requiring the user to intervene manually.
|
||||
*/
|
||||
invalidateCredentials?(scope: 'all' | 'client' | 'tokens' | 'verifier' | 'discovery'): void | Promise<void>;
|
||||
/**
|
||||
* Prepares grant-specific parameters for a token request.
|
||||
*
|
||||
* This optional method allows providers to customize the token request based on
|
||||
* the grant type they support. When implemented, it returns the grant type and
|
||||
* any grant-specific parameters needed for the token exchange.
|
||||
*
|
||||
* If not implemented, the default behavior depends on the flow:
|
||||
* - For authorization code flow: uses code, code_verifier, and redirect_uri
|
||||
* - For client_credentials: detected via grant_types in clientMetadata
|
||||
*
|
||||
* @param scope - Optional scope to request
|
||||
* @returns Grant type and parameters, or undefined to use default behavior
|
||||
*
|
||||
* @example
|
||||
* // For client_credentials grant:
|
||||
* prepareTokenRequest(scope) {
|
||||
* return {
|
||||
* grantType: 'client_credentials',
|
||||
* params: scope ? { scope } : {}
|
||||
* };
|
||||
* }
|
||||
*
|
||||
* @example
|
||||
* // For authorization_code grant (default behavior):
|
||||
* async prepareTokenRequest() {
|
||||
* return {
|
||||
* grantType: 'authorization_code',
|
||||
* params: {
|
||||
* code: this.authorizationCode,
|
||||
* code_verifier: await this.codeVerifier(),
|
||||
* redirect_uri: String(this.redirectUrl)
|
||||
* }
|
||||
* };
|
||||
* }
|
||||
*/
|
||||
prepareTokenRequest?(scope?: string): URLSearchParams | Promise<URLSearchParams | undefined> | undefined;
|
||||
/**
|
||||
* Saves the OAuth discovery state after RFC 9728 and authorization server metadata
|
||||
* discovery. Providers can persist this state to avoid redundant discovery requests
|
||||
* on subsequent {@linkcode auth} calls.
|
||||
*
|
||||
* This state can also be provided out-of-band (e.g., from a previous session or
|
||||
* external configuration) to bootstrap the OAuth flow without discovery.
|
||||
*
|
||||
* Called by {@linkcode auth} after successful discovery.
|
||||
*/
|
||||
saveDiscoveryState?(state: OAuthDiscoveryState): void | Promise<void>;
|
||||
/**
|
||||
* Returns previously saved discovery state, or `undefined` if none is cached.
|
||||
*
|
||||
* When available, {@linkcode auth} restores the discovery state (authorization server
|
||||
* URL, resource metadata, etc.) instead of performing RFC 9728 discovery, reducing
|
||||
* latency on subsequent calls.
|
||||
*
|
||||
* Providers should clear cached discovery state on repeated authentication failures
|
||||
* (via {@linkcode invalidateCredentials} with scope `'discovery'` or `'all'`) to allow
|
||||
* re-discovery in case the authorization server has changed.
|
||||
*/
|
||||
discoveryState?(): OAuthDiscoveryState | undefined | Promise<OAuthDiscoveryState | undefined>;
|
||||
}
|
||||
/**
|
||||
* Discovery state that can be persisted across sessions by an {@linkcode OAuthClientProvider}.
|
||||
*
|
||||
* Contains the results of RFC 9728 protected resource metadata discovery and
|
||||
* authorization server metadata discovery. Persisting this state avoids
|
||||
* redundant discovery HTTP requests on subsequent {@linkcode auth} calls.
|
||||
*/
|
||||
export interface OAuthDiscoveryState extends OAuthServerInfo {
|
||||
/** The URL at which the protected resource metadata was found, if available. */
|
||||
resourceMetadataUrl?: string;
|
||||
}
|
||||
export type AuthResult = 'AUTHORIZED' | 'REDIRECT';
|
||||
export declare class UnauthorizedError extends Error {
|
||||
constructor(message?: string);
|
||||
}
|
||||
type ClientAuthMethod = 'client_secret_basic' | 'client_secret_post' | 'none';
|
||||
/**
|
||||
* Determines the best client authentication method to use based on server support and client configuration.
|
||||
*
|
||||
* Priority order (highest to lowest):
|
||||
* 1. client_secret_basic (if client secret is available)
|
||||
* 2. client_secret_post (if client secret is available)
|
||||
* 3. none (for public clients)
|
||||
*
|
||||
* @param clientInformation - OAuth client information containing credentials
|
||||
* @param supportedMethods - Authentication methods supported by the authorization server
|
||||
* @returns The selected authentication method
|
||||
*/
|
||||
export declare function selectClientAuthMethod(clientInformation: OAuthClientInformationMixed, supportedMethods: string[]): ClientAuthMethod;
|
||||
/**
|
||||
* Parses an OAuth error response from a string or Response object.
|
||||
*
|
||||
* If the input is a standard OAuth2.0 error response, it will be parsed according to the spec
|
||||
* and an instance of the appropriate OAuthError subclass will be returned.
|
||||
* If parsing fails, it falls back to a generic ServerError that includes
|
||||
* the response status (if available) and original content.
|
||||
*
|
||||
* @param input - A Response object or string containing the error response
|
||||
* @returns A Promise that resolves to an OAuthError instance
|
||||
*/
|
||||
export declare function parseErrorResponse(input: Response | string): Promise<OAuthError>;
|
||||
/**
|
||||
* Orchestrates the full auth flow with a server.
|
||||
*
|
||||
* This can be used as a single entry point for all authorization functionality,
|
||||
* instead of linking together the other lower-level functions in this module.
|
||||
*/
|
||||
export declare function auth(provider: OAuthClientProvider, options: {
|
||||
serverUrl: string | URL;
|
||||
authorizationCode?: string;
|
||||
scope?: string;
|
||||
resourceMetadataUrl?: URL;
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<AuthResult>;
|
||||
/**
|
||||
* SEP-991: URL-based Client IDs
|
||||
* Validate that the client_id is a valid URL with https scheme
|
||||
*/
|
||||
export declare function isHttpsUrl(value?: string): boolean;
|
||||
export declare function selectResourceURL(serverUrl: string | URL, provider: OAuthClientProvider, resourceMetadata?: OAuthProtectedResourceMetadata): Promise<URL | undefined>;
|
||||
/**
|
||||
* Extract resource_metadata, scope, and error from WWW-Authenticate header.
|
||||
*/
|
||||
export declare function extractWWWAuthenticateParams(res: Response): {
|
||||
resourceMetadataUrl?: URL;
|
||||
scope?: string;
|
||||
error?: string;
|
||||
};
|
||||
/**
|
||||
* Extract resource_metadata from response header.
|
||||
* @deprecated Use `extractWWWAuthenticateParams` instead.
|
||||
*/
|
||||
export declare function extractResourceMetadataUrl(res: Response): URL | undefined;
|
||||
/**
|
||||
* Looks up RFC 9728 OAuth 2.0 Protected Resource Metadata.
|
||||
*
|
||||
* If the server returns a 404 for the well-known endpoint, this function will
|
||||
* return `undefined`. Any other errors will be thrown as exceptions.
|
||||
*/
|
||||
export declare function discoverOAuthProtectedResourceMetadata(serverUrl: string | URL, opts?: {
|
||||
protocolVersion?: string;
|
||||
resourceMetadataUrl?: string | URL;
|
||||
}, fetchFn?: FetchLike): Promise<OAuthProtectedResourceMetadata>;
|
||||
/**
|
||||
* Looks up RFC 8414 OAuth 2.0 Authorization Server Metadata.
|
||||
*
|
||||
* If the server returns a 404 for the well-known endpoint, this function will
|
||||
* return `undefined`. Any other errors will be thrown as exceptions.
|
||||
*
|
||||
* @deprecated This function is deprecated in favor of `discoverAuthorizationServerMetadata`.
|
||||
*/
|
||||
export declare function discoverOAuthMetadata(issuer: string | URL, { authorizationServerUrl, protocolVersion }?: {
|
||||
authorizationServerUrl?: string | URL;
|
||||
protocolVersion?: string;
|
||||
}, fetchFn?: FetchLike): Promise<OAuthMetadata | undefined>;
|
||||
/**
|
||||
* Builds a list of discovery URLs to try for authorization server metadata.
|
||||
* URLs are returned in priority order:
|
||||
* 1. OAuth metadata at the given URL
|
||||
* 2. OIDC metadata endpoints at the given URL
|
||||
*/
|
||||
export declare function buildDiscoveryUrls(authorizationServerUrl: string | URL): {
|
||||
url: URL;
|
||||
type: 'oauth' | 'oidc';
|
||||
}[];
|
||||
/**
|
||||
* Discovers authorization server metadata with support for RFC 8414 OAuth 2.0 Authorization Server Metadata
|
||||
* and OpenID Connect Discovery 1.0 specifications.
|
||||
*
|
||||
* This function implements a fallback strategy for authorization server discovery:
|
||||
* 1. Attempts RFC 8414 OAuth metadata discovery first
|
||||
* 2. If OAuth discovery fails, falls back to OpenID Connect Discovery
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server URL obtained from the MCP Server's
|
||||
* protected resource metadata, or the MCP server's URL if the
|
||||
* metadata was not found.
|
||||
* @param options - Configuration options
|
||||
* @param options.fetchFn - Optional fetch function for making HTTP requests, defaults to global fetch
|
||||
* @param options.protocolVersion - MCP protocol version to use, defaults to LATEST_PROTOCOL_VERSION
|
||||
* @returns Promise resolving to authorization server metadata, or undefined if discovery fails
|
||||
*/
|
||||
export declare function discoverAuthorizationServerMetadata(authorizationServerUrl: string | URL, { fetchFn, protocolVersion }?: {
|
||||
fetchFn?: FetchLike;
|
||||
protocolVersion?: string;
|
||||
}): Promise<AuthorizationServerMetadata | undefined>;
|
||||
/**
|
||||
* Result of {@linkcode discoverOAuthServerInfo}.
|
||||
*/
|
||||
export interface OAuthServerInfo {
|
||||
/**
|
||||
* The authorization server URL, either discovered via RFC 9728
|
||||
* or derived from the MCP server URL as a fallback.
|
||||
*/
|
||||
authorizationServerUrl: string;
|
||||
/**
|
||||
* The authorization server metadata (endpoints, capabilities),
|
||||
* or `undefined` if metadata discovery failed.
|
||||
*/
|
||||
authorizationServerMetadata?: AuthorizationServerMetadata;
|
||||
/**
|
||||
* The OAuth 2.0 Protected Resource Metadata from RFC 9728,
|
||||
* or `undefined` if the server does not support it.
|
||||
*/
|
||||
resourceMetadata?: OAuthProtectedResourceMetadata;
|
||||
}
|
||||
/**
|
||||
* Discovers the authorization server for an MCP server following
|
||||
* {@link https://datatracker.ietf.org/doc/html/rfc9728 | RFC 9728} (OAuth 2.0 Protected
|
||||
* Resource Metadata), with fallback to treating the server URL as the
|
||||
* authorization server.
|
||||
*
|
||||
* This function combines two discovery steps into one call:
|
||||
* 1. Probes `/.well-known/oauth-protected-resource` on the MCP server to find the
|
||||
* authorization server URL (RFC 9728).
|
||||
* 2. Fetches authorization server metadata from that URL (RFC 8414 / OpenID Connect Discovery).
|
||||
*
|
||||
* Use this when you need the authorization server metadata for operations outside the
|
||||
* {@linkcode auth} orchestrator, such as token refresh or token revocation.
|
||||
*
|
||||
* @param serverUrl - The MCP resource server URL
|
||||
* @param opts - Optional configuration
|
||||
* @param opts.resourceMetadataUrl - Override URL for the protected resource metadata endpoint
|
||||
* @param opts.fetchFn - Custom fetch function for HTTP requests
|
||||
* @returns Authorization server URL, metadata, and resource metadata (if available)
|
||||
*/
|
||||
export declare function discoverOAuthServerInfo(serverUrl: string | URL, opts?: {
|
||||
resourceMetadataUrl?: URL;
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<OAuthServerInfo>;
|
||||
/**
|
||||
* Begins the authorization flow with the given server, by generating a PKCE challenge and constructing the authorization URL.
|
||||
*/
|
||||
export declare function startAuthorization(authorizationServerUrl: string | URL, { metadata, clientInformation, redirectUrl, scope, state, resource }: {
|
||||
metadata?: AuthorizationServerMetadata;
|
||||
clientInformation: OAuthClientInformationMixed;
|
||||
redirectUrl: string | URL;
|
||||
scope?: string;
|
||||
state?: string;
|
||||
resource?: URL;
|
||||
}): Promise<{
|
||||
authorizationUrl: URL;
|
||||
codeVerifier: string;
|
||||
}>;
|
||||
/**
|
||||
* Prepares token request parameters for an authorization code exchange.
|
||||
*
|
||||
* This is the default implementation used by fetchToken when the provider
|
||||
* doesn't implement prepareTokenRequest.
|
||||
*
|
||||
* @param authorizationCode - The authorization code received from the authorization endpoint
|
||||
* @param codeVerifier - The PKCE code verifier
|
||||
* @param redirectUri - The redirect URI used in the authorization request
|
||||
* @returns URLSearchParams for the authorization_code grant
|
||||
*/
|
||||
export declare function prepareAuthorizationCodeRequest(authorizationCode: string, codeVerifier: string, redirectUri: string | URL): URLSearchParams;
|
||||
/**
|
||||
* Exchanges an authorization code for an access token with the given server.
|
||||
*
|
||||
* Supports multiple client authentication methods as specified in OAuth 2.1:
|
||||
* - Automatically selects the best authentication method based on server support
|
||||
* - Falls back to appropriate defaults when server metadata is unavailable
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration object containing client info, auth code, etc.
|
||||
* @returns Promise resolving to OAuth tokens
|
||||
* @throws {Error} When token exchange fails or authentication is invalid
|
||||
*/
|
||||
export declare function exchangeAuthorization(authorizationServerUrl: string | URL, { metadata, clientInformation, authorizationCode, codeVerifier, redirectUri, resource, addClientAuthentication, fetchFn }: {
|
||||
metadata?: AuthorizationServerMetadata;
|
||||
clientInformation: OAuthClientInformationMixed;
|
||||
authorizationCode: string;
|
||||
codeVerifier: string;
|
||||
redirectUri: string | URL;
|
||||
resource?: URL;
|
||||
addClientAuthentication?: OAuthClientProvider['addClientAuthentication'];
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<OAuthTokens>;
|
||||
/**
|
||||
* Exchange a refresh token for an updated access token.
|
||||
*
|
||||
* Supports multiple client authentication methods as specified in OAuth 2.1:
|
||||
* - Automatically selects the best authentication method based on server support
|
||||
* - Preserves the original refresh token if a new one is not returned
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration object containing client info, refresh token, etc.
|
||||
* @returns Promise resolving to OAuth tokens (preserves original refresh_token if not replaced)
|
||||
* @throws {Error} When token refresh fails or authentication is invalid
|
||||
*/
|
||||
export declare function refreshAuthorization(authorizationServerUrl: string | URL, { metadata, clientInformation, refreshToken, resource, addClientAuthentication, fetchFn }: {
|
||||
metadata?: AuthorizationServerMetadata;
|
||||
clientInformation: OAuthClientInformationMixed;
|
||||
refreshToken: string;
|
||||
resource?: URL;
|
||||
addClientAuthentication?: OAuthClientProvider['addClientAuthentication'];
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<OAuthTokens>;
|
||||
/**
|
||||
* Unified token fetching that works with any grant type via provider.prepareTokenRequest().
|
||||
*
|
||||
* This function provides a single entry point for obtaining tokens regardless of the
|
||||
* OAuth grant type. The provider's prepareTokenRequest() method determines which grant
|
||||
* to use and supplies the grant-specific parameters.
|
||||
*
|
||||
* @param provider - OAuth client provider that implements prepareTokenRequest()
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration for the token request
|
||||
* @returns Promise resolving to OAuth tokens
|
||||
* @throws {Error} When provider doesn't implement prepareTokenRequest or token fetch fails
|
||||
*
|
||||
* @example
|
||||
* // Provider for client_credentials:
|
||||
* class MyProvider implements OAuthClientProvider {
|
||||
* prepareTokenRequest(scope) {
|
||||
* const params = new URLSearchParams({ grant_type: 'client_credentials' });
|
||||
* if (scope) params.set('scope', scope);
|
||||
* return params;
|
||||
* }
|
||||
* // ... other methods
|
||||
* }
|
||||
*
|
||||
* const tokens = await fetchToken(provider, authServerUrl, { metadata });
|
||||
*/
|
||||
export declare function fetchToken(provider: OAuthClientProvider, authorizationServerUrl: string | URL, { metadata, resource, authorizationCode, fetchFn }?: {
|
||||
metadata?: AuthorizationServerMetadata;
|
||||
resource?: URL;
|
||||
/** Authorization code for the default authorization_code grant flow */
|
||||
authorizationCode?: string;
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<OAuthTokens>;
|
||||
/**
|
||||
* Performs OAuth 2.0 Dynamic Client Registration according to RFC 7591.
|
||||
*/
|
||||
export declare function registerClient(authorizationServerUrl: string | URL, { metadata, clientMetadata, fetchFn }: {
|
||||
metadata?: AuthorizationServerMetadata;
|
||||
clientMetadata: OAuthClientMetadata;
|
||||
fetchFn?: FetchLike;
|
||||
}): Promise<OAuthClientInformationFull>;
|
||||
export {};
|
||||
//# sourceMappingURL=auth.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.d.ts.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
925
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.js
generated
vendored
Normal file
925
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.js
generated
vendored
Normal file
@@ -0,0 +1,925 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.UnauthorizedError = void 0;
|
||||
exports.selectClientAuthMethod = selectClientAuthMethod;
|
||||
exports.parseErrorResponse = parseErrorResponse;
|
||||
exports.auth = auth;
|
||||
exports.isHttpsUrl = isHttpsUrl;
|
||||
exports.selectResourceURL = selectResourceURL;
|
||||
exports.extractWWWAuthenticateParams = extractWWWAuthenticateParams;
|
||||
exports.extractResourceMetadataUrl = extractResourceMetadataUrl;
|
||||
exports.discoverOAuthProtectedResourceMetadata = discoverOAuthProtectedResourceMetadata;
|
||||
exports.discoverOAuthMetadata = discoverOAuthMetadata;
|
||||
exports.buildDiscoveryUrls = buildDiscoveryUrls;
|
||||
exports.discoverAuthorizationServerMetadata = discoverAuthorizationServerMetadata;
|
||||
exports.discoverOAuthServerInfo = discoverOAuthServerInfo;
|
||||
exports.startAuthorization = startAuthorization;
|
||||
exports.prepareAuthorizationCodeRequest = prepareAuthorizationCodeRequest;
|
||||
exports.exchangeAuthorization = exchangeAuthorization;
|
||||
exports.refreshAuthorization = refreshAuthorization;
|
||||
exports.fetchToken = fetchToken;
|
||||
exports.registerClient = registerClient;
|
||||
const pkce_challenge_1 = __importDefault(require("pkce-challenge"));
|
||||
const types_js_1 = require("../types.js");
|
||||
const auth_js_1 = require("../shared/auth.js");
|
||||
const auth_js_2 = require("../shared/auth.js");
|
||||
const auth_utils_js_1 = require("../shared/auth-utils.js");
|
||||
const errors_js_1 = require("../server/auth/errors.js");
|
||||
class UnauthorizedError extends Error {
|
||||
constructor(message) {
|
||||
super(message ?? 'Unauthorized');
|
||||
}
|
||||
}
|
||||
exports.UnauthorizedError = UnauthorizedError;
|
||||
function isClientAuthMethod(method) {
|
||||
return ['client_secret_basic', 'client_secret_post', 'none'].includes(method);
|
||||
}
|
||||
const AUTHORIZATION_CODE_RESPONSE_TYPE = 'code';
|
||||
const AUTHORIZATION_CODE_CHALLENGE_METHOD = 'S256';
|
||||
/**
|
||||
* Determines the best client authentication method to use based on server support and client configuration.
|
||||
*
|
||||
* Priority order (highest to lowest):
|
||||
* 1. client_secret_basic (if client secret is available)
|
||||
* 2. client_secret_post (if client secret is available)
|
||||
* 3. none (for public clients)
|
||||
*
|
||||
* @param clientInformation - OAuth client information containing credentials
|
||||
* @param supportedMethods - Authentication methods supported by the authorization server
|
||||
* @returns The selected authentication method
|
||||
*/
|
||||
function selectClientAuthMethod(clientInformation, supportedMethods) {
|
||||
const hasClientSecret = clientInformation.client_secret !== undefined;
|
||||
// If server doesn't specify supported methods, use RFC 6749 defaults
|
||||
if (supportedMethods.length === 0) {
|
||||
return hasClientSecret ? 'client_secret_post' : 'none';
|
||||
}
|
||||
// Prefer the method returned by the server during client registration if valid and supported
|
||||
if ('token_endpoint_auth_method' in clientInformation &&
|
||||
clientInformation.token_endpoint_auth_method &&
|
||||
isClientAuthMethod(clientInformation.token_endpoint_auth_method) &&
|
||||
supportedMethods.includes(clientInformation.token_endpoint_auth_method)) {
|
||||
return clientInformation.token_endpoint_auth_method;
|
||||
}
|
||||
// Try methods in priority order (most secure first)
|
||||
if (hasClientSecret && supportedMethods.includes('client_secret_basic')) {
|
||||
return 'client_secret_basic';
|
||||
}
|
||||
if (hasClientSecret && supportedMethods.includes('client_secret_post')) {
|
||||
return 'client_secret_post';
|
||||
}
|
||||
if (supportedMethods.includes('none')) {
|
||||
return 'none';
|
||||
}
|
||||
// Fallback: use what we have
|
||||
return hasClientSecret ? 'client_secret_post' : 'none';
|
||||
}
|
||||
/**
|
||||
* Applies client authentication to the request based on the specified method.
|
||||
*
|
||||
* Implements OAuth 2.1 client authentication methods:
|
||||
* - client_secret_basic: HTTP Basic authentication (RFC 6749 Section 2.3.1)
|
||||
* - client_secret_post: Credentials in request body (RFC 6749 Section 2.3.1)
|
||||
* - none: Public client authentication (RFC 6749 Section 2.1)
|
||||
*
|
||||
* @param method - The authentication method to use
|
||||
* @param clientInformation - OAuth client information containing credentials
|
||||
* @param headers - HTTP headers object to modify
|
||||
* @param params - URL search parameters to modify
|
||||
* @throws {Error} When required credentials are missing
|
||||
*/
|
||||
function applyClientAuthentication(method, clientInformation, headers, params) {
|
||||
const { client_id, client_secret } = clientInformation;
|
||||
switch (method) {
|
||||
case 'client_secret_basic':
|
||||
applyBasicAuth(client_id, client_secret, headers);
|
||||
return;
|
||||
case 'client_secret_post':
|
||||
applyPostAuth(client_id, client_secret, params);
|
||||
return;
|
||||
case 'none':
|
||||
applyPublicAuth(client_id, params);
|
||||
return;
|
||||
default:
|
||||
throw new Error(`Unsupported client authentication method: ${method}`);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Applies HTTP Basic authentication (RFC 6749 Section 2.3.1)
|
||||
*/
|
||||
function applyBasicAuth(clientId, clientSecret, headers) {
|
||||
if (!clientSecret) {
|
||||
throw new Error('client_secret_basic authentication requires a client_secret');
|
||||
}
|
||||
const credentials = btoa(`${clientId}:${clientSecret}`);
|
||||
headers.set('Authorization', `Basic ${credentials}`);
|
||||
}
|
||||
/**
|
||||
* Applies POST body authentication (RFC 6749 Section 2.3.1)
|
||||
*/
|
||||
function applyPostAuth(clientId, clientSecret, params) {
|
||||
params.set('client_id', clientId);
|
||||
if (clientSecret) {
|
||||
params.set('client_secret', clientSecret);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Applies public client authentication (RFC 6749 Section 2.1)
|
||||
*/
|
||||
function applyPublicAuth(clientId, params) {
|
||||
params.set('client_id', clientId);
|
||||
}
|
||||
/**
|
||||
* Parses an OAuth error response from a string or Response object.
|
||||
*
|
||||
* If the input is a standard OAuth2.0 error response, it will be parsed according to the spec
|
||||
* and an instance of the appropriate OAuthError subclass will be returned.
|
||||
* If parsing fails, it falls back to a generic ServerError that includes
|
||||
* the response status (if available) and original content.
|
||||
*
|
||||
* @param input - A Response object or string containing the error response
|
||||
* @returns A Promise that resolves to an OAuthError instance
|
||||
*/
|
||||
async function parseErrorResponse(input) {
|
||||
const statusCode = input instanceof Response ? input.status : undefined;
|
||||
const body = input instanceof Response ? await input.text() : input;
|
||||
try {
|
||||
const result = auth_js_1.OAuthErrorResponseSchema.parse(JSON.parse(body));
|
||||
const { error, error_description, error_uri } = result;
|
||||
const errorClass = errors_js_1.OAUTH_ERRORS[error] || errors_js_1.ServerError;
|
||||
return new errorClass(error_description || '', error_uri);
|
||||
}
|
||||
catch (error) {
|
||||
// Not a valid OAuth error response, but try to inform the user of the raw data anyway
|
||||
const errorMessage = `${statusCode ? `HTTP ${statusCode}: ` : ''}Invalid OAuth error response: ${error}. Raw body: ${body}`;
|
||||
return new errors_js_1.ServerError(errorMessage);
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Orchestrates the full auth flow with a server.
|
||||
*
|
||||
* This can be used as a single entry point for all authorization functionality,
|
||||
* instead of linking together the other lower-level functions in this module.
|
||||
*/
|
||||
async function auth(provider, options) {
|
||||
try {
|
||||
return await authInternal(provider, options);
|
||||
}
|
||||
catch (error) {
|
||||
// Handle recoverable error types by invalidating credentials and retrying
|
||||
if (error instanceof errors_js_1.InvalidClientError || error instanceof errors_js_1.UnauthorizedClientError) {
|
||||
await provider.invalidateCredentials?.('all');
|
||||
return await authInternal(provider, options);
|
||||
}
|
||||
else if (error instanceof errors_js_1.InvalidGrantError) {
|
||||
await provider.invalidateCredentials?.('tokens');
|
||||
return await authInternal(provider, options);
|
||||
}
|
||||
// Throw otherwise
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
async function authInternal(provider, { serverUrl, authorizationCode, scope, resourceMetadataUrl, fetchFn }) {
|
||||
// Check if the provider has cached discovery state to skip discovery
|
||||
const cachedState = await provider.discoveryState?.();
|
||||
let resourceMetadata;
|
||||
let authorizationServerUrl;
|
||||
let metadata;
|
||||
// If resourceMetadataUrl is not provided, try to load it from cached state
|
||||
// This handles browser redirects where the URL was saved before navigation
|
||||
let effectiveResourceMetadataUrl = resourceMetadataUrl;
|
||||
if (!effectiveResourceMetadataUrl && cachedState?.resourceMetadataUrl) {
|
||||
effectiveResourceMetadataUrl = new URL(cachedState.resourceMetadataUrl);
|
||||
}
|
||||
if (cachedState?.authorizationServerUrl) {
|
||||
// Restore discovery state from cache
|
||||
authorizationServerUrl = cachedState.authorizationServerUrl;
|
||||
resourceMetadata = cachedState.resourceMetadata;
|
||||
metadata =
|
||||
cachedState.authorizationServerMetadata ?? (await discoverAuthorizationServerMetadata(authorizationServerUrl, { fetchFn }));
|
||||
// If resource metadata wasn't cached, try to fetch it for selectResourceURL
|
||||
if (!resourceMetadata) {
|
||||
try {
|
||||
resourceMetadata = await discoverOAuthProtectedResourceMetadata(serverUrl, { resourceMetadataUrl: effectiveResourceMetadataUrl }, fetchFn);
|
||||
}
|
||||
catch {
|
||||
// RFC 9728 not available — selectResourceURL will handle undefined
|
||||
}
|
||||
}
|
||||
// Re-save if we enriched the cached state with missing metadata
|
||||
if (metadata !== cachedState.authorizationServerMetadata || resourceMetadata !== cachedState.resourceMetadata) {
|
||||
await provider.saveDiscoveryState?.({
|
||||
authorizationServerUrl: String(authorizationServerUrl),
|
||||
resourceMetadataUrl: effectiveResourceMetadataUrl?.toString(),
|
||||
resourceMetadata,
|
||||
authorizationServerMetadata: metadata
|
||||
});
|
||||
}
|
||||
}
|
||||
else {
|
||||
// Full discovery via RFC 9728
|
||||
const serverInfo = await discoverOAuthServerInfo(serverUrl, { resourceMetadataUrl: effectiveResourceMetadataUrl, fetchFn });
|
||||
authorizationServerUrl = serverInfo.authorizationServerUrl;
|
||||
metadata = serverInfo.authorizationServerMetadata;
|
||||
resourceMetadata = serverInfo.resourceMetadata;
|
||||
// Persist discovery state for future use
|
||||
// TODO: resourceMetadataUrl is only populated when explicitly provided via options
|
||||
// or loaded from cached state. The URL derived internally by
|
||||
// discoverOAuthProtectedResourceMetadata() is not captured back here.
|
||||
await provider.saveDiscoveryState?.({
|
||||
authorizationServerUrl: String(authorizationServerUrl),
|
||||
resourceMetadataUrl: effectiveResourceMetadataUrl?.toString(),
|
||||
resourceMetadata,
|
||||
authorizationServerMetadata: metadata
|
||||
});
|
||||
}
|
||||
const resource = await selectResourceURL(serverUrl, provider, resourceMetadata);
|
||||
// Handle client registration if needed
|
||||
let clientInformation = await Promise.resolve(provider.clientInformation());
|
||||
if (!clientInformation) {
|
||||
if (authorizationCode !== undefined) {
|
||||
throw new Error('Existing OAuth client information is required when exchanging an authorization code');
|
||||
}
|
||||
const supportsUrlBasedClientId = metadata?.client_id_metadata_document_supported === true;
|
||||
const clientMetadataUrl = provider.clientMetadataUrl;
|
||||
if (clientMetadataUrl && !isHttpsUrl(clientMetadataUrl)) {
|
||||
throw new errors_js_1.InvalidClientMetadataError(`clientMetadataUrl must be a valid HTTPS URL with a non-root pathname, got: ${clientMetadataUrl}`);
|
||||
}
|
||||
const shouldUseUrlBasedClientId = supportsUrlBasedClientId && clientMetadataUrl;
|
||||
if (shouldUseUrlBasedClientId) {
|
||||
// SEP-991: URL-based Client IDs
|
||||
clientInformation = {
|
||||
client_id: clientMetadataUrl
|
||||
};
|
||||
await provider.saveClientInformation?.(clientInformation);
|
||||
}
|
||||
else {
|
||||
// Fallback to dynamic registration
|
||||
if (!provider.saveClientInformation) {
|
||||
throw new Error('OAuth client information must be saveable for dynamic registration');
|
||||
}
|
||||
const fullInformation = await registerClient(authorizationServerUrl, {
|
||||
metadata,
|
||||
clientMetadata: provider.clientMetadata,
|
||||
fetchFn
|
||||
});
|
||||
await provider.saveClientInformation(fullInformation);
|
||||
clientInformation = fullInformation;
|
||||
}
|
||||
}
|
||||
// Non-interactive flows (e.g., client_credentials, jwt-bearer) don't need a redirect URL
|
||||
const nonInteractiveFlow = !provider.redirectUrl;
|
||||
// Exchange authorization code for tokens, or fetch tokens directly for non-interactive flows
|
||||
if (authorizationCode !== undefined || nonInteractiveFlow) {
|
||||
const tokens = await fetchToken(provider, authorizationServerUrl, {
|
||||
metadata,
|
||||
resource,
|
||||
authorizationCode,
|
||||
fetchFn
|
||||
});
|
||||
await provider.saveTokens(tokens);
|
||||
return 'AUTHORIZED';
|
||||
}
|
||||
const tokens = await provider.tokens();
|
||||
// Handle token refresh or new authorization
|
||||
if (tokens?.refresh_token) {
|
||||
try {
|
||||
// Attempt to refresh the token
|
||||
const newTokens = await refreshAuthorization(authorizationServerUrl, {
|
||||
metadata,
|
||||
clientInformation,
|
||||
refreshToken: tokens.refresh_token,
|
||||
resource,
|
||||
addClientAuthentication: provider.addClientAuthentication,
|
||||
fetchFn
|
||||
});
|
||||
await provider.saveTokens(newTokens);
|
||||
return 'AUTHORIZED';
|
||||
}
|
||||
catch (error) {
|
||||
// If this is a ServerError, or an unknown type, log it out and try to continue. Otherwise, escalate so we can fix things and retry.
|
||||
if (!(error instanceof errors_js_1.OAuthError) || error instanceof errors_js_1.ServerError) {
|
||||
// Could not refresh OAuth tokens
|
||||
}
|
||||
else {
|
||||
// Refresh failed for another reason, re-throw
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
||||
const state = provider.state ? await provider.state() : undefined;
|
||||
// Start new authorization flow
|
||||
const { authorizationUrl, codeVerifier } = await startAuthorization(authorizationServerUrl, {
|
||||
metadata,
|
||||
clientInformation,
|
||||
state,
|
||||
redirectUrl: provider.redirectUrl,
|
||||
scope: scope || resourceMetadata?.scopes_supported?.join(' ') || provider.clientMetadata.scope,
|
||||
resource
|
||||
});
|
||||
await provider.saveCodeVerifier(codeVerifier);
|
||||
await provider.redirectToAuthorization(authorizationUrl);
|
||||
return 'REDIRECT';
|
||||
}
|
||||
/**
|
||||
* SEP-991: URL-based Client IDs
|
||||
* Validate that the client_id is a valid URL with https scheme
|
||||
*/
|
||||
function isHttpsUrl(value) {
|
||||
if (!value)
|
||||
return false;
|
||||
try {
|
||||
const url = new URL(value);
|
||||
return url.protocol === 'https:' && url.pathname !== '/';
|
||||
}
|
||||
catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
async function selectResourceURL(serverUrl, provider, resourceMetadata) {
|
||||
const defaultResource = (0, auth_utils_js_1.resourceUrlFromServerUrl)(serverUrl);
|
||||
// If provider has custom validation, delegate to it
|
||||
if (provider.validateResourceURL) {
|
||||
return await provider.validateResourceURL(defaultResource, resourceMetadata?.resource);
|
||||
}
|
||||
// Only include resource parameter when Protected Resource Metadata is present
|
||||
if (!resourceMetadata) {
|
||||
return undefined;
|
||||
}
|
||||
// Validate that the metadata's resource is compatible with our request
|
||||
if (!(0, auth_utils_js_1.checkResourceAllowed)({ requestedResource: defaultResource, configuredResource: resourceMetadata.resource })) {
|
||||
throw new Error(`Protected resource ${resourceMetadata.resource} does not match expected ${defaultResource} (or origin)`);
|
||||
}
|
||||
// Prefer the resource from metadata since it's what the server is telling us to request
|
||||
return new URL(resourceMetadata.resource);
|
||||
}
|
||||
/**
|
||||
* Extract resource_metadata, scope, and error from WWW-Authenticate header.
|
||||
*/
|
||||
function extractWWWAuthenticateParams(res) {
|
||||
const authenticateHeader = res.headers.get('WWW-Authenticate');
|
||||
if (!authenticateHeader) {
|
||||
return {};
|
||||
}
|
||||
const [type, scheme] = authenticateHeader.split(' ');
|
||||
if (type.toLowerCase() !== 'bearer' || !scheme) {
|
||||
return {};
|
||||
}
|
||||
const resourceMetadataMatch = extractFieldFromWwwAuth(res, 'resource_metadata') || undefined;
|
||||
let resourceMetadataUrl;
|
||||
if (resourceMetadataMatch) {
|
||||
try {
|
||||
resourceMetadataUrl = new URL(resourceMetadataMatch);
|
||||
}
|
||||
catch {
|
||||
// Ignore invalid URL
|
||||
}
|
||||
}
|
||||
const scope = extractFieldFromWwwAuth(res, 'scope') || undefined;
|
||||
const error = extractFieldFromWwwAuth(res, 'error') || undefined;
|
||||
return {
|
||||
resourceMetadataUrl,
|
||||
scope,
|
||||
error
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Extracts a specific field's value from the WWW-Authenticate header string.
|
||||
*
|
||||
* @param response The HTTP response object containing the headers.
|
||||
* @param fieldName The name of the field to extract (e.g., "realm", "nonce").
|
||||
* @returns The field value
|
||||
*/
|
||||
function extractFieldFromWwwAuth(response, fieldName) {
|
||||
const wwwAuthHeader = response.headers.get('WWW-Authenticate');
|
||||
if (!wwwAuthHeader) {
|
||||
return null;
|
||||
}
|
||||
const pattern = new RegExp(`${fieldName}=(?:"([^"]+)"|([^\\s,]+))`);
|
||||
const match = wwwAuthHeader.match(pattern);
|
||||
if (match) {
|
||||
// Pattern matches: field_name="value" or field_name=value (unquoted)
|
||||
return match[1] || match[2];
|
||||
}
|
||||
return null;
|
||||
}
|
||||
/**
|
||||
* Extract resource_metadata from response header.
|
||||
* @deprecated Use `extractWWWAuthenticateParams` instead.
|
||||
*/
|
||||
function extractResourceMetadataUrl(res) {
|
||||
const authenticateHeader = res.headers.get('WWW-Authenticate');
|
||||
if (!authenticateHeader) {
|
||||
return undefined;
|
||||
}
|
||||
const [type, scheme] = authenticateHeader.split(' ');
|
||||
if (type.toLowerCase() !== 'bearer' || !scheme) {
|
||||
return undefined;
|
||||
}
|
||||
const regex = /resource_metadata="([^"]*)"/;
|
||||
const match = regex.exec(authenticateHeader);
|
||||
if (!match) {
|
||||
return undefined;
|
||||
}
|
||||
try {
|
||||
return new URL(match[1]);
|
||||
}
|
||||
catch {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Looks up RFC 9728 OAuth 2.0 Protected Resource Metadata.
|
||||
*
|
||||
* If the server returns a 404 for the well-known endpoint, this function will
|
||||
* return `undefined`. Any other errors will be thrown as exceptions.
|
||||
*/
|
||||
async function discoverOAuthProtectedResourceMetadata(serverUrl, opts, fetchFn = fetch) {
|
||||
const response = await discoverMetadataWithFallback(serverUrl, 'oauth-protected-resource', fetchFn, {
|
||||
protocolVersion: opts?.protocolVersion,
|
||||
metadataUrl: opts?.resourceMetadataUrl
|
||||
});
|
||||
if (!response || response.status === 404) {
|
||||
await response?.body?.cancel();
|
||||
throw new Error(`Resource server does not implement OAuth 2.0 Protected Resource Metadata.`);
|
||||
}
|
||||
if (!response.ok) {
|
||||
await response.body?.cancel();
|
||||
throw new Error(`HTTP ${response.status} trying to load well-known OAuth protected resource metadata.`);
|
||||
}
|
||||
return auth_js_2.OAuthProtectedResourceMetadataSchema.parse(await response.json());
|
||||
}
|
||||
/**
|
||||
* Helper function to handle fetch with CORS retry logic
|
||||
*/
|
||||
async function fetchWithCorsRetry(url, headers, fetchFn = fetch) {
|
||||
try {
|
||||
return await fetchFn(url, { headers });
|
||||
}
|
||||
catch (error) {
|
||||
if (error instanceof TypeError) {
|
||||
if (headers) {
|
||||
// CORS errors come back as TypeError, retry without headers
|
||||
return fetchWithCorsRetry(url, undefined, fetchFn);
|
||||
}
|
||||
else {
|
||||
// We're getting CORS errors on retry too, return undefined
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Constructs the well-known path for auth-related metadata discovery
|
||||
*/
|
||||
function buildWellKnownPath(wellKnownPrefix, pathname = '', options = {}) {
|
||||
// Strip trailing slash from pathname to avoid double slashes
|
||||
if (pathname.endsWith('/')) {
|
||||
pathname = pathname.slice(0, -1);
|
||||
}
|
||||
return options.prependPathname ? `${pathname}/.well-known/${wellKnownPrefix}` : `/.well-known/${wellKnownPrefix}${pathname}`;
|
||||
}
|
||||
/**
|
||||
* Tries to discover OAuth metadata at a specific URL
|
||||
*/
|
||||
async function tryMetadataDiscovery(url, protocolVersion, fetchFn = fetch) {
|
||||
const headers = {
|
||||
'MCP-Protocol-Version': protocolVersion
|
||||
};
|
||||
return await fetchWithCorsRetry(url, headers, fetchFn);
|
||||
}
|
||||
/**
|
||||
* Determines if fallback to root discovery should be attempted
|
||||
*/
|
||||
function shouldAttemptFallback(response, pathname) {
|
||||
return !response || (response.status >= 400 && response.status < 500 && pathname !== '/');
|
||||
}
|
||||
/**
|
||||
* Generic function for discovering OAuth metadata with fallback support
|
||||
*/
|
||||
async function discoverMetadataWithFallback(serverUrl, wellKnownType, fetchFn, opts) {
|
||||
const issuer = new URL(serverUrl);
|
||||
const protocolVersion = opts?.protocolVersion ?? types_js_1.LATEST_PROTOCOL_VERSION;
|
||||
let url;
|
||||
if (opts?.metadataUrl) {
|
||||
url = new URL(opts.metadataUrl);
|
||||
}
|
||||
else {
|
||||
// Try path-aware discovery first
|
||||
const wellKnownPath = buildWellKnownPath(wellKnownType, issuer.pathname);
|
||||
url = new URL(wellKnownPath, opts?.metadataServerUrl ?? issuer);
|
||||
url.search = issuer.search;
|
||||
}
|
||||
let response = await tryMetadataDiscovery(url, protocolVersion, fetchFn);
|
||||
// If path-aware discovery fails with 404 and we're not already at root, try fallback to root discovery
|
||||
if (!opts?.metadataUrl && shouldAttemptFallback(response, issuer.pathname)) {
|
||||
const rootUrl = new URL(`/.well-known/${wellKnownType}`, issuer);
|
||||
response = await tryMetadataDiscovery(rootUrl, protocolVersion, fetchFn);
|
||||
}
|
||||
return response;
|
||||
}
|
||||
/**
|
||||
* Looks up RFC 8414 OAuth 2.0 Authorization Server Metadata.
|
||||
*
|
||||
* If the server returns a 404 for the well-known endpoint, this function will
|
||||
* return `undefined`. Any other errors will be thrown as exceptions.
|
||||
*
|
||||
* @deprecated This function is deprecated in favor of `discoverAuthorizationServerMetadata`.
|
||||
*/
|
||||
async function discoverOAuthMetadata(issuer, { authorizationServerUrl, protocolVersion } = {}, fetchFn = fetch) {
|
||||
if (typeof issuer === 'string') {
|
||||
issuer = new URL(issuer);
|
||||
}
|
||||
if (!authorizationServerUrl) {
|
||||
authorizationServerUrl = issuer;
|
||||
}
|
||||
if (typeof authorizationServerUrl === 'string') {
|
||||
authorizationServerUrl = new URL(authorizationServerUrl);
|
||||
}
|
||||
protocolVersion ?? (protocolVersion = types_js_1.LATEST_PROTOCOL_VERSION);
|
||||
const response = await discoverMetadataWithFallback(authorizationServerUrl, 'oauth-authorization-server', fetchFn, {
|
||||
protocolVersion,
|
||||
metadataServerUrl: authorizationServerUrl
|
||||
});
|
||||
if (!response || response.status === 404) {
|
||||
await response?.body?.cancel();
|
||||
return undefined;
|
||||
}
|
||||
if (!response.ok) {
|
||||
await response.body?.cancel();
|
||||
throw new Error(`HTTP ${response.status} trying to load well-known OAuth metadata`);
|
||||
}
|
||||
return auth_js_2.OAuthMetadataSchema.parse(await response.json());
|
||||
}
|
||||
/**
|
||||
* Builds a list of discovery URLs to try for authorization server metadata.
|
||||
* URLs are returned in priority order:
|
||||
* 1. OAuth metadata at the given URL
|
||||
* 2. OIDC metadata endpoints at the given URL
|
||||
*/
|
||||
function buildDiscoveryUrls(authorizationServerUrl) {
|
||||
const url = typeof authorizationServerUrl === 'string' ? new URL(authorizationServerUrl) : authorizationServerUrl;
|
||||
const hasPath = url.pathname !== '/';
|
||||
const urlsToTry = [];
|
||||
if (!hasPath) {
|
||||
// Root path: https://example.com/.well-known/oauth-authorization-server
|
||||
urlsToTry.push({
|
||||
url: new URL('/.well-known/oauth-authorization-server', url.origin),
|
||||
type: 'oauth'
|
||||
});
|
||||
// OIDC: https://example.com/.well-known/openid-configuration
|
||||
urlsToTry.push({
|
||||
url: new URL(`/.well-known/openid-configuration`, url.origin),
|
||||
type: 'oidc'
|
||||
});
|
||||
return urlsToTry;
|
||||
}
|
||||
// Strip trailing slash from pathname to avoid double slashes
|
||||
let pathname = url.pathname;
|
||||
if (pathname.endsWith('/')) {
|
||||
pathname = pathname.slice(0, -1);
|
||||
}
|
||||
// 1. OAuth metadata at the given URL
|
||||
// Insert well-known before the path: https://example.com/.well-known/oauth-authorization-server/tenant1
|
||||
urlsToTry.push({
|
||||
url: new URL(`/.well-known/oauth-authorization-server${pathname}`, url.origin),
|
||||
type: 'oauth'
|
||||
});
|
||||
// 2. OIDC metadata endpoints
|
||||
// RFC 8414 style: Insert /.well-known/openid-configuration before the path
|
||||
urlsToTry.push({
|
||||
url: new URL(`/.well-known/openid-configuration${pathname}`, url.origin),
|
||||
type: 'oidc'
|
||||
});
|
||||
// OIDC Discovery 1.0 style: Append /.well-known/openid-configuration after the path
|
||||
urlsToTry.push({
|
||||
url: new URL(`${pathname}/.well-known/openid-configuration`, url.origin),
|
||||
type: 'oidc'
|
||||
});
|
||||
return urlsToTry;
|
||||
}
|
||||
/**
|
||||
* Discovers authorization server metadata with support for RFC 8414 OAuth 2.0 Authorization Server Metadata
|
||||
* and OpenID Connect Discovery 1.0 specifications.
|
||||
*
|
||||
* This function implements a fallback strategy for authorization server discovery:
|
||||
* 1. Attempts RFC 8414 OAuth metadata discovery first
|
||||
* 2. If OAuth discovery fails, falls back to OpenID Connect Discovery
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server URL obtained from the MCP Server's
|
||||
* protected resource metadata, or the MCP server's URL if the
|
||||
* metadata was not found.
|
||||
* @param options - Configuration options
|
||||
* @param options.fetchFn - Optional fetch function for making HTTP requests, defaults to global fetch
|
||||
* @param options.protocolVersion - MCP protocol version to use, defaults to LATEST_PROTOCOL_VERSION
|
||||
* @returns Promise resolving to authorization server metadata, or undefined if discovery fails
|
||||
*/
|
||||
async function discoverAuthorizationServerMetadata(authorizationServerUrl, { fetchFn = fetch, protocolVersion = types_js_1.LATEST_PROTOCOL_VERSION } = {}) {
|
||||
const headers = {
|
||||
'MCP-Protocol-Version': protocolVersion,
|
||||
Accept: 'application/json'
|
||||
};
|
||||
// Get the list of URLs to try
|
||||
const urlsToTry = buildDiscoveryUrls(authorizationServerUrl);
|
||||
// Try each URL in order
|
||||
for (const { url: endpointUrl, type } of urlsToTry) {
|
||||
const response = await fetchWithCorsRetry(endpointUrl, headers, fetchFn);
|
||||
if (!response) {
|
||||
/**
|
||||
* CORS error occurred - don't throw as the endpoint may not allow CORS,
|
||||
* continue trying other possible endpoints
|
||||
*/
|
||||
continue;
|
||||
}
|
||||
if (!response.ok) {
|
||||
await response.body?.cancel();
|
||||
// Continue looking for any 4xx response code.
|
||||
if (response.status >= 400 && response.status < 500) {
|
||||
continue; // Try next URL
|
||||
}
|
||||
throw new Error(`HTTP ${response.status} trying to load ${type === 'oauth' ? 'OAuth' : 'OpenID provider'} metadata from ${endpointUrl}`);
|
||||
}
|
||||
// Parse and validate based on type
|
||||
if (type === 'oauth') {
|
||||
return auth_js_2.OAuthMetadataSchema.parse(await response.json());
|
||||
}
|
||||
else {
|
||||
return auth_js_1.OpenIdProviderDiscoveryMetadataSchema.parse(await response.json());
|
||||
}
|
||||
}
|
||||
return undefined;
|
||||
}
|
||||
/**
|
||||
* Discovers the authorization server for an MCP server following
|
||||
* {@link https://datatracker.ietf.org/doc/html/rfc9728 | RFC 9728} (OAuth 2.0 Protected
|
||||
* Resource Metadata), with fallback to treating the server URL as the
|
||||
* authorization server.
|
||||
*
|
||||
* This function combines two discovery steps into one call:
|
||||
* 1. Probes `/.well-known/oauth-protected-resource` on the MCP server to find the
|
||||
* authorization server URL (RFC 9728).
|
||||
* 2. Fetches authorization server metadata from that URL (RFC 8414 / OpenID Connect Discovery).
|
||||
*
|
||||
* Use this when you need the authorization server metadata for operations outside the
|
||||
* {@linkcode auth} orchestrator, such as token refresh or token revocation.
|
||||
*
|
||||
* @param serverUrl - The MCP resource server URL
|
||||
* @param opts - Optional configuration
|
||||
* @param opts.resourceMetadataUrl - Override URL for the protected resource metadata endpoint
|
||||
* @param opts.fetchFn - Custom fetch function for HTTP requests
|
||||
* @returns Authorization server URL, metadata, and resource metadata (if available)
|
||||
*/
|
||||
async function discoverOAuthServerInfo(serverUrl, opts) {
|
||||
let resourceMetadata;
|
||||
let authorizationServerUrl;
|
||||
try {
|
||||
resourceMetadata = await discoverOAuthProtectedResourceMetadata(serverUrl, { resourceMetadataUrl: opts?.resourceMetadataUrl }, opts?.fetchFn);
|
||||
if (resourceMetadata.authorization_servers && resourceMetadata.authorization_servers.length > 0) {
|
||||
authorizationServerUrl = resourceMetadata.authorization_servers[0];
|
||||
}
|
||||
}
|
||||
catch {
|
||||
// RFC 9728 not supported -- fall back to treating the server URL as the authorization server
|
||||
}
|
||||
// If we don't get a valid authorization server from protected resource metadata,
|
||||
// fall back to the legacy MCP spec behavior: MCP server base URL acts as the authorization server
|
||||
if (!authorizationServerUrl) {
|
||||
authorizationServerUrl = String(new URL('/', serverUrl));
|
||||
}
|
||||
const authorizationServerMetadata = await discoverAuthorizationServerMetadata(authorizationServerUrl, { fetchFn: opts?.fetchFn });
|
||||
return {
|
||||
authorizationServerUrl,
|
||||
authorizationServerMetadata,
|
||||
resourceMetadata
|
||||
};
|
||||
}
|
||||
/**
|
||||
* Begins the authorization flow with the given server, by generating a PKCE challenge and constructing the authorization URL.
|
||||
*/
|
||||
async function startAuthorization(authorizationServerUrl, { metadata, clientInformation, redirectUrl, scope, state, resource }) {
|
||||
let authorizationUrl;
|
||||
if (metadata) {
|
||||
authorizationUrl = new URL(metadata.authorization_endpoint);
|
||||
if (!metadata.response_types_supported.includes(AUTHORIZATION_CODE_RESPONSE_TYPE)) {
|
||||
throw new Error(`Incompatible auth server: does not support response type ${AUTHORIZATION_CODE_RESPONSE_TYPE}`);
|
||||
}
|
||||
if (metadata.code_challenge_methods_supported &&
|
||||
!metadata.code_challenge_methods_supported.includes(AUTHORIZATION_CODE_CHALLENGE_METHOD)) {
|
||||
throw new Error(`Incompatible auth server: does not support code challenge method ${AUTHORIZATION_CODE_CHALLENGE_METHOD}`);
|
||||
}
|
||||
}
|
||||
else {
|
||||
authorizationUrl = new URL('/authorize', authorizationServerUrl);
|
||||
}
|
||||
// Generate PKCE challenge
|
||||
const challenge = await (0, pkce_challenge_1.default)();
|
||||
const codeVerifier = challenge.code_verifier;
|
||||
const codeChallenge = challenge.code_challenge;
|
||||
authorizationUrl.searchParams.set('response_type', AUTHORIZATION_CODE_RESPONSE_TYPE);
|
||||
authorizationUrl.searchParams.set('client_id', clientInformation.client_id);
|
||||
authorizationUrl.searchParams.set('code_challenge', codeChallenge);
|
||||
authorizationUrl.searchParams.set('code_challenge_method', AUTHORIZATION_CODE_CHALLENGE_METHOD);
|
||||
authorizationUrl.searchParams.set('redirect_uri', String(redirectUrl));
|
||||
if (state) {
|
||||
authorizationUrl.searchParams.set('state', state);
|
||||
}
|
||||
if (scope) {
|
||||
authorizationUrl.searchParams.set('scope', scope);
|
||||
}
|
||||
if (scope?.includes('offline_access')) {
|
||||
// if the request includes the OIDC-only "offline_access" scope,
|
||||
// we need to set the prompt to "consent" to ensure the user is prompted to grant offline access
|
||||
// https://openid.net/specs/openid-connect-core-1_0.html#OfflineAccess
|
||||
authorizationUrl.searchParams.append('prompt', 'consent');
|
||||
}
|
||||
if (resource) {
|
||||
authorizationUrl.searchParams.set('resource', resource.href);
|
||||
}
|
||||
return { authorizationUrl, codeVerifier };
|
||||
}
|
||||
/**
|
||||
* Prepares token request parameters for an authorization code exchange.
|
||||
*
|
||||
* This is the default implementation used by fetchToken when the provider
|
||||
* doesn't implement prepareTokenRequest.
|
||||
*
|
||||
* @param authorizationCode - The authorization code received from the authorization endpoint
|
||||
* @param codeVerifier - The PKCE code verifier
|
||||
* @param redirectUri - The redirect URI used in the authorization request
|
||||
* @returns URLSearchParams for the authorization_code grant
|
||||
*/
|
||||
function prepareAuthorizationCodeRequest(authorizationCode, codeVerifier, redirectUri) {
|
||||
return new URLSearchParams({
|
||||
grant_type: 'authorization_code',
|
||||
code: authorizationCode,
|
||||
code_verifier: codeVerifier,
|
||||
redirect_uri: String(redirectUri)
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Internal helper to execute a token request with the given parameters.
|
||||
* Used by exchangeAuthorization, refreshAuthorization, and fetchToken.
|
||||
*/
|
||||
async function executeTokenRequest(authorizationServerUrl, { metadata, tokenRequestParams, clientInformation, addClientAuthentication, resource, fetchFn }) {
|
||||
const tokenUrl = metadata?.token_endpoint ? new URL(metadata.token_endpoint) : new URL('/token', authorizationServerUrl);
|
||||
const headers = new Headers({
|
||||
'Content-Type': 'application/x-www-form-urlencoded',
|
||||
Accept: 'application/json'
|
||||
});
|
||||
if (resource) {
|
||||
tokenRequestParams.set('resource', resource.href);
|
||||
}
|
||||
if (addClientAuthentication) {
|
||||
await addClientAuthentication(headers, tokenRequestParams, tokenUrl, metadata);
|
||||
}
|
||||
else if (clientInformation) {
|
||||
const supportedMethods = metadata?.token_endpoint_auth_methods_supported ?? [];
|
||||
const authMethod = selectClientAuthMethod(clientInformation, supportedMethods);
|
||||
applyClientAuthentication(authMethod, clientInformation, headers, tokenRequestParams);
|
||||
}
|
||||
const response = await (fetchFn ?? fetch)(tokenUrl, {
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: tokenRequestParams
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw await parseErrorResponse(response);
|
||||
}
|
||||
return auth_js_2.OAuthTokensSchema.parse(await response.json());
|
||||
}
|
||||
/**
|
||||
* Exchanges an authorization code for an access token with the given server.
|
||||
*
|
||||
* Supports multiple client authentication methods as specified in OAuth 2.1:
|
||||
* - Automatically selects the best authentication method based on server support
|
||||
* - Falls back to appropriate defaults when server metadata is unavailable
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration object containing client info, auth code, etc.
|
||||
* @returns Promise resolving to OAuth tokens
|
||||
* @throws {Error} When token exchange fails or authentication is invalid
|
||||
*/
|
||||
async function exchangeAuthorization(authorizationServerUrl, { metadata, clientInformation, authorizationCode, codeVerifier, redirectUri, resource, addClientAuthentication, fetchFn }) {
|
||||
const tokenRequestParams = prepareAuthorizationCodeRequest(authorizationCode, codeVerifier, redirectUri);
|
||||
return executeTokenRequest(authorizationServerUrl, {
|
||||
metadata,
|
||||
tokenRequestParams,
|
||||
clientInformation,
|
||||
addClientAuthentication,
|
||||
resource,
|
||||
fetchFn
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Exchange a refresh token for an updated access token.
|
||||
*
|
||||
* Supports multiple client authentication methods as specified in OAuth 2.1:
|
||||
* - Automatically selects the best authentication method based on server support
|
||||
* - Preserves the original refresh token if a new one is not returned
|
||||
*
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration object containing client info, refresh token, etc.
|
||||
* @returns Promise resolving to OAuth tokens (preserves original refresh_token if not replaced)
|
||||
* @throws {Error} When token refresh fails or authentication is invalid
|
||||
*/
|
||||
async function refreshAuthorization(authorizationServerUrl, { metadata, clientInformation, refreshToken, resource, addClientAuthentication, fetchFn }) {
|
||||
const tokenRequestParams = new URLSearchParams({
|
||||
grant_type: 'refresh_token',
|
||||
refresh_token: refreshToken
|
||||
});
|
||||
const tokens = await executeTokenRequest(authorizationServerUrl, {
|
||||
metadata,
|
||||
tokenRequestParams,
|
||||
clientInformation,
|
||||
addClientAuthentication,
|
||||
resource,
|
||||
fetchFn
|
||||
});
|
||||
// Preserve original refresh token if server didn't return a new one
|
||||
return { refresh_token: refreshToken, ...tokens };
|
||||
}
|
||||
/**
|
||||
* Unified token fetching that works with any grant type via provider.prepareTokenRequest().
|
||||
*
|
||||
* This function provides a single entry point for obtaining tokens regardless of the
|
||||
* OAuth grant type. The provider's prepareTokenRequest() method determines which grant
|
||||
* to use and supplies the grant-specific parameters.
|
||||
*
|
||||
* @param provider - OAuth client provider that implements prepareTokenRequest()
|
||||
* @param authorizationServerUrl - The authorization server's base URL
|
||||
* @param options - Configuration for the token request
|
||||
* @returns Promise resolving to OAuth tokens
|
||||
* @throws {Error} When provider doesn't implement prepareTokenRequest or token fetch fails
|
||||
*
|
||||
* @example
|
||||
* // Provider for client_credentials:
|
||||
* class MyProvider implements OAuthClientProvider {
|
||||
* prepareTokenRequest(scope) {
|
||||
* const params = new URLSearchParams({ grant_type: 'client_credentials' });
|
||||
* if (scope) params.set('scope', scope);
|
||||
* return params;
|
||||
* }
|
||||
* // ... other methods
|
||||
* }
|
||||
*
|
||||
* const tokens = await fetchToken(provider, authServerUrl, { metadata });
|
||||
*/
|
||||
async function fetchToken(provider, authorizationServerUrl, { metadata, resource, authorizationCode, fetchFn } = {}) {
|
||||
const scope = provider.clientMetadata.scope;
|
||||
// Use provider's prepareTokenRequest if available, otherwise fall back to authorization_code
|
||||
let tokenRequestParams;
|
||||
if (provider.prepareTokenRequest) {
|
||||
tokenRequestParams = await provider.prepareTokenRequest(scope);
|
||||
}
|
||||
// Default to authorization_code grant if no custom prepareTokenRequest
|
||||
if (!tokenRequestParams) {
|
||||
if (!authorizationCode) {
|
||||
throw new Error('Either provider.prepareTokenRequest() or authorizationCode is required');
|
||||
}
|
||||
if (!provider.redirectUrl) {
|
||||
throw new Error('redirectUrl is required for authorization_code flow');
|
||||
}
|
||||
const codeVerifier = await provider.codeVerifier();
|
||||
tokenRequestParams = prepareAuthorizationCodeRequest(authorizationCode, codeVerifier, provider.redirectUrl);
|
||||
}
|
||||
const clientInformation = await provider.clientInformation();
|
||||
return executeTokenRequest(authorizationServerUrl, {
|
||||
metadata,
|
||||
tokenRequestParams,
|
||||
clientInformation: clientInformation ?? undefined,
|
||||
addClientAuthentication: provider.addClientAuthentication,
|
||||
resource,
|
||||
fetchFn
|
||||
});
|
||||
}
|
||||
/**
|
||||
* Performs OAuth 2.0 Dynamic Client Registration according to RFC 7591.
|
||||
*/
|
||||
async function registerClient(authorizationServerUrl, { metadata, clientMetadata, fetchFn }) {
|
||||
let registrationUrl;
|
||||
if (metadata) {
|
||||
if (!metadata.registration_endpoint) {
|
||||
throw new Error('Incompatible auth server: does not support dynamic client registration');
|
||||
}
|
||||
registrationUrl = new URL(metadata.registration_endpoint);
|
||||
}
|
||||
else {
|
||||
registrationUrl = new URL('/register', authorizationServerUrl);
|
||||
}
|
||||
const response = await (fetchFn ?? fetch)(registrationUrl, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json'
|
||||
},
|
||||
body: JSON.stringify(clientMetadata)
|
||||
});
|
||||
if (!response.ok) {
|
||||
throw await parseErrorResponse(response);
|
||||
}
|
||||
return auth_js_2.OAuthClientInformationFullSchema.parse(await response.json());
|
||||
}
|
||||
//# sourceMappingURL=auth.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/auth.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
588
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.d.ts
generated
vendored
Normal file
588
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,588 @@
|
||||
import { Protocol, type ProtocolOptions, type RequestOptions } from '../shared/protocol.js';
|
||||
import type { Transport } from '../shared/transport.js';
|
||||
import { type CallToolRequest, CallToolResultSchema, type ClientCapabilities, type ClientNotification, type ClientRequest, type ClientResult, type CompatibilityCallToolResultSchema, type CompleteRequest, type GetPromptRequest, type Implementation, type ListPromptsRequest, type ListResourcesRequest, type ListResourceTemplatesRequest, type ListToolsRequest, type LoggingLevel, type ReadResourceRequest, type ServerCapabilities, type SubscribeRequest, type UnsubscribeRequest, type ListChangedHandlers, type Request, type Notification, type Result } from '../types.js';
|
||||
import type { jsonSchemaValidator } from '../validation/types.js';
|
||||
import { AnyObjectSchema, SchemaOutput } from '../server/zod-compat.js';
|
||||
import type { RequestHandlerExtra } from '../shared/protocol.js';
|
||||
import { ExperimentalClientTasks } from '../experimental/tasks/client.js';
|
||||
/**
|
||||
* Determines which elicitation modes are supported based on declared client capabilities.
|
||||
*
|
||||
* According to the spec:
|
||||
* - An empty elicitation capability object defaults to form mode support (backwards compatibility)
|
||||
* - URL mode is only supported if explicitly declared
|
||||
*
|
||||
* @param capabilities - The client's elicitation capabilities
|
||||
* @returns An object indicating which modes are supported
|
||||
*/
|
||||
export declare function getSupportedElicitationModes(capabilities: ClientCapabilities['elicitation']): {
|
||||
supportsFormMode: boolean;
|
||||
supportsUrlMode: boolean;
|
||||
};
|
||||
export type ClientOptions = ProtocolOptions & {
|
||||
/**
|
||||
* Capabilities to advertise as being supported by this client.
|
||||
*/
|
||||
capabilities?: ClientCapabilities;
|
||||
/**
|
||||
* JSON Schema validator for tool output validation.
|
||||
*
|
||||
* The validator is used to validate structured content returned by tools
|
||||
* against their declared output schemas.
|
||||
*
|
||||
* @default AjvJsonSchemaValidator
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* // ajv
|
||||
* const client = new Client(
|
||||
* { name: 'my-client', version: '1.0.0' },
|
||||
* {
|
||||
* capabilities: {},
|
||||
* jsonSchemaValidator: new AjvJsonSchemaValidator()
|
||||
* }
|
||||
* );
|
||||
*
|
||||
* // @cfworker/json-schema
|
||||
* const client = new Client(
|
||||
* { name: 'my-client', version: '1.0.0' },
|
||||
* {
|
||||
* capabilities: {},
|
||||
* jsonSchemaValidator: new CfWorkerJsonSchemaValidator()
|
||||
* }
|
||||
* );
|
||||
* ```
|
||||
*/
|
||||
jsonSchemaValidator?: jsonSchemaValidator;
|
||||
/**
|
||||
* Configure handlers for list changed notifications (tools, prompts, resources).
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* const client = new Client(
|
||||
* { name: 'my-client', version: '1.0.0' },
|
||||
* {
|
||||
* listChanged: {
|
||||
* tools: {
|
||||
* onChanged: (error, tools) => {
|
||||
* if (error) {
|
||||
* console.error('Failed to refresh tools:', error);
|
||||
* return;
|
||||
* }
|
||||
* console.log('Tools updated:', tools);
|
||||
* }
|
||||
* },
|
||||
* prompts: {
|
||||
* onChanged: (error, prompts) => console.log('Prompts updated:', prompts)
|
||||
* }
|
||||
* }
|
||||
* }
|
||||
* );
|
||||
* ```
|
||||
*/
|
||||
listChanged?: ListChangedHandlers;
|
||||
};
|
||||
/**
|
||||
* An MCP client on top of a pluggable transport.
|
||||
*
|
||||
* The client will automatically begin the initialization flow with the server when connect() is called.
|
||||
*
|
||||
* To use with custom types, extend the base Request/Notification/Result types and pass them as type parameters:
|
||||
*
|
||||
* ```typescript
|
||||
* // Custom schemas
|
||||
* const CustomRequestSchema = RequestSchema.extend({...})
|
||||
* const CustomNotificationSchema = NotificationSchema.extend({...})
|
||||
* const CustomResultSchema = ResultSchema.extend({...})
|
||||
*
|
||||
* // Type aliases
|
||||
* type CustomRequest = z.infer<typeof CustomRequestSchema>
|
||||
* type CustomNotification = z.infer<typeof CustomNotificationSchema>
|
||||
* type CustomResult = z.infer<typeof CustomResultSchema>
|
||||
*
|
||||
* // Create typed client
|
||||
* const client = new Client<CustomRequest, CustomNotification, CustomResult>({
|
||||
* name: "CustomClient",
|
||||
* version: "1.0.0"
|
||||
* })
|
||||
* ```
|
||||
*/
|
||||
export declare class Client<RequestT extends Request = Request, NotificationT extends Notification = Notification, ResultT extends Result = Result> extends Protocol<ClientRequest | RequestT, ClientNotification | NotificationT, ClientResult | ResultT> {
|
||||
private _clientInfo;
|
||||
private _serverCapabilities?;
|
||||
private _serverVersion?;
|
||||
private _capabilities;
|
||||
private _instructions?;
|
||||
private _jsonSchemaValidator;
|
||||
private _cachedToolOutputValidators;
|
||||
private _cachedKnownTaskTools;
|
||||
private _cachedRequiredTaskTools;
|
||||
private _experimental?;
|
||||
private _listChangedDebounceTimers;
|
||||
private _pendingListChangedConfig?;
|
||||
/**
|
||||
* Initializes this client with the given name and version information.
|
||||
*/
|
||||
constructor(_clientInfo: Implementation, options?: ClientOptions);
|
||||
/**
|
||||
* Set up handlers for list changed notifications based on config and server capabilities.
|
||||
* This should only be called after initialization when server capabilities are known.
|
||||
* Handlers are silently skipped if the server doesn't advertise the corresponding listChanged capability.
|
||||
* @internal
|
||||
*/
|
||||
private _setupListChangedHandlers;
|
||||
/**
|
||||
* Access experimental features.
|
||||
*
|
||||
* WARNING: These APIs are experimental and may change without notice.
|
||||
*
|
||||
* @experimental
|
||||
*/
|
||||
get experimental(): {
|
||||
tasks: ExperimentalClientTasks<RequestT, NotificationT, ResultT>;
|
||||
};
|
||||
/**
|
||||
* Registers new capabilities. This can only be called before connecting to a transport.
|
||||
*
|
||||
* The new capabilities will be merged with any existing capabilities previously given (e.g., at initialization).
|
||||
*/
|
||||
registerCapabilities(capabilities: ClientCapabilities): void;
|
||||
/**
|
||||
* Override request handler registration to enforce client-side validation for elicitation.
|
||||
*/
|
||||
setRequestHandler<T extends AnyObjectSchema>(requestSchema: T, handler: (request: SchemaOutput<T>, extra: RequestHandlerExtra<ClientRequest | RequestT, ClientNotification | NotificationT>) => ClientResult | ResultT | Promise<ClientResult | ResultT>): void;
|
||||
protected assertCapability(capability: keyof ServerCapabilities, method: string): void;
|
||||
connect(transport: Transport, options?: RequestOptions): Promise<void>;
|
||||
/**
|
||||
* After initialization has completed, this will be populated with the server's reported capabilities.
|
||||
*/
|
||||
getServerCapabilities(): ServerCapabilities | undefined;
|
||||
/**
|
||||
* After initialization has completed, this will be populated with information about the server's name and version.
|
||||
*/
|
||||
getServerVersion(): Implementation | undefined;
|
||||
/**
|
||||
* After initialization has completed, this may be populated with information about the server's instructions.
|
||||
*/
|
||||
getInstructions(): string | undefined;
|
||||
protected assertCapabilityForMethod(method: RequestT['method']): void;
|
||||
protected assertNotificationCapability(method: NotificationT['method']): void;
|
||||
protected assertRequestHandlerCapability(method: string): void;
|
||||
protected assertTaskCapability(method: string): void;
|
||||
protected assertTaskHandlerCapability(method: string): void;
|
||||
ping(options?: RequestOptions): Promise<{
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
complete(params: CompleteRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
completion: {
|
||||
[x: string]: unknown;
|
||||
values: string[];
|
||||
total?: number | undefined;
|
||||
hasMore?: boolean | undefined;
|
||||
};
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
setLoggingLevel(level: LoggingLevel, options?: RequestOptions): Promise<{
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
getPrompt(params: GetPromptRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
messages: {
|
||||
role: "user" | "assistant";
|
||||
content: {
|
||||
type: "text";
|
||||
text: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "image";
|
||||
data: string;
|
||||
mimeType: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "audio";
|
||||
data: string;
|
||||
mimeType: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "resource";
|
||||
resource: {
|
||||
uri: string;
|
||||
text: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
uri: string;
|
||||
blob: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
};
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
uri: string;
|
||||
name: string;
|
||||
type: "resource_link";
|
||||
description?: string | undefined;
|
||||
mimeType?: string | undefined;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
} | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
};
|
||||
}[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
description?: string | undefined;
|
||||
}>;
|
||||
listPrompts(params?: ListPromptsRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
prompts: {
|
||||
name: string;
|
||||
description?: string | undefined;
|
||||
arguments?: {
|
||||
name: string;
|
||||
description?: string | undefined;
|
||||
required?: boolean | undefined;
|
||||
}[] | undefined;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
} | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
}[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
nextCursor?: string | undefined;
|
||||
}>;
|
||||
listResources(params?: ListResourcesRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
resources: {
|
||||
uri: string;
|
||||
name: string;
|
||||
description?: string | undefined;
|
||||
mimeType?: string | undefined;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
} | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
}[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
nextCursor?: string | undefined;
|
||||
}>;
|
||||
listResourceTemplates(params?: ListResourceTemplatesRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
resourceTemplates: {
|
||||
uriTemplate: string;
|
||||
name: string;
|
||||
description?: string | undefined;
|
||||
mimeType?: string | undefined;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
} | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
}[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
nextCursor?: string | undefined;
|
||||
}>;
|
||||
readResource(params: ReadResourceRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
contents: ({
|
||||
uri: string;
|
||||
text: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
uri: string;
|
||||
blob: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
})[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
subscribeResource(params: SubscribeRequest['params'], options?: RequestOptions): Promise<{
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
unsubscribeResource(params: UnsubscribeRequest['params'], options?: RequestOptions): Promise<{
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
/**
|
||||
* Calls a tool and waits for the result. Automatically validates structured output if the tool has an outputSchema.
|
||||
*
|
||||
* For task-based execution with streaming behavior, use client.experimental.tasks.callToolStream() instead.
|
||||
*/
|
||||
callTool(params: CallToolRequest['params'], resultSchema?: typeof CallToolResultSchema | typeof CompatibilityCallToolResultSchema, options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
content: ({
|
||||
type: "text";
|
||||
text: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "image";
|
||||
data: string;
|
||||
mimeType: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "audio";
|
||||
data: string;
|
||||
mimeType: string;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
type: "resource";
|
||||
resource: {
|
||||
uri: string;
|
||||
text: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
uri: string;
|
||||
blob: string;
|
||||
mimeType?: string | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
};
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
} | {
|
||||
uri: string;
|
||||
name: string;
|
||||
type: "resource_link";
|
||||
description?: string | undefined;
|
||||
mimeType?: string | undefined;
|
||||
annotations?: {
|
||||
audience?: ("user" | "assistant")[] | undefined;
|
||||
priority?: number | undefined;
|
||||
lastModified?: string | undefined;
|
||||
} | undefined;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
} | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
})[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
structuredContent?: Record<string, unknown> | undefined;
|
||||
isError?: boolean | undefined;
|
||||
} | {
|
||||
[x: string]: unknown;
|
||||
toolResult: unknown;
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
}>;
|
||||
private isToolTask;
|
||||
/**
|
||||
* Check if a tool requires task-based execution.
|
||||
* Unlike isToolTask which includes 'optional' tools, this only checks for 'required'.
|
||||
*/
|
||||
private isToolTaskRequired;
|
||||
/**
|
||||
* Cache validators for tool output schemas.
|
||||
* Called after listTools() to pre-compile validators for better performance.
|
||||
*/
|
||||
private cacheToolMetadata;
|
||||
/**
|
||||
* Get cached validator for a tool
|
||||
*/
|
||||
private getToolOutputValidator;
|
||||
listTools(params?: ListToolsRequest['params'], options?: RequestOptions): Promise<{
|
||||
[x: string]: unknown;
|
||||
tools: {
|
||||
inputSchema: {
|
||||
[x: string]: unknown;
|
||||
type: "object";
|
||||
properties?: Record<string, object> | undefined;
|
||||
required?: string[] | undefined;
|
||||
};
|
||||
name: string;
|
||||
description?: string | undefined;
|
||||
outputSchema?: {
|
||||
[x: string]: unknown;
|
||||
type: "object";
|
||||
properties?: Record<string, object> | undefined;
|
||||
required?: string[] | undefined;
|
||||
} | undefined;
|
||||
annotations?: {
|
||||
title?: string | undefined;
|
||||
readOnlyHint?: boolean | undefined;
|
||||
destructiveHint?: boolean | undefined;
|
||||
idempotentHint?: boolean | undefined;
|
||||
openWorldHint?: boolean | undefined;
|
||||
} | undefined;
|
||||
execution?: {
|
||||
taskSupport?: "optional" | "required" | "forbidden" | undefined;
|
||||
} | undefined;
|
||||
_meta?: Record<string, unknown> | undefined;
|
||||
icons?: {
|
||||
src: string;
|
||||
mimeType?: string | undefined;
|
||||
sizes?: string[] | undefined;
|
||||
theme?: "light" | "dark" | undefined;
|
||||
}[] | undefined;
|
||||
title?: string | undefined;
|
||||
}[];
|
||||
_meta?: {
|
||||
[x: string]: unknown;
|
||||
progressToken?: string | number | undefined;
|
||||
"io.modelcontextprotocol/related-task"?: {
|
||||
taskId: string;
|
||||
} | undefined;
|
||||
} | undefined;
|
||||
nextCursor?: string | undefined;
|
||||
}>;
|
||||
/**
|
||||
* Set up a single list changed handler.
|
||||
* @internal
|
||||
*/
|
||||
private _setupListChangedHandler;
|
||||
sendRootsListChanged(): Promise<void>;
|
||||
}
|
||||
//# sourceMappingURL=index.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../../src/client/index.ts"],"names":[],"mappings":"AAAA,OAAO,EAAqB,QAAQ,EAAE,KAAK,eAAe,EAAE,KAAK,cAAc,EAAE,MAAM,uBAAuB,CAAC;AAC/G,OAAO,KAAK,EAAE,SAAS,EAAE,MAAM,wBAAwB,CAAC;AAExD,OAAO,EACH,KAAK,eAAe,EACpB,oBAAoB,EACpB,KAAK,kBAAkB,EACvB,KAAK,kBAAkB,EACvB,KAAK,aAAa,EAClB,KAAK,YAAY,EACjB,KAAK,iCAAiC,EACtC,KAAK,eAAe,EAIpB,KAAK,gBAAgB,EAErB,KAAK,cAAc,EAGnB,KAAK,kBAAkB,EAEvB,KAAK,oBAAoB,EAEzB,KAAK,4BAA4B,EAEjC,KAAK,gBAAgB,EAErB,KAAK,YAAY,EAEjB,KAAK,mBAAmB,EAExB,KAAK,kBAAkB,EAEvB,KAAK,gBAAgB,EAErB,KAAK,kBAAkB,EAYvB,KAAK,mBAAmB,EACxB,KAAK,OAAO,EACZ,KAAK,YAAY,EACjB,KAAK,MAAM,EACd,MAAM,aAAa,CAAC;AAErB,OAAO,KAAK,EAAuC,mBAAmB,EAAE,MAAM,wBAAwB,CAAC;AACvG,OAAO,EACH,eAAe,EACf,YAAY,EAMf,MAAM,yBAAyB,CAAC;AACjC,OAAO,KAAK,EAAE,mBAAmB,EAAE,MAAM,uBAAuB,CAAC;AACjE,OAAO,EAAE,uBAAuB,EAAE,MAAM,iCAAiC,CAAC;AAiD1E;;;;;;;;;GASG;AACH,wBAAgB,4BAA4B,CAAC,YAAY,EAAE,kBAAkB,CAAC,aAAa,CAAC,GAAG;IAC3F,gBAAgB,EAAE,OAAO,CAAC;IAC1B,eAAe,EAAE,OAAO,CAAC;CAC5B,CAaA;AAED,MAAM,MAAM,aAAa,GAAG,eAAe,GAAG;IAC1C;;OAEG;IACH,YAAY,CAAC,EAAE,kBAAkB,CAAC;IAElC;;;;;;;;;;;;;;;;;;;;;;;;;;;;OA4BG;IACH,mBAAmB,CAAC,EAAE,mBAAmB,CAAC;IAE1C;;;;;;;;;;;;;;;;;;;;;;;;;OAyBG;IACH,WAAW,CAAC,EAAE,mBAAmB,CAAC;CACrC,CAAC;AAEF;;;;;;;;;;;;;;;;;;;;;;;;GAwBG;AACH,qBAAa,MAAM,CACf,QAAQ,SAAS,OAAO,GAAG,OAAO,EAClC,aAAa,SAAS,YAAY,GAAG,YAAY,EACjD,OAAO,SAAS,MAAM,GAAG,MAAM,CACjC,SAAQ,QAAQ,CAAC,aAAa,GAAG,QAAQ,EAAE,kBAAkB,GAAG,aAAa,EAAE,YAAY,GAAG,OAAO,CAAC;IAiBhG,OAAO,CAAC,WAAW;IAhBvB,OAAO,CAAC,mBAAmB,CAAC,CAAqB;IACjD,OAAO,CAAC,cAAc,CAAC,CAAiB;IACxC,OAAO,CAAC,aAAa,CAAqB;IAC1C,OAAO,CAAC,aAAa,CAAC,CAAS;IAC/B,OAAO,CAAC,oBAAoB,CAAsB;IAClD,OAAO,CAAC,2BAA2B,CAAwD;IAC3F,OAAO,CAAC,qBAAqB,CAA0B;IACvD,OAAO,CAAC,wBAAwB,CAA0B;IAC1D,OAAO,CAAC,aAAa,CAAC,CAAuE;IAC7F,OAAO,CAAC,0BAA0B,CAAyD;IAC3F,OAAO,CAAC,yBAAyB,CAAC,CAAsB;IAExD;;OAEG;gBAES,WAAW,EAAE,cAAc,EACnC,OAAO,CAAC,EAAE,aAAa;IAY3B;;;;;OAKG;IACH,OAAO,CAAC,yBAAyB;IAuBjC;;;;;;OAMG;IACH,IAAI,YAAY,IAAI;QAAE,KAAK,EAAE,uBAAuB,CAAC,QAAQ,EAAE,aAAa,EAAE,OAAO,CAAC,CAAA;KAAE,CAOvF;IAED;;;;OAIG;IACI,oBAAoB,CAAC,YAAY,EAAE,kBAAkB,GAAG,IAAI;IAQnE;;OAEG;IACa,iBAAiB,CAAC,CAAC,SAAS,eAAe,EACvD,aAAa,EAAE,CAAC,EAChB,OAAO,EAAE,CACL,OAAO,EAAE,YAAY,CAAC,CAAC,CAAC,EACxB,KAAK,EAAE,mBAAmB,CAAC,aAAa,GAAG,QAAQ,EAAE,kBAAkB,GAAG,aAAa,CAAC,KACvF,YAAY,GAAG,OAAO,GAAG,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,GAC9D,IAAI;IA8IP,SAAS,CAAC,gBAAgB,CAAC,UAAU,EAAE,MAAM,kBAAkB,EAAE,MAAM,EAAE,MAAM,GAAG,IAAI;IAMvE,OAAO,CAAC,SAAS,EAAE,SAAS,EAAE,OAAO,CAAC,EAAE,cAAc,GAAG,OAAO,CAAC,IAAI,CAAC;IAsDrF;;OAEG;IACH,qBAAqB,IAAI,kBAAkB,GAAG,SAAS;IAIvD;;OAEG;IACH,gBAAgB,IAAI,cAAc,GAAG,SAAS;IAI9C;;OAEG;IACH,eAAe,IAAI,MAAM,GAAG,SAAS;IAIrC,SAAS,CAAC,yBAAyB,CAAC,MAAM,EAAE,QAAQ,CAAC,QAAQ,CAAC,GAAG,IAAI;IAqDrE,SAAS,CAAC,4BAA4B,CAAC,MAAM,EAAE,aAAa,CAAC,QAAQ,CAAC,GAAG,IAAI;IAsB7E,SAAS,CAAC,8BAA8B,CAAC,MAAM,EAAE,MAAM,GAAG,IAAI;IAyC9D,SAAS,CAAC,oBAAoB,CAAC,MAAM,EAAE,MAAM,GAAG,IAAI;IAIpD,SAAS,CAAC,2BAA2B,CAAC,MAAM,EAAE,MAAM,GAAG,IAAI;IAUrD,IAAI,CAAC,OAAO,CAAC,EAAE,cAAc;;;;;;;;;IAI7B,QAAQ,CAAC,MAAM,EAAE,eAAe,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;IAIpE,eAAe,CAAC,KAAK,EAAE,YAAY,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;IAI7D,SAAS,CAAC,MAAM,EAAE,gBAAgB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAItE,WAAW,CAAC,MAAM,CAAC,EAAE,kBAAkB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAI3E,aAAa,CAAC,MAAM,CAAC,EAAE,oBAAoB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAI/E,qBAAqB,CAAC,MAAM,CAAC,EAAE,4BAA4B,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAI/F,YAAY,CAAC,MAAM,EAAE,mBAAmB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;IAI5E,iBAAiB,CAAC,MAAM,EAAE,gBAAgB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;IAI9E,mBAAmB,CAAC,MAAM,EAAE,kBAAkB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;IAIxF;;;;OAIG;IACG,QAAQ,CACV,MAAM,EAAE,eAAe,CAAC,QAAQ,CAAC,EACjC,YAAY,GAAE,OAAO,oBAAoB,GAAG,OAAO,iCAAwD,EAC3G,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAkD5B,OAAO,CAAC,UAAU;IAQlB;;;OAGG;IACH,OAAO,CAAC,kBAAkB;IAI1B;;;OAGG;IACH,OAAO,CAAC,iBAAiB;IAuBzB;;OAEG;IACH,OAAO,CAAC,sBAAsB;IAIxB,SAAS,CAAC,MAAM,CAAC,EAAE,gBAAgB,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,EAAE,cAAc;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;IAS7E;;;OAGG;IACH,OAAO,CAAC,wBAAwB;IAwD1B,oBAAoB;CAG7B"}
|
||||
629
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.js
generated
vendored
Normal file
629
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.js
generated
vendored
Normal file
@@ -0,0 +1,629 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Client = void 0;
|
||||
exports.getSupportedElicitationModes = getSupportedElicitationModes;
|
||||
const protocol_js_1 = require("../shared/protocol.js");
|
||||
const types_js_1 = require("../types.js");
|
||||
const ajv_provider_js_1 = require("../validation/ajv-provider.js");
|
||||
const zod_compat_js_1 = require("../server/zod-compat.js");
|
||||
const client_js_1 = require("../experimental/tasks/client.js");
|
||||
const helpers_js_1 = require("../experimental/tasks/helpers.js");
|
||||
/**
|
||||
* Elicitation default application helper. Applies defaults to the data based on the schema.
|
||||
*
|
||||
* @param schema - The schema to apply defaults to.
|
||||
* @param data - The data to apply defaults to.
|
||||
*/
|
||||
function applyElicitationDefaults(schema, data) {
|
||||
if (!schema || data === null || typeof data !== 'object')
|
||||
return;
|
||||
// Handle object properties
|
||||
if (schema.type === 'object' && schema.properties && typeof schema.properties === 'object') {
|
||||
const obj = data;
|
||||
const props = schema.properties;
|
||||
for (const key of Object.keys(props)) {
|
||||
const propSchema = props[key];
|
||||
// If missing or explicitly undefined, apply default if present
|
||||
if (obj[key] === undefined && Object.prototype.hasOwnProperty.call(propSchema, 'default')) {
|
||||
obj[key] = propSchema.default;
|
||||
}
|
||||
// Recurse into existing nested objects/arrays
|
||||
if (obj[key] !== undefined) {
|
||||
applyElicitationDefaults(propSchema, obj[key]);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (Array.isArray(schema.anyOf)) {
|
||||
for (const sub of schema.anyOf) {
|
||||
// Skip boolean schemas (true/false are valid JSON Schemas but have no defaults)
|
||||
if (typeof sub !== 'boolean') {
|
||||
applyElicitationDefaults(sub, data);
|
||||
}
|
||||
}
|
||||
}
|
||||
// Combine schemas
|
||||
if (Array.isArray(schema.oneOf)) {
|
||||
for (const sub of schema.oneOf) {
|
||||
// Skip boolean schemas (true/false are valid JSON Schemas but have no defaults)
|
||||
if (typeof sub !== 'boolean') {
|
||||
applyElicitationDefaults(sub, data);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Determines which elicitation modes are supported based on declared client capabilities.
|
||||
*
|
||||
* According to the spec:
|
||||
* - An empty elicitation capability object defaults to form mode support (backwards compatibility)
|
||||
* - URL mode is only supported if explicitly declared
|
||||
*
|
||||
* @param capabilities - The client's elicitation capabilities
|
||||
* @returns An object indicating which modes are supported
|
||||
*/
|
||||
function getSupportedElicitationModes(capabilities) {
|
||||
if (!capabilities) {
|
||||
return { supportsFormMode: false, supportsUrlMode: false };
|
||||
}
|
||||
const hasFormCapability = capabilities.form !== undefined;
|
||||
const hasUrlCapability = capabilities.url !== undefined;
|
||||
// If neither form nor url are explicitly declared, form mode is supported (backwards compatibility)
|
||||
const supportsFormMode = hasFormCapability || (!hasFormCapability && !hasUrlCapability);
|
||||
const supportsUrlMode = hasUrlCapability;
|
||||
return { supportsFormMode, supportsUrlMode };
|
||||
}
|
||||
/**
|
||||
* An MCP client on top of a pluggable transport.
|
||||
*
|
||||
* The client will automatically begin the initialization flow with the server when connect() is called.
|
||||
*
|
||||
* To use with custom types, extend the base Request/Notification/Result types and pass them as type parameters:
|
||||
*
|
||||
* ```typescript
|
||||
* // Custom schemas
|
||||
* const CustomRequestSchema = RequestSchema.extend({...})
|
||||
* const CustomNotificationSchema = NotificationSchema.extend({...})
|
||||
* const CustomResultSchema = ResultSchema.extend({...})
|
||||
*
|
||||
* // Type aliases
|
||||
* type CustomRequest = z.infer<typeof CustomRequestSchema>
|
||||
* type CustomNotification = z.infer<typeof CustomNotificationSchema>
|
||||
* type CustomResult = z.infer<typeof CustomResultSchema>
|
||||
*
|
||||
* // Create typed client
|
||||
* const client = new Client<CustomRequest, CustomNotification, CustomResult>({
|
||||
* name: "CustomClient",
|
||||
* version: "1.0.0"
|
||||
* })
|
||||
* ```
|
||||
*/
|
||||
class Client extends protocol_js_1.Protocol {
|
||||
/**
|
||||
* Initializes this client with the given name and version information.
|
||||
*/
|
||||
constructor(_clientInfo, options) {
|
||||
super(options);
|
||||
this._clientInfo = _clientInfo;
|
||||
this._cachedToolOutputValidators = new Map();
|
||||
this._cachedKnownTaskTools = new Set();
|
||||
this._cachedRequiredTaskTools = new Set();
|
||||
this._listChangedDebounceTimers = new Map();
|
||||
this._capabilities = options?.capabilities ?? {};
|
||||
this._jsonSchemaValidator = options?.jsonSchemaValidator ?? new ajv_provider_js_1.AjvJsonSchemaValidator();
|
||||
// Store list changed config for setup after connection (when we know server capabilities)
|
||||
if (options?.listChanged) {
|
||||
this._pendingListChangedConfig = options.listChanged;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Set up handlers for list changed notifications based on config and server capabilities.
|
||||
* This should only be called after initialization when server capabilities are known.
|
||||
* Handlers are silently skipped if the server doesn't advertise the corresponding listChanged capability.
|
||||
* @internal
|
||||
*/
|
||||
_setupListChangedHandlers(config) {
|
||||
if (config.tools && this._serverCapabilities?.tools?.listChanged) {
|
||||
this._setupListChangedHandler('tools', types_js_1.ToolListChangedNotificationSchema, config.tools, async () => {
|
||||
const result = await this.listTools();
|
||||
return result.tools;
|
||||
});
|
||||
}
|
||||
if (config.prompts && this._serverCapabilities?.prompts?.listChanged) {
|
||||
this._setupListChangedHandler('prompts', types_js_1.PromptListChangedNotificationSchema, config.prompts, async () => {
|
||||
const result = await this.listPrompts();
|
||||
return result.prompts;
|
||||
});
|
||||
}
|
||||
if (config.resources && this._serverCapabilities?.resources?.listChanged) {
|
||||
this._setupListChangedHandler('resources', types_js_1.ResourceListChangedNotificationSchema, config.resources, async () => {
|
||||
const result = await this.listResources();
|
||||
return result.resources;
|
||||
});
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Access experimental features.
|
||||
*
|
||||
* WARNING: These APIs are experimental and may change without notice.
|
||||
*
|
||||
* @experimental
|
||||
*/
|
||||
get experimental() {
|
||||
if (!this._experimental) {
|
||||
this._experimental = {
|
||||
tasks: new client_js_1.ExperimentalClientTasks(this)
|
||||
};
|
||||
}
|
||||
return this._experimental;
|
||||
}
|
||||
/**
|
||||
* Registers new capabilities. This can only be called before connecting to a transport.
|
||||
*
|
||||
* The new capabilities will be merged with any existing capabilities previously given (e.g., at initialization).
|
||||
*/
|
||||
registerCapabilities(capabilities) {
|
||||
if (this.transport) {
|
||||
throw new Error('Cannot register capabilities after connecting to transport');
|
||||
}
|
||||
this._capabilities = (0, protocol_js_1.mergeCapabilities)(this._capabilities, capabilities);
|
||||
}
|
||||
/**
|
||||
* Override request handler registration to enforce client-side validation for elicitation.
|
||||
*/
|
||||
setRequestHandler(requestSchema, handler) {
|
||||
const shape = (0, zod_compat_js_1.getObjectShape)(requestSchema);
|
||||
const methodSchema = shape?.method;
|
||||
if (!methodSchema) {
|
||||
throw new Error('Schema is missing a method literal');
|
||||
}
|
||||
// Extract literal value using type-safe property access
|
||||
let methodValue;
|
||||
if ((0, zod_compat_js_1.isZ4Schema)(methodSchema)) {
|
||||
const v4Schema = methodSchema;
|
||||
const v4Def = v4Schema._zod?.def;
|
||||
methodValue = v4Def?.value ?? v4Schema.value;
|
||||
}
|
||||
else {
|
||||
const v3Schema = methodSchema;
|
||||
const legacyDef = v3Schema._def;
|
||||
methodValue = legacyDef?.value ?? v3Schema.value;
|
||||
}
|
||||
if (typeof methodValue !== 'string') {
|
||||
throw new Error('Schema method literal must be a string');
|
||||
}
|
||||
const method = methodValue;
|
||||
if (method === 'elicitation/create') {
|
||||
const wrappedHandler = async (request, extra) => {
|
||||
const validatedRequest = (0, zod_compat_js_1.safeParse)(types_js_1.ElicitRequestSchema, request);
|
||||
if (!validatedRequest.success) {
|
||||
// Type guard: if success is false, error is guaranteed to exist
|
||||
const errorMessage = validatedRequest.error instanceof Error ? validatedRequest.error.message : String(validatedRequest.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid elicitation request: ${errorMessage}`);
|
||||
}
|
||||
const { params } = validatedRequest.data;
|
||||
params.mode = params.mode ?? 'form';
|
||||
const { supportsFormMode, supportsUrlMode } = getSupportedElicitationModes(this._capabilities.elicitation);
|
||||
if (params.mode === 'form' && !supportsFormMode) {
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, 'Client does not support form-mode elicitation requests');
|
||||
}
|
||||
if (params.mode === 'url' && !supportsUrlMode) {
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, 'Client does not support URL-mode elicitation requests');
|
||||
}
|
||||
const result = await Promise.resolve(handler(request, extra));
|
||||
// When task creation is requested, validate and return CreateTaskResult
|
||||
if (params.task) {
|
||||
const taskValidationResult = (0, zod_compat_js_1.safeParse)(types_js_1.CreateTaskResultSchema, result);
|
||||
if (!taskValidationResult.success) {
|
||||
const errorMessage = taskValidationResult.error instanceof Error
|
||||
? taskValidationResult.error.message
|
||||
: String(taskValidationResult.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid task creation result: ${errorMessage}`);
|
||||
}
|
||||
return taskValidationResult.data;
|
||||
}
|
||||
// For non-task requests, validate against ElicitResultSchema
|
||||
const validationResult = (0, zod_compat_js_1.safeParse)(types_js_1.ElicitResultSchema, result);
|
||||
if (!validationResult.success) {
|
||||
// Type guard: if success is false, error is guaranteed to exist
|
||||
const errorMessage = validationResult.error instanceof Error ? validationResult.error.message : String(validationResult.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid elicitation result: ${errorMessage}`);
|
||||
}
|
||||
const validatedResult = validationResult.data;
|
||||
const requestedSchema = params.mode === 'form' ? params.requestedSchema : undefined;
|
||||
if (params.mode === 'form' && validatedResult.action === 'accept' && validatedResult.content && requestedSchema) {
|
||||
if (this._capabilities.elicitation?.form?.applyDefaults) {
|
||||
try {
|
||||
applyElicitationDefaults(requestedSchema, validatedResult.content);
|
||||
}
|
||||
catch {
|
||||
// gracefully ignore errors in default application
|
||||
}
|
||||
}
|
||||
}
|
||||
return validatedResult;
|
||||
};
|
||||
// Install the wrapped handler
|
||||
return super.setRequestHandler(requestSchema, wrappedHandler);
|
||||
}
|
||||
if (method === 'sampling/createMessage') {
|
||||
const wrappedHandler = async (request, extra) => {
|
||||
const validatedRequest = (0, zod_compat_js_1.safeParse)(types_js_1.CreateMessageRequestSchema, request);
|
||||
if (!validatedRequest.success) {
|
||||
const errorMessage = validatedRequest.error instanceof Error ? validatedRequest.error.message : String(validatedRequest.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid sampling request: ${errorMessage}`);
|
||||
}
|
||||
const { params } = validatedRequest.data;
|
||||
const result = await Promise.resolve(handler(request, extra));
|
||||
// When task creation is requested, validate and return CreateTaskResult
|
||||
if (params.task) {
|
||||
const taskValidationResult = (0, zod_compat_js_1.safeParse)(types_js_1.CreateTaskResultSchema, result);
|
||||
if (!taskValidationResult.success) {
|
||||
const errorMessage = taskValidationResult.error instanceof Error
|
||||
? taskValidationResult.error.message
|
||||
: String(taskValidationResult.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid task creation result: ${errorMessage}`);
|
||||
}
|
||||
return taskValidationResult.data;
|
||||
}
|
||||
// For non-task requests, validate against appropriate schema based on tools presence
|
||||
const hasTools = params.tools || params.toolChoice;
|
||||
const resultSchema = hasTools ? types_js_1.CreateMessageResultWithToolsSchema : types_js_1.CreateMessageResultSchema;
|
||||
const validationResult = (0, zod_compat_js_1.safeParse)(resultSchema, result);
|
||||
if (!validationResult.success) {
|
||||
const errorMessage = validationResult.error instanceof Error ? validationResult.error.message : String(validationResult.error);
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Invalid sampling result: ${errorMessage}`);
|
||||
}
|
||||
return validationResult.data;
|
||||
};
|
||||
// Install the wrapped handler
|
||||
return super.setRequestHandler(requestSchema, wrappedHandler);
|
||||
}
|
||||
// Other handlers use default behavior
|
||||
return super.setRequestHandler(requestSchema, handler);
|
||||
}
|
||||
assertCapability(capability, method) {
|
||||
if (!this._serverCapabilities?.[capability]) {
|
||||
throw new Error(`Server does not support ${capability} (required for ${method})`);
|
||||
}
|
||||
}
|
||||
async connect(transport, options) {
|
||||
await super.connect(transport);
|
||||
// When transport sessionId is already set this means we are trying to reconnect.
|
||||
// In this case we don't need to initialize again.
|
||||
if (transport.sessionId !== undefined) {
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const result = await this.request({
|
||||
method: 'initialize',
|
||||
params: {
|
||||
protocolVersion: types_js_1.LATEST_PROTOCOL_VERSION,
|
||||
capabilities: this._capabilities,
|
||||
clientInfo: this._clientInfo
|
||||
}
|
||||
}, types_js_1.InitializeResultSchema, options);
|
||||
if (result === undefined) {
|
||||
throw new Error(`Server sent invalid initialize result: ${result}`);
|
||||
}
|
||||
if (!types_js_1.SUPPORTED_PROTOCOL_VERSIONS.includes(result.protocolVersion)) {
|
||||
throw new Error(`Server's protocol version is not supported: ${result.protocolVersion}`);
|
||||
}
|
||||
this._serverCapabilities = result.capabilities;
|
||||
this._serverVersion = result.serverInfo;
|
||||
// HTTP transports must set the protocol version in each header after initialization.
|
||||
if (transport.setProtocolVersion) {
|
||||
transport.setProtocolVersion(result.protocolVersion);
|
||||
}
|
||||
this._instructions = result.instructions;
|
||||
await this.notification({
|
||||
method: 'notifications/initialized'
|
||||
});
|
||||
// Set up list changed handlers now that we know server capabilities
|
||||
if (this._pendingListChangedConfig) {
|
||||
this._setupListChangedHandlers(this._pendingListChangedConfig);
|
||||
this._pendingListChangedConfig = undefined;
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
// Disconnect if initialization fails.
|
||||
void this.close();
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* After initialization has completed, this will be populated with the server's reported capabilities.
|
||||
*/
|
||||
getServerCapabilities() {
|
||||
return this._serverCapabilities;
|
||||
}
|
||||
/**
|
||||
* After initialization has completed, this will be populated with information about the server's name and version.
|
||||
*/
|
||||
getServerVersion() {
|
||||
return this._serverVersion;
|
||||
}
|
||||
/**
|
||||
* After initialization has completed, this may be populated with information about the server's instructions.
|
||||
*/
|
||||
getInstructions() {
|
||||
return this._instructions;
|
||||
}
|
||||
assertCapabilityForMethod(method) {
|
||||
switch (method) {
|
||||
case 'logging/setLevel':
|
||||
if (!this._serverCapabilities?.logging) {
|
||||
throw new Error(`Server does not support logging (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'prompts/get':
|
||||
case 'prompts/list':
|
||||
if (!this._serverCapabilities?.prompts) {
|
||||
throw new Error(`Server does not support prompts (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'resources/list':
|
||||
case 'resources/templates/list':
|
||||
case 'resources/read':
|
||||
case 'resources/subscribe':
|
||||
case 'resources/unsubscribe':
|
||||
if (!this._serverCapabilities?.resources) {
|
||||
throw new Error(`Server does not support resources (required for ${method})`);
|
||||
}
|
||||
if (method === 'resources/subscribe' && !this._serverCapabilities.resources.subscribe) {
|
||||
throw new Error(`Server does not support resource subscriptions (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'tools/call':
|
||||
case 'tools/list':
|
||||
if (!this._serverCapabilities?.tools) {
|
||||
throw new Error(`Server does not support tools (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'completion/complete':
|
||||
if (!this._serverCapabilities?.completions) {
|
||||
throw new Error(`Server does not support completions (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'initialize':
|
||||
// No specific capability required for initialize
|
||||
break;
|
||||
case 'ping':
|
||||
// No specific capability required for ping
|
||||
break;
|
||||
}
|
||||
}
|
||||
assertNotificationCapability(method) {
|
||||
switch (method) {
|
||||
case 'notifications/roots/list_changed':
|
||||
if (!this._capabilities.roots?.listChanged) {
|
||||
throw new Error(`Client does not support roots list changed notifications (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'notifications/initialized':
|
||||
// No specific capability required for initialized
|
||||
break;
|
||||
case 'notifications/cancelled':
|
||||
// Cancellation notifications are always allowed
|
||||
break;
|
||||
case 'notifications/progress':
|
||||
// Progress notifications are always allowed
|
||||
break;
|
||||
}
|
||||
}
|
||||
assertRequestHandlerCapability(method) {
|
||||
// Task handlers are registered in Protocol constructor before _capabilities is initialized
|
||||
// Skip capability check for task methods during initialization
|
||||
if (!this._capabilities) {
|
||||
return;
|
||||
}
|
||||
switch (method) {
|
||||
case 'sampling/createMessage':
|
||||
if (!this._capabilities.sampling) {
|
||||
throw new Error(`Client does not support sampling capability (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'elicitation/create':
|
||||
if (!this._capabilities.elicitation) {
|
||||
throw new Error(`Client does not support elicitation capability (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'roots/list':
|
||||
if (!this._capabilities.roots) {
|
||||
throw new Error(`Client does not support roots capability (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'tasks/get':
|
||||
case 'tasks/list':
|
||||
case 'tasks/result':
|
||||
case 'tasks/cancel':
|
||||
if (!this._capabilities.tasks) {
|
||||
throw new Error(`Client does not support tasks capability (required for ${method})`);
|
||||
}
|
||||
break;
|
||||
case 'ping':
|
||||
// No specific capability required for ping
|
||||
break;
|
||||
}
|
||||
}
|
||||
assertTaskCapability(method) {
|
||||
(0, helpers_js_1.assertToolsCallTaskCapability)(this._serverCapabilities?.tasks?.requests, method, 'Server');
|
||||
}
|
||||
assertTaskHandlerCapability(method) {
|
||||
// Task handlers are registered in Protocol constructor before _capabilities is initialized
|
||||
// Skip capability check for task methods during initialization
|
||||
if (!this._capabilities) {
|
||||
return;
|
||||
}
|
||||
(0, helpers_js_1.assertClientRequestTaskCapability)(this._capabilities.tasks?.requests, method, 'Client');
|
||||
}
|
||||
async ping(options) {
|
||||
return this.request({ method: 'ping' }, types_js_1.EmptyResultSchema, options);
|
||||
}
|
||||
async complete(params, options) {
|
||||
return this.request({ method: 'completion/complete', params }, types_js_1.CompleteResultSchema, options);
|
||||
}
|
||||
async setLoggingLevel(level, options) {
|
||||
return this.request({ method: 'logging/setLevel', params: { level } }, types_js_1.EmptyResultSchema, options);
|
||||
}
|
||||
async getPrompt(params, options) {
|
||||
return this.request({ method: 'prompts/get', params }, types_js_1.GetPromptResultSchema, options);
|
||||
}
|
||||
async listPrompts(params, options) {
|
||||
return this.request({ method: 'prompts/list', params }, types_js_1.ListPromptsResultSchema, options);
|
||||
}
|
||||
async listResources(params, options) {
|
||||
return this.request({ method: 'resources/list', params }, types_js_1.ListResourcesResultSchema, options);
|
||||
}
|
||||
async listResourceTemplates(params, options) {
|
||||
return this.request({ method: 'resources/templates/list', params }, types_js_1.ListResourceTemplatesResultSchema, options);
|
||||
}
|
||||
async readResource(params, options) {
|
||||
return this.request({ method: 'resources/read', params }, types_js_1.ReadResourceResultSchema, options);
|
||||
}
|
||||
async subscribeResource(params, options) {
|
||||
return this.request({ method: 'resources/subscribe', params }, types_js_1.EmptyResultSchema, options);
|
||||
}
|
||||
async unsubscribeResource(params, options) {
|
||||
return this.request({ method: 'resources/unsubscribe', params }, types_js_1.EmptyResultSchema, options);
|
||||
}
|
||||
/**
|
||||
* Calls a tool and waits for the result. Automatically validates structured output if the tool has an outputSchema.
|
||||
*
|
||||
* For task-based execution with streaming behavior, use client.experimental.tasks.callToolStream() instead.
|
||||
*/
|
||||
async callTool(params, resultSchema = types_js_1.CallToolResultSchema, options) {
|
||||
// Guard: required-task tools need experimental API
|
||||
if (this.isToolTaskRequired(params.name)) {
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidRequest, `Tool "${params.name}" requires task-based execution. Use client.experimental.tasks.callToolStream() instead.`);
|
||||
}
|
||||
const result = await this.request({ method: 'tools/call', params }, resultSchema, options);
|
||||
// Check if the tool has an outputSchema
|
||||
const validator = this.getToolOutputValidator(params.name);
|
||||
if (validator) {
|
||||
// If tool has outputSchema, it MUST return structuredContent (unless it's an error)
|
||||
if (!result.structuredContent && !result.isError) {
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidRequest, `Tool ${params.name} has an output schema but did not return structured content`);
|
||||
}
|
||||
// Only validate structured content if present (not when there's an error)
|
||||
if (result.structuredContent) {
|
||||
try {
|
||||
// Validate the structured content against the schema
|
||||
const validationResult = validator(result.structuredContent);
|
||||
if (!validationResult.valid) {
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Structured content does not match the tool's output schema: ${validationResult.errorMessage}`);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
if (error instanceof types_js_1.McpError) {
|
||||
throw error;
|
||||
}
|
||||
throw new types_js_1.McpError(types_js_1.ErrorCode.InvalidParams, `Failed to validate structured content: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
return result;
|
||||
}
|
||||
isToolTask(toolName) {
|
||||
if (!this._serverCapabilities?.tasks?.requests?.tools?.call) {
|
||||
return false;
|
||||
}
|
||||
return this._cachedKnownTaskTools.has(toolName);
|
||||
}
|
||||
/**
|
||||
* Check if a tool requires task-based execution.
|
||||
* Unlike isToolTask which includes 'optional' tools, this only checks for 'required'.
|
||||
*/
|
||||
isToolTaskRequired(toolName) {
|
||||
return this._cachedRequiredTaskTools.has(toolName);
|
||||
}
|
||||
/**
|
||||
* Cache validators for tool output schemas.
|
||||
* Called after listTools() to pre-compile validators for better performance.
|
||||
*/
|
||||
cacheToolMetadata(tools) {
|
||||
this._cachedToolOutputValidators.clear();
|
||||
this._cachedKnownTaskTools.clear();
|
||||
this._cachedRequiredTaskTools.clear();
|
||||
for (const tool of tools) {
|
||||
// If the tool has an outputSchema, create and cache the validator
|
||||
if (tool.outputSchema) {
|
||||
const toolValidator = this._jsonSchemaValidator.getValidator(tool.outputSchema);
|
||||
this._cachedToolOutputValidators.set(tool.name, toolValidator);
|
||||
}
|
||||
// If the tool supports task-based execution, cache that information
|
||||
const taskSupport = tool.execution?.taskSupport;
|
||||
if (taskSupport === 'required' || taskSupport === 'optional') {
|
||||
this._cachedKnownTaskTools.add(tool.name);
|
||||
}
|
||||
if (taskSupport === 'required') {
|
||||
this._cachedRequiredTaskTools.add(tool.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Get cached validator for a tool
|
||||
*/
|
||||
getToolOutputValidator(toolName) {
|
||||
return this._cachedToolOutputValidators.get(toolName);
|
||||
}
|
||||
async listTools(params, options) {
|
||||
const result = await this.request({ method: 'tools/list', params }, types_js_1.ListToolsResultSchema, options);
|
||||
// Cache the tools and their output schemas for future validation
|
||||
this.cacheToolMetadata(result.tools);
|
||||
return result;
|
||||
}
|
||||
/**
|
||||
* Set up a single list changed handler.
|
||||
* @internal
|
||||
*/
|
||||
_setupListChangedHandler(listType, notificationSchema, options, fetcher) {
|
||||
// Validate options using Zod schema (validates autoRefresh and debounceMs)
|
||||
const parseResult = types_js_1.ListChangedOptionsBaseSchema.safeParse(options);
|
||||
if (!parseResult.success) {
|
||||
throw new Error(`Invalid ${listType} listChanged options: ${parseResult.error.message}`);
|
||||
}
|
||||
// Validate callback
|
||||
if (typeof options.onChanged !== 'function') {
|
||||
throw new Error(`Invalid ${listType} listChanged options: onChanged must be a function`);
|
||||
}
|
||||
const { autoRefresh, debounceMs } = parseResult.data;
|
||||
const { onChanged } = options;
|
||||
const refresh = async () => {
|
||||
if (!autoRefresh) {
|
||||
onChanged(null, null);
|
||||
return;
|
||||
}
|
||||
try {
|
||||
const items = await fetcher();
|
||||
onChanged(null, items);
|
||||
}
|
||||
catch (e) {
|
||||
const error = e instanceof Error ? e : new Error(String(e));
|
||||
onChanged(error, null);
|
||||
}
|
||||
};
|
||||
const handler = () => {
|
||||
if (debounceMs) {
|
||||
// Clear any pending debounce timer for this list type
|
||||
const existingTimer = this._listChangedDebounceTimers.get(listType);
|
||||
if (existingTimer) {
|
||||
clearTimeout(existingTimer);
|
||||
}
|
||||
// Set up debounced refresh
|
||||
const timer = setTimeout(refresh, debounceMs);
|
||||
this._listChangedDebounceTimers.set(listType, timer);
|
||||
}
|
||||
else {
|
||||
// No debounce, refresh immediately
|
||||
refresh();
|
||||
}
|
||||
};
|
||||
// Register notification handler
|
||||
this.setNotificationHandler(notificationSchema, handler);
|
||||
}
|
||||
async sendRootsListChanged() {
|
||||
return this.notification({ method: 'notifications/roots/list_changed' });
|
||||
}
|
||||
}
|
||||
exports.Client = Client;
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/index.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
169
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.d.ts
generated
vendored
Normal file
169
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.d.ts
generated
vendored
Normal file
@@ -0,0 +1,169 @@
|
||||
import { OAuthClientProvider } from './auth.js';
|
||||
import { FetchLike } from '../shared/transport.js';
|
||||
/**
|
||||
* Middleware function that wraps and enhances fetch functionality.
|
||||
* Takes a fetch handler and returns an enhanced fetch handler.
|
||||
*/
|
||||
export type Middleware = (next: FetchLike) => FetchLike;
|
||||
/**
|
||||
* Creates a fetch wrapper that handles OAuth authentication automatically.
|
||||
*
|
||||
* This wrapper will:
|
||||
* - Add Authorization headers with access tokens
|
||||
* - Handle 401 responses by attempting re-authentication
|
||||
* - Retry the original request after successful auth
|
||||
* - Handle OAuth errors appropriately (InvalidClientError, etc.)
|
||||
*
|
||||
* The baseUrl parameter is optional and defaults to using the domain from the request URL.
|
||||
* However, you should explicitly provide baseUrl when:
|
||||
* - Making requests to multiple subdomains (e.g., api.example.com, cdn.example.com)
|
||||
* - Using API paths that differ from OAuth discovery paths (e.g., requesting /api/v1/data but OAuth is at /)
|
||||
* - The OAuth server is on a different domain than your API requests
|
||||
* - You want to ensure consistent OAuth behavior regardless of request URLs
|
||||
*
|
||||
* For MCP transports, set baseUrl to the same URL you pass to the transport constructor.
|
||||
*
|
||||
* Note: This wrapper is designed for general-purpose fetch operations.
|
||||
* MCP transports (SSE and StreamableHTTP) already have built-in OAuth handling
|
||||
* and should not need this wrapper.
|
||||
*
|
||||
* @param provider - OAuth client provider for authentication
|
||||
* @param baseUrl - Base URL for OAuth server discovery (defaults to request URL domain)
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
export declare const withOAuth: (provider: OAuthClientProvider, baseUrl?: string | URL) => Middleware;
|
||||
/**
|
||||
* Logger function type for HTTP requests
|
||||
*/
|
||||
export type RequestLogger = (input: {
|
||||
method: string;
|
||||
url: string | URL;
|
||||
status: number;
|
||||
statusText: string;
|
||||
duration: number;
|
||||
requestHeaders?: Headers;
|
||||
responseHeaders?: Headers;
|
||||
error?: Error;
|
||||
}) => void;
|
||||
/**
|
||||
* Configuration options for the logging middleware
|
||||
*/
|
||||
export type LoggingOptions = {
|
||||
/**
|
||||
* Custom logger function, defaults to console logging
|
||||
*/
|
||||
logger?: RequestLogger;
|
||||
/**
|
||||
* Whether to include request headers in logs
|
||||
* @default false
|
||||
*/
|
||||
includeRequestHeaders?: boolean;
|
||||
/**
|
||||
* Whether to include response headers in logs
|
||||
* @default false
|
||||
*/
|
||||
includeResponseHeaders?: boolean;
|
||||
/**
|
||||
* Status level filter - only log requests with status >= this value
|
||||
* Set to 0 to log all requests, 400 to log only errors
|
||||
* @default 0
|
||||
*/
|
||||
statusLevel?: number;
|
||||
};
|
||||
/**
|
||||
* Creates a fetch middleware that logs HTTP requests and responses.
|
||||
*
|
||||
* When called without arguments `withLogging()`, it uses the default logger that:
|
||||
* - Logs successful requests (2xx) to `console.log`
|
||||
* - Logs error responses (4xx/5xx) and network errors to `console.error`
|
||||
* - Logs all requests regardless of status (statusLevel: 0)
|
||||
* - Does not include request or response headers in logs
|
||||
* - Measures and displays request duration in milliseconds
|
||||
*
|
||||
* Important: the default logger uses both `console.log` and `console.error` so it should not be used with
|
||||
* `stdio` transports and applications.
|
||||
*
|
||||
* @param options - Logging configuration options
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
export declare const withLogging: (options?: LoggingOptions) => Middleware;
|
||||
/**
|
||||
* Composes multiple fetch middleware functions into a single middleware pipeline.
|
||||
* Middleware are applied in the order they appear, creating a chain of handlers.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* // Create a middleware pipeline that handles both OAuth and logging
|
||||
* const enhancedFetch = applyMiddlewares(
|
||||
* withOAuth(oauthProvider, 'https://api.example.com'),
|
||||
* withLogging({ statusLevel: 400 })
|
||||
* )(fetch);
|
||||
*
|
||||
* // Use the enhanced fetch - it will handle auth and log errors
|
||||
* const response = await enhancedFetch('https://api.example.com/data');
|
||||
* ```
|
||||
*
|
||||
* @param middleware - Array of fetch middleware to compose into a pipeline
|
||||
* @returns A single composed middleware function
|
||||
*/
|
||||
export declare const applyMiddlewares: (...middleware: Middleware[]) => Middleware;
|
||||
/**
|
||||
* Helper function to create custom fetch middleware with cleaner syntax.
|
||||
* Provides the next handler and request details as separate parameters for easier access.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* // Create custom authentication middleware
|
||||
* const customAuthMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const headers = new Headers(init?.headers);
|
||||
* headers.set('X-Custom-Auth', 'my-token');
|
||||
*
|
||||
* const response = await next(input, { ...init, headers });
|
||||
*
|
||||
* if (response.status === 401) {
|
||||
* console.log('Authentication failed');
|
||||
* }
|
||||
*
|
||||
* return response;
|
||||
* });
|
||||
*
|
||||
* // Create conditional middleware
|
||||
* const conditionalMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const url = typeof input === 'string' ? input : input.toString();
|
||||
*
|
||||
* // Only add headers for API routes
|
||||
* if (url.includes('/api/')) {
|
||||
* const headers = new Headers(init?.headers);
|
||||
* headers.set('X-API-Version', 'v2');
|
||||
* return next(input, { ...init, headers });
|
||||
* }
|
||||
*
|
||||
* // Pass through for non-API routes
|
||||
* return next(input, init);
|
||||
* });
|
||||
*
|
||||
* // Create caching middleware
|
||||
* const cacheMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const cacheKey = typeof input === 'string' ? input : input.toString();
|
||||
*
|
||||
* // Check cache first
|
||||
* const cached = await getFromCache(cacheKey);
|
||||
* if (cached) {
|
||||
* return new Response(cached, { status: 200 });
|
||||
* }
|
||||
*
|
||||
* // Make request and cache result
|
||||
* const response = await next(input, init);
|
||||
* if (response.ok) {
|
||||
* await saveToCache(cacheKey, await response.clone().text());
|
||||
* }
|
||||
*
|
||||
* return response;
|
||||
* });
|
||||
* ```
|
||||
*
|
||||
* @param handler - Function that receives the next handler and request parameters
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
export declare const createMiddleware: (handler: (next: FetchLike, input: string | URL, init?: RequestInit) => Promise<Response>) => Middleware;
|
||||
//# sourceMappingURL=middleware.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"middleware.d.ts","sourceRoot":"","sources":["../../../src/client/middleware.ts"],"names":[],"mappings":"AAAA,OAAO,EAAsC,mBAAmB,EAAqB,MAAM,WAAW,CAAC;AACvG,OAAO,EAAE,SAAS,EAAE,MAAM,wBAAwB,CAAC;AAEnD;;;GAGG;AACH,MAAM,MAAM,UAAU,GAAG,CAAC,IAAI,EAAE,SAAS,KAAK,SAAS,CAAC;AAExD;;;;;;;;;;;;;;;;;;;;;;;;;GAyBG;AACH,eAAO,MAAM,SAAS,aACP,mBAAmB,YAAY,MAAM,GAAG,GAAG,KAAG,UA0DxD,CAAC;AAEN;;GAEG;AACH,MAAM,MAAM,aAAa,GAAG,CAAC,KAAK,EAAE;IAChC,MAAM,EAAE,MAAM,CAAC;IACf,GAAG,EAAE,MAAM,GAAG,GAAG,CAAC;IAClB,MAAM,EAAE,MAAM,CAAC;IACf,UAAU,EAAE,MAAM,CAAC;IACnB,QAAQ,EAAE,MAAM,CAAC;IACjB,cAAc,CAAC,EAAE,OAAO,CAAC;IACzB,eAAe,CAAC,EAAE,OAAO,CAAC;IAC1B,KAAK,CAAC,EAAE,KAAK,CAAC;CACjB,KAAK,IAAI,CAAC;AAEX;;GAEG;AACH,MAAM,MAAM,cAAc,GAAG;IACzB;;OAEG;IACH,MAAM,CAAC,EAAE,aAAa,CAAC;IAEvB;;;OAGG;IACH,qBAAqB,CAAC,EAAE,OAAO,CAAC;IAEhC;;;OAGG;IACH,sBAAsB,CAAC,EAAE,OAAO,CAAC;IAEjC;;;;OAIG;IACH,WAAW,CAAC,EAAE,MAAM,CAAC;CACxB,CAAC;AAEF;;;;;;;;;;;;;;;GAeG;AACH,eAAO,MAAM,WAAW,aAAa,cAAc,KAAQ,UA6E1D,CAAC;AAEF;;;;;;;;;;;;;;;;;;GAkBG;AACH,eAAO,MAAM,gBAAgB,kBAAmB,UAAU,EAAE,KAAG,UAI9D,CAAC;AAEF;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;GAyDG;AACH,eAAO,MAAM,gBAAgB,YAAa,CAAC,IAAI,EAAE,SAAS,EAAE,KAAK,EAAE,MAAM,GAAG,GAAG,EAAE,IAAI,CAAC,EAAE,WAAW,KAAK,OAAO,CAAC,QAAQ,CAAC,KAAG,UAE3H,CAAC"}
|
||||
252
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.js
generated
vendored
Normal file
252
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.js
generated
vendored
Normal file
@@ -0,0 +1,252 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.createMiddleware = exports.applyMiddlewares = exports.withLogging = exports.withOAuth = void 0;
|
||||
const auth_js_1 = require("./auth.js");
|
||||
/**
|
||||
* Creates a fetch wrapper that handles OAuth authentication automatically.
|
||||
*
|
||||
* This wrapper will:
|
||||
* - Add Authorization headers with access tokens
|
||||
* - Handle 401 responses by attempting re-authentication
|
||||
* - Retry the original request after successful auth
|
||||
* - Handle OAuth errors appropriately (InvalidClientError, etc.)
|
||||
*
|
||||
* The baseUrl parameter is optional and defaults to using the domain from the request URL.
|
||||
* However, you should explicitly provide baseUrl when:
|
||||
* - Making requests to multiple subdomains (e.g., api.example.com, cdn.example.com)
|
||||
* - Using API paths that differ from OAuth discovery paths (e.g., requesting /api/v1/data but OAuth is at /)
|
||||
* - The OAuth server is on a different domain than your API requests
|
||||
* - You want to ensure consistent OAuth behavior regardless of request URLs
|
||||
*
|
||||
* For MCP transports, set baseUrl to the same URL you pass to the transport constructor.
|
||||
*
|
||||
* Note: This wrapper is designed for general-purpose fetch operations.
|
||||
* MCP transports (SSE and StreamableHTTP) already have built-in OAuth handling
|
||||
* and should not need this wrapper.
|
||||
*
|
||||
* @param provider - OAuth client provider for authentication
|
||||
* @param baseUrl - Base URL for OAuth server discovery (defaults to request URL domain)
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
const withOAuth = (provider, baseUrl) => next => {
|
||||
return async (input, init) => {
|
||||
const makeRequest = async () => {
|
||||
const headers = new Headers(init?.headers);
|
||||
// Add authorization header if tokens are available
|
||||
const tokens = await provider.tokens();
|
||||
if (tokens) {
|
||||
headers.set('Authorization', `Bearer ${tokens.access_token}`);
|
||||
}
|
||||
return await next(input, { ...init, headers });
|
||||
};
|
||||
let response = await makeRequest();
|
||||
// Handle 401 responses by attempting re-authentication
|
||||
if (response.status === 401) {
|
||||
try {
|
||||
const { resourceMetadataUrl, scope } = (0, auth_js_1.extractWWWAuthenticateParams)(response);
|
||||
// Use provided baseUrl or extract from request URL
|
||||
const serverUrl = baseUrl || (typeof input === 'string' ? new URL(input).origin : input.origin);
|
||||
const result = await (0, auth_js_1.auth)(provider, {
|
||||
serverUrl,
|
||||
resourceMetadataUrl,
|
||||
scope,
|
||||
fetchFn: next
|
||||
});
|
||||
if (result === 'REDIRECT') {
|
||||
throw new auth_js_1.UnauthorizedError('Authentication requires user authorization - redirect initiated');
|
||||
}
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError(`Authentication failed with result: ${result}`);
|
||||
}
|
||||
// Retry the request with fresh tokens
|
||||
response = await makeRequest();
|
||||
}
|
||||
catch (error) {
|
||||
if (error instanceof auth_js_1.UnauthorizedError) {
|
||||
throw error;
|
||||
}
|
||||
throw new auth_js_1.UnauthorizedError(`Failed to re-authenticate: ${error instanceof Error ? error.message : String(error)}`);
|
||||
}
|
||||
}
|
||||
// If we still have a 401 after re-auth attempt, throw an error
|
||||
if (response.status === 401) {
|
||||
const url = typeof input === 'string' ? input : input.toString();
|
||||
throw new auth_js_1.UnauthorizedError(`Authentication failed for ${url}`);
|
||||
}
|
||||
return response;
|
||||
};
|
||||
};
|
||||
exports.withOAuth = withOAuth;
|
||||
/**
|
||||
* Creates a fetch middleware that logs HTTP requests and responses.
|
||||
*
|
||||
* When called without arguments `withLogging()`, it uses the default logger that:
|
||||
* - Logs successful requests (2xx) to `console.log`
|
||||
* - Logs error responses (4xx/5xx) and network errors to `console.error`
|
||||
* - Logs all requests regardless of status (statusLevel: 0)
|
||||
* - Does not include request or response headers in logs
|
||||
* - Measures and displays request duration in milliseconds
|
||||
*
|
||||
* Important: the default logger uses both `console.log` and `console.error` so it should not be used with
|
||||
* `stdio` transports and applications.
|
||||
*
|
||||
* @param options - Logging configuration options
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
const withLogging = (options = {}) => {
|
||||
const { logger, includeRequestHeaders = false, includeResponseHeaders = false, statusLevel = 0 } = options;
|
||||
const defaultLogger = input => {
|
||||
const { method, url, status, statusText, duration, requestHeaders, responseHeaders, error } = input;
|
||||
let message = error
|
||||
? `HTTP ${method} ${url} failed: ${error.message} (${duration}ms)`
|
||||
: `HTTP ${method} ${url} ${status} ${statusText} (${duration}ms)`;
|
||||
// Add headers to message if requested
|
||||
if (includeRequestHeaders && requestHeaders) {
|
||||
const reqHeaders = Array.from(requestHeaders.entries())
|
||||
.map(([key, value]) => `${key}: ${value}`)
|
||||
.join(', ');
|
||||
message += `\n Request Headers: {${reqHeaders}}`;
|
||||
}
|
||||
if (includeResponseHeaders && responseHeaders) {
|
||||
const resHeaders = Array.from(responseHeaders.entries())
|
||||
.map(([key, value]) => `${key}: ${value}`)
|
||||
.join(', ');
|
||||
message += `\n Response Headers: {${resHeaders}}`;
|
||||
}
|
||||
if (error || status >= 400) {
|
||||
// eslint-disable-next-line no-console
|
||||
console.error(message);
|
||||
}
|
||||
else {
|
||||
// eslint-disable-next-line no-console
|
||||
console.log(message);
|
||||
}
|
||||
};
|
||||
const logFn = logger || defaultLogger;
|
||||
return next => async (input, init) => {
|
||||
const startTime = performance.now();
|
||||
const method = init?.method || 'GET';
|
||||
const url = typeof input === 'string' ? input : input.toString();
|
||||
const requestHeaders = includeRequestHeaders ? new Headers(init?.headers) : undefined;
|
||||
try {
|
||||
const response = await next(input, init);
|
||||
const duration = performance.now() - startTime;
|
||||
// Only log if status meets the log level threshold
|
||||
if (response.status >= statusLevel) {
|
||||
logFn({
|
||||
method,
|
||||
url,
|
||||
status: response.status,
|
||||
statusText: response.statusText,
|
||||
duration,
|
||||
requestHeaders,
|
||||
responseHeaders: includeResponseHeaders ? response.headers : undefined
|
||||
});
|
||||
}
|
||||
return response;
|
||||
}
|
||||
catch (error) {
|
||||
const duration = performance.now() - startTime;
|
||||
// Always log errors regardless of log level
|
||||
logFn({
|
||||
method,
|
||||
url,
|
||||
status: 0,
|
||||
statusText: 'Network Error',
|
||||
duration,
|
||||
requestHeaders,
|
||||
error: error
|
||||
});
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
};
|
||||
exports.withLogging = withLogging;
|
||||
/**
|
||||
* Composes multiple fetch middleware functions into a single middleware pipeline.
|
||||
* Middleware are applied in the order they appear, creating a chain of handlers.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* // Create a middleware pipeline that handles both OAuth and logging
|
||||
* const enhancedFetch = applyMiddlewares(
|
||||
* withOAuth(oauthProvider, 'https://api.example.com'),
|
||||
* withLogging({ statusLevel: 400 })
|
||||
* )(fetch);
|
||||
*
|
||||
* // Use the enhanced fetch - it will handle auth and log errors
|
||||
* const response = await enhancedFetch('https://api.example.com/data');
|
||||
* ```
|
||||
*
|
||||
* @param middleware - Array of fetch middleware to compose into a pipeline
|
||||
* @returns A single composed middleware function
|
||||
*/
|
||||
const applyMiddlewares = (...middleware) => {
|
||||
return next => {
|
||||
return middleware.reduce((handler, mw) => mw(handler), next);
|
||||
};
|
||||
};
|
||||
exports.applyMiddlewares = applyMiddlewares;
|
||||
/**
|
||||
* Helper function to create custom fetch middleware with cleaner syntax.
|
||||
* Provides the next handler and request details as separate parameters for easier access.
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* // Create custom authentication middleware
|
||||
* const customAuthMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const headers = new Headers(init?.headers);
|
||||
* headers.set('X-Custom-Auth', 'my-token');
|
||||
*
|
||||
* const response = await next(input, { ...init, headers });
|
||||
*
|
||||
* if (response.status === 401) {
|
||||
* console.log('Authentication failed');
|
||||
* }
|
||||
*
|
||||
* return response;
|
||||
* });
|
||||
*
|
||||
* // Create conditional middleware
|
||||
* const conditionalMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const url = typeof input === 'string' ? input : input.toString();
|
||||
*
|
||||
* // Only add headers for API routes
|
||||
* if (url.includes('/api/')) {
|
||||
* const headers = new Headers(init?.headers);
|
||||
* headers.set('X-API-Version', 'v2');
|
||||
* return next(input, { ...init, headers });
|
||||
* }
|
||||
*
|
||||
* // Pass through for non-API routes
|
||||
* return next(input, init);
|
||||
* });
|
||||
*
|
||||
* // Create caching middleware
|
||||
* const cacheMiddleware = createMiddleware(async (next, input, init) => {
|
||||
* const cacheKey = typeof input === 'string' ? input : input.toString();
|
||||
*
|
||||
* // Check cache first
|
||||
* const cached = await getFromCache(cacheKey);
|
||||
* if (cached) {
|
||||
* return new Response(cached, { status: 200 });
|
||||
* }
|
||||
*
|
||||
* // Make request and cache result
|
||||
* const response = await next(input, init);
|
||||
* if (response.ok) {
|
||||
* await saveToCache(cacheKey, await response.clone().text());
|
||||
* }
|
||||
*
|
||||
* return response;
|
||||
* });
|
||||
* ```
|
||||
*
|
||||
* @param handler - Function that receives the next handler and request parameters
|
||||
* @returns A fetch middleware function
|
||||
*/
|
||||
const createMiddleware = (handler) => {
|
||||
return next => (input, init) => handler(next, input, init);
|
||||
};
|
||||
exports.createMiddleware = createMiddleware;
|
||||
//# sourceMappingURL=middleware.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/middleware.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"middleware.js","sourceRoot":"","sources":["../../../src/client/middleware.ts"],"names":[],"mappings":";;;AAAA,uCAAuG;AASvG;;;;;;;;;;;;;;;;;;;;;;;;;GAyBG;AACI,MAAM,SAAS,GAClB,CAAC,QAA6B,EAAE,OAAsB,EAAc,EAAE,CACtE,IAAI,CAAC,EAAE;IACH,OAAO,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE,EAAE;QACzB,MAAM,WAAW,GAAG,KAAK,IAAuB,EAAE;YAC9C,MAAM,OAAO,GAAG,IAAI,OAAO,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC;YAE3C,mDAAmD;YACnD,MAAM,MAAM,GAAG,MAAM,QAAQ,CAAC,MAAM,EAAE,CAAC;YACvC,IAAI,MAAM,EAAE,CAAC;gBACT,OAAO,CAAC,GAAG,CAAC,eAAe,EAAE,UAAU,MAAM,CAAC,YAAY,EAAE,CAAC,CAAC;YAClE,CAAC;YAED,OAAO,MAAM,IAAI,CAAC,KAAK,EAAE,EAAE,GAAG,IAAI,EAAE,OAAO,EAAE,CAAC,CAAC;QACnD,CAAC,CAAC;QAEF,IAAI,QAAQ,GAAG,MAAM,WAAW,EAAE,CAAC;QAEnC,uDAAuD;QACvD,IAAI,QAAQ,CAAC,MAAM,KAAK,GAAG,EAAE,CAAC;YAC1B,IAAI,CAAC;gBACD,MAAM,EAAE,mBAAmB,EAAE,KAAK,EAAE,GAAG,IAAA,sCAA4B,EAAC,QAAQ,CAAC,CAAC;gBAE9E,mDAAmD;gBACnD,MAAM,SAAS,GAAG,OAAO,IAAI,CAAC,OAAO,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,IAAI,GAAG,CAAC,KAAK,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;gBAEhG,MAAM,MAAM,GAAG,MAAM,IAAA,cAAI,EAAC,QAAQ,EAAE;oBAChC,SAAS;oBACT,mBAAmB;oBACnB,KAAK;oBACL,OAAO,EAAE,IAAI;iBAChB,CAAC,CAAC;gBAEH,IAAI,MAAM,KAAK,UAAU,EAAE,CAAC;oBACxB,MAAM,IAAI,2BAAiB,CAAC,iEAAiE,CAAC,CAAC;gBACnG,CAAC;gBAED,IAAI,MAAM,KAAK,YAAY,EAAE,CAAC;oBAC1B,MAAM,IAAI,2BAAiB,CAAC,sCAAsC,MAAM,EAAE,CAAC,CAAC;gBAChF,CAAC;gBAED,sCAAsC;gBACtC,QAAQ,GAAG,MAAM,WAAW,EAAE,CAAC;YACnC,CAAC;YAAC,OAAO,KAAK,EAAE,CAAC;gBACb,IAAI,KAAK,YAAY,2BAAiB,EAAE,CAAC;oBACrC,MAAM,KAAK,CAAC;gBAChB,CAAC;gBACD,MAAM,IAAI,2BAAiB,CAAC,8BAA8B,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;YACxH,CAAC;QACL,CAAC;QAED,+DAA+D;QAC/D,IAAI,QAAQ,CAAC,MAAM,KAAK,GAAG,EAAE,CAAC;YAC1B,MAAM,GAAG,GAAG,OAAO,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,QAAQ,EAAE,CAAC;YACjE,MAAM,IAAI,2BAAiB,CAAC,6BAA6B,GAAG,EAAE,CAAC,CAAC;QACpE,CAAC;QAED,OAAO,QAAQ,CAAC;IACpB,CAAC,CAAC;AACN,CAAC,CAAC;AA3DO,QAAA,SAAS,aA2DhB;AA6CN;;;;;;;;;;;;;;;GAeG;AACI,MAAM,WAAW,GAAG,CAAC,UAA0B,EAAE,EAAc,EAAE;IACpE,MAAM,EAAE,MAAM,EAAE,qBAAqB,GAAG,KAAK,EAAE,sBAAsB,GAAG,KAAK,EAAE,WAAW,GAAG,CAAC,EAAE,GAAG,OAAO,CAAC;IAE3G,MAAM,aAAa,GAAkB,KAAK,CAAC,EAAE;QACzC,MAAM,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM,EAAE,UAAU,EAAE,QAAQ,EAAE,cAAc,EAAE,eAAe,EAAE,KAAK,EAAE,GAAG,KAAK,CAAC;QAEpG,IAAI,OAAO,GAAG,KAAK;YACf,CAAC,CAAC,QAAQ,MAAM,IAAI,GAAG,YAAY,KAAK,CAAC,OAAO,KAAK,QAAQ,KAAK;YAClE,CAAC,CAAC,QAAQ,MAAM,IAAI,GAAG,IAAI,MAAM,IAAI,UAAU,KAAK,QAAQ,KAAK,CAAC;QAEtE,sCAAsC;QACtC,IAAI,qBAAqB,IAAI,cAAc,EAAE,CAAC;YAC1C,MAAM,UAAU,GAAG,KAAK,CAAC,IAAI,CAAC,cAAc,CAAC,OAAO,EAAE,CAAC;iBAClD,GAAG,CAAC,CAAC,CAAC,GAAG,EAAE,KAAK,CAAC,EAAE,EAAE,CAAC,GAAG,GAAG,KAAK,KAAK,EAAE,CAAC;iBACzC,IAAI,CAAC,IAAI,CAAC,CAAC;YAChB,OAAO,IAAI,yBAAyB,UAAU,GAAG,CAAC;QACtD,CAAC;QAED,IAAI,sBAAsB,IAAI,eAAe,EAAE,CAAC;YAC5C,MAAM,UAAU,GAAG,KAAK,CAAC,IAAI,CAAC,eAAe,CAAC,OAAO,EAAE,CAAC;iBACnD,GAAG,CAAC,CAAC,CAAC,GAAG,EAAE,KAAK,CAAC,EAAE,EAAE,CAAC,GAAG,GAAG,KAAK,KAAK,EAAE,CAAC;iBACzC,IAAI,CAAC,IAAI,CAAC,CAAC;YAChB,OAAO,IAAI,0BAA0B,UAAU,GAAG,CAAC;QACvD,CAAC;QAED,IAAI,KAAK,IAAI,MAAM,IAAI,GAAG,EAAE,CAAC;YACzB,sCAAsC;YACtC,OAAO,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;QAC3B,CAAC;aAAM,CAAC;YACJ,sCAAsC;YACtC,OAAO,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;QACzB,CAAC;IACL,CAAC,CAAC;IAEF,MAAM,KAAK,GAAG,MAAM,IAAI,aAAa,CAAC;IAEtC,OAAO,IAAI,CAAC,EAAE,CAAC,KAAK,EAAE,KAAK,EAAE,IAAI,EAAE,EAAE;QACjC,MAAM,SAAS,GAAG,WAAW,CAAC,GAAG,EAAE,CAAC;QACpC,MAAM,MAAM,GAAG,IAAI,EAAE,MAAM,IAAI,KAAK,CAAC;QACrC,MAAM,GAAG,GAAG,OAAO,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,QAAQ,EAAE,CAAC;QACjE,MAAM,cAAc,GAAG,qBAAqB,CAAC,CAAC,CAAC,IAAI,OAAO,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;QAEtF,IAAI,CAAC;YACD,MAAM,QAAQ,GAAG,MAAM,IAAI,CAAC,KAAK,EAAE,IAAI,CAAC,CAAC;YACzC,MAAM,QAAQ,GAAG,WAAW,CAAC,GAAG,EAAE,GAAG,SAAS,CAAC;YAE/C,mDAAmD;YACnD,IAAI,QAAQ,CAAC,MAAM,IAAI,WAAW,EAAE,CAAC;gBACjC,KAAK,CAAC;oBACF,MAAM;oBACN,GAAG;oBACH,MAAM,EAAE,QAAQ,CAAC,MAAM;oBACvB,UAAU,EAAE,QAAQ,CAAC,UAAU;oBAC/B,QAAQ;oBACR,cAAc;oBACd,eAAe,EAAE,sBAAsB,CAAC,CAAC,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,CAAC,SAAS;iBACzE,CAAC,CAAC;YACP,CAAC;YAED,OAAO,QAAQ,CAAC;QACpB,CAAC;QAAC,OAAO,KAAK,EAAE,CAAC;YACb,MAAM,QAAQ,GAAG,WAAW,CAAC,GAAG,EAAE,GAAG,SAAS,CAAC;YAE/C,4CAA4C;YAC5C,KAAK,CAAC;gBACF,MAAM;gBACN,GAAG;gBACH,MAAM,EAAE,CAAC;gBACT,UAAU,EAAE,eAAe;gBAC3B,QAAQ;gBACR,cAAc;gBACd,KAAK,EAAE,KAAc;aACxB,CAAC,CAAC;YAEH,MAAM,KAAK,CAAC;QAChB,CAAC;IACL,CAAC,CAAC;AACN,CAAC,CAAC;AA7EW,QAAA,WAAW,eA6EtB;AAEF;;;;;;;;;;;;;;;;;;GAkBG;AACI,MAAM,gBAAgB,GAAG,CAAC,GAAG,UAAwB,EAAc,EAAE;IACxE,OAAO,IAAI,CAAC,EAAE;QACV,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC,OAAO,EAAE,EAAE,EAAE,EAAE,CAAC,EAAE,CAAC,OAAO,CAAC,EAAE,IAAI,CAAC,CAAC;IACjE,CAAC,CAAC;AACN,CAAC,CAAC;AAJW,QAAA,gBAAgB,oBAI3B;AAEF;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;GAyDG;AACI,MAAM,gBAAgB,GAAG,CAAC,OAAwF,EAAc,EAAE;IACrI,OAAO,IAAI,CAAC,EAAE,CAAC,CAAC,KAAK,EAAE,IAAI,EAAE,EAAE,CAAC,OAAO,CAAC,IAAI,EAAE,KAAqB,EAAE,IAAI,CAAC,CAAC;AAC/E,CAAC,CAAC;AAFW,QAAA,gBAAgB,oBAE3B"}
|
||||
81
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.d.ts
generated
vendored
Normal file
81
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.d.ts
generated
vendored
Normal file
@@ -0,0 +1,81 @@
|
||||
import { type ErrorEvent, type EventSourceInit } from 'eventsource';
|
||||
import { Transport, FetchLike } from '../shared/transport.js';
|
||||
import { JSONRPCMessage } from '../types.js';
|
||||
import { OAuthClientProvider } from './auth.js';
|
||||
export declare class SseError extends Error {
|
||||
readonly code: number | undefined;
|
||||
readonly event: ErrorEvent;
|
||||
constructor(code: number | undefined, message: string | undefined, event: ErrorEvent);
|
||||
}
|
||||
/**
|
||||
* Configuration options for the `SSEClientTransport`.
|
||||
*/
|
||||
export type SSEClientTransportOptions = {
|
||||
/**
|
||||
* An OAuth client provider to use for authentication.
|
||||
*
|
||||
* When an `authProvider` is specified and the SSE connection is started:
|
||||
* 1. The connection is attempted with any existing access token from the `authProvider`.
|
||||
* 2. If the access token has expired, the `authProvider` is used to refresh the token.
|
||||
* 3. If token refresh fails or no access token exists, and auth is required, `OAuthClientProvider.redirectToAuthorization` is called, and an `UnauthorizedError` will be thrown from `connect`/`start`.
|
||||
*
|
||||
* After the user has finished authorizing via their user agent, and is redirected back to the MCP client application, call `SSEClientTransport.finishAuth` with the authorization code before retrying the connection.
|
||||
*
|
||||
* If an `authProvider` is not provided, and auth is required, an `UnauthorizedError` will be thrown.
|
||||
*
|
||||
* `UnauthorizedError` might also be thrown when sending any message over the SSE transport, indicating that the session has expired, and needs to be re-authed and reconnected.
|
||||
*/
|
||||
authProvider?: OAuthClientProvider;
|
||||
/**
|
||||
* Customizes the initial SSE request to the server (the request that begins the stream).
|
||||
*
|
||||
* NOTE: Setting this property will prevent an `Authorization` header from
|
||||
* being automatically attached to the SSE request, if an `authProvider` is
|
||||
* also given. This can be worked around by setting the `Authorization` header
|
||||
* manually.
|
||||
*/
|
||||
eventSourceInit?: EventSourceInit;
|
||||
/**
|
||||
* Customizes recurring POST requests to the server.
|
||||
*/
|
||||
requestInit?: RequestInit;
|
||||
/**
|
||||
* Custom fetch implementation used for all network requests.
|
||||
*/
|
||||
fetch?: FetchLike;
|
||||
};
|
||||
/**
|
||||
* Client transport for SSE: this will connect to a server using Server-Sent Events for receiving
|
||||
* messages and make separate POST requests for sending messages.
|
||||
* @deprecated SSEClientTransport is deprecated. Prefer to use StreamableHTTPClientTransport where possible instead. Note that because some servers are still using SSE, clients may need to support both transports during the migration period.
|
||||
*/
|
||||
export declare class SSEClientTransport implements Transport {
|
||||
private _eventSource?;
|
||||
private _endpoint?;
|
||||
private _abortController?;
|
||||
private _url;
|
||||
private _resourceMetadataUrl?;
|
||||
private _scope?;
|
||||
private _eventSourceInit?;
|
||||
private _requestInit?;
|
||||
private _authProvider?;
|
||||
private _fetch?;
|
||||
private _fetchWithInit;
|
||||
private _protocolVersion?;
|
||||
onclose?: () => void;
|
||||
onerror?: (error: Error) => void;
|
||||
onmessage?: (message: JSONRPCMessage) => void;
|
||||
constructor(url: URL, opts?: SSEClientTransportOptions);
|
||||
private _authThenStart;
|
||||
private _commonHeaders;
|
||||
private _startOrAuth;
|
||||
start(): Promise<void>;
|
||||
/**
|
||||
* Call this method after the user has finished authorizing via their user agent and is redirected back to the MCP client application. This will exchange the authorization code for an access token, enabling the next connection attempt to successfully auth.
|
||||
*/
|
||||
finishAuth(authorizationCode: string): Promise<void>;
|
||||
close(): Promise<void>;
|
||||
send(message: JSONRPCMessage): Promise<void>;
|
||||
setProtocolVersion(version: string): void;
|
||||
}
|
||||
//# sourceMappingURL=sse.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"sse.d.ts","sourceRoot":"","sources":["../../../src/client/sse.ts"],"names":[],"mappings":"AAAA,OAAO,EAAe,KAAK,UAAU,EAAE,KAAK,eAAe,EAAE,MAAM,aAAa,CAAC;AACjF,OAAO,EAAE,SAAS,EAAE,SAAS,EAAyC,MAAM,wBAAwB,CAAC;AACrG,OAAO,EAAE,cAAc,EAAwB,MAAM,aAAa,CAAC;AACnE,OAAO,EAAkD,mBAAmB,EAAqB,MAAM,WAAW,CAAC;AAEnH,qBAAa,QAAS,SAAQ,KAAK;aAEX,IAAI,EAAE,MAAM,GAAG,SAAS;aAExB,KAAK,EAAE,UAAU;gBAFjB,IAAI,EAAE,MAAM,GAAG,SAAS,EACxC,OAAO,EAAE,MAAM,GAAG,SAAS,EACX,KAAK,EAAE,UAAU;CAIxC;AAED;;GAEG;AACH,MAAM,MAAM,yBAAyB,GAAG;IACpC;;;;;;;;;;;;;OAaG;IACH,YAAY,CAAC,EAAE,mBAAmB,CAAC;IAEnC;;;;;;;OAOG;IACH,eAAe,CAAC,EAAE,eAAe,CAAC;IAElC;;OAEG;IACH,WAAW,CAAC,EAAE,WAAW,CAAC;IAE1B;;OAEG;IACH,KAAK,CAAC,EAAE,SAAS,CAAC;CACrB,CAAC;AAEF;;;;GAIG;AACH,qBAAa,kBAAmB,YAAW,SAAS;IAChD,OAAO,CAAC,YAAY,CAAC,CAAc;IACnC,OAAO,CAAC,SAAS,CAAC,CAAM;IACxB,OAAO,CAAC,gBAAgB,CAAC,CAAkB;IAC3C,OAAO,CAAC,IAAI,CAAM;IAClB,OAAO,CAAC,oBAAoB,CAAC,CAAM;IACnC,OAAO,CAAC,MAAM,CAAC,CAAS;IACxB,OAAO,CAAC,gBAAgB,CAAC,CAAkB;IAC3C,OAAO,CAAC,YAAY,CAAC,CAAc;IACnC,OAAO,CAAC,aAAa,CAAC,CAAsB;IAC5C,OAAO,CAAC,MAAM,CAAC,CAAY;IAC3B,OAAO,CAAC,cAAc,CAAY;IAClC,OAAO,CAAC,gBAAgB,CAAC,CAAS;IAElC,OAAO,CAAC,EAAE,MAAM,IAAI,CAAC;IACrB,OAAO,CAAC,EAAE,CAAC,KAAK,EAAE,KAAK,KAAK,IAAI,CAAC;IACjC,SAAS,CAAC,EAAE,CAAC,OAAO,EAAE,cAAc,KAAK,IAAI,CAAC;gBAElC,GAAG,EAAE,GAAG,EAAE,IAAI,CAAC,EAAE,yBAAyB;YAWxC,cAAc;YAyBd,cAAc;IAoB5B,OAAO,CAAC,YAAY;IAyEd,KAAK;IAQX;;OAEG;IACG,UAAU,CAAC,iBAAiB,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAiBpD,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAMtB,IAAI,CAAC,OAAO,EAAE,cAAc,GAAG,OAAO,CAAC,IAAI,CAAC;IAkDlD,kBAAkB,CAAC,OAAO,EAAE,MAAM,GAAG,IAAI;CAG5C"}
|
||||
211
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.js
generated
vendored
Normal file
211
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.js
generated
vendored
Normal file
@@ -0,0 +1,211 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.SSEClientTransport = exports.SseError = void 0;
|
||||
const eventsource_1 = require("eventsource");
|
||||
const transport_js_1 = require("../shared/transport.js");
|
||||
const types_js_1 = require("../types.js");
|
||||
const auth_js_1 = require("./auth.js");
|
||||
class SseError extends Error {
|
||||
constructor(code, message, event) {
|
||||
super(`SSE error: ${message}`);
|
||||
this.code = code;
|
||||
this.event = event;
|
||||
}
|
||||
}
|
||||
exports.SseError = SseError;
|
||||
/**
|
||||
* Client transport for SSE: this will connect to a server using Server-Sent Events for receiving
|
||||
* messages and make separate POST requests for sending messages.
|
||||
* @deprecated SSEClientTransport is deprecated. Prefer to use StreamableHTTPClientTransport where possible instead. Note that because some servers are still using SSE, clients may need to support both transports during the migration period.
|
||||
*/
|
||||
class SSEClientTransport {
|
||||
constructor(url, opts) {
|
||||
this._url = url;
|
||||
this._resourceMetadataUrl = undefined;
|
||||
this._scope = undefined;
|
||||
this._eventSourceInit = opts?.eventSourceInit;
|
||||
this._requestInit = opts?.requestInit;
|
||||
this._authProvider = opts?.authProvider;
|
||||
this._fetch = opts?.fetch;
|
||||
this._fetchWithInit = (0, transport_js_1.createFetchWithInit)(opts?.fetch, opts?.requestInit);
|
||||
}
|
||||
async _authThenStart() {
|
||||
if (!this._authProvider) {
|
||||
throw new auth_js_1.UnauthorizedError('No auth provider');
|
||||
}
|
||||
let result;
|
||||
try {
|
||||
result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError();
|
||||
}
|
||||
return await this._startOrAuth();
|
||||
}
|
||||
async _commonHeaders() {
|
||||
const headers = {};
|
||||
if (this._authProvider) {
|
||||
const tokens = await this._authProvider.tokens();
|
||||
if (tokens) {
|
||||
headers['Authorization'] = `Bearer ${tokens.access_token}`;
|
||||
}
|
||||
}
|
||||
if (this._protocolVersion) {
|
||||
headers['mcp-protocol-version'] = this._protocolVersion;
|
||||
}
|
||||
const extraHeaders = (0, transport_js_1.normalizeHeaders)(this._requestInit?.headers);
|
||||
return new Headers({
|
||||
...headers,
|
||||
...extraHeaders
|
||||
});
|
||||
}
|
||||
_startOrAuth() {
|
||||
const fetchImpl = (this?._eventSourceInit?.fetch ?? this._fetch ?? fetch);
|
||||
return new Promise((resolve, reject) => {
|
||||
this._eventSource = new eventsource_1.EventSource(this._url.href, {
|
||||
...this._eventSourceInit,
|
||||
fetch: async (url, init) => {
|
||||
const headers = await this._commonHeaders();
|
||||
headers.set('Accept', 'text/event-stream');
|
||||
const response = await fetchImpl(url, {
|
||||
...init,
|
||||
headers
|
||||
});
|
||||
if (response.status === 401 && response.headers.has('www-authenticate')) {
|
||||
const { resourceMetadataUrl, scope } = (0, auth_js_1.extractWWWAuthenticateParams)(response);
|
||||
this._resourceMetadataUrl = resourceMetadataUrl;
|
||||
this._scope = scope;
|
||||
}
|
||||
return response;
|
||||
}
|
||||
});
|
||||
this._abortController = new AbortController();
|
||||
this._eventSource.onerror = event => {
|
||||
if (event.code === 401 && this._authProvider) {
|
||||
this._authThenStart().then(resolve, reject);
|
||||
return;
|
||||
}
|
||||
const error = new SseError(event.code, event.message, event);
|
||||
reject(error);
|
||||
this.onerror?.(error);
|
||||
};
|
||||
this._eventSource.onopen = () => {
|
||||
// The connection is open, but we need to wait for the endpoint to be received.
|
||||
};
|
||||
this._eventSource.addEventListener('endpoint', (event) => {
|
||||
const messageEvent = event;
|
||||
try {
|
||||
this._endpoint = new URL(messageEvent.data, this._url);
|
||||
if (this._endpoint.origin !== this._url.origin) {
|
||||
throw new Error(`Endpoint origin does not match connection origin: ${this._endpoint.origin}`);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
reject(error);
|
||||
this.onerror?.(error);
|
||||
void this.close();
|
||||
return;
|
||||
}
|
||||
resolve();
|
||||
});
|
||||
this._eventSource.onmessage = (event) => {
|
||||
const messageEvent = event;
|
||||
let message;
|
||||
try {
|
||||
message = types_js_1.JSONRPCMessageSchema.parse(JSON.parse(messageEvent.data));
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
return;
|
||||
}
|
||||
this.onmessage?.(message);
|
||||
};
|
||||
});
|
||||
}
|
||||
async start() {
|
||||
if (this._eventSource) {
|
||||
throw new Error('SSEClientTransport already started! If using Client class, note that connect() calls start() automatically.');
|
||||
}
|
||||
return await this._startOrAuth();
|
||||
}
|
||||
/**
|
||||
* Call this method after the user has finished authorizing via their user agent and is redirected back to the MCP client application. This will exchange the authorization code for an access token, enabling the next connection attempt to successfully auth.
|
||||
*/
|
||||
async finishAuth(authorizationCode) {
|
||||
if (!this._authProvider) {
|
||||
throw new auth_js_1.UnauthorizedError('No auth provider');
|
||||
}
|
||||
const result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
authorizationCode,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError('Failed to authorize');
|
||||
}
|
||||
}
|
||||
async close() {
|
||||
this._abortController?.abort();
|
||||
this._eventSource?.close();
|
||||
this.onclose?.();
|
||||
}
|
||||
async send(message) {
|
||||
if (!this._endpoint) {
|
||||
throw new Error('Not connected');
|
||||
}
|
||||
try {
|
||||
const headers = await this._commonHeaders();
|
||||
headers.set('content-type', 'application/json');
|
||||
const init = {
|
||||
...this._requestInit,
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: JSON.stringify(message),
|
||||
signal: this._abortController?.signal
|
||||
};
|
||||
const response = await (this._fetch ?? fetch)(this._endpoint, init);
|
||||
if (!response.ok) {
|
||||
const text = await response.text().catch(() => null);
|
||||
if (response.status === 401 && this._authProvider) {
|
||||
const { resourceMetadataUrl, scope } = (0, auth_js_1.extractWWWAuthenticateParams)(response);
|
||||
this._resourceMetadataUrl = resourceMetadataUrl;
|
||||
this._scope = scope;
|
||||
const result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError();
|
||||
}
|
||||
// Purposely _not_ awaited, so we don't call onerror twice
|
||||
return this.send(message);
|
||||
}
|
||||
throw new Error(`Error POSTing to endpoint (HTTP ${response.status}): ${text}`);
|
||||
}
|
||||
// Release connection - POST responses don't have content we need
|
||||
await response.body?.cancel();
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
setProtocolVersion(version) {
|
||||
this._protocolVersion = version;
|
||||
}
|
||||
}
|
||||
exports.SSEClientTransport = SSEClientTransport;
|
||||
//# sourceMappingURL=sse.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/sse.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
77
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.d.ts
generated
vendored
Normal file
77
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.d.ts
generated
vendored
Normal file
@@ -0,0 +1,77 @@
|
||||
import { IOType } from 'node:child_process';
|
||||
import { Stream } from 'node:stream';
|
||||
import { Transport } from '../shared/transport.js';
|
||||
import { JSONRPCMessage } from '../types.js';
|
||||
export type StdioServerParameters = {
|
||||
/**
|
||||
* The executable to run to start the server.
|
||||
*/
|
||||
command: string;
|
||||
/**
|
||||
* Command line arguments to pass to the executable.
|
||||
*/
|
||||
args?: string[];
|
||||
/**
|
||||
* The environment to use when spawning the process.
|
||||
*
|
||||
* If not specified, the result of getDefaultEnvironment() will be used.
|
||||
*/
|
||||
env?: Record<string, string>;
|
||||
/**
|
||||
* How to handle stderr of the child process. This matches the semantics of Node's `child_process.spawn`.
|
||||
*
|
||||
* The default is "inherit", meaning messages to stderr will be printed to the parent process's stderr.
|
||||
*/
|
||||
stderr?: IOType | Stream | number;
|
||||
/**
|
||||
* The working directory to use when spawning the process.
|
||||
*
|
||||
* If not specified, the current working directory will be inherited.
|
||||
*/
|
||||
cwd?: string;
|
||||
};
|
||||
/**
|
||||
* Environment variables to inherit by default, if an environment is not explicitly given.
|
||||
*/
|
||||
export declare const DEFAULT_INHERITED_ENV_VARS: string[];
|
||||
/**
|
||||
* Returns a default environment object including only environment variables deemed safe to inherit.
|
||||
*/
|
||||
export declare function getDefaultEnvironment(): Record<string, string>;
|
||||
/**
|
||||
* Client transport for stdio: this will connect to a server by spawning a process and communicating with it over stdin/stdout.
|
||||
*
|
||||
* This transport is only available in Node.js environments.
|
||||
*/
|
||||
export declare class StdioClientTransport implements Transport {
|
||||
private _process?;
|
||||
private _readBuffer;
|
||||
private _serverParams;
|
||||
private _stderrStream;
|
||||
onclose?: () => void;
|
||||
onerror?: (error: Error) => void;
|
||||
onmessage?: (message: JSONRPCMessage) => void;
|
||||
constructor(server: StdioServerParameters);
|
||||
/**
|
||||
* Starts the server process and prepares to communicate with it.
|
||||
*/
|
||||
start(): Promise<void>;
|
||||
/**
|
||||
* The stderr stream of the child process, if `StdioServerParameters.stderr` was set to "pipe" or "overlapped".
|
||||
*
|
||||
* If stderr piping was requested, a PassThrough stream is returned _immediately_, allowing callers to
|
||||
* attach listeners before the start method is invoked. This prevents loss of any early
|
||||
* error output emitted by the child process.
|
||||
*/
|
||||
get stderr(): Stream | null;
|
||||
/**
|
||||
* The child process pid spawned by this transport.
|
||||
*
|
||||
* This is only available after the transport has been started.
|
||||
*/
|
||||
get pid(): number | null;
|
||||
private processReadBuffer;
|
||||
close(): Promise<void>;
|
||||
send(message: JSONRPCMessage): Promise<void>;
|
||||
}
|
||||
//# sourceMappingURL=stdio.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stdio.d.ts","sourceRoot":"","sources":["../../../src/client/stdio.ts"],"names":[],"mappings":"AAAA,OAAO,EAAgB,MAAM,EAAE,MAAM,oBAAoB,CAAC;AAG1D,OAAO,EAAE,MAAM,EAAe,MAAM,aAAa,CAAC;AAElD,OAAO,EAAE,SAAS,EAAE,MAAM,wBAAwB,CAAC;AACnD,OAAO,EAAE,cAAc,EAAE,MAAM,aAAa,CAAC;AAE7C,MAAM,MAAM,qBAAqB,GAAG;IAChC;;OAEG;IACH,OAAO,EAAE,MAAM,CAAC;IAEhB;;OAEG;IACH,IAAI,CAAC,EAAE,MAAM,EAAE,CAAC;IAEhB;;;;OAIG;IACH,GAAG,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAE7B;;;;OAIG;IACH,MAAM,CAAC,EAAE,MAAM,GAAG,MAAM,GAAG,MAAM,CAAC;IAElC;;;;OAIG;IACH,GAAG,CAAC,EAAE,MAAM,CAAC;CAChB,CAAC;AAEF;;GAEG;AACH,eAAO,MAAM,0BAA0B,UAiBuB,CAAC;AAE/D;;GAEG;AACH,wBAAgB,qBAAqB,IAAI,MAAM,CAAC,MAAM,EAAE,MAAM,CAAC,CAkB9D;AAED;;;;GAIG;AACH,qBAAa,oBAAqB,YAAW,SAAS;IAClD,OAAO,CAAC,QAAQ,CAAC,CAAe;IAChC,OAAO,CAAC,WAAW,CAAgC;IACnD,OAAO,CAAC,aAAa,CAAwB;IAC7C,OAAO,CAAC,aAAa,CAA4B;IAEjD,OAAO,CAAC,EAAE,MAAM,IAAI,CAAC;IACrB,OAAO,CAAC,EAAE,CAAC,KAAK,EAAE,KAAK,KAAK,IAAI,CAAC;IACjC,SAAS,CAAC,EAAE,CAAC,OAAO,EAAE,cAAc,KAAK,IAAI,CAAC;gBAElC,MAAM,EAAE,qBAAqB;IAOzC;;OAEG;IACG,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAqD5B;;;;;;OAMG;IACH,IAAI,MAAM,IAAI,MAAM,GAAG,IAAI,CAM1B;IAED;;;;OAIG;IACH,IAAI,GAAG,IAAI,MAAM,GAAG,IAAI,CAEvB;IAED,OAAO,CAAC,iBAAiB;IAenB,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAyC5B,IAAI,CAAC,OAAO,EAAE,cAAc,GAAG,OAAO,CAAC,IAAI,CAAC;CAc/C"}
|
||||
199
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.js
generated
vendored
Normal file
199
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.js
generated
vendored
Normal file
@@ -0,0 +1,199 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.StdioClientTransport = exports.DEFAULT_INHERITED_ENV_VARS = void 0;
|
||||
exports.getDefaultEnvironment = getDefaultEnvironment;
|
||||
const cross_spawn_1 = __importDefault(require("cross-spawn"));
|
||||
const node_process_1 = __importDefault(require("node:process"));
|
||||
const node_stream_1 = require("node:stream");
|
||||
const stdio_js_1 = require("../shared/stdio.js");
|
||||
/**
|
||||
* Environment variables to inherit by default, if an environment is not explicitly given.
|
||||
*/
|
||||
exports.DEFAULT_INHERITED_ENV_VARS = node_process_1.default.platform === 'win32'
|
||||
? [
|
||||
'APPDATA',
|
||||
'HOMEDRIVE',
|
||||
'HOMEPATH',
|
||||
'LOCALAPPDATA',
|
||||
'PATH',
|
||||
'PROCESSOR_ARCHITECTURE',
|
||||
'SYSTEMDRIVE',
|
||||
'SYSTEMROOT',
|
||||
'TEMP',
|
||||
'USERNAME',
|
||||
'USERPROFILE',
|
||||
'PROGRAMFILES'
|
||||
]
|
||||
: /* list inspired by the default env inheritance of sudo */
|
||||
['HOME', 'LOGNAME', 'PATH', 'SHELL', 'TERM', 'USER'];
|
||||
/**
|
||||
* Returns a default environment object including only environment variables deemed safe to inherit.
|
||||
*/
|
||||
function getDefaultEnvironment() {
|
||||
const env = {};
|
||||
for (const key of exports.DEFAULT_INHERITED_ENV_VARS) {
|
||||
const value = node_process_1.default.env[key];
|
||||
if (value === undefined) {
|
||||
continue;
|
||||
}
|
||||
if (value.startsWith('()')) {
|
||||
// Skip functions, which are a security risk.
|
||||
continue;
|
||||
}
|
||||
env[key] = value;
|
||||
}
|
||||
return env;
|
||||
}
|
||||
/**
|
||||
* Client transport for stdio: this will connect to a server by spawning a process and communicating with it over stdin/stdout.
|
||||
*
|
||||
* This transport is only available in Node.js environments.
|
||||
*/
|
||||
class StdioClientTransport {
|
||||
constructor(server) {
|
||||
this._readBuffer = new stdio_js_1.ReadBuffer();
|
||||
this._stderrStream = null;
|
||||
this._serverParams = server;
|
||||
if (server.stderr === 'pipe' || server.stderr === 'overlapped') {
|
||||
this._stderrStream = new node_stream_1.PassThrough();
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Starts the server process and prepares to communicate with it.
|
||||
*/
|
||||
async start() {
|
||||
if (this._process) {
|
||||
throw new Error('StdioClientTransport already started! If using Client class, note that connect() calls start() automatically.');
|
||||
}
|
||||
return new Promise((resolve, reject) => {
|
||||
this._process = (0, cross_spawn_1.default)(this._serverParams.command, this._serverParams.args ?? [], {
|
||||
// merge default env with server env because mcp server needs some env vars
|
||||
env: {
|
||||
...getDefaultEnvironment(),
|
||||
...this._serverParams.env
|
||||
},
|
||||
stdio: ['pipe', 'pipe', this._serverParams.stderr ?? 'inherit'],
|
||||
shell: false,
|
||||
windowsHide: node_process_1.default.platform === 'win32' && isElectron(),
|
||||
cwd: this._serverParams.cwd
|
||||
});
|
||||
this._process.on('error', error => {
|
||||
reject(error);
|
||||
this.onerror?.(error);
|
||||
});
|
||||
this._process.on('spawn', () => {
|
||||
resolve();
|
||||
});
|
||||
this._process.on('close', _code => {
|
||||
this._process = undefined;
|
||||
this.onclose?.();
|
||||
});
|
||||
this._process.stdin?.on('error', error => {
|
||||
this.onerror?.(error);
|
||||
});
|
||||
this._process.stdout?.on('data', chunk => {
|
||||
this._readBuffer.append(chunk);
|
||||
this.processReadBuffer();
|
||||
});
|
||||
this._process.stdout?.on('error', error => {
|
||||
this.onerror?.(error);
|
||||
});
|
||||
if (this._stderrStream && this._process.stderr) {
|
||||
this._process.stderr.pipe(this._stderrStream);
|
||||
}
|
||||
});
|
||||
}
|
||||
/**
|
||||
* The stderr stream of the child process, if `StdioServerParameters.stderr` was set to "pipe" or "overlapped".
|
||||
*
|
||||
* If stderr piping was requested, a PassThrough stream is returned _immediately_, allowing callers to
|
||||
* attach listeners before the start method is invoked. This prevents loss of any early
|
||||
* error output emitted by the child process.
|
||||
*/
|
||||
get stderr() {
|
||||
if (this._stderrStream) {
|
||||
return this._stderrStream;
|
||||
}
|
||||
return this._process?.stderr ?? null;
|
||||
}
|
||||
/**
|
||||
* The child process pid spawned by this transport.
|
||||
*
|
||||
* This is only available after the transport has been started.
|
||||
*/
|
||||
get pid() {
|
||||
return this._process?.pid ?? null;
|
||||
}
|
||||
processReadBuffer() {
|
||||
while (true) {
|
||||
try {
|
||||
const message = this._readBuffer.readMessage();
|
||||
if (message === null) {
|
||||
break;
|
||||
}
|
||||
this.onmessage?.(message);
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
async close() {
|
||||
if (this._process) {
|
||||
const processToClose = this._process;
|
||||
this._process = undefined;
|
||||
const closePromise = new Promise(resolve => {
|
||||
processToClose.once('close', () => {
|
||||
resolve();
|
||||
});
|
||||
});
|
||||
try {
|
||||
processToClose.stdin?.end();
|
||||
}
|
||||
catch {
|
||||
// ignore
|
||||
}
|
||||
await Promise.race([closePromise, new Promise(resolve => setTimeout(resolve, 2000).unref())]);
|
||||
if (processToClose.exitCode === null) {
|
||||
try {
|
||||
processToClose.kill('SIGTERM');
|
||||
}
|
||||
catch {
|
||||
// ignore
|
||||
}
|
||||
await Promise.race([closePromise, new Promise(resolve => setTimeout(resolve, 2000).unref())]);
|
||||
}
|
||||
if (processToClose.exitCode === null) {
|
||||
try {
|
||||
processToClose.kill('SIGKILL');
|
||||
}
|
||||
catch {
|
||||
// ignore
|
||||
}
|
||||
}
|
||||
}
|
||||
this._readBuffer.clear();
|
||||
}
|
||||
send(message) {
|
||||
return new Promise(resolve => {
|
||||
if (!this._process?.stdin) {
|
||||
throw new Error('Not connected');
|
||||
}
|
||||
const json = (0, stdio_js_1.serializeMessage)(message);
|
||||
if (this._process.stdin.write(json)) {
|
||||
resolve();
|
||||
}
|
||||
else {
|
||||
this._process.stdin.once('drain', resolve);
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.StdioClientTransport = StdioClientTransport;
|
||||
function isElectron() {
|
||||
return 'type' in node_process_1.default;
|
||||
}
|
||||
//# sourceMappingURL=stdio.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/stdio.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"stdio.js","sourceRoot":"","sources":["../../../src/client/stdio.ts"],"names":[],"mappings":";;;;;;AAkEA,sDAkBC;AAnFD,8DAAgC;AAChC,gEAAmC;AACnC,6CAAkD;AAClD,iDAAkE;AAqClE;;GAEG;AACU,QAAA,0BAA0B,GACnC,sBAAO,CAAC,QAAQ,KAAK,OAAO;IACxB,CAAC,CAAC;QACI,SAAS;QACT,WAAW;QACX,UAAU;QACV,cAAc;QACd,MAAM;QACN,wBAAwB;QACxB,aAAa;QACb,YAAY;QACZ,MAAM;QACN,UAAU;QACV,aAAa;QACb,cAAc;KACjB;IACH,CAAC,CAAC,0DAA0D;QAC1D,CAAC,MAAM,EAAE,SAAS,EAAE,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,MAAM,CAAC,CAAC;AAE/D;;GAEG;AACH,SAAgB,qBAAqB;IACjC,MAAM,GAAG,GAA2B,EAAE,CAAC;IAEvC,KAAK,MAAM,GAAG,IAAI,kCAA0B,EAAE,CAAC;QAC3C,MAAM,KAAK,GAAG,sBAAO,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;QAC/B,IAAI,KAAK,KAAK,SAAS,EAAE,CAAC;YACtB,SAAS;QACb,CAAC;QAED,IAAI,KAAK,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC;YACzB,6CAA6C;YAC7C,SAAS;QACb,CAAC;QAED,GAAG,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;IACrB,CAAC;IAED,OAAO,GAAG,CAAC;AACf,CAAC;AAED;;;;GAIG;AACH,MAAa,oBAAoB;IAU7B,YAAY,MAA6B;QARjC,gBAAW,GAAe,IAAI,qBAAU,EAAE,CAAC;QAE3C,kBAAa,GAAuB,IAAI,CAAC;QAO7C,IAAI,CAAC,aAAa,GAAG,MAAM,CAAC;QAC5B,IAAI,MAAM,CAAC,MAAM,KAAK,MAAM,IAAI,MAAM,CAAC,MAAM,KAAK,YAAY,EAAE,CAAC;YAC7D,IAAI,CAAC,aAAa,GAAG,IAAI,yBAAW,EAAE,CAAC;QAC3C,CAAC;IACL,CAAC;IAED;;OAEG;IACH,KAAK,CAAC,KAAK;QACP,IAAI,IAAI,CAAC,QAAQ,EAAE,CAAC;YAChB,MAAM,IAAI,KAAK,CACX,+GAA+G,CAClH,CAAC;QACN,CAAC;QAED,OAAO,IAAI,OAAO,CAAC,CAAC,OAAO,EAAE,MAAM,EAAE,EAAE;YACnC,IAAI,CAAC,QAAQ,GAAG,IAAA,qBAAK,EAAC,IAAI,CAAC,aAAa,CAAC,OAAO,EAAE,IAAI,CAAC,aAAa,CAAC,IAAI,IAAI,EAAE,EAAE;gBAC7E,2EAA2E;gBAC3E,GAAG,EAAE;oBACD,GAAG,qBAAqB,EAAE;oBAC1B,GAAG,IAAI,CAAC,aAAa,CAAC,GAAG;iBAC5B;gBACD,KAAK,EAAE,CAAC,MAAM,EAAE,MAAM,EAAE,IAAI,CAAC,aAAa,CAAC,MAAM,IAAI,SAAS,CAAC;gBAC/D,KAAK,EAAE,KAAK;gBACZ,WAAW,EAAE,sBAAO,CAAC,QAAQ,KAAK,OAAO,IAAI,UAAU,EAAE;gBACzD,GAAG,EAAE,IAAI,CAAC,aAAa,CAAC,GAAG;aAC9B,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBAC9B,MAAM,CAAC,KAAK,CAAC,CAAC;gBACd,IAAI,CAAC,OAAO,EAAE,CAAC,KAAK,CAAC,CAAC;YAC1B,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,EAAE;gBAC3B,OAAO,EAAE,CAAC;YACd,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBAC9B,IAAI,CAAC,QAAQ,GAAG,SAAS,CAAC;gBAC1B,IAAI,CAAC,OAAO,EAAE,EAAE,CAAC;YACrB,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,KAAK,EAAE,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBACrC,IAAI,CAAC,OAAO,EAAE,CAAC,KAAK,CAAC,CAAC;YAC1B,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE,EAAE,CAAC,MAAM,EAAE,KAAK,CAAC,EAAE;gBACrC,IAAI,CAAC,WAAW,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;gBAC/B,IAAI,CAAC,iBAAiB,EAAE,CAAC;YAC7B,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE;gBACtC,IAAI,CAAC,OAAO,EAAE,CAAC,KAAK,CAAC,CAAC;YAC1B,CAAC,CAAC,CAAC;YAEH,IAAI,IAAI,CAAC,aAAa,IAAI,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE,CAAC;gBAC7C,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;YAClD,CAAC;QACL,CAAC,CAAC,CAAC;IACP,CAAC;IAED;;;;;;OAMG;IACH,IAAI,MAAM;QACN,IAAI,IAAI,CAAC,aAAa,EAAE,CAAC;YACrB,OAAO,IAAI,CAAC,aAAa,CAAC;QAC9B,CAAC;QAED,OAAO,IAAI,CAAC,QAAQ,EAAE,MAAM,IAAI,IAAI,CAAC;IACzC,CAAC;IAED;;;;OAIG;IACH,IAAI,GAAG;QACH,OAAO,IAAI,CAAC,QAAQ,EAAE,GAAG,IAAI,IAAI,CAAC;IACtC,CAAC;IAEO,iBAAiB;QACrB,OAAO,IAAI,EAAE,CAAC;YACV,IAAI,CAAC;gBACD,MAAM,OAAO,GAAG,IAAI,CAAC,WAAW,CAAC,WAAW,EAAE,CAAC;gBAC/C,IAAI,OAAO,KAAK,IAAI,EAAE,CAAC;oBACnB,MAAM;gBACV,CAAC;gBAED,IAAI,CAAC,SAAS,EAAE,CAAC,OAAO,CAAC,CAAC;YAC9B,CAAC;YAAC,OAAO,KAAK,EAAE,CAAC;gBACb,IAAI,CAAC,OAAO,EAAE,CAAC,KAAc,CAAC,CAAC;YACnC,CAAC;QACL,CAAC;IACL,CAAC;IAED,KAAK,CAAC,KAAK;QACP,IAAI,IAAI,CAAC,QAAQ,EAAE,CAAC;YAChB,MAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC;YACrC,IAAI,CAAC,QAAQ,GAAG,SAAS,CAAC;YAE1B,MAAM,YAAY,GAAG,IAAI,OAAO,CAAO,OAAO,CAAC,EAAE;gBAC7C,cAAc,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,EAAE;oBAC9B,OAAO,EAAE,CAAC;gBACd,CAAC,CAAC,CAAC;YACP,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC;gBACD,cAAc,CAAC,KAAK,EAAE,GAAG,EAAE,CAAC;YAChC,CAAC;YAAC,MAAM,CAAC;gBACL,SAAS;YACb,CAAC;YAED,MAAM,OAAO,CAAC,IAAI,CAAC,CAAC,YAAY,EAAE,IAAI,OAAO,CAAC,OAAO,CAAC,EAAE,CAAC,UAAU,CAAC,OAAO,EAAE,IAAK,CAAC,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;YAE/F,IAAI,cAAc,CAAC,QAAQ,KAAK,IAAI,EAAE,CAAC;gBACnC,IAAI,CAAC;oBACD,cAAc,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBACnC,CAAC;gBAAC,MAAM,CAAC;oBACL,SAAS;gBACb,CAAC;gBAED,MAAM,OAAO,CAAC,IAAI,CAAC,CAAC,YAAY,EAAE,IAAI,OAAO,CAAC,OAAO,CAAC,EAAE,CAAC,UAAU,CAAC,OAAO,EAAE,IAAK,CAAC,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;YACnG,CAAC;YAED,IAAI,cAAc,CAAC,QAAQ,KAAK,IAAI,EAAE,CAAC;gBACnC,IAAI,CAAC;oBACD,cAAc,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBACnC,CAAC;gBAAC,MAAM,CAAC;oBACL,SAAS;gBACb,CAAC;YACL,CAAC;QACL,CAAC;QAED,IAAI,CAAC,WAAW,CAAC,KAAK,EAAE,CAAC;IAC7B,CAAC;IAED,IAAI,CAAC,OAAuB;QACxB,OAAO,IAAI,OAAO,CAAC,OAAO,CAAC,EAAE;YACzB,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAE,KAAK,EAAE,CAAC;gBACxB,MAAM,IAAI,KAAK,CAAC,eAAe,CAAC,CAAC;YACrC,CAAC;YAED,MAAM,IAAI,GAAG,IAAA,2BAAgB,EAAC,OAAO,CAAC,CAAC;YACvC,IAAI,IAAI,CAAC,QAAQ,CAAC,KAAK,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE,CAAC;gBAClC,OAAO,EAAE,CAAC;YACd,CAAC;iBAAM,CAAC;gBACJ,IAAI,CAAC,QAAQ,CAAC,KAAK,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;YAC/C,CAAC;QACL,CAAC,CAAC,CAAC;IACP,CAAC;CACJ;AAvKD,oDAuKC;AAED,SAAS,UAAU;IACf,OAAO,MAAM,IAAI,sBAAO,CAAC;AAC7B,CAAC"}
|
||||
171
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.d.ts
generated
vendored
Normal file
171
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.d.ts
generated
vendored
Normal file
@@ -0,0 +1,171 @@
|
||||
import { Transport, FetchLike } from '../shared/transport.js';
|
||||
import { JSONRPCMessage } from '../types.js';
|
||||
import { OAuthClientProvider } from './auth.js';
|
||||
export declare class StreamableHTTPError extends Error {
|
||||
readonly code: number | undefined;
|
||||
constructor(code: number | undefined, message: string | undefined);
|
||||
}
|
||||
/**
|
||||
* Options for starting or authenticating an SSE connection
|
||||
*/
|
||||
export interface StartSSEOptions {
|
||||
/**
|
||||
* The resumption token used to continue long-running requests that were interrupted.
|
||||
*
|
||||
* This allows clients to reconnect and continue from where they left off.
|
||||
*/
|
||||
resumptionToken?: string;
|
||||
/**
|
||||
* A callback that is invoked when the resumption token changes.
|
||||
*
|
||||
* This allows clients to persist the latest token for potential reconnection.
|
||||
*/
|
||||
onresumptiontoken?: (token: string) => void;
|
||||
/**
|
||||
* Override Message ID to associate with the replay message
|
||||
* so that response can be associate with the new resumed request.
|
||||
*/
|
||||
replayMessageId?: string | number;
|
||||
}
|
||||
/**
|
||||
* Configuration options for reconnection behavior of the StreamableHTTPClientTransport.
|
||||
*/
|
||||
export interface StreamableHTTPReconnectionOptions {
|
||||
/**
|
||||
* Maximum backoff time between reconnection attempts in milliseconds.
|
||||
* Default is 30000 (30 seconds).
|
||||
*/
|
||||
maxReconnectionDelay: number;
|
||||
/**
|
||||
* Initial backoff time between reconnection attempts in milliseconds.
|
||||
* Default is 1000 (1 second).
|
||||
*/
|
||||
initialReconnectionDelay: number;
|
||||
/**
|
||||
* The factor by which the reconnection delay increases after each attempt.
|
||||
* Default is 1.5.
|
||||
*/
|
||||
reconnectionDelayGrowFactor: number;
|
||||
/**
|
||||
* Maximum number of reconnection attempts before giving up.
|
||||
* Default is 2.
|
||||
*/
|
||||
maxRetries: number;
|
||||
}
|
||||
/**
|
||||
* Configuration options for the `StreamableHTTPClientTransport`.
|
||||
*/
|
||||
export type StreamableHTTPClientTransportOptions = {
|
||||
/**
|
||||
* An OAuth client provider to use for authentication.
|
||||
*
|
||||
* When an `authProvider` is specified and the connection is started:
|
||||
* 1. The connection is attempted with any existing access token from the `authProvider`.
|
||||
* 2. If the access token has expired, the `authProvider` is used to refresh the token.
|
||||
* 3. If token refresh fails or no access token exists, and auth is required, `OAuthClientProvider.redirectToAuthorization` is called, and an `UnauthorizedError` will be thrown from `connect`/`start`.
|
||||
*
|
||||
* After the user has finished authorizing via their user agent, and is redirected back to the MCP client application, call `StreamableHTTPClientTransport.finishAuth` with the authorization code before retrying the connection.
|
||||
*
|
||||
* If an `authProvider` is not provided, and auth is required, an `UnauthorizedError` will be thrown.
|
||||
*
|
||||
* `UnauthorizedError` might also be thrown when sending any message over the transport, indicating that the session has expired, and needs to be re-authed and reconnected.
|
||||
*/
|
||||
authProvider?: OAuthClientProvider;
|
||||
/**
|
||||
* Customizes HTTP requests to the server.
|
||||
*/
|
||||
requestInit?: RequestInit;
|
||||
/**
|
||||
* Custom fetch implementation used for all network requests.
|
||||
*/
|
||||
fetch?: FetchLike;
|
||||
/**
|
||||
* Options to configure the reconnection behavior.
|
||||
*/
|
||||
reconnectionOptions?: StreamableHTTPReconnectionOptions;
|
||||
/**
|
||||
* Session ID for the connection. This is used to identify the session on the server.
|
||||
* When not provided and connecting to a server that supports session IDs, the server will generate a new session ID.
|
||||
*/
|
||||
sessionId?: string;
|
||||
};
|
||||
/**
|
||||
* Client transport for Streamable HTTP: this implements the MCP Streamable HTTP transport specification.
|
||||
* It will connect to a server using HTTP POST for sending messages and HTTP GET with Server-Sent Events
|
||||
* for receiving messages.
|
||||
*/
|
||||
export declare class StreamableHTTPClientTransport implements Transport {
|
||||
private _abortController?;
|
||||
private _url;
|
||||
private _resourceMetadataUrl?;
|
||||
private _scope?;
|
||||
private _requestInit?;
|
||||
private _authProvider?;
|
||||
private _fetch?;
|
||||
private _fetchWithInit;
|
||||
private _sessionId?;
|
||||
private _reconnectionOptions;
|
||||
private _protocolVersion?;
|
||||
private _hasCompletedAuthFlow;
|
||||
private _lastUpscopingHeader?;
|
||||
private _serverRetryMs?;
|
||||
private _reconnectionTimeout?;
|
||||
onclose?: () => void;
|
||||
onerror?: (error: Error) => void;
|
||||
onmessage?: (message: JSONRPCMessage) => void;
|
||||
constructor(url: URL, opts?: StreamableHTTPClientTransportOptions);
|
||||
private _authThenStart;
|
||||
private _commonHeaders;
|
||||
private _startOrAuthSse;
|
||||
/**
|
||||
* Calculates the next reconnection delay using backoff algorithm
|
||||
*
|
||||
* @param attempt Current reconnection attempt count for the specific stream
|
||||
* @returns Time to wait in milliseconds before next reconnection attempt
|
||||
*/
|
||||
private _getNextReconnectionDelay;
|
||||
/**
|
||||
* Schedule a reconnection attempt using server-provided retry interval or backoff
|
||||
*
|
||||
* @param lastEventId The ID of the last received event for resumability
|
||||
* @param attemptCount Current reconnection attempt count for this specific stream
|
||||
*/
|
||||
private _scheduleReconnection;
|
||||
private _handleSseStream;
|
||||
start(): Promise<void>;
|
||||
/**
|
||||
* Call this method after the user has finished authorizing via their user agent and is redirected back to the MCP client application. This will exchange the authorization code for an access token, enabling the next connection attempt to successfully auth.
|
||||
*/
|
||||
finishAuth(authorizationCode: string): Promise<void>;
|
||||
close(): Promise<void>;
|
||||
send(message: JSONRPCMessage | JSONRPCMessage[], options?: {
|
||||
resumptionToken?: string;
|
||||
onresumptiontoken?: (token: string) => void;
|
||||
}): Promise<void>;
|
||||
get sessionId(): string | undefined;
|
||||
/**
|
||||
* Terminates the current session by sending a DELETE request to the server.
|
||||
*
|
||||
* Clients that no longer need a particular session
|
||||
* (e.g., because the user is leaving the client application) SHOULD send an
|
||||
* HTTP DELETE to the MCP endpoint with the Mcp-Session-Id header to explicitly
|
||||
* terminate the session.
|
||||
*
|
||||
* The server MAY respond with HTTP 405 Method Not Allowed, indicating that
|
||||
* the server does not allow clients to terminate sessions.
|
||||
*/
|
||||
terminateSession(): Promise<void>;
|
||||
setProtocolVersion(version: string): void;
|
||||
get protocolVersion(): string | undefined;
|
||||
/**
|
||||
* Resume an SSE stream from a previous event ID.
|
||||
* Opens a GET SSE connection with Last-Event-ID header to replay missed events.
|
||||
*
|
||||
* @param lastEventId The event ID to resume from
|
||||
* @param options Optional callback to receive new resumption tokens
|
||||
*/
|
||||
resumeStream(lastEventId: string, options?: {
|
||||
onresumptiontoken?: (token: string) => void;
|
||||
}): Promise<void>;
|
||||
}
|
||||
//# sourceMappingURL=streamableHttp.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"streamableHttp.d.ts","sourceRoot":"","sources":["../../../src/client/streamableHttp.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,SAAS,EAAE,SAAS,EAAyC,MAAM,wBAAwB,CAAC;AACrG,OAAO,EAAwE,cAAc,EAAwB,MAAM,aAAa,CAAC;AACzI,OAAO,EAAkD,mBAAmB,EAAqB,MAAM,WAAW,CAAC;AAWnH,qBAAa,mBAAoB,SAAQ,KAAK;aAEtB,IAAI,EAAE,MAAM,GAAG,SAAS;gBAAxB,IAAI,EAAE,MAAM,GAAG,SAAS,EACxC,OAAO,EAAE,MAAM,GAAG,SAAS;CAIlC;AAED;;GAEG;AACH,MAAM,WAAW,eAAe;IAC5B;;;;OAIG;IACH,eAAe,CAAC,EAAE,MAAM,CAAC;IAEzB;;;;OAIG;IACH,iBAAiB,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;IAE5C;;;OAGG;IACH,eAAe,CAAC,EAAE,MAAM,GAAG,MAAM,CAAC;CACrC;AAED;;GAEG;AACH,MAAM,WAAW,iCAAiC;IAC9C;;;OAGG;IACH,oBAAoB,EAAE,MAAM,CAAC;IAE7B;;;OAGG;IACH,wBAAwB,EAAE,MAAM,CAAC;IAEjC;;;OAGG;IACH,2BAA2B,EAAE,MAAM,CAAC;IAEpC;;;OAGG;IACH,UAAU,EAAE,MAAM,CAAC;CACtB;AAED;;GAEG;AACH,MAAM,MAAM,oCAAoC,GAAG;IAC/C;;;;;;;;;;;;;OAaG;IACH,YAAY,CAAC,EAAE,mBAAmB,CAAC;IAEnC;;OAEG;IACH,WAAW,CAAC,EAAE,WAAW,CAAC;IAE1B;;OAEG;IACH,KAAK,CAAC,EAAE,SAAS,CAAC;IAElB;;OAEG;IACH,mBAAmB,CAAC,EAAE,iCAAiC,CAAC;IAExD;;;OAGG;IACH,SAAS,CAAC,EAAE,MAAM,CAAC;CACtB,CAAC;AAEF;;;;GAIG;AACH,qBAAa,6BAA8B,YAAW,SAAS;IAC3D,OAAO,CAAC,gBAAgB,CAAC,CAAkB;IAC3C,OAAO,CAAC,IAAI,CAAM;IAClB,OAAO,CAAC,oBAAoB,CAAC,CAAM;IACnC,OAAO,CAAC,MAAM,CAAC,CAAS;IACxB,OAAO,CAAC,YAAY,CAAC,CAAc;IACnC,OAAO,CAAC,aAAa,CAAC,CAAsB;IAC5C,OAAO,CAAC,MAAM,CAAC,CAAY;IAC3B,OAAO,CAAC,cAAc,CAAY;IAClC,OAAO,CAAC,UAAU,CAAC,CAAS;IAC5B,OAAO,CAAC,oBAAoB,CAAoC;IAChE,OAAO,CAAC,gBAAgB,CAAC,CAAS;IAClC,OAAO,CAAC,qBAAqB,CAAS;IACtC,OAAO,CAAC,oBAAoB,CAAC,CAAS;IACtC,OAAO,CAAC,cAAc,CAAC,CAAS;IAChC,OAAO,CAAC,oBAAoB,CAAC,CAAgC;IAE7D,OAAO,CAAC,EAAE,MAAM,IAAI,CAAC;IACrB,OAAO,CAAC,EAAE,CAAC,KAAK,EAAE,KAAK,KAAK,IAAI,CAAC;IACjC,SAAS,CAAC,EAAE,CAAC,OAAO,EAAE,cAAc,KAAK,IAAI,CAAC;gBAElC,GAAG,EAAE,GAAG,EAAE,IAAI,CAAC,EAAE,oCAAoC;YAYnD,cAAc;YAyBd,cAAc;YAwBd,eAAe;IA4C7B;;;;;OAKG;IACH,OAAO,CAAC,yBAAyB;IAejC;;;;;OAKG;IACH,OAAO,CAAC,qBAAqB;IAwB7B,OAAO,CAAC,gBAAgB;IA+GlB,KAAK;IAUX;;OAEG;IACG,UAAU,CAAC,iBAAiB,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAiBpD,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAStB,IAAI,CACN,OAAO,EAAE,cAAc,GAAG,cAAc,EAAE,EAC1C,OAAO,CAAC,EAAE;QAAE,eAAe,CAAC,EAAE,MAAM,CAAC;QAAC,iBAAiB,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,KAAK,IAAI,CAAA;KAAE,GACpF,OAAO,CAAC,IAAI,CAAC;IA0JhB,IAAI,SAAS,IAAI,MAAM,GAAG,SAAS,CAElC;IAED;;;;;;;;;;OAUG;IACG,gBAAgB,IAAI,OAAO,CAAC,IAAI,CAAC;IA+BvC,kBAAkB,CAAC,OAAO,EAAE,MAAM,GAAG,IAAI;IAGzC,IAAI,eAAe,IAAI,MAAM,GAAG,SAAS,CAExC;IAED;;;;;;OAMG;IACG,YAAY,CAAC,WAAW,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE;QAAE,iBAAiB,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,KAAK,IAAI,CAAA;KAAE,GAAG,OAAO,CAAC,IAAI,CAAC;CAMpH"}
|
||||
482
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.js
generated
vendored
Normal file
482
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.js
generated
vendored
Normal file
@@ -0,0 +1,482 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.StreamableHTTPClientTransport = exports.StreamableHTTPError = void 0;
|
||||
const transport_js_1 = require("../shared/transport.js");
|
||||
const types_js_1 = require("../types.js");
|
||||
const auth_js_1 = require("./auth.js");
|
||||
const stream_1 = require("eventsource-parser/stream");
|
||||
// Default reconnection options for StreamableHTTP connections
|
||||
const DEFAULT_STREAMABLE_HTTP_RECONNECTION_OPTIONS = {
|
||||
initialReconnectionDelay: 1000,
|
||||
maxReconnectionDelay: 30000,
|
||||
reconnectionDelayGrowFactor: 1.5,
|
||||
maxRetries: 2
|
||||
};
|
||||
class StreamableHTTPError extends Error {
|
||||
constructor(code, message) {
|
||||
super(`Streamable HTTP error: ${message}`);
|
||||
this.code = code;
|
||||
}
|
||||
}
|
||||
exports.StreamableHTTPError = StreamableHTTPError;
|
||||
/**
|
||||
* Client transport for Streamable HTTP: this implements the MCP Streamable HTTP transport specification.
|
||||
* It will connect to a server using HTTP POST for sending messages and HTTP GET with Server-Sent Events
|
||||
* for receiving messages.
|
||||
*/
|
||||
class StreamableHTTPClientTransport {
|
||||
constructor(url, opts) {
|
||||
this._hasCompletedAuthFlow = false; // Circuit breaker: detect auth success followed by immediate 401
|
||||
this._url = url;
|
||||
this._resourceMetadataUrl = undefined;
|
||||
this._scope = undefined;
|
||||
this._requestInit = opts?.requestInit;
|
||||
this._authProvider = opts?.authProvider;
|
||||
this._fetch = opts?.fetch;
|
||||
this._fetchWithInit = (0, transport_js_1.createFetchWithInit)(opts?.fetch, opts?.requestInit);
|
||||
this._sessionId = opts?.sessionId;
|
||||
this._reconnectionOptions = opts?.reconnectionOptions ?? DEFAULT_STREAMABLE_HTTP_RECONNECTION_OPTIONS;
|
||||
}
|
||||
async _authThenStart() {
|
||||
if (!this._authProvider) {
|
||||
throw new auth_js_1.UnauthorizedError('No auth provider');
|
||||
}
|
||||
let result;
|
||||
try {
|
||||
result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError();
|
||||
}
|
||||
return await this._startOrAuthSse({ resumptionToken: undefined });
|
||||
}
|
||||
async _commonHeaders() {
|
||||
const headers = {};
|
||||
if (this._authProvider) {
|
||||
const tokens = await this._authProvider.tokens();
|
||||
if (tokens) {
|
||||
headers['Authorization'] = `Bearer ${tokens.access_token}`;
|
||||
}
|
||||
}
|
||||
if (this._sessionId) {
|
||||
headers['mcp-session-id'] = this._sessionId;
|
||||
}
|
||||
if (this._protocolVersion) {
|
||||
headers['mcp-protocol-version'] = this._protocolVersion;
|
||||
}
|
||||
const extraHeaders = (0, transport_js_1.normalizeHeaders)(this._requestInit?.headers);
|
||||
return new Headers({
|
||||
...headers,
|
||||
...extraHeaders
|
||||
});
|
||||
}
|
||||
async _startOrAuthSse(options) {
|
||||
const { resumptionToken } = options;
|
||||
try {
|
||||
// Try to open an initial SSE stream with GET to listen for server messages
|
||||
// This is optional according to the spec - server may not support it
|
||||
const headers = await this._commonHeaders();
|
||||
headers.set('Accept', 'text/event-stream');
|
||||
// Include Last-Event-ID header for resumable streams if provided
|
||||
if (resumptionToken) {
|
||||
headers.set('last-event-id', resumptionToken);
|
||||
}
|
||||
const response = await (this._fetch ?? fetch)(this._url, {
|
||||
method: 'GET',
|
||||
headers,
|
||||
signal: this._abortController?.signal
|
||||
});
|
||||
if (!response.ok) {
|
||||
await response.body?.cancel();
|
||||
if (response.status === 401 && this._authProvider) {
|
||||
// Need to authenticate
|
||||
return await this._authThenStart();
|
||||
}
|
||||
// 405 indicates that the server does not offer an SSE stream at GET endpoint
|
||||
// This is an expected case that should not trigger an error
|
||||
if (response.status === 405) {
|
||||
return;
|
||||
}
|
||||
throw new StreamableHTTPError(response.status, `Failed to open SSE stream: ${response.statusText}`);
|
||||
}
|
||||
this._handleSseStream(response.body, options, true);
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
/**
|
||||
* Calculates the next reconnection delay using backoff algorithm
|
||||
*
|
||||
* @param attempt Current reconnection attempt count for the specific stream
|
||||
* @returns Time to wait in milliseconds before next reconnection attempt
|
||||
*/
|
||||
_getNextReconnectionDelay(attempt) {
|
||||
// Use server-provided retry value if available
|
||||
if (this._serverRetryMs !== undefined) {
|
||||
return this._serverRetryMs;
|
||||
}
|
||||
// Fall back to exponential backoff
|
||||
const initialDelay = this._reconnectionOptions.initialReconnectionDelay;
|
||||
const growFactor = this._reconnectionOptions.reconnectionDelayGrowFactor;
|
||||
const maxDelay = this._reconnectionOptions.maxReconnectionDelay;
|
||||
// Cap at maximum delay
|
||||
return Math.min(initialDelay * Math.pow(growFactor, attempt), maxDelay);
|
||||
}
|
||||
/**
|
||||
* Schedule a reconnection attempt using server-provided retry interval or backoff
|
||||
*
|
||||
* @param lastEventId The ID of the last received event for resumability
|
||||
* @param attemptCount Current reconnection attempt count for this specific stream
|
||||
*/
|
||||
_scheduleReconnection(options, attemptCount = 0) {
|
||||
// Use provided options or default options
|
||||
const maxRetries = this._reconnectionOptions.maxRetries;
|
||||
// Check if we've exceeded maximum retry attempts
|
||||
if (attemptCount >= maxRetries) {
|
||||
this.onerror?.(new Error(`Maximum reconnection attempts (${maxRetries}) exceeded.`));
|
||||
return;
|
||||
}
|
||||
// Calculate next delay based on current attempt count
|
||||
const delay = this._getNextReconnectionDelay(attemptCount);
|
||||
// Schedule the reconnection
|
||||
this._reconnectionTimeout = setTimeout(() => {
|
||||
// Use the last event ID to resume where we left off
|
||||
this._startOrAuthSse(options).catch(error => {
|
||||
this.onerror?.(new Error(`Failed to reconnect SSE stream: ${error instanceof Error ? error.message : String(error)}`));
|
||||
// Schedule another attempt if this one failed, incrementing the attempt counter
|
||||
this._scheduleReconnection(options, attemptCount + 1);
|
||||
});
|
||||
}, delay);
|
||||
}
|
||||
_handleSseStream(stream, options, isReconnectable) {
|
||||
if (!stream) {
|
||||
return;
|
||||
}
|
||||
const { onresumptiontoken, replayMessageId } = options;
|
||||
let lastEventId;
|
||||
// Track whether we've received a priming event (event with ID)
|
||||
// Per spec, server SHOULD send a priming event with ID before closing
|
||||
let hasPrimingEvent = false;
|
||||
// Track whether we've received a response - if so, no need to reconnect
|
||||
// Reconnection is for when server disconnects BEFORE sending response
|
||||
let receivedResponse = false;
|
||||
const processStream = async () => {
|
||||
// this is the closest we can get to trying to catch network errors
|
||||
// if something happens reader will throw
|
||||
try {
|
||||
// Create a pipeline: binary stream -> text decoder -> SSE parser
|
||||
const reader = stream
|
||||
.pipeThrough(new TextDecoderStream())
|
||||
.pipeThrough(new stream_1.EventSourceParserStream({
|
||||
onRetry: (retryMs) => {
|
||||
// Capture server-provided retry value for reconnection timing
|
||||
this._serverRetryMs = retryMs;
|
||||
}
|
||||
}))
|
||||
.getReader();
|
||||
while (true) {
|
||||
const { value: event, done } = await reader.read();
|
||||
if (done) {
|
||||
break;
|
||||
}
|
||||
// Update last event ID if provided
|
||||
if (event.id) {
|
||||
lastEventId = event.id;
|
||||
// Mark that we've received a priming event - stream is now resumable
|
||||
hasPrimingEvent = true;
|
||||
onresumptiontoken?.(event.id);
|
||||
}
|
||||
// Skip events with no data (priming events, keep-alives)
|
||||
if (!event.data) {
|
||||
continue;
|
||||
}
|
||||
if (!event.event || event.event === 'message') {
|
||||
try {
|
||||
const message = types_js_1.JSONRPCMessageSchema.parse(JSON.parse(event.data));
|
||||
if ((0, types_js_1.isJSONRPCResultResponse)(message)) {
|
||||
// Mark that we received a response - no need to reconnect for this request
|
||||
receivedResponse = true;
|
||||
if (replayMessageId !== undefined) {
|
||||
message.id = replayMessageId;
|
||||
}
|
||||
}
|
||||
this.onmessage?.(message);
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
}
|
||||
}
|
||||
}
|
||||
// Handle graceful server-side disconnect
|
||||
// Server may close connection after sending event ID and retry field
|
||||
// Reconnect if: already reconnectable (GET stream) OR received a priming event (POST stream with event ID)
|
||||
// BUT don't reconnect if we already received a response - the request is complete
|
||||
const canResume = isReconnectable || hasPrimingEvent;
|
||||
const needsReconnect = canResume && !receivedResponse;
|
||||
if (needsReconnect && this._abortController && !this._abortController.signal.aborted) {
|
||||
this._scheduleReconnection({
|
||||
resumptionToken: lastEventId,
|
||||
onresumptiontoken,
|
||||
replayMessageId
|
||||
}, 0);
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
// Handle stream errors - likely a network disconnect
|
||||
this.onerror?.(new Error(`SSE stream disconnected: ${error}`));
|
||||
// Attempt to reconnect if the stream disconnects unexpectedly and we aren't closing
|
||||
// Reconnect if: already reconnectable (GET stream) OR received a priming event (POST stream with event ID)
|
||||
// BUT don't reconnect if we already received a response - the request is complete
|
||||
const canResume = isReconnectable || hasPrimingEvent;
|
||||
const needsReconnect = canResume && !receivedResponse;
|
||||
if (needsReconnect && this._abortController && !this._abortController.signal.aborted) {
|
||||
// Use the exponential backoff reconnection strategy
|
||||
try {
|
||||
this._scheduleReconnection({
|
||||
resumptionToken: lastEventId,
|
||||
onresumptiontoken,
|
||||
replayMessageId
|
||||
}, 0);
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(new Error(`Failed to reconnect: ${error instanceof Error ? error.message : String(error)}`));
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
processStream();
|
||||
}
|
||||
async start() {
|
||||
if (this._abortController) {
|
||||
throw new Error('StreamableHTTPClientTransport already started! If using Client class, note that connect() calls start() automatically.');
|
||||
}
|
||||
this._abortController = new AbortController();
|
||||
}
|
||||
/**
|
||||
* Call this method after the user has finished authorizing via their user agent and is redirected back to the MCP client application. This will exchange the authorization code for an access token, enabling the next connection attempt to successfully auth.
|
||||
*/
|
||||
async finishAuth(authorizationCode) {
|
||||
if (!this._authProvider) {
|
||||
throw new auth_js_1.UnauthorizedError('No auth provider');
|
||||
}
|
||||
const result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
authorizationCode,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError('Failed to authorize');
|
||||
}
|
||||
}
|
||||
async close() {
|
||||
if (this._reconnectionTimeout) {
|
||||
clearTimeout(this._reconnectionTimeout);
|
||||
this._reconnectionTimeout = undefined;
|
||||
}
|
||||
this._abortController?.abort();
|
||||
this.onclose?.();
|
||||
}
|
||||
async send(message, options) {
|
||||
try {
|
||||
const { resumptionToken, onresumptiontoken } = options || {};
|
||||
if (resumptionToken) {
|
||||
// If we have at last event ID, we need to reconnect the SSE stream
|
||||
this._startOrAuthSse({ resumptionToken, replayMessageId: (0, types_js_1.isJSONRPCRequest)(message) ? message.id : undefined }).catch(err => this.onerror?.(err));
|
||||
return;
|
||||
}
|
||||
const headers = await this._commonHeaders();
|
||||
headers.set('content-type', 'application/json');
|
||||
headers.set('accept', 'application/json, text/event-stream');
|
||||
const init = {
|
||||
...this._requestInit,
|
||||
method: 'POST',
|
||||
headers,
|
||||
body: JSON.stringify(message),
|
||||
signal: this._abortController?.signal
|
||||
};
|
||||
const response = await (this._fetch ?? fetch)(this._url, init);
|
||||
// Handle session ID received during initialization
|
||||
const sessionId = response.headers.get('mcp-session-id');
|
||||
if (sessionId) {
|
||||
this._sessionId = sessionId;
|
||||
}
|
||||
if (!response.ok) {
|
||||
const text = await response.text().catch(() => null);
|
||||
if (response.status === 401 && this._authProvider) {
|
||||
// Prevent infinite recursion when server returns 401 after successful auth
|
||||
if (this._hasCompletedAuthFlow) {
|
||||
throw new StreamableHTTPError(401, 'Server returned 401 after successful authentication');
|
||||
}
|
||||
const { resourceMetadataUrl, scope } = (0, auth_js_1.extractWWWAuthenticateParams)(response);
|
||||
this._resourceMetadataUrl = resourceMetadataUrl;
|
||||
this._scope = scope;
|
||||
const result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetchWithInit
|
||||
});
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError();
|
||||
}
|
||||
// Mark that we completed auth flow
|
||||
this._hasCompletedAuthFlow = true;
|
||||
// Purposely _not_ awaited, so we don't call onerror twice
|
||||
return this.send(message);
|
||||
}
|
||||
if (response.status === 403 && this._authProvider) {
|
||||
const { resourceMetadataUrl, scope, error } = (0, auth_js_1.extractWWWAuthenticateParams)(response);
|
||||
if (error === 'insufficient_scope') {
|
||||
const wwwAuthHeader = response.headers.get('WWW-Authenticate');
|
||||
// Check if we've already tried upscoping with this header to prevent infinite loops.
|
||||
if (this._lastUpscopingHeader === wwwAuthHeader) {
|
||||
throw new StreamableHTTPError(403, 'Server returned 403 after trying upscoping');
|
||||
}
|
||||
if (scope) {
|
||||
this._scope = scope;
|
||||
}
|
||||
if (resourceMetadataUrl) {
|
||||
this._resourceMetadataUrl = resourceMetadataUrl;
|
||||
}
|
||||
// Mark that upscoping was tried.
|
||||
this._lastUpscopingHeader = wwwAuthHeader ?? undefined;
|
||||
const result = await (0, auth_js_1.auth)(this._authProvider, {
|
||||
serverUrl: this._url,
|
||||
resourceMetadataUrl: this._resourceMetadataUrl,
|
||||
scope: this._scope,
|
||||
fetchFn: this._fetch
|
||||
});
|
||||
if (result !== 'AUTHORIZED') {
|
||||
throw new auth_js_1.UnauthorizedError();
|
||||
}
|
||||
return this.send(message);
|
||||
}
|
||||
}
|
||||
throw new StreamableHTTPError(response.status, `Error POSTing to endpoint: ${text}`);
|
||||
}
|
||||
// Reset auth loop flag on successful response
|
||||
this._hasCompletedAuthFlow = false;
|
||||
this._lastUpscopingHeader = undefined;
|
||||
// If the response is 202 Accepted, there's no body to process
|
||||
if (response.status === 202) {
|
||||
await response.body?.cancel();
|
||||
// if the accepted notification is initialized, we start the SSE stream
|
||||
// if it's supported by the server
|
||||
if ((0, types_js_1.isInitializedNotification)(message)) {
|
||||
// Start without a lastEventId since this is a fresh connection
|
||||
this._startOrAuthSse({ resumptionToken: undefined }).catch(err => this.onerror?.(err));
|
||||
}
|
||||
return;
|
||||
}
|
||||
// Get original message(s) for detecting request IDs
|
||||
const messages = Array.isArray(message) ? message : [message];
|
||||
const hasRequests = messages.filter(msg => 'method' in msg && 'id' in msg && msg.id !== undefined).length > 0;
|
||||
// Check the response type
|
||||
const contentType = response.headers.get('content-type');
|
||||
if (hasRequests) {
|
||||
if (contentType?.includes('text/event-stream')) {
|
||||
// Handle SSE stream responses for requests
|
||||
// We use the same handler as standalone streams, which now supports
|
||||
// reconnection with the last event ID
|
||||
this._handleSseStream(response.body, { onresumptiontoken }, false);
|
||||
}
|
||||
else if (contentType?.includes('application/json')) {
|
||||
// For non-streaming servers, we might get direct JSON responses
|
||||
const data = await response.json();
|
||||
const responseMessages = Array.isArray(data)
|
||||
? data.map(msg => types_js_1.JSONRPCMessageSchema.parse(msg))
|
||||
: [types_js_1.JSONRPCMessageSchema.parse(data)];
|
||||
for (const msg of responseMessages) {
|
||||
this.onmessage?.(msg);
|
||||
}
|
||||
}
|
||||
else {
|
||||
await response.body?.cancel();
|
||||
throw new StreamableHTTPError(-1, `Unexpected content type: ${contentType}`);
|
||||
}
|
||||
}
|
||||
else {
|
||||
// No requests in message but got 200 OK - still need to release connection
|
||||
await response.body?.cancel();
|
||||
}
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
get sessionId() {
|
||||
return this._sessionId;
|
||||
}
|
||||
/**
|
||||
* Terminates the current session by sending a DELETE request to the server.
|
||||
*
|
||||
* Clients that no longer need a particular session
|
||||
* (e.g., because the user is leaving the client application) SHOULD send an
|
||||
* HTTP DELETE to the MCP endpoint with the Mcp-Session-Id header to explicitly
|
||||
* terminate the session.
|
||||
*
|
||||
* The server MAY respond with HTTP 405 Method Not Allowed, indicating that
|
||||
* the server does not allow clients to terminate sessions.
|
||||
*/
|
||||
async terminateSession() {
|
||||
if (!this._sessionId) {
|
||||
return; // No session to terminate
|
||||
}
|
||||
try {
|
||||
const headers = await this._commonHeaders();
|
||||
const init = {
|
||||
...this._requestInit,
|
||||
method: 'DELETE',
|
||||
headers,
|
||||
signal: this._abortController?.signal
|
||||
};
|
||||
const response = await (this._fetch ?? fetch)(this._url, init);
|
||||
await response.body?.cancel();
|
||||
// We specifically handle 405 as a valid response according to the spec,
|
||||
// meaning the server does not support explicit session termination
|
||||
if (!response.ok && response.status !== 405) {
|
||||
throw new StreamableHTTPError(response.status, `Failed to terminate session: ${response.statusText}`);
|
||||
}
|
||||
this._sessionId = undefined;
|
||||
}
|
||||
catch (error) {
|
||||
this.onerror?.(error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
setProtocolVersion(version) {
|
||||
this._protocolVersion = version;
|
||||
}
|
||||
get protocolVersion() {
|
||||
return this._protocolVersion;
|
||||
}
|
||||
/**
|
||||
* Resume an SSE stream from a previous event ID.
|
||||
* Opens a GET SSE connection with Last-Event-ID header to replay missed events.
|
||||
*
|
||||
* @param lastEventId The event ID to resume from
|
||||
* @param options Optional callback to receive new resumption tokens
|
||||
*/
|
||||
async resumeStream(lastEventId, options) {
|
||||
await this._startOrAuthSse({
|
||||
resumptionToken: lastEventId,
|
||||
onresumptiontoken: options?.onresumptiontoken
|
||||
});
|
||||
}
|
||||
}
|
||||
exports.StreamableHTTPClientTransport = StreamableHTTPClientTransport;
|
||||
//# sourceMappingURL=streamableHttp.js.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.js.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/streamableHttp.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
17
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/websocket.d.ts
generated
vendored
Normal file
17
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/websocket.d.ts
generated
vendored
Normal file
@@ -0,0 +1,17 @@
|
||||
import { Transport } from '../shared/transport.js';
|
||||
import { JSONRPCMessage } from '../types.js';
|
||||
/**
|
||||
* Client transport for WebSocket: this will connect to a server over the WebSocket protocol.
|
||||
*/
|
||||
export declare class WebSocketClientTransport implements Transport {
|
||||
private _socket?;
|
||||
private _url;
|
||||
onclose?: () => void;
|
||||
onerror?: (error: Error) => void;
|
||||
onmessage?: (message: JSONRPCMessage) => void;
|
||||
constructor(url: URL);
|
||||
start(): Promise<void>;
|
||||
close(): Promise<void>;
|
||||
send(message: JSONRPCMessage): Promise<void>;
|
||||
}
|
||||
//# sourceMappingURL=websocket.d.ts.map
|
||||
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/websocket.d.ts.map
generated
vendored
Normal file
1
projects/arabica/sprint1/node_modules/@modelcontextprotocol/sdk/dist/cjs/client/websocket.d.ts.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"websocket.d.ts","sourceRoot":"","sources":["../../../src/client/websocket.ts"],"names":[],"mappings":"AAAA,OAAO,EAAE,SAAS,EAAE,MAAM,wBAAwB,CAAC;AACnD,OAAO,EAAE,cAAc,EAAwB,MAAM,aAAa,CAAC;AAInE;;GAEG;AACH,qBAAa,wBAAyB,YAAW,SAAS;IACtD,OAAO,CAAC,OAAO,CAAC,CAAY;IAC5B,OAAO,CAAC,IAAI,CAAM;IAElB,OAAO,CAAC,EAAE,MAAM,IAAI,CAAC;IACrB,OAAO,CAAC,EAAE,CAAC,KAAK,EAAE,KAAK,KAAK,IAAI,CAAC;IACjC,SAAS,CAAC,EAAE,CAAC,OAAO,EAAE,cAAc,KAAK,IAAI,CAAC;gBAElC,GAAG,EAAE,GAAG;IAIpB,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAsChB,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAI5B,IAAI,CAAC,OAAO,EAAE,cAAc,GAAG,OAAO,CAAC,IAAI,CAAC;CAW/C"}
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user