Initial commit
This commit is contained in:
64
docs/DEVELOPMENT_PLAN.md
Normal file
64
docs/DEVELOPMENT_PLAN.md
Normal file
@@ -0,0 +1,64 @@
|
||||
# InsightReply 开发任务分解版
|
||||
|
||||
基于 PRD(v1.0),我们将开发任务划分为以下核心模块与阶段,适合在 Notion/Jira 等任务管理系统中作为 Epic/Story 录入。
|
||||
|
||||
---
|
||||
|
||||
## 🏁 第一阶段:核心 MVP(预计耗时 2-4 周)
|
||||
**核心目标**:跑通“获取推文 -> 生成多策略评论 -> 一键复制”的核心业务流。
|
||||
|
||||
### Epic 1: 项目基础设施搭建
|
||||
- [ ] **前端框架初始化**:搭建浏览器插件(Chrome Extension)基础模版,使用 Vue 3 + Tailwind CSS,并配置打包工具(如 Plasmo 或 Vite)。
|
||||
- [ ] **后端架构选型与初始化**:创建 Go (Golang) 后端服务基础框架。
|
||||
- [ ] **数据库初始化**:设计并创建基础表结构(Users, MonitoredKeywords, Tweets, GeneratedReplies)。
|
||||
- [ ] **LLM API 接入**:申请并联通 OpenAI (GPT-4) 或其他大模型 API,建立接口通信链路。
|
||||
|
||||
### Epic 2: 浏览器插件核心开发
|
||||
- [ ] **UI 侧边栏/弹窗实现**:在 X(Twitter)页面注入前端组件,展示 InsightReply 面板。
|
||||
- [ ] **推文内容提取**:获取当前浏览中的相关推文文本内容及上下文。
|
||||
- [ ] **手动生成交互**:用户手动点击“生成”,调用后端接口返回评论建议。
|
||||
- [ ] **结果呈现与复制**:展示返回的备选评论,支持“一键复制”操作。
|
||||
|
||||
### Epic 3: 评论生成引擎(基础版)
|
||||
- [ ] **提示词(Prompt)工程调优**:编写可稳定生成 5 种不同属性(认知升级型、反向观点型、数据补充型、共鸣型、创始人经验型)的底层提示词。
|
||||
- [ ] **身份预设支持**:支持基础的用户预设身份(如 AI 创始人/SaaS Builder),与推文内容一同传入 LLM。
|
||||
|
||||
### Epic 4: 基础关键词监控
|
||||
- [ ] **监控规则配置**:前端/后台页面支持用户录入最初的几个核心关键词。
|
||||
- [ ] **定时拉取脚本**:服务端定时通过 API/规则 拉取匹配关键词的相关推文缓存于数据库。
|
||||
|
||||
---
|
||||
|
||||
## 🚀 第二阶段:自动化与雷达引擎(进阶版)
|
||||
**核心目标**:实现对账号的定点监控、推文的热度初步计算,并让评论策略更完善。
|
||||
|
||||
### Epic 5: 账号与组合监控
|
||||
- [ ] **账号监控功能**:实现对重点账号的定点监控配置(支持实时抓取)。
|
||||
- [ ] **多规则组合过滤**:支持指定账号 + 关键词、AND/OR 多条件的交叉过滤搜索。
|
||||
|
||||
### Epic 6: 热度评分系统
|
||||
- [ ] **热度指标采集**:获取推文的点赞、转发、评论数量的变化速率。
|
||||
- [ ] **热度公式落地**:实现 `热度 = 点赞增长率*0.4 + 转发增长率*0.3 + 评论增长率*0.3` 算法。
|
||||
- [ ] **增强因子计算**:接入账号蓝V标识识别、粉丝数权重计算和热搜趋势匹配。
|
||||
- [ ] **阈值提醒机制**:当分析出的推文热度超过设定阈值,出现在插件的“高潜爆款候选列”中。
|
||||
|
||||
### Epic 7: Web 端管理后台
|
||||
- [ ] **Web 界面开发**:Nuxt.js / Vue 3 等前端框架搭建完整数据看板。
|
||||
- [ ] **策略调整与历史记录**:用户可查看所有生成过的历史评论,调整个人细分风格标签库。
|
||||
|
||||
---
|
||||
|
||||
## 💎 第三阶段:商业化闭环与数据优化(完全体)
|
||||
**核心目标**:验证效果以形成数据反馈,推出付费订阅,强化护城河。
|
||||
|
||||
### Epic 8: 商业化支付与权限系统
|
||||
- [ ] **支付系统接入**:集成 Stripe 等主流订阅支付平台。
|
||||
- [ ] **多级版本控制**:根据 Free/Pro/Premium 版本,对“生成次数限制”“监控关键词上限”“账号数量”进行鉴权与隔离。
|
||||
|
||||
### Epic 9: 评论效果数据反馈(V2)
|
||||
- [ ] **社交数据回拨检测**:定期查询用户发布评论后的真实点赞、回复数据。
|
||||
- [ ] **用户表现看板**:在 Web 后台提供“最有效互动风格”“最佳发帖时间”的数据可视化分析图表。
|
||||
|
||||
### Epic 10: AI 模型个性化学习
|
||||
- [ ] **风格反馈微调**:针对高频点赞的回复风格,优化该用户的专属 Prompt。
|
||||
- [ ] **长期资产构建**:落地行业趋势资料包与垂直知识库,辅以 RAG 技术提高生成内容的深度。
|
||||
390
docs/PRD.md
Normal file
390
docs/PRD.md
Normal file
@@ -0,0 +1,390 @@
|
||||
# InsightReply 产品文档(PRD v1.0)
|
||||
|
||||
> 产品定位:创始人 / 独立开发者热点评论增强系统
|
||||
> 产品形态:浏览器插件 + Web 后台
|
||||
> 产品类型:AI 写作辅助工具(非自动化营销工具)
|
||||
|
||||
---
|
||||
|
||||
# 1. 产品概述
|
||||
|
||||
## 1.1 产品愿景
|
||||
|
||||
帮助创始人和独立开发者在 X(Twitter)行业热点中:
|
||||
|
||||
* 快速发现值得参与的话题
|
||||
* 输出更有洞察力的评论
|
||||
* 提升个人品牌影响力
|
||||
* 优化表达风格与互动质量
|
||||
|
||||
---
|
||||
|
||||
## 1.2 产品定位
|
||||
|
||||
InsightReply 不是“引流机器人”,而是:
|
||||
|
||||
> 社交表达增强系统(Social Insight Copilot)
|
||||
|
||||
---
|
||||
|
||||
## 1.3 目标用户
|
||||
|
||||
### 核心用户
|
||||
|
||||
* 独立开发者
|
||||
* SaaS 创始人
|
||||
* AI 创业者
|
||||
* 投资人
|
||||
* 技术型创作者
|
||||
|
||||
### 非目标用户
|
||||
|
||||
* 批量营销账号
|
||||
* 自动化矩阵运营团队
|
||||
* 灰产流量操作者
|
||||
|
||||
---
|
||||
|
||||
# 2. 产品核心价值
|
||||
|
||||
1. 行业热点过滤
|
||||
2. 多策略评论生成
|
||||
3. 个性化表达增强
|
||||
4. 评论效果数据反馈
|
||||
5. 长期影响力优化
|
||||
|
||||
---
|
||||
|
||||
# 3. 产品模块设计
|
||||
|
||||
---
|
||||
|
||||
# 模块一:行业雷达(监控系统)
|
||||
|
||||
## 3.1 账号监控
|
||||
|
||||
### 功能描述
|
||||
|
||||
用户添加指定账号进行监控。
|
||||
|
||||
### 功能点
|
||||
|
||||
* 添加 X 账号
|
||||
* 设置热度阈值
|
||||
* 实时抓取新推文
|
||||
* 超过阈值进入推荐列表
|
||||
|
||||
---
|
||||
|
||||
## 3.2 关键词监控
|
||||
|
||||
### 功能描述
|
||||
|
||||
监听包含特定关键词的推文。
|
||||
|
||||
### 支持能力
|
||||
|
||||
* 多关键词添加
|
||||
* 热度排序
|
||||
* 噪音过滤
|
||||
* 行业标签分类
|
||||
|
||||
示例关键词:
|
||||
|
||||
* AI agent
|
||||
* SaaS
|
||||
* GPT
|
||||
* Indie Hacker
|
||||
* Crypto
|
||||
|
||||
---
|
||||
|
||||
## 3.3 组合监控
|
||||
|
||||
支持逻辑组合:
|
||||
|
||||
* 指定账号 + 关键词
|
||||
* 多关键词 AND / OR 逻辑
|
||||
* 行业过滤
|
||||
|
||||
示例:
|
||||
|
||||
* @a16z + AI
|
||||
* Crypto + ETF
|
||||
* Indie Hacker + Funding
|
||||
|
||||
---
|
||||
|
||||
# 模块二:评论增强引擎
|
||||
|
||||
## 4.1 输入
|
||||
|
||||
* 原推文内容
|
||||
* 用户账号定位标签
|
||||
* 行业类型
|
||||
|
||||
---
|
||||
|
||||
## 4.2 输出
|
||||
|
||||
生成 5 种策略评论:
|
||||
|
||||
1. 认知升级型
|
||||
2. 反向观点型
|
||||
3. 数据补充型
|
||||
4. 共鸣型
|
||||
5. 创始人经验型
|
||||
|
||||
每种策略生成 1–2 条备选评论。
|
||||
|
||||
---
|
||||
|
||||
## 4.3 评论结构公式
|
||||
|
||||
```
|
||||
Hook
|
||||
+
|
||||
Position
|
||||
+
|
||||
Insight
|
||||
+
|
||||
Brevity
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 4.4 示例输出结构
|
||||
|
||||
### 认知升级型
|
||||
|
||||
> Most people miss this part...
|
||||
|
||||
### 反向观点型
|
||||
|
||||
> Unpopular opinion:
|
||||
|
||||
### 创始人经验型
|
||||
|
||||
> We faced this building our product...
|
||||
|
||||
---
|
||||
|
||||
# 模块三:个人定位系统
|
||||
|
||||
## 5.1 用户身份标签
|
||||
|
||||
用户可选择:
|
||||
|
||||
* AI 创始人
|
||||
* SaaS Builder
|
||||
* 投资人
|
||||
* 独立开发者
|
||||
* 技术分析者
|
||||
|
||||
不同身份影响评论语气与逻辑。
|
||||
|
||||
---
|
||||
|
||||
# 模块四:评论效果反馈(V2)
|
||||
|
||||
## 6.1 数据记录
|
||||
|
||||
记录:
|
||||
|
||||
* 评论发布时间
|
||||
* 点赞数
|
||||
* 回复数
|
||||
* 互动率
|
||||
|
||||
---
|
||||
|
||||
## 6.2 数据分析
|
||||
|
||||
输出:
|
||||
|
||||
* 高互动风格分析
|
||||
* 最佳评论时间建议
|
||||
* 风格推荐排序
|
||||
|
||||
---
|
||||
|
||||
# 7. 热度评分模型
|
||||
|
||||
## 7.1 基础公式
|
||||
|
||||
```
|
||||
热度 =
|
||||
点赞增长率 × 0.4
|
||||
+ 转发增长率 × 0.3
|
||||
+ 评论增长率 × 0.3
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 7.2 增强因子
|
||||
|
||||
* 账号粉丝权重
|
||||
* 是否蓝V
|
||||
* 是否争议话题
|
||||
* 是否趋势标签
|
||||
|
||||
---
|
||||
|
||||
# 8. 产品形态设计
|
||||
|
||||
---
|
||||
|
||||
## 8.1 浏览器插件(优先)
|
||||
|
||||
### 用户流程
|
||||
|
||||
1. 打开 X
|
||||
2. 查看推文
|
||||
3. 插件侧边栏显示:
|
||||
|
||||
* 热度指数
|
||||
* 评论建议
|
||||
* 一键复制
|
||||
|
||||
---
|
||||
|
||||
## 8.2 Web 后台
|
||||
|
||||
功能:
|
||||
|
||||
* 监控管理
|
||||
* 历史记录
|
||||
* 数据统计
|
||||
* 风格配置
|
||||
|
||||
---
|
||||
|
||||
# 9. 功能分级
|
||||
|
||||
---
|
||||
|
||||
## Free 版本
|
||||
|
||||
* 3 个关键词
|
||||
* 3 个账号
|
||||
* 每日 10 次生成
|
||||
* 基础评论风格
|
||||
|
||||
---
|
||||
|
||||
## Pro 版本($29/月)
|
||||
|
||||
* 20 个关键词
|
||||
* 20 个账号
|
||||
* 无限生成
|
||||
* 全策略风格
|
||||
* 热度雷达
|
||||
|
||||
---
|
||||
|
||||
## Premium 版本($59/月)
|
||||
|
||||
* 50 个关键词
|
||||
* 高级数据分析
|
||||
* 评论优化建议
|
||||
* 爆款概率预测
|
||||
* 个人品牌成长报告
|
||||
|
||||
---
|
||||
|
||||
# 10. 技术架构建议
|
||||
|
||||
## 前端
|
||||
|
||||
* 浏览器插件(Chrome / Edge)
|
||||
* Vue 3 + Tailwind CSS
|
||||
|
||||
---
|
||||
|
||||
## 后端
|
||||
|
||||
* Go (Golang)
|
||||
* LLM API 调用
|
||||
* 定时任务系统
|
||||
* 热度计算服务
|
||||
|
||||
---
|
||||
|
||||
## 数据库
|
||||
|
||||
主要表:
|
||||
|
||||
* Users
|
||||
* MonitoredAccounts
|
||||
* MonitoredKeywords
|
||||
* Tweets
|
||||
* GeneratedReplies
|
||||
* ReplyPerformance
|
||||
|
||||
---
|
||||
|
||||
# 11. 合规设计原则
|
||||
|
||||
必须遵守:
|
||||
|
||||
* 不自动发布评论
|
||||
* 不控制多个账号
|
||||
* 不模拟用户行为
|
||||
* 不自动带推广链接
|
||||
|
||||
产品定位:
|
||||
|
||||
> AI 写作增强工具
|
||||
|
||||
---
|
||||
|
||||
# 12. MVP 开发计划
|
||||
|
||||
---
|
||||
|
||||
## 第一阶段(2–4周)
|
||||
|
||||
* 手动输入推文生成评论
|
||||
* 关键词监控
|
||||
* 浏览器插件弹窗
|
||||
|
||||
---
|
||||
|
||||
## 第二阶段
|
||||
|
||||
* 账号监控
|
||||
* 热度评分
|
||||
* 多策略生成
|
||||
|
||||
---
|
||||
|
||||
## 第三阶段
|
||||
|
||||
* 数据反馈
|
||||
* 风格优化
|
||||
* 商业化升级
|
||||
|
||||
---
|
||||
|
||||
# 13. 成功指标(KPI)
|
||||
|
||||
* 日活跃用户
|
||||
* 评论生成次数
|
||||
* 用户留存率
|
||||
* Pro 转化率
|
||||
* 评论互动增长率
|
||||
|
||||
---
|
||||
|
||||
# 14. 长期护城河
|
||||
|
||||
* 评论风格数据积累
|
||||
* 行业趋势数据库
|
||||
* 用户表达风格画像
|
||||
* 个性化生成模型
|
||||
|
||||
---
|
||||
|
||||
# 15. 产品一句话描述
|
||||
|
||||
> InsightReply 是一个帮助创始人在行业热点中输出更有洞察力评论的 AI 助手。
|
||||
91
docs/schema.sql
Normal file
91
docs/schema.sql
Normal file
@@ -0,0 +1,91 @@
|
||||
-- users 表:存储业务用户
|
||||
CREATE TABLE IF NOT EXISTS users (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
email VARCHAR(255) UNIQUE NOT NULL,
|
||||
password_hash VARCHAR(255),
|
||||
subscription_tier VARCHAR(50) DEFAULT 'Free', -- Free, Pro, Premium
|
||||
identity_label VARCHAR(100), -- AI 创始人, SaaS Builder 等
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
-- monitored_accounts 表:存储用户重点监控的 X 账号
|
||||
CREATE TABLE IF NOT EXISTS monitored_accounts (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
x_account_id VARCHAR(255),
|
||||
x_handle VARCHAR(255) NOT NULL,
|
||||
is_active BOOLEAN DEFAULT TRUE,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
UNIQUE (user_id, x_handle)
|
||||
);
|
||||
|
||||
-- monitored_keywords 表:存储用户重点监控的关键词
|
||||
CREATE TABLE IF NOT EXISTS monitored_keywords (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
keyword VARCHAR(255) NOT NULL,
|
||||
is_active BOOLEAN DEFAULT TRUE,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
|
||||
UNIQUE (user_id, keyword)
|
||||
);
|
||||
|
||||
-- tweets 表:共享的推文数据池,AI 评论生成的上下文
|
||||
CREATE TABLE IF NOT EXISTS tweets (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
x_tweet_id VARCHAR(255) UNIQUE NOT NULL,
|
||||
author_id VARCHAR(255),
|
||||
author_handle VARCHAR(255),
|
||||
content TEXT NOT NULL,
|
||||
posted_at TIMESTAMP WITH TIME ZONE,
|
||||
like_count INTEGER DEFAULT 0,
|
||||
retweet_count INTEGER DEFAULT 0,
|
||||
reply_count INTEGER DEFAULT 0,
|
||||
heat_score FLOAT DEFAULT 0.0,
|
||||
is_processed BOOLEAN DEFAULT FALSE,
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX idx_tweets_x_tweet_id ON tweets(x_tweet_id);
|
||||
CREATE INDEX idx_tweets_heat_score ON tweets(heat_score DESC);
|
||||
|
||||
-- generated_replies 表:生成的 AI 评论记录
|
||||
CREATE TABLE IF NOT EXISTS generated_replies (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
|
||||
tweet_id UUID NOT NULL REFERENCES tweets(id) ON DELETE CASCADE,
|
||||
strategy_type VARCHAR(100) NOT NULL, -- 认知升级型, 反向观点型, 数据补充型, 共鸣型, 创始人经验型
|
||||
content TEXT NOT NULL,
|
||||
status VARCHAR(50) DEFAULT 'draft', -- draft, copied, posted
|
||||
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX idx_generated_replies_user_id ON generated_replies(user_id);
|
||||
CREATE INDEX idx_generated_replies_tweet_id ON generated_replies(tweet_id);
|
||||
|
||||
-- reply_performance 表:针对已发布评论的效果数据回拨
|
||||
CREATE TABLE IF NOT EXISTS reply_performance (
|
||||
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
|
||||
reply_id UUID NOT NULL REFERENCES generated_replies(id) ON DELETE CASCADE,
|
||||
like_count_increase INTEGER DEFAULT 0,
|
||||
reply_count_increase INTEGER DEFAULT 0,
|
||||
interaction_rate FLOAT DEFAULT 0.0,
|
||||
check_time TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
|
||||
);
|
||||
|
||||
CREATE INDEX idx_reply_performance_reply_id ON reply_performance(reply_id);
|
||||
|
||||
-- 更新 updated_at 的触发器函数
|
||||
CREATE OR REPLACE FUNCTION update_modified_column()
|
||||
RETURNS TRIGGER AS $$
|
||||
BEGIN
|
||||
NEW.updated_at = CURRENT_TIMESTAMP;
|
||||
RETURN NEW;
|
||||
END;
|
||||
$$ language 'plpgsql';
|
||||
|
||||
-- 为 users 表添加触发器
|
||||
CREATE TRIGGER update_users_modtime
|
||||
BEFORE UPDATE ON users
|
||||
FOR EACH ROW
|
||||
EXECUTE FUNCTION update_modified_column();
|
||||
24
extension/.gitignore
vendored
Normal file
24
extension/.gitignore
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
# Logs
|
||||
logs
|
||||
*.log
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
pnpm-debug.log*
|
||||
lerna-debug.log*
|
||||
|
||||
node_modules
|
||||
dist
|
||||
dist-ssr
|
||||
*.local
|
||||
|
||||
# Editor directories and files
|
||||
.vscode/*
|
||||
!.vscode/extensions.json
|
||||
.idea
|
||||
.DS_Store
|
||||
*.suo
|
||||
*.ntvs*
|
||||
*.njsproj
|
||||
*.sln
|
||||
*.sw?
|
||||
3
extension/.vscode/extensions.json
vendored
Normal file
3
extension/.vscode/extensions.json
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
{
|
||||
"recommendations": ["Vue.volar"]
|
||||
}
|
||||
5
extension/README.md
Normal file
5
extension/README.md
Normal file
@@ -0,0 +1,5 @@
|
||||
# Vue 3 + TypeScript + Vite
|
||||
|
||||
This template should help get you started developing with Vue 3 and TypeScript in Vite. The template uses Vue 3 `<script setup>` SFCs, check out the [script setup docs](https://v3.vuejs.org/api/sfc-script-setup.html#sfc-script-setup) to learn more.
|
||||
|
||||
Learn more about the recommended Project Setup and IDE Support in the [Vue Docs TypeScript Guide](https://vuejs.org/guide/typescript/overview.html#project-setup).
|
||||
13
extension/index.html
Normal file
13
extension/index.html
Normal file
@@ -0,0 +1,13 @@
|
||||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>extension</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="app"></div>
|
||||
<script type="module" src="/src/main.ts"></script>
|
||||
</body>
|
||||
</html>
|
||||
33
extension/manifest.json
Normal file
33
extension/manifest.json
Normal file
@@ -0,0 +1,33 @@
|
||||
{
|
||||
"manifest_version": 3,
|
||||
"name": "InsightReply",
|
||||
"version": "1.0.0",
|
||||
"description": "InsightReply 是一个帮助创始人在行业热点中增强社交表达并且输出高质评论的助手",
|
||||
"action": {
|
||||
"default_popup": "index.html"
|
||||
},
|
||||
"background": {
|
||||
"service_worker": "src/background/index.ts",
|
||||
"type": "module"
|
||||
},
|
||||
"permissions": [
|
||||
"storage",
|
||||
"activeTab"
|
||||
],
|
||||
"host_permissions": [
|
||||
"https://twitter.com/*",
|
||||
"https://x.com/*"
|
||||
],
|
||||
"content_scripts": [
|
||||
{
|
||||
"js": [
|
||||
"src/content/index.ts",
|
||||
"src/content/sidebar-mount.ts"
|
||||
],
|
||||
"matches": [
|
||||
"https://twitter.com/*",
|
||||
"https://x.com/*"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
3139
extension/package-lock.json
generated
Normal file
3139
extension/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
30
extension/package.json
Normal file
30
extension/package.json
Normal file
@@ -0,0 +1,30 @@
|
||||
{
|
||||
"name": "extension",
|
||||
"private": true,
|
||||
"version": "0.0.0",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"dev": "vite",
|
||||
"build": "vue-tsc -b && vite build",
|
||||
"preview": "vite preview"
|
||||
},
|
||||
"dependencies": {
|
||||
"clsx": "^2.1.1",
|
||||
"tailwind-merge": "^3.5.0",
|
||||
"vue": "^3.5.25"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@crxjs/vite-plugin": "^2.0.0-beta.33",
|
||||
"@tailwindcss/postcss": "^4.2.1",
|
||||
"@types/chrome": "^0.1.37",
|
||||
"@types/node": "^24.10.1",
|
||||
"@vitejs/plugin-vue": "^6.0.2",
|
||||
"@vue/tsconfig": "^0.8.1",
|
||||
"autoprefixer": "^10.4.27",
|
||||
"postcss": "^8.5.6",
|
||||
"tailwindcss": "^4.2.1",
|
||||
"typescript": "~5.9.3",
|
||||
"vite": "^7.3.1",
|
||||
"vue-tsc": "^3.1.5"
|
||||
}
|
||||
}
|
||||
6
extension/postcss.config.js
Normal file
6
extension/postcss.config.js
Normal file
@@ -0,0 +1,6 @@
|
||||
export default {
|
||||
plugins: {
|
||||
'@tailwindcss/postcss': {},
|
||||
autoprefixer: {},
|
||||
},
|
||||
}
|
||||
1
extension/public/vite.svg
Normal file
1
extension/public/vite.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>
|
||||
|
After Width: | Height: | Size: 1.5 KiB |
59
extension/src/App.vue
Normal file
59
extension/src/App.vue
Normal file
@@ -0,0 +1,59 @@
|
||||
<script setup lang="ts">
|
||||
import { ref } from 'vue'
|
||||
import { cn } from './lib/utils'
|
||||
|
||||
const isLoading = ref(false)
|
||||
|
||||
const triggerMockLoading = () => {
|
||||
isLoading.value = true
|
||||
setTimeout(() => {
|
||||
isLoading.value = false
|
||||
}, 1000)
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<div class="w-[400px] h-[600px] bg-[#0A0A0A]/90 backdrop-blur-xl border border-white/10 text-[#E5E5E5] p-6 flex flex-col font-sans">
|
||||
|
||||
<!-- Title Area -->
|
||||
<div class="mb-6">
|
||||
<h1 class="text-xl font-medium tracking-tight bg-gradient-to-r from-violet-500 to-blue-500 bg-clip-text text-transparent inline-block">
|
||||
InsightReply
|
||||
</h1>
|
||||
<p class="text-xs text-zinc-400 mt-1">Social Insight Copilot</p>
|
||||
</div>
|
||||
|
||||
<!-- Main Content Area -->
|
||||
<div class="flex-1 overflow-y-auto pr-2 space-y-4">
|
||||
|
||||
<!-- Example Heat Score Card -->
|
||||
<div class="bg-[#171717] rounded-xl p-4 border border-white/5 shadow-lg shadow-black/50">
|
||||
<div class="flex justify-between items-center mb-2">
|
||||
<span class="text-sm font-medium text-zinc-300">Current Tweet Heat</span>
|
||||
<span class="text-xs font-bold px-2 py-0.5 rounded-full bg-orange-500/20 text-orange-400 border border-orange-500/20">Hot</span>
|
||||
</div>
|
||||
<div class="flex items-end gap-2">
|
||||
<span class="text-3xl font-semibold tracking-tighter">87.5</span>
|
||||
<span class="text-xs text-zinc-500 mb-1">/ 100</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Action Button -->
|
||||
<button
|
||||
@click="triggerMockLoading"
|
||||
:disabled="isLoading"
|
||||
:class="cn(
|
||||
'w-full py-2.5 rounded-lg text-sm font-medium transition-all duration-200 ease-in-out',
|
||||
'flex items-center justify-center gap-2',
|
||||
isLoading
|
||||
? 'bg-[#171717] text-zinc-500 border border-white/5 cursor-not-allowed'
|
||||
: 'bg-brand-primary hover:bg-brand-primary/90 text-white shadow-lg shadow-brand-primary/20 hover:scale-[0.98]'
|
||||
)"
|
||||
>
|
||||
<span v-if="isLoading" class="animate-spin inline-block w-4 h-4 border-2 border-white/20 border-t-white rounded-full"></span>
|
||||
{{ isLoading ? 'Generating Insights...' : 'Generate Replies' }}
|
||||
</button>
|
||||
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
16
extension/src/assets/tailwind.css
Normal file
16
extension/src/assets/tailwind.css
Normal file
@@ -0,0 +1,16 @@
|
||||
@import "tailwindcss";
|
||||
|
||||
@theme {
|
||||
--color-brand-primary: #8B5CF6;
|
||||
--color-brand-secondary: #3B82F6;
|
||||
}
|
||||
|
||||
@layer base {
|
||||
body {
|
||||
@apply bg-[#0A0A0A] text-[#E5E5E5] antialiased;
|
||||
}
|
||||
|
||||
::selection {
|
||||
background-color: rgb(139 92 246 / 0.3);
|
||||
}
|
||||
}
|
||||
1
extension/src/assets/vue.svg
Normal file
1
extension/src/assets/vue.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="37.07" height="36" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 198"><path fill="#41B883" d="M204.8 0H256L128 220.8L0 0h97.92L128 51.2L157.44 0h47.36Z"></path><path fill="#41B883" d="m0 0l128 220.8L256 0h-51.2L128 132.48L50.56 0H0Z"></path><path fill="#35495E" d="M50.56 0L128 133.12L204.8 0h-47.36L128 51.2L97.92 0H50.56Z"></path></svg>
|
||||
|
After Width: | Height: | Size: 496 B |
39
extension/src/background/index.ts
Normal file
39
extension/src/background/index.ts
Normal file
@@ -0,0 +1,39 @@
|
||||
/**
|
||||
* InsightReply Background Script
|
||||
* 负责中转消息、管理 OAuth 以及与 Go 后端 API 通信
|
||||
*/
|
||||
|
||||
console.log('InsightReply Background Script Loaded');
|
||||
|
||||
const API_BASE = 'http://localhost:8080/api/v1';
|
||||
|
||||
chrome.runtime.onMessage.addListener((message: { type: string; payload?: any }, _sender: chrome.runtime.MessageSender, sendResponse: (response?: any) => void) => {
|
||||
if (message.type === 'SHOW_INSIGHT') {
|
||||
console.log('Received tweet data in background:', message.payload);
|
||||
}
|
||||
|
||||
if (message.type === 'GENERATE_REPLY') {
|
||||
const { tweetContent, strategy, identity } = message.payload;
|
||||
|
||||
fetch(`${API_BASE}/ai/generate`, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
tweet_content: tweetContent,
|
||||
strategy: strategy,
|
||||
identity: identity || 'Independent Developer / Founder'
|
||||
})
|
||||
})
|
||||
.then(resp => resp.json())
|
||||
.then(data => {
|
||||
sendResponse({ success: true, data: data.data });
|
||||
})
|
||||
.catch(err => {
|
||||
console.error('API Error:', err);
|
||||
sendResponse({ success: false, error: err.message });
|
||||
});
|
||||
|
||||
return true; // Keep channel open for async response
|
||||
}
|
||||
return true;
|
||||
});
|
||||
41
extension/src/components/HelloWorld.vue
Normal file
41
extension/src/components/HelloWorld.vue
Normal file
@@ -0,0 +1,41 @@
|
||||
<script setup lang="ts">
|
||||
import { ref } from 'vue'
|
||||
|
||||
defineProps<{ msg: string }>()
|
||||
|
||||
const count = ref(0)
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<h1>{{ msg }}</h1>
|
||||
|
||||
<div class="card">
|
||||
<button type="button" @click="count++">count is {{ count }}</button>
|
||||
<p>
|
||||
Edit
|
||||
<code>components/HelloWorld.vue</code> to test HMR
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<p>
|
||||
Check out
|
||||
<a href="https://vuejs.org/guide/quick-start.html#local" target="_blank"
|
||||
>create-vue</a
|
||||
>, the official Vue + Vite starter
|
||||
</p>
|
||||
<p>
|
||||
Learn more about IDE Support for Vue in the
|
||||
<a
|
||||
href="https://vuejs.org/guide/scaling-up/tooling.html#ide-support"
|
||||
target="_blank"
|
||||
>Vue Docs Scaling up Guide</a
|
||||
>.
|
||||
</p>
|
||||
<p class="read-the-docs">Click on the Vite and Vue logos to learn more</p>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
.read-the-docs {
|
||||
color: #888;
|
||||
}
|
||||
</style>
|
||||
143
extension/src/content/Sidebar.vue
Normal file
143
extension/src/content/Sidebar.vue
Normal file
@@ -0,0 +1,143 @@
|
||||
<script setup lang="ts">
|
||||
import { ref } from 'vue'
|
||||
|
||||
const props = defineProps<{
|
||||
tweetData?: {
|
||||
author: string;
|
||||
content: string;
|
||||
stats: {
|
||||
replies: string;
|
||||
retweets: string;
|
||||
likes: string;
|
||||
}
|
||||
}
|
||||
}>()
|
||||
|
||||
const isVisible = ref(true)
|
||||
const selectedStrategy = ref('Insightful')
|
||||
const generatedReply = ref('')
|
||||
const isGenerating = ref(false)
|
||||
|
||||
const strategies = [
|
||||
{ id: 'Insightful', label: '认知升级型', icon: '🧠' },
|
||||
{ id: 'Humorous', label: '幽默风趣型', icon: '😄' },
|
||||
{ id: 'Professional', label: '专业严谨型', icon: '⚖️' },
|
||||
{ id: 'Supportive', label: '共鸣支持型', icon: '❤️' },
|
||||
{ id: 'Critical', label: '锐评批判型', icon: '🔥' }
|
||||
]
|
||||
|
||||
const generate = () => {
|
||||
if (!props.tweetData) return
|
||||
|
||||
isGenerating.value = true
|
||||
chrome.runtime.sendMessage({
|
||||
type: 'GENERATE_REPLY',
|
||||
payload: {
|
||||
tweetContent: props.tweetData.content,
|
||||
strategy: selectedStrategy.value,
|
||||
identity: 'Independent Developer / Founder' // Could be dynamic later
|
||||
}
|
||||
}, (response) => {
|
||||
isGenerating.value = false
|
||||
if (response && response.success) {
|
||||
generatedReply.value = response.data.reply
|
||||
} else {
|
||||
generatedReply.value = 'Failed to generate reply. Please check your connection or API key.'
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const copyToClipboard = () => {
|
||||
navigator.clipboard.writeText(generatedReply.value)
|
||||
}
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<div v-if="isVisible" class="fixed right-4 top-20 w-[360px] max-h-[85vh] flex flex-col bg-[#0A0A0A]/90 backdrop-blur-2xl border border-white/10 rounded-2xl shadow-2xl text-[#E5E5E5] font-sans z-[9999] overflow-hidden">
|
||||
|
||||
<!-- Header -->
|
||||
<div class="p-4 border-b border-white/5 flex justify-between items-center bg-white/5">
|
||||
<div class="flex items-center gap-2">
|
||||
<div class="w-2 h-2 rounded-full bg-brand-primary animate-pulse"></div>
|
||||
<span class="text-sm font-medium tracking-wide">InsightReply AI</span>
|
||||
</div>
|
||||
<button @click="isVisible = false" class="text-zinc-500 hover:text-white transition-colors">
|
||||
<svg width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2"><path d="M18 6L6 18M6 6l12 12"/></svg>
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<!-- Content -->
|
||||
<div class="p-5 flex-1 overflow-y-auto space-y-6">
|
||||
|
||||
<!-- Tweet Context (Small Preview) -->
|
||||
<div v-if="tweetData" class="bg-white/5 rounded-xl p-3 border border-white/5">
|
||||
<div class="text-[10px] text-zinc-500 mb-1 font-mono uppercase">Context</div>
|
||||
<div class="text-xs text-zinc-400 line-clamp-2 italic">" {{ tweetData.content }} "</div>
|
||||
</div>
|
||||
|
||||
<!-- Strategy Selector -->
|
||||
<div class="space-y-3">
|
||||
<span class="text-xs font-semibold text-zinc-500 uppercase tracking-widest">Generation Strategy</span>
|
||||
<div class="grid grid-cols-1 gap-2">
|
||||
<button
|
||||
v-for="s in strategies"
|
||||
:key="s.id"
|
||||
@click="selectedStrategy = s.id"
|
||||
:class="[
|
||||
'flex items-center gap-3 p-3 rounded-xl border transition-all duration-200 text-sm group',
|
||||
selectedStrategy === s.id
|
||||
? 'bg-brand-primary/20 border-brand-primary text-white shadow-lg shadow-brand-primary/10'
|
||||
: 'bg-white/5 border-transparent hover:bg-white/10 text-zinc-400'
|
||||
]"
|
||||
>
|
||||
<span class="text-lg">{{ s.icon }}</span>
|
||||
<span class="flex-1 text-left">{{ s.label }}</span>
|
||||
<div v-if="selectedStrategy === s.id" class="w-1.5 h-1.5 rounded-full bg-brand-primary"></div>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Result Area -->
|
||||
<div v-if="generatedReply" class="space-y-3 animate-in fade-in slide-in-from-bottom-2 duration-500">
|
||||
<div class="flex justify-between items-center">
|
||||
<span class="text-xs font-semibold text-zinc-500 uppercase tracking-widest">Assistant Suggestion</span>
|
||||
<button @click="copyToClipboard" class="text-[10px] text-brand-primary hover:underline">Copy Result</button>
|
||||
</div>
|
||||
<div class="bg-[#171717] rounded-xl p-4 border border-white/10 text-sm leading-relaxed whitespace-pre-wrap">
|
||||
{{ generatedReply }}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- Footer Action -->
|
||||
<div class="p-4 bg-white/5 border-t border-white/5">
|
||||
<button
|
||||
@click="generate"
|
||||
:disabled="isGenerating"
|
||||
class="w-full py-3 bg-gradient-to-r from-violet-600 to-blue-600 hover:from-violet-500 hover:to-blue-500 disabled:from-zinc-800 disabled:to-zinc-800 disabled:text-zinc-500 disabled:cursor-not-allowed text-white rounded-xl text-sm font-semibold transition-all shadow-xl shadow-brand-primary/20 flex items-center justify-center gap-2"
|
||||
>
|
||||
<div v-if="isGenerating" class="w-4 h-4 border-2 border-white/20 border-t-white rounded-full animate-spin"></div>
|
||||
{{ isGenerating ? 'AI Thinking...' : 'Generate High-Quality Reply' }}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
/* 局部样式保证不溢出 */
|
||||
::-webkit-scrollbar {
|
||||
width: 4px;
|
||||
}
|
||||
::-webkit-scrollbar-track {
|
||||
background: transparent;
|
||||
}
|
||||
::-webkit-scrollbar-thumb {
|
||||
background: rgba(255, 255, 255, 0.1);
|
||||
border-radius: 10px;
|
||||
}
|
||||
::-webkit-scrollbar-thumb:hover {
|
||||
background: rgba(255, 255, 255, 0.2);
|
||||
}
|
||||
</style>
|
||||
107
extension/src/content/index.ts
Normal file
107
extension/src/content/index.ts
Normal file
@@ -0,0 +1,107 @@
|
||||
/**
|
||||
* InsightReply Content Script
|
||||
* 负责解析 X (Twitter) 页面 DOM,提取推文内容并注入交互按钮
|
||||
*/
|
||||
|
||||
console.log('InsightReply Content Script Loaded');
|
||||
|
||||
// 1. 定义推文数据结构
|
||||
interface TweetData {
|
||||
id: string;
|
||||
author: string;
|
||||
content: string;
|
||||
stats: {
|
||||
replies: string;
|
||||
retweets: string;
|
||||
likes: string;
|
||||
};
|
||||
}
|
||||
|
||||
// 2. 提取推文内容的逻辑
|
||||
const extractTweetData = (tweetElement: HTMLElement): TweetData | null => {
|
||||
try {
|
||||
// 根据 X 的 DOM 结构提取 (可能会随 Twitter 更新而变化)
|
||||
const textElement = tweetElement.querySelector('[data-testid="tweetText"]');
|
||||
const authorElement = tweetElement.querySelector('[data-testid="User-Name"]');
|
||||
const linkElement = tweetElement.querySelector('time')?.parentElement as HTMLAnchorElement;
|
||||
|
||||
// 互动数据提取
|
||||
const getStat = (testid: string) => {
|
||||
const el = tweetElement.querySelector(`[data-testid="${testid}"]`);
|
||||
return el?.getAttribute('aria-label') || '0';
|
||||
};
|
||||
|
||||
if (!textElement || !authorElement || !linkElement) return null;
|
||||
|
||||
const tweetId = linkElement.href.split('/').pop() || '';
|
||||
|
||||
return {
|
||||
id: tweetId,
|
||||
author: authorElement.textContent || 'Unknown',
|
||||
content: textElement.textContent || '',
|
||||
stats: {
|
||||
replies: getStat('reply'),
|
||||
retweets: getStat('retweet'),
|
||||
likes: getStat('like'),
|
||||
}
|
||||
};
|
||||
} catch (e) {
|
||||
console.error('Failed to extract tweet data:', e);
|
||||
return null;
|
||||
}
|
||||
};
|
||||
|
||||
// 3. 注入“Insight”按钮
|
||||
const injectInsightButton = (tweetElement: HTMLElement) => {
|
||||
// 查找操作栏 (Actions bar)
|
||||
const actionBar = tweetElement.querySelector('[role="group"]');
|
||||
if (!actionBar || actionBar.querySelector('.insight-reply-btn')) return;
|
||||
|
||||
// 创建按钮
|
||||
const btnContainer = document.createElement('div');
|
||||
btnContainer.className = 'insight-reply-btn';
|
||||
btnContainer.style.display = 'flex';
|
||||
btnContainer.style.alignItems = 'center';
|
||||
btnContainer.style.marginLeft = '12px';
|
||||
btnContainer.style.cursor = 'pointer';
|
||||
|
||||
// 按钮内部图标 (简易版)
|
||||
btnContainer.innerHTML = `
|
||||
<div style="padding: 4px; border-radius: 9999px; transition: background 0.2s;" onmouseover="this.style.background='rgba(139, 92, 246, 0.1)'" onmouseout="this.style.background='transparent'">
|
||||
<svg viewBox="0 0 24 24" width="20" height="20" fill="currentColor" style="color: #8B5CF6;">
|
||||
<path d="M12 2C6.48 2 2 6.48 2 12s4.48 10 10 10 10-4.48 10-10S17.52 2 12 2zm1 15h-2v-2h2v2zm0-4h-2V7h2v6z"></path>
|
||||
</svg>
|
||||
</div>
|
||||
`;
|
||||
|
||||
btnContainer.onclick = (e) => {
|
||||
e.stopPropagation();
|
||||
const data = extractTweetData(tweetElement);
|
||||
console.log('Target Tweet Data:', data);
|
||||
|
||||
// 发送消息给插件侧边栏/Popup (后续实现)
|
||||
if (data) {
|
||||
chrome.runtime.sendMessage({ type: 'SHOW_INSIGHT', payload: data });
|
||||
}
|
||||
};
|
||||
|
||||
actionBar.appendChild(btnContainer);
|
||||
};
|
||||
|
||||
// 4. 定时或监听 DOM 变化进行扫描
|
||||
const scanTweets = () => {
|
||||
const tweets = document.querySelectorAll('article[data-testid="tweet"]');
|
||||
tweets.forEach((tweet) => {
|
||||
injectInsightButton(tweet as HTMLElement);
|
||||
});
|
||||
};
|
||||
|
||||
// 使用 MutationObserver 监听动态加载
|
||||
const observer = new MutationObserver(() => {
|
||||
scanTweets();
|
||||
});
|
||||
|
||||
observer.observe(document.body, { childList: true, subtree: true });
|
||||
|
||||
// 初始扫描
|
||||
scanTweets();
|
||||
51
extension/src/content/sidebar-mount.ts
Normal file
51
extension/src/content/sidebar-mount.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
import { createApp } from 'vue'
|
||||
import Sidebar from './Sidebar.vue'
|
||||
import '../assets/tailwind.css' // We might need to handle this specially for Shadow DOM
|
||||
|
||||
const MOUNT_ID = 'insight-reply-sidebar-root'
|
||||
|
||||
function initSidebar(tweetData?: any) {
|
||||
if (document.getElementById(MOUNT_ID)) return
|
||||
|
||||
// 1. Create Host Element
|
||||
const host = document.createElement('div')
|
||||
host.id = MOUNT_ID
|
||||
document.body.appendChild(host)
|
||||
|
||||
// 2. Create Shadow Root
|
||||
const shadowRoot = host.attachShadow({ mode: 'open' })
|
||||
|
||||
// 3. Create Container for Vue
|
||||
const container = document.createElement('div')
|
||||
container.id = 'app'
|
||||
shadowRoot.appendChild(container)
|
||||
|
||||
// 4. Inject Styles into Shadow DOM
|
||||
// Note: In development/build, we need to find the generated CSS and inject it.
|
||||
// CRXJS usually puts CSS in <link> tags in the head for content scripts.
|
||||
// For Shadow DOM, we need to move or clone them into the shadow root.
|
||||
const injectStyles = () => {
|
||||
const styles = document.querySelectorAll('style, link[rel="stylesheet"]')
|
||||
styles.forEach(style => {
|
||||
// Only clone styles that look like they belong to our extension
|
||||
// This is a heuristic, in a real build we'd use the asset URL
|
||||
shadowRoot.appendChild(style.cloneNode(true))
|
||||
})
|
||||
}
|
||||
|
||||
// Initial injection
|
||||
injectStyles()
|
||||
|
||||
// 5. Create Vue App
|
||||
const app = createApp(Sidebar, { tweetData })
|
||||
app.mount(container)
|
||||
|
||||
console.log('InsightReply Sidebar Mounted in Shadow DOM');
|
||||
}
|
||||
|
||||
// Listen for messages to show/hide or update data
|
||||
chrome.runtime.onMessage.addListener((message) => {
|
||||
if (message.type === 'SHOW_INSIGHT') {
|
||||
initSidebar(message.payload)
|
||||
}
|
||||
});
|
||||
6
extension/src/lib/utils.ts
Normal file
6
extension/src/lib/utils.ts
Normal file
@@ -0,0 +1,6 @@
|
||||
import { clsx, type ClassValue } from "clsx"
|
||||
import { twMerge } from "tailwind-merge"
|
||||
|
||||
export function cn(...inputs: ClassValue[]) {
|
||||
return twMerge(clsx(inputs))
|
||||
}
|
||||
5
extension/src/main.ts
Normal file
5
extension/src/main.ts
Normal file
@@ -0,0 +1,5 @@
|
||||
import { createApp } from 'vue'
|
||||
import './assets/tailwind.css'
|
||||
import App from './App.vue'
|
||||
|
||||
createApp(App).mount('#app')
|
||||
79
extension/src/style.css
Normal file
79
extension/src/style.css
Normal file
@@ -0,0 +1,79 @@
|
||||
:root {
|
||||
font-family: system-ui, Avenir, Helvetica, Arial, sans-serif;
|
||||
line-height: 1.5;
|
||||
font-weight: 400;
|
||||
|
||||
color-scheme: light dark;
|
||||
color: rgba(255, 255, 255, 0.87);
|
||||
background-color: #242424;
|
||||
|
||||
font-synthesis: none;
|
||||
text-rendering: optimizeLegibility;
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
a {
|
||||
font-weight: 500;
|
||||
color: #646cff;
|
||||
text-decoration: inherit;
|
||||
}
|
||||
a:hover {
|
||||
color: #535bf2;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
display: flex;
|
||||
place-items: center;
|
||||
min-width: 320px;
|
||||
min-height: 100vh;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 3.2em;
|
||||
line-height: 1.1;
|
||||
}
|
||||
|
||||
button {
|
||||
border-radius: 8px;
|
||||
border: 1px solid transparent;
|
||||
padding: 0.6em 1.2em;
|
||||
font-size: 1em;
|
||||
font-weight: 500;
|
||||
font-family: inherit;
|
||||
background-color: #1a1a1a;
|
||||
cursor: pointer;
|
||||
transition: border-color 0.25s;
|
||||
}
|
||||
button:hover {
|
||||
border-color: #646cff;
|
||||
}
|
||||
button:focus,
|
||||
button:focus-visible {
|
||||
outline: 4px auto -webkit-focus-ring-color;
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 2em;
|
||||
}
|
||||
|
||||
#app {
|
||||
max-width: 1280px;
|
||||
margin: 0 auto;
|
||||
padding: 2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: light) {
|
||||
:root {
|
||||
color: #213547;
|
||||
background-color: #ffffff;
|
||||
}
|
||||
a:hover {
|
||||
color: #747bff;
|
||||
}
|
||||
button {
|
||||
background-color: #f9f9f9;
|
||||
}
|
||||
}
|
||||
7
extension/src/vite-env.d.ts
vendored
Normal file
7
extension/src/vite-env.d.ts
vendored
Normal file
@@ -0,0 +1,7 @@
|
||||
/// <reference types="vite/client" />
|
||||
|
||||
declare module "*.vue" {
|
||||
import type { DefineComponent } from "vue";
|
||||
const component: DefineComponent<{}, {}, any>;
|
||||
export default component;
|
||||
}
|
||||
16
extension/tsconfig.app.json
Normal file
16
extension/tsconfig.app.json
Normal file
@@ -0,0 +1,16 @@
|
||||
{
|
||||
"extends": "@vue/tsconfig/tsconfig.dom.json",
|
||||
"compilerOptions": {
|
||||
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.app.tsbuildinfo",
|
||||
"types": ["vite/client", "chrome"],
|
||||
|
||||
/* Linting */
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"noUncheckedSideEffectImports": true
|
||||
},
|
||||
"include": ["src/**/*.ts", "src/**/*.tsx", "src/**/*.vue"]
|
||||
}
|
||||
7
extension/tsconfig.json
Normal file
7
extension/tsconfig.json
Normal file
@@ -0,0 +1,7 @@
|
||||
{
|
||||
"files": [],
|
||||
"references": [
|
||||
{ "path": "./tsconfig.app.json" },
|
||||
{ "path": "./tsconfig.node.json" }
|
||||
]
|
||||
}
|
||||
26
extension/tsconfig.node.json
Normal file
26
extension/tsconfig.node.json
Normal file
@@ -0,0 +1,26 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.node.tsbuildinfo",
|
||||
"target": "ES2023",
|
||||
"lib": ["ES2023"],
|
||||
"module": "ESNext",
|
||||
"types": ["node"],
|
||||
"skipLibCheck": true,
|
||||
|
||||
/* Bundler mode */
|
||||
"moduleResolution": "bundler",
|
||||
"allowImportingTsExtensions": true,
|
||||
"verbatimModuleSyntax": true,
|
||||
"moduleDetection": "force",
|
||||
"noEmit": true,
|
||||
|
||||
/* Linting */
|
||||
"strict": true,
|
||||
"noUnusedLocals": true,
|
||||
"noUnusedParameters": true,
|
||||
"erasableSyntaxOnly": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"noUncheckedSideEffectImports": true
|
||||
},
|
||||
"include": ["vite.config.ts"]
|
||||
}
|
||||
12
extension/vite.config.ts
Normal file
12
extension/vite.config.ts
Normal file
@@ -0,0 +1,12 @@
|
||||
import { defineConfig } from 'vite'
|
||||
import vue from '@vitejs/plugin-vue'
|
||||
import { crx } from '@crxjs/vite-plugin'
|
||||
import manifest from './manifest.json' assert { type: 'json' }
|
||||
|
||||
// https://vite.dev/config/
|
||||
export default defineConfig({
|
||||
plugins: [
|
||||
vue(),
|
||||
crx({ manifest }),
|
||||
],
|
||||
})
|
||||
153
node_modules/.package-lock.json
generated
vendored
Normal file
153
node_modules/.package-lock.json
generated
vendored
Normal file
@@ -0,0 +1,153 @@
|
||||
{
|
||||
"name": "InsightReply",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"node_modules/pg": {
|
||||
"version": "8.19.0",
|
||||
"resolved": "https://registry.npmjs.org/pg/-/pg-8.19.0.tgz",
|
||||
"integrity": "sha512-QIcLGi508BAHkQ3pJNptsFz5WQMlpGbuBGBaIaXsWK8mel2kQ/rThYI+DbgjUvZrIr7MiuEuc9LcChJoEZK1xQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"pg-connection-string": "^2.11.0",
|
||||
"pg-pool": "^3.12.0",
|
||||
"pg-protocol": "^1.12.0",
|
||||
"pg-types": "2.2.0",
|
||||
"pgpass": "1.0.5"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 16.0.0"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"pg-cloudflare": "^1.3.0"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"pg-native": ">=3.0.1"
|
||||
},
|
||||
"peerDependenciesMeta": {
|
||||
"pg-native": {
|
||||
"optional": true
|
||||
}
|
||||
}
|
||||
},
|
||||
"node_modules/pg-cloudflare": {
|
||||
"version": "1.3.0",
|
||||
"resolved": "https://registry.npmjs.org/pg-cloudflare/-/pg-cloudflare-1.3.0.tgz",
|
||||
"integrity": "sha512-6lswVVSztmHiRtD6I8hw4qP/nDm1EJbKMRhf3HCYaqud7frGysPv7FYJ5noZQdhQtN2xJnimfMtvQq21pdbzyQ==",
|
||||
"license": "MIT",
|
||||
"optional": true
|
||||
},
|
||||
"node_modules/pg-connection-string": {
|
||||
"version": "2.11.0",
|
||||
"resolved": "https://registry.npmjs.org/pg-connection-string/-/pg-connection-string-2.11.0.tgz",
|
||||
"integrity": "sha512-kecgoJwhOpxYU21rZjULrmrBJ698U2RxXofKVzOn5UDj61BPj/qMb7diYUR1nLScCDbrztQFl1TaQZT0t1EtzQ==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pg-int8": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/pg-int8/-/pg-int8-1.0.1.tgz",
|
||||
"integrity": "sha512-WCtabS6t3c8SkpDBUlb1kjOs7l66xsGdKpIPZsg4wR+B3+u9UAum2odSsF9tnvxg80h4ZxLWMy4pRjOsFIqQpw==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=4.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/pg-pool": {
|
||||
"version": "3.12.0",
|
||||
"resolved": "https://registry.npmjs.org/pg-pool/-/pg-pool-3.12.0.tgz",
|
||||
"integrity": "sha512-eIJ0DES8BLaziFHW7VgJEBPi5hg3Nyng5iKpYtj3wbcAUV9A1wLgWiY7ajf/f/oO1wfxt83phXPY8Emztg7ITg==",
|
||||
"license": "MIT",
|
||||
"peerDependencies": {
|
||||
"pg": ">=8.0"
|
||||
}
|
||||
},
|
||||
"node_modules/pg-protocol": {
|
||||
"version": "1.12.0",
|
||||
"resolved": "https://registry.npmjs.org/pg-protocol/-/pg-protocol-1.12.0.tgz",
|
||||
"integrity": "sha512-uOANXNRACNdElMXJ0tPz6RBM0XQ61nONGAwlt8da5zs/iUOOCLBQOHSXnrC6fMsvtjxbOJrZZl5IScGv+7mpbg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/pg-types": {
|
||||
"version": "2.2.0",
|
||||
"resolved": "https://registry.npmjs.org/pg-types/-/pg-types-2.2.0.tgz",
|
||||
"integrity": "sha512-qTAAlrEsl8s4OiEQY69wDvcMIdQN6wdz5ojQiOy6YRMuynxenON0O5oCpJI6lshc6scgAY8qvJ2On/p+CXY0GA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"pg-int8": "1.0.1",
|
||||
"postgres-array": "~2.0.0",
|
||||
"postgres-bytea": "~1.0.0",
|
||||
"postgres-date": "~1.0.4",
|
||||
"postgres-interval": "^1.1.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/pgpass": {
|
||||
"version": "1.0.5",
|
||||
"resolved": "https://registry.npmjs.org/pgpass/-/pgpass-1.0.5.tgz",
|
||||
"integrity": "sha512-FdW9r/jQZhSeohs1Z3sI1yxFQNFvMcnmfuj4WBMUTxOrAyLMaTcE1aAMBiTlbMNaXvBCQuVi0R7hd8udDSP7ug==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"split2": "^4.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/postgres-array": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/postgres-array/-/postgres-array-2.0.0.tgz",
|
||||
"integrity": "sha512-VpZrUqU5A69eQyW2c5CA1jtLecCsN2U/bD6VilrFDWq5+5UIEVO7nazS3TEcHf1zuPYO/sqGvUvW62g86RXZuA==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=4"
|
||||
}
|
||||
},
|
||||
"node_modules/postgres-bytea": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/postgres-bytea/-/postgres-bytea-1.0.1.tgz",
|
||||
"integrity": "sha512-5+5HqXnsZPE65IJZSMkZtURARZelel2oXUEO8rH83VS/hxH5vv1uHquPg5wZs8yMAfdv971IU+kcPUczi7NVBQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/postgres-date": {
|
||||
"version": "1.0.7",
|
||||
"resolved": "https://registry.npmjs.org/postgres-date/-/postgres-date-1.0.7.tgz",
|
||||
"integrity": "sha512-suDmjLVQg78nMK2UZ454hAG+OAW+HQPZ6n++TNDUX+L0+uUlLywnoxJKDou51Zm+zTCjrCl0Nq6J9C5hP9vK/Q==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/postgres-interval": {
|
||||
"version": "1.2.0",
|
||||
"resolved": "https://registry.npmjs.org/postgres-interval/-/postgres-interval-1.2.0.tgz",
|
||||
"integrity": "sha512-9ZhXKM/rw350N1ovuWHbGxnGh/SNJ4cnxHiM0rxE4VN41wsg8P8zWn9hv/buK00RP4WvlOyr/RBDiptyxVbkZQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"xtend": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/split2": {
|
||||
"version": "4.2.0",
|
||||
"resolved": "https://registry.npmjs.org/split2/-/split2-4.2.0.tgz",
|
||||
"integrity": "sha512-UcjcJOWknrNkF6PLX83qcHM6KHgVKNkV62Y8a5uYDVv9ydGQVwAHMKqHdJje1VTWpljG0WYpCDhrCdAOYH4TWg==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">= 10.x"
|
||||
}
|
||||
},
|
||||
"node_modules/xtend": {
|
||||
"version": "4.0.2",
|
||||
"resolved": "https://registry.npmjs.org/xtend/-/xtend-4.0.2.tgz",
|
||||
"integrity": "sha512-LKYU1iAXJXUgAXn9URjiu+MWhyUXHsvfp7mcuYm9dSUKK0/CjtrUwFAxD82/mCWbtLsGjFIad0wIsod4zrTAEQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=0.4"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
21
node_modules/pg-cloudflare/LICENSE
generated
vendored
Normal file
21
node_modules/pg-cloudflare/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2010 - 2021 Brian Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
112
node_modules/pg-cloudflare/README.md
generated
vendored
Normal file
112
node_modules/pg-cloudflare/README.md
generated
vendored
Normal file
@@ -0,0 +1,112 @@
|
||||
# pg-cloudflare
|
||||
|
||||
`pg-cloudflare` makes it easier to take an existing package that relies on `tls` and `net`, and make it work in environments where only `connect()` is supported, such as Cloudflare Workers.
|
||||
|
||||
`pg-cloudflare` wraps `connect()`, the [TCP Socket API](https://github.com/wintercg/proposal-sockets-api) proposed within WinterCG, and implemented in [Cloudflare Workers](https://developers.cloudflare.com/workers/runtime-apis/tcp-sockets/), and exposes an interface with methods similar to what the `net` and `tls` modules in Node.js expose. (ex: `net.connect(path[, options][, callback])`). This minimizes the number of changes needed in order to make an existing package work across JavaScript runtimes.
|
||||
|
||||
## Installation
|
||||
|
||||
```
|
||||
npm i --save-dev pg-cloudflare
|
||||
```
|
||||
|
||||
The package uses conditional exports to support bundlers that don't know about
|
||||
`cloudflare:sockets`, so the consumer code by default imports an empty file. To
|
||||
enable the package, resolve to the `cloudflare` condition in your bundler's
|
||||
config. For example:
|
||||
|
||||
- `webpack.config.js`
|
||||
```js
|
||||
export default {
|
||||
...,
|
||||
resolve: { conditionNames: [..., "workerd"] },
|
||||
plugins: [
|
||||
// ignore cloudflare:sockets imports
|
||||
new webpack.IgnorePlugin({
|
||||
resourceRegExp: /^cloudflare:sockets$/,
|
||||
}),
|
||||
],
|
||||
}
|
||||
```
|
||||
- `vite.config.js`
|
||||
|
||||
> [!NOTE]
|
||||
> If you are using the [Cloudflare Vite plugin](https://www.npmjs.com/package/@cloudflare/vite-plugin) then the following configuration is not necessary.
|
||||
|
||||
```js
|
||||
export default defineConfig({
|
||||
...,
|
||||
resolve: {
|
||||
conditions: [..., "workerd"],
|
||||
},
|
||||
build: {
|
||||
...,
|
||||
// don't try to bundle cloudflare:sockets
|
||||
rollupOptions: {
|
||||
external: [..., 'cloudflare:sockets'],
|
||||
},
|
||||
},
|
||||
})
|
||||
```
|
||||
|
||||
- `rollup.config.js`
|
||||
```js
|
||||
export default defineConfig({
|
||||
...,
|
||||
plugins: [..., nodeResolve({ exportConditions: [..., 'workerd'] })],
|
||||
// don't try to bundle cloudflare:sockets
|
||||
external: [..., 'cloudflare:sockets'],
|
||||
})
|
||||
```
|
||||
- `esbuild.config.js`
|
||||
```js
|
||||
await esbuild.build({
|
||||
...,
|
||||
conditions: [..., 'workerd'],
|
||||
})
|
||||
```
|
||||
|
||||
The concrete examples can be found in `packages/pg-bundler-test`.
|
||||
|
||||
## How to use conditionally, in non-Node.js environments
|
||||
|
||||
As implemented in `pg` [here](https://github.com/brianc/node-postgres/commit/07553428e9c0eacf761a5d4541a3300ff7859578#diff-34588ad868ebcb232660aba7ee6a99d1e02f4bc93f73497d2688c3f074e60533R5-R13), a typical use case might look as follows, where in a Node.js environment the `net` module is used, while in a non-Node.js environment, where `net` is unavailable, `pg-cloudflare` is used instead, providing an equivalent interface:
|
||||
|
||||
```js
|
||||
module.exports.getStream = function getStream(ssl = false) {
|
||||
const net = require('net')
|
||||
if (typeof net.Socket === 'function') {
|
||||
return net.Socket()
|
||||
}
|
||||
const { CloudflareSocket } = require('pg-cloudflare')
|
||||
return new CloudflareSocket(ssl)
|
||||
}
|
||||
```
|
||||
|
||||
## Node.js implementation of the Socket API proposal
|
||||
|
||||
If you're looking for a way to rely on `connect()` as the interface you use to interact with raw sockets, but need this interface to be available in a Node.js environment, [`@arrowood.dev/socket`](https://github.com/Ethan-Arrowood/socket) provides a Node.js implementation of the Socket API.
|
||||
|
||||
### license
|
||||
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2023 Brian M. Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in
|
||||
all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
|
||||
THE SOFTWARE.
|
||||
2
node_modules/pg-cloudflare/dist/empty.d.ts
generated
vendored
Normal file
2
node_modules/pg-cloudflare/dist/empty.d.ts
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
||||
declare const _default: {};
|
||||
export default _default;
|
||||
6
node_modules/pg-cloudflare/dist/empty.js
generated
vendored
Normal file
6
node_modules/pg-cloudflare/dist/empty.js
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
// This is an empty module that is served up when outside of a workerd environment
|
||||
// See the `exports` field in package.json
|
||||
exports.default = {};
|
||||
//# sourceMappingURL=empty.js.map
|
||||
1
node_modules/pg-cloudflare/dist/empty.js.map
generated
vendored
Normal file
1
node_modules/pg-cloudflare/dist/empty.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"empty.js","sourceRoot":"","sources":["../src/empty.ts"],"names":[],"mappings":";;AAAA,kFAAkF;AAClF,0CAA0C;AAC1C,kBAAe,EAAE,CAAA"}
|
||||
31
node_modules/pg-cloudflare/dist/index.d.ts
generated
vendored
Normal file
31
node_modules/pg-cloudflare/dist/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
||||
/// <reference types="node" />
|
||||
/// <reference types="node" />
|
||||
/// <reference types="node" />
|
||||
import { TlsOptions } from 'cloudflare:sockets';
|
||||
import { EventEmitter } from 'events';
|
||||
/**
|
||||
* Wrapper around the Cloudflare built-in socket that can be used by the `Connection`.
|
||||
*/
|
||||
export declare class CloudflareSocket extends EventEmitter {
|
||||
readonly ssl: boolean;
|
||||
writable: boolean;
|
||||
destroyed: boolean;
|
||||
private _upgrading;
|
||||
private _upgraded;
|
||||
private _cfSocket;
|
||||
private _cfWriter;
|
||||
private _cfReader;
|
||||
constructor(ssl: boolean);
|
||||
setNoDelay(): this;
|
||||
setKeepAlive(): this;
|
||||
ref(): this;
|
||||
unref(): this;
|
||||
connect(port: number, host: string, connectListener?: (...args: unknown[]) => void): Promise<this | undefined>;
|
||||
_listen(): Promise<void>;
|
||||
_listenOnce(): Promise<void>;
|
||||
write(data: Uint8Array | string, encoding?: BufferEncoding, callback?: (...args: unknown[]) => void): true | void;
|
||||
end(data?: Buffer, encoding?: BufferEncoding, callback?: (...args: unknown[]) => void): this;
|
||||
destroy(reason: string): this;
|
||||
startTls(options: TlsOptions): void;
|
||||
_addClosedHandler(): void;
|
||||
}
|
||||
152
node_modules/pg-cloudflare/dist/index.js
generated
vendored
Normal file
152
node_modules/pg-cloudflare/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,152 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.CloudflareSocket = void 0;
|
||||
const events_1 = require("events");
|
||||
/**
|
||||
* Wrapper around the Cloudflare built-in socket that can be used by the `Connection`.
|
||||
*/
|
||||
class CloudflareSocket extends events_1.EventEmitter {
|
||||
constructor(ssl) {
|
||||
super();
|
||||
this.ssl = ssl;
|
||||
this.writable = false;
|
||||
this.destroyed = false;
|
||||
this._upgrading = false;
|
||||
this._upgraded = false;
|
||||
this._cfSocket = null;
|
||||
this._cfWriter = null;
|
||||
this._cfReader = null;
|
||||
}
|
||||
setNoDelay() {
|
||||
return this;
|
||||
}
|
||||
setKeepAlive() {
|
||||
return this;
|
||||
}
|
||||
ref() {
|
||||
return this;
|
||||
}
|
||||
unref() {
|
||||
return this;
|
||||
}
|
||||
async connect(port, host, connectListener) {
|
||||
try {
|
||||
log('connecting');
|
||||
if (connectListener)
|
||||
this.once('connect', connectListener);
|
||||
const options = this.ssl ? { secureTransport: 'starttls' } : {};
|
||||
const mod = await import('cloudflare:sockets');
|
||||
const connect = mod.connect;
|
||||
this._cfSocket = connect(`${host}:${port}`, options);
|
||||
this._cfWriter = this._cfSocket.writable.getWriter();
|
||||
this._addClosedHandler();
|
||||
this._cfReader = this._cfSocket.readable.getReader();
|
||||
if (this.ssl) {
|
||||
this._listenOnce().catch((e) => this.emit('error', e));
|
||||
}
|
||||
else {
|
||||
this._listen().catch((e) => this.emit('error', e));
|
||||
}
|
||||
await this._cfWriter.ready;
|
||||
log('socket ready');
|
||||
this.writable = true;
|
||||
this.emit('connect');
|
||||
return this;
|
||||
}
|
||||
catch (e) {
|
||||
this.emit('error', e);
|
||||
}
|
||||
}
|
||||
async _listen() {
|
||||
// eslint-disable-next-line no-constant-condition
|
||||
while (true) {
|
||||
log('awaiting receive from CF socket');
|
||||
const { done, value } = await this._cfReader.read();
|
||||
log('CF socket received:', done, value);
|
||||
if (done) {
|
||||
log('done');
|
||||
break;
|
||||
}
|
||||
this.emit('data', Buffer.from(value));
|
||||
}
|
||||
}
|
||||
async _listenOnce() {
|
||||
log('awaiting first receive from CF socket');
|
||||
const { done, value } = await this._cfReader.read();
|
||||
log('First CF socket received:', done, value);
|
||||
this.emit('data', Buffer.from(value));
|
||||
}
|
||||
write(data, encoding = 'utf8', callback = () => { }) {
|
||||
if (data.length === 0)
|
||||
return callback();
|
||||
if (typeof data === 'string')
|
||||
data = Buffer.from(data, encoding);
|
||||
log('sending data direct:', data);
|
||||
this._cfWriter.write(data).then(() => {
|
||||
log('data sent');
|
||||
callback();
|
||||
}, (err) => {
|
||||
log('send error', err);
|
||||
callback(err);
|
||||
});
|
||||
return true;
|
||||
}
|
||||
end(data = Buffer.alloc(0), encoding = 'utf8', callback = () => { }) {
|
||||
log('ending CF socket');
|
||||
this.write(data, encoding, (err) => {
|
||||
this._cfSocket.close();
|
||||
if (callback)
|
||||
callback(err);
|
||||
});
|
||||
return this;
|
||||
}
|
||||
destroy(reason) {
|
||||
log('destroying CF socket', reason);
|
||||
this.destroyed = true;
|
||||
return this.end();
|
||||
}
|
||||
startTls(options) {
|
||||
if (this._upgraded) {
|
||||
// Don't try to upgrade again.
|
||||
this.emit('error', 'Cannot call `startTls()` more than once on a socket');
|
||||
return;
|
||||
}
|
||||
this._cfWriter.releaseLock();
|
||||
this._cfReader.releaseLock();
|
||||
this._upgrading = true;
|
||||
this._cfSocket = this._cfSocket.startTls(options);
|
||||
this._cfWriter = this._cfSocket.writable.getWriter();
|
||||
this._cfReader = this._cfSocket.readable.getReader();
|
||||
this._addClosedHandler();
|
||||
this._listen().catch((e) => this.emit('error', e));
|
||||
}
|
||||
_addClosedHandler() {
|
||||
this._cfSocket.closed.then(() => {
|
||||
if (!this._upgrading) {
|
||||
log('CF socket closed');
|
||||
this._cfSocket = null;
|
||||
this.emit('close');
|
||||
}
|
||||
else {
|
||||
this._upgrading = false;
|
||||
this._upgraded = true;
|
||||
}
|
||||
}).catch((e) => this.emit('error', e));
|
||||
}
|
||||
}
|
||||
exports.CloudflareSocket = CloudflareSocket;
|
||||
const debug = false;
|
||||
function dump(data) {
|
||||
if (data instanceof Uint8Array || data instanceof ArrayBuffer) {
|
||||
const hex = Buffer.from(data).toString('hex');
|
||||
const str = new TextDecoder().decode(data);
|
||||
return `\n>>> STR: "${str.replace(/\n/g, '\\n')}"\n>>> HEX: ${hex}\n`;
|
||||
}
|
||||
else {
|
||||
return data;
|
||||
}
|
||||
}
|
||||
function log(...args) {
|
||||
debug && console.log(...args.map(dump));
|
||||
}
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
node_modules/pg-cloudflare/dist/index.js.map
generated
vendored
Normal file
1
node_modules/pg-cloudflare/dist/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AACA,mCAAqC;AAErC;;GAEG;AACH,MAAa,gBAAiB,SAAQ,qBAAY;IAUhD,YAAqB,GAAY;QAC/B,KAAK,EAAE,CAAA;QADY,QAAG,GAAH,GAAG,CAAS;QATjC,aAAQ,GAAG,KAAK,CAAA;QAChB,cAAS,GAAG,KAAK,CAAA;QAET,eAAU,GAAG,KAAK,CAAA;QAClB,cAAS,GAAG,KAAK,CAAA;QACjB,cAAS,GAAkB,IAAI,CAAA;QAC/B,cAAS,GAAuC,IAAI,CAAA;QACpD,cAAS,GAAuC,IAAI,CAAA;IAI5D,CAAC;IAED,UAAU;QACR,OAAO,IAAI,CAAA;IACb,CAAC;IACD,YAAY;QACV,OAAO,IAAI,CAAA;IACb,CAAC;IACD,GAAG;QACD,OAAO,IAAI,CAAA;IACb,CAAC;IACD,KAAK;QACH,OAAO,IAAI,CAAA;IACb,CAAC;IAED,KAAK,CAAC,OAAO,CAAC,IAAY,EAAE,IAAY,EAAE,eAA8C;QACtF,IAAI;YACF,GAAG,CAAC,YAAY,CAAC,CAAA;YACjB,IAAI,eAAe;gBAAE,IAAI,CAAC,IAAI,CAAC,SAAS,EAAE,eAAe,CAAC,CAAA;YAE1D,MAAM,OAAO,GAAkB,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,eAAe,EAAE,UAAU,EAAE,CAAC,CAAC,CAAC,EAAE,CAAA;YAC9E,MAAM,GAAG,GAAG,MAAM,MAAM,CAAC,oBAAoB,CAAC,CAAA;YAC9C,MAAM,OAAO,GAAG,GAAG,CAAC,OAAO,CAAA;YAC3B,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,GAAG,IAAI,IAAI,IAAI,EAAE,EAAE,OAAO,CAAC,CAAA;YACpD,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAA;YACpD,IAAI,CAAC,iBAAiB,EAAE,CAAA;YAExB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAA;YACpD,IAAI,IAAI,CAAC,GAAG,EAAE;gBACZ,IAAI,CAAC,WAAW,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,CAAA;aACvD;iBAAM;gBACL,IAAI,CAAC,OAAO,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,CAAA;aACnD;YAED,MAAM,IAAI,CAAC,SAAU,CAAC,KAAK,CAAA;YAC3B,GAAG,CAAC,cAAc,CAAC,CAAA;YACnB,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAA;YACpB,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,CAAA;YAEpB,OAAO,IAAI,CAAA;SACZ;QAAC,OAAO,CAAC,EAAE;YACV,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAA;SACtB;IACH,CAAC;IAED,KAAK,CAAC,OAAO;QACX,iDAAiD;QACjD,OAAO,IAAI,EAAE;YACX,GAAG,CAAC,iCAAiC,CAAC,CAAA;YACtC,MAAM,EAAE,IAAI,EAAE,KAAK,EAAE,GAAG,MAAM,IAAI,CAAC,SAAU,CAAC,IAAI,EAAE,CAAA;YACpD,GAAG,CAAC,qBAAqB,EAAE,IAAI,EAAE,KAAK,CAAC,CAAA;YACvC,IAAI,IAAI,EAAE;gBACR,GAAG,CAAC,MAAM,CAAC,CAAA;gBACX,MAAK;aACN;YACD,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAA;SACtC;IACH,CAAC;IAED,KAAK,CAAC,WAAW;QACf,GAAG,CAAC,uCAAuC,CAAC,CAAA;QAC5C,MAAM,EAAE,IAAI,EAAE,KAAK,EAAE,GAAG,MAAM,IAAI,CAAC,SAAU,CAAC,IAAI,EAAE,CAAA;QACpD,GAAG,CAAC,2BAA2B,EAAE,IAAI,EAAE,KAAK,CAAC,CAAA;QAC7C,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAA;IACvC,CAAC;IAED,KAAK,CACH,IAAyB,EACzB,WAA2B,MAAM,EACjC,WAAyC,GAAG,EAAE,GAAE,CAAC;QAEjD,IAAI,IAAI,CAAC,MAAM,KAAK,CAAC;YAAE,OAAO,QAAQ,EAAE,CAAA;QACxC,IAAI,OAAO,IAAI,KAAK,QAAQ;YAAE,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,EAAE,QAAQ,CAAC,CAAA;QAEhE,GAAG,CAAC,sBAAsB,EAAE,IAAI,CAAC,CAAA;QACjC,IAAI,CAAC,SAAU,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC,IAAI,CAC9B,GAAG,EAAE;YACH,GAAG,CAAC,WAAW,CAAC,CAAA;YAChB,QAAQ,EAAE,CAAA;QACZ,CAAC,EACD,CAAC,GAAG,EAAE,EAAE;YACN,GAAG,CAAC,YAAY,EAAE,GAAG,CAAC,CAAA;YACtB,QAAQ,CAAC,GAAG,CAAC,CAAA;QACf,CAAC,CACF,CAAA;QACD,OAAO,IAAI,CAAA;IACb,CAAC;IAED,GAAG,CAAC,IAAI,GAAG,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,WAA2B,MAAM,EAAE,WAAyC,GAAG,EAAE,GAAE,CAAC;QAC9G,GAAG,CAAC,kBAAkB,CAAC,CAAA;QACvB,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,QAAQ,EAAE,CAAC,GAAG,EAAE,EAAE;YACjC,IAAI,CAAC,SAAU,CAAC,KAAK,EAAE,CAAA;YACvB,IAAI,QAAQ;gBAAE,QAAQ,CAAC,GAAG,CAAC,CAAA;QAC7B,CAAC,CAAC,CAAA;QACF,OAAO,IAAI,CAAA;IACb,CAAC;IAED,OAAO,CAAC,MAAc;QACpB,GAAG,CAAC,sBAAsB,EAAE,MAAM,CAAC,CAAA;QACnC,IAAI,CAAC,SAAS,GAAG,IAAI,CAAA;QACrB,OAAO,IAAI,CAAC,GAAG,EAAE,CAAA;IACnB,CAAC;IAED,QAAQ,CAAC,OAAmB;QAC1B,IAAI,IAAI,CAAC,SAAS,EAAE;YAClB,8BAA8B;YAC9B,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,qDAAqD,CAAC,CAAA;YACzE,OAAM;SACP;QACD,IAAI,CAAC,SAAU,CAAC,WAAW,EAAE,CAAA;QAC7B,IAAI,CAAC,SAAU,CAAC,WAAW,EAAE,CAAA;QAC7B,IAAI,CAAC,UAAU,GAAG,IAAI,CAAA;QACtB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,SAAU,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAA;QAClD,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAA;QACpD,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAA;QACpD,IAAI,CAAC,iBAAiB,EAAE,CAAA;QACxB,IAAI,CAAC,OAAO,EAAE,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,CAAA;IACpD,CAAC;IAED,iBAAiB;QACf,IAAI,CAAC,SAAU,CAAC,MAAM,CAAC,IAAI,CAAC,GAAG,EAAE;YAC/B,IAAI,CAAC,IAAI,CAAC,UAAU,EAAE;gBACpB,GAAG,CAAC,kBAAkB,CAAC,CAAA;gBACvB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAA;gBACrB,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAA;aACnB;iBAAM;gBACL,IAAI,CAAC,UAAU,GAAG,KAAK,CAAA;gBACvB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAA;aACtB;QACH,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,CAAA;IACxC,CAAC;CACF;AA/ID,4CA+IC;AAED,MAAM,KAAK,GAAG,KAAK,CAAA;AAEnB,SAAS,IAAI,CAAC,IAAa;IACzB,IAAI,IAAI,YAAY,UAAU,IAAI,IAAI,YAAY,WAAW,EAAE;QAC7D,MAAM,GAAG,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAA;QAC7C,MAAM,GAAG,GAAG,IAAI,WAAW,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,CAAA;QAC1C,OAAO,eAAe,GAAG,CAAC,OAAO,CAAC,KAAK,EAAE,KAAK,CAAC,eAAe,GAAG,IAAI,CAAA;KACtE;SAAM;QACL,OAAO,IAAI,CAAA;KACZ;AACH,CAAC;AAED,SAAS,GAAG,CAAC,GAAG,IAAe;IAC7B,KAAK,IAAI,OAAO,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAA;AACzC,CAAC"}
|
||||
3
node_modules/pg-cloudflare/esm/index.mjs
generated
vendored
Normal file
3
node_modules/pg-cloudflare/esm/index.mjs
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
import cf from '../dist/index.js'
|
||||
|
||||
export const CloudflareSocket = cf.CloudflareSocket
|
||||
39
node_modules/pg-cloudflare/package.json
generated
vendored
Normal file
39
node_modules/pg-cloudflare/package.json
generated
vendored
Normal file
@@ -0,0 +1,39 @@
|
||||
{
|
||||
"name": "pg-cloudflare",
|
||||
"version": "1.3.0",
|
||||
"description": "A socket implementation that can run on Cloudflare Workers using native TCP connections.",
|
||||
"main": "dist/index.js",
|
||||
"types": "dist/index.d.ts",
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"ts-node": "^8.5.4",
|
||||
"typescript": "^4.0.3"
|
||||
},
|
||||
"exports": {
|
||||
".": {
|
||||
"workerd": {
|
||||
"import": "./esm/index.mjs",
|
||||
"require": "./dist/index.js"
|
||||
},
|
||||
"default": "./dist/empty.js"
|
||||
},
|
||||
"./package.json": "./package.json"
|
||||
},
|
||||
"scripts": {
|
||||
"build": "tsc",
|
||||
"build:watch": "tsc --watch",
|
||||
"prepublish": "yarn build",
|
||||
"test": "echo e2e test in pg package"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-postgres.git",
|
||||
"directory": "packages/pg-cloudflare"
|
||||
},
|
||||
"files": [
|
||||
"/dist/*{js,ts,map}",
|
||||
"/src",
|
||||
"/esm"
|
||||
],
|
||||
"gitHead": "d10e09c888f94abf77382aba6f353ca665a1cf09"
|
||||
}
|
||||
3
node_modules/pg-cloudflare/src/empty.ts
generated
vendored
Normal file
3
node_modules/pg-cloudflare/src/empty.ts
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
// This is an empty module that is served up when outside of a workerd environment
|
||||
// See the `exports` field in package.json
|
||||
export default {}
|
||||
166
node_modules/pg-cloudflare/src/index.ts
generated
vendored
Normal file
166
node_modules/pg-cloudflare/src/index.ts
generated
vendored
Normal file
@@ -0,0 +1,166 @@
|
||||
import { SocketOptions, Socket, TlsOptions } from 'cloudflare:sockets'
|
||||
import { EventEmitter } from 'events'
|
||||
|
||||
/**
|
||||
* Wrapper around the Cloudflare built-in socket that can be used by the `Connection`.
|
||||
*/
|
||||
export class CloudflareSocket extends EventEmitter {
|
||||
writable = false
|
||||
destroyed = false
|
||||
|
||||
private _upgrading = false
|
||||
private _upgraded = false
|
||||
private _cfSocket: Socket | null = null
|
||||
private _cfWriter: WritableStreamDefaultWriter | null = null
|
||||
private _cfReader: ReadableStreamDefaultReader | null = null
|
||||
|
||||
constructor(readonly ssl: boolean) {
|
||||
super()
|
||||
}
|
||||
|
||||
setNoDelay() {
|
||||
return this
|
||||
}
|
||||
setKeepAlive() {
|
||||
return this
|
||||
}
|
||||
ref() {
|
||||
return this
|
||||
}
|
||||
unref() {
|
||||
return this
|
||||
}
|
||||
|
||||
async connect(port: number, host: string, connectListener?: (...args: unknown[]) => void) {
|
||||
try {
|
||||
log('connecting')
|
||||
if (connectListener) this.once('connect', connectListener)
|
||||
|
||||
const options: SocketOptions = this.ssl ? { secureTransport: 'starttls' } : {}
|
||||
const mod = await import('cloudflare:sockets')
|
||||
const connect = mod.connect
|
||||
this._cfSocket = connect(`${host}:${port}`, options)
|
||||
this._cfWriter = this._cfSocket.writable.getWriter()
|
||||
this._addClosedHandler()
|
||||
|
||||
this._cfReader = this._cfSocket.readable.getReader()
|
||||
if (this.ssl) {
|
||||
this._listenOnce().catch((e) => this.emit('error', e))
|
||||
} else {
|
||||
this._listen().catch((e) => this.emit('error', e))
|
||||
}
|
||||
|
||||
await this._cfWriter!.ready
|
||||
log('socket ready')
|
||||
this.writable = true
|
||||
this.emit('connect')
|
||||
|
||||
return this
|
||||
} catch (e) {
|
||||
this.emit('error', e)
|
||||
}
|
||||
}
|
||||
|
||||
async _listen() {
|
||||
// eslint-disable-next-line no-constant-condition
|
||||
while (true) {
|
||||
log('awaiting receive from CF socket')
|
||||
const { done, value } = await this._cfReader!.read()
|
||||
log('CF socket received:', done, value)
|
||||
if (done) {
|
||||
log('done')
|
||||
break
|
||||
}
|
||||
this.emit('data', Buffer.from(value))
|
||||
}
|
||||
}
|
||||
|
||||
async _listenOnce() {
|
||||
log('awaiting first receive from CF socket')
|
||||
const { done, value } = await this._cfReader!.read()
|
||||
log('First CF socket received:', done, value)
|
||||
this.emit('data', Buffer.from(value))
|
||||
}
|
||||
|
||||
write(
|
||||
data: Uint8Array | string,
|
||||
encoding: BufferEncoding = 'utf8',
|
||||
callback: (...args: unknown[]) => void = () => {}
|
||||
) {
|
||||
if (data.length === 0) return callback()
|
||||
if (typeof data === 'string') data = Buffer.from(data, encoding)
|
||||
|
||||
log('sending data direct:', data)
|
||||
this._cfWriter!.write(data).then(
|
||||
() => {
|
||||
log('data sent')
|
||||
callback()
|
||||
},
|
||||
(err) => {
|
||||
log('send error', err)
|
||||
callback(err)
|
||||
}
|
||||
)
|
||||
return true
|
||||
}
|
||||
|
||||
end(data = Buffer.alloc(0), encoding: BufferEncoding = 'utf8', callback: (...args: unknown[]) => void = () => {}) {
|
||||
log('ending CF socket')
|
||||
this.write(data, encoding, (err) => {
|
||||
this._cfSocket!.close()
|
||||
if (callback) callback(err)
|
||||
})
|
||||
return this
|
||||
}
|
||||
|
||||
destroy(reason: string) {
|
||||
log('destroying CF socket', reason)
|
||||
this.destroyed = true
|
||||
return this.end()
|
||||
}
|
||||
|
||||
startTls(options: TlsOptions) {
|
||||
if (this._upgraded) {
|
||||
// Don't try to upgrade again.
|
||||
this.emit('error', 'Cannot call `startTls()` more than once on a socket')
|
||||
return
|
||||
}
|
||||
this._cfWriter!.releaseLock()
|
||||
this._cfReader!.releaseLock()
|
||||
this._upgrading = true
|
||||
this._cfSocket = this._cfSocket!.startTls(options)
|
||||
this._cfWriter = this._cfSocket.writable.getWriter()
|
||||
this._cfReader = this._cfSocket.readable.getReader()
|
||||
this._addClosedHandler()
|
||||
this._listen().catch((e) => this.emit('error', e))
|
||||
}
|
||||
|
||||
_addClosedHandler() {
|
||||
this._cfSocket!.closed.then(() => {
|
||||
if (!this._upgrading) {
|
||||
log('CF socket closed')
|
||||
this._cfSocket = null
|
||||
this.emit('close')
|
||||
} else {
|
||||
this._upgrading = false
|
||||
this._upgraded = true
|
||||
}
|
||||
}).catch((e) => this.emit('error', e))
|
||||
}
|
||||
}
|
||||
|
||||
const debug = false
|
||||
|
||||
function dump(data: unknown) {
|
||||
if (data instanceof Uint8Array || data instanceof ArrayBuffer) {
|
||||
const hex = Buffer.from(data).toString('hex')
|
||||
const str = new TextDecoder().decode(data)
|
||||
return `\n>>> STR: "${str.replace(/\n/g, '\\n')}"\n>>> HEX: ${hex}\n`
|
||||
} else {
|
||||
return data
|
||||
}
|
||||
}
|
||||
|
||||
function log(...args: unknown[]) {
|
||||
debug && console.log(...args.map(dump))
|
||||
}
|
||||
25
node_modules/pg-cloudflare/src/types.d.ts
generated
vendored
Normal file
25
node_modules/pg-cloudflare/src/types.d.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
declare module 'cloudflare:sockets' {
|
||||
export class Socket {
|
||||
public readonly readable: any
|
||||
public readonly writable: any
|
||||
public readonly closed: Promise<void>
|
||||
public close(): Promise<void>
|
||||
public startTls(options: TlsOptions): Socket
|
||||
}
|
||||
|
||||
export type TlsOptions = {
|
||||
expectedServerHostname?: string
|
||||
}
|
||||
|
||||
export type SocketAddress = {
|
||||
hostname: string
|
||||
port: number
|
||||
}
|
||||
|
||||
export type SocketOptions = {
|
||||
secureTransport?: 'off' | 'on' | 'starttls'
|
||||
allowHalfOpen?: boolean
|
||||
}
|
||||
|
||||
export function connect(address: string | SocketAddress, options?: SocketOptions): Socket
|
||||
}
|
||||
21
node_modules/pg-connection-string/LICENSE
generated
vendored
Normal file
21
node_modules/pg-connection-string/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
The MIT License (MIT)
|
||||
|
||||
Copyright (c) 2014 Iced Development
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
105
node_modules/pg-connection-string/README.md
generated
vendored
Normal file
105
node_modules/pg-connection-string/README.md
generated
vendored
Normal file
@@ -0,0 +1,105 @@
|
||||
pg-connection-string
|
||||
====================
|
||||
|
||||
[](https://nodei.co/npm/pg-connection-string/)
|
||||
|
||||
Functions for dealing with a PostgresSQL connection string
|
||||
|
||||
`parse` method taken from [node-postgres](https://github.com/brianc/node-postgres.git)
|
||||
Copyright (c) 2010-2014 Brian Carlson (brian.m.carlson@gmail.com)
|
||||
MIT License
|
||||
|
||||
## Usage
|
||||
|
||||
```js
|
||||
const parse = require('pg-connection-string').parse;
|
||||
|
||||
const config = parse('postgres://someuser:somepassword@somehost:381/somedatabase')
|
||||
```
|
||||
|
||||
The resulting config contains a subset of the following properties:
|
||||
|
||||
* `user` - User with which to authenticate to the server
|
||||
* `password` - Corresponding password
|
||||
* `host` - Postgres server hostname or, for UNIX domain sockets, the socket filename
|
||||
* `port` - port on which to connect
|
||||
* `database` - Database name within the server
|
||||
* `client_encoding` - string encoding the client will use
|
||||
* `ssl`, either a boolean or an object with properties
|
||||
* `rejectUnauthorized`
|
||||
* `cert`
|
||||
* `key`
|
||||
* `ca`
|
||||
* any other query parameters (for example, `application_name`) are preserved intact.
|
||||
|
||||
### ClientConfig Compatibility for TypeScript
|
||||
|
||||
The pg-connection-string `ConnectionOptions` interface is not compatible with the `ClientConfig` interface that [pg.Client](https://node-postgres.com/apis/client) expects. To remedy this, use the `parseIntoClientConfig` function instead of `parse`:
|
||||
|
||||
```ts
|
||||
import { ClientConfig } from 'pg';
|
||||
import { parseIntoClientConfig } from 'pg-connection-string';
|
||||
|
||||
const config: ClientConfig = parseIntoClientConfig('postgres://someuser:somepassword@somehost:381/somedatabase')
|
||||
```
|
||||
|
||||
You can also use `toClientConfig` to convert an existing `ConnectionOptions` interface into a `ClientConfig` interface:
|
||||
|
||||
```ts
|
||||
import { ClientConfig } from 'pg';
|
||||
import { parse, toClientConfig } from 'pg-connection-string';
|
||||
|
||||
const config = parse('postgres://someuser:somepassword@somehost:381/somedatabase')
|
||||
const clientConfig: ClientConfig = toClientConfig(config)
|
||||
```
|
||||
|
||||
## Connection Strings
|
||||
|
||||
The short summary of acceptable URLs is:
|
||||
|
||||
* `socket:<path>?<query>` - UNIX domain socket
|
||||
* `postgres://<user>:<password>@<host>:<port>/<database>?<query>` - TCP connection
|
||||
|
||||
But see below for more details.
|
||||
|
||||
### UNIX Domain Sockets
|
||||
|
||||
When user and password are not given, the socket path follows `socket:`, as in `socket:/var/run/pgsql`.
|
||||
This form can be shortened to just a path: `/var/run/pgsql`.
|
||||
|
||||
When user and password are given, they are included in the typical URL positions, with an empty `host`, as in `socket://user:pass@/var/run/pgsql`.
|
||||
|
||||
Query parameters follow a `?` character, including the following special query parameters:
|
||||
|
||||
* `db=<database>` - sets the database name (urlencoded)
|
||||
* `encoding=<encoding>` - sets the `client_encoding` property
|
||||
|
||||
### TCP Connections
|
||||
|
||||
TCP connections to the Postgres server are indicated with `pg:` or `postgres:` schemes (in fact, any scheme but `socket:` is accepted).
|
||||
If username and password are included, they should be urlencoded.
|
||||
The database name, however, should *not* be urlencoded.
|
||||
|
||||
Query parameters follow a `?` character, including the following special query parameters:
|
||||
* `host=<host>` - sets `host` property, overriding the URL's host
|
||||
* `encoding=<encoding>` - sets the `client_encoding` property
|
||||
* `ssl=1`, `ssl=true`, `ssl=0`, `ssl=false` - sets `ssl` to true or false, accordingly
|
||||
* `uselibpqcompat=true` - use libpq semantics
|
||||
* `sslmode=<sslmode>` when `uselibpqcompat=true` is not set
|
||||
* `sslmode=disable` - sets `ssl` to false
|
||||
* `sslmode=no-verify` - sets `ssl` to `{ rejectUnauthorized: false }`
|
||||
* `sslmode=prefer`, `sslmode=require`, `sslmode=verify-ca`, `sslmode=verify-full` - sets `ssl` to true
|
||||
* `sslmode=<sslmode>` when `uselibpqcompat=true`
|
||||
* `sslmode=disable` - sets `ssl` to false
|
||||
* `sslmode=prefer` - sets `ssl` to `{ rejectUnauthorized: false }`
|
||||
* `sslmode=require` - sets `ssl` to `{ rejectUnauthorized: false }` unless `sslrootcert` is specified, in which case it behaves like `verify-ca`
|
||||
* `sslmode=verify-ca` - sets `ssl` to `{ checkServerIdentity: no-op }` (verify CA, but not server identity). This verifies the presented certificate against the effective CA specified in sslrootcert.
|
||||
* `sslmode=verify-full` - sets `ssl` to `{}` (verify CA and server identity)
|
||||
* `sslcert=<filename>` - reads data from the given file and includes the result as `ssl.cert`
|
||||
* `sslkey=<filename>` - reads data from the given file and includes the result as `ssl.key`
|
||||
* `sslrootcert=<filename>` - reads data from the given file and includes the result as `ssl.ca`
|
||||
|
||||
A bare relative URL, such as `salesdata`, will indicate a database name while leaving other properties empty.
|
||||
|
||||
> [!CAUTION]
|
||||
> Choosing an sslmode other than verify-full has serious security implications. Please read https://www.postgresql.org/docs/current/libpq-ssl.html#LIBPQ-SSL-SSLMODE-STATEMENTS to understand the trade-offs.
|
||||
8
node_modules/pg-connection-string/esm/index.mjs
generated
vendored
Normal file
8
node_modules/pg-connection-string/esm/index.mjs
generated
vendored
Normal file
@@ -0,0 +1,8 @@
|
||||
// ESM wrapper for pg-connection-string
|
||||
import connectionString from '../index.js'
|
||||
|
||||
// Re-export the parse function
|
||||
export default connectionString.parse
|
||||
export const parse = connectionString.parse
|
||||
export const toClientConfig = connectionString.toClientConfig
|
||||
export const parseIntoClientConfig = connectionString.parseIntoClientConfig
|
||||
36
node_modules/pg-connection-string/index.d.ts
generated
vendored
Normal file
36
node_modules/pg-connection-string/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
||||
import { ClientConfig } from 'pg'
|
||||
|
||||
export function parse(connectionString: string, options?: Options): ConnectionOptions
|
||||
|
||||
export interface Options {
|
||||
// Use libpq semantics when interpreting the connection string
|
||||
useLibpqCompat?: boolean
|
||||
}
|
||||
|
||||
interface SSLConfig {
|
||||
ca?: string
|
||||
cert?: string | null
|
||||
key?: string
|
||||
rejectUnauthorized?: boolean
|
||||
}
|
||||
|
||||
export interface ConnectionOptions {
|
||||
host: string | null
|
||||
password?: string
|
||||
user?: string
|
||||
port?: string | null
|
||||
database: string | null | undefined
|
||||
client_encoding?: string
|
||||
ssl?: boolean | string | SSLConfig
|
||||
|
||||
application_name?: string
|
||||
fallback_application_name?: string
|
||||
options?: string
|
||||
keepalives?: number
|
||||
|
||||
// We allow any other options to be passed through
|
||||
[key: string]: unknown
|
||||
}
|
||||
|
||||
export function toClientConfig(config: ConnectionOptions): ClientConfig
|
||||
export function parseIntoClientConfig(connectionString: string): ClientConfig
|
||||
231
node_modules/pg-connection-string/index.js
generated
vendored
Normal file
231
node_modules/pg-connection-string/index.js
generated
vendored
Normal file
@@ -0,0 +1,231 @@
|
||||
'use strict'
|
||||
|
||||
//Parse method copied from https://github.com/brianc/node-postgres
|
||||
//Copyright (c) 2010-2014 Brian Carlson (brian.m.carlson@gmail.com)
|
||||
//MIT License
|
||||
|
||||
//parses a connection string
|
||||
function parse(str, options = {}) {
|
||||
//unix socket
|
||||
if (str.charAt(0) === '/') {
|
||||
const config = str.split(' ')
|
||||
return { host: config[0], database: config[1] }
|
||||
}
|
||||
|
||||
// Check for empty host in URL
|
||||
|
||||
const config = {}
|
||||
let result
|
||||
let dummyHost = false
|
||||
if (/ |%[^a-f0-9]|%[a-f0-9][^a-f0-9]/i.test(str)) {
|
||||
// Ensure spaces are encoded as %20
|
||||
str = encodeURI(str).replace(/%25(\d\d)/g, '%$1')
|
||||
}
|
||||
|
||||
try {
|
||||
try {
|
||||
result = new URL(str, 'postgres://base')
|
||||
} catch (e) {
|
||||
// The URL is invalid so try again with a dummy host
|
||||
result = new URL(str.replace('@/', '@___DUMMY___/'), 'postgres://base')
|
||||
dummyHost = true
|
||||
}
|
||||
} catch (err) {
|
||||
// Remove the input from the error message to avoid leaking sensitive information
|
||||
err.input && (err.input = '*****REDACTED*****')
|
||||
throw err
|
||||
}
|
||||
|
||||
// We'd like to use Object.fromEntries() here but Node.js 10 does not support it
|
||||
for (const entry of result.searchParams.entries()) {
|
||||
config[entry[0]] = entry[1]
|
||||
}
|
||||
|
||||
config.user = config.user || decodeURIComponent(result.username)
|
||||
config.password = config.password || decodeURIComponent(result.password)
|
||||
|
||||
if (result.protocol == 'socket:') {
|
||||
config.host = decodeURI(result.pathname)
|
||||
config.database = result.searchParams.get('db')
|
||||
config.client_encoding = result.searchParams.get('encoding')
|
||||
return config
|
||||
}
|
||||
const hostname = dummyHost ? '' : result.hostname
|
||||
if (!config.host) {
|
||||
// Only set the host if there is no equivalent query param.
|
||||
config.host = decodeURIComponent(hostname)
|
||||
} else if (hostname && /^%2f/i.test(hostname)) {
|
||||
// Only prepend the hostname to the pathname if it is not a URL encoded Unix socket host.
|
||||
result.pathname = hostname + result.pathname
|
||||
}
|
||||
if (!config.port) {
|
||||
// Only set the port if there is no equivalent query param.
|
||||
config.port = result.port
|
||||
}
|
||||
|
||||
const pathname = result.pathname.slice(1) || null
|
||||
config.database = pathname ? decodeURI(pathname) : null
|
||||
|
||||
if (config.ssl === 'true' || config.ssl === '1') {
|
||||
config.ssl = true
|
||||
}
|
||||
|
||||
if (config.ssl === '0') {
|
||||
config.ssl = false
|
||||
}
|
||||
|
||||
if (config.sslcert || config.sslkey || config.sslrootcert || config.sslmode) {
|
||||
config.ssl = {}
|
||||
}
|
||||
|
||||
// Only try to load fs if we expect to read from the disk
|
||||
const fs = config.sslcert || config.sslkey || config.sslrootcert ? require('fs') : null
|
||||
|
||||
if (config.sslcert) {
|
||||
config.ssl.cert = fs.readFileSync(config.sslcert).toString()
|
||||
}
|
||||
|
||||
if (config.sslkey) {
|
||||
config.ssl.key = fs.readFileSync(config.sslkey).toString()
|
||||
}
|
||||
|
||||
if (config.sslrootcert) {
|
||||
config.ssl.ca = fs.readFileSync(config.sslrootcert).toString()
|
||||
}
|
||||
|
||||
if (options.useLibpqCompat && config.uselibpqcompat) {
|
||||
throw new Error('Both useLibpqCompat and uselibpqcompat are set. Please use only one of them.')
|
||||
}
|
||||
|
||||
if (config.uselibpqcompat === 'true' || options.useLibpqCompat) {
|
||||
switch (config.sslmode) {
|
||||
case 'disable': {
|
||||
config.ssl = false
|
||||
break
|
||||
}
|
||||
case 'prefer': {
|
||||
config.ssl.rejectUnauthorized = false
|
||||
break
|
||||
}
|
||||
case 'require': {
|
||||
if (config.sslrootcert) {
|
||||
// If a root CA is specified, behavior of `sslmode=require` will be the same as that of `verify-ca`
|
||||
config.ssl.checkServerIdentity = function () {}
|
||||
} else {
|
||||
config.ssl.rejectUnauthorized = false
|
||||
}
|
||||
break
|
||||
}
|
||||
case 'verify-ca': {
|
||||
if (!config.ssl.ca) {
|
||||
throw new Error(
|
||||
'SECURITY WARNING: Using sslmode=verify-ca requires specifying a CA with sslrootcert. If a public CA is used, verify-ca allows connections to a server that somebody else may have registered with the CA, making you vulnerable to Man-in-the-Middle attacks. Either specify a custom CA certificate with sslrootcert parameter or use sslmode=verify-full for proper security.'
|
||||
)
|
||||
}
|
||||
config.ssl.checkServerIdentity = function () {}
|
||||
break
|
||||
}
|
||||
case 'verify-full': {
|
||||
break
|
||||
}
|
||||
}
|
||||
} else {
|
||||
switch (config.sslmode) {
|
||||
case 'disable': {
|
||||
config.ssl = false
|
||||
break
|
||||
}
|
||||
case 'prefer':
|
||||
case 'require':
|
||||
case 'verify-ca':
|
||||
case 'verify-full': {
|
||||
if (config.sslmode !== 'verify-full') {
|
||||
deprecatedSslModeWarning(config.sslmode)
|
||||
}
|
||||
break
|
||||
}
|
||||
case 'no-verify': {
|
||||
config.ssl.rejectUnauthorized = false
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return config
|
||||
}
|
||||
|
||||
// convert pg-connection-string ssl config to a ClientConfig.ConnectionOptions
|
||||
function toConnectionOptions(sslConfig) {
|
||||
const connectionOptions = Object.entries(sslConfig).reduce((c, [key, value]) => {
|
||||
// we explicitly check for undefined and null instead of `if (value)` because some
|
||||
// options accept falsy values. Example: `ssl.rejectUnauthorized = false`
|
||||
if (value !== undefined && value !== null) {
|
||||
c[key] = value
|
||||
}
|
||||
|
||||
return c
|
||||
}, {})
|
||||
|
||||
return connectionOptions
|
||||
}
|
||||
|
||||
// convert pg-connection-string config to a ClientConfig
|
||||
function toClientConfig(config) {
|
||||
const poolConfig = Object.entries(config).reduce((c, [key, value]) => {
|
||||
if (key === 'ssl') {
|
||||
const sslConfig = value
|
||||
|
||||
if (typeof sslConfig === 'boolean') {
|
||||
c[key] = sslConfig
|
||||
}
|
||||
|
||||
if (typeof sslConfig === 'object') {
|
||||
c[key] = toConnectionOptions(sslConfig)
|
||||
}
|
||||
} else if (value !== undefined && value !== null) {
|
||||
if (key === 'port') {
|
||||
// when port is not specified, it is converted into an empty string
|
||||
// we want to avoid NaN or empty string as a values in ClientConfig
|
||||
if (value !== '') {
|
||||
const v = parseInt(value, 10)
|
||||
if (isNaN(v)) {
|
||||
throw new Error(`Invalid ${key}: ${value}`)
|
||||
}
|
||||
|
||||
c[key] = v
|
||||
}
|
||||
} else {
|
||||
c[key] = value
|
||||
}
|
||||
}
|
||||
|
||||
return c
|
||||
}, {})
|
||||
|
||||
return poolConfig
|
||||
}
|
||||
|
||||
// parses a connection string into ClientConfig
|
||||
function parseIntoClientConfig(str) {
|
||||
return toClientConfig(parse(str))
|
||||
}
|
||||
|
||||
function deprecatedSslModeWarning(sslmode) {
|
||||
if (!deprecatedSslModeWarning.warned && typeof process !== 'undefined' && process.emitWarning) {
|
||||
deprecatedSslModeWarning.warned = true
|
||||
process.emitWarning(`SECURITY WARNING: The SSL modes 'prefer', 'require', and 'verify-ca' are treated as aliases for 'verify-full'.
|
||||
In the next major version (pg-connection-string v3.0.0 and pg v9.0.0), these modes will adopt standard libpq semantics, which have weaker security guarantees.
|
||||
|
||||
To prepare for this change:
|
||||
- If you want the current behavior, explicitly use 'sslmode=verify-full'
|
||||
- If you want libpq compatibility now, use 'uselibpqcompat=true&sslmode=${sslmode}'
|
||||
|
||||
See https://www.postgresql.org/docs/current/libpq-ssl.html for libpq SSL mode definitions.`)
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = parse
|
||||
|
||||
parse.parse = parse
|
||||
parse.toClientConfig = toClientConfig
|
||||
parse.parseIntoClientConfig = parseIntoClientConfig
|
||||
52
node_modules/pg-connection-string/package.json
generated
vendored
Normal file
52
node_modules/pg-connection-string/package.json
generated
vendored
Normal file
@@ -0,0 +1,52 @@
|
||||
{
|
||||
"name": "pg-connection-string",
|
||||
"version": "2.11.0",
|
||||
"description": "Functions for dealing with a PostgresSQL connection string",
|
||||
"main": "./index.js",
|
||||
"types": "./index.d.ts",
|
||||
"exports": {
|
||||
".": {
|
||||
"types": "./index.d.ts",
|
||||
"import": "./esm/index.mjs",
|
||||
"require": "./index.js",
|
||||
"default": "./index.js"
|
||||
}
|
||||
},
|
||||
"scripts": {
|
||||
"test": "nyc --reporter=lcov mocha && npm run check-coverage",
|
||||
"check-coverage": "nyc check-coverage --statements 100 --branches 100 --lines 100 --functions 100"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-postgres.git",
|
||||
"directory": "packages/pg-connection-string"
|
||||
},
|
||||
"keywords": [
|
||||
"pg",
|
||||
"connection",
|
||||
"string",
|
||||
"parse"
|
||||
],
|
||||
"author": "Blaine Bublitz <blaine@iceddev.com> (http://iceddev.com/)",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/brianc/node-postgres/issues"
|
||||
},
|
||||
"homepage": "https://github.com/brianc/node-postgres/tree/master/packages/pg-connection-string",
|
||||
"devDependencies": {
|
||||
"@types/pg": "^8.12.0",
|
||||
"chai": "^4.1.1",
|
||||
"coveralls": "^3.0.4",
|
||||
"istanbul": "^0.4.5",
|
||||
"mocha": "^10.5.2",
|
||||
"nyc": "^15",
|
||||
"tsx": "^4.19.4",
|
||||
"typescript": "^4.0.3"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"index.d.ts",
|
||||
"esm"
|
||||
],
|
||||
"gitHead": "fc4de3c62ad350d0e1b392a0d132aff906d1cec6"
|
||||
}
|
||||
13
node_modules/pg-int8/LICENSE
generated
vendored
Normal file
13
node_modules/pg-int8/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,13 @@
|
||||
Copyright © 2017, Charmander <~@charmander.me>
|
||||
|
||||
Permission to use, copy, modify, and/or distribute this software for any
|
||||
purpose with or without fee is hereby granted, provided that the above
|
||||
copyright notice and this permission notice appear in all copies.
|
||||
|
||||
THE SOFTWARE IS PROVIDED “AS IS” AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH
|
||||
REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
|
||||
FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,
|
||||
INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM
|
||||
LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR
|
||||
OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR
|
||||
PERFORMANCE OF THIS SOFTWARE.
|
||||
16
node_modules/pg-int8/README.md
generated
vendored
Normal file
16
node_modules/pg-int8/README.md
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
[![Build status][ci image]][ci]
|
||||
|
||||
64-bit big-endian signed integer-to-string conversion designed for [pg][].
|
||||
|
||||
```js
|
||||
const readInt8 = require('pg-int8');
|
||||
|
||||
readInt8(Buffer.from([0, 1, 2, 3, 4, 5, 6, 7]))
|
||||
// '283686952306183'
|
||||
```
|
||||
|
||||
|
||||
[pg]: https://github.com/brianc/node-postgres
|
||||
|
||||
[ci]: https://travis-ci.org/charmander/pg-int8
|
||||
[ci image]: https://api.travis-ci.org/charmander/pg-int8.svg
|
||||
100
node_modules/pg-int8/index.js
generated
vendored
Normal file
100
node_modules/pg-int8/index.js
generated
vendored
Normal file
@@ -0,0 +1,100 @@
|
||||
'use strict';
|
||||
|
||||
// selected so (BASE - 1) * 0x100000000 + 0xffffffff is a safe integer
|
||||
var BASE = 1000000;
|
||||
|
||||
function readInt8(buffer) {
|
||||
var high = buffer.readInt32BE(0);
|
||||
var low = buffer.readUInt32BE(4);
|
||||
var sign = '';
|
||||
|
||||
if (high < 0) {
|
||||
high = ~high + (low === 0);
|
||||
low = (~low + 1) >>> 0;
|
||||
sign = '-';
|
||||
}
|
||||
|
||||
var result = '';
|
||||
var carry;
|
||||
var t;
|
||||
var digits;
|
||||
var pad;
|
||||
var l;
|
||||
var i;
|
||||
|
||||
{
|
||||
carry = high % BASE;
|
||||
high = high / BASE >>> 0;
|
||||
|
||||
t = 0x100000000 * carry + low;
|
||||
low = t / BASE >>> 0;
|
||||
digits = '' + (t - BASE * low);
|
||||
|
||||
if (low === 0 && high === 0) {
|
||||
return sign + digits + result;
|
||||
}
|
||||
|
||||
pad = '';
|
||||
l = 6 - digits.length;
|
||||
|
||||
for (i = 0; i < l; i++) {
|
||||
pad += '0';
|
||||
}
|
||||
|
||||
result = pad + digits + result;
|
||||
}
|
||||
|
||||
{
|
||||
carry = high % BASE;
|
||||
high = high / BASE >>> 0;
|
||||
|
||||
t = 0x100000000 * carry + low;
|
||||
low = t / BASE >>> 0;
|
||||
digits = '' + (t - BASE * low);
|
||||
|
||||
if (low === 0 && high === 0) {
|
||||
return sign + digits + result;
|
||||
}
|
||||
|
||||
pad = '';
|
||||
l = 6 - digits.length;
|
||||
|
||||
for (i = 0; i < l; i++) {
|
||||
pad += '0';
|
||||
}
|
||||
|
||||
result = pad + digits + result;
|
||||
}
|
||||
|
||||
{
|
||||
carry = high % BASE;
|
||||
high = high / BASE >>> 0;
|
||||
|
||||
t = 0x100000000 * carry + low;
|
||||
low = t / BASE >>> 0;
|
||||
digits = '' + (t - BASE * low);
|
||||
|
||||
if (low === 0 && high === 0) {
|
||||
return sign + digits + result;
|
||||
}
|
||||
|
||||
pad = '';
|
||||
l = 6 - digits.length;
|
||||
|
||||
for (i = 0; i < l; i++) {
|
||||
pad += '0';
|
||||
}
|
||||
|
||||
result = pad + digits + result;
|
||||
}
|
||||
|
||||
{
|
||||
carry = high % BASE;
|
||||
t = 0x100000000 * carry + low;
|
||||
digits = '' + t % BASE;
|
||||
|
||||
return sign + digits + result;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = readInt8;
|
||||
24
node_modules/pg-int8/package.json
generated
vendored
Normal file
24
node_modules/pg-int8/package.json
generated
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
{
|
||||
"name": "pg-int8",
|
||||
"version": "1.0.1",
|
||||
"description": "64-bit big-endian signed integer-to-string conversion",
|
||||
"bugs": "https://github.com/charmander/pg-int8/issues",
|
||||
"license": "ISC",
|
||||
"files": [
|
||||
"index.js"
|
||||
],
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "https://github.com/charmander/pg-int8"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "tap test"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@charmander/eslint-config-base": "1.0.2",
|
||||
"tap": "10.7.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=4.0.0"
|
||||
}
|
||||
}
|
||||
21
node_modules/pg-pool/LICENSE
generated
vendored
Normal file
21
node_modules/pg-pool/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2017 Brian M. Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
357
node_modules/pg-pool/README.md
generated
vendored
Normal file
357
node_modules/pg-pool/README.md
generated
vendored
Normal file
@@ -0,0 +1,357 @@
|
||||
# pg-pool
|
||||
|
||||
[](https://travis-ci.org/brianc/node-pg-pool)
|
||||
|
||||
A connection pool for node-postgres
|
||||
|
||||
## install
|
||||
|
||||
```sh
|
||||
npm i pg-pool pg
|
||||
```
|
||||
|
||||
## use
|
||||
|
||||
### create
|
||||
|
||||
to use pg-pool you must first create an instance of a pool
|
||||
|
||||
```js
|
||||
const Pool = require('pg-pool')
|
||||
|
||||
// by default the pool uses the same
|
||||
// configuration as whatever `pg` version you have installed
|
||||
const pool = new Pool()
|
||||
|
||||
// you can pass properties to the pool
|
||||
// these properties are passed unchanged to both the node-postgres Client constructor
|
||||
// and the pool constructor, allowing you to fully configure the behavior of both
|
||||
const pool2 = new Pool({
|
||||
database: 'postgres',
|
||||
user: 'brianc',
|
||||
password: 'secret!',
|
||||
port: 5432,
|
||||
ssl: true,
|
||||
max: 20, // set pool max size to 20
|
||||
idleTimeoutMillis: 1000, // close idle clients after 1 second
|
||||
connectionTimeoutMillis: 1000, // return an error after 1 second if connection could not be established
|
||||
maxUses: 7500, // close (and replace) a connection after it has been used 7500 times (see below for discussion)
|
||||
})
|
||||
|
||||
// you can supply a custom client constructor
|
||||
// if you want to use the native postgres client
|
||||
const NativeClient = require('pg').native.Client
|
||||
const nativePool = new Pool({ Client: NativeClient })
|
||||
|
||||
// you can even pool pg-native clients directly
|
||||
const PgNativeClient = require('pg-native')
|
||||
const pgNativePool = new Pool({ Client: PgNativeClient })
|
||||
```
|
||||
|
||||
##### Note:
|
||||
|
||||
The Pool constructor does not support passing a Database URL as the parameter. To use pg-pool on heroku, for example, you need to parse the URL into a config object. Here is an example of how to parse a Database URL.
|
||||
|
||||
```js
|
||||
const Pool = require('pg-pool')
|
||||
const url = require('url')
|
||||
|
||||
const params = url.parse(process.env.DATABASE_URL)
|
||||
const auth = params.auth.split(':')
|
||||
|
||||
const config = {
|
||||
user: auth[0],
|
||||
password: auth[1],
|
||||
host: params.hostname,
|
||||
port: params.port,
|
||||
database: params.pathname.split('/')[1],
|
||||
ssl: true,
|
||||
}
|
||||
|
||||
const pool = new Pool(config)
|
||||
|
||||
/*
|
||||
Transforms, 'postgres://DBuser:secret@DBHost:#####/myDB', into
|
||||
config = {
|
||||
user: 'DBuser',
|
||||
password: 'secret',
|
||||
host: 'DBHost',
|
||||
port: '#####',
|
||||
database: 'myDB',
|
||||
ssl: true
|
||||
}
|
||||
*/
|
||||
```
|
||||
|
||||
### acquire clients with a promise
|
||||
|
||||
pg-pool supports a fully promise-based api for acquiring clients
|
||||
|
||||
```js
|
||||
const pool = new Pool()
|
||||
pool.connect().then((client) => {
|
||||
client
|
||||
.query('select $1::text as name', ['pg-pool'])
|
||||
.then((res) => {
|
||||
client.release()
|
||||
console.log('hello from', res.rows[0].name)
|
||||
})
|
||||
.catch((e) => {
|
||||
client.release()
|
||||
console.error('query error', e.message, e.stack)
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
### plays nice with async/await
|
||||
|
||||
this ends up looking much nicer if you're using [co](https://github.com/tj/co) or async/await:
|
||||
|
||||
```js
|
||||
// with async/await
|
||||
;(async () => {
|
||||
const pool = new Pool()
|
||||
const client = await pool.connect()
|
||||
try {
|
||||
const result = await client.query('select $1::text as name', ['brianc'])
|
||||
console.log('hello from', result.rows[0])
|
||||
} finally {
|
||||
client.release()
|
||||
}
|
||||
})().catch((e) => console.error(e.message, e.stack))
|
||||
|
||||
// with co
|
||||
co(function* () {
|
||||
const client = yield pool.connect()
|
||||
try {
|
||||
const result = yield client.query('select $1::text as name', ['brianc'])
|
||||
console.log('hello from', result.rows[0])
|
||||
} finally {
|
||||
client.release()
|
||||
}
|
||||
}).catch((e) => console.error(e.message, e.stack))
|
||||
```
|
||||
|
||||
### your new favorite helper method
|
||||
|
||||
because its so common to just run a query and return the client to the pool afterward pg-pool has this built-in:
|
||||
|
||||
```js
|
||||
const pool = new Pool()
|
||||
const time = await pool.query('SELECT NOW()')
|
||||
const name = await pool.query('select $1::text as name', ['brianc'])
|
||||
console.log(name.rows[0].name, 'says hello at', time.rows[0].now)
|
||||
```
|
||||
|
||||
you can also use a callback here if you'd like:
|
||||
|
||||
```js
|
||||
const pool = new Pool()
|
||||
pool.query('SELECT $1::text as name', ['brianc'], function (err, res) {
|
||||
console.log(res.rows[0].name) // brianc
|
||||
})
|
||||
```
|
||||
|
||||
**pro tip:** unless you need to run a transaction (which requires a single client for multiple queries) or you
|
||||
have some other edge case like [streaming rows](https://github.com/brianc/node-pg-query-stream) or using a [cursor](https://github.com/brianc/node-pg-cursor)
|
||||
you should almost always just use `pool.query`. Its easy, it does the right thing :tm:, and wont ever forget to return
|
||||
clients back to the pool after the query is done.
|
||||
|
||||
### drop-in backwards compatible
|
||||
|
||||
pg-pool still and will always support the traditional callback api for acquiring a client. This is the exact API node-postgres has shipped with for years:
|
||||
|
||||
```js
|
||||
const pool = new Pool()
|
||||
pool.connect((err, client, done) => {
|
||||
if (err) return done(err)
|
||||
|
||||
client.query('SELECT $1::text as name', ['pg-pool'], (err, res) => {
|
||||
done()
|
||||
if (err) {
|
||||
return console.error('query error', err.message, err.stack)
|
||||
}
|
||||
console.log('hello from', res.rows[0].name)
|
||||
})
|
||||
})
|
||||
```
|
||||
|
||||
### shut it down
|
||||
|
||||
When you are finished with the pool if all the clients are idle the pool will close them after `config.idleTimeoutMillis` and your app
|
||||
will shutdown gracefully. If you don't want to wait for the timeout you can end the pool as follows:
|
||||
|
||||
```js
|
||||
const pool = new Pool()
|
||||
const client = await pool.connect()
|
||||
console.log(await client.query('select now()'))
|
||||
client.release()
|
||||
await pool.end()
|
||||
```
|
||||
|
||||
### a note on instances
|
||||
|
||||
The pool should be a **long-lived object** in your application. Generally you'll want to instantiate one pool when your app starts up and use the same instance of the pool throughout the lifetime of your application. If you are frequently creating a new pool within your code you likely don't have your pool initialization code in the correct place. Example:
|
||||
|
||||
```js
|
||||
// assume this is a file in your program at ./your-app/lib/db.js
|
||||
|
||||
// correct usage: create the pool and let it live
|
||||
// 'globally' here, controlling access to it through exported methods
|
||||
const pool = new pg.Pool()
|
||||
|
||||
// this is the right way to export the query method
|
||||
module.exports.query = (text, values) => {
|
||||
console.log('query:', text, values)
|
||||
return pool.query(text, values)
|
||||
}
|
||||
|
||||
// this would be the WRONG way to export the connect method
|
||||
module.exports.connect = () => {
|
||||
// notice how we would be creating a pool instance here
|
||||
// every time we called 'connect' to get a new client?
|
||||
// that's a bad thing & results in creating an unbounded
|
||||
// number of pools & therefore connections
|
||||
const aPool = new pg.Pool()
|
||||
return aPool.connect()
|
||||
}
|
||||
```
|
||||
|
||||
### events
|
||||
|
||||
Every instance of a `Pool` is an event emitter. These instances emit the following events:
|
||||
|
||||
#### error
|
||||
|
||||
Emitted whenever an idle client in the pool encounters an error. This is common when your PostgreSQL server shuts down, reboots, or a network partition otherwise causes it to become unavailable while your pool has connected clients.
|
||||
|
||||
Example:
|
||||
|
||||
```js
|
||||
const Pool = require('pg-pool')
|
||||
const pool = new Pool()
|
||||
|
||||
// attach an error handler to the pool for when a connected, idle client
|
||||
// receives an error by being disconnected, etc
|
||||
pool.on('error', function (error, client) {
|
||||
// handle this in the same way you would treat process.on('uncaughtException')
|
||||
// it is supplied the error as well as the idle client which received the error
|
||||
})
|
||||
```
|
||||
|
||||
#### connect
|
||||
|
||||
Fired whenever the pool creates a **new** `pg.Client` instance and successfully connects it to the backend.
|
||||
|
||||
Example:
|
||||
|
||||
```js
|
||||
const Pool = require('pg-pool')
|
||||
const pool = new Pool()
|
||||
|
||||
const count = 0
|
||||
|
||||
pool.on('connect', (client) => {
|
||||
client.count = count++
|
||||
})
|
||||
|
||||
pool
|
||||
.connect()
|
||||
.then((client) => {
|
||||
return client
|
||||
.query('SELECT $1::int AS "clientCount"', [client.count])
|
||||
.then((res) => console.log(res.rows[0].clientCount)) // outputs 0
|
||||
.then(() => client)
|
||||
})
|
||||
.then((client) => client.release())
|
||||
```
|
||||
|
||||
#### acquire
|
||||
|
||||
Fired whenever a client is acquired from the pool
|
||||
|
||||
Example:
|
||||
|
||||
This allows you to count the number of clients which have ever been acquired from the pool.
|
||||
|
||||
```js
|
||||
const Pool = require('pg-pool')
|
||||
const pool = new Pool()
|
||||
|
||||
const acquireCount = 0
|
||||
pool.on('acquire', function (client) {
|
||||
acquireCount++
|
||||
})
|
||||
|
||||
const connectCount = 0
|
||||
pool.on('connect', function () {
|
||||
connectCount++
|
||||
})
|
||||
|
||||
for (let i = 0; i < 200; i++) {
|
||||
pool.query('SELECT NOW()')
|
||||
}
|
||||
|
||||
setTimeout(function () {
|
||||
console.log('connect count:', connectCount) // output: connect count: 10
|
||||
console.log('acquire count:', acquireCount) // output: acquire count: 200
|
||||
}, 100)
|
||||
```
|
||||
|
||||
### environment variables
|
||||
|
||||
pg-pool & node-postgres support some of the same environment variables as `psql` supports. The most common are:
|
||||
|
||||
```
|
||||
PGDATABASE=my_db
|
||||
PGUSER=username
|
||||
PGPASSWORD="my awesome password"
|
||||
PGPORT=5432
|
||||
PGSSLMODE=require
|
||||
```
|
||||
|
||||
Usually I will export these into my local environment via a `.env` file with environment settings or export them in `~/.bash_profile` or something similar. This way I get configurability which works with both the postgres suite of tools (`psql`, `pg_dump`, `pg_restore`) and node, I can vary the environment variables locally and in production, and it supports the concept of a [12-factor app](http://12factor.net/) out of the box.
|
||||
|
||||
## maxUses and read-replica autoscaling (e.g. AWS Aurora)
|
||||
|
||||
The maxUses config option can help an application instance rebalance load against a replica set that has been auto-scaled after the connection pool is already full of healthy connections.
|
||||
|
||||
The mechanism here is that a connection is considered "expended" after it has been acquired and released `maxUses` number of times. Depending on the load on your system, this means there will be an approximate time in which any given connection will live, thus creating a window for rebalancing.
|
||||
|
||||
Imagine a scenario where you have 10 app instances providing an API running against a replica cluster of 3 that are accessed via a round-robin DNS entry. Each instance runs a connection pool size of 20. With an ambient load of 50 requests per second, the connection pool will likely fill up in a few minutes with healthy connections.
|
||||
|
||||
If you have weekly bursts of traffic which peak at 1,000 requests per second, you might want to grow your replicas to 10 during this period. Without setting `maxUses`, the new replicas will not be adopted by the app servers without an intervention -- namely, restarting each in turn in order to build up new connection pools that are balanced against all the replicas. Adding additional app server instances will help to some extent because they will adopt all the replicas in an even way, but the initial app servers will continue to focus additional load on the original replicas.
|
||||
|
||||
This is where the `maxUses` configuration option comes into play. Setting `maxUses` to 7500 will ensure that over a period of 30 minutes or so the new replicas will be adopted as the pre-existing connections are closed and replaced with new ones, thus creating a window for eventual balance.
|
||||
|
||||
You'll want to test based on your own scenarios, but one way to make a first guess at `maxUses` is to identify an acceptable window for rebalancing and then solve for the value:
|
||||
|
||||
```
|
||||
maxUses = rebalanceWindowSeconds * totalRequestsPerSecond / numAppInstances / poolSize
|
||||
```
|
||||
|
||||
In the example above, assuming we acquire and release 1 connection per request and we are aiming for a 30 minute rebalancing window:
|
||||
|
||||
```
|
||||
maxUses = rebalanceWindowSeconds * totalRequestsPerSecond / numAppInstances / poolSize
|
||||
7200 = 1800 * 1000 / 10 / 25
|
||||
```
|
||||
|
||||
## tests
|
||||
|
||||
To run tests clone the repo, `npm i` in the working dir, and then run `npm test`
|
||||
|
||||
## contributions
|
||||
|
||||
I love contributions. Please make sure they have tests, and submit a PR. If you're not sure if the issue is worth it or will be accepted it never hurts to open an issue to begin the conversation. If you're interested in keeping up with node-postgres releated stuff, you can follow me on twitter at [@briancarlson](https://twitter.com/briancarlson) - I generally announce any noteworthy updates there.
|
||||
|
||||
## license
|
||||
|
||||
The MIT License (MIT)
|
||||
Copyright (c) 2016 Brian M. Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||
5
node_modules/pg-pool/esm/index.mjs
generated
vendored
Normal file
5
node_modules/pg-pool/esm/index.mjs
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
// ESM wrapper for pg-pool
|
||||
import Pool from '../index.js'
|
||||
|
||||
// Export as default only to match CJS module
|
||||
export default Pool
|
||||
487
node_modules/pg-pool/index.js
generated
vendored
Normal file
487
node_modules/pg-pool/index.js
generated
vendored
Normal file
@@ -0,0 +1,487 @@
|
||||
'use strict'
|
||||
const EventEmitter = require('events').EventEmitter
|
||||
|
||||
const NOOP = function () {}
|
||||
|
||||
const removeWhere = (list, predicate) => {
|
||||
const i = list.findIndex(predicate)
|
||||
|
||||
return i === -1 ? undefined : list.splice(i, 1)[0]
|
||||
}
|
||||
|
||||
class IdleItem {
|
||||
constructor(client, idleListener, timeoutId) {
|
||||
this.client = client
|
||||
this.idleListener = idleListener
|
||||
this.timeoutId = timeoutId
|
||||
}
|
||||
}
|
||||
|
||||
class PendingItem {
|
||||
constructor(callback) {
|
||||
this.callback = callback
|
||||
}
|
||||
}
|
||||
|
||||
function throwOnDoubleRelease() {
|
||||
throw new Error('Release called on client which has already been released to the pool.')
|
||||
}
|
||||
|
||||
function promisify(Promise, callback) {
|
||||
if (callback) {
|
||||
return { callback: callback, result: undefined }
|
||||
}
|
||||
let rej
|
||||
let res
|
||||
const cb = function (err, client) {
|
||||
err ? rej(err) : res(client)
|
||||
}
|
||||
const result = new Promise(function (resolve, reject) {
|
||||
res = resolve
|
||||
rej = reject
|
||||
}).catch((err) => {
|
||||
// replace the stack trace that leads to `TCP.onStreamRead` with one that leads back to the
|
||||
// application that created the query
|
||||
Error.captureStackTrace(err)
|
||||
throw err
|
||||
})
|
||||
return { callback: cb, result: result }
|
||||
}
|
||||
|
||||
function makeIdleListener(pool, client) {
|
||||
return function idleListener(err) {
|
||||
err.client = client
|
||||
|
||||
client.removeListener('error', idleListener)
|
||||
client.on('error', () => {
|
||||
pool.log('additional client error after disconnection due to error', err)
|
||||
})
|
||||
pool._remove(client)
|
||||
// TODO - document that once the pool emits an error
|
||||
// the client has already been closed & purged and is unusable
|
||||
pool.emit('error', err, client)
|
||||
}
|
||||
}
|
||||
|
||||
class Pool extends EventEmitter {
|
||||
constructor(options, Client) {
|
||||
super()
|
||||
this.options = Object.assign({}, options)
|
||||
|
||||
if (options != null && 'password' in options) {
|
||||
// "hiding" the password so it doesn't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
Object.defineProperty(this.options, 'password', {
|
||||
configurable: true,
|
||||
enumerable: false,
|
||||
writable: true,
|
||||
value: options.password,
|
||||
})
|
||||
}
|
||||
if (options != null && options.ssl && options.ssl.key) {
|
||||
// "hiding" the ssl->key so it doesn't show up in stack traces
|
||||
// or if the client is console.logged
|
||||
Object.defineProperty(this.options.ssl, 'key', {
|
||||
enumerable: false,
|
||||
})
|
||||
}
|
||||
|
||||
this.options.max = this.options.max || this.options.poolSize || 10
|
||||
this.options.min = this.options.min || 0
|
||||
this.options.maxUses = this.options.maxUses || Infinity
|
||||
this.options.allowExitOnIdle = this.options.allowExitOnIdle || false
|
||||
this.options.maxLifetimeSeconds = this.options.maxLifetimeSeconds || 0
|
||||
this.log = this.options.log || function () {}
|
||||
this.Client = this.options.Client || Client || require('pg').Client
|
||||
this.Promise = this.options.Promise || global.Promise
|
||||
|
||||
if (typeof this.options.idleTimeoutMillis === 'undefined') {
|
||||
this.options.idleTimeoutMillis = 10000
|
||||
}
|
||||
|
||||
this._clients = []
|
||||
this._idle = []
|
||||
this._expired = new WeakSet()
|
||||
this._pendingQueue = []
|
||||
this._endCallback = undefined
|
||||
this.ending = false
|
||||
this.ended = false
|
||||
}
|
||||
|
||||
_isFull() {
|
||||
return this._clients.length >= this.options.max
|
||||
}
|
||||
|
||||
_isAboveMin() {
|
||||
return this._clients.length > this.options.min
|
||||
}
|
||||
|
||||
_pulseQueue() {
|
||||
this.log('pulse queue')
|
||||
if (this.ended) {
|
||||
this.log('pulse queue ended')
|
||||
return
|
||||
}
|
||||
if (this.ending) {
|
||||
this.log('pulse queue on ending')
|
||||
if (this._idle.length) {
|
||||
this._idle.slice().map((item) => {
|
||||
this._remove(item.client)
|
||||
})
|
||||
}
|
||||
if (!this._clients.length) {
|
||||
this.ended = true
|
||||
this._endCallback()
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
// if we don't have any waiting, do nothing
|
||||
if (!this._pendingQueue.length) {
|
||||
this.log('no queued requests')
|
||||
return
|
||||
}
|
||||
// if we don't have any idle clients and we have no more room do nothing
|
||||
if (!this._idle.length && this._isFull()) {
|
||||
return
|
||||
}
|
||||
const pendingItem = this._pendingQueue.shift()
|
||||
if (this._idle.length) {
|
||||
const idleItem = this._idle.pop()
|
||||
clearTimeout(idleItem.timeoutId)
|
||||
const client = idleItem.client
|
||||
client.ref && client.ref()
|
||||
const idleListener = idleItem.idleListener
|
||||
|
||||
return this._acquireClient(client, pendingItem, idleListener, false)
|
||||
}
|
||||
if (!this._isFull()) {
|
||||
return this.newClient(pendingItem)
|
||||
}
|
||||
throw new Error('unexpected condition')
|
||||
}
|
||||
|
||||
_remove(client, callback) {
|
||||
const removed = removeWhere(this._idle, (item) => item.client === client)
|
||||
|
||||
if (removed !== undefined) {
|
||||
clearTimeout(removed.timeoutId)
|
||||
}
|
||||
|
||||
this._clients = this._clients.filter((c) => c !== client)
|
||||
const context = this
|
||||
client.end(() => {
|
||||
context.emit('remove', client)
|
||||
|
||||
if (typeof callback === 'function') {
|
||||
callback()
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
connect(cb) {
|
||||
if (this.ending) {
|
||||
const err = new Error('Cannot use a pool after calling end on the pool')
|
||||
return cb ? cb(err) : this.Promise.reject(err)
|
||||
}
|
||||
|
||||
const response = promisify(this.Promise, cb)
|
||||
const result = response.result
|
||||
|
||||
// if we don't have to connect a new client, don't do so
|
||||
if (this._isFull() || this._idle.length) {
|
||||
// if we have idle clients schedule a pulse immediately
|
||||
if (this._idle.length) {
|
||||
process.nextTick(() => this._pulseQueue())
|
||||
}
|
||||
|
||||
if (!this.options.connectionTimeoutMillis) {
|
||||
this._pendingQueue.push(new PendingItem(response.callback))
|
||||
return result
|
||||
}
|
||||
|
||||
const queueCallback = (err, res, done) => {
|
||||
clearTimeout(tid)
|
||||
response.callback(err, res, done)
|
||||
}
|
||||
|
||||
const pendingItem = new PendingItem(queueCallback)
|
||||
|
||||
// set connection timeout on checking out an existing client
|
||||
const tid = setTimeout(() => {
|
||||
// remove the callback from pending waiters because
|
||||
// we're going to call it with a timeout error
|
||||
removeWhere(this._pendingQueue, (i) => i.callback === queueCallback)
|
||||
pendingItem.timedOut = true
|
||||
response.callback(new Error('timeout exceeded when trying to connect'))
|
||||
}, this.options.connectionTimeoutMillis)
|
||||
|
||||
if (tid.unref) {
|
||||
tid.unref()
|
||||
}
|
||||
|
||||
this._pendingQueue.push(pendingItem)
|
||||
return result
|
||||
}
|
||||
|
||||
this.newClient(new PendingItem(response.callback))
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
newClient(pendingItem) {
|
||||
const client = new this.Client(this.options)
|
||||
this._clients.push(client)
|
||||
const idleListener = makeIdleListener(this, client)
|
||||
|
||||
this.log('checking client timeout')
|
||||
|
||||
// connection timeout logic
|
||||
let tid
|
||||
let timeoutHit = false
|
||||
if (this.options.connectionTimeoutMillis) {
|
||||
tid = setTimeout(() => {
|
||||
if (client.connection) {
|
||||
this.log('ending client due to timeout')
|
||||
timeoutHit = true
|
||||
client.connection.stream.destroy()
|
||||
} else if (!client.isConnected()) {
|
||||
this.log('ending client due to timeout')
|
||||
timeoutHit = true
|
||||
// force kill the node driver, and let libpq do its teardown
|
||||
client.end()
|
||||
}
|
||||
}, this.options.connectionTimeoutMillis)
|
||||
}
|
||||
|
||||
this.log('connecting new client')
|
||||
client.connect((err) => {
|
||||
if (tid) {
|
||||
clearTimeout(tid)
|
||||
}
|
||||
client.on('error', idleListener)
|
||||
if (err) {
|
||||
this.log('client failed to connect', err)
|
||||
// remove the dead client from our list of clients
|
||||
this._clients = this._clients.filter((c) => c !== client)
|
||||
if (timeoutHit) {
|
||||
err = new Error('Connection terminated due to connection timeout', { cause: err })
|
||||
}
|
||||
|
||||
// this client won’t be released, so move on immediately
|
||||
this._pulseQueue()
|
||||
|
||||
if (!pendingItem.timedOut) {
|
||||
pendingItem.callback(err, undefined, NOOP)
|
||||
}
|
||||
} else {
|
||||
this.log('new client connected')
|
||||
|
||||
if (this.options.maxLifetimeSeconds !== 0) {
|
||||
const maxLifetimeTimeout = setTimeout(() => {
|
||||
this.log('ending client due to expired lifetime')
|
||||
this._expired.add(client)
|
||||
const idleIndex = this._idle.findIndex((idleItem) => idleItem.client === client)
|
||||
if (idleIndex !== -1) {
|
||||
this._acquireClient(
|
||||
client,
|
||||
new PendingItem((err, client, clientRelease) => clientRelease()),
|
||||
idleListener,
|
||||
false
|
||||
)
|
||||
}
|
||||
}, this.options.maxLifetimeSeconds * 1000)
|
||||
|
||||
maxLifetimeTimeout.unref()
|
||||
client.once('end', () => clearTimeout(maxLifetimeTimeout))
|
||||
}
|
||||
|
||||
return this._acquireClient(client, pendingItem, idleListener, true)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
// acquire a client for a pending work item
|
||||
_acquireClient(client, pendingItem, idleListener, isNew) {
|
||||
if (isNew) {
|
||||
this.emit('connect', client)
|
||||
}
|
||||
|
||||
this.emit('acquire', client)
|
||||
|
||||
client.release = this._releaseOnce(client, idleListener)
|
||||
|
||||
client.removeListener('error', idleListener)
|
||||
|
||||
if (!pendingItem.timedOut) {
|
||||
if (isNew && this.options.verify) {
|
||||
this.options.verify(client, (err) => {
|
||||
if (err) {
|
||||
client.release(err)
|
||||
return pendingItem.callback(err, undefined, NOOP)
|
||||
}
|
||||
|
||||
pendingItem.callback(undefined, client, client.release)
|
||||
})
|
||||
} else {
|
||||
pendingItem.callback(undefined, client, client.release)
|
||||
}
|
||||
} else {
|
||||
if (isNew && this.options.verify) {
|
||||
this.options.verify(client, client.release)
|
||||
} else {
|
||||
client.release()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// returns a function that wraps _release and throws if called more than once
|
||||
_releaseOnce(client, idleListener) {
|
||||
let released = false
|
||||
|
||||
return (err) => {
|
||||
if (released) {
|
||||
throwOnDoubleRelease()
|
||||
}
|
||||
|
||||
released = true
|
||||
this._release(client, idleListener, err)
|
||||
}
|
||||
}
|
||||
|
||||
// release a client back to the poll, include an error
|
||||
// to remove it from the pool
|
||||
_release(client, idleListener, err) {
|
||||
client.on('error', idleListener)
|
||||
|
||||
client._poolUseCount = (client._poolUseCount || 0) + 1
|
||||
|
||||
this.emit('release', err, client)
|
||||
|
||||
// TODO(bmc): expose a proper, public interface _queryable and _ending
|
||||
if (err || this.ending || !client._queryable || client._ending || client._poolUseCount >= this.options.maxUses) {
|
||||
if (client._poolUseCount >= this.options.maxUses) {
|
||||
this.log('remove expended client')
|
||||
}
|
||||
|
||||
return this._remove(client, this._pulseQueue.bind(this))
|
||||
}
|
||||
|
||||
const isExpired = this._expired.has(client)
|
||||
if (isExpired) {
|
||||
this.log('remove expired client')
|
||||
this._expired.delete(client)
|
||||
return this._remove(client, this._pulseQueue.bind(this))
|
||||
}
|
||||
|
||||
// idle timeout
|
||||
let tid
|
||||
if (this.options.idleTimeoutMillis && this._isAboveMin()) {
|
||||
tid = setTimeout(() => {
|
||||
if (this._isAboveMin()) {
|
||||
this.log('remove idle client')
|
||||
this._remove(client, this._pulseQueue.bind(this))
|
||||
}
|
||||
}, this.options.idleTimeoutMillis)
|
||||
|
||||
if (this.options.allowExitOnIdle) {
|
||||
// allow Node to exit if this is all that's left
|
||||
tid.unref()
|
||||
}
|
||||
}
|
||||
|
||||
if (this.options.allowExitOnIdle) {
|
||||
client.unref()
|
||||
}
|
||||
|
||||
this._idle.push(new IdleItem(client, idleListener, tid))
|
||||
this._pulseQueue()
|
||||
}
|
||||
|
||||
query(text, values, cb) {
|
||||
// guard clause against passing a function as the first parameter
|
||||
if (typeof text === 'function') {
|
||||
const response = promisify(this.Promise, text)
|
||||
setImmediate(function () {
|
||||
return response.callback(new Error('Passing a function as the first parameter to pool.query is not supported'))
|
||||
})
|
||||
return response.result
|
||||
}
|
||||
|
||||
// allow plain text query without values
|
||||
if (typeof values === 'function') {
|
||||
cb = values
|
||||
values = undefined
|
||||
}
|
||||
const response = promisify(this.Promise, cb)
|
||||
cb = response.callback
|
||||
|
||||
this.connect((err, client) => {
|
||||
if (err) {
|
||||
return cb(err)
|
||||
}
|
||||
|
||||
let clientReleased = false
|
||||
const onError = (err) => {
|
||||
if (clientReleased) {
|
||||
return
|
||||
}
|
||||
clientReleased = true
|
||||
client.release(err)
|
||||
cb(err)
|
||||
}
|
||||
|
||||
client.once('error', onError)
|
||||
this.log('dispatching query')
|
||||
try {
|
||||
client.query(text, values, (err, res) => {
|
||||
this.log('query dispatched')
|
||||
client.removeListener('error', onError)
|
||||
if (clientReleased) {
|
||||
return
|
||||
}
|
||||
clientReleased = true
|
||||
client.release(err)
|
||||
if (err) {
|
||||
return cb(err)
|
||||
}
|
||||
return cb(undefined, res)
|
||||
})
|
||||
} catch (err) {
|
||||
client.release(err)
|
||||
return cb(err)
|
||||
}
|
||||
})
|
||||
return response.result
|
||||
}
|
||||
|
||||
end(cb) {
|
||||
this.log('ending')
|
||||
if (this.ending) {
|
||||
const err = new Error('Called end on pool more than once')
|
||||
return cb ? cb(err) : this.Promise.reject(err)
|
||||
}
|
||||
this.ending = true
|
||||
const promised = promisify(this.Promise, cb)
|
||||
this._endCallback = promised.callback
|
||||
this._pulseQueue()
|
||||
return promised.result
|
||||
}
|
||||
|
||||
get waitingCount() {
|
||||
return this._pendingQueue.length
|
||||
}
|
||||
|
||||
get idleCount() {
|
||||
return this._idle.length
|
||||
}
|
||||
|
||||
get expiredCount() {
|
||||
return this._clients.reduce((acc, client) => acc + (this._expired.has(client) ? 1 : 0), 0)
|
||||
}
|
||||
|
||||
get totalCount() {
|
||||
return this._clients.length
|
||||
}
|
||||
}
|
||||
module.exports = Pool
|
||||
51
node_modules/pg-pool/package.json
generated
vendored
Normal file
51
node_modules/pg-pool/package.json
generated
vendored
Normal file
@@ -0,0 +1,51 @@
|
||||
{
|
||||
"name": "pg-pool",
|
||||
"version": "3.12.0",
|
||||
"description": "Connection pool for node-postgres",
|
||||
"main": "index.js",
|
||||
"exports": {
|
||||
".": {
|
||||
"import": "./esm/index.mjs",
|
||||
"require": "./index.js",
|
||||
"default": "./index.js"
|
||||
}
|
||||
},
|
||||
"directories": {
|
||||
"test": "test"
|
||||
},
|
||||
"scripts": {
|
||||
"test": " node_modules/.bin/mocha"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-postgres.git",
|
||||
"directory": "packages/pg-pool"
|
||||
},
|
||||
"keywords": [
|
||||
"pg",
|
||||
"postgres",
|
||||
"pool",
|
||||
"database"
|
||||
],
|
||||
"author": "Brian M. Carlson",
|
||||
"license": "MIT",
|
||||
"bugs": {
|
||||
"url": "https://github.com/brianc/node-postgres/issues"
|
||||
},
|
||||
"homepage": "https://github.com/brianc/node-postgres/tree/master/packages/pg-pool#readme",
|
||||
"devDependencies": {
|
||||
"bluebird": "3.7.2",
|
||||
"co": "4.6.0",
|
||||
"expect.js": "0.3.1",
|
||||
"lodash": "^4.17.11",
|
||||
"mocha": "^10.5.2"
|
||||
},
|
||||
"peerDependencies": {
|
||||
"pg": ">=8.0"
|
||||
},
|
||||
"files": [
|
||||
"index.js",
|
||||
"esm"
|
||||
],
|
||||
"gitHead": "f2d7d1146cc87024a5fa503dce13c59ff5196d26"
|
||||
}
|
||||
21
node_modules/pg-protocol/LICENSE
generated
vendored
Normal file
21
node_modules/pg-protocol/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
MIT License
|
||||
|
||||
Copyright (c) 2010 - 2021 Brian Carlson
|
||||
|
||||
Permission is hereby granted, free of charge, to any person obtaining a copy
|
||||
of this software and associated documentation files (the "Software"), to deal
|
||||
in the Software without restriction, including without limitation the rights
|
||||
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
||||
copies of the Software, and to permit persons to whom the Software is
|
||||
furnished to do so, subject to the following conditions:
|
||||
|
||||
The above copyright notice and this permission notice shall be included in all
|
||||
copies or substantial portions of the Software.
|
||||
|
||||
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
||||
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
||||
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
||||
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
||||
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
||||
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
||||
SOFTWARE.
|
||||
3
node_modules/pg-protocol/README.md
generated
vendored
Normal file
3
node_modules/pg-protocol/README.md
generated
vendored
Normal file
@@ -0,0 +1,3 @@
|
||||
# pg-protocol
|
||||
|
||||
Low level postgres wire protocol parser and serializer written in Typescript. Used by node-postgres. Needs more documentation. :smile:
|
||||
1
node_modules/pg-protocol/dist/b.d.ts
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/b.d.ts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
export {};
|
||||
23
node_modules/pg-protocol/dist/b.js
generated
vendored
Normal file
23
node_modules/pg-protocol/dist/b.js
generated
vendored
Normal file
@@ -0,0 +1,23 @@
|
||||
"use strict";
|
||||
// file for microbenchmarking
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const buffer_reader_1 = require("./buffer-reader");
|
||||
const LOOPS = 1000;
|
||||
let count = 0;
|
||||
const start = performance.now();
|
||||
const reader = new buffer_reader_1.BufferReader();
|
||||
const buffer = Buffer.from([33, 33, 33, 33, 33, 33, 33, 0]);
|
||||
const run = () => {
|
||||
if (count > LOOPS) {
|
||||
console.log(performance.now() - start);
|
||||
return;
|
||||
}
|
||||
count++;
|
||||
for (let i = 0; i < LOOPS; i++) {
|
||||
reader.setBuffer(0, buffer);
|
||||
reader.cstring();
|
||||
}
|
||||
setImmediate(run);
|
||||
};
|
||||
run();
|
||||
//# sourceMappingURL=b.js.map
|
||||
1
node_modules/pg-protocol/dist/b.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/b.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"b.js","sourceRoot":"","sources":["../src/b.ts"],"names":[],"mappings":";AAAA,6BAA6B;;AAE7B,mDAA8C;AAE9C,MAAM,KAAK,GAAG,IAAI,CAAA;AAClB,IAAI,KAAK,GAAG,CAAC,CAAA;AACb,MAAM,KAAK,GAAG,WAAW,CAAC,GAAG,EAAE,CAAA;AAE/B,MAAM,MAAM,GAAG,IAAI,4BAAY,EAAE,CAAA;AACjC,MAAM,MAAM,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,CAAC,CAAC,CAAC,CAAA;AAE3D,MAAM,GAAG,GAAG,GAAG,EAAE;IACf,IAAI,KAAK,GAAG,KAAK,EAAE;QACjB,OAAO,CAAC,GAAG,CAAC,WAAW,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC,CAAA;QACtC,OAAM;KACP;IACD,KAAK,EAAE,CAAA;IACP,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,EAAE,CAAC,EAAE,EAAE;QAC9B,MAAM,CAAC,SAAS,CAAC,CAAC,EAAE,MAAM,CAAC,CAAA;QAC3B,MAAM,CAAC,OAAO,EAAE,CAAA;KACjB;IACD,YAAY,CAAC,GAAG,CAAC,CAAA;AACnB,CAAC,CAAA;AAED,GAAG,EAAE,CAAA"}
|
||||
15
node_modules/pg-protocol/dist/buffer-reader.d.ts
generated
vendored
Normal file
15
node_modules/pg-protocol/dist/buffer-reader.d.ts
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
/// <reference types="node" />
|
||||
export declare class BufferReader {
|
||||
private offset;
|
||||
private buffer;
|
||||
private encoding;
|
||||
constructor(offset?: number);
|
||||
setBuffer(offset: number, buffer: Buffer): void;
|
||||
int16(): number;
|
||||
byte(): number;
|
||||
int32(): number;
|
||||
uint32(): number;
|
||||
string(length: number): string;
|
||||
cstring(): string;
|
||||
bytes(length: number): Buffer;
|
||||
}
|
||||
55
node_modules/pg-protocol/dist/buffer-reader.js
generated
vendored
Normal file
55
node_modules/pg-protocol/dist/buffer-reader.js
generated
vendored
Normal file
@@ -0,0 +1,55 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.BufferReader = void 0;
|
||||
class BufferReader {
|
||||
constructor(offset = 0) {
|
||||
this.offset = offset;
|
||||
this.buffer = Buffer.allocUnsafe(0);
|
||||
// TODO(bmc): support non-utf8 encoding?
|
||||
this.encoding = 'utf-8';
|
||||
}
|
||||
setBuffer(offset, buffer) {
|
||||
this.offset = offset;
|
||||
this.buffer = buffer;
|
||||
}
|
||||
int16() {
|
||||
const result = this.buffer.readInt16BE(this.offset);
|
||||
this.offset += 2;
|
||||
return result;
|
||||
}
|
||||
byte() {
|
||||
const result = this.buffer[this.offset];
|
||||
this.offset++;
|
||||
return result;
|
||||
}
|
||||
int32() {
|
||||
const result = this.buffer.readInt32BE(this.offset);
|
||||
this.offset += 4;
|
||||
return result;
|
||||
}
|
||||
uint32() {
|
||||
const result = this.buffer.readUInt32BE(this.offset);
|
||||
this.offset += 4;
|
||||
return result;
|
||||
}
|
||||
string(length) {
|
||||
const result = this.buffer.toString(this.encoding, this.offset, this.offset + length);
|
||||
this.offset += length;
|
||||
return result;
|
||||
}
|
||||
cstring() {
|
||||
const start = this.offset;
|
||||
let end = start;
|
||||
// eslint-disable-next-line no-empty
|
||||
while (this.buffer[end++] !== 0) { }
|
||||
this.offset = end;
|
||||
return this.buffer.toString(this.encoding, start, end - 1);
|
||||
}
|
||||
bytes(length) {
|
||||
const result = this.buffer.slice(this.offset, this.offset + length);
|
||||
this.offset += length;
|
||||
return result;
|
||||
}
|
||||
}
|
||||
exports.BufferReader = BufferReader;
|
||||
//# sourceMappingURL=buffer-reader.js.map
|
||||
1
node_modules/pg-protocol/dist/buffer-reader.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/buffer-reader.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"buffer-reader.js","sourceRoot":"","sources":["../src/buffer-reader.ts"],"names":[],"mappings":";;;AAAA,MAAa,YAAY;IAMvB,YAAoB,SAAiB,CAAC;QAAlB,WAAM,GAAN,MAAM,CAAY;QAL9B,WAAM,GAAW,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAA;QAE9C,wCAAwC;QAChC,aAAQ,GAAW,OAAO,CAAA;IAEO,CAAC;IAEnC,SAAS,CAAC,MAAc,EAAE,MAAc;QAC7C,IAAI,CAAC,MAAM,GAAG,MAAM,CAAA;QACpB,IAAI,CAAC,MAAM,GAAG,MAAM,CAAA;IACtB,CAAC;IAEM,KAAK;QACV,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;QACnD,IAAI,CAAC,MAAM,IAAI,CAAC,CAAA;QAChB,OAAO,MAAM,CAAA;IACf,CAAC;IAEM,IAAI;QACT,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;QACvC,IAAI,CAAC,MAAM,EAAE,CAAA;QACb,OAAO,MAAM,CAAA;IACf,CAAC;IAEM,KAAK;QACV,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;QACnD,IAAI,CAAC,MAAM,IAAI,CAAC,CAAA;QAChB,OAAO,MAAM,CAAA;IACf,CAAC;IAEM,MAAM;QACX,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;QACpD,IAAI,CAAC,MAAM,IAAI,CAAC,CAAA;QAChB,OAAO,MAAM,CAAA;IACf,CAAC;IAEM,MAAM,CAAC,MAAc;QAC1B,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC,QAAQ,EAAE,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,CAAA;QACrF,IAAI,CAAC,MAAM,IAAI,MAAM,CAAA;QACrB,OAAO,MAAM,CAAA;IACf,CAAC;IAEM,OAAO;QACZ,MAAM,KAAK,GAAG,IAAI,CAAC,MAAM,CAAA;QACzB,IAAI,GAAG,GAAG,KAAK,CAAA;QACf,oCAAoC;QACpC,OAAO,IAAI,CAAC,MAAM,CAAC,GAAG,EAAE,CAAC,KAAK,CAAC,EAAE,GAAE;QACnC,IAAI,CAAC,MAAM,GAAG,GAAG,CAAA;QACjB,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,IAAI,CAAC,QAAQ,EAAE,KAAK,EAAE,GAAG,GAAG,CAAC,CAAC,CAAA;IAC5D,CAAC;IAEM,KAAK,CAAC,MAAc;QACzB,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,CAAA;QACnE,IAAI,CAAC,MAAM,IAAI,MAAM,CAAA;QACrB,OAAO,MAAM,CAAA;IACf,CAAC;CACF;AAzDD,oCAyDC"}
|
||||
16
node_modules/pg-protocol/dist/buffer-writer.d.ts
generated
vendored
Normal file
16
node_modules/pg-protocol/dist/buffer-writer.d.ts
generated
vendored
Normal file
@@ -0,0 +1,16 @@
|
||||
/// <reference types="node" />
|
||||
export declare class Writer {
|
||||
private size;
|
||||
private buffer;
|
||||
private offset;
|
||||
private headerPosition;
|
||||
constructor(size?: number);
|
||||
private ensure;
|
||||
addInt32(num: number): Writer;
|
||||
addInt16(num: number): Writer;
|
||||
addCString(string: string): Writer;
|
||||
addString(string?: string): Writer;
|
||||
add(otherBuffer: Buffer): Writer;
|
||||
private join;
|
||||
flush(code?: number): Buffer;
|
||||
}
|
||||
81
node_modules/pg-protocol/dist/buffer-writer.js
generated
vendored
Normal file
81
node_modules/pg-protocol/dist/buffer-writer.js
generated
vendored
Normal file
@@ -0,0 +1,81 @@
|
||||
"use strict";
|
||||
//binary data writer tuned for encoding binary specific to the postgres binary protocol
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Writer = void 0;
|
||||
class Writer {
|
||||
constructor(size = 256) {
|
||||
this.size = size;
|
||||
this.offset = 5;
|
||||
this.headerPosition = 0;
|
||||
this.buffer = Buffer.allocUnsafe(size);
|
||||
}
|
||||
ensure(size) {
|
||||
const remaining = this.buffer.length - this.offset;
|
||||
if (remaining < size) {
|
||||
const oldBuffer = this.buffer;
|
||||
// exponential growth factor of around ~ 1.5
|
||||
// https://stackoverflow.com/questions/2269063/buffer-growth-strategy
|
||||
const newSize = oldBuffer.length + (oldBuffer.length >> 1) + size;
|
||||
this.buffer = Buffer.allocUnsafe(newSize);
|
||||
oldBuffer.copy(this.buffer);
|
||||
}
|
||||
}
|
||||
addInt32(num) {
|
||||
this.ensure(4);
|
||||
this.buffer[this.offset++] = (num >>> 24) & 0xff;
|
||||
this.buffer[this.offset++] = (num >>> 16) & 0xff;
|
||||
this.buffer[this.offset++] = (num >>> 8) & 0xff;
|
||||
this.buffer[this.offset++] = (num >>> 0) & 0xff;
|
||||
return this;
|
||||
}
|
||||
addInt16(num) {
|
||||
this.ensure(2);
|
||||
this.buffer[this.offset++] = (num >>> 8) & 0xff;
|
||||
this.buffer[this.offset++] = (num >>> 0) & 0xff;
|
||||
return this;
|
||||
}
|
||||
addCString(string) {
|
||||
if (!string) {
|
||||
this.ensure(1);
|
||||
}
|
||||
else {
|
||||
const len = Buffer.byteLength(string);
|
||||
this.ensure(len + 1); // +1 for null terminator
|
||||
this.buffer.write(string, this.offset, 'utf-8');
|
||||
this.offset += len;
|
||||
}
|
||||
this.buffer[this.offset++] = 0; // null terminator
|
||||
return this;
|
||||
}
|
||||
addString(string = '') {
|
||||
const len = Buffer.byteLength(string);
|
||||
this.ensure(len);
|
||||
this.buffer.write(string, this.offset);
|
||||
this.offset += len;
|
||||
return this;
|
||||
}
|
||||
add(otherBuffer) {
|
||||
this.ensure(otherBuffer.length);
|
||||
otherBuffer.copy(this.buffer, this.offset);
|
||||
this.offset += otherBuffer.length;
|
||||
return this;
|
||||
}
|
||||
join(code) {
|
||||
if (code) {
|
||||
this.buffer[this.headerPosition] = code;
|
||||
//length is everything in this packet minus the code
|
||||
const length = this.offset - (this.headerPosition + 1);
|
||||
this.buffer.writeInt32BE(length, this.headerPosition + 1);
|
||||
}
|
||||
return this.buffer.slice(code ? 0 : 5, this.offset);
|
||||
}
|
||||
flush(code) {
|
||||
const result = this.join(code);
|
||||
this.offset = 5;
|
||||
this.headerPosition = 0;
|
||||
this.buffer = Buffer.allocUnsafe(this.size);
|
||||
return result;
|
||||
}
|
||||
}
|
||||
exports.Writer = Writer;
|
||||
//# sourceMappingURL=buffer-writer.js.map
|
||||
1
node_modules/pg-protocol/dist/buffer-writer.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/buffer-writer.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"buffer-writer.js","sourceRoot":"","sources":["../src/buffer-writer.ts"],"names":[],"mappings":";AAAA,uFAAuF;;;AAEvF,MAAa,MAAM;IAIjB,YAAoB,OAAO,GAAG;QAAV,SAAI,GAAJ,IAAI,CAAM;QAFtB,WAAM,GAAW,CAAC,CAAA;QAClB,mBAAc,GAAW,CAAC,CAAA;QAEhC,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,WAAW,CAAC,IAAI,CAAC,CAAA;IACxC,CAAC;IAEO,MAAM,CAAC,IAAY;QACzB,MAAM,SAAS,GAAG,IAAI,CAAC,MAAM,CAAC,MAAM,GAAG,IAAI,CAAC,MAAM,CAAA;QAClD,IAAI,SAAS,GAAG,IAAI,EAAE;YACpB,MAAM,SAAS,GAAG,IAAI,CAAC,MAAM,CAAA;YAC7B,4CAA4C;YAC5C,qEAAqE;YACrE,MAAM,OAAO,GAAG,SAAS,CAAC,MAAM,GAAG,CAAC,SAAS,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,IAAI,CAAA;YACjE,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,WAAW,CAAC,OAAO,CAAC,CAAA;YACzC,SAAS,CAAC,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,CAAA;SAC5B;IACH,CAAC;IAEM,QAAQ,CAAC,GAAW;QACzB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAA;QACd,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,EAAE,CAAC,GAAG,IAAI,CAAA;QAChD,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,EAAE,CAAC,GAAG,IAAI,CAAA;QAChD,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,CAAC,GAAG,IAAI,CAAA;QAC/C,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,CAAC,GAAG,IAAI,CAAA;QAC/C,OAAO,IAAI,CAAA;IACb,CAAC;IAEM,QAAQ,CAAC,GAAW;QACzB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAA;QACd,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,CAAC,GAAG,IAAI,CAAA;QAC/C,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,CAAC,GAAG,IAAI,CAAA;QAC/C,OAAO,IAAI,CAAA;IACb,CAAC;IAEM,UAAU,CAAC,MAAc;QAC9B,IAAI,CAAC,MAAM,EAAE;YACX,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAA;SACf;aAAM;YACL,MAAM,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAA;YACrC,IAAI,CAAC,MAAM,CAAC,GAAG,GAAG,CAAC,CAAC,CAAA,CAAC,yBAAyB;YAC9C,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAA;YAC/C,IAAI,CAAC,MAAM,IAAI,GAAG,CAAA;SACnB;QAED,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC,CAAA,CAAC,kBAAkB;QACjD,OAAO,IAAI,CAAA;IACb,CAAC;IAEM,SAAS,CAAC,SAAiB,EAAE;QAClC,MAAM,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAA;QACrC,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,CAAA;QAChB,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,CAAC,CAAA;QACtC,IAAI,CAAC,MAAM,IAAI,GAAG,CAAA;QAClB,OAAO,IAAI,CAAA;IACb,CAAC;IAEM,GAAG,CAAC,WAAmB;QAC5B,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,MAAM,CAAC,CAAA;QAC/B,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,CAAC,CAAA;QAC1C,IAAI,CAAC,MAAM,IAAI,WAAW,CAAC,MAAM,CAAA;QACjC,OAAO,IAAI,CAAA;IACb,CAAC;IAEO,IAAI,CAAC,IAAa;QACxB,IAAI,IAAI,EAAE;YACR,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,GAAG,IAAI,CAAA;YACvC,oDAAoD;YACpD,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,GAAG,CAAC,IAAI,CAAC,cAAc,GAAG,CAAC,CAAC,CAAA;YACtD,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC,MAAM,EAAE,IAAI,CAAC,cAAc,GAAG,CAAC,CAAC,CAAA;SAC1D;QACD,OAAO,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,IAAI,CAAC,MAAM,CAAC,CAAA;IACrD,CAAC;IAEM,KAAK,CAAC,IAAa;QACxB,MAAM,MAAM,GAAG,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;QAC9B,IAAI,CAAC,MAAM,GAAG,CAAC,CAAA;QACf,IAAI,CAAC,cAAc,GAAG,CAAC,CAAA;QACvB,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,WAAW,CAAC,IAAI,CAAC,IAAI,CAAC,CAAA;QAC3C,OAAO,MAAM,CAAA;IACf,CAAC;CACF;AAlFD,wBAkFC"}
|
||||
1
node_modules/pg-protocol/dist/inbound-parser.test.d.ts
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/inbound-parser.test.d.ts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
export {};
|
||||
530
node_modules/pg-protocol/dist/inbound-parser.test.js
generated
vendored
Normal file
530
node_modules/pg-protocol/dist/inbound-parser.test.js
generated
vendored
Normal file
@@ -0,0 +1,530 @@
|
||||
"use strict";
|
||||
var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
|
||||
function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
|
||||
return new (P || (P = Promise))(function (resolve, reject) {
|
||||
function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }
|
||||
function rejected(value) { try { step(generator["throw"](value)); } catch (e) { reject(e); } }
|
||||
function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const test_buffers_1 = __importDefault(require("./testing/test-buffers"));
|
||||
const buffer_list_1 = __importDefault(require("./testing/buffer-list"));
|
||||
const _1 = require(".");
|
||||
const assert_1 = __importDefault(require("assert"));
|
||||
const stream_1 = require("stream");
|
||||
const parser_1 = require("./parser");
|
||||
const authOkBuffer = test_buffers_1.default.authenticationOk();
|
||||
const paramStatusBuffer = test_buffers_1.default.parameterStatus('client_encoding', 'UTF8');
|
||||
const readyForQueryBuffer = test_buffers_1.default.readyForQuery();
|
||||
const backendKeyDataBuffer = test_buffers_1.default.backendKeyData(1, 2);
|
||||
const commandCompleteBuffer = test_buffers_1.default.commandComplete('SELECT 3');
|
||||
const parseCompleteBuffer = test_buffers_1.default.parseComplete();
|
||||
const bindCompleteBuffer = test_buffers_1.default.bindComplete();
|
||||
const portalSuspendedBuffer = test_buffers_1.default.portalSuspended();
|
||||
const row1 = {
|
||||
name: 'id',
|
||||
tableID: 1,
|
||||
attributeNumber: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
typeModifier: 5,
|
||||
formatCode: 0,
|
||||
};
|
||||
const oneRowDescBuff = test_buffers_1.default.rowDescription([row1]);
|
||||
row1.name = 'bang';
|
||||
const twoRowBuf = test_buffers_1.default.rowDescription([
|
||||
row1,
|
||||
{
|
||||
name: 'whoah',
|
||||
tableID: 10,
|
||||
attributeNumber: 11,
|
||||
dataTypeID: 12,
|
||||
dataTypeSize: 13,
|
||||
typeModifier: 14,
|
||||
formatCode: 0,
|
||||
},
|
||||
]);
|
||||
const rowWithBigOids = {
|
||||
name: 'bigoid',
|
||||
tableID: 3000000001,
|
||||
attributeNumber: 2,
|
||||
dataTypeID: 3000000003,
|
||||
dataTypeSize: 4,
|
||||
typeModifier: 5,
|
||||
formatCode: 0,
|
||||
};
|
||||
const bigOidDescBuff = test_buffers_1.default.rowDescription([rowWithBigOids]);
|
||||
const emptyRowFieldBuf = test_buffers_1.default.dataRow([]);
|
||||
const oneFieldBuf = test_buffers_1.default.dataRow(['test']);
|
||||
const expectedAuthenticationOkayMessage = {
|
||||
name: 'authenticationOk',
|
||||
length: 8,
|
||||
};
|
||||
const expectedParameterStatusMessage = {
|
||||
name: 'parameterStatus',
|
||||
parameterName: 'client_encoding',
|
||||
parameterValue: 'UTF8',
|
||||
length: 25,
|
||||
};
|
||||
const expectedBackendKeyDataMessage = {
|
||||
name: 'backendKeyData',
|
||||
processID: 1,
|
||||
secretKey: 2,
|
||||
};
|
||||
const expectedReadyForQueryMessage = {
|
||||
name: 'readyForQuery',
|
||||
length: 5,
|
||||
status: 'I',
|
||||
};
|
||||
const expectedCommandCompleteMessage = {
|
||||
name: 'commandComplete',
|
||||
length: 13,
|
||||
text: 'SELECT 3',
|
||||
};
|
||||
const emptyRowDescriptionBuffer = new buffer_list_1.default()
|
||||
.addInt16(0) // number of fields
|
||||
.join(true, 'T');
|
||||
const expectedEmptyRowDescriptionMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 6,
|
||||
fieldCount: 0,
|
||||
fields: [],
|
||||
};
|
||||
const expectedOneRowMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 27,
|
||||
fieldCount: 1,
|
||||
fields: [
|
||||
{
|
||||
name: 'id',
|
||||
tableID: 1,
|
||||
columnID: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
};
|
||||
const expectedTwoRowMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 53,
|
||||
fieldCount: 2,
|
||||
fields: [
|
||||
{
|
||||
name: 'bang',
|
||||
tableID: 1,
|
||||
columnID: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
{
|
||||
name: 'whoah',
|
||||
tableID: 10,
|
||||
columnID: 11,
|
||||
dataTypeID: 12,
|
||||
dataTypeSize: 13,
|
||||
dataTypeModifier: 14,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
};
|
||||
const expectedBigOidMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 31,
|
||||
fieldCount: 1,
|
||||
fields: [
|
||||
{
|
||||
name: 'bigoid',
|
||||
tableID: 3000000001,
|
||||
columnID: 2,
|
||||
dataTypeID: 3000000003,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
};
|
||||
const emptyParameterDescriptionBuffer = new buffer_list_1.default()
|
||||
.addInt16(0) // number of parameters
|
||||
.join(true, 't');
|
||||
const oneParameterDescBuf = test_buffers_1.default.parameterDescription([1111]);
|
||||
const twoParameterDescBuf = test_buffers_1.default.parameterDescription([2222, 3333]);
|
||||
const expectedEmptyParameterDescriptionMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 6,
|
||||
parameterCount: 0,
|
||||
dataTypeIDs: [],
|
||||
};
|
||||
const expectedOneParameterMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 10,
|
||||
parameterCount: 1,
|
||||
dataTypeIDs: [1111],
|
||||
};
|
||||
const expectedTwoParameterMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 14,
|
||||
parameterCount: 2,
|
||||
dataTypeIDs: [2222, 3333],
|
||||
};
|
||||
const testForMessage = function (buffer, expectedMessage) {
|
||||
it('receives and parses ' + expectedMessage.name, () => __awaiter(this, void 0, void 0, function* () {
|
||||
const messages = yield parseBuffers([buffer]);
|
||||
const [lastMessage] = messages;
|
||||
for (const key in expectedMessage) {
|
||||
assert_1.default.deepEqual(lastMessage[key], expectedMessage[key]);
|
||||
}
|
||||
}));
|
||||
};
|
||||
const plainPasswordBuffer = test_buffers_1.default.authenticationCleartextPassword();
|
||||
const md5PasswordBuffer = test_buffers_1.default.authenticationMD5Password();
|
||||
const SASLBuffer = test_buffers_1.default.authenticationSASL();
|
||||
const SASLContinueBuffer = test_buffers_1.default.authenticationSASLContinue();
|
||||
const SASLFinalBuffer = test_buffers_1.default.authenticationSASLFinal();
|
||||
const expectedPlainPasswordMessage = {
|
||||
name: 'authenticationCleartextPassword',
|
||||
};
|
||||
const expectedMD5PasswordMessage = {
|
||||
name: 'authenticationMD5Password',
|
||||
salt: Buffer.from([1, 2, 3, 4]),
|
||||
};
|
||||
const expectedSASLMessage = {
|
||||
name: 'authenticationSASL',
|
||||
mechanisms: ['SCRAM-SHA-256'],
|
||||
};
|
||||
const expectedSASLContinueMessage = {
|
||||
name: 'authenticationSASLContinue',
|
||||
data: 'data',
|
||||
};
|
||||
const expectedSASLFinalMessage = {
|
||||
name: 'authenticationSASLFinal',
|
||||
data: 'data',
|
||||
};
|
||||
const notificationResponseBuffer = test_buffers_1.default.notification(4, 'hi', 'boom');
|
||||
const expectedNotificationResponseMessage = {
|
||||
name: 'notification',
|
||||
processId: 4,
|
||||
channel: 'hi',
|
||||
payload: 'boom',
|
||||
};
|
||||
const parseBuffers = (buffers) => __awaiter(void 0, void 0, void 0, function* () {
|
||||
const stream = new stream_1.PassThrough();
|
||||
for (const buffer of buffers) {
|
||||
stream.write(buffer);
|
||||
}
|
||||
stream.end();
|
||||
const msgs = [];
|
||||
yield (0, _1.parse)(stream, (msg) => msgs.push(msg));
|
||||
return msgs;
|
||||
});
|
||||
describe('PgPacketStream', function () {
|
||||
testForMessage(authOkBuffer, expectedAuthenticationOkayMessage);
|
||||
testForMessage(plainPasswordBuffer, expectedPlainPasswordMessage);
|
||||
testForMessage(md5PasswordBuffer, expectedMD5PasswordMessage);
|
||||
testForMessage(SASLBuffer, expectedSASLMessage);
|
||||
testForMessage(SASLContinueBuffer, expectedSASLContinueMessage);
|
||||
// this exercises a found bug in the parser:
|
||||
// https://github.com/brianc/node-postgres/pull/2210#issuecomment-627626084
|
||||
// and adds a test which is deterministic, rather than relying on network packet chunking
|
||||
const extendedSASLContinueBuffer = Buffer.concat([SASLContinueBuffer, Buffer.from([1, 2, 3, 4])]);
|
||||
testForMessage(extendedSASLContinueBuffer, expectedSASLContinueMessage);
|
||||
testForMessage(SASLFinalBuffer, expectedSASLFinalMessage);
|
||||
// this exercises a found bug in the parser:
|
||||
// https://github.com/brianc/node-postgres/pull/2210#issuecomment-627626084
|
||||
// and adds a test which is deterministic, rather than relying on network packet chunking
|
||||
const extendedSASLFinalBuffer = Buffer.concat([SASLFinalBuffer, Buffer.from([1, 2, 4, 5])]);
|
||||
testForMessage(extendedSASLFinalBuffer, expectedSASLFinalMessage);
|
||||
testForMessage(paramStatusBuffer, expectedParameterStatusMessage);
|
||||
testForMessage(backendKeyDataBuffer, expectedBackendKeyDataMessage);
|
||||
testForMessage(readyForQueryBuffer, expectedReadyForQueryMessage);
|
||||
testForMessage(commandCompleteBuffer, expectedCommandCompleteMessage);
|
||||
testForMessage(notificationResponseBuffer, expectedNotificationResponseMessage);
|
||||
testForMessage(test_buffers_1.default.emptyQuery(), {
|
||||
name: 'emptyQuery',
|
||||
length: 4,
|
||||
});
|
||||
testForMessage(Buffer.from([0x6e, 0, 0, 0, 4]), {
|
||||
name: 'noData',
|
||||
});
|
||||
describe('rowDescription messages', function () {
|
||||
testForMessage(emptyRowDescriptionBuffer, expectedEmptyRowDescriptionMessage);
|
||||
testForMessage(oneRowDescBuff, expectedOneRowMessage);
|
||||
testForMessage(twoRowBuf, expectedTwoRowMessage);
|
||||
testForMessage(bigOidDescBuff, expectedBigOidMessage);
|
||||
});
|
||||
describe('parameterDescription messages', function () {
|
||||
testForMessage(emptyParameterDescriptionBuffer, expectedEmptyParameterDescriptionMessage);
|
||||
testForMessage(oneParameterDescBuf, expectedOneParameterMessage);
|
||||
testForMessage(twoParameterDescBuf, expectedTwoParameterMessage);
|
||||
});
|
||||
describe('parsing rows', function () {
|
||||
describe('parsing empty row', function () {
|
||||
testForMessage(emptyRowFieldBuf, {
|
||||
name: 'dataRow',
|
||||
fieldCount: 0,
|
||||
});
|
||||
});
|
||||
describe('parsing data row with fields', function () {
|
||||
testForMessage(oneFieldBuf, {
|
||||
name: 'dataRow',
|
||||
fieldCount: 1,
|
||||
fields: ['test'],
|
||||
});
|
||||
});
|
||||
});
|
||||
describe('notice message', function () {
|
||||
// this uses the same logic as error message
|
||||
const buff = test_buffers_1.default.notice([{ type: 'C', value: 'code' }]);
|
||||
testForMessage(buff, {
|
||||
name: 'notice',
|
||||
code: 'code',
|
||||
});
|
||||
});
|
||||
testForMessage(test_buffers_1.default.error([]), {
|
||||
name: 'error',
|
||||
});
|
||||
describe('with all the fields', function () {
|
||||
const buffer = test_buffers_1.default.error([
|
||||
{
|
||||
type: 'S',
|
||||
value: 'ERROR',
|
||||
},
|
||||
{
|
||||
type: 'C',
|
||||
value: 'code',
|
||||
},
|
||||
{
|
||||
type: 'M',
|
||||
value: 'message',
|
||||
},
|
||||
{
|
||||
type: 'D',
|
||||
value: 'details',
|
||||
},
|
||||
{
|
||||
type: 'H',
|
||||
value: 'hint',
|
||||
},
|
||||
{
|
||||
type: 'P',
|
||||
value: '100',
|
||||
},
|
||||
{
|
||||
type: 'p',
|
||||
value: '101',
|
||||
},
|
||||
{
|
||||
type: 'q',
|
||||
value: 'query',
|
||||
},
|
||||
{
|
||||
type: 'W',
|
||||
value: 'where',
|
||||
},
|
||||
{
|
||||
type: 'F',
|
||||
value: 'file',
|
||||
},
|
||||
{
|
||||
type: 'L',
|
||||
value: 'line',
|
||||
},
|
||||
{
|
||||
type: 'R',
|
||||
value: 'routine',
|
||||
},
|
||||
{
|
||||
type: 'Z',
|
||||
value: 'alsdkf',
|
||||
},
|
||||
]);
|
||||
testForMessage(buffer, {
|
||||
name: 'error',
|
||||
severity: 'ERROR',
|
||||
code: 'code',
|
||||
message: 'message',
|
||||
detail: 'details',
|
||||
hint: 'hint',
|
||||
position: '100',
|
||||
internalPosition: '101',
|
||||
internalQuery: 'query',
|
||||
where: 'where',
|
||||
file: 'file',
|
||||
line: 'line',
|
||||
routine: 'routine',
|
||||
});
|
||||
});
|
||||
testForMessage(parseCompleteBuffer, {
|
||||
name: 'parseComplete',
|
||||
});
|
||||
testForMessage(bindCompleteBuffer, {
|
||||
name: 'bindComplete',
|
||||
});
|
||||
testForMessage(bindCompleteBuffer, {
|
||||
name: 'bindComplete',
|
||||
});
|
||||
testForMessage(test_buffers_1.default.closeComplete(), {
|
||||
name: 'closeComplete',
|
||||
});
|
||||
describe('parses portal suspended message', function () {
|
||||
testForMessage(portalSuspendedBuffer, {
|
||||
name: 'portalSuspended',
|
||||
});
|
||||
});
|
||||
describe('parses replication start message', function () {
|
||||
testForMessage(Buffer.from([0x57, 0x00, 0x00, 0x00, 0x04]), {
|
||||
name: 'replicationStart',
|
||||
length: 4,
|
||||
});
|
||||
});
|
||||
describe('copy', () => {
|
||||
testForMessage(test_buffers_1.default.copyIn(0), {
|
||||
name: 'copyInResponse',
|
||||
length: 7,
|
||||
binary: false,
|
||||
columnTypes: [],
|
||||
});
|
||||
testForMessage(test_buffers_1.default.copyIn(2), {
|
||||
name: 'copyInResponse',
|
||||
length: 11,
|
||||
binary: false,
|
||||
columnTypes: [0, 1],
|
||||
});
|
||||
testForMessage(test_buffers_1.default.copyOut(0), {
|
||||
name: 'copyOutResponse',
|
||||
length: 7,
|
||||
binary: false,
|
||||
columnTypes: [],
|
||||
});
|
||||
testForMessage(test_buffers_1.default.copyOut(3), {
|
||||
name: 'copyOutResponse',
|
||||
length: 13,
|
||||
binary: false,
|
||||
columnTypes: [0, 1, 2],
|
||||
});
|
||||
testForMessage(test_buffers_1.default.copyDone(), {
|
||||
name: 'copyDone',
|
||||
length: 4,
|
||||
});
|
||||
testForMessage(test_buffers_1.default.copyData(Buffer.from([5, 6, 7])), {
|
||||
name: 'copyData',
|
||||
length: 7,
|
||||
chunk: Buffer.from([5, 6, 7]),
|
||||
});
|
||||
});
|
||||
// since the data message on a stream can randomly divide the incomming
|
||||
// tcp packets anywhere, we need to make sure we can parse every single
|
||||
// split on a tcp message
|
||||
describe('split buffer, single message parsing', function () {
|
||||
const fullBuffer = test_buffers_1.default.dataRow([null, 'bang', 'zug zug', null, '!']);
|
||||
it('parses when full buffer comes in', function () {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const messages = yield parseBuffers([fullBuffer]);
|
||||
const message = messages[0];
|
||||
assert_1.default.equal(message.fields.length, 5);
|
||||
assert_1.default.equal(message.fields[0], null);
|
||||
assert_1.default.equal(message.fields[1], 'bang');
|
||||
assert_1.default.equal(message.fields[2], 'zug zug');
|
||||
assert_1.default.equal(message.fields[3], null);
|
||||
assert_1.default.equal(message.fields[4], '!');
|
||||
});
|
||||
});
|
||||
const testMessageReceivedAfterSplitAt = function (split) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const firstBuffer = Buffer.alloc(fullBuffer.length - split);
|
||||
const secondBuffer = Buffer.alloc(fullBuffer.length - firstBuffer.length);
|
||||
fullBuffer.copy(firstBuffer, 0, 0);
|
||||
fullBuffer.copy(secondBuffer, 0, firstBuffer.length);
|
||||
const messages = yield parseBuffers([firstBuffer, secondBuffer]);
|
||||
const message = messages[0];
|
||||
assert_1.default.equal(message.fields.length, 5);
|
||||
assert_1.default.equal(message.fields[0], null);
|
||||
assert_1.default.equal(message.fields[1], 'bang');
|
||||
assert_1.default.equal(message.fields[2], 'zug zug');
|
||||
assert_1.default.equal(message.fields[3], null);
|
||||
assert_1.default.equal(message.fields[4], '!');
|
||||
});
|
||||
};
|
||||
it('parses when split in the middle', function () {
|
||||
return testMessageReceivedAfterSplitAt(6);
|
||||
});
|
||||
it('parses when split at end', function () {
|
||||
return testMessageReceivedAfterSplitAt(2);
|
||||
});
|
||||
it('parses when split at beginning', function () {
|
||||
return Promise.all([
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 2),
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 1),
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 5),
|
||||
]);
|
||||
});
|
||||
});
|
||||
describe('split buffer, multiple message parsing', function () {
|
||||
const dataRowBuffer = test_buffers_1.default.dataRow(['!']);
|
||||
const readyForQueryBuffer = test_buffers_1.default.readyForQuery();
|
||||
const fullBuffer = Buffer.alloc(dataRowBuffer.length + readyForQueryBuffer.length);
|
||||
dataRowBuffer.copy(fullBuffer, 0, 0);
|
||||
readyForQueryBuffer.copy(fullBuffer, dataRowBuffer.length, 0);
|
||||
const verifyMessages = function (messages) {
|
||||
assert_1.default.strictEqual(messages.length, 2);
|
||||
assert_1.default.deepEqual(messages[0], {
|
||||
name: 'dataRow',
|
||||
fieldCount: 1,
|
||||
length: 11,
|
||||
fields: ['!'],
|
||||
});
|
||||
assert_1.default.equal(messages[0].fields[0], '!');
|
||||
assert_1.default.deepEqual(messages[1], {
|
||||
name: 'readyForQuery',
|
||||
length: 5,
|
||||
status: 'I',
|
||||
});
|
||||
};
|
||||
// sanity check
|
||||
it('receives both messages when packet is not split', function () {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const messages = yield parseBuffers([fullBuffer]);
|
||||
verifyMessages(messages);
|
||||
});
|
||||
});
|
||||
const splitAndVerifyTwoMessages = function (split) {
|
||||
return __awaiter(this, void 0, void 0, function* () {
|
||||
const firstBuffer = Buffer.alloc(fullBuffer.length - split);
|
||||
const secondBuffer = Buffer.alloc(fullBuffer.length - firstBuffer.length);
|
||||
fullBuffer.copy(firstBuffer, 0, 0);
|
||||
fullBuffer.copy(secondBuffer, 0, firstBuffer.length);
|
||||
const messages = yield parseBuffers([firstBuffer, secondBuffer]);
|
||||
verifyMessages(messages);
|
||||
});
|
||||
};
|
||||
describe('receives both messages when packet is split', function () {
|
||||
it('in the middle', function () {
|
||||
return splitAndVerifyTwoMessages(11);
|
||||
});
|
||||
it('at the front', function () {
|
||||
return Promise.all([
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 1),
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 4),
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 6),
|
||||
]);
|
||||
});
|
||||
it('at the end', function () {
|
||||
return Promise.all([splitAndVerifyTwoMessages(8), splitAndVerifyTwoMessages(1)]);
|
||||
});
|
||||
});
|
||||
});
|
||||
it('cleans up the reader after handling a packet', function () {
|
||||
const parser = new parser_1.Parser();
|
||||
parser.parse(oneFieldBuf, () => { });
|
||||
assert_1.default.strictEqual(parser.reader.buffer.byteLength, 0);
|
||||
});
|
||||
});
|
||||
//# sourceMappingURL=inbound-parser.test.js.map
|
||||
1
node_modules/pg-protocol/dist/inbound-parser.test.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/inbound-parser.test.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
6
node_modules/pg-protocol/dist/index.d.ts
generated
vendored
Normal file
6
node_modules/pg-protocol/dist/index.d.ts
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
||||
/// <reference types="node" />
|
||||
import { DatabaseError } from './messages';
|
||||
import { serialize } from './serializer';
|
||||
import { MessageCallback } from './parser';
|
||||
export declare function parse(stream: NodeJS.ReadableStream, callback: MessageCallback): Promise<void>;
|
||||
export { serialize, DatabaseError };
|
||||
15
node_modules/pg-protocol/dist/index.js
generated
vendored
Normal file
15
node_modules/pg-protocol/dist/index.js
generated
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.DatabaseError = exports.serialize = exports.parse = void 0;
|
||||
const messages_1 = require("./messages");
|
||||
Object.defineProperty(exports, "DatabaseError", { enumerable: true, get: function () { return messages_1.DatabaseError; } });
|
||||
const serializer_1 = require("./serializer");
|
||||
Object.defineProperty(exports, "serialize", { enumerable: true, get: function () { return serializer_1.serialize; } });
|
||||
const parser_1 = require("./parser");
|
||||
function parse(stream, callback) {
|
||||
const parser = new parser_1.Parser();
|
||||
stream.on('data', (buffer) => parser.parse(buffer, callback));
|
||||
return new Promise((resolve) => stream.on('end', () => resolve()));
|
||||
}
|
||||
exports.parse = parse;
|
||||
//# sourceMappingURL=index.js.map
|
||||
1
node_modules/pg-protocol/dist/index.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AAAA,yCAA0C;AAUtB,8FAVX,wBAAa,OAUW;AATjC,6CAAwC;AAS/B,0FATA,sBAAS,OASA;AARlB,qCAAkD;AAElD,SAAgB,KAAK,CAAC,MAA6B,EAAE,QAAyB;IAC5E,MAAM,MAAM,GAAG,IAAI,eAAM,EAAE,CAAA;IAC3B,MAAM,CAAC,EAAE,CAAC,MAAM,EAAE,CAAC,MAAc,EAAE,EAAE,CAAC,MAAM,CAAC,KAAK,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC,CAAA;IACrE,OAAO,IAAI,OAAO,CAAC,CAAC,OAAO,EAAE,EAAE,CAAC,MAAM,CAAC,EAAE,CAAC,KAAK,EAAE,GAAG,EAAE,CAAC,OAAO,EAAE,CAAC,CAAC,CAAA;AACpE,CAAC;AAJD,sBAIC"}
|
||||
162
node_modules/pg-protocol/dist/messages.d.ts
generated
vendored
Normal file
162
node_modules/pg-protocol/dist/messages.d.ts
generated
vendored
Normal file
@@ -0,0 +1,162 @@
|
||||
/// <reference types="node" />
|
||||
export declare type Mode = 'text' | 'binary';
|
||||
export declare type MessageName = 'parseComplete' | 'bindComplete' | 'closeComplete' | 'noData' | 'portalSuspended' | 'replicationStart' | 'emptyQuery' | 'copyDone' | 'copyData' | 'rowDescription' | 'parameterDescription' | 'parameterStatus' | 'backendKeyData' | 'notification' | 'readyForQuery' | 'commandComplete' | 'dataRow' | 'copyInResponse' | 'copyOutResponse' | 'authenticationOk' | 'authenticationMD5Password' | 'authenticationCleartextPassword' | 'authenticationSASL' | 'authenticationSASLContinue' | 'authenticationSASLFinal' | 'error' | 'notice';
|
||||
export interface BackendMessage {
|
||||
name: MessageName;
|
||||
length: number;
|
||||
}
|
||||
export declare const parseComplete: BackendMessage;
|
||||
export declare const bindComplete: BackendMessage;
|
||||
export declare const closeComplete: BackendMessage;
|
||||
export declare const noData: BackendMessage;
|
||||
export declare const portalSuspended: BackendMessage;
|
||||
export declare const replicationStart: BackendMessage;
|
||||
export declare const emptyQuery: BackendMessage;
|
||||
export declare const copyDone: BackendMessage;
|
||||
interface NoticeOrError {
|
||||
message: string | undefined;
|
||||
severity: string | undefined;
|
||||
code: string | undefined;
|
||||
detail: string | undefined;
|
||||
hint: string | undefined;
|
||||
position: string | undefined;
|
||||
internalPosition: string | undefined;
|
||||
internalQuery: string | undefined;
|
||||
where: string | undefined;
|
||||
schema: string | undefined;
|
||||
table: string | undefined;
|
||||
column: string | undefined;
|
||||
dataType: string | undefined;
|
||||
constraint: string | undefined;
|
||||
file: string | undefined;
|
||||
line: string | undefined;
|
||||
routine: string | undefined;
|
||||
}
|
||||
export declare class DatabaseError extends Error implements NoticeOrError {
|
||||
readonly length: number;
|
||||
readonly name: MessageName;
|
||||
severity: string | undefined;
|
||||
code: string | undefined;
|
||||
detail: string | undefined;
|
||||
hint: string | undefined;
|
||||
position: string | undefined;
|
||||
internalPosition: string | undefined;
|
||||
internalQuery: string | undefined;
|
||||
where: string | undefined;
|
||||
schema: string | undefined;
|
||||
table: string | undefined;
|
||||
column: string | undefined;
|
||||
dataType: string | undefined;
|
||||
constraint: string | undefined;
|
||||
file: string | undefined;
|
||||
line: string | undefined;
|
||||
routine: string | undefined;
|
||||
constructor(message: string, length: number, name: MessageName);
|
||||
}
|
||||
export declare class CopyDataMessage {
|
||||
readonly length: number;
|
||||
readonly chunk: Buffer;
|
||||
readonly name = "copyData";
|
||||
constructor(length: number, chunk: Buffer);
|
||||
}
|
||||
export declare class CopyResponse {
|
||||
readonly length: number;
|
||||
readonly name: MessageName;
|
||||
readonly binary: boolean;
|
||||
readonly columnTypes: number[];
|
||||
constructor(length: number, name: MessageName, binary: boolean, columnCount: number);
|
||||
}
|
||||
export declare class Field {
|
||||
readonly name: string;
|
||||
readonly tableID: number;
|
||||
readonly columnID: number;
|
||||
readonly dataTypeID: number;
|
||||
readonly dataTypeSize: number;
|
||||
readonly dataTypeModifier: number;
|
||||
readonly format: Mode;
|
||||
constructor(name: string, tableID: number, columnID: number, dataTypeID: number, dataTypeSize: number, dataTypeModifier: number, format: Mode);
|
||||
}
|
||||
export declare class RowDescriptionMessage {
|
||||
readonly length: number;
|
||||
readonly fieldCount: number;
|
||||
readonly name: MessageName;
|
||||
readonly fields: Field[];
|
||||
constructor(length: number, fieldCount: number);
|
||||
}
|
||||
export declare class ParameterDescriptionMessage {
|
||||
readonly length: number;
|
||||
readonly parameterCount: number;
|
||||
readonly name: MessageName;
|
||||
readonly dataTypeIDs: number[];
|
||||
constructor(length: number, parameterCount: number);
|
||||
}
|
||||
export declare class ParameterStatusMessage {
|
||||
readonly length: number;
|
||||
readonly parameterName: string;
|
||||
readonly parameterValue: string;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, parameterName: string, parameterValue: string);
|
||||
}
|
||||
export declare class AuthenticationMD5Password implements BackendMessage {
|
||||
readonly length: number;
|
||||
readonly salt: Buffer;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, salt: Buffer);
|
||||
}
|
||||
export declare class BackendKeyDataMessage {
|
||||
readonly length: number;
|
||||
readonly processID: number;
|
||||
readonly secretKey: number;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, processID: number, secretKey: number);
|
||||
}
|
||||
export declare class NotificationResponseMessage {
|
||||
readonly length: number;
|
||||
readonly processId: number;
|
||||
readonly channel: string;
|
||||
readonly payload: string;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, processId: number, channel: string, payload: string);
|
||||
}
|
||||
export declare class ReadyForQueryMessage {
|
||||
readonly length: number;
|
||||
readonly status: string;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, status: string);
|
||||
}
|
||||
export declare class CommandCompleteMessage {
|
||||
readonly length: number;
|
||||
readonly text: string;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, text: string);
|
||||
}
|
||||
export declare class DataRowMessage {
|
||||
length: number;
|
||||
fields: any[];
|
||||
readonly fieldCount: number;
|
||||
readonly name: MessageName;
|
||||
constructor(length: number, fields: any[]);
|
||||
}
|
||||
export declare class NoticeMessage implements BackendMessage, NoticeOrError {
|
||||
readonly length: number;
|
||||
readonly message: string | undefined;
|
||||
constructor(length: number, message: string | undefined);
|
||||
readonly name = "notice";
|
||||
severity: string | undefined;
|
||||
code: string | undefined;
|
||||
detail: string | undefined;
|
||||
hint: string | undefined;
|
||||
position: string | undefined;
|
||||
internalPosition: string | undefined;
|
||||
internalQuery: string | undefined;
|
||||
where: string | undefined;
|
||||
schema: string | undefined;
|
||||
table: string | undefined;
|
||||
column: string | undefined;
|
||||
dataType: string | undefined;
|
||||
constraint: string | undefined;
|
||||
file: string | undefined;
|
||||
line: string | undefined;
|
||||
routine: string | undefined;
|
||||
}
|
||||
export {};
|
||||
160
node_modules/pg-protocol/dist/messages.js
generated
vendored
Normal file
160
node_modules/pg-protocol/dist/messages.js
generated
vendored
Normal file
@@ -0,0 +1,160 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.NoticeMessage = exports.DataRowMessage = exports.CommandCompleteMessage = exports.ReadyForQueryMessage = exports.NotificationResponseMessage = exports.BackendKeyDataMessage = exports.AuthenticationMD5Password = exports.ParameterStatusMessage = exports.ParameterDescriptionMessage = exports.RowDescriptionMessage = exports.Field = exports.CopyResponse = exports.CopyDataMessage = exports.DatabaseError = exports.copyDone = exports.emptyQuery = exports.replicationStart = exports.portalSuspended = exports.noData = exports.closeComplete = exports.bindComplete = exports.parseComplete = void 0;
|
||||
exports.parseComplete = {
|
||||
name: 'parseComplete',
|
||||
length: 5,
|
||||
};
|
||||
exports.bindComplete = {
|
||||
name: 'bindComplete',
|
||||
length: 5,
|
||||
};
|
||||
exports.closeComplete = {
|
||||
name: 'closeComplete',
|
||||
length: 5,
|
||||
};
|
||||
exports.noData = {
|
||||
name: 'noData',
|
||||
length: 5,
|
||||
};
|
||||
exports.portalSuspended = {
|
||||
name: 'portalSuspended',
|
||||
length: 5,
|
||||
};
|
||||
exports.replicationStart = {
|
||||
name: 'replicationStart',
|
||||
length: 4,
|
||||
};
|
||||
exports.emptyQuery = {
|
||||
name: 'emptyQuery',
|
||||
length: 4,
|
||||
};
|
||||
exports.copyDone = {
|
||||
name: 'copyDone',
|
||||
length: 4,
|
||||
};
|
||||
class DatabaseError extends Error {
|
||||
constructor(message, length, name) {
|
||||
super(message);
|
||||
this.length = length;
|
||||
this.name = name;
|
||||
}
|
||||
}
|
||||
exports.DatabaseError = DatabaseError;
|
||||
class CopyDataMessage {
|
||||
constructor(length, chunk) {
|
||||
this.length = length;
|
||||
this.chunk = chunk;
|
||||
this.name = 'copyData';
|
||||
}
|
||||
}
|
||||
exports.CopyDataMessage = CopyDataMessage;
|
||||
class CopyResponse {
|
||||
constructor(length, name, binary, columnCount) {
|
||||
this.length = length;
|
||||
this.name = name;
|
||||
this.binary = binary;
|
||||
this.columnTypes = new Array(columnCount);
|
||||
}
|
||||
}
|
||||
exports.CopyResponse = CopyResponse;
|
||||
class Field {
|
||||
constructor(name, tableID, columnID, dataTypeID, dataTypeSize, dataTypeModifier, format) {
|
||||
this.name = name;
|
||||
this.tableID = tableID;
|
||||
this.columnID = columnID;
|
||||
this.dataTypeID = dataTypeID;
|
||||
this.dataTypeSize = dataTypeSize;
|
||||
this.dataTypeModifier = dataTypeModifier;
|
||||
this.format = format;
|
||||
}
|
||||
}
|
||||
exports.Field = Field;
|
||||
class RowDescriptionMessage {
|
||||
constructor(length, fieldCount) {
|
||||
this.length = length;
|
||||
this.fieldCount = fieldCount;
|
||||
this.name = 'rowDescription';
|
||||
this.fields = new Array(this.fieldCount);
|
||||
}
|
||||
}
|
||||
exports.RowDescriptionMessage = RowDescriptionMessage;
|
||||
class ParameterDescriptionMessage {
|
||||
constructor(length, parameterCount) {
|
||||
this.length = length;
|
||||
this.parameterCount = parameterCount;
|
||||
this.name = 'parameterDescription';
|
||||
this.dataTypeIDs = new Array(this.parameterCount);
|
||||
}
|
||||
}
|
||||
exports.ParameterDescriptionMessage = ParameterDescriptionMessage;
|
||||
class ParameterStatusMessage {
|
||||
constructor(length, parameterName, parameterValue) {
|
||||
this.length = length;
|
||||
this.parameterName = parameterName;
|
||||
this.parameterValue = parameterValue;
|
||||
this.name = 'parameterStatus';
|
||||
}
|
||||
}
|
||||
exports.ParameterStatusMessage = ParameterStatusMessage;
|
||||
class AuthenticationMD5Password {
|
||||
constructor(length, salt) {
|
||||
this.length = length;
|
||||
this.salt = salt;
|
||||
this.name = 'authenticationMD5Password';
|
||||
}
|
||||
}
|
||||
exports.AuthenticationMD5Password = AuthenticationMD5Password;
|
||||
class BackendKeyDataMessage {
|
||||
constructor(length, processID, secretKey) {
|
||||
this.length = length;
|
||||
this.processID = processID;
|
||||
this.secretKey = secretKey;
|
||||
this.name = 'backendKeyData';
|
||||
}
|
||||
}
|
||||
exports.BackendKeyDataMessage = BackendKeyDataMessage;
|
||||
class NotificationResponseMessage {
|
||||
constructor(length, processId, channel, payload) {
|
||||
this.length = length;
|
||||
this.processId = processId;
|
||||
this.channel = channel;
|
||||
this.payload = payload;
|
||||
this.name = 'notification';
|
||||
}
|
||||
}
|
||||
exports.NotificationResponseMessage = NotificationResponseMessage;
|
||||
class ReadyForQueryMessage {
|
||||
constructor(length, status) {
|
||||
this.length = length;
|
||||
this.status = status;
|
||||
this.name = 'readyForQuery';
|
||||
}
|
||||
}
|
||||
exports.ReadyForQueryMessage = ReadyForQueryMessage;
|
||||
class CommandCompleteMessage {
|
||||
constructor(length, text) {
|
||||
this.length = length;
|
||||
this.text = text;
|
||||
this.name = 'commandComplete';
|
||||
}
|
||||
}
|
||||
exports.CommandCompleteMessage = CommandCompleteMessage;
|
||||
class DataRowMessage {
|
||||
constructor(length, fields) {
|
||||
this.length = length;
|
||||
this.fields = fields;
|
||||
this.name = 'dataRow';
|
||||
this.fieldCount = fields.length;
|
||||
}
|
||||
}
|
||||
exports.DataRowMessage = DataRowMessage;
|
||||
class NoticeMessage {
|
||||
constructor(length, message) {
|
||||
this.length = length;
|
||||
this.message = message;
|
||||
this.name = 'notice';
|
||||
}
|
||||
}
|
||||
exports.NoticeMessage = NoticeMessage;
|
||||
//# sourceMappingURL=messages.js.map
|
||||
1
node_modules/pg-protocol/dist/messages.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/messages.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
{"version":3,"file":"messages.js","sourceRoot":"","sources":["../src/messages.ts"],"names":[],"mappings":";;;AAoCa,QAAA,aAAa,GAAmB;IAC3C,IAAI,EAAE,eAAe;IACrB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,YAAY,GAAmB;IAC1C,IAAI,EAAE,cAAc;IACpB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,aAAa,GAAmB;IAC3C,IAAI,EAAE,eAAe;IACrB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,MAAM,GAAmB;IACpC,IAAI,EAAE,QAAQ;IACd,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,eAAe,GAAmB;IAC7C,IAAI,EAAE,iBAAiB;IACvB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,gBAAgB,GAAmB;IAC9C,IAAI,EAAE,kBAAkB;IACxB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,UAAU,GAAmB;IACxC,IAAI,EAAE,YAAY;IAClB,MAAM,EAAE,CAAC;CACV,CAAA;AAEY,QAAA,QAAQ,GAAmB;IACtC,IAAI,EAAE,UAAU;IAChB,MAAM,EAAE,CAAC;CACV,CAAA;AAsBD,MAAa,aAAc,SAAQ,KAAK;IAiBtC,YACE,OAAe,EACC,MAAc,EACd,IAAiB;QAEjC,KAAK,CAAC,OAAO,CAAC,CAAA;QAHE,WAAM,GAAN,MAAM,CAAQ;QACd,SAAI,GAAJ,IAAI,CAAa;IAGnC,CAAC;CACF;AAxBD,sCAwBC;AAED,MAAa,eAAe;IAE1B,YACkB,MAAc,EACd,KAAa;QADb,WAAM,GAAN,MAAM,CAAQ;QACd,UAAK,GAAL,KAAK,CAAQ;QAHf,SAAI,GAAG,UAAU,CAAA;IAI9B,CAAC;CACL;AAND,0CAMC;AAED,MAAa,YAAY;IAEvB,YACkB,MAAc,EACd,IAAiB,EACjB,MAAe,EAC/B,WAAmB;QAHH,WAAM,GAAN,MAAM,CAAQ;QACd,SAAI,GAAJ,IAAI,CAAa;QACjB,WAAM,GAAN,MAAM,CAAS;QAG/B,IAAI,CAAC,WAAW,GAAG,IAAI,KAAK,CAAC,WAAW,CAAC,CAAA;IAC3C,CAAC;CACF;AAVD,oCAUC;AAED,MAAa,KAAK;IAChB,YACkB,IAAY,EACZ,OAAe,EACf,QAAgB,EAChB,UAAkB,EAClB,YAAoB,EACpB,gBAAwB,EACxB,MAAY;QANZ,SAAI,GAAJ,IAAI,CAAQ;QACZ,YAAO,GAAP,OAAO,CAAQ;QACf,aAAQ,GAAR,QAAQ,CAAQ;QAChB,eAAU,GAAV,UAAU,CAAQ;QAClB,iBAAY,GAAZ,YAAY,CAAQ;QACpB,qBAAgB,GAAhB,gBAAgB,CAAQ;QACxB,WAAM,GAAN,MAAM,CAAM;IAC3B,CAAC;CACL;AAVD,sBAUC;AAED,MAAa,qBAAqB;IAGhC,YACkB,MAAc,EACd,UAAkB;QADlB,WAAM,GAAN,MAAM,CAAQ;QACd,eAAU,GAAV,UAAU,CAAQ;QAJpB,SAAI,GAAgB,gBAAgB,CAAA;QAMlD,IAAI,CAAC,MAAM,GAAG,IAAI,KAAK,CAAC,IAAI,CAAC,UAAU,CAAC,CAAA;IAC1C,CAAC;CACF;AATD,sDASC;AAED,MAAa,2BAA2B;IAGtC,YACkB,MAAc,EACd,cAAsB;QADtB,WAAM,GAAN,MAAM,CAAQ;QACd,mBAAc,GAAd,cAAc,CAAQ;QAJxB,SAAI,GAAgB,sBAAsB,CAAA;QAMxD,IAAI,CAAC,WAAW,GAAG,IAAI,KAAK,CAAC,IAAI,CAAC,cAAc,CAAC,CAAA;IACnD,CAAC;CACF;AATD,kEASC;AAED,MAAa,sBAAsB;IAEjC,YACkB,MAAc,EACd,aAAqB,EACrB,cAAsB;QAFtB,WAAM,GAAN,MAAM,CAAQ;QACd,kBAAa,GAAb,aAAa,CAAQ;QACrB,mBAAc,GAAd,cAAc,CAAQ;QAJxB,SAAI,GAAgB,iBAAiB,CAAA;IAKlD,CAAC;CACL;AAPD,wDAOC;AAED,MAAa,yBAAyB;IAEpC,YACkB,MAAc,EACd,IAAY;QADZ,WAAM,GAAN,MAAM,CAAQ;QACd,SAAI,GAAJ,IAAI,CAAQ;QAHd,SAAI,GAAgB,2BAA2B,CAAA;IAI5D,CAAC;CACL;AAND,8DAMC;AAED,MAAa,qBAAqB;IAEhC,YACkB,MAAc,EACd,SAAiB,EACjB,SAAiB;QAFjB,WAAM,GAAN,MAAM,CAAQ;QACd,cAAS,GAAT,SAAS,CAAQ;QACjB,cAAS,GAAT,SAAS,CAAQ;QAJnB,SAAI,GAAgB,gBAAgB,CAAA;IAKjD,CAAC;CACL;AAPD,sDAOC;AAED,MAAa,2BAA2B;IAEtC,YACkB,MAAc,EACd,SAAiB,EACjB,OAAe,EACf,OAAe;QAHf,WAAM,GAAN,MAAM,CAAQ;QACd,cAAS,GAAT,SAAS,CAAQ;QACjB,YAAO,GAAP,OAAO,CAAQ;QACf,YAAO,GAAP,OAAO,CAAQ;QALjB,SAAI,GAAgB,cAAc,CAAA;IAM/C,CAAC;CACL;AARD,kEAQC;AAED,MAAa,oBAAoB;IAE/B,YACkB,MAAc,EACd,MAAc;QADd,WAAM,GAAN,MAAM,CAAQ;QACd,WAAM,GAAN,MAAM,CAAQ;QAHhB,SAAI,GAAgB,eAAe,CAAA;IAIhD,CAAC;CACL;AAND,oDAMC;AAED,MAAa,sBAAsB;IAEjC,YACkB,MAAc,EACd,IAAY;QADZ,WAAM,GAAN,MAAM,CAAQ;QACd,SAAI,GAAJ,IAAI,CAAQ;QAHd,SAAI,GAAgB,iBAAiB,CAAA;IAIlD,CAAC;CACL;AAND,wDAMC;AAED,MAAa,cAAc;IAGzB,YACS,MAAc,EACd,MAAa;QADb,WAAM,GAAN,MAAM,CAAQ;QACd,WAAM,GAAN,MAAM,CAAO;QAHN,SAAI,GAAgB,SAAS,CAAA;QAK3C,IAAI,CAAC,UAAU,GAAG,MAAM,CAAC,MAAM,CAAA;IACjC,CAAC;CACF;AATD,wCASC;AAED,MAAa,aAAa;IACxB,YACkB,MAAc,EACd,OAA2B;QAD3B,WAAM,GAAN,MAAM,CAAQ;QACd,YAAO,GAAP,OAAO,CAAoB;QAE7B,SAAI,GAAG,QAAQ,CAAA;IAD5B,CAAC;CAkBL;AAtBD,sCAsBC"}
|
||||
1
node_modules/pg-protocol/dist/outbound-serializer.test.d.ts
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/outbound-serializer.test.d.ts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
export {};
|
||||
252
node_modules/pg-protocol/dist/outbound-serializer.test.js
generated
vendored
Normal file
252
node_modules/pg-protocol/dist/outbound-serializer.test.js
generated
vendored
Normal file
@@ -0,0 +1,252 @@
|
||||
"use strict";
|
||||
var __importDefault = (this && this.__importDefault) || function (mod) {
|
||||
return (mod && mod.__esModule) ? mod : { "default": mod };
|
||||
};
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
const assert_1 = __importDefault(require("assert"));
|
||||
const serializer_1 = require("./serializer");
|
||||
const buffer_list_1 = __importDefault(require("./testing/buffer-list"));
|
||||
describe('serializer', () => {
|
||||
it('builds startup message', function () {
|
||||
const actual = serializer_1.serialize.startup({
|
||||
user: 'brian',
|
||||
database: 'bang',
|
||||
});
|
||||
assert_1.default.deepEqual(actual, new buffer_list_1.default()
|
||||
.addInt16(3)
|
||||
.addInt16(0)
|
||||
.addCString('user')
|
||||
.addCString('brian')
|
||||
.addCString('database')
|
||||
.addCString('bang')
|
||||
.addCString('client_encoding')
|
||||
.addCString('UTF8')
|
||||
.addCString('')
|
||||
.join(true));
|
||||
});
|
||||
it('builds password message', function () {
|
||||
const actual = serializer_1.serialize.password('!');
|
||||
assert_1.default.deepEqual(actual, new buffer_list_1.default().addCString('!').join(true, 'p'));
|
||||
});
|
||||
it('builds request ssl message', function () {
|
||||
const actual = serializer_1.serialize.requestSsl();
|
||||
const expected = new buffer_list_1.default().addInt32(80877103).join(true);
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds SASLInitialResponseMessage message', function () {
|
||||
const actual = serializer_1.serialize.sendSASLInitialResponseMessage('mech', 'data');
|
||||
assert_1.default.deepEqual(actual, new buffer_list_1.default().addCString('mech').addInt32(4).addString('data').join(true, 'p'));
|
||||
});
|
||||
it('builds SCRAMClientFinalMessage message', function () {
|
||||
const actual = serializer_1.serialize.sendSCRAMClientFinalMessage('data');
|
||||
assert_1.default.deepEqual(actual, new buffer_list_1.default().addString('data').join(true, 'p'));
|
||||
});
|
||||
it('builds query message', function () {
|
||||
const txt = 'select * from boom';
|
||||
const actual = serializer_1.serialize.query(txt);
|
||||
assert_1.default.deepEqual(actual, new buffer_list_1.default().addCString(txt).join(true, 'Q'));
|
||||
});
|
||||
describe('parse message', () => {
|
||||
it('builds parse message', function () {
|
||||
const actual = serializer_1.serialize.parse({ text: '!' });
|
||||
const expected = new buffer_list_1.default().addCString('').addCString('!').addInt16(0).join(true, 'P');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds parse message with named query', function () {
|
||||
const actual = serializer_1.serialize.parse({
|
||||
name: 'boom',
|
||||
text: 'select * from boom',
|
||||
types: [],
|
||||
});
|
||||
const expected = new buffer_list_1.default().addCString('boom').addCString('select * from boom').addInt16(0).join(true, 'P');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('with multiple parameters', function () {
|
||||
const actual = serializer_1.serialize.parse({
|
||||
name: 'force',
|
||||
text: 'select * from bang where name = $1',
|
||||
types: [1, 2, 3, 4],
|
||||
});
|
||||
const expected = new buffer_list_1.default()
|
||||
.addCString('force')
|
||||
.addCString('select * from bang where name = $1')
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.addInt32(2)
|
||||
.addInt32(3)
|
||||
.addInt32(4)
|
||||
.join(true, 'P');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
});
|
||||
describe('bind messages', function () {
|
||||
it('with no values', function () {
|
||||
const actual = serializer_1.serialize.bind();
|
||||
const expectedBuffer = new buffer_list_1.default()
|
||||
.addCString('')
|
||||
.addCString('')
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
it('with named statement, portal, and values', function () {
|
||||
const actual = serializer_1.serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, 'zing'],
|
||||
});
|
||||
const expectedBuffer = new buffer_list_1.default()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.add(Buffer.from('1'))
|
||||
.addInt32(2)
|
||||
.add(Buffer.from('hi'))
|
||||
.addInt32(-1)
|
||||
.addInt32(4)
|
||||
.add(Buffer.from('zing'))
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
});
|
||||
it('with custom valueMapper', function () {
|
||||
const actual = serializer_1.serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, 'zing'],
|
||||
valueMapper: () => null,
|
||||
});
|
||||
const expectedBuffer = new buffer_list_1.default()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(4)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
it('with named statement, portal, and buffer value', function () {
|
||||
const actual = serializer_1.serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, Buffer.from('zing', 'utf8')],
|
||||
});
|
||||
const expectedBuffer = new buffer_list_1.default()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4) // value count
|
||||
.addInt16(0) // string
|
||||
.addInt16(0) // string
|
||||
.addInt16(0) // string
|
||||
.addInt16(1) // binary
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.add(Buffer.from('1'))
|
||||
.addInt32(2)
|
||||
.add(Buffer.from('hi'))
|
||||
.addInt32(-1)
|
||||
.addInt32(4)
|
||||
.add(Buffer.from('zing', 'utf-8'))
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
describe('builds execute message', function () {
|
||||
it('for unamed portal with no row limit', function () {
|
||||
const actual = serializer_1.serialize.execute();
|
||||
const expectedBuffer = new buffer_list_1.default().addCString('').addInt32(0).join(true, 'E');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
it('for named portal with row limit', function () {
|
||||
const actual = serializer_1.serialize.execute({
|
||||
portal: 'my favorite portal',
|
||||
rows: 100,
|
||||
});
|
||||
const expectedBuffer = new buffer_list_1.default().addCString('my favorite portal').addInt32(100).join(true, 'E');
|
||||
assert_1.default.deepEqual(actual, expectedBuffer);
|
||||
});
|
||||
});
|
||||
it('builds flush command', function () {
|
||||
const actual = serializer_1.serialize.flush();
|
||||
const expected = new buffer_list_1.default().join(true, 'H');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds sync command', function () {
|
||||
const actual = serializer_1.serialize.sync();
|
||||
const expected = new buffer_list_1.default().join(true, 'S');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds end command', function () {
|
||||
const actual = serializer_1.serialize.end();
|
||||
const expected = Buffer.from([0x58, 0, 0, 0, 4]);
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
describe('builds describe command', function () {
|
||||
it('describe statement', function () {
|
||||
const actual = serializer_1.serialize.describe({ type: 'S', name: 'bang' });
|
||||
const expected = new buffer_list_1.default().addChar('S').addCString('bang').join(true, 'D');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('describe unnamed portal', function () {
|
||||
const actual = serializer_1.serialize.describe({ type: 'P' });
|
||||
const expected = new buffer_list_1.default().addChar('P').addCString('').join(true, 'D');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
});
|
||||
describe('builds close command', function () {
|
||||
it('describe statement', function () {
|
||||
const actual = serializer_1.serialize.close({ type: 'S', name: 'bang' });
|
||||
const expected = new buffer_list_1.default().addChar('S').addCString('bang').join(true, 'C');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('describe unnamed portal', function () {
|
||||
const actual = serializer_1.serialize.close({ type: 'P' });
|
||||
const expected = new buffer_list_1.default().addChar('P').addCString('').join(true, 'C');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
});
|
||||
describe('copy messages', function () {
|
||||
it('builds copyFromChunk', () => {
|
||||
const actual = serializer_1.serialize.copyData(Buffer.from([1, 2, 3]));
|
||||
const expected = new buffer_list_1.default().add(Buffer.from([1, 2, 3])).join(true, 'd');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds copy fail', () => {
|
||||
const actual = serializer_1.serialize.copyFail('err!');
|
||||
const expected = new buffer_list_1.default().addCString('err!').join(true, 'f');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
it('builds copy done', () => {
|
||||
const actual = serializer_1.serialize.copyDone();
|
||||
const expected = new buffer_list_1.default().join(true, 'c');
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
});
|
||||
it('builds cancel message', () => {
|
||||
const actual = serializer_1.serialize.cancel(3, 4);
|
||||
const expected = new buffer_list_1.default().addInt16(1234).addInt16(5678).addInt32(3).addInt32(4).join(true);
|
||||
assert_1.default.deepEqual(actual, expected);
|
||||
});
|
||||
});
|
||||
//# sourceMappingURL=outbound-serializer.test.js.map
|
||||
1
node_modules/pg-protocol/dist/outbound-serializer.test.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/outbound-serializer.test.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
24
node_modules/pg-protocol/dist/parser.d.ts
generated
vendored
Normal file
24
node_modules/pg-protocol/dist/parser.d.ts
generated
vendored
Normal file
@@ -0,0 +1,24 @@
|
||||
/// <reference types="node" />
|
||||
/// <reference types="node" />
|
||||
import { TransformOptions } from 'stream';
|
||||
import { Mode, BackendMessage } from './messages';
|
||||
export declare type Packet = {
|
||||
code: number;
|
||||
packet: Buffer;
|
||||
};
|
||||
declare type StreamOptions = TransformOptions & {
|
||||
mode: Mode;
|
||||
};
|
||||
export declare type MessageCallback = (msg: BackendMessage) => void;
|
||||
export declare class Parser {
|
||||
private buffer;
|
||||
private bufferLength;
|
||||
private bufferOffset;
|
||||
private reader;
|
||||
private mode;
|
||||
constructor(opts?: StreamOptions);
|
||||
parse(buffer: Buffer, callback: MessageCallback): void;
|
||||
private mergeBuffer;
|
||||
private handlePacket;
|
||||
}
|
||||
export {};
|
||||
324
node_modules/pg-protocol/dist/parser.js
generated
vendored
Normal file
324
node_modules/pg-protocol/dist/parser.js
generated
vendored
Normal file
@@ -0,0 +1,324 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.Parser = void 0;
|
||||
const messages_1 = require("./messages");
|
||||
const buffer_reader_1 = require("./buffer-reader");
|
||||
// every message is prefixed with a single bye
|
||||
const CODE_LENGTH = 1;
|
||||
// every message has an int32 length which includes itself but does
|
||||
// NOT include the code in the length
|
||||
const LEN_LENGTH = 4;
|
||||
const HEADER_LENGTH = CODE_LENGTH + LEN_LENGTH;
|
||||
// A placeholder for a `BackendMessage`’s length value that will be set after construction.
|
||||
const LATEINIT_LENGTH = -1;
|
||||
const emptyBuffer = Buffer.allocUnsafe(0);
|
||||
class Parser {
|
||||
constructor(opts) {
|
||||
this.buffer = emptyBuffer;
|
||||
this.bufferLength = 0;
|
||||
this.bufferOffset = 0;
|
||||
this.reader = new buffer_reader_1.BufferReader();
|
||||
if ((opts === null || opts === void 0 ? void 0 : opts.mode) === 'binary') {
|
||||
throw new Error('Binary mode not supported yet');
|
||||
}
|
||||
this.mode = (opts === null || opts === void 0 ? void 0 : opts.mode) || 'text';
|
||||
}
|
||||
parse(buffer, callback) {
|
||||
this.mergeBuffer(buffer);
|
||||
const bufferFullLength = this.bufferOffset + this.bufferLength;
|
||||
let offset = this.bufferOffset;
|
||||
while (offset + HEADER_LENGTH <= bufferFullLength) {
|
||||
// code is 1 byte long - it identifies the message type
|
||||
const code = this.buffer[offset];
|
||||
// length is 1 Uint32BE - it is the length of the message EXCLUDING the code
|
||||
const length = this.buffer.readUInt32BE(offset + CODE_LENGTH);
|
||||
const fullMessageLength = CODE_LENGTH + length;
|
||||
if (fullMessageLength + offset <= bufferFullLength) {
|
||||
const message = this.handlePacket(offset + HEADER_LENGTH, code, length, this.buffer);
|
||||
callback(message);
|
||||
offset += fullMessageLength;
|
||||
}
|
||||
else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (offset === bufferFullLength) {
|
||||
// No more use for the buffer
|
||||
this.buffer = emptyBuffer;
|
||||
this.bufferLength = 0;
|
||||
this.bufferOffset = 0;
|
||||
}
|
||||
else {
|
||||
// Adjust the cursors of remainingBuffer
|
||||
this.bufferLength = bufferFullLength - offset;
|
||||
this.bufferOffset = offset;
|
||||
}
|
||||
}
|
||||
mergeBuffer(buffer) {
|
||||
if (this.bufferLength > 0) {
|
||||
const newLength = this.bufferLength + buffer.byteLength;
|
||||
const newFullLength = newLength + this.bufferOffset;
|
||||
if (newFullLength > this.buffer.byteLength) {
|
||||
// We can't concat the new buffer with the remaining one
|
||||
let newBuffer;
|
||||
if (newLength <= this.buffer.byteLength && this.bufferOffset >= this.bufferLength) {
|
||||
// We can move the relevant part to the beginning of the buffer instead of allocating a new buffer
|
||||
newBuffer = this.buffer;
|
||||
}
|
||||
else {
|
||||
// Allocate a new larger buffer
|
||||
let newBufferLength = this.buffer.byteLength * 2;
|
||||
while (newLength >= newBufferLength) {
|
||||
newBufferLength *= 2;
|
||||
}
|
||||
newBuffer = Buffer.allocUnsafe(newBufferLength);
|
||||
}
|
||||
// Move the remaining buffer to the new one
|
||||
this.buffer.copy(newBuffer, 0, this.bufferOffset, this.bufferOffset + this.bufferLength);
|
||||
this.buffer = newBuffer;
|
||||
this.bufferOffset = 0;
|
||||
}
|
||||
// Concat the new buffer with the remaining one
|
||||
buffer.copy(this.buffer, this.bufferOffset + this.bufferLength);
|
||||
this.bufferLength = newLength;
|
||||
}
|
||||
else {
|
||||
this.buffer = buffer;
|
||||
this.bufferOffset = 0;
|
||||
this.bufferLength = buffer.byteLength;
|
||||
}
|
||||
}
|
||||
handlePacket(offset, code, length, bytes) {
|
||||
const { reader } = this;
|
||||
// NOTE: This undesirably retains the buffer in `this.reader` if the `parse*Message` calls below throw. However, those should only throw in the case of a protocol error, which normally results in the reader being discarded.
|
||||
reader.setBuffer(offset, bytes);
|
||||
let message;
|
||||
switch (code) {
|
||||
case 50 /* MessageCodes.BindComplete */:
|
||||
message = messages_1.bindComplete;
|
||||
break;
|
||||
case 49 /* MessageCodes.ParseComplete */:
|
||||
message = messages_1.parseComplete;
|
||||
break;
|
||||
case 51 /* MessageCodes.CloseComplete */:
|
||||
message = messages_1.closeComplete;
|
||||
break;
|
||||
case 110 /* MessageCodes.NoData */:
|
||||
message = messages_1.noData;
|
||||
break;
|
||||
case 115 /* MessageCodes.PortalSuspended */:
|
||||
message = messages_1.portalSuspended;
|
||||
break;
|
||||
case 99 /* MessageCodes.CopyDone */:
|
||||
message = messages_1.copyDone;
|
||||
break;
|
||||
case 87 /* MessageCodes.ReplicationStart */:
|
||||
message = messages_1.replicationStart;
|
||||
break;
|
||||
case 73 /* MessageCodes.EmptyQuery */:
|
||||
message = messages_1.emptyQuery;
|
||||
break;
|
||||
case 68 /* MessageCodes.DataRow */:
|
||||
message = parseDataRowMessage(reader);
|
||||
break;
|
||||
case 67 /* MessageCodes.CommandComplete */:
|
||||
message = parseCommandCompleteMessage(reader);
|
||||
break;
|
||||
case 90 /* MessageCodes.ReadyForQuery */:
|
||||
message = parseReadyForQueryMessage(reader);
|
||||
break;
|
||||
case 65 /* MessageCodes.NotificationResponse */:
|
||||
message = parseNotificationMessage(reader);
|
||||
break;
|
||||
case 82 /* MessageCodes.AuthenticationResponse */:
|
||||
message = parseAuthenticationResponse(reader, length);
|
||||
break;
|
||||
case 83 /* MessageCodes.ParameterStatus */:
|
||||
message = parseParameterStatusMessage(reader);
|
||||
break;
|
||||
case 75 /* MessageCodes.BackendKeyData */:
|
||||
message = parseBackendKeyData(reader);
|
||||
break;
|
||||
case 69 /* MessageCodes.ErrorMessage */:
|
||||
message = parseErrorMessage(reader, 'error');
|
||||
break;
|
||||
case 78 /* MessageCodes.NoticeMessage */:
|
||||
message = parseErrorMessage(reader, 'notice');
|
||||
break;
|
||||
case 84 /* MessageCodes.RowDescriptionMessage */:
|
||||
message = parseRowDescriptionMessage(reader);
|
||||
break;
|
||||
case 116 /* MessageCodes.ParameterDescriptionMessage */:
|
||||
message = parseParameterDescriptionMessage(reader);
|
||||
break;
|
||||
case 71 /* MessageCodes.CopyIn */:
|
||||
message = parseCopyInMessage(reader);
|
||||
break;
|
||||
case 72 /* MessageCodes.CopyOut */:
|
||||
message = parseCopyOutMessage(reader);
|
||||
break;
|
||||
case 100 /* MessageCodes.CopyData */:
|
||||
message = parseCopyData(reader, length);
|
||||
break;
|
||||
default:
|
||||
return new messages_1.DatabaseError('received invalid response: ' + code.toString(16), length, 'error');
|
||||
}
|
||||
reader.setBuffer(0, emptyBuffer);
|
||||
message.length = length;
|
||||
return message;
|
||||
}
|
||||
}
|
||||
exports.Parser = Parser;
|
||||
const parseReadyForQueryMessage = (reader) => {
|
||||
const status = reader.string(1);
|
||||
return new messages_1.ReadyForQueryMessage(LATEINIT_LENGTH, status);
|
||||
};
|
||||
const parseCommandCompleteMessage = (reader) => {
|
||||
const text = reader.cstring();
|
||||
return new messages_1.CommandCompleteMessage(LATEINIT_LENGTH, text);
|
||||
};
|
||||
const parseCopyData = (reader, length) => {
|
||||
const chunk = reader.bytes(length - 4);
|
||||
return new messages_1.CopyDataMessage(LATEINIT_LENGTH, chunk);
|
||||
};
|
||||
const parseCopyInMessage = (reader) => parseCopyMessage(reader, 'copyInResponse');
|
||||
const parseCopyOutMessage = (reader) => parseCopyMessage(reader, 'copyOutResponse');
|
||||
const parseCopyMessage = (reader, messageName) => {
|
||||
const isBinary = reader.byte() !== 0;
|
||||
const columnCount = reader.int16();
|
||||
const message = new messages_1.CopyResponse(LATEINIT_LENGTH, messageName, isBinary, columnCount);
|
||||
for (let i = 0; i < columnCount; i++) {
|
||||
message.columnTypes[i] = reader.int16();
|
||||
}
|
||||
return message;
|
||||
};
|
||||
const parseNotificationMessage = (reader) => {
|
||||
const processId = reader.int32();
|
||||
const channel = reader.cstring();
|
||||
const payload = reader.cstring();
|
||||
return new messages_1.NotificationResponseMessage(LATEINIT_LENGTH, processId, channel, payload);
|
||||
};
|
||||
const parseRowDescriptionMessage = (reader) => {
|
||||
const fieldCount = reader.int16();
|
||||
const message = new messages_1.RowDescriptionMessage(LATEINIT_LENGTH, fieldCount);
|
||||
for (let i = 0; i < fieldCount; i++) {
|
||||
message.fields[i] = parseField(reader);
|
||||
}
|
||||
return message;
|
||||
};
|
||||
const parseField = (reader) => {
|
||||
const name = reader.cstring();
|
||||
const tableID = reader.uint32();
|
||||
const columnID = reader.int16();
|
||||
const dataTypeID = reader.uint32();
|
||||
const dataTypeSize = reader.int16();
|
||||
const dataTypeModifier = reader.int32();
|
||||
const mode = reader.int16() === 0 ? 'text' : 'binary';
|
||||
return new messages_1.Field(name, tableID, columnID, dataTypeID, dataTypeSize, dataTypeModifier, mode);
|
||||
};
|
||||
const parseParameterDescriptionMessage = (reader) => {
|
||||
const parameterCount = reader.int16();
|
||||
const message = new messages_1.ParameterDescriptionMessage(LATEINIT_LENGTH, parameterCount);
|
||||
for (let i = 0; i < parameterCount; i++) {
|
||||
message.dataTypeIDs[i] = reader.int32();
|
||||
}
|
||||
return message;
|
||||
};
|
||||
const parseDataRowMessage = (reader) => {
|
||||
const fieldCount = reader.int16();
|
||||
const fields = new Array(fieldCount);
|
||||
for (let i = 0; i < fieldCount; i++) {
|
||||
const len = reader.int32();
|
||||
// a -1 for length means the value of the field is null
|
||||
fields[i] = len === -1 ? null : reader.string(len);
|
||||
}
|
||||
return new messages_1.DataRowMessage(LATEINIT_LENGTH, fields);
|
||||
};
|
||||
const parseParameterStatusMessage = (reader) => {
|
||||
const name = reader.cstring();
|
||||
const value = reader.cstring();
|
||||
return new messages_1.ParameterStatusMessage(LATEINIT_LENGTH, name, value);
|
||||
};
|
||||
const parseBackendKeyData = (reader) => {
|
||||
const processID = reader.int32();
|
||||
const secretKey = reader.int32();
|
||||
return new messages_1.BackendKeyDataMessage(LATEINIT_LENGTH, processID, secretKey);
|
||||
};
|
||||
const parseAuthenticationResponse = (reader, length) => {
|
||||
const code = reader.int32();
|
||||
// TODO(bmc): maybe better types here
|
||||
const message = {
|
||||
name: 'authenticationOk',
|
||||
length,
|
||||
};
|
||||
switch (code) {
|
||||
case 0: // AuthenticationOk
|
||||
break;
|
||||
case 3: // AuthenticationCleartextPassword
|
||||
if (message.length === 8) {
|
||||
message.name = 'authenticationCleartextPassword';
|
||||
}
|
||||
break;
|
||||
case 5: // AuthenticationMD5Password
|
||||
if (message.length === 12) {
|
||||
message.name = 'authenticationMD5Password';
|
||||
const salt = reader.bytes(4);
|
||||
return new messages_1.AuthenticationMD5Password(LATEINIT_LENGTH, salt);
|
||||
}
|
||||
break;
|
||||
case 10: // AuthenticationSASL
|
||||
{
|
||||
message.name = 'authenticationSASL';
|
||||
message.mechanisms = [];
|
||||
let mechanism;
|
||||
do {
|
||||
mechanism = reader.cstring();
|
||||
if (mechanism) {
|
||||
message.mechanisms.push(mechanism);
|
||||
}
|
||||
} while (mechanism);
|
||||
}
|
||||
break;
|
||||
case 11: // AuthenticationSASLContinue
|
||||
message.name = 'authenticationSASLContinue';
|
||||
message.data = reader.string(length - 8);
|
||||
break;
|
||||
case 12: // AuthenticationSASLFinal
|
||||
message.name = 'authenticationSASLFinal';
|
||||
message.data = reader.string(length - 8);
|
||||
break;
|
||||
default:
|
||||
throw new Error('Unknown authenticationOk message type ' + code);
|
||||
}
|
||||
return message;
|
||||
};
|
||||
const parseErrorMessage = (reader, name) => {
|
||||
const fields = {};
|
||||
let fieldType = reader.string(1);
|
||||
while (fieldType !== '\0') {
|
||||
fields[fieldType] = reader.cstring();
|
||||
fieldType = reader.string(1);
|
||||
}
|
||||
const messageValue = fields.M;
|
||||
const message = name === 'notice'
|
||||
? new messages_1.NoticeMessage(LATEINIT_LENGTH, messageValue)
|
||||
: new messages_1.DatabaseError(messageValue, LATEINIT_LENGTH, name);
|
||||
message.severity = fields.S;
|
||||
message.code = fields.C;
|
||||
message.detail = fields.D;
|
||||
message.hint = fields.H;
|
||||
message.position = fields.P;
|
||||
message.internalPosition = fields.p;
|
||||
message.internalQuery = fields.q;
|
||||
message.where = fields.W;
|
||||
message.schema = fields.s;
|
||||
message.table = fields.t;
|
||||
message.column = fields.c;
|
||||
message.dataType = fields.d;
|
||||
message.constraint = fields.n;
|
||||
message.file = fields.F;
|
||||
message.line = fields.L;
|
||||
message.routine = fields.R;
|
||||
return message;
|
||||
};
|
||||
//# sourceMappingURL=parser.js.map
|
||||
1
node_modules/pg-protocol/dist/parser.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/parser.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
42
node_modules/pg-protocol/dist/serializer.d.ts
generated
vendored
Normal file
42
node_modules/pg-protocol/dist/serializer.d.ts
generated
vendored
Normal file
@@ -0,0 +1,42 @@
|
||||
declare type ParseOpts = {
|
||||
name?: string;
|
||||
types?: number[];
|
||||
text: string;
|
||||
};
|
||||
declare type ValueMapper = (param: any, index: number) => any;
|
||||
declare type BindOpts = {
|
||||
portal?: string;
|
||||
binary?: boolean;
|
||||
statement?: string;
|
||||
values?: any[];
|
||||
valueMapper?: ValueMapper;
|
||||
};
|
||||
declare type ExecOpts = {
|
||||
portal?: string;
|
||||
rows?: number;
|
||||
};
|
||||
declare type PortalOpts = {
|
||||
type: 'S' | 'P';
|
||||
name?: string;
|
||||
};
|
||||
declare const serialize: {
|
||||
startup: (opts: Record<string, string>) => Buffer;
|
||||
password: (password: string) => Buffer;
|
||||
requestSsl: () => Buffer;
|
||||
sendSASLInitialResponseMessage: (mechanism: string, initialResponse: string) => Buffer;
|
||||
sendSCRAMClientFinalMessage: (additionalData: string) => Buffer;
|
||||
query: (text: string) => Buffer;
|
||||
parse: (query: ParseOpts) => Buffer;
|
||||
bind: (config?: BindOpts) => Buffer;
|
||||
execute: (config?: ExecOpts) => Buffer;
|
||||
describe: (msg: PortalOpts) => Buffer;
|
||||
close: (msg: PortalOpts) => Buffer;
|
||||
flush: () => Buffer;
|
||||
sync: () => Buffer;
|
||||
end: () => Buffer;
|
||||
copyData: (chunk: Buffer) => Buffer;
|
||||
copyDone: () => Buffer;
|
||||
copyFail: (message: string) => Buffer;
|
||||
cancel: (processID: number, secretKey: number) => Buffer;
|
||||
};
|
||||
export { serialize };
|
||||
189
node_modules/pg-protocol/dist/serializer.js
generated
vendored
Normal file
189
node_modules/pg-protocol/dist/serializer.js
generated
vendored
Normal file
@@ -0,0 +1,189 @@
|
||||
"use strict";
|
||||
Object.defineProperty(exports, "__esModule", { value: true });
|
||||
exports.serialize = void 0;
|
||||
const buffer_writer_1 = require("./buffer-writer");
|
||||
const writer = new buffer_writer_1.Writer();
|
||||
const startup = (opts) => {
|
||||
// protocol version
|
||||
writer.addInt16(3).addInt16(0);
|
||||
for (const key of Object.keys(opts)) {
|
||||
writer.addCString(key).addCString(opts[key]);
|
||||
}
|
||||
writer.addCString('client_encoding').addCString('UTF8');
|
||||
const bodyBuffer = writer.addCString('').flush();
|
||||
// this message is sent without a code
|
||||
const length = bodyBuffer.length + 4;
|
||||
return new buffer_writer_1.Writer().addInt32(length).add(bodyBuffer).flush();
|
||||
};
|
||||
const requestSsl = () => {
|
||||
const response = Buffer.allocUnsafe(8);
|
||||
response.writeInt32BE(8, 0);
|
||||
response.writeInt32BE(80877103, 4);
|
||||
return response;
|
||||
};
|
||||
const password = (password) => {
|
||||
return writer.addCString(password).flush(112 /* code.startup */);
|
||||
};
|
||||
const sendSASLInitialResponseMessage = function (mechanism, initialResponse) {
|
||||
// 0x70 = 'p'
|
||||
writer.addCString(mechanism).addInt32(Buffer.byteLength(initialResponse)).addString(initialResponse);
|
||||
return writer.flush(112 /* code.startup */);
|
||||
};
|
||||
const sendSCRAMClientFinalMessage = function (additionalData) {
|
||||
return writer.addString(additionalData).flush(112 /* code.startup */);
|
||||
};
|
||||
const query = (text) => {
|
||||
return writer.addCString(text).flush(81 /* code.query */);
|
||||
};
|
||||
const emptyArray = [];
|
||||
const parse = (query) => {
|
||||
// expect something like this:
|
||||
// { name: 'queryName',
|
||||
// text: 'select * from blah',
|
||||
// types: ['int8', 'bool'] }
|
||||
// normalize missing query names to allow for null
|
||||
const name = query.name || '';
|
||||
if (name.length > 63) {
|
||||
console.error('Warning! Postgres only supports 63 characters for query names.');
|
||||
console.error('You supplied %s (%s)', name, name.length);
|
||||
console.error('This can cause conflicts and silent errors executing queries');
|
||||
}
|
||||
const types = query.types || emptyArray;
|
||||
const len = types.length;
|
||||
const buffer = writer
|
||||
.addCString(name) // name of query
|
||||
.addCString(query.text) // actual query text
|
||||
.addInt16(len);
|
||||
for (let i = 0; i < len; i++) {
|
||||
buffer.addInt32(types[i]);
|
||||
}
|
||||
return writer.flush(80 /* code.parse */);
|
||||
};
|
||||
const paramWriter = new buffer_writer_1.Writer();
|
||||
const writeValues = function (values, valueMapper) {
|
||||
for (let i = 0; i < values.length; i++) {
|
||||
const mappedVal = valueMapper ? valueMapper(values[i], i) : values[i];
|
||||
if (mappedVal == null) {
|
||||
// add the param type (string) to the writer
|
||||
writer.addInt16(0 /* ParamType.STRING */);
|
||||
// write -1 to the param writer to indicate null
|
||||
paramWriter.addInt32(-1);
|
||||
}
|
||||
else if (mappedVal instanceof Buffer) {
|
||||
// add the param type (binary) to the writer
|
||||
writer.addInt16(1 /* ParamType.BINARY */);
|
||||
// add the buffer to the param writer
|
||||
paramWriter.addInt32(mappedVal.length);
|
||||
paramWriter.add(mappedVal);
|
||||
}
|
||||
else {
|
||||
// add the param type (string) to the writer
|
||||
writer.addInt16(0 /* ParamType.STRING */);
|
||||
paramWriter.addInt32(Buffer.byteLength(mappedVal));
|
||||
paramWriter.addString(mappedVal);
|
||||
}
|
||||
}
|
||||
};
|
||||
const bind = (config = {}) => {
|
||||
// normalize config
|
||||
const portal = config.portal || '';
|
||||
const statement = config.statement || '';
|
||||
const binary = config.binary || false;
|
||||
const values = config.values || emptyArray;
|
||||
const len = values.length;
|
||||
writer.addCString(portal).addCString(statement);
|
||||
writer.addInt16(len);
|
||||
writeValues(values, config.valueMapper);
|
||||
writer.addInt16(len);
|
||||
writer.add(paramWriter.flush());
|
||||
// all results use the same format code
|
||||
writer.addInt16(1);
|
||||
// format code
|
||||
writer.addInt16(binary ? 1 /* ParamType.BINARY */ : 0 /* ParamType.STRING */);
|
||||
return writer.flush(66 /* code.bind */);
|
||||
};
|
||||
const emptyExecute = Buffer.from([69 /* code.execute */, 0x00, 0x00, 0x00, 0x09, 0x00, 0x00, 0x00, 0x00, 0x00]);
|
||||
const execute = (config) => {
|
||||
// this is the happy path for most queries
|
||||
if (!config || (!config.portal && !config.rows)) {
|
||||
return emptyExecute;
|
||||
}
|
||||
const portal = config.portal || '';
|
||||
const rows = config.rows || 0;
|
||||
const portalLength = Buffer.byteLength(portal);
|
||||
const len = 4 + portalLength + 1 + 4;
|
||||
// one extra bit for code
|
||||
const buff = Buffer.allocUnsafe(1 + len);
|
||||
buff[0] = 69 /* code.execute */;
|
||||
buff.writeInt32BE(len, 1);
|
||||
buff.write(portal, 5, 'utf-8');
|
||||
buff[portalLength + 5] = 0; // null terminate portal cString
|
||||
buff.writeUInt32BE(rows, buff.length - 4);
|
||||
return buff;
|
||||
};
|
||||
const cancel = (processID, secretKey) => {
|
||||
const buffer = Buffer.allocUnsafe(16);
|
||||
buffer.writeInt32BE(16, 0);
|
||||
buffer.writeInt16BE(1234, 4);
|
||||
buffer.writeInt16BE(5678, 6);
|
||||
buffer.writeInt32BE(processID, 8);
|
||||
buffer.writeInt32BE(secretKey, 12);
|
||||
return buffer;
|
||||
};
|
||||
const cstringMessage = (code, string) => {
|
||||
const stringLen = Buffer.byteLength(string);
|
||||
const len = 4 + stringLen + 1;
|
||||
// one extra bit for code
|
||||
const buffer = Buffer.allocUnsafe(1 + len);
|
||||
buffer[0] = code;
|
||||
buffer.writeInt32BE(len, 1);
|
||||
buffer.write(string, 5, 'utf-8');
|
||||
buffer[len] = 0; // null terminate cString
|
||||
return buffer;
|
||||
};
|
||||
const emptyDescribePortal = writer.addCString('P').flush(68 /* code.describe */);
|
||||
const emptyDescribeStatement = writer.addCString('S').flush(68 /* code.describe */);
|
||||
const describe = (msg) => {
|
||||
return msg.name
|
||||
? cstringMessage(68 /* code.describe */, `${msg.type}${msg.name || ''}`)
|
||||
: msg.type === 'P'
|
||||
? emptyDescribePortal
|
||||
: emptyDescribeStatement;
|
||||
};
|
||||
const close = (msg) => {
|
||||
const text = `${msg.type}${msg.name || ''}`;
|
||||
return cstringMessage(67 /* code.close */, text);
|
||||
};
|
||||
const copyData = (chunk) => {
|
||||
return writer.add(chunk).flush(100 /* code.copyFromChunk */);
|
||||
};
|
||||
const copyFail = (message) => {
|
||||
return cstringMessage(102 /* code.copyFail */, message);
|
||||
};
|
||||
const codeOnlyBuffer = (code) => Buffer.from([code, 0x00, 0x00, 0x00, 0x04]);
|
||||
const flushBuffer = codeOnlyBuffer(72 /* code.flush */);
|
||||
const syncBuffer = codeOnlyBuffer(83 /* code.sync */);
|
||||
const endBuffer = codeOnlyBuffer(88 /* code.end */);
|
||||
const copyDoneBuffer = codeOnlyBuffer(99 /* code.copyDone */);
|
||||
const serialize = {
|
||||
startup,
|
||||
password,
|
||||
requestSsl,
|
||||
sendSASLInitialResponseMessage,
|
||||
sendSCRAMClientFinalMessage,
|
||||
query,
|
||||
parse,
|
||||
bind,
|
||||
execute,
|
||||
describe,
|
||||
close,
|
||||
flush: () => flushBuffer,
|
||||
sync: () => syncBuffer,
|
||||
end: () => endBuffer,
|
||||
copyData,
|
||||
copyDone: () => copyDoneBuffer,
|
||||
copyFail,
|
||||
cancel,
|
||||
};
|
||||
exports.serialize = serialize;
|
||||
//# sourceMappingURL=serializer.js.map
|
||||
1
node_modules/pg-protocol/dist/serializer.js.map
generated
vendored
Normal file
1
node_modules/pg-protocol/dist/serializer.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
11
node_modules/pg-protocol/esm/index.js
generated
vendored
Normal file
11
node_modules/pg-protocol/esm/index.js
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
// ESM wrapper for pg-protocol
|
||||
import * as protocol from '../dist/index.js'
|
||||
|
||||
// Re-export all the properties
|
||||
export const DatabaseError = protocol.DatabaseError
|
||||
export const SASL = protocol.SASL
|
||||
export const serialize = protocol.serialize
|
||||
export const parse = protocol.parse
|
||||
|
||||
// Re-export the default
|
||||
export default protocol
|
||||
45
node_modules/pg-protocol/package.json
generated
vendored
Normal file
45
node_modules/pg-protocol/package.json
generated
vendored
Normal file
@@ -0,0 +1,45 @@
|
||||
{
|
||||
"name": "pg-protocol",
|
||||
"version": "1.12.0",
|
||||
"description": "The postgres client/server binary protocol, implemented in TypeScript",
|
||||
"main": "dist/index.js",
|
||||
"types": "dist/index.d.ts",
|
||||
"exports": {
|
||||
".": {
|
||||
"import": "./esm/index.js",
|
||||
"require": "./dist/index.js",
|
||||
"default": "./dist/index.js"
|
||||
},
|
||||
"./dist/*": "./dist/*.js",
|
||||
"./dist/*.js": "./dist/*.js"
|
||||
},
|
||||
"license": "MIT",
|
||||
"devDependencies": {
|
||||
"@types/chai": "^4.2.7",
|
||||
"@types/mocha": "^10.0.7",
|
||||
"@types/node": "^12.12.21",
|
||||
"chai": "^4.2.0",
|
||||
"chunky": "^0.0.0",
|
||||
"mocha": "^10.5.2",
|
||||
"ts-node": "^8.5.4",
|
||||
"typescript": "^4.0.3"
|
||||
},
|
||||
"scripts": {
|
||||
"test": "mocha dist/**/*.test.js",
|
||||
"build": "tsc",
|
||||
"build:watch": "tsc --watch",
|
||||
"prepublish": "yarn build",
|
||||
"pretest": "yarn build"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git://github.com/brianc/node-postgres.git",
|
||||
"directory": "packages/pg-protocol"
|
||||
},
|
||||
"files": [
|
||||
"/dist/*{js,ts,map}",
|
||||
"/src",
|
||||
"/esm"
|
||||
],
|
||||
"gitHead": "f2d7d1146cc87024a5fa503dce13c59ff5196d26"
|
||||
}
|
||||
25
node_modules/pg-protocol/src/b.ts
generated
vendored
Normal file
25
node_modules/pg-protocol/src/b.ts
generated
vendored
Normal file
@@ -0,0 +1,25 @@
|
||||
// file for microbenchmarking
|
||||
|
||||
import { BufferReader } from './buffer-reader'
|
||||
|
||||
const LOOPS = 1000
|
||||
let count = 0
|
||||
const start = performance.now()
|
||||
|
||||
const reader = new BufferReader()
|
||||
const buffer = Buffer.from([33, 33, 33, 33, 33, 33, 33, 0])
|
||||
|
||||
const run = () => {
|
||||
if (count > LOOPS) {
|
||||
console.log(performance.now() - start)
|
||||
return
|
||||
}
|
||||
count++
|
||||
for (let i = 0; i < LOOPS; i++) {
|
||||
reader.setBuffer(0, buffer)
|
||||
reader.cstring()
|
||||
}
|
||||
setImmediate(run)
|
||||
}
|
||||
|
||||
run()
|
||||
58
node_modules/pg-protocol/src/buffer-reader.ts
generated
vendored
Normal file
58
node_modules/pg-protocol/src/buffer-reader.ts
generated
vendored
Normal file
@@ -0,0 +1,58 @@
|
||||
export class BufferReader {
|
||||
private buffer: Buffer = Buffer.allocUnsafe(0)
|
||||
|
||||
// TODO(bmc): support non-utf8 encoding?
|
||||
private encoding: string = 'utf-8'
|
||||
|
||||
constructor(private offset: number = 0) {}
|
||||
|
||||
public setBuffer(offset: number, buffer: Buffer): void {
|
||||
this.offset = offset
|
||||
this.buffer = buffer
|
||||
}
|
||||
|
||||
public int16(): number {
|
||||
const result = this.buffer.readInt16BE(this.offset)
|
||||
this.offset += 2
|
||||
return result
|
||||
}
|
||||
|
||||
public byte(): number {
|
||||
const result = this.buffer[this.offset]
|
||||
this.offset++
|
||||
return result
|
||||
}
|
||||
|
||||
public int32(): number {
|
||||
const result = this.buffer.readInt32BE(this.offset)
|
||||
this.offset += 4
|
||||
return result
|
||||
}
|
||||
|
||||
public uint32(): number {
|
||||
const result = this.buffer.readUInt32BE(this.offset)
|
||||
this.offset += 4
|
||||
return result
|
||||
}
|
||||
|
||||
public string(length: number): string {
|
||||
const result = this.buffer.toString(this.encoding, this.offset, this.offset + length)
|
||||
this.offset += length
|
||||
return result
|
||||
}
|
||||
|
||||
public cstring(): string {
|
||||
const start = this.offset
|
||||
let end = start
|
||||
// eslint-disable-next-line no-empty
|
||||
while (this.buffer[end++] !== 0) {}
|
||||
this.offset = end
|
||||
return this.buffer.toString(this.encoding, start, end - 1)
|
||||
}
|
||||
|
||||
public bytes(length: number): Buffer {
|
||||
const result = this.buffer.slice(this.offset, this.offset + length)
|
||||
this.offset += length
|
||||
return result
|
||||
}
|
||||
}
|
||||
85
node_modules/pg-protocol/src/buffer-writer.ts
generated
vendored
Normal file
85
node_modules/pg-protocol/src/buffer-writer.ts
generated
vendored
Normal file
@@ -0,0 +1,85 @@
|
||||
//binary data writer tuned for encoding binary specific to the postgres binary protocol
|
||||
|
||||
export class Writer {
|
||||
private buffer: Buffer
|
||||
private offset: number = 5
|
||||
private headerPosition: number = 0
|
||||
constructor(private size = 256) {
|
||||
this.buffer = Buffer.allocUnsafe(size)
|
||||
}
|
||||
|
||||
private ensure(size: number): void {
|
||||
const remaining = this.buffer.length - this.offset
|
||||
if (remaining < size) {
|
||||
const oldBuffer = this.buffer
|
||||
// exponential growth factor of around ~ 1.5
|
||||
// https://stackoverflow.com/questions/2269063/buffer-growth-strategy
|
||||
const newSize = oldBuffer.length + (oldBuffer.length >> 1) + size
|
||||
this.buffer = Buffer.allocUnsafe(newSize)
|
||||
oldBuffer.copy(this.buffer)
|
||||
}
|
||||
}
|
||||
|
||||
public addInt32(num: number): Writer {
|
||||
this.ensure(4)
|
||||
this.buffer[this.offset++] = (num >>> 24) & 0xff
|
||||
this.buffer[this.offset++] = (num >>> 16) & 0xff
|
||||
this.buffer[this.offset++] = (num >>> 8) & 0xff
|
||||
this.buffer[this.offset++] = (num >>> 0) & 0xff
|
||||
return this
|
||||
}
|
||||
|
||||
public addInt16(num: number): Writer {
|
||||
this.ensure(2)
|
||||
this.buffer[this.offset++] = (num >>> 8) & 0xff
|
||||
this.buffer[this.offset++] = (num >>> 0) & 0xff
|
||||
return this
|
||||
}
|
||||
|
||||
public addCString(string: string): Writer {
|
||||
if (!string) {
|
||||
this.ensure(1)
|
||||
} else {
|
||||
const len = Buffer.byteLength(string)
|
||||
this.ensure(len + 1) // +1 for null terminator
|
||||
this.buffer.write(string, this.offset, 'utf-8')
|
||||
this.offset += len
|
||||
}
|
||||
|
||||
this.buffer[this.offset++] = 0 // null terminator
|
||||
return this
|
||||
}
|
||||
|
||||
public addString(string: string = ''): Writer {
|
||||
const len = Buffer.byteLength(string)
|
||||
this.ensure(len)
|
||||
this.buffer.write(string, this.offset)
|
||||
this.offset += len
|
||||
return this
|
||||
}
|
||||
|
||||
public add(otherBuffer: Buffer): Writer {
|
||||
this.ensure(otherBuffer.length)
|
||||
otherBuffer.copy(this.buffer, this.offset)
|
||||
this.offset += otherBuffer.length
|
||||
return this
|
||||
}
|
||||
|
||||
private join(code?: number): Buffer {
|
||||
if (code) {
|
||||
this.buffer[this.headerPosition] = code
|
||||
//length is everything in this packet minus the code
|
||||
const length = this.offset - (this.headerPosition + 1)
|
||||
this.buffer.writeInt32BE(length, this.headerPosition + 1)
|
||||
}
|
||||
return this.buffer.slice(code ? 0 : 5, this.offset)
|
||||
}
|
||||
|
||||
public flush(code?: number): Buffer {
|
||||
const result = this.join(code)
|
||||
this.offset = 5
|
||||
this.headerPosition = 0
|
||||
this.buffer = Buffer.allocUnsafe(this.size)
|
||||
return result
|
||||
}
|
||||
}
|
||||
575
node_modules/pg-protocol/src/inbound-parser.test.ts
generated
vendored
Normal file
575
node_modules/pg-protocol/src/inbound-parser.test.ts
generated
vendored
Normal file
@@ -0,0 +1,575 @@
|
||||
import buffers from './testing/test-buffers'
|
||||
import BufferList from './testing/buffer-list'
|
||||
import { parse } from '.'
|
||||
import assert from 'assert'
|
||||
import { PassThrough } from 'stream'
|
||||
import { BackendMessage } from './messages'
|
||||
import { Parser } from './parser'
|
||||
|
||||
const authOkBuffer = buffers.authenticationOk()
|
||||
const paramStatusBuffer = buffers.parameterStatus('client_encoding', 'UTF8')
|
||||
const readyForQueryBuffer = buffers.readyForQuery()
|
||||
const backendKeyDataBuffer = buffers.backendKeyData(1, 2)
|
||||
const commandCompleteBuffer = buffers.commandComplete('SELECT 3')
|
||||
const parseCompleteBuffer = buffers.parseComplete()
|
||||
const bindCompleteBuffer = buffers.bindComplete()
|
||||
const portalSuspendedBuffer = buffers.portalSuspended()
|
||||
|
||||
const row1 = {
|
||||
name: 'id',
|
||||
tableID: 1,
|
||||
attributeNumber: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
typeModifier: 5,
|
||||
formatCode: 0,
|
||||
}
|
||||
const oneRowDescBuff = buffers.rowDescription([row1])
|
||||
row1.name = 'bang'
|
||||
|
||||
const twoRowBuf = buffers.rowDescription([
|
||||
row1,
|
||||
{
|
||||
name: 'whoah',
|
||||
tableID: 10,
|
||||
attributeNumber: 11,
|
||||
dataTypeID: 12,
|
||||
dataTypeSize: 13,
|
||||
typeModifier: 14,
|
||||
formatCode: 0,
|
||||
},
|
||||
])
|
||||
|
||||
const rowWithBigOids = {
|
||||
name: 'bigoid',
|
||||
tableID: 3000000001,
|
||||
attributeNumber: 2,
|
||||
dataTypeID: 3000000003,
|
||||
dataTypeSize: 4,
|
||||
typeModifier: 5,
|
||||
formatCode: 0,
|
||||
}
|
||||
const bigOidDescBuff = buffers.rowDescription([rowWithBigOids])
|
||||
|
||||
const emptyRowFieldBuf = buffers.dataRow([])
|
||||
|
||||
const oneFieldBuf = buffers.dataRow(['test'])
|
||||
|
||||
const expectedAuthenticationOkayMessage = {
|
||||
name: 'authenticationOk',
|
||||
length: 8,
|
||||
}
|
||||
|
||||
const expectedParameterStatusMessage = {
|
||||
name: 'parameterStatus',
|
||||
parameterName: 'client_encoding',
|
||||
parameterValue: 'UTF8',
|
||||
length: 25,
|
||||
}
|
||||
|
||||
const expectedBackendKeyDataMessage = {
|
||||
name: 'backendKeyData',
|
||||
processID: 1,
|
||||
secretKey: 2,
|
||||
}
|
||||
|
||||
const expectedReadyForQueryMessage = {
|
||||
name: 'readyForQuery',
|
||||
length: 5,
|
||||
status: 'I',
|
||||
}
|
||||
|
||||
const expectedCommandCompleteMessage = {
|
||||
name: 'commandComplete',
|
||||
length: 13,
|
||||
text: 'SELECT 3',
|
||||
}
|
||||
const emptyRowDescriptionBuffer = new BufferList()
|
||||
.addInt16(0) // number of fields
|
||||
.join(true, 'T')
|
||||
|
||||
const expectedEmptyRowDescriptionMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 6,
|
||||
fieldCount: 0,
|
||||
fields: [],
|
||||
}
|
||||
const expectedOneRowMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 27,
|
||||
fieldCount: 1,
|
||||
fields: [
|
||||
{
|
||||
name: 'id',
|
||||
tableID: 1,
|
||||
columnID: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
const expectedTwoRowMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 53,
|
||||
fieldCount: 2,
|
||||
fields: [
|
||||
{
|
||||
name: 'bang',
|
||||
tableID: 1,
|
||||
columnID: 2,
|
||||
dataTypeID: 3,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
{
|
||||
name: 'whoah',
|
||||
tableID: 10,
|
||||
columnID: 11,
|
||||
dataTypeID: 12,
|
||||
dataTypeSize: 13,
|
||||
dataTypeModifier: 14,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
}
|
||||
const expectedBigOidMessage = {
|
||||
name: 'rowDescription',
|
||||
length: 31,
|
||||
fieldCount: 1,
|
||||
fields: [
|
||||
{
|
||||
name: 'bigoid',
|
||||
tableID: 3000000001,
|
||||
columnID: 2,
|
||||
dataTypeID: 3000000003,
|
||||
dataTypeSize: 4,
|
||||
dataTypeModifier: 5,
|
||||
format: 'text',
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
const emptyParameterDescriptionBuffer = new BufferList()
|
||||
.addInt16(0) // number of parameters
|
||||
.join(true, 't')
|
||||
|
||||
const oneParameterDescBuf = buffers.parameterDescription([1111])
|
||||
|
||||
const twoParameterDescBuf = buffers.parameterDescription([2222, 3333])
|
||||
|
||||
const expectedEmptyParameterDescriptionMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 6,
|
||||
parameterCount: 0,
|
||||
dataTypeIDs: [],
|
||||
}
|
||||
|
||||
const expectedOneParameterMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 10,
|
||||
parameterCount: 1,
|
||||
dataTypeIDs: [1111],
|
||||
}
|
||||
|
||||
const expectedTwoParameterMessage = {
|
||||
name: 'parameterDescription',
|
||||
length: 14,
|
||||
parameterCount: 2,
|
||||
dataTypeIDs: [2222, 3333],
|
||||
}
|
||||
|
||||
const testForMessage = function (buffer: Buffer, expectedMessage: any) {
|
||||
it('receives and parses ' + expectedMessage.name, async () => {
|
||||
const messages = await parseBuffers([buffer])
|
||||
const [lastMessage] = messages
|
||||
|
||||
for (const key in expectedMessage) {
|
||||
assert.deepEqual((lastMessage as any)[key], expectedMessage[key])
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
const plainPasswordBuffer = buffers.authenticationCleartextPassword()
|
||||
const md5PasswordBuffer = buffers.authenticationMD5Password()
|
||||
const SASLBuffer = buffers.authenticationSASL()
|
||||
const SASLContinueBuffer = buffers.authenticationSASLContinue()
|
||||
const SASLFinalBuffer = buffers.authenticationSASLFinal()
|
||||
|
||||
const expectedPlainPasswordMessage = {
|
||||
name: 'authenticationCleartextPassword',
|
||||
}
|
||||
|
||||
const expectedMD5PasswordMessage = {
|
||||
name: 'authenticationMD5Password',
|
||||
salt: Buffer.from([1, 2, 3, 4]),
|
||||
}
|
||||
|
||||
const expectedSASLMessage = {
|
||||
name: 'authenticationSASL',
|
||||
mechanisms: ['SCRAM-SHA-256'],
|
||||
}
|
||||
|
||||
const expectedSASLContinueMessage = {
|
||||
name: 'authenticationSASLContinue',
|
||||
data: 'data',
|
||||
}
|
||||
|
||||
const expectedSASLFinalMessage = {
|
||||
name: 'authenticationSASLFinal',
|
||||
data: 'data',
|
||||
}
|
||||
|
||||
const notificationResponseBuffer = buffers.notification(4, 'hi', 'boom')
|
||||
const expectedNotificationResponseMessage = {
|
||||
name: 'notification',
|
||||
processId: 4,
|
||||
channel: 'hi',
|
||||
payload: 'boom',
|
||||
}
|
||||
|
||||
const parseBuffers = async (buffers: Buffer[]): Promise<BackendMessage[]> => {
|
||||
const stream = new PassThrough()
|
||||
for (const buffer of buffers) {
|
||||
stream.write(buffer)
|
||||
}
|
||||
stream.end()
|
||||
const msgs: BackendMessage[] = []
|
||||
await parse(stream, (msg) => msgs.push(msg))
|
||||
return msgs
|
||||
}
|
||||
|
||||
describe('PgPacketStream', function () {
|
||||
testForMessage(authOkBuffer, expectedAuthenticationOkayMessage)
|
||||
testForMessage(plainPasswordBuffer, expectedPlainPasswordMessage)
|
||||
testForMessage(md5PasswordBuffer, expectedMD5PasswordMessage)
|
||||
testForMessage(SASLBuffer, expectedSASLMessage)
|
||||
testForMessage(SASLContinueBuffer, expectedSASLContinueMessage)
|
||||
|
||||
// this exercises a found bug in the parser:
|
||||
// https://github.com/brianc/node-postgres/pull/2210#issuecomment-627626084
|
||||
// and adds a test which is deterministic, rather than relying on network packet chunking
|
||||
const extendedSASLContinueBuffer = Buffer.concat([SASLContinueBuffer, Buffer.from([1, 2, 3, 4])])
|
||||
testForMessage(extendedSASLContinueBuffer, expectedSASLContinueMessage)
|
||||
|
||||
testForMessage(SASLFinalBuffer, expectedSASLFinalMessage)
|
||||
|
||||
// this exercises a found bug in the parser:
|
||||
// https://github.com/brianc/node-postgres/pull/2210#issuecomment-627626084
|
||||
// and adds a test which is deterministic, rather than relying on network packet chunking
|
||||
const extendedSASLFinalBuffer = Buffer.concat([SASLFinalBuffer, Buffer.from([1, 2, 4, 5])])
|
||||
testForMessage(extendedSASLFinalBuffer, expectedSASLFinalMessage)
|
||||
|
||||
testForMessage(paramStatusBuffer, expectedParameterStatusMessage)
|
||||
testForMessage(backendKeyDataBuffer, expectedBackendKeyDataMessage)
|
||||
testForMessage(readyForQueryBuffer, expectedReadyForQueryMessage)
|
||||
testForMessage(commandCompleteBuffer, expectedCommandCompleteMessage)
|
||||
testForMessage(notificationResponseBuffer, expectedNotificationResponseMessage)
|
||||
testForMessage(buffers.emptyQuery(), {
|
||||
name: 'emptyQuery',
|
||||
length: 4,
|
||||
})
|
||||
|
||||
testForMessage(Buffer.from([0x6e, 0, 0, 0, 4]), {
|
||||
name: 'noData',
|
||||
})
|
||||
|
||||
describe('rowDescription messages', function () {
|
||||
testForMessage(emptyRowDescriptionBuffer, expectedEmptyRowDescriptionMessage)
|
||||
testForMessage(oneRowDescBuff, expectedOneRowMessage)
|
||||
testForMessage(twoRowBuf, expectedTwoRowMessage)
|
||||
testForMessage(bigOidDescBuff, expectedBigOidMessage)
|
||||
})
|
||||
|
||||
describe('parameterDescription messages', function () {
|
||||
testForMessage(emptyParameterDescriptionBuffer, expectedEmptyParameterDescriptionMessage)
|
||||
testForMessage(oneParameterDescBuf, expectedOneParameterMessage)
|
||||
testForMessage(twoParameterDescBuf, expectedTwoParameterMessage)
|
||||
})
|
||||
|
||||
describe('parsing rows', function () {
|
||||
describe('parsing empty row', function () {
|
||||
testForMessage(emptyRowFieldBuf, {
|
||||
name: 'dataRow',
|
||||
fieldCount: 0,
|
||||
})
|
||||
})
|
||||
|
||||
describe('parsing data row with fields', function () {
|
||||
testForMessage(oneFieldBuf, {
|
||||
name: 'dataRow',
|
||||
fieldCount: 1,
|
||||
fields: ['test'],
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
describe('notice message', function () {
|
||||
// this uses the same logic as error message
|
||||
const buff = buffers.notice([{ type: 'C', value: 'code' }])
|
||||
testForMessage(buff, {
|
||||
name: 'notice',
|
||||
code: 'code',
|
||||
})
|
||||
})
|
||||
|
||||
testForMessage(buffers.error([]), {
|
||||
name: 'error',
|
||||
})
|
||||
|
||||
describe('with all the fields', function () {
|
||||
const buffer = buffers.error([
|
||||
{
|
||||
type: 'S',
|
||||
value: 'ERROR',
|
||||
},
|
||||
{
|
||||
type: 'C',
|
||||
value: 'code',
|
||||
},
|
||||
{
|
||||
type: 'M',
|
||||
value: 'message',
|
||||
},
|
||||
{
|
||||
type: 'D',
|
||||
value: 'details',
|
||||
},
|
||||
{
|
||||
type: 'H',
|
||||
value: 'hint',
|
||||
},
|
||||
{
|
||||
type: 'P',
|
||||
value: '100',
|
||||
},
|
||||
{
|
||||
type: 'p',
|
||||
value: '101',
|
||||
},
|
||||
{
|
||||
type: 'q',
|
||||
value: 'query',
|
||||
},
|
||||
{
|
||||
type: 'W',
|
||||
value: 'where',
|
||||
},
|
||||
{
|
||||
type: 'F',
|
||||
value: 'file',
|
||||
},
|
||||
{
|
||||
type: 'L',
|
||||
value: 'line',
|
||||
},
|
||||
{
|
||||
type: 'R',
|
||||
value: 'routine',
|
||||
},
|
||||
{
|
||||
type: 'Z', // ignored
|
||||
value: 'alsdkf',
|
||||
},
|
||||
])
|
||||
|
||||
testForMessage(buffer, {
|
||||
name: 'error',
|
||||
severity: 'ERROR',
|
||||
code: 'code',
|
||||
message: 'message',
|
||||
detail: 'details',
|
||||
hint: 'hint',
|
||||
position: '100',
|
||||
internalPosition: '101',
|
||||
internalQuery: 'query',
|
||||
where: 'where',
|
||||
file: 'file',
|
||||
line: 'line',
|
||||
routine: 'routine',
|
||||
})
|
||||
})
|
||||
|
||||
testForMessage(parseCompleteBuffer, {
|
||||
name: 'parseComplete',
|
||||
})
|
||||
|
||||
testForMessage(bindCompleteBuffer, {
|
||||
name: 'bindComplete',
|
||||
})
|
||||
|
||||
testForMessage(bindCompleteBuffer, {
|
||||
name: 'bindComplete',
|
||||
})
|
||||
|
||||
testForMessage(buffers.closeComplete(), {
|
||||
name: 'closeComplete',
|
||||
})
|
||||
|
||||
describe('parses portal suspended message', function () {
|
||||
testForMessage(portalSuspendedBuffer, {
|
||||
name: 'portalSuspended',
|
||||
})
|
||||
})
|
||||
|
||||
describe('parses replication start message', function () {
|
||||
testForMessage(Buffer.from([0x57, 0x00, 0x00, 0x00, 0x04]), {
|
||||
name: 'replicationStart',
|
||||
length: 4,
|
||||
})
|
||||
})
|
||||
|
||||
describe('copy', () => {
|
||||
testForMessage(buffers.copyIn(0), {
|
||||
name: 'copyInResponse',
|
||||
length: 7,
|
||||
binary: false,
|
||||
columnTypes: [],
|
||||
})
|
||||
|
||||
testForMessage(buffers.copyIn(2), {
|
||||
name: 'copyInResponse',
|
||||
length: 11,
|
||||
binary: false,
|
||||
columnTypes: [0, 1],
|
||||
})
|
||||
|
||||
testForMessage(buffers.copyOut(0), {
|
||||
name: 'copyOutResponse',
|
||||
length: 7,
|
||||
binary: false,
|
||||
columnTypes: [],
|
||||
})
|
||||
|
||||
testForMessage(buffers.copyOut(3), {
|
||||
name: 'copyOutResponse',
|
||||
length: 13,
|
||||
binary: false,
|
||||
columnTypes: [0, 1, 2],
|
||||
})
|
||||
|
||||
testForMessage(buffers.copyDone(), {
|
||||
name: 'copyDone',
|
||||
length: 4,
|
||||
})
|
||||
|
||||
testForMessage(buffers.copyData(Buffer.from([5, 6, 7])), {
|
||||
name: 'copyData',
|
||||
length: 7,
|
||||
chunk: Buffer.from([5, 6, 7]),
|
||||
})
|
||||
})
|
||||
|
||||
// since the data message on a stream can randomly divide the incomming
|
||||
// tcp packets anywhere, we need to make sure we can parse every single
|
||||
// split on a tcp message
|
||||
describe('split buffer, single message parsing', function () {
|
||||
const fullBuffer = buffers.dataRow([null, 'bang', 'zug zug', null, '!'])
|
||||
|
||||
it('parses when full buffer comes in', async function () {
|
||||
const messages = await parseBuffers([fullBuffer])
|
||||
const message = messages[0] as any
|
||||
assert.equal(message.fields.length, 5)
|
||||
assert.equal(message.fields[0], null)
|
||||
assert.equal(message.fields[1], 'bang')
|
||||
assert.equal(message.fields[2], 'zug zug')
|
||||
assert.equal(message.fields[3], null)
|
||||
assert.equal(message.fields[4], '!')
|
||||
})
|
||||
|
||||
const testMessageReceivedAfterSplitAt = async function (split: number) {
|
||||
const firstBuffer = Buffer.alloc(fullBuffer.length - split)
|
||||
const secondBuffer = Buffer.alloc(fullBuffer.length - firstBuffer.length)
|
||||
fullBuffer.copy(firstBuffer, 0, 0)
|
||||
fullBuffer.copy(secondBuffer, 0, firstBuffer.length)
|
||||
const messages = await parseBuffers([firstBuffer, secondBuffer])
|
||||
const message = messages[0] as any
|
||||
assert.equal(message.fields.length, 5)
|
||||
assert.equal(message.fields[0], null)
|
||||
assert.equal(message.fields[1], 'bang')
|
||||
assert.equal(message.fields[2], 'zug zug')
|
||||
assert.equal(message.fields[3], null)
|
||||
assert.equal(message.fields[4], '!')
|
||||
}
|
||||
|
||||
it('parses when split in the middle', function () {
|
||||
return testMessageReceivedAfterSplitAt(6)
|
||||
})
|
||||
|
||||
it('parses when split at end', function () {
|
||||
return testMessageReceivedAfterSplitAt(2)
|
||||
})
|
||||
|
||||
it('parses when split at beginning', function () {
|
||||
return Promise.all([
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 2),
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 1),
|
||||
testMessageReceivedAfterSplitAt(fullBuffer.length - 5),
|
||||
])
|
||||
})
|
||||
})
|
||||
|
||||
describe('split buffer, multiple message parsing', function () {
|
||||
const dataRowBuffer = buffers.dataRow(['!'])
|
||||
const readyForQueryBuffer = buffers.readyForQuery()
|
||||
const fullBuffer = Buffer.alloc(dataRowBuffer.length + readyForQueryBuffer.length)
|
||||
dataRowBuffer.copy(fullBuffer, 0, 0)
|
||||
readyForQueryBuffer.copy(fullBuffer, dataRowBuffer.length, 0)
|
||||
|
||||
const verifyMessages = function (messages: any[]) {
|
||||
assert.strictEqual(messages.length, 2)
|
||||
assert.deepEqual(messages[0], {
|
||||
name: 'dataRow',
|
||||
fieldCount: 1,
|
||||
length: 11,
|
||||
fields: ['!'],
|
||||
})
|
||||
assert.equal(messages[0].fields[0], '!')
|
||||
assert.deepEqual(messages[1], {
|
||||
name: 'readyForQuery',
|
||||
length: 5,
|
||||
status: 'I',
|
||||
})
|
||||
}
|
||||
// sanity check
|
||||
it('receives both messages when packet is not split', async function () {
|
||||
const messages = await parseBuffers([fullBuffer])
|
||||
verifyMessages(messages)
|
||||
})
|
||||
|
||||
const splitAndVerifyTwoMessages = async function (split: number) {
|
||||
const firstBuffer = Buffer.alloc(fullBuffer.length - split)
|
||||
const secondBuffer = Buffer.alloc(fullBuffer.length - firstBuffer.length)
|
||||
fullBuffer.copy(firstBuffer, 0, 0)
|
||||
fullBuffer.copy(secondBuffer, 0, firstBuffer.length)
|
||||
const messages = await parseBuffers([firstBuffer, secondBuffer])
|
||||
verifyMessages(messages)
|
||||
}
|
||||
|
||||
describe('receives both messages when packet is split', function () {
|
||||
it('in the middle', function () {
|
||||
return splitAndVerifyTwoMessages(11)
|
||||
})
|
||||
it('at the front', function () {
|
||||
return Promise.all([
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 1),
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 4),
|
||||
splitAndVerifyTwoMessages(fullBuffer.length - 6),
|
||||
])
|
||||
})
|
||||
|
||||
it('at the end', function () {
|
||||
return Promise.all([splitAndVerifyTwoMessages(8), splitAndVerifyTwoMessages(1)])
|
||||
})
|
||||
})
|
||||
})
|
||||
|
||||
it('cleans up the reader after handling a packet', function () {
|
||||
const parser = new Parser()
|
||||
parser.parse(oneFieldBuf, () => {})
|
||||
assert.strictEqual((parser as any).reader.buffer.byteLength, 0)
|
||||
})
|
||||
})
|
||||
11
node_modules/pg-protocol/src/index.ts
generated
vendored
Normal file
11
node_modules/pg-protocol/src/index.ts
generated
vendored
Normal file
@@ -0,0 +1,11 @@
|
||||
import { DatabaseError } from './messages'
|
||||
import { serialize } from './serializer'
|
||||
import { Parser, MessageCallback } from './parser'
|
||||
|
||||
export function parse(stream: NodeJS.ReadableStream, callback: MessageCallback): Promise<void> {
|
||||
const parser = new Parser()
|
||||
stream.on('data', (buffer: Buffer) => parser.parse(buffer, callback))
|
||||
return new Promise((resolve) => stream.on('end', () => resolve()))
|
||||
}
|
||||
|
||||
export { serialize, DatabaseError }
|
||||
262
node_modules/pg-protocol/src/messages.ts
generated
vendored
Normal file
262
node_modules/pg-protocol/src/messages.ts
generated
vendored
Normal file
@@ -0,0 +1,262 @@
|
||||
export type Mode = 'text' | 'binary'
|
||||
|
||||
export type MessageName =
|
||||
| 'parseComplete'
|
||||
| 'bindComplete'
|
||||
| 'closeComplete'
|
||||
| 'noData'
|
||||
| 'portalSuspended'
|
||||
| 'replicationStart'
|
||||
| 'emptyQuery'
|
||||
| 'copyDone'
|
||||
| 'copyData'
|
||||
| 'rowDescription'
|
||||
| 'parameterDescription'
|
||||
| 'parameterStatus'
|
||||
| 'backendKeyData'
|
||||
| 'notification'
|
||||
| 'readyForQuery'
|
||||
| 'commandComplete'
|
||||
| 'dataRow'
|
||||
| 'copyInResponse'
|
||||
| 'copyOutResponse'
|
||||
| 'authenticationOk'
|
||||
| 'authenticationMD5Password'
|
||||
| 'authenticationCleartextPassword'
|
||||
| 'authenticationSASL'
|
||||
| 'authenticationSASLContinue'
|
||||
| 'authenticationSASLFinal'
|
||||
| 'error'
|
||||
| 'notice'
|
||||
|
||||
export interface BackendMessage {
|
||||
name: MessageName
|
||||
length: number
|
||||
}
|
||||
|
||||
export const parseComplete: BackendMessage = {
|
||||
name: 'parseComplete',
|
||||
length: 5,
|
||||
}
|
||||
|
||||
export const bindComplete: BackendMessage = {
|
||||
name: 'bindComplete',
|
||||
length: 5,
|
||||
}
|
||||
|
||||
export const closeComplete: BackendMessage = {
|
||||
name: 'closeComplete',
|
||||
length: 5,
|
||||
}
|
||||
|
||||
export const noData: BackendMessage = {
|
||||
name: 'noData',
|
||||
length: 5,
|
||||
}
|
||||
|
||||
export const portalSuspended: BackendMessage = {
|
||||
name: 'portalSuspended',
|
||||
length: 5,
|
||||
}
|
||||
|
||||
export const replicationStart: BackendMessage = {
|
||||
name: 'replicationStart',
|
||||
length: 4,
|
||||
}
|
||||
|
||||
export const emptyQuery: BackendMessage = {
|
||||
name: 'emptyQuery',
|
||||
length: 4,
|
||||
}
|
||||
|
||||
export const copyDone: BackendMessage = {
|
||||
name: 'copyDone',
|
||||
length: 4,
|
||||
}
|
||||
|
||||
interface NoticeOrError {
|
||||
message: string | undefined
|
||||
severity: string | undefined
|
||||
code: string | undefined
|
||||
detail: string | undefined
|
||||
hint: string | undefined
|
||||
position: string | undefined
|
||||
internalPosition: string | undefined
|
||||
internalQuery: string | undefined
|
||||
where: string | undefined
|
||||
schema: string | undefined
|
||||
table: string | undefined
|
||||
column: string | undefined
|
||||
dataType: string | undefined
|
||||
constraint: string | undefined
|
||||
file: string | undefined
|
||||
line: string | undefined
|
||||
routine: string | undefined
|
||||
}
|
||||
|
||||
export class DatabaseError extends Error implements NoticeOrError {
|
||||
public severity: string | undefined
|
||||
public code: string | undefined
|
||||
public detail: string | undefined
|
||||
public hint: string | undefined
|
||||
public position: string | undefined
|
||||
public internalPosition: string | undefined
|
||||
public internalQuery: string | undefined
|
||||
public where: string | undefined
|
||||
public schema: string | undefined
|
||||
public table: string | undefined
|
||||
public column: string | undefined
|
||||
public dataType: string | undefined
|
||||
public constraint: string | undefined
|
||||
public file: string | undefined
|
||||
public line: string | undefined
|
||||
public routine: string | undefined
|
||||
constructor(
|
||||
message: string,
|
||||
public readonly length: number,
|
||||
public readonly name: MessageName
|
||||
) {
|
||||
super(message)
|
||||
}
|
||||
}
|
||||
|
||||
export class CopyDataMessage {
|
||||
public readonly name = 'copyData'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly chunk: Buffer
|
||||
) {}
|
||||
}
|
||||
|
||||
export class CopyResponse {
|
||||
public readonly columnTypes: number[]
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly name: MessageName,
|
||||
public readonly binary: boolean,
|
||||
columnCount: number
|
||||
) {
|
||||
this.columnTypes = new Array(columnCount)
|
||||
}
|
||||
}
|
||||
|
||||
export class Field {
|
||||
constructor(
|
||||
public readonly name: string,
|
||||
public readonly tableID: number,
|
||||
public readonly columnID: number,
|
||||
public readonly dataTypeID: number,
|
||||
public readonly dataTypeSize: number,
|
||||
public readonly dataTypeModifier: number,
|
||||
public readonly format: Mode
|
||||
) {}
|
||||
}
|
||||
|
||||
export class RowDescriptionMessage {
|
||||
public readonly name: MessageName = 'rowDescription'
|
||||
public readonly fields: Field[]
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly fieldCount: number
|
||||
) {
|
||||
this.fields = new Array(this.fieldCount)
|
||||
}
|
||||
}
|
||||
|
||||
export class ParameterDescriptionMessage {
|
||||
public readonly name: MessageName = 'parameterDescription'
|
||||
public readonly dataTypeIDs: number[]
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly parameterCount: number
|
||||
) {
|
||||
this.dataTypeIDs = new Array(this.parameterCount)
|
||||
}
|
||||
}
|
||||
|
||||
export class ParameterStatusMessage {
|
||||
public readonly name: MessageName = 'parameterStatus'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly parameterName: string,
|
||||
public readonly parameterValue: string
|
||||
) {}
|
||||
}
|
||||
|
||||
export class AuthenticationMD5Password implements BackendMessage {
|
||||
public readonly name: MessageName = 'authenticationMD5Password'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly salt: Buffer
|
||||
) {}
|
||||
}
|
||||
|
||||
export class BackendKeyDataMessage {
|
||||
public readonly name: MessageName = 'backendKeyData'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly processID: number,
|
||||
public readonly secretKey: number
|
||||
) {}
|
||||
}
|
||||
|
||||
export class NotificationResponseMessage {
|
||||
public readonly name: MessageName = 'notification'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly processId: number,
|
||||
public readonly channel: string,
|
||||
public readonly payload: string
|
||||
) {}
|
||||
}
|
||||
|
||||
export class ReadyForQueryMessage {
|
||||
public readonly name: MessageName = 'readyForQuery'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly status: string
|
||||
) {}
|
||||
}
|
||||
|
||||
export class CommandCompleteMessage {
|
||||
public readonly name: MessageName = 'commandComplete'
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly text: string
|
||||
) {}
|
||||
}
|
||||
|
||||
export class DataRowMessage {
|
||||
public readonly fieldCount: number
|
||||
public readonly name: MessageName = 'dataRow'
|
||||
constructor(
|
||||
public length: number,
|
||||
public fields: any[]
|
||||
) {
|
||||
this.fieldCount = fields.length
|
||||
}
|
||||
}
|
||||
|
||||
export class NoticeMessage implements BackendMessage, NoticeOrError {
|
||||
constructor(
|
||||
public readonly length: number,
|
||||
public readonly message: string | undefined
|
||||
) {}
|
||||
public readonly name = 'notice'
|
||||
public severity: string | undefined
|
||||
public code: string | undefined
|
||||
public detail: string | undefined
|
||||
public hint: string | undefined
|
||||
public position: string | undefined
|
||||
public internalPosition: string | undefined
|
||||
public internalQuery: string | undefined
|
||||
public where: string | undefined
|
||||
public schema: string | undefined
|
||||
public table: string | undefined
|
||||
public column: string | undefined
|
||||
public dataType: string | undefined
|
||||
public constraint: string | undefined
|
||||
public file: string | undefined
|
||||
public line: string | undefined
|
||||
public routine: string | undefined
|
||||
}
|
||||
276
node_modules/pg-protocol/src/outbound-serializer.test.ts
generated
vendored
Normal file
276
node_modules/pg-protocol/src/outbound-serializer.test.ts
generated
vendored
Normal file
@@ -0,0 +1,276 @@
|
||||
import assert from 'assert'
|
||||
import { serialize } from './serializer'
|
||||
import BufferList from './testing/buffer-list'
|
||||
|
||||
describe('serializer', () => {
|
||||
it('builds startup message', function () {
|
||||
const actual = serialize.startup({
|
||||
user: 'brian',
|
||||
database: 'bang',
|
||||
})
|
||||
assert.deepEqual(
|
||||
actual,
|
||||
new BufferList()
|
||||
.addInt16(3)
|
||||
.addInt16(0)
|
||||
.addCString('user')
|
||||
.addCString('brian')
|
||||
.addCString('database')
|
||||
.addCString('bang')
|
||||
.addCString('client_encoding')
|
||||
.addCString('UTF8')
|
||||
.addCString('')
|
||||
.join(true)
|
||||
)
|
||||
})
|
||||
|
||||
it('builds password message', function () {
|
||||
const actual = serialize.password('!')
|
||||
assert.deepEqual(actual, new BufferList().addCString('!').join(true, 'p'))
|
||||
})
|
||||
|
||||
it('builds request ssl message', function () {
|
||||
const actual = serialize.requestSsl()
|
||||
const expected = new BufferList().addInt32(80877103).join(true)
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds SASLInitialResponseMessage message', function () {
|
||||
const actual = serialize.sendSASLInitialResponseMessage('mech', 'data')
|
||||
assert.deepEqual(actual, new BufferList().addCString('mech').addInt32(4).addString('data').join(true, 'p'))
|
||||
})
|
||||
|
||||
it('builds SCRAMClientFinalMessage message', function () {
|
||||
const actual = serialize.sendSCRAMClientFinalMessage('data')
|
||||
assert.deepEqual(actual, new BufferList().addString('data').join(true, 'p'))
|
||||
})
|
||||
|
||||
it('builds query message', function () {
|
||||
const txt = 'select * from boom'
|
||||
const actual = serialize.query(txt)
|
||||
assert.deepEqual(actual, new BufferList().addCString(txt).join(true, 'Q'))
|
||||
})
|
||||
|
||||
describe('parse message', () => {
|
||||
it('builds parse message', function () {
|
||||
const actual = serialize.parse({ text: '!' })
|
||||
const expected = new BufferList().addCString('').addCString('!').addInt16(0).join(true, 'P')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds parse message with named query', function () {
|
||||
const actual = serialize.parse({
|
||||
name: 'boom',
|
||||
text: 'select * from boom',
|
||||
types: [],
|
||||
})
|
||||
const expected = new BufferList().addCString('boom').addCString('select * from boom').addInt16(0).join(true, 'P')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('with multiple parameters', function () {
|
||||
const actual = serialize.parse({
|
||||
name: 'force',
|
||||
text: 'select * from bang where name = $1',
|
||||
types: [1, 2, 3, 4],
|
||||
})
|
||||
const expected = new BufferList()
|
||||
.addCString('force')
|
||||
.addCString('select * from bang where name = $1')
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.addInt32(2)
|
||||
.addInt32(3)
|
||||
.addInt32(4)
|
||||
.join(true, 'P')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
})
|
||||
|
||||
describe('bind messages', function () {
|
||||
it('with no values', function () {
|
||||
const actual = serialize.bind()
|
||||
|
||||
const expectedBuffer = new BufferList()
|
||||
.addCString('')
|
||||
.addCString('')
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
|
||||
it('with named statement, portal, and values', function () {
|
||||
const actual = serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, 'zing'],
|
||||
})
|
||||
const expectedBuffer = new BufferList()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.add(Buffer.from('1'))
|
||||
.addInt32(2)
|
||||
.add(Buffer.from('hi'))
|
||||
.addInt32(-1)
|
||||
.addInt32(4)
|
||||
.add(Buffer.from('zing'))
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
})
|
||||
|
||||
it('with custom valueMapper', function () {
|
||||
const actual = serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, 'zing'],
|
||||
valueMapper: () => null,
|
||||
})
|
||||
const expectedBuffer = new BufferList()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(0)
|
||||
.addInt16(4)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt32(-1)
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
|
||||
it('with named statement, portal, and buffer value', function () {
|
||||
const actual = serialize.bind({
|
||||
portal: 'bang',
|
||||
statement: 'woo',
|
||||
values: ['1', 'hi', null, Buffer.from('zing', 'utf8')],
|
||||
})
|
||||
const expectedBuffer = new BufferList()
|
||||
.addCString('bang') // portal name
|
||||
.addCString('woo') // statement name
|
||||
.addInt16(4) // value count
|
||||
.addInt16(0) // string
|
||||
.addInt16(0) // string
|
||||
.addInt16(0) // string
|
||||
.addInt16(1) // binary
|
||||
.addInt16(4)
|
||||
.addInt32(1)
|
||||
.add(Buffer.from('1'))
|
||||
.addInt32(2)
|
||||
.add(Buffer.from('hi'))
|
||||
.addInt32(-1)
|
||||
.addInt32(4)
|
||||
.add(Buffer.from('zing', 'utf-8'))
|
||||
.addInt16(1)
|
||||
.addInt16(0)
|
||||
.join(true, 'B')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
|
||||
describe('builds execute message', function () {
|
||||
it('for unamed portal with no row limit', function () {
|
||||
const actual = serialize.execute()
|
||||
const expectedBuffer = new BufferList().addCString('').addInt32(0).join(true, 'E')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
|
||||
it('for named portal with row limit', function () {
|
||||
const actual = serialize.execute({
|
||||
portal: 'my favorite portal',
|
||||
rows: 100,
|
||||
})
|
||||
const expectedBuffer = new BufferList().addCString('my favorite portal').addInt32(100).join(true, 'E')
|
||||
assert.deepEqual(actual, expectedBuffer)
|
||||
})
|
||||
})
|
||||
|
||||
it('builds flush command', function () {
|
||||
const actual = serialize.flush()
|
||||
const expected = new BufferList().join(true, 'H')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds sync command', function () {
|
||||
const actual = serialize.sync()
|
||||
const expected = new BufferList().join(true, 'S')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds end command', function () {
|
||||
const actual = serialize.end()
|
||||
const expected = Buffer.from([0x58, 0, 0, 0, 4])
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
describe('builds describe command', function () {
|
||||
it('describe statement', function () {
|
||||
const actual = serialize.describe({ type: 'S', name: 'bang' })
|
||||
const expected = new BufferList().addChar('S').addCString('bang').join(true, 'D')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('describe unnamed portal', function () {
|
||||
const actual = serialize.describe({ type: 'P' })
|
||||
const expected = new BufferList().addChar('P').addCString('').join(true, 'D')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
})
|
||||
|
||||
describe('builds close command', function () {
|
||||
it('describe statement', function () {
|
||||
const actual = serialize.close({ type: 'S', name: 'bang' })
|
||||
const expected = new BufferList().addChar('S').addCString('bang').join(true, 'C')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('describe unnamed portal', function () {
|
||||
const actual = serialize.close({ type: 'P' })
|
||||
const expected = new BufferList().addChar('P').addCString('').join(true, 'C')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
})
|
||||
|
||||
describe('copy messages', function () {
|
||||
it('builds copyFromChunk', () => {
|
||||
const actual = serialize.copyData(Buffer.from([1, 2, 3]))
|
||||
const expected = new BufferList().add(Buffer.from([1, 2, 3])).join(true, 'd')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds copy fail', () => {
|
||||
const actual = serialize.copyFail('err!')
|
||||
const expected = new BufferList().addCString('err!').join(true, 'f')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
|
||||
it('builds copy done', () => {
|
||||
const actual = serialize.copyDone()
|
||||
const expected = new BufferList().join(true, 'c')
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
})
|
||||
|
||||
it('builds cancel message', () => {
|
||||
const actual = serialize.cancel(3, 4)
|
||||
const expected = new BufferList().addInt16(1234).addInt16(5678).addInt32(3).addInt32(4).join(true)
|
||||
assert.deepEqual(actual, expected)
|
||||
})
|
||||
})
|
||||
413
node_modules/pg-protocol/src/parser.ts
generated
vendored
Normal file
413
node_modules/pg-protocol/src/parser.ts
generated
vendored
Normal file
@@ -0,0 +1,413 @@
|
||||
import { TransformOptions } from 'stream'
|
||||
import {
|
||||
Mode,
|
||||
bindComplete,
|
||||
parseComplete,
|
||||
closeComplete,
|
||||
noData,
|
||||
portalSuspended,
|
||||
copyDone,
|
||||
replicationStart,
|
||||
emptyQuery,
|
||||
ReadyForQueryMessage,
|
||||
CommandCompleteMessage,
|
||||
CopyDataMessage,
|
||||
CopyResponse,
|
||||
NotificationResponseMessage,
|
||||
RowDescriptionMessage,
|
||||
ParameterDescriptionMessage,
|
||||
Field,
|
||||
DataRowMessage,
|
||||
ParameterStatusMessage,
|
||||
BackendKeyDataMessage,
|
||||
DatabaseError,
|
||||
BackendMessage,
|
||||
MessageName,
|
||||
AuthenticationMD5Password,
|
||||
NoticeMessage,
|
||||
} from './messages'
|
||||
import { BufferReader } from './buffer-reader'
|
||||
|
||||
// every message is prefixed with a single bye
|
||||
const CODE_LENGTH = 1
|
||||
// every message has an int32 length which includes itself but does
|
||||
// NOT include the code in the length
|
||||
const LEN_LENGTH = 4
|
||||
|
||||
const HEADER_LENGTH = CODE_LENGTH + LEN_LENGTH
|
||||
|
||||
// A placeholder for a `BackendMessage`’s length value that will be set after construction.
|
||||
const LATEINIT_LENGTH = -1
|
||||
|
||||
export type Packet = {
|
||||
code: number
|
||||
packet: Buffer
|
||||
}
|
||||
|
||||
const emptyBuffer = Buffer.allocUnsafe(0)
|
||||
|
||||
type StreamOptions = TransformOptions & {
|
||||
mode: Mode
|
||||
}
|
||||
|
||||
const enum MessageCodes {
|
||||
DataRow = 0x44, // D
|
||||
ParseComplete = 0x31, // 1
|
||||
BindComplete = 0x32, // 2
|
||||
CloseComplete = 0x33, // 3
|
||||
CommandComplete = 0x43, // C
|
||||
ReadyForQuery = 0x5a, // Z
|
||||
NoData = 0x6e, // n
|
||||
NotificationResponse = 0x41, // A
|
||||
AuthenticationResponse = 0x52, // R
|
||||
ParameterStatus = 0x53, // S
|
||||
BackendKeyData = 0x4b, // K
|
||||
ErrorMessage = 0x45, // E
|
||||
NoticeMessage = 0x4e, // N
|
||||
RowDescriptionMessage = 0x54, // T
|
||||
ParameterDescriptionMessage = 0x74, // t
|
||||
PortalSuspended = 0x73, // s
|
||||
ReplicationStart = 0x57, // W
|
||||
EmptyQuery = 0x49, // I
|
||||
CopyIn = 0x47, // G
|
||||
CopyOut = 0x48, // H
|
||||
CopyDone = 0x63, // c
|
||||
CopyData = 0x64, // d
|
||||
}
|
||||
|
||||
export type MessageCallback = (msg: BackendMessage) => void
|
||||
|
||||
export class Parser {
|
||||
private buffer: Buffer = emptyBuffer
|
||||
private bufferLength: number = 0
|
||||
private bufferOffset: number = 0
|
||||
private reader = new BufferReader()
|
||||
private mode: Mode
|
||||
|
||||
constructor(opts?: StreamOptions) {
|
||||
if (opts?.mode === 'binary') {
|
||||
throw new Error('Binary mode not supported yet')
|
||||
}
|
||||
this.mode = opts?.mode || 'text'
|
||||
}
|
||||
|
||||
public parse(buffer: Buffer, callback: MessageCallback) {
|
||||
this.mergeBuffer(buffer)
|
||||
const bufferFullLength = this.bufferOffset + this.bufferLength
|
||||
let offset = this.bufferOffset
|
||||
while (offset + HEADER_LENGTH <= bufferFullLength) {
|
||||
// code is 1 byte long - it identifies the message type
|
||||
const code = this.buffer[offset]
|
||||
// length is 1 Uint32BE - it is the length of the message EXCLUDING the code
|
||||
const length = this.buffer.readUInt32BE(offset + CODE_LENGTH)
|
||||
const fullMessageLength = CODE_LENGTH + length
|
||||
if (fullMessageLength + offset <= bufferFullLength) {
|
||||
const message = this.handlePacket(offset + HEADER_LENGTH, code, length, this.buffer)
|
||||
callback(message)
|
||||
offset += fullMessageLength
|
||||
} else {
|
||||
break
|
||||
}
|
||||
}
|
||||
if (offset === bufferFullLength) {
|
||||
// No more use for the buffer
|
||||
this.buffer = emptyBuffer
|
||||
this.bufferLength = 0
|
||||
this.bufferOffset = 0
|
||||
} else {
|
||||
// Adjust the cursors of remainingBuffer
|
||||
this.bufferLength = bufferFullLength - offset
|
||||
this.bufferOffset = offset
|
||||
}
|
||||
}
|
||||
|
||||
private mergeBuffer(buffer: Buffer): void {
|
||||
if (this.bufferLength > 0) {
|
||||
const newLength = this.bufferLength + buffer.byteLength
|
||||
const newFullLength = newLength + this.bufferOffset
|
||||
if (newFullLength > this.buffer.byteLength) {
|
||||
// We can't concat the new buffer with the remaining one
|
||||
let newBuffer: Buffer
|
||||
if (newLength <= this.buffer.byteLength && this.bufferOffset >= this.bufferLength) {
|
||||
// We can move the relevant part to the beginning of the buffer instead of allocating a new buffer
|
||||
newBuffer = this.buffer
|
||||
} else {
|
||||
// Allocate a new larger buffer
|
||||
let newBufferLength = this.buffer.byteLength * 2
|
||||
while (newLength >= newBufferLength) {
|
||||
newBufferLength *= 2
|
||||
}
|
||||
newBuffer = Buffer.allocUnsafe(newBufferLength)
|
||||
}
|
||||
// Move the remaining buffer to the new one
|
||||
this.buffer.copy(newBuffer, 0, this.bufferOffset, this.bufferOffset + this.bufferLength)
|
||||
this.buffer = newBuffer
|
||||
this.bufferOffset = 0
|
||||
}
|
||||
// Concat the new buffer with the remaining one
|
||||
buffer.copy(this.buffer, this.bufferOffset + this.bufferLength)
|
||||
this.bufferLength = newLength
|
||||
} else {
|
||||
this.buffer = buffer
|
||||
this.bufferOffset = 0
|
||||
this.bufferLength = buffer.byteLength
|
||||
}
|
||||
}
|
||||
|
||||
private handlePacket(offset: number, code: number, length: number, bytes: Buffer): BackendMessage {
|
||||
const { reader } = this
|
||||
|
||||
// NOTE: This undesirably retains the buffer in `this.reader` if the `parse*Message` calls below throw. However, those should only throw in the case of a protocol error, which normally results in the reader being discarded.
|
||||
reader.setBuffer(offset, bytes)
|
||||
|
||||
let message: BackendMessage
|
||||
|
||||
switch (code) {
|
||||
case MessageCodes.BindComplete:
|
||||
message = bindComplete
|
||||
break
|
||||
case MessageCodes.ParseComplete:
|
||||
message = parseComplete
|
||||
break
|
||||
case MessageCodes.CloseComplete:
|
||||
message = closeComplete
|
||||
break
|
||||
case MessageCodes.NoData:
|
||||
message = noData
|
||||
break
|
||||
case MessageCodes.PortalSuspended:
|
||||
message = portalSuspended
|
||||
break
|
||||
case MessageCodes.CopyDone:
|
||||
message = copyDone
|
||||
break
|
||||
case MessageCodes.ReplicationStart:
|
||||
message = replicationStart
|
||||
break
|
||||
case MessageCodes.EmptyQuery:
|
||||
message = emptyQuery
|
||||
break
|
||||
case MessageCodes.DataRow:
|
||||
message = parseDataRowMessage(reader)
|
||||
break
|
||||
case MessageCodes.CommandComplete:
|
||||
message = parseCommandCompleteMessage(reader)
|
||||
break
|
||||
case MessageCodes.ReadyForQuery:
|
||||
message = parseReadyForQueryMessage(reader)
|
||||
break
|
||||
case MessageCodes.NotificationResponse:
|
||||
message = parseNotificationMessage(reader)
|
||||
break
|
||||
case MessageCodes.AuthenticationResponse:
|
||||
message = parseAuthenticationResponse(reader, length)
|
||||
break
|
||||
case MessageCodes.ParameterStatus:
|
||||
message = parseParameterStatusMessage(reader)
|
||||
break
|
||||
case MessageCodes.BackendKeyData:
|
||||
message = parseBackendKeyData(reader)
|
||||
break
|
||||
case MessageCodes.ErrorMessage:
|
||||
message = parseErrorMessage(reader, 'error')
|
||||
break
|
||||
case MessageCodes.NoticeMessage:
|
||||
message = parseErrorMessage(reader, 'notice')
|
||||
break
|
||||
case MessageCodes.RowDescriptionMessage:
|
||||
message = parseRowDescriptionMessage(reader)
|
||||
break
|
||||
case MessageCodes.ParameterDescriptionMessage:
|
||||
message = parseParameterDescriptionMessage(reader)
|
||||
break
|
||||
case MessageCodes.CopyIn:
|
||||
message = parseCopyInMessage(reader)
|
||||
break
|
||||
case MessageCodes.CopyOut:
|
||||
message = parseCopyOutMessage(reader)
|
||||
break
|
||||
case MessageCodes.CopyData:
|
||||
message = parseCopyData(reader, length)
|
||||
break
|
||||
default:
|
||||
return new DatabaseError('received invalid response: ' + code.toString(16), length, 'error')
|
||||
}
|
||||
|
||||
reader.setBuffer(0, emptyBuffer)
|
||||
|
||||
message.length = length
|
||||
return message
|
||||
}
|
||||
}
|
||||
|
||||
const parseReadyForQueryMessage = (reader: BufferReader) => {
|
||||
const status = reader.string(1)
|
||||
return new ReadyForQueryMessage(LATEINIT_LENGTH, status)
|
||||
}
|
||||
|
||||
const parseCommandCompleteMessage = (reader: BufferReader) => {
|
||||
const text = reader.cstring()
|
||||
return new CommandCompleteMessage(LATEINIT_LENGTH, text)
|
||||
}
|
||||
|
||||
const parseCopyData = (reader: BufferReader, length: number) => {
|
||||
const chunk = reader.bytes(length - 4)
|
||||
return new CopyDataMessage(LATEINIT_LENGTH, chunk)
|
||||
}
|
||||
|
||||
const parseCopyInMessage = (reader: BufferReader) => parseCopyMessage(reader, 'copyInResponse')
|
||||
|
||||
const parseCopyOutMessage = (reader: BufferReader) => parseCopyMessage(reader, 'copyOutResponse')
|
||||
|
||||
const parseCopyMessage = (reader: BufferReader, messageName: MessageName) => {
|
||||
const isBinary = reader.byte() !== 0
|
||||
const columnCount = reader.int16()
|
||||
const message = new CopyResponse(LATEINIT_LENGTH, messageName, isBinary, columnCount)
|
||||
for (let i = 0; i < columnCount; i++) {
|
||||
message.columnTypes[i] = reader.int16()
|
||||
}
|
||||
return message
|
||||
}
|
||||
|
||||
const parseNotificationMessage = (reader: BufferReader) => {
|
||||
const processId = reader.int32()
|
||||
const channel = reader.cstring()
|
||||
const payload = reader.cstring()
|
||||
return new NotificationResponseMessage(LATEINIT_LENGTH, processId, channel, payload)
|
||||
}
|
||||
|
||||
const parseRowDescriptionMessage = (reader: BufferReader) => {
|
||||
const fieldCount = reader.int16()
|
||||
const message = new RowDescriptionMessage(LATEINIT_LENGTH, fieldCount)
|
||||
for (let i = 0; i < fieldCount; i++) {
|
||||
message.fields[i] = parseField(reader)
|
||||
}
|
||||
return message
|
||||
}
|
||||
|
||||
const parseField = (reader: BufferReader) => {
|
||||
const name = reader.cstring()
|
||||
const tableID = reader.uint32()
|
||||
const columnID = reader.int16()
|
||||
const dataTypeID = reader.uint32()
|
||||
const dataTypeSize = reader.int16()
|
||||
const dataTypeModifier = reader.int32()
|
||||
const mode = reader.int16() === 0 ? 'text' : 'binary'
|
||||
return new Field(name, tableID, columnID, dataTypeID, dataTypeSize, dataTypeModifier, mode)
|
||||
}
|
||||
|
||||
const parseParameterDescriptionMessage = (reader: BufferReader) => {
|
||||
const parameterCount = reader.int16()
|
||||
const message = new ParameterDescriptionMessage(LATEINIT_LENGTH, parameterCount)
|
||||
for (let i = 0; i < parameterCount; i++) {
|
||||
message.dataTypeIDs[i] = reader.int32()
|
||||
}
|
||||
return message
|
||||
}
|
||||
|
||||
const parseDataRowMessage = (reader: BufferReader) => {
|
||||
const fieldCount = reader.int16()
|
||||
const fields: any[] = new Array(fieldCount)
|
||||
for (let i = 0; i < fieldCount; i++) {
|
||||
const len = reader.int32()
|
||||
// a -1 for length means the value of the field is null
|
||||
fields[i] = len === -1 ? null : reader.string(len)
|
||||
}
|
||||
return new DataRowMessage(LATEINIT_LENGTH, fields)
|
||||
}
|
||||
|
||||
const parseParameterStatusMessage = (reader: BufferReader) => {
|
||||
const name = reader.cstring()
|
||||
const value = reader.cstring()
|
||||
return new ParameterStatusMessage(LATEINIT_LENGTH, name, value)
|
||||
}
|
||||
|
||||
const parseBackendKeyData = (reader: BufferReader) => {
|
||||
const processID = reader.int32()
|
||||
const secretKey = reader.int32()
|
||||
return new BackendKeyDataMessage(LATEINIT_LENGTH, processID, secretKey)
|
||||
}
|
||||
|
||||
const parseAuthenticationResponse = (reader: BufferReader, length: number) => {
|
||||
const code = reader.int32()
|
||||
// TODO(bmc): maybe better types here
|
||||
const message: BackendMessage & any = {
|
||||
name: 'authenticationOk',
|
||||
length,
|
||||
}
|
||||
|
||||
switch (code) {
|
||||
case 0: // AuthenticationOk
|
||||
break
|
||||
case 3: // AuthenticationCleartextPassword
|
||||
if (message.length === 8) {
|
||||
message.name = 'authenticationCleartextPassword'
|
||||
}
|
||||
break
|
||||
case 5: // AuthenticationMD5Password
|
||||
if (message.length === 12) {
|
||||
message.name = 'authenticationMD5Password'
|
||||
const salt = reader.bytes(4)
|
||||
return new AuthenticationMD5Password(LATEINIT_LENGTH, salt)
|
||||
}
|
||||
break
|
||||
case 10: // AuthenticationSASL
|
||||
{
|
||||
message.name = 'authenticationSASL'
|
||||
message.mechanisms = []
|
||||
let mechanism: string
|
||||
do {
|
||||
mechanism = reader.cstring()
|
||||
if (mechanism) {
|
||||
message.mechanisms.push(mechanism)
|
||||
}
|
||||
} while (mechanism)
|
||||
}
|
||||
break
|
||||
case 11: // AuthenticationSASLContinue
|
||||
message.name = 'authenticationSASLContinue'
|
||||
message.data = reader.string(length - 8)
|
||||
break
|
||||
case 12: // AuthenticationSASLFinal
|
||||
message.name = 'authenticationSASLFinal'
|
||||
message.data = reader.string(length - 8)
|
||||
break
|
||||
default:
|
||||
throw new Error('Unknown authenticationOk message type ' + code)
|
||||
}
|
||||
return message
|
||||
}
|
||||
|
||||
const parseErrorMessage = (reader: BufferReader, name: MessageName) => {
|
||||
const fields: Record<string, string> = {}
|
||||
let fieldType = reader.string(1)
|
||||
while (fieldType !== '\0') {
|
||||
fields[fieldType] = reader.cstring()
|
||||
fieldType = reader.string(1)
|
||||
}
|
||||
|
||||
const messageValue = fields.M
|
||||
|
||||
const message =
|
||||
name === 'notice'
|
||||
? new NoticeMessage(LATEINIT_LENGTH, messageValue)
|
||||
: new DatabaseError(messageValue, LATEINIT_LENGTH, name)
|
||||
|
||||
message.severity = fields.S
|
||||
message.code = fields.C
|
||||
message.detail = fields.D
|
||||
message.hint = fields.H
|
||||
message.position = fields.P
|
||||
message.internalPosition = fields.p
|
||||
message.internalQuery = fields.q
|
||||
message.where = fields.W
|
||||
message.schema = fields.s
|
||||
message.table = fields.t
|
||||
message.column = fields.c
|
||||
message.dataType = fields.d
|
||||
message.constraint = fields.n
|
||||
message.file = fields.F
|
||||
message.line = fields.L
|
||||
message.routine = fields.R
|
||||
return message
|
||||
}
|
||||
274
node_modules/pg-protocol/src/serializer.ts
generated
vendored
Normal file
274
node_modules/pg-protocol/src/serializer.ts
generated
vendored
Normal file
@@ -0,0 +1,274 @@
|
||||
import { Writer } from './buffer-writer'
|
||||
|
||||
const enum code {
|
||||
startup = 0x70,
|
||||
query = 0x51,
|
||||
parse = 0x50,
|
||||
bind = 0x42,
|
||||
execute = 0x45,
|
||||
flush = 0x48,
|
||||
sync = 0x53,
|
||||
end = 0x58,
|
||||
close = 0x43,
|
||||
describe = 0x44,
|
||||
copyFromChunk = 0x64,
|
||||
copyDone = 0x63,
|
||||
copyFail = 0x66,
|
||||
}
|
||||
|
||||
const writer = new Writer()
|
||||
|
||||
const startup = (opts: Record<string, string>): Buffer => {
|
||||
// protocol version
|
||||
writer.addInt16(3).addInt16(0)
|
||||
for (const key of Object.keys(opts)) {
|
||||
writer.addCString(key).addCString(opts[key])
|
||||
}
|
||||
|
||||
writer.addCString('client_encoding').addCString('UTF8')
|
||||
|
||||
const bodyBuffer = writer.addCString('').flush()
|
||||
// this message is sent without a code
|
||||
|
||||
const length = bodyBuffer.length + 4
|
||||
|
||||
return new Writer().addInt32(length).add(bodyBuffer).flush()
|
||||
}
|
||||
|
||||
const requestSsl = (): Buffer => {
|
||||
const response = Buffer.allocUnsafe(8)
|
||||
response.writeInt32BE(8, 0)
|
||||
response.writeInt32BE(80877103, 4)
|
||||
return response
|
||||
}
|
||||
|
||||
const password = (password: string): Buffer => {
|
||||
return writer.addCString(password).flush(code.startup)
|
||||
}
|
||||
|
||||
const sendSASLInitialResponseMessage = function (mechanism: string, initialResponse: string): Buffer {
|
||||
// 0x70 = 'p'
|
||||
writer.addCString(mechanism).addInt32(Buffer.byteLength(initialResponse)).addString(initialResponse)
|
||||
|
||||
return writer.flush(code.startup)
|
||||
}
|
||||
|
||||
const sendSCRAMClientFinalMessage = function (additionalData: string): Buffer {
|
||||
return writer.addString(additionalData).flush(code.startup)
|
||||
}
|
||||
|
||||
const query = (text: string): Buffer => {
|
||||
return writer.addCString(text).flush(code.query)
|
||||
}
|
||||
|
||||
type ParseOpts = {
|
||||
name?: string
|
||||
types?: number[]
|
||||
text: string
|
||||
}
|
||||
|
||||
const emptyArray: any[] = []
|
||||
|
||||
const parse = (query: ParseOpts): Buffer => {
|
||||
// expect something like this:
|
||||
// { name: 'queryName',
|
||||
// text: 'select * from blah',
|
||||
// types: ['int8', 'bool'] }
|
||||
|
||||
// normalize missing query names to allow for null
|
||||
const name = query.name || ''
|
||||
if (name.length > 63) {
|
||||
console.error('Warning! Postgres only supports 63 characters for query names.')
|
||||
console.error('You supplied %s (%s)', name, name.length)
|
||||
console.error('This can cause conflicts and silent errors executing queries')
|
||||
}
|
||||
|
||||
const types = query.types || emptyArray
|
||||
|
||||
const len = types.length
|
||||
|
||||
const buffer = writer
|
||||
.addCString(name) // name of query
|
||||
.addCString(query.text) // actual query text
|
||||
.addInt16(len)
|
||||
|
||||
for (let i = 0; i < len; i++) {
|
||||
buffer.addInt32(types[i])
|
||||
}
|
||||
|
||||
return writer.flush(code.parse)
|
||||
}
|
||||
|
||||
type ValueMapper = (param: any, index: number) => any
|
||||
|
||||
type BindOpts = {
|
||||
portal?: string
|
||||
binary?: boolean
|
||||
statement?: string
|
||||
values?: any[]
|
||||
// optional map from JS value to postgres value per parameter
|
||||
valueMapper?: ValueMapper
|
||||
}
|
||||
|
||||
const paramWriter = new Writer()
|
||||
|
||||
// make this a const enum so typescript will inline the value
|
||||
const enum ParamType {
|
||||
STRING = 0,
|
||||
BINARY = 1,
|
||||
}
|
||||
|
||||
const writeValues = function (values: any[], valueMapper?: ValueMapper): void {
|
||||
for (let i = 0; i < values.length; i++) {
|
||||
const mappedVal = valueMapper ? valueMapper(values[i], i) : values[i]
|
||||
if (mappedVal == null) {
|
||||
// add the param type (string) to the writer
|
||||
writer.addInt16(ParamType.STRING)
|
||||
// write -1 to the param writer to indicate null
|
||||
paramWriter.addInt32(-1)
|
||||
} else if (mappedVal instanceof Buffer) {
|
||||
// add the param type (binary) to the writer
|
||||
writer.addInt16(ParamType.BINARY)
|
||||
// add the buffer to the param writer
|
||||
paramWriter.addInt32(mappedVal.length)
|
||||
paramWriter.add(mappedVal)
|
||||
} else {
|
||||
// add the param type (string) to the writer
|
||||
writer.addInt16(ParamType.STRING)
|
||||
paramWriter.addInt32(Buffer.byteLength(mappedVal))
|
||||
paramWriter.addString(mappedVal)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const bind = (config: BindOpts = {}): Buffer => {
|
||||
// normalize config
|
||||
const portal = config.portal || ''
|
||||
const statement = config.statement || ''
|
||||
const binary = config.binary || false
|
||||
const values = config.values || emptyArray
|
||||
const len = values.length
|
||||
|
||||
writer.addCString(portal).addCString(statement)
|
||||
writer.addInt16(len)
|
||||
|
||||
writeValues(values, config.valueMapper)
|
||||
|
||||
writer.addInt16(len)
|
||||
writer.add(paramWriter.flush())
|
||||
|
||||
// all results use the same format code
|
||||
writer.addInt16(1)
|
||||
// format code
|
||||
writer.addInt16(binary ? ParamType.BINARY : ParamType.STRING)
|
||||
return writer.flush(code.bind)
|
||||
}
|
||||
|
||||
type ExecOpts = {
|
||||
portal?: string
|
||||
rows?: number
|
||||
}
|
||||
|
||||
const emptyExecute = Buffer.from([code.execute, 0x00, 0x00, 0x00, 0x09, 0x00, 0x00, 0x00, 0x00, 0x00])
|
||||
|
||||
const execute = (config?: ExecOpts): Buffer => {
|
||||
// this is the happy path for most queries
|
||||
if (!config || (!config.portal && !config.rows)) {
|
||||
return emptyExecute
|
||||
}
|
||||
|
||||
const portal = config.portal || ''
|
||||
const rows = config.rows || 0
|
||||
|
||||
const portalLength = Buffer.byteLength(portal)
|
||||
const len = 4 + portalLength + 1 + 4
|
||||
// one extra bit for code
|
||||
const buff = Buffer.allocUnsafe(1 + len)
|
||||
buff[0] = code.execute
|
||||
buff.writeInt32BE(len, 1)
|
||||
buff.write(portal, 5, 'utf-8')
|
||||
buff[portalLength + 5] = 0 // null terminate portal cString
|
||||
buff.writeUInt32BE(rows, buff.length - 4)
|
||||
return buff
|
||||
}
|
||||
|
||||
const cancel = (processID: number, secretKey: number): Buffer => {
|
||||
const buffer = Buffer.allocUnsafe(16)
|
||||
buffer.writeInt32BE(16, 0)
|
||||
buffer.writeInt16BE(1234, 4)
|
||||
buffer.writeInt16BE(5678, 6)
|
||||
buffer.writeInt32BE(processID, 8)
|
||||
buffer.writeInt32BE(secretKey, 12)
|
||||
return buffer
|
||||
}
|
||||
|
||||
type PortalOpts = {
|
||||
type: 'S' | 'P'
|
||||
name?: string
|
||||
}
|
||||
|
||||
const cstringMessage = (code: code, string: string): Buffer => {
|
||||
const stringLen = Buffer.byteLength(string)
|
||||
const len = 4 + stringLen + 1
|
||||
// one extra bit for code
|
||||
const buffer = Buffer.allocUnsafe(1 + len)
|
||||
buffer[0] = code
|
||||
buffer.writeInt32BE(len, 1)
|
||||
buffer.write(string, 5, 'utf-8')
|
||||
buffer[len] = 0 // null terminate cString
|
||||
return buffer
|
||||
}
|
||||
|
||||
const emptyDescribePortal = writer.addCString('P').flush(code.describe)
|
||||
const emptyDescribeStatement = writer.addCString('S').flush(code.describe)
|
||||
|
||||
const describe = (msg: PortalOpts): Buffer => {
|
||||
return msg.name
|
||||
? cstringMessage(code.describe, `${msg.type}${msg.name || ''}`)
|
||||
: msg.type === 'P'
|
||||
? emptyDescribePortal
|
||||
: emptyDescribeStatement
|
||||
}
|
||||
|
||||
const close = (msg: PortalOpts): Buffer => {
|
||||
const text = `${msg.type}${msg.name || ''}`
|
||||
return cstringMessage(code.close, text)
|
||||
}
|
||||
|
||||
const copyData = (chunk: Buffer): Buffer => {
|
||||
return writer.add(chunk).flush(code.copyFromChunk)
|
||||
}
|
||||
|
||||
const copyFail = (message: string): Buffer => {
|
||||
return cstringMessage(code.copyFail, message)
|
||||
}
|
||||
|
||||
const codeOnlyBuffer = (code: code): Buffer => Buffer.from([code, 0x00, 0x00, 0x00, 0x04])
|
||||
|
||||
const flushBuffer = codeOnlyBuffer(code.flush)
|
||||
const syncBuffer = codeOnlyBuffer(code.sync)
|
||||
const endBuffer = codeOnlyBuffer(code.end)
|
||||
const copyDoneBuffer = codeOnlyBuffer(code.copyDone)
|
||||
|
||||
const serialize = {
|
||||
startup,
|
||||
password,
|
||||
requestSsl,
|
||||
sendSASLInitialResponseMessage,
|
||||
sendSCRAMClientFinalMessage,
|
||||
query,
|
||||
parse,
|
||||
bind,
|
||||
execute,
|
||||
describe,
|
||||
close,
|
||||
flush: () => flushBuffer,
|
||||
sync: () => syncBuffer,
|
||||
end: () => endBuffer,
|
||||
copyData,
|
||||
copyDone: () => copyDoneBuffer,
|
||||
copyFail,
|
||||
cancel,
|
||||
}
|
||||
|
||||
export { serialize }
|
||||
67
node_modules/pg-protocol/src/testing/buffer-list.ts
generated
vendored
Normal file
67
node_modules/pg-protocol/src/testing/buffer-list.ts
generated
vendored
Normal file
@@ -0,0 +1,67 @@
|
||||
export default class BufferList {
|
||||
constructor(public buffers: Buffer[] = []) {}
|
||||
|
||||
public add(buffer: Buffer, front?: boolean) {
|
||||
this.buffers[front ? 'unshift' : 'push'](buffer)
|
||||
return this
|
||||
}
|
||||
|
||||
public addInt16(val: number, front?: boolean) {
|
||||
return this.add(Buffer.from([val >>> 8, val >>> 0]), front)
|
||||
}
|
||||
|
||||
public getByteLength() {
|
||||
return this.buffers.reduce(function (previous, current) {
|
||||
return previous + current.length
|
||||
}, 0)
|
||||
}
|
||||
|
||||
public addInt32(val: number, first?: boolean) {
|
||||
return this.add(
|
||||
Buffer.from([(val >>> 24) & 0xff, (val >>> 16) & 0xff, (val >>> 8) & 0xff, (val >>> 0) & 0xff]),
|
||||
first
|
||||
)
|
||||
}
|
||||
|
||||
public addCString(val: string, front?: boolean) {
|
||||
const len = Buffer.byteLength(val)
|
||||
const buffer = Buffer.alloc(len + 1)
|
||||
buffer.write(val)
|
||||
buffer[len] = 0
|
||||
return this.add(buffer, front)
|
||||
}
|
||||
|
||||
public addString(val: string, front?: boolean) {
|
||||
const len = Buffer.byteLength(val)
|
||||
const buffer = Buffer.alloc(len)
|
||||
buffer.write(val)
|
||||
return this.add(buffer, front)
|
||||
}
|
||||
|
||||
public addChar(char: string, first?: boolean) {
|
||||
return this.add(Buffer.from(char, 'utf8'), first)
|
||||
}
|
||||
|
||||
public addByte(byte: number) {
|
||||
return this.add(Buffer.from([byte]))
|
||||
}
|
||||
|
||||
public join(appendLength?: boolean, char?: string): Buffer {
|
||||
let length = this.getByteLength()
|
||||
if (appendLength) {
|
||||
this.addInt32(length + 4, true)
|
||||
return this.join(false, char)
|
||||
}
|
||||
if (char) {
|
||||
this.addChar(char, true)
|
||||
length++
|
||||
}
|
||||
const result = Buffer.alloc(length)
|
||||
let index = 0
|
||||
this.buffers.forEach(function (buffer) {
|
||||
buffer.copy(result, index, 0)
|
||||
index += buffer.length
|
||||
})
|
||||
return result
|
||||
}
|
||||
}
|
||||
166
node_modules/pg-protocol/src/testing/test-buffers.ts
generated
vendored
Normal file
166
node_modules/pg-protocol/src/testing/test-buffers.ts
generated
vendored
Normal file
@@ -0,0 +1,166 @@
|
||||
// https://www.postgresql.org/docs/current/protocol-message-formats.html
|
||||
import BufferList from './buffer-list'
|
||||
|
||||
const buffers = {
|
||||
readyForQuery: function () {
|
||||
return new BufferList().add(Buffer.from('I')).join(true, 'Z')
|
||||
},
|
||||
|
||||
authenticationOk: function () {
|
||||
return new BufferList().addInt32(0).join(true, 'R')
|
||||
},
|
||||
|
||||
authenticationCleartextPassword: function () {
|
||||
return new BufferList().addInt32(3).join(true, 'R')
|
||||
},
|
||||
|
||||
authenticationMD5Password: function () {
|
||||
return new BufferList()
|
||||
.addInt32(5)
|
||||
.add(Buffer.from([1, 2, 3, 4]))
|
||||
.join(true, 'R')
|
||||
},
|
||||
|
||||
authenticationSASL: function () {
|
||||
return new BufferList().addInt32(10).addCString('SCRAM-SHA-256').addCString('').join(true, 'R')
|
||||
},
|
||||
|
||||
authenticationSASLContinue: function () {
|
||||
return new BufferList().addInt32(11).addString('data').join(true, 'R')
|
||||
},
|
||||
|
||||
authenticationSASLFinal: function () {
|
||||
return new BufferList().addInt32(12).addString('data').join(true, 'R')
|
||||
},
|
||||
|
||||
parameterStatus: function (name: string, value: string) {
|
||||
return new BufferList().addCString(name).addCString(value).join(true, 'S')
|
||||
},
|
||||
|
||||
backendKeyData: function (processID: number, secretKey: number) {
|
||||
return new BufferList().addInt32(processID).addInt32(secretKey).join(true, 'K')
|
||||
},
|
||||
|
||||
commandComplete: function (string: string) {
|
||||
return new BufferList().addCString(string).join(true, 'C')
|
||||
},
|
||||
|
||||
rowDescription: function (fields: any[]) {
|
||||
fields = fields || []
|
||||
const buf = new BufferList()
|
||||
buf.addInt16(fields.length)
|
||||
fields.forEach(function (field) {
|
||||
buf
|
||||
.addCString(field.name)
|
||||
.addInt32(field.tableID || 0)
|
||||
.addInt16(field.attributeNumber || 0)
|
||||
.addInt32(field.dataTypeID || 0)
|
||||
.addInt16(field.dataTypeSize || 0)
|
||||
.addInt32(field.typeModifier || 0)
|
||||
.addInt16(field.formatCode || 0)
|
||||
})
|
||||
return buf.join(true, 'T')
|
||||
},
|
||||
|
||||
parameterDescription: function (dataTypeIDs: number[]) {
|
||||
dataTypeIDs = dataTypeIDs || []
|
||||
const buf = new BufferList()
|
||||
buf.addInt16(dataTypeIDs.length)
|
||||
dataTypeIDs.forEach(function (dataTypeID) {
|
||||
buf.addInt32(dataTypeID)
|
||||
})
|
||||
return buf.join(true, 't')
|
||||
},
|
||||
|
||||
dataRow: function (columns: any[]) {
|
||||
columns = columns || []
|
||||
const buf = new BufferList()
|
||||
buf.addInt16(columns.length)
|
||||
columns.forEach(function (col) {
|
||||
if (col == null) {
|
||||
buf.addInt32(-1)
|
||||
} else {
|
||||
const strBuf = Buffer.from(col, 'utf8')
|
||||
buf.addInt32(strBuf.length)
|
||||
buf.add(strBuf)
|
||||
}
|
||||
})
|
||||
return buf.join(true, 'D')
|
||||
},
|
||||
|
||||
error: function (fields: any) {
|
||||
return buffers.errorOrNotice(fields).join(true, 'E')
|
||||
},
|
||||
|
||||
notice: function (fields: any) {
|
||||
return buffers.errorOrNotice(fields).join(true, 'N')
|
||||
},
|
||||
|
||||
errorOrNotice: function (fields: any) {
|
||||
fields = fields || []
|
||||
const buf = new BufferList()
|
||||
fields.forEach(function (field: any) {
|
||||
buf.addChar(field.type)
|
||||
buf.addCString(field.value)
|
||||
})
|
||||
return buf.add(Buffer.from([0])) // terminator
|
||||
},
|
||||
|
||||
parseComplete: function () {
|
||||
return new BufferList().join(true, '1')
|
||||
},
|
||||
|
||||
bindComplete: function () {
|
||||
return new BufferList().join(true, '2')
|
||||
},
|
||||
|
||||
notification: function (id: number, channel: string, payload: string) {
|
||||
return new BufferList().addInt32(id).addCString(channel).addCString(payload).join(true, 'A')
|
||||
},
|
||||
|
||||
emptyQuery: function () {
|
||||
return new BufferList().join(true, 'I')
|
||||
},
|
||||
|
||||
portalSuspended: function () {
|
||||
return new BufferList().join(true, 's')
|
||||
},
|
||||
|
||||
closeComplete: function () {
|
||||
return new BufferList().join(true, '3')
|
||||
},
|
||||
|
||||
copyIn: function (cols: number) {
|
||||
const list = new BufferList()
|
||||
// text mode
|
||||
.addByte(0)
|
||||
// column count
|
||||
.addInt16(cols)
|
||||
for (let i = 0; i < cols; i++) {
|
||||
list.addInt16(i)
|
||||
}
|
||||
return list.join(true, 'G')
|
||||
},
|
||||
|
||||
copyOut: function (cols: number) {
|
||||
const list = new BufferList()
|
||||
// text mode
|
||||
.addByte(0)
|
||||
// column count
|
||||
.addInt16(cols)
|
||||
for (let i = 0; i < cols; i++) {
|
||||
list.addInt16(i)
|
||||
}
|
||||
return list.join(true, 'H')
|
||||
},
|
||||
|
||||
copyData: function (bytes: Buffer) {
|
||||
return new BufferList().add(bytes).join(true, 'd')
|
||||
},
|
||||
|
||||
copyDone: function () {
|
||||
return new BufferList().join(true, 'c')
|
||||
},
|
||||
}
|
||||
|
||||
export default buffers
|
||||
1
node_modules/pg-protocol/src/types/chunky.d.ts
generated
vendored
Normal file
1
node_modules/pg-protocol/src/types/chunky.d.ts
generated
vendored
Normal file
@@ -0,0 +1 @@
|
||||
declare module 'chunky'
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user