mirror of
https://github.com/mountain-loop/yaak.git
synced 2026-02-02 02:32:07 -05:00
Compare commits
1 Commits
actions-sy
...
copilot/cr
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
0cb42dedc9 |
@@ -1,72 +0,0 @@
|
||||
# Claude Context: Detaching Tauri from Yaak
|
||||
|
||||
## Goal
|
||||
Make Yaak runnable as a standalone CLI without Tauri as a dependency. The core Rust crates in `crates/` should be usable independently, while Tauri-specific code lives in `crates-tauri/`.
|
||||
|
||||
## Project Structure
|
||||
```
|
||||
crates/ # Core crates - should NOT depend on Tauri
|
||||
crates-tauri/ # Tauri-specific crates (yaak-app, yaak-tauri-utils, etc.)
|
||||
crates-cli/ # CLI crate (yaak-cli)
|
||||
```
|
||||
|
||||
## Completed Work
|
||||
|
||||
### 1. Folder Restructure
|
||||
- Moved Tauri-dependent app code to `crates-tauri/yaak-app/`
|
||||
- Created `crates-tauri/yaak-tauri-utils/` for shared Tauri utilities (window traits, api_client, error handling)
|
||||
- Created `crates-cli/yaak-cli/` for the standalone CLI
|
||||
|
||||
### 2. Decoupled Crates (no longer depend on Tauri)
|
||||
- **yaak-models**: Uses `init_standalone()` pattern for CLI database access
|
||||
- **yaak-http**: Removed Tauri plugin, HttpConnectionManager initialized in yaak-app setup
|
||||
- **yaak-common**: Only contains Tauri-free utilities (serde, platform)
|
||||
- **yaak-crypto**: Removed Tauri plugin, EncryptionManager initialized in yaak-app setup, commands moved to yaak-app
|
||||
- **yaak-grpc**: Replaced AppHandle with GrpcConfig struct, uses tokio::process::Command instead of Tauri sidecar
|
||||
|
||||
### 3. CLI Implementation
|
||||
- Basic CLI at `crates-cli/yaak-cli/src/main.rs`
|
||||
- Commands: workspaces, requests, send (by ID), get (ad-hoc URL), create
|
||||
- Uses same database as Tauri app via `yaak_models::init_standalone()`
|
||||
|
||||
## Remaining Work
|
||||
|
||||
### Crates Still Depending on Tauri (in `crates/`)
|
||||
1. **yaak-git** (3 files) - Moderate complexity
|
||||
2. **yaak-plugins** (13 files) - **Hardest** - deeply integrated with Tauri for plugin-to-window communication
|
||||
3. **yaak-sync** (4 files) - Moderate complexity
|
||||
4. **yaak-ws** (5 files) - Moderate complexity
|
||||
|
||||
### Pattern for Decoupling
|
||||
1. Remove Tauri plugin `init()` function from the crate
|
||||
2. Move commands to `yaak-app/src/commands.rs` or keep inline in `lib.rs`
|
||||
3. Move extension traits (e.g., `SomethingManagerExt`) to yaak-app or yaak-tauri-utils
|
||||
4. Initialize managers in yaak-app's `.setup()` block
|
||||
5. Remove `tauri` from Cargo.toml dependencies
|
||||
6. Update `crates-tauri/yaak-app/capabilities/default.json` to remove the plugin permission
|
||||
7. Replace `tauri::async_runtime::block_on` with `tokio::runtime::Handle::current().block_on()`
|
||||
|
||||
## Key Files
|
||||
- `crates-tauri/yaak-app/src/lib.rs` - Main Tauri app, setup block initializes managers
|
||||
- `crates-tauri/yaak-app/src/commands.rs` - Migrated Tauri commands
|
||||
- `crates-tauri/yaak-app/src/models_ext.rs` - Database plugin and extension traits
|
||||
- `crates-tauri/yaak-tauri-utils/src/window.rs` - WorkspaceWindowTrait for window state
|
||||
- `crates/yaak-models/src/lib.rs` - Contains `init_standalone()` for CLI usage
|
||||
|
||||
## Git Branch
|
||||
Working on `detach-tauri` branch.
|
||||
|
||||
## Recent Commits
|
||||
```
|
||||
c40cff40 Remove Tauri dependencies from yaak-crypto and yaak-grpc
|
||||
df495f1d Move Tauri utilities from yaak-common to yaak-tauri-utils
|
||||
481e0273 Remove Tauri dependencies from yaak-http and yaak-common
|
||||
10568ac3 Add HTTP request sending to yaak-cli
|
||||
bcb7d600 Add yaak-cli stub with basic database access
|
||||
e718a5f1 Refactor models_ext to use init_standalone from yaak-models
|
||||
```
|
||||
|
||||
## Testing
|
||||
- Run `cargo check -p <crate>` to verify a crate builds without Tauri
|
||||
- Run `npm run app-dev` to test the Tauri app still works
|
||||
- Run `cargo run -p yaak-cli -- --help` to test the CLI
|
||||
@@ -1,51 +0,0 @@
|
||||
---
|
||||
description: Review a PR in a new worktree
|
||||
allowed-tools: Bash(git worktree:*), Bash(gh pr:*)
|
||||
---
|
||||
|
||||
Review a GitHub pull request in a new git worktree.
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/review-pr <PR_NUMBER>
|
||||
```
|
||||
|
||||
## What to do
|
||||
|
||||
1. List all open pull requests and ask the user to select one
|
||||
2. Get PR information using `gh pr view <PR_NUMBER> --json number,headRefName`
|
||||
3. Extract the branch name from the PR
|
||||
4. Create a new worktree at `../yaak-worktrees/pr-<PR_NUMBER>` using `git worktree add` with a timeout of at least 300000ms (5 minutes) since the post-checkout hook runs a bootstrap script
|
||||
5. Checkout the PR branch in the new worktree using `gh pr checkout <PR_NUMBER>`
|
||||
6. The post-checkout hook will automatically:
|
||||
- Create `.env.local` with unique ports
|
||||
- Copy editor config folders
|
||||
- Run `npm install && npm run bootstrap`
|
||||
7. Inform the user:
|
||||
- Where the worktree was created
|
||||
- What ports were assigned
|
||||
- How to access it (cd command)
|
||||
- How to run the dev server
|
||||
- How to remove the worktree when done
|
||||
|
||||
## Example Output
|
||||
|
||||
```
|
||||
Created worktree for PR #123 at ../yaak-worktrees/pr-123
|
||||
Branch: feature-auth
|
||||
Ports: Vite (1421), MCP (64344)
|
||||
|
||||
To start working:
|
||||
cd ../yaak-worktrees/pr-123
|
||||
npm run app-dev
|
||||
|
||||
To remove when done:
|
||||
git worktree remove ../yaak-worktrees/pr-123
|
||||
```
|
||||
|
||||
## Error Handling
|
||||
|
||||
- If the PR doesn't exist, show a helpful error
|
||||
- If the worktree already exists, inform the user and ask if they want to remove and recreate it
|
||||
- If `gh` CLI is not available, inform the user to install it
|
||||
@@ -1,47 +0,0 @@
|
||||
---
|
||||
description: Generate formatted release notes for Yaak releases
|
||||
allowed-tools: Bash(git tag:*)
|
||||
---
|
||||
|
||||
Generate formatted release notes for Yaak releases by analyzing git history and pull request descriptions.
|
||||
|
||||
## What to do
|
||||
|
||||
1. Identifies the version tag and previous version
|
||||
2. Retrieves all commits between versions
|
||||
- If the version is a beta version, it retrieves commits between the beta version and previous beta version
|
||||
- If the version is a stable version, it retrieves commits between the stable version and the previous stable version
|
||||
3. Fetches PR descriptions for linked issues to find:
|
||||
- Feedback URLs (feedback.yaak.app)
|
||||
- Additional context and descriptions
|
||||
- Installation links for plugins
|
||||
4. Formats the release notes using the standard Yaak format:
|
||||
- Changelog badge at the top
|
||||
- Bulleted list of changes with PR links
|
||||
- Feedback links where available
|
||||
- Full changelog comparison link at the bottom
|
||||
|
||||
## Output Format
|
||||
|
||||
The skill generates markdown-formatted release notes following this structure:
|
||||
|
||||
```markdown
|
||||
[](https://yaak.app/changelog/VERSION)
|
||||
|
||||
- Feature/fix description in by @username [#123](https://github.com/mountain-loop/yaak/pull/123)
|
||||
- [Linked feedback item](https://feedback.yaak.app/p/item) by @username in [#456](https://github.com/mountain-loop/yaak/pull/456)
|
||||
- A simple item that doesn't have a feedback or PR link
|
||||
|
||||
**Full Changelog**: https://github.com/mountain-loop/yaak/compare/vPREV...vCURRENT
|
||||
```
|
||||
|
||||
**IMPORTANT**: Always add a blank lines around the markdown code fence and output the markdown code block last
|
||||
**IMPORTANT**: PRs by `@gschier` should not mention the @username
|
||||
|
||||
## After Generating Release Notes
|
||||
|
||||
After outputting the release notes, ask the user if they would like to create a draft GitHub release with these notes. If they confirm, create the release using:
|
||||
|
||||
```bash
|
||||
gh release create <tag> --draft --prerelease --title "<tag>" --notes '<release notes>'
|
||||
```
|
||||
@@ -1,27 +0,0 @@
|
||||
# Project Rules
|
||||
|
||||
## General Development
|
||||
|
||||
- **NEVER** commit or push without explicit confirmation
|
||||
|
||||
## Build and Lint
|
||||
|
||||
- **ALWAYS** run `npm run lint` after modifying TypeScript or JavaScript files
|
||||
- Run `npm run bootstrap` after changing plugin runtime or MCP server code
|
||||
|
||||
## Plugin System
|
||||
|
||||
### Backend Constraints
|
||||
|
||||
- Always use `UpdateSource::Plugin` when calling database methods from plugin events
|
||||
- Never send timestamps (`createdAt`, `updatedAt`) from TypeScript - Rust backend controls these
|
||||
- Backend uses `NaiveDateTime` (no timezone) so avoid sending ISO timestamp strings
|
||||
|
||||
### MCP Server
|
||||
|
||||
- MCP server has **no active window context** - cannot call `window.workspaceId()`
|
||||
- Get workspace ID from `workspaceCtx.yaak.workspace.list()` instead
|
||||
|
||||
## Rust Type Generation
|
||||
|
||||
- Run `cargo test --package yaak-plugins` (and for other crates) to regenerate TypeScript bindings after modifying Rust event types
|
||||
@@ -1,35 +0,0 @@
|
||||
# Worktree Management Skill
|
||||
|
||||
## Creating Worktrees
|
||||
|
||||
When creating git worktrees for this project, ALWAYS use the path format:
|
||||
```
|
||||
../yaak-worktrees/<NAME>
|
||||
```
|
||||
|
||||
For example:
|
||||
- `git worktree add ../yaak-worktrees/feature-auth`
|
||||
- `git worktree add ../yaak-worktrees/bugfix-login`
|
||||
- `git worktree add ../yaak-worktrees/refactor-api`
|
||||
|
||||
## What Happens Automatically
|
||||
|
||||
The post-checkout hook will automatically:
|
||||
1. Create `.env.local` with unique ports (YAAK_DEV_PORT and YAAK_PLUGIN_MCP_SERVER_PORT)
|
||||
2. Copy gitignored editor config folders (.zed, .idea, etc.)
|
||||
3. Run `npm install && npm run bootstrap`
|
||||
|
||||
## Deleting Worktrees
|
||||
|
||||
```bash
|
||||
git worktree remove ../yaak-worktrees/<NAME>
|
||||
```
|
||||
|
||||
## Port Assignments
|
||||
|
||||
- Main worktree: 1420 (Vite), 64343 (MCP)
|
||||
- First worktree: 1421, 64344
|
||||
- Second worktree: 1422, 64345
|
||||
- etc.
|
||||
|
||||
Each worktree can run `npm run app-dev` simultaneously without conflicts.
|
||||
8
.gitattributes
vendored
8
.gitattributes
vendored
@@ -1,7 +1,5 @@
|
||||
crates-tauri/yaak-app/vendored/**/* linguist-generated=true
|
||||
crates-tauri/yaak-app/gen/schemas/**/* linguist-generated=true
|
||||
**/bindings/* linguist-generated=true
|
||||
crates/yaak-templates/pkg/* linguist-generated=true
|
||||
src-tauri/vendored/**/* linguist-generated=true
|
||||
src-tauri/gen/schemas/**/* linguist-generated=true
|
||||
|
||||
# Ensure consistent line endings for test files that check exact content
|
||||
crates/yaak-http/tests/test.txt text eol=lf
|
||||
src-tauri/yaak-http/tests/test.txt text eol=lf
|
||||
|
||||
3
.github/workflows/ci.yml
vendored
3
.github/workflows/ci.yml
vendored
@@ -18,13 +18,14 @@ jobs:
|
||||
- uses: dtolnay/rust-toolchain@stable
|
||||
- uses: Swatinem/rust-cache@v2
|
||||
with:
|
||||
workspaces: 'src-tauri'
|
||||
shared-key: ci
|
||||
cache-on-failure: true
|
||||
|
||||
- run: npm ci
|
||||
- run: npm run bootstrap
|
||||
- run: npm run lint
|
||||
- name: Run JS Tests
|
||||
run: npm test
|
||||
- name: Run Rust Tests
|
||||
run: cargo test --all
|
||||
working-directory: src-tauri
|
||||
|
||||
50
.github/workflows/claude.yml
vendored
50
.github/workflows/claude.yml
vendored
@@ -1,50 +0,0 @@
|
||||
name: Claude Code
|
||||
|
||||
on:
|
||||
issue_comment:
|
||||
types: [created]
|
||||
pull_request_review_comment:
|
||||
types: [created]
|
||||
issues:
|
||||
types: [opened, assigned]
|
||||
pull_request_review:
|
||||
types: [submitted]
|
||||
|
||||
jobs:
|
||||
claude:
|
||||
if: |
|
||||
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
|
||||
(github.event_name == 'pull_request_review_comment' && contains(github.event.comment.body, '@claude')) ||
|
||||
(github.event_name == 'pull_request_review' && contains(github.event.review.body, '@claude')) ||
|
||||
(github.event_name == 'issues' && (contains(github.event.issue.body, '@claude') || contains(github.event.issue.title, '@claude')))
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: read
|
||||
pull-requests: read
|
||||
issues: read
|
||||
id-token: write
|
||||
actions: read # Required for Claude to read CI results on PRs
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
- name: Run Claude Code
|
||||
id: claude
|
||||
uses: anthropics/claude-code-action@v1
|
||||
with:
|
||||
claude_code_oauth_token: ${{ secrets.CLAUDE_CODE_OAUTH_TOKEN }}
|
||||
|
||||
# This is an optional setting that allows Claude to read CI results on PRs
|
||||
additional_permissions: |
|
||||
actions: read
|
||||
|
||||
# Optional: Give a custom prompt to Claude. If this is not specified, Claude will perform the instructions specified in the comment that tagged it.
|
||||
# prompt: 'Update the pull request description to include a summary of changes.'
|
||||
|
||||
# Optional: Add claude_args to customize behavior and configuration
|
||||
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
|
||||
# or https://code.claude.com/docs/en/cli-reference for available options
|
||||
# claude_args: '--allowed-tools Bash(gh pr:*)'
|
||||
|
||||
98
.github/workflows/release.yml
vendored
98
.github/workflows/release.yml
vendored
@@ -1,7 +1,7 @@
|
||||
name: Generate Artifacts
|
||||
on:
|
||||
push:
|
||||
tags: [v*]
|
||||
tags: [ v* ]
|
||||
|
||||
jobs:
|
||||
build-artifacts:
|
||||
@@ -13,37 +13,37 @@ jobs:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
- platform: "macos-latest" # for Arm-based Macs (M1 and above).
|
||||
args: "--target aarch64-apple-darwin"
|
||||
yaak_arch: "arm64"
|
||||
os: "macos"
|
||||
targets: "aarch64-apple-darwin"
|
||||
- platform: "macos-latest" # for Intel-based Macs.
|
||||
args: "--target x86_64-apple-darwin"
|
||||
yaak_arch: "x64"
|
||||
os: "macos"
|
||||
targets: "x86_64-apple-darwin"
|
||||
- platform: "ubuntu-22.04"
|
||||
args: ""
|
||||
yaak_arch: "x64"
|
||||
os: "ubuntu"
|
||||
targets: ""
|
||||
- platform: "ubuntu-22.04-arm"
|
||||
args: ""
|
||||
yaak_arch: "arm64"
|
||||
os: "ubuntu"
|
||||
targets: ""
|
||||
- platform: "windows-latest"
|
||||
args: ""
|
||||
yaak_arch: "x64"
|
||||
os: "windows"
|
||||
targets: ""
|
||||
- platform: 'macos-latest' # for Arm-based Macs (M1 and above).
|
||||
args: '--target aarch64-apple-darwin'
|
||||
yaak_arch: 'arm64'
|
||||
os: 'macos'
|
||||
targets: 'aarch64-apple-darwin'
|
||||
- platform: 'macos-latest' # for Intel-based Macs.
|
||||
args: '--target x86_64-apple-darwin'
|
||||
yaak_arch: 'x64'
|
||||
os: 'macos'
|
||||
targets: 'x86_64-apple-darwin'
|
||||
- platform: 'ubuntu-22.04'
|
||||
args: ''
|
||||
yaak_arch: 'x64'
|
||||
os: 'ubuntu'
|
||||
targets: ''
|
||||
- platform: 'ubuntu-22.04-arm'
|
||||
args: ''
|
||||
yaak_arch: 'arm64'
|
||||
os: 'ubuntu'
|
||||
targets: ''
|
||||
- platform: 'windows-latest'
|
||||
args: ''
|
||||
yaak_arch: 'x64'
|
||||
os: 'windows'
|
||||
targets: ''
|
||||
# Windows ARM64
|
||||
- platform: "windows-latest"
|
||||
args: "--target aarch64-pc-windows-msvc"
|
||||
yaak_arch: "arm64"
|
||||
os: "windows"
|
||||
targets: "aarch64-pc-windows-msvc"
|
||||
- platform: 'windows-latest'
|
||||
args: '--target aarch64-pc-windows-msvc'
|
||||
yaak_arch: 'arm64'
|
||||
os: 'windows'
|
||||
targets: 'aarch64-pc-windows-msvc'
|
||||
runs-on: ${{ matrix.platform }}
|
||||
timeout-minutes: 40
|
||||
steps:
|
||||
@@ -60,6 +60,7 @@ jobs:
|
||||
|
||||
- uses: Swatinem/rust-cache@v2
|
||||
with:
|
||||
workspaces: 'src-tauri'
|
||||
shared-key: ci
|
||||
cache-on-failure: true
|
||||
|
||||
@@ -88,43 +89,18 @@ jobs:
|
||||
& $exe --version
|
||||
|
||||
- run: npm ci
|
||||
- run: npm run bootstrap
|
||||
env:
|
||||
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||
- run: npm run lint
|
||||
- name: Run JS Tests
|
||||
run: npm test
|
||||
- name: Run Rust Tests
|
||||
run: cargo test --all
|
||||
working-directory: src-tauri
|
||||
|
||||
- name: Set version
|
||||
run: npm run replace-version
|
||||
env:
|
||||
YAAK_VERSION: ${{ github.ref_name }}
|
||||
|
||||
- name: Sign vendored binaries (macOS only)
|
||||
if: matrix.os == 'macos'
|
||||
env:
|
||||
APPLE_CERTIFICATE: ${{ secrets.APPLE_CERTIFICATE }}
|
||||
APPLE_CERTIFICATE_PASSWORD: ${{ secrets.APPLE_CERTIFICATE_PASSWORD }}
|
||||
APPLE_SIGNING_IDENTITY: ${{ secrets.APPLE_SIGNING_IDENTITY }}
|
||||
KEYCHAIN_PASSWORD: ${{ secrets.KEYCHAIN_PASSWORD }}
|
||||
run: |
|
||||
# Create keychain
|
||||
KEYCHAIN_PATH=$RUNNER_TEMP/app-signing.keychain-db
|
||||
security create-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
|
||||
security set-keychain-settings -lut 21600 $KEYCHAIN_PATH
|
||||
security unlock-keychain -p "$KEYCHAIN_PASSWORD" $KEYCHAIN_PATH
|
||||
|
||||
# Import certificate
|
||||
echo "$APPLE_CERTIFICATE" | base64 --decode > certificate.p12
|
||||
security import certificate.p12 -P "$APPLE_CERTIFICATE_PASSWORD" -A -t cert -f pkcs12 -k $KEYCHAIN_PATH
|
||||
security list-keychain -d user -s $KEYCHAIN_PATH
|
||||
|
||||
# Sign vendored binaries with hardened runtime and their specific entitlements
|
||||
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaakprotoc.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/protoc/yaakprotoc || true
|
||||
codesign --force --options runtime --entitlements crates-tauri/yaak-app/macos/entitlements.yaaknode.plist --sign "$APPLE_SIGNING_IDENTITY" crates-tauri/yaak-app/vendored/node/yaaknode || true
|
||||
|
||||
- uses: tauri-apps/tauri-action@v0
|
||||
env:
|
||||
YAAK_TARGET_ARCH: ${{ matrix.yaak_arch }}
|
||||
@@ -147,9 +123,9 @@ jobs:
|
||||
AZURE_CLIENT_SECRET: ${{ matrix.os == 'windows' && secrets.AZURE_CLIENT_SECRET }}
|
||||
AZURE_TENANT_ID: ${{ matrix.os == 'windows' && secrets.AZURE_TENANT_ID }}
|
||||
with:
|
||||
tagName: "v__VERSION__"
|
||||
releaseName: "Release __VERSION__"
|
||||
releaseBody: "[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)"
|
||||
tagName: 'v__VERSION__'
|
||||
releaseName: 'Release __VERSION__'
|
||||
releaseBody: '[Changelog __VERSION__](https://yaak.app/blog/__VERSION__)'
|
||||
releaseDraft: true
|
||||
prerelease: true
|
||||
args: "${{ matrix.args }} --config ./crates-tauri/yaak-app/tauri.release.conf.json"
|
||||
args: '${{ matrix.args }} --config ./src-tauri/tauri.release.conf.json'
|
||||
|
||||
8
.gitignore
vendored
8
.gitignore
vendored
@@ -36,11 +36,3 @@ out
|
||||
tmp
|
||||
.zed
|
||||
codebook.toml
|
||||
target
|
||||
|
||||
# Per-worktree Tauri config (generated by post-checkout hook)
|
||||
crates-tauri/yaak-app/tauri.worktree.conf.json
|
||||
|
||||
# Tauri auto-generated permission files
|
||||
**/permissions/autogenerated
|
||||
**/permissions/schemas
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
node scripts/git-hooks/post-checkout.mjs "$@"
|
||||
73
Cargo.toml
73
Cargo.toml
@@ -1,73 +0,0 @@
|
||||
[workspace]
|
||||
resolver = "2"
|
||||
members = [
|
||||
# Shared crates (no Tauri dependency)
|
||||
"crates/yaak-actions",
|
||||
"crates/yaak-actions-builtin",
|
||||
"crates/yaak-core",
|
||||
"crates/yaak-common",
|
||||
"crates/yaak-crypto",
|
||||
"crates/yaak-git",
|
||||
"crates/yaak-grpc",
|
||||
"crates/yaak-http",
|
||||
"crates/yaak-models",
|
||||
"crates/yaak-plugins",
|
||||
"crates/yaak-sse",
|
||||
"crates/yaak-sync",
|
||||
"crates/yaak-templates",
|
||||
"crates/yaak-tls",
|
||||
"crates/yaak-ws",
|
||||
# CLI crates
|
||||
"crates-cli/yaak-cli",
|
||||
# Tauri-specific crates
|
||||
"crates-tauri/yaak-app",
|
||||
"crates-tauri/yaak-fonts",
|
||||
"crates-tauri/yaak-license",
|
||||
"crates-tauri/yaak-mac-window",
|
||||
"crates-tauri/yaak-tauri-utils",
|
||||
]
|
||||
|
||||
[workspace.dependencies]
|
||||
chrono = "0.4.42"
|
||||
hex = "0.4.3"
|
||||
keyring = "3.6.3"
|
||||
log = "0.4.29"
|
||||
reqwest = "0.12.20"
|
||||
rustls = { version = "0.23.34", default-features = false }
|
||||
rustls-platform-verifier = "0.6.2"
|
||||
serde = "1.0.228"
|
||||
serde_json = "1.0.145"
|
||||
sha2 = "0.10.9"
|
||||
tauri = "2.9.5"
|
||||
tauri-plugin = "2.5.2"
|
||||
tauri-plugin-dialog = "2.4.2"
|
||||
tauri-plugin-shell = "2.3.3"
|
||||
thiserror = "2.0.17"
|
||||
tokio = "1.48.0"
|
||||
ts-rs = "11.1.0"
|
||||
|
||||
# Internal crates - shared
|
||||
yaak-actions = { path = "crates/yaak-actions" }
|
||||
yaak-actions-builtin = { path = "crates/yaak-actions-builtin" }
|
||||
yaak-core = { path = "crates/yaak-core" }
|
||||
yaak-common = { path = "crates/yaak-common" }
|
||||
yaak-crypto = { path = "crates/yaak-crypto" }
|
||||
yaak-git = { path = "crates/yaak-git" }
|
||||
yaak-grpc = { path = "crates/yaak-grpc" }
|
||||
yaak-http = { path = "crates/yaak-http" }
|
||||
yaak-models = { path = "crates/yaak-models" }
|
||||
yaak-plugins = { path = "crates/yaak-plugins" }
|
||||
yaak-sse = { path = "crates/yaak-sse" }
|
||||
yaak-sync = { path = "crates/yaak-sync" }
|
||||
yaak-templates = { path = "crates/yaak-templates" }
|
||||
yaak-tls = { path = "crates/yaak-tls" }
|
||||
yaak-ws = { path = "crates/yaak-ws" }
|
||||
|
||||
# Internal crates - Tauri-specific
|
||||
yaak-fonts = { path = "crates-tauri/yaak-fonts" }
|
||||
yaak-license = { path = "crates-tauri/yaak-license" }
|
||||
yaak-mac-window = { path = "crates-tauri/yaak-mac-window" }
|
||||
yaak-tauri-utils = { path = "crates-tauri/yaak-tauri-utils" }
|
||||
|
||||
[profile.release]
|
||||
strip = false
|
||||
@@ -1,6 +1,6 @@
|
||||
<p align="center">
|
||||
<a href="https://github.com/JamesIves/github-sponsors-readme-action">
|
||||
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/crates-tauri/yaak-app/icons/icon.png">
|
||||
<img width="200px" src="https://github.com/mountain-loop/yaak/raw/main/src-tauri/icons/icon.png">
|
||||
</a>
|
||||
</p>
|
||||
|
||||
@@ -19,7 +19,7 @@
|
||||
|
||||
|
||||
<p align="center">
|
||||
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/bytebase"><img src="https://github.com/bytebase.png" width="80px" alt="User avatar: bytebase" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
||||
<!-- sponsors-premium --><a href="https://github.com/MVST-Solutions"><img src="https://github.com/MVST-Solutions.png" width="80px" alt="User avatar: MVST-Solutions" /></a> <a href="https://github.com/dharsanb"><img src="https://github.com/dharsanb.png" width="80px" alt="User avatar: dharsanb" /></a> <a href="https://github.com/railwayapp"><img src="https://github.com/railwayapp.png" width="80px" alt="User avatar: railwayapp" /></a> <a href="https://github.com/caseyamcl"><img src="https://github.com/caseyamcl.png" width="80px" alt="User avatar: caseyamcl" /></a> <a href="https://github.com/"><img src="https://raw.githubusercontent.com/JamesIves/github-sponsors-readme-action/dev/.github/assets/placeholder.png" width="80px" alt="User avatar: " /></a> <!-- sponsors-premium -->
|
||||
</p>
|
||||
<p align="center">
|
||||
<!-- sponsors-base --><a href="https://github.com/seanwash"><img src="https://github.com/seanwash.png" width="50px" alt="User avatar: seanwash" /></a> <a href="https://github.com/jerath"><img src="https://github.com/jerath.png" width="50px" alt="User avatar: jerath" /></a> <a href="https://github.com/itsa-sh"><img src="https://github.com/itsa-sh.png" width="50px" alt="User avatar: itsa-sh" /></a> <a href="https://github.com/dmmulroy"><img src="https://github.com/dmmulroy.png" width="50px" alt="User avatar: dmmulroy" /></a> <a href="https://github.com/timcole"><img src="https://github.com/timcole.png" width="50px" alt="User avatar: timcole" /></a> <a href="https://github.com/VLZH"><img src="https://github.com/VLZH.png" width="50px" alt="User avatar: VLZH" /></a> <a href="https://github.com/terasaka2k"><img src="https://github.com/terasaka2k.png" width="50px" alt="User avatar: terasaka2k" /></a> <a href="https://github.com/andriyor"><img src="https://github.com/andriyor.png" width="50px" alt="User avatar: andriyor" /></a> <a href="https://github.com/majudhu"><img src="https://github.com/majudhu.png" width="50px" alt="User avatar: majudhu" /></a> <a href="https://github.com/axelrindle"><img src="https://github.com/axelrindle.png" width="50px" alt="User avatar: axelrindle" /></a> <a href="https://github.com/jirizverina"><img src="https://github.com/jirizverina.png" width="50px" alt="User avatar: jirizverina" /></a> <a href="https://github.com/chip-well"><img src="https://github.com/chip-well.png" width="50px" alt="User avatar: chip-well" /></a> <a href="https://github.com/GRAYAH"><img src="https://github.com/GRAYAH.png" width="50px" alt="User avatar: GRAYAH" /></a> <!-- sponsors-base -->
|
||||
@@ -64,7 +64,7 @@ visit [`DEVELOPMENT.md`](DEVELOPMENT.md) for tips on setting up your environment
|
||||
## Useful Resources
|
||||
|
||||
- [Feedback and Bug Reports](https://feedback.yaak.app)
|
||||
- [Documentation](https://yaak.app/docs)
|
||||
- [Documentation](https://feedback.yaak.app/help)
|
||||
- [Yaak vs Postman](https://yaak.app/alternatives/postman)
|
||||
- [Yaak vs Bruno](https://yaak.app/alternatives/bruno)
|
||||
- [Yaak vs Insomnia](https://yaak.app/alternatives/insomnia)
|
||||
|
||||
12
biome.json
12
biome.json
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
|
||||
"$schema": "https://biomejs.dev/schemas/2.3.7/schema.json",
|
||||
"linter": {
|
||||
"enabled": true,
|
||||
"rules": {
|
||||
@@ -38,16 +38,14 @@
|
||||
"!**/node_modules",
|
||||
"!**/dist",
|
||||
"!**/build",
|
||||
"!target",
|
||||
"!scripts",
|
||||
"!crates",
|
||||
"!crates-tauri",
|
||||
"!packages/plugin-runtime",
|
||||
"!packages/plugin-runtime-types",
|
||||
"!src-tauri",
|
||||
"!src-web/tailwind.config.cjs",
|
||||
"!src-web/postcss.config.cjs",
|
||||
"!src-web/vite.config.ts",
|
||||
"!src-web/routeTree.gen.ts",
|
||||
"!packages/plugin-runtime-types/lib",
|
||||
"!**/bindings"
|
||||
"!src-web/routeTree.gen.ts"
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,24 +0,0 @@
|
||||
[package]
|
||||
name = "yaak-cli"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
publish = false
|
||||
|
||||
[[bin]]
|
||||
name = "yaakcli"
|
||||
path = "src/main.rs"
|
||||
|
||||
[dependencies]
|
||||
clap = { version = "4", features = ["derive"] }
|
||||
dirs = "6"
|
||||
env_logger = "0.11"
|
||||
log = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
|
||||
yaak-actions = { workspace = true }
|
||||
yaak-actions-builtin = { workspace = true }
|
||||
yaak-crypto = { workspace = true }
|
||||
yaak-http = { workspace = true }
|
||||
yaak-models = { workspace = true }
|
||||
yaak-plugins = { workspace = true }
|
||||
yaak-templates = { workspace = true }
|
||||
@@ -1,290 +0,0 @@
|
||||
use clap::{Parser, Subcommand};
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::mpsc;
|
||||
use yaak_http::sender::{HttpSender, ReqwestSender};
|
||||
use yaak_http::types::{SendableHttpRequest, SendableHttpRequestOptions};
|
||||
use yaak_models::models::HttpRequest;
|
||||
use yaak_models::util::UpdateSource;
|
||||
use yaak_plugins::events::PluginContext;
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
|
||||
#[derive(Parser)]
|
||||
#[command(name = "yaakcli")]
|
||||
#[command(about = "Yaak CLI - API client from the command line")]
|
||||
struct Cli {
|
||||
/// Use a custom data directory
|
||||
#[arg(long, global = true)]
|
||||
data_dir: Option<PathBuf>,
|
||||
|
||||
/// Environment ID to use for variable substitution
|
||||
#[arg(long, short, global = true)]
|
||||
environment: Option<String>,
|
||||
|
||||
/// Enable verbose logging
|
||||
#[arg(long, short, global = true)]
|
||||
verbose: bool,
|
||||
|
||||
#[command(subcommand)]
|
||||
command: Commands,
|
||||
}
|
||||
|
||||
#[derive(Subcommand)]
|
||||
enum Commands {
|
||||
/// List all workspaces
|
||||
Workspaces,
|
||||
/// List requests in a workspace
|
||||
Requests {
|
||||
/// Workspace ID
|
||||
workspace_id: String,
|
||||
},
|
||||
/// Send an HTTP request by ID
|
||||
Send {
|
||||
/// Request ID
|
||||
request_id: String,
|
||||
},
|
||||
/// Send a GET request to a URL
|
||||
Get {
|
||||
/// URL to request
|
||||
url: String,
|
||||
},
|
||||
/// Create a new HTTP request
|
||||
Create {
|
||||
/// Workspace ID
|
||||
workspace_id: String,
|
||||
/// Request name
|
||||
#[arg(short, long)]
|
||||
name: String,
|
||||
/// HTTP method
|
||||
#[arg(short, long, default_value = "GET")]
|
||||
method: String,
|
||||
/// URL
|
||||
#[arg(short, long)]
|
||||
url: String,
|
||||
},
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() {
|
||||
let cli = Cli::parse();
|
||||
|
||||
// Initialize logging
|
||||
if cli.verbose {
|
||||
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
|
||||
}
|
||||
|
||||
// Use the same app_id for both data directory and keyring
|
||||
let app_id = if cfg!(debug_assertions) { "app.yaak.desktop.dev" } else { "app.yaak.desktop" };
|
||||
|
||||
let data_dir = cli.data_dir.unwrap_or_else(|| {
|
||||
dirs::data_dir().expect("Could not determine data directory").join(app_id)
|
||||
});
|
||||
|
||||
let db_path = data_dir.join("db.sqlite");
|
||||
let blob_path = data_dir.join("blobs.sqlite");
|
||||
|
||||
let (query_manager, _blob_manager, _rx) =
|
||||
yaak_models::init_standalone(&db_path, &blob_path).expect("Failed to initialize database");
|
||||
|
||||
let db = query_manager.connect();
|
||||
|
||||
// Initialize plugin manager for template functions
|
||||
let vendored_plugin_dir = data_dir.join("vendored-plugins");
|
||||
let installed_plugin_dir = data_dir.join("installed-plugins");
|
||||
|
||||
// Use system node for CLI (must be in PATH)
|
||||
let node_bin_path = PathBuf::from("node");
|
||||
|
||||
// Find the plugin runtime - check YAAK_PLUGIN_RUNTIME env var, then fallback to development path
|
||||
let plugin_runtime_main =
|
||||
std::env::var("YAAK_PLUGIN_RUNTIME").map(PathBuf::from).unwrap_or_else(|_| {
|
||||
// Development fallback: look relative to crate root
|
||||
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
|
||||
});
|
||||
|
||||
// Create plugin manager (plugins may not be available in CLI context)
|
||||
let plugin_manager = Arc::new(
|
||||
PluginManager::new(
|
||||
vendored_plugin_dir.clone(),
|
||||
installed_plugin_dir.clone(),
|
||||
node_bin_path.clone(),
|
||||
plugin_runtime_main,
|
||||
false,
|
||||
)
|
||||
.await,
|
||||
);
|
||||
|
||||
// Initialize plugins from database
|
||||
let plugins = db.list_plugins().unwrap_or_default();
|
||||
if !plugins.is_empty() {
|
||||
let errors =
|
||||
plugin_manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
|
||||
for (plugin_dir, error_msg) in errors {
|
||||
eprintln!("Warning: Failed to initialize plugin '{}': {}", plugin_dir, error_msg);
|
||||
}
|
||||
}
|
||||
|
||||
match cli.command {
|
||||
Commands::Workspaces => {
|
||||
let workspaces = db.list_workspaces().expect("Failed to list workspaces");
|
||||
if workspaces.is_empty() {
|
||||
println!("No workspaces found");
|
||||
} else {
|
||||
for ws in workspaces {
|
||||
println!("{} - {}", ws.id, ws.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
Commands::Requests { workspace_id } => {
|
||||
let requests = db.list_http_requests(&workspace_id).expect("Failed to list requests");
|
||||
if requests.is_empty() {
|
||||
println!("No requests found in workspace {}", workspace_id);
|
||||
} else {
|
||||
for req in requests {
|
||||
println!("{} - {} {}", req.id, req.method, req.name);
|
||||
}
|
||||
}
|
||||
}
|
||||
Commands::Send { request_id } => {
|
||||
use yaak_actions::{
|
||||
ActionExecutor, ActionId, ActionParams, ActionResult, ActionTarget, CurrentContext,
|
||||
};
|
||||
use yaak_actions_builtin::{BuiltinActionDependencies, register_http_actions};
|
||||
|
||||
// Create dependencies
|
||||
let deps = BuiltinActionDependencies::new_standalone(
|
||||
&db_path,
|
||||
&blob_path,
|
||||
&app_id,
|
||||
vendored_plugin_dir.clone(),
|
||||
installed_plugin_dir.clone(),
|
||||
node_bin_path.clone(),
|
||||
)
|
||||
.await
|
||||
.expect("Failed to initialize dependencies");
|
||||
|
||||
// Create executor and register actions
|
||||
let executor = ActionExecutor::new();
|
||||
executor.register_builtin_groups().await.expect("Failed to register groups");
|
||||
register_http_actions(&executor, &deps).await.expect("Failed to register HTTP actions");
|
||||
|
||||
// Prepare context
|
||||
let context = CurrentContext {
|
||||
target: Some(ActionTarget::HttpRequest { id: request_id.clone() }),
|
||||
environment_id: cli.environment.clone(),
|
||||
workspace_id: None,
|
||||
has_window: false,
|
||||
can_prompt: false,
|
||||
};
|
||||
|
||||
// Prepare params
|
||||
let params = ActionParams {
|
||||
data: serde_json::json!({
|
||||
"render": true,
|
||||
"follow_redirects": false,
|
||||
"timeout_ms": 30000,
|
||||
}),
|
||||
};
|
||||
|
||||
// Invoke action
|
||||
let action_id = ActionId::builtin("http", "send-request");
|
||||
let result = executor.invoke(&action_id, context, params).await.expect("Action failed");
|
||||
|
||||
// Handle result
|
||||
match result {
|
||||
ActionResult::Success { data, message } => {
|
||||
if let Some(msg) = message {
|
||||
println!("{}", msg);
|
||||
}
|
||||
if let Some(data) = data {
|
||||
println!("{}", serde_json::to_string_pretty(&data).unwrap());
|
||||
}
|
||||
}
|
||||
ActionResult::RequiresInput { .. } => {
|
||||
eprintln!("Action requires input (not supported in CLI)");
|
||||
}
|
||||
ActionResult::Cancelled => {
|
||||
eprintln!("Action cancelled");
|
||||
}
|
||||
}
|
||||
}
|
||||
Commands::Get { url } => {
|
||||
if cli.verbose {
|
||||
println!("> GET {}", url);
|
||||
}
|
||||
|
||||
// Build a simple GET request
|
||||
let sendable = SendableHttpRequest {
|
||||
url: url.clone(),
|
||||
method: "GET".to_string(),
|
||||
headers: vec![],
|
||||
body: None,
|
||||
options: SendableHttpRequestOptions::default(),
|
||||
};
|
||||
|
||||
// Create event channel for progress
|
||||
let (event_tx, mut event_rx) = mpsc::channel(100);
|
||||
|
||||
// Spawn task to print events if verbose
|
||||
let verbose = cli.verbose;
|
||||
let verbose_handle = if verbose {
|
||||
Some(tokio::spawn(async move {
|
||||
while let Some(event) = event_rx.recv().await {
|
||||
println!("{}", event);
|
||||
}
|
||||
}))
|
||||
} else {
|
||||
tokio::spawn(async move { while event_rx.recv().await.is_some() {} });
|
||||
None
|
||||
};
|
||||
|
||||
// Send the request
|
||||
let sender = ReqwestSender::new().expect("Failed to create HTTP client");
|
||||
let response = sender.send(sendable, event_tx).await.expect("Failed to send request");
|
||||
|
||||
if let Some(handle) = verbose_handle {
|
||||
let _ = handle.await;
|
||||
}
|
||||
|
||||
// Print response
|
||||
if verbose {
|
||||
println!();
|
||||
}
|
||||
println!(
|
||||
"HTTP {} {}",
|
||||
response.status,
|
||||
response.status_reason.as_deref().unwrap_or("")
|
||||
);
|
||||
|
||||
if verbose {
|
||||
for (name, value) in &response.headers {
|
||||
println!("{}: {}", name, value);
|
||||
}
|
||||
println!();
|
||||
}
|
||||
|
||||
// Print body
|
||||
let (body, _stats) = response.text().await.expect("Failed to read response body");
|
||||
println!("{}", body);
|
||||
}
|
||||
Commands::Create { workspace_id, name, method, url } => {
|
||||
let request = HttpRequest {
|
||||
workspace_id,
|
||||
name,
|
||||
method: method.to_uppercase(),
|
||||
url,
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let created = db
|
||||
.upsert_http_request(&request, &UpdateSource::Sync)
|
||||
.expect("Failed to create request");
|
||||
|
||||
println!("Created request: {}", created.id);
|
||||
}
|
||||
}
|
||||
|
||||
// Terminate plugin manager gracefully
|
||||
plugin_manager.terminate().await;
|
||||
}
|
||||
@@ -1,3 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type WatchResult = { unlistenEvent: string, };
|
||||
@@ -1,5 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type PluginUpdateInfo = { name: string, currentVersion: string, latestVersion: string, };
|
||||
|
||||
export type PluginUpdateNotification = { updateCount: number, plugins: Array<PluginUpdateInfo>, };
|
||||
@@ -1,13 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<!-- Enable for NodeJS/V8 JIT compiler -->
|
||||
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
|
||||
<true/>
|
||||
|
||||
<!-- Allow loading plugins signed with different Team IDs (e.g., 1Password) -->
|
||||
<key>com.apple.security.cs.disable-library-validation</key>
|
||||
<true/>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -1,6 +0,0 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -1,115 +0,0 @@
|
||||
use crate::PluginContextExt;
|
||||
use crate::error::Result;
|
||||
use std::sync::Arc;
|
||||
use tauri::{AppHandle, Manager, Runtime, State, WebviewWindow, command};
|
||||
use tauri_plugin_dialog::{DialogExt, MessageDialogKind};
|
||||
use yaak_crypto::manager::EncryptionManager;
|
||||
use yaak_models::models::HttpRequestHeader;
|
||||
use yaak_models::queries::workspaces::default_headers;
|
||||
use yaak_plugins::events::GetThemesResponse;
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::native_template_functions::{
|
||||
decrypt_secure_template_function, encrypt_secure_template_function,
|
||||
};
|
||||
|
||||
/// Extension trait for accessing the EncryptionManager from Tauri Manager types.
|
||||
pub trait EncryptionManagerExt<'a, R> {
|
||||
fn crypto(&'a self) -> State<'a, EncryptionManager>;
|
||||
}
|
||||
|
||||
impl<'a, R: Runtime, M: Manager<R>> EncryptionManagerExt<'a, R> for M {
|
||||
fn crypto(&'a self) -> State<'a, EncryptionManager> {
|
||||
self.state::<EncryptionManager>()
|
||||
}
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_show_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<()> {
|
||||
let key = window.crypto().reveal_workspace_key(workspace_id)?;
|
||||
window
|
||||
.dialog()
|
||||
.message(format!("Your workspace key is \n\n{}", key))
|
||||
.kind(MessageDialogKind::Info)
|
||||
.show(|_v| {});
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_decrypt_template<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
template: &str,
|
||||
) -> Result<String> {
|
||||
let encryption_manager = window.app_handle().state::<EncryptionManager>();
|
||||
let plugin_context = window.plugin_context();
|
||||
Ok(decrypt_secure_template_function(&encryption_manager, &plugin_context, template)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_secure_template<R: Runtime>(
|
||||
app_handle: AppHandle<R>,
|
||||
window: WebviewWindow<R>,
|
||||
template: &str,
|
||||
) -> Result<String> {
|
||||
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
|
||||
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
|
||||
let plugin_context = window.plugin_context();
|
||||
Ok(encrypt_secure_template_function(
|
||||
plugin_manager,
|
||||
encryption_manager,
|
||||
&plugin_context,
|
||||
template,
|
||||
)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_get_themes<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
plugin_manager: State<'_, PluginManager>,
|
||||
) -> Result<Vec<GetThemesResponse>> {
|
||||
Ok(plugin_manager.get_themes(&window.plugin_context()).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_enable_encryption<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<()> {
|
||||
window.crypto().ensure_workspace_key(workspace_id)?;
|
||||
window.crypto().reveal_workspace_key(workspace_id)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_reveal_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<String> {
|
||||
Ok(window.crypto().reveal_workspace_key(workspace_id)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_set_workspace_key<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
key: &str,
|
||||
) -> Result<()> {
|
||||
window.crypto().set_human_key(workspace_id, key)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) async fn cmd_disable_encryption<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
workspace_id: &str,
|
||||
) -> Result<()> {
|
||||
window.crypto().disable_encryption(workspace_id)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub(crate) fn cmd_default_headers() -> Vec<HttpRequestHeader> {
|
||||
default_headers()
|
||||
}
|
||||
@@ -1,130 +0,0 @@
|
||||
//! Tauri-specific extensions for yaak-git.
|
||||
//!
|
||||
//! This module provides the Tauri commands for git functionality.
|
||||
|
||||
use crate::error::Result;
|
||||
use std::path::{Path, PathBuf};
|
||||
use tauri::command;
|
||||
use yaak_git::{
|
||||
BranchDeleteResult, CloneResult, GitCommit, GitRemote, GitStatusSummary, PullResult,
|
||||
PushResult, git_add, git_add_credential, git_add_remote, git_checkout_branch, git_clone,
|
||||
git_commit, git_create_branch, git_delete_branch, git_delete_remote_branch, git_fetch_all,
|
||||
git_init, git_log, git_merge_branch, git_pull, git_push, git_remotes, git_rename_branch,
|
||||
git_rm_remote, git_status, git_unstage,
|
||||
};
|
||||
|
||||
// NOTE: All of these commands are async to prevent blocking work from locking up the UI
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_checkout(dir: &Path, branch: &str, force: bool) -> Result<String> {
|
||||
Ok(git_checkout_branch(dir, branch, force).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_branch(dir: &Path, branch: &str, base: Option<&str>) -> Result<()> {
|
||||
Ok(git_create_branch(dir, branch, base).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_delete_branch(
|
||||
dir: &Path,
|
||||
branch: &str,
|
||||
force: Option<bool>,
|
||||
) -> Result<BranchDeleteResult> {
|
||||
Ok(git_delete_branch(dir, branch, force.unwrap_or(false)).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_delete_remote_branch(dir: &Path, branch: &str) -> Result<()> {
|
||||
Ok(git_delete_remote_branch(dir, branch).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_merge_branch(dir: &Path, branch: &str) -> Result<()> {
|
||||
Ok(git_merge_branch(dir, branch).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_rename_branch(dir: &Path, old_name: &str, new_name: &str) -> Result<()> {
|
||||
Ok(git_rename_branch(dir, old_name, new_name).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_status(dir: &Path) -> Result<GitStatusSummary> {
|
||||
Ok(git_status(dir)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_log(dir: &Path) -> Result<Vec<GitCommit>> {
|
||||
Ok(git_log(dir)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_initialize(dir: &Path) -> Result<()> {
|
||||
Ok(git_init(dir)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_clone(url: &str, dir: &Path) -> Result<CloneResult> {
|
||||
Ok(git_clone(url, dir).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_commit(dir: &Path, message: &str) -> Result<()> {
|
||||
Ok(git_commit(dir, message).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_fetch_all(dir: &Path) -> Result<()> {
|
||||
Ok(git_fetch_all(dir).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_push(dir: &Path) -> Result<PushResult> {
|
||||
Ok(git_push(dir).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_pull(dir: &Path) -> Result<PullResult> {
|
||||
Ok(git_pull(dir).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_add(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
|
||||
for path in rela_paths {
|
||||
git_add(dir, &path)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_unstage(dir: &Path, rela_paths: Vec<PathBuf>) -> Result<()> {
|
||||
for path in rela_paths {
|
||||
git_unstage(dir, &path)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_add_credential(
|
||||
remote_url: &str,
|
||||
username: &str,
|
||||
password: &str,
|
||||
) -> Result<()> {
|
||||
Ok(git_add_credential(remote_url, username, password).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_remotes(dir: &Path) -> Result<Vec<GitRemote>> {
|
||||
Ok(git_remotes(dir)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_add_remote(dir: &Path, name: &str, url: &str) -> Result<GitRemote> {
|
||||
Ok(git_add_remote(dir, name, url)?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_git_rm_remote(dir: &Path, name: &str) -> Result<()> {
|
||||
Ok(git_rm_remote(dir, name)?)
|
||||
}
|
||||
@@ -1,709 +0,0 @@
|
||||
use crate::PluginContextExt;
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use crate::models_ext::BlobManagerExt;
|
||||
use crate::models_ext::QueryManagerExt;
|
||||
use crate::render::render_http_request;
|
||||
use log::{debug, warn};
|
||||
use std::pin::Pin;
|
||||
use std::sync::Arc;
|
||||
use std::sync::atomic::{AtomicI32, Ordering};
|
||||
use std::time::{Duration, Instant};
|
||||
use tauri::{AppHandle, Manager, Runtime, WebviewWindow};
|
||||
use tokio::fs::{File, create_dir_all};
|
||||
use tokio::io::{AsyncRead, AsyncReadExt, AsyncWriteExt};
|
||||
use tokio::sync::watch::Receiver;
|
||||
use tokio_util::bytes::Bytes;
|
||||
use yaak_crypto::manager::EncryptionManager;
|
||||
use yaak_http::client::{
|
||||
HttpConnectionOptions, HttpConnectionProxySetting, HttpConnectionProxySettingAuth,
|
||||
};
|
||||
use yaak_http::cookies::CookieStore;
|
||||
use yaak_http::manager::{CachedClient, HttpConnectionManager};
|
||||
use yaak_http::sender::ReqwestSender;
|
||||
use yaak_http::tee_reader::TeeReader;
|
||||
use yaak_http::transaction::HttpTransaction;
|
||||
use yaak_http::types::{
|
||||
SendableBody, SendableHttpRequest, SendableHttpRequestOptions, append_query_params,
|
||||
};
|
||||
use yaak_models::blob_manager::BodyChunk;
|
||||
use yaak_models::models::{
|
||||
CookieJar, Environment, HttpRequest, HttpResponse, HttpResponseEvent, HttpResponseHeader,
|
||||
HttpResponseState, ProxySetting, ProxySettingAuth,
|
||||
};
|
||||
use yaak_models::util::UpdateSource;
|
||||
use yaak_plugins::events::{
|
||||
CallHttpAuthenticationRequest, HttpHeader, PluginContext, RenderPurpose,
|
||||
};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||
use yaak_templates::RenderOptions;
|
||||
use yaak_tls::find_client_certificate;
|
||||
|
||||
/// Chunk size for storing request bodies (1MB)
|
||||
const REQUEST_BODY_CHUNK_SIZE: usize = 1024 * 1024;
|
||||
|
||||
/// Context for managing response state during HTTP transactions.
|
||||
/// Handles both persisted responses (stored in DB) and ephemeral responses (in-memory only).
|
||||
struct ResponseContext<R: Runtime> {
|
||||
app_handle: AppHandle<R>,
|
||||
response: HttpResponse,
|
||||
update_source: UpdateSource,
|
||||
}
|
||||
|
||||
impl<R: Runtime> ResponseContext<R> {
|
||||
fn new(app_handle: AppHandle<R>, response: HttpResponse, update_source: UpdateSource) -> Self {
|
||||
Self { app_handle, response, update_source }
|
||||
}
|
||||
|
||||
/// Whether this response is persisted (has a non-empty ID)
|
||||
fn is_persisted(&self) -> bool {
|
||||
!self.response.id.is_empty()
|
||||
}
|
||||
|
||||
/// Update the response state. For persisted responses, fetches from DB, applies the
|
||||
/// closure, and updates the DB. For ephemeral responses, just applies the closure
|
||||
/// to the in-memory response.
|
||||
fn update<F>(&mut self, func: F) -> Result<()>
|
||||
where
|
||||
F: FnOnce(&mut HttpResponse),
|
||||
{
|
||||
if self.is_persisted() {
|
||||
let r = self.app_handle.with_tx(|tx| {
|
||||
let mut r = tx.get_http_response(&self.response.id)?;
|
||||
func(&mut r);
|
||||
tx.update_http_response_if_id(&r, &self.update_source)?;
|
||||
Ok(r)
|
||||
})?;
|
||||
self.response = r;
|
||||
Ok(())
|
||||
} else {
|
||||
func(&mut self.response);
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the current response state
|
||||
fn response(&self) -> &HttpResponse {
|
||||
&self.response
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn send_http_request<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
unrendered_request: &HttpRequest,
|
||||
og_response: &HttpResponse,
|
||||
environment: Option<Environment>,
|
||||
cookie_jar: Option<CookieJar>,
|
||||
cancelled_rx: &mut Receiver<bool>,
|
||||
) -> Result<HttpResponse> {
|
||||
send_http_request_with_context(
|
||||
window,
|
||||
unrendered_request,
|
||||
og_response,
|
||||
environment,
|
||||
cookie_jar,
|
||||
cancelled_rx,
|
||||
&window.plugin_context(),
|
||||
)
|
||||
.await
|
||||
}
|
||||
|
||||
pub async fn send_http_request_with_context<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
unrendered_request: &HttpRequest,
|
||||
og_response: &HttpResponse,
|
||||
environment: Option<Environment>,
|
||||
cookie_jar: Option<CookieJar>,
|
||||
cancelled_rx: &Receiver<bool>,
|
||||
plugin_context: &PluginContext,
|
||||
) -> Result<HttpResponse> {
|
||||
let app_handle = window.app_handle().clone();
|
||||
let update_source = UpdateSource::from_window_label(window.label());
|
||||
let mut response_ctx =
|
||||
ResponseContext::new(app_handle.clone(), og_response.clone(), update_source);
|
||||
|
||||
// Execute the inner send logic and handle errors consistently
|
||||
let start = Instant::now();
|
||||
let result = send_http_request_inner(
|
||||
window,
|
||||
unrendered_request,
|
||||
environment,
|
||||
cookie_jar,
|
||||
cancelled_rx,
|
||||
plugin_context,
|
||||
&mut response_ctx,
|
||||
)
|
||||
.await;
|
||||
|
||||
match result {
|
||||
Ok(response) => Ok(response),
|
||||
Err(e) => {
|
||||
let error = e.to_string();
|
||||
let elapsed = start.elapsed().as_millis() as i32;
|
||||
warn!("Failed to send request: {error:?}");
|
||||
let _ = response_ctx.update(|r| {
|
||||
r.state = HttpResponseState::Closed;
|
||||
r.elapsed = elapsed;
|
||||
if r.elapsed_headers == 0 {
|
||||
r.elapsed_headers = elapsed;
|
||||
}
|
||||
r.error = Some(error);
|
||||
});
|
||||
Ok(response_ctx.response().clone())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async fn send_http_request_inner<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
unrendered_request: &HttpRequest,
|
||||
environment: Option<Environment>,
|
||||
cookie_jar: Option<CookieJar>,
|
||||
cancelled_rx: &Receiver<bool>,
|
||||
plugin_context: &PluginContext,
|
||||
response_ctx: &mut ResponseContext<R>,
|
||||
) -> Result<HttpResponse> {
|
||||
let app_handle = window.app_handle().clone();
|
||||
let plugin_manager = Arc::new((*app_handle.state::<PluginManager>()).clone());
|
||||
let encryption_manager = Arc::new((*app_handle.state::<EncryptionManager>()).clone());
|
||||
let connection_manager = app_handle.state::<HttpConnectionManager>();
|
||||
let settings = window.db().get_settings();
|
||||
let workspace_id = &unrendered_request.workspace_id;
|
||||
let folder_id = unrendered_request.folder_id.as_deref();
|
||||
let environment_id = environment.map(|e| e.id);
|
||||
let workspace = window.db().get_workspace(workspace_id)?;
|
||||
let (resolved, auth_context_id) = resolve_http_request(window, unrendered_request)?;
|
||||
let cb = PluginTemplateCallback::new(
|
||||
plugin_manager.clone(),
|
||||
encryption_manager.clone(),
|
||||
&plugin_context,
|
||||
RenderPurpose::Send,
|
||||
);
|
||||
let env_chain =
|
||||
window.db().resolve_environments(&workspace.id, folder_id, environment_id.as_deref())?;
|
||||
let request = render_http_request(&resolved, env_chain, &cb, &RenderOptions::throw()).await?;
|
||||
|
||||
// Build the sendable request using the new SendableHttpRequest type
|
||||
let options = SendableHttpRequestOptions {
|
||||
follow_redirects: workspace.setting_follow_redirects,
|
||||
timeout: if workspace.setting_request_timeout > 0 {
|
||||
Some(Duration::from_millis(workspace.setting_request_timeout.unsigned_abs() as u64))
|
||||
} else {
|
||||
None
|
||||
},
|
||||
};
|
||||
let mut sendable_request = SendableHttpRequest::from_http_request(&request, options).await?;
|
||||
|
||||
debug!("Sending request to {} {}", sendable_request.method, sendable_request.url);
|
||||
|
||||
let proxy_setting = match settings.proxy {
|
||||
None => HttpConnectionProxySetting::System,
|
||||
Some(ProxySetting::Disabled) => HttpConnectionProxySetting::Disabled,
|
||||
Some(ProxySetting::Enabled { http, https, auth, bypass, disabled }) => {
|
||||
if disabled {
|
||||
HttpConnectionProxySetting::System
|
||||
} else {
|
||||
HttpConnectionProxySetting::Enabled {
|
||||
http,
|
||||
https,
|
||||
bypass,
|
||||
auth: match auth {
|
||||
None => None,
|
||||
Some(ProxySettingAuth { user, password }) => {
|
||||
Some(HttpConnectionProxySettingAuth { user, password })
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let client_certificate =
|
||||
find_client_certificate(&sendable_request.url, &settings.client_certificates);
|
||||
|
||||
// Create cookie store if a cookie jar is specified
|
||||
let maybe_cookie_store = match cookie_jar.clone() {
|
||||
Some(CookieJar { id, .. }) => {
|
||||
// NOTE: We need to refetch the cookie jar because a chained request might have
|
||||
// updated cookies when we rendered the request.
|
||||
let cj = window.db().get_cookie_jar(&id)?;
|
||||
let cookie_store = CookieStore::from_cookies(cj.cookies.clone());
|
||||
Some((cookie_store, cj))
|
||||
}
|
||||
None => None,
|
||||
};
|
||||
|
||||
let cached_client = connection_manager
|
||||
.get_client(&HttpConnectionOptions {
|
||||
id: plugin_context.id.clone(),
|
||||
validate_certificates: workspace.setting_validate_certificates,
|
||||
proxy: proxy_setting,
|
||||
client_certificate,
|
||||
dns_overrides: workspace.setting_dns_overrides.clone(),
|
||||
})
|
||||
.await?;
|
||||
|
||||
// Apply authentication to the request
|
||||
apply_authentication(
|
||||
&window,
|
||||
&mut sendable_request,
|
||||
&request,
|
||||
auth_context_id,
|
||||
&plugin_manager,
|
||||
plugin_context,
|
||||
)
|
||||
.await?;
|
||||
|
||||
let cookie_store = maybe_cookie_store.as_ref().map(|(cs, _)| cs.clone());
|
||||
let result = execute_transaction(
|
||||
cached_client,
|
||||
sendable_request,
|
||||
response_ctx,
|
||||
cancelled_rx.clone(),
|
||||
cookie_store,
|
||||
)
|
||||
.await;
|
||||
|
||||
// Wait for blob writing to complete and check for errors
|
||||
let final_result = match result {
|
||||
Ok((response, maybe_blob_write_handle)) => {
|
||||
// Check if blob writing failed
|
||||
if let Some(handle) = maybe_blob_write_handle {
|
||||
if let Ok(Err(e)) = handle.await {
|
||||
// Update response with the storage error
|
||||
let _ = response_ctx.update(|r| {
|
||||
let error_msg =
|
||||
format!("Request succeeded but failed to store request body: {}", e);
|
||||
r.error = Some(match &r.error {
|
||||
Some(existing) => format!("{}; {}", existing, error_msg),
|
||||
None => error_msg,
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
Ok(response)
|
||||
}
|
||||
Err(e) => Err(e),
|
||||
};
|
||||
|
||||
// Persist cookies back to the database after the request completes
|
||||
if let Some((cookie_store, mut cj)) = maybe_cookie_store {
|
||||
let cookies = cookie_store.get_all_cookies();
|
||||
cj.cookies = cookies;
|
||||
if let Err(e) = window.db().upsert_cookie_jar(&cj, &UpdateSource::Background) {
|
||||
warn!("Failed to persist cookies to database: {}", e);
|
||||
}
|
||||
}
|
||||
|
||||
final_result
|
||||
}
|
||||
|
||||
pub fn resolve_http_request<R: Runtime>(
|
||||
window: &WebviewWindow<R>,
|
||||
request: &HttpRequest,
|
||||
) -> Result<(HttpRequest, String)> {
|
||||
let mut new_request = request.clone();
|
||||
|
||||
let (authentication_type, authentication, authentication_context_id) =
|
||||
window.db().resolve_auth_for_http_request(request)?;
|
||||
new_request.authentication_type = authentication_type;
|
||||
new_request.authentication = authentication;
|
||||
|
||||
let headers = window.db().resolve_headers_for_http_request(request)?;
|
||||
new_request.headers = headers;
|
||||
|
||||
Ok((new_request, authentication_context_id))
|
||||
}
|
||||
|
||||
async fn execute_transaction<R: Runtime>(
|
||||
cached_client: CachedClient,
|
||||
mut sendable_request: SendableHttpRequest,
|
||||
response_ctx: &mut ResponseContext<R>,
|
||||
mut cancelled_rx: Receiver<bool>,
|
||||
cookie_store: Option<CookieStore>,
|
||||
) -> Result<(HttpResponse, Option<tauri::async_runtime::JoinHandle<Result<()>>>)> {
|
||||
let app_handle = &response_ctx.app_handle.clone();
|
||||
let response_id = response_ctx.response().id.clone();
|
||||
let workspace_id = response_ctx.response().workspace_id.clone();
|
||||
let is_persisted = response_ctx.is_persisted();
|
||||
|
||||
// Keep a reference to the resolver for DNS timing events
|
||||
let resolver = cached_client.resolver.clone();
|
||||
|
||||
let sender = ReqwestSender::with_client(cached_client.client);
|
||||
let transaction = match cookie_store {
|
||||
Some(cs) => HttpTransaction::with_cookie_store(sender, cs),
|
||||
None => HttpTransaction::new(sender),
|
||||
};
|
||||
let start = Instant::now();
|
||||
|
||||
// Capture request headers before sending
|
||||
let request_headers: Vec<HttpResponseHeader> = sendable_request
|
||||
.headers
|
||||
.iter()
|
||||
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
|
||||
.collect();
|
||||
|
||||
// Update response with headers info
|
||||
response_ctx.update(|r| {
|
||||
r.url = sendable_request.url.clone();
|
||||
r.request_headers = request_headers;
|
||||
})?;
|
||||
|
||||
// Create bounded channel for receiving events and spawn a task to store them in DB
|
||||
// Buffer size of 100 events provides back pressure if DB writes are slow
|
||||
let (event_tx, mut event_rx) =
|
||||
tokio::sync::mpsc::channel::<yaak_http::sender::HttpResponseEvent>(100);
|
||||
|
||||
// Set the event sender on the DNS resolver so it can emit DNS timing events
|
||||
resolver.set_event_sender(Some(event_tx.clone())).await;
|
||||
|
||||
// Shared state to capture DNS timing from the event processing task
|
||||
let dns_elapsed = Arc::new(AtomicI32::new(0));
|
||||
|
||||
// Write events to DB in a task (only for persisted responses)
|
||||
if is_persisted {
|
||||
let response_id = response_id.clone();
|
||||
let app_handle = app_handle.clone();
|
||||
let update_source = response_ctx.update_source.clone();
|
||||
let workspace_id = workspace_id.clone();
|
||||
let dns_elapsed = dns_elapsed.clone();
|
||||
tokio::spawn(async move {
|
||||
while let Some(event) = event_rx.recv().await {
|
||||
// Capture DNS timing when we see a DNS event
|
||||
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
|
||||
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
|
||||
}
|
||||
let db_event = HttpResponseEvent::new(&response_id, &workspace_id, event.into());
|
||||
let _ = app_handle.db().upsert_http_response_event(&db_event, &update_source);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
// For ephemeral responses, just drain the events but still capture DNS timing
|
||||
let dns_elapsed = dns_elapsed.clone();
|
||||
tokio::spawn(async move {
|
||||
while let Some(event) = event_rx.recv().await {
|
||||
if let yaak_http::sender::HttpResponseEvent::DnsResolved { duration, .. } = &event {
|
||||
dns_elapsed.store(*duration as i32, Ordering::SeqCst);
|
||||
}
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
// Capture request body as it's sent (only for persisted responses)
|
||||
let body_id = format!("{}.request", response_id);
|
||||
let maybe_blob_write_handle = match sendable_request.body {
|
||||
Some(SendableBody::Bytes(bytes)) => {
|
||||
if is_persisted {
|
||||
write_bytes_to_db_sync(response_ctx, &body_id, bytes.clone())?;
|
||||
}
|
||||
sendable_request.body = Some(SendableBody::Bytes(bytes));
|
||||
None
|
||||
}
|
||||
Some(SendableBody::Stream(stream)) => {
|
||||
// Wrap stream with TeeReader to capture data as it's read
|
||||
// Use unbounded channel to ensure all data is captured without blocking the HTTP request
|
||||
let (body_chunk_tx, body_chunk_rx) = tokio::sync::mpsc::unbounded_channel::<Vec<u8>>();
|
||||
let tee_reader = TeeReader::new(stream, body_chunk_tx);
|
||||
let pinned: Pin<Box<dyn AsyncRead + Send + 'static>> = Box::pin(tee_reader);
|
||||
|
||||
let handle = if is_persisted {
|
||||
// Spawn task to write request body chunks to blob DB
|
||||
let app_handle = app_handle.clone();
|
||||
let response_id = response_id.clone();
|
||||
let workspace_id = workspace_id.clone();
|
||||
let body_id = body_id.clone();
|
||||
let update_source = response_ctx.update_source.clone();
|
||||
Some(tauri::async_runtime::spawn(async move {
|
||||
write_stream_chunks_to_db(
|
||||
app_handle,
|
||||
&body_id,
|
||||
&workspace_id,
|
||||
&response_id,
|
||||
&update_source,
|
||||
body_chunk_rx,
|
||||
)
|
||||
.await
|
||||
}))
|
||||
} else {
|
||||
// For ephemeral responses, just drain the body chunks
|
||||
tauri::async_runtime::spawn(async move {
|
||||
let mut rx = body_chunk_rx;
|
||||
while rx.recv().await.is_some() {}
|
||||
});
|
||||
None
|
||||
};
|
||||
|
||||
sendable_request.body = Some(SendableBody::Stream(pinned));
|
||||
handle
|
||||
}
|
||||
None => {
|
||||
sendable_request.body = None;
|
||||
None
|
||||
}
|
||||
};
|
||||
|
||||
// Execute the transaction with cancellation support
|
||||
// This returns the response with headers, but body is not yet consumed
|
||||
// Events (headers, settings, chunks) are sent through the channel
|
||||
let mut http_response = transaction
|
||||
.execute_with_cancellation(sendable_request, cancelled_rx.clone(), event_tx)
|
||||
.await?;
|
||||
|
||||
// Prepare the response path before consuming the body
|
||||
let body_path = if response_id.is_empty() {
|
||||
// Ephemeral responses: use OS temp directory for automatic cleanup
|
||||
let temp_dir = std::env::temp_dir().join("yaak-ephemeral-responses");
|
||||
create_dir_all(&temp_dir).await?;
|
||||
temp_dir.join(uuid::Uuid::new_v4().to_string())
|
||||
} else {
|
||||
// Persisted responses: use app data directory
|
||||
let dir = app_handle.path().app_data_dir()?;
|
||||
let base_dir = dir.join("responses");
|
||||
create_dir_all(&base_dir).await?;
|
||||
base_dir.join(&response_id)
|
||||
};
|
||||
|
||||
// Extract metadata before consuming the body (headers are available immediately)
|
||||
// Url might change, so update again
|
||||
response_ctx.update(|r| {
|
||||
r.body_path = Some(body_path.to_string_lossy().to_string());
|
||||
r.elapsed_headers = start.elapsed().as_millis() as i32;
|
||||
r.status = http_response.status as i32;
|
||||
r.status_reason = http_response.status_reason.clone();
|
||||
r.url = http_response.url.clone();
|
||||
r.remote_addr = http_response.remote_addr.clone();
|
||||
r.version = http_response.version.clone();
|
||||
r.headers = http_response
|
||||
.headers
|
||||
.iter()
|
||||
.map(|(name, value)| HttpResponseHeader { name: name.clone(), value: value.clone() })
|
||||
.collect();
|
||||
r.content_length = http_response.content_length.map(|l| l as i32);
|
||||
r.state = HttpResponseState::Connected;
|
||||
r.request_headers = http_response
|
||||
.request_headers
|
||||
.iter()
|
||||
.map(|(n, v)| HttpResponseHeader { name: n.clone(), value: v.clone() })
|
||||
.collect();
|
||||
})?;
|
||||
|
||||
// Get the body stream for manual consumption
|
||||
let mut body_stream = http_response.into_body_stream()?;
|
||||
|
||||
// Open file for writing
|
||||
let mut file = File::options()
|
||||
.create(true)
|
||||
.truncate(true)
|
||||
.write(true)
|
||||
.open(&body_path)
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("Failed to open file: {}", e)))?;
|
||||
|
||||
// Stream body to file, with throttled DB updates to avoid excessive writes
|
||||
let mut written_bytes: usize = 0;
|
||||
let mut last_update_time = start;
|
||||
let mut buf = [0u8; 8192];
|
||||
|
||||
// Throttle settings: update DB at most every 100ms
|
||||
const UPDATE_INTERVAL_MS: u128 = 100;
|
||||
|
||||
loop {
|
||||
// Check for cancellation. If we already have headers/body, just close cleanly without error
|
||||
if *cancelled_rx.borrow() {
|
||||
break;
|
||||
}
|
||||
|
||||
// Use select! to race between reading and cancellation, so cancellation is immediate
|
||||
let read_result = tokio::select! {
|
||||
biased;
|
||||
_ = cancelled_rx.changed() => {
|
||||
break;
|
||||
}
|
||||
result = body_stream.read(&mut buf) => result,
|
||||
};
|
||||
|
||||
match read_result {
|
||||
Ok(0) => break, // EOF
|
||||
Ok(n) => {
|
||||
file.write_all(&buf[..n])
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("Failed to write to file: {}", e)))?;
|
||||
file.flush()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("Failed to flush file: {}", e)))?;
|
||||
written_bytes += n;
|
||||
|
||||
// Throttle DB updates: only update if enough time has passed
|
||||
let now = Instant::now();
|
||||
let elapsed_since_update = now.duration_since(last_update_time).as_millis();
|
||||
|
||||
if elapsed_since_update >= UPDATE_INTERVAL_MS {
|
||||
response_ctx.update(|r| {
|
||||
r.elapsed = start.elapsed().as_millis() as i32;
|
||||
r.content_length = Some(written_bytes as i32);
|
||||
})?;
|
||||
last_update_time = now;
|
||||
}
|
||||
}
|
||||
Err(e) => {
|
||||
return Err(GenericError(format!("Failed to read response body: {}", e)));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Final update with closed state and accurate byte count
|
||||
response_ctx.update(|r| {
|
||||
r.elapsed = start.elapsed().as_millis() as i32;
|
||||
r.elapsed_dns = dns_elapsed.load(Ordering::SeqCst);
|
||||
r.content_length = Some(written_bytes as i32);
|
||||
r.state = HttpResponseState::Closed;
|
||||
})?;
|
||||
|
||||
// Clear the event sender from the resolver since this request is done
|
||||
resolver.set_event_sender(None).await;
|
||||
|
||||
Ok((response_ctx.response().clone(), maybe_blob_write_handle))
|
||||
}
|
||||
|
||||
fn write_bytes_to_db_sync<R: Runtime>(
|
||||
response_ctx: &mut ResponseContext<R>,
|
||||
body_id: &str,
|
||||
data: Bytes,
|
||||
) -> Result<()> {
|
||||
if data.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// Write in chunks if data is large
|
||||
let mut offset = 0;
|
||||
let mut chunk_index = 0;
|
||||
while offset < data.len() {
|
||||
let end = std::cmp::min(offset + REQUEST_BODY_CHUNK_SIZE, data.len());
|
||||
let chunk_data = data.slice(offset..end).to_vec();
|
||||
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
|
||||
response_ctx.app_handle.blobs().insert_chunk(&chunk)?;
|
||||
offset = end;
|
||||
chunk_index += 1;
|
||||
}
|
||||
|
||||
// Update the response with the total request body size
|
||||
response_ctx.update(|r| {
|
||||
r.request_content_length = Some(data.len() as i32);
|
||||
})?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn write_stream_chunks_to_db<R: Runtime>(
|
||||
app_handle: AppHandle<R>,
|
||||
body_id: &str,
|
||||
workspace_id: &str,
|
||||
response_id: &str,
|
||||
update_source: &UpdateSource,
|
||||
mut rx: tokio::sync::mpsc::UnboundedReceiver<Vec<u8>>,
|
||||
) -> Result<()> {
|
||||
let mut buffer = Vec::with_capacity(REQUEST_BODY_CHUNK_SIZE);
|
||||
let mut chunk_index = 0;
|
||||
let mut total_bytes: usize = 0;
|
||||
|
||||
while let Some(data) = rx.recv().await {
|
||||
total_bytes += data.len();
|
||||
buffer.extend_from_slice(&data);
|
||||
|
||||
// Flush when buffer reaches chunk size
|
||||
while buffer.len() >= REQUEST_BODY_CHUNK_SIZE {
|
||||
debug!("Writing chunk {chunk_index} to DB");
|
||||
let chunk_data: Vec<u8> = buffer.drain(..REQUEST_BODY_CHUNK_SIZE).collect();
|
||||
let chunk = BodyChunk::new(body_id, chunk_index, chunk_data);
|
||||
app_handle.blobs().insert_chunk(&chunk)?;
|
||||
app_handle.db().upsert_http_response_event(
|
||||
&HttpResponseEvent::new(
|
||||
response_id,
|
||||
workspace_id,
|
||||
yaak_http::sender::HttpResponseEvent::ChunkSent {
|
||||
bytes: REQUEST_BODY_CHUNK_SIZE,
|
||||
}
|
||||
.into(),
|
||||
),
|
||||
update_source,
|
||||
)?;
|
||||
chunk_index += 1;
|
||||
}
|
||||
}
|
||||
|
||||
// Flush remaining data
|
||||
if !buffer.is_empty() {
|
||||
let chunk = BodyChunk::new(body_id, chunk_index, buffer);
|
||||
debug!("Flushing remaining data {chunk_index} {}", chunk.data.len());
|
||||
app_handle.blobs().insert_chunk(&chunk)?;
|
||||
app_handle.db().upsert_http_response_event(
|
||||
&HttpResponseEvent::new(
|
||||
response_id,
|
||||
workspace_id,
|
||||
yaak_http::sender::HttpResponseEvent::ChunkSent { bytes: chunk.data.len() }.into(),
|
||||
),
|
||||
update_source,
|
||||
)?;
|
||||
}
|
||||
|
||||
// Update the response with the total request body size
|
||||
app_handle.with_tx(|tx| {
|
||||
debug!("Updating final body length {total_bytes}");
|
||||
if let Ok(mut response) = tx.get_http_response(&response_id) {
|
||||
response.request_content_length = Some(total_bytes as i32);
|
||||
tx.update_http_response_if_id(&response, update_source)?;
|
||||
}
|
||||
Ok(())
|
||||
})?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
async fn apply_authentication<R: Runtime>(
|
||||
_window: &WebviewWindow<R>,
|
||||
sendable_request: &mut SendableHttpRequest,
|
||||
request: &HttpRequest,
|
||||
auth_context_id: String,
|
||||
plugin_manager: &PluginManager,
|
||||
plugin_context: &PluginContext,
|
||||
) -> Result<()> {
|
||||
match &request.authentication_type {
|
||||
None => {
|
||||
// No authentication found. Not even inherited
|
||||
}
|
||||
Some(authentication_type) if authentication_type == "none" => {
|
||||
// Explicitly no authentication
|
||||
}
|
||||
Some(authentication_type) => {
|
||||
let req = CallHttpAuthenticationRequest {
|
||||
context_id: format!("{:x}", md5::compute(auth_context_id)),
|
||||
values: serde_json::from_value(serde_json::to_value(&request.authentication)?)?,
|
||||
url: sendable_request.url.clone(),
|
||||
method: sendable_request.method.clone(),
|
||||
headers: sendable_request
|
||||
.headers
|
||||
.iter()
|
||||
.map(|(name, value)| HttpHeader {
|
||||
name: name.to_string(),
|
||||
value: value.to_string(),
|
||||
})
|
||||
.collect(),
|
||||
};
|
||||
let plugin_result = plugin_manager
|
||||
.call_http_authentication(plugin_context, &authentication_type, req)
|
||||
.await?;
|
||||
|
||||
for header in plugin_result.set_headers.unwrap_or_default() {
|
||||
sendable_request.insert_header((header.name, header.value));
|
||||
}
|
||||
|
||||
if let Some(params) = plugin_result.set_query_parameters {
|
||||
let params = params.into_iter().map(|p| (p.name, p.value)).collect::<Vec<_>>();
|
||||
sendable_request.url = append_query_params(&sendable_request.url, params);
|
||||
}
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,359 +0,0 @@
|
||||
//! Tauri-specific plugin management code.
|
||||
//!
|
||||
//! This module contains all Tauri integration for the plugin system:
|
||||
//! - Plugin initialization and lifecycle management
|
||||
//! - Tauri commands for plugin search/install/uninstall
|
||||
//! - Plugin update checking
|
||||
|
||||
use crate::PluginContextExt;
|
||||
use crate::error::Result;
|
||||
use crate::models_ext::QueryManagerExt;
|
||||
use log::{error, info, warn};
|
||||
use serde::Serialize;
|
||||
use std::sync::Arc;
|
||||
use std::sync::atomic::{AtomicBool, Ordering};
|
||||
use std::time::{Duration, Instant};
|
||||
use tauri::path::BaseDirectory;
|
||||
use tauri::plugin::{Builder, TauriPlugin};
|
||||
use tauri::{
|
||||
AppHandle, Emitter, Manager, RunEvent, Runtime, State, WebviewWindow, WindowEvent, command,
|
||||
is_dev,
|
||||
};
|
||||
use tokio::sync::Mutex;
|
||||
use ts_rs::TS;
|
||||
use yaak_models::models::Plugin;
|
||||
use yaak_models::util::UpdateSource;
|
||||
use yaak_plugins::api::{
|
||||
PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse, check_plugin_updates,
|
||||
search_plugins,
|
||||
};
|
||||
use yaak_plugins::events::{Color, Icon, PluginContext, ShowToastRequest};
|
||||
use yaak_plugins::install::{delete_and_uninstall, download_and_install};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::plugin_meta::get_plugin_meta;
|
||||
use yaak_tauri_utils::api_client::yaak_api_client;
|
||||
|
||||
static EXITING: AtomicBool = AtomicBool::new(false);
|
||||
|
||||
// ============================================================================
|
||||
// Plugin Updater
|
||||
// ============================================================================
|
||||
|
||||
const MAX_UPDATE_CHECK_HOURS: u64 = 12;
|
||||
|
||||
pub struct PluginUpdater {
|
||||
last_check: Option<Instant>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[ts(export, export_to = "index.ts")]
|
||||
pub struct PluginUpdateNotification {
|
||||
pub update_count: usize,
|
||||
pub plugins: Vec<PluginUpdateInfo>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, TS)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
#[ts(export, export_to = "index.ts")]
|
||||
pub struct PluginUpdateInfo {
|
||||
pub name: String,
|
||||
pub current_version: String,
|
||||
pub latest_version: String,
|
||||
}
|
||||
|
||||
impl PluginUpdater {
|
||||
pub fn new() -> Self {
|
||||
Self { last_check: None }
|
||||
}
|
||||
|
||||
pub async fn check_now<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
|
||||
self.last_check = Some(Instant::now());
|
||||
|
||||
info!("Checking for plugin updates");
|
||||
|
||||
let http_client = yaak_api_client(window.app_handle())?;
|
||||
let plugins = window.app_handle().db().list_plugins()?;
|
||||
let updates = check_plugin_updates(&http_client, plugins.clone()).await?;
|
||||
|
||||
if updates.plugins.is_empty() {
|
||||
info!("No plugin updates available");
|
||||
return Ok(false);
|
||||
}
|
||||
|
||||
// Get current plugin versions to build notification
|
||||
let mut update_infos = Vec::new();
|
||||
|
||||
for update in &updates.plugins {
|
||||
if let Some(plugin) = plugins.iter().find(|p| {
|
||||
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&p.directory)) {
|
||||
meta.name == update.name
|
||||
} else {
|
||||
false
|
||||
}
|
||||
}) {
|
||||
if let Ok(meta) = get_plugin_meta(&std::path::Path::new(&plugin.directory)) {
|
||||
update_infos.push(PluginUpdateInfo {
|
||||
name: update.name.clone(),
|
||||
current_version: meta.version,
|
||||
latest_version: update.version.clone(),
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let notification =
|
||||
PluginUpdateNotification { update_count: update_infos.len(), plugins: update_infos };
|
||||
|
||||
info!("Found {} plugin update(s)", notification.update_count);
|
||||
|
||||
if let Err(e) = window.emit_to(window.label(), "plugin_updates_available", ¬ification) {
|
||||
error!("Failed to emit plugin_updates_available event: {}", e);
|
||||
}
|
||||
|
||||
Ok(true)
|
||||
}
|
||||
|
||||
pub async fn maybe_check<R: Runtime>(&mut self, window: &WebviewWindow<R>) -> Result<bool> {
|
||||
let update_period_seconds = MAX_UPDATE_CHECK_HOURS * 60 * 60;
|
||||
|
||||
if let Some(i) = self.last_check
|
||||
&& i.elapsed().as_secs() < update_period_seconds
|
||||
{
|
||||
return Ok(false);
|
||||
}
|
||||
|
||||
self.check_now(window).await
|
||||
}
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Tauri Commands
|
||||
// ============================================================================
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_plugins_search<R: Runtime>(
|
||||
app_handle: AppHandle<R>,
|
||||
query: &str,
|
||||
) -> Result<PluginSearchResponse> {
|
||||
let http_client = yaak_api_client(&app_handle)?;
|
||||
Ok(search_plugins(&http_client, query).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_plugins_install<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
name: &str,
|
||||
version: Option<String>,
|
||||
) -> Result<()> {
|
||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
||||
let http_client = yaak_api_client(window.app_handle())?;
|
||||
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
|
||||
let plugin_context = window.plugin_context();
|
||||
download_and_install(
|
||||
plugin_manager,
|
||||
&query_manager,
|
||||
&http_client,
|
||||
&plugin_context,
|
||||
name,
|
||||
version,
|
||||
)
|
||||
.await?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_plugins_uninstall<R: Runtime>(
|
||||
plugin_id: &str,
|
||||
window: WebviewWindow<R>,
|
||||
) -> Result<Plugin> {
|
||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
||||
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
|
||||
let plugin_context = window.plugin_context();
|
||||
Ok(delete_and_uninstall(plugin_manager, &query_manager, &plugin_context, plugin_id).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_plugins_updates<R: Runtime>(
|
||||
app_handle: AppHandle<R>,
|
||||
) -> Result<PluginUpdatesResponse> {
|
||||
let http_client = yaak_api_client(&app_handle)?;
|
||||
let plugins = app_handle.db().list_plugins()?;
|
||||
Ok(check_plugin_updates(&http_client, plugins).await?)
|
||||
}
|
||||
|
||||
#[command]
|
||||
pub async fn cmd_plugins_update_all<R: Runtime>(
|
||||
window: WebviewWindow<R>,
|
||||
) -> Result<Vec<PluginNameVersion>> {
|
||||
let http_client = yaak_api_client(window.app_handle())?;
|
||||
let plugins = window.db().list_plugins()?;
|
||||
|
||||
// Get list of available updates (already filtered to only registry plugins)
|
||||
let updates = check_plugin_updates(&http_client, plugins).await?;
|
||||
|
||||
if updates.plugins.is_empty() {
|
||||
return Ok(Vec::new());
|
||||
}
|
||||
|
||||
let plugin_manager = Arc::new((*window.state::<PluginManager>()).clone());
|
||||
let query_manager = window.state::<yaak_models::query_manager::QueryManager>();
|
||||
let plugin_context = window.plugin_context();
|
||||
|
||||
let mut updated = Vec::new();
|
||||
|
||||
for update in updates.plugins {
|
||||
info!("Updating plugin: {} to version {}", update.name, update.version);
|
||||
match download_and_install(
|
||||
plugin_manager.clone(),
|
||||
&query_manager,
|
||||
&http_client,
|
||||
&plugin_context,
|
||||
&update.name,
|
||||
Some(update.version.clone()),
|
||||
)
|
||||
.await
|
||||
{
|
||||
Ok(_) => {
|
||||
info!("Successfully updated plugin: {}", update.name);
|
||||
updated.push(update.clone());
|
||||
}
|
||||
Err(e) => {
|
||||
log::error!("Failed to update plugin {}: {:?}", update.name, e);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Ok(updated)
|
||||
}
|
||||
|
||||
// ============================================================================
|
||||
// Tauri Plugin Initialization
|
||||
// ============================================================================
|
||||
|
||||
pub fn init<R: Runtime>() -> TauriPlugin<R> {
|
||||
Builder::new("yaak-plugins")
|
||||
.setup(|app_handle, _| {
|
||||
// Resolve paths for plugin manager
|
||||
let vendored_plugin_dir = app_handle
|
||||
.path()
|
||||
.resolve("vendored/plugins", BaseDirectory::Resource)
|
||||
.expect("failed to resolve plugin directory resource");
|
||||
|
||||
let installed_plugin_dir = app_handle
|
||||
.path()
|
||||
.app_data_dir()
|
||||
.expect("failed to get app data dir")
|
||||
.join("installed-plugins");
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
let node_bin_name = "yaaknode.exe";
|
||||
#[cfg(not(target_os = "windows"))]
|
||||
let node_bin_name = "yaaknode";
|
||||
|
||||
let node_bin_path = app_handle
|
||||
.path()
|
||||
.resolve(format!("vendored/node/{}", node_bin_name), BaseDirectory::Resource)
|
||||
.expect("failed to resolve yaaknode binary");
|
||||
|
||||
let plugin_runtime_main = app_handle
|
||||
.path()
|
||||
.resolve("vendored/plugin-runtime", BaseDirectory::Resource)
|
||||
.expect("failed to resolve plugin runtime")
|
||||
.join("index.cjs");
|
||||
|
||||
let dev_mode = is_dev();
|
||||
|
||||
// Create plugin manager asynchronously
|
||||
let app_handle_clone = app_handle.clone();
|
||||
tauri::async_runtime::block_on(async move {
|
||||
let manager = PluginManager::new(
|
||||
vendored_plugin_dir,
|
||||
installed_plugin_dir,
|
||||
node_bin_path,
|
||||
plugin_runtime_main,
|
||||
dev_mode,
|
||||
)
|
||||
.await;
|
||||
|
||||
// Initialize all plugins after manager is created
|
||||
let bundled_dirs = manager
|
||||
.list_bundled_plugin_dirs()
|
||||
.await
|
||||
.expect("Failed to list bundled plugins");
|
||||
|
||||
// Ensure all bundled plugins make it into the database
|
||||
let db = app_handle_clone.db();
|
||||
for dir in &bundled_dirs {
|
||||
if db.get_plugin_by_directory(dir).is_none() {
|
||||
db.upsert_plugin(
|
||||
&Plugin {
|
||||
directory: dir.clone(),
|
||||
enabled: true,
|
||||
url: None,
|
||||
..Default::default()
|
||||
},
|
||||
&UpdateSource::Background,
|
||||
)
|
||||
.expect("Failed to upsert bundled plugin");
|
||||
}
|
||||
}
|
||||
|
||||
// Get all plugins from database and initialize
|
||||
let plugins = db.list_plugins().expect("Failed to list plugins from database");
|
||||
drop(db); // Explicitly drop the connection before await
|
||||
|
||||
let errors =
|
||||
manager.initialize_all_plugins(plugins, &PluginContext::new_empty()).await;
|
||||
|
||||
// Show toast for any failed plugins
|
||||
for (plugin_dir, error_msg) in errors {
|
||||
let plugin_name = plugin_dir.split('/').last().unwrap_or(&plugin_dir);
|
||||
let toast = ShowToastRequest {
|
||||
message: format!("Failed to start plugin '{}': {}", plugin_name, error_msg),
|
||||
color: Some(Color::Danger),
|
||||
icon: Some(Icon::AlertTriangle),
|
||||
timeout: Some(10000),
|
||||
};
|
||||
if let Err(emit_err) = app_handle_clone.emit("show_toast", toast) {
|
||||
error!("Failed to emit toast for plugin error: {emit_err:?}");
|
||||
}
|
||||
}
|
||||
|
||||
app_handle_clone.manage(manager);
|
||||
});
|
||||
|
||||
let plugin_updater = PluginUpdater::new();
|
||||
app_handle.manage(Mutex::new(plugin_updater));
|
||||
|
||||
Ok(())
|
||||
})
|
||||
.on_event(|app, e| match e {
|
||||
RunEvent::ExitRequested { api, .. } => {
|
||||
if EXITING.swap(true, Ordering::SeqCst) {
|
||||
return; // Only exit once to prevent infinite recursion
|
||||
}
|
||||
api.prevent_exit();
|
||||
tauri::async_runtime::block_on(async move {
|
||||
info!("Exiting plugin runtime due to app exit");
|
||||
let manager: State<PluginManager> = app.state();
|
||||
manager.terminate().await;
|
||||
app.exit(0);
|
||||
});
|
||||
}
|
||||
RunEvent::WindowEvent { event: WindowEvent::Focused(true), label, .. } => {
|
||||
// Check for plugin updates on window focus
|
||||
let w = app.get_webview_window(&label).unwrap();
|
||||
let h = app.clone();
|
||||
tauri::async_runtime::spawn(async move {
|
||||
tokio::time::sleep(Duration::from_secs(3)).await;
|
||||
let val: State<'_, Mutex<PluginUpdater>> = h.state();
|
||||
if let Err(e) = val.lock().await.maybe_check(&w).await {
|
||||
warn!("Failed to check for plugin updates {e:?}");
|
||||
}
|
||||
});
|
||||
}
|
||||
_ => {}
|
||||
})
|
||||
.build()
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
[package]
|
||||
name = "yaak-actions-builtin"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
authors = ["Gregory Schier"]
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
yaak-actions = { workspace = true }
|
||||
yaak-http = { workspace = true }
|
||||
yaak-models = { workspace = true }
|
||||
yaak-templates = { workspace = true }
|
||||
yaak-plugins = { workspace = true }
|
||||
yaak-crypto = { workspace = true }
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true, features = ["sync", "rt-multi-thread"] }
|
||||
log = { workspace = true }
|
||||
@@ -1,88 +0,0 @@
|
||||
//! Dependency injection for built-in actions.
|
||||
|
||||
use std::path::{Path, PathBuf};
|
||||
use std::sync::Arc;
|
||||
use yaak_crypto::manager::EncryptionManager;
|
||||
use yaak_models::query_manager::QueryManager;
|
||||
use yaak_plugins::events::PluginContext;
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
|
||||
/// Dependencies needed by built-in action implementations.
|
||||
///
|
||||
/// This struct bundles all the dependencies that action handlers need,
|
||||
/// providing a clean way to initialize them in different contexts
|
||||
/// (CLI, Tauri app, MCP server, etc.).
|
||||
pub struct BuiltinActionDependencies {
|
||||
pub query_manager: Arc<QueryManager>,
|
||||
pub plugin_manager: Arc<PluginManager>,
|
||||
pub encryption_manager: Arc<EncryptionManager>,
|
||||
}
|
||||
|
||||
impl BuiltinActionDependencies {
|
||||
/// Create dependencies for standalone usage (CLI, MCP server, etc.)
|
||||
///
|
||||
/// This initializes all the necessary managers following the same pattern
|
||||
/// as the yaak-cli implementation.
|
||||
pub async fn new_standalone(
|
||||
db_path: &Path,
|
||||
blob_path: &Path,
|
||||
app_id: &str,
|
||||
plugin_vendored_dir: PathBuf,
|
||||
plugin_installed_dir: PathBuf,
|
||||
node_path: PathBuf,
|
||||
) -> Result<Self, Box<dyn std::error::Error>> {
|
||||
// Initialize database
|
||||
let (query_manager, _, _) = yaak_models::init_standalone(db_path, blob_path)?;
|
||||
|
||||
// Initialize encryption manager (takes QueryManager by value)
|
||||
let encryption_manager = Arc::new(EncryptionManager::new(
|
||||
query_manager.clone(),
|
||||
app_id.to_string(),
|
||||
));
|
||||
|
||||
let query_manager = Arc::new(query_manager);
|
||||
|
||||
// Find plugin runtime
|
||||
let plugin_runtime_main = std::env::var("YAAK_PLUGIN_RUNTIME")
|
||||
.map(PathBuf::from)
|
||||
.unwrap_or_else(|_| {
|
||||
// Development fallback
|
||||
PathBuf::from(env!("CARGO_MANIFEST_DIR"))
|
||||
.join("../../crates-tauri/yaak-app/vendored/plugin-runtime/index.cjs")
|
||||
});
|
||||
|
||||
// Initialize plugin manager
|
||||
let plugin_manager = Arc::new(
|
||||
PluginManager::new(
|
||||
plugin_vendored_dir,
|
||||
plugin_installed_dir,
|
||||
node_path,
|
||||
plugin_runtime_main,
|
||||
false, // not sandboxed in CLI
|
||||
)
|
||||
.await,
|
||||
);
|
||||
|
||||
// Initialize plugins from database
|
||||
let db = query_manager.connect();
|
||||
let plugins = db.list_plugins().unwrap_or_default();
|
||||
if !plugins.is_empty() {
|
||||
let errors = plugin_manager
|
||||
.initialize_all_plugins(plugins, &PluginContext::new_empty())
|
||||
.await;
|
||||
for (plugin_dir, error_msg) in errors {
|
||||
log::warn!(
|
||||
"Failed to initialize plugin '{}': {}",
|
||||
plugin_dir,
|
||||
error_msg
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Self {
|
||||
query_manager,
|
||||
plugin_manager,
|
||||
encryption_manager,
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -1,24 +0,0 @@
|
||||
//! HTTP action implementations.
|
||||
|
||||
pub mod send;
|
||||
|
||||
use crate::BuiltinActionDependencies;
|
||||
use yaak_actions::{ActionError, ActionExecutor, ActionSource};
|
||||
|
||||
/// Register all HTTP-related actions with the executor.
|
||||
pub async fn register_http_actions(
|
||||
executor: &ActionExecutor,
|
||||
deps: &BuiltinActionDependencies,
|
||||
) -> Result<(), ActionError> {
|
||||
let handler = send::HttpSendActionHandler {
|
||||
query_manager: deps.query_manager.clone(),
|
||||
plugin_manager: deps.plugin_manager.clone(),
|
||||
encryption_manager: deps.encryption_manager.clone(),
|
||||
};
|
||||
|
||||
executor
|
||||
.register(send::metadata(), ActionSource::Builtin, handler)
|
||||
.await?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,293 +0,0 @@
|
||||
//! HTTP send action implementation.
|
||||
|
||||
use std::collections::BTreeMap;
|
||||
use std::sync::Arc;
|
||||
use serde_json::{json, Value};
|
||||
use tokio::sync::mpsc;
|
||||
use yaak_actions::{
|
||||
ActionError, ActionGroupId, ActionHandler, ActionId, ActionMetadata,
|
||||
ActionParams, ActionResult, ActionScope, CurrentContext,
|
||||
RequiredContext,
|
||||
};
|
||||
use yaak_crypto::manager::EncryptionManager;
|
||||
use yaak_http::path_placeholders::apply_path_placeholders;
|
||||
use yaak_http::sender::{HttpSender, ReqwestSender};
|
||||
use yaak_http::types::{SendableHttpRequest, SendableHttpRequestOptions};
|
||||
use yaak_models::models::{HttpRequest, HttpRequestHeader, HttpUrlParameter};
|
||||
use yaak_models::query_manager::QueryManager;
|
||||
use yaak_models::render::make_vars_hashmap;
|
||||
use yaak_plugins::events::{PluginContext, RenderPurpose};
|
||||
use yaak_plugins::manager::PluginManager;
|
||||
use yaak_plugins::template_callback::PluginTemplateCallback;
|
||||
use yaak_templates::{parse_and_render, render_json_value_raw, RenderOptions};
|
||||
|
||||
/// Handler for HTTP send action.
|
||||
pub struct HttpSendActionHandler {
|
||||
pub query_manager: Arc<QueryManager>,
|
||||
pub plugin_manager: Arc<PluginManager>,
|
||||
pub encryption_manager: Arc<EncryptionManager>,
|
||||
}
|
||||
|
||||
/// Metadata for the HTTP send action.
|
||||
pub fn metadata() -> ActionMetadata {
|
||||
ActionMetadata {
|
||||
id: ActionId::builtin("http", "send-request"),
|
||||
label: "Send HTTP Request".to_string(),
|
||||
description: Some("Execute an HTTP request and return the response".to_string()),
|
||||
icon: Some("play".to_string()),
|
||||
scope: ActionScope::HttpRequest,
|
||||
keyboard_shortcut: None,
|
||||
requires_selection: true,
|
||||
enabled_condition: None,
|
||||
group_id: Some(ActionGroupId::builtin("send")),
|
||||
order: 10,
|
||||
required_context: RequiredContext::requires_target(),
|
||||
}
|
||||
}
|
||||
|
||||
impl ActionHandler for HttpSendActionHandler {
|
||||
fn handle(
|
||||
&self,
|
||||
context: CurrentContext,
|
||||
params: ActionParams,
|
||||
) -> std::pin::Pin<
|
||||
Box<dyn std::future::Future<Output = Result<ActionResult, ActionError>> + Send + 'static>,
|
||||
> {
|
||||
let query_manager = self.query_manager.clone();
|
||||
let plugin_manager = self.plugin_manager.clone();
|
||||
let encryption_manager = self.encryption_manager.clone();
|
||||
|
||||
Box::pin(async move {
|
||||
// Extract request_id from context
|
||||
let request_id = context
|
||||
.target
|
||||
.as_ref()
|
||||
.ok_or_else(|| {
|
||||
ActionError::ContextMissing {
|
||||
missing_fields: vec!["target".to_string()],
|
||||
}
|
||||
})?
|
||||
.id()
|
||||
.ok_or_else(|| {
|
||||
ActionError::ContextMissing {
|
||||
missing_fields: vec!["target.id".to_string()],
|
||||
}
|
||||
})?
|
||||
.to_string();
|
||||
|
||||
// Fetch request and environment from database (synchronous)
|
||||
let (request, environment_chain) = {
|
||||
let db = query_manager.connect();
|
||||
|
||||
// Fetch HTTP request from database
|
||||
let request = db.get_http_request(&request_id).map_err(|e| {
|
||||
ActionError::Internal(format!("Failed to fetch request {}: {}", request_id, e))
|
||||
})?;
|
||||
|
||||
// Resolve environment chain for variable substitution
|
||||
let environment_chain = if let Some(env_id) = &context.environment_id {
|
||||
db.resolve_environments(
|
||||
&request.workspace_id,
|
||||
request.folder_id.as_deref(),
|
||||
Some(env_id),
|
||||
)
|
||||
.unwrap_or_default()
|
||||
} else {
|
||||
db.resolve_environments(
|
||||
&request.workspace_id,
|
||||
request.folder_id.as_deref(),
|
||||
None,
|
||||
)
|
||||
.unwrap_or_default()
|
||||
};
|
||||
|
||||
(request, environment_chain)
|
||||
}; // db is dropped here
|
||||
|
||||
// Create template callback with plugin support
|
||||
let plugin_context = PluginContext::new(None, Some(request.workspace_id.clone()));
|
||||
let template_callback = PluginTemplateCallback::new(
|
||||
plugin_manager,
|
||||
encryption_manager,
|
||||
&plugin_context,
|
||||
RenderPurpose::Send,
|
||||
);
|
||||
|
||||
// Render templates in the request
|
||||
let rendered_request = render_http_request(
|
||||
&request,
|
||||
environment_chain,
|
||||
&template_callback,
|
||||
&RenderOptions::throw(),
|
||||
)
|
||||
.await
|
||||
.map_err(|e| ActionError::Internal(format!("Failed to render request: {}", e)))?;
|
||||
|
||||
// Build sendable request
|
||||
let options = SendableHttpRequestOptions {
|
||||
timeout: params
|
||||
.data
|
||||
.get("timeout_ms")
|
||||
.and_then(|v| v.as_u64())
|
||||
.map(|ms| std::time::Duration::from_millis(ms)),
|
||||
follow_redirects: params
|
||||
.data
|
||||
.get("follow_redirects")
|
||||
.and_then(|v| v.as_bool())
|
||||
.unwrap_or(false),
|
||||
};
|
||||
|
||||
let sendable = SendableHttpRequest::from_http_request(&rendered_request, options)
|
||||
.await
|
||||
.map_err(|e| ActionError::Internal(format!("Failed to build request: {}", e)))?;
|
||||
|
||||
// Create event channel
|
||||
let (event_tx, mut event_rx) = mpsc::channel(100);
|
||||
|
||||
// Spawn task to drain events
|
||||
let _event_handle = tokio::spawn(async move {
|
||||
while event_rx.recv().await.is_some() {
|
||||
// For now, just drain events
|
||||
// In the future, we could log them or emit them to UI
|
||||
}
|
||||
});
|
||||
|
||||
// Send the request
|
||||
let sender = ReqwestSender::new()
|
||||
.map_err(|e| ActionError::Internal(format!("Failed to create HTTP client: {}", e)))?;
|
||||
let response = sender
|
||||
.send(sendable, event_tx)
|
||||
.await
|
||||
.map_err(|e| ActionError::Internal(format!("Failed to send request: {}", e)))?;
|
||||
|
||||
// Consume response body
|
||||
let status = response.status;
|
||||
let status_reason = response.status_reason.clone();
|
||||
let headers = response.headers.clone();
|
||||
let url = response.url.clone();
|
||||
|
||||
let (body_text, stats) = response
|
||||
.text()
|
||||
.await
|
||||
.map_err(|e| ActionError::Internal(format!("Failed to read response body: {}", e)))?;
|
||||
|
||||
// Return success result with response data
|
||||
Ok(ActionResult::Success {
|
||||
data: Some(json!({
|
||||
"status": status,
|
||||
"statusReason": status_reason,
|
||||
"headers": headers,
|
||||
"body": body_text,
|
||||
"contentLength": stats.size_decompressed,
|
||||
"url": url,
|
||||
})),
|
||||
message: Some(format!("HTTP {}", status)),
|
||||
})
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
/// Helper function to render templates in an HTTP request.
|
||||
/// Copied from yaak-cli implementation.
|
||||
async fn render_http_request(
|
||||
r: &HttpRequest,
|
||||
environment_chain: Vec<yaak_models::models::Environment>,
|
||||
cb: &PluginTemplateCallback,
|
||||
opt: &RenderOptions,
|
||||
) -> Result<HttpRequest, String> {
|
||||
let vars = &make_vars_hashmap(environment_chain);
|
||||
|
||||
let mut url_parameters = Vec::new();
|
||||
for p in r.url_parameters.clone() {
|
||||
if !p.enabled {
|
||||
continue;
|
||||
}
|
||||
url_parameters.push(HttpUrlParameter {
|
||||
enabled: p.enabled,
|
||||
name: parse_and_render(p.name.as_str(), vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
value: parse_and_render(p.value.as_str(), vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
id: p.id,
|
||||
})
|
||||
}
|
||||
|
||||
let mut headers = Vec::new();
|
||||
for p in r.headers.clone() {
|
||||
if !p.enabled {
|
||||
continue;
|
||||
}
|
||||
headers.push(HttpRequestHeader {
|
||||
enabled: p.enabled,
|
||||
name: parse_and_render(p.name.as_str(), vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
value: parse_and_render(p.value.as_str(), vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
id: p.id,
|
||||
})
|
||||
}
|
||||
|
||||
let mut body = BTreeMap::new();
|
||||
for (k, v) in r.body.clone() {
|
||||
body.insert(
|
||||
k,
|
||||
render_json_value_raw(v, vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
);
|
||||
}
|
||||
|
||||
let authentication = {
|
||||
let mut disabled = false;
|
||||
let mut auth = BTreeMap::new();
|
||||
match r.authentication.get("disabled") {
|
||||
Some(Value::Bool(true)) => {
|
||||
disabled = true;
|
||||
}
|
||||
Some(Value::String(tmpl)) => {
|
||||
disabled = parse_and_render(tmpl.as_str(), vars, cb, opt)
|
||||
.await
|
||||
.unwrap_or_default()
|
||||
.is_empty();
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
if disabled {
|
||||
auth.insert("disabled".to_string(), Value::Bool(true));
|
||||
} else {
|
||||
for (k, v) in r.authentication.clone() {
|
||||
if k == "disabled" {
|
||||
auth.insert(k, Value::Bool(false));
|
||||
} else {
|
||||
auth.insert(
|
||||
k,
|
||||
render_json_value_raw(v, vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
auth
|
||||
};
|
||||
|
||||
let url = parse_and_render(r.url.clone().as_str(), vars, cb, opt)
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
// Apply path placeholders (e.g., /users/:id -> /users/123)
|
||||
let (url, url_parameters) = apply_path_placeholders(&url, &url_parameters);
|
||||
|
||||
Ok(HttpRequest {
|
||||
url,
|
||||
url_parameters,
|
||||
headers,
|
||||
body,
|
||||
authentication,
|
||||
..r.to_owned()
|
||||
})
|
||||
}
|
||||
@@ -1,11 +0,0 @@
|
||||
//! Built-in action implementations for Yaak.
|
||||
//!
|
||||
//! This crate provides concrete implementations of built-in actions using
|
||||
//! the yaak-actions framework. It depends on domain-specific crates like
|
||||
//! yaak-http, yaak-models, yaak-plugins, etc.
|
||||
|
||||
pub mod dependencies;
|
||||
pub mod http;
|
||||
|
||||
pub use dependencies::BuiltinActionDependencies;
|
||||
pub use http::register_http_actions;
|
||||
@@ -1,15 +0,0 @@
|
||||
[package]
|
||||
name = "yaak-actions"
|
||||
version = "0.1.0"
|
||||
edition = "2021"
|
||||
description = "Centralized action system for Yaak"
|
||||
|
||||
[dependencies]
|
||||
serde = { workspace = true, features = ["derive"] }
|
||||
serde_json = { workspace = true }
|
||||
thiserror = { workspace = true }
|
||||
tokio = { workspace = true, features = ["sync"] }
|
||||
ts-rs = { workspace = true }
|
||||
|
||||
[dev-dependencies]
|
||||
tokio = { workspace = true, features = ["rt-multi-thread", "macros"] }
|
||||
@@ -1,14 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Availability status for an action.
|
||||
*/
|
||||
export type ActionAvailability = { "status": "available" } | { "status": "available-with-prompt",
|
||||
/**
|
||||
* Fields that will require prompting.
|
||||
*/
|
||||
prompt_fields: Array<string>, } | { "status": "unavailable",
|
||||
/**
|
||||
* Fields that are missing.
|
||||
*/
|
||||
missing_fields: Array<string>, } | { "status": "not-found" };
|
||||
@@ -1,13 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ActionGroupId } from "./ActionGroupId";
|
||||
import type { ActionId } from "./ActionId";
|
||||
import type { ActionScope } from "./ActionScope";
|
||||
|
||||
/**
|
||||
* Errors that can occur during action operations.
|
||||
*/
|
||||
export type ActionError = { "type": "not-found" } & ActionId | { "type": "disabled", action_id: ActionId, reason: string, } | { "type": "invalid-scope", expected: ActionScope, actual: ActionScope, } | { "type": "timeout" } & ActionId | { "type": "plugin-error" } & string | { "type": "validation-error" } & string | { "type": "permission-denied" } & string | { "type": "cancelled" } | { "type": "internal" } & string | { "type": "context-missing",
|
||||
/**
|
||||
* The context fields that are missing.
|
||||
*/
|
||||
missing_fields: Array<string>, } | { "type": "group-not-found" } & ActionGroupId | { "type": "group-already-exists" } & ActionGroupId;
|
||||
@@ -1,10 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Unique identifier for an action group.
|
||||
*
|
||||
* Format: `namespace:group-name`
|
||||
* - Built-in: `yaak:export`
|
||||
* - Plugin: `plugin.my-plugin:utilities`
|
||||
*/
|
||||
export type ActionGroupId = string;
|
||||
@@ -1,32 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ActionGroupId } from "./ActionGroupId";
|
||||
import type { ActionScope } from "./ActionScope";
|
||||
|
||||
/**
|
||||
* Metadata about an action group.
|
||||
*/
|
||||
export type ActionGroupMetadata = {
|
||||
/**
|
||||
* Unique identifier for this group.
|
||||
*/
|
||||
id: ActionGroupId,
|
||||
/**
|
||||
* Display name for the group.
|
||||
*/
|
||||
name: string,
|
||||
/**
|
||||
* Optional description of the group's purpose.
|
||||
*/
|
||||
description: string | null,
|
||||
/**
|
||||
* Icon to display for the group.
|
||||
*/
|
||||
icon: string | null,
|
||||
/**
|
||||
* Sort order for displaying groups (lower = earlier).
|
||||
*/
|
||||
order: number,
|
||||
/**
|
||||
* Optional scope restriction (if set, group only appears in this scope).
|
||||
*/
|
||||
scope: ActionScope | null, };
|
||||
@@ -1,18 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Where an action group was registered from.
|
||||
*/
|
||||
export type ActionGroupSource = { "type": "builtin" } | { "type": "plugin",
|
||||
/**
|
||||
* Plugin reference ID.
|
||||
*/
|
||||
ref_id: string,
|
||||
/**
|
||||
* Plugin name.
|
||||
*/
|
||||
name: string, } | { "type": "dynamic",
|
||||
/**
|
||||
* Source identifier.
|
||||
*/
|
||||
source_id: string, };
|
||||
@@ -1,16 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ActionGroupMetadata } from "./ActionGroupMetadata";
|
||||
import type { ActionMetadata } from "./ActionMetadata";
|
||||
|
||||
/**
|
||||
* A group with its actions for UI rendering.
|
||||
*/
|
||||
export type ActionGroupWithActions = {
|
||||
/**
|
||||
* Group metadata.
|
||||
*/
|
||||
group: ActionGroupMetadata,
|
||||
/**
|
||||
* Actions in this group.
|
||||
*/
|
||||
actions: Array<ActionMetadata>, };
|
||||
@@ -1,10 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Unique identifier for an action.
|
||||
*
|
||||
* Format: `namespace:category:name`
|
||||
* - Built-in: `yaak:http-request:send`
|
||||
* - Plugin: `plugin.copy-curl:http-request:copy`
|
||||
*/
|
||||
export type ActionId = string;
|
||||
@@ -1,54 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ActionGroupId } from "./ActionGroupId";
|
||||
import type { ActionId } from "./ActionId";
|
||||
import type { ActionScope } from "./ActionScope";
|
||||
import type { RequiredContext } from "./RequiredContext";
|
||||
|
||||
/**
|
||||
* Metadata about an action for discovery.
|
||||
*/
|
||||
export type ActionMetadata = {
|
||||
/**
|
||||
* Unique identifier for this action.
|
||||
*/
|
||||
id: ActionId,
|
||||
/**
|
||||
* Display label for the action.
|
||||
*/
|
||||
label: string,
|
||||
/**
|
||||
* Optional description of what the action does.
|
||||
*/
|
||||
description: string | null,
|
||||
/**
|
||||
* Icon name to display.
|
||||
*/
|
||||
icon: string | null,
|
||||
/**
|
||||
* The scope this action applies to.
|
||||
*/
|
||||
scope: ActionScope,
|
||||
/**
|
||||
* Keyboard shortcut (e.g., "Cmd+Enter").
|
||||
*/
|
||||
keyboardShortcut: string | null,
|
||||
/**
|
||||
* Whether the action requires a selection/target.
|
||||
*/
|
||||
requiresSelection: boolean,
|
||||
/**
|
||||
* Optional condition expression for when action is enabled.
|
||||
*/
|
||||
enabledCondition: string | null,
|
||||
/**
|
||||
* Optional group this action belongs to.
|
||||
*/
|
||||
groupId: ActionGroupId | null,
|
||||
/**
|
||||
* Sort order within a group (lower = earlier).
|
||||
*/
|
||||
order: number,
|
||||
/**
|
||||
* Context requirements for this action.
|
||||
*/
|
||||
requiredContext: RequiredContext, };
|
||||
@@ -1,10 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Parameters passed to action handlers.
|
||||
*/
|
||||
export type ActionParams = {
|
||||
/**
|
||||
* Arbitrary JSON parameters.
|
||||
*/
|
||||
data: unknown, };
|
||||
@@ -1,23 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { InputPrompt } from "./InputPrompt";
|
||||
|
||||
/**
|
||||
* Result of action execution.
|
||||
*/
|
||||
export type ActionResult = { "type": "success",
|
||||
/**
|
||||
* Optional data to return.
|
||||
*/
|
||||
data: unknown,
|
||||
/**
|
||||
* Optional message to display.
|
||||
*/
|
||||
message: string | null, } | { "type": "requires-input",
|
||||
/**
|
||||
* Prompt to show user.
|
||||
*/
|
||||
prompt: InputPrompt,
|
||||
/**
|
||||
* Continuation token.
|
||||
*/
|
||||
continuation_id: string, } | { "type": "cancelled" };
|
||||
@@ -1,6 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* The scope in which an action can be invoked.
|
||||
*/
|
||||
export type ActionScope = "global" | "http-request" | "websocket-request" | "grpc-request" | "workspace" | "folder" | "environment" | "cookie-jar";
|
||||
@@ -1,18 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* Where an action was registered from.
|
||||
*/
|
||||
export type ActionSource = { "type": "builtin" } | { "type": "plugin",
|
||||
/**
|
||||
* Plugin reference ID.
|
||||
*/
|
||||
ref_id: string,
|
||||
/**
|
||||
* Plugin name.
|
||||
*/
|
||||
name: string, } | { "type": "dynamic",
|
||||
/**
|
||||
* Source identifier.
|
||||
*/
|
||||
source_id: string, };
|
||||
@@ -1,6 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* The target entity for an action.
|
||||
*/
|
||||
export type ActionTarget = { "type": "none" } | { "type": "http-request", id: string, } | { "type": "websocket-request", id: string, } | { "type": "grpc-request", id: string, } | { "type": "workspace", id: string, } | { "type": "folder", id: string, } | { "type": "environment", id: string, } | { "type": "multiple", targets: Array<ActionTarget>, };
|
||||
@@ -1,6 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* How strictly a context field is required.
|
||||
*/
|
||||
export type ContextRequirement = "not-required" | "optional" | "required" | "required-with-prompt";
|
||||
@@ -1,27 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ActionTarget } from "./ActionTarget";
|
||||
|
||||
/**
|
||||
* Current context state from the application.
|
||||
*/
|
||||
export type CurrentContext = {
|
||||
/**
|
||||
* Current workspace ID (if any).
|
||||
*/
|
||||
workspaceId: string | null,
|
||||
/**
|
||||
* Current environment ID (if any).
|
||||
*/
|
||||
environmentId: string | null,
|
||||
/**
|
||||
* Currently selected target (if any).
|
||||
*/
|
||||
target: ActionTarget | null,
|
||||
/**
|
||||
* Whether a window context is available.
|
||||
*/
|
||||
hasWindow: boolean,
|
||||
/**
|
||||
* Whether the context provider can prompt for missing fields.
|
||||
*/
|
||||
canPrompt: boolean, };
|
||||
@@ -1,7 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { SelectOption } from "./SelectOption";
|
||||
|
||||
/**
|
||||
* A prompt for user input.
|
||||
*/
|
||||
export type InputPrompt = { "type": "text", label: string, placeholder: string | null, default_value: string | null, } | { "type": "select", label: string, options: Array<SelectOption>, } | { "type": "confirm", label: string, };
|
||||
@@ -1,23 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
import type { ContextRequirement } from "./ContextRequirement";
|
||||
|
||||
/**
|
||||
* Specifies what context fields an action requires.
|
||||
*/
|
||||
export type RequiredContext = {
|
||||
/**
|
||||
* Action requires a workspace to be active.
|
||||
*/
|
||||
workspace: ContextRequirement,
|
||||
/**
|
||||
* Action requires an environment to be selected.
|
||||
*/
|
||||
environment: ContextRequirement,
|
||||
/**
|
||||
* Action requires a specific target entity (request, folder, etc.).
|
||||
*/
|
||||
target: ContextRequirement,
|
||||
/**
|
||||
* Action requires a window context (for UI operations).
|
||||
*/
|
||||
window: ContextRequirement, };
|
||||
@@ -1,6 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
/**
|
||||
* An option in a select prompt.
|
||||
*/
|
||||
export type SelectOption = { label: string, value: string, };
|
||||
@@ -1,331 +0,0 @@
|
||||
//! Action context types and context-aware filtering.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use ts_rs::TS;
|
||||
|
||||
use crate::ActionScope;
|
||||
|
||||
/// Specifies what context fields an action requires.
|
||||
#[derive(Clone, Debug, Default, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct RequiredContext {
|
||||
/// Action requires a workspace to be active.
|
||||
#[serde(default)]
|
||||
pub workspace: ContextRequirement,
|
||||
|
||||
/// Action requires an environment to be selected.
|
||||
#[serde(default)]
|
||||
pub environment: ContextRequirement,
|
||||
|
||||
/// Action requires a specific target entity (request, folder, etc.).
|
||||
#[serde(default)]
|
||||
pub target: ContextRequirement,
|
||||
|
||||
/// Action requires a window context (for UI operations).
|
||||
#[serde(default)]
|
||||
pub window: ContextRequirement,
|
||||
}
|
||||
|
||||
impl RequiredContext {
|
||||
/// Action requires a target entity.
|
||||
pub fn requires_target() -> Self {
|
||||
Self {
|
||||
target: ContextRequirement::Required,
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
|
||||
/// Action requires workspace and target.
|
||||
pub fn requires_workspace_and_target() -> Self {
|
||||
Self {
|
||||
workspace: ContextRequirement::Required,
|
||||
target: ContextRequirement::Required,
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
|
||||
/// Action works globally, no specific context needed.
|
||||
pub fn global() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
/// Action requires target with prompt if missing.
|
||||
pub fn requires_target_with_prompt() -> Self {
|
||||
Self {
|
||||
target: ContextRequirement::RequiredWithPrompt,
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
|
||||
/// Action requires environment with prompt if missing.
|
||||
pub fn requires_environment_with_prompt() -> Self {
|
||||
Self {
|
||||
environment: ContextRequirement::RequiredWithPrompt,
|
||||
..Default::default()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// How strictly a context field is required.
|
||||
#[derive(Clone, Debug, Default, PartialEq, Eq, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub enum ContextRequirement {
|
||||
/// Field is not needed.
|
||||
#[default]
|
||||
NotRequired,
|
||||
|
||||
/// Field is optional but will be used if available.
|
||||
Optional,
|
||||
|
||||
/// Field must be present; action will fail without it.
|
||||
Required,
|
||||
|
||||
/// Field must be present; prompt user to select if missing.
|
||||
RequiredWithPrompt,
|
||||
}
|
||||
|
||||
/// Current context state from the application.
|
||||
#[derive(Clone, Debug, Default, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct CurrentContext {
|
||||
/// Current workspace ID (if any).
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub workspace_id: Option<String>,
|
||||
|
||||
/// Current environment ID (if any).
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub environment_id: Option<String>,
|
||||
|
||||
/// Currently selected target (if any).
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub target: Option<ActionTarget>,
|
||||
|
||||
/// Whether a window context is available.
|
||||
#[serde(default)]
|
||||
pub has_window: bool,
|
||||
|
||||
/// Whether the context provider can prompt for missing fields.
|
||||
#[serde(default)]
|
||||
pub can_prompt: bool,
|
||||
}
|
||||
|
||||
/// The target entity for an action.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum ActionTarget {
|
||||
/// No target.
|
||||
None,
|
||||
/// HTTP request target.
|
||||
HttpRequest { id: String },
|
||||
/// WebSocket request target.
|
||||
WebsocketRequest { id: String },
|
||||
/// gRPC request target.
|
||||
GrpcRequest { id: String },
|
||||
/// Workspace target.
|
||||
Workspace { id: String },
|
||||
/// Folder target.
|
||||
Folder { id: String },
|
||||
/// Environment target.
|
||||
Environment { id: String },
|
||||
/// Multiple targets.
|
||||
Multiple { targets: Vec<ActionTarget> },
|
||||
}
|
||||
|
||||
impl ActionTarget {
|
||||
/// Get the scope this target corresponds to.
|
||||
pub fn scope(&self) -> Option<ActionScope> {
|
||||
match self {
|
||||
Self::None => None,
|
||||
Self::HttpRequest { .. } => Some(ActionScope::HttpRequest),
|
||||
Self::WebsocketRequest { .. } => Some(ActionScope::WebsocketRequest),
|
||||
Self::GrpcRequest { .. } => Some(ActionScope::GrpcRequest),
|
||||
Self::Workspace { .. } => Some(ActionScope::Workspace),
|
||||
Self::Folder { .. } => Some(ActionScope::Folder),
|
||||
Self::Environment { .. } => Some(ActionScope::Environment),
|
||||
Self::Multiple { .. } => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Get the ID of the target (if single target).
|
||||
pub fn id(&self) -> Option<&str> {
|
||||
match self {
|
||||
Self::HttpRequest { id }
|
||||
| Self::WebsocketRequest { id }
|
||||
| Self::GrpcRequest { id }
|
||||
| Self::Workspace { id }
|
||||
| Self::Folder { id }
|
||||
| Self::Environment { id } => Some(id),
|
||||
Self::None | Self::Multiple { .. } => None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Availability status for an action.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "status", rename_all = "kebab-case")]
|
||||
pub enum ActionAvailability {
|
||||
/// Action is ready to execute.
|
||||
Available,
|
||||
|
||||
/// Action can execute but will prompt for missing context.
|
||||
AvailableWithPrompt {
|
||||
/// Fields that will require prompting.
|
||||
prompt_fields: Vec<String>,
|
||||
},
|
||||
|
||||
/// Action cannot execute due to missing context.
|
||||
Unavailable {
|
||||
/// Fields that are missing.
|
||||
missing_fields: Vec<String>,
|
||||
},
|
||||
|
||||
/// Action not found in registry.
|
||||
NotFound,
|
||||
}
|
||||
|
||||
impl ActionAvailability {
|
||||
/// Check if the action is available (possibly with prompts).
|
||||
pub fn is_available(&self) -> bool {
|
||||
matches!(self, Self::Available | Self::AvailableWithPrompt { .. })
|
||||
}
|
||||
|
||||
/// Check if the action is immediately available without prompts.
|
||||
pub fn is_immediately_available(&self) -> bool {
|
||||
matches!(self, Self::Available)
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if required context is satisfied by current context.
|
||||
pub fn check_context_availability(
|
||||
required: &RequiredContext,
|
||||
current: &CurrentContext,
|
||||
) -> ActionAvailability {
|
||||
let mut missing_fields = Vec::new();
|
||||
let mut prompt_fields = Vec::new();
|
||||
|
||||
// Check workspace
|
||||
check_field(
|
||||
"workspace",
|
||||
current.workspace_id.is_some(),
|
||||
&required.workspace,
|
||||
current.can_prompt,
|
||||
&mut missing_fields,
|
||||
&mut prompt_fields,
|
||||
);
|
||||
|
||||
// Check environment
|
||||
check_field(
|
||||
"environment",
|
||||
current.environment_id.is_some(),
|
||||
&required.environment,
|
||||
current.can_prompt,
|
||||
&mut missing_fields,
|
||||
&mut prompt_fields,
|
||||
);
|
||||
|
||||
// Check target
|
||||
check_field(
|
||||
"target",
|
||||
current.target.is_some(),
|
||||
&required.target,
|
||||
current.can_prompt,
|
||||
&mut missing_fields,
|
||||
&mut prompt_fields,
|
||||
);
|
||||
|
||||
// Check window
|
||||
check_field(
|
||||
"window",
|
||||
current.has_window,
|
||||
&required.window,
|
||||
false, // Can't prompt for window
|
||||
&mut missing_fields,
|
||||
&mut prompt_fields,
|
||||
);
|
||||
|
||||
if !missing_fields.is_empty() {
|
||||
ActionAvailability::Unavailable { missing_fields }
|
||||
} else if !prompt_fields.is_empty() {
|
||||
ActionAvailability::AvailableWithPrompt { prompt_fields }
|
||||
} else {
|
||||
ActionAvailability::Available
|
||||
}
|
||||
}
|
||||
|
||||
fn check_field(
|
||||
name: &str,
|
||||
has_value: bool,
|
||||
requirement: &ContextRequirement,
|
||||
can_prompt: bool,
|
||||
missing: &mut Vec<String>,
|
||||
promptable: &mut Vec<String>,
|
||||
) {
|
||||
match requirement {
|
||||
ContextRequirement::NotRequired | ContextRequirement::Optional => {}
|
||||
ContextRequirement::Required => {
|
||||
if !has_value {
|
||||
missing.push(name.to_string());
|
||||
}
|
||||
}
|
||||
ContextRequirement::RequiredWithPrompt => {
|
||||
if !has_value {
|
||||
if can_prompt {
|
||||
promptable.push(name.to_string());
|
||||
} else {
|
||||
missing.push(name.to_string());
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_context_available() {
|
||||
let required = RequiredContext::requires_target();
|
||||
let current = CurrentContext {
|
||||
target: Some(ActionTarget::HttpRequest {
|
||||
id: "123".to_string(),
|
||||
}),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let availability = check_context_availability(&required, ¤t);
|
||||
assert!(matches!(availability, ActionAvailability::Available));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_context_missing() {
|
||||
let required = RequiredContext::requires_target();
|
||||
let current = CurrentContext::default();
|
||||
|
||||
let availability = check_context_availability(&required, ¤t);
|
||||
assert!(matches!(
|
||||
availability,
|
||||
ActionAvailability::Unavailable { missing_fields } if missing_fields == vec!["target"]
|
||||
));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_context_promptable() {
|
||||
let required = RequiredContext::requires_target_with_prompt();
|
||||
let current = CurrentContext {
|
||||
can_prompt: true,
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let availability = check_context_availability(&required, ¤t);
|
||||
assert!(matches!(
|
||||
availability,
|
||||
ActionAvailability::AvailableWithPrompt { prompt_fields } if prompt_fields == vec!["target"]
|
||||
));
|
||||
}
|
||||
}
|
||||
@@ -1,131 +0,0 @@
|
||||
//! Error types for the action system.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use thiserror::Error;
|
||||
use ts_rs::TS;
|
||||
|
||||
use crate::{ActionGroupId, ActionId};
|
||||
|
||||
/// Errors that can occur during action operations.
|
||||
#[derive(Debug, Error, Clone, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum ActionError {
|
||||
/// Action not found in registry.
|
||||
#[error("Action not found: {0}")]
|
||||
NotFound(ActionId),
|
||||
|
||||
/// Action is disabled in current context.
|
||||
#[error("Action is disabled: {action_id} - {reason}")]
|
||||
Disabled { action_id: ActionId, reason: String },
|
||||
|
||||
/// Invalid scope for the action.
|
||||
#[error("Invalid scope: expected {expected:?}, got {actual:?}")]
|
||||
InvalidScope {
|
||||
expected: crate::ActionScope,
|
||||
actual: crate::ActionScope,
|
||||
},
|
||||
|
||||
/// Action execution timed out.
|
||||
#[error("Action timed out: {0}")]
|
||||
Timeout(ActionId),
|
||||
|
||||
/// Error from plugin execution.
|
||||
#[error("Plugin error: {0}")]
|
||||
PluginError(String),
|
||||
|
||||
/// Validation error in action parameters.
|
||||
#[error("Validation error: {0}")]
|
||||
ValidationError(String),
|
||||
|
||||
/// Permission denied for action.
|
||||
#[error("Permission denied: {0}")]
|
||||
PermissionDenied(String),
|
||||
|
||||
/// Action was cancelled by user.
|
||||
#[error("Action cancelled by user")]
|
||||
Cancelled,
|
||||
|
||||
/// Internal error.
|
||||
#[error("Internal error: {0}")]
|
||||
Internal(String),
|
||||
|
||||
/// Required context is missing.
|
||||
#[error("Required context missing: {missing_fields:?}")]
|
||||
ContextMissing {
|
||||
/// The context fields that are missing.
|
||||
missing_fields: Vec<String>,
|
||||
},
|
||||
|
||||
/// Action group not found.
|
||||
#[error("Group not found: {0}")]
|
||||
GroupNotFound(ActionGroupId),
|
||||
|
||||
/// Action group already exists.
|
||||
#[error("Group already exists: {0}")]
|
||||
GroupAlreadyExists(ActionGroupId),
|
||||
}
|
||||
|
||||
impl ActionError {
|
||||
/// Get a user-friendly error message.
|
||||
pub fn user_message(&self) -> String {
|
||||
match self {
|
||||
Self::NotFound(id) => format!("Action '{}' is not available", id),
|
||||
Self::Disabled { reason, .. } => reason.clone(),
|
||||
Self::InvalidScope { expected, actual } => {
|
||||
format!("Action requires {:?} scope, but got {:?}", expected, actual)
|
||||
}
|
||||
Self::Timeout(_) => "The operation took too long and was cancelled".into(),
|
||||
Self::PluginError(msg) => format!("Plugin error: {}", msg),
|
||||
Self::ValidationError(msg) => format!("Invalid input: {}", msg),
|
||||
Self::PermissionDenied(resource) => format!("Permission denied for {}", resource),
|
||||
Self::Cancelled => "Operation was cancelled".into(),
|
||||
Self::Internal(_) => "An unexpected error occurred".into(),
|
||||
Self::ContextMissing { missing_fields } => {
|
||||
format!("Missing required context: {}", missing_fields.join(", "))
|
||||
}
|
||||
Self::GroupNotFound(id) => format!("Action group '{}' not found", id),
|
||||
Self::GroupAlreadyExists(id) => format!("Action group '{}' already exists", id),
|
||||
}
|
||||
}
|
||||
|
||||
/// Whether this error should be reported to telemetry.
|
||||
pub fn is_reportable(&self) -> bool {
|
||||
matches!(self, Self::Internal(_) | Self::PluginError(_))
|
||||
}
|
||||
|
||||
/// Whether this error can potentially be resolved by user interaction.
|
||||
pub fn is_promptable(&self) -> bool {
|
||||
matches!(self, Self::ContextMissing { .. })
|
||||
}
|
||||
|
||||
/// Whether this is a user-initiated cancellation.
|
||||
pub fn is_cancelled(&self) -> bool {
|
||||
matches!(self, Self::Cancelled)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_error_messages() {
|
||||
let err = ActionError::ContextMissing {
|
||||
missing_fields: vec!["workspace".into()],
|
||||
};
|
||||
assert_eq!(err.user_message(), "Missing required context: workspace");
|
||||
assert!(err.is_promptable());
|
||||
assert!(!err.is_cancelled());
|
||||
|
||||
let cancelled = ActionError::Cancelled;
|
||||
assert!(cancelled.is_cancelled());
|
||||
assert!(!cancelled.is_promptable());
|
||||
|
||||
let not_found = ActionError::NotFound(ActionId::builtin("test", "action"));
|
||||
assert_eq!(
|
||||
not_found.user_message(),
|
||||
"Action 'yaak:test:action' is not available"
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,606 +0,0 @@
|
||||
//! Action executor - central hub for action registration and invocation.
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::sync::Arc;
|
||||
use tokio::sync::RwLock;
|
||||
|
||||
use crate::{
|
||||
check_context_availability, ActionAvailability, ActionError, ActionGroupId,
|
||||
ActionGroupMetadata, ActionGroupSource, ActionGroupWithActions, ActionHandler, ActionId,
|
||||
ActionMetadata, ActionParams, ActionResult, ActionScope, ActionSource, CurrentContext,
|
||||
RegisteredActionGroup,
|
||||
};
|
||||
|
||||
/// Options for listing actions.
|
||||
#[derive(Clone, Debug, Default)]
|
||||
pub struct ListActionsOptions {
|
||||
/// Filter by scope.
|
||||
pub scope: Option<ActionScope>,
|
||||
/// Filter by group.
|
||||
pub group_id: Option<ActionGroupId>,
|
||||
/// Search term for label/description.
|
||||
pub search: Option<String>,
|
||||
}
|
||||
|
||||
/// A registered action with its handler.
|
||||
struct RegisteredAction {
|
||||
/// Action metadata.
|
||||
metadata: ActionMetadata,
|
||||
/// Where the action was registered from.
|
||||
source: ActionSource,
|
||||
/// The handler for this action.
|
||||
handler: Arc<dyn ActionHandler>,
|
||||
}
|
||||
|
||||
/// Central hub for action registration and invocation.
|
||||
///
|
||||
/// The executor owns all action metadata and handlers, ensuring every
|
||||
/// registered action has a handler by construction.
|
||||
pub struct ActionExecutor {
|
||||
/// All registered actions indexed by ID.
|
||||
actions: RwLock<HashMap<ActionId, RegisteredAction>>,
|
||||
|
||||
/// Actions indexed by scope for efficient filtering.
|
||||
scope_index: RwLock<HashMap<ActionScope, Vec<ActionId>>>,
|
||||
|
||||
/// All registered groups indexed by ID.
|
||||
groups: RwLock<HashMap<ActionGroupId, RegisteredActionGroup>>,
|
||||
}
|
||||
|
||||
impl Default for ActionExecutor {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl ActionExecutor {
|
||||
/// Create a new empty executor.
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
actions: RwLock::new(HashMap::new()),
|
||||
scope_index: RwLock::new(HashMap::new()),
|
||||
groups: RwLock::new(HashMap::new()),
|
||||
}
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Action Registration
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Register an action with its handler.
|
||||
///
|
||||
/// Every action must have a handler - this is enforced by the API.
|
||||
pub async fn register<H: ActionHandler + 'static>(
|
||||
&self,
|
||||
metadata: ActionMetadata,
|
||||
source: ActionSource,
|
||||
handler: H,
|
||||
) -> Result<ActionId, ActionError> {
|
||||
let id = metadata.id.clone();
|
||||
let scope = metadata.scope.clone();
|
||||
|
||||
let action = RegisteredAction {
|
||||
metadata,
|
||||
source,
|
||||
handler: Arc::new(handler),
|
||||
};
|
||||
|
||||
// Insert action
|
||||
{
|
||||
let mut actions = self.actions.write().await;
|
||||
actions.insert(id.clone(), action);
|
||||
}
|
||||
|
||||
// Update scope index
|
||||
{
|
||||
let mut index = self.scope_index.write().await;
|
||||
index.entry(scope).or_default().push(id.clone());
|
||||
}
|
||||
|
||||
Ok(id)
|
||||
}
|
||||
|
||||
/// Unregister an action.
|
||||
pub async fn unregister(&self, id: &ActionId) -> Result<(), ActionError> {
|
||||
let mut actions = self.actions.write().await;
|
||||
|
||||
let action = actions
|
||||
.remove(id)
|
||||
.ok_or_else(|| ActionError::NotFound(id.clone()))?;
|
||||
|
||||
// Update scope index
|
||||
{
|
||||
let mut index = self.scope_index.write().await;
|
||||
if let Some(ids) = index.get_mut(&action.metadata.scope) {
|
||||
ids.retain(|i| i != id);
|
||||
}
|
||||
}
|
||||
|
||||
// Remove from group if assigned
|
||||
if let Some(group_id) = &action.metadata.group_id {
|
||||
let mut groups = self.groups.write().await;
|
||||
if let Some(group) = groups.get_mut(group_id) {
|
||||
group.action_ids.retain(|i| i != id);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Unregister all actions from a specific source.
|
||||
pub async fn unregister_source(&self, source_id: &str) -> Vec<ActionId> {
|
||||
let actions_to_remove: Vec<ActionId> = {
|
||||
let actions = self.actions.read().await;
|
||||
actions
|
||||
.iter()
|
||||
.filter(|(_, a)| match &a.source {
|
||||
ActionSource::Plugin { ref_id, .. } => ref_id == source_id,
|
||||
ActionSource::Dynamic {
|
||||
source_id: sid, ..
|
||||
} => sid == source_id,
|
||||
ActionSource::Builtin => false,
|
||||
})
|
||||
.map(|(id, _)| id.clone())
|
||||
.collect()
|
||||
};
|
||||
|
||||
for id in &actions_to_remove {
|
||||
let _ = self.unregister(id).await;
|
||||
}
|
||||
|
||||
actions_to_remove
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Action Invocation
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Invoke an action with the given context and parameters.
|
||||
///
|
||||
/// This will:
|
||||
/// 1. Look up the action metadata
|
||||
/// 2. Check context availability
|
||||
/// 3. Execute the handler
|
||||
pub async fn invoke(
|
||||
&self,
|
||||
action_id: &ActionId,
|
||||
context: CurrentContext,
|
||||
params: ActionParams,
|
||||
) -> Result<ActionResult, ActionError> {
|
||||
// Get action and handler
|
||||
let (metadata, handler) = {
|
||||
let actions = self.actions.read().await;
|
||||
let action = actions
|
||||
.get(action_id)
|
||||
.ok_or_else(|| ActionError::NotFound(action_id.clone()))?;
|
||||
(action.metadata.clone(), action.handler.clone())
|
||||
};
|
||||
|
||||
// Check context availability
|
||||
let availability = check_context_availability(&metadata.required_context, &context);
|
||||
|
||||
match availability {
|
||||
ActionAvailability::Available | ActionAvailability::AvailableWithPrompt { .. } => {
|
||||
// Context is satisfied, proceed with execution
|
||||
}
|
||||
ActionAvailability::Unavailable { missing_fields } => {
|
||||
return Err(ActionError::ContextMissing { missing_fields });
|
||||
}
|
||||
ActionAvailability::NotFound => {
|
||||
return Err(ActionError::NotFound(action_id.clone()));
|
||||
}
|
||||
}
|
||||
|
||||
// Execute handler
|
||||
handler.handle(context, params).await
|
||||
}
|
||||
|
||||
/// Invoke an action, skipping context validation.
|
||||
///
|
||||
/// Use this when you've already validated the context externally.
|
||||
pub async fn invoke_unchecked(
|
||||
&self,
|
||||
action_id: &ActionId,
|
||||
context: CurrentContext,
|
||||
params: ActionParams,
|
||||
) -> Result<ActionResult, ActionError> {
|
||||
// Get handler
|
||||
let handler = {
|
||||
let actions = self.actions.read().await;
|
||||
let action = actions
|
||||
.get(action_id)
|
||||
.ok_or_else(|| ActionError::NotFound(action_id.clone()))?;
|
||||
action.handler.clone()
|
||||
};
|
||||
|
||||
// Execute handler
|
||||
handler.handle(context, params).await
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Action Queries
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Get action metadata by ID.
|
||||
pub async fn get(&self, id: &ActionId) -> Option<ActionMetadata> {
|
||||
let actions = self.actions.read().await;
|
||||
actions.get(id).map(|a| a.metadata.clone())
|
||||
}
|
||||
|
||||
/// List all actions, optionally filtered.
|
||||
pub async fn list(&self, options: ListActionsOptions) -> Vec<ActionMetadata> {
|
||||
let actions = self.actions.read().await;
|
||||
|
||||
let mut result: Vec<_> = actions
|
||||
.values()
|
||||
.filter(|a| {
|
||||
// Scope filter
|
||||
if let Some(scope) = &options.scope {
|
||||
if &a.metadata.scope != scope {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Group filter
|
||||
if let Some(group_id) = &options.group_id {
|
||||
if a.metadata.group_id.as_ref() != Some(group_id) {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
// Search filter
|
||||
if let Some(search) = &options.search {
|
||||
let search = search.to_lowercase();
|
||||
let matches_label = a.metadata.label.to_lowercase().contains(&search);
|
||||
let matches_desc = a
|
||||
.metadata
|
||||
.description
|
||||
.as_ref()
|
||||
.map(|d| d.to_lowercase().contains(&search))
|
||||
.unwrap_or(false);
|
||||
if !matches_label && !matches_desc {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
true
|
||||
})
|
||||
.map(|a| a.metadata.clone())
|
||||
.collect();
|
||||
|
||||
// Sort by order then label
|
||||
result.sort_by(|a, b| a.order.cmp(&b.order).then_with(|| a.label.cmp(&b.label)));
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
/// List actions available in the given context.
|
||||
pub async fn list_available(
|
||||
&self,
|
||||
context: &CurrentContext,
|
||||
options: ListActionsOptions,
|
||||
) -> Vec<(ActionMetadata, ActionAvailability)> {
|
||||
let all_actions = self.list(options).await;
|
||||
|
||||
all_actions
|
||||
.into_iter()
|
||||
.map(|action| {
|
||||
let availability =
|
||||
check_context_availability(&action.required_context, context);
|
||||
(action, availability)
|
||||
})
|
||||
.filter(|(_, availability)| availability.is_available())
|
||||
.collect()
|
||||
}
|
||||
|
||||
/// Get availability status for a specific action.
|
||||
pub async fn get_availability(
|
||||
&self,
|
||||
id: &ActionId,
|
||||
context: &CurrentContext,
|
||||
) -> ActionAvailability {
|
||||
let actions = self.actions.read().await;
|
||||
|
||||
match actions.get(id) {
|
||||
Some(action) => {
|
||||
check_context_availability(&action.metadata.required_context, context)
|
||||
}
|
||||
None => ActionAvailability::NotFound,
|
||||
}
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Group Registration
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Register an action group.
|
||||
pub async fn register_group(
|
||||
&self,
|
||||
metadata: ActionGroupMetadata,
|
||||
source: ActionGroupSource,
|
||||
) -> Result<ActionGroupId, ActionError> {
|
||||
let id = metadata.id.clone();
|
||||
|
||||
let mut groups = self.groups.write().await;
|
||||
if groups.contains_key(&id) {
|
||||
return Err(ActionError::GroupAlreadyExists(id));
|
||||
}
|
||||
|
||||
groups.insert(
|
||||
id.clone(),
|
||||
RegisteredActionGroup {
|
||||
metadata,
|
||||
action_ids: Vec::new(),
|
||||
source,
|
||||
},
|
||||
);
|
||||
|
||||
Ok(id)
|
||||
}
|
||||
|
||||
/// Unregister a group (does not unregister its actions).
|
||||
pub async fn unregister_group(&self, id: &ActionGroupId) -> Result<(), ActionError> {
|
||||
let mut groups = self.groups.write().await;
|
||||
groups
|
||||
.remove(id)
|
||||
.ok_or_else(|| ActionError::GroupNotFound(id.clone()))?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Add an action to a group.
|
||||
pub async fn add_to_group(
|
||||
&self,
|
||||
action_id: &ActionId,
|
||||
group_id: &ActionGroupId,
|
||||
) -> Result<(), ActionError> {
|
||||
// Update action's group_id
|
||||
{
|
||||
let mut actions = self.actions.write().await;
|
||||
let action = actions
|
||||
.get_mut(action_id)
|
||||
.ok_or_else(|| ActionError::NotFound(action_id.clone()))?;
|
||||
action.metadata.group_id = Some(group_id.clone());
|
||||
}
|
||||
|
||||
// Add to group's action list
|
||||
{
|
||||
let mut groups = self.groups.write().await;
|
||||
let group = groups
|
||||
.get_mut(group_id)
|
||||
.ok_or_else(|| ActionError::GroupNotFound(group_id.clone()))?;
|
||||
|
||||
if !group.action_ids.contains(action_id) {
|
||||
group.action_ids.push(action_id.clone());
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Group Queries
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Get a group by ID.
|
||||
pub async fn get_group(&self, id: &ActionGroupId) -> Option<ActionGroupMetadata> {
|
||||
let groups = self.groups.read().await;
|
||||
groups.get(id).map(|g| g.metadata.clone())
|
||||
}
|
||||
|
||||
/// List all groups, optionally filtered by scope.
|
||||
pub async fn list_groups(&self, scope: Option<ActionScope>) -> Vec<ActionGroupMetadata> {
|
||||
let groups = self.groups.read().await;
|
||||
|
||||
let mut result: Vec<_> = groups
|
||||
.values()
|
||||
.filter(|g| {
|
||||
scope.as_ref().map_or(true, |s| {
|
||||
g.metadata.scope.as_ref().map_or(true, |gs| gs == s)
|
||||
})
|
||||
})
|
||||
.map(|g| g.metadata.clone())
|
||||
.collect();
|
||||
|
||||
result.sort_by_key(|g| g.order);
|
||||
result
|
||||
}
|
||||
|
||||
/// List all actions in a specific group.
|
||||
pub async fn list_by_group(&self, group_id: &ActionGroupId) -> Vec<ActionMetadata> {
|
||||
let groups = self.groups.read().await;
|
||||
let actions = self.actions.read().await;
|
||||
|
||||
groups
|
||||
.get(group_id)
|
||||
.map(|group| {
|
||||
let mut result: Vec<_> = group
|
||||
.action_ids
|
||||
.iter()
|
||||
.filter_map(|id| actions.get(id).map(|a| a.metadata.clone()))
|
||||
.collect();
|
||||
result.sort_by_key(|a| a.order);
|
||||
result
|
||||
})
|
||||
.unwrap_or_default()
|
||||
}
|
||||
|
||||
/// Get actions organized by their groups.
|
||||
pub async fn list_grouped(&self, scope: Option<ActionScope>) -> Vec<ActionGroupWithActions> {
|
||||
let group_list = self.list_groups(scope).await;
|
||||
let mut result = Vec::new();
|
||||
|
||||
for group in group_list {
|
||||
let actions = self.list_by_group(&group.id).await;
|
||||
result.push(ActionGroupWithActions { group, actions });
|
||||
}
|
||||
|
||||
result
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
// Built-in Registration
|
||||
// ─────────────────────────────────────────────────────────────────────────
|
||||
|
||||
/// Register all built-in groups.
|
||||
pub async fn register_builtin_groups(&self) -> Result<(), ActionError> {
|
||||
for group in crate::groups::builtin::all() {
|
||||
self.register_group(group, ActionGroupSource::Builtin).await?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::{handler_fn, RequiredContext};
|
||||
|
||||
async fn create_test_executor() -> ActionExecutor {
|
||||
let executor = ActionExecutor::new();
|
||||
executor
|
||||
.register(
|
||||
ActionMetadata {
|
||||
id: ActionId::builtin("test", "echo"),
|
||||
label: "Echo".to_string(),
|
||||
description: None,
|
||||
icon: None,
|
||||
scope: ActionScope::Global,
|
||||
keyboard_shortcut: None,
|
||||
requires_selection: false,
|
||||
enabled_condition: None,
|
||||
group_id: None,
|
||||
order: 0,
|
||||
required_context: RequiredContext::default(),
|
||||
},
|
||||
ActionSource::Builtin,
|
||||
handler_fn(|_ctx, params| async move {
|
||||
let msg: String = params.get("message").unwrap_or_default();
|
||||
Ok(ActionResult::with_message(msg))
|
||||
}),
|
||||
)
|
||||
.await
|
||||
.unwrap();
|
||||
executor
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_register_and_invoke() {
|
||||
let executor = create_test_executor().await;
|
||||
let action_id = ActionId::builtin("test", "echo");
|
||||
|
||||
let params = ActionParams::from_json(serde_json::json!({
|
||||
"message": "Hello, World!"
|
||||
}));
|
||||
|
||||
let result = executor
|
||||
.invoke(&action_id, CurrentContext::default(), params)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
match result {
|
||||
ActionResult::Success { message, .. } => {
|
||||
assert_eq!(message, Some("Hello, World!".to_string()));
|
||||
}
|
||||
_ => panic!("Expected Success result"),
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_invoke_not_found() {
|
||||
let executor = ActionExecutor::new();
|
||||
let action_id = ActionId::builtin("test", "unknown");
|
||||
|
||||
let result = executor
|
||||
.invoke(&action_id, CurrentContext::default(), ActionParams::empty())
|
||||
.await;
|
||||
|
||||
assert!(matches!(result, Err(ActionError::NotFound(_))));
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_list_by_scope() {
|
||||
let executor = ActionExecutor::new();
|
||||
|
||||
executor
|
||||
.register(
|
||||
ActionMetadata {
|
||||
id: ActionId::builtin("global", "one"),
|
||||
label: "Global One".to_string(),
|
||||
description: None,
|
||||
icon: None,
|
||||
scope: ActionScope::Global,
|
||||
keyboard_shortcut: None,
|
||||
requires_selection: false,
|
||||
enabled_condition: None,
|
||||
group_id: None,
|
||||
order: 0,
|
||||
required_context: RequiredContext::default(),
|
||||
},
|
||||
ActionSource::Builtin,
|
||||
handler_fn(|_ctx, _params| async move { Ok(ActionResult::ok()) }),
|
||||
)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
executor
|
||||
.register(
|
||||
ActionMetadata {
|
||||
id: ActionId::builtin("http", "one"),
|
||||
label: "HTTP One".to_string(),
|
||||
description: None,
|
||||
icon: None,
|
||||
scope: ActionScope::HttpRequest,
|
||||
keyboard_shortcut: None,
|
||||
requires_selection: false,
|
||||
enabled_condition: None,
|
||||
group_id: None,
|
||||
order: 0,
|
||||
required_context: RequiredContext::default(),
|
||||
},
|
||||
ActionSource::Builtin,
|
||||
handler_fn(|_ctx, _params| async move { Ok(ActionResult::ok()) }),
|
||||
)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
let global_actions = executor
|
||||
.list(ListActionsOptions {
|
||||
scope: Some(ActionScope::Global),
|
||||
..Default::default()
|
||||
})
|
||||
.await;
|
||||
assert_eq!(global_actions.len(), 1);
|
||||
|
||||
let http_actions = executor
|
||||
.list(ListActionsOptions {
|
||||
scope: Some(ActionScope::HttpRequest),
|
||||
..Default::default()
|
||||
})
|
||||
.await;
|
||||
assert_eq!(http_actions.len(), 1);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_groups() {
|
||||
let executor = ActionExecutor::new();
|
||||
executor.register_builtin_groups().await.unwrap();
|
||||
|
||||
let groups = executor.list_groups(None).await;
|
||||
assert!(!groups.is_empty());
|
||||
|
||||
let export_group = executor.get_group(&ActionGroupId::builtin("export")).await;
|
||||
assert!(export_group.is_some());
|
||||
assert_eq!(export_group.unwrap().name, "Export");
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_unregister() {
|
||||
let executor = create_test_executor().await;
|
||||
let action_id = ActionId::builtin("test", "echo");
|
||||
|
||||
assert!(executor.get(&action_id).await.is_some());
|
||||
|
||||
executor.unregister(&action_id).await.unwrap();
|
||||
assert!(executor.get(&action_id).await.is_none());
|
||||
}
|
||||
}
|
||||
@@ -1,208 +0,0 @@
|
||||
//! Action group types and management.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use ts_rs::TS;
|
||||
|
||||
use crate::{ActionId, ActionMetadata, ActionScope};
|
||||
|
||||
/// Unique identifier for an action group.
|
||||
///
|
||||
/// Format: `namespace:group-name`
|
||||
/// - Built-in: `yaak:export`
|
||||
/// - Plugin: `plugin.my-plugin:utilities`
|
||||
#[derive(Clone, Debug, Hash, Eq, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
pub struct ActionGroupId(pub String);
|
||||
|
||||
impl ActionGroupId {
|
||||
/// Create a namespaced group ID.
|
||||
pub fn new(namespace: &str, name: &str) -> Self {
|
||||
Self(format!("{}:{}", namespace, name))
|
||||
}
|
||||
|
||||
/// Create ID for built-in groups.
|
||||
pub fn builtin(name: &str) -> Self {
|
||||
Self::new("yaak", name)
|
||||
}
|
||||
|
||||
/// Create ID for plugin groups.
|
||||
pub fn plugin(plugin_ref_id: &str, name: &str) -> Self {
|
||||
Self::new(&format!("plugin.{}", plugin_ref_id), name)
|
||||
}
|
||||
|
||||
/// Get the raw string value.
|
||||
pub fn as_str(&self) -> &str {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Display for ActionGroupId {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{}", self.0)
|
||||
}
|
||||
}
|
||||
|
||||
/// Metadata about an action group.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ActionGroupMetadata {
|
||||
/// Unique identifier for this group.
|
||||
pub id: ActionGroupId,
|
||||
|
||||
/// Display name for the group.
|
||||
pub name: String,
|
||||
|
||||
/// Optional description of the group's purpose.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub description: Option<String>,
|
||||
|
||||
/// Icon to display for the group.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub icon: Option<String>,
|
||||
|
||||
/// Sort order for displaying groups (lower = earlier).
|
||||
#[serde(default)]
|
||||
pub order: i32,
|
||||
|
||||
/// Optional scope restriction (if set, group only appears in this scope).
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub scope: Option<ActionScope>,
|
||||
}
|
||||
|
||||
/// Where an action group was registered from.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum ActionGroupSource {
|
||||
/// Built into Yaak core.
|
||||
Builtin,
|
||||
/// Registered by a plugin.
|
||||
Plugin {
|
||||
/// Plugin reference ID.
|
||||
ref_id: String,
|
||||
/// Plugin name.
|
||||
name: String,
|
||||
},
|
||||
/// Registered at runtime.
|
||||
Dynamic {
|
||||
/// Source identifier.
|
||||
source_id: String,
|
||||
},
|
||||
}
|
||||
|
||||
/// A registered action group with its actions.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct RegisteredActionGroup {
|
||||
/// Group metadata.
|
||||
pub metadata: ActionGroupMetadata,
|
||||
|
||||
/// IDs of actions in this group (ordered by action's order field).
|
||||
pub action_ids: Vec<ActionId>,
|
||||
|
||||
/// Where the group was registered from.
|
||||
pub source: ActionGroupSource,
|
||||
}
|
||||
|
||||
/// A group with its actions for UI rendering.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ActionGroupWithActions {
|
||||
/// Group metadata.
|
||||
pub group: ActionGroupMetadata,
|
||||
|
||||
/// Actions in this group.
|
||||
pub actions: Vec<ActionMetadata>,
|
||||
}
|
||||
|
||||
/// Built-in action group definitions.
|
||||
pub mod builtin {
|
||||
use super::*;
|
||||
|
||||
/// Export group - export and copy actions.
|
||||
pub fn export() -> ActionGroupMetadata {
|
||||
ActionGroupMetadata {
|
||||
id: ActionGroupId::builtin("export"),
|
||||
name: "Export".into(),
|
||||
description: Some("Export and copy actions".into()),
|
||||
icon: Some("download".into()),
|
||||
order: 100,
|
||||
scope: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Code generation group.
|
||||
pub fn code_generation() -> ActionGroupMetadata {
|
||||
ActionGroupMetadata {
|
||||
id: ActionGroupId::builtin("code-generation"),
|
||||
name: "Code Generation".into(),
|
||||
description: Some("Generate code snippets from requests".into()),
|
||||
icon: Some("code".into()),
|
||||
order: 200,
|
||||
scope: Some(ActionScope::HttpRequest),
|
||||
}
|
||||
}
|
||||
|
||||
/// Send group - request sending actions.
|
||||
pub fn send() -> ActionGroupMetadata {
|
||||
ActionGroupMetadata {
|
||||
id: ActionGroupId::builtin("send"),
|
||||
name: "Send".into(),
|
||||
description: Some("Actions for sending requests".into()),
|
||||
icon: Some("play".into()),
|
||||
order: 50,
|
||||
scope: Some(ActionScope::HttpRequest),
|
||||
}
|
||||
}
|
||||
|
||||
/// Import group.
|
||||
pub fn import() -> ActionGroupMetadata {
|
||||
ActionGroupMetadata {
|
||||
id: ActionGroupId::builtin("import"),
|
||||
name: "Import".into(),
|
||||
description: Some("Import data from files".into()),
|
||||
icon: Some("upload".into()),
|
||||
order: 150,
|
||||
scope: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Workspace management group.
|
||||
pub fn workspace() -> ActionGroupMetadata {
|
||||
ActionGroupMetadata {
|
||||
id: ActionGroupId::builtin("workspace"),
|
||||
name: "Workspace".into(),
|
||||
description: Some("Workspace management actions".into()),
|
||||
icon: Some("folder".into()),
|
||||
order: 300,
|
||||
scope: Some(ActionScope::Workspace),
|
||||
}
|
||||
}
|
||||
|
||||
/// Get all built-in group definitions.
|
||||
pub fn all() -> Vec<ActionGroupMetadata> {
|
||||
vec![send(), export(), import(), code_generation(), workspace()]
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_group_id_creation() {
|
||||
let id = ActionGroupId::builtin("export");
|
||||
assert_eq!(id.as_str(), "yaak:export");
|
||||
|
||||
let plugin_id = ActionGroupId::plugin("my-plugin", "utilities");
|
||||
assert_eq!(plugin_id.as_str(), "plugin.my-plugin:utilities");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_builtin_groups() {
|
||||
let groups = builtin::all();
|
||||
assert!(!groups.is_empty());
|
||||
assert!(groups.iter().any(|g| g.id == ActionGroupId::builtin("export")));
|
||||
}
|
||||
}
|
||||
@@ -1,103 +0,0 @@
|
||||
//! Action handler types and execution.
|
||||
|
||||
use std::future::Future;
|
||||
use std::pin::Pin;
|
||||
use std::sync::Arc;
|
||||
|
||||
use crate::{ActionError, ActionParams, ActionResult, CurrentContext};
|
||||
|
||||
/// A boxed future for async action handlers.
|
||||
pub type BoxFuture<'a, T> = Pin<Box<dyn Future<Output = T> + Send + 'a>>;
|
||||
|
||||
/// Function signature for action handlers.
|
||||
pub type ActionHandlerFn = Arc<
|
||||
dyn Fn(CurrentContext, ActionParams) -> BoxFuture<'static, Result<ActionResult, ActionError>>
|
||||
+ Send
|
||||
+ Sync,
|
||||
>;
|
||||
|
||||
/// Trait for types that can handle action invocations.
|
||||
pub trait ActionHandler: Send + Sync {
|
||||
/// Execute the action with the given context and parameters.
|
||||
fn handle(
|
||||
&self,
|
||||
context: CurrentContext,
|
||||
params: ActionParams,
|
||||
) -> BoxFuture<'static, Result<ActionResult, ActionError>>;
|
||||
}
|
||||
|
||||
/// Wrapper to create an ActionHandler from a function.
|
||||
pub struct FnHandler<F>(pub F);
|
||||
|
||||
impl<F, Fut> ActionHandler for FnHandler<F>
|
||||
where
|
||||
F: Fn(CurrentContext, ActionParams) -> Fut + Send + Sync,
|
||||
Fut: Future<Output = Result<ActionResult, ActionError>> + Send + 'static,
|
||||
{
|
||||
fn handle(
|
||||
&self,
|
||||
context: CurrentContext,
|
||||
params: ActionParams,
|
||||
) -> BoxFuture<'static, Result<ActionResult, ActionError>> {
|
||||
Box::pin((self.0)(context, params))
|
||||
}
|
||||
}
|
||||
|
||||
/// Create an action handler from an async function.
|
||||
///
|
||||
/// # Example
|
||||
/// ```ignore
|
||||
/// let handler = handler_fn(|ctx, params| async move {
|
||||
/// Ok(ActionResult::ok())
|
||||
/// });
|
||||
/// ```
|
||||
pub fn handler_fn<F, Fut>(f: F) -> FnHandler<F>
|
||||
where
|
||||
F: Fn(CurrentContext, ActionParams) -> Fut + Send + Sync,
|
||||
Fut: Future<Output = Result<ActionResult, ActionError>> + Send + 'static,
|
||||
{
|
||||
FnHandler(f)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_handler_fn() {
|
||||
let handler = handler_fn(|_ctx, _params| async move { Ok(ActionResult::ok()) });
|
||||
|
||||
let result = handler
|
||||
.handle(CurrentContext::default(), ActionParams::empty())
|
||||
.await;
|
||||
|
||||
assert!(result.is_ok());
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_handler_with_params() {
|
||||
let handler = handler_fn(|_ctx, params| async move {
|
||||
let name: Option<String> = params.get("name");
|
||||
Ok(ActionResult::with_message(format!(
|
||||
"Hello, {}!",
|
||||
name.unwrap_or_else(|| "World".to_string())
|
||||
)))
|
||||
});
|
||||
|
||||
let params = ActionParams::from_json(serde_json::json!({
|
||||
"name": "Yaak"
|
||||
}));
|
||||
|
||||
let result = handler
|
||||
.handle(CurrentContext::default(), params)
|
||||
.await
|
||||
.unwrap();
|
||||
|
||||
match result {
|
||||
ActionResult::Success { message, .. } => {
|
||||
assert_eq!(message, Some("Hello, Yaak!".to_string()));
|
||||
}
|
||||
_ => panic!("Expected Success result"),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
//! Centralized action system for Yaak.
|
||||
//!
|
||||
//! This crate provides a unified hub for registering and invoking actions
|
||||
//! across all entry points: plugins, Tauri desktop app, CLI, deep links, and MCP server.
|
||||
|
||||
mod context;
|
||||
mod error;
|
||||
mod executor;
|
||||
mod groups;
|
||||
mod handler;
|
||||
mod types;
|
||||
|
||||
pub use context::*;
|
||||
pub use error::*;
|
||||
pub use executor::*;
|
||||
pub use groups::*;
|
||||
pub use handler::*;
|
||||
pub use types::*;
|
||||
@@ -1,273 +0,0 @@
|
||||
//! Core types for the action system.
|
||||
|
||||
use serde::{Deserialize, Serialize};
|
||||
use ts_rs::TS;
|
||||
|
||||
use crate::{ActionGroupId, RequiredContext};
|
||||
|
||||
/// Unique identifier for an action.
|
||||
///
|
||||
/// Format: `namespace:category:name`
|
||||
/// - Built-in: `yaak:http-request:send`
|
||||
/// - Plugin: `plugin.copy-curl:http-request:copy`
|
||||
#[derive(Clone, Debug, Hash, Eq, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
pub struct ActionId(pub String);
|
||||
|
||||
impl ActionId {
|
||||
/// Create a namespaced action ID.
|
||||
pub fn new(namespace: &str, category: &str, name: &str) -> Self {
|
||||
Self(format!("{}:{}:{}", namespace, category, name))
|
||||
}
|
||||
|
||||
/// Create ID for built-in actions.
|
||||
pub fn builtin(category: &str, name: &str) -> Self {
|
||||
Self::new("yaak", category, name)
|
||||
}
|
||||
|
||||
/// Create ID for plugin actions.
|
||||
pub fn plugin(plugin_ref_id: &str, category: &str, name: &str) -> Self {
|
||||
Self::new(&format!("plugin.{}", plugin_ref_id), category, name)
|
||||
}
|
||||
|
||||
/// Get the raw string value.
|
||||
pub fn as_str(&self) -> &str {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl std::fmt::Display for ActionId {
|
||||
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
|
||||
write!(f, "{}", self.0)
|
||||
}
|
||||
}
|
||||
|
||||
/// The scope in which an action can be invoked.
|
||||
#[derive(Clone, Debug, Hash, Eq, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "kebab-case")]
|
||||
pub enum ActionScope {
|
||||
/// Global actions available everywhere.
|
||||
Global,
|
||||
/// Actions on HTTP requests.
|
||||
HttpRequest,
|
||||
/// Actions on WebSocket requests.
|
||||
WebsocketRequest,
|
||||
/// Actions on gRPC requests.
|
||||
GrpcRequest,
|
||||
/// Actions on workspaces.
|
||||
Workspace,
|
||||
/// Actions on folders.
|
||||
Folder,
|
||||
/// Actions on environments.
|
||||
Environment,
|
||||
/// Actions on cookie jars.
|
||||
CookieJar,
|
||||
}
|
||||
|
||||
/// Metadata about an action for discovery.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct ActionMetadata {
|
||||
/// Unique identifier for this action.
|
||||
pub id: ActionId,
|
||||
|
||||
/// Display label for the action.
|
||||
pub label: String,
|
||||
|
||||
/// Optional description of what the action does.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub description: Option<String>,
|
||||
|
||||
/// Icon name to display.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub icon: Option<String>,
|
||||
|
||||
/// The scope this action applies to.
|
||||
pub scope: ActionScope,
|
||||
|
||||
/// Keyboard shortcut (e.g., "Cmd+Enter").
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub keyboard_shortcut: Option<String>,
|
||||
|
||||
/// Whether the action requires a selection/target.
|
||||
#[serde(default)]
|
||||
pub requires_selection: bool,
|
||||
|
||||
/// Optional condition expression for when action is enabled.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub enabled_condition: Option<String>,
|
||||
|
||||
/// Optional group this action belongs to.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
pub group_id: Option<ActionGroupId>,
|
||||
|
||||
/// Sort order within a group (lower = earlier).
|
||||
#[serde(default)]
|
||||
pub order: i32,
|
||||
|
||||
/// Context requirements for this action.
|
||||
#[serde(default)]
|
||||
pub required_context: RequiredContext,
|
||||
}
|
||||
|
||||
/// Where an action was registered from.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum ActionSource {
|
||||
/// Built into Yaak core.
|
||||
Builtin,
|
||||
/// Registered by a plugin.
|
||||
Plugin {
|
||||
/// Plugin reference ID.
|
||||
ref_id: String,
|
||||
/// Plugin name.
|
||||
name: String,
|
||||
},
|
||||
/// Registered at runtime (e.g., by MCP tools).
|
||||
Dynamic {
|
||||
/// Source identifier.
|
||||
source_id: String,
|
||||
},
|
||||
}
|
||||
|
||||
/// Parameters passed to action handlers.
|
||||
#[derive(Clone, Debug, Default, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
pub struct ActionParams {
|
||||
/// Arbitrary JSON parameters.
|
||||
#[serde(default)]
|
||||
#[ts(type = "unknown")]
|
||||
pub data: serde_json::Value,
|
||||
}
|
||||
|
||||
impl ActionParams {
|
||||
/// Create empty params.
|
||||
pub fn empty() -> Self {
|
||||
Self {
|
||||
data: serde_json::Value::Null,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create params from a JSON value.
|
||||
pub fn from_json(data: serde_json::Value) -> Self {
|
||||
Self { data }
|
||||
}
|
||||
|
||||
/// Get a typed value from the params.
|
||||
pub fn get<T: serde::de::DeserializeOwned>(&self, key: &str) -> Option<T> {
|
||||
self.data
|
||||
.get(key)
|
||||
.and_then(|v| serde_json::from_value(v.clone()).ok())
|
||||
}
|
||||
}
|
||||
|
||||
/// Result of action execution.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum ActionResult {
|
||||
/// Action completed successfully.
|
||||
Success {
|
||||
/// Optional data to return.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
#[ts(type = "unknown")]
|
||||
data: Option<serde_json::Value>,
|
||||
/// Optional message to display.
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
message: Option<String>,
|
||||
},
|
||||
|
||||
/// Action requires user input to continue.
|
||||
RequiresInput {
|
||||
/// Prompt to show user.
|
||||
prompt: InputPrompt,
|
||||
/// Continuation token.
|
||||
continuation_id: String,
|
||||
},
|
||||
|
||||
/// Action was cancelled by the user.
|
||||
Cancelled,
|
||||
}
|
||||
|
||||
impl ActionResult {
|
||||
/// Create a success result with no data.
|
||||
pub fn ok() -> Self {
|
||||
Self::Success {
|
||||
data: None,
|
||||
message: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a success result with a message.
|
||||
pub fn with_message(message: impl Into<String>) -> Self {
|
||||
Self::Success {
|
||||
data: None,
|
||||
message: Some(message.into()),
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a success result with data.
|
||||
pub fn with_data(data: serde_json::Value) -> Self {
|
||||
Self::Success {
|
||||
data: Some(data),
|
||||
message: None,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// A prompt for user input.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
#[serde(tag = "type", rename_all = "kebab-case")]
|
||||
pub enum InputPrompt {
|
||||
/// Text input prompt.
|
||||
Text {
|
||||
label: String,
|
||||
placeholder: Option<String>,
|
||||
default_value: Option<String>,
|
||||
},
|
||||
/// Selection prompt.
|
||||
Select {
|
||||
label: String,
|
||||
options: Vec<SelectOption>,
|
||||
},
|
||||
/// Confirmation prompt.
|
||||
Confirm { label: String },
|
||||
}
|
||||
|
||||
/// An option in a select prompt.
|
||||
#[derive(Clone, Debug, Serialize, Deserialize, TS)]
|
||||
#[ts(export)]
|
||||
pub struct SelectOption {
|
||||
pub label: String,
|
||||
pub value: String,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_action_id_creation() {
|
||||
let id = ActionId::builtin("http-request", "send");
|
||||
assert_eq!(id.as_str(), "yaak:http-request:send");
|
||||
|
||||
let plugin_id = ActionId::plugin("copy-curl", "http-request", "copy");
|
||||
assert_eq!(plugin_id.as_str(), "plugin.copy-curl:http-request:copy");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_action_params() {
|
||||
let params = ActionParams::from_json(serde_json::json!({
|
||||
"name": "test",
|
||||
"count": 42
|
||||
}));
|
||||
|
||||
assert_eq!(params.get::<String>("name"), Some("test".to_string()));
|
||||
assert_eq!(params.get::<i32>("count"), Some(42));
|
||||
assert_eq!(params.get::<String>("missing"), None);
|
||||
}
|
||||
}
|
||||
@@ -1,9 +0,0 @@
|
||||
[package]
|
||||
name = "yaak-common"
|
||||
version = "0.1.0"
|
||||
edition = "2024"
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true, features = ["process"] }
|
||||
@@ -1,16 +0,0 @@
|
||||
use std::ffi::OsStr;
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
const CREATE_NO_WINDOW: u32 = 0x0800_0000;
|
||||
|
||||
/// Creates a new `tokio::process::Command` that won't spawn a console window on Windows.
|
||||
pub fn new_xplatform_command<S: AsRef<OsStr>>(program: S) -> tokio::process::Command {
|
||||
#[allow(unused_mut)]
|
||||
let mut cmd = tokio::process::Command::new(program);
|
||||
#[cfg(target_os = "windows")]
|
||||
{
|
||||
use std::os::windows::process::CommandExt;
|
||||
cmd.creation_flags(CREATE_NO_WINDOW);
|
||||
}
|
||||
cmd
|
||||
}
|
||||
@@ -1,3 +0,0 @@
|
||||
pub mod command;
|
||||
pub mod platform;
|
||||
pub mod serde;
|
||||
@@ -1,9 +0,0 @@
|
||||
[package]
|
||||
name = "yaak-core"
|
||||
version = "0.0.0"
|
||||
edition = "2024"
|
||||
authors = ["Gregory Schier"]
|
||||
publish = false
|
||||
|
||||
[dependencies]
|
||||
thiserror = { workspace = true }
|
||||
@@ -1,56 +0,0 @@
|
||||
use std::path::PathBuf;
|
||||
|
||||
/// Context for a workspace operation.
|
||||
///
|
||||
/// In Tauri, this is extracted from the WebviewWindow URL.
|
||||
/// In CLI, this is constructed from command arguments or config.
|
||||
#[derive(Debug, Clone, Default)]
|
||||
pub struct WorkspaceContext {
|
||||
pub workspace_id: Option<String>,
|
||||
pub environment_id: Option<String>,
|
||||
pub cookie_jar_id: Option<String>,
|
||||
pub request_id: Option<String>,
|
||||
}
|
||||
|
||||
impl WorkspaceContext {
|
||||
pub fn new() -> Self {
|
||||
Self::default()
|
||||
}
|
||||
|
||||
pub fn with_workspace(mut self, workspace_id: impl Into<String>) -> Self {
|
||||
self.workspace_id = Some(workspace_id.into());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_environment(mut self, environment_id: impl Into<String>) -> Self {
|
||||
self.environment_id = Some(environment_id.into());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_cookie_jar(mut self, cookie_jar_id: impl Into<String>) -> Self {
|
||||
self.cookie_jar_id = Some(cookie_jar_id.into());
|
||||
self
|
||||
}
|
||||
|
||||
pub fn with_request(mut self, request_id: impl Into<String>) -> Self {
|
||||
self.request_id = Some(request_id.into());
|
||||
self
|
||||
}
|
||||
}
|
||||
|
||||
/// Application context trait for accessing app-level resources.
|
||||
///
|
||||
/// This abstracts over Tauri's `AppHandle` for path resolution and app identity.
|
||||
/// Implemented by Tauri's AppHandle and by CLI's own context struct.
|
||||
pub trait AppContext: Send + Sync + Clone {
|
||||
/// Returns the path to the application data directory.
|
||||
/// This is where the database and other persistent data are stored.
|
||||
fn app_data_dir(&self) -> PathBuf;
|
||||
|
||||
/// Returns the application identifier (e.g., "app.yaak.desktop").
|
||||
/// Used for keyring access and other platform-specific features.
|
||||
fn app_identifier(&self) -> &str;
|
||||
|
||||
/// Returns true if running in development mode.
|
||||
fn is_dev(&self) -> bool;
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
use thiserror::Error;
|
||||
|
||||
pub type Result<T> = std::result::Result<T, Error>;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum Error {
|
||||
#[error("Missing required context: {0}")]
|
||||
MissingContext(String),
|
||||
|
||||
#[error("Configuration error: {0}")]
|
||||
Config(String),
|
||||
|
||||
#[error("IO error: {0}")]
|
||||
Io(#[from] std::io::Error),
|
||||
}
|
||||
@@ -1,10 +0,0 @@
|
||||
//! Core abstractions for Yaak that work without Tauri.
|
||||
//!
|
||||
//! This crate provides foundational types and traits that allow Yaak's
|
||||
//! business logic to run in both Tauri (desktop app) and CLI contexts.
|
||||
|
||||
mod context;
|
||||
mod error;
|
||||
|
||||
pub use context::{AppContext, WorkspaceContext};
|
||||
pub use error::{Error, Result};
|
||||
@@ -1,17 +0,0 @@
|
||||
import { invoke } from '@tauri-apps/api/core';
|
||||
|
||||
export function enableEncryption(workspaceId: string) {
|
||||
return invoke<void>('cmd_enable_encryption', { workspaceId });
|
||||
}
|
||||
|
||||
export function revealWorkspaceKey(workspaceId: string) {
|
||||
return invoke<string>('cmd_reveal_workspace_key', { workspaceId });
|
||||
}
|
||||
|
||||
export function setWorkspaceKey(args: { workspaceId: string; key: string }) {
|
||||
return invoke<void>('cmd_set_workspace_key', args);
|
||||
}
|
||||
|
||||
export function disableEncryption(workspaceId: string) {
|
||||
return invoke<void>('cmd_disable_encryption', { workspaceId });
|
||||
}
|
||||
@@ -1,7 +0,0 @@
|
||||
extern crate core;
|
||||
|
||||
pub mod encryption;
|
||||
pub mod error;
|
||||
pub mod manager;
|
||||
mod master_key;
|
||||
mod workspace_key;
|
||||
@@ -1,30 +0,0 @@
|
||||
use crate::error::Error::GitNotFound;
|
||||
use crate::error::Result;
|
||||
use std::path::Path;
|
||||
use std::process::Stdio;
|
||||
use tokio::process::Command;
|
||||
use yaak_common::command::new_xplatform_command;
|
||||
|
||||
/// Create a git command that runs in the specified directory
|
||||
pub(crate) async fn new_binary_command(dir: &Path) -> Result<Command> {
|
||||
let mut cmd = new_binary_command_global().await?;
|
||||
cmd.arg("-C").arg(dir);
|
||||
Ok(cmd)
|
||||
}
|
||||
|
||||
/// Create a git command without a specific directory (for global operations)
|
||||
pub(crate) async fn new_binary_command_global() -> Result<Command> {
|
||||
// 1. Probe that `git` exists and is runnable
|
||||
let mut probe = new_xplatform_command("git");
|
||||
probe.arg("--version").stdin(Stdio::null()).stdout(Stdio::null()).stderr(Stdio::null());
|
||||
|
||||
let status = probe.status().await.map_err(|_| GitNotFound)?;
|
||||
|
||||
if !status.success() {
|
||||
return Err(GitNotFound);
|
||||
}
|
||||
|
||||
// 2. Build the reusable git command
|
||||
let cmd = new_xplatform_command("git");
|
||||
Ok(cmd)
|
||||
}
|
||||
@@ -1,153 +0,0 @@
|
||||
use serde::{Deserialize, Serialize};
|
||||
use ts_rs::TS;
|
||||
|
||||
use crate::binary::new_binary_command;
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use std::path::Path;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[serde(rename_all = "snake_case", tag = "type")]
|
||||
#[ts(export, export_to = "gen_git.ts")]
|
||||
pub enum BranchDeleteResult {
|
||||
Success { message: String },
|
||||
NotFullyMerged,
|
||||
}
|
||||
|
||||
pub async fn git_checkout_branch(dir: &Path, branch_name: &str, force: bool) -> Result<String> {
|
||||
let branch_name = branch_name.trim_start_matches("origin/");
|
||||
|
||||
let mut args = vec!["checkout"];
|
||||
if force {
|
||||
args.push("--force");
|
||||
}
|
||||
args.push(branch_name);
|
||||
|
||||
let out = new_binary_command(dir)
|
||||
.await?
|
||||
.args(&args)
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git checkout: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() {
|
||||
return Err(GenericError(format!("Failed to checkout: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(branch_name.to_string())
|
||||
}
|
||||
|
||||
pub async fn git_create_branch(dir: &Path, name: &str, base: Option<&str>) -> Result<()> {
|
||||
let mut cmd = new_binary_command(dir).await?;
|
||||
cmd.arg("branch").arg(name);
|
||||
if let Some(base_branch) = base {
|
||||
cmd.arg(base_branch);
|
||||
}
|
||||
|
||||
let out =
|
||||
cmd.output().await.map_err(|e| GenericError(format!("failed to run git branch: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() {
|
||||
return Err(GenericError(format!("Failed to create branch: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn git_delete_branch(dir: &Path, name: &str, force: bool) -> Result<BranchDeleteResult> {
|
||||
let mut cmd = new_binary_command(dir).await?;
|
||||
|
||||
let out =
|
||||
if force { cmd.args(["branch", "-D", name]) } else { cmd.args(["branch", "-d", name]) }
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git branch -d: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() && stderr.to_lowercase().contains("not fully merged") {
|
||||
return Ok(BranchDeleteResult::NotFullyMerged);
|
||||
}
|
||||
|
||||
if !out.status.success() {
|
||||
return Err(GenericError(format!("Failed to delete branch: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(BranchDeleteResult::Success { message: combined })
|
||||
}
|
||||
|
||||
pub async fn git_merge_branch(dir: &Path, name: &str) -> Result<()> {
|
||||
let out = new_binary_command(dir)
|
||||
.await?
|
||||
.args(["merge", name])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git merge: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() {
|
||||
// Check for merge conflicts
|
||||
if combined.to_lowercase().contains("conflict") {
|
||||
return Err(GenericError(
|
||||
"Merge conflicts detected. Please resolve them manually.".to_string(),
|
||||
));
|
||||
}
|
||||
return Err(GenericError(format!("Failed to merge: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn git_delete_remote_branch(dir: &Path, name: &str) -> Result<()> {
|
||||
// Remote branch names come in as "origin/branch-name", extract the branch name
|
||||
let branch_name = name.trim_start_matches("origin/");
|
||||
|
||||
let out = new_binary_command(dir)
|
||||
.await?
|
||||
.args(["push", "origin", "--delete", branch_name])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git push --delete: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() {
|
||||
return Err(GenericError(format!("Failed to delete remote branch: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub async fn git_rename_branch(dir: &Path, old_name: &str, new_name: &str) -> Result<()> {
|
||||
let out = new_binary_command(dir)
|
||||
.await?
|
||||
.args(["branch", "-m", old_name, new_name])
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git branch -m: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
|
||||
if !out.status.success() {
|
||||
return Err(GenericError(format!("Failed to rename branch: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,53 +0,0 @@
|
||||
use crate::binary::new_binary_command;
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use log::info;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::fs;
|
||||
use std::path::Path;
|
||||
use ts_rs::TS;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[serde(rename_all = "snake_case", tag = "type")]
|
||||
#[ts(export, export_to = "gen_git.ts")]
|
||||
pub enum CloneResult {
|
||||
Success,
|
||||
Cancelled,
|
||||
NeedsCredentials { url: String, error: Option<String> },
|
||||
}
|
||||
|
||||
pub async fn git_clone(url: &str, dir: &Path) -> Result<CloneResult> {
|
||||
let parent = dir.parent().ok_or_else(|| GenericError("Invalid clone directory".to_string()))?;
|
||||
fs::create_dir_all(parent)
|
||||
.map_err(|e| GenericError(format!("Failed to create directory: {e}")))?;
|
||||
let mut cmd = new_binary_command(parent).await?;
|
||||
cmd.args(["clone", url]).arg(dir).env("GIT_TERMINAL_PROMPT", "0");
|
||||
|
||||
let out =
|
||||
cmd.output().await.map_err(|e| GenericError(format!("failed to run git clone: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = format!("{}{}", stdout, stderr);
|
||||
let combined_lower = combined.to_lowercase();
|
||||
|
||||
info!("Cloned status={}: {combined}", out.status);
|
||||
|
||||
if !out.status.success() {
|
||||
// Check for credentials error
|
||||
if combined_lower.contains("could not read") {
|
||||
return Ok(CloneResult::NeedsCredentials { url: url.to_string(), error: None });
|
||||
}
|
||||
if combined_lower.contains("unable to access")
|
||||
|| combined_lower.contains("authentication failed")
|
||||
{
|
||||
return Ok(CloneResult::NeedsCredentials {
|
||||
url: url.to_string(),
|
||||
error: Some(combined.to_string()),
|
||||
});
|
||||
}
|
||||
return Err(GenericError(format!("Failed to clone: {}", combined.trim())));
|
||||
}
|
||||
|
||||
Ok(CloneResult::Success)
|
||||
}
|
||||
@@ -1,44 +0,0 @@
|
||||
use crate::binary::new_binary_command_global;
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use std::process::Stdio;
|
||||
use tokio::io::AsyncWriteExt;
|
||||
use url::Url;
|
||||
|
||||
pub async fn git_add_credential(remote_url: &str, username: &str, password: &str) -> Result<()> {
|
||||
let url = Url::parse(remote_url)
|
||||
.map_err(|e| GenericError(format!("Failed to parse remote url {remote_url}: {e:?}")))?;
|
||||
let protocol = url.scheme();
|
||||
let host = url.host_str().unwrap();
|
||||
let path = Some(url.path());
|
||||
|
||||
let mut child = new_binary_command_global()
|
||||
.await?
|
||||
.args(["credential", "approve"])
|
||||
.stdin(Stdio::piped())
|
||||
.stdout(Stdio::null())
|
||||
.spawn()?;
|
||||
|
||||
{
|
||||
let stdin = child.stdin.as_mut().unwrap();
|
||||
stdin.write_all(format!("protocol={}\n", protocol).as_bytes()).await?;
|
||||
stdin.write_all(format!("host={}\n", host).as_bytes()).await?;
|
||||
if let Some(path) = path {
|
||||
if !path.is_empty() {
|
||||
stdin
|
||||
.write_all(format!("path={}\n", path.trim_start_matches('/')).as_bytes())
|
||||
.await?;
|
||||
}
|
||||
}
|
||||
stdin.write_all(format!("username={}\n", username).as_bytes()).await?;
|
||||
stdin.write_all(format!("password={}\n", password).as_bytes()).await?;
|
||||
stdin.write_all(b"\n").await?; // blank line terminator
|
||||
}
|
||||
|
||||
let status = child.wait().await?;
|
||||
if !status.success() {
|
||||
return Err(GenericError("Failed to approve git credential".to_string()));
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,36 +0,0 @@
|
||||
mod add;
|
||||
mod binary;
|
||||
mod branch;
|
||||
mod clone;
|
||||
mod commit;
|
||||
mod credential;
|
||||
pub mod error;
|
||||
mod fetch;
|
||||
mod init;
|
||||
mod log;
|
||||
|
||||
mod pull;
|
||||
mod push;
|
||||
mod remotes;
|
||||
mod repository;
|
||||
mod status;
|
||||
mod unstage;
|
||||
mod util;
|
||||
|
||||
// Re-export all git functions for external use
|
||||
pub use add::git_add;
|
||||
pub use branch::{
|
||||
BranchDeleteResult, git_checkout_branch, git_create_branch, git_delete_branch,
|
||||
git_delete_remote_branch, git_merge_branch, git_rename_branch,
|
||||
};
|
||||
pub use clone::{CloneResult, git_clone};
|
||||
pub use commit::git_commit;
|
||||
pub use credential::git_add_credential;
|
||||
pub use fetch::git_fetch_all;
|
||||
pub use init::git_init;
|
||||
pub use log::{GitCommit, git_log};
|
||||
pub use pull::{PullResult, git_pull};
|
||||
pub use push::{PushResult, git_push};
|
||||
pub use remotes::{GitRemote, git_add_remote, git_remotes, git_rm_remote};
|
||||
pub use status::{GitStatusSummary, git_status};
|
||||
pub use unstage::git_unstage;
|
||||
@@ -1,89 +0,0 @@
|
||||
use crate::binary::new_binary_command;
|
||||
use crate::error::Error::GenericError;
|
||||
use crate::error::Result;
|
||||
use crate::repository::open_repo;
|
||||
use crate::util::{get_current_branch_name, get_default_remote_for_push_in_repo};
|
||||
use log::info;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use std::path::Path;
|
||||
use ts_rs::TS;
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize, TS)]
|
||||
#[serde(rename_all = "snake_case", tag = "type")]
|
||||
#[ts(export, export_to = "gen_git.ts")]
|
||||
pub enum PushResult {
|
||||
Success { message: String },
|
||||
UpToDate,
|
||||
NeedsCredentials { url: String, error: Option<String> },
|
||||
}
|
||||
|
||||
pub async fn git_push(dir: &Path) -> Result<PushResult> {
|
||||
// Extract all git2 data before any await points (git2 types are not Send)
|
||||
let (branch_name, remote_name, remote_url) = {
|
||||
let repo = open_repo(dir)?;
|
||||
let branch_name = get_current_branch_name(&repo)?;
|
||||
let remote = get_default_remote_for_push_in_repo(&repo)?;
|
||||
let remote_name =
|
||||
remote.name().ok_or(GenericError("Failed to get remote name".to_string()))?.to_string();
|
||||
let remote_url =
|
||||
remote.url().ok_or(GenericError("Failed to get remote url".to_string()))?.to_string();
|
||||
(branch_name, remote_name, remote_url)
|
||||
};
|
||||
|
||||
let out = new_binary_command(dir)
|
||||
.await?
|
||||
.args(["push", &remote_name, &branch_name])
|
||||
.env("GIT_TERMINAL_PROMPT", "0")
|
||||
.output()
|
||||
.await
|
||||
.map_err(|e| GenericError(format!("failed to run git push: {e}")))?;
|
||||
|
||||
let stdout = String::from_utf8_lossy(&out.stdout);
|
||||
let stderr = String::from_utf8_lossy(&out.stderr);
|
||||
let combined = stdout + stderr;
|
||||
let combined_lower = combined.to_lowercase();
|
||||
|
||||
info!("Pushed to repo status={} {combined}", out.status);
|
||||
|
||||
// Helper to check if this is a credentials error
|
||||
let is_credentials_error = || {
|
||||
combined_lower.contains("could not read")
|
||||
|| combined_lower.contains("unable to access")
|
||||
|| combined_lower.contains("authentication failed")
|
||||
};
|
||||
|
||||
// Check for explicit rejection indicators first (e.g., protected branch rejections)
|
||||
// These can occur even if some git servers don't properly set exit codes
|
||||
if combined_lower.contains("rejected") || combined_lower.contains("failed to push") {
|
||||
if is_credentials_error() {
|
||||
return Ok(PushResult::NeedsCredentials {
|
||||
url: remote_url.to_string(),
|
||||
error: Some(combined.to_string()),
|
||||
});
|
||||
}
|
||||
return Err(GenericError(format!("Failed to push: {combined}")));
|
||||
}
|
||||
|
||||
// Check exit status for any other failures
|
||||
if !out.status.success() {
|
||||
if combined_lower.contains("could not read") {
|
||||
return Ok(PushResult::NeedsCredentials { url: remote_url.to_string(), error: None });
|
||||
}
|
||||
if combined_lower.contains("unable to access")
|
||||
|| combined_lower.contains("authentication failed")
|
||||
{
|
||||
return Ok(PushResult::NeedsCredentials {
|
||||
url: remote_url.to_string(),
|
||||
error: Some(combined.to_string()),
|
||||
});
|
||||
}
|
||||
return Err(GenericError(format!("Failed to push: {combined}")));
|
||||
}
|
||||
|
||||
// Success cases (exit code 0 and no rejection indicators)
|
||||
if combined_lower.contains("up-to-date") {
|
||||
return Ok(PushResult::UpToDate);
|
||||
}
|
||||
|
||||
Ok(PushResult::Success { message: format!("Pushed to {}/{}", remote_name, branch_name) })
|
||||
}
|
||||
@@ -1,484 +0,0 @@
|
||||
//! Custom cookie handling for HTTP requests
|
||||
//!
|
||||
//! This module provides cookie storage and matching functionality that was previously
|
||||
//! delegated to reqwest. It implements RFC 6265 cookie domain and path matching.
|
||||
|
||||
use log::debug;
|
||||
use std::sync::{Arc, Mutex};
|
||||
use std::time::{Duration, SystemTime, UNIX_EPOCH};
|
||||
use url::Url;
|
||||
use yaak_models::models::{Cookie, CookieDomain, CookieExpires};
|
||||
|
||||
/// A thread-safe cookie store that can be shared across requests
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CookieStore {
|
||||
cookies: Arc<Mutex<Vec<Cookie>>>,
|
||||
}
|
||||
|
||||
impl Default for CookieStore {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
impl CookieStore {
|
||||
/// Create a new empty cookie store
|
||||
pub fn new() -> Self {
|
||||
Self { cookies: Arc::new(Mutex::new(Vec::new())) }
|
||||
}
|
||||
|
||||
/// Create a cookie store from existing cookies
|
||||
pub fn from_cookies(cookies: Vec<Cookie>) -> Self {
|
||||
Self { cookies: Arc::new(Mutex::new(cookies)) }
|
||||
}
|
||||
|
||||
/// Get all cookies (for persistence)
|
||||
pub fn get_all_cookies(&self) -> Vec<Cookie> {
|
||||
self.cookies.lock().unwrap().clone()
|
||||
}
|
||||
|
||||
/// Get the Cookie header value for the given URL
|
||||
pub fn get_cookie_header(&self, url: &Url) -> Option<String> {
|
||||
let cookies = self.cookies.lock().unwrap();
|
||||
let now = SystemTime::now();
|
||||
|
||||
let matching_cookies: Vec<_> = cookies
|
||||
.iter()
|
||||
.filter(|cookie| self.cookie_matches(cookie, url, &now))
|
||||
.filter_map(|cookie| {
|
||||
// Parse the raw cookie to get name=value
|
||||
parse_cookie_name_value(&cookie.raw_cookie)
|
||||
})
|
||||
.collect();
|
||||
|
||||
if matching_cookies.is_empty() {
|
||||
None
|
||||
} else {
|
||||
Some(
|
||||
matching_cookies
|
||||
.into_iter()
|
||||
.map(|(name, value)| format!("{}={}", name, value))
|
||||
.collect::<Vec<_>>()
|
||||
.join("; "),
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse Set-Cookie headers and add cookies to the store
|
||||
pub fn store_cookies_from_response(&self, url: &Url, set_cookie_headers: &[String]) {
|
||||
let mut cookies = self.cookies.lock().unwrap();
|
||||
|
||||
for header_value in set_cookie_headers {
|
||||
if let Some(cookie) = parse_set_cookie(header_value, url) {
|
||||
// Remove any existing cookie with the same name and domain
|
||||
cookies.retain(|existing| !cookies_match(existing, &cookie));
|
||||
debug!(
|
||||
"Storing cookie: {} for domain {:?}",
|
||||
parse_cookie_name_value(&cookie.raw_cookie)
|
||||
.map(|(n, _)| n)
|
||||
.unwrap_or_else(|| "unknown".to_string()),
|
||||
cookie.domain
|
||||
);
|
||||
cookies.push(cookie);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if a cookie matches the given URL
|
||||
fn cookie_matches(&self, cookie: &Cookie, url: &Url, now: &SystemTime) -> bool {
|
||||
// Check expiration
|
||||
if let CookieExpires::AtUtc(expiry_str) = &cookie.expires {
|
||||
if let Ok(expiry) = parse_cookie_date(expiry_str) {
|
||||
if expiry < *now {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Check domain
|
||||
let url_host = match url.host_str() {
|
||||
Some(h) => h.to_lowercase(),
|
||||
None => return false,
|
||||
};
|
||||
|
||||
let domain_matches = match &cookie.domain {
|
||||
CookieDomain::HostOnly(domain) => url_host == domain.to_lowercase(),
|
||||
CookieDomain::Suffix(domain) => {
|
||||
let domain_lower = domain.to_lowercase();
|
||||
url_host == domain_lower || url_host.ends_with(&format!(".{}", domain_lower))
|
||||
}
|
||||
// NotPresent and Empty should never occur in practice since we always set domain
|
||||
// when parsing Set-Cookie headers. Treat as non-matching to be safe.
|
||||
CookieDomain::NotPresent | CookieDomain::Empty => false,
|
||||
};
|
||||
|
||||
if !domain_matches {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check path
|
||||
let (cookie_path, _) = &cookie.path;
|
||||
let url_path = url.path();
|
||||
|
||||
path_matches(url_path, cookie_path)
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse name=value from a cookie string (raw_cookie format)
|
||||
fn parse_cookie_name_value(raw_cookie: &str) -> Option<(String, String)> {
|
||||
// The raw_cookie typically looks like "name=value" or "name=value; attr1; attr2=..."
|
||||
let first_part = raw_cookie.split(';').next()?;
|
||||
let mut parts = first_part.splitn(2, '=');
|
||||
let name = parts.next()?.trim().to_string();
|
||||
let value = parts.next().unwrap_or("").trim().to_string();
|
||||
|
||||
if name.is_empty() { None } else { Some((name, value)) }
|
||||
}
|
||||
|
||||
/// Parse a Set-Cookie header into a Cookie
|
||||
fn parse_set_cookie(header_value: &str, request_url: &Url) -> Option<Cookie> {
|
||||
let parsed = cookie::Cookie::parse(header_value).ok()?;
|
||||
|
||||
let raw_cookie = format!("{}={}", parsed.name(), parsed.value());
|
||||
|
||||
// Determine domain
|
||||
let domain = if let Some(domain_attr) = parsed.domain() {
|
||||
// Domain attribute present - this is a suffix match
|
||||
let domain = domain_attr.trim_start_matches('.').to_lowercase();
|
||||
|
||||
// Reject single-component domains (TLDs) except localhost
|
||||
if is_single_component_domain(&domain) && !is_localhost(&domain) {
|
||||
debug!("Rejecting cookie with single-component domain: {}", domain);
|
||||
return None;
|
||||
}
|
||||
|
||||
CookieDomain::Suffix(domain)
|
||||
} else {
|
||||
// No domain attribute - host-only cookie
|
||||
CookieDomain::HostOnly(request_url.host_str().unwrap_or("").to_lowercase())
|
||||
};
|
||||
|
||||
// Determine expiration
|
||||
let expires = if let Some(max_age) = parsed.max_age() {
|
||||
let duration = Duration::from_secs(max_age.whole_seconds().max(0) as u64);
|
||||
let expiry = SystemTime::now() + duration;
|
||||
let expiry_secs = expiry.duration_since(UNIX_EPOCH).unwrap_or_default().as_secs();
|
||||
CookieExpires::AtUtc(format!("{}", expiry_secs))
|
||||
} else if let Some(expires_time) = parsed.expires() {
|
||||
match expires_time {
|
||||
cookie::Expiration::DateTime(dt) => {
|
||||
let timestamp = dt.unix_timestamp();
|
||||
CookieExpires::AtUtc(format!("{}", timestamp))
|
||||
}
|
||||
cookie::Expiration::Session => CookieExpires::SessionEnd,
|
||||
}
|
||||
} else {
|
||||
CookieExpires::SessionEnd
|
||||
};
|
||||
|
||||
// Determine path
|
||||
let path = if let Some(path_attr) = parsed.path() {
|
||||
(path_attr.to_string(), true)
|
||||
} else {
|
||||
// Default path is the directory of the request URI
|
||||
let default_path = default_cookie_path(request_url.path());
|
||||
(default_path, false)
|
||||
};
|
||||
|
||||
Some(Cookie { raw_cookie, domain, expires, path })
|
||||
}
|
||||
|
||||
/// Get the default cookie path from a request path (RFC 6265 Section 5.1.4)
|
||||
fn default_cookie_path(request_path: &str) -> String {
|
||||
if request_path.is_empty() || !request_path.starts_with('/') {
|
||||
return "/".to_string();
|
||||
}
|
||||
|
||||
// Find the last slash
|
||||
if let Some(last_slash) = request_path.rfind('/') {
|
||||
if last_slash == 0 { "/".to_string() } else { request_path[..last_slash].to_string() }
|
||||
} else {
|
||||
"/".to_string()
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if a request path matches a cookie path (RFC 6265 Section 5.1.4)
|
||||
fn path_matches(request_path: &str, cookie_path: &str) -> bool {
|
||||
if request_path == cookie_path {
|
||||
return true;
|
||||
}
|
||||
|
||||
if request_path.starts_with(cookie_path) {
|
||||
// Cookie path must end with / or the char after cookie_path in request_path must be /
|
||||
if cookie_path.ends_with('/') {
|
||||
return true;
|
||||
}
|
||||
if request_path.chars().nth(cookie_path.len()) == Some('/') {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
|
||||
/// Check if two cookies match (same name and domain)
|
||||
fn cookies_match(a: &Cookie, b: &Cookie) -> bool {
|
||||
let name_a = parse_cookie_name_value(&a.raw_cookie).map(|(n, _)| n);
|
||||
let name_b = parse_cookie_name_value(&b.raw_cookie).map(|(n, _)| n);
|
||||
|
||||
if name_a != name_b {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Check domain match
|
||||
match (&a.domain, &b.domain) {
|
||||
(CookieDomain::HostOnly(d1), CookieDomain::HostOnly(d2)) => {
|
||||
d1.to_lowercase() == d2.to_lowercase()
|
||||
}
|
||||
(CookieDomain::Suffix(d1), CookieDomain::Suffix(d2)) => {
|
||||
d1.to_lowercase() == d2.to_lowercase()
|
||||
}
|
||||
_ => false,
|
||||
}
|
||||
}
|
||||
|
||||
/// Parse a cookie date string (Unix timestamp in our format)
|
||||
fn parse_cookie_date(date_str: &str) -> Result<SystemTime, ()> {
|
||||
let timestamp: i64 = date_str.parse().map_err(|_| ())?;
|
||||
let duration = Duration::from_secs(timestamp.max(0) as u64);
|
||||
Ok(UNIX_EPOCH + duration)
|
||||
}
|
||||
|
||||
/// Check if a domain is a single-component domain (TLD)
|
||||
/// e.g., "com", "org", "net" - domains without any dots
|
||||
fn is_single_component_domain(domain: &str) -> bool {
|
||||
// Empty or only dots
|
||||
let trimmed = domain.trim_matches('.');
|
||||
if trimmed.is_empty() {
|
||||
return true;
|
||||
}
|
||||
// IPv6 addresses use colons, not dots - don't consider them single-component
|
||||
if domain.contains(':') {
|
||||
return false;
|
||||
}
|
||||
!trimmed.contains('.')
|
||||
}
|
||||
|
||||
/// Check if a domain is localhost or a localhost variant
|
||||
fn is_localhost(domain: &str) -> bool {
|
||||
let lower = domain.to_lowercase();
|
||||
lower == "localhost"
|
||||
|| lower.ends_with(".localhost")
|
||||
|| lower == "127.0.0.1"
|
||||
|| lower == "::1"
|
||||
|| lower == "[::1]"
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_parse_cookie_name_value() {
|
||||
assert_eq!(
|
||||
parse_cookie_name_value("session=abc123"),
|
||||
Some(("session".to_string(), "abc123".to_string()))
|
||||
);
|
||||
assert_eq!(
|
||||
parse_cookie_name_value("name=value; Path=/; HttpOnly"),
|
||||
Some(("name".to_string(), "value".to_string()))
|
||||
);
|
||||
assert_eq!(parse_cookie_name_value("empty="), Some(("empty".to_string(), "".to_string())));
|
||||
assert_eq!(parse_cookie_name_value(""), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_path_matches() {
|
||||
assert!(path_matches("/", "/"));
|
||||
assert!(path_matches("/foo", "/"));
|
||||
assert!(path_matches("/foo/bar", "/foo"));
|
||||
assert!(path_matches("/foo/bar", "/foo/"));
|
||||
assert!(!path_matches("/foobar", "/foo"));
|
||||
assert!(!path_matches("/foo", "/foo/bar"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_default_cookie_path() {
|
||||
assert_eq!(default_cookie_path("/"), "/");
|
||||
assert_eq!(default_cookie_path("/foo"), "/");
|
||||
assert_eq!(default_cookie_path("/foo/bar"), "/foo");
|
||||
assert_eq!(default_cookie_path("/foo/bar/baz"), "/foo/bar");
|
||||
assert_eq!(default_cookie_path(""), "/");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cookie_store_basic() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/path").unwrap();
|
||||
|
||||
// Initially empty
|
||||
assert!(store.get_cookie_header(&url).is_none());
|
||||
|
||||
// Add a cookie
|
||||
store.store_cookies_from_response(&url, &["session=abc123".to_string()]);
|
||||
|
||||
// Should now have the cookie
|
||||
let header = store.get_cookie_header(&url);
|
||||
assert_eq!(header, Some("session=abc123".to_string()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cookie_domain_matching() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/").unwrap();
|
||||
|
||||
// Cookie with domain attribute (suffix match)
|
||||
store.store_cookies_from_response(
|
||||
&url,
|
||||
&["domain_cookie=value; Domain=example.com".to_string()],
|
||||
);
|
||||
|
||||
// Should match example.com
|
||||
assert!(store.get_cookie_header(&url).is_some());
|
||||
|
||||
// Should match subdomain
|
||||
let subdomain_url = Url::parse("https://sub.example.com/").unwrap();
|
||||
assert!(store.get_cookie_header(&subdomain_url).is_some());
|
||||
|
||||
// Should not match different domain
|
||||
let other_url = Url::parse("https://other.com/").unwrap();
|
||||
assert!(store.get_cookie_header(&other_url).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cookie_path_matching() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/api/v1").unwrap();
|
||||
|
||||
// Cookie with path
|
||||
store.store_cookies_from_response(&url, &["api_cookie=value; Path=/api".to_string()]);
|
||||
|
||||
// Should match /api/v1
|
||||
assert!(store.get_cookie_header(&url).is_some());
|
||||
|
||||
// Should match /api
|
||||
let api_url = Url::parse("https://example.com/api").unwrap();
|
||||
assert!(store.get_cookie_header(&api_url).is_some());
|
||||
|
||||
// Should not match /other
|
||||
let other_url = Url::parse("https://example.com/other").unwrap();
|
||||
assert!(store.get_cookie_header(&other_url).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_cookie_replacement() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/").unwrap();
|
||||
|
||||
// Add a cookie
|
||||
store.store_cookies_from_response(&url, &["session=old".to_string()]);
|
||||
assert_eq!(store.get_cookie_header(&url), Some("session=old".to_string()));
|
||||
|
||||
// Replace with new value
|
||||
store.store_cookies_from_response(&url, &["session=new".to_string()]);
|
||||
assert_eq!(store.get_cookie_header(&url), Some("session=new".to_string()));
|
||||
|
||||
// Should only have one cookie
|
||||
assert_eq!(store.get_all_cookies().len(), 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_single_component_domain() {
|
||||
// Single-component domains (TLDs)
|
||||
assert!(is_single_component_domain("com"));
|
||||
assert!(is_single_component_domain("org"));
|
||||
assert!(is_single_component_domain("net"));
|
||||
assert!(is_single_component_domain("localhost")); // Still single-component, but allowed separately
|
||||
|
||||
// Multi-component domains
|
||||
assert!(!is_single_component_domain("example.com"));
|
||||
assert!(!is_single_component_domain("sub.example.com"));
|
||||
assert!(!is_single_component_domain("co.uk"));
|
||||
|
||||
// Edge cases
|
||||
assert!(is_single_component_domain("")); // Empty is treated as single-component
|
||||
assert!(is_single_component_domain(".")); // Only dots
|
||||
assert!(is_single_component_domain("..")); // Only dots
|
||||
|
||||
// IPv6 addresses (have colons, not dots)
|
||||
assert!(!is_single_component_domain("::1")); // IPv6 localhost
|
||||
assert!(!is_single_component_domain("[::1]")); // Bracketed IPv6
|
||||
assert!(!is_single_component_domain("2001:db8::1")); // IPv6 address
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_localhost() {
|
||||
// Localhost variants
|
||||
assert!(is_localhost("localhost"));
|
||||
assert!(is_localhost("LOCALHOST")); // Case-insensitive
|
||||
assert!(is_localhost("sub.localhost"));
|
||||
assert!(is_localhost("app.sub.localhost"));
|
||||
|
||||
// IP localhost
|
||||
assert!(is_localhost("127.0.0.1"));
|
||||
assert!(is_localhost("::1"));
|
||||
assert!(is_localhost("[::1]"));
|
||||
|
||||
// Not localhost
|
||||
assert!(!is_localhost("example.com"));
|
||||
assert!(!is_localhost("localhost.com")); // .com domain, not localhost
|
||||
assert!(!is_localhost("notlocalhost"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_reject_tld_cookies() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/").unwrap();
|
||||
|
||||
// Try to set a cookie with Domain=com (TLD)
|
||||
store.store_cookies_from_response(&url, &["bad=cookie; Domain=com".to_string()]);
|
||||
|
||||
// Should be rejected - no cookies stored
|
||||
assert_eq!(store.get_all_cookies().len(), 0);
|
||||
assert!(store.get_cookie_header(&url).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_allow_localhost_cookies() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("http://localhost:3000/").unwrap();
|
||||
|
||||
// Cookie with Domain=localhost should be allowed
|
||||
store.store_cookies_from_response(&url, &["session=abc; Domain=localhost".to_string()]);
|
||||
|
||||
// Should be accepted
|
||||
assert_eq!(store.get_all_cookies().len(), 1);
|
||||
assert!(store.get_cookie_header(&url).is_some());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_allow_127_0_0_1_cookies() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("http://127.0.0.1:8080/").unwrap();
|
||||
|
||||
// Cookie without Domain attribute (host-only) should work
|
||||
store.store_cookies_from_response(&url, &["session=xyz".to_string()]);
|
||||
|
||||
// Should be accepted
|
||||
assert_eq!(store.get_all_cookies().len(), 1);
|
||||
assert!(store.get_cookie_header(&url).is_some());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_allow_normal_domain_cookies() {
|
||||
let store = CookieStore::new();
|
||||
let url = Url::parse("https://example.com/").unwrap();
|
||||
|
||||
// Cookie with valid domain should be allowed
|
||||
store.store_cookies_from_response(&url, &["session=abc; Domain=example.com".to_string()]);
|
||||
|
||||
// Should be accepted
|
||||
assert_eq!(store.get_all_cookies().len(), 1);
|
||||
assert!(store.get_cookie_header(&url).is_some());
|
||||
}
|
||||
}
|
||||
@@ -1,186 +0,0 @@
|
||||
use crate::sender::HttpResponseEvent;
|
||||
use hyper_util::client::legacy::connect::dns::{
|
||||
GaiResolver as HyperGaiResolver, Name as HyperName,
|
||||
};
|
||||
use log::info;
|
||||
use reqwest::dns::{Addrs, Name, Resolve, Resolving};
|
||||
use std::collections::HashMap;
|
||||
use std::net::{IpAddr, Ipv4Addr, Ipv6Addr, SocketAddr};
|
||||
use std::str::FromStr;
|
||||
use std::sync::Arc;
|
||||
use std::time::Instant;
|
||||
use tokio::sync::{RwLock, mpsc};
|
||||
use tower_service::Service;
|
||||
use yaak_models::models::DnsOverride;
|
||||
|
||||
/// Stores resolved addresses for a hostname override
|
||||
#[derive(Clone)]
|
||||
pub struct ResolvedOverride {
|
||||
pub ipv4: Vec<Ipv4Addr>,
|
||||
pub ipv6: Vec<Ipv6Addr>,
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct LocalhostResolver {
|
||||
fallback: HyperGaiResolver,
|
||||
event_tx: Arc<RwLock<Option<mpsc::Sender<HttpResponseEvent>>>>,
|
||||
overrides: Arc<HashMap<String, ResolvedOverride>>,
|
||||
}
|
||||
|
||||
impl LocalhostResolver {
|
||||
pub fn new(dns_overrides: Vec<DnsOverride>) -> Arc<Self> {
|
||||
let resolver = HyperGaiResolver::new();
|
||||
|
||||
// Pre-parse DNS overrides into a lookup map
|
||||
let mut overrides = HashMap::new();
|
||||
for o in dns_overrides {
|
||||
if !o.enabled {
|
||||
continue;
|
||||
}
|
||||
let hostname = o.hostname.to_lowercase();
|
||||
|
||||
let ipv4: Vec<Ipv4Addr> =
|
||||
o.ipv4.iter().filter_map(|s| s.parse::<Ipv4Addr>().ok()).collect();
|
||||
|
||||
let ipv6: Vec<Ipv6Addr> =
|
||||
o.ipv6.iter().filter_map(|s| s.parse::<Ipv6Addr>().ok()).collect();
|
||||
|
||||
// Only add if at least one address is valid
|
||||
if !ipv4.is_empty() || !ipv6.is_empty() {
|
||||
overrides.insert(hostname, ResolvedOverride { ipv4, ipv6 });
|
||||
}
|
||||
}
|
||||
|
||||
Arc::new(Self {
|
||||
fallback: resolver,
|
||||
event_tx: Arc::new(RwLock::new(None)),
|
||||
overrides: Arc::new(overrides),
|
||||
})
|
||||
}
|
||||
|
||||
/// Set the event sender for the current request.
|
||||
/// This should be called before each request to direct DNS events
|
||||
/// to the appropriate channel.
|
||||
pub async fn set_event_sender(&self, tx: Option<mpsc::Sender<HttpResponseEvent>>) {
|
||||
let mut guard = self.event_tx.write().await;
|
||||
*guard = tx;
|
||||
}
|
||||
}
|
||||
|
||||
impl Resolve for LocalhostResolver {
|
||||
fn resolve(&self, name: Name) -> Resolving {
|
||||
let host = name.as_str().to_lowercase();
|
||||
let event_tx = self.event_tx.clone();
|
||||
let overrides = self.overrides.clone();
|
||||
|
||||
info!("DNS resolve called for: {}", host);
|
||||
|
||||
// Check for DNS override first
|
||||
if let Some(resolved) = overrides.get(&host) {
|
||||
log::debug!("DNS override found for: {}", host);
|
||||
let hostname = host.clone();
|
||||
let mut addrs: Vec<SocketAddr> = Vec::new();
|
||||
|
||||
// Add IPv4 addresses
|
||||
for ip in &resolved.ipv4 {
|
||||
addrs.push(SocketAddr::new(IpAddr::V4(*ip), 0));
|
||||
}
|
||||
|
||||
// Add IPv6 addresses
|
||||
for ip in &resolved.ipv6 {
|
||||
addrs.push(SocketAddr::new(IpAddr::V6(*ip), 0));
|
||||
}
|
||||
|
||||
let addresses: Vec<String> = addrs.iter().map(|a| a.ip().to_string()).collect();
|
||||
|
||||
return Box::pin(async move {
|
||||
// Emit DNS event for override
|
||||
let guard = event_tx.read().await;
|
||||
if let Some(tx) = guard.as_ref() {
|
||||
let _ = tx
|
||||
.send(HttpResponseEvent::DnsResolved {
|
||||
hostname,
|
||||
addresses,
|
||||
duration: 0,
|
||||
overridden: true,
|
||||
})
|
||||
.await;
|
||||
}
|
||||
|
||||
Ok::<Addrs, Box<dyn std::error::Error + Send + Sync>>(Box::new(addrs.into_iter()))
|
||||
});
|
||||
}
|
||||
|
||||
// Check for .localhost suffix
|
||||
let is_localhost = host.ends_with(".localhost");
|
||||
if is_localhost {
|
||||
let hostname = host.clone();
|
||||
// Port 0 is fine; reqwest replaces it with the URL's explicit
|
||||
// port or the scheme's default (80/443, etc.).
|
||||
let addrs: Vec<SocketAddr> = vec![
|
||||
SocketAddr::new(IpAddr::V4(Ipv4Addr::LOCALHOST), 0),
|
||||
SocketAddr::new(IpAddr::V6(Ipv6Addr::LOCALHOST), 0),
|
||||
];
|
||||
|
||||
let addresses: Vec<String> = addrs.iter().map(|a| a.ip().to_string()).collect();
|
||||
|
||||
return Box::pin(async move {
|
||||
// Emit DNS event for localhost resolution
|
||||
let guard = event_tx.read().await;
|
||||
if let Some(tx) = guard.as_ref() {
|
||||
let _ = tx
|
||||
.send(HttpResponseEvent::DnsResolved {
|
||||
hostname,
|
||||
addresses,
|
||||
duration: 0,
|
||||
overridden: false,
|
||||
})
|
||||
.await;
|
||||
}
|
||||
|
||||
Ok::<Addrs, Box<dyn std::error::Error + Send + Sync>>(Box::new(addrs.into_iter()))
|
||||
});
|
||||
}
|
||||
|
||||
// Fall back to system DNS
|
||||
let mut fallback = self.fallback.clone();
|
||||
let name_str = name.as_str().to_string();
|
||||
let hostname = host.clone();
|
||||
|
||||
Box::pin(async move {
|
||||
let start = Instant::now();
|
||||
|
||||
let result = match HyperName::from_str(&name_str) {
|
||||
Ok(n) => fallback.call(n).await,
|
||||
Err(e) => return Err(Box::new(e) as Box<dyn std::error::Error + Send + Sync>),
|
||||
};
|
||||
|
||||
let duration = start.elapsed().as_millis() as u64;
|
||||
|
||||
match result {
|
||||
Ok(addrs) => {
|
||||
// Collect addresses for event emission
|
||||
let addr_vec: Vec<SocketAddr> = addrs.collect();
|
||||
let addresses: Vec<String> =
|
||||
addr_vec.iter().map(|a| a.ip().to_string()).collect();
|
||||
|
||||
// Emit DNS event
|
||||
let guard = event_tx.read().await;
|
||||
if let Some(tx) = guard.as_ref() {
|
||||
let _ = tx
|
||||
.send(HttpResponseEvent::DnsResolved {
|
||||
hostname,
|
||||
addresses,
|
||||
duration,
|
||||
overridden: false,
|
||||
})
|
||||
.await;
|
||||
}
|
||||
|
||||
Ok(Box::new(addr_vec.into_iter()) as Addrs)
|
||||
}
|
||||
Err(err) => Err(Box::new(err) as Box<dyn std::error::Error + Send + Sync>),
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
@@ -1,13 +0,0 @@
|
||||
mod chained_reader;
|
||||
pub mod client;
|
||||
pub mod cookies;
|
||||
pub mod decompress;
|
||||
pub mod dns;
|
||||
pub mod error;
|
||||
pub mod manager;
|
||||
pub mod path_placeholders;
|
||||
mod proto;
|
||||
pub mod sender;
|
||||
pub mod tee_reader;
|
||||
pub mod transaction;
|
||||
pub mod types;
|
||||
@@ -1,159 +0,0 @@
|
||||
use std::io;
|
||||
use std::pin::Pin;
|
||||
use std::task::{Context, Poll};
|
||||
use tokio::io::{AsyncRead, ReadBuf};
|
||||
use tokio::sync::mpsc;
|
||||
|
||||
/// A reader that forwards all read data to a channel while also returning it to the caller.
|
||||
/// This allows capturing request body data as it's being sent.
|
||||
/// Uses an unbounded channel to ensure all data is captured without blocking the request.
|
||||
pub struct TeeReader<R> {
|
||||
inner: R,
|
||||
tx: mpsc::UnboundedSender<Vec<u8>>,
|
||||
}
|
||||
|
||||
impl<R> TeeReader<R> {
|
||||
pub fn new(inner: R, tx: mpsc::UnboundedSender<Vec<u8>>) -> Self {
|
||||
Self { inner, tx }
|
||||
}
|
||||
}
|
||||
|
||||
impl<R: AsyncRead + Unpin> AsyncRead for TeeReader<R> {
|
||||
fn poll_read(
|
||||
mut self: Pin<&mut Self>,
|
||||
cx: &mut Context<'_>,
|
||||
buf: &mut ReadBuf<'_>,
|
||||
) -> Poll<io::Result<()>> {
|
||||
let before_len = buf.filled().len();
|
||||
|
||||
match Pin::new(&mut self.inner).poll_read(cx, buf) {
|
||||
Poll::Ready(Ok(())) => {
|
||||
let after_len = buf.filled().len();
|
||||
if after_len > before_len {
|
||||
// Data was read, send a copy to the channel
|
||||
let data = buf.filled()[before_len..after_len].to_vec();
|
||||
// Send to unbounded channel - this never blocks
|
||||
// Ignore error if receiver is closed
|
||||
let _ = self.tx.send(data);
|
||||
}
|
||||
Poll::Ready(Ok(()))
|
||||
}
|
||||
Poll::Ready(Err(e)) => Poll::Ready(Err(e)),
|
||||
Poll::Pending => Poll::Pending,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::io::Cursor;
|
||||
use tokio::io::AsyncReadExt;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_tee_reader_captures_all_data() {
|
||||
let data = b"Hello, World!";
|
||||
let cursor = Cursor::new(data.to_vec());
|
||||
let (tx, mut rx) = mpsc::unbounded_channel();
|
||||
|
||||
let mut tee = TeeReader::new(cursor, tx);
|
||||
let mut output = Vec::new();
|
||||
tee.read_to_end(&mut output).await.unwrap();
|
||||
|
||||
// Verify the reader returns the correct data
|
||||
assert_eq!(output, data);
|
||||
|
||||
// Verify the channel received the data
|
||||
let mut captured = Vec::new();
|
||||
while let Ok(chunk) = rx.try_recv() {
|
||||
captured.extend(chunk);
|
||||
}
|
||||
assert_eq!(captured, data);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_tee_reader_with_chunked_reads() {
|
||||
let data = b"ABCDEFGHIJKLMNOPQRSTUVWXYZ";
|
||||
let cursor = Cursor::new(data.to_vec());
|
||||
let (tx, mut rx) = mpsc::unbounded_channel();
|
||||
|
||||
let mut tee = TeeReader::new(cursor, tx);
|
||||
|
||||
// Read in small chunks
|
||||
let mut buf = [0u8; 5];
|
||||
let mut output = Vec::new();
|
||||
loop {
|
||||
let n = tee.read(&mut buf).await.unwrap();
|
||||
if n == 0 {
|
||||
break;
|
||||
}
|
||||
output.extend_from_slice(&buf[..n]);
|
||||
}
|
||||
|
||||
// Verify the reader returns the correct data
|
||||
assert_eq!(output, data);
|
||||
|
||||
// Verify the channel received all chunks
|
||||
let mut captured = Vec::new();
|
||||
while let Ok(chunk) = rx.try_recv() {
|
||||
captured.extend(chunk);
|
||||
}
|
||||
assert_eq!(captured, data);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_tee_reader_empty_data() {
|
||||
let data: Vec<u8> = vec![];
|
||||
let cursor = Cursor::new(data.clone());
|
||||
let (tx, mut rx) = mpsc::unbounded_channel();
|
||||
|
||||
let mut tee = TeeReader::new(cursor, tx);
|
||||
let mut output = Vec::new();
|
||||
tee.read_to_end(&mut output).await.unwrap();
|
||||
|
||||
// Verify empty output
|
||||
assert!(output.is_empty());
|
||||
|
||||
// Verify no data was sent to channel
|
||||
assert!(rx.try_recv().is_err());
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_tee_reader_works_when_receiver_dropped() {
|
||||
let data = b"Hello, World!";
|
||||
let cursor = Cursor::new(data.to_vec());
|
||||
let (tx, rx) = mpsc::unbounded_channel();
|
||||
|
||||
// Drop the receiver before reading
|
||||
drop(rx);
|
||||
|
||||
let mut tee = TeeReader::new(cursor, tx);
|
||||
let mut output = Vec::new();
|
||||
|
||||
// Should still work even though receiver is dropped
|
||||
tee.read_to_end(&mut output).await.unwrap();
|
||||
assert_eq!(output, data);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_tee_reader_large_data() {
|
||||
// Test with 1MB of data
|
||||
let data: Vec<u8> = (0..1024 * 1024).map(|i| (i % 256) as u8).collect();
|
||||
let cursor = Cursor::new(data.clone());
|
||||
let (tx, mut rx) = mpsc::unbounded_channel();
|
||||
|
||||
let mut tee = TeeReader::new(cursor, tx);
|
||||
let mut output = Vec::new();
|
||||
tee.read_to_end(&mut output).await.unwrap();
|
||||
|
||||
// Verify the reader returns the correct data
|
||||
assert_eq!(output, data);
|
||||
|
||||
// Verify the channel received all data
|
||||
let mut captured = Vec::new();
|
||||
while let Ok(chunk) = rx.try_recv() {
|
||||
captured.extend(chunk);
|
||||
}
|
||||
assert_eq!(captured, data);
|
||||
}
|
||||
}
|
||||
@@ -1,729 +0,0 @@
|
||||
use crate::cookies::CookieStore;
|
||||
use crate::error::Result;
|
||||
use crate::sender::{HttpResponse, HttpResponseEvent, HttpSender, RedirectBehavior};
|
||||
use crate::types::SendableHttpRequest;
|
||||
use log::debug;
|
||||
use tokio::sync::mpsc;
|
||||
use tokio::sync::watch::Receiver;
|
||||
use url::Url;
|
||||
|
||||
/// HTTP Transaction that manages the lifecycle of a request, including redirect handling
|
||||
pub struct HttpTransaction<S: HttpSender> {
|
||||
sender: S,
|
||||
max_redirects: usize,
|
||||
cookie_store: Option<CookieStore>,
|
||||
}
|
||||
|
||||
impl<S: HttpSender> HttpTransaction<S> {
|
||||
/// Create a new transaction with default settings
|
||||
pub fn new(sender: S) -> Self {
|
||||
Self { sender, max_redirects: 10, cookie_store: None }
|
||||
}
|
||||
|
||||
/// Create a new transaction with custom max redirects
|
||||
pub fn with_max_redirects(sender: S, max_redirects: usize) -> Self {
|
||||
Self { sender, max_redirects, cookie_store: None }
|
||||
}
|
||||
|
||||
/// Create a new transaction with a cookie store
|
||||
pub fn with_cookie_store(sender: S, cookie_store: CookieStore) -> Self {
|
||||
Self { sender, max_redirects: 10, cookie_store: Some(cookie_store) }
|
||||
}
|
||||
|
||||
/// Create a new transaction with custom max redirects and a cookie store
|
||||
pub fn with_options(
|
||||
sender: S,
|
||||
max_redirects: usize,
|
||||
cookie_store: Option<CookieStore>,
|
||||
) -> Self {
|
||||
Self { sender, max_redirects, cookie_store }
|
||||
}
|
||||
|
||||
/// Execute the request with cancellation support.
|
||||
/// Returns an HttpResponse with unconsumed body - caller decides how to consume it.
|
||||
/// Events are sent through the provided channel.
|
||||
pub async fn execute_with_cancellation(
|
||||
&self,
|
||||
request: SendableHttpRequest,
|
||||
mut cancelled_rx: Receiver<bool>,
|
||||
event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
let mut redirect_count = 0;
|
||||
let mut current_url = request.url;
|
||||
let mut current_method = request.method;
|
||||
let mut current_headers = request.headers;
|
||||
let mut current_body = request.body;
|
||||
|
||||
// Helper to send events (ignores errors if receiver is dropped or channel is full)
|
||||
let send_event = |event: HttpResponseEvent| {
|
||||
let _ = event_tx.try_send(event);
|
||||
};
|
||||
|
||||
loop {
|
||||
// Check for cancellation before each request
|
||||
if *cancelled_rx.borrow() {
|
||||
return Err(crate::error::Error::RequestCanceledError);
|
||||
}
|
||||
|
||||
// Inject cookies into headers if we have a cookie store
|
||||
let headers_with_cookies = if let Some(cookie_store) = &self.cookie_store {
|
||||
let mut headers = current_headers.clone();
|
||||
if let Ok(url) = Url::parse(¤t_url) {
|
||||
if let Some(cookie_header) = cookie_store.get_cookie_header(&url) {
|
||||
debug!("Injecting Cookie header: {}", cookie_header);
|
||||
// Check if there's already a Cookie header and merge if so
|
||||
if let Some(existing) =
|
||||
headers.iter_mut().find(|h| h.0.eq_ignore_ascii_case("cookie"))
|
||||
{
|
||||
existing.1 = format!("{}; {}", existing.1, cookie_header);
|
||||
} else {
|
||||
headers.push(("Cookie".to_string(), cookie_header));
|
||||
}
|
||||
}
|
||||
}
|
||||
headers
|
||||
} else {
|
||||
current_headers.clone()
|
||||
};
|
||||
|
||||
// Build request for this iteration
|
||||
let req = SendableHttpRequest {
|
||||
url: current_url.clone(),
|
||||
method: current_method.clone(),
|
||||
headers: headers_with_cookies,
|
||||
body: current_body,
|
||||
options: request.options.clone(),
|
||||
};
|
||||
|
||||
// Send the request
|
||||
send_event(HttpResponseEvent::Setting(
|
||||
"redirects".to_string(),
|
||||
request.options.follow_redirects.to_string(),
|
||||
));
|
||||
|
||||
// Execute with cancellation support
|
||||
let response = tokio::select! {
|
||||
result = self.sender.send(req, event_tx.clone()) => result?,
|
||||
_ = cancelled_rx.changed() => {
|
||||
return Err(crate::error::Error::RequestCanceledError);
|
||||
}
|
||||
};
|
||||
|
||||
// Parse Set-Cookie headers and store cookies
|
||||
if let Some(cookie_store) = &self.cookie_store {
|
||||
if let Ok(url) = Url::parse(¤t_url) {
|
||||
let set_cookie_headers: Vec<String> = response
|
||||
.headers
|
||||
.iter()
|
||||
.filter(|(k, _)| k.eq_ignore_ascii_case("set-cookie"))
|
||||
.map(|(_, v)| v.clone())
|
||||
.collect();
|
||||
|
||||
if !set_cookie_headers.is_empty() {
|
||||
debug!("Storing {} cookies from response", set_cookie_headers.len());
|
||||
cookie_store.store_cookies_from_response(&url, &set_cookie_headers);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !Self::is_redirect(response.status) {
|
||||
// Not a redirect - return the response for caller to consume body
|
||||
return Ok(response);
|
||||
}
|
||||
|
||||
if !request.options.follow_redirects {
|
||||
// Redirects disabled - return the redirect response as-is
|
||||
return Ok(response);
|
||||
}
|
||||
|
||||
// Check if we've exceeded max redirects
|
||||
if redirect_count >= self.max_redirects {
|
||||
// Drain the response before returning error
|
||||
let _ = response.drain().await;
|
||||
return Err(crate::error::Error::RequestError(format!(
|
||||
"Maximum redirect limit ({}) exceeded",
|
||||
self.max_redirects
|
||||
)));
|
||||
}
|
||||
|
||||
// Extract Location header before draining (headers are available immediately)
|
||||
// HTTP headers are case-insensitive, so we need to search for any casing
|
||||
let location = response
|
||||
.headers
|
||||
.iter()
|
||||
.find(|(k, _)| k.eq_ignore_ascii_case("location"))
|
||||
.map(|(_, v)| v.clone())
|
||||
.ok_or_else(|| {
|
||||
crate::error::Error::RequestError(
|
||||
"Redirect response missing Location header".to_string(),
|
||||
)
|
||||
})?;
|
||||
|
||||
// Also get status before draining
|
||||
let status = response.status;
|
||||
|
||||
send_event(HttpResponseEvent::Info("Ignoring the response body".to_string()));
|
||||
|
||||
// Drain the redirect response body before following
|
||||
response.drain().await?;
|
||||
|
||||
// Update the request URL
|
||||
current_url = if location.starts_with("http://") || location.starts_with("https://") {
|
||||
// Absolute URL
|
||||
location
|
||||
} else if location.starts_with('/') {
|
||||
// Absolute path - need to extract base URL from current request
|
||||
let base_url = Self::extract_base_url(¤t_url)?;
|
||||
format!("{}{}", base_url, location)
|
||||
} else {
|
||||
// Relative path - need to resolve relative to current path
|
||||
let base_path = Self::extract_base_path(¤t_url)?;
|
||||
format!("{}/{}", base_path, location)
|
||||
};
|
||||
|
||||
// Determine redirect behavior based on status code and method
|
||||
let behavior = if status == 303 {
|
||||
// 303 See Other always changes to GET
|
||||
RedirectBehavior::DropBody
|
||||
} else if (status == 301 || status == 302) && current_method == "POST" {
|
||||
// For 301/302, change POST to GET (common browser behavior)
|
||||
RedirectBehavior::DropBody
|
||||
} else {
|
||||
// For 307 and 308, the method and body are preserved
|
||||
// Also for 301/302 with non-POST methods
|
||||
RedirectBehavior::Preserve
|
||||
};
|
||||
|
||||
send_event(HttpResponseEvent::Redirect {
|
||||
url: current_url.clone(),
|
||||
status,
|
||||
behavior: behavior.clone(),
|
||||
});
|
||||
|
||||
// Handle method changes for certain redirect codes
|
||||
if matches!(behavior, RedirectBehavior::DropBody) {
|
||||
if current_method != "GET" {
|
||||
current_method = "GET".to_string();
|
||||
}
|
||||
// Remove content-related headers
|
||||
current_headers.retain(|h| {
|
||||
let name_lower = h.0.to_lowercase();
|
||||
!name_lower.starts_with("content-") && name_lower != "transfer-encoding"
|
||||
});
|
||||
}
|
||||
|
||||
// Reset body for next iteration (since it was moved in the send call)
|
||||
// For redirects that change method to GET or for all redirects since body was consumed
|
||||
current_body = None;
|
||||
|
||||
redirect_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
/// Check if a status code indicates a redirect
|
||||
fn is_redirect(status: u16) -> bool {
|
||||
matches!(status, 301 | 302 | 303 | 307 | 308)
|
||||
}
|
||||
|
||||
/// Extract the base URL (scheme + host) from a full URL
|
||||
fn extract_base_url(url: &str) -> Result<String> {
|
||||
// Find the position after "://"
|
||||
let scheme_end = url.find("://").ok_or_else(|| {
|
||||
crate::error::Error::RequestError(format!("Invalid URL format: {}", url))
|
||||
})?;
|
||||
|
||||
// Find the first '/' after the scheme
|
||||
let path_start = url[scheme_end + 3..].find('/');
|
||||
|
||||
if let Some(idx) = path_start {
|
||||
Ok(url[..scheme_end + 3 + idx].to_string())
|
||||
} else {
|
||||
// No path, return entire URL
|
||||
Ok(url.to_string())
|
||||
}
|
||||
}
|
||||
|
||||
/// Extract the base path (everything except the last segment) from a URL
|
||||
fn extract_base_path(url: &str) -> Result<String> {
|
||||
if let Some(last_slash) = url.rfind('/') {
|
||||
// Don't include the trailing slash if it's part of the host
|
||||
if url[..last_slash].ends_with("://") || url[..last_slash].ends_with(':') {
|
||||
Ok(url.to_string())
|
||||
} else {
|
||||
Ok(url[..last_slash].to_string())
|
||||
}
|
||||
} else {
|
||||
Ok(url.to_string())
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::decompress::ContentEncoding;
|
||||
use crate::sender::{HttpResponseEvent, HttpSender};
|
||||
use async_trait::async_trait;
|
||||
use std::pin::Pin;
|
||||
use std::sync::Arc;
|
||||
use tokio::io::AsyncRead;
|
||||
use tokio::sync::Mutex;
|
||||
|
||||
/// Mock sender for testing
|
||||
struct MockSender {
|
||||
responses: Arc<Mutex<Vec<MockResponse>>>,
|
||||
}
|
||||
|
||||
struct MockResponse {
|
||||
status: u16,
|
||||
headers: Vec<(String, String)>,
|
||||
body: Vec<u8>,
|
||||
}
|
||||
|
||||
impl MockSender {
|
||||
fn new(responses: Vec<MockResponse>) -> Self {
|
||||
Self { responses: Arc::new(Mutex::new(responses)) }
|
||||
}
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl HttpSender for MockSender {
|
||||
async fn send(
|
||||
&self,
|
||||
_request: SendableHttpRequest,
|
||||
_event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
let mut responses = self.responses.lock().await;
|
||||
if responses.is_empty() {
|
||||
Err(crate::error::Error::RequestError("No more mock responses".to_string()))
|
||||
} else {
|
||||
let mock = responses.remove(0);
|
||||
// Create a simple in-memory stream from the body
|
||||
let body_stream: Pin<Box<dyn AsyncRead + Send>> =
|
||||
Box::pin(std::io::Cursor::new(mock.body));
|
||||
Ok(HttpResponse::new(
|
||||
mock.status,
|
||||
None, // status_reason
|
||||
mock.headers,
|
||||
Vec::new(),
|
||||
None, // content_length
|
||||
"https://example.com".to_string(), // url
|
||||
None, // remote_addr
|
||||
Some("HTTP/1.1".to_string()), // version
|
||||
body_stream,
|
||||
ContentEncoding::Identity,
|
||||
))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_transaction_no_redirect() {
|
||||
let response = MockResponse { status: 200, headers: Vec::new(), body: b"OK".to_vec() };
|
||||
let sender = MockSender::new(vec![response]);
|
||||
let transaction = HttpTransaction::new(sender);
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com".to_string(),
|
||||
method: "GET".to_string(),
|
||||
headers: vec![],
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await.unwrap();
|
||||
assert_eq!(result.status, 200);
|
||||
|
||||
// Consume the body to verify it
|
||||
let (body, _) = result.bytes().await.unwrap();
|
||||
assert_eq!(body, b"OK");
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_transaction_single_redirect() {
|
||||
let redirect_headers =
|
||||
vec![("Location".to_string(), "https://example.com/new".to_string())];
|
||||
|
||||
let responses = vec![
|
||||
MockResponse { status: 302, headers: redirect_headers, body: vec![] },
|
||||
MockResponse { status: 200, headers: Vec::new(), body: b"Final".to_vec() },
|
||||
];
|
||||
|
||||
let sender = MockSender::new(responses);
|
||||
let transaction = HttpTransaction::new(sender);
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/old".to_string(),
|
||||
method: "GET".to_string(),
|
||||
options: crate::types::SendableHttpRequestOptions {
|
||||
follow_redirects: true,
|
||||
..Default::default()
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await.unwrap();
|
||||
assert_eq!(result.status, 200);
|
||||
|
||||
let (body, _) = result.bytes().await.unwrap();
|
||||
assert_eq!(body, b"Final");
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_transaction_max_redirects_exceeded() {
|
||||
let redirect_headers =
|
||||
vec![("Location".to_string(), "https://example.com/loop".to_string())];
|
||||
|
||||
// Create more redirects than allowed
|
||||
let responses: Vec<MockResponse> = (0..12)
|
||||
.map(|_| MockResponse { status: 302, headers: redirect_headers.clone(), body: vec![] })
|
||||
.collect();
|
||||
|
||||
let sender = MockSender::new(responses);
|
||||
let transaction = HttpTransaction::with_max_redirects(sender, 10);
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/start".to_string(),
|
||||
method: "GET".to_string(),
|
||||
options: crate::types::SendableHttpRequestOptions {
|
||||
follow_redirects: true,
|
||||
..Default::default()
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await;
|
||||
if let Err(crate::error::Error::RequestError(msg)) = result {
|
||||
assert!(msg.contains("Maximum redirect limit"));
|
||||
} else {
|
||||
panic!("Expected RequestError with max redirect message. Got {result:?}");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_is_redirect() {
|
||||
assert!(HttpTransaction::<MockSender>::is_redirect(301));
|
||||
assert!(HttpTransaction::<MockSender>::is_redirect(302));
|
||||
assert!(HttpTransaction::<MockSender>::is_redirect(303));
|
||||
assert!(HttpTransaction::<MockSender>::is_redirect(307));
|
||||
assert!(HttpTransaction::<MockSender>::is_redirect(308));
|
||||
assert!(!HttpTransaction::<MockSender>::is_redirect(200));
|
||||
assert!(!HttpTransaction::<MockSender>::is_redirect(404));
|
||||
assert!(!HttpTransaction::<MockSender>::is_redirect(500));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_base_url() {
|
||||
let result =
|
||||
HttpTransaction::<MockSender>::extract_base_url("https://example.com/path/to/resource");
|
||||
assert_eq!(result.unwrap(), "https://example.com");
|
||||
|
||||
let result = HttpTransaction::<MockSender>::extract_base_url("http://localhost:8080/api");
|
||||
assert_eq!(result.unwrap(), "http://localhost:8080");
|
||||
|
||||
let result = HttpTransaction::<MockSender>::extract_base_url("invalid-url");
|
||||
assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extract_base_path() {
|
||||
let result = HttpTransaction::<MockSender>::extract_base_path(
|
||||
"https://example.com/path/to/resource",
|
||||
);
|
||||
assert_eq!(result.unwrap(), "https://example.com/path/to");
|
||||
|
||||
let result = HttpTransaction::<MockSender>::extract_base_path("https://example.com/single");
|
||||
assert_eq!(result.unwrap(), "https://example.com");
|
||||
|
||||
let result = HttpTransaction::<MockSender>::extract_base_path("https://example.com/");
|
||||
assert_eq!(result.unwrap(), "https://example.com");
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_cookie_injection() {
|
||||
// Create a mock sender that verifies the Cookie header was injected
|
||||
struct CookieVerifyingSender {
|
||||
expected_cookie: String,
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl HttpSender for CookieVerifyingSender {
|
||||
async fn send(
|
||||
&self,
|
||||
request: SendableHttpRequest,
|
||||
_event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
// Verify the Cookie header was injected
|
||||
let cookie_header =
|
||||
request.headers.iter().find(|(k, _)| k.eq_ignore_ascii_case("cookie"));
|
||||
|
||||
assert!(cookie_header.is_some(), "Cookie header should be present");
|
||||
assert!(
|
||||
cookie_header.unwrap().1.contains(&self.expected_cookie),
|
||||
"Cookie header should contain expected value"
|
||||
);
|
||||
|
||||
let body_stream: Pin<Box<dyn AsyncRead + Send>> =
|
||||
Box::pin(std::io::Cursor::new(vec![]));
|
||||
Ok(HttpResponse::new(
|
||||
200,
|
||||
None,
|
||||
Vec::new(),
|
||||
Vec::new(),
|
||||
None,
|
||||
"https://example.com".to_string(),
|
||||
None,
|
||||
Some("HTTP/1.1".to_string()),
|
||||
body_stream,
|
||||
ContentEncoding::Identity,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
use yaak_models::models::{Cookie, CookieDomain, CookieExpires};
|
||||
|
||||
// Create a cookie store with a test cookie
|
||||
let cookie = Cookie {
|
||||
raw_cookie: "session=abc123".to_string(),
|
||||
domain: CookieDomain::HostOnly("example.com".to_string()),
|
||||
expires: CookieExpires::SessionEnd,
|
||||
path: ("/".to_string(), false),
|
||||
};
|
||||
let cookie_store = CookieStore::from_cookies(vec![cookie]);
|
||||
|
||||
let sender = CookieVerifyingSender { expected_cookie: "session=abc123".to_string() };
|
||||
let transaction = HttpTransaction::with_cookie_store(sender, cookie_store);
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/api".to_string(),
|
||||
method: "GET".to_string(),
|
||||
headers: vec![],
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await;
|
||||
assert!(result.is_ok());
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_set_cookie_parsing() {
|
||||
// Create a cookie store
|
||||
let cookie_store = CookieStore::new();
|
||||
|
||||
// Mock sender that returns a Set-Cookie header
|
||||
struct SetCookieSender;
|
||||
|
||||
#[async_trait]
|
||||
impl HttpSender for SetCookieSender {
|
||||
async fn send(
|
||||
&self,
|
||||
_request: SendableHttpRequest,
|
||||
_event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
let headers =
|
||||
vec![("set-cookie".to_string(), "session=xyz789; Path=/".to_string())];
|
||||
|
||||
let body_stream: Pin<Box<dyn AsyncRead + Send>> =
|
||||
Box::pin(std::io::Cursor::new(vec![]));
|
||||
Ok(HttpResponse::new(
|
||||
200,
|
||||
None,
|
||||
headers,
|
||||
Vec::new(),
|
||||
None,
|
||||
"https://example.com".to_string(),
|
||||
None,
|
||||
Some("HTTP/1.1".to_string()),
|
||||
body_stream,
|
||||
ContentEncoding::Identity,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
let sender = SetCookieSender;
|
||||
let transaction = HttpTransaction::with_cookie_store(sender, cookie_store.clone());
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/login".to_string(),
|
||||
method: "POST".to_string(),
|
||||
headers: vec![],
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await;
|
||||
assert!(result.is_ok());
|
||||
|
||||
// Verify the cookie was stored
|
||||
let cookies = cookie_store.get_all_cookies();
|
||||
assert_eq!(cookies.len(), 1);
|
||||
assert!(cookies[0].raw_cookie.contains("session=xyz789"));
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_multiple_set_cookie_headers() {
|
||||
// Create a cookie store
|
||||
let cookie_store = CookieStore::new();
|
||||
|
||||
// Mock sender that returns multiple Set-Cookie headers
|
||||
struct MultiSetCookieSender;
|
||||
|
||||
#[async_trait]
|
||||
impl HttpSender for MultiSetCookieSender {
|
||||
async fn send(
|
||||
&self,
|
||||
_request: SendableHttpRequest,
|
||||
_event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
// Multiple Set-Cookie headers (this is standard HTTP behavior)
|
||||
let headers = vec![
|
||||
("set-cookie".to_string(), "session=abc123; Path=/".to_string()),
|
||||
("set-cookie".to_string(), "user_id=42; Path=/".to_string()),
|
||||
(
|
||||
"set-cookie".to_string(),
|
||||
"preferences=dark; Path=/; Max-Age=86400".to_string(),
|
||||
),
|
||||
];
|
||||
|
||||
let body_stream: Pin<Box<dyn AsyncRead + Send>> =
|
||||
Box::pin(std::io::Cursor::new(vec![]));
|
||||
Ok(HttpResponse::new(
|
||||
200,
|
||||
None,
|
||||
headers,
|
||||
Vec::new(),
|
||||
None,
|
||||
"https://example.com".to_string(),
|
||||
None,
|
||||
Some("HTTP/1.1".to_string()),
|
||||
body_stream,
|
||||
ContentEncoding::Identity,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
let sender = MultiSetCookieSender;
|
||||
let transaction = HttpTransaction::with_cookie_store(sender, cookie_store.clone());
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/login".to_string(),
|
||||
method: "POST".to_string(),
|
||||
headers: vec![],
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await;
|
||||
assert!(result.is_ok());
|
||||
|
||||
// Verify all three cookies were stored
|
||||
let cookies = cookie_store.get_all_cookies();
|
||||
assert_eq!(cookies.len(), 3, "All three Set-Cookie headers should be parsed and stored");
|
||||
|
||||
let cookie_values: Vec<&str> = cookies.iter().map(|c| c.raw_cookie.as_str()).collect();
|
||||
assert!(
|
||||
cookie_values.iter().any(|c| c.contains("session=abc123")),
|
||||
"session cookie should be stored"
|
||||
);
|
||||
assert!(
|
||||
cookie_values.iter().any(|c| c.contains("user_id=42")),
|
||||
"user_id cookie should be stored"
|
||||
);
|
||||
assert!(
|
||||
cookie_values.iter().any(|c| c.contains("preferences=dark")),
|
||||
"preferences cookie should be stored"
|
||||
);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_cookies_across_redirects() {
|
||||
use std::sync::atomic::{AtomicUsize, Ordering};
|
||||
|
||||
// Create a cookie store
|
||||
let cookie_store = CookieStore::new();
|
||||
|
||||
// Track request count
|
||||
let request_count = Arc::new(AtomicUsize::new(0));
|
||||
let request_count_clone = request_count.clone();
|
||||
|
||||
struct RedirectWithCookiesSender {
|
||||
request_count: Arc<AtomicUsize>,
|
||||
}
|
||||
|
||||
#[async_trait]
|
||||
impl HttpSender for RedirectWithCookiesSender {
|
||||
async fn send(
|
||||
&self,
|
||||
request: SendableHttpRequest,
|
||||
_event_tx: mpsc::Sender<HttpResponseEvent>,
|
||||
) -> Result<HttpResponse> {
|
||||
let count = self.request_count.fetch_add(1, Ordering::SeqCst);
|
||||
|
||||
let (status, headers) = if count == 0 {
|
||||
// First request: return redirect with Set-Cookie
|
||||
let h = vec![
|
||||
("location".to_string(), "https://example.com/final".to_string()),
|
||||
("set-cookie".to_string(), "redirect_cookie=value1".to_string()),
|
||||
];
|
||||
(302, h)
|
||||
} else {
|
||||
// Second request: verify cookie was sent
|
||||
let cookie_header =
|
||||
request.headers.iter().find(|(k, _)| k.eq_ignore_ascii_case("cookie"));
|
||||
|
||||
assert!(cookie_header.is_some(), "Cookie header should be present on redirect");
|
||||
assert!(
|
||||
cookie_header.unwrap().1.contains("redirect_cookie=value1"),
|
||||
"Redirect cookie should be included"
|
||||
);
|
||||
|
||||
(200, Vec::new())
|
||||
};
|
||||
|
||||
let body_stream: Pin<Box<dyn AsyncRead + Send>> =
|
||||
Box::pin(std::io::Cursor::new(vec![]));
|
||||
Ok(HttpResponse::new(
|
||||
status,
|
||||
None,
|
||||
headers,
|
||||
Vec::new(),
|
||||
None,
|
||||
"https://example.com".to_string(),
|
||||
None,
|
||||
Some("HTTP/1.1".to_string()),
|
||||
body_stream,
|
||||
ContentEncoding::Identity,
|
||||
))
|
||||
}
|
||||
}
|
||||
|
||||
let sender = RedirectWithCookiesSender { request_count: request_count_clone };
|
||||
let transaction = HttpTransaction::with_cookie_store(sender, cookie_store);
|
||||
|
||||
let request = SendableHttpRequest {
|
||||
url: "https://example.com/start".to_string(),
|
||||
method: "GET".to_string(),
|
||||
headers: vec![],
|
||||
options: crate::types::SendableHttpRequestOptions {
|
||||
follow_redirects: true,
|
||||
..Default::default()
|
||||
},
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let (_tx, rx) = tokio::sync::watch::channel(false);
|
||||
let (event_tx, _event_rx) = mpsc::channel(100);
|
||||
let result = transaction.execute_with_cancellation(request, rx, event_tx).await;
|
||||
assert!(result.is_ok());
|
||||
assert_eq!(request_count.load(Ordering::SeqCst), 2);
|
||||
}
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
CREATE TABLE body_chunks
|
||||
(
|
||||
id TEXT PRIMARY KEY,
|
||||
body_id TEXT NOT NULL,
|
||||
chunk_index INTEGER NOT NULL,
|
||||
data BLOB NOT NULL,
|
||||
created_at DATETIME DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) NOT NULL,
|
||||
|
||||
UNIQUE (body_id, chunk_index)
|
||||
);
|
||||
|
||||
CREATE INDEX idx_body_chunks_body_id ON body_chunks (body_id, chunk_index);
|
||||
@@ -1,2 +0,0 @@
|
||||
ALTER TABLE http_responses
|
||||
ADD COLUMN request_content_length INTEGER;
|
||||
@@ -1 +0,0 @@
|
||||
ALTER TABLE settings ADD COLUMN hotkeys TEXT DEFAULT '{}' NOT NULL;
|
||||
@@ -1,2 +0,0 @@
|
||||
-- Add DNS resolution timing to http_responses
|
||||
ALTER TABLE http_responses ADD COLUMN elapsed_dns INTEGER DEFAULT 0 NOT NULL;
|
||||
@@ -1,2 +0,0 @@
|
||||
-- Add DNS overrides setting to workspaces
|
||||
ALTER TABLE workspaces ADD COLUMN setting_dns_overrides TEXT DEFAULT '[]' NOT NULL;
|
||||
@@ -1,12 +0,0 @@
|
||||
-- Filter out headers that match the hardcoded defaults (User-Agent: yaak, Accept: */*),
|
||||
-- keeping any other custom headers the user may have added.
|
||||
UPDATE workspaces
|
||||
SET headers = (
|
||||
SELECT json_group_array(json(value))
|
||||
FROM json_each(headers)
|
||||
WHERE NOT (
|
||||
(LOWER(json_extract(value, '$.name')) = 'user-agent' AND json_extract(value, '$.value') = 'yaak')
|
||||
OR (LOWER(json_extract(value, '$.name')) = 'accept' AND json_extract(value, '$.value') = '*/*')
|
||||
)
|
||||
)
|
||||
WHERE json_array_length(headers) > 0;
|
||||
@@ -1,354 +0,0 @@
|
||||
use crate::error::Result;
|
||||
use crate::util::generate_prefixed_id;
|
||||
use include_dir::{Dir, include_dir};
|
||||
use log::{debug, info};
|
||||
use r2d2::Pool;
|
||||
use r2d2_sqlite::SqliteConnectionManager;
|
||||
use rusqlite::{OptionalExtension, params};
|
||||
use std::sync::{Arc, Mutex};
|
||||
|
||||
static BLOB_MIGRATIONS_DIR: Dir = include_dir!("$CARGO_MANIFEST_DIR/blob_migrations");
|
||||
|
||||
/// A chunk of body data stored in the blob database.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BodyChunk {
|
||||
pub id: String,
|
||||
pub body_id: String,
|
||||
pub chunk_index: i32,
|
||||
pub data: Vec<u8>,
|
||||
}
|
||||
|
||||
impl BodyChunk {
|
||||
pub fn new(body_id: impl Into<String>, chunk_index: i32, data: Vec<u8>) -> Self {
|
||||
Self { id: generate_prefixed_id("bc"), body_id: body_id.into(), chunk_index, data }
|
||||
}
|
||||
}
|
||||
|
||||
/// Manages the blob database connection pool.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BlobManager {
|
||||
pool: Arc<Mutex<Pool<SqliteConnectionManager>>>,
|
||||
}
|
||||
|
||||
impl BlobManager {
|
||||
pub fn new(pool: Pool<SqliteConnectionManager>) -> Self {
|
||||
Self { pool: Arc::new(Mutex::new(pool)) }
|
||||
}
|
||||
|
||||
pub fn connect(&self) -> BlobContext {
|
||||
let conn = self
|
||||
.pool
|
||||
.lock()
|
||||
.expect("Failed to gain lock on blob DB")
|
||||
.get()
|
||||
.expect("Failed to get blob DB connection from pool");
|
||||
BlobContext { conn }
|
||||
}
|
||||
}
|
||||
|
||||
/// Context for blob database operations.
|
||||
pub struct BlobContext {
|
||||
conn: r2d2::PooledConnection<SqliteConnectionManager>,
|
||||
}
|
||||
|
||||
impl BlobContext {
|
||||
/// Insert a single chunk.
|
||||
pub fn insert_chunk(&self, chunk: &BodyChunk) -> Result<()> {
|
||||
self.conn.execute(
|
||||
"INSERT INTO body_chunks (id, body_id, chunk_index, data) VALUES (?1, ?2, ?3, ?4)",
|
||||
params![chunk.id, chunk.body_id, chunk.chunk_index, chunk.data],
|
||||
)?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Get all chunks for a body, ordered by chunk_index.
|
||||
pub fn get_chunks(&self, body_id: &str) -> Result<Vec<BodyChunk>> {
|
||||
let mut stmt = self.conn.prepare(
|
||||
"SELECT id, body_id, chunk_index, data FROM body_chunks
|
||||
WHERE body_id = ?1 ORDER BY chunk_index ASC",
|
||||
)?;
|
||||
|
||||
let chunks = stmt
|
||||
.query_map(params![body_id], |row| {
|
||||
Ok(BodyChunk {
|
||||
id: row.get(0)?,
|
||||
body_id: row.get(1)?,
|
||||
chunk_index: row.get(2)?,
|
||||
data: row.get(3)?,
|
||||
})
|
||||
})?
|
||||
.collect::<std::result::Result<Vec<_>, _>>()?;
|
||||
|
||||
Ok(chunks)
|
||||
}
|
||||
|
||||
/// Delete all chunks for a body.
|
||||
pub fn delete_chunks(&self, body_id: &str) -> Result<()> {
|
||||
self.conn.execute("DELETE FROM body_chunks WHERE body_id = ?1", params![body_id])?;
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Delete all chunks matching a body_id prefix (e.g., "rs_abc123.%" to delete all bodies for a response).
|
||||
pub fn delete_chunks_like(&self, body_id_prefix: &str) -> Result<()> {
|
||||
self.conn
|
||||
.execute("DELETE FROM body_chunks WHERE body_id LIKE ?1", params![body_id_prefix])?;
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
/// Get total size of a body without loading data.
|
||||
impl BlobContext {
|
||||
pub fn get_body_size(&self, body_id: &str) -> Result<usize> {
|
||||
let size: i64 = self
|
||||
.conn
|
||||
.query_row(
|
||||
"SELECT COALESCE(SUM(LENGTH(data)), 0) FROM body_chunks WHERE body_id = ?1",
|
||||
params![body_id],
|
||||
|row| row.get(0),
|
||||
)
|
||||
.unwrap_or(0);
|
||||
Ok(size as usize)
|
||||
}
|
||||
|
||||
/// Check if a body exists.
|
||||
pub fn body_exists(&self, body_id: &str) -> Result<bool> {
|
||||
let count: i64 = self
|
||||
.conn
|
||||
.query_row(
|
||||
"SELECT COUNT(*) FROM body_chunks WHERE body_id = ?1",
|
||||
params![body_id],
|
||||
|row| row.get(0),
|
||||
)
|
||||
.unwrap_or(0);
|
||||
Ok(count > 0)
|
||||
}
|
||||
}
|
||||
|
||||
/// Run migrations for the blob database.
|
||||
pub fn migrate_blob_db(pool: &Pool<SqliteConnectionManager>) -> Result<()> {
|
||||
info!("Running blob database migrations");
|
||||
|
||||
// Create migrations tracking table
|
||||
pool.get()?.execute(
|
||||
"CREATE TABLE IF NOT EXISTS _blob_migrations (
|
||||
version TEXT PRIMARY KEY,
|
||||
description TEXT NOT NULL,
|
||||
applied_at DATETIME DEFAULT CURRENT_TIMESTAMP NOT NULL
|
||||
)",
|
||||
[],
|
||||
)?;
|
||||
|
||||
// Read and sort all .sql files
|
||||
let mut entries: Vec<_> = BLOB_MIGRATIONS_DIR
|
||||
.entries()
|
||||
.iter()
|
||||
.filter(|e| e.path().extension().map(|ext| ext == "sql").unwrap_or(false))
|
||||
.collect();
|
||||
|
||||
entries.sort_by_key(|e| e.path());
|
||||
|
||||
let mut ran_migrations = 0;
|
||||
for entry in &entries {
|
||||
let filename = entry.path().file_name().unwrap().to_str().unwrap();
|
||||
let version = filename.split('_').next().unwrap();
|
||||
|
||||
// Check if already applied
|
||||
let already_applied: Option<i64> = pool
|
||||
.get()?
|
||||
.query_row("SELECT 1 FROM _blob_migrations WHERE version = ?", [version], |r| r.get(0))
|
||||
.optional()?;
|
||||
|
||||
if already_applied.is_some() {
|
||||
debug!("Skipping already applied blob migration: {}", filename);
|
||||
continue;
|
||||
}
|
||||
|
||||
let sql =
|
||||
entry.as_file().unwrap().contents_utf8().expect("Failed to read blob migration file");
|
||||
|
||||
info!("Applying blob migration: {}", filename);
|
||||
let conn = pool.get()?;
|
||||
conn.execute_batch(sql)?;
|
||||
|
||||
// Record migration
|
||||
conn.execute(
|
||||
"INSERT INTO _blob_migrations (version, description) VALUES (?, ?)",
|
||||
params![version, filename],
|
||||
)?;
|
||||
|
||||
ran_migrations += 1;
|
||||
}
|
||||
|
||||
if ran_migrations == 0 {
|
||||
info!("No blob migrations to run");
|
||||
} else {
|
||||
info!("Ran {} blob migration(s)", ran_migrations);
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn create_test_pool() -> Pool<SqliteConnectionManager> {
|
||||
let manager = SqliteConnectionManager::memory();
|
||||
let pool = Pool::builder().max_size(1).build(manager).unwrap();
|
||||
migrate_blob_db(&pool).unwrap();
|
||||
pool
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_insert_and_get_chunks() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
let body_id = "rs_test123.request";
|
||||
let chunk1 = BodyChunk::new(body_id, 0, b"Hello, ".to_vec());
|
||||
let chunk2 = BodyChunk::new(body_id, 1, b"World!".to_vec());
|
||||
|
||||
ctx.insert_chunk(&chunk1).unwrap();
|
||||
ctx.insert_chunk(&chunk2).unwrap();
|
||||
|
||||
let chunks = ctx.get_chunks(body_id).unwrap();
|
||||
assert_eq!(chunks.len(), 2);
|
||||
assert_eq!(chunks[0].chunk_index, 0);
|
||||
assert_eq!(chunks[0].data, b"Hello, ");
|
||||
assert_eq!(chunks[1].chunk_index, 1);
|
||||
assert_eq!(chunks[1].data, b"World!");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_get_chunks_ordered_by_index() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
let body_id = "rs_test123.request";
|
||||
|
||||
// Insert out of order
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 2, b"C".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 0, b"A".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 1, b"B".to_vec())).unwrap();
|
||||
|
||||
let chunks = ctx.get_chunks(body_id).unwrap();
|
||||
assert_eq!(chunks.len(), 3);
|
||||
assert_eq!(chunks[0].data, b"A");
|
||||
assert_eq!(chunks[1].data, b"B");
|
||||
assert_eq!(chunks[2].data, b"C");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_delete_chunks() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
let body_id = "rs_test123.request";
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 0, b"data".to_vec())).unwrap();
|
||||
|
||||
assert!(ctx.body_exists(body_id).unwrap());
|
||||
|
||||
ctx.delete_chunks(body_id).unwrap();
|
||||
|
||||
assert!(!ctx.body_exists(body_id).unwrap());
|
||||
assert_eq!(ctx.get_chunks(body_id).unwrap().len(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_delete_chunks_like() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
// Insert chunks for same response but different body types
|
||||
ctx.insert_chunk(&BodyChunk::new("rs_abc.request", 0, b"req".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new("rs_abc.response", 0, b"resp".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new("rs_other.request", 0, b"other".to_vec())).unwrap();
|
||||
|
||||
// Delete all bodies for rs_abc
|
||||
ctx.delete_chunks_like("rs_abc.%").unwrap();
|
||||
|
||||
// rs_abc bodies should be gone
|
||||
assert!(!ctx.body_exists("rs_abc.request").unwrap());
|
||||
assert!(!ctx.body_exists("rs_abc.response").unwrap());
|
||||
|
||||
// rs_other should still exist
|
||||
assert!(ctx.body_exists("rs_other.request").unwrap());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_get_body_size() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
let body_id = "rs_test123.request";
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 0, b"Hello".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 1, b"World".to_vec())).unwrap();
|
||||
|
||||
let size = ctx.get_body_size(body_id).unwrap();
|
||||
assert_eq!(size, 10); // "Hello" + "World" = 10 bytes
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_get_body_size_empty() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
let size = ctx.get_body_size("nonexistent").unwrap();
|
||||
assert_eq!(size, 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_body_exists() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
assert!(!ctx.body_exists("rs_test.request").unwrap());
|
||||
|
||||
ctx.insert_chunk(&BodyChunk::new("rs_test.request", 0, b"data".to_vec())).unwrap();
|
||||
|
||||
assert!(ctx.body_exists("rs_test.request").unwrap());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_multiple_bodies_isolated() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
ctx.insert_chunk(&BodyChunk::new("body1", 0, b"data1".to_vec())).unwrap();
|
||||
ctx.insert_chunk(&BodyChunk::new("body2", 0, b"data2".to_vec())).unwrap();
|
||||
|
||||
let chunks1 = ctx.get_chunks("body1").unwrap();
|
||||
let chunks2 = ctx.get_chunks("body2").unwrap();
|
||||
|
||||
assert_eq!(chunks1.len(), 1);
|
||||
assert_eq!(chunks1[0].data, b"data1");
|
||||
assert_eq!(chunks2.len(), 1);
|
||||
assert_eq!(chunks2[0].data, b"data2");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_large_chunk() {
|
||||
let pool = create_test_pool();
|
||||
let manager = BlobManager::new(pool);
|
||||
let ctx = manager.connect();
|
||||
|
||||
// 1MB chunk
|
||||
let large_data: Vec<u8> = (0..1024 * 1024).map(|i| (i % 256) as u8).collect();
|
||||
let body_id = "rs_large.request";
|
||||
|
||||
ctx.insert_chunk(&BodyChunk::new(body_id, 0, large_data.clone())).unwrap();
|
||||
|
||||
let chunks = ctx.get_chunks(body_id).unwrap();
|
||||
assert_eq!(chunks.len(), 1);
|
||||
assert_eq!(chunks[0].data, large_data);
|
||||
assert_eq!(ctx.get_body_size(body_id).unwrap(), 1024 * 1024);
|
||||
}
|
||||
}
|
||||
@@ -1,100 +0,0 @@
|
||||
use crate::blob_manager::{BlobManager, migrate_blob_db};
|
||||
use crate::error::{Error, Result};
|
||||
use crate::migrate::migrate_db;
|
||||
use crate::query_manager::QueryManager;
|
||||
use crate::util::ModelPayload;
|
||||
use log::info;
|
||||
use r2d2::Pool;
|
||||
use r2d2_sqlite::SqliteConnectionManager;
|
||||
use std::fs::create_dir_all;
|
||||
use std::path::Path;
|
||||
use std::sync::mpsc;
|
||||
use std::time::Duration;
|
||||
|
||||
pub mod blob_manager;
|
||||
mod connection_or_tx;
|
||||
pub mod db_context;
|
||||
pub mod error;
|
||||
pub mod migrate;
|
||||
pub mod models;
|
||||
pub mod queries;
|
||||
pub mod query_manager;
|
||||
pub mod render;
|
||||
pub mod util;
|
||||
|
||||
/// Initialize the database managers for standalone (non-Tauri) usage.
|
||||
///
|
||||
/// Returns a tuple of (QueryManager, BlobManager, event_receiver).
|
||||
/// The event_receiver can be used to listen for model change events.
|
||||
pub fn init_standalone(
|
||||
db_path: impl AsRef<Path>,
|
||||
blob_path: impl AsRef<Path>,
|
||||
) -> Result<(QueryManager, BlobManager, mpsc::Receiver<ModelPayload>)> {
|
||||
let db_path = db_path.as_ref();
|
||||
let blob_path = blob_path.as_ref();
|
||||
|
||||
// Create parent directories if needed
|
||||
if let Some(parent) = db_path.parent() {
|
||||
create_dir_all(parent)?;
|
||||
}
|
||||
if let Some(parent) = blob_path.parent() {
|
||||
create_dir_all(parent)?;
|
||||
}
|
||||
|
||||
// Main database pool
|
||||
info!("Initializing app database {db_path:?}");
|
||||
let manager = SqliteConnectionManager::file(db_path);
|
||||
let pool = Pool::builder()
|
||||
.max_size(100)
|
||||
.connection_timeout(Duration::from_secs(10))
|
||||
.build(manager)
|
||||
.map_err(|e| Error::Database(e.to_string()))?;
|
||||
|
||||
migrate_db(&pool)?;
|
||||
|
||||
info!("Initializing blobs database {blob_path:?}");
|
||||
|
||||
// Blob database pool
|
||||
let blob_manager = SqliteConnectionManager::file(blob_path);
|
||||
let blob_pool = Pool::builder()
|
||||
.max_size(50)
|
||||
.connection_timeout(Duration::from_secs(10))
|
||||
.build(blob_manager)
|
||||
.map_err(|e| Error::Database(e.to_string()))?;
|
||||
|
||||
migrate_blob_db(&blob_pool)?;
|
||||
|
||||
let (tx, rx) = mpsc::channel();
|
||||
let query_manager = QueryManager::new(pool, tx);
|
||||
let blob_manager = BlobManager::new(blob_pool);
|
||||
|
||||
Ok((query_manager, blob_manager, rx))
|
||||
}
|
||||
|
||||
/// Initialize the database managers with in-memory SQLite databases.
|
||||
/// Useful for testing and CI environments.
|
||||
pub fn init_in_memory() -> Result<(QueryManager, BlobManager, mpsc::Receiver<ModelPayload>)> {
|
||||
// Main database pool
|
||||
let manager = SqliteConnectionManager::memory();
|
||||
let pool = Pool::builder()
|
||||
.max_size(1) // In-memory DB doesn't support multiple connections
|
||||
.build(manager)
|
||||
.map_err(|e| Error::Database(e.to_string()))?;
|
||||
|
||||
migrate_db(&pool)?;
|
||||
|
||||
// Blob database pool
|
||||
let blob_manager = SqliteConnectionManager::memory();
|
||||
let blob_pool = Pool::builder()
|
||||
.max_size(1)
|
||||
.build(blob_manager)
|
||||
.map_err(|e| Error::Database(e.to_string()))?;
|
||||
|
||||
migrate_blob_db(&blob_pool)?;
|
||||
|
||||
let (tx, rx) = mpsc::channel();
|
||||
let query_manager = QueryManager::new(pool, tx);
|
||||
let blob_manager = BlobManager::new(blob_pool);
|
||||
|
||||
Ok((query_manager, blob_manager, rx))
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
use crate::db_context::DbContext;
|
||||
use crate::error::Result;
|
||||
use crate::models::{HttpResponseEvent, HttpResponseEventIden};
|
||||
use crate::util::UpdateSource;
|
||||
|
||||
impl<'a> DbContext<'a> {
|
||||
pub fn list_http_response_events(&self, response_id: &str) -> Result<Vec<HttpResponseEvent>> {
|
||||
self.find_many(HttpResponseEventIden::ResponseId, response_id, None)
|
||||
}
|
||||
|
||||
pub fn upsert_http_response_event(
|
||||
&self,
|
||||
http_response_event: &HttpResponseEvent,
|
||||
source: &UpdateSource,
|
||||
) -> Result<HttpResponseEvent> {
|
||||
self.upsert(http_response_event, source)
|
||||
}
|
||||
}
|
||||
@@ -1,44 +0,0 @@
|
||||
pub mod any_request;
|
||||
mod batch;
|
||||
mod cookie_jars;
|
||||
mod environments;
|
||||
mod folders;
|
||||
mod graphql_introspections;
|
||||
mod grpc_connections;
|
||||
mod grpc_events;
|
||||
mod grpc_requests;
|
||||
mod http_requests;
|
||||
mod http_response_events;
|
||||
mod http_responses;
|
||||
mod key_values;
|
||||
mod plugin_key_values;
|
||||
mod plugins;
|
||||
mod settings;
|
||||
mod sync_states;
|
||||
mod websocket_connections;
|
||||
mod websocket_events;
|
||||
mod websocket_requests;
|
||||
mod workspace_metas;
|
||||
pub mod workspaces;
|
||||
|
||||
const MAX_HISTORY_ITEMS: usize = 20;
|
||||
|
||||
use crate::models::HttpRequestHeader;
|
||||
use std::collections::HashMap;
|
||||
|
||||
/// Deduplicate headers by name (case-insensitive), keeping the latest (most specific) value.
|
||||
/// Preserves the order of first occurrence for each header name.
|
||||
pub(crate) fn dedupe_headers(headers: Vec<HttpRequestHeader>) -> Vec<HttpRequestHeader> {
|
||||
let mut index_by_name: HashMap<String, usize> = HashMap::new();
|
||||
let mut deduped: Vec<HttpRequestHeader> = Vec::new();
|
||||
for header in headers {
|
||||
let key = header.name.to_lowercase();
|
||||
if let Some(&idx) = index_by_name.get(&key) {
|
||||
deduped[idx] = header;
|
||||
} else {
|
||||
index_by_name.insert(key, deduped.len());
|
||||
deduped.push(header);
|
||||
}
|
||||
}
|
||||
deduped
|
||||
}
|
||||
@@ -1,84 +0,0 @@
|
||||
// This file was generated by [ts-rs](https://github.com/Aleph-Alpha/ts-rs). Do not edit this file manually.
|
||||
|
||||
export type AnyModel = CookieJar | Environment | Folder | GraphQlIntrospection | GrpcConnection | GrpcEvent | GrpcRequest | HttpRequest | HttpResponse | HttpResponseEvent | KeyValue | Plugin | Settings | SyncState | WebsocketConnection | WebsocketEvent | WebsocketRequest | Workspace | WorkspaceMeta;
|
||||
|
||||
export type ClientCertificate = { host: string, port: number | null, crtFile: string | null, keyFile: string | null, pfxFile: string | null, passphrase: string | null, enabled?: boolean, };
|
||||
|
||||
export type Cookie = { raw_cookie: string, domain: CookieDomain, expires: CookieExpires, path: [string, boolean], };
|
||||
|
||||
export type CookieDomain = { "HostOnly": string } | { "Suffix": string } | "NotPresent" | "Empty";
|
||||
|
||||
export type CookieExpires = { "AtUtc": string } | "SessionEnd";
|
||||
|
||||
export type CookieJar = { model: "cookie_jar", id: string, createdAt: string, updatedAt: string, workspaceId: string, cookies: Array<Cookie>, name: string, };
|
||||
|
||||
export type DnsOverride = { hostname: string, ipv4: Array<string>, ipv6: Array<string>, enabled?: boolean, };
|
||||
|
||||
export type EditorKeymap = "default" | "vim" | "vscode" | "emacs";
|
||||
|
||||
export type EncryptedKey = { encryptedKey: string, };
|
||||
|
||||
export type Environment = { model: "environment", id: string, workspaceId: string, createdAt: string, updatedAt: string, name: string, public: boolean, parentModel: string, parentId: string | null, variables: Array<EnvironmentVariable>, color: string | null, sortPriority: number, };
|
||||
|
||||
export type EnvironmentVariable = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type Folder = { model: "folder", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, sortPriority: number, };
|
||||
|
||||
export type GraphQlIntrospection = { model: "graphql_introspection", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, content: string | null, };
|
||||
|
||||
export type GrpcConnection = { model: "grpc_connection", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, elapsed: number, error: string | null, method: string, service: string, status: number, state: GrpcConnectionState, trailers: { [key in string]?: string }, url: string, };
|
||||
|
||||
export type GrpcConnectionState = "initialized" | "connected" | "closed";
|
||||
|
||||
export type GrpcEvent = { model: "grpc_event", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, connectionId: string, content: string, error: string | null, eventType: GrpcEventType, metadata: { [key in string]?: string }, status: number | null, };
|
||||
|
||||
export type GrpcEventType = "info" | "error" | "client_message" | "server_message" | "connection_start" | "connection_end";
|
||||
|
||||
export type GrpcRequest = { model: "grpc_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authenticationType: string | null, authentication: Record<string, any>, description: string, message: string, metadata: Array<HttpRequestHeader>, method: string | null, name: string, service: string | null, sortPriority: number, url: string, };
|
||||
|
||||
export type HttpRequest = { model: "http_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, body: Record<string, any>, bodyType: string | null, description: string, headers: Array<HttpRequestHeader>, method: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
export type HttpRequestHeader = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type HttpResponse = { model: "http_response", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, bodyPath: string | null, contentLength: number | null, contentLengthCompressed: number | null, elapsed: number, elapsedHeaders: number, elapsedDns: number, error: string | null, headers: Array<HttpResponseHeader>, remoteAddr: string | null, requestContentLength: number | null, requestHeaders: Array<HttpResponseHeader>, status: number, statusReason: string | null, state: HttpResponseState, url: string, version: string | null, };
|
||||
|
||||
export type HttpResponseEvent = { model: "http_response_event", id: string, createdAt: string, updatedAt: string, workspaceId: string, responseId: string, event: HttpResponseEventData, };
|
||||
|
||||
/**
|
||||
* Serializable representation of HTTP response events for DB storage.
|
||||
* This mirrors `yaak_http::sender::HttpResponseEvent` but with serde support.
|
||||
* The `From` impl is in yaak-http to avoid circular dependencies.
|
||||
*/
|
||||
export type HttpResponseEventData = { "type": "setting", name: string, value: string, } | { "type": "info", message: string, } | { "type": "redirect", url: string, status: number, behavior: string, } | { "type": "send_url", method: string, path: string, } | { "type": "receive_url", version: string, status: string, } | { "type": "header_up", name: string, value: string, } | { "type": "header_down", name: string, value: string, } | { "type": "chunk_sent", bytes: number, } | { "type": "chunk_received", bytes: number, } | { "type": "dns_resolved", hostname: string, addresses: Array<string>, duration: bigint, overridden: boolean, };
|
||||
|
||||
export type HttpResponseHeader = { name: string, value: string, };
|
||||
|
||||
export type HttpResponseState = "initialized" | "connected" | "closed";
|
||||
|
||||
export type HttpUrlParameter = { enabled?: boolean, name: string, value: string, id?: string, };
|
||||
|
||||
export type KeyValue = { model: "key_value", id: string, createdAt: string, updatedAt: string, key: string, namespace: string, value: string, };
|
||||
|
||||
export type Plugin = { model: "plugin", id: string, createdAt: string, updatedAt: string, checkedAt: string | null, directory: string, enabled: boolean, url: string | null, };
|
||||
|
||||
export type ProxySetting = { "type": "enabled", http: string, https: string, auth: ProxySettingAuth | null, bypass: string, disabled: boolean, } | { "type": "disabled" };
|
||||
|
||||
export type ProxySettingAuth = { user: string, password: string, };
|
||||
|
||||
export type Settings = { model: "settings", id: string, createdAt: string, updatedAt: string, appearance: string, clientCertificates: Array<ClientCertificate>, coloredMethods: boolean, editorFont: string | null, editorFontSize: number, editorKeymap: EditorKeymap, editorSoftWrap: boolean, hideWindowControls: boolean, useNativeTitlebar: boolean, interfaceFont: string | null, interfaceFontSize: number, interfaceScale: number, openWorkspaceNewWindow: boolean | null, proxy: ProxySetting | null, themeDark: string, themeLight: string, updateChannel: string, hideLicenseBadge: boolean, autoupdate: boolean, autoDownloadUpdates: boolean, checkNotifications: boolean, hotkeys: { [key in string]?: Array<string> }, };
|
||||
|
||||
export type SyncState = { model: "sync_state", id: string, workspaceId: string, createdAt: string, updatedAt: string, flushedAt: string, modelId: string, checksum: string, relPath: string, syncDir: string, };
|
||||
|
||||
export type WebsocketConnection = { model: "websocket_connection", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, elapsed: number, error: string | null, headers: Array<HttpResponseHeader>, state: WebsocketConnectionState, status: number, url: string, };
|
||||
|
||||
export type WebsocketConnectionState = "initialized" | "connected" | "closing" | "closed";
|
||||
|
||||
export type WebsocketEvent = { model: "websocket_event", id: string, createdAt: string, updatedAt: string, workspaceId: string, requestId: string, connectionId: string, isServer: boolean, message: Array<number>, messageType: WebsocketEventType, };
|
||||
|
||||
export type WebsocketEventType = "binary" | "close" | "frame" | "open" | "ping" | "pong" | "text";
|
||||
|
||||
export type WebsocketRequest = { model: "websocket_request", id: string, createdAt: string, updatedAt: string, workspaceId: string, folderId: string | null, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, message: string, name: string, sortPriority: number, url: string, urlParameters: Array<HttpUrlParameter>, };
|
||||
|
||||
export type Workspace = { model: "workspace", id: string, createdAt: string, updatedAt: string, authentication: Record<string, any>, authenticationType: string | null, description: string, headers: Array<HttpRequestHeader>, name: string, encryptionKeyChallenge: string | null, settingValidateCertificates: boolean, settingFollowRedirects: boolean, settingRequestTimeout: number, settingDnsOverrides: Array<DnsOverride>, };
|
||||
|
||||
export type WorkspaceMeta = { model: "workspace_meta", id: string, workspaceId: string, createdAt: string, updatedAt: string, encryptionKey: EncryptedKey | null, settingSyncDir: string | null, };
|
||||
@@ -1,26 +0,0 @@
|
||||
import { invoke } from '@tauri-apps/api/core';
|
||||
import { PluginNameVersion, PluginSearchResponse, PluginUpdatesResponse } from './bindings/gen_api';
|
||||
|
||||
export * from './bindings/gen_models';
|
||||
export * from './bindings/gen_events';
|
||||
export * from './bindings/gen_search';
|
||||
|
||||
export async function searchPlugins(query: string) {
|
||||
return invoke<PluginSearchResponse>('cmd_plugins_search', { query });
|
||||
}
|
||||
|
||||
export async function installPlugin(name: string, version: string | null) {
|
||||
return invoke<void>('cmd_plugins_install', { name, version });
|
||||
}
|
||||
|
||||
export async function uninstallPlugin(pluginId: string) {
|
||||
return invoke<void>('cmd_plugins_uninstall', { pluginId });
|
||||
}
|
||||
|
||||
export async function checkPluginUpdates() {
|
||||
return invoke<PluginUpdatesResponse>('cmd_plugins_updates', {});
|
||||
}
|
||||
|
||||
export async function updateAllPlugins() {
|
||||
return invoke<PluginNameVersion[]>('cmd_plugins_update_all', {});
|
||||
}
|
||||
@@ -1,92 +0,0 @@
|
||||
use crate::api::{PluginVersion, download_plugin_archive, get_plugin};
|
||||
use crate::checksum::compute_checksum;
|
||||
use crate::error::Error::PluginErr;
|
||||
use crate::error::Result;
|
||||
use crate::events::PluginContext;
|
||||
use crate::manager::PluginManager;
|
||||
use chrono::Utc;
|
||||
use log::info;
|
||||
use std::fs::{create_dir_all, remove_dir_all};
|
||||
use std::io::Cursor;
|
||||
use std::sync::Arc;
|
||||
use yaak_models::models::Plugin;
|
||||
use yaak_models::query_manager::QueryManager;
|
||||
use yaak_models::util::UpdateSource;
|
||||
|
||||
/// Delete a plugin from the database and uninstall it.
|
||||
pub async fn delete_and_uninstall(
|
||||
plugin_manager: Arc<PluginManager>,
|
||||
query_manager: &QueryManager,
|
||||
plugin_context: &PluginContext,
|
||||
plugin_id: &str,
|
||||
) -> Result<Plugin> {
|
||||
let update_source = match plugin_context.label.clone() {
|
||||
Some(label) => UpdateSource::from_window_label(label),
|
||||
None => UpdateSource::Background,
|
||||
};
|
||||
// Scope the db connection so it doesn't live across await
|
||||
let plugin = {
|
||||
let db = query_manager.connect();
|
||||
db.delete_plugin_by_id(plugin_id, &update_source)?
|
||||
};
|
||||
plugin_manager.uninstall(plugin_context, plugin.directory.as_str()).await?;
|
||||
Ok(plugin)
|
||||
}
|
||||
|
||||
/// Download and install a plugin.
|
||||
pub async fn download_and_install(
|
||||
plugin_manager: Arc<PluginManager>,
|
||||
query_manager: &QueryManager,
|
||||
http_client: &reqwest::Client,
|
||||
plugin_context: &PluginContext,
|
||||
name: &str,
|
||||
version: Option<String>,
|
||||
) -> Result<PluginVersion> {
|
||||
info!("Installing plugin {} {}", name, version.clone().unwrap_or_default());
|
||||
let plugin_version = get_plugin(http_client, name, version).await?;
|
||||
let resp = download_plugin_archive(http_client, &plugin_version).await?;
|
||||
let bytes = resp.bytes().await?;
|
||||
|
||||
let checksum = compute_checksum(&bytes);
|
||||
if checksum != plugin_version.checksum {
|
||||
return Err(PluginErr(format!(
|
||||
"Checksum mismatch {}b {checksum} != {}",
|
||||
bytes.len(),
|
||||
plugin_version.checksum
|
||||
)));
|
||||
}
|
||||
|
||||
info!("Checksum matched {}", checksum);
|
||||
|
||||
let plugin_dir = plugin_manager.installed_plugin_dir.join(name);
|
||||
let plugin_dir_str = plugin_dir.to_str().unwrap().to_string();
|
||||
|
||||
// Re-create the plugin directory
|
||||
let _ = remove_dir_all(&plugin_dir);
|
||||
create_dir_all(&plugin_dir)?;
|
||||
|
||||
zip_extract::extract(Cursor::new(&bytes), &plugin_dir, true)?;
|
||||
info!("Extracted plugin {} to {}", plugin_version.id, plugin_dir_str);
|
||||
|
||||
// Scope the db connection so it doesn't live across await
|
||||
let plugin = {
|
||||
let db = query_manager.connect();
|
||||
db.upsert_plugin(
|
||||
&Plugin {
|
||||
id: plugin_version.id.clone(),
|
||||
checked_at: Some(Utc::now().naive_utc()),
|
||||
directory: plugin_dir_str.clone(),
|
||||
enabled: true,
|
||||
url: Some(plugin_version.url.clone()),
|
||||
..Default::default()
|
||||
},
|
||||
&UpdateSource::Background,
|
||||
)?
|
||||
};
|
||||
|
||||
plugin_manager.add_plugin(plugin_context, &plugin).await?;
|
||||
|
||||
info!("Installed plugin {} to {}", plugin_version.id, plugin_dir_str);
|
||||
|
||||
Ok(plugin_version)
|
||||
}
|
||||
@@ -1,21 +0,0 @@
|
||||
//! Core plugin system for Yaak.
|
||||
//!
|
||||
//! This crate provides the plugin manager and supporting functionality
|
||||
//! for running JavaScript plugins via a Node.js runtime.
|
||||
//!
|
||||
//! Note: This crate is Tauri-independent. Tauri integration is provided
|
||||
//! by yaak-app's plugins_ext module.
|
||||
|
||||
pub mod api;
|
||||
mod checksum;
|
||||
pub mod error;
|
||||
pub mod events;
|
||||
pub mod install;
|
||||
pub mod manager;
|
||||
pub mod native_template_functions;
|
||||
mod nodejs;
|
||||
pub mod plugin_handle;
|
||||
pub mod plugin_meta;
|
||||
mod server_ws;
|
||||
pub mod template_callback;
|
||||
mod util;
|
||||
@@ -1,78 +0,0 @@
|
||||
use crate::error::Result;
|
||||
use log::{info, warn};
|
||||
use std::net::SocketAddr;
|
||||
use std::path::Path;
|
||||
use std::process::Stdio;
|
||||
use tokio::io::{AsyncBufReadExt, BufReader};
|
||||
use tokio::sync::watch::Receiver;
|
||||
use yaak_common::command::new_xplatform_command;
|
||||
|
||||
/// Start the Node.js plugin runtime process.
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `node_bin_path` - Path to the yaaknode binary
|
||||
/// * `plugin_runtime_main` - Path to the plugin runtime index.cjs
|
||||
/// * `addr` - Socket address for the plugin runtime to connect to
|
||||
/// * `kill_rx` - Channel to signal shutdown
|
||||
pub async fn start_nodejs_plugin_runtime(
|
||||
node_bin_path: &Path,
|
||||
plugin_runtime_main: &Path,
|
||||
addr: SocketAddr,
|
||||
kill_rx: &Receiver<bool>,
|
||||
) -> Result<()> {
|
||||
// HACK: Remove UNC prefix for Windows paths to pass to sidecar
|
||||
let plugin_runtime_main_str =
|
||||
dunce::simplified(plugin_runtime_main).to_string_lossy().to_string();
|
||||
|
||||
info!(
|
||||
"Starting plugin runtime node={} main={}",
|
||||
node_bin_path.display(),
|
||||
plugin_runtime_main_str
|
||||
);
|
||||
|
||||
let mut cmd = new_xplatform_command(node_bin_path);
|
||||
cmd.env("HOST", addr.ip().to_string())
|
||||
.env("PORT", addr.port().to_string())
|
||||
.arg(&plugin_runtime_main_str)
|
||||
.stdout(Stdio::piped())
|
||||
.stderr(Stdio::piped());
|
||||
|
||||
let mut child = cmd.spawn()?;
|
||||
|
||||
info!("Spawned plugin runtime");
|
||||
|
||||
// Stream stdout
|
||||
if let Some(stdout) = child.stdout.take() {
|
||||
tokio::spawn(async move {
|
||||
let reader = BufReader::new(stdout);
|
||||
let mut lines = reader.lines();
|
||||
while let Ok(Some(line)) = lines.next_line().await {
|
||||
info!("{}", line);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Stream stderr
|
||||
if let Some(stderr) = child.stderr.take() {
|
||||
tokio::spawn(async move {
|
||||
let reader = BufReader::new(stderr);
|
||||
let mut lines = reader.lines();
|
||||
while let Ok(Some(line)) = lines.next_line().await {
|
||||
warn!("{}", line);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// Handle kill signal
|
||||
let mut kill_rx = kill_rx.clone();
|
||||
tokio::spawn(async move {
|
||||
kill_rx.wait_for(|b| *b == true).await.expect("Kill channel errored");
|
||||
info!("Killing plugin runtime");
|
||||
if let Err(e) = child.kill().await {
|
||||
warn!("Failed to kill plugin runtime: {e}");
|
||||
}
|
||||
info!("Killed plugin runtime");
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,4 +0,0 @@
|
||||
pub mod error;
|
||||
pub mod models;
|
||||
pub mod sync;
|
||||
pub mod watch;
|
||||
Binary file not shown.
@@ -1,11 +0,0 @@
|
||||
/* tslint:disable */
|
||||
/* eslint-disable */
|
||||
export const memory: WebAssembly.Memory;
|
||||
export const escape_template: (a: number, b: number) => [number, number, number];
|
||||
export const parse_template: (a: number, b: number) => [number, number, number];
|
||||
export const unescape_template: (a: number, b: number) => [number, number, number];
|
||||
export const __wbindgen_malloc: (a: number, b: number) => number;
|
||||
export const __wbindgen_realloc: (a: number, b: number, c: number, d: number) => number;
|
||||
export const __wbindgen_export_2: WebAssembly.Table;
|
||||
export const __externref_table_dealloc: (a: number) => void;
|
||||
export const __wbindgen_start: () => void;
|
||||
@@ -1,12 +0,0 @@
|
||||
mod connect;
|
||||
pub mod error;
|
||||
pub mod manager;
|
||||
pub mod render;
|
||||
|
||||
pub use connect::ws_connect;
|
||||
pub use manager::WebsocketManager;
|
||||
pub use render::render_websocket_request;
|
||||
|
||||
// Re-export http types needed by consumers
|
||||
pub use http::HeaderMap;
|
||||
pub use tokio_tungstenite::tungstenite::http::HeaderValue;
|
||||
8767
package-lock.json
generated
8767
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
47
package.json
47
package.json
@@ -10,11 +10,8 @@
|
||||
"packages/common-lib",
|
||||
"packages/plugin-runtime",
|
||||
"packages/plugin-runtime-types",
|
||||
"plugins-external/mcp-server",
|
||||
"plugins-external/template-function-faker",
|
||||
"plugins/action-copy-curl",
|
||||
"plugins/action-copy-grpcurl",
|
||||
"plugins/action-send-folder",
|
||||
"plugins/auth-apikey",
|
||||
"plugins/auth-aws",
|
||||
"plugins/auth-basic",
|
||||
@@ -47,33 +44,31 @@
|
||||
"plugins/template-function-request",
|
||||
"plugins/template-function-response",
|
||||
"plugins/themes-yaak",
|
||||
"crates-tauri/yaak-app",
|
||||
"crates-tauri/yaak-fonts",
|
||||
"crates-tauri/yaak-license",
|
||||
"crates-tauri/yaak-mac-window",
|
||||
"crates/yaak-crypto",
|
||||
"crates/yaak-git",
|
||||
"crates/yaak-models",
|
||||
"crates/yaak-plugins",
|
||||
"crates/yaak-sse",
|
||||
"crates/yaak-sync",
|
||||
"crates/yaak-templates",
|
||||
"crates/yaak-ws",
|
||||
"src-tauri",
|
||||
"src-tauri/yaak-crypto",
|
||||
"src-tauri/yaak-fonts",
|
||||
"src-tauri/yaak-git",
|
||||
"src-tauri/yaak-license",
|
||||
"src-tauri/yaak-mac-window",
|
||||
"src-tauri/yaak-models",
|
||||
"src-tauri/yaak-plugins",
|
||||
"src-tauri/yaak-sse",
|
||||
"src-tauri/yaak-sync",
|
||||
"src-tauri/yaak-templates",
|
||||
"src-tauri/yaak-ws",
|
||||
"src-web"
|
||||
],
|
||||
"scripts": {
|
||||
"prepare": "husky",
|
||||
"init": "npm install && npm run bootstrap",
|
||||
"start": "npm run app-dev",
|
||||
"app-build": "tauri build",
|
||||
"app-dev": "node scripts/run-dev.mjs",
|
||||
"app-dev": "tauri dev --no-watch --config ./src-tauri/tauri.development.conf.json",
|
||||
"migration": "node scripts/create-migration.cjs",
|
||||
"build": "npm run --workspaces --if-present build",
|
||||
"build-plugins": "npm run --workspaces --if-present build",
|
||||
"test": "npm run --workspaces --if-present test",
|
||||
"icons": "run-p icons:*",
|
||||
"icons:dev": "tauri icon crates-tauri/yaak-app/icons/icon-dev.png --output crates-tauri/yaak-app/icons/dev",
|
||||
"icons:release": "tauri icon crates-tauri/yaak-app/icons/icon.png --output crates-tauri/yaak-app/icons/release",
|
||||
"icons:dev": "tauri icon src-tauri/icons/icon-dev.png --output src-tauri/icons/dev",
|
||||
"icons:release": "tauri icon src-tauri/icons/icon.png --output src-tauri/icons/release",
|
||||
"bootstrap": "run-s bootstrap:*",
|
||||
"bootstrap:install-wasm-pack": "node scripts/install-wasm-pack.cjs",
|
||||
"bootstrap:build": "npm run build",
|
||||
@@ -89,20 +84,16 @@
|
||||
"replace-version": "node scripts/replace-version.cjs",
|
||||
"tauri": "tauri",
|
||||
"tauri-before-build": "npm run bootstrap",
|
||||
"tauri-before-dev": "node scripts/run-workspaces-dev.mjs"
|
||||
},
|
||||
"overrides": {
|
||||
"js-yaml": "^4.1.1"
|
||||
"tauri-before-dev": "workspaces-run --parallel -- npm run --workspaces --if-present dev"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@biomejs/biome": "^2.3.13",
|
||||
"@biomejs/biome": "^2.3.7",
|
||||
"@tauri-apps/cli": "^2.9.6",
|
||||
"@yaakapp/cli": "^0.3.4",
|
||||
"dotenv-cli": "^11.0.0",
|
||||
"husky": "^9.1.7",
|
||||
"nodejs-file-downloader": "^4.13.0",
|
||||
"npm-run-all": "^4.1.5",
|
||||
"typescript": "^5.8.3",
|
||||
"vitest": "^3.2.4"
|
||||
"vitest": "^3.2.4",
|
||||
"workspaces-run": "^1.0.2"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -17,7 +17,7 @@ npx @yaakapp/cli generate
|
||||
```
|
||||
|
||||
For more details on creating plugins, check out
|
||||
the [Quick Start Guide](https://yaak.app/docs/plugin-development/plugins-quick-start)
|
||||
the [Quick Start Guide](https://feedback.yaak.app/help/articles/6911763-plugins-quick-start)
|
||||
|
||||
## Installation
|
||||
|
||||
|
||||
47
packages/plugin-runtime-types/package-lock.json
generated
Normal file
47
packages/plugin-runtime-types/package-lock.json
generated
Normal file
@@ -0,0 +1,47 @@
|
||||
{
|
||||
"name": "@yaakapp/api",
|
||||
"version": "0.1.17",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "@yaakapp/api",
|
||||
"version": "0.1.17",
|
||||
"dependencies": {
|
||||
"@types/node": "^22.5.4"
|
||||
},
|
||||
"devDependencies": {
|
||||
"typescript": "^5.6.2"
|
||||
}
|
||||
},
|
||||
"node_modules/@types/node": {
|
||||
"version": "22.5.4",
|
||||
"resolved": "https://registry.npmjs.org/@types/node/-/node-22.5.4.tgz",
|
||||
"integrity": "sha512-FDuKUJQm/ju9fT/SeX/6+gBzoPzlVCzfzmGkwKvRHQVxi4BntVbyIwf6a4Xn62mrvndLiml6z/UBXIdEVjQLXg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"undici-types": "~6.19.2"
|
||||
}
|
||||
},
|
||||
"node_modules/typescript": {
|
||||
"version": "5.6.2",
|
||||
"resolved": "https://registry.npmjs.org/typescript/-/typescript-5.6.2.tgz",
|
||||
"integrity": "sha512-NW8ByodCSNCwZeghjN3o+JX5OFH0Ojg6sadjEKY4huZ52TqbJTJnDo5+Tw98lSy63NZvi4n+ez5m2u5d4PkZyw==",
|
||||
"dev": true,
|
||||
"license": "Apache-2.0",
|
||||
"bin": {
|
||||
"tsc": "bin/tsc",
|
||||
"tsserver": "bin/tsserver"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14.17"
|
||||
}
|
||||
},
|
||||
"node_modules/undici-types": {
|
||||
"version": "6.19.8",
|
||||
"resolved": "https://registry.npmjs.org/undici-types/-/undici-types-6.19.8.tgz",
|
||||
"integrity": "sha512-ve2KP6f/JnbPBFyobGHuerC9g1FYGn/F8n1LWTwNxCEzd6IfqTwUQcNXgEtmmQ6DlRrC1hrSrBnCZPokRrDHjw==",
|
||||
"license": "MIT"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "@yaakapp/api",
|
||||
"version": "0.8.0",
|
||||
"version": "0.7.1",
|
||||
"keywords": [
|
||||
"api-client",
|
||||
"insomnia-alternative",
|
||||
@@ -25,8 +25,8 @@
|
||||
"build": "run-s build:copy-types build:tsc",
|
||||
"build:tsc": "tsc",
|
||||
"build:copy-types": "run-p build:copy-types:*",
|
||||
"build:copy-types:root": "cpy --flat ../../crates/yaak-plugins/bindings/*.ts ./src/bindings",
|
||||
"build:copy-types:next": "cpy --flat ../../crates/yaak-plugins/bindings/serde_json/*.ts ./src/bindings/serde_json",
|
||||
"build:copy-types:root": "cpy --flat ../../src-tauri/yaak-plugins/bindings/*.ts ./src/bindings",
|
||||
"build:copy-types:next": "cpy --flat ../../src-tauri/yaak-plugins/bindings/serde_json/*.ts ./src/bindings/serde_json",
|
||||
"publish": "npm publish",
|
||||
"prepublishOnly": "npm run build"
|
||||
},
|
||||
|
||||
File diff suppressed because one or more lines are too long
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user