cleanup
663
AGENTS.md
Normal file
@@ -0,0 +1,663 @@
|
||||
# Spacedrive Core v2 Development Guide
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Development Workflow
|
||||
|
||||
1. Start daemon: `cargo run --bin sd-daemon`
|
||||
2. Make code changes
|
||||
3. Run tests: `cargo test`
|
||||
4. Rebuild and restart: `cargo run --bin sd-cli -- restart`
|
||||
5. Test via CLI: `cargo run --bin sd-cli -- <command>`
|
||||
|
||||
### Common Commands
|
||||
|
||||
```bash
|
||||
cargo build # Build the project
|
||||
cargo test # Run all tests
|
||||
cargo test <test_name> # Run specific test
|
||||
cargo clippy # Lint code
|
||||
cargo fmt # Format code
|
||||
cargo run --bin sd-cli -- <command> # Run CLI (binary is sd-cli, not spacedrive)
|
||||
```
|
||||
|
||||
### Common Mistakes
|
||||
|
||||
- Running `spacedrive` instead of `sd-cli` (the binary name is `sd-cli`)
|
||||
- Forgetting to restart daemon after rebuilding
|
||||
- Using `println!` instead of `tracing` macros (`info!`, `debug!`, etc)
|
||||
- Implementing `Wire` manually instead of using `register_*` macros
|
||||
- Blocking the async runtime with synchronous I/O operations
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
Spacedrive uses daemon-client architecture. A single daemon process manages core functionality. Multiple clients (CLI, GraphQL server, desktop app) connect via Unix domain sockets.
|
||||
|
||||
### CQRS and DDD Pattern
|
||||
|
||||
- **Domain** (`src/domain/`): Core data structures and business logic (nouns)
|
||||
- **Operations** (`src/ops/`): Actions and queries (verbs)
|
||||
- **Actions**: State-changing operations (writes)
|
||||
- **Queries**: Data retrieval without state changes (reads)
|
||||
|
||||
### Feature Module Structure
|
||||
|
||||
Each feature lives in its own module under `src/ops/`. Example: `src/ops/files/share`
|
||||
|
||||
```
|
||||
src/ops/files/share/
|
||||
├── action.rs # State-changing logic
|
||||
├── input.rs # Action input structures
|
||||
├── output.rs # Action output structures
|
||||
└── job.rs # Long-running job implementation (if needed)
|
||||
```
|
||||
|
||||
Complete feature example:
|
||||
|
||||
```rust
|
||||
// src/ops/files/share/input.rs
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct ShareFileInput {
|
||||
pub file_id: i32,
|
||||
pub recipient: String,
|
||||
}
|
||||
|
||||
// src/ops/files/share/output.rs
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
pub struct ShareFileOutput {
|
||||
pub share_id: String,
|
||||
pub url: String,
|
||||
}
|
||||
|
||||
// src/ops/files/share/action.rs
|
||||
use super::{ShareFileInput, ShareFileOutput};
|
||||
|
||||
pub struct ShareFileAction;
|
||||
|
||||
crate::register_library_action!(ShareFileAction, "files.share");
|
||||
|
||||
impl Action for ShareFileAction {
|
||||
type Input = ShareFileInput;
|
||||
type Output = ShareFileOutput;
|
||||
|
||||
async fn run(input: Self::Input, ctx: &ActionContext) -> Result<Self::Output> {
|
||||
// Implementation
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Communication Architecture
|
||||
|
||||
Spacedrive supports multiple communication patterns for different platforms and use cases.
|
||||
|
||||
### Daemon-Client Communication (Desktop, CLI)
|
||||
|
||||
Desktop and CLI clients connect to a daemon process via Unix domain sockets. Communication uses JSON-RPC 2.0 with Wire method strings.
|
||||
|
||||
**Registration Macros:**
|
||||
|
||||
Never implement `Wire` manually. Use registration macros:
|
||||
|
||||
```rust
|
||||
// Queries
|
||||
crate::register_query!(NetworkStatusQuery, "network.status");
|
||||
// Generates: "query:network.status.v1"
|
||||
|
||||
// Library Actions
|
||||
crate::register_library_action!(FileCopyAction, "files.copy");
|
||||
// Generates: "action:files.copy.input.v1"
|
||||
|
||||
// Core Actions
|
||||
crate::register_core_action!(LibraryCreateAction, "libraries.create");
|
||||
// Generates: "action:libraries.create.input.v1"
|
||||
```
|
||||
|
||||
**Registry System:**
|
||||
|
||||
The `inventory` crate collects operations at compile time. When you use `register_query!` or `register_library_action!`, the operation automatically appears in global `QUERIES` and `ACTIONS` hashmaps at startup. You never manually register operations.
|
||||
|
||||
Location: `core/src/ops/registry.rs`
|
||||
|
||||
### Embedded Core (iOS, Mobile)
|
||||
|
||||
iOS and mobile apps embed the core directly as a native library rather than connecting to a daemon. Communication uses the same JSON-RPC 2.0 protocol with Wire method strings, but over FFI instead of sockets.
|
||||
|
||||
**Architecture:**
|
||||
|
||||
```
|
||||
iOS App (Swift)
|
||||
↓
|
||||
SpacedriveClient (Swift)
|
||||
↓
|
||||
SDIOSCore (Rust FFI)
|
||||
↓
|
||||
RpcServer::execute_json_operation() (Rust)
|
||||
↓
|
||||
Operation Registry (same as daemon!)
|
||||
```
|
||||
|
||||
**Key Files:**
|
||||
|
||||
- `apps/ios/sd-ios-core/src/lib.rs` - FFI bridge to Rust core
|
||||
- `packages/swift-client/` - Swift client library
|
||||
- `apps/ios/Spacedrive/Spacedrive/Managers/EmbeddedCoreManager.swift` - iOS manager
|
||||
|
||||
**Benefits:**
|
||||
|
||||
- No daemon process needed
|
||||
- Lower latency (in-process)
|
||||
- Works offline immediately
|
||||
- Same operation registry as daemon
|
||||
|
||||
### Swift Client Auto-Generation
|
||||
|
||||
Swift types are automatically generated from Rust types using Specta, similar to TypeScript generation.
|
||||
|
||||
**Generation Process:**
|
||||
|
||||
```bash
|
||||
# Types are generated during build
|
||||
cargo run --bin generate_swift_types
|
||||
```
|
||||
|
||||
**Output Files:**
|
||||
|
||||
- `packages/swift-client/Sources/SpacedriveClient/SpacedriveTypes.swift` - All types
|
||||
- `packages/swift-client/Sources/SpacedriveClient/SpacedriveAPI.swift` - API methods
|
||||
|
||||
**How It Works:**
|
||||
|
||||
The `generate_swift_types` binary uses Specta to export Rust types to Swift:
|
||||
|
||||
```rust
|
||||
// core/src/bin/generate_swift_types.rs
|
||||
let (operations, queries, types) = generate_spacedrive_api();
|
||||
let api_structure = create_spacedrive_api_structure(&operations, &queries);
|
||||
|
||||
// Generate Swift code from Specta types
|
||||
let swift = specta_swift::Swift::new()
|
||||
.naming(specta_swift::NamingConvention::PascalCase)
|
||||
.export(types)?;
|
||||
```
|
||||
|
||||
**Consumed By:**
|
||||
|
||||
- iOS app (`apps/ios/`)
|
||||
- macOS app (`apps/macos/`)
|
||||
- Any Swift-based Spacedrive client
|
||||
|
||||
### Extension System (WASM)
|
||||
|
||||
Extensions run as sandboxed WASM modules that interact with Spacedrive core via host functions. Extensions are distributed as compiled `.wasm` files.
|
||||
|
||||
**Architecture:**
|
||||
|
||||
```
|
||||
Extension.wasm (compiled Rust)
|
||||
↓
|
||||
spacedrive-sdk (Rust crate)
|
||||
↓
|
||||
Host Functions (FFI boundary)
|
||||
↓
|
||||
Core (VDFS, Jobs, AI, etc.)
|
||||
```
|
||||
|
||||
**Key Components:**
|
||||
|
||||
**SDK Location:** `crates/sdk/`
|
||||
|
||||
- High-level Rust API abstracting FFI details
|
||||
- Procedural macros for extension definition
|
||||
- Type-safe job, model, and action builders
|
||||
|
||||
**Extension Development:**
|
||||
|
||||
Extensions use procedural macros to minimize boilerplate:
|
||||
|
||||
```rust
|
||||
use spacedrive_sdk::prelude::*;
|
||||
|
||||
#[extension(
|
||||
id = "test-extension",
|
||||
name = "Test Extension",
|
||||
version = "0.1.0",
|
||||
jobs = [test_counter],
|
||||
)]
|
||||
struct TestExtension;
|
||||
|
||||
#[derive(Serialize, Deserialize, Default)]
|
||||
pub struct CounterState {
|
||||
pub current: u32,
|
||||
pub target: u32,
|
||||
pub processed: Vec<String>,
|
||||
}
|
||||
|
||||
#[job(name = "counter")]
|
||||
fn test_counter(ctx: &JobContext, state: &mut CounterState) -> Result<()> {
|
||||
ctx.log(&format!("Starting counter (current: {}, target: {})",
|
||||
state.current, state.target));
|
||||
|
||||
while state.current < state.target {
|
||||
if ctx.check_interrupt() {
|
||||
ctx.checkpoint(state)?;
|
||||
return Err(Error::OperationFailed("Interrupted".into()));
|
||||
}
|
||||
|
||||
state.current += 1;
|
||||
ctx.report_progress(
|
||||
state.current as f32 / state.target as f32,
|
||||
&format!("Counted {}/{}", state.current, state.target),
|
||||
);
|
||||
|
||||
if state.current % 10 == 0 {
|
||||
ctx.checkpoint(state)?;
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
```
|
||||
|
||||
**Host Functions:**
|
||||
|
||||
Extensions import minimal FFI functions:
|
||||
|
||||
```rust
|
||||
#[link(wasm_import_module = "spacedrive")]
|
||||
extern "C" {
|
||||
fn spacedrive_log(level: u32, msg_ptr: *const u8, msg_len: usize);
|
||||
fn register_job(
|
||||
job_name_ptr: *const u8,
|
||||
job_name_len: u32,
|
||||
export_fn_ptr: *const u8,
|
||||
export_fn_len: u32,
|
||||
resumable: u32,
|
||||
) -> i32;
|
||||
}
|
||||
```
|
||||
|
||||
**Building Extensions:**
|
||||
|
||||
```bash
|
||||
# From extension directory
|
||||
cargo build --target wasm32-unknown-unknown --release
|
||||
|
||||
# Output: target/wasm32-unknown-unknown/release/extension_name.wasm
|
||||
```
|
||||
|
||||
**Extension Capabilities:**
|
||||
|
||||
Extensions can define:
|
||||
|
||||
- Models: Data structures stored in `models` table (content-scoped, standalone, or entry-scoped)
|
||||
- Jobs: Long-running resumable operations
|
||||
- Actions: User-invoked operations with preview-commit workflow
|
||||
- Agents: Autonomous logic with memory and event handling
|
||||
- UI: Custom views via `ui_manifest.json`
|
||||
|
||||
**Example Use Cases:**
|
||||
|
||||
- Photos extension: Face detection, scene tagging, album organization
|
||||
- Finance extension: Receipt extraction, expense tracking
|
||||
- Research extension: Citation extraction, knowledge graphs
|
||||
|
||||
**Key Benefits:**
|
||||
|
||||
- Single `.wasm` file works on all platforms
|
||||
- True sandboxing (WASM isolation)
|
||||
- Resumable jobs with checkpointing
|
||||
- Type-safe API with procedural macros
|
||||
- No core modifications needed for new features
|
||||
|
||||
**Documentation:**
|
||||
|
||||
- `/docs/sdk/sdk.md` - Complete SDK specification and API reference
|
||||
- `extensions/test-extension/` - Working example extension
|
||||
- `crates/sdk/` - SDK implementation
|
||||
- `crates/sdk-macros/` - SDK procedural macros
|
||||
|
||||
**Status:** SDK implementation in progress. Test extension compiles to WASM successfully. Core integration for loading and executing WASM modules is next phase.
|
||||
|
||||
## Code Standards
|
||||
|
||||
### Import Organization
|
||||
|
||||
Group imports with blank lines between groups:
|
||||
|
||||
```rust
|
||||
// Standard library
|
||||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
// External crates
|
||||
use serde::{Deserialize, Serialize};
|
||||
use tokio::sync::RwLock;
|
||||
|
||||
// Local modules
|
||||
use crate::domain::library::Library;
|
||||
use crate::ops::Action;
|
||||
```
|
||||
|
||||
### Naming Conventions
|
||||
|
||||
- Functions/variables: `snake_case`
|
||||
- Types: `PascalCase`
|
||||
- Constants: `SCREAMING_SNAKE_CASE`
|
||||
|
||||
### Error Handling
|
||||
|
||||
Use `Result<T, E>` for all fallible operations. Use `thiserror` for custom errors, `anyhow` for application errors.
|
||||
|
||||
```rust
|
||||
use thiserror::Error;
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum ShareError {
|
||||
#[error("File not found: {0}")]
|
||||
FileNotFound(i32),
|
||||
|
||||
#[error("Permission denied")]
|
||||
PermissionDenied,
|
||||
|
||||
#[error("Database error: {0}")]
|
||||
Database(#[from] sea_orm::DbErr),
|
||||
}
|
||||
|
||||
pub async fn share_file(id: i32) -> Result<String, ShareError> {
|
||||
let file = find_file(id).await.ok_or(ShareError::FileNotFound(id))?;
|
||||
// Implementation
|
||||
Ok(share_url)
|
||||
}
|
||||
```
|
||||
|
||||
### Async Code
|
||||
|
||||
- Use `async/await` syntax
|
||||
- Prefer `tokio` primitives (`tokio::sync::RwLock`, `tokio::spawn`)
|
||||
- Avoid blocking operations (use `tokio::fs` not `std::fs`)
|
||||
- Use `tokio::task::spawn_blocking` for CPU-intensive work
|
||||
|
||||
### Resumable Jobs
|
||||
|
||||
Store job state within the job struct. Use `#[serde(skip)]` for non-persistent fields.
|
||||
|
||||
```rust
|
||||
#[derive(Serialize, Deserialize)]
|
||||
pub struct FileCopyJob {
|
||||
pub source: PathBuf,
|
||||
pub destination: PathBuf,
|
||||
pub copied_files: Vec<PathBuf>, // Persisted for resumability
|
||||
|
||||
#[serde(skip)]
|
||||
pub progress_tx: Option<tokio::sync::mpsc::Sender<Progress>>, // Not persisted
|
||||
}
|
||||
|
||||
impl Job for FileCopyJob {
|
||||
async fn run(&mut self, ctx: &JobContext) -> Result<()> {
|
||||
ctx.log().info("Starting file copy job");
|
||||
|
||||
for file in &self.files_to_copy {
|
||||
if self.copied_files.contains(file) {
|
||||
continue; // Skip already copied files on resume
|
||||
}
|
||||
|
||||
copy_file(file).await?;
|
||||
self.copied_files.push(file.clone());
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Documentation
|
||||
|
||||
- Module docs: `//!` at top of file
|
||||
- Public items: `///` with examples
|
||||
- Focus on why, not what
|
||||
- Track future work in GitHub issues, not code comments
|
||||
|
||||
````rust
|
||||
//! File sharing operations.
|
||||
//!
|
||||
//! Handles creating, revoking, and managing file shares.
|
||||
|
||||
/// Creates a new file share with the specified recipient.
|
||||
///
|
||||
/// # Example
|
||||
///
|
||||
/// ```
|
||||
/// let output = share_file(ShareFileInput {
|
||||
/// file_id: 123,
|
||||
/// recipient: "user@example.com".to_string(),
|
||||
/// }).await?;
|
||||
/// ```
|
||||
pub async fn share_file(input: ShareFileInput) -> Result<ShareFileOutput> {
|
||||
// Implementation
|
||||
}
|
||||
````
|
||||
|
||||
### Formatting
|
||||
|
||||
Run `cargo fmt` before committing. Tabs for indentation. No emojis.
|
||||
|
||||
## Logging
|
||||
|
||||
### Setup
|
||||
|
||||
Use `tracing_subscriber` in main or examples:
|
||||
|
||||
```rust
|
||||
use tracing_subscriber::EnvFilter;
|
||||
|
||||
fn main() {
|
||||
tracing_subscriber::fmt()
|
||||
.with_env_filter(
|
||||
EnvFilter::try_from_default_env()
|
||||
.unwrap_or_else(|_| EnvFilter::new("sd_core=info"))
|
||||
)
|
||||
.init();
|
||||
}
|
||||
```
|
||||
|
||||
## Writing Style
|
||||
|
||||
This applies to all documentation, code comments, and design documents.
|
||||
|
||||
Use clear, simple language. Write short, impactful sentences. Use active voice. Focus on practical, actionable information.
|
||||
|
||||
Address the reader directly with "you" and "your". Support claims with data and examples when possible.
|
||||
|
||||
Avoid these constructions:
|
||||
|
||||
- Em dashes (use commas or periods)
|
||||
- "Not only this, but also this"
|
||||
- Metaphors and cliches
|
||||
- Generalizations
|
||||
- Setup language like "in conclusion"
|
||||
- Unnecessary adjectives and adverbs
|
||||
- Emojis, hashtags, markdown formatting in prose
|
||||
|
||||
Avoid these words:
|
||||
comprehensive, delve, utilize, harness, realm, tapestry, unlock, revolutionary, groundbreaking, remarkable, pivotal
|
||||
|
||||
### Macros
|
||||
|
||||
Use `tracing` macros, never `println!`:
|
||||
|
||||
```rust
|
||||
use tracing::{info, warn, error, debug};
|
||||
|
||||
info!("Server started on port {}", port);
|
||||
debug!(file_id = %id, "Processing file");
|
||||
warn!(error = %e, "Retrying operation");
|
||||
error!("Failed to connect to database");
|
||||
```
|
||||
|
||||
### Job Logging
|
||||
|
||||
Use `ctx.log()` in job implementations for automatic `job_id` tagging:
|
||||
|
||||
```rust
|
||||
impl Job for MyJob {
|
||||
async fn run(&mut self, ctx: &JobContext) -> Result<()> {
|
||||
ctx.log().info("Job started");
|
||||
ctx.log().debug!(progress = %self.progress, "Processing");
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Log Levels
|
||||
|
||||
- `debug`: Detailed flow for troubleshooting
|
||||
- `info`: User-relevant events (server start, job completion)
|
||||
- `warn`: Recoverable issues (retry, fallback)
|
||||
- `error`: Failures requiring attention
|
||||
|
||||
### Environment Control
|
||||
|
||||
Use `RUST_LOG` environment variable:
|
||||
|
||||
```bash
|
||||
RUST_LOG=debug cargo run --bin sd-cli
|
||||
RUST_LOG=sd_core=trace cargo run
|
||||
RUST_LOG=sd_core::ops=debug cargo run
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Test Organization
|
||||
|
||||
- Unit tests: Colocated in `#[cfg(test)]` modules
|
||||
- Integration tests: `tests/` directory at crate root
|
||||
|
||||
```rust
|
||||
// src/ops/files/share/action.rs
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[tokio::test]
|
||||
async fn test_share_file() {
|
||||
let input = ShareFileInput {
|
||||
file_id: 1,
|
||||
recipient: "test@example.com".to_string(),
|
||||
};
|
||||
|
||||
let output = share_file(input).await.unwrap();
|
||||
assert!(!output.share_id.is_empty());
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
|
||||
```bash
|
||||
cargo test # All tests
|
||||
cargo test test_share_file # Specific test
|
||||
cargo test --lib # Library tests only
|
||||
cargo test -- --nocapture # Show output
|
||||
```
|
||||
|
||||
## Task Tracking
|
||||
|
||||
Spacedrive uses a file-based task system in `/.tasks/` to track features, epics, and development work. All task files are version-controlled alongside the code.
|
||||
|
||||
### When to Create Tasks
|
||||
|
||||
Create tasks for work that:
|
||||
|
||||
- Introduces a new feature or capability
|
||||
- Refactors a significant system or module
|
||||
- Fixes a bug requiring architectural changes
|
||||
- Implements a whitepaper specification
|
||||
|
||||
Do not create tasks for:
|
||||
|
||||
- Routine code formatting or style fixes
|
||||
- Trivial bug fixes (single line changes)
|
||||
- Documentation updates to existing features
|
||||
- Dependency version bumps
|
||||
|
||||
### Task Structure
|
||||
|
||||
Each task is a Markdown file: `CATEGORY-###-title-slug.md`
|
||||
|
||||
```yaml
|
||||
---
|
||||
id: CORE-042
|
||||
title: "Implement file sharing API"
|
||||
status: "In Progress"
|
||||
assignee: "james"
|
||||
priority: "High"
|
||||
tags: ["core", "networking"]
|
||||
whitepaper: "Section 4.2" # And/or design_doc: DESIGN_DOC_NAME.md
|
||||
---
|
||||
|
||||
## Description
|
||||
Brief overview of what needs to be done and why.
|
||||
|
||||
## Implementation Steps
|
||||
- [ ] Create share action in src/ops/files/share
|
||||
- [ ] Add database schema for shares table
|
||||
- [ ] Implement expiration logic
|
||||
|
||||
## Acceptance Criteria
|
||||
- Share links work across all platforms
|
||||
- Expired shares return 404
|
||||
- Tests cover edge cases
|
||||
```
|
||||
|
||||
### Managing Tasks
|
||||
|
||||
```bash
|
||||
# List your active tasks
|
||||
cargo run -p task-validator -- list --assignee "yourname" --status "In Progress"
|
||||
|
||||
# List high priority tasks
|
||||
cargo run -p task-validator -- list --priority "High" --sort-by id
|
||||
|
||||
# Validate before committing (automatic via git hook)
|
||||
cargo run -p task-validator -- validate
|
||||
```
|
||||
|
||||
### Task Lifecycle
|
||||
|
||||
1. Create task file in `/.tasks/` with `status: "To Do"`
|
||||
2. Update status to `"In Progress"` when you start work
|
||||
3. Complete implementation and tests
|
||||
4. Update status to `"Done"` and commit
|
||||
|
||||
Full documentation: `/docs/core/task-tracking.md`
|
||||
|
||||
## Debugging
|
||||
|
||||
### Log Files
|
||||
|
||||
Job logs live in the `job_logs` directory in the data folder root.
|
||||
|
||||
### Daemon Restart
|
||||
|
||||
After rebuilding, restart the daemon to use the latest code:
|
||||
|
||||
```bash
|
||||
cargo build
|
||||
cargo run --bin sd-cli -- restart
|
||||
```
|
||||
|
||||
### Verbose Logging
|
||||
|
||||
```bash
|
||||
RUST_LOG=debug cargo run --bin sd-daemon
|
||||
RUST_LOG=sd_core::jobs=trace cargo run
|
||||
```
|
||||
|
||||
## Documentation Locations
|
||||
|
||||
- Core architecture: `/docs/core/`
|
||||
- Design docs and RFCs: `/docs/core/design/`
|
||||
- Application docs: `/docs/`
|
||||
- Daemon details: `/docs/core/daemon.md`
|
||||
- Task tracking: `/docs/core/task-tracking.md`
|
||||
@@ -212,7 +212,7 @@ Once you have finished making your changes, create a pull request (PR) to submit
|
||||
|
||||
### Your PR is Merged!
|
||||
|
||||
Congratulations! 🎉🎉 The Spacedrive team thanks you for your contribution! ✨
|
||||
Congratulations! 🎉The Spacedrive team thanks you for your contribution!
|
||||
|
||||
Once your PR is merged, your changes will be included in the next release of the application.
|
||||
|
||||
|
||||
@@ -6,15 +6,15 @@
|
||||
Spacedrive v2 represents a **major architectural achievement** with approximately **87% of core whitepaper features implemented**. The project has successfully built the foundational VDFS architecture, networking stack, **complete sync infrastructure**, and essential file operations. Development is concentrated in the Rust core (61,831 LOC), with working CLI (4,131 LOC), iOS/macOS apps, extension SDK, and comprehensive documentation (147 docs).
|
||||
|
||||
**Status Overview:**
|
||||
- ✅ **30 tasks completed** (Core infrastructure complete, sync infrastructure 95% done)
|
||||
- 🔄 **8 tasks in progress** (Model wiring, client features, search)
|
||||
- 📋 **52 tasks remaining** (AI agent, cloud, advanced features)
|
||||
- **30 tasks completed** (Core infrastructure complete, sync infrastructure 95% done)
|
||||
- **8 tasks in progress** (Model wiring, client features, search)
|
||||
- **52 tasks remaining** (AI agent, cloud, advanced features)
|
||||
|
||||
**Critical Update:** Initial assessment underestimated sync completeness. Comprehensive integration tests prove all sync infrastructure is working - only model wiring remains.
|
||||
|
||||
---
|
||||
|
||||
## 1. Core VDFS Architecture (✅ ~95% Complete)
|
||||
## 1. Core VDFS Architecture (~95% Complete)
|
||||
|
||||
### Completed Components
|
||||
|
||||
@@ -72,7 +72,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
### In Progress Components
|
||||
|
||||
#### 1.7 Virtual Sidecar System 🔄 (~70% Complete)
|
||||
#### 1.7 Virtual Sidecar System (~70% Complete)
|
||||
- **Implementation:** `core/src/ops/sidecar/`, `core/src/service/sidecar_manager.rs`
|
||||
- **Completed:**
|
||||
- Sidecar types defined (Thumb, Proxy, Embeddings, OCR, Transcript, LivePhotoVideo)
|
||||
@@ -87,7 +87,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 2. Indexing Engine (✅ ~90% Complete)
|
||||
## 2. Indexing Engine (~90% Complete)
|
||||
|
||||
### Implementation: `core/src/ops/indexing/`
|
||||
|
||||
@@ -129,7 +129,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 3. Transactional Action System (✅ 100% Complete)
|
||||
## 3. Transactional Action System (100% Complete)
|
||||
|
||||
### Implementation: `core/src/infra/action/`
|
||||
|
||||
@@ -172,7 +172,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 4. File Operations (✅ ~85% Complete)
|
||||
## 4. File Operations (~85% Complete)
|
||||
|
||||
### Implementation: `core/src/ops/files/`
|
||||
|
||||
@@ -214,7 +214,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 5. Durable Job System (✅ 100% Complete)
|
||||
## 5. Durable Job System (100% Complete)
|
||||
|
||||
### Implementation: `core/src/infra/job/`
|
||||
|
||||
@@ -239,7 +239,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 6. Networking & Synchronization (✅ ~95% Complete)
|
||||
## 6. Networking & Synchronization (~95% Complete)
|
||||
|
||||
### 6.1 Iroh P2P Stack ✅
|
||||
**Implementation:** `core/src/service/network/`
|
||||
@@ -266,7 +266,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Status:** Complete and tested
|
||||
|
||||
### 6.3 Library Sync Infrastructure ✅ (~95% Complete)
|
||||
### 6.3 Library Sync Infrastructure (~95% Complete)
|
||||
**Implementation:** `core/src/service/sync/`, `core/src/infra/sync/`
|
||||
**Test Coverage:** `core/tests/sync_integration_test.rs` (1,554 lines, all tests passing)
|
||||
|
||||
@@ -283,7 +283,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
- **Transaction Manager** (`transaction.rs`, 287 lines) - Leaderless coordinator
|
||||
- **Peer Log** (`peer_log.rs`, 428 lines) - Per-device change log (sync.db)
|
||||
- **HLC Implementation** (`hlc.rs`, 348 lines) - Hybrid Logical Clock ✅
|
||||
- **FK Mapper** (`fk_mapper.rs`, 296 lines) - Automatic UUID ↔ ID conversion
|
||||
- **FK Mapper** (`fk_mapper.rs`, 296 lines) - Automatic UUID ID conversion
|
||||
- **Dependency Graph** (`dependency_graph.rs`) - Ensures correct sync order
|
||||
- **Syncable Trait** (`syncable.rs`, 337 lines) - Trait for sync-aware models ✅
|
||||
- **Registry System** (`registry.rs`, 486 lines) - Model registration
|
||||
@@ -315,19 +315,19 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
- Automatic FK conversion
|
||||
- Transparent sync broadcasting
|
||||
|
||||
#### What Remains 🔄 (~5% work)
|
||||
#### What Remains (~5% work)
|
||||
|
||||
**Model Wiring Only:**
|
||||
- Wire remaining 15-20 models to sync (Tags ✅, Locations ✅ already done)
|
||||
- Wire remaining 15-20 models to sync (Tags ✅, Locations already done)
|
||||
- Models: Albums, Collections, UserMetadata, etc.
|
||||
- Estimated: 1 week of mechanical work
|
||||
|
||||
**NOT Missing:**
|
||||
- ✅ HLC implementation (complete, tested)
|
||||
- ✅ Syncable trait (complete, used by Tag and Location)
|
||||
- ✅ Conflict resolution (last-writer-wins implemented)
|
||||
- ✅ Backfill (complete with full state snapshots)
|
||||
- ✅ Transitive sync (proven working)
|
||||
- HLC implementation (complete, tested)
|
||||
- Syncable trait (complete, used by Tag and Location)
|
||||
- Conflict resolution (last-writer-wins implemented)
|
||||
- Backfill (complete with full state snapshots)
|
||||
- Transitive sync (proven working)
|
||||
|
||||
**Overall Status:** Sync infrastructure 95% complete - all mechanisms working, just needs model wiring
|
||||
|
||||
@@ -336,7 +336,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 7. Volume Management (✅ ~90% Complete)
|
||||
## 7. Volume Management (~90% Complete)
|
||||
|
||||
### Implementation: `core/src/volume/`
|
||||
|
||||
@@ -361,7 +361,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 8. Search System (🔄 ~40% Complete)
|
||||
## 8. Search System (~40% Complete)
|
||||
|
||||
### Implementation: `core/src/ops/search/`
|
||||
|
||||
@@ -382,7 +382,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 9. WASM Extension System (🔄 ~60% Complete)
|
||||
## 9. WASM Extension System (~60% Complete)
|
||||
|
||||
### Implementation: `core/src/infra/extension/`, `crates/sdk/`
|
||||
|
||||
@@ -413,7 +413,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 10. CLI Application (✅ ~85% Complete)
|
||||
## 10. CLI Application (~85% Complete)
|
||||
|
||||
### Implementation: `apps/cli/src/`
|
||||
|
||||
@@ -551,7 +551,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
---
|
||||
|
||||
## 12. Documentation (✅ ~80% Complete)
|
||||
## 12. Documentation (~80% Complete)
|
||||
|
||||
### Coverage
|
||||
|
||||
@@ -600,7 +600,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
## 13. Not Yet Started Features
|
||||
|
||||
### 13.1 AI Agent System (❌ 0% Complete)
|
||||
### 13.1 AI Agent System (0% Complete)
|
||||
**Tasks:** `AI-000`, `AI-001`, `AI-002`
|
||||
|
||||
**Missing:**
|
||||
@@ -614,7 +614,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Impact:** High - This is the most transformative feature
|
||||
|
||||
### 13.2 Cloud Infrastructure (❌ 0% Complete)
|
||||
### 13.2 Cloud Infrastructure (0% Complete)
|
||||
**Tasks:** `CLOUD-000` through `CLOUD-003`
|
||||
|
||||
**Missing:**
|
||||
@@ -626,7 +626,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Impact:** High - Required for full P2P backup
|
||||
|
||||
### 13.3 Advanced Client Features (❌ 0% Complete)
|
||||
### 13.3 Advanced Client Features (0% Complete)
|
||||
**Tasks:** `CORE-011` through `CORE-017`
|
||||
|
||||
**Missing:**
|
||||
@@ -638,7 +638,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Impact:** Medium - Improves client responsiveness
|
||||
|
||||
### 13.4 File Sync Conduits (🔄 ~20% Complete)
|
||||
### 13.4 File Sync Conduits (~20% Complete)
|
||||
**Tasks:** `FSYNC-000` through `FSYNC-014`
|
||||
|
||||
**What Exists:**
|
||||
@@ -660,7 +660,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Impact:** High - Critical for automated file sync
|
||||
|
||||
### 13.5 Security Features (🔄 ~30% Complete)
|
||||
### 13.5 Security Features (~30% Complete)
|
||||
**Tasks:** `SEC-000` through `SEC-007`
|
||||
|
||||
**Completed:**
|
||||
@@ -678,7 +678,7 @@ Spacedrive v2 represents a **major architectural achievement** with approximatel
|
||||
|
||||
**Impact:** High - Required for enterprise use
|
||||
|
||||
### 13.6 Advanced Search Features (❌ 0% Complete)
|
||||
### 13.6 Advanced Search Features (0% Complete)
|
||||
**Tasks:** `SEARCH-001` through `SEARCH-003`
|
||||
|
||||
**Missing:**
|
||||
@@ -736,7 +736,7 @@ Rust 35 596 424 4,131
|
||||
10. **Resumability:** Jobs designed for interruption and recovery
|
||||
11. **P2P Architecture:** Solid Iroh integration
|
||||
|
||||
### Areas for Improvement ⚠️
|
||||
### Areas for Improvement ️
|
||||
|
||||
1. **Test Coverage:** Limited unit/integration tests visible
|
||||
2. **Performance Benchmarks:** Benchmark infrastructure exists but limited results
|
||||
@@ -749,76 +749,76 @@ Rust 35 596 424 4,131
|
||||
|
||||
## 16. Comparison to Whitepaper
|
||||
|
||||
### Core VDFS ✅ (~95%)
|
||||
- ✅ Entry-centric model
|
||||
- ✅ SdPath addressing
|
||||
- ✅ Content identity
|
||||
- ✅ Closure tables
|
||||
- ✅ File type system
|
||||
- ✅ Semantic tagging
|
||||
- 🔄 Virtual sidecars (~70%)
|
||||
### Core VDFS (~95%)
|
||||
- Entry-centric model
|
||||
- SdPath addressing
|
||||
- Content identity
|
||||
- Closure tables
|
||||
- File type system
|
||||
- Semantic tagging
|
||||
- Virtual sidecars (~70%)
|
||||
|
||||
### Indexing Engine ✅ (~90%)
|
||||
- ✅ Five-phase pipeline
|
||||
- ✅ Resumability
|
||||
- ✅ Change detection
|
||||
- 🔄 Real-time monitoring (~60%)
|
||||
- 🔄 Offline recovery (~40%)
|
||||
- ❌ Remote volume indexing (OpenDAL integration pending)
|
||||
### Indexing Engine (~90%)
|
||||
- Five-phase pipeline
|
||||
- Resumability
|
||||
- Change detection
|
||||
- Real-time monitoring (~60%)
|
||||
- Offline recovery (~40%)
|
||||
- Remote volume indexing (OpenDAL integration pending)
|
||||
|
||||
### Transactional Actions ✅ (100%)
|
||||
- ✅ Preview, commit, verify
|
||||
- ✅ Durable execution
|
||||
- ✅ Conflict detection
|
||||
- ✅ Audit logging
|
||||
### Transactional Actions (100%)
|
||||
- Preview, commit, verify
|
||||
- Durable execution
|
||||
- Conflict detection
|
||||
- Audit logging
|
||||
|
||||
### File Operations ✅ (~85%)
|
||||
- ✅ Copy with strategy pattern
|
||||
- ✅ Delete with trash support
|
||||
- ✅ Move/rename
|
||||
- 🔄 Validation (~60%)
|
||||
### File Operations (~85%)
|
||||
- Copy with strategy pattern
|
||||
- Delete with trash support
|
||||
- Move/rename
|
||||
- Validation (~60%)
|
||||
|
||||
### Library Sync ✅ (~95%)
|
||||
- ✅ Leaderless architecture
|
||||
- ✅ Domain separation
|
||||
- ✅ State-based sync (device data) - **fully working**
|
||||
- ✅ Log-based sync (shared data) - **fully working with HLC**
|
||||
- ✅ HLC implementation - **complete (348 LOC, tested)**
|
||||
- ✅ Syncable trait - **complete (337 LOC, in use)**
|
||||
- ✅ Backfill with full state snapshots - **complete**
|
||||
- ✅ Transitive sync - **validated end-to-end**
|
||||
- 🔄 Model wiring - remaining 15-20 models (1 week)
|
||||
### Library Sync (~95%)
|
||||
- Leaderless architecture
|
||||
- Domain separation
|
||||
- State-based sync (device data) - **fully working**
|
||||
- Log-based sync (shared data) - **fully working with HLC**
|
||||
- HLC implementation - **complete (348 LOC, tested)**
|
||||
- Syncable trait - **complete (337 LOC, in use)**
|
||||
- Backfill with full state snapshots - **complete**
|
||||
- Transitive sync - **validated end-to-end**
|
||||
- Model wiring - remaining 15-20 models (1 week)
|
||||
|
||||
### Networking ✅ (~85%)
|
||||
- ✅ Iroh P2P stack
|
||||
- ✅ Device pairing
|
||||
- ✅ mDNS discovery
|
||||
- ✅ QUIC transport
|
||||
- ❌ Spacedrop protocol (0%)
|
||||
### Networking (~85%)
|
||||
- Iroh P2P stack
|
||||
- Device pairing
|
||||
- mDNS discovery
|
||||
- QUIC transport
|
||||
- Spacedrop protocol (0%)
|
||||
|
||||
### AI-Native Architecture ❌ (0%)
|
||||
- ❌ AI agent
|
||||
- ❌ Natural language interface
|
||||
- ❌ Proactive assistance
|
||||
- ❌ Local model integration
|
||||
### AI-Native Architecture (0%)
|
||||
- AI agent
|
||||
- Natural language interface
|
||||
- Proactive assistance
|
||||
- Local model integration
|
||||
|
||||
### Temporal-Semantic Search 🔄 (~40%)
|
||||
- ✅ Basic search
|
||||
- 🔄 FTS5 index (migration exists, not integrated)
|
||||
- ❌ Semantic re-ranking (0%)
|
||||
- ❌ Vector repositories (0%)
|
||||
### Temporal-Semantic Search (~40%)
|
||||
- Basic search
|
||||
- FTS5 index (migration exists, not integrated)
|
||||
- Semantic re-ranking (0%)
|
||||
- Vector repositories (0%)
|
||||
|
||||
### Cloud as a Peer ❌ (0%)
|
||||
- ❌ Cloud core infrastructure
|
||||
- ❌ Relay server
|
||||
- ❌ Cloud volumes
|
||||
### Cloud as a Peer (0%)
|
||||
- Cloud core infrastructure
|
||||
- Relay server
|
||||
- Cloud volumes
|
||||
|
||||
### Security 🔄 (~30%)
|
||||
- ✅ Network encryption
|
||||
- ✅ Device keys
|
||||
- ❌ Database encryption (0%)
|
||||
- ❌ RBAC (0%)
|
||||
- ❌ Audit log encryption (0%)
|
||||
### Security (~30%)
|
||||
- Network encryption
|
||||
- Device keys
|
||||
- Database encryption (0%)
|
||||
- RBAC (0%)
|
||||
- Audit log encryption (0%)
|
||||
|
||||
---
|
||||
|
||||
@@ -840,7 +840,7 @@ Rust 35 596 424 4,131
|
||||
- Query file metadata
|
||||
- Duplicate detection
|
||||
|
||||
3. **Networking & Sync** ⭐ **[UPDATED]**
|
||||
3. **Networking & Sync** **[UPDATED]**
|
||||
- Discover devices on local network
|
||||
- Pair devices securely
|
||||
- Establish P2P connections
|
||||
@@ -885,9 +885,9 @@ Rust 35 596 424 4,131
|
||||
### Partially Working Features 🔄
|
||||
|
||||
1. **Library Sync Model Wiring** (~95% → 100%)
|
||||
- ✅ All sync infrastructure complete
|
||||
- ✅ Tags and locations wired
|
||||
- 🔄 Remaining 15-20 models need wiring (1 week of work)
|
||||
- All sync infrastructure complete
|
||||
- Tags and locations wired
|
||||
- Remaining 15-20 models need wiring (1 week of work)
|
||||
|
||||
2. **Search**
|
||||
- Basic querying works
|
||||
@@ -905,20 +905,20 @@ Rust 35 596 424 4,131
|
||||
|
||||
## 18. Technical Debt & Known Issues
|
||||
|
||||
### Critical Issues 🔴
|
||||
### Critical Issues
|
||||
1. **Sync Incomplete:** Shared metadata (tags, albums) not syncing between devices
|
||||
2. **No Database Encryption:** Libraries unencrypted at rest
|
||||
3. **FTS5 Not Integrated:** Migration exists but search doesn't use it
|
||||
4. **iOS Background Limitations:** Photo sync requires app to be active
|
||||
|
||||
### Medium Issues 🟡
|
||||
### Medium Issues
|
||||
1. **Test Coverage:** Limited integration tests
|
||||
2. **Performance Profiling:** Need more benchmarks
|
||||
3. **Error Handling:** Some areas use generic errors
|
||||
4. **API Versioning:** No version negotiation yet
|
||||
5. **Hot Reload:** Extension updates require restart
|
||||
|
||||
### Low Issues 🟢
|
||||
### Low Issues
|
||||
1. **Documentation Gaps:** Some features undocumented
|
||||
2. **CLI Help Text:** Could be more detailed
|
||||
3. **Logging Verbosity:** Too much debug output in some areas
|
||||
@@ -1003,51 +1003,51 @@ Spacedrive v2 has achieved **remarkable progress** in building a sophisticated V
|
||||
- **Complete Sync Infrastructure:** 1,554 lines of passing integration tests prove full functionality
|
||||
- **Working Networking:** P2P with Iroh and device pairing complete
|
||||
|
||||
### What's Missing 🎯
|
||||
### What's Missing
|
||||
- **AI Agent:** The "intelligence" layer (0% complete)
|
||||
- **Cloud Services:** Managed infrastructure (0% complete)
|
||||
- **Model Wiring:** Remaining 15-20 models need sync wiring (1 week)
|
||||
- **Semantic Search:** Vector-based search (0% complete)
|
||||
- **Security Hardening:** Encryption at rest (0% complete)
|
||||
|
||||
### Overall Assessment 📊
|
||||
**Implementation: ~87% of whitepaper core features** ⬆️ *(revised from 82%)*
|
||||
### Overall Assessment
|
||||
**Implementation: ~87% of whitepaper core features** ️ *(revised from 82%)*
|
||||
- Core VDFS: ~95% ✅
|
||||
- File Operations: ~85% ✅
|
||||
- Networking: ~85% ✅
|
||||
- **Sync: ~95% ✅** ⬆️ *(was 75% - sync infrastructure complete, just needs wiring)*
|
||||
- **Sync: ~95% ✅** ️ *(was 75% - sync infrastructure complete, just needs wiring)*
|
||||
- Search: ~40% 🔄
|
||||
- AI: ~0% ❌
|
||||
- Cloud: ~0% ❌
|
||||
|
||||
### Correction to Initial Assessment
|
||||
**Initial analysis underestimated sync completeness.** Comprehensive integration tests (`sync_integration_test.rs`) prove:
|
||||
- ✅ State-based sync working (locations, entries)
|
||||
- ✅ Log-based sync with HLC working (tags)
|
||||
- ✅ Backfill with full state snapshots
|
||||
- ✅ Transitive sync validated (A→B→C)
|
||||
- ✅ All sync infrastructure complete
|
||||
- State-based sync working (locations, entries)
|
||||
- Log-based sync with HLC working (tags)
|
||||
- Backfill with full state snapshots
|
||||
- Transitive sync validated (A→B→C)
|
||||
- All sync infrastructure complete
|
||||
|
||||
**Only remaining work:** Wire 15-20 models to existing sync API (mechanical, ~1 week)
|
||||
|
||||
### Readiness for Production 🚀
|
||||
### Readiness for Production
|
||||
**Current State:** Advanced Alpha
|
||||
- ✅ Safe for technical users and testing
|
||||
- ✅ Core functionality works reliably
|
||||
- ✅ **Sync infrastructure complete and validated**
|
||||
- ⚠️ Missing: AI agent, encryption at rest, model wiring
|
||||
- ⚠️ Limited testing and hardening
|
||||
- ❌ Not ready for general release
|
||||
- Safe for technical users and testing
|
||||
- Core functionality works reliably
|
||||
- **Sync infrastructure complete and validated**
|
||||
- ️ Missing: AI agent, encryption at rest, model wiring
|
||||
- ️ Limited testing and hardening
|
||||
- Not ready for general release
|
||||
|
||||
**Revised Path to Production:**
|
||||
1. Complete model wiring (1 week) ⬇️ *(was 2-3 months)*
|
||||
1. Complete model wiring (1 week) ️ *(was 2-3 months)*
|
||||
2. Build AI agent basics (3-4 weeks with AI assistance)
|
||||
3. Add encryption (1 month)
|
||||
4. Build extensions (3-4 weeks)
|
||||
5. Comprehensive testing (1 month)
|
||||
6. Polish UI/UX (2-3 weeks)
|
||||
7. **Alpha Release: November 2025** ⬅️ **ACHIEVABLE**
|
||||
8. **Beta Release: Q1 2026** ⬅️ **Updated from Q2**
|
||||
7. **Alpha Release: November 2025** ️ **ACHIEVABLE**
|
||||
8. **Beta Release: Q1 2026** ️ **Updated from Q2**
|
||||
|
||||
### Final Note
|
||||
The project demonstrates **exceptional engineering quality** and architectural vision.
|
||||
@@ -1055,12 +1055,12 @@ The project demonstrates **exceptional engineering quality** and architectural v
|
||||
**Critical Finding:** Initial assessment failed to recognize that sync is **95% complete** with all core mechanisms working. The comprehensive integration tests prove end-to-end functionality - only mechanical model wiring remains.
|
||||
|
||||
With your demonstrated velocity (V2 core built in 4 months) and AI-accelerated workflow, the **November 2025 alpha timeline is realistic**:
|
||||
- Sync infrastructure: ✅ Complete
|
||||
- Core VDFS: ✅ Production-ready
|
||||
- Networking: ✅ Working
|
||||
- Sync infrastructure: Complete
|
||||
- Core VDFS: Production-ready
|
||||
- Networking: Working
|
||||
- Remaining work: AI agent + extensions + polish (~4-6 weeks at your pace)
|
||||
|
||||
**The core VDFS vision is realized and sync is working. November alpha is achievable.** 🚀
|
||||
**The core VDFS vision is realized and sync is working. November alpha is achievable.**
|
||||
|
||||
---
|
||||
|
||||
|
||||
@@ -3,10 +3,10 @@
|
||||
|
||||
## TL;DR
|
||||
|
||||
**Implementation:** ~87% of whitepaper core features complete ⬆️ *(revised from 82%)*
|
||||
**Implementation:** ~87% of whitepaper core features complete ️ *(revised from 82%)*
|
||||
**Code:** 68,180 lines (61,831 Rust core + 4,131 CLI + 2,218 docs)
|
||||
**Status:** Advanced Alpha - **sync infrastructure complete**, missing AI/cloud
|
||||
**Production Ready:** **Alpha Nov 2025** ⬅️ **ACHIEVABLE** | Beta Q1 2026 *(revised from Q2)*
|
||||
**Production Ready:** **Alpha Nov 2025** ️ **ACHIEVABLE** | Beta Q1 2026 *(revised from Q2)*
|
||||
|
||||
**Critical Update:** Sync infrastructure 95% complete with 1,554 lines of passing integration tests - only model wiring remains.
|
||||
|
||||
@@ -16,49 +16,49 @@
|
||||
|
||||
| Area | Status | % Complete | Notes |
|
||||
|------|--------|-----------|-------|
|
||||
| **Core VDFS** | ✅ Done | 95% | Entry model, SdPath, content identity, file types, tagging all working |
|
||||
| **Indexing Engine** | ✅ Done | 90% | 5-phase pipeline, resumability, change detection complete |
|
||||
| **Actions System** | ✅ Done | 100% | Preview-commit-verify, audit logging, all actions implemented |
|
||||
| **File Operations** | ✅ Done | 85% | Copy/move/delete with strategy pattern working |
|
||||
| **Job System** | ✅ Done | 100% | Durable jobs, resumability, progress tracking complete |
|
||||
| **Networking** | ✅ Done | 85% | Iroh P2P, device pairing, mDNS discovery working |
|
||||
| **Library Sync** | ✅ Done | 95% | **All infrastructure complete with validated tests - just needs model wiring** ⬆️ |
|
||||
| **Volume System** | ✅ Done | 90% | Detection, classification, tracking, speed testing complete |
|
||||
| **CLI** | ✅ Done | 85% | All major commands functional |
|
||||
| **iOS/macOS Apps** | 🔄 Partial | 65% | Core features work, polish needed |
|
||||
| **Extension System** | 🔄 Partial | 60% | WASM runtime + SDK done, API surface incomplete |
|
||||
| **Search** | 🔄 Partial | 40% | Basic search works, FTS5/semantic missing |
|
||||
| **Sidecars** | 🔄 Partial | 70% | Types + paths done, generation workflows incomplete |
|
||||
| **Security** | 🔄 Partial | 30% | Network encrypted, database encryption missing |
|
||||
| **AI Agent** | ❌ Not Started | 0% | Greenfield |
|
||||
| **Cloud Services** | ❌ Not Started | 0% | Greenfield |
|
||||
| **Core VDFS** | Done | 95% | Entry model, SdPath, content identity, file types, tagging all working |
|
||||
| **Indexing Engine** | Done | 90% | 5-phase pipeline, resumability, change detection complete |
|
||||
| **Actions System** | Done | 100% | Preview-commit-verify, audit logging, all actions implemented |
|
||||
| **File Operations** | Done | 85% | Copy/move/delete with strategy pattern working |
|
||||
| **Job System** | Done | 100% | Durable jobs, resumability, progress tracking complete |
|
||||
| **Networking** | Done | 85% | Iroh P2P, device pairing, mDNS discovery working |
|
||||
| **Library Sync** | Done | 95% | **All infrastructure complete with validated tests - just needs model wiring** ️ |
|
||||
| **Volume System** | Done | 90% | Detection, classification, tracking, speed testing complete |
|
||||
| **CLI** | Done | 85% | All major commands functional |
|
||||
| **iOS/macOS Apps** | Partial | 65% | Core features work, polish needed |
|
||||
| **Extension System** | Partial | 60% | WASM runtime + SDK done, API surface incomplete |
|
||||
| **Search** | Partial | 40% | Basic search works, FTS5/semantic missing |
|
||||
| **Sidecars** | Partial | 70% | Types + paths done, generation workflows incomplete |
|
||||
| **Security** | Partial | 30% | Network encrypted, database encryption missing |
|
||||
| **AI Agent** | Not Started | 0% | Greenfield |
|
||||
| **Cloud Services** | Not Started | 0% | Greenfield |
|
||||
|
||||
---
|
||||
|
||||
## What Works Today ✅
|
||||
|
||||
### You Can:
|
||||
- ✅ Create and manage libraries
|
||||
- ✅ Add locations and index directories (millions of files)
|
||||
- ✅ Copy, move, delete files with intelligent routing
|
||||
- ✅ Discover and pair devices on local network
|
||||
- ✅ **Sync tags between devices** ⭐ **[NEW]**
|
||||
- ✅ **Sync locations and entries between devices** ⭐ **[NEW]**
|
||||
- ✅ Create semantic tags with hierarchies
|
||||
- ✅ Search files by metadata and tags
|
||||
- ✅ Detect and track all volumes
|
||||
- ✅ Use comprehensive CLI
|
||||
- ✅ Run iOS app with photo backup to paired devices
|
||||
- ✅ Load and run WASM extensions
|
||||
- Create and manage libraries
|
||||
- Add locations and index directories (millions of files)
|
||||
- Copy, move, delete files with intelligent routing
|
||||
- Discover and pair devices on local network
|
||||
- **Sync tags between devices** **[NEW]**
|
||||
- **Sync locations and entries between devices** **[NEW]**
|
||||
- Create semantic tags with hierarchies
|
||||
- Search files by metadata and tags
|
||||
- Detect and track all volumes
|
||||
- Use comprehensive CLI
|
||||
- Run iOS app with photo backup to paired devices
|
||||
- Load and run WASM extensions
|
||||
|
||||
### You Cannot (Yet):
|
||||
- 🔄 Sync ALL models (15-20 models need wiring - 1 week) *(was: cannot sync at all)*
|
||||
- ❌ Use AI for file organization
|
||||
- ❌ Search by file content semantically
|
||||
- ❌ Backup to cloud
|
||||
- ❌ Encrypt libraries at rest
|
||||
- ❌ Set up automated file sync policies
|
||||
- ❌ Use Spacedrop (P2P file sharing)
|
||||
- Sync ALL models (15-20 models need wiring - 1 week) *(was: cannot sync at all)*
|
||||
- Use AI for file organization
|
||||
- Search by file content semantically
|
||||
- Backup to cloud
|
||||
- Encrypt libraries at rest
|
||||
- Set up automated file sync policies
|
||||
- Use Spacedrop (P2P file sharing)
|
||||
|
||||
---
|
||||
|
||||
@@ -131,36 +131,36 @@
|
||||
- Per-job logging
|
||||
|
||||
### Partially Implemented 🔄
|
||||
1. **Library Sync** (~95%) ⬆️
|
||||
- ✅ Leaderless architecture
|
||||
- ✅ Domain separation
|
||||
- ✅ State-based sync (device data) - **fully working**
|
||||
- ✅ Log-based sync (shared data) - **fully working with HLC**
|
||||
- ✅ HLC timestamps - **complete (348 LOC, tested)**
|
||||
- ✅ Syncable trait - **complete (337 LOC, in use)**
|
||||
- ✅ Backfill with full state snapshots
|
||||
- ✅ Transitive sync validated
|
||||
- 🔄 Model wiring (15-20 models remaining - 1 week)
|
||||
1. **Library Sync** (~95%) ️
|
||||
- Leaderless architecture
|
||||
- Domain separation
|
||||
- State-based sync (device data) - **fully working**
|
||||
- Log-based sync (shared data) - **fully working with HLC**
|
||||
- HLC timestamps - **complete (348 LOC, tested)**
|
||||
- Syncable trait - **complete (337 LOC, in use)**
|
||||
- Backfill with full state snapshots
|
||||
- Transitive sync validated
|
||||
- Model wiring (15-20 models remaining - 1 week)
|
||||
|
||||
2. **Search** (~40%)
|
||||
- ✅ Basic filtering and sorting
|
||||
- 🔄 FTS5 index (migration exists, not integrated)
|
||||
- ❌ Semantic re-ranking - 0%
|
||||
- ❌ Vector search - 0%
|
||||
- Basic filtering and sorting
|
||||
- FTS5 index (migration exists, not integrated)
|
||||
- Semantic re-ranking - 0%
|
||||
- Vector search - 0%
|
||||
|
||||
3. **Virtual Sidecars** (~70%)
|
||||
- ✅ Types and path system
|
||||
- ✅ Database entities
|
||||
- 🔄 Generation workflows - 50%
|
||||
- ❌ Cross-device availability - 0%
|
||||
- Types and path system
|
||||
- Database entities
|
||||
- Generation workflows - 50%
|
||||
- Cross-device availability - 0%
|
||||
|
||||
4. **Extensions** (~60%)
|
||||
- ✅ WASM runtime
|
||||
- ✅ Permission system
|
||||
- ✅ Beautiful SDK with macros
|
||||
- 🔄 VDFS API - 30%
|
||||
- ❌ AI API - 0%
|
||||
- ❌ Credential API - 0%
|
||||
- WASM runtime
|
||||
- Permission system
|
||||
- Beautiful SDK with macros
|
||||
- VDFS API - 30%
|
||||
- AI API - 0%
|
||||
- Credential API - 0%
|
||||
|
||||
### Not Implemented ❌
|
||||
1. **AI Agent** (0%)
|
||||
@@ -175,9 +175,9 @@
|
||||
- S3 integration
|
||||
|
||||
3. **Security** (~30% done, major pieces missing)
|
||||
- ❌ SQLCipher encryption at rest
|
||||
- ❌ RBAC system
|
||||
- ❌ Cryptographic audit log
|
||||
- SQLCipher encryption at rest
|
||||
- RBAC system
|
||||
- Cryptographic audit log
|
||||
|
||||
---
|
||||
|
||||
@@ -192,7 +192,7 @@
|
||||
- Strong type safety
|
||||
- Resumable job design
|
||||
|
||||
### Weaknesses ⚠️
|
||||
### Weaknesses ️
|
||||
- Limited test coverage (integration tests exist but sparse)
|
||||
- Some APIs still evolving
|
||||
- iOS app has background processing constraints
|
||||
@@ -203,11 +203,11 @@
|
||||
## Critical Path to Production
|
||||
|
||||
### Phase 1: Core Completion (3-4 months)
|
||||
1. ✅ Complete library sync (HLC, shared metadata)
|
||||
2. ✅ Integrate FTS5 search
|
||||
3. ✅ Finish virtual sidecars
|
||||
4. ✅ Add SQLCipher encryption
|
||||
5. ✅ Basic file sync policies (Replicate, Synchronize)
|
||||
1. Complete library sync (HLC, shared metadata)
|
||||
2. Integrate FTS5 search
|
||||
3. Finish virtual sidecars
|
||||
4. Add SQLCipher encryption
|
||||
5. Basic file sync policies (Replicate, Synchronize)
|
||||
|
||||
### Phase 2: Testing & Hardening (2 months)
|
||||
1. Comprehensive integration tests
|
||||
@@ -257,19 +257,19 @@
|
||||
|
||||
## Bottom Line
|
||||
|
||||
**Spacedrive v2 is 87% complete** ⬆️ with a **production-ready foundation and working sync**. The core VDFS architecture is solid, **sync infrastructure is complete with validated end-to-end tests**, and file operations are robust.
|
||||
**Spacedrive v2 is 87% complete** ️ with a **production-ready foundation and working sync**. The core VDFS architecture is solid, **sync infrastructure is complete with validated end-to-end tests**, and file operations are robust.
|
||||
|
||||
### Correction to Initial Assessment
|
||||
Initial analysis **significantly underestimated sync completeness**. The 1,554-line integration test suite proves:
|
||||
- ✅ State-based sync working
|
||||
- ✅ Log-based sync with HLC working
|
||||
- ✅ Backfill with full state snapshots
|
||||
- ✅ Transitive sync validated (A→B→C)
|
||||
- State-based sync working
|
||||
- Log-based sync with HLC working
|
||||
- Backfill with full state snapshots
|
||||
- Transitive sync validated (A→B→C)
|
||||
|
||||
**Only remaining:** Wire 15-20 models to existing sync API (~1 week, not 3 months)
|
||||
|
||||
### What's Actually Missing:
|
||||
1. **Model wiring** - 1 week ⬇️ *(was: 3-4 months for "sync")*
|
||||
1. **Model wiring** - 1 week ️ *(was: 3-4 months for "sync")*
|
||||
2. **AI agent basics** - 3-4 weeks with AI assistance
|
||||
3. **Extensions** - 3-4 weeks (Chronicle, Cipher, Ledger, Atlas)
|
||||
4. **Encryption at rest** - 2-3 weeks
|
||||
@@ -277,9 +277,9 @@ Initial analysis **significantly underestimated sync completeness**. The 1,554-l
|
||||
|
||||
**Total: 4-6 weeks at your demonstrated velocity**
|
||||
|
||||
**The vision is realized. Sync is working. November alpha is achievable.** 🚀
|
||||
**The vision is realized. Sync is working. November alpha is achievable.**
|
||||
|
||||
**Alpha: November 2025** ⬅️ **ACHIEVABLE** | Beta: Q1 2026 *(revised from Q2)*
|
||||
**Alpha: November 2025** ️ **ACHIEVABLE** | Beta: Q1 2026 *(revised from Q2)*
|
||||
|
||||
---
|
||||
|
||||
|
||||
98
README.md
@@ -9,7 +9,7 @@
|
||||
<a href="https://spacedrive.com"><strong>spacedrive.com »</strong></a>
|
||||
<br />
|
||||
<br />
|
||||
🚀 <strong>Development resuming with revolutionary new architecture</strong> 🚀
|
||||
<strong>Development resuming with revolutionary new architecture</strong> 🚀
|
||||
<br />
|
||||
<br />
|
||||
<b>Status:</b> Core rewrite in progress ·
|
||||
@@ -63,25 +63,25 @@ The original Spacedrive captured imaginations with a bold promise: the **Virtual
|
||||
|
||||
Your files are scattered across devices, cloud services, and external drives. Traditional file managers trap you in local boundaries. Spacedrive makes those boundaries disappear:
|
||||
|
||||
**🌐 Universal File Access**
|
||||
**Universal File Access**
|
||||
|
||||
- Browse files on any device from any device
|
||||
- External drives, cloud storage, remote servers - all unified
|
||||
- Offline files show up with cached metadata
|
||||
|
||||
**⚡ Lightning Search**
|
||||
**Lightning Search**
|
||||
|
||||
- Find files across all locations with a single search
|
||||
- Content search inside documents, PDFs, and media
|
||||
- AI-powered semantic search: "find sunset photos from vacation"
|
||||
|
||||
**🔄 Seamless Operations**
|
||||
**Seamless Operations**
|
||||
|
||||
- Copy, move, and organize files between any devices
|
||||
- Drag and drop across device boundaries
|
||||
- Batch operations on distributed collections
|
||||
|
||||
**🔒 Privacy First**
|
||||
**Privacy First**
|
||||
|
||||
- Your data stays on your devices
|
||||
- Optional cloud sync, never required
|
||||
@@ -113,13 +113,13 @@ We kept the revolutionary vision. We rebuilt the foundation to deliver it.
|
||||
|
||||
```
|
||||
┌─ Spacedrive ──────────────────────────────────────────┐
|
||||
│ ≡ Locations 📱 iPhone (via P2P) │
|
||||
│ 📁 Desktop 📁 Photos (1,234 items) │
|
||||
│ 📁 Documents 📁 Documents │
|
||||
│ 📁 Downloads 🔗 iCloud Drive │
|
||||
│ 💾 External Drive 📱 iPad │
|
||||
│ ☁️ iCloud Drive 📱 Android Phone │
|
||||
│ 🖥️ Server ⚙️ Background indexing... │
|
||||
│ ≡ Locations iPhone (via P2P) │
|
||||
│ Desktop Photos (1,234 items) │
|
||||
│ Documents Documents │
|
||||
│ Downloads iCloud Drive │
|
||||
│ External Drive iPad │
|
||||
│ ️ iCloud Drive Android Phone │
|
||||
│ ️ Server ️ Background indexing... │
|
||||
└───────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
@@ -184,12 +184,12 @@ No more confusion between "indexed" and "direct" files. Every file operation wor
|
||||
### Real Search Engine
|
||||
|
||||
```
|
||||
🔍 Search: "sunset photos from vacation"
|
||||
Search: "sunset photos from vacation"
|
||||
|
||||
Results across all devices:
|
||||
📱 iPhone/Photos/Vacation2024/sunset_beach.jpg
|
||||
💾 External/Backup/2024/vacation_sunset.mov
|
||||
☁️ iCloud/Memories/golden_hour_sunset.heic
|
||||
iPhone/Photos/Vacation2024/sunset_beach.jpg
|
||||
External/Backup/2024/vacation_sunset.mov
|
||||
️ iCloud/Memories/golden_hour_sunset.heic
|
||||
```
|
||||
|
||||
**Beyond filename matching:**
|
||||
@@ -203,31 +203,31 @@ Results across all devices:
|
||||
|
||||
### Q1 2025: Foundation
|
||||
|
||||
- ✅ **Core rewrite** with unified file system
|
||||
- ✅ **Working CLI** with daemon architecture
|
||||
- 🚧 **Desktop app** rebuilt on new foundation
|
||||
- 🚧 **Real search** with content indexing
|
||||
- **Core rewrite** with unified file system
|
||||
- **Working CLI** with daemon architecture
|
||||
- **Desktop app** rebuilt on new foundation
|
||||
- **Real search** with content indexing
|
||||
|
||||
### Q2 2025: Device Communication
|
||||
|
||||
- 🔄 **P2P discovery** and secure connections
|
||||
- 🔄 **Cross-device operations** (copy, move, sync)
|
||||
- 🔄 **Mobile apps** with desktop feature parity
|
||||
- 🔄 **Web interface** for universal access
|
||||
- **P2P discovery** and secure connections
|
||||
- **Cross-device operations** (copy, move, sync)
|
||||
- **Mobile apps** with desktop feature parity
|
||||
- **Web interface** for universal access
|
||||
|
||||
### Q3 2025: Intelligence
|
||||
|
||||
- 🎯 **AI-powered organization** with local models
|
||||
- 🎯 **Smart collections** and auto-tagging
|
||||
- 🎯 **Cloud integrations** (iCloud, Google Drive, etc.)
|
||||
- 🎯 **Advanced media analysis**
|
||||
- **AI-powered organization** with local models
|
||||
- **Smart collections** and auto-tagging
|
||||
- **Cloud integrations** (iCloud, Google Drive, etc.)
|
||||
- **Advanced media analysis**
|
||||
|
||||
### Q4 2025: Ecosystem
|
||||
|
||||
- 🚀 **Extension system** for community features
|
||||
- 🚀 **Professional tools** for creators and teams
|
||||
- 🚀 **Enterprise features** and compliance
|
||||
- 🚀 **Plugin marketplace** and developer APIs
|
||||
- **Extension system** for community features
|
||||
- **Professional tools** for creators and teams
|
||||
- **Enterprise features** and compliance
|
||||
- **Plugin marketplace** and developer APIs
|
||||
|
||||
## Try It Today
|
||||
|
||||
@@ -254,11 +254,11 @@ spacedrive job monitor
|
||||
|
||||
**Working today:**
|
||||
|
||||
- ✅ Multi-location management
|
||||
- ✅ Smart indexing with progress tracking
|
||||
- ✅ Content-aware search
|
||||
- ✅ Real-time job monitoring
|
||||
- ✅ Portable library format
|
||||
- Multi-location management
|
||||
- Smart indexing with progress tracking
|
||||
- Content-aware search
|
||||
- Real-time job monitoring
|
||||
- Portable library format
|
||||
|
||||
## Sustainable Open Source
|
||||
|
||||
@@ -315,24 +315,24 @@ No more feature paralysis:
|
||||
|
||||
### For Users
|
||||
|
||||
- ⭐ **Star the repo** to follow development
|
||||
- 💬 **Join Discord** for updates and early access
|
||||
- 🐛 **Report issues** and request features
|
||||
- 📖 **Beta testing** as features ship
|
||||
- **Star the repo** to follow development
|
||||
- **Join Discord** for updates and early access
|
||||
- **Report issues** and request features
|
||||
- **Beta testing** as features ship
|
||||
|
||||
### For Developers
|
||||
|
||||
- 🔧 **Contribute code** to the core rewrite
|
||||
- 📚 **Improve docs** and tutorials
|
||||
- 🧪 **Write tests** and benchmarks
|
||||
- 🎨 **Design interfaces** for new features
|
||||
- **Contribute code** to the core rewrite
|
||||
- **Improve docs** and tutorials
|
||||
- **Write tests** and benchmarks
|
||||
- **Design interfaces** for new features
|
||||
|
||||
### For Organizations
|
||||
|
||||
- 💼 **Early access** to enterprise features
|
||||
- 🤝 **Partnership** opportunities
|
||||
- 💰 **Sponsorship** and development funding
|
||||
- 🎯 **Custom development** services
|
||||
- **Early access** to enterprise features
|
||||
- **Partnership** opportunities
|
||||
- **Sponsorship** and development funding
|
||||
- **Custom development** services
|
||||
|
||||
## The Return
|
||||
|
||||
|
||||
@@ -127,10 +127,10 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
println!("╠══════════════════════════════════════════════════════════════╣");
|
||||
|
||||
if result.is_valid {
|
||||
println!("║ ✅ STATUS: VALID - Index matches filesystem perfectly! ║");
|
||||
println!("║ STATUS: VALID - Index matches filesystem perfectly! ║");
|
||||
} else {
|
||||
println!(
|
||||
"║ ❌ STATUS: DIVERGED - {} issues found {:24} ║",
|
||||
"║ STATUS: DIVERGED - {} issues found {:24} ║",
|
||||
report.total_issues(),
|
||||
""
|
||||
);
|
||||
@@ -140,7 +140,7 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
|
||||
if !report.missing_from_index.is_empty() {
|
||||
println!(
|
||||
"║ ⚠️ Missing from index: {} {:33} ║",
|
||||
"║ ️ Missing from index: {} {:33} ║",
|
||||
report.missing_from_index.len(),
|
||||
""
|
||||
);
|
||||
@@ -168,7 +168,7 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
|
||||
if !report.stale_in_index.is_empty() {
|
||||
println!(
|
||||
"║ 🗑️ Stale in index: {} {:36} ║",
|
||||
"║ ️ Stale in index: {} {:36} ║",
|
||||
report.stale_in_index.len(),
|
||||
""
|
||||
);
|
||||
@@ -196,7 +196,7 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
|
||||
if !report.metadata_mismatches.is_empty() {
|
||||
println!(
|
||||
"║ ⚙️ Metadata mismatches: {} {:31} ║",
|
||||
"║ ️ Metadata mismatches: {} {:31} ║",
|
||||
report.metadata_mismatches.len(),
|
||||
""
|
||||
);
|
||||
@@ -215,7 +215,7 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
|
||||
if !report.hierarchy_errors.is_empty() {
|
||||
println!(
|
||||
"║ 🌳 Hierarchy errors: {} {:34} ║",
|
||||
"║ Hierarchy errors: {} {:34} ║",
|
||||
report.hierarchy_errors.len(),
|
||||
""
|
||||
);
|
||||
@@ -225,7 +225,7 @@ pub async fn run(ctx: &Context, cmd: IndexCmd) -> Result<()> {
|
||||
println!("╠══════════════════════════════════════════════════════════════╣");
|
||||
println!(
|
||||
"║ {}{:59} ║",
|
||||
if result.is_valid { "✅ " } else { "❌ " },
|
||||
if result.is_valid { "" } else { "" },
|
||||
report.summary.chars().take(59).collect::<String>()
|
||||
);
|
||||
println!("╚══════════════════════════════════════════════════════════════╝\n");
|
||||
|
||||
@@ -302,9 +302,9 @@ async fn run_simple_job_monitor(ctx: &Context, args: JobMonitorArgs) -> Result<(
|
||||
|
||||
Event::JobResumed { job_id } => {
|
||||
if let Some(pb) = progress_bars.get(&job_id) {
|
||||
pb.set_message(format!("▶️ Job resumed [{}]", &job_id[..8]));
|
||||
pb.set_message(format!("️ Job resumed [{}]", &job_id[..8]));
|
||||
}
|
||||
println!("▶️ Job resumed: [{}]", &job_id[..8]);
|
||||
println!("️ Job resumed: [{}]", &job_id[..8]);
|
||||
}
|
||||
|
||||
_ => {} // Ignore other events
|
||||
|
||||
@@ -78,7 +78,7 @@ pub async fn run(ctx: &Context, cmd: NetworkCmd) -> Result<()> {
|
||||
let out: PairGenerateOutput = execute_action!(ctx, input);
|
||||
print_output!(ctx, &out, |o: &PairGenerateOutput| {
|
||||
// Show QR code for remote pairing (includes NodeId and relay URL)
|
||||
println!("📱 Scan this QR code with your mobile app for remote pairing:");
|
||||
println!("Scan this QR code with your mobile app for remote pairing:");
|
||||
println!("┌─────────────────────────────────────────────────────────┐");
|
||||
if let Err(e) = qr2term::print_qr(&o.qr_json) {
|
||||
println!("Failed to generate QR code: {}", e);
|
||||
@@ -87,12 +87,12 @@ pub async fn run(ctx: &Context, cmd: NetworkCmd) -> Result<()> {
|
||||
println!();
|
||||
|
||||
// Show raw QR JSON for debugging
|
||||
println!("🔍 QR Code JSON (for debugging):");
|
||||
println!("QR Code JSON (for debugging):");
|
||||
println!(" {}", o.qr_json);
|
||||
println!();
|
||||
|
||||
// Also show words for manual entry (local pairing)
|
||||
println!("💬 Or type these words manually for local pairing:");
|
||||
println!("Or type these words manually for local pairing:");
|
||||
println!(" {}", o.code);
|
||||
|
||||
println!();
|
||||
@@ -155,9 +155,9 @@ pub async fn run(ctx: &Context, cmd: NetworkCmd) -> Result<()> {
|
||||
println!(
|
||||
" Status: {}",
|
||||
if device.is_connected {
|
||||
"🟢 Connected"
|
||||
"Connected"
|
||||
} else {
|
||||
"⚪ Paired"
|
||||
"Paired"
|
||||
}
|
||||
);
|
||||
println!(
|
||||
|
||||
@@ -33,12 +33,12 @@ pub fn job_status_color(status: JobStatus) -> Color {
|
||||
/// Get status icon for job
|
||||
pub fn job_status_icon(status: JobStatus) -> &'static str {
|
||||
match status {
|
||||
JobStatus::Queued => "⏳",
|
||||
JobStatus::Running => "⚡",
|
||||
JobStatus::Paused => "⏸️",
|
||||
JobStatus::Completed => "✅",
|
||||
JobStatus::Failed => "❌",
|
||||
JobStatus::Cancelled => "🚫",
|
||||
JobStatus::Queued => "",
|
||||
JobStatus::Running => "",
|
||||
JobStatus::Paused => "️",
|
||||
JobStatus::Completed => "",
|
||||
JobStatus::Failed => "",
|
||||
JobStatus::Cancelled => "",
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
module.exports = {
|
||||
extends: [require.resolve('@sd/config/eslint/web.js')],
|
||||
parserOptions: {
|
||||
tsconfigRootDir: __dirname,
|
||||
project: './tsconfig.json'
|
||||
}
|
||||
};
|
||||
|
Before Width: | Height: | Size: 125 KiB |
@@ -1,17 +0,0 @@
|
||||
[package]
|
||||
name = "sd-desktop-linux"
|
||||
version = "0.1.0"
|
||||
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
|
||||
[dependencies]
|
||||
libc = { workspace = true }
|
||||
tokio = { workspace = true, features = ["fs"] }
|
||||
|
||||
[target.'cfg(target_os = "linux")'.dependencies]
|
||||
wgpu = { version = "22.1", default-features = false }
|
||||
# WARNING: gtk should follow the same version used by tauri
|
||||
# https://github.com/tauri-apps/tauri/blob/tauri-v2.0.0/crates/tauri/Cargo.toml#L100
|
||||
gtk = { version = "0.18", features = ["v3_24"] }
|
||||
@@ -1,9 +0,0 @@
|
||||
# Linux crate
|
||||
|
||||
For some OS specific operations
|
||||
|
||||
> The code for parsing Desktop Entries and finding which programs handle a certain mime-type is based on:
|
||||
>
|
||||
> https://github.com/chmln/handlr (MIT)
|
||||
>
|
||||
> thanks @chmln
|
||||
@@ -1,98 +0,0 @@
|
||||
use std::path::Path;
|
||||
|
||||
use gtk::{
|
||||
gio::{
|
||||
content_type_guess, prelude::AppInfoExt, prelude::FileExt, AppInfo, AppLaunchContext,
|
||||
DesktopAppInfo, File as GioFile, ResourceError,
|
||||
},
|
||||
glib::error::Error as GlibError,
|
||||
};
|
||||
use tokio::fs::File;
|
||||
use tokio::io::AsyncReadExt;
|
||||
|
||||
thread_local! {
|
||||
static LAUNCH_CTX: AppLaunchContext = {
|
||||
// TODO: Display supports requires GDK, which can only run on the main thread
|
||||
// let ctx = Display::default()
|
||||
// .and_then(|display| display.app_launch_context())
|
||||
// .map(|display| display.to_value().get::<AppLaunchContext>().expect(
|
||||
// "This is an Glib type conversion, it should never fail because GDKAppLaunchContext is a subclass of AppLaunchContext"
|
||||
// )).unwrap_or_default();
|
||||
|
||||
|
||||
AppLaunchContext::default()
|
||||
}
|
||||
}
|
||||
|
||||
pub struct App {
|
||||
pub id: String,
|
||||
pub name: String,
|
||||
// pub icon: Option<Vec<u8>>,
|
||||
}
|
||||
|
||||
async fn recommended_for_type(file_path: impl AsRef<Path>) -> Vec<AppInfo> {
|
||||
let data = if let Ok(mut file) = File::open(&file_path).await {
|
||||
let mut data = [0; 1024];
|
||||
if file.read_exact(&mut data).await.is_ok() {
|
||||
Some(data)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
let file_path = Some(file_path);
|
||||
let (content_type, uncertain) = if let Some(data) = data {
|
||||
content_type_guess(file_path, &data)
|
||||
} else {
|
||||
content_type_guess(file_path, &[])
|
||||
};
|
||||
|
||||
if uncertain {
|
||||
vec![]
|
||||
} else {
|
||||
AppInfo::recommended_for_type(content_type.as_str())
|
||||
}
|
||||
}
|
||||
|
||||
pub async fn list_apps_associated_with_ext(file_path: impl AsRef<Path>) -> Vec<App> {
|
||||
recommended_for_type(file_path)
|
||||
.await
|
||||
.iter()
|
||||
.flat_map(|app_info| {
|
||||
app_info.id().map(|id| App {
|
||||
id: id.to_string(),
|
||||
name: app_info.name().to_string(),
|
||||
// TODO: Icon supports requires GTK, which can only run on the main thread
|
||||
// icon: app_info
|
||||
// .icon()
|
||||
// .and_then(|icon| {
|
||||
// IconTheme::default().and_then(|icon_theme| {
|
||||
// icon_theme.lookup_by_gicon(&icon, 128, IconLookupFlags::empty())
|
||||
// })
|
||||
// })
|
||||
// .and_then(|icon_info| icon_info.load_icon().ok())
|
||||
// .and_then(|pixbuf| pixbuf.save_to_bufferv("png", &[]).ok()),
|
||||
})
|
||||
})
|
||||
.collect()
|
||||
}
|
||||
|
||||
pub fn open_files_path_with(file_paths: &[impl AsRef<Path>], id: &str) -> Result<(), GlibError> {
|
||||
let Some(app) = DesktopAppInfo::new(id) else {
|
||||
return Err(GlibError::new(ResourceError::NotFound, "App not found"));
|
||||
};
|
||||
|
||||
LAUNCH_CTX.with(|ctx| {
|
||||
app.launch(
|
||||
&file_paths.iter().map(GioFile::for_path).collect::<Vec<_>>(),
|
||||
Some(ctx),
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
pub fn open_file_path(file_path: impl AsRef<Path>) -> Result<(), GlibError> {
|
||||
let file_uri = GioFile::for_path(file_path).uri().to_string();
|
||||
LAUNCH_CTX.with(|ctx| AppInfo::launch_default_for_uri(&file_uri.to_string(), Some(ctx)))
|
||||
}
|
||||
@@ -1,233 +0,0 @@
|
||||
use std::{
|
||||
collections::HashSet,
|
||||
env,
|
||||
ffi::{CStr, OsStr},
|
||||
mem,
|
||||
os::unix::ffi::OsStrExt,
|
||||
path::PathBuf,
|
||||
ptr,
|
||||
};
|
||||
|
||||
pub fn get_current_user_home() -> Option<PathBuf> {
|
||||
use libc::{getpwuid_r, getuid, passwd, ERANGE};
|
||||
|
||||
if let Some(home) = env::var_os("HOME") {
|
||||
let home = PathBuf::from(home);
|
||||
if home.is_absolute() && home.is_dir() {
|
||||
return Some(home);
|
||||
}
|
||||
}
|
||||
|
||||
let uid = unsafe { getuid() };
|
||||
let mut buf = vec![0; 2048];
|
||||
let mut passwd = unsafe { mem::zeroed::<passwd>() };
|
||||
let mut result = ptr::null_mut::<passwd>();
|
||||
|
||||
loop {
|
||||
let r = unsafe { getpwuid_r(uid, &mut passwd, buf.as_mut_ptr(), buf.len(), &mut result) };
|
||||
|
||||
if r != ERANGE {
|
||||
break;
|
||||
}
|
||||
|
||||
let newsize = buf.len().checked_mul(2)?;
|
||||
buf.resize(newsize, 0);
|
||||
}
|
||||
|
||||
if result.is_null() {
|
||||
// There is no such user, or an error has occurred.
|
||||
// errno gets set if there’s an error.
|
||||
return None;
|
||||
}
|
||||
|
||||
if result != &mut passwd {
|
||||
// The result of getpwuid_r should be its input passwd.
|
||||
return None;
|
||||
}
|
||||
|
||||
let passwd: passwd = unsafe { result.read() };
|
||||
if passwd.pw_dir.is_null() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let home = PathBuf::from(OsStr::from_bytes(
|
||||
unsafe { CStr::from_ptr(passwd.pw_dir) }.to_bytes(),
|
||||
));
|
||||
if home.is_absolute() && home.is_dir() {
|
||||
env::set_var("HOME", &home);
|
||||
Some(home)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn normalize_pathlist(
|
||||
env_name: &str,
|
||||
default_dirs: &[PathBuf],
|
||||
) -> Result<Vec<PathBuf>, env::JoinPathsError> {
|
||||
let dirs = if let Some(value) = env::var_os(env_name) {
|
||||
let mut dirs = env::split_paths(&value)
|
||||
.filter(|entry| !entry.as_os_str().is_empty())
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
let mut insert_index = dirs.len();
|
||||
for default_dir in default_dirs {
|
||||
match dirs.iter().rev().position(|dir| dir == default_dir) {
|
||||
Some(mut index) => {
|
||||
index = dirs.len() - index - 1;
|
||||
if index < insert_index {
|
||||
insert_index = index
|
||||
}
|
||||
}
|
||||
None => dirs.insert(insert_index, default_dir.to_path_buf()),
|
||||
}
|
||||
}
|
||||
|
||||
dirs
|
||||
} else {
|
||||
default_dirs.into()
|
||||
};
|
||||
|
||||
let mut unique = HashSet::new();
|
||||
let mut pathlist = dirs
|
||||
.iter()
|
||||
.rev() // Reverse order to remove duplicates from the end
|
||||
.filter(|dir| unique.insert(*dir))
|
||||
.cloned()
|
||||
.collect::<Vec<_>>();
|
||||
|
||||
pathlist.reverse();
|
||||
|
||||
env::set_var(env_name, env::join_paths(&pathlist)?);
|
||||
|
||||
Ok(pathlist)
|
||||
}
|
||||
|
||||
fn normalize_xdg_environment(name: &str, default_value: PathBuf) -> PathBuf {
|
||||
if let Some(value) = env::var_os(name) {
|
||||
if !value.is_empty() {
|
||||
let path = PathBuf::from(value);
|
||||
if path.is_absolute() && path.is_dir() {
|
||||
return path;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
env::set_var(name, &default_value);
|
||||
default_value
|
||||
}
|
||||
|
||||
pub fn normalize_environment() {
|
||||
let home = get_current_user_home().expect("No user home directory found");
|
||||
|
||||
// Normalize user XDG dirs environment variables
|
||||
// https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html
|
||||
let data_home = normalize_xdg_environment("XDG_DATA_HOME", home.join(".local/share"));
|
||||
normalize_xdg_environment("XDG_CACHE_HOME", home.join(".cache"));
|
||||
normalize_xdg_environment("XDG_CONFIG_HOME", home.join(".config"));
|
||||
|
||||
// Normalize system XDG dirs environment variables
|
||||
// https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html
|
||||
normalize_pathlist(
|
||||
"XDG_DATA_DIRS",
|
||||
&[
|
||||
PathBuf::from("/usr/share"),
|
||||
PathBuf::from("/usr/local/share"),
|
||||
PathBuf::from("/var/lib/flatpak/exports/share"),
|
||||
data_home.join("flatpak/exports/share"),
|
||||
],
|
||||
)
|
||||
.expect("XDG_DATA_DIRS must be successfully normalized");
|
||||
normalize_pathlist("XDG_CONFIG_DIRS", &[PathBuf::from("/etc/xdg")])
|
||||
.expect("XDG_CONFIG_DIRS must be successfully normalized");
|
||||
|
||||
// Normalize GStreamer plugin path
|
||||
// https://gstreamer.freedesktop.org/documentation/gstreamer/gstregistry.html#gstregistry-page
|
||||
normalize_pathlist(
|
||||
"GST_PLUGIN_SYSTEM_PATH",
|
||||
&[
|
||||
PathBuf::from("/usr/lib/gstreamer"),
|
||||
data_home.join("gstreamer/plugins"),
|
||||
],
|
||||
)
|
||||
.expect("GST_PLUGIN_SYSTEM_PATH must be successfully normalized");
|
||||
normalize_pathlist(
|
||||
"GST_PLUGIN_SYSTEM_PATH_1_0",
|
||||
&[
|
||||
PathBuf::from("/usr/lib/gstreamer-1.0"),
|
||||
data_home.join("gstreamer-1.0/plugins"),
|
||||
],
|
||||
)
|
||||
.expect("GST_PLUGIN_SYSTEM_PATH_1_0 must be successfully normalized");
|
||||
|
||||
// Normalize PATH
|
||||
normalize_pathlist(
|
||||
"PATH",
|
||||
&[
|
||||
PathBuf::from("/sbin"),
|
||||
PathBuf::from("/bin"),
|
||||
PathBuf::from("/usr/sbin"),
|
||||
PathBuf::from("/usr/bin"),
|
||||
PathBuf::from("/usr/local/sbin"),
|
||||
PathBuf::from("/usr/local/bin"),
|
||||
PathBuf::from("/var/lib/flatpak/exports/bin"),
|
||||
data_home.join("flatpak/exports/bin"),
|
||||
],
|
||||
)
|
||||
.expect("PATH must be successfully normalized");
|
||||
|
||||
if has_nvidia() {
|
||||
// Workaround for: https://github.com/tauri-apps/tauri/issues/9304
|
||||
env::set_var("WEBKIT_DISABLE_DMABUF_RENDERER", "1");
|
||||
}
|
||||
}
|
||||
|
||||
// Check if snap by looking if SNAP is set and not empty and that the SNAP directory exists
|
||||
pub fn is_snap() -> bool {
|
||||
if let Some(snap) = std::env::var_os("SNAP") {
|
||||
if !snap.is_empty() && PathBuf::from(snap).is_dir() {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
|
||||
// Check if flatpak by looking if FLATPAK_ID is set and not empty and that the .flatpak-info file exists
|
||||
pub fn is_flatpak() -> bool {
|
||||
if let Some(flatpak_id) = std::env::var_os("FLATPAK_ID") {
|
||||
if !flatpak_id.is_empty() && PathBuf::from("/.flatpak-info").is_file() {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
|
||||
fn has_nvidia() -> bool {
|
||||
use wgpu::{
|
||||
Backends, DeviceType, Dx12Compiler, Gles3MinorVersion, Instance, InstanceDescriptor,
|
||||
InstanceFlags,
|
||||
};
|
||||
|
||||
let instance = Instance::new(InstanceDescriptor {
|
||||
flags: InstanceFlags::empty(),
|
||||
backends: Backends::VULKAN | Backends::GL,
|
||||
gles_minor_version: Gles3MinorVersion::Automatic,
|
||||
dx12_shader_compiler: Dx12Compiler::default(),
|
||||
});
|
||||
for adapter in instance.enumerate_adapters(Backends::all()) {
|
||||
let info = adapter.get_info();
|
||||
match info.device_type {
|
||||
DeviceType::DiscreteGpu | DeviceType::IntegratedGpu | DeviceType::VirtualGpu => {
|
||||
// Nvidia PCI id
|
||||
if info.vendor == 0x10de {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
}
|
||||
}
|
||||
|
||||
false
|
||||
}
|
||||
@@ -1,7 +0,0 @@
|
||||
#![cfg(target_os = "linux")]
|
||||
|
||||
mod app_info;
|
||||
mod env;
|
||||
|
||||
pub use app_info::{list_apps_associated_with_ext, open_file_path, open_files_path_with};
|
||||
pub use env::{get_current_user_home, is_flatpak, is_snap, normalize_environment};
|
||||
@@ -1,13 +0,0 @@
|
||||
[package]
|
||||
name = "sd-desktop-macos"
|
||||
version = "0.1.0"
|
||||
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
|
||||
[target.'cfg(target_os = "macos")'.dependencies]
|
||||
swift-rs = { version = "1.0", features = ["serde"] }
|
||||
|
||||
[target.'cfg(target_os = "macos")'.build-dependencies]
|
||||
swift-rs = { version = "1.0", features = ["build"] }
|
||||
@@ -1,16 +0,0 @@
|
||||
{
|
||||
"object": {
|
||||
"pins": [
|
||||
{
|
||||
"package": "SwiftRs",
|
||||
"repositoryURL": "https://github.com/brendonovich/swift-rs",
|
||||
"state": {
|
||||
"branch": "specta",
|
||||
"revision": "dbefee04115083ad283d1640cdceca3036c41042",
|
||||
"version": null
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
"version": 1
|
||||
}
|
||||
@@ -1,33 +0,0 @@
|
||||
// swift-tools-version: 5.5
|
||||
// The swift-tools-version declares the minimum version of Swift required to build this package.
|
||||
|
||||
import PackageDescription
|
||||
|
||||
let package = Package(
|
||||
name: "sd-desktop-macos",
|
||||
platforms: [
|
||||
.macOS(.v10_15), // macOS Catalina. Earliest version that is officially supported by Apple.
|
||||
],
|
||||
products: [
|
||||
// Products define the executables and libraries a package produces, and make them visible to other packages.
|
||||
.library(
|
||||
name: "sd-desktop-macos",
|
||||
type: .static,
|
||||
targets: ["sd-desktop-macos"]
|
||||
),
|
||||
],
|
||||
dependencies: [
|
||||
// Dependencies declare other packages that this package depends on.
|
||||
.package(url: "https://github.com/brendonovich/swift-rs", branch: "specta"),
|
||||
],
|
||||
targets: [
|
||||
// Targets are the basic building blocks of a package. A target can define a module or a test suite.
|
||||
// Targets can depend on other targets in this package, and on products in packages this package depends on.
|
||||
.target(
|
||||
name: "sd-desktop-macos",
|
||||
dependencies: [
|
||||
.product(name: "SwiftRs", package: "swift-rs") ],
|
||||
path: "src-swift"
|
||||
),
|
||||
]
|
||||
)
|
||||
@@ -1,14 +0,0 @@
|
||||
#[cfg(target_os = "macos")]
|
||||
use std::env;
|
||||
|
||||
fn main() {
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
let deployment_target =
|
||||
env::var("MACOSX_DEPLOYMENT_TARGET").unwrap_or_else(|_| String::from("10.15"));
|
||||
|
||||
swift_rs::SwiftLinker::new(deployment_target.as_str())
|
||||
.with_package("sd-desktop-macos", "./")
|
||||
.link();
|
||||
}
|
||||
}
|
||||
@@ -1,120 +0,0 @@
|
||||
import AppKit
|
||||
import SwiftRs
|
||||
|
||||
extension NSBitmapImageRep {
|
||||
var png: Data? { representation(using: .png, properties: [:]) }
|
||||
}
|
||||
|
||||
extension Data {
|
||||
var bitmap: NSBitmapImageRep? { NSBitmapImageRep(data: self) }
|
||||
}
|
||||
|
||||
extension NSImage {
|
||||
var png: Data? { tiffRepresentation?.bitmap?.png }
|
||||
}
|
||||
|
||||
class OpenWithApplication: NSObject {
|
||||
var name: SRString
|
||||
var id: SRString
|
||||
var url: SRString
|
||||
var icon: SRData
|
||||
|
||||
init(name: SRString, id: SRString, url: SRString, icon: SRData) {
|
||||
self.name = name
|
||||
self.id = id
|
||||
self.url = url
|
||||
self.icon = icon
|
||||
}
|
||||
}
|
||||
|
||||
@_cdecl("get_open_with_applications")
|
||||
func getOpenWithApplications(urlString: SRString) -> SRObjectArray {
|
||||
let url: URL
|
||||
if #available(macOS 13.0, *) {
|
||||
url = URL(filePath: urlString.toString())
|
||||
} else {
|
||||
// Fallback on earlier versions
|
||||
url = URL(fileURLWithPath: urlString.toString())
|
||||
}
|
||||
|
||||
let appURLs: [URL]
|
||||
if #available(macOS 12.0, *) {
|
||||
appURLs = NSWorkspace.shared.urlsForApplications(toOpen: url)
|
||||
} else {
|
||||
// Fallback for macOS versions prior to 12
|
||||
|
||||
// Get type identifier from file URL
|
||||
let fileType: String
|
||||
if #available(macOS 11.0, *) {
|
||||
guard let _fileType = (try? url.resourceValues(forKeys: [.typeIdentifierKey]))?.typeIdentifier
|
||||
else {
|
||||
print("Failed to fetch file type for the specified file URL")
|
||||
return SRObjectArray([])
|
||||
}
|
||||
|
||||
fileType = _fileType
|
||||
} else {
|
||||
// Fallback for macOS versions prior to 11
|
||||
guard
|
||||
let _fileType = UTTypeCreatePreferredIdentifierForTag(
|
||||
kUTTagClassFilenameExtension, url.pathExtension as CFString, nil)?.takeRetainedValue()
|
||||
else {
|
||||
print("Failed to fetch file type for the specified file URL")
|
||||
return SRObjectArray([])
|
||||
}
|
||||
fileType = _fileType as String
|
||||
}
|
||||
|
||||
// Locates an array of bundle identifiers for apps capable of handling a specified content type with the specified roles.
|
||||
guard
|
||||
let bundleIds = LSCopyAllRoleHandlersForContentType(fileType as CFString, LSRolesMask.all)?
|
||||
.takeRetainedValue() as? [String]
|
||||
else {
|
||||
print("Failed to fetch bundle IDs for the specified file type")
|
||||
return SRObjectArray([])
|
||||
}
|
||||
|
||||
// Retrieve all URLs for the app identified by a bundle id
|
||||
appURLs = bundleIds.compactMap { bundleId -> URL? in
|
||||
guard let retVal = LSCopyApplicationURLsForBundleIdentifier(bundleId as CFString, nil) else {
|
||||
return nil
|
||||
}
|
||||
return retVal.takeRetainedValue() as? URL
|
||||
}
|
||||
}
|
||||
|
||||
return SRObjectArray(
|
||||
appURLs.compactMap { url -> NSObject? in
|
||||
guard url.path.contains("/Applications/"),
|
||||
let infoDict = Bundle(url: url)?.infoDictionary,
|
||||
let name = (infoDict["CFBundleDisplayName"] ?? infoDict["CFBundleName"]) as? String,
|
||||
let appId = infoDict["CFBundleIdentifier"] as? String
|
||||
else {
|
||||
return nil
|
||||
}
|
||||
|
||||
let icon = NSWorkspace.shared.icon(forFile: url.path)
|
||||
|
||||
return OpenWithApplication(
|
||||
name: SRString(name),
|
||||
id: SRString(appId),
|
||||
url: SRString(url.path),
|
||||
icon: SRData([UInt8](icon.png ?? Data()))
|
||||
)
|
||||
})
|
||||
}
|
||||
|
||||
@_cdecl("open_file_path_with")
|
||||
func openFilePathsWith(filePath: SRString, withUrl: SRString) {
|
||||
let config = NSWorkspace.OpenConfiguration()
|
||||
let at = URL(fileURLWithPath: withUrl.toString())
|
||||
|
||||
// FIX-ME(HACK): The NULL split here is because I was not able to make this function accept a SRArray<SRString> argument.
|
||||
// So, considering these are file paths, and \0 is not a valid character for a file path,
|
||||
// I am using it as a delimitor to allow the rust side to pass in an array of files paths to this function
|
||||
let fileURLs = filePath.toString().split(separator: "\0").map {
|
||||
filePath in URL(fileURLWithPath: String(filePath))
|
||||
}
|
||||
|
||||
NSWorkspace.shared.open(fileURLs, withApplicationAt: at, configuration: config)
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
import WebKit
|
||||
|
||||
@_cdecl("reload_webview")
|
||||
public func reloadWebview(webview: WKWebView) -> () {
|
||||
webview.window!.orderOut(webview);
|
||||
webview.reload();
|
||||
webview.window!.makeKey();
|
||||
}
|
||||
@@ -1,98 +0,0 @@
|
||||
import AppKit
|
||||
import SwiftRs
|
||||
|
||||
@objc
|
||||
public enum AppThemeType: Int {
|
||||
case auto = -1
|
||||
case light = 0
|
||||
case dark = 1
|
||||
}
|
||||
|
||||
private let activityLock = NSLock()
|
||||
private var activity: NSObjectProtocol?
|
||||
private var isThemeUpdating = false
|
||||
|
||||
@_cdecl("disable_app_nap")
|
||||
public func disableAppNap(reason: SRString) -> Bool {
|
||||
activityLock.lock()
|
||||
defer { activityLock.unlock() }
|
||||
|
||||
guard activity == nil else {
|
||||
return false
|
||||
}
|
||||
|
||||
activity = ProcessInfo.processInfo.beginActivity(
|
||||
options: .userInitiatedAllowingIdleSystemSleep,
|
||||
reason: reason.toString()
|
||||
)
|
||||
return true
|
||||
}
|
||||
|
||||
@_cdecl("enable_app_nap")
|
||||
public func enableAppNap() -> Bool {
|
||||
activityLock.lock()
|
||||
defer { activityLock.unlock() }
|
||||
|
||||
guard let currentActivity = activity else {
|
||||
return false
|
||||
}
|
||||
|
||||
ProcessInfo.processInfo.endActivity(currentActivity)
|
||||
activity = nil
|
||||
return true
|
||||
}
|
||||
|
||||
@_cdecl("lock_app_theme")
|
||||
public func lockAppTheme(themeType: AppThemeType) {
|
||||
// Prevent concurrent theme updates
|
||||
guard !isThemeUpdating else {
|
||||
return
|
||||
}
|
||||
|
||||
isThemeUpdating = true
|
||||
|
||||
let theme: NSAppearance?
|
||||
switch themeType {
|
||||
case .auto:
|
||||
theme = nil
|
||||
case .dark:
|
||||
theme = NSAppearance(named: .darkAqua)
|
||||
case .light:
|
||||
theme = NSAppearance(named: .aqua)
|
||||
}
|
||||
|
||||
// Use sync to ensure completion before return
|
||||
DispatchQueue.main.sync {
|
||||
autoreleasepool {
|
||||
NSApp.appearance = theme
|
||||
|
||||
if let window = NSApplication.shared.mainWindow {
|
||||
NSAnimationContext.runAnimationGroup({ context in
|
||||
context.duration = 0
|
||||
window.invalidateShadow()
|
||||
window.displayIfNeeded()
|
||||
}, completionHandler: {
|
||||
isThemeUpdating = false
|
||||
})
|
||||
} else {
|
||||
isThemeUpdating = false
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@_cdecl("set_titlebar_style")
|
||||
public func setTitlebarStyle(window: NSWindow, fullScreen: Bool) {
|
||||
// this results in far less visual artifacts if we just manage it ourselves (the native taskbar re-appears when fullscreening/un-fullscreening)
|
||||
window.titlebarAppearsTransparent = true
|
||||
if fullScreen { // fullscreen, give control back to the native OS
|
||||
window.toolbar = nil
|
||||
} else { // non-fullscreen
|
||||
// here we create a uniquely identifiable invisible toolbar in order to correctly pad out the traffic lights
|
||||
// this MUST be hidden while fullscreen as macos has a unique dropdown bar for that, and it's far easier to just let it do its thing
|
||||
let toolbar = NSToolbar(identifier: "window_invisible_toolbar")
|
||||
toolbar.showsBaselineSeparator = false
|
||||
window.toolbar = toolbar
|
||||
}
|
||||
window.titleVisibility = fullScreen ? .visible : .hidden
|
||||
}
|
||||
@@ -1,32 +0,0 @@
|
||||
#![cfg(target_os = "macos")]
|
||||
|
||||
use swift_rs::{swift, Bool, Int, SRData, SRObjectArray, SRString};
|
||||
|
||||
pub type NSObject = *mut std::ffi::c_void;
|
||||
|
||||
pub enum AppThemeType {
|
||||
Light = 0 as Int,
|
||||
Dark = 1 as Int,
|
||||
}
|
||||
|
||||
swift!(pub fn disable_app_nap(reason: &SRString) -> Bool);
|
||||
swift!(pub fn enable_app_nap() -> Bool);
|
||||
swift!(pub fn lock_app_theme(theme_type: Int));
|
||||
swift!(pub fn set_titlebar_style(window: &NSObject, is_fullscreen: Bool));
|
||||
swift!(pub fn reload_webview(webview: &NSObject));
|
||||
|
||||
#[repr(C)]
|
||||
pub struct OpenWithApplication {
|
||||
pub name: SRString,
|
||||
pub id: SRString,
|
||||
pub url: SRString,
|
||||
pub icon: SRData,
|
||||
}
|
||||
|
||||
swift!(pub fn get_open_with_applications(url: &SRString) -> SRObjectArray<OpenWithApplication>);
|
||||
swift!(pub(crate) fn open_file_path_with(file_url: &SRString, with_url: &SRString));
|
||||
|
||||
pub fn open_file_paths_with(file_urls: &[String], with_url: &str) {
|
||||
let file_url = file_urls.join("\0");
|
||||
unsafe { open_file_path_with(&file_url.as_str().into(), &with_url.into()) }
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
[package]
|
||||
name = "sd-desktop-windows"
|
||||
version = "0.1.0"
|
||||
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
|
||||
[dependencies]
|
||||
libc = { workspace = true }
|
||||
normpath = { workspace = true }
|
||||
|
||||
[target.'cfg(target_os = "windows")'.dependencies.windows]
|
||||
features = ["Win32_Foundation", "Win32_System_Com", "Win32_UI_Shell"]
|
||||
version = "0.58"
|
||||
@@ -1,128 +0,0 @@
|
||||
#![cfg(target_os = "windows")]
|
||||
|
||||
use std::{
|
||||
ffi::{OsStr, OsString},
|
||||
os::windows::ffi::OsStrExt,
|
||||
path::Path,
|
||||
};
|
||||
|
||||
use normpath::PathExt;
|
||||
use windows::{
|
||||
core::{HSTRING, PCWSTR},
|
||||
Win32::{
|
||||
Foundation::E_FAIL,
|
||||
System::Com::{
|
||||
CoInitializeEx, CoUninitialize, IDataObject, COINIT_APARTMENTTHREADED,
|
||||
COINIT_DISABLE_OLE1DDE,
|
||||
},
|
||||
UI::Shell::{
|
||||
BHID_DataObject, IAssocHandler, IShellItem, SHAssocEnumHandlers,
|
||||
SHCreateItemFromParsingName, ASSOC_FILTER_RECOMMENDED,
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
pub use windows::core::{Error, Result};
|
||||
|
||||
// Based on: https://github.com/Byron/trash-rs/blob/841bc1388959ab3be4f05ad1a90b03aa6bcaea67/src/windows.rs#L212-L258
|
||||
struct CoInitializer {}
|
||||
impl CoInitializer {
|
||||
fn new() -> CoInitializer {
|
||||
let hr = unsafe { CoInitializeEx(None, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE) };
|
||||
if hr.is_err() {
|
||||
panic!("Call to CoInitializeEx failed. HRESULT: {:?}.", hr);
|
||||
}
|
||||
CoInitializer {}
|
||||
}
|
||||
}
|
||||
|
||||
thread_local! {
|
||||
static CO_INITIALIZER: CoInitializer = {
|
||||
unsafe { libc::atexit(atexit_handler) };
|
||||
CoInitializer::new()
|
||||
};
|
||||
}
|
||||
|
||||
extern "C" fn atexit_handler() {
|
||||
unsafe {
|
||||
CoUninitialize();
|
||||
}
|
||||
}
|
||||
|
||||
fn ensure_com_initialized() {
|
||||
CO_INITIALIZER.with(|_| {});
|
||||
}
|
||||
|
||||
// Use SHAssocEnumHandlers to get the list of apps associated with a file extension.
|
||||
// https://learn.microsoft.com/en-us/windows/win32/api/shobjidl_core/nf-shobjidl_core-shassocenumhandlers
|
||||
pub fn list_apps_associated_with_ext(ext: &OsStr) -> Result<Vec<IAssocHandler>> {
|
||||
if ext.is_empty() {
|
||||
return Ok(Vec::new());
|
||||
}
|
||||
|
||||
// SHAssocEnumHandlers requires the extension to be prefixed with a dot
|
||||
let ext = {
|
||||
// Get first charact from ext
|
||||
let ext_bytes = ext.encode_wide().collect::<Vec<_>>();
|
||||
if ext_bytes[0] != '.' as u16 {
|
||||
let mut prefixed_ext = OsString::from(".");
|
||||
prefixed_ext.push(ext);
|
||||
prefixed_ext
|
||||
} else {
|
||||
ext.to_os_string()
|
||||
}
|
||||
};
|
||||
|
||||
let assoc_handlers =
|
||||
unsafe { SHAssocEnumHandlers(&HSTRING::from(ext), ASSOC_FILTER_RECOMMENDED) }?;
|
||||
|
||||
let mut vec = Vec::new();
|
||||
loop {
|
||||
let mut rgelt = [None; 1];
|
||||
let mut pceltfetched = 0;
|
||||
unsafe { assoc_handlers.Next(&mut rgelt, Some(&mut pceltfetched)) }?;
|
||||
|
||||
if pceltfetched == 0 {
|
||||
break;
|
||||
}
|
||||
|
||||
if let [Some(handler)] = rgelt {
|
||||
vec.push(handler);
|
||||
}
|
||||
}
|
||||
|
||||
Ok(vec)
|
||||
}
|
||||
|
||||
pub fn open_file_path_with(path: impl AsRef<Path>, url: &str) -> Result<()> {
|
||||
ensure_com_initialized();
|
||||
let path = path.as_ref();
|
||||
|
||||
let ext = path
|
||||
.extension()
|
||||
.ok_or(Error::new(E_FAIL, "No file extension"))?;
|
||||
for handler in list_apps_associated_with_ext(ext)?.iter() {
|
||||
let name = unsafe { handler.GetName()?.to_string()? };
|
||||
if name == url {
|
||||
let path = path
|
||||
.normalize_virtually()
|
||||
.map_err(|e| Error::new(E_FAIL, e.to_string()))?;
|
||||
let wide_path = path
|
||||
.as_os_str()
|
||||
.encode_wide()
|
||||
.chain(std::iter::once(0))
|
||||
.collect::<Vec<_>>();
|
||||
let factory: IShellItem =
|
||||
unsafe { SHCreateItemFromParsingName(PCWSTR(wide_path.as_ptr()), None) }?;
|
||||
let data: IDataObject = unsafe { factory.BindToHandler(None, &BHID_DataObject) }?;
|
||||
unsafe { handler.Invoke(&data) }?;
|
||||
|
||||
return Ok(());
|
||||
}
|
||||
}
|
||||
|
||||
Err(Error::new(
|
||||
E_FAIL,
|
||||
"No available handler for the given path",
|
||||
))
|
||||
}
|
||||
5
apps/desktop/dist/.gitignore
vendored
@@ -1,5 +0,0 @@
|
||||
# Ignore everything in this directory
|
||||
*
|
||||
# Except this file
|
||||
!.gitignore
|
||||
# This is done so that Tauri never complains that '../dist does not exist'
|
||||
@@ -1,48 +0,0 @@
|
||||
{
|
||||
"name": "@sd/desktop",
|
||||
"type": "module",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"vite": "vite",
|
||||
"dev": "vite dev",
|
||||
"build": "vite build",
|
||||
"tauri": "pnpm --filter @sd/scripts -- tauri",
|
||||
"dmg": "open ../../target/release/bundle/dmg/",
|
||||
"typecheck": "tsc -b",
|
||||
"lint": "eslint src --cache"
|
||||
},
|
||||
"dependencies": {
|
||||
"@crabnebula/tauri-plugin-drag": "^2.0.0",
|
||||
"@remix-run/router": "=1.13.1",
|
||||
"@sd/client": "workspace:*",
|
||||
"@sd/interface": "workspace:*",
|
||||
"@sd/ui": "workspace:*",
|
||||
"@spacedrive/rspc-client": "github:spacedriveapp/rspc#path:packages/client&6a77167495",
|
||||
"@spacedrive/rspc-tauri": "github:spacedriveapp/rspc#path:packages/tauri&6a77167495",
|
||||
"@t3-oss/env-core": "^0.7.1",
|
||||
"@tanstack/react-query": "^5.59",
|
||||
"@tauri-apps/api": "=2.0.3",
|
||||
"@tauri-apps/plugin-dialog": "2.0.1",
|
||||
"@tauri-apps/plugin-http": "2.0.1",
|
||||
"@tauri-apps/plugin-os": "2.0.0",
|
||||
"@tauri-apps/plugin-shell": "2.0.1",
|
||||
"consistent-hash": "^1.2.2",
|
||||
"immer": "^10.0.3",
|
||||
"react": "^18.2.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"react-router-dom": "=6.20.1",
|
||||
"sonner": "^1.0.3",
|
||||
"supertokens-web-js": "=0.13.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@sd/config": "workspace:*",
|
||||
"@sentry/vite-plugin": "^2.16.0",
|
||||
"@tauri-apps/cli": "2.0.4",
|
||||
"@types/react": "^18.2.67",
|
||||
"@types/react-dom": "^18.2.22",
|
||||
"sass": "^1.72.0",
|
||||
"typescript": "^5.6.2",
|
||||
"vite": "^5.4.9",
|
||||
"vite-tsconfig-paths": "^5.0.1"
|
||||
}
|
||||
}
|
||||
@@ -1 +0,0 @@
|
||||
module.exports = require('@sd/ui/postcss');
|
||||
9
apps/desktop/src-tauri/.gitignore
vendored
@@ -1,9 +0,0 @@
|
||||
# Generated by Cargo
|
||||
# will have compiled files and executables
|
||||
/target/
|
||||
gen/
|
||||
WixTools
|
||||
*.dll
|
||||
*.dll.*
|
||||
*.so
|
||||
*.so.*
|
||||
@@ -1,91 +0,0 @@
|
||||
[package]
|
||||
name = "sd-desktop"
|
||||
version = "0.5.0"
|
||||
|
||||
authors = ["Spacedrive Technology Inc <support@spacedrive.com>"]
|
||||
default-run = "sd-desktop"
|
||||
description = "The universal file manager."
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
|
||||
[dependencies]
|
||||
# Spacedrive Sub-crates
|
||||
sd-core = { path = "../../../core", features = ["ffmpeg", "heif"] }
|
||||
sd-fda = { path = "../../../crates/fda" }
|
||||
|
||||
# Workspace dependencies
|
||||
axum = { workspace = true, features = ["query"] }
|
||||
axum-extra = { workspace = true, features = ["typed-header"] }
|
||||
base64 = { workspace = true }
|
||||
futures = { workspace = true }
|
||||
http = { workspace = true }
|
||||
hyper = { workspace = true }
|
||||
rand = { workspace = true }
|
||||
rspc = { workspace = true, features = ["tauri"] }
|
||||
serde = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
specta = { workspace = true }
|
||||
strum = { workspace = true, features = ["derive"] }
|
||||
thiserror = { workspace = true }
|
||||
tokio = { workspace = true, features = ["sync"] }
|
||||
tracing = { workspace = true }
|
||||
uuid = { workspace = true, features = ["serde"] }
|
||||
|
||||
# Specific Desktop dependencies
|
||||
# WARNING: Do NOT enable default features, as that vendors dbus (see below)
|
||||
drag = { git = "https://github.com/spacedriveapp/drag-rs", branch = "move-operation" }
|
||||
opener = { version = "0.7.1", features = ["reveal"], default-features = false }
|
||||
specta-typescript = "=0.0.7"
|
||||
tauri-plugin-clipboard-manager = "=2.0.1"
|
||||
tauri-plugin-cors-fetch = { path = "../../../crates/tauri-plugin-cors-fetch" }
|
||||
tauri-plugin-deep-link = "=2.0.1"
|
||||
tauri-plugin-dialog = "=2.0.3"
|
||||
tauri-plugin-drag = "2.0.0"
|
||||
tauri-plugin-http = "=2.0.3"
|
||||
tauri-plugin-os = "=2.0.1"
|
||||
tauri-plugin-shell = "=2.0.2"
|
||||
tauri-plugin-updater = "=2.0.2"
|
||||
|
||||
# memory allocator
|
||||
mimalloc = { workspace = true }
|
||||
|
||||
[dependencies.tauri]
|
||||
features = ["linux-libxdo", "macos-private-api", "native-tls-vendored", "unstable"]
|
||||
version = "=2.0.6"
|
||||
|
||||
[dependencies.tauri-specta]
|
||||
features = ["derive", "typescript"]
|
||||
git = "https://github.com/spacedriveapp/tauri-specta"
|
||||
rev = "8c85d40eb9"
|
||||
|
||||
[target.'cfg(target_os = "linux")'.dependencies]
|
||||
# Spacedrive Sub-crates
|
||||
sd-desktop-linux = { path = "../crates/linux" }
|
||||
|
||||
# Specific Desktop dependencies
|
||||
# WARNING: dbus must NOT be vendored, as that breaks the app on Linux,X11,Nvidia
|
||||
dbus = { version = "0.9.7", features = ["stdfd"] }
|
||||
# https://github.com/tauri-apps/tauri/blob/tauri-v2.0.0/crates/tauri/Cargo.toml#L101
|
||||
gtk = { version = "0.18", features = ["v3_24"] }
|
||||
tao = { version = "0.31.1", features = ["serde"] }
|
||||
webkit2gtk = { version = "=2.0.1", features = ["v2_40"] }
|
||||
|
||||
|
||||
[target.'cfg(target_os = "macos")'.dependencies]
|
||||
# Spacedrive Sub-crates
|
||||
sd-desktop-macos = { path = "../crates/macos" }
|
||||
|
||||
[target.'cfg(target_os = "windows")'.dependencies]
|
||||
# Spacedrive Sub-crates
|
||||
sd-desktop-windows = { path = "../crates/windows" }
|
||||
|
||||
[build-dependencies]
|
||||
# Specific Desktop dependencies
|
||||
tauri-build = "=2.0.2"
|
||||
|
||||
[features]
|
||||
ai-models = ["sd-core/ai"]
|
||||
custom-protocol = ["tauri/custom-protocol"]
|
||||
default = ["custom-protocol"]
|
||||
devtools = ["tauri/devtools"]
|
||||
@@ -1,12 +0,0 @@
|
||||
fn main() {
|
||||
#[cfg(all(not(target_os = "windows"), feature = "ai-models"))]
|
||||
// This is required because libonnxruntime.so is incorrectly built with the Initial Executable (IE) thread-Local storage access model by zig
|
||||
// https://docs.oracle.com/cd/E23824_01/html/819-0690/chapter8-20.html
|
||||
// https://github.com/ziglang/zig/issues/16152
|
||||
// https://github.com/ziglang/zig/pull/17702
|
||||
// Due to this, the linker will fail to dlopen libonnxruntime.so because it runs out of the static TLS space reserved after initial load
|
||||
// To workaround this problem libonnxruntime.so is added as a dependency to the binaries, which makes the linker allocate its TLS space during initial load
|
||||
println!("cargo:rustc-link-lib=onnxruntime");
|
||||
|
||||
tauri_build::build();
|
||||
}
|
||||
@@ -1,63 +0,0 @@
|
||||
{
|
||||
"$schema": "../gen/schemas/desktop-schema.json",
|
||||
"identifier": "default",
|
||||
"description": "Capability for the main window",
|
||||
"windows": [
|
||||
"main"
|
||||
],
|
||||
"permissions": [
|
||||
"core:app:default",
|
||||
"core:event:default",
|
||||
"core:image:default",
|
||||
"core:menu:default",
|
||||
"core:path:default",
|
||||
"core:resources:default",
|
||||
"core:window:default",
|
||||
"core:tray:default",
|
||||
"core:webview:default",
|
||||
"shell:allow-open",
|
||||
"dialog:allow-open",
|
||||
"dialog:allow-save",
|
||||
"dialog:allow-confirm",
|
||||
"deep-link:default",
|
||||
"os:allow-os-type",
|
||||
"core:window:allow-close",
|
||||
"core:window:allow-create",
|
||||
"core:window:allow-maximize",
|
||||
"core:window:allow-minimize",
|
||||
"core:window:allow-toggle-maximize",
|
||||
"core:window:allow-start-dragging",
|
||||
"core:webview:allow-internal-toggle-devtools",
|
||||
"cors-fetch:default",
|
||||
"drag:default",
|
||||
{
|
||||
"identifier": "http:default",
|
||||
"allow": [
|
||||
{
|
||||
"url": "http://ipc.localhost"
|
||||
},
|
||||
{
|
||||
"url": "http://asset.localhost"
|
||||
},
|
||||
{
|
||||
"url": "http://localhost:8001"
|
||||
},
|
||||
{
|
||||
"url": "http://tauri.localhost"
|
||||
},
|
||||
{
|
||||
"url": "http://localhost:9420"
|
||||
},
|
||||
{
|
||||
"url": "https://auth.spacedrive.com"
|
||||
},
|
||||
{
|
||||
"url": "https://plausible.io"
|
||||
},
|
||||
{
|
||||
"url": "http://localhost:3567"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
|
Before Width: | Height: | Size: 144 KiB |
|
Before Width: | Height: | Size: 12 KiB |
|
Before Width: | Height: | Size: 35 KiB |
|
Before Width: | Height: | Size: 1.6 KiB |
|
Before Width: | Height: | Size: 9.0 KiB |
|
Before Width: | Height: | Size: 14 KiB |
|
Before Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 41 KiB |
|
Before Width: | Height: | Size: 1.5 KiB |
|
Before Width: | Height: | Size: 47 KiB |
|
Before Width: | Height: | Size: 2.5 KiB |
|
Before Width: | Height: | Size: 5.0 KiB |
|
Before Width: | Height: | Size: 6.8 KiB |
|
Before Width: | Height: | Size: 3.0 KiB |
|
Before Width: | Height: | Size: 116 KiB |
|
Before Width: | Height: | Size: 609 KiB |
|
Before Width: | Height: | Size: 158 KiB |
|
Before Width: | Height: | Size: 109 KiB |
@@ -1,12 +0,0 @@
|
||||
edition = "2018"
|
||||
force_explicit_abi = true
|
||||
hard_tabs = true
|
||||
max_width = 100
|
||||
merge_derives = true
|
||||
newline_style = "Auto"
|
||||
remove_nested_parens = true
|
||||
reorder_imports = true
|
||||
reorder_modules = true
|
||||
use_field_init_shorthand = false
|
||||
use_small_heuristics = "Default"
|
||||
use_try_shorthand = false
|
||||
@@ -1,332 +0,0 @@
|
||||
use base64::{engine::general_purpose::STANDARD, Engine as _};
|
||||
use drag::{DragItem, Image, Options};
|
||||
use serde::{Deserialize, Serialize};
|
||||
use specta::Type;
|
||||
use std::path::PathBuf;
|
||||
use std::sync::atomic::{AtomicBool, Ordering};
|
||||
use std::sync::{Arc, Mutex};
|
||||
use std::time::{Duration, Instant};
|
||||
use tauri::{ipc::Channel, Manager, PhysicalPosition, State, WebviewWindow};
|
||||
|
||||
// DragState wraps a thread-safe boolean flag to track drag operation status
|
||||
#[derive(Clone)]
|
||||
pub struct DragState(pub Arc<Mutex<bool>>);
|
||||
|
||||
// Default implementation for DragState initializes with false
|
||||
impl Default for DragState {
|
||||
fn default() -> Self {
|
||||
Self(Arc::new(Mutex::new(false)))
|
||||
}
|
||||
}
|
||||
|
||||
// Enum to represent the result of a drag operation (serializable for IPC)
|
||||
#[derive(Serialize, Deserialize, Type, Clone)]
|
||||
pub enum WrappedDragResult {
|
||||
Dropped,
|
||||
Cancel,
|
||||
}
|
||||
|
||||
// Structure to hold cursor position coordinates (serializable for IPC)
|
||||
#[derive(Serialize, Deserialize, Type, Clone)]
|
||||
pub struct WrappedCursorPosition {
|
||||
x: i32,
|
||||
y: i32,
|
||||
}
|
||||
|
||||
// Combined structure for drag operation results (serializable for IPC)
|
||||
#[derive(Serialize, Deserialize, Type, Clone)]
|
||||
pub struct CallbackResult {
|
||||
result: WrappedDragResult,
|
||||
#[serde(rename = "cursorPos")]
|
||||
cursor_pos: WrappedCursorPosition,
|
||||
}
|
||||
|
||||
// Conversion implementations for drag-rs types to our wrapped types
|
||||
impl From<drag::DragResult> for WrappedDragResult {
|
||||
fn from(result: drag::DragResult) -> Self {
|
||||
match result {
|
||||
drag::DragResult::Dropped => WrappedDragResult::Dropped,
|
||||
drag::DragResult::Cancel => WrappedDragResult::Cancel,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl From<drag::CursorPosition> for WrappedCursorPosition {
|
||||
fn from(pos: drag::CursorPosition) -> Self {
|
||||
WrappedCursorPosition { x: pos.x, y: pos.y }
|
||||
}
|
||||
}
|
||||
|
||||
// Global flag to track if position tracking is active
|
||||
static TRACKING: AtomicBool = AtomicBool::new(false);
|
||||
|
||||
/// Initiates a drag and drop operation with cursor position tracking
|
||||
///
|
||||
/// # Arguments
|
||||
/// * `window` - The Tauri window instance
|
||||
/// * `_state` - Current drag state (unused)
|
||||
/// * `files` - Vector of file paths to be dragged
|
||||
/// * `image` - Base64 encoded image to be used as drag icon
|
||||
/// * `on_event` - Channel for communicating drag operation events back to the frontend
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
#[cfg(not(target_os = "linux"))]
|
||||
pub async fn start_drag(
|
||||
window: WebviewWindow,
|
||||
_state: State<'_, DragState>,
|
||||
files: Vec<String>,
|
||||
image: String,
|
||||
on_event: Channel<CallbackResult>,
|
||||
) -> Result<(), String> {
|
||||
// Check if image string is base64 encoded
|
||||
let icon_path = if image.starts_with("data:image/") {
|
||||
image
|
||||
} else {
|
||||
// If not, assume it's a file path and convert to base64
|
||||
let icon_data = std::fs::read(&image).map_err(|e| e.to_string())?;
|
||||
format!("data:image/png;base64,{}", STANDARD.encode(icon_data))
|
||||
};
|
||||
|
||||
// Convert the base64 string to a vec<u8>
|
||||
let base64_str = icon_path.split(",").last().unwrap();
|
||||
let image_raw = STANDARD.decode(base64_str).unwrap();
|
||||
|
||||
// Fast atomic swap for tracking state
|
||||
match TRACKING.compare_exchange(false, true, Ordering::SeqCst, Ordering::SeqCst) {
|
||||
Ok(_) => {
|
||||
println!("Starting position tracking");
|
||||
}
|
||||
Err(_) => {
|
||||
// If already tracking, stop previous instance quickly
|
||||
TRACKING.store(false, Ordering::SeqCst);
|
||||
tokio::time::sleep(tokio::time::Duration::from_millis(16)).await;
|
||||
TRACKING.store(true, Ordering::SeqCst);
|
||||
println!("Restarting position tracking");
|
||||
}
|
||||
}
|
||||
|
||||
// Pre-allocate resources before spawning task
|
||||
let window_handle = Arc::new(window);
|
||||
let app_handle = window_handle.app_handle();
|
||||
|
||||
// Initialize control flags
|
||||
let cancel_flag = Arc::new(AtomicBool::new(false));
|
||||
let is_completed = Arc::new(AtomicBool::new(false));
|
||||
|
||||
// Prepare resources once with minimal cloning
|
||||
let tracking_resources = Arc::new((files.clone(), icon_path.clone(), Arc::new(on_event)));
|
||||
|
||||
println!("Starting position tracking");
|
||||
|
||||
// Get handles for window and app management
|
||||
let window_clone = window_handle.clone();
|
||||
let app_handle_owned = app_handle.to_owned();
|
||||
let window_owned = window_clone.to_owned();
|
||||
|
||||
// Control flags for operation state
|
||||
let is_completed_clone = is_completed.clone();
|
||||
|
||||
// Spawn background task for cursor tracking
|
||||
tokio::spawn(async move {
|
||||
// Initialize tracking state
|
||||
let mut last_position = (0.0, 0.0);
|
||||
let mut last_message_time = Instant::now();
|
||||
let threshold = 1.0; // Minimum movement threshold
|
||||
let message_debounce = Duration::from_millis(32); // State update interval
|
||||
let mut was_inside = false;
|
||||
|
||||
// Main tracking loop
|
||||
while TRACKING.load(Ordering::SeqCst) && !is_completed.load(Ordering::SeqCst) {
|
||||
let window_for_check = window_owned.clone();
|
||||
// Skip if window is not focused
|
||||
if !window_for_check.is_focused().unwrap_or(false) {
|
||||
tokio::time::sleep(tokio::time::Duration::from_millis(8)).await;
|
||||
continue;
|
||||
}
|
||||
|
||||
// Get current cursor and window positions
|
||||
if let (Ok(cursor_position), Ok(window_position), Ok(window_size)) = (
|
||||
window_for_check.cursor_position(),
|
||||
window_for_check.outer_position(),
|
||||
window_for_check.inner_size(),
|
||||
) {
|
||||
// Calculate cursor position relative to window
|
||||
let relative_position = PhysicalPosition::new(
|
||||
cursor_position.x - window_position.x as f64,
|
||||
cursor_position.y - window_position.y as f64,
|
||||
);
|
||||
|
||||
// Check if cursor is inside window boundaries
|
||||
let is_inside = relative_position.x >= 0.0
|
||||
&& relative_position.y >= 0.0
|
||||
&& relative_position.x <= window_size.width as f64
|
||||
&& relative_position.y <= window_size.height as f64;
|
||||
|
||||
// Process state changes if cursor moved enough
|
||||
if is_inside != was_inside
|
||||
&& ((relative_position.x - last_position.0).abs() > threshold
|
||||
|| (relative_position.y - last_position.1).abs() > threshold)
|
||||
{
|
||||
let now = Instant::now();
|
||||
if now.duration_since(last_message_time) >= message_debounce {
|
||||
// Prepare resources for drag operation
|
||||
let files_for_drag = tracking_resources.0.clone();
|
||||
let icon_path_for_drag = tracking_resources.1.clone();
|
||||
let on_event_for_drag = tracking_resources.2.clone();
|
||||
let is_completed = is_completed_clone.clone();
|
||||
let cancel_flag_clone = cancel_flag.clone();
|
||||
let window_for_drag = window_owned.clone();
|
||||
let image_raw_for_drag = image_raw.clone();
|
||||
|
||||
// Execute drag operation on main thread
|
||||
app_handle_owned
|
||||
.run_on_main_thread(move || {
|
||||
if !is_inside {
|
||||
println!("Starting drag operation");
|
||||
// Create drag items
|
||||
let paths: Vec<PathBuf> =
|
||||
files_for_drag.iter().map(PathBuf::from).collect();
|
||||
let item = DragItem::Files(paths);
|
||||
let preview_icon = Image::Raw(image_raw_for_drag.clone());
|
||||
|
||||
// Start the drag operation
|
||||
if let Ok(session) = drag::start_drag(
|
||||
&window_for_drag,
|
||||
item,
|
||||
preview_icon,
|
||||
move |result, cursor_pos| {
|
||||
println!("Drag operation completed");
|
||||
// Send result back to frontend
|
||||
let _ = on_event_for_drag.send(CallbackResult {
|
||||
result: result.into(),
|
||||
cursor_pos: cursor_pos.into(),
|
||||
});
|
||||
// Mark operation as completed
|
||||
is_completed.store(true, Ordering::SeqCst);
|
||||
TRACKING.store(false, Ordering::SeqCst);
|
||||
},
|
||||
Options {
|
||||
skip_animatation_on_cancel_or_failure: false,
|
||||
mode: drag::DragMode::Move,
|
||||
},
|
||||
) {
|
||||
println!("Drag operation started");
|
||||
// Store drag session for cancellation
|
||||
// *drag_session_clone.lock().unwrap() = Some(session);
|
||||
}
|
||||
} else {
|
||||
println!("Cursor returned to window");
|
||||
cancel_flag_clone.store(true, Ordering::SeqCst);
|
||||
// We have this for now, but technically, it doesn't do anything.
|
||||
// I'm still trying to figure out how to cancel mid-drag without the user having to cancel the dragging on the frontend too.
|
||||
// - @Rocky43007
|
||||
}
|
||||
})
|
||||
.unwrap_or_default();
|
||||
|
||||
// Update tracking state
|
||||
last_message_time = now;
|
||||
was_inside = is_inside;
|
||||
last_position = (relative_position.x, relative_position.y);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Prevent excessive CPU usage
|
||||
tokio::time::sleep(tokio::time::Duration::from_millis(8)).await;
|
||||
}
|
||||
|
||||
println!("Tracking instance stopped");
|
||||
});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
// /// Initiates a drag and drop operation with cursor position tracking - WIP
|
||||
// ///
|
||||
// /// # Arguments
|
||||
// /// * `window` - The Tauri window instance
|
||||
// /// * `_state` - Current drag state (unused)
|
||||
// /// * `files` - Vector of file paths to be dragged
|
||||
// /// * `image` - Base64 encoded image to be used as drag icon
|
||||
// /// * `on_event` - Channel for communicating drag operation events back to the frontend
|
||||
// #[tauri::command(async)]
|
||||
// #[specta::specta]
|
||||
// #[cfg(target_os = "linux")]
|
||||
// pub async fn start_drag(
|
||||
// window: WebviewWindow,
|
||||
// _state: State<'_, DragState>,
|
||||
// files: Vec<String>,
|
||||
// image: String,
|
||||
// on_event: Channel<CallbackResult>,
|
||||
// ) -> Result<(), String> {
|
||||
// use drag::{CursorPosition, DragResult};
|
||||
// use tao::platform::unix::WindowExtUnix;
|
||||
|
||||
// // Convert file paths to PathBuf
|
||||
// let paths: Vec<PathBuf> = files.iter().map(PathBuf::from).collect();
|
||||
|
||||
// // Handle preview image
|
||||
// let preview_icon = if image.starts_with("data:image/") {
|
||||
// let base64_str = image.split(",").last().unwrap();
|
||||
// let image_raw = STANDARD.decode(base64_str).unwrap();
|
||||
// Image::Raw(image_raw)
|
||||
// } else {
|
||||
// Image::File(PathBuf::from(image))
|
||||
// };
|
||||
|
||||
// // Get main thread handle
|
||||
// let app_handle = window.app_handle();
|
||||
// let window_clone = window.clone();
|
||||
|
||||
// app_handle
|
||||
// .run_on_main_thread(move || {
|
||||
// // Get GTK window handle
|
||||
// let gtk_window = window_clone.gtk_window().expect("Failed to get GTK window");
|
||||
// let item = DragItem::Files(paths);
|
||||
// println!("Starting drag operation");
|
||||
|
||||
// // Start drag operation
|
||||
// let _ = drag::start_drag(
|
||||
// >k_window,
|
||||
// item,
|
||||
// preview_icon,
|
||||
// move |result, cursor_pos| {
|
||||
// println!("Drag operation completed");
|
||||
// println!("Result: {:?}", result);
|
||||
// println!("Cursor position: {:?}", cursor_pos);
|
||||
// let _ = on_event.send(CallbackResult {
|
||||
// result: result.into(),
|
||||
// cursor_pos: cursor_pos.into(),
|
||||
// });
|
||||
// },
|
||||
// Options {
|
||||
// skip_animatation_on_cancel_or_failure: false,
|
||||
// mode: drag::DragMode::Move,
|
||||
// },
|
||||
// );
|
||||
// })
|
||||
// .unwrap_or_default();
|
||||
|
||||
// Ok(())
|
||||
// }
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
#[cfg(target_os = "linux")]
|
||||
pub async fn start_drag(
|
||||
_window: WebviewWindow,
|
||||
_state: State<'_, DragState>,
|
||||
_files: Vec<String>,
|
||||
_image: String,
|
||||
_on_event: Channel<CallbackResult>,
|
||||
) -> Result<(), String> {
|
||||
Err("Drag and drop is not supported on Linux".to_string())
|
||||
}
|
||||
|
||||
/// Stops the cursor position tracking for drag operations
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn stop_drag() {
|
||||
TRACKING.store(false, Ordering::SeqCst);
|
||||
}
|
||||
@@ -1,452 +0,0 @@
|
||||
use sd_core::Node;
|
||||
use sd_prisma::prisma::{file_path, location};
|
||||
|
||||
use std::{
|
||||
collections::{BTreeSet, HashMap, HashSet},
|
||||
hash::{Hash, Hasher},
|
||||
path::PathBuf,
|
||||
sync::Arc,
|
||||
};
|
||||
|
||||
use futures::future::join_all;
|
||||
use serde::Serialize;
|
||||
use specta::Type;
|
||||
use tauri::async_runtime::spawn_blocking;
|
||||
use tracing::error;
|
||||
|
||||
type NodeState<'a> = tauri::State<'a, Arc<Node>>;
|
||||
|
||||
#[derive(Serialize, Type)]
|
||||
#[serde(tag = "t", content = "c")]
|
||||
pub enum OpenFilePathResult {
|
||||
NoLibrary,
|
||||
NoFile(i32),
|
||||
OpenError(i32, String),
|
||||
AllGood(i32),
|
||||
Internal(String),
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn open_file_paths(
|
||||
library: uuid::Uuid,
|
||||
ids: Vec<i32>,
|
||||
node: tauri::State<'_, Arc<Node>>,
|
||||
) -> Result<Vec<OpenFilePathResult>, ()> {
|
||||
let res = if let Some(library) = node.libraries.get_library(&library).await {
|
||||
library.get_file_paths(ids).await.map_or_else(
|
||||
|e| vec![OpenFilePathResult::Internal(e.to_string())],
|
||||
|paths| {
|
||||
paths
|
||||
.into_iter()
|
||||
.map(|(id, maybe_path)| {
|
||||
if let Some(path) = maybe_path {
|
||||
let open_result = {
|
||||
#[cfg(target_os = "linux")]
|
||||
{
|
||||
sd_desktop_linux::open_file_path(path)
|
||||
}
|
||||
|
||||
#[cfg(not(target_os = "linux"))]
|
||||
{
|
||||
opener::open(path)
|
||||
}
|
||||
};
|
||||
|
||||
open_result
|
||||
.map(|()| OpenFilePathResult::AllGood(id))
|
||||
.unwrap_or_else(|err| {
|
||||
error!("Failed to open logs dir: {err}");
|
||||
OpenFilePathResult::OpenError(id, err.to_string())
|
||||
})
|
||||
} else {
|
||||
OpenFilePathResult::NoFile(id)
|
||||
}
|
||||
})
|
||||
.collect()
|
||||
},
|
||||
)
|
||||
} else {
|
||||
vec![OpenFilePathResult::NoLibrary]
|
||||
};
|
||||
|
||||
Ok(res)
|
||||
}
|
||||
|
||||
#[derive(Serialize, Type)]
|
||||
#[serde(tag = "t", content = "c")]
|
||||
pub enum EphemeralFileOpenResult {
|
||||
Ok(PathBuf),
|
||||
Err(String),
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn open_ephemeral_files(paths: Vec<PathBuf>) -> Result<Vec<EphemeralFileOpenResult>, ()> {
|
||||
Ok(paths
|
||||
.into_iter()
|
||||
.map(|path| {
|
||||
if let Err(e) = {
|
||||
#[cfg(target_os = "linux")]
|
||||
{
|
||||
sd_desktop_linux::open_file_path(&path)
|
||||
}
|
||||
|
||||
#[cfg(not(target_os = "linux"))]
|
||||
{
|
||||
opener::open(&path)
|
||||
}
|
||||
} {
|
||||
error!("Failed to open file: {e:#?}");
|
||||
EphemeralFileOpenResult::Err(e.to_string())
|
||||
} else {
|
||||
EphemeralFileOpenResult::Ok(path)
|
||||
}
|
||||
})
|
||||
.collect())
|
||||
}
|
||||
|
||||
#[derive(Serialize, Type, Debug, Clone)]
|
||||
pub struct OpenWithApplication {
|
||||
url: String,
|
||||
name: String,
|
||||
}
|
||||
|
||||
impl Hash for OpenWithApplication {
|
||||
fn hash<H: Hasher>(&self, state: &mut H) {
|
||||
self.url.hash(state);
|
||||
}
|
||||
}
|
||||
|
||||
impl PartialEq for OpenWithApplication {
|
||||
fn eq(&self, other: &Self) -> bool {
|
||||
self.url == other.url
|
||||
}
|
||||
}
|
||||
|
||||
impl Eq for OpenWithApplication {}
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
async fn get_file_path_open_apps_set(path: PathBuf) -> Option<HashSet<OpenWithApplication>> {
|
||||
let Some(path_str) = path.to_str() else {
|
||||
error!(
|
||||
"File path contains non-UTF8 characters: '{}'",
|
||||
path.display()
|
||||
);
|
||||
return None;
|
||||
};
|
||||
|
||||
let res = unsafe { sd_desktop_macos::get_open_with_applications(&path_str.into()) }
|
||||
.as_slice()
|
||||
.iter()
|
||||
.map(|app| OpenWithApplication {
|
||||
url: app.url.to_string(),
|
||||
name: app.name.to_string(),
|
||||
})
|
||||
.collect::<HashSet<_>>();
|
||||
|
||||
Some(res)
|
||||
}
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
async fn get_file_path_open_apps_set(path: PathBuf) -> Option<HashSet<OpenWithApplication>> {
|
||||
Some(
|
||||
sd_desktop_linux::list_apps_associated_with_ext(&path)
|
||||
.await
|
||||
.into_iter()
|
||||
.map(|app| OpenWithApplication {
|
||||
url: app.id,
|
||||
name: app.name,
|
||||
})
|
||||
.collect::<HashSet<_>>(),
|
||||
)
|
||||
}
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
async fn get_file_path_open_apps_set(path: PathBuf) -> Option<HashSet<OpenWithApplication>> {
|
||||
let Some(ext) = path.extension() else {
|
||||
error!("Failed to extract file extension for '{}'", path.display());
|
||||
return None;
|
||||
};
|
||||
|
||||
sd_desktop_windows::list_apps_associated_with_ext(ext)
|
||||
.map_err(|e| {
|
||||
error!("{e:#?}");
|
||||
})
|
||||
.map(|handlers| {
|
||||
handlers
|
||||
.iter()
|
||||
.filter_map(|handler| {
|
||||
let (Ok(name), Ok(url)) = (
|
||||
unsafe { handler.GetUIName() }
|
||||
.map_err(|e| {
|
||||
error!("Error on '{}': {e:#?}", path.display());
|
||||
})
|
||||
.and_then(|name| {
|
||||
unsafe { name.to_string() }.map_err(|e| {
|
||||
error!("Error on '{}': {e:#?}", path.display());
|
||||
})
|
||||
}),
|
||||
unsafe { handler.GetName() }
|
||||
.map_err(|e| {
|
||||
error!("Error on '{}': {e:#?}", path.display());
|
||||
})
|
||||
.and_then(|name| {
|
||||
unsafe { name.to_string() }.map_err(|e| {
|
||||
error!("Error on '{}': {e:#?}", path.display());
|
||||
})
|
||||
}),
|
||||
) else {
|
||||
error!("Failed to get handler info for '{}'", path.display());
|
||||
return None;
|
||||
};
|
||||
|
||||
Some(OpenWithApplication { name, url })
|
||||
})
|
||||
.collect::<HashSet<_>>()
|
||||
})
|
||||
.ok()
|
||||
}
|
||||
|
||||
async fn aggregate_open_with_apps(
|
||||
paths: impl Iterator<Item = PathBuf>,
|
||||
) -> Result<Vec<OpenWithApplication>, ()> {
|
||||
Ok(join_all(paths.map(get_file_path_open_apps_set))
|
||||
.await
|
||||
.into_iter()
|
||||
.flatten()
|
||||
.reduce(|intersection, set| intersection.intersection(&set).cloned().collect())
|
||||
.map(|set| set.into_iter().collect())
|
||||
.unwrap_or(vec![]))
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn get_file_path_open_with_apps(
|
||||
library: uuid::Uuid,
|
||||
ids: Vec<i32>,
|
||||
node: NodeState<'_>,
|
||||
) -> Result<Vec<OpenWithApplication>, ()> {
|
||||
let Some(library) = node.libraries.get_library(&library).await else {
|
||||
return Ok(vec![]);
|
||||
};
|
||||
|
||||
let Ok(paths) = library.get_file_paths(ids).await.map_err(|e| {
|
||||
error!("{e:#?}");
|
||||
}) else {
|
||||
return Ok(vec![]);
|
||||
};
|
||||
|
||||
aggregate_open_with_apps(paths.into_values().filter_map(|maybe_path| {
|
||||
if maybe_path.is_none() {
|
||||
error!("File not found in database");
|
||||
}
|
||||
maybe_path
|
||||
}))
|
||||
.await
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn get_ephemeral_files_open_with_apps(
|
||||
paths: Vec<PathBuf>,
|
||||
) -> Result<Vec<OpenWithApplication>, ()> {
|
||||
aggregate_open_with_apps(paths.into_iter()).await
|
||||
}
|
||||
|
||||
type FileIdAndUrl = (i32, String);
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn open_file_path_with(
|
||||
library: uuid::Uuid,
|
||||
file_ids_and_urls: Vec<FileIdAndUrl>,
|
||||
node: NodeState<'_>,
|
||||
) -> Result<(), ()> {
|
||||
let Some(library) = node.libraries.get_library(&library).await else {
|
||||
return Err(());
|
||||
};
|
||||
|
||||
let url_by_id = file_ids_and_urls.into_iter().collect::<HashMap<_, _>>();
|
||||
let ids = url_by_id.keys().copied().collect::<Vec<_>>();
|
||||
|
||||
library
|
||||
.get_file_paths(ids)
|
||||
.await
|
||||
.map_err(|e| {
|
||||
error!("{e:#?}");
|
||||
})
|
||||
.and_then(|paths| {
|
||||
paths
|
||||
.iter()
|
||||
.map(|(id, path)| {
|
||||
let (Some(path), Some(url)) = (
|
||||
#[cfg(any(target_os = "windows", target_os = "linux"))]
|
||||
path.as_ref(),
|
||||
#[cfg(target_os = "macos")]
|
||||
path.as_ref()
|
||||
.and_then(|path| path.to_str().map(str::to_string)),
|
||||
url_by_id.get(id),
|
||||
) else {
|
||||
error!("File not found in database");
|
||||
return Err(());
|
||||
};
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
return {
|
||||
sd_desktop_macos::open_file_paths_with(&[path], url);
|
||||
Ok(())
|
||||
};
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
return sd_desktop_linux::open_files_path_with(&[path], url).map_err(|e| {
|
||||
error!("{e:#?}");
|
||||
});
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
return sd_desktop_windows::open_file_path_with(path, url).map_err(|e| {
|
||||
error!("{e:#?}");
|
||||
});
|
||||
|
||||
#[cfg(not(any(
|
||||
target_os = "windows",
|
||||
target_os = "linux",
|
||||
target_os = "macos"
|
||||
)))]
|
||||
Err(())
|
||||
})
|
||||
.collect::<Result<Vec<_>, _>>()
|
||||
.map(|_| ())
|
||||
})
|
||||
}
|
||||
|
||||
type PathAndUrl = (PathBuf, String);
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn open_ephemeral_file_with(paths_and_urls: Vec<PathAndUrl>) -> Result<(), ()> {
|
||||
join_all(
|
||||
paths_and_urls
|
||||
.into_iter()
|
||||
.collect::<HashMap<_, _>>() // Just to avoid duplicates
|
||||
.into_iter()
|
||||
.map(|(path, url)| async move {
|
||||
#[cfg(target_os = "macos")]
|
||||
if let Some(path) = path.to_str().map(str::to_string) {
|
||||
if let Err(e) = spawn_blocking(move || {
|
||||
sd_desktop_macos::open_file_paths_with(&[path], &url);
|
||||
})
|
||||
.await
|
||||
{
|
||||
error!("Error joining spawned task for opening files with: {e:#?}");
|
||||
}
|
||||
} else {
|
||||
error!(
|
||||
"File path contains non-UTF8 characters: '{}'",
|
||||
path.display()
|
||||
);
|
||||
};
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
match spawn_blocking(move || sd_desktop_linux::open_files_path_with(&[path], &url))
|
||||
.await
|
||||
{
|
||||
Ok(Ok(())) => (),
|
||||
Ok(Err(e)) => error!("Error opening file with: {e:#?}"),
|
||||
Err(e) => error!("Error joining spawned task for opening files with: {e:#?}"),
|
||||
}
|
||||
|
||||
#[cfg(windows)]
|
||||
match spawn_blocking(move || sd_desktop_windows::open_file_path_with(path, &url))
|
||||
.await
|
||||
{
|
||||
Ok(Ok(())) => (),
|
||||
Ok(Err(e)) => error!("Error opening file with: {e:#?}"),
|
||||
Err(e) => error!("Error joining spawned task for opening files with: {e:#?}"),
|
||||
}
|
||||
}),
|
||||
)
|
||||
.await;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn inner_reveal_paths(paths: impl Iterator<Item = PathBuf>) {
|
||||
for path in paths {
|
||||
if let Err(e) = opener::reveal(path) {
|
||||
error!("Failed to open logs dir: {e:#?}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(specta::Type, serde::Deserialize)]
|
||||
pub enum RevealItem {
|
||||
Location { id: location::id::Type },
|
||||
FilePath { id: file_path::id::Type },
|
||||
Ephemeral { path: PathBuf },
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
pub async fn reveal_items(
|
||||
library: uuid::Uuid,
|
||||
items: Vec<RevealItem>,
|
||||
node: NodeState<'_>,
|
||||
) -> Result<(), ()> {
|
||||
let Some(library) = node.libraries.get_library(&library).await else {
|
||||
return Err(());
|
||||
};
|
||||
|
||||
let mut paths_to_open = BTreeSet::new();
|
||||
|
||||
let (paths, locations): (Vec<_>, Vec<_>) =
|
||||
items
|
||||
.into_iter()
|
||||
.fold((vec![], vec![]), |(mut paths, mut locations), item| {
|
||||
match item {
|
||||
RevealItem::FilePath { id } => paths.push(id),
|
||||
RevealItem::Location { id } => locations.push(id),
|
||||
RevealItem::Ephemeral { path } => {
|
||||
paths_to_open.insert(path);
|
||||
}
|
||||
}
|
||||
|
||||
(paths, locations)
|
||||
});
|
||||
|
||||
if !paths.is_empty() {
|
||||
paths_to_open.extend(
|
||||
library
|
||||
.get_file_paths(paths)
|
||||
.await
|
||||
.unwrap_or_default()
|
||||
.into_values()
|
||||
.flatten(),
|
||||
);
|
||||
}
|
||||
|
||||
if !locations.is_empty() {
|
||||
paths_to_open.extend(
|
||||
library
|
||||
.db
|
||||
.location()
|
||||
.find_many(vec![
|
||||
// TODO(N): This will fall apart with removable media and is making an invalid assumption that the `Node` is fixed for an `Instance`.
|
||||
location::instance_id::equals(Some(library.config().await.instance_id)),
|
||||
location::id::in_vec(locations),
|
||||
])
|
||||
.select(location::select!({ path }))
|
||||
.exec()
|
||||
.await
|
||||
.unwrap_or_default()
|
||||
.into_iter()
|
||||
.filter_map(|location| location.path.map(Into::into)),
|
||||
);
|
||||
}
|
||||
|
||||
if let Err(e) = spawn_blocking(|| inner_reveal_paths(paths_to_open.into_iter())).await {
|
||||
error!("Error joining reveal paths thread: {e:#?}");
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,380 +0,0 @@
|
||||
#![cfg_attr(
|
||||
all(not(debug_assertions), target_os = "windows"),
|
||||
windows_subsystem = "windows"
|
||||
)]
|
||||
|
||||
#[global_allocator]
|
||||
static GLOBAL: mimalloc::MiMalloc = mimalloc::MiMalloc;
|
||||
|
||||
use std::{fs, path::PathBuf, process::Command, sync::Arc, time::Duration};
|
||||
|
||||
use menu::{set_enabled, MenuEvent};
|
||||
use sd_core::{Node, NodeError};
|
||||
|
||||
use sd_fda::DiskAccess;
|
||||
use serde::{Deserialize, Serialize};
|
||||
use specta_typescript::Typescript;
|
||||
use tauri::{async_runtime::block_on, webview::PlatformWebview, AppHandle, Manager, WindowEvent};
|
||||
use tauri::{Emitter, Listener};
|
||||
use tauri_plugins::{sd_error_plugin, sd_server_plugin};
|
||||
use tauri_specta::{collect_events, Builder};
|
||||
use tokio::task::block_in_place;
|
||||
use tokio::time::sleep;
|
||||
use tracing::{debug, error};
|
||||
|
||||
mod drag;
|
||||
mod file;
|
||||
mod menu;
|
||||
mod tauri_plugins;
|
||||
mod theme;
|
||||
mod updater;
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn app_ready(app_handle: AppHandle) {
|
||||
let window = app_handle.get_webview_window("main").unwrap();
|
||||
window.show().unwrap();
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
// If this errors, we don't have FDA and we need to re-prompt for it
|
||||
async fn request_fda_macos() {
|
||||
DiskAccess::request_fda().expect("Unable to request full disk access");
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn set_menu_bar_item_state(window: tauri::Window, event: MenuEvent, enabled: bool) {
|
||||
let menu = window
|
||||
.menu()
|
||||
.expect("unable to get menu for current window");
|
||||
|
||||
set_enabled(&menu, event, enabled);
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn reload_webview(app_handle: AppHandle) {
|
||||
app_handle
|
||||
.get_webview_window("main")
|
||||
.expect("Error getting window handle")
|
||||
.with_webview(reload_webview_inner)
|
||||
.expect("Error while reloading webview");
|
||||
}
|
||||
|
||||
fn reload_webview_inner(webview: PlatformWebview) {
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
unsafe {
|
||||
sd_desktop_macos::reload_webview(&webview.inner().cast());
|
||||
}
|
||||
}
|
||||
#[cfg(target_os = "linux")]
|
||||
{
|
||||
use webkit2gtk::WebViewExt;
|
||||
|
||||
webview.inner().reload();
|
||||
}
|
||||
#[cfg(target_os = "windows")]
|
||||
unsafe {
|
||||
webview
|
||||
.controller()
|
||||
.CoreWebView2()
|
||||
.expect("Unable to get handle on inner webview")
|
||||
.Reload()
|
||||
.expect("Unable to reload webview");
|
||||
}
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn reset_spacedrive(app_handle: AppHandle) {
|
||||
let data_dir = app_handle
|
||||
.path()
|
||||
.data_dir()
|
||||
.unwrap_or_else(|_| PathBuf::from("./"))
|
||||
.join("spacedrive");
|
||||
|
||||
#[cfg(debug_assertions)]
|
||||
let data_dir = data_dir.join("dev");
|
||||
|
||||
fs::remove_dir_all(data_dir).unwrap();
|
||||
|
||||
// TODO: Restarting the app doesn't work in dev (cause Tauri's devserver shutdown) and in prod makes the app go unresponsive until you click in/out on macOS
|
||||
// app_handle.restart();
|
||||
|
||||
app_handle.exit(0);
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn refresh_menu_bar(node: tauri::State<'_, Arc<Node>>, app: AppHandle) -> Result<(), ()> {
|
||||
let has_library = !node.libraries.get_all().await.is_empty();
|
||||
menu::refresh_menu_bar(&app, has_library);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn open_logs_dir(node: tauri::State<'_, Arc<Node>>) -> Result<(), ()> {
|
||||
let logs_path = node.data_dir.join("logs");
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
let open_result = sd_desktop_linux::open_file_path(logs_path);
|
||||
|
||||
#[cfg(not(target_os = "linux"))]
|
||||
let open_result = opener::open(logs_path);
|
||||
|
||||
open_result.map_err(|e| {
|
||||
error!("Failed to open logs dir: {e:#?}");
|
||||
})
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
async fn open_trash_in_os_explorer() -> Result<(), ()> {
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
let full_path = format!("{}/.Trash/", std::env::var("HOME").unwrap());
|
||||
|
||||
Command::new("open")
|
||||
.arg(full_path)
|
||||
.spawn()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?
|
||||
.wait()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
{
|
||||
Command::new("explorer")
|
||||
.arg("shell:RecycleBinFolder")
|
||||
.spawn()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?
|
||||
.wait()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
#[cfg(target_os = "linux")]
|
||||
{
|
||||
Command::new("xdg-open")
|
||||
.arg("trash://")
|
||||
.spawn()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?
|
||||
.wait()
|
||||
.map_err(|err| error!("Error opening trash: {err:#?}"))?;
|
||||
|
||||
Ok(())
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, specta::Type, tauri_specta::Event)]
|
||||
#[serde(tag = "type")]
|
||||
pub enum DragAndDropEvent {
|
||||
Hovered { paths: Vec<String>, x: f64, y: f64 },
|
||||
Dropped { paths: Vec<String>, x: f64, y: f64 },
|
||||
Cancelled,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Serialize, Deserialize, specta::Type, tauri_specta::Event)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct DeepLinkEvent {
|
||||
data: String,
|
||||
}
|
||||
|
||||
#[tokio::main]
|
||||
async fn main() -> tauri::Result<()> {
|
||||
#[cfg(target_os = "linux")]
|
||||
sd_desktop_linux::normalize_environment();
|
||||
|
||||
let builder = Builder::new()
|
||||
.commands(tauri_specta::collect_commands![
|
||||
app_ready,
|
||||
reset_spacedrive,
|
||||
open_logs_dir,
|
||||
refresh_menu_bar,
|
||||
reload_webview,
|
||||
set_menu_bar_item_state,
|
||||
request_fda_macos,
|
||||
open_trash_in_os_explorer,
|
||||
drag::start_drag,
|
||||
drag::stop_drag,
|
||||
file::open_file_paths,
|
||||
file::open_ephemeral_files,
|
||||
file::get_file_path_open_with_apps,
|
||||
file::get_ephemeral_files_open_with_apps,
|
||||
file::open_file_path_with,
|
||||
file::open_ephemeral_file_with,
|
||||
file::reveal_items,
|
||||
theme::lock_app_theme,
|
||||
updater::check_for_update,
|
||||
updater::install_update
|
||||
])
|
||||
.events(collect_events![DragAndDropEvent]);
|
||||
|
||||
#[cfg(debug_assertions)]
|
||||
builder
|
||||
.export(
|
||||
Typescript::default()
|
||||
.formatter(specta_typescript::formatter::prettier)
|
||||
.header("/* eslint-disable */"),
|
||||
"../src/commands.ts",
|
||||
)
|
||||
.expect("Failed to export typescript bindings");
|
||||
|
||||
tauri::Builder::default()
|
||||
.invoke_handler(builder.invoke_handler())
|
||||
.plugin(tauri_plugin_deep_link::init())
|
||||
.plugin(tauri_plugin_cors_fetch::init())
|
||||
.setup(move |app| {
|
||||
// We need a the app handle to determine the data directory now.
|
||||
// This means all the setup code has to be within `setup`, however it doesn't support async so we `block_on`.
|
||||
let handle = app.handle().clone();
|
||||
app.listen("deep-link://new-url", move |event| {
|
||||
let deep_link_event = DeepLinkEvent {
|
||||
data: event.payload().to_string(),
|
||||
};
|
||||
println!("Deep link event={:?}", deep_link_event);
|
||||
|
||||
handle.emit("deeplink", deep_link_event).unwrap();
|
||||
});
|
||||
|
||||
// #[cfg(debug_assertions)] // only include this code on debug builds
|
||||
// {
|
||||
// let window = app.get_webview_window("main").unwrap();
|
||||
// window.open_devtools();
|
||||
// window.close_devtools();
|
||||
// }
|
||||
|
||||
block_in_place(|| {
|
||||
block_on(async move {
|
||||
builder.mount_events(app);
|
||||
|
||||
let data_dir = app
|
||||
.path()
|
||||
.data_dir()
|
||||
.unwrap_or_else(|_| PathBuf::from("./"))
|
||||
.join("spacedrive");
|
||||
|
||||
#[cfg(debug_assertions)]
|
||||
let data_dir = data_dir.join("dev");
|
||||
|
||||
// The `_guard` must be assigned to variable for flushing remaining logs on main exit through Drop
|
||||
let (_guard, result) = match Node::init_logger(&data_dir) {
|
||||
Ok(guard) => (Some(guard), Node::new(data_dir).await),
|
||||
Err(err) => (None, Err(NodeError::Logger(err))),
|
||||
};
|
||||
|
||||
let handle = app.handle();
|
||||
let (node, router) = match result {
|
||||
Ok(r) => r,
|
||||
Err(err) => {
|
||||
error!("Error starting up the node: {err:#?}");
|
||||
handle.plugin(sd_error_plugin(err))?;
|
||||
return Ok(());
|
||||
}
|
||||
};
|
||||
|
||||
let should_clear_local_storage = node.libraries.get_all().await.is_empty();
|
||||
|
||||
handle.plugin(rspc::integrations::tauri::plugin(router, {
|
||||
let node = node.clone();
|
||||
move || node.clone()
|
||||
}))?;
|
||||
handle.plugin(sd_server_plugin(node.clone()).await.unwrap())?; // TODO: Handle `unwrap`
|
||||
handle.manage(node.clone());
|
||||
|
||||
handle.windows().iter().for_each(|(_, window)| {
|
||||
if should_clear_local_storage {
|
||||
debug!("cleaning localStorage");
|
||||
for webview in window.webviews() {
|
||||
webview.eval("localStorage.clear();").ok();
|
||||
}
|
||||
}
|
||||
|
||||
tokio::spawn({
|
||||
let window = window.clone();
|
||||
async move {
|
||||
sleep(Duration::from_secs(3)).await;
|
||||
if !window.is_visible().unwrap_or(true) {
|
||||
// This happens if the JS bundle crashes and hence doesn't send ready event.
|
||||
println!(
|
||||
"Window did not emit `app_ready` event fast enough. Showing window..."
|
||||
);
|
||||
window.show().expect("Main window should show");
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
window.set_decorations(false).unwrap();
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
unsafe {
|
||||
sd_desktop_macos::set_titlebar_style(
|
||||
&window.ns_window().expect("NSWindows must exist on macOS"),
|
||||
false,
|
||||
);
|
||||
sd_desktop_macos::disable_app_nap(
|
||||
&"File indexer needs to run unimpeded".into(),
|
||||
);
|
||||
};
|
||||
}
|
||||
});
|
||||
|
||||
Ok(())
|
||||
})
|
||||
})
|
||||
})
|
||||
.on_window_event(move |window, event| match event {
|
||||
// macOS expected behavior is for the app to not exit when the main window is closed.
|
||||
// Instead, the window is hidden and the dock icon remains so that on user click it should show the window again.
|
||||
#[cfg(target_os = "macos")]
|
||||
WindowEvent::CloseRequested { api, .. } => {
|
||||
// TODO: make this multi-window compatible in the future
|
||||
window
|
||||
.app_handle()
|
||||
.hide()
|
||||
.expect("Window should hide on macOS");
|
||||
api.prevent_close();
|
||||
}
|
||||
WindowEvent::Resized(_) => {
|
||||
let (_state, command) =
|
||||
if window.is_fullscreen().expect("Can't get fullscreen state") {
|
||||
(true, "window_fullscreened")
|
||||
} else {
|
||||
(false, "window_not_fullscreened")
|
||||
};
|
||||
|
||||
window
|
||||
.emit("keybind", command)
|
||||
.expect("Unable to emit window event");
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
let nswindow = window.ns_window().unwrap();
|
||||
unsafe { sd_desktop_macos::set_titlebar_style(&nswindow, _state) };
|
||||
}
|
||||
}
|
||||
_ => {}
|
||||
})
|
||||
.menu(menu::setup_menu)
|
||||
.plugin(tauri_plugin_dialog::init())
|
||||
.plugin(tauri_plugin_os::init())
|
||||
.plugin(tauri_plugin_shell::init())
|
||||
.plugin(tauri_plugin_http::init())
|
||||
// TODO: Bring back Tauri Plugin Window State - it was buggy so we removed it.
|
||||
.plugin(tauri_plugin_updater::Builder::new().build())
|
||||
.plugin(updater::plugin())
|
||||
.manage(updater::State::default())
|
||||
.manage(drag::DragState::default())
|
||||
.build(tauri::generate_context!())?
|
||||
.run(|_, _| {});
|
||||
|
||||
Ok(())
|
||||
}
|
||||
@@ -1,296 +0,0 @@
|
||||
use std::str::FromStr;
|
||||
|
||||
use serde::Deserialize;
|
||||
use specta::Type;
|
||||
use tauri::{
|
||||
menu::{Menu, MenuItemKind},
|
||||
AppHandle, Emitter, Manager, Wry,
|
||||
};
|
||||
use tracing::error;
|
||||
|
||||
#[derive(
|
||||
Debug, Clone, Copy, Type, Deserialize, strum::EnumString, strum::AsRefStr, strum::Display,
|
||||
)]
|
||||
pub enum MenuEvent {
|
||||
NewLibrary,
|
||||
NewFile,
|
||||
NewDirectory,
|
||||
AddLocation,
|
||||
OpenOverview,
|
||||
OpenSearch,
|
||||
OpenSettings,
|
||||
ReloadExplorer,
|
||||
SetLayoutGrid,
|
||||
SetLayoutList,
|
||||
SetLayoutMedia,
|
||||
ToggleDeveloperTools,
|
||||
NewWindow,
|
||||
ReloadWebview,
|
||||
Copy,
|
||||
Cut,
|
||||
Paste,
|
||||
Duplicate,
|
||||
SelectAll,
|
||||
}
|
||||
|
||||
/// Menu items which require a library to be open to use.
|
||||
/// They will be disabled/enabled automatically.
|
||||
const LIBRARY_LOCKED_MENU_IDS: &[MenuEvent] = &[
|
||||
MenuEvent::NewWindow,
|
||||
MenuEvent::OpenOverview,
|
||||
MenuEvent::OpenSearch,
|
||||
MenuEvent::OpenSettings,
|
||||
MenuEvent::ReloadExplorer,
|
||||
MenuEvent::SetLayoutGrid,
|
||||
MenuEvent::SetLayoutList,
|
||||
MenuEvent::SetLayoutMedia,
|
||||
MenuEvent::NewFile,
|
||||
MenuEvent::NewDirectory,
|
||||
MenuEvent::NewLibrary,
|
||||
MenuEvent::AddLocation,
|
||||
];
|
||||
|
||||
pub fn setup_menu(app: &AppHandle) -> tauri::Result<Menu<Wry>> {
|
||||
app.on_menu_event(move |app, event| {
|
||||
if let Ok(event) = MenuEvent::from_str(&event.id().0) {
|
||||
handle_menu_event(event, app);
|
||||
} else {
|
||||
println!("Unknown menu event: {}", event.id().0);
|
||||
}
|
||||
});
|
||||
|
||||
#[cfg(not(target_os = "macos"))]
|
||||
{
|
||||
Menu::new(app)
|
||||
}
|
||||
#[cfg(target_os = "macos")]
|
||||
{
|
||||
use tauri::menu::{AboutMetadataBuilder, MenuBuilder, MenuItemBuilder, SubmenuBuilder};
|
||||
|
||||
let app_menu = SubmenuBuilder::new(app, "Spacedrive")
|
||||
.about(Some(
|
||||
AboutMetadataBuilder::new()
|
||||
.authors(Some(vec!["Spacedrive Technology Inc.".to_string()]))
|
||||
.license(Some(env!("CARGO_PKG_VERSION")))
|
||||
.version(Some(env!("CARGO_PKG_VERSION")))
|
||||
.website(Some("https://spacedrive.com/"))
|
||||
.website_label(Some("Spacedrive.com"))
|
||||
.build(),
|
||||
))
|
||||
.separator()
|
||||
.item(&MenuItemBuilder::with_id(MenuEvent::NewLibrary, "New Library").build(app)?)
|
||||
// .item(
|
||||
// &SubmenuBuilder::new(app, "Libraries")
|
||||
// // TODO: Implement this
|
||||
// .items(&[])
|
||||
// .build()?,
|
||||
// )
|
||||
.separator()
|
||||
.hide()
|
||||
.hide_others()
|
||||
.show_all()
|
||||
.separator()
|
||||
.quit()
|
||||
.build()?;
|
||||
|
||||
// TODO: Re-enable these when they are implemented, and doesn't stop duplicates.
|
||||
// let file_menu = SubmenuBuilder::new(app, "File")
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::NewFile, "New File")
|
||||
// .accelerator("CmdOrCtrl+N")
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::NewDirectory, "New Directory")
|
||||
// .accelerator("CmdOrCtrl+D")
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::AddLocation, "Add Location")
|
||||
// // .accelerator("") // TODO
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .build()?;
|
||||
|
||||
let edit_menu = SubmenuBuilder::new(app, "Edit")
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::Copy, "Copy")
|
||||
// .accelerator("CmdOrCtrl+C")
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::Cut, "Cut")
|
||||
// .accelerator("CmdOrCtrl+X")
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::Paste, "Paste")
|
||||
// .accelerator("CmdOrCtrl+V")
|
||||
// .build(app)?,
|
||||
// )
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::Duplicate, "Duplicate")
|
||||
// .accelerator("CmdOrCtrl+D")
|
||||
// .build(app)?,
|
||||
// )
|
||||
.select_all()
|
||||
.undo()
|
||||
.redo()
|
||||
.build()?;
|
||||
|
||||
let view_menu = SubmenuBuilder::new(app, "View")
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::OpenOverview, "Open Overview")
|
||||
.accelerator("CmdOrCtrl+.")
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::OpenSearch, "Search")
|
||||
.accelerator("CmdOrCtrl+F")
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::OpenSettings, "Settings")
|
||||
.accelerator("CmdOrCtrl+Comma")
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::ReloadExplorer, "Open Explorer")
|
||||
.accelerator("CmdOrCtrl+R")
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&SubmenuBuilder::new(app, "Layout")
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::SetLayoutGrid, "Grid (Default)")
|
||||
// .accelerator("") // TODO
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::SetLayoutList, "List")
|
||||
// .accelerator("") // TODO
|
||||
.build(app)?,
|
||||
)
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::SetLayoutMedia, "Media")
|
||||
// .accelerator("") // TODO
|
||||
.build(app)?,
|
||||
)
|
||||
.build()?,
|
||||
);
|
||||
|
||||
#[cfg(debug_assertions)]
|
||||
let view_menu = view_menu.separator().item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::ToggleDeveloperTools, "Toggle Developer Tools")
|
||||
.accelerator("CmdOrCtrl+Shift+Alt+I")
|
||||
.build(app)?,
|
||||
);
|
||||
|
||||
let view_menu = view_menu.build()?;
|
||||
|
||||
let window_menu = SubmenuBuilder::new(app, "Window")
|
||||
.minimize()
|
||||
// Disabling this fixes the new "Duplicate current tab" shortcut on macOS clients
|
||||
// ...and at the time I'm committing this we don't support multi-window so... ¯\_(ツ)_/¯
|
||||
// .item(
|
||||
// &MenuItemBuilder::with_id(MenuEvent::NewWindow, "New Window")
|
||||
// .accelerator("CmdOrCtrl+Shift+N")
|
||||
// .build(app)?,
|
||||
// )
|
||||
.fullscreen()
|
||||
.item(
|
||||
&MenuItemBuilder::with_id(MenuEvent::ReloadWebview, "Reload Webview")
|
||||
.accelerator("CmdOrCtrl+Shift+R")
|
||||
.build(app)?,
|
||||
)
|
||||
.build()?;
|
||||
|
||||
let menu = MenuBuilder::new(app)
|
||||
.item(&app_menu)
|
||||
// .item(&file_menu)
|
||||
.item(&edit_menu)
|
||||
.item(&view_menu)
|
||||
.item(&window_menu)
|
||||
.build()?;
|
||||
|
||||
for event in LIBRARY_LOCKED_MENU_IDS {
|
||||
set_enabled(&menu, *event, false);
|
||||
}
|
||||
|
||||
Ok(menu)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn handle_menu_event(event: MenuEvent, app: &AppHandle) {
|
||||
let webview = app
|
||||
.get_webview_window("main")
|
||||
.expect("unable to find window");
|
||||
|
||||
match event {
|
||||
// TODO: Use Tauri Specta with frontend instead of this
|
||||
MenuEvent::NewLibrary => webview.emit("keybind", "new_library").unwrap(),
|
||||
MenuEvent::NewFile => webview.emit("keybind", "new_file").unwrap(),
|
||||
MenuEvent::NewDirectory => webview.emit("keybind", "new_directory").unwrap(),
|
||||
MenuEvent::AddLocation => webview.emit("keybind", "add_location").unwrap(),
|
||||
MenuEvent::OpenOverview => webview.emit("keybind", "open_overview").unwrap(),
|
||||
MenuEvent::OpenSearch => webview.emit("keybind", "open_search".to_string()).unwrap(),
|
||||
MenuEvent::OpenSettings => webview.emit("keybind", "open_settings").unwrap(),
|
||||
MenuEvent::ReloadExplorer => webview.emit("keybind", "reload_explorer").unwrap(),
|
||||
MenuEvent::SetLayoutGrid => webview.emit("keybind", "set_layout_grid").unwrap(),
|
||||
MenuEvent::SetLayoutList => webview.emit("keybind", "set_layout_list").unwrap(),
|
||||
MenuEvent::SetLayoutMedia => webview.emit("keybind", "set_layout_media").unwrap(),
|
||||
MenuEvent::Copy => webview.emit("keybind", "copy").unwrap(),
|
||||
MenuEvent::Cut => webview.emit("keybind", "cut").unwrap(),
|
||||
MenuEvent::Paste => webview.emit("keybind", "paste").unwrap(),
|
||||
MenuEvent::Duplicate => webview.emit("keybind", "duplicate").unwrap(),
|
||||
MenuEvent::SelectAll => webview.emit("keybind", "select_all").unwrap(),
|
||||
MenuEvent::ToggleDeveloperTools =>
|
||||
{
|
||||
#[cfg(feature = "devtools")]
|
||||
if webview.is_devtools_open() {
|
||||
webview.close_devtools();
|
||||
} else {
|
||||
webview.open_devtools();
|
||||
}
|
||||
}
|
||||
MenuEvent::NewWindow => {
|
||||
// TODO: Implement this
|
||||
}
|
||||
MenuEvent::ReloadWebview => {
|
||||
webview
|
||||
.with_webview(crate::reload_webview_inner)
|
||||
.expect("Error while reloading webview");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Enable/disable all items in `LIBRARY_LOCKED_MENU_IDS`
|
||||
pub fn refresh_menu_bar(app: &AppHandle, enabled: bool) {
|
||||
let menu = app
|
||||
.get_window("main")
|
||||
.expect("unable to find window")
|
||||
.menu()
|
||||
.expect("unable to get menu for current window");
|
||||
|
||||
for event in LIBRARY_LOCKED_MENU_IDS {
|
||||
set_enabled(&menu, *event, enabled);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn set_enabled(menu: &Menu<Wry>, event: MenuEvent, enabled: bool) {
|
||||
let result = match menu.get(event.as_ref()) {
|
||||
Some(MenuItemKind::MenuItem(i)) => i.set_enabled(enabled),
|
||||
Some(MenuItemKind::Submenu(i)) => i.set_enabled(enabled),
|
||||
Some(MenuItemKind::Predefined(_)) => return,
|
||||
Some(MenuItemKind::Check(i)) => i.set_enabled(enabled),
|
||||
Some(MenuItemKind::Icon(i)) => i.set_enabled(enabled),
|
||||
None => {
|
||||
error!("Unable to get menu item: {event:?}");
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
if let Err(e) = result {
|
||||
error!("Error setting menu item state: {e:#?}");
|
||||
}
|
||||
}
|
||||
@@ -1,135 +0,0 @@
|
||||
use std::{net::Ipv4Addr, sync::Arc};
|
||||
|
||||
use axum::{
|
||||
body::Body,
|
||||
extract::{Query, State},
|
||||
http::{Request, StatusCode},
|
||||
middleware::{self, Next},
|
||||
response::Response,
|
||||
RequestPartsExt,
|
||||
};
|
||||
use axum_extra::{
|
||||
headers::authorization::{Authorization, Bearer},
|
||||
TypedHeader,
|
||||
};
|
||||
use http::Method;
|
||||
use rand::{distr::Alphanumeric, Rng};
|
||||
use sd_core::{custom_uri, Node, NodeError};
|
||||
use serde::Deserialize;
|
||||
use tauri::{async_runtime::block_on, plugin::TauriPlugin, RunEvent, Runtime};
|
||||
use thiserror::Error;
|
||||
use tokio::{net::TcpListener, task::block_in_place};
|
||||
use tracing::info;
|
||||
|
||||
/// Inject `window.__SD_ERROR__` so the frontend can render core startup errors.
|
||||
/// It's assumed the error happened prior or during settings up the core and rspc.
|
||||
pub fn sd_error_plugin<R: Runtime>(err: NodeError) -> TauriPlugin<R> {
|
||||
tauri::plugin::Builder::new("sd-error")
|
||||
.js_init_script(format!(
|
||||
r#"window.__SD_ERROR__ = `{}`;"#,
|
||||
err.to_string().replace('`', "\"")
|
||||
))
|
||||
.build()
|
||||
}
|
||||
|
||||
#[derive(Error, Debug)]
|
||||
pub enum SdServerPluginError {
|
||||
#[error("hyper error")]
|
||||
HyperError(#[from] hyper::Error),
|
||||
#[error("io error")]
|
||||
IoError(#[from] std::io::Error),
|
||||
}
|
||||
|
||||
/// Right now Tauri doesn't support async custom URI protocols so we ship an Axum server.
|
||||
/// I began the upstream work on this: https://github.com/tauri-apps/wry/pull/872
|
||||
/// Related to https://github.com/tauri-apps/tauri/issues/3725 & https://bugs.webkit.org/show_bug.cgi?id=146351#c5
|
||||
///
|
||||
/// The server is on a random port w/ a localhost bind address and requires a random on startup auth token which is injected into the webview so this *should* be secure enough.
|
||||
///
|
||||
/// We also spin up multiple servers so we can load balance image requests between them to avoid any issue with browser connection limits.
|
||||
pub async fn sd_server_plugin<R: Runtime>(
|
||||
node: Arc<Node>,
|
||||
) -> Result<TauriPlugin<R>, SdServerPluginError> {
|
||||
let auth_token: String = rand::thread_rng()
|
||||
.sample_iter(&Alphanumeric)
|
||||
.take(15)
|
||||
.map(char::from)
|
||||
.collect();
|
||||
|
||||
let app = custom_uri::router(node.clone())
|
||||
.route_layer(middleware::from_fn_with_state(
|
||||
auth_token.clone(),
|
||||
auth_middleware,
|
||||
))
|
||||
.fallback(|| async { "404 Not Found: We're past the event horizon..." });
|
||||
|
||||
// Only allow current device to access it
|
||||
let listener = TcpListener::bind((Ipv4Addr::LOCALHOST, 0)).await?;
|
||||
let listen_addr = listener.local_addr()?; // We get it from a listener so `0` is turned into a random port
|
||||
let (tx, mut rx) = tokio::sync::mpsc::channel(1);
|
||||
|
||||
info!("Internal server listening on: http://{listen_addr:?}");
|
||||
tokio::spawn(async move {
|
||||
axum::serve(listener, app)
|
||||
.with_graceful_shutdown(async move {
|
||||
rx.recv().await;
|
||||
})
|
||||
.await
|
||||
.expect("Error with HTTP server!"); // TODO: Panic handling
|
||||
});
|
||||
|
||||
let script = format!(
|
||||
r#"window.__SD_CUSTOM_SERVER_AUTH_TOKEN__ = "{auth_token}"; window.__SD_CUSTOM_URI_SERVER__ = ['http://{listen_addr}'];"#,
|
||||
);
|
||||
|
||||
Ok(tauri::plugin::Builder::new("sd-server")
|
||||
.js_init_script(script.to_owned())
|
||||
.on_page_load(move |webview, _payload| {
|
||||
webview
|
||||
.eval(&script)
|
||||
.expect("Spacedrive server URL must be injected")
|
||||
})
|
||||
.on_event(move |_app, e| {
|
||||
if let RunEvent::Exit { .. } = e {
|
||||
block_in_place(|| {
|
||||
block_on(node.shutdown());
|
||||
block_on(tx.send(())).ok();
|
||||
});
|
||||
}
|
||||
})
|
||||
.build())
|
||||
}
|
||||
|
||||
#[derive(Deserialize)]
|
||||
struct QueryParams {
|
||||
token: Option<String>,
|
||||
}
|
||||
|
||||
async fn auth_middleware(
|
||||
Query(query): Query<QueryParams>,
|
||||
State(auth_token): State<String>,
|
||||
request: Request<Body>,
|
||||
next: Next,
|
||||
) -> Result<Response, StatusCode> {
|
||||
let req = if query.token.as_ref() != Some(&auth_token) {
|
||||
let (mut parts, body) = request.into_parts();
|
||||
|
||||
// We don't check auth for OPTIONS requests cause the CORS middleware will handle it
|
||||
if parts.method != Method::OPTIONS {
|
||||
let auth: TypedHeader<Authorization<Bearer>> = parts
|
||||
.extract()
|
||||
.await
|
||||
.map_err(|_| StatusCode::UNAUTHORIZED)?;
|
||||
|
||||
if auth.token() != auth_token {
|
||||
return Err(StatusCode::UNAUTHORIZED);
|
||||
}
|
||||
}
|
||||
|
||||
Request::from_parts(parts, body)
|
||||
} else {
|
||||
request
|
||||
};
|
||||
|
||||
Ok(next.run(req).await)
|
||||
}
|
||||
@@ -1,20 +0,0 @@
|
||||
use serde::Deserialize;
|
||||
use specta::Type;
|
||||
|
||||
#[derive(Type, Deserialize, Clone, Copy, Debug)]
|
||||
pub enum AppThemeType {
|
||||
Auto = -1,
|
||||
Light = 0,
|
||||
Dark = 1,
|
||||
}
|
||||
|
||||
#[tauri::command(async)]
|
||||
#[specta::specta]
|
||||
#[allow(unused_variables)]
|
||||
pub async fn lock_app_theme(theme_type: AppThemeType) {
|
||||
#[cfg(target_os = "macos")]
|
||||
unsafe {
|
||||
sd_desktop_macos::lock_app_theme(theme_type as isize);
|
||||
}
|
||||
// println!("Lock theme, type: {theme_type:?}")
|
||||
}
|
||||
@@ -1,117 +0,0 @@
|
||||
use tauri::{plugin::TauriPlugin, Emitter, Runtime};
|
||||
use tauri_plugin_updater::{Update as TauriPluginUpdate, UpdaterExt};
|
||||
use tokio::sync::Mutex;
|
||||
|
||||
#[derive(Debug, Clone, specta::Type, serde::Serialize)]
|
||||
pub struct Update {
|
||||
pub version: String,
|
||||
}
|
||||
|
||||
impl Update {
|
||||
fn new(update: &TauriPluginUpdate) -> Self {
|
||||
Self {
|
||||
version: update.version.clone(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Default)]
|
||||
pub struct State {
|
||||
install_lock: Mutex<()>,
|
||||
}
|
||||
|
||||
async fn get_update(app: tauri::AppHandle) -> Result<Option<TauriPluginUpdate>, String> {
|
||||
app.updater_builder()
|
||||
.header("X-Spacedrive-Version", "stable")
|
||||
.map_err(|e| e.to_string())?
|
||||
.build()
|
||||
.map_err(|e| e.to_string())?
|
||||
.check()
|
||||
.await
|
||||
.map_err(|e| e.to_string())
|
||||
}
|
||||
|
||||
#[derive(Clone, serde::Serialize, specta::Type)]
|
||||
#[serde(rename_all = "camelCase", tag = "status")]
|
||||
pub enum UpdateEvent {
|
||||
Loading,
|
||||
Error(String),
|
||||
UpdateAvailable { update: Update },
|
||||
NoUpdateAvailable,
|
||||
Installing,
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
#[specta::specta]
|
||||
pub async fn check_for_update(app: tauri::AppHandle) -> Result<Option<Update>, String> {
|
||||
app.emit("updater", UpdateEvent::Loading).ok();
|
||||
|
||||
let update = match get_update(app.clone()).await {
|
||||
Ok(update) => update,
|
||||
Err(e) => {
|
||||
app.emit("updater", UpdateEvent::Error(e.clone())).ok();
|
||||
return Err(e);
|
||||
}
|
||||
};
|
||||
|
||||
let update = update.map(|update| Update::new(&update));
|
||||
|
||||
app.emit(
|
||||
"updater",
|
||||
update
|
||||
.clone()
|
||||
.map_or(UpdateEvent::NoUpdateAvailable, |update| {
|
||||
UpdateEvent::UpdateAvailable { update }
|
||||
}),
|
||||
)
|
||||
.ok();
|
||||
|
||||
Ok(update)
|
||||
}
|
||||
|
||||
#[tauri::command]
|
||||
#[specta::specta]
|
||||
pub async fn install_update(
|
||||
app: tauri::AppHandle,
|
||||
state: tauri::State<'_, State>,
|
||||
) -> Result<(), String> {
|
||||
let lock = match state.install_lock.try_lock() {
|
||||
Ok(lock) => lock,
|
||||
Err(_) => return Err("Update already installing".into()),
|
||||
};
|
||||
|
||||
app.emit("updater", UpdateEvent::Installing).ok();
|
||||
|
||||
get_update(app.clone())
|
||||
.await?
|
||||
.ok_or_else(|| "No update required".to_string())?
|
||||
.download_and_install(|_, _| {}, || {})
|
||||
.await
|
||||
.map_err(|e| e.to_string())?;
|
||||
|
||||
drop(lock);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn plugin<R: Runtime>() -> TauriPlugin<R> {
|
||||
tauri::plugin::Builder::new("sd-updater")
|
||||
.on_page_load(|window, _| {
|
||||
#[cfg(target_os = "linux")]
|
||||
let updater_available = false;
|
||||
|
||||
#[cfg(not(target_os = "linux"))]
|
||||
let updater_available = true;
|
||||
|
||||
if updater_available {
|
||||
window
|
||||
.eval("window.__SD_UPDATER__ = true;")
|
||||
.expect("Failed to inject updater JS");
|
||||
}
|
||||
})
|
||||
.js_init_script(format!(
|
||||
r#"window.__SD_DESKTOP_VERSION__ = "{}";"#,
|
||||
env!("CARGO_PKG_VERSION")
|
||||
))
|
||||
.build()
|
||||
}
|
||||
@@ -1,116 +0,0 @@
|
||||
{
|
||||
"$schema": "https://raw.githubusercontent.com/tauri-apps/tauri/tauri-v2.0.0-rc.8/crates/tauri-cli/tauri.config.schema.json",
|
||||
"productName": "Spacedrive",
|
||||
"identifier": "com.spacedrive.desktop",
|
||||
"build": {
|
||||
"beforeDevCommand": "pnpm dev",
|
||||
"devUrl": "http://localhost:8001",
|
||||
"beforeBuildCommand": "pnpm turbo run build --filter=@sd/desktop...",
|
||||
"frontendDist": "../dist"
|
||||
},
|
||||
"app": {
|
||||
"withGlobalTauri": true,
|
||||
"macOSPrivateApi": true,
|
||||
"windows": [
|
||||
{
|
||||
"title": "Spacedrive",
|
||||
"hiddenTitle": true,
|
||||
"width": 1400,
|
||||
"height": 750,
|
||||
"minWidth": 768,
|
||||
"minHeight": 500,
|
||||
"resizable": true,
|
||||
"fullscreen": false,
|
||||
"alwaysOnTop": false,
|
||||
"focus": false,
|
||||
"visible": false,
|
||||
"dragDropEnabled": true,
|
||||
"decorations": true,
|
||||
"transparent": true,
|
||||
"center": true,
|
||||
"windowEffects": {
|
||||
"effects": ["sidebar"],
|
||||
"state": "followsWindowActiveState",
|
||||
"radius": 9
|
||||
}
|
||||
}
|
||||
],
|
||||
"security": {
|
||||
"csp": {
|
||||
"default-src": "'self' webkit-pdfjs-viewer: asset: http://asset.localhost blob: data: filesystem: http: https: tauri:",
|
||||
"connect-src": "'self' ipc: http://ipc.localhost ws: wss: http: https: tauri:",
|
||||
"img-src": "'self' asset: http://asset.localhost blob: data: filesystem: http: https: tauri:",
|
||||
"style-src": "'self' 'unsafe-inline' http: https: tauri:"
|
||||
}
|
||||
}
|
||||
},
|
||||
"bundle": {
|
||||
"active": true,
|
||||
"targets": ["deb", "msi", "dmg"],
|
||||
"publisher": "Spacedrive Technology Inc.",
|
||||
"copyright": "Spacedrive Technology Inc.",
|
||||
"category": "Productivity",
|
||||
"shortDescription": "Spacedrive",
|
||||
"longDescription": "Cross-platform universal file explorer, powered by an open-source virtual distributed filesystem.",
|
||||
"createUpdaterArtifacts": "v1Compatible",
|
||||
"icon": [
|
||||
"icons/32x32.png",
|
||||
"icons/128x128.png",
|
||||
"icons/128x128@2x.png",
|
||||
"icons/icon.icns",
|
||||
"icons/icon.ico"
|
||||
],
|
||||
"linux": {
|
||||
"deb": {
|
||||
"files": {
|
||||
"/usr/share/spacedrive/models/yolov8s.onnx": "../../.deps/models/yolov8s.onnx"
|
||||
},
|
||||
"depends": ["libc6", "libxdo3", "dbus"]
|
||||
}
|
||||
},
|
||||
"macOS": {
|
||||
"minimumSystemVersion": "10.15",
|
||||
"exceptionDomain": null,
|
||||
"entitlements": null,
|
||||
"frameworks": ["../../.deps/Spacedrive.framework"],
|
||||
"dmg": {
|
||||
"background": "dmg-background.png",
|
||||
"appPosition": {
|
||||
"x": 190,
|
||||
"y": 190
|
||||
},
|
||||
"applicationFolderPosition": {
|
||||
"x": 470,
|
||||
"y": 190
|
||||
}
|
||||
}
|
||||
},
|
||||
"windows": {
|
||||
"certificateThumbprint": null,
|
||||
"webviewInstallMode": {
|
||||
"type": "embedBootstrapper",
|
||||
"silent": true
|
||||
},
|
||||
"digestAlgorithm": "sha256",
|
||||
"timestampUrl": "",
|
||||
"wix": {
|
||||
"dialogImagePath": "icons/WindowsDialogImage.bmp",
|
||||
"bannerPath": "icons/WindowsBanner.bmp"
|
||||
}
|
||||
}
|
||||
},
|
||||
"plugins": {
|
||||
"updater": {
|
||||
"pubkey": "dW50cnVzdGVkIGNvbW1lbnQ6IG1pbmlzaWduIHB1YmxpYyBrZXk6IEZBMURCMkU5NEU3NDAyOEMKUldTTUFuUk82YklkK296dlkxUGkrTXhCT3ZMNFFVOWROcXNaS0RqWU1kMUdRV2tDdFdIS0Y3YUsK",
|
||||
"endpoints": [
|
||||
"https://spacedrive.com/api/releases/tauri/{{version}}/{{target}}/{{arch}}"
|
||||
]
|
||||
},
|
||||
"deep-link": {
|
||||
"mobile": [],
|
||||
"desktop": {
|
||||
"schemes": ["spacedrive"]
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,477 +0,0 @@
|
||||
import { createMemoryHistory } from '@remix-run/router';
|
||||
import { QueryClientProvider } from '@tanstack/react-query';
|
||||
import { listen } from '@tauri-apps/api/event';
|
||||
import { PropsWithChildren, startTransition, useEffect, useMemo, useRef, useState } from 'react';
|
||||
import { createPortal } from 'react-dom';
|
||||
import {
|
||||
getItemFilePath,
|
||||
libraryClient,
|
||||
RspcProvider,
|
||||
useBridgeMutation,
|
||||
useLibraryMutation,
|
||||
useSelector
|
||||
} from '@sd/client';
|
||||
import {
|
||||
createRoutes,
|
||||
DeeplinkEvent,
|
||||
ErrorPage,
|
||||
FileDropEvent,
|
||||
KeybindEvent,
|
||||
PlatformProvider,
|
||||
SpacedriveInterfaceRoot,
|
||||
SpacedriveRouterProvider,
|
||||
TabsContext
|
||||
} from '@sd/interface';
|
||||
import { RouteTitleContext } from '@sd/interface/hooks/useRouteTitle';
|
||||
|
||||
import '@sd/ui/style';
|
||||
|
||||
import { Channel, invoke } from '@tauri-apps/api/core';
|
||||
import SuperTokens from 'supertokens-web-js';
|
||||
import EmailPassword from 'supertokens-web-js/recipe/emailpassword';
|
||||
import Passwordless from 'supertokens-web-js/recipe/passwordless';
|
||||
import Session from 'supertokens-web-js/recipe/session';
|
||||
import ThirdParty from 'supertokens-web-js/recipe/thirdparty';
|
||||
import { explorerStore } from '@sd/interface/app/$libraryId/Explorer/store';
|
||||
// TODO: Bring this back once upstream is fixed up.
|
||||
// const client = hooks.createClient({
|
||||
// links: [
|
||||
// loggerLink({
|
||||
// enabled: () => getDebugState().rspcLogger
|
||||
// }),
|
||||
// tauriLink()
|
||||
// ]
|
||||
// });
|
||||
import getCookieHandler from '@sd/interface/app/$libraryId/settings/client/account/handlers/cookieHandler';
|
||||
import getWindowHandler from '@sd/interface/app/$libraryId/settings/client/account/handlers/windowHandler';
|
||||
import { useLocale } from '@sd/interface/hooks';
|
||||
import { AUTH_SERVER_URL, getTokens } from '@sd/interface/util';
|
||||
|
||||
import { Transparent } from '../../../packages/assets/images';
|
||||
import { commands } from './commands';
|
||||
import { platform } from './platform';
|
||||
import { queryClient } from './query';
|
||||
import { createMemoryRouterWithHistory } from './router';
|
||||
import { createUpdater } from './updater';
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
enableCORSFetch: (enable: boolean) => void;
|
||||
useDragAndDrop: () => void;
|
||||
}
|
||||
}
|
||||
|
||||
// Disabling until sync is ready.
|
||||
SuperTokens.init({
|
||||
appInfo: {
|
||||
apiDomain: AUTH_SERVER_URL,
|
||||
apiBasePath: '/api/auth',
|
||||
appName: 'Spacedrive Auth Service'
|
||||
},
|
||||
cookieHandler: getCookieHandler,
|
||||
windowHandler: getWindowHandler,
|
||||
recipeList: [
|
||||
Session.init({ tokenTransferMethod: 'header' }),
|
||||
EmailPassword.init()
|
||||
// ThirdParty.init(),
|
||||
// Passwordless.init()
|
||||
]
|
||||
});
|
||||
|
||||
const startupError = (window as any).__SD_ERROR__ as string | undefined;
|
||||
|
||||
function useDragAndDrop() {
|
||||
const dragState = useSelector(explorerStore, (s) => s.drag);
|
||||
|
||||
useEffect(() => {
|
||||
console.log('Drag effect triggered:', {
|
||||
dragStateType: dragState?.type,
|
||||
itemCount: dragState?.type === 'dragging' ? dragState?.items?.length : undefined
|
||||
});
|
||||
|
||||
(async () => {
|
||||
if (['linux', 'browser'].includes(await platform.getOs())) {
|
||||
console.log('Skipping drag operation on Linux or Browser');
|
||||
return;
|
||||
}
|
||||
if (dragState?.type === 'dragging' && dragState.items.length > 0) {
|
||||
console.log('Starting drag operation with items:', dragState.items);
|
||||
|
||||
const items = await Promise.all(
|
||||
dragState.items.map(async (item) => {
|
||||
const data = getItemFilePath(item);
|
||||
if (!data) {
|
||||
console.log('No file path data for item:', item);
|
||||
return;
|
||||
}
|
||||
|
||||
const file_path =
|
||||
'path' in data
|
||||
? data.path
|
||||
: await libraryClient.query(['files.getPath', data.id]);
|
||||
|
||||
console.log('Resolved file path:', file_path);
|
||||
return {
|
||||
type: 'explorer-item',
|
||||
file_path: file_path
|
||||
};
|
||||
})
|
||||
);
|
||||
|
||||
const validFiles = items.filter(Boolean).map((item) => item?.file_path);
|
||||
console.log('Invoking start_drag with files:', validFiles);
|
||||
|
||||
try {
|
||||
const channel = new Channel<{
|
||||
result: 'Dropped' | 'Cancelled';
|
||||
cursorPos: { x: number; y: number };
|
||||
}>();
|
||||
|
||||
channel.onmessage = (payload) => {
|
||||
console.log('Drag completed:', {
|
||||
result: payload.result,
|
||||
position: payload.cursorPos,
|
||||
timestamp: new Date().toISOString()
|
||||
});
|
||||
|
||||
if (payload.result === 'Dropped') {
|
||||
console.log('Drop location:', {
|
||||
x: payload.cursorPos.x,
|
||||
y: payload.cursorPos.y,
|
||||
screen: window.screen
|
||||
});
|
||||
// Refetch explorer files after successful drop
|
||||
queryClient.invalidateQueries({ queryKey: ['search.paths'] });
|
||||
}
|
||||
|
||||
explorerStore.drag = null;
|
||||
};
|
||||
|
||||
const image = !Transparent.includes('/@fs/')
|
||||
? Transparent
|
||||
: Transparent.replace('/@fs', '');
|
||||
|
||||
await invoke('start_drag', {
|
||||
files: validFiles,
|
||||
image: image,
|
||||
onEvent: channel
|
||||
});
|
||||
console.log('start_drag invoked successfully');
|
||||
} catch (error) {
|
||||
console.error('Failed to start drag:', error);
|
||||
explorerStore.drag = null;
|
||||
}
|
||||
} else {
|
||||
console.log('Drag operation cancelled');
|
||||
await invoke('stop_drag');
|
||||
}
|
||||
})();
|
||||
}, [dragState]);
|
||||
}
|
||||
|
||||
export default function App() {
|
||||
useEffect(() => {
|
||||
// This tells Tauri to show the current window because it's finished loading
|
||||
commands.appReady();
|
||||
window.enableCORSFetch(true);
|
||||
window.useDragAndDrop = useDragAndDrop;
|
||||
// .then(() => {
|
||||
// if (import.meta.env.PROD) window.fetch = fetch;
|
||||
// });
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
const keybindListener = listen('keybind', (input) => {
|
||||
document.dispatchEvent(new KeybindEvent(input.payload as string));
|
||||
});
|
||||
const deeplinkListener = listen('deeplink', async (data) => {
|
||||
const payload = (data.payload as any).data as string;
|
||||
if (!payload) return;
|
||||
const json = JSON.parse(payload)[0];
|
||||
if (!json) return;
|
||||
//json output: "spacedrive://-/URL"
|
||||
if (typeof json !== 'string') return;
|
||||
if (!json.startsWith('spacedrive://-')) return;
|
||||
const url = (json as string).split('://-/')[1];
|
||||
if (!url) return;
|
||||
document.dispatchEvent(new DeeplinkEvent(url));
|
||||
});
|
||||
const fileDropListener = listen('tauri://drag-drop', async (data) => {
|
||||
document.dispatchEvent(new FileDropEvent((data.payload as { paths: string[] }).paths));
|
||||
});
|
||||
|
||||
return () => {
|
||||
keybindListener.then((unlisten) => unlisten());
|
||||
deeplinkListener.then((unlisten) => unlisten());
|
||||
fileDropListener.then((unlisten) => unlisten());
|
||||
};
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<RspcProvider queryClient={queryClient}>
|
||||
<QueryClientProvider client={queryClient}>
|
||||
{startupError ? (
|
||||
<ErrorPage
|
||||
message={startupError}
|
||||
submessage="Error occurred starting up the Spacedrive core"
|
||||
/>
|
||||
) : (
|
||||
<AppInner />
|
||||
)}
|
||||
</QueryClientProvider>
|
||||
</RspcProvider>
|
||||
);
|
||||
}
|
||||
|
||||
// we have a minimum delay between creating new tabs as react router can't handle creating tabs super fast
|
||||
const TAB_CREATE_DELAY = 150;
|
||||
|
||||
const routes = createRoutes(platform);
|
||||
|
||||
type RedirectPath = { pathname: string; search: string | undefined };
|
||||
|
||||
function AppInner() {
|
||||
const [tabs, setTabs] = useState(() => [createTab()]);
|
||||
const [selectedTabIndex, setSelectedTabIndex] = useState(0);
|
||||
const cloudBootstrap = useLibraryMutation('cloud.bootstrap');
|
||||
|
||||
useEffect(() => {
|
||||
(async () => {
|
||||
const tokens = await getTokens();
|
||||
// If the access token and/or refresh token are missing, we need to skip the cloud bootstrap
|
||||
if (tokens.accessToken.length === 0 || tokens.refreshToken.length === 0) return;
|
||||
cloudBootstrap.mutate([tokens.accessToken, tokens.refreshToken]);
|
||||
})();
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, []);
|
||||
|
||||
const selectedTab = tabs[selectedTabIndex]!;
|
||||
|
||||
function createTab(redirect?: RedirectPath) {
|
||||
const history = createMemoryHistory();
|
||||
const router = createMemoryRouterWithHistory({ routes, history });
|
||||
|
||||
const id = Math.random().toString();
|
||||
|
||||
// for "Open in new tab"
|
||||
if (redirect) {
|
||||
router.navigate({
|
||||
pathname: redirect.pathname,
|
||||
search: redirect.search
|
||||
});
|
||||
}
|
||||
|
||||
const dispose = router.subscribe((event) => {
|
||||
// we don't care about non-idle events as those are artifacts of form mutations + suspense
|
||||
if (event.navigation.state !== 'idle') return;
|
||||
|
||||
setTabs((routers) => {
|
||||
const index = routers.findIndex((r) => r.id === id);
|
||||
if (index === -1) return routers;
|
||||
|
||||
const routerAtIndex = routers[index]!;
|
||||
|
||||
routers[index] = {
|
||||
...routerAtIndex,
|
||||
currentIndex: history.index,
|
||||
maxIndex:
|
||||
event.historyAction === 'PUSH'
|
||||
? history.index
|
||||
: Math.max(routerAtIndex.maxIndex, history.index)
|
||||
};
|
||||
|
||||
return [...routers];
|
||||
});
|
||||
});
|
||||
|
||||
return {
|
||||
id,
|
||||
router,
|
||||
history,
|
||||
dispose,
|
||||
element: document.createElement('div'),
|
||||
currentIndex: 0,
|
||||
maxIndex: 0,
|
||||
title: 'New Tab'
|
||||
};
|
||||
}
|
||||
|
||||
const createTabPromise = useRef(Promise.resolve());
|
||||
|
||||
const ref = useRef<HTMLDivElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const div = ref.current;
|
||||
if (!div) return;
|
||||
|
||||
div.appendChild(selectedTab.element);
|
||||
|
||||
return () => {
|
||||
while (div.firstChild) {
|
||||
div.removeChild(div.firstChild);
|
||||
}
|
||||
};
|
||||
}, [selectedTab.element]);
|
||||
|
||||
const SizeDisplay = () => {
|
||||
const [size, setSize] = useState({
|
||||
width: window.innerWidth,
|
||||
height: window.innerHeight
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
const handleResize = () => {
|
||||
setSize({
|
||||
width: window.innerWidth,
|
||||
height: window.innerHeight
|
||||
});
|
||||
};
|
||||
|
||||
window.addEventListener('resize', handleResize);
|
||||
return () => window.removeEventListener('resize', handleResize);
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
position: 'fixed',
|
||||
bottom: 10,
|
||||
right: 10,
|
||||
background: 'rgba(0,0,0,0.7)',
|
||||
color: 'white',
|
||||
padding: '5px 10px',
|
||||
borderRadius: '5px',
|
||||
fontSize: '12px',
|
||||
zIndex: 9999
|
||||
}}
|
||||
>
|
||||
{size.width} x {size.height}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
return (
|
||||
<RouteTitleContext.Provider
|
||||
value={useMemo(
|
||||
() => ({
|
||||
setTitle(id, title) {
|
||||
setTabs((tabs) => {
|
||||
const tabIndex = tabs.findIndex((t) => t.id === id);
|
||||
if (tabIndex === -1) return tabs;
|
||||
|
||||
tabs[tabIndex] = { ...tabs[tabIndex]!, title };
|
||||
|
||||
return [...tabs];
|
||||
});
|
||||
}
|
||||
}),
|
||||
[]
|
||||
)}
|
||||
>
|
||||
<TabsContext.Provider
|
||||
value={{
|
||||
tabIndex: selectedTabIndex,
|
||||
setTabIndex: setSelectedTabIndex,
|
||||
tabs: tabs.map(({ router, title }) => ({ router, title })),
|
||||
createTab(redirect?: RedirectPath) {
|
||||
createTabPromise.current = createTabPromise.current.then(
|
||||
() =>
|
||||
new Promise((res) => {
|
||||
startTransition(() => {
|
||||
setTabs((tabs) => {
|
||||
const newTab = createTab(redirect);
|
||||
const newTabs = [...tabs, newTab];
|
||||
|
||||
setSelectedTabIndex(newTabs.length - 1);
|
||||
|
||||
return newTabs;
|
||||
});
|
||||
});
|
||||
|
||||
setTimeout(res, TAB_CREATE_DELAY);
|
||||
})
|
||||
);
|
||||
},
|
||||
duplicateTab() {
|
||||
createTabPromise.current = createTabPromise.current.then(
|
||||
() =>
|
||||
new Promise((res) => {
|
||||
startTransition(() => {
|
||||
setTabs((tabs) => {
|
||||
const { pathname, search } =
|
||||
selectedTab.router.state.location;
|
||||
const newTab = createTab({ pathname, search });
|
||||
const newTabs = [...tabs, newTab];
|
||||
|
||||
setSelectedTabIndex(newTabs.length - 1);
|
||||
|
||||
return newTabs;
|
||||
});
|
||||
});
|
||||
|
||||
setTimeout(res, TAB_CREATE_DELAY);
|
||||
})
|
||||
);
|
||||
},
|
||||
removeTab(index: number) {
|
||||
startTransition(() => {
|
||||
setTabs((tabs) => {
|
||||
const tab = tabs[index];
|
||||
if (!tab) return tabs;
|
||||
|
||||
tab.dispose();
|
||||
|
||||
tabs.splice(index, 1);
|
||||
|
||||
setSelectedTabIndex(Math.min(selectedTabIndex, tabs.length - 1));
|
||||
|
||||
return [...tabs];
|
||||
});
|
||||
});
|
||||
}
|
||||
}}
|
||||
>
|
||||
<PlatformUpdaterProvider>
|
||||
<SpacedriveInterfaceRoot>
|
||||
{tabs.map((tab, index) =>
|
||||
createPortal(
|
||||
<SpacedriveRouterProvider
|
||||
key={tab.id}
|
||||
routing={{
|
||||
routes,
|
||||
visible: selectedTabIndex === tabs.indexOf(tab),
|
||||
router: tab.router,
|
||||
currentIndex: tab.currentIndex,
|
||||
tabId: tab.id,
|
||||
maxIndex: tab.maxIndex
|
||||
}}
|
||||
/>,
|
||||
tab.element
|
||||
)
|
||||
)}
|
||||
{/* <SizeDisplay /> */}
|
||||
<div ref={ref} />
|
||||
</SpacedriveInterfaceRoot>
|
||||
</PlatformUpdaterProvider>
|
||||
</TabsContext.Provider>
|
||||
</RouteTitleContext.Provider>
|
||||
);
|
||||
}
|
||||
|
||||
function PlatformUpdaterProvider(props: PropsWithChildren) {
|
||||
const { t } = useLocale();
|
||||
|
||||
return (
|
||||
<PlatformProvider
|
||||
platform={useMemo(
|
||||
() => ({
|
||||
...platform,
|
||||
updater: window.__SD_UPDATER__ ? createUpdater(t) : undefined
|
||||
}),
|
||||
[t]
|
||||
)}
|
||||
>
|
||||
{props.children}
|
||||
</PlatformProvider>
|
||||
);
|
||||
}
|
||||
278
apps/desktop/src/commands.ts
generated
@@ -1,278 +0,0 @@
|
||||
/** tauri-specta globals **/
|
||||
|
||||
import { Channel as TAURI_CHANNEL, invoke as TAURI_INVOKE } from '@tauri-apps/api/core';
|
||||
import * as TAURI_API_EVENT from '@tauri-apps/api/event';
|
||||
import { type WebviewWindow as __WebviewWindow__ } from '@tauri-apps/api/webviewWindow';
|
||||
|
||||
/* eslint-disable */
|
||||
// This file was generated by [tauri-specta](https://github.com/oscartbeaumont/tauri-specta). Do not edit this file manually.
|
||||
|
||||
/** user-defined commands **/
|
||||
|
||||
export const commands = {
|
||||
async appReady(): Promise<void> {
|
||||
await TAURI_INVOKE('app_ready');
|
||||
},
|
||||
async resetSpacedrive(): Promise<void> {
|
||||
await TAURI_INVOKE('reset_spacedrive');
|
||||
},
|
||||
async openLogsDir(): Promise<Result<null, null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('open_logs_dir') };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async refreshMenuBar(): Promise<Result<null, null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('refresh_menu_bar') };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async reloadWebview(): Promise<void> {
|
||||
await TAURI_INVOKE('reload_webview');
|
||||
},
|
||||
async setMenuBarItemState(event: MenuEvent, enabled: boolean): Promise<void> {
|
||||
await TAURI_INVOKE('set_menu_bar_item_state', { event, enabled });
|
||||
},
|
||||
async requestFdaMacos(): Promise<void> {
|
||||
await TAURI_INVOKE('request_fda_macos');
|
||||
},
|
||||
async openTrashInOsExplorer(): Promise<Result<null, null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('open_trash_in_os_explorer') };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
/**
|
||||
* Initiates a drag and drop operation with cursor position tracking
|
||||
*
|
||||
* # Arguments
|
||||
* * `window` - The Tauri window instance
|
||||
* * `_state` - Current drag state (unused)
|
||||
* * `files` - Vector of file paths to be dragged
|
||||
* * `image` - Base64 encoded image to be used as drag icon
|
||||
* * `on_event` - Channel for communicating drag operation events back to the frontend
|
||||
*/
|
||||
async startDrag(
|
||||
files: string[],
|
||||
image: string,
|
||||
onEvent: TAURI_CHANNEL<CallbackResult>
|
||||
): Promise<Result<null, string>> {
|
||||
try {
|
||||
return {
|
||||
status: 'ok',
|
||||
data: await TAURI_INVOKE('start_drag', { files, image, onEvent })
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
/**
|
||||
* Stops the cursor position tracking for drag operations
|
||||
*/
|
||||
async stopDrag(): Promise<void> {
|
||||
await TAURI_INVOKE('stop_drag');
|
||||
},
|
||||
async openFilePaths(
|
||||
library: string,
|
||||
ids: number[]
|
||||
): Promise<Result<OpenFilePathResult[], null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('open_file_paths', { library, ids }) };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async openEphemeralFiles(paths: string[]): Promise<Result<EphemeralFileOpenResult[], null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('open_ephemeral_files', { paths }) };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async getFilePathOpenWithApps(
|
||||
library: string,
|
||||
ids: number[]
|
||||
): Promise<Result<OpenWithApplication[], null>> {
|
||||
try {
|
||||
return {
|
||||
status: 'ok',
|
||||
data: await TAURI_INVOKE('get_file_path_open_with_apps', { library, ids })
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async getEphemeralFilesOpenWithApps(
|
||||
paths: string[]
|
||||
): Promise<Result<OpenWithApplication[], null>> {
|
||||
try {
|
||||
return {
|
||||
status: 'ok',
|
||||
data: await TAURI_INVOKE('get_ephemeral_files_open_with_apps', { paths })
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async openFilePathWith(
|
||||
library: string,
|
||||
fileIdsAndUrls: [number, string][]
|
||||
): Promise<Result<null, null>> {
|
||||
try {
|
||||
return {
|
||||
status: 'ok',
|
||||
data: await TAURI_INVOKE('open_file_path_with', { library, fileIdsAndUrls })
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async openEphemeralFileWith(pathsAndUrls: [string, string][]): Promise<Result<null, null>> {
|
||||
try {
|
||||
return {
|
||||
status: 'ok',
|
||||
data: await TAURI_INVOKE('open_ephemeral_file_with', { pathsAndUrls })
|
||||
};
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async revealItems(library: string, items: RevealItem[]): Promise<Result<null, null>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('reveal_items', { library, items }) };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async lockAppTheme(themeType: AppThemeType): Promise<void> {
|
||||
await TAURI_INVOKE('lock_app_theme', { themeType });
|
||||
},
|
||||
async checkForUpdate(): Promise<Result<Update | null, string>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('check_for_update') };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
},
|
||||
async installUpdate(): Promise<Result<null, string>> {
|
||||
try {
|
||||
return { status: 'ok', data: await TAURI_INVOKE('install_update') };
|
||||
} catch (e) {
|
||||
if (e instanceof Error) throw e;
|
||||
else return { status: 'error', error: e as any };
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
/** user-defined events **/
|
||||
|
||||
export const events = __makeEvents__<{
|
||||
dragAndDropEvent: DragAndDropEvent;
|
||||
}>({
|
||||
dragAndDropEvent: 'drag-and-drop-event'
|
||||
});
|
||||
|
||||
/** user-defined constants **/
|
||||
|
||||
/** user-defined types **/
|
||||
|
||||
export type AppThemeType = 'Auto' | 'Light' | 'Dark';
|
||||
export type CallbackResult = { result: WrappedDragResult; cursorPos: WrappedCursorPosition };
|
||||
export type DragAndDropEvent =
|
||||
| { type: 'Hovered'; paths: string[]; x: number; y: number }
|
||||
| { type: 'Dropped'; paths: string[]; x: number; y: number }
|
||||
| { type: 'Cancelled' };
|
||||
export type EphemeralFileOpenResult = { t: 'Ok'; c: string } | { t: 'Err'; c: string };
|
||||
export type MenuEvent =
|
||||
| 'NewLibrary'
|
||||
| 'NewFile'
|
||||
| 'NewDirectory'
|
||||
| 'AddLocation'
|
||||
| 'OpenOverview'
|
||||
| 'OpenSearch'
|
||||
| 'OpenSettings'
|
||||
| 'ReloadExplorer'
|
||||
| 'SetLayoutGrid'
|
||||
| 'SetLayoutList'
|
||||
| 'SetLayoutMedia'
|
||||
| 'ToggleDeveloperTools'
|
||||
| 'NewWindow'
|
||||
| 'ReloadWebview'
|
||||
| 'Copy'
|
||||
| 'Cut'
|
||||
| 'Paste'
|
||||
| 'Duplicate'
|
||||
| 'SelectAll';
|
||||
export type OpenFilePathResult =
|
||||
| { t: 'NoLibrary' }
|
||||
| { t: 'NoFile'; c: number }
|
||||
| { t: 'OpenError'; c: [number, string] }
|
||||
| { t: 'AllGood'; c: number }
|
||||
| { t: 'Internal'; c: string };
|
||||
export type OpenWithApplication = { url: string; name: string };
|
||||
export type RevealItem =
|
||||
| { Location: { id: number } }
|
||||
| { FilePath: { id: number } }
|
||||
| { Ephemeral: { path: string } };
|
||||
export type Update = { version: string };
|
||||
export type WrappedCursorPosition = { x: number; y: number };
|
||||
export type WrappedDragResult = 'Dropped' | 'Cancel';
|
||||
|
||||
type __EventObj__<T> = {
|
||||
listen: (cb: TAURI_API_EVENT.EventCallback<T>) => ReturnType<typeof TAURI_API_EVENT.listen<T>>;
|
||||
once: (cb: TAURI_API_EVENT.EventCallback<T>) => ReturnType<typeof TAURI_API_EVENT.once<T>>;
|
||||
emit: null extends T
|
||||
? (payload?: T) => ReturnType<typeof TAURI_API_EVENT.emit>
|
||||
: (payload: T) => ReturnType<typeof TAURI_API_EVENT.emit>;
|
||||
};
|
||||
|
||||
export type Result<T, E> = { status: 'ok'; data: T } | { status: 'error'; error: E };
|
||||
|
||||
function __makeEvents__<T extends Record<string, any>>(mappings: Record<keyof T, string>) {
|
||||
return new Proxy(
|
||||
{} as unknown as {
|
||||
[K in keyof T]: __EventObj__<T[K]> & {
|
||||
(handle: __WebviewWindow__): __EventObj__<T[K]>;
|
||||
};
|
||||
},
|
||||
{
|
||||
get: (_, event) => {
|
||||
const name = mappings[event as keyof T];
|
||||
|
||||
return new Proxy((() => {}) as any, {
|
||||
apply: (_, __, [window]: [__WebviewWindow__]) => ({
|
||||
listen: (arg: any) => window.listen(name, arg),
|
||||
once: (arg: any) => window.once(name, arg),
|
||||
emit: (arg: any) => window.emit(name, arg)
|
||||
}),
|
||||
get: (_, command: keyof __EventObj__<any>) => {
|
||||
switch (command) {
|
||||
case 'listen':
|
||||
return (arg: any) => TAURI_API_EVENT.listen(name, arg);
|
||||
case 'once':
|
||||
return (arg: any) => TAURI_API_EVENT.once(name, arg);
|
||||
case 'emit':
|
||||
return (arg: any) => TAURI_API_EVENT.emit(name, arg);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
);
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
import { createEnv } from '@t3-oss/env-core';
|
||||
import { z } from 'zod';
|
||||
|
||||
export const env = createEnv({
|
||||
clientPrefix: 'VITE_',
|
||||
client: {
|
||||
VITE_LANDING_ORIGIN: z.string().default('https://www.spacedrive.com')
|
||||
},
|
||||
runtimeEnv: import.meta.env,
|
||||
skipValidation: false,
|
||||
emptyStringAsUndefined: true
|
||||
});
|
||||
@@ -1,18 +0,0 @@
|
||||
<!doctype html>
|
||||
<html lang="en" class="vanilla-theme">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<link rel="icon" type="image/svg+xml" href="/src/favicon.svg" />
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||
<title>Spacedrive</title>
|
||||
<script>
|
||||
// Some libs depend on webpack specific stuff
|
||||
var global = globalThis || window;
|
||||
</script>
|
||||
</head>
|
||||
|
||||
<body style="overflow: hidden">
|
||||
<div id="root"></div>
|
||||
<script type="module" src="./index.tsx"></script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,14 +0,0 @@
|
||||
// WARNING: Import order is important in this file. Make sure ~/patches comes before App.
|
||||
import { StrictMode, Suspense } from 'react';
|
||||
import ReactDOM from 'react-dom/client';
|
||||
import '~/patches';
|
||||
import App from './App';
|
||||
|
||||
const root = ReactDOM.createRoot(document.getElementById('root') as HTMLElement);
|
||||
root.render(
|
||||
<StrictMode>
|
||||
<Suspense>
|
||||
<App />
|
||||
</Suspense>
|
||||
</StrictMode>
|
||||
);
|
||||
@@ -1,13 +0,0 @@
|
||||
import { tauriLink } from '@spacedrive/rspc-tauri/src/v2';
|
||||
|
||||
globalThis.isDev = import.meta.env.DEV;
|
||||
globalThis.rspcLinks = [
|
||||
// TODO
|
||||
// loggerLink({
|
||||
// enabled: () => getDebugState().rspcLogger
|
||||
// }),
|
||||
tauriLink()
|
||||
];
|
||||
globalThis.onHotReload = (func: () => void) => {
|
||||
if (import.meta.hot) import.meta.hot.dispose(func);
|
||||
};
|
||||
@@ -1,92 +0,0 @@
|
||||
import { invoke } from '@tauri-apps/api/core';
|
||||
import { homeDir } from '@tauri-apps/api/path';
|
||||
import { confirm, open as dialogOpen, save as dialogSave } from '@tauri-apps/plugin-dialog';
|
||||
import { type } from '@tauri-apps/plugin-os';
|
||||
import { open as shellOpen } from '@tauri-apps/plugin-shell';
|
||||
// @ts-expect-error: Doesn't have a types package.
|
||||
import ConsistentHash from 'consistent-hash';
|
||||
import { OperatingSystem, Platform } from '@sd/interface';
|
||||
|
||||
import { commands, events } from './commands';
|
||||
import { env } from './env';
|
||||
|
||||
const customUriAuthToken = (window as any).__SD_CUSTOM_SERVER_AUTH_TOKEN__ as string | undefined;
|
||||
const customUriServerUrl = (window as any).__SD_CUSTOM_URI_SERVER__ as string[] | undefined;
|
||||
|
||||
const queryParams = customUriAuthToken ? `?token=${encodeURIComponent(customUriAuthToken)}` : '';
|
||||
|
||||
async function getOs(): Promise<OperatingSystem> {
|
||||
switch (await type()) {
|
||||
case 'linux':
|
||||
return 'linux';
|
||||
case 'windows':
|
||||
return 'windows';
|
||||
case 'macos':
|
||||
return 'macOS';
|
||||
default:
|
||||
return 'unknown';
|
||||
}
|
||||
}
|
||||
|
||||
let hr: typeof ConsistentHash | undefined;
|
||||
|
||||
function constructServerUrl(urlSuffix: string) {
|
||||
if (!hr) {
|
||||
if (!customUriServerUrl)
|
||||
throw new Error("'window.__SD_CUSTOM_URI_SERVER__' was not injected correctly!");
|
||||
|
||||
hr = new ConsistentHash();
|
||||
customUriServerUrl.forEach((url) => hr.add(url));
|
||||
}
|
||||
|
||||
// Randomly switch between servers to avoid HTTP connection limits
|
||||
return hr.get(urlSuffix) + urlSuffix + queryParams;
|
||||
}
|
||||
|
||||
export const platform = {
|
||||
platform: 'tauri',
|
||||
getThumbnailUrlByThumbKey: (thumbKey) =>
|
||||
constructServerUrl(
|
||||
`/thumbnail/${encodeURIComponent(
|
||||
thumbKey.base_directory_str
|
||||
)}/${encodeURIComponent(thumbKey.shard_hex)}/${encodeURIComponent(thumbKey.cas_id)}.webp`
|
||||
),
|
||||
getFileUrl: (libraryId, locationLocalId, filePathId) =>
|
||||
constructServerUrl(`/file/${libraryId}/${locationLocalId}/${filePathId}`),
|
||||
getFileUrlByPath: (path) =>
|
||||
constructServerUrl(`/local-file-by-path/${encodeURIComponent(path)}`),
|
||||
getRemoteRspcEndpoint: (remote_identity) => ({
|
||||
url: `${customUriServerUrl?.[0]
|
||||
?.replace('https', 'wss')
|
||||
?.replace('http', 'ws')}/remote/${encodeURIComponent(
|
||||
remote_identity
|
||||
)}/rspc/ws?token=${customUriAuthToken}`
|
||||
}),
|
||||
constructRemoteRspcPath: (remote_identity, path) =>
|
||||
constructServerUrl(
|
||||
`/remote/${encodeURIComponent(remote_identity)}/uri/${path}?token=${customUriAuthToken}`
|
||||
),
|
||||
openLink: shellOpen,
|
||||
getOs,
|
||||
openDirectoryPickerDialog: (opts) => {
|
||||
const result = dialogOpen({ directory: true, ...opts });
|
||||
if (opts?.multiple) return result as any; // Tauri don't properly type narrow on `multiple` argument
|
||||
return result;
|
||||
},
|
||||
openFilePickerDialog: () => dialogOpen({ multiple: true }),
|
||||
saveFilePickerDialog: (opts) => dialogSave(opts),
|
||||
showDevtools: () => invoke('show_devtools'),
|
||||
confirm: (msg, cb) => confirm(msg).then(cb),
|
||||
subscribeToDragAndDropEvents: (cb) =>
|
||||
events.dragAndDropEvent.listen((e) => {
|
||||
cb(e.payload);
|
||||
}),
|
||||
userHomeDir: homeDir,
|
||||
auth: {
|
||||
start(url) {
|
||||
return shellOpen(url);
|
||||
}
|
||||
},
|
||||
...commands,
|
||||
landingApiOrigin: env.VITE_LANDING_ORIGIN
|
||||
} satisfies Omit<Platform, 'updater'>;
|
||||
@@ -1,12 +0,0 @@
|
||||
import { QueryClient } from '@tanstack/react-query';
|
||||
|
||||
export const queryClient = new QueryClient({
|
||||
defaultOptions: {
|
||||
queries: {
|
||||
networkMode: 'always'
|
||||
},
|
||||
mutations: {
|
||||
networkMode: 'always'
|
||||
}
|
||||
}
|
||||
});
|
||||
@@ -1,21 +0,0 @@
|
||||
import { createRouter, InitialEntry, MemoryHistory } from '@remix-run/router';
|
||||
import { UNSAFE_mapRouteProperties } from 'react-router';
|
||||
import { RouteObject } from 'react-router-dom';
|
||||
|
||||
export function createMemoryRouterWithHistory(props: {
|
||||
routes: RouteObject[];
|
||||
history: MemoryHistory;
|
||||
basename?: string;
|
||||
initialEntries?: InitialEntry[];
|
||||
initialIndex?: number;
|
||||
}) {
|
||||
return createRouter({
|
||||
routes: props.routes,
|
||||
history: props.history,
|
||||
basename: props.basename,
|
||||
future: {
|
||||
v7_prependBasename: true
|
||||
},
|
||||
mapRouteProperties: UNSAFE_mapRouteProperties
|
||||
}).initialize();
|
||||
}
|
||||
@@ -1,138 +0,0 @@
|
||||
import { listen } from '@tauri-apps/api/event';
|
||||
import { proxy, useSnapshot } from 'valtio';
|
||||
import { UpdateStore } from '@sd/interface';
|
||||
import { useLocale } from '@sd/interface/hooks';
|
||||
import { toast, ToastId } from '@sd/ui';
|
||||
|
||||
import { commands } from './commands';
|
||||
|
||||
declare global {
|
||||
interface Window {
|
||||
__SD_UPDATER__?: true;
|
||||
__SD_DESKTOP_VERSION__: string;
|
||||
}
|
||||
}
|
||||
|
||||
export function createUpdater(t: ReturnType<typeof useLocale>['t']) {
|
||||
if (!window.__SD_UPDATER__) return;
|
||||
|
||||
const updateStore = proxy<UpdateStore>({
|
||||
status: 'idle'
|
||||
});
|
||||
|
||||
listen<UpdateStore>('updater', (e) => {
|
||||
Object.assign(updateStore, e.payload);
|
||||
});
|
||||
|
||||
const onInstallCallbacks = new Set<() => void>();
|
||||
|
||||
async function checkForUpdate() {
|
||||
const result = await commands.checkForUpdate();
|
||||
|
||||
if (result.status === 'error') {
|
||||
console.error('UPDATER ERROR', result.error);
|
||||
// TODO: Show some UI?
|
||||
return null;
|
||||
}
|
||||
if (!result.data) return null;
|
||||
const update = result.data;
|
||||
|
||||
let id: ToastId | null = null;
|
||||
|
||||
const cb = () => {
|
||||
if (id !== null) toast.dismiss(id);
|
||||
};
|
||||
|
||||
onInstallCallbacks.add(cb);
|
||||
|
||||
toast.info(
|
||||
(_id) => {
|
||||
const { t } = useLocale();
|
||||
|
||||
id = _id;
|
||||
|
||||
return {
|
||||
title: t('new_update_available'),
|
||||
body: t('version', { version: update.version })
|
||||
};
|
||||
},
|
||||
{
|
||||
onClose() {
|
||||
onInstallCallbacks.delete(cb);
|
||||
},
|
||||
duration: 10 * 1000,
|
||||
action: {
|
||||
label: t('update'),
|
||||
onClick: installUpdate
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
return update;
|
||||
}
|
||||
|
||||
function installUpdate() {
|
||||
for (const cb of onInstallCallbacks) {
|
||||
cb();
|
||||
}
|
||||
|
||||
const promise = commands.installUpdate();
|
||||
|
||||
toast.promise(promise, {
|
||||
loading: t('downloading_update'),
|
||||
success: t('update_downloaded'),
|
||||
error: (e: any) => (
|
||||
<>
|
||||
<p>{t('failed_to_download_update')}</p>
|
||||
<p className="text-gray-300">Error: {e.toString()}</p>
|
||||
</>
|
||||
)
|
||||
});
|
||||
|
||||
return promise;
|
||||
}
|
||||
|
||||
const SD_VERSION_LOCALSTORAGE = 'sd-version';
|
||||
async function runJustUpdatedCheck(onViewChangelog: () => void) {
|
||||
const version = window.__SD_DESKTOP_VERSION__;
|
||||
const lastVersion = localStorage.getItem(SD_VERSION_LOCALSTORAGE);
|
||||
if (!lastVersion) return;
|
||||
|
||||
if (lastVersion !== version) {
|
||||
localStorage.setItem(SD_VERSION_LOCALSTORAGE, version);
|
||||
let tagline = null;
|
||||
|
||||
try {
|
||||
const request = await fetch(
|
||||
`${import.meta.env.VITE_LANDING_ORIGIN}/api/releases/${version}`
|
||||
);
|
||||
const { frontmatter } = await request.json();
|
||||
tagline = frontmatter?.tagline;
|
||||
} catch (error) {
|
||||
console.warn('Failed to fetch release info');
|
||||
console.error(error);
|
||||
}
|
||||
|
||||
toast.success(
|
||||
{
|
||||
title: t('updated_successfully', { version }),
|
||||
body: tagline
|
||||
},
|
||||
{
|
||||
duration: 10 * 1000,
|
||||
action: {
|
||||
label: t('view_changes'),
|
||||
onClick: onViewChangelog
|
||||
}
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
useSnapshot: () => useSnapshot(updateStore),
|
||||
checkForUpdate,
|
||||
installUpdate,
|
||||
runJustUpdatedCheck
|
||||
};
|
||||
}
|
||||
7
apps/desktop/src/vite-env.d.ts
vendored
@@ -1,7 +0,0 @@
|
||||
/// <reference types="vite/client" />
|
||||
|
||||
declare interface ImportMetaEnv {
|
||||
VITE_OS: string;
|
||||
}
|
||||
|
||||
declare module '@babel/core' {}
|
||||
@@ -1 +0,0 @@
|
||||
module.exports = require('@sd/ui/tailwind')('desktop');
|
||||
@@ -1,17 +0,0 @@
|
||||
{
|
||||
"extends": "../../packages/config/base.tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"rootDir": "src",
|
||||
"declarationDir": "dist",
|
||||
"paths": {
|
||||
"~/*": ["./src/*"]
|
||||
},
|
||||
"moduleResolution": "bundler"
|
||||
},
|
||||
"include": ["src"],
|
||||
"references": [
|
||||
{
|
||||
"path": "../../interface"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -1,48 +0,0 @@
|
||||
import { sentryVitePlugin } from '@sentry/vite-plugin';
|
||||
import { defineConfig, loadEnv, mergeConfig, Plugin } from 'vite';
|
||||
|
||||
import baseConfig from '../../packages/config/vite';
|
||||
|
||||
const devtoolsPlugin: Plugin = {
|
||||
name: 'devtools-plugin',
|
||||
transformIndexHtml(html) {
|
||||
const isDev = process.env.NODE_ENV === 'development';
|
||||
if (isDev) {
|
||||
const devtoolsScript = `<script src="http://localhost:8097"></script>`;
|
||||
const headTagIndex = html.indexOf('</head>');
|
||||
if (headTagIndex > -1) {
|
||||
return html.slice(0, headTagIndex) + devtoolsScript + html.slice(headTagIndex);
|
||||
}
|
||||
}
|
||||
return html;
|
||||
}
|
||||
};
|
||||
|
||||
export default defineConfig(({ mode }) => {
|
||||
process.env = { ...process.env, ...loadEnv(mode, process.cwd(), '') };
|
||||
|
||||
return mergeConfig(baseConfig, {
|
||||
server: {
|
||||
port: 8001
|
||||
},
|
||||
build: {
|
||||
rollupOptions: {
|
||||
treeshake: 'recommended',
|
||||
external: [
|
||||
// Don't bundle Fda video for non-macOS platforms
|
||||
process.platform !== 'darwin' && /^@sd\/assets\/videos\/Fda.mp4$/
|
||||
].filter(Boolean)
|
||||
}
|
||||
},
|
||||
plugins: [
|
||||
devtoolsPlugin,
|
||||
process.env.SENTRY_AUTH_TOKEN &&
|
||||
// All this plugin does is give Sentry access to source maps and release data for errors that users *choose* to report
|
||||
sentryVitePlugin({
|
||||
authToken: process.env.SENTRY_AUTH_TOKEN,
|
||||
org: 'spacedriveapp',
|
||||
project: 'desktop'
|
||||
})
|
||||
]
|
||||
});
|
||||
});
|
||||
@@ -1,39 +0,0 @@
|
||||
module.exports = {
|
||||
extends: [require.resolve('@sd/config/eslint/reactNative.js')],
|
||||
parserOptions: {
|
||||
tsconfigRootDir: __dirname,
|
||||
project: './tsconfig.json'
|
||||
},
|
||||
rules: {
|
||||
'tailwindcss/classnames-order': [
|
||||
'warn',
|
||||
{
|
||||
config: './tailwind.config.js'
|
||||
}
|
||||
],
|
||||
'tailwindcss/no-contradicting-classname': 'warn',
|
||||
'tailwindcss/enforces-shorthand': 'off',
|
||||
'@typescript-eslint/no-require-imports': 'off',
|
||||
'no-restricted-imports': [
|
||||
'error',
|
||||
{
|
||||
paths: [
|
||||
{
|
||||
name: 'react-native',
|
||||
importNames: ['SafeAreaView'],
|
||||
message: 'Import SafeAreaView from react-native-safe-area-context instead'
|
||||
},
|
||||
{
|
||||
name: 'react-native',
|
||||
importNames: ['Image', 'ImageProps', 'ImageBackground'],
|
||||
message: 'Import it from expo-image instead'
|
||||
},
|
||||
{
|
||||
name: 'react-native-toast-message',
|
||||
message: 'Import it from components instead'
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
};
|
||||
1
apps/mobile/.gitattributes
vendored
@@ -1 +0,0 @@
|
||||
*.pbxproj -text
|
||||
556
apps/mobile/.gitignore
vendored
@@ -1,556 +0,0 @@
|
||||
# Created by https://www.toptal.com/developers/gitignore/api/reactnative,android,androidstudio,xcode,objective-c,swift,swiftpm,swiftpackagemanager
|
||||
# Edit at https://www.toptal.com/developers/gitignore?templates=reactnative,android,androidstudio,xcode,objective-c,swift,swiftpm,swiftpackagemanager
|
||||
|
||||
### Android ###
|
||||
# Gradle files
|
||||
.gradle/
|
||||
build/
|
||||
|
||||
# Local configuration file (sdk path, etc)
|
||||
local.properties
|
||||
|
||||
# Log/OS Files
|
||||
*.log
|
||||
|
||||
# Android Studio generated files and folders
|
||||
captures/
|
||||
.externalNativeBuild/
|
||||
.cxx/
|
||||
*.apk
|
||||
output.json
|
||||
|
||||
# IntelliJ
|
||||
*.iml
|
||||
.idea/
|
||||
misc.xml
|
||||
deploymentTargetDropDown.xml
|
||||
render.experimental.xml
|
||||
|
||||
# Keystore files
|
||||
*.jks
|
||||
*.keystore
|
||||
|
||||
# Google Services (e.g. APIs or Firebase)
|
||||
google-services.json
|
||||
|
||||
# Android Profiling
|
||||
*.hprof
|
||||
|
||||
### Android Patch ###
|
||||
gen-external-apklibs
|
||||
|
||||
# Replacement of .externalNativeBuild directories introduced
|
||||
# with Android Studio 3.5.
|
||||
|
||||
### Objective-C ###
|
||||
# Xcode
|
||||
#
|
||||
# gitignore contributors: remember to update Global/Xcode.gitignore, Objective-C.gitignore & Swift.gitignore
|
||||
|
||||
## User settings
|
||||
xcuserdata/
|
||||
|
||||
## compatibility with Xcode 8 and earlier (ignoring not required starting Xcode 9)
|
||||
*.xcscmblueprint
|
||||
*.xccheckout
|
||||
|
||||
## compatibility with Xcode 3 and earlier (ignoring not required starting Xcode 4)
|
||||
DerivedData/
|
||||
*.moved-aside
|
||||
*.pbxuser
|
||||
!default.pbxuser
|
||||
*.mode1v3
|
||||
!default.mode1v3
|
||||
*.mode2v3
|
||||
!default.mode2v3
|
||||
*.perspectivev3
|
||||
!default.perspectivev3
|
||||
|
||||
## Obj-C/Swift specific
|
||||
*.hmap
|
||||
|
||||
## App packaging
|
||||
*.ipa
|
||||
*.dSYM.zip
|
||||
*.dSYM
|
||||
|
||||
# CocoaPods
|
||||
# We recommend against adding the Pods directory to your .gitignore. However
|
||||
# you should judge for yourself, the pros and cons are mentioned at:
|
||||
# https://guides.cocoapods.org/using/using-cocoapods.html#should-i-check-the-pods-directory-into-source-control
|
||||
# Pods/
|
||||
# Add this line if you want to avoid checking in source code from the Xcode workspace
|
||||
# *.xcworkspace
|
||||
|
||||
# Carthage
|
||||
# Add this line if you want to avoid checking in source code from Carthage dependencies.
|
||||
# Carthage/Checkouts
|
||||
|
||||
Carthage/Build/
|
||||
|
||||
# fastlane
|
||||
# It is recommended to not store the screenshots in the git repo.
|
||||
# Instead, use fastlane to re-generate the screenshots whenever they are needed.
|
||||
# For more information about the recommended setup visit:
|
||||
# https://docs.fastlane.tools/best-practices/source-control/#source-control
|
||||
|
||||
fastlane/report.xml
|
||||
fastlane/Preview.html
|
||||
fastlane/screenshots/**/*.png
|
||||
fastlane/test_output
|
||||
|
||||
# Code Injection
|
||||
# After new code Injection tools there's a generated folder /iOSInjectionProject
|
||||
# https://github.com/johnno1962/injectionforxcode
|
||||
|
||||
iOSInjectionProject/
|
||||
|
||||
### Objective-C Patch ###
|
||||
|
||||
### ReactNative ###
|
||||
# React Native Stack Base
|
||||
|
||||
.expo
|
||||
__generated__
|
||||
|
||||
### ReactNative.Android Stack ###
|
||||
# Gradle files
|
||||
|
||||
# Local configuration file (sdk path, etc)
|
||||
|
||||
# Log/OS Files
|
||||
|
||||
# Android Studio generated files and folders
|
||||
|
||||
# IntelliJ
|
||||
|
||||
# Keystore files
|
||||
|
||||
# Google Services (e.g. APIs or Firebase)
|
||||
|
||||
# Android Profiling
|
||||
|
||||
### ReactNative.Linux Stack ###
|
||||
*~
|
||||
|
||||
# temporary files which can be created if a process still has a handle open of a deleted file
|
||||
.fuse_hidden*
|
||||
|
||||
# KDE directory preferences
|
||||
.directory
|
||||
|
||||
# Linux trash folder which might appear on any partition or disk
|
||||
.Trash-*
|
||||
|
||||
# .nfs files are created when an open file is removed but is still being accessed
|
||||
.nfs*
|
||||
|
||||
### ReactNative.Node Stack ###
|
||||
# Logs
|
||||
logs
|
||||
npm-debug.log*
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
lerna-debug.log*
|
||||
.pnpm-debug.log*
|
||||
|
||||
# Diagnostic reports (https://nodejs.org/api/report.html)
|
||||
report.[0-9]*.[0-9]*.[0-9]*.[0-9]*.json
|
||||
|
||||
# Runtime data
|
||||
pids
|
||||
*.pid
|
||||
*.seed
|
||||
*.pid.lock
|
||||
|
||||
# Directory for instrumented libs generated by jscoverage/JSCover
|
||||
lib-cov
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
coverage
|
||||
*.lcov
|
||||
|
||||
# nyc test coverage
|
||||
.nyc_output
|
||||
|
||||
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
|
||||
.grunt
|
||||
|
||||
# Bower dependency directory (https://bower.io/)
|
||||
bower_components
|
||||
|
||||
# node-waf configuration
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons (https://nodejs.org/api/addons.html)
|
||||
build/Release
|
||||
|
||||
# Dependency directories
|
||||
node_modules/
|
||||
jspm_packages/
|
||||
|
||||
# Snowpack dependency directory (https://snowpack.dev/)
|
||||
web_modules/
|
||||
|
||||
# TypeScript cache
|
||||
*.tsbuildinfo
|
||||
|
||||
# Optional npm cache directory
|
||||
.npm
|
||||
|
||||
# Optional eslint cache
|
||||
.eslintcache
|
||||
|
||||
# Optional stylelint cache
|
||||
.stylelintcache
|
||||
|
||||
# Microbundle cache
|
||||
.rpt2_cache/
|
||||
.rts2_cache_cjs/
|
||||
.rts2_cache_es/
|
||||
.rts2_cache_umd/
|
||||
|
||||
# Optional REPL history
|
||||
.node_repl_history
|
||||
|
||||
# Output of 'npm pack'
|
||||
*.tgz
|
||||
|
||||
# Yarn Integrity file
|
||||
.yarn-integrity
|
||||
|
||||
# dotenv environment variable files
|
||||
.env
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
.env.local
|
||||
|
||||
# parcel-bundler cache (https://parceljs.org/)
|
||||
.cache
|
||||
.parcel-cache
|
||||
|
||||
# Next.js build output
|
||||
.next
|
||||
out
|
||||
|
||||
# Nuxt.js build / generate output
|
||||
.nuxt
|
||||
dist
|
||||
|
||||
# Gatsby files
|
||||
.cache/
|
||||
# Comment in the public line in if your project uses Gatsby and not Next.js
|
||||
# https://nextjs.org/blog/next-9-1#public-directory-support
|
||||
# public
|
||||
|
||||
# vuepress build output
|
||||
.vuepress/dist
|
||||
|
||||
# vuepress v2.x temp and cache directory
|
||||
.temp
|
||||
|
||||
# Docusaurus cache and generated files
|
||||
.docusaurus
|
||||
|
||||
# Serverless directories
|
||||
.serverless/
|
||||
|
||||
# FuseBox cache
|
||||
.fusebox/
|
||||
|
||||
# DynamoDB Local files
|
||||
.dynamodb/
|
||||
|
||||
# TernJS port file
|
||||
.tern-port
|
||||
|
||||
# Stores VSCode versions used for testing VSCode extensions
|
||||
.vscode-test
|
||||
|
||||
# yarn v2
|
||||
.yarn/cache
|
||||
.yarn/unplugged
|
||||
.yarn/build-state.yml
|
||||
.yarn/install-state.gz
|
||||
.pnp.*
|
||||
|
||||
### ReactNative.macOS Stack ###
|
||||
# General
|
||||
.DS_Store
|
||||
.AppleDouble
|
||||
.LSOverride
|
||||
|
||||
# Icon must end with two \r
|
||||
Icon
|
||||
|
||||
|
||||
# Thumbnails
|
||||
._*
|
||||
|
||||
# Files that might appear in the root of a volume
|
||||
.DocumentRevisions-V100
|
||||
.fseventsd
|
||||
.Spotlight-V100
|
||||
.TemporaryItems
|
||||
.Trashes
|
||||
.VolumeIcon.icns
|
||||
.com.apple.timemachine.donotpresent
|
||||
|
||||
# Directories potentially created on remote AFP share
|
||||
.AppleDB
|
||||
.AppleDesktop
|
||||
Network Trash Folder
|
||||
Temporary Items
|
||||
.apdisk
|
||||
|
||||
### ReactNative.Buck Stack ###
|
||||
buck-out/
|
||||
.buckconfig.local
|
||||
.buckd/
|
||||
.buckversion
|
||||
.fakebuckversion
|
||||
|
||||
### ReactNative.Xcode Stack ###
|
||||
ios/
|
||||
|
||||
## Xcode 8 and earlier
|
||||
|
||||
### ReactNative.Gradle Stack ###
|
||||
.gradle
|
||||
**/build/
|
||||
!src/**/build/
|
||||
android/
|
||||
|
||||
# Ignore Gradle GUI config
|
||||
gradle-app.setting
|
||||
|
||||
# Avoid ignoring Gradle wrapper jar file (.jar files are usually ignored)
|
||||
!gradle-wrapper.jar
|
||||
|
||||
# Avoid ignore Gradle wrappper properties
|
||||
!gradle-wrapper.properties
|
||||
|
||||
# Cache of project
|
||||
.gradletasknamecache
|
||||
|
||||
# Eclipse Gradle plugin generated files
|
||||
# Eclipse Core
|
||||
.project
|
||||
# JDT-specific (Eclipse Java Development Tools)
|
||||
.classpath
|
||||
|
||||
### Swift ###
|
||||
# Xcode
|
||||
# gitignore contributors: remember to update Global/Xcode.gitignore, Objective-C.gitignore & Swift.gitignore
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## Playgrounds
|
||||
timeline.xctimeline
|
||||
playground.xcworkspace
|
||||
|
||||
# Swift Package Manager
|
||||
# Add this line if you want to avoid checking in source code from Swift Package Manager dependencies.
|
||||
# Packages/
|
||||
# Package.pins
|
||||
# Package.resolved
|
||||
# *.xcodeproj
|
||||
# Xcode automatically generates this directory with a .xcworkspacedata file and xcuserdata
|
||||
# hence it is not needed unless you have added a package configuration file to your project
|
||||
# .swiftpm
|
||||
|
||||
.build/
|
||||
|
||||
# CocoaPods
|
||||
# We recommend against adding the Pods directory to your .gitignore. However
|
||||
# you should judge for yourself, the pros and cons are mentioned at:
|
||||
# https://guides.cocoapods.org/using/using-cocoapods.html#should-i-check-the-pods-directory-into-source-control
|
||||
# Pods/
|
||||
# Add this line if you want to avoid checking in source code from the Xcode workspace
|
||||
# *.xcworkspace
|
||||
|
||||
# Carthage
|
||||
# Add this line if you want to avoid checking in source code from Carthage dependencies.
|
||||
# Carthage/Checkouts
|
||||
|
||||
|
||||
# Accio dependency management
|
||||
Dependencies/
|
||||
.accio/
|
||||
|
||||
# fastlane
|
||||
# It is recommended to not store the screenshots in the git repo.
|
||||
# Instead, use fastlane to re-generate the screenshots whenever they are needed.
|
||||
# For more information about the recommended setup visit:
|
||||
# https://docs.fastlane.tools/best-practices/source-control/#source-control
|
||||
|
||||
|
||||
# Code Injection
|
||||
# After new code Injection tools there's a generated folder /iOSInjectionProject
|
||||
# https://github.com/johnno1962/injectionforxcode
|
||||
|
||||
|
||||
### SwiftPackageManager ###
|
||||
Packages
|
||||
xcuserdata
|
||||
*.xcodeproj
|
||||
|
||||
|
||||
### SwiftPM ###
|
||||
|
||||
|
||||
### Xcode ###
|
||||
|
||||
|
||||
### Xcode Patch ###
|
||||
*.xcodeproj/*
|
||||
!*.xcodeproj/project.pbxproj
|
||||
!*.xcodeproj/xcshareddata/
|
||||
!*.xcodeproj/project.xcworkspace/
|
||||
!*.xcworkspace/contents.xcworkspacedata
|
||||
/*.gcno
|
||||
**/xcshareddata/WorkspaceSettings.xcsettings
|
||||
|
||||
### AndroidStudio ###
|
||||
# Covers files to be ignored for android development using Android Studio.
|
||||
|
||||
# Built application files
|
||||
*.ap_
|
||||
*.aab
|
||||
|
||||
# Files for the ART/Dalvik VM
|
||||
*.dex
|
||||
|
||||
# Java class files
|
||||
*.class
|
||||
|
||||
# Generated files
|
||||
bin/
|
||||
gen/
|
||||
out/
|
||||
|
||||
# Gradle files
|
||||
|
||||
# Signing files
|
||||
.signing/
|
||||
|
||||
# Local configuration file (sdk path, etc)
|
||||
|
||||
# Proguard folder generated by Eclipse
|
||||
proguard/
|
||||
|
||||
# Log Files
|
||||
|
||||
# Android Studio
|
||||
/*/build/
|
||||
/*/local.properties
|
||||
/*/out
|
||||
/*/*/build
|
||||
/*/*/production
|
||||
.navigation/
|
||||
*.ipr
|
||||
*.swp
|
||||
|
||||
# Keystore files
|
||||
|
||||
# Google Services (e.g. APIs or Firebase)
|
||||
# google-services.json
|
||||
|
||||
# Android Patch
|
||||
|
||||
# External native build folder generated in Android Studio 2.2 and later
|
||||
.externalNativeBuild
|
||||
|
||||
# NDK
|
||||
obj/
|
||||
|
||||
# IntelliJ IDEA
|
||||
*.iws
|
||||
/out/
|
||||
|
||||
# User-specific configurations
|
||||
.idea/caches/
|
||||
.idea/libraries/
|
||||
.idea/shelf/
|
||||
.idea/workspace.xml
|
||||
.idea/tasks.xml
|
||||
.idea/.name
|
||||
.idea/compiler.xml
|
||||
.idea/copyright/profiles_settings.xml
|
||||
.idea/encodings.xml
|
||||
.idea/misc.xml
|
||||
.idea/modules.xml
|
||||
.idea/scopes/scope_settings.xml
|
||||
.idea/dictionaries
|
||||
.idea/vcs.xml
|
||||
.idea/jsLibraryMappings.xml
|
||||
.idea/datasources.xml
|
||||
.idea/dataSources.ids
|
||||
.idea/sqlDataSources.xml
|
||||
.idea/dynamic.xml
|
||||
.idea/uiDesigner.xml
|
||||
.idea/assetWizardSettings.xml
|
||||
.idea/gradle.xml
|
||||
.idea/jarRepositories.xml
|
||||
.idea/navEditor.xml
|
||||
|
||||
# Legacy Eclipse project files
|
||||
.cproject
|
||||
.settings/
|
||||
|
||||
# Mobile Tools for Java (J2ME)
|
||||
.mtj.tmp/
|
||||
|
||||
# Package Files #
|
||||
*.war
|
||||
*.ear
|
||||
|
||||
# virtual machine crash logs (Reference: http://www.java.com/en/download/help/error_hotspot.xml)
|
||||
hs_err_pid*
|
||||
|
||||
## Plugin-specific files:
|
||||
|
||||
# mpeltonen/sbt-idea plugin
|
||||
.idea_modules/
|
||||
|
||||
# JIRA plugin
|
||||
atlassian-ide-plugin.xml
|
||||
|
||||
# Mongo Explorer plugin
|
||||
.idea/mongoSettings.xml
|
||||
|
||||
# Crashlytics plugin (for Android Studio and IntelliJ)
|
||||
com_crashlytics_export_strings.xml
|
||||
crashlytics.properties
|
||||
crashlytics-build.properties
|
||||
fabric.properties
|
||||
|
||||
### AndroidStudio Patch ###
|
||||
|
||||
!/gradle/wrapper/gradle-wrapper.jar
|
||||
|
||||
# End of https://www.toptal.com/developers/gitignore/api/reactnative,android,androidstudio,xcode,objective-c,swift,swiftpm,swiftpackagemanager
|
||||
|
||||
### Project ###
|
||||
|
||||
# Expo
|
||||
web-build/
|
||||
android/
|
||||
ios/
|
||||
!modules/sd-core/android/
|
||||
!modules/sd-core/ios/
|
||||
|
||||
# Native
|
||||
*.orig.*
|
||||
*.p8
|
||||
*.p12
|
||||
*.key
|
||||
*.mobileprovision
|
||||
|
||||
# Metro
|
||||
.metro-health-check*
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
module.exports = {
|
||||
icon: true,
|
||||
typescript: true,
|
||||
svgProps: { fill: 'currentColor' }
|
||||
};
|
||||
@@ -1,5 +0,0 @@
|
||||
- Make sure to run `pnpm i` if you make any change to the `package` mobile uses like `assets`.
|
||||
- If iOS build fails with `node not found` error, run `echo "export NODE_BINARY=$(command -v node)" >> .xcode.env.local` on `mobile/ios/` directory.
|
||||
- If XCode can't find node, run `ln -s "$(which node)" /usr/local/bin/node`
|
||||
- To view the logs from the Spacedrive Core API, run `xcrun simctl launch --console booted com.spacedrive.app` with the app built in debug mode.
|
||||
- If Rive Assets have been updated, run `pnpm mobile prebuild` to import the latest version of the `.riv` files into the project.
|
||||
@@ -1,71 +0,0 @@
|
||||
{
|
||||
"expo": {
|
||||
"name": "Spacedrive",
|
||||
"slug": "spacedrive",
|
||||
"owner": "spacedrive",
|
||||
"version": "0.1.0",
|
||||
"orientation": "portrait",
|
||||
"jsEngine": "hermes",
|
||||
"scheme": "spacedrive",
|
||||
"platforms": ["ios", "android"],
|
||||
"userInterfaceStyle": "automatic",
|
||||
"icon": "./assets/icon.png",
|
||||
"updates": {
|
||||
"enabled": false,
|
||||
"fallbackToCacheTimeout": 0
|
||||
},
|
||||
"assetBundlePatterns": ["**/*"],
|
||||
"ios": {
|
||||
"supportsTablet": false,
|
||||
"bundleIdentifier": "com.spacedrive.app",
|
||||
"infoPlist": {
|
||||
"ITSAppUsesNonExemptEncryption": false,
|
||||
"UIBackgroundModes": ["remote-notification"],
|
||||
"UIFileSharingEnabled": true
|
||||
},
|
||||
"entitlements": {
|
||||
"com.apple.developer.icloud-container-identifiers": [],
|
||||
"com.apple.developer.icloud-services": ["CloudDocuments"],
|
||||
"com.apple.developer.ubiquity-container-identifiers": []
|
||||
}
|
||||
},
|
||||
"android": {
|
||||
"softwareKeyboardLayoutMode": "pan",
|
||||
"permissions": [
|
||||
"MANAGE_EXTERNAL_STORAGE",
|
||||
"READ_MEDIA_AUDIO",
|
||||
"READ_MEDIA_IMAGES",
|
||||
"READ_MEDIA_VIDEO"
|
||||
],
|
||||
"package": "com.spacedrive.app"
|
||||
},
|
||||
"splash": {
|
||||
"image": "./assets/splash.png",
|
||||
"backgroundColor": "#000000"
|
||||
},
|
||||
"privacy": "hidden",
|
||||
"plugins": [
|
||||
[
|
||||
"expo-build-properties",
|
||||
{
|
||||
"android": {
|
||||
"minSdkVersion": 28
|
||||
},
|
||||
"ios": {
|
||||
"useFrameworks": "static",
|
||||
"deploymentTarget": "14.0"
|
||||
}
|
||||
}
|
||||
],
|
||||
[
|
||||
"expo-av",
|
||||
{
|
||||
"microphonePermission": "Allow Spacedrive to access your microphone."
|
||||
}
|
||||
],
|
||||
["./scripts/withRiveAssets.js"],
|
||||
["./scripts/withAndroidIntent.js"],
|
||||
["./scripts/withNativeFunctions.js"]
|
||||
]
|
||||
}
|
||||
}
|
||||
|
Before Width: | Height: | Size: 313 KiB |
|
Before Width: | Height: | Size: 62 KiB |
@@ -1,29 +0,0 @@
|
||||
module.exports = function (api) {
|
||||
api.cache(true);
|
||||
return {
|
||||
presets: ['babel-preset-expo'],
|
||||
plugins: [
|
||||
'react-native-reanimated/plugin',
|
||||
[
|
||||
'module-resolver',
|
||||
{
|
||||
extensions: [
|
||||
'.js',
|
||||
'.jsx',
|
||||
'.ts',
|
||||
'.tsx',
|
||||
'.android.js',
|
||||
'.android.tsx',
|
||||
'.ios.js',
|
||||
'.ios.tsx'
|
||||
],
|
||||
root: ['src'],
|
||||
alias: {
|
||||
'~': './src'
|
||||
}
|
||||
}
|
||||
]
|
||||
],
|
||||
overrides: [{ test: /\.solid.tsx$/, presets: ['solid'] }]
|
||||
};
|
||||
};
|
||||
@@ -1,5 +0,0 @@
|
||||
import { registerRootComponent } from 'expo';
|
||||
|
||||
import { AppWrapper } from './src/main';
|
||||
|
||||
registerRootComponent(AppWrapper);
|
||||
@@ -1,46 +0,0 @@
|
||||
const { makeMetroConfig, resolveUniqueModule, exclusionList } = require('@rnx-kit/metro-config');
|
||||
|
||||
const path = require('path');
|
||||
|
||||
// Needed for transforming svgs from @sd/assets
|
||||
const [reactSVGPath, reactSVGExclude] = resolveUniqueModule('react-native-svg');
|
||||
|
||||
const { getDefaultConfig } = require('expo/metro-config');
|
||||
const expoDefaultConfig = getDefaultConfig(__dirname);
|
||||
|
||||
const projectRoot = __dirname;
|
||||
const workspaceRoot = path.resolve(projectRoot, '../..');
|
||||
|
||||
const metroConfig = makeMetroConfig({
|
||||
...expoDefaultConfig,
|
||||
projectRoot,
|
||||
watchFolders: [workspaceRoot],
|
||||
resolver: {
|
||||
...expoDefaultConfig.resolver,
|
||||
extraNodeModules: {
|
||||
'react-native-svg': reactSVGPath
|
||||
},
|
||||
blockList: exclusionList([reactSVGExclude]),
|
||||
sourceExts: [...expoDefaultConfig.resolver.sourceExts, 'svg'],
|
||||
assetExts: expoDefaultConfig.resolver.assetExts.filter((ext) => ext !== 'svg'),
|
||||
disableHierarchicalLookup: false,
|
||||
nodeModulesPaths: [
|
||||
path.resolve(projectRoot, 'node_modules'),
|
||||
path.resolve(workspaceRoot, 'node_modules')
|
||||
],
|
||||
platforms: ['ios', 'android']
|
||||
},
|
||||
transformer: {
|
||||
...expoDefaultConfig.transformer,
|
||||
getTransformOptions: async () => ({
|
||||
transform: {
|
||||
// What does this do?
|
||||
experimentalImportSupport: false,
|
||||
inlineRequires: true
|
||||
}
|
||||
}),
|
||||
babelTransformerPath: require.resolve('react-native-svg-transformer')
|
||||
}
|
||||
});
|
||||
|
||||
module.exports = metroConfig;
|
||||
@@ -1,24 +0,0 @@
|
||||
//
|
||||
// NativeFunctions.m
|
||||
// Spacedrive
|
||||
//
|
||||
// Created by Arnab Chakraborty on November 27, 2024.
|
||||
//
|
||||
|
||||
#import <Foundation/Foundation.h>
|
||||
#import <React/RCTBridgeModule.h>
|
||||
|
||||
@interface RCT_EXTERN_MODULE(NativeFunctions, NSObject)
|
||||
|
||||
RCT_EXTERN_METHOD(saveLocation:(nonnull NSString *)path
|
||||
locationId:(nonnull NSNumber *)locationId
|
||||
resolver:(RCTPromiseResolveBlock)resolve
|
||||
rejecter:(RCTPromiseRejectBlock)reject)
|
||||
|
||||
RCT_EXTERN_METHOD(previewFile:(nonnull NSString *)path
|
||||
locationId:(nonnull NSNumber *)locationId
|
||||
resolver:(RCTPromiseResolveBlock)resolve
|
||||
rejecter:(RCTPromiseRejectBlock)reject)
|
||||
|
||||
@end
|
||||
|
||||
@@ -1,202 +0,0 @@
|
||||
//
|
||||
// NativeFunctions.swift
|
||||
// Spacedrive
|
||||
//
|
||||
// Created by Arnab Chakraborty on November 27, 2024.
|
||||
//
|
||||
|
||||
import Foundation
|
||||
import UIKit
|
||||
import QuickLook
|
||||
|
||||
@objc(NativeFunctions)
|
||||
class NativeFunctions: NSObject, QLPreviewControllerDataSource {
|
||||
private var fileURL: URL?
|
||||
|
||||
@objc
|
||||
static func requiresMainQueueSetup() -> Bool {
|
||||
return true
|
||||
}
|
||||
|
||||
private func getBookmarkStoragePath(for id: Int) -> URL {
|
||||
let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
|
||||
return documentsDirectory.appendingPathComponent("\(id).sd_bookmark")
|
||||
}
|
||||
|
||||
@objc
|
||||
func saveLocation(_ path: String,
|
||||
locationId: NSNumber,
|
||||
resolver resolve: @escaping RCTPromiseResolveBlock,
|
||||
rejecter reject: @escaping RCTPromiseRejectBlock) {
|
||||
do {
|
||||
let url = URL(fileURLWithPath: path)
|
||||
guard url.startAccessingSecurityScopedResource() else {
|
||||
reject("ERROR", "Cannot access directory", nil)
|
||||
return
|
||||
}
|
||||
defer { url.stopAccessingSecurityScopedResource() }
|
||||
|
||||
let bookmarkData = try url.bookmarkData(
|
||||
options: .minimalBookmark,
|
||||
includingResourceValuesForKeys: nil,
|
||||
relativeTo: nil
|
||||
)
|
||||
|
||||
let bookmarkPath = getBookmarkStoragePath(for: locationId.intValue)
|
||||
try bookmarkData.write(to: bookmarkPath, options: .atomicWrite)
|
||||
|
||||
resolve(["success": true])
|
||||
} catch {
|
||||
reject("ERROR", "Failed to create bookmark: \(error.localizedDescription)", nil)
|
||||
}
|
||||
}
|
||||
|
||||
@objc
|
||||
func previewFile(_ path: String,
|
||||
locationId: NSNumber,
|
||||
resolver resolve: @escaping RCTPromiseResolveBlock,
|
||||
rejecter reject: @escaping RCTPromiseRejectBlock) {
|
||||
#if DEBUG
|
||||
print("🔍 PreviewFile called with path: \(path), locationId: \(locationId)")
|
||||
#endif
|
||||
|
||||
do {
|
||||
let bookmarkPath = getBookmarkStoragePath(for: locationId.intValue)
|
||||
#if DEBUG
|
||||
print("📁 Bookmark path: \(bookmarkPath)")
|
||||
#endif
|
||||
|
||||
let fileURL = URL(fileURLWithPath: path)
|
||||
#if DEBUG
|
||||
print("📄 File URL: \(fileURL)")
|
||||
#endif
|
||||
|
||||
if FileManager.default.fileExists(atPath: bookmarkPath.path) {
|
||||
#if DEBUG
|
||||
print("✅ Bookmark exists at path")
|
||||
#endif
|
||||
let bookmarkData = try Data(contentsOf: bookmarkPath)
|
||||
#if DEBUG
|
||||
print("📊 Bookmark data size: \(bookmarkData.count) bytes")
|
||||
#endif
|
||||
|
||||
var isStale = false
|
||||
let directoryURL = try URL(
|
||||
resolvingBookmarkData: bookmarkData,
|
||||
options: [],
|
||||
relativeTo: nil,
|
||||
bookmarkDataIsStale: &isStale
|
||||
)
|
||||
#if DEBUG
|
||||
print("📂 Resolved directory URL: \(directoryURL)")
|
||||
print("🔄 Is bookmark stale? \(isStale)")
|
||||
#endif
|
||||
|
||||
guard directoryURL.startAccessingSecurityScopedResource() else {
|
||||
#if DEBUG
|
||||
print("❌ Failed to access security-scoped resource for directory")
|
||||
#endif
|
||||
reject("ERROR", "Cannot access directory", nil)
|
||||
return
|
||||
}
|
||||
defer {
|
||||
directoryURL.stopAccessingSecurityScopedResource()
|
||||
#if DEBUG
|
||||
print("🔒 Stopped accessing security-scoped resource")
|
||||
#endif
|
||||
}
|
||||
|
||||
// Get the relative path from the base directory to the file
|
||||
let basePath = directoryURL.path
|
||||
let fullPath = fileURL.path
|
||||
|
||||
#if DEBUG
|
||||
print("📍 Base path: \(basePath)")
|
||||
print("📍 Full path: \(fullPath)")
|
||||
#endif
|
||||
|
||||
// Ensure the file path starts with the base path
|
||||
guard fullPath.hasPrefix(basePath) else {
|
||||
#if DEBUG
|
||||
print("❌ File is not within the bookmarked directory")
|
||||
#endif
|
||||
reject("ERROR", "File is not within the bookmarked directory", nil)
|
||||
return
|
||||
}
|
||||
|
||||
// Use the full file URL directly
|
||||
self.fileURL = fileURL
|
||||
#if DEBUG
|
||||
print("💾 Set fileURL for QuickLook: \(fileURL)")
|
||||
#endif
|
||||
|
||||
// Verify file exists
|
||||
if FileManager.default.fileExists(atPath: fileURL.path) {
|
||||
#if DEBUG
|
||||
print("✅ File exists at path")
|
||||
#endif
|
||||
} else {
|
||||
#if DEBUG
|
||||
print("⚠️ File does not exist at path")
|
||||
#endif
|
||||
reject("ERROR", "File not found at path", nil)
|
||||
return
|
||||
}
|
||||
} else {
|
||||
#if DEBUG
|
||||
print("❌ Bookmark not found at path: \(bookmarkPath)")
|
||||
#endif
|
||||
reject("ERROR", "Bookmark not found for this location", nil)
|
||||
return
|
||||
}
|
||||
|
||||
#if DEBUG
|
||||
print("🚀 Preparing to present QuickLook controller")
|
||||
#endif
|
||||
DispatchQueue.main.async {
|
||||
let previewController = QLPreviewController()
|
||||
previewController.dataSource = self
|
||||
|
||||
guard let presentedVC = RCTPresentedViewController() else {
|
||||
#if DEBUG
|
||||
print("❌ Failed to get presented view controller")
|
||||
#endif
|
||||
reject("ERROR", "Cannot present preview", nil)
|
||||
return
|
||||
}
|
||||
|
||||
#if DEBUG
|
||||
print("📱 Presenting QuickLook controller")
|
||||
#endif
|
||||
presentedVC.present(previewController, animated: true) {
|
||||
#if DEBUG
|
||||
print("✨ QuickLook controller presented successfully")
|
||||
#endif
|
||||
resolve(["success": true])
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
#if DEBUG
|
||||
print("💥 Error occurred: \(error.localizedDescription)")
|
||||
print("🔍 Detailed error: \(error)")
|
||||
#endif
|
||||
reject("ERROR", "Failed to preview file: \(error.localizedDescription)", nil)
|
||||
}
|
||||
}
|
||||
|
||||
// MARK: - QLPreviewControllerDataSource
|
||||
func numberOfPreviewItems(in controller: QLPreviewController) -> Int {
|
||||
#if DEBUG
|
||||
print("📊 numberOfPreviewItems called, returning: \(fileURL != nil ? 1 : 0)")
|
||||
#endif
|
||||
return fileURL != nil ? 1 : 0
|
||||
}
|
||||
|
||||
func previewController(_ controller: QLPreviewController, previewItemAt index: Int) -> QLPreviewItem {
|
||||
#if DEBUG
|
||||
print("🎯 previewItemAt called for index: \(index)")
|
||||
print("📄 Returning fileURL: \(String(describing: fileURL))")
|
||||
#endif
|
||||
return fileURL! as QLPreviewItem
|
||||
}
|
||||
}
|
||||
@@ -1,103 +0,0 @@
|
||||
apply plugin: 'com.android.library'
|
||||
apply plugin: 'kotlin-android'
|
||||
apply plugin: 'maven-publish'
|
||||
|
||||
|
||||
group = 'com.spacedrive.core'
|
||||
version = '0.1.0'
|
||||
|
||||
buildscript {
|
||||
def expoModulesCorePlugin = new File(project(":expo-modules-core").projectDir.absolutePath, "ExpoModulesCorePlugin.gradle")
|
||||
if (expoModulesCorePlugin.exists()) {
|
||||
apply from: expoModulesCorePlugin
|
||||
applyKotlinExpoModulesCorePlugin()
|
||||
}
|
||||
|
||||
// Simple helper that allows the root project to override versions declared by this library.
|
||||
ext.safeExtGet = { prop, fallback ->
|
||||
rootProject.ext.has(prop) ? rootProject.ext.get(prop) : fallback
|
||||
}
|
||||
|
||||
// Ensures backward compatibility
|
||||
ext.getKotlinVersion = {
|
||||
if (ext.has("kotlinVersion")) {
|
||||
ext.kotlinVersion()
|
||||
} else {
|
||||
ext.safeExtGet("kotlinVersion", "1.8.10")
|
||||
}
|
||||
}
|
||||
|
||||
repositories {
|
||||
mavenCentral()
|
||||
maven {
|
||||
url "https://plugins.gradle.org/m2/"
|
||||
}
|
||||
}
|
||||
|
||||
dependencies {
|
||||
classpath("org.jetbrains.kotlin:kotlin-gradle-plugin:${getKotlinVersion()}")
|
||||
classpath 'org.mozilla.rust-android-gradle:plugin:0.9.3'
|
||||
}
|
||||
}
|
||||
|
||||
afterEvaluate {
|
||||
publishing {
|
||||
publications {
|
||||
release(MavenPublication) {
|
||||
from components.release
|
||||
}
|
||||
}
|
||||
repositories {
|
||||
maven {
|
||||
url = mavenLocal().url
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
android {
|
||||
compileSdkVersion safeExtGet("compileSdkVersion", 34)
|
||||
|
||||
compileOptions {
|
||||
sourceCompatibility JavaVersion.VERSION_17
|
||||
targetCompatibility JavaVersion.VERSION_17
|
||||
}
|
||||
|
||||
kotlinOptions {
|
||||
jvmTarget = JavaVersion.VERSION_17.majorVersion
|
||||
}
|
||||
|
||||
namespace "com.spacedrive.core"
|
||||
defaultConfig {
|
||||
minSdkVersion safeExtGet("minSdkVersion", 28)
|
||||
targetSdkVersion safeExtGet("targetSdkVersion", 34)
|
||||
versionCode 1
|
||||
versionName "0.1.0"
|
||||
}
|
||||
lintOptions {
|
||||
abortOnError false
|
||||
}
|
||||
publishing {
|
||||
singleVariant("release") {
|
||||
withSourcesJar()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
repositories {
|
||||
mavenCentral()
|
||||
}
|
||||
|
||||
dependencies {
|
||||
implementation project(':expo-modules-core')
|
||||
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:${getKotlinVersion()}"
|
||||
}
|
||||
|
||||
// Run the ./build.sh script to build the Rust code
|
||||
task buildRustCode(type: Exec) {
|
||||
commandLine "./build.sh"
|
||||
}
|
||||
|
||||
tasks.named('preBuild').configure {
|
||||
dependsOn buildRustCode
|
||||
}
|
||||
@@ -1,74 +0,0 @@
|
||||
#!/usr/bin/env sh
|
||||
|
||||
set -eu
|
||||
|
||||
if [ "${CI:-}" = "true" ]; then
|
||||
set -x
|
||||
fi
|
||||
|
||||
err() {
|
||||
for _line in "$@"; do
|
||||
echo "$_line" >&2
|
||||
done
|
||||
exit 1
|
||||
}
|
||||
|
||||
if [ -z "${HOME:-}" ]; then
|
||||
case "$(uname)" in
|
||||
"Darwin")
|
||||
HOME="$(CDPATH='' cd -- "$(osascript -e 'set output to (POSIX path of (path to home folder))')" && pwd -P)"
|
||||
;;
|
||||
"Linux")
|
||||
HOME="$(CDPATH='' cd -- "$(getent passwd "$(id -un)" | cut -d: -f6)" && pwd -P)"
|
||||
;;
|
||||
*)
|
||||
err "Your OS ($(uname)) is not supported by this script." \
|
||||
'We would welcome a PR or some help adding your OS to this script.' \
|
||||
'https://github.com/spacedriveapp/spacedrive/issues'
|
||||
;;
|
||||
esac
|
||||
|
||||
export HOME
|
||||
fi
|
||||
|
||||
echo "Building 'sd-mobile-android' library..."
|
||||
|
||||
__dirname="$(CDPATH='' cd -- "$(dirname -- "$0")" && pwd -P)"
|
||||
|
||||
# Ensure output dir exists
|
||||
OUTPUT_DIRECTORY="${__dirname}/../../../../../apps/mobile/android/app/src/main/jniLibs"
|
||||
mkdir -p "$OUTPUT_DIRECTORY"
|
||||
|
||||
# Required for CI and for everyone I guess?
|
||||
export PATH="${CARGO_HOME:-"${HOME}/.cargo"}/bin:$PATH"
|
||||
|
||||
# Set the targets to build
|
||||
# If CI, then we build x86_64 else we build all targets
|
||||
if [ "${CI:-}" = "true" ]; then
|
||||
# TODO: This need to be adjusted for future mobile release CI
|
||||
case "$(uname -m)" in
|
||||
"arm64" | "aarch64")
|
||||
ANDROID_BUILD_TARGET_LIST="arm64-v8a"
|
||||
;;
|
||||
"x86_64")
|
||||
ANDROID_BUILD_TARGET_LIST="x86_64"
|
||||
;;
|
||||
*)
|
||||
err 'Unsupported architecture for CI build.'
|
||||
;;
|
||||
esac
|
||||
else
|
||||
# ANDROID_BUILD_TARGET_LIST="arm64-v8a armeabi-v7a x86_64"
|
||||
ANDROID_BUILD_TARGET_LIST="arm64-v8a"
|
||||
fi
|
||||
|
||||
# Configure build targets CLI arg for `cargo ndk`
|
||||
echo "Building targets: $ANDROID_BUILD_TARGET_LIST"
|
||||
set --
|
||||
for _target in $ANDROID_BUILD_TARGET_LIST; do
|
||||
set -- "$@" -t "$_target"
|
||||
done
|
||||
|
||||
cd "${__dirname}/crate"
|
||||
cargo ndk --platform 34 "$@" -o "$OUTPUT_DIRECTORY" build
|
||||
# \ --release
|
||||
@@ -1,22 +0,0 @@
|
||||
[package]
|
||||
name = "sd-mobile-android"
|
||||
version = "0.1.0"
|
||||
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
rust-version.workspace = true
|
||||
|
||||
[lib]
|
||||
# Android can use dynamic linking since all FFI is done via JNI
|
||||
crate-type = ["cdylib"]
|
||||
|
||||
[target.'cfg(target_os = "android")'.dependencies]
|
||||
# Spacedrive Sub-crates
|
||||
sd-mobile-core = { path = "../../core" }
|
||||
|
||||
# Workspace dependencies
|
||||
tracing = { workspace = true }
|
||||
|
||||
# Specific Mobile Android dependencies
|
||||
jni = "0.21.1"
|
||||
@@ -1,110 +0,0 @@
|
||||
#![cfg(target_os = "android")]
|
||||
|
||||
use std::panic;
|
||||
|
||||
use jni::{
|
||||
objects::{JClass, JObject, JString},
|
||||
JNIEnv,
|
||||
};
|
||||
|
||||
use sd_mobile_core::*;
|
||||
|
||||
use tracing::error;
|
||||
|
||||
#[no_mangle]
|
||||
pub extern "system" fn Java_com_spacedrive_core_SDCoreModule_registerCoreEventListener(
|
||||
env: JNIEnv,
|
||||
class: JClass,
|
||||
) {
|
||||
let result = panic::catch_unwind(|| {
|
||||
let jvm = env.get_java_vm().unwrap();
|
||||
let class = env.new_global_ref(class).unwrap();
|
||||
|
||||
spawn_core_event_listener(move |data| {
|
||||
let mut env = jvm.attach_current_thread().unwrap();
|
||||
|
||||
let s = env.new_string(data).expect("Couldn't create java string!");
|
||||
env.call_method(
|
||||
&class,
|
||||
"sendCoreEvent",
|
||||
"(Ljava/lang/String;)V",
|
||||
&[(&s).into()],
|
||||
)
|
||||
.unwrap();
|
||||
})
|
||||
});
|
||||
|
||||
if let Err(err) = result {
|
||||
// TODO: Send rspc error or something here so we can show this in the UI.
|
||||
// TODO: Maybe reinitialise the core cause it could be in an invalid state?
|
||||
error!("Error in Java_com_spacedrive_core_SDCoreModule_registerCoreEventListener: {err:?}");
|
||||
}
|
||||
}
|
||||
|
||||
#[no_mangle]
|
||||
pub extern "system" fn Java_com_spacedrive_core_SDCoreModule_handleCoreMsg(
|
||||
env: JNIEnv,
|
||||
class: JClass,
|
||||
query: JString,
|
||||
callback: JObject,
|
||||
) {
|
||||
let jvm = env.get_java_vm().unwrap();
|
||||
let mut env = jvm.attach_current_thread().unwrap();
|
||||
let callback = env.new_global_ref(callback).unwrap();
|
||||
|
||||
let query: String = env
|
||||
.get_string(&query)
|
||||
.expect("Couldn't get java string!")
|
||||
.into();
|
||||
|
||||
// env.call_method(
|
||||
// class,
|
||||
// "printFromRust",
|
||||
// "(Ljava/lang/Object;)V",
|
||||
// &[env
|
||||
// .new_string("Hello from Rust".to_string())
|
||||
// .expect("Couldn't create java string!")
|
||||
// .into()],
|
||||
// )
|
||||
// .unwrap();
|
||||
|
||||
let result = panic::catch_unwind(|| {
|
||||
let data_directory = {
|
||||
let mut env = jvm.attach_current_thread().unwrap();
|
||||
let data_dir = env
|
||||
.call_method(&class, "getDataDirectory", "()Ljava/lang/String;", &[])
|
||||
.unwrap()
|
||||
.l()
|
||||
.unwrap();
|
||||
|
||||
env.get_string((&data_dir).into()).unwrap().into()
|
||||
};
|
||||
|
||||
let jvm = env.get_java_vm().unwrap();
|
||||
handle_core_msg(query, data_directory, move |result| match result {
|
||||
Ok(data) => {
|
||||
let mut env = jvm.attach_current_thread().unwrap();
|
||||
let s = env.new_string(data).expect("Couldn't create java string!");
|
||||
env.call_method(
|
||||
&callback,
|
||||
"resolve",
|
||||
"(Ljava/lang/String;)V",
|
||||
&[(&s).into()],
|
||||
)
|
||||
.unwrap();
|
||||
}
|
||||
Err(err) => error!(err),
|
||||
});
|
||||
});
|
||||
|
||||
if let Err(err) = result {
|
||||
// TODO: Send rspc error or something here so we can show this in the UI.
|
||||
// TODO: Maybe reinitialise the core cause it could be in an invalid state?
|
||||
|
||||
// TODO: This log statement doesn't work. I recon the JNI env is being dropped before it's called.
|
||||
error!(
|
||||
"Error in Java_com_spacedrive_app_SDCore_registerCoreEventListener: {:?}",
|
||||
err
|
||||
);
|
||||
}
|
||||
}
|
||||
@@ -1,2 +0,0 @@
|
||||
<manifest>
|
||||
</manifest>
|
||||
@@ -1,61 +0,0 @@
|
||||
package com.spacedrive.core
|
||||
|
||||
import expo.modules.kotlin.Promise
|
||||
import expo.modules.kotlin.modules.Module
|
||||
import expo.modules.kotlin.modules.ModuleDefinition
|
||||
|
||||
class SDCoreModule : Module() {
|
||||
private var registeredWithRust = false
|
||||
private var listeners = 0
|
||||
|
||||
init {
|
||||
System.loadLibrary("sd_mobile_android")
|
||||
}
|
||||
|
||||
// is exposed by Rust and is used to register the subscription
|
||||
private external fun registerCoreEventListener()
|
||||
|
||||
private external fun handleCoreMsg(query: String, promise: SDCorePromise)
|
||||
|
||||
public fun getDataDirectory(): String {
|
||||
return appContext.persistentFilesDirectory.absolutePath;
|
||||
}
|
||||
|
||||
public fun printFromRust(msg: String) {
|
||||
print(msg);
|
||||
}
|
||||
|
||||
public fun sendCoreEvent(body: String) {
|
||||
if (listeners > 0) {
|
||||
this@SDCoreModule.sendEvent(
|
||||
"SDCoreEvent",
|
||||
mapOf(
|
||||
"body" to body
|
||||
)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
override fun definition() = ModuleDefinition {
|
||||
Name("SDCore")
|
||||
|
||||
Events("SDCoreEvent")
|
||||
|
||||
OnStartObserving {
|
||||
if (!registeredWithRust)
|
||||
{
|
||||
this@SDCoreModule.registerCoreEventListener();
|
||||
}
|
||||
|
||||
this@SDCoreModule.listeners++;
|
||||
}
|
||||
|
||||
OnStopObserving {
|
||||
this@SDCoreModule.listeners--;
|
||||
}
|
||||
|
||||
AsyncFunction("sd_core_msg") { query: String, promise: Promise ->
|
||||
this@SDCoreModule.handleCoreMsg(query, SDCorePromise(promise))
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
package com.spacedrive.core;
|
||||
|
||||
import expo.modules.kotlin.Promise;
|
||||
|
||||
public class SDCorePromise {
|
||||
public Promise promise;
|
||||
|
||||
public SDCorePromise(Promise promise) {
|
||||
this.promise = promise;
|
||||
}
|
||||
|
||||
public void resolve(String msg) {
|
||||
this.promise.resolve(msg);
|
||||
}
|
||||
}
|
||||
@@ -1,31 +0,0 @@
|
||||
[package]
|
||||
name = "sd-mobile-core"
|
||||
version = "0.1.0"
|
||||
|
||||
edition.workspace = true
|
||||
license.workspace = true
|
||||
repository.workspace = true
|
||||
rust-version.workspace = true
|
||||
|
||||
# Spacedrive Sub-crates
|
||||
[target.'cfg(target_os = "ios")'.dependencies]
|
||||
sd-core = { default-features = false, features = [
|
||||
"ffmpeg",
|
||||
"heif",
|
||||
"mobile"
|
||||
], path = "../../../../../core" }
|
||||
|
||||
[target.'cfg(target_os = "android")'.dependencies]
|
||||
sd-core = { path = "../../../../../core", features = ["mobile"], default-features = false }
|
||||
|
||||
[dependencies]
|
||||
# Workspace dependencies
|
||||
futures = { workspace = true }
|
||||
rspc = { workspace = true }
|
||||
serde_json = { workspace = true }
|
||||
tokio = { workspace = true }
|
||||
tracing = { workspace = true }
|
||||
|
||||
# Specific Mobile Core dependencies
|
||||
futures-channel = "0.3.30"
|
||||
futures-locks = "0.7.1"
|
||||
@@ -1,145 +0,0 @@
|
||||
#![cfg(any(target_os = "android", target_os = "ios"))]
|
||||
|
||||
use futures::{future::join_all, StreamExt};
|
||||
use futures_channel::mpsc;
|
||||
use rspc::internal::jsonrpc::{
|
||||
self, handle_json_rpc, OwnedMpscSender, Request, RequestId, Response, Sender,
|
||||
SubscriptionUpgrade,
|
||||
};
|
||||
use sd_core::{api::Router, Node};
|
||||
use serde_json::{from_str, from_value, to_string, Value};
|
||||
use std::{
|
||||
borrow::Cow,
|
||||
collections::HashMap,
|
||||
future::{ready, Ready},
|
||||
marker::Send,
|
||||
sync::{Arc, LazyLock, OnceLock},
|
||||
};
|
||||
use tokio::{
|
||||
runtime::Runtime,
|
||||
sync::{oneshot, Mutex},
|
||||
};
|
||||
use tracing::error;
|
||||
|
||||
pub static RUNTIME: LazyLock<Runtime> = LazyLock::new(|| Runtime::new().unwrap());
|
||||
|
||||
pub type NodeType = LazyLock<Mutex<Option<(Arc<Node>, Arc<Router>)>>>;
|
||||
|
||||
pub static NODE: NodeType = LazyLock::new(|| Mutex::new(None));
|
||||
|
||||
#[allow(clippy::type_complexity)]
|
||||
pub static SUBSCRIPTIONS: LazyLock<
|
||||
Arc<futures_locks::Mutex<HashMap<RequestId, oneshot::Sender<()>>>>,
|
||||
> = LazyLock::new(Default::default);
|
||||
|
||||
pub static EVENT_SENDER: OnceLock<mpsc::Sender<Response>> = OnceLock::new();
|
||||
|
||||
pub struct MobileSender<'a> {
|
||||
resp: &'a mut Option<Response>,
|
||||
}
|
||||
|
||||
impl<'a> Sender<'a> for MobileSender<'a> {
|
||||
type SendFut = Ready<()>;
|
||||
type SubscriptionMap = Arc<futures_locks::Mutex<HashMap<RequestId, oneshot::Sender<()>>>>;
|
||||
type OwnedSender = OwnedMpscSender;
|
||||
|
||||
fn subscription(self) -> SubscriptionUpgrade<'a, Self> {
|
||||
SubscriptionUpgrade::Supported(
|
||||
OwnedMpscSender::new(
|
||||
EVENT_SENDER
|
||||
.get()
|
||||
.expect("Core was not started before making a request!")
|
||||
.clone(),
|
||||
),
|
||||
SUBSCRIPTIONS.clone(),
|
||||
)
|
||||
}
|
||||
|
||||
fn send(self, resp: jsonrpc::Response) -> Self::SendFut {
|
||||
*self.resp = Some(resp);
|
||||
ready(())
|
||||
}
|
||||
}
|
||||
|
||||
pub fn handle_core_msg(
|
||||
query: String,
|
||||
data_dir: String,
|
||||
callback: impl FnOnce(Result<String, String>) + Send + 'static,
|
||||
) {
|
||||
RUNTIME.spawn(async move {
|
||||
let (node, router) = {
|
||||
let node = &mut *NODE.lock().await;
|
||||
match node {
|
||||
Some(node) => node.clone(),
|
||||
None => {
|
||||
let _guard = Node::init_logger(&data_dir);
|
||||
|
||||
let new_node = match Node::new(data_dir).await {
|
||||
Ok(node) => node,
|
||||
Err(e) => {
|
||||
error!(?e, "Failed to initialize node;");
|
||||
callback(Err(query));
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
node.replace(new_node.clone());
|
||||
new_node
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
let reqs = match from_str::<Value>(&query).and_then(|v| match v.is_array() {
|
||||
true => from_value::<Vec<Request>>(v),
|
||||
false => from_value::<Request>(v).map(|v| vec![v]),
|
||||
}) {
|
||||
Ok(v) => v,
|
||||
Err(e) => {
|
||||
error!(?e, "Failed to decode JSON-RPC request;");
|
||||
callback(Err(query));
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
let responses = join_all(reqs.into_iter().map(|request| {
|
||||
let node = node.clone();
|
||||
let router = router.clone();
|
||||
async move {
|
||||
let mut resp = Option::<Response>::None;
|
||||
handle_json_rpc(
|
||||
node.clone(),
|
||||
request,
|
||||
Cow::Borrowed(&router),
|
||||
MobileSender { resp: &mut resp },
|
||||
)
|
||||
.await;
|
||||
resp
|
||||
}
|
||||
}))
|
||||
.await;
|
||||
|
||||
callback(Ok(serde_json::to_string(
|
||||
&responses.into_iter().flatten().collect::<Vec<_>>(),
|
||||
)
|
||||
.unwrap()));
|
||||
});
|
||||
}
|
||||
|
||||
pub fn spawn_core_event_listener(callback: impl Fn(String) + Send + 'static) {
|
||||
let (tx, mut rx) = mpsc::channel(100);
|
||||
let _ = EVENT_SENDER.set(tx);
|
||||
|
||||
RUNTIME.spawn(async move {
|
||||
while let Some(event) = rx.next().await {
|
||||
let data = match to_string(&event) {
|
||||
Ok(json) => json,
|
||||
Err(e) => {
|
||||
error!(?e, "Failed to serialize event;");
|
||||
continue;
|
||||
}
|
||||
};
|
||||
|
||||
callback(data);
|
||||
}
|
||||
});
|
||||
}
|
||||