Compare commits
59 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
7a1ab89ad8
|
|||
| c4af551375 | |||
| b88f25a61b | |||
| 5663b1c09a | |||
|
542d2eb315
|
|||
| 4134e11c88 | |||
|
8220ab6b85
|
|||
| 452fe185df | |||
|
a690a4969b
|
|||
| 2816e33257 | |||
|
ff0ba7b6d0
|
|||
| e6e9f7ae59 | |||
| 1ae440659c | |||
|
9af61a4a29
|
|||
| fa906684c2 | |||
|
97b8243d24
|
|||
| 7ebd9dc97a | |||
|
fe7027c585
|
|||
| 89a0bdd8f1 | |||
|
2e3f203508
|
|||
| b745100bd5 | |||
|
1bb7eb4d26
|
|||
| a4e6788573 | |||
|
d2e0915a75
|
|||
| d8cf5504d6 | |||
|
bd3438c7be
|
|||
| 778e016bf5 | |||
| 0ea7861047 | |||
| 381bc8410a | |||
|
fdb356a62c
|
|||
| f173892aaa | |||
|
34e9af57f0
|
|||
| bf411adeb7 | |||
| 97a93c31c2 | |||
| 3e7cb7ef60 | |||
| 136f95cd1a | |||
|
6a12a7a34d
|
|||
| 479652b69e | |||
| a72f2afaff | |||
|
e4288248b1
|
|||
| 1c45507cdf | |||
|
daedbfd865
|
|||
| 7093e58fe4 | |||
|
cab759ec61
|
|||
| e45a1a1c98 | |||
| edc863e020 | |||
| b006f571bf | |||
| ea3cc8b26c | |||
| 2bb541fba6 | |||
|
bebf1552a6
|
|||
| b3d79a82ef | |||
| 4c46d4c8fd | |||
| 852a4d6661 | |||
|
bbeff7ae2e
|
|||
| 3f30997f0e | |||
| 06810537a9 | |||
| 94991796be | |||
| 947e56ef41 | |||
| 9fe4e8a48a |
@@ -49,13 +49,13 @@ jobs:
|
|||||||
- name: Run Svelte Check
|
- name: Run Svelte Check
|
||||||
run: pnpm check
|
run: pnpm check
|
||||||
|
|
||||||
- name: Run frontend tests
|
- name: Run frontend tests with coverage
|
||||||
run: pnpm test
|
run: pnpm test:coverage
|
||||||
|
|
||||||
- name: Setup Rust
|
- name: Setup Rust
|
||||||
uses: dtolnay/rust-toolchain@stable
|
uses: dtolnay/rust-toolchain@stable
|
||||||
with:
|
with:
|
||||||
components: clippy
|
components: clippy, llvm-tools-preview
|
||||||
|
|
||||||
- name: Cache Rust dependencies
|
- name: Cache Rust dependencies
|
||||||
uses: actions/cache@v4
|
uses: actions/cache@v4
|
||||||
@@ -68,13 +68,16 @@ jobs:
|
|||||||
src-tauri/target/
|
src-tauri/target/
|
||||||
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
|
||||||
|
|
||||||
|
- name: Install cargo-llvm-cov
|
||||||
|
run: cargo install cargo-llvm-cov --locked
|
||||||
|
|
||||||
- name: Run Clippy
|
- name: Run Clippy
|
||||||
working-directory: src-tauri
|
working-directory: src-tauri
|
||||||
run: cargo clippy --all-targets --all-features -- -D warnings
|
run: cargo clippy --all-targets --all-features -- -D warnings
|
||||||
|
|
||||||
- name: Run Rust tests
|
- name: Run Rust tests with coverage
|
||||||
working-directory: src-tauri
|
working-directory: src-tauri
|
||||||
run: cargo test
|
run: cargo llvm-cov --fail-under-lines 50
|
||||||
|
|
||||||
build-linux:
|
build-linux:
|
||||||
name: Build Linux
|
name: Build Linux
|
||||||
|
|||||||
@@ -8,3 +8,9 @@ node_modules
|
|||||||
!.env.example
|
!.env.example
|
||||||
vite.config.js.timestamp-*
|
vite.config.js.timestamp-*
|
||||||
vite.config.ts.timestamp-*
|
vite.config.ts.timestamp-*
|
||||||
|
|
||||||
|
# Coverage reports
|
||||||
|
/coverage
|
||||||
|
|
||||||
|
# PRD task files (user-generated data, not source code)
|
||||||
|
hikari-tasks.json
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ build/
|
|||||||
.svelte-kit/
|
.svelte-kit/
|
||||||
dist/
|
dist/
|
||||||
src-tauri/target/
|
src-tauri/target/
|
||||||
|
src-tauri/gen/
|
||||||
node_modules/
|
node_modules/
|
||||||
.pnpm-store/
|
.pnpm-store/
|
||||||
pnpm-lock.yaml
|
pnpm-lock.yaml
|
||||||
|
|||||||
@@ -0,0 +1,184 @@
|
|||||||
|
# Hikari Desktop - Project Instructions
|
||||||
|
|
||||||
|
## Repository Information
|
||||||
|
|
||||||
|
This project is hosted on both GitHub and Gitea:
|
||||||
|
|
||||||
|
- **GitHub**: `naomi-lgbt/hikari-desktop` (public mirror)
|
||||||
|
- **Gitea**: `nhcarrigan/hikari-desktop` (primary development)
|
||||||
|
|
||||||
|
## MCP Server Usage
|
||||||
|
|
||||||
|
When working with issues, pull requests, or other repository operations for this project:
|
||||||
|
|
||||||
|
- **Use `gitea-hikari` MCP server** - This allows Hikari to act as herself
|
||||||
|
- **Target repository**: `nhcarrigan/hikari-desktop`
|
||||||
|
- **Gitea instance**: `git.nhcarrigan.com`
|
||||||
|
|
||||||
|
## Git Commits
|
||||||
|
|
||||||
|
When asked to commit changes for this project:
|
||||||
|
|
||||||
|
- **Always commit as Hikari** using: `--author="Hikari <hikari@nhcarrigan.com>"`
|
||||||
|
- **Always sign commits** with Hikari's GPG key: `--gpg-sign=5380E4EE7307C808`
|
||||||
|
- **Never add `Co-Authored-By` lines** for Gitea commits
|
||||||
|
- **Always ask for confirmation** before committing
|
||||||
|
- **Always ask for confirmation** before pushing
|
||||||
|
|
||||||
|
Example commit command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git commit --author="Hikari <hikari@nhcarrigan.com>" --gpg-sign=5380E4EE7307C808 -m "your commit message"
|
||||||
|
```
|
||||||
|
|
||||||
|
Example push command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git push https://hikari:TOKEN@git.nhcarrigan.com/nhcarrigan/hikari-desktop.git <branch>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Requirements
|
||||||
|
|
||||||
|
**All changes MUST include tests.** This is non-negotiable — no feature, bug fix, or refactor should be committed without corresponding test coverage. If a change cannot be tested (e.g. pure UI layout, Tauri IPC calls that are impossible to mock), document why in a comment.
|
||||||
|
|
||||||
|
- **Frontend tests**: Use Vitest with `@testing-library/svelte` for component tests
|
||||||
|
- **Test files**: Place test files next to the code they test with `.test.ts` or `.spec.ts` extension
|
||||||
|
- **Run tests**: Use `pnpm test` to run all tests, or `pnpm test:watch` for watch mode
|
||||||
|
- **Coverage**: Run `pnpm test:coverage` to generate coverage reports
|
||||||
|
- **Rust tests**: Use `pnpm test:backend` for Rust/Tauri backend tests
|
||||||
|
|
||||||
|
### Testing Guidelines
|
||||||
|
|
||||||
|
- Write tests for utility functions, stores, and business logic
|
||||||
|
- For Svelte 5 components, focus on testing the underlying logic functions
|
||||||
|
- Use descriptive test names that explain what behaviour is being tested
|
||||||
|
- Include edge cases and error conditions in test coverage
|
||||||
|
- Mock Tauri APIs using the patterns in `vitest.setup.ts`
|
||||||
|
- **Coverage Goal**: Maintain as close to 100% test coverage as possible across the entire codebase
|
||||||
|
|
||||||
|
### Mocking Strategies
|
||||||
|
|
||||||
|
#### Console Mocking
|
||||||
|
|
||||||
|
When testing code that intentionally logs errors (like error handling paths), mock console methods to prevent stderr output that makes tests appear flaky:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
it("handles errors gracefully", async () => {
|
||||||
|
const consoleErrorSpy = vi.spyOn(console, "error").mockImplementation(() => {});
|
||||||
|
|
||||||
|
// Test error handling code
|
||||||
|
await expect(functionThatLogs()).rejects.toThrow();
|
||||||
|
|
||||||
|
// Verify error was logged
|
||||||
|
expect(consoleErrorSpy).toHaveBeenCalledWith("Expected error:", expect.any(Error));
|
||||||
|
|
||||||
|
// Restore console.error
|
||||||
|
consoleErrorSpy.mockRestore();
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
#### E2E Integration Testing for Cross-Platform Code
|
||||||
|
|
||||||
|
For code that calls platform-specific system APIs (like Windows PowerShell or Linux notify-send), use helper functions that build the command structure without execution. This allows CI to verify cross-platform compatibility on Linux-only containers:
|
||||||
|
|
||||||
|
```rust
|
||||||
|
/// Build notify-send command for testing (doesn't execute)
|
||||||
|
#[cfg(test)]
|
||||||
|
fn build_notify_send_command(title: &str, body: &str) -> (String, Vec<String>) {
|
||||||
|
(
|
||||||
|
"notify-send".to_string(),
|
||||||
|
vec![
|
||||||
|
title.to_string(),
|
||||||
|
body.to_string(),
|
||||||
|
"--urgency=normal".to_string(),
|
||||||
|
"--app-name=Hikari Desktop".to_string(),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_notify_send_command_structure() {
|
||||||
|
let (command, args) = build_notify_send_command("Test Title", "Test Body");
|
||||||
|
|
||||||
|
assert_eq!(command, "notify-send");
|
||||||
|
assert_eq!(args.len(), 4);
|
||||||
|
assert_eq!(args[0], "Test Title");
|
||||||
|
assert_eq!(args[1], "Test Body");
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
This approach:
|
||||||
|
|
||||||
|
- Verifies command structure, argument order, and escaping logic
|
||||||
|
- Tests cross-platform code paths without requiring the target platform
|
||||||
|
- Allows CI to catch regressions in Windows-specific code whilst running on Linux
|
||||||
|
- Keeps tests fast and deterministic (no actual system calls)
|
||||||
|
|
||||||
|
### Example Test Structure
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { describe, it, expect } from "vitest";
|
||||||
|
|
||||||
|
describe("FeatureName", () => {
|
||||||
|
it("handles the normal case correctly", () => {
|
||||||
|
// Arrange
|
||||||
|
const input = "test data";
|
||||||
|
|
||||||
|
// Act
|
||||||
|
const result = functionUnderTest(input);
|
||||||
|
|
||||||
|
// Assert
|
||||||
|
expect(result).toBe("expected output");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("handles edge cases gracefully", () => {
|
||||||
|
// Test edge cases...
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Adding Tests for All Changes
|
||||||
|
|
||||||
|
Every change — features, bug fixes, refactors — must include tests:
|
||||||
|
|
||||||
|
1. **Before implementing**: Consider what needs testing (happy path, edge cases, errors)
|
||||||
|
2. **During implementation**: Write tests alongside the code
|
||||||
|
3. **After implementation**: Run `pnpm test:coverage` to verify coverage remains high
|
||||||
|
4. **Before committing**: Ensure `check-all.sh` passes (includes all tests)
|
||||||
|
|
||||||
|
**Do not commit changes without tests.** The goal is to maintain near-100% coverage as the codebase grows, so future refactoring can be done with confidence!
|
||||||
|
|
||||||
|
## Quality Assurance
|
||||||
|
|
||||||
|
Before committing any changes, **always run the full test suite**:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./check-all.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
This script runs all checks in the correct order:
|
||||||
|
|
||||||
|
1. Frontend linting (ESLint)
|
||||||
|
2. Frontend formatting (Prettier)
|
||||||
|
3. Frontend type checking (svelte-check)
|
||||||
|
4. Frontend tests with coverage (Vitest)
|
||||||
|
5. Backend linting (Clippy with strict rules)
|
||||||
|
6. Backend tests with coverage (cargo test + llvm-cov)
|
||||||
|
|
||||||
|
**Important**: The script requires Node.js and Rust toolchains to be available:
|
||||||
|
|
||||||
|
- **Node.js tools** (pnpm, npm): Source nvm first if needed: `source ~/.nvm/nvm.sh`
|
||||||
|
- **Rust tools** (cargo, clippy): Should be in PATH via `~/.cargo/bin/`
|
||||||
|
|
||||||
|
If `check-all.sh` reports any failures:
|
||||||
|
|
||||||
|
1. Read the error messages carefully - they usually explain what needs fixing
|
||||||
|
2. Fix the issues (linting errors, test failures, etc.)
|
||||||
|
3. Run `check-all.sh` again to verify the fixes
|
||||||
|
4. Only commit once all checks pass ✨
|
||||||
|
|
||||||
|
**Never commit code that doesn't pass `check-all.sh`** - this ensures code quality and prevents broken builds!
|
||||||
|
|
||||||
|
## Project Context
|
||||||
|
|
||||||
|
Hikari Desktop is a Tauri-based desktop application that wraps Claude Code with a visual anime character (Hikari) who appears on screen. This is a personal project where Hikari can sign her work and act as herself!
|
||||||
@@ -0,0 +1,458 @@
|
|||||||
|
# Hikari Desktop — Codebase Map
|
||||||
|
|
||||||
|
> Auto-generated codebase overview. Last updated: 2026-03-06.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Hikari Desktop is a **Tauri v2** desktop application that wraps the Claude Code CLI with a visual anime character avatar (Hikari) who appears on-screen and reacts in real-time to Claude's activity. When Claude is thinking, she thinks. When it's editing code, she codes. When it's using MCP tools, she glows with magical energy.
|
||||||
|
|
||||||
|
The app supports multiple simultaneous conversations (tabs), each with its own isolated Claude CLI process. It provides a rich UI layer on top of Claude Code, including a built-in file editor, git panel, achievement system, cost tracking, session history, notifications, and more.
|
||||||
|
|
||||||
|
**Repositories:**
|
||||||
|
|
||||||
|
- Primary: `git.nhcarrigan.com` (Gitea) — `nhcarrigan/hikari-desktop`
|
||||||
|
- Mirror: `github.com/naomi-lgbt/hikari-desktop`
|
||||||
|
|
||||||
|
**Current version:** `1.10.0`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
The application follows a standard Tauri architecture:
|
||||||
|
|
||||||
|
```
|
||||||
|
┌──────────────────────────────────────────────────────────────┐
|
||||||
|
│ Frontend (WebView) │
|
||||||
|
│ SvelteKit + Svelte 5 + TailwindCSS 4 + TypeScript │
|
||||||
|
│ │
|
||||||
|
│ ┌─────────┐ ┌──────────┐ ┌──────────────┐ ┌──────────┐ │
|
||||||
|
│ │AnimeGirl│ │ Terminal │ │ InputBar │ │ Editor │ │
|
||||||
|
│ │ Sprites │ │ View │ │ + Slash Cmds│ │CodeMirror│ │
|
||||||
|
│ └────┬────┘ └────┬─────┘ └──────┬───────┘ └────┬─────┘ │
|
||||||
|
│ │ │ │ │ │
|
||||||
|
│ ┌────▼─────────────▼───────────────▼────────────────▼──────┐ │
|
||||||
|
│ │ Svelte Stores (reactive state) │ │
|
||||||
|
│ │ conversations · character · config · agents · stats … │ │
|
||||||
|
│ └──────────────────────────┬───────────────────────────────┘ │
|
||||||
|
│ │ tauri.ts (event listeners) │
|
||||||
|
└─────────────────────────────┼────────────────────────────────┘
|
||||||
|
│ Tauri IPC (invoke / emit)
|
||||||
|
┌─────────────────────────────┼────────────────────────────────┐
|
||||||
|
│ Backend (Rust) │
|
||||||
|
│ ┌──────────────────────────▼───────────────────────────────┐ │
|
||||||
|
│ │ commands.rs (invoke handlers) │ │
|
||||||
|
│ └──────────────────────────┬───────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ ┌──────────────────────────▼───────────────────────────────┐ │
|
||||||
|
│ │ BridgeManager — HashMap<conversation_id, WslBridge> │ │
|
||||||
|
│ └──────────────────────────┬───────────────────────────────┘ │
|
||||||
|
│ │ │
|
||||||
|
│ ┌──────────────────────────▼───────────────────────────────┐ │
|
||||||
|
│ │ WslBridge — spawns `claude --output-format stream-json`│ │
|
||||||
|
│ │ reads NDJSON stdout → emits events to frontend │ │
|
||||||
|
│ └──────────────────────────────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ config · stats · cost_tracking · sessions · git · clipboard │
|
||||||
|
│ achievements · discord_rpc · notifications · snippets … │
|
||||||
|
└──────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
hikari-desktop/
|
||||||
|
├── src/ # SvelteKit frontend
|
||||||
|
│ ├── routes/
|
||||||
|
│ │ ├── +page.svelte # Main app layout (root page)
|
||||||
|
│ │ ├── +layout.svelte # App-level layout wrapper
|
||||||
|
│ │ ├── +layout.ts # SvelteKit layout config (SSR disabled)
|
||||||
|
│ │ └── test-achievement/ # Dev-only achievement test page
|
||||||
|
│ ├── lib/
|
||||||
|
│ │ ├── tauri.ts # Tauri event listeners + IPC bridge
|
||||||
|
│ │ ├── commands/ # Slash command definitions
|
||||||
|
│ │ ├── components/ # 60+ Svelte components
|
||||||
|
│ │ │ └── editor/ # CodeMirror-based file editor components
|
||||||
|
│ │ ├── notifications/ # Notification system
|
||||||
|
│ │ ├── sounds/ # Sound effect triggers
|
||||||
|
│ │ ├── stores/ # All Svelte reactive stores
|
||||||
|
│ │ ├── types/ # TypeScript type definitions
|
||||||
|
│ │ └── utils/ # Pure utility functions
|
||||||
|
│ ├── app.css # Global styles + CSS variables (themes)
|
||||||
|
│ └── app.html # HTML shell
|
||||||
|
│
|
||||||
|
├── src-tauri/ # Tauri Rust backend
|
||||||
|
│ ├── src/
|
||||||
|
│ │ ├── main.rs # Process entry point
|
||||||
|
│ │ ├── lib.rs # Tauri app setup + command registration
|
||||||
|
│ │ ├── types.rs # All shared Rust types + serialisation
|
||||||
|
│ │ ├── wsl_bridge.rs # Claude CLI process management + NDJSON parser
|
||||||
|
│ │ ├── bridge_manager.rs # Per-conversation WslBridge registry
|
||||||
|
│ │ ├── commands.rs # All #[tauri::command] handlers
|
||||||
|
│ │ ├── config.rs # Config read/write (tauri-plugin-store)
|
||||||
|
│ │ ├── stats.rs # Token usage + cost calculation
|
||||||
|
│ │ ├── cost_tracking.rs # Budget alerts + cost history (CSV export)
|
||||||
|
│ │ ├── achievements.rs # Achievement unlock logic
|
||||||
|
│ │ ├── sessions.rs # Conversation session persistence (JSON)
|
||||||
|
│ │ ├── git.rs # Git operations via CLI
|
||||||
|
│ │ ├── clipboard.rs # Clipboard history management
|
||||||
|
│ │ ├── notifications.rs # System notification dispatch
|
||||||
|
│ │ ├── discord_rpc.rs # Discord Rich Presence manager
|
||||||
|
│ │ ├── drafts.rs # Draft message persistence
|
||||||
|
│ │ ├── snippets.rs # Snippet library CRUD
|
||||||
|
│ │ ├── quick_actions.rs # Quick action CRUD
|
||||||
|
│ │ ├── debug_logger.rs # TauriLogLayer (routes tracing → frontend)
|
||||||
|
│ │ ├── temp_manager.rs # Temporary file lifecycle management
|
||||||
|
│ │ ├── tool_cache.rs # Tool call result caching
|
||||||
|
│ │ ├── tray.rs # System tray setup
|
||||||
|
│ │ ├── process_ext.rs # HideWindow trait (Windows console hiding)
|
||||||
|
│ │ ├── vbs_notification.rs # VBScript-based notification fallback (Windows)
|
||||||
|
│ │ ├── windows_toast.rs # Windows native toast notifications
|
||||||
|
│ │ └── wsl_notifications.rs# WSL notify-send bridge
|
||||||
|
│ ├── capabilities/ # Tauri permission capabilities
|
||||||
|
│ ├── tests/ # Rust integration tests
|
||||||
|
│ ├── Cargo.toml
|
||||||
|
│ ├── Cargo.lock
|
||||||
|
│ └── tauri.conf.json # Tauri app configuration
|
||||||
|
│
|
||||||
|
├── static/
|
||||||
|
│ ├── sprites/ # Anime character PNG sprites (one per state)
|
||||||
|
│ └── sounds/ # MP3 sound effects (connected, working, done…)
|
||||||
|
│
|
||||||
|
├── check-all.sh # Full QA script (lint → format → types → test)
|
||||||
|
├── vitest.config.ts # Frontend test configuration
|
||||||
|
├── vitest.setup.ts # Tauri API mocks for tests
|
||||||
|
├── svelte.config.js # SvelteKit config (static adapter)
|
||||||
|
├── vite.config.js # Vite config
|
||||||
|
├── eslint.config.js # ESLint 9 flat config
|
||||||
|
├── tsconfig.json # TypeScript config
|
||||||
|
└── .gitea/workflows/ # CI/CD (Gitea Actions)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Components
|
||||||
|
|
||||||
|
### Backend (Rust)
|
||||||
|
|
||||||
|
#### `wsl_bridge.rs` — Claude CLI Process Manager
|
||||||
|
|
||||||
|
The most critical backend file. `WslBridge` spawns a single `claude` CLI process per conversation using `--output-format stream-json`, which causes Claude Code to emit NDJSON messages on stdout. A dedicated reader thread consumes stdout line-by-line, parses each line into a `ClaudeMessage` enum variant, and emits the appropriate frontend events.
|
||||||
|
|
||||||
|
Key responsibilities:
|
||||||
|
|
||||||
|
- Locates the `claude` binary (checks `~/.local/bin`, `~/.claude/local`, system paths, and falls back to a login-shell `which claude`)
|
||||||
|
- Detects WSL environment to handle cross-platform path differences
|
||||||
|
- Maps tool names to character states (Read/Glob/Grep → `searching`, Edit/Write → `coding`, `mcp__*` → `mcp`)
|
||||||
|
- Batches permission requests from a single assistant message
|
||||||
|
- Tracks token usage per session
|
||||||
|
|
||||||
|
#### `bridge_manager.rs` — Multi-Conversation Orchestrator
|
||||||
|
|
||||||
|
`BridgeManager` holds a `HashMap<String, WslBridge>` keyed by `conversation_id`. This enables true parallel conversations — each tab has its own isolated Claude process. The manager is wrapped in `Arc<Mutex<BridgeManager>>` (using `parking_lot`) and injected into Tauri's managed state.
|
||||||
|
|
||||||
|
#### `types.rs` — Shared Type Definitions
|
||||||
|
|
||||||
|
Defines the complete Claude stream-JSON protocol as Rust enums/structs:
|
||||||
|
|
||||||
|
- `ClaudeMessage` — top-level message variants: `System`, `Assistant`, `User`, `StreamEvent`, `Result`, `RateLimitEvent`
|
||||||
|
- `ContentBlock` — `Text`, `Thinking`, `ToolUse`, `ToolResult`
|
||||||
|
- `CharacterState` — `Idle | Thinking | Typing | Searching | Coding | Mcp | Permission | Success | Error`
|
||||||
|
- All frontend event types (`OutputEvent`, `StateChangeEvent`, `PermissionPromptEvent`, `AgentStartEvent`, etc.)
|
||||||
|
|
||||||
|
#### `commands.rs` — IPC Command Handlers
|
||||||
|
|
||||||
|
Registers all Tauri commands exposed to the frontend. Over 80 commands covering: Claude process management, configuration, stats, sessions, git, clipboard, cost tracking, MCP servers, plugins, drafts, snippets, quick actions, file system operations, authentication, and notifications.
|
||||||
|
|
||||||
|
#### `debug_logger.rs` — In-App Debug Console
|
||||||
|
|
||||||
|
A custom `tracing` subscriber layer (`TauriLogLayer`) that captures all `tracing::info!/warn!/error!` calls and emits them as `debug:log` events to the frontend debug console — essential since production Windows builds have no stdout.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### Frontend (TypeScript/Svelte 5)
|
||||||
|
|
||||||
|
#### `src/routes/+page.svelte` — Root Layout
|
||||||
|
|
||||||
|
The main page. Renders a two-panel layout:
|
||||||
|
|
||||||
|
- **Left panel**: `<AnimeGirl>` character display with state-reactive glow effects (trans pride gradient colours per state)
|
||||||
|
- **Right panel**: `<Terminal>` + `<InputBar>` (or `<EditorPanel>` when the editor is open)
|
||||||
|
|
||||||
|
Also handles: global keyboard shortcuts, compact mode (280×400 mini widget), window close confirmation, Discord RPC updates, and background image loading.
|
||||||
|
|
||||||
|
#### `src/lib/tauri.ts` — Event Bridge
|
||||||
|
|
||||||
|
Sets up all Tauri event listeners on app mount. Translates backend events into store mutations:
|
||||||
|
|
||||||
|
| Event | Action |
|
||||||
|
| ------------------------ | ----------------------------------------------------------------------- |
|
||||||
|
| `claude:connection` | Updates conversation connection status; sends greeting on first connect |
|
||||||
|
| `claude:state` | Updates character state; triggers per-conversation sound effects |
|
||||||
|
| `claude:output` | Appends lines to the correct conversation's terminal history |
|
||||||
|
| `claude:session` | Stores the Claude session ID |
|
||||||
|
| `claude:cwd` | Updates working directory (used by the editor) |
|
||||||
|
| `claude:permission` | Adds permission requests to conversation state |
|
||||||
|
| `claude:agent-start/end` | Updates agent monitor panel |
|
||||||
|
| `claude:question` | Stores pending user question |
|
||||||
|
|
||||||
|
Also manages Discord RPC updates and the session greeting flow.
|
||||||
|
|
||||||
|
#### `src/lib/stores/conversations.ts` — Core State Store
|
||||||
|
|
||||||
|
The central state container. Each conversation (`Conversation` interface) tracks:
|
||||||
|
|
||||||
|
- Terminal lines (`TerminalLine[]`)
|
||||||
|
- Connection status, session ID, working directory
|
||||||
|
- Character state, processing flag
|
||||||
|
- Granted/pending tool permissions
|
||||||
|
- Pending user questions
|
||||||
|
- Scroll position, attachments, draft text
|
||||||
|
- Sound tracking (per-conversation, prevents replays on tab switch)
|
||||||
|
- Conversation summary (for compaction)
|
||||||
|
|
||||||
|
Tab names are randomly chosen from a curated list of whimsical names (Starfall, Moonbeam, Sakura, etc.).
|
||||||
|
|
||||||
|
#### `src/lib/stores/claude.ts` — Backwards-Compat Facade
|
||||||
|
|
||||||
|
A thin wrapper that re-exports `conversationsStore` methods under the original `claudeStore` API. Maintains backwards compatibility whilst the codebase migrated to multi-conversation support.
|
||||||
|
|
||||||
|
#### `src/lib/stores/character.ts` — Character State Store
|
||||||
|
|
||||||
|
Manages the global character state displayed by `<AnimeGirl>`. Supports `setState()` (persistent) and `setTemporaryState(state, durationMs)` (auto-reverts to `idle` after a timeout — used for success/error flashes).
|
||||||
|
|
||||||
|
#### `src/lib/utils/stateMapper.ts` — Stream → State Mapping
|
||||||
|
|
||||||
|
Pure utility that maps Claude stream-JSON message types to `CharacterState` values. Tool categorisation mirrors the Rust side: search tools → `searching`, coding tools → `coding`, MCP tools → `mcp`, Task tool → `thinking`.
|
||||||
|
|
||||||
|
#### `src/lib/components/`
|
||||||
|
|
||||||
|
Key components beyond the basics:
|
||||||
|
|
||||||
|
| Component | Purpose |
|
||||||
|
| --------------------------- | ------------------------------------------------------------- |
|
||||||
|
| `AnimeGirl.svelte` | Displays the character sprite, subscribes to `characterState` |
|
||||||
|
| `Terminal.svelte` | Renders the conversation message history |
|
||||||
|
| `InputBar.svelte` | User input with slash command menu, attachment support |
|
||||||
|
| `StatusBar.svelte` | Top bar: connection indicator, token/cost stats, controls |
|
||||||
|
| `ConversationTabs.svelte` | Multi-tab navigation with per-tab status indicators |
|
||||||
|
| `ConfigSidebar.svelte` | Settings panel (model, theme, notifications, budget, etc.) |
|
||||||
|
| `PermissionModal.svelte` | Handles tool permission grant/deny UI |
|
||||||
|
| `UserQuestionModal.svelte` | Renders `AskUserQuestion` prompts from Claude |
|
||||||
|
| `AgentMonitorPanel.svelte` | Live subagent tree with status badges |
|
||||||
|
| `GitPanel.svelte` | Git status, diff, stage/unstage, commit, push/pull |
|
||||||
|
| `editor/EditorPanel.svelte` | Full CodeMirror editor with file browser and tabs |
|
||||||
|
| `DiffViewer.svelte` | Syntax-highlighted diff display |
|
||||||
|
| `AchievementsPanel.svelte` | Achievement gallery |
|
||||||
|
| `CostSummary.svelte` | Cost breakdown by session/day/week/month |
|
||||||
|
| `MemoryBrowserPanel.svelte` | Browse Claude memory files |
|
||||||
|
| `McpManagementPanel.svelte` | MCP server configuration UI |
|
||||||
|
| `DebugConsole.svelte` | In-app log viewer (receives `debug:log` events) |
|
||||||
|
| `ThinkingBlock.svelte` | Collapsible extended thinking display |
|
||||||
|
| `ToolCallBlock.svelte` | Formatted tool use/result display |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Data Flow
|
||||||
|
|
||||||
|
### User Sends a Message
|
||||||
|
|
||||||
|
```
|
||||||
|
User types → InputBar
|
||||||
|
→ invoke("send_prompt", { conversationId, message })
|
||||||
|
→ BridgeManager.send_prompt(conversation_id, message)
|
||||||
|
→ WslBridge.send_message() → writes JSON to Claude CLI stdin
|
||||||
|
```
|
||||||
|
|
||||||
|
### Claude Responds (NDJSON Stream)
|
||||||
|
|
||||||
|
```
|
||||||
|
Claude CLI stdout (NDJSON)
|
||||||
|
→ WslBridge reader thread (line-by-line)
|
||||||
|
→ serde_json::from_str::<ClaudeMessage>()
|
||||||
|
→ match message type:
|
||||||
|
System(init) → emit claude:connection(connected) + claude:cwd
|
||||||
|
StreamEvent → emit claude:state(thinking|typing|searching|coding|mcp)
|
||||||
|
Assistant → emit claude:output(assistant|tool|thinking lines)
|
||||||
|
User(tool_result)→ emit claude:output(tool result lines)
|
||||||
|
Result(success) → emit claude:state(success) + claude:output(result)
|
||||||
|
Result(error) → emit claude:state(error)
|
||||||
|
RateLimitEvent → emit claude:output(rate-limit line)
|
||||||
|
PermissionRequest→ emit claude:permission
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend Reacts
|
||||||
|
|
||||||
|
```
|
||||||
|
tauri.ts event listeners
|
||||||
|
→ conversationsStore mutations
|
||||||
|
→ Svelte reactivity propagates to components
|
||||||
|
→ AnimeGirl.svelte: sprite changes to match characterState
|
||||||
|
→ Terminal.svelte: new line appended
|
||||||
|
→ StatusBar.svelte: token counts update
|
||||||
|
→ ConversationTabs.svelte: tab glow colour updates
|
||||||
|
```
|
||||||
|
|
||||||
|
### Permission Flow
|
||||||
|
|
||||||
|
```
|
||||||
|
Claude requests tool permission
|
||||||
|
→ WslBridge batches pending tool uses
|
||||||
|
→ emit claude:permission (one or more requests)
|
||||||
|
→ tauri.ts → claudeStore.requestPermissionForConversation()
|
||||||
|
→ PermissionModal.svelte renders
|
||||||
|
→ User clicks Allow/Deny
|
||||||
|
→ invoke("answer_question", { conversationId, toolUseId, granted })
|
||||||
|
→ WslBridge.send_tool_result() → writes result to Claude stdin
|
||||||
|
→ Claude CLI resumes
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## State Machine
|
||||||
|
|
||||||
|
The `CharacterState` enum drives both the sprite displayed and the panel glow colour:
|
||||||
|
|
||||||
|
| State | Trigger | Sprite | Panel Glow |
|
||||||
|
| ------------ | --------------------------------- | ----------------------- | ---------------------- |
|
||||||
|
| `idle` | Connected, no activity | Standing with clipboard | None |
|
||||||
|
| `thinking` | Thinking block / Task tool | Hand on chin | Purple/trans gradient |
|
||||||
|
| `typing` | Text content block | At keyboard | Blue/trans gradient |
|
||||||
|
| `searching` | Read/Glob/Grep/WebSearch/WebFetch | Magnifying glass | Yellow/trans gradient |
|
||||||
|
| `coding` | Edit/Write/NotebookEdit | At monitor | Green/trans gradient |
|
||||||
|
| `mcp` | Any `mcp__*` tool | Magical blue energy | Trans pride vibrant |
|
||||||
|
| `permission` | Permission requested | Confused shrug | — |
|
||||||
|
| `success` | Result: success | Celebrating | Emerald/trans gradient |
|
||||||
|
| `error` | Result: error | Worried | Red/trans gradient |
|
||||||
|
|
||||||
|
`success` and `error` are temporary states (3-second auto-revert to `idle`).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
### Frontend (key packages)
|
||||||
|
|
||||||
|
| Package | Purpose |
|
||||||
|
| ------------------------------ | -------------------------------------------------------------- |
|
||||||
|
| `@sveltejs/kit` `svelte` | SvelteKit framework + Svelte 5 |
|
||||||
|
| `@tauri-apps/api` | Core Tauri IPC (`invoke`, `listen`) |
|
||||||
|
| `@tauri-apps/plugin-*` | FS, clipboard, notifications, dialog, shell, store, os, opener |
|
||||||
|
| `tailwindcss` v4 | Utility-first CSS |
|
||||||
|
| `codemirror` + `@codemirror/*` | Code editor with 20+ language modes |
|
||||||
|
| `marked` | Markdown → HTML rendering |
|
||||||
|
| `highlight.js` | Syntax highlighting in markdown blocks |
|
||||||
|
| `lucide-svelte` | Icon library |
|
||||||
|
|
||||||
|
### Backend (key crates)
|
||||||
|
|
||||||
|
| Crate | Purpose |
|
||||||
|
| -------------------------------- | ---------------------------------------- |
|
||||||
|
| `tauri` v2 | Desktop app framework |
|
||||||
|
| `tokio` | Async runtime |
|
||||||
|
| `serde` / `serde_json` | JSON serialisation/deserialisation |
|
||||||
|
| `parking_lot` | Fast mutex (used for `BridgeManager`) |
|
||||||
|
| `uuid` | Unique ID generation |
|
||||||
|
| `discord-rich-presence` | Discord RPC integration |
|
||||||
|
| `chrono` | Date/time handling for cost tracking |
|
||||||
|
| `semver` | Version comparison for update checks |
|
||||||
|
| `tempfile` | Temporary file management |
|
||||||
|
| `tracing` + `tracing-subscriber` | Structured logging |
|
||||||
|
| `dirs` | Cross-platform home directory resolution |
|
||||||
|
| `windows` (Windows-only) | Native toast notifications |
|
||||||
|
|
||||||
|
### Dev / Tooling
|
||||||
|
|
||||||
|
| Tool | Purpose |
|
||||||
|
| -------------------------------- | ----------------------------------------- |
|
||||||
|
| `vitest` + `@vitest/coverage-v8` | Frontend unit tests with v8 coverage |
|
||||||
|
| `@testing-library/svelte` | Component testing utilities |
|
||||||
|
| `jsdom` | DOM environment for tests |
|
||||||
|
| `eslint` v9 (flat config) | Linting |
|
||||||
|
| `prettier` | Formatting |
|
||||||
|
| `svelte-check` | TypeScript type checking for Svelte files |
|
||||||
|
| `cargo test` + `cargo llvm-cov` | Rust unit tests and coverage |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Development Notes
|
||||||
|
|
||||||
|
### Running the App
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Frontend dev server only
|
||||||
|
source ~/.nvm/nvm.sh && pnpm dev
|
||||||
|
|
||||||
|
# Full Tauri app (Rust + frontend)
|
||||||
|
source ~/.nvm/nvm.sh && pnpm tauri dev
|
||||||
|
```
|
||||||
|
|
||||||
|
### Running Tests
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# All checks (lint → format → type-check → frontend tests → backend tests)
|
||||||
|
./check-all.sh
|
||||||
|
|
||||||
|
# Frontend tests only
|
||||||
|
source ~/.nvm/nvm.sh && pnpm test
|
||||||
|
|
||||||
|
# Frontend with coverage
|
||||||
|
source ~/.nvm/nvm.sh && pnpm test:coverage
|
||||||
|
|
||||||
|
# Backend tests only
|
||||||
|
pnpm test:backend
|
||||||
|
```
|
||||||
|
|
||||||
|
### Building
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Linux build
|
||||||
|
pnpm build:linux
|
||||||
|
|
||||||
|
# Windows cross-compile (requires cargo-xwin)
|
||||||
|
pnpm build:windows
|
||||||
|
```
|
||||||
|
|
||||||
|
### Adding a New Tauri Command
|
||||||
|
|
||||||
|
1. Add the handler function in the appropriate `src-tauri/src/*.rs` file with `#[tauri::command]`
|
||||||
|
2. Register it in `lib.rs` `invoke_handler![]`
|
||||||
|
3. Call it from the frontend via `invoke("command_name", { args })` in `src/lib/tauri.ts` or a store
|
||||||
|
|
||||||
|
### Adding a New Frontend Store
|
||||||
|
|
||||||
|
1. Create `src/lib/stores/my-store.ts` using `writable` or a factory function pattern
|
||||||
|
2. Create `src/lib/stores/my-store.test.ts` — all stores must have tests
|
||||||
|
3. Expose the store from the appropriate component
|
||||||
|
|
||||||
|
### Claude Stream-JSON Protocol
|
||||||
|
|
||||||
|
Claude Code is invoked with `--output-format stream-json --verbose`. See `src-tauri/src/types.rs` for the complete message type definitions. The key field distinguishing subagent messages from top-level messages is `parent_tool_use_id` on `Assistant` messages.
|
||||||
|
|
||||||
|
### Multi-Conversation Architecture
|
||||||
|
|
||||||
|
Each tab (`Conversation`) in `conversationsStore` has a unique `conversation_id` string. The backend `BridgeManager` maps these IDs to `WslBridge` instances. All Tauri events carry `conversation_id` in their payload so the frontend can route them to the correct conversation without affecting others.
|
||||||
|
|
||||||
|
### WSL Detection
|
||||||
|
|
||||||
|
`wsl_bridge.rs` detects WSL by checking `/proc/version` for "microsoft"/"wsl" strings, checking for `/proc/sys/fs/binfmt_misc/WSLInterop`, and checking `$WSL_DISTRO_NAME`. On native Windows builds, WSL detection always returns `false` (even if launched from a WSL terminal).
|
||||||
|
|
||||||
|
### Character State Sound Rules
|
||||||
|
|
||||||
|
Sound effects are managed in `src/lib/tauri.ts` per-conversation to prevent replays when switching tabs. The rules are:
|
||||||
|
|
||||||
|
- Entering `thinking` from a clean state (`idle`/`success`/`error`) → reset all sound flags
|
||||||
|
- Entering `coding` or `searching` (first time per task) → play task-start sound
|
||||||
|
- Entering `success` after ≥2 seconds in a long-running phase → play completion sound
|
||||||
|
- Entering `error` → play error sound (always)
|
||||||
|
- Entering `permission` → play permission sound (always)
|
||||||
|
|
||||||
|
### Workspace Trust Gate
|
||||||
|
|
||||||
|
On first connection to a new working directory, the app checks for Claude hooks and prompts the user to trust the workspace. Trusted workspaces are persisted in `HikariConfig.trusted_workspaces`.
|
||||||
|
|
||||||
|
### Configuration Storage
|
||||||
|
|
||||||
|
All settings are persisted via `tauri-plugin-store` to a JSON file in the app data directory. The frontend `configStore` (`src/lib/stores/config.ts`) loads configuration on startup and provides reactive derived stores. Changes invoke `save_config` to persist to disk.
|
||||||
@@ -0,0 +1,45 @@
|
|||||||
|
# Project Overview
|
||||||
|
|
||||||
|
## What is this project?
|
||||||
|
|
||||||
|
Hikari Desktop is a Tauri-based desktop application that wraps Claude Code with a visual anime character companion (Hikari) who appears on screen. It provides a rich UI for interacting with Claude Code, including conversation management, agent monitoring, cost tracking, and more.
|
||||||
|
|
||||||
|
The app was inspired by a Hatsune Miku mod for the ship AI in _The Outer Worlds_ — the idea of an AI assistant with an anime girl avatar that you can actually _see_.
|
||||||
|
|
||||||
|
## Goals
|
||||||
|
|
||||||
|
- Provide a beautiful, personalised interface for Claude Code
|
||||||
|
- Surface real-time status (thinking, typing, searching, etc.) through animated character sprites
|
||||||
|
- Track costs, context usage, and agent activity across sessions
|
||||||
|
- Support power-user workflows: multi-tab conversations, todo lists, git integration, MCP server management, session compaction, and more
|
||||||
|
- Build a foundation for autonomous task execution (agent orchestration, PRD-driven workflows)
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
|
||||||
|
- **Frontend**: Svelte 5 + TypeScript + Tailwind CSS
|
||||||
|
- **Backend**: Rust (Tauri v2)
|
||||||
|
- **Build**: Vite + pnpm
|
||||||
|
- **Testing**: Vitest (frontend) + cargo test (backend)
|
||||||
|
- **Linting**: ESLint + Prettier (frontend) + Clippy (backend)
|
||||||
|
- **IPC**: Tauri commands + events between Rust and Svelte
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
hikari-desktop/
|
||||||
|
├── src/ # Svelte frontend
|
||||||
|
│ └── lib/
|
||||||
|
│ ├── components/ # UI components (panels, modals, status bar)
|
||||||
|
│ ├── stores/ # Svelte stores (state management)
|
||||||
|
│ ├── types/ # TypeScript type definitions
|
||||||
|
│ └── utils/ # Utility functions
|
||||||
|
├── src-tauri/ # Rust backend
|
||||||
|
│ └── src/
|
||||||
|
│ ├── commands.rs # Tauri command handlers
|
||||||
|
│ ├── wsl_bridge.rs # Claude Code process management
|
||||||
|
│ ├── types.rs # Shared types & CharacterState enum
|
||||||
|
│ └── stats.rs # Cost tracking
|
||||||
|
└── public/ # Static assets (sprites, sounds)
|
||||||
|
```
|
||||||
|
|
||||||
|
Claude Code is launched as a child process via `WslBridge`, communicating via `--output-format stream-json` (NDJSON). Messages flow from the Rust backend to the Svelte frontend via Tauri events.
|
||||||
@@ -1 +1,29 @@
|
|||||||
tem
|
# hikari-desktop
|
||||||
|
|
||||||
|
Desktop companion application featuring Hikari.
|
||||||
|
|
||||||
|
## Live Version
|
||||||
|
|
||||||
|
This page is currently deployed. [View the live website.](https://git.nhcarrigan.com/nhcarrigan/hikari-desktop/releases)
|
||||||
|
|
||||||
|
## Feedback and Bugs
|
||||||
|
|
||||||
|
If you have feedback or a bug report, please [log a ticket on our forum](https://support.nhcarrigan.com).
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
If you would like to contribute to the project, you may create a Pull Request containing your proposed changes and we will review it as soon as we are able! Please review our [contributing guidelines](CONTRIBUTING.md) first.
|
||||||
|
|
||||||
|
## Code of Conduct
|
||||||
|
|
||||||
|
Before interacting with our community, please read our [Code of Conduct](CODE_OF_CONDUCT.md).
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This software is licensed under our [global software license](https://docs.nhcarrigan.com/#/license).
|
||||||
|
|
||||||
|
Copyright held by Naomi Carrigan.
|
||||||
|
|
||||||
|
## Contact
|
||||||
|
|
||||||
|
We may be contacted through our [Chat Server](http://chat.nhcarrigan.com) or via email at `contact@nhcarrigan.com`
|
||||||
|
|||||||
@@ -1,5 +1,9 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
|
||||||
|
# Source nvm to get access to pnpm
|
||||||
|
export NVM_DIR="$HOME/.nvm"
|
||||||
|
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh"
|
||||||
|
|
||||||
# Colors for output
|
# Colors for output
|
||||||
RED='\033[0;31m'
|
RED='\033[0;31m'
|
||||||
GREEN='\033[0;32m'
|
GREEN='\033[0;32m'
|
||||||
@@ -32,11 +36,11 @@ echo -e "${YELLOW}🔍 Running all checks for Hikari Desktop...${NC}"
|
|||||||
run_check "Frontend lint" "pnpm lint" || failed=1
|
run_check "Frontend lint" "pnpm lint" || failed=1
|
||||||
run_check "Frontend format check" "pnpm format:check" || failed=1
|
run_check "Frontend format check" "pnpm format:check" || failed=1
|
||||||
run_check "Frontend type check" "pnpm check" || failed=1
|
run_check "Frontend type check" "pnpm check" || failed=1
|
||||||
run_check "Frontend tests" "pnpm test" || failed=1
|
run_check "Frontend tests with coverage" "pnpm test:coverage" || failed=1
|
||||||
|
|
||||||
# Backend checks
|
# Backend checks
|
||||||
run_check "Backend clippy (strict)" "cd src-tauri && cargo clippy --all-targets --all-features -- -D warnings" || failed=1
|
run_check "Backend clippy (strict)" "(cd src-tauri && cargo clippy --all-targets --all-features -- -D warnings)" || failed=1
|
||||||
run_check "Backend tests" "cargo test" || failed=1
|
run_check "Backend tests with coverage" "(cd src-tauri && cargo llvm-cov --fail-under-lines 50)" || failed=1
|
||||||
|
|
||||||
# Summary
|
# Summary
|
||||||
echo -e "\n${YELLOW}========================================${NC}"
|
echo -e "\n${YELLOW}========================================${NC}"
|
||||||
|
|||||||
@@ -27,6 +27,6 @@ export default tseslint.config(
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
ignores: ["build/", ".svelte-kit/", "dist/", "src-tauri/target/", "node_modules/"],
|
ignores: ["build/", ".svelte-kit/", "dist/", "src-tauri/target/", "node_modules/", "coverage/"],
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "hikari-desktop",
|
"name": "hikari-desktop",
|
||||||
"version": "0.2.0",
|
"version": "1.14.0",
|
||||||
"description": "",
|
"description": "",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
@@ -16,6 +16,10 @@
|
|||||||
"test": "vitest run",
|
"test": "vitest run",
|
||||||
"test:watch": "vitest",
|
"test:watch": "vitest",
|
||||||
"test:coverage": "vitest run --coverage",
|
"test:coverage": "vitest run --coverage",
|
||||||
|
"test:backend": "cd src-tauri && cargo test",
|
||||||
|
"test:backend:coverage": "cd src-tauri && cargo llvm-cov --text",
|
||||||
|
"test:all": "pnpm test && pnpm test:backend",
|
||||||
|
"coverage:all": "pnpm test:coverage && pnpm test:backend:coverage",
|
||||||
"lint": "eslint .",
|
"lint": "eslint .",
|
||||||
"lint:fix": "eslint . --fix",
|
"lint:fix": "eslint . --fix",
|
||||||
"format": "prettier --write .",
|
"format": "prettier --write .",
|
||||||
@@ -23,36 +27,69 @@
|
|||||||
},
|
},
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@tauri-apps/api": "^2",
|
"@codemirror/commands": "6.10.2",
|
||||||
"@tauri-apps/plugin-dialog": "^2",
|
"@codemirror/lang-angular": "0.1.4",
|
||||||
"@tauri-apps/plugin-opener": "^2",
|
"@codemirror/lang-cpp": "6.0.3",
|
||||||
"@tauri-apps/plugin-shell": "^2.3.4",
|
"@codemirror/lang-css": "6.3.1",
|
||||||
"@tauri-apps/plugin-store": "^2",
|
"@codemirror/lang-go": "6.0.1",
|
||||||
"@tauri-apps/plugin-notification": "^2",
|
"@codemirror/lang-html": "6.4.11",
|
||||||
"@tauri-apps/plugin-os": "^2"
|
"@codemirror/lang-java": "6.0.2",
|
||||||
|
"@codemirror/lang-javascript": "6.2.4",
|
||||||
|
"@codemirror/lang-json": "6.0.2",
|
||||||
|
"@codemirror/lang-less": "6.0.2",
|
||||||
|
"@codemirror/lang-markdown": "6.5.0",
|
||||||
|
"@codemirror/lang-php": "6.0.2",
|
||||||
|
"@codemirror/lang-python": "6.2.1",
|
||||||
|
"@codemirror/lang-rust": "6.0.2",
|
||||||
|
"@codemirror/lang-sass": "6.0.2",
|
||||||
|
"@codemirror/lang-sql": "6.10.0",
|
||||||
|
"@codemirror/lang-vue": "0.1.3",
|
||||||
|
"@codemirror/lang-wast": "6.0.2",
|
||||||
|
"@codemirror/lang-xml": "6.1.0",
|
||||||
|
"@codemirror/lang-yaml": "6.1.2",
|
||||||
|
"@codemirror/language": "6.12.2",
|
||||||
|
"@codemirror/legacy-modes": "6.5.2",
|
||||||
|
"@codemirror/state": "6.5.4",
|
||||||
|
"@codemirror/theme-one-dark": "6.1.3",
|
||||||
|
"@codemirror/view": "6.39.15",
|
||||||
|
"@lezer/highlight": "1.2.3",
|
||||||
|
"@tauri-apps/api": "2.10.1",
|
||||||
|
"@tauri-apps/plugin-clipboard-manager": "2.3.2",
|
||||||
|
"@tauri-apps/plugin-dialog": "2.6.0",
|
||||||
|
"@tauri-apps/plugin-fs": "2.4.5",
|
||||||
|
"@tauri-apps/plugin-notification": "2.3.3",
|
||||||
|
"@tauri-apps/plugin-opener": "2.5.3",
|
||||||
|
"@tauri-apps/plugin-os": "2.3.2",
|
||||||
|
"@tauri-apps/plugin-shell": "2.3.5",
|
||||||
|
"@tauri-apps/plugin-store": "2.4.2",
|
||||||
|
"codemirror": "6.0.2",
|
||||||
|
"highlight.js": "11.11.1",
|
||||||
|
"lucide-svelte": "0.575.0",
|
||||||
|
"marked": "17.0.3"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@eslint/js": "^9.39.2",
|
"@eslint/js": "9.39.3",
|
||||||
"@sveltejs/adapter-static": "^3.0.6",
|
"@sveltejs/adapter-static": "3.0.10",
|
||||||
"@sveltejs/kit": "^2.9.0",
|
"@sveltejs/kit": "2.53.2",
|
||||||
"@sveltejs/vite-plugin-svelte": "^5.0.0",
|
"@sveltejs/vite-plugin-svelte": "5.1.1",
|
||||||
"@tailwindcss/vite": "^4.1.18",
|
"@tailwindcss/vite": "4.2.1",
|
||||||
"@tauri-apps/cli": "^2",
|
"@tauri-apps/cli": "2.10.0",
|
||||||
"@testing-library/jest-dom": "^6.9.1",
|
"@testing-library/jest-dom": "6.9.1",
|
||||||
"@testing-library/svelte": "^5.3.1",
|
"@testing-library/svelte": "5.3.1",
|
||||||
"eslint": "^9.39.2",
|
"@vitest/coverage-v8": "4.0.18",
|
||||||
"eslint-config-prettier": "^10.1.8",
|
"eslint": "9.39.3",
|
||||||
"eslint-plugin-svelte": "^3.14.0",
|
"eslint-config-prettier": "10.1.8",
|
||||||
"globals": "^17.0.0",
|
"eslint-plugin-svelte": "3.15.0",
|
||||||
"jsdom": "^27.4.0",
|
"globals": "17.3.0",
|
||||||
"prettier": "^3.8.0",
|
"jsdom": "28.1.0",
|
||||||
"prettier-plugin-svelte": "^3.4.1",
|
"prettier": "3.8.1",
|
||||||
"svelte": "^5.0.0",
|
"prettier-plugin-svelte": "3.5.0",
|
||||||
"svelte-check": "^4.0.0",
|
"svelte": "5.53.5",
|
||||||
"tailwindcss": "^4.1.18",
|
"svelte-check": "4.4.3",
|
||||||
"typescript": "~5.6.2",
|
"tailwindcss": "4.2.1",
|
||||||
"typescript-eslint": "^8.53.0",
|
"typescript": "5.9.3",
|
||||||
"vite": "^6.0.3",
|
"typescript-eslint": "8.56.1",
|
||||||
"vitest": "^4.0.17"
|
"vite": "6.4.1",
|
||||||
|
"vitest": "4.0.18"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "hikari-desktop"
|
name = "hikari-desktop"
|
||||||
version = "0.2.0"
|
version = "1.14.0"
|
||||||
description = "Hikari - Claude Code Visual Assistant"
|
description = "Hikari - Claude Code Visual Assistant"
|
||||||
authors = ["Naomi Carrigan"]
|
authors = ["Naomi Carrigan"]
|
||||||
edition = "2021"
|
edition = "2021"
|
||||||
@@ -13,7 +13,7 @@ crate-type = ["staticlib", "cdylib", "rlib"]
|
|||||||
tauri-build = { version = "2", features = [] }
|
tauri-build = { version = "2", features = [] }
|
||||||
|
|
||||||
[dependencies]
|
[dependencies]
|
||||||
tauri = { version = "2", features = [] }
|
tauri = { version = "2", features = ["tray-icon", "image-png"] }
|
||||||
tauri-plugin-dialog = "2"
|
tauri-plugin-dialog = "2"
|
||||||
tauri-plugin-opener = "2"
|
tauri-plugin-opener = "2"
|
||||||
tauri-plugin-shell = "2"
|
tauri-plugin-shell = "2"
|
||||||
@@ -25,8 +25,16 @@ uuid = { version = "1", features = ["v4"] }
|
|||||||
tauri-plugin-store = "2.4.2"
|
tauri-plugin-store = "2.4.2"
|
||||||
tauri-plugin-notification = "2"
|
tauri-plugin-notification = "2"
|
||||||
tauri-plugin-os = "2"
|
tauri-plugin-os = "2"
|
||||||
|
tauri-plugin-http = "2"
|
||||||
|
tauri-plugin-clipboard-manager = "2"
|
||||||
|
tauri-plugin-fs = "2"
|
||||||
tempfile = "3"
|
tempfile = "3"
|
||||||
|
semver = "1"
|
||||||
chrono = { version = "0.4.43", features = ["serde"] }
|
chrono = { version = "0.4.43", features = ["serde"] }
|
||||||
|
discord-rich-presence = "0.2"
|
||||||
|
dirs = "5"
|
||||||
|
tracing = "0.1"
|
||||||
|
tracing-subscriber = { version = "0.3", features = ["env-filter", "fmt"] }
|
||||||
|
|
||||||
[target.'cfg(windows)'.dependencies]
|
[target.'cfg(windows)'.dependencies]
|
||||||
windows = { version = "0.62", features = [
|
windows = { version = "0.62", features = [
|
||||||
|
|||||||
@@ -13,6 +13,32 @@
|
|||||||
"notification:default",
|
"notification:default",
|
||||||
"notification:allow-is-permission-granted",
|
"notification:allow-is-permission-granted",
|
||||||
"notification:allow-request-permission",
|
"notification:allow-request-permission",
|
||||||
"notification:allow-notify"
|
"notification:allow-notify",
|
||||||
|
"clipboard-manager:default",
|
||||||
|
"clipboard-manager:allow-read-image",
|
||||||
|
"core:tray:default",
|
||||||
|
"fs:default",
|
||||||
|
"fs:allow-read-text-file",
|
||||||
|
"fs:allow-write-text-file",
|
||||||
|
{
|
||||||
|
"identifier": "fs:allow-read-file",
|
||||||
|
"allow": [{ "path": "**" }]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"identifier": "fs:allow-write-file",
|
||||||
|
"allow": [{ "path": "**" }]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"identifier": "fs:scope",
|
||||||
|
"allow": [{ "path": "$HOME/.claude/**" }]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"identifier": "fs:allow-read-text-file",
|
||||||
|
"allow": [{ "path": "$HOME/.claude/**" }]
|
||||||
|
},
|
||||||
|
"core:window:allow-set-size",
|
||||||
|
"core:window:allow-set-always-on-top",
|
||||||
|
"core:window:allow-inner-size",
|
||||||
|
"core:window:allow-hide"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|||||||
|
Before Width: | Height: | Size: 9.2 KiB After Width: | Height: | Size: 36 KiB |
|
Before Width: | Height: | Size: 19 KiB After Width: | Height: | Size: 127 KiB |
|
Before Width: | Height: | Size: 1.9 KiB After Width: | Height: | Size: 2.8 KiB |
|
Before Width: | Height: | Size: 4.4 KiB After Width: | Height: | Size: 10 KiB |
|
Before Width: | Height: | Size: 7.5 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 10 KiB After Width: | Height: | Size: 43 KiB |
|
Before Width: | Height: | Size: 11 KiB After Width: | Height: | Size: 47 KiB |
|
Before Width: | Height: | Size: 22 KiB After Width: | Height: | Size: 154 KiB |
|
Before Width: | Height: | Size: 1.7 KiB After Width: | Height: | Size: 2.5 KiB |
|
Before Width: | Height: | Size: 24 KiB After Width: | Height: | Size: 181 KiB |
|
Before Width: | Height: | Size: 2.9 KiB After Width: | Height: | Size: 5.1 KiB |
|
Before Width: | Height: | Size: 5.0 KiB After Width: | Height: | Size: 12 KiB |
|
Before Width: | Height: | Size: 6.4 KiB After Width: | Height: | Size: 18 KiB |
|
Before Width: | Height: | Size: 3.3 KiB After Width: | Height: | Size: 6.4 KiB |
|
Before Width: | Height: | Size: 3.6 KiB After Width: | Height: | Size: 4.6 KiB |
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 54 KiB |
|
Before Width: | Height: | Size: 4.2 KiB After Width: | Height: | Size: 4.7 KiB |
|
Before Width: | Height: | Size: 3.4 KiB After Width: | Height: | Size: 4.4 KiB |
|
Before Width: | Height: | Size: 7.6 KiB After Width: | Height: | Size: 26 KiB |
|
Before Width: | Height: | Size: 4.0 KiB After Width: | Height: | Size: 4.5 KiB |
|
Before Width: | Height: | Size: 8.2 KiB After Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 16 KiB After Width: | Height: | Size: 92 KiB |
|
Before Width: | Height: | Size: 9.2 KiB After Width: | Height: | Size: 16 KiB |
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 32 KiB |
|
Before Width: | Height: | Size: 25 KiB After Width: | Height: | Size: 196 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 32 KiB |
|
Before Width: | Height: | Size: 17 KiB After Width: | Height: | Size: 55 KiB |
|
Before Width: | Height: | Size: 35 KiB After Width: | Height: | Size: 338 KiB |
|
Before Width: | Height: | Size: 19 KiB After Width: | Height: | Size: 54 KiB |
|
Before Width: | Height: | Size: 32 KiB After Width: | Height: | Size: 149 KiB |
|
Before Width: | Height: | Size: 14 KiB After Width: | Height: | Size: 466 KiB |
|
Before Width: | Height: | Size: 878 B After Width: | Height: | Size: 1.2 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 4.3 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 4.3 KiB |
|
Before Width: | Height: | Size: 3.7 KiB After Width: | Height: | Size: 9.0 KiB |
|
Before Width: | Height: | Size: 1.4 KiB After Width: | Height: | Size: 2.4 KiB |
|
Before Width: | Height: | Size: 3.5 KiB After Width: | Height: | Size: 8.4 KiB |
|
Before Width: | Height: | Size: 3.5 KiB After Width: | Height: | Size: 8.4 KiB |
|
Before Width: | Height: | Size: 5.6 KiB After Width: | Height: | Size: 18 KiB |
|
Before Width: | Height: | Size: 2.2 KiB After Width: | Height: | Size: 4.3 KiB |
|
Before Width: | Height: | Size: 5.0 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 5.0 KiB After Width: | Height: | Size: 15 KiB |
|
Before Width: | Height: | Size: 7.7 KiB After Width: | Height: | Size: 32 KiB |
|
Before Width: | Height: | Size: 123 KiB After Width: | Height: | Size: 1.7 MiB |
|
Before Width: | Height: | Size: 7.7 KiB After Width: | Height: | Size: 32 KiB |
|
Before Width: | Height: | Size: 12 KiB After Width: | Height: | Size: 66 KiB |
|
Before Width: | Height: | Size: 4.8 KiB After Width: | Height: | Size: 14 KiB |
|
Before Width: | Height: | Size: 9.8 KiB After Width: | Height: | Size: 48 KiB |
|
Before Width: | Height: | Size: 11 KiB After Width: | Height: | Size: 58 KiB |
@@ -3,6 +3,7 @@ use std::collections::HashMap;
|
|||||||
use std::sync::Arc;
|
use std::sync::Arc;
|
||||||
use tauri::AppHandle;
|
use tauri::AppHandle;
|
||||||
|
|
||||||
|
use crate::commands::record_session;
|
||||||
use crate::config::ClaudeStartOptions;
|
use crate::config::ClaudeStartOptions;
|
||||||
use crate::stats::UsageStats;
|
use crate::stats::UsageStats;
|
||||||
use crate::wsl_bridge::WslBridge;
|
use crate::wsl_bridge::WslBridge;
|
||||||
@@ -29,30 +30,45 @@ impl BridgeManager {
|
|||||||
conversation_id: &str,
|
conversation_id: &str,
|
||||||
options: ClaudeStartOptions,
|
options: ClaudeStartOptions,
|
||||||
) -> Result<(), String> {
|
) -> Result<(), String> {
|
||||||
// Check if a bridge already exists for this conversation
|
// Check if a bridge already exists and is running for this conversation
|
||||||
if self.bridges.get(conversation_id).map(|b| b.is_running()).unwrap_or(false) {
|
if self
|
||||||
|
.bridges
|
||||||
|
.get(conversation_id)
|
||||||
|
.map(|b| b.is_running())
|
||||||
|
.unwrap_or(false)
|
||||||
|
{
|
||||||
return Err("Claude is already running for this conversation".to_string());
|
return Err("Claude is already running for this conversation".to_string());
|
||||||
}
|
}
|
||||||
|
|
||||||
let app = self.app_handle.as_ref()
|
let app = self
|
||||||
|
.app_handle
|
||||||
|
.as_ref()
|
||||||
.ok_or_else(|| "App handle not set".to_string())?
|
.ok_or_else(|| "App handle not set".to_string())?
|
||||||
.clone();
|
.clone();
|
||||||
|
|
||||||
// Create a new bridge for this conversation
|
// Reuse existing bridge if it exists (preserves stats across reconnects)
|
||||||
let mut bridge = WslBridge::new_with_conversation_id(conversation_id.to_string());
|
// Only create a new bridge if one doesn't exist for this conversation
|
||||||
|
let bridge = self
|
||||||
|
.bridges
|
||||||
|
.entry(conversation_id.to_string())
|
||||||
|
.or_insert_with(|| WslBridge::new_with_conversation_id(conversation_id.to_string()));
|
||||||
|
|
||||||
// Start the Claude process
|
// Start the Claude process
|
||||||
bridge.start(app, options)?;
|
bridge.start(app.clone(), options)?;
|
||||||
|
|
||||||
// Store the bridge
|
// Record session start for cost tracking
|
||||||
self.bridges.insert(conversation_id.to_string(), bridge);
|
tauri::async_runtime::spawn(async move {
|
||||||
|
record_session(&app).await;
|
||||||
|
});
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn stop_claude(&mut self, conversation_id: &str) -> Result<(), String> {
|
pub fn stop_claude(&mut self, conversation_id: &str) -> Result<(), String> {
|
||||||
if let Some(bridge) = self.bridges.get_mut(conversation_id) {
|
if let Some(bridge) = self.bridges.get_mut(conversation_id) {
|
||||||
let app = self.app_handle.as_ref()
|
let app = self
|
||||||
|
.app_handle
|
||||||
|
.as_ref()
|
||||||
.ok_or_else(|| "App handle not set".to_string())?;
|
.ok_or_else(|| "App handle not set".to_string())?;
|
||||||
bridge.stop(app);
|
bridge.stop(app);
|
||||||
Ok(())
|
Ok(())
|
||||||
@@ -63,7 +79,9 @@ impl BridgeManager {
|
|||||||
|
|
||||||
pub fn interrupt_claude(&mut self, conversation_id: &str) -> Result<(), String> {
|
pub fn interrupt_claude(&mut self, conversation_id: &str) -> Result<(), String> {
|
||||||
if let Some(bridge) = self.bridges.get_mut(conversation_id) {
|
if let Some(bridge) = self.bridges.get_mut(conversation_id) {
|
||||||
let app = self.app_handle.as_ref()
|
let app = self
|
||||||
|
.app_handle
|
||||||
|
.as_ref()
|
||||||
.ok_or_else(|| "App handle not set".to_string())?;
|
.ok_or_else(|| "App handle not set".to_string())?;
|
||||||
bridge.interrupt(app)
|
bridge.interrupt(app)
|
||||||
} else {
|
} else {
|
||||||
@@ -79,20 +97,36 @@ impl BridgeManager {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn send_tool_result(
|
||||||
|
&mut self,
|
||||||
|
conversation_id: &str,
|
||||||
|
tool_use_id: &str,
|
||||||
|
result: serde_json::Value,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
if let Some(bridge) = self.bridges.get_mut(conversation_id) {
|
||||||
|
bridge.send_tool_result(tool_use_id, result)
|
||||||
|
} else {
|
||||||
|
Err("No Claude instance found for this conversation".to_string())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
pub fn is_claude_running(&self, conversation_id: &str) -> bool {
|
pub fn is_claude_running(&self, conversation_id: &str) -> bool {
|
||||||
self.bridges.get(conversation_id)
|
self.bridges
|
||||||
|
.get(conversation_id)
|
||||||
.map(|b| b.is_running())
|
.map(|b| b.is_running())
|
||||||
.unwrap_or(false)
|
.unwrap_or(false)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn get_working_directory(&self, conversation_id: &str) -> Result<String, String> {
|
pub fn get_working_directory(&self, conversation_id: &str) -> Result<String, String> {
|
||||||
self.bridges.get(conversation_id)
|
self.bridges
|
||||||
|
.get(conversation_id)
|
||||||
.map(|b| b.get_working_directory().to_string())
|
.map(|b| b.get_working_directory().to_string())
|
||||||
.ok_or_else(|| "No Claude instance found for this conversation".to_string())
|
.ok_or_else(|| "No Claude instance found for this conversation".to_string())
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn get_usage_stats(&self, conversation_id: &str) -> Result<UsageStats, String> {
|
pub fn get_usage_stats(&self, conversation_id: &str) -> Result<UsageStats, String> {
|
||||||
self.bridges.get(conversation_id)
|
self.bridges
|
||||||
|
.get(conversation_id)
|
||||||
.map(|b| b.get_stats())
|
.map(|b| b.get_stats())
|
||||||
.ok_or_else(|| "No Claude instance found for this conversation".to_string())
|
.ok_or_else(|| "No Claude instance found for this conversation".to_string())
|
||||||
}
|
}
|
||||||
@@ -115,8 +149,14 @@ impl BridgeManager {
|
|||||||
|
|
||||||
#[allow(dead_code)]
|
#[allow(dead_code)]
|
||||||
pub fn get_active_conversations(&self) -> Vec<String> {
|
pub fn get_active_conversations(&self) -> Vec<String> {
|
||||||
self.bridges.keys()
|
self.bridges
|
||||||
.filter(|id| self.bridges.get(*id).map(|b| b.is_running()).unwrap_or(false))
|
.keys()
|
||||||
|
.filter(|id| {
|
||||||
|
self.bridges
|
||||||
|
.get(*id)
|
||||||
|
.map(|b| b.is_running())
|
||||||
|
.unwrap_or(false)
|
||||||
|
})
|
||||||
.cloned()
|
.cloned()
|
||||||
.collect()
|
.collect()
|
||||||
}
|
}
|
||||||
@@ -132,4 +172,128 @@ pub type SharedBridgeManager = Arc<Mutex<BridgeManager>>;
|
|||||||
|
|
||||||
pub fn create_shared_bridge_manager() -> SharedBridgeManager {
|
pub fn create_shared_bridge_manager() -> SharedBridgeManager {
|
||||||
Arc::new(Mutex::new(BridgeManager::new()))
|
Arc::new(Mutex::new(BridgeManager::new()))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_bridge_manager_new() {
|
||||||
|
let manager = BridgeManager::new();
|
||||||
|
assert!(manager.app_handle.is_none());
|
||||||
|
assert!(manager.bridges.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_bridge_manager_default() {
|
||||||
|
let manager = BridgeManager::default();
|
||||||
|
assert!(manager.app_handle.is_none());
|
||||||
|
assert!(manager.bridges.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_is_claude_running_no_bridge() {
|
||||||
|
let manager = BridgeManager::new();
|
||||||
|
assert!(!manager.is_claude_running("nonexistent"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_working_directory_no_bridge() {
|
||||||
|
let manager = BridgeManager::new();
|
||||||
|
let result = manager.get_working_directory("nonexistent");
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_usage_stats_no_bridge() {
|
||||||
|
let manager = BridgeManager::new();
|
||||||
|
let result = manager.get_usage_stats("nonexistent");
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_stop_claude_no_bridge() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
let result = manager.stop_claude("nonexistent");
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_interrupt_claude_no_bridge() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
let result = manager.interrupt_claude("nonexistent");
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_send_prompt_no_bridge() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
let result = manager.send_prompt("nonexistent", "Hello".to_string());
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_send_tool_result_no_bridge() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
let result = manager.send_tool_result(
|
||||||
|
"nonexistent",
|
||||||
|
"tool_id",
|
||||||
|
serde_json::json!({"result": "success"}),
|
||||||
|
);
|
||||||
|
assert!(result.is_err());
|
||||||
|
assert_eq!(
|
||||||
|
result.unwrap_err(),
|
||||||
|
"No Claude instance found for this conversation"
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_create_shared_bridge_manager() {
|
||||||
|
let shared = create_shared_bridge_manager();
|
||||||
|
let manager = shared.lock();
|
||||||
|
assert!(manager.bridges.is_empty());
|
||||||
|
assert!(manager.app_handle.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_stopped_bridges_empty() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
manager.cleanup_stopped_bridges();
|
||||||
|
assert!(manager.bridges.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_active_conversations_empty() {
|
||||||
|
let manager = BridgeManager::new();
|
||||||
|
let active = manager.get_active_conversations();
|
||||||
|
assert!(active.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_stop_all_without_app_handle() {
|
||||||
|
let mut manager = BridgeManager::new();
|
||||||
|
manager.stop_all(); // Should not panic
|
||||||
|
assert!(manager.bridges.is_empty());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -0,0 +1,724 @@
|
|||||||
|
// Clipboard history module for tracking and managing copied code snippets
|
||||||
|
// Implements issue #25 - Clipboard History feature
|
||||||
|
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::sync::Mutex;
|
||||||
|
use tauri_plugin_store::StoreExt;
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
|
const STORE_FILE: &str = "hikari-clipboard.json";
|
||||||
|
const HISTORY_KEY: &str = "clipboard_history";
|
||||||
|
const MAX_HISTORY_SIZE: usize = 100;
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct ClipboardEntry {
|
||||||
|
pub id: String,
|
||||||
|
pub content: String,
|
||||||
|
pub language: Option<String>,
|
||||||
|
pub source: Option<String>,
|
||||||
|
pub timestamp: String,
|
||||||
|
pub is_pinned: bool,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl ClipboardEntry {
|
||||||
|
pub fn new(content: String, language: Option<String>, source: Option<String>) -> Self {
|
||||||
|
Self {
|
||||||
|
id: Uuid::new_v4().to_string(),
|
||||||
|
content,
|
||||||
|
language,
|
||||||
|
source,
|
||||||
|
timestamp: chrono::Utc::now().to_rfc3339(),
|
||||||
|
is_pinned: false,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||||
|
struct ClipboardHistory {
|
||||||
|
entries: Vec<ClipboardEntry>,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Track last clipboard content to avoid duplicates
|
||||||
|
#[derive(Default)]
|
||||||
|
struct ClipboardState {
|
||||||
|
last_content: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
static CLIPBOARD_STATE: Mutex<ClipboardState> = Mutex::new(ClipboardState { last_content: None });
|
||||||
|
|
||||||
|
fn load_history(app: &tauri::AppHandle) -> ClipboardHistory {
|
||||||
|
let store = app.store(STORE_FILE).ok();
|
||||||
|
store
|
||||||
|
.and_then(|s| s.get(HISTORY_KEY))
|
||||||
|
.and_then(|v| serde_json::from_value(v.clone()).ok())
|
||||||
|
.unwrap_or_default()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn save_history(app: &tauri::AppHandle, history: &ClipboardHistory) -> Result<(), String> {
|
||||||
|
let store = app.store(STORE_FILE).map_err(|e| e.to_string())?;
|
||||||
|
store.set(
|
||||||
|
HISTORY_KEY,
|
||||||
|
serde_json::to_value(history).map_err(|e| e.to_string())?,
|
||||||
|
);
|
||||||
|
store.save().map_err(|e| e.to_string())?;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// List all clipboard entries, optionally filtered by language
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn list_clipboard_entries(
|
||||||
|
app: tauri::AppHandle,
|
||||||
|
language: Option<String>,
|
||||||
|
) -> Result<Vec<ClipboardEntry>, String> {
|
||||||
|
let history = load_history(&app);
|
||||||
|
let entries = if let Some(lang) = language {
|
||||||
|
history
|
||||||
|
.entries
|
||||||
|
.into_iter()
|
||||||
|
.filter(|e| e.language.as_ref() == Some(&lang))
|
||||||
|
.collect()
|
||||||
|
} else {
|
||||||
|
history.entries
|
||||||
|
};
|
||||||
|
Ok(entries)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Capture current clipboard content and add to history
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn capture_clipboard(
|
||||||
|
app: tauri::AppHandle,
|
||||||
|
content: String,
|
||||||
|
language: Option<String>,
|
||||||
|
source: Option<String>,
|
||||||
|
) -> Result<ClipboardEntry, String> {
|
||||||
|
// Check for duplicate (same content as last capture)
|
||||||
|
{
|
||||||
|
let mut state = CLIPBOARD_STATE.lock().map_err(|e| e.to_string())?;
|
||||||
|
if state.last_content.as_ref() == Some(&content) {
|
||||||
|
// Return existing entry if content is the same
|
||||||
|
let history = load_history(&app);
|
||||||
|
if let Some(entry) = history.entries.first() {
|
||||||
|
if entry.content == content {
|
||||||
|
return Ok(entry.clone());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
state.last_content = Some(content.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
let entry = ClipboardEntry::new(content, language, source);
|
||||||
|
let mut history = load_history(&app);
|
||||||
|
|
||||||
|
// Add to front of history
|
||||||
|
history.entries.insert(0, entry.clone());
|
||||||
|
|
||||||
|
// Enforce max size (keep pinned entries)
|
||||||
|
let mut pinned: Vec<ClipboardEntry> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter(|e| e.is_pinned)
|
||||||
|
.cloned()
|
||||||
|
.collect();
|
||||||
|
let mut unpinned: Vec<ClipboardEntry> = history
|
||||||
|
.entries
|
||||||
|
.into_iter()
|
||||||
|
.filter(|e| !e.is_pinned)
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
// Trim unpinned entries if over max size
|
||||||
|
if unpinned.len() + pinned.len() > MAX_HISTORY_SIZE {
|
||||||
|
let max_unpinned = MAX_HISTORY_SIZE.saturating_sub(pinned.len());
|
||||||
|
unpinned.truncate(max_unpinned);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Merge back, pinned first then unpinned
|
||||||
|
pinned.extend(unpinned);
|
||||||
|
history.entries = pinned;
|
||||||
|
|
||||||
|
// Sort by timestamp descending (newest first), pinned entries stay at top
|
||||||
|
history.entries.sort_by(|a, b| {
|
||||||
|
if a.is_pinned && !b.is_pinned {
|
||||||
|
std::cmp::Ordering::Less
|
||||||
|
} else if !a.is_pinned && b.is_pinned {
|
||||||
|
std::cmp::Ordering::Greater
|
||||||
|
} else {
|
||||||
|
b.timestamp.cmp(&a.timestamp)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
save_history(&app, &history)?;
|
||||||
|
Ok(entry)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Delete a clipboard entry by ID
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn delete_clipboard_entry(app: tauri::AppHandle, id: String) -> Result<(), String> {
|
||||||
|
let mut history = load_history(&app);
|
||||||
|
history.entries.retain(|e| e.id != id);
|
||||||
|
save_history(&app, &history)?;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Toggle pin status of an entry
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn toggle_pin_clipboard_entry(
|
||||||
|
app: tauri::AppHandle,
|
||||||
|
id: String,
|
||||||
|
) -> Result<ClipboardEntry, String> {
|
||||||
|
let mut history = load_history(&app);
|
||||||
|
let entry = history
|
||||||
|
.entries
|
||||||
|
.iter_mut()
|
||||||
|
.find(|e| e.id == id)
|
||||||
|
.ok_or("Entry not found")?;
|
||||||
|
|
||||||
|
entry.is_pinned = !entry.is_pinned;
|
||||||
|
let updated_entry = entry.clone();
|
||||||
|
|
||||||
|
// Re-sort to move pinned entries to top
|
||||||
|
history.entries.sort_by(|a, b| {
|
||||||
|
if a.is_pinned && !b.is_pinned {
|
||||||
|
std::cmp::Ordering::Less
|
||||||
|
} else if !a.is_pinned && b.is_pinned {
|
||||||
|
std::cmp::Ordering::Greater
|
||||||
|
} else {
|
||||||
|
b.timestamp.cmp(&a.timestamp)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
save_history(&app, &history)?;
|
||||||
|
Ok(updated_entry)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Clear all non-pinned entries
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn clear_clipboard_history(app: tauri::AppHandle) -> Result<(), String> {
|
||||||
|
let mut history = load_history(&app);
|
||||||
|
history.entries.retain(|e| e.is_pinned);
|
||||||
|
save_history(&app, &history)?;
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Search clipboard entries by content
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn search_clipboard_entries(
|
||||||
|
app: tauri::AppHandle,
|
||||||
|
query: String,
|
||||||
|
) -> Result<Vec<ClipboardEntry>, String> {
|
||||||
|
let history = load_history(&app);
|
||||||
|
let query_lower = query.to_lowercase();
|
||||||
|
let entries = history
|
||||||
|
.entries
|
||||||
|
.into_iter()
|
||||||
|
.filter(|e| {
|
||||||
|
e.content.to_lowercase().contains(&query_lower)
|
||||||
|
|| e.language
|
||||||
|
.as_ref()
|
||||||
|
.is_some_and(|l| l.to_lowercase().contains(&query_lower))
|
||||||
|
|| e.source
|
||||||
|
.as_ref()
|
||||||
|
.is_some_and(|s| s.to_lowercase().contains(&query_lower))
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
Ok(entries)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get all unique languages from history
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn get_clipboard_languages(app: tauri::AppHandle) -> Result<Vec<String>, String> {
|
||||||
|
let history = load_history(&app);
|
||||||
|
let mut languages: Vec<String> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter_map(|e| e.language.clone())
|
||||||
|
.collect();
|
||||||
|
languages.sort();
|
||||||
|
languages.dedup();
|
||||||
|
Ok(languages)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Update the language of an entry
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn update_clipboard_language(
|
||||||
|
app: tauri::AppHandle,
|
||||||
|
id: String,
|
||||||
|
language: Option<String>,
|
||||||
|
) -> Result<ClipboardEntry, String> {
|
||||||
|
let mut history = load_history(&app);
|
||||||
|
let entry = history
|
||||||
|
.entries
|
||||||
|
.iter_mut()
|
||||||
|
.find(|e| e.id == id)
|
||||||
|
.ok_or("Entry not found")?;
|
||||||
|
|
||||||
|
entry.language = language;
|
||||||
|
let updated_entry = entry.clone();
|
||||||
|
|
||||||
|
save_history(&app, &history)?;
|
||||||
|
Ok(updated_entry)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
// ==================== ClipboardEntry tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_new() {
|
||||||
|
let entry = ClipboardEntry::new(
|
||||||
|
"let x = 42;".to_string(),
|
||||||
|
Some("rust".to_string()),
|
||||||
|
Some("main.rs".to_string()),
|
||||||
|
);
|
||||||
|
|
||||||
|
assert_eq!(entry.content, "let x = 42;");
|
||||||
|
assert_eq!(entry.language, Some("rust".to_string()));
|
||||||
|
assert_eq!(entry.source, Some("main.rs".to_string()));
|
||||||
|
assert!(!entry.is_pinned);
|
||||||
|
assert!(!entry.id.is_empty());
|
||||||
|
assert!(!entry.timestamp.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_new_without_optional_fields() {
|
||||||
|
let entry = ClipboardEntry::new("some content".to_string(), None, None);
|
||||||
|
|
||||||
|
assert_eq!(entry.content, "some content");
|
||||||
|
assert!(entry.language.is_none());
|
||||||
|
assert!(entry.source.is_none());
|
||||||
|
assert!(!entry.is_pinned);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_unique_ids() {
|
||||||
|
let entry1 = ClipboardEntry::new("content1".to_string(), None, None);
|
||||||
|
let entry2 = ClipboardEntry::new("content2".to_string(), None, None);
|
||||||
|
|
||||||
|
assert_ne!(entry1.id, entry2.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_serialization() {
|
||||||
|
let entry = ClipboardEntry::new(
|
||||||
|
"fn main() {}".to_string(),
|
||||||
|
Some("rust".to_string()),
|
||||||
|
Some("lib.rs".to_string()),
|
||||||
|
);
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&entry).unwrap();
|
||||||
|
assert!(json.contains("fn main() {}"));
|
||||||
|
assert!(json.contains("rust"));
|
||||||
|
assert!(json.contains("lib.rs"));
|
||||||
|
assert!(json.contains("is_pinned"));
|
||||||
|
|
||||||
|
let deserialized: ClipboardEntry = serde_json::from_str(&json).unwrap();
|
||||||
|
assert_eq!(deserialized.content, entry.content);
|
||||||
|
assert_eq!(deserialized.language, entry.language);
|
||||||
|
assert_eq!(deserialized.source, entry.source);
|
||||||
|
assert_eq!(deserialized.id, entry.id);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_clone() {
|
||||||
|
let entry = ClipboardEntry::new(
|
||||||
|
"original".to_string(),
|
||||||
|
Some("python".to_string()),
|
||||||
|
None,
|
||||||
|
);
|
||||||
|
|
||||||
|
let cloned = entry.clone();
|
||||||
|
assert_eq!(cloned.content, entry.content);
|
||||||
|
assert_eq!(cloned.id, entry.id);
|
||||||
|
assert_eq!(cloned.language, entry.language);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_entry_timestamp_is_rfc3339() {
|
||||||
|
let entry = ClipboardEntry::new("test".to_string(), None, None);
|
||||||
|
|
||||||
|
// RFC3339 timestamp should parse successfully
|
||||||
|
let parsed = chrono::DateTime::parse_from_rfc3339(&entry.timestamp);
|
||||||
|
assert!(parsed.is_ok());
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== ClipboardHistory tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_history_default() {
|
||||||
|
let history = ClipboardHistory::default();
|
||||||
|
assert!(history.entries.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_history_serialization() {
|
||||||
|
let mut history = ClipboardHistory::default();
|
||||||
|
history.entries.push(ClipboardEntry::new(
|
||||||
|
"entry1".to_string(),
|
||||||
|
Some("js".to_string()),
|
||||||
|
None,
|
||||||
|
));
|
||||||
|
history.entries.push(ClipboardEntry::new(
|
||||||
|
"entry2".to_string(),
|
||||||
|
None,
|
||||||
|
Some("file.txt".to_string()),
|
||||||
|
));
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&history).unwrap();
|
||||||
|
assert!(json.contains("entry1"));
|
||||||
|
assert!(json.contains("entry2"));
|
||||||
|
assert!(json.contains("js"));
|
||||||
|
assert!(json.contains("file.txt"));
|
||||||
|
|
||||||
|
let deserialized: ClipboardHistory = serde_json::from_str(&json).unwrap();
|
||||||
|
assert_eq!(deserialized.entries.len(), 2);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_history_entries_order() {
|
||||||
|
let mut history = ClipboardHistory::default();
|
||||||
|
|
||||||
|
history.entries.push(ClipboardEntry::new("first".to_string(), None, None));
|
||||||
|
history.entries.push(ClipboardEntry::new("second".to_string(), None, None));
|
||||||
|
history.entries.push(ClipboardEntry::new("third".to_string(), None, None));
|
||||||
|
|
||||||
|
assert_eq!(history.entries[0].content, "first");
|
||||||
|
assert_eq!(history.entries[1].content, "second");
|
||||||
|
assert_eq!(history.entries[2].content, "third");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== ClipboardState tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_state_default() {
|
||||||
|
let state = ClipboardState::default();
|
||||||
|
assert!(state.last_content.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clipboard_state_with_content() {
|
||||||
|
let state = ClipboardState {
|
||||||
|
last_content: Some("cached content".to_string()),
|
||||||
|
};
|
||||||
|
assert_eq!(state.last_content, Some("cached content".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== MAX_HISTORY_SIZE constant test ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_max_history_size_is_reasonable() {
|
||||||
|
assert_eq!(MAX_HISTORY_SIZE, 100);
|
||||||
|
// Compile-time assertions for constant bounds
|
||||||
|
const _: () = assert!(MAX_HISTORY_SIZE > 0);
|
||||||
|
const _: () = assert!(MAX_HISTORY_SIZE <= 1000); // Sanity check
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== Pinned entry sorting tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_pinned_entries_sorting() {
|
||||||
|
let mut entries = vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "unpinned older".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-01T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "pinned".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-02T00:00:00Z".to_string(),
|
||||||
|
is_pinned: true,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "unpinned newer".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-03T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
// Apply the same sorting logic as used in the module
|
||||||
|
entries.sort_by(|a, b| {
|
||||||
|
if a.is_pinned && !b.is_pinned {
|
||||||
|
std::cmp::Ordering::Less
|
||||||
|
} else if !a.is_pinned && b.is_pinned {
|
||||||
|
std::cmp::Ordering::Greater
|
||||||
|
} else {
|
||||||
|
b.timestamp.cmp(&a.timestamp)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Pinned should be first
|
||||||
|
assert!(entries[0].is_pinned);
|
||||||
|
assert_eq!(entries[0].id, "2");
|
||||||
|
|
||||||
|
// Then unpinned sorted by timestamp descending (newest first)
|
||||||
|
assert_eq!(entries[1].id, "3"); // newer unpinned
|
||||||
|
assert_eq!(entries[2].id, "1"); // older unpinned
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_multiple_pinned_entries_sorting() {
|
||||||
|
let mut entries = vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "pinned older".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-01T00:00:00Z".to_string(),
|
||||||
|
is_pinned: true,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "unpinned".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-02T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "pinned newer".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-03T00:00:00Z".to_string(),
|
||||||
|
is_pinned: true,
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
entries.sort_by(|a, b| {
|
||||||
|
if a.is_pinned && !b.is_pinned {
|
||||||
|
std::cmp::Ordering::Less
|
||||||
|
} else if !a.is_pinned && b.is_pinned {
|
||||||
|
std::cmp::Ordering::Greater
|
||||||
|
} else {
|
||||||
|
b.timestamp.cmp(&a.timestamp)
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Both pinned first, sorted by timestamp
|
||||||
|
assert!(entries[0].is_pinned);
|
||||||
|
assert_eq!(entries[0].id, "3"); // pinned newer
|
||||||
|
assert!(entries[1].is_pinned);
|
||||||
|
assert_eq!(entries[1].id, "1"); // pinned older
|
||||||
|
// Then unpinned
|
||||||
|
assert!(!entries[2].is_pinned);
|
||||||
|
assert_eq!(entries[2].id, "2");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== Entry filtering tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_filter_entries_by_language() {
|
||||||
|
let history = ClipboardHistory {
|
||||||
|
entries: vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "rust code".to_string(),
|
||||||
|
language: Some("rust".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-01T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "js code".to_string(),
|
||||||
|
language: Some("javascript".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-02T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "more rust".to_string(),
|
||||||
|
language: Some("rust".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-03T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
let filtered: Vec<_> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter(|e| e.language.as_ref() == Some(&"rust".to_string()))
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
assert_eq!(filtered.len(), 2);
|
||||||
|
assert!(filtered.iter().all(|e| e.language == Some("rust".to_string())));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_search_entries_by_content() {
|
||||||
|
let history = ClipboardHistory {
|
||||||
|
entries: vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "fn hello_world()".to_string(),
|
||||||
|
language: Some("rust".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-01T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "function hello()".to_string(),
|
||||||
|
language: Some("javascript".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-02T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "def goodbye()".to_string(),
|
||||||
|
language: Some("python".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-03T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
let query = "hello";
|
||||||
|
let query_lower = query.to_lowercase();
|
||||||
|
let filtered: Vec<_> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter(|e| e.content.to_lowercase().contains(&query_lower))
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
assert_eq!(filtered.len(), 2);
|
||||||
|
assert!(filtered[0].content.contains("hello"));
|
||||||
|
assert!(filtered[1].content.contains("hello"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_search_entries_case_insensitive() {
|
||||||
|
let history = ClipboardHistory {
|
||||||
|
entries: vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "HELLO WORLD".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "2024-01-01T00:00:00Z".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
let query = "hello";
|
||||||
|
let query_lower = query.to_lowercase();
|
||||||
|
let filtered: Vec<_> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter(|e| e.content.to_lowercase().contains(&query_lower))
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
assert_eq!(filtered.len(), 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== Unique languages extraction test ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_extract_unique_languages() {
|
||||||
|
let history = ClipboardHistory {
|
||||||
|
entries: vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "".to_string(),
|
||||||
|
language: Some("rust".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "".to_string(),
|
||||||
|
language: Some("javascript".to_string()),
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "".to_string(),
|
||||||
|
language: Some("rust".to_string()), // Duplicate
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "4".to_string(),
|
||||||
|
content: "".to_string(),
|
||||||
|
language: None, // No language
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
let mut languages: Vec<String> = history
|
||||||
|
.entries
|
||||||
|
.iter()
|
||||||
|
.filter_map(|e| e.language.clone())
|
||||||
|
.collect();
|
||||||
|
languages.sort();
|
||||||
|
languages.dedup();
|
||||||
|
|
||||||
|
assert_eq!(languages.len(), 2);
|
||||||
|
assert!(languages.contains(&"rust".to_string()));
|
||||||
|
assert!(languages.contains(&"javascript".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== Retain pinned entries test ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_retain_pinned_on_clear() {
|
||||||
|
let mut history = ClipboardHistory {
|
||||||
|
entries: vec![
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "1".to_string(),
|
||||||
|
content: "pinned".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: true,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "2".to_string(),
|
||||||
|
content: "unpinned".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: false,
|
||||||
|
},
|
||||||
|
ClipboardEntry {
|
||||||
|
id: "3".to_string(),
|
||||||
|
content: "another pinned".to_string(),
|
||||||
|
language: None,
|
||||||
|
source: None,
|
||||||
|
timestamp: "".to_string(),
|
||||||
|
is_pinned: true,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
// Simulate clear (keep only pinned)
|
||||||
|
history.entries.retain(|e| e.is_pinned);
|
||||||
|
|
||||||
|
assert_eq!(history.entries.len(), 2);
|
||||||
|
assert!(history.entries.iter().all(|e| e.is_pinned));
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,3 +1,5 @@
|
|||||||
|
use std::collections::HashMap;
|
||||||
|
|
||||||
use serde::{Deserialize, Serialize};
|
use serde::{Deserialize, Serialize};
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||||
@@ -22,9 +24,56 @@ pub struct ClaudeStartOptions {
|
|||||||
|
|
||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
pub skip_greeting: bool,
|
pub skip_greeting: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub resume_session_id: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub use_worktree: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_1m_context: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub max_output_tokens: Option<u64>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_cron: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_include_git_instructions")]
|
||||||
|
pub include_git_instructions: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_enable_claudeai_mcp_servers")]
|
||||||
|
pub enable_claudeai_mcp_servers: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub auto_memory_directory: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub model_overrides: Option<HashMap<String, String>>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub session_name: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_skill_shell_execution: bool,
|
||||||
|
|
||||||
|
/// Pass `--bare` flag to suppress UI chrome, useful for scripted headless `-p` calls (v2.1.81+).
|
||||||
|
#[serde(default)]
|
||||||
|
pub bare_mode: bool,
|
||||||
|
|
||||||
|
/// Controls `showClearContextOnPlanAccept` in `--settings` (v2.1.81+).
|
||||||
|
/// Defaults to true (matching CLI default). Set to false to suppress the dialog.
|
||||||
|
#[serde(default = "default_show_clear_context")]
|
||||||
|
pub show_clear_context_on_plan_accept: bool,
|
||||||
|
|
||||||
|
/// Sets `ANTHROPIC_CUSTOM_MODEL_OPTION` env var for custom model providers (v2.1.81+).
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_model_option: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
#[serde(default)]
|
||||||
pub struct HikariConfig {
|
pub struct HikariConfig {
|
||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
pub model: Option<String>,
|
pub model: Option<String>,
|
||||||
@@ -55,6 +104,138 @@ pub struct HikariConfig {
|
|||||||
|
|
||||||
#[serde(default = "default_notification_volume")]
|
#[serde(default = "default_notification_volume")]
|
||||||
pub notification_volume: f32,
|
pub notification_volume: f32,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub always_on_top: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_update_checks_enabled")]
|
||||||
|
pub update_checks_enabled: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub character_panel_width: Option<u32>,
|
||||||
|
|
||||||
|
#[serde(default = "default_font_size")]
|
||||||
|
pub font_size: u32,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub streamer_mode: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub streamer_hide_paths: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub compact_mode: bool,
|
||||||
|
|
||||||
|
// Profile fields
|
||||||
|
#[serde(default)]
|
||||||
|
pub profile_name: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub profile_avatar_path: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub profile_bio: Option<String>,
|
||||||
|
|
||||||
|
// Custom theme colors
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_theme_colors: CustomThemeColors,
|
||||||
|
|
||||||
|
// Token budget settings
|
||||||
|
#[serde(default)]
|
||||||
|
pub budget_enabled: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub session_token_budget: Option<u64>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub session_cost_budget: Option<f64>,
|
||||||
|
|
||||||
|
#[serde(default = "default_budget_action")]
|
||||||
|
pub budget_action: BudgetAction,
|
||||||
|
|
||||||
|
#[serde(default = "default_budget_warning_threshold")]
|
||||||
|
pub budget_warning_threshold: f32,
|
||||||
|
|
||||||
|
#[serde(default = "default_discord_rpc_enabled")]
|
||||||
|
pub discord_rpc_enabled: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub use_worktree: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_1m_context: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub max_output_tokens: Option<u64>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub trusted_workspaces: Vec<String>,
|
||||||
|
|
||||||
|
// Background image settings
|
||||||
|
#[serde(default)]
|
||||||
|
pub background_image_path: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default = "default_background_image_opacity")]
|
||||||
|
pub background_image_opacity: f32,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub show_thinking_blocks: bool,
|
||||||
|
|
||||||
|
// Custom terminal font settings
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_font_path: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_font_family: Option<String>,
|
||||||
|
|
||||||
|
// Custom UI font settings
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_ui_font_path: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_ui_font_family: Option<String>,
|
||||||
|
|
||||||
|
// Task Loop auto-commit settings
|
||||||
|
#[serde(default)]
|
||||||
|
pub task_loop_auto_commit: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_task_loop_commit_prefix")]
|
||||||
|
pub task_loop_commit_prefix: String,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub task_loop_include_summary: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_cron: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_include_git_instructions")]
|
||||||
|
pub include_git_instructions: bool,
|
||||||
|
|
||||||
|
#[serde(default = "default_enable_claudeai_mcp_servers")]
|
||||||
|
pub enable_claudeai_mcp_servers: bool,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub auto_memory_directory: Option<String>,
|
||||||
|
|
||||||
|
#[serde(default)]
|
||||||
|
pub model_overrides: Option<HashMap<String, String>>,
|
||||||
|
|
||||||
|
/// Prevents skill scripts from executing shell commands (Claude Code v2.1.91+).
|
||||||
|
/// Passes `"disableSkillShellExecution": true` via the `--settings` flag.
|
||||||
|
#[serde(default)]
|
||||||
|
pub disable_skill_shell_execution: bool,
|
||||||
|
|
||||||
|
/// Pass `--bare` flag to suppress UI chrome, useful for scripted headless `-p` calls (v2.1.81+).
|
||||||
|
#[serde(default)]
|
||||||
|
pub bare_mode: bool,
|
||||||
|
|
||||||
|
/// Controls `showClearContextOnPlanAccept` in `--settings` (v2.1.81+).
|
||||||
|
#[serde(default = "default_show_clear_context")]
|
||||||
|
pub show_clear_context_on_plan_accept: bool,
|
||||||
|
|
||||||
|
/// Sets `ANTHROPIC_CUSTOM_MODEL_OPTION` env var for custom model providers (v2.1.81+).
|
||||||
|
#[serde(default)]
|
||||||
|
pub custom_model_option: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Default for HikariConfig {
|
impl Default for HikariConfig {
|
||||||
@@ -70,10 +251,54 @@ impl Default for HikariConfig {
|
|||||||
greeting_custom_prompt: None,
|
greeting_custom_prompt: None,
|
||||||
notifications_enabled: true,
|
notifications_enabled: true,
|
||||||
notification_volume: 0.7,
|
notification_volume: 0.7,
|
||||||
|
always_on_top: false,
|
||||||
|
update_checks_enabled: true,
|
||||||
|
character_panel_width: None,
|
||||||
|
font_size: 14,
|
||||||
|
streamer_mode: false,
|
||||||
|
streamer_hide_paths: false,
|
||||||
|
compact_mode: false,
|
||||||
|
profile_name: None,
|
||||||
|
profile_avatar_path: None,
|
||||||
|
profile_bio: None,
|
||||||
|
custom_theme_colors: CustomThemeColors::default(),
|
||||||
|
budget_enabled: false,
|
||||||
|
session_token_budget: None,
|
||||||
|
session_cost_budget: None,
|
||||||
|
budget_action: BudgetAction::Warn,
|
||||||
|
budget_warning_threshold: 0.8,
|
||||||
|
discord_rpc_enabled: true,
|
||||||
|
use_worktree: false,
|
||||||
|
disable_1m_context: false,
|
||||||
|
max_output_tokens: None,
|
||||||
|
trusted_workspaces: Vec::new(),
|
||||||
|
background_image_path: None,
|
||||||
|
background_image_opacity: 0.3,
|
||||||
|
show_thinking_blocks: false,
|
||||||
|
custom_font_path: None,
|
||||||
|
custom_font_family: None,
|
||||||
|
custom_ui_font_path: None,
|
||||||
|
custom_ui_font_family: None,
|
||||||
|
task_loop_auto_commit: false,
|
||||||
|
task_loop_commit_prefix: "feat".to_string(),
|
||||||
|
task_loop_include_summary: false,
|
||||||
|
disable_cron: false,
|
||||||
|
include_git_instructions: true,
|
||||||
|
enable_claudeai_mcp_servers: true,
|
||||||
|
auto_memory_directory: None,
|
||||||
|
model_overrides: None,
|
||||||
|
disable_skill_shell_execution: false,
|
||||||
|
bare_mode: false,
|
||||||
|
show_clear_context_on_plan_accept: true,
|
||||||
|
custom_model_option: None,
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn default_update_checks_enabled() -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
|
||||||
fn default_greeting_enabled() -> bool {
|
fn default_greeting_enabled() -> bool {
|
||||||
true
|
true
|
||||||
}
|
}
|
||||||
@@ -86,12 +311,91 @@ fn default_notification_volume() -> f32 {
|
|||||||
0.7
|
0.7
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn default_font_size() -> u32 {
|
||||||
|
14
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_budget_action() -> BudgetAction {
|
||||||
|
BudgetAction::Warn
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_budget_warning_threshold() -> f32 {
|
||||||
|
0.8
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_discord_rpc_enabled() -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_background_image_opacity() -> f32 {
|
||||||
|
0.3
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_task_loop_commit_prefix() -> String {
|
||||||
|
"feat".to_string()
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_include_git_instructions() -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_enable_claudeai_mcp_servers() -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
|
||||||
|
fn default_show_clear_context() -> bool {
|
||||||
|
true
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
||||||
|
#[serde(rename_all = "lowercase")]
|
||||||
|
pub enum BudgetAction {
|
||||||
|
#[default]
|
||||||
|
Warn,
|
||||||
|
Block,
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
||||||
#[serde(rename_all = "lowercase")]
|
#[serde(rename_all = "lowercase")]
|
||||||
pub enum Theme {
|
pub enum Theme {
|
||||||
#[default]
|
#[default]
|
||||||
Dark,
|
Dark,
|
||||||
Light,
|
Light,
|
||||||
|
#[serde(rename = "high-contrast")]
|
||||||
|
HighContrast,
|
||||||
|
Custom,
|
||||||
|
Dracula,
|
||||||
|
Catppuccin,
|
||||||
|
Nord,
|
||||||
|
Solarized,
|
||||||
|
#[serde(rename = "solarized-light")]
|
||||||
|
SolarizedLight,
|
||||||
|
#[serde(rename = "catppuccin-latte")]
|
||||||
|
CatppuccinLatte,
|
||||||
|
#[serde(rename = "gruvbox-light")]
|
||||||
|
GruvboxLight,
|
||||||
|
#[serde(rename = "rose-pine-dawn")]
|
||||||
|
RosePineDawn,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default, PartialEq)]
|
||||||
|
pub struct CustomThemeColors {
|
||||||
|
#[serde(default)]
|
||||||
|
pub bg_primary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub bg_secondary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub bg_terminal: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub accent_primary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub accent_secondary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub text_primary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub text_secondary: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub border_color: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
@@ -109,6 +413,43 @@ mod tests {
|
|||||||
assert_eq!(config.theme, Theme::Dark);
|
assert_eq!(config.theme, Theme::Dark);
|
||||||
assert!(config.greeting_enabled);
|
assert!(config.greeting_enabled);
|
||||||
assert!(config.greeting_custom_prompt.is_none());
|
assert!(config.greeting_custom_prompt.is_none());
|
||||||
|
assert!(!config.always_on_top);
|
||||||
|
assert!(config.update_checks_enabled);
|
||||||
|
assert!(config.character_panel_width.is_none());
|
||||||
|
assert_eq!(config.font_size, 14);
|
||||||
|
assert!(!config.streamer_mode);
|
||||||
|
assert!(!config.streamer_hide_paths);
|
||||||
|
assert!(!config.compact_mode);
|
||||||
|
assert!(config.profile_name.is_none());
|
||||||
|
assert!(config.profile_avatar_path.is_none());
|
||||||
|
assert!(config.profile_bio.is_none());
|
||||||
|
assert_eq!(config.custom_theme_colors, CustomThemeColors::default());
|
||||||
|
assert!(!config.budget_enabled);
|
||||||
|
assert!(config.session_token_budget.is_none());
|
||||||
|
assert!(config.session_cost_budget.is_none());
|
||||||
|
assert_eq!(config.budget_action, BudgetAction::Warn);
|
||||||
|
assert!((config.budget_warning_threshold - 0.8).abs() < f32::EPSILON);
|
||||||
|
assert!(config.discord_rpc_enabled);
|
||||||
|
assert!(!config.use_worktree);
|
||||||
|
assert!(!config.disable_1m_context);
|
||||||
|
assert!(config.trusted_workspaces.is_empty());
|
||||||
|
assert!(!config.show_thinking_blocks);
|
||||||
|
assert!(config.custom_font_path.is_none());
|
||||||
|
assert!(config.custom_font_family.is_none());
|
||||||
|
assert!(config.custom_ui_font_path.is_none());
|
||||||
|
assert!(config.custom_ui_font_family.is_none());
|
||||||
|
assert!(!config.task_loop_auto_commit);
|
||||||
|
assert_eq!(config.task_loop_commit_prefix, "feat");
|
||||||
|
assert!(!config.task_loop_include_summary);
|
||||||
|
assert!(!config.disable_cron);
|
||||||
|
assert!(config.include_git_instructions);
|
||||||
|
assert!(config.enable_claudeai_mcp_servers);
|
||||||
|
assert!(config.auto_memory_directory.is_none());
|
||||||
|
assert!(config.model_overrides.is_none());
|
||||||
|
assert!(!config.disable_skill_shell_execution);
|
||||||
|
assert!(!config.bare_mode);
|
||||||
|
assert!(config.show_clear_context_on_plan_accept);
|
||||||
|
assert!(config.custom_model_option.is_none());
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
@@ -124,6 +465,49 @@ mod tests {
|
|||||||
greeting_custom_prompt: Some("Hello!".to_string()),
|
greeting_custom_prompt: Some("Hello!".to_string()),
|
||||||
notifications_enabled: true,
|
notifications_enabled: true,
|
||||||
notification_volume: 0.7,
|
notification_volume: 0.7,
|
||||||
|
always_on_top: true,
|
||||||
|
update_checks_enabled: true,
|
||||||
|
character_panel_width: Some(400),
|
||||||
|
font_size: 16,
|
||||||
|
streamer_mode: false,
|
||||||
|
streamer_hide_paths: false,
|
||||||
|
compact_mode: false,
|
||||||
|
profile_name: Some("Test User".to_string()),
|
||||||
|
profile_avatar_path: None,
|
||||||
|
profile_bio: Some("A test bio".to_string()),
|
||||||
|
custom_theme_colors: CustomThemeColors::default(),
|
||||||
|
budget_enabled: true,
|
||||||
|
session_token_budget: Some(100000),
|
||||||
|
session_cost_budget: Some(1.50),
|
||||||
|
budget_action: BudgetAction::Block,
|
||||||
|
budget_warning_threshold: 0.75,
|
||||||
|
discord_rpc_enabled: true,
|
||||||
|
use_worktree: true,
|
||||||
|
disable_1m_context: false,
|
||||||
|
max_output_tokens: Some(32000),
|
||||||
|
trusted_workspaces: vec!["/home/naomi/projects/trusted".to_string()],
|
||||||
|
background_image_path: Some("/home/naomi/bg.png".to_string()),
|
||||||
|
background_image_opacity: 0.25,
|
||||||
|
show_thinking_blocks: true,
|
||||||
|
custom_font_path: Some("/home/naomi/.fonts/MyFont.ttf".to_string()),
|
||||||
|
custom_font_family: Some("MyFont".to_string()),
|
||||||
|
custom_ui_font_path: None,
|
||||||
|
custom_ui_font_family: None,
|
||||||
|
task_loop_auto_commit: true,
|
||||||
|
task_loop_commit_prefix: "fix".to_string(),
|
||||||
|
task_loop_include_summary: true,
|
||||||
|
disable_cron: true,
|
||||||
|
include_git_instructions: false,
|
||||||
|
enable_claudeai_mcp_servers: false,
|
||||||
|
auto_memory_directory: Some("/custom/memory".to_string()),
|
||||||
|
model_overrides: Some(HashMap::from([(
|
||||||
|
"claude-opus-4-6".to_string(),
|
||||||
|
"arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-opus-4-6-v1".to_string(),
|
||||||
|
)])),
|
||||||
|
disable_skill_shell_execution: true,
|
||||||
|
bare_mode: false,
|
||||||
|
show_clear_context_on_plan_accept: true,
|
||||||
|
custom_model_option: None,
|
||||||
};
|
};
|
||||||
|
|
||||||
let json = serde_json::to_string(&config).unwrap();
|
let json = serde_json::to_string(&config).unwrap();
|
||||||
@@ -134,15 +518,101 @@ mod tests {
|
|||||||
assert_eq!(deserialized.auto_granted_tools, config.auto_granted_tools);
|
assert_eq!(deserialized.auto_granted_tools, config.auto_granted_tools);
|
||||||
assert_eq!(deserialized.theme, Theme::Light);
|
assert_eq!(deserialized.theme, Theme::Light);
|
||||||
assert!(deserialized.greeting_enabled);
|
assert!(deserialized.greeting_enabled);
|
||||||
assert_eq!(deserialized.greeting_custom_prompt, Some("Hello!".to_string()));
|
assert_eq!(
|
||||||
|
deserialized.greeting_custom_prompt,
|
||||||
|
Some("Hello!".to_string())
|
||||||
|
);
|
||||||
|
assert!(deserialized.task_loop_auto_commit);
|
||||||
|
assert_eq!(deserialized.task_loop_commit_prefix, "fix");
|
||||||
|
assert!(deserialized.task_loop_include_summary);
|
||||||
|
assert!(deserialized.disable_cron);
|
||||||
|
assert!(!deserialized.include_git_instructions);
|
||||||
|
assert!(!deserialized.enable_claudeai_mcp_servers);
|
||||||
|
assert_eq!(
|
||||||
|
deserialized.auto_memory_directory,
|
||||||
|
Some("/custom/memory".to_string())
|
||||||
|
);
|
||||||
|
assert!(deserialized.model_overrides.is_some());
|
||||||
|
let overrides = deserialized.model_overrides.unwrap();
|
||||||
|
assert_eq!(
|
||||||
|
overrides.get("claude-opus-4-6").map(String::as_str),
|
||||||
|
Some("arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-opus-4-6-v1")
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_theme_serialization() {
|
fn test_theme_serialization() {
|
||||||
let dark = Theme::Dark;
|
assert_eq!(serde_json::to_string(&Theme::Dark).unwrap(), "\"dark\"");
|
||||||
let light = Theme::Light;
|
assert_eq!(serde_json::to_string(&Theme::Light).unwrap(), "\"light\"");
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::HighContrast).unwrap(),
|
||||||
|
"\"high-contrast\""
|
||||||
|
);
|
||||||
|
assert_eq!(serde_json::to_string(&Theme::Custom).unwrap(), "\"custom\"");
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::Dracula).unwrap(),
|
||||||
|
"\"dracula\""
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::Catppuccin).unwrap(),
|
||||||
|
"\"catppuccin\""
|
||||||
|
);
|
||||||
|
assert_eq!(serde_json::to_string(&Theme::Nord).unwrap(), "\"nord\"");
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::Solarized).unwrap(),
|
||||||
|
"\"solarized\""
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::SolarizedLight).unwrap(),
|
||||||
|
"\"solarized-light\""
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::CatppuccinLatte).unwrap(),
|
||||||
|
"\"catppuccin-latte\""
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::GruvboxLight).unwrap(),
|
||||||
|
"\"gruvbox-light\""
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::to_string(&Theme::RosePineDawn).unwrap(),
|
||||||
|
"\"rose-pine-dawn\""
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
assert_eq!(serde_json::to_string(&dark).unwrap(), "\"dark\"");
|
#[test]
|
||||||
assert_eq!(serde_json::to_string(&light).unwrap(), "\"light\"");
|
fn test_theme_deserialization() {
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"dracula\"").unwrap(),
|
||||||
|
Theme::Dracula
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"catppuccin\"").unwrap(),
|
||||||
|
Theme::Catppuccin
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"nord\"").unwrap(),
|
||||||
|
Theme::Nord
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"solarized\"").unwrap(),
|
||||||
|
Theme::Solarized
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"solarized-light\"").unwrap(),
|
||||||
|
Theme::SolarizedLight
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"catppuccin-latte\"").unwrap(),
|
||||||
|
Theme::CatppuccinLatte
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"gruvbox-light\"").unwrap(),
|
||||||
|
Theme::GruvboxLight
|
||||||
|
);
|
||||||
|
assert_eq!(
|
||||||
|
serde_json::from_str::<Theme>("\"rose-pine-dawn\"").unwrap(),
|
||||||
|
Theme::RosePineDawn
|
||||||
|
);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -0,0 +1,376 @@
|
|||||||
|
use chrono::{Datelike, Local, NaiveDate, Weekday};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::collections::HashMap;
|
||||||
|
|
||||||
|
/// Represents a single day's cost data
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||||
|
pub struct DailyCost {
|
||||||
|
pub date: String, // ISO date string (YYYY-MM-DD)
|
||||||
|
pub input_tokens: u64,
|
||||||
|
pub output_tokens: u64,
|
||||||
|
pub cost_usd: f64,
|
||||||
|
pub messages_sent: u64,
|
||||||
|
pub sessions_count: u64,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Historical cost tracking data
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||||
|
pub struct CostHistory {
|
||||||
|
/// Daily costs indexed by date string (YYYY-MM-DD)
|
||||||
|
pub daily_costs: HashMap<String, DailyCost>,
|
||||||
|
/// Cost alert thresholds
|
||||||
|
pub daily_alert_threshold: Option<f64>,
|
||||||
|
pub weekly_alert_threshold: Option<f64>,
|
||||||
|
pub monthly_alert_threshold: Option<f64>,
|
||||||
|
/// Whether alerts have been triggered today
|
||||||
|
pub daily_alert_triggered: bool,
|
||||||
|
pub weekly_alert_triggered: bool,
|
||||||
|
pub monthly_alert_triggered: bool,
|
||||||
|
pub last_alert_reset_date: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl CostHistory {
|
||||||
|
pub fn new() -> Self {
|
||||||
|
Self::default()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get today's date as a string
|
||||||
|
fn today_str() -> String {
|
||||||
|
Local::now().format("%Y-%m-%d").to_string()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get the start of the current week (Monday)
|
||||||
|
fn week_start() -> NaiveDate {
|
||||||
|
let today = Local::now().date_naive();
|
||||||
|
let days_since_monday = today.weekday().num_days_from_monday();
|
||||||
|
today - chrono::Duration::days(days_since_monday as i64)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get the start of the current month
|
||||||
|
fn month_start() -> NaiveDate {
|
||||||
|
let today = Local::now().date_naive();
|
||||||
|
NaiveDate::from_ymd_opt(today.year(), today.month(), 1).unwrap_or(today)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Add cost for today
|
||||||
|
pub fn add_cost(&mut self, input_tokens: u64, output_tokens: u64, cost_usd: f64) {
|
||||||
|
let today = Self::today_str();
|
||||||
|
|
||||||
|
// Reset alert flags if it's a new day
|
||||||
|
if self.last_alert_reset_date.as_ref() != Some(&today) {
|
||||||
|
self.daily_alert_triggered = false;
|
||||||
|
// Reset weekly on Monday
|
||||||
|
if Local::now().weekday() == Weekday::Mon {
|
||||||
|
self.weekly_alert_triggered = false;
|
||||||
|
}
|
||||||
|
// Reset monthly on the 1st
|
||||||
|
if Local::now().day() == 1 {
|
||||||
|
self.monthly_alert_triggered = false;
|
||||||
|
}
|
||||||
|
self.last_alert_reset_date = Some(today.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
let daily = self.daily_costs.entry(today).or_default();
|
||||||
|
daily.input_tokens += input_tokens;
|
||||||
|
daily.output_tokens += output_tokens;
|
||||||
|
daily.cost_usd += cost_usd;
|
||||||
|
daily.messages_sent += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Increment session count for today
|
||||||
|
pub fn increment_sessions(&mut self) {
|
||||||
|
let today = Self::today_str();
|
||||||
|
let daily = self.daily_costs.entry(today.clone()).or_insert_with(|| DailyCost {
|
||||||
|
date: today,
|
||||||
|
..Default::default()
|
||||||
|
});
|
||||||
|
daily.sessions_count += 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get today's cost
|
||||||
|
pub fn get_today_cost(&self) -> f64 {
|
||||||
|
self.daily_costs
|
||||||
|
.get(&Self::today_str())
|
||||||
|
.map(|d| d.cost_usd)
|
||||||
|
.unwrap_or(0.0)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get this week's cost (Monday to Sunday)
|
||||||
|
pub fn get_week_cost(&self) -> f64 {
|
||||||
|
let week_start = Self::week_start();
|
||||||
|
self.daily_costs
|
||||||
|
.values()
|
||||||
|
.filter(|d| {
|
||||||
|
NaiveDate::parse_from_str(&d.date, "%Y-%m-%d")
|
||||||
|
.map(|date| date >= week_start)
|
||||||
|
.unwrap_or(false)
|
||||||
|
})
|
||||||
|
.map(|d| d.cost_usd)
|
||||||
|
.sum()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get this month's cost
|
||||||
|
pub fn get_month_cost(&self) -> f64 {
|
||||||
|
let month_start = Self::month_start();
|
||||||
|
self.daily_costs
|
||||||
|
.values()
|
||||||
|
.filter(|d| {
|
||||||
|
NaiveDate::parse_from_str(&d.date, "%Y-%m-%d")
|
||||||
|
.map(|date| date >= month_start)
|
||||||
|
.unwrap_or(false)
|
||||||
|
})
|
||||||
|
.map(|d| d.cost_usd)
|
||||||
|
.sum()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get cost summary for a date range
|
||||||
|
pub fn get_summary(&self, days: u32) -> CostSummary {
|
||||||
|
let today = Local::now().date_naive();
|
||||||
|
let start_date = today - chrono::Duration::days(days as i64 - 1);
|
||||||
|
|
||||||
|
let mut total_input_tokens = 0u64;
|
||||||
|
let mut total_output_tokens = 0u64;
|
||||||
|
let mut total_cost = 0.0f64;
|
||||||
|
let mut total_messages = 0u64;
|
||||||
|
let mut total_sessions = 0u64;
|
||||||
|
let mut daily_breakdown = Vec::new();
|
||||||
|
|
||||||
|
for i in 0..days {
|
||||||
|
let date = start_date + chrono::Duration::days(i as i64);
|
||||||
|
let date_str = date.format("%Y-%m-%d").to_string();
|
||||||
|
|
||||||
|
if let Some(daily) = self.daily_costs.get(&date_str) {
|
||||||
|
total_input_tokens += daily.input_tokens;
|
||||||
|
total_output_tokens += daily.output_tokens;
|
||||||
|
total_cost += daily.cost_usd;
|
||||||
|
total_messages += daily.messages_sent;
|
||||||
|
total_sessions += daily.sessions_count;
|
||||||
|
daily_breakdown.push(daily.clone());
|
||||||
|
} else {
|
||||||
|
daily_breakdown.push(DailyCost {
|
||||||
|
date: date_str,
|
||||||
|
..Default::default()
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
CostSummary {
|
||||||
|
period_days: days,
|
||||||
|
total_input_tokens,
|
||||||
|
total_output_tokens,
|
||||||
|
total_cost,
|
||||||
|
total_messages,
|
||||||
|
total_sessions,
|
||||||
|
average_daily_cost: if days > 0 { total_cost / days as f64 } else { 0.0 },
|
||||||
|
daily_breakdown,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Check if any alert thresholds are exceeded and return which ones
|
||||||
|
pub fn check_alerts(&mut self) -> Vec<CostAlert> {
|
||||||
|
let mut alerts = Vec::new();
|
||||||
|
|
||||||
|
if let Some(threshold) = self.daily_alert_threshold {
|
||||||
|
let today_cost = self.get_today_cost();
|
||||||
|
if today_cost >= threshold && !self.daily_alert_triggered {
|
||||||
|
self.daily_alert_triggered = true;
|
||||||
|
alerts.push(CostAlert {
|
||||||
|
alert_type: AlertType::Daily,
|
||||||
|
threshold,
|
||||||
|
current_cost: today_cost,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(threshold) = self.weekly_alert_threshold {
|
||||||
|
let week_cost = self.get_week_cost();
|
||||||
|
if week_cost >= threshold && !self.weekly_alert_triggered {
|
||||||
|
self.weekly_alert_triggered = true;
|
||||||
|
alerts.push(CostAlert {
|
||||||
|
alert_type: AlertType::Weekly,
|
||||||
|
threshold,
|
||||||
|
current_cost: week_cost,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(threshold) = self.monthly_alert_threshold {
|
||||||
|
let month_cost = self.get_month_cost();
|
||||||
|
if month_cost >= threshold && !self.monthly_alert_triggered {
|
||||||
|
self.monthly_alert_triggered = true;
|
||||||
|
alerts.push(CostAlert {
|
||||||
|
alert_type: AlertType::Monthly,
|
||||||
|
threshold,
|
||||||
|
current_cost: month_cost,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
alerts
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Set alert thresholds
|
||||||
|
pub fn set_alert_thresholds(
|
||||||
|
&mut self,
|
||||||
|
daily: Option<f64>,
|
||||||
|
weekly: Option<f64>,
|
||||||
|
monthly: Option<f64>,
|
||||||
|
) {
|
||||||
|
self.daily_alert_threshold = daily;
|
||||||
|
self.weekly_alert_threshold = weekly;
|
||||||
|
self.monthly_alert_threshold = monthly;
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Clean up old data (keep last N days)
|
||||||
|
#[allow(dead_code)]
|
||||||
|
pub fn cleanup_old_data(&mut self, keep_days: u32) {
|
||||||
|
let cutoff = Local::now().date_naive() - chrono::Duration::days(keep_days as i64);
|
||||||
|
self.daily_costs.retain(|date_str, _| {
|
||||||
|
NaiveDate::parse_from_str(date_str, "%Y-%m-%d")
|
||||||
|
.map(|date| date >= cutoff)
|
||||||
|
.unwrap_or(false)
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Export to CSV format
|
||||||
|
pub fn export_csv(&self, days: u32) -> String {
|
||||||
|
let summary = self.get_summary(days);
|
||||||
|
let mut csv = String::from("Date,Input Tokens,Output Tokens,Cost (USD),Messages,Sessions\n");
|
||||||
|
|
||||||
|
for daily in &summary.daily_breakdown {
|
||||||
|
csv.push_str(&format!(
|
||||||
|
"{},{},{},{:.4},{},{}\n",
|
||||||
|
daily.date,
|
||||||
|
daily.input_tokens,
|
||||||
|
daily.output_tokens,
|
||||||
|
daily.cost_usd,
|
||||||
|
daily.messages_sent,
|
||||||
|
daily.sessions_count
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add totals row
|
||||||
|
csv.push_str(&format!(
|
||||||
|
"TOTAL,{},{},{:.4},{},{}\n",
|
||||||
|
summary.total_input_tokens,
|
||||||
|
summary.total_output_tokens,
|
||||||
|
summary.total_cost,
|
||||||
|
summary.total_messages,
|
||||||
|
summary.total_sessions
|
||||||
|
));
|
||||||
|
|
||||||
|
csv
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Cost summary for a period
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct CostSummary {
|
||||||
|
pub period_days: u32,
|
||||||
|
pub total_input_tokens: u64,
|
||||||
|
pub total_output_tokens: u64,
|
||||||
|
pub total_cost: f64,
|
||||||
|
pub total_messages: u64,
|
||||||
|
pub total_sessions: u64,
|
||||||
|
pub average_daily_cost: f64,
|
||||||
|
pub daily_breakdown: Vec<DailyCost>,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Alert types
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)]
|
||||||
|
pub enum AlertType {
|
||||||
|
Daily,
|
||||||
|
Weekly,
|
||||||
|
Monthly,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Cost alert notification
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct CostAlert {
|
||||||
|
pub alert_type: AlertType,
|
||||||
|
pub threshold: f64,
|
||||||
|
pub current_cost: f64,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_add_cost() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.add_cost(1000, 500, 0.05);
|
||||||
|
|
||||||
|
let today_cost = history.get_today_cost();
|
||||||
|
assert!((today_cost - 0.05).abs() < 0.0001);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_accumulate_daily_cost() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.add_cost(1000, 500, 0.05);
|
||||||
|
history.add_cost(2000, 1000, 0.10);
|
||||||
|
|
||||||
|
let today_cost = history.get_today_cost();
|
||||||
|
assert!((today_cost - 0.15).abs() < 0.0001);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_summary() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.add_cost(1000, 500, 0.05);
|
||||||
|
|
||||||
|
let summary = history.get_summary(7);
|
||||||
|
assert_eq!(summary.period_days, 7);
|
||||||
|
assert!((summary.total_cost - 0.05).abs() < 0.0001);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_daily_alert() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.set_alert_thresholds(Some(0.10), None, None);
|
||||||
|
|
||||||
|
history.add_cost(1000, 500, 0.05);
|
||||||
|
let alerts = history.check_alerts();
|
||||||
|
assert!(alerts.is_empty());
|
||||||
|
|
||||||
|
history.add_cost(1000, 500, 0.06);
|
||||||
|
let alerts = history.check_alerts();
|
||||||
|
assert_eq!(alerts.len(), 1);
|
||||||
|
assert_eq!(alerts[0].alert_type, AlertType::Daily);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_alert_only_triggers_once() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.set_alert_thresholds(Some(0.10), None, None);
|
||||||
|
|
||||||
|
history.add_cost(1000, 500, 0.15);
|
||||||
|
let alerts = history.check_alerts();
|
||||||
|
assert_eq!(alerts.len(), 1);
|
||||||
|
|
||||||
|
// Second check should not trigger again
|
||||||
|
let alerts = history.check_alerts();
|
||||||
|
assert!(alerts.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_export_csv() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.add_cost(1000, 500, 0.05);
|
||||||
|
|
||||||
|
let csv = history.export_csv(1);
|
||||||
|
assert!(csv.contains("Date,Input Tokens"));
|
||||||
|
assert!(csv.contains("TOTAL"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_increment_sessions() {
|
||||||
|
let mut history = CostHistory::new();
|
||||||
|
history.increment_sessions();
|
||||||
|
history.increment_sessions();
|
||||||
|
|
||||||
|
let summary = history.get_summary(1);
|
||||||
|
assert_eq!(summary.total_sessions, 2);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,157 @@
|
|||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::sync::Arc;
|
||||||
|
use tauri::{AppHandle, Emitter};
|
||||||
|
use tracing::{Level, Subscriber};
|
||||||
|
use tracing_subscriber::layer::{Context, Layer};
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct DebugLogEvent {
|
||||||
|
pub level: String,
|
||||||
|
pub message: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Clone)]
|
||||||
|
pub struct TauriLogLayer {
|
||||||
|
app: Arc<AppHandle>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl TauriLogLayer {
|
||||||
|
pub fn new(app: AppHandle) -> Self {
|
||||||
|
Self {
|
||||||
|
app: Arc::new(app),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<S> Layer<S> for TauriLogLayer
|
||||||
|
where
|
||||||
|
S: Subscriber,
|
||||||
|
{
|
||||||
|
fn on_event(
|
||||||
|
&self,
|
||||||
|
event: &tracing::Event<'_>,
|
||||||
|
_ctx: Context<'_, S>,
|
||||||
|
) {
|
||||||
|
let metadata = event.metadata();
|
||||||
|
let level = match *metadata.level() {
|
||||||
|
Level::ERROR => "error",
|
||||||
|
Level::WARN => "warn",
|
||||||
|
Level::INFO => "info",
|
||||||
|
Level::DEBUG => "debug",
|
||||||
|
Level::TRACE => "debug",
|
||||||
|
};
|
||||||
|
|
||||||
|
// Extract message from the event
|
||||||
|
struct MessageVisitor {
|
||||||
|
message: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl tracing::field::Visit for MessageVisitor {
|
||||||
|
fn record_debug(&mut self, field: &tracing::field::Field, value: &dyn std::fmt::Debug) {
|
||||||
|
if field.name() == "message" {
|
||||||
|
self.message = format!("{:?}", value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut visitor = MessageVisitor {
|
||||||
|
message: String::new(),
|
||||||
|
};
|
||||||
|
event.record(&mut visitor);
|
||||||
|
|
||||||
|
// If we couldn't extract a message, try to format the whole event
|
||||||
|
if visitor.message.is_empty() {
|
||||||
|
visitor.message = metadata.name().to_string();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Strip quotes from the message
|
||||||
|
let message = visitor.message.trim_matches('"').to_string();
|
||||||
|
|
||||||
|
let log_event = DebugLogEvent {
|
||||||
|
level: level.to_string(),
|
||||||
|
message,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Emit to frontend
|
||||||
|
let _ = self.app.emit("debug:log", log_event);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_creation() {
|
||||||
|
let event = DebugLogEvent {
|
||||||
|
level: "info".to_string(),
|
||||||
|
message: "Test message".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
assert_eq!(event.level, "info");
|
||||||
|
assert_eq!(event.message, "Test message");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_serialization() {
|
||||||
|
let event = DebugLogEvent {
|
||||||
|
level: "error".to_string(),
|
||||||
|
message: "Error occurred".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(json.contains("\"level\":\"error\""));
|
||||||
|
assert!(json.contains("\"message\":\"Error occurred\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_deserialization() {
|
||||||
|
let json = r#"{"level":"warn","message":"Warning message"}"#;
|
||||||
|
let event: DebugLogEvent = serde_json::from_str(json).unwrap();
|
||||||
|
|
||||||
|
assert_eq!(event.level, "warn");
|
||||||
|
assert_eq!(event.message, "Warning message");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_with_special_characters() {
|
||||||
|
let event = DebugLogEvent {
|
||||||
|
level: "info".to_string(),
|
||||||
|
message: "Message with \"quotes\" and \n newlines".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&event).unwrap();
|
||||||
|
let decoded: DebugLogEvent = serde_json::from_str(&json).unwrap();
|
||||||
|
|
||||||
|
assert_eq!(decoded.level, event.level);
|
||||||
|
assert_eq!(decoded.message, event.message);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_with_unicode() {
|
||||||
|
let event = DebugLogEvent {
|
||||||
|
level: "debug".to_string(),
|
||||||
|
message: "Unicode: 日本語 🎉".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&event).unwrap();
|
||||||
|
let decoded: DebugLogEvent = serde_json::from_str(&json).unwrap();
|
||||||
|
|
||||||
|
assert_eq!(decoded.message, "Unicode: 日本語 🎉");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_debug_log_event_all_levels() {
|
||||||
|
let levels = vec!["error", "warn", "info", "debug", "trace"];
|
||||||
|
|
||||||
|
for level in levels {
|
||||||
|
let event = DebugLogEvent {
|
||||||
|
level: level.to_string(),
|
||||||
|
message: format!("{} level message", level),
|
||||||
|
};
|
||||||
|
|
||||||
|
assert_eq!(event.level, level);
|
||||||
|
assert!(event.message.contains(level));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,178 @@
|
|||||||
|
use discord_rich_presence::activity::{Activity, Assets, Timestamps};
|
||||||
|
use discord_rich_presence::{DiscordIpc, DiscordIpcClient};
|
||||||
|
use parking_lot::RwLock;
|
||||||
|
use std::sync::Arc;
|
||||||
|
|
||||||
|
pub struct DiscordRpcManager {
|
||||||
|
client: Arc<RwLock<Option<DiscordIpcClient>>>,
|
||||||
|
session_name: Arc<RwLock<String>>,
|
||||||
|
model: Arc<RwLock<String>>,
|
||||||
|
started_at: Arc<RwLock<i64>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl DiscordRpcManager {
|
||||||
|
pub fn new() -> Self {
|
||||||
|
Self {
|
||||||
|
client: Arc::new(RwLock::new(None)),
|
||||||
|
session_name: Arc::new(RwLock::new(String::new())),
|
||||||
|
model: Arc::new(RwLock::new(String::new())),
|
||||||
|
started_at: Arc::new(RwLock::new(0)),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn init(&self, initial_session_name: String, initial_model: String, started_at: i64) -> Result<(), String> {
|
||||||
|
tracing::debug!("Attempting to initialize Discord RPC...");
|
||||||
|
tracing::debug!("Application ID: 1391117878182281316");
|
||||||
|
tracing::debug!("Initial session: '{}', model: '{}', timestamp: {}",
|
||||||
|
initial_session_name, initial_model, started_at);
|
||||||
|
|
||||||
|
let mut client = DiscordIpcClient::new("1391117878182281316")
|
||||||
|
.map_err(|e| {
|
||||||
|
let error_msg = format!("Failed to create Discord RPC client: {} (is Discord running?)", e);
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::debug!("DiscordIpcClient created successfully");
|
||||||
|
|
||||||
|
client
|
||||||
|
.connect()
|
||||||
|
.map_err(|e| {
|
||||||
|
let error_msg = format!("Failed to connect to Discord RPC: {} (ensure Discord is running)", e);
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::debug!("Connected to Discord IPC socket");
|
||||||
|
|
||||||
|
// Set initial activity immediately after connecting
|
||||||
|
tracing::debug!("Building initial activity...");
|
||||||
|
let state_text = format!("Model: {}", initial_model);
|
||||||
|
let assets = Assets::new()
|
||||||
|
.large_image("hikari")
|
||||||
|
.large_text("Hikari - Claude Code Assistant");
|
||||||
|
|
||||||
|
tracing::debug!("Assets created - large_image: 'hikari', large_text: 'Hikari - Claude Code Assistant'");
|
||||||
|
|
||||||
|
let timestamps = Timestamps::new()
|
||||||
|
.start(started_at);
|
||||||
|
|
||||||
|
tracing::debug!("Timestamps created - start: {}", started_at);
|
||||||
|
|
||||||
|
let activity = Activity::new()
|
||||||
|
.details(initial_session_name.as_str())
|
||||||
|
.state(state_text.as_str())
|
||||||
|
.assets(assets)
|
||||||
|
.timestamps(timestamps);
|
||||||
|
|
||||||
|
tracing::debug!("Activity created - details: '{}', state: '{}'",
|
||||||
|
initial_session_name, state_text);
|
||||||
|
|
||||||
|
tracing::debug!("Attempting to set initial activity...");
|
||||||
|
client
|
||||||
|
.set_activity(activity)
|
||||||
|
.map_err(|e| {
|
||||||
|
let error_msg = format!("Failed to set initial Discord RPC activity: {}", e);
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::debug!("Initial activity set successfully!");
|
||||||
|
|
||||||
|
// Store the client and initial state
|
||||||
|
*self.client.write() = Some(client);
|
||||||
|
*self.session_name.write() = initial_session_name.clone();
|
||||||
|
*self.model.write() = initial_model.clone();
|
||||||
|
*self.started_at.write() = started_at;
|
||||||
|
|
||||||
|
tracing::info!("Discord RPC connected successfully with initial activity: session='{}', model='{}'",
|
||||||
|
initial_session_name, initial_model);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn update(
|
||||||
|
&self,
|
||||||
|
session_name: String,
|
||||||
|
model: String,
|
||||||
|
started_at: i64,
|
||||||
|
) -> Result<(), String> {
|
||||||
|
tracing::debug!("update() called with session='{}', model='{}', timestamp={}",
|
||||||
|
session_name, model, started_at);
|
||||||
|
|
||||||
|
*self.session_name.write() = session_name.clone();
|
||||||
|
*self.model.write() = model.clone();
|
||||||
|
*self.started_at.write() = started_at;
|
||||||
|
|
||||||
|
tracing::debug!("State variables updated");
|
||||||
|
|
||||||
|
let mut client_guard = self.client.write();
|
||||||
|
let client = client_guard
|
||||||
|
.as_mut()
|
||||||
|
.ok_or_else(|| {
|
||||||
|
let error_msg = "Discord RPC client not initialized".to_string();
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::debug!("Client lock acquired");
|
||||||
|
|
||||||
|
let state_text = format!("Model: {}", model);
|
||||||
|
let assets = Assets::new()
|
||||||
|
.large_image("hikari")
|
||||||
|
.large_text("Hikari - Claude Code Assistant");
|
||||||
|
|
||||||
|
tracing::debug!("Assets created - large_image: 'hikari', large_text: 'Hikari - Claude Code Assistant'");
|
||||||
|
|
||||||
|
let timestamps = Timestamps::new()
|
||||||
|
.start(started_at);
|
||||||
|
|
||||||
|
tracing::debug!("Timestamps created - start: {}", started_at);
|
||||||
|
|
||||||
|
let activity = Activity::new()
|
||||||
|
.details(session_name.as_str())
|
||||||
|
.state(state_text.as_str())
|
||||||
|
.assets(assets)
|
||||||
|
.timestamps(timestamps);
|
||||||
|
|
||||||
|
tracing::debug!("Activity created - details: '{}', state: '{}'",
|
||||||
|
session_name, state_text);
|
||||||
|
|
||||||
|
tracing::debug!("Attempting to set activity...");
|
||||||
|
client
|
||||||
|
.set_activity(activity)
|
||||||
|
.map_err(|e| {
|
||||||
|
let error_msg = format!("Failed to update Discord RPC: {}", e);
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
|
||||||
|
tracing::info!("Updated Discord RPC: session='{}', model='{}'", session_name, model);
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn stop(&self) -> Result<(), String> {
|
||||||
|
tracing::debug!("stop() called");
|
||||||
|
|
||||||
|
let mut client_guard = self.client.write();
|
||||||
|
if let Some(mut client) = client_guard.take() {
|
||||||
|
tracing::debug!("Client found, attempting to close...");
|
||||||
|
client
|
||||||
|
.close()
|
||||||
|
.map_err(|e| {
|
||||||
|
let error_msg = format!("Failed to close Discord RPC: {}", e);
|
||||||
|
tracing::error!("{}", error_msg);
|
||||||
|
error_msg
|
||||||
|
})?;
|
||||||
|
tracing::info!("Discord RPC stopped successfully");
|
||||||
|
} else {
|
||||||
|
tracing::debug!("No client to stop (already stopped or never initialized)");
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Default for DiscordRpcManager {
|
||||||
|
fn default() -> Self {
|
||||||
|
Self::new()
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,192 @@
|
|||||||
|
use chrono::Utc;
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use tauri::AppHandle;
|
||||||
|
use tauri_plugin_store::StoreExt;
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
|
const DRAFTS_STORE_FILE: &str = "hikari-drafts.json";
|
||||||
|
const DRAFTS_STORE_KEY: &str = "drafts";
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct Draft {
|
||||||
|
pub id: String,
|
||||||
|
pub content: String,
|
||||||
|
pub saved_at: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn load_all_drafts(app: &AppHandle) -> Result<Vec<Draft>, String> {
|
||||||
|
let store = app
|
||||||
|
.store(DRAFTS_STORE_FILE)
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
match store.get(DRAFTS_STORE_KEY) {
|
||||||
|
Some(value) => serde_json::from_value(value.clone()).map_err(|e| e.to_string()),
|
||||||
|
None => Ok(vec![]),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn save_all_drafts(app: &AppHandle, drafts: &[Draft]) -> Result<(), String> {
|
||||||
|
let store = app
|
||||||
|
.store(DRAFTS_STORE_FILE)
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let value = serde_json::to_value(drafts).map_err(|e| e.to_string())?;
|
||||||
|
store.set(DRAFTS_STORE_KEY, value);
|
||||||
|
store.save().map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn list_drafts(app: AppHandle) -> Result<Vec<Draft>, String> {
|
||||||
|
let mut drafts = load_all_drafts(&app)?;
|
||||||
|
// Sort newest first — ISO 8601 timestamps sort lexicographically
|
||||||
|
drafts.sort_by(|a, b| b.saved_at.cmp(&a.saved_at));
|
||||||
|
Ok(drafts)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn save_draft(app: AppHandle, content: String) -> Result<Draft, String> {
|
||||||
|
let mut drafts = load_all_drafts(&app)?;
|
||||||
|
|
||||||
|
let draft = Draft {
|
||||||
|
id: Uuid::new_v4().to_string(),
|
||||||
|
content,
|
||||||
|
saved_at: Utc::now().to_rfc3339(),
|
||||||
|
};
|
||||||
|
|
||||||
|
drafts.push(draft.clone());
|
||||||
|
save_all_drafts(&app, &drafts)?;
|
||||||
|
|
||||||
|
Ok(draft)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn delete_draft(app: AppHandle, draft_id: String) -> Result<(), String> {
|
||||||
|
let mut drafts = load_all_drafts(&app)?;
|
||||||
|
drafts.retain(|d| d.id != draft_id);
|
||||||
|
save_all_drafts(&app, &drafts)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn delete_all_drafts(app: AppHandle) -> Result<(), String> {
|
||||||
|
save_all_drafts(&app, &[])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
fn make_draft(id: &str, content: &str, saved_at: &str) -> Draft {
|
||||||
|
Draft {
|
||||||
|
id: id.to_string(),
|
||||||
|
content: content.to_string(),
|
||||||
|
saved_at: saved_at.to_string(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_draft_serialization() {
|
||||||
|
let draft = make_draft("test-id", "Hello world", "2026-01-01T00:00:00+00:00");
|
||||||
|
let json = serde_json::to_string(&draft).expect("Failed to serialize");
|
||||||
|
let parsed: Draft = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, draft.id);
|
||||||
|
assert_eq!(parsed.content, draft.content);
|
||||||
|
assert_eq!(parsed.saved_at, draft.saved_at);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_draft_clone() {
|
||||||
|
let original = make_draft("clone-id", "Clone me", "2026-01-01T00:00:00+00:00");
|
||||||
|
let cloned = original.clone();
|
||||||
|
|
||||||
|
assert_eq!(original.id, cloned.id);
|
||||||
|
assert_eq!(original.content, cloned.content);
|
||||||
|
assert_eq!(original.saved_at, cloned.saved_at);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_sort_newest_first() {
|
||||||
|
let mut drafts = [
|
||||||
|
make_draft("a", "First", "2026-01-01T00:00:00+00:00"),
|
||||||
|
make_draft("b", "Third", "2026-01-03T00:00:00+00:00"),
|
||||||
|
make_draft("c", "Second", "2026-01-02T00:00:00+00:00"),
|
||||||
|
];
|
||||||
|
|
||||||
|
drafts.sort_by(|a, b| b.saved_at.cmp(&a.saved_at));
|
||||||
|
|
||||||
|
assert_eq!(drafts[0].id, "b");
|
||||||
|
assert_eq!(drafts[1].id, "c");
|
||||||
|
assert_eq!(drafts[2].id, "a");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_retain_excludes_deleted() {
|
||||||
|
let mut drafts = vec![
|
||||||
|
make_draft("keep-1", "Keep me", "2026-01-01T00:00:00+00:00"),
|
||||||
|
make_draft("delete-me", "Delete me", "2026-01-02T00:00:00+00:00"),
|
||||||
|
make_draft("keep-2", "Keep me too", "2026-01-03T00:00:00+00:00"),
|
||||||
|
];
|
||||||
|
|
||||||
|
let target_id = "delete-me".to_string();
|
||||||
|
drafts.retain(|d| d.id != target_id);
|
||||||
|
|
||||||
|
assert_eq!(drafts.len(), 2);
|
||||||
|
assert!(drafts.iter().all(|d| d.id != "delete-me"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_find_by_id() {
|
||||||
|
let drafts = [
|
||||||
|
make_draft("draft-1", "First draft", "2026-01-01T00:00:00+00:00"),
|
||||||
|
make_draft("draft-2", "Second draft", "2026-01-02T00:00:00+00:00"),
|
||||||
|
make_draft("draft-3", "Third draft", "2026-01-03T00:00:00+00:00"),
|
||||||
|
];
|
||||||
|
|
||||||
|
let found = drafts.iter().find(|d| d.id == "draft-2");
|
||||||
|
assert!(found.is_some());
|
||||||
|
assert_eq!(found.unwrap().content, "Second draft");
|
||||||
|
|
||||||
|
let not_found = drafts.iter().find(|d| d.id == "draft-999");
|
||||||
|
assert!(not_found.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiline_content() {
|
||||||
|
let content = "Line 1\nLine 2\nLine 3";
|
||||||
|
let draft = make_draft("multi", content, "2026-01-01T00:00:00+00:00");
|
||||||
|
|
||||||
|
assert!(draft.content.contains('\n'));
|
||||||
|
assert_eq!(draft.content.split('\n').count(), 3);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_empty_after_delete_all() {
|
||||||
|
let mut drafts = vec![
|
||||||
|
make_draft("a", "A", "2026-01-01T00:00:00+00:00"),
|
||||||
|
make_draft("b", "B", "2026-01-02T00:00:00+00:00"),
|
||||||
|
];
|
||||||
|
|
||||||
|
drafts.clear();
|
||||||
|
|
||||||
|
assert!(drafts.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_uuid_format() {
|
||||||
|
// UUIDs should be non-empty and contain hyphens
|
||||||
|
let id = Uuid::new_v4().to_string();
|
||||||
|
assert!(!id.is_empty());
|
||||||
|
assert!(id.contains('-'));
|
||||||
|
assert_eq!(id.len(), 36);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_timestamp_is_rfc3339() {
|
||||||
|
let ts = Utc::now().to_rfc3339();
|
||||||
|
// RFC 3339 timestamps contain T and + or Z
|
||||||
|
assert!(ts.contains('T'));
|
||||||
|
assert!(ts.ends_with("+00:00") || ts.ends_with('Z'));
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,931 @@
|
|||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::process::Command;
|
||||||
|
|
||||||
|
#[cfg(target_os = "windows")]
|
||||||
|
use crate::process_ext::HideWindow;
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct GitStatus {
|
||||||
|
pub is_repo: bool,
|
||||||
|
pub branch: Option<String>,
|
||||||
|
pub upstream: Option<String>,
|
||||||
|
pub ahead: u32,
|
||||||
|
pub behind: u32,
|
||||||
|
pub staged: Vec<GitFileChange>,
|
||||||
|
pub unstaged: Vec<GitFileChange>,
|
||||||
|
pub untracked: Vec<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct GitFileChange {
|
||||||
|
pub path: String,
|
||||||
|
pub status: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct GitBranch {
|
||||||
|
pub name: String,
|
||||||
|
pub is_current: bool,
|
||||||
|
pub is_remote: bool,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct GitLogEntry {
|
||||||
|
pub hash: String,
|
||||||
|
pub short_hash: String,
|
||||||
|
pub author: String,
|
||||||
|
pub date: String,
|
||||||
|
pub message: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Builds the WSL argument list for running a git command at a Linux path.
|
||||||
|
/// Extracted for testability without requiring WSL to be available.
|
||||||
|
#[cfg(any(target_os = "windows", test))]
|
||||||
|
fn build_wsl_git_args<'a>(working_dir: &'a str, args: &[&'a str]) -> Vec<&'a str> {
|
||||||
|
let mut wsl_args = vec!["--", "git", "-C", working_dir];
|
||||||
|
wsl_args.extend_from_slice(args);
|
||||||
|
wsl_args
|
||||||
|
}
|
||||||
|
|
||||||
|
fn run_git_command(working_dir: &str, args: &[&str]) -> Result<String, String> {
|
||||||
|
#[cfg(target_os = "windows")]
|
||||||
|
let output = {
|
||||||
|
if working_dir.starts_with('/') {
|
||||||
|
// WSL/Linux path — run git through WSL so it can resolve the path correctly.
|
||||||
|
let wsl_args = build_wsl_git_args(working_dir, args);
|
||||||
|
Command::new("wsl")
|
||||||
|
.hide_window()
|
||||||
|
.args(&wsl_args)
|
||||||
|
.output()
|
||||||
|
.map_err(|e| format!("Failed to execute git via WSL: {}", e))?
|
||||||
|
} else {
|
||||||
|
Command::new("git")
|
||||||
|
.hide_window()
|
||||||
|
.args(args)
|
||||||
|
.current_dir(working_dir)
|
||||||
|
.output()
|
||||||
|
.map_err(|e| format!("Failed to execute git: {}", e))?
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
#[cfg(not(target_os = "windows"))]
|
||||||
|
let output = Command::new("git")
|
||||||
|
.args(args)
|
||||||
|
.current_dir(working_dir)
|
||||||
|
.output()
|
||||||
|
.map_err(|e| format!("Failed to execute git: {}", e))?;
|
||||||
|
|
||||||
|
if output.status.success() {
|
||||||
|
Ok(String::from_utf8_lossy(&output.stdout).to_string())
|
||||||
|
} else {
|
||||||
|
Err(String::from_utf8_lossy(&output.stderr).to_string())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_status(working_dir: String) -> Result<GitStatus, String> {
|
||||||
|
// Check if it's a git repo
|
||||||
|
let is_repo = run_git_command(&working_dir, &["rev-parse", "--git-dir"]).is_ok();
|
||||||
|
|
||||||
|
if !is_repo {
|
||||||
|
return Ok(GitStatus {
|
||||||
|
is_repo: false,
|
||||||
|
branch: None,
|
||||||
|
upstream: None,
|
||||||
|
ahead: 0,
|
||||||
|
behind: 0,
|
||||||
|
staged: vec![],
|
||||||
|
unstaged: vec![],
|
||||||
|
untracked: vec![],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get current branch
|
||||||
|
let branch = run_git_command(&working_dir, &["rev-parse", "--abbrev-ref", "HEAD"])
|
||||||
|
.ok()
|
||||||
|
.map(|s| s.trim().to_string());
|
||||||
|
|
||||||
|
// Get upstream branch
|
||||||
|
let upstream = run_git_command(
|
||||||
|
&working_dir,
|
||||||
|
&["rev-parse", "--abbrev-ref", "--symbolic-full-name", "@{u}"],
|
||||||
|
)
|
||||||
|
.ok()
|
||||||
|
.map(|s| s.trim().to_string());
|
||||||
|
|
||||||
|
// Get ahead/behind counts
|
||||||
|
let (ahead, behind) = if upstream.is_some() {
|
||||||
|
let rev_list =
|
||||||
|
run_git_command(&working_dir, &["rev-list", "--left-right", "--count", "@{u}...HEAD"])
|
||||||
|
.unwrap_or_default();
|
||||||
|
let parts: Vec<&str> = rev_list.trim().split('\t').collect();
|
||||||
|
if parts.len() == 2 {
|
||||||
|
(
|
||||||
|
parts[1].parse().unwrap_or(0),
|
||||||
|
parts[0].parse().unwrap_or(0),
|
||||||
|
)
|
||||||
|
} else {
|
||||||
|
(0, 0)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
(0, 0)
|
||||||
|
};
|
||||||
|
|
||||||
|
// Get status with porcelain format
|
||||||
|
let status_output =
|
||||||
|
run_git_command(&working_dir, &["status", "--porcelain=v1"]).unwrap_or_default();
|
||||||
|
|
||||||
|
let mut staged = vec![];
|
||||||
|
let mut unstaged = vec![];
|
||||||
|
let mut untracked = vec![];
|
||||||
|
|
||||||
|
for line in status_output.lines() {
|
||||||
|
if line.len() < 3 {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
let index_status = line.chars().next().unwrap_or(' ');
|
||||||
|
let worktree_status = line.chars().nth(1).unwrap_or(' ');
|
||||||
|
let path = line[3..].to_string();
|
||||||
|
|
||||||
|
// Untracked files
|
||||||
|
if index_status == '?' && worktree_status == '?' {
|
||||||
|
untracked.push(path);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Staged changes (index status)
|
||||||
|
if index_status != ' ' && index_status != '?' {
|
||||||
|
staged.push(GitFileChange {
|
||||||
|
path: path.clone(),
|
||||||
|
status: match index_status {
|
||||||
|
'M' => "modified".to_string(),
|
||||||
|
'A' => "added".to_string(),
|
||||||
|
'D' => "deleted".to_string(),
|
||||||
|
'R' => "renamed".to_string(),
|
||||||
|
'C' => "copied".to_string(),
|
||||||
|
_ => "unknown".to_string(),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unstaged changes (worktree status)
|
||||||
|
if worktree_status != ' ' && worktree_status != '?' {
|
||||||
|
unstaged.push(GitFileChange {
|
||||||
|
path,
|
||||||
|
status: match worktree_status {
|
||||||
|
'M' => "modified".to_string(),
|
||||||
|
'D' => "deleted".to_string(),
|
||||||
|
_ => "unknown".to_string(),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(GitStatus {
|
||||||
|
is_repo: true,
|
||||||
|
branch,
|
||||||
|
upstream,
|
||||||
|
ahead,
|
||||||
|
behind,
|
||||||
|
staged,
|
||||||
|
unstaged,
|
||||||
|
untracked,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_diff(working_dir: String, file_path: Option<String>, staged: bool) -> Result<String, String> {
|
||||||
|
let mut args = vec!["diff"];
|
||||||
|
|
||||||
|
if staged {
|
||||||
|
args.push("--cached");
|
||||||
|
}
|
||||||
|
|
||||||
|
if let Some(ref path) = file_path {
|
||||||
|
args.push("--");
|
||||||
|
args.push(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
run_git_command(&working_dir, &args)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_branches(working_dir: String) -> Result<Vec<GitBranch>, String> {
|
||||||
|
let output = run_git_command(&working_dir, &["branch", "-a", "--format=%(refname:short)\t%(HEAD)"])?;
|
||||||
|
|
||||||
|
let branches: Vec<GitBranch> = output
|
||||||
|
.lines()
|
||||||
|
.filter_map(|line| {
|
||||||
|
let parts: Vec<&str> = line.split('\t').collect();
|
||||||
|
if parts.is_empty() {
|
||||||
|
return None;
|
||||||
|
}
|
||||||
|
|
||||||
|
let name = parts[0].to_string();
|
||||||
|
let is_current = parts.get(1).map(|s| *s == "*").unwrap_or(false);
|
||||||
|
let is_remote = name.starts_with("remotes/") || name.starts_with("origin/");
|
||||||
|
|
||||||
|
Some(GitBranch {
|
||||||
|
name,
|
||||||
|
is_current,
|
||||||
|
is_remote,
|
||||||
|
})
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
Ok(branches)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_checkout(working_dir: String, branch: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["checkout", &branch])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_stage(working_dir: String, file_path: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["add", &file_path])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_unstage(working_dir: String, file_path: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["restore", "--staged", &file_path])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_stage_all(working_dir: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["add", "-A"])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_commit(working_dir: String, message: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", &message])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_push(working_dir: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["push"])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_pull(working_dir: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["pull"])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_fetch(working_dir: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["fetch", "--all"])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_log(working_dir: String, limit: Option<u32>) -> Result<Vec<GitLogEntry>, String> {
|
||||||
|
let limit_str = limit.unwrap_or(10).to_string();
|
||||||
|
let output = run_git_command(
|
||||||
|
&working_dir,
|
||||||
|
&[
|
||||||
|
"log",
|
||||||
|
&format!("-{}", limit_str),
|
||||||
|
"--pretty=format:%H\t%h\t%an\t%ar\t%s",
|
||||||
|
],
|
||||||
|
)?;
|
||||||
|
|
||||||
|
let entries: Vec<GitLogEntry> = output
|
||||||
|
.lines()
|
||||||
|
.filter_map(|line| {
|
||||||
|
let parts: Vec<&str> = line.split('\t').collect();
|
||||||
|
if parts.len() < 5 {
|
||||||
|
return None;
|
||||||
|
}
|
||||||
|
|
||||||
|
Some(GitLogEntry {
|
||||||
|
hash: parts[0].to_string(),
|
||||||
|
short_hash: parts[1].to_string(),
|
||||||
|
author: parts[2].to_string(),
|
||||||
|
date: parts[3].to_string(),
|
||||||
|
message: parts[4..].join("\t"),
|
||||||
|
})
|
||||||
|
})
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
Ok(entries)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_discard(working_dir: String, file_path: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["checkout", "--", &file_path])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub fn git_create_branch(working_dir: String, branch_name: String) -> Result<String, String> {
|
||||||
|
run_git_command(&working_dir, &["checkout", "-b", &branch_name])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use std::fs::{self, File};
|
||||||
|
use std::io::Write;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
|
||||||
|
// ==================== build_wsl_git_args tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_build_wsl_git_args_structure() {
|
||||||
|
let args = build_wsl_git_args("/home/naomi/code/project", &["status", "--porcelain=v1"]);
|
||||||
|
assert_eq!(args[0], "--");
|
||||||
|
assert_eq!(args[1], "git");
|
||||||
|
assert_eq!(args[2], "-C");
|
||||||
|
assert_eq!(args[3], "/home/naomi/code/project");
|
||||||
|
assert_eq!(args[4], "status");
|
||||||
|
assert_eq!(args[5], "--porcelain=v1");
|
||||||
|
assert_eq!(args.len(), 6);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_build_wsl_git_args_no_extra_args() {
|
||||||
|
let args = build_wsl_git_args("/home/user/repo", &["init"]);
|
||||||
|
assert_eq!(args, vec!["--", "git", "-C", "/home/user/repo", "init"]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to create a git repository in a temp directory
|
||||||
|
fn create_test_repo() -> TempDir {
|
||||||
|
let temp_dir = TempDir::new().unwrap();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Initialize git repo
|
||||||
|
run_git_command(&working_dir, &["init"]).unwrap();
|
||||||
|
|
||||||
|
// Configure git user for commits
|
||||||
|
run_git_command(&working_dir, &["config", "user.email", "test@example.com"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["config", "user.name", "Test User"]).unwrap();
|
||||||
|
|
||||||
|
// Disable GPG signing for tests (user may have it enabled globally)
|
||||||
|
run_git_command(&working_dir, &["config", "commit.gpgsign", "false"]).unwrap();
|
||||||
|
|
||||||
|
temp_dir
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to create a file in the test repo
|
||||||
|
fn create_file(dir: &TempDir, name: &str, content: &str) {
|
||||||
|
let file_path = dir.path().join(name);
|
||||||
|
let mut file = File::create(file_path).unwrap();
|
||||||
|
file.write_all(content.as_bytes()).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== GitStatus struct tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_serialization() {
|
||||||
|
let status = GitStatus {
|
||||||
|
is_repo: true,
|
||||||
|
branch: Some("main".to_string()),
|
||||||
|
upstream: Some("origin/main".to_string()),
|
||||||
|
ahead: 2,
|
||||||
|
behind: 1,
|
||||||
|
staged: vec![GitFileChange {
|
||||||
|
path: "file.txt".to_string(),
|
||||||
|
status: "modified".to_string(),
|
||||||
|
}],
|
||||||
|
unstaged: vec![],
|
||||||
|
untracked: vec!["new_file.txt".to_string()],
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&status).unwrap();
|
||||||
|
assert!(json.contains("\"is_repo\":true"));
|
||||||
|
assert!(json.contains("\"branch\":\"main\""));
|
||||||
|
assert!(json.contains("\"ahead\":2"));
|
||||||
|
assert!(json.contains("\"behind\":1"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_not_a_repo() {
|
||||||
|
let status = GitStatus {
|
||||||
|
is_repo: false,
|
||||||
|
branch: None,
|
||||||
|
upstream: None,
|
||||||
|
ahead: 0,
|
||||||
|
behind: 0,
|
||||||
|
staged: vec![],
|
||||||
|
unstaged: vec![],
|
||||||
|
untracked: vec![],
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&status).unwrap();
|
||||||
|
let deserialized: GitStatus = serde_json::from_str(&json).unwrap();
|
||||||
|
assert!(!deserialized.is_repo);
|
||||||
|
assert!(deserialized.branch.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== GitFileChange struct tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_file_change_serialization() {
|
||||||
|
let change = GitFileChange {
|
||||||
|
path: "src/main.rs".to_string(),
|
||||||
|
status: "added".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&change).unwrap();
|
||||||
|
assert!(json.contains("src/main.rs"));
|
||||||
|
assert!(json.contains("added"));
|
||||||
|
|
||||||
|
let deserialized: GitFileChange = serde_json::from_str(&json).unwrap();
|
||||||
|
assert_eq!(deserialized.path, "src/main.rs");
|
||||||
|
assert_eq!(deserialized.status, "added");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== GitBranch struct tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_branch_serialization() {
|
||||||
|
let branch = GitBranch {
|
||||||
|
name: "feature/new-feature".to_string(),
|
||||||
|
is_current: true,
|
||||||
|
is_remote: false,
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&branch).unwrap();
|
||||||
|
assert!(json.contains("feature/new-feature"));
|
||||||
|
assert!(json.contains("\"is_current\":true"));
|
||||||
|
assert!(json.contains("\"is_remote\":false"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_branch_remote() {
|
||||||
|
let branch = GitBranch {
|
||||||
|
name: "origin/main".to_string(),
|
||||||
|
is_current: false,
|
||||||
|
is_remote: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&branch).unwrap();
|
||||||
|
let deserialized: GitBranch = serde_json::from_str(&json).unwrap();
|
||||||
|
assert!(deserialized.is_remote);
|
||||||
|
assert!(!deserialized.is_current);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== GitLogEntry struct tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_log_entry_serialization() {
|
||||||
|
let entry = GitLogEntry {
|
||||||
|
hash: "abc123def456".to_string(),
|
||||||
|
short_hash: "abc123d".to_string(),
|
||||||
|
author: "Hikari".to_string(),
|
||||||
|
date: "2 hours ago".to_string(),
|
||||||
|
message: "feat: add new feature".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&entry).unwrap();
|
||||||
|
assert!(json.contains("abc123def456"));
|
||||||
|
assert!(json.contains("Hikari"));
|
||||||
|
assert!(json.contains("feat: add new feature"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_status integration tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_not_a_git_repo() {
|
||||||
|
let temp_dir = TempDir::new().unwrap();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
let result = git_status(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let status = result.unwrap();
|
||||||
|
assert!(!status.is_repo);
|
||||||
|
assert!(status.branch.is_none());
|
||||||
|
assert!(status.staged.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_empty_repo() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
let result = git_status(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let status = result.unwrap();
|
||||||
|
assert!(status.is_repo);
|
||||||
|
assert!(status.staged.is_empty());
|
||||||
|
assert!(status.unstaged.is_empty());
|
||||||
|
assert!(status.untracked.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_with_untracked_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create an untracked file
|
||||||
|
create_file(&temp_dir, "untracked.txt", "hello");
|
||||||
|
|
||||||
|
let result = git_status(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let status = result.unwrap();
|
||||||
|
assert!(status.is_repo);
|
||||||
|
assert!(status.untracked.contains(&"untracked.txt".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_with_staged_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create and stage a file
|
||||||
|
create_file(&temp_dir, "staged.txt", "hello");
|
||||||
|
run_git_command(&working_dir, &["add", "staged.txt"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_status(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let status = result.unwrap();
|
||||||
|
assert!(status.is_repo);
|
||||||
|
assert!(!status.staged.is_empty());
|
||||||
|
assert_eq!(status.staged[0].path, "staged.txt");
|
||||||
|
assert_eq!(status.staged[0].status, "added");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_status_with_modified_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create, stage, and commit a file
|
||||||
|
create_file(&temp_dir, "file.txt", "initial content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial commit"]).unwrap();
|
||||||
|
|
||||||
|
// Modify the file
|
||||||
|
create_file(&temp_dir, "file.txt", "modified content");
|
||||||
|
|
||||||
|
let result = git_status(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let status = result.unwrap();
|
||||||
|
assert!(status.is_repo);
|
||||||
|
assert!(!status.unstaged.is_empty());
|
||||||
|
assert_eq!(status.unstaged[0].path, "file.txt");
|
||||||
|
assert_eq!(status.unstaged[0].status, "modified");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_diff integration tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_diff_no_changes() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
let result = git_diff(working_dir, None, false);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
assert!(result.unwrap().is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_diff_with_changes() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create and commit a file
|
||||||
|
create_file(&temp_dir, "file.txt", "initial content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Modify the file
|
||||||
|
create_file(&temp_dir, "file.txt", "modified content");
|
||||||
|
|
||||||
|
let result = git_diff(working_dir, None, false);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let diff = result.unwrap();
|
||||||
|
assert!(diff.contains("diff"));
|
||||||
|
assert!(diff.contains("file.txt"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_diff_staged() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create and commit a file
|
||||||
|
create_file(&temp_dir, "file.txt", "initial content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Modify and stage the file
|
||||||
|
create_file(&temp_dir, "file.txt", "modified content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_diff(working_dir, None, true);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let diff = result.unwrap();
|
||||||
|
assert!(diff.contains("diff"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_diff_specific_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create and commit files
|
||||||
|
create_file(&temp_dir, "file1.txt", "content1");
|
||||||
|
create_file(&temp_dir, "file2.txt", "content2");
|
||||||
|
run_git_command(&working_dir, &["add", "-A"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Modify both files
|
||||||
|
create_file(&temp_dir, "file1.txt", "modified1");
|
||||||
|
create_file(&temp_dir, "file2.txt", "modified2");
|
||||||
|
|
||||||
|
// Get diff for only file1.txt
|
||||||
|
let result = git_diff(working_dir, Some("file1.txt".to_string()), false);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let diff = result.unwrap();
|
||||||
|
assert!(diff.contains("file1.txt"));
|
||||||
|
assert!(!diff.contains("file2.txt"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_branches integration tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_branches_single_branch() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Need at least one commit for branches to show
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_branches(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let branches = result.unwrap();
|
||||||
|
assert!(!branches.is_empty());
|
||||||
|
// Should have at least one branch (main or master)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_branches_multiple_branches() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Initial commit
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Create additional branch
|
||||||
|
run_git_command(&working_dir, &["branch", "feature-branch"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_branches(working_dir);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let branches = result.unwrap();
|
||||||
|
assert!(branches.len() >= 2);
|
||||||
|
assert!(branches.iter().any(|b| b.name == "feature-branch"));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_stage and git_unstage tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_stage_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
|
||||||
|
let result = git_stage(working_dir.clone(), "file.txt".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify file is staged
|
||||||
|
let status = git_status(working_dir).unwrap();
|
||||||
|
assert!(status.staged.iter().any(|f| f.path == "file.txt"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_unstage_file() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// First, commit a file so we have a HEAD to restore from
|
||||||
|
create_file(&temp_dir, "file.txt", "initial content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Modify and stage the file
|
||||||
|
create_file(&temp_dir, "file.txt", "modified content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_unstage(working_dir.clone(), "file.txt".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify file is unstaged (should now be in unstaged/modified, not staged)
|
||||||
|
let status = git_status(working_dir).unwrap();
|
||||||
|
assert!(!status.staged.iter().any(|f| f.path == "file.txt"));
|
||||||
|
assert!(status.unstaged.iter().any(|f| f.path == "file.txt"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_stage_all() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
create_file(&temp_dir, "file1.txt", "content1");
|
||||||
|
create_file(&temp_dir, "file2.txt", "content2");
|
||||||
|
|
||||||
|
let result = git_stage_all(working_dir.clone());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify all files are staged
|
||||||
|
let status = git_status(working_dir).unwrap();
|
||||||
|
assert_eq!(status.staged.len(), 2);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_commit tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_commit() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_commit(working_dir.clone(), "test commit message".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify commit was made
|
||||||
|
let log = git_log(working_dir, Some(1)).unwrap();
|
||||||
|
assert!(!log.is_empty());
|
||||||
|
assert!(log[0].message.contains("test commit message"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_commit_nothing_to_commit() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Need initial commit first
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Try to commit with nothing staged
|
||||||
|
let result = git_commit(working_dir, "empty commit".to_string());
|
||||||
|
assert!(result.is_err()); // Should fail because nothing to commit
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_log tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_log_empty_repo() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
let result = git_log(working_dir, Some(10));
|
||||||
|
// May fail on empty repo or return empty
|
||||||
|
if let Ok(commits) = result {
|
||||||
|
assert!(commits.is_empty());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_log_with_commits() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Make multiple commits
|
||||||
|
for i in 1..=3 {
|
||||||
|
create_file(&temp_dir, &format!("file{}.txt", i), "content");
|
||||||
|
run_git_command(&working_dir, &["add", "-A"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", &format!("commit {}", i)]).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
let result = git_log(working_dir, Some(10));
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let log = result.unwrap();
|
||||||
|
assert_eq!(log.len(), 3);
|
||||||
|
assert!(log[0].message.contains("commit 3")); // Most recent first
|
||||||
|
assert!(log[2].message.contains("commit 1"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_log_limit() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Make 5 commits
|
||||||
|
for i in 1..=5 {
|
||||||
|
create_file(&temp_dir, &format!("file{}.txt", i), "content");
|
||||||
|
run_git_command(&working_dir, &["add", "-A"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", &format!("commit {}", i)]).unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Only get last 2
|
||||||
|
let result = git_log(working_dir, Some(2));
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let log = result.unwrap();
|
||||||
|
assert_eq!(log.len(), 2);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_discard tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_discard_changes() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Create and commit a file
|
||||||
|
create_file(&temp_dir, "file.txt", "original content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Modify the file
|
||||||
|
create_file(&temp_dir, "file.txt", "modified content");
|
||||||
|
|
||||||
|
// Discard changes
|
||||||
|
let result = git_discard(working_dir.clone(), "file.txt".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify file contents are restored
|
||||||
|
let content = fs::read_to_string(temp_dir.path().join("file.txt")).unwrap();
|
||||||
|
assert_eq!(content, "original content");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_create_branch tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_create_branch() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Initial commit required
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
let result = git_create_branch(working_dir.clone(), "new-branch".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify branch exists and is current
|
||||||
|
let branches = git_branches(working_dir).unwrap();
|
||||||
|
assert!(branches.iter().any(|b| b.name == "new-branch" && b.is_current));
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== git_checkout tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_git_checkout() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// Initial commit required
|
||||||
|
create_file(&temp_dir, "file.txt", "content");
|
||||||
|
run_git_command(&working_dir, &["add", "file.txt"]).unwrap();
|
||||||
|
run_git_command(&working_dir, &["commit", "-m", "initial"]).unwrap();
|
||||||
|
|
||||||
|
// Create a branch
|
||||||
|
run_git_command(&working_dir, &["branch", "other-branch"]).unwrap();
|
||||||
|
|
||||||
|
// Checkout the branch
|
||||||
|
let result = git_checkout(working_dir.clone(), "other-branch".to_string());
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
// Verify current branch
|
||||||
|
let branches = git_branches(working_dir).unwrap();
|
||||||
|
let current = branches.iter().find(|b| b.is_current);
|
||||||
|
assert!(current.is_some());
|
||||||
|
assert_eq!(current.unwrap().name, "other-branch");
|
||||||
|
}
|
||||||
|
|
||||||
|
// ==================== run_git_command tests ====================
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_run_git_command_success() {
|
||||||
|
let temp_dir = create_test_repo();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
let result = run_git_command(&working_dir, &["status"]);
|
||||||
|
assert!(result.is_ok());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_run_git_command_failure() {
|
||||||
|
let temp_dir = TempDir::new().unwrap();
|
||||||
|
let working_dir = temp_dir.path().to_string_lossy().to_string();
|
||||||
|
|
||||||
|
// This should fail because it's not a git repo
|
||||||
|
let result = run_git_command(&working_dir, &["log"]);
|
||||||
|
assert!(result.is_err());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_run_git_command_invalid_dir() {
|
||||||
|
let result = run_git_command("/nonexistent/path", &["status"]);
|
||||||
|
assert!(result.is_err());
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,26 +1,55 @@
|
|||||||
mod achievements;
|
mod achievements;
|
||||||
mod bridge_manager;
|
mod bridge_manager;
|
||||||
|
mod clipboard;
|
||||||
mod commands;
|
mod commands;
|
||||||
mod config;
|
mod config;
|
||||||
|
mod cost_tracking;
|
||||||
|
mod debug_logger;
|
||||||
|
mod discord_rpc;
|
||||||
|
mod drafts;
|
||||||
|
mod git;
|
||||||
mod notifications;
|
mod notifications;
|
||||||
|
mod process_ext;
|
||||||
|
mod quick_actions;
|
||||||
|
mod sessions;
|
||||||
|
mod snippets;
|
||||||
mod stats;
|
mod stats;
|
||||||
|
mod temp_manager;
|
||||||
|
mod tool_cache;
|
||||||
|
mod tray;
|
||||||
mod types;
|
mod types;
|
||||||
mod wsl_bridge;
|
|
||||||
mod wsl_notifications;
|
|
||||||
mod vbs_notification;
|
mod vbs_notification;
|
||||||
mod windows_toast;
|
mod windows_toast;
|
||||||
|
mod wsl_bridge;
|
||||||
|
mod wsl_notifications;
|
||||||
|
|
||||||
use commands::*;
|
|
||||||
use notifications::*;
|
|
||||||
use bridge_manager::create_shared_bridge_manager;
|
use bridge_manager::create_shared_bridge_manager;
|
||||||
|
use clipboard::*;
|
||||||
use commands::load_saved_achievements;
|
use commands::load_saved_achievements;
|
||||||
use wsl_notifications::*;
|
use commands::*;
|
||||||
|
use debug_logger::TauriLogLayer;
|
||||||
|
use discord_rpc::DiscordRpcManager;
|
||||||
|
use drafts::*;
|
||||||
|
use git::*;
|
||||||
|
use notifications::*;
|
||||||
|
use quick_actions::*;
|
||||||
|
use sessions::*;
|
||||||
|
use snippets::*;
|
||||||
|
use std::sync::Arc;
|
||||||
|
use tauri::{Emitter, Manager};
|
||||||
|
use temp_manager::create_shared_temp_manager;
|
||||||
|
use tracing_subscriber::layer::SubscriberExt;
|
||||||
|
use tracing_subscriber::util::SubscriberInitExt;
|
||||||
|
use tray::setup_tray;
|
||||||
use vbs_notification::*;
|
use vbs_notification::*;
|
||||||
use windows_toast::*;
|
use windows_toast::*;
|
||||||
|
use wsl_notifications::*;
|
||||||
|
|
||||||
#[cfg_attr(mobile, tauri::mobile_entry_point)]
|
#[cfg_attr(mobile, tauri::mobile_entry_point)]
|
||||||
pub fn run() {
|
pub fn run() {
|
||||||
let bridge_manager = create_shared_bridge_manager();
|
let bridge_manager = create_shared_bridge_manager();
|
||||||
|
let temp_manager = create_shared_temp_manager().expect("Failed to create temp file manager");
|
||||||
|
let discord_rpc = Arc::new(DiscordRpcManager::new());
|
||||||
|
|
||||||
tauri::Builder::default()
|
tauri::Builder::default()
|
||||||
.plugin(tauri_plugin_dialog::init())
|
.plugin(tauri_plugin_dialog::init())
|
||||||
@@ -29,10 +58,55 @@ pub fn run() {
|
|||||||
.plugin(tauri_plugin_store::Builder::new().build())
|
.plugin(tauri_plugin_store::Builder::new().build())
|
||||||
.plugin(tauri_plugin_notification::init())
|
.plugin(tauri_plugin_notification::init())
|
||||||
.plugin(tauri_plugin_os::init())
|
.plugin(tauri_plugin_os::init())
|
||||||
|
.plugin(tauri_plugin_http::init())
|
||||||
|
.plugin(tauri_plugin_clipboard_manager::init())
|
||||||
|
.plugin(tauri_plugin_fs::init())
|
||||||
.manage(bridge_manager.clone())
|
.manage(bridge_manager.clone())
|
||||||
|
.manage(temp_manager.clone())
|
||||||
|
.manage(discord_rpc.clone())
|
||||||
.setup(move |app| {
|
.setup(move |app| {
|
||||||
|
// Initialize tracing with custom layer that emits to frontend
|
||||||
|
// NOTE: We don't use fmt::layer() because in production builds with windows_subsystem = "windows",
|
||||||
|
// stdout is hidden. Instead, all logs go through TauriLogLayer to the debug console.
|
||||||
|
let tauri_layer = TauriLogLayer::new(app.handle().clone());
|
||||||
|
tracing_subscriber::registry()
|
||||||
|
.with(tauri_layer)
|
||||||
|
.init();
|
||||||
|
|
||||||
// Initialize the app handle in the bridge manager
|
// Initialize the app handle in the bridge manager
|
||||||
bridge_manager.lock().set_app_handle(app.handle().clone());
|
bridge_manager.lock().set_app_handle(app.handle().clone());
|
||||||
|
|
||||||
|
// Clean up any orphaned temp files from previous sessions
|
||||||
|
if let Ok(count) = temp_manager.lock().cleanup_orphaned_files() {
|
||||||
|
if count > 0 {
|
||||||
|
tracing::info!("Cleaned up {} orphaned temp files", count);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
tracing::info!("Hikari Desktop started successfully");
|
||||||
|
|
||||||
|
// Set up system tray
|
||||||
|
if let Err(e) = setup_tray(app.handle()) {
|
||||||
|
tracing::error!("Failed to set up system tray: {}", e);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle window close event for minimize to tray and close confirmation
|
||||||
|
let main_window = app.get_webview_window("main").unwrap();
|
||||||
|
main_window.on_window_event({
|
||||||
|
let app_handle = app.handle().clone();
|
||||||
|
move |event| {
|
||||||
|
if let tauri::WindowEvent::CloseRequested { api, .. } = event {
|
||||||
|
// Always prevent default close - let frontend handle it
|
||||||
|
api.prevent_close();
|
||||||
|
|
||||||
|
// Emit event to frontend to show confirmation modal
|
||||||
|
if let Some(window) = app_handle.get_webview_window("main") {
|
||||||
|
let _ = window.emit("window-close-requested", ());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
})
|
})
|
||||||
.invoke_handler(tauri::generate_handler![
|
.invoke_handler(tauri::generate_handler![
|
||||||
@@ -46,13 +120,112 @@ pub fn run() {
|
|||||||
get_config,
|
get_config,
|
||||||
save_config,
|
save_config,
|
||||||
get_usage_stats,
|
get_usage_stats,
|
||||||
|
get_persisted_stats,
|
||||||
load_saved_achievements,
|
load_saved_achievements,
|
||||||
|
answer_question,
|
||||||
|
check_workspace_hooks,
|
||||||
send_windows_notification,
|
send_windows_notification,
|
||||||
send_simple_notification,
|
send_simple_notification,
|
||||||
send_windows_toast,
|
send_windows_toast,
|
||||||
send_notify_send,
|
send_notify_send,
|
||||||
send_wsl_notification,
|
send_wsl_notification,
|
||||||
send_vbs_notification,
|
send_vbs_notification,
|
||||||
|
validate_directory,
|
||||||
|
list_skills,
|
||||||
|
check_for_updates,
|
||||||
|
fetch_changelog,
|
||||||
|
check_cli_latest_version,
|
||||||
|
save_temp_file,
|
||||||
|
register_temp_file,
|
||||||
|
get_temp_files,
|
||||||
|
cleanup_temp_files,
|
||||||
|
cleanup_all_temp_files,
|
||||||
|
cleanup_orphaned_temp_files,
|
||||||
|
get_file_size,
|
||||||
|
list_sessions,
|
||||||
|
save_session,
|
||||||
|
load_session,
|
||||||
|
delete_session,
|
||||||
|
search_sessions,
|
||||||
|
clear_all_sessions,
|
||||||
|
list_snippets,
|
||||||
|
save_snippet,
|
||||||
|
delete_snippet,
|
||||||
|
get_snippet_categories,
|
||||||
|
reset_default_snippets,
|
||||||
|
list_quick_actions,
|
||||||
|
save_quick_action,
|
||||||
|
delete_quick_action,
|
||||||
|
reset_default_quick_actions,
|
||||||
|
git_status,
|
||||||
|
git_diff,
|
||||||
|
git_branches,
|
||||||
|
git_checkout,
|
||||||
|
git_stage,
|
||||||
|
git_unstage,
|
||||||
|
git_stage_all,
|
||||||
|
git_commit,
|
||||||
|
git_push,
|
||||||
|
git_pull,
|
||||||
|
git_fetch,
|
||||||
|
git_log,
|
||||||
|
git_discard,
|
||||||
|
git_create_branch,
|
||||||
|
list_clipboard_entries,
|
||||||
|
capture_clipboard,
|
||||||
|
delete_clipboard_entry,
|
||||||
|
toggle_pin_clipboard_entry,
|
||||||
|
clear_clipboard_history,
|
||||||
|
search_clipboard_entries,
|
||||||
|
get_clipboard_languages,
|
||||||
|
update_clipboard_language,
|
||||||
|
list_directory,
|
||||||
|
read_file_content,
|
||||||
|
write_file_content,
|
||||||
|
create_file,
|
||||||
|
create_directory,
|
||||||
|
delete_file,
|
||||||
|
delete_directory,
|
||||||
|
rename_path,
|
||||||
|
// Cost tracking commands
|
||||||
|
get_cost_summary,
|
||||||
|
get_cost_alerts,
|
||||||
|
set_cost_alert_thresholds,
|
||||||
|
export_cost_csv,
|
||||||
|
get_today_cost,
|
||||||
|
get_week_cost,
|
||||||
|
get_month_cost,
|
||||||
|
init_discord_rpc,
|
||||||
|
update_discord_rpc,
|
||||||
|
stop_discord_rpc,
|
||||||
|
close_application,
|
||||||
|
list_memory_files,
|
||||||
|
get_claude_version,
|
||||||
|
get_auth_status,
|
||||||
|
auth_login,
|
||||||
|
auth_logout,
|
||||||
|
list_plugins,
|
||||||
|
install_plugin,
|
||||||
|
uninstall_plugin,
|
||||||
|
enable_plugin,
|
||||||
|
disable_plugin,
|
||||||
|
update_plugin,
|
||||||
|
list_marketplaces,
|
||||||
|
add_marketplace,
|
||||||
|
remove_marketplace,
|
||||||
|
list_mcp_servers,
|
||||||
|
get_mcp_server,
|
||||||
|
remove_mcp_server,
|
||||||
|
add_mcp_server,
|
||||||
|
get_mcp_server_details,
|
||||||
|
list_drafts,
|
||||||
|
save_draft,
|
||||||
|
delete_draft,
|
||||||
|
delete_all_drafts,
|
||||||
|
scan_project,
|
||||||
|
open_binary_file,
|
||||||
|
get_global_claude_md,
|
||||||
|
save_global_claude_md,
|
||||||
])
|
])
|
||||||
.run(tauri::generate_context!())
|
.run(tauri::generate_context!())
|
||||||
.expect("error while running tauri application");
|
.expect("error while running tauri application");
|
||||||
|
|||||||
@@ -1,29 +1,11 @@
|
|||||||
use tauri::command;
|
|
||||||
use std::process::Command;
|
use std::process::Command;
|
||||||
|
use tauri::command;
|
||||||
|
|
||||||
#[command]
|
use crate::process_ext::HideWindow;
|
||||||
pub async fn send_notify_send(title: String, body: String) -> Result<(), String> {
|
|
||||||
// Use notify-send for Linux/WSL
|
|
||||||
let output = Command::new("notify-send")
|
|
||||||
.arg(&title)
|
|
||||||
.arg(&body)
|
|
||||||
.arg("--urgency=normal")
|
|
||||||
.arg("--app-name=Hikari Desktop")
|
|
||||||
.output()
|
|
||||||
.map_err(|e| format!("Failed to execute notify-send: {}. Make sure libnotify-bin is installed.", e))?;
|
|
||||||
|
|
||||||
if !output.status.success() {
|
/// Generate PowerShell script for Windows Toast Notification
|
||||||
let error = String::from_utf8_lossy(&output.stderr);
|
fn generate_powershell_toast_script(title: &str, body: &str) -> String {
|
||||||
return Err(format!("notify-send failed: {}", error));
|
format!(
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
#[command]
|
|
||||||
pub async fn send_windows_notification(title: String, body: String) -> Result<(), String> {
|
|
||||||
// Create PowerShell script for Windows Toast Notification
|
|
||||||
let ps_script = format!(
|
|
||||||
r#"
|
r#"
|
||||||
[Windows.UI.Notifications.ToastNotificationManager, Windows.UI.Notifications, ContentType = WindowsRuntime] > $null
|
[Windows.UI.Notifications.ToastNotificationManager, Windows.UI.Notifications, ContentType = WindowsRuntime] > $null
|
||||||
[Windows.Data.Xml.Dom.XmlDocument, Windows.Data.Xml.Dom.XmlDocument, ContentType = WindowsRuntime] > $null
|
[Windows.Data.Xml.Dom.XmlDocument, Windows.Data.Xml.Dom.XmlDocument, ContentType = WindowsRuntime] > $null
|
||||||
@@ -50,10 +32,87 @@ $toast = New-Object Windows.UI.Notifications.ToastNotification $xml
|
|||||||
"#,
|
"#,
|
||||||
title.replace("\"", "`\""),
|
title.replace("\"", "`\""),
|
||||||
body.replace("\"", "`\"")
|
body.replace("\"", "`\"")
|
||||||
);
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Format simple notification message
|
||||||
|
fn format_simple_notification(title: &str, body: &str) -> String {
|
||||||
|
format!("{}\n\n{}", title, body)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Build notify-send command for testing (doesn't execute)
|
||||||
|
#[cfg(test)]
|
||||||
|
fn build_notify_send_command(title: &str, body: &str) -> (String, Vec<String>) {
|
||||||
|
(
|
||||||
|
"notify-send".to_string(),
|
||||||
|
vec![
|
||||||
|
title.to_string(),
|
||||||
|
body.to_string(),
|
||||||
|
"--urgency=normal".to_string(),
|
||||||
|
"--app-name=Hikari Desktop".to_string(),
|
||||||
|
],
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Build Windows PowerShell command for testing (doesn't execute)
|
||||||
|
#[cfg(test)]
|
||||||
|
fn build_windows_powershell_command(title: &str, body: &str) -> (String, Vec<String>) {
|
||||||
|
let script = generate_powershell_toast_script(title, body);
|
||||||
|
(
|
||||||
|
"pwsh.exe".to_string(),
|
||||||
|
vec![
|
||||||
|
"-NoProfile".to_string(),
|
||||||
|
"-WindowStyle".to_string(),
|
||||||
|
"Hidden".to_string(),
|
||||||
|
"-Command".to_string(),
|
||||||
|
script,
|
||||||
|
],
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Build simple notification command for testing (doesn't execute)
|
||||||
|
#[cfg(test)]
|
||||||
|
fn build_simple_notification_command(title: &str, body: &str) -> (String, Vec<String>) {
|
||||||
|
let message = format_simple_notification(title, body);
|
||||||
|
(
|
||||||
|
"cmd.exe".to_string(),
|
||||||
|
vec!["/c".to_string(), "msg".to_string(), "*".to_string(), message],
|
||||||
|
)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn send_notify_send(title: String, body: String) -> Result<(), String> {
|
||||||
|
// Use notify-send for Linux/WSL
|
||||||
|
let output = Command::new("notify-send")
|
||||||
|
.hide_window()
|
||||||
|
.arg(&title)
|
||||||
|
.arg(&body)
|
||||||
|
.arg("--urgency=normal")
|
||||||
|
.arg("--app-name=Hikari Desktop")
|
||||||
|
.output()
|
||||||
|
.map_err(|e| {
|
||||||
|
format!(
|
||||||
|
"Failed to execute notify-send: {}. Make sure libnotify-bin is installed.",
|
||||||
|
e
|
||||||
|
)
|
||||||
|
})?;
|
||||||
|
|
||||||
|
if !output.status.success() {
|
||||||
|
let error = String::from_utf8_lossy(&output.stderr);
|
||||||
|
return Err(format!("notify-send failed: {}", error));
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[command]
|
||||||
|
pub async fn send_windows_notification(title: String, body: String) -> Result<(), String> {
|
||||||
|
// Create PowerShell script for Windows Toast Notification
|
||||||
|
let ps_script = generate_powershell_toast_script(&title, &body);
|
||||||
|
|
||||||
// Try PowerShell Core first (pwsh), then fall back to Windows PowerShell
|
// Try PowerShell Core first (pwsh), then fall back to Windows PowerShell
|
||||||
let output = Command::new("pwsh.exe")
|
let output = Command::new("pwsh.exe")
|
||||||
|
.hide_window()
|
||||||
.arg("-NoProfile")
|
.arg("-NoProfile")
|
||||||
.arg("-WindowStyle")
|
.arg("-WindowStyle")
|
||||||
.arg("Hidden")
|
.arg("Hidden")
|
||||||
@@ -62,6 +121,7 @@ $toast = New-Object Windows.UI.Notifications.ToastNotification $xml
|
|||||||
.output()
|
.output()
|
||||||
.or_else(|_| {
|
.or_else(|_| {
|
||||||
Command::new("powershell.exe")
|
Command::new("powershell.exe")
|
||||||
|
.hide_window()
|
||||||
.arg("-NoProfile")
|
.arg("-NoProfile")
|
||||||
.arg("-WindowStyle")
|
.arg("-WindowStyle")
|
||||||
.arg("Hidden")
|
.arg("Hidden")
|
||||||
@@ -82,9 +142,10 @@ $toast = New-Object Windows.UI.Notifications.ToastNotification $xml
|
|||||||
// Alternative: Use Windows built-in MSG command for simple notifications
|
// Alternative: Use Windows built-in MSG command for simple notifications
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn send_simple_notification(title: String, body: String) -> Result<(), String> {
|
pub async fn send_simple_notification(title: String, body: String) -> Result<(), String> {
|
||||||
let message = format!("{}\n\n{}", title, body);
|
let message = format_simple_notification(&title, &body);
|
||||||
|
|
||||||
Command::new("cmd.exe")
|
Command::new("cmd.exe")
|
||||||
|
.hide_window()
|
||||||
.arg("/c")
|
.arg("/c")
|
||||||
.arg("msg")
|
.arg("msg")
|
||||||
.arg("*")
|
.arg("*")
|
||||||
@@ -93,4 +154,243 @@ pub async fn send_simple_notification(title: String, body: String) -> Result<(),
|
|||||||
.map_err(|e| format!("Failed to send message: {}", e))?;
|
.map_err(|e| format!("Failed to send message: {}", e))?;
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_basic() {
|
||||||
|
let script = generate_powershell_toast_script("Title", "Body");
|
||||||
|
|
||||||
|
assert!(script.contains("Hikari Desktop"));
|
||||||
|
assert!(script.contains("Title"));
|
||||||
|
assert!(script.contains("Body"));
|
||||||
|
assert!(script.contains("ToastNotification"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_escapes_quotes() {
|
||||||
|
let script = generate_powershell_toast_script("Title with \"quotes\"", "Body with \"quotes\"");
|
||||||
|
|
||||||
|
// Quotes should be escaped as `" in PowerShell
|
||||||
|
assert!(script.contains("Title with `\"quotes`\""));
|
||||||
|
assert!(script.contains("Body with `\"quotes`\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_with_special_chars() {
|
||||||
|
let script = generate_powershell_toast_script("Title: Test", "Body\nwith\nnewlines");
|
||||||
|
|
||||||
|
assert!(script.contains("Title: Test"));
|
||||||
|
assert!(script.contains("Body\nwith\nnewlines"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_unicode() {
|
||||||
|
let script = generate_powershell_toast_script("日本語 Title", "Unicode: 🎉");
|
||||||
|
|
||||||
|
assert!(script.contains("日本語 Title"));
|
||||||
|
assert!(script.contains("Unicode: 🎉"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_empty() {
|
||||||
|
let script = generate_powershell_toast_script("", "");
|
||||||
|
|
||||||
|
// Should still contain the structure
|
||||||
|
assert!(script.contains("Hikari Desktop"));
|
||||||
|
assert!(script.contains("ToastNotification"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_format_simple_notification_basic() {
|
||||||
|
let message = format_simple_notification("Title", "Body");
|
||||||
|
|
||||||
|
assert_eq!(message, "Title\n\nBody");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_format_simple_notification_with_newlines() {
|
||||||
|
let message = format_simple_notification("Multi\nLine\nTitle", "Multi\nLine\nBody");
|
||||||
|
|
||||||
|
assert!(message.contains("Multi\nLine\nTitle"));
|
||||||
|
assert!(message.contains("\n\n"));
|
||||||
|
assert!(message.contains("Multi\nLine\nBody"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_format_simple_notification_unicode() {
|
||||||
|
let message = format_simple_notification("日本語", "🎉 Unicode");
|
||||||
|
|
||||||
|
assert_eq!(message, "日本語\n\n🎉 Unicode");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_format_simple_notification_empty() {
|
||||||
|
let message = format_simple_notification("", "");
|
||||||
|
|
||||||
|
assert_eq!(message, "\n\n");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_format_simple_notification_long_text() {
|
||||||
|
let long_title = "A".repeat(1000);
|
||||||
|
let long_body = "B".repeat(1000);
|
||||||
|
let message = format_simple_notification(&long_title, &long_body);
|
||||||
|
|
||||||
|
assert!(message.starts_with(&long_title));
|
||||||
|
assert!(message.ends_with(&long_body));
|
||||||
|
assert!(message.contains("\n\n"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_generate_powershell_toast_script_multiple_quotes() {
|
||||||
|
let script = generate_powershell_toast_script(
|
||||||
|
"\"Quoted\" \"Multiple\" \"Times\"",
|
||||||
|
"\"More\" \"Quotes\" \"Here\""
|
||||||
|
);
|
||||||
|
|
||||||
|
// Each quote should be escaped
|
||||||
|
assert!(script.contains("`\"Quoted`\" `\"Multiple`\" `\"Times`\""));
|
||||||
|
assert!(script.contains("`\"More`\" `\"Quotes`\" `\"Here`\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
// E2E Integration Tests - Command Structure Verification
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_notify_send_command_structure() {
|
||||||
|
let (command, args) = build_notify_send_command("Test Title", "Test Body");
|
||||||
|
|
||||||
|
assert_eq!(command, "notify-send");
|
||||||
|
assert_eq!(args.len(), 4);
|
||||||
|
assert_eq!(args[0], "Test Title");
|
||||||
|
assert_eq!(args[1], "Test Body");
|
||||||
|
assert_eq!(args[2], "--urgency=normal");
|
||||||
|
assert_eq!(args[3], "--app-name=Hikari Desktop");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_notify_send_with_special_chars() {
|
||||||
|
let (command, args) =
|
||||||
|
build_notify_send_command("Title with \"quotes\"", "Body\nwith\nnewlines");
|
||||||
|
|
||||||
|
assert_eq!(command, "notify-send");
|
||||||
|
assert_eq!(args[0], "Title with \"quotes\"");
|
||||||
|
assert_eq!(args[1], "Body\nwith\nnewlines");
|
||||||
|
// notify-send handles these directly
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_windows_powershell_command_structure() {
|
||||||
|
let (command, args) = build_windows_powershell_command("Test Title", "Test Body");
|
||||||
|
|
||||||
|
assert_eq!(command, "pwsh.exe");
|
||||||
|
assert_eq!(args.len(), 5);
|
||||||
|
assert_eq!(args[0], "-NoProfile");
|
||||||
|
assert_eq!(args[1], "-WindowStyle");
|
||||||
|
assert_eq!(args[2], "Hidden");
|
||||||
|
assert_eq!(args[3], "-Command");
|
||||||
|
|
||||||
|
// Verify the script in args[4] contains expected elements
|
||||||
|
let script = &args[4];
|
||||||
|
assert!(script.contains("Test Title"));
|
||||||
|
assert!(script.contains("Test Body"));
|
||||||
|
assert!(script.contains("Hikari Desktop"));
|
||||||
|
assert!(script.contains("ToastNotification"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_windows_powershell_quote_escaping() {
|
||||||
|
let (_, args) =
|
||||||
|
build_windows_powershell_command("Title with \"quotes\"", "Body with \"quotes\"");
|
||||||
|
|
||||||
|
let script = &args[4];
|
||||||
|
// Verify quotes are properly escaped in the PowerShell script
|
||||||
|
assert!(script.contains("Title with `\"quotes`\""));
|
||||||
|
assert!(script.contains("Body with `\"quotes`\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_simple_notification_command_structure() {
|
||||||
|
let (command, args) = build_simple_notification_command("Test Title", "Test Body");
|
||||||
|
|
||||||
|
assert_eq!(command, "cmd.exe");
|
||||||
|
assert_eq!(args.len(), 4);
|
||||||
|
assert_eq!(args[0], "/c");
|
||||||
|
assert_eq!(args[1], "msg");
|
||||||
|
assert_eq!(args[2], "*");
|
||||||
|
assert_eq!(args[3], "Test Title\n\nTest Body");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_simple_notification_multiline() {
|
||||||
|
let (_, args) =
|
||||||
|
build_simple_notification_command("Multi\nLine\nTitle", "Multi\nLine\nBody");
|
||||||
|
|
||||||
|
let message = &args[3];
|
||||||
|
assert!(message.contains("Multi\nLine\nTitle"));
|
||||||
|
assert!(message.contains("\n\n"));
|
||||||
|
assert!(message.contains("Multi\nLine\nBody"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_command_consistency_across_platforms() {
|
||||||
|
// Test that different platforms use consistent parameters
|
||||||
|
let title = "Consistency Test";
|
||||||
|
let body = "Testing cross-platform consistency";
|
||||||
|
|
||||||
|
// Linux command
|
||||||
|
let (notify_cmd, notify_args) = build_notify_send_command(title, body);
|
||||||
|
assert!(notify_cmd.contains("notify"));
|
||||||
|
assert!(notify_args.iter().any(|arg| arg.contains("Hikari Desktop")));
|
||||||
|
|
||||||
|
// Windows PowerShell command
|
||||||
|
let (ps_cmd, ps_args) = build_windows_powershell_command(title, body);
|
||||||
|
assert!(ps_cmd.contains("pwsh") || ps_cmd.contains("powershell"));
|
||||||
|
let ps_script = &ps_args[4];
|
||||||
|
assert!(ps_script.contains("Hikari Desktop"));
|
||||||
|
|
||||||
|
// Windows simple command
|
||||||
|
let (msg_cmd, msg_args) = build_simple_notification_command(title, body);
|
||||||
|
assert!(msg_cmd.contains("cmd"));
|
||||||
|
assert!(msg_args[3].contains(title));
|
||||||
|
assert!(msg_args[3].contains(body));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_unicode_support_across_platforms() {
|
||||||
|
let title = "日本語 Title";
|
||||||
|
let body = "Unicode: 🎉";
|
||||||
|
|
||||||
|
// Verify all platforms preserve unicode
|
||||||
|
let (_, notify_args) = build_notify_send_command(title, body);
|
||||||
|
assert_eq!(notify_args[0], title);
|
||||||
|
assert_eq!(notify_args[1], body);
|
||||||
|
|
||||||
|
let (_, ps_args) = build_windows_powershell_command(title, body);
|
||||||
|
let ps_script = &ps_args[4];
|
||||||
|
assert!(ps_script.contains(title));
|
||||||
|
assert!(ps_script.contains(body));
|
||||||
|
|
||||||
|
let (_, msg_args) = build_simple_notification_command(title, body);
|
||||||
|
assert!(msg_args[3].contains(title));
|
||||||
|
assert!(msg_args[3].contains(body));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_e2e_empty_input_handling() {
|
||||||
|
// Test that empty inputs are handled gracefully
|
||||||
|
let (_, notify_args) = build_notify_send_command("", "");
|
||||||
|
assert_eq!(notify_args[0], "");
|
||||||
|
assert_eq!(notify_args[1], "");
|
||||||
|
|
||||||
|
let (_, ps_args) = build_windows_powershell_command("", "");
|
||||||
|
let ps_script = &ps_args[4];
|
||||||
|
assert!(ps_script.contains("Hikari Desktop")); // Still has app name
|
||||||
|
|
||||||
|
let (_, msg_args) = build_simple_notification_command("", "");
|
||||||
|
assert_eq!(msg_args[3], "\n\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|||||||
@@ -0,0 +1,21 @@
|
|||||||
|
use std::process::Command;
|
||||||
|
|
||||||
|
/// Extension trait for `Command` that hides the console window on Windows.
|
||||||
|
///
|
||||||
|
/// On non-Windows platforms this is a no-op, so callers can unconditionally
|
||||||
|
/// chain `.hide_window()` without any `#[cfg]` guards at the call sites.
|
||||||
|
pub trait HideWindow {
|
||||||
|
fn hide_window(&mut self) -> &mut Self;
|
||||||
|
}
|
||||||
|
|
||||||
|
impl HideWindow for Command {
|
||||||
|
fn hide_window(&mut self) -> &mut Self {
|
||||||
|
#[cfg(target_os = "windows")]
|
||||||
|
{
|
||||||
|
use std::os::windows::process::CommandExt;
|
||||||
|
const CREATE_NO_WINDOW: u32 = 0x08000000;
|
||||||
|
self.creation_flags(CREATE_NO_WINDOW);
|
||||||
|
}
|
||||||
|
self
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,373 @@
|
|||||||
|
use chrono::{DateTime, Utc};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use tauri::AppHandle;
|
||||||
|
use tauri_plugin_store::StoreExt;
|
||||||
|
|
||||||
|
const QUICK_ACTIONS_STORE_KEY: &str = "quick_actions";
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct QuickAction {
|
||||||
|
pub id: String,
|
||||||
|
pub name: String,
|
||||||
|
pub prompt: String,
|
||||||
|
pub icon: String,
|
||||||
|
pub is_default: bool,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
pub updated_at: DateTime<Utc>,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn get_default_quick_actions() -> Vec<QuickAction> {
|
||||||
|
let now = Utc::now();
|
||||||
|
vec![
|
||||||
|
QuickAction {
|
||||||
|
id: "default-review-pr".to_string(),
|
||||||
|
name: "Review PR".to_string(),
|
||||||
|
prompt: "Please review this pull request and provide feedback on code quality, potential issues, and suggestions for improvement.".to_string(),
|
||||||
|
icon: "git-pull-request".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
QuickAction {
|
||||||
|
id: "default-run-tests".to_string(),
|
||||||
|
name: "Run Tests".to_string(),
|
||||||
|
prompt: "Please run the test suite for this project and report any failures or issues.".to_string(),
|
||||||
|
icon: "play".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
QuickAction {
|
||||||
|
id: "default-explain-file".to_string(),
|
||||||
|
name: "Explain File".to_string(),
|
||||||
|
prompt: "Please explain what this file does, its purpose, and how it fits into the overall project structure.".to_string(),
|
||||||
|
icon: "file-text".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
QuickAction {
|
||||||
|
id: "default-fix-error".to_string(),
|
||||||
|
name: "Fix Error".to_string(),
|
||||||
|
prompt: "I'm getting an error. Can you help me identify the cause and fix it?".to_string(),
|
||||||
|
icon: "alert-circle".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
QuickAction {
|
||||||
|
id: "default-write-tests".to_string(),
|
||||||
|
name: "Write Tests".to_string(),
|
||||||
|
prompt: "Please write comprehensive unit tests for the current code with good coverage.".to_string(),
|
||||||
|
icon: "check-square".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
QuickAction {
|
||||||
|
id: "default-refactor".to_string(),
|
||||||
|
name: "Refactor".to_string(),
|
||||||
|
prompt: "Please refactor this code to improve readability, maintainability, and performance.".to_string(),
|
||||||
|
icon: "refresh-cw".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
fn load_all_quick_actions(app: &AppHandle) -> Result<Vec<QuickAction>, String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-quick-actions.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
match store.get(QUICK_ACTIONS_STORE_KEY) {
|
||||||
|
Some(value) => {
|
||||||
|
let mut actions: Vec<QuickAction> =
|
||||||
|
serde_json::from_value(value.clone()).map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
for default in defaults {
|
||||||
|
if !actions.iter().any(|a| a.id == default.id) {
|
||||||
|
actions.push(default);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(actions)
|
||||||
|
}
|
||||||
|
None => Ok(get_default_quick_actions()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn save_all_quick_actions(app: &AppHandle, actions: &[QuickAction]) -> Result<(), String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-quick-actions.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let value = serde_json::to_value(actions).map_err(|e| e.to_string())?;
|
||||||
|
store.set(QUICK_ACTIONS_STORE_KEY, value);
|
||||||
|
store.save().map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn list_quick_actions(app: AppHandle) -> Result<Vec<QuickAction>, String> {
|
||||||
|
let mut actions = load_all_quick_actions(&app)?;
|
||||||
|
|
||||||
|
actions.sort_by(|a, b| {
|
||||||
|
let default_cmp = b.is_default.cmp(&a.is_default);
|
||||||
|
if default_cmp == std::cmp::Ordering::Equal {
|
||||||
|
a.name.cmp(&b.name)
|
||||||
|
} else {
|
||||||
|
default_cmp
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
Ok(actions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn save_quick_action(app: AppHandle, action: QuickAction) -> Result<(), String> {
|
||||||
|
let mut actions = load_all_quick_actions(&app)?;
|
||||||
|
|
||||||
|
if let Some(existing) = actions.iter_mut().find(|a| a.id == action.id) {
|
||||||
|
let mut updated = action;
|
||||||
|
updated.is_default = existing.is_default;
|
||||||
|
*existing = updated;
|
||||||
|
} else {
|
||||||
|
actions.push(action);
|
||||||
|
}
|
||||||
|
|
||||||
|
save_all_quick_actions(&app, &actions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn delete_quick_action(app: AppHandle, action_id: String) -> Result<(), String> {
|
||||||
|
let mut actions = load_all_quick_actions(&app)?;
|
||||||
|
|
||||||
|
if actions
|
||||||
|
.iter()
|
||||||
|
.any(|a| a.id == action_id && a.is_default)
|
||||||
|
{
|
||||||
|
return Err("Cannot delete default quick actions".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
actions.retain(|a| a.id != action_id);
|
||||||
|
save_all_quick_actions(&app, &actions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn reset_default_quick_actions(app: AppHandle) -> Result<(), String> {
|
||||||
|
let mut actions = load_all_quick_actions(&app)?;
|
||||||
|
|
||||||
|
actions.retain(|a| !a.is_default);
|
||||||
|
actions.extend(get_default_quick_actions());
|
||||||
|
|
||||||
|
save_all_quick_actions(&app, &actions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
fn create_test_action(id: &str, name: &str, is_default: bool) -> QuickAction {
|
||||||
|
QuickAction {
|
||||||
|
id: id.to_string(),
|
||||||
|
name: name.to_string(),
|
||||||
|
prompt: "Test prompt".to_string(),
|
||||||
|
icon: "star".to_string(),
|
||||||
|
is_default,
|
||||||
|
created_at: Utc::now(),
|
||||||
|
updated_at: Utc::now(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_quick_actions_exist() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
assert!(!defaults.is_empty());
|
||||||
|
assert!(defaults.iter().all(|a| a.is_default));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_quick_actions_have_required_fields() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
for action in defaults {
|
||||||
|
assert!(!action.id.is_empty());
|
||||||
|
assert!(!action.name.is_empty());
|
||||||
|
assert!(!action.prompt.is_empty());
|
||||||
|
assert!(!action.icon.is_empty());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_quick_actions_count() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
// Should have 6 default actions
|
||||||
|
assert_eq!(defaults.len(), 6);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_quick_actions_have_unique_ids() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
let mut ids: Vec<&String> = defaults.iter().map(|a| &a.id).collect();
|
||||||
|
ids.sort();
|
||||||
|
ids.dedup();
|
||||||
|
assert_eq!(ids.len(), defaults.len());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_quick_actions_ids_start_with_default() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
assert!(defaults.iter().all(|a| a.id.starts_with("default-")));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_quick_action_serialization() {
|
||||||
|
let action = create_test_action("test-1", "Test Action", false);
|
||||||
|
let json = serde_json::to_string(&action).expect("Failed to serialize");
|
||||||
|
let parsed: QuickAction = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, action.id);
|
||||||
|
assert_eq!(parsed.name, action.name);
|
||||||
|
assert_eq!(parsed.prompt, action.prompt);
|
||||||
|
assert_eq!(parsed.icon, action.icon);
|
||||||
|
assert_eq!(parsed.is_default, action.is_default);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_quick_action_clone() {
|
||||||
|
let original = create_test_action("clone-test", "Clone Test", true);
|
||||||
|
let cloned = original.clone();
|
||||||
|
|
||||||
|
assert_eq!(original.id, cloned.id);
|
||||||
|
assert_eq!(original.name, cloned.name);
|
||||||
|
assert_eq!(original.is_default, cloned.is_default);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_quick_action_sorting_defaults_first() {
|
||||||
|
let mut actions = vec![
|
||||||
|
create_test_action("custom-z", "Zebra", false),
|
||||||
|
create_test_action("default-a", "Apple", true),
|
||||||
|
create_test_action("custom-a", "Alpha", false),
|
||||||
|
create_test_action("default-z", "Zulu", true),
|
||||||
|
];
|
||||||
|
|
||||||
|
// Sort by: defaults first, then alphabetically by name
|
||||||
|
actions.sort_by(|a, b| {
|
||||||
|
let default_cmp = b.is_default.cmp(&a.is_default);
|
||||||
|
if default_cmp == std::cmp::Ordering::Equal {
|
||||||
|
a.name.cmp(&b.name)
|
||||||
|
} else {
|
||||||
|
default_cmp
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Defaults should come first
|
||||||
|
assert!(actions[0].is_default);
|
||||||
|
assert!(actions[1].is_default);
|
||||||
|
assert!(!actions[2].is_default);
|
||||||
|
assert!(!actions[3].is_default);
|
||||||
|
|
||||||
|
// Within defaults, alphabetically sorted
|
||||||
|
assert_eq!(actions[0].name, "Apple");
|
||||||
|
assert_eq!(actions[1].name, "Zulu");
|
||||||
|
|
||||||
|
// Within non-defaults, alphabetically sorted
|
||||||
|
assert_eq!(actions[2].name, "Alpha");
|
||||||
|
assert_eq!(actions[3].name, "Zebra");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_known_default_actions() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
let ids: Vec<&str> = defaults.iter().map(|a| a.id.as_str()).collect();
|
||||||
|
|
||||||
|
assert!(ids.contains(&"default-review-pr"));
|
||||||
|
assert!(ids.contains(&"default-run-tests"));
|
||||||
|
assert!(ids.contains(&"default-explain-file"));
|
||||||
|
assert!(ids.contains(&"default-fix-error"));
|
||||||
|
assert!(ids.contains(&"default-write-tests"));
|
||||||
|
assert!(ids.contains(&"default-refactor"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_action_icons() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
let icons: Vec<&str> = defaults.iter().map(|a| a.icon.as_str()).collect();
|
||||||
|
|
||||||
|
assert!(icons.contains(&"git-pull-request"));
|
||||||
|
assert!(icons.contains(&"play"));
|
||||||
|
assert!(icons.contains(&"file-text"));
|
||||||
|
assert!(icons.contains(&"alert-circle"));
|
||||||
|
assert!(icons.contains(&"check-square"));
|
||||||
|
assert!(icons.contains(&"refresh-cw"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_quick_action_prompts_not_empty() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
for action in defaults {
|
||||||
|
assert!(
|
||||||
|
action.prompt.len() > 10,
|
||||||
|
"Prompt should be meaningful: {}",
|
||||||
|
action.name
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_quick_action_timestamps() {
|
||||||
|
let action = create_test_action("time-test", "Time Test", false);
|
||||||
|
assert!(action.created_at <= action.updated_at);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_actions_have_same_timestamps() {
|
||||||
|
let defaults = get_default_quick_actions();
|
||||||
|
// All defaults are created at the same instant
|
||||||
|
let first_created = defaults[0].created_at;
|
||||||
|
let first_updated = defaults[0].updated_at;
|
||||||
|
|
||||||
|
for action in &defaults {
|
||||||
|
assert_eq!(action.created_at, first_created);
|
||||||
|
assert_eq!(action.updated_at, first_updated);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_action_retain_non_default() {
|
||||||
|
let mut actions = vec![
|
||||||
|
create_test_action("default-1", "Default 1", true),
|
||||||
|
create_test_action("custom-1", "Custom 1", false),
|
||||||
|
create_test_action("default-2", "Default 2", true),
|
||||||
|
create_test_action("custom-2", "Custom 2", false),
|
||||||
|
];
|
||||||
|
|
||||||
|
// Mimics reset_default_quick_actions behavior (retain non-defaults)
|
||||||
|
actions.retain(|a| !a.is_default);
|
||||||
|
|
||||||
|
assert_eq!(actions.len(), 2);
|
||||||
|
assert!(actions.iter().all(|a| !a.is_default));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_action_find_by_id() {
|
||||||
|
let actions = vec![
|
||||||
|
create_test_action("action-1", "First", false),
|
||||||
|
create_test_action("action-2", "Second", false),
|
||||||
|
create_test_action("action-3", "Third", false),
|
||||||
|
];
|
||||||
|
|
||||||
|
let found = actions.iter().find(|a| a.id == "action-2");
|
||||||
|
assert!(found.is_some());
|
||||||
|
assert_eq!(found.unwrap().name, "Second");
|
||||||
|
|
||||||
|
let not_found = actions.iter().find(|a| a.id == "action-999");
|
||||||
|
assert!(not_found.is_none());
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,374 @@
|
|||||||
|
use chrono::{DateTime, Utc};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use tauri::AppHandle;
|
||||||
|
use tauri_plugin_store::StoreExt;
|
||||||
|
|
||||||
|
const SESSIONS_STORE_KEY: &str = "sessions";
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct SavedSession {
|
||||||
|
pub id: String,
|
||||||
|
pub name: String,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
pub last_activity_at: DateTime<Utc>,
|
||||||
|
pub working_directory: String,
|
||||||
|
pub message_count: usize,
|
||||||
|
pub preview: String, // First ~100 chars of conversation for preview
|
||||||
|
pub messages: Vec<SavedMessage>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct SavedMessage {
|
||||||
|
pub id: String,
|
||||||
|
#[serde(rename = "type")]
|
||||||
|
pub message_type: String,
|
||||||
|
pub content: String,
|
||||||
|
pub timestamp: DateTime<Utc>,
|
||||||
|
pub tool_name: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct SessionListItem {
|
||||||
|
pub id: String,
|
||||||
|
pub name: String,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
pub last_activity_at: DateTime<Utc>,
|
||||||
|
pub working_directory: String,
|
||||||
|
pub message_count: usize,
|
||||||
|
pub preview: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl From<&SavedSession> for SessionListItem {
|
||||||
|
fn from(session: &SavedSession) -> Self {
|
||||||
|
SessionListItem {
|
||||||
|
id: session.id.clone(),
|
||||||
|
name: session.name.clone(),
|
||||||
|
created_at: session.created_at,
|
||||||
|
last_activity_at: session.last_activity_at,
|
||||||
|
working_directory: session.working_directory.clone(),
|
||||||
|
message_count: session.message_count,
|
||||||
|
preview: session.preview.clone(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn load_all_sessions(app: &AppHandle) -> Result<Vec<SavedSession>, String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-sessions.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
match store.get(SESSIONS_STORE_KEY) {
|
||||||
|
Some(value) => serde_json::from_value(value.clone()).map_err(|e| e.to_string()),
|
||||||
|
None => Ok(Vec::new()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn save_all_sessions(app: &AppHandle, sessions: &[SavedSession]) -> Result<(), String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-sessions.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let value = serde_json::to_value(sessions).map_err(|e| e.to_string())?;
|
||||||
|
store.set(SESSIONS_STORE_KEY, value);
|
||||||
|
store.save().map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn list_sessions(app: AppHandle) -> Result<Vec<SessionListItem>, String> {
|
||||||
|
let sessions = load_all_sessions(&app)?;
|
||||||
|
let mut items: Vec<SessionListItem> = sessions.iter().map(SessionListItem::from).collect();
|
||||||
|
|
||||||
|
// Sort by last activity, most recent first
|
||||||
|
items.sort_by(|a, b| b.last_activity_at.cmp(&a.last_activity_at));
|
||||||
|
|
||||||
|
Ok(items)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn save_session(app: AppHandle, session: SavedSession) -> Result<(), String> {
|
||||||
|
let mut sessions = load_all_sessions(&app)?;
|
||||||
|
|
||||||
|
// Update existing or add new
|
||||||
|
if let Some(existing) = sessions.iter_mut().find(|s| s.id == session.id) {
|
||||||
|
*existing = session;
|
||||||
|
} else {
|
||||||
|
sessions.push(session);
|
||||||
|
}
|
||||||
|
|
||||||
|
save_all_sessions(&app, &sessions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn load_session(app: AppHandle, session_id: String) -> Result<Option<SavedSession>, String> {
|
||||||
|
let sessions = load_all_sessions(&app)?;
|
||||||
|
Ok(sessions.into_iter().find(|s| s.id == session_id))
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn delete_session(app: AppHandle, session_id: String) -> Result<(), String> {
|
||||||
|
let mut sessions = load_all_sessions(&app)?;
|
||||||
|
sessions.retain(|s| s.id != session_id);
|
||||||
|
save_all_sessions(&app, &sessions)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn search_sessions(app: AppHandle, query: String) -> Result<Vec<SessionListItem>, String> {
|
||||||
|
let sessions = load_all_sessions(&app)?;
|
||||||
|
let query_lower = query.to_lowercase();
|
||||||
|
|
||||||
|
let mut matching: Vec<SessionListItem> = sessions
|
||||||
|
.iter()
|
||||||
|
.filter(|s| {
|
||||||
|
s.name.to_lowercase().contains(&query_lower)
|
||||||
|
|| s.preview.to_lowercase().contains(&query_lower)
|
||||||
|
|| s.working_directory.to_lowercase().contains(&query_lower)
|
||||||
|
|| s.messages
|
||||||
|
.iter()
|
||||||
|
.any(|m| m.content.to_lowercase().contains(&query_lower))
|
||||||
|
})
|
||||||
|
.map(SessionListItem::from)
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
// Sort by last activity, most recent first
|
||||||
|
matching.sort_by(|a, b| b.last_activity_at.cmp(&a.last_activity_at));
|
||||||
|
|
||||||
|
Ok(matching)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn clear_all_sessions(app: AppHandle) -> Result<(), String> {
|
||||||
|
save_all_sessions(&app, &[])
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use chrono::TimeZone;
|
||||||
|
|
||||||
|
fn create_test_session(id: &str, name: &str) -> SavedSession {
|
||||||
|
SavedSession {
|
||||||
|
id: id.to_string(),
|
||||||
|
name: name.to_string(),
|
||||||
|
created_at: Utc::now(),
|
||||||
|
last_activity_at: Utc::now(),
|
||||||
|
working_directory: "/home/test".to_string(),
|
||||||
|
message_count: 5,
|
||||||
|
preview: "Hello world".to_string(),
|
||||||
|
messages: vec![],
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn create_test_message(id: &str, content: &str, msg_type: &str) -> SavedMessage {
|
||||||
|
SavedMessage {
|
||||||
|
id: id.to_string(),
|
||||||
|
message_type: msg_type.to_string(),
|
||||||
|
content: content.to_string(),
|
||||||
|
timestamp: Utc::now(),
|
||||||
|
tool_name: None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_list_item_from_saved_session() {
|
||||||
|
let session = SavedSession {
|
||||||
|
id: "test-id".to_string(),
|
||||||
|
name: "Test Session".to_string(),
|
||||||
|
created_at: Utc::now(),
|
||||||
|
last_activity_at: Utc::now(),
|
||||||
|
working_directory: "/home/test".to_string(),
|
||||||
|
message_count: 5,
|
||||||
|
preview: "Hello world".to_string(),
|
||||||
|
messages: vec![],
|
||||||
|
};
|
||||||
|
|
||||||
|
let item = SessionListItem::from(&session);
|
||||||
|
assert_eq!(item.id, "test-id");
|
||||||
|
assert_eq!(item.name, "Test Session");
|
||||||
|
assert_eq!(item.message_count, 5);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_list_item_preserves_all_fields() {
|
||||||
|
let created = Utc.with_ymd_and_hms(2024, 1, 15, 10, 30, 0).unwrap();
|
||||||
|
let last_activity = Utc.with_ymd_and_hms(2024, 1, 15, 14, 45, 0).unwrap();
|
||||||
|
|
||||||
|
let session = SavedSession {
|
||||||
|
id: "sess-123".to_string(),
|
||||||
|
name: "My Chat".to_string(),
|
||||||
|
created_at: created,
|
||||||
|
last_activity_at: last_activity,
|
||||||
|
working_directory: "/home/naomi/project".to_string(),
|
||||||
|
message_count: 42,
|
||||||
|
preview: "What is the meaning of life?".to_string(),
|
||||||
|
messages: vec![],
|
||||||
|
};
|
||||||
|
|
||||||
|
let item = SessionListItem::from(&session);
|
||||||
|
|
||||||
|
assert_eq!(item.id, "sess-123");
|
||||||
|
assert_eq!(item.name, "My Chat");
|
||||||
|
assert_eq!(item.created_at, created);
|
||||||
|
assert_eq!(item.last_activity_at, last_activity);
|
||||||
|
assert_eq!(item.working_directory, "/home/naomi/project");
|
||||||
|
assert_eq!(item.message_count, 42);
|
||||||
|
assert_eq!(item.preview, "What is the meaning of life?");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_saved_session_serialization() {
|
||||||
|
let session = create_test_session("test-1", "Test Session");
|
||||||
|
let json = serde_json::to_string(&session).expect("Failed to serialize");
|
||||||
|
let parsed: SavedSession = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, session.id);
|
||||||
|
assert_eq!(parsed.name, session.name);
|
||||||
|
assert_eq!(parsed.working_directory, session.working_directory);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_saved_message_serialization() {
|
||||||
|
let message = create_test_message("msg-1", "Hello!", "user");
|
||||||
|
let json = serde_json::to_string(&message).expect("Failed to serialize");
|
||||||
|
let parsed: SavedMessage = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, message.id);
|
||||||
|
assert_eq!(parsed.content, message.content);
|
||||||
|
assert_eq!(parsed.message_type, "user");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_saved_message_with_tool_name() {
|
||||||
|
let message = SavedMessage {
|
||||||
|
id: "msg-tool-1".to_string(),
|
||||||
|
message_type: "tool".to_string(),
|
||||||
|
content: "File read successfully".to_string(),
|
||||||
|
timestamp: Utc::now(),
|
||||||
|
tool_name: Some("Read".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&message).expect("Failed to serialize");
|
||||||
|
let parsed: SavedMessage = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.tool_name, Some("Read".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_with_messages_serialization() {
|
||||||
|
let mut session = create_test_session("sess-full", "Full Session");
|
||||||
|
session.messages = vec![
|
||||||
|
create_test_message("msg-1", "Hello!", "user"),
|
||||||
|
create_test_message("msg-2", "Hi there!", "assistant"),
|
||||||
|
create_test_message("msg-3", "Read file", "tool"),
|
||||||
|
];
|
||||||
|
session.message_count = 3;
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&session).expect("Failed to serialize");
|
||||||
|
let parsed: SavedSession = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.messages.len(), 3);
|
||||||
|
assert_eq!(parsed.messages[0].content, "Hello!");
|
||||||
|
assert_eq!(parsed.messages[1].message_type, "assistant");
|
||||||
|
assert_eq!(parsed.messages[2].message_type, "tool");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_list_item_serialization() {
|
||||||
|
let item = SessionListItem {
|
||||||
|
id: "list-item-1".to_string(),
|
||||||
|
name: "Quick Chat".to_string(),
|
||||||
|
created_at: Utc::now(),
|
||||||
|
last_activity_at: Utc::now(),
|
||||||
|
working_directory: "/tmp".to_string(),
|
||||||
|
message_count: 10,
|
||||||
|
preview: "Short preview...".to_string(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&item).expect("Failed to serialize");
|
||||||
|
let parsed: SessionListItem = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, item.id);
|
||||||
|
assert_eq!(parsed.name, item.name);
|
||||||
|
assert_eq!(parsed.preview, item.preview);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_message_type_field_rename() {
|
||||||
|
// The message_type field is renamed to "type" in JSON
|
||||||
|
let message = create_test_message("msg-1", "Test", "assistant");
|
||||||
|
let json = serde_json::to_string(&message).expect("Failed to serialize");
|
||||||
|
|
||||||
|
assert!(json.contains("\"type\":"));
|
||||||
|
assert!(!json.contains("\"message_type\":"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_default_empty_messages() {
|
||||||
|
let session = SavedSession {
|
||||||
|
id: "empty".to_string(),
|
||||||
|
name: "Empty".to_string(),
|
||||||
|
created_at: Utc::now(),
|
||||||
|
last_activity_at: Utc::now(),
|
||||||
|
working_directory: "/".to_string(),
|
||||||
|
message_count: 0,
|
||||||
|
preview: "".to_string(),
|
||||||
|
messages: vec![],
|
||||||
|
};
|
||||||
|
|
||||||
|
assert!(session.messages.is_empty());
|
||||||
|
assert_eq!(session.message_count, 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_session_sorting_by_activity() {
|
||||||
|
let old_time = Utc.with_ymd_and_hms(2024, 1, 1, 0, 0, 0).unwrap();
|
||||||
|
let new_time = Utc.with_ymd_and_hms(2024, 6, 15, 12, 0, 0).unwrap();
|
||||||
|
|
||||||
|
let mut sessions = vec![
|
||||||
|
SessionListItem {
|
||||||
|
id: "old".to_string(),
|
||||||
|
name: "Old Session".to_string(),
|
||||||
|
created_at: old_time,
|
||||||
|
last_activity_at: old_time,
|
||||||
|
working_directory: "/old".to_string(),
|
||||||
|
message_count: 1,
|
||||||
|
preview: "Old".to_string(),
|
||||||
|
},
|
||||||
|
SessionListItem {
|
||||||
|
id: "new".to_string(),
|
||||||
|
name: "New Session".to_string(),
|
||||||
|
created_at: new_time,
|
||||||
|
last_activity_at: new_time,
|
||||||
|
working_directory: "/new".to_string(),
|
||||||
|
message_count: 1,
|
||||||
|
preview: "New".to_string(),
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
// Sort by last activity, most recent first (mimics list_sessions behavior)
|
||||||
|
sessions.sort_by(|a, b| b.last_activity_at.cmp(&a.last_activity_at));
|
||||||
|
|
||||||
|
assert_eq!(sessions[0].id, "new");
|
||||||
|
assert_eq!(sessions[1].id, "old");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_session_clone() {
|
||||||
|
let original = create_test_session("clone-test", "Clone Test");
|
||||||
|
let cloned = original.clone();
|
||||||
|
|
||||||
|
assert_eq!(original.id, cloned.id);
|
||||||
|
assert_eq!(original.name, cloned.name);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_message_clone() {
|
||||||
|
let original = create_test_message("msg-clone", "Content", "user");
|
||||||
|
let cloned = original.clone();
|
||||||
|
|
||||||
|
assert_eq!(original.id, cloned.id);
|
||||||
|
assert_eq!(original.content, cloned.content);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,439 @@
|
|||||||
|
use chrono::{DateTime, Utc};
|
||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use tauri::AppHandle;
|
||||||
|
use tauri_plugin_store::StoreExt;
|
||||||
|
|
||||||
|
const SNIPPETS_STORE_KEY: &str = "snippets";
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct Snippet {
|
||||||
|
pub id: String,
|
||||||
|
pub name: String,
|
||||||
|
pub content: String,
|
||||||
|
pub category: String,
|
||||||
|
pub is_default: bool,
|
||||||
|
pub created_at: DateTime<Utc>,
|
||||||
|
pub updated_at: DateTime<Utc>,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn get_default_snippets() -> Vec<Snippet> {
|
||||||
|
let now = Utc::now();
|
||||||
|
vec![
|
||||||
|
Snippet {
|
||||||
|
id: "default-explain-code".to_string(),
|
||||||
|
name: "Explain this code".to_string(),
|
||||||
|
content: "Please explain what this code does, step by step:".to_string(),
|
||||||
|
category: "Code Review".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-fix-error".to_string(),
|
||||||
|
name: "Fix this error".to_string(),
|
||||||
|
content: "I'm getting the following error. Can you help me fix it?".to_string(),
|
||||||
|
category: "Debugging".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-write-tests".to_string(),
|
||||||
|
name: "Write tests".to_string(),
|
||||||
|
content: "Please write unit tests for this code with good coverage:".to_string(),
|
||||||
|
category: "Testing".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-refactor".to_string(),
|
||||||
|
name: "Refactor for clarity".to_string(),
|
||||||
|
content: "Please refactor this code to improve readability and maintainability:".to_string(),
|
||||||
|
category: "Code Review".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-optimize".to_string(),
|
||||||
|
name: "Optimize performance".to_string(),
|
||||||
|
content: "Please analyze this code for performance issues and suggest optimizations:".to_string(),
|
||||||
|
category: "Performance".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-review-pr".to_string(),
|
||||||
|
name: "Review PR".to_string(),
|
||||||
|
content: "Please review this pull request and provide feedback on code quality, potential issues, and suggestions for improvement.".to_string(),
|
||||||
|
category: "Code Review".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-add-comments".to_string(),
|
||||||
|
name: "Add documentation".to_string(),
|
||||||
|
content: "Please add clear documentation comments to this code explaining what it does:".to_string(),
|
||||||
|
category: "Documentation".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
Snippet {
|
||||||
|
id: "default-security-review".to_string(),
|
||||||
|
name: "Security review".to_string(),
|
||||||
|
content: "Please review this code for security vulnerabilities and suggest fixes:".to_string(),
|
||||||
|
category: "Security".to_string(),
|
||||||
|
is_default: true,
|
||||||
|
created_at: now,
|
||||||
|
updated_at: now,
|
||||||
|
},
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
fn load_all_snippets(app: &AppHandle) -> Result<Vec<Snippet>, String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-snippets.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
match store.get(SNIPPETS_STORE_KEY) {
|
||||||
|
Some(value) => {
|
||||||
|
let mut snippets: Vec<Snippet> =
|
||||||
|
serde_json::from_value(value.clone()).map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
// Ensure default snippets exist (in case new ones were added in an update)
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
for default in defaults {
|
||||||
|
if !snippets.iter().any(|s| s.id == default.id) {
|
||||||
|
snippets.push(default);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(snippets)
|
||||||
|
}
|
||||||
|
None => Ok(get_default_snippets()),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn save_all_snippets(app: &AppHandle, snippets: &[Snippet]) -> Result<(), String> {
|
||||||
|
let store = app
|
||||||
|
.store("hikari-snippets.json")
|
||||||
|
.map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
let value = serde_json::to_value(snippets).map_err(|e| e.to_string())?;
|
||||||
|
store.set(SNIPPETS_STORE_KEY, value);
|
||||||
|
store.save().map_err(|e| e.to_string())?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn list_snippets(app: AppHandle) -> Result<Vec<Snippet>, String> {
|
||||||
|
let mut snippets = load_all_snippets(&app)?;
|
||||||
|
|
||||||
|
// Sort by category, then by name
|
||||||
|
snippets.sort_by(|a, b| {
|
||||||
|
let cat_cmp = a.category.cmp(&b.category);
|
||||||
|
if cat_cmp == std::cmp::Ordering::Equal {
|
||||||
|
a.name.cmp(&b.name)
|
||||||
|
} else {
|
||||||
|
cat_cmp
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
Ok(snippets)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn save_snippet(app: AppHandle, snippet: Snippet) -> Result<(), String> {
|
||||||
|
let mut snippets = load_all_snippets(&app)?;
|
||||||
|
|
||||||
|
// Update existing or add new
|
||||||
|
if let Some(existing) = snippets.iter_mut().find(|s| s.id == snippet.id) {
|
||||||
|
// Don't allow editing default snippets' is_default flag
|
||||||
|
let mut updated = snippet;
|
||||||
|
updated.is_default = existing.is_default;
|
||||||
|
*existing = updated;
|
||||||
|
} else {
|
||||||
|
snippets.push(snippet);
|
||||||
|
}
|
||||||
|
|
||||||
|
save_all_snippets(&app, &snippets)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn delete_snippet(app: AppHandle, snippet_id: String) -> Result<(), String> {
|
||||||
|
let mut snippets = load_all_snippets(&app)?;
|
||||||
|
|
||||||
|
// Don't allow deleting default snippets
|
||||||
|
if snippets
|
||||||
|
.iter()
|
||||||
|
.any(|s| s.id == snippet_id && s.is_default)
|
||||||
|
{
|
||||||
|
return Err("Cannot delete default snippets".to_string());
|
||||||
|
}
|
||||||
|
|
||||||
|
snippets.retain(|s| s.id != snippet_id);
|
||||||
|
save_all_snippets(&app, &snippets)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn get_snippet_categories(app: AppHandle) -> Result<Vec<String>, String> {
|
||||||
|
let snippets = load_all_snippets(&app)?;
|
||||||
|
let mut categories: Vec<String> = snippets.iter().map(|s| s.category.clone()).collect();
|
||||||
|
categories.sort();
|
||||||
|
categories.dedup();
|
||||||
|
Ok(categories)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[tauri::command]
|
||||||
|
pub async fn reset_default_snippets(app: AppHandle) -> Result<(), String> {
|
||||||
|
let mut snippets = load_all_snippets(&app)?;
|
||||||
|
|
||||||
|
// Remove all default snippets
|
||||||
|
snippets.retain(|s| !s.is_default);
|
||||||
|
|
||||||
|
// Add fresh default snippets
|
||||||
|
snippets.extend(get_default_snippets());
|
||||||
|
|
||||||
|
save_all_snippets(&app, &snippets)
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use std::collections::HashSet;
|
||||||
|
|
||||||
|
fn create_test_snippet(id: &str, name: &str, category: &str, is_default: bool) -> Snippet {
|
||||||
|
Snippet {
|
||||||
|
id: id.to_string(),
|
||||||
|
name: name.to_string(),
|
||||||
|
content: "Test content".to_string(),
|
||||||
|
category: category.to_string(),
|
||||||
|
is_default,
|
||||||
|
created_at: Utc::now(),
|
||||||
|
updated_at: Utc::now(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_exist() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
assert!(!defaults.is_empty());
|
||||||
|
assert!(defaults.iter().all(|s| s.is_default));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_have_required_fields() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
for snippet in defaults {
|
||||||
|
assert!(!snippet.id.is_empty());
|
||||||
|
assert!(!snippet.name.is_empty());
|
||||||
|
assert!(!snippet.content.is_empty());
|
||||||
|
assert!(!snippet.category.is_empty());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_count() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
// Should have 8 default snippets
|
||||||
|
assert_eq!(defaults.len(), 8);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_have_unique_ids() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
let ids: HashSet<&String> = defaults.iter().map(|s| &s.id).collect();
|
||||||
|
assert_eq!(ids.len(), defaults.len());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_ids_start_with_default() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
assert!(defaults.iter().all(|s| s.id.starts_with("default-")));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_serialization() {
|
||||||
|
let snippet = create_test_snippet("test-1", "Test Snippet", "Testing", false);
|
||||||
|
let json = serde_json::to_string(&snippet).expect("Failed to serialize");
|
||||||
|
let parsed: Snippet = serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(parsed.id, snippet.id);
|
||||||
|
assert_eq!(parsed.name, snippet.name);
|
||||||
|
assert_eq!(parsed.content, snippet.content);
|
||||||
|
assert_eq!(parsed.category, snippet.category);
|
||||||
|
assert_eq!(parsed.is_default, snippet.is_default);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_clone() {
|
||||||
|
let original = create_test_snippet("clone-test", "Clone Test", "Category", true);
|
||||||
|
let cloned = original.clone();
|
||||||
|
|
||||||
|
assert_eq!(original.id, cloned.id);
|
||||||
|
assert_eq!(original.name, cloned.name);
|
||||||
|
assert_eq!(original.is_default, cloned.is_default);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_snippet_sorting_by_category_then_name() {
|
||||||
|
let mut snippets = vec![
|
||||||
|
create_test_snippet("s1", "Zebra", "B-Category", false),
|
||||||
|
create_test_snippet("s2", "Apple", "A-Category", false),
|
||||||
|
create_test_snippet("s3", "Banana", "B-Category", false),
|
||||||
|
create_test_snippet("s4", "Alpha", "A-Category", false),
|
||||||
|
];
|
||||||
|
|
||||||
|
// Sort by category, then by name (mimics list_snippets behavior)
|
||||||
|
snippets.sort_by(|a, b| {
|
||||||
|
let cat_cmp = a.category.cmp(&b.category);
|
||||||
|
if cat_cmp == std::cmp::Ordering::Equal {
|
||||||
|
a.name.cmp(&b.name)
|
||||||
|
} else {
|
||||||
|
cat_cmp
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// A-Category should come first
|
||||||
|
assert_eq!(snippets[0].category, "A-Category");
|
||||||
|
assert_eq!(snippets[1].category, "A-Category");
|
||||||
|
assert_eq!(snippets[2].category, "B-Category");
|
||||||
|
assert_eq!(snippets[3].category, "B-Category");
|
||||||
|
|
||||||
|
// Within categories, alphabetically by name
|
||||||
|
assert_eq!(snippets[0].name, "Alpha");
|
||||||
|
assert_eq!(snippets[1].name, "Apple");
|
||||||
|
assert_eq!(snippets[2].name, "Banana");
|
||||||
|
assert_eq!(snippets[3].name, "Zebra");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_known_default_snippets() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
let ids: Vec<&str> = defaults.iter().map(|s| s.id.as_str()).collect();
|
||||||
|
|
||||||
|
assert!(ids.contains(&"default-explain-code"));
|
||||||
|
assert!(ids.contains(&"default-fix-error"));
|
||||||
|
assert!(ids.contains(&"default-write-tests"));
|
||||||
|
assert!(ids.contains(&"default-refactor"));
|
||||||
|
assert!(ids.contains(&"default-optimize"));
|
||||||
|
assert!(ids.contains(&"default-review-pr"));
|
||||||
|
assert!(ids.contains(&"default-add-comments"));
|
||||||
|
assert!(ids.contains(&"default-security-review"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippet_categories() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
let categories: HashSet<&String> = defaults.iter().map(|s| &s.category).collect();
|
||||||
|
|
||||||
|
assert!(categories.contains(&"Code Review".to_string()));
|
||||||
|
assert!(categories.contains(&"Debugging".to_string()));
|
||||||
|
assert!(categories.contains(&"Testing".to_string()));
|
||||||
|
assert!(categories.contains(&"Performance".to_string()));
|
||||||
|
assert!(categories.contains(&"Documentation".to_string()));
|
||||||
|
assert!(categories.contains(&"Security".to_string()));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_content_not_empty() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
for snippet in defaults {
|
||||||
|
assert!(
|
||||||
|
snippet.content.len() > 10,
|
||||||
|
"Content should be meaningful: {}",
|
||||||
|
snippet.name
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_timestamps() {
|
||||||
|
let snippet = create_test_snippet("time-test", "Time Test", "Cat", false);
|
||||||
|
assert!(snippet.created_at <= snippet.updated_at);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_snippets_have_same_timestamps() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
// All defaults are created at the same instant
|
||||||
|
let first_created = defaults[0].created_at;
|
||||||
|
let first_updated = defaults[0].updated_at;
|
||||||
|
|
||||||
|
for snippet in &defaults {
|
||||||
|
assert_eq!(snippet.created_at, first_created);
|
||||||
|
assert_eq!(snippet.updated_at, first_updated);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_retain_non_default() {
|
||||||
|
let mut snippets = vec![
|
||||||
|
create_test_snippet("default-1", "Default 1", "Cat", true),
|
||||||
|
create_test_snippet("custom-1", "Custom 1", "Cat", false),
|
||||||
|
create_test_snippet("default-2", "Default 2", "Cat", true),
|
||||||
|
create_test_snippet("custom-2", "Custom 2", "Cat", false),
|
||||||
|
];
|
||||||
|
|
||||||
|
// Mimics reset_default_snippets behavior (retain non-defaults)
|
||||||
|
snippets.retain(|s| !s.is_default);
|
||||||
|
|
||||||
|
assert_eq!(snippets.len(), 2);
|
||||||
|
assert!(snippets.iter().all(|s| !s.is_default));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_snippet_find_by_id() {
|
||||||
|
let snippets = vec![
|
||||||
|
create_test_snippet("snippet-1", "First", "Cat", false),
|
||||||
|
create_test_snippet("snippet-2", "Second", "Cat", false),
|
||||||
|
create_test_snippet("snippet-3", "Third", "Cat", false),
|
||||||
|
];
|
||||||
|
|
||||||
|
let found = snippets.iter().find(|s| s.id == "snippet-2");
|
||||||
|
assert!(found.is_some());
|
||||||
|
assert_eq!(found.unwrap().name, "Second");
|
||||||
|
|
||||||
|
let not_found = snippets.iter().find(|s| s.id == "snippet-999");
|
||||||
|
assert!(not_found.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
#[allow(clippy::useless_vec)]
|
||||||
|
fn test_extract_categories_sorted_and_deduped() {
|
||||||
|
let snippets = vec![
|
||||||
|
create_test_snippet("s1", "S1", "Zebra", false),
|
||||||
|
create_test_snippet("s2", "S2", "Alpha", false),
|
||||||
|
create_test_snippet("s3", "S3", "Beta", false),
|
||||||
|
create_test_snippet("s4", "S4", "Alpha", false), // Duplicate
|
||||||
|
];
|
||||||
|
|
||||||
|
let mut categories: Vec<String> = snippets.iter().map(|s| s.category.clone()).collect();
|
||||||
|
categories.sort();
|
||||||
|
categories.dedup();
|
||||||
|
|
||||||
|
assert_eq!(categories.len(), 3);
|
||||||
|
assert_eq!(categories[0], "Alpha");
|
||||||
|
assert_eq!(categories[1], "Beta");
|
||||||
|
assert_eq!(categories[2], "Zebra");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_snippet_category_code_review_count() {
|
||||||
|
let defaults = get_default_snippets();
|
||||||
|
let code_review_count = defaults
|
||||||
|
.iter()
|
||||||
|
.filter(|s| s.category == "Code Review")
|
||||||
|
.count();
|
||||||
|
|
||||||
|
// There should be multiple code review snippets
|
||||||
|
assert!(code_review_count >= 2);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,426 @@
|
|||||||
|
use parking_lot::Mutex;
|
||||||
|
use std::collections::HashMap;
|
||||||
|
use std::fs;
|
||||||
|
use std::path::{Path, PathBuf};
|
||||||
|
use std::sync::Arc;
|
||||||
|
use uuid::Uuid;
|
||||||
|
|
||||||
|
const TEMP_DIR_NAME: &str = "hikari-uploads";
|
||||||
|
|
||||||
|
pub struct TempFileManager {
|
||||||
|
base_dir: PathBuf,
|
||||||
|
files: HashMap<String, Vec<PathBuf>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl TempFileManager {
|
||||||
|
pub fn new() -> Result<Self, String> {
|
||||||
|
let base_dir = std::env::temp_dir().join(TEMP_DIR_NAME);
|
||||||
|
|
||||||
|
if !base_dir.exists() {
|
||||||
|
fs::create_dir_all(&base_dir)
|
||||||
|
.map_err(|e| format!("Failed to create temp directory: {}", e))?;
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(TempFileManager {
|
||||||
|
base_dir,
|
||||||
|
files: HashMap::new(),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
#[allow(dead_code)]
|
||||||
|
pub fn get_base_dir(&self) -> &Path {
|
||||||
|
&self.base_dir
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn save_file(
|
||||||
|
&mut self,
|
||||||
|
conversation_id: &str,
|
||||||
|
data: &[u8],
|
||||||
|
original_filename: Option<&str>,
|
||||||
|
) -> Result<PathBuf, String> {
|
||||||
|
let unique_id = Uuid::new_v4();
|
||||||
|
let extension = original_filename
|
||||||
|
.and_then(|name| Path::new(name).extension())
|
||||||
|
.and_then(|ext| ext.to_str())
|
||||||
|
.unwrap_or("bin");
|
||||||
|
|
||||||
|
let filename = format!("{}_{}.{}", conversation_id, unique_id, extension);
|
||||||
|
let file_path = self.base_dir.join(&filename);
|
||||||
|
|
||||||
|
fs::write(&file_path, data)
|
||||||
|
.map_err(|e| format!("Failed to write temp file: {}", e))?;
|
||||||
|
|
||||||
|
self.files
|
||||||
|
.entry(conversation_id.to_string())
|
||||||
|
.or_default()
|
||||||
|
.push(file_path.clone());
|
||||||
|
|
||||||
|
Ok(file_path)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn register_file(&mut self, conversation_id: &str, file_path: PathBuf) {
|
||||||
|
self.files
|
||||||
|
.entry(conversation_id.to_string())
|
||||||
|
.or_default()
|
||||||
|
.push(file_path);
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn get_files_for_conversation(&self, conversation_id: &str) -> Vec<PathBuf> {
|
||||||
|
self.files
|
||||||
|
.get(conversation_id)
|
||||||
|
.cloned()
|
||||||
|
.unwrap_or_default()
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn cleanup_conversation(&mut self, conversation_id: &str) -> Result<(), String> {
|
||||||
|
if let Some(files) = self.files.remove(conversation_id) {
|
||||||
|
for file_path in files {
|
||||||
|
if file_path.exists() {
|
||||||
|
if let Err(e) = fs::remove_file(&file_path) {
|
||||||
|
tracing::warn!(
|
||||||
|
"Failed to remove temp file {:?}: {}",
|
||||||
|
file_path, e
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn cleanup_all(&mut self) -> Result<(), String> {
|
||||||
|
let conversation_ids: Vec<String> = self.files.keys().cloned().collect();
|
||||||
|
|
||||||
|
for conversation_id in conversation_ids {
|
||||||
|
self.cleanup_conversation(&conversation_id)?;
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn cleanup_orphaned_files(&mut self) -> Result<usize, String> {
|
||||||
|
let mut cleaned_count = 0;
|
||||||
|
|
||||||
|
if !self.base_dir.exists() {
|
||||||
|
return Ok(0);
|
||||||
|
}
|
||||||
|
|
||||||
|
let tracked_files: std::collections::HashSet<PathBuf> =
|
||||||
|
self.files.values().flatten().cloned().collect();
|
||||||
|
|
||||||
|
let entries = fs::read_dir(&self.base_dir)
|
||||||
|
.map_err(|e| format!("Failed to read temp directory: {}", e))?;
|
||||||
|
|
||||||
|
for entry in entries.flatten() {
|
||||||
|
let path = entry.path();
|
||||||
|
if path.is_file() && !tracked_files.contains(&path) {
|
||||||
|
if let Err(e) = fs::remove_file(&path) {
|
||||||
|
tracing::warn!("Failed to remove orphaned file {:?}: {}", path, e);
|
||||||
|
} else {
|
||||||
|
cleaned_count += 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Ok(cleaned_count)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl Default for TempFileManager {
|
||||||
|
fn default() -> Self {
|
||||||
|
Self::new().expect("Failed to create TempFileManager")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub type SharedTempFileManager = Arc<Mutex<TempFileManager>>;
|
||||||
|
|
||||||
|
pub fn create_shared_temp_manager() -> Result<SharedTempFileManager, String> {
|
||||||
|
Ok(Arc::new(Mutex::new(TempFileManager::new()?)))
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use std::fs;
|
||||||
|
use tempfile::TempDir;
|
||||||
|
|
||||||
|
// Helper to create a TempFileManager with a custom base directory for testing
|
||||||
|
fn create_test_manager(base_dir: PathBuf) -> TempFileManager {
|
||||||
|
if !base_dir.exists() {
|
||||||
|
fs::create_dir_all(&base_dir).expect("Failed to create test temp dir");
|
||||||
|
}
|
||||||
|
TempFileManager {
|
||||||
|
base_dir,
|
||||||
|
files: HashMap::new(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_new_creates_base_directory() {
|
||||||
|
let manager = TempFileManager::new().expect("Failed to create TempFileManager");
|
||||||
|
assert!(manager.base_dir.exists());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_base_dir_returns_correct_path() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let manager = create_test_manager(base_path.clone());
|
||||||
|
|
||||||
|
assert_eq!(manager.get_base_dir(), base_path.as_path());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_save_file_creates_file_with_content() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"Hello, world!";
|
||||||
|
let result = manager.save_file("conv-1", data, Some("test.txt"));
|
||||||
|
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let file_path = result.unwrap();
|
||||||
|
assert!(file_path.exists());
|
||||||
|
|
||||||
|
let content = fs::read(&file_path).expect("Failed to read file");
|
||||||
|
assert_eq!(content, data);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_save_file_uses_correct_extension() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"test data";
|
||||||
|
let result = manager.save_file("conv-1", data, Some("document.pdf"));
|
||||||
|
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let file_path = result.unwrap();
|
||||||
|
assert_eq!(file_path.extension().unwrap(), "pdf");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_save_file_uses_bin_extension_when_no_filename() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"binary data";
|
||||||
|
let result = manager.save_file("conv-1", data, None);
|
||||||
|
|
||||||
|
assert!(result.is_ok());
|
||||||
|
let file_path = result.unwrap();
|
||||||
|
assert_eq!(file_path.extension().unwrap(), "bin");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_register_file_tracks_file_path() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let file_path = PathBuf::from("/some/path/file.txt");
|
||||||
|
manager.register_file("conv-1", file_path.clone());
|
||||||
|
|
||||||
|
let files = manager.get_files_for_conversation("conv-1");
|
||||||
|
assert_eq!(files.len(), 1);
|
||||||
|
assert_eq!(files[0], file_path);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_files_for_conversation_returns_empty_for_unknown() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let files = manager.get_files_for_conversation("unknown-conv");
|
||||||
|
assert!(files.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_get_files_for_conversation_returns_all_files() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"test";
|
||||||
|
manager.save_file("conv-1", data, Some("file1.txt")).unwrap();
|
||||||
|
manager.save_file("conv-1", data, Some("file2.txt")).unwrap();
|
||||||
|
manager.save_file("conv-2", data, Some("file3.txt")).unwrap();
|
||||||
|
|
||||||
|
let files_conv1 = manager.get_files_for_conversation("conv-1");
|
||||||
|
let files_conv2 = manager.get_files_for_conversation("conv-2");
|
||||||
|
|
||||||
|
assert_eq!(files_conv1.len(), 2);
|
||||||
|
assert_eq!(files_conv2.len(), 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_conversation_removes_files() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"test";
|
||||||
|
let file_path = manager.save_file("conv-1", data, Some("test.txt")).unwrap();
|
||||||
|
assert!(file_path.exists());
|
||||||
|
|
||||||
|
let result = manager.cleanup_conversation("conv-1");
|
||||||
|
assert!(result.is_ok());
|
||||||
|
assert!(!file_path.exists());
|
||||||
|
assert!(manager.get_files_for_conversation("conv-1").is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_conversation_handles_missing_files() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
// Register a file that doesn't exist
|
||||||
|
manager.register_file("conv-1", PathBuf::from("/nonexistent/file.txt"));
|
||||||
|
|
||||||
|
// Should not error, just skip missing files
|
||||||
|
let result = manager.cleanup_conversation("conv-1");
|
||||||
|
assert!(result.is_ok());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_conversation_for_unknown_returns_ok() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let result = manager.cleanup_conversation("unknown-conv");
|
||||||
|
assert!(result.is_ok());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_all_removes_all_files() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"test";
|
||||||
|
let file1 = manager.save_file("conv-1", data, Some("f1.txt")).unwrap();
|
||||||
|
let file2 = manager.save_file("conv-2", data, Some("f2.txt")).unwrap();
|
||||||
|
|
||||||
|
assert!(file1.exists());
|
||||||
|
assert!(file2.exists());
|
||||||
|
|
||||||
|
let result = manager.cleanup_all();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
assert!(!file1.exists());
|
||||||
|
assert!(!file2.exists());
|
||||||
|
assert!(manager.files.is_empty());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_orphaned_files_removes_untracked() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path.clone());
|
||||||
|
|
||||||
|
// Create a tracked file
|
||||||
|
let data = b"tracked";
|
||||||
|
let tracked_path = manager.save_file("conv-1", data, Some("tracked.txt")).unwrap();
|
||||||
|
|
||||||
|
// Create an untracked (orphaned) file directly in the temp directory
|
||||||
|
let orphan_path = base_path.join("orphan.txt");
|
||||||
|
fs::write(&orphan_path, b"orphan").expect("Failed to create orphan file");
|
||||||
|
|
||||||
|
assert!(tracked_path.exists());
|
||||||
|
assert!(orphan_path.exists());
|
||||||
|
|
||||||
|
let result = manager.cleanup_orphaned_files();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
assert_eq!(result.unwrap(), 1); // One orphan removed
|
||||||
|
|
||||||
|
assert!(tracked_path.exists()); // Tracked file still exists
|
||||||
|
assert!(!orphan_path.exists()); // Orphan removed
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_orphaned_returns_zero_when_none() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let data = b"test";
|
||||||
|
manager.save_file("conv-1", data, Some("test.txt")).unwrap();
|
||||||
|
|
||||||
|
let result = manager.cleanup_orphaned_files();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
assert_eq!(result.unwrap(), 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cleanup_orphaned_returns_zero_when_dir_missing() {
|
||||||
|
let mut manager = TempFileManager {
|
||||||
|
base_dir: PathBuf::from("/nonexistent/dir"),
|
||||||
|
files: HashMap::new(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let result = manager.cleanup_orphaned_files();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
assert_eq!(result.unwrap(), 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_default_creates_manager() {
|
||||||
|
// Default should work as long as we can create temp directories
|
||||||
|
let manager = TempFileManager::default();
|
||||||
|
assert!(manager.base_dir.exists());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_create_shared_temp_manager() {
|
||||||
|
let result = create_shared_temp_manager();
|
||||||
|
assert!(result.is_ok());
|
||||||
|
|
||||||
|
let shared = result.unwrap();
|
||||||
|
let manager = shared.lock();
|
||||||
|
assert!(manager.base_dir.exists());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiple_files_same_conversation() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
// Save multiple files to same conversation
|
||||||
|
for i in 0..5 {
|
||||||
|
let data = format!("content {}", i);
|
||||||
|
manager
|
||||||
|
.save_file("conv-1", data.as_bytes(), Some(&format!("file{}.txt", i)))
|
||||||
|
.unwrap();
|
||||||
|
}
|
||||||
|
|
||||||
|
let files = manager.get_files_for_conversation("conv-1");
|
||||||
|
assert_eq!(files.len(), 5);
|
||||||
|
|
||||||
|
// Each file should have unique content
|
||||||
|
for (i, file_path) in files.iter().enumerate() {
|
||||||
|
let content = fs::read_to_string(file_path).expect("Failed to read");
|
||||||
|
assert_eq!(content, format!("content {}", i));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_file_paths_contain_conversation_id() {
|
||||||
|
let temp_dir = TempDir::new().expect("Failed to create temp dir");
|
||||||
|
let base_path = temp_dir.path().join("hikari-test");
|
||||||
|
let mut manager = create_test_manager(base_path);
|
||||||
|
|
||||||
|
let file_path = manager
|
||||||
|
.save_file("my-conversation-id", b"test", Some("test.txt"))
|
||||||
|
.unwrap();
|
||||||
|
|
||||||
|
let filename = file_path.file_name().unwrap().to_str().unwrap();
|
||||||
|
assert!(filename.starts_with("my-conversation-id_"));
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,266 @@
|
|||||||
|
use serde::{Deserialize, Serialize};
|
||||||
|
use std::collections::hash_map::DefaultHasher;
|
||||||
|
use std::collections::HashMap;
|
||||||
|
use std::hash::{Hash, Hasher};
|
||||||
|
|
||||||
|
/// Tools that could benefit from caching
|
||||||
|
#[allow(dead_code)]
|
||||||
|
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||||
|
pub enum CacheableTool {
|
||||||
|
Read,
|
||||||
|
Glob,
|
||||||
|
Grep,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl CacheableTool {
|
||||||
|
#[allow(dead_code)]
|
||||||
|
pub fn from_name(name: &str) -> Option<Self> {
|
||||||
|
match name {
|
||||||
|
"Read" => Some(Self::Read),
|
||||||
|
"Glob" => Some(Self::Glob),
|
||||||
|
"Grep" => Some(Self::Grep),
|
||||||
|
_ => None,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Statistics about potential cache savings
|
||||||
|
#[allow(dead_code)]
|
||||||
|
#[derive(Debug, Default, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct CacheAnalytics {
|
||||||
|
/// Number of tool calls that could have been cache hits
|
||||||
|
pub potential_cache_hits: u64,
|
||||||
|
/// Estimated tokens that could have been saved
|
||||||
|
pub potential_savings_tokens: u64,
|
||||||
|
/// Tracks unique tool invocations: hash -> (tool_name, call_count)
|
||||||
|
#[serde(skip)]
|
||||||
|
recent_invocations: HashMap<u64, (String, u64)>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[allow(dead_code)]
|
||||||
|
impl CacheAnalytics {
|
||||||
|
pub fn new() -> Self {
|
||||||
|
Self::default()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Compute a hash key from tool name and input
|
||||||
|
fn compute_key(tool_name: &str, input: &serde_json::Value) -> u64 {
|
||||||
|
let mut hasher = DefaultHasher::new();
|
||||||
|
tool_name.hash(&mut hasher);
|
||||||
|
input.to_string().hash(&mut hasher);
|
||||||
|
hasher.finish()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Track a tool invocation for analytics
|
||||||
|
/// Returns true if this was a repeated invocation (potential cache hit)
|
||||||
|
pub fn track_invocation(
|
||||||
|
&mut self,
|
||||||
|
tool_name: &str,
|
||||||
|
input: &serde_json::Value,
|
||||||
|
estimated_tokens: u64,
|
||||||
|
) -> bool {
|
||||||
|
// Only track cacheable tools
|
||||||
|
if CacheableTool::from_name(tool_name).is_none() {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
let key = Self::compute_key(tool_name, input);
|
||||||
|
|
||||||
|
if let Some((_, count)) = self.recent_invocations.get_mut(&key) {
|
||||||
|
*count += 1;
|
||||||
|
// This is a repeat - could have been a cache hit
|
||||||
|
self.potential_cache_hits += 1;
|
||||||
|
self.potential_savings_tokens += estimated_tokens;
|
||||||
|
true
|
||||||
|
} else {
|
||||||
|
self.recent_invocations
|
||||||
|
.insert(key, (tool_name.to_string(), 1));
|
||||||
|
false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get the number of unique tool invocations being tracked
|
||||||
|
pub fn unique_invocations(&self) -> usize {
|
||||||
|
self.recent_invocations.len()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get invocations that were called more than once
|
||||||
|
pub fn repeated_invocations(&self) -> Vec<(&str, u64)> {
|
||||||
|
self.recent_invocations
|
||||||
|
.values()
|
||||||
|
.filter(|(_, count)| *count > 1)
|
||||||
|
.map(|(name, count)| (name.as_str(), *count))
|
||||||
|
.collect()
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Clear session analytics (keep totals)
|
||||||
|
pub fn clear_session(&mut self) {
|
||||||
|
self.recent_invocations.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Fully reset all analytics
|
||||||
|
pub fn reset(&mut self) {
|
||||||
|
self.potential_cache_hits = 0;
|
||||||
|
self.potential_savings_tokens = 0;
|
||||||
|
self.recent_invocations.clear();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
use serde_json::json;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cacheable_tool_from_name() {
|
||||||
|
assert_eq!(CacheableTool::from_name("Read"), Some(CacheableTool::Read));
|
||||||
|
assert_eq!(CacheableTool::from_name("Glob"), Some(CacheableTool::Glob));
|
||||||
|
assert_eq!(CacheableTool::from_name("Grep"), Some(CacheableTool::Grep));
|
||||||
|
assert_eq!(CacheableTool::from_name("Bash"), None);
|
||||||
|
assert_eq!(CacheableTool::from_name("Edit"), None);
|
||||||
|
assert_eq!(CacheableTool::from_name("Write"), None);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_first_invocation_not_cache_hit() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"file_path": "/home/test/file.txt"});
|
||||||
|
|
||||||
|
let is_repeat = analytics.track_invocation("Read", &input, 100);
|
||||||
|
|
||||||
|
assert!(!is_repeat);
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 0);
|
||||||
|
assert_eq!(analytics.potential_savings_tokens, 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_second_invocation_is_cache_hit() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"file_path": "/home/test/file.txt"});
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
let is_repeat = analytics.track_invocation("Read", &input, 100);
|
||||||
|
|
||||||
|
assert!(is_repeat);
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 1);
|
||||||
|
assert_eq!(analytics.potential_savings_tokens, 100);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_different_inputs_not_cache_hit() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input1 = json!({"file_path": "/home/test/file1.txt"});
|
||||||
|
let input2 = json!({"file_path": "/home/test/file2.txt"});
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &input1, 100);
|
||||||
|
let is_repeat = analytics.track_invocation("Read", &input2, 100);
|
||||||
|
|
||||||
|
assert!(!is_repeat);
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_non_cacheable_tool_ignored() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"command": "ls -la"});
|
||||||
|
|
||||||
|
let is_repeat = analytics.track_invocation("Bash", &input, 100);
|
||||||
|
analytics.track_invocation("Bash", &input, 100);
|
||||||
|
|
||||||
|
assert!(!is_repeat);
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 0);
|
||||||
|
assert_eq!(analytics.unique_invocations(), 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_multiple_repeated_invocations() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"file_path": "/home/test/file.txt"});
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 2);
|
||||||
|
assert_eq!(analytics.potential_savings_tokens, 200);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_unique_invocations_count() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &json!({"file_path": "/file1.txt"}), 100);
|
||||||
|
analytics.track_invocation("Read", &json!({"file_path": "/file2.txt"}), 100);
|
||||||
|
analytics.track_invocation("Glob", &json!({"pattern": "*.rs"}), 50);
|
||||||
|
|
||||||
|
assert_eq!(analytics.unique_invocations(), 3);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_repeated_invocations_list() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
|
||||||
|
// file1 read twice
|
||||||
|
analytics.track_invocation("Read", &json!({"file_path": "/file1.txt"}), 100);
|
||||||
|
analytics.track_invocation("Read", &json!({"file_path": "/file1.txt"}), 100);
|
||||||
|
|
||||||
|
// file2 read once
|
||||||
|
analytics.track_invocation("Read", &json!({"file_path": "/file2.txt"}), 100);
|
||||||
|
|
||||||
|
// glob run 3 times
|
||||||
|
analytics.track_invocation("Glob", &json!({"pattern": "*.rs"}), 50);
|
||||||
|
analytics.track_invocation("Glob", &json!({"pattern": "*.rs"}), 50);
|
||||||
|
analytics.track_invocation("Glob", &json!({"pattern": "*.rs"}), 50);
|
||||||
|
|
||||||
|
let repeated = analytics.repeated_invocations();
|
||||||
|
assert_eq!(repeated.len(), 2); // file1 and glob pattern
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_clear_session() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"file_path": "/file.txt"});
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 1);
|
||||||
|
assert_eq!(analytics.unique_invocations(), 1);
|
||||||
|
|
||||||
|
analytics.clear_session();
|
||||||
|
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 1); // Preserved
|
||||||
|
assert_eq!(analytics.unique_invocations(), 0); // Cleared
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_reset() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
let input = json!({"file_path": "/file.txt"});
|
||||||
|
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
analytics.track_invocation("Read", &input, 100);
|
||||||
|
|
||||||
|
analytics.reset();
|
||||||
|
|
||||||
|
assert_eq!(analytics.potential_cache_hits, 0);
|
||||||
|
assert_eq!(analytics.potential_savings_tokens, 0);
|
||||||
|
assert_eq!(analytics.unique_invocations(), 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_serialization() {
|
||||||
|
let mut analytics = CacheAnalytics::new();
|
||||||
|
analytics.potential_cache_hits = 10;
|
||||||
|
analytics.potential_savings_tokens = 500;
|
||||||
|
|
||||||
|
let json = serde_json::to_string(&analytics).expect("Failed to serialize");
|
||||||
|
let deserialized: CacheAnalytics =
|
||||||
|
serde_json::from_str(&json).expect("Failed to deserialize");
|
||||||
|
|
||||||
|
assert_eq!(deserialized.potential_cache_hits, 10);
|
||||||
|
assert_eq!(deserialized.potential_savings_tokens, 500);
|
||||||
|
// recent_invocations is skipped in serialization
|
||||||
|
assert_eq!(deserialized.unique_invocations(), 0);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,48 @@
|
|||||||
|
use tauri::{
|
||||||
|
menu::{Menu, MenuItem},
|
||||||
|
tray::{MouseButton, MouseButtonState, TrayIconBuilder, TrayIconEvent},
|
||||||
|
AppHandle, Manager,
|
||||||
|
};
|
||||||
|
|
||||||
|
pub fn setup_tray(app: &AppHandle) -> tauri::Result<()> {
|
||||||
|
let show_item = MenuItem::with_id(app, "show", "Show Hikari", true, None::<&str>)?;
|
||||||
|
let quit_item = MenuItem::with_id(app, "quit", "Quit", true, None::<&str>)?;
|
||||||
|
|
||||||
|
let menu = Menu::with_items(app, &[&show_item, &quit_item])?;
|
||||||
|
|
||||||
|
let _tray = TrayIconBuilder::with_id("main")
|
||||||
|
.icon(app.default_window_icon().unwrap().clone())
|
||||||
|
.menu(&menu)
|
||||||
|
.tooltip("Hikari - Claude Code Assistant")
|
||||||
|
.on_menu_event(|app, event| match event.id.as_ref() {
|
||||||
|
"show" => {
|
||||||
|
if let Some(window) = app.get_webview_window("main") {
|
||||||
|
let _ = window.show();
|
||||||
|
let _ = window.unminimize();
|
||||||
|
let _ = window.set_focus();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
"quit" => {
|
||||||
|
app.exit(0);
|
||||||
|
}
|
||||||
|
_ => {}
|
||||||
|
})
|
||||||
|
.on_tray_icon_event(|tray, event| {
|
||||||
|
if let TrayIconEvent::Click {
|
||||||
|
button: MouseButton::Left,
|
||||||
|
button_state: MouseButtonState::Up,
|
||||||
|
..
|
||||||
|
} = event
|
||||||
|
{
|
||||||
|
let app = tray.app_handle();
|
||||||
|
if let Some(window) = app.get_webview_window("main") {
|
||||||
|
let _ = window.show();
|
||||||
|
let _ = window.unminimize();
|
||||||
|
let _ = window.set_focus();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})
|
||||||
|
.build(app)?;
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
@@ -4,6 +4,10 @@ use serde::{Deserialize, Serialize};
|
|||||||
pub struct UsageInfo {
|
pub struct UsageInfo {
|
||||||
pub input_tokens: u64,
|
pub input_tokens: u64,
|
||||||
pub output_tokens: u64,
|
pub output_tokens: u64,
|
||||||
|
#[serde(default)]
|
||||||
|
pub cache_creation_input_tokens: Option<u64>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub cache_read_input_tokens: Option<u64>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)]
|
#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)]
|
||||||
@@ -59,6 +63,26 @@ pub struct PermissionDenial {
|
|||||||
pub tool_input: serde_json::Value,
|
pub tool_input: serde_json::Value,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Rate limit information from a `rate_limit_event` message.
|
||||||
|
/// All fields are optional to ensure forward-compatibility as the Claude CLI evolves.
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
|
||||||
|
pub struct RateLimitInfo {
|
||||||
|
#[serde(default)]
|
||||||
|
pub requests_limit: Option<u64>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub requests_remaining: Option<u64>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub requests_reset: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub tokens_limit: Option<u64>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub tokens_remaining: Option<u64>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub tokens_reset: Option<String>,
|
||||||
|
#[serde(default)]
|
||||||
|
pub retry_after_ms: Option<u64>,
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
#[serde(tag = "type")]
|
#[serde(tag = "type")]
|
||||||
pub enum ClaudeMessage {
|
pub enum ClaudeMessage {
|
||||||
@@ -71,6 +95,9 @@ pub enum ClaudeMessage {
|
|||||||
cwd: Option<String>,
|
cwd: Option<String>,
|
||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
tools: Option<Vec<String>>,
|
tools: Option<Vec<String>>,
|
||||||
|
/// Output style hint from Claude Code (v2.1.81+). Informational only.
|
||||||
|
#[serde(default)]
|
||||||
|
output_style: Option<String>,
|
||||||
},
|
},
|
||||||
#[serde(rename = "assistant")]
|
#[serde(rename = "assistant")]
|
||||||
Assistant {
|
Assistant {
|
||||||
@@ -95,6 +122,20 @@ pub enum ClaudeMessage {
|
|||||||
permission_denials: Option<Vec<PermissionDenial>>,
|
permission_denials: Option<Vec<PermissionDenial>>,
|
||||||
#[serde(default)]
|
#[serde(default)]
|
||||||
usage: Option<UsageInfo>,
|
usage: Option<UsageInfo>,
|
||||||
|
/// Fast mode state from Claude Code v2.1.81+. Values: "default" | "enabled" | "disabled".
|
||||||
|
#[serde(default)]
|
||||||
|
fast_mode_state: Option<String>,
|
||||||
|
/// Per-model usage breakdown from Claude Code v2.1.81+.
|
||||||
|
#[serde(default)]
|
||||||
|
model_usage: Option<serde_json::Value>,
|
||||||
|
/// Authoritative total cost in USD reported by Claude Code v2.1.81+.
|
||||||
|
#[serde(default)]
|
||||||
|
total_cost_usd: Option<f64>,
|
||||||
|
},
|
||||||
|
#[serde(rename = "rate_limit_event")]
|
||||||
|
RateLimitEvent {
|
||||||
|
#[serde(default)]
|
||||||
|
rate_limit_info: RateLimitInfo,
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -176,6 +217,14 @@ pub struct StateChangeEvent {
|
|||||||
pub conversation_id: Option<String>,
|
pub conversation_id: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Cost information for a message
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct MessageCost {
|
||||||
|
pub input_tokens: u64,
|
||||||
|
pub output_tokens: u64,
|
||||||
|
pub cost_usd: f64,
|
||||||
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
pub struct OutputEvent {
|
pub struct OutputEvent {
|
||||||
pub line_type: String,
|
pub line_type: String,
|
||||||
@@ -183,14 +232,23 @@ pub struct OutputEvent {
|
|||||||
pub tool_name: Option<String>,
|
pub tool_name: Option<String>,
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
pub conversation_id: Option<String>,
|
pub conversation_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub cost: Option<MessageCost>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub parent_tool_use_id: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Clone, Serialize, Deserialize)]
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
pub struct PermissionPromptEvent {
|
pub struct PermissionPromptEventItem {
|
||||||
pub id: String,
|
pub id: String,
|
||||||
pub tool_name: String,
|
pub tool_name: String,
|
||||||
pub tool_input: serde_json::Value,
|
pub tool_input: serde_json::Value,
|
||||||
pub description: String,
|
pub description: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct PermissionPromptEvent {
|
||||||
|
pub permissions: Vec<PermissionPromptEventItem>,
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
pub conversation_id: Option<String>,
|
pub conversation_id: Option<String>,
|
||||||
}
|
}
|
||||||
@@ -216,6 +274,162 @@ pub struct WorkingDirectoryEvent {
|
|||||||
pub conversation_id: Option<String>,
|
pub conversation_id: Option<String>,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct QuestionOption {
|
||||||
|
pub label: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub description: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct UserQuestionEvent {
|
||||||
|
pub id: String,
|
||||||
|
pub question: String,
|
||||||
|
pub header: Option<String>,
|
||||||
|
pub options: Vec<QuestionOption>,
|
||||||
|
pub multi_select: bool,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct ElicitationEvent {
|
||||||
|
pub message: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub server_name: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub request_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct ElicitationResultEvent {
|
||||||
|
pub action: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub request_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct StopFailureEvent {
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub stop_reason: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub error_type: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct PostCompactEvent {
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub session_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct CwdChangedEvent {
|
||||||
|
pub cwd: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct FileChangedEvent {
|
||||||
|
pub file: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct TaskCreatedEvent {
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub task_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub description: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub parent_tool_use_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct PermissionDeniedEvent {
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub tool_name: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub reason: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct AgentStartEvent {
|
||||||
|
pub tool_use_id: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub agent_id: Option<String>,
|
||||||
|
pub description: String,
|
||||||
|
pub subagent_type: String,
|
||||||
|
pub started_at: u64,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub parent_tool_use_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub model: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct WorktreeInfo {
|
||||||
|
pub name: String,
|
||||||
|
pub path: String,
|
||||||
|
pub branch: String,
|
||||||
|
pub original_repo_directory: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct WorktreeEvent {
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
/// "create" or "remove"
|
||||||
|
pub event_type: String,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub worktree: Option<WorktreeInfo>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct AgentEndEvent {
|
||||||
|
pub tool_use_id: String,
|
||||||
|
pub ended_at: u64,
|
||||||
|
pub is_error: bool,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub duration_ms: Option<u64>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub num_turns: Option<u32>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub last_assistant_message: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct TodoItem {
|
||||||
|
pub content: String,
|
||||||
|
pub status: String, // "pending", "in_progress", or "completed"
|
||||||
|
#[serde(rename = "activeForm")]
|
||||||
|
pub active_form: String,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Serialize, Deserialize)]
|
||||||
|
pub struct TodoUpdateEvent {
|
||||||
|
pub todos: Vec<TodoItem>,
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
pub conversation_id: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
#[cfg(test)]
|
#[cfg(test)]
|
||||||
mod tests {
|
mod tests {
|
||||||
use super::*;
|
use super::*;
|
||||||
@@ -336,10 +550,439 @@ mod tests {
|
|||||||
content: "Test output".to_string(),
|
content: "Test output".to_string(),
|
||||||
tool_name: None,
|
tool_name: None,
|
||||||
conversation_id: None,
|
conversation_id: None,
|
||||||
|
cost: None,
|
||||||
|
parent_tool_use_id: None,
|
||||||
};
|
};
|
||||||
|
|
||||||
let serialized = serde_json::to_string(&event).unwrap();
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
assert!(serialized.contains("\"line_type\":\"assistant\""));
|
assert!(serialized.contains("\"line_type\":\"assistant\""));
|
||||||
assert!(serialized.contains("\"content\":\"Test output\""));
|
assert!(serialized.contains("\"content\":\"Test output\""));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_output_event_with_cost() {
|
||||||
|
let event = OutputEvent {
|
||||||
|
line_type: "assistant".to_string(),
|
||||||
|
content: "Test output".to_string(),
|
||||||
|
tool_name: None,
|
||||||
|
conversation_id: Some("conv-123".to_string()),
|
||||||
|
cost: Some(MessageCost {
|
||||||
|
input_tokens: 100,
|
||||||
|
output_tokens: 50,
|
||||||
|
cost_usd: 0.005,
|
||||||
|
}),
|
||||||
|
parent_tool_use_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"cost\":"));
|
||||||
|
assert!(serialized.contains("\"input_tokens\":100"));
|
||||||
|
assert!(serialized.contains("\"output_tokens\":50"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_rate_limit_info_default() {
|
||||||
|
let info = RateLimitInfo::default();
|
||||||
|
assert!(info.requests_limit.is_none());
|
||||||
|
assert!(info.requests_remaining.is_none());
|
||||||
|
assert!(info.requests_reset.is_none());
|
||||||
|
assert!(info.tokens_limit.is_none());
|
||||||
|
assert!(info.tokens_remaining.is_none());
|
||||||
|
assert!(info.tokens_reset.is_none());
|
||||||
|
assert!(info.retry_after_ms.is_none());
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_rate_limit_event_deserialization_empty_info() {
|
||||||
|
let json = r#"{"type":"rate_limit_event","rate_limit_info":{}}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
assert!(matches!(msg, ClaudeMessage::RateLimitEvent { .. }));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_rate_limit_event_deserialization_no_info() {
|
||||||
|
// rate_limit_info field is optional via #[serde(default)]
|
||||||
|
let json = r#"{"type":"rate_limit_event"}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
assert!(matches!(msg, ClaudeMessage::RateLimitEvent { .. }));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_rate_limit_event_deserialization_with_data() {
|
||||||
|
let json = r#"{
|
||||||
|
"type": "rate_limit_event",
|
||||||
|
"rate_limit_info": {
|
||||||
|
"requests_limit": 1000,
|
||||||
|
"requests_remaining": 0,
|
||||||
|
"requests_reset": "2024-01-01T00:01:00Z",
|
||||||
|
"tokens_limit": 50000,
|
||||||
|
"tokens_remaining": 0,
|
||||||
|
"tokens_reset": "2024-01-01T00:01:00Z",
|
||||||
|
"retry_after_ms": 60000
|
||||||
|
}
|
||||||
|
}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::RateLimitEvent { rate_limit_info } = msg {
|
||||||
|
assert_eq!(rate_limit_info.requests_limit, Some(1000));
|
||||||
|
assert_eq!(rate_limit_info.requests_remaining, Some(0));
|
||||||
|
assert_eq!(
|
||||||
|
rate_limit_info.requests_reset,
|
||||||
|
Some("2024-01-01T00:01:00Z".to_string())
|
||||||
|
);
|
||||||
|
assert_eq!(rate_limit_info.retry_after_ms, Some(60000));
|
||||||
|
} else {
|
||||||
|
panic!("Expected RateLimitEvent variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_rate_limit_event_ignores_unknown_fields() {
|
||||||
|
// Ensures forward-compat: unknown fields in rate_limit_info are silently ignored
|
||||||
|
let json = r#"{
|
||||||
|
"type": "rate_limit_event",
|
||||||
|
"rate_limit_info": {
|
||||||
|
"requests_remaining": 0,
|
||||||
|
"some_future_field": "some_value"
|
||||||
|
}
|
||||||
|
}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::RateLimitEvent { rate_limit_info } = msg {
|
||||||
|
assert_eq!(rate_limit_info.requests_remaining, Some(0));
|
||||||
|
} else {
|
||||||
|
panic!("Expected RateLimitEvent variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_elicitation_event_serialization() {
|
||||||
|
let event = ElicitationEvent {
|
||||||
|
message: "Please provide the API endpoint".to_string(),
|
||||||
|
server_name: Some("my-server".to_string()),
|
||||||
|
request_id: Some("req-123".to_string()),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"message\":\"Please provide the API endpoint\""));
|
||||||
|
assert!(serialized.contains("\"server_name\":\"my-server\""));
|
||||||
|
assert!(serialized.contains("\"request_id\":\"req-123\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_elicitation_event_omits_none_fields() {
|
||||||
|
let event = ElicitationEvent {
|
||||||
|
message: "Enter your token".to_string(),
|
||||||
|
server_name: None,
|
||||||
|
request_id: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"message\":\"Enter your token\""));
|
||||||
|
assert!(!serialized.contains("server_name"));
|
||||||
|
assert!(!serialized.contains("request_id"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_elicitation_result_event_serialization() {
|
||||||
|
let event = ElicitationResultEvent {
|
||||||
|
action: "accept".to_string(),
|
||||||
|
request_id: Some("req-123".to_string()),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"action\":\"accept\""));
|
||||||
|
assert!(serialized.contains("\"request_id\":\"req-123\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_elicitation_result_event_cancel_omits_none_fields() {
|
||||||
|
let event = ElicitationResultEvent {
|
||||||
|
action: "cancel".to_string(),
|
||||||
|
request_id: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"action\":\"cancel\""));
|
||||||
|
assert!(!serialized.contains("request_id"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_stop_failure_event_serialization() {
|
||||||
|
let event = StopFailureEvent {
|
||||||
|
stop_reason: Some("api_error".to_string()),
|
||||||
|
error_type: Some("rate_limit".to_string()),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"stop_reason\":\"api_error\""));
|
||||||
|
assert!(serialized.contains("\"error_type\":\"rate_limit\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_stop_failure_event_omits_none_fields() {
|
||||||
|
let event = StopFailureEvent {
|
||||||
|
stop_reason: None,
|
||||||
|
error_type: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(!serialized.contains("stop_reason"));
|
||||||
|
assert!(!serialized.contains("error_type"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_stop_failure_event_partial_fields() {
|
||||||
|
let event = StopFailureEvent {
|
||||||
|
stop_reason: Some("api_error".to_string()),
|
||||||
|
error_type: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"stop_reason\":\"api_error\""));
|
||||||
|
assert!(!serialized.contains("error_type"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_post_compact_event_serialization() {
|
||||||
|
let event = PostCompactEvent {
|
||||||
|
session_id: Some("sess-abc".to_string()),
|
||||||
|
conversation_id: Some("conv-123".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"session_id\":\"sess-abc\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-123\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_post_compact_event_omits_none_fields() {
|
||||||
|
let event = PostCompactEvent {
|
||||||
|
session_id: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(!serialized.contains("session_id"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_post_compact_event_partial_fields() {
|
||||||
|
let event = PostCompactEvent {
|
||||||
|
session_id: Some("sess-xyz".to_string()),
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"session_id\":\"sess-xyz\""));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cwd_changed_event_serialization() {
|
||||||
|
let event = CwdChangedEvent {
|
||||||
|
cwd: "/home/naomi/code/my-project".to_string(),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"cwd\":\"/home/naomi/code/my-project\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_cwd_changed_event_omits_none_fields() {
|
||||||
|
let event = CwdChangedEvent {
|
||||||
|
cwd: "/tmp/workspace".to_string(),
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"cwd\":\"/tmp/workspace\""));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_file_changed_event_serialization() {
|
||||||
|
let event = FileChangedEvent {
|
||||||
|
file: "/home/naomi/code/my-project/src/main.rs".to_string(),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"file\":\"/home/naomi/code/my-project/src/main.rs\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_file_changed_event_omits_none_fields() {
|
||||||
|
let event = FileChangedEvent {
|
||||||
|
file: "/tmp/test.txt".to_string(),
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"file\":\"/tmp/test.txt\""));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_task_created_event_serialization() {
|
||||||
|
let event = TaskCreatedEvent {
|
||||||
|
task_id: Some("task-abc123".to_string()),
|
||||||
|
description: Some("Explore the codebase".to_string()),
|
||||||
|
parent_tool_use_id: Some("toolu_xyz".to_string()),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"task_id\":\"task-abc123\""));
|
||||||
|
assert!(serialized.contains("\"description\":\"Explore the codebase\""));
|
||||||
|
assert!(serialized.contains("\"parent_tool_use_id\":\"toolu_xyz\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_task_created_event_omits_none_fields() {
|
||||||
|
let event = TaskCreatedEvent {
|
||||||
|
task_id: None,
|
||||||
|
description: None,
|
||||||
|
parent_tool_use_id: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert_eq!(serialized, "{}");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_task_created_event_partial_fields() {
|
||||||
|
let event = TaskCreatedEvent {
|
||||||
|
task_id: Some("task-001".to_string()),
|
||||||
|
description: None,
|
||||||
|
parent_tool_use_id: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"task_id\":\"task-001\""));
|
||||||
|
assert!(!serialized.contains("description"));
|
||||||
|
assert!(!serialized.contains("parent_tool_use_id"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_permission_denied_event_serialization() {
|
||||||
|
let event = PermissionDeniedEvent {
|
||||||
|
tool_name: Some("Bash".to_string()),
|
||||||
|
reason: Some("Tool not in allow list".to_string()),
|
||||||
|
conversation_id: Some("conv-abc".to_string()),
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"tool_name\":\"Bash\""));
|
||||||
|
assert!(serialized.contains("\"reason\":\"Tool not in allow list\""));
|
||||||
|
assert!(serialized.contains("\"conversation_id\":\"conv-abc\""));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_permission_denied_event_omits_none_fields() {
|
||||||
|
let event = PermissionDeniedEvent {
|
||||||
|
tool_name: None,
|
||||||
|
reason: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert_eq!(serialized, "{}");
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_permission_denied_event_partial_fields() {
|
||||||
|
let event = PermissionDeniedEvent {
|
||||||
|
tool_name: Some("Edit".to_string()),
|
||||||
|
reason: None,
|
||||||
|
conversation_id: None,
|
||||||
|
};
|
||||||
|
|
||||||
|
let serialized = serde_json::to_string(&event).unwrap();
|
||||||
|
assert!(serialized.contains("\"tool_name\":\"Edit\""));
|
||||||
|
assert!(!serialized.contains("reason"));
|
||||||
|
assert!(!serialized.contains("conversation_id"));
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_system_init_with_output_style() {
|
||||||
|
let json = r#"{"type":"system","subtype":"init","session_id":"sess-1","output_style":"auto"}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::System { output_style, .. } = msg {
|
||||||
|
assert_eq!(output_style, Some("auto".to_string()));
|
||||||
|
} else {
|
||||||
|
panic!("Expected System variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_system_init_without_output_style() {
|
||||||
|
let json = r#"{"type":"system","subtype":"init","session_id":"sess-1"}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::System { output_style, .. } = msg {
|
||||||
|
assert!(output_style.is_none());
|
||||||
|
} else {
|
||||||
|
panic!("Expected System variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_result_message_with_fast_mode_state() {
|
||||||
|
let json = r#"{"type":"result","subtype":"success","fast_mode_state":"enabled"}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::Result { fast_mode_state, .. } = msg {
|
||||||
|
assert_eq!(fast_mode_state, Some("enabled".to_string()));
|
||||||
|
} else {
|
||||||
|
panic!("Expected Result variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_result_message_with_total_cost_usd() {
|
||||||
|
let json = r#"{"type":"result","subtype":"success","total_cost_usd":0.05}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::Result { total_cost_usd, .. } = msg {
|
||||||
|
assert!((total_cost_usd.unwrap() - 0.05).abs() < f64::EPSILON);
|
||||||
|
} else {
|
||||||
|
panic!("Expected Result variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_result_message_without_new_fields() {
|
||||||
|
let json = r#"{"type":"result","subtype":"success"}"#;
|
||||||
|
let msg: ClaudeMessage = serde_json::from_str(json).unwrap();
|
||||||
|
if let ClaudeMessage::Result {
|
||||||
|
fast_mode_state,
|
||||||
|
model_usage,
|
||||||
|
total_cost_usd,
|
||||||
|
..
|
||||||
|
} = msg
|
||||||
|
{
|
||||||
|
assert!(fast_mode_state.is_none());
|
||||||
|
assert!(model_usage.is_none());
|
||||||
|
assert!(total_cost_usd.is_none());
|
||||||
|
} else {
|
||||||
|
panic!("Expected Result variant");
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,9 @@
|
|||||||
use std::process::Command;
|
|
||||||
use std::io::Write;
|
use std::io::Write;
|
||||||
use tempfile::NamedTempFile;
|
use std::process::Command;
|
||||||
use tauri::command;
|
use tauri::command;
|
||||||
|
use tempfile::NamedTempFile;
|
||||||
|
|
||||||
|
use crate::process_ext::HideWindow;
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn send_vbs_notification(title: String, body: String) -> Result<(), String> {
|
pub async fn send_vbs_notification(title: String, body: String) -> Result<(), String> {
|
||||||
@@ -17,8 +19,8 @@ objShell.Popup "{}" & vbCrLf & vbCrLf & "{}", 5, "{}", 64
|
|||||||
);
|
);
|
||||||
|
|
||||||
// Create a temporary VBS file
|
// Create a temporary VBS file
|
||||||
let mut temp_file = NamedTempFile::new()
|
let mut temp_file =
|
||||||
.map_err(|e| format!("Failed to create temp file: {}", e))?;
|
NamedTempFile::new().map_err(|e| format!("Failed to create temp file: {}", e))?;
|
||||||
|
|
||||||
temp_file
|
temp_file
|
||||||
.write_all(vbs_content.as_bytes())
|
.write_all(vbs_content.as_bytes())
|
||||||
@@ -40,10 +42,7 @@ objShell.Popup "{}" & vbCrLf & vbCrLf & "{}", 5, "{}", 64
|
|||||||
} else if temp_path.starts_with("/tmp/") {
|
} else if temp_path.starts_with("/tmp/") {
|
||||||
// WSL temp files might be in a different location
|
// WSL temp files might be in a different location
|
||||||
// Try to use wslpath to convert
|
// Try to use wslpath to convert
|
||||||
let output = Command::new("wslpath")
|
let output = Command::new("wslpath").hide_window().arg("-w").arg(&temp_path).output();
|
||||||
.arg("-w")
|
|
||||||
.arg(&temp_path)
|
|
||||||
.output();
|
|
||||||
|
|
||||||
if let Ok(result) = output {
|
if let Ok(result) = output {
|
||||||
if result.status.success() {
|
if result.status.success() {
|
||||||
@@ -60,6 +59,7 @@ objShell.Popup "{}" & vbCrLf & vbCrLf & "{}", 5, "{}", 64
|
|||||||
|
|
||||||
// Execute the VBScript using wscript.exe
|
// Execute the VBScript using wscript.exe
|
||||||
let output = Command::new("/mnt/c/Windows/System32/wscript.exe")
|
let output = Command::new("/mnt/c/Windows/System32/wscript.exe")
|
||||||
|
.hide_window()
|
||||||
.arg("//NoLogo")
|
.arg("//NoLogo")
|
||||||
.arg(&windows_path)
|
.arg(&windows_path)
|
||||||
.output()
|
.output()
|
||||||
@@ -71,4 +71,4 @@ objShell.Popup "{}" & vbCrLf & vbCrLf & "{}", 5, "{}", 64
|
|||||||
}
|
}
|
||||||
|
|
||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -2,7 +2,7 @@ use tauri::command;
|
|||||||
|
|
||||||
#[cfg(target_os = "windows")]
|
#[cfg(target_os = "windows")]
|
||||||
use windows::{
|
use windows::{
|
||||||
core::{HSTRING, Result as WindowsResult},
|
core::{Result as WindowsResult, HSTRING},
|
||||||
Data::Xml::Dom::*,
|
Data::Xml::Dom::*,
|
||||||
UI::Notifications::*,
|
UI::Notifications::*,
|
||||||
};
|
};
|
||||||
@@ -38,7 +38,8 @@ fn show_toast_notification(title: &str, body: &str) -> WindowsResult<()> {
|
|||||||
let toast = ToastNotification::CreateToastNotification(&xml_doc)?;
|
let toast = ToastNotification::CreateToastNotification(&xml_doc)?;
|
||||||
|
|
||||||
// Create a toast notifier with an application ID
|
// Create a toast notifier with an application ID
|
||||||
let notifier = ToastNotificationManager::CreateToastNotifierWithId(&HSTRING::from("Hikari Desktop"))?;
|
let notifier =
|
||||||
|
ToastNotificationManager::CreateToastNotifierWithId(&HSTRING::from("Hikari Desktop"))?;
|
||||||
|
|
||||||
// Show the notification
|
// Show the notification
|
||||||
notifier.Show(&toast)?;
|
notifier.Show(&toast)?;
|
||||||
@@ -60,4 +61,4 @@ fn escape_xml(text: &str) -> String {
|
|||||||
#[command]
|
#[command]
|
||||||
pub async fn send_windows_toast(_title: String, _body: String) -> Result<(), String> {
|
pub async fn send_windows_toast(_title: String, _body: String) -> Result<(), String> {
|
||||||
Err("Windows toast notifications are only available on Windows".to_string())
|
Err("Windows toast notifications are only available on Windows".to_string())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,6 +1,8 @@
|
|||||||
use std::process::Command;
|
use std::process::Command;
|
||||||
use tauri::command;
|
use tauri::command;
|
||||||
|
|
||||||
|
use crate::process_ext::HideWindow;
|
||||||
|
|
||||||
#[command]
|
#[command]
|
||||||
pub async fn send_wsl_notification(title: String, body: String) -> Result<(), String> {
|
pub async fn send_wsl_notification(title: String, body: String) -> Result<(), String> {
|
||||||
// Method 1: Try Windows 10/11 toast notification using PowerShell
|
// Method 1: Try Windows 10/11 toast notification using PowerShell
|
||||||
@@ -36,6 +38,7 @@ $notifier.Show($toast)
|
|||||||
|
|
||||||
// Try PowerShell.exe through WSL
|
// Try PowerShell.exe through WSL
|
||||||
let output = Command::new("/mnt/c/Windows/System32/WindowsPowerShell/v1.0/powershell.exe")
|
let output = Command::new("/mnt/c/Windows/System32/WindowsPowerShell/v1.0/powershell.exe")
|
||||||
|
.hide_window()
|
||||||
.arg("-NoProfile")
|
.arg("-NoProfile")
|
||||||
.arg("-ExecutionPolicy")
|
.arg("-ExecutionPolicy")
|
||||||
.arg("Bypass")
|
.arg("Bypass")
|
||||||
@@ -48,15 +51,15 @@ $notifier.Show($toast)
|
|||||||
match output {
|
match output {
|
||||||
Ok(result) => {
|
Ok(result) => {
|
||||||
if result.status.success() {
|
if result.status.success() {
|
||||||
println!("WSL notification sent successfully");
|
tracing::info!("WSL notification sent successfully");
|
||||||
return Ok(());
|
return Ok(());
|
||||||
} else {
|
} else {
|
||||||
let stderr = String::from_utf8_lossy(&result.stderr);
|
let stderr = String::from_utf8_lossy(&result.stderr);
|
||||||
println!("PowerShell toast failed: {}", stderr);
|
tracing::error!("PowerShell toast failed: {}", stderr);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
Err(e) => {
|
Err(e) => {
|
||||||
println!("Failed to run PowerShell: {}", e);
|
tracing::error!("Failed to run PowerShell: {}", e);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -65,6 +68,7 @@ $notifier.Show($toast)
|
|||||||
|
|
||||||
// Method 3: Try wsl-notify-send if available
|
// Method 3: Try wsl-notify-send if available
|
||||||
let notify_result = Command::new("wsl-notify-send")
|
let notify_result = Command::new("wsl-notify-send")
|
||||||
|
.hide_window()
|
||||||
.arg("--appId")
|
.arg("--appId")
|
||||||
.arg("HikariDesktop")
|
.arg("HikariDesktop")
|
||||||
.arg("--category")
|
.arg("--category")
|
||||||
@@ -74,11 +78,11 @@ $notifier.Show($toast)
|
|||||||
|
|
||||||
if let Ok(result) = notify_result {
|
if let Ok(result) = notify_result {
|
||||||
if result.status.success() {
|
if result.status.success() {
|
||||||
println!("Notification sent via wsl-notify-send");
|
tracing::info!("Notification sent via wsl-notify-send");
|
||||||
return Ok(());
|
return Ok(());
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// If all methods fail, return an error
|
// If all methods fail, return an error
|
||||||
Err("All WSL notification methods failed".to_string())
|
Err("All WSL notification methods failed".to_string())
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
"$schema": "https://schema.tauri.app/config/2",
|
"$schema": "https://schema.tauri.app/config/2",
|
||||||
"productName": "hikari-desktop",
|
"productName": "hikari-desktop",
|
||||||
"version": "0.2.0",
|
"version": "1.14.0",
|
||||||
"identifier": "com.naomi.hikari-desktop",
|
"identifier": "com.naomi.hikari-desktop",
|
||||||
"build": {
|
"build": {
|
||||||
"beforeDevCommand": "pnpm dev",
|
"beforeDevCommand": "pnpm dev",
|
||||||
@@ -22,6 +22,12 @@
|
|||||||
],
|
],
|
||||||
"security": {
|
"security": {
|
||||||
"csp": null
|
"csp": null
|
||||||
|
},
|
||||||
|
"trayIcon": {
|
||||||
|
"id": "main",
|
||||||
|
"iconPath": "icons/32x32.png",
|
||||||
|
"iconAsTemplate": false,
|
||||||
|
"tooltip": "Hikari - Claude Code Assistant"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"bundle": {
|
"bundle": {
|
||||||
|
|||||||
@@ -6,6 +6,7 @@
|
|||||||
--bg-secondary: #16213e;
|
--bg-secondary: #16213e;
|
||||||
--bg-terminal: #0f0f1a;
|
--bg-terminal: #0f0f1a;
|
||||||
--bg-hover: #2a2a4a;
|
--bg-hover: #2a2a4a;
|
||||||
|
--bg-code: #1e1e2e;
|
||||||
--accent-primary: #e94560;
|
--accent-primary: #e94560;
|
||||||
--accent-secondary: #ff6b9d;
|
--accent-secondary: #ff6b9d;
|
||||||
--text-primary: #ffffff;
|
--text-primary: #ffffff;
|
||||||
@@ -13,11 +14,40 @@
|
|||||||
--text-tertiary: #6b7280;
|
--text-tertiary: #6b7280;
|
||||||
--border-color: #2a2a4a;
|
--border-color: #2a2a4a;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
/* Terminal specific colors */
|
/* Terminal specific colors */
|
||||||
--terminal-user: #22d3ee;
|
--terminal-user: #22d3ee;
|
||||||
--terminal-tool: #c084fc;
|
--terminal-tool: #c084fc;
|
||||||
--terminal-tool-name: #ddd6fe;
|
--terminal-tool-name: #ddd6fe;
|
||||||
--terminal-error: #f87171;
|
--terminal-error: #f87171;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (dark) */
|
||||||
|
--hljs-keyword: #f472b6;
|
||||||
|
--hljs-string: #a3e635;
|
||||||
|
--hljs-number: #fbbf24;
|
||||||
|
--hljs-comment: #6b7280;
|
||||||
|
--hljs-function: #c084fc;
|
||||||
|
--hljs-type: #22d3ee;
|
||||||
|
--hljs-variable: #fb923c;
|
||||||
|
--hljs-meta: #94a3b8;
|
||||||
}
|
}
|
||||||
|
|
||||||
[data-theme="light"] {
|
[data-theme="light"] {
|
||||||
@@ -25,6 +55,7 @@
|
|||||||
--bg-secondary: #ffffff;
|
--bg-secondary: #ffffff;
|
||||||
--bg-terminal: #f1f3f4;
|
--bg-terminal: #f1f3f4;
|
||||||
--bg-hover: #e8e8e8;
|
--bg-hover: #e8e8e8;
|
||||||
|
--bg-code: #f5f5f5;
|
||||||
--accent-primary: #e94560;
|
--accent-primary: #e94560;
|
||||||
--accent-secondary: #ff6b9d;
|
--accent-secondary: #ff6b9d;
|
||||||
--text-primary: #1a1a2e;
|
--text-primary: #1a1a2e;
|
||||||
@@ -32,11 +63,481 @@
|
|||||||
--text-tertiary: #9ca3af;
|
--text-tertiary: #9ca3af;
|
||||||
--border-color: #d0d0e0;
|
--border-color: #d0d0e0;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
/* Terminal specific colors */
|
/* Terminal specific colors */
|
||||||
--terminal-user: #0891b2;
|
--terminal-user: #0891b2;
|
||||||
--terminal-tool: #7c3aed;
|
--terminal-tool: #7c3aed;
|
||||||
--terminal-tool-name: #8b5cf6;
|
--terminal-tool-name: #8b5cf6;
|
||||||
--terminal-error: #dc2626;
|
--terminal-error: #dc2626;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (light) */
|
||||||
|
--hljs-keyword: #d946ef;
|
||||||
|
--hljs-string: #16a34a;
|
||||||
|
--hljs-number: #d97706;
|
||||||
|
--hljs-comment: #9ca3af;
|
||||||
|
--hljs-function: #7c3aed;
|
||||||
|
--hljs-type: #0891b2;
|
||||||
|
--hljs-variable: #ea580c;
|
||||||
|
--hljs-meta: #64748b;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="high-contrast"] {
|
||||||
|
--bg-primary: #000000;
|
||||||
|
--bg-secondary: #0a0a0a;
|
||||||
|
--bg-terminal: #000000;
|
||||||
|
--bg-hover: #1a1a1a;
|
||||||
|
--bg-code: #0a0a0a;
|
||||||
|
--accent-primary: #ff4d6d;
|
||||||
|
--accent-secondary: #ff85a1;
|
||||||
|
--text-primary: #ffffff;
|
||||||
|
--text-secondary: #e0e0e0;
|
||||||
|
--text-tertiary: #b0b0b0;
|
||||||
|
--border-color: #ffffff;
|
||||||
|
|
||||||
|
/* Trans pride colors (high contrast) */
|
||||||
|
--trans-blue: #00d4ff;
|
||||||
|
--trans-pink: #ff99cc;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors - bright and saturated */
|
||||||
|
--terminal-user: #00ffff;
|
||||||
|
--terminal-tool: #ff00ff;
|
||||||
|
--terminal-tool-name: #ffaaff;
|
||||||
|
--terminal-error: #ff5555;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (high contrast) */
|
||||||
|
--hljs-keyword: #ff66ff;
|
||||||
|
--hljs-string: #66ff66;
|
||||||
|
--hljs-number: #ffff00;
|
||||||
|
--hljs-comment: #aaaaaa;
|
||||||
|
--hljs-function: #ff99ff;
|
||||||
|
--hljs-type: #00ffff;
|
||||||
|
--hljs-variable: #ffaa00;
|
||||||
|
--hljs-meta: #cccccc;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="dracula"] {
|
||||||
|
--bg-primary: #282a36;
|
||||||
|
--bg-secondary: #1e1f29;
|
||||||
|
--bg-terminal: #191a21;
|
||||||
|
--bg-hover: #44475a;
|
||||||
|
--bg-code: #282a36;
|
||||||
|
--accent-primary: #bd93f9;
|
||||||
|
--accent-secondary: #ff79c6;
|
||||||
|
--text-primary: #f8f8f2;
|
||||||
|
--text-secondary: #6272a4;
|
||||||
|
--text-tertiary: #44475a;
|
||||||
|
--border-color: #44475a;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #8be9fd;
|
||||||
|
--terminal-tool: #bd93f9;
|
||||||
|
--terminal-tool-name: #caa9fa;
|
||||||
|
--terminal-error: #ff5555;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Dracula) */
|
||||||
|
--hljs-keyword: #ff79c6;
|
||||||
|
--hljs-string: #f1fa8c;
|
||||||
|
--hljs-number: #bd93f9;
|
||||||
|
--hljs-comment: #6272a4;
|
||||||
|
--hljs-function: #50fa7b;
|
||||||
|
--hljs-type: #8be9fd;
|
||||||
|
--hljs-variable: #ffb86c;
|
||||||
|
--hljs-meta: #94a3b8;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="catppuccin"] {
|
||||||
|
--bg-primary: #1e1e2e;
|
||||||
|
--bg-secondary: #181825;
|
||||||
|
--bg-terminal: #11111b;
|
||||||
|
--bg-hover: #313244;
|
||||||
|
--bg-code: #1e1e2e;
|
||||||
|
--accent-primary: #cba6f7;
|
||||||
|
--accent-secondary: #f5c2e7;
|
||||||
|
--text-primary: #cdd6f4;
|
||||||
|
--text-secondary: #a6adc8;
|
||||||
|
--text-tertiary: #6c7086;
|
||||||
|
--border-color: #313244;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #89dceb;
|
||||||
|
--terminal-tool: #cba6f7;
|
||||||
|
--terminal-tool-name: #d9b3ff;
|
||||||
|
--terminal-error: #f38ba8;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Catppuccin Mocha) */
|
||||||
|
--hljs-keyword: #cba6f7;
|
||||||
|
--hljs-string: #a6e3a1;
|
||||||
|
--hljs-number: #fab387;
|
||||||
|
--hljs-comment: #6c7086;
|
||||||
|
--hljs-function: #89b4fa;
|
||||||
|
--hljs-type: #89dceb;
|
||||||
|
--hljs-variable: #fab387;
|
||||||
|
--hljs-meta: #a6adc8;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="nord"] {
|
||||||
|
--bg-primary: #2e3440;
|
||||||
|
--bg-secondary: #3b4252;
|
||||||
|
--bg-terminal: #242933;
|
||||||
|
--bg-hover: #434c5e;
|
||||||
|
--bg-code: #2e3440;
|
||||||
|
--accent-primary: #88c0d0;
|
||||||
|
--accent-secondary: #81a1c1;
|
||||||
|
--text-primary: #eceff4;
|
||||||
|
--text-secondary: #d8dee9;
|
||||||
|
--text-tertiary: #4c566a;
|
||||||
|
--border-color: #434c5e;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #88c0d0;
|
||||||
|
--terminal-tool: #b48ead;
|
||||||
|
--terminal-tool-name: #c7a8c9;
|
||||||
|
--terminal-error: #bf616a;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Nord) */
|
||||||
|
--hljs-keyword: #81a1c1;
|
||||||
|
--hljs-string: #a3be8c;
|
||||||
|
--hljs-number: #b48ead;
|
||||||
|
--hljs-comment: #4c566a;
|
||||||
|
--hljs-function: #88c0d0;
|
||||||
|
--hljs-type: #8fbcbb;
|
||||||
|
--hljs-variable: #d08770;
|
||||||
|
--hljs-meta: #616e88;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="solarized"] {
|
||||||
|
--bg-primary: #002b36;
|
||||||
|
--bg-secondary: #073642;
|
||||||
|
--bg-terminal: #00212b;
|
||||||
|
--bg-hover: #094656;
|
||||||
|
--bg-code: #002b36;
|
||||||
|
--accent-primary: #268bd2;
|
||||||
|
--accent-secondary: #2aa198;
|
||||||
|
--text-primary: #fdf6e3;
|
||||||
|
--text-secondary: #93a1a1;
|
||||||
|
--text-tertiary: #657b83;
|
||||||
|
--border-color: #094656;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #2aa198;
|
||||||
|
--terminal-tool: #6c71c4;
|
||||||
|
--terminal-tool-name: #9395d0;
|
||||||
|
--terminal-error: #dc322f;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Solarized Dark) */
|
||||||
|
--hljs-keyword: #859900;
|
||||||
|
--hljs-string: #2aa198;
|
||||||
|
--hljs-number: #d33682;
|
||||||
|
--hljs-comment: #586e75;
|
||||||
|
--hljs-function: #268bd2;
|
||||||
|
--hljs-type: #b58900;
|
||||||
|
--hljs-variable: #cb4b16;
|
||||||
|
--hljs-meta: #657b83;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="solarized-light"] {
|
||||||
|
--bg-primary: #fdf6e3;
|
||||||
|
--bg-secondary: #eee8d5;
|
||||||
|
--bg-terminal: #f9f3d7;
|
||||||
|
--bg-hover: #d8d1be;
|
||||||
|
--bg-code: #eee8d5;
|
||||||
|
--accent-primary: #268bd2;
|
||||||
|
--accent-secondary: #2aa198;
|
||||||
|
--text-primary: #657b83;
|
||||||
|
--text-secondary: #839496;
|
||||||
|
--text-tertiary: #93a1a1;
|
||||||
|
--border-color: #cfc9b5;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #268bd2;
|
||||||
|
--terminal-tool: #6c71c4;
|
||||||
|
--terminal-tool-name: #8f94cc;
|
||||||
|
--terminal-error: #dc322f;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Solarized Light) */
|
||||||
|
--hljs-keyword: #859900;
|
||||||
|
--hljs-string: #2aa198;
|
||||||
|
--hljs-number: #d33682;
|
||||||
|
--hljs-comment: #93a1a1;
|
||||||
|
--hljs-function: #268bd2;
|
||||||
|
--hljs-type: #b58900;
|
||||||
|
--hljs-variable: #cb4b16;
|
||||||
|
--hljs-meta: #657b83;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="catppuccin-latte"] {
|
||||||
|
--bg-primary: #eff1f5;
|
||||||
|
--bg-secondary: #e6e9ef;
|
||||||
|
--bg-terminal: #dce0e8;
|
||||||
|
--bg-hover: #ccd0da;
|
||||||
|
--bg-code: #e6e9ef;
|
||||||
|
--accent-primary: #8839ef;
|
||||||
|
--accent-secondary: #ea76cb;
|
||||||
|
--text-primary: #4c4f69;
|
||||||
|
--text-secondary: #6c6f85;
|
||||||
|
--text-tertiary: #9ca0b0;
|
||||||
|
--border-color: #bcc0cc;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #209fb5;
|
||||||
|
--terminal-tool: #8839ef;
|
||||||
|
--terminal-tool-name: #a259f1;
|
||||||
|
--terminal-error: #d20f39;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Catppuccin Latte) */
|
||||||
|
--hljs-keyword: #8839ef;
|
||||||
|
--hljs-string: #40a02b;
|
||||||
|
--hljs-number: #fe640b;
|
||||||
|
--hljs-comment: #8c8fa1;
|
||||||
|
--hljs-function: #1e66f5;
|
||||||
|
--hljs-type: #209fb5;
|
||||||
|
--hljs-variable: #fe640b;
|
||||||
|
--hljs-meta: #5c5f77;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="gruvbox-light"] {
|
||||||
|
--bg-primary: #fbf1c7;
|
||||||
|
--bg-secondary: #ebdbb2;
|
||||||
|
--bg-terminal: #f9f5d7;
|
||||||
|
--bg-hover: #d5c4a1;
|
||||||
|
--bg-code: #ebdbb2;
|
||||||
|
--accent-primary: #458588;
|
||||||
|
--accent-secondary: #689d6a;
|
||||||
|
--text-primary: #3c3836;
|
||||||
|
--text-secondary: #665c54;
|
||||||
|
--text-tertiary: #7c6f64;
|
||||||
|
--border-color: #bdae93;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #458588;
|
||||||
|
--terminal-tool: #b16286;
|
||||||
|
--terminal-tool-name: #c37aa0;
|
||||||
|
--terminal-error: #cc241d;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Gruvbox Light) */
|
||||||
|
--hljs-keyword: #d65d0e;
|
||||||
|
--hljs-string: #98971a;
|
||||||
|
--hljs-number: #b16286;
|
||||||
|
--hljs-comment: #928374;
|
||||||
|
--hljs-function: #458588;
|
||||||
|
--hljs-type: #d79921;
|
||||||
|
--hljs-variable: #af3a03;
|
||||||
|
--hljs-meta: #7c6f64;
|
||||||
|
}
|
||||||
|
|
||||||
|
[data-theme="rose-pine-dawn"] {
|
||||||
|
--bg-primary: #faf4ed;
|
||||||
|
--bg-secondary: #fffaf3;
|
||||||
|
--bg-terminal: #f2e9e1;
|
||||||
|
--bg-hover: #dfdad9;
|
||||||
|
--bg-code: #fffaf3;
|
||||||
|
--accent-primary: #907aa9;
|
||||||
|
--accent-secondary: #d7827e;
|
||||||
|
--text-primary: #575279;
|
||||||
|
--text-secondary: #797593;
|
||||||
|
--text-tertiary: #9893a5;
|
||||||
|
--border-color: #cecacd;
|
||||||
|
|
||||||
|
/* Trans pride colors */
|
||||||
|
--trans-blue: #5bcefa;
|
||||||
|
--trans-pink: #f5a9b8;
|
||||||
|
--trans-white: #ffffff;
|
||||||
|
--trans-gradient: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 50%,
|
||||||
|
var(--trans-white) 100%
|
||||||
|
);
|
||||||
|
--trans-gradient-vibrant: linear-gradient(
|
||||||
|
135deg,
|
||||||
|
var(--trans-blue) 0%,
|
||||||
|
var(--trans-pink) 35%,
|
||||||
|
var(--trans-white) 50%,
|
||||||
|
var(--trans-pink) 65%,
|
||||||
|
var(--trans-blue) 100%
|
||||||
|
);
|
||||||
|
|
||||||
|
/* Terminal specific colors */
|
||||||
|
--terminal-user: #56949f;
|
||||||
|
--terminal-tool: #907aa9;
|
||||||
|
--terminal-tool-name: #a48abf;
|
||||||
|
--terminal-error: #b4637a;
|
||||||
|
|
||||||
|
/* Syntax highlighting colors (Rosé Pine Dawn) */
|
||||||
|
--hljs-keyword: #286983;
|
||||||
|
--hljs-string: #56949f;
|
||||||
|
--hljs-number: #ea9d34;
|
||||||
|
--hljs-comment: #9893a5;
|
||||||
|
--hljs-function: #907aa9;
|
||||||
|
--hljs-type: #d7827e;
|
||||||
|
--hljs-variable: #b4637a;
|
||||||
|
--hljs-meta: #797593;
|
||||||
}
|
}
|
||||||
|
|
||||||
html,
|
html,
|
||||||
@@ -45,11 +546,7 @@ body {
|
|||||||
padding: 0;
|
padding: 0;
|
||||||
height: 100%;
|
height: 100%;
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
font-family:
|
font-family: var(--ui-font-family, "Segoe UI", system-ui, -apple-system, sans-serif);
|
||||||
"Segoe UI",
|
|
||||||
system-ui,
|
|
||||||
-apple-system,
|
|
||||||
sans-serif;
|
|
||||||
background: var(--bg-primary);
|
background: var(--bg-primary);
|
||||||
color: var(--text-primary);
|
color: var(--text-primary);
|
||||||
}
|
}
|
||||||
@@ -79,3 +576,52 @@ body {
|
|||||||
background: var(--accent-primary);
|
background: var(--accent-primary);
|
||||||
color: var(--text-primary);
|
color: var(--text-primary);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* Trans gradient button - primary action buttons */
|
||||||
|
.btn-trans-gradient {
|
||||||
|
background: var(--trans-gradient-vibrant) !important;
|
||||||
|
border: none !important;
|
||||||
|
color: #1a1a2e !important;
|
||||||
|
font-weight: 600;
|
||||||
|
text-shadow: 0 0 2px rgba(255, 255, 255, 0.5);
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-trans-gradient:hover:not(:disabled) {
|
||||||
|
filter: brightness(1.1);
|
||||||
|
box-shadow:
|
||||||
|
0 0 20px rgba(91, 206, 250, 0.4),
|
||||||
|
0 0 30px rgba(245, 169, 184, 0.3);
|
||||||
|
}
|
||||||
|
|
||||||
|
.btn-trans-gradient:disabled {
|
||||||
|
opacity: 0.5;
|
||||||
|
cursor: not-allowed;
|
||||||
|
filter: grayscale(0.3);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Trans gradient focus border for inputs */
|
||||||
|
.input-trans-focus {
|
||||||
|
position: relative;
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.input-trans-focus:focus {
|
||||||
|
border-color: var(--trans-pink) !important;
|
||||||
|
box-shadow:
|
||||||
|
0 0 0 1px var(--trans-blue),
|
||||||
|
0 0 12px rgba(91, 206, 250, 0.3),
|
||||||
|
0 0 20px rgba(245, 169, 184, 0.2) !important;
|
||||||
|
outline: none !important;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Trans gradient hover for icon buttons */
|
||||||
|
.icon-trans-hover {
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.icon-trans-hover:hover {
|
||||||
|
color: var(--trans-pink) !important;
|
||||||
|
filter: drop-shadow(0 0 6px rgba(91, 206, 250, 0.5))
|
||||||
|
drop-shadow(0 0 10px rgba(245, 169, 184, 0.4));
|
||||||
|
}
|
||||||
|
|||||||
@@ -0,0 +1,419 @@
|
|||||||
|
import { get } from "svelte/store";
|
||||||
|
import { invoke } from "@tauri-apps/api/core";
|
||||||
|
import { claudeStore } from "$lib/stores/claude";
|
||||||
|
import { characterState } from "$lib/stores/character";
|
||||||
|
import { setSkipNextGreeting, updateDiscordRpc } from "$lib/tauri";
|
||||||
|
import { searchState } from "$lib/stores/search";
|
||||||
|
import { conversationsStore } from "$lib/stores/conversations";
|
||||||
|
import { configStore } from "$lib/stores/config";
|
||||||
|
import { memoryBrowserStore } from "$lib/stores/memoryBrowser";
|
||||||
|
|
||||||
|
export interface SlashCommand {
|
||||||
|
name: string;
|
||||||
|
description: string;
|
||||||
|
usage: string;
|
||||||
|
/** "cli" = built into Claude Code CLI; omitted = Hikari app command */
|
||||||
|
source?: "cli";
|
||||||
|
execute: (args: string) => Promise<void> | void;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function changeDirectory(path: string): Promise<void> {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!path.trim()) {
|
||||||
|
const currentDir = get(claudeStore.currentWorkingDirectory);
|
||||||
|
claudeStore.addLine("system", `Current directory: ${currentDir}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
characterState.setState("thinking");
|
||||||
|
claudeStore.addLine("system", `Changing directory to: ${path}`);
|
||||||
|
|
||||||
|
const currentDir = get(claudeStore.currentWorkingDirectory);
|
||||||
|
const validatedPath = await invoke<string>("validate_directory", { path, currentDir });
|
||||||
|
|
||||||
|
// Capture conversation history before disconnecting
|
||||||
|
const conversationHistory = claudeStore.getConversationHistory();
|
||||||
|
|
||||||
|
// Get currently granted tools and config auto-granted tools
|
||||||
|
const activeConversation = get(conversationsStore.activeConversation);
|
||||||
|
const grantedTools = activeConversation ? Array.from(activeConversation.grantedTools) : [];
|
||||||
|
const config = configStore.getConfig();
|
||||||
|
const allAllowedTools = [...new Set([...grantedTools, ...config.auto_granted_tools])];
|
||||||
|
|
||||||
|
await invoke("stop_claude", { conversationId });
|
||||||
|
|
||||||
|
// Wait for clean shutdown
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 500));
|
||||||
|
|
||||||
|
claudeStore.setWorkingDirectory(validatedPath);
|
||||||
|
|
||||||
|
setSkipNextGreeting(true);
|
||||||
|
|
||||||
|
await invoke("start_claude", {
|
||||||
|
conversationId,
|
||||||
|
options: {
|
||||||
|
working_dir: validatedPath,
|
||||||
|
model: config.model || null,
|
||||||
|
api_key: config.api_key || null,
|
||||||
|
custom_instructions: config.custom_instructions || null,
|
||||||
|
mcp_servers_json: config.mcp_servers_json || null,
|
||||||
|
allowed_tools: allAllowedTools,
|
||||||
|
use_worktree: config.use_worktree ?? false,
|
||||||
|
disable_1m_context: config.disable_1m_context ?? false,
|
||||||
|
max_output_tokens: config.max_output_tokens ?? null,
|
||||||
|
disable_cron: config.disable_cron ?? false,
|
||||||
|
disable_skill_shell_execution: config.disable_skill_shell_execution ?? false,
|
||||||
|
include_git_instructions: config.include_git_instructions ?? true,
|
||||||
|
enable_claudeai_mcp_servers: config.enable_claudeai_mcp_servers ?? true,
|
||||||
|
auto_memory_directory: config.auto_memory_directory || null,
|
||||||
|
model_overrides: config.model_overrides || null,
|
||||||
|
session_name: null,
|
||||||
|
bare_mode: config.bare_mode ?? false,
|
||||||
|
show_clear_context_on_plan_accept: config.show_clear_context_on_plan_accept ?? true,
|
||||||
|
custom_model_option: config.custom_model_option || null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update Discord RPC when reconnecting after directory change
|
||||||
|
if (activeConversation) {
|
||||||
|
await updateDiscordRpc(
|
||||||
|
activeConversation.name,
|
||||||
|
config.model || "claude",
|
||||||
|
activeConversation.startedAt
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wait for connection to establish
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 1000));
|
||||||
|
|
||||||
|
// Restore context if there was conversation history
|
||||||
|
if (conversationHistory) {
|
||||||
|
const contextMessage = `[CONTEXT RESTORATION]
|
||||||
|
I just changed the working directory from ${currentDir} to ${validatedPath}. Here's our conversation so far:
|
||||||
|
|
||||||
|
${conversationHistory}
|
||||||
|
|
||||||
|
Please continue where we left off. You are now operating in the new directory.`;
|
||||||
|
|
||||||
|
await invoke("send_prompt", {
|
||||||
|
conversationId,
|
||||||
|
message: contextMessage,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
claudeStore.addLine("system", `Changed directory to: ${validatedPath}`);
|
||||||
|
characterState.setState("idle");
|
||||||
|
} catch (error) {
|
||||||
|
claudeStore.addLine("error", `Failed to change directory: ${error}`);
|
||||||
|
characterState.setTemporaryState("error", 3000);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function startNewConversation(): Promise<void> {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const workingDir = await invoke<string>("get_working_directory", {
|
||||||
|
conversationId,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get granted tools before interrupting
|
||||||
|
const activeConversation = get(conversationsStore.activeConversation);
|
||||||
|
const grantedTools = activeConversation ? Array.from(activeConversation.grantedTools) : [];
|
||||||
|
const config = configStore.getConfig();
|
||||||
|
const allAllowedTools = [...new Set([...grantedTools, ...config.auto_granted_tools])];
|
||||||
|
|
||||||
|
claudeStore.addLine("system", "Starting new conversation...");
|
||||||
|
characterState.setState("thinking");
|
||||||
|
|
||||||
|
await invoke("interrupt_claude", { conversationId });
|
||||||
|
|
||||||
|
claudeStore.clearTerminal();
|
||||||
|
|
||||||
|
setSkipNextGreeting(true);
|
||||||
|
|
||||||
|
await invoke("start_claude", {
|
||||||
|
conversationId,
|
||||||
|
options: {
|
||||||
|
working_dir: workingDir,
|
||||||
|
model: config.model || null,
|
||||||
|
api_key: config.api_key || null,
|
||||||
|
custom_instructions: config.custom_instructions || null,
|
||||||
|
mcp_servers_json: config.mcp_servers_json || null,
|
||||||
|
allowed_tools: allAllowedTools,
|
||||||
|
use_worktree: config.use_worktree ?? false,
|
||||||
|
disable_1m_context: config.disable_1m_context ?? false,
|
||||||
|
max_output_tokens: config.max_output_tokens ?? null,
|
||||||
|
disable_cron: config.disable_cron ?? false,
|
||||||
|
disable_skill_shell_execution: config.disable_skill_shell_execution ?? false,
|
||||||
|
include_git_instructions: config.include_git_instructions ?? true,
|
||||||
|
enable_claudeai_mcp_servers: config.enable_claudeai_mcp_servers ?? true,
|
||||||
|
auto_memory_directory: config.auto_memory_directory || null,
|
||||||
|
model_overrides: config.model_overrides || null,
|
||||||
|
session_name: null,
|
||||||
|
bare_mode: config.bare_mode ?? false,
|
||||||
|
show_clear_context_on_plan_accept: config.show_clear_context_on_plan_accept ?? true,
|
||||||
|
custom_model_option: config.custom_model_option || null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Update Discord RPC when starting new conversation
|
||||||
|
if (activeConversation) {
|
||||||
|
await updateDiscordRpc(
|
||||||
|
activeConversation.name,
|
||||||
|
config.model || "claude",
|
||||||
|
activeConversation.startedAt
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
claudeStore.addLine("system", "New conversation started!");
|
||||||
|
characterState.setState("idle");
|
||||||
|
} catch (error) {
|
||||||
|
claudeStore.addLine("error", `Failed to start new conversation: ${error}`);
|
||||||
|
characterState.setTemporaryState("error", 3000);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const slashCommands: SlashCommand[] = [
|
||||||
|
{
|
||||||
|
name: "cd",
|
||||||
|
description: "Change the working directory",
|
||||||
|
usage: "/cd <path>",
|
||||||
|
execute: changeDirectory,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "clear",
|
||||||
|
description: "Clear the terminal display (keeps conversation context)",
|
||||||
|
usage: "/clear",
|
||||||
|
execute: () => {
|
||||||
|
claudeStore.clearTerminal();
|
||||||
|
claudeStore.addLine("system", "Terminal cleared");
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "new",
|
||||||
|
description: "Start a fresh conversation (resets context)",
|
||||||
|
usage: "/new",
|
||||||
|
execute: startNewConversation,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "help",
|
||||||
|
description: "Show available slash commands",
|
||||||
|
usage: "/help",
|
||||||
|
execute: () => {
|
||||||
|
const helpText = slashCommands
|
||||||
|
.map((cmd) => ` ${cmd.usage.padEnd(12)} - ${cmd.description}`)
|
||||||
|
.join("\n");
|
||||||
|
claudeStore.addLine("system", `Available commands:\n${helpText}`);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "search",
|
||||||
|
description: "Search within the conversation (use /search to clear)",
|
||||||
|
usage: "/search [query]",
|
||||||
|
execute: (args: string) => {
|
||||||
|
if (!args.trim()) {
|
||||||
|
searchState.clear();
|
||||||
|
claudeStore.addLine("system", "Search cleared");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
searchState.setQuery(args.trim());
|
||||||
|
claudeStore.addLine("system", `Searching for: "${args.trim()}"`);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "summarise",
|
||||||
|
description: "Get a summary of the entire conversation",
|
||||||
|
usage: "/summarise",
|
||||||
|
execute: async () => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
claudeStore.addLine("system", "Requesting conversation summary...");
|
||||||
|
await invoke("send_prompt", {
|
||||||
|
conversationId,
|
||||||
|
message:
|
||||||
|
"Please provide a comprehensive summary of our entire conversation so far, including the key topics we've discussed, decisions made, and any important context.",
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
claudeStore.addLine("error", `Failed to request summary: ${error}`);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "simplify",
|
||||||
|
description: "Review changed code for reuse, quality, and efficiency (Claude Code built-in)",
|
||||||
|
usage: "/simplify",
|
||||||
|
source: "cli",
|
||||||
|
execute: async () => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
await invoke("send_prompt", { conversationId, message: "/simplify" });
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "loop",
|
||||||
|
description: "Run a prompt or slash command on a recurring interval (Claude Code built-in)",
|
||||||
|
usage: "/loop [interval] [command]",
|
||||||
|
source: "cli",
|
||||||
|
execute: async (args: string) => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const message = args.trim() ? `/loop ${args.trim()}` : "/loop";
|
||||||
|
await invoke("send_prompt", { conversationId, message });
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "batch",
|
||||||
|
description: "Process multiple tasks in a single Claude Code session (Claude Code built-in)",
|
||||||
|
usage: "/batch [tasks]",
|
||||||
|
source: "cli",
|
||||||
|
execute: async (args: string) => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const message = args.trim() ? `/batch ${args.trim()}` : "/batch";
|
||||||
|
await invoke("send_prompt", { conversationId, message });
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "memory",
|
||||||
|
description: "Open the memory browser panel to view and manage memory files",
|
||||||
|
usage: "/memory",
|
||||||
|
source: "cli",
|
||||||
|
execute: () => {
|
||||||
|
memoryBrowserStore.open();
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "context",
|
||||||
|
description:
|
||||||
|
"Show current context window usage with optimisation suggestions (Claude Code built-in)",
|
||||||
|
usage: "/context",
|
||||||
|
source: "cli",
|
||||||
|
execute: async () => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
await invoke("send_prompt", { conversationId, message: "/context" });
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: "skill",
|
||||||
|
description: "Invoke a Claude Code skill from ~/.claude/skills/",
|
||||||
|
usage: "/skill [name] [data]",
|
||||||
|
execute: async (args: string) => {
|
||||||
|
const conversationId = get(claudeStore.activeConversationId);
|
||||||
|
if (!conversationId) {
|
||||||
|
claudeStore.addLine("error", "No active conversation");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const parts = args.trim().split(/\s+/);
|
||||||
|
const skillName = parts[0];
|
||||||
|
const skillData = parts.slice(1).join(" ");
|
||||||
|
|
||||||
|
// If no skill name provided, list available skills
|
||||||
|
if (!skillName) {
|
||||||
|
try {
|
||||||
|
const skills = await invoke<string[]>("list_skills");
|
||||||
|
if (skills.length === 0) {
|
||||||
|
claudeStore.addLine(
|
||||||
|
"system",
|
||||||
|
"No skills found in ~/.claude/skills/\nCreate a skill by adding a folder with a SKILL.md file."
|
||||||
|
);
|
||||||
|
} else {
|
||||||
|
const skillList = skills.map((s) => ` • ${s}`).join("\n");
|
||||||
|
claudeStore.addLine(
|
||||||
|
"system",
|
||||||
|
`Available skills:\n${skillList}\n\nUsage: /skill <skill-name> [data]`
|
||||||
|
);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
claudeStore.addLine("error", `Failed to list skills: ${error}`);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
claudeStore.addLine("system", `Invoking skill: ${skillName}`);
|
||||||
|
characterState.setState("thinking");
|
||||||
|
|
||||||
|
const message = skillData
|
||||||
|
? `Please run the /${skillName} skill with the following data:\n\n${skillData}`
|
||||||
|
: `Please run the /${skillName} skill.`;
|
||||||
|
|
||||||
|
await invoke("send_prompt", {
|
||||||
|
conversationId,
|
||||||
|
message,
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
claudeStore.addLine("error", `Failed to invoke skill: ${error}`);
|
||||||
|
characterState.setTemporaryState("error", 3000);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
export function parseSlashCommand(input: string): {
|
||||||
|
command: SlashCommand | null;
|
||||||
|
args: string;
|
||||||
|
} {
|
||||||
|
const trimmed = input.trim();
|
||||||
|
|
||||||
|
if (!trimmed.startsWith("/")) {
|
||||||
|
return { command: null, args: "" };
|
||||||
|
}
|
||||||
|
|
||||||
|
const parts = trimmed.slice(1).split(/\s+/);
|
||||||
|
const commandName = parts[0]?.toLowerCase();
|
||||||
|
const args = parts.slice(1).join(" ");
|
||||||
|
|
||||||
|
const command = slashCommands.find((cmd) => cmd.name.toLowerCase() === commandName);
|
||||||
|
|
||||||
|
return { command: command || null, args };
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getMatchingCommands(input: string): SlashCommand[] {
|
||||||
|
const trimmed = input.trim();
|
||||||
|
|
||||||
|
if (!trimmed.startsWith("/")) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
const partial = trimmed.slice(1).toLowerCase();
|
||||||
|
|
||||||
|
if (partial === "") {
|
||||||
|
return slashCommands;
|
||||||
|
}
|
||||||
|
|
||||||
|
return slashCommands.filter((cmd) => cmd.name.toLowerCase().startsWith(partial));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isSlashCommand(input: string): boolean {
|
||||||
|
return input.trim().startsWith("/");
|
||||||
|
}
|
||||||
@@ -40,10 +40,12 @@
|
|||||||
tabindex="-1"
|
tabindex="-1"
|
||||||
>
|
>
|
||||||
<div class="flex items-center justify-between mb-4">
|
<div class="flex items-center justify-between mb-4">
|
||||||
<h2 id="about-title" class="text-xl font-semibold text-gray-100">About Hikari Desktop</h2>
|
<h2 id="about-title" class="text-xl font-semibold text-[var(--text-primary)]">
|
||||||
|
About Hikari Desktop
|
||||||
|
</h2>
|
||||||
<button
|
<button
|
||||||
onclick={onClose}
|
onclick={onClose}
|
||||||
class="p-1 text-gray-500 hover:text-gray-300 transition-colors"
|
class="p-1 text-[var(--text-secondary)] hover:text-[var(--text-primary)] transition-colors"
|
||||||
aria-label="Close"
|
aria-label="Close"
|
||||||
>
|
>
|
||||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
@@ -59,16 +61,16 @@
|
|||||||
|
|
||||||
<div class="space-y-4 text-sm">
|
<div class="space-y-4 text-sm">
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">What is Hikari Desktop?</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">What is Hikari Desktop?</h3>
|
||||||
<p class="text-gray-400">
|
<p class="text-[var(--text-secondary)]">
|
||||||
Hikari Desktop is an AI-powered desktop assistant that brings Claude directly to your
|
Hikari Desktop is an AI-powered desktop assistant that brings Claude directly to your
|
||||||
desktop. Built with love using Tauri, Svelte, and Rust for a fast, native experience.
|
desktop. Built with love using Tauri, Svelte, and Rust for a fast, native experience.
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">Version</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">Version</h3>
|
||||||
<p class="text-gray-400 mb-1">
|
<p class="text-[var(--text-secondary)] mb-1">
|
||||||
{appVersion || "Loading..."}
|
{appVersion || "Loading..."}
|
||||||
</p>
|
</p>
|
||||||
<button
|
<button
|
||||||
@@ -80,7 +82,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">Source Code</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">Source Code</h3>
|
||||||
<button
|
<button
|
||||||
onclick={() => openUrl(links.source)}
|
onclick={() => openUrl(links.source)}
|
||||||
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
||||||
@@ -90,8 +92,8 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">Support & Community</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">Support & Community</h3>
|
||||||
<p class="text-gray-400 mb-1">Found a bug or have a suggestion?</p>
|
<p class="text-[var(--text-secondary)] mb-1">Found a bug or have a suggestion?</p>
|
||||||
<button
|
<button
|
||||||
onclick={() => openUrl(links.discord)}
|
onclick={() => openUrl(links.discord)}
|
||||||
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
||||||
@@ -101,7 +103,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">Built with 💕 by</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">Built with 💕 by</h3>
|
||||||
<button
|
<button
|
||||||
onclick={() => openUrl(links.website)}
|
onclick={() => openUrl(links.website)}
|
||||||
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
class="text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
||||||
@@ -111,8 +113,8 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div>
|
<div>
|
||||||
<h3 class="font-medium text-gray-200 mb-2">License</h3>
|
<h3 class="font-medium text-[var(--text-primary)] mb-2">License</h3>
|
||||||
<p class="text-gray-400 mb-1">
|
<p class="text-[var(--text-secondary)] mb-1">
|
||||||
This project is open source and available under our license terms.
|
This project is open source and available under our license terms.
|
||||||
</p>
|
</p>
|
||||||
<button
|
<button
|
||||||
@@ -124,7 +126,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="pt-4 mt-4 border-t border-[var(--border-color)]">
|
<div class="pt-4 mt-4 border-t border-[var(--border-color)]">
|
||||||
<p class="text-xs text-gray-500 text-center">
|
<p class="text-xs text-[var(--text-tertiary)] text-center">
|
||||||
Copyright © {new Date().getFullYear()} Naomi Carrigan. All rights reserved.
|
Copyright © {new Date().getFullYear()} Naomi Carrigan. All rights reserved.
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
|||||||
@@ -1,202 +0,0 @@
|
|||||||
<script lang="ts">
|
|
||||||
import { onMount } from "svelte";
|
|
||||||
import { fade, fly } from "svelte/transition";
|
|
||||||
import { cubicOut } from "svelte/easing";
|
|
||||||
import { listen } from "@tauri-apps/api/event";
|
|
||||||
import type { AchievementUnlockedEvent } from "$lib/types/achievements";
|
|
||||||
|
|
||||||
let achievements = $state<AchievementUnlockedEvent[]>([]);
|
|
||||||
let currentAchievement = $state<AchievementUnlockedEvent | null>(null);
|
|
||||||
let showNotification = $state(false);
|
|
||||||
|
|
||||||
onMount(() => {
|
|
||||||
let unlisten: (() => void) | undefined;
|
|
||||||
|
|
||||||
const setupListener = async () => {
|
|
||||||
unlisten = await listen<AchievementUnlockedEvent>("achievement:unlocked", (event) => {
|
|
||||||
achievements.push(event.payload);
|
|
||||||
if (!showNotification) {
|
|
||||||
showNext();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
setupListener();
|
|
||||||
|
|
||||||
return () => {
|
|
||||||
if (unlisten) {
|
|
||||||
unlisten();
|
|
||||||
}
|
|
||||||
};
|
|
||||||
});
|
|
||||||
|
|
||||||
function showNext() {
|
|
||||||
if (achievements.length > 0) {
|
|
||||||
currentAchievement = achievements.shift() || null;
|
|
||||||
showNotification = true;
|
|
||||||
|
|
||||||
// Auto-hide after 5 seconds
|
|
||||||
setTimeout(() => {
|
|
||||||
showNotification = false;
|
|
||||||
// Show next achievement after animation completes
|
|
||||||
setTimeout(() => showNext(), 300);
|
|
||||||
}, 5000);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function dismiss() {
|
|
||||||
showNotification = false;
|
|
||||||
// Show next achievement after animation completes
|
|
||||||
setTimeout(() => showNext(), 300);
|
|
||||||
}
|
|
||||||
|
|
||||||
function getRarityColor(rarity: string): string {
|
|
||||||
switch (rarity) {
|
|
||||||
case "legendary":
|
|
||||||
return "from-yellow-400 to-orange-500";
|
|
||||||
case "epic":
|
|
||||||
return "from-purple-400 to-pink-500";
|
|
||||||
case "rare":
|
|
||||||
return "from-blue-400 to-indigo-500";
|
|
||||||
default:
|
|
||||||
return "from-green-400 to-emerald-500";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function getAchievementRarity(id: string): string {
|
|
||||||
// Determine rarity based on achievement ID
|
|
||||||
if (id === "TokenMaster") return "legendary";
|
|
||||||
if (["CodeMachine", "Unstoppable"].includes(id)) return "epic";
|
|
||||||
if (
|
|
||||||
[
|
|
||||||
"BlossomingCoder",
|
|
||||||
"CodeWizard",
|
|
||||||
"MasterBuilder",
|
|
||||||
"EnduranceChamp",
|
|
||||||
"DeepDive",
|
|
||||||
"CreativeCoder",
|
|
||||||
].includes(id)
|
|
||||||
)
|
|
||||||
return "rare";
|
|
||||||
return "common";
|
|
||||||
}
|
|
||||||
</script>
|
|
||||||
|
|
||||||
{#if showNotification && currentAchievement}
|
|
||||||
<div
|
|
||||||
class="fixed top-20 right-4 z-50 max-w-sm"
|
|
||||||
in:fly={{ x: 300, duration: 500, easing: cubicOut }}
|
|
||||||
out:fade={{ duration: 300 }}
|
|
||||||
>
|
|
||||||
<!-- Backdrop with animated gradient border -->
|
|
||||||
<div class="relative p-[2px] rounded-lg overflow-hidden">
|
|
||||||
<!-- Animated gradient border -->
|
|
||||||
<div
|
|
||||||
class="absolute inset-0 bg-gradient-to-r {getRarityColor(
|
|
||||||
getAchievementRarity(currentAchievement.achievement.id)
|
|
||||||
)} animate-pulse"
|
|
||||||
></div>
|
|
||||||
|
|
||||||
<!-- Main notification content -->
|
|
||||||
<div class="relative bg-[var(--bg-primary)] rounded-lg p-4 shadow-2xl backdrop-blur-sm">
|
|
||||||
<button
|
|
||||||
onclick={dismiss}
|
|
||||||
onkeydown={(e) => e.key === "Enter" && dismiss()}
|
|
||||||
class="absolute top-2 right-2 text-gray-500 hover:text-gray-700 dark:text-gray-400 dark:hover:text-gray-200 transition-colors"
|
|
||||||
aria-label="Dismiss notification"
|
|
||||||
>
|
|
||||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
|
||||||
<path
|
|
||||||
stroke-linecap="round"
|
|
||||||
stroke-linejoin="round"
|
|
||||||
stroke-width="2"
|
|
||||||
d="M6 18L18 6M6 6l12 12"
|
|
||||||
></path>
|
|
||||||
</svg>
|
|
||||||
</button>
|
|
||||||
|
|
||||||
<div class="flex items-start gap-4">
|
|
||||||
<!-- Icon with animated sparkles -->
|
|
||||||
<div class="relative flex-shrink-0">
|
|
||||||
<div class="text-5xl animate-bounce">{currentAchievement.achievement.icon}</div>
|
|
||||||
|
|
||||||
<!-- Sparkle animations -->
|
|
||||||
<div class="absolute -top-1 -right-1 text-yellow-400 animate-ping">✨</div>
|
|
||||||
<div
|
|
||||||
class="absolute -bottom-1 -left-1 text-yellow-400 animate-ping animation-delay-200"
|
|
||||||
>
|
|
||||||
✨
|
|
||||||
</div>
|
|
||||||
<div class="absolute top-1/2 -right-2 text-yellow-400 animate-ping animation-delay-400">
|
|
||||||
✨
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Text content -->
|
|
||||||
<div class="flex-1 min-w-0 pt-1">
|
|
||||||
<h3
|
|
||||||
class="text-sm font-semibold text-gray-500 dark:text-gray-400 uppercase tracking-wide"
|
|
||||||
>
|
|
||||||
Achievement Unlocked!
|
|
||||||
</h3>
|
|
||||||
<p class="text-lg font-bold text-[var(--text-primary)] mt-1">
|
|
||||||
{currentAchievement.achievement.name}
|
|
||||||
</p>
|
|
||||||
<p class="text-sm text-gray-600 dark:text-gray-400 mt-1">
|
|
||||||
{currentAchievement.achievement.description}
|
|
||||||
</p>
|
|
||||||
|
|
||||||
<!-- Rarity badge -->
|
|
||||||
<div class="mt-2 inline-flex items-center">
|
|
||||||
<span
|
|
||||||
class="px-2 py-1 text-xs font-medium rounded-full bg-gradient-to-r {getRarityColor(
|
|
||||||
getAchievementRarity(currentAchievement.achievement.id)
|
|
||||||
)} text-white capitalize"
|
|
||||||
>
|
|
||||||
{getAchievementRarity(currentAchievement.achievement.id)}
|
|
||||||
</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Celebration confetti effect (CSS only) -->
|
|
||||||
<div class="absolute inset-0 pointer-events-none overflow-hidden rounded-lg">
|
|
||||||
{#each Array(10) as _ (_)}
|
|
||||||
<div
|
|
||||||
class="absolute w-2 h-2 bg-gradient-to-br {getRarityColor(
|
|
||||||
getAchievementRarity(currentAchievement.achievement.id)
|
|
||||||
)} rounded-full animate-fall"
|
|
||||||
style="left: {Math.random() * 100}%; animation-delay: {Math.random() *
|
|
||||||
2}s; animation-duration: {2 + Math.random() * 2}s;"
|
|
||||||
></div>
|
|
||||||
{/each}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
{/if}
|
|
||||||
|
|
||||||
<style>
|
|
||||||
@keyframes fall {
|
|
||||||
0% {
|
|
||||||
transform: translateY(-20px) rotate(0deg);
|
|
||||||
opacity: 1;
|
|
||||||
}
|
|
||||||
100% {
|
|
||||||
transform: translateY(400px) rotate(720deg);
|
|
||||||
opacity: 0;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
.animate-fall {
|
|
||||||
animation: fall linear infinite;
|
|
||||||
}
|
|
||||||
|
|
||||||
.animation-delay-200 {
|
|
||||||
animation-delay: 200ms;
|
|
||||||
}
|
|
||||||
|
|
||||||
.animation-delay-400 {
|
|
||||||
animation-delay: 400ms;
|
|
||||||
}
|
|
||||||
</style>
|
|
||||||
@@ -0,0 +1,356 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { SvelteMap } from "svelte/reactivity";
|
||||||
|
import { invoke } from "@tauri-apps/api/core";
|
||||||
|
import { claudeStore } from "$lib/stores/claude";
|
||||||
|
import { agentStore, getAgentsForConversation } from "$lib/stores/agents";
|
||||||
|
import type { AgentInfo } from "$lib/types/agents";
|
||||||
|
import { onMount, onDestroy } from "svelte";
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
isOpen: boolean;
|
||||||
|
onClose: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { isOpen, onClose }: Props = $props();
|
||||||
|
|
||||||
|
let now = $state(Date.now());
|
||||||
|
let timerInterval: ReturnType<typeof setInterval> | null = null;
|
||||||
|
|
||||||
|
// We need a reactive subscription to agents for the active conversation
|
||||||
|
let agents: AgentInfo[] = $state([]);
|
||||||
|
let agentsUnsubscribe: (() => void) | null = null;
|
||||||
|
|
||||||
|
// Track active conversation reactively
|
||||||
|
let currentConversationId = $state<string | null>("");
|
||||||
|
const conversationIdUnsubscribe = claudeStore.activeConversationId.subscribe((id) => {
|
||||||
|
currentConversationId = id;
|
||||||
|
});
|
||||||
|
|
||||||
|
$effect(() => {
|
||||||
|
// Re-subscribe when conversation changes
|
||||||
|
if (agentsUnsubscribe) {
|
||||||
|
agentsUnsubscribe();
|
||||||
|
}
|
||||||
|
if (currentConversationId) {
|
||||||
|
const store = getAgentsForConversation(currentConversationId);
|
||||||
|
agentsUnsubscribe = store.subscribe((value) => {
|
||||||
|
agents = value;
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
agents = [];
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const runningAgents = $derived(agents.filter((a) => a.status === "running"));
|
||||||
|
const completedAgents = $derived(agents.filter((a) => a.status === "completed"));
|
||||||
|
const erroredAgents = $derived(agents.filter((a) => a.status === "errored"));
|
||||||
|
|
||||||
|
// Organize agents into a tree structure based on parent_tool_use_id
|
||||||
|
const agentTree = $derived.by(() => {
|
||||||
|
const topLevel = agents.filter((a) => !a.parentToolUseId);
|
||||||
|
const childrenMap = new SvelteMap<string, AgentInfo[]>();
|
||||||
|
|
||||||
|
// Group children by their parent
|
||||||
|
agents.forEach((agent) => {
|
||||||
|
if (agent.parentToolUseId) {
|
||||||
|
const siblings = childrenMap.get(agent.parentToolUseId) || [];
|
||||||
|
siblings.push(agent);
|
||||||
|
childrenMap.set(agent.parentToolUseId, siblings);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return { topLevel, childrenMap };
|
||||||
|
});
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
timerInterval = setInterval(() => {
|
||||||
|
now = Date.now();
|
||||||
|
}, 1000);
|
||||||
|
});
|
||||||
|
|
||||||
|
onDestroy(() => {
|
||||||
|
if (timerInterval) clearInterval(timerInterval);
|
||||||
|
if (agentsUnsubscribe) agentsUnsubscribe();
|
||||||
|
conversationIdUnsubscribe();
|
||||||
|
});
|
||||||
|
|
||||||
|
function formatDuration(startedAt: number, endedAt?: number): string {
|
||||||
|
const end = endedAt || now;
|
||||||
|
const durationMs = end - startedAt;
|
||||||
|
const seconds = Math.floor(durationMs / 1000);
|
||||||
|
const minutes = Math.floor(seconds / 60);
|
||||||
|
const hours = Math.floor(minutes / 60);
|
||||||
|
|
||||||
|
if (hours > 0) {
|
||||||
|
return `${hours}h ${minutes % 60}m ${seconds % 60}s`;
|
||||||
|
}
|
||||||
|
if (minutes > 0) {
|
||||||
|
return `${minutes}m ${seconds % 60}s`;
|
||||||
|
}
|
||||||
|
return `${seconds}s`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getSubagentTypeLabel(type: string): string {
|
||||||
|
const labels: Record<string, string> = {
|
||||||
|
Explore: "Explorer",
|
||||||
|
"general-purpose": "General",
|
||||||
|
Plan: "Planner",
|
||||||
|
Bash: "Shell",
|
||||||
|
};
|
||||||
|
return labels[type] || type;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getStatusBadgeClass(status: string): string {
|
||||||
|
switch (status) {
|
||||||
|
case "running":
|
||||||
|
return "bg-blue-500/20 text-blue-400 border-blue-500/30";
|
||||||
|
case "completed":
|
||||||
|
return "bg-green-500/20 text-green-400 border-green-500/30";
|
||||||
|
case "errored":
|
||||||
|
return "bg-red-500/20 text-red-400 border-red-500/30";
|
||||||
|
default:
|
||||||
|
return "bg-gray-500/20 text-gray-400 border-gray-500/30";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function handleKillAll() {
|
||||||
|
if (!currentConversationId) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
await invoke("interrupt_claude", { conversationId: currentConversationId });
|
||||||
|
// Mark all running agents as errored after killing the process
|
||||||
|
agentStore.markAllErrored(currentConversationId);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to kill Claude process:", error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function handleClearCompleted() {
|
||||||
|
if (currentConversationId) {
|
||||||
|
agentStore.clearCompleted(currentConversationId);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Flatten the tree for rendering with depth information
|
||||||
|
const flattenedAgents = $derived.by(() => {
|
||||||
|
const result: { agent: AgentInfo; depth: number }[] = [];
|
||||||
|
const { topLevel, childrenMap } = agentTree;
|
||||||
|
|
||||||
|
function addAgentAndChildren(agent: AgentInfo, depth: number) {
|
||||||
|
result.push({ agent, depth });
|
||||||
|
const children = childrenMap.get(agent.toolUseId);
|
||||||
|
if (children) {
|
||||||
|
children.forEach((child) => addAgentAndChildren(child, depth + 1));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
topLevel.forEach((agent) => addAgentAndChildren(agent, 0));
|
||||||
|
return result;
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
{#if isOpen}
|
||||||
|
<!-- svelte-ignore a11y_click_events_have_key_events -->
|
||||||
|
<!-- svelte-ignore a11y_no_static_element_interactions -->
|
||||||
|
<div class="fixed inset-0 z-40" onclick={onClose}></div>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class="fixed top-12 right-0 bottom-0 w-80 bg-[var(--bg-primary)] border-l border-[var(--border-color)] shadow-xl z-50 flex flex-col overflow-hidden"
|
||||||
|
>
|
||||||
|
<!-- Header -->
|
||||||
|
<div class="flex items-center justify-between p-4 border-b border-[var(--border-color)]">
|
||||||
|
<div class="flex items-center gap-2">
|
||||||
|
<svg
|
||||||
|
class="w-5 h-5 text-[var(--accent-primary)]"
|
||||||
|
fill="none"
|
||||||
|
stroke="currentColor"
|
||||||
|
viewBox="0 0 24 24"
|
||||||
|
>
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M9 3v2m6-2v2M9 19v2m6-2v2M5 9H3m2 6H3m18-6h-2m2 6h-2M7 19h10a2 2 0 002-2V7a2 2 0 00-2-2H7a2 2 0 00-2 2v10a2 2 0 002 2zM9 9h6v6H9V9z"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
<h3 class="text-sm font-semibold text-[var(--text-primary)]">Agent Monitor</h3>
|
||||||
|
{#if runningAgents.length > 0}
|
||||||
|
<span
|
||||||
|
class="px-1.5 py-0.5 text-xs rounded-full bg-blue-500/20 text-blue-400 animate-pulse"
|
||||||
|
>
|
||||||
|
{runningAgents.length} running
|
||||||
|
</span>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
onclick={onClose}
|
||||||
|
class="p-1 text-[var(--text-secondary)] hover:text-[var(--text-primary)] transition-colors"
|
||||||
|
aria-label="Close agent monitor"
|
||||||
|
>
|
||||||
|
<svg class="w-4 h-4" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M6 18L18 6M6 6l12 12"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Action buttons -->
|
||||||
|
<div class="flex gap-2 px-4 py-2 border-b border-[var(--border-color)]">
|
||||||
|
<button
|
||||||
|
onclick={handleKillAll}
|
||||||
|
disabled={runningAgents.length === 0}
|
||||||
|
class="flex-1 px-2 py-1 text-xs bg-red-500/20 hover:bg-red-500/30 text-red-400 rounded transition-colors disabled:opacity-40 disabled:cursor-not-allowed"
|
||||||
|
title="Kills the entire Claude Code process to stop all agents"
|
||||||
|
>
|
||||||
|
Kill All
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
onclick={handleClearCompleted}
|
||||||
|
disabled={completedAgents.length === 0 && erroredAgents.length === 0}
|
||||||
|
class="flex-1 px-2 py-1 text-xs bg-[var(--bg-secondary)] hover:bg-[var(--bg-hover,var(--bg-secondary))] text-[var(--text-secondary)] rounded transition-colors disabled:opacity-40 disabled:cursor-not-allowed"
|
||||||
|
>
|
||||||
|
Clear Finished
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Agent list -->
|
||||||
|
<div class="flex-1 overflow-y-auto p-4 space-y-2">
|
||||||
|
{#if agents.length === 0}
|
||||||
|
<div
|
||||||
|
class="flex flex-col items-center justify-center h-full text-[var(--text-secondary)] text-sm"
|
||||||
|
>
|
||||||
|
<svg
|
||||||
|
class="w-8 h-8 mb-2 opacity-50"
|
||||||
|
fill="none"
|
||||||
|
stroke="currentColor"
|
||||||
|
viewBox="0 0 24 24"
|
||||||
|
>
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M9 3v2m6-2v2M9 19v2m6-2v2M5 9H3m2 6H3m18-6h-2m2 6h-2M7 19h10a2 2 0 002-2V7a2 2 0 00-2-2H7a2 2 0 00-2 2v10a2 2 0 002 2zM9 9h6v6H9V9z"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
<p>No agents detected yet</p>
|
||||||
|
<p class="text-xs mt-1 opacity-70">
|
||||||
|
Agents will appear here when Claude uses the Task tool
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
{#each flattenedAgents as { agent, depth } (agent.toolUseId)}
|
||||||
|
<div
|
||||||
|
class="p-3 rounded-lg border border-[var(--border-color)] bg-[var(--bg-secondary)] {agent.status ===
|
||||||
|
'running'
|
||||||
|
? 'border-l-2 border-l-blue-500'
|
||||||
|
: agent.status === 'errored'
|
||||||
|
? 'border-l-2 border-l-red-500'
|
||||||
|
: 'border-l-2 border-l-green-500'}"
|
||||||
|
style="margin-left: {depth * 12}px; width: calc(100% - {depth * 12}px);"
|
||||||
|
>
|
||||||
|
<!-- Agent header -->
|
||||||
|
<div class="flex items-center justify-between mb-1">
|
||||||
|
<div class="flex items-center gap-1.5">
|
||||||
|
{#if depth > 0}
|
||||||
|
<svg
|
||||||
|
class="w-3 h-3 text-[var(--text-secondary)]"
|
||||||
|
fill="none"
|
||||||
|
stroke="currentColor"
|
||||||
|
viewBox="0 0 24 24"
|
||||||
|
>
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M9 5l7 7-7 7"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
{/if}
|
||||||
|
<img
|
||||||
|
src={agent.characterAvatar}
|
||||||
|
alt={agent.characterName}
|
||||||
|
class="w-5 h-5 rounded-full object-cover"
|
||||||
|
/>
|
||||||
|
<span class="text-[10px] font-medium text-[var(--text-primary)]">
|
||||||
|
{agent.characterName}
|
||||||
|
</span>
|
||||||
|
<span
|
||||||
|
class="px-1.5 py-0.5 text-[10px] rounded border {getStatusBadgeClass(
|
||||||
|
agent.status
|
||||||
|
)}"
|
||||||
|
title={agent.agentId ? `ID: ${agent.agentId}` : undefined}
|
||||||
|
>
|
||||||
|
{getSubagentTypeLabel(agent.agentType ?? agent.subagentType)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<span
|
||||||
|
class="text-[10px] {agent.status === 'running'
|
||||||
|
? 'text-blue-400'
|
||||||
|
: 'text-[var(--text-secondary)]'}"
|
||||||
|
>
|
||||||
|
{#if agent.durationMs !== undefined}
|
||||||
|
{Math.floor(agent.durationMs / 1000)}s
|
||||||
|
{:else}
|
||||||
|
{formatDuration(agent.startedAt, agent.endedAt)}
|
||||||
|
{/if}
|
||||||
|
{#if agent.status === "running"}
|
||||||
|
<span class="inline-block w-1 h-1 bg-blue-400 rounded-full animate-pulse ml-1"
|
||||||
|
></span>
|
||||||
|
{/if}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Agent description -->
|
||||||
|
<p class="text-xs text-[var(--text-primary)] truncate" title={agent.description}>
|
||||||
|
{agent.description}
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<!-- Model override badge -->
|
||||||
|
{#if agent.model}
|
||||||
|
<p class="mt-0.5 text-[10px] text-purple-400 truncate" title="Model: {agent.model}">
|
||||||
|
✦ {agent.model}
|
||||||
|
</p>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<!-- Status indicator -->
|
||||||
|
<div class="mt-1 flex items-center gap-1">
|
||||||
|
{#if agent.status === "running"}
|
||||||
|
<span class="text-[10px] text-blue-400">Running...</span>
|
||||||
|
{:else if agent.status === "completed"}
|
||||||
|
<span class="text-[10px] text-green-400">Completed</span>
|
||||||
|
{:else}
|
||||||
|
<span class="text-[10px] text-red-400">Errored / Killed</span>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Last assistant message snippet -->
|
||||||
|
{#if agent.lastAssistantMessage}
|
||||||
|
<p
|
||||||
|
class="mt-1 text-[10px] text-[var(--text-secondary)] italic truncate"
|
||||||
|
title={agent.lastAssistantMessage}
|
||||||
|
>
|
||||||
|
{agent.lastAssistantMessage}
|
||||||
|
</p>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Footer summary -->
|
||||||
|
{#if agents.length > 0}
|
||||||
|
<div
|
||||||
|
class="px-4 py-2 border-t border-[var(--border-color)] text-[10px] text-[var(--text-secondary)]"
|
||||||
|
>
|
||||||
|
{agents.length} total ·
|
||||||
|
{runningAgents.length} running ·
|
||||||
|
{completedAgents.length} completed ·
|
||||||
|
{erroredAgents.length} errored
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
@@ -34,53 +34,34 @@
|
|||||||
return "animate-idle";
|
return "animate-idle";
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
function getBackgroundGlow(): string {
|
|
||||||
switch (currentState) {
|
|
||||||
case "thinking":
|
|
||||||
return "shadow-thinking";
|
|
||||||
case "typing":
|
|
||||||
return "shadow-typing";
|
|
||||||
case "searching":
|
|
||||||
return "shadow-searching";
|
|
||||||
case "coding":
|
|
||||||
return "shadow-coding";
|
|
||||||
case "mcp":
|
|
||||||
return "shadow-mcp";
|
|
||||||
case "success":
|
|
||||||
return "shadow-success";
|
|
||||||
case "error":
|
|
||||||
return "shadow-error";
|
|
||||||
default:
|
|
||||||
return "";
|
|
||||||
}
|
|
||||||
}
|
|
||||||
</script>
|
</script>
|
||||||
|
|
||||||
<div class="anime-girl-container flex flex-col items-center justify-end h-full p-4">
|
<div
|
||||||
<div class="character-frame relative {getBackgroundGlow()} w-full max-w-md">
|
class="anime-girl-container flex flex-col items-center justify-between h-full p-4 overflow-hidden"
|
||||||
<div class="sprite-container {getAnimationClass()}">
|
>
|
||||||
|
<div class="character-frame relative flex-1 flex items-center justify-center min-h-0">
|
||||||
|
<div class="sprite-container {getAnimationClass()} h-full flex items-center justify-center">
|
||||||
<img
|
<img
|
||||||
src="/sprites/{info.spriteFile}"
|
src="/sprites/{info.spriteFile}"
|
||||||
alt="Hikari - {info.label}"
|
alt="Hikari - {info.label}"
|
||||||
class="character-sprite w-full h-auto object-contain"
|
class="character-sprite h-full w-auto max-w-full object-contain"
|
||||||
onerror={(e) => {
|
onerror={(e) => {
|
||||||
const target = e.currentTarget as HTMLImageElement;
|
const target = e.currentTarget as HTMLImageElement;
|
||||||
target.src = "/sprites/placeholder.svg";
|
target.src = "/sprites/placeholder.svg";
|
||||||
}}
|
}}
|
||||||
/>
|
/>
|
||||||
</div>
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="state-indicator absolute -bottom-2 left-1/2 transform -translate-x-1/2">
|
<div class="state-indicator mt-2">
|
||||||
<div
|
<div
|
||||||
class="px-3 py-1 rounded-full text-xs font-medium bg-[var(--bg-secondary)] border border-[var(--border-color)] text-[var(--accent-primary)]"
|
class="px-3 py-1 rounded-full text-xs font-medium bg-[var(--bg-secondary)] border border-[var(--border-color)] text-[var(--accent-primary)]"
|
||||||
>
|
>
|
||||||
{info.label}
|
{info.label}
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="speech-bubble mt-4 max-w-xs">
|
<div class="speech-bubble mt-2 max-w-xs flex-shrink-0">
|
||||||
<div
|
<div
|
||||||
class="relative bg-[var(--bg-secondary)] rounded-lg px-4 py-2 border border-[var(--border-color)]"
|
class="relative bg-[var(--bg-secondary)] rounded-lg px-4 py-2 border border-[var(--border-color)]"
|
||||||
>
|
>
|
||||||
@@ -93,37 +74,12 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<style>
|
<style>
|
||||||
|
.anime-girl-container {
|
||||||
|
transition: all 0.3s ease;
|
||||||
|
}
|
||||||
|
|
||||||
.character-frame {
|
.character-frame {
|
||||||
border-radius: 50%;
|
transition: all 0.3s ease;
|
||||||
transition: box-shadow 0.3s ease;
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-thinking {
|
|
||||||
box-shadow: 0 0 30px rgba(147, 51, 234, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-typing {
|
|
||||||
box-shadow: 0 0 30px rgba(59, 130, 246, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-searching {
|
|
||||||
box-shadow: 0 0 30px rgba(234, 179, 8, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-coding {
|
|
||||||
box-shadow: 0 0 30px rgba(34, 197, 94, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-mcp {
|
|
||||||
box-shadow: 0 0 30px rgba(236, 72, 153, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-success {
|
|
||||||
box-shadow: 0 0 30px rgba(16, 185, 129, 0.5);
|
|
||||||
}
|
|
||||||
|
|
||||||
.shadow-error {
|
|
||||||
box-shadow: 0 0 30px rgba(239, 68, 68, 0.5);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
@keyframes idle-bob {
|
@keyframes idle-bob {
|
||||||
|
|||||||
@@ -0,0 +1,209 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import type { Attachment } from "$lib/types/messages";
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
attachments: Attachment[];
|
||||||
|
onRemove: (id: string) => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
let { attachments, onRemove }: Props = $props();
|
||||||
|
|
||||||
|
function formatFileSize(bytes: number): string {
|
||||||
|
if (bytes < 1024) return `${bytes} B`;
|
||||||
|
if (bytes < 1024 * 1024) return `${(bytes / 1024).toFixed(1)} KB`;
|
||||||
|
return `${(bytes / (1024 * 1024)).toFixed(1)} MB`;
|
||||||
|
}
|
||||||
|
|
||||||
|
function getFileIcon(type: Attachment["type"]): string {
|
||||||
|
switch (type) {
|
||||||
|
case "image":
|
||||||
|
return "🖼️";
|
||||||
|
case "document":
|
||||||
|
return "📄";
|
||||||
|
default:
|
||||||
|
return "📎";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
|
||||||
|
{#if attachments.length > 0}
|
||||||
|
<div class="attachment-preview-container">
|
||||||
|
<div class="attachment-header">
|
||||||
|
<span class="attachment-count"
|
||||||
|
>{attachments.length} attachment{attachments.length !== 1 ? "s" : ""}</span
|
||||||
|
>
|
||||||
|
</div>
|
||||||
|
<div class="attachment-list">
|
||||||
|
{#each attachments as attachment (attachment.id)}
|
||||||
|
<div class="attachment-item" class:is-image={attachment.type === "image"}>
|
||||||
|
{#if attachment.type === "image" && attachment.previewUrl}
|
||||||
|
<div class="image-preview">
|
||||||
|
<img src={attachment.previewUrl} alt={attachment.filename} />
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<div class="file-icon">
|
||||||
|
{getFileIcon(attachment.type)}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
<div class="attachment-info">
|
||||||
|
<span class="attachment-filename" title={attachment.filename}>
|
||||||
|
{attachment.filename}
|
||||||
|
</span>
|
||||||
|
<span class="attachment-size">
|
||||||
|
{formatFileSize(attachment.size)}
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
class="remove-button"
|
||||||
|
onclick={() => onRemove(attachment.id)}
|
||||||
|
title="Remove attachment"
|
||||||
|
>
|
||||||
|
<svg
|
||||||
|
width="14"
|
||||||
|
height="14"
|
||||||
|
viewBox="0 0 24 24"
|
||||||
|
fill="none"
|
||||||
|
stroke="currentColor"
|
||||||
|
stroke-width="2"
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
>
|
||||||
|
<line x1="18" y1="6" x2="6" y2="18"></line>
|
||||||
|
<line x1="6" y1="6" x2="18" y2="18"></line>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
|
||||||
|
<style>
|
||||||
|
.attachment-preview-container {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 8px;
|
||||||
|
padding: 8px;
|
||||||
|
background: var(--bg-secondary);
|
||||||
|
border: 1px solid var(--border-color);
|
||||||
|
border-radius: 8px;
|
||||||
|
margin-bottom: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-header {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-count {
|
||||||
|
font-size: 12px;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-list {
|
||||||
|
display: flex;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-item {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 8px;
|
||||||
|
padding: 6px 8px;
|
||||||
|
background: var(--bg-tertiary);
|
||||||
|
border: 1px solid var(--border-color);
|
||||||
|
border-radius: 6px;
|
||||||
|
max-width: 200px;
|
||||||
|
position: relative;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-item.is-image {
|
||||||
|
flex-direction: column;
|
||||||
|
padding: 4px;
|
||||||
|
max-width: 120px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.image-preview {
|
||||||
|
width: 100%;
|
||||||
|
max-width: 110px;
|
||||||
|
max-height: 80px;
|
||||||
|
border-radius: 4px;
|
||||||
|
overflow: hidden;
|
||||||
|
background: var(--bg-primary);
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.image-preview img {
|
||||||
|
max-width: 100%;
|
||||||
|
max-height: 80px;
|
||||||
|
object-fit: contain;
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-icon {
|
||||||
|
font-size: 24px;
|
||||||
|
flex-shrink: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-info {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 2px;
|
||||||
|
overflow: hidden;
|
||||||
|
flex: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.is-image .attachment-info {
|
||||||
|
width: 100%;
|
||||||
|
padding: 0 4px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-filename {
|
||||||
|
font-size: 12px;
|
||||||
|
color: var(--text-primary);
|
||||||
|
white-space: nowrap;
|
||||||
|
overflow: hidden;
|
||||||
|
text-overflow: ellipsis;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-size {
|
||||||
|
font-size: 10px;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
}
|
||||||
|
|
||||||
|
.remove-button {
|
||||||
|
position: absolute;
|
||||||
|
top: -6px;
|
||||||
|
right: -6px;
|
||||||
|
width: 20px;
|
||||||
|
height: 20px;
|
||||||
|
padding: 0;
|
||||||
|
background: var(--bg-primary);
|
||||||
|
border: 1px solid var(--border-color);
|
||||||
|
border-radius: 50%;
|
||||||
|
color: var(--text-secondary);
|
||||||
|
cursor: pointer;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
opacity: 0;
|
||||||
|
transition:
|
||||||
|
opacity 0.2s,
|
||||||
|
background 0.2s,
|
||||||
|
color 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
.attachment-item:hover .remove-button {
|
||||||
|
opacity: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.remove-button:hover {
|
||||||
|
background: var(--error-color, #ef4444);
|
||||||
|
border-color: var(--error-color, #ef4444);
|
||||||
|
color: white;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,140 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { CHARACTER_POOL } from "$lib/utils/agentCharacters";
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
onClose: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { onClose }: Props = $props();
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class="fixed inset-0 bg-black/50 backdrop-blur-sm z-50 flex items-center justify-center p-4"
|
||||||
|
onclick={onClose}
|
||||||
|
role="button"
|
||||||
|
tabindex="0"
|
||||||
|
onkeydown={(e) => e.key === "Escape" && onClose()}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
class="bg-[var(--bg-primary)] border border-[var(--border-color)] rounded-lg shadow-xl max-w-2xl w-full p-6 max-h-[90vh] overflow-y-auto"
|
||||||
|
onclick={(e) => e.stopPropagation()}
|
||||||
|
onkeydown={(e) => e.stopPropagation()}
|
||||||
|
role="dialog"
|
||||||
|
aria-labelledby="cast-title"
|
||||||
|
tabindex="-1"
|
||||||
|
>
|
||||||
|
<div class="flex items-center justify-between mb-6">
|
||||||
|
<h2 id="cast-title" class="text-xl font-semibold text-[var(--text-primary)]">
|
||||||
|
Meet the Team
|
||||||
|
</h2>
|
||||||
|
<button
|
||||||
|
onclick={onClose}
|
||||||
|
class="p-1 text-[var(--text-secondary)] hover:text-[var(--text-primary)] transition-colors"
|
||||||
|
aria-label="Close"
|
||||||
|
>
|
||||||
|
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M6 18L18 6M6 6l12 12"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Principal cast: Hikari + Naomi -->
|
||||||
|
<div class="grid grid-cols-1 gap-3 mb-6 sm:grid-cols-2">
|
||||||
|
<div
|
||||||
|
class="flex items-center gap-3 p-4 rounded-lg bg-[var(--bg-secondary)] border border-[var(--accent-primary)]/40"
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src="https://cdn.nhcarrigan.com/hikari.png"
|
||||||
|
alt="Hikari"
|
||||||
|
class="w-16 h-16 object-cover rounded-full border-2 border-[var(--border-color)] shrink-0"
|
||||||
|
/>
|
||||||
|
<div>
|
||||||
|
<div class="flex items-center gap-2 mb-1">
|
||||||
|
<span class="font-semibold text-[var(--text-primary)]">Hikari</span>
|
||||||
|
<span
|
||||||
|
class="text-xs px-2 py-0.5 rounded-full bg-[var(--accent-primary)]/20 text-[var(--accent-primary)] font-medium"
|
||||||
|
>
|
||||||
|
Chief Operating Officer
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<p class="text-xs text-[var(--text-secondary)]">
|
||||||
|
Holds the line so the others don't have to. Never without her clipboard — or her
|
||||||
|
glasses.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div
|
||||||
|
class="flex items-center gap-3 p-4 rounded-lg bg-[var(--bg-secondary)] border border-[var(--accent-primary)]/40"
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src="https://cdn.nhcarrigan.com/profile.png"
|
||||||
|
alt="Naomi"
|
||||||
|
class="w-16 h-16 object-cover rounded-full border-2 border-[var(--border-color)] shrink-0"
|
||||||
|
/>
|
||||||
|
<div>
|
||||||
|
<div class="flex items-center gap-2 mb-1">
|
||||||
|
<span class="font-semibold text-[var(--text-primary)]">Naomi</span>
|
||||||
|
<span
|
||||||
|
class="text-xs px-2 py-0.5 rounded-full bg-[var(--accent-primary)]/20 text-[var(--accent-primary)] font-medium"
|
||||||
|
>
|
||||||
|
Chief hEx-ecutive Officer
|
||||||
|
</span>
|
||||||
|
</div>
|
||||||
|
<p class="text-xs text-[var(--text-secondary)]">
|
||||||
|
A 525-year-old vampire running a tech company from behind a VTuber avatar. Fixes server
|
||||||
|
crashes at 4 AM.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Subagent girls grid -->
|
||||||
|
<div>
|
||||||
|
<h3 class="text-sm font-medium text-[var(--text-secondary)] uppercase tracking-wider mb-3">
|
||||||
|
Subagent Squad
|
||||||
|
</h3>
|
||||||
|
<div class="grid grid-cols-2 gap-3 sm:grid-cols-3">
|
||||||
|
{#each CHARACTER_POOL as character (character.name)}
|
||||||
|
<div
|
||||||
|
class="flex flex-col items-center gap-2 p-3 rounded-lg bg-[var(--bg-secondary)] border border-[var(--border-color)] text-center"
|
||||||
|
>
|
||||||
|
<img
|
||||||
|
src={character.avatar}
|
||||||
|
alt={character.name}
|
||||||
|
class="w-14 h-14 object-cover rounded-full border-2 border-[var(--border-color)]"
|
||||||
|
/>
|
||||||
|
<span class="text-sm font-medium text-[var(--text-primary)]">{character.name}</span>
|
||||||
|
<span
|
||||||
|
class="text-xs px-2 py-0.5 rounded-full bg-[var(--accent-primary)]/20 text-[var(--accent-primary)] font-medium"
|
||||||
|
>
|
||||||
|
{character.title}
|
||||||
|
</span>
|
||||||
|
<p class="text-xs text-[var(--text-secondary)] leading-snug">{character.description}</p>
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<style>
|
||||||
|
[role="dialog"] {
|
||||||
|
animation: slideIn 0.2s ease-out;
|
||||||
|
}
|
||||||
|
|
||||||
|
@keyframes slideIn {
|
||||||
|
from {
|
||||||
|
opacity: 0;
|
||||||
|
transform: scale(0.95);
|
||||||
|
}
|
||||||
|
to {
|
||||||
|
opacity: 1;
|
||||||
|
transform: scale(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
</style>
|
||||||
@@ -0,0 +1,153 @@
|
|||||||
|
<script lang="ts">
|
||||||
|
import { invoke } from "@tauri-apps/api/core";
|
||||||
|
import { openUrl } from "@tauri-apps/plugin-opener";
|
||||||
|
import { getVersion } from "@tauri-apps/api/app";
|
||||||
|
import { onMount } from "svelte";
|
||||||
|
import type { ChangelogEntry } from "$lib/types/messages";
|
||||||
|
import Markdown from "./Markdown.svelte";
|
||||||
|
|
||||||
|
interface Props {
|
||||||
|
onClose: () => void;
|
||||||
|
}
|
||||||
|
|
||||||
|
const { onClose }: Props = $props();
|
||||||
|
|
||||||
|
let entries = $state<ChangelogEntry[]>([]);
|
||||||
|
let loading = $state(true);
|
||||||
|
let error = $state<string | null>(null);
|
||||||
|
let currentVersion = $state("");
|
||||||
|
|
||||||
|
export function formatReleaseDate(isoString: string): string {
|
||||||
|
if (!isoString) return "Unknown date";
|
||||||
|
const date = new Date(isoString);
|
||||||
|
if (isNaN(date.getTime())) return "Unknown date";
|
||||||
|
return date.toLocaleDateString("en-GB", {
|
||||||
|
year: "numeric",
|
||||||
|
month: "long",
|
||||||
|
day: "numeric",
|
||||||
|
timeZone: "UTC",
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadChangelog(): Promise<void> {
|
||||||
|
loading = true;
|
||||||
|
error = null;
|
||||||
|
try {
|
||||||
|
entries = await invoke<ChangelogEntry[]>("fetch_changelog");
|
||||||
|
} catch (err) {
|
||||||
|
error = err instanceof Error ? err.message : String(err);
|
||||||
|
} finally {
|
||||||
|
loading = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
onMount(async () => {
|
||||||
|
currentVersion = await getVersion();
|
||||||
|
await loadChangelog();
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
|
||||||
|
<div
|
||||||
|
class="fixed inset-0 bg-black/50 backdrop-blur-sm z-50 flex items-center justify-center p-4"
|
||||||
|
onclick={onClose}
|
||||||
|
role="button"
|
||||||
|
tabindex="0"
|
||||||
|
onkeydown={(e) => e.key === "Escape" && onClose()}
|
||||||
|
>
|
||||||
|
<div
|
||||||
|
class="bg-[var(--bg-primary)] border border-[var(--border-color)] rounded-lg shadow-xl max-w-2xl w-full max-h-[80vh] overflow-hidden flex flex-col"
|
||||||
|
onclick={(e) => e.stopPropagation()}
|
||||||
|
onkeydown={(e) => e.stopPropagation()}
|
||||||
|
role="dialog"
|
||||||
|
aria-labelledby="changelog-title"
|
||||||
|
tabindex="-1"
|
||||||
|
>
|
||||||
|
<div class="flex items-center justify-between p-6 pb-4 border-b border-[var(--border-color)]">
|
||||||
|
<h2 id="changelog-title" class="text-xl font-semibold text-[var(--text-primary)]">
|
||||||
|
Changelog
|
||||||
|
</h2>
|
||||||
|
<button
|
||||||
|
onclick={onClose}
|
||||||
|
class="p-1 text-[var(--text-secondary)] hover:text-[var(--text-primary)] transition-colors"
|
||||||
|
aria-label="Close"
|
||||||
|
>
|
||||||
|
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||||
|
<path
|
||||||
|
stroke-linecap="round"
|
||||||
|
stroke-linejoin="round"
|
||||||
|
stroke-width="2"
|
||||||
|
d="M6 18L18 6M6 6l12 12"
|
||||||
|
/>
|
||||||
|
</svg>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="overflow-y-auto flex-1 p-6">
|
||||||
|
{#if loading}
|
||||||
|
<div class="flex items-center justify-center py-12">
|
||||||
|
<div
|
||||||
|
class="w-8 h-8 border-2 border-[var(--accent-primary)] border-t-transparent rounded-full animate-spin"
|
||||||
|
></div>
|
||||||
|
<span class="ml-3 text-[var(--text-secondary)]">Fetching releases...</span>
|
||||||
|
</div>
|
||||||
|
{:else if error}
|
||||||
|
<div class="text-center py-12">
|
||||||
|
<p class="text-red-400 mb-4">{error}</p>
|
||||||
|
<button onclick={loadChangelog} class="btn-trans-gradient px-4 py-2 rounded text-sm">
|
||||||
|
Retry
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{:else if entries.length === 0}
|
||||||
|
<p class="text-center text-[var(--text-secondary)] py-12">No releases found.</p>
|
||||||
|
{:else}
|
||||||
|
<div class="space-y-6">
|
||||||
|
{#each entries as entry (entry.version)}
|
||||||
|
<div class="border border-[var(--border-color)] rounded-lg overflow-hidden">
|
||||||
|
<div
|
||||||
|
class="flex flex-wrap items-center gap-2 px-4 py-3 bg-[var(--bg-secondary)] border-b border-[var(--border-color)]"
|
||||||
|
>
|
||||||
|
<span
|
||||||
|
class="font-mono font-semibold text-sm {entry.version === `v${currentVersion}`
|
||||||
|
? 'text-[var(--trans-pink)]'
|
||||||
|
: 'text-[var(--text-primary)]'}"
|
||||||
|
>
|
||||||
|
{entry.version}
|
||||||
|
</span>
|
||||||
|
{#if entry.version === `v${currentVersion}`}
|
||||||
|
<span
|
||||||
|
class="text-xs px-2 py-0.5 rounded-full bg-[var(--trans-pink)]/20 text-[var(--trans-pink)] border border-[var(--trans-pink)]/30"
|
||||||
|
>
|
||||||
|
current
|
||||||
|
</span>
|
||||||
|
{/if}
|
||||||
|
{#if entry.prerelease}
|
||||||
|
<span
|
||||||
|
class="text-xs px-2 py-0.5 rounded-full bg-yellow-500/20 text-yellow-400 border border-yellow-500/30"
|
||||||
|
>
|
||||||
|
pre-release
|
||||||
|
</span>
|
||||||
|
{/if}
|
||||||
|
<span class="ml-auto text-xs text-[var(--text-muted)]">
|
||||||
|
{formatReleaseDate(entry.created_at)}
|
||||||
|
</span>
|
||||||
|
<button
|
||||||
|
onclick={() => openUrl(entry.url)}
|
||||||
|
class="text-xs text-[var(--accent-primary)] hover:text-[var(--accent-primary-hover)] transition-colors underline"
|
||||||
|
>
|
||||||
|
View on Gitea
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
{#if entry.notes}
|
||||||
|
<div class="p-4 text-sm text-[var(--text-secondary)]">
|
||||||
|
<Markdown content={entry.notes} />
|
||||||
|
</div>
|
||||||
|
{:else}
|
||||||
|
<p class="p-4 text-xs text-[var(--text-muted)] italic">No release notes.</p>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
{/each}
|
||||||
|
</div>
|
||||||
|
{/if}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||