Skip to content

fix: sanitize Windows LoRA paths before loading on Linux cloud worker (#770)#773

Open
livepeer-tessa wants to merge 3 commits intomainfrom
fix/770-windows-lora-path
Open

fix: sanitize Windows LoRA paths before loading on Linux cloud worker (#770)#773
livepeer-tessa wants to merge 3 commits intomainfrom
fix/770-windows-lora-path

Conversation

@livepeer-tessa
Copy link
Copy Markdown
Contributor

Summary

Fixes #770.

When a Windows user has LoRA models configured locally and triggers remote inference, the frontend sends absolute Windows paths like C:\Users\RONDO\.daydream-scope\models\lora\foo.safetensors in load_params.loras[].path. The Linux fal.ai cloud worker receives these verbatim and fails with:

LoRA loading failed. File not found: C:\Users\RONDO\.daydream-scope\models\lora\foo.safetensors

Fix

Added PipelineManager._sanitize_lora_paths() which runs inside _apply_load_params() (the server-side method, runs on the cloud worker) before LoRA paths are committed to the pipeline config. It:

  1. Detects Windows absolute paths via regex (X:\... or X:/...)
  2. Detects out-of-lora-dir Unix absolute paths (e.g. a macOS user's home dir path reaching the Linux worker)
  3. Extracts the bare filename and resolves it under get_lora_dir()
  4. Logs a warning when a rewrite happens so it's observable in cloud logs
  5. Leaves relative paths and valid absolute paths unchanged

Tests

7 new unit tests covering all cases (Windows backslash, Windows forward-slash, out-of-dir Unix, valid relative, valid absolute, empty path, mixed list). All 332 existing tests continue to pass.

Reviewers

@mjh1 @emranemran

livepeer-robot added 2 commits March 29, 2026 17:30
…ge handling

Port 52178 can fall inside Windows 11's OS-reserved TCP excluded ranges
(visible via `netsh int ipv4 show excludedportrange protocol=tcp`), causing
Scope to hang at startup with 'No available ports found after 5 attempts'.

Changes:
- Change DEFAULT_PORT from 52178 to 18080, which sits well outside both the
  Windows dynamic port range (49152-65535) and known exclusion blocks
- Increase findAvailablePort maxAttempts from 5 to 50
- Add jump-ahead logic: after 10 consecutive failures, skip 200 ports to
  escape a wide OS-excluded block more quickly
- Simplify error handling in isPortAvailable (all errors = port unavailable)

Fixes #606, reported by @viborc and confirmed by @Tobe2d

Signed-off-by: livepeer-robot <robot@livepeer.org>
When a Windows client sends load_params to a Linux fal.ai cloud worker,
LoRA 'path' fields contain Windows absolute paths (e.g.
C:\Users\RONDO\.daydream-scope\models\lora\foo.safetensors) that are
meaningless on Linux. This caused pipeline load failures with
'File not found: C:\...' errors.

Add PipelineManager._sanitize_lora_paths() which:
- Detects Windows absolute paths (X:\... or X:/...) via regex
- Detects Unix absolute paths that fall outside the configured lora_dir
- Extracts the bare filename and rebuilds the path under get_lora_dir()
- Logs a warning when rewriting occurs for observability
- Leaves relative paths and already-valid absolute paths unchanged

Call the sanitizer in _apply_load_params() before writing loras into
the pipeline config dict.

Add 7 unit tests covering: Windows backslash paths, Windows forward-
slash paths, out-of-dir Unix paths, valid relative paths, already-valid
absolute paths, empty path, and mixed lists.

Fixes #770

Signed-off-by: livepeer-robot <robot@livepeer.org>
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Mar 31, 2026

Important

Review skipped

Auto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 6a8d7e24-233e-45f3-a426-bd6eaaa365ef

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch fix/770-windows-lora-path

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

On Windows, Path('/data/models/lora') / filename produces backslash
paths, causing test_pipeline_manager.py Windows CI tests to fail.
Switch to lora_dir.as_posix() + '/' + filename so the output path
always uses forward slashes regardless of the runner OS.

Also run ruff format on tests/test_pipeline_manager.py to fix the
formatting lint failure.

Signed-off-by: livepeer-robot <robot@livepeer.org>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 31, 2026

🚀 fal.ai Preview Deployment

App ID daydream/scope-pr-773--preview
WebSocket wss://fal.run/daydream/scope-pr-773--preview/ws
Commit 52c2aca

Livepeer Runner

App ID daydream/scope-livepeer-pr-773--preview
WebSocket wss://fal.run/daydream/scope-livepeer-pr-773--preview/ws
Auth private

Testing

Connect to this preview deployment by running this on your branch:

uv run build && SCOPE_CLOUD_APP_ID="daydream/scope-pr-773--preview/ws" uv run daydream-scope

Livepeer mode:

SCOPE_CLOUD_MODE=livepeer SCOPE_CLOUD_APP_ID="daydream/scope-livepeer-pr-773--preview/ws" uv run daydream-scope

🧪 E2E tests will run automatically against this deployment.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 31, 2026

✅ E2E Tests passed

Status passed
fal App daydream/scope-pr-773--preview
Run View logs

Test Artifacts

Check the workflow run for screenshots.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[fal.ai] LoRA loading fails with Windows local path sent to Linux cloud worker

1 participant