Compare commits
9 commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 200c7d6d8c | |||
| 53f7750e0d | |||
| cfb1f8ffbc | |||
| ab99ac3f34 | |||
| 7e0502ca40 | |||
| b48784e2ad | |||
| 4c33cb48a9 | |||
| b5d9939193 | |||
| cdd27043a2 |
39 changed files with 3734 additions and 737 deletions
10
Dockerfile
10
Dockerfile
|
|
@ -52,7 +52,8 @@ RUN groupadd --gid 1000 appuser \
|
|||
|
||||
# ── Application code ─────────────────────────────────────────────────────────
|
||||
WORKDIR /app
|
||||
COPY app.py wsgi.py gunicorn.conf.py requirements.txt ./
|
||||
COPY app/ app/
|
||||
COPY wsgi.py run.py gunicorn.conf.py requirements.txt ./
|
||||
COPY templates/ templates/
|
||||
COPY static/ static/
|
||||
|
||||
|
|
@ -61,6 +62,10 @@ COPY static/ static/
|
|||
# All file-system access by the application is restricted to this path.
|
||||
RUN mkdir -p /media && chown appuser:appuser /media
|
||||
|
||||
# ── Data directory for the SQLite settings database ──────────────────────────
|
||||
# Mounted as a named volume in docker-compose so settings survive restarts.
|
||||
RUN mkdir -p /data && chown appuser:appuser /data
|
||||
|
||||
# ── File ownership ────────────────────────────────────────────────────────────
|
||||
RUN chown -R appuser:appuser /app
|
||||
|
||||
|
|
@ -73,6 +78,7 @@ USER appuser
|
|||
# PORT — TCP port Gunicorn listens on (exposed below).
|
||||
# LOG_LEVEL — Gunicorn log verbosity (debug | info | warning | error).
|
||||
ENV MEDIA_ROOT=/media \
|
||||
DB_PATH=/data/videopress.db \
|
||||
PORT=8080 \
|
||||
LOG_LEVEL=info \
|
||||
PYTHONUNBUFFERED=1 \
|
||||
|
|
@ -89,4 +95,4 @@ HEALTHCHECK --interval=30s --timeout=10s --start-period=15s --retries=3 \
|
|||
|| exit 1
|
||||
|
||||
# ── Start Gunicorn ────────────────────────────────────────────────────────────
|
||||
CMD ["gunicorn", "-c", "gunicorn.conf.py", "wsgi:app"]
|
||||
CMD ["gunicorn", "-c", "gunicorn.conf.py", "wsgi:application"]
|
||||
|
|
|
|||
91
README.md
91
README.md
|
|
@ -13,6 +13,24 @@ containerised with **Docker**. FFmpeg compresses video files to approximately
|
|||
1/3 their original size. All file-system access is restricted to a single
|
||||
configurable media root for security.
|
||||
|
||||
1. Set your folder to scan, and minimum size to scan for.
|
||||
|
||||

|
||||
|
||||
2. Select from the files found.
|
||||
|
||||

|
||||
|
||||
3. Start the Compression and See the Progress
|
||||
|
||||

|
||||
|
||||
4. Get an email notification when a compresion run finishes.
|
||||
|
||||

|
||||
|
||||

|
||||
|
||||
---
|
||||
|
||||
## Quick start with Docker (recommended)
|
||||
|
|
@ -29,8 +47,18 @@ Step. 1. You have two options
|
|||
Already included in the `docker-compose.yml` file witht his project.
|
||||
|
||||
Step 2. Run — replace `/your/video/path` with the real path on your host
|
||||
|
||||
If you want to persist data between updates, changes, etc.; then you need to create a "data" directory in the same location you store your 'docker-compose.yml' file.
|
||||
|
||||
`mkdir -p data`
|
||||
|
||||
Ensure the 'data' directory is accessible with permissions = 775.
|
||||
|
||||
`chmod -R 775 ./data`
|
||||
|
||||
A. Using `docker run`:
|
||||
|
||||
- with an image you build:
|
||||
|
||||
```
|
||||
docker run -d \
|
||||
|
|
@ -38,8 +66,22 @@ docker run -d \
|
|||
--restart unless-stopped \
|
||||
-p 8080:8080 \
|
||||
-v /your/video/path:/media \
|
||||
-v ./data:/data \
|
||||
videopress
|
||||
```
|
||||
|
||||
- with the pre-built image:
|
||||
|
||||
```
|
||||
docker run -d \
|
||||
--name videopress \
|
||||
--restart unless-stopped \
|
||||
-p 8080:8080 \
|
||||
-v /your/video/path:/media \
|
||||
-v ./data:/data \
|
||||
bmcgonag/videopress:latest
|
||||
```
|
||||
|
||||
B. Using Docker Compose
|
||||
|
||||
`docker compose up -d`
|
||||
|
|
@ -84,7 +126,7 @@ MEDIA_ROOT=/your/video/path ./start.sh --prod 9000
|
|||
Every API call that accepts a path validates it with `safe_path()` before any
|
||||
OS operation. `safe_path()` resolves symlinks and asserts the result is inside
|
||||
`MEDIA_ROOT`. Requests that attempt directory traversal are rejected with
|
||||
HTTP 403. The container runs as a non-root user (UID 1000).
|
||||
HTTP 403. The container runs as a non-root user (e.g. your id. You can find this by entering the command `id` in the terminal, make note of the number for `uid` - usually 1000).
|
||||
|
||||
---
|
||||
|
||||
|
|
@ -102,6 +144,8 @@ Browser ──HTTP──▶ Gunicorn (gevent worker)
|
|||
└─ POST /api/compress/cancel/<id>
|
||||
```
|
||||
|
||||
|
||||
|
||||
**Why gevent?** SSE (`/api/compress/progress`) is a long-lived streaming
|
||||
response. Standard Gunicorn sync workers block for its entire duration.
|
||||
Gevent workers use cooperative greenlets so a single worker process can
|
||||
|
|
@ -128,17 +172,44 @@ in-process job store with Redis.
|
|||
|
||||
```
|
||||
videocompressor/
|
||||
├── app.py ← Flask application + all API routes
|
||||
├── wsgi.py ← Gunicorn entry point (imports app from app.py)
|
||||
├── gunicorn.conf.py ← Gunicorn configuration (gevent, timeout, logging)
|
||||
├── requirements.txt ← Python dependencies
|
||||
├── Dockerfile ← Two-stage Docker build
|
||||
├── docker-compose.yml ← Volume mapping, port, env vars
|
||||
├── start.sh ← Helper script (dev + prod modes)
|
||||
├── app
|
||||
├── __init__.py
|
||||
├── config.py
|
||||
├── db.py
|
||||
├── jobs.py
|
||||
├── media.py
|
||||
├── notify.py
|
||||
└── routes.py
|
||||
├── wsgi.py
|
||||
├── gunicorn.conf.py
|
||||
├── requirements.txt
|
||||
├── Dockerfile
|
||||
├── docker-compose.yml
|
||||
├── start.sh
|
||||
├── README.md
|
||||
├── templates/
|
||||
│ └── index.html
|
||||
└── static/
|
||||
├── css/main.css
|
||||
└── js/app.js
|
||||
├── css
|
||||
└── main.css
|
||||
└── js
|
||||
├── app.js
|
||||
└── modules
|
||||
├── browser.js
|
||||
├── compress.js
|
||||
├── progress.js
|
||||
├── scan.js
|
||||
├── session.js
|
||||
├── settings.js
|
||||
├── state.js
|
||||
├── stream.js
|
||||
├── theme.js
|
||||
└── utils.js
|
||||
```
|
||||
|
||||
## Contribute
|
||||
Feel free to clone the repository, make updates, and submit a pull request to make this more feature rich.
|
||||
|
||||
Keep in mind, I like to keep things fairly simple to use.
|
||||
|
||||
If you need to know too much about ffmpeg and how to configure the perfect compression, then that may be too much for this app, but feel free to fork this repository and make your own as well.
|
||||
BIN
__pycache__/app.cpython-313.pyc
Normal file
BIN
__pycache__/app.cpython-313.pyc
Normal file
Binary file not shown.
|
|
@ -62,10 +62,20 @@ VIDEO_EXTENSIONS = {
|
|||
|
||||
|
||||
def get_video_info(filepath: str) -> dict | None:
|
||||
"""
|
||||
Use ffprobe to get duration, total bitrate, codec, and dimensions.
|
||||
|
||||
Bitrate resolution strategy (handles HEVC/MKV where stream-level
|
||||
bit_rate is absent):
|
||||
1. Stream-level bit_rate — present for H.264/MP4, often missing for HEVC
|
||||
2. Format-level bit_rate — reliable for all containers
|
||||
3. Derived from size/duration — final fallback
|
||||
"""
|
||||
cmd = [
|
||||
'ffprobe', '-v', 'error',
|
||||
'-select_streams', 'v:0',
|
||||
'-show_entries', 'format=duration,bit_rate,size:stream=codec_name,width,height',
|
||||
'-show_entries',
|
||||
'format=duration,bit_rate,size:stream=codec_name,width,height,bit_rate',
|
||||
'-of', 'json',
|
||||
filepath,
|
||||
]
|
||||
|
|
@ -78,15 +88,26 @@ def get_video_info(filepath: str) -> dict | None:
|
|||
stream = (data.get('streams') or [{}])[0]
|
||||
|
||||
duration = float(fmt.get('duration', 0))
|
||||
bit_rate = int(fmt.get('bit_rate', 0))
|
||||
size_bytes = int(fmt.get('size', 0))
|
||||
codec = stream.get('codec_name', 'unknown')
|
||||
width = stream.get('width', 0)
|
||||
height = stream.get('height', 0)
|
||||
|
||||
if bit_rate == 0 and duration > 0:
|
||||
# Prefer stream-level bitrate, fall back to format-level, then derive
|
||||
stream_br = int(stream.get('bit_rate') or 0)
|
||||
format_br = int(fmt.get('bit_rate') or 0)
|
||||
if stream_br > 0:
|
||||
bit_rate = stream_br
|
||||
elif format_br > 0:
|
||||
bit_rate = format_br
|
||||
elif duration > 0:
|
||||
bit_rate = int((size_bytes * 8) / duration)
|
||||
else:
|
||||
bit_rate = 0
|
||||
|
||||
# Target ≈ 1/3 of the total bitrate, reserving 128 kbps for audio.
|
||||
# For HEVC sources the format bitrate already includes audio, so the
|
||||
# same formula applies regardless of codec.
|
||||
audio_bps = 128_000
|
||||
video_bps = bit_rate - audio_bps if bit_rate > audio_bps else bit_rate
|
||||
target_video_bps = max(int(video_bps / 3), 200_000)
|
||||
|
|
@ -345,13 +366,21 @@ def run_compression_job(job_id: str) -> None:
|
|||
|
||||
src_path = file_info['path']
|
||||
target_bitrate = file_info.get('target_bit_rate_bps', 1_000_000)
|
||||
src_codec = file_info.get('codec', 'unknown').lower()
|
||||
p = Path(src_path)
|
||||
out_path = str(p.parent / (p.stem + suffix + p.suffix))
|
||||
|
||||
# Choose encoder to match the source codec.
|
||||
# hevc / h265 / x265 → libx265
|
||||
# everything else → libx264 (safe, universally supported)
|
||||
is_hevc = src_codec in ('hevc', 'h265', 'x265')
|
||||
encoder = 'libx265' if is_hevc else 'libx264'
|
||||
|
||||
push_event(job, {
|
||||
'type': 'file_start', 'index': idx, 'total': total,
|
||||
'filename': p.name, 'output': out_path,
|
||||
'message': f'Compressing ({idx + 1}/{total}): {p.name}',
|
||||
'encoder': encoder,
|
||||
'message': f'Compressing ({idx + 1}/{total}): {p.name} [{encoder}]',
|
||||
})
|
||||
|
||||
try:
|
||||
|
|
@ -365,12 +394,34 @@ def run_compression_job(job_id: str) -> None:
|
|||
duration_secs = 0
|
||||
|
||||
video_k = max(int(target_bitrate / 1000), 200)
|
||||
|
||||
# Build the encoder-specific part of the ffmpeg command.
|
||||
#
|
||||
# libx264 uses -maxrate / -bufsize for VBV (Video Buffering Verifier).
|
||||
# libx265 passes those same constraints via -x265-params because its
|
||||
# CLI option names differ from the generic ffmpeg flags.
|
||||
# Both use AAC audio at 128 kbps.
|
||||
# -movflags +faststart is only meaningful for MP4 containers; it is
|
||||
# harmless (silently ignored) for MKV/MOV/etc.
|
||||
if is_hevc:
|
||||
vbv_maxrate = int(video_k * 1.5)
|
||||
vbv_bufsize = video_k * 2
|
||||
encoder_opts = [
|
||||
'-c:v', 'libx265',
|
||||
'-b:v', f'{video_k}k',
|
||||
'-x265-params', f'vbv-maxrate={vbv_maxrate}:vbv-bufsize={vbv_bufsize}',
|
||||
]
|
||||
else:
|
||||
encoder_opts = [
|
||||
'-c:v', 'libx264',
|
||||
'-b:v', f'{video_k}k',
|
||||
'-maxrate', f'{int(video_k * 1.5)}k',
|
||||
'-bufsize', f'{video_k * 2}k',
|
||||
]
|
||||
|
||||
cmd = [
|
||||
'ffmpeg', '-y', '-i', src_path,
|
||||
'-c:v', 'libx264',
|
||||
'-b:v', f'{video_k}k',
|
||||
'-maxrate', f'{int(video_k * 1.5)}k',
|
||||
'-bufsize', f'{video_k * 2}k',
|
||||
*encoder_opts,
|
||||
'-c:a', 'aac', '-b:a', '128k',
|
||||
'-movflags', '+faststart',
|
||||
'-progress', 'pipe:1', '-nostats',
|
||||
37
app/__init__.py
Normal file
37
app/__init__.py
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
"""
|
||||
app/__init__.py
|
||||
===============
|
||||
Flask application factory.
|
||||
|
||||
Usage
|
||||
-----
|
||||
from app import create_app
|
||||
flask_app = create_app()
|
||||
|
||||
Gunicorn (wsgi.py) calls create_app() once at startup.
|
||||
The dev-server entry point (run.py) does the same.
|
||||
"""
|
||||
|
||||
from flask import Flask
|
||||
|
||||
from .config import BASE_DIR, MEDIA_ROOT
|
||||
from .db import init_db
|
||||
from .routes import register_routes
|
||||
|
||||
|
||||
def create_app() -> Flask:
|
||||
"""
|
||||
Create and return a configured Flask application instance.
|
||||
"""
|
||||
flask_app = Flask(
|
||||
__name__,
|
||||
template_folder=str(BASE_DIR / 'templates'),
|
||||
static_folder=str(BASE_DIR / 'static'),
|
||||
)
|
||||
|
||||
# Initialise the SQLite settings database
|
||||
init_db()
|
||||
|
||||
register_routes(flask_app)
|
||||
|
||||
return flask_app
|
||||
BIN
app/__pycache__/__init__.cpython-313.pyc
Normal file
BIN
app/__pycache__/__init__.cpython-313.pyc
Normal file
Binary file not shown.
BIN
app/__pycache__/config.cpython-313.pyc
Normal file
BIN
app/__pycache__/config.cpython-313.pyc
Normal file
Binary file not shown.
BIN
app/__pycache__/jobs.cpython-313.pyc
Normal file
BIN
app/__pycache__/jobs.cpython-313.pyc
Normal file
Binary file not shown.
BIN
app/__pycache__/media.cpython-313.pyc
Normal file
BIN
app/__pycache__/media.cpython-313.pyc
Normal file
Binary file not shown.
BIN
app/__pycache__/notify.cpython-313.pyc
Normal file
BIN
app/__pycache__/notify.cpython-313.pyc
Normal file
Binary file not shown.
BIN
app/__pycache__/routes.cpython-313.pyc
Normal file
BIN
app/__pycache__/routes.cpython-313.pyc
Normal file
Binary file not shown.
51
app/config.py
Normal file
51
app/config.py
Normal file
|
|
@ -0,0 +1,51 @@
|
|||
"""
|
||||
app/config.py
|
||||
=============
|
||||
Central configuration and the path-jail helper used by every other module.
|
||||
|
||||
All tuneable values can be overridden via environment variables:
|
||||
|
||||
MEDIA_ROOT Root directory the application may read/write (default: /media)
|
||||
DB_PATH Path to the SQLite database file (default: <project>/videopress.db)
|
||||
PORT TCP port Gunicorn listens on (default: 8080)
|
||||
LOG_LEVEL Gunicorn log verbosity (default: info)
|
||||
"""
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Paths
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
PACKAGE_DIR = Path(__file__).resolve().parent # …/app/
|
||||
BASE_DIR = PACKAGE_DIR.parent # …/videocompressor/
|
||||
|
||||
# Every file-system operation in the application is restricted to MEDIA_ROOT.
|
||||
MEDIA_ROOT = Path(os.environ.get('MEDIA_ROOT', '/media')).resolve()
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Path-jail helper
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def safe_path(raw: str) -> Path:
|
||||
"""
|
||||
Resolve *raw* to an absolute path and assert it is inside MEDIA_ROOT.
|
||||
|
||||
Returns the resolved Path on success.
|
||||
Raises PermissionError if the path would escape MEDIA_ROOT (including
|
||||
symlink traversal and ../../ attacks).
|
||||
"""
|
||||
try:
|
||||
resolved = Path(raw).resolve()
|
||||
except Exception:
|
||||
raise PermissionError(f"Invalid path: {raw!r}")
|
||||
|
||||
root_str = str(MEDIA_ROOT)
|
||||
path_str = str(resolved)
|
||||
if path_str != root_str and not path_str.startswith(root_str + os.sep):
|
||||
raise PermissionError(
|
||||
f"Access denied: '{resolved}' is outside the allowed "
|
||||
f"media root ({MEDIA_ROOT})."
|
||||
)
|
||||
return resolved
|
||||
142
app/db.py
Normal file
142
app/db.py
Normal file
|
|
@ -0,0 +1,142 @@
|
|||
"""
|
||||
app/db.py
|
||||
=========
|
||||
Lightweight SQLite-backed key/value settings store.
|
||||
|
||||
The database file is created automatically on first use beside the
|
||||
application package, or at the path set by the DB_PATH environment
|
||||
variable (useful for Docker volume persistence).
|
||||
|
||||
Public API
|
||||
----------
|
||||
init_db() — create the table if it doesn't exist (call at startup)
|
||||
get_setting(key) — return the stored string value, or None
|
||||
save_setting(key, val) — upsert a key/value pair
|
||||
get_all_settings() — return all rows as {key: value}
|
||||
delete_setting(key) — remove a key (used to clear optional fields)
|
||||
"""
|
||||
|
||||
import os
|
||||
import sqlite3
|
||||
import threading
|
||||
from pathlib import Path
|
||||
|
||||
from .config import BASE_DIR
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Database location
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
# Default: videocompressor/videopress.db — sits beside the app/ package.
|
||||
# Override with the DB_PATH env var (e.g. to a Docker-mounted volume path).
|
||||
DB_PATH = Path(os.environ.get('DB_PATH', str(BASE_DIR / 'videopress.db')))
|
||||
|
||||
# SQLite connections are not thread-safe across threads; use a per-thread
|
||||
# connection via threading.local() so each worker greenlet/thread gets its own.
|
||||
_local = threading.local()
|
||||
|
||||
_INIT_LOCK = threading.Lock()
|
||||
_initialised = False
|
||||
|
||||
|
||||
def _connect() -> sqlite3.Connection:
|
||||
"""Return (and cache) a per-thread SQLite connection."""
|
||||
if not hasattr(_local, 'conn') or _local.conn is None:
|
||||
_local.conn = sqlite3.connect(str(DB_PATH), check_same_thread=False)
|
||||
_local.conn.row_factory = sqlite3.Row
|
||||
# WAL mode allows concurrent reads alongside a single writer
|
||||
_local.conn.execute('PRAGMA journal_mode=WAL')
|
||||
_local.conn.execute('PRAGMA foreign_keys=ON')
|
||||
return _local.conn
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Schema
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def init_db() -> None:
|
||||
"""
|
||||
Create the settings table if it does not already exist.
|
||||
Also creates the parent directory of DB_PATH if needed.
|
||||
Safe to call multiple times — idempotent.
|
||||
"""
|
||||
global _initialised
|
||||
with _INIT_LOCK:
|
||||
if _initialised:
|
||||
return
|
||||
|
||||
# Ensure the directory exists before SQLite tries to create the file.
|
||||
# This handles the case where the Docker volume mount creates ./data
|
||||
# as root before the container user can write to it.
|
||||
db_dir = DB_PATH.parent
|
||||
try:
|
||||
db_dir.mkdir(parents=True, exist_ok=True)
|
||||
except PermissionError:
|
||||
raise PermissionError(
|
||||
f"Cannot create database directory '{db_dir}'. "
|
||||
f"If running in Docker, create the directory on the host first "
|
||||
f"and ensure it is writable by UID 1000:\n"
|
||||
f" mkdir -p {db_dir} && chown 1000:1000 {db_dir}"
|
||||
)
|
||||
|
||||
# Test that we can actually write to the directory before SQLite tries
|
||||
test_file = db_dir / '.write_test'
|
||||
try:
|
||||
test_file.touch()
|
||||
test_file.unlink()
|
||||
except PermissionError:
|
||||
raise PermissionError(
|
||||
f"Database directory '{db_dir}' is not writable by the current user. "
|
||||
f"If running in Docker, fix permissions on the host:\n"
|
||||
f" chown 1000:1000 {db_dir}"
|
||||
)
|
||||
|
||||
conn = _connect()
|
||||
conn.execute("""
|
||||
CREATE TABLE IF NOT EXISTS settings (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT NOT NULL
|
||||
)
|
||||
""")
|
||||
conn.commit()
|
||||
_initialised = True
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# CRUD helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def get_setting(key: str) -> str | None:
|
||||
"""Return the stored value for *key*, or None if not set."""
|
||||
init_db()
|
||||
row = _connect().execute(
|
||||
'SELECT value FROM settings WHERE key = ?', (key,)
|
||||
).fetchone()
|
||||
return row['value'] if row else None
|
||||
|
||||
|
||||
def save_setting(key: str, value: str) -> None:
|
||||
"""Insert or update *key* with *value*."""
|
||||
init_db()
|
||||
conn = _connect()
|
||||
conn.execute(
|
||||
'INSERT INTO settings (key, value) VALUES (?, ?)'
|
||||
' ON CONFLICT(key) DO UPDATE SET value = excluded.value',
|
||||
(key, value),
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
|
||||
def delete_setting(key: str) -> None:
|
||||
"""Remove *key* from the store (silently succeeds if absent)."""
|
||||
init_db()
|
||||
conn = _connect()
|
||||
conn.execute('DELETE FROM settings WHERE key = ?', (key,))
|
||||
conn.commit()
|
||||
|
||||
|
||||
def get_all_settings() -> dict[str, str]:
|
||||
"""Return all stored settings as a plain dict."""
|
||||
init_db()
|
||||
rows = _connect().execute('SELECT key, value FROM settings').fetchall()
|
||||
return {row['key']: row['value'] for row in rows}
|
||||
349
app/jobs.py
Normal file
349
app/jobs.py
Normal file
|
|
@ -0,0 +1,349 @@
|
|||
"""
|
||||
app/jobs.py
|
||||
===========
|
||||
In-process job store and the ffmpeg compression worker thread.
|
||||
|
||||
Design note: job state is kept in a plain dict protected by a threading.Lock.
|
||||
This is intentional — VideoPress uses a single Gunicorn worker process
|
||||
(required for SSE streaming with gevent), so cross-process state sharing is
|
||||
not needed. If you ever move to multiple workers, replace `active_jobs` with
|
||||
a Redis-backed store and remove the threading.Lock.
|
||||
|
||||
Public API
|
||||
----------
|
||||
active_jobs : dict {job_id -> job_dict}
|
||||
job_lock : Lock protects mutations to active_jobs
|
||||
push_event() : append an SSE event to a job's event queue
|
||||
run_compression_job(): worker — called in a daemon thread
|
||||
"""
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import threading
|
||||
import time
|
||||
from pathlib import Path
|
||||
|
||||
from .notify import send_completion_email
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Job store
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
active_jobs: dict = {}
|
||||
job_lock = threading.Lock()
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def push_event(job: dict, event: dict) -> None:
|
||||
"""Append *event* to job['events'] under the job's own lock."""
|
||||
with job['lock']:
|
||||
job['events'].append(event)
|
||||
|
||||
|
||||
def _choose_encoder(codec: str) -> tuple[str, bool]:
|
||||
"""
|
||||
Return (ffmpeg_encoder_name, is_hevc) for the given source codec string.
|
||||
|
||||
HEVC / H.265 sources are re-encoded with libx265 to preserve efficiency.
|
||||
Everything else uses libx264 (universally supported, always available).
|
||||
"""
|
||||
normalised = codec.lower()
|
||||
is_hevc = normalised in ('hevc', 'h265', 'x265')
|
||||
encoder = 'libx265' if is_hevc else 'libx264'
|
||||
return encoder, is_hevc
|
||||
|
||||
|
||||
def _build_ffmpeg_cmd(
|
||||
src: str,
|
||||
out: str,
|
||||
video_k: int,
|
||||
is_hevc: bool,
|
||||
encoder: str,
|
||||
) -> list[str]:
|
||||
"""
|
||||
Build the ffmpeg command list for one file.
|
||||
|
||||
libx264 accepts -maxrate / -bufsize directly.
|
||||
libx265 requires those same constraints via -x265-params because its
|
||||
CLI option names differ from the generic ffmpeg flags.
|
||||
Both use AAC audio at 128 kbps.
|
||||
-movflags +faststart is only meaningful for MP4 containers but is
|
||||
silently ignored for MKV / MOV / etc., so it is always included.
|
||||
"""
|
||||
if is_hevc:
|
||||
vbv_maxrate = int(video_k * 1.5)
|
||||
vbv_bufsize = video_k * 2
|
||||
encoder_opts = [
|
||||
'-c:v', encoder,
|
||||
'-b:v', f'{video_k}k',
|
||||
'-x265-params', f'vbv-maxrate={vbv_maxrate}:vbv-bufsize={vbv_bufsize}',
|
||||
]
|
||||
else:
|
||||
encoder_opts = [
|
||||
'-c:v', encoder,
|
||||
'-b:v', f'{video_k}k',
|
||||
'-maxrate', f'{int(video_k * 1.5)}k',
|
||||
'-bufsize', f'{video_k * 2}k',
|
||||
]
|
||||
|
||||
return [
|
||||
'ffmpeg', '-y', '-i', src,
|
||||
*encoder_opts,
|
||||
'-c:a', 'aac', '-b:a', '128k',
|
||||
'-movflags', '+faststart',
|
||||
'-progress', 'pipe:1', '-nostats',
|
||||
out,
|
||||
]
|
||||
|
||||
|
||||
def _get_duration(filepath: str) -> float:
|
||||
"""Return the duration of *filepath* in seconds, or 0.0 on failure."""
|
||||
try:
|
||||
probe = subprocess.run(
|
||||
['ffprobe', '-v', 'error',
|
||||
'-show_entries', 'format=duration',
|
||||
'-of', 'default=noprint_wrappers=1:nokey=1',
|
||||
filepath],
|
||||
capture_output=True, text=True, timeout=30,
|
||||
)
|
||||
return float(probe.stdout.strip()) if probe.stdout.strip() else 0.0
|
||||
except Exception:
|
||||
return 0.0
|
||||
|
||||
|
||||
def _send_notification(job: dict, email_results: list[dict], cancelled: bool) -> None:
|
||||
"""Send email and push a 'notify' event regardless of outcome."""
|
||||
notify_email = job.get('notify_email', '')
|
||||
if not notify_email:
|
||||
return
|
||||
ok, err = send_completion_email(notify_email, email_results, cancelled)
|
||||
push_event(job, {
|
||||
'type': 'notify',
|
||||
'success': ok,
|
||||
'message': (f'Notification sent to {notify_email}.' if ok
|
||||
else f'Could not send notification: {err}'),
|
||||
})
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Compression worker
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def run_compression_job(job_id: str) -> None:
|
||||
"""
|
||||
Worker function executed in a daemon thread for each compression job.
|
||||
|
||||
Iterates over the file list, runs ffmpeg for each file, streams progress
|
||||
events, and sends an email notification when finished (if requested).
|
||||
"""
|
||||
with job_lock:
|
||||
job = active_jobs.get(job_id)
|
||||
if not job:
|
||||
return
|
||||
|
||||
files = job['files']
|
||||
suffix = job['suffix']
|
||||
total = job['total']
|
||||
|
||||
push_event(job, {
|
||||
'type': 'start',
|
||||
'total': total,
|
||||
'message': f'Starting compression of {total} file(s)',
|
||||
})
|
||||
|
||||
for idx, file_info in enumerate(files):
|
||||
|
||||
# ── Cancellation check ────────────────────────────────────────────
|
||||
with job['lock']:
|
||||
cancelled = job['cancelled']
|
||||
if cancelled:
|
||||
_handle_cancel(job, idx)
|
||||
return
|
||||
|
||||
# ── Per-file setup ────────────────────────────────────────────────
|
||||
src_path = file_info['path']
|
||||
target_bitrate = file_info.get('target_bit_rate_bps', 1_000_000)
|
||||
src_codec = file_info.get('codec', 'unknown')
|
||||
p = Path(src_path)
|
||||
out_path = str(p.parent / (p.stem + suffix + p.suffix))
|
||||
encoder, is_hevc = _choose_encoder(src_codec)
|
||||
video_k = max(int(target_bitrate / 1000), 200)
|
||||
|
||||
push_event(job, {
|
||||
'type': 'file_start',
|
||||
'index': idx,
|
||||
'total': total,
|
||||
'filename': p.name,
|
||||
'output': out_path,
|
||||
'encoder': encoder,
|
||||
'message': f'Compressing ({idx + 1}/{total}): {p.name} [{encoder}]',
|
||||
})
|
||||
|
||||
duration_secs = _get_duration(src_path)
|
||||
cmd = _build_ffmpeg_cmd(src_path, out_path, video_k, is_hevc, encoder)
|
||||
|
||||
# ── Run ffmpeg ────────────────────────────────────────────────────
|
||||
try:
|
||||
proc = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
text=True,
|
||||
bufsize=1,
|
||||
)
|
||||
with job['lock']:
|
||||
job['process'] = proc
|
||||
|
||||
_stream_progress(job, proc, idx, duration_secs)
|
||||
proc.wait()
|
||||
|
||||
with job['lock']:
|
||||
cancelled = job['cancelled']
|
||||
|
||||
if cancelled:
|
||||
_remove_partial(out_path)
|
||||
_handle_cancel(job, idx)
|
||||
return
|
||||
|
||||
if proc.returncode != 0:
|
||||
_push_file_error(job, idx, p.name, proc)
|
||||
else:
|
||||
_push_file_done(job, idx, p.name, out_path, file_info)
|
||||
|
||||
with job['lock']:
|
||||
job['current_index'] = idx + 1
|
||||
|
||||
except Exception as exc:
|
||||
push_event(job, {
|
||||
'type': 'file_error',
|
||||
'index': idx,
|
||||
'filename': p.name,
|
||||
'message': f'Exception: {exc}',
|
||||
})
|
||||
|
||||
# ── All files processed ───────────────────────────────────────────────
|
||||
push_event(job, {
|
||||
'type': 'done',
|
||||
'message': f'All {total} file(s) processed.',
|
||||
})
|
||||
with job['lock']:
|
||||
job['status'] = 'done'
|
||||
all_events = list(job['events'])
|
||||
|
||||
completed = [{'status': 'done', **e} for e in all_events if e.get('type') == 'file_done']
|
||||
errored = [{'status': 'error', **e} for e in all_events if e.get('type') == 'file_error']
|
||||
_send_notification(job, completed + errored, cancelled=False)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Private sub-helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _stream_progress(
|
||||
job: dict,
|
||||
proc: subprocess.Popen,
|
||||
idx: int,
|
||||
duration_secs: float,
|
||||
) -> None:
|
||||
"""Read ffmpeg's -progress output and push progress events."""
|
||||
for line in proc.stdout:
|
||||
with job['lock']:
|
||||
if job['cancelled']:
|
||||
proc.terminate()
|
||||
return
|
||||
|
||||
line = line.strip()
|
||||
if '=' not in line:
|
||||
continue
|
||||
key, _, value = line.partition('=')
|
||||
key, value = key.strip(), value.strip()
|
||||
|
||||
if key == 'out_time_ms' and duration_secs > 0:
|
||||
try:
|
||||
elapsed = int(value) / 1_000_000
|
||||
pct = min(100.0, (elapsed / duration_secs) * 100)
|
||||
push_event(job, {
|
||||
'type': 'progress',
|
||||
'index': idx,
|
||||
'percent': round(pct, 1),
|
||||
'elapsed_secs': round(elapsed, 1),
|
||||
'duration_secs': round(duration_secs, 1),
|
||||
})
|
||||
except (ValueError, ZeroDivisionError):
|
||||
pass
|
||||
elif key == 'progress' and value == 'end':
|
||||
push_event(job, {
|
||||
'type': 'progress',
|
||||
'index': idx,
|
||||
'percent': 100.0,
|
||||
'elapsed_secs': duration_secs,
|
||||
'duration_secs': duration_secs,
|
||||
})
|
||||
|
||||
|
||||
def _remove_partial(path: str) -> None:
|
||||
try:
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
except OSError:
|
||||
pass
|
||||
|
||||
|
||||
def _handle_cancel(job: dict, idx: int) -> None:
|
||||
"""Push cancel event, set status, send notification for cancelled run."""
|
||||
push_event(job, {'type': 'cancelled', 'message': 'Compression cancelled by user'})
|
||||
with job['lock']:
|
||||
job['status'] = 'cancelled'
|
||||
all_events = list(job['events'])
|
||||
|
||||
completed = [{'status': 'done', **e} for e in all_events if e.get('type') == 'file_done']
|
||||
errored = [{'status': 'error', **e} for e in all_events if e.get('type') == 'file_error']
|
||||
_send_notification(job, completed + errored, cancelled=True)
|
||||
|
||||
|
||||
def _push_file_error(
|
||||
job: dict,
|
||||
idx: int,
|
||||
filename: str,
|
||||
proc: subprocess.Popen,
|
||||
) -> None:
|
||||
try:
|
||||
tail = proc.stderr.read()[-500:]
|
||||
except Exception:
|
||||
tail = ''
|
||||
push_event(job, {
|
||||
'type': 'file_error',
|
||||
'index': idx,
|
||||
'filename': filename,
|
||||
'message': f'ffmpeg exited with code {proc.returncode}',
|
||||
'detail': tail,
|
||||
})
|
||||
|
||||
|
||||
def _push_file_done(
|
||||
job: dict,
|
||||
idx: int,
|
||||
filename: str,
|
||||
out_path: str,
|
||||
file_info: dict,
|
||||
) -> None:
|
||||
try:
|
||||
out_sz = os.path.getsize(out_path)
|
||||
out_gb = round(out_sz / (1024 ** 3), 3)
|
||||
orig_sz = file_info.get('size_bytes', 0)
|
||||
reduction = round((1 - out_sz / orig_sz) * 100, 1) if orig_sz else 0
|
||||
except OSError:
|
||||
out_gb = 0
|
||||
reduction = 0
|
||||
|
||||
push_event(job, {
|
||||
'type': 'file_done',
|
||||
'index': idx,
|
||||
'filename': filename,
|
||||
'output': out_path,
|
||||
'output_size_gb': out_gb,
|
||||
'reduction_pct': reduction,
|
||||
'message': f'Completed: {filename} → saved {reduction}%',
|
||||
})
|
||||
140
app/media.py
Normal file
140
app/media.py
Normal file
|
|
@ -0,0 +1,140 @@
|
|||
"""
|
||||
app/media.py
|
||||
============
|
||||
File-system scanning and FFprobe metadata helpers.
|
||||
|
||||
Public API
|
||||
----------
|
||||
VIDEO_EXTENSIONS : frozenset of lowercase video file suffixes
|
||||
get_video_info() : run ffprobe on a single file, return a metadata dict
|
||||
list_video_files(): walk a directory tree and return files above a size floor
|
||||
"""
|
||||
|
||||
import json
|
||||
import os
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Constants
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
VIDEO_EXTENSIONS: frozenset[str] = frozenset({
|
||||
'.mp4', '.mkv', '.mov', '.avi', '.wmv', '.flv',
|
||||
'.webm', '.m4v', '.mpg', '.mpeg', '.ts', '.mts',
|
||||
'.m2ts', '.vob', '.ogv', '.3gp', '.3g2',
|
||||
})
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# FFprobe helper
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def get_video_info(filepath: str) -> dict | None:
|
||||
"""
|
||||
Use ffprobe to get duration, total bitrate, codec, and dimensions.
|
||||
|
||||
Returns a dict with the keys below, or None if ffprobe fails.
|
||||
|
||||
Bitrate resolution order (handles HEVC/MKV where the stream-level
|
||||
bit_rate field is absent):
|
||||
1. Stream-level bit_rate — present for H.264/MP4, often missing for HEVC
|
||||
2. Format-level bit_rate — reliable for all containers
|
||||
3. Derived from size / duration — final fallback
|
||||
|
||||
Returned keys
|
||||
-------------
|
||||
duration, bit_rate_bps, bit_rate_mbps,
|
||||
target_bit_rate_bps, target_bit_rate_mbps,
|
||||
size_bytes, size_gb, codec, width, height
|
||||
"""
|
||||
cmd = [
|
||||
'ffprobe', '-v', 'error',
|
||||
'-select_streams', 'v:0',
|
||||
'-show_entries',
|
||||
'format=duration,bit_rate,size:stream=codec_name,width,height,bit_rate',
|
||||
'-of', 'json',
|
||||
filepath,
|
||||
]
|
||||
try:
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=30)
|
||||
if result.returncode != 0:
|
||||
return None
|
||||
|
||||
data = json.loads(result.stdout)
|
||||
fmt = data.get('format', {})
|
||||
stream = (data.get('streams') or [{}])[0]
|
||||
|
||||
duration = float(fmt.get('duration', 0))
|
||||
size_bytes = int(fmt.get('size', 0))
|
||||
codec = stream.get('codec_name', 'unknown')
|
||||
width = stream.get('width', 0)
|
||||
height = stream.get('height', 0)
|
||||
|
||||
stream_br = int(stream.get('bit_rate') or 0)
|
||||
format_br = int(fmt.get('bit_rate') or 0)
|
||||
if stream_br > 0:
|
||||
bit_rate = stream_br
|
||||
elif format_br > 0:
|
||||
bit_rate = format_br
|
||||
elif duration > 0:
|
||||
bit_rate = int((size_bytes * 8) / duration)
|
||||
else:
|
||||
bit_rate = 0
|
||||
|
||||
# Target ≈ 1/3 of the total bitrate; reserve 128 kbps for audio.
|
||||
audio_bps = 128_000
|
||||
video_bps = bit_rate - audio_bps if bit_rate > audio_bps else bit_rate
|
||||
target_video_bps = max(int(video_bps / 3), 200_000)
|
||||
|
||||
return {
|
||||
'duration': duration,
|
||||
'bit_rate_bps': bit_rate,
|
||||
'bit_rate_mbps': round(bit_rate / 1_000_000, 2),
|
||||
'target_bit_rate_bps': target_video_bps,
|
||||
'target_bit_rate_mbps': round(target_video_bps / 1_000_000, 2),
|
||||
'size_bytes': size_bytes,
|
||||
'size_gb': round(size_bytes / (1024 ** 3), 3),
|
||||
'codec': codec,
|
||||
'width': width,
|
||||
'height': height,
|
||||
}
|
||||
except Exception:
|
||||
return None
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Directory scanner
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def list_video_files(directory: Path, min_size_gb: float) -> list[dict]:
|
||||
"""
|
||||
Recursively walk *directory* and return video files larger than
|
||||
*min_size_gb* gigabytes.
|
||||
|
||||
Each entry is a dict with: path, name, size_bytes, size_gb.
|
||||
Raises PermissionError if the root directory is inaccessible.
|
||||
"""
|
||||
min_bytes = min_size_gb * (1024 ** 3)
|
||||
results: list[dict] = []
|
||||
|
||||
try:
|
||||
for root, dirs, files in os.walk(directory):
|
||||
dirs[:] = [d for d in dirs if not d.startswith('.')]
|
||||
for fname in files:
|
||||
if Path(fname).suffix.lower() in VIDEO_EXTENSIONS:
|
||||
fpath = os.path.join(root, fname)
|
||||
try:
|
||||
fsize = os.path.getsize(fpath)
|
||||
if fsize >= min_bytes:
|
||||
results.append({
|
||||
'path': fpath,
|
||||
'name': fname,
|
||||
'size_bytes': fsize,
|
||||
'size_gb': round(fsize / (1024 ** 3), 3),
|
||||
})
|
||||
except OSError:
|
||||
continue
|
||||
except PermissionError as exc:
|
||||
raise PermissionError(f"Cannot access directory: {exc}") from exc
|
||||
|
||||
return results
|
||||
329
app/notify.py
Normal file
329
app/notify.py
Normal file
|
|
@ -0,0 +1,329 @@
|
|||
"""
|
||||
app/notify.py
|
||||
=============
|
||||
Email notification helper for compression job completion.
|
||||
|
||||
Delivery uses SMTP settings stored in SQLite (via app.db).
|
||||
If no SMTP settings have been configured, the send call returns an
|
||||
informative error rather than silently failing.
|
||||
|
||||
Public API
|
||||
----------
|
||||
get_smtp_config() -> dict with all SMTP fields (safe for the UI)
|
||||
send_completion_email(to, results, cancelled) -> (ok: bool, error: str)
|
||||
|
||||
SMTP settings keys (stored in the 'settings' table)
|
||||
----------------------------------------------------
|
||||
smtp_host — hostname or IP of the SMTP server
|
||||
smtp_port — port number (str)
|
||||
smtp_security — 'tls' (STARTTLS) | 'ssl' (SMTPS) | 'none'
|
||||
smtp_user — login username (optional)
|
||||
smtp_password — login password (optional, stored as-is)
|
||||
smtp_from — From: address used in sent mail
|
||||
"""
|
||||
|
||||
import smtplib
|
||||
import socket
|
||||
import ssl
|
||||
from email.mime.multipart import MIMEMultipart
|
||||
from email.mime.text import MIMEText
|
||||
from email.utils import formatdate, make_msgid
|
||||
|
||||
from .db import get_setting
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# SMTP config helper
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def get_smtp_config() -> dict:
|
||||
"""
|
||||
Read SMTP settings from the database and return them as a dict.
|
||||
The password field is replaced with a placeholder so this dict is
|
||||
safe to serialise and send to the browser.
|
||||
|
||||
Returns
|
||||
-------
|
||||
{
|
||||
host, port, security, user, from_addr,
|
||||
password_set: bool (True if a password is stored)
|
||||
}
|
||||
"""
|
||||
return {
|
||||
'host': get_setting('smtp_host') or '',
|
||||
'port': get_setting('smtp_port') or '587',
|
||||
'security': get_setting('smtp_security') or 'tls',
|
||||
'user': get_setting('smtp_user') or '',
|
||||
'from_addr': get_setting('smtp_from') or '',
|
||||
'password_set': bool(get_setting('smtp_password')),
|
||||
}
|
||||
|
||||
|
||||
def _load_smtp_config() -> dict:
|
||||
"""Load full config including the raw password (server-side only)."""
|
||||
return {
|
||||
'host': get_setting('smtp_host') or '',
|
||||
'port': int(get_setting('smtp_port') or 587),
|
||||
'security': get_setting('smtp_security') or 'tls',
|
||||
'user': get_setting('smtp_user') or '',
|
||||
'password': get_setting('smtp_password') or '',
|
||||
'from_addr': get_setting('smtp_from') or '',
|
||||
}
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Send helper
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def send_completion_email(
|
||||
to_address: str,
|
||||
results: list[dict],
|
||||
cancelled: bool,
|
||||
) -> tuple[bool, str]:
|
||||
"""
|
||||
Send a job-completion notification to *to_address* using the SMTP
|
||||
settings stored in SQLite.
|
||||
|
||||
Returns (success, error_message).
|
||||
"""
|
||||
if not to_address or '@' not in to_address:
|
||||
return False, 'Invalid recipient email address'
|
||||
|
||||
cfg = _load_smtp_config()
|
||||
|
||||
if not cfg['host']:
|
||||
return False, (
|
||||
'No SMTP server configured. '
|
||||
'Please add your SMTP settings in the ⚙ Settings panel.'
|
||||
)
|
||||
if not cfg['from_addr']:
|
||||
return False, (
|
||||
'No From address configured. '
|
||||
'Please add your SMTP settings in the ⚙ Settings panel.'
|
||||
)
|
||||
|
||||
# ── Build message ─────────────────────────────────────────────────────
|
||||
done_files = [r for r in results if r.get('status') == 'done']
|
||||
error_files = [r for r in results if r.get('status') == 'error']
|
||||
total = len(results)
|
||||
hostname = socket.getfqdn()
|
||||
|
||||
if cancelled:
|
||||
subject = (f'VideoPress: compression cancelled '
|
||||
f'({len(done_files)}/{total} completed) on {hostname}')
|
||||
elif error_files:
|
||||
subject = (f'VideoPress: compression complete with '
|
||||
f'{len(error_files)} error(s) on {hostname}')
|
||||
else:
|
||||
subject = (f'VideoPress: compression complete — '
|
||||
f'{total} file(s) processed on {hostname}')
|
||||
|
||||
msg = MIMEMultipart('alternative')
|
||||
msg['Subject'] = subject
|
||||
msg['From'] = cfg['from_addr']
|
||||
msg['To'] = to_address
|
||||
msg['Date'] = formatdate(localtime=True)
|
||||
msg['Message-ID'] = make_msgid(domain=hostname)
|
||||
msg.attach(MIMEText(
|
||||
_build_plain(hostname, cancelled, done_files, error_files, total),
|
||||
'plain', 'utf-8',
|
||||
))
|
||||
msg.attach(MIMEText(
|
||||
_build_html(hostname, subject, cancelled, done_files, error_files, total),
|
||||
'html', 'utf-8',
|
||||
))
|
||||
|
||||
# ── Connect and send ──────────────────────────────────────────────────
|
||||
try:
|
||||
security = cfg['security'].lower()
|
||||
host = cfg['host']
|
||||
port = cfg['port']
|
||||
|
||||
if security == 'ssl':
|
||||
# SMTPS — wrap in SSL from the start (port 465 typically)
|
||||
context = ssl.create_default_context()
|
||||
server = smtplib.SMTP_SSL(host, port, context=context, timeout=15)
|
||||
else:
|
||||
# Plain or STARTTLS (port 587 typically)
|
||||
server = smtplib.SMTP(host, port, timeout=15)
|
||||
server.ehlo()
|
||||
if security == 'tls':
|
||||
context = ssl.create_default_context()
|
||||
server.starttls(context=context)
|
||||
server.ehlo()
|
||||
|
||||
with server:
|
||||
if cfg['user'] and cfg['password']:
|
||||
server.login(cfg['user'], cfg['password'])
|
||||
server.sendmail(cfg['from_addr'], [to_address], msg.as_bytes())
|
||||
|
||||
return True, ''
|
||||
|
||||
except smtplib.SMTPAuthenticationError:
|
||||
return False, (
|
||||
'Authentication failed — check your username and password. '
|
||||
'For Gmail/Google Workspace, use an App Password rather than '
|
||||
'your account password.'
|
||||
)
|
||||
except smtplib.SMTPConnectError as exc:
|
||||
return False, (
|
||||
f'Could not connect to {host}:{port}. '
|
||||
f'Check the host, port, and security setting. ({exc})'
|
||||
)
|
||||
except smtplib.SMTPRecipientsRefused as exc:
|
||||
refused = ', '.join(exc.recipients.keys())
|
||||
return False, f'Recipient address rejected by server: {refused}'
|
||||
except smtplib.SMTPSenderRefused as exc:
|
||||
return False, (
|
||||
f'From address "{cfg["from_addr"]}" was rejected by the server. '
|
||||
f'Ensure it matches your authenticated account. ({exc.smtp_error.decode(errors="replace")})'
|
||||
)
|
||||
except smtplib.SMTPException as exc:
|
||||
return False, f'SMTP error: {exc}'
|
||||
except ssl.SSLError as exc:
|
||||
return False, (
|
||||
f'SSL/TLS error connecting to {host}:{port} — '
|
||||
f'try changing the Security setting. ({exc})'
|
||||
)
|
||||
except TimeoutError:
|
||||
return False, (
|
||||
f'Connection to {host}:{port} timed out. '
|
||||
f'Check the host and port, and that the server is reachable.'
|
||||
)
|
||||
except OSError as exc:
|
||||
return False, (
|
||||
f'Network error connecting to {host}:{port} — {exc}. '
|
||||
f'Check the hostname and that the server is reachable.'
|
||||
)
|
||||
except Exception as exc:
|
||||
return False, f'Unexpected error: {exc}'
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Email body builders
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
def _build_plain(hostname, cancelled, done_files, error_files, total) -> str:
|
||||
lines = [
|
||||
'VideoPress Compression Report',
|
||||
f'Host : {hostname}',
|
||||
f'Status : {"Cancelled" if cancelled else "Complete"}',
|
||||
f'Files : {len(done_files)} succeeded, {len(error_files)} failed, {total} total',
|
||||
'',
|
||||
]
|
||||
if done_files:
|
||||
lines.append('Completed files:')
|
||||
for r in done_files:
|
||||
lines.append(
|
||||
f" ✓ {r.get('filename','?')} "
|
||||
f"({r.get('output_size_gb','?')} GB, "
|
||||
f"-{r.get('reduction_pct','?')}%)"
|
||||
)
|
||||
lines.append('')
|
||||
if error_files:
|
||||
lines.append('Failed files:')
|
||||
for r in error_files:
|
||||
lines.append(
|
||||
f" ✗ {r.get('filename','?')} "
|
||||
f"— {r.get('message','unknown error')}"
|
||||
)
|
||||
lines.append('')
|
||||
lines += ['—', 'Sent by VideoPress FFmpeg Compressor']
|
||||
return '\n'.join(lines)
|
||||
|
||||
|
||||
def _build_html(hostname, subject, cancelled, done_files, error_files, total) -> str:
|
||||
status_colour = (
|
||||
'#166534' if not cancelled and not error_files
|
||||
else '#92400e' if cancelled
|
||||
else '#991b1b'
|
||||
)
|
||||
status_label = (
|
||||
'Cancelled' if cancelled
|
||||
else 'Complete ✓' if not error_files
|
||||
else 'Complete with errors'
|
||||
)
|
||||
|
||||
def file_rows(files, icon, bg):
|
||||
rows = ''
|
||||
for r in files:
|
||||
detail = (
|
||||
f"{r.get('output_size_gb','?')} GB · "
|
||||
f"-{r.get('reduction_pct','?')}%"
|
||||
if r.get('status') == 'done'
|
||||
else r.get('message', 'unknown error')
|
||||
)
|
||||
rows += (
|
||||
f'<tr style="background:{bg}">'
|
||||
f'<td style="padding:6px 10px;font-size:1.1em">{icon}</td>'
|
||||
f'<td style="padding:6px 10px;font-family:monospace;font-size:.9em">'
|
||||
f'{r.get("filename","?")}</td>'
|
||||
f'<td style="padding:6px 10px;color:#555;font-size:.85em">{detail}</td>'
|
||||
f'</tr>'
|
||||
)
|
||||
return rows
|
||||
|
||||
done_rows = file_rows(done_files, '✅', '#f0fdf4')
|
||||
error_rows = file_rows(error_files, '❌', '#fef2f2')
|
||||
|
||||
error_cell = (
|
||||
f'<div><div style="font-size:.7em;text-transform:uppercase;'
|
||||
f'letter-spacing:.06em;color:#6b7280;font-weight:700">Failed</div>'
|
||||
f'<div style="font-size:1.3em;font-weight:700;color:#991b1b">'
|
||||
f'{len(error_files)}</div></div>'
|
||||
) if error_files else ''
|
||||
|
||||
done_section = (
|
||||
f'<h2 style="font-size:1em;color:#166534;margin:0 0 8px">Completed</h2>'
|
||||
f'<table style="width:100%;border-collapse:collapse;margin-bottom:20px">'
|
||||
f'{done_rows}</table>'
|
||||
) if done_files else ''
|
||||
|
||||
error_section = (
|
||||
f'<h2 style="font-size:1em;color:#991b1b;margin:0 0 8px">Errors</h2>'
|
||||
f'<table style="width:100%;border-collapse:collapse;margin-bottom:20px">'
|
||||
f'{error_rows}</table>'
|
||||
) if error_files else ''
|
||||
|
||||
return f"""<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head><meta charset="UTF-8"><title>{subject}</title></head>
|
||||
<body style="font-family:system-ui,sans-serif;background:#f9fafb;margin:0;padding:24px">
|
||||
<div style="max-width:640px;margin:0 auto;background:#fff;border-radius:10px;
|
||||
box-shadow:0 2px 8px rgba(0,0,0,.08);overflow:hidden">
|
||||
<div style="background:#1a1a18;padding:20px 28px">
|
||||
<span style="color:#f97316;font-size:1.4em">▶</span>
|
||||
<span style="color:#f5f5f2;font-size:1.15em;font-weight:700;
|
||||
letter-spacing:.03em;margin-left:10px">
|
||||
Video<strong style="color:#f97316">Press</strong>
|
||||
</span>
|
||||
</div>
|
||||
<div style="padding:28px">
|
||||
<h1 style="margin:0 0 4px;font-size:1.2em;color:#111">Compression Run Report</h1>
|
||||
<p style="margin:0 0 20px;color:#6b7280;font-size:.9em">Host: <code>{hostname}</code></p>
|
||||
<div style="background:#f3f4f6;border-radius:8px;padding:16px 20px;
|
||||
margin-bottom:24px;display:flex;gap:32px;flex-wrap:wrap">
|
||||
<div>
|
||||
<div style="font-size:.7em;text-transform:uppercase;letter-spacing:.06em;
|
||||
color:#6b7280;font-weight:700">Status</div>
|
||||
<div style="font-size:1.3em;font-weight:700;color:{status_colour}">{status_label}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div style="font-size:.7em;text-transform:uppercase;letter-spacing:.06em;
|
||||
color:#6b7280;font-weight:700">Total</div>
|
||||
<div style="font-size:1.3em;font-weight:700;color:#111">{total}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div style="font-size:.7em;text-transform:uppercase;letter-spacing:.06em;
|
||||
color:#6b7280;font-weight:700">Succeeded</div>
|
||||
<div style="font-size:1.3em;font-weight:700;color:#166534">{len(done_files)}</div>
|
||||
</div>
|
||||
{error_cell}
|
||||
</div>
|
||||
{done_section}
|
||||
{error_section}
|
||||
<hr style="border:none;border-top:1px solid #e5e7eb;margin:24px 0 16px">
|
||||
<p style="color:#9ca3af;font-size:.78em;margin:0">Sent by VideoPress FFmpeg Compressor</p>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>"""
|
||||
470
app/routes.py
Normal file
470
app/routes.py
Normal file
|
|
@ -0,0 +1,470 @@
|
|||
"""
|
||||
app/routes.py
|
||||
=============
|
||||
All Flask route handlers. Registered on the app object via register_routes()
|
||||
which is called by the application factory in app/__init__.py.
|
||||
|
||||
Routes
|
||||
------
|
||||
GET / → index page
|
||||
GET /api/config → server configuration (media_root)
|
||||
GET /api/browse?path=… → directory listing
|
||||
POST /api/scan → scan for video files
|
||||
POST /api/compress/start → start a compression job
|
||||
GET /api/compress/progress/<id> → SSE progress stream
|
||||
POST /api/compress/cancel/<id> → cancel a running job
|
||||
"""
|
||||
|
||||
import json
|
||||
import time
|
||||
import threading
|
||||
from pathlib import Path
|
||||
|
||||
from flask import Flask, Response, jsonify, render_template, request, stream_with_context
|
||||
|
||||
from .config import MEDIA_ROOT, safe_path
|
||||
from .db import get_all_settings, save_setting, delete_setting
|
||||
from .media import get_video_info, list_video_files
|
||||
from .jobs import active_jobs, job_lock, run_compression_job
|
||||
from .notify import get_smtp_config, send_completion_email
|
||||
|
||||
|
||||
def fmttime(seconds: float) -> str:
|
||||
"""Format *seconds* as M:SS or H:MM:SS."""
|
||||
s = int(seconds)
|
||||
h = s // 3600
|
||||
m = (s % 3600) // 60
|
||||
sec = s % 60
|
||||
if h:
|
||||
return f"{h}:{m:02d}:{sec:02d}"
|
||||
return f"{m}:{sec:02d}"
|
||||
|
||||
|
||||
def register_routes(app: Flask) -> None:
|
||||
"""Attach all routes to *app*."""
|
||||
|
||||
# ── UI ────────────────────────────────────────────────────────────────
|
||||
|
||||
@app.route('/')
|
||||
def index():
|
||||
return render_template('index.html', media_root=str(MEDIA_ROOT))
|
||||
|
||||
# ── Config ────────────────────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/config')
|
||||
def api_config():
|
||||
"""Return server-side settings the frontend needs at startup."""
|
||||
return jsonify({'media_root': str(MEDIA_ROOT)})
|
||||
|
||||
# ── SMTP settings ─────────────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/settings/smtp', methods=['GET'])
|
||||
def smtp_settings_get():
|
||||
"""
|
||||
Return current SMTP settings (password is never sent, only a flag
|
||||
indicating whether one is stored).
|
||||
"""
|
||||
return jsonify(get_smtp_config())
|
||||
|
||||
@app.route('/api/settings/smtp', methods=['POST'])
|
||||
def smtp_settings_save():
|
||||
"""
|
||||
Save SMTP settings to SQLite. Only fields present in the request
|
||||
body are updated; omitting 'password' leaves the stored password
|
||||
unchanged (useful when the user edits other fields but doesn't want
|
||||
to re-enter the password).
|
||||
"""
|
||||
data = request.get_json(silent=True) or {}
|
||||
|
||||
# Fields whose DB key matches smtp_{field} exactly
|
||||
for field in ('host', 'port', 'security'):
|
||||
if field in data:
|
||||
value = str(data[field]).strip()
|
||||
if not value:
|
||||
return jsonify({'error': f"'{field}' cannot be empty"}), 400
|
||||
save_setting(f'smtp_{field}', value)
|
||||
|
||||
# from_addr is stored as 'smtp_from' (not 'smtp_from_addr')
|
||||
if 'from_addr' in data:
|
||||
value = str(data['from_addr']).strip()
|
||||
if not value:
|
||||
return jsonify({'error': "'from_addr' cannot be empty"}), 400
|
||||
save_setting('smtp_from', value)
|
||||
|
||||
# Optional fields
|
||||
if 'user' in data:
|
||||
val = str(data['user']).strip()
|
||||
if val:
|
||||
save_setting('smtp_user', val)
|
||||
else:
|
||||
delete_setting('smtp_user')
|
||||
|
||||
# Password: only update if a non-empty value is explicitly sent
|
||||
if 'password' in data and str(data['password']).strip():
|
||||
save_setting('smtp_password', str(data['password']).strip())
|
||||
|
||||
return jsonify({'ok': True, 'config': get_smtp_config()})
|
||||
|
||||
@app.route('/api/settings/smtp/test', methods=['POST'])
|
||||
def smtp_settings_test():
|
||||
"""
|
||||
Send a test email using the currently saved SMTP settings.
|
||||
Always returns HTTP 200 — SMTP failures are reported in the
|
||||
JSON body as {ok: false, message: "..."} so the browser can
|
||||
display the exact error without interference from proxies or
|
||||
the browser's own error handling for 5xx responses.
|
||||
"""
|
||||
data = request.get_json(silent=True) or {}
|
||||
test_to = data.get('to', '').strip()
|
||||
|
||||
if not test_to or '@' not in test_to:
|
||||
return jsonify({'ok': False, 'message': 'Please enter a valid recipient address.'}), 400
|
||||
|
||||
ok, err = send_completion_email(
|
||||
to_address = test_to,
|
||||
results = [{
|
||||
'status': 'done',
|
||||
'filename': 'test_video.mp4',
|
||||
'output_size_gb': 1.2,
|
||||
'reduction_pct': 33,
|
||||
}],
|
||||
cancelled = False,
|
||||
)
|
||||
|
||||
if ok:
|
||||
return jsonify({'ok': True, 'message': f'Test email sent to {test_to}.'})
|
||||
|
||||
# Always 200 — the caller checks data.ok, not the HTTP status
|
||||
return jsonify({'ok': False, 'message': err})
|
||||
|
||||
# ── Directory browser ─────────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/browse')
|
||||
def browse_directory():
|
||||
raw = request.args.get('path', str(MEDIA_ROOT))
|
||||
try:
|
||||
path = safe_path(raw)
|
||||
except PermissionError as exc:
|
||||
return jsonify({'error': str(exc)}), 403
|
||||
|
||||
if not path.exists():
|
||||
return jsonify({'error': 'Path does not exist'}), 404
|
||||
if not path.is_dir():
|
||||
return jsonify({'error': 'Not a directory'}), 400
|
||||
|
||||
try:
|
||||
entries = [
|
||||
{'name': e.name, 'path': str(e), 'is_dir': e.is_dir()}
|
||||
for e in sorted(
|
||||
path.iterdir(),
|
||||
key=lambda e: (not e.is_dir(), e.name.lower()),
|
||||
)
|
||||
if not e.name.startswith('.')
|
||||
]
|
||||
parent = str(path.parent) if path != MEDIA_ROOT else None
|
||||
return jsonify({
|
||||
'current': str(path),
|
||||
'parent': parent,
|
||||
'entries': entries,
|
||||
'media_root': str(MEDIA_ROOT),
|
||||
})
|
||||
except PermissionError:
|
||||
return jsonify({'error': 'Permission denied'}), 403
|
||||
|
||||
# ── File scanner ──────────────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/scan', methods=['POST'])
|
||||
def scan_directory():
|
||||
data = request.get_json(silent=True) or {}
|
||||
raw_dir = data.get('directory', '')
|
||||
min_size_gb = float(data.get('min_size_gb', 1.0))
|
||||
|
||||
if not raw_dir:
|
||||
return jsonify({'error': 'No directory provided'}), 400
|
||||
try:
|
||||
directory = safe_path(raw_dir)
|
||||
except PermissionError as exc:
|
||||
return jsonify({'error': str(exc)}), 403
|
||||
if not directory.is_dir():
|
||||
return jsonify({'error': 'Invalid directory'}), 400
|
||||
|
||||
try:
|
||||
files = list_video_files(directory, min_size_gb)
|
||||
except PermissionError as exc:
|
||||
return jsonify({'error': str(exc)}), 403
|
||||
|
||||
enriched = []
|
||||
for f in files:
|
||||
info = get_video_info(f['path'])
|
||||
if info:
|
||||
f.update(info)
|
||||
else:
|
||||
# Rough fallback: assume a 90-minute feature film
|
||||
bps = int((f['size_bytes'] * 8) / (90 * 60))
|
||||
f.update({
|
||||
'bit_rate_bps': bps,
|
||||
'bit_rate_mbps': round(bps / 1_000_000, 2),
|
||||
'target_bit_rate_bps': max(bps // 3, 200_000),
|
||||
'target_bit_rate_mbps': round(max(bps // 3, 200_000) / 1_000_000, 2),
|
||||
'duration': 0,
|
||||
'codec': 'unknown',
|
||||
'width': 0,
|
||||
'height': 0,
|
||||
})
|
||||
enriched.append(f)
|
||||
|
||||
enriched.sort(key=lambda x: x['size_bytes'], reverse=True)
|
||||
return jsonify({'files': enriched, 'count': len(enriched)})
|
||||
|
||||
# ── Compression — status snapshot (for reconnect/reload) ─────────────
|
||||
|
||||
@app.route('/api/compress/status/<job_id>')
|
||||
def compression_status(job_id):
|
||||
"""
|
||||
Return a complete point-in-time snapshot of a job's state.
|
||||
|
||||
This is used when the browser reconnects after losing the SSE stream
|
||||
(page reload, tab backgrounded, network blip). The frontend replays
|
||||
this snapshot to rebuild the full progress UI, then re-attaches the
|
||||
live SSE stream from where it left off.
|
||||
|
||||
Response shape
|
||||
--------------
|
||||
{
|
||||
job_id, status, total, current_index,
|
||||
files: [ {path, name, ...original file info} ],
|
||||
file_states: [ # one entry per file, index-aligned
|
||||
{
|
||||
status: 'waiting' | 'running' | 'done' | 'error',
|
||||
percent: 0-100,
|
||||
detail: str, # time elapsed / output size / error msg
|
||||
filename, output, reduction_pct, output_size_gb (done only)
|
||||
message (error only)
|
||||
}
|
||||
],
|
||||
done_count: int,
|
||||
event_count: int # total events stored; SSE stream resumes from here
|
||||
}
|
||||
"""
|
||||
with job_lock:
|
||||
job = active_jobs.get(job_id)
|
||||
if not job:
|
||||
return jsonify({'error': 'Job not found'}), 404
|
||||
|
||||
with job['lock']:
|
||||
events = list(job['events'])
|
||||
status = job['status']
|
||||
total = job['total']
|
||||
current_index = job['current_index']
|
||||
files = job['files']
|
||||
|
||||
# Replay the event log to reconstruct per-file state
|
||||
file_states = [
|
||||
{'status': 'waiting', 'percent': 0, 'detail': '', 'filename': f.get('name', '')}
|
||||
for f in files
|
||||
]
|
||||
done_count = 0
|
||||
|
||||
for evt in events:
|
||||
t = evt.get('type')
|
||||
idx = evt.get('index')
|
||||
|
||||
if t == 'file_start' and idx is not None:
|
||||
file_states[idx].update({
|
||||
'status': 'running',
|
||||
'percent': 0,
|
||||
'detail': '',
|
||||
'filename': evt.get('filename', file_states[idx]['filename']),
|
||||
'output': evt.get('output', ''),
|
||||
'encoder': evt.get('encoder', ''),
|
||||
})
|
||||
|
||||
elif t == 'progress' and idx is not None:
|
||||
file_states[idx].update({
|
||||
'status': 'running',
|
||||
'percent': evt.get('percent', 0),
|
||||
'detail': (
|
||||
f"{fmttime(evt.get('elapsed_secs',0))} / "
|
||||
f"{fmttime(evt.get('duration_secs',0))}"
|
||||
if evt.get('duration_secs', 0) > 0 else ''
|
||||
),
|
||||
})
|
||||
|
||||
elif t == 'file_done' and idx is not None:
|
||||
done_count += 1
|
||||
file_states[idx].update({
|
||||
'status': 'done',
|
||||
'percent': 100,
|
||||
'detail': (f"{evt.get('output_size_gb','?')} GB "
|
||||
f"saved {evt.get('reduction_pct','?')}%"),
|
||||
'filename': evt.get('filename', ''),
|
||||
'output': evt.get('output', ''),
|
||||
'reduction_pct': evt.get('reduction_pct', 0),
|
||||
'output_size_gb': evt.get('output_size_gb', 0),
|
||||
})
|
||||
|
||||
elif t == 'file_error' and idx is not None:
|
||||
file_states[idx].update({
|
||||
'status': 'error',
|
||||
'percent': 0,
|
||||
'detail': evt.get('message', 'Unknown error'),
|
||||
'message': evt.get('message', ''),
|
||||
})
|
||||
|
||||
return jsonify({
|
||||
'job_id': job_id,
|
||||
'status': status,
|
||||
'total': total,
|
||||
'current_index': current_index,
|
||||
'done_count': done_count,
|
||||
'event_count': len(events),
|
||||
'files': files,
|
||||
'file_states': file_states,
|
||||
})
|
||||
|
||||
# ── Compression — list active jobs (for page-load auto-reconnect) ─────
|
||||
|
||||
@app.route('/api/compress/active')
|
||||
def list_active_jobs():
|
||||
"""
|
||||
Return a list of jobs that are currently running or recently finished.
|
||||
The frontend calls this on page load to detect whether a job is in
|
||||
progress and should be reconnected to.
|
||||
"""
|
||||
with job_lock:
|
||||
jobs = list(active_jobs.values())
|
||||
|
||||
result = []
|
||||
for job in jobs:
|
||||
with job['lock']:
|
||||
result.append({
|
||||
'job_id': job['id'],
|
||||
'status': job['status'],
|
||||
'total': job['total'],
|
||||
'current_index': job['current_index'],
|
||||
})
|
||||
|
||||
# Most recent first
|
||||
result.sort(key=lambda j: j['job_id'], reverse=True)
|
||||
return jsonify({'jobs': result})
|
||||
|
||||
# ── Compression — start ───────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/compress/start', methods=['POST'])
|
||||
def start_compression():
|
||||
data = request.get_json(silent=True) or {}
|
||||
files = data.get('files', [])
|
||||
suffix = data.get('suffix', '_new')
|
||||
notify_email = data.get('notify_email', '').strip()
|
||||
|
||||
if not files:
|
||||
return jsonify({'error': 'No files provided'}), 400
|
||||
|
||||
if notify_email and (len(notify_email) > 254 or '@' not in notify_email):
|
||||
return jsonify({'error': 'Invalid notification email address'}), 400
|
||||
|
||||
for f in files:
|
||||
try:
|
||||
safe_path(f.get('path', ''))
|
||||
except PermissionError as exc:
|
||||
return jsonify({'error': str(exc)}), 403
|
||||
|
||||
job_id = f"job_{int(time.time() * 1000)}"
|
||||
job = {
|
||||
'id': job_id,
|
||||
'files': files,
|
||||
'suffix': suffix,
|
||||
'notify_email': notify_email,
|
||||
'status': 'running',
|
||||
'current_index': 0,
|
||||
'total': len(files),
|
||||
'events': [],
|
||||
'process': None,
|
||||
'cancelled': False,
|
||||
'lock': threading.Lock(),
|
||||
}
|
||||
with job_lock:
|
||||
active_jobs[job_id] = job
|
||||
|
||||
threading.Thread(
|
||||
target=run_compression_job,
|
||||
args=(job_id,),
|
||||
daemon=True,
|
||||
).start()
|
||||
return jsonify({'job_id': job_id})
|
||||
|
||||
# ── Compression — SSE progress stream ─────────────────────────────────
|
||||
|
||||
@app.route('/api/compress/progress/<job_id>')
|
||||
def compression_progress(job_id):
|
||||
"""
|
||||
Server-Sent Events stream for real-time job progress.
|
||||
|
||||
Query param: ?from=N — start streaming from event index N (default 0).
|
||||
On reconnect the client passes the last event index it saw so it only
|
||||
receives new events, not a full replay of the history.
|
||||
|
||||
Compatible with Gunicorn + gevent: time.sleep() yields the greenlet
|
||||
rather than blocking a real OS thread.
|
||||
"""
|
||||
try:
|
||||
start_from = int(request.args.get('from', 0))
|
||||
except (TypeError, ValueError):
|
||||
start_from = 0
|
||||
|
||||
def event_stream():
|
||||
last_idx = start_from
|
||||
while True:
|
||||
with job_lock:
|
||||
job = active_jobs.get(job_id)
|
||||
if not job:
|
||||
yield (
|
||||
f"data: {json.dumps({'type': 'error', 'message': 'Job not found'})}\n\n"
|
||||
)
|
||||
return
|
||||
|
||||
with job['lock']:
|
||||
new_events = job['events'][last_idx:]
|
||||
last_idx += len(new_events)
|
||||
status = job['status']
|
||||
|
||||
for event in new_events:
|
||||
yield f"data: {json.dumps(event)}\n\n"
|
||||
|
||||
if status in ('done', 'cancelled', 'error') and not new_events:
|
||||
break
|
||||
|
||||
time.sleep(0.25)
|
||||
|
||||
return Response(
|
||||
stream_with_context(event_stream()),
|
||||
mimetype='text/event-stream',
|
||||
headers={
|
||||
'Cache-Control': 'no-cache',
|
||||
'X-Accel-Buffering': 'no',
|
||||
},
|
||||
)
|
||||
|
||||
# ── Compression — cancel ──────────────────────────────────────────────
|
||||
|
||||
@app.route('/api/compress/cancel/<job_id>', methods=['POST'])
|
||||
def cancel_compression(job_id):
|
||||
with job_lock:
|
||||
job = active_jobs.get(job_id)
|
||||
if not job:
|
||||
return jsonify({'error': 'Job not found'}), 404
|
||||
|
||||
with job['lock']:
|
||||
job['cancelled'] = True
|
||||
proc = job.get('process')
|
||||
|
||||
if proc and proc.poll() is None:
|
||||
try:
|
||||
proc.terminate()
|
||||
time.sleep(1)
|
||||
if proc.poll() is None:
|
||||
proc.kill()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return jsonify({'status': 'cancellation requested'})
|
||||
|
|
@ -13,13 +13,20 @@
|
|||
|
||||
services:
|
||||
videopress:
|
||||
# build:
|
||||
# context: .
|
||||
# dockerfile: Dockerfile
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
|
||||
# ── Alternatively, use a pre-built image: ───────────────────────────────
|
||||
image: bmcgonag/videopress:latest
|
||||
# image: videopress:latest
|
||||
|
||||
container_name: videopress
|
||||
|
||||
# Run as UID:GID 1000:1000 (matches the 'appuser' created in the Dockerfile).
|
||||
# This ensures the container can write to bind-mounted host directories
|
||||
# that are owned by UID 1000.
|
||||
user: "1000:1000"
|
||||
|
||||
restart: unless-stopped
|
||||
|
||||
# ── Port mapping ─────────────────────────────────────────────────────────
|
||||
|
|
@ -38,8 +45,19 @@ services:
|
|||
# You can also set MEDIA_HOST_PATH as an environment variable before
|
||||
# running docker compose:
|
||||
# export MEDIA_HOST_PATH=/mnt/nas/videos && docker compose up -d
|
||||
#
|
||||
# IMPORTANT — before first run, create the data directory on the HOST
|
||||
# and give it to UID 1000 (the container's non-root user) so SQLite can
|
||||
# write the settings database:
|
||||
#
|
||||
# mkdir -p ./data
|
||||
# chown 1000:1000 ./data
|
||||
#
|
||||
# If you skip this step Docker will create ./data as root and the
|
||||
# container will fail to start with "unable to open database file".
|
||||
volumes:
|
||||
- ${MEDIA_HOST_PATH:-/path/to/your/videos}:/media
|
||||
- ./data:/data
|
||||
|
||||
# ── Environment variables ─────────────────────────────────────────────────
|
||||
environment:
|
||||
|
|
@ -47,6 +65,10 @@ services:
|
|||
# Must match the right-hand side of the volume mount above.
|
||||
MEDIA_ROOT: /media
|
||||
|
||||
# SQLite database path inside the container.
|
||||
# Must match the right-hand side of the ./data:/data volume mount.
|
||||
DB_PATH: /data/videopress.db
|
||||
|
||||
# TCP port Gunicorn listens on (must match EXPOSE in Dockerfile and
|
||||
# the right-hand side of the ports mapping above).
|
||||
PORT: 8080
|
||||
|
|
@ -57,15 +79,14 @@ services:
|
|||
# ── Resource limits (optional — uncomment to enable) ─────────────────────
|
||||
# Compressing large video files is CPU-intensive. Limits prevent the
|
||||
# container from starving other workloads on the host.
|
||||
# Feel free to comment out this whole section if youw ant it to run full blast.
|
||||
deploy:
|
||||
resources:
|
||||
limits:
|
||||
cpus: '4'
|
||||
memory: 2G
|
||||
reservations:
|
||||
cpus: '1'
|
||||
memory: 512M
|
||||
# deploy:
|
||||
# resources:
|
||||
# limits:
|
||||
# cpus: '4'
|
||||
# memory: 2G
|
||||
# reservations:
|
||||
# cpus: '1'
|
||||
# memory: 512M
|
||||
|
||||
# ── Health check ──────────────────────────────────────────────────────────
|
||||
healthcheck:
|
||||
|
|
|
|||
22
run.py
Normal file
22
run.py
Normal file
|
|
@ -0,0 +1,22 @@
|
|||
#!/usr/bin/env python3
|
||||
"""
|
||||
run.py — Development server entry point.
|
||||
|
||||
Usage:
|
||||
python3 run.py [PORT]
|
||||
|
||||
Do NOT use this in production — use Gunicorn via wsgi.py instead.
|
||||
"""
|
||||
|
||||
import sys
|
||||
from app import create_app
|
||||
from app.config import MEDIA_ROOT
|
||||
|
||||
if __name__ == '__main__':
|
||||
port = int(sys.argv[1]) if len(sys.argv) > 1 else 5000
|
||||
print(f"\n{'='*60}")
|
||||
print(f" VideoPress — dev server http://localhost:{port}")
|
||||
print(f" MEDIA_ROOT : {MEDIA_ROOT}")
|
||||
print(f" WARNING : dev server only — use Gunicorn for production")
|
||||
print(f"{'='*60}\n")
|
||||
create_app().run(host='0.0.0.0', port=port, debug=False, threaded=True)
|
||||
BIN
screens/VideoPress_Screen_1.png
Normal file
BIN
screens/VideoPress_Screen_1.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 100 KiB |
BIN
screens/VideoPress_Screen_2.png
Normal file
BIN
screens/VideoPress_Screen_2.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 124 KiB |
BIN
screens/VideoPress_Screen_3.png
Normal file
BIN
screens/VideoPress_Screen_3.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 59 KiB |
BIN
screens/VideoPress_Screen_4.png
Normal file
BIN
screens/VideoPress_Screen_4.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 64 KiB |
BIN
screens/VideoPress_Screen_5.png
Normal file
BIN
screens/VideoPress_Screen_5.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 33 KiB |
4
start.sh
Normal file → Executable file
4
start.sh
Normal file → Executable file
|
|
@ -50,11 +50,11 @@ if [[ "$MODE" == "prod" ]]; then
|
|||
echo " Press Ctrl+C to stop."
|
||||
echo "============================================================"
|
||||
echo ""
|
||||
PORT="$PORT" exec gunicorn -c gunicorn.conf.py wsgi:app
|
||||
PORT="$PORT" exec gunicorn -c gunicorn.conf.py wsgi:application
|
||||
else
|
||||
echo " WARNING: Dev server only — use --prod or Docker for production."
|
||||
echo " Starting Flask on http://localhost:${PORT}"
|
||||
echo "============================================================"
|
||||
echo ""
|
||||
exec python3 app.py "$PORT"
|
||||
exec python3 run.py "$PORT"
|
||||
fi
|
||||
|
|
|
|||
|
|
@ -767,6 +767,35 @@ body {
|
|||
color: var(--text-muted);
|
||||
text-transform: uppercase;
|
||||
letter-spacing: 0.04em;
|
||||
padding: 2px 7px;
|
||||
border-radius: var(--radius-pill);
|
||||
border: 1px solid var(--border-base);
|
||||
background: var(--bg-card2);
|
||||
white-space: nowrap;
|
||||
}
|
||||
|
||||
/* H.265 / HEVC — amber tint to flag it uses libx265 */
|
||||
.codec-tag.hevc {
|
||||
background: rgba(180, 100, 0, 0.10);
|
||||
border-color: rgba(180, 100, 0, 0.35);
|
||||
color: #7a4500;
|
||||
}
|
||||
[data-theme="dark"] .codec-tag.hevc {
|
||||
background: rgba(251, 191, 36, 0.12);
|
||||
border-color: rgba(251, 191, 36, 0.30);
|
||||
color: #fbbf24;
|
||||
}
|
||||
|
||||
/* H.264 / AVC — subtle blue tint */
|
||||
.codec-tag.h264 {
|
||||
background: rgba(26, 86, 219, 0.07);
|
||||
border-color: rgba(26, 86, 219, 0.25);
|
||||
color: #1e40af;
|
||||
}
|
||||
[data-theme="dark"] .codec-tag.h264 {
|
||||
background: rgba(147, 197, 253, 0.10);
|
||||
border-color: rgba(147, 197, 253, 0.25);
|
||||
color: #93c5fd;
|
||||
}
|
||||
|
||||
/* Checkbox styling */
|
||||
|
|
@ -1173,7 +1202,107 @@ body {
|
|||
color: rgba(245,245,242,0.65);
|
||||
}
|
||||
|
||||
/* ── Animations ─────────────────────────────────────────────── */
|
||||
/* ── Notification opt-in ────────────────────────────────────── */
|
||||
.notify-group {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-sm);
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.notify-checkbox-row {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-sm);
|
||||
}
|
||||
|
||||
.notify-checkbox {
|
||||
width: 18px;
|
||||
height: 18px;
|
||||
min-width: 18px;
|
||||
min-height: 18px;
|
||||
cursor: pointer;
|
||||
accent-color: var(--accent);
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.notify-label {
|
||||
font-size: 0.9rem;
|
||||
color: var(--text-primary);
|
||||
cursor: pointer;
|
||||
font-weight: 500;
|
||||
line-height: 1.3;
|
||||
}
|
||||
|
||||
.notify-email-row {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-xs);
|
||||
padding-left: 26px; /* indent under checkbox */
|
||||
animation: slide-down 180ms ease;
|
||||
}
|
||||
|
||||
.notify-email-row[hidden] {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
@keyframes slide-down {
|
||||
from { opacity: 0; transform: translateY(-6px); }
|
||||
to { opacity: 1; transform: translateY(0); }
|
||||
}
|
||||
|
||||
.notify-email-label {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.notify-email-input {
|
||||
max-width: 340px;
|
||||
}
|
||||
|
||||
.notify-divider {
|
||||
width: 1px;
|
||||
background: var(--border-base);
|
||||
align-self: stretch;
|
||||
margin: 0 var(--space-sm);
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
/* Notification send status shown in progress footer */
|
||||
.notify-status {
|
||||
font-size: 0.85rem;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: var(--space-xs);
|
||||
padding: var(--space-xs) var(--space-md);
|
||||
border-radius: var(--radius-pill);
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.notify-status[hidden] {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.notify-status.ok {
|
||||
background: rgba(22, 101, 52, 0.10);
|
||||
color: var(--text-success);
|
||||
border: 1px solid rgba(22, 101, 52, 0.25);
|
||||
}
|
||||
|
||||
.notify-status.fail {
|
||||
background: rgba(185, 28, 28, 0.10);
|
||||
color: var(--text-danger);
|
||||
border: 1px solid rgba(185, 28, 28, 0.25);
|
||||
}
|
||||
|
||||
[data-theme="dark"] .notify-status.ok {
|
||||
background: rgba(134, 239, 172, 0.10);
|
||||
border-color: rgba(134, 239, 172, 0.25);
|
||||
}
|
||||
|
||||
[data-theme="dark"] .notify-status.fail {
|
||||
background: rgba(252, 165, 165, 0.10);
|
||||
border-color: rgba(252, 165, 165, 0.25);
|
||||
}
|
||||
@keyframes pulse {
|
||||
0%, 100% { opacity: 1; }
|
||||
50% { opacity: 0.5; }
|
||||
|
|
@ -1183,6 +1312,60 @@ body {
|
|||
animation: pulse 1.8s ease infinite;
|
||||
}
|
||||
|
||||
/* ── Stream-lost banner ─────────────────────────────────────── */
|
||||
.stream-lost-banner {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--space-md);
|
||||
flex-wrap: wrap;
|
||||
background: rgba(180, 100, 0, 0.10);
|
||||
border: 1.5px solid rgba(180, 100, 0, 0.35);
|
||||
border-radius: var(--radius-md);
|
||||
padding: var(--space-md) var(--space-lg);
|
||||
margin-bottom: var(--space-lg);
|
||||
color: #7a4500;
|
||||
}
|
||||
|
||||
[data-theme="dark"] .stream-lost-banner {
|
||||
background: rgba(251, 191, 36, 0.10);
|
||||
border-color: rgba(251, 191, 36, 0.30);
|
||||
color: #fbbf24;
|
||||
}
|
||||
|
||||
.stream-lost-banner[hidden] {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
.banner-icon {
|
||||
font-size: 1.2rem;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.banner-text {
|
||||
flex: 1;
|
||||
font-size: 0.88rem;
|
||||
font-weight: 500;
|
||||
line-height: 1.4;
|
||||
}
|
||||
|
||||
/* Reconnect button sits in the card title row */
|
||||
.card-title .reconnect-btn {
|
||||
margin-left: auto;
|
||||
font-size: 0.78rem;
|
||||
padding: 5px 12px;
|
||||
min-height: 32px;
|
||||
animation: pulse-reconnect 1.8s ease infinite;
|
||||
}
|
||||
|
||||
.reconnect-btn[hidden] {
|
||||
display: none !important;
|
||||
}
|
||||
|
||||
@keyframes pulse-reconnect {
|
||||
0%, 100% { border-color: var(--btn-outline-border); }
|
||||
50% { border-color: var(--accent); color: var(--accent); }
|
||||
}
|
||||
|
||||
/* ── Responsive ─────────────────────────────────────────────── */
|
||||
@media (max-width: 768px) {
|
||||
.app-main {
|
||||
|
|
@ -1229,3 +1412,123 @@ body {
|
|||
[data-theme="dark"] .file-table th {
|
||||
color: var(--text-primary);
|
||||
}
|
||||
|
||||
/* ── Settings modal ─────────────────────────────────────────── */
|
||||
.settings-panel {
|
||||
max-width: 560px;
|
||||
max-height: 90vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.settings-body {
|
||||
flex: 1;
|
||||
overflow-y: auto;
|
||||
padding: var(--space-lg) var(--space-xl);
|
||||
}
|
||||
|
||||
.settings-body::-webkit-scrollbar { width: 6px; }
|
||||
.settings-body::-webkit-scrollbar-track { background: var(--bg-card2); }
|
||||
.settings-body::-webkit-scrollbar-thumb { background: var(--border-strong); border-radius: 3px; }
|
||||
|
||||
.settings-intro {
|
||||
font-size: 0.85rem;
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--space-lg);
|
||||
line-height: 1.5;
|
||||
}
|
||||
|
||||
.settings-grid {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--space-lg);
|
||||
}
|
||||
|
||||
.settings-row-2 {
|
||||
display: grid;
|
||||
grid-template-columns: 100px 1fr;
|
||||
gap: var(--space-md);
|
||||
}
|
||||
|
||||
.settings-divider-above {
|
||||
border-top: 1px solid var(--border-base);
|
||||
padding-top: var(--space-lg);
|
||||
margin-top: var(--space-sm);
|
||||
}
|
||||
|
||||
.settings-save-status {
|
||||
font-size: 0.82rem;
|
||||
text-align: center;
|
||||
min-height: 1.4em;
|
||||
padding: var(--space-xs) var(--space-xl) var(--space-md);
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.settings-test-result {
|
||||
min-height: 1.4em;
|
||||
}
|
||||
|
||||
/* Password row with toggle */
|
||||
.password-row {
|
||||
display: flex;
|
||||
gap: var(--space-sm);
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.password-row .text-input { flex: 1; }
|
||||
|
||||
.btn-icon-inline {
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
min-width: 40px;
|
||||
min-height: 40px;
|
||||
border: 1.5px solid var(--border-input);
|
||||
border-radius: var(--radius-md);
|
||||
background: var(--bg-input);
|
||||
color: var(--text-muted);
|
||||
font-size: 1rem;
|
||||
cursor: pointer;
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
transition: background var(--transition-fast), border-color var(--transition-fast);
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.btn-icon-inline:hover {
|
||||
background: var(--bg-row-alt);
|
||||
border-color: var(--border-strong);
|
||||
}
|
||||
|
||||
/* Select input to match text-input style */
|
||||
.select-input {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
/* Inline button link (used in hint text) */
|
||||
.btn-link {
|
||||
background: none;
|
||||
border: none;
|
||||
padding: 0;
|
||||
color: var(--text-link);
|
||||
font: inherit;
|
||||
font-size: inherit;
|
||||
cursor: pointer;
|
||||
text-decoration: underline;
|
||||
text-underline-offset: 2px;
|
||||
}
|
||||
|
||||
.btn-link:hover {
|
||||
color: var(--accent);
|
||||
}
|
||||
|
||||
/* Settings save status colours */
|
||||
.settings-save-status.ok { color: var(--text-success); }
|
||||
.settings-save-status.fail { color: var(--text-danger); }
|
||||
|
||||
/* SMTP not configured warning on the notify row */
|
||||
.smtp-warn {
|
||||
color: var(--accent);
|
||||
font-size: 0.78rem;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
|
|
|||
720
static/js/app.js
720
static/js/app.js
|
|
@ -1,696 +1,38 @@
|
|||
/**
|
||||
* VideoPress — Frontend Application
|
||||
* Handles all UI interactions, API calls, and SSE progress streaming.
|
||||
* WCAG 2.2 compliant, fully functional, no stubs.
|
||||
* app.js — VideoPress entry point
|
||||
* --------------------------------
|
||||
* Imports every feature module and calls its init function.
|
||||
* No application logic lives here — this file is intentionally thin.
|
||||
*
|
||||
* Module layout
|
||||
* -------------
|
||||
* utils.js Pure helpers: esc(), fmtTime(), pad()
|
||||
* state.js Shared state object, DOM refs (els), announce()
|
||||
* theme.js Dark / light mode toggle
|
||||
* browser.js Server-side directory browser modal
|
||||
* scan.js /api/scan, file selection table, select-all controls
|
||||
* progress.js Progress bars, results card, stream-lost banner
|
||||
* stream.js SSE stream, reconnect, snapshot restore (applySnapshot)
|
||||
* compress.js Start / cancel / restart compression, notification opt-in
|
||||
* session.js Page-load restore via /api/compress/active
|
||||
* settings.js SMTP email settings modal
|
||||
*/
|
||||
|
||||
'use strict';
|
||||
|
||||
// ─── State ──────────────────────────────────────────────────────────────────
|
||||
const state = {
|
||||
scannedFiles: [], // enriched file objects from API
|
||||
selectedPaths: new Set(), // paths of selected files
|
||||
currentJobId: null,
|
||||
eventSource: null,
|
||||
compressionResults: [],
|
||||
browserPath: '/',
|
||||
};
|
||||
import { initTheme } from './modules/theme.js';
|
||||
import { initBrowser } from './modules/browser.js';
|
||||
import { initScan } from './modules/scan.js';
|
||||
import { initStreamControls } from './modules/stream.js';
|
||||
import { initCompress } from './modules/compress.js';
|
||||
import { tryRestoreSession } from './modules/session.js';
|
||||
import { initSettings } from './modules/settings.js';
|
||||
|
||||
// ─── DOM References ──────────────────────────────────────────────────────────
|
||||
const $ = (id) => document.getElementById(id);
|
||||
|
||||
const els = {
|
||||
// Config section
|
||||
dirInput: $('dir-input'),
|
||||
browseBtn: $('browse-btn'),
|
||||
minSizeInput: $('min-size-input'),
|
||||
suffixInput: $('suffix-input'),
|
||||
scanBtn: $('scan-btn'),
|
||||
scanStatus: $('scan-status'),
|
||||
|
||||
// Browser modal
|
||||
browserModal: $('browser-modal'),
|
||||
browserList: $('browser-list'),
|
||||
browserPath: $('browser-current-path'),
|
||||
closeBrowser: $('close-browser'),
|
||||
browserCancel: $('browser-cancel'),
|
||||
browserSelect: $('browser-select'),
|
||||
|
||||
// Files section
|
||||
sectionFiles: $('section-files'),
|
||||
selectAllBtn: $('select-all-btn'),
|
||||
deselectAllBtn: $('deselect-all-btn'),
|
||||
selectionSummary: $('selection-summary'),
|
||||
fileTbody: $('file-tbody'),
|
||||
compressBtn: $('compress-btn'),
|
||||
|
||||
// Progress section
|
||||
sectionProgress: $('section-progress'),
|
||||
progTotal: $('prog-total'),
|
||||
progDone: $('prog-done'),
|
||||
progStatus: $('prog-status'),
|
||||
overallBar: $('overall-bar'),
|
||||
overallBarFill: $('overall-bar-fill'),
|
||||
overallPct: $('overall-pct'),
|
||||
fileProgressList: $('file-progress-list'),
|
||||
cancelBtn: $('cancel-btn'),
|
||||
|
||||
// Results
|
||||
sectionResults: $('section-results'),
|
||||
resultsContent: $('results-content'),
|
||||
restartBtn: $('restart-btn'),
|
||||
|
||||
// Theme
|
||||
themeToggle: $('theme-toggle'),
|
||||
themeIcon: $('theme-icon'),
|
||||
|
||||
// Screen reader announce
|
||||
srAnnounce: $('sr-announce'),
|
||||
};
|
||||
|
||||
// ─── Accessibility Helper ─────────────────────────────────────────────────────
|
||||
function announce(msg) {
|
||||
els.srAnnounce.textContent = '';
|
||||
requestAnimationFrame(() => {
|
||||
els.srAnnounce.textContent = msg;
|
||||
});
|
||||
}
|
||||
|
||||
// ─── Theme Management ─────────────────────────────────────────────────────────
|
||||
function initTheme() {
|
||||
const saved = localStorage.getItem('vp-theme');
|
||||
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches;
|
||||
const theme = saved || (prefersDark ? 'dark' : 'light');
|
||||
applyTheme(theme);
|
||||
}
|
||||
|
||||
function applyTheme(theme) {
|
||||
document.documentElement.setAttribute('data-theme', theme);
|
||||
els.themeIcon.textContent = theme === 'dark' ? '☀' : '◑';
|
||||
els.themeToggle.setAttribute('aria-label', `Switch to ${theme === 'dark' ? 'light' : 'dark'} mode`);
|
||||
localStorage.setItem('vp-theme', theme);
|
||||
}
|
||||
|
||||
els.themeToggle.addEventListener('click', () => {
|
||||
const current = document.documentElement.getAttribute('data-theme') || 'light';
|
||||
applyTheme(current === 'dark' ? 'light' : 'dark');
|
||||
});
|
||||
|
||||
// ─── Directory Browser ────────────────────────────────────────────────────────
|
||||
async function loadBrowserPath(path) {
|
||||
els.browserList.innerHTML = '<p class="browser-loading" aria-live="polite">Loading…</p>';
|
||||
els.browserPath.textContent = path;
|
||||
|
||||
try {
|
||||
const resp = await fetch(`/api/browse?path=${encodeURIComponent(path)}`);
|
||||
if (!resp.ok) throw new Error((await resp.json()).error || 'Error loading directory');
|
||||
const data = await resp.json();
|
||||
|
||||
state.browserPath = data.current;
|
||||
els.browserPath.textContent = data.current;
|
||||
|
||||
let html = '';
|
||||
|
||||
// Parent directory link
|
||||
if (data.parent !== null) {
|
||||
html += `
|
||||
<button class="browser-item parent-dir" data-path="${escHtml(data.parent)}" data-is-dir="true">
|
||||
<span class="item-icon" aria-hidden="true">↑</span>
|
||||
<span>.. (parent directory)</span>
|
||||
</button>`;
|
||||
}
|
||||
|
||||
if (data.entries.length === 0 && !data.parent) {
|
||||
html += '<p class="browser-loading">No accessible directories found.</p>';
|
||||
}
|
||||
|
||||
for (const entry of data.entries) {
|
||||
if (!entry.is_dir) continue;
|
||||
html += `
|
||||
<button class="browser-item" data-path="${escHtml(entry.path)}" data-is-dir="true"
|
||||
role="option" aria-label="Directory: ${escHtml(entry.name)}">
|
||||
<span class="item-icon" aria-hidden="true">📁</span>
|
||||
<span>${escHtml(entry.name)}</span>
|
||||
</button>`;
|
||||
}
|
||||
|
||||
if (html === '') {
|
||||
html = '<p class="browser-loading">No subdirectories found.</p>';
|
||||
}
|
||||
|
||||
els.browserList.innerHTML = html;
|
||||
|
||||
// Attach click events
|
||||
els.browserList.querySelectorAll('.browser-item').forEach(item => {
|
||||
item.addEventListener('click', () => loadBrowserPath(item.dataset.path));
|
||||
item.addEventListener('keydown', (e) => {
|
||||
if (e.key === 'Enter' || e.key === ' ') {
|
||||
e.preventDefault();
|
||||
loadBrowserPath(item.dataset.path);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
} catch (err) {
|
||||
els.browserList.innerHTML = `<p class="browser-error" role="alert">Error: ${escHtml(err.message)}</p>`;
|
||||
}
|
||||
}
|
||||
|
||||
function openBrowserModal() {
|
||||
els.browserModal.hidden = false;
|
||||
document.body.style.overflow = 'hidden';
|
||||
loadBrowserPath(els.dirInput.value || '/');
|
||||
// Focus trap
|
||||
els.closeBrowser.focus();
|
||||
announce('Directory browser opened');
|
||||
}
|
||||
|
||||
function closeBrowserModal() {
|
||||
els.browserModal.hidden = true;
|
||||
document.body.style.overflow = '';
|
||||
els.browseBtn.focus();
|
||||
announce('Directory browser closed');
|
||||
}
|
||||
|
||||
els.browseBtn.addEventListener('click', openBrowserModal);
|
||||
els.closeBrowser.addEventListener('click', closeBrowserModal);
|
||||
els.browserCancel.addEventListener('click', closeBrowserModal);
|
||||
|
||||
els.browserSelect.addEventListener('click', () => {
|
||||
els.dirInput.value = state.browserPath;
|
||||
closeBrowserModal();
|
||||
announce(`Directory selected: ${state.browserPath}`);
|
||||
});
|
||||
|
||||
// Close modal on backdrop click
|
||||
els.browserModal.addEventListener('click', (e) => {
|
||||
if (e.target === els.browserModal) closeBrowserModal();
|
||||
});
|
||||
|
||||
// Keyboard: close on Escape
|
||||
document.addEventListener('keydown', (e) => {
|
||||
if (e.key === 'Escape' && !els.browserModal.hidden) {
|
||||
closeBrowserModal();
|
||||
}
|
||||
});
|
||||
|
||||
// ─── Scan for Files ───────────────────────────────────────────────────────────
|
||||
els.scanBtn.addEventListener('click', async () => {
|
||||
const directory = els.dirInput.value.trim();
|
||||
const minSize = parseFloat(els.minSizeInput.value);
|
||||
|
||||
if (!directory) {
|
||||
showScanStatus('Please enter a directory path.', 'error');
|
||||
els.dirInput.focus();
|
||||
return;
|
||||
}
|
||||
if (isNaN(minSize) || minSize <= 0) {
|
||||
showScanStatus('Please enter a valid minimum size greater than 0.', 'error');
|
||||
els.minSizeInput.focus();
|
||||
return;
|
||||
}
|
||||
|
||||
els.scanBtn.disabled = true;
|
||||
els.scanBtn.textContent = '⟳ Scanning…';
|
||||
showScanStatus('Scanning directory, please wait…', 'info');
|
||||
announce('Scanning directory for video files, please wait.');
|
||||
|
||||
// Hide previous results
|
||||
els.sectionFiles.hidden = true;
|
||||
|
||||
try {
|
||||
const resp = await fetch('/api/scan', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ directory, min_size_gb: minSize }),
|
||||
});
|
||||
|
||||
const data = await resp.json();
|
||||
|
||||
if (!resp.ok) {
|
||||
showScanStatus(`Error: ${data.error}`, 'error');
|
||||
announce(`Scan failed: ${data.error}`);
|
||||
return;
|
||||
}
|
||||
|
||||
state.scannedFiles = data.files;
|
||||
state.selectedPaths.clear();
|
||||
|
||||
if (data.files.length === 0) {
|
||||
showScanStatus(
|
||||
`No video files larger than ${minSize} GB found in that directory.`,
|
||||
'warn'
|
||||
);
|
||||
announce('No video files found matching your criteria.');
|
||||
return;
|
||||
}
|
||||
|
||||
showScanStatus(`Found ${data.files.length} file(s).`, 'success');
|
||||
announce(`Scan complete. Found ${data.files.length} video files.`);
|
||||
renderFileTable(data.files);
|
||||
els.sectionFiles.hidden = false;
|
||||
els.sectionFiles.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
|
||||
} catch (err) {
|
||||
showScanStatus(`Network error: ${err.message}`, 'error');
|
||||
announce(`Scan error: ${err.message}`);
|
||||
} finally {
|
||||
els.scanBtn.disabled = false;
|
||||
els.scanBtn.innerHTML = '<span class="btn-icon-prefix" aria-hidden="true">⊙</span> Scan for Files';
|
||||
}
|
||||
});
|
||||
|
||||
function showScanStatus(msg, type) {
|
||||
els.scanStatus.textContent = msg;
|
||||
els.scanStatus.style.color = type === 'error' ? 'var(--text-danger)'
|
||||
: type === 'success' ? 'var(--text-success)'
|
||||
: type === 'warn' ? 'var(--accent)'
|
||||
: 'var(--text-muted)';
|
||||
}
|
||||
|
||||
// ─── File Table Rendering ─────────────────────────────────────────────────────
|
||||
function renderFileTable(files) {
|
||||
let html = '';
|
||||
files.forEach((f, idx) => {
|
||||
const sizeFmt = f.size_gb.toFixed(3);
|
||||
const curBitrate = f.bit_rate_mbps ? `${f.bit_rate_mbps} Mbps` : 'Unknown';
|
||||
const tgtBitrate = f.target_bit_rate_mbps ? `${f.target_bit_rate_mbps} Mbps` : '—';
|
||||
const codec = f.codec || 'unknown';
|
||||
const pathDir = f.path.replace(f.name, '');
|
||||
|
||||
html += `
|
||||
<tr id="row-${idx}" data-path="${escHtml(f.path)}">
|
||||
<td class="col-check">
|
||||
<input
|
||||
type="checkbox"
|
||||
class="file-checkbox"
|
||||
id="chk-${idx}"
|
||||
data-path="${escHtml(f.path)}"
|
||||
aria-label="Select ${escHtml(f.name)} for compression"
|
||||
/>
|
||||
</td>
|
||||
<td class="col-name">
|
||||
<label for="chk-${idx}" class="file-name-cell" style="cursor:pointer">
|
||||
<span class="file-name">${escHtml(f.name)}</span>
|
||||
<span class="file-path" title="${escHtml(f.path)}">${escHtml(pathDir)}</span>
|
||||
</label>
|
||||
</td>
|
||||
<td class="col-size">
|
||||
<strong>${sizeFmt}</strong> GB
|
||||
</td>
|
||||
<td class="col-bitrate">
|
||||
<span class="bitrate-badge">${escHtml(curBitrate)}</span>
|
||||
</td>
|
||||
<td class="col-target">
|
||||
<span class="bitrate-badge target">${escHtml(tgtBitrate)}</span>
|
||||
</td>
|
||||
<td class="col-codec">
|
||||
<span class="codec-tag">${escHtml(codec)}</span>
|
||||
</td>
|
||||
</tr>`;
|
||||
});
|
||||
|
||||
els.fileTbody.innerHTML = html;
|
||||
|
||||
// Attach change events
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.addEventListener('change', () => {
|
||||
const path = chk.dataset.path;
|
||||
const row = chk.closest('tr');
|
||||
if (chk.checked) {
|
||||
state.selectedPaths.add(path);
|
||||
row.classList.add('selected');
|
||||
} else {
|
||||
state.selectedPaths.delete(path);
|
||||
row.classList.remove('selected');
|
||||
}
|
||||
updateSelectionUI();
|
||||
});
|
||||
});
|
||||
|
||||
updateSelectionUI();
|
||||
}
|
||||
|
||||
function updateSelectionUI() {
|
||||
const total = state.scannedFiles.length;
|
||||
const sel = state.selectedPaths.size;
|
||||
els.selectionSummary.textContent = `${sel} of ${total} selected`;
|
||||
els.compressBtn.disabled = sel === 0;
|
||||
}
|
||||
|
||||
els.selectAllBtn.addEventListener('click', () => {
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.checked = true;
|
||||
state.selectedPaths.add(chk.dataset.path);
|
||||
chk.closest('tr').classList.add('selected');
|
||||
});
|
||||
updateSelectionUI();
|
||||
announce(`All ${state.scannedFiles.length} files selected.`);
|
||||
});
|
||||
|
||||
els.deselectAllBtn.addEventListener('click', () => {
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.checked = false;
|
||||
chk.closest('tr').classList.remove('selected');
|
||||
});
|
||||
state.selectedPaths.clear();
|
||||
updateSelectionUI();
|
||||
announce('All files deselected.');
|
||||
});
|
||||
|
||||
// ─── Start Compression ────────────────────────────────────────────────────────
|
||||
els.compressBtn.addEventListener('click', async () => {
|
||||
const selectedFiles = state.scannedFiles.filter(f => state.selectedPaths.has(f.path));
|
||||
if (selectedFiles.length === 0) return;
|
||||
|
||||
const suffix = els.suffixInput.value.trim() || '_new';
|
||||
|
||||
const payload = {
|
||||
files: selectedFiles.map(f => ({
|
||||
path: f.path,
|
||||
size_bytes: f.size_bytes,
|
||||
target_bit_rate_bps: f.target_bit_rate_bps || 1000000,
|
||||
})),
|
||||
suffix,
|
||||
};
|
||||
|
||||
els.compressBtn.disabled = true;
|
||||
els.compressBtn.textContent = 'Starting…';
|
||||
|
||||
try {
|
||||
const resp = await fetch('/api/compress/start', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(payload),
|
||||
});
|
||||
|
||||
const data = await resp.json();
|
||||
|
||||
if (!resp.ok) {
|
||||
alert(`Failed to start compression: ${data.error}`);
|
||||
els.compressBtn.disabled = false;
|
||||
els.compressBtn.innerHTML = '<span class="btn-icon-prefix" aria-hidden="true">⚡</span> Compress Selected Files';
|
||||
return;
|
||||
}
|
||||
|
||||
state.currentJobId = data.job_id;
|
||||
state.compressionResults = [];
|
||||
|
||||
// Show progress section
|
||||
setupProgressSection(selectedFiles);
|
||||
els.sectionProgress.hidden = false;
|
||||
els.sectionProgress.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
announce(`Compression started for ${selectedFiles.length} file(s).`);
|
||||
|
||||
// Start SSE stream
|
||||
startProgressStream(data.job_id, selectedFiles);
|
||||
|
||||
} catch (err) {
|
||||
alert(`Error: ${err.message}`);
|
||||
els.compressBtn.disabled = false;
|
||||
els.compressBtn.innerHTML = '<span class="btn-icon-prefix" aria-hidden="true">⚡</span> Compress Selected Files';
|
||||
}
|
||||
});
|
||||
|
||||
// ─── Progress Setup ───────────────────────────────────────────────────────────
|
||||
function setupProgressSection(files) {
|
||||
els.progTotal.textContent = files.length;
|
||||
els.progDone.textContent = '0';
|
||||
els.progStatus.textContent = 'Running';
|
||||
setOverallProgress(0);
|
||||
|
||||
// Create per-file items
|
||||
let html = '';
|
||||
files.forEach((f, idx) => {
|
||||
html += `
|
||||
<div class="file-progress-item" id="fpi-${idx}" role="listitem"
|
||||
aria-label="File ${idx+1} of ${files.length}: ${escHtml(f.name)}">
|
||||
<div class="fp-header">
|
||||
<span class="fp-name">${escHtml(f.name)}</span>
|
||||
<span class="fp-status waiting" id="fps-${idx}" aria-live="polite">Waiting</span>
|
||||
</div>
|
||||
<div class="fp-bar-wrap" aria-label="Progress for ${escHtml(f.name)}">
|
||||
<div class="fp-bar" role="progressbar" aria-valuenow="0" aria-valuemin="0" aria-valuemax="100"
|
||||
id="fpbar-${idx}" aria-label="${escHtml(f.name)} progress">
|
||||
<div class="fp-bar-fill" id="fpfill-${idx}" style="width:0%"></div>
|
||||
</div>
|
||||
<span class="fp-pct" id="fppct-${idx}" aria-hidden="true">0%</span>
|
||||
</div>
|
||||
<div class="fp-detail" id="fpdetail-${idx}"></div>
|
||||
</div>`;
|
||||
});
|
||||
|
||||
els.fileProgressList.innerHTML = html;
|
||||
}
|
||||
|
||||
function setOverallProgress(pct) {
|
||||
const p = Math.round(pct);
|
||||
els.overallBarFill.style.width = `${p}%`;
|
||||
els.overallBar.setAttribute('aria-valuenow', p);
|
||||
els.overallPct.textContent = `${p}%`;
|
||||
}
|
||||
|
||||
function updateFileProgress(idx, pct, statusClass, statusText, detail, detailClass) {
|
||||
const fill = $(`fpfill-${idx}`);
|
||||
const bar = $(`fpbar-${idx}`);
|
||||
const pctEl = $(`fppct-${idx}`);
|
||||
const status = $(`fps-${idx}`);
|
||||
const item = $(`fpi-${idx}`);
|
||||
const det = $(`fpdetail-${idx}`);
|
||||
|
||||
if (!fill) return;
|
||||
|
||||
const p = Math.round(pct);
|
||||
fill.style.width = `${p}%`;
|
||||
bar.setAttribute('aria-valuenow', p);
|
||||
pctEl.textContent = `${p}%`;
|
||||
|
||||
status.className = `fp-status ${statusClass}`;
|
||||
status.textContent = statusText;
|
||||
|
||||
item.className = `file-progress-item ${statusClass}`;
|
||||
|
||||
// Toggle animation on bar fill
|
||||
fill.classList.toggle('active', statusClass === 'running');
|
||||
|
||||
if (detail !== undefined) {
|
||||
det.textContent = detail;
|
||||
det.className = `fp-detail ${detailClass || ''}`;
|
||||
}
|
||||
}
|
||||
|
||||
// ─── SSE Stream Handling ──────────────────────────────────────────────────────
|
||||
function startProgressStream(jobId, files) {
|
||||
if (state.eventSource) {
|
||||
state.eventSource.close();
|
||||
}
|
||||
|
||||
state.eventSource = new EventSource(`/api/compress/progress/${jobId}`);
|
||||
let doneCount = 0;
|
||||
|
||||
state.eventSource.onmessage = (evt) => {
|
||||
let data;
|
||||
try { data = JSON.parse(evt.data); }
|
||||
catch { return; }
|
||||
|
||||
switch (data.type) {
|
||||
case 'start':
|
||||
els.progStatus.textContent = 'Running';
|
||||
break;
|
||||
|
||||
case 'file_start':
|
||||
updateFileProgress(data.index, 0, 'running', 'Compressing…', '', '');
|
||||
// Scroll to active item
|
||||
const activeItem = $(`fpi-${data.index}`);
|
||||
if (activeItem) {
|
||||
activeItem.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
|
||||
}
|
||||
announce(`Compressing file ${data.index + 1} of ${data.total}: ${files[data.index]?.name || ''}`);
|
||||
break;
|
||||
|
||||
case 'progress': {
|
||||
const pct = data.percent || 0;
|
||||
let detail = '';
|
||||
if (data.elapsed_secs > 0 && data.duration_secs > 0) {
|
||||
detail = `${fmtTime(data.elapsed_secs)} / ${fmtTime(data.duration_secs)}`;
|
||||
}
|
||||
updateFileProgress(data.index, pct, 'running', 'Compressing…', detail, '');
|
||||
|
||||
// Update overall progress
|
||||
const overallPct = ((doneCount + (pct / 100)) / files.length) * 100;
|
||||
setOverallProgress(overallPct);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'file_done': {
|
||||
doneCount++;
|
||||
els.progDone.textContent = doneCount;
|
||||
const detail = data.reduction_pct
|
||||
? `Saved ${data.reduction_pct}% → ${data.output_size_gb} GB`
|
||||
: 'Complete';
|
||||
updateFileProgress(data.index, 100, 'done', '✓ Done', detail, 'success');
|
||||
setOverallProgress((doneCount / files.length) * 100);
|
||||
state.compressionResults.push({ ...data, status: 'done' });
|
||||
announce(`File complete: ${files[data.index]?.name}. Saved ${data.reduction_pct}%.`);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'file_error': {
|
||||
doneCount++;
|
||||
els.progDone.textContent = doneCount;
|
||||
updateFileProgress(data.index, 0, 'error', '✗ Error', data.message, 'error');
|
||||
state.compressionResults.push({ ...data, status: 'error' });
|
||||
announce(`Error compressing file ${files[data.index]?.name}: ${data.message}`);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'done':
|
||||
state.eventSource.close();
|
||||
els.progStatus.textContent = 'Complete';
|
||||
setOverallProgress(100);
|
||||
els.cancelBtn.disabled = true;
|
||||
announce('All compression operations complete.');
|
||||
showResults('done');
|
||||
break;
|
||||
|
||||
case 'cancelled':
|
||||
state.eventSource.close();
|
||||
els.progStatus.textContent = 'Cancelled';
|
||||
announce('Compression cancelled.');
|
||||
showResults('cancelled');
|
||||
break;
|
||||
|
||||
case 'error':
|
||||
state.eventSource.close();
|
||||
els.progStatus.textContent = 'Error';
|
||||
announce(`Compression error: ${data.message}`);
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
state.eventSource.onerror = () => {
|
||||
if (state.eventSource.readyState === EventSource.CLOSED) return;
|
||||
console.error('SSE connection error');
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Cancel ───────────────────────────────────────────────────────────────────
|
||||
els.cancelBtn.addEventListener('click', async () => {
|
||||
if (!state.currentJobId) return;
|
||||
|
||||
const confirmed = window.confirm(
|
||||
'Cancel all compression operations? Any files currently being processed will be deleted.'
|
||||
);
|
||||
if (!confirmed) return;
|
||||
|
||||
els.cancelBtn.disabled = true;
|
||||
els.cancelBtn.textContent = 'Cancelling…';
|
||||
|
||||
try {
|
||||
await fetch(`/api/compress/cancel/${state.currentJobId}`, { method: 'POST' });
|
||||
announce('Cancellation requested.');
|
||||
} catch (err) {
|
||||
console.error('Cancel error:', err);
|
||||
}
|
||||
});
|
||||
|
||||
// ─── Results ──────────────────────────────────────────────────────────────────
|
||||
function showResults(finalStatus) {
|
||||
const results = state.compressionResults;
|
||||
let html = '';
|
||||
|
||||
if (finalStatus === 'cancelled') {
|
||||
html += `<p style="color:var(--text-muted); margin-bottom: var(--space-md)">
|
||||
Compression was cancelled. Any completed files are listed below.
|
||||
</p>`;
|
||||
}
|
||||
|
||||
if (results.length === 0 && finalStatus === 'cancelled') {
|
||||
html += '<p style="color:var(--text-muted)">No files were completed before cancellation.</p>';
|
||||
}
|
||||
|
||||
results.forEach(r => {
|
||||
if (r.status === 'done') {
|
||||
html += `
|
||||
<div class="result-row">
|
||||
<span class="result-icon">✅</span>
|
||||
<div class="result-info">
|
||||
<div class="result-name">${escHtml(r.filename)}</div>
|
||||
<div class="result-meta">→ ${escHtml(r.output || '')}</div>
|
||||
</div>
|
||||
<span class="result-reduction">-${r.reduction_pct}%</span>
|
||||
</div>`;
|
||||
} else if (r.status === 'error') {
|
||||
html += `
|
||||
<div class="result-row">
|
||||
<span class="result-icon">❌</span>
|
||||
<div class="result-info">
|
||||
<div class="result-name">${escHtml(r.filename)}</div>
|
||||
<div class="result-meta" style="color:var(--text-danger)">${escHtml(r.message)}</div>
|
||||
</div>
|
||||
</div>`;
|
||||
}
|
||||
});
|
||||
|
||||
if (html === '') {
|
||||
html = '<p style="color:var(--text-muted)">No results to display.</p>';
|
||||
}
|
||||
|
||||
els.resultsContent.innerHTML = html;
|
||||
els.sectionResults.hidden = false;
|
||||
els.sectionResults.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
}
|
||||
|
||||
// ─── Restart ──────────────────────────────────────────────────────────────────
|
||||
els.restartBtn.addEventListener('click', () => {
|
||||
// Reset state
|
||||
state.scannedFiles = [];
|
||||
state.selectedPaths.clear();
|
||||
state.currentJobId = null;
|
||||
state.compressionResults = [];
|
||||
if (state.eventSource) { state.eventSource.close(); state.eventSource = null; }
|
||||
|
||||
// Reset UI
|
||||
els.sectionFiles.hidden = true;
|
||||
els.sectionProgress.hidden = true;
|
||||
els.sectionResults.hidden = true;
|
||||
els.fileTbody.innerHTML = '';
|
||||
els.fileProgressList.innerHTML = '';
|
||||
els.scanStatus.textContent = '';
|
||||
els.compressBtn.innerHTML = '<span class="btn-icon-prefix" aria-hidden="true">⚡</span> Compress Selected Files';
|
||||
els.compressBtn.disabled = true;
|
||||
els.cancelBtn.disabled = false;
|
||||
els.cancelBtn.textContent = '✕ Cancel Compression';
|
||||
|
||||
// Scroll to top
|
||||
document.getElementById('section-config').scrollIntoView({ behavior: 'smooth' });
|
||||
els.dirInput.focus();
|
||||
announce('Session reset. Ready to scan again.');
|
||||
});
|
||||
|
||||
// ─── Helpers ──────────────────────────────────────────────────────────────────
|
||||
function escHtml(str) {
|
||||
if (!str) return '';
|
||||
return String(str)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"')
|
||||
.replace(/'/g, ''');
|
||||
}
|
||||
|
||||
function fmtTime(seconds) {
|
||||
const s = Math.floor(seconds);
|
||||
const h = Math.floor(s / 3600);
|
||||
const m = Math.floor((s % 3600) / 60);
|
||||
const sec = s % 60;
|
||||
if (h > 0) return `${h}:${pad(m)}:${pad(sec)}`;
|
||||
return `${m}:${pad(sec)}`;
|
||||
}
|
||||
|
||||
function pad(n) {
|
||||
return String(n).padStart(2, '0');
|
||||
}
|
||||
|
||||
// ─── Init ─────────────────────────────────────────────────────────────────────
|
||||
initTheme();
|
||||
initBrowser();
|
||||
initScan();
|
||||
initStreamControls();
|
||||
initCompress();
|
||||
initSettings();
|
||||
|
||||
tryRestoreSession();
|
||||
|
|
|
|||
111
static/js/modules/browser.js
Normal file
111
static/js/modules/browser.js
Normal file
|
|
@ -0,0 +1,111 @@
|
|||
/**
|
||||
* browser.js
|
||||
* ----------
|
||||
* Server-side directory browser modal.
|
||||
*
|
||||
* Fetches directory listings from /api/browse and renders them inside the
|
||||
* modal panel. The user navigates the server filesystem and selects a
|
||||
* directory to populate the scan path input.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* initBrowser() — attach all event listeners; call once at startup
|
||||
*/
|
||||
|
||||
import { state, els, announce } from './state.js';
|
||||
import { esc } from './utils.js';
|
||||
|
||||
// ─── Internal helpers ─────────────────────────────────────────────────────────
|
||||
|
||||
async function loadBrowserPath(path) {
|
||||
els.browserList.innerHTML =
|
||||
'<p class="browser-loading" aria-live="polite">Loading…</p>';
|
||||
els.browserPath.textContent = path;
|
||||
|
||||
try {
|
||||
const resp = await fetch(`/api/browse?path=${encodeURIComponent(path)}`);
|
||||
if (!resp.ok) throw new Error((await resp.json()).error || 'Error loading directory');
|
||||
const data = await resp.json();
|
||||
|
||||
state.browserPath = data.current;
|
||||
els.browserPath.textContent = data.current;
|
||||
|
||||
let html = '';
|
||||
|
||||
if (data.parent !== null) {
|
||||
html += `
|
||||
<button class="browser-item parent-dir" data-path="${esc(data.parent)}">
|
||||
<span class="item-icon" aria-hidden="true">↑</span>
|
||||
<span>.. (parent directory)</span>
|
||||
</button>`;
|
||||
}
|
||||
|
||||
for (const entry of data.entries) {
|
||||
if (!entry.is_dir) continue;
|
||||
html += `
|
||||
<button class="browser-item" data-path="${esc(entry.path)}"
|
||||
role="option" aria-label="Directory: ${esc(entry.name)}">
|
||||
<span class="item-icon" aria-hidden="true">📁</span>
|
||||
<span>${esc(entry.name)}</span>
|
||||
</button>`;
|
||||
}
|
||||
|
||||
if (!html) html = '<p class="browser-loading">No subdirectories found.</p>';
|
||||
els.browserList.innerHTML = html;
|
||||
|
||||
els.browserList.querySelectorAll('.browser-item').forEach(btn => {
|
||||
btn.addEventListener('click', () => loadBrowserPath(btn.dataset.path));
|
||||
btn.addEventListener('keydown', e => {
|
||||
if (e.key === 'Enter' || e.key === ' ') {
|
||||
e.preventDefault();
|
||||
loadBrowserPath(btn.dataset.path);
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
} catch (err) {
|
||||
els.browserList.innerHTML =
|
||||
`<p class="browser-error" role="alert">Error: ${esc(err.message)}</p>`;
|
||||
}
|
||||
}
|
||||
|
||||
function openBrowser() {
|
||||
els.browserModal.hidden = false;
|
||||
document.body.style.overflow = 'hidden';
|
||||
loadBrowserPath(els.dirInput.value || '/');
|
||||
els.closeBrowser.focus();
|
||||
announce('Directory browser opened');
|
||||
}
|
||||
|
||||
function closeBrowser() {
|
||||
els.browserModal.hidden = true;
|
||||
document.body.style.overflow = '';
|
||||
els.browseBtn.focus();
|
||||
announce('Directory browser closed');
|
||||
}
|
||||
|
||||
// ─── Public init ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Attach all event listeners for the directory browser modal.
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initBrowser() {
|
||||
els.browseBtn.addEventListener('click', openBrowser);
|
||||
els.closeBrowser.addEventListener('click', closeBrowser);
|
||||
els.browserCancel.addEventListener('click', closeBrowser);
|
||||
|
||||
els.browserModal.addEventListener('click', e => {
|
||||
if (e.target === els.browserModal) closeBrowser();
|
||||
});
|
||||
|
||||
els.browserSelect.addEventListener('click', () => {
|
||||
els.dirInput.value = state.browserPath;
|
||||
closeBrowser();
|
||||
announce(`Directory selected: ${state.browserPath}`);
|
||||
});
|
||||
|
||||
document.addEventListener('keydown', e => {
|
||||
if (e.key === 'Escape' && !els.browserModal.hidden) closeBrowser();
|
||||
});
|
||||
}
|
||||
195
static/js/modules/compress.js
Normal file
195
static/js/modules/compress.js
Normal file
|
|
@ -0,0 +1,195 @@
|
|||
/**
|
||||
* compress.js
|
||||
* -----------
|
||||
* Compression job lifecycle: start, notification opt-in, cancel, and restart.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* initCompress() — attach all event listeners; call once at startup
|
||||
*/
|
||||
|
||||
import { state, els, announce } from './state.js';
|
||||
import { setupProgressSection, showResults } from './progress.js';
|
||||
import { startProgressStream } from './stream.js';
|
||||
import { smtpIsConfigured } from './settings.js';
|
||||
|
||||
// ─── Public init ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Attach event listeners for:
|
||||
* - Notification checkbox toggle
|
||||
* - "Compress Selected Files" button
|
||||
* - "Cancel Compression" button
|
||||
* - "Start New Session" (restart) button
|
||||
*
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initCompress() {
|
||||
_initNotifyToggle();
|
||||
_initCompressButton();
|
||||
_initCancelButton();
|
||||
_initRestartButton();
|
||||
}
|
||||
|
||||
// ─── Notification opt-in ─────────────────────────────────────────────────────
|
||||
|
||||
function _initNotifyToggle() {
|
||||
els.notifyChk.addEventListener('change', () => {
|
||||
const show = els.notifyChk.checked;
|
||||
els.notifyEmailRow.hidden = !show;
|
||||
els.notifyEmail.setAttribute('aria-required', show ? 'true' : 'false');
|
||||
const warn = document.getElementById('smtp-not-configured-warn');
|
||||
if (show) {
|
||||
els.notifyEmail.focus();
|
||||
if (warn) warn.hidden = smtpIsConfigured();
|
||||
} else {
|
||||
els.notifyEmail.value = '';
|
||||
if (warn) warn.hidden = true;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ─── Start compression ────────────────────────────────────────────────────────
|
||||
|
||||
function _initCompressButton() {
|
||||
els.compressBtn.addEventListener('click', async () => {
|
||||
const selectedFiles = state.scannedFiles.filter(
|
||||
f => state.selectedPaths.has(f.path),
|
||||
);
|
||||
if (!selectedFiles.length) return;
|
||||
|
||||
const suffix = els.suffixInput.value.trim() || '_new';
|
||||
const notifyEmail = els.notifyChk.checked ? els.notifyEmail.value.trim() : '';
|
||||
|
||||
// Client-side email validation
|
||||
if (els.notifyChk.checked) {
|
||||
if (!notifyEmail) {
|
||||
els.notifyEmail.setCustomValidity('Please enter your email address.');
|
||||
els.notifyEmail.reportValidity();
|
||||
return;
|
||||
}
|
||||
if (!/^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(notifyEmail)) {
|
||||
els.notifyEmail.setCustomValidity('Please enter a valid email address.');
|
||||
els.notifyEmail.reportValidity();
|
||||
return;
|
||||
}
|
||||
els.notifyEmail.setCustomValidity('');
|
||||
}
|
||||
|
||||
els.compressBtn.disabled = true;
|
||||
els.compressBtn.textContent = 'Starting…';
|
||||
|
||||
try {
|
||||
const resp = await fetch('/api/compress/start', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
files: selectedFiles.map(f => ({
|
||||
path: f.path,
|
||||
size_bytes: f.size_bytes,
|
||||
target_bit_rate_bps: f.target_bit_rate_bps || 1_000_000,
|
||||
codec: f.codec || 'unknown',
|
||||
})),
|
||||
suffix,
|
||||
notify_email: notifyEmail,
|
||||
}),
|
||||
});
|
||||
const data = await resp.json();
|
||||
|
||||
if (!resp.ok) {
|
||||
alert(`Failed to start compression: ${data.error}`);
|
||||
_resetCompressBtn();
|
||||
return;
|
||||
}
|
||||
|
||||
state.currentJobId = data.job_id;
|
||||
state.seenEventCount = 0;
|
||||
state.compressionResults = [];
|
||||
sessionStorage.setItem('vp-job-id', data.job_id);
|
||||
|
||||
setupProgressSection(selectedFiles);
|
||||
els.sectionProgress.hidden = false;
|
||||
els.sectionProgress.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
announce(`Compression started for ${selectedFiles.length} file(s).`);
|
||||
startProgressStream(data.job_id, selectedFiles);
|
||||
|
||||
} catch (err) {
|
||||
alert(`Error: ${err.message}`);
|
||||
_resetCompressBtn();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function _resetCompressBtn() {
|
||||
els.compressBtn.disabled = false;
|
||||
els.compressBtn.innerHTML =
|
||||
'<span class="btn-icon-prefix" aria-hidden="true">⚡</span> Compress Selected Files';
|
||||
}
|
||||
|
||||
// ─── Cancel ───────────────────────────────────────────────────────────────────
|
||||
|
||||
function _initCancelButton() {
|
||||
els.cancelBtn.addEventListener('click', async () => {
|
||||
if (!state.currentJobId) return;
|
||||
if (!confirm(
|
||||
'Cancel all compression operations? ' +
|
||||
'Any file currently being processed will be deleted.',
|
||||
)) return;
|
||||
|
||||
els.cancelBtn.disabled = true;
|
||||
els.cancelBtn.textContent = 'Cancelling…';
|
||||
|
||||
try {
|
||||
await fetch(`/api/compress/cancel/${state.currentJobId}`, { method: 'POST' });
|
||||
announce('Cancellation requested.');
|
||||
} catch (err) {
|
||||
console.error('Cancel error:', err);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ─── Restart (new session) ────────────────────────────────────────────────────
|
||||
|
||||
function _initRestartButton() {
|
||||
els.restartBtn.addEventListener('click', () => {
|
||||
// Clear state
|
||||
state.scannedFiles = [];
|
||||
state.selectedPaths.clear();
|
||||
state.currentJobId = null;
|
||||
state.compressionResults = [];
|
||||
state.seenEventCount = 0;
|
||||
|
||||
if (state.eventSource) {
|
||||
state.eventSource.close();
|
||||
state.eventSource = null;
|
||||
}
|
||||
if (state.reconnectTimer) {
|
||||
clearTimeout(state.reconnectTimer);
|
||||
state.reconnectTimer = null;
|
||||
}
|
||||
sessionStorage.removeItem('vp-job-id');
|
||||
|
||||
// Reset UI
|
||||
els.sectionFiles.hidden = true;
|
||||
els.sectionProgress.hidden = true;
|
||||
els.sectionResults.hidden = true;
|
||||
els.fileTbody.innerHTML = '';
|
||||
els.fileProgressList.innerHTML = '';
|
||||
els.scanStatus.textContent = '';
|
||||
els.notifyChk.checked = false;
|
||||
els.notifyEmailRow.hidden = true;
|
||||
els.notifyEmail.value = '';
|
||||
els.notifyStatus.hidden = true;
|
||||
els.notifyStatus.textContent = '';
|
||||
els.streamLostBanner.hidden = true;
|
||||
els.reconnectBtn.hidden = true;
|
||||
els.cancelBtn.disabled = false;
|
||||
els.cancelBtn.textContent = '✕ Cancel Compression';
|
||||
_resetCompressBtn();
|
||||
|
||||
document.getElementById('section-config')
|
||||
.scrollIntoView({ behavior: 'smooth' });
|
||||
els.dirInput.focus();
|
||||
announce('Session reset. Ready to scan again.');
|
||||
});
|
||||
}
|
||||
172
static/js/modules/progress.js
Normal file
172
static/js/modules/progress.js
Normal file
|
|
@ -0,0 +1,172 @@
|
|||
/**
|
||||
* progress.js
|
||||
* -----------
|
||||
* Progress section DOM management: per-file bars, overall bar, and
|
||||
* the final results summary.
|
||||
*
|
||||
* These functions are called by both stream.js (live SSE updates) and
|
||||
* session.js (snapshot restore on reconnect / page reload).
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* setupProgressSection(files)
|
||||
* setOverallProgress(pct)
|
||||
* updateFileProgress(idx, pct, statusClass, statusText, detail, detailClass)
|
||||
* showStreamLost()
|
||||
* hideStreamLost()
|
||||
* showResults(finalStatus)
|
||||
*/
|
||||
|
||||
import { state, els, announce } from './state.js';
|
||||
import { esc } from './utils.js';
|
||||
|
||||
// ─── Progress section setup ───────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Render the initial per-file progress items and reset counters.
|
||||
* Called when a new compression job starts or when the DOM needs to be
|
||||
* rebuilt after a full page reload.
|
||||
* @param {Array} files — file objects from the job (name, path, …)
|
||||
*/
|
||||
export function setupProgressSection(files) {
|
||||
els.progTotal.textContent = files.length;
|
||||
els.progDone.textContent = '0';
|
||||
els.progStatus.textContent = 'Running';
|
||||
setOverallProgress(0);
|
||||
|
||||
let html = '';
|
||||
files.forEach((f, idx) => {
|
||||
html += `
|
||||
<div class="file-progress-item" id="fpi-${idx}" role="listitem"
|
||||
aria-label="File ${idx + 1} of ${files.length}: ${esc(f.name)}">
|
||||
<div class="fp-header">
|
||||
<span class="fp-name">${esc(f.name)}</span>
|
||||
<span class="fp-status waiting" id="fps-${idx}"
|
||||
aria-live="polite">Waiting</span>
|
||||
</div>
|
||||
<div class="fp-bar-wrap" aria-label="Progress for ${esc(f.name)}">
|
||||
<div class="fp-bar" role="progressbar"
|
||||
aria-valuenow="0" aria-valuemin="0" aria-valuemax="100"
|
||||
id="fpbar-${idx}">
|
||||
<div class="fp-bar-fill" id="fpfill-${idx}" style="width:0%"></div>
|
||||
</div>
|
||||
<span class="fp-pct" id="fppct-${idx}" aria-hidden="true">0%</span>
|
||||
</div>
|
||||
<div class="fp-detail" id="fpdetail-${idx}"></div>
|
||||
</div>`;
|
||||
});
|
||||
els.fileProgressList.innerHTML = html;
|
||||
}
|
||||
|
||||
// ─── Bar helpers ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Update the overall progress bar.
|
||||
* @param {number} pct 0–100
|
||||
*/
|
||||
export function setOverallProgress(pct) {
|
||||
const p = Math.min(100, Math.round(pct));
|
||||
els.overallBarFill.style.width = `${p}%`;
|
||||
els.overallBar.setAttribute('aria-valuenow', p);
|
||||
els.overallPct.textContent = `${p}%`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update a single file's progress bar, status badge, and detail text.
|
||||
*
|
||||
* @param {number} idx — file index (0-based)
|
||||
* @param {number} pct — 0–100
|
||||
* @param {string} statusClass — 'waiting' | 'running' | 'done' | 'error'
|
||||
* @param {string} statusText — visible badge text
|
||||
* @param {string} [detail] — optional sub-text (elapsed time, size saved…)
|
||||
* @param {string} [detailClass]— optional class applied to the detail element
|
||||
*/
|
||||
export function updateFileProgress(idx, pct, statusClass, statusText, detail, detailClass) {
|
||||
const fill = document.getElementById(`fpfill-${idx}`);
|
||||
const bar = document.getElementById(`fpbar-${idx}`);
|
||||
const pctEl = document.getElementById(`fppct-${idx}`);
|
||||
const status = document.getElementById(`fps-${idx}`);
|
||||
const item = document.getElementById(`fpi-${idx}`);
|
||||
const det = document.getElementById(`fpdetail-${idx}`);
|
||||
if (!fill) return;
|
||||
|
||||
const p = Math.min(100, Math.round(pct));
|
||||
fill.style.width = `${p}%`;
|
||||
bar.setAttribute('aria-valuenow', p);
|
||||
pctEl.textContent = `${p}%`;
|
||||
status.className = `fp-status ${statusClass}`;
|
||||
status.textContent = statusText;
|
||||
item.className = `file-progress-item ${statusClass}`;
|
||||
fill.classList.toggle('active', statusClass === 'running');
|
||||
|
||||
if (detail !== undefined) {
|
||||
det.textContent = detail;
|
||||
det.className = `fp-detail ${detailClass || ''}`;
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Stream-lost banner ───────────────────────────────────────────────────────
|
||||
|
||||
/** Show the disconnection warning banner and reveal the Reconnect button. */
|
||||
export function showStreamLost() {
|
||||
els.streamLostBanner.hidden = false;
|
||||
els.reconnectBtn.hidden = false;
|
||||
els.progStatus.textContent = 'Disconnected';
|
||||
announce('Live progress stream disconnected. Use Reconnect to resume.');
|
||||
}
|
||||
|
||||
/** Hide the disconnection warning banner and Reconnect button. */
|
||||
export function hideStreamLost() {
|
||||
els.streamLostBanner.hidden = true;
|
||||
els.reconnectBtn.hidden = true;
|
||||
}
|
||||
|
||||
// ─── Results summary ─────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Render the Step 4 results card and scroll it into view.
|
||||
* @param {'done'|'cancelled'} finalStatus
|
||||
*/
|
||||
export function showResults(finalStatus) {
|
||||
const results = state.compressionResults;
|
||||
let html = '';
|
||||
|
||||
if (finalStatus === 'cancelled') {
|
||||
html += `<p style="color:var(--text-muted);margin-bottom:var(--space-md)">
|
||||
Compression was cancelled. Completed files are listed below.</p>`;
|
||||
}
|
||||
|
||||
if (!results.length && finalStatus === 'cancelled') {
|
||||
html += '<p style="color:var(--text-muted)">No files were completed before cancellation.</p>';
|
||||
}
|
||||
|
||||
results.forEach(r => {
|
||||
if (r.status === 'done') {
|
||||
html += `
|
||||
<div class="result-row">
|
||||
<span class="result-icon">✅</span>
|
||||
<div class="result-info">
|
||||
<div class="result-name">${esc(r.filename)}</div>
|
||||
<div class="result-meta">→ ${esc(r.output || '')}</div>
|
||||
</div>
|
||||
<span class="result-reduction">-${r.reduction_pct}%</span>
|
||||
</div>`;
|
||||
} else if (r.status === 'error') {
|
||||
html += `
|
||||
<div class="result-row">
|
||||
<span class="result-icon">❌</span>
|
||||
<div class="result-info">
|
||||
<div class="result-name">${esc(r.filename)}</div>
|
||||
<div class="result-meta" style="color:var(--text-danger)">
|
||||
${esc(r.message)}
|
||||
</div>
|
||||
</div>
|
||||
</div>`;
|
||||
}
|
||||
});
|
||||
|
||||
if (!html) html = '<p style="color:var(--text-muted)">No results to display.</p>';
|
||||
els.resultsContent.innerHTML = html;
|
||||
els.sectionResults.hidden = false;
|
||||
els.sectionResults.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
}
|
||||
194
static/js/modules/scan.js
Normal file
194
static/js/modules/scan.js
Normal file
|
|
@ -0,0 +1,194 @@
|
|||
/**
|
||||
* scan.js
|
||||
* -------
|
||||
* Directory scan and file selection table.
|
||||
*
|
||||
* Handles the "Scan for Files" button, the /api/scan fetch, rendering the
|
||||
* results table, and the select-all / deselect-all controls.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* initScan() — attach all event listeners; call once at startup
|
||||
*/
|
||||
|
||||
import { state, els, announce } from './state.js';
|
||||
import { esc } from './utils.js';
|
||||
|
||||
// ─── Status helper ────────────────────────────────────────────────────────────
|
||||
|
||||
function setScanStatus(msg, type) {
|
||||
els.scanStatus.textContent = msg;
|
||||
els.scanStatus.style.color =
|
||||
type === 'error' ? 'var(--text-danger)'
|
||||
: type === 'success' ? 'var(--text-success)'
|
||||
: type === 'warn' ? 'var(--accent)'
|
||||
: 'var(--text-muted)';
|
||||
}
|
||||
|
||||
// ─── File table ───────────────────────────────────────────────────────────────
|
||||
|
||||
function updateSelectionUI() {
|
||||
els.selectionSummary.textContent =
|
||||
`${state.selectedPaths.size} of ${state.scannedFiles.length} selected`;
|
||||
els.compressBtn.disabled = state.selectedPaths.size === 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Build and inject the file selection table from the scan results.
|
||||
* Attaches checkbox change handlers for each row.
|
||||
* @param {Array} files — enriched file objects from /api/scan
|
||||
*/
|
||||
export function renderFileTable(files) {
|
||||
let html = '';
|
||||
|
||||
files.forEach((f, idx) => {
|
||||
const codec = (f.codec || 'unknown').toLowerCase();
|
||||
const isHevc = ['hevc', 'h265', 'x265'].includes(codec);
|
||||
const isH264 = ['h264', 'avc', 'x264'].includes(codec);
|
||||
const codecLabel = isHevc ? 'H.265 / HEVC'
|
||||
: isH264 ? 'H.264 / AVC'
|
||||
: codec.toUpperCase();
|
||||
const codecMod = isHevc ? 'hevc' : isH264 ? 'h264' : '';
|
||||
const pathDir = f.path.replace(f.name, '');
|
||||
|
||||
html += `
|
||||
<tr id="row-${idx}" data-path="${esc(f.path)}">
|
||||
<td class="col-check">
|
||||
<input type="checkbox" class="file-checkbox" id="chk-${idx}"
|
||||
data-path="${esc(f.path)}"
|
||||
aria-label="Select ${esc(f.name)} for compression" />
|
||||
</td>
|
||||
<td class="col-name">
|
||||
<label for="chk-${idx}" class="file-name-cell" style="cursor:pointer">
|
||||
<span class="file-name">${esc(f.name)}</span>
|
||||
<span class="file-path" title="${esc(f.path)}">${esc(pathDir)}</span>
|
||||
</label>
|
||||
</td>
|
||||
<td class="col-size"><strong>${f.size_gb.toFixed(3)}</strong> GB</td>
|
||||
<td class="col-bitrate">
|
||||
<span class="bitrate-badge">
|
||||
${esc(f.bit_rate_mbps ? f.bit_rate_mbps + ' Mbps' : 'Unknown')}
|
||||
</span>
|
||||
</td>
|
||||
<td class="col-target">
|
||||
<span class="bitrate-badge target">
|
||||
${esc(f.target_bit_rate_mbps ? f.target_bit_rate_mbps + ' Mbps' : '—')}
|
||||
</span>
|
||||
</td>
|
||||
<td class="col-codec">
|
||||
<span class="codec-tag ${codecMod}"
|
||||
title="Encoder: ${isHevc ? 'libx265' : 'libx264'}">
|
||||
${esc(codecLabel)}
|
||||
</span>
|
||||
</td>
|
||||
</tr>`;
|
||||
});
|
||||
|
||||
els.fileTbody.innerHTML = html;
|
||||
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.addEventListener('change', () => {
|
||||
const row = chk.closest('tr');
|
||||
if (chk.checked) {
|
||||
state.selectedPaths.add(chk.dataset.path);
|
||||
row.classList.add('selected');
|
||||
} else {
|
||||
state.selectedPaths.delete(chk.dataset.path);
|
||||
row.classList.remove('selected');
|
||||
}
|
||||
updateSelectionUI();
|
||||
});
|
||||
});
|
||||
|
||||
updateSelectionUI();
|
||||
}
|
||||
|
||||
// ─── Public init ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Attach event listeners for the scan button and file selection controls.
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initScan() {
|
||||
// ── Scan button ──────────────────────────────────────────────────────────
|
||||
els.scanBtn.addEventListener('click', async () => {
|
||||
const directory = els.dirInput.value.trim();
|
||||
const minSize = parseFloat(els.minSizeInput.value);
|
||||
|
||||
if (!directory) {
|
||||
setScanStatus('Please enter a directory path.', 'error');
|
||||
els.dirInput.focus();
|
||||
return;
|
||||
}
|
||||
if (isNaN(minSize) || minSize <= 0) {
|
||||
setScanStatus('Please enter a valid minimum size > 0.', 'error');
|
||||
els.minSizeInput.focus();
|
||||
return;
|
||||
}
|
||||
|
||||
els.scanBtn.disabled = true;
|
||||
els.scanBtn.textContent = '⟳ Scanning…';
|
||||
setScanStatus('Scanning directory, please wait…', 'info');
|
||||
announce('Scanning directory for video files.');
|
||||
els.sectionFiles.hidden = true;
|
||||
|
||||
try {
|
||||
const resp = await fetch('/api/scan', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ directory, min_size_gb: minSize }),
|
||||
});
|
||||
const data = await resp.json();
|
||||
|
||||
if (!resp.ok) {
|
||||
setScanStatus(`Error: ${data.error}`, 'error');
|
||||
announce(`Scan failed: ${data.error}`);
|
||||
return;
|
||||
}
|
||||
|
||||
state.scannedFiles = data.files;
|
||||
state.selectedPaths.clear();
|
||||
|
||||
if (!data.files.length) {
|
||||
setScanStatus(`No video files larger than ${minSize} GB found.`, 'warn');
|
||||
announce('No video files found matching your criteria.');
|
||||
return;
|
||||
}
|
||||
|
||||
setScanStatus(`Found ${data.files.length} file(s).`, 'success');
|
||||
announce(`Scan complete. Found ${data.files.length} video files.`);
|
||||
renderFileTable(data.files);
|
||||
els.sectionFiles.hidden = false;
|
||||
els.sectionFiles.scrollIntoView({ behavior: 'smooth', block: 'start' });
|
||||
|
||||
} catch (err) {
|
||||
setScanStatus(`Network error: ${err.message}`, 'error');
|
||||
} finally {
|
||||
els.scanBtn.disabled = false;
|
||||
els.scanBtn.innerHTML =
|
||||
'<span class="btn-icon-prefix" aria-hidden="true">⊙</span> Scan for Files';
|
||||
}
|
||||
});
|
||||
|
||||
// ── Select all ───────────────────────────────────────────────────────────
|
||||
els.selectAllBtn.addEventListener('click', () => {
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.checked = true;
|
||||
state.selectedPaths.add(chk.dataset.path);
|
||||
chk.closest('tr').classList.add('selected');
|
||||
});
|
||||
updateSelectionUI();
|
||||
announce(`All ${state.scannedFiles.length} files selected.`);
|
||||
});
|
||||
|
||||
// ── Deselect all ─────────────────────────────────────────────────────────
|
||||
els.deselectAllBtn.addEventListener('click', () => {
|
||||
els.fileTbody.querySelectorAll('.file-checkbox').forEach(chk => {
|
||||
chk.checked = false;
|
||||
state.selectedPaths.delete(chk.dataset.path);
|
||||
chk.closest('tr').classList.remove('selected');
|
||||
});
|
||||
updateSelectionUI();
|
||||
announce('All files deselected.');
|
||||
});
|
||||
}
|
||||
59
static/js/modules/session.js
Normal file
59
static/js/modules/session.js
Normal file
|
|
@ -0,0 +1,59 @@
|
|||
/**
|
||||
* session.js
|
||||
* ----------
|
||||
* Page-load session restore.
|
||||
*
|
||||
* On every page load — including hard browser reloads (Ctrl+Shift+R) and
|
||||
* opening the app in a new tab — asks the server whether a job is active,
|
||||
* fetches its full snapshot, and reconnects the live SSE stream if needed.
|
||||
*
|
||||
* This does NOT depend on sessionStorage surviving the reload (though
|
||||
* sessionStorage is still written as a fast secondary hint).
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* tryRestoreSession() — call once at startup
|
||||
*/
|
||||
|
||||
import { announce } from './state.js';
|
||||
import { applySnapshot, startProgressStream } from './stream.js';
|
||||
import { showResults } from './progress.js';
|
||||
|
||||
/**
|
||||
* Query the server for active/recent jobs and restore the UI if one is found.
|
||||
*
|
||||
* Strategy:
|
||||
* 1. GET /api/compress/active — find the most recent running job (or any job)
|
||||
* 2. GET /api/compress/status/<id> — fetch the full snapshot
|
||||
* 3. applySnapshot() to rebuild all progress bars
|
||||
* 4. If still running: re-attach the SSE stream
|
||||
* 5. If done/cancelled: show the results card
|
||||
*/
|
||||
export async function tryRestoreSession() {
|
||||
try {
|
||||
const activeResp = await fetch('/api/compress/active');
|
||||
if (!activeResp.ok) return;
|
||||
|
||||
const { jobs } = await activeResp.json();
|
||||
if (!jobs.length) return;
|
||||
|
||||
// Prefer the most recent running job; fall back to any job
|
||||
const candidate = jobs.find(j => j.status === 'running') || jobs[0];
|
||||
|
||||
const snapResp = await fetch(`/api/compress/status/${candidate.job_id}`);
|
||||
if (!snapResp.ok) return;
|
||||
|
||||
const snap = await snapResp.json();
|
||||
applySnapshot(snap);
|
||||
announce('Active compression job restored.');
|
||||
|
||||
if (snap.status === 'running') {
|
||||
startProgressStream(snap.job_id, snap.files);
|
||||
} else if (snap.status === 'done' || snap.status === 'cancelled') {
|
||||
showResults(snap.status);
|
||||
sessionStorage.removeItem('vp-job-id');
|
||||
}
|
||||
} catch {
|
||||
// Server unreachable or no jobs — start fresh, no action needed
|
||||
}
|
||||
}
|
||||
260
static/js/modules/settings.js
Normal file
260
static/js/modules/settings.js
Normal file
|
|
@ -0,0 +1,260 @@
|
|||
/**
|
||||
* settings.js
|
||||
* -----------
|
||||
* SMTP email settings modal.
|
||||
*
|
||||
* Loads saved settings from the server on open, lets the user edit and
|
||||
* save them, and sends a test email to verify the configuration works.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* initSettings() — wire up all listeners; call once at startup
|
||||
* smtpIsConfigured() — returns true if the server has smtp_host saved
|
||||
*/
|
||||
|
||||
import { announce } from './state.js';
|
||||
|
||||
// ─── DOM refs (local to this module) ─────────────────────────────────────────
|
||||
|
||||
const $ = id => document.getElementById(id);
|
||||
|
||||
const modal = $('settings-modal');
|
||||
const openBtn = $('settings-btn');
|
||||
const openFromHint = $('open-settings-from-hint');
|
||||
const closeBtn = $('close-settings');
|
||||
const cancelBtn = $('settings-cancel');
|
||||
const saveBtn = $('settings-save');
|
||||
const saveStatus = $('settings-save-status');
|
||||
|
||||
const hostInput = $('smtp-host');
|
||||
const portInput = $('smtp-port');
|
||||
const securitySel = $('smtp-security');
|
||||
const fromInput = $('smtp-from');
|
||||
const userInput = $('smtp-user');
|
||||
const passwordInput = $('smtp-password');
|
||||
const passwordHint = $('smtp-password-hint');
|
||||
const togglePwBtn = $('toggle-password');
|
||||
|
||||
const testToInput = $('smtp-test-to');
|
||||
const testBtn = $('smtp-test-btn');
|
||||
const testResult = $('smtp-test-result');
|
||||
|
||||
// ─── Module-level state ───────────────────────────────────────────────────────
|
||||
|
||||
let _configured = false; // whether smtp_host is set on the server
|
||||
|
||||
// ─── Public API ───────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Returns true if the server has an SMTP host configured.
|
||||
* Used by compress.js to warn the user before they start a job with
|
||||
* notifications enabled but no SMTP server set up.
|
||||
*/
|
||||
export function smtpIsConfigured() {
|
||||
return _configured;
|
||||
}
|
||||
|
||||
/**
|
||||
* Attach all event listeners for the settings modal.
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initSettings() {
|
||||
openBtn.addEventListener('click', openSettings);
|
||||
if (openFromHint) openFromHint.addEventListener('click', openSettings);
|
||||
const openFromWarn = document.getElementById('open-settings-from-warn');
|
||||
if (openFromWarn) openFromWarn.addEventListener('click', openSettings);
|
||||
|
||||
closeBtn.addEventListener('click', closeSettings);
|
||||
cancelBtn.addEventListener('click', closeSettings);
|
||||
modal.addEventListener('click', e => { if (e.target === modal) closeSettings(); });
|
||||
document.addEventListener('keydown', e => {
|
||||
if (e.key === 'Escape' && !modal.hidden) closeSettings();
|
||||
});
|
||||
|
||||
saveBtn.addEventListener('click', saveSettings);
|
||||
testBtn.addEventListener('click', sendTestEmail);
|
||||
|
||||
// Password show/hide toggle
|
||||
togglePwBtn.addEventListener('click', () => {
|
||||
const isHidden = passwordInput.type === 'password';
|
||||
passwordInput.type = isHidden ? 'text' : 'password';
|
||||
togglePwBtn.setAttribute('aria-label', isHidden ? 'Hide password' : 'Show password');
|
||||
});
|
||||
|
||||
// Auto-fill port when security mode changes
|
||||
securitySel.addEventListener('change', () => {
|
||||
const presets = { tls: '587', ssl: '465', none: '25' };
|
||||
portInput.value = presets[securitySel.value] || portInput.value;
|
||||
});
|
||||
|
||||
// Load current config silently at startup so smtpIsConfigured() works
|
||||
_fetchConfig(false);
|
||||
}
|
||||
|
||||
// ─── Open / close ─────────────────────────────────────────────────────────────
|
||||
|
||||
async function openSettings() {
|
||||
modal.hidden = false;
|
||||
document.body.style.overflow = 'hidden';
|
||||
clearStatus();
|
||||
await _fetchConfig(true);
|
||||
closeBtn.focus();
|
||||
announce('SMTP settings panel opened');
|
||||
}
|
||||
|
||||
function closeSettings() {
|
||||
modal.hidden = true;
|
||||
document.body.style.overflow = '';
|
||||
openBtn.focus();
|
||||
announce('SMTP settings panel closed');
|
||||
}
|
||||
|
||||
// ─── Load settings from server ────────────────────────────────────────────────
|
||||
|
||||
async function _fetchConfig(populateForm) {
|
||||
try {
|
||||
const resp = await fetch('/api/settings/smtp');
|
||||
if (!resp.ok) return;
|
||||
const cfg = await resp.json();
|
||||
|
||||
_configured = Boolean(cfg.host);
|
||||
|
||||
if (!populateForm) return;
|
||||
|
||||
hostInput.value = cfg.host || '';
|
||||
portInput.value = cfg.port || '587';
|
||||
fromInput.value = cfg.from_addr || '';
|
||||
userInput.value = cfg.user || '';
|
||||
passwordInput.value = ''; // never pre-fill passwords
|
||||
|
||||
// Select the right security option
|
||||
const opt = securitySel.querySelector(`option[value="${cfg.security || 'tls'}"]`);
|
||||
if (opt) opt.selected = true;
|
||||
|
||||
passwordHint.textContent = cfg.password_set
|
||||
? 'A password is saved. Enter a new value to replace it, or leave blank to keep it.'
|
||||
: '';
|
||||
|
||||
} catch {
|
||||
// Silently ignore — server may not be reachable during init
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Save ────────────────────────────────────────────────────────────────────
|
||||
|
||||
async function saveSettings() {
|
||||
const host = hostInput.value.trim();
|
||||
const port = portInput.value.trim();
|
||||
const security = securitySel.value;
|
||||
const from = fromInput.value.trim();
|
||||
const user = userInput.value.trim();
|
||||
const password = passwordInput.value; // not trimmed — passwords may have spaces
|
||||
|
||||
if (!host) {
|
||||
showStatus('SMTP server host is required.', 'fail');
|
||||
hostInput.focus();
|
||||
return;
|
||||
}
|
||||
if (!port || isNaN(Number(port))) {
|
||||
showStatus('A valid port number is required.', 'fail');
|
||||
portInput.focus();
|
||||
return;
|
||||
}
|
||||
if (!from || !from.includes('@')) {
|
||||
showStatus('A valid From address is required.', 'fail');
|
||||
fromInput.focus();
|
||||
return;
|
||||
}
|
||||
|
||||
saveBtn.disabled = true;
|
||||
saveBtn.textContent = 'Saving…';
|
||||
clearStatus();
|
||||
|
||||
try {
|
||||
const body = { host, port, security, from_addr: from, user };
|
||||
if (password) body.password = password;
|
||||
|
||||
const resp = await fetch('/api/settings/smtp', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(body),
|
||||
});
|
||||
const data = await resp.json();
|
||||
|
||||
if (!resp.ok) {
|
||||
showStatus(`Error: ${data.error}`, 'fail');
|
||||
return;
|
||||
}
|
||||
|
||||
_configured = Boolean(data.config?.host);
|
||||
passwordInput.value = '';
|
||||
passwordHint.textContent =
|
||||
'Password saved. Enter a new value to replace it, or leave blank to keep it.';
|
||||
showStatus('Settings saved successfully.', 'ok');
|
||||
announce('SMTP settings saved.');
|
||||
|
||||
} catch (err) {
|
||||
showStatus(`Network error: ${err.message}`, 'fail');
|
||||
} finally {
|
||||
saveBtn.disabled = false;
|
||||
saveBtn.textContent = 'Save Settings';
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Test email ───────────────────────────────────────────────────────────────
|
||||
|
||||
async function sendTestEmail() {
|
||||
const to = testToInput.value.trim();
|
||||
if (!to || !to.includes('@')) {
|
||||
setTestResult('Please enter a valid recipient address.', 'fail');
|
||||
testToInput.focus();
|
||||
return;
|
||||
}
|
||||
|
||||
testBtn.disabled = true;
|
||||
testBtn.textContent = 'Sending…';
|
||||
setTestResult('Sending test email…', '');
|
||||
|
||||
try {
|
||||
const resp = await fetch('/api/settings/smtp/test', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({ to }),
|
||||
});
|
||||
const data = await resp.json();
|
||||
|
||||
if (data.ok) {
|
||||
setTestResult(`✓ ${data.message}`, 'ok');
|
||||
announce(`Test email sent to ${to}.`);
|
||||
} else {
|
||||
setTestResult(`✗ ${data.message}`, 'fail');
|
||||
announce(`Test email failed: ${data.message}`);
|
||||
}
|
||||
} catch (err) {
|
||||
setTestResult(`Network error: ${err.message}`, 'fail');
|
||||
} finally {
|
||||
testBtn.disabled = false;
|
||||
testBtn.textContent = 'Send Test';
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Helpers ─────────────────────────────────────────────────────────────────
|
||||
|
||||
function showStatus(msg, type) {
|
||||
saveStatus.textContent = msg;
|
||||
saveStatus.className = `settings-save-status ${type}`;
|
||||
}
|
||||
|
||||
function clearStatus() {
|
||||
saveStatus.textContent = '';
|
||||
saveStatus.className = 'settings-save-status';
|
||||
setTestResult('', '');
|
||||
}
|
||||
|
||||
function setTestResult(msg, type) {
|
||||
testResult.textContent = msg;
|
||||
testResult.style.color =
|
||||
type === 'ok' ? 'var(--text-success)'
|
||||
: type === 'fail' ? 'var(--text-danger)'
|
||||
: 'var(--text-muted)';
|
||||
}
|
||||
119
static/js/modules/state.js
Normal file
119
static/js/modules/state.js
Normal file
|
|
@ -0,0 +1,119 @@
|
|||
/**
|
||||
* state.js
|
||||
* --------
|
||||
* Single shared application state object and all DOM element references.
|
||||
*
|
||||
* Centralising these here means every module imports the same live object —
|
||||
* mutations made in one module are immediately visible to all others without
|
||||
* any event bus or pub/sub layer.
|
||||
*
|
||||
* Also exports announce(), which every module uses to push messages to the
|
||||
* ARIA live region for screen-reader users.
|
||||
*/
|
||||
|
||||
// ─── Shared mutable state ────────────────────────────────────────────────────
|
||||
export const state = {
|
||||
/** Files returned by the last /api/scan call. */
|
||||
scannedFiles: [],
|
||||
|
||||
/** Set of file paths the user has checked for compression. */
|
||||
selectedPaths: new Set(),
|
||||
|
||||
/** job_id of the currently active or most-recently-seen compression job. */
|
||||
currentJobId: null,
|
||||
|
||||
/** Active EventSource for the SSE progress stream. */
|
||||
eventSource: null,
|
||||
|
||||
/** Per-file result objects accumulated during a compression run. */
|
||||
compressionResults: [],
|
||||
|
||||
/** Current path shown in the server-side directory browser modal. */
|
||||
browserPath: '/',
|
||||
|
||||
/**
|
||||
* Index of the last SSE event we have processed.
|
||||
* Passed as ?from=N when reconnecting so the server skips events
|
||||
* we already applied to the UI.
|
||||
*/
|
||||
seenEventCount: 0,
|
||||
|
||||
/** Handle returned by setTimeout for the auto-reconnect retry. */
|
||||
reconnectTimer: null,
|
||||
};
|
||||
|
||||
// ─── DOM element references ───────────────────────────────────────────────────
|
||||
const $ = id => document.getElementById(id);
|
||||
|
||||
export const els = {
|
||||
// Step 1 — Configure source
|
||||
dirInput: $('dir-input'),
|
||||
browseBtn: $('browse-btn'),
|
||||
minSizeInput: $('min-size-input'),
|
||||
suffixInput: $('suffix-input'),
|
||||
scanBtn: $('scan-btn'),
|
||||
scanStatus: $('scan-status'),
|
||||
|
||||
// Directory browser modal
|
||||
browserModal: $('browser-modal'),
|
||||
browserList: $('browser-list'),
|
||||
browserPath: $('browser-current-path'),
|
||||
closeBrowser: $('close-browser'),
|
||||
browserCancel: $('browser-cancel'),
|
||||
browserSelect: $('browser-select'),
|
||||
|
||||
// Step 2 — File selection
|
||||
sectionFiles: $('section-files'),
|
||||
selectAllBtn: $('select-all-btn'),
|
||||
deselectAllBtn: $('deselect-all-btn'),
|
||||
selectionSummary: $('selection-summary'),
|
||||
fileTbody: $('file-tbody'),
|
||||
compressBtn: $('compress-btn'),
|
||||
|
||||
// Email notification opt-in
|
||||
notifyChk: $('notify-chk'),
|
||||
notifyEmailRow: $('notify-email-row'),
|
||||
notifyEmail: $('notify-email'),
|
||||
|
||||
// Step 3 — Compression progress
|
||||
sectionProgress: $('section-progress'),
|
||||
progTotal: $('prog-total'),
|
||||
progDone: $('prog-done'),
|
||||
progStatus: $('prog-status'),
|
||||
overallBar: $('overall-bar'),
|
||||
overallBarFill: $('overall-bar-fill'),
|
||||
overallPct: $('overall-pct'),
|
||||
fileProgressList: $('file-progress-list'),
|
||||
cancelBtn: $('cancel-btn'),
|
||||
notifyStatus: $('notify-status'),
|
||||
reconnectBtn: $('reconnect-btn'),
|
||||
reconnectBtnBanner: $('reconnect-btn-banner'),
|
||||
streamLostBanner: $('stream-lost-banner'),
|
||||
|
||||
// Step 4 — Results
|
||||
sectionResults: $('section-results'),
|
||||
resultsContent: $('results-content'),
|
||||
restartBtn: $('restart-btn'),
|
||||
|
||||
// Header
|
||||
themeToggle: $('theme-toggle'),
|
||||
themeIcon: $('theme-icon'),
|
||||
settingsBtn: $('settings-btn'),
|
||||
|
||||
// Accessibility live region
|
||||
srAnnounce: $('sr-announce'),
|
||||
};
|
||||
|
||||
// ─── Screen-reader announcements ─────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Push a message to the ARIA assertive live region.
|
||||
* Clears first so repeated identical messages are still announced.
|
||||
* @param {string} msg
|
||||
*/
|
||||
export function announce(msg) {
|
||||
els.srAnnounce.textContent = '';
|
||||
requestAnimationFrame(() => {
|
||||
els.srAnnounce.textContent = msg;
|
||||
});
|
||||
}
|
||||
276
static/js/modules/stream.js
Normal file
276
static/js/modules/stream.js
Normal file
|
|
@ -0,0 +1,276 @@
|
|||
/**
|
||||
* stream.js
|
||||
* ---------
|
||||
* SSE progress stream management and reconnect / snapshot-restore logic.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* startProgressStream(jobId, files) — open (or re-open) the SSE connection
|
||||
* reconnectToJob(jobId) — fetch snapshot then re-open stream
|
||||
* applySnapshot(snap) — paint a server snapshot onto the UI
|
||||
* initStreamControls() — wire up Reconnect buttons; call once
|
||||
*/
|
||||
|
||||
import { state, els, announce } from './state.js';
|
||||
import { fmtTime } from './utils.js';
|
||||
import {
|
||||
setupProgressSection,
|
||||
setOverallProgress,
|
||||
updateFileProgress,
|
||||
showStreamLost,
|
||||
hideStreamLost,
|
||||
showResults,
|
||||
} from './progress.js';
|
||||
|
||||
// ─── SSE stream ───────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Open a Server-Sent Events connection for *jobId*.
|
||||
*
|
||||
* Resumes from state.seenEventCount so no events are replayed or skipped
|
||||
* after a reconnect. doneCount is seeded from already-known results so
|
||||
* the overall progress bar is correct on the first incoming event.
|
||||
*
|
||||
* @param {string} jobId
|
||||
* @param {Array} files — file objects (need .name for announcements)
|
||||
*/
|
||||
export function startProgressStream(jobId, files) {
|
||||
// Cancel any pending auto-reconnect timer
|
||||
if (state.reconnectTimer) {
|
||||
clearTimeout(state.reconnectTimer);
|
||||
state.reconnectTimer = null;
|
||||
}
|
||||
// Close any stale connection
|
||||
if (state.eventSource) {
|
||||
state.eventSource.close();
|
||||
state.eventSource = null;
|
||||
}
|
||||
hideStreamLost();
|
||||
|
||||
state.eventSource = new EventSource(
|
||||
`/api/compress/progress/${jobId}?from=${state.seenEventCount}`,
|
||||
);
|
||||
|
||||
// Seed from results already recorded by applySnapshot (reconnect path)
|
||||
let doneCount = state.compressionResults.filter(
|
||||
r => r.status === 'done' || r.status === 'error',
|
||||
).length;
|
||||
|
||||
state.eventSource.onmessage = evt => {
|
||||
let data;
|
||||
try { data = JSON.parse(evt.data); } catch { return; }
|
||||
state.seenEventCount++;
|
||||
|
||||
switch (data.type) {
|
||||
|
||||
case 'start':
|
||||
els.progStatus.textContent = 'Running';
|
||||
break;
|
||||
|
||||
case 'file_start':
|
||||
updateFileProgress(data.index, 0, 'running', 'Compressing…', '', '');
|
||||
document.getElementById(`fpi-${data.index}`)
|
||||
?.scrollIntoView({ behavior: 'smooth', block: 'nearest' });
|
||||
announce(
|
||||
`Compressing file ${data.index + 1} of ${data.total}: ` +
|
||||
`${files[data.index]?.name || ''}`,
|
||||
);
|
||||
break;
|
||||
|
||||
case 'progress': {
|
||||
const pct = data.percent || 0;
|
||||
const detail = (data.elapsed_secs > 0 && data.duration_secs > 0)
|
||||
? `${fmtTime(data.elapsed_secs)} / ${fmtTime(data.duration_secs)}` : '';
|
||||
updateFileProgress(data.index, pct, 'running', 'Compressing…', detail, '');
|
||||
setOverallProgress(((doneCount + pct / 100) / files.length) * 100);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'file_done': {
|
||||
doneCount++;
|
||||
els.progDone.textContent = doneCount;
|
||||
const detail = data.reduction_pct
|
||||
? `Saved ${data.reduction_pct}% → ${data.output_size_gb} GB` : 'Complete';
|
||||
updateFileProgress(data.index, 100, 'done', '✓ Done', detail, 'success');
|
||||
setOverallProgress((doneCount / files.length) * 100);
|
||||
// Guard against replay on reconnect
|
||||
if (!state.compressionResults.find(
|
||||
r => r.index === data.index && r.status === 'done',
|
||||
)) {
|
||||
state.compressionResults.push({ ...data, status: 'done' });
|
||||
}
|
||||
announce(
|
||||
`File complete: ${files[data.index]?.name}. Saved ${data.reduction_pct}%.`,
|
||||
);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'file_error': {
|
||||
doneCount++;
|
||||
els.progDone.textContent = doneCount;
|
||||
updateFileProgress(data.index, 0, 'error', '✗ Error', data.message, 'error');
|
||||
if (!state.compressionResults.find(
|
||||
r => r.index === data.index && r.status === 'error',
|
||||
)) {
|
||||
state.compressionResults.push({ ...data, status: 'error' });
|
||||
}
|
||||
announce(`Error: ${files[data.index]?.name}: ${data.message}`);
|
||||
break;
|
||||
}
|
||||
|
||||
case 'notify':
|
||||
els.notifyStatus.hidden = false;
|
||||
els.notifyStatus.className = `notify-status ${data.success ? 'ok' : 'fail'}`;
|
||||
els.notifyStatus.textContent = `✉ ${data.message}`;
|
||||
announce(data.message);
|
||||
break;
|
||||
|
||||
case 'done':
|
||||
state.eventSource.close();
|
||||
sessionStorage.removeItem('vp-job-id');
|
||||
els.progStatus.textContent = 'Complete';
|
||||
setOverallProgress(100);
|
||||
els.cancelBtn.disabled = true;
|
||||
announce('All compression operations complete.');
|
||||
showResults('done');
|
||||
break;
|
||||
|
||||
case 'cancelled':
|
||||
state.eventSource.close();
|
||||
sessionStorage.removeItem('vp-job-id');
|
||||
els.progStatus.textContent = 'Cancelled';
|
||||
announce('Compression cancelled.');
|
||||
showResults('cancelled');
|
||||
break;
|
||||
|
||||
case 'error':
|
||||
state.eventSource.close();
|
||||
els.progStatus.textContent = 'Error';
|
||||
announce(`Compression error: ${data.message}`);
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
state.eventSource.onerror = () => {
|
||||
// CLOSED means the stream ended cleanly (done/cancelled) — ignore.
|
||||
if (!state.eventSource || state.eventSource.readyState === EventSource.CLOSED) return;
|
||||
state.eventSource.close();
|
||||
state.eventSource = null;
|
||||
showStreamLost();
|
||||
// Auto-retry after 5 s
|
||||
state.reconnectTimer = setTimeout(() => {
|
||||
if (state.currentJobId) reconnectToJob(state.currentJobId);
|
||||
}, 5_000);
|
||||
};
|
||||
}
|
||||
|
||||
// ─── Reconnect ────────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Fetch a fresh status snapshot from the server, rebuild the progress UI to
|
||||
* reflect everything that happened while disconnected, then re-open the SSE
|
||||
* stream starting from the last event already processed.
|
||||
*
|
||||
* @param {string} jobId
|
||||
*/
|
||||
export async function reconnectToJob(jobId) {
|
||||
if (state.reconnectTimer) {
|
||||
clearTimeout(state.reconnectTimer);
|
||||
state.reconnectTimer = null;
|
||||
}
|
||||
hideStreamLost();
|
||||
els.progStatus.textContent = 'Reconnecting…';
|
||||
announce('Reconnecting to compression job…');
|
||||
|
||||
try {
|
||||
const resp = await fetch(`/api/compress/status/${jobId}`);
|
||||
if (!resp.ok) throw new Error('Job no longer available on server.');
|
||||
const snap = await resp.json();
|
||||
|
||||
applySnapshot(snap);
|
||||
|
||||
if (snap.status === 'done' || snap.status === 'cancelled') {
|
||||
showResults(snap.status);
|
||||
sessionStorage.removeItem('vp-job-id');
|
||||
} else {
|
||||
startProgressStream(jobId, snap.files);
|
||||
announce('Reconnected. Progress restored.');
|
||||
}
|
||||
} catch (err) {
|
||||
els.progStatus.textContent = 'Reconnect failed';
|
||||
showStreamLost();
|
||||
els.streamLostBanner.querySelector('.banner-text').textContent =
|
||||
`Could not reconnect: ${err.message}`;
|
||||
announce(`Reconnect failed: ${err.message}`);
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Snapshot restore ────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Paint a server-supplied status snapshot onto the progress UI.
|
||||
*
|
||||
* Called by:
|
||||
* - reconnectToJob() after a mid-session SSE drop
|
||||
* - tryRestoreSession() on every page load to recover an active job
|
||||
*
|
||||
* @param {object} snap — response from GET /api/compress/status/<id>
|
||||
*/
|
||||
export function applySnapshot(snap) {
|
||||
// Rebuild the per-file DOM if the page was reloaded and lost it
|
||||
if (!document.getElementById('fpi-0')) {
|
||||
setupProgressSection(snap.files);
|
||||
}
|
||||
|
||||
state.currentJobId = snap.job_id;
|
||||
state.seenEventCount = snap.event_count;
|
||||
sessionStorage.setItem('vp-job-id', snap.job_id);
|
||||
|
||||
els.sectionProgress.hidden = false;
|
||||
els.progTotal.textContent = snap.total;
|
||||
els.progDone.textContent = snap.done_count;
|
||||
els.progStatus.textContent =
|
||||
snap.status === 'running' ? 'Running'
|
||||
: snap.status === 'done' ? 'Complete'
|
||||
: snap.status === 'cancelled' ? 'Cancelled'
|
||||
: snap.status;
|
||||
|
||||
// Restore each file bar from the snapshot's computed file_states
|
||||
snap.file_states.forEach((fs, idx) => {
|
||||
const statusClass = { done: 'done', error: 'error', running: 'running' }[fs.status] || 'waiting';
|
||||
const statusText = { done: '✓ Done', error: '✗ Error', running: 'Compressing…' }[fs.status] || 'Waiting';
|
||||
const detailClass = { done: 'success', error: 'error' }[fs.status] || '';
|
||||
updateFileProgress(idx, fs.percent || 0, statusClass, statusText, fs.detail || '', detailClass);
|
||||
});
|
||||
|
||||
// Restore overall bar
|
||||
const runningPct = snap.file_states.find(f => f.status === 'running')?.percent || 0;
|
||||
const overall = snap.total > 0
|
||||
? ((snap.done_count + runningPct / 100) / snap.total) * 100 : 0;
|
||||
setOverallProgress(Math.min(overall, 100));
|
||||
|
||||
// Seed compressionResults so showResults() has data if job is already done
|
||||
state.compressionResults = snap.file_states
|
||||
.filter(fs => fs.status === 'done' || fs.status === 'error')
|
||||
.map((fs, idx) => ({ ...fs, index: idx }));
|
||||
|
||||
if (snap.status === 'done') {
|
||||
els.cancelBtn.disabled = true;
|
||||
setOverallProgress(100);
|
||||
}
|
||||
}
|
||||
|
||||
// ─── Button wiring ────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Attach click handlers to both Reconnect buttons (title-bar and banner).
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initStreamControls() {
|
||||
els.reconnectBtn.addEventListener('click', () => {
|
||||
if (state.currentJobId) reconnectToJob(state.currentJobId);
|
||||
});
|
||||
els.reconnectBtnBanner.addEventListener('click', () => {
|
||||
if (state.currentJobId) reconnectToJob(state.currentJobId);
|
||||
});
|
||||
}
|
||||
46
static/js/modules/theme.js
Normal file
46
static/js/modules/theme.js
Normal file
|
|
@ -0,0 +1,46 @@
|
|||
/**
|
||||
* theme.js
|
||||
* --------
|
||||
* Dark / light theme management.
|
||||
*
|
||||
* Reads the user's saved preference from localStorage and falls back to the
|
||||
* OS-level prefers-color-scheme media query. Writes back on every change
|
||||
* so the choice persists across page loads.
|
||||
*
|
||||
* Exports
|
||||
* -------
|
||||
* initTheme() — call once at startup; reads saved pref and applies it
|
||||
* applyTheme() — apply a specific theme string ('dark' | 'light')
|
||||
*/
|
||||
|
||||
import { els } from './state.js';
|
||||
|
||||
/**
|
||||
* Apply *theme* to the document and persist the choice.
|
||||
* @param {'dark'|'light'} theme
|
||||
*/
|
||||
export function applyTheme(theme) {
|
||||
document.documentElement.setAttribute('data-theme', theme);
|
||||
els.themeIcon.textContent = theme === 'dark' ? '☀' : '◑';
|
||||
els.themeToggle.setAttribute(
|
||||
'aria-label',
|
||||
`Switch to ${theme === 'dark' ? 'light' : 'dark'} mode`,
|
||||
);
|
||||
localStorage.setItem('vp-theme', theme);
|
||||
}
|
||||
|
||||
/**
|
||||
* Read the saved theme preference (or detect from OS) and apply it.
|
||||
* Attaches the toggle button's click listener.
|
||||
* Call once during app initialisation.
|
||||
*/
|
||||
export function initTheme() {
|
||||
const saved = localStorage.getItem('vp-theme');
|
||||
const prefersDark = window.matchMedia('(prefers-color-scheme: dark)').matches;
|
||||
applyTheme(saved || (prefersDark ? 'dark' : 'light'));
|
||||
|
||||
els.themeToggle.addEventListener('click', () => {
|
||||
const current = document.documentElement.getAttribute('data-theme');
|
||||
applyTheme(current === 'dark' ? 'light' : 'dark');
|
||||
});
|
||||
}
|
||||
45
static/js/modules/utils.js
Normal file
45
static/js/modules/utils.js
Normal file
|
|
@ -0,0 +1,45 @@
|
|||
/**
|
||||
* utils.js
|
||||
* --------
|
||||
* Pure utility functions with no DOM or state dependencies.
|
||||
* Safe to import anywhere without side-effects.
|
||||
*/
|
||||
|
||||
/**
|
||||
* Escape a string for safe insertion into HTML.
|
||||
* @param {*} str
|
||||
* @returns {string}
|
||||
*/
|
||||
export function esc(str) {
|
||||
if (!str) return '';
|
||||
return String(str)
|
||||
.replace(/&/g, '&')
|
||||
.replace(/</g, '<')
|
||||
.replace(/>/g, '>')
|
||||
.replace(/"/g, '"')
|
||||
.replace(/'/g, ''');
|
||||
}
|
||||
|
||||
/**
|
||||
* Format a duration in seconds as M:SS or H:MM:SS.
|
||||
* @param {number} seconds
|
||||
* @returns {string}
|
||||
*/
|
||||
export function fmtTime(seconds) {
|
||||
const s = Math.floor(seconds);
|
||||
const h = Math.floor(s / 3600);
|
||||
const m = Math.floor((s % 3600) / 60);
|
||||
const sec = s % 60;
|
||||
return h > 0
|
||||
? `${h}:${pad(m)}:${pad(sec)}`
|
||||
: `${m}:${pad(sec)}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Zero-pad a number to at least 2 digits.
|
||||
* @param {number} n
|
||||
* @returns {string}
|
||||
*/
|
||||
export function pad(n) {
|
||||
return String(n).padStart(2, '0');
|
||||
}
|
||||
|
|
@ -21,6 +21,14 @@
|
|||
<span class="logo-text">Video<strong>Press</strong></span>
|
||||
</div>
|
||||
<div class="header-actions">
|
||||
<button
|
||||
id="settings-btn"
|
||||
class="btn-icon"
|
||||
aria-label="Open SMTP email settings"
|
||||
title="Email Settings"
|
||||
>
|
||||
<span aria-hidden="true">⚙</span>
|
||||
</button>
|
||||
<button
|
||||
id="theme-toggle"
|
||||
class="btn-icon"
|
||||
|
|
@ -184,6 +192,48 @@
|
|||
</div>
|
||||
|
||||
<div class="card-footer">
|
||||
<!-- Notification opt-in -->
|
||||
<div class="notify-group" id="notify-group">
|
||||
<div class="notify-checkbox-row">
|
||||
<input
|
||||
type="checkbox"
|
||||
id="notify-chk"
|
||||
class="notify-checkbox"
|
||||
aria-describedby="notify-hint"
|
||||
aria-controls="notify-email-row"
|
||||
/>
|
||||
<label for="notify-chk" class="notify-label">
|
||||
Notify me when the compression run is complete.
|
||||
</label>
|
||||
</div>
|
||||
<div class="notify-email-row" id="notify-email-row" hidden>
|
||||
<label for="notify-email" class="field-label notify-email-label">
|
||||
Email address
|
||||
</label>
|
||||
<input
|
||||
type="email"
|
||||
id="notify-email"
|
||||
class="text-input notify-email-input"
|
||||
placeholder="you@example.com"
|
||||
autocomplete="email"
|
||||
aria-describedby="notify-hint"
|
||||
aria-required="false"
|
||||
maxlength="254"
|
||||
/>
|
||||
<p id="notify-hint" class="field-hint">
|
||||
An email will be sent via your configured SMTP server when all files are processed.
|
||||
Configure SMTP in <button class="btn-link" id="open-settings-from-hint">⚙ Settings</button>.
|
||||
</p>
|
||||
<p id="smtp-not-configured-warn" class="field-hint smtp-warn" hidden>
|
||||
⚠ No SMTP server configured yet.
|
||||
<button class="btn-link" id="open-settings-from-warn">Open ⚙ Settings</button>
|
||||
to set one up before starting compression.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="notify-divider" aria-hidden="true"></div>
|
||||
|
||||
<button class="btn btn-primary btn-lg" id="compress-btn" disabled>
|
||||
<span class="btn-icon-prefix" aria-hidden="true">⚡</span>
|
||||
Compress Selected Files
|
||||
|
|
@ -196,8 +246,34 @@
|
|||
<h2 id="progress-heading" class="card-title">
|
||||
<span class="step-badge" aria-hidden="true">03</span>
|
||||
Compression Progress
|
||||
<button
|
||||
class="btn btn-sm btn-outline reconnect-btn"
|
||||
id="reconnect-btn"
|
||||
aria-label="Reconnect to live progress stream"
|
||||
title="Stream disconnected — click to reconnect"
|
||||
hidden
|
||||
>
|
||||
⟳ Reconnect
|
||||
</button>
|
||||
</h2>
|
||||
|
||||
<!-- Stream-lost warning banner -->
|
||||
<div
|
||||
id="stream-lost-banner"
|
||||
class="stream-lost-banner"
|
||||
role="alert"
|
||||
aria-live="assertive"
|
||||
hidden
|
||||
>
|
||||
<span class="banner-icon" aria-hidden="true">⚠</span>
|
||||
<span class="banner-text">
|
||||
Live progress stream disconnected — the compression is still running on the server.
|
||||
</span>
|
||||
<button class="btn btn-sm btn-primary" id="reconnect-btn-banner">
|
||||
⟳ Reconnect Now
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="progress-overview" role="region" aria-label="Overall progress">
|
||||
<div class="overview-stats">
|
||||
<div class="stat-chip">
|
||||
|
|
@ -229,6 +305,7 @@
|
|||
<button class="btn btn-danger" id="cancel-btn" aria-label="Cancel all compression operations">
|
||||
✕ Cancel Compression
|
||||
</button>
|
||||
<span id="notify-status" class="notify-status" aria-live="polite" aria-atomic="true" hidden></span>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
|
|
@ -246,6 +323,120 @@
|
|||
</div>
|
||||
</section>
|
||||
|
||||
<!-- Settings modal -->
|
||||
<div
|
||||
id="settings-modal"
|
||||
class="modal-backdrop"
|
||||
role="dialog"
|
||||
aria-modal="true"
|
||||
aria-labelledby="settings-modal-title"
|
||||
hidden
|
||||
>
|
||||
<div class="modal-panel settings-panel">
|
||||
<div class="modal-header">
|
||||
<h3 id="settings-modal-title" class="modal-title">⚙ Email / SMTP Settings</h3>
|
||||
<button class="btn-icon" id="close-settings" aria-label="Close settings">✕</button>
|
||||
</div>
|
||||
|
||||
<div class="settings-body">
|
||||
<p class="settings-intro">
|
||||
Configure your outgoing mail server so VideoPress can send completion
|
||||
notifications. Settings are saved on the server in a local SQLite database.
|
||||
</p>
|
||||
|
||||
<div class="settings-grid">
|
||||
<!-- SMTP Host -->
|
||||
<div class="field-group">
|
||||
<label for="smtp-host" class="field-label">SMTP Server Host</label>
|
||||
<input type="text" id="smtp-host" class="text-input"
|
||||
placeholder="smtp.example.com"
|
||||
autocomplete="off" spellcheck="false" />
|
||||
</div>
|
||||
|
||||
<!-- Port + Security -->
|
||||
<div class="settings-row-2">
|
||||
<div class="field-group">
|
||||
<label for="smtp-port" class="field-label">Port</label>
|
||||
<input type="number" id="smtp-port" class="text-input"
|
||||
value="587" min="1" max="65535" />
|
||||
</div>
|
||||
<div class="field-group">
|
||||
<label for="smtp-security" class="field-label">Security</label>
|
||||
<select id="smtp-security" class="text-input select-input">
|
||||
<option value="tls">STARTTLS (587)</option>
|
||||
<option value="ssl">SSL / TLS (465)</option>
|
||||
<option value="none">None (25)</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- From address -->
|
||||
<div class="field-group">
|
||||
<label for="smtp-from" class="field-label">From Address</label>
|
||||
<input type="text" id="smtp-from" class="text-input"
|
||||
placeholder="videopress@yourdomain.com"
|
||||
autocomplete="off"
|
||||
spellcheck="false" />
|
||||
</div>
|
||||
|
||||
<!-- Username -->
|
||||
<div class="field-group">
|
||||
<label for="smtp-user" class="field-label">
|
||||
Username
|
||||
<span class="field-unit">(optional)</span>
|
||||
</label>
|
||||
<input type="text" id="smtp-user" class="text-input"
|
||||
placeholder="user@yourdomain.com"
|
||||
autocomplete="off" />
|
||||
</div>
|
||||
|
||||
<!-- Password -->
|
||||
<div class="field-group">
|
||||
<label for="smtp-password" class="field-label">
|
||||
Password
|
||||
<span class="field-unit">(optional)</span>
|
||||
</label>
|
||||
<div class="password-row">
|
||||
<input type="password" id="smtp-password" class="text-input"
|
||||
placeholder="Leave blank to keep existing password"
|
||||
autocomplete="new-password" />
|
||||
<button type="button" class="btn-icon btn-icon-inline"
|
||||
id="toggle-password"
|
||||
aria-label="Show or hide password"
|
||||
title="Show / hide password">👁</button>
|
||||
</div>
|
||||
<p class="field-hint" id="smtp-password-hint"></p>
|
||||
</div>
|
||||
|
||||
<!-- Test recipient -->
|
||||
<div class="field-group settings-divider-above">
|
||||
<label for="smtp-test-to" class="field-label">Send Test Email To</label>
|
||||
<div class="dir-input-row">
|
||||
<input type="text" id="smtp-test-to" class="text-input"
|
||||
placeholder="you@example.com"
|
||||
autocomplete="email"
|
||||
spellcheck="false" />
|
||||
<button class="btn btn-secondary" id="smtp-test-btn">
|
||||
Send Test
|
||||
</button>
|
||||
</div>
|
||||
<p class="field-hint settings-test-result" id="smtp-test-result"
|
||||
aria-live="polite" aria-atomic="true"></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="modal-footer">
|
||||
<button class="btn btn-secondary" id="settings-cancel">Cancel</button>
|
||||
<button class="btn btn-primary" id="settings-save">Save Settings</button>
|
||||
</div>
|
||||
|
||||
<!-- Save status -->
|
||||
<p class="settings-save-status" id="settings-save-status"
|
||||
aria-live="polite" aria-atomic="true"></p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</main>
|
||||
|
||||
<footer class="app-footer" role="contentinfo">
|
||||
|
|
@ -255,6 +446,6 @@
|
|||
<!-- Live region for screen reader announcements -->
|
||||
<div id="sr-announce" class="sr-only" aria-live="assertive" aria-atomic="true"></div>
|
||||
|
||||
<script src="/static/js/app.js"></script>
|
||||
<script type="module" src="/static/js/app.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
|
|
|
|||
17
wsgi.py
17
wsgi.py
|
|
@ -1,18 +1,13 @@
|
|||
"""
|
||||
wsgi.py — Gunicorn entry point for VideoPress.
|
||||
|
||||
Start with:
|
||||
gunicorn -c gunicorn.conf.py wsgi:app
|
||||
Start the production server with:
|
||||
gunicorn -c gunicorn.conf.py wsgi:application
|
||||
|
||||
Or directly:
|
||||
gunicorn \
|
||||
--worker-class geventwebsocket.gunicorn.workers.GeventWebSocketWorker \
|
||||
--workers 1 \
|
||||
--bind 0.0.0.0:8080 \
|
||||
wsgi:app
|
||||
The variable must be named 'application' (or 'app') — Gunicorn looks for
|
||||
a WSGI callable at this module level.
|
||||
"""
|
||||
|
||||
from app import app # noqa: F401 — 'app' is the Flask application object
|
||||
from app import create_app
|
||||
|
||||
# Gunicorn imports this module and looks for a callable named 'app'.
|
||||
# Nothing else is needed here.
|
||||
application = create_app()
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue