mirror of
https://github.com/kyantech/Palmr.git
synced 2025-10-23 16:14:18 +00:00
Compare commits
10 Commits
feat/remov
...
v3.2.3-bet
Author | SHA1 | Date | |
---|---|---|---|
|
94e021d8c6 | ||
|
95ac0f195b | ||
|
d6c9b0d7d2 | ||
|
59f9e19ffb | ||
|
6086d2a0ac | ||
|
f3aeaf66df | ||
|
6b979a22fb | ||
|
e4bae380c9 | ||
|
331624e2f2 | ||
|
ba512ebe95 |
11
README.md
11
README.md
@@ -6,6 +6,17 @@
|
||||
|
||||
**Palmr.** is a **flexible** and **open-source** alternative to file transfer services like **WeTransfer**, **SendGB**, **Send Anywhere**, and **Files.fm**.
|
||||
|
||||
<div align="center">
|
||||
<div style="background: linear-gradient(135deg, #ff4757, #ff3838); padding: 20px; border-radius: 12px; margin: 20px 0; box-shadow: 0 4px 15px rgba(255, 71, 87, 0.3); border: 2px solid #ff3838;">
|
||||
<h3 style="color: white; margin: 0 0 10px 0; font-size: 18px; font-weight: bold;">
|
||||
⚠️ BETA VERSION
|
||||
</h3>
|
||||
<p style="color: white; margin: 0; font-size: 14px; opacity: 0.95;">
|
||||
<strong>This project is currently in beta phase.</strong><br>
|
||||
Not recommended for production environments.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
🔗 **For detailed documentation visit:** [Palmr. - Documentation](https://palmr.kyantech.com.br)
|
||||
|
||||
|
@@ -165,6 +165,27 @@ cp .env.example .env
|
||||
|
||||
This creates a `.env` file with the necessary configurations for the frontend.
|
||||
|
||||
##### Upload Configuration
|
||||
|
||||
Palmr. supports configurable chunked uploading for large files. You can customize the chunk size by setting the following environment variable in your `.env` file:
|
||||
|
||||
```bash
|
||||
NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100
|
||||
```
|
||||
|
||||
**How it works:**
|
||||
|
||||
- If `NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB` is set, Palmr. will use this value (in megabytes) as the chunk size for all file uploads that exceed this threshold.
|
||||
- If not set or left empty, Palmr. automatically calculates optimal chunk sizes based on file size:
|
||||
- Files ≤ 100MB: uploaded without chunking
|
||||
- Files > 100MB and ≤ 1GB: 75MB chunks
|
||||
- Files > 1GB: 150MB chunks
|
||||
|
||||
**When to configure:**
|
||||
|
||||
- **Default (not set):** Recommended for most use cases. Palmr. will intelligently determine the best chunk size.
|
||||
- **Custom value:** Set this if you have specific network conditions or want to optimize for your infrastructure (e.g., slower connections may benefit from smaller chunks like 50MB, while fast networks can handle larger chunks like 200MB, or the upload size per payload may be limited by a proxy like Cloudflare)
|
||||
|
||||
#### Install dependencies
|
||||
|
||||
Install all the frontend dependencies:
|
||||
|
@@ -76,6 +76,7 @@ Choose your storage method based on your needs:
|
||||
# - DOWNLOAD_MIN_FILE_SIZE_GB=3.0 # Minimum file size in GB to activate memory management (default: 3.0)
|
||||
# - DOWNLOAD_AUTO_SCALE=true # Enable auto-scaling based on system memory (default: true)
|
||||
# - NODE_OPTIONS=--expose-gc # Enable garbage collection for large downloads (recommended for production)
|
||||
# - NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB for large file uploads (OPTIONAL - auto-calculates if not set)
|
||||
volumes:
|
||||
- palmr_data:/app/server
|
||||
|
||||
@@ -151,32 +152,33 @@ Choose your storage method based on your needs:
|
||||
|
||||
Customize Palmr's behavior with these environment variables:
|
||||
|
||||
| Variable | Default | Description |
|
||||
| ------------------------------- | ---------- | ---------------------------------------------------------------------------------------------------------------------- |
|
||||
| `ENABLE_S3` | `false` | Enable S3-compatible storage backends |
|
||||
| `S3_ENDPOINT` | - | S3 server endpoint URL (required when using S3) |
|
||||
| `S3_PORT` | - | S3 server port (optional when using S3) |
|
||||
| `S3_USE_SSL` | - | Enable SSL for S3 connections (optional when using S3) |
|
||||
| `S3_ACCESS_KEY` | - | S3 access key for authentication (required when using S3) |
|
||||
| `S3_SECRET_KEY` | - | S3 secret key for authentication (required when using S3) |
|
||||
| `S3_REGION` | - | S3 region configuration (optional when using S3) |
|
||||
| `S3_BUCKET_NAME` | - | S3 bucket name for file storage (required when using S3) |
|
||||
| `S3_FORCE_PATH_STYLE` | `false` | Force path-style S3 URLs (optional when using S3) |
|
||||
| `S3_REJECT_UNAUTHORIZED` | `true` | Enable strict SSL certificate validation for S3 (set to `false` for self-signed certificates) |
|
||||
| `ENCRYPTION_KEY` | - | **Required when encryption is enabled**: 32+ character key for file encryption |
|
||||
| `DISABLE_FILESYSTEM_ENCRYPTION` | `true` | Disable file encryption for better performance (set to `false` to enable encryption) |
|
||||
| `PRESIGNED_URL_EXPIRATION` | `3600` | Duration in seconds for presigned URL expiration (applies to both filesystem and S3 storage) |
|
||||
| `CUSTOM_PATH` | - | Custom base path for disk space detection in manual installations with symlinks |
|
||||
| `SECURE_SITE` | `false` | Enable secure cookies for HTTPS/reverse proxy deployments |
|
||||
| `DEFAULT_LANGUAGE` | `en-US` | Default application language ([see available languages](/docs/3.2-beta/available-languages)) |
|
||||
| `PALMR_UID` | `1000` | User ID for container processes (helps with file permissions) |
|
||||
| `PALMR_GID` | `1000` | Group ID for container processes (helps with file permissions) |
|
||||
| `NODE_OPTIONS` | - | Node.js options (recommended: `--expose-gc` for garbage collection in production) |
|
||||
| `DOWNLOAD_MAX_CONCURRENT` | auto-scale | Maximum number of simultaneous downloads (see [Download Memory Management](/docs/3.2-beta/download-memory-management)) |
|
||||
| `DOWNLOAD_MEMORY_THRESHOLD_MB` | auto-scale | Memory threshold in MB before throttling |
|
||||
| `DOWNLOAD_QUEUE_SIZE` | auto-scale | Maximum queue size for pending downloads |
|
||||
| `DOWNLOAD_MIN_FILE_SIZE_GB` | `3.0` | Minimum file size in GB to activate memory management |
|
||||
| `DOWNLOAD_AUTO_SCALE` | `true` | Enable auto-scaling based on system memory |
|
||||
| Variable | Default | Description |
|
||||
| ---------------------------------- | -------------- | ------------------------------------------------------------------------------------------------------------------------------------- |
|
||||
| `ENABLE_S3` | `false` | Enable S3-compatible storage backends |
|
||||
| `S3_ENDPOINT` | - | S3 server endpoint URL (required when using S3) |
|
||||
| `S3_PORT` | - | S3 server port (optional when using S3) |
|
||||
| `S3_USE_SSL` | - | Enable SSL for S3 connections (optional when using S3) |
|
||||
| `S3_ACCESS_KEY` | - | S3 access key for authentication (required when using S3) |
|
||||
| `S3_SECRET_KEY` | - | S3 secret key for authentication (required when using S3) |
|
||||
| `S3_REGION` | - | S3 region configuration (optional when using S3) |
|
||||
| `S3_BUCKET_NAME` | - | S3 bucket name for file storage (required when using S3) |
|
||||
| `S3_FORCE_PATH_STYLE` | `false` | Force path-style S3 URLs (optional when using S3) |
|
||||
| `S3_REJECT_UNAUTHORIZED` | `true` | Enable strict SSL certificate validation for S3 (set to `false` for self-signed certificates) |
|
||||
| `ENCRYPTION_KEY` | - | **Required when encryption is enabled**: 32+ character key for file encryption |
|
||||
| `DISABLE_FILESYSTEM_ENCRYPTION` | `true` | Disable file encryption for better performance (set to `false` to enable encryption) |
|
||||
| `PRESIGNED_URL_EXPIRATION` | `3600` | Duration in seconds for presigned URL expiration (applies to both filesystem and S3 storage) |
|
||||
| `CUSTOM_PATH` | - | Custom base path for disk space detection in manual installations with symlinks |
|
||||
| `SECURE_SITE` | `false` | Enable secure cookies for HTTPS/reverse proxy deployments |
|
||||
| `DEFAULT_LANGUAGE` | `en-US` | Default application language ([see available languages](/docs/3.2-beta/available-languages)) |
|
||||
| `PALMR_UID` | `1000` | User ID for container processes (helps with file permissions) |
|
||||
| `PALMR_GID` | `1000` | Group ID for container processes (helps with file permissions) |
|
||||
| `NODE_OPTIONS` | - | Node.js options (recommended: `--expose-gc` for garbage collection in production) |
|
||||
| `DOWNLOAD_MAX_CONCURRENT` | auto-scale | Maximum number of simultaneous downloads (see [Download Memory Management](/docs/3.2-beta/download-memory-management)) |
|
||||
| `DOWNLOAD_MEMORY_THRESHOLD_MB` | auto-scale | Memory threshold in MB before throttling |
|
||||
| `DOWNLOAD_QUEUE_SIZE` | auto-scale | Maximum queue size for pending downloads |
|
||||
| `DOWNLOAD_MIN_FILE_SIZE_GB` | `3.0` | Minimum file size in GB to activate memory management |
|
||||
| `NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB` | auto-calculate | Chunk size in MB for large file uploads (see [Chunked Upload Configuration](/docs/3.2-beta/quick-start#chunked-upload-configuration)) |
|
||||
| `DOWNLOAD_AUTO_SCALE` | `true` | Enable auto-scaling based on system memory |
|
||||
|
||||
<Callout type="info">
|
||||
**Performance First**: Palmr runs without encryption by default for optimal speed and lower resource usage—perfect for
|
||||
@@ -314,6 +316,28 @@ environment:
|
||||
|
||||
**Note:** S3 storage handles encryption through your S3 provider's encryption features.
|
||||
|
||||
### Chunked Upload Configuration
|
||||
|
||||
Palmr supports configurable chunked uploading for large files. You can customize the chunk size by setting the following environment variable:
|
||||
|
||||
```yaml
|
||||
environment:
|
||||
- NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB
|
||||
```
|
||||
|
||||
**How it works:**
|
||||
|
||||
- If `NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB` is set, Palmr will use this value (in megabytes) as the chunk size for all file uploads that exceed this threshold.
|
||||
- If not set or left empty, Palmr automatically calculates optimal chunk sizes based on file size:
|
||||
- Files ≤ 100MB: uploaded without chunking
|
||||
- Files > 100MB and ≤ 1GB: 75MB chunks
|
||||
- Files > 1GB: 150MB chunks
|
||||
|
||||
**When to configure:**
|
||||
|
||||
- **Default (not set):** Recommended for most use cases. Palmr will intelligently determine the best chunk size.
|
||||
- **Custom value:** Set this if you have specific network conditions or want to optimize for your infrastructure (e.g., slower connections may benefit from smaller chunks like 50MB, while fast networks can handle larger chunks like 200MB, or the upload size per payload may be limited by a proxy like Cloudflare)
|
||||
|
||||
---
|
||||
|
||||
## Maintenance
|
||||
|
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "palmr-docs",
|
||||
"version": "3.2.1-beta",
|
||||
"version": "3.2.3-beta",
|
||||
"description": "Docs for Palmr",
|
||||
"private": true,
|
||||
"author": "Daniel Luiz Alves <daniel@kyantech.com.br>",
|
||||
@@ -62,4 +62,4 @@
|
||||
"tw-animate-css": "^1.2.8",
|
||||
"typescript": "^5.8.3"
|
||||
}
|
||||
}
|
||||
}
|
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "palmr-api",
|
||||
"version": "3.2.1-beta",
|
||||
"version": "3.2.3-beta",
|
||||
"description": "API for Palmr",
|
||||
"private": true,
|
||||
"author": "Daniel Luiz Alves <daniel@kyantech.com.br>",
|
||||
@@ -42,9 +42,7 @@
|
||||
"@fastify/swagger-ui": "^5.2.3",
|
||||
"@prisma/client": "^6.11.0",
|
||||
"@scalar/fastify-api-reference": "^1.32.1",
|
||||
"@types/archiver": "^6.0.3",
|
||||
"@types/crypto-js": "^4.2.2",
|
||||
"archiver": "^7.0.1",
|
||||
"bcryptjs": "^2.4.3",
|
||||
"crypto-js": "^4.2.0",
|
||||
"fastify": "^5.4.0",
|
||||
@@ -79,4 +77,4 @@
|
||||
"tsx": "^4.19.2",
|
||||
"typescript": "^5.7.3"
|
||||
}
|
||||
}
|
||||
}
|
353
apps/server/pnpm-lock.yaml
generated
353
apps/server/pnpm-lock.yaml
generated
@@ -41,15 +41,9 @@ importers:
|
||||
'@scalar/fastify-api-reference':
|
||||
specifier: ^1.32.1
|
||||
version: 1.32.1
|
||||
'@types/archiver':
|
||||
specifier: ^6.0.3
|
||||
version: 6.0.3
|
||||
'@types/crypto-js':
|
||||
specifier: ^4.2.2
|
||||
version: 4.2.2
|
||||
archiver:
|
||||
specifier: ^7.0.1
|
||||
version: 7.0.1
|
||||
bcryptjs:
|
||||
specifier: ^2.4.3
|
||||
version: 2.4.3
|
||||
@@ -782,10 +776,6 @@ packages:
|
||||
resolution: {integrity: sha512-oGB+UxlgWcgQkgwo8GcEGwemoTFt3FIO9ababBmaGwXIoBKZ+GTy0pP185beGg7Llih/NSHSV2XAs1lnznocSg==}
|
||||
engines: {node: '>= 8'}
|
||||
|
||||
'@pkgjs/parseargs@0.11.0':
|
||||
resolution: {integrity: sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==}
|
||||
engines: {node: '>=14'}
|
||||
|
||||
'@pkgr/core@0.2.7':
|
||||
resolution: {integrity: sha512-YLT9Zo3oNPJoBjBc4q8G2mjU4tqIbf5CEOORbUUr48dCD9q3umJ3IPlVqOqDakPfd2HuwccBaqlGhN4Gmr5OWg==}
|
||||
engines: {node: ^12.20.0 || ^14.18.0 || >=16.0.0}
|
||||
@@ -1064,9 +1054,6 @@ packages:
|
||||
'@tsconfig/node16@1.0.4':
|
||||
resolution: {integrity: sha512-vxhUy4J8lyeyinH7Azl1pdd43GJhZH/tP2weN8TntQblOY+A0XbT8DJk1/oCPuOOyg/Ja757rG0CgHcWC8OfMA==}
|
||||
|
||||
'@types/archiver@6.0.3':
|
||||
resolution: {integrity: sha512-a6wUll6k3zX6qs5KlxIggs1P1JcYJaTCx2gnlr+f0S1yd2DoaEwoIK10HmBaLnZwWneBz+JBm0dwcZu0zECBcQ==}
|
||||
|
||||
'@types/bcryptjs@2.4.6':
|
||||
resolution: {integrity: sha512-9xlo6R2qDs5uixm0bcIqCeMCE6HiQsIyel9KQySStiyqNl2tnj2mP3DX1Nf56MD6KMenNNlBBsy3LJ7gUEQPXQ==}
|
||||
|
||||
@@ -1088,9 +1075,6 @@ packages:
|
||||
'@types/qrcode@1.5.5':
|
||||
resolution: {integrity: sha512-CdfBi/e3Qk+3Z/fXYShipBT13OJ2fDO2Q2w5CIP5anLTLIndQG9z6P1cnm+8zCWSpm5dnxMFd/uREtb0EXuQzg==}
|
||||
|
||||
'@types/readdir-glob@1.1.5':
|
||||
resolution: {integrity: sha512-raiuEPUYqXu+nvtY2Pe8s8FEmZ3x5yAH4VkLdihcPdalvsHltomrRC9BzuStrJ9yk06470hS0Crw0f1pXqD+Hg==}
|
||||
|
||||
'@types/speakeasy@2.0.10':
|
||||
resolution: {integrity: sha512-QVRlDW5r4yl7p7xkNIbAIC/JtyOcClDIIdKfuG7PWdDT1MmyhtXSANsildohy0K+Lmvf/9RUtLbNLMacvrVwxA==}
|
||||
|
||||
@@ -1156,10 +1140,6 @@ packages:
|
||||
resolution: {integrity: sha512-VRwixir4zBWCSTP/ljEo091lbpypz57PoeAQ9imjG+vbeof9LplljsL1mos4ccG6H9IjfrVGM359RozUnuFhpw==}
|
||||
engines: {node: ^18.18.0 || ^20.9.0 || >=21.1.0}
|
||||
|
||||
abort-controller@3.0.0:
|
||||
resolution: {integrity: sha512-h8lQ8tacZYnR3vNQTgibj+tODHI5/+l06Au2Pcriv/Gmet0eaj4TwWH41sO9wnHDiQsEj19q0drzdWdeAHtweg==}
|
||||
engines: {node: '>=6.5'}
|
||||
|
||||
abstract-logging@2.0.1:
|
||||
resolution: {integrity: sha512-2BjRTZxTPvheOvGbBslFSYOUkr+SjPtOnrLP33f+VIWLzezQpZcqVg7ja3L4dBXmzzgwT+a029jRx5PCi3JuiA==}
|
||||
|
||||
@@ -1215,14 +1195,6 @@ packages:
|
||||
resolution: {integrity: sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
archiver-utils@5.0.2:
|
||||
resolution: {integrity: sha512-wuLJMmIBQYCsGZgYLTy5FIB2pF6Lfb6cXMSF8Qywwk3t20zWnAi7zLcQFdKQmIB8wyZpY5ER38x08GbwtR2cLA==}
|
||||
engines: {node: '>= 14'}
|
||||
|
||||
archiver@7.0.1:
|
||||
resolution: {integrity: sha512-ZcbTaIqJOfCc03QwD468Unz/5Ir8ATtvAHsK+FdXbDIbGfihqh9mrvdcYunQzqn4HrvWWaFyaxJhGZagaJJpPQ==}
|
||||
engines: {node: '>= 14'}
|
||||
|
||||
arg@4.1.3:
|
||||
resolution: {integrity: sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==}
|
||||
|
||||
@@ -1232,9 +1204,6 @@ packages:
|
||||
asn1.js@5.4.1:
|
||||
resolution: {integrity: sha512-+I//4cYPccV8LdmBLiX8CYvf9Sp3vQsrqu2QNXRcrbiWvcx/UdlFiqUJJzxRQxgsZmvhXhn4cSKeSmoFjVdupA==}
|
||||
|
||||
async@3.2.6:
|
||||
resolution: {integrity: sha512-htCUDlxyyCLMgaM3xXg0C0LW2xqfuQ6p05pCEIsXuyQ+a1koYKTuBMzRNwmybfLgvJDMd0r1LTn4+E0Ti6C2AA==}
|
||||
|
||||
atomic-sleep@1.0.0:
|
||||
resolution: {integrity: sha512-kNOjDqAh7px0XWNI+4QbzoiR/nTkHAWNud2uvnJquD1/x5a7EQZMJT0AczqK0Qn67oY/TTQ1LbUKajZpp3I9tQ==}
|
||||
engines: {node: '>=8.0.0'}
|
||||
@@ -1242,26 +1211,12 @@ packages:
|
||||
avvio@9.1.0:
|
||||
resolution: {integrity: sha512-fYASnYi600CsH/j9EQov7lECAniYiBFiiAtBNuZYLA2leLe9qOvZzqYHFjtIj6gD2VMoMLP14834LFWvr4IfDw==}
|
||||
|
||||
b4a@1.7.2:
|
||||
resolution: {integrity: sha512-DyUOdz+E8R6+sruDpQNOaV0y/dBbV6X/8ZkxrDcR0Ifc3BgKlpgG0VAtfOozA0eMtJO5GGe9FsZhueLs00pTww==}
|
||||
peerDependencies:
|
||||
react-native-b4a: '*'
|
||||
peerDependenciesMeta:
|
||||
react-native-b4a:
|
||||
optional: true
|
||||
|
||||
balanced-match@1.0.2:
|
||||
resolution: {integrity: sha512-3oSeUO0TMV67hN1AmbXsK4yaqU7tjiHlbxRDZOpH0KW9+CeX4bRAaX0Anxt0tx2MrpRpWwQaPwIlISEJhYU5Pw==}
|
||||
|
||||
bare-events@2.7.0:
|
||||
resolution: {integrity: sha512-b3N5eTW1g7vXkw+0CXh/HazGTcO5KYuu/RCNaJbDMPI6LHDi+7qe8EmxKUVe1sUbY2KZOVZFyj62x0OEz9qyAA==}
|
||||
|
||||
base32.js@0.0.1:
|
||||
resolution: {integrity: sha512-EGHIRiegFa62/SsA1J+Xs2tIzludPdzM064N9wjbiEgHnGnJ1V0WEpA4pEwCYT5nDvZk3ubf0shqaCS7k6xeUQ==}
|
||||
|
||||
base64-js@1.5.1:
|
||||
resolution: {integrity: sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==}
|
||||
|
||||
bcryptjs@2.4.3:
|
||||
resolution: {integrity: sha512-V/Hy/X9Vt7f3BbPJEi8BdVFMByHi+jNXrYkW3huaybV/kQ0KJg0Y6PkEMbn+zeT+i+SiKZ/HMqJGIIt4LZDqNQ==}
|
||||
|
||||
@@ -1281,13 +1236,6 @@ packages:
|
||||
resolution: {integrity: sha512-yQbXgO/OSZVD2IsiLlro+7Hf6Q18EJrKSEsdoMzKePKXct3gvD8oLcOQdIzGupr5Fj+EDe8gO/lxc1BzfMpxvA==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
buffer-crc32@1.0.0:
|
||||
resolution: {integrity: sha512-Db1SbgBS/fg/392AblrMJk97KggmvYhr4pB5ZIMTWtaivCPMWLkmb7m21cJvpvgK+J3nsU2CmmixNBZx4vFj/w==}
|
||||
engines: {node: '>=8.0.0'}
|
||||
|
||||
buffer@6.0.3:
|
||||
resolution: {integrity: sha512-FTiCpNxtwiZZHEZbcbTIcZjERVICn9yq/pDFkTl95/AxzD1naBctN7YO68riM/gLSDY7sdrMby8hofADYuuqOA==}
|
||||
|
||||
callsites@3.1.0:
|
||||
resolution: {integrity: sha512-P8BjAsXvZS+VIDUI11hHCQEv74YT67YUi5JJFNWIqL235sBmjX4+qx9Muvls5ivyNENctx46xQLQ3aTuE7ssaQ==}
|
||||
engines: {node: '>=6'}
|
||||
@@ -1317,10 +1265,6 @@ packages:
|
||||
resolution: {integrity: sha512-1rXeuUUiGGrykh+CeBdu5Ie7OJwinCgQY0bc7GCRxy5xVHy+moaqkpL/jqQq0MtQOeYcrqEz4abc5f0KtU7W4A==}
|
||||
engines: {node: '>=12.5.0'}
|
||||
|
||||
compress-commons@6.0.2:
|
||||
resolution: {integrity: sha512-6FqVXeETqWPoGcfzrXb37E50NP0LXT8kAMu5ooZayhWWdgEY4lBEEcbQNXtkuKQsGduxiIcI4gOTsxTmuq/bSg==}
|
||||
engines: {node: '>= 14'}
|
||||
|
||||
concat-map@0.0.1:
|
||||
resolution: {integrity: sha512-/Srv4dswyQNBfohGpz9o6Yb3Gz3SrUDqBH5rTuhGR7ahtlbYKnVxw2bCFMRljaA7EXHaXZ8wsHdodFvbkhKmqg==}
|
||||
|
||||
@@ -1332,18 +1276,6 @@ packages:
|
||||
resolution: {integrity: sha512-9Kr/j4O16ISv8zBBhJoi4bXOYNTkFLOqSL3UDB0njXxCXNezjeyVrJyGOWtgfs/q2km1gwBcfH8q1yEGoMYunA==}
|
||||
engines: {node: '>=18'}
|
||||
|
||||
core-util-is@1.0.3:
|
||||
resolution: {integrity: sha512-ZQBvi1DcpJ4GDqanjucZ2Hj3wEO5pZDS89BWbkcrvdxksJorwUDDZamX9ldFkp9aw2lmBDLgkObEA4DWNJ9FYQ==}
|
||||
|
||||
crc-32@1.2.2:
|
||||
resolution: {integrity: sha512-ROmzCKrTnOwybPcJApAA6WBWij23HVfGVNKqqrZpuyZOHqK2CwHSvpGuyt/UNNvaIjEd8X5IFGp4Mh+Ie1IHJQ==}
|
||||
engines: {node: '>=0.8'}
|
||||
hasBin: true
|
||||
|
||||
crc32-stream@6.0.0:
|
||||
resolution: {integrity: sha512-piICUB6ei4IlTv1+653yq5+KoqfBYmj9bw6LqXoOneTMDXk5nM1qt12mFW1caG3LlJXEKW1Bp0WggEmIfQB34g==}
|
||||
engines: {node: '>= 14'}
|
||||
|
||||
create-require@1.1.1:
|
||||
resolution: {integrity: sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==}
|
||||
|
||||
@@ -1479,17 +1411,6 @@ packages:
|
||||
resolution: {integrity: sha512-kVscqXk4OCp68SZ0dkgEKVi6/8ij300KBWTJq32P/dYeWTSwK41WyTxalN1eRmA5Z9UU/LX9D7FWSmV9SAYx6g==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
event-target-shim@5.0.1:
|
||||
resolution: {integrity: sha512-i/2XbnSz/uxRCU6+NdVJgKWDTM427+MqYbkQzD321DuCQJUqOuJKIA0IM2+W2xtYHdKOmZ4dR6fExsd4SXL+WQ==}
|
||||
engines: {node: '>=6'}
|
||||
|
||||
events-universal@1.0.1:
|
||||
resolution: {integrity: sha512-LUd5euvbMLpwOF8m6ivPCbhQeSiYVNb8Vs0fQ8QjXo0JTkEHpz8pxdQf0gStltaPpw0Cca8b39KxvK9cfKRiAw==}
|
||||
|
||||
events@3.3.0:
|
||||
resolution: {integrity: sha512-mQw+2fkQbALzQ7V0MY0IqdnXNOeTtP4r0lN9z7AAawCXgqea7bDii20AYrIBrFd/Hx0M2Ocz6S111CaFkUcb0Q==}
|
||||
engines: {node: '>=0.8.x'}
|
||||
|
||||
fast-decode-uri-component@1.0.1:
|
||||
resolution: {integrity: sha512-WKgKWg5eUxvRZGwW8FvfbaH7AXSh2cL+3j5fMGzUMCxWBJ3dV3a7Wz8y2f/uQ0e3B6WmodD3oS54jTQ9HVTIIg==}
|
||||
|
||||
@@ -1499,9 +1420,6 @@ packages:
|
||||
fast-diff@1.3.0:
|
||||
resolution: {integrity: sha512-VxPP4NqbUjj6MaAOafWeUn2cXWLcCtljklUtZf0Ind4XQ+QPtmA0b18zZy0jIQx+ExRVCR/ZQpBmik5lXshNsw==}
|
||||
|
||||
fast-fifo@1.3.2:
|
||||
resolution: {integrity: sha512-/d9sfos4yxzpwkDkuN7k2SqFKtYNmCTzgfEpz82x34IM9/zc8KGxQoXg1liNC/izpRM/MBdt44Nmx41ZWqk+FQ==}
|
||||
|
||||
fast-glob@3.3.3:
|
||||
resolution: {integrity: sha512-7MptL8U0cqcFdzIzwOTHoilX9x5BrNqye7Z/LuC7kCMRio1EMSyqRK3BEAUD7sXRq4iT4AzTVuZdhgQ2TCvYLg==}
|
||||
engines: {node: '>=8.6.0'}
|
||||
@@ -1623,10 +1541,6 @@ packages:
|
||||
resolution: {integrity: sha512-XxwI8EOhVQgWp6iDL+3b0r86f4d6AX6zSU55HfB4ydCEuXLXc5FcYeOu+nnGftS4TEju/11rt4KJPTMgbfmv4A==}
|
||||
engines: {node: '>=10.13.0'}
|
||||
|
||||
glob@10.4.5:
|
||||
resolution: {integrity: sha512-7Bv8RF0k6xjo7d4A/PxYLbUCfb6c+Vpd2/mB2yRDlew7Jb5hEXiCD9ibfO7wpk8i4sevK6DFny9h7EYbM3/sHg==}
|
||||
hasBin: true
|
||||
|
||||
glob@11.0.3:
|
||||
resolution: {integrity: sha512-2Nim7dha1KVkaiF4q6Dj+ngPPMdfvLJEOpZk/jKiUAkqKebpGAWQXAq9z1xu9HKu5lWfqw/FASuccEjyznjPaA==}
|
||||
engines: {node: 20 || >=22}
|
||||
@@ -1636,9 +1550,6 @@ packages:
|
||||
resolution: {integrity: sha512-oahGvuMGQlPw/ivIYBjVSrWAfWLBeku5tpPE2fOPLi+WHffIWbuh2tCjhyQhTBPMf5E9jDEH4FOmTYgYwbKwtQ==}
|
||||
engines: {node: '>=18'}
|
||||
|
||||
graceful-fs@4.2.11:
|
||||
resolution: {integrity: sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==}
|
||||
|
||||
graphemer@1.4.0:
|
||||
resolution: {integrity: sha512-EtKwoO6kxCL9WO5xipiHTZlSzBm7WLT627TqC/uVRd0HKmq8NXyebnNYxDoBi7wt8eTWrUrKXCOVaFq9x1kgag==}
|
||||
|
||||
@@ -1650,9 +1561,6 @@ packages:
|
||||
resolution: {integrity: sha512-FtwrG/euBzaEjYeRqOgly7G0qviiXoJWnvEH2Z1plBdXgbyjv34pHTSb9zoeHMyDy33+DWy5Wt9Wo+TURtOYSQ==}
|
||||
engines: {node: '>= 0.8'}
|
||||
|
||||
ieee754@1.2.1:
|
||||
resolution: {integrity: sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==}
|
||||
|
||||
ignore@5.3.2:
|
||||
resolution: {integrity: sha512-hsBTNUqQTDwkWtcdYI2i06Y/nUBEsNEDJKjWdigLvegy8kDuJAS8uRlpkkcQpyEXL0Z/pjDy5HBmMjRCJ2gq+g==}
|
||||
engines: {node: '>= 4'}
|
||||
@@ -1695,19 +1603,9 @@ packages:
|
||||
resolution: {integrity: sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==}
|
||||
engines: {node: '>=0.12.0'}
|
||||
|
||||
is-stream@2.0.1:
|
||||
resolution: {integrity: sha512-hFoiJiTl63nn+kstHGBtewWSKnQLpyb155KHheA1l39uvtO9nWIop1p3udqPcUd/xbF1VLMO4n7OI6p7RbngDg==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
isarray@1.0.0:
|
||||
resolution: {integrity: sha512-VLghIWNM6ELQzo7zwmcg0NmTVyWKYjvIeM83yjp0wRDTmUnrM678fQbcKBo6n2CJEF0szoG//ytg+TKla89ALQ==}
|
||||
|
||||
isexe@2.0.0:
|
||||
resolution: {integrity: sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==}
|
||||
|
||||
jackspeak@3.4.3:
|
||||
resolution: {integrity: sha512-OGlZQpz2yfahA/Rd1Y8Cd9SIEsqvXkLVoSw/cgwhnhFMDbsQFeZYoJJ7bIZBS9BcamUW96asq/npPWugM+RQBw==}
|
||||
|
||||
jackspeak@4.1.1:
|
||||
resolution: {integrity: sha512-zptv57P3GpL+O0I7VdMJNBZCu+BPHVQUk55Ft8/QCJjTVxrnJHuVuX/0Bl2A6/+2oyR/ZMEuFKwmzqqZ/U5nPQ==}
|
||||
engines: {node: 20 || >=22}
|
||||
@@ -1760,10 +1658,6 @@ packages:
|
||||
keyv@4.5.4:
|
||||
resolution: {integrity: sha512-oxVHkHR/EJf2CNXnWxRLW6mg7JyCCUcG0DtEGmL2ctUo1PNTin1PUil+r/+4r5MpVgC/fn1kjsx7mjSujKqIpw==}
|
||||
|
||||
lazystream@1.0.1:
|
||||
resolution: {integrity: sha512-b94GiNHQNy6JNTrt5w6zNyffMrNkXZb3KTkCZJb2V1xaEGCk093vkZ2jk3tpaeP33/OiXC+WvK9AxUebnf5nbw==}
|
||||
engines: {node: '>= 0.6.3'}
|
||||
|
||||
leven@4.0.0:
|
||||
resolution: {integrity: sha512-puehA3YKku3osqPlNuzGDUHq8WpwXupUg1V6NXdV38G+gr+gkBwFC8g1b/+YcIvp8gnqVIus+eJCH/eGsRmJNw==}
|
||||
engines: {node: ^12.20.0 || ^14.13.1 || >=16.0.0}
|
||||
@@ -1786,12 +1680,6 @@ packages:
|
||||
lodash.merge@4.6.2:
|
||||
resolution: {integrity: sha512-0KpjqXRVvrYyCsX1swR/XTK0va6VQkQM6MNo7PqW77ByjAhoARA8EfrP1N4+KlKj8YS0ZUCtRT/YUuhyYDujIQ==}
|
||||
|
||||
lodash@4.17.21:
|
||||
resolution: {integrity: sha512-v2kDEe57lecTulaDIuNTPy3Ry4gLGJ6Z1O3vE1krgXZNrsQ+LFTGHVxVjcXPs17LhbZVGedAJv8XZ1tvj5FvSg==}
|
||||
|
||||
lru-cache@10.4.3:
|
||||
resolution: {integrity: sha512-JNAzZcXrCt42VGLuYz0zfAzDfAvJWW6AfYlDBQyDV5DClI2m5sAmK+OIO7s59XfsRsWHp02jAJrRadPRGTt6SQ==}
|
||||
|
||||
lru-cache@11.1.0:
|
||||
resolution: {integrity: sha512-QIXZUBJUx+2zHUdQujWejBkcD9+cs94tLn0+YL8UrCh+D5sCXZ4c7LaEH48pNwRY3MLDgqUFyhlCyjJPf1WP0A==}
|
||||
engines: {node: 20 || >=22}
|
||||
@@ -1822,10 +1710,6 @@ packages:
|
||||
minimatch@3.1.2:
|
||||
resolution: {integrity: sha512-J7p63hRiAjw1NDEww1W7i37+ByIrOWO5XQQAzZ3VOcL0PNybwpfmV/N05zFAzwQ9USyEcX6t3UO+K5aqBQOIHw==}
|
||||
|
||||
minimatch@5.1.6:
|
||||
resolution: {integrity: sha512-lKwV/1brpG6mBUFHtb7NUmtABCb2WZZmm2wNiOA5hAb8VdCS4B3dtMWyvcoViccwAW/COERjXLt0zP1zXUN26g==}
|
||||
engines: {node: '>=10'}
|
||||
|
||||
minimatch@9.0.5:
|
||||
resolution: {integrity: sha512-G6T0ZX48xgozx7587koeX9Ys2NYy6Gmv//P89sEte9V9whIapMNF4idKxnW2QtCcLiTWlb/wfCabAtAFWhhBow==}
|
||||
engines: {node: '>=16 || 14 >=14.17'}
|
||||
@@ -1864,10 +1748,6 @@ packages:
|
||||
resolution: {integrity: sha512-Z+iLaBGVaSjbIzQ4pX6XV41HrooLsQ10ZWPUehGmuantvzWoDVBnmsdUcOIDM1t+yPor5pDhVlDESgOMEGxhHA==}
|
||||
engines: {node: '>=6.0.0'}
|
||||
|
||||
normalize-path@3.0.0:
|
||||
resolution: {integrity: sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==}
|
||||
engines: {node: '>=0.10.0'}
|
||||
|
||||
oauth4webapi@3.5.5:
|
||||
resolution: {integrity: sha512-1K88D2GiAydGblHo39NBro5TebGXa+7tYoyIbxvqv3+haDDry7CBE1eSYuNbOSsYCCU6y0gdynVZAkm4YPw4hg==}
|
||||
|
||||
@@ -1923,10 +1803,6 @@ packages:
|
||||
resolution: {integrity: sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==}
|
||||
engines: {node: '>=8'}
|
||||
|
||||
path-scurry@1.11.1:
|
||||
resolution: {integrity: sha512-Xa4Nw17FS9ApQFJ9umLiJS4orGjm7ZzwUrwamcGQuHSzDyth9boKDaycYdDcZDuqYATXw4HFXgaqWTctW/v1HA==}
|
||||
engines: {node: '>=16 || 14 >=14.18'}
|
||||
|
||||
path-scurry@2.0.0:
|
||||
resolution: {integrity: sha512-ypGJsmGtdXUOeM5u93TyeIEfEhM6s+ljAhrk5vAvSx8uyY/02OvrZnA0YNGUrPXfpJMgI1ODd3nwz8Npx4O4cg==}
|
||||
engines: {node: 20 || >=22}
|
||||
@@ -1981,19 +1857,12 @@ packages:
|
||||
typescript:
|
||||
optional: true
|
||||
|
||||
process-nextick-args@2.0.1:
|
||||
resolution: {integrity: sha512-3ouUOpQhtgrbOa17J7+uxOTpITYWaGP7/AhoR3+A+/1e9skrzelGi/dXzEYyvbxubEF6Wn2ypscTKiKJFFn1ag==}
|
||||
|
||||
process-warning@4.0.1:
|
||||
resolution: {integrity: sha512-3c2LzQ3rY9d0hc1emcsHhfT9Jwz0cChib/QN89oME2R451w5fy3f0afAhERFZAwrbDU43wk12d0ORBpDVME50Q==}
|
||||
|
||||
process-warning@5.0.0:
|
||||
resolution: {integrity: sha512-a39t9ApHNx2L4+HBnQKqxxHNs1r7KF+Intd8Q/g1bUh6q0WIp9voPXJ/x0j+ZL45KF1pJd9+q2jLIRMfvEshkA==}
|
||||
|
||||
process@0.11.10:
|
||||
resolution: {integrity: sha512-cdGef/drWFoydD1JsMzuFf8100nZl+GT+yacc2bEced5f9Rjk4z+WtFUTBu9PhOi9j/jfmBPu0mMEY4wIdAF8A==}
|
||||
engines: {node: '>= 0.6.0'}
|
||||
|
||||
punycode@2.3.1:
|
||||
resolution: {integrity: sha512-vYt7UD1U9Wg6138shLtLOvdAu+8DsC/ilFtEVHcH+wydcSpNE20AfSOduf6MkRFahL5FY7X1oU7nKVZFtfq8Fg==}
|
||||
engines: {node: '>=6'}
|
||||
@@ -2009,16 +1878,6 @@ packages:
|
||||
quick-format-unescaped@4.0.4:
|
||||
resolution: {integrity: sha512-tYC1Q1hgyRuHgloV/YXs2w15unPVh8qfu/qCTfhTYamaw7fyhumKa2yGpdSo87vY32rIclj+4fWYQXUMs9EHvg==}
|
||||
|
||||
readable-stream@2.3.8:
|
||||
resolution: {integrity: sha512-8p0AUk4XODgIewSi0l8Epjs+EVnWiK7NoDIEGU0HhE7+ZyY8D1IMY7odu5lRrFXGg71L15KG8QrPmum45RTtdA==}
|
||||
|
||||
readable-stream@4.7.0:
|
||||
resolution: {integrity: sha512-oIGGmcpTLwPga8Bn6/Z75SVaH1z5dUut2ibSyAMVhmUggWpmDn2dapB0n7f8nwaSiRtepAsfJyfXIO5DCVAODg==}
|
||||
engines: {node: ^12.22.0 || ^14.17.0 || >=16.0.0}
|
||||
|
||||
readdir-glob@1.1.3:
|
||||
resolution: {integrity: sha512-v05I2k7xN8zXvPD9N+z/uhXPaj0sUFCe2rcWZIpBsqxfP7xXFQ0tipAd/wjj1YxWyWtUS5IDJpOG82JKt2EAVA==}
|
||||
|
||||
real-require@0.2.0:
|
||||
resolution: {integrity: sha512-57frrGM/OCTLqLOAh0mhVA9VBMHd+9U7Zb2THMGdBUoZVOtGbJzjxsYGDJ3A9AYYCP4hn6y1TVbaOfzWtm5GFg==}
|
||||
engines: {node: '>= 12.13.0'}
|
||||
@@ -2055,9 +1914,6 @@ packages:
|
||||
run-parallel@1.2.0:
|
||||
resolution: {integrity: sha512-5l4VyZR86LZ/lDxZTR6jqL8AFE2S0IFLMP26AbjsLVADxHdhB/c0GUsH+y39UfCi3dzz8OlQuPmnaJOMoDHQBA==}
|
||||
|
||||
safe-buffer@5.1.2:
|
||||
resolution: {integrity: sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==}
|
||||
|
||||
safe-buffer@5.2.1:
|
||||
resolution: {integrity: sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==}
|
||||
|
||||
@@ -2128,9 +1984,6 @@ packages:
|
||||
steed@1.1.3:
|
||||
resolution: {integrity: sha512-EUkci0FAUiE4IvGTSKcDJIQ/eRUP2JJb56+fvZ4sdnguLTqIdKjSxUe138poW8mkvKWXW2sFPrgTsxqoISnmoA==}
|
||||
|
||||
streamx@2.23.0:
|
||||
resolution: {integrity: sha512-kn+e44esVfn2Fa/O0CPFcex27fjIL6MkVae0Mm6q+E6f0hWv578YCERbv+4m02cjxvDsPKLnmxral/rR6lBMAg==}
|
||||
|
||||
string-width@4.2.3:
|
||||
resolution: {integrity: sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==}
|
||||
engines: {node: '>=8'}
|
||||
@@ -2139,12 +1992,6 @@ packages:
|
||||
resolution: {integrity: sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==}
|
||||
engines: {node: '>=12'}
|
||||
|
||||
string_decoder@1.1.1:
|
||||
resolution: {integrity: sha512-n/ShnvDi6FHbbVfviro+WojiFzv+s8MPMHBczVePfUpDJLwoLT0ht1l4YwBCbi8pJAveEEdnkHyPyTP/mzRfwg==}
|
||||
|
||||
string_decoder@1.3.0:
|
||||
resolution: {integrity: sha512-hkRX8U1WjJFd8LsDJ2yQ/wWWxaopEsABU1XfkM8A+j0+85JAGppt16cr1Whg6KIbb4okU6Mql6BOj+uup/wKeA==}
|
||||
|
||||
strip-ansi@6.0.1:
|
||||
resolution: {integrity: sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==}
|
||||
engines: {node: '>=8'}
|
||||
@@ -2168,12 +2015,6 @@ packages:
|
||||
resolution: {integrity: sha512-+XZ+r1XGIJGeQk3VvXhT6xx/VpbHsRzsTkGgF6E5RX9TTXD0118l87puaEBZ566FhqblC6U0d4XnubznJDm30A==}
|
||||
engines: {node: ^14.18.0 || >=16.0.0}
|
||||
|
||||
tar-stream@3.1.7:
|
||||
resolution: {integrity: sha512-qJj60CXt7IU1Ffyc3NJMjh6EkuCFej46zUqJ4J7pqYlThyd9bO0XBTmcOIhSzZJVWfsLks0+nle/j538YAW9RQ==}
|
||||
|
||||
text-decoder@1.2.3:
|
||||
resolution: {integrity: sha512-3/o9z3X0X0fTupwsYvR03pJ/DjWuqqrfwBgTQzdWDiQSm9KitAyz/9WqsT2JQW7KV2m+bC2ol/zqpW37NHxLaA==}
|
||||
|
||||
thread-stream@3.1.0:
|
||||
resolution: {integrity: sha512-OqyPZ9u96VohAyMfJykzmivOrY2wfMSf3C5TtFJVgN+Hm6aj+voFhlK+kZEIv2FBh1X6Xp3DlnCOfEQ3B2J86A==}
|
||||
|
||||
@@ -2232,9 +2073,6 @@ packages:
|
||||
uri-js@4.4.1:
|
||||
resolution: {integrity: sha512-7rKUyy33Q1yc98pQ1DAmLtwX109F7TIfWlW1Ydo8Wl1ii1SeHieeh0HHfPeL2fMXK6z0s8ecKs9frCuLJvndBg==}
|
||||
|
||||
util-deprecate@1.0.2:
|
||||
resolution: {integrity: sha512-EPD5q1uXyFxJpCrLnCc1nHnq3gOa6DZBocAIiI2TaSCA7VCJ1UJDMagCzIkXNsUYfD1daK//LTEQ8xiIbrHtcw==}
|
||||
|
||||
uuid@9.0.1:
|
||||
resolution: {integrity: sha512-b+1eJOlsR9K8HJpow9Ok3fiWOWSIcIzXodvv0rQjVoOVNpWMpxf1wZNpt4y9h10odCNrqnYp1OBzRktckBe3sA==}
|
||||
hasBin: true
|
||||
@@ -2298,10 +2136,6 @@ packages:
|
||||
resolution: {integrity: sha512-rVksvsnNCdJ/ohGc6xgPwyN8eheCxsiLM8mxuE/t/mOVqJewPuO1miLpTHQiRgTKCLexL4MeAFVagts7HmNZ2Q==}
|
||||
engines: {node: '>=10'}
|
||||
|
||||
zip-stream@6.0.1:
|
||||
resolution: {integrity: sha512-zK7YHHz4ZXpW89AHXUPbQVGKI7uvkd3hzusTdotCg1UxyaVtg0zFJSTfW/Dq5f7OBBVnq6cZIaC8Ti4hb6dtCA==}
|
||||
engines: {node: '>= 14'}
|
||||
|
||||
zod-to-json-schema@3.24.6:
|
||||
resolution: {integrity: sha512-h/z3PKvcTcTetyjl1fkj79MHNEjm+HpD6NXheWjzOekY7kV+lwDYnHw+ivHkijnCSMz1yJaWBD9vu/Fcmk+vEg==}
|
||||
peerDependencies:
|
||||
@@ -3220,9 +3054,6 @@ snapshots:
|
||||
'@nodelib/fs.scandir': 2.1.5
|
||||
fastq: 1.19.1
|
||||
|
||||
'@pkgjs/parseargs@0.11.0':
|
||||
optional: true
|
||||
|
||||
'@pkgr/core@0.2.7': {}
|
||||
|
||||
'@prisma/client@6.11.1(prisma@6.11.1(typescript@5.8.3))(typescript@5.8.3)':
|
||||
@@ -3627,10 +3458,6 @@ snapshots:
|
||||
|
||||
'@tsconfig/node16@1.0.4': {}
|
||||
|
||||
'@types/archiver@6.0.3':
|
||||
dependencies:
|
||||
'@types/readdir-glob': 1.1.5
|
||||
|
||||
'@types/bcryptjs@2.4.6': {}
|
||||
|
||||
'@types/crypto-js@4.2.2': {}
|
||||
@@ -3651,10 +3478,6 @@ snapshots:
|
||||
dependencies:
|
||||
'@types/node': 22.16.0
|
||||
|
||||
'@types/readdir-glob@1.1.5':
|
||||
dependencies:
|
||||
'@types/node': 22.16.0
|
||||
|
||||
'@types/speakeasy@2.0.10':
|
||||
dependencies:
|
||||
'@types/node': 22.16.0
|
||||
@@ -3753,10 +3576,6 @@ snapshots:
|
||||
'@typescript-eslint/types': 8.35.1
|
||||
eslint-visitor-keys: 4.2.1
|
||||
|
||||
abort-controller@3.0.0:
|
||||
dependencies:
|
||||
event-target-shim: 5.0.1
|
||||
|
||||
abstract-logging@2.0.1: {}
|
||||
|
||||
acorn-jsx@5.3.2(acorn@8.15.0):
|
||||
@@ -3801,28 +3620,6 @@ snapshots:
|
||||
|
||||
ansi-styles@6.2.1: {}
|
||||
|
||||
archiver-utils@5.0.2:
|
||||
dependencies:
|
||||
glob: 10.4.5
|
||||
graceful-fs: 4.2.11
|
||||
is-stream: 2.0.1
|
||||
lazystream: 1.0.1
|
||||
lodash: 4.17.21
|
||||
normalize-path: 3.0.0
|
||||
readable-stream: 4.7.0
|
||||
|
||||
archiver@7.0.1:
|
||||
dependencies:
|
||||
archiver-utils: 5.0.2
|
||||
async: 3.2.6
|
||||
buffer-crc32: 1.0.0
|
||||
readable-stream: 4.7.0
|
||||
readdir-glob: 1.1.3
|
||||
tar-stream: 3.1.7
|
||||
zip-stream: 6.0.1
|
||||
transitivePeerDependencies:
|
||||
- react-native-b4a
|
||||
|
||||
arg@4.1.3: {}
|
||||
|
||||
argparse@2.0.1: {}
|
||||
@@ -3834,8 +3631,6 @@ snapshots:
|
||||
minimalistic-assert: 1.0.1
|
||||
safer-buffer: 2.1.2
|
||||
|
||||
async@3.2.6: {}
|
||||
|
||||
atomic-sleep@1.0.0: {}
|
||||
|
||||
avvio@9.1.0:
|
||||
@@ -3843,16 +3638,10 @@ snapshots:
|
||||
'@fastify/error': 4.2.0
|
||||
fastq: 1.19.1
|
||||
|
||||
b4a@1.7.2: {}
|
||||
|
||||
balanced-match@1.0.2: {}
|
||||
|
||||
bare-events@2.7.0: {}
|
||||
|
||||
base32.js@0.0.1: {}
|
||||
|
||||
base64-js@1.5.1: {}
|
||||
|
||||
bcryptjs@2.4.3: {}
|
||||
|
||||
bn.js@4.12.2: {}
|
||||
@@ -3872,13 +3661,6 @@ snapshots:
|
||||
dependencies:
|
||||
fill-range: 7.1.1
|
||||
|
||||
buffer-crc32@1.0.0: {}
|
||||
|
||||
buffer@6.0.3:
|
||||
dependencies:
|
||||
base64-js: 1.5.1
|
||||
ieee754: 1.2.1
|
||||
|
||||
callsites@3.1.0: {}
|
||||
|
||||
camelcase@5.3.1: {}
|
||||
@@ -3910,14 +3692,6 @@ snapshots:
|
||||
color-convert: 2.0.1
|
||||
color-string: 1.9.1
|
||||
|
||||
compress-commons@6.0.2:
|
||||
dependencies:
|
||||
crc-32: 1.2.2
|
||||
crc32-stream: 6.0.0
|
||||
is-stream: 2.0.1
|
||||
normalize-path: 3.0.0
|
||||
readable-stream: 4.7.0
|
||||
|
||||
concat-map@0.0.1: {}
|
||||
|
||||
content-disposition@0.5.4:
|
||||
@@ -3926,15 +3700,6 @@ snapshots:
|
||||
|
||||
cookie@1.0.2: {}
|
||||
|
||||
core-util-is@1.0.3: {}
|
||||
|
||||
crc-32@1.2.2: {}
|
||||
|
||||
crc32-stream@6.0.0:
|
||||
dependencies:
|
||||
crc-32: 1.2.2
|
||||
readable-stream: 4.7.0
|
||||
|
||||
create-require@1.1.1: {}
|
||||
|
||||
cross-spawn@7.0.6:
|
||||
@@ -4089,22 +3854,12 @@ snapshots:
|
||||
|
||||
esutils@2.0.3: {}
|
||||
|
||||
event-target-shim@5.0.1: {}
|
||||
|
||||
events-universal@1.0.1:
|
||||
dependencies:
|
||||
bare-events: 2.7.0
|
||||
|
||||
events@3.3.0: {}
|
||||
|
||||
fast-decode-uri-component@1.0.1: {}
|
||||
|
||||
fast-deep-equal@3.1.3: {}
|
||||
|
||||
fast-diff@1.3.0: {}
|
||||
|
||||
fast-fifo@1.3.2: {}
|
||||
|
||||
fast-glob@3.3.3:
|
||||
dependencies:
|
||||
'@nodelib/fs.stat': 2.0.5
|
||||
@@ -4256,15 +4011,6 @@ snapshots:
|
||||
dependencies:
|
||||
is-glob: 4.0.3
|
||||
|
||||
glob@10.4.5:
|
||||
dependencies:
|
||||
foreground-child: 3.3.1
|
||||
jackspeak: 3.4.3
|
||||
minimatch: 9.0.5
|
||||
minipass: 7.1.2
|
||||
package-json-from-dist: 1.0.1
|
||||
path-scurry: 1.11.1
|
||||
|
||||
glob@11.0.3:
|
||||
dependencies:
|
||||
foreground-child: 3.3.1
|
||||
@@ -4276,8 +4022,6 @@ snapshots:
|
||||
|
||||
globals@14.0.0: {}
|
||||
|
||||
graceful-fs@4.2.11: {}
|
||||
|
||||
graphemer@1.4.0: {}
|
||||
|
||||
has-flag@4.0.0: {}
|
||||
@@ -4290,8 +4034,6 @@ snapshots:
|
||||
statuses: 2.0.1
|
||||
toidentifier: 1.0.1
|
||||
|
||||
ieee754@1.2.1: {}
|
||||
|
||||
ignore@5.3.2: {}
|
||||
|
||||
ignore@7.0.5: {}
|
||||
@@ -4319,18 +4061,8 @@ snapshots:
|
||||
|
||||
is-number@7.0.0: {}
|
||||
|
||||
is-stream@2.0.1: {}
|
||||
|
||||
isarray@1.0.0: {}
|
||||
|
||||
isexe@2.0.0: {}
|
||||
|
||||
jackspeak@3.4.3:
|
||||
dependencies:
|
||||
'@isaacs/cliui': 8.0.2
|
||||
optionalDependencies:
|
||||
'@pkgjs/parseargs': 0.11.0
|
||||
|
||||
jackspeak@4.1.1:
|
||||
dependencies:
|
||||
'@isaacs/cliui': 8.0.2
|
||||
@@ -4375,10 +4107,6 @@ snapshots:
|
||||
dependencies:
|
||||
json-buffer: 3.0.1
|
||||
|
||||
lazystream@1.0.1:
|
||||
dependencies:
|
||||
readable-stream: 2.3.8
|
||||
|
||||
leven@4.0.0: {}
|
||||
|
||||
levn@0.4.1:
|
||||
@@ -4402,10 +4130,6 @@ snapshots:
|
||||
|
||||
lodash.merge@4.6.2: {}
|
||||
|
||||
lodash@4.17.21: {}
|
||||
|
||||
lru-cache@10.4.3: {}
|
||||
|
||||
lru-cache@11.1.0: {}
|
||||
|
||||
make-error@1.3.6: {}
|
||||
@@ -4429,10 +4153,6 @@ snapshots:
|
||||
dependencies:
|
||||
brace-expansion: 1.1.12
|
||||
|
||||
minimatch@5.1.6:
|
||||
dependencies:
|
||||
brace-expansion: 2.0.2
|
||||
|
||||
minimatch@9.0.5:
|
||||
dependencies:
|
||||
brace-expansion: 2.0.2
|
||||
@@ -4463,8 +4183,6 @@ snapshots:
|
||||
|
||||
nodemailer@6.10.1: {}
|
||||
|
||||
normalize-path@3.0.0: {}
|
||||
|
||||
oauth4webapi@3.5.5: {}
|
||||
|
||||
obliterator@2.0.5: {}
|
||||
@@ -4515,11 +4233,6 @@ snapshots:
|
||||
|
||||
path-key@3.1.1: {}
|
||||
|
||||
path-scurry@1.11.1:
|
||||
dependencies:
|
||||
lru-cache: 10.4.3
|
||||
minipass: 7.1.2
|
||||
|
||||
path-scurry@2.0.0:
|
||||
dependencies:
|
||||
lru-cache: 11.1.0
|
||||
@@ -4570,14 +4283,10 @@ snapshots:
|
||||
optionalDependencies:
|
||||
typescript: 5.8.3
|
||||
|
||||
process-nextick-args@2.0.1: {}
|
||||
|
||||
process-warning@4.0.1: {}
|
||||
|
||||
process-warning@5.0.0: {}
|
||||
|
||||
process@0.11.10: {}
|
||||
|
||||
punycode@2.3.1: {}
|
||||
|
||||
qrcode@1.5.4:
|
||||
@@ -4590,28 +4299,6 @@ snapshots:
|
||||
|
||||
quick-format-unescaped@4.0.4: {}
|
||||
|
||||
readable-stream@2.3.8:
|
||||
dependencies:
|
||||
core-util-is: 1.0.3
|
||||
inherits: 2.0.4
|
||||
isarray: 1.0.0
|
||||
process-nextick-args: 2.0.1
|
||||
safe-buffer: 5.1.2
|
||||
string_decoder: 1.1.1
|
||||
util-deprecate: 1.0.2
|
||||
|
||||
readable-stream@4.7.0:
|
||||
dependencies:
|
||||
abort-controller: 3.0.0
|
||||
buffer: 6.0.3
|
||||
events: 3.3.0
|
||||
process: 0.11.10
|
||||
string_decoder: 1.3.0
|
||||
|
||||
readdir-glob@1.1.3:
|
||||
dependencies:
|
||||
minimatch: 5.1.6
|
||||
|
||||
real-require@0.2.0: {}
|
||||
|
||||
require-directory@2.1.1: {}
|
||||
@@ -4634,8 +4321,6 @@ snapshots:
|
||||
dependencies:
|
||||
queue-microtask: 1.2.3
|
||||
|
||||
safe-buffer@5.1.2: {}
|
||||
|
||||
safe-buffer@5.2.1: {}
|
||||
|
||||
safe-regex2@5.0.0:
|
||||
@@ -4718,14 +4403,6 @@ snapshots:
|
||||
fastseries: 1.7.2
|
||||
reusify: 1.1.0
|
||||
|
||||
streamx@2.23.0:
|
||||
dependencies:
|
||||
events-universal: 1.0.1
|
||||
fast-fifo: 1.3.2
|
||||
text-decoder: 1.2.3
|
||||
transitivePeerDependencies:
|
||||
- react-native-b4a
|
||||
|
||||
string-width@4.2.3:
|
||||
dependencies:
|
||||
emoji-regex: 8.0.0
|
||||
@@ -4738,14 +4415,6 @@ snapshots:
|
||||
emoji-regex: 9.2.2
|
||||
strip-ansi: 7.1.0
|
||||
|
||||
string_decoder@1.1.1:
|
||||
dependencies:
|
||||
safe-buffer: 5.1.2
|
||||
|
||||
string_decoder@1.3.0:
|
||||
dependencies:
|
||||
safe-buffer: 5.2.1
|
||||
|
||||
strip-ansi@6.0.1:
|
||||
dependencies:
|
||||
ansi-regex: 5.0.1
|
||||
@@ -4766,20 +4435,6 @@ snapshots:
|
||||
dependencies:
|
||||
'@pkgr/core': 0.2.7
|
||||
|
||||
tar-stream@3.1.7:
|
||||
dependencies:
|
||||
b4a: 1.7.2
|
||||
fast-fifo: 1.3.2
|
||||
streamx: 2.23.0
|
||||
transitivePeerDependencies:
|
||||
- react-native-b4a
|
||||
|
||||
text-decoder@1.2.3:
|
||||
dependencies:
|
||||
b4a: 1.7.2
|
||||
transitivePeerDependencies:
|
||||
- react-native-b4a
|
||||
|
||||
thread-stream@3.1.0:
|
||||
dependencies:
|
||||
real-require: 0.2.0
|
||||
@@ -4835,8 +4490,6 @@ snapshots:
|
||||
dependencies:
|
||||
punycode: 2.3.1
|
||||
|
||||
util-deprecate@1.0.2: {}
|
||||
|
||||
uuid@9.0.1: {}
|
||||
|
||||
v8-compile-cache-lib@3.0.1: {}
|
||||
@@ -4898,12 +4551,6 @@ snapshots:
|
||||
|
||||
yocto-queue@0.1.0: {}
|
||||
|
||||
zip-stream@6.0.1:
|
||||
dependencies:
|
||||
archiver-utils: 5.0.2
|
||||
compress-commons: 6.0.2
|
||||
readable-stream: 4.7.0
|
||||
|
||||
zod-to-json-schema@3.24.6(zod@3.25.74):
|
||||
dependencies:
|
||||
zod: 3.25.74
|
||||
|
@@ -23,18 +23,21 @@ if (storageConfig.useSSL && env.S3_REJECT_UNAUTHORIZED === "false") {
|
||||
}
|
||||
}
|
||||
|
||||
export const s3Client = new S3Client({
|
||||
endpoint: storageConfig.useSSL
|
||||
? `https://${storageConfig.endpoint}${storageConfig.port ? `:${storageConfig.port}` : ""}`
|
||||
: `http://${storageConfig.endpoint}${storageConfig.port ? `:${storageConfig.port}` : ""}`,
|
||||
region: storageConfig.region,
|
||||
credentials: {
|
||||
accessKeyId: storageConfig.accessKey,
|
||||
secretAccessKey: storageConfig.secretKey,
|
||||
},
|
||||
forcePathStyle: storageConfig.forcePathStyle,
|
||||
});
|
||||
export const s3Client =
|
||||
env.ENABLE_S3 === "true"
|
||||
? new S3Client({
|
||||
endpoint: storageConfig.useSSL
|
||||
? `https://${storageConfig.endpoint}${storageConfig.port ? `:${storageConfig.port}` : ""}`
|
||||
: `http://${storageConfig.endpoint}${storageConfig.port ? `:${storageConfig.port}` : ""}`,
|
||||
region: storageConfig.region,
|
||||
credentials: {
|
||||
accessKeyId: storageConfig.accessKey,
|
||||
secretAccessKey: storageConfig.secretKey,
|
||||
},
|
||||
forcePathStyle: storageConfig.forcePathStyle,
|
||||
})
|
||||
: null;
|
||||
|
||||
export const bucketName = storageConfig.bucketName;
|
||||
|
||||
export const isS3Enabled = true;
|
||||
export const isS3Enabled = env.ENABLE_S3 === "true";
|
||||
|
@@ -1,18 +1,38 @@
|
||||
import { z } from "zod";
|
||||
|
||||
const envSchema = z.object({
|
||||
S3_ENDPOINT: z.string().min(1, "S3_ENDPOINT is required"),
|
||||
ENABLE_S3: z.union([z.literal("true"), z.literal("false")]).default("false"),
|
||||
ENCRYPTION_KEY: z.string().optional(),
|
||||
DISABLE_FILESYSTEM_ENCRYPTION: z.union([z.literal("true"), z.literal("false")]).default("true"),
|
||||
S3_ENDPOINT: z.string().optional(),
|
||||
S3_PORT: z.string().optional(),
|
||||
S3_USE_SSL: z.string().optional(),
|
||||
S3_ACCESS_KEY: z.string().min(1, "S3_ACCESS_KEY is required"),
|
||||
S3_SECRET_KEY: z.string().min(1, "S3_SECRET_KEY is required"),
|
||||
S3_REGION: z.string().min(1, "S3_REGION is required"),
|
||||
S3_BUCKET_NAME: z.string().min(1, "S3_BUCKET_NAME is required"),
|
||||
S3_ACCESS_KEY: z.string().optional(),
|
||||
S3_SECRET_KEY: z.string().optional(),
|
||||
S3_REGION: z.string().optional(),
|
||||
S3_BUCKET_NAME: z.string().optional(),
|
||||
S3_FORCE_PATH_STYLE: z.union([z.literal("true"), z.literal("false")]).default("false"),
|
||||
S3_REJECT_UNAUTHORIZED: z.union([z.literal("true"), z.literal("false")]).default("true"),
|
||||
PRESIGNED_URL_EXPIRATION: z.string().optional().default("3600"),
|
||||
SECURE_SITE: z.union([z.literal("true"), z.literal("false")]).default("false"),
|
||||
DATABASE_URL: z.string().optional().default("file:/app/server/prisma/palmr.db"),
|
||||
DOWNLOAD_MAX_CONCURRENT: z
|
||||
.string()
|
||||
.optional()
|
||||
.transform((val) => (val ? parseInt(val, 10) : undefined)),
|
||||
DOWNLOAD_MEMORY_THRESHOLD_MB: z
|
||||
.string()
|
||||
.optional()
|
||||
.transform((val) => (val ? parseInt(val, 10) : undefined)),
|
||||
DOWNLOAD_QUEUE_SIZE: z
|
||||
.string()
|
||||
.optional()
|
||||
.transform((val) => (val ? parseInt(val, 10) : undefined)),
|
||||
DOWNLOAD_AUTO_SCALE: z.union([z.literal("true"), z.literal("false")]).default("true"),
|
||||
DOWNLOAD_MIN_FILE_SIZE_GB: z
|
||||
.string()
|
||||
.optional()
|
||||
.transform((val) => (val ? parseFloat(val) : undefined)),
|
||||
CUSTOM_PATH: z.string().optional(),
|
||||
});
|
||||
|
||||
|
@@ -1,263 +0,0 @@
|
||||
import { GetObjectCommand } from "@aws-sdk/client-s3";
|
||||
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
|
||||
import archiver from "archiver";
|
||||
import { FastifyReply, FastifyRequest } from "fastify";
|
||||
|
||||
import { bucketName, s3Client } from "../../config/storage.config";
|
||||
import { prisma } from "../../shared/prisma";
|
||||
import { ReverseShareService } from "../reverse-share/service";
|
||||
|
||||
export class BulkDownloadController {
|
||||
private reverseShareService = new ReverseShareService();
|
||||
|
||||
async downloadFiles(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { fileIds, folderIds, zipName } = request.body as {
|
||||
fileIds: string[];
|
||||
folderIds: string[];
|
||||
zipName: string;
|
||||
};
|
||||
|
||||
if (!fileIds.length && !folderIds.length) {
|
||||
return reply.status(400).send({ error: "No files or folders to download" });
|
||||
}
|
||||
|
||||
// Get files from database
|
||||
const files = await prisma.file.findMany({
|
||||
where: {
|
||||
id: { in: fileIds },
|
||||
},
|
||||
select: {
|
||||
id: true,
|
||||
name: true,
|
||||
objectName: true,
|
||||
size: true,
|
||||
},
|
||||
});
|
||||
|
||||
// Get folders and their files
|
||||
const folders = await prisma.folder.findMany({
|
||||
where: {
|
||||
id: { in: folderIds },
|
||||
},
|
||||
include: {
|
||||
files: {
|
||||
select: {
|
||||
id: true,
|
||||
name: true,
|
||||
objectName: true,
|
||||
size: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
// Create ZIP stream
|
||||
const archive = archiver("zip", {
|
||||
zlib: { level: 9 },
|
||||
});
|
||||
|
||||
reply.raw.setHeader("Content-Type", "application/zip");
|
||||
reply.raw.setHeader("Content-Disposition", `attachment; filename="${zipName}"`);
|
||||
reply.raw.setHeader("Transfer-Encoding", "chunked");
|
||||
|
||||
archive.pipe(reply.raw);
|
||||
|
||||
// Add files to ZIP
|
||||
for (const file of files) {
|
||||
try {
|
||||
const downloadUrl = await getSignedUrl(
|
||||
s3Client,
|
||||
new GetObjectCommand({
|
||||
Bucket: bucketName,
|
||||
Key: file.objectName,
|
||||
}),
|
||||
{ expiresIn: 300 } // 5 minutes
|
||||
);
|
||||
|
||||
const response = await fetch(downloadUrl);
|
||||
if (response.ok) {
|
||||
const buffer = await response.arrayBuffer();
|
||||
archive.append(Buffer.from(buffer), { name: file.name });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
// Add folder files to ZIP
|
||||
for (const folder of folders) {
|
||||
for (const file of folder.files) {
|
||||
try {
|
||||
const downloadUrl = await getSignedUrl(
|
||||
s3Client,
|
||||
new GetObjectCommand({
|
||||
Bucket: bucketName,
|
||||
Key: file.objectName,
|
||||
}),
|
||||
{ expiresIn: 300 }
|
||||
);
|
||||
|
||||
const response = await fetch(downloadUrl);
|
||||
if (response.ok) {
|
||||
const buffer = await response.arrayBuffer();
|
||||
archive.append(Buffer.from(buffer), {
|
||||
name: `${folder.name}/${file.name}`,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
await archive.finalize();
|
||||
} catch (error) {
|
||||
console.error("Bulk download error:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async downloadFolder(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { folderId, folderName } = request.params as {
|
||||
folderId: string;
|
||||
folderName: string;
|
||||
};
|
||||
|
||||
// Get folder and all its files recursively
|
||||
const folder = await prisma.folder.findUnique({
|
||||
where: { id: folderId },
|
||||
include: {
|
||||
files: {
|
||||
select: {
|
||||
id: true,
|
||||
name: true,
|
||||
objectName: true,
|
||||
size: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
if (!folder) {
|
||||
return reply.status(404).send({ error: "Folder not found" });
|
||||
}
|
||||
|
||||
// Create ZIP stream
|
||||
const archive = archiver("zip", {
|
||||
zlib: { level: 9 },
|
||||
});
|
||||
|
||||
reply.raw.setHeader("Content-Type", "application/zip");
|
||||
reply.raw.setHeader("Content-Disposition", `attachment; filename="${folderName}.zip"`);
|
||||
reply.raw.setHeader("Transfer-Encoding", "chunked");
|
||||
|
||||
archive.pipe(reply.raw);
|
||||
|
||||
// Add all files to ZIP
|
||||
for (const file of folder.files) {
|
||||
try {
|
||||
const downloadUrl = await getSignedUrl(
|
||||
s3Client,
|
||||
new GetObjectCommand({
|
||||
Bucket: bucketName,
|
||||
Key: file.objectName,
|
||||
}),
|
||||
{ expiresIn: 300 }
|
||||
);
|
||||
|
||||
const response = await fetch(downloadUrl);
|
||||
if (response.ok) {
|
||||
const buffer = await response.arrayBuffer();
|
||||
archive.append(Buffer.from(buffer), { name: file.name });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
await archive.finalize();
|
||||
} catch (error) {
|
||||
console.error("Folder download error:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async downloadReverseShareFiles(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
await request.jwtVerify();
|
||||
const userId = (request as any).user?.userId;
|
||||
if (!userId) {
|
||||
return reply.status(401).send({ error: "Unauthorized: a valid token is required to access this resource." });
|
||||
}
|
||||
|
||||
const { fileIds, zipName } = request.body as {
|
||||
fileIds: string[];
|
||||
zipName: string;
|
||||
};
|
||||
|
||||
if (!fileIds.length) {
|
||||
return reply.status(400).send({ error: "No files to download" });
|
||||
}
|
||||
|
||||
// Get reverse share files from database
|
||||
const files = await prisma.reverseShareFile.findMany({
|
||||
where: {
|
||||
id: { in: fileIds },
|
||||
reverseShare: {
|
||||
creatorId: userId, // Only allow creator to download
|
||||
},
|
||||
},
|
||||
select: {
|
||||
id: true,
|
||||
name: true,
|
||||
objectName: true,
|
||||
size: true,
|
||||
},
|
||||
});
|
||||
|
||||
if (files.length === 0) {
|
||||
return reply.status(404).send({ error: "No files found or unauthorized" });
|
||||
}
|
||||
|
||||
// Create ZIP stream
|
||||
const archive = archiver("zip", {
|
||||
zlib: { level: 9 },
|
||||
});
|
||||
|
||||
reply.raw.setHeader("Content-Type", "application/zip");
|
||||
reply.raw.setHeader("Content-Disposition", `attachment; filename="${zipName}"`);
|
||||
reply.raw.setHeader("Transfer-Encoding", "chunked");
|
||||
|
||||
archive.pipe(reply.raw);
|
||||
|
||||
// Add files to ZIP
|
||||
for (const file of files) {
|
||||
try {
|
||||
const downloadUrl = await getSignedUrl(
|
||||
s3Client,
|
||||
new GetObjectCommand({
|
||||
Bucket: bucketName,
|
||||
Key: file.objectName,
|
||||
}),
|
||||
{ expiresIn: 300 } // 5 minutes
|
||||
);
|
||||
|
||||
const response = await fetch(downloadUrl);
|
||||
if (response.ok) {
|
||||
const buffer = await response.arrayBuffer();
|
||||
archive.append(Buffer.from(buffer), { name: file.name });
|
||||
}
|
||||
} catch (error) {
|
||||
console.error(`Error downloading reverse share file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
await archive.finalize();
|
||||
} catch (error) {
|
||||
console.error("Reverse share bulk download error:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
}
|
@@ -1,88 +0,0 @@
|
||||
import { FastifyInstance, FastifyReply, FastifyRequest } from "fastify";
|
||||
import { z } from "zod";
|
||||
|
||||
import { BulkDownloadController } from "./controller";
|
||||
|
||||
export async function bulkDownloadRoutes(app: FastifyInstance) {
|
||||
const bulkDownloadController = new BulkDownloadController();
|
||||
|
||||
const preValidation = async (request: FastifyRequest, reply: FastifyReply) => {
|
||||
try {
|
||||
await request.jwtVerify();
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
reply.status(401).send({ error: "Token inválido ou ausente." });
|
||||
}
|
||||
};
|
||||
|
||||
app.post(
|
||||
"/bulk-download",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Bulk Download"],
|
||||
operationId: "bulkDownloadFiles",
|
||||
summary: "Download multiple files as ZIP",
|
||||
description: "Downloads multiple files and folders as a ZIP archive",
|
||||
body: z.object({
|
||||
fileIds: z.array(z.string()).describe("Array of file IDs to download"),
|
||||
folderIds: z.array(z.string()).describe("Array of folder IDs to download"),
|
||||
zipName: z.string().describe("Name of the ZIP file"),
|
||||
}),
|
||||
response: {
|
||||
200: z.string().describe("ZIP file stream"),
|
||||
400: z.object({ error: z.string().describe("Error message") }),
|
||||
500: z.object({ error: z.string().describe("Error message") }),
|
||||
},
|
||||
},
|
||||
},
|
||||
bulkDownloadController.downloadFiles.bind(bulkDownloadController)
|
||||
);
|
||||
|
||||
app.get(
|
||||
"/bulk-download/folder/:folderId/:folderName",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Bulk Download"],
|
||||
operationId: "downloadFolder",
|
||||
summary: "Download folder as ZIP",
|
||||
description: "Downloads a folder and all its files as a ZIP archive",
|
||||
params: z.object({
|
||||
folderId: z.string().describe("Folder ID"),
|
||||
folderName: z.string().describe("Folder name"),
|
||||
}),
|
||||
response: {
|
||||
200: z.string().describe("ZIP file stream"),
|
||||
404: z.object({ error: z.string().describe("Error message") }),
|
||||
500: z.object({ error: z.string().describe("Error message") }),
|
||||
},
|
||||
},
|
||||
},
|
||||
bulkDownloadController.downloadFolder.bind(bulkDownloadController)
|
||||
);
|
||||
|
||||
app.post(
|
||||
"/bulk-download/reverse-share",
|
||||
{
|
||||
preValidation,
|
||||
schema: {
|
||||
tags: ["Bulk Download"],
|
||||
operationId: "bulkDownloadReverseShareFiles",
|
||||
summary: "Download multiple reverse share files as ZIP",
|
||||
description:
|
||||
"Downloads multiple reverse share files as a ZIP archive. Only the creator of the reverse share can download files.",
|
||||
body: z.object({
|
||||
fileIds: z.array(z.string()).describe("Array of reverse share file IDs to download"),
|
||||
zipName: z.string().describe("Name of the ZIP file"),
|
||||
}),
|
||||
response: {
|
||||
200: z.string().describe("ZIP file stream"),
|
||||
400: z.object({ error: z.string().describe("Error message") }),
|
||||
401: z.object({ error: z.string().describe("Unauthorized") }),
|
||||
404: z.object({ error: z.string().describe("No files found or unauthorized") }),
|
||||
500: z.object({ error: z.string().describe("Error message") }),
|
||||
},
|
||||
},
|
||||
},
|
||||
bulkDownloadController.downloadReverseShareFiles.bind(bulkDownloadController)
|
||||
);
|
||||
}
|
@@ -1,3 +1,5 @@
|
||||
import { isS3Enabled } from "../../config/storage.config";
|
||||
import { FilesystemStorageProvider } from "../../providers/filesystem-storage.provider";
|
||||
import { S3StorageProvider } from "../../providers/s3-storage.provider";
|
||||
import { StorageProvider } from "../../types/storage";
|
||||
|
||||
@@ -5,7 +7,11 @@ export class FileService {
|
||||
private storageProvider: StorageProvider;
|
||||
|
||||
constructor() {
|
||||
this.storageProvider = new S3StorageProvider();
|
||||
if (isS3Enabled) {
|
||||
this.storageProvider = new S3StorageProvider();
|
||||
} else {
|
||||
this.storageProvider = FilesystemStorageProvider.getInstance();
|
||||
}
|
||||
}
|
||||
|
||||
async getPresignedPutUrl(objectName: string, expires: number): Promise<string> {
|
||||
@@ -34,4 +40,8 @@ export class FileService {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
|
||||
isFilesystemMode(): boolean {
|
||||
return !isS3Enabled;
|
||||
}
|
||||
}
|
||||
|
345
apps/server/src/modules/filesystem/chunk-manager.ts
Normal file
345
apps/server/src/modules/filesystem/chunk-manager.ts
Normal file
@@ -0,0 +1,345 @@
|
||||
import * as fs from "fs";
|
||||
import * as path from "path";
|
||||
|
||||
import { getTempFilePath } from "../../config/directories.config";
|
||||
import { FilesystemStorageProvider } from "../../providers/filesystem-storage.provider";
|
||||
|
||||
export interface ChunkMetadata {
|
||||
fileId: string;
|
||||
chunkIndex: number;
|
||||
totalChunks: number;
|
||||
chunkSize: number;
|
||||
totalSize: number;
|
||||
fileName: string;
|
||||
isLastChunk: boolean;
|
||||
}
|
||||
|
||||
export interface ChunkInfo {
|
||||
fileId: string;
|
||||
fileName: string;
|
||||
totalSize: number;
|
||||
totalChunks: number;
|
||||
uploadedChunks: Set<number>;
|
||||
tempPath: string;
|
||||
createdAt: number;
|
||||
}
|
||||
|
||||
export class ChunkManager {
|
||||
private static instance: ChunkManager;
|
||||
private activeUploads = new Map<string, ChunkInfo>();
|
||||
private finalizingUploads = new Set<string>(); // Track uploads currently being finalized
|
||||
private cleanupInterval: NodeJS.Timeout;
|
||||
|
||||
private constructor() {
|
||||
// Cleanup expired uploads every 30 minutes
|
||||
this.cleanupInterval = setInterval(
|
||||
() => {
|
||||
this.cleanupExpiredUploads();
|
||||
},
|
||||
30 * 60 * 1000
|
||||
);
|
||||
}
|
||||
|
||||
public static getInstance(): ChunkManager {
|
||||
if (!ChunkManager.instance) {
|
||||
ChunkManager.instance = new ChunkManager();
|
||||
}
|
||||
return ChunkManager.instance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Process a chunk upload with streaming
|
||||
*/
|
||||
async processChunk(
|
||||
metadata: ChunkMetadata,
|
||||
inputStream: NodeJS.ReadableStream,
|
||||
originalObjectName: string
|
||||
): Promise<{ isComplete: boolean; finalPath?: string }> {
|
||||
const startTime = Date.now();
|
||||
const { fileId, chunkIndex, totalChunks, fileName, totalSize, isLastChunk } = metadata;
|
||||
|
||||
console.log(`Processing chunk ${chunkIndex + 1}/${totalChunks} for file ${fileName} (${fileId})`);
|
||||
|
||||
let chunkInfo = this.activeUploads.get(fileId);
|
||||
if (!chunkInfo) {
|
||||
if (chunkIndex !== 0) {
|
||||
throw new Error("First chunk must be chunk 0");
|
||||
}
|
||||
|
||||
const tempPath = getTempFilePath(fileId);
|
||||
chunkInfo = {
|
||||
fileId,
|
||||
fileName,
|
||||
totalSize,
|
||||
totalChunks,
|
||||
uploadedChunks: new Set(),
|
||||
tempPath,
|
||||
createdAt: Date.now(),
|
||||
};
|
||||
this.activeUploads.set(fileId, chunkInfo);
|
||||
console.log(`Created new upload session for ${fileName} at ${tempPath}`);
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Validating chunk ${chunkIndex} (total: ${totalChunks}, uploaded: ${Array.from(chunkInfo.uploadedChunks).join(",")})`
|
||||
);
|
||||
|
||||
if (chunkIndex < 0 || chunkIndex >= totalChunks) {
|
||||
throw new Error(`Invalid chunk index: ${chunkIndex} (must be 0-${totalChunks - 1})`);
|
||||
}
|
||||
|
||||
if (chunkInfo.uploadedChunks.has(chunkIndex)) {
|
||||
console.log(`Chunk ${chunkIndex} already uploaded, treating as success`);
|
||||
|
||||
if (isLastChunk && chunkInfo.uploadedChunks.size === totalChunks) {
|
||||
if (this.finalizingUploads.has(fileId)) {
|
||||
console.log(`Upload ${fileId} is already being finalized, waiting...`);
|
||||
return { isComplete: false };
|
||||
}
|
||||
|
||||
console.log(`All chunks uploaded, finalizing ${fileName}`);
|
||||
return await this.finalizeUpload(chunkInfo, metadata, originalObjectName);
|
||||
}
|
||||
|
||||
return { isComplete: false };
|
||||
}
|
||||
|
||||
const tempDir = path.dirname(chunkInfo.tempPath);
|
||||
await fs.promises.mkdir(tempDir, { recursive: true });
|
||||
console.log(`Temp directory ensured: ${tempDir}`);
|
||||
|
||||
await this.writeChunkToFile(chunkInfo.tempPath, inputStream, chunkIndex === 0);
|
||||
|
||||
chunkInfo.uploadedChunks.add(chunkIndex);
|
||||
|
||||
try {
|
||||
const stats = await fs.promises.stat(chunkInfo.tempPath);
|
||||
const processingTime = Date.now() - startTime;
|
||||
console.log(
|
||||
`Chunk ${chunkIndex + 1}/${totalChunks} uploaded successfully in ${processingTime}ms. Temp file size: ${stats.size} bytes`
|
||||
);
|
||||
} catch (error) {
|
||||
console.warn(`Could not get temp file stats:`, error);
|
||||
}
|
||||
|
||||
console.log(
|
||||
`Checking completion: isLastChunk=${isLastChunk}, uploadedChunks.size=${chunkInfo.uploadedChunks.size}, totalChunks=${totalChunks}`
|
||||
);
|
||||
|
||||
if (isLastChunk && chunkInfo.uploadedChunks.size === totalChunks) {
|
||||
if (this.finalizingUploads.has(fileId)) {
|
||||
console.log(`Upload ${fileId} is already being finalized, waiting...`);
|
||||
return { isComplete: false };
|
||||
}
|
||||
|
||||
console.log(`All chunks uploaded, finalizing ${fileName}`);
|
||||
|
||||
const uploadedChunksArray = Array.from(chunkInfo.uploadedChunks).sort((a, b) => a - b);
|
||||
console.log(`Uploaded chunks in order: ${uploadedChunksArray.join(", ")}`);
|
||||
|
||||
const expectedChunks = Array.from({ length: totalChunks }, (_, i) => i);
|
||||
const missingChunks = expectedChunks.filter((chunk) => !chunkInfo.uploadedChunks.has(chunk));
|
||||
|
||||
if (missingChunks.length > 0) {
|
||||
throw new Error(`Missing chunks: ${missingChunks.join(", ")}`);
|
||||
}
|
||||
|
||||
return await this.finalizeUpload(chunkInfo, metadata, originalObjectName);
|
||||
} else {
|
||||
console.log(
|
||||
`Not ready for finalization: isLastChunk=${isLastChunk}, uploadedChunks.size=${chunkInfo.uploadedChunks.size}, totalChunks=${totalChunks}`
|
||||
);
|
||||
}
|
||||
|
||||
return { isComplete: false };
|
||||
}
|
||||
|
||||
/**
|
||||
* Write chunk to file using streaming
|
||||
*/
|
||||
private async writeChunkToFile(
|
||||
filePath: string,
|
||||
inputStream: NodeJS.ReadableStream,
|
||||
isFirstChunk: boolean
|
||||
): Promise<void> {
|
||||
return new Promise((resolve, reject) => {
|
||||
console.log(`Writing chunk to ${filePath} (first: ${isFirstChunk})`);
|
||||
|
||||
if (isFirstChunk) {
|
||||
const writeStream = fs.createWriteStream(filePath, {
|
||||
highWaterMark: 64 * 1024 * 1024, // 64MB buffer for better performance
|
||||
});
|
||||
writeStream.on("error", (error) => {
|
||||
console.error("Write stream error:", error);
|
||||
reject(error);
|
||||
});
|
||||
writeStream.on("finish", () => {
|
||||
console.log("Write stream finished successfully");
|
||||
resolve();
|
||||
});
|
||||
inputStream.pipe(writeStream);
|
||||
} else {
|
||||
const writeStream = fs.createWriteStream(filePath, {
|
||||
flags: "a",
|
||||
highWaterMark: 64 * 1024 * 1024, // 64MB buffer for better performance
|
||||
});
|
||||
writeStream.on("error", (error) => {
|
||||
console.error("Write stream error:", error);
|
||||
reject(error);
|
||||
});
|
||||
writeStream.on("finish", () => {
|
||||
console.log("Write stream finished successfully");
|
||||
resolve();
|
||||
});
|
||||
inputStream.pipe(writeStream);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Finalize upload by moving temp file to final location and encrypting (if enabled)
|
||||
*/
|
||||
private async finalizeUpload(
|
||||
chunkInfo: ChunkInfo,
|
||||
metadata: ChunkMetadata,
|
||||
originalObjectName: string
|
||||
): Promise<{ isComplete: boolean; finalPath: string }> {
|
||||
// Mark as finalizing to prevent race conditions
|
||||
this.finalizingUploads.add(chunkInfo.fileId);
|
||||
|
||||
try {
|
||||
console.log(`Finalizing upload for ${chunkInfo.fileName}`);
|
||||
|
||||
const tempStats = await fs.promises.stat(chunkInfo.tempPath);
|
||||
console.log(`Temp file size: ${tempStats.size} bytes, expected: ${chunkInfo.totalSize} bytes`);
|
||||
|
||||
if (tempStats.size !== chunkInfo.totalSize) {
|
||||
console.warn(`Size mismatch! Temp: ${tempStats.size}, Expected: ${chunkInfo.totalSize}`);
|
||||
}
|
||||
|
||||
const provider = FilesystemStorageProvider.getInstance();
|
||||
const finalObjectName = originalObjectName;
|
||||
const filePath = provider.getFilePath(finalObjectName);
|
||||
const dir = path.dirname(filePath);
|
||||
|
||||
console.log(`Starting finalization: ${finalObjectName}`);
|
||||
|
||||
await fs.promises.mkdir(dir, { recursive: true });
|
||||
|
||||
const tempReadStream = fs.createReadStream(chunkInfo.tempPath, {
|
||||
highWaterMark: 64 * 1024 * 1024, // 64MB buffer for better performance
|
||||
});
|
||||
const writeStream = fs.createWriteStream(filePath, {
|
||||
highWaterMark: 64 * 1024 * 1024,
|
||||
});
|
||||
const encryptStream = provider.createEncryptStream();
|
||||
|
||||
await new Promise<void>((resolve, reject) => {
|
||||
const startTime = Date.now();
|
||||
|
||||
tempReadStream
|
||||
.pipe(encryptStream)
|
||||
.pipe(writeStream)
|
||||
.on("finish", () => {
|
||||
const duration = Date.now() - startTime;
|
||||
console.log(`File processed and saved to: ${filePath} in ${duration}ms`);
|
||||
resolve();
|
||||
})
|
||||
.on("error", (error) => {
|
||||
console.error("Error during processing:", error);
|
||||
reject(error);
|
||||
});
|
||||
});
|
||||
|
||||
console.log(`File successfully uploaded and processed: ${finalObjectName}`);
|
||||
|
||||
await this.cleanupTempFile(chunkInfo.tempPath);
|
||||
|
||||
this.activeUploads.delete(chunkInfo.fileId);
|
||||
this.finalizingUploads.delete(chunkInfo.fileId);
|
||||
|
||||
return { isComplete: true, finalPath: finalObjectName };
|
||||
} catch (error) {
|
||||
console.error("Error during finalization:", error);
|
||||
await this.cleanupTempFile(chunkInfo.tempPath);
|
||||
this.activeUploads.delete(chunkInfo.fileId);
|
||||
this.finalizingUploads.delete(chunkInfo.fileId);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup temporary file
|
||||
*/
|
||||
private async cleanupTempFile(tempPath: string): Promise<void> {
|
||||
try {
|
||||
await fs.promises.access(tempPath);
|
||||
await fs.promises.unlink(tempPath);
|
||||
console.log(`Temp file cleaned up: ${tempPath}`);
|
||||
} catch (error: any) {
|
||||
if (error.code === "ENOENT") {
|
||||
console.log(`Temp file already cleaned up: ${tempPath}`);
|
||||
} else {
|
||||
console.warn(`Failed to cleanup temp file ${tempPath}:`, error);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup expired uploads (older than 2 hours)
|
||||
*/
|
||||
private async cleanupExpiredUploads(): Promise<void> {
|
||||
const now = Date.now();
|
||||
const maxAge = 2 * 60 * 60 * 1000; // 2 hours
|
||||
|
||||
for (const [fileId, chunkInfo] of this.activeUploads.entries()) {
|
||||
if (now - chunkInfo.createdAt > maxAge) {
|
||||
console.log(`Cleaning up expired upload: ${fileId}`);
|
||||
await this.cleanupTempFile(chunkInfo.tempPath);
|
||||
this.activeUploads.delete(fileId);
|
||||
this.finalizingUploads.delete(fileId);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get upload progress
|
||||
*/
|
||||
getUploadProgress(fileId: string): { uploaded: number; total: number; percentage: number } | null {
|
||||
const chunkInfo = this.activeUploads.get(fileId);
|
||||
if (!chunkInfo) return null;
|
||||
|
||||
return {
|
||||
uploaded: chunkInfo.uploadedChunks.size,
|
||||
total: chunkInfo.totalChunks,
|
||||
percentage: Math.round((chunkInfo.uploadedChunks.size / chunkInfo.totalChunks) * 100),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Cancel upload
|
||||
*/
|
||||
async cancelUpload(fileId: string): Promise<void> {
|
||||
const chunkInfo = this.activeUploads.get(fileId);
|
||||
if (chunkInfo) {
|
||||
await this.cleanupTempFile(chunkInfo.tempPath);
|
||||
this.activeUploads.delete(fileId);
|
||||
this.finalizingUploads.delete(fileId);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Cleanup on shutdown
|
||||
*/
|
||||
destroy(): void {
|
||||
if (this.cleanupInterval) {
|
||||
clearInterval(this.cleanupInterval);
|
||||
}
|
||||
|
||||
for (const [fileId, chunkInfo] of this.activeUploads.entries()) {
|
||||
this.cleanupTempFile(chunkInfo.tempPath);
|
||||
}
|
||||
this.activeUploads.clear();
|
||||
this.finalizingUploads.clear();
|
||||
}
|
||||
}
|
416
apps/server/src/modules/filesystem/controller.ts
Normal file
416
apps/server/src/modules/filesystem/controller.ts
Normal file
@@ -0,0 +1,416 @@
|
||||
import * as fs from "fs";
|
||||
import { pipeline } from "stream/promises";
|
||||
import { FastifyReply, FastifyRequest } from "fastify";
|
||||
|
||||
import { FilesystemStorageProvider } from "../../providers/filesystem-storage.provider";
|
||||
import { DownloadCancelResponse, QueueClearResponse, QueueStatusResponse } from "../../types/download-queue";
|
||||
import { DownloadMemoryManager } from "../../utils/download-memory-manager";
|
||||
import { getContentType } from "../../utils/mime-types";
|
||||
import { ChunkManager, ChunkMetadata } from "./chunk-manager";
|
||||
|
||||
export class FilesystemController {
|
||||
private chunkManager = ChunkManager.getInstance();
|
||||
private memoryManager = DownloadMemoryManager.getInstance();
|
||||
|
||||
private encodeFilenameForHeader(filename: string): string {
|
||||
if (!filename || filename.trim() === "") {
|
||||
return 'attachment; filename="download"';
|
||||
}
|
||||
|
||||
let sanitized = filename
|
||||
.replace(/"/g, "'")
|
||||
.replace(/[\r\n\t\v\f]/g, "")
|
||||
.replace(/[\\|/]/g, "-")
|
||||
.replace(/[<>:|*?]/g, "");
|
||||
|
||||
sanitized = sanitized
|
||||
.split("")
|
||||
.filter((char) => {
|
||||
const code = char.charCodeAt(0);
|
||||
return code >= 32 && !(code >= 127 && code <= 159);
|
||||
})
|
||||
.join("")
|
||||
.trim();
|
||||
|
||||
if (!sanitized) {
|
||||
return 'attachment; filename="download"';
|
||||
}
|
||||
|
||||
const asciiSafe = sanitized
|
||||
.split("")
|
||||
.filter((char) => {
|
||||
const code = char.charCodeAt(0);
|
||||
return code >= 32 && code <= 126;
|
||||
})
|
||||
.join("");
|
||||
|
||||
if (asciiSafe && asciiSafe.trim()) {
|
||||
const encoded = encodeURIComponent(sanitized);
|
||||
return `attachment; filename="${asciiSafe}"; filename*=UTF-8''${encoded}`;
|
||||
} else {
|
||||
const encoded = encodeURIComponent(sanitized);
|
||||
return `attachment; filename*=UTF-8''${encoded}`;
|
||||
}
|
||||
}
|
||||
|
||||
async upload(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { token } = request.params as { token: string };
|
||||
|
||||
const provider = FilesystemStorageProvider.getInstance();
|
||||
|
||||
const tokenData = provider.validateUploadToken(token);
|
||||
|
||||
if (!tokenData) {
|
||||
return reply.status(400).send({ error: "Invalid or expired upload token" });
|
||||
}
|
||||
|
||||
const chunkMetadata = this.extractChunkMetadata(request);
|
||||
|
||||
if (chunkMetadata) {
|
||||
try {
|
||||
const result = await this.handleChunkedUpload(request, chunkMetadata, tokenData.objectName);
|
||||
|
||||
if (result.isComplete) {
|
||||
provider.consumeUploadToken(token);
|
||||
reply.status(200).send({
|
||||
message: "File uploaded successfully",
|
||||
objectName: result.finalPath,
|
||||
finalObjectName: result.finalPath,
|
||||
});
|
||||
} else {
|
||||
reply.status(200).send({
|
||||
message: "Chunk uploaded successfully",
|
||||
progress: this.chunkManager.getUploadProgress(chunkMetadata.fileId),
|
||||
});
|
||||
}
|
||||
} catch (chunkError: any) {
|
||||
return reply.status(400).send({
|
||||
error: chunkError.message || "Chunked upload failed",
|
||||
details: chunkError.toString(),
|
||||
});
|
||||
}
|
||||
} else {
|
||||
await this.uploadFileStream(request, provider, tokenData.objectName);
|
||||
provider.consumeUploadToken(token);
|
||||
reply.status(200).send({ message: "File uploaded successfully" });
|
||||
}
|
||||
} catch (error) {
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
private async uploadFileStream(request: FastifyRequest, provider: FilesystemStorageProvider, objectName: string) {
|
||||
await provider.uploadFileFromStream(objectName, request.raw);
|
||||
}
|
||||
|
||||
private extractChunkMetadata(request: FastifyRequest): ChunkMetadata | null {
|
||||
const fileId = request.headers["x-file-id"] as string;
|
||||
const chunkIndex = request.headers["x-chunk-index"] as string;
|
||||
const totalChunks = request.headers["x-total-chunks"] as string;
|
||||
const chunkSize = request.headers["x-chunk-size"] as string;
|
||||
const totalSize = request.headers["x-total-size"] as string;
|
||||
const fileName = request.headers["x-file-name"] as string;
|
||||
const isLastChunk = request.headers["x-is-last-chunk"] as string;
|
||||
|
||||
if (!fileId || !chunkIndex || !totalChunks || !chunkSize || !totalSize || !fileName) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const metadata = {
|
||||
fileId,
|
||||
chunkIndex: parseInt(chunkIndex, 10),
|
||||
totalChunks: parseInt(totalChunks, 10),
|
||||
chunkSize: parseInt(chunkSize, 10),
|
||||
totalSize: parseInt(totalSize, 10),
|
||||
fileName,
|
||||
isLastChunk: isLastChunk === "true",
|
||||
};
|
||||
|
||||
return metadata;
|
||||
}
|
||||
|
||||
private async handleChunkedUpload(request: FastifyRequest, metadata: ChunkMetadata, originalObjectName: string) {
|
||||
const stream = request.raw;
|
||||
|
||||
stream.on("error", (error) => {
|
||||
console.error("Request stream error:", error);
|
||||
});
|
||||
|
||||
return await this.chunkManager.processChunk(metadata, stream, originalObjectName);
|
||||
}
|
||||
|
||||
async getUploadProgress(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { fileId } = request.params as { fileId: string };
|
||||
|
||||
const progress = this.chunkManager.getUploadProgress(fileId);
|
||||
|
||||
if (!progress) {
|
||||
return reply.status(404).send({ error: "Upload not found" });
|
||||
}
|
||||
|
||||
reply.status(200).send(progress);
|
||||
} catch (error) {
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async cancelUpload(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { fileId } = request.params as { fileId: string };
|
||||
|
||||
await this.chunkManager.cancelUpload(fileId);
|
||||
|
||||
reply.status(200).send({ message: "Upload cancelled successfully" });
|
||||
} catch (error) {
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async download(request: FastifyRequest, reply: FastifyReply) {
|
||||
const downloadId = `${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
try {
|
||||
const { token } = request.params as { token: string };
|
||||
const provider = FilesystemStorageProvider.getInstance();
|
||||
|
||||
const tokenData = provider.validateDownloadToken(token);
|
||||
|
||||
if (!tokenData) {
|
||||
return reply.status(400).send({ error: "Invalid or expired download token" });
|
||||
}
|
||||
|
||||
const filePath = provider.getFilePath(tokenData.objectName);
|
||||
const stats = await fs.promises.stat(filePath);
|
||||
const fileSize = stats.size;
|
||||
const fileName = tokenData.fileName || "download";
|
||||
|
||||
const fileSizeMB = fileSize / (1024 * 1024);
|
||||
console.log(`[DOWNLOAD] Requesting slot for ${downloadId}: ${tokenData.objectName} (${fileSizeMB.toFixed(1)}MB)`);
|
||||
|
||||
try {
|
||||
await this.memoryManager.requestDownloadSlot(downloadId, {
|
||||
fileName,
|
||||
fileSize,
|
||||
objectName: tokenData.objectName,
|
||||
});
|
||||
} catch (error: any) {
|
||||
console.warn(`[DOWNLOAD] Queue full for ${downloadId}: ${error.message}`);
|
||||
return reply.status(503).send({
|
||||
error: "Download queue is full",
|
||||
message: error.message,
|
||||
retryAfter: 60,
|
||||
});
|
||||
}
|
||||
|
||||
console.log(`[DOWNLOAD] Starting ${downloadId}: ${tokenData.objectName} (${fileSizeMB.toFixed(1)}MB)`);
|
||||
this.memoryManager.startDownload(downloadId);
|
||||
|
||||
const range = request.headers.range;
|
||||
|
||||
reply.header("Content-Disposition", this.encodeFilenameForHeader(fileName));
|
||||
reply.header("Content-Type", getContentType(fileName));
|
||||
reply.header("Accept-Ranges", "bytes");
|
||||
reply.header("X-Download-ID", downloadId);
|
||||
|
||||
reply.raw.on("close", () => {
|
||||
this.memoryManager.endDownload(downloadId);
|
||||
console.log(`[DOWNLOAD] Client disconnected: ${downloadId}`);
|
||||
});
|
||||
|
||||
reply.raw.on("error", () => {
|
||||
this.memoryManager.endDownload(downloadId);
|
||||
console.log(`[DOWNLOAD] Client error: ${downloadId}`);
|
||||
});
|
||||
|
||||
try {
|
||||
if (range) {
|
||||
const parts = range.replace(/bytes=/, "").split("-");
|
||||
const start = parseInt(parts[0], 10);
|
||||
const end = parts[1] ? parseInt(parts[1], 10) : fileSize - 1;
|
||||
|
||||
reply.status(206);
|
||||
reply.header("Content-Range", `bytes ${start}-${end}/${fileSize}`);
|
||||
reply.header("Content-Length", end - start + 1);
|
||||
|
||||
await this.downloadFileRange(reply, provider, tokenData.objectName, start, end, downloadId);
|
||||
} else {
|
||||
reply.header("Content-Length", fileSize);
|
||||
await this.downloadFileStream(reply, provider, tokenData.objectName, downloadId);
|
||||
}
|
||||
|
||||
provider.consumeDownloadToken(token);
|
||||
} finally {
|
||||
this.memoryManager.endDownload(downloadId);
|
||||
}
|
||||
} catch (error) {
|
||||
this.memoryManager.endDownload(downloadId);
|
||||
console.error(`[DOWNLOAD] Error in ${downloadId}:`, error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
private async downloadFileStream(
|
||||
reply: FastifyReply,
|
||||
provider: FilesystemStorageProvider,
|
||||
objectName: string,
|
||||
downloadId?: string
|
||||
) {
|
||||
try {
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download start: ${objectName} (${downloadId})`);
|
||||
|
||||
const downloadStream = provider.createDownloadStream(objectName);
|
||||
|
||||
downloadStream.on("error", (error) => {
|
||||
console.error("Download stream error:", error);
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download error: ${objectName} (${downloadId})`);
|
||||
if (!reply.sent) {
|
||||
reply.status(500).send({ error: "Download failed" });
|
||||
}
|
||||
});
|
||||
|
||||
reply.raw.on("close", () => {
|
||||
if (downloadStream.readable && typeof (downloadStream as any).destroy === "function") {
|
||||
(downloadStream as any).destroy();
|
||||
}
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download client disconnect: ${objectName} (${downloadId})`);
|
||||
});
|
||||
|
||||
if (this.memoryManager.shouldThrottleStream()) {
|
||||
console.log(
|
||||
`[MEMORY THROTTLE] ${objectName} - Pausing stream due to high memory usage: ${this.memoryManager.getCurrentMemoryUsageMB().toFixed(0)}MB`
|
||||
);
|
||||
|
||||
const { Transform } = require("stream");
|
||||
const memoryManager = this.memoryManager;
|
||||
const throttleStream = new Transform({
|
||||
highWaterMark: 256 * 1024,
|
||||
transform(chunk: Buffer, _encoding: BufferEncoding, callback: (error?: Error | null, data?: any) => void) {
|
||||
if (memoryManager.shouldThrottleStream()) {
|
||||
setImmediate(() => {
|
||||
this.push(chunk);
|
||||
callback();
|
||||
});
|
||||
} else {
|
||||
this.push(chunk);
|
||||
callback();
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
await pipeline(downloadStream, throttleStream, reply.raw);
|
||||
} else {
|
||||
await pipeline(downloadStream, reply.raw);
|
||||
}
|
||||
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download complete: ${objectName} (${downloadId})`);
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download failed: ${objectName} (${downloadId})`);
|
||||
if (!reply.sent) {
|
||||
reply.status(500).send({ error: "Download failed" });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async downloadFileRange(
|
||||
reply: FastifyReply,
|
||||
provider: FilesystemStorageProvider,
|
||||
objectName: string,
|
||||
start: number,
|
||||
end: number,
|
||||
downloadId?: string
|
||||
) {
|
||||
try {
|
||||
FilesystemStorageProvider.logMemoryUsage(`Range download start: ${objectName} (${start}-${end}) (${downloadId})`);
|
||||
|
||||
const rangeStream = await provider.createDownloadRangeStream(objectName, start, end);
|
||||
|
||||
rangeStream.on("error", (error) => {
|
||||
console.error("Range download stream error:", error);
|
||||
FilesystemStorageProvider.logMemoryUsage(
|
||||
`Range download error: ${objectName} (${start}-${end}) (${downloadId})`
|
||||
);
|
||||
if (!reply.sent) {
|
||||
reply.status(500).send({ error: "Download failed" });
|
||||
}
|
||||
});
|
||||
|
||||
reply.raw.on("close", () => {
|
||||
if (rangeStream.readable && typeof (rangeStream as any).destroy === "function") {
|
||||
(rangeStream as any).destroy();
|
||||
}
|
||||
FilesystemStorageProvider.logMemoryUsage(
|
||||
`Range download client disconnect: ${objectName} (${start}-${end}) (${downloadId})`
|
||||
);
|
||||
});
|
||||
|
||||
await pipeline(rangeStream, reply.raw);
|
||||
|
||||
FilesystemStorageProvider.logMemoryUsage(
|
||||
`Range download complete: ${objectName} (${start}-${end}) (${downloadId})`
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Range download error:", error);
|
||||
FilesystemStorageProvider.logMemoryUsage(
|
||||
`Range download failed: ${objectName} (${start}-${end}) (${downloadId})`
|
||||
);
|
||||
if (!reply.sent) {
|
||||
reply.status(500).send({ error: "Download failed" });
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async getQueueStatus(_request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const queueStatus = this.memoryManager.getQueueStatus();
|
||||
const response: QueueStatusResponse = {
|
||||
status: "success",
|
||||
data: queueStatus,
|
||||
};
|
||||
reply.status(200).send(response);
|
||||
} catch (error) {
|
||||
console.error("Error getting queue status:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async cancelQueuedDownload(request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const { downloadId } = request.params as { downloadId: string };
|
||||
|
||||
const cancelled = this.memoryManager.cancelQueuedDownload(downloadId);
|
||||
|
||||
if (cancelled) {
|
||||
const response: DownloadCancelResponse = {
|
||||
message: "Download cancelled successfully",
|
||||
downloadId,
|
||||
};
|
||||
reply.status(200).send(response);
|
||||
} else {
|
||||
reply.status(404).send({
|
||||
error: "Download not found in queue",
|
||||
downloadId,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error cancelling queued download:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
|
||||
async clearDownloadQueue(_request: FastifyRequest, reply: FastifyReply) {
|
||||
try {
|
||||
const clearedCount = this.memoryManager.clearQueue();
|
||||
const response: QueueClearResponse = {
|
||||
message: "Download queue cleared successfully",
|
||||
clearedCount,
|
||||
};
|
||||
reply.status(200).send(response);
|
||||
} catch (error) {
|
||||
console.error("Error clearing download queue:", error);
|
||||
return reply.status(500).send({ error: "Internal server error" });
|
||||
}
|
||||
}
|
||||
}
|
95
apps/server/src/modules/filesystem/download-queue-routes.ts
Normal file
95
apps/server/src/modules/filesystem/download-queue-routes.ts
Normal file
@@ -0,0 +1,95 @@
|
||||
import { FastifyInstance } from "fastify";
|
||||
import { z } from "zod";
|
||||
|
||||
import { FilesystemController } from "./controller";
|
||||
|
||||
export async function downloadQueueRoutes(app: FastifyInstance) {
|
||||
const filesystemController = new FilesystemController();
|
||||
|
||||
app.get(
|
||||
"/filesystem/download-queue/status",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Download Queue"],
|
||||
operationId: "getDownloadQueueStatus",
|
||||
summary: "Get download queue status",
|
||||
description: "Get current status of the download queue including active downloads and queue length",
|
||||
response: {
|
||||
200: z.object({
|
||||
status: z.string(),
|
||||
data: z.object({
|
||||
queueLength: z.number(),
|
||||
maxQueueSize: z.number(),
|
||||
activeDownloads: z.number(),
|
||||
maxConcurrent: z.number(),
|
||||
queuedDownloads: z.array(
|
||||
z.object({
|
||||
downloadId: z.string(),
|
||||
position: z.number(),
|
||||
waitTime: z.number(),
|
||||
fileName: z.string().optional(),
|
||||
fileSize: z.number().optional(),
|
||||
})
|
||||
),
|
||||
}),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.getQueueStatus.bind(filesystemController)
|
||||
);
|
||||
|
||||
app.delete(
|
||||
"/filesystem/download-queue/:downloadId",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Download Queue"],
|
||||
operationId: "cancelQueuedDownload",
|
||||
summary: "Cancel a queued download",
|
||||
description: "Cancel a specific download that is waiting in the queue",
|
||||
params: z.object({
|
||||
downloadId: z.string().describe("Download ID"),
|
||||
}),
|
||||
response: {
|
||||
200: z.object({
|
||||
message: z.string(),
|
||||
downloadId: z.string(),
|
||||
}),
|
||||
404: z.object({
|
||||
error: z.string(),
|
||||
downloadId: z.string(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.cancelQueuedDownload.bind(filesystemController)
|
||||
);
|
||||
|
||||
app.delete(
|
||||
"/filesystem/download-queue",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Download Queue"],
|
||||
operationId: "clearDownloadQueue",
|
||||
summary: "Clear entire download queue",
|
||||
description: "Cancel all downloads waiting in the queue (admin operation)",
|
||||
response: {
|
||||
200: z.object({
|
||||
message: z.string(),
|
||||
clearedCount: z.number(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.clearDownloadQueue.bind(filesystemController)
|
||||
);
|
||||
}
|
123
apps/server/src/modules/filesystem/routes.ts
Normal file
123
apps/server/src/modules/filesystem/routes.ts
Normal file
@@ -0,0 +1,123 @@
|
||||
import { FastifyInstance, FastifyRequest } from "fastify";
|
||||
import { z } from "zod";
|
||||
|
||||
import { FilesystemController } from "./controller";
|
||||
|
||||
export async function filesystemRoutes(app: FastifyInstance) {
|
||||
const filesystemController = new FilesystemController();
|
||||
|
||||
app.addContentTypeParser("*", async (request: FastifyRequest, payload: any) => {
|
||||
return payload;
|
||||
});
|
||||
|
||||
app.addContentTypeParser("application/json", async (request: FastifyRequest, payload: any) => {
|
||||
return payload;
|
||||
});
|
||||
|
||||
app.put(
|
||||
"/filesystem/upload/:token",
|
||||
{
|
||||
bodyLimit: 1024 * 1024 * 1024 * 1024 * 1024, // 1PB limit
|
||||
schema: {
|
||||
tags: ["Filesystem"],
|
||||
operationId: "uploadToFilesystem",
|
||||
summary: "Upload file to filesystem storage",
|
||||
description: "Upload a file directly to the encrypted filesystem storage",
|
||||
params: z.object({
|
||||
token: z.string().describe("Upload token"),
|
||||
}),
|
||||
response: {
|
||||
200: z.object({
|
||||
message: z.string(),
|
||||
}),
|
||||
400: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.upload.bind(filesystemController)
|
||||
);
|
||||
|
||||
app.get(
|
||||
"/filesystem/download/:token",
|
||||
{
|
||||
bodyLimit: 1024 * 1024 * 1024 * 1024 * 1024, // 1PB limit
|
||||
schema: {
|
||||
tags: ["Filesystem"],
|
||||
operationId: "downloadFromFilesystem",
|
||||
summary: "Download file from filesystem storage",
|
||||
description: "Download a file directly from the encrypted filesystem storage",
|
||||
params: z.object({
|
||||
token: z.string().describe("Download token"),
|
||||
}),
|
||||
response: {
|
||||
200: z.string().describe("File content"),
|
||||
400: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.download.bind(filesystemController)
|
||||
);
|
||||
|
||||
app.get(
|
||||
"/filesystem/upload-progress/:fileId",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Filesystem"],
|
||||
operationId: "getUploadProgress",
|
||||
summary: "Get chunked upload progress",
|
||||
description: "Get the progress of a chunked upload",
|
||||
params: z.object({
|
||||
fileId: z.string().describe("File ID"),
|
||||
}),
|
||||
response: {
|
||||
200: z.object({
|
||||
uploaded: z.number(),
|
||||
total: z.number(),
|
||||
percentage: z.number(),
|
||||
}),
|
||||
404: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.getUploadProgress.bind(filesystemController)
|
||||
);
|
||||
|
||||
app.delete(
|
||||
"/filesystem/cancel-upload/:fileId",
|
||||
{
|
||||
schema: {
|
||||
tags: ["Filesystem"],
|
||||
operationId: "cancelUpload",
|
||||
summary: "Cancel chunked upload",
|
||||
description: "Cancel an ongoing chunked upload",
|
||||
params: z.object({
|
||||
fileId: z.string().describe("File ID"),
|
||||
}),
|
||||
response: {
|
||||
200: z.object({
|
||||
message: z.string(),
|
||||
}),
|
||||
500: z.object({
|
||||
error: z.string(),
|
||||
}),
|
||||
},
|
||||
},
|
||||
},
|
||||
filesystemController.cancelUpload.bind(filesystemController)
|
||||
);
|
||||
}
|
@@ -1,3 +1,5 @@
|
||||
import { isS3Enabled } from "../../config/storage.config";
|
||||
import { FilesystemStorageProvider } from "../../providers/filesystem-storage.provider";
|
||||
import { S3StorageProvider } from "../../providers/s3-storage.provider";
|
||||
import { prisma } from "../../shared/prisma";
|
||||
import { StorageProvider } from "../../types/storage";
|
||||
@@ -6,7 +8,11 @@ export class FolderService {
|
||||
private storageProvider: StorageProvider;
|
||||
|
||||
constructor() {
|
||||
this.storageProvider = new S3StorageProvider();
|
||||
if (isS3Enabled) {
|
||||
this.storageProvider = new S3StorageProvider();
|
||||
} else {
|
||||
this.storageProvider = FilesystemStorageProvider.getInstance();
|
||||
}
|
||||
}
|
||||
|
||||
async getPresignedPutUrl(objectName: string, expires: number): Promise<string> {
|
||||
@@ -36,6 +42,10 @@ export class FolderService {
|
||||
}
|
||||
}
|
||||
|
||||
isFilesystemMode(): boolean {
|
||||
return !isS3Enabled;
|
||||
}
|
||||
|
||||
async getAllFilesInFolder(folderId: string, userId: string, basePath: string = ""): Promise<any[]> {
|
||||
const files = await prisma.file.findMany({
|
||||
where: { folderId, userId },
|
||||
|
@@ -322,15 +322,54 @@ export class ReverseShareController {
|
||||
const fileInfo = await this.reverseShareService.getFileInfo(fileId, userId);
|
||||
const downloadId = `reverse-${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
const { DownloadMemoryManager } = await import("../../utils/download-memory-manager.js");
|
||||
const memoryManager = DownloadMemoryManager.getInstance();
|
||||
|
||||
const fileSizeMB = Number(fileInfo.size) / (1024 * 1024);
|
||||
console.log(
|
||||
`[REVERSE-DOWNLOAD] Requesting slot for ${downloadId}: ${fileInfo.name} (${fileSizeMB.toFixed(1)}MB)`
|
||||
);
|
||||
|
||||
try {
|
||||
await memoryManager.requestDownloadSlot(downloadId, {
|
||||
fileName: fileInfo.name,
|
||||
fileSize: Number(fileInfo.size),
|
||||
objectName: fileInfo.objectName,
|
||||
});
|
||||
} catch (error: any) {
|
||||
console.warn(`[REVERSE-DOWNLOAD] Queued ${downloadId}: ${error.message}`);
|
||||
return reply.status(202).send({
|
||||
queued: true,
|
||||
downloadId: downloadId,
|
||||
message: "Download queued due to memory constraints",
|
||||
estimatedWaitTime: error.estimatedWaitTime || 60,
|
||||
});
|
||||
}
|
||||
|
||||
console.log(`[REVERSE-DOWNLOAD] Starting ${downloadId}: ${fileInfo.name} (${fileSizeMB.toFixed(1)}MB)`);
|
||||
memoryManager.startDownload(downloadId);
|
||||
|
||||
try {
|
||||
const result = await this.reverseShareService.downloadReverseShareFile(fileId, userId);
|
||||
|
||||
const originalUrl = result.url;
|
||||
reply.header("X-Download-ID", downloadId);
|
||||
|
||||
reply.raw.on("finish", () => {
|
||||
memoryManager.endDownload(downloadId);
|
||||
});
|
||||
|
||||
reply.raw.on("close", () => {
|
||||
memoryManager.endDownload(downloadId);
|
||||
});
|
||||
|
||||
reply.raw.on("error", () => {
|
||||
memoryManager.endDownload(downloadId);
|
||||
});
|
||||
|
||||
return reply.send(result);
|
||||
} catch (downloadError) {
|
||||
memoryManager.endDownload(downloadId);
|
||||
throw downloadError;
|
||||
}
|
||||
} catch (error: any) {
|
||||
|
@@ -568,57 +568,76 @@ export class ReverseShareService {
|
||||
|
||||
const newObjectName = `${creatorId}/${Date.now()}-${file.name}`;
|
||||
|
||||
const fileSizeMB = Number(file.size) / (1024 * 1024);
|
||||
const needsStreaming = fileSizeMB > 100;
|
||||
if (this.fileService.isFilesystemMode()) {
|
||||
const { FilesystemStorageProvider } = await import("../../providers/filesystem-storage.provider.js");
|
||||
const provider = FilesystemStorageProvider.getInstance();
|
||||
|
||||
const downloadUrl = await this.fileService.getPresignedGetUrl(file.objectName, 300);
|
||||
const uploadUrl = await this.fileService.getPresignedPutUrl(newObjectName, 300);
|
||||
const sourcePath = provider.getFilePath(file.objectName);
|
||||
const fs = await import("fs");
|
||||
|
||||
let retries = 0;
|
||||
const maxRetries = 3;
|
||||
let success = false;
|
||||
const targetPath = provider.getFilePath(newObjectName);
|
||||
|
||||
while (retries < maxRetries && !success) {
|
||||
try {
|
||||
const response = await fetch(downloadUrl, {
|
||||
signal: AbortSignal.timeout(600000), // 10 minutes timeout
|
||||
});
|
||||
const path = await import("path");
|
||||
const targetDir = path.dirname(targetPath);
|
||||
if (!fs.existsSync(targetDir)) {
|
||||
fs.mkdirSync(targetDir, { recursive: true });
|
||||
}
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to download file: ${response.statusText}`);
|
||||
const { copyFile } = await import("fs/promises");
|
||||
await copyFile(sourcePath, targetPath);
|
||||
} else {
|
||||
const fileSizeMB = Number(file.size) / (1024 * 1024);
|
||||
const needsStreaming = fileSizeMB > 100;
|
||||
|
||||
const downloadUrl = await this.fileService.getPresignedGetUrl(file.objectName, 300);
|
||||
const uploadUrl = await this.fileService.getPresignedPutUrl(newObjectName, 300);
|
||||
|
||||
let retries = 0;
|
||||
const maxRetries = 3;
|
||||
let success = false;
|
||||
|
||||
while (retries < maxRetries && !success) {
|
||||
try {
|
||||
const response = await fetch(downloadUrl, {
|
||||
signal: AbortSignal.timeout(600000), // 10 minutes timeout
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Failed to download file: ${response.statusText}`);
|
||||
}
|
||||
|
||||
if (!response.body) {
|
||||
throw new Error("No response body received");
|
||||
}
|
||||
|
||||
const uploadOptions: any = {
|
||||
method: "PUT",
|
||||
body: response.body,
|
||||
headers: {
|
||||
"Content-Type": "application/octet-stream",
|
||||
"Content-Length": file.size.toString(),
|
||||
},
|
||||
signal: AbortSignal.timeout(600000), // 10 minutes timeout
|
||||
};
|
||||
|
||||
const uploadResponse = await fetch(uploadUrl, uploadOptions);
|
||||
|
||||
if (!uploadResponse.ok) {
|
||||
const errorText = await uploadResponse.text();
|
||||
throw new Error(`Failed to upload file: ${uploadResponse.statusText} - ${errorText}`);
|
||||
}
|
||||
|
||||
success = true;
|
||||
} catch (error: any) {
|
||||
retries++;
|
||||
|
||||
if (retries >= maxRetries) {
|
||||
throw new Error(`Failed to copy file after ${maxRetries} attempts: ${error.message}`);
|
||||
}
|
||||
|
||||
const delay = Math.min(1000 * Math.pow(2, retries - 1), 10000);
|
||||
await new Promise((resolve) => setTimeout(resolve, delay));
|
||||
}
|
||||
|
||||
if (!response.body) {
|
||||
throw new Error("No response body received");
|
||||
}
|
||||
|
||||
const uploadOptions: any = {
|
||||
method: "PUT",
|
||||
body: response.body,
|
||||
headers: {
|
||||
"Content-Type": "application/octet-stream",
|
||||
"Content-Length": file.size.toString(),
|
||||
},
|
||||
signal: AbortSignal.timeout(600000), // 10 minutes timeout
|
||||
};
|
||||
|
||||
const uploadResponse = await fetch(uploadUrl, uploadOptions);
|
||||
|
||||
if (!uploadResponse.ok) {
|
||||
const errorText = await uploadResponse.text();
|
||||
throw new Error(`Failed to upload file: ${uploadResponse.statusText} - ${errorText}`);
|
||||
}
|
||||
|
||||
success = true;
|
||||
} catch (error: any) {
|
||||
retries++;
|
||||
|
||||
if (retries >= maxRetries) {
|
||||
throw new Error(`Failed to copy file after ${maxRetries} attempts: ${error.message}`);
|
||||
}
|
||||
|
||||
const delay = Math.min(1000 * Math.pow(2, retries - 1), 10000);
|
||||
await new Promise((resolve) => setTimeout(resolve, delay));
|
||||
}
|
||||
}
|
||||
|
||||
|
724
apps/server/src/providers/filesystem-storage.provider.ts
Normal file
724
apps/server/src/providers/filesystem-storage.provider.ts
Normal file
@@ -0,0 +1,724 @@
|
||||
import * as crypto from "crypto";
|
||||
import * as fsSync from "fs";
|
||||
import * as fs from "fs/promises";
|
||||
import * as path from "path";
|
||||
import { Transform } from "stream";
|
||||
import { pipeline } from "stream/promises";
|
||||
|
||||
import { directoriesConfig, getTempFilePath } from "../config/directories.config";
|
||||
import { env } from "../env";
|
||||
import { StorageProvider } from "../types/storage";
|
||||
|
||||
export class FilesystemStorageProvider implements StorageProvider {
|
||||
private static instance: FilesystemStorageProvider;
|
||||
private uploadsDir: string;
|
||||
private encryptionKey = env.ENCRYPTION_KEY;
|
||||
private isEncryptionDisabled = env.DISABLE_FILESYSTEM_ENCRYPTION === "true";
|
||||
private uploadTokens = new Map<string, { objectName: string; expiresAt: number }>();
|
||||
private downloadTokens = new Map<string, { objectName: string; expiresAt: number; fileName?: string }>();
|
||||
|
||||
private constructor() {
|
||||
this.uploadsDir = directoriesConfig.uploads;
|
||||
|
||||
if (!this.isEncryptionDisabled && !this.encryptionKey) {
|
||||
throw new Error(
|
||||
"Encryption is enabled but ENCRYPTION_KEY is not provided. " +
|
||||
"Please set ENCRYPTION_KEY environment variable or set DISABLE_FILESYSTEM_ENCRYPTION=true to disable encryption."
|
||||
);
|
||||
}
|
||||
|
||||
this.ensureUploadsDir();
|
||||
setInterval(() => this.cleanExpiredTokens(), 5 * 60 * 1000);
|
||||
setInterval(() => this.cleanupEmptyTempDirs(), 10 * 60 * 1000);
|
||||
}
|
||||
|
||||
public static getInstance(): FilesystemStorageProvider {
|
||||
if (!FilesystemStorageProvider.instance) {
|
||||
FilesystemStorageProvider.instance = new FilesystemStorageProvider();
|
||||
}
|
||||
return FilesystemStorageProvider.instance;
|
||||
}
|
||||
|
||||
private async ensureUploadsDir(): Promise<void> {
|
||||
try {
|
||||
await fs.access(this.uploadsDir);
|
||||
} catch {
|
||||
await fs.mkdir(this.uploadsDir, { recursive: true });
|
||||
}
|
||||
}
|
||||
|
||||
private cleanExpiredTokens(): void {
|
||||
const now = Date.now();
|
||||
|
||||
for (const [token, data] of this.uploadTokens.entries()) {
|
||||
if (now > data.expiresAt) {
|
||||
this.uploadTokens.delete(token);
|
||||
}
|
||||
}
|
||||
|
||||
for (const [token, data] of this.downloadTokens.entries()) {
|
||||
if (now > data.expiresAt) {
|
||||
this.downloadTokens.delete(token);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
public getFilePath(objectName: string): string {
|
||||
const sanitizedName = objectName.replace(/[^a-zA-Z0-9\-_./]/g, "_");
|
||||
return path.join(this.uploadsDir, sanitizedName);
|
||||
}
|
||||
|
||||
private createEncryptionKey(): Buffer {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error(
|
||||
"Encryption key is required when encryption is enabled. Please set ENCRYPTION_KEY environment variable."
|
||||
);
|
||||
}
|
||||
return crypto.scryptSync(this.encryptionKey, "salt", 32);
|
||||
}
|
||||
|
||||
public createEncryptStream(): Transform {
|
||||
if (this.isEncryptionDisabled) {
|
||||
return new Transform({
|
||||
highWaterMark: 64 * 1024,
|
||||
transform(chunk, _encoding, callback) {
|
||||
this.push(chunk);
|
||||
callback();
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const key = this.createEncryptionKey();
|
||||
const iv = crypto.randomBytes(16);
|
||||
const cipher = crypto.createCipheriv("aes-256-cbc", key, iv);
|
||||
|
||||
let isFirstChunk = true;
|
||||
|
||||
return new Transform({
|
||||
highWaterMark: 64 * 1024,
|
||||
transform(chunk, _encoding, callback) {
|
||||
try {
|
||||
if (isFirstChunk) {
|
||||
this.push(iv);
|
||||
isFirstChunk = false;
|
||||
}
|
||||
|
||||
const encrypted = cipher.update(chunk);
|
||||
this.push(encrypted);
|
||||
callback();
|
||||
} catch (error) {
|
||||
callback(error as Error);
|
||||
}
|
||||
},
|
||||
|
||||
flush(callback) {
|
||||
try {
|
||||
const final = cipher.final();
|
||||
this.push(final);
|
||||
callback();
|
||||
} catch (error) {
|
||||
callback(error as Error);
|
||||
}
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
public createDecryptStream(): Transform {
|
||||
if (this.isEncryptionDisabled) {
|
||||
return new Transform({
|
||||
highWaterMark: 64 * 1024,
|
||||
transform(chunk, _encoding, callback) {
|
||||
this.push(chunk);
|
||||
callback();
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
const key = this.createEncryptionKey();
|
||||
let iv: Buffer | null = null;
|
||||
let decipher: crypto.Decipher | null = null;
|
||||
let ivBuffer = Buffer.alloc(0);
|
||||
|
||||
return new Transform({
|
||||
highWaterMark: 64 * 1024,
|
||||
transform(chunk, _encoding, callback) {
|
||||
try {
|
||||
if (!iv) {
|
||||
ivBuffer = Buffer.concat([ivBuffer, chunk]);
|
||||
|
||||
if (ivBuffer.length >= 16) {
|
||||
iv = ivBuffer.subarray(0, 16);
|
||||
decipher = crypto.createDecipheriv("aes-256-cbc", key, iv);
|
||||
const remainingData = ivBuffer.subarray(16);
|
||||
if (remainingData.length > 0) {
|
||||
const decrypted = decipher.update(remainingData);
|
||||
this.push(decrypted);
|
||||
}
|
||||
}
|
||||
callback();
|
||||
return;
|
||||
}
|
||||
|
||||
if (decipher) {
|
||||
const decrypted = decipher.update(chunk);
|
||||
this.push(decrypted);
|
||||
}
|
||||
callback();
|
||||
} catch (error) {
|
||||
callback(error as Error);
|
||||
}
|
||||
},
|
||||
|
||||
flush(callback) {
|
||||
try {
|
||||
if (decipher) {
|
||||
const final = decipher.final();
|
||||
this.push(final);
|
||||
}
|
||||
callback();
|
||||
} catch (error) {
|
||||
callback(error as Error);
|
||||
}
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async getPresignedPutUrl(objectName: string, expires: number): Promise<string> {
|
||||
const token = crypto.randomBytes(32).toString("hex");
|
||||
const expiresAt = Date.now() + expires * 1000;
|
||||
|
||||
this.uploadTokens.set(token, { objectName, expiresAt });
|
||||
|
||||
return `/api/filesystem/upload/${token}`;
|
||||
}
|
||||
|
||||
async getPresignedGetUrl(objectName: string, expires: number, fileName?: string): Promise<string> {
|
||||
const token = crypto.randomBytes(32).toString("hex");
|
||||
const expiresAt = Date.now() + expires * 1000;
|
||||
|
||||
this.downloadTokens.set(token, { objectName, expiresAt, fileName });
|
||||
|
||||
return `/api/filesystem/download/${token}`;
|
||||
}
|
||||
|
||||
async deleteObject(objectName: string): Promise<void> {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
try {
|
||||
await fs.unlink(filePath);
|
||||
} catch (error: any) {
|
||||
if (error.code !== "ENOENT") {
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async uploadFile(objectName: string, buffer: Buffer): Promise<void> {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
const dir = path.dirname(filePath);
|
||||
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
|
||||
const { Readable } = await import("stream");
|
||||
const readable = Readable.from(buffer);
|
||||
|
||||
await this.uploadFileFromStream(objectName, readable);
|
||||
}
|
||||
|
||||
async uploadFileFromStream(objectName: string, inputStream: NodeJS.ReadableStream): Promise<void> {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
const dir = path.dirname(filePath);
|
||||
|
||||
await fs.mkdir(dir, { recursive: true });
|
||||
|
||||
const tempPath = getTempFilePath(objectName);
|
||||
const tempDir = path.dirname(tempPath);
|
||||
|
||||
await fs.mkdir(tempDir, { recursive: true });
|
||||
|
||||
const writeStream = fsSync.createWriteStream(tempPath);
|
||||
const encryptStream = this.createEncryptStream();
|
||||
|
||||
try {
|
||||
await pipeline(inputStream, encryptStream, writeStream);
|
||||
await this.moveFile(tempPath, filePath);
|
||||
} catch (error) {
|
||||
await this.cleanupTempFile(tempPath);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
async downloadFile(objectName: string): Promise<Buffer> {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
const fileBuffer = await fs.readFile(filePath);
|
||||
|
||||
if (this.isEncryptionDisabled) {
|
||||
return fileBuffer;
|
||||
}
|
||||
|
||||
if (fileBuffer.length > 16) {
|
||||
try {
|
||||
return this.decryptFileBuffer(fileBuffer);
|
||||
} catch (error: unknown) {
|
||||
if (error instanceof Error) {
|
||||
console.warn("Failed to decrypt with new method, trying legacy format", error.message);
|
||||
}
|
||||
return this.decryptFileLegacy(fileBuffer);
|
||||
}
|
||||
}
|
||||
|
||||
return this.decryptFileLegacy(fileBuffer);
|
||||
}
|
||||
|
||||
createDownloadStream(objectName: string): NodeJS.ReadableStream {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
|
||||
const streamOptions = {
|
||||
highWaterMark: 64 * 1024,
|
||||
autoDestroy: true,
|
||||
emitClose: true,
|
||||
};
|
||||
|
||||
const fileStream = fsSync.createReadStream(filePath, streamOptions);
|
||||
|
||||
if (this.isEncryptionDisabled) {
|
||||
this.setupStreamMemoryManagement(fileStream, objectName);
|
||||
return fileStream;
|
||||
}
|
||||
|
||||
const decryptStream = this.createDecryptStream();
|
||||
const { PassThrough } = require("stream");
|
||||
const outputStream = new PassThrough(streamOptions);
|
||||
|
||||
let isDestroyed = false;
|
||||
let memoryCheckInterval: NodeJS.Timeout;
|
||||
|
||||
const cleanup = () => {
|
||||
if (isDestroyed) return;
|
||||
isDestroyed = true;
|
||||
|
||||
if (memoryCheckInterval) {
|
||||
clearInterval(memoryCheckInterval);
|
||||
}
|
||||
|
||||
try {
|
||||
if (fileStream && !fileStream.destroyed) {
|
||||
fileStream.destroy();
|
||||
}
|
||||
if (decryptStream && !decryptStream.destroyed) {
|
||||
decryptStream.destroy();
|
||||
}
|
||||
if (outputStream && !outputStream.destroyed) {
|
||||
outputStream.destroy();
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn("Error during download stream cleanup:", error);
|
||||
}
|
||||
|
||||
setImmediate(() => {
|
||||
if (global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
});
|
||||
};
|
||||
|
||||
memoryCheckInterval = setInterval(() => {
|
||||
const memUsage = process.memoryUsage();
|
||||
const memoryUsageMB = memUsage.heapUsed / 1024 / 1024;
|
||||
|
||||
if (memoryUsageMB > 1024) {
|
||||
if (!fileStream.readableFlowing) return;
|
||||
|
||||
console.warn(
|
||||
`[MEMORY THROTTLE] ${objectName} - Pausing stream due to high memory usage: ${memoryUsageMB.toFixed(2)}MB`
|
||||
);
|
||||
fileStream.pause();
|
||||
|
||||
if (global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
if (!isDestroyed && fileStream && !fileStream.destroyed) {
|
||||
fileStream.resume();
|
||||
console.log(`[MEMORY THROTTLE] ${objectName} - Stream resumed`);
|
||||
}
|
||||
}, 100);
|
||||
}
|
||||
}, 1000);
|
||||
|
||||
fileStream.on("error", (error: any) => {
|
||||
console.error("File stream error:", error);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
decryptStream.on("error", (error: any) => {
|
||||
console.error("Decrypt stream error:", error);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
outputStream.on("error", (error: any) => {
|
||||
console.error("Output stream error:", error);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
outputStream.on("close", cleanup);
|
||||
outputStream.on("finish", cleanup);
|
||||
|
||||
outputStream.on("pipe", (src: any) => {
|
||||
if (src && src.on) {
|
||||
src.on("close", cleanup);
|
||||
src.on("error", cleanup);
|
||||
}
|
||||
});
|
||||
|
||||
pipeline(fileStream, decryptStream, outputStream)
|
||||
.then(() => {})
|
||||
.catch((error: any) => {
|
||||
console.error("Pipeline error during download:", error);
|
||||
cleanup();
|
||||
});
|
||||
|
||||
this.setupStreamMemoryManagement(outputStream, objectName);
|
||||
return outputStream;
|
||||
}
|
||||
|
||||
private setupStreamMemoryManagement(stream: NodeJS.ReadableStream, objectName: string): void {
|
||||
let lastMemoryLog = 0;
|
||||
|
||||
stream.on("data", () => {
|
||||
const now = Date.now();
|
||||
if (now - lastMemoryLog > 30000) {
|
||||
FilesystemStorageProvider.logMemoryUsage(`Active download: ${objectName}`);
|
||||
lastMemoryLog = now;
|
||||
}
|
||||
});
|
||||
|
||||
stream.on("end", () => {
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download completed: ${objectName}`);
|
||||
setImmediate(() => {
|
||||
if (global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
stream.on("close", () => {
|
||||
FilesystemStorageProvider.logMemoryUsage(`Download closed: ${objectName}`);
|
||||
});
|
||||
}
|
||||
|
||||
async createDownloadRangeStream(objectName: string, start: number, end: number): Promise<NodeJS.ReadableStream> {
|
||||
if (!this.isEncryptionDisabled) {
|
||||
return this.createRangeStreamFromDecrypted(objectName, start, end);
|
||||
}
|
||||
|
||||
const filePath = this.getFilePath(objectName);
|
||||
return fsSync.createReadStream(filePath, { start, end });
|
||||
}
|
||||
|
||||
private createRangeStreamFromDecrypted(objectName: string, start: number, end: number): NodeJS.ReadableStream {
|
||||
const { Transform, PassThrough } = require("stream");
|
||||
const filePath = this.getFilePath(objectName);
|
||||
const fileStream = fsSync.createReadStream(filePath);
|
||||
const decryptStream = this.createDecryptStream();
|
||||
const rangeStream = new PassThrough();
|
||||
|
||||
let bytesRead = 0;
|
||||
let rangeEnded = false;
|
||||
let isDestroyed = false;
|
||||
|
||||
const rangeTransform = new Transform({
|
||||
transform(chunk: Buffer, encoding: any, callback: any) {
|
||||
if (rangeEnded || isDestroyed) {
|
||||
callback();
|
||||
return;
|
||||
}
|
||||
|
||||
const chunkStart = bytesRead;
|
||||
const chunkEnd = bytesRead + chunk.length - 1;
|
||||
bytesRead += chunk.length;
|
||||
|
||||
if (chunkEnd < start) {
|
||||
callback();
|
||||
return;
|
||||
}
|
||||
|
||||
if (chunkStart > end) {
|
||||
rangeEnded = true;
|
||||
this.end();
|
||||
callback();
|
||||
return;
|
||||
}
|
||||
|
||||
let sliceStart = 0;
|
||||
let sliceEnd = chunk.length;
|
||||
|
||||
if (chunkStart < start) {
|
||||
sliceStart = start - chunkStart;
|
||||
}
|
||||
|
||||
if (chunkEnd > end) {
|
||||
sliceEnd = end - chunkStart + 1;
|
||||
rangeEnded = true;
|
||||
}
|
||||
|
||||
const slicedChunk = chunk.slice(sliceStart, sliceEnd);
|
||||
this.push(slicedChunk);
|
||||
|
||||
if (rangeEnded) {
|
||||
this.end();
|
||||
}
|
||||
|
||||
callback();
|
||||
},
|
||||
|
||||
flush(callback: any) {
|
||||
if (global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
callback();
|
||||
},
|
||||
});
|
||||
|
||||
const cleanup = () => {
|
||||
if (isDestroyed) return;
|
||||
isDestroyed = true;
|
||||
|
||||
try {
|
||||
if (fileStream && !fileStream.destroyed) {
|
||||
fileStream.destroy();
|
||||
}
|
||||
if (decryptStream && !decryptStream.destroyed) {
|
||||
decryptStream.destroy();
|
||||
}
|
||||
if (rangeTransform && !rangeTransform.destroyed) {
|
||||
rangeTransform.destroy();
|
||||
}
|
||||
if (rangeStream && !rangeStream.destroyed) {
|
||||
rangeStream.destroy();
|
||||
}
|
||||
} catch (error) {
|
||||
console.warn("Error during stream cleanup:", error);
|
||||
}
|
||||
|
||||
if (global.gc) {
|
||||
global.gc();
|
||||
}
|
||||
};
|
||||
|
||||
fileStream.on("error", cleanup);
|
||||
decryptStream.on("error", cleanup);
|
||||
rangeTransform.on("error", cleanup);
|
||||
rangeStream.on("error", cleanup);
|
||||
|
||||
rangeStream.on("close", cleanup);
|
||||
rangeStream.on("end", cleanup);
|
||||
|
||||
rangeStream.on("pipe", (src: any) => {
|
||||
if (src && src.on) {
|
||||
src.on("close", cleanup);
|
||||
src.on("error", cleanup);
|
||||
}
|
||||
});
|
||||
|
||||
fileStream.pipe(decryptStream).pipe(rangeTransform).pipe(rangeStream);
|
||||
|
||||
return rangeStream;
|
||||
}
|
||||
|
||||
private decryptFileBuffer(encryptedBuffer: Buffer): Buffer {
|
||||
const key = this.createEncryptionKey();
|
||||
const iv = encryptedBuffer.slice(0, 16);
|
||||
const encrypted = encryptedBuffer.slice(16);
|
||||
|
||||
const decipher = crypto.createDecipheriv("aes-256-cbc", key, iv);
|
||||
|
||||
return Buffer.concat([decipher.update(encrypted), decipher.final()]);
|
||||
}
|
||||
|
||||
private decryptFileLegacy(encryptedBuffer: Buffer): Buffer {
|
||||
if (!this.encryptionKey) {
|
||||
throw new Error(
|
||||
"Encryption key is required when encryption is enabled. Please set ENCRYPTION_KEY environment variable."
|
||||
);
|
||||
}
|
||||
const CryptoJS = require("crypto-js");
|
||||
const decrypted = CryptoJS.AES.decrypt(encryptedBuffer.toString("utf8"), this.encryptionKey);
|
||||
return Buffer.from(decrypted.toString(CryptoJS.enc.Utf8), "base64");
|
||||
}
|
||||
|
||||
static logMemoryUsage(context: string = "Unknown"): void {
|
||||
const memUsage = process.memoryUsage();
|
||||
const formatBytes = (bytes: number) => {
|
||||
const mb = bytes / 1024 / 1024;
|
||||
return `${mb.toFixed(2)} MB`;
|
||||
};
|
||||
|
||||
const rssInMB = memUsage.rss / 1024 / 1024;
|
||||
const heapUsedInMB = memUsage.heapUsed / 1024 / 1024;
|
||||
|
||||
if (rssInMB > 1024 || heapUsedInMB > 512) {
|
||||
console.warn(`[MEMORY WARNING] ${context} - High memory usage detected:`);
|
||||
console.warn(` RSS: ${formatBytes(memUsage.rss)}`);
|
||||
console.warn(` Heap Used: ${formatBytes(memUsage.heapUsed)}`);
|
||||
console.warn(` Heap Total: ${formatBytes(memUsage.heapTotal)}`);
|
||||
console.warn(` External: ${formatBytes(memUsage.external)}`);
|
||||
|
||||
if (global.gc) {
|
||||
console.warn(" Forcing garbage collection...");
|
||||
global.gc();
|
||||
|
||||
const afterGC = process.memoryUsage();
|
||||
console.warn(` After GC - RSS: ${formatBytes(afterGC.rss)}, Heap: ${formatBytes(afterGC.heapUsed)}`);
|
||||
}
|
||||
} else {
|
||||
console.log(
|
||||
`[MEMORY INFO] ${context} - RSS: ${formatBytes(memUsage.rss)}, Heap: ${formatBytes(memUsage.heapUsed)}`
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
static forceGarbageCollection(context: string = "Manual"): void {
|
||||
if (global.gc) {
|
||||
const beforeGC = process.memoryUsage();
|
||||
global.gc();
|
||||
const afterGC = process.memoryUsage();
|
||||
|
||||
const formatBytes = (bytes: number) => `${(bytes / 1024 / 1024).toFixed(2)} MB`;
|
||||
|
||||
console.log(`[GC] ${context} - Before: RSS ${formatBytes(beforeGC.rss)}, Heap ${formatBytes(beforeGC.heapUsed)}`);
|
||||
console.log(`[GC] ${context} - After: RSS ${formatBytes(afterGC.rss)}, Heap ${formatBytes(afterGC.heapUsed)}`);
|
||||
|
||||
const rssSaved = beforeGC.rss - afterGC.rss;
|
||||
const heapSaved = beforeGC.heapUsed - afterGC.heapUsed;
|
||||
|
||||
if (rssSaved > 0 || heapSaved > 0) {
|
||||
console.log(`[GC] ${context} - Freed: RSS ${formatBytes(rssSaved)}, Heap ${formatBytes(heapSaved)}`);
|
||||
}
|
||||
} else {
|
||||
console.warn(`[GC] ${context} - Garbage collection not available. Start Node.js with --expose-gc flag.`);
|
||||
}
|
||||
}
|
||||
|
||||
async fileExists(objectName: string): Promise<boolean> {
|
||||
const filePath = this.getFilePath(objectName);
|
||||
try {
|
||||
await fs.access(filePath);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
validateUploadToken(token: string): { objectName: string } | null {
|
||||
const data = this.uploadTokens.get(token);
|
||||
if (!data || Date.now() > data.expiresAt) {
|
||||
this.uploadTokens.delete(token);
|
||||
return null;
|
||||
}
|
||||
return { objectName: data.objectName };
|
||||
}
|
||||
|
||||
validateDownloadToken(token: string): { objectName: string; fileName?: string } | null {
|
||||
const data = this.downloadTokens.get(token);
|
||||
|
||||
if (!data) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const now = Date.now();
|
||||
|
||||
if (now > data.expiresAt) {
|
||||
this.downloadTokens.delete(token);
|
||||
return null;
|
||||
}
|
||||
|
||||
return { objectName: data.objectName, fileName: data.fileName };
|
||||
}
|
||||
|
||||
consumeUploadToken(token: string): void {
|
||||
this.uploadTokens.delete(token);
|
||||
}
|
||||
|
||||
consumeDownloadToken(token: string): void {
|
||||
this.downloadTokens.delete(token);
|
||||
}
|
||||
|
||||
private async cleanupTempFile(tempPath: string): Promise<void> {
|
||||
try {
|
||||
await fs.unlink(tempPath);
|
||||
|
||||
const tempDir = path.dirname(tempPath);
|
||||
try {
|
||||
const files = await fs.readdir(tempDir);
|
||||
if (files.length === 0) {
|
||||
await fs.rmdir(tempDir);
|
||||
}
|
||||
} catch (dirError: any) {
|
||||
if (dirError.code !== "ENOTEMPTY" && dirError.code !== "ENOENT") {
|
||||
console.warn("Warning: Could not remove temp directory:", dirError.message);
|
||||
}
|
||||
}
|
||||
} catch (cleanupError: any) {
|
||||
if (cleanupError.code !== "ENOENT") {
|
||||
console.error("Error deleting temp file:", cleanupError);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async cleanupEmptyTempDirs(): Promise<void> {
|
||||
try {
|
||||
const tempUploadsDir = directoriesConfig.tempUploads;
|
||||
|
||||
try {
|
||||
await fs.access(tempUploadsDir);
|
||||
} catch {
|
||||
return;
|
||||
}
|
||||
|
||||
const items = await fs.readdir(tempUploadsDir);
|
||||
|
||||
for (const item of items) {
|
||||
const itemPath = path.join(tempUploadsDir, item);
|
||||
|
||||
try {
|
||||
const stat = await fs.stat(itemPath);
|
||||
|
||||
if (stat.isDirectory()) {
|
||||
const dirContents = await fs.readdir(itemPath);
|
||||
if (dirContents.length === 0) {
|
||||
await fs.rmdir(itemPath);
|
||||
console.log(`🧹 Cleaned up empty temp directory: ${itemPath}`);
|
||||
}
|
||||
} else if (stat.isFile()) {
|
||||
const oneHourAgo = Date.now() - 60 * 60 * 1000;
|
||||
if (stat.mtime.getTime() < oneHourAgo) {
|
||||
await fs.unlink(itemPath);
|
||||
console.log(`🧹 Cleaned up stale temp file: ${itemPath}`);
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (error.code !== "ENOENT") {
|
||||
console.warn(`Warning: Could not process temp item ${itemPath}:`, error.message);
|
||||
}
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Error during temp directory cleanup:", error);
|
||||
}
|
||||
}
|
||||
|
||||
private async moveFile(src: string, dest: string): Promise<void> {
|
||||
try {
|
||||
await fs.rename(src, dest);
|
||||
} catch (err: any) {
|
||||
if (err.code === "EXDEV") {
|
||||
// cross-device: fallback to copy + delete
|
||||
await fs.copyFile(src, dest);
|
||||
await fs.unlink(src);
|
||||
} else {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
@@ -1,12 +1,19 @@
|
||||
import * as fs from "fs/promises";
|
||||
import crypto from "node:crypto";
|
||||
import path from "path";
|
||||
import fastifyMultipart from "@fastify/multipart";
|
||||
import fastifyStatic from "@fastify/static";
|
||||
|
||||
import { buildApp } from "./app";
|
||||
import { directoriesConfig } from "./config/directories.config";
|
||||
import { env } from "./env";
|
||||
import { appRoutes } from "./modules/app/routes";
|
||||
import { authProvidersRoutes } from "./modules/auth-providers/routes";
|
||||
import { authRoutes } from "./modules/auth/routes";
|
||||
import { bulkDownloadRoutes } from "./modules/bulk-download/routes";
|
||||
import { fileRoutes } from "./modules/file/routes";
|
||||
import { ChunkManager } from "./modules/filesystem/chunk-manager";
|
||||
import { downloadQueueRoutes } from "./modules/filesystem/download-queue-routes";
|
||||
import { filesystemRoutes } from "./modules/filesystem/routes";
|
||||
import { folderRoutes } from "./modules/folder/routes";
|
||||
import { healthRoutes } from "./modules/health/routes";
|
||||
import { reverseShareRoutes } from "./modules/reverse-share/routes";
|
||||
@@ -14,6 +21,7 @@ import { shareRoutes } from "./modules/share/routes";
|
||||
import { storageRoutes } from "./modules/storage/routes";
|
||||
import { twoFactorRoutes } from "./modules/two-factor/routes";
|
||||
import { userRoutes } from "./modules/user/routes";
|
||||
import { IS_RUNNING_IN_CONTAINER } from "./utils/container-detection";
|
||||
|
||||
if (typeof globalThis.crypto === "undefined") {
|
||||
globalThis.crypto = crypto.webcrypto as any;
|
||||
@@ -23,9 +31,27 @@ if (typeof global.crypto === "undefined") {
|
||||
(global as any).crypto = crypto.webcrypto;
|
||||
}
|
||||
|
||||
async function ensureDirectories() {
|
||||
const dirsToCreate = [
|
||||
{ path: directoriesConfig.uploads, name: "uploads" },
|
||||
{ path: directoriesConfig.tempUploads, name: "temp-uploads" },
|
||||
];
|
||||
|
||||
for (const dir of dirsToCreate) {
|
||||
try {
|
||||
await fs.access(dir.path);
|
||||
} catch {
|
||||
await fs.mkdir(dir.path, { recursive: true });
|
||||
console.log(`📁 Created ${dir.name} directory: ${dir.path}`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async function startServer() {
|
||||
const app = await buildApp();
|
||||
|
||||
await ensureDirectories();
|
||||
|
||||
await app.register(fastifyMultipart, {
|
||||
limits: {
|
||||
fieldNameSize: 100,
|
||||
@@ -37,19 +63,31 @@ async function startServer() {
|
||||
},
|
||||
});
|
||||
|
||||
if (env.ENABLE_S3 !== "true") {
|
||||
await app.register(fastifyStatic, {
|
||||
root: directoriesConfig.uploads,
|
||||
prefix: "/uploads/",
|
||||
decorateReply: false,
|
||||
});
|
||||
}
|
||||
|
||||
app.register(authRoutes);
|
||||
app.register(authProvidersRoutes, { prefix: "/auth" });
|
||||
app.register(twoFactorRoutes, { prefix: "/auth" });
|
||||
app.register(userRoutes);
|
||||
app.register(fileRoutes);
|
||||
app.register(folderRoutes);
|
||||
app.register(downloadQueueRoutes);
|
||||
app.register(shareRoutes);
|
||||
app.register(reverseShareRoutes);
|
||||
app.register(storageRoutes);
|
||||
app.register(bulkDownloadRoutes);
|
||||
app.register(appRoutes);
|
||||
app.register(healthRoutes);
|
||||
|
||||
if (env.ENABLE_S3 !== "true") {
|
||||
app.register(filesystemRoutes);
|
||||
}
|
||||
|
||||
await app.listen({
|
||||
port: 3333,
|
||||
host: "0.0.0.0",
|
||||
@@ -66,16 +104,23 @@ async function startServer() {
|
||||
}
|
||||
|
||||
console.log(`🌴 Palmr server running on port 3333 🌴`);
|
||||
console.log(
|
||||
`📦 Storage mode: ${env.ENABLE_S3 === "true" ? "S3" : `Local Filesystem ${env.DISABLE_FILESYSTEM_ENCRYPTION === "true" ? "(Unencrypted)" : "(Encrypted)"}`}`
|
||||
);
|
||||
console.log(`🔐 Auth Providers: ${authProviders}`);
|
||||
|
||||
console.log("\n📚 API Documentation:");
|
||||
console.log(` - API Reference: http://localhost:3333/docs\n`);
|
||||
|
||||
process.on("SIGINT", async () => {
|
||||
const chunkManager = ChunkManager.getInstance();
|
||||
chunkManager.destroy();
|
||||
process.exit(0);
|
||||
});
|
||||
|
||||
process.on("SIGTERM", async () => {
|
||||
const chunkManager = ChunkManager.getInstance();
|
||||
chunkManager.destroy();
|
||||
process.exit(0);
|
||||
});
|
||||
}
|
||||
|
52
apps/server/src/types/download-queue.ts
Normal file
52
apps/server/src/types/download-queue.ts
Normal file
@@ -0,0 +1,52 @@
|
||||
/**
|
||||
* TypeScript interfaces for download queue management
|
||||
*/
|
||||
|
||||
export interface QueuedDownloadInfo {
|
||||
downloadId: string;
|
||||
position: number;
|
||||
waitTime: number;
|
||||
fileName?: string;
|
||||
fileSize?: number;
|
||||
}
|
||||
|
||||
export interface QueueStatus {
|
||||
queueLength: number;
|
||||
maxQueueSize: number;
|
||||
activeDownloads: number;
|
||||
maxConcurrent: number;
|
||||
queuedDownloads: QueuedDownloadInfo[];
|
||||
}
|
||||
|
||||
export interface DownloadCancelResponse {
|
||||
message: string;
|
||||
downloadId: string;
|
||||
}
|
||||
|
||||
export interface QueueClearResponse {
|
||||
message: string;
|
||||
clearedCount: number;
|
||||
}
|
||||
|
||||
export interface ApiResponse<T = any> {
|
||||
status: "success" | "error";
|
||||
data?: T;
|
||||
error?: string;
|
||||
message?: string;
|
||||
}
|
||||
|
||||
export interface QueueStatusResponse extends ApiResponse<QueueStatus> {
|
||||
status: "success";
|
||||
data: QueueStatus;
|
||||
}
|
||||
|
||||
export interface DownloadSlotRequest {
|
||||
fileName?: string;
|
||||
fileSize?: number;
|
||||
objectName: string;
|
||||
}
|
||||
|
||||
export interface ActiveDownloadInfo {
|
||||
startTime: number;
|
||||
memoryAtStart: number;
|
||||
}
|
423
apps/server/src/utils/download-memory-manager.ts
Normal file
423
apps/server/src/utils/download-memory-manager.ts
Normal file
@@ -0,0 +1,423 @@
|
||||
import { ActiveDownloadInfo, DownloadSlotRequest, QueuedDownloadInfo, QueueStatus } from "../types/download-queue";
|
||||
|
||||
interface QueuedDownload {
|
||||
downloadId: string;
|
||||
queuedAt: number;
|
||||
resolve: () => void;
|
||||
reject: (error: Error) => void;
|
||||
metadata?: DownloadSlotRequest;
|
||||
}
|
||||
|
||||
export class DownloadMemoryManager {
|
||||
private static instance: DownloadMemoryManager;
|
||||
private activeDownloads = new Map<string, ActiveDownloadInfo>();
|
||||
private downloadQueue: QueuedDownload[] = [];
|
||||
private maxConcurrentDownloads: number;
|
||||
private memoryThresholdMB: number;
|
||||
private maxQueueSize: number;
|
||||
private cleanupInterval: NodeJS.Timeout;
|
||||
private isAutoScalingEnabled: boolean;
|
||||
private minFileSizeGB: number;
|
||||
|
||||
private constructor() {
|
||||
const { env } = require("../env");
|
||||
|
||||
const totalMemoryGB = require("os").totalmem() / 1024 ** 3;
|
||||
this.isAutoScalingEnabled = env.DOWNLOAD_AUTO_SCALE === "true";
|
||||
|
||||
if (env.DOWNLOAD_MAX_CONCURRENT !== undefined) {
|
||||
this.maxConcurrentDownloads = env.DOWNLOAD_MAX_CONCURRENT;
|
||||
} else if (this.isAutoScalingEnabled) {
|
||||
this.maxConcurrentDownloads = this.calculateDefaultConcurrentDownloads(totalMemoryGB);
|
||||
} else {
|
||||
this.maxConcurrentDownloads = 3;
|
||||
}
|
||||
|
||||
if (env.DOWNLOAD_MEMORY_THRESHOLD_MB !== undefined) {
|
||||
this.memoryThresholdMB = env.DOWNLOAD_MEMORY_THRESHOLD_MB;
|
||||
} else if (this.isAutoScalingEnabled) {
|
||||
this.memoryThresholdMB = this.calculateDefaultMemoryThreshold(totalMemoryGB);
|
||||
} else {
|
||||
this.memoryThresholdMB = 1024;
|
||||
}
|
||||
|
||||
if (env.DOWNLOAD_QUEUE_SIZE !== undefined) {
|
||||
this.maxQueueSize = env.DOWNLOAD_QUEUE_SIZE;
|
||||
} else if (this.isAutoScalingEnabled) {
|
||||
this.maxQueueSize = this.calculateDefaultQueueSize(totalMemoryGB);
|
||||
} else {
|
||||
this.maxQueueSize = 15;
|
||||
}
|
||||
|
||||
if (env.DOWNLOAD_MIN_FILE_SIZE_GB !== undefined) {
|
||||
this.minFileSizeGB = env.DOWNLOAD_MIN_FILE_SIZE_GB;
|
||||
} else {
|
||||
this.minFileSizeGB = 3.0;
|
||||
}
|
||||
|
||||
this.validateConfiguration();
|
||||
|
||||
console.log(`[DOWNLOAD MANAGER] Configuration loaded:`);
|
||||
console.log(` System Memory: ${totalMemoryGB.toFixed(1)}GB`);
|
||||
console.log(
|
||||
` Max Concurrent: ${this.maxConcurrentDownloads} ${env.DOWNLOAD_MAX_CONCURRENT !== undefined ? "(ENV)" : "(AUTO)"}`
|
||||
);
|
||||
console.log(
|
||||
` Memory Threshold: ${this.memoryThresholdMB}MB ${env.DOWNLOAD_MEMORY_THRESHOLD_MB !== undefined ? "(ENV)" : "(AUTO)"}`
|
||||
);
|
||||
console.log(` Queue Size: ${this.maxQueueSize} ${env.DOWNLOAD_QUEUE_SIZE !== undefined ? "(ENV)" : "(AUTO)"}`);
|
||||
console.log(
|
||||
` Min File Size: ${this.minFileSizeGB}GB ${env.DOWNLOAD_MIN_FILE_SIZE_GB !== undefined ? "(ENV)" : "(DEFAULT)"}`
|
||||
);
|
||||
console.log(` Auto-scaling: ${this.isAutoScalingEnabled ? "enabled" : "disabled"}`);
|
||||
|
||||
this.cleanupInterval = setInterval(() => {
|
||||
this.cleanupStaleDownloads();
|
||||
}, 30000);
|
||||
}
|
||||
|
||||
public static getInstance(): DownloadMemoryManager {
|
||||
if (!DownloadMemoryManager.instance) {
|
||||
DownloadMemoryManager.instance = new DownloadMemoryManager();
|
||||
}
|
||||
return DownloadMemoryManager.instance;
|
||||
}
|
||||
|
||||
private calculateDefaultConcurrentDownloads(totalMemoryGB: number): number {
|
||||
if (totalMemoryGB > 16) return 10;
|
||||
if (totalMemoryGB > 8) return 5;
|
||||
if (totalMemoryGB > 4) return 3;
|
||||
if (totalMemoryGB > 2) return 2;
|
||||
return 1;
|
||||
}
|
||||
|
||||
private calculateDefaultMemoryThreshold(totalMemoryGB: number): number {
|
||||
if (totalMemoryGB > 16) return 4096; // 4GB
|
||||
if (totalMemoryGB > 8) return 2048; // 2GB
|
||||
if (totalMemoryGB > 4) return 1024; // 1GB
|
||||
if (totalMemoryGB > 2) return 512; // 512MB
|
||||
return 256; // 256MB
|
||||
}
|
||||
|
||||
private calculateDefaultQueueSize(totalMemoryGB: number): number {
|
||||
if (totalMemoryGB > 16) return 50; // Large queue for powerful servers
|
||||
if (totalMemoryGB > 8) return 25; // Medium queue
|
||||
if (totalMemoryGB > 4) return 15; // Small queue
|
||||
if (totalMemoryGB > 2) return 10; // Very small queue
|
||||
return 5; // Minimal queue
|
||||
}
|
||||
|
||||
private validateConfiguration(): void {
|
||||
const warnings: string[] = [];
|
||||
const errors: string[] = [];
|
||||
|
||||
if (this.maxConcurrentDownloads < 1) {
|
||||
errors.push(`DOWNLOAD_MAX_CONCURRENT must be >= 1, got: ${this.maxConcurrentDownloads}`);
|
||||
}
|
||||
if (this.maxConcurrentDownloads > 50) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_MAX_CONCURRENT is very high (${this.maxConcurrentDownloads}), this may cause performance issues`
|
||||
);
|
||||
}
|
||||
|
||||
if (this.memoryThresholdMB < 128) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_MEMORY_THRESHOLD_MB is very low (${this.memoryThresholdMB}MB), downloads may be throttled frequently`
|
||||
);
|
||||
}
|
||||
if (this.memoryThresholdMB > 16384) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_MEMORY_THRESHOLD_MB is very high (${this.memoryThresholdMB}MB), system may run out of memory`
|
||||
);
|
||||
}
|
||||
|
||||
if (this.maxQueueSize < 1) {
|
||||
errors.push(`DOWNLOAD_QUEUE_SIZE must be >= 1, got: ${this.maxQueueSize}`);
|
||||
}
|
||||
if (this.maxQueueSize > 1000) {
|
||||
warnings.push(`DOWNLOAD_QUEUE_SIZE is very high (${this.maxQueueSize}), this may consume significant memory`);
|
||||
}
|
||||
|
||||
if (this.minFileSizeGB < 0.1) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_MIN_FILE_SIZE_GB is very low (${this.minFileSizeGB}GB), most downloads will use memory management`
|
||||
);
|
||||
}
|
||||
if (this.minFileSizeGB > 50) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_MIN_FILE_SIZE_GB is very high (${this.minFileSizeGB}GB), memory management may rarely activate`
|
||||
);
|
||||
}
|
||||
|
||||
const recommendedQueueSize = this.maxConcurrentDownloads * 5;
|
||||
if (this.maxQueueSize < this.maxConcurrentDownloads) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_QUEUE_SIZE (${this.maxQueueSize}) is smaller than DOWNLOAD_MAX_CONCURRENT (${this.maxConcurrentDownloads})`
|
||||
);
|
||||
} else if (this.maxQueueSize < recommendedQueueSize) {
|
||||
warnings.push(
|
||||
`DOWNLOAD_QUEUE_SIZE (${this.maxQueueSize}) might be too small. Recommended: ${recommendedQueueSize} (5x concurrent downloads)`
|
||||
);
|
||||
}
|
||||
|
||||
if (warnings.length > 0) {
|
||||
console.warn(`[DOWNLOAD MANAGER] Configuration warnings:`);
|
||||
warnings.forEach((warning) => console.warn(` - ${warning}`));
|
||||
}
|
||||
|
||||
if (errors.length > 0) {
|
||||
console.error(`[DOWNLOAD MANAGER] Configuration errors:`);
|
||||
errors.forEach((error) => console.error(` - ${error}`));
|
||||
throw new Error(`Invalid download manager configuration: ${errors.join(", ")}`);
|
||||
}
|
||||
}
|
||||
|
||||
public async requestDownloadSlot(downloadId: string, metadata?: DownloadSlotRequest): Promise<void> {
|
||||
if (metadata?.fileSize) {
|
||||
const fileSizeGB = metadata.fileSize / 1024 ** 3;
|
||||
if (fileSizeGB < this.minFileSizeGB) {
|
||||
console.log(
|
||||
`[DOWNLOAD MANAGER] File ${metadata.fileName || "unknown"} (${fileSizeGB.toFixed(2)}GB) below threshold (${this.minFileSizeGB}GB), bypassing queue`
|
||||
);
|
||||
return Promise.resolve();
|
||||
}
|
||||
}
|
||||
|
||||
if (this.canStartImmediately()) {
|
||||
console.log(`[DOWNLOAD MANAGER] Immediate start: ${downloadId}`);
|
||||
return Promise.resolve();
|
||||
}
|
||||
|
||||
if (this.downloadQueue.length >= this.maxQueueSize) {
|
||||
const error = new Error(`Download queue is full: ${this.downloadQueue.length}/${this.maxQueueSize}`);
|
||||
throw error;
|
||||
}
|
||||
|
||||
return new Promise<void>((resolve, reject) => {
|
||||
const queuedDownload: QueuedDownload = {
|
||||
downloadId,
|
||||
queuedAt: Date.now(),
|
||||
resolve,
|
||||
reject,
|
||||
metadata,
|
||||
};
|
||||
|
||||
this.downloadQueue.push(queuedDownload);
|
||||
|
||||
const position = this.downloadQueue.length;
|
||||
console.log(`[DOWNLOAD MANAGER] Queued: ${downloadId} (Position: ${position}/${this.maxQueueSize})`);
|
||||
|
||||
if (metadata?.fileName && metadata?.fileSize) {
|
||||
const sizeMB = (metadata.fileSize / (1024 * 1024)).toFixed(1);
|
||||
console.log(`[DOWNLOAD MANAGER] Queued file: ${metadata.fileName} (${sizeMB}MB)`);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
private canStartImmediately(): boolean {
|
||||
const currentMemoryMB = this.getCurrentMemoryUsage();
|
||||
|
||||
if (currentMemoryMB > this.memoryThresholdMB) {
|
||||
return false;
|
||||
}
|
||||
|
||||
if (this.activeDownloads.size >= this.maxConcurrentDownloads) {
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
public canStartDownload(): { allowed: boolean; reason?: string } {
|
||||
if (this.canStartImmediately()) {
|
||||
return { allowed: true };
|
||||
}
|
||||
|
||||
const currentMemoryMB = this.getCurrentMemoryUsage();
|
||||
|
||||
if (currentMemoryMB > this.memoryThresholdMB) {
|
||||
return {
|
||||
allowed: false,
|
||||
reason: `Memory usage too high: ${currentMemoryMB.toFixed(0)}MB > ${this.memoryThresholdMB}MB`,
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
allowed: false,
|
||||
reason: `Too many concurrent downloads: ${this.activeDownloads.size}/${this.maxConcurrentDownloads}`,
|
||||
};
|
||||
}
|
||||
|
||||
public startDownload(downloadId: string): void {
|
||||
const memUsage = process.memoryUsage();
|
||||
this.activeDownloads.set(downloadId, {
|
||||
startTime: Date.now(),
|
||||
memoryAtStart: memUsage.rss + memUsage.external,
|
||||
});
|
||||
|
||||
console.log(
|
||||
`[DOWNLOAD MANAGER] Started: ${downloadId} (${this.activeDownloads.size}/${this.maxConcurrentDownloads} active)`
|
||||
);
|
||||
}
|
||||
|
||||
public endDownload(downloadId: string): void {
|
||||
const downloadInfo = this.activeDownloads.get(downloadId);
|
||||
this.activeDownloads.delete(downloadId);
|
||||
|
||||
if (downloadInfo) {
|
||||
const duration = Date.now() - downloadInfo.startTime;
|
||||
const memUsage = process.memoryUsage();
|
||||
const currentMemory = memUsage.rss + memUsage.external;
|
||||
const memoryDiff = currentMemory - downloadInfo.memoryAtStart;
|
||||
|
||||
console.log(
|
||||
`[DOWNLOAD MANAGER] Ended: ${downloadId} (Duration: ${(duration / 1000).toFixed(1)}s, Memory delta: ${(memoryDiff / 1024 / 1024).toFixed(1)}MB)`
|
||||
);
|
||||
|
||||
if (memoryDiff > 100 * 1024 * 1024 && global.gc) {
|
||||
setImmediate(() => {
|
||||
global.gc!();
|
||||
console.log(`[DOWNLOAD MANAGER] Forced GC after download ${downloadId}`);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
this.processQueue();
|
||||
}
|
||||
|
||||
private processQueue(): void {
|
||||
if (this.downloadQueue.length === 0 || !this.canStartImmediately()) {
|
||||
return;
|
||||
}
|
||||
|
||||
const nextDownload = this.downloadQueue.shift();
|
||||
if (!nextDownload) {
|
||||
return;
|
||||
}
|
||||
|
||||
console.log(
|
||||
`[DOWNLOAD MANAGER] Processing queue: ${nextDownload.downloadId} (${this.downloadQueue.length} remaining)`
|
||||
);
|
||||
|
||||
if (nextDownload.metadata?.fileName && nextDownload.metadata?.fileSize) {
|
||||
const sizeMB = (nextDownload.metadata.fileSize / (1024 * 1024)).toFixed(1);
|
||||
console.log(`[DOWNLOAD MANAGER] Starting queued file: ${nextDownload.metadata.fileName} (${sizeMB}MB)`);
|
||||
}
|
||||
|
||||
nextDownload.resolve();
|
||||
}
|
||||
|
||||
public getActiveDownloadsCount(): number {
|
||||
return this.activeDownloads.size;
|
||||
}
|
||||
|
||||
private getCurrentMemoryUsage(): number {
|
||||
const usage = process.memoryUsage();
|
||||
return (usage.rss + usage.external) / (1024 * 1024);
|
||||
}
|
||||
|
||||
public getCurrentMemoryUsageMB(): number {
|
||||
return this.getCurrentMemoryUsage();
|
||||
}
|
||||
|
||||
public getQueueStatus(): QueueStatus {
|
||||
return {
|
||||
queueLength: this.downloadQueue.length,
|
||||
maxQueueSize: this.maxQueueSize,
|
||||
activeDownloads: this.activeDownloads.size,
|
||||
maxConcurrent: this.maxConcurrentDownloads,
|
||||
queuedDownloads: this.downloadQueue.map((download, index) => ({
|
||||
downloadId: download.downloadId,
|
||||
position: index + 1,
|
||||
waitTime: Date.now() - download.queuedAt,
|
||||
fileName: download.metadata?.fileName,
|
||||
fileSize: download.metadata?.fileSize,
|
||||
})),
|
||||
};
|
||||
}
|
||||
|
||||
public cancelQueuedDownload(downloadId: string): boolean {
|
||||
const index = this.downloadQueue.findIndex((item) => item.downloadId === downloadId);
|
||||
|
||||
if (index === -1) {
|
||||
return false;
|
||||
}
|
||||
|
||||
const canceledDownload = this.downloadQueue.splice(index, 1)[0];
|
||||
canceledDownload.reject(new Error(`Download ${downloadId} was cancelled`));
|
||||
|
||||
console.log(`[DOWNLOAD MANAGER] Cancelled queued download: ${downloadId} (was at position ${index + 1})`);
|
||||
return true;
|
||||
}
|
||||
|
||||
private cleanupStaleDownloads(): void {
|
||||
const now = Date.now();
|
||||
const staleThreshold = 10 * 60 * 1000; // 10 minutes
|
||||
const queueStaleThreshold = 30 * 60 * 1000;
|
||||
|
||||
for (const [downloadId, info] of this.activeDownloads.entries()) {
|
||||
if (now - info.startTime > staleThreshold) {
|
||||
console.warn(`[DOWNLOAD MANAGER] Cleaning up stale active download: ${downloadId}`);
|
||||
this.activeDownloads.delete(downloadId);
|
||||
}
|
||||
}
|
||||
|
||||
const initialQueueLength = this.downloadQueue.length;
|
||||
this.downloadQueue = this.downloadQueue.filter((download) => {
|
||||
if (now - download.queuedAt > queueStaleThreshold) {
|
||||
console.warn(`[DOWNLOAD MANAGER] Cleaning up stale queued download: ${download.downloadId}`);
|
||||
download.reject(new Error(`Download ${download.downloadId} timed out in queue`));
|
||||
return false;
|
||||
}
|
||||
return true;
|
||||
});
|
||||
|
||||
if (this.downloadQueue.length < initialQueueLength) {
|
||||
console.log(
|
||||
`[DOWNLOAD MANAGER] Cleaned up ${initialQueueLength - this.downloadQueue.length} stale queued downloads`
|
||||
);
|
||||
}
|
||||
|
||||
this.processQueue();
|
||||
}
|
||||
|
||||
public shouldThrottleStream(): boolean {
|
||||
const currentMemoryMB = this.getCurrentMemoryUsageMB();
|
||||
return currentMemoryMB > this.memoryThresholdMB * 0.8;
|
||||
}
|
||||
|
||||
public getThrottleDelay(): number {
|
||||
const currentMemoryMB = this.getCurrentMemoryUsageMB();
|
||||
const thresholdRatio = currentMemoryMB / this.memoryThresholdMB;
|
||||
|
||||
if (thresholdRatio > 0.9) return 200;
|
||||
if (thresholdRatio > 0.8) return 100;
|
||||
return 50;
|
||||
}
|
||||
|
||||
public destroy(): void {
|
||||
if (this.cleanupInterval) {
|
||||
clearInterval(this.cleanupInterval);
|
||||
}
|
||||
|
||||
this.downloadQueue.forEach((download) => {
|
||||
download.reject(new Error("Download manager is shutting down"));
|
||||
});
|
||||
|
||||
this.activeDownloads.clear();
|
||||
this.downloadQueue = [];
|
||||
console.log("[DOWNLOAD MANAGER] Shutdown completed");
|
||||
}
|
||||
|
||||
public clearQueue(): number {
|
||||
const clearedCount = this.downloadQueue.length;
|
||||
|
||||
this.downloadQueue.forEach((download) => {
|
||||
download.reject(new Error("Queue was cleared by administrator"));
|
||||
});
|
||||
|
||||
this.downloadQueue = [];
|
||||
console.log(`[DOWNLOAD MANAGER] Cleared queue: ${clearedCount} downloads cancelled`);
|
||||
return clearedCount;
|
||||
}
|
||||
}
|
@@ -1,2 +1,5 @@
|
||||
API_BASE_URL=http:localhost:3333
|
||||
NEXT_PUBLIC_DEFAULT_LANGUAGE=en-US
|
||||
NEXT_PUBLIC_DEFAULT_LANGUAGE=en-US
|
||||
|
||||
# Configuration options
|
||||
NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "لوحة التحكم"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "تم تمكين إشعارات التنزيل",
|
||||
"permissionDenied": "تم تعطيل إشعارات التنزيل",
|
||||
"downloadComplete": {
|
||||
"title": "اكتمل التنزيل",
|
||||
"body": "اكتمل تنزيل {fileName}"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "فشل التنزيل",
|
||||
"body": "فشل تنزيل {fileName}: {error}",
|
||||
"unknownError": "خطأ غير معروف"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "بدء التنزيل",
|
||||
"body": "يتم الآن تنزيل {fileName}{position}",
|
||||
"position": " (كان #{position} في قائمة الانتظار)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "تغيير كلمة المرور",
|
||||
@@ -1906,4 +1924,4 @@
|
||||
"nameRequired": "الاسم مطلوب",
|
||||
"required": "هذا الحقل مطلوب"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Übersicht"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Download-Benachrichtigungen aktiviert",
|
||||
"permissionDenied": "Download-Benachrichtigungen deaktiviert",
|
||||
"downloadComplete": {
|
||||
"title": "Download abgeschlossen",
|
||||
"body": "{fileName} wurde erfolgreich heruntergeladen"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Download fehlgeschlagen",
|
||||
"body": "Fehler beim Herunterladen von {fileName}: {error}",
|
||||
"unknownError": "Unbekannter Fehler"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Download startet",
|
||||
"body": "{fileName} wird jetzt heruntergeladen{position}",
|
||||
"position": " (war #{position} in der Warteschlange)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Passwort ändern",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Name ist erforderlich",
|
||||
"required": "Dieses Feld ist erforderlich"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -125,10 +125,7 @@
|
||||
"zipNameLabel": "ZIP file name",
|
||||
"zipNamePlaceholder": "Enter file name",
|
||||
"description": "{count, plural, =1 {1 file will be compressed} other {# files will be compressed}}",
|
||||
"download": "Download ZIP",
|
||||
"creatingZip": "Creating ZIP file...",
|
||||
"zipCreated": "ZIP file created successfully, download started",
|
||||
"zipError": "Failed to create ZIP file"
|
||||
"download": "Download ZIP"
|
||||
},
|
||||
"common": {
|
||||
"loading": "Loading, please wait...",
|
||||
@@ -553,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Dashboard"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Download notifications enabled",
|
||||
"permissionDenied": "Download notifications disabled",
|
||||
"downloadComplete": {
|
||||
"title": "Download Complete",
|
||||
"body": "{fileName} has finished downloading"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Download Failed",
|
||||
"body": "Failed to download {fileName}: {error}",
|
||||
"unknownError": "Unknown error"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Download Starting",
|
||||
"body": "{fileName} is now downloading{position}",
|
||||
"position": " (was #{position} in queue)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Change Password",
|
||||
@@ -1872,4 +1887,4 @@
|
||||
"nameRequired": "Name is required",
|
||||
"required": "This field is required"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Panel de control"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Notificaciones de descarga habilitadas",
|
||||
"permissionDenied": "Notificaciones de descarga deshabilitadas",
|
||||
"downloadComplete": {
|
||||
"title": "Descarga Completada",
|
||||
"body": "{fileName} ha terminado de descargarse"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Descarga Fallida",
|
||||
"body": "Error al descargar {fileName}: {error}",
|
||||
"unknownError": "Error desconocido"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Descarga Iniciando",
|
||||
"body": "{fileName} está descargándose ahora{position}",
|
||||
"position": " (estaba en posición #{position} en la cola)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Cambiar contraseña",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "El nombre es obligatorio",
|
||||
"required": "Este campo es obligatorio"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Tableau de bord"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Notifications de téléchargement activées",
|
||||
"permissionDenied": "Notifications de téléchargement désactivées",
|
||||
"downloadComplete": {
|
||||
"title": "Téléchargement Terminé",
|
||||
"body": "{fileName} a fini de télécharger"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Échec du Téléchargement",
|
||||
"body": "Échec du téléchargement de {fileName} : {error}",
|
||||
"unknownError": "Erreur inconnue"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Démarrage du Téléchargement",
|
||||
"body": "{fileName} est en cours de téléchargement{position}",
|
||||
"position": " (était n°{position} dans la file d'attente)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Changer le Mot de Passe",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Nome é obrigatório",
|
||||
"required": "Este campo é obrigatório"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "डैशबोर्ड"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "डाउनलोड सूचनाएं सक्षम की गईं",
|
||||
"permissionDenied": "डाउनलोड सूचनाएं अक्षम की गईं",
|
||||
"downloadComplete": {
|
||||
"title": "डाउनलोड पूर्ण",
|
||||
"body": "{fileName} का डाउनलोड समाप्त हो गया है"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "डाउनलोड विफल",
|
||||
"body": "{fileName} डाउनलोड करने में विफल: {error}",
|
||||
"unknownError": "अज्ञात त्रुटि"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "डाउनलोड प्रारंभ",
|
||||
"body": "{fileName} अब डाउनलोड हो रहा है{position}",
|
||||
"position": " (कतार में #{position} था)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "पासवर्ड बदलें",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "नाम आवश्यक है",
|
||||
"required": "यह फ़ील्ड आवश्यक है"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Pannello di controllo"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Notifiche download abilitate",
|
||||
"permissionDenied": "Notifiche download disabilitate",
|
||||
"downloadComplete": {
|
||||
"title": "Download Completato",
|
||||
"body": "Il download di {fileName} è terminato"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Download Fallito",
|
||||
"body": "Impossibile scaricare {fileName}: {error}",
|
||||
"unknownError": "Errore sconosciuto"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Download in Avvio",
|
||||
"body": "{fileName} sta ora scaricando{position}",
|
||||
"position": " (era #{position} in coda)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Cambia Parola d'accesso",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Il nome è obbligatorio",
|
||||
"required": "Questo campo è obbligatorio"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "ダッシュボード"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "ダウンロード通知が有効になりました",
|
||||
"permissionDenied": "ダウンロード通知が無効になりました",
|
||||
"downloadComplete": {
|
||||
"title": "ダウンロード完了",
|
||||
"body": "{fileName}のダウンロードが完了しました"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "ダウンロード失敗",
|
||||
"body": "{fileName}のダウンロードに失敗: {error}",
|
||||
"unknownError": "不明なエラー"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "ダウンロード開始",
|
||||
"body": "{fileName}のダウンロードを開始しています{position}",
|
||||
"position": "(キュー内{position}番目)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "パスワードを変更",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "名前は必須です",
|
||||
"required": "このフィールドは必須です"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "대시보드"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "다운로드 알림이 활성화되었습니다",
|
||||
"permissionDenied": "다운로드 알림이 비활성화되었습니다",
|
||||
"downloadComplete": {
|
||||
"title": "다운로드 완료",
|
||||
"body": "{fileName} 다운로드가 완료되었습니다"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "다운로드 실패",
|
||||
"body": "{fileName} 다운로드 실패: {error}",
|
||||
"unknownError": "알 수 없는 오류"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "다운로드 시작",
|
||||
"body": "{fileName} 다운로드가 시작되었습니다{position}",
|
||||
"position": " (대기열 #{position}번이었음)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "비밀번호 변경",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "이름은 필수입니다",
|
||||
"required": "이 필드는 필수입니다"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Controlepaneel"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Download meldingen ingeschakeld",
|
||||
"permissionDenied": "Download meldingen uitgeschakeld",
|
||||
"downloadComplete": {
|
||||
"title": "Download Voltooid",
|
||||
"body": "{fileName} is klaar met downloaden"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Download Mislukt",
|
||||
"body": "Downloaden van {fileName} mislukt: {error}",
|
||||
"unknownError": "Onbekende fout"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Download Start",
|
||||
"body": "{fileName} wordt nu gedownload{position}",
|
||||
"position": " (was #{position} in wachtrij)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Wachtwoord Wijzigen",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Naam is verplicht",
|
||||
"required": "Dit veld is verplicht"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Panel główny"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Powiadomienia o pobieraniu włączone",
|
||||
"permissionDenied": "Powiadomienia o pobieraniu wyłączone",
|
||||
"downloadComplete": {
|
||||
"title": "Pobieranie zakończone",
|
||||
"body": "Plik {fileName} został pobrany"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Błąd pobierania",
|
||||
"body": "Nie udało się pobrać pliku {fileName}: {error}",
|
||||
"unknownError": "Nieznany błąd"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Rozpoczęcie pobierania",
|
||||
"body": "Trwa pobieranie pliku {fileName}{position}",
|
||||
"position": " (był #{position} w kolejce)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Zmień hasło",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Nazwa jest wymagana",
|
||||
"required": "To pole jest wymagane"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Painel"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Notificações de download ativadas",
|
||||
"permissionDenied": "Notificações de download desativadas",
|
||||
"downloadComplete": {
|
||||
"title": "Download Concluído",
|
||||
"body": "{fileName} terminou de baixar"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Download Falhou",
|
||||
"body": "Falha ao baixar {fileName}: {error}",
|
||||
"unknownError": "Erro desconhecido"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Download Iniciando",
|
||||
"body": "{fileName} está sendo baixado agora{position}",
|
||||
"position": " (estava na posição #{position} da fila)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Alterar Senha",
|
||||
@@ -1905,4 +1923,4 @@
|
||||
"usernameLength": "O nome de usuário deve ter pelo menos 3 caracteres",
|
||||
"usernameSpaces": "O nome de usuário não pode conter espaços"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Панель управления"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "Уведомления о загрузках включены",
|
||||
"permissionDenied": "Уведомления о загрузках отключены",
|
||||
"downloadComplete": {
|
||||
"title": "Загрузка завершена",
|
||||
"body": "Файл {fileName} успешно загружен"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "Ошибка загрузки",
|
||||
"body": "Не удалось загрузить {fileName}: {error}",
|
||||
"unknownError": "Неизвестная ошибка"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "Начало загрузки",
|
||||
"body": "Файл {fileName} загружается{position}",
|
||||
"position": " (был №{position} в очереди)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Изменить пароль",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "Требуется имя",
|
||||
"required": "Это поле обязательно"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "Gösterge Paneli"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "İndirme bildirimleri etkinleştirildi",
|
||||
"permissionDenied": "İndirme bildirimleri devre dışı bırakıldı",
|
||||
"downloadComplete": {
|
||||
"title": "İndirme Tamamlandı",
|
||||
"body": "{fileName} indirmesi tamamlandı"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "İndirme Başarısız",
|
||||
"body": "{fileName} indirilemedi: {error}",
|
||||
"unknownError": "Bilinmeyen hata"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "İndirme Başlıyor",
|
||||
"body": "{fileName} şimdi indiriliyor{position}",
|
||||
"position": " (kuyrukta #{position} sıradaydı)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "Şifreyi Değiştir",
|
||||
@@ -1904,4 +1922,4 @@
|
||||
"nameRequired": "İsim gereklidir",
|
||||
"required": "Bu alan zorunludur"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -550,6 +550,24 @@
|
||||
"navigation": {
|
||||
"dashboard": "仪表盘"
|
||||
},
|
||||
"notifications": {
|
||||
"permissionGranted": "下载通知已启用",
|
||||
"permissionDenied": "下载通知已禁用",
|
||||
"downloadComplete": {
|
||||
"title": "下载完成",
|
||||
"body": "{fileName} 已下载完成"
|
||||
},
|
||||
"downloadFailed": {
|
||||
"title": "下载失败",
|
||||
"body": "下载 {fileName} 失败:{error}",
|
||||
"unknownError": "未知错误"
|
||||
},
|
||||
"queueProcessing": {
|
||||
"title": "开始下载",
|
||||
"body": "{fileName} 正在下载{position}",
|
||||
"position": "(队列中第 {position} 位)"
|
||||
}
|
||||
},
|
||||
"profile": {
|
||||
"password": {
|
||||
"title": "修改密码",
|
||||
@@ -1666,11 +1684,7 @@
|
||||
"copyToClipboard": "复制到剪贴板",
|
||||
"savedMessage": "我已保存备用码",
|
||||
"available": "可用备用码:{count}个",
|
||||
"instructions": [
|
||||
"• 将这些代码保存在安全的位置",
|
||||
"• 每个备用码只能使用一次",
|
||||
"• 您可以随时生成新的备用码"
|
||||
]
|
||||
"instructions": ["• 将这些代码保存在安全的位置", "• 每个备用码只能使用一次", "• 您可以随时生成新的备用码"]
|
||||
},
|
||||
"verification": {
|
||||
"title": "双重认证",
|
||||
@@ -1904,4 +1918,4 @@
|
||||
"nameRequired": "名称为必填项",
|
||||
"required": "此字段为必填项"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "palmr-web",
|
||||
"version": "3.2.1-beta",
|
||||
"version": "3.2.3-beta",
|
||||
"description": "Frontend for Palmr",
|
||||
"private": true,
|
||||
"author": "Daniel Luiz Alves <daniel@kyantech.com.br>",
|
||||
@@ -100,4 +100,4 @@
|
||||
"tailwindcss": "4.1.11",
|
||||
"typescript": "5.8.3"
|
||||
}
|
||||
}
|
||||
}
|
@@ -38,14 +38,13 @@ import {
|
||||
import { Input } from "@/components/ui/input";
|
||||
import { Separator } from "@/components/ui/separator";
|
||||
import { Table, TableBody, TableCell, TableHead, TableHeader, TableRow } from "@/components/ui/table";
|
||||
import { bulkDownloadReverseShareFiles } from "@/http/endpoints/bulk-download";
|
||||
import {
|
||||
copyReverseShareFileToUserFiles,
|
||||
deleteReverseShareFile,
|
||||
downloadReverseShareFile,
|
||||
updateReverseShareFile,
|
||||
} from "@/http/endpoints/reverse-shares";
|
||||
import type { ReverseShareFile } from "@/http/endpoints/reverse-shares/types";
|
||||
import { bulkDownloadWithQueue, downloadReverseShareWithQueue } from "@/utils/download-queue-utils";
|
||||
import { getFileIcon } from "@/utils/file-icons";
|
||||
import { truncateFileName } from "@/utils/file-utils";
|
||||
import { ReverseShare } from "../hooks/use-reverse-shares";
|
||||
@@ -472,20 +471,13 @@ export function ReceivedFilesModal({
|
||||
|
||||
const handleDownload = async (file: ReverseShareFile) => {
|
||||
try {
|
||||
const response = await downloadReverseShareFile(file.id);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = file.name;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("reverseShares.modals.receivedFiles.downloadSuccess"));
|
||||
await downloadReverseShareWithQueue(file.id, file.name, {
|
||||
onComplete: () => toast.success(t("reverseShares.modals.receivedFiles.downloadSuccess")),
|
||||
onFail: () => toast.error(t("reverseShares.modals.receivedFiles.downloadError")),
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
toast.error(t("reverseShares.modals.receivedFiles.downloadError"));
|
||||
// Error already handled in downloadReverseShareWithQueue
|
||||
}
|
||||
};
|
||||
|
||||
@@ -610,31 +602,23 @@ export function ReceivedFilesModal({
|
||||
try {
|
||||
const zipName = `${reverseShare.name || t("reverseShares.defaultLinkName")}_files.zip`;
|
||||
|
||||
const fileIds = selectedFileObjects.map((file) => file.id);
|
||||
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
|
||||
const blob = await bulkDownloadReverseShareFiles({
|
||||
fileIds,
|
||||
zipName,
|
||||
});
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = zipName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
setSelectedFiles(new Set());
|
||||
toast.promise(
|
||||
bulkDownloadWithQueue(
|
||||
selectedFileObjects.map((file) => ({
|
||||
name: file.name,
|
||||
id: file.id,
|
||||
isReverseShare: true,
|
||||
})),
|
||||
zipName
|
||||
).then(() => {
|
||||
setSelectedFiles(new Set());
|
||||
}),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("shareManager.zipDownloadError"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error creating ZIP:", error);
|
||||
}
|
||||
|
@@ -6,7 +6,8 @@ import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { deleteReverseShareFile, downloadReverseShareFile } from "@/http/endpoints/reverse-shares";
|
||||
import { deleteReverseShareFile } from "@/http/endpoints/reverse-shares";
|
||||
import { downloadReverseShareWithQueue } from "@/utils/download-queue-utils";
|
||||
import { getFileIcon } from "@/utils/file-icons";
|
||||
import { ReverseShareFilePreviewModal } from "./reverse-share-file-preview-modal";
|
||||
|
||||
@@ -67,20 +68,13 @@ export function ReceivedFilesSection({ files, onFileDeleted }: ReceivedFilesSect
|
||||
|
||||
const handleDownload = async (file: ReverseShareFile) => {
|
||||
try {
|
||||
const response = await downloadReverseShareFile(file.id);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = file.name;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("reverseShares.modals.details.downloadSuccess"));
|
||||
await downloadReverseShareWithQueue(file.id, file.name, {
|
||||
onComplete: () => toast.success(t("reverseShares.modals.details.downloadSuccess")),
|
||||
onFail: () => toast.error(t("reverseShares.modals.details.downloadError")),
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
toast.error(t("reverseShares.modals.details.downloadError"));
|
||||
// Error already handled in downloadReverseShareWithQueue
|
||||
}
|
||||
};
|
||||
|
||||
|
@@ -5,10 +5,13 @@ import { useParams, useRouter, useSearchParams } from "next/navigation";
|
||||
import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import { getDownloadUrl } from "@/http/endpoints";
|
||||
import { bulkDownloadFiles, downloadFolder } from "@/http/endpoints/bulk-download";
|
||||
import { getShareByAlias } from "@/http/endpoints/index";
|
||||
import type { Share } from "@/http/endpoints/shares/types";
|
||||
import {
|
||||
bulkDownloadShareWithQueue,
|
||||
downloadFileWithQueue,
|
||||
downloadShareFolderWithQueue,
|
||||
} from "@/utils/download-queue-utils";
|
||||
|
||||
const createSlug = (name: string): string => {
|
||||
return name
|
||||
@@ -226,17 +229,11 @@ export function usePublicShare() {
|
||||
throw new Error("Share data not available");
|
||||
}
|
||||
|
||||
const blob = await downloadFolder(folderId, folderName);
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = `${folderName}.zip`;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
await downloadShareFolderWithQueue(folderId, folderName, share.files || [], share.folders || [], {
|
||||
silent: true,
|
||||
showToasts: false,
|
||||
sharePassword: password,
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Error downloading folder:", error);
|
||||
throw error;
|
||||
@@ -253,24 +250,18 @@ export function usePublicShare() {
|
||||
error: t("share.errors.downloadFailed"),
|
||||
});
|
||||
} else {
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
const params: Record<string, string> = {};
|
||||
if (password) params.password = password;
|
||||
|
||||
const response = await getDownloadUrl(
|
||||
encodedObjectName,
|
||||
Object.keys(params).length > 0 ? { params } : undefined
|
||||
await toast.promise(
|
||||
downloadFileWithQueue(objectName, fileName, {
|
||||
silent: true,
|
||||
showToasts: false,
|
||||
sharePassword: password,
|
||||
}),
|
||||
{
|
||||
loading: t("share.messages.downloadStarted"),
|
||||
success: t("shareManager.downloadSuccess"),
|
||||
error: t("share.errors.downloadFailed"),
|
||||
}
|
||||
);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = fileName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("shareManager.downloadSuccess"));
|
||||
}
|
||||
} catch {}
|
||||
};
|
||||
@@ -330,31 +321,22 @@ export function usePublicShare() {
|
||||
return;
|
||||
}
|
||||
|
||||
const fileIds = share.files?.map((file) => file.id) || [];
|
||||
const folderIds = share.folders?.map((folder) => folder.id) || [];
|
||||
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
|
||||
const blob = await bulkDownloadFiles({
|
||||
fileIds,
|
||||
folderIds,
|
||||
zipName,
|
||||
});
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = zipName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
toast.promise(
|
||||
bulkDownloadShareWithQueue(
|
||||
allItems,
|
||||
share.files || [],
|
||||
share.folders || [],
|
||||
zipName,
|
||||
undefined,
|
||||
true,
|
||||
password
|
||||
).then(() => {}),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("shareManager.zipDownloadError"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error creating ZIP:", error);
|
||||
}
|
||||
@@ -390,33 +372,44 @@ export function usePublicShare() {
|
||||
checkNestedFolders(folder.id);
|
||||
}
|
||||
|
||||
const allItems = [
|
||||
...files
|
||||
.filter((file) => !filesInSelectedFolders.has(file.id))
|
||||
.map((file) => ({
|
||||
objectName: file.objectName,
|
||||
name: file.name,
|
||||
type: "file" as const,
|
||||
})),
|
||||
// Add only top-level folders (avoid duplicating nested folders)
|
||||
...folders
|
||||
.filter((folder) => {
|
||||
return !folder.parentId || !folders.some((f) => f.id === folder.parentId);
|
||||
})
|
||||
.map((folder) => ({
|
||||
id: folder.id,
|
||||
name: folder.name,
|
||||
type: "folder" as const,
|
||||
})),
|
||||
];
|
||||
|
||||
const zipName = `${share.name || t("shareManager.defaultShareName")}-selected.zip`;
|
||||
|
||||
const fileIds = files.map((file) => file.id);
|
||||
const folderIds = folders.map((folder) => folder.id);
|
||||
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
|
||||
const blob = await bulkDownloadFiles({
|
||||
fileIds,
|
||||
folderIds,
|
||||
zipName,
|
||||
});
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = zipName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
toast.promise(
|
||||
bulkDownloadShareWithQueue(
|
||||
allItems,
|
||||
share.files || [],
|
||||
share.folders || [],
|
||||
zipName,
|
||||
undefined,
|
||||
false,
|
||||
password
|
||||
).then(() => {}),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("shareManager.zipDownloadError"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error creating ZIP:", error);
|
||||
toast.error(t("shareManager.zipDownloadError"));
|
||||
|
@@ -1,39 +0,0 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function GET(
|
||||
request: NextRequest,
|
||||
{ params }: { params: Promise<{ folderId: string; folderName: string }> }
|
||||
) {
|
||||
try {
|
||||
const cookieHeader = request.headers.get("cookie");
|
||||
const { folderId, folderName } = await params;
|
||||
|
||||
const apiRes = await fetch(`${API_BASE_URL}/bulk-download/folder/${folderId}/${folderName}`, {
|
||||
method: "GET",
|
||||
headers: {
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
});
|
||||
|
||||
if (!apiRes.ok) {
|
||||
const errorText = await apiRes.text();
|
||||
return NextResponse.json({ error: errorText }, { status: apiRes.status });
|
||||
}
|
||||
|
||||
// For binary responses (ZIP files), we need to handle them differently
|
||||
const buffer = await apiRes.arrayBuffer();
|
||||
|
||||
return new NextResponse(buffer, {
|
||||
status: 200,
|
||||
headers: {
|
||||
"Content-Type": "application/zip",
|
||||
"Content-Disposition": `attachment; filename=${folderName}.zip`,
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Folder download proxy error:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -1,38 +0,0 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const cookieHeader = request.headers.get("cookie");
|
||||
const body = await request.text();
|
||||
|
||||
const apiRes = await fetch(`${API_BASE_URL}/bulk-download/reverse-share`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
body,
|
||||
});
|
||||
|
||||
if (!apiRes.ok) {
|
||||
const errorText = await apiRes.text();
|
||||
return NextResponse.json({ error: errorText }, { status: apiRes.status });
|
||||
}
|
||||
|
||||
// For binary responses (ZIP files), we need to handle them differently
|
||||
const buffer = await apiRes.arrayBuffer();
|
||||
|
||||
return new NextResponse(buffer, {
|
||||
status: 200,
|
||||
headers: {
|
||||
"Content-Type": "application/zip",
|
||||
"Content-Disposition": apiRes.headers.get("Content-Disposition") || "attachment; filename=download.zip",
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Reverse share bulk download proxy error:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -1,38 +0,0 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function POST(request: NextRequest) {
|
||||
try {
|
||||
const cookieHeader = request.headers.get("cookie");
|
||||
const body = await request.text();
|
||||
|
||||
const apiRes = await fetch(`${API_BASE_URL}/bulk-download`, {
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
body,
|
||||
});
|
||||
|
||||
if (!apiRes.ok) {
|
||||
const errorText = await apiRes.text();
|
||||
return NextResponse.json({ error: errorText }, { status: apiRes.status });
|
||||
}
|
||||
|
||||
// For binary responses (ZIP files), we need to handle them differently
|
||||
const buffer = await apiRes.arrayBuffer();
|
||||
|
||||
return new NextResponse(buffer, {
|
||||
status: 200,
|
||||
headers: {
|
||||
"Content-Type": "application/zip",
|
||||
"Content-Disposition": apiRes.headers.get("Content-Disposition") || "attachment; filename=download.zip",
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Bulk download proxy error:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -0,0 +1,38 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function DELETE(req: NextRequest, { params }: { params: Promise<{ downloadId: string }> }) {
|
||||
const { downloadId } = await params;
|
||||
const cookieHeader = req.headers.get("cookie");
|
||||
const url = `${API_BASE_URL}/filesystem/download-queue/${downloadId}`;
|
||||
|
||||
try {
|
||||
const apiRes = await fetch(url, {
|
||||
method: "DELETE",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
redirect: "manual",
|
||||
});
|
||||
|
||||
const resBody = await apiRes.text();
|
||||
const res = new NextResponse(resBody, {
|
||||
status: apiRes.status,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
});
|
||||
|
||||
const setCookie = apiRes.headers.getSetCookie?.() || [];
|
||||
if (setCookie.length > 0) {
|
||||
res.headers.set("Set-Cookie", setCookie.join(","));
|
||||
}
|
||||
|
||||
return res;
|
||||
} catch (error) {
|
||||
console.error("Error proxying cancel download request:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -0,0 +1,37 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function DELETE(req: NextRequest) {
|
||||
const cookieHeader = req.headers.get("cookie");
|
||||
const url = `${API_BASE_URL}/filesystem/download-queue`;
|
||||
|
||||
try {
|
||||
const apiRes = await fetch(url, {
|
||||
method: "DELETE",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
redirect: "manual",
|
||||
});
|
||||
|
||||
const resBody = await apiRes.text();
|
||||
const res = new NextResponse(resBody, {
|
||||
status: apiRes.status,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
});
|
||||
|
||||
const setCookie = apiRes.headers.getSetCookie?.() || [];
|
||||
if (setCookie.length > 0) {
|
||||
res.headers.set("Set-Cookie", setCookie.join(","));
|
||||
}
|
||||
|
||||
return res;
|
||||
} catch (error) {
|
||||
console.error("Error proxying clear download queue request:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -0,0 +1,37 @@
|
||||
import { NextRequest, NextResponse } from "next/server";
|
||||
|
||||
const API_BASE_URL = process.env.API_BASE_URL || "http://localhost:3333";
|
||||
|
||||
export async function GET(req: NextRequest) {
|
||||
const cookieHeader = req.headers.get("cookie");
|
||||
const url = `${API_BASE_URL}/filesystem/download-queue/status`;
|
||||
|
||||
try {
|
||||
const apiRes = await fetch(url, {
|
||||
method: "GET",
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
cookie: cookieHeader || "",
|
||||
},
|
||||
redirect: "manual",
|
||||
});
|
||||
|
||||
const resBody = await apiRes.text();
|
||||
const res = new NextResponse(resBody, {
|
||||
status: apiRes.status,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
},
|
||||
});
|
||||
|
||||
const setCookie = apiRes.headers.getSetCookie?.() || [];
|
||||
if (setCookie.length > 0) {
|
||||
res.headers.set("Set-Cookie", setCookie.join(","));
|
||||
}
|
||||
|
||||
return res;
|
||||
} catch (error) {
|
||||
console.error("Error proxying download queue status request:", error);
|
||||
return NextResponse.json({ error: "Internal server error" }, { status: 500 });
|
||||
}
|
||||
}
|
@@ -46,6 +46,8 @@ export default function DashboardPage() {
|
||||
icon={<IconLayoutDashboardFilled className="text-xl" />}
|
||||
showBreadcrumb={false}
|
||||
title={t("dashboard.pageTitle")}
|
||||
pendingDownloads={fileManager.pendingDownloads}
|
||||
onCancelDownload={fileManager.cancelPendingDownload}
|
||||
>
|
||||
<StorageUsage diskSpace={diskSpace} diskSpaceError={diskSpaceError} onRetry={handleRetryDiskSpace} />
|
||||
<QuickAccessCards />
|
||||
|
@@ -119,6 +119,8 @@ export default function FilesPage() {
|
||||
breadcrumbLabel={t("files.breadcrumb")}
|
||||
icon={<IconFolderOpen size={20} />}
|
||||
title={t("files.pageTitle")}
|
||||
pendingDownloads={fileManager.pendingDownloads}
|
||||
onCancelDownload={fileManager.cancelPendingDownload}
|
||||
>
|
||||
<Card>
|
||||
<CardContent>
|
||||
|
268
apps/web/src/components/download-queue-indicator.tsx
Normal file
268
apps/web/src/components/download-queue-indicator.tsx
Normal file
@@ -0,0 +1,268 @@
|
||||
"use client";
|
||||
|
||||
import { useEffect, useState } from "react";
|
||||
import {
|
||||
IconAlertCircle,
|
||||
IconBell,
|
||||
IconBellOff,
|
||||
IconClock,
|
||||
IconDownload,
|
||||
IconLoader2,
|
||||
IconX,
|
||||
} from "@tabler/icons-react";
|
||||
import { useTranslations } from "next-intl";
|
||||
|
||||
import { Badge } from "@/components/ui/badge";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Progress } from "@/components/ui/progress";
|
||||
import { useDownloadQueue } from "@/hooks/use-download-queue";
|
||||
import { usePushNotifications } from "@/hooks/use-push-notifications";
|
||||
import { formatFileSize } from "@/utils/format-file-size";
|
||||
|
||||
interface PendingDownload {
|
||||
downloadId: string;
|
||||
fileName: string;
|
||||
objectName: string;
|
||||
startTime: number;
|
||||
status: "pending" | "queued" | "downloading" | "completed" | "failed";
|
||||
}
|
||||
|
||||
interface DownloadQueueIndicatorProps {
|
||||
pendingDownloads?: PendingDownload[];
|
||||
onCancelDownload?: (downloadId: string) => void;
|
||||
className?: string;
|
||||
}
|
||||
|
||||
export function DownloadQueueIndicator({
|
||||
pendingDownloads = [],
|
||||
onCancelDownload,
|
||||
className = "",
|
||||
}: DownloadQueueIndicatorProps) {
|
||||
const t = useTranslations();
|
||||
|
||||
const shouldAutoRefresh = pendingDownloads.length > 0;
|
||||
const { queueStatus, refreshQueue, cancelDownload, getEstimatedWaitTime } = useDownloadQueue(shouldAutoRefresh);
|
||||
const notifications = usePushNotifications();
|
||||
const [isOpen, setIsOpen] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (pendingDownloads.length > 0 || (queueStatus && queueStatus.queueLength > 0)) {
|
||||
setIsOpen(true);
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [pendingDownloads.length, queueStatus?.queueLength]);
|
||||
|
||||
const totalDownloads = pendingDownloads.length + (queueStatus?.queueLength || 0);
|
||||
const activeDownloads = queueStatus?.activeDownloads || 0;
|
||||
|
||||
if (totalDownloads === 0 && activeDownloads === 0) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const getStatusIcon = (status: string) => {
|
||||
switch (status) {
|
||||
case "pending":
|
||||
return <IconLoader2 className="h-4 w-4 animate-spin text-blue-500" />;
|
||||
case "queued":
|
||||
return <IconClock className="h-4 w-4 text-yellow-500" />;
|
||||
case "downloading":
|
||||
return <IconDownload className="h-4 w-4 text-green-500" />;
|
||||
case "completed":
|
||||
return <IconDownload className="h-4 w-4 text-green-600" />;
|
||||
case "failed":
|
||||
return <IconAlertCircle className="h-4 w-4 text-red-500" />;
|
||||
default:
|
||||
return <IconLoader2 className="h-4 w-4 animate-spin" />;
|
||||
}
|
||||
};
|
||||
|
||||
const getStatusText = (status: string) => {
|
||||
switch (status) {
|
||||
case "pending":
|
||||
return t("downloadQueue.status.pending");
|
||||
case "queued":
|
||||
return t("downloadQueue.status.queued");
|
||||
case "downloading":
|
||||
return t("downloadQueue.status.downloading");
|
||||
case "completed":
|
||||
return t("downloadQueue.status.completed");
|
||||
case "failed":
|
||||
return t("downloadQueue.status.failed");
|
||||
default:
|
||||
return status;
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div className={`fixed bottom-6 right-6 z-50 max-w-sm ${className}`} data-download-indicator>
|
||||
<div className="flex flex-col gap-3">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={() => setIsOpen(!isOpen)}
|
||||
className="min-w-fit bg-background/80 backdrop-blur-md border-border/50 shadow-lg hover:shadow-xl transition-all duration-200 text-sm font-medium"
|
||||
>
|
||||
<IconDownload className="h-4 w-4 mr-2 text-primary" />
|
||||
Downloads
|
||||
{totalDownloads > 0 && (
|
||||
<Badge variant="secondary" className="ml-2 text-xs font-semibold bg-primary/10 text-primary border-0">
|
||||
{totalDownloads}
|
||||
</Badge>
|
||||
)}
|
||||
</Button>
|
||||
|
||||
{isOpen && (
|
||||
<div className="border border-border/50 rounded-xl bg-background/95 backdrop-blur-md shadow-xl animate-in slide-in-from-bottom-2 duration-200">
|
||||
<div className="p-4 border-b border-border/50">
|
||||
<div className="flex items-center justify-between">
|
||||
<h3 className="font-semibold text-sm text-foreground">Download Manager</h3>
|
||||
<div className="flex items-center gap-2">
|
||||
{notifications.isSupported && (
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={notifications.requestPermission}
|
||||
className="h-7 w-7 p-0 rounded-md hover:bg-muted/80"
|
||||
title={
|
||||
notifications.hasPermission
|
||||
? t("notifications.permissionGranted")
|
||||
: "Enable download notifications"
|
||||
}
|
||||
>
|
||||
{notifications.hasPermission ? (
|
||||
<IconBell className="h-3.5 w-3.5 text-green-600" />
|
||||
) : (
|
||||
<IconBellOff className="h-3.5 w-3.5 text-muted-foreground" />
|
||||
)}
|
||||
</Button>
|
||||
)}
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => setIsOpen(false)}
|
||||
className="h-7 w-7 p-0 rounded-md hover:bg-muted/80"
|
||||
>
|
||||
<IconX className="h-3.5 w-3.5 text-muted-foreground" />
|
||||
</Button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{queueStatus && (
|
||||
<div className="mt-3 space-y-2">
|
||||
<div className="flex items-center justify-between text-xs">
|
||||
<span className="text-muted-foreground">Active:</span>
|
||||
<span className="font-medium text-foreground">
|
||||
{activeDownloads}/{queueStatus.maxConcurrent}
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex items-center justify-between text-xs">
|
||||
<span className="text-muted-foreground">Queued:</span>
|
||||
<span className="font-medium text-foreground">
|
||||
{queueStatus.queueLength}/{queueStatus.maxQueueSize}
|
||||
</span>
|
||||
</div>
|
||||
{queueStatus.maxConcurrent > 0 && (
|
||||
<div className="space-y-1">
|
||||
<Progress value={(activeDownloads / queueStatus.maxConcurrent) * 100} className="h-1.5" />
|
||||
<p className="text-xs text-muted-foreground">
|
||||
{Math.round((activeDownloads / queueStatus.maxConcurrent) * 100)}% capacity
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="p-3 space-y-2">
|
||||
{pendingDownloads.map((download) => (
|
||||
<div
|
||||
key={download.downloadId}
|
||||
className="group flex items-center justify-between p-2.5 rounded-lg bg-muted/30 hover:bg-muted/50 transition-colors border border-transparent hover:border-border/50"
|
||||
>
|
||||
<div className="flex items-center gap-3 flex-1 min-w-0">
|
||||
<div className="shrink-0">{getStatusIcon(download.status)}</div>
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="text-sm font-medium text-foreground truncate leading-tight">{download.fileName}</p>
|
||||
<p className="text-xs text-muted-foreground mt-0.5">{getStatusText(download.status)}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{(download.status === "pending" || download.status === "queued") && onCancelDownload && (
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => onCancelDownload(download.downloadId)}
|
||||
className="h-7 w-7 p-0 opacity-0 group-hover:opacity-100 transition-opacity shrink-0 hover:bg-destructive/10 hover:text-destructive"
|
||||
>
|
||||
<IconX className="h-3.5 w-3.5" />
|
||||
</Button>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
|
||||
{(queueStatus?.queuedDownloads || []).map((download) => {
|
||||
const waitTime = getEstimatedWaitTime(download.downloadId);
|
||||
|
||||
return (
|
||||
<div
|
||||
key={download.downloadId}
|
||||
className="group flex items-center justify-between p-2.5 rounded-lg bg-muted/30 hover:bg-muted/50 transition-colors border border-transparent hover:border-border/50"
|
||||
>
|
||||
<div className="flex items-center gap-3 flex-1 min-w-0">
|
||||
<div className="shrink-0">
|
||||
<IconClock className="h-4 w-4 text-amber-500" />
|
||||
</div>
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="text-sm font-medium text-foreground truncate leading-tight">
|
||||
{download.fileName || t("downloadQueue.indicator.unknownFile")}
|
||||
</p>
|
||||
<div className="text-xs text-muted-foreground space-y-0.5">
|
||||
<div className="flex items-center gap-2">
|
||||
<span>#{download.position} in queue</span>
|
||||
{download.fileSize && (
|
||||
<span className="text-muted-foreground/70">• {formatFileSize(download.fileSize)}</span>
|
||||
)}
|
||||
</div>
|
||||
{waitTime && <p className="text-xs text-muted-foreground/80">~{waitTime} remaining</p>}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="sm"
|
||||
onClick={() => cancelDownload(download.downloadId)}
|
||||
className="h-7 w-7 p-0 opacity-0 group-hover:opacity-100 transition-opacity shrink-0 hover:bg-destructive/10 hover:text-destructive"
|
||||
>
|
||||
<IconX className="h-3.5 w-3.5" />
|
||||
</Button>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
|
||||
{totalDownloads === 0 && (
|
||||
<div className="text-center py-8">
|
||||
<IconDownload className="h-8 w-8 mx-auto text-muted-foreground/50 mb-2" />
|
||||
<p className="text-sm text-muted-foreground">No active downloads</p>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{queueStatus && queueStatus.queueLength > 0 && (
|
||||
<div className="p-3 border-t border-border/50">
|
||||
<Button
|
||||
variant="outline"
|
||||
size="sm"
|
||||
onClick={refreshQueue}
|
||||
className="w-full text-xs font-medium hover:bg-muted/80"
|
||||
>
|
||||
Refresh Queue
|
||||
</Button>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
@@ -3,6 +3,7 @@ import Link from "next/link";
|
||||
import { IconLayoutDashboard } from "@tabler/icons-react";
|
||||
import { useTranslations } from "next-intl";
|
||||
|
||||
import { DownloadQueueIndicator } from "@/components/download-queue-indicator";
|
||||
import { Navbar } from "@/components/layout/navbar";
|
||||
import {
|
||||
Breadcrumb,
|
||||
@@ -20,6 +21,14 @@ interface FileManagerLayoutProps {
|
||||
icon: ReactNode;
|
||||
breadcrumbLabel?: string;
|
||||
showBreadcrumb?: boolean;
|
||||
pendingDownloads?: Array<{
|
||||
downloadId: string;
|
||||
fileName: string;
|
||||
objectName: string;
|
||||
startTime: number;
|
||||
status: "pending" | "queued" | "downloading" | "completed" | "failed";
|
||||
}>;
|
||||
onCancelDownload?: (downloadId: string) => void;
|
||||
}
|
||||
|
||||
export function FileManagerLayout({
|
||||
@@ -28,6 +37,8 @@ export function FileManagerLayout({
|
||||
icon,
|
||||
breadcrumbLabel,
|
||||
showBreadcrumb = true,
|
||||
pendingDownloads = [],
|
||||
onCancelDownload,
|
||||
}: FileManagerLayoutProps) {
|
||||
const t = useTranslations();
|
||||
|
||||
@@ -68,6 +79,8 @@ export function FileManagerLayout({
|
||||
</div>
|
||||
</div>
|
||||
<DefaultFooter />
|
||||
|
||||
<DownloadQueueIndicator pendingDownloads={pendingDownloads} onCancelDownload={onCancelDownload} />
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
144
apps/web/src/hooks/use-download-queue.ts
Normal file
144
apps/web/src/hooks/use-download-queue.ts
Normal file
@@ -0,0 +1,144 @@
|
||||
import { useCallback, useEffect, useState } from "react";
|
||||
import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import {
|
||||
cancelQueuedDownload,
|
||||
getDownloadQueueStatus,
|
||||
type DownloadQueueStatus,
|
||||
} from "@/http/endpoints/download-queue";
|
||||
|
||||
export interface DownloadQueueHook {
|
||||
queueStatus: DownloadQueueStatus | null;
|
||||
isLoading: boolean;
|
||||
error: string | null;
|
||||
refreshQueue: () => Promise<void>;
|
||||
cancelDownload: (downloadId: string) => Promise<void>;
|
||||
getQueuePosition: (downloadId: string) => number | null;
|
||||
isDownloadQueued: (downloadId: string) => boolean;
|
||||
getEstimatedWaitTime: (downloadId: string) => string | null;
|
||||
}
|
||||
|
||||
export function useDownloadQueue(autoRefresh = true, initialIntervalMs = 3000) {
|
||||
const t = useTranslations();
|
||||
const [queueStatus, setQueueStatus] = useState<DownloadQueueStatus | null>(null);
|
||||
const [isLoading, setIsLoading] = useState(false);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
const [currentInterval, setCurrentInterval] = useState(initialIntervalMs);
|
||||
const [noActivityCount, setNoActivityCount] = useState(0);
|
||||
|
||||
const refreshQueue = useCallback(async () => {
|
||||
try {
|
||||
setIsLoading(true);
|
||||
setError(null);
|
||||
const response = await getDownloadQueueStatus();
|
||||
const newStatus = response.data;
|
||||
|
||||
const hasActivity = newStatus.activeDownloads > 0 || newStatus.queueLength > 0;
|
||||
const previousActivity = (queueStatus?.activeDownloads || 0) > 0 || (queueStatus?.queueLength || 0) > 0;
|
||||
const statusChanged = JSON.stringify(queueStatus) !== JSON.stringify(newStatus);
|
||||
|
||||
if (!hasActivity && !previousActivity && !statusChanged) {
|
||||
setNoActivityCount((prev) => prev + 1);
|
||||
} else {
|
||||
setNoActivityCount(0);
|
||||
setCurrentInterval(initialIntervalMs);
|
||||
}
|
||||
|
||||
setQueueStatus(newStatus);
|
||||
} catch (err: any) {
|
||||
const errorMessage = err?.response?.data?.error || err?.message || "Failed to fetch queue status";
|
||||
setError(errorMessage);
|
||||
console.error("Error fetching download queue status:", err);
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, [queueStatus, initialIntervalMs]);
|
||||
|
||||
const cancelDownload = useCallback(
|
||||
async (downloadId: string) => {
|
||||
try {
|
||||
await cancelQueuedDownload(downloadId);
|
||||
toast.success(t("downloadQueue.cancelSuccess"));
|
||||
await refreshQueue();
|
||||
} catch (err: any) {
|
||||
const errorMessage = err?.response?.data?.error || err?.message || "Failed to cancel download";
|
||||
toast.error(t("downloadQueue.cancelError", { error: errorMessage }));
|
||||
console.error("Error cancelling download:", err);
|
||||
}
|
||||
},
|
||||
[refreshQueue, t]
|
||||
);
|
||||
|
||||
const getQueuePosition = useCallback(
|
||||
(downloadId: string): number | null => {
|
||||
if (!queueStatus) return null;
|
||||
const download = queueStatus.queuedDownloads.find((d) => d.downloadId === downloadId);
|
||||
return download?.position || null;
|
||||
},
|
||||
[queueStatus]
|
||||
);
|
||||
|
||||
const isDownloadQueued = useCallback(
|
||||
(downloadId: string): boolean => {
|
||||
if (!queueStatus) return false;
|
||||
return queueStatus.queuedDownloads.some((d) => d.downloadId === downloadId);
|
||||
},
|
||||
[queueStatus]
|
||||
);
|
||||
|
||||
const getEstimatedWaitTime = useCallback(
|
||||
(downloadId: string): string | null => {
|
||||
if (!queueStatus) return null;
|
||||
|
||||
const download = queueStatus.queuedDownloads.find((d) => d.downloadId === downloadId);
|
||||
if (!download) return null;
|
||||
|
||||
const waitTimeMs = download.waitTime;
|
||||
const waitTimeSeconds = Math.floor(waitTimeMs / 1000);
|
||||
|
||||
if (waitTimeSeconds < 60) {
|
||||
return t("downloadQueue.waitTime.seconds", { seconds: waitTimeSeconds });
|
||||
} else if (waitTimeSeconds < 3600) {
|
||||
const minutes = Math.floor(waitTimeSeconds / 60);
|
||||
return t("downloadQueue.waitTime.minutes", { minutes });
|
||||
} else {
|
||||
const hours = Math.floor(waitTimeSeconds / 3600);
|
||||
const minutes = Math.floor((waitTimeSeconds % 3600) / 60);
|
||||
return t("downloadQueue.waitTime.hoursMinutes", { hours, minutes });
|
||||
}
|
||||
},
|
||||
[queueStatus, t]
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
if (!autoRefresh) return;
|
||||
|
||||
let actualInterval = currentInterval;
|
||||
|
||||
if (noActivityCount > 5) {
|
||||
console.log("[DOWNLOAD QUEUE] No activity detected, stopping polling");
|
||||
return;
|
||||
} else if (noActivityCount > 2) {
|
||||
actualInterval = 10000;
|
||||
setCurrentInterval(10000);
|
||||
}
|
||||
|
||||
refreshQueue();
|
||||
|
||||
const interval = setInterval(refreshQueue, actualInterval);
|
||||
|
||||
return () => clearInterval(interval);
|
||||
}, [autoRefresh, refreshQueue, currentInterval, noActivityCount]);
|
||||
|
||||
return {
|
||||
queueStatus,
|
||||
isLoading,
|
||||
error,
|
||||
refreshQueue,
|
||||
cancelDownload,
|
||||
getQueuePosition,
|
||||
isDownloadQueued,
|
||||
getEstimatedWaitTime,
|
||||
};
|
||||
}
|
@@ -1,10 +1,11 @@
|
||||
import { useCallback, useState } from "react";
|
||||
import { useCallback, useEffect, useState } from "react";
|
||||
import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import { deleteFile, getDownloadUrl, updateFile } from "@/http/endpoints";
|
||||
import { bulkDownloadFiles, downloadFolder } from "@/http/endpoints/bulk-download";
|
||||
import { deleteFolder, registerFolder, updateFolder } from "@/http/endpoints/folders";
|
||||
import { useDownloadQueue } from "./use-download-queue";
|
||||
import { usePushNotifications } from "./use-push-notifications";
|
||||
|
||||
interface FileToRename {
|
||||
id: string;
|
||||
@@ -150,6 +151,8 @@ export interface EnhancedFileManagerHook {
|
||||
|
||||
export function useEnhancedFileManager(onRefresh: () => Promise<void>, clearSelection?: () => void) {
|
||||
const t = useTranslations();
|
||||
const downloadQueue = useDownloadQueue(true, 3000);
|
||||
const notifications = usePushNotifications();
|
||||
|
||||
const [previewFile, setPreviewFile] = useState<PreviewFile | null>(null);
|
||||
const [fileToRename, setFileToRename] = useState<FileToRename | null>(null);
|
||||
@@ -171,33 +174,124 @@ export function useEnhancedFileManager(onRefresh: () => Promise<void>, clearSele
|
||||
const [foldersToShare, setFoldersToShare] = useState<BulkFolder[] | null>(null);
|
||||
const [foldersToDownload, setFoldersToDownload] = useState<BulkFolder[] | null>(null);
|
||||
|
||||
const startActualDownload = async (
|
||||
downloadId: string,
|
||||
objectName: string,
|
||||
fileName: string,
|
||||
downloadUrl?: string
|
||||
) => {
|
||||
try {
|
||||
setPendingDownloads((prev) =>
|
||||
prev.map((d) => (d.downloadId === downloadId ? { ...d, status: "downloading" } : d))
|
||||
);
|
||||
|
||||
let url = downloadUrl;
|
||||
if (!url) {
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
const response = await getDownloadUrl(encodedObjectName);
|
||||
url = response.data.url;
|
||||
}
|
||||
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = fileName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
const wasQueued = pendingDownloads.some((d) => d.downloadId === downloadId);
|
||||
|
||||
if (wasQueued) {
|
||||
setPendingDownloads((prev) =>
|
||||
prev.map((d) => (d.downloadId === downloadId ? { ...d, status: "completed" } : d))
|
||||
);
|
||||
|
||||
const completedDownload = pendingDownloads.find((d) => d.downloadId === downloadId);
|
||||
if (completedDownload) {
|
||||
const fileSize = completedDownload.startTime ? Date.now() - completedDownload.startTime : undefined;
|
||||
await notifications.notifyDownloadComplete(fileName, fileSize);
|
||||
}
|
||||
|
||||
setTimeout(() => {
|
||||
setPendingDownloads((prev) => prev.filter((d) => d.downloadId !== downloadId));
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
if (!wasQueued) {
|
||||
toast.success(t("files.downloadStart", { fileName }));
|
||||
}
|
||||
} catch (error: any) {
|
||||
const wasQueued = pendingDownloads.some((d) => d.downloadId === downloadId);
|
||||
|
||||
if (wasQueued) {
|
||||
setPendingDownloads((prev) => prev.map((d) => (d.downloadId === downloadId ? { ...d, status: "failed" } : d)));
|
||||
|
||||
const errorMessage =
|
||||
error?.response?.data?.message || error?.message || t("notifications.downloadFailed.unknownError");
|
||||
await notifications.notifyDownloadFailed(fileName, errorMessage);
|
||||
|
||||
setTimeout(() => {
|
||||
setPendingDownloads((prev) => prev.filter((d) => d.downloadId !== downloadId));
|
||||
}, 10000);
|
||||
}
|
||||
|
||||
if (!pendingDownloads.some((d) => d.downloadId === downloadId)) {
|
||||
toast.error(t("files.downloadError"));
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
useEffect(() => {
|
||||
if (!downloadQueue.queueStatus) return;
|
||||
|
||||
pendingDownloads.forEach(async (download) => {
|
||||
if (download.status === "queued") {
|
||||
const stillQueued = downloadQueue.queueStatus?.queuedDownloads.find((qd) => qd.fileName === download.fileName);
|
||||
|
||||
if (!stillQueued) {
|
||||
console.log(`[DOWNLOAD] Processing queued download: ${download.fileName}`);
|
||||
|
||||
await notifications.notifyQueueProcessing(download.fileName);
|
||||
|
||||
await startActualDownload(download.downloadId, download.objectName, download.fileName);
|
||||
}
|
||||
}
|
||||
});
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [downloadQueue.queueStatus, pendingDownloads, notifications]);
|
||||
|
||||
const setClearSelectionCallback = useCallback((callback: () => void) => {
|
||||
setClearSelectionCallbackState(() => callback);
|
||||
}, []);
|
||||
|
||||
const handleDownload = async (objectName: string, fileName: string) => {
|
||||
try {
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
const response = await getDownloadUrl(encodedObjectName);
|
||||
const { downloadFileWithQueue } = await import("@/utils/download-queue-utils");
|
||||
|
||||
// Direct S3 download - no queue needed
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = fileName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("shareManager.downloadSuccess"));
|
||||
await toast.promise(
|
||||
downloadFileWithQueue(objectName, fileName, {
|
||||
silent: true,
|
||||
showToasts: false,
|
||||
}),
|
||||
{
|
||||
loading: t("share.messages.downloadStarted"),
|
||||
success: t("shareManager.downloadSuccess"),
|
||||
error: t("share.errors.downloadFailed"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
toast.error(t("share.errors.downloadFailed"));
|
||||
}
|
||||
};
|
||||
|
||||
const cancelPendingDownload = async (downloadId: string) => {
|
||||
// Queue functionality removed - just remove from local state
|
||||
setPendingDownloads((prev) => prev.filter((d) => d.downloadId !== downloadId));
|
||||
try {
|
||||
await downloadQueue.cancelDownload(downloadId);
|
||||
setPendingDownloads((prev) => prev.filter((d) => d.downloadId !== downloadId));
|
||||
} catch (error) {
|
||||
console.error("Error cancelling download:", error);
|
||||
}
|
||||
};
|
||||
|
||||
const getDownloadStatus = useCallback(
|
||||
@@ -271,78 +365,68 @@ export function useEnhancedFileManager(onRefresh: () => Promise<void>, clearSele
|
||||
|
||||
const handleSingleFolderDownload = async (folderId: string, folderName: string) => {
|
||||
try {
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
const { downloadFolderWithQueue } = await import("@/utils/download-queue-utils");
|
||||
|
||||
const blob = await downloadFolder(folderId, folderName);
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = `${folderName}.zip`;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
await toast.promise(
|
||||
downloadFolderWithQueue(folderId, folderName, {
|
||||
silent: true,
|
||||
showToasts: false,
|
||||
}),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("share.errors.downloadFailed"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Folder download error:", error);
|
||||
toast.error(t("bulkDownload.zipError"));
|
||||
console.error("Error downloading folder:", error);
|
||||
}
|
||||
};
|
||||
|
||||
const handleBulkDownloadWithZip = async (files: BulkFile[], zipName: string) => {
|
||||
try {
|
||||
const folders = foldersToDownload || [];
|
||||
const { bulkDownloadWithQueue } = await import("@/utils/download-queue-utils");
|
||||
|
||||
if (files.length === 0 && folders.length === 0) {
|
||||
const allItems = [
|
||||
...files.map((file) => ({
|
||||
objectName: file.objectName,
|
||||
name: file.relativePath || file.name,
|
||||
isReverseShare: false,
|
||||
type: "file" as const,
|
||||
})),
|
||||
...folders.map((folder) => ({
|
||||
id: folder.id,
|
||||
name: folder.name,
|
||||
type: "folder" as const,
|
||||
})),
|
||||
];
|
||||
|
||||
if (allItems.length === 0) {
|
||||
toast.error(t("shareManager.noFilesToDownload"));
|
||||
return;
|
||||
}
|
||||
|
||||
const fileIds = files.map((file) => file.id);
|
||||
const folderIds = folders.map((folder) => folder.id);
|
||||
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
|
||||
const blob = await bulkDownloadFiles({
|
||||
fileIds,
|
||||
folderIds,
|
||||
zipName,
|
||||
});
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = zipName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
|
||||
setBulkDownloadModalOpen(false);
|
||||
setFilesToDownload(null);
|
||||
setFoldersToDownload(null);
|
||||
|
||||
if (clearSelectionCallback) {
|
||||
clearSelectionCallback();
|
||||
}
|
||||
toast.promise(
|
||||
bulkDownloadWithQueue(allItems, zipName, undefined, false).then(() => {
|
||||
setBulkDownloadModalOpen(false);
|
||||
setFilesToDownload(null);
|
||||
setFoldersToDownload(null);
|
||||
if (clearSelectionCallback) {
|
||||
clearSelectionCallback();
|
||||
}
|
||||
}),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("shareManager.zipDownloadError"),
|
||||
}
|
||||
);
|
||||
} catch (error) {
|
||||
console.error("Error in bulk download:", error);
|
||||
setBulkDownloadModalOpen(false);
|
||||
setFilesToDownload(null);
|
||||
setFoldersToDownload(null);
|
||||
toast.error(t("bulkDownload.zipError"));
|
||||
}
|
||||
};
|
||||
|
||||
|
@@ -4,6 +4,7 @@ import { toast } from "sonner";
|
||||
|
||||
import { getDownloadUrl } from "@/http/endpoints";
|
||||
import { downloadReverseShareFile } from "@/http/endpoints/reverse-shares";
|
||||
import { downloadFileWithQueue, downloadReverseShareWithQueue } from "@/utils/download-queue-utils";
|
||||
import { getFileExtension, getFileType, type FileType } from "@/utils/file-types";
|
||||
|
||||
interface FilePreviewState {
|
||||
@@ -242,40 +243,17 @@ export function useFilePreview({ file, isOpen, isReverseShare = false, sharePass
|
||||
|
||||
try {
|
||||
if (isReverseShare) {
|
||||
const response = await downloadReverseShareFile(file.id!);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = file.name;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("filePreview.downloadSuccess"));
|
||||
await downloadReverseShareWithQueue(file.id!, file.name, {
|
||||
onFail: () => toast.error(t("filePreview.downloadError")),
|
||||
});
|
||||
} else {
|
||||
const encodedObjectName = encodeURIComponent(file.objectName);
|
||||
const params: Record<string, string> = {};
|
||||
if (sharePassword) params.password = sharePassword;
|
||||
|
||||
const response = await getDownloadUrl(
|
||||
encodedObjectName,
|
||||
Object.keys(params).length > 0 ? { params } : undefined
|
||||
);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = file.name;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("filePreview.downloadSuccess"));
|
||||
await downloadFileWithQueue(file.objectName, file.name, {
|
||||
sharePassword,
|
||||
onFail: () => toast.error(t("filePreview.downloadError")),
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
toast.error(t("filePreview.downloadError"));
|
||||
}
|
||||
}, [isReverseShare, file.id, file.objectName, file.name, sharePassword, t]);
|
||||
|
||||
|
185
apps/web/src/hooks/use-push-notifications.ts
Normal file
185
apps/web/src/hooks/use-push-notifications.ts
Normal file
@@ -0,0 +1,185 @@
|
||||
import { useCallback, useEffect, useRef, useState } from "react";
|
||||
import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
interface NotificationOptions {
|
||||
title: string;
|
||||
body: string;
|
||||
icon?: string;
|
||||
badge?: string;
|
||||
tag?: string;
|
||||
requireInteraction?: boolean;
|
||||
silent?: boolean;
|
||||
data?: any;
|
||||
}
|
||||
|
||||
export function usePushNotifications() {
|
||||
const t = useTranslations();
|
||||
const [permissionGranted, setPermissionGranted] = useState(false);
|
||||
const isSupported = useRef(typeof window !== "undefined" && "Notification" in window);
|
||||
|
||||
const requestPermission = useCallback(async (): Promise<boolean> => {
|
||||
if (!isSupported.current) {
|
||||
console.warn("Push notifications are not supported in this browser");
|
||||
return false;
|
||||
}
|
||||
|
||||
try {
|
||||
const permission = await Notification.requestPermission();
|
||||
const granted = permission === "granted";
|
||||
setPermissionGranted(granted);
|
||||
|
||||
if (permission === "granted") {
|
||||
console.log("🔔 Push notifications enabled");
|
||||
toast.success(t("notifications.permissionGranted"));
|
||||
} else if (permission === "denied") {
|
||||
console.warn("🚫 Push notifications denied");
|
||||
toast.warning(t("notifications.permissionDenied"));
|
||||
} else {
|
||||
console.info("⏸️ Push notifications dismissed");
|
||||
}
|
||||
|
||||
return granted;
|
||||
} catch (error) {
|
||||
console.error("Error requesting notification permission:", error);
|
||||
return false;
|
||||
}
|
||||
}, [t]);
|
||||
|
||||
const sendNotification = useCallback(
|
||||
async (options: NotificationOptions): Promise<boolean> => {
|
||||
if (!isSupported.current) {
|
||||
console.warn("Push notifications not supported");
|
||||
return false;
|
||||
}
|
||||
|
||||
if (Notification.permission !== "granted") {
|
||||
const granted = await requestPermission();
|
||||
if (!granted) return false;
|
||||
}
|
||||
|
||||
try {
|
||||
const notification = new Notification(options.title, {
|
||||
body: options.body,
|
||||
icon: options.icon || "/favicon.ico",
|
||||
badge: options.badge,
|
||||
tag: options.tag,
|
||||
requireInteraction: options.requireInteraction ?? false,
|
||||
silent: options.silent ?? false,
|
||||
data: options.data,
|
||||
});
|
||||
|
||||
if (!options.requireInteraction) {
|
||||
setTimeout(() => {
|
||||
notification.close();
|
||||
}, 5000);
|
||||
}
|
||||
|
||||
notification.onclick = (event) => {
|
||||
event.preventDefault();
|
||||
window.focus();
|
||||
notification.close();
|
||||
|
||||
if (options.data?.action === "focus-downloads") {
|
||||
const downloadIndicator = document.querySelector("[data-download-indicator]");
|
||||
if (downloadIndicator) {
|
||||
downloadIndicator.scrollIntoView({ behavior: "smooth" });
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error("Error sending notification:", error);
|
||||
return false;
|
||||
}
|
||||
},
|
||||
[requestPermission]
|
||||
);
|
||||
|
||||
useEffect(() => {
|
||||
if (isSupported.current) {
|
||||
setPermissionGranted(Notification.permission === "granted");
|
||||
}
|
||||
}, []);
|
||||
|
||||
const notifyDownloadComplete = useCallback(
|
||||
async (fileName: string, fileSize?: number) => {
|
||||
const sizeText = fileSize ? ` (${(fileSize / 1024 / 1024).toFixed(1)}MB)` : "";
|
||||
|
||||
return sendNotification({
|
||||
title: t("notifications.downloadComplete.title"),
|
||||
body: t("notifications.downloadComplete.body", {
|
||||
fileName: fileName + sizeText,
|
||||
}),
|
||||
icon: "/favicon.ico",
|
||||
tag: `download-complete-${Date.now()}`,
|
||||
requireInteraction: false,
|
||||
data: {
|
||||
action: "focus-downloads",
|
||||
type: "download-complete",
|
||||
fileName,
|
||||
fileSize,
|
||||
},
|
||||
});
|
||||
},
|
||||
[sendNotification, t]
|
||||
);
|
||||
|
||||
const notifyDownloadFailed = useCallback(
|
||||
async (fileName: string, error?: string) => {
|
||||
return sendNotification({
|
||||
title: t("notifications.downloadFailed.title"),
|
||||
body: t("notifications.downloadFailed.body", {
|
||||
fileName,
|
||||
error: error || t("notifications.downloadFailed.unknownError"),
|
||||
}),
|
||||
icon: "/favicon.ico",
|
||||
tag: `download-failed-${Date.now()}`,
|
||||
requireInteraction: true,
|
||||
data: {
|
||||
action: "focus-downloads",
|
||||
type: "download-failed",
|
||||
fileName,
|
||||
error,
|
||||
},
|
||||
});
|
||||
},
|
||||
[sendNotification, t]
|
||||
);
|
||||
|
||||
const notifyQueueProcessing = useCallback(
|
||||
async (fileName: string, position?: number) => {
|
||||
const positionText = position ? t("notifications.queueProcessing.position", { position }) : "";
|
||||
|
||||
return sendNotification({
|
||||
title: t("notifications.queueProcessing.title"),
|
||||
body: t("notifications.queueProcessing.body", {
|
||||
fileName,
|
||||
position: positionText,
|
||||
}),
|
||||
icon: "/favicon.ico",
|
||||
tag: `queue-processing-${Date.now()}`,
|
||||
requireInteraction: false,
|
||||
silent: true,
|
||||
data: {
|
||||
action: "focus-downloads",
|
||||
type: "queue-processing",
|
||||
fileName,
|
||||
position,
|
||||
},
|
||||
});
|
||||
},
|
||||
[sendNotification, t]
|
||||
);
|
||||
|
||||
return {
|
||||
isSupported: isSupported.current,
|
||||
hasPermission: permissionGranted,
|
||||
requestPermission,
|
||||
sendNotification,
|
||||
notifyDownloadComplete,
|
||||
notifyDownloadFailed,
|
||||
notifyQueueProcessing,
|
||||
};
|
||||
}
|
@@ -4,17 +4,10 @@ import { useCallback, useState } from "react";
|
||||
import { useTranslations } from "next-intl";
|
||||
import { toast } from "sonner";
|
||||
|
||||
import {
|
||||
addRecipients,
|
||||
createShareAlias,
|
||||
deleteShare,
|
||||
getDownloadUrl,
|
||||
notifyRecipients,
|
||||
updateShare,
|
||||
} from "@/http/endpoints";
|
||||
import { bulkDownloadFiles } from "@/http/endpoints/bulk-download";
|
||||
import { addRecipients, createShareAlias, deleteShare, notifyRecipients, updateShare } from "@/http/endpoints";
|
||||
import { updateFolder } from "@/http/endpoints/folders";
|
||||
import type { Share } from "@/http/endpoints/shares/types";
|
||||
import { bulkDownloadShareWithQueue, downloadFileWithQueue } from "@/utils/download-queue-utils";
|
||||
|
||||
export interface ShareManagerHook {
|
||||
shareToDelete: Share | null;
|
||||
@@ -237,31 +230,20 @@ export function useShareManager(onSuccess: () => void) {
|
||||
return;
|
||||
}
|
||||
|
||||
const fileIds = share.files?.map((file) => file.id) || [];
|
||||
const folderIds = share.folders?.map((folder) => folder.id) || [];
|
||||
|
||||
// Show creating ZIP toast
|
||||
const creatingToast = toast.loading(t("bulkDownload.creatingZip"));
|
||||
|
||||
const blob = await bulkDownloadFiles({
|
||||
fileIds,
|
||||
folderIds,
|
||||
zipName,
|
||||
});
|
||||
|
||||
// Update toast to success
|
||||
toast.dismiss(creatingToast);
|
||||
toast.success(t("bulkDownload.zipCreated"));
|
||||
|
||||
// Create download link
|
||||
const url = window.URL.createObjectURL(blob);
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = zipName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
window.URL.revokeObjectURL(url);
|
||||
toast.promise(
|
||||
bulkDownloadShareWithQueue(allItems, share.files || [], share.folders || [], zipName, undefined, true).then(
|
||||
() => {
|
||||
if (clearSelectionCallback) {
|
||||
clearSelectionCallback();
|
||||
}
|
||||
}
|
||||
),
|
||||
{
|
||||
loading: t("shareManager.creatingZip"),
|
||||
success: t("shareManager.zipDownloadSuccess"),
|
||||
error: t("shareManager.zipDownloadError"),
|
||||
}
|
||||
);
|
||||
} else {
|
||||
toast.error("Multiple share download not yet supported - please download shares individually");
|
||||
}
|
||||
@@ -291,21 +273,12 @@ export function useShareManager(onSuccess: () => void) {
|
||||
if (totalFiles === 1 && totalFolders === 0) {
|
||||
const file = share.files[0];
|
||||
try {
|
||||
const encodedObjectName = encodeURIComponent(file.objectName);
|
||||
const response = await getDownloadUrl(encodedObjectName);
|
||||
|
||||
// Direct S3 download
|
||||
const link = document.createElement("a");
|
||||
link.href = response.data.url;
|
||||
link.download = file.name;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
|
||||
toast.success(t("shareManager.downloadSuccess"));
|
||||
await downloadFileWithQueue(file.objectName, file.name, {
|
||||
onComplete: () => toast.success(t("shareManager.downloadSuccess")),
|
||||
onFail: () => toast.error(t("shareManager.downloadError")),
|
||||
});
|
||||
} catch (error) {
|
||||
console.error("Download error:", error);
|
||||
toast.error(t("shareManager.downloadError"));
|
||||
}
|
||||
} else {
|
||||
const zipName = t("shareManager.singleShareZipName", {
|
||||
|
@@ -1,33 +0,0 @@
|
||||
import apiInstance from "@/config/api";
|
||||
|
||||
export interface BulkDownloadRequest {
|
||||
fileIds: string[];
|
||||
folderIds: string[];
|
||||
zipName: string;
|
||||
}
|
||||
|
||||
export interface BulkDownloadResponse {
|
||||
success: boolean;
|
||||
message: string;
|
||||
}
|
||||
|
||||
export const bulkDownloadFiles = async (data: BulkDownloadRequest): Promise<Blob> => {
|
||||
const response = await apiInstance.post("/api/files/bulk-download", data, {
|
||||
responseType: "blob",
|
||||
});
|
||||
return response.data;
|
||||
};
|
||||
|
||||
export const downloadFolder = async (folderId: string, folderName: string): Promise<Blob> => {
|
||||
const response = await apiInstance.get(`/api/files/bulk-download/folder/${folderId}/${folderName}`, {
|
||||
responseType: "blob",
|
||||
});
|
||||
return response.data;
|
||||
};
|
||||
|
||||
export const bulkDownloadReverseShareFiles = async (data: { fileIds: string[]; zipName: string }): Promise<Blob> => {
|
||||
const response = await apiInstance.post("/api/files/bulk-download/reverse-share", data, {
|
||||
responseType: "blob",
|
||||
});
|
||||
return response.data;
|
||||
};
|
63
apps/web/src/http/endpoints/download-queue/index.ts
Normal file
63
apps/web/src/http/endpoints/download-queue/index.ts
Normal file
@@ -0,0 +1,63 @@
|
||||
import type { AxiosRequestConfig } from "axios";
|
||||
|
||||
import apiInstance from "@/config/api";
|
||||
|
||||
export interface QueuedDownload {
|
||||
downloadId: string;
|
||||
position: number;
|
||||
waitTime: number;
|
||||
fileName?: string;
|
||||
fileSize?: number;
|
||||
}
|
||||
|
||||
export interface DownloadQueueStatus {
|
||||
queueLength: number;
|
||||
maxQueueSize: number;
|
||||
activeDownloads: number;
|
||||
maxConcurrent: number;
|
||||
queuedDownloads: QueuedDownload[];
|
||||
}
|
||||
|
||||
export interface DownloadQueueStatusResult {
|
||||
status: string;
|
||||
data: DownloadQueueStatus;
|
||||
}
|
||||
|
||||
export interface CancelDownloadResult {
|
||||
message: string;
|
||||
downloadId: string;
|
||||
}
|
||||
|
||||
export interface ClearQueueResult {
|
||||
message: string;
|
||||
clearedCount: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current download queue status
|
||||
* @summary Get Download Queue Status
|
||||
*/
|
||||
export const getDownloadQueueStatus = <TData = DownloadQueueStatusResult>(
|
||||
options?: AxiosRequestConfig
|
||||
): Promise<TData> => {
|
||||
return apiInstance.get(`/api/filesystem/download-queue/status`, options);
|
||||
};
|
||||
|
||||
/**
|
||||
* Cancel a specific queued download
|
||||
* @summary Cancel Queued Download
|
||||
*/
|
||||
export const cancelQueuedDownload = <TData = CancelDownloadResult>(
|
||||
downloadId: string,
|
||||
options?: AxiosRequestConfig
|
||||
): Promise<TData> => {
|
||||
return apiInstance.delete(`/api/filesystem/download-queue/${downloadId}`, options);
|
||||
};
|
||||
|
||||
/**
|
||||
* Clear the entire download queue (admin operation)
|
||||
* @summary Clear Download Queue
|
||||
*/
|
||||
export const clearDownloadQueue = <TData = ClearQueueResult>(options?: AxiosRequestConfig): Promise<TData> => {
|
||||
return apiInstance.delete(`/api/filesystem/download-queue`, options);
|
||||
};
|
@@ -18,6 +18,8 @@ export interface ChunkedUploadResult {
|
||||
}
|
||||
|
||||
export class ChunkedUploader {
|
||||
private static defaultChunkSizeInBytes = 100 * 1024 * 1024; // 100MB
|
||||
|
||||
/**
|
||||
* Upload a file in chunks with streaming
|
||||
*/
|
||||
@@ -246,7 +248,7 @@ export class ChunkedUploader {
|
||||
return false;
|
||||
}
|
||||
|
||||
const threshold = 100 * 1024 * 1024; // 100MB
|
||||
const threshold = this.getConfiguredChunkSize() || this.defaultChunkSizeInBytes;
|
||||
const shouldUse = fileSize > threshold;
|
||||
|
||||
return shouldUse;
|
||||
@@ -256,12 +258,19 @@ export class ChunkedUploader {
|
||||
* Calculate optimal chunk size based on file size
|
||||
*/
|
||||
static calculateOptimalChunkSize(fileSize: number): number {
|
||||
if (fileSize <= 100 * 1024 * 1024) {
|
||||
const configuredChunkSize = this.getConfiguredChunkSize();
|
||||
const chunkSize = configuredChunkSize || this.defaultChunkSizeInBytes;
|
||||
|
||||
if (fileSize <= chunkSize) {
|
||||
throw new Error(
|
||||
`calculateOptimalChunkSize should not be called for files <= 100MB. File size: ${(fileSize / (1024 * 1024)).toFixed(2)}MB`
|
||||
`calculateOptimalChunkSize should not be called for files <= ${chunkSize}. File size: ${(fileSize / (1024 * 1024)).toFixed(2)}MB`
|
||||
);
|
||||
}
|
||||
|
||||
if (configuredChunkSize) {
|
||||
return configuredChunkSize;
|
||||
}
|
||||
|
||||
// For files > 1GB, use 150MB chunks
|
||||
if (fileSize > 1024 * 1024 * 1024) {
|
||||
return 150 * 1024 * 1024;
|
||||
@@ -275,4 +284,24 @@ export class ChunkedUploader {
|
||||
// For files > 100MB, use 75MB chunks (minimum for chunked upload)
|
||||
return 75 * 1024 * 1024;
|
||||
}
|
||||
|
||||
private static getConfiguredChunkSize(): number | null {
|
||||
const configuredChunkSizeMb = process.env.NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB;
|
||||
|
||||
if (!configuredChunkSizeMb) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const parsedValue = Number(configuredChunkSizeMb);
|
||||
|
||||
if (Number.isNaN(parsedValue) || parsedValue <= 0) {
|
||||
console.warn(
|
||||
`Invalid NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB value: ${configuredChunkSizeMb}. Falling back to optimal chunk size.`
|
||||
);
|
||||
|
||||
return null;
|
||||
}
|
||||
|
||||
return Math.floor(parsedValue * 1024 * 1024);
|
||||
}
|
||||
}
|
||||
|
611
apps/web/src/utils/download-queue-utils.ts
Normal file
611
apps/web/src/utils/download-queue-utils.ts
Normal file
@@ -0,0 +1,611 @@
|
||||
import { toast } from "sonner";
|
||||
|
||||
import { getDownloadUrl } from "@/http/endpoints";
|
||||
import { downloadReverseShareFile } from "@/http/endpoints/reverse-shares";
|
||||
|
||||
interface DownloadWithQueueOptions {
|
||||
useQueue?: boolean;
|
||||
silent?: boolean;
|
||||
showToasts?: boolean;
|
||||
sharePassword?: string;
|
||||
onStart?: (downloadId: string) => void;
|
||||
onComplete?: (downloadId: string) => void;
|
||||
onFail?: (downloadId: string, error: string) => void;
|
||||
}
|
||||
|
||||
async function waitForDownloadReady(objectName: string, fileName: string): Promise<string> {
|
||||
let attempts = 0;
|
||||
const maxAttempts = 30;
|
||||
let currentDelay = 2000;
|
||||
const maxDelay = 10000;
|
||||
|
||||
while (attempts < maxAttempts) {
|
||||
try {
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
const response = await getDownloadUrl(encodedObjectName);
|
||||
|
||||
if (response.status !== 202) {
|
||||
return response.data.url;
|
||||
}
|
||||
|
||||
await new Promise((resolve) => setTimeout(resolve, currentDelay));
|
||||
|
||||
if (attempts > 3 && currentDelay < maxDelay) {
|
||||
currentDelay = Math.min(currentDelay * 1.5, maxDelay);
|
||||
}
|
||||
|
||||
attempts++;
|
||||
} catch (error) {
|
||||
console.error(`Error checking download status for ${fileName}:`, error);
|
||||
await new Promise((resolve) => setTimeout(resolve, currentDelay * 2));
|
||||
attempts++;
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(`Download timeout for ${fileName} after ${attempts} attempts`);
|
||||
}
|
||||
|
||||
async function waitForReverseShareDownloadReady(fileId: string, fileName: string): Promise<string> {
|
||||
let attempts = 0;
|
||||
const maxAttempts = 30;
|
||||
let currentDelay = 2000;
|
||||
const maxDelay = 10000;
|
||||
|
||||
while (attempts < maxAttempts) {
|
||||
try {
|
||||
const response = await downloadReverseShareFile(fileId);
|
||||
|
||||
if (response.status !== 202) {
|
||||
return response.data.url;
|
||||
}
|
||||
|
||||
await new Promise((resolve) => setTimeout(resolve, currentDelay));
|
||||
|
||||
if (attempts > 3 && currentDelay < maxDelay) {
|
||||
currentDelay = Math.min(currentDelay * 1.5, maxDelay);
|
||||
}
|
||||
|
||||
attempts++;
|
||||
} catch (error) {
|
||||
console.error(`Error checking reverse share download status for ${fileName}:`, error);
|
||||
await new Promise((resolve) => setTimeout(resolve, currentDelay * 2));
|
||||
attempts++;
|
||||
}
|
||||
}
|
||||
|
||||
throw new Error(`Reverse share download timeout for ${fileName} after ${attempts} attempts`);
|
||||
}
|
||||
|
||||
async function performDownload(url: string, fileName: string): Promise<void> {
|
||||
const link = document.createElement("a");
|
||||
link.href = url;
|
||||
link.download = fileName;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
}
|
||||
|
||||
export async function downloadFileWithQueue(
|
||||
objectName: string,
|
||||
fileName: string,
|
||||
options: DownloadWithQueueOptions = {}
|
||||
): Promise<void> {
|
||||
const { useQueue = true, silent = false, showToasts = true, sharePassword } = options;
|
||||
const downloadId = `${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
try {
|
||||
if (!silent) {
|
||||
options.onStart?.(downloadId);
|
||||
}
|
||||
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
|
||||
const params: Record<string, string> = {};
|
||||
if (sharePassword) params.password = sharePassword;
|
||||
|
||||
const response = await getDownloadUrl(
|
||||
encodedObjectName,
|
||||
Object.keys(params).length > 0
|
||||
? {
|
||||
params: { ...params },
|
||||
}
|
||||
: undefined
|
||||
);
|
||||
|
||||
if (response.status === 202 && useQueue) {
|
||||
if (!silent && showToasts) {
|
||||
toast.info(`${fileName} was added to download queue`, {
|
||||
description: "Download will start automatically when queue space is available",
|
||||
duration: 5000,
|
||||
});
|
||||
}
|
||||
|
||||
const actualDownloadUrl = await waitForDownloadReady(objectName, fileName);
|
||||
await performDownload(actualDownloadUrl, fileName);
|
||||
} else {
|
||||
await performDownload(response.data.url, fileName);
|
||||
}
|
||||
|
||||
if (!silent) {
|
||||
options.onComplete?.(downloadId);
|
||||
if (showToasts) {
|
||||
toast.success(`${fileName} downloaded successfully`);
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (!silent) {
|
||||
options.onFail?.(downloadId, error?.message || "Download failed");
|
||||
if (showToasts) {
|
||||
toast.error(`Failed to download ${fileName}`);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function downloadReverseShareWithQueue(
|
||||
fileId: string,
|
||||
fileName: string,
|
||||
options: DownloadWithQueueOptions = {}
|
||||
): Promise<void> {
|
||||
const { silent = false, showToasts = true } = options;
|
||||
const downloadId = `reverse-${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
try {
|
||||
if (!silent) {
|
||||
options.onStart?.(downloadId);
|
||||
}
|
||||
|
||||
const response = await downloadReverseShareFile(fileId);
|
||||
|
||||
if (response.status === 202) {
|
||||
if (!silent && showToasts) {
|
||||
toast.info(`${fileName} was added to download queue`, {
|
||||
description: "Download will start automatically when queue space is available",
|
||||
duration: 5000,
|
||||
});
|
||||
}
|
||||
|
||||
const actualDownloadUrl = await waitForReverseShareDownloadReady(fileId, fileName);
|
||||
await performDownload(actualDownloadUrl, fileName);
|
||||
} else {
|
||||
await performDownload(response.data.url, fileName);
|
||||
}
|
||||
|
||||
if (!silent) {
|
||||
options.onComplete?.(downloadId);
|
||||
if (showToasts) {
|
||||
toast.success(`${fileName} downloaded successfully`);
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (!silent) {
|
||||
options.onFail?.(downloadId, error?.message || "Download failed");
|
||||
if (showToasts) {
|
||||
toast.error(`Failed to download ${fileName}`);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function downloadFileAsBlobWithQueue(
|
||||
objectName: string,
|
||||
fileName: string,
|
||||
isReverseShare: boolean = false,
|
||||
fileId?: string,
|
||||
sharePassword?: string
|
||||
): Promise<Blob> {
|
||||
try {
|
||||
let downloadUrl: string;
|
||||
|
||||
if (isReverseShare && fileId) {
|
||||
const response = await downloadReverseShareFile(fileId);
|
||||
|
||||
if (response.status === 202) {
|
||||
downloadUrl = await waitForReverseShareDownloadReady(fileId, fileName);
|
||||
} else {
|
||||
downloadUrl = response.data.url;
|
||||
}
|
||||
} else {
|
||||
const encodedObjectName = encodeURIComponent(objectName);
|
||||
|
||||
const params: Record<string, string> = {};
|
||||
if (sharePassword) params.password = sharePassword;
|
||||
|
||||
const response = await getDownloadUrl(
|
||||
encodedObjectName,
|
||||
Object.keys(params).length > 0
|
||||
? {
|
||||
params: { ...params },
|
||||
}
|
||||
: undefined
|
||||
);
|
||||
|
||||
if (response.status === 202) {
|
||||
downloadUrl = await waitForDownloadReady(objectName, fileName);
|
||||
} else {
|
||||
downloadUrl = response.data.url;
|
||||
}
|
||||
}
|
||||
|
||||
const fetchResponse = await fetch(downloadUrl);
|
||||
if (!fetchResponse.ok) {
|
||||
throw new Error(`Failed to download ${fileName}: ${fetchResponse.status}`);
|
||||
}
|
||||
|
||||
return await fetchResponse.blob();
|
||||
} catch (error: any) {
|
||||
console.error(`Error downloading ${fileName}:`, error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
function collectFolderFiles(
|
||||
folderId: string,
|
||||
allFiles: any[],
|
||||
allFolders: any[],
|
||||
folderPath: string = ""
|
||||
): Array<{ objectName: string; name: string; zipPath: string }> {
|
||||
const result: Array<{ objectName: string; name: string; zipPath: string }> = [];
|
||||
|
||||
const directFiles = allFiles.filter((file: any) => file.folderId === folderId);
|
||||
for (const file of directFiles) {
|
||||
result.push({
|
||||
objectName: file.objectName,
|
||||
name: file.name,
|
||||
zipPath: folderPath + file.name,
|
||||
});
|
||||
}
|
||||
|
||||
const subfolders = allFolders.filter((folder: any) => folder.parentId === folderId);
|
||||
for (const subfolder of subfolders) {
|
||||
const subfolderPath = folderPath + subfolder.name + "/";
|
||||
const subFiles = collectFolderFiles(subfolder.id, allFiles, allFolders, subfolderPath);
|
||||
result.push(...subFiles);
|
||||
}
|
||||
|
||||
return result;
|
||||
}
|
||||
|
||||
function collectEmptyFolders(folderId: string, allFiles: any[], allFolders: any[], folderPath: string = ""): string[] {
|
||||
const emptyFolders: string[] = [];
|
||||
|
||||
const subfolders = allFolders.filter((folder: any) => folder.parentId === folderId);
|
||||
for (const subfolder of subfolders) {
|
||||
const subfolderPath = folderPath + subfolder.name + "/";
|
||||
|
||||
const subfolderFiles = collectFolderFiles(subfolder.id, allFiles, allFolders, "");
|
||||
|
||||
if (subfolderFiles.length === 0) {
|
||||
emptyFolders.push(subfolderPath.slice(0, -1));
|
||||
}
|
||||
|
||||
const nestedEmptyFolders = collectEmptyFolders(subfolder.id, allFiles, allFolders, subfolderPath);
|
||||
emptyFolders.push(...nestedEmptyFolders);
|
||||
}
|
||||
|
||||
return emptyFolders;
|
||||
}
|
||||
|
||||
export async function downloadFolderWithQueue(
|
||||
folderId: string,
|
||||
folderName: string,
|
||||
options: DownloadWithQueueOptions = {}
|
||||
): Promise<void> {
|
||||
const { silent = false, showToasts = true } = options;
|
||||
const downloadId = `folder-${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
try {
|
||||
if (!silent) {
|
||||
options.onStart?.(downloadId);
|
||||
}
|
||||
|
||||
const { listFiles } = await import("@/http/endpoints/files");
|
||||
const { listFolders } = await import("@/http/endpoints/folders");
|
||||
|
||||
const [allFilesResponse, allFoldersResponse] = await Promise.all([listFiles(), listFolders()]);
|
||||
const allFiles = allFilesResponse.data.files || [];
|
||||
const allFolders = allFoldersResponse.data.folders || [];
|
||||
|
||||
const folderFiles = collectFolderFiles(folderId, allFiles, allFolders, `${folderName}/`);
|
||||
const emptyFolders = collectEmptyFolders(folderId, allFiles, allFolders, `${folderName}/`);
|
||||
|
||||
if (folderFiles.length === 0 && emptyFolders.length === 0) {
|
||||
const message = "Folder is empty";
|
||||
if (showToasts) {
|
||||
toast.error(message);
|
||||
}
|
||||
throw new Error(message);
|
||||
}
|
||||
|
||||
const JSZip = (await import("jszip")).default;
|
||||
const zip = new JSZip();
|
||||
|
||||
for (const emptyFolderPath of emptyFolders) {
|
||||
zip.folder(emptyFolderPath);
|
||||
}
|
||||
|
||||
for (const file of folderFiles) {
|
||||
try {
|
||||
const blob = await downloadFileAsBlobWithQueue(file.objectName, file.name);
|
||||
zip.file(file.zipPath, blob);
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
const zipBlob = await zip.generateAsync({ type: "blob" });
|
||||
const url = URL.createObjectURL(zipBlob);
|
||||
const a = document.createElement("a");
|
||||
a.href = url;
|
||||
a.download = `${folderName}.zip`;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
URL.revokeObjectURL(url);
|
||||
|
||||
if (!silent) {
|
||||
options.onComplete?.(downloadId);
|
||||
if (showToasts) {
|
||||
toast.success(`${folderName} downloaded successfully`);
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (!silent) {
|
||||
options.onFail?.(downloadId, error?.message || "Download failed");
|
||||
if (showToasts) {
|
||||
toast.error(`Failed to download ${folderName}`);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function downloadShareFolderWithQueue(
|
||||
folderId: string,
|
||||
folderName: string,
|
||||
shareFiles: any[],
|
||||
shareFolders: any[],
|
||||
options: DownloadWithQueueOptions = {}
|
||||
): Promise<void> {
|
||||
const { silent = false, showToasts = true, sharePassword } = options;
|
||||
const downloadId = `share-folder-${Date.now()}-${Math.random().toString(36).substring(2, 11)}`;
|
||||
|
||||
try {
|
||||
if (!silent) {
|
||||
options.onStart?.(downloadId);
|
||||
}
|
||||
|
||||
const folderFiles = collectFolderFiles(folderId, shareFiles, shareFolders, `${folderName}/`);
|
||||
const emptyFolders = collectEmptyFolders(folderId, shareFiles, shareFolders, `${folderName}/`);
|
||||
|
||||
if (folderFiles.length === 0 && emptyFolders.length === 0) {
|
||||
const message = "Folder is empty";
|
||||
if (showToasts) {
|
||||
toast.error(message);
|
||||
}
|
||||
throw new Error(message);
|
||||
}
|
||||
|
||||
const JSZip = (await import("jszip")).default;
|
||||
const zip = new JSZip();
|
||||
|
||||
for (const emptyFolderPath of emptyFolders) {
|
||||
zip.folder(emptyFolderPath);
|
||||
}
|
||||
|
||||
for (const file of folderFiles) {
|
||||
try {
|
||||
const blob = await downloadFileAsBlobWithQueue(file.objectName, file.name, false, undefined, sharePassword);
|
||||
zip.file(file.zipPath, blob);
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
const zipBlob = await zip.generateAsync({ type: "blob" });
|
||||
const url = URL.createObjectURL(zipBlob);
|
||||
const a = document.createElement("a");
|
||||
a.href = url;
|
||||
a.download = `${folderName}.zip`;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
URL.revokeObjectURL(url);
|
||||
|
||||
if (!silent) {
|
||||
options.onComplete?.(downloadId);
|
||||
if (showToasts) {
|
||||
toast.success(`${folderName} downloaded successfully`);
|
||||
}
|
||||
}
|
||||
} catch (error: any) {
|
||||
if (!silent) {
|
||||
options.onFail?.(downloadId, error?.message || "Download failed");
|
||||
if (showToasts) {
|
||||
toast.error(`Failed to download ${folderName}`);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function bulkDownloadWithQueue(
|
||||
items: Array<{
|
||||
objectName?: string;
|
||||
name: string;
|
||||
id?: string;
|
||||
isReverseShare?: boolean;
|
||||
type?: "file" | "folder";
|
||||
}>,
|
||||
zipName: string,
|
||||
onProgress?: (current: number, total: number) => void,
|
||||
wrapInFolder?: boolean
|
||||
): Promise<void> {
|
||||
try {
|
||||
const JSZip = (await import("jszip")).default;
|
||||
const zip = new JSZip();
|
||||
|
||||
const files = items.filter((item) => item.type !== "folder");
|
||||
const folders = items.filter((item) => item.type === "folder");
|
||||
|
||||
// eslint-disable-next-line prefer-const
|
||||
let allFilesToDownload: Array<{ objectName: string; name: string; zipPath: string }> = [];
|
||||
// eslint-disable-next-line prefer-const
|
||||
let allEmptyFolders: string[] = [];
|
||||
|
||||
if (folders.length > 0) {
|
||||
const { listFiles } = await import("@/http/endpoints/files");
|
||||
const { listFolders } = await import("@/http/endpoints/folders");
|
||||
|
||||
const [allFilesResponse, allFoldersResponse] = await Promise.all([listFiles(), listFolders()]);
|
||||
const allFiles = allFilesResponse.data.files || [];
|
||||
const allFolders = allFoldersResponse.data.folders || [];
|
||||
|
||||
const wrapperPath = wrapInFolder ? `${zipName.replace(".zip", "")}/` : "";
|
||||
for (const folder of folders) {
|
||||
const folderPath = wrapperPath + `${folder.name}/`;
|
||||
const folderFiles = collectFolderFiles(folder.id!, allFiles, allFolders, folderPath);
|
||||
const emptyFolders = collectEmptyFolders(folder.id!, allFiles, allFolders, folderPath);
|
||||
|
||||
allFilesToDownload.push(...folderFiles);
|
||||
allEmptyFolders.push(...emptyFolders);
|
||||
|
||||
if (folderFiles.length === 0 && emptyFolders.length === 0) {
|
||||
allEmptyFolders.push(folderPath.slice(0, -1));
|
||||
}
|
||||
}
|
||||
|
||||
const filesInFolders = new Set(allFilesToDownload.map((f) => f.objectName));
|
||||
for (const file of files) {
|
||||
if (!file.objectName || !filesInFolders.has(file.objectName)) {
|
||||
allFilesToDownload.push({
|
||||
objectName: file.objectName || file.name,
|
||||
name: file.name,
|
||||
zipPath: wrapperPath + file.name,
|
||||
});
|
||||
}
|
||||
}
|
||||
} else {
|
||||
const wrapperPath = wrapInFolder ? `${zipName.replace(".zip", "")}/` : "";
|
||||
for (const file of files) {
|
||||
allFilesToDownload.push({
|
||||
objectName: file.objectName || file.name,
|
||||
name: file.name,
|
||||
zipPath: wrapperPath + file.name,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
for (const emptyFolderPath of allEmptyFolders) {
|
||||
zip.folder(emptyFolderPath);
|
||||
}
|
||||
|
||||
for (let i = 0; i < allFilesToDownload.length; i++) {
|
||||
const file = allFilesToDownload[i];
|
||||
try {
|
||||
const blob = await downloadFileAsBlobWithQueue(file.objectName, file.name);
|
||||
zip.file(file.zipPath, blob);
|
||||
onProgress?.(i + 1, allFilesToDownload.length);
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
const zipBlob = await zip.generateAsync({ type: "blob" });
|
||||
const url = URL.createObjectURL(zipBlob);
|
||||
const a = document.createElement("a");
|
||||
a.href = url;
|
||||
a.download = zipName.endsWith(".zip") ? zipName : `${zipName}.zip`;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
URL.revokeObjectURL(url);
|
||||
} catch (error) {
|
||||
console.error("Error creating ZIP:", error);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
export async function bulkDownloadShareWithQueue(
|
||||
items: Array<{
|
||||
objectName?: string;
|
||||
name: string;
|
||||
id?: string;
|
||||
type?: "file" | "folder";
|
||||
}>,
|
||||
shareFiles: any[],
|
||||
shareFolders: any[],
|
||||
zipName: string,
|
||||
onProgress?: (current: number, total: number) => void,
|
||||
wrapInFolder?: boolean,
|
||||
sharePassword?: string
|
||||
): Promise<void> {
|
||||
try {
|
||||
const JSZip = (await import("jszip")).default;
|
||||
const zip = new JSZip();
|
||||
|
||||
const files = items.filter((item) => item.type !== "folder");
|
||||
const folders = items.filter((item) => item.type === "folder");
|
||||
|
||||
// eslint-disable-next-line prefer-const
|
||||
let allFilesToDownload: Array<{ objectName: string; name: string; zipPath: string }> = [];
|
||||
// eslint-disable-next-line prefer-const
|
||||
let allEmptyFolders: string[] = [];
|
||||
|
||||
const wrapperPath = wrapInFolder ? `${zipName.replace(".zip", "")}/` : "";
|
||||
|
||||
for (const folder of folders) {
|
||||
const folderPath = wrapperPath + `${folder.name}/`;
|
||||
const folderFiles = collectFolderFiles(folder.id!, shareFiles, shareFolders, folderPath);
|
||||
const emptyFolders = collectEmptyFolders(folder.id!, shareFiles, shareFolders, folderPath);
|
||||
|
||||
allFilesToDownload.push(...folderFiles);
|
||||
allEmptyFolders.push(...emptyFolders);
|
||||
|
||||
if (folderFiles.length === 0 && emptyFolders.length === 0) {
|
||||
allEmptyFolders.push(folderPath.slice(0, -1));
|
||||
}
|
||||
}
|
||||
|
||||
const filesInFolders = new Set(allFilesToDownload.map((f) => f.objectName));
|
||||
for (const file of files) {
|
||||
if (!file.objectName || !filesInFolders.has(file.objectName)) {
|
||||
allFilesToDownload.push({
|
||||
objectName: file.objectName!,
|
||||
name: file.name,
|
||||
zipPath: wrapperPath + file.name,
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
for (const emptyFolderPath of allEmptyFolders) {
|
||||
zip.folder(emptyFolderPath);
|
||||
}
|
||||
|
||||
for (let i = 0; i < allFilesToDownload.length; i++) {
|
||||
const file = allFilesToDownload[i];
|
||||
try {
|
||||
const blob = await downloadFileAsBlobWithQueue(file.objectName, file.name, false, undefined, sharePassword);
|
||||
zip.file(file.zipPath, blob);
|
||||
onProgress?.(i + 1, allFilesToDownload.length);
|
||||
} catch (error) {
|
||||
console.error(`Error downloading file ${file.name}:`, error);
|
||||
}
|
||||
}
|
||||
|
||||
const zipBlob = await zip.generateAsync({ type: "blob" });
|
||||
const url = URL.createObjectURL(zipBlob);
|
||||
const a = document.createElement("a");
|
||||
a.href = url;
|
||||
a.download = zipName.endsWith(".zip") ? zipName : `${zipName}.zip`;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
URL.revokeObjectURL(url);
|
||||
} catch (error) {
|
||||
console.error("Error creating ZIP:", error);
|
||||
throw error;
|
||||
}
|
||||
}
|
@@ -13,7 +13,7 @@ services:
|
||||
# - DEFAULT_LANGUAGE=en-US # Default language for the application (optional, defaults to en-US) | See the docs for see all supported languages
|
||||
# - PRESIGNED_URL_EXPIRATION=3600 # Duration in seconds for presigned URL expiration (OPTIONAL - default is 3600 seconds / 1 hour)
|
||||
# - SECURE_SITE=true # Set to true if you are using a reverse proxy (OPTIONAL - default is false)
|
||||
|
||||
|
||||
# Download Memory Management Configuration (OPTIONAL - See documentation for details)
|
||||
# - DOWNLOAD_MAX_CONCURRENT=5 # Maximum number of simultaneous downloads (OPTIONAL - auto-scales based on system memory if not set)
|
||||
# - DOWNLOAD_MEMORY_THRESHOLD_MB=2048 # Memory threshold in MB before throttling (OPTIONAL - auto-scales based on system memory if not set)
|
||||
@@ -21,6 +21,7 @@ services:
|
||||
# - DOWNLOAD_MIN_FILE_SIZE_GB=3.0 # Minimum file size in GB to activate memory management (OPTIONAL - default is 3.0)
|
||||
# - DOWNLOAD_AUTO_SCALE=true # Enable auto-scaling based on system memory (OPTIONAL - default is true)
|
||||
# - NODE_OPTIONS=--expose-gc # Enable garbage collection for large file downloads (RECOMMENDED for production)
|
||||
# - NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB for large file uploads (OPTIONAL - auto-calculates if not set)
|
||||
ports:
|
||||
- "5487:5487" # Web port
|
||||
- "3333:3333" # API port (OPTIONAL EXPOSED - ONLY IF YOU WANT TO ACCESS THE API DIRECTLY)
|
||||
|
@@ -18,7 +18,7 @@ services:
|
||||
# - DEFAULT_LANGUAGE=en-US # Default language for the application (optional, defaults to en-US) | See the docs for see all supported languages
|
||||
# - PRESIGNED_URL_EXPIRATION=3600 # Duration in seconds for presigned URL expiration (OPTIONAL - default is 3600 seconds / 1 hour)
|
||||
# - SECURE_SITE=true # Set to true if you are using a reverse proxy (OPTIONAL - default is false)
|
||||
|
||||
|
||||
# Download Memory Management Configuration (OPTIONAL - See documentation for details)
|
||||
# - DOWNLOAD_MAX_CONCURRENT=5 # Maximum number of simultaneous downloads (OPTIONAL - auto-scales based on system memory if not set)
|
||||
# - DOWNLOAD_MEMORY_THRESHOLD_MB=2048 # Memory threshold in MB before throttling (OPTIONAL - auto-scales based on system memory if not set)
|
||||
@@ -26,6 +26,7 @@ services:
|
||||
# - DOWNLOAD_MIN_FILE_SIZE_GB=3.0 # Minimum file size in GB to activate memory management (OPTIONAL - default is 3.0)
|
||||
# - DOWNLOAD_AUTO_SCALE=true # Enable auto-scaling based on system memory (OPTIONAL - default is true)
|
||||
# - NODE_OPTIONS=--expose-gc # Enable garbage collection for large file downloads (RECOMMENDED for production)
|
||||
# - NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB for large file uploads (OPTIONAL - auto-calculates if not set)
|
||||
ports:
|
||||
- "5487:5487" # Web port
|
||||
- "3333:3333" # API port (OPTIONAL EXPOSED - ONLY IF YOU WANT TO ACCESS THE API DIRECTLY)
|
||||
|
@@ -18,7 +18,7 @@ services:
|
||||
# - DEFAULT_LANGUAGE=en-US # Default language for the application (optional, defaults to en-US) | See the docs for see all supported languages
|
||||
# - PRESIGNED_URL_EXPIRATION=3600 # Duration in seconds for presigned URL expiration (OPTIONAL - default is 3600 seconds / 1 hour)
|
||||
# - SECURE_SITE=true # Set to true if you are using a reverse proxy (OPTIONAL - default is false)
|
||||
|
||||
|
||||
# Download Memory Management Configuration (OPTIONAL - See documentation for details)
|
||||
# - DOWNLOAD_MAX_CONCURRENT=5 # Maximum number of simultaneous downloads (OPTIONAL - auto-scales based on system memory if not set)
|
||||
# - DOWNLOAD_MEMORY_THRESHOLD_MB=2048 # Memory threshold in MB before throttling (OPTIONAL - auto-scales based on system memory if not set)
|
||||
@@ -26,6 +26,7 @@ services:
|
||||
# - DOWNLOAD_MIN_FILE_SIZE_GB=3.0 # Minimum file size in GB to activate memory management (OPTIONAL - default is 3.0)
|
||||
# - DOWNLOAD_AUTO_SCALE=true # Enable auto-scaling based on system memory (OPTIONAL - default is true)
|
||||
# - NODE_OPTIONS=--expose-gc # Enable garbage collection for large file downloads (RECOMMENDED for production)
|
||||
# - NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB for large file uploads (OPTIONAL - auto-calculates if not set)
|
||||
ports:
|
||||
- "5487:5487" # Web port
|
||||
- "3333:3333" # API port (OPTIONAL EXPOSED - ONLY IF YOU WANT TO ACCESS THE API DIRECTLY)
|
||||
|
@@ -13,7 +13,7 @@ services:
|
||||
# - DEFAULT_LANGUAGE=en-US # Default language for the application (optional, defaults to en-US) | See the docs to see all supported languages
|
||||
# - PRESIGNED_URL_EXPIRATION=3600 # Duration in seconds for presigned URL expiration (OPTIONAL - default is 3600 seconds / 1 hour)
|
||||
# - SECURE_SITE=true # Set to true if you are using a reverse proxy (OPTIONAL - default is false)
|
||||
|
||||
|
||||
# Download Memory Management Configuration (OPTIONAL - See documentation for details)
|
||||
# - DOWNLOAD_MAX_CONCURRENT=5 # Maximum number of simultaneous downloads (OPTIONAL - auto-scales based on system memory if not set)
|
||||
# - DOWNLOAD_MEMORY_THRESHOLD_MB=2048 # Memory threshold in MB before throttling (OPTIONAL - auto-scales based on system memory if not set)
|
||||
@@ -21,6 +21,7 @@ services:
|
||||
# - DOWNLOAD_MIN_FILE_SIZE_GB=3.0 # Minimum file size in GB to activate memory management (OPTIONAL - default is 3.0)
|
||||
# - DOWNLOAD_AUTO_SCALE=true # Enable auto-scaling based on system memory (OPTIONAL - default is true)
|
||||
# - NODE_OPTIONS=--expose-gc # Enable garbage collection for large file downloads (RECOMMENDED for production)
|
||||
# - NEXT_PUBLIC_UPLOAD_CHUNK_SIZE_MB=100 # Chunk size in MB for large file uploads (OPTIONAL - auto-calculates if not set)
|
||||
ports:
|
||||
- "5487:5487" # Web port
|
||||
- "3333:3333" # API port (OPTIONAL EXPOSED - ONLY IF YOU WANT TO ACCESS THE API DIRECTLY)
|
||||
|
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "palmr-monorepo",
|
||||
"version": "3.2.1-beta",
|
||||
"version": "3.2.3-beta",
|
||||
"description": "Palmr monorepo with Husky configuration",
|
||||
"private": true,
|
||||
"packageManager": "pnpm@10.6.0",
|
||||
|
Reference in New Issue
Block a user