Compare commits

...

30 Commits

Author SHA1 Message Date
ed
fb81de3b36 v0.11.24 2021-06-22 17:28:09 +02:00
ed
aa4f352301 prefer audio tags in audio files 2021-06-22 17:21:24 +02:00
ed
f1a1c2ea45 recover from opening a corrupt database 2021-06-22 17:19:56 +02:00
ed
6249bd4163 add pebkac hints 2021-06-22 17:18:34 +02:00
ed
2579dc64ce update notes 2021-06-21 22:49:28 +00:00
ed
356512270a file extensions dont contain whitespace 2021-06-21 23:50:35 +02:00
ed
bed27f2b43 mention fix for the OSD popup on windows 2021-06-21 23:43:07 +02:00
ed
54013d861b v0.11.23 2021-06-21 21:15:56 +02:00
ed
ec100210dc support showing album-cover on windows lockscreen 2021-06-21 19:15:22 +00:00
ed
3ab1acf32c v0.11.22 2021-06-21 20:30:29 +02:00
ed
8c28266418 subscribe to media-keys globally as a media player 2021-06-21 20:26:11 +02:00
ed
7f8b8dcb92 scandir is not withable before py3.6 2021-06-21 20:23:35 +02:00
ed
6dd39811d4 disable u2idx if sqlite3 is unavailable 2021-06-21 20:22:54 +02:00
ed
35e2138e3e doc: macos support 2021-06-21 18:42:15 +02:00
ed
239b4e9fe6 v0.11.21 2021-06-20 21:25:18 +02:00
ed
2fcd0e7e72 abandon listing tags in browser when db busy 2021-06-20 21:19:47 +02:00
ed
357347ce3a lower timeout on db reads 2021-06-20 21:03:35 +02:00
ed
36dc1107fb update dbtool desc 2021-06-20 20:05:43 +02:00
ed
0a3bbc4b4a v0.11.20 for real 2021-06-20 19:32:17 +02:00
ed
855b93dcf6 v0.11.20 2021-06-20 18:53:58 +02:00
ed
89b79ba267 fix histpath getting indexed on windows 2021-06-20 17:59:27 +02:00
ed
f5651b7d94 dont include hidden colums in /np clips 2021-06-20 17:45:59 +02:00
ed
1881019ede support cygpaths for mtag binaries 2021-06-20 17:45:23 +02:00
ed
caba4e974c upgrade dbtool for v4 2021-06-20 17:44:24 +02:00
ed
bc3c9613bc cosmetic macos fix on shutdown 2021-06-20 15:50:37 +02:00
ed
15a3ee252e support backslash in filenames 2021-06-20 15:50:06 +02:00
ed
be055961ae adjust up2k hashlen to match base64 window 2021-06-20 15:32:36 +02:00
ed
e3031bdeec fix up2k folder-upload 2021-06-20 00:00:50 +00:00
ed
75917b9f7c better fallback 2021-06-19 16:21:39 +02:00
ed
910732e02c update build notes 2021-06-19 16:20:35 +02:00
20 changed files with 418 additions and 195 deletions

View File

@@ -51,8 +51,10 @@ turn your phone or raspi into a portable file server with resumable uploads/down
* [sfx](#sfx) * [sfx](#sfx)
* [sfx repack](#sfx-repack) * [sfx repack](#sfx-repack)
* [install on android](#install-on-android) * [install on android](#install-on-android)
* [building](#building)
* [dev env setup](#dev-env-setup) * [dev env setup](#dev-env-setup)
* [how to release](#how-to-release) * [just the sfx](#just-the-sfx)
* [complete release](#complete-release)
* [todo](#todo) * [todo](#todo)
@@ -109,7 +111,7 @@ summary: all planned features work! now please enjoy the bloatening
* ☑ FUSE client (read-only) * ☑ FUSE client (read-only)
* browser * browser
* ☑ tree-view * ☑ tree-view
* ☑ audio player * ☑ audio player (with OS media controls)
* ☑ thumbnails * ☑ thumbnails
* ☑ images using Pillow * ☑ images using Pillow
* ☑ videos using FFmpeg * ☑ videos using FFmpeg
@@ -141,6 +143,9 @@ summary: all planned features work! now please enjoy the bloatening
## not my bugs ## not my bugs
* Windows: folders cannot be accessed if the name ends with `.`
* python or windows bug
* Windows: msys2-python 3.8.6 occasionally throws "RuntimeError: release unlocked lock" when leaving a scoped mutex in up2k * Windows: msys2-python 3.8.6 occasionally throws "RuntimeError: release unlocked lock" when leaving a scoped mutex in up2k
* this is an msys2 bug, the regular windows edition of python is fine * this is an msys2 bug, the regular windows edition of python is fine
@@ -163,15 +168,16 @@ summary: all planned features work! now please enjoy the bloatening
## hotkeys ## hotkeys
the browser has the following hotkeys the browser has the following hotkeys
* `B` toggle breadcrumbs / directory tree
* `I/K` prev/next folder * `I/K` prev/next folder
* `P` parent folder * `M` parent folder
* `G` toggle list / grid view * `G` toggle list / grid view
* `T` toggle thumbnails / icons * `T` toggle thumbnails / icons
* when playing audio: * when playing audio:
* `0..9` jump to 10%..90% * `0..9` jump to 10%..90%
* `U/O` skip 10sec back/forward * `U/O` skip 10sec back/forward
* `J/L` prev/next song * `J/L` prev/next song
* `M` play/pause (also starts playing the folder) * `P` play/pause (also starts playing the folder)
* in the grid view: * in the grid view:
* `S` toggle multiselect * `S` toggle multiselect
* `A/D` zoom * `A/D` zoom
@@ -179,9 +185,9 @@ the browser has the following hotkeys
## tree-mode ## tree-mode
by default there's a breadcrumbs path; you can replace this with a tree-browser sidebar thing by clicking the 🌲 by default there's a breadcrumbs path; you can replace this with a tree-browser sidebar thing by clicking the `🌲` or pressing the `B` hotkey
click `[-]` and `[+]` to adjust the size, and the `[a]` toggles if the tree should widen dynamically as you go deeper or stay fixed-size click `[-]` and `[+]` (or hotkeys `A`/`D`) to adjust the size, and the `[a]` toggles if the tree should widen dynamically as you go deeper or stay fixed-size
## thumbnails ## thumbnails
@@ -275,6 +281,8 @@ up2k has saved a few uploads from becoming corrupted in-transfer already; caught
* you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab` * you can link a particular timestamp in an audio file by adding it to the URL, such as `&20` / `&20s` / `&1m20` / `&t=1:20` after the `.../#af-c8960dab`
* if you are using media hotkeys to switch songs and are getting tired of seeing the OSD popup which Windows doesn't let you disable, consider https://ocv.me/dev/?media-osd-bgone.ps1
# searching # searching
@@ -437,7 +445,7 @@ quick summary of more eccentric web-browsers trying to view a directory index:
copyparty returns a truncated sha512sum of your PUT/POST as base64; you can generate the same checksum locally to verify uplaods: copyparty returns a truncated sha512sum of your PUT/POST as base64; you can generate the same checksum locally to verify uplaods:
b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|head -c43;} b512(){ printf "$((sha512sum||shasum -a512)|sed -E 's/ .*//;s/(..)/\\x\1/g')"|base64|tr '+/' '-_'|head -c44;}
b512 <movie.mkv b512 <movie.mkv
@@ -532,18 +540,45 @@ echo $?
after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux after the initial setup, you can launch copyparty at any time by running `copyparty` anywhere in Termux
# dev env setup # building
## dev env setup
mostly optional; if you need a working env for vscode or similar
```sh ```sh
python3 -m venv .venv python3 -m venv .venv
. .venv/bin/activate . .venv/bin/activate
pip install jinja2 # mandatory deps pip install jinja2 # mandatory
pip install Pillow # thumbnail deps pip install mutagen # audio metadata
pip install Pillow pyheif-pillow-opener pillow-avif-plugin # thumbnails
pip install black bandit pylint flake8 # vscode tooling pip install black bandit pylint flake8 # vscode tooling
``` ```
# how to release ## just the sfx
unless you need to modify something in the web-dependencies, it's faster to grab those from a previous release:
```sh
rm -rf copyparty/web/deps
curl -L https://github.com/9001/copyparty/releases/latest/download/copyparty-sfx.py >x.py
python3 x.py -h
rm x.py
mv /tmp/pe-copyparty/copyparty/web/deps/ copyparty/web/
```
then build the sfx using any of the following examples:
```sh
./scripts/make-sfx.sh # both python and sh editions
./scripts/make-sfx.sh no-sh gz # just python with gzip
```
## complete release
also builds the sfx so disregard the sfx section above
in the `scripts` folder: in the `scripts` folder:

View File

@@ -48,15 +48,16 @@ you could replace winfsp with [dokan](https://github.com/dokan-dev/dokany/releas
# [`dbtool.py`](dbtool.py) # [`dbtool.py`](dbtool.py)
upgrade utility which can show db info and help transfer data between databases, for example when a new version of copyparty recommends to wipe the DB and reindex because it now collects additional metadata during analysis, but you have some really expensive `-mtp` parsers and want to copy over the tags from the old db upgrade utility which can show db info and help transfer data between databases, for example when a new version of copyparty is incompatible with the old DB and automatically rebuilds the DB from scratch, but you have some really expensive `-mtp` parsers and want to copy over the tags from the old db
for that example (upgrading to v0.11.0), first move the old db aside, launch copyparty, let it rebuild the db until the point where it starts running mtp (colored messages as it adds the mtp tags), then CTRL-C and patch in the old mtp tags from the old db instead for that example (upgrading to v0.11.20), first launch the new version of copyparty like usual, let it make a backup of the old db and rebuild the new db until the point where it starts running mtp (colored messages as it adds the mtp tags), that's when you hit CTRL-C and patch in the old mtp tags from the old db instead
so assuming you have `-mtp` parsers to provide the tags `key` and `.bpm`: so assuming you have `-mtp` parsers to provide the tags `key` and `.bpm`:
``` ```
~/bin/dbtool.py -ls up2k.db cd /mnt/nas/music/.hist
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -cmp ~/src/copyparty/bin/dbtool.py -ls up2k.db
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -rm-mtp-flag -copy key ~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -cmp
~/bin/dbtool.py -src up2k.db.v0.10.22 up2k.db -rm-mtp-flag -copy .bpm -vac ~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -rm-mtp-flag -copy key
~/src/copyparty/bin/dbtool.py -src up2k.*.v3 up2k.db -rm-mtp-flag -copy .bpm -vac
``` ```

View File

@@ -2,10 +2,13 @@
import os import os
import sys import sys
import time
import shutil
import sqlite3 import sqlite3
import argparse import argparse
DB_VER = 3 DB_VER1 = 3
DB_VER2 = 4
def die(msg): def die(msg):
@@ -45,18 +48,21 @@ def compare(n1, d1, n2, d2, verbose):
nt = next(d1.execute("select count(w) from up"))[0] nt = next(d1.execute("select count(w) from up"))[0]
n = 0 n = 0
miss = 0 miss = 0
for w, rd, fn in d1.execute("select w, rd, fn from up"): for w1, rd, fn in d1.execute("select w, rd, fn from up"):
n += 1 n += 1
if n % 25_000 == 0: if n % 25_000 == 0:
m = f"\033[36mchecked {n:,} of {nt:,} files in {n1} against {n2}\033[0m" m = f"\033[36mchecked {n:,} of {nt:,} files in {n1} against {n2}\033[0m"
print(m) print(m)
q = "select w from up where substr(w,1,16) = ?" if rd.split("/", 1)[0] == ".hist":
hit = d2.execute(q, (w[:16],)).fetchone() continue
q = "select w from up where rd = ? and fn = ?"
hit = d2.execute(q, (rd, fn)).fetchone()
if not hit: if not hit:
miss += 1 miss += 1
if verbose: if verbose:
print(f"file in {n1} missing in {n2}: [{w}] {rd}/{fn}") print(f"file in {n1} missing in {n2}: [{w1}] {rd}/{fn}")
print(f" {miss} files in {n1} missing in {n2}\n") print(f" {miss} files in {n1} missing in {n2}\n")
@@ -64,13 +70,28 @@ def compare(n1, d1, n2, d2, verbose):
n = 0 n = 0
miss = {} miss = {}
nmiss = 0 nmiss = 0
for w, k, v in d1.execute("select * from mt"): for w1, k, v in d1.execute("select * from mt"):
n += 1 n += 1
if n % 100_000 == 0: if n % 100_000 == 0:
m = f"\033[36mchecked {n:,} of {nt:,} tags in {n1} against {n2}, so far {nmiss} missing tags\033[0m" m = f"\033[36mchecked {n:,} of {nt:,} tags in {n1} against {n2}, so far {nmiss} missing tags\033[0m"
print(m) print(m)
v2 = d2.execute("select v from mt where w = ? and +k = ?", (w, k)).fetchone() q = "select rd, fn from up where substr(w,1,16) = ?"
rd, fn = d1.execute(q, (w1,)).fetchone()
if rd.split("/", 1)[0] == ".hist":
continue
q = "select substr(w,1,16) from up where rd = ? and fn = ?"
w2 = d2.execute(q, (rd, fn)).fetchone()
if w2:
w2 = w2[0]
v2 = None
if w2:
v2 = d2.execute(
"select v from mt where w = ? and +k = ?", (w2, k)
).fetchone()
if v2: if v2:
v2 = v2[0] v2 = v2[0]
@@ -99,9 +120,7 @@ def compare(n1, d1, n2, d2, verbose):
miss[k] = 1 miss[k] = 1
if verbose: if verbose:
q = "select rd, fn from up where substr(w,1,16) = ?" print(f"missing in {n2}: [{w1}] [{rd}/{fn}] {k} = {v}")
rd, fn = d1.execute(q, (w,)).fetchone()
print(f"missing in {n2}: [{w}] [{rd}/{fn}] {k} = {v}")
for k, v in sorted(miss.items()): for k, v in sorted(miss.items()):
if v: if v:
@@ -114,24 +133,35 @@ def copy_mtp(d1, d2, tag, rm):
nt = next(d1.execute("select count(w) from mt where k = ?", (tag,)))[0] nt = next(d1.execute("select count(w) from mt where k = ?", (tag,)))[0]
n = 0 n = 0
ndone = 0 ndone = 0
for w, k, v in d1.execute("select * from mt where k = ?", (tag,)): for w1, k, v in d1.execute("select * from mt where k = ?", (tag,)):
n += 1 n += 1
if n % 25_000 == 0: if n % 25_000 == 0:
m = f"\033[36m{n:,} of {nt:,} tags checked, so far {ndone} copied\033[0m" m = f"\033[36m{n:,} of {nt:,} tags checked, so far {ndone} copied\033[0m"
print(m) print(m)
hit = d2.execute("select v from mt where w = ? and +k = ?", (w, k)).fetchone() q = "select rd, fn from up where substr(w,1,16) = ?"
rd, fn = d1.execute(q, (w1,)).fetchone()
if rd.split("/", 1)[0] == ".hist":
continue
q = "select substr(w,1,16) from up where rd = ? and fn = ?"
w2 = d2.execute(q, (rd, fn)).fetchone()
if not w2:
continue
w2 = w2[0]
hit = d2.execute("select v from mt where w = ? and +k = ?", (w2, k)).fetchone()
if hit: if hit:
hit = hit[0] hit = hit[0]
if hit != v: if hit != v:
ndone += 1 ndone += 1
if hit is not None: if hit is not None:
d2.execute("delete from mt where w = ? and +k = ?", (w, k)) d2.execute("delete from mt where w = ? and +k = ?", (w2, k))
d2.execute("insert into mt values (?,?,?)", (w, k, v)) d2.execute("insert into mt values (?,?,?)", (w2, k, v))
if rm: if rm:
d2.execute("delete from mt where w = ? and +k = 't:mtp'", (w,)) d2.execute("delete from mt where w = ? and +k = 't:mtp'", (w2,))
d2.commit() d2.commit()
print(f"copied {ndone} {tag} tags over") print(f"copied {ndone} {tag} tags over")
@@ -168,6 +198,23 @@ def main():
db = sqlite3.connect(ar.db) db = sqlite3.connect(ar.db)
ds = sqlite3.connect(ar.src) if ar.src else None ds = sqlite3.connect(ar.src) if ar.src else None
# revert journals
for d, p in [[db, ar.db], [ds, ar.src]]:
if not d:
continue
pj = "{}-journal".format(p)
if not os.path.exists(pj):
continue
d.execute("create table foo (bar int)")
d.execute("drop table foo")
if ar.copy:
db.close()
shutil.copy2(ar.db, "{}.bak.dbtool.{:x}".format(ar.db, int(time.time())))
db = sqlite3.connect(ar.db)
for d, n in [[ds, "src"], [db, "dst"]]: for d, n in [[ds, "src"], [db, "dst"]]:
if not d: if not d:
continue continue
@@ -176,8 +223,8 @@ def main():
if ver == "corrupt": if ver == "corrupt":
die("{} database appears to be corrupt, sorry") die("{} database appears to be corrupt, sorry")
if ver != DB_VER: if ver < DB_VER1 or ver > DB_VER2:
m = f"{n} db is version {ver}, this tool only supports version {DB_VER}, please upgrade it with copyparty first" m = f"{n} db is version {ver}, this tool only supports versions between {DB_VER1} and {DB_VER2}, please upgrade it with copyparty first"
die(m) die(m)
if ar.ls: if ar.ls:

View File

@@ -410,7 +410,7 @@ def main(argv=None):
+ " (if you crash with codec errors then that is why)" + " (if you crash with codec errors then that is why)"
) )
if WINDOWS and sys.version_info < (3, 6): if sys.version_info < (3, 6):
al.no_scandir = True al.no_scandir = True
# signal.signal(signal.SIGINT, sighandler) # signal.signal(signal.SIGINT, sighandler)

View File

@@ -1,8 +1,8 @@
# coding: utf-8 # coding: utf-8
VERSION = (0, 11, 19) VERSION = (0, 11, 24)
CODENAME = "the grid" CODENAME = "the grid"
BUILD_DT = (2021, 6, 19) BUILD_DT = (2021, 6, 22)
S_VERSION = ".".join(map(str, VERSION)) S_VERSION = ".".join(map(str, VERSION))
S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT) S_BUILD_DT = "{0:04d}-{1:02d}-{2:02d}".format(*BUILD_DT)

View File

@@ -10,7 +10,7 @@ import hashlib
import threading import threading
from .__init__ import WINDOWS from .__init__ import WINDOWS
from .util import IMPLICATIONS, undot, Pebkac, fsdec, fsenc, statdir, nuprint from .util import IMPLICATIONS, uncyg, undot, Pebkac, fsdec, fsenc, statdir, nuprint
class VFS(object): class VFS(object):
@@ -439,8 +439,8 @@ class AuthSrv(object):
raise Exception("invalid -v argument: [{}]".format(v_str)) raise Exception("invalid -v argument: [{}]".format(v_str))
src, dst, perms = m.groups() src, dst, perms = m.groups()
if WINDOWS and src.startswith("/"): if WINDOWS:
src = "{}:\\{}".format(src[1], src[3:]) src = uncyg(src)
# print("\n".join([src, dst, perms])) # print("\n".join([src, dst, perms]))
src = fsdec(os.path.abspath(fsenc(src))) src = fsdec(os.path.abspath(fsenc(src)))
@@ -469,6 +469,17 @@ class AuthSrv(object):
print(m.format(cfg_fn, self.line_ctr)) print(m.format(cfg_fn, self.line_ctr))
raise raise
# case-insensitive; normalize
if WINDOWS:
cased = {}
for k, v in mount.items():
try:
cased[k] = fsdec(os.path.realpath(fsenc(v)))
except:
cased[k] = v
mount = cased
if not mount: if not mount:
# -h says our defaults are CWD at root and read/write for everyone # -h says our defaults are CWD at root and read/write for everyone
vfs = VFS(os.path.abspath("."), "", ["*"], ["*"]) vfs = VFS(os.path.abspath("."), "", ["*"], ["*"])
@@ -524,9 +535,7 @@ class AuthSrv(object):
if vflag == "-": if vflag == "-":
pass pass
elif vflag: elif vflag:
if WINDOWS and vflag.startswith("/"): vol.histpath = uncyg(vflag) if WINDOWS else vflag
vflag = "{}:\\{}".format(vflag[1], vflag[3:])
vol.histpath = vflag
elif self.args.hist: elif self.args.hist:
for nch in range(len(hid)): for nch in range(len(hid)):
hpath = os.path.join(self.args.hist, hid[: nch + 1]) hpath = os.path.join(self.args.hist, hid[: nch + 1])

View File

@@ -10,7 +10,6 @@ import json
import string import string
import socket import socket
import ctypes import ctypes
import traceback
from datetime import datetime from datetime import datetime
import calendar import calendar
@@ -50,6 +49,7 @@ class HttpCli(object):
self.tls = hasattr(self.s, "cipher") self.tls = hasattr(self.s, "cipher")
self.bufsz = 1024 * 32 self.bufsz = 1024 * 32
self.hint = None
self.absolute_urls = False self.absolute_urls = False
self.out_headers = {"Access-Control-Allow-Origin": "*"} self.out_headers = {"Access-Control-Allow-Origin": "*"}
@@ -72,6 +72,7 @@ class HttpCli(object):
"""returns true if connection can be reused""" """returns true if connection can be reused"""
self.keepalive = False self.keepalive = False
self.headers = {} self.headers = {}
self.hint = None
try: try:
headerlines = read_header(self.sr) headerlines = read_header(self.sr)
if not headerlines: if not headerlines:
@@ -115,7 +116,7 @@ class HttpCli(object):
try: try:
self.ip = vs[n].strip() self.ip = vs[n].strip()
except: except:
self.ip = vs[-1].strip() self.ip = vs[0].strip()
self.log("rproxy={} oob x-fwd {}".format(self.args.rproxy, v), c=3) self.log("rproxy={} oob x-fwd {}".format(self.args.rproxy, v), c=3)
self.log_src = self.conn.set_rproxy(self.ip) self.log_src = self.conn.set_rproxy(self.ip)
@@ -130,6 +131,9 @@ class HttpCli(object):
if v is not None: if v is not None:
self.log("[H] {}: \033[33m[{}]".format(k, v), 6) self.log("[H] {}: \033[33m[{}]".format(k, v), 6)
if "&" in self.req and "?" not in self.req:
self.hint = "did you mean '?' instead of '&'"
# split req into vpath + uparam # split req into vpath + uparam
uparam = {} uparam = {}
if "?" not in self.req: if "?" not in self.req:
@@ -199,6 +203,9 @@ class HttpCli(object):
self.log("{}\033[0m, {}".format(str(ex), self.vpath), 3) self.log("{}\033[0m, {}".format(str(ex), self.vpath), 3)
msg = "<pre>{}\r\nURL: {}\r\n".format(str(ex), self.vpath) msg = "<pre>{}\r\nURL: {}\r\n".format(str(ex), self.vpath)
if self.hint:
msg += "hint: {}\r\n".format(self.hint)
self.reply(msg.encode("utf-8", "replace"), status=ex.code) self.reply(msg.encode("utf-8", "replace"), status=ex.code)
return self.keepalive return self.keepalive
except Pebkac: except Pebkac:
@@ -581,8 +588,10 @@ class HttpCli(object):
if sub: if sub:
try: try:
dst = os.path.join(vfs.realpath, rem) dst = os.path.join(vfs.realpath, rem)
if not os.path.isdir(fsenc(dst)):
os.makedirs(fsenc(dst)) os.makedirs(fsenc(dst))
except OSError as ex: except OSError as ex:
self.log("makedirs failed [{}]".format(dst))
if ex.errno == 13: if ex.errno == 13:
raise Pebkac(500, "the server OS denied write-access") raise Pebkac(500, "the server OS denied write-access")
@@ -1763,16 +1772,27 @@ class HttpCli(object):
fn = f["name"] fn = f["name"]
rd = f["rd"] rd = f["rd"]
del f["rd"] del f["rd"]
if icur: if not icur:
break
if vn != dbv: if vn != dbv:
_, rd = vn.get_dbv(rd) _, rd = vn.get_dbv(rd)
q = "select w from up where rd = ? and fn = ?" q = "select w from up where rd = ? and fn = ?"
r = None
try: try:
r = icur.execute(q, (rd, fn)).fetchone() r = icur.execute(q, (rd, fn)).fetchone()
except: except Exception as ex:
if "database is locked" in str(ex):
break
try:
args = s3enc(idx.mem_cur, rd, fn) args = s3enc(idx.mem_cur, rd, fn)
r = icur.execute(q, args).fetchone() r = icur.execute(q, args).fetchone()
except:
m = "tag list error, {}/{}\n{}"
self.log(m.format(rd, fn, min_ex()))
break
tags = {} tags = {}
f["tags"] = tags f["tags"] = tags
@@ -1782,9 +1802,14 @@ class HttpCli(object):
w = r[0][:16] w = r[0][:16]
q = "select k, v from mt where w = ? and k != 'x'" q = "select k, v from mt where w = ? and k != 'x'"
try:
for k, v in icur.execute(q, (w,)): for k, v in icur.execute(q, (w,)):
taglist[k] = True taglist[k] = True
tags[k] = v tags[k] = v
except:
m = "tag read error, {}/{} [{}]:\n{}"
self.log(m.format(rd, fn, w, min_ex()))
break
if icur: if icur:
taglist = [k for k in vn.flags.get("mte", "").split(",") if k in taglist] taglist = [k for k in vn.flags.get("mte", "").split(",") if k in taglist]

View File

@@ -3,7 +3,6 @@ from __future__ import print_function, unicode_literals
import re import re
import os import os
import sys
import time import time
import socket import socket

View File

@@ -8,7 +8,7 @@ import shutil
import subprocess as sp import subprocess as sp
from .__init__ import PY2, WINDOWS from .__init__ import PY2, WINDOWS
from .util import fsenc, fsdec, REKOBO_LKEY from .util import fsenc, fsdec, uncyg, REKOBO_LKEY
if not PY2: if not PY2:
unicode = str unicode = str
@@ -44,6 +44,9 @@ class MParser(object):
while True: while True:
try: try:
bp = os.path.expanduser(args) bp = os.path.expanduser(args)
if WINDOWS:
bp = uncyg(bp)
if os.path.exists(bp): if os.path.exists(bp):
self.bin = bp self.bin = bp
return return
@@ -112,6 +115,19 @@ def parse_ffprobe(txt):
ret = {} # processed ret = {} # processed
md = {} # raw tags md = {} # raw tags
is_audio = fmt.get("format_name") in ["mp3", "ogg", "flac", "wav"]
if fmt.get("filename", "").split(".")[-1].lower() in ["m4a", "aac"]:
is_audio = True
# if audio file, ensure audio stream appears first
if (
is_audio
and len(streams) > 2
and streams[1].get("codec_type") != "audio"
and streams[2].get("codec_type") == "audio"
):
streams = [fmt, streams[2], streams[1]] + streams[3:]
have = {} have = {}
for strm in streams: for strm in streams:
typ = strm.get("codec_type") typ = strm.get("codec_type")
@@ -131,9 +147,7 @@ def parse_ffprobe(txt):
] ]
if typ == "video": if typ == "video":
if strm.get("DISPOSITION:attached_pic") == "1" or fmt.get( if strm.get("DISPOSITION:attached_pic") == "1" or is_audio:
"format_name"
) in ["mp3", "ogg", "flac"]:
continue continue
kvm = [ kvm = [
@@ -177,7 +191,7 @@ def parse_ffprobe(txt):
k = k[4:].strip() k = k[4:].strip()
v = v.strip() v = v.strip()
if k and v: if k and v and k not in md:
md[k] = [v] md[k] = [v]
for k in [".q", ".vq", ".aq"]: for k in [".q", ".vq", ".aq"]:

View File

@@ -79,7 +79,14 @@ class TcpSrv(object):
if self.args.log_conn: if self.args.log_conn:
self.log("tcpsrv", "|%sC-acc1" % ("-" * 2,), c="1;30") self.log("tcpsrv", "|%sC-acc1" % ("-" * 2,), c="1;30")
try:
# macos throws bad-fd
ready, _, _ = select.select(self.srv, [], []) ready, _, _ = select.select(self.srv, [], [])
except:
ready = []
if not self.stopping:
raise
for srv in ready: for srv in ready:
if self.stopping: if self.stopping:
break break

View File

@@ -84,14 +84,14 @@ def thumb_path(histpath, rem, mtime, fmt):
fn = rem fn = rem
if rd: if rd:
h = hashlib.sha512(fsenc(rd)).digest()[:24] h = hashlib.sha512(fsenc(rd)).digest()
b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24] b64 = base64.urlsafe_b64encode(h).decode("ascii")[:24]
rd = "{}/{}/".format(b64[:2], b64[2:4]).lower() + b64 rd = "{}/{}/".format(b64[:2], b64[2:4]).lower() + b64
else: else:
rd = "top" rd = "top"
# could keep original filenames but this is safer re pathlen # could keep original filenames but this is safer re pathlen
h = hashlib.sha512(fsenc(fn)).digest()[:24] h = hashlib.sha512(fsenc(fn)).digest()
fn = base64.urlsafe_b64encode(h).decode("ascii")[:24] fn = base64.urlsafe_b64encode(h).decode("ascii")[:24]
return "{}/th/{}/{}.{:x}.{}".format( return "{}/th/{}/{}.{:x}.{}".format(

View File

@@ -26,7 +26,7 @@ class U2idx(object):
self.timeout = self.args.srch_time self.timeout = self.args.srch_time
if not HAVE_SQLITE3: if not HAVE_SQLITE3:
self.log("could not load sqlite3; searchign wqill be disabled") self.log("your python does not have sqlite3; searching will be disabled")
return return
self.cur = {} self.cur = {}
@@ -57,6 +57,9 @@ class U2idx(object):
raise Pebkac(500, min_ex()) raise Pebkac(500, min_ex())
def get_cur(self, ptop): def get_cur(self, ptop):
if not HAVE_SQLITE3:
return None
cur = self.cur.get(ptop) cur = self.cur.get(ptop)
if cur: if cur:
return cur return cur
@@ -66,7 +69,7 @@ class U2idx(object):
if not os.path.exists(db_path): if not os.path.exists(db_path):
return None return None
cur = sqlite3.connect(db_path).cursor() cur = sqlite3.connect(db_path, 2).cursor()
self.cur[ptop] = cur self.cur[ptop] = cur
return cur return cur

View File

@@ -30,6 +30,7 @@ from .util import (
s3dec, s3dec,
statdir, statdir,
s2hms, s2hms,
min_ex,
) )
from .mtag import MTag, MParser from .mtag import MTag, MParser
@@ -39,6 +40,8 @@ try:
except: except:
HAVE_SQLITE3 = False HAVE_SQLITE3 = False
DB_VER = 4
class Up2k(object): class Up2k(object):
""" """
@@ -91,7 +94,7 @@ class Up2k(object):
thr.start() thr.start()
# static # static
self.r_hash = re.compile("^[0-9a-zA-Z_-]{43}$") self.r_hash = re.compile("^[0-9a-zA-Z_-]{44}$")
if not HAVE_SQLITE3: if not HAVE_SQLITE3:
self.log("could not initialize sqlite3, will use in-memory registry only") self.log("could not initialize sqlite3, will use in-memory registry only")
@@ -422,7 +425,10 @@ class Up2k(object):
ret += self._build_dir(dbw, top, excl, abspath, nohash) ret += self._build_dir(dbw, top, excl, abspath, nohash)
else: else:
# self.log("file: {}".format(abspath)) # self.log("file: {}".format(abspath))
rp = abspath[len(top) :].replace("\\", "/").strip("/") rp = abspath[len(top) + 1 :]
if WINDOWS:
rp = rp.replace("\\", "/").strip("/")
rd, fn = rp.rsplit("/", 1) if "/" in rp else ["", rp] rd, fn = rp.rsplit("/", 1) if "/" in rp else ["", rp]
sql = "select w, mt, sz from up where rd = ? and fn = ?" sql = "select w, mt, sz from up where rd = ? and fn = ?"
try: try:
@@ -647,7 +653,7 @@ class Up2k(object):
try: try:
parser = MParser(parser) parser = MParser(parser)
except: except:
self.log("invalid argument: " + parser, 1) self.log("invalid argument (could not find program): " + parser, 1)
return return
for tag in entags: for tag in entags:
@@ -887,59 +893,31 @@ class Up2k(object):
if not existed and ver is None: if not existed and ver is None:
return self._create_db(db_path, cur) return self._create_db(db_path, cur)
orig_ver = ver if ver == DB_VER:
if not ver or ver < 3: try:
nfiles = next(cur.execute("select count(w) from up"))[0]
self.log("OK: {} |{}|".format(db_path, nfiles))
return cur
except:
self.log("WARN: could not list files; DB corrupt?\n" + min_ex())
if (ver or 0) > DB_VER:
m = "database is version {}, this copyparty only supports versions <= {}"
raise Exception(m.format(ver, DB_VER))
bak = "{}.bak.{:x}.v{}".format(db_path, int(time.time()), ver) bak = "{}.bak.{:x}.v{}".format(db_path, int(time.time()), ver)
db = cur.connection db = cur.connection
cur.close() cur.close()
db.close() db.close()
msg = "creating new DB (old is bad); backup: {}" msg = "creating new DB (old is bad); backup: {}"
if ver: if ver:
msg = "creating backup before upgrade: {}" msg = "creating new DB (too old to upgrade); backup: {}"
self.log(msg.format(bak)) self.log(msg.format(bak))
shutil.copy2(db_path, bak) os.rename(fsenc(db_path), fsenc(bak))
cur = self._orz(db_path)
if ver == 1:
cur = self._upgrade_v1(cur, db_path)
if cur:
ver = 2
if ver == 2:
cur = self._create_v3(cur)
ver = self._read_ver(cur) if cur else None
if ver == 3:
if orig_ver != ver:
cur.connection.commit()
cur.execute("vacuum")
cur.connection.commit()
try:
nfiles = next(cur.execute("select count(w) from up"))[0]
self.log("OK: {} |{}|".format(db_path, nfiles))
return cur
except Exception as ex:
self.log("WARN: could not list files, DB corrupt?\n " + repr(ex))
if cur:
db = cur.connection
cur.close()
db.close()
return self._create_db(db_path, None) return self._create_db(db_path, None)
def _create_db(self, db_path, cur):
if not cur:
cur = self._orz(db_path)
self._create_v2(cur)
self._create_v3(cur)
cur.connection.commit()
self.log("created DB at {}".format(db_path))
return cur
def _read_ver(self, cur): def _read_ver(self, cur):
for tab in ["ki", "kv"]: for tab in ["ki", "kv"]:
try: try:
@@ -951,65 +929,38 @@ class Up2k(object):
if rows: if rows:
return int(rows[0][0]) return int(rows[0][0])
def _create_v2(self, cur): def _create_db(self, db_path, cur):
for cmd in [
r"create table up (w text, mt int, sz int, rd text, fn text)",
r"create index up_rd on up(rd)",
r"create index up_fn on up(fn)",
]:
cur.execute(cmd)
return cur
def _create_v3(self, cur):
""" """
collision in 2^(n/2) files where n = bits (6 bits/ch) collision in 2^(n/2) files where n = bits (6 bits/ch)
10*6/2 = 2^30 = 1'073'741'824, 24.1mb idx 1<<(3*10) 10*6/2 = 2^30 = 1'073'741'824, 24.1mb idx 1<<(3*10)
12*6/2 = 2^36 = 68'719'476'736, 24.8mb idx 12*6/2 = 2^36 = 68'719'476'736, 24.8mb idx
16*6/2 = 2^48 = 281'474'976'710'656, 26.1mb idx 16*6/2 = 2^48 = 281'474'976'710'656, 26.1mb idx
""" """
for c, ks in [["drop table k", "isv"], ["drop index up_", "w"]]: if not cur:
for k in ks: cur = self._orz(db_path)
try:
cur.execute(c + k)
except:
pass
idx = r"create index up_w on up(substr(w,1,16))" idx = r"create index up_w on up(substr(w,1,16))"
if self.no_expr_idx: if self.no_expr_idx:
idx = r"create index up_w on up(w)" idx = r"create index up_w on up(w)"
for cmd in [ for cmd in [
r"create table up (w text, mt int, sz int, rd text, fn text)",
r"create index up_rd on up(rd)",
r"create index up_fn on up(fn)",
idx, idx,
r"create table mt (w text, k text, v int)", r"create table mt (w text, k text, v int)",
r"create index mt_w on mt(w)", r"create index mt_w on mt(w)",
r"create index mt_k on mt(k)", r"create index mt_k on mt(k)",
r"create index mt_v on mt(v)", r"create index mt_v on mt(v)",
r"create table kv (k text, v int)", r"create table kv (k text, v int)",
r"insert into kv values ('sver', 3)", r"insert into kv values ('sver', {})".format(DB_VER),
]: ]:
cur.execute(cmd) cur.execute(cmd)
cur.connection.commit()
self.log("created DB at {}".format(db_path))
return cur return cur
def _upgrade_v1(self, odb, db_path):
npath = db_path + ".next"
if os.path.exists(npath):
os.unlink(npath)
ndb = self._orz(npath)
self._create_v2(ndb)
c = odb.execute("select * from up")
for wark, ts, sz, rp in c:
rd, fn = rp.rsplit("/", 1) if "/" in rp else ["", rp]
v = (wark, ts, sz, rd, fn)
ndb.execute("insert into up values (?,?,?,?,?)", v)
ndb.connection.commit()
ndb.connection.close()
odb.connection.close()
atomic_move(npath, db_path)
return self._orz(db_path)
def handle_json(self, cj): def handle_json(self, cj):
with self.mutex: with self.mutex:
if not self.register_vpath(cj["ptop"], cj["vcfg"]): if not self.register_vpath(cj["ptop"], cj["vcfg"]):
@@ -1316,9 +1267,9 @@ class Up2k(object):
hashobj.update(buf) hashobj.update(buf)
rem -= len(buf) rem -= len(buf)
digest = hashobj.digest()[:32] digest = hashobj.digest()[:33]
digest = base64.urlsafe_b64encode(digest) digest = base64.urlsafe_b64encode(digest)
ret.append(digest.decode("utf-8").rstrip("=")) ret.append(digest.decode("utf-8"))
return ret return ret
@@ -1518,12 +1469,12 @@ def up2k_wark_from_hashlist(salt, filesize, hashes):
ident.extend(hashes) ident.extend(hashes)
ident = "\n".join(ident) ident = "\n".join(ident)
wark = hashlib.sha512(ident.encode("utf-8")).digest() wark = hashlib.sha512(ident.encode("utf-8")).digest()[:33]
wark = base64.urlsafe_b64encode(wark) wark = base64.urlsafe_b64encode(wark)
return wark.decode("ascii")[:43] return wark.decode("ascii")
def up2k_wark_from_metadata(salt, sz, lastmod, rd, fn): def up2k_wark_from_metadata(salt, sz, lastmod, rd, fn):
ret = fsenc("{}\n{}\n{}\n{}\n{}".format(salt, lastmod, sz, rd, fn)) ret = fsenc("{}\n{}\n{}\n{}\n{}".format(salt, lastmod, sz, rd, fn))
ret = base64.urlsafe_b64encode(hashlib.sha512(ret).digest()) ret = base64.urlsafe_b64encode(hashlib.sha512(ret).digest())
return "#{}".format(ret[:42].decode("ascii")) return "#{}".format(ret.decode("ascii"))[:44]

View File

@@ -351,7 +351,7 @@ def ren_open(fname, *args, **kwargs):
if not b64: if not b64:
b64 = (bname + ext).encode("utf-8", "replace") b64 = (bname + ext).encode("utf-8", "replace")
b64 = hashlib.sha512(b64).digest()[:12] b64 = hashlib.sha512(b64).digest()[:12]
b64 = base64.urlsafe_b64encode(b64).decode("utf-8").rstrip("=") b64 = base64.urlsafe_b64encode(b64).decode("utf-8")
badlen = len(fname) badlen = len(fname)
while len(fname) >= badlen: while len(fname) >= badlen:
@@ -648,6 +648,16 @@ def s2hms(s, optional_h=False):
return "{}:{:02}:{:02}".format(h, m, s) return "{}:{:02}:{:02}".format(h, m, s)
def uncyg(path):
if len(path) < 2 or not path.startswith("/"):
return path
if len(path) > 2 and path[2] != "/":
return path
return "{}:\\{}".format(path[1], path[3:])
def undot(path): def undot(path):
ret = [] ret = []
for node in path.split("/"): for node in path.split("/"):
@@ -908,8 +918,8 @@ def hashcopy(actor, fin, fout):
hashobj.update(buf) hashobj.update(buf)
fout.write(buf) fout.write(buf)
digest32 = hashobj.digest()[:32] digest = hashobj.digest()[:33]
digest_b64 = base64.urlsafe_b64encode(digest32).decode("utf-8").rstrip("=") digest_b64 = base64.urlsafe_b64encode(digest).decode("utf-8")
return tlen, hashobj.hexdigest(), digest_b64 return tlen, hashobj.hexdigest(), digest_b64

View File

@@ -811,10 +811,12 @@ input.eq_gain {
padding: 0; padding: 0;
border-bottom: 1px solid #555; border-bottom: 1px solid #555;
} }
#thumbs { #thumbs,
#au_osd_cv {
opacity: .3; opacity: .3;
} }
#griden.on+#thumbs { #griden.on+#thumbs,
#au_os_ctl.on+#au_osd_cv {
opacity: 1; opacity: 1;
} }
#ghead { #ghead {
@@ -969,6 +971,9 @@ html.light #treeul a.hl {
background: #07a; background: #07a;
color: #fff; color: #fff;
} }
html.light #treeul a.hl:hover {
background: #059;
}
html.light #tree li { html.light #tree li {
border-color: #f7f7f7 #fff #ddd #fff; border-color: #f7f7f7 #fff #ddd #fff;
} }

View File

@@ -222,10 +222,14 @@ var have_webp = null;
var mpl = (function () { var mpl = (function () {
var have_mctl = 'mediaSession' in navigator && window.MediaMetadata;
ebi('op_player').innerHTML = ( ebi('op_player').innerHTML = (
'<div><h3>switches</h3><div>' + '<div><h3>switches</h3><div>' +
'<a href="#" class="tgl btn" id="au_preload" tt="start loading the next song near the end for gapless playback">preload</a>' + '<a href="#" class="tgl btn" id="au_preload" tt="start loading the next song near the end for gapless playback">preload</a>' +
'<a href="#" class="tgl btn" id="au_npclip" tt="show buttons for clipboarding the currently playing song">/np clip</a>' + '<a href="#" class="tgl btn" id="au_npclip" tt="show buttons for clipboarding the currently playing song">/np clip</a>' +
'<a href="#" class="tgl btn" id="au_os_ctl" tt="os integration (media hotkeys / osd)">os-ctl</a>' +
'<a href="#" class="tgl btn" id="au_osd_cv" tt="show album cover in osd">osd-cv</a>' +
'</div></div>' + '</div></div>' +
'<div><h3>playback mode</h3><div id="pb_mode">' + '<div><h3>playback mode</h3><div id="pb_mode">' +
@@ -238,7 +242,9 @@ var mpl = (function () {
var r = { var r = {
"pb_mode": sread('pb_mode') || 'loop-folder', "pb_mode": sread('pb_mode') || 'loop-folder',
"preload": bcfg_get('au_preload', true), "preload": bcfg_get('au_preload', true),
"clip": bcfg_get('au_npclip', false) "clip": bcfg_get('au_npclip', false),
"os_ctl": bcfg_get('au_os_ctl', false) && have_mctl,
"osd_cv": bcfg_get('au_osd_cv', true),
}; };
ebi('au_preload').onclick = function (e) { ebi('au_preload').onclick = function (e) {
@@ -254,6 +260,20 @@ var mpl = (function () {
clmod(ebi('wtoggle'), 'np', r.clip && mp.au); clmod(ebi('wtoggle'), 'np', r.clip && mp.au);
}; };
ebi('au_os_ctl').onclick = function (e) {
ev(e);
r.os_ctl = !r.os_ctl && have_mctl;
bcfg_set('au_os_ctl', r.os_ctl);
if (!have_mctl)
alert('need firefox 82+ or chrome 73+');
};
ebi('au_osd_cv').onclick = function (e) {
ev(e);
r.osd_cv = !r.osd_cv;
bcfg_set('au_osd_cv', r.osd_cv);
};
function draw_pb_mode() { function draw_pb_mode() {
var btns = QSA('#pb_mode>a'); var btns = QSA('#pb_mode>a');
for (var a = 0, aa = btns.length; a < aa; a++) { for (var a = 0, aa = btns.length; a < aa; a++) {
@@ -270,6 +290,55 @@ var mpl = (function () {
draw_pb_mode(); draw_pb_mode();
} }
r.announce = function () {
if (!r.os_ctl)
return;
var np = get_np()[0],
fns = np.file.split(' - '),
artist = (np.circle ? np.circle + ' // ' : '') + (np.artist || (fns.length > 1 ? fns[0] : '')),
tags = {
title: np.title || fns.slice(-1)[0]
};
if (artist)
tags.artist = artist;
if (np.album)
tags.album = np.album;
if (r.osd_cv) {
var files = QSA("#files tr>td:nth-child(2)>a[id]"),
cover = null;
for (var a = 0, aa = files.length; a < aa; a++) {
if (/^(cover|folder)\.(jpe?g|png|gif)$/.test(files[a].textContent)) {
cover = files[a].getAttribute('href');
break;
}
}
if (cover) {
cover += (cover.indexOf('?') === -1 ? '?' : '&') + 'th=j';
var pwd = get_pwd();
if (pwd)
cover += '&pw=' + uricom_enc(pwd);
tags.artwork = [{ "src": cover, type: "image/jpeg" }];
}
}
navigator.mediaSession.metadata = new MediaMetadata(tags);
navigator.mediaSession.playbackState = mp.au.paused ? "paused" : "playing";
navigator.mediaSession.setActionHandler('play', playpause);
navigator.mediaSession.setActionHandler('pause', playpause);
navigator.mediaSession.setActionHandler('seekbackward', function () { seek_au_rel(-10); });
navigator.mediaSession.setActionHandler('seekforward', function () { seek_au_rel(10); });
navigator.mediaSession.setActionHandler('previoustrack', prev_song);
navigator.mediaSession.setActionHandler('nexttrack', next_song);
};
return r; return r;
})(); })();
@@ -365,6 +434,28 @@ var mp = new MPlayer();
makeSortable(ebi('files'), mp.read_order.bind(mp)); makeSortable(ebi('files'), mp.read_order.bind(mp));
function get_np() {
var th = ebi('files').tHead.rows[0].cells,
tr = QS('#files tr.play').cells,
rv = [],
ra = [],
rt = {};
for (var a = 1, aa = th.length; a < aa; a++) {
var tv = tr[a].textContent,
tk = a == 1 ? 'file' : th[a].getAttribute('name').split('/').slice(-1)[0],
vis = th[a].className.indexOf('min') === -1;
if (!tv)
continue;
(vis ? rv : ra).push(tk);
rt[tk] = tv;
}
return [rt, rv, ra];
};
// toggle player widget // toggle player widget
var widget = (function () { var widget = (function () {
var ret = {}, var ret = {},
@@ -411,19 +502,16 @@ var widget = (function () {
}; };
npirc.onclick = nptxt.onclick = function (e) { npirc.onclick = nptxt.onclick = function (e) {
ev(e); ev(e);
var th = ebi('files').tHead.rows[0].cells, var irc = this.getAttribute('id') == 'npirc',
tr = QS('#files tr.play').cells,
irc = this.getAttribute('id') == 'npirc',
ck = irc ? '06' : '', ck = irc ? '06' : '',
cv = irc ? '07' : '', cv = irc ? '07' : '',
m = ck + 'np: '; m = ck + 'np: ',
npr = get_np(),
npk = npr[1],
np = npr[0];
for (var a = 1, aa = th.length; a < aa; a++) { for (var a = 0; a < npk.length; a++)
var tv = tr[a].textContent, m += (npk[a] == 'file' ? '' : npk[a]) + '(' + cv + np[npk[a]] + ck + ') // ';
tk = a == 1 ? '' : th[a].getAttribute('name').split('/').slice(-1)[0];
m += tk + '(' + cv + tv + ck + ') // ';
}
m += '[' + cv + s2ms(mp.au.currentTime) + ck + '/' + cv + s2ms(mp.au.duration) + ck + ']'; m += '[' + cv + s2ms(mp.au.currentTime) + ck + '/' + cv + s2ms(mp.au.duration) + ck + ']';
@@ -632,6 +720,11 @@ function seek_au_mul(mul) {
seek_au_sec(mp.au.duration * mul); seek_au_sec(mp.au.duration * mul);
} }
function seek_au_rel(sec) {
if (mp.au)
seek_au_sec(mp.au.currentTime + sec);
}
function seek_au_sec(seek) { function seek_au_sec(seek) {
if (!mp.au) if (!mp.au)
return; return;
@@ -682,6 +775,9 @@ function playpause(e) {
} }
else else
play(0); play(0);
if (navigator.mediaSession)
navigator.mediaSession.playbackState = mp.au.paused ? "paused" : "playing";
}; };
@@ -1121,6 +1217,7 @@ function play(tid, seek, call_depth) {
mpui.progress_updater(); mpui.progress_updater();
pbar.drawbuf(); pbar.drawbuf();
mpl.announce();
return true; return true;
} }
catch (ex) { catch (ex) {
@@ -1203,6 +1300,8 @@ function autoplay_blocked(seek) {
seek_au_sec(seek); seek_au_sec(seek);
else else
mpui.progress_updater(); mpui.progress_updater();
mpl.announce();
}; };
na.onclick = unblocked; na.onclick = unblocked;
} }
@@ -1512,18 +1611,18 @@ document.onkeydown = function (e) {
pos = parseInt(k.slice(-1)) * 0.1; pos = parseInt(k.slice(-1)) * 0.1;
if (pos !== -1) if (pos !== -1)
return seek_au_mul(pos); return seek_au_mul(pos) || true;
var n = k == 'KeyJ' ? -1 : k == 'KeyL' ? 1 : 0; var n = k == 'KeyJ' ? -1 : k == 'KeyL' ? 1 : 0;
if (n !== 0) if (n !== 0)
return song_skip(n); return song_skip(n) || true;
if (k == 'KeyP') if (k == 'KeyP')
return playpause(); return playpause() || true;
n = k == 'KeyU' ? -10 : k == 'KeyO' ? 10 : 0; n = k == 'KeyU' ? -10 : k == 'KeyO' ? 10 : 0;
if (n !== 0) if (n !== 0)
return mp.au ? seek_au_sec(mp.au.currentTime + n) : true; return seek_au_rel(n) || true;
n = k == 'KeyI' ? -1 : k == 'KeyK' ? 1 : 0; n = k == 'KeyI' ? -1 : k == 'KeyK' ? 1 : 0;
if (n !== 0) if (n !== 0)
@@ -1533,7 +1632,6 @@ document.onkeydown = function (e) {
return tree_up(); return tree_up();
if (k == 'KeyB') if (k == 'KeyB')
//return treectl.hidden ? treectl.show() : treectl.hide();
return treectl.hidden ? treectl.entree() : treectl.detree(); return treectl.hidden ? treectl.entree() : treectl.detree();
if (k == 'KeyG') if (k == 'KeyG')

View File

@@ -970,8 +970,8 @@ function up2k_init(subtle) {
while (segm_next()); while (segm_next());
var hash_done = function (hashbuf) { var hash_done = function (hashbuf) {
var hslice = new Uint8Array(hashbuf).subarray(0, 32), var hslice = new Uint8Array(hashbuf).subarray(0, 33),
b64str = buf2b64(hslice).replace(/=$/, ''); b64str = buf2b64(hslice);
hashtab[nch] = b64str; hashtab[nch] = b64str;
t.hash.push(nch); t.hash.push(nch);
@@ -996,6 +996,7 @@ function up2k_init(subtle) {
pvis.seth(t.n, 1, '📦 wait'); pvis.seth(t.n, 1, '📦 wait');
st.busy.hash.splice(st.busy.hash.indexOf(t), 1); st.busy.hash.splice(st.busy.hash.indexOf(t), 1);
st.todo.handshake.push(t); st.todo.handshake.push(t);
tasker();
}; };
if (subtle) if (subtle)
@@ -1041,6 +1042,7 @@ function up2k_init(subtle) {
console.log('handshake onerror, retrying'); console.log('handshake onerror, retrying');
st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1); st.busy.handshake.splice(st.busy.handshake.indexOf(t), 1);
st.todo.handshake.unshift(t); st.todo.handshake.unshift(t);
tasker();
}; };
xhr.onload = function (e) { xhr.onload = function (e) {
if (t.busied != me) { if (t.busied != me) {

View File

@@ -359,6 +359,15 @@ function get_vpath() {
} }
function get_pwd() {
var pwd = ('; ' + document.cookie).split('; cppwd=');
if (pwd.length < 2)
return null;
return pwd[1].split(';')[0];
}
function unix2iso(ts) { function unix2iso(ts) {
return new Date(ts * 1000).toISOString().replace("T", " ").slice(0, -5); return new Date(ts * 1000).toISOString().replace("T", " ").slice(0, -5);
} }

View File

@@ -103,6 +103,9 @@ cat warks | while IFS= read -r x; do sqlite3 up2k.db "delete from mt where w = '
# dump all dbs # dump all dbs
find -iname up2k.db | while IFS= read -r x; do sqlite3 "$x" 'select substr(w,1,12), rd, fn from up' | sed -r 's/\|/ \| /g' | while IFS= read -r y; do printf '%s | %s\n' "$x" "$y"; done; done find -iname up2k.db | while IFS= read -r x; do sqlite3 "$x" 'select substr(w,1,12), rd, fn from up' | sed -r 's/\|/ \| /g' | while IFS= read -r y; do printf '%s | %s\n' "$x" "$y"; done; done
# unschedule mtp scan for all files somewhere under "enc/"
sqlite3 -readonly up2k.db 'select substr(up.w,1,16) from up inner join mt on mt.w = substr(up.w,1,16) where rd like "enc/%" and +mt.k = "t:mtp"' > keys; awk '{printf "delete from mt where w = \"%s\" and +k = \"t:mtp\";\n", $0}' <keys | tee /dev/stderr | sqlite3 up2k.db
## ##
## media ## media
@@ -157,7 +160,7 @@ dbg.asyncStore.pendingBreakpoints = {}
about:config >> devtools.debugger.prefs-schema-version = -1 about:config >> devtools.debugger.prefs-schema-version = -1
# determine server version # determine server version
git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > /dev/shm/revs && cat /dev/shm/revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser,up2k}.js 2>/dev/null | diff -wNarU0 - <(cat /mnt/Users/ed/Downloads/ref/{util,browser,up2k}.js) | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done git pull; git reset --hard origin/HEAD && git log --format=format:"%H %ai %d" --decorate=full > ../revs && cat ../{util,browser}.js >../vr && cat ../revs | while read -r rev extra; do (git reset --hard $rev >/dev/null 2>/dev/null && dsz=$(cat copyparty/web/{util,browser}.js >../vg 2>/dev/null && diff -wNarU0 ../{vg,vr} | wc -c) && printf '%s %6s %s\n' "$rev" $dsz "$extra") </dev/null; done
## ##
@@ -200,3 +203,4 @@ mk() { rm -rf /tmp/foo; sudo -u ed bash -c 'mkdir /tmp/foo; echo hi > /tmp/foo/b
mk && t0="$(date)" && while true; do date -s "$(date '+ 1 hour')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0" mk && t0="$(date)" && while true; do date -s "$(date '+ 1 hour')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0"
mk && sudo -u ed flock /tmp/foo sleep 40 & sleep 1; ps aux | grep -E 'sleep 40$' && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0" mk && sudo -u ed flock /tmp/foo sleep 40 & sleep 1; ps aux | grep -E 'sleep 40$' && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; done; echo "$t0"
mk && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; tar -cf/dev/null /tmp/foo; done; echo "$t0" mk && t0="$(date)" && for n in {1..40}; do date -s "$(date '+ 1 day')"; systemd-tmpfiles --clean; ls -1 /tmp | grep foo || break; tar -cf/dev/null /tmp/foo; done; echo "$t0"

View File

@@ -1,6 +1,7 @@
FROM alpine:3.13 FROM alpine:3.13
WORKDIR /z WORKDIR /z
ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
ver_hashwasm=4.7.0 \
ver_marked=1.1.0 \ ver_marked=1.1.0 \
ver_ogvjs=1.8.0 \ ver_ogvjs=1.8.0 \
ver_mde=2.14.0 \ ver_mde=2.14.0 \
@@ -9,12 +10,6 @@ ENV ver_asmcrypto=5b994303a9d3e27e0915f72a10b6c2c51535a4dc \
ver_zopfli=1.0.3 ver_zopfli=1.0.3
# TODO
# sha512.hw.js https://github.com/Daninet/hash-wasm
# sha512.kc.js https://github.com/chm-diederichs/sha3-wasm
# awk '/HMAC state/{o=1} /var HEAP/{o=0} /function hmac_reset/{o=1} /return \{/{o=0} /var __extends =/{o=1} /var Hash =/{o=0} /hmac_|pbkdf2_/{next} o{next} {gsub(/IllegalStateError/,"Exception")} {sub(/^ +/,"");sub(/^\/\/ .*/,"");sub(/;$/," ;")} 1' <sha512.ac.js.orig >sha512.ac.js; for fn in sha512.ac.js.orig sha512.ac.js; do wc -c <$fn; wc -c <$fn.gz ; for n in {1..9}; do printf '%8d %d bz\n' $(bzip2 -c$n <$fn | wc -c) $n; done; done
# download; # download;
# the scp url is latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap # the scp url is latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
RUN mkdir -p /z/dist/no-pk \ RUN mkdir -p /z/dist/no-pk \
@@ -27,7 +22,11 @@ RUN mkdir -p /z/dist/no-pk \
&& wget https://github.com/codemirror/CodeMirror/archive/$ver_codemirror.tar.gz -O codemirror.tgz \ && wget https://github.com/codemirror/CodeMirror/archive/$ver_codemirror.tar.gz -O codemirror.tgz \
&& wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \ && wget https://github.com/FortAwesome/Font-Awesome/releases/download/$ver_fontawesome/fontawesome-free-$ver_fontawesome-web.zip -O fontawesome.zip \
&& wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \ && wget https://github.com/google/zopfli/archive/zopfli-$ver_zopfli.tar.gz -O zopfli.tgz \
&& wget https://github.com/Daninet/hash-wasm/releases/download/v$ver_hashwasm/hash-wasm@$ver_hashwasm.zip -O hash-wasm.zip \
&& unzip ogvjs.zip \ && unzip ogvjs.zip \
&& (mkdir hash-wasm \
&& cd hash-wasm \
&& unzip ../hash-wasm.zip) \
&& (tar -xf asmcrypto.tgz \ && (tar -xf asmcrypto.tgz \
&& cd asmcrypto.js-$ver_asmcrypto \ && cd asmcrypto.js-$ver_asmcrypto \
&& npm install ) \ && npm install ) \
@@ -64,7 +63,12 @@ RUN tar -xf zopfli.tgz \
RUN cd asmcrypto.js-$ver_asmcrypto \ RUN cd asmcrypto.js-$ver_asmcrypto \
&& echo "export { Sha512 } from './hash/sha512/sha512';" > src/entry-export_all.ts \ && echo "export { Sha512 } from './hash/sha512/sha512';" > src/entry-export_all.ts \
&& node -r esm build.js \ && node -r esm build.js \
&& mv asmcrypto.all.es5.js /z/dist/sha512.js && awk '/HMAC state/{o=1} /var HEAP/{o=0} /function hmac_reset/{o=1} /return \{/{o=0} /var __extends =/{o=1} /var Hash =/{o=0} /hmac_|pbkdf2_/{next} o{next} {gsub(/IllegalStateError/,"Exception")} {sub(/^ +/,"");sub(/^\/\/ .*/,"");sub(/;$/," ;")} 1' < asmcrypto.all.es5.js > /z/dist/sha512.ac.js
# build hash-wasm
RUN cd hash-wasm \
&& mv sha512.umd.min.js /z/dist/sha512.hw.js
# build ogvjs # build ogvjs