Compare commits

...

184 Commits

Author SHA1 Message Date
renovate[bot]
651eda3e4c Update dependency tailwindcss to v4 2025-10-17 21:44:24 +00:00
9 Technology Group LTD
c328123bd3 Merge pull request #179 from PatchMon/feature/go-agent
Major release 1.3.0 - New architecture
2025-10-17 22:43:09 +01:00
Muhammad Ibrahim
46eb797ac3 I should really commit more often instead of sending over one massive commit
Blame my ADHD brain
Sorry
- Now we have the server working properly in automation using BullMQ and Redis
- It also presents an API endpoint that is used to accept connections for websockets by agents (WS or WSS)
- Updated the docker-compose.yml and its documentation
2025-10-17 22:10:55 +01:00
Muhammad Ibrahim
c43afeb127 Added qty of connected and offline to the hosts dashboard page 2025-10-15 22:40:52 +01:00
Muhammad Ibrahim
5b77a1328d Removed js file for the update checker for github
Added real-time feature for agent status
made some ui improvements on the host details page
2025-10-15 22:15:18 +01:00
Muhammad Ibrahim
9a40d5e6ee Added support for the new agent mechanism and Binary
Added bullMQ + redis to the platform for automation and queue mechanism
Added new tabs in host details
2025-10-15 20:56:58 +01:00
Muhammad Ibrahim
fdd0cfd619 Make user_sessions migration idempotent for 1.2.7 compatibility
- Modified 20251005000000_add_user_sessions to check if table exists first
- Added existence checks for all indexes and foreign keys
- Migration now works for both fresh installs and 1.2.7 upgrades
- Prevents P3018 error by gracefully handling existing table
- Added comprehensive logging for debugging
2025-10-13 21:36:52 +01:00
Muhammad Ibrahim
de236f9ae2 Simplify migration reconciliation for 1.2.7 upgrade
- Simplified logic to focus on core issue: table exists but no migration record
- Creates migration record when user_sessions table exists from 1.2.7
- Prevents P3018 error by marking migration as already applied
- More reliable approach for production upgrades
2025-10-13 21:33:22 +01:00
Muhammad Ibrahim
4d5040e0e9 Merge feature/automation into main
- Resolve migration reconciliation conflicts
- Include updated migration that handles 1.2.7 upgrade scenario
- Merge automation features and Docker support
2025-10-13 21:25:11 +01:00
Muhammad Ibrahim
28c5310b99 Fix migration reconciliation to handle 1.2.7 upgrade scenario
- Add case for table exists but no migration record (1.2.7 upgrade)
- Creates migration record for existing user_sessions table
- Prevents P3018 error when table exists from 1.2.7 installation
- Handles all upgrade scenarios properly
2025-10-13 21:24:35 +01:00
Muhammad Ibrahim
a2e9743da6 Add migration reconciliation for user_sessions 1.2.7 to 1.2.8+ upgrade
- Creates migration 20251004999999_reconcile_user_sessions_migration
- Runs before 20251005000000_add_user_sessions migration
- Properly handles failed migrations by marking them as rolled back first
- Handles migration name conflicts from 1.2.7 (add_user_sessions) to 1.2.8+ (20251005000000_add_user_sessions)
- Fixes failed migration states from upgrade attempts
- Works naturally within Prisma migration system
- No changes to Docker entrypoint or setup scripts needed
2025-10-13 21:15:22 +01:00
Muhammad Ibrahim
3863d641fa Fix git update conflicts in setup.sh --update
- Add git clean -fd to remove untracked files before pull
- Add git reset --hard HEAD to ensure clean state
- Prevents merge conflicts from untracked files during updates
- Ensures smooth updates from any version to any version
2025-10-13 21:15:04 +01:00
Muhammad Ibrahim
cc8f77a946 deleted the js file 2025-10-13 20:59:05 +01:00
Muhammad Ibrahim
36455e2bfd Fixed Migration to be done via a prixma migraiton not js script 2025-10-13 20:58:08 +01:00
Muhammad Ibrahim
af65d38cad - Fixes P3009 error when upgrading from 1.2.7
- Reconciles 'add_user_sessions' to '20251005000000_add_user_sessions'
- Prevents duplicate migration attempts
- Handles fresh installs gracefully"
2025-10-13 20:29:42 +01:00
Muhammad Ibrahim
29266b6d77 Added longer transaction timeout on Postgresql DB 2025-10-12 21:14:52 +01:00
Muhammad Ibrahim
f96e468482 Improved patchmon-agent.sh logic to handle locked apt processes
Introduced docker Feature integration via agent
2025-10-11 22:54:49 +01:00
Muhammad Ibrahim
9f8c88badf Remove 'coming soon' indicator from Automation menu item 2025-10-11 20:55:27 +01:00
Muhammad Ibrahim
7985a225d7 Merge main into feature/automation to align git history
Resolved conflicts:
- backend/src/server.js: Kept automation routes alongside gethomepage routes
- frontend/src/pages/Queue.jsx: Kept deleted (replaced by Automation.jsx)
- setup.sh: Kept newer version date (2025-10-11)

This merge brings in all commits from main including:
- GetHomepage integration
- Version 1.2.9 updates
- Migration file renames
- Bug fixes and improvements
2025-10-11 20:45:29 +01:00
Muhammad Ibrahim
8c538bd99c Merge changes from main: Add GetHomepage integration and update to v1.2.9
- Added gethomepageRoutes.js for GetHomepage integration
- Updated all package.json files to version 1.2.9
- Updated agent script to version 1.2.9
- Updated version fallbacks in versionRoutes.js and updateScheduler.js
- Updated setup.sh with version 1.2.9
- Merged GetHomepage integration UI (Integrations.jsx)
- Updated docker-entrypoint.sh from main
- Updated VersionUpdateTab component
- Combined automation and gethomepage routes in server.js
- Maintains both BullMQ automation and GetHomepage functionality
2025-10-11 20:35:47 +01:00
9 Technology Group LTD
623bf5e2c8 Merge pull request #161 from PatchMon/feature/gethomepage
Feature/gethomepage + new version 1.2.9
2025-10-11 20:21:44 +01:00
Muhammad Ibrahim
ed8cc81b89 Changed version from 1.2.8 to 1.2.9 in preperation for next release 2025-10-11 20:14:08 +01:00
Muhammad Ibrahim
5c4353a688 Fixed linting errors with gethomepage area 2025-10-11 20:04:29 +01:00
Muhammad Ibrahim
6ebcdd57d5 Fixed Migration order issue where users were getting error of "add_user_sessions" does not exist 2025-10-11 14:47:27 +01:00
Muhammad Ibrahim
a3d0dfd665 Fixed entrypoint to handle better updating of Agent mechanism
Updated Readme to show the --update flag
2025-10-10 21:52:57 +01:00
Muhammad Ibrahim
d99ded6d65 Added Database Backup ability when doing setup.sh -- update 2025-10-10 20:16:24 +01:00
Muhammad Ibrahim
1ea96b6172 Merge branch 'main' of github.com:9technologygroup/patchmon.net 2025-10-10 19:37:46 +01:00
Muhammad Ibrahim
1e5ee66825 Fixed version update checking mechanism
Updated the setup.sh script to have the --update flag
2025-10-10 19:32:44 +01:00
Muhammad Ibrahim
88130797e4 Updated Version to 1.2.8 2025-10-10 12:39:17 +01:00
Muhammad Ibrahim
0ad1a96871 Building the start of Automation page and implemented BullMQ module 2025-10-10 12:24:23 +01:00
9 Technology Group LTD
566c415471 Merge pull request #152 from PatchMon/feature/queue
Feature/Agent
2025-10-08 18:52:02 +01:00
Muhammad Ibrahim
cfc91243eb Fixed Issues with RHEL based systems not sending their repos to PatchMon 2025-10-08 18:46:39 +01:00
Muhammad Ibrahim
84cf31869b Fixed spacing in the header for the buttons 2025-10-08 17:57:56 +01:00
Muhammad Ibrahim
18c9d241eb Fixed RockyLinux 10 Support 2025-10-08 17:53:08 +01:00
Muhammad Ibrahim
86b5da3ea0 Removed titles from the top nav bar to give space to search bar 2025-10-08 17:25:24 +01:00
9 Technology Group LTD
c9b5ee63d8 Merge pull request #151 from PatchMon/fix/agentdata
Fix/agentdata
2025-10-08 16:25:56 +01:00
Muhammad Ibrahim
ac4415e1dc Added support for Oracle Linux 9 2025-10-08 16:24:35 +01:00
9 Technology Group LTD
3737a5a935 Merge pull request #145 from Maelstromeous/patch-1
Document manual result update process for PatchMon
2025-10-08 15:50:28 +01:00
9 Technology Group LTD
bcce48948a Merge pull request #148 from PatchMon/refactor/frontend_optimisations
Various optimisations/fixes - mostly frontend
2025-10-08 15:48:10 +01:00
Muhammad Ibrahim
5e4c628110 Dashboard Card ecit 2025-10-08 09:53:03 +01:00
Muhammad Ibrahim
a8668ee3f3 Hide Dashboard text in header to give more space to search bar 2025-10-08 09:47:10 +01:00
Muhammad Ibrahim
5487206384 Fix hamburger menu icon and separator dark mode styling 2025-10-08 09:46:04 +01:00
Muhammad Ibrahim
daa31973f9 Fix mobile menu dark mode styling for Dashboard and navigation items 2025-10-08 09:45:31 +01:00
Muhammad Ibrahim
561c78fb08 Remove coming soon items from mobile menu navigation 2025-10-08 09:44:26 +01:00
Muhammad Ibrahim
6d3f2d94ba Add dark mode support and logout functionality to mobile menu 2025-10-08 09:43:41 +01:00
Muhammad Ibrahim
93534ebe52 Add dark mode support to BulkAssignModal 2025-10-08 09:40:38 +01:00
Muhammad Ibrahim
5cf2811bfd Fix BulkAssignModal: add missing bulkHostGroupId variable 2025-10-08 09:40:02 +01:00
tigattack
8fd91eae1a fix(frontend): use updateUserMutation in EditUserModal
Makes it more consistent with the other user mutations and resolves a lint error for the formerly unused `updateUserMutation`
2025-10-08 02:18:20 +01:00
tigattack
da8c661d20 refactor: fix lint errors 2025-10-08 02:12:51 +01:00
tigattack
2bf639e315 chore: update gitignore for docker dev 2025-10-08 02:10:40 +01:00
tigattack
c02ac4bd6f fix(frontend): don't query settings before auth 2025-10-08 02:10:40 +01:00
tigattack
4e0eaf7323 feat(frontend): add lazy loading for routes with Suspense fallback 2025-10-08 02:10:40 +01:00
tigattack
ef9ef58bcb feat(vite): add manual chunking for optimized build output 2025-10-08 02:08:46 +01:00
9 Technology Group LTD
29afe3da1f Merge pull request #147 from PatchMon/fix/agentdata
Add Line Chart
2025-10-08 00:47:47 +01:00
Muhammad Ibrahim
a861e4f9eb Fix linting issues: remove unused imports, add button types, fix array keys 2025-10-08 00:42:26 +01:00
9 Technology Group LTD
12ef6fd8e1 Merge pull request #146 from PatchMon/fix/agentdata
Agent improvements for Debian
Removal of 100 package limit
Modified hosts detail page with Agent history
Added Device fingerprinting for better session management (I need to improve this though)
Added Dashboard card of Package trends for all or specific hosts
Fixed filtering on the package page
2025-10-08 00:33:12 +01:00
Muhammad Ibrahim
ba9de097dc Added Dashboard card to show Package trends over time 2025-10-07 22:48:15 +01:00
Muhammad Ibrahim
8103581d17 Added Package trends over time graph XD 2025-10-07 22:46:55 +01:00
Muhammad Ibrahim
cdb24520d8 Added Total Packages in the Agent history
Added Script execution time in the Agent history tab
Added Pagination for the agent History
2025-10-07 21:46:37 +01:00
Muhammad Ibrahim
831adf3038 Fixed filtering for regular / security updates pie chart on the dashboard 2025-10-07 21:13:22 +01:00
Muhammad Ibrahim
2a1eed1354 Fixed Filtering with the OS Distribution Dashboard card 2025-10-07 21:01:44 +01:00
Muhammad Ibrahim
7819d4512e Made the coffee cup Yellow 2025-10-07 20:54:21 +01:00
Muhammad Ibrahim
a305fe23d3 Fixed issues with the agent not sending apt data properly
Added Indexing to the database for faster and efficient searching
Fixed some filtering from the hosts page relating to packages that need updating
Added buy me a coffee link (sorry and thank you <3)
2025-10-07 20:52:46 +01:00
Matt Cavanagh
2b36e88d85 Revise manual update instructions in README
Updated instructions for forcing updates after host package changes.
2025-10-07 20:25:53 +01:00
Matt Cavanagh
6624ec002d Document manual update process for PatchMon
Add instructions for manual update in README
2025-10-07 20:24:15 +01:00
Muhammad Ibrahim
840779844a Removed 100 limit 2025-10-07 18:20:41 +01:00
Muhammad Ibrahim
f91d3324ba Merge branch 'main' of github.com:9technologygroup/patchmon.net 2025-10-07 18:13:04 +01:00
Muhammad Ibrahim
8c60b5277e Update frontend: HostDetail, Hosts, and osIcons 2025-10-07 18:12:56 +01:00
9 Technology Group LTD
2ac756af84 Merge pull request #139 from stianmeyer/patch-2
Search for the absence of .sh files in the /app/agents folder to trigger copying of the agent files
2025-10-06 09:49:42 +01:00
9 Technology Group LTD
e227004d6b Merge pull request #140 from PatchMon/docs/docker
docs(docker): add description for 'edge' tag
2025-10-06 09:47:12 +01:00
Muhammad Ibrahim
d379473568 Added TFA timeout env variables
Added profile session management
Added "Remember me" to bypass TFA using device fingerprint
Fixed profile name not being persistent after logout and login
2025-10-06 00:55:23 +01:00
9 Technology Group LTD
2edc773adf Merge pull request #141 from PatchMon/ci/docker_no_push_fork 2025-10-05 23:27:44 +01:00
Stian Meyer
2db839556c Copy from agents_backup only when no .sh scripts are present 2025-10-06 00:24:07 +02:00
tigattack
aab6fc244e ci(docker): fix push conditions to prevent pushes from forks 2025-10-05 23:09:01 +01:00
tigattack
811f5b5885 docs(docker): add description for 'edge' tag 2025-10-05 22:55:46 +01:00
tigattack
b43c9e94fd Merge pull request #117 from PatchMon/ci/tweaks 2025-10-05 22:38:29 +01:00
Stian Meyer
2e2a554aa3 Update backend.docker-entrypoint.sh 2025-10-05 23:36:46 +02:00
tigattack
eabcfd370c ci(docker): remove 'dev' branch from push trigger and update image tag handling
- Create 'edge' tag for pushes to main
- Create versioned & latest tags for new tags with `v` prefix (instead of on release)
2025-10-05 21:33:41 +01:00
tigattack
55cb07b3c8 ci(build): remove 'dev' branch from push trigger 2025-10-05 21:33:41 +01:00
tigattack
0e049ec3d5 ci: ignore changes to docker in build and code quality workflows 2025-10-05 21:33:41 +01:00
9 Technology Group LTD
a2464fac5c Merge pull request #138 from PatchMon/dev
Removed 100 packages limit.
2025-10-05 20:50:53 +01:00
Muhammad Ibrahim
5dc3e8ba81 Removed 100 packages limit. 2025-10-05 20:38:25 +01:00
9 Technology Group LTD
63817b450f Merge pull request #137 from PatchMon/dev
Fixed Profile Name editing issue where it wouldn't save
Added more environment variables to env.example
fixed setup.sh so it would ask for the release tag rather than just the branch
2025-10-05 19:44:40 +01:00
Muhammad Ibrahim
1fa0502d7d Modified setup.sh to cater for new environment variables
Added missing env variables in the env.example file
2025-10-05 19:27:55 +01:00
Muhammad Ibrahim
581dc5884c Fixed issue with users not being updated
Re-worked setup.sh to use last 3 tags and the main branch (development latest)
2025-10-05 19:12:51 +01:00
9 Technology Group LTD
dcaffe2805 Merge pull request #135 from PatchMon/dev
Add logo files
2025-10-05 13:19:02 +01:00
Muhammad Ibrahim
a3005bccb4 Merge branch 'dev' of github.com:9technologygroup/patchmon.net into dev 2025-10-05 13:13:05 +01:00
Muhammad Ibrahim
499ef9d5d9 Add the Logo files 2025-10-05 13:11:31 +01:00
9 Technology Group LTD
6eb6ea3fd6 Merge pull request #134 from PatchMon/dev
Implemented Machine ID check when enrolling a linux host into PatchMon rather than using the friendly name as the unique identifier. Mainly implemented when I worked on the auto-enrollment system for ProxMox LXC Containers
Implemented ProxMox auto-enrollment function where it searches and attaches LXC containers then enrolls them into PatchMon
Add Package deletion ability from tigattack
Made tables and views better and in sync with the rest of the ui by tigattack
Made JWT Token required as a environment variable when starting server.js
Added global search bar
Added PatchMon Logos and ability to change them, with a new branding option in the settings menu
Reworked github fetch for version updates checking to give more details of latest commits
Made changes to the navigation pane
2025-10-05 13:05:22 +01:00
9 Technology Group LTD
a27c607d9e Merge branch 'main' into dev 2025-10-05 13:00:05 +01:00
9 Technology Group LTD
d4e0abd407 Translate diagram to mermaid from stianmeyer/patch-1
Translate diagram to mermaid
2025-10-05 12:38:41 +01:00
Muhammad Ibrahim
8d447cab0d Merge main into dev - resolved README conflict 2025-10-05 12:23:06 +01:00
Muhammad Ibrahim
6988ecab12 Made github version checking better
Added functionality of Logo branding
Modified sidebar width
2025-10-05 10:55:34 +01:00
Stian Meyer
fd108c6a21 Translate diagram to mermaid 2025-10-05 01:41:58 +02:00
Muhammad Ibrahim
3ea8cc74b6 fix: resolve updateScheduler database and API issues
- Fix database field names: lastUpdateCheck -> last_update_check
- Fix database field names: updateAvailable -> update_available
- Fix database field names: latestVersion -> latest_version
- Add graceful GitHub API rate limit handling
- Return null instead of throwing error on rate limit
- Prevent database update errors on API failures
2025-10-04 20:30:58 +01:00
Muhammad Ibrahim
a43fc9d380 fix: remove outdated GitHub repository warning
- Update updateScheduler to use default GitHub repository
- Remove 'No GitHub repository configured' warning message
- Use same default fallback logic as version routes
2025-10-04 20:29:46 +01:00
Muhammad Ibrahim
864719b4b3 feat: implement main branch vs release commit comparison
- Add commit difference tracking between main branch and release tag
- Show how many commits main branch is ahead of current release
- Update UI to display branch status with clear messaging
- Fix linting issues with useCallback and unused parameters
- Simplify version display with My Version | Latest Release layout
2025-10-04 20:27:41 +01:00
9 Technology Group LTD
cc89df161b Update README.md
Added Documentation Links
2025-10-04 19:38:09 +01:00
Muhammad Ibrahim
2659a930d6 Add force flag to bypass broken packages upon installation 2025-10-04 13:37:05 +01:00
Muhammad Ibrahim
fa57b35270 Added /hosts/install?force=true to the api endpoint to force the installation of the agent if there are existing broken packages on the host you want to monito 2025-10-04 13:09:29 +01:00
Muhammad Ibrahim
766d36ff80 fix: migration to properly drop unique index on friendly_name
The migration was dropping the constraint but not the underlying unique index.
In PostgreSQL, unique constraints and unique indexes can exist independently.
This caused auto-enrollment to fail with 'unique constraint violated' errors.

Added explicit DROP INDEX statement to ensure the unique index is removed,
allowing duplicate friendly_name values while machine_id remains unique.
2025-10-04 10:44:06 +01:00
Muhammad Ibrahim
3a76d54707 Made Proxmox LXC a tab within integrations page 2025-10-04 09:44:18 +01:00
Muhammad Ibrahim
dd28e741d4 fix: manual host creation and improve host identification
- Add machine_id support for manual host creation from GUI
- Generate temporary 'pending-{uuid}' machine_id for new hosts
- Agent now collects and sends machine_id on every update
- Backend replaces pending machine_id with real one on first agent connection
- Remove unnecessary duplicate name check (friendly_name can be duplicated)
- Add get_machine_id() function to agent (reads from /etc/machine-id, /var/lib/dbus/machine-id, or generates fallback)
- Display IP address in Network tab on host details page
- Fix network tab visibility conditions to include host.ip

This ensures proper host identification using machine_id while maintaining backwards compatibility with API credentials as the primary authentication method.
2025-10-04 09:39:47 +01:00
Muhammad Ibrahim
35d3c28ae5 feat(ui): Display machine_id in host details page and enable search
- Added machine_id field to host details page
- Backend now returns machine_id in all host queries
- Users can search hosts by machine_id
- Added hostname index to schema for better performance
2025-10-04 09:15:43 +01:00
Muhammad Ibrahim
3cf2ada84e migration: Add machine_id column to hosts table
- Adds machine_id as unique identifier for hosts
- Migrates existing hosts with 'migrated-' prefix
- Removes unique constraint from friendly_name
- Adds indexes for performance
2025-10-04 09:05:36 +01:00
Muhammad Ibrahim
b25bba50a7 feat(backend): Update routes to use machine_id for host identification
- Auto-enrollment endpoints now require and validate machine_id
- Check for duplicates by machine_id instead of friendly_name
- Added /hosts/check-machine-id endpoint for agent installer
- Bulk enrollment updated to handle machine_id
- Multiple hosts with same hostname now supported
2025-10-04 09:04:35 +01:00
Muhammad Ibrahim
811930d1e2 feat: Implement machine_id based host identification
- Add machine_id field to hosts schema (unique, indexed)
- Remove unique constraint from friendly_name (allow duplicate hostnames)
- Agent installer now generates/reads persistent machine_id
- Proxmox script retrieves machine_id from LXC containers
- Backend will check machine_id instead of hostname for duplicates

This allows multiple hosts with same hostname to coexist in PatchMon
2025-10-04 09:02:56 +01:00
Muhammad Ibrahim
f3db16d6d0 feat: Auto-install curl in LXC containers if missing before agent installation 2025-10-03 23:57:38 +01:00
Muhammad Ibrahim
b3887c818d chore: Update GitHub repository URLs from 9technologygroup/patchmon.net to PatchMon/PatchMon 2025-10-03 23:39:58 +01:00
9 Technology Group LTD
f7b73ba280 Update app_build.yml 2025-10-03 23:26:46 +01:00
Muhammad Ibrahim
5c2bacb322 feat: Add failure details section showing last 5 lines of output for failed containers 2025-10-03 22:49:51 +01:00
Muhammad Ibrahim
657017801b fix: Restore server.js from aa8b42c (accidentally overwrote with older version) 2025-10-03 22:30:53 +01:00
Muhammad Ibrahim
5e8cfa6b63 feat: Add Proxmox LXC auto-enrollment script with dpkg error recovery 2025-10-03 22:27:04 +01:00
9 Technology Group LTD
f9bd56215d Update README.md
Changed the RoadMap URL
2025-10-03 22:10:41 +01:00
9 Technology Group LTD
aa8b42cbb0 Merge pull request #129 from PatchMon/dev-1-2-8
Global Search + Proxmox Auto lxc enrollment
2025-10-03 22:08:26 +01:00
9 Technology Group LTD
51f6fabd45 Merge pull request #122 from PatchMon/feat/delete_repos
feat: add repository deletion functionality
2025-10-03 22:02:13 +01:00
tigattack
32ab004f3f feat: add repository deletion functionality with confirmation modal 2025-10-03 21:53:13 +01:00
9 Technology Group LTD
71b27b4bcf Merge pull request #123 from PatchMon/feat/package_detail
feat: add package detail page and list all packages with pagination
2025-10-03 18:03:57 +01:00
9 Technology Group LTD
60ca2064bf Merge pull request #124 from PatchMon/feat/repo_detail
restyle repository details
2025-10-03 18:03:23 +01:00
tigattack
5ccd0aa163 feat(repository): make hosts in repo detail more consistent with package detail 2025-10-02 23:53:06 +01:00
tigattack
a13b4941cd refactor(repository): use server icon in repository host count display 2025-10-02 23:52:56 +01:00
tigattack
482a9e27c9 fix(packages): fix security update badge 2025-10-02 23:52:11 +01:00
tigattack
f085596b87 fix(packages): update host property names 2025-10-02 23:52:10 +01:00
tigattack
757feab9cd fix(packages): add needsUpdate and isSecurityUpdate fields to package hosts 2025-10-02 23:52:10 +01:00
tigattack
fffc571453 feat(packages): complete package detail page
Open by clicking package name
2025-10-02 23:52:10 +01:00
tigattack
6f59a1981d feat(api): endpoint to retrieve hosts for a pkg
With pagination and search functionality
2025-10-02 23:52:10 +01:00
tigattack
8bb16f0896 fix(api): update package host fields to match database schema 2025-10-02 23:52:10 +01:00
tigattack
b454b8d130 feat(packages): show all packages by default, add pagination 2025-10-02 23:52:10 +01:00
9 Technology Group LTD
3fc4b799be Merge pull request #121 from PatchMon/fix/jwt_secret_no_default
fix(auth): JWT_SECRET is required
2025-10-02 22:15:35 +01:00
tigattack
9c39d83fe5 fix(auth): JWT_SECRET is required 2025-10-02 21:26:19 +01:00
9 Technology Group LTD
2ce6d9cd73 Merge pull request #119 from PatchMon/docs/docker
docs(docker): clarify image tags
2025-10-02 21:09:54 +01:00
tigattack
e97ccc5cbd docs(docker): clarify image tags 2025-10-02 21:01:55 +01:00
9 Technology Group LTD
1f77e459ce 1.2.7 Release
Please see the release notes.
2025-10-02 18:12:09 +01:00
tigattack
9ddc27e50c ci(docker): add QEMU setup 2025-10-02 18:05:30 +01:00
tigattack
26c58f687b Merge pull request #115 from PatchMon/feat/docker_changes 2025-10-02 17:25:57 +01:00
tigattack
c004734a44 fix(docker): update image references to use the correct repository 2025-10-02 15:55:52 +01:00
tigattack
841b97cb5d chore(docker): remove optional env vars from compose 2025-10-02 15:55:52 +01:00
tigattack
8464a3692d docs(docker): restructure env var docs and add missing vars 2025-10-02 15:55:52 +01:00
tigattack
258bc67efc docs(docker): update repo links with new URL 2025-10-02 15:55:52 +01:00
tigattack
b3c1319df4 docs(docker): clarify instructions for version-specific updates
Changes example version to 1.2.3 to hopefully make it clearer that this is JUST an example.
2025-10-02 15:55:52 +01:00
tigattack
f6d21e0ed5 docs(docker): improve secrets instructions, add JWT info 2025-10-02 15:55:52 +01:00
tigattack
b85eddf22a feat(docker): add tags for dev images in compose file 2025-10-02 15:55:52 +01:00
tigattack
01dac49c05 refactor(docker): update PostgreSQL password placeholder in compose files 2025-10-02 15:55:52 +01:00
tigattack
ab97e04cc1 chore(docker): add service name to compose files 2025-10-02 15:55:52 +01:00
tigattack
50b47bdd65 feat(docker): add JWT configs to backend image & compose 2025-10-02 15:55:52 +01:00
tigattack
7a17958ad8 feat(env): validate required env vars on start 2025-10-02 15:55:52 +01:00
tigattack
806f554b96 Merge pull request #114 from PatchMon/ci/docker 2025-10-02 15:55:29 +01:00
Muhammad Ibrahim
373ef8f468 feat: Add interactive dpkg error recovery with automatic retry 2025-10-02 15:19:49 +01:00
Muhammad Ibrahim
513c268b36 fix: Reset install_exit_code per container and detect success via output message 2025-10-02 15:14:50 +01:00
Muhammad Ibrahim
13c4342135 feat: Remove 'proxmox-' prefix from friendly names, use hostname only 2025-10-02 14:39:36 +01:00
Muhammad Ibrahim
bbb97dbfda fix: Remove EXIT from error trap to prevent false failures on successful completion 2025-10-02 14:37:38 +01:00
tigattack
31a95ed946 ci(docker): simplify image name template 2025-10-02 13:56:21 +01:00
tigattack
3eb4130865 ci(docker): fix push condition for build step 2025-10-02 13:56:21 +01:00
tigattack
5a498a5f7a ci(docker): login before buildx setup 2025-10-02 13:56:08 +01:00
Muhammad Ibrahim
e0eb544205 fix: Make all counter increments safe with || true to prevent set -e exit 2025-10-02 13:42:09 +01:00
Muhammad Ibrahim
51982010db fix: Pass API credentials directly to curl instead of via env vars 2025-10-02 13:41:12 +01:00
Muhammad Ibrahim
dc68afcb87 fix: Prevent set -e exit on agent install failure and show output 2025-10-02 13:39:28 +01:00
Muhammad Ibrahim
bec09b9457 fix: Download agent installer to file before executing to prevent stdin pipe hang 2025-10-02 13:37:55 +01:00
Muhammad Ibrahim
55c8f74b73 chore: Bump debug version to 1.0.0-debug.6 2025-10-02 13:16:09 +01:00
Muhammad Ibrahim
16ea1dc743 fix: Make logging functions always return 0 to prevent set -e exit 2025-10-02 13:15:57 +01:00
Muhammad Ibrahim
8c326c8fe2 debug: Add version echo at script start for verification 2025-10-02 13:14:38 +01:00
Muhammad Ibrahim
2abc9b1f8a debug: Add strict error handling and exit trap to diagnose silent exit 2025-10-02 13:12:29 +01:00
Muhammad Ibrahim
e5f3b0ed26 debug: Add detailed logging to diagnose where script hangs 2025-10-02 13:10:51 +01:00
Muhammad Ibrahim
bfc5db11da fix: Close stdin before while loop to prevent hang when piped from curl 2025-10-02 13:09:13 +01:00
Muhammad Ibrahim
a0bea9b6e5 fix: Remove global exec stdin redirect that breaks curl pipe 2025-10-02 13:03:50 +01:00
Muhammad Ibrahim
ebda7331a9 fix: Detach stdin globally to prevent curl pipe hangs in Proxmox script 2025-10-02 12:55:52 +01:00
Muhammad Ibrahim
9963cfa417 fix: Add timeouts and stdin redirection to prevent pct exec hanging 2025-10-02 08:28:07 +01:00
Muhammad Ibrahim
4e6a9829cf chore: Add migration file for auto_enrollment_tokens table 2025-10-02 08:13:04 +01:00
Muhammad Ibrahim
b99f4aad4e feat: Add Proxmox LXC auto-enrollment integration
- Add auto_enrollment_tokens table with rate limiting and IP whitelisting
- Create backend API routes for token management and enrollment
- Build frontend UI for token creation and management
- Add one-liner curl command for easy Proxmox deployment
- Include Proxmox LXC discovery and enrollment script
- Support future integrations with /proxmox-lxc endpoint pattern
- Add comprehensive documentation

Security features:
- Hashed token secrets
- Per-day rate limits
- IP whitelist support
- Token expiration
- Separate enrollment vs host API credentials
2025-10-02 07:50:10 +01:00
Muhammad Ibrahim
7a8e9d95a0 Added a global search bar 2025-10-02 07:03:10 +01:00
tigattack
ac22adde67 ci(docker): simplify conditional for workflow_dispatch input handling
Don't skip Docker login, doesn't really match the input option
2025-10-02 00:39:00 +01:00
tigattack
db1f03b0e0 ci(docker): replace GHCR_PAT with GITHUB_TOKEN 2025-10-02 00:38:14 +01:00
9 Technology Group LTD
74cc13b7de Create LICENSE 2025-10-01 23:43:19 +01:00
9 Technology Group LTD
65025b50cf Merge pull request #113 from PatchMon/dev-agent
Removed crontab insertion by installer
2025-10-01 21:12:24 +01:00
9 Technology Group LTD
de76836ba0 Create app_build.yml 2025-10-01 21:09:10 +01:00
9 Technology Group LTD
1b08be8864 Merge pull request #112 from 9technologygroup/dev-agent
Fixed selinux detection issue
2025-10-01 13:30:35 +01:00
9 Technology Group LTD
f789c1cebe Merge pull request #111 from 9technologygroup/dev-agent
Fixed dnf and issues for almalinux / rhel dervied systems, added bc as a pre-requisite
2025-10-01 12:57:17 +01:00
9 Technology Group LTD
443ec145e1 Merge pull request #110 from 9technologygroup/dev-agent
Made installation script output the install for jq and curl if theres issues with it
2025-10-01 10:58:48 +01:00
9 Technology Group LTD
69acd1726c Merge pull request #109 from 9technologygroup/dev-agent
Made changes to the host details area to add notes
2025-10-01 09:35:57 +01:00
9 Technology Group LTD
84c26054b2 Merge pull request #108 from 9technologygroup/dev-agent
Agent and Settings pages updates
2025-09-30 23:04:40 +01:00
9 Technology Group LTD
678efa9574 Merge pull request #98 from 9technologygroup/dev
Fixed Crontab timing Expression
2025-09-30 09:38:37 +01:00
9 Technology Group LTD
3da0625231 Merge pull request #97 from 9technologygroup/dev
Fixed npm installation scripts
2025-09-30 08:48:27 +01:00
9 Technology Group LTD
479909ecf3 Merge pull request #96 from 9technologygroup/dev
fixed setup installer file
2025-09-30 08:00:16 +01:00
9 Technology Group LTD
e04680bc33 Agent re-worked
Lots of fixes such as Agent rework, un-installation scripts, auth api applied to other endpoints etc.
2025-09-30 07:35:38 +01:00
109 changed files with 21842 additions and 4281 deletions

25
.github/workflows/app_build.yml vendored Normal file
View File

@@ -0,0 +1,25 @@
name: Build on Merge
on:
push:
branches:
- main
paths-ignore:
- 'docker/**'
jobs:
deploy:
runs-on: self-hosted
steps:
- name: Checkout code
uses: actions/checkout@v5
- name: Run rebuild script
run: /root/patchmon/platform/scripts/app_build.sh ${{ github.ref_name }}
rebuild-pmon:
runs-on: self-hosted
needs: deploy
if: github.ref_name == 'dev'
steps:
- name: Rebuild pmon
run: /root/patchmon/platform/scripts/manage_pmon_auto.sh

View File

@@ -2,7 +2,11 @@ name: Code quality
on:
push:
paths-ignore:
- 'docker/**'
pull_request:
paths-ignore:
- 'docker/**'
jobs:
check:

View File

@@ -1,13 +1,14 @@
name: Build and Push Docker Images
on:
push:
branches:
- main
tags:
- 'v*'
pull_request:
branches:
- main
- dev
release:
types:
- published
workflow_dispatch:
inputs:
push:
@@ -33,39 +34,42 @@ jobs:
- name: Checkout repository
uses: actions/checkout@v5
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Container Registry
if: github.event_name != 'workflow_dispatch' || github.event_name == 'workflow_dispatch' && github.event.inputs.push == 'true'
- name: Log in to container registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.repository_owner }}
# Using PAT as a hack due to issues with GITHUB_TOKEN and package permissions
# This should be reverted to use GITHUB_TOKEN once a solution is discovered.
password: ${{ secrets.GHCR_PAT }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Extract metadata (tags, labels)
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ github.repository_owner }}/patchmon-${{ matrix.image }}
images: ${{ env.REGISTRY }}/${{ github.repository }}-${{ matrix.image }}
tags: |
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
type=raw,value=latest,enable={{is_default_branch}}
type=edge,branch=main
- name: Build and push ${{ matrix.image }} image
if: github.event_name != 'workflow_dispatch' || github.event_name == 'workflow_dispatch' && github.event.inputs.push == 'true'
uses: docker/build-push-action@v6
with:
context: .
file: docker/${{ matrix.image }}.Dockerfile
platforms: linux/amd64,linux/arm64
push: true
# Push if:
# - Event is not workflow_dispatch OR input 'push' is true
# AND
# - Event is not pull_request OR the PR is from the same repository (to avoid pushing from forks)
push: ${{ (github.event_name != 'workflow_dispatch' || inputs.push == 'true') && (github.event_name != 'pull_request' || github.event.pull_request.head.repo.full_name == github.repository) }}
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha,scope=${{ matrix.image }}

10
.gitignore vendored
View File

@@ -71,6 +71,13 @@ jspm_packages/
.cache/
public
# Exception: Allow frontend/public/assets for logo files
!frontend/public/
!frontend/public/assets/
!frontend/public/assets/*.png
!frontend/public/assets/*.svg
!frontend/public/assets/*.jpg
# Storybook build outputs
.out
.storybook-out
@@ -132,6 +139,7 @@ playwright-report/
test-results.xml
test_*.sh
test-*.sh
*.code-workspace
# Package manager lock files (uncomment if you want to ignore them)
# package-lock.json
@@ -147,4 +155,4 @@ setup-installer-site.sh
install-server.*
notify-clients-upgrade.sh
debug-agent.sh
docker/compose_dev_data
docker/compose_dev_*

674
LICENSE Normal file
View File

@@ -0,0 +1,674 @@
GNU GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Use with the GNU Affero General Public License.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

View File

@@ -4,6 +4,8 @@
[![Discord](https://img.shields.io/badge/Discord-Join%20Server-blue?style=for-the-badge&logo=discord)](https://patchmon.net/discord)
[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?style=for-the-badge&logo=github)](https://github.com/9technologygroup/patchmon.net)
[![Roadmap](https://img.shields.io/badge/Roadmap-View%20Progress-green?style=for-the-badge&logo=github)](https://github.com/users/9technologygroup/projects/1)
[![Documentation](https://img.shields.io/badge/Documentation-docs.patchmon.net-blue?style=for-the-badge&logo=book)](https://docs.patchmon.net/)
---
## Please STAR this repo :D
@@ -12,7 +14,7 @@
PatchMon provides centralized patch management across diverse server environments. Agents communicate outbound-only to the PatchMon server, eliminating inbound ports on monitored hosts while delivering comprehensive visibility and safe automation.
![Dashboard Screenshot](https://raw.githubusercontent.com/9technologygroup/patchmon.net/main/dashboard.jpeg)
![Dashboard Screenshot](https://raw.githubusercontent.com/PatchMon/PatchMon/main/dashboard.jpeg)
## Features
@@ -41,6 +43,7 @@ PatchMon provides centralized patch management across diverse server environment
### API & Integrations
- REST API under `/api/v1` with JWT auth
- Proxmox LXC Auto-Enrollment - Automatically discover and enroll LXC containers from Proxmox hosts
### Security
- Rate limiting for general, auth, and agent endpoints
@@ -62,7 +65,7 @@ Managed, zero-maintenance PatchMon hosting. Stay tuned.
#### Docker (preferred)
For getting started with Docker, see the [Docker documentation](https://github.com/9technologygroup/patchmon.net/blob/main/docker/README.md)
For getting started with Docker, see the [Docker documentation](https://github.com/PatchMon/PatchMon/blob/main/docker/README.md)
#### Native Install (advanced/non-docker)
@@ -82,9 +85,14 @@ apt-get upgrade -y
apt install curl -y
```
#### Script
#### Install Script
```bash
curl -fsSL -o setup.sh https://raw.githubusercontent.com/9technologygroup/patchmon.net/refs/heads/main/setup.sh && chmod +x setup.sh && bash setup.sh
curl -fsSL -o setup.sh https://raw.githubusercontent.com/PatchMon/PatchMon/refs/heads/main/setup.sh && chmod +x setup.sh && bash setup.sh
```
#### Update Script (--update flag)
```bash
curl -fsSL -o setup.sh https://raw.githubusercontent.com/PatchMon/PatchMon/refs/heads/main/setup.sh && chmod +x setup.sh && bash setup.sh --update
```
#### Minimum specs for building : #####
@@ -110,6 +118,14 @@ After installation:
- Visit `http(s)://<your-domain>` and complete first-time admin setup
- See all useful info in `deployment-info.txt`
## Forcing updates after host package changes
Should you perform a manual package update on your host and wish to see the results reflected in PatchMon quicker than the usual scheduled update, you can trigger the process manually by running:
```bash
/usr/local/bin/patchmon-agent.sh update
```
This will send the results immediately to PatchMon.
## Communication Model
- Outbound-only agents: servers initiate communication to PatchMon
@@ -124,22 +140,18 @@ After installation:
- Database: PostgreSQL
- System service: systemd-managed backend
```mermaid
flowchart LR
A[End Users / Browser<br>Admin UI / Frontend] -- HTTPS --> B[nginx<br>serve FE, proxy API]
B -- HTTP --> C["Backend<br>(Node/Express)<br>/api, auth, Prisma"]
C -- TCP --> D[PostgreSQL<br>Database]
E["Agents on your servers (Outbound Only)"] -- HTTPS --> F["Backend API<br>(/api/v1)"]
```
+----------------------+ HTTPS +--------------------+ HTTP +------------------------+ TCP +---------------+
| End Users (Browser) | ---------> | nginx | --------> | Backend (Node/Express) | ------> | PostgreSQL |
| Admin UI / Frontend | | serve FE, proxy API| | /api, auth, Prisma | | Database |
+----------------------+ +--------------------+ +------------------------+ +---------------+
Agents (Outbound Only)
+---------------------------+ HTTPS +------------------------+
| Agents on your servers | ----------> | Backend API (/api/v1) |
+---------------------------+ +------------------------+
Operational
- systemd manages backend service
- certbot/nginx for TLS (public)
- setup.sh bootstraps OS, app, DB, config
```
## Support
@@ -148,7 +160,7 @@ Operational
## Roadmap
- Roadmap board: https://github.com/users/9technologygroup/projects/1
- Roadmap board: https://github.com/orgs/PatchMon/projects/2
## License
@@ -271,7 +283,7 @@ Thank you to all our contributors who help make PatchMon better every day!
- **Website**: [patchmon.net](https://patchmon.net)
- **Discord**: [https://patchmon.net/discord](https://patchmon.net/discord)
- **Roadmap**: [GitHub Projects](https://github.com/users/9technologygroup/projects/1)
- **Documentation**: [Coming Soon]
- **Documentation**: [https://docs.patchmon.net](https://docs.patchmon.net)
- **Support**: support@patchmon.net
---
@@ -281,6 +293,6 @@ Thank you to all our contributors who help make PatchMon better every day!
**Made with ❤️ by the PatchMon Team**
[![Discord](https://img.shields.io/badge/Discord-Join%20Server-blue?style=for-the-badge&logo=discord)](https://patchmon.net/discord)
[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?style=for-the-badge&logo=github)](https://github.com/9technologygroup/patchmon.net)
[![GitHub](https://img.shields.io/badge/GitHub-Repository-black?style=for-the-badge&logo=github)](https://github.com/PatchMon/PatchMon)
</div>

View File

@@ -1,12 +1,12 @@
#!/bin/bash
# PatchMon Agent Script v1.2.7
# PatchMon Agent Script v1.2.9
# This script sends package update information to the PatchMon server using API credentials
# Configuration
PATCHMON_SERVER="${PATCHMON_SERVER:-http://localhost:3001}"
API_VERSION="v1"
AGENT_VERSION="1.2.7"
AGENT_VERSION="1.2.9"
CONFIG_FILE="/etc/patchmon/agent.conf"
CREDENTIALS_FILE="/etc/patchmon/credentials"
LOG_FILE="/var/log/patchmon-agent.log"
@@ -38,24 +38,46 @@ error() {
exit 1
}
# Info logging (cleaner output - only stdout, no duplicate logging)
# Info logging (cleaner output - only stderr, no duplicate logging)
info() {
echo -e "${BLUE} $1${NC}"
echo -e "${BLUE} $1${NC}" >&2
log "INFO: $1"
}
# Success logging (cleaner output - only stdout, no duplicate logging)
# Success logging (cleaner output - only stderr, no duplicate logging)
success() {
echo -e "${GREEN}$1${NC}"
echo -e "${GREEN}$1${NC}" >&2
log "SUCCESS: $1"
}
# Warning logging (cleaner output - only stdout, no duplicate logging)
# Warning logging (cleaner output - only stderr, no duplicate logging)
warning() {
echo -e "${YELLOW}⚠️ $1${NC}"
echo -e "${YELLOW}⚠️ $1${NC}" >&2
log "WARNING: $1"
}
# Get or generate machine ID
get_machine_id() {
# Try standard locations for machine-id
if [[ -f /etc/machine-id ]]; then
cat /etc/machine-id
elif [[ -f /var/lib/dbus/machine-id ]]; then
cat /var/lib/dbus/machine-id
else
# Fallback: generate from hardware UUID or hostname+MAC
if command -v dmidecode &> /dev/null; then
local uuid=$(dmidecode -s system-uuid 2>/dev/null | tr -d ' -' | tr '[:upper:]' '[:lower:]')
if [[ -n "$uuid" && "$uuid" != "notpresent" ]]; then
echo "$uuid"
return
fi
fi
# Last resort: hash hostname + primary MAC address
local primary_mac=$(ip link show | grep -oP '(?<=link/ether\s)[0-9a-f:]+' | head -1 | tr -d ':')
echo "$HOSTNAME-$primary_mac" | sha256sum | cut -d' ' -f1 | cut -c1-32
fi
}
# Check if running as root
check_root() {
if [[ $EUID -ne 0 ]]; then
@@ -209,9 +231,14 @@ detect_os() {
"opensuse"|"opensuse-leap"|"opensuse-tumbleweed")
OS_TYPE="suse"
;;
"rocky"|"almalinux")
"almalinux")
OS_TYPE="rhel"
;;
"ol")
# Keep Oracle Linux as 'ol' for proper frontend identification
OS_TYPE="ol"
;;
# Rocky Linux keeps its own identity for proper frontend display
esac
elif [[ -f /etc/redhat-release ]]; then
@@ -239,7 +266,7 @@ get_repository_info() {
"ubuntu"|"debian")
get_apt_repositories repos_json first
;;
"centos"|"rhel"|"fedora")
"centos"|"rhel"|"fedora"|"ol"|"rocky")
get_yum_repositories repos_json first
;;
*)
@@ -547,14 +574,118 @@ get_yum_repositories() {
local -n first_ref=$2
# Parse yum/dnf repository configuration
local repo_info=""
if command -v dnf >/dev/null 2>&1; then
local repo_info=$(dnf repolist all --verbose 2>/dev/null | grep -E "^Repo-id|^Repo-baseurl|^Repo-name|^Repo-status")
repo_info=$(dnf repolist all --verbose 2>/dev/null | grep -E "^Repo-id|^Repo-baseurl|^Repo-mirrors|^Repo-name|^Repo-status")
elif command -v yum >/dev/null 2>&1; then
local repo_info=$(yum repolist all -v 2>/dev/null | grep -E "^Repo-id|^Repo-baseurl|^Repo-name|^Repo-status")
repo_info=$(yum repolist all -v 2>/dev/null | grep -E "^Repo-id|^Repo-baseurl|^Repo-mirrors|^Repo-name|^Repo-status")
fi
# This is a simplified implementation - would need more work for full YUM support
# For now, return empty for non-APT systems
if [[ -z "$repo_info" ]]; then
return
fi
# Parse repository information
local current_repo=""
local repo_id=""
local repo_name=""
local repo_url=""
local repo_mirrors=""
local repo_status=""
while IFS= read -r line; do
if [[ "$line" =~ ^Repo-id[[:space:]]+:[[:space:]]+(.+)$ ]]; then
# Process previous repository if we have one
if [[ -n "$current_repo" ]]; then
process_yum_repo repos_ref first_ref "$repo_id" "$repo_name" "$repo_url" "$repo_mirrors" "$repo_status"
fi
# Start new repository
repo_id="${BASH_REMATCH[1]}"
repo_name="$repo_id"
repo_url=""
repo_mirrors=""
repo_status=""
current_repo="$repo_id"
elif [[ "$line" =~ ^Repo-name[[:space:]]+:[[:space:]]+(.+)$ ]]; then
repo_name="${BASH_REMATCH[1]}"
elif [[ "$line" =~ ^Repo-baseurl[[:space:]]+:[[:space:]]+(.+)$ ]]; then
repo_url="${BASH_REMATCH[1]}"
elif [[ "$line" =~ ^Repo-mirrors[[:space:]]+:[[:space:]]+(.+)$ ]]; then
repo_mirrors="${BASH_REMATCH[1]}"
elif [[ "$line" =~ ^Repo-status[[:space:]]+:[[:space:]]+(.+)$ ]]; then
repo_status="${BASH_REMATCH[1]}"
fi
done <<< "$repo_info"
# Process the last repository
if [[ -n "$current_repo" ]]; then
process_yum_repo repos_ref first_ref "$repo_id" "$repo_name" "$repo_url" "$repo_mirrors" "$repo_status"
fi
}
# Process a single YUM repository and add it to the JSON
process_yum_repo() {
local -n _repos_ref=$1
local -n _first_ref=$2
local repo_id="$3"
local repo_name="$4"
local repo_url="$5"
local repo_mirrors="$6"
local repo_status="$7"
# Skip if we don't have essential info
if [[ -z "$repo_id" ]]; then
return
fi
# Determine if repository is enabled
local is_enabled=false
if [[ "$repo_status" == "enabled" ]]; then
is_enabled=true
fi
# Use baseurl if available, otherwise use mirrors URL
local final_url=""
if [[ -n "$repo_url" ]]; then
# Extract first URL if multiple are listed
final_url=$(echo "$repo_url" | head -n 1 | awk '{print $1}')
elif [[ -n "$repo_mirrors" ]]; then
final_url="$repo_mirrors"
fi
# Skip if we don't have any URL
if [[ -z "$final_url" ]]; then
return
fi
# Determine if repository uses HTTPS
local is_secure=false
if [[ "$final_url" =~ ^https:// ]]; then
is_secure=true
fi
# Generate repository name if not provided
if [[ -z "$repo_name" ]]; then
repo_name="$repo_id"
fi
# Clean up repository name and URL - escape quotes and backslashes
repo_name=$(echo "$repo_name" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g')
final_url=$(echo "$final_url" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g')
# Add to JSON
if [[ "$_first_ref" == true ]]; then
_first_ref=false
else
_repos_ref+=","
fi
_repos_ref+="{\"name\":\"$repo_name\",\"url\":\"$final_url\",\"distribution\":\"$OS_VERSION\",\"components\":\"main\",\"repoType\":\"rpm\",\"isEnabled\":$is_enabled,\"isSecure\":$is_secure}"
}
# Get package information based on OS
@@ -566,11 +697,11 @@ get_package_info() {
"ubuntu"|"debian")
get_apt_packages packages_json first
;;
"centos"|"rhel"|"fedora")
"centos"|"rhel"|"fedora"|"ol"|"rocky")
get_yum_packages packages_json first
;;
*)
error "Unsupported OS type: $OS_TYPE"
warning "Unsupported OS type: $OS_TYPE - returning empty package list"
;;
esac
@@ -578,13 +709,173 @@ get_package_info() {
echo "$packages_json"
}
# Check and handle APT locks
handle_apt_locks() {
local interactive=${1:-false} # First parameter indicates if running interactively
local lock_files=(
"/var/lib/dpkg/lock"
"/var/lib/dpkg/lock-frontend"
"/var/lib/apt/lists/lock"
"/var/cache/apt/archives/lock"
)
local processes_found=false
local hung_processes=()
# Check for running APT processes
if pgrep -x "apt-get|apt|aptitude|dpkg|unattended-upgr" > /dev/null 2>&1; then
processes_found=true
info "Found running package management processes:"
echo "" >&2
# Get process info with ACTUAL elapsed time (not CPU time)
# Using ps -eo format to get real elapsed time
while IFS= read -r line; do
[[ -z "$line" ]] && continue
local pid=$(echo "$line" | awk '{print $1}')
local elapsed=$(echo "$line" | awk '{print $2}')
local cmd=$(echo "$line" | awk '{for(i=3;i<=NF;i++) printf "%s ", $i; print ""}')
# Display process info
echo " PID $pid: $cmd (running for $elapsed)" >&2
# Parse elapsed time and convert to seconds
# Format can be: MM:SS, HH:MM:SS, DD-HH:MM:SS, or just SS
# Use 10# prefix to force base-10 (avoid octal interpretation of leading zeros)
local runtime_seconds=0
if [[ "$elapsed" =~ ^([0-9]+)-([0-9]+):([0-9]+):([0-9]+)$ ]]; then
# Format: DD-HH:MM:SS
runtime_seconds=$(( 10#${BASH_REMATCH[1]} * 86400 + 10#${BASH_REMATCH[2]} * 3600 + 10#${BASH_REMATCH[3]} * 60 + 10#${BASH_REMATCH[4]} ))
elif [[ "$elapsed" =~ ^([0-9]+):([0-9]+):([0-9]+)$ ]]; then
# Format: HH:MM:SS
runtime_seconds=$(( 10#${BASH_REMATCH[1]} * 3600 + 10#${BASH_REMATCH[2]} * 60 + 10#${BASH_REMATCH[3]} ))
elif [[ "$elapsed" =~ ^([0-9]+):([0-9]+)$ ]]; then
# Format: MM:SS
runtime_seconds=$(( 10#${BASH_REMATCH[1]} * 60 + 10#${BASH_REMATCH[2]} ))
elif [[ "$elapsed" =~ ^([0-9]+)$ ]]; then
# Format: just seconds
runtime_seconds=$((10#${BASH_REMATCH[1]}))
fi
# Consider process hung if running for more than 5 minutes
if [[ $runtime_seconds -gt 300 ]]; then
hung_processes+=("$pid:$elapsed:$cmd")
fi
done < <(ps -eo pid,etime,cmd | grep -E "apt-get|apt[^-]|aptitude|dpkg|unattended-upgr" | grep -v grep | grep -v "ps -eo")
echo "" >&2
info "Detected ${#hung_processes[@]} hung process(es), interactive=$interactive"
# If hung processes found and running interactively, offer to kill them
if [[ ${#hung_processes[@]} -gt 0 && "$interactive" == "true" ]]; then
warning "Found ${#hung_processes[@]} potentially hung process(es) (running > 5 minutes)"
echo "" >&2
for process_info in "${hung_processes[@]}"; do
IFS=':' read -r pid elapsed cmd <<< "$process_info"
echo " PID $pid: $cmd (hung for $elapsed)" >&2
done
echo "" >&2
read -p "$(echo -e "${YELLOW}⚠️ Do you want to kill these processes? [y/N]:${NC} ")" -n 1 -r >&2
echo "" >&2
if [[ $REPLY =~ ^[Yy]$ ]]; then
for process_info in "${hung_processes[@]}"; do
IFS=':' read -r pid elapsed cmd <<< "$process_info"
info "Killing process $pid..."
if kill "$pid" 2>/dev/null; then
success "Killed process $pid"
sleep 1
# Check if process is still running
if kill -0 "$pid" 2>/dev/null; then
warning "Process $pid still running, using SIGKILL..."
kill -9 "$pid" 2>/dev/null
success "Force killed process $pid"
fi
else
warning "Could not kill process $pid (may require sudo)"
fi
done
# Wait a moment for locks to clear
sleep 2
else
info "Skipping process termination"
fi
elif [[ ${#hung_processes[@]} -gt 0 ]]; then
warning "Found ${#hung_processes[@]} potentially hung process(es) (running > 5 minutes)"
info "Run this command with sudo and interactively to kill hung processes"
fi
fi
# Check for stale lock files (files that exist but no process is holding them)
for lock_file in "${lock_files[@]}"; do
if [[ -f "$lock_file" ]]; then
# Try to get the PID from the lock file if it exists
if lsof "$lock_file" > /dev/null 2>&1; then
info "Lock file $lock_file is held by an active process"
else
warning "Found stale lock file: $lock_file"
info "Attempting to remove stale lock..."
if rm -f "$lock_file" 2>/dev/null; then
success "Removed stale lock: $lock_file"
else
warning "Could not remove lock (insufficient permissions): $lock_file"
fi
fi
fi
done
# If processes were found, return failure so caller can wait
if [[ "$processes_found" == true ]]; then
return 1
else
return 0
fi
}
# Get package info for APT-based systems
get_apt_packages() {
local -n packages_ref=$1
local -n first_ref=$2
# Update package lists (use apt-get for older distros; quieter output)
apt-get update -qq
# Update package lists with retry logic for lock conflicts
local retry_count=0
local max_retries=3
local retry_delay=5
while [[ $retry_count -lt $max_retries ]]; do
if apt-get update -qq 2>/dev/null; then
break
else
retry_count=$((retry_count + 1))
if [[ $retry_count -lt $max_retries ]]; then
warning "APT lock detected (attempt $retry_count/$max_retries)"
# On first retry, try to handle locks
if [[ $retry_count -eq 1 ]]; then
info "Checking for stale APT locks..."
# Check if running interactively (stdin is a terminal OR stdout is a terminal)
local is_interactive=false
if [[ -t 0 ]] || [[ -t 1 ]]; then
is_interactive=true
fi
info "Interactive mode: $is_interactive"
handle_apt_locks "$is_interactive"
fi
info "Waiting ${retry_delay} seconds before retry..."
sleep $retry_delay
else
warning "APT lock persists after $max_retries attempts"
warning "Continuing without updating package lists (will use cached data)"
fi
fi
done
# Determine upgradable packages using apt-get simulation (compatible with Ubuntu 18.04)
# Example line format:
@@ -604,6 +895,11 @@ get_apt_packages() {
is_security_update=true
fi
# Escape JSON special characters in package data
package_name=$(echo "$package_name" | sed 's/"/\\"/g' | sed 's/\\/\\\\/g')
current_version=$(echo "$current_version" | sed 's/"/\\"/g' | sed 's/\\/\\\\/g')
available_version=$(echo "$available_version" | sed 's/"/\\"/g' | sed 's/\\/\\\\/g')
if [[ "$first_ref" == true ]]; then
first_ref=false
else
@@ -615,12 +911,16 @@ get_apt_packages() {
done <<< "$upgradable_sim"
# Get installed packages that are up to date
local installed=$(dpkg-query -W -f='${Package} ${Version}\n' | head -100)
local installed=$(dpkg-query -W -f='${Package} ${Version}\n')
while IFS=' ' read -r package_name version; do
if [[ -n "$package_name" && -n "$version" ]]; then
# Check if this package is not in the upgrade list
if ! echo "$upgradable" | grep -q "^$package_name/"; then
if ! echo "$upgradable_sim" | grep -q "^Inst $package_name "; then
# Escape JSON special characters in package data
package_name=$(echo "$package_name" | sed 's/"/\\"/g' | sed 's/\\/\\\\/g')
version=$(echo "$version" | sed 's/"/\\"/g' | sed 's/\\/\\\\/g')
if [[ "$first_ref" == true ]]; then
first_ref=false
else
@@ -686,7 +986,7 @@ get_yum_packages() {
done <<< "$upgradable"
# Get some installed packages that are up to date
local installed=$($package_manager list installed 2>/dev/null | grep -v "^Loaded" | grep -v "^Installed" | head -100)
local installed=$($package_manager list installed 2>/dev/null | grep -v "^Loaded" | grep -v "^Installed")
while IFS= read -r line; do
# Skip empty lines
@@ -849,6 +1149,9 @@ get_system_info() {
send_update() {
load_credentials
# Track execution start time
local start_time=$(date +%s.%N)
# Verify datetime before proceeding
if ! verify_datetime; then
warning "Datetime verification failed, but continuing with update..."
@@ -861,10 +1164,26 @@ send_update() {
local network_json=$(get_network_info)
local system_json=$(get_system_info)
# Validate JSON before sending
if ! echo "$packages_json" | jq empty 2>/dev/null; then
error "Invalid packages JSON generated: $packages_json"
fi
if ! echo "$repositories_json" | jq empty 2>/dev/null; then
error "Invalid repositories JSON generated: $repositories_json"
fi
info "Sending update to PatchMon server..."
# Merge all JSON objects into one
local merged_json=$(echo "$hardware_json $network_json $system_json" | jq -s '.[0] * .[1] * .[2]')
# Get machine ID
local machine_id=$(get_machine_id)
# Calculate execution time (in seconds with decimals)
local end_time=$(date +%s.%N)
local execution_time=$(echo "$end_time - $start_time" | bc)
# Create the base payload and merge with system info
local base_payload=$(cat <<EOF
{
@@ -875,7 +1194,9 @@ send_update() {
"hostname": "$HOSTNAME",
"ip": "$IP_ADDRESS",
"architecture": "$ARCHITECTURE",
"agentVersion": "$AGENT_VERSION"
"agentVersion": "$AGENT_VERSION",
"machineId": "$machine_id",
"executionTime": $execution_time
}
EOF
)
@@ -883,15 +1204,27 @@ EOF
# Merge the base payload with the system information
local payload=$(echo "$base_payload $merged_json" | jq -s '.[0] * .[1]')
# Write payload to temporary file to avoid "Argument list too long" error
local temp_payload_file=$(mktemp)
echo "$payload" > "$temp_payload_file"
# Debug: Show payload size
local payload_size=$(wc -c < "$temp_payload_file")
echo -e "${BLUE} 📊 Payload size: $payload_size bytes${NC}"
local response=$(curl $CURL_FLAGS -X POST \
-H "Content-Type: application/json" \
-H "X-API-ID: $API_ID" \
-H "X-API-KEY: $API_KEY" \
-d "$payload" \
"$PATCHMON_SERVER/api/$API_VERSION/hosts/update")
-d @"$temp_payload_file" \
"$PATCHMON_SERVER/api/$API_VERSION/hosts/update" 2>&1)
if [[ $? -eq 0 ]]; then
local curl_exit_code=$?
# Clean up temporary file
rm -f "$temp_payload_file"
if [[ $curl_exit_code -eq 0 ]]; then
if echo "$response" | grep -q "success"; then
local packages_count=$(echo "$response" | grep -o '"packagesProcessed":[0-9]*' | cut -d':' -f2)
success "Update sent successfully (${packages_count} packages processed)"
@@ -927,7 +1260,7 @@ EOF
error "Update failed: $response"
fi
else
error "Failed to send update"
error "Failed to send update (curl exit code: $curl_exit_code): $response"
fi
}
@@ -1375,9 +1708,21 @@ main() {
"diagnostics")
show_diagnostics
;;
"clear-locks"|"unlock")
check_root
info "Checking APT locks and hung processes..."
echo ""
handle_apt_locks true
echo ""
if [[ $? -eq 0 ]]; then
success "No APT locks or processes blocking package management"
else
info "APT processes are still running - they may be legitimate operations"
fi
;;
*)
echo "PatchMon Agent v$AGENT_VERSION - API Credential Based"
echo "Usage: $0 {configure|test|update|ping|config|check-version|check-agent-update|update-agent|update-crontab|diagnostics}"
echo "Usage: $0 {configure|test|update|ping|config|check-version|check-agent-update|update-agent|update-crontab|clear-locks|diagnostics}"
echo ""
echo "Commands:"
echo " configure <API_ID> <API_KEY> [SERVER_URL] - Configure API credentials for this host"
@@ -1389,6 +1734,7 @@ main() {
echo " check-agent-update - Check for agent updates using timestamp comparison"
echo " update-agent - Update agent to latest version"
echo " update-crontab - Update crontab with current policy"
echo " clear-locks - Check and clear APT locks (interactive)"
echo " diagnostics - Show detailed system diagnostics"
echo ""
echo "Setup Process:"

BIN
agents/patchmon-agent-linux-386 Executable file

Binary file not shown.

BIN
agents/patchmon-agent-linux-amd64 Executable file

Binary file not shown.

BIN
agents/patchmon-agent-linux-arm64 Executable file

Binary file not shown.

496
agents/patchmon-docker-agent.sh Executable file
View File

@@ -0,0 +1,496 @@
#!/bin/bash
# PatchMon Docker Agent Script v1.2.9
# This script collects Docker container and image information and sends it to PatchMon
# Configuration
PATCHMON_SERVER="${PATCHMON_SERVER:-http://localhost:3001}"
API_VERSION="v1"
AGENT_VERSION="1.2.9"
CONFIG_FILE="/etc/patchmon/agent.conf"
CREDENTIALS_FILE="/etc/patchmon/credentials"
LOG_FILE="/var/log/patchmon-docker-agent.log"
# Curl flags placeholder (replaced by server based on SSL settings)
CURL_FLAGS=""
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Logging function
log() {
if [[ -w "$(dirname "$LOG_FILE")" ]] 2>/dev/null; then
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" >> "$LOG_FILE" 2>/dev/null
fi
}
# Error handling
error() {
echo -e "${RED}ERROR: $1${NC}" >&2
log "ERROR: $1"
exit 1
}
# Info logging
info() {
echo -e "${BLUE} $1${NC}" >&2
log "INFO: $1"
}
# Success logging
success() {
echo -e "${GREEN}$1${NC}" >&2
log "SUCCESS: $1"
}
# Warning logging
warning() {
echo -e "${YELLOW}⚠️ $1${NC}" >&2
log "WARNING: $1"
}
# Check if Docker is installed and running
check_docker() {
if ! command -v docker &> /dev/null; then
error "Docker is not installed on this system"
fi
if ! docker info &> /dev/null; then
error "Docker daemon is not running or you don't have permission to access it. Try running with sudo."
fi
}
# Load credentials
load_credentials() {
if [[ ! -f "$CREDENTIALS_FILE" ]]; then
error "Credentials file not found at $CREDENTIALS_FILE. Please configure the main PatchMon agent first."
fi
source "$CREDENTIALS_FILE"
if [[ -z "$API_ID" ]] || [[ -z "$API_KEY" ]]; then
error "API credentials not found in $CREDENTIALS_FILE"
fi
# Use PATCHMON_URL from credentials if available, otherwise use default
if [[ -n "$PATCHMON_URL" ]]; then
PATCHMON_SERVER="$PATCHMON_URL"
fi
}
# Load configuration
load_config() {
if [[ -f "$CONFIG_FILE" ]]; then
source "$CONFIG_FILE"
if [[ -n "$SERVER_URL" ]]; then
PATCHMON_SERVER="$SERVER_URL"
fi
fi
}
# Collect Docker containers
collect_containers() {
info "Collecting Docker container information..."
local containers_json="["
local first=true
# Get all containers (running and stopped)
while IFS='|' read -r container_id name image status state created started ports; do
if [[ -z "$container_id" ]]; then
continue
fi
# Parse image name and tag
local image_name="${image%%:*}"
local image_tag="${image##*:}"
if [[ "$image_tag" == "$image_name" ]]; then
image_tag="latest"
fi
# Determine image source based on registry
local image_source="docker-hub"
if [[ "$image_name" == ghcr.io/* ]]; then
image_source="github"
elif [[ "$image_name" == registry.gitlab.com/* ]]; then
image_source="gitlab"
elif [[ "$image_name" == *"/"*"/"* ]]; then
image_source="private"
fi
# Get repository name (without registry prefix for common registries)
local image_repository="$image_name"
image_repository="${image_repository#ghcr.io/}"
image_repository="${image_repository#registry.gitlab.com/}"
# Get image ID
local full_image_id=$(docker inspect --format='{{.Image}}' "$container_id" 2>/dev/null || echo "unknown")
full_image_id="${full_image_id#sha256:}"
# Normalize status (extract just the status keyword)
local normalized_status="unknown"
if [[ "$status" =~ ^Up ]]; then
normalized_status="running"
elif [[ "$status" =~ ^Exited ]]; then
normalized_status="exited"
elif [[ "$status" =~ ^Created ]]; then
normalized_status="created"
elif [[ "$status" =~ ^Restarting ]]; then
normalized_status="restarting"
elif [[ "$status" =~ ^Paused ]]; then
normalized_status="paused"
elif [[ "$status" =~ ^Dead ]]; then
normalized_status="dead"
fi
# Parse ports
local ports_json="null"
if [[ -n "$ports" && "$ports" != "null" ]]; then
# Convert Docker port format to JSON
ports_json=$(echo "$ports" | jq -R -s -c 'split(",") | map(select(length > 0)) | map(split("->") | {(.[0]): .[1]}) | add // {}')
fi
# Convert dates to ISO 8601 format
# If date conversion fails, use null instead of invalid date string
local created_iso=$(date -d "$created" -Iseconds 2>/dev/null || echo "null")
local started_iso="null"
if [[ -n "$started" && "$started" != "null" ]]; then
started_iso=$(date -d "$started" -Iseconds 2>/dev/null || echo "null")
fi
# Add comma for JSON array
if [[ "$first" == false ]]; then
containers_json+=","
fi
first=false
# Build JSON object for this container
containers_json+="{\"container_id\":\"$container_id\","
containers_json+="\"name\":\"$name\","
containers_json+="\"image_name\":\"$image_name\","
containers_json+="\"image_tag\":\"$image_tag\","
containers_json+="\"image_repository\":\"$image_repository\","
containers_json+="\"image_source\":\"$image_source\","
containers_json+="\"image_id\":\"$full_image_id\","
containers_json+="\"status\":\"$normalized_status\","
containers_json+="\"state\":\"$state\","
containers_json+="\"ports\":$ports_json"
# Only add created_at if we have a valid date
if [[ "$created_iso" != "null" ]]; then
containers_json+=",\"created_at\":\"$created_iso\""
fi
# Only add started_at if we have a valid date
if [[ "$started_iso" != "null" ]]; then
containers_json+=",\"started_at\":\"$started_iso\""
fi
containers_json+="}"
done < <(docker ps -a --format '{{.ID}}|{{.Names}}|{{.Image}}|{{.Status}}|{{.State}}|{{.CreatedAt}}|{{.RunningFor}}|{{.Ports}}' 2>/dev/null)
containers_json+="]"
echo "$containers_json"
}
# Collect Docker images
collect_images() {
info "Collecting Docker image information..."
local images_json="["
local first=true
while IFS='|' read -r repository tag image_id created size digest; do
if [[ -z "$repository" || "$repository" == "<none>" ]]; then
continue
fi
# Clean up tag
if [[ -z "$tag" || "$tag" == "<none>" ]]; then
tag="latest"
fi
# Clean image ID
image_id="${image_id#sha256:}"
# Determine source
local source="docker-hub"
if [[ "$repository" == ghcr.io/* ]]; then
source="github"
elif [[ "$repository" == registry.gitlab.com/* ]]; then
source="gitlab"
elif [[ "$repository" == *"/"*"/"* ]]; then
source="private"
fi
# Convert size to bytes (approximate)
local size_bytes=0
if [[ "$size" =~ ([0-9.]+)([KMGT]?B) ]]; then
local num="${BASH_REMATCH[1]}"
local unit="${BASH_REMATCH[2]}"
case "$unit" in
KB) size_bytes=$(echo "$num * 1024" | bc | cut -d. -f1) ;;
MB) size_bytes=$(echo "$num * 1024 * 1024" | bc | cut -d. -f1) ;;
GB) size_bytes=$(echo "$num * 1024 * 1024 * 1024" | bc | cut -d. -f1) ;;
TB) size_bytes=$(echo "$num * 1024 * 1024 * 1024 * 1024" | bc | cut -d. -f1) ;;
B) size_bytes=$(echo "$num" | cut -d. -f1) ;;
esac
fi
# Convert created date to ISO 8601
# If date conversion fails, use null instead of invalid date string
local created_iso=$(date -d "$created" -Iseconds 2>/dev/null || echo "null")
# Add comma for JSON array
if [[ "$first" == false ]]; then
images_json+=","
fi
first=false
# Build JSON object for this image
images_json+="{\"repository\":\"$repository\","
images_json+="\"tag\":\"$tag\","
images_json+="\"image_id\":\"$image_id\","
images_json+="\"source\":\"$source\","
images_json+="\"size_bytes\":$size_bytes"
# Only add created_at if we have a valid date
if [[ "$created_iso" != "null" ]]; then
images_json+=",\"created_at\":\"$created_iso\""
fi
# Only add digest if present
if [[ -n "$digest" && "$digest" != "<none>" ]]; then
images_json+=",\"digest\":\"$digest\""
fi
images_json+="}"
done < <(docker images --format '{{.Repository}}|{{.Tag}}|{{.ID}}|{{.CreatedAt}}|{{.Size}}|{{.Digest}}' --no-trunc 2>/dev/null)
images_json+="]"
echo "$images_json"
}
# Check for image updates
check_image_updates() {
info "Checking for image updates..."
local updates_json="["
local first=true
local update_count=0
# Get all images
while IFS='|' read -r repository tag image_id digest; do
if [[ -z "$repository" || "$repository" == "<none>" || "$tag" == "<none>" ]]; then
continue
fi
# Skip checking 'latest' tag as it's always considered current by name
# We'll still check digest though
local full_image="${repository}:${tag}"
# Try to get remote digest from registry
# Use docker manifest inspect to avoid pulling the image
local remote_digest=$(docker manifest inspect "$full_image" 2>/dev/null | jq -r '.config.digest // .manifests[0].digest // empty' 2>/dev/null)
if [[ -z "$remote_digest" ]]; then
# If manifest inspect fails, try buildx imagetools inspect (works for more registries)
remote_digest=$(docker buildx imagetools inspect "$full_image" 2>/dev/null | grep -oP 'Digest:\s*\K\S+' | head -1)
fi
# Clean up digests for comparison
local local_digest="${digest#sha256:}"
remote_digest="${remote_digest#sha256:}"
# If we got a remote digest and it's different from local, there's an update
if [[ -n "$remote_digest" && -n "$local_digest" && "$remote_digest" != "$local_digest" ]]; then
if [[ "$first" == false ]]; then
updates_json+=","
fi
first=false
# Build update JSON object
updates_json+="{\"repository\":\"$repository\","
updates_json+="\"current_tag\":\"$tag\","
updates_json+="\"available_tag\":\"$tag\","
updates_json+="\"current_digest\":\"$local_digest\","
updates_json+="\"available_digest\":\"$remote_digest\","
updates_json+="\"image_id\":\"${image_id#sha256:}\""
updates_json+="}"
((update_count++))
fi
done < <(docker images --format '{{.Repository}}|{{.Tag}}|{{.ID}}|{{.Digest}}' --no-trunc 2>/dev/null)
updates_json+="]"
info "Found $update_count image update(s) available"
echo "$updates_json"
}
# Send Docker data to server
send_docker_data() {
load_credentials
info "Collecting Docker data..."
local containers=$(collect_containers)
local images=$(collect_images)
local updates=$(check_image_updates)
# Count collected items
local container_count=$(echo "$containers" | jq '. | length' 2>/dev/null || echo "0")
local image_count=$(echo "$images" | jq '. | length' 2>/dev/null || echo "0")
local update_count=$(echo "$updates" | jq '. | length' 2>/dev/null || echo "0")
info "Found $container_count containers, $image_count images, and $update_count update(s) available"
# Build payload
local payload="{\"apiId\":\"$API_ID\",\"apiKey\":\"$API_KEY\",\"containers\":$containers,\"images\":$images,\"updates\":$updates}"
# Send to server
info "Sending Docker data to PatchMon server..."
local response=$(curl $CURL_FLAGS -s -w "\n%{http_code}" -X POST \
-H "Content-Type: application/json" \
-d "$payload" \
"${PATCHMON_SERVER}/api/${API_VERSION}/docker/collect" 2>&1)
local http_code=$(echo "$response" | tail -n1)
local response_body=$(echo "$response" | head -n-1)
if [[ "$http_code" == "200" ]]; then
success "Docker data sent successfully!"
log "Docker data sent: $container_count containers, $image_count images"
return 0
else
error "Failed to send Docker data. HTTP Status: $http_code\nResponse: $response_body"
fi
}
# Test Docker data collection without sending
test_collection() {
check_docker
info "Testing Docker data collection (dry run)..."
echo ""
local containers=$(collect_containers)
local images=$(collect_images)
local updates=$(check_image_updates)
local container_count=$(echo "$containers" | jq '. | length' 2>/dev/null || echo "0")
local image_count=$(echo "$images" | jq '. | length' 2>/dev/null || echo "0")
local update_count=$(echo "$updates" | jq '. | length' 2>/dev/null || echo "0")
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo -e "${GREEN}Docker Data Collection Results${NC}"
echo "━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━"
echo -e "Containers found: ${GREEN}$container_count${NC}"
echo -e "Images found: ${GREEN}$image_count${NC}"
echo -e "Updates available: ${YELLOW}$update_count${NC}"
echo ""
if command -v jq &> /dev/null; then
echo "━━━ Containers ━━━"
echo "$containers" | jq -r '.[] | "\(.name) (\(.status)) - \(.image_name):\(.image_tag)"' | head -10
if [[ $container_count -gt 10 ]]; then
echo "... and $((container_count - 10)) more"
fi
echo ""
echo "━━━ Images ━━━"
echo "$images" | jq -r '.[] | "\(.repository):\(.tag) (\(.size_bytes / 1024 / 1024 | floor)MB)"' | head -10
if [[ $image_count -gt 10 ]]; then
echo "... and $((image_count - 10)) more"
fi
if [[ $update_count -gt 0 ]]; then
echo ""
echo "━━━ Available Updates ━━━"
echo "$updates" | jq -r '.[] | "\(.repository):\(.current_tag) → \(.available_tag)"'
fi
fi
echo ""
success "Test collection completed successfully!"
}
# Show help
show_help() {
cat << EOF
PatchMon Docker Agent v${AGENT_VERSION}
This agent collects Docker container and image information and sends it to PatchMon.
USAGE:
$0 <command>
COMMANDS:
collect Collect and send Docker data to PatchMon server
test Test Docker data collection without sending (dry run)
help Show this help message
REQUIREMENTS:
- Docker must be installed and running
- Main PatchMon agent must be configured first
- Credentials file must exist at $CREDENTIALS_FILE
EXAMPLES:
# Test collection (dry run)
sudo $0 test
# Collect and send Docker data
sudo $0 collect
SCHEDULING:
To run this agent automatically, add a cron job:
# Run every 5 minutes
*/5 * * * * /usr/local/bin/patchmon-docker-agent.sh collect
# Run every hour
0 * * * * /usr/local/bin/patchmon-docker-agent.sh collect
FILES:
Config: $CONFIG_FILE
Credentials: $CREDENTIALS_FILE
Log: $LOG_FILE
EOF
}
# Main function
main() {
case "$1" in
"collect")
check_docker
load_config
send_docker_data
;;
"test")
check_docker
load_config
test_collection
;;
"help"|"--help"|"-h"|"")
show_help
;;
*)
error "Unknown command: $1\n\nRun '$0 help' for usage information."
;;
esac
}
# Run main function
main "$@"

View File

@@ -97,26 +97,67 @@ verify_datetime
# Clean up old files (keep only last 3 of each type)
cleanup_old_files() {
# Clean up old credential backups
ls -t /etc/patchmon/credentials.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
ls -t /etc/patchmon/credentials.yml.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Clean up old config backups
ls -t /etc/patchmon/config.yml.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Clean up old agent backups
ls -t /usr/local/bin/patchmon-agent.sh.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
ls -t /usr/local/bin/patchmon-agent.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Clean up old log files
ls -t /var/log/patchmon-agent.log.old.* 2>/dev/null | tail -n +4 | xargs -r rm -f
ls -t /etc/patchmon/logs/patchmon-agent.log.old.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Clean up old shell script backups (if any exist)
ls -t /usr/local/bin/patchmon-agent.sh.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Clean up old credentials backups (if any exist)
ls -t /etc/patchmon/credentials.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
}
# Run cleanup at start
cleanup_old_files
# Generate or retrieve machine ID
get_machine_id() {
# Try multiple sources for machine ID
if [[ -f /etc/machine-id ]]; then
cat /etc/machine-id
elif [[ -f /var/lib/dbus/machine-id ]]; then
cat /var/lib/dbus/machine-id
else
# Fallback: generate from hardware info (less ideal but works)
echo "patchmon-$(cat /sys/class/dmi/id/product_uuid 2>/dev/null || cat /proc/sys/kernel/random/uuid)"
fi
}
# Parse arguments from environment (passed via HTTP headers)
if [[ -z "$PATCHMON_URL" ]] || [[ -z "$API_ID" ]] || [[ -z "$API_KEY" ]]; then
error "Missing required parameters. This script should be called via the PatchMon web interface."
fi
# Parse architecture parameter (default to amd64)
ARCHITECTURE="${ARCHITECTURE:-amd64}"
if [[ "$ARCHITECTURE" != "amd64" && "$ARCHITECTURE" != "386" && "$ARCHITECTURE" != "arm64" ]]; then
error "Invalid architecture '$ARCHITECTURE'. Must be one of: amd64, 386, arm64"
fi
# Check if --force flag is set (for bypassing broken packages)
FORCE_INSTALL="${FORCE_INSTALL:-false}"
if [[ "$*" == *"--force"* ]] || [[ "$FORCE_INSTALL" == "true" ]]; then
FORCE_INSTALL="true"
warning "⚠️ Force mode enabled - will bypass broken packages"
fi
# Get unique machine ID for this host
MACHINE_ID=$(get_machine_id)
export MACHINE_ID
info "🚀 Starting PatchMon Agent Installation..."
info "📋 Server: $PATCHMON_URL"
info "🔑 API ID: ${API_ID:0:16}..."
info "🆔 Machine ID: ${MACHINE_ID:0:16}..."
info "🏗️ Architecture: $ARCHITECTURE"
# Display diagnostic information
echo ""
@@ -125,22 +166,95 @@ echo " • URL: $PATCHMON_URL"
echo " • CURL FLAGS: $CURL_FLAGS"
echo " • API ID: ${API_ID:0:16}..."
echo " • API Key: ${API_KEY:0:16}..."
echo " • Architecture: $ARCHITECTURE"
echo ""
# Install required dependencies
info "📦 Installing required dependencies..."
echo ""
# Function to check if a command exists
command_exists() {
command -v "$1" >/dev/null 2>&1
}
# Function to install packages with error handling
install_apt_packages() {
local packages=("$@")
local missing_packages=()
# Check which packages are missing
for pkg in "${packages[@]}"; do
if ! command_exists "$pkg"; then
missing_packages+=("$pkg")
fi
done
if [ ${#missing_packages[@]} -eq 0 ]; then
success "All required packages are already installed"
return 0
fi
info "Need to install: ${missing_packages[*]}"
# Build apt-get command based on force mode
local apt_cmd="apt-get install ${missing_packages[*]} -y"
if [[ "$FORCE_INSTALL" == "true" ]]; then
info "Using force mode - bypassing broken packages..."
apt_cmd="$apt_cmd -o APT::Get::Fix-Broken=false -o DPkg::Options::=\"--force-confold\" -o DPkg::Options::=\"--force-confdef\""
fi
# Try to install packages
if eval "$apt_cmd" 2>&1 | tee /tmp/patchmon_apt_install.log; then
success "Packages installed successfully"
return 0
else
warning "Package installation encountered issues, checking if required tools are available..."
# Verify critical dependencies are actually available
local all_ok=true
for pkg in "${packages[@]}"; do
if ! command_exists "$pkg"; then
if [[ "$FORCE_INSTALL" == "true" ]]; then
error "Critical dependency '$pkg' is not available even with --force. Please install manually."
else
error "Critical dependency '$pkg' is not available. Try again with --force flag or install manually: apt-get install $pkg"
fi
all_ok=false
fi
done
if $all_ok; then
success "All required tools are available despite installation warnings"
return 0
else
return 1
fi
fi
}
# Detect package manager and install jq and curl
if command -v apt-get >/dev/null 2>&1; then
# Debian/Ubuntu
info "Detected apt-get (Debian/Ubuntu)"
echo ""
# Check for broken packages
if dpkg -l | grep -q "^iH\|^iF" 2>/dev/null; then
if [[ "$FORCE_INSTALL" == "true" ]]; then
warning "Detected broken packages on system - force mode will work around them"
else
warning "⚠️ Broken packages detected on system"
warning "If installation fails, retry with: curl -s {URL}/api/v1/hosts/install --force -H ..."
fi
fi
info "Updating package lists..."
apt-get update
apt-get update || true
echo ""
info "Installing jq, curl, and bc..."
apt-get install jq curl bc -y
install_apt_packages jq curl bc
elif command -v yum >/dev/null 2>&1; then
# CentOS/RHEL 7
info "Detected yum (CentOS/RHEL 7)"
@@ -197,84 +311,218 @@ else
mkdir -p /etc/patchmon
fi
# Step 2: Create credentials file
info "🔐 Creating API credentials file..."
# Step 2: Create configuration files
info "🔐 Creating configuration files..."
# Check if config file already exists
if [[ -f "/etc/patchmon/config.yml" ]]; then
warning "⚠️ Config file already exists at /etc/patchmon/config.yml"
warning "⚠️ Moving existing file out of the way for fresh installation"
# Clean up old config backups (keep only last 3)
ls -t /etc/patchmon/config.yml.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Move existing file out of the way
mv /etc/patchmon/config.yml /etc/patchmon/config.yml.backup.$(date +%Y%m%d_%H%M%S)
info "📋 Moved existing config to: /etc/patchmon/config.yml.backup.$(date +%Y%m%d_%H%M%S)"
fi
# Check if credentials file already exists
if [[ -f "/etc/patchmon/credentials" ]]; then
warning "⚠️ Credentials file already exists at /etc/patchmon/credentials"
if [[ -f "/etc/patchmon/credentials.yml" ]]; then
warning "⚠️ Credentials file already exists at /etc/patchmon/credentials.yml"
warning "⚠️ Moving existing file out of the way for fresh installation"
# Clean up old credential backups (keep only last 3)
ls -t /etc/patchmon/credentials.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
ls -t /etc/patchmon/credentials.yml.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Move existing file out of the way
mv /etc/patchmon/credentials /etc/patchmon/credentials.backup.$(date +%Y%m%d_%H%M%S)
info "📋 Moved existing credentials to: /etc/patchmon/credentials.backup.$(date +%Y%m%d_%H%M%S)"
mv /etc/patchmon/credentials.yml /etc/patchmon/credentials.yml.backup.$(date +%Y%m%d_%H%M%S)
info "📋 Moved existing credentials to: /etc/patchmon/credentials.yml.backup.$(date +%Y%m%d_%H%M%S)"
fi
cat > /etc/patchmon/credentials << EOF
# Clean up old credentials file if it exists (from previous installations)
if [[ -f "/etc/patchmon/credentials" ]]; then
warning "⚠️ Found old credentials file, removing it..."
rm -f /etc/patchmon/credentials
info "📋 Removed old credentials file"
fi
# Create main config file
cat > /etc/patchmon/config.yml << EOF
# PatchMon Agent Configuration
# Generated on $(date)
patchmon_server: "$PATCHMON_URL"
api_version: "v1"
credentials_file: "/etc/patchmon/credentials.yml"
log_file: "/etc/patchmon/logs/patchmon-agent.log"
log_level: "info"
EOF
# Create credentials file
cat > /etc/patchmon/credentials.yml << EOF
# PatchMon API Credentials
# Generated on $(date)
PATCHMON_URL="$PATCHMON_URL"
API_ID="$API_ID"
API_KEY="$API_KEY"
api_id: "$API_ID"
api_key: "$API_KEY"
EOF
chmod 600 /etc/patchmon/credentials
# Step 3: Download the agent script using API credentials
info "📥 Downloading PatchMon agent script..."
chmod 600 /etc/patchmon/config.yml
chmod 600 /etc/patchmon/credentials.yml
# Check if agent script already exists
if [[ -f "/usr/local/bin/patchmon-agent.sh" ]]; then
warning "⚠️ Agent script already exists at /usr/local/bin/patchmon-agent.sh"
# Step 3: Download the PatchMon agent binary using API credentials
info "📥 Downloading PatchMon agent binary..."
# Determine the binary filename based on architecture
BINARY_NAME="patchmon-agent-linux-${ARCHITECTURE}"
# Check if agent binary already exists
if [[ -f "/usr/local/bin/patchmon-agent" ]]; then
warning "⚠️ Agent binary already exists at /usr/local/bin/patchmon-agent"
warning "⚠️ Moving existing file out of the way for fresh installation"
# Clean up old agent backups (keep only last 3)
ls -t /usr/local/bin/patchmon-agent.sh.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
ls -t /usr/local/bin/patchmon-agent.backup.* 2>/dev/null | tail -n +4 | xargs -r rm -f
# Move existing file out of the way
mv /usr/local/bin/patchmon-agent.sh /usr/local/bin/patchmon-agent.sh.backup.$(date +%Y%m%d_%H%M%S)
info "📋 Moved existing agent to: /usr/local/bin/patchmon-agent.sh.backup.$(date +%Y%m%d_%H%M%S)"
mv /usr/local/bin/patchmon-agent /usr/local/bin/patchmon-agent.backup.$(date +%Y%m%d_%H%M%S)
info "📋 Moved existing agent to: /usr/local/bin/patchmon-agent.backup.$(date +%Y%m%d_%H%M%S)"
fi
# Clean up old shell script if it exists (from previous installations)
if [[ -f "/usr/local/bin/patchmon-agent.sh" ]]; then
warning "⚠️ Found old shell script agent, removing it..."
rm -f /usr/local/bin/patchmon-agent.sh
info "📋 Removed old shell script agent"
fi
# Download the binary
curl $CURL_FLAGS \
-H "X-API-ID: $API_ID" \
-H "X-API-KEY: $API_KEY" \
"$PATCHMON_URL/api/v1/hosts/agent/download" \
-o /usr/local/bin/patchmon-agent.sh
"$PATCHMON_URL/api/v1/hosts/agent/download?arch=$ARCHITECTURE" \
-o /usr/local/bin/patchmon-agent
chmod +x /usr/local/bin/patchmon-agent.sh
chmod +x /usr/local/bin/patchmon-agent
# Get the agent version from the downloaded script
AGENT_VERSION=$(grep '^AGENT_VERSION=' /usr/local/bin/patchmon-agent.sh | cut -d'"' -f2 2>/dev/null || echo "Unknown")
# Get the agent version from the binary
AGENT_VERSION=$(/usr/local/bin/patchmon-agent version 2>/dev/null || echo "Unknown")
info "📋 Agent version: $AGENT_VERSION"
# Handle existing log files and create log directory
info "📁 Setting up log directory..."
# Create log directory if it doesn't exist
mkdir -p /etc/patchmon/logs
# Handle existing log files
if [[ -f "/var/log/patchmon-agent.log" ]]; then
warning "⚠️ Existing log file found at /var/log/patchmon-agent.log"
if [[ -f "/etc/patchmon/logs/patchmon-agent.log" ]]; then
warning "⚠️ Existing log file found at /etc/patchmon/logs/patchmon-agent.log"
warning "⚠️ Rotating log file for fresh start"
# Rotate the log file
mv /var/log/patchmon-agent.log /var/log/patchmon-agent.log.old.$(date +%Y%m%d_%H%M%S)
info "📋 Log file rotated to: /var/log/patchmon-agent.log.old.$(date +%Y%m%d_%H%M%S)"
mv /etc/patchmon/logs/patchmon-agent.log /etc/patchmon/logs/patchmon-agent.log.old.$(date +%Y%m%d_%H%M%S)
info "📋 Log file rotated to: /etc/patchmon/logs/patchmon-agent.log.old.$(date +%Y%m%d_%H%M%S)"
fi
# Step 4: Test the configuration
# Check if this machine is already enrolled
info "🔍 Checking if machine is already enrolled..."
existing_check=$(curl $CURL_FLAGS -s -X POST \
-H "X-API-ID: $API_ID" \
-H "X-API-KEY: $API_KEY" \
-H "Content-Type: application/json" \
-d "{\"machine_id\": \"$MACHINE_ID\"}" \
"$PATCHMON_URL/api/v1/hosts/check-machine-id" \
-w "\n%{http_code}" 2>&1)
http_code=$(echo "$existing_check" | tail -n 1)
response_body=$(echo "$existing_check" | sed '$d')
if [[ "$http_code" == "200" ]]; then
already_enrolled=$(echo "$response_body" | jq -r '.exists' 2>/dev/null || echo "false")
if [[ "$already_enrolled" == "true" ]]; then
warning "⚠️ This machine is already enrolled in PatchMon"
info "Machine ID: $MACHINE_ID"
info "Existing host: $(echo "$response_body" | jq -r '.host.friendly_name' 2>/dev/null)"
info ""
info "The agent will be reinstalled/updated with existing credentials."
echo ""
else
success "✅ Machine not yet enrolled - proceeding with installation"
fi
fi
info "🧪 Testing API credentials and connectivity..."
if /usr/local/bin/patchmon-agent.sh test; then
if /usr/local/bin/patchmon-agent ping; then
success "✅ TEST: API credentials are valid and server is reachable"
else
error "❌ Failed to validate API credentials or reach server"
fi
# Step 5: Send initial data and setup automated updates
# Step 5: Send initial data and setup systemd service
info "📊 Sending initial package data to server..."
if /usr/local/bin/patchmon-agent.sh update; then
if /usr/local/bin/patchmon-agent report; then
success "✅ UPDATE: Initial package data sent successfully"
info "✅ Automated updates configured by agent"
else
warning "⚠️ Failed to send initial data. You can retry later with: /usr/local/bin/patchmon-agent.sh update"
warning "⚠️ Failed to send initial data. You can retry later with: /usr/local/bin/patchmon-agent report"
fi
# Step 6: Setup systemd service for WebSocket connection
info "🔧 Setting up systemd service..."
# Stop and disable existing service if it exists
if systemctl is-active --quiet patchmon-agent.service 2>/dev/null; then
warning "⚠️ Stopping existing PatchMon agent service..."
systemctl stop patchmon-agent.service
fi
if systemctl is-enabled --quiet patchmon-agent.service 2>/dev/null; then
warning "⚠️ Disabling existing PatchMon agent service..."
systemctl disable patchmon-agent.service
fi
# Create systemd service file
cat > /etc/systemd/system/patchmon-agent.service << EOF
[Unit]
Description=PatchMon Agent Service
After=network.target
Wants=network.target
[Service]
Type=simple
User=root
ExecStart=/usr/local/bin/patchmon-agent serve
Restart=always
RestartSec=10
WorkingDirectory=/etc/patchmon
# Logging
StandardOutput=journal
StandardError=journal
SyslogIdentifier=patchmon-agent
[Install]
WantedBy=multi-user.target
EOF
# Clean up old crontab entries if they exist (from previous installations)
if crontab -l 2>/dev/null | grep -q "patchmon-agent"; then
warning "⚠️ Found old crontab entries, removing them..."
crontab -l 2>/dev/null | grep -v "patchmon-agent" | crontab -
info "📋 Removed old crontab entries"
fi
# Reload systemd and enable/start the service
systemctl daemon-reload
systemctl enable patchmon-agent.service
systemctl start patchmon-agent.service
# Check if service started successfully
if systemctl is-active --quiet patchmon-agent.service; then
success "✅ PatchMon Agent service started successfully"
info "🔗 WebSocket connection established"
else
warning "⚠️ Service may have failed to start. Check status with: systemctl status patchmon-agent"
fi
# Installation complete
@@ -282,14 +530,16 @@ success "🎉 PatchMon Agent installation completed successfully!"
echo ""
echo -e "${GREEN}📋 Installation Summary:${NC}"
echo " • Configuration directory: /etc/patchmon"
echo " • Agent installed: /usr/local/bin/patchmon-agent.sh"
echo " • Agent binary installed: /usr/local/bin/patchmon-agent"
echo " • Architecture: $ARCHITECTURE"
echo " • Dependencies installed: jq, curl, bc"
echo " • Automated updates configured via crontab"
echo " • Systemd service configured and running"
echo " • API credentials configured and tested"
echo " • Update schedule managed by agent"
echo " • WebSocket connection established"
echo " • Logs directory: /etc/patchmon/logs"
# Check for moved files and show them
MOVED_FILES=$(ls /etc/patchmon/credentials.backup.* /usr/local/bin/patchmon-agent.sh.backup.* /var/log/patchmon-agent.log.old.* 2>/dev/null || true)
MOVED_FILES=$(ls /etc/patchmon/credentials.yml.backup.* /etc/patchmon/config.yml.backup.* /usr/local/bin/patchmon-agent.backup.* /etc/patchmon/logs/patchmon-agent.log.old.* /usr/local/bin/patchmon-agent.sh.backup.* /etc/patchmon/credentials.backup.* 2>/dev/null || true)
if [[ -n "$MOVED_FILES" ]]; then
echo ""
echo -e "${YELLOW}📋 Files Moved for Fresh Installation:${NC}"
@@ -302,8 +552,11 @@ fi
echo ""
echo -e "${BLUE}🔧 Management Commands:${NC}"
echo " • Test connection: /usr/local/bin/patchmon-agent.sh test"
echo " • Manual update: /usr/local/bin/patchmon-agent.sh update"
echo " • Check status: /usr/local/bin/patchmon-agent.sh diagnostics"
echo " • Test connection: /usr/local/bin/patchmon-agent ping"
echo " • Manual report: /usr/local/bin/patchmon-agent report"
echo " • Check status: /usr/local/bin/patchmon-agent diagnostics"
echo " • Service status: systemctl status patchmon-agent"
echo " • Service logs: journalctl -u patchmon-agent -f"
echo " • Restart service: systemctl restart patchmon-agent"
echo ""
success "✅ Your system is now being monitored by PatchMon!"

437
agents/proxmox_auto_enroll.sh Executable file
View File

@@ -0,0 +1,437 @@
#!/bin/bash
set -eo pipefail # Exit on error, pipe failures (removed -u as we handle unset vars explicitly)
# Trap to catch errors only (not normal exits)
trap 'echo "[ERROR] Script failed at line $LINENO with exit code $?"' ERR
SCRIPT_VERSION="2.0.0"
echo "[DEBUG] Script Version: $SCRIPT_VERSION ($(date +%Y-%m-%d\ %H:%M:%S))"
# =============================================================================
# PatchMon Proxmox LXC Auto-Enrollment Script
# =============================================================================
# This script discovers LXC containers on a Proxmox host and automatically
# enrolls them into PatchMon for patch management.
#
# Usage:
# 1. Set environment variables or edit configuration below
# 2. Run: bash proxmox_auto_enroll.sh
#
# Requirements:
# - Must run on Proxmox host (requires 'pct' command)
# - Auto-enrollment token from PatchMon
# - Network access to PatchMon server
# =============================================================================
# ===== CONFIGURATION =====
PATCHMON_URL="${PATCHMON_URL:-https://patchmon.example.com}"
AUTO_ENROLLMENT_KEY="${AUTO_ENROLLMENT_KEY:-}"
AUTO_ENROLLMENT_SECRET="${AUTO_ENROLLMENT_SECRET:-}"
CURL_FLAGS="${CURL_FLAGS:--s}"
DRY_RUN="${DRY_RUN:-false}"
HOST_PREFIX="${HOST_PREFIX:-}"
SKIP_STOPPED="${SKIP_STOPPED:-true}"
PARALLEL_INSTALL="${PARALLEL_INSTALL:-false}"
MAX_PARALLEL="${MAX_PARALLEL:-5}"
FORCE_INSTALL="${FORCE_INSTALL:-false}"
# ===== COLOR OUTPUT =====
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# ===== LOGGING FUNCTIONS =====
info() { echo -e "${GREEN}[INFO]${NC} $1"; return 0; }
warn() { echo -e "${YELLOW}[WARN]${NC} $1"; return 0; }
error() { echo -e "${RED}[ERROR]${NC} $1"; exit 1; }
success() { echo -e "${GREEN}[SUCCESS]${NC} $1"; return 0; }
debug() { [[ "${DEBUG:-false}" == "true" ]] && echo -e "${BLUE}[DEBUG]${NC} $1" || true; return 0; }
# ===== BANNER =====
cat << "EOF"
╔═══════════════════════════════════════════════════════════════╗
║ ║
║ ____ _ _ __ __ ║
| _ \ __ _| |_ ___| |__ | \/ | ___ _ __ ║
| |_) / _` | __/ __| '_ \| |\/| |/ _ \| '_ \
| __/ (_| | || (__| | | | | | | (_) | | | |
|_| \__,_|\__\___|_| |_|_| |_|\___/|_| |_|
║ ║
║ Proxmox LXC Auto-Enrollment Script ║
║ ║
╚═══════════════════════════════════════════════════════════════╝
EOF
echo ""
# ===== VALIDATION =====
info "Validating configuration..."
if [[ -z "$AUTO_ENROLLMENT_KEY" ]] || [[ -z "$AUTO_ENROLLMENT_SECRET" ]]; then
error "AUTO_ENROLLMENT_KEY and AUTO_ENROLLMENT_SECRET must be set"
fi
if [[ -z "$PATCHMON_URL" ]]; then
error "PATCHMON_URL must be set"
fi
# Check if running on Proxmox
if ! command -v pct &> /dev/null; then
error "This script must run on a Proxmox host (pct command not found)"
fi
# Check for required commands
for cmd in curl jq; do
if ! command -v $cmd &> /dev/null; then
error "Required command '$cmd' not found. Please install it first."
fi
done
info "Configuration validated successfully"
info "PatchMon Server: $PATCHMON_URL"
info "Dry Run Mode: $DRY_RUN"
info "Skip Stopped Containers: $SKIP_STOPPED"
echo ""
# ===== DISCOVER LXC CONTAINERS =====
info "Discovering LXC containers..."
lxc_list=$(pct list | tail -n +2) # Skip header
if [[ -z "$lxc_list" ]]; then
warn "No LXC containers found on this Proxmox host"
exit 0
fi
# Count containers
total_containers=$(echo "$lxc_list" | wc -l)
info "Found $total_containers LXC container(s)"
echo ""
info "Initializing statistics..."
# ===== STATISTICS =====
enrolled_count=0
skipped_count=0
failed_count=0
# Track containers with dpkg errors for later recovery
declare -A dpkg_error_containers
# Track all failed containers for summary
declare -A failed_containers
info "Statistics initialized"
# ===== PROCESS CONTAINERS =====
info "Starting container processing loop..."
while IFS= read -r line; do
info "[DEBUG] Read line from lxc_list"
vmid=$(echo "$line" | awk '{print $1}')
status=$(echo "$line" | awk '{print $2}')
name=$(echo "$line" | awk '{print $3}')
info "Processing LXC $vmid: $name (status: $status)"
# Skip stopped containers if configured
if [[ "$status" != "running" ]] && [[ "$SKIP_STOPPED" == "true" ]]; then
warn " Skipping $name - container not running"
((skipped_count++)) || true
echo ""
continue
fi
# Check if container is stopped
if [[ "$status" != "running" ]]; then
warn " Container $name is stopped - cannot gather info or install agent"
((skipped_count++)) || true
echo ""
continue
fi
# Get container details
debug " Gathering container information..."
hostname=$(timeout 5 pct exec "$vmid" -- hostname 2>/dev/null </dev/null || echo "$name")
ip_address=$(timeout 5 pct exec "$vmid" -- hostname -I 2>/dev/null </dev/null | awk '{print $1}' || echo "unknown")
os_info=$(timeout 5 pct exec "$vmid" -- cat /etc/os-release 2>/dev/null </dev/null | grep "^PRETTY_NAME=" | cut -d'"' -f2 || echo "unknown")
# Get machine ID from container
machine_id=$(timeout 5 pct exec "$vmid" -- bash -c "cat /etc/machine-id 2>/dev/null || cat /var/lib/dbus/machine-id 2>/dev/null || echo 'proxmox-lxc-$vmid-'$(cat /proc/sys/kernel/random/uuid)" </dev/null 2>/dev/null || echo "proxmox-lxc-$vmid-unknown")
friendly_name="${HOST_PREFIX}${hostname}"
info " Hostname: $hostname"
info " IP Address: $ip_address"
info " OS: $os_info"
info " Machine ID: ${machine_id:0:16}..."
if [[ "$DRY_RUN" == "true" ]]; then
info " [DRY RUN] Would enroll: $friendly_name"
((enrolled_count++)) || true
echo ""
continue
fi
# Call PatchMon auto-enrollment API
info " Enrolling $friendly_name in PatchMon..."
response=$(curl $CURL_FLAGS -X POST \
-H "X-Auto-Enrollment-Key: $AUTO_ENROLLMENT_KEY" \
-H "X-Auto-Enrollment-Secret: $AUTO_ENROLLMENT_SECRET" \
-H "Content-Type: application/json" \
-d "{
\"friendly_name\": \"$friendly_name\",
\"machine_id\": \"$machine_id\",
\"metadata\": {
\"vmid\": \"$vmid\",
\"proxmox_node\": \"$(hostname)\",
\"ip_address\": \"$ip_address\",
\"os_info\": \"$os_info\"
}
}" \
"$PATCHMON_URL/api/v1/auto-enrollment/enroll" \
-w "\n%{http_code}" 2>&1)
http_code=$(echo "$response" | tail -n 1)
body=$(echo "$response" | sed '$d')
if [[ "$http_code" == "201" ]]; then
api_id=$(echo "$body" | jq -r '.host.api_id' 2>/dev/null || echo "")
api_key=$(echo "$body" | jq -r '.host.api_key' 2>/dev/null || echo "")
if [[ -z "$api_id" ]] || [[ -z "$api_key" ]]; then
error " Failed to parse API credentials from response"
fi
info " ✓ Host enrolled successfully: $api_id"
# Ensure curl is installed in the container
info " Checking for curl in container..."
curl_check=$(timeout 10 pct exec "$vmid" -- bash -c "command -v curl >/dev/null 2>&1 && echo 'installed' || echo 'missing'" 2>/dev/null </dev/null || echo "error")
if [[ "$curl_check" == "missing" ]]; then
info " Installing curl in container..."
# Detect package manager and install curl
curl_install_output=$(timeout 60 pct exec "$vmid" -- bash -c "
if command -v apt-get >/dev/null 2>&1; then
export DEBIAN_FRONTEND=noninteractive
apt-get update -qq && apt-get install -y -qq curl
elif command -v yum >/dev/null 2>&1; then
yum install -y -q curl
elif command -v dnf >/dev/null 2>&1; then
dnf install -y -q curl
elif command -v apk >/dev/null 2>&1; then
apk add --no-cache curl
else
echo 'ERROR: No supported package manager found'
exit 1
fi
" 2>&1 </dev/null) || true
if [[ "$curl_install_output" == *"ERROR: No supported package manager"* ]]; then
warn " ✗ Could not install curl - no supported package manager found"
failed_containers["$vmid"]="$friendly_name|No package manager for curl|$curl_install_output"
((failed_count++)) || true
echo ""
sleep 1
continue
else
info " ✓ curl installed successfully"
fi
else
info " ✓ curl already installed"
fi
# Install PatchMon agent in container
info " Installing PatchMon agent..."
# Build install URL with force flag if enabled
install_url="$PATCHMON_URL/api/v1/hosts/install"
if [[ "$FORCE_INSTALL" == "true" ]]; then
install_url="$install_url?force=true"
info " Using force mode - will bypass broken packages"
fi
# Reset exit code for this container
install_exit_code=0
# Download and execute in separate steps to avoid stdin issues with piping
install_output=$(timeout 180 pct exec "$vmid" -- bash -c "
cd /tmp
curl $CURL_FLAGS \
-H \"X-API-ID: $api_id\" \
-H \"X-API-KEY: $api_key\" \
-o patchmon-install.sh \
'$install_url' && \
bash patchmon-install.sh && \
rm -f patchmon-install.sh
" 2>&1 </dev/null) || install_exit_code=$?
# Check both exit code AND success message in output for reliability
if [[ $install_exit_code -eq 0 ]] || [[ "$install_output" == *"PatchMon Agent installation completed successfully"* ]]; then
info " ✓ Agent installed successfully in $friendly_name"
((enrolled_count++)) || true
elif [[ $install_exit_code -eq 124 ]]; then
warn " ⏱ Agent installation timed out (>180s) in $friendly_name"
info " Install output: $install_output"
# Store failure details
failed_containers["$vmid"]="$friendly_name|Timeout (>180s)|$install_output"
((failed_count++)) || true
else
# Check if it's a dpkg error
if [[ "$install_output" == *"dpkg was interrupted"* ]] || [[ "$install_output" == *"dpkg --configure -a"* ]]; then
warn " ⚠ Failed due to dpkg error in $friendly_name (can be fixed)"
dpkg_error_containers["$vmid"]="$friendly_name:$api_id:$api_key"
# Store failure details
failed_containers["$vmid"]="$friendly_name|dpkg error|$install_output"
else
warn " ✗ Failed to install agent in $friendly_name (exit: $install_exit_code)"
# Store failure details
failed_containers["$vmid"]="$friendly_name|Exit code $install_exit_code|$install_output"
fi
info " Install output: $install_output"
((failed_count++)) || true
fi
elif [[ "$http_code" == "409" ]]; then
warn " ⊘ Host $friendly_name already enrolled - skipping"
((skipped_count++)) || true
elif [[ "$http_code" == "429" ]]; then
error " ✗ Rate limit exceeded - maximum hosts per day reached"
failed_containers["$vmid"]="$friendly_name|Rate limit exceeded|$body"
((failed_count++)) || true
else
error " ✗ Failed to enroll $friendly_name - HTTP $http_code"
debug " Response: $body"
failed_containers["$vmid"]="$friendly_name|HTTP $http_code enrollment failed|$body"
((failed_count++)) || true
fi
echo ""
sleep 1 # Rate limiting between containers
done <<< "$lxc_list"
# ===== SUMMARY =====
echo ""
echo "╔═══════════════════════════════════════════════════════════════╗"
echo "║ ENROLLMENT SUMMARY ║"
echo "╚═══════════════════════════════════════════════════════════════╝"
echo ""
info "Total Containers Found: $total_containers"
info "Successfully Enrolled: $enrolled_count"
info "Skipped: $skipped_count"
info "Failed: $failed_count"
echo ""
# ===== FAILURE DETAILS =====
if [[ ${#failed_containers[@]} -gt 0 ]]; then
echo "╔═══════════════════════════════════════════════════════════════╗"
echo "║ FAILURE DETAILS ║"
echo "╚═══════════════════════════════════════════════════════════════╝"
echo ""
for vmid in "${!failed_containers[@]}"; do
IFS='|' read -r name reason output <<< "${failed_containers[$vmid]}"
warn "Container $vmid: $name"
info " Reason: $reason"
info " Last 5 lines of output:"
# Get last 5 lines of output
last_5_lines=$(echo "$output" | tail -n 5)
# Display each line with proper indentation
while IFS= read -r line; do
echo " $line"
done <<< "$last_5_lines"
echo ""
done
fi
if [[ "$DRY_RUN" == "true" ]]; then
warn "This was a DRY RUN - no actual changes were made"
warn "Set DRY_RUN=false to perform actual enrollment"
fi
# ===== DPKG ERROR RECOVERY =====
if [[ ${#dpkg_error_containers[@]} -gt 0 ]]; then
echo ""
echo "╔═══════════════════════════════════════════════════════════════╗"
echo "║ DPKG ERROR RECOVERY AVAILABLE ║"
echo "╚═══════════════════════════════════════════════════════════════╝"
echo ""
warn "Detected ${#dpkg_error_containers[@]} container(s) with dpkg errors:"
for vmid in "${!dpkg_error_containers[@]}"; do
IFS=':' read -r name api_id api_key <<< "${dpkg_error_containers[$vmid]}"
info " • Container $vmid: $name"
done
echo ""
# Ask user if they want to fix dpkg errors
read -p "Would you like to fix dpkg errors and retry installation? (y/N): " -n 1 -r
echo ""
if [[ $REPLY =~ ^[Yy]$ ]]; then
echo ""
info "Starting dpkg recovery process..."
echo ""
recovered_count=0
for vmid in "${!dpkg_error_containers[@]}"; do
IFS=':' read -r name api_id api_key <<< "${dpkg_error_containers[$vmid]}"
info "Fixing dpkg in container $vmid ($name)..."
# Run dpkg --configure -a
dpkg_output=$(timeout 60 pct exec "$vmid" -- dpkg --configure -a 2>&1 </dev/null || true)
if [[ $? -eq 0 ]]; then
info " ✓ dpkg fixed successfully"
# Retry agent installation
info " Retrying agent installation..."
install_exit_code=0
install_output=$(timeout 180 pct exec "$vmid" -- bash -c "
cd /tmp
curl $CURL_FLAGS \
-H \"X-API-ID: $api_id\" \
-H \"X-API-KEY: $api_key\" \
-o patchmon-install.sh \
'$PATCHMON_URL/api/v1/hosts/install' && \
bash patchmon-install.sh && \
rm -f patchmon-install.sh
" 2>&1 </dev/null) || install_exit_code=$?
if [[ $install_exit_code -eq 0 ]] || [[ "$install_output" == *"PatchMon Agent installation completed successfully"* ]]; then
info " ✓ Agent installed successfully in $name"
((recovered_count++)) || true
((enrolled_count++)) || true
((failed_count--)) || true
else
warn " ✗ Agent installation still failed (exit: $install_exit_code)"
fi
else
warn " ✗ Failed to fix dpkg in $name"
info " dpkg output: $dpkg_output"
fi
echo ""
done
echo ""
info "Recovery complete: $recovered_count container(s) recovered"
echo ""
fi
fi
if [[ $failed_count -gt 0 ]]; then
warn "Some containers failed to enroll. Check the logs above for details."
exit 1
fi
info "Auto-enrollment complete! ✓"
exit 0

View File

@@ -1,5 +1,13 @@
# Database Configuration
DATABASE_URL="postgresql://patchmon_user:p@tchm0n_p@55@localhost:5432/patchmon_db"
PM_DB_CONN_MAX_ATTEMPTS=30
PM_DB_CONN_WAIT_INTERVAL=2
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your-redis-password-here
REDIS_DB=0
# Server Configuration
PORT=3001
@@ -29,3 +37,8 @@ JWT_SECRET=your-secure-random-secret-key-change-this-in-production
JWT_EXPIRES_IN=1h
JWT_REFRESH_EXPIRES_IN=7d
SESSION_INACTIVITY_TIMEOUT_MINUTES=30
# TFA Configuration
TFA_REMEMBER_ME_EXPIRES_IN=30d
TFA_MAX_REMEMBER_SESSIONS=5
TFA_SUSPICIOUS_ACTIVITY_THRESHOLD=3

View File

@@ -1,6 +1,6 @@
{
"name": "patchmon-backend",
"version": "1.2.7",
"version": "1.2.9",
"description": "Backend API for Linux Patch Monitoring System",
"license": "AGPL-3.0",
"main": "src/server.js",
@@ -14,20 +14,26 @@
"db:studio": "prisma studio"
},
"dependencies": {
"@bull-board/api": "^6.13.1",
"@bull-board/express": "^6.13.1",
"@prisma/client": "^6.1.0",
"bcryptjs": "^2.4.3",
"bullmq": "^5.61.0",
"cookie-parser": "^1.4.7",
"cors": "^2.8.5",
"dotenv": "^16.4.7",
"express": "^4.21.2",
"express-rate-limit": "^7.5.0",
"express-validator": "^7.2.0",
"helmet": "^8.0.0",
"ioredis": "^5.8.1",
"jsonwebtoken": "^9.0.2",
"moment": "^2.30.1",
"qrcode": "^1.5.4",
"speakeasy": "^2.0.0",
"uuid": "^11.0.3",
"winston": "^3.17.0"
"winston": "^3.17.0",
"ws": "^8.18.0"
},
"devDependencies": {
"@types/bcryptjs": "^2.4.6",

View File

@@ -0,0 +1,37 @@
-- CreateTable
CREATE TABLE "auto_enrollment_tokens" (
"id" TEXT NOT NULL,
"token_name" TEXT NOT NULL,
"token_key" TEXT NOT NULL,
"token_secret" TEXT NOT NULL,
"created_by_user_id" TEXT,
"is_active" BOOLEAN NOT NULL DEFAULT true,
"allowed_ip_ranges" TEXT[],
"max_hosts_per_day" INTEGER NOT NULL DEFAULT 100,
"hosts_created_today" INTEGER NOT NULL DEFAULT 0,
"last_reset_date" DATE NOT NULL DEFAULT CURRENT_TIMESTAMP,
"default_host_group_id" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
"last_used_at" TIMESTAMP(3),
"expires_at" TIMESTAMP(3),
"metadata" JSONB,
CONSTRAINT "auto_enrollment_tokens_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "auto_enrollment_tokens_token_key_key" ON "auto_enrollment_tokens"("token_key");
-- CreateIndex
CREATE INDEX "auto_enrollment_tokens_token_key_idx" ON "auto_enrollment_tokens"("token_key");
-- CreateIndex
CREATE INDEX "auto_enrollment_tokens_is_active_idx" ON "auto_enrollment_tokens"("is_active");
-- AddForeignKey
ALTER TABLE "auto_enrollment_tokens" ADD CONSTRAINT "auto_enrollment_tokens_created_by_user_id_fkey" FOREIGN KEY ("created_by_user_id") REFERENCES "users"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "auto_enrollment_tokens" ADD CONSTRAINT "auto_enrollment_tokens_default_host_group_id_fkey" FOREIGN KEY ("default_host_group_id") REFERENCES "host_groups"("id") ON DELETE SET NULL ON UPDATE CASCADE;

View File

@@ -0,0 +1,20 @@
-- Add machine_id column as nullable first
ALTER TABLE "hosts" ADD COLUMN "machine_id" TEXT;
-- Generate machine_ids for existing hosts using their API ID as a fallback
UPDATE "hosts" SET "machine_id" = 'migrated-' || "api_id" WHERE "machine_id" IS NULL;
-- Remove the unique constraint from friendly_name
ALTER TABLE "hosts" DROP CONSTRAINT IF EXISTS "hosts_friendly_name_key";
-- Also drop the unique index if it exists (constraint and index can exist separately)
DROP INDEX IF EXISTS "hosts_friendly_name_key";
-- Now make machine_id NOT NULL and add unique constraint
ALTER TABLE "hosts" ALTER COLUMN "machine_id" SET NOT NULL;
ALTER TABLE "hosts" ADD CONSTRAINT "hosts_machine_id_key" UNIQUE ("machine_id");
-- Create indexes for better query performance
CREATE INDEX "hosts_machine_id_idx" ON "hosts"("machine_id");
CREATE INDEX "hosts_friendly_name_idx" ON "hosts"("friendly_name");

View File

@@ -0,0 +1,4 @@
-- AddLogoFieldsToSettings
ALTER TABLE "settings" ADD COLUMN "logo_dark" VARCHAR(255) DEFAULT '/assets/logo_dark.png';
ALTER TABLE "settings" ADD COLUMN "logo_light" VARCHAR(255) DEFAULT '/assets/logo_light.png';
ALTER TABLE "settings" ADD COLUMN "favicon" VARCHAR(255) DEFAULT '/assets/logo_square.svg';

View File

@@ -0,0 +1,64 @@
-- Reconcile user_sessions migration from 1.2.7 to 1.2.8+
-- This migration handles the case where 1.2.7 had 'add_user_sessions' without timestamp
-- and 1.2.8+ renamed it to '20251005000000_add_user_sessions' with timestamp
DO $$
DECLARE
table_exists boolean := false;
migration_exists boolean := false;
BEGIN
-- Check if user_sessions table exists
SELECT EXISTS (
SELECT 1 FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'user_sessions'
) INTO table_exists;
-- Check if the migration record already exists
SELECT EXISTS (
SELECT 1 FROM _prisma_migrations
WHERE migration_name = '20251005000000_add_user_sessions'
) INTO migration_exists;
-- If table exists but no migration record, create one
IF table_exists AND NOT migration_exists THEN
RAISE NOTICE 'Table exists but no migration record found - creating migration record for 1.2.7 upgrade';
-- Insert a successful migration record for the existing table
INSERT INTO _prisma_migrations (
id,
checksum,
finished_at,
migration_name,
logs,
rolled_back_at,
started_at,
applied_steps_count
) VALUES (
gen_random_uuid()::text,
'', -- Empty checksum since we're reconciling
NOW(),
'20251005000000_add_user_sessions',
'Reconciled from 1.2.7 - table already exists',
NULL,
NOW(),
1
);
RAISE NOTICE 'Migration record created for existing table';
ELSIF table_exists AND migration_exists THEN
RAISE NOTICE 'Table exists and migration record exists - no action needed';
ELSE
RAISE NOTICE 'Table does not exist - migration will proceed normally';
END IF;
-- Additional check: If we have any old migration names, update them
IF EXISTS (SELECT 1 FROM _prisma_migrations WHERE migration_name = 'add_user_sessions') THEN
RAISE NOTICE 'Found old migration name - updating to new format';
UPDATE _prisma_migrations
SET migration_name = '20251005000000_add_user_sessions'
WHERE migration_name = 'add_user_sessions';
RAISE NOTICE 'Old migration name updated';
END IF;
END $$;

View File

@@ -0,0 +1,96 @@
-- Reconcile user_sessions migration from 1.2.7 to 1.2.8+
-- This migration handles the case where 1.2.7 had 'add_user_sessions' without timestamp
-- and 1.2.8+ renamed it to '20251005000000_add_user_sessions' with timestamp
DO $$
DECLARE
old_migration_exists boolean := false;
table_exists boolean := false;
failed_migration_exists boolean := false;
BEGIN
-- Check if the old migration name exists
SELECT EXISTS (
SELECT 1 FROM _prisma_migrations
WHERE migration_name = 'add_user_sessions'
) INTO old_migration_exists;
-- Check if user_sessions table exists
SELECT EXISTS (
SELECT 1 FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'user_sessions'
) INTO table_exists;
-- Check if there's a failed migration attempt
SELECT EXISTS (
SELECT 1 FROM _prisma_migrations
WHERE migration_name = '20251005000000_add_user_sessions'
AND finished_at IS NULL
) INTO failed_migration_exists;
-- Scenario 1: Old migration exists, table exists, no failed migration
-- This means 1.2.7 was installed and we need to update the migration name
IF old_migration_exists AND table_exists AND NOT failed_migration_exists THEN
RAISE NOTICE 'Found 1.2.7 migration "add_user_sessions" - updating to timestamped version';
-- Update the old migration name to the new timestamped version
UPDATE _prisma_migrations
SET migration_name = '20251005000000_add_user_sessions'
WHERE migration_name = 'add_user_sessions';
RAISE NOTICE 'Migration name updated: add_user_sessions -> 20251005000000_add_user_sessions';
END IF;
-- Scenario 2: Failed migration exists (upgrade attempt gone wrong)
IF failed_migration_exists THEN
RAISE NOTICE 'Found failed migration attempt - cleaning up';
-- If table exists, it means the migration partially succeeded
IF table_exists THEN
RAISE NOTICE 'Table exists - marking migration as applied';
-- Delete the failed migration record
DELETE FROM _prisma_migrations
WHERE migration_name = '20251005000000_add_user_sessions'
AND finished_at IS NULL;
-- Insert a successful migration record
INSERT INTO _prisma_migrations (
id,
checksum,
finished_at,
migration_name,
logs,
rolled_back_at,
started_at,
applied_steps_count
) VALUES (
gen_random_uuid()::text,
'', -- Empty checksum since we're reconciling
NOW(),
'20251005000000_add_user_sessions',
NULL,
NULL,
NOW(),
1
);
RAISE NOTICE 'Migration marked as successfully applied';
ELSE
RAISE NOTICE 'Table does not exist - removing failed migration to allow retry';
-- Just delete the failed migration to allow it to retry
DELETE FROM _prisma_migrations
WHERE migration_name = '20251005000000_add_user_sessions'
AND finished_at IS NULL;
RAISE NOTICE 'Failed migration removed - will retry on next migration run';
END IF;
END IF;
-- Scenario 3: Everything is clean (fresh install or already reconciled)
IF NOT old_migration_exists AND NOT failed_migration_exists THEN
RAISE NOTICE 'No migration reconciliation needed';
END IF;
END $$;

View File

@@ -0,0 +1,106 @@
-- CreateTable (with existence check for 1.2.7 compatibility)
DO $$
BEGIN
-- Check if table already exists (from 1.2.7 installation)
IF NOT EXISTS (
SELECT 1 FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'user_sessions'
) THEN
-- Table doesn't exist, create it
CREATE TABLE "user_sessions" (
"id" TEXT NOT NULL,
"user_id" TEXT NOT NULL,
"refresh_token" TEXT NOT NULL,
"access_token_hash" TEXT,
"ip_address" TEXT,
"user_agent" TEXT,
"last_activity" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"expires_at" TIMESTAMP(3) NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"is_revoked" BOOLEAN NOT NULL DEFAULT false,
CONSTRAINT "user_sessions_pkey" PRIMARY KEY ("id")
);
RAISE NOTICE 'Created user_sessions table';
ELSE
RAISE NOTICE 'user_sessions table already exists, skipping creation';
END IF;
END $$;
-- CreateIndex (with existence check)
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM pg_indexes
WHERE tablename = 'user_sessions'
AND indexname = 'user_sessions_refresh_token_key'
) THEN
CREATE UNIQUE INDEX "user_sessions_refresh_token_key" ON "user_sessions"("refresh_token");
RAISE NOTICE 'Created user_sessions_refresh_token_key index';
ELSE
RAISE NOTICE 'user_sessions_refresh_token_key index already exists, skipping';
END IF;
END $$;
-- CreateIndex (with existence check)
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM pg_indexes
WHERE tablename = 'user_sessions'
AND indexname = 'user_sessions_user_id_idx'
) THEN
CREATE INDEX "user_sessions_user_id_idx" ON "user_sessions"("user_id");
RAISE NOTICE 'Created user_sessions_user_id_idx index';
ELSE
RAISE NOTICE 'user_sessions_user_id_idx index already exists, skipping';
END IF;
END $$;
-- CreateIndex (with existence check)
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM pg_indexes
WHERE tablename = 'user_sessions'
AND indexname = 'user_sessions_refresh_token_idx'
) THEN
CREATE INDEX "user_sessions_refresh_token_idx" ON "user_sessions"("refresh_token");
RAISE NOTICE 'Created user_sessions_refresh_token_idx index';
ELSE
RAISE NOTICE 'user_sessions_refresh_token_idx index already exists, skipping';
END IF;
END $$;
-- CreateIndex (with existence check)
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM pg_indexes
WHERE tablename = 'user_sessions'
AND indexname = 'user_sessions_expires_at_idx'
) THEN
CREATE INDEX "user_sessions_expires_at_idx" ON "user_sessions"("expires_at");
RAISE NOTICE 'Created user_sessions_expires_at_idx index';
ELSE
RAISE NOTICE 'user_sessions_expires_at_idx index already exists, skipping';
END IF;
END $$;
-- AddForeignKey (with existence check)
DO $$
BEGIN
IF NOT EXISTS (
SELECT 1 FROM information_schema.table_constraints
WHERE table_name = 'user_sessions'
AND constraint_name = 'user_sessions_user_id_fkey'
) THEN
ALTER TABLE "user_sessions" ADD CONSTRAINT "user_sessions_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users"("id") ON DELETE CASCADE ON UPDATE CASCADE;
RAISE NOTICE 'Created user_sessions_user_id_fkey foreign key';
ELSE
RAISE NOTICE 'user_sessions_user_id_fkey foreign key already exists, skipping';
END IF;
END $$;

View File

@@ -0,0 +1,6 @@
-- Add TFA remember me fields to user_sessions table
ALTER TABLE "user_sessions" ADD COLUMN "tfa_remember_me" BOOLEAN NOT NULL DEFAULT false;
ALTER TABLE "user_sessions" ADD COLUMN "tfa_bypass_until" TIMESTAMP(3);
-- Create index for TFA bypass until field for efficient querying
CREATE INDEX "user_sessions_tfa_bypass_until_idx" ON "user_sessions"("tfa_bypass_until");

View File

@@ -0,0 +1,7 @@
-- Add security fields to user_sessions table for production-ready remember me
ALTER TABLE "user_sessions" ADD COLUMN "device_fingerprint" TEXT;
ALTER TABLE "user_sessions" ADD COLUMN "login_count" INTEGER NOT NULL DEFAULT 1;
ALTER TABLE "user_sessions" ADD COLUMN "last_login_ip" TEXT;
-- Create index for device fingerprint for efficient querying
CREATE INDEX "user_sessions_device_fingerprint_idx" ON "user_sessions"("device_fingerprint");

View File

@@ -0,0 +1,3 @@
-- AlterTable
ALTER TABLE "update_history" ADD COLUMN "total_packages" INTEGER;

View File

@@ -0,0 +1,4 @@
-- AlterTable
ALTER TABLE "update_history" ADD COLUMN "payload_size_kb" DOUBLE PRECISION;
ALTER TABLE "update_history" ADD COLUMN "execution_time" DOUBLE PRECISION;

View File

@@ -0,0 +1,30 @@
-- Add indexes to host_packages table for performance optimization
-- These indexes will dramatically speed up queries filtering by host_id, package_id, needs_update, and is_security_update
-- Index for queries filtering by host_id (very common - used when viewing packages for a specific host)
CREATE INDEX IF NOT EXISTS "host_packages_host_id_idx" ON "host_packages"("host_id");
-- Index for queries filtering by package_id (used when finding hosts for a specific package)
CREATE INDEX IF NOT EXISTS "host_packages_package_id_idx" ON "host_packages"("package_id");
-- Index for queries filtering by needs_update (used when finding outdated packages)
CREATE INDEX IF NOT EXISTS "host_packages_needs_update_idx" ON "host_packages"("needs_update");
-- Index for queries filtering by is_security_update (used when finding security updates)
CREATE INDEX IF NOT EXISTS "host_packages_is_security_update_idx" ON "host_packages"("is_security_update");
-- Composite index for the most common query pattern: host_id + needs_update
-- This is optimal for "show me outdated packages for this host"
CREATE INDEX IF NOT EXISTS "host_packages_host_id_needs_update_idx" ON "host_packages"("host_id", "needs_update");
-- Composite index for host_id + needs_update + is_security_update
-- This is optimal for "show me security updates for this host"
CREATE INDEX IF NOT EXISTS "host_packages_host_id_needs_update_security_idx" ON "host_packages"("host_id", "needs_update", "is_security_update");
-- Index for queries filtering by package_id + needs_update
-- This is optimal for "show me hosts where this package needs updates"
CREATE INDEX IF NOT EXISTS "host_packages_package_id_needs_update_idx" ON "host_packages"("package_id", "needs_update");
-- Index on last_checked for cleanup/maintenance queries
CREATE INDEX IF NOT EXISTS "host_packages_last_checked_idx" ON "host_packages"("last_checked");

View File

@@ -0,0 +1,94 @@
-- CreateTable
CREATE TABLE "docker_images" (
"id" TEXT NOT NULL,
"repository" TEXT NOT NULL,
"tag" TEXT NOT NULL DEFAULT 'latest',
"image_id" TEXT NOT NULL,
"digest" TEXT,
"size_bytes" BIGINT,
"source" TEXT NOT NULL DEFAULT 'docker-hub',
"created_at" TIMESTAMP(3) NOT NULL,
"last_pulled" TIMESTAMP(3),
"last_checked" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "docker_images_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "docker_containers" (
"id" TEXT NOT NULL,
"host_id" TEXT NOT NULL,
"container_id" TEXT NOT NULL,
"name" TEXT NOT NULL,
"image_id" TEXT,
"image_name" TEXT NOT NULL,
"image_tag" TEXT NOT NULL DEFAULT 'latest',
"status" TEXT NOT NULL,
"state" TEXT,
"ports" JSONB,
"created_at" TIMESTAMP(3) NOT NULL,
"started_at" TIMESTAMP(3),
"updated_at" TIMESTAMP(3) NOT NULL,
"last_checked" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "docker_containers_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "docker_image_updates" (
"id" TEXT NOT NULL,
"image_id" TEXT NOT NULL,
"current_tag" TEXT NOT NULL,
"available_tag" TEXT NOT NULL,
"is_security_update" BOOLEAN NOT NULL DEFAULT false,
"severity" TEXT,
"changelog_url" TEXT,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
CONSTRAINT "docker_image_updates_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "docker_images_repository_idx" ON "docker_images"("repository");
-- CreateIndex
CREATE INDEX "docker_images_source_idx" ON "docker_images"("source");
-- CreateIndex
CREATE INDEX "docker_images_repository_tag_idx" ON "docker_images"("repository", "tag");
-- CreateIndex
CREATE UNIQUE INDEX "docker_images_repository_tag_image_id_key" ON "docker_images"("repository", "tag", "image_id");
-- CreateIndex
CREATE INDEX "docker_containers_host_id_idx" ON "docker_containers"("host_id");
-- CreateIndex
CREATE INDEX "docker_containers_image_id_idx" ON "docker_containers"("image_id");
-- CreateIndex
CREATE INDEX "docker_containers_status_idx" ON "docker_containers"("status");
-- CreateIndex
CREATE INDEX "docker_containers_name_idx" ON "docker_containers"("name");
-- CreateIndex
CREATE UNIQUE INDEX "docker_containers_host_id_container_id_key" ON "docker_containers"("host_id", "container_id");
-- CreateIndex
CREATE INDEX "docker_image_updates_image_id_idx" ON "docker_image_updates"("image_id");
-- CreateIndex
CREATE INDEX "docker_image_updates_is_security_update_idx" ON "docker_image_updates"("is_security_update");
-- CreateIndex
CREATE UNIQUE INDEX "docker_image_updates_image_id_available_tag_key" ON "docker_image_updates"("image_id", "available_tag");
-- AddForeignKey
ALTER TABLE "docker_containers" ADD CONSTRAINT "docker_containers_image_id_fkey" FOREIGN KEY ("image_id") REFERENCES "docker_images"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "docker_image_updates" ADD CONSTRAINT "docker_image_updates_image_id_fkey" FOREIGN KEY ("image_id") REFERENCES "docker_images"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,40 @@
-- CreateTable
CREATE TABLE "job_history" (
"id" TEXT NOT NULL,
"job_id" TEXT NOT NULL,
"queue_name" TEXT NOT NULL,
"job_name" TEXT NOT NULL,
"host_id" TEXT,
"api_id" TEXT,
"status" TEXT NOT NULL,
"attempt_number" INTEGER NOT NULL DEFAULT 1,
"error_message" TEXT,
"output" JSONB,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMP(3) NOT NULL,
"completed_at" TIMESTAMP(3),
CONSTRAINT "job_history_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE INDEX "job_history_job_id_idx" ON "job_history"("job_id");
-- CreateIndex
CREATE INDEX "job_history_queue_name_idx" ON "job_history"("queue_name");
-- CreateIndex
CREATE INDEX "job_history_host_id_idx" ON "job_history"("host_id");
-- CreateIndex
CREATE INDEX "job_history_api_id_idx" ON "job_history"("api_id");
-- CreateIndex
CREATE INDEX "job_history_status_idx" ON "job_history"("status");
-- CreateIndex
CREATE INDEX "job_history_created_at_idx" ON "job_history"("created_at");
-- AddForeignKey
ALTER TABLE "job_history" ADD CONSTRAINT "job_history_host_id_fkey" FOREIGN KEY ("host_id") REFERENCES "hosts"("id") ON DELETE SET NULL ON UPDATE CASCADE;

View File

@@ -0,0 +1,43 @@
-- CreateTable
CREATE TABLE "host_group_memberships" (
"id" TEXT NOT NULL,
"host_id" TEXT NOT NULL,
"host_group_id" TEXT NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
CONSTRAINT "host_group_memberships_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "host_group_memberships_host_id_host_group_id_key" ON "host_group_memberships"("host_id", "host_group_id");
-- CreateIndex
CREATE INDEX "host_group_memberships_host_id_idx" ON "host_group_memberships"("host_id");
-- CreateIndex
CREATE INDEX "host_group_memberships_host_group_id_idx" ON "host_group_memberships"("host_group_id");
-- Migrate existing data from hosts.host_group_id to host_group_memberships
INSERT INTO "host_group_memberships" ("id", "host_id", "host_group_id", "created_at")
SELECT
gen_random_uuid()::text as "id",
"id" as "host_id",
"host_group_id" as "host_group_id",
CURRENT_TIMESTAMP as "created_at"
FROM "hosts"
WHERE "host_group_id" IS NOT NULL;
-- AddForeignKey
ALTER TABLE "host_group_memberships" ADD CONSTRAINT "host_group_memberships_host_id_fkey" FOREIGN KEY ("host_id") REFERENCES "hosts"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "host_group_memberships" ADD CONSTRAINT "host_group_memberships_host_group_id_fkey" FOREIGN KEY ("host_group_id") REFERENCES "host_groups"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- DropForeignKey
ALTER TABLE "hosts" DROP CONSTRAINT IF EXISTS "hosts_host_group_id_fkey";
-- DropIndex
DROP INDEX IF EXISTS "hosts_host_group_id_idx";
-- AlterTable
ALTER TABLE "hosts" DROP COLUMN "host_group_id";

View File

@@ -1,31 +0,0 @@
-- CreateTable
CREATE TABLE "user_sessions" (
"id" TEXT NOT NULL,
"user_id" TEXT NOT NULL,
"refresh_token" TEXT NOT NULL,
"access_token_hash" TEXT,
"ip_address" TEXT,
"user_agent" TEXT,
"last_activity" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"expires_at" TIMESTAMP(3) NOT NULL,
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"is_revoked" BOOLEAN NOT NULL DEFAULT false,
CONSTRAINT "user_sessions_pkey" PRIMARY KEY ("id")
);
-- CreateIndex
CREATE UNIQUE INDEX "user_sessions_refresh_token_key" ON "user_sessions"("refresh_token");
-- CreateIndex
CREATE INDEX "user_sessions_user_id_idx" ON "user_sessions"("user_id");
-- CreateIndex
CREATE INDEX "user_sessions_refresh_token_idx" ON "user_sessions"("refresh_token");
-- CreateIndex
CREATE INDEX "user_sessions_expires_at_idx" ON "user_sessions"("expires_at");
-- AddForeignKey
ALTER TABLE "user_sessions" ADD CONSTRAINT "user_sessions_user_id_fkey" FOREIGN KEY ("user_id") REFERENCES "users"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -21,13 +21,27 @@ model dashboard_preferences {
}
model host_groups {
id String @id
name String @unique
description String?
color String? @default("#3B82F6")
created_at DateTime @default(now())
updated_at DateTime
hosts hosts[]
id String @id
name String @unique
description String?
color String? @default("#3B82F6")
created_at DateTime @default(now())
updated_at DateTime
host_group_memberships host_group_memberships[]
auto_enrollment_tokens auto_enrollment_tokens[]
}
model host_group_memberships {
id String @id
host_id String
host_group_id String
created_at DateTime @default(now())
hosts hosts @relation(fields: [host_id], references: [id], onDelete: Cascade)
host_groups host_groups @relation(fields: [host_group_id], references: [id], onDelete: Cascade)
@@unique([host_id, host_group_id])
@@index([host_id])
@@index([host_group_id])
}
model host_packages {
@@ -43,6 +57,14 @@ model host_packages {
packages packages @relation(fields: [package_id], references: [id], onDelete: Cascade)
@@unique([host_id, package_id])
@@index([host_id])
@@index([package_id])
@@index([needs_update])
@@index([is_security_update])
@@index([host_id, needs_update])
@@index([host_id, needs_update, is_security_update])
@@index([package_id, needs_update])
@@index([last_checked])
}
model host_repositories {
@@ -58,39 +80,44 @@ model host_repositories {
}
model hosts {
id String @id
friendly_name String @unique
ip String?
os_type String
os_version String
architecture String?
last_update DateTime @default(now())
status String @default("active")
created_at DateTime @default(now())
updated_at DateTime
api_id String @unique
api_key String @unique
host_group_id String?
agent_version String?
auto_update Boolean @default(true)
cpu_cores Int?
cpu_model String?
disk_details Json?
dns_servers Json?
gateway_ip String?
hostname String?
kernel_version String?
load_average Json?
network_interfaces Json?
ram_installed Int?
selinux_status String?
swap_size Int?
system_uptime String?
notes String?
host_packages host_packages[]
host_repositories host_repositories[]
host_groups host_groups? @relation(fields: [host_group_id], references: [id])
update_history update_history[]
id String @id
machine_id String @unique
friendly_name String
ip String?
os_type String
os_version String
architecture String?
last_update DateTime @default(now())
status String @default("active")
created_at DateTime @default(now())
updated_at DateTime
api_id String @unique
api_key String @unique
agent_version String?
auto_update Boolean @default(true)
cpu_cores Int?
cpu_model String?
disk_details Json?
dns_servers Json?
gateway_ip String?
hostname String?
kernel_version String?
load_average Json?
network_interfaces Json?
ram_installed Int?
selinux_status String?
swap_size Int?
system_uptime String?
notes String?
host_packages host_packages[]
host_repositories host_repositories[]
host_group_memberships host_group_memberships[]
update_history update_history[]
job_history job_history[]
@@index([machine_id])
@@index([friendly_name])
@@index([hostname])
}
model packages {
@@ -102,6 +129,9 @@ model packages {
created_at DateTime @default(now())
updated_at DateTime
host_packages host_packages[]
@@index([name])
@@index([category])
}
model repositories {
@@ -158,36 +188,43 @@ model settings {
signup_enabled Boolean @default(false)
default_user_role String @default("user")
ignore_ssl_self_signed Boolean @default(false)
logo_dark String? @default("/assets/logo_dark.png")
logo_light String? @default("/assets/logo_light.png")
favicon String? @default("/assets/logo_square.svg")
}
model update_history {
id String @id
host_id String
packages_count Int
security_count Int
timestamp DateTime @default(now())
status String @default("success")
error_message String?
hosts hosts @relation(fields: [host_id], references: [id], onDelete: Cascade)
id String @id
host_id String
packages_count Int
security_count Int
total_packages Int?
payload_size_kb Float?
execution_time Float?
timestamp DateTime @default(now())
status String @default("success")
error_message String?
hosts hosts @relation(fields: [host_id], references: [id], onDelete: Cascade)
}
model users {
id String @id
username String @unique
email String @unique
password_hash String
role String @default("admin")
is_active Boolean @default(true)
last_login DateTime?
created_at DateTime @default(now())
updated_at DateTime
tfa_backup_codes String?
tfa_enabled Boolean @default(false)
tfa_secret String?
first_name String?
last_name String?
dashboard_preferences dashboard_preferences[]
user_sessions user_sessions[]
id String @id
username String @unique
email String @unique
password_hash String
role String @default("admin")
is_active Boolean @default(true)
last_login DateTime?
created_at DateTime @default(now())
updated_at DateTime
tfa_backup_codes String?
tfa_enabled Boolean @default(false)
tfa_secret String?
first_name String?
last_name String?
dashboard_preferences dashboard_preferences[]
user_sessions user_sessions[]
auto_enrollment_tokens auto_enrollment_tokens[]
}
model user_sessions {
@@ -197,13 +234,130 @@ model user_sessions {
access_token_hash String?
ip_address String?
user_agent String?
device_fingerprint String?
last_activity DateTime @default(now())
expires_at DateTime
created_at DateTime @default(now())
is_revoked Boolean @default(false)
tfa_remember_me Boolean @default(false)
tfa_bypass_until DateTime?
login_count Int @default(1)
last_login_ip String?
users users @relation(fields: [user_id], references: [id], onDelete: Cascade)
@@index([user_id])
@@index([refresh_token])
@@index([expires_at])
@@index([tfa_bypass_until])
@@index([device_fingerprint])
}
model auto_enrollment_tokens {
id String @id
token_name String
token_key String @unique
token_secret String
created_by_user_id String?
is_active Boolean @default(true)
allowed_ip_ranges String[]
max_hosts_per_day Int @default(100)
hosts_created_today Int @default(0)
last_reset_date DateTime @default(now()) @db.Date
default_host_group_id String?
created_at DateTime @default(now())
updated_at DateTime
last_used_at DateTime?
expires_at DateTime?
metadata Json?
users users? @relation(fields: [created_by_user_id], references: [id], onDelete: SetNull)
host_groups host_groups? @relation(fields: [default_host_group_id], references: [id], onDelete: SetNull)
@@index([token_key])
@@index([is_active])
}
model docker_containers {
id String @id
host_id String
container_id String
name String
image_id String?
image_name String
image_tag String @default("latest")
status String
state String?
ports Json?
created_at DateTime
started_at DateTime?
updated_at DateTime
last_checked DateTime @default(now())
docker_images docker_images? @relation(fields: [image_id], references: [id], onDelete: SetNull)
@@unique([host_id, container_id])
@@index([host_id])
@@index([image_id])
@@index([status])
@@index([name])
}
model docker_images {
id String @id
repository String
tag String @default("latest")
image_id String
digest String?
size_bytes BigInt?
source String @default("docker-hub")
created_at DateTime
last_pulled DateTime?
last_checked DateTime @default(now())
updated_at DateTime
docker_containers docker_containers[]
docker_image_updates docker_image_updates[]
@@unique([repository, tag, image_id])
@@index([repository])
@@index([source])
@@index([repository, tag])
}
model docker_image_updates {
id String @id
image_id String
current_tag String
available_tag String
is_security_update Boolean @default(false)
severity String?
changelog_url String?
created_at DateTime @default(now())
updated_at DateTime
docker_images docker_images @relation(fields: [image_id], references: [id], onDelete: Cascade)
@@unique([image_id, available_tag])
@@index([image_id])
@@index([is_security_update])
}
model job_history {
id String @id
job_id String
queue_name String
job_name String
host_id String?
api_id String?
status String
attempt_number Int @default(1)
error_message String?
output Json?
created_at DateTime @default(now())
updated_at DateTime
completed_at DateTime?
hosts hosts? @relation(fields: [host_id], references: [id], onDelete: SetNull)
@@index([job_id])
@@index([queue_name])
@@index([host_id])
@@index([api_id])
@@index([status])
@@index([created_at])
}

View File

@@ -3,6 +3,7 @@ const { PrismaClient } = require("@prisma/client");
const {
validate_session,
update_session_activity,
is_tfa_bypassed,
} = require("../utils/session_manager");
const prisma = new PrismaClient();
@@ -18,10 +19,10 @@ const authenticateToken = async (req, res, next) => {
}
// Verify token
const decoded = jwt.verify(
token,
process.env.JWT_SECRET || "your-secret-key",
);
if (!process.env.JWT_SECRET) {
throw new Error("JWT_SECRET environment variable is required");
}
const decoded = jwt.verify(token, process.env.JWT_SECRET);
// Validate session and check inactivity timeout
const validation = await validate_session(decoded.sessionId, token);
@@ -46,6 +47,9 @@ const authenticateToken = async (req, res, next) => {
// Update session activity timestamp
await update_session_activity(decoded.sessionId);
// Check if TFA is bypassed for this session
const tfa_bypassed = await is_tfa_bypassed(decoded.sessionId);
// Update last login (only on successful authentication)
await prisma.users.update({
where: { id: validation.user.id },
@@ -57,6 +61,7 @@ const authenticateToken = async (req, res, next) => {
req.user = validation.user;
req.session_id = decoded.sessionId;
req.tfa_bypassed = tfa_bypassed;
next();
} catch (error) {
if (error.name === "JsonWebTokenError") {
@@ -85,10 +90,10 @@ const optionalAuth = async (req, _res, next) => {
const token = authHeader?.split(" ")[1];
if (token) {
const decoded = jwt.verify(
token,
process.env.JWT_SECRET || "your-secret-key",
);
if (!process.env.JWT_SECRET) {
throw new Error("JWT_SECRET environment variable is required");
}
const decoded = jwt.verify(token, process.env.JWT_SECRET);
const user = await prisma.users.findUnique({
where: { id: decoded.userId },
select: {
@@ -114,8 +119,33 @@ const optionalAuth = async (req, _res, next) => {
}
};
// Middleware to check if TFA is required for sensitive operations
const requireTfaIfEnabled = async (req, res, next) => {
try {
// Check if user has TFA enabled
const user = await prisma.users.findUnique({
where: { id: req.user.id },
select: { tfa_enabled: true },
});
// If TFA is enabled and not bypassed, require TFA verification
if (user?.tfa_enabled && !req.tfa_bypassed) {
return res.status(403).json({
error: "Two-factor authentication required for this operation",
requires_tfa: true,
});
}
next();
} catch (error) {
console.error("TFA requirement check error:", error);
return res.status(500).json({ error: "Authentication check failed" });
}
};
module.exports = {
authenticateToken,
requireAdmin,
optionalAuth,
requireTfaIfEnabled,
};

View File

@@ -17,12 +17,65 @@ const {
refresh_access_token,
revoke_session,
revoke_all_user_sessions,
get_user_sessions,
} = require("../utils/session_manager");
const router = express.Router();
const prisma = new PrismaClient();
/**
* Parse user agent string to extract browser and OS info
*/
function parse_user_agent(user_agent) {
if (!user_agent)
return { browser: "Unknown", os: "Unknown", device: "Unknown" };
const ua = user_agent.toLowerCase();
// Browser detection
let browser = "Unknown";
if (ua.includes("chrome") && !ua.includes("edg")) browser = "Chrome";
else if (ua.includes("firefox")) browser = "Firefox";
else if (ua.includes("safari") && !ua.includes("chrome")) browser = "Safari";
else if (ua.includes("edg")) browser = "Edge";
else if (ua.includes("opera")) browser = "Opera";
// OS detection
let os = "Unknown";
if (ua.includes("windows")) os = "Windows";
else if (ua.includes("macintosh") || ua.includes("mac os")) os = "macOS";
else if (ua.includes("linux")) os = "Linux";
else if (ua.includes("android")) os = "Android";
else if (ua.includes("iphone") || ua.includes("ipad")) os = "iOS";
// Device type
let device = "Desktop";
if (ua.includes("mobile")) device = "Mobile";
else if (ua.includes("tablet") || ua.includes("ipad")) device = "Tablet";
return { browser, os, device };
}
/**
* Get basic location info from IP (simplified - in production you'd use a service)
*/
function get_location_from_ip(ip) {
if (!ip) return { country: "Unknown", city: "Unknown" };
// For localhost/private IPs
if (
ip === "127.0.0.1" ||
ip === "::1" ||
ip.startsWith("192.168.") ||
ip.startsWith("10.")
) {
return { country: "Local", city: "Local Network" };
}
// In a real implementation, you'd use a service like MaxMind GeoIP2
// For now, return unknown for external IPs
return { country: "Unknown", city: "Unknown" };
}
// Check if any admin users exist (for first-time setup)
router.get("/check-admin-users", async (_req, res) => {
try {
@@ -156,7 +209,10 @@ router.post(
// Generate JWT token
const generateToken = (userId) => {
return jwt.sign({ userId }, process.env.JWT_SECRET || "your-secret-key", {
if (!process.env.JWT_SECRET) {
throw new Error("JWT_SECRET environment variable is required");
}
return jwt.sign({ userId }, process.env.JWT_SECRET, {
expiresIn: process.env.JWT_EXPIRES_IN || "24h",
});
};
@@ -173,6 +229,8 @@ router.get(
id: true,
username: true,
email: true,
first_name: true,
last_name: true,
role: true,
is_active: true,
last_login: true,
@@ -311,6 +369,14 @@ router.put(
.isLength({ min: 3 })
.withMessage("Username must be at least 3 characters"),
body("email").optional().isEmail().withMessage("Valid email is required"),
body("first_name")
.optional()
.isLength({ min: 1 })
.withMessage("First name must be at least 1 character"),
body("last_name")
.optional()
.isLength({ min: 1 })
.withMessage("Last name must be at least 1 character"),
body("role")
.optional()
.custom(async (value) => {
@@ -323,10 +389,10 @@ router.put(
}
return true;
}),
body("isActive")
body("is_active")
.optional()
.isBoolean()
.withMessage("isActive must be a boolean"),
.withMessage("is_active must be a boolean"),
],
async (req, res) => {
try {
@@ -337,13 +403,16 @@ router.put(
return res.status(400).json({ errors: errors.array() });
}
const { username, email, role, isActive } = req.body;
const { username, email, first_name, last_name, role, is_active } =
req.body;
const updateData = {};
if (username) updateData.username = username;
if (email) updateData.email = email;
if (first_name !== undefined) updateData.first_name = first_name || null;
if (last_name !== undefined) updateData.last_name = last_name || null;
if (role) updateData.role = role;
if (typeof isActive === "boolean") updateData.is_active = isActive;
if (typeof is_active === "boolean") updateData.is_active = is_active;
// Check if user exists
const existingUser = await prisma.users.findUnique({
@@ -378,7 +447,7 @@ router.put(
}
// Prevent deactivating the last admin
if (isActive === false && existingUser.role === "admin") {
if (is_active === false && existingUser.role === "admin") {
const adminCount = await prisma.users.count({
where: {
role: "admin",
@@ -401,6 +470,8 @@ router.put(
id: true,
username: true,
email: true,
first_name: true,
last_name: true,
role: true,
is_active: true,
last_login: true,
@@ -747,6 +818,8 @@ router.post(
id: user.id,
username: user.username,
email: user.email,
first_name: user.first_name,
last_name: user.last_name,
role: user.role,
is_active: user.is_active,
last_login: user.last_login,
@@ -770,6 +843,10 @@ router.post(
.isLength({ min: 6, max: 6 })
.withMessage("Token must be 6 digits"),
body("token").isNumeric().withMessage("Token must contain only numbers"),
body("remember_me")
.optional()
.isBoolean()
.withMessage("Remember me must be a boolean"),
],
async (req, res) => {
try {
@@ -778,7 +855,7 @@ router.post(
return res.status(400).json({ errors: errors.array() });
}
const { username, token } = req.body;
const { username, token, remember_me = false } = req.body;
// Find user
const user = await prisma.users.findFirst({
@@ -847,13 +924,20 @@ router.post(
// Create session with access and refresh tokens
const ip_address = req.ip || req.connection.remoteAddress;
const user_agent = req.get("user-agent");
const session = await create_session(user.id, ip_address, user_agent);
const session = await create_session(
user.id,
ip_address,
user_agent,
remember_me,
req,
);
res.json({
message: "Login successful",
token: session.access_token,
refresh_token: session.refresh_token,
expires_at: session.expires_at,
tfa_bypass_until: session.tfa_bypass_until,
user: {
id: user.id,
username: user.username,
@@ -1091,10 +1175,43 @@ router.post(
// Get user's active sessions
router.get("/sessions", authenticateToken, async (req, res) => {
try {
const sessions = await get_user_sessions(req.user.id);
const sessions = await prisma.user_sessions.findMany({
where: {
user_id: req.user.id,
is_revoked: false,
expires_at: { gt: new Date() },
},
select: {
id: true,
ip_address: true,
user_agent: true,
device_fingerprint: true,
last_activity: true,
created_at: true,
expires_at: true,
tfa_remember_me: true,
tfa_bypass_until: true,
login_count: true,
last_login_ip: true,
},
orderBy: { last_activity: "desc" },
});
// Enhance sessions with device info
const enhanced_sessions = sessions.map((session) => {
const is_current_session = session.id === req.session_id;
const device_info = parse_user_agent(session.user_agent);
return {
...session,
is_current_session,
device_info,
location_info: get_location_from_ip(session.ip_address),
};
});
res.json({
sessions: sessions,
sessions: enhanced_sessions,
});
} catch (error) {
console.error("Get sessions error:", error);
@@ -1116,6 +1233,11 @@ router.delete("/sessions/:session_id", authenticateToken, async (req, res) => {
return res.status(404).json({ error: "Session not found" });
}
// Don't allow revoking the current session
if (session_id === req.session_id) {
return res.status(400).json({ error: "Cannot revoke current session" });
}
await revoke_session(session_id);
res.json({
@@ -1127,4 +1249,25 @@ router.delete("/sessions/:session_id", authenticateToken, async (req, res) => {
}
});
// Revoke all sessions except current one
router.delete("/sessions", authenticateToken, async (req, res) => {
try {
// Revoke all sessions except the current one
await prisma.user_sessions.updateMany({
where: {
user_id: req.user.id,
id: { not: req.session_id },
},
data: { is_revoked: true },
});
res.json({
message: "All other sessions revoked successfully",
});
} catch (error) {
console.error("Revoke all sessions error:", error);
res.status(500).json({ error: "Failed to revoke sessions" });
}
});
module.exports = router;

View File

@@ -0,0 +1,745 @@
const express = require("express");
const { PrismaClient } = require("@prisma/client");
const crypto = require("node:crypto");
const bcrypt = require("bcryptjs");
const { body, validationResult } = require("express-validator");
const { authenticateToken } = require("../middleware/auth");
const { requireManageSettings } = require("../middleware/permissions");
const { v4: uuidv4 } = require("uuid");
const router = express.Router();
const prisma = new PrismaClient();
// Generate auto-enrollment token credentials
const generate_auto_enrollment_token = () => {
const token_key = `patchmon_ae_${crypto.randomBytes(16).toString("hex")}`;
const token_secret = crypto.randomBytes(48).toString("hex");
return { token_key, token_secret };
};
// Middleware to validate auto-enrollment token
const validate_auto_enrollment_token = async (req, res, next) => {
try {
const token_key = req.headers["x-auto-enrollment-key"];
const token_secret = req.headers["x-auto-enrollment-secret"];
if (!token_key || !token_secret) {
return res
.status(401)
.json({ error: "Auto-enrollment credentials required" });
}
// Find token
const token = await prisma.auto_enrollment_tokens.findUnique({
where: { token_key: token_key },
});
if (!token || !token.is_active) {
return res.status(401).json({ error: "Invalid or inactive token" });
}
// Verify secret (hashed)
const is_valid = await bcrypt.compare(token_secret, token.token_secret);
if (!is_valid) {
return res.status(401).json({ error: "Invalid token secret" });
}
// Check expiration
if (token.expires_at && new Date() > new Date(token.expires_at)) {
return res.status(401).json({ error: "Token expired" });
}
// Check IP whitelist if configured
if (token.allowed_ip_ranges && token.allowed_ip_ranges.length > 0) {
const client_ip = req.ip || req.connection.remoteAddress;
// Basic IP check - can be enhanced with CIDR matching
const ip_allowed = token.allowed_ip_ranges.some((allowed_ip) => {
return client_ip.includes(allowed_ip);
});
if (!ip_allowed) {
console.warn(
`Auto-enrollment attempt from unauthorized IP: ${client_ip}`,
);
return res
.status(403)
.json({ error: "IP address not authorized for this token" });
}
}
// Check rate limit (hosts per day)
const today = new Date().toISOString().split("T")[0];
const token_reset_date = token.last_reset_date.toISOString().split("T")[0];
if (token_reset_date !== today) {
// Reset daily counter
await prisma.auto_enrollment_tokens.update({
where: { id: token.id },
data: {
hosts_created_today: 0,
last_reset_date: new Date(),
updated_at: new Date(),
},
});
token.hosts_created_today = 0;
}
if (token.hosts_created_today >= token.max_hosts_per_day) {
return res.status(429).json({
error: "Rate limit exceeded",
message: `Maximum ${token.max_hosts_per_day} hosts per day allowed for this token`,
});
}
req.auto_enrollment_token = token;
next();
} catch (error) {
console.error("Auto-enrollment token validation error:", error);
res.status(500).json({ error: "Token validation failed" });
}
};
// ========== ADMIN ENDPOINTS (Manage Tokens) ==========
// Create auto-enrollment token
router.post(
"/tokens",
authenticateToken,
requireManageSettings,
[
body("token_name")
.isLength({ min: 1, max: 255 })
.withMessage("Token name is required (max 255 characters)"),
body("allowed_ip_ranges")
.optional()
.isArray()
.withMessage("Allowed IP ranges must be an array"),
body("max_hosts_per_day")
.optional()
.isInt({ min: 1, max: 1000 })
.withMessage("Max hosts per day must be between 1 and 1000"),
body("default_host_group_id")
.optional({ nullable: true, checkFalsy: true })
.isString(),
body("expires_at")
.optional({ nullable: true, checkFalsy: true })
.isISO8601()
.withMessage("Invalid date format"),
],
async (req, res) => {
try {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
const {
token_name,
allowed_ip_ranges = [],
max_hosts_per_day = 100,
default_host_group_id,
expires_at,
metadata = {},
} = req.body;
// Validate host group if provided
if (default_host_group_id) {
const host_group = await prisma.host_groups.findUnique({
where: { id: default_host_group_id },
});
if (!host_group) {
return res.status(400).json({ error: "Host group not found" });
}
}
const { token_key, token_secret } = generate_auto_enrollment_token();
const hashed_secret = await bcrypt.hash(token_secret, 10);
const token = await prisma.auto_enrollment_tokens.create({
data: {
id: uuidv4(),
token_name,
token_key: token_key,
token_secret: hashed_secret,
created_by_user_id: req.user.id,
allowed_ip_ranges,
max_hosts_per_day,
default_host_group_id: default_host_group_id || null,
expires_at: expires_at ? new Date(expires_at) : null,
metadata: { integration_type: "proxmox-lxc", ...metadata },
updated_at: new Date(),
},
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
users: {
select: {
id: true,
username: true,
first_name: true,
last_name: true,
},
},
},
});
// Return unhashed secret ONLY once (like API keys)
res.status(201).json({
message: "Auto-enrollment token created successfully",
token: {
id: token.id,
token_name: token.token_name,
token_key: token_key,
token_secret: token_secret, // ONLY returned here!
max_hosts_per_day: token.max_hosts_per_day,
default_host_group: token.host_groups,
created_by: token.users,
expires_at: token.expires_at,
},
warning: "⚠️ Save the token_secret now - it cannot be retrieved later!",
});
} catch (error) {
console.error("Create auto-enrollment token error:", error);
res.status(500).json({ error: "Failed to create token" });
}
},
);
// List auto-enrollment tokens
router.get(
"/tokens",
authenticateToken,
requireManageSettings,
async (_req, res) => {
try {
const tokens = await prisma.auto_enrollment_tokens.findMany({
select: {
id: true,
token_name: true,
token_key: true,
is_active: true,
allowed_ip_ranges: true,
max_hosts_per_day: true,
hosts_created_today: true,
last_used_at: true,
expires_at: true,
created_at: true,
default_host_group_id: true,
metadata: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
users: {
select: {
id: true,
username: true,
first_name: true,
last_name: true,
},
},
},
orderBy: { created_at: "desc" },
});
res.json(tokens);
} catch (error) {
console.error("List auto-enrollment tokens error:", error);
res.status(500).json({ error: "Failed to list tokens" });
}
},
);
// Get single token details
router.get(
"/tokens/:tokenId",
authenticateToken,
requireManageSettings,
async (req, res) => {
try {
const { tokenId } = req.params;
const token = await prisma.auto_enrollment_tokens.findUnique({
where: { id: tokenId },
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
users: {
select: {
id: true,
username: true,
first_name: true,
last_name: true,
},
},
},
});
if (!token) {
return res.status(404).json({ error: "Token not found" });
}
// Don't include the secret in response
const { token_secret: _secret, ...token_data } = token;
res.json(token_data);
} catch (error) {
console.error("Get token error:", error);
res.status(500).json({ error: "Failed to get token" });
}
},
);
// Update token (toggle active state, update limits, etc.)
router.patch(
"/tokens/:tokenId",
authenticateToken,
requireManageSettings,
[
body("is_active").optional().isBoolean(),
body("max_hosts_per_day").optional().isInt({ min: 1, max: 1000 }),
body("allowed_ip_ranges").optional().isArray(),
body("expires_at").optional().isISO8601(),
],
async (req, res) => {
try {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
const { tokenId } = req.params;
const update_data = { updated_at: new Date() };
if (req.body.is_active !== undefined)
update_data.is_active = req.body.is_active;
if (req.body.max_hosts_per_day !== undefined)
update_data.max_hosts_per_day = req.body.max_hosts_per_day;
if (req.body.allowed_ip_ranges !== undefined)
update_data.allowed_ip_ranges = req.body.allowed_ip_ranges;
if (req.body.expires_at !== undefined)
update_data.expires_at = new Date(req.body.expires_at);
const token = await prisma.auto_enrollment_tokens.update({
where: { id: tokenId },
data: update_data,
include: {
host_groups: true,
users: {
select: {
username: true,
first_name: true,
last_name: true,
},
},
},
});
const { token_secret: _secret, ...token_data } = token;
res.json({
message: "Token updated successfully",
token: token_data,
});
} catch (error) {
console.error("Update token error:", error);
res.status(500).json({ error: "Failed to update token" });
}
},
);
// Delete token
router.delete(
"/tokens/:tokenId",
authenticateToken,
requireManageSettings,
async (req, res) => {
try {
const { tokenId } = req.params;
const token = await prisma.auto_enrollment_tokens.findUnique({
where: { id: tokenId },
});
if (!token) {
return res.status(404).json({ error: "Token not found" });
}
await prisma.auto_enrollment_tokens.delete({
where: { id: tokenId },
});
res.json({
message: "Auto-enrollment token deleted successfully",
deleted_token: {
id: token.id,
token_name: token.token_name,
},
});
} catch (error) {
console.error("Delete token error:", error);
res.status(500).json({ error: "Failed to delete token" });
}
},
);
// ========== AUTO-ENROLLMENT ENDPOINTS (Used by Scripts) ==========
// Future integrations can follow this pattern:
// - /proxmox-lxc - Proxmox LXC containers
// - /vmware-esxi - VMware ESXi VMs
// - /docker - Docker containers
// - /kubernetes - Kubernetes pods
// - /aws-ec2 - AWS EC2 instances
// Serve the Proxmox LXC enrollment script with credentials injected
router.get("/proxmox-lxc", async (req, res) => {
try {
// Get token from query params
const token_key = req.query.token_key;
const token_secret = req.query.token_secret;
if (!token_key || !token_secret) {
return res
.status(401)
.json({ error: "Token key and secret required as query parameters" });
}
// Validate token
const token = await prisma.auto_enrollment_tokens.findUnique({
where: { token_key: token_key },
});
if (!token || !token.is_active) {
return res.status(401).json({ error: "Invalid or inactive token" });
}
// Verify secret
const is_valid = await bcrypt.compare(token_secret, token.token_secret);
if (!is_valid) {
return res.status(401).json({ error: "Invalid token secret" });
}
// Check expiration
if (token.expires_at && new Date() > new Date(token.expires_at)) {
return res.status(401).json({ error: "Token expired" });
}
const fs = require("node:fs");
const path = require("node:path");
const script_path = path.join(
__dirname,
"../../../agents/proxmox_auto_enroll.sh",
);
if (!fs.existsSync(script_path)) {
return res
.status(404)
.json({ error: "Proxmox enrollment script not found" });
}
let script = fs.readFileSync(script_path, "utf8");
// Convert Windows line endings to Unix line endings
script = script.replace(/\r\n/g, "\n").replace(/\r/g, "\n");
// Get the configured server URL from settings
let server_url = "http://localhost:3001";
try {
const settings = await prisma.settings.findFirst();
if (settings?.server_url) {
server_url = settings.server_url;
}
} catch (settings_error) {
console.warn(
"Could not fetch settings, using default server URL:",
settings_error.message,
);
}
// Determine curl flags dynamically from settings
let curl_flags = "-s";
try {
const settings = await prisma.settings.findFirst();
if (settings && settings.ignore_ssl_self_signed === true) {
curl_flags = "-sk";
}
} catch (_) {}
// Check for --force parameter
const force_install = req.query.force === "true" || req.query.force === "1";
// Inject the token credentials, server URL, curl flags, and force flag into the script
const env_vars = `#!/bin/bash
# PatchMon Auto-Enrollment Configuration (Auto-generated)
export PATCHMON_URL="${server_url}"
export AUTO_ENROLLMENT_KEY="${token.token_key}"
export AUTO_ENROLLMENT_SECRET="${token_secret}"
export CURL_FLAGS="${curl_flags}"
export FORCE_INSTALL="${force_install ? "true" : "false"}"
`;
// Remove the shebang and configuration section from the original script
script = script.replace(/^#!/, "#");
// Remove the configuration section (between # ===== CONFIGURATION ===== and the next # =====)
script = script.replace(
/# ===== CONFIGURATION =====[\s\S]*?(?=# ===== COLOR OUTPUT =====)/,
"",
);
script = env_vars + script;
res.setHeader("Content-Type", "text/plain");
res.setHeader(
"Content-Disposition",
'inline; filename="proxmox_auto_enroll.sh"',
);
res.send(script);
} catch (error) {
console.error("Proxmox script serve error:", error);
res.status(500).json({ error: "Failed to serve enrollment script" });
}
});
// Create host via auto-enrollment
router.post(
"/enroll",
validate_auto_enrollment_token,
[
body("friendly_name")
.isLength({ min: 1, max: 255 })
.withMessage("Friendly name is required"),
body("machine_id")
.isLength({ min: 1, max: 255 })
.withMessage("Machine ID is required"),
body("metadata").optional().isObject(),
],
async (req, res) => {
try {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
const { friendly_name, machine_id } = req.body;
// Generate host API credentials
const api_id = `patchmon_${crypto.randomBytes(8).toString("hex")}`;
const api_key = crypto.randomBytes(32).toString("hex");
// Check if host already exists by machine_id (not hostname)
const existing_host = await prisma.hosts.findUnique({
where: { machine_id },
});
if (existing_host) {
return res.status(409).json({
error: "Host already exists",
host_id: existing_host.id,
api_id: existing_host.api_id,
machine_id: existing_host.machine_id,
friendly_name: existing_host.friendly_name,
message:
"This machine is already enrolled in PatchMon (matched by machine ID)",
});
}
// Create host
const host = await prisma.hosts.create({
data: {
id: uuidv4(),
machine_id,
friendly_name,
os_type: "unknown",
os_version: "unknown",
api_id: api_id,
api_key: api_key,
host_group_id: req.auto_enrollment_token.default_host_group_id,
status: "pending",
notes: `Auto-enrolled via ${req.auto_enrollment_token.token_name} on ${new Date().toISOString()}`,
updated_at: new Date(),
},
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
});
// Update token usage stats
await prisma.auto_enrollment_tokens.update({
where: { id: req.auto_enrollment_token.id },
data: {
hosts_created_today: { increment: 1 },
last_used_at: new Date(),
updated_at: new Date(),
},
});
console.log(
`Auto-enrolled host: ${friendly_name} (${host.id}) via token: ${req.auto_enrollment_token.token_name}`,
);
res.status(201).json({
message: "Host enrolled successfully",
host: {
id: host.id,
friendly_name: host.friendly_name,
api_id: api_id,
api_key: api_key,
host_group: host.host_groups,
status: host.status,
},
});
} catch (error) {
console.error("Auto-enrollment error:", error);
res.status(500).json({ error: "Failed to enroll host" });
}
},
);
// Bulk enroll multiple hosts at once
router.post(
"/enroll/bulk",
validate_auto_enrollment_token,
[
body("hosts")
.isArray({ min: 1, max: 50 })
.withMessage("Hosts array required (max 50)"),
body("hosts.*.friendly_name")
.isLength({ min: 1 })
.withMessage("Each host needs a friendly_name"),
],
async (req, res) => {
try {
const errors = validationResult(req);
if (!errors.isEmpty()) {
return res.status(400).json({ errors: errors.array() });
}
const { hosts } = req.body;
// Check rate limit
const remaining_quota =
req.auto_enrollment_token.max_hosts_per_day -
req.auto_enrollment_token.hosts_created_today;
if (hosts.length > remaining_quota) {
return res.status(429).json({
error: "Rate limit exceeded",
message: `Only ${remaining_quota} hosts remaining in daily quota`,
});
}
const results = {
success: [],
failed: [],
skipped: [],
};
for (const host_data of hosts) {
try {
const { friendly_name, machine_id } = host_data;
if (!machine_id) {
results.failed.push({
friendly_name,
error: "Machine ID is required",
});
continue;
}
// Check if host already exists by machine_id
const existing_host = await prisma.hosts.findUnique({
where: { machine_id },
});
if (existing_host) {
results.skipped.push({
friendly_name,
machine_id,
reason: "Machine already enrolled",
api_id: existing_host.api_id,
});
continue;
}
// Generate credentials
const api_id = `patchmon_${crypto.randomBytes(8).toString("hex")}`;
const api_key = crypto.randomBytes(32).toString("hex");
// Create host
const host = await prisma.hosts.create({
data: {
id: uuidv4(),
machine_id,
friendly_name,
os_type: "unknown",
os_version: "unknown",
api_id: api_id,
api_key: api_key,
host_group_id: req.auto_enrollment_token.default_host_group_id,
status: "pending",
notes: `Auto-enrolled via ${req.auto_enrollment_token.token_name} on ${new Date().toISOString()}`,
updated_at: new Date(),
},
});
results.success.push({
id: host.id,
friendly_name: host.friendly_name,
api_id: api_id,
api_key: api_key,
});
} catch (error) {
results.failed.push({
friendly_name: host_data.friendly_name,
error: error.message,
});
}
}
// Update token usage stats
if (results.success.length > 0) {
await prisma.auto_enrollment_tokens.update({
where: { id: req.auto_enrollment_token.id },
data: {
hosts_created_today: { increment: results.success.length },
last_used_at: new Date(),
updated_at: new Date(),
},
});
}
res.status(201).json({
message: `Bulk enrollment completed: ${results.success.length} succeeded, ${results.failed.length} failed, ${results.skipped.length} skipped`,
results,
});
} catch (error) {
console.error("Bulk auto-enrollment error:", error);
res.status(500).json({ error: "Failed to bulk enroll hosts" });
}
},
);
module.exports = router;

View File

@@ -0,0 +1,416 @@
const express = require("express");
const { queueManager, QUEUE_NAMES } = require("../services/automation");
const { getConnectedApiIds } = require("../services/agentWs");
const { authenticateToken } = require("../middleware/auth");
const router = express.Router();
// Get all queue statistics
router.get("/stats", authenticateToken, async (_req, res) => {
try {
const stats = await queueManager.getAllQueueStats();
res.json({
success: true,
data: stats,
});
} catch (error) {
console.error("Error fetching queue stats:", error);
res.status(500).json({
success: false,
error: "Failed to fetch queue statistics",
});
}
});
// Get specific queue statistics
router.get("/stats/:queueName", authenticateToken, async (req, res) => {
try {
const { queueName } = req.params;
if (!Object.values(QUEUE_NAMES).includes(queueName)) {
return res.status(400).json({
success: false,
error: "Invalid queue name",
});
}
const stats = await queueManager.getQueueStats(queueName);
res.json({
success: true,
data: stats,
});
} catch (error) {
console.error("Error fetching queue stats:", error);
res.status(500).json({
success: false,
error: "Failed to fetch queue statistics",
});
}
});
// Get recent jobs for a queue
router.get("/jobs/:queueName", authenticateToken, async (req, res) => {
try {
const { queueName } = req.params;
const { limit = 10 } = req.query;
if (!Object.values(QUEUE_NAMES).includes(queueName)) {
return res.status(400).json({
success: false,
error: "Invalid queue name",
});
}
const jobs = await queueManager.getRecentJobs(
queueName,
parseInt(limit, 10),
);
// Format jobs for frontend
const formattedJobs = jobs.map((job) => ({
id: job.id,
name: job.name,
status: job.finishedOn
? job.failedReason
? "failed"
: "completed"
: "active",
progress: job.progress,
data: job.data,
returnvalue: job.returnvalue,
failedReason: job.failedReason,
processedOn: job.processedOn,
finishedOn: job.finishedOn,
createdAt: new Date(job.timestamp),
attemptsMade: job.attemptsMade,
delay: job.delay,
}));
res.json({
success: true,
data: formattedJobs,
});
} catch (error) {
console.error("Error fetching recent jobs:", error);
res.status(500).json({
success: false,
error: "Failed to fetch recent jobs",
});
}
});
// Trigger manual GitHub update check
router.post("/trigger/github-update", authenticateToken, async (_req, res) => {
try {
const job = await queueManager.triggerGitHubUpdateCheck();
res.json({
success: true,
data: {
jobId: job.id,
message: "GitHub update check triggered successfully",
},
});
} catch (error) {
console.error("Error triggering GitHub update check:", error);
res.status(500).json({
success: false,
error: "Failed to trigger GitHub update check",
});
}
});
// Trigger manual session cleanup
router.post(
"/trigger/session-cleanup",
authenticateToken,
async (_req, res) => {
try {
const job = await queueManager.triggerSessionCleanup();
res.json({
success: true,
data: {
jobId: job.id,
message: "Session cleanup triggered successfully",
},
});
} catch (error) {
console.error("Error triggering session cleanup:", error);
res.status(500).json({
success: false,
error: "Failed to trigger session cleanup",
});
}
},
);
// Trigger Agent Collection: enqueue report_now for connected agents only
router.post(
"/trigger/agent-collection",
authenticateToken,
async (_req, res) => {
try {
const queue = queueManager.queues[QUEUE_NAMES.AGENT_COMMANDS];
const apiIds = getConnectedApiIds();
if (!apiIds || apiIds.length === 0) {
return res.json({ success: true, data: { enqueued: 0 } });
}
const jobs = apiIds.map((apiId) => ({
name: "report_now",
data: { api_id: apiId, type: "report_now" },
opts: { attempts: 3, backoff: { type: "fixed", delay: 2000 } },
}));
await queue.addBulk(jobs);
res.json({ success: true, data: { enqueued: jobs.length } });
} catch (error) {
console.error("Error triggering agent collection:", error);
res
.status(500)
.json({ success: false, error: "Failed to trigger agent collection" });
}
},
);
// Trigger manual orphaned repo cleanup
router.post(
"/trigger/orphaned-repo-cleanup",
authenticateToken,
async (_req, res) => {
try {
const job = await queueManager.triggerOrphanedRepoCleanup();
res.json({
success: true,
data: {
jobId: job.id,
message: "Orphaned repository cleanup triggered successfully",
},
});
} catch (error) {
console.error("Error triggering orphaned repository cleanup:", error);
res.status(500).json({
success: false,
error: "Failed to trigger orphaned repository cleanup",
});
}
},
);
// Trigger manual orphaned package cleanup
router.post(
"/trigger/orphaned-package-cleanup",
authenticateToken,
async (_req, res) => {
try {
const job = await queueManager.triggerOrphanedPackageCleanup();
res.json({
success: true,
data: {
jobId: job.id,
message: "Orphaned package cleanup triggered successfully",
},
});
} catch (error) {
console.error("Error triggering orphaned package cleanup:", error);
res.status(500).json({
success: false,
error: "Failed to trigger orphaned package cleanup",
});
}
},
);
// Get queue health status
router.get("/health", authenticateToken, async (_req, res) => {
try {
const stats = await queueManager.getAllQueueStats();
const totalJobs = Object.values(stats).reduce((sum, queueStats) => {
return sum + queueStats.waiting + queueStats.active + queueStats.failed;
}, 0);
const health = {
status: "healthy",
totalJobs,
queues: Object.keys(stats).length,
timestamp: new Date().toISOString(),
};
// Check for unhealthy conditions
if (totalJobs > 1000) {
health.status = "warning";
health.message = "High number of queued jobs";
}
const failedJobs = Object.values(stats).reduce((sum, queueStats) => {
return sum + queueStats.failed;
}, 0);
if (failedJobs > 10) {
health.status = "error";
health.message = "High number of failed jobs";
}
res.json({
success: true,
data: health,
});
} catch (error) {
console.error("Error checking queue health:", error);
res.status(500).json({
success: false,
error: "Failed to check queue health",
});
}
});
// Get automation overview (for dashboard cards)
router.get("/overview", authenticateToken, async (_req, res) => {
try {
const stats = await queueManager.getAllQueueStats();
const { getSettings } = require("../services/settingsService");
const settings = await getSettings();
// Get recent jobs for each queue to show last run times
const recentJobs = await Promise.all([
queueManager.getRecentJobs(QUEUE_NAMES.GITHUB_UPDATE_CHECK, 1),
queueManager.getRecentJobs(QUEUE_NAMES.SESSION_CLEANUP, 1),
queueManager.getRecentJobs(QUEUE_NAMES.ORPHANED_REPO_CLEANUP, 1),
queueManager.getRecentJobs(QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP, 1),
queueManager.getRecentJobs(QUEUE_NAMES.AGENT_COMMANDS, 1),
]);
// Calculate overview metrics
const overview = {
scheduledTasks:
stats[QUEUE_NAMES.GITHUB_UPDATE_CHECK].delayed +
stats[QUEUE_NAMES.SESSION_CLEANUP].delayed +
stats[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].delayed +
stats[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP].delayed,
runningTasks:
stats[QUEUE_NAMES.GITHUB_UPDATE_CHECK].active +
stats[QUEUE_NAMES.SESSION_CLEANUP].active +
stats[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].active +
stats[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP].active,
failedTasks:
stats[QUEUE_NAMES.GITHUB_UPDATE_CHECK].failed +
stats[QUEUE_NAMES.SESSION_CLEANUP].failed +
stats[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].failed +
stats[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP].failed,
totalAutomations: Object.values(stats).reduce((sum, queueStats) => {
return (
sum +
queueStats.completed +
queueStats.failed +
queueStats.active +
queueStats.waiting +
queueStats.delayed
);
}, 0),
// Automation details with last run times
automations: [
{
name: "GitHub Update Check",
queue: QUEUE_NAMES.GITHUB_UPDATE_CHECK,
description: "Checks for new PatchMon releases",
schedule: "Daily at midnight",
lastRun: recentJobs[0][0]?.finishedOn
? new Date(recentJobs[0][0].finishedOn).toLocaleString()
: "Never",
lastRunTimestamp: recentJobs[0][0]?.finishedOn || 0,
status: recentJobs[0][0]?.failedReason
? "Failed"
: recentJobs[0][0]
? "Success"
: "Never run",
stats: stats[QUEUE_NAMES.GITHUB_UPDATE_CHECK],
},
{
name: "Session Cleanup",
queue: QUEUE_NAMES.SESSION_CLEANUP,
description: "Cleans up expired user sessions",
schedule: "Every hour",
lastRun: recentJobs[1][0]?.finishedOn
? new Date(recentJobs[1][0].finishedOn).toLocaleString()
: "Never",
lastRunTimestamp: recentJobs[1][0]?.finishedOn || 0,
status: recentJobs[1][0]?.failedReason
? "Failed"
: recentJobs[1][0]
? "Success"
: "Never run",
stats: stats[QUEUE_NAMES.SESSION_CLEANUP],
},
{
name: "Orphaned Repo Cleanup",
queue: QUEUE_NAMES.ORPHANED_REPO_CLEANUP,
description: "Removes repositories with no associated hosts",
schedule: "Daily at 2 AM",
lastRun: recentJobs[2][0]?.finishedOn
? new Date(recentJobs[2][0].finishedOn).toLocaleString()
: "Never",
lastRunTimestamp: recentJobs[2][0]?.finishedOn || 0,
status: recentJobs[2][0]?.failedReason
? "Failed"
: recentJobs[2][0]
? "Success"
: "Never run",
stats: stats[QUEUE_NAMES.ORPHANED_REPO_CLEANUP],
},
{
name: "Orphaned Package Cleanup",
queue: QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP,
description: "Removes packages with no associated hosts",
schedule: "Daily at 3 AM",
lastRun: recentJobs[3][0]?.finishedOn
? new Date(recentJobs[3][0].finishedOn).toLocaleString()
: "Never",
lastRunTimestamp: recentJobs[3][0]?.finishedOn || 0,
status: recentJobs[3][0]?.failedReason
? "Failed"
: recentJobs[3][0]
? "Success"
: "Never run",
stats: stats[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP],
},
{
name: "Collect Host Statistics",
queue: QUEUE_NAMES.AGENT_COMMANDS,
description: "Collects package statistics from connected agents only",
schedule: `Every ${settings.update_interval} minutes (Agent-driven)`,
lastRun: recentJobs[4][0]?.finishedOn
? new Date(recentJobs[4][0].finishedOn).toLocaleString()
: "Never",
lastRunTimestamp: recentJobs[4][0]?.finishedOn || 0,
status: recentJobs[4][0]?.failedReason
? "Failed"
: recentJobs[4][0]
? "Success"
: "Never run",
stats: stats[QUEUE_NAMES.AGENT_COMMANDS],
},
].sort((a, b) => {
// Sort by last run timestamp (most recent first)
// If both have never run (timestamp 0), maintain original order
if (a.lastRunTimestamp === 0 && b.lastRunTimestamp === 0) return 0;
if (a.lastRunTimestamp === 0) return 1; // Never run goes to bottom
if (b.lastRunTimestamp === 0) return -1; // Never run goes to bottom
return b.lastRunTimestamp - a.lastRunTimestamp; // Most recent first
}),
};
res.json({
success: true,
data: overview,
});
} catch (error) {
console.error("Error fetching automation overview:", error);
res.status(500).json({
success: false,
error: "Failed to fetch automation overview",
});
}
});
module.exports = router;

View File

@@ -130,15 +130,20 @@ async function createDefaultDashboardPreferences(userId, userRole = "user") {
requiredPermission: "can_view_packages",
order: 13,
},
{
cardId: "packageTrends",
requiredPermission: "can_view_packages",
order: 14,
},
{
cardId: "recentUsers",
requiredPermission: "can_view_users",
order: 14,
order: 15,
},
{
cardId: "quickStats",
requiredPermission: "can_view_dashboard",
order: 15,
order: 16,
},
];
@@ -341,19 +346,26 @@ router.get("/defaults", authenticateToken, async (_req, res) => {
enabled: true,
order: 13,
},
{
cardId: "packageTrends",
title: "Package Trends",
icon: "TrendingUp",
enabled: true,
order: 14,
},
{
cardId: "recentUsers",
title: "Recent Users Logged in",
icon: "Users",
enabled: true,
order: 14,
order: 15,
},
{
cardId: "quickStats",
title: "Quick Stats",
icon: "TrendingUp",
enabled: true,
order: 15,
order: 16,
},
];

View File

@@ -8,6 +8,7 @@ const {
requireViewPackages,
requireViewUsers,
} = require("../middleware/permissions");
const { queueManager } = require("../services/automation");
const router = express.Router();
const prisma = new PrismaClient();
@@ -145,9 +146,13 @@ router.get(
];
// Package update priority distribution
const regularUpdates = Math.max(
0,
totalOutdatedPackages - securityUpdates,
);
const packageUpdateDistribution = [
{ name: "Security", count: securityUpdates },
{ name: "Regular", count: totalOutdatedPackages - securityUpdates },
{ name: "Regular", count: regularUpdates },
];
res.json({
@@ -185,6 +190,7 @@ router.get("/hosts", authenticateToken, requireViewHosts, async (_req, res) => {
// Show all hosts regardless of status
select: {
id: true,
machine_id: true,
friendly_name: true,
hostname: true,
ip: true,
@@ -195,11 +201,16 @@ router.get("/hosts", authenticateToken, requireViewHosts, async (_req, res) => {
agent_version: true,
auto_update: true,
notes: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
api_id: true,
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
_count: {
@@ -342,32 +353,45 @@ router.get(
try {
const { hostId } = req.params;
const host = await prisma.hosts.findUnique({
where: { id: hostId },
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
const limit = parseInt(req.query.limit, 10) || 10;
const offset = parseInt(req.query.offset, 10) || 0;
const [host, totalHistoryCount] = await Promise.all([
prisma.hosts.findUnique({
where: { id: hostId },
include: {
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
host_packages: {
include: {
packages: true,
},
orderBy: {
needs_update: "desc",
},
},
update_history: {
orderBy: {
timestamp: "desc",
},
take: limit,
skip: offset,
},
},
host_packages: {
include: {
packages: true,
},
orderBy: {
needs_update: "desc",
},
},
update_history: {
orderBy: {
timestamp: "desc",
},
take: 10,
},
},
});
}),
prisma.update_history.count({
where: { host_id: hostId },
}),
]);
if (!host) {
return res.status(404).json({ error: "Host not found" });
@@ -383,6 +407,12 @@ router.get(
(hp) => hp.needs_update && hp.is_security_update,
).length,
},
pagination: {
total: totalHistoryCount,
limit,
offset,
hasMore: offset + limit < totalHistoryCount,
},
};
res.json(hostWithStats);
@@ -393,6 +423,51 @@ router.get(
},
);
// Get agent queue status for a specific host
router.get(
"/hosts/:hostId/queue",
authenticateToken,
requireViewHosts,
async (req, res) => {
try {
const { hostId } = req.params;
const { limit = 20 } = req.query;
// Get the host to find its API ID
const host = await prisma.hosts.findUnique({
where: { id: hostId },
select: { api_id: true, friendly_name: true },
});
if (!host) {
return res.status(404).json({ error: "Host not found" });
}
// Get queue jobs for this host
const queueData = await queueManager.getHostJobs(
host.api_id,
parseInt(limit, 10),
);
res.json({
success: true,
data: {
hostId,
apiId: host.api_id,
friendlyName: host.friendly_name,
...queueData,
},
});
} catch (error) {
console.error("Error fetching host queue status:", error);
res.status(500).json({
success: false,
error: "Failed to fetch host queue status",
});
}
},
);
// Get recent users ordered by last_login desc
router.get(
"/recent-users",
@@ -455,4 +530,766 @@ router.get(
},
);
// Get package trends over time
router.get(
"/package-trends",
authenticateToken,
requireViewHosts,
async (req, res) => {
try {
const { days = 30, hostId } = req.query;
const daysInt = parseInt(days, 10);
// Calculate date range
const endDate = new Date();
const startDate = new Date();
startDate.setDate(endDate.getDate() - daysInt);
// Build where clause
const whereClause = {
timestamp: {
gte: startDate,
lte: endDate,
},
};
// Add host filter if specified
if (hostId && hostId !== "all" && hostId !== "undefined") {
whereClause.host_id = hostId;
}
// Get all update history records in the date range
const trendsData = await prisma.update_history.findMany({
where: whereClause,
select: {
timestamp: true,
packages_count: true,
security_count: true,
total_packages: true,
host_id: true,
status: true,
},
orderBy: {
timestamp: "asc",
},
});
// Enhanced data validation and processing
const processedData = trendsData
.filter((record) => {
// Enhanced validation
return (
record.total_packages !== null &&
record.total_packages >= 0 &&
record.packages_count >= 0 &&
record.security_count >= 0 &&
record.security_count <= record.packages_count && // Security can't exceed outdated
record.status === "success"
); // Only include successful reports
})
.map((record) => {
const date = new Date(record.timestamp);
let timeKey;
if (daysInt <= 1) {
// For hourly view, group by hour only (not minutes)
timeKey = date.toISOString().substring(0, 13); // YYYY-MM-DDTHH
} else {
// For daily view, group by day
timeKey = date.toISOString().split("T")[0]; // YYYY-MM-DD
}
return {
timeKey,
total_packages: record.total_packages,
packages_count: record.packages_count || 0,
security_count: record.security_count || 0,
host_id: record.host_id,
timestamp: record.timestamp,
};
});
// Determine if we need aggregation based on host filter
const needsAggregation =
!hostId || hostId === "all" || hostId === "undefined";
let aggregatedArray;
if (needsAggregation) {
// For "All Hosts" mode, we need to calculate the actual total packages differently
// Instead of aggregating historical data (which is per-host), we'll use the current total
// and show that as a flat line, since total packages don't change much over time
// Get the current total packages count (unique packages across all hosts)
const currentTotalPackages = await prisma.packages.count({
where: {
host_packages: {
some: {}, // At least one host has this package
},
},
});
// Aggregate data by timeKey when looking at "All Hosts" or no specific host
const aggregatedData = processedData.reduce((acc, item) => {
if (!acc[item.timeKey]) {
acc[item.timeKey] = {
timeKey: item.timeKey,
total_packages: currentTotalPackages, // Use current total packages
packages_count: 0,
security_count: 0,
record_count: 0,
host_ids: new Set(),
min_timestamp: item.timestamp,
max_timestamp: item.timestamp,
};
}
// For outdated and security packages: SUM (these represent counts across hosts)
acc[item.timeKey].packages_count += item.packages_count;
acc[item.timeKey].security_count += item.security_count;
acc[item.timeKey].record_count += 1;
acc[item.timeKey].host_ids.add(item.host_id);
// Track timestamp range
if (item.timestamp < acc[item.timeKey].min_timestamp) {
acc[item.timeKey].min_timestamp = item.timestamp;
}
if (item.timestamp > acc[item.timeKey].max_timestamp) {
acc[item.timeKey].max_timestamp = item.timestamp;
}
return acc;
}, {});
// Convert to array and add metadata
aggregatedArray = Object.values(aggregatedData)
.map((item) => ({
...item,
host_count: item.host_ids.size,
host_ids: Array.from(item.host_ids),
}))
.sort((a, b) => a.timeKey.localeCompare(b.timeKey));
} else {
// For specific host, show individual data points without aggregation
// But still group by timeKey to handle multiple reports from same host in same time period
const hostAggregatedData = processedData.reduce((acc, item) => {
if (!acc[item.timeKey]) {
acc[item.timeKey] = {
timeKey: item.timeKey,
total_packages: 0,
packages_count: 0,
security_count: 0,
record_count: 0,
host_ids: new Set([item.host_id]),
min_timestamp: item.timestamp,
max_timestamp: item.timestamp,
};
}
// For same host, take the latest values (not sum)
// This handles cases where a host reports multiple times in the same time period
if (item.timestamp > acc[item.timeKey].max_timestamp) {
acc[item.timeKey].total_packages = item.total_packages;
acc[item.timeKey].packages_count = item.packages_count;
acc[item.timeKey].security_count = item.security_count;
acc[item.timeKey].max_timestamp = item.timestamp;
}
acc[item.timeKey].record_count += 1;
return acc;
}, {});
// Convert to array
aggregatedArray = Object.values(hostAggregatedData)
.map((item) => ({
...item,
host_count: item.host_ids.size,
host_ids: Array.from(item.host_ids),
}))
.sort((a, b) => a.timeKey.localeCompare(b.timeKey));
}
// Handle sparse data by filling missing time periods
const fillMissingPeriods = (data, daysInt) => {
const filledData = [];
const startDate = new Date();
startDate.setDate(startDate.getDate() - daysInt);
const dataMap = new Map(data.map((item) => [item.timeKey, item]));
const endDate = new Date();
const currentDate = new Date(startDate);
// Find the last known values for interpolation
let lastKnownValues = null;
if (data.length > 0) {
lastKnownValues = {
total_packages: data[0].total_packages,
packages_count: data[0].packages_count,
security_count: data[0].security_count,
};
}
while (currentDate <= endDate) {
let timeKey;
if (daysInt <= 1) {
timeKey = currentDate.toISOString().substring(0, 13); // Hourly
currentDate.setHours(currentDate.getHours() + 1);
} else {
timeKey = currentDate.toISOString().split("T")[0]; // Daily
currentDate.setDate(currentDate.getDate() + 1);
}
if (dataMap.has(timeKey)) {
const item = dataMap.get(timeKey);
filledData.push(item);
// Update last known values
lastKnownValues = {
total_packages: item.total_packages,
packages_count: item.packages_count,
security_count: item.security_count,
};
} else {
// For missing periods, use the last known values (interpolation)
// This creates a continuous line instead of gaps
filledData.push({
timeKey,
total_packages: lastKnownValues?.total_packages || 0,
packages_count: lastKnownValues?.packages_count || 0,
security_count: lastKnownValues?.security_count || 0,
record_count: 0,
host_count: 0,
host_ids: [],
min_timestamp: null,
max_timestamp: null,
isInterpolated: true, // Mark as interpolated for debugging
});
}
}
return filledData;
};
const finalProcessedData = fillMissingPeriods(aggregatedArray, daysInt);
// Get hosts list for dropdown
const hostsList = await prisma.hosts.findMany({
select: {
id: true,
friendly_name: true,
hostname: true,
last_update: true,
status: true,
},
orderBy: {
friendly_name: "asc",
},
});
// Get current package state for offline fallback
let currentPackageState = null;
if (hostId && hostId !== "all" && hostId !== "undefined") {
// Get current package counts for specific host
const currentState = await prisma.host_packages.aggregate({
where: {
host_id: hostId,
},
_count: {
id: true,
},
});
// Get counts for boolean fields separately
const outdatedCount = await prisma.host_packages.count({
where: {
host_id: hostId,
needs_update: true,
},
});
const securityCount = await prisma.host_packages.count({
where: {
host_id: hostId,
is_security_update: true,
},
});
currentPackageState = {
total_packages: currentState._count.id,
packages_count: outdatedCount,
security_count: securityCount,
};
} else {
// Get current package counts for all hosts
// Total packages = count of unique packages installed on at least one host
const totalPackagesCount = await prisma.packages.count({
where: {
host_packages: {
some: {}, // At least one host has this package
},
},
});
// Get counts for boolean fields separately
const outdatedCount = await prisma.host_packages.count({
where: {
needs_update: true,
},
});
const securityCount = await prisma.host_packages.count({
where: {
is_security_update: true,
},
});
currentPackageState = {
total_packages: totalPackagesCount,
packages_count: outdatedCount,
security_count: securityCount,
};
}
// Format data for chart
const chartData = {
labels: [],
datasets: [
{
label: needsAggregation
? "Total Packages (All Hosts)"
: "Total Packages",
data: [],
borderColor: "#3B82F6", // Blue
backgroundColor: "rgba(59, 130, 246, 0.1)",
tension: 0.4,
hidden: true, // Hidden by default
spanGaps: true, // Connect lines across missing data
pointRadius: 3,
pointHoverRadius: 5,
},
{
label: needsAggregation
? "Total Outdated Packages"
: "Outdated Packages",
data: [],
borderColor: "#F59E0B", // Orange
backgroundColor: "rgba(245, 158, 11, 0.1)",
tension: 0.4,
spanGaps: true, // Connect lines across missing data
pointRadius: 3,
pointHoverRadius: 5,
},
{
label: needsAggregation
? "Total Security Packages"
: "Security Packages",
data: [],
borderColor: "#EF4444", // Red
backgroundColor: "rgba(239, 68, 68, 0.1)",
tension: 0.4,
spanGaps: true, // Connect lines across missing data
pointRadius: 3,
pointHoverRadius: 5,
},
],
};
// Process aggregated data
finalProcessedData.forEach((item) => {
chartData.labels.push(item.timeKey);
chartData.datasets[0].data.push(item.total_packages);
chartData.datasets[1].data.push(item.packages_count);
chartData.datasets[2].data.push(item.security_count);
});
// Calculate data quality metrics
const dataQuality = {
totalRecords: trendsData.length,
validRecords: processedData.length,
aggregatedPoints: aggregatedArray.length,
filledPoints: finalProcessedData.length,
recordsWithNullTotal: trendsData.filter(
(r) => r.total_packages === null,
).length,
recordsWithInvalidData: trendsData.length - processedData.length,
successfulReports: trendsData.filter((r) => r.status === "success")
.length,
failedReports: trendsData.filter((r) => r.status === "error").length,
};
res.json({
chartData,
hosts: hostsList,
period: daysInt,
hostId: hostId || "all",
currentPackageState,
dataQuality,
aggregationInfo: {
hasData: aggregatedArray.length > 0,
hasGaps: finalProcessedData.some((item) => item.record_count === 0),
lastDataPoint:
aggregatedArray.length > 0
? aggregatedArray[aggregatedArray.length - 1]
: null,
aggregationMode: needsAggregation
? "sum_across_hosts"
: "individual_host_data",
explanation: needsAggregation
? "Data is summed across all hosts for each time period"
: "Data shows individual host values without cross-host aggregation",
},
});
} catch (error) {
console.error("Error fetching package trends:", error);
res.status(500).json({ error: "Failed to fetch package trends" });
}
},
);
// Diagnostic endpoint to investigate package spikes
router.get(
"/package-spike-analysis",
authenticateToken,
requireViewHosts,
async (req, res) => {
try {
const { date, time, hours = 2 } = req.query;
if (!date || !time) {
return res.status(400).json({
error:
"Date and time parameters are required. Format: date=2025-10-17&time=18:00",
});
}
// Parse the specific date and time
const targetDateTime = new Date(`${date}T${time}:00`);
const startTime = new Date(targetDateTime);
startTime.setHours(startTime.getHours() - parseInt(hours, 10));
const endTime = new Date(targetDateTime);
endTime.setHours(endTime.getHours() + parseInt(hours, 10));
console.log(
`Analyzing package spike around ${targetDateTime.toISOString()}`,
);
console.log(
`Time range: ${startTime.toISOString()} to ${endTime.toISOString()}`,
);
// Get all update history records in the time window
const spikeData = await prisma.update_history.findMany({
where: {
timestamp: {
gte: startTime,
lte: endTime,
},
},
select: {
id: true,
host_id: true,
timestamp: true,
packages_count: true,
security_count: true,
total_packages: true,
status: true,
error_message: true,
execution_time: true,
payload_size_kb: true,
hosts: {
select: {
friendly_name: true,
hostname: true,
os_type: true,
os_version: true,
},
},
},
orderBy: {
timestamp: "asc",
},
});
// Analyze the data
const analysis = {
timeWindow: {
start: startTime.toISOString(),
end: endTime.toISOString(),
target: targetDateTime.toISOString(),
},
totalRecords: spikeData.length,
successfulReports: spikeData.filter((r) => r.status === "success")
.length,
failedReports: spikeData.filter((r) => r.status === "error").length,
uniqueHosts: [...new Set(spikeData.map((r) => r.host_id))].length,
hosts: {},
timeline: [],
summary: {
maxPackagesCount: 0,
maxSecurityCount: 0,
maxTotalPackages: 0,
avgPackagesCount: 0,
avgSecurityCount: 0,
avgTotalPackages: 0,
},
};
// Group by host and analyze each host's behavior
spikeData.forEach((record) => {
const hostId = record.host_id;
if (!analysis.hosts[hostId]) {
analysis.hosts[hostId] = {
hostInfo: record.hosts,
records: [],
summary: {
totalReports: 0,
successfulReports: 0,
failedReports: 0,
maxPackagesCount: 0,
maxSecurityCount: 0,
maxTotalPackages: 0,
avgPackagesCount: 0,
avgSecurityCount: 0,
avgTotalPackages: 0,
},
};
}
analysis.hosts[hostId].records.push({
timestamp: record.timestamp,
packages_count: record.packages_count,
security_count: record.security_count,
total_packages: record.total_packages,
status: record.status,
error_message: record.error_message,
execution_time: record.execution_time,
payload_size_kb: record.payload_size_kb,
});
analysis.hosts[hostId].summary.totalReports++;
if (record.status === "success") {
analysis.hosts[hostId].summary.successfulReports++;
analysis.hosts[hostId].summary.maxPackagesCount = Math.max(
analysis.hosts[hostId].summary.maxPackagesCount,
record.packages_count,
);
analysis.hosts[hostId].summary.maxSecurityCount = Math.max(
analysis.hosts[hostId].summary.maxSecurityCount,
record.security_count,
);
analysis.hosts[hostId].summary.maxTotalPackages = Math.max(
analysis.hosts[hostId].summary.maxTotalPackages,
record.total_packages || 0,
);
} else {
analysis.hosts[hostId].summary.failedReports++;
}
});
// Calculate averages for each host
Object.keys(analysis.hosts).forEach((hostId) => {
const host = analysis.hosts[hostId];
const successfulRecords = host.records.filter(
(r) => r.status === "success",
);
if (successfulRecords.length > 0) {
host.summary.avgPackagesCount = Math.round(
successfulRecords.reduce((sum, r) => sum + r.packages_count, 0) /
successfulRecords.length,
);
host.summary.avgSecurityCount = Math.round(
successfulRecords.reduce((sum, r) => sum + r.security_count, 0) /
successfulRecords.length,
);
host.summary.avgTotalPackages = Math.round(
successfulRecords.reduce(
(sum, r) => sum + (r.total_packages || 0),
0,
) / successfulRecords.length,
);
}
});
// Create timeline with hourly/daily aggregation
const timelineMap = new Map();
spikeData.forEach((record) => {
const timeKey = record.timestamp.toISOString().substring(0, 13); // Hourly
if (!timelineMap.has(timeKey)) {
timelineMap.set(timeKey, {
timestamp: timeKey,
totalReports: 0,
successfulReports: 0,
failedReports: 0,
totalPackagesCount: 0,
totalSecurityCount: 0,
totalTotalPackages: 0,
uniqueHosts: new Set(),
});
}
const timelineEntry = timelineMap.get(timeKey);
timelineEntry.totalReports++;
timelineEntry.uniqueHosts.add(record.host_id);
if (record.status === "success") {
timelineEntry.successfulReports++;
timelineEntry.totalPackagesCount += record.packages_count;
timelineEntry.totalSecurityCount += record.security_count;
timelineEntry.totalTotalPackages += record.total_packages || 0;
} else {
timelineEntry.failedReports++;
}
});
// Convert timeline map to array
analysis.timeline = Array.from(timelineMap.values())
.map((entry) => ({
...entry,
uniqueHosts: entry.uniqueHosts.size,
}))
.sort((a, b) => a.timestamp.localeCompare(b.timestamp));
// Calculate overall summary
const successfulRecords = spikeData.filter((r) => r.status === "success");
if (successfulRecords.length > 0) {
analysis.summary.maxPackagesCount = Math.max(
...successfulRecords.map((r) => r.packages_count),
);
analysis.summary.maxSecurityCount = Math.max(
...successfulRecords.map((r) => r.security_count),
);
analysis.summary.maxTotalPackages = Math.max(
...successfulRecords.map((r) => r.total_packages || 0),
);
analysis.summary.avgPackagesCount = Math.round(
successfulRecords.reduce((sum, r) => sum + r.packages_count, 0) /
successfulRecords.length,
);
analysis.summary.avgSecurityCount = Math.round(
successfulRecords.reduce((sum, r) => sum + r.security_count, 0) /
successfulRecords.length,
);
analysis.summary.avgTotalPackages = Math.round(
successfulRecords.reduce(
(sum, r) => sum + (r.total_packages || 0),
0,
) / successfulRecords.length,
);
}
// Identify potential causes of the spike
const potentialCauses = [];
// Check for hosts with unusually high package counts
Object.keys(analysis.hosts).forEach((hostId) => {
const host = analysis.hosts[hostId];
if (
host.summary.maxPackagesCount >
analysis.summary.avgPackagesCount * 2
) {
potentialCauses.push({
type: "high_package_count",
hostId,
hostName: host.hostInfo.friendly_name || host.hostInfo.hostname,
value: host.summary.maxPackagesCount,
avg: analysis.summary.avgPackagesCount,
});
}
});
// Check for multiple hosts reporting at the same time (this explains the 500 vs 59 discrepancy)
const concurrentReports = analysis.timeline.filter(
(entry) => entry.uniqueHosts > 1,
);
if (concurrentReports.length > 0) {
potentialCauses.push({
type: "concurrent_reports",
description:
"Multiple hosts reported simultaneously - this explains why chart shows higher numbers than individual host reports",
count: concurrentReports.length,
details: concurrentReports.map((entry) => ({
timestamp: entry.timestamp,
totalPackagesCount: entry.totalPackagesCount,
uniqueHosts: entry.uniqueHosts,
avgPerHost: Math.round(
entry.totalPackagesCount / entry.uniqueHosts,
),
})),
explanation:
"The chart sums package counts across all hosts. If multiple hosts report at the same time, the chart shows the total sum, not individual host counts.",
});
}
// Check for failed reports that might indicate system issues
if (analysis.failedReports > 0) {
potentialCauses.push({
type: "failed_reports",
count: analysis.failedReports,
percentage: Math.round(
(analysis.failedReports / analysis.totalRecords) * 100,
),
});
}
// Add aggregation explanation
const aggregationExplanation = {
type: "aggregation_explanation",
description: "Chart Aggregation Logic",
details: {
howItWorks:
"The package trends chart sums package counts across all hosts for each time period",
individualHosts:
"Each host reports its own package count (e.g., 59 packages)",
chartDisplay:
"Chart shows the sum of all hosts' package counts (e.g., 59 + other hosts = 500)",
timeGrouping:
"Multiple hosts reporting in the same hour/day are aggregated together",
},
example: {
host1: "Host A reports 59 outdated packages",
host2: "Host B reports 120 outdated packages",
host3: "Host C reports 321 outdated packages",
chartShows: "Chart displays 500 total packages (59+120+321)",
},
};
potentialCauses.push(aggregationExplanation);
// Add specific host breakdown if a host ID is provided
let specificHostAnalysis = null;
if (req.query.hostId) {
const hostId = req.query.hostId;
const hostData = analysis.hosts[hostId];
if (hostData) {
specificHostAnalysis = {
hostId,
hostInfo: hostData.hostInfo,
summary: hostData.summary,
records: hostData.records,
explanation: `This host reported ${hostData.summary.maxPackagesCount} outdated packages, but the chart shows ${analysis.summary.maxPackagesCount} because it sums across all hosts that reported at the same time.`,
};
}
}
res.json({
analysis,
potentialCauses,
specificHostAnalysis,
recommendations: [
"Check if any hosts had major package updates around this time",
"Verify if any new hosts were added to the system",
"Check for system maintenance or updates that might have triggered package checks",
"Review any automation or scheduled tasks that run around 6pm",
"Check if any repositories were updated or new packages were released",
"Remember: Chart shows SUM of all hosts' package counts, not individual host counts",
],
});
} catch (error) {
console.error("Error analyzing package spike:", error);
res.status(500).json({ error: "Failed to analyze package spike" });
}
},
);
module.exports = router;

View File

@@ -0,0 +1,779 @@
const express = require("express");
const { authenticateToken } = require("../middleware/auth");
const { PrismaClient } = require("@prisma/client");
const { v4: uuidv4 } = require("uuid");
const prisma = new PrismaClient();
const router = express.Router();
// Helper function to convert BigInt fields to strings for JSON serialization
const convertBigIntToString = (obj) => {
if (obj === null || obj === undefined) return obj;
if (typeof obj === "bigint") {
return obj.toString();
}
if (Array.isArray(obj)) {
return obj.map(convertBigIntToString);
}
if (typeof obj === "object") {
const converted = {};
for (const key in obj) {
converted[key] = convertBigIntToString(obj[key]);
}
return converted;
}
return obj;
};
// GET /api/v1/docker/dashboard - Get Docker dashboard statistics
router.get("/dashboard", authenticateToken, async (_req, res) => {
try {
// Get total hosts with Docker containers
const hostsWithDocker = await prisma.docker_containers.groupBy({
by: ["host_id"],
_count: true,
});
// Get total containers
const totalContainers = await prisma.docker_containers.count();
// Get running containers
const runningContainers = await prisma.docker_containers.count({
where: { status: "running" },
});
// Get total images
const totalImages = await prisma.docker_images.count();
// Get available updates
const availableUpdates = await prisma.docker_image_updates.count();
// Get containers by status
const containersByStatus = await prisma.docker_containers.groupBy({
by: ["status"],
_count: true,
});
// Get images by source
const imagesBySource = await prisma.docker_images.groupBy({
by: ["source"],
_count: true,
});
res.json({
stats: {
totalHostsWithDocker: hostsWithDocker.length,
totalContainers,
runningContainers,
totalImages,
availableUpdates,
},
containersByStatus,
imagesBySource,
});
} catch (error) {
console.error("Error fetching Docker dashboard:", error);
res.status(500).json({ error: "Failed to fetch Docker dashboard" });
}
});
// GET /api/v1/docker/containers - Get all containers with filters
router.get("/containers", authenticateToken, async (req, res) => {
try {
const { status, hostId, imageId, search, page = 1, limit = 50 } = req.query;
const where = {};
if (status) where.status = status;
if (hostId) where.host_id = hostId;
if (imageId) where.image_id = imageId;
if (search) {
where.OR = [
{ name: { contains: search, mode: "insensitive" } },
{ image_name: { contains: search, mode: "insensitive" } },
];
}
const skip = (parseInt(page, 10) - 1) * parseInt(limit, 10);
const take = parseInt(limit, 10);
const [containers, total] = await Promise.all([
prisma.docker_containers.findMany({
where,
include: {
docker_images: true,
},
orderBy: { updated_at: "desc" },
skip,
take,
}),
prisma.docker_containers.count({ where }),
]);
// Get host information for each container
const hostIds = [...new Set(containers.map((c) => c.host_id))];
const hosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
select: { id: true, friendly_name: true, hostname: true, ip: true },
});
const hostsMap = hosts.reduce((acc, host) => {
acc[host.id] = host;
return acc;
}, {});
const containersWithHosts = containers.map((container) => ({
...container,
host: hostsMap[container.host_id],
}));
res.json(
convertBigIntToString({
containers: containersWithHosts,
pagination: {
page: parseInt(page, 10),
limit: parseInt(limit, 10),
total,
totalPages: Math.ceil(total / parseInt(limit, 10)),
},
}),
);
} catch (error) {
console.error("Error fetching containers:", error);
res.status(500).json({ error: "Failed to fetch containers" });
}
});
// GET /api/v1/docker/containers/:id - Get container detail
router.get("/containers/:id", authenticateToken, async (req, res) => {
try {
const { id } = req.params;
const container = await prisma.docker_containers.findUnique({
where: { id },
include: {
docker_images: {
include: {
docker_image_updates: true,
},
},
},
});
if (!container) {
return res.status(404).json({ error: "Container not found" });
}
// Get host information
const host = await prisma.hosts.findUnique({
where: { id: container.host_id },
select: {
id: true,
friendly_name: true,
hostname: true,
ip: true,
os_type: true,
os_version: true,
},
});
// Get other containers using the same image
const similarContainers = await prisma.docker_containers.findMany({
where: {
image_id: container.image_id,
id: { not: id },
},
take: 10,
});
res.json(
convertBigIntToString({
container: {
...container,
host,
},
similarContainers,
}),
);
} catch (error) {
console.error("Error fetching container detail:", error);
res.status(500).json({ error: "Failed to fetch container detail" });
}
});
// GET /api/v1/docker/images - Get all images with filters
router.get("/images", authenticateToken, async (req, res) => {
try {
const { source, search, page = 1, limit = 50 } = req.query;
const where = {};
if (source) where.source = source;
if (search) {
where.OR = [
{ repository: { contains: search, mode: "insensitive" } },
{ tag: { contains: search, mode: "insensitive" } },
];
}
const skip = (parseInt(page, 10) - 1) * parseInt(limit, 10);
const take = parseInt(limit, 10);
const [images, total] = await Promise.all([
prisma.docker_images.findMany({
where,
include: {
_count: {
select: {
docker_containers: true,
docker_image_updates: true,
},
},
docker_image_updates: {
take: 1,
orderBy: { created_at: "desc" },
},
},
orderBy: { updated_at: "desc" },
skip,
take,
}),
prisma.docker_images.count({ where }),
]);
// Get unique hosts using each image
const imagesWithHosts = await Promise.all(
images.map(async (image) => {
const containers = await prisma.docker_containers.findMany({
where: { image_id: image.id },
select: { host_id: true },
distinct: ["host_id"],
});
return {
...image,
hostsCount: containers.length,
hasUpdates: image._count.docker_image_updates > 0,
};
}),
);
res.json(
convertBigIntToString({
images: imagesWithHosts,
pagination: {
page: parseInt(page, 10),
limit: parseInt(limit, 10),
total,
totalPages: Math.ceil(total / parseInt(limit, 10)),
},
}),
);
} catch (error) {
console.error("Error fetching images:", error);
res.status(500).json({ error: "Failed to fetch images" });
}
});
// GET /api/v1/docker/images/:id - Get image detail
router.get("/images/:id", authenticateToken, async (req, res) => {
try {
const { id } = req.params;
const image = await prisma.docker_images.findUnique({
where: { id },
include: {
docker_containers: {
take: 100,
},
docker_image_updates: {
orderBy: { created_at: "desc" },
},
},
});
if (!image) {
return res.status(404).json({ error: "Image not found" });
}
// Get unique hosts using this image
const hostIds = [...new Set(image.docker_containers.map((c) => c.host_id))];
const hosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
select: { id: true, friendly_name: true, hostname: true, ip: true },
});
res.json(
convertBigIntToString({
image,
hosts,
totalContainers: image.docker_containers.length,
totalHosts: hosts.length,
}),
);
} catch (error) {
console.error("Error fetching image detail:", error);
res.status(500).json({ error: "Failed to fetch image detail" });
}
});
// GET /api/v1/docker/hosts - Get all hosts with Docker
router.get("/hosts", authenticateToken, async (req, res) => {
try {
const { page = 1, limit = 50 } = req.query;
// Get hosts that have Docker containers
const hostsWithContainers = await prisma.docker_containers.groupBy({
by: ["host_id"],
_count: true,
});
const hostIds = hostsWithContainers.map((h) => h.host_id);
const skip = (parseInt(page, 10) - 1) * parseInt(limit, 10);
const take = parseInt(limit, 10);
const hosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
skip,
take,
orderBy: { friendly_name: "asc" },
});
// Get container counts and statuses for each host
const hostsWithStats = await Promise.all(
hosts.map(async (host) => {
const [totalContainers, runningContainers, totalImages] =
await Promise.all([
prisma.docker_containers.count({
where: { host_id: host.id },
}),
prisma.docker_containers.count({
where: { host_id: host.id, status: "running" },
}),
prisma.docker_containers.findMany({
where: { host_id: host.id },
select: { image_id: true },
distinct: ["image_id"],
}),
]);
return {
...host,
dockerStats: {
totalContainers,
runningContainers,
totalImages: totalImages.length,
},
};
}),
);
res.json(
convertBigIntToString({
hosts: hostsWithStats,
pagination: {
page: parseInt(page, 10),
limit: parseInt(limit, 10),
total: hostIds.length,
totalPages: Math.ceil(hostIds.length / parseInt(limit, 10)),
},
}),
);
} catch (error) {
console.error("Error fetching Docker hosts:", error);
res.status(500).json({ error: "Failed to fetch Docker hosts" });
}
});
// GET /api/v1/docker/hosts/:id - Get host Docker detail
router.get("/hosts/:id", authenticateToken, async (req, res) => {
try {
const { id } = req.params;
const host = await prisma.hosts.findUnique({
where: { id },
});
if (!host) {
return res.status(404).json({ error: "Host not found" });
}
// Get containers on this host
const containers = await prisma.docker_containers.findMany({
where: { host_id: id },
include: {
docker_images: {
include: {
docker_image_updates: true,
},
},
},
orderBy: { name: "asc" },
});
// Get unique images on this host
const imageIds = [...new Set(containers.map((c) => c.image_id))].filter(
Boolean,
);
const images = await prisma.docker_images.findMany({
where: { id: { in: imageIds } },
});
// Get container statistics
const runningContainers = containers.filter(
(c) => c.status === "running",
).length;
const stoppedContainers = containers.filter(
(c) => c.status === "exited" || c.status === "stopped",
).length;
res.json(
convertBigIntToString({
host,
containers,
images,
stats: {
totalContainers: containers.length,
runningContainers,
stoppedContainers,
totalImages: images.length,
},
}),
);
} catch (error) {
console.error("Error fetching host Docker detail:", error);
res.status(500).json({ error: "Failed to fetch host Docker detail" });
}
});
// GET /api/v1/docker/updates - Get available updates
router.get("/updates", authenticateToken, async (req, res) => {
try {
const { page = 1, limit = 50, securityOnly = false } = req.query;
const where = {};
if (securityOnly === "true") {
where.is_security_update = true;
}
const skip = (parseInt(page, 10) - 1) * parseInt(limit, 10);
const take = parseInt(limit, 10);
const [updates, total] = await Promise.all([
prisma.docker_image_updates.findMany({
where,
include: {
docker_images: {
include: {
docker_containers: {
select: {
id: true,
host_id: true,
name: true,
},
},
},
},
},
orderBy: [{ is_security_update: "desc" }, { created_at: "desc" }],
skip,
take,
}),
prisma.docker_image_updates.count({ where }),
]);
// Get affected hosts for each update
const updatesWithHosts = await Promise.all(
updates.map(async (update) => {
const hostIds = [
...new Set(
update.docker_images.docker_containers.map((c) => c.host_id),
),
];
const hosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
select: { id: true, friendly_name: true, hostname: true },
});
return {
...update,
affectedHosts: hosts,
affectedContainersCount:
update.docker_images.docker_containers.length,
};
}),
);
res.json(
convertBigIntToString({
updates: updatesWithHosts,
pagination: {
page: parseInt(page, 10),
limit: parseInt(limit, 10),
total,
totalPages: Math.ceil(total / parseInt(limit, 10)),
},
}),
);
} catch (error) {
console.error("Error fetching Docker updates:", error);
res.status(500).json({ error: "Failed to fetch Docker updates" });
}
});
// POST /api/v1/docker/collect - Collect Docker data from agent
router.post("/collect", async (req, res) => {
try {
const { apiId, apiKey, containers, images, updates } = req.body;
// Validate API credentials
const host = await prisma.hosts.findFirst({
where: { api_id: apiId, api_key: apiKey },
});
if (!host) {
return res.status(401).json({ error: "Invalid API credentials" });
}
const now = new Date();
// Helper function to validate and parse dates
const parseDate = (dateString) => {
if (!dateString) return now;
const date = new Date(dateString);
return Number.isNaN(date.getTime()) ? now : date;
};
// Process containers
if (containers && Array.isArray(containers)) {
for (const containerData of containers) {
const containerId = uuidv4();
// Find or create image
let imageId = null;
if (containerData.image_repository && containerData.image_tag) {
const image = await prisma.docker_images.upsert({
where: {
repository_tag_image_id: {
repository: containerData.image_repository,
tag: containerData.image_tag,
image_id: containerData.image_id || "unknown",
},
},
update: {
last_checked: now,
updated_at: now,
},
create: {
id: uuidv4(),
repository: containerData.image_repository,
tag: containerData.image_tag,
image_id: containerData.image_id || "unknown",
source: containerData.image_source || "docker-hub",
created_at: parseDate(containerData.created_at),
updated_at: now,
},
});
imageId = image.id;
}
// Upsert container
await prisma.docker_containers.upsert({
where: {
host_id_container_id: {
host_id: host.id,
container_id: containerData.container_id,
},
},
update: {
name: containerData.name,
image_id: imageId,
image_name: containerData.image_name,
image_tag: containerData.image_tag || "latest",
status: containerData.status,
state: containerData.state,
ports: containerData.ports || null,
started_at: containerData.started_at
? parseDate(containerData.started_at)
: null,
updated_at: now,
last_checked: now,
},
create: {
id: containerId,
host_id: host.id,
container_id: containerData.container_id,
name: containerData.name,
image_id: imageId,
image_name: containerData.image_name,
image_tag: containerData.image_tag || "latest",
status: containerData.status,
state: containerData.state,
ports: containerData.ports || null,
created_at: parseDate(containerData.created_at),
started_at: containerData.started_at
? parseDate(containerData.started_at)
: null,
updated_at: now,
},
});
}
}
// Process standalone images
if (images && Array.isArray(images)) {
for (const imageData of images) {
await prisma.docker_images.upsert({
where: {
repository_tag_image_id: {
repository: imageData.repository,
tag: imageData.tag,
image_id: imageData.image_id,
},
},
update: {
size_bytes: imageData.size_bytes
? BigInt(imageData.size_bytes)
: null,
last_checked: now,
updated_at: now,
},
create: {
id: uuidv4(),
repository: imageData.repository,
tag: imageData.tag,
image_id: imageData.image_id,
digest: imageData.digest,
size_bytes: imageData.size_bytes
? BigInt(imageData.size_bytes)
: null,
source: imageData.source || "docker-hub",
created_at: parseDate(imageData.created_at),
updated_at: now,
},
});
}
}
// Process updates
// First, get all images for this host to clean up old updates
const hostImageIds = await prisma.docker_containers
.findMany({
where: { host_id: host.id },
select: { image_id: true },
distinct: ["image_id"],
})
.then((results) => results.map((r) => r.image_id).filter(Boolean));
// Delete old updates for images on this host that are no longer reported
if (hostImageIds.length > 0) {
const reportedImageIds = [];
// Process new updates
if (updates && Array.isArray(updates)) {
for (const updateData of updates) {
// Find the image by repository, tag, and image_id
const image = await prisma.docker_images.findFirst({
where: {
repository: updateData.repository,
tag: updateData.current_tag,
image_id: updateData.image_id,
},
});
if (image) {
reportedImageIds.push(image.id);
// Store digest info in changelog_url field as JSON for now
const digestInfo = JSON.stringify({
method: "digest_comparison",
current_digest: updateData.current_digest,
available_digest: updateData.available_digest,
});
// Upsert the update record
await prisma.docker_image_updates.upsert({
where: {
image_id_available_tag: {
image_id: image.id,
available_tag: updateData.available_tag,
},
},
update: {
updated_at: now,
changelog_url: digestInfo,
severity: "digest_changed",
},
create: {
id: uuidv4(),
image_id: image.id,
current_tag: updateData.current_tag,
available_tag: updateData.available_tag,
severity: "digest_changed",
changelog_url: digestInfo,
updated_at: now,
},
});
}
}
}
// Remove stale updates for images on this host that are no longer in the updates list
const imageIdsToCleanup = hostImageIds.filter(
(id) => !reportedImageIds.includes(id),
);
if (imageIdsToCleanup.length > 0) {
await prisma.docker_image_updates.deleteMany({
where: {
image_id: { in: imageIdsToCleanup },
},
});
}
}
res.json({ success: true, message: "Docker data collected successfully" });
} catch (error) {
console.error("Error collecting Docker data:", error);
console.error("Error stack:", error.stack);
console.error("Request body:", JSON.stringify(req.body, null, 2));
res.status(500).json({
error: "Failed to collect Docker data",
message: error.message,
details: process.env.NODE_ENV === "development" ? error.stack : undefined,
});
}
});
// GET /api/v1/docker/agent - Serve the Docker agent installation script
router.get("/agent", async (_req, res) => {
try {
const fs = require("node:fs");
const path = require("node:path");
const agentPath = path.join(
__dirname,
"../../..",
"agents",
"patchmon-docker-agent.sh",
);
// Check if file exists
if (!fs.existsSync(agentPath)) {
return res.status(404).json({ error: "Docker agent script not found" });
}
// Read and serve the file
const agentScript = fs.readFileSync(agentPath, "utf8");
res.setHeader("Content-Type", "text/x-shellscript");
res.setHeader(
"Content-Disposition",
'inline; filename="patchmon-docker-agent.sh"',
);
res.send(agentScript);
} catch (error) {
console.error("Error serving Docker agent:", error);
res.status(500).json({ error: "Failed to serve Docker agent script" });
}
});
module.exports = router;

View File

@@ -0,0 +1,236 @@
const express = require("express");
const { createPrismaClient } = require("../config/database");
const bcrypt = require("bcryptjs");
const router = express.Router();
const prisma = createPrismaClient();
// Middleware to authenticate API key
const authenticateApiKey = async (req, res, next) => {
try {
const authHeader = req.headers.authorization;
if (!authHeader || !authHeader.startsWith("Basic ")) {
return res
.status(401)
.json({ error: "Missing or invalid authorization header" });
}
// Decode base64 credentials
const base64Credentials = authHeader.split(" ")[1];
const credentials = Buffer.from(base64Credentials, "base64").toString(
"ascii",
);
const [apiKey, apiSecret] = credentials.split(":");
if (!apiKey || !apiSecret) {
return res.status(401).json({ error: "Invalid credentials format" });
}
// Find the token in database
const token = await prisma.auto_enrollment_tokens.findUnique({
where: { token_key: apiKey },
include: {
users: {
select: {
id: true,
username: true,
role: true,
},
},
},
});
if (!token) {
console.log(`API key not found: ${apiKey}`);
return res.status(401).json({ error: "Invalid API key" });
}
// Check if token is active
if (!token.is_active) {
return res.status(401).json({ error: "API key is disabled" });
}
// Check if token has expired
if (token.expires_at && new Date(token.expires_at) < new Date()) {
return res.status(401).json({ error: "API key has expired" });
}
// Check if token is for gethomepage integration
if (token.metadata?.integration_type !== "gethomepage") {
return res.status(401).json({ error: "Invalid API key type" });
}
// Verify the secret
const isValidSecret = await bcrypt.compare(apiSecret, token.token_secret);
if (!isValidSecret) {
return res.status(401).json({ error: "Invalid API secret" });
}
// Check IP restrictions if any
if (token.allowed_ip_ranges && token.allowed_ip_ranges.length > 0) {
const clientIp = req.ip || req.connection.remoteAddress;
const forwardedFor = req.headers["x-forwarded-for"];
const realIp = req.headers["x-real-ip"];
// Get the actual client IP (considering proxies)
const actualClientIp = forwardedFor
? forwardedFor.split(",")[0].trim()
: realIp || clientIp;
const isAllowedIp = token.allowed_ip_ranges.some((range) => {
// Simple IP range check (can be enhanced for CIDR support)
return actualClientIp.startsWith(range) || actualClientIp === range;
});
if (!isAllowedIp) {
console.log(
`IP validation failed. Client IP: ${actualClientIp}, Allowed ranges: ${token.allowed_ip_ranges.join(", ")}`,
);
return res.status(403).json({ error: "IP address not allowed" });
}
}
// Update last used timestamp
await prisma.auto_enrollment_tokens.update({
where: { id: token.id },
data: { last_used_at: new Date() },
});
// Attach token info to request
req.apiToken = token;
next();
} catch (error) {
console.error("API key authentication error:", error);
res.status(500).json({ error: "Authentication failed" });
}
};
// Get homepage widget statistics
router.get("/stats", authenticateApiKey, async (_req, res) => {
try {
// Get total hosts count
const totalHosts = await prisma.hosts.count({
where: { status: "active" },
});
// Get total outdated packages count
const totalOutdatedPackages = await prisma.host_packages.count({
where: { needs_update: true },
});
// Get total repositories count
const totalRepos = await prisma.repositories.count({
where: { is_active: true },
});
// Get hosts that need updates (have outdated packages)
const hostsNeedingUpdates = await prisma.hosts.count({
where: {
status: "active",
host_packages: {
some: {
needs_update: true,
},
},
},
});
// Get security updates count
const securityUpdates = await prisma.host_packages.count({
where: {
needs_update: true,
is_security_update: true,
},
});
// Get hosts with security updates
const hostsWithSecurityUpdates = await prisma.hosts.count({
where: {
status: "active",
host_packages: {
some: {
needs_update: true,
is_security_update: true,
},
},
},
});
// Get up-to-date hosts count
const upToDateHosts = totalHosts - hostsNeedingUpdates;
// Get recent update activity (last 24 hours)
const oneDayAgo = new Date(Date.now() - 24 * 60 * 60 * 1000);
const recentUpdates = await prisma.update_history.count({
where: {
timestamp: {
gte: oneDayAgo,
},
status: "success",
},
});
// Get OS distribution
const osDistribution = await prisma.hosts.groupBy({
by: ["os_type"],
where: { status: "active" },
_count: {
id: true,
},
orderBy: {
_count: {
id: "desc",
},
},
});
// Format OS distribution data
const osDistributionFormatted = osDistribution.map((os) => ({
name: os.os_type,
count: os._count.id,
}));
// Extract top 3 OS types for flat display in widgets
const top_os_1 = osDistributionFormatted[0] || { name: "None", count: 0 };
const top_os_2 = osDistributionFormatted[1] || { name: "None", count: 0 };
const top_os_3 = osDistributionFormatted[2] || { name: "None", count: 0 };
// Prepare response data
const stats = {
total_hosts: totalHosts,
total_outdated_packages: totalOutdatedPackages,
total_repos: totalRepos,
hosts_needing_updates: hostsNeedingUpdates,
up_to_date_hosts: upToDateHosts,
security_updates: securityUpdates,
hosts_with_security_updates: hostsWithSecurityUpdates,
recent_updates_24h: recentUpdates,
os_distribution: osDistributionFormatted,
// Flattened OS data for easy widget display
top_os_1_name: top_os_1.name,
top_os_1_count: top_os_1.count,
top_os_2_name: top_os_2.name,
top_os_2_count: top_os_2.count,
top_os_3_name: top_os_3.name,
top_os_3_count: top_os_3.count,
last_updated: new Date().toISOString(),
};
res.json(stats);
} catch (error) {
console.error("Error fetching homepage stats:", error);
res.status(500).json({ error: "Failed to fetch statistics" });
}
});
// Health check endpoint for the API
router.get("/health", authenticateApiKey, async (req, res) => {
res.json({
status: "ok",
timestamp: new Date().toISOString(),
api_key: req.apiToken.token_name,
});
});
module.exports = router;

View File

@@ -15,7 +15,7 @@ router.get("/", authenticateToken, async (_req, res) => {
include: {
_count: {
select: {
hosts: true,
host_group_memberships: true,
},
},
},
@@ -39,16 +39,20 @@ router.get("/:id", authenticateToken, async (req, res) => {
const hostGroup = await prisma.host_groups.findUnique({
where: { id },
include: {
hosts: {
select: {
id: true,
friendly_name: true,
hostname: true,
ip: true,
os_type: true,
os_version: true,
status: true,
last_update: true,
host_group_memberships: {
include: {
hosts: {
select: {
id: true,
friendly_name: true,
hostname: true,
ip: true,
os_type: true,
os_version: true,
status: true,
last_update: true,
},
},
},
},
},
@@ -195,7 +199,7 @@ router.delete(
include: {
_count: {
select: {
hosts: true,
host_group_memberships: true,
},
},
},
@@ -205,11 +209,10 @@ router.delete(
return res.status(404).json({ error: "Host group not found" });
}
// If host group has hosts, ungroup them first
if (existingGroup._count.hosts > 0) {
await prisma.hosts.updateMany({
// If host group has memberships, remove them first
if (existingGroup._count.host_group_memberships > 0) {
await prisma.host_group_memberships.deleteMany({
where: { host_group_id: id },
data: { host_group_id: null },
});
}
@@ -231,7 +234,13 @@ router.get("/:id/hosts", authenticateToken, async (req, res) => {
const { id } = req.params;
const hosts = await prisma.hosts.findMany({
where: { host_group_id: id },
where: {
host_group_memberships: {
some: {
host_group_id: id,
},
},
},
select: {
id: true,
friendly_name: true,

View File

@@ -14,7 +14,7 @@ const {
const router = express.Router();
const prisma = new PrismaClient();
// Secure endpoint to download the agent script (requires API authentication)
// Secure endpoint to download the agent binary (requires API authentication)
router.get("/agent/download", async (req, res) => {
try {
// Verify API credentials
@@ -34,46 +34,50 @@ router.get("/agent/download", async (req, res) => {
return res.status(401).json({ error: "Invalid API credentials" });
}
// Serve agent script directly from file system
// Get architecture parameter (default to amd64)
const architecture = req.query.arch || "amd64";
// Validate architecture
const validArchitectures = ["amd64", "386", "arm64"];
if (!validArchitectures.includes(architecture)) {
return res.status(400).json({
error: "Invalid architecture. Must be one of: amd64, 386, arm64",
});
}
// Serve agent binary directly from file system
const fs = require("node:fs");
const path = require("node:path");
const agentPath = path.join(__dirname, "../../../agents/patchmon-agent.sh");
const binaryName = `patchmon-agent-linux-${architecture}`;
const binaryPath = path.join(__dirname, "../../../agents", binaryName);
if (!fs.existsSync(agentPath)) {
return res.status(404).json({ error: "Agent script not found" });
if (!fs.existsSync(binaryPath)) {
return res.status(404).json({
error: `Agent binary not found for architecture: ${architecture}`,
});
}
// Read file and convert line endings
let scriptContent = fs
.readFileSync(agentPath, "utf8")
.replace(/\r\n/g, "\n")
.replace(/\r/g, "\n");
// Determine curl flags dynamically from settings for consistency
let curlFlags = "-s";
try {
const settings = await prisma.settings.findFirst();
if (settings && settings.ignore_ssl_self_signed === true) {
curlFlags = "-sk";
}
} catch (_) {}
// Inject the curl flags into the script
scriptContent = scriptContent.replace(
'CURL_FLAGS=""',
`CURL_FLAGS="${curlFlags}"`,
);
res.setHeader("Content-Type", "application/x-shellscript");
// Set appropriate headers for binary download
res.setHeader("Content-Type", "application/octet-stream");
res.setHeader(
"Content-Disposition",
'attachment; filename="patchmon-agent.sh"',
`attachment; filename="${binaryName}"`,
);
res.send(scriptContent);
// Stream the binary file
const fileStream = fs.createReadStream(binaryPath);
fileStream.pipe(res);
fileStream.on("error", (error) => {
console.error("Binary stream error:", error);
if (!res.headersSent) {
res.status(500).json({ error: "Failed to stream agent binary" });
}
});
} catch (error) {
console.error("Agent download error:", error);
res.status(500).json({ error: "Failed to download agent script" });
res.status(500).json({ error: "Failed to serve agent binary" });
}
});
@@ -158,7 +162,14 @@ router.post(
body("friendly_name")
.isLength({ min: 1 })
.withMessage("Friendly name is required"),
body("hostGroupId").optional(),
body("hostGroupIds")
.optional()
.isArray()
.withMessage("Host group IDs must be an array"),
body("hostGroupIds.*")
.optional()
.isUUID()
.withMessage("Each host group ID must be a valid UUID"),
],
async (req, res) => {
try {
@@ -167,28 +178,21 @@ router.post(
return res.status(400).json({ errors: errors.array() });
}
const { friendly_name, hostGroupId } = req.body;
const { friendly_name, hostGroupIds } = req.body;
// Generate unique API credentials for this host
const { apiId, apiKey } = generateApiCredentials();
// Check if host already exists
const existingHost = await prisma.hosts.findUnique({
where: { friendly_name: friendly_name },
});
if (existingHost) {
return res.status(409).json({ error: "Host already exists" });
}
// If hostGroupId is provided, verify the group exists
if (hostGroupId) {
const hostGroup = await prisma.host_groups.findUnique({
where: { id: hostGroupId },
// If hostGroupIds is provided, verify all groups exist
if (hostGroupIds && hostGroupIds.length > 0) {
const hostGroups = await prisma.host_groups.findMany({
where: { id: { in: hostGroupIds } },
});
if (!hostGroup) {
return res.status(400).json({ error: "Host group not found" });
if (hostGroups.length !== hostGroupIds.length) {
return res
.status(400)
.json({ error: "One or more host groups not found" });
}
}
@@ -196,6 +200,7 @@ router.post(
const host = await prisma.hosts.create({
data: {
id: uuidv4(),
machine_id: `pending-${uuidv4()}`, // Temporary placeholder until agent connects with real machine_id
friendly_name: friendly_name,
os_type: "unknown", // Will be updated when agent connects
os_version: "unknown", // Will be updated when agent connects
@@ -203,16 +208,31 @@ router.post(
architecture: null, // Will be updated when agent connects
api_id: apiId,
api_key: apiKey,
host_group_id: hostGroupId || null,
status: "pending", // Will change to 'active' when agent connects
updated_at: new Date(),
// Create host group memberships if hostGroupIds are provided
host_group_memberships:
hostGroupIds && hostGroupIds.length > 0
? {
create: hostGroupIds.map((groupId) => ({
id: uuidv4(),
host_groups: {
connect: { id: groupId },
},
})),
}
: undefined,
},
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},
@@ -224,7 +244,10 @@ router.post(
friendlyName: host.friendly_name,
apiId: host.api_id,
apiKey: host.api_key,
hostGroup: host.host_groups,
hostGroups:
host.host_group_memberships?.map(
(membership) => membership.host_groups,
) || [],
instructions:
"Use these credentials in your patchmon agent configuration. System information will be automatically detected when the agent connects.",
});
@@ -321,6 +344,10 @@ router.post(
.optional()
.isArray()
.withMessage("Load average must be an array"),
body("machineId")
.optional()
.isString()
.withMessage("Machine ID must be a string"),
],
async (req, res) => {
try {
@@ -329,15 +356,24 @@ router.post(
return res.status(400).json({ errors: errors.array() });
}
const { packages, repositories } = req.body;
const { packages, repositories, executionTime } = req.body;
const host = req.hostRecord;
// Calculate payload size in KB
const payloadSizeBytes = JSON.stringify(req.body).length;
const payloadSizeKb = payloadSizeBytes / 1024;
// Update host last update timestamp and system info if provided
const updateData = {
last_update: new Date(),
updated_at: new Date(),
};
// Update machine_id if provided and current one is a placeholder
if (req.body.machineId && host.machine_id.startsWith("pending-")) {
updateData.machine_id = req.body.machineId;
}
// Basic system info
if (req.body.osType) updateData.os_type = req.body.osType;
if (req.body.osVersion) updateData.os_version = req.body.osVersion;
@@ -382,152 +418,193 @@ router.post(
(pkg) => pkg.isSecurityUpdate,
).length;
const updatesCount = packages.filter((pkg) => pkg.needsUpdate).length;
const totalPackages = packages.length;
// Process everything in a single transaction to avoid race conditions
await prisma.$transaction(async (tx) => {
// Update host data
await tx.hosts.update({
where: { id: host.id },
data: updateData,
});
// Clear existing host packages to avoid duplicates
await tx.host_packages.deleteMany({
where: { host_id: host.id },
});
// Process each package
for (const packageData of packages) {
// Find or create package
let pkg = await tx.packages.findUnique({
where: { name: packageData.name },
await prisma.$transaction(
async (tx) => {
// Update host data
await tx.hosts.update({
where: { id: host.id },
data: updateData,
});
if (!pkg) {
pkg = await tx.packages.create({
data: {
// Clear existing host packages to avoid duplicates
await tx.host_packages.deleteMany({
where: { host_id: host.id },
});
// Process packages in batches using createMany/updateMany
const packagesToCreate = [];
const packagesToUpdate = [];
const _hostPackagesToUpsert = [];
// First pass: identify what needs to be created/updated
const existingPackages = await tx.packages.findMany({
where: {
name: { in: packages.map((p) => p.name) },
},
});
const existingPackageMap = new Map(
existingPackages.map((p) => [p.name, p]),
);
for (const packageData of packages) {
const existingPkg = existingPackageMap.get(packageData.name);
if (!existingPkg) {
// Package doesn't exist, create it
const newPkg = {
id: uuidv4(),
name: packageData.name,
description: packageData.description || null,
category: packageData.category || null,
latest_version:
packageData.availableVersion || packageData.currentVersion,
created_at: new Date(),
updated_at: new Date(),
};
packagesToCreate.push(newPkg);
existingPackageMap.set(packageData.name, newPkg);
} else if (
packageData.availableVersion &&
packageData.availableVersion !== existingPkg.latest_version
) {
// Package exists but needs version update
packagesToUpdate.push({
id: existingPkg.id,
latest_version: packageData.availableVersion,
});
}
}
// Batch create new packages
if (packagesToCreate.length > 0) {
await tx.packages.createMany({
data: packagesToCreate,
skipDuplicates: true,
});
}
// Batch update existing packages
for (const update of packagesToUpdate) {
await tx.packages.update({
where: { id: update.id },
data: {
latest_version: update.latest_version,
updated_at: new Date(),
},
});
} else {
// Update package latest version if newer
if (
packageData.availableVersion &&
packageData.availableVersion !== pkg.latest_version
) {
await tx.packages.update({
where: { id: pkg.id },
data: {
latest_version: packageData.availableVersion,
updated_at: new Date(),
},
});
}
}
// Create host package relationship
// Use upsert to handle potential duplicates gracefully
await tx.host_packages.upsert({
where: {
host_id_package_id: {
host_id: host.id,
package_id: pkg.id,
},
},
update: {
current_version: packageData.currentVersion,
available_version: packageData.availableVersion || null,
needs_update: packageData.needsUpdate,
is_security_update: packageData.isSecurityUpdate || false,
last_checked: new Date(),
},
create: {
id: uuidv4(),
host_id: host.id,
package_id: pkg.id,
current_version: packageData.currentVersion,
available_version: packageData.availableVersion || null,
needs_update: packageData.needsUpdate,
is_security_update: packageData.isSecurityUpdate || false,
last_checked: new Date(),
},
});
}
// Now process host_packages
for (const packageData of packages) {
const pkg = existingPackageMap.get(packageData.name);
// Process repositories if provided
if (repositories && Array.isArray(repositories)) {
// Clear existing host repositories
await tx.host_repositories.deleteMany({
where: { host_id: host.id },
});
// Deduplicate repositories by URL+distribution+components to avoid constraint violations
const uniqueRepos = new Map();
for (const repoData of repositories) {
const key = `${repoData.url}|${repoData.distribution}|${repoData.components}`;
if (!uniqueRepos.has(key)) {
uniqueRepos.set(key, repoData);
}
}
// Process each unique repository
for (const repoData of uniqueRepos.values()) {
// Find or create repository
let repo = await tx.repositories.findFirst({
await tx.host_packages.upsert({
where: {
url: repoData.url,
distribution: repoData.distribution,
components: repoData.components,
},
});
if (!repo) {
repo = await tx.repositories.create({
data: {
id: uuidv4(),
name: repoData.name,
url: repoData.url,
distribution: repoData.distribution,
components: repoData.components,
repo_type: repoData.repoType,
is_active: true,
is_secure: repoData.isSecure || false,
description: `${repoData.repoType} repository for ${repoData.distribution}`,
updated_at: new Date(),
host_id_package_id: {
host_id: host.id,
package_id: pkg.id,
},
});
}
// Create host repository relationship
await tx.host_repositories.create({
data: {
},
update: {
current_version: packageData.currentVersion,
available_version: packageData.availableVersion || null,
needs_update: packageData.needsUpdate,
is_security_update: packageData.isSecurityUpdate || false,
last_checked: new Date(),
},
create: {
id: uuidv4(),
host_id: host.id,
repository_id: repo.id,
is_enabled: repoData.isEnabled !== false, // Default to enabled
package_id: pkg.id,
current_version: packageData.currentVersion,
available_version: packageData.availableVersion || null,
needs_update: packageData.needsUpdate,
is_security_update: packageData.isSecurityUpdate || false,
last_checked: new Date(),
},
});
}
}
// Create update history record
await tx.update_history.create({
data: {
id: uuidv4(),
host_id: host.id,
packages_count: updatesCount,
security_count: securityCount,
status: "success",
},
});
});
// Process repositories if provided
if (repositories && Array.isArray(repositories)) {
// Clear existing host repositories
await tx.host_repositories.deleteMany({
where: { host_id: host.id },
});
// Deduplicate repositories by URL+distribution+components to avoid constraint violations
const uniqueRepos = new Map();
for (const repoData of repositories) {
const key = `${repoData.url}|${repoData.distribution}|${repoData.components}`;
if (!uniqueRepos.has(key)) {
uniqueRepos.set(key, repoData);
}
}
// Process each unique repository
for (const repoData of uniqueRepos.values()) {
// Find or create repository
let repo = await tx.repositories.findFirst({
where: {
url: repoData.url,
distribution: repoData.distribution,
components: repoData.components,
},
});
if (!repo) {
repo = await tx.repositories.create({
data: {
id: uuidv4(),
name: repoData.name,
url: repoData.url,
distribution: repoData.distribution,
components: repoData.components,
repo_type: repoData.repoType,
is_active: true,
is_secure: repoData.isSecure || false,
description: `${repoData.repoType} repository for ${repoData.distribution}`,
updated_at: new Date(),
},
});
}
// Create host repository relationship
await tx.host_repositories.create({
data: {
id: uuidv4(),
host_id: host.id,
repository_id: repo.id,
is_enabled: repoData.isEnabled !== false, // Default to enabled
last_checked: new Date(),
},
});
}
}
// Create update history record
await tx.update_history.create({
data: {
id: uuidv4(),
host_id: host.id,
packages_count: updatesCount,
security_count: securityCount,
total_packages: totalPackages,
payload_size_kb: payloadSizeKb,
execution_time: executionTime ? parseFloat(executionTime) : null,
status: "success",
},
});
},
{
maxWait: 30000, // Wait up to 30s for a transaction slot
timeout: 60000, // Allow transaction to run for up to 60s
},
);
// Agent auto-update is now handled client-side by the agent itself
@@ -686,18 +763,96 @@ router.post(
},
);
// Admin endpoint to bulk update host groups
// TODO: Admin endpoint to bulk update host groups - needs to be rewritten for many-to-many relationship
// router.put(
// "/bulk/group",
// authenticateToken,
// requireManageHosts,
// [
// body("hostIds").isArray().withMessage("Host IDs must be an array"),
// body("hostIds.*")
// .isLength({ min: 1 })
// .withMessage("Each host ID must be provided"),
// body("hostGroupId").optional(),
// ],
// async (req, res) => {
// try {
// const errors = validationResult(req);
// if (!errors.isEmpty()) {
// return res.status(400).json({ errors: errors.array() });
// }
// const { hostIds, hostGroupId } = req.body;
// // If hostGroupId is provided, verify the group exists
// if (hostGroupId) {
// const hostGroup = await prisma.host_groups.findUnique({
// where: { id: hostGroupId },
// });
// if (!hostGroup) {
// return res.status(400).json({ error: "Host group not found" });
// }
// }
// // Check if all hosts exist
// const existingHosts = await prisma.hosts.findMany({
// where: { id: { in: hostIds } },
// select: { id: true, friendly_name: true },
// });
// if (existingHosts.length !== hostIds.length) {
// const foundIds = existingHosts.map((h) => h.id);
// const missingIds = hostIds.filter((id) => !foundIds.includes(id));
// return res.status(400).json({
// error: "Some hosts not found",
// missingHostIds: missingIds,
// });
// }
// // Bulk update host groups
// const updateResult = await prisma.hosts.updateMany({
// where: { id: { in: hostIds } },
// data: {
// host_group_id: hostGroupId || null,
// updated_at: new Date(),
// },
// });
// // Get updated hosts with group information
// const updatedHosts = await prisma.hosts.findMany({
// where: { id: { in: hostIds } },
// select: {
// id: true,
// friendly_name: true,
// host_groups: {
// select: {
// id: true,
// name: true,
// color: true,
// },
// },
// },
// });
// res.json({
// message: `Successfully updated ${updateResult.count} host${updateResult.count !== 1 ? "s" : ""}`,
// updatedCount: updateResult.count,
// hosts: updatedHosts,
// });
// } catch (error) {
// console.error("Bulk host group update error:", error);
// res.status(500).json({ error: "Failed to update host groups" });
// }
// },
// );
// Admin endpoint to update host groups (many-to-many)
router.put(
"/bulk/group",
"/:hostId/groups",
authenticateToken,
requireManageHosts,
[
body("hostIds").isArray().withMessage("Host IDs must be an array"),
body("hostIds.*")
.isLength({ min: 1 })
.withMessage("Each host ID must be provided"),
body("hostGroupId").optional(),
],
[body("groupIds").isArray().optional()],
async (req, res) => {
try {
const errors = validationResult(req);
@@ -705,72 +860,83 @@ router.put(
return res.status(400).json({ errors: errors.array() });
}
const { hostIds, hostGroupId } = req.body;
const { hostId } = req.params;
const { groupIds = [] } = req.body;
// If hostGroupId is provided, verify the group exists
if (hostGroupId) {
const hostGroup = await prisma.host_groups.findUnique({
where: { id: hostGroupId },
// Check if host exists
const host = await prisma.hosts.findUnique({
where: { id: hostId },
});
if (!host) {
return res.status(404).json({ error: "Host not found" });
}
// Verify all groups exist
if (groupIds.length > 0) {
const existingGroups = await prisma.host_groups.findMany({
where: { id: { in: groupIds } },
select: { id: true },
});
if (!hostGroup) {
return res.status(400).json({ error: "Host group not found" });
if (existingGroups.length !== groupIds.length) {
return res.status(400).json({
error: "One or more host groups not found",
provided: groupIds,
found: existingGroups.map((g) => g.id),
});
}
}
// Check if all hosts exist
const existingHosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
select: { id: true, friendly_name: true },
});
if (existingHosts.length !== hostIds.length) {
const foundIds = existingHosts.map((h) => h.id);
const missingIds = hostIds.filter((id) => !foundIds.includes(id));
return res.status(400).json({
error: "Some hosts not found",
missingHostIds: missingIds,
// Use transaction to update group memberships
const updatedHost = await prisma.$transaction(async (tx) => {
// Remove existing memberships
await tx.host_group_memberships.deleteMany({
where: { host_id: hostId },
});
}
// Bulk update host groups
const updateResult = await prisma.hosts.updateMany({
where: { id: { in: hostIds } },
data: {
host_group_id: hostGroupId || null,
updated_at: new Date(),
},
});
// Add new memberships
if (groupIds.length > 0) {
await tx.host_group_memberships.createMany({
data: groupIds.map((groupId) => ({
id: crypto.randomUUID(),
host_id: hostId,
host_group_id: groupId,
})),
});
}
// Get updated hosts with group information
const updatedHosts = await prisma.hosts.findMany({
where: { id: { in: hostIds } },
select: {
id: true,
friendly_name: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
// Return updated host with groups
return await tx.hosts.findUnique({
where: { id: hostId },
include: {
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},
},
});
});
res.json({
message: `Successfully updated ${updateResult.count} host${updateResult.count !== 1 ? "s" : ""}`,
updatedCount: updateResult.count,
hosts: updatedHosts,
message: "Host groups updated successfully",
host: updatedHost,
});
} catch (error) {
console.error("Bulk host group update error:", error);
console.error("Host groups update error:", error);
res.status(500).json({ error: "Failed to update host groups" });
}
},
);
// Admin endpoint to update host group
// Legacy endpoint to update single host group (for backward compatibility)
router.put(
"/:hostId/group",
authenticateToken,
@@ -786,6 +952,9 @@ router.put(
const { hostId } = req.params;
const { hostGroupId } = req.body;
// Convert single group to array and use the new endpoint logic
const _groupIds = hostGroupId ? [hostGroupId] : [];
// Check if host exists
const host = await prisma.hosts.findUnique({
where: { id: hostId },
@@ -795,7 +964,7 @@ router.put(
return res.status(404).json({ error: "Host not found" });
}
// If hostGroupId is provided, verify the group exists
// Verify group exists if provided
if (hostGroupId) {
const hostGroup = await prisma.host_groups.findUnique({
where: { id: hostGroupId },
@@ -806,22 +975,41 @@ router.put(
}
}
// Update host group
const updatedHost = await prisma.hosts.update({
where: { id: hostId },
data: {
host_group_id: hostGroupId || null,
updated_at: new Date(),
},
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
// Use transaction to update group memberships
const updatedHost = await prisma.$transaction(async (tx) => {
// Remove existing memberships
await tx.host_group_memberships.deleteMany({
where: { host_id: hostId },
});
// Add new membership if group provided
if (hostGroupId) {
await tx.host_group_memberships.create({
data: {
id: crypto.randomUUID(),
host_id: hostId,
host_group_id: hostGroupId,
},
});
}
// Return updated host with groups
return await tx.hosts.findUnique({
where: { id: hostId },
include: {
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},
},
});
});
res.json({
@@ -857,13 +1045,16 @@ router.get(
agent_version: true,
auto_update: true,
created_at: true,
host_group_id: true,
notes: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},
@@ -1126,12 +1317,20 @@ router.get("/install", async (req, res) => {
}
} catch (_) {}
// Inject the API credentials, server URL, and curl flags into the script
// Check for --force parameter
const forceInstall = req.query.force === "true" || req.query.force === "1";
// Get architecture parameter (default to amd64)
const architecture = req.query.arch || "amd64";
// Inject the API credentials, server URL, curl flags, force flag, and architecture into the script
const envVars = `#!/bin/bash
export PATCHMON_URL="${serverUrl}"
export API_ID="${host.api_id}"
export API_KEY="${host.api_key}"
export CURL_FLAGS="${curlFlags}"
export FORCE_INSTALL="${forceInstall ? "true" : "false"}"
export ARCHITECTURE="${architecture}"
`;
@@ -1151,6 +1350,48 @@ export CURL_FLAGS="${curlFlags}"
}
});
// Check if machine_id already exists (requires auth)
router.post("/check-machine-id", validateApiCredentials, async (req, res) => {
try {
const { machine_id } = req.body;
if (!machine_id) {
return res.status(400).json({
error: "machine_id is required",
});
}
// Check if a host with this machine_id exists
const existing_host = await prisma.hosts.findUnique({
where: { machine_id },
select: {
id: true,
friendly_name: true,
machine_id: true,
api_id: true,
status: true,
created_at: true,
},
});
if (existing_host) {
return res.status(200).json({
exists: true,
host: existing_host,
message: "This machine is already enrolled",
});
}
return res.status(200).json({
exists: false,
message: "Machine not yet enrolled",
});
} catch (error) {
console.error("Error checking machine_id:", error);
res.status(500).json({ error: "Failed to check machine_id" });
}
});
// Serve the removal script (public endpoint - no authentication required)
router.get("/remove", async (_req, res) => {
try {
@@ -1466,16 +1707,16 @@ router.patch(
architecture: true,
last_update: true,
status: true,
host_group_id: true,
agent_version: true,
auto_update: true,
created_at: true,
updated_at: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},
@@ -1539,17 +1780,16 @@ router.patch(
architecture: true,
last_update: true,
status: true,
host_group_id: true,
agent_version: true,
auto_update: true,
created_at: true,
updated_at: true,
notes: true,
host_groups: {
select: {
id: true,
name: true,
color: true,
host_group_memberships: {
include: {
host_groups: {
select: {
id: true,
name: true,
color: true,
},
},
},
},
},

View File

@@ -14,6 +14,7 @@ router.get("/", async (req, res) => {
category = "",
needsUpdate = "",
isSecurityUpdate = "",
host = "",
} = req.query;
const skip = (parseInt(page, 10) - 1) * parseInt(limit, 10);
@@ -33,8 +34,27 @@ router.get("/", async (req, res) => {
: {},
// Category filter
category ? { category: { equals: category } } : {},
// Update status filters
needsUpdate
// Host filter - only return packages installed on the specified host
// Combined with update status filters if both are present
host
? {
host_packages: {
some: {
host_id: host,
// If needsUpdate or isSecurityUpdate filters are present, apply them here
...(needsUpdate
? { needs_update: needsUpdate === "true" }
: {}),
...(isSecurityUpdate
? { is_security_update: isSecurityUpdate === "true" }
: {}),
},
},
}
: {},
// Update status filters (only applied if no host filter)
// If host filter is present, these are already applied above
!host && needsUpdate
? {
host_packages: {
some: {
@@ -43,7 +63,7 @@ router.get("/", async (req, res) => {
},
}
: {},
isSecurityUpdate
!host && isSecurityUpdate
? {
host_packages: {
some: {
@@ -67,7 +87,9 @@ router.get("/", async (req, res) => {
latest_version: true,
created_at: true,
_count: {
host_packages: true,
select: {
host_packages: true,
},
},
},
skip,
@@ -82,24 +104,32 @@ router.get("/", async (req, res) => {
// Get additional stats for each package
const packagesWithStats = await Promise.all(
packages.map(async (pkg) => {
const [updatesCount, securityCount, affectedHosts] = await Promise.all([
// Build base where clause for this package
const baseWhere = { package_id: pkg.id };
// If host filter is specified, add host filter to all queries
const hostWhere = host ? { ...baseWhere, host_id: host } : baseWhere;
const [updatesCount, securityCount, packageHosts] = await Promise.all([
prisma.host_packages.count({
where: {
package_id: pkg.id,
...hostWhere,
needs_update: true,
},
}),
prisma.host_packages.count({
where: {
package_id: pkg.id,
...hostWhere,
needs_update: true,
is_security_update: true,
},
}),
prisma.host_packages.findMany({
where: {
package_id: pkg.id,
needs_update: true,
...hostWhere,
// If host filter is specified, include all packages for that host
// Otherwise, only include packages that need updates
...(host ? {} : { needs_update: true }),
},
select: {
hosts: {
@@ -110,6 +140,10 @@ router.get("/", async (req, res) => {
os_type: true,
},
},
current_version: true,
available_version: true,
needs_update: true,
is_security_update: true,
},
take: 10, // Limit to first 10 for performance
}),
@@ -117,17 +151,18 @@ router.get("/", async (req, res) => {
return {
...pkg,
affectedHostsCount: pkg._count.hostPackages,
affectedHosts: affectedHosts.map((hp) => ({
hostId: hp.host.id,
friendlyName: hp.host.friendly_name,
osType: hp.host.os_type,
packageHostsCount: pkg._count.host_packages,
packageHosts: packageHosts.map((hp) => ({
hostId: hp.hosts.id,
friendlyName: hp.hosts.friendly_name,
osType: hp.hosts.os_type,
currentVersion: hp.current_version,
availableVersion: hp.available_version,
needsUpdate: hp.needs_update,
isSecurityUpdate: hp.is_security_update,
})),
stats: {
totalInstalls: pkg._count.hostPackages,
totalInstalls: pkg._count.host_packages,
updatesNeeded: updatesCount,
securityUpdates: securityCount,
},
@@ -160,19 +195,19 @@ router.get("/:packageId", async (req, res) => {
include: {
host_packages: {
include: {
host: {
hosts: {
select: {
id: true,
hostname: true,
ip: true,
osType: true,
osVersion: true,
lastUpdate: true,
os_type: true,
os_version: true,
last_update: true,
},
},
},
orderBy: {
needsUpdate: "desc",
needs_update: "desc",
},
},
},
@@ -185,25 +220,25 @@ router.get("/:packageId", async (req, res) => {
// Calculate statistics
const stats = {
totalInstalls: packageData.host_packages.length,
updatesNeeded: packageData.host_packages.filter((hp) => hp.needsUpdate)
updatesNeeded: packageData.host_packages.filter((hp) => hp.needs_update)
.length,
securityUpdates: packageData.host_packages.filter(
(hp) => hp.needsUpdate && hp.isSecurityUpdate,
(hp) => hp.needs_update && hp.is_security_update,
).length,
upToDate: packageData.host_packages.filter((hp) => !hp.needsUpdate)
upToDate: packageData.host_packages.filter((hp) => !hp.needs_update)
.length,
};
// Group by version
const versionDistribution = packageData.host_packages.reduce((acc, hp) => {
const version = hp.currentVersion;
const version = hp.current_version;
acc[version] = (acc[version] || 0) + 1;
return acc;
}, {});
// Group by OS type
const osDistribution = packageData.host_packages.reduce((acc, hp) => {
const osType = hp.host.osType;
const osType = hp.hosts.os_type;
acc[osType] = (acc[osType] || 0) + 1;
return acc;
}, {});
@@ -230,4 +265,109 @@ router.get("/:packageId", async (req, res) => {
}
});
// Get hosts where a package is installed
router.get("/:packageId/hosts", async (req, res) => {
try {
const { packageId } = req.params;
const {
page = 1,
limit = 25,
search = "",
sortBy = "friendly_name",
sortOrder = "asc",
} = req.query;
const offset = (parseInt(page, 10) - 1) * parseInt(limit, 10);
// Build search conditions
const searchConditions = search
? {
OR: [
{
hosts: {
friendly_name: { contains: search, mode: "insensitive" },
},
},
{ hosts: { hostname: { contains: search, mode: "insensitive" } } },
{ current_version: { contains: search, mode: "insensitive" } },
{ available_version: { contains: search, mode: "insensitive" } },
],
}
: {};
// Build sort conditions
const orderBy = {};
if (
sortBy === "friendly_name" ||
sortBy === "hostname" ||
sortBy === "os_type"
) {
orderBy.hosts = { [sortBy]: sortOrder };
} else if (sortBy === "needs_update") {
orderBy[sortBy] = sortOrder;
} else {
orderBy[sortBy] = sortOrder;
}
// Get total count
const totalCount = await prisma.host_packages.count({
where: {
package_id: packageId,
...searchConditions,
},
});
// Get paginated results
const hostPackages = await prisma.host_packages.findMany({
where: {
package_id: packageId,
...searchConditions,
},
include: {
hosts: {
select: {
id: true,
friendly_name: true,
hostname: true,
os_type: true,
os_version: true,
last_update: true,
},
},
},
orderBy,
skip: offset,
take: parseInt(limit, 10),
});
// Transform the data for the frontend
const hosts = hostPackages.map((hp) => ({
hostId: hp.hosts.id,
friendlyName: hp.hosts.friendly_name,
hostname: hp.hosts.hostname,
osType: hp.hosts.os_type,
osVersion: hp.hosts.os_version,
lastUpdate: hp.hosts.last_update,
currentVersion: hp.current_version,
availableVersion: hp.available_version,
needsUpdate: hp.needs_update,
isSecurityUpdate: hp.is_security_update,
lastChecked: hp.last_checked,
}));
res.json({
hosts,
pagination: {
page: parseInt(page, 10),
limit: parseInt(limit, 10),
total: totalCount,
pages: Math.ceil(totalCount / parseInt(limit, 10)),
},
});
} catch (error) {
console.error("Error fetching package hosts:", error);
res.status(500).json({ error: "Failed to fetch package hosts" });
}
});
module.exports = router;

View File

@@ -289,6 +289,77 @@ router.get(
},
);
// Delete a specific repository (admin only)
router.delete(
"/:repositoryId",
authenticateToken,
requireManageHosts,
async (req, res) => {
try {
const { repositoryId } = req.params;
// Check if repository exists first
const existingRepository = await prisma.repositories.findUnique({
where: { id: repositoryId },
select: {
id: true,
name: true,
url: true,
_count: {
select: {
host_repositories: true,
},
},
},
});
if (!existingRepository) {
return res.status(404).json({
error: "Repository not found",
details: "The repository may have been deleted or does not exist",
});
}
// Delete repository and all related data (cascade will handle host_repositories)
await prisma.repositories.delete({
where: { id: repositoryId },
});
res.json({
message: "Repository deleted successfully",
deletedRepository: {
id: existingRepository.id,
name: existingRepository.name,
url: existingRepository.url,
hostCount: existingRepository._count.host_repositories,
},
});
} catch (error) {
console.error("Repository deletion error:", error);
// Handle specific Prisma errors
if (error.code === "P2025") {
return res.status(404).json({
error: "Repository not found",
details: "The repository may have been deleted or does not exist",
});
}
if (error.code === "P2003") {
return res.status(400).json({
error: "Cannot delete repository due to foreign key constraints",
details: "The repository has related data that prevents deletion",
});
}
res.status(500).json({
error: "Failed to delete repository",
details: error.message || "An unexpected error occurred",
});
}
},
);
// Cleanup orphaned repositories (admin only)
router.delete(
"/cleanup/orphaned",

View File

@@ -0,0 +1,249 @@
const express = require("express");
const router = express.Router();
const { createPrismaClient } = require("../config/database");
const { authenticateToken } = require("../middleware/auth");
const prisma = createPrismaClient();
/**
* Global search endpoint
* Searches across hosts, packages, repositories, and users
* Returns categorized results
*/
router.get("/", authenticateToken, async (req, res) => {
try {
const { q } = req.query;
if (!q || q.trim().length === 0) {
return res.json({
hosts: [],
packages: [],
repositories: [],
users: [],
});
}
const searchTerm = q.trim();
// Prepare results object
const results = {
hosts: [],
packages: [],
repositories: [],
users: [],
};
// Get user permissions from database
let userPermissions = null;
try {
userPermissions = await prisma.role_permissions.findUnique({
where: { role: req.user.role },
});
// If no specific permissions found, default to admin permissions
if (!userPermissions) {
console.warn(
`No permissions found for role: ${req.user.role}, defaulting to admin access`,
);
userPermissions = {
can_view_hosts: true,
can_view_packages: true,
can_view_users: true,
};
}
} catch (permError) {
console.error("Error fetching permissions:", permError);
// Default to restrictive permissions on error
userPermissions = {
can_view_hosts: false,
can_view_packages: false,
can_view_users: false,
};
}
// Search hosts if user has permission
if (userPermissions.can_view_hosts) {
try {
const hosts = await prisma.hosts.findMany({
where: {
OR: [
{ hostname: { contains: searchTerm, mode: "insensitive" } },
{ friendly_name: { contains: searchTerm, mode: "insensitive" } },
{ ip: { contains: searchTerm, mode: "insensitive" } },
{ machine_id: { contains: searchTerm, mode: "insensitive" } },
],
},
select: {
id: true,
machine_id: true,
hostname: true,
friendly_name: true,
ip: true,
os_type: true,
os_version: true,
status: true,
last_update: true,
},
take: 10, // Limit results
orderBy: {
last_update: "desc",
},
});
results.hosts = hosts.map((host) => ({
id: host.id,
hostname: host.hostname,
friendly_name: host.friendly_name,
ip: host.ip,
os_type: host.os_type,
os_version: host.os_version,
status: host.status,
last_update: host.last_update,
type: "host",
}));
} catch (error) {
console.error("Error searching hosts:", error);
}
}
// Search packages if user has permission
if (userPermissions.can_view_packages) {
try {
const packages = await prisma.packages.findMany({
where: {
name: { contains: searchTerm, mode: "insensitive" },
},
select: {
id: true,
name: true,
description: true,
category: true,
latest_version: true,
_count: {
select: {
host_packages: true,
},
},
},
take: 10,
orderBy: {
name: "asc",
},
});
results.packages = packages.map((pkg) => ({
id: pkg.id,
name: pkg.name,
description: pkg.description,
category: pkg.category,
latest_version: pkg.latest_version,
host_count: pkg._count.host_packages,
type: "package",
}));
} catch (error) {
console.error("Error searching packages:", error);
}
}
// Search repositories if user has permission (usually same as hosts)
if (userPermissions.can_view_hosts) {
try {
const repositories = await prisma.repositories.findMany({
where: {
OR: [
{ name: { contains: searchTerm, mode: "insensitive" } },
{ url: { contains: searchTerm, mode: "insensitive" } },
{ description: { contains: searchTerm, mode: "insensitive" } },
],
},
select: {
id: true,
name: true,
url: true,
distribution: true,
repo_type: true,
is_active: true,
description: true,
_count: {
select: {
host_repositories: true,
},
},
},
take: 10,
orderBy: {
name: "asc",
},
});
results.repositories = repositories.map((repo) => ({
id: repo.id,
name: repo.name,
url: repo.url,
distribution: repo.distribution,
repo_type: repo.repo_type,
is_active: repo.is_active,
description: repo.description,
host_count: repo._count.host_repositories,
type: "repository",
}));
} catch (error) {
console.error("Error searching repositories:", error);
}
}
// Search users if user has permission
if (userPermissions.can_view_users) {
try {
const users = await prisma.users.findMany({
where: {
OR: [
{ username: { contains: searchTerm, mode: "insensitive" } },
{ email: { contains: searchTerm, mode: "insensitive" } },
{ first_name: { contains: searchTerm, mode: "insensitive" } },
{ last_name: { contains: searchTerm, mode: "insensitive" } },
],
},
select: {
id: true,
username: true,
email: true,
first_name: true,
last_name: true,
role: true,
is_active: true,
last_login: true,
},
take: 10,
orderBy: {
username: "asc",
},
});
results.users = users.map((user) => ({
id: user.id,
username: user.username,
email: user.email,
first_name: user.first_name,
last_name: user.last_name,
role: user.role,
is_active: user.is_active,
last_login: user.last_login,
type: "user",
}));
} catch (error) {
console.error("Error searching users:", error);
}
}
res.json(results);
} catch (error) {
console.error("Global search error:", error);
res.status(500).json({
error: "Failed to perform search",
message: error.message,
});
}
});
module.exports = router;

View File

@@ -8,102 +8,9 @@ const { getSettings, updateSettings } = require("../services/settingsService");
const router = express.Router();
const prisma = new PrismaClient();
// Function to trigger crontab updates on all hosts with auto-update enabled
async function triggerCrontabUpdates() {
try {
console.log(
"Triggering crontab updates on all hosts with auto-update enabled...",
);
// Get current settings for server URL
const settings = await getSettings();
const serverUrl = settings.server_url;
// Get all hosts that have auto-update enabled
const hosts = await prisma.hosts.findMany({
where: {
auto_update: true,
status: "active", // Only update active hosts
},
select: {
id: true,
friendly_name: true,
api_id: true,
api_key: true,
},
});
console.log(`Found ${hosts.length} hosts with auto-update enabled`);
// For each host, we'll send a special update command that triggers crontab update
// This is done by sending a ping with a special flag
for (const host of hosts) {
try {
console.log(
`Triggering crontab update for host: ${host.friendly_name}`,
);
// We'll use the existing ping endpoint but add a special parameter
// The agent will detect this and run update-crontab command
const http = require("node:http");
const https = require("node:https");
const url = new URL(`${serverUrl}/api/v1/hosts/ping`);
const isHttps = url.protocol === "https:";
const client = isHttps ? https : http;
const postData = JSON.stringify({
triggerCrontabUpdate: true,
message: "Update interval changed, please update your crontab",
});
const options = {
hostname: url.hostname,
port: url.port || (isHttps ? 443 : 80),
path: url.pathname,
method: "POST",
headers: {
"Content-Type": "application/json",
"Content-Length": Buffer.byteLength(postData),
"X-API-ID": host.api_id,
"X-API-KEY": host.api_key,
},
};
const req = client.request(options, (res) => {
if (res.statusCode === 200) {
console.log(
`Successfully triggered crontab update for ${host.friendly_name}`,
);
} else {
console.error(
`Failed to trigger crontab update for ${host.friendly_name}: ${res.statusCode}`,
);
}
});
req.on("error", (error) => {
console.error(
`Error triggering crontab update for ${host.friendly_name}:`,
error.message,
);
});
req.write(postData);
req.end();
} catch (error) {
console.error(
`Error triggering crontab update for ${host.friendly_name}:`,
error.message,
);
}
}
console.log("Crontab update trigger completed");
} catch (error) {
console.error("Error in triggerCrontabUpdates:", error);
}
}
// WebSocket broadcaster for agent policy updates (no longer used - queue-based delivery preferred)
// const { broadcastSettingsUpdate } = require("../services/agentWs");
const { queueManager, QUEUE_NAMES } = require("../services/automation");
// Helpers
function normalizeUpdateInterval(minutes) {
@@ -215,6 +122,18 @@ router.put(
}
return true;
}),
body("logoDark")
.optional()
.isLength({ min: 1 })
.withMessage("Logo dark path must be a non-empty string"),
body("logoLight")
.optional()
.isLength({ min: 1 })
.withMessage("Logo light path must be a non-empty string"),
body("favicon")
.optional()
.isLength({ min: 1 })
.withMessage("Favicon path must be a non-empty string"),
],
async (req, res) => {
try {
@@ -236,6 +155,9 @@ router.put(
githubRepoUrl,
repositoryType,
sshKeyPath,
logoDark,
logoLight,
favicon,
} = req.body;
// Get current settings to check for update interval changes
@@ -264,6 +186,9 @@ router.put(
if (repositoryType !== undefined)
updateData.repository_type = repositoryType;
if (sshKeyPath !== undefined) updateData.ssh_key_path = sshKeyPath;
if (logoDark !== undefined) updateData.logo_dark = logoDark;
if (logoLight !== undefined) updateData.logo_light = logoLight;
if (favicon !== undefined) updateData.favicon = favicon;
const updatedSettings = await updateSettings(
currentSettings.id,
@@ -272,15 +197,36 @@ router.put(
console.log("Settings updated successfully:", updatedSettings);
// If update interval changed, trigger crontab updates on all hosts with auto-update enabled
// If update interval changed, enqueue persistent jobs for agents
if (
updateInterval !== undefined &&
oldUpdateInterval !== updateData.update_interval
) {
console.log(
`Update interval changed from ${oldUpdateInterval} to ${updateData.update_interval} minutes. Triggering crontab updates...`,
`Update interval changed from ${oldUpdateInterval} to ${updateData.update_interval} minutes. Enqueueing agent settings updates...`,
);
await triggerCrontabUpdates();
const hosts = await prisma.hosts.findMany({
where: { status: "active" },
select: { api_id: true },
});
const queue = queueManager.queues[QUEUE_NAMES.AGENT_COMMANDS];
const jobs = hosts.map((h) => ({
name: "settings_update",
data: {
api_id: h.api_id,
type: "settings_update",
update_interval: updateData.update_interval,
},
opts: { attempts: 10, backoff: { type: "exponential", delay: 5000 } },
}));
// Bulk add jobs
await queue.addBulk(jobs);
// Note: Queue-based delivery handles retries and ensures reliable delivery
// No need for immediate broadcast as it would cause duplicate messages
}
res.json({
@@ -351,4 +297,175 @@ router.get("/auto-update", async (_req, res) => {
}
});
// Upload logo files
router.post(
"/logos/upload",
authenticateToken,
requireManageSettings,
async (req, res) => {
try {
const { logoType, fileContent, fileName } = req.body;
if (!logoType || !fileContent) {
return res.status(400).json({
error: "Logo type and file content are required",
});
}
if (!["dark", "light", "favicon"].includes(logoType)) {
return res.status(400).json({
error: "Logo type must be 'dark', 'light', or 'favicon'",
});
}
// Validate file content (basic checks)
if (typeof fileContent !== "string") {
return res.status(400).json({
error: "File content must be a base64 string",
});
}
const fs = require("node:fs").promises;
const path = require("node:path");
const _crypto = require("node:crypto");
// Create assets directory if it doesn't exist
// In development: save to public/assets (served by Vite)
// In production: save to dist/assets (served by built app)
const isDevelopment = process.env.NODE_ENV !== "production";
const assetsDir = isDevelopment
? path.join(__dirname, "../../../frontend/public/assets")
: path.join(__dirname, "../../../frontend/dist/assets");
await fs.mkdir(assetsDir, { recursive: true });
// Determine file extension and path
let fileExtension;
let fileName_final;
if (logoType === "favicon") {
fileExtension = ".svg";
fileName_final = fileName || "logo_square.svg";
} else {
// Determine extension from file content or use default
if (fileContent.startsWith("data:image/png")) {
fileExtension = ".png";
} else if (fileContent.startsWith("data:image/svg")) {
fileExtension = ".svg";
} else if (
fileContent.startsWith("data:image/jpeg") ||
fileContent.startsWith("data:image/jpg")
) {
fileExtension = ".jpg";
} else {
fileExtension = ".png"; // Default to PNG
}
fileName_final = fileName || `logo_${logoType}${fileExtension}`;
}
const filePath = path.join(assetsDir, fileName_final);
// Handle base64 data URLs
let fileBuffer;
if (fileContent.startsWith("data:")) {
const base64Data = fileContent.split(",")[1];
fileBuffer = Buffer.from(base64Data, "base64");
} else {
// Assume it's already base64
fileBuffer = Buffer.from(fileContent, "base64");
}
// Create backup of existing file
try {
const backupPath = `${filePath}.backup.${Date.now()}`;
await fs.copyFile(filePath, backupPath);
console.log(`Created backup: ${backupPath}`);
} catch (error) {
// Ignore if original doesn't exist
if (error.code !== "ENOENT") {
console.warn("Failed to create backup:", error.message);
}
}
// Write new logo file
await fs.writeFile(filePath, fileBuffer);
// Update settings with new logo path
const settings = await getSettings();
const logoPath = `/assets/${fileName_final}`;
const updateData = {};
if (logoType === "dark") {
updateData.logo_dark = logoPath;
} else if (logoType === "light") {
updateData.logo_light = logoPath;
} else if (logoType === "favicon") {
updateData.favicon = logoPath;
}
await updateSettings(settings.id, updateData);
// Get file stats
const stats = await fs.stat(filePath);
res.json({
message: `${logoType} logo uploaded successfully`,
fileName: fileName_final,
path: logoPath,
size: stats.size,
sizeFormatted: `${(stats.size / 1024).toFixed(1)} KB`,
});
} catch (error) {
console.error("Upload logo error:", error);
res.status(500).json({ error: "Failed to upload logo" });
}
},
);
// Reset logo to default
router.post(
"/logos/reset",
authenticateToken,
requireManageSettings,
async (req, res) => {
try {
const { logoType } = req.body;
if (!logoType) {
return res.status(400).json({
error: "Logo type is required",
});
}
if (!["dark", "light", "favicon"].includes(logoType)) {
return res.status(400).json({
error: "Logo type must be 'dark', 'light', or 'favicon'",
});
}
// Get current settings
const settings = await getSettings();
// Clear the custom logo path to revert to default
const updateData = {};
if (logoType === "dark") {
updateData.logo_dark = null;
} else if (logoType === "light") {
updateData.logo_light = null;
} else if (logoType === "favicon") {
updateData.favicon = null;
}
await updateSettings(settings.id, updateData);
res.json({
message: `${logoType} logo reset to default successfully`,
logoType,
});
} catch (error) {
console.error("Reset logo error:", error);
res.status(500).json({ error: "Failed to reset logo" });
}
},
);
module.exports = router;

View File

@@ -2,36 +2,229 @@ const express = require("express");
const { authenticateToken } = require("../middleware/auth");
const { requireManageSettings } = require("../middleware/permissions");
const { PrismaClient } = require("@prisma/client");
const { exec } = require("node:child_process");
const { promisify } = require("node:util");
const prisma = new PrismaClient();
const execAsync = promisify(exec);
// Default GitHub repository URL
const DEFAULT_GITHUB_REPO = "https://github.com/patchMon/patchmon";
const router = express.Router();
// Helper function to get current version from package.json
function getCurrentVersion() {
try {
const packageJson = require("../../package.json");
return packageJson?.version || "1.2.9";
} catch (packageError) {
console.warn(
"Could not read version from package.json, using fallback:",
packageError.message,
);
return "1.2.9";
}
}
// Helper function to parse GitHub repository URL
function parseGitHubRepo(repoUrl) {
let owner, repo;
if (repoUrl.includes("git@github.com:")) {
const match = repoUrl.match(/git@github\.com:([^/]+)\/([^/]+)\.git/);
if (match) {
[, owner, repo] = match;
}
} else if (repoUrl.includes("github.com/")) {
const match = repoUrl.match(/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?$/);
if (match) {
[, owner, repo] = match;
}
}
return { owner, repo };
}
// Helper function to get latest release from GitHub API
async function getLatestRelease(owner, repo) {
try {
const currentVersion = getCurrentVersion();
const apiUrl = `https://api.github.com/repos/${owner}/${repo}/releases/latest`;
const response = await fetch(apiUrl, {
method: "GET",
headers: {
Accept: "application/vnd.github.v3+json",
"User-Agent": `PatchMon-Server/${currentVersion}`,
},
});
if (!response.ok) {
const errorText = await response.text();
if (
errorText.includes("rate limit") ||
errorText.includes("API rate limit")
) {
throw new Error("GitHub API rate limit exceeded");
}
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const releaseData = await response.json();
return {
tagName: releaseData.tag_name,
version: releaseData.tag_name.replace("v", ""),
publishedAt: releaseData.published_at,
htmlUrl: releaseData.html_url,
};
} catch (error) {
console.error("Error fetching latest release:", error.message);
throw error; // Re-throw to be caught by the calling function
}
}
// Helper function to get latest commit from main branch
async function getLatestCommit(owner, repo) {
try {
const currentVersion = getCurrentVersion();
const apiUrl = `https://api.github.com/repos/${owner}/${repo}/commits/main`;
const response = await fetch(apiUrl, {
method: "GET",
headers: {
Accept: "application/vnd.github.v3+json",
"User-Agent": `PatchMon-Server/${currentVersion}`,
},
});
if (!response.ok) {
const errorText = await response.text();
if (
errorText.includes("rate limit") ||
errorText.includes("API rate limit")
) {
throw new Error("GitHub API rate limit exceeded");
}
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const commitData = await response.json();
return {
sha: commitData.sha,
message: commitData.commit.message,
author: commitData.commit.author.name,
date: commitData.commit.author.date,
htmlUrl: commitData.html_url,
};
} catch (error) {
console.error("Error fetching latest commit:", error.message);
throw error; // Re-throw to be caught by the calling function
}
}
// Helper function to get commit count difference
async function getCommitDifference(owner, repo, currentVersion) {
// Try both with and without 'v' prefix for compatibility
const versionTags = [
currentVersion, // Try without 'v' first (new format)
`v${currentVersion}`, // Try with 'v' prefix (old format)
];
for (const versionTag of versionTags) {
try {
// Compare main branch with the released version tag
const apiUrl = `https://api.github.com/repos/${owner}/${repo}/compare/${versionTag}...main`;
const response = await fetch(apiUrl, {
method: "GET",
headers: {
Accept: "application/vnd.github.v3+json",
"User-Agent": `PatchMon-Server/${getCurrentVersion()}`,
},
});
if (!response.ok) {
const errorText = await response.text();
if (
errorText.includes("rate limit") ||
errorText.includes("API rate limit")
) {
throw new Error("GitHub API rate limit exceeded");
}
// If 404, try next tag format
if (response.status === 404) {
continue;
}
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const compareData = await response.json();
return {
commitsBehind: compareData.behind_by || 0, // How many commits main is behind release
commitsAhead: compareData.ahead_by || 0, // How many commits main is ahead of release
totalCommits: compareData.total_commits || 0,
branchInfo: "main branch vs release",
};
} catch (error) {
// If rate limit, throw immediately
if (error.message.includes("rate limit")) {
throw error;
}
}
}
// If all attempts failed, throw error
throw new Error(
`Could not find tag '${currentVersion}' or 'v${currentVersion}' in repository`,
);
}
// Helper function to compare version strings (semantic versioning)
function compareVersions(version1, version2) {
const v1parts = version1.split(".").map(Number);
const v2parts = version2.split(".").map(Number);
const maxLength = Math.max(v1parts.length, v2parts.length);
for (let i = 0; i < maxLength; i++) {
const v1part = v1parts[i] || 0;
const v2part = v2parts[i] || 0;
if (v1part > v2part) return 1;
if (v1part < v2part) return -1;
}
return 0;
}
// Get current version info
router.get("/current", authenticateToken, async (_req, res) => {
try {
// Read version from package.json dynamically
let currentVersion = "1.2.7"; // fallback
const currentVersion = getCurrentVersion();
try {
const packageJson = require("../../package.json");
if (packageJson?.version) {
currentVersion = packageJson.version;
}
} catch (packageError) {
console.warn(
"Could not read version from package.json, using fallback:",
packageError.message,
);
}
// Get settings with cached update info (no GitHub API calls)
const settings = await prisma.settings.findFirst();
const githubRepoUrl = settings?.githubRepoUrl || DEFAULT_GITHUB_REPO;
const { owner, repo } = parseGitHubRepo(githubRepoUrl);
// Return current version and cached update information
// The backend scheduler updates this data periodically
res.json({
version: currentVersion,
latest_version: settings?.latest_version || null,
is_update_available: settings?.is_update_available || false,
last_update_check: settings?.last_update_check || null,
buildDate: new Date().toISOString(),
environment: process.env.NODE_ENV || "development",
github: {
repository: githubRepoUrl,
owner: owner,
repo: repo,
},
});
} catch (error) {
console.error("Error getting current version:", error);
@@ -44,119 +237,11 @@ router.post(
"/test-ssh-key",
authenticateToken,
requireManageSettings,
async (req, res) => {
try {
const { sshKeyPath, githubRepoUrl } = req.body;
if (!sshKeyPath || !githubRepoUrl) {
return res.status(400).json({
error: "SSH key path and GitHub repo URL are required",
});
}
// Parse repository info
let owner, repo;
if (githubRepoUrl.includes("git@github.com:")) {
const match = githubRepoUrl.match(
/git@github\.com:([^/]+)\/([^/]+)\.git/,
);
if (match) {
[, owner, repo] = match;
}
} else if (githubRepoUrl.includes("github.com/")) {
const match = githubRepoUrl.match(/github\.com\/([^/]+)\/([^/]+)/);
if (match) {
[, owner, repo] = match;
}
}
if (!owner || !repo) {
return res.status(400).json({
error: "Invalid GitHub repository URL format",
});
}
// Check if SSH key file exists and is readable
try {
require("node:fs").accessSync(sshKeyPath);
} catch {
return res.status(400).json({
error: "SSH key file not found or not accessible",
details: `Cannot access: ${sshKeyPath}`,
suggestion:
"Check the file path and ensure the application has read permissions",
});
}
// Test SSH connection to GitHub
const sshRepoUrl = `git@github.com:${owner}/${repo}.git`;
const env = {
...process.env,
GIT_SSH_COMMAND: `ssh -i ${sshKeyPath} -o StrictHostKeyChecking=no -o IdentitiesOnly=yes -o ConnectTimeout=10`,
};
try {
// Test with a simple git command
const { stdout } = await execAsync(
`git ls-remote --heads ${sshRepoUrl} | head -n 1`,
{
timeout: 15000,
env: env,
},
);
if (stdout.trim()) {
return res.json({
success: true,
message: "SSH key is working correctly",
details: {
sshKeyPath,
repository: `${owner}/${repo}`,
testResult: "Successfully connected to GitHub",
},
});
} else {
return res.status(400).json({
error: "SSH connection succeeded but no data returned",
suggestion: "Check repository access permissions",
});
}
} catch (sshError) {
console.error("SSH test error:", sshError.message);
if (sshError.message.includes("Permission denied")) {
return res.status(403).json({
error: "SSH key permission denied",
details: "The SSH key exists but GitHub rejected the connection",
suggestion:
"Verify the SSH key is added to the repository as a deploy key with read access",
});
} else if (sshError.message.includes("Host key verification failed")) {
return res.status(403).json({
error: "Host key verification failed",
suggestion:
"This is normal for first-time connections. The key will be added to known_hosts automatically.",
});
} else if (sshError.message.includes("Connection timed out")) {
return res.status(408).json({
error: "Connection timed out",
suggestion: "Check your internet connection and GitHub status",
});
} else {
return res.status(500).json({
error: "SSH connection failed",
details: sshError.message,
suggestion: "Check the SSH key format and repository URL",
});
}
}
} catch (error) {
console.error("SSH key test error:", error);
res.status(500).json({
error: "Failed to test SSH key",
details: error.message,
});
}
async (_req, res) => {
res.status(410).json({
error:
"SSH key testing has been removed. Using default public repository.",
});
},
);
@@ -174,24 +259,93 @@ router.get(
return res.status(400).json({ error: "Settings not found" });
}
const currentVersion = "1.2.7";
const latestVersion = settings.latest_version || currentVersion;
const isUpdateAvailable = settings.update_available || false;
const lastUpdateCheck = settings.last_update_check || null;
const currentVersion = getCurrentVersion();
const githubRepoUrl = settings.githubRepoUrl || DEFAULT_GITHUB_REPO;
const { owner, repo } = parseGitHubRepo(githubRepoUrl);
let latestRelease = null;
let latestCommit = null;
let commitDifference = null;
// Fetch fresh GitHub data if we have valid owner/repo
if (owner && repo) {
try {
const [releaseData, commitData, differenceData] = await Promise.all([
getLatestRelease(owner, repo),
getLatestCommit(owner, repo),
getCommitDifference(owner, repo, currentVersion),
]);
latestRelease = releaseData;
latestCommit = commitData;
commitDifference = differenceData;
} catch (githubError) {
console.warn(
"Failed to fetch fresh GitHub data:",
githubError.message,
);
// Provide fallback data when GitHub API is rate-limited
if (
githubError.message.includes("rate limit") ||
githubError.message.includes("API rate limit")
) {
console.log("GitHub API rate limited, providing fallback data");
latestRelease = {
tagName: "v1.2.8",
version: "1.2.8",
publishedAt: "2025-10-02T17:12:53Z",
htmlUrl:
"https://github.com/PatchMon/PatchMon/releases/tag/v1.2.8",
};
latestCommit = {
sha: "cc89df161b8ea5d48ff95b0eb405fe69042052cd",
message: "Update README.md\n\nAdded Documentation Links",
author: "9 Technology Group LTD",
date: "2025-10-04T18:38:09Z",
htmlUrl:
"https://github.com/PatchMon/PatchMon/commit/cc89df161b8ea5d48ff95b0eb405fe69042052cd",
};
commitDifference = {
commitsBehind: 0,
commitsAhead: 3, // Main branch is ahead of release
totalCommits: 3,
branchInfo: "main branch vs release",
};
} else {
// Fall back to cached data for other errors
const githubRepoUrl = settings.githubRepoUrl || DEFAULT_GITHUB_REPO;
latestRelease = settings.latest_version
? {
version: settings.latest_version,
tagName: `v${settings.latest_version}`,
publishedAt: null, // Only use date from GitHub API, not cached data
htmlUrl: `${githubRepoUrl.replace(/\.git$/, "")}/releases/tag/v${settings.latest_version}`,
}
: null;
}
}
}
const latestVersion =
latestRelease?.version || settings.latest_version || currentVersion;
const isUpdateAvailable = latestRelease
? compareVersions(latestVersion, currentVersion) > 0
: settings.update_available || false;
res.json({
currentVersion,
latestVersion,
isUpdateAvailable,
lastUpdateCheck,
lastUpdateCheck: settings.last_update_check || null,
repositoryType: settings.repository_type || "public",
latestRelease: {
tagName: latestVersion ? `v${latestVersion}` : null,
version: latestVersion,
repository: settings.github_repo_url
? settings.github_repo_url.split("/").slice(-2).join("/")
: null,
accessMethod: settings.repository_type === "private" ? "ssh" : "api",
github: {
repository: githubRepoUrl,
owner: owner,
repo: repo,
latestRelease: latestRelease,
latestCommit: latestCommit,
commitDifference: commitDifference,
},
});
} catch (error) {

View File

@@ -0,0 +1,143 @@
const express = require("express");
const { authenticateToken } = require("../middleware/auth");
const {
getConnectionInfo,
subscribeToConnectionChanges,
} = require("../services/agentWs");
const {
validate_session,
update_session_activity,
} = require("../utils/session_manager");
const router = express.Router();
// Get WebSocket connection status by api_id (no database access - pure memory lookup)
router.get("/status/:apiId", authenticateToken, async (req, res) => {
try {
const { apiId } = req.params;
// Direct in-memory check - no database query needed
const connectionInfo = getConnectionInfo(apiId);
// Minimal response for maximum speed
res.json({
success: true,
data: connectionInfo,
});
} catch (error) {
console.error("Error fetching WebSocket status:", error);
res.status(500).json({
success: false,
error: "Failed to fetch WebSocket status",
});
}
});
// Server-Sent Events endpoint for real-time status updates (no polling needed!)
router.get("/status/:apiId/stream", async (req, res) => {
try {
const { apiId } = req.params;
// Manual authentication for SSE (EventSource doesn't support custom headers)
const token =
req.query.token || req.headers.authorization?.replace("Bearer ", "");
if (!token) {
return res.status(401).json({ error: "Authentication required" });
}
// Verify token manually with session validation
const jwt = require("jsonwebtoken");
try {
const decoded = jwt.verify(token, process.env.JWT_SECRET);
// Validate session (same as regular auth middleware)
const validation = await validate_session(decoded.sessionId, token);
if (!validation.valid) {
console.error("[SSE] Session validation failed:", validation.reason);
console.error("[SSE] Invalid session for api_id:", apiId);
return res.status(401).json({ error: "Invalid or expired session" });
}
// Update session activity to prevent inactivity timeout
await update_session_activity(decoded.sessionId);
req.user = validation.user;
} catch (err) {
console.error("[SSE] JWT verification failed:", err.message);
console.error("[SSE] Invalid token for api_id:", apiId);
return res.status(401).json({ error: "Invalid or expired token" });
}
console.log("[SSE] Client connected for api_id:", apiId);
// Set headers for SSE
res.setHeader("Content-Type", "text/event-stream");
res.setHeader("Cache-Control", "no-cache");
res.setHeader("Connection", "keep-alive");
res.setHeader("X-Accel-Buffering", "no"); // Disable nginx buffering
// Send initial status immediately
const initialInfo = getConnectionInfo(apiId);
res.write(`data: ${JSON.stringify(initialInfo)}\n\n`);
res.flushHeaders(); // Ensure headers are sent immediately
// Subscribe to connection changes for this specific api_id
const unsubscribe = subscribeToConnectionChanges(apiId, (_connected) => {
try {
// Push update to client instantly when status changes
const connectionInfo = getConnectionInfo(apiId);
console.log(
`[SSE] Pushing status change for ${apiId}: connected=${connectionInfo.connected} secure=${connectionInfo.secure}`,
);
res.write(`data: ${JSON.stringify(connectionInfo)}\n\n`);
} catch (err) {
console.error("[SSE] Error writing to stream:", err);
}
});
// Heartbeat to keep connection alive (every 30 seconds)
const heartbeat = setInterval(() => {
try {
res.write(": heartbeat\n\n");
} catch (err) {
console.error("[SSE] Error writing heartbeat:", err);
clearInterval(heartbeat);
}
}, 30000);
// Cleanup on client disconnect
req.on("close", () => {
console.log("[SSE] Client disconnected for api_id:", apiId);
clearInterval(heartbeat);
unsubscribe();
});
// Handle errors - distinguish between different error types
req.on("error", (err) => {
// Only log non-connection-reset errors to reduce noise
if (err.code !== "ECONNRESET" && err.code !== "EPIPE") {
console.error("[SSE] Request error:", err);
} else {
console.log("[SSE] Client connection reset for api_id:", apiId);
}
clearInterval(heartbeat);
unsubscribe();
});
// Handle response errors
res.on("error", (err) => {
if (err.code !== "ECONNRESET" && err.code !== "EPIPE") {
console.error("[SSE] Response error:", err);
}
clearInterval(heartbeat);
unsubscribe();
});
} catch (error) {
console.error("[SSE] Unexpected error:", error);
if (!res.headersSent) {
res.status(500).json({ error: "Internal server error" });
}
}
});
module.exports = router;

View File

@@ -1,8 +1,45 @@
require("dotenv").config();
// Validate required environment variables on startup
function validateEnvironmentVariables() {
const requiredVars = {
JWT_SECRET: "Required for secure authentication token generation",
DATABASE_URL: "Required for database connection",
};
const missing = [];
// Check required variables
for (const [varName, description] of Object.entries(requiredVars)) {
if (!process.env[varName]) {
missing.push(`${varName}: ${description}`);
}
}
// Fail if required variables are missing
if (missing.length > 0) {
console.error("❌ Missing required environment variables:");
for (const error of missing) {
console.error(` - ${error}`);
}
console.error("");
console.error(
"Please set these environment variables and restart the application.",
);
process.exit(1);
}
console.log("✅ Environment variable validation passed");
}
// Validate environment variables before importing any modules that depend on them
validateEnvironmentVariables();
const express = require("express");
const cors = require("cors");
const helmet = require("helmet");
const rateLimit = require("express-rate-limit");
const cookieParser = require("cookie-parser");
const {
createPrismaClient,
waitForDatabase,
@@ -24,9 +61,18 @@ const {
const repositoryRoutes = require("./routes/repositoryRoutes");
const versionRoutes = require("./routes/versionRoutes");
const tfaRoutes = require("./routes/tfaRoutes");
const updateScheduler = require("./services/updateScheduler");
const searchRoutes = require("./routes/searchRoutes");
const autoEnrollmentRoutes = require("./routes/autoEnrollmentRoutes");
const gethomepageRoutes = require("./routes/gethomepageRoutes");
const automationRoutes = require("./routes/automationRoutes");
const dockerRoutes = require("./routes/dockerRoutes");
const wsRoutes = require("./routes/wsRoutes");
const { initSettings } = require("./services/settingsService");
const { cleanup_expired_sessions } = require("./utils/session_manager");
const { queueManager } = require("./services/automation");
const { authenticateToken, requireAdmin } = require("./middleware/auth");
const { createBullBoard } = require("@bull-board/api");
const { BullMQAdapter } = require("@bull-board/api/bullMQAdapter");
const { ExpressAdapter } = require("@bull-board/express");
// Initialize Prisma client with optimized connection pooling for multiple instances
const prisma = createPrismaClient();
@@ -213,6 +259,9 @@ if (process.env.ENABLE_LOGGING === "true") {
const app = express();
const PORT = process.env.PORT || 3001;
const http = require("node:http");
const server = http.createServer(app);
const { init: initAgentWs } = require("./services/agentWs");
// Trust proxy (needed when behind reverse proxy) and remove X-Powered-By
if (process.env.TRUST_PROXY) {
@@ -300,12 +349,17 @@ app.use(
// Allow non-browser/SSR tools with no origin
if (!origin) return callback(null, true);
if (allowedOrigins.includes(origin)) return callback(null, true);
// Allow same-origin requests (e.g., Bull Board accessing its own API)
// This allows http://hostname:3001 to make requests to http://hostname:3001
if (origin?.includes(":3001")) return callback(null, true);
return callback(new Error("Not allowed by CORS"));
},
credentials: true,
}),
);
app.use(limiter);
// Cookie parser for Bull Board sessions
app.use(cookieParser());
// Reduce body size limits to reasonable defaults
app.use(express.json({ limit: process.env.JSON_BODY_LIMIT || "5mb" }));
app.use(
@@ -378,6 +432,132 @@ app.use(`/api/${apiVersion}/dashboard-preferences`, dashboardPreferencesRoutes);
app.use(`/api/${apiVersion}/repositories`, repositoryRoutes);
app.use(`/api/${apiVersion}/version`, versionRoutes);
app.use(`/api/${apiVersion}/tfa`, tfaRoutes);
app.use(`/api/${apiVersion}/search`, searchRoutes);
app.use(
`/api/${apiVersion}/auto-enrollment`,
authLimiter,
autoEnrollmentRoutes,
);
app.use(`/api/${apiVersion}/gethomepage`, gethomepageRoutes);
app.use(`/api/${apiVersion}/automation`, automationRoutes);
app.use(`/api/${apiVersion}/docker`, dockerRoutes);
app.use(`/api/${apiVersion}/ws`, wsRoutes);
// Bull Board - will be populated after queue manager initializes
let bullBoardRouter = null;
const bullBoardSessions = new Map(); // Store authenticated sessions
// Mount Bull Board at /admin instead of /api/v1/admin to avoid path conflicts
app.use(`/admin/queues`, (_req, res, next) => {
// Relax COOP/COEP for Bull Board in non-production to avoid browser warnings
if (process.env.NODE_ENV !== "production") {
res.setHeader("Cross-Origin-Opener-Policy", "same-origin-allow-popups");
res.setHeader("Cross-Origin-Embedder-Policy", "unsafe-none");
}
next();
});
// Authentication middleware for Bull Board
app.use(`/admin/queues`, async (req, res, next) => {
// Skip authentication for static assets only
if (req.path.includes("/static/") || req.path.includes("/favicon")) {
return next();
}
// Check for bull-board-session cookie first
const sessionId = req.cookies["bull-board-session"];
if (sessionId) {
const session = bullBoardSessions.get(sessionId);
if (session && Date.now() - session.timestamp < 3600000) {
// 1 hour
// Valid session, extend it
session.timestamp = Date.now();
return next();
} else if (session) {
// Expired session, remove it
bullBoardSessions.delete(sessionId);
}
}
// No valid session, check for token
let token = req.query.token;
if (!token && req.headers.authorization) {
token = req.headers.authorization.replace("Bearer ", "");
}
// If no token, deny access
if (!token) {
return res.status(401).json({ error: "Access token required" });
}
// Add token to headers for authentication
req.headers.authorization = `Bearer ${token}`;
// Authenticate the user
return authenticateToken(req, res, (err) => {
if (err) {
return res.status(401).json({ error: "Authentication failed" });
}
return requireAdmin(req, res, (adminErr) => {
if (adminErr) {
return res.status(403).json({ error: "Admin access required" });
}
// Authentication successful - create a session
const newSessionId = require("node:crypto")
.randomBytes(32)
.toString("hex");
bullBoardSessions.set(newSessionId, {
timestamp: Date.now(),
userId: req.user.id,
});
// Set session cookie
res.cookie("bull-board-session", newSessionId, {
httpOnly: true,
secure: process.env.NODE_ENV === "production",
sameSite: "lax",
maxAge: 3600000, // 1 hour
});
// Clean up old sessions periodically
if (bullBoardSessions.size > 100) {
const now = Date.now();
for (const [sid, session] of bullBoardSessions.entries()) {
if (now - session.timestamp > 3600000) {
bullBoardSessions.delete(sid);
}
}
}
return next();
});
});
});
app.use(`/admin/queues`, (req, res, next) => {
if (bullBoardRouter) {
return bullBoardRouter(req, res, next);
}
return res.status(503).json({ error: "Bull Board not initialized yet" });
});
// Error handler specifically for Bull Board routes
app.use("/admin/queues", (err, req, res, _next) => {
console.error("Bull Board error on", req.method, req.url);
console.error("Error details:", err.message);
console.error("Stack:", err.stack);
if (process.env.ENABLE_LOGGING === "true") {
logger.error(`Bull Board error on ${req.method} ${req.url}:`, err);
}
res.status(500).json({
error: "Internal server error",
message: err.message,
path: req.path,
url: req.url,
});
});
// Error handling middleware
app.use((err, _req, res, _next) => {
@@ -400,10 +580,7 @@ process.on("SIGINT", async () => {
if (process.env.ENABLE_LOGGING === "true") {
logger.info("SIGINT received, shutting down gracefully");
}
if (app.locals.session_cleanup_interval) {
clearInterval(app.locals.session_cleanup_interval);
}
updateScheduler.stop();
await queueManager.shutdown();
await disconnectPrisma(prisma);
process.exit(0);
});
@@ -412,10 +589,7 @@ process.on("SIGTERM", async () => {
if (process.env.ENABLE_LOGGING === "true") {
logger.info("SIGTERM received, shutting down gracefully");
}
if (app.locals.session_cleanup_interval) {
clearInterval(app.locals.session_cleanup_interval);
}
updateScheduler.stop();
await queueManager.shutdown();
await disconnectPrisma(prisma);
process.exit(0);
});
@@ -630,11 +804,16 @@ async function getPermissionBasedPreferences(userRole) {
requiredPermission: "can_view_packages",
order: 13,
},
{ cardId: "recentUsers", requiredPermission: "can_view_users", order: 14 },
{
cardId: "packageTrends",
requiredPermission: "can_view_packages",
order: 14,
},
{ cardId: "recentUsers", requiredPermission: "can_view_users", order: 15 },
{
cardId: "quickStats",
requiredPermission: "can_view_dashboard",
order: 15,
order: 16,
},
];
@@ -679,34 +858,40 @@ async function startServer() {
// Initialize dashboard preferences for all users
await initializeDashboardPreferences();
// Initial session cleanup
await cleanup_expired_sessions();
// Initialize BullMQ queue manager
await queueManager.initialize();
// Schedule session cleanup every hour
const session_cleanup_interval = setInterval(
async () => {
try {
await cleanup_expired_sessions();
} catch (error) {
console.error("Session cleanup error:", error);
}
},
60 * 60 * 1000,
); // Every hour
// Schedule recurring jobs
await queueManager.scheduleAllJobs();
app.listen(PORT, () => {
// Set up Bull Board for queue monitoring
const serverAdapter = new ExpressAdapter();
// Set basePath to match where we mount the router
serverAdapter.setBasePath("/admin/queues");
const { QUEUE_NAMES } = require("./services/automation");
const bullAdapters = Object.values(QUEUE_NAMES).map(
(queueName) => new BullMQAdapter(queueManager.queues[queueName]),
);
createBullBoard({
queues: bullAdapters,
serverAdapter: serverAdapter,
});
// Set the router for the Bull Board middleware (secured middleware above)
bullBoardRouter = serverAdapter.getRouter();
console.log("✅ Bull Board mounted at /admin/queues (secured)");
// Initialize WS layer with the underlying HTTP server
initAgentWs(server, prisma);
server.listen(PORT, () => {
if (process.env.ENABLE_LOGGING === "true") {
logger.info(`Server running on port ${PORT}`);
logger.info(`Environment: ${process.env.NODE_ENV}`);
logger.info("✅ Session cleanup scheduled (every hour)");
}
// Start update scheduler
updateScheduler.start();
});
// Store interval for cleanup on shutdown
app.locals.session_cleanup_interval = session_cleanup_interval;
} catch (error) {
console.error("❌ Failed to start server:", error.message);
process.exit(1);

View File

@@ -0,0 +1,190 @@
// Lightweight WebSocket hub for agent connections
// Auth: X-API-ID / X-API-KEY headers on the upgrade request
const WebSocket = require("ws");
const url = require("node:url");
// Connection registry by api_id
const apiIdToSocket = new Map();
// Connection metadata (secure/insecure)
// Map<api_id, { ws: WebSocket, secure: boolean }>
const connectionMetadata = new Map();
// Subscribers for connection status changes (for SSE)
// Map<api_id, Set<callback>>
const connectionChangeSubscribers = new Map();
let wss;
let prisma;
function init(server, prismaClient) {
prisma = prismaClient;
wss = new WebSocket.Server({ noServer: true });
// Handle HTTP upgrade events and authenticate before accepting WS
server.on("upgrade", async (request, socket, head) => {
try {
const { pathname } = url.parse(request.url);
if (!pathname || !pathname.startsWith("/api/")) {
socket.destroy();
return;
}
// Expected path: /api/{v}/agents/ws
const parts = pathname.split("/").filter(Boolean); // [api, v1, agents, ws]
if (parts.length !== 4 || parts[2] !== "agents" || parts[3] !== "ws") {
socket.destroy();
return;
}
const apiId = request.headers["x-api-id"];
const apiKey = request.headers["x-api-key"];
if (!apiId || !apiKey) {
socket.destroy();
return;
}
// Validate credentials
const host = await prisma.hosts.findUnique({ where: { api_id: apiId } });
if (!host || host.api_key !== apiKey) {
socket.destroy();
return;
}
wss.handleUpgrade(request, socket, head, (ws) => {
ws.apiId = apiId;
// Detect if connection is secure (wss://) or not (ws://)
const isSecure =
socket.encrypted || request.headers["x-forwarded-proto"] === "https";
apiIdToSocket.set(apiId, ws);
connectionMetadata.set(apiId, { ws, secure: isSecure });
console.log(
`[agent-ws] connected api_id=${apiId} protocol=${isSecure ? "wss" : "ws"} total=${apiIdToSocket.size}`,
);
// Notify subscribers of connection
notifyConnectionChange(apiId, true);
ws.on("message", () => {
// Currently we don't need to handle agent->server messages
});
ws.on("close", () => {
const existing = apiIdToSocket.get(apiId);
if (existing === ws) {
apiIdToSocket.delete(apiId);
connectionMetadata.delete(apiId);
// Notify subscribers of disconnection
notifyConnectionChange(apiId, false);
}
console.log(
`[agent-ws] disconnected api_id=${apiId} total=${apiIdToSocket.size}`,
);
});
// Optional: greet/ack
safeSend(ws, JSON.stringify({ type: "connected" }));
});
} catch (_err) {
try {
socket.destroy();
} catch {
/* ignore */
}
}
});
}
function safeSend(ws, data) {
if (ws && ws.readyState === WebSocket.OPEN) {
try {
ws.send(data);
} catch {
/* ignore */
}
}
}
function broadcastSettingsUpdate(newInterval) {
const payload = JSON.stringify({
type: "settings_update",
update_interval: newInterval,
});
for (const [, ws] of apiIdToSocket) {
safeSend(ws, payload);
}
}
function pushReportNow(apiId) {
const ws = apiIdToSocket.get(apiId);
safeSend(ws, JSON.stringify({ type: "report_now" }));
}
function pushSettingsUpdate(apiId, newInterval) {
const ws = apiIdToSocket.get(apiId);
safeSend(
ws,
JSON.stringify({ type: "settings_update", update_interval: newInterval }),
);
}
// Notify all subscribers when connection status changes
function notifyConnectionChange(apiId, connected) {
const subscribers = connectionChangeSubscribers.get(apiId);
if (subscribers) {
for (const callback of subscribers) {
try {
callback(connected);
} catch (err) {
console.error(`[agent-ws] error notifying subscriber:`, err);
}
}
}
}
// Subscribe to connection status changes for a specific api_id
function subscribeToConnectionChanges(apiId, callback) {
if (!connectionChangeSubscribers.has(apiId)) {
connectionChangeSubscribers.set(apiId, new Set());
}
connectionChangeSubscribers.get(apiId).add(callback);
// Return unsubscribe function
return () => {
const subscribers = connectionChangeSubscribers.get(apiId);
if (subscribers) {
subscribers.delete(callback);
if (subscribers.size === 0) {
connectionChangeSubscribers.delete(apiId);
}
}
};
}
module.exports = {
init,
broadcastSettingsUpdate,
pushReportNow,
pushSettingsUpdate,
// Expose read-only view of connected agents
getConnectedApiIds: () => Array.from(apiIdToSocket.keys()),
isConnected: (apiId) => {
const ws = apiIdToSocket.get(apiId);
return !!ws && ws.readyState === WebSocket.OPEN;
},
// Get connection info including protocol (ws/wss)
getConnectionInfo: (apiId) => {
const metadata = connectionMetadata.get(apiId);
if (!metadata) {
return { connected: false, secure: false };
}
const connected = metadata.ws.readyState === WebSocket.OPEN;
return { connected, secure: metadata.secure };
},
// Subscribe to connection status changes (for SSE)
subscribeToConnectionChanges,
};

View File

@@ -0,0 +1,153 @@
const { prisma } = require("./shared/prisma");
const { compareVersions, checkPublicRepo } = require("./shared/utils");
/**
* GitHub Update Check Automation
* Checks for new releases on GitHub using HTTPS API
*/
class GitHubUpdateCheck {
constructor(queueManager) {
this.queueManager = queueManager;
this.queueName = "github-update-check";
}
/**
* Process GitHub update check job
*/
async process(_job) {
const startTime = Date.now();
console.log("🔍 Starting GitHub update check...");
try {
// Get settings
const settings = await prisma.settings.findFirst();
const DEFAULT_GITHUB_REPO = "https://github.com/patchMon/patchmon";
const repoUrl = settings?.githubRepoUrl || DEFAULT_GITHUB_REPO;
let owner, repo;
// Parse GitHub repository URL (supports both HTTPS and SSH formats)
if (repoUrl.includes("git@github.com:")) {
const match = repoUrl.match(/git@github\.com:([^/]+)\/([^/]+)\.git/);
if (match) {
[, owner, repo] = match;
}
} else if (repoUrl.includes("github.com/")) {
const match = repoUrl.match(
/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?$/,
);
if (match) {
[, owner, repo] = match;
}
}
if (!owner || !repo) {
throw new Error("Could not parse GitHub repository URL");
}
// Always use HTTPS GitHub API (simpler and more reliable)
const latestVersion = await checkPublicRepo(owner, repo);
if (!latestVersion) {
throw new Error("Could not determine latest version");
}
// Read version from package.json
let currentVersion = "1.2.7"; // fallback
try {
const packageJson = require("../../../package.json");
if (packageJson?.version) {
currentVersion = packageJson.version;
}
} catch (packageError) {
console.warn(
"Could not read version from package.json:",
packageError.message,
);
}
const isUpdateAvailable =
compareVersions(latestVersion, currentVersion) > 0;
// Update settings with check results
await prisma.settings.update({
where: { id: settings.id },
data: {
last_update_check: new Date(),
update_available: isUpdateAvailable,
latest_version: latestVersion,
},
});
const executionTime = Date.now() - startTime;
console.log(
`✅ GitHub update check completed in ${executionTime}ms - Current: ${currentVersion}, Latest: ${latestVersion}, Update Available: ${isUpdateAvailable}`,
);
return {
success: true,
currentVersion,
latestVersion,
isUpdateAvailable,
executionTime,
};
} catch (error) {
const executionTime = Date.now() - startTime;
console.error(
`❌ GitHub update check failed after ${executionTime}ms:`,
error.message,
);
// Update last check time even on error
try {
const settings = await prisma.settings.findFirst();
if (settings) {
await prisma.settings.update({
where: { id: settings.id },
data: {
last_update_check: new Date(),
update_available: false,
},
});
}
} catch (updateError) {
console.error(
"❌ Error updating last check time:",
updateError.message,
);
}
throw error;
}
}
/**
* Schedule recurring GitHub update check (daily at midnight)
*/
async schedule() {
const job = await this.queueManager.queues[this.queueName].add(
"github-update-check",
{},
{
repeat: { cron: "0 0 * * *" }, // Daily at midnight
jobId: "github-update-check-recurring",
},
);
console.log("✅ GitHub update check scheduled");
return job;
}
/**
* Trigger manual GitHub update check
*/
async triggerManual() {
const job = await this.queueManager.queues[this.queueName].add(
"github-update-check-manual",
{},
{ priority: 1 },
);
console.log("✅ Manual GitHub update check triggered");
return job;
}
}
module.exports = GitHubUpdateCheck;

View File

@@ -0,0 +1,517 @@
const { Queue, Worker } = require("bullmq");
const { redis, redisConnection } = require("./shared/redis");
const { prisma } = require("./shared/prisma");
const agentWs = require("../agentWs");
// Import automation classes
const GitHubUpdateCheck = require("./githubUpdateCheck");
const SessionCleanup = require("./sessionCleanup");
const OrphanedRepoCleanup = require("./orphanedRepoCleanup");
const OrphanedPackageCleanup = require("./orphanedPackageCleanup");
// Queue names
const QUEUE_NAMES = {
GITHUB_UPDATE_CHECK: "github-update-check",
SESSION_CLEANUP: "session-cleanup",
ORPHANED_REPO_CLEANUP: "orphaned-repo-cleanup",
ORPHANED_PACKAGE_CLEANUP: "orphaned-package-cleanup",
AGENT_COMMANDS: "agent-commands",
};
/**
* Main Queue Manager
* Manages all BullMQ queues and workers
*/
class QueueManager {
constructor() {
this.queues = {};
this.workers = {};
this.automations = {};
this.isInitialized = false;
}
/**
* Initialize all queues, workers, and automations
*/
async initialize() {
try {
console.log("✅ Redis connection successful");
// Initialize queues
await this.initializeQueues();
// Initialize automation classes
await this.initializeAutomations();
// Initialize workers
await this.initializeWorkers();
// Setup event listeners
this.setupEventListeners();
this.isInitialized = true;
console.log("✅ Queue manager initialized successfully");
} catch (error) {
console.error("❌ Failed to initialize queue manager:", error.message);
throw error;
}
}
/**
* Initialize all queues
*/
async initializeQueues() {
for (const [_key, queueName] of Object.entries(QUEUE_NAMES)) {
this.queues[queueName] = new Queue(queueName, {
connection: redisConnection,
defaultJobOptions: {
removeOnComplete: 50, // Keep last 50 completed jobs
removeOnFail: 20, // Keep last 20 failed jobs
attempts: 3, // Retry failed jobs 3 times
backoff: {
type: "exponential",
delay: 2000,
},
},
});
console.log(`✅ Queue '${queueName}' initialized`);
}
}
/**
* Initialize automation classes
*/
async initializeAutomations() {
this.automations[QUEUE_NAMES.GITHUB_UPDATE_CHECK] = new GitHubUpdateCheck(
this,
);
this.automations[QUEUE_NAMES.SESSION_CLEANUP] = new SessionCleanup(this);
this.automations[QUEUE_NAMES.ORPHANED_REPO_CLEANUP] =
new OrphanedRepoCleanup(this);
this.automations[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP] =
new OrphanedPackageCleanup(this);
console.log("✅ All automation classes initialized");
}
/**
* Initialize all workers
*/
async initializeWorkers() {
// GitHub Update Check Worker
this.workers[QUEUE_NAMES.GITHUB_UPDATE_CHECK] = new Worker(
QUEUE_NAMES.GITHUB_UPDATE_CHECK,
this.automations[QUEUE_NAMES.GITHUB_UPDATE_CHECK].process.bind(
this.automations[QUEUE_NAMES.GITHUB_UPDATE_CHECK],
),
{
connection: redisConnection,
concurrency: 1,
},
);
// Session Cleanup Worker
this.workers[QUEUE_NAMES.SESSION_CLEANUP] = new Worker(
QUEUE_NAMES.SESSION_CLEANUP,
this.automations[QUEUE_NAMES.SESSION_CLEANUP].process.bind(
this.automations[QUEUE_NAMES.SESSION_CLEANUP],
),
{
connection: redisConnection,
concurrency: 1,
},
);
// Orphaned Repo Cleanup Worker
this.workers[QUEUE_NAMES.ORPHANED_REPO_CLEANUP] = new Worker(
QUEUE_NAMES.ORPHANED_REPO_CLEANUP,
this.automations[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].process.bind(
this.automations[QUEUE_NAMES.ORPHANED_REPO_CLEANUP],
),
{
connection: redisConnection,
concurrency: 1,
},
);
// Orphaned Package Cleanup Worker
this.workers[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP] = new Worker(
QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP,
this.automations[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP].process.bind(
this.automations[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP],
),
{
connection: redisConnection,
concurrency: 1,
},
);
// Agent Commands Worker
this.workers[QUEUE_NAMES.AGENT_COMMANDS] = new Worker(
QUEUE_NAMES.AGENT_COMMANDS,
async (job) => {
const { api_id, type, update_interval } = job.data || {};
console.log("[agent-commands] processing job", job.id, api_id, type);
// Log job attempt to history - use job.id as the unique identifier
const attemptNumber = job.attemptsMade || 1;
const historyId = job.id; // Single row per job, updated with each attempt
try {
if (!api_id || !type) {
throw new Error("invalid job data");
}
// Find host by api_id
const host = await prisma.hosts.findUnique({
where: { api_id },
select: { id: true },
});
// Ensure agent is connected; if not, retry later
if (!agentWs.isConnected(api_id)) {
const error = new Error("agent not connected");
// Log failed attempt
await prisma.job_history.upsert({
where: { id: historyId },
create: {
id: historyId,
job_id: job.id,
queue_name: QUEUE_NAMES.AGENT_COMMANDS,
job_name: type,
host_id: host?.id,
api_id,
status: "failed",
attempt_number: attemptNumber,
error_message: error.message,
created_at: new Date(),
updated_at: new Date(),
},
update: {
status: "failed",
attempt_number: attemptNumber,
error_message: error.message,
updated_at: new Date(),
},
});
console.log(
"[agent-commands] agent not connected, will retry",
api_id,
);
throw error;
}
// Process the command
let result;
if (type === "settings_update") {
agentWs.pushSettingsUpdate(api_id, update_interval);
console.log(
"[agent-commands] delivered settings_update",
api_id,
update_interval,
);
result = { delivered: true, update_interval };
} else if (type === "report_now") {
agentWs.pushReportNow(api_id);
console.log("[agent-commands] delivered report_now", api_id);
result = { delivered: true };
} else {
throw new Error("unsupported agent command");
}
// Log successful completion
await prisma.job_history.upsert({
where: { id: historyId },
create: {
id: historyId,
job_id: job.id,
queue_name: QUEUE_NAMES.AGENT_COMMANDS,
job_name: type,
host_id: host?.id,
api_id,
status: "completed",
attempt_number: attemptNumber,
output: result,
created_at: new Date(),
updated_at: new Date(),
completed_at: new Date(),
},
update: {
status: "completed",
attempt_number: attemptNumber,
output: result,
error_message: null,
updated_at: new Date(),
completed_at: new Date(),
},
});
return result;
} catch (error) {
// Log error to history (if not already logged above)
if (error.message !== "agent not connected") {
const host = await prisma.hosts
.findUnique({
where: { api_id },
select: { id: true },
})
.catch(() => null);
await prisma.job_history
.upsert({
where: { id: historyId },
create: {
id: historyId,
job_id: job.id,
queue_name: QUEUE_NAMES.AGENT_COMMANDS,
job_name: type || "unknown",
host_id: host?.id,
api_id,
status: "failed",
attempt_number: attemptNumber,
error_message: error.message,
created_at: new Date(),
updated_at: new Date(),
},
update: {
status: "failed",
attempt_number: attemptNumber,
error_message: error.message,
updated_at: new Date(),
},
})
.catch((err) =>
console.error("[agent-commands] failed to log error:", err),
);
}
throw error;
}
},
{
connection: redisConnection,
concurrency: 10,
},
);
// Add error handling for all workers
Object.values(this.workers).forEach((worker) => {
worker.on("error", (error) => {
console.error("Worker error:", error);
});
});
console.log("✅ All workers initialized");
}
/**
* Setup event listeners for all queues
*/
setupEventListeners() {
for (const queueName of Object.values(QUEUE_NAMES)) {
const queue = this.queues[queueName];
queue.on("error", (error) => {
console.error(`❌ Queue '${queueName}' experienced an error:`, error);
});
queue.on("failed", (job, err) => {
console.error(
`❌ Job '${job.id}' in queue '${queueName}' failed:`,
err,
);
});
queue.on("completed", (job) => {
console.log(`✅ Job '${job.id}' in queue '${queueName}' completed.`);
});
}
console.log("✅ Queue events initialized");
}
/**
* Schedule all recurring jobs
*/
async scheduleAllJobs() {
await this.automations[QUEUE_NAMES.GITHUB_UPDATE_CHECK].schedule();
await this.automations[QUEUE_NAMES.SESSION_CLEANUP].schedule();
await this.automations[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].schedule();
await this.automations[QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP].schedule();
}
/**
* Manual job triggers
*/
async triggerGitHubUpdateCheck() {
return this.automations[QUEUE_NAMES.GITHUB_UPDATE_CHECK].triggerManual();
}
async triggerSessionCleanup() {
return this.automations[QUEUE_NAMES.SESSION_CLEANUP].triggerManual();
}
async triggerOrphanedRepoCleanup() {
return this.automations[QUEUE_NAMES.ORPHANED_REPO_CLEANUP].triggerManual();
}
async triggerOrphanedPackageCleanup() {
return this.automations[
QUEUE_NAMES.ORPHANED_PACKAGE_CLEANUP
].triggerManual();
}
/**
* Get queue statistics
*/
async getQueueStats(queueName) {
const queue = this.queues[queueName];
if (!queue) {
throw new Error(`Queue ${queueName} not found`);
}
const [waiting, active, completed, failed, delayed] = await Promise.all([
queue.getWaiting(),
queue.getActive(),
queue.getCompleted(),
queue.getFailed(),
queue.getDelayed(),
]);
return {
waiting: waiting.length,
active: active.length,
completed: completed.length,
failed: failed.length,
delayed: delayed.length,
};
}
/**
* Get all queue statistics
*/
async getAllQueueStats() {
const stats = {};
for (const queueName of Object.values(QUEUE_NAMES)) {
stats[queueName] = await this.getQueueStats(queueName);
}
return stats;
}
/**
* Get recent jobs for a queue
*/
async getRecentJobs(queueName, limit = 10) {
const queue = this.queues[queueName];
if (!queue) {
throw new Error(`Queue ${queueName} not found`);
}
const [completed, failed] = await Promise.all([
queue.getCompleted(0, limit - 1),
queue.getFailed(0, limit - 1),
]);
return [...completed, ...failed]
.sort((a, b) => new Date(b.finishedOn) - new Date(a.finishedOn))
.slice(0, limit);
}
/**
* Get jobs for a specific host (by API ID)
*/
async getHostJobs(apiId, limit = 20) {
const queue = this.queues[QUEUE_NAMES.AGENT_COMMANDS];
if (!queue) {
throw new Error(`Queue ${QUEUE_NAMES.AGENT_COMMANDS} not found`);
}
console.log(`[getHostJobs] Looking for jobs with api_id: ${apiId}`);
// Get active queue status (waiting, active, delayed, failed)
const [waiting, active, delayed, failed] = await Promise.all([
queue.getWaiting(),
queue.getActive(),
queue.getDelayed(),
queue.getFailed(),
]);
// Filter by API ID
const filterByApiId = (jobs) =>
jobs.filter((job) => job.data && job.data.api_id === apiId);
const waitingCount = filterByApiId(waiting).length;
const activeCount = filterByApiId(active).length;
const delayedCount = filterByApiId(delayed).length;
const failedCount = filterByApiId(failed).length;
console.log(
`[getHostJobs] Queue status - Waiting: ${waitingCount}, Active: ${activeCount}, Delayed: ${delayedCount}, Failed: ${failedCount}`,
);
// Get job history from database (shows all attempts and status changes)
const jobHistory = await prisma.job_history.findMany({
where: {
api_id: apiId,
},
orderBy: {
created_at: "desc",
},
take: limit,
});
console.log(
`[getHostJobs] Found ${jobHistory.length} job history records for api_id: ${apiId}`,
);
return {
waiting: waitingCount,
active: activeCount,
delayed: delayedCount,
failed: failedCount,
jobHistory: jobHistory.map((job) => ({
id: job.id,
job_id: job.job_id,
job_name: job.job_name,
status: job.status,
attempt_number: job.attempt_number,
error_message: job.error_message,
output: job.output,
created_at: job.created_at,
updated_at: job.updated_at,
completed_at: job.completed_at,
})),
};
}
/**
* Graceful shutdown
*/
async shutdown() {
console.log("🛑 Shutting down queue manager...");
for (const queueName of Object.keys(this.queues)) {
try {
await this.queues[queueName].close();
} catch (e) {
console.warn(
`⚠️ Failed to close queue '${queueName}':`,
e?.message || e,
);
}
if (this.workers?.[queueName]) {
try {
await this.workers[queueName].close();
} catch (e) {
console.warn(
`⚠️ Failed to close worker for '${queueName}':`,
e?.message || e,
);
}
}
}
await redis.quit();
console.log("✅ Queue manager shutdown complete");
}
}
const queueManager = new QueueManager();
module.exports = { queueManager, QUEUE_NAMES };

View File

@@ -0,0 +1,116 @@
const { prisma } = require("./shared/prisma");
/**
* Orphaned Package Cleanup Automation
* Removes packages with no associated hosts
*/
class OrphanedPackageCleanup {
constructor(queueManager) {
this.queueManager = queueManager;
this.queueName = "orphaned-package-cleanup";
}
/**
* Process orphaned package cleanup job
*/
async process(_job) {
const startTime = Date.now();
console.log("🧹 Starting orphaned package cleanup...");
try {
// Find packages with 0 hosts
const orphanedPackages = await prisma.packages.findMany({
where: {
host_packages: {
none: {},
},
},
include: {
_count: {
select: {
host_packages: true,
},
},
},
});
let deletedCount = 0;
const deletedPackages = [];
// Delete orphaned packages
for (const pkg of orphanedPackages) {
try {
await prisma.packages.delete({
where: { id: pkg.id },
});
deletedCount++;
deletedPackages.push({
id: pkg.id,
name: pkg.name,
description: pkg.description,
category: pkg.category,
latest_version: pkg.latest_version,
});
console.log(
`🗑️ Deleted orphaned package: ${pkg.name} (${pkg.latest_version})`,
);
} catch (deleteError) {
console.error(
`❌ Failed to delete package ${pkg.id}:`,
deleteError.message,
);
}
}
const executionTime = Date.now() - startTime;
console.log(
`✅ Orphaned package cleanup completed in ${executionTime}ms - Deleted ${deletedCount} packages`,
);
return {
success: true,
deletedCount,
deletedPackages,
executionTime,
};
} catch (error) {
const executionTime = Date.now() - startTime;
console.error(
`❌ Orphaned package cleanup failed after ${executionTime}ms:`,
error.message,
);
throw error;
}
}
/**
* Schedule recurring orphaned package cleanup (daily at 3 AM)
*/
async schedule() {
const job = await this.queueManager.queues[this.queueName].add(
"orphaned-package-cleanup",
{},
{
repeat: { cron: "0 3 * * *" }, // Daily at 3 AM
jobId: "orphaned-package-cleanup-recurring",
},
);
console.log("✅ Orphaned package cleanup scheduled");
return job;
}
/**
* Trigger manual orphaned package cleanup
*/
async triggerManual() {
const job = await this.queueManager.queues[this.queueName].add(
"orphaned-package-cleanup-manual",
{},
{ priority: 1 },
);
console.log("✅ Manual orphaned package cleanup triggered");
return job;
}
}
module.exports = OrphanedPackageCleanup;

View File

@@ -0,0 +1,114 @@
const { prisma } = require("./shared/prisma");
/**
* Orphaned Repository Cleanup Automation
* Removes repositories with no associated hosts
*/
class OrphanedRepoCleanup {
constructor(queueManager) {
this.queueManager = queueManager;
this.queueName = "orphaned-repo-cleanup";
}
/**
* Process orphaned repository cleanup job
*/
async process(_job) {
const startTime = Date.now();
console.log("🧹 Starting orphaned repository cleanup...");
try {
// Find repositories with 0 hosts
const orphanedRepos = await prisma.repositories.findMany({
where: {
host_repositories: {
none: {},
},
},
include: {
_count: {
select: {
host_repositories: true,
},
},
},
});
let deletedCount = 0;
const deletedRepos = [];
// Delete orphaned repositories
for (const repo of orphanedRepos) {
try {
await prisma.repositories.delete({
where: { id: repo.id },
});
deletedCount++;
deletedRepos.push({
id: repo.id,
name: repo.name,
url: repo.url,
});
console.log(
`🗑️ Deleted orphaned repository: ${repo.name} (${repo.url})`,
);
} catch (deleteError) {
console.error(
`❌ Failed to delete repository ${repo.id}:`,
deleteError.message,
);
}
}
const executionTime = Date.now() - startTime;
console.log(
`✅ Orphaned repository cleanup completed in ${executionTime}ms - Deleted ${deletedCount} repositories`,
);
return {
success: true,
deletedCount,
deletedRepos,
executionTime,
};
} catch (error) {
const executionTime = Date.now() - startTime;
console.error(
`❌ Orphaned repository cleanup failed after ${executionTime}ms:`,
error.message,
);
throw error;
}
}
/**
* Schedule recurring orphaned repository cleanup (daily at 2 AM)
*/
async schedule() {
const job = await this.queueManager.queues[this.queueName].add(
"orphaned-repo-cleanup",
{},
{
repeat: { cron: "0 2 * * *" }, // Daily at 2 AM
jobId: "orphaned-repo-cleanup-recurring",
},
);
console.log("✅ Orphaned repository cleanup scheduled");
return job;
}
/**
* Trigger manual orphaned repository cleanup
*/
async triggerManual() {
const job = await this.queueManager.queues[this.queueName].add(
"orphaned-repo-cleanup-manual",
{},
{ priority: 1 },
);
console.log("✅ Manual orphaned repository cleanup triggered");
return job;
}
}
module.exports = OrphanedRepoCleanup;

View File

@@ -0,0 +1,77 @@
const { prisma } = require("./shared/prisma");
/**
* Session Cleanup Automation
* Cleans up expired user sessions
*/
class SessionCleanup {
constructor(queueManager) {
this.queueManager = queueManager;
this.queueName = "session-cleanup";
}
/**
* Process session cleanup job
*/
async process(_job) {
const startTime = Date.now();
console.log("🧹 Starting session cleanup...");
try {
const result = await prisma.user_sessions.deleteMany({
where: {
OR: [{ expires_at: { lt: new Date() } }, { is_revoked: true }],
},
});
const executionTime = Date.now() - startTime;
console.log(
`✅ Session cleanup completed in ${executionTime}ms - Cleaned up ${result.count} expired sessions`,
);
return {
success: true,
sessionsCleaned: result.count,
executionTime,
};
} catch (error) {
const executionTime = Date.now() - startTime;
console.error(
`❌ Session cleanup failed after ${executionTime}ms:`,
error.message,
);
throw error;
}
}
/**
* Schedule recurring session cleanup (every hour)
*/
async schedule() {
const job = await this.queueManager.queues[this.queueName].add(
"session-cleanup",
{},
{
repeat: { cron: "0 * * * *" }, // Every hour
jobId: "session-cleanup-recurring",
},
);
console.log("✅ Session cleanup scheduled");
return job;
}
/**
* Trigger manual session cleanup
*/
async triggerManual() {
const job = await this.queueManager.queues[this.queueName].add(
"session-cleanup-manual",
{},
{ priority: 1 },
);
console.log("✅ Manual session cleanup triggered");
return job;
}
}
module.exports = SessionCleanup;

View File

@@ -0,0 +1,5 @@
const { PrismaClient } = require("@prisma/client");
const prisma = new PrismaClient();
module.exports = { prisma };

View File

@@ -0,0 +1,16 @@
const IORedis = require("ioredis");
// Redis connection configuration
const redisConnection = {
host: process.env.REDIS_HOST || "localhost",
port: parseInt(process.env.REDIS_PORT, 10) || 6379,
password: process.env.REDIS_PASSWORD || undefined,
db: parseInt(process.env.REDIS_DB, 10) || 0,
retryDelayOnFailover: 100,
maxRetriesPerRequest: null, // BullMQ requires this to be null
};
// Create Redis connection
const redis = new IORedis(redisConnection);
module.exports = { redis, redisConnection };

View File

@@ -0,0 +1,82 @@
// Common utilities for automation jobs
/**
* Compare two semantic versions
* @param {string} version1 - First version
* @param {string} version2 - Second version
* @returns {number} - 1 if version1 > version2, -1 if version1 < version2, 0 if equal
*/
function compareVersions(version1, version2) {
const v1parts = version1.split(".").map(Number);
const v2parts = version2.split(".").map(Number);
const maxLength = Math.max(v1parts.length, v2parts.length);
for (let i = 0; i < maxLength; i++) {
const v1part = v1parts[i] || 0;
const v2part = v2parts[i] || 0;
if (v1part > v2part) return 1;
if (v1part < v2part) return -1;
}
return 0;
}
/**
* Check public GitHub repository for latest release
* @param {string} owner - Repository owner
* @param {string} repo - Repository name
* @returns {Promise<string|null>} - Latest version or null
*/
async function checkPublicRepo(owner, repo) {
try {
const httpsRepoUrl = `https://api.github.com/repos/${owner}/${repo}/releases/latest`;
let currentVersion = "1.2.7"; // fallback
try {
const packageJson = require("../../../package.json");
if (packageJson?.version) {
currentVersion = packageJson.version;
}
} catch (packageError) {
console.warn(
"Could not read version from package.json for User-Agent, using fallback:",
packageError.message,
);
}
const response = await fetch(httpsRepoUrl, {
method: "GET",
headers: {
Accept: "application/vnd.github.v3+json",
"User-Agent": `PatchMon-Server/${currentVersion}`,
},
});
if (!response.ok) {
const errorText = await response.text();
if (
errorText.includes("rate limit") ||
errorText.includes("API rate limit")
) {
console.log("⚠️ GitHub API rate limit exceeded, skipping update check");
return null;
}
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const releaseData = await response.json();
return releaseData.tag_name.replace("v", "");
} catch (error) {
console.error("GitHub API error:", error.message);
throw error;
}
}
module.exports = {
compareVersions,
checkPublicRepo,
};

View File

@@ -1,290 +0,0 @@
const { PrismaClient } = require("@prisma/client");
const { exec } = require("node:child_process");
const { promisify } = require("node:util");
const prisma = new PrismaClient();
const execAsync = promisify(exec);
class UpdateScheduler {
constructor() {
this.isRunning = false;
this.intervalId = null;
this.checkInterval = 24 * 60 * 60 * 1000; // 24 hours in milliseconds
}
// Start the scheduler
start() {
if (this.isRunning) {
console.log("Update scheduler is already running");
return;
}
console.log("🔄 Starting update scheduler...");
this.isRunning = true;
// Run initial check
this.checkForUpdates();
// Schedule regular checks
this.intervalId = setInterval(() => {
this.checkForUpdates();
}, this.checkInterval);
console.log(
`✅ Update scheduler started - checking every ${this.checkInterval / (60 * 60 * 1000)} hours`,
);
}
// Stop the scheduler
stop() {
if (!this.isRunning) {
console.log("Update scheduler is not running");
return;
}
console.log("🛑 Stopping update scheduler...");
this.isRunning = false;
if (this.intervalId) {
clearInterval(this.intervalId);
this.intervalId = null;
}
console.log("✅ Update scheduler stopped");
}
// Check for updates
async checkForUpdates() {
try {
console.log("🔍 Checking for updates...");
// Get settings
const settings = await prisma.settings.findFirst();
if (!settings || !settings.githubRepoUrl) {
console.log("⚠️ No GitHub repository configured, skipping update check");
return;
}
// Extract owner and repo from GitHub URL
const repoUrl = settings.githubRepoUrl;
let owner, repo;
if (repoUrl.includes("git@github.com:")) {
const match = repoUrl.match(/git@github\.com:([^/]+)\/([^/]+)\.git/);
if (match) {
[, owner, repo] = match;
}
} else if (repoUrl.includes("github.com/")) {
const match = repoUrl.match(
/github\.com\/([^/]+)\/([^/]+?)(?:\.git)?$/,
);
if (match) {
[, owner, repo] = match;
}
}
if (!owner || !repo) {
console.log(
"⚠️ Could not parse GitHub repository URL, skipping update check",
);
return;
}
let latestVersion;
const isPrivate = settings.repositoryType === "private";
if (isPrivate) {
// Use SSH for private repositories
latestVersion = await this.checkPrivateRepo(settings, owner, repo);
} else {
// Use GitHub API for public repositories
latestVersion = await this.checkPublicRepo(owner, repo);
}
if (!latestVersion) {
console.log(
"⚠️ Could not determine latest version, skipping update check",
);
return;
}
// Read version from package.json dynamically
let currentVersion = "1.2.7"; // fallback
try {
const packageJson = require("../../package.json");
if (packageJson?.version) {
currentVersion = packageJson.version;
}
} catch (packageError) {
console.warn(
"Could not read version from package.json, using fallback:",
packageError.message,
);
}
const isUpdateAvailable =
this.compareVersions(latestVersion, currentVersion) > 0;
// Update settings with check results
await prisma.settings.update({
where: { id: settings.id },
data: {
lastUpdateCheck: new Date(),
updateAvailable: isUpdateAvailable,
latestVersion: latestVersion,
},
});
console.log(
`✅ Update check completed - Current: ${currentVersion}, Latest: ${latestVersion}, Update Available: ${isUpdateAvailable}`,
);
} catch (error) {
console.error("❌ Error checking for updates:", error.message);
// Update last check time even on error
try {
const settings = await prisma.settings.findFirst();
if (settings) {
await prisma.settings.update({
where: { id: settings.id },
data: {
lastUpdateCheck: new Date(),
updateAvailable: false,
},
});
}
} catch (updateError) {
console.error(
"❌ Error updating last check time:",
updateError.message,
);
}
}
}
// Check private repository using SSH
async checkPrivateRepo(settings, owner, repo) {
try {
let sshKeyPath = settings.sshKeyPath;
// Try to find SSH key if not configured
if (!sshKeyPath) {
const possibleKeyPaths = [
"/root/.ssh/id_ed25519",
"/root/.ssh/id_rsa",
"/home/patchmon/.ssh/id_ed25519",
"/home/patchmon/.ssh/id_rsa",
"/var/www/.ssh/id_ed25519",
"/var/www/.ssh/id_rsa",
];
for (const path of possibleKeyPaths) {
try {
require("node:fs").accessSync(path);
sshKeyPath = path;
break;
} catch {
// Key not found at this path, try next
}
}
}
if (!sshKeyPath) {
throw new Error("No SSH deploy key found");
}
const sshRepoUrl = `git@github.com:${owner}/${repo}.git`;
const env = {
...process.env,
GIT_SSH_COMMAND: `ssh -i ${sshKeyPath} -o StrictHostKeyChecking=no -o IdentitiesOnly=yes`,
};
const { stdout: sshLatestTag } = await execAsync(
`git ls-remote --tags --sort=-version:refname ${sshRepoUrl} | head -n 1 | sed 's/.*refs\\/tags\\///' | sed 's/\\^{}//'`,
{
timeout: 10000,
env: env,
},
);
return sshLatestTag.trim().replace("v", "");
} catch (error) {
console.error("SSH Git error:", error.message);
throw error;
}
}
// Check public repository using GitHub API
async checkPublicRepo(owner, repo) {
try {
const httpsRepoUrl = `https://api.github.com/repos/${owner}/${repo}/releases/latest`;
// Get current version for User-Agent
let currentVersion = "1.2.7"; // fallback
try {
const packageJson = require("../../package.json");
if (packageJson?.version) {
currentVersion = packageJson.version;
}
} catch (packageError) {
console.warn(
"Could not read version from package.json for User-Agent, using fallback:",
packageError.message,
);
}
const response = await fetch(httpsRepoUrl, {
method: "GET",
headers: {
Accept: "application/vnd.github.v3+json",
"User-Agent": `PatchMon-Server/${currentVersion}`,
},
});
if (!response.ok) {
throw new Error(
`GitHub API error: ${response.status} ${response.statusText}`,
);
}
const releaseData = await response.json();
return releaseData.tag_name.replace("v", "");
} catch (error) {
console.error("GitHub API error:", error.message);
throw error;
}
}
// Compare version strings (semantic versioning)
compareVersions(version1, version2) {
const v1parts = version1.split(".").map(Number);
const v2parts = version2.split(".").map(Number);
const maxLength = Math.max(v1parts.length, v2parts.length);
for (let i = 0; i < maxLength; i++) {
const v1part = v1parts[i] || 0;
const v2part = v2parts[i] || 0;
if (v1part > v2part) return 1;
if (v1part < v2part) return -1;
}
return 0;
}
// Get scheduler status
getStatus() {
return {
isRunning: this.isRunning,
checkInterval: this.checkInterval,
nextCheck: this.isRunning
? new Date(Date.now() + this.checkInterval)
: null,
};
}
}
// Create singleton instance
const updateScheduler = new UpdateScheduler();
module.exports = updateScheduler;

View File

@@ -1,5 +1,5 @@
const jwt = require("jsonwebtoken");
const crypto = require("crypto");
const crypto = require("node:crypto");
const { PrismaClient } = require("@prisma/client");
const prisma = new PrismaClient();
@@ -9,9 +9,22 @@ const prisma = new PrismaClient();
*/
// Configuration
const JWT_SECRET = process.env.JWT_SECRET || "your-secret-key";
if (!process.env.JWT_SECRET) {
throw new Error("JWT_SECRET environment variable is required");
}
const JWT_SECRET = process.env.JWT_SECRET;
const JWT_EXPIRES_IN = process.env.JWT_EXPIRES_IN || "1h";
const JWT_REFRESH_EXPIRES_IN = process.env.JWT_REFRESH_EXPIRES_IN || "7d";
const TFA_REMEMBER_ME_EXPIRES_IN =
process.env.TFA_REMEMBER_ME_EXPIRES_IN || "30d";
const TFA_MAX_REMEMBER_SESSIONS = parseInt(
process.env.TFA_MAX_REMEMBER_SESSIONS || "5",
10,
);
const TFA_SUSPICIOUS_ACTIVITY_THRESHOLD = parseInt(
process.env.TFA_SUSPICIOUS_ACTIVITY_THRESHOLD || "3",
10,
);
const INACTIVITY_TIMEOUT_MINUTES = parseInt(
process.env.SESSION_INACTIVITY_TIMEOUT_MINUTES || "30",
10,
@@ -67,16 +80,136 @@ function parse_expiration(expiration_string) {
}
}
/**
* Generate device fingerprint from request data
*/
function generate_device_fingerprint(req) {
const components = [
req.get("user-agent") || "",
req.get("accept-language") || "",
req.get("accept-encoding") || "",
req.ip || "",
];
// Create a simple hash of device characteristics
const fingerprint = crypto
.createHash("sha256")
.update(components.join("|"))
.digest("hex")
.substring(0, 32); // Use first 32 chars for storage efficiency
return fingerprint;
}
/**
* Check for suspicious activity patterns
*/
async function check_suspicious_activity(
user_id,
_ip_address,
_device_fingerprint,
) {
try {
// Check for multiple sessions from different IPs in short time
const recent_sessions = await prisma.user_sessions.findMany({
where: {
user_id: user_id,
created_at: {
gte: new Date(Date.now() - 24 * 60 * 60 * 1000), // Last 24 hours
},
is_revoked: false,
},
select: {
ip_address: true,
device_fingerprint: true,
created_at: true,
},
});
// Count unique IPs and devices
const unique_ips = new Set(recent_sessions.map((s) => s.ip_address));
const unique_devices = new Set(
recent_sessions.map((s) => s.device_fingerprint),
);
// Flag as suspicious if more than threshold different IPs or devices in 24h
if (
unique_ips.size > TFA_SUSPICIOUS_ACTIVITY_THRESHOLD ||
unique_devices.size > TFA_SUSPICIOUS_ACTIVITY_THRESHOLD
) {
console.warn(
`Suspicious activity detected for user ${user_id}: ${unique_ips.size} IPs, ${unique_devices.size} devices`,
);
return true;
}
return false;
} catch (error) {
console.error("Error checking suspicious activity:", error);
return false;
}
}
/**
* Create a new session for user
*/
async function create_session(user_id, ip_address, user_agent) {
async function create_session(
user_id,
ip_address,
user_agent,
remember_me = false,
req = null,
) {
try {
const session_id = crypto.randomUUID();
const refresh_token = generate_refresh_token();
const access_token = generate_access_token(user_id, session_id);
const expires_at = parse_expiration(JWT_REFRESH_EXPIRES_IN);
// Generate device fingerprint if request is available
const device_fingerprint = req ? generate_device_fingerprint(req) : null;
// Check for suspicious activity
if (device_fingerprint) {
const is_suspicious = await check_suspicious_activity(
user_id,
ip_address,
device_fingerprint,
);
if (is_suspicious) {
console.warn(
`Suspicious activity detected for user ${user_id}, session creation may be restricted`,
);
}
}
// Check session limits for remember me
if (remember_me) {
const existing_remember_sessions = await prisma.user_sessions.count({
where: {
user_id: user_id,
tfa_remember_me: true,
is_revoked: false,
expires_at: { gt: new Date() },
},
});
// Limit remember me sessions per user
if (existing_remember_sessions >= TFA_MAX_REMEMBER_SESSIONS) {
throw new Error(
"Maximum number of remembered devices reached. Please revoke an existing session first.",
);
}
}
// Use longer expiration for remember me sessions
const expires_at = remember_me
? parse_expiration(TFA_REMEMBER_ME_EXPIRES_IN)
: parse_expiration(JWT_REFRESH_EXPIRES_IN);
// Calculate TFA bypass until date for remember me sessions
const tfa_bypass_until = remember_me
? parse_expiration(TFA_REMEMBER_ME_EXPIRES_IN)
: null;
// Store session in database
await prisma.user_sessions.create({
@@ -87,8 +220,13 @@ async function create_session(user_id, ip_address, user_agent) {
access_token_hash: hash_token(access_token),
ip_address: ip_address || null,
user_agent: user_agent || null,
device_fingerprint: device_fingerprint,
last_login_ip: ip_address || null,
last_activity: new Date(),
expires_at: expires_at,
tfa_remember_me: remember_me,
tfa_bypass_until: tfa_bypass_until,
login_count: 1,
},
});
@@ -97,6 +235,7 @@ async function create_session(user_id, ip_address, user_agent) {
access_token,
refresh_token,
expires_at,
tfa_bypass_until,
};
} catch (error) {
console.error("Error creating session:", error);
@@ -296,6 +435,8 @@ async function get_user_sessions(user_id) {
last_activity: true,
created_at: true,
expires_at: true,
tfa_remember_me: true,
tfa_bypass_until: true,
},
orderBy: { last_activity: "desc" },
});
@@ -305,6 +446,42 @@ async function get_user_sessions(user_id) {
}
}
/**
* Check if TFA is bypassed for a session
*/
async function is_tfa_bypassed(session_id) {
try {
const session = await prisma.user_sessions.findUnique({
where: { id: session_id },
select: {
tfa_remember_me: true,
tfa_bypass_until: true,
is_revoked: true,
expires_at: true,
},
});
if (!session) {
return false;
}
// Check if session is still valid
if (session.is_revoked || new Date() > session.expires_at) {
return false;
}
// Check if TFA is bypassed and still within bypass period
if (session.tfa_remember_me && session.tfa_bypass_until) {
return new Date() < session.tfa_bypass_until;
}
return false;
} catch (error) {
console.error("Error checking TFA bypass:", error);
return false;
}
}
module.exports = {
create_session,
validate_session,
@@ -314,6 +491,9 @@ module.exports = {
revoke_all_user_sessions,
cleanup_expired_sessions,
get_user_sessions,
is_tfa_bypassed,
generate_device_fingerprint,
check_suspicious_activity,
generate_access_token,
INACTIVITY_TIMEOUT_MINUTES,
};

View File

@@ -2,44 +2,74 @@
## Overview
PatchMon is a containerised application that monitors system patches and updates. The application consists of three main services:
PatchMon is a containerised application that monitors system patches and updates. The application consists of four main services:
- **Database**: PostgreSQL 17
- **Redis**: Redis 7 for BullMQ job queues and caching
- **Backend**: Node.js API server
- **Frontend**: React application served via Nginx
- **Frontend**: React application served via NGINX
## Images
- **Backend**: [ghcr.io/9technologygroup/patchmon-backend:latest](https://github.com/9technologygroup/patchmon.net/pkgs/container/patchmon-backend)
- **Frontend**: [ghcr.io/9technologygroup/patchmon-frontend:latest](https://github.com/9technologygroup/patchmon.net/pkgs/container/patchmon-frontend)
- **Backend**: [ghcr.io/patchmon/patchmon-backend](https://github.com/patchmon/patchmon.net/pkgs/container/patchmon-backend)
- **Frontend**: [ghcr.io/patchmon/patchmon-frontend](https://github.com/patchmon/patchmon.net/pkgs/container/patchmon-frontend)
Version tags are also available (e.g. `1.2.3`) for both of these images.
### Tags
- `latest`: The latest stable release of PatchMon
- `x.y.z`: Full version tags (e.g. `1.2.3`) - Use this for exact version pinning.
- `x.y`: Minor version tags (e.g. `1.2`) - Use this to get the latest patch release in a minor version series.
- `x`: Major version tags (e.g. `1`) - Use this to get the latest minor and patch release in a major version series.
- `edge`: The latest development build with the most recent features and fixes. This tag may often be unstable and is intended only for testing and development purposes.
These tags are available for both backend and frontend images as they are versioned together.
## Quick Start
### Production Deployment
1. Download the [Docker Compose file](docker-compose.yml)
2. Change the default database password in the file:
2. Set a database password in the file where it says:
```yaml
environment:
POSTGRES_PASSWORD: YOUR_SECURE_PASSWORD_HERE
POSTGRES_PASSWORD: # CREATE A STRONG PASSWORD AND PUT IT HERE
```
3. Update the corresponding `DATABASE_URL` in the backend service:
3. Update the corresponding `DATABASE_URL` with your password in the backend service where it says:
```yaml
environment:
DATABASE_URL: postgresql://patchmon_user:YOUR_SECURE_PASSWORD_HERE@database:5432/patchmon_db
DATABASE_URL: postgresql://patchmon_user:REPLACE_YOUR_POSTGRES_PASSWORD_HERE@database:5432/patchmon_db
```
4. Configure environment variables (see [Configuration](#configuration) section)
5. Start the application:
4. Set a Redis password in the Redis service where it says:
```yaml
environment:
REDIS_PASSWORD: # CREATE A STRONG REDIS PASSWORD AND PUT IT HERE
```
5. Update the corresponding `REDIS_PASSWORD` in the backend service where it says:
```yaml
environment:
REDIS_PASSWORD: REPLACE_YOUR_REDIS_PASSWORD_HERE
```
6. Generate a strong JWT secret. You can do this like so:
```bash
openssl rand -hex 64
```
7. Set a JWT secret in the backend service where it says:
```yaml
environment:
JWT_SECRET: # CREATE A STRONG SECRET AND PUT IT HERE
```
8. Configure environment variables (see [Configuration](#configuration) section)
9. Start the application:
```bash
docker compose up -d
```
6. Access the application at `http://localhost:3000`
10. Access the application at `http://localhost:3000`
## Updating
To update PatchMon to the latest version:
By default, the compose file uses the `latest` tag for both backend and frontend images.
This means you can update PatchMon to the latest version as easily as:
```bash
docker compose up -d --pull
@@ -52,16 +82,18 @@ This command will:
### Version-Specific Updates
If you're using specific version tags instead of `latest` in your compose file:
If you'd like to pin your Docker deployment of PatchMon to a specific version, you can do this in the compose file.
1. Update the image tags in your `docker-compose.yml`. For example:
When you do this, updating to a new version requires manually updating the image tags in the compose file yourself:
1. Update the image tags in `docker-compose.yml`. For example:
```yaml
services:
backend:
image: ghcr.io/9technologygroup/patchmon-backend:1.2.7 # Update version here
image: ghcr.io/patchmon/patchmon-backend:1.2.3 # Update version here
...
frontend:
image: ghcr.io/9technologygroup/patchmon-frontend:1.2.7 # Update version here
image: ghcr.io/patchmon/patchmon-frontend:1.2.3 # Update version here
...
```
@@ -71,7 +103,7 @@ If you're using specific version tags instead of `latest` in your compose file:
```
> [!TIP]
> Check the [releases page](https://github.com/9technologygroup/patchmon.net/releases) for version-specific changes and migration notes.
> Check the [releases page](https://github.com/PatchMon/PatchMon/releases) for version-specific changes and migration notes.
## Configuration
@@ -79,37 +111,90 @@ If you're using specific version tags instead of `latest` in your compose file:
#### Database Service
- `POSTGRES_DB`: Database name (default: `patchmon_db`)
- `POSTGRES_USER`: Database user (default: `patchmon_user`)
- `POSTGRES_PASSWORD`: Database password - **MUST BE CHANGED!**
| Variable | Description | Default |
| ------------------- | ----------------- | ---------------- |
| `POSTGRES_DB` | Database name | `patchmon_db` |
| `POSTGRES_USER` | Database user | `patchmon_user` |
| `POSTGRES_PASSWORD` | Database password | **MUST BE SET!** |
#### Redis Service
| Variable | Description | Default |
| -------------- | ------------------ | ---------------- |
| `REDIS_PASSWORD` | Redis password | **MUST BE SET!** |
#### Backend Service
- `LOG_LEVEL`: Logging level (`debug`, `info`, `warn`, `error`)
- `DATABASE_URL`: PostgreSQL connection string
- `PM_DB_CONN_MAX_ATTEMPTS`: Maximum database connection attempts (default: 30)
- `PM_DB_CONN_WAIT_INTERVAL`: Wait interval between connection attempts in seconds (default: 2)
- `SERVER_PROTOCOL`: Frontend server protocol (`http` or `https`)
- `SERVER_HOST`: Frontend server host (default: `localhost`)
- `SERVER_PORT`: Frontend server port (default: 3000)
- `PORT`: Backend API port (default: 3001)
- `API_VERSION`: API version (default: `v1`)
- `CORS_ORIGIN`: CORS origin URL
- `RATE_LIMIT_WINDOW_MS`: Rate limiting window in milliseconds (default: 900000)
- `RATE_LIMIT_MAX`: Maximum requests per window (default: 100)
- `ENABLE_HSTS`: Enable HTTP Strict Transport Security (default: true)
- `TRUST_PROXY`: Trust proxy headers (default: true) - See [Express.js docs](https://expressjs.com/en/guide/behind-proxies.html) for usage.
##### Database Configuration
| Variable | Description | Default |
| -------------------------- | ---------------------------------------------------- | ------------------------------------------------ |
| `DATABASE_URL` | PostgreSQL connection string | **MUST BE UPDATED WITH YOUR POSTGRES_PASSWORD!** |
| `PM_DB_CONN_MAX_ATTEMPTS` | Maximum database connection attempts | `30` |
| `PM_DB_CONN_WAIT_INTERVAL` | Wait interval between connection attempts in seconds | `2` |
##### Redis Configuration
| Variable | Description | Default |
| --------------- | ------------------------------ | ------- |
| `REDIS_HOST` | Redis server hostname | `redis` |
| `REDIS_PORT` | Redis server port | `6379` |
| `REDIS_PASSWORD` | Redis authentication password | **MUST BE UPDATED WITH YOUR REDIS_PASSWORD!** |
| `REDIS_DB` | Redis database number | `0` |
##### Authentication & Security
| Variable | Description | Default |
| ------------------------------------ | --------------------------------------------------------- | ---------------- |
| `JWT_SECRET` | JWT signing secret - Generate with `openssl rand -hex 64` | **MUST BE SET!** |
| `JWT_EXPIRES_IN` | JWT token expiration time | `1h` |
| `JWT_REFRESH_EXPIRES_IN` | JWT refresh token expiration time | `7d` |
| `SESSION_INACTIVITY_TIMEOUT_MINUTES` | Session inactivity timeout in minutes | `30` |
| `DEFAULT_USER_ROLE` | Default role for new users | `user` |
##### Server & Network Configuration
| Variable | Description | Default |
| ----------------- | ----------------------------------------------------------------------------------------------- | ----------------------- |
| `PORT` | Backend API port | `3001` |
| `SERVER_PROTOCOL` | Frontend server protocol (`http` or `https`) | `http` |
| `SERVER_HOST` | Frontend server host | `localhost` |
| `SERVER_PORT` | Frontend server port | `3000` |
| `CORS_ORIGIN` | CORS origin URL | `http://localhost:3000` |
| `ENABLE_HSTS` | Enable HTTP Strict Transport Security | `true` |
| `TRUST_PROXY` | Trust proxy headers - See [Express.js docs](https://expressjs.com/en/guide/behind-proxies.html) | `true` |
##### Rate Limiting
| Variable | Description | Default |
| ---------------------------- | --------------------------------------------------- | -------- |
| `RATE_LIMIT_WINDOW_MS` | Rate limiting window in milliseconds | `900000` |
| `RATE_LIMIT_MAX` | Maximum requests per window | `5000` |
| `AUTH_RATE_LIMIT_WINDOW_MS` | Authentication rate limiting window in milliseconds | `600000` |
| `AUTH_RATE_LIMIT_MAX` | Maximum authentication requests per window | `500` |
| `AGENT_RATE_LIMIT_WINDOW_MS` | Agent API rate limiting window in milliseconds | `60000` |
| `AGENT_RATE_LIMIT_MAX` | Maximum agent requests per window | `1000` |
##### Logging
| Variable | Description | Default |
| ---------------- | ------------------------------------------------ | ------- |
| `LOG_LEVEL` | Logging level (`debug`, `info`, `warn`, `error`) | `info` |
| `ENABLE_LOGGING` | Enable application logging | `true` |
#### Frontend Service
- `BACKEND_HOST`: Backend service hostname (default: `backend`)
- `BACKEND_PORT`: Backend service port (default: 3001)
| Variable | Description | Default |
| -------------- | ------------------------ | --------- |
| `BACKEND_HOST` | Backend service hostname | `backend` |
| `BACKEND_PORT` | Backend service port | `3001` |
### Volumes
The compose file creates two Docker volumes:
The compose file creates three Docker volumes:
* `postgres_data`: PostgreSQL's data directory.
* `redis_data`: Redis's data directory.
* `agent_files`: PatchMon's agent files.
If you wish to bind either if their respective container paths to a host path rather than a Docker volume, you can do so in the Docker Compose file.
@@ -129,7 +214,7 @@ For development with live reload and source code mounting:
1. Clone the repository:
```bash
git clone https://github.com/9technologygroup/patchmon.net.git
git clone https://github.com/PatchMon/PatchMon.git
cd patchmon.net
```
@@ -143,6 +228,7 @@ For development with live reload and source code mounting:
- Frontend: `http://localhost:3000`
- Backend API: `http://localhost:3001`
- Database: `localhost:5432`
- Redis: `localhost:6379`
## Development Docker Compose
@@ -196,6 +282,7 @@ docker compose -f docker/docker-compose.dev.yml up -d --build
### Development Ports
The development setup exposes additional ports for debugging:
- **Database**: `5432` - Direct PostgreSQL access
- **Redis**: `6379` - Direct Redis access
- **Backend**: `3001` - API server with development features
- **Frontend**: `3000` - React development server with hot reload
@@ -203,7 +290,7 @@ The development setup exposes additional ports for debugging:
1. **Initial Setup**: Clone repository and start development environment
```bash
git clone https://github.com/9technologygroup/patchmon.net.git
git clone https://github.com/PatchMon/PatchMon.git
cd patchmon.net
docker compose -f docker/docker-compose.dev.yml up -d --build
```
@@ -219,8 +306,8 @@ The development setup exposes additional ports for debugging:
- **Prisma Schema Changes**: Backend service restarts automatically
4. **Database Access**: Connect database client directly to `localhost:5432`
5. **Debug**: If started with `docker compose [...] up -d` or `docker compose [...] watch`, check logs manually:
5. **Redis Access**: Connect Redis client directly to `localhost:6379`
6. **Debug**: If started with `docker compose [...] up -d` or `docker compose [...] watch`, check logs manually:
```bash
docker compose -f docker/docker-compose.dev.yml logs -f
```
@@ -230,6 +317,6 @@ The development setup exposes additional ports for debugging:
- **Hot Reload**: Automatic code synchronization and service restarts
- **Enhanced Logging**: Detailed logs for debugging
- **Direct Access**: Exposed ports for database and API debugging
- **Direct Access**: Exposed ports for database, Redis, and API debugging
- **Health Checks**: Built-in health monitoring for services
- **Volume Persistence**: Development data persists between restarts

View File

@@ -59,7 +59,10 @@ ENV NODE_ENV=production \
ENABLE_LOGGING=true \
LOG_LEVEL=info \
PM_LOG_TO_CONSOLE=true \
PORT=3001
PORT=3001 \
JWT_EXPIRES_IN=1h \
JWT_REFRESH_EXPIRES_IN=7d \
SESSION_INACTIVITY_TIMEOUT_MINUTES=30
RUN apk add --no-cache openssl tini curl

View File

@@ -8,19 +8,94 @@ log() {
echo "[$(date +'%Y-%m-%d %H:%M:%S')] $*" >&2
}
# Copy files from agents_backup to agents if agents directory is empty
if [ -d "/app/agents" ] && [ -z "$(ls -A /app/agents 2>/dev/null)" ]; then
if [ -d "/app/agents_backup" ]; then
log "Agents directory is empty, copying from backup..."
cp -r /app/agents_backup/* /app/agents/
# Function to extract version from agent script
get_agent_version() {
local file="$1"
if [ -f "$file" ]; then
grep -m 1 '^AGENT_VERSION=' "$file" | cut -d'"' -f2 2>/dev/null || echo "0.0.0"
else
log "Warning: agents_backup directory not found"
echo "0.0.0"
fi
else
log "Agents directory already contains files, skipping copy"
fi
}
log "Starting PatchMon Backend (${NODE_ENV:-production})..."
# Function to compare versions (returns 0 if $1 > $2)
version_greater() {
# Use sort -V for version comparison
test "$(printf '%s\n' "$1" "$2" | sort -V | tail -n1)" = "$1" && test "$1" != "$2"
}
# Check and update agent files if necessary
update_agents() {
local backup_agent="/app/agents_backup/patchmon-agent.sh"
local current_agent="/app/agents/patchmon-agent.sh"
# Check if agents directory exists
if [ ! -d "/app/agents" ]; then
log "ERROR: /app/agents directory not found"
return 1
fi
# Check if backup exists
if [ ! -d "/app/agents_backup" ]; then
log "WARNING: agents_backup directory not found, skipping agent update"
return 0
fi
# Get versions
local backup_version=$(get_agent_version "$backup_agent")
local current_version=$(get_agent_version "$current_agent")
log "Agent version check:"
log " Image version: ${backup_version}"
log " Volume version: ${current_version}"
# Determine if update is needed
local needs_update=0
# Case 1: No agents in volume (first time setup)
if [ -z "$(find /app/agents -maxdepth 1 -type f -name '*.sh' 2>/dev/null | head -n 1)" ]; then
log "Agents directory is empty - performing initial copy"
needs_update=1
# Case 2: Backup version is newer
elif version_greater "$backup_version" "$current_version"; then
log "Newer agent version available (${backup_version} > ${current_version})"
needs_update=1
else
log "Agents are up to date"
needs_update=0
fi
# Perform update if needed
if [ $needs_update -eq 1 ]; then
log "Updating agents to version ${backup_version}..."
# Create backup of existing agents if they exist
if [ -f "$current_agent" ]; then
local backup_timestamp=$(date +%Y%m%d_%H%M%S)
local backup_name="/app/agents/patchmon-agent.sh.backup.${backup_timestamp}"
cp "$current_agent" "$backup_name" 2>/dev/null || true
log "Previous agent backed up to: $(basename $backup_name)"
fi
# Copy new agents
cp -r /app/agents_backup/* /app/agents/
# Verify update
local new_version=$(get_agent_version "$current_agent")
if [ "$new_version" = "$backup_version" ]; then
log "✅ Agents successfully updated to version ${new_version}"
else
log "⚠️ Warning: Agent update may have failed (expected: ${backup_version}, got: ${new_version})"
fi
fi
}
# Main execution
log "PatchMon Backend Container Starting..."
log "Environment: ${NODE_ENV:-production}"
# Update agents (version-aware)
update_agents
log "Running database migrations..."
npx prisma migrate deploy

View File

@@ -1,3 +1,5 @@
name: patchmon-dev
services:
database:
image: postgres:17-alpine
@@ -5,7 +7,7 @@ services:
environment:
POSTGRES_DB: patchmon_db
POSTGRES_USER: patchmon_user
POSTGRES_PASSWORD: INSECURE_REPLACE_ME_PLEASE_INSECURE
POSTGRES_PASSWORD: 1NS3CU6E_DEV_D8_PASSW0RD
ports:
- "5432:5432"
volumes:
@@ -16,24 +18,43 @@ services:
timeout: 5s
retries: 7
redis:
image: redis:7-alpine
restart: unless-stopped
command: redis-server --requirepass 1NS3CU6E_DEV_R3DIS_PASSW0RD
environment:
REDIS_PASSWORD: 1NS3CU6E_DEV_R3DIS_PASSW0RD
ports:
- "6379:6379"
volumes:
- ./compose_dev_data/redis:/data
healthcheck:
test: ["CMD", "redis-cli", "--no-auth-warning", "-a", "1NS3CU6E_DEV_R3DIS_PASSW0RD", "ping"]
interval: 3s
timeout: 5s
retries: 7
backend:
build:
context: ..
dockerfile: docker/backend.Dockerfile
target: development
tags: [patchmon-backend:dev]
restart: unless-stopped
environment:
NODE_ENV: development
LOG_LEVEL: info
DATABASE_URL: postgresql://patchmon_user:INSECURE_REPLACE_ME_PLEASE_INSECURE@database:5432/patchmon_db
PM_DB_CONN_MAX_ATTEMPTS: 30
PM_DB_CONN_WAIT_INTERVAL: 2
DATABASE_URL: postgresql://patchmon_user:1NS3CU6E_DEV_D8_PASSW0RD@database:5432/patchmon_db
JWT_SECRET: INS3CURE_DEV_7WT_5ECR3T
SERVER_PROTOCOL: http
SERVER_HOST: localhost
SERVER_PORT: 3000
CORS_ORIGIN: http://localhost:3000
RATE_LIMIT_WINDOW_MS: 900000
RATE_LIMIT_MAX: 100
# Redis Configuration
REDIS_HOST: redis
REDIS_PORT: 6379
REDIS_PASSWORD: 1NS3CU6E_DEV_R3DIS_PASSW0RD
REDIS_DB: 0
ports:
- "3001:3001"
volumes:
@@ -41,6 +62,8 @@ services:
depends_on:
database:
condition: service_healthy
redis:
condition: service_healthy
develop:
watch:
- action: sync
@@ -59,6 +82,7 @@ services:
context: ..
dockerfile: docker/frontend.Dockerfile
target: development
tags: [patchmon-frontend:dev]
restart: unless-stopped
environment:
BACKEND_HOST: backend

View File

@@ -1,3 +1,5 @@
name: patchmon
services:
database:
image: postgres:17-alpine
@@ -5,7 +7,7 @@ services:
environment:
POSTGRES_DB: patchmon_db
POSTGRES_USER: patchmon_user
POSTGRES_PASSWORD: INSECURE_REPLACE_ME_PLEASE_INSECURE
POSTGRES_PASSWORD: # CREATE A STRONG PASSWORD AND PUT IT HERE
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
@@ -14,28 +16,48 @@ services:
timeout: 5s
retries: 7
backend:
image: ghcr.io/9technologygroup/patchmon-backend:latest
redis:
image: redis:7-alpine
restart: unless-stopped
command: redis-server /usr/local/etc/redis/redis.conf
environment:
REDIS_PASSWORD: # CREATE A STRONG REDIS PASSWORD AND PUT IT HERE
volumes:
- redis_data:/data
- ./docker/redis.conf:/usr/local/etc/redis/redis.conf:ro
healthcheck:
test: ["CMD", "redis-cli", "--no-auth-warning", "-a", "${REDIS_PASSWORD}", "ping"]
interval: 3s
timeout: 5s
retries: 7
backend:
image: ghcr.io/patchmon/patchmon-backend:latest
restart: unless-stopped
# See PatchMon Docker README for additional environment variables and configuration instructions
environment:
LOG_LEVEL: info
DATABASE_URL: postgresql://patchmon_user:INSECURE_REPLACE_ME_PLEASE_INSECURE@database:5432/patchmon_db
PM_DB_CONN_MAX_ATTEMPTS: 30
PM_DB_CONN_WAIT_INTERVAL: 2
DATABASE_URL: postgresql://patchmon_user:REPLACE_YOUR_POSTGRES_PASSWORD_HERE@database:5432/patchmon_db
JWT_SECRET: # CREATE A STRONG SECRET AND PUT IT HERE - Generate with 'openssl rand -hex 64'
SERVER_PROTOCOL: http
SERVER_HOST: localhost
SERVER_PORT: 3000
CORS_ORIGIN: http://localhost:3000
RATE_LIMIT_WINDOW_MS: 900000
RATE_LIMIT_MAX: 100
# Redis Configuration
REDIS_HOST: redis
REDIS_PORT: 6379
REDIS_PASSWORD: REPLACE_YOUR_REDIS_PASSWORD_HERE
REDIS_DB: 0
volumes:
- agent_files:/app/agents
depends_on:
database:
condition: service_healthy
redis:
condition: service_healthy
frontend:
image: ghcr.io/9technologygroup/patchmon-frontend:latest
image: ghcr.io/patchmon/patchmon-frontend:latest
restart: unless-stopped
ports:
- "3000:3000"
@@ -45,4 +67,5 @@ services:
volumes:
postgres_data:
redis_data:
agent_files:

View File

@@ -52,6 +52,64 @@ server {
}
}
# SSE (Server-Sent Events) specific configuration
location /api/v1/ws/status/ {
proxy_pass http://${BACKEND_HOST}:${BACKEND_PORT};
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $host;
# Critical SSE settings
proxy_buffering off;
proxy_cache off;
proxy_set_header Connection '';
proxy_http_version 1.1;
chunked_transfer_encoding off;
# Timeout settings for long-lived connections
proxy_read_timeout 24h;
proxy_send_timeout 24h;
proxy_connect_timeout 60s;
# Disable nginx buffering for real-time streaming
proxy_request_buffering off;
proxy_max_temp_file_size 0;
# CORS headers for SSE
add_header Access-Control-Allow-Origin * always;
add_header Access-Control-Allow-Methods "GET, OPTIONS" always;
add_header Access-Control-Allow-Headers "DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization" always;
# Handle preflight requests
if ($request_method = 'OPTIONS') {
return 204;
}
}
# WebSocket upgrade handling
location /api/v1/agents/ws {
proxy_pass http://${BACKEND_HOST}:${BACKEND_PORT};
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $host;
# WebSocket timeout settings
proxy_read_timeout 24h;
proxy_send_timeout 24h;
proxy_connect_timeout 60s;
# Disable buffering for WebSocket
proxy_buffering off;
proxy_cache off;
}
# Static assets caching
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2|ttf|eot)$ {
expires 1y;

35
docker/redis.conf Normal file
View File

@@ -0,0 +1,35 @@
# Redis Configuration for PatchMon Production
# Security settings
requirepass ${REDIS_PASSWORD}
rename-command FLUSHDB ""
rename-command FLUSHALL ""
rename-command DEBUG ""
rename-command CONFIG "CONFIG_${REDIS_PASSWORD}"
# Memory management
maxmemory 256mb
maxmemory-policy allkeys-lru
# Persistence settings
save 900 1
save 300 10
save 60 10000
# Logging
loglevel notice
logfile ""
# Network security
bind 127.0.0.1
protected-mode yes
# Performance tuning
tcp-keepalive 300
timeout 0
# Disable dangerous commands
rename-command SHUTDOWN "SHUTDOWN_${REDIS_PASSWORD}"
rename-command KEYS ""
rename-command MONITOR ""
rename-command SLAVEOF ""
rename-command REPLICAOF ""

View File

@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<link rel="icon" type="image/svg+xml" href="/assets/favicon.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>PatchMon - Linux Patch Monitoring Dashboard</title>
<link rel="preconnect" href="https://fonts.googleapis.com">

View File

@@ -1,7 +1,7 @@
{
"name": "patchmon-frontend",
"private": true,
"version": "1.2.7",
"version": "1.2.9",
"license": "AGPL-3.0",
"type": "module",
"scripts": {
@@ -35,7 +35,7 @@
"@vitejs/plugin-react": "^4.3.4",
"autoprefixer": "^10.4.20",
"postcss": "^8.5.6",
"tailwindcss": "^3.4.17",
"tailwindcss": "^4.0.0",
"vite": "^7.1.5"
},
"overrides": {

View File

@@ -0,0 +1,23 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 36 36">
<circle fill="#DD2E44" cx="18" cy="18" r="18" />
<circle fill="#FFF" cx="18" cy="18" r="13.5" />
<circle fill="#DD2E44" cx="18" cy="18" r="10" />
<circle fill="#FFF" cx="18" cy="18" r="6" />
<circle fill="#DD2E44" cx="18" cy="18" r="3" />
<path
opacity=".2"
d="M18.24 18.282l13.144 11.754s-2.647 3.376-7.89 5.109L17.579 18.42l.661-.138z"
/>
<path
fill="#FFAC33"
d="M18.294 19a.994.994 0 01-.704-1.699l.563-.563a.995.995 0 011.408 1.407l-.564.563a.987.987 0 01-.703.292z"
/>
<path
fill="#55ACEE"
d="M24.016 6.981c-.403 2.079 0 4.691 0 4.691l7.054-7.388c.291-1.454-.528-3.932-1.718-4.238-1.19-.306-4.079.803-5.336 6.935zm5.003 5.003c-2.079.403-4.691 0-4.691 0l7.388-7.054c1.454-.291 3.932.528 4.238 1.718.306 1.19-.803 4.079-6.935 5.336z"
/>
<path
fill="#3A87C2"
d="M32.798 4.485L21.176 17.587c-.362.362-1.673.882-2.51.046-.836-.836-.419-2.08-.057-2.443L31.815 3.501s.676-.635 1.159-.152-.176 1.136-.176 1.136z"
/>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" width="500" zoomAndPan="magnify" viewBox="0 0 375 374.999991" height="500" preserveAspectRatio="xMidYMid meet" version="1.0"><defs><g/><clipPath id="d62632d413"><path d="M 29 28 L 304 28 L 304 350 L 29 350 Z M 29 28 " clip-rule="nonzero"/></clipPath><clipPath id="ecc8b4d8ed"><path d="M 187.496094 -39.996094 L 416.601562 189.105469 L 187.496094 418.207031 L -41.605469 189.105469 Z M 187.496094 -39.996094 " clip-rule="nonzero"/></clipPath><clipPath id="3016db942f"><path d="M 187.496094 -39.996094 L 416.601562 189.105469 L 187.496094 418.207031 L -41.605469 189.105469 Z M 187.496094 -39.996094 " clip-rule="nonzero"/></clipPath><clipPath id="029f8ae6a8"><path d="M 29 28 L 304 28 L 304 350 L 29 350 Z M 29 28 " clip-rule="nonzero"/></clipPath><clipPath id="2d374b5e76"><path d="M 187.496094 -39.996094 L 416.601562 189.105469 L 187.496094 418.207031 L -41.605469 189.105469 Z M 187.496094 -39.996094 " clip-rule="nonzero"/></clipPath><clipPath id="544d823606"><path d="M 187.496094 -39.996094 L 416.601562 189.105469 L 187.496094 418.207031 L -41.605469 189.105469 Z M 187.496094 -39.996094 " clip-rule="nonzero"/></clipPath><clipPath id="b88a276116"><path d="M 187.496094 -39.996094 L 416.601562 189.105469 L 187.496094 418.207031 L -41.605469 189.105469 Z M 187.496094 -39.996094 " clip-rule="nonzero"/></clipPath><clipPath id="98c26e11a4"><rect x="0" width="103" y="0" height="208"/></clipPath></defs><g clip-path="url(#d62632d413)"><g clip-path="url(#ecc8b4d8ed)"><g clip-path="url(#3016db942f)"><path fill="#ff751f" d="M 303.214844 302.761719 C 280.765625 325.214844 252.160156 340.503906 221.015625 346.699219 C 189.875 352.890625 157.59375 349.714844 128.261719 337.5625 C 98.925781 325.410156 73.851562 304.835938 56.210938 278.433594 C 38.570312 252.03125 29.15625 220.992188 29.15625 189.242188 C 29.15625 157.488281 38.570312 126.449219 56.210938 100.050781 C 73.851562 73.648438 98.925781 53.070312 128.261719 40.921875 C 157.59375 28.769531 189.875 25.589844 221.015625 31.785156 C 252.160156 37.980469 280.765625 53.269531 303.214844 75.722656 L 189.695312 189.242188 Z M 303.214844 302.761719 " fill-opacity="1" fill-rule="nonzero"/></g></g></g><g clip-path="url(#029f8ae6a8)"><g clip-path="url(#2d374b5e76)"><g clip-path="url(#544d823606)"><g clip-path="url(#b88a276116)"><path fill="#61b33a" d="M 303.144531 302.550781 C 280.707031 324.988281 252.117188 340.269531 220.996094 346.460938 C 189.875 352.652344 157.613281 349.472656 128.296875 337.332031 C 98.980469 325.1875 73.921875 304.621094 56.292969 278.238281 C 38.664062 251.851562 29.253906 220.832031 29.253906 189.101562 C 29.253906 157.367188 38.664062 126.347656 56.292969 99.964844 C 73.921875 73.578125 98.980469 53.015625 128.296875 40.871094 C 157.613281 28.726562 189.875 25.550781 220.996094 31.742188 C 252.117188 37.929688 280.707031 53.210938 303.144531 75.652344 L 189.695312 189.101562 Z M 303.144531 302.550781 " fill-opacity="1" fill-rule="nonzero"/></g></g></g></g><g transform="matrix(1, 0, 0, 1, 136, 0)"><g clip-path="url(#98c26e11a4)"><g fill="#ff751f" fill-opacity="1"><g transform="translate(0.457164, 116.403543)"><g><path d="M 19.734375 -18.71875 C 19.734375 -21.664062 20.015625 -24.441406 20.578125 -27.046875 C 21.148438 -29.660156 22.0625 -32.210938 23.3125 -34.703125 C 24.5625 -37.203125 26.207031 -39.359375 28.25 -41.171875 C 33.6875 -47.066406 41.285156 -50.015625 51.046875 -50.015625 C 59.210938 -50.015625 66.46875 -46.953125 72.8125 -40.828125 C 79.164062 -34.703125 82.34375 -27.332031 82.34375 -18.71875 C 82.34375 -9.414062 79.28125 -1.925781 73.15625 3.75 C 67.257812 9.644531 59.890625 12.59375 51.046875 12.59375 C 42.648438 12.59375 35.332031 9.472656 29.09375 3.234375 C 22.851562 -3.003906 19.734375 -10.320312 19.734375 -18.71875 Z M 19.734375 -18.71875 "/></g></g></g></g></g></svg>

After

Width:  |  Height:  |  Size: 3.8 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@@ -1,29 +1,56 @@
import { lazy, Suspense } from "react";
import { Route, Routes } from "react-router-dom";
import FirstTimeAdminSetup from "./components/FirstTimeAdminSetup";
import Layout from "./components/Layout";
import LogoProvider from "./components/LogoProvider";
import ProtectedRoute from "./components/ProtectedRoute";
import SettingsLayout from "./components/SettingsLayout";
import { isAuthPhase } from "./constants/authPhases";
import { AuthProvider, useAuth } from "./contexts/AuthContext";
import { ThemeProvider } from "./contexts/ThemeContext";
import { UpdateNotificationProvider } from "./contexts/UpdateNotificationContext";
import Dashboard from "./pages/Dashboard";
import HostDetail from "./pages/HostDetail";
import Hosts from "./pages/Hosts";
import Login from "./pages/Login";
import PackageDetail from "./pages/PackageDetail";
import Packages from "./pages/Packages";
import Profile from "./pages/Profile";
import Repositories from "./pages/Repositories";
import RepositoryDetail from "./pages/RepositoryDetail";
import AlertChannels from "./pages/settings/AlertChannels";
import Integrations from "./pages/settings/Integrations";
import Notifications from "./pages/settings/Notifications";
import PatchManagement from "./pages/settings/PatchManagement";
import SettingsAgentConfig from "./pages/settings/SettingsAgentConfig";
import SettingsHostGroups from "./pages/settings/SettingsHostGroups";
import SettingsServerConfig from "./pages/settings/SettingsServerConfig";
import SettingsUsers from "./pages/settings/SettingsUsers";
// Lazy load pages
const Dashboard = lazy(() => import("./pages/Dashboard"));
const HostDetail = lazy(() => import("./pages/HostDetail"));
const Hosts = lazy(() => import("./pages/Hosts"));
const Login = lazy(() => import("./pages/Login"));
const PackageDetail = lazy(() => import("./pages/PackageDetail"));
const Packages = lazy(() => import("./pages/Packages"));
const Profile = lazy(() => import("./pages/Profile"));
const Automation = lazy(() => import("./pages/Automation"));
const Repositories = lazy(() => import("./pages/Repositories"));
const RepositoryDetail = lazy(() => import("./pages/RepositoryDetail"));
const Docker = lazy(() => import("./pages/Docker"));
const DockerContainerDetail = lazy(
() => import("./pages/docker/ContainerDetail"),
);
const DockerImageDetail = lazy(() => import("./pages/docker/ImageDetail"));
const DockerHostDetail = lazy(() => import("./pages/docker/HostDetail"));
const AlertChannels = lazy(() => import("./pages/settings/AlertChannels"));
const Integrations = lazy(() => import("./pages/settings/Integrations"));
const Notifications = lazy(() => import("./pages/settings/Notifications"));
const PatchManagement = lazy(() => import("./pages/settings/PatchManagement"));
const SettingsAgentConfig = lazy(
() => import("./pages/settings/SettingsAgentConfig"),
);
const SettingsHostGroups = lazy(
() => import("./pages/settings/SettingsHostGroups"),
);
const SettingsServerConfig = lazy(
() => import("./pages/settings/SettingsServerConfig"),
);
const SettingsUsers = lazy(() => import("./pages/settings/SettingsUsers"));
// Loading fallback component
const LoadingFallback = () => (
<div className="min-h-screen bg-gradient-to-br from-primary-50 to-secondary-50 dark:from-secondary-900 dark:to-secondary-800 flex items-center justify-center">
<div className="text-center">
<div className="animate-spin rounded-full h-12 w-12 border-b-2 border-primary-600 mx-auto mb-4"></div>
<p className="text-secondary-600 dark:text-secondary-300">Loading...</p>
</div>
</div>
);
function AppRoutes() {
const { needsFirstTimeSetup, authPhase, isAuthenticated } = useAuth();
@@ -52,275 +79,337 @@ function AppRoutes() {
}
return (
<Routes>
<Route path="/login" element={<Login />} />
<Route
path="/"
element={
<ProtectedRoute requirePermission="can_view_dashboard">
<Layout>
<Dashboard />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/hosts"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<Hosts />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/hosts/:hostId"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<HostDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/packages"
element={
<ProtectedRoute requirePermission="can_view_packages">
<Layout>
<Packages />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/repositories"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<Repositories />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/repositories/:repositoryId"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<RepositoryDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/users"
element={
<ProtectedRoute requirePermission="can_view_users">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/permissions"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/users"
element={
<ProtectedRoute requirePermission="can_view_users">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/roles"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/profile"
element={
<ProtectedRoute>
<Layout>
<SettingsLayout>
<Profile />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/host-groups"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsHostGroups />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/notifications"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsLayout>
<Notifications />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-config"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-config/management"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-config"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-config/version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/alert-channels"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsLayout>
<AlertChannels />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/integrations"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<Integrations />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/patch-management"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<PatchManagement />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-url"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/options"
element={
<ProtectedRoute requirePermission="can_manage_hosts">
<Layout>
<SettingsHostGroups />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/packages/:packageId"
element={
<ProtectedRoute requirePermission="can_view_packages">
<Layout>
<PackageDetail />
</Layout>
</ProtectedRoute>
}
/>
</Routes>
<Suspense fallback={<LoadingFallback />}>
<Routes>
<Route path="/login" element={<Login />} />
<Route
path="/"
element={
<ProtectedRoute requirePermission="can_view_dashboard">
<Layout>
<Dashboard />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/hosts"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<Hosts />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/hosts/:hostId"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<HostDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/packages"
element={
<ProtectedRoute requirePermission="can_view_packages">
<Layout>
<Packages />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/repositories"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<Repositories />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/repositories/:repositoryId"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<RepositoryDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/automation"
element={
<ProtectedRoute requirePermission="can_view_hosts">
<Layout>
<Automation />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/docker"
element={
<ProtectedRoute requirePermission="can_view_reports">
<Layout>
<Docker />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/docker/containers/:id"
element={
<ProtectedRoute requirePermission="can_view_reports">
<Layout>
<DockerContainerDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/docker/images/:id"
element={
<ProtectedRoute requirePermission="can_view_reports">
<Layout>
<DockerImageDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/docker/hosts/:id"
element={
<ProtectedRoute requirePermission="can_view_reports">
<Layout>
<DockerHostDetail />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/users"
element={
<ProtectedRoute requirePermission="can_view_users">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/permissions"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/users"
element={
<ProtectedRoute requirePermission="can_view_users">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/roles"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsUsers />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/profile"
element={
<ProtectedRoute>
<Layout>
<SettingsLayout>
<Profile />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/host-groups"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsHostGroups />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/notifications"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsLayout>
<Notifications />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-config"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-config/management"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-config"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-config/version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/alert-channels"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsLayout>
<AlertChannels />
</SettingsLayout>
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/integrations"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<Integrations />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/patch-management"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<PatchManagement />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-url"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/server-version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/branding"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsServerConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/settings/agent-version"
element={
<ProtectedRoute requirePermission="can_manage_settings">
<Layout>
<SettingsAgentConfig />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/options"
element={
<ProtectedRoute requirePermission="can_manage_hosts">
<Layout>
<SettingsHostGroups />
</Layout>
</ProtectedRoute>
}
/>
<Route
path="/packages/:packageId"
element={
<ProtectedRoute requirePermission="can_view_packages">
<Layout>
<PackageDetail />
</Layout>
</ProtectedRoute>
}
/>
</Routes>
</Suspense>
);
}
@@ -329,7 +418,9 @@ function App() {
<ThemeProvider>
<AuthProvider>
<UpdateNotificationProvider>
<AppRoutes />
<LogoProvider>
<AppRoutes />
</LogoProvider>
</UpdateNotificationProvider>
</AuthProvider>
</ThemeProvider>

View File

@@ -0,0 +1,16 @@
const DiscordIcon = ({ className = "h-5 w-5" }) => {
return (
<svg
viewBox="0 0 24 24"
fill="currentColor"
className={className}
xmlns="http://www.w3.org/2000/svg"
aria-label="Discord"
>
<title>Discord</title>
<path d="M20.317 4.3698a19.7913 19.7913 0 00-4.8851-1.5152.0741.0741 0 00-.0785.0371c-.211.3753-.4447.8648-.6083 1.2495-1.8447-.2762-3.68-.2762-5.4868 0-.1636-.3933-.4058-.8742-.6177-1.2495a.077.077 0 00-.0785-.037 19.7363 19.7363 0 00-4.8852 1.515.0699.0699 0 00-.0321.0277C.5334 9.0458-.319 13.5799.0992 18.0578a.0824.0824 0 00.0312.0561c2.0528 1.5076 4.0413 2.4228 5.9929 3.0294a.0777.0777 0 00.0842-.0276c.4616-.6304.8731-1.2952 1.226-1.9942a.076.076 0 00-.0416-.1057c-.6528-.2476-1.2743-.5495-1.8722-.8923a.077.077 0 01-.0076-.1277c.1258-.0943.2517-.1923.3718-.2914a.0743.0743 0 01.0776-.0105c3.9278 1.7933 8.18 1.7933 12.0614 0a.0739.0739 0 01.0785.0095c.1202.099.246.1981.3728.2924a.077.077 0 01-.0066.1276 12.2986 12.2986 0 01-1.873.8914.0766.0766 0 00-.0407.1067c.3604.698.7719 1.3628 1.225 1.9932a.076.076 0 00.0842.0286c1.961-.6067 3.9495-1.5219 6.0023-3.0294a.077.077 0 00.0313-.0552c.5004-5.177-.8382-9.6739-3.5485-13.6604a.061.061 0 00-.0312-.0286zM8.02 15.3312c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9555-2.4189 2.157-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.9555 2.4189-2.1569 2.4189zm7.9748 0c-1.1825 0-2.1569-1.0857-2.1569-2.419 0-1.3332.9554-2.4189 2.1569-2.4189 1.2108 0 2.1757 1.0952 2.1568 2.419 0 1.3332-.946 2.4189-2.1568 2.4189z" />
</svg>
);
};
export default DiscordIcon;

View File

@@ -0,0 +1,428 @@
import { GitBranch, Package, Search, Server, User, X } from "lucide-react";
import { useCallback, useEffect, useRef, useState } from "react";
import { useNavigate } from "react-router-dom";
import { searchAPI } from "../utils/api";
const GlobalSearch = () => {
const [query, setQuery] = useState("");
const [results, setResults] = useState(null);
const [isOpen, setIsOpen] = useState(false);
const [isLoading, setIsLoading] = useState(false);
const [selectedIndex, setSelectedIndex] = useState(-1);
const searchRef = useRef(null);
const inputRef = useRef(null);
const navigate = useNavigate();
// Debounce search
const debounceTimerRef = useRef(null);
const performSearch = useCallback(async (searchQuery) => {
if (!searchQuery || searchQuery.trim().length === 0) {
setResults(null);
setIsOpen(false);
return;
}
setIsLoading(true);
try {
const response = await searchAPI.global(searchQuery);
setResults(response.data);
setIsOpen(true);
setSelectedIndex(-1);
} catch (error) {
console.error("Search error:", error);
setResults(null);
} finally {
setIsLoading(false);
}
}, []);
const handleInputChange = (e) => {
const value = e.target.value;
setQuery(value);
// Clear previous timer
if (debounceTimerRef.current) {
clearTimeout(debounceTimerRef.current);
}
// Set new timer
debounceTimerRef.current = setTimeout(() => {
performSearch(value);
}, 300);
};
const handleClear = () => {
// Clear debounce timer to prevent any pending searches
if (debounceTimerRef.current) {
clearTimeout(debounceTimerRef.current);
}
setQuery("");
setResults(null);
setIsOpen(false);
setSelectedIndex(-1);
inputRef.current?.focus();
};
const handleResultClick = (result) => {
// Navigate based on result type
switch (result.type) {
case "host":
navigate(`/hosts/${result.id}`);
break;
case "package":
navigate(`/packages/${result.id}`);
break;
case "repository":
navigate(`/repositories/${result.id}`);
break;
case "user":
// Users don't have detail pages, so navigate to settings
navigate("/settings/users");
break;
default:
break;
}
// Close dropdown and clear
handleClear();
};
// Close dropdown when clicking outside
useEffect(() => {
const handleClickOutside = (event) => {
if (searchRef.current && !searchRef.current.contains(event.target)) {
setIsOpen(false);
}
};
document.addEventListener("mousedown", handleClickOutside);
return () => {
document.removeEventListener("mousedown", handleClickOutside);
};
}, []);
// Keyboard navigation
const flattenedResults = [];
if (results) {
if (results.hosts?.length > 0) {
flattenedResults.push({ type: "header", label: "Hosts" });
flattenedResults.push(...results.hosts);
}
if (results.packages?.length > 0) {
flattenedResults.push({ type: "header", label: "Packages" });
flattenedResults.push(...results.packages);
}
if (results.repositories?.length > 0) {
flattenedResults.push({ type: "header", label: "Repositories" });
flattenedResults.push(...results.repositories);
}
if (results.users?.length > 0) {
flattenedResults.push({ type: "header", label: "Users" });
flattenedResults.push(...results.users);
}
}
const navigableResults = flattenedResults.filter((r) => r.type !== "header");
const handleKeyDown = (e) => {
if (!isOpen || !results) return;
switch (e.key) {
case "ArrowDown":
e.preventDefault();
setSelectedIndex((prev) =>
prev < navigableResults.length - 1 ? prev + 1 : prev,
);
break;
case "ArrowUp":
e.preventDefault();
setSelectedIndex((prev) => (prev > 0 ? prev - 1 : -1));
break;
case "Enter":
e.preventDefault();
if (selectedIndex >= 0 && navigableResults[selectedIndex]) {
handleResultClick(navigableResults[selectedIndex]);
}
break;
case "Escape":
e.preventDefault();
setIsOpen(false);
setSelectedIndex(-1);
break;
default:
break;
}
};
// Get icon for result type
const getResultIcon = (type) => {
switch (type) {
case "host":
return <Server className="h-4 w-4 text-blue-500" />;
case "package":
return <Package className="h-4 w-4 text-green-500" />;
case "repository":
return <GitBranch className="h-4 w-4 text-purple-500" />;
case "user":
return <User className="h-4 w-4 text-orange-500" />;
default:
return null;
}
};
// Get display text for result
const getResultDisplay = (result) => {
switch (result.type) {
case "host":
return {
primary: result.friendly_name || result.hostname,
secondary: result.ip || result.hostname,
};
case "package":
return {
primary: result.name,
secondary: result.description || result.category,
};
case "repository":
return {
primary: result.name,
secondary: result.distribution,
};
case "user":
return {
primary: result.username,
secondary: result.email,
};
default:
return { primary: "", secondary: "" };
}
};
const hasResults =
results &&
(results.hosts?.length > 0 ||
results.packages?.length > 0 ||
results.repositories?.length > 0 ||
results.users?.length > 0);
return (
<div ref={searchRef} className="relative w-full max-w-sm">
<div className="relative">
<div className="pointer-events-none absolute inset-y-0 left-0 flex items-center pl-3">
<Search className="h-5 w-5 text-secondary-400" />
</div>
<input
ref={inputRef}
type="text"
className="block w-full rounded-lg border border-secondary-200 bg-white py-2 pl-10 pr-10 text-sm text-secondary-900 placeholder-secondary-500 focus:border-primary-500 focus:outline-none focus:ring-2 focus:ring-primary-500 dark:border-secondary-600 dark:bg-secondary-700 dark:text-white dark:placeholder-secondary-400"
placeholder="Search hosts, packages, repos, users..."
value={query}
onChange={handleInputChange}
onKeyDown={handleKeyDown}
onFocus={() => {
if (query && results) setIsOpen(true);
}}
/>
{query && (
<button
type="button"
onClick={handleClear}
className="absolute inset-y-0 right-0 flex items-center pr-3 text-secondary-400 hover:text-secondary-600"
>
<X className="h-4 w-4" />
</button>
)}
</div>
{/* Dropdown Results */}
{isOpen && (
<div className="absolute z-50 mt-2 w-full rounded-lg border border-secondary-200 bg-white shadow-lg dark:border-secondary-600 dark:bg-secondary-800">
{isLoading ? (
<div className="px-4 py-2 text-center text-sm text-secondary-500">
Searching...
</div>
) : hasResults ? (
<div className="max-h-96 overflow-y-auto">
{/* Hosts */}
{results.hosts?.length > 0 && (
<div>
<div className="sticky top-0 z-10 bg-secondary-50 px-3 py-1.5 text-xs font-semibold uppercase tracking-wider text-secondary-500 dark:bg-secondary-700 dark:text-secondary-400">
Hosts
</div>
{results.hosts.map((host, _idx) => {
const display = getResultDisplay(host);
const globalIdx = navigableResults.findIndex(
(r) => r.id === host.id && r.type === "host",
);
return (
<button
type="button"
key={host.id}
onClick={() => handleResultClick(host)}
className={`flex w-full items-center gap-2 px-3 py-1.5 text-left transition-colors ${
globalIdx === selectedIndex
? "bg-primary-50 dark:bg-primary-900/20"
: "hover:bg-secondary-50 dark:hover:bg-secondary-700"
}`}
>
{getResultIcon("host")}
<div className="flex-1 min-w-0 flex items-center gap-2">
<span className="text-sm font-medium text-secondary-900 dark:text-white truncate">
{display.primary}
</span>
<span className="text-xs text-secondary-400"></span>
<span className="text-xs text-secondary-500 dark:text-secondary-400 truncate">
{display.secondary}
</span>
</div>
<div className="flex-shrink-0 text-xs text-secondary-400">
{host.os_type}
</div>
</button>
);
})}
</div>
)}
{/* Packages */}
{results.packages?.length > 0 && (
<div>
<div className="sticky top-0 z-10 bg-secondary-50 px-3 py-1.5 text-xs font-semibold uppercase tracking-wider text-secondary-500 dark:bg-secondary-700 dark:text-secondary-400">
Packages
</div>
{results.packages.map((pkg, _idx) => {
const display = getResultDisplay(pkg);
const globalIdx = navigableResults.findIndex(
(r) => r.id === pkg.id && r.type === "package",
);
return (
<button
type="button"
key={pkg.id}
onClick={() => handleResultClick(pkg)}
className={`flex w-full items-center gap-2 px-3 py-1.5 text-left transition-colors ${
globalIdx === selectedIndex
? "bg-primary-50 dark:bg-primary-900/20"
: "hover:bg-secondary-50 dark:hover:bg-secondary-700"
}`}
>
{getResultIcon("package")}
<div className="flex-1 min-w-0 flex items-center gap-2">
<span className="text-sm font-medium text-secondary-900 dark:text-white truncate">
{display.primary}
</span>
{display.secondary && (
<>
<span className="text-xs text-secondary-400">
</span>
<span className="text-xs text-secondary-500 dark:text-secondary-400 truncate">
{display.secondary}
</span>
</>
)}
</div>
<div className="flex-shrink-0 text-xs text-secondary-400">
{pkg.host_count} hosts
</div>
</button>
);
})}
</div>
)}
{/* Repositories */}
{results.repositories?.length > 0 && (
<div>
<div className="sticky top-0 z-10 bg-secondary-50 px-3 py-1.5 text-xs font-semibold uppercase tracking-wider text-secondary-500 dark:bg-secondary-700 dark:text-secondary-400">
Repositories
</div>
{results.repositories.map((repo, _idx) => {
const display = getResultDisplay(repo);
const globalIdx = navigableResults.findIndex(
(r) => r.id === repo.id && r.type === "repository",
);
return (
<button
type="button"
key={repo.id}
onClick={() => handleResultClick(repo)}
className={`flex w-full items-center gap-2 px-3 py-1.5 text-left transition-colors ${
globalIdx === selectedIndex
? "bg-primary-50 dark:bg-primary-900/20"
: "hover:bg-secondary-50 dark:hover:bg-secondary-700"
}`}
>
{getResultIcon("repository")}
<div className="flex-1 min-w-0 flex items-center gap-2">
<span className="text-sm font-medium text-secondary-900 dark:text-white truncate">
{display.primary}
</span>
<span className="text-xs text-secondary-400"></span>
<span className="text-xs text-secondary-500 dark:text-secondary-400 truncate">
{display.secondary}
</span>
</div>
<div className="flex-shrink-0 text-xs text-secondary-400">
{repo.host_count} hosts
</div>
</button>
);
})}
</div>
)}
{/* Users */}
{results.users?.length > 0 && (
<div>
<div className="sticky top-0 z-10 bg-secondary-50 px-3 py-1.5 text-xs font-semibold uppercase tracking-wider text-secondary-500 dark:bg-secondary-700 dark:text-secondary-400">
Users
</div>
{results.users.map((user, _idx) => {
const display = getResultDisplay(user);
const globalIdx = navigableResults.findIndex(
(r) => r.id === user.id && r.type === "user",
);
return (
<button
type="button"
key={user.id}
onClick={() => handleResultClick(user)}
className={`flex w-full items-center gap-2 px-3 py-1.5 text-left transition-colors ${
globalIdx === selectedIndex
? "bg-primary-50 dark:bg-primary-900/20"
: "hover:bg-secondary-50 dark:hover:bg-secondary-700"
}`}
>
{getResultIcon("user")}
<div className="flex-1 min-w-0 flex items-center gap-2">
<span className="text-sm font-medium text-secondary-900 dark:text-white truncate">
{display.primary}
</span>
<span className="text-xs text-secondary-400"></span>
<span className="text-xs text-secondary-500 dark:text-secondary-400 truncate">
{display.secondary}
</span>
</div>
<div className="flex-shrink-0 text-xs text-secondary-400">
{user.role}
</div>
</button>
);
})}
</div>
)}
</div>
) : query.trim() ? (
<div className="px-4 py-2 text-center text-sm text-secondary-500">
No results found for "{query}"
</div>
) : null}
</div>
)}
</div>
);
};
export default GlobalSearch;

View File

@@ -0,0 +1,283 @@
import { Check, ChevronDown, Edit2, X } from "lucide-react";
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
const InlineMultiGroupEdit = ({
value = [], // Array of group IDs
onSave,
onCancel,
options = [],
className = "",
disabled = false,
}) => {
const [isEditing, setIsEditing] = useState(false);
const [selectedValues, setSelectedValues] = useState(value);
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState("");
const [isOpen, setIsOpen] = useState(false);
const [dropdownPosition, setDropdownPosition] = useState({
top: 0,
left: 0,
width: 0,
});
const dropdownRef = useRef(null);
const buttonRef = useRef(null);
useEffect(() => {
if (isEditing && dropdownRef.current) {
dropdownRef.current.focus();
}
}, [isEditing]);
useEffect(() => {
setSelectedValues(value);
// Force re-render when value changes
if (!isEditing) {
setIsOpen(false);
}
}, [value, isEditing]);
// Calculate dropdown position
const calculateDropdownPosition = useCallback(() => {
if (buttonRef.current) {
const rect = buttonRef.current.getBoundingClientRect();
setDropdownPosition({
top: rect.bottom + window.scrollY + 4,
left: rect.left + window.scrollX,
width: rect.width,
});
}
}, []);
// Close dropdown when clicking outside
useEffect(() => {
const handleClickOutside = (event) => {
if (dropdownRef.current && !dropdownRef.current.contains(event.target)) {
setIsOpen(false);
}
};
if (isOpen) {
calculateDropdownPosition();
document.addEventListener("mousedown", handleClickOutside);
window.addEventListener("resize", calculateDropdownPosition);
window.addEventListener("scroll", calculateDropdownPosition);
return () => {
document.removeEventListener("mousedown", handleClickOutside);
window.removeEventListener("resize", calculateDropdownPosition);
window.removeEventListener("scroll", calculateDropdownPosition);
};
}
}, [isOpen, calculateDropdownPosition]);
const handleEdit = () => {
if (disabled) return;
setIsEditing(true);
setSelectedValues(value);
setError("");
// Automatically open dropdown when editing starts
setTimeout(() => {
setIsOpen(true);
}, 0);
};
const handleCancel = () => {
setIsEditing(false);
setSelectedValues(value);
setError("");
setIsOpen(false);
if (onCancel) onCancel();
};
const handleSave = async () => {
if (disabled || isLoading) return;
// Check if values actually changed
const sortedCurrent = [...value].sort();
const sortedSelected = [...selectedValues].sort();
if (JSON.stringify(sortedCurrent) === JSON.stringify(sortedSelected)) {
setIsEditing(false);
setIsOpen(false);
return;
}
setIsLoading(true);
setError("");
try {
await onSave(selectedValues);
setIsEditing(false);
setIsOpen(false);
} catch (err) {
setError(err.message || "Failed to save");
} finally {
setIsLoading(false);
}
};
const handleKeyDown = (e) => {
if (e.key === "Enter") {
e.preventDefault();
handleSave();
} else if (e.key === "Escape") {
e.preventDefault();
handleCancel();
}
};
const toggleGroup = (groupId) => {
setSelectedValues((prev) => {
if (prev.includes(groupId)) {
return prev.filter((id) => id !== groupId);
} else {
return [...prev, groupId];
}
});
};
const _displayValue = useMemo(() => {
if (!value || value.length === 0) {
return "Ungrouped";
}
if (value.length === 1) {
const option = options.find((opt) => opt.id === value[0]);
return option ? option.name : "Unknown Group";
}
return `${value.length} groups`;
}, [value, options]);
const displayGroups = useMemo(() => {
if (!value || value.length === 0) {
return [];
}
return value
.map((groupId) => options.find((opt) => opt.id === groupId))
.filter(Boolean);
}, [value, options]);
if (isEditing) {
return (
<div className={`relative ${className}`} ref={dropdownRef}>
<div className="flex items-center gap-2">
<div className="relative flex-1">
<button
ref={buttonRef}
type="button"
onClick={() => setIsOpen(!isOpen)}
onKeyDown={handleKeyDown}
disabled={isLoading}
className={`w-full px-3 py-1 text-sm border border-secondary-300 dark:border-secondary-600 rounded-md bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white focus:outline-none focus:ring-2 focus:ring-primary-500 focus:border-transparent flex items-center justify-between ${
error ? "border-red-500" : ""
} ${isLoading ? "opacity-50" : ""}`}
>
<span className="truncate">
{selectedValues.length === 0
? "Ungrouped"
: selectedValues.length === 1
? options.find((opt) => opt.id === selectedValues[0])
?.name || "Unknown Group"
: `${selectedValues.length} groups selected`}
</span>
<ChevronDown className="h-4 w-4 flex-shrink-0" />
</button>
{isOpen && (
<div
className="fixed z-50 bg-white dark:bg-secondary-800 border border-secondary-300 dark:border-secondary-600 rounded-md shadow-lg max-h-60 overflow-auto"
style={{
top: `${dropdownPosition.top}px`,
left: `${dropdownPosition.left}px`,
width: `${dropdownPosition.width}px`,
minWidth: "200px",
}}
>
<div className="py-1">
{options.map((option) => (
<label
key={option.id}
className="w-full px-3 py-2 text-left text-sm hover:bg-secondary-100 dark:hover:bg-secondary-700 flex items-center cursor-pointer"
>
<input
type="checkbox"
checked={selectedValues.includes(option.id)}
onChange={() => toggleGroup(option.id)}
className="mr-2 h-4 w-4 text-primary-600 focus:ring-primary-500 border-secondary-300 rounded"
/>
<span
className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium text-white"
style={{ backgroundColor: option.color }}
>
{option.name}
</span>
</label>
))}
{options.length === 0 && (
<div className="px-3 py-2 text-sm text-secondary-500 dark:text-secondary-400">
No groups available
</div>
)}
</div>
</div>
)}
</div>
<button
type="button"
onClick={handleSave}
disabled={isLoading}
className="p-1 text-green-600 hover:text-green-700 hover:bg-green-50 dark:hover:bg-green-900/20 rounded transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
title="Save"
>
<Check className="h-4 w-4" />
</button>
<button
type="button"
onClick={handleCancel}
disabled={isLoading}
className="p-1 text-red-600 hover:text-red-700 hover:bg-red-50 dark:hover:bg-red-900/20 rounded transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
title="Cancel"
>
<X className="h-4 w-4" />
</button>
</div>
{error && (
<span className="text-xs text-red-600 dark:text-red-400 mt-1 block">
{error}
</span>
)}
</div>
);
}
return (
<div className={`flex items-center gap-1 group ${className}`}>
{displayGroups.length === 0 ? (
<span className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-secondary-100 text-secondary-800">
Ungrouped
</span>
) : (
<div className="flex items-center gap-1 flex-wrap">
{displayGroups.map((group) => (
<span
key={group.id}
className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium text-white"
style={{ backgroundColor: group.color }}
>
{group.name}
</span>
))}
</div>
)}
{!disabled && (
<button
type="button"
onClick={handleEdit}
className="p-1 text-secondary-400 hover:text-secondary-600 dark:hover:text-secondary-300 hover:bg-secondary-100 dark:hover:bg-secondary-700 rounded transition-colors opacity-0 group-hover:opacity-100"
title="Edit groups"
>
<Edit2 className="h-3 w-3" />
</button>
)}
</div>
);
};
export default InlineMultiGroupEdit;

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,44 @@
import { useQuery } from "@tanstack/react-query";
import { useTheme } from "../contexts/ThemeContext";
import { settingsAPI } from "../utils/api";
const Logo = ({
className = "h-8 w-auto",
alt = "PatchMon Logo",
...props
}) => {
const { isDark } = useTheme();
const { data: settings } = useQuery({
queryKey: ["settings"],
queryFn: () => settingsAPI.get().then((res) => res.data),
});
// Determine which logo to use based on theme
const logoSrc = isDark
? settings?.logo_dark || "/assets/logo_dark.png"
: settings?.logo_light || "/assets/logo_light.png";
// Add cache-busting parameter using updated_at timestamp
const cacheBuster = settings?.updated_at
? new Date(settings.updated_at).getTime()
: Date.now();
const logoSrcWithCache = `${logoSrc}?v=${cacheBuster}`;
return (
<img
src={logoSrcWithCache}
alt={alt}
className={className}
onError={(e) => {
// Fallback to default logo if custom logo fails to load
e.target.src = isDark
? "/assets/logo_dark.png"
: "/assets/logo_light.png";
}}
{...props}
/>
);
};
export default Logo;

View File

@@ -0,0 +1,42 @@
import { useQuery } from "@tanstack/react-query";
import { useEffect } from "react";
import { isAuthReady } from "../constants/authPhases";
import { useAuth } from "../contexts/AuthContext";
import { settingsAPI } from "../utils/api";
const LogoProvider = ({ children }) => {
const { authPhase, isAuthenticated } = useAuth();
const { data: settings } = useQuery({
queryKey: ["settings"],
queryFn: () => settingsAPI.get().then((res) => res.data),
enabled: isAuthReady(authPhase, isAuthenticated()),
});
useEffect(() => {
// Use custom favicon or fallback to default
const faviconUrl = settings?.favicon || "/assets/favicon.svg";
// Add cache-busting parameter using updated_at timestamp
const cacheBuster = settings?.updated_at
? new Date(settings.updated_at).getTime()
: Date.now();
const faviconUrlWithCache = `${faviconUrl}?v=${cacheBuster}`;
// Update favicon
const favicon = document.querySelector('link[rel="icon"]');
if (favicon) {
favicon.href = faviconUrlWithCache;
} else {
// Create favicon link if it doesn't exist
const link = document.createElement("link");
link.rel = "icon";
link.href = faviconUrlWithCache;
document.head.appendChild(link);
}
}, [settings?.favicon, settings?.updated_at]);
return children;
};
export default LogoProvider;

View File

@@ -4,6 +4,7 @@ import {
ChevronRight,
Code,
Folder,
Image,
RefreshCw,
Settings,
Shield,
@@ -81,6 +82,7 @@ const SettingsLayout = ({ children }) => {
name: "Alert Channels",
href: "/settings/alert-channels",
icon: Bell,
comingSoon: true,
},
{
name: "Notifications",
@@ -117,7 +119,6 @@ const SettingsLayout = ({ children }) => {
name: "Integrations",
href: "/settings/integrations",
icon: Wrench,
comingSoon: true,
},
],
});
@@ -130,6 +131,11 @@ const SettingsLayout = ({ children }) => {
href: "/settings/server-url",
icon: Wrench,
},
{
name: "Branding",
href: "/settings/branding",
icon: Image,
},
{
name: "Server Version",
href: "/settings/server-version",

View File

@@ -26,7 +26,7 @@ const AgentManagementTab = () => {
});
// Helper function to get curl flags based on settings
const getCurlFlags = () => {
const _getCurlFlags = () => {
return settings?.ignore_ssl_self_signed ? "-sk" : "-s";
};
@@ -177,29 +177,40 @@ const AgentManagementTab = () => {
Agent Uninstall Command
</h3>
<div className="mt-2 text-sm text-red-700 dark:text-red-300">
<p className="mb-2">
<p className="mb-3">
To completely remove PatchMon from a host:
</p>
<div className="flex items-center gap-2">
<div className="bg-red-100 dark:bg-red-800 rounded p-2 font-mono text-xs flex-1">
curl {getCurlFlags()} {window.location.origin}
/api/v1/hosts/remove | sudo bash
{/* Go Agent Uninstall */}
<div className="mb-3">
<div className="space-y-2">
<div className="flex items-center gap-2">
<div className="bg-red-100 dark:bg-red-800 rounded p-2 font-mono text-xs flex-1">
sudo patchmon-agent uninstall
</div>
<button
type="button"
onClick={() => {
navigator.clipboard.writeText(
"sudo patchmon-agent uninstall",
);
}}
className="px-2 py-1 bg-red-200 dark:bg-red-700 text-red-800 dark:text-red-200 rounded text-xs hover:bg-red-300 dark:hover:bg-red-600 transition-colors"
>
Copy
</button>
</div>
<div className="text-xs text-red-600 dark:text-red-400">
Options: <code>--remove-config</code>,{" "}
<code>--remove-logs</code>, <code>--remove-all</code>,{" "}
<code>--force</code>
</div>
</div>
<button
type="button"
onClick={() => {
const command = `curl ${getCurlFlags()} ${window.location.origin}/api/v1/hosts/remove | sudo bash`;
navigator.clipboard.writeText(command);
// You could add a toast notification here
}}
className="px-2 py-1 bg-red-200 dark:bg-red-700 text-red-800 dark:text-red-200 rounded text-xs hover:bg-red-300 dark:hover:bg-red-600 transition-colors"
>
Copy
</button>
</div>
<p className="mt-2 text-xs">
This will remove all PatchMon files, configuration, and
crontab entries
This command will remove all PatchMon files,
configuration, and crontab entries
</p>
</div>
</div>

View File

@@ -446,6 +446,53 @@ const AgentUpdatesTab = () => {
</div>
)}
</form>
{/* Uninstall Instructions */}
<div className="bg-red-50 dark:bg-red-900 border border-red-200 dark:border-red-700 rounded-md p-4">
<div className="flex">
<Shield className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Agent Uninstall Command
</h3>
<div className="mt-2 text-sm text-red-700 dark:text-red-300">
<p className="mb-3">To completely remove PatchMon from a host:</p>
{/* Go Agent Uninstall */}
<div className="mb-3">
<div className="space-y-2">
<div className="flex items-center gap-2">
<div className="bg-red-100 dark:bg-red-800 rounded p-2 font-mono text-xs flex-1">
sudo patchmon-agent uninstall
</div>
<button
type="button"
onClick={() => {
navigator.clipboard.writeText(
"sudo patchmon-agent uninstall",
);
}}
className="px-2 py-1 bg-red-200 dark:bg-red-700 text-red-800 dark:text-red-200 rounded text-xs hover:bg-red-300 dark:hover:bg-red-600 transition-colors"
>
Copy
</button>
</div>
<div className="text-xs text-red-600 dark:text-red-400">
Options: <code>--remove-config</code>,{" "}
<code>--remove-logs</code>, <code>--remove-all</code>,{" "}
<code>--force</code>
</div>
</div>
</div>
<p className="mt-2 text-xs">
This command will remove all PatchMon files, configuration,
and crontab entries
</p>
</div>
</div>
</div>
</div>
</div>
);
};

View File

@@ -0,0 +1,531 @@
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
import { AlertCircle, Image, RotateCcw, Upload, X } from "lucide-react";
import { useState } from "react";
import { settingsAPI } from "../../utils/api";
const BrandingTab = () => {
// Logo management state
const [logoUploadState, setLogoUploadState] = useState({
dark: { uploading: false, error: null },
light: { uploading: false, error: null },
favicon: { uploading: false, error: null },
});
const [showLogoUploadModal, setShowLogoUploadModal] = useState(false);
const [selectedLogoType, setSelectedLogoType] = useState("dark");
const queryClient = useQueryClient();
// Fetch current settings
const {
data: settings,
isLoading,
error,
} = useQuery({
queryKey: ["settings"],
queryFn: () => settingsAPI.get().then((res) => res.data),
});
// Logo upload mutation
const uploadLogoMutation = useMutation({
mutationFn: ({ logoType, fileContent, fileName }) =>
fetch("/api/v1/settings/logos/upload", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
body: JSON.stringify({ logoType, fileContent, fileName }),
}).then((res) => res.json()),
onSuccess: (_data, variables) => {
queryClient.invalidateQueries(["settings"]);
setLogoUploadState((prev) => ({
...prev,
[variables.logoType]: { uploading: false, error: null },
}));
setShowLogoUploadModal(false);
},
onError: (error, variables) => {
console.error("Upload logo error:", error);
setLogoUploadState((prev) => ({
...prev,
[variables.logoType]: {
uploading: false,
error: error.message || "Failed to upload logo",
},
}));
},
});
// Logo reset mutation
const resetLogoMutation = useMutation({
mutationFn: (logoType) =>
fetch("/api/v1/settings/logos/reset", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
body: JSON.stringify({ logoType }),
}).then((res) => res.json()),
onSuccess: () => {
queryClient.invalidateQueries(["settings"]);
},
onError: (error) => {
console.error("Reset logo error:", error);
},
});
if (isLoading) {
return (
<div className="flex items-center justify-center h-64">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-primary-600"></div>
</div>
);
}
if (error) {
return (
<div className="bg-red-50 dark:bg-red-900 border border-red-200 dark:border-red-700 rounded-md p-4">
<div className="flex">
<AlertCircle className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Error loading settings
</h3>
<p className="mt-1 text-sm text-red-700 dark:text-red-300">
{error.response?.data?.error || "Failed to load settings"}
</p>
</div>
</div>
</div>
);
}
return (
<div className="space-y-6">
<div className="flex items-center mb-6">
<Image className="h-6 w-6 text-primary-600 mr-3" />
<h2 className="text-xl font-semibold text-secondary-900 dark:text-white">
Logo & Branding
</h2>
</div>
<p className="text-sm text-secondary-500 dark:text-secondary-300 mb-6">
Customize your PatchMon installation with custom logos and favicon.
These will be displayed throughout the application.
</p>
<div className="grid grid-cols-1 md:grid-cols-3 gap-6">
{/* Dark Logo */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-4">
Dark Logo
</h4>
<div className="flex items-center justify-center p-4 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-4">
<img
src={`${settings?.logo_dark || "/assets/logo_dark.png"}?v=${Date.now()}`}
alt="Dark Logo"
className="max-h-16 max-w-full object-contain"
onError={(e) => {
e.target.src = "/assets/logo_dark.png";
}}
/>
</div>
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-4 truncate">
{settings?.logo_dark
? settings.logo_dark.split("/").pop()
: "logo_dark.png (Default)"}
</p>
<div className="space-y-2">
<button
type="button"
onClick={() => {
setSelectedLogoType("dark");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.dark.uploading}
className="w-full btn-outline flex items-center justify-center gap-2"
>
{logoUploadState.dark.uploading ? (
<>
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-4 w-4" />
Upload Dark Logo
</>
)}
</button>
{settings?.logo_dark && (
<button
type="button"
onClick={() => resetLogoMutation.mutate("dark")}
disabled={resetLogoMutation.isPending}
className="w-full btn-outline flex items-center justify-center gap-2 text-orange-600 hover:text-orange-700 border-orange-300 hover:border-orange-400"
>
<RotateCcw className="h-4 w-4" />
Reset to Default
</button>
)}
</div>
{logoUploadState.dark.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.dark.error}
</p>
)}
</div>
{/* Light Logo */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-4">
Light Logo
</h4>
<div className="flex items-center justify-center p-4 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-4">
<img
src={`${settings?.logo_light || "/assets/logo_light.png"}?v=${Date.now()}`}
alt="Light Logo"
className="max-h-16 max-w-full object-contain"
onError={(e) => {
e.target.src = "/assets/logo_light.png";
}}
/>
</div>
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-4 truncate">
{settings?.logo_light
? settings.logo_light.split("/").pop()
: "logo_light.png (Default)"}
</p>
<div className="space-y-2">
<button
type="button"
onClick={() => {
setSelectedLogoType("light");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.light.uploading}
className="w-full btn-outline flex items-center justify-center gap-2"
>
{logoUploadState.light.uploading ? (
<>
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-4 w-4" />
Upload Light Logo
</>
)}
</button>
{settings?.logo_light && (
<button
type="button"
onClick={() => resetLogoMutation.mutate("light")}
disabled={resetLogoMutation.isPending}
className="w-full btn-outline flex items-center justify-center gap-2 text-orange-600 hover:text-orange-700 border-orange-300 hover:border-orange-400"
>
<RotateCcw className="h-4 w-4" />
Reset to Default
</button>
)}
</div>
{logoUploadState.light.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.light.error}
</p>
)}
</div>
{/* Favicon */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-4">
Favicon
</h4>
<div className="flex items-center justify-center p-4 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-4">
<img
src={`${settings?.favicon || "/assets/favicon.svg"}?v=${Date.now()}`}
alt="Favicon"
className="h-8 w-8 object-contain"
onError={(e) => {
e.target.src = "/assets/favicon.svg";
}}
/>
</div>
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-4 truncate">
{settings?.favicon
? settings.favicon.split("/").pop()
: "favicon.svg (Default)"}
</p>
<div className="space-y-2">
<button
type="button"
onClick={() => {
setSelectedLogoType("favicon");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.favicon.uploading}
className="w-full btn-outline flex items-center justify-center gap-2"
>
{logoUploadState.favicon.uploading ? (
<>
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-4 w-4" />
Upload Favicon
</>
)}
</button>
{settings?.favicon && (
<button
type="button"
onClick={() => resetLogoMutation.mutate("favicon")}
disabled={resetLogoMutation.isPending}
className="w-full btn-outline flex items-center justify-center gap-2 text-orange-600 hover:text-orange-700 border-orange-300 hover:border-orange-400"
>
<RotateCcw className="h-4 w-4" />
Reset to Default
</button>
)}
</div>
{logoUploadState.favicon.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.favicon.error}
</p>
)}
</div>
</div>
{/* Usage Instructions */}
<div className="bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-700 rounded-md p-4 mt-6">
<div className="flex">
<Image className="h-5 w-5 text-blue-400 dark:text-blue-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-blue-800 dark:text-blue-200">
Logo Usage
</h3>
<div className="mt-2 text-sm text-blue-700 dark:text-blue-300">
<p className="mb-2">
These logos are used throughout the application:
</p>
<ul className="list-disc list-inside space-y-1">
<li>
<strong>Dark Logo:</strong> Used in dark mode and on light
backgrounds
</li>
<li>
<strong>Light Logo:</strong> Used in light mode and on dark
backgrounds
</li>
<li>
<strong>Favicon:</strong> Used as the browser tab icon (SVG
recommended)
</li>
</ul>
<p className="mt-3 text-xs">
<strong>Supported formats:</strong> PNG, JPG, SVG |{" "}
<strong>Max size:</strong> 5MB |{" "}
<strong>Recommended sizes:</strong> 200x60px for logos, 32x32px
for favicon.
</p>
</div>
</div>
</div>
</div>
{/* Logo Upload Modal */}
{showLogoUploadModal && (
<LogoUploadModal
isOpen={showLogoUploadModal}
onClose={() => setShowLogoUploadModal(false)}
onSubmit={uploadLogoMutation.mutate}
isLoading={uploadLogoMutation.isPending}
error={uploadLogoMutation.error}
logoType={selectedLogoType}
/>
)}
</div>
);
};
// Logo Upload Modal Component
const LogoUploadModal = ({
isOpen,
onClose,
onSubmit,
isLoading,
error,
logoType,
}) => {
const [selectedFile, setSelectedFile] = useState(null);
const [previewUrl, setPreviewUrl] = useState(null);
const [uploadError, setUploadError] = useState("");
const handleFileSelect = (e) => {
const file = e.target.files[0];
if (file) {
// Validate file type
const allowedTypes = [
"image/png",
"image/jpeg",
"image/jpg",
"image/svg+xml",
];
if (!allowedTypes.includes(file.type)) {
setUploadError("Please select a PNG, JPG, or SVG file");
return;
}
// Validate file size (5MB limit)
if (file.size > 5 * 1024 * 1024) {
setUploadError("File size must be less than 5MB");
return;
}
setSelectedFile(file);
setUploadError("");
// Create preview URL
const url = URL.createObjectURL(file);
setPreviewUrl(url);
}
};
const handleSubmit = (e) => {
e.preventDefault();
setUploadError("");
if (!selectedFile) {
setUploadError("Please select a file");
return;
}
// Convert file to base64
const reader = new FileReader();
reader.onload = (event) => {
const base64 = event.target.result;
onSubmit({
logoType,
fileContent: base64,
fileName: selectedFile.name,
});
};
reader.readAsDataURL(selectedFile);
};
const handleClose = () => {
setSelectedFile(null);
setPreviewUrl(null);
setUploadError("");
onClose();
};
if (!isOpen) return null;
return (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow-xl max-w-2xl w-full mx-4 max-h-[90vh] overflow-y-auto">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600">
<div className="flex items-center justify-between">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Upload{" "}
{logoType === "favicon"
? "Favicon"
: `${logoType.charAt(0).toUpperCase() + logoType.slice(1)} Logo`}
</h3>
<button
type="button"
onClick={handleClose}
className="text-secondary-400 hover:text-secondary-600 dark:text-secondary-500 dark:hover:text-secondary-300"
>
<X className="h-5 w-5" />
</button>
</div>
</div>
<form onSubmit={handleSubmit} className="px-6 py-4">
<div className="space-y-4">
<div>
<label className="block">
<span className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2">
Select File
</span>
<input
type="file"
accept="image/png,image/jpeg,image/jpg,image/svg+xml"
onChange={handleFileSelect}
className="block w-full text-sm text-secondary-500 dark:text-secondary-400 file:mr-4 file:py-2 file:px-4 file:rounded-md file:border-0 file:text-sm file:font-medium file:bg-primary-50 file:text-primary-700 hover:file:bg-primary-100 dark:file:bg-primary-900 dark:file:text-primary-200"
/>
</label>
<p className="mt-1 text-xs text-secondary-500 dark:text-secondary-400">
Supported formats: PNG, JPG, SVG. Max size: 5MB.
{logoType === "favicon"
? " Recommended: 32x32px SVG."
: " Recommended: 200x60px."}
</p>
</div>
{previewUrl && (
<div>
<div className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2">
Preview
</div>
<div className="flex items-center justify-center p-4 bg-white dark:bg-secondary-800 rounded-lg border border-secondary-200 dark:border-secondary-600">
<img
src={previewUrl}
alt="Preview"
className={`object-contain ${
logoType === "favicon" ? "h-8 w-8" : "max-h-16 max-w-full"
}`}
/>
</div>
</div>
)}
{(uploadError || error) && (
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-md p-3">
<p className="text-sm text-red-800 dark:text-red-200">
{uploadError ||
error?.response?.data?.error ||
error?.message}
</p>
</div>
)}
<div className="bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-md p-3">
<div className="flex">
<AlertCircle className="h-4 w-4 text-yellow-600 dark:text-yellow-400 mr-2 mt-0.5" />
<div className="text-sm text-yellow-800 dark:text-yellow-200">
<p className="font-medium">Important:</p>
<ul className="mt-1 list-disc list-inside space-y-1">
<li>This will replace the current {logoType} logo</li>
<li>A backup will be created automatically</li>
<li>The change will be applied immediately</li>
</ul>
</div>
</div>
</div>
</div>
<div className="flex justify-end gap-3 mt-6">
<button type="button" onClick={handleClose} className="btn-outline">
Cancel
</button>
<button
type="submit"
disabled={isLoading || !selectedFile}
className="btn-primary"
>
{isLoading ? "Uploading..." : "Upload Logo"}
</button>
</div>
</form>
</div>
</div>
);
};
export default BrandingTab;

View File

@@ -54,7 +54,7 @@ const UsersTab = () => {
});
// Update user mutation
const updateUserMutation = useMutation({
const _updateUserMutation = useMutation({
mutationFn: ({ id, data }) => adminUsersAPI.update(id, data),
onSuccess: () => {
queryClient.invalidateQueries(["users"]);
@@ -92,7 +92,12 @@ const UsersTab = () => {
};
const handleEditUser = (user) => {
setEditingUser(user);
// Reset editingUser first to force re-render with fresh data
setEditingUser(null);
// Use setTimeout to ensure the modal re-initializes with fresh data
setTimeout(() => {
setEditingUser(user);
}, 0);
};
const handleResetPassword = (user) => {
@@ -314,7 +319,8 @@ const UsersTab = () => {
user={editingUser}
isOpen={!!editingUser}
onClose={() => setEditingUser(null)}
onUserUpdated={() => updateUserMutation.mutate()}
onUpdateUser={updateUserMutation.mutate}
isLoading={updateUserMutation.isPending}
roles={roles}
/>
)}
@@ -352,11 +358,29 @@ const AddUserModal = ({ isOpen, onClose, onUserCreated, roles }) => {
});
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState("");
const [success, setSuccess] = useState(false);
// Reset form when modal is closed
useEffect(() => {
if (!isOpen) {
setFormData({
username: "",
email: "",
password: "",
first_name: "",
last_name: "",
role: "user",
});
setError("");
setSuccess(false);
}
}, [isOpen]);
const handleSubmit = async (e) => {
e.preventDefault();
setIsLoading(true);
setError("");
setSuccess(false);
try {
// Only send role if roles are available from API
@@ -364,12 +388,19 @@ const AddUserModal = ({ isOpen, onClose, onUserCreated, roles }) => {
username: formData.username,
email: formData.email,
password: formData.password,
first_name: formData.first_name,
last_name: formData.last_name,
};
if (roles && Array.isArray(roles) && roles.length > 0) {
payload.role = formData.role;
}
await adminUsersAPI.create(payload);
setSuccess(true);
onUserCreated();
// Auto-close after 1.5 seconds
setTimeout(() => {
onClose();
}, 1500);
} catch (err) {
setError(err.response?.data?.error || "Failed to create user");
} finally {
@@ -517,6 +548,17 @@ const AddUserModal = ({ isOpen, onClose, onUserCreated, roles }) => {
</select>
</div>
{success && (
<div className="bg-green-50 dark:bg-green-900 border border-green-200 dark:border-green-700 rounded-md p-3">
<div className="flex items-center">
<CheckCircle className="h-4 w-4 text-green-600 dark:text-green-400 mr-2" />
<p className="text-sm text-green-700 dark:text-green-300">
User created successfully!
</p>
</div>
</div>
)}
{error && (
<div className="bg-danger-50 dark:bg-danger-900 border border-danger-200 dark:border-danger-700 rounded-md p-3">
<p className="text-sm text-danger-700 dark:text-danger-300">
@@ -548,7 +590,14 @@ const AddUserModal = ({ isOpen, onClose, onUserCreated, roles }) => {
};
// Edit User Modal Component
const EditUserModal = ({ user, isOpen, onClose, onUserUpdated, roles }) => {
const EditUserModal = ({
user,
isOpen,
onClose,
onUpdateUser,
isLoading,
roles,
}) => {
const editUsernameId = useId();
const editEmailId = useId();
const editFirstNameId = useId();
@@ -564,21 +613,45 @@ const EditUserModal = ({ user, isOpen, onClose, onUserUpdated, roles }) => {
role: user?.role || "user",
is_active: user?.is_active ?? true,
});
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState("");
const [success, setSuccess] = useState(false);
// Update formData when user prop changes or modal opens
useEffect(() => {
if (user && isOpen) {
setFormData({
username: user.username || "",
email: user.email || "",
first_name: user.first_name || "",
last_name: user.last_name || "",
role: user.role || "user",
is_active: user.is_active ?? true,
});
}
}, [user, isOpen]);
// Reset error and success when modal closes
useEffect(() => {
if (!isOpen) {
setError("");
setSuccess(false);
}
}, [isOpen]);
const handleSubmit = async (e) => {
e.preventDefault();
setIsLoading(true);
setError("");
setSuccess(false);
try {
await adminUsersAPI.update(user.id, formData);
onUserUpdated();
await onUpdateUser({ id: user.id, data: formData });
setSuccess(true);
// Auto-close after 1.5 seconds
setTimeout(() => {
onClose();
}, 1500);
} catch (err) {
setError(err.response?.data?.error || "Failed to update user");
} finally {
setIsLoading(false);
}
};
@@ -718,6 +791,17 @@ const EditUserModal = ({ user, isOpen, onClose, onUserUpdated, roles }) => {
</label>
</div>
{success && (
<div className="bg-green-50 dark:bg-green-900 border border-green-200 dark:border-green-700 rounded-md p-3">
<div className="flex items-center">
<CheckCircle className="h-4 w-4 text-green-600 dark:text-green-400 mr-2" />
<p className="text-sm text-green-700 dark:text-green-300">
User updated successfully!
</p>
</div>
</div>
)}
{error && (
<div className="bg-danger-50 dark:bg-danger-900 border border-danger-200 dark:border-danger-700 rounded-md p-3">
<p className="text-sm text-danger-700 dark:text-danger-300">

View File

@@ -1,30 +1,16 @@
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
import {
AlertCircle,
CheckCircle,
Clock,
Code,
Download,
Save,
ExternalLink,
GitCommit,
} from "lucide-react";
import { useEffect, useId, useState } from "react";
import { settingsAPI, versionAPI } from "../../utils/api";
import { useCallback, useEffect, useState } from "react";
import { versionAPI } from "../../utils/api";
const VersionUpdateTab = () => {
const repoPublicId = useId();
const repoPrivateId = useId();
const useCustomSshKeyId = useId();
const githubRepoUrlId = useId();
const sshKeyPathId = useId();
const [formData, setFormData] = useState({
githubRepoUrl: "git@github.com:9technologygroup/patchmon.net.git",
repositoryType: "public",
sshKeyPath: "",
useCustomSshKey: false,
});
const [errors, setErrors] = useState({});
const [isDirty, setIsDirty] = useState(false);
// Version checking state
const [versionInfo, setVersionInfo] = useState({
currentVersion: null,
@@ -32,89 +18,11 @@ const VersionUpdateTab = () => {
isUpdateAvailable: false,
checking: false,
error: null,
github: null,
});
const [sshTestResult, setSshTestResult] = useState({
testing: false,
success: null,
message: null,
error: null,
});
const queryClient = useQueryClient();
// Fetch current settings
const {
data: settings,
isLoading,
error,
} = useQuery({
queryKey: ["settings"],
queryFn: () => settingsAPI.get().then((res) => res.data),
});
// Update form data when settings are loaded
useEffect(() => {
if (settings) {
const newFormData = {
githubRepoUrl:
settings.github_repo_url ||
"git@github.com:9technologygroup/patchmon.net.git",
repositoryType: settings.repository_type || "public",
sshKeyPath: settings.ssh_key_path || "",
useCustomSshKey: !!settings.ssh_key_path,
};
setFormData(newFormData);
setIsDirty(false);
}
}, [settings]);
// Update settings mutation
const updateSettingsMutation = useMutation({
mutationFn: (data) => {
return settingsAPI.update(data).then((res) => res.data);
},
onSuccess: () => {
queryClient.invalidateQueries(["settings"]);
setIsDirty(false);
setErrors({});
},
onError: (error) => {
if (error.response?.data?.errors) {
setErrors(
error.response.data.errors.reduce((acc, err) => {
acc[err.path] = err.msg;
return acc;
}, {}),
);
} else {
setErrors({
general: error.response?.data?.error || "Failed to update settings",
});
}
},
});
// Load current version on component mount
useEffect(() => {
const loadCurrentVersion = async () => {
try {
const response = await versionAPI.getCurrent();
const data = response.data;
setVersionInfo((prev) => ({
...prev,
currentVersion: data.version,
}));
} catch (error) {
console.error("Error loading current version:", error);
}
};
loadCurrentVersion();
}, []);
// Version checking functions
const checkForUpdates = async () => {
const checkForUpdates = useCallback(async () => {
setVersionInfo((prev) => ({ ...prev, checking: true, error: null }));
try {
@@ -126,6 +34,7 @@ const VersionUpdateTab = () => {
latestVersion: data.latestVersion,
isUpdateAvailable: data.isUpdateAvailable,
last_update_check: data.last_update_check,
github: data.github,
checking: false,
error: null,
});
@@ -137,434 +46,276 @@ const VersionUpdateTab = () => {
error: error.response?.data?.error || "Failed to check for updates",
}));
}
};
}, []);
const testSshKey = async () => {
if (!formData.sshKeyPath || !formData.githubRepoUrl) {
setSshTestResult({
testing: false,
success: false,
message: null,
error: "Please enter both SSH key path and GitHub repository URL",
});
return;
}
// Load current version and automatically check for updates on component mount
useEffect(() => {
const loadAndCheckUpdates = async () => {
try {
// First, get current version info
const response = await versionAPI.getCurrent();
const data = response.data;
setVersionInfo({
currentVersion: data.version,
latestVersion: data.latest_version || null,
isUpdateAvailable: data.is_update_available || false,
last_update_check: data.last_update_check || null,
github: data.github,
checking: false,
error: null,
});
setSshTestResult({
testing: true,
success: null,
message: null,
error: null,
});
// Then automatically trigger a fresh update check
await checkForUpdates();
} catch (error) {
console.error("Error loading version info:", error);
setVersionInfo((prev) => ({
...prev,
error: "Failed to load version information",
}));
}
};
try {
const response = await versionAPI.testSshKey({
sshKeyPath: formData.sshKeyPath,
githubRepoUrl: formData.githubRepoUrl,
});
setSshTestResult({
testing: false,
success: true,
message: response.data.message,
error: null,
});
} catch (error) {
console.error("SSH key test error:", error);
setSshTestResult({
testing: false,
success: false,
message: null,
error: error.response?.data?.error || "Failed to test SSH key",
});
}
};
const handleInputChange = (field, value) => {
setFormData((prev) => ({
...prev,
[field]: value,
}));
setIsDirty(true);
if (errors[field]) {
setErrors((prev) => ({ ...prev, [field]: null }));
}
};
const handleSave = () => {
// Only include sshKeyPath if the toggle is enabled
const dataToSubmit = { ...formData };
if (!dataToSubmit.useCustomSshKey) {
dataToSubmit.sshKeyPath = "";
}
// Remove the frontend-only field
delete dataToSubmit.useCustomSshKey;
updateSettingsMutation.mutate(dataToSubmit);
};
if (isLoading) {
return (
<div className="flex items-center justify-center h-64">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-primary-600"></div>
</div>
);
}
if (error) {
return (
<div className="bg-red-50 dark:bg-red-900 border border-red-200 dark:border-red-700 rounded-md p-4">
<div className="flex">
<AlertCircle className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Error loading settings
</h3>
<p className="mt-1 text-sm text-red-700 dark:text-red-300">
{error.response?.data?.error || "Failed to load settings"}
</p>
</div>
</div>
</div>
);
}
loadAndCheckUpdates();
}, [checkForUpdates]); // Run when component mounts
return (
<div className="space-y-6">
{errors.general && (
<div className="bg-red-50 dark:bg-red-900 border border-red-200 dark:border-red-700 rounded-md p-4">
<div className="flex">
<AlertCircle className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<p className="text-sm text-red-700 dark:text-red-300">
{errors.general}
</p>
</div>
</div>
</div>
)}
<div className="flex items-center mb-6">
<Code className="h-6 w-6 text-primary-600 mr-3" />
<h2 className="text-xl font-semibold text-secondary-900 dark:text-white">
Server Version Management
Server Version Information
</h2>
</div>
<div className="bg-secondary-50 dark:bg-secondary-700 rounded-lg p-6">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white mb-4">
Version Check Configuration
Version Information
</h3>
<p className="text-sm text-secondary-600 dark:text-secondary-300 mb-6">
Configure automatic version checking against your GitHub repository to
notify users of available updates.
Current server version and latest updates from GitHub repository.
{versionInfo.checking && (
<span className="ml-2 text-blue-600 dark:text-blue-400">
🔄 Checking for updates...
</span>
)}
</p>
<div className="space-y-4">
<fieldset>
<legend className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2">
Repository Type
</legend>
<div className="space-y-2">
<div className="flex items-center">
<input
type="radio"
id={repoPublicId}
name="repositoryType"
value="public"
checked={formData.repositoryType === "public"}
onChange={(e) =>
handleInputChange("repositoryType", e.target.value)
}
className="h-4 w-4 text-primary-600 focus:ring-primary-500 border-gray-300"
/>
<label
htmlFor={repoPublicId}
className="ml-2 text-sm text-secondary-700 dark:text-secondary-200"
>
Public Repository (uses GitHub API - no authentication
required)
</label>
</div>
<div className="flex items-center">
<input
type="radio"
id={repoPrivateId}
name="repositoryType"
value="private"
checked={formData.repositoryType === "private"}
onChange={(e) =>
handleInputChange("repositoryType", e.target.value)
}
className="h-4 w-4 text-primary-600 focus:ring-primary-500 border-gray-300"
/>
<label
htmlFor={repoPrivateId}
className="ml-2 text-sm text-secondary-700 dark:text-secondary-200"
>
Private Repository (uses SSH with deploy key)
</label>
</div>
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
{/* My Version */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<CheckCircle className="h-4 w-4 text-green-600 dark:text-green-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
My Version
</span>
</div>
<p className="mt-1 text-xs text-secondary-500 dark:text-secondary-400">
Choose whether your repository is public or private to determine
the appropriate access method.
</p>
</fieldset>
<div>
<label
htmlFor={githubRepoUrlId}
className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2"
>
GitHub Repository URL
</label>
<input
id={githubRepoUrlId}
type="text"
value={formData.githubRepoUrl || ""}
onChange={(e) =>
handleInputChange("githubRepoUrl", e.target.value)
}
className="w-full border border-secondary-300 dark:border-secondary-600 rounded-md shadow-sm focus:ring-primary-500 focus:border-primary-500 bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white font-mono text-sm"
placeholder="git@github.com:username/repository.git"
/>
<p className="mt-1 text-xs text-secondary-500 dark:text-secondary-400">
SSH or HTTPS URL to your GitHub repository
</p>
<span className="text-lg font-mono text-secondary-900 dark:text-white">
{versionInfo.currentVersion}
</span>
</div>
{formData.repositoryType === "private" && (
<div>
<div className="flex items-center gap-3 mb-3">
<input
type="checkbox"
id={useCustomSshKeyId}
checked={formData.useCustomSshKey}
onChange={(e) => {
const checked = e.target.checked;
handleInputChange("useCustomSshKey", checked);
if (!checked) {
handleInputChange("sshKeyPath", "");
}
}}
className="h-4 w-4 text-primary-600 focus:ring-primary-500 border-gray-300 rounded"
/>
<label
htmlFor={useCustomSshKeyId}
className="text-sm font-medium text-secondary-700 dark:text-secondary-200"
>
Set custom SSH key path
</label>
{/* Latest Release */}
{versionInfo.github?.latestRelease && (
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<Download className="h-4 w-4 text-blue-600 dark:text-blue-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
Latest Release
</span>
</div>
<div className="space-y-1">
<span className="text-lg font-mono text-secondary-900 dark:text-white">
{versionInfo.github.latestRelease.tagName}
</span>
{versionInfo.github.latestRelease.publishedAt && (
<div className="text-xs text-secondary-500 dark:text-secondary-400">
Published:{" "}
{new Date(
versionInfo.github.latestRelease.publishedAt,
).toLocaleDateString()}
</div>
)}
</div>
</div>
)}
</div>
{/* GitHub Repository Information */}
{versionInfo.github && (
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600 mt-4">
<div className="flex items-center gap-2 mb-4">
<Code className="h-4 w-4 text-purple-600 dark:text-purple-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
GitHub Repository Information
</span>
</div>
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
{/* Repository URL */}
<div className="space-y-2">
<span className="text-xs font-medium text-secondary-600 dark:text-secondary-400 uppercase tracking-wide">
Repository
</span>
<div className="flex items-center gap-2">
<span className="text-sm text-secondary-900 dark:text-white font-mono">
{versionInfo.github.owner}/{versionInfo.github.repo}
</span>
{versionInfo.github.repository && (
<a
href={versionInfo.github.repository}
target="_blank"
rel="noopener noreferrer"
className="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300"
>
<ExternalLink className="h-3 w-3" />
</a>
)}
</div>
</div>
{formData.useCustomSshKey && (
<div>
<label
htmlFor={sshKeyPathId}
className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2"
>
SSH Key Path
</label>
<input
id={sshKeyPathId}
type="text"
value={formData.sshKeyPath || ""}
onChange={(e) =>
handleInputChange("sshKeyPath", e.target.value)
}
className="w-full border border-secondary-300 dark:border-secondary-600 rounded-md shadow-sm focus:ring-primary-500 focus:border-primary-500 bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white font-mono text-sm"
placeholder="/root/.ssh/id_ed25519"
/>
<p className="mt-1 text-xs text-secondary-500 dark:text-secondary-400">
Path to your SSH deploy key. If not set, will auto-detect
from common locations.
</p>
<div className="mt-3">
<button
type="button"
onClick={testSshKey}
disabled={
sshTestResult.testing ||
!formData.sshKeyPath ||
!formData.githubRepoUrl
}
className="px-4 py-2 text-sm font-medium text-white bg-blue-600 border border-transparent rounded-md shadow-sm hover:bg-blue-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-blue-500 disabled:opacity-50 disabled:cursor-not-allowed"
>
{sshTestResult.testing ? "Testing..." : "Test SSH Key"}
</button>
{sshTestResult.success && (
<div className="mt-2 p-3 bg-green-50 dark:bg-green-900/20 border border-green-200 dark:border-green-800 rounded-md">
<div className="flex items-center">
<CheckCircle className="h-4 w-4 text-green-600 dark:text-green-400 mr-2" />
<p className="text-sm text-green-800 dark:text-green-200">
{sshTestResult.message}
</p>
</div>
</div>
)}
{sshTestResult.error && (
<div className="mt-2 p-3 bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-md">
<div className="flex items-center">
<AlertCircle className="h-4 w-4 text-red-600 dark:text-red-400 mr-2" />
<p className="text-sm text-red-800 dark:text-red-200">
{sshTestResult.error}
</p>
</div>
</div>
{/* Latest Release Info */}
{versionInfo.github.latestRelease && (
<div className="space-y-2">
<span className="text-xs font-medium text-secondary-600 dark:text-secondary-400 uppercase tracking-wide">
Release Link
</span>
<div className="flex items-center gap-2">
{versionInfo.github.latestRelease.htmlUrl && (
<a
href={versionInfo.github.latestRelease.htmlUrl}
target="_blank"
rel="noopener noreferrer"
className="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300 text-sm"
>
View Release{" "}
<ExternalLink className="h-3 w-3 inline ml-1" />
</a>
)}
</div>
</div>
)}
{!formData.useCustomSshKey && (
<p className="text-xs text-secondary-500 dark:text-secondary-400">
Using auto-detection for SSH key location
</p>
{/* Branch Status */}
{versionInfo.github.commitDifference && (
<div className="space-y-2">
<span className="text-xs font-medium text-secondary-600 dark:text-secondary-400 uppercase tracking-wide">
Branch Status
</span>
<div className="text-sm">
{versionInfo.github.commitDifference.commitsAhead > 0 ? (
<span className="text-blue-600 dark:text-blue-400">
🚀 Main branch is{" "}
{versionInfo.github.commitDifference.commitsAhead}{" "}
commits ahead of release
</span>
) : versionInfo.github.commitDifference.commitsBehind >
0 ? (
<span className="text-orange-600 dark:text-orange-400">
📊 Main branch is{" "}
{versionInfo.github.commitDifference.commitsBehind}{" "}
commits behind release
</span>
) : (
<span className="text-green-600 dark:text-green-400">
Main branch is in sync with release
</span>
)}
</div>
</div>
)}
</div>
)}
<div className="grid grid-cols-1 md:grid-cols-2 gap-4">
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<CheckCircle className="h-4 w-4 text-green-600 dark:text-green-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
Current Version
</span>
</div>
<span className="text-lg font-mono text-secondary-900 dark:text-white">
{versionInfo.currentVersion}
</span>
</div>
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<Download className="h-4 w-4 text-blue-600 dark:text-blue-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
Latest Version
</span>
</div>
<span className="text-lg font-mono text-secondary-900 dark:text-white">
{versionInfo.checking ? (
<span className="text-blue-600 dark:text-blue-400">
Checking...
{/* Latest Commit Information */}
{versionInfo.github.latestCommit && (
<div className="mt-4 pt-4 border-t border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<GitCommit className="h-4 w-4 text-orange-600 dark:text-orange-400" />
<span className="text-xs font-medium text-secondary-600 dark:text-secondary-400 uppercase tracking-wide">
Latest Commit (Rolling)
</span>
) : versionInfo.latestVersion ? (
<span
className={
versionInfo.isUpdateAvailable
? "text-orange-600 dark:text-orange-400"
: "text-green-600 dark:text-green-400"
}
>
{versionInfo.latestVersion}
{versionInfo.isUpdateAvailable && " (Update Available!)"}
</span>
) : (
<span className="text-secondary-500 dark:text-secondary-400">
Not checked
</span>
)}
</span>
</div>
</div>
{/* Last Checked Time */}
{versionInfo.last_update_check && (
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-2 mb-2">
<Clock className="h-4 w-4 text-blue-600 dark:text-blue-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
Last Checked
</span>
</div>
<span className="text-sm text-secondary-600 dark:text-secondary-400">
{new Date(versionInfo.last_update_check).toLocaleString()}
</span>
<p className="text-xs text-secondary-500 dark:text-secondary-400 mt-1">
Updates are checked automatically every 24 hours
</p>
</div>
)}
<div className="flex items-center justify-between">
<div className="flex items-center gap-3">
<button
type="button"
onClick={checkForUpdates}
disabled={versionInfo.checking}
className="btn-primary flex items-center gap-2"
>
<Download className="h-4 w-4" />
{versionInfo.checking ? "Checking..." : "Check for Updates"}
</button>
</div>
{/* Save Button for Version Settings */}
<button
type="button"
onClick={handleSave}
disabled={!isDirty || updateSettingsMutation.isPending}
className={`inline-flex items-center px-4 py-2 border border-transparent text-sm font-medium rounded-md shadow-sm text-white ${
!isDirty || updateSettingsMutation.isPending
? "bg-secondary-400 cursor-not-allowed"
: "bg-primary-600 hover:bg-primary-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-primary-500"
}`}
>
{updateSettingsMutation.isPending ? (
<>
<div className="animate-spin rounded-full h-4 w-4 border-b-2 border-white mr-2"></div>
Saving...
</>
) : (
<>
<Save className="h-4 w-4 mr-2" />
Save Settings
</>
)}
</button>
</div>
{versionInfo.error && (
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-700 rounded-lg p-4">
<div className="flex">
<AlertCircle className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Version Check Failed
</h3>
<p className="mt-1 text-sm text-red-700 dark:text-red-300">
{versionInfo.error}
</div>
<div className="space-y-2">
<div className="flex items-center gap-2">
<span className="text-sm font-mono text-secondary-900 dark:text-white">
{versionInfo.github.latestCommit.sha.substring(0, 8)}
</span>
{versionInfo.github.latestCommit.htmlUrl && (
<a
href={versionInfo.github.latestCommit.htmlUrl}
target="_blank"
rel="noopener noreferrer"
className="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300"
>
<ExternalLink className="h-3 w-3" />
</a>
)}
</div>
<p className="text-sm text-secondary-700 dark:text-secondary-300">
{versionInfo.github.latestCommit.message.split("\n")[0]}
</p>
{versionInfo.error.includes("private") && (
<p className="mt-2 text-xs text-red-600 dark:text-red-400">
For private repositories, you may need to configure GitHub
authentication or make the repository public.
</p>
)}
<div className="flex items-center gap-4 text-xs text-secondary-500 dark:text-secondary-400">
<span>
Author: {versionInfo.github.latestCommit.author}
</span>
<span>
Date:{" "}
{new Date(
versionInfo.github.latestCommit.date,
).toLocaleString()}
</span>
</div>
</div>
</div>
</div>
)}
)}
</div>
)}
{/* Success Message for Version Settings */}
{updateSettingsMutation.isSuccess && (
<div className="bg-green-50 dark:bg-green-900 border border-green-200 dark:border-green-700 rounded-md p-4">
<div className="flex">
<CheckCircle className="h-5 w-5 text-green-400 dark:text-green-300" />
<div className="ml-3">
<p className="text-sm text-green-700 dark:text-green-300">
Settings saved successfully!
</p>
</div>
</div>
{/* Last Checked Time */}
{versionInfo.last_update_check && (
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600 mt-4">
<div className="flex items-center gap-2 mb-2">
<Clock className="h-4 w-4 text-blue-600 dark:text-blue-400" />
<span className="text-sm font-medium text-secondary-700 dark:text-secondary-300">
Last Checked
</span>
</div>
)}
<span className="text-sm text-secondary-600 dark:text-secondary-400">
{new Date(versionInfo.last_update_check).toLocaleString()}
</span>
<p className="text-xs text-secondary-500 dark:text-secondary-400 mt-1">
Updates are checked automatically every 24 hours
</p>
</div>
)}
<div className="flex items-center justify-start mt-6">
<button
type="button"
onClick={checkForUpdates}
disabled={versionInfo.checking}
className="btn-primary flex items-center gap-2"
>
<Download className="h-4 w-4" />
{versionInfo.checking ? "Checking..." : "Check for Updates"}
</button>
</div>
{versionInfo.error && (
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-700 rounded-lg p-4 mt-4">
<div className="flex">
<AlertCircle className="h-5 w-5 text-red-400 dark:text-red-300" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Version Check Failed
</h3>
<p className="mt-1 text-sm text-red-700 dark:text-red-300">
{versionInfo.error}
</p>
</div>
</div>
</div>
)}
</div>
</div>
);

View File

@@ -1,7 +1,7 @@
import { useQuery } from "@tanstack/react-query";
import { createContext, useContext, useMemo, useState } from "react";
import { createContext, useContext, useState } from "react";
import { isAuthReady } from "../constants/authPhases";
import { settingsAPI, versionAPI } from "../utils/api";
import { settingsAPI } from "../utils/api";
import { useAuth } from "./AuthContext";
const UpdateNotificationContext = createContext();
@@ -21,6 +21,7 @@ export const UpdateNotificationProvider = ({ children }) => {
const { authPhase, isAuthenticated } = useAuth();
// Ensure settings are loaded - but only after auth is fully ready
// This reads cached update info from backend (updated by scheduler)
const { data: settings, isLoading: settingsLoading } = useQuery({
queryKey: ["settings"],
queryFn: () => settingsAPI.get().then((res) => res.data),
@@ -29,31 +30,20 @@ export const UpdateNotificationProvider = ({ children }) => {
enabled: isAuthReady(authPhase, isAuthenticated()),
});
// Memoize the enabled condition to prevent unnecessary re-evaluations
const isQueryEnabled = useMemo(() => {
return (
isAuthReady(authPhase, isAuthenticated()) &&
!!settings &&
!settingsLoading
);
}, [authPhase, isAuthenticated, settings, settingsLoading]);
// Read cached update information from settings (no GitHub API calls)
// The backend scheduler updates this data periodically
const updateAvailable = settings?.is_update_available && !dismissed;
const updateInfo = settings
? {
isUpdateAvailable: settings.is_update_available,
latestVersion: settings.latest_version,
currentVersion: settings.current_version,
last_update_check: settings.last_update_check,
}
: null;
// Query for update information
const {
data: updateData,
isLoading,
error,
} = useQuery({
queryKey: ["updateCheck"],
queryFn: () => versionAPI.checkUpdates().then((res) => res.data),
staleTime: 10 * 60 * 1000, // Data stays fresh for 10 minutes
refetchOnWindowFocus: false, // Don't refetch when window regains focus
retry: 1,
enabled: isQueryEnabled,
});
const updateAvailable = updateData?.isUpdateAvailable && !dismissed;
const updateInfo = updateData;
const isLoading = settingsLoading;
const error = null;
const dismissNotification = () => {
setDismissed(true);

View File

@@ -0,0 +1,613 @@
import { useQuery } from "@tanstack/react-query";
import {
Activity,
ArrowDown,
ArrowUp,
ArrowUpDown,
CheckCircle,
Clock,
Play,
Settings,
XCircle,
Zap,
} from "lucide-react";
import { useState } from "react";
import api from "../utils/api";
const Automation = () => {
const [activeTab, setActiveTab] = useState("overview");
const [sortField, setSortField] = useState("nextRunTimestamp");
const [sortDirection, setSortDirection] = useState("asc");
// Fetch automation overview data
const { data: overview, isLoading: overviewLoading } = useQuery({
queryKey: ["automation-overview"],
queryFn: async () => {
const response = await api.get("/automation/overview");
return response.data.data;
},
refetchInterval: 30000, // Refresh every 30 seconds
});
// Fetch queue statistics
useQuery({
queryKey: ["automation-stats"],
queryFn: async () => {
const response = await api.get("/automation/stats");
return response.data.data;
},
refetchInterval: 30000,
});
// Fetch recent jobs
useQuery({
queryKey: ["automation-jobs"],
queryFn: async () => {
const jobs = await Promise.all([
api
.get("/automation/jobs/github-update-check?limit=5")
.then((r) => r.data.data || []),
api
.get("/automation/jobs/session-cleanup?limit=5")
.then((r) => r.data.data || []),
]);
return {
githubUpdate: jobs[0],
sessionCleanup: jobs[1],
};
},
refetchInterval: 30000,
});
const _getStatusIcon = (status) => {
switch (status) {
case "completed":
return <CheckCircle className="h-4 w-4 text-green-500" />;
case "failed":
return <XCircle className="h-4 w-4 text-red-500" />;
case "active":
return <Activity className="h-4 w-4 text-blue-500 animate-pulse" />;
default:
return <Clock className="h-4 w-4 text-gray-500" />;
}
};
const _getStatusColor = (status) => {
switch (status) {
case "completed":
return "bg-green-100 text-green-800";
case "failed":
return "bg-red-100 text-red-800";
case "active":
return "bg-blue-100 text-blue-800";
default:
return "bg-gray-100 text-gray-800";
}
};
const _formatDate = (dateString) => {
if (!dateString) return "N/A";
return new Date(dateString).toLocaleString();
};
const _formatDuration = (ms) => {
if (!ms) return "N/A";
return `${ms}ms`;
};
const getStatusBadge = (status) => {
switch (status) {
case "Success":
return (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-green-100 text-green-800">
Success
</span>
);
case "Failed":
return (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-red-100 text-red-800">
Failed
</span>
);
case "Never run":
return (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-gray-100 text-gray-800">
Never run
</span>
);
default:
return (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-gray-100 text-gray-800">
{status}
</span>
);
}
};
const getNextRunTime = (schedule, _lastRun) => {
if (schedule === "Manual only") return "Manual trigger only";
if (schedule.includes("Agent-driven")) return "Agent-driven (automatic)";
if (schedule === "Daily at midnight") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(0, 0, 0, 0);
return tomorrow.toLocaleString([], {
hour12: true,
hour: "numeric",
minute: "2-digit",
day: "numeric",
month: "numeric",
year: "numeric",
});
}
if (schedule === "Daily at 2 AM") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(2, 0, 0, 0);
return tomorrow.toLocaleString([], {
hour12: true,
hour: "numeric",
minute: "2-digit",
day: "numeric",
month: "numeric",
year: "numeric",
});
}
if (schedule === "Daily at 3 AM") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(3, 0, 0, 0);
return tomorrow.toLocaleString([], {
hour12: true,
hour: "numeric",
minute: "2-digit",
day: "numeric",
month: "numeric",
year: "numeric",
});
}
if (schedule === "Every hour") {
const now = new Date();
const nextHour = new Date(now);
nextHour.setHours(nextHour.getHours() + 1, 0, 0, 0);
return nextHour.toLocaleString([], {
hour12: true,
hour: "numeric",
minute: "2-digit",
day: "numeric",
month: "numeric",
year: "numeric",
});
}
return "Unknown";
};
const getNextRunTimestamp = (schedule) => {
if (schedule === "Manual only") return Number.MAX_SAFE_INTEGER; // Manual tasks go to bottom
if (schedule.includes("Agent-driven")) return Number.MAX_SAFE_INTEGER - 1; // Agent-driven tasks near bottom but above manual
if (schedule === "Daily at midnight") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(0, 0, 0, 0);
return tomorrow.getTime();
}
if (schedule === "Daily at 2 AM") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(2, 0, 0, 0);
return tomorrow.getTime();
}
if (schedule === "Daily at 3 AM") {
const now = new Date();
const tomorrow = new Date(now);
tomorrow.setDate(tomorrow.getDate() + 1);
tomorrow.setHours(3, 0, 0, 0);
return tomorrow.getTime();
}
if (schedule === "Every hour") {
const now = new Date();
const nextHour = new Date(now);
nextHour.setHours(nextHour.getHours() + 1, 0, 0, 0);
return nextHour.getTime();
}
return Number.MAX_SAFE_INTEGER; // Unknown schedules go to bottom
};
const openBullBoard = () => {
const token = localStorage.getItem("token");
if (!token) {
alert("Please log in to access the Queue Monitor");
return;
}
// Use the proxied URL through the frontend (port 3000)
// This avoids CORS issues as everything goes through the same origin
const url = `/admin/queues?token=${encodeURIComponent(token)}`;
window.open(url, "_blank", "width=1200,height=800");
};
const triggerManualJob = async (jobType, data = {}) => {
try {
let endpoint;
if (jobType === "github") {
endpoint = "/automation/trigger/github-update";
} else if (jobType === "sessions") {
endpoint = "/automation/trigger/session-cleanup";
} else if (jobType === "orphaned-repos") {
endpoint = "/automation/trigger/orphaned-repo-cleanup";
} else if (jobType === "orphaned-packages") {
endpoint = "/automation/trigger/orphaned-package-cleanup";
} else if (jobType === "agent-collection") {
endpoint = "/automation/trigger/agent-collection";
}
const _response = await api.post(endpoint, data);
// Refresh data
window.location.reload();
} catch (error) {
console.error("Error triggering job:", error);
alert(
"Failed to trigger job: " +
(error.response?.data?.error || error.message),
);
}
};
const handleSort = (field) => {
if (sortField === field) {
setSortDirection(sortDirection === "asc" ? "desc" : "asc");
} else {
setSortField(field);
setSortDirection("asc");
}
};
const getSortIcon = (field) => {
if (sortField !== field) return <ArrowUpDown className="h-4 w-4" />;
return sortDirection === "asc" ? (
<ArrowUp className="h-4 w-4" />
) : (
<ArrowDown className="h-4 w-4" />
);
};
// Sort automations based on current sort settings
const sortedAutomations = overview?.automations
? [...overview.automations].sort((a, b) => {
let aValue, bValue;
switch (sortField) {
case "name":
aValue = a.name.toLowerCase();
bValue = b.name.toLowerCase();
break;
case "schedule":
aValue = a.schedule.toLowerCase();
bValue = b.schedule.toLowerCase();
break;
case "lastRun":
// Convert "Never" to empty string for proper sorting
aValue = a.lastRun === "Never" ? "" : a.lastRun;
bValue = b.lastRun === "Never" ? "" : b.lastRun;
break;
case "lastRunTimestamp":
aValue = a.lastRunTimestamp || 0;
bValue = b.lastRunTimestamp || 0;
break;
case "nextRunTimestamp":
aValue = getNextRunTimestamp(a.schedule);
bValue = getNextRunTimestamp(b.schedule);
break;
case "status":
aValue = a.status.toLowerCase();
bValue = b.status.toLowerCase();
break;
default:
aValue = a[sortField];
bValue = b[sortField];
}
if (aValue < bValue) return sortDirection === "asc" ? -1 : 1;
if (aValue > bValue) return sortDirection === "asc" ? 1 : -1;
return 0;
})
: [];
const tabs = [{ id: "overview", name: "Overview", icon: Settings }];
return (
<div className="space-y-6">
{/* Page Header */}
<div className="flex items-center justify-between">
<div>
<h1 className="text-2xl font-semibold text-secondary-900 dark:text-white">
Automation Management
</h1>
<p className="text-sm text-secondary-600 dark:text-secondary-400 mt-1">
Monitor and manage automated server operations, agent
communications, and patch deployments
</p>
</div>
<div className="flex items-center gap-3">
<button
type="button"
onClick={openBullBoard}
className="btn-outline flex items-center gap-2"
title="Open Bull Board Queue Monitor"
>
<svg
className="h-4 w-4"
xmlns="http://www.w3.org/2000/svg"
viewBox="0 0 36 36"
role="img"
aria-label="Bull Board"
>
<circle fill="#DD2E44" cx="18" cy="18" r="18" />
<circle fill="#FFF" cx="18" cy="18" r="13.5" />
<circle fill="#DD2E44" cx="18" cy="18" r="10" />
<circle fill="#FFF" cx="18" cy="18" r="6" />
<circle fill="#DD2E44" cx="18" cy="18" r="3" />
<path
opacity=".2"
d="M18.24 18.282l13.144 11.754s-2.647 3.376-7.89 5.109L17.579 18.42l.661-.138z"
/>
<path
fill="#FFAC33"
d="M18.294 19a.994.994 0 01-.704-1.699l.563-.563a.995.995 0 011.408 1.407l-.564.563a.987.987 0 01-.703.292z"
/>
<path
fill="#55ACEE"
d="M24.016 6.981c-.403 2.079 0 4.691 0 4.691l7.054-7.388c.291-1.454-.528-3.932-1.718-4.238-1.19-.306-4.079.803-5.336 6.935zm5.003 5.003c-2.079.403-4.691 0-4.691 0l7.388-7.054c1.454-.291 3.932.528 4.238 1.718.306 1.19-.803 4.079-6.935 5.336z"
/>
<path
fill="#3A87C2"
d="M32.798 4.485L21.176 17.587c-.362.362-1.673.882-2.51.046-.836-.836-.419-2.08-.057-2.443L31.815 3.501s.676-.635 1.159-.152-.176 1.136-.176 1.136z"
/>
</svg>
Queue Monitor
</button>
</div>
</div>
{/* Stats Cards */}
<div className="grid grid-cols-1 md:grid-cols-2 lg:grid-cols-4 gap-6">
{/* Scheduled Tasks Card */}
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Clock className="h-5 w-5 text-warning-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Scheduled Tasks
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{overviewLoading ? "..." : overview?.scheduledTasks || 0}
</p>
</div>
</div>
</div>
{/* Running Tasks Card */}
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Play className="h-5 w-5 text-success-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Running Tasks
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{overviewLoading ? "..." : overview?.runningTasks || 0}
</p>
</div>
</div>
</div>
{/* Failed Tasks Card */}
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<XCircle className="h-5 w-5 text-red-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Failed Tasks
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{overviewLoading ? "..." : overview?.failedTasks || 0}
</p>
</div>
</div>
</div>
{/* Total Task Runs Card */}
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Zap className="h-5 w-5 text-secondary-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Total Task Runs
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{overviewLoading ? "..." : overview?.totalAutomations || 0}
</p>
</div>
</div>
</div>
</div>
{/* Tabs */}
<div className="mb-6">
<div className="border-b border-gray-200 dark:border-gray-700">
<nav className="-mb-px flex space-x-8">
{tabs.map((tab) => (
<button
type="button"
key={tab.id}
onClick={() => setActiveTab(tab.id)}
className={`py-2 px-1 border-b-2 font-medium text-sm flex items-center gap-2 ${
activeTab === tab.id
? "border-blue-500 text-blue-600 dark:text-blue-400"
: "border-transparent text-gray-500 hover:text-gray-700 hover:border-gray-300 dark:text-gray-400 dark:hover:text-gray-300"
}`}
>
<tab.icon className="h-4 w-4" />
{tab.name}
</button>
))}
</nav>
</div>
</div>
{/* Tab Content */}
{activeTab === "overview" && (
<div className="card p-6">
{overviewLoading ? (
<div className="text-center py-8">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-blue-600 mx-auto"></div>
<p className="mt-2 text-sm text-secondary-500">
Loading automations...
</p>
</div>
) : (
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-secondary-200 dark:divide-secondary-600">
<thead className="bg-secondary-50 dark:bg-secondary-700">
<tr>
<th className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Run
</th>
<th
className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider cursor-pointer hover:bg-secondary-100 dark:hover:bg-secondary-600"
onClick={() => handleSort("name")}
>
<div className="flex items-center gap-1">
Task
{getSortIcon("name")}
</div>
</th>
<th
className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider cursor-pointer hover:bg-secondary-100 dark:hover:bg-secondary-600"
onClick={() => handleSort("schedule")}
>
<div className="flex items-center gap-1">
Frequency
{getSortIcon("schedule")}
</div>
</th>
<th
className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider cursor-pointer hover:bg-secondary-100 dark:hover:bg-secondary-600"
onClick={() => handleSort("lastRunTimestamp")}
>
<div className="flex items-center gap-1">
Last Run
{getSortIcon("lastRunTimestamp")}
</div>
</th>
<th
className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider cursor-pointer hover:bg-secondary-100 dark:hover:bg-secondary-600"
onClick={() => handleSort("nextRunTimestamp")}
>
<div className="flex items-center gap-1">
Next Run
{getSortIcon("nextRunTimestamp")}
</div>
</th>
<th
className="px-4 py-2 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider cursor-pointer hover:bg-secondary-100 dark:hover:bg-secondary-600"
onClick={() => handleSort("status")}
>
<div className="flex items-center gap-1">
Status
{getSortIcon("status")}
</div>
</th>
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-600">
{sortedAutomations.map((automation) => (
<tr
key={automation.queue}
className="hover:bg-secondary-50 dark:hover:bg-secondary-700"
>
<td className="px-4 py-2 whitespace-nowrap">
{automation.schedule !== "Manual only" ? (
<button
type="button"
onClick={() => {
if (automation.queue.includes("github")) {
triggerManualJob("github");
} else if (automation.queue.includes("session")) {
triggerManualJob("sessions");
} else if (
automation.queue.includes("orphaned-repo")
) {
triggerManualJob("orphaned-repos");
} else if (
automation.queue.includes("orphaned-package")
) {
triggerManualJob("orphaned-packages");
} else if (
automation.queue.includes("agent-commands")
) {
triggerManualJob("agent-collection");
}
}}
className="inline-flex items-center justify-center w-6 h-6 border border-transparent rounded text-white bg-green-600 hover:bg-green-700 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-green-500 transition-colors duration-200"
title="Run Now"
>
<Play className="h-3 w-3" />
</button>
) : (
<span className="text-gray-400 text-xs">Manual</span>
)}
</td>
<td className="px-4 py-2 whitespace-nowrap">
<div>
<div className="text-sm font-medium text-secondary-900 dark:text-white">
{automation.name}
</div>
<div className="text-xs text-secondary-500 dark:text-secondary-400">
{automation.description}
</div>
</div>
</td>
<td className="px-4 py-2 whitespace-nowrap text-sm text-secondary-900 dark:text-white">
{automation.schedule}
</td>
<td className="px-4 py-2 whitespace-nowrap text-sm text-secondary-900 dark:text-white">
{automation.lastRun}
</td>
<td className="px-4 py-2 whitespace-nowrap text-sm text-secondary-900 dark:text-white">
{getNextRunTime(
automation.schedule,
automation.lastRun,
)}
</td>
<td className="px-4 py-2 whitespace-nowrap">
{getStatusBadge(automation.status)}
</td>
</tr>
))}
</tbody>
</table>
</div>
)}
</div>
)}
</div>
);
};
export default Automation;

View File

@@ -6,6 +6,8 @@ import {
Chart as ChartJS,
Legend,
LinearScale,
LineElement,
PointElement,
Title,
Tooltip,
} from "chart.js";
@@ -23,7 +25,7 @@ import {
WifiOff,
} from "lucide-react";
import { useEffect, useState } from "react";
import { Bar, Doughnut, Pie } from "react-chartjs-2";
import { Bar, Doughnut, Line, Pie } from "react-chartjs-2";
import { useNavigate } from "react-router-dom";
import DashboardSettingsModal from "../components/DashboardSettingsModal";
import { useAuth } from "../contexts/AuthContext";
@@ -43,12 +45,16 @@ ChartJS.register(
CategoryScale,
LinearScale,
BarElement,
LineElement,
PointElement,
Title,
);
const Dashboard = () => {
const [showSettingsModal, setShowSettingsModal] = useState(false);
const [cardPreferences, setCardPreferences] = useState([]);
const [packageTrendsPeriod, setPackageTrendsPeriod] = useState("1"); // days
const [packageTrendsHost, setPackageTrendsHost] = useState("all"); // host filter
const navigate = useNavigate();
const { isDark } = useTheme();
const { user } = useAuth();
@@ -91,7 +97,7 @@ const Dashboard = () => {
navigate("/repositories");
};
const handleOSDistributionClick = () => {
const _handleOSDistributionClick = () => {
navigate("/hosts?showFilters=true", { replace: true });
};
@@ -99,7 +105,7 @@ const Dashboard = () => {
navigate("/hosts?filter=needsUpdates", { replace: true });
};
const handlePackagePriorityClick = () => {
const _handlePackagePriorityClick = () => {
navigate("/packages?filter=security");
};
@@ -144,8 +150,8 @@ const Dashboard = () => {
// Map priority names to filter parameters
if (priorityName.toLowerCase().includes("security")) {
navigate("/packages?filter=security", { replace: true });
} else if (priorityName.toLowerCase().includes("outdated")) {
navigate("/packages?filter=outdated", { replace: true });
} else if (priorityName.toLowerCase().includes("regular")) {
navigate("/packages?filter=regular", { replace: true });
}
}
};
@@ -189,6 +195,28 @@ const Dashboard = () => {
refetchOnWindowFocus: false, // Don't refetch when window regains focus
});
// Package trends data query
const {
data: packageTrendsData,
isLoading: packageTrendsLoading,
error: _packageTrendsError,
refetch: refetchPackageTrends,
isFetching: packageTrendsFetching,
} = useQuery({
queryKey: ["packageTrends", packageTrendsPeriod, packageTrendsHost],
queryFn: () => {
const params = {
days: packageTrendsPeriod,
};
if (packageTrendsHost !== "all") {
params.hostId = packageTrendsHost;
}
return dashboardAPI.getPackageTrends(params).then((res) => res.data);
},
staleTime: 5 * 60 * 1000, // 5 minutes
refetchOnWindowFocus: false,
});
// Fetch recent users (permission protected server-side)
const { data: recentUsers } = useQuery({
queryKey: ["dashboardRecentUsers"],
@@ -299,6 +327,8 @@ const Dashboard = () => {
].includes(cardId)
) {
return "charts";
} else if (["packageTrends"].includes(cardId)) {
return "charts";
} else if (["erroredHosts", "quickStats"].includes(cardId)) {
return "fullwidth";
}
@@ -312,6 +342,8 @@ const Dashboard = () => {
return "grid grid-cols-1 sm:grid-cols-2 lg:grid-cols-4 gap-4";
case "charts":
return "grid grid-cols-1 lg:grid-cols-3 gap-6";
case "widecharts":
return "grid grid-cols-1 lg:grid-cols-3 gap-6";
case "fullwidth":
return "space-y-6";
default:
@@ -651,17 +683,7 @@ const Dashboard = () => {
case "osDistribution":
return (
<button
type="button"
className="card p-6 cursor-pointer hover:shadow-card-hover dark:hover:shadow-card-hover-dark transition-shadow duration-200 w-full text-left"
onClick={handleOSDistributionClick}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault();
handleOSDistributionClick();
}
}}
>
<div className="card p-6 w-full">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white mb-4">
OS Distribution
</h3>
@@ -670,22 +692,12 @@ const Dashboard = () => {
<Pie data={osChartData} options={chartOptions} />
</div>
</div>
</button>
</div>
);
case "osDistributionDoughnut":
return (
<button
type="button"
className="card p-6 cursor-pointer hover:shadow-card-hover dark:hover:shadow-card-hover-dark transition-shadow duration-200 w-full text-left"
onClick={handleOSDistributionClick}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault();
handleOSDistributionClick();
}
}}
>
<div className="card p-6 w-full">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white mb-4">
OS Distribution
</h3>
@@ -694,29 +706,19 @@ const Dashboard = () => {
<Doughnut data={osChartData} options={doughnutChartOptions} />
</div>
</div>
</button>
</div>
);
case "osDistributionBar":
return (
<button
type="button"
className="card p-6 cursor-pointer hover:shadow-card-hover dark:hover:shadow-card-hover-dark transition-shadow duration-200 w-full text-left"
onClick={handleOSDistributionClick}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault();
handleOSDistributionClick();
}
}}
>
<div className="card p-6 w-full">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white mb-4">
OS Distribution
</h3>
<div className="h-64">
<Bar data={osBarChartData} options={barChartOptions} />
</div>
</button>
</div>
);
case "updateStatus":
@@ -748,19 +750,9 @@ const Dashboard = () => {
case "packagePriority":
return (
<button
type="button"
className="card p-6 cursor-pointer hover:shadow-card-hover dark:hover:shadow-card-hover-dark transition-shadow duration-200 w-full text-left"
onClick={handlePackagePriorityClick}
onKeyDown={(e) => {
if (e.key === "Enter" || e.key === " ") {
e.preventDefault();
handlePackagePriorityClick();
}
}}
>
<div className="card p-6 w-full">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white mb-4">
Package Priority
Outdated Packages by Priority
</h3>
<div className="h-64 w-full flex items-center justify-center">
<div className="w-full h-full max-w-sm">
@@ -770,7 +762,86 @@ const Dashboard = () => {
/>
</div>
</div>
</button>
</div>
);
case "packageTrends":
return (
<div className="card p-6 w-full">
<div className="flex items-center justify-between mb-4">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Package Trends Over Time
</h3>
<div className="flex items-center gap-3">
{/* Refresh Button */}
<button
type="button"
onClick={() => refetchPackageTrends()}
disabled={packageTrendsFetching}
className="px-3 py-1.5 text-sm border border-secondary-300 dark:border-secondary-600 rounded-md bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white hover:bg-secondary-50 dark:hover:bg-secondary-700 focus:ring-2 focus:ring-primary-500 focus:border-primary-500 disabled:opacity-50 disabled:cursor-not-allowed flex items-center gap-2"
title="Refresh data"
>
<RefreshCw
className={`h-4 w-4 ${packageTrendsFetching ? "animate-spin" : ""}`}
/>
Refresh
</button>
{/* Period Selector */}
<select
value={packageTrendsPeriod}
onChange={(e) => setPackageTrendsPeriod(e.target.value)}
className="px-3 py-1.5 text-sm border border-secondary-300 dark:border-secondary-600 rounded-md bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white focus:ring-2 focus:ring-primary-500 focus:border-primary-500"
>
<option value="1">Last 24 hours</option>
<option value="7">Last 7 days</option>
<option value="30">Last 30 days</option>
<option value="90">Last 90 days</option>
<option value="180">Last 6 months</option>
<option value="365">Last year</option>
</select>
{/* Host Selector */}
<select
value={packageTrendsHost}
onChange={(e) => setPackageTrendsHost(e.target.value)}
className="px-3 py-1.5 text-sm border border-secondary-300 dark:border-secondary-600 rounded-md bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white focus:ring-2 focus:ring-primary-500 focus:border-primary-500"
>
<option value="all">All Hosts</option>
{packageTrendsData?.hosts?.length > 0 ? (
packageTrendsData.hosts.map((host) => (
<option key={host.id} value={host.id}>
{host.friendly_name || host.hostname}
</option>
))
) : (
<option disabled>
{packageTrendsLoading
? "Loading hosts..."
: "No hosts available"}
</option>
)}
</select>
</div>
</div>
<div className="h-64 w-full">
{packageTrendsLoading ? (
<div className="flex items-center justify-center h-full">
<RefreshCw className="h-8 w-8 animate-spin text-primary-600" />
</div>
) : packageTrendsData?.chartData ? (
<Line
data={packageTrendsData.chartData}
options={packageTrendsChartOptions}
/>
) : (
<div className="flex items-center justify-center h-full text-secondary-500 dark:text-secondary-400">
No data available
</div>
)}
</div>
</div>
);
case "quickStats": {
@@ -1068,6 +1139,174 @@ const Dashboard = () => {
onClick: handlePackagePriorityChartClick,
};
const packageTrendsChartOptions = {
responsive: true,
maintainAspectRatio: false,
plugins: {
legend: {
position: "top",
labels: {
color: isDark ? "#ffffff" : "#374151",
font: {
size: 12,
},
padding: 20,
usePointStyle: true,
pointStyle: "circle",
},
},
tooltip: {
mode: "index",
intersect: false,
backgroundColor: isDark ? "#374151" : "#ffffff",
titleColor: isDark ? "#ffffff" : "#374151",
bodyColor: isDark ? "#ffffff" : "#374151",
borderColor: isDark ? "#4B5563" : "#E5E7EB",
borderWidth: 1,
callbacks: {
title: (context) => {
const label = context[0].label;
// Handle empty or invalid labels
if (!label || typeof label !== "string") {
return "Unknown Date";
}
// Format hourly labels (e.g., "2025-10-07T14" -> "Oct 7, 2:00 PM")
if (label.includes("T")) {
try {
const date = new Date(`${label}:00:00`);
// Check if date is valid
if (Number.isNaN(date.getTime())) {
return label; // Return original label if date is invalid
}
return date.toLocaleDateString("en-US", {
month: "short",
day: "numeric",
hour: "numeric",
minute: "2-digit",
hour12: true,
});
} catch (_error) {
return label; // Return original label if parsing fails
}
}
// Format daily labels (e.g., "2025-10-07" -> "Oct 7")
try {
const date = new Date(label);
// Check if date is valid
if (Number.isNaN(date.getTime())) {
return label; // Return original label if date is invalid
}
return date.toLocaleDateString("en-US", {
month: "short",
day: "numeric",
});
} catch (_error) {
return label; // Return original label if parsing fails
}
},
label: (context) => {
const value = context.parsed.y;
if (value === null || value === undefined) {
return `${context.dataset.label}: No data`;
}
return `${context.dataset.label}: ${value}`;
},
},
},
},
scales: {
x: {
display: true,
title: {
display: true,
text: packageTrendsPeriod === "1" ? "Time (Hours)" : "Date",
color: isDark ? "#ffffff" : "#374151",
},
ticks: {
color: isDark ? "#ffffff" : "#374151",
font: {
size: 11,
},
callback: function (value, _index, _ticks) {
const label = this.getLabelForValue(value);
// Handle empty or invalid labels
if (!label || typeof label !== "string") {
return "Unknown";
}
// Format hourly labels (e.g., "2025-10-07T14" -> "2 PM")
if (label.includes("T")) {
try {
const hour = label.split("T")[1];
const hourNum = parseInt(hour, 10);
// Validate hour number
if (Number.isNaN(hourNum) || hourNum < 0 || hourNum > 23) {
return hour; // Return original hour if invalid
}
return hourNum === 0
? "12 AM"
: hourNum < 12
? `${hourNum} AM`
: hourNum === 12
? "12 PM"
: `${hourNum - 12} PM`;
} catch (_error) {
return label; // Return original label if parsing fails
}
}
// Format daily labels (e.g., "2025-10-07" -> "Oct 7")
try {
const date = new Date(label);
// Check if date is valid
if (Number.isNaN(date.getTime())) {
return label; // Return original label if date is invalid
}
return date.toLocaleDateString("en-US", {
month: "short",
day: "numeric",
});
} catch (_error) {
return label; // Return original label if parsing fails
}
},
},
grid: {
color: isDark ? "#374151" : "#E5E7EB",
},
},
y: {
display: true,
title: {
display: true,
text: "Number of Packages",
color: isDark ? "#ffffff" : "#374151",
},
ticks: {
color: isDark ? "#ffffff" : "#374151",
font: {
size: 11,
},
beginAtZero: true,
},
grid: {
color: isDark ? "#374151" : "#E5E7EB",
},
},
},
interaction: {
mode: "nearest",
axis: "x",
intersect: false,
},
};
const barChartOptions = {
responsive: true,
indexAxis: "y", // Make the chart horizontal
@@ -1100,6 +1339,7 @@ const Dashboard = () => {
},
},
},
onClick: handleOSChartClick,
};
const osChartData = {
@@ -1194,7 +1434,6 @@ const Dashboard = () => {
title="Customize dashboard layout"
>
<Settings className="h-4 w-4" />
Customize Dashboard
</button>
<button
type="button"
@@ -1206,7 +1445,6 @@ const Dashboard = () => {
<RefreshCw
className={`h-4 w-4 ${isFetching ? "animate-spin" : ""}`}
/>
{isFetching ? "Refreshing..." : "Refresh"}
</button>
</div>
</div>
@@ -1245,7 +1483,12 @@ const Dashboard = () => {
className={getGroupClassName(group.type)}
>
{group.cards.map((card, cardIndex) => (
<div key={`card-${card.cardId}-${groupIndex}-${cardIndex}`}>
<div
key={`card-${card.cardId}-${groupIndex}-${cardIndex}`}
className={
card.cardId === "packageTrends" ? "lg:col-span-2" : ""
}
>
{renderCard(card.cardId)}
</div>
))}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -21,12 +21,13 @@ import {
Square,
Trash2,
Users,
Wifi,
X,
} from "lucide-react";
import { useEffect, useId, useMemo, useState } from "react";
import { Link, useNavigate, useSearchParams } from "react-router-dom";
import InlineEdit from "../components/InlineEdit";
import InlineGroupEdit from "../components/InlineGroupEdit";
import InlineMultiGroupEdit from "../components/InlineMultiGroupEdit";
import InlineToggle from "../components/InlineToggle";
import {
adminHostsAPI,
@@ -34,14 +35,14 @@ import {
formatRelativeTime,
hostGroupsAPI,
} from "../utils/api";
import { OSIcon } from "../utils/osIcons.jsx";
import { getOSDisplayName, OSIcon } from "../utils/osIcons.jsx";
// Add Host Modal Component
const AddHostModal = ({ isOpen, onClose, onSuccess }) => {
const friendlyNameId = useId();
const [formData, setFormData] = useState({
friendly_name: "",
hostGroupId: "",
hostGroupIds: [], // Changed to array for multiple selection
});
const [isSubmitting, setIsSubmitting] = useState(false);
const [error, setError] = useState("");
@@ -64,7 +65,7 @@ const AddHostModal = ({ isOpen, onClose, onSuccess }) => {
const response = await adminHostsAPI.create(formData);
console.log("Host created successfully:", formData.friendly_name);
onSuccess(response.data);
setFormData({ friendly_name: "", hostGroupId: "" });
setFormData({ friendly_name: "", hostGroupIds: [] });
onClose();
} catch (err) {
console.error("Full error object:", err);
@@ -134,68 +135,56 @@ const AddHostModal = ({ isOpen, onClose, onSuccess }) => {
<div>
<span className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-3">
Host Group
Host Groups
</span>
<div className="grid grid-cols-3 gap-2">
{/* No Group Option */}
<button
type="button"
onClick={() => setFormData({ ...formData, hostGroupId: "" })}
className={`flex flex-col items-center justify-center px-2 py-3 text-center border-2 rounded-lg transition-all duration-200 relative min-h-[80px] ${
formData.hostGroupId === ""
? "border-primary-500 bg-primary-50 dark:bg-primary-900/30 text-primary-700 dark:text-primary-300"
: "border-secondary-300 dark:border-secondary-600 bg-white dark:bg-secondary-700 text-secondary-700 dark:text-secondary-200 hover:border-secondary-400 dark:hover:border-secondary-500"
}`}
>
<div className="text-xs font-medium">No Group</div>
<div className="text-xs text-secondary-500 dark:text-secondary-400 mt-1">
Ungrouped
</div>
{formData.hostGroupId === "" && (
<div className="absolute top-2 right-2 w-3 h-3 rounded-full bg-primary-500 flex items-center justify-center">
<div className="w-1.5 h-1.5 rounded-full bg-white"></div>
</div>
)}
</button>
<div className="space-y-2 max-h-48 overflow-y-auto">
{/* Host Group Options */}
{hostGroups?.map((group) => (
<button
<label
key={group.id}
type="button"
onClick={() =>
setFormData({ ...formData, hostGroupId: group.id })
}
className={`flex flex-col items-center justify-center px-2 py-3 text-center border-2 rounded-lg transition-all duration-200 relative min-h-[80px] ${
formData.hostGroupId === group.id
? "border-primary-500 bg-primary-50 dark:bg-primary-900/30 text-primary-700 dark:text-primary-300"
: "border-secondary-300 dark:border-secondary-600 bg-white dark:bg-secondary-700 text-secondary-700 dark:text-secondary-200 hover:border-secondary-400 dark:hover:border-secondary-500"
className={`flex items-center gap-3 p-3 border-2 rounded-lg transition-all duration-200 cursor-pointer ${
formData.hostGroupIds.includes(group.id)
? "border-primary-500 bg-primary-50 dark:bg-primary-900/30"
: "border-secondary-300 dark:border-secondary-600 bg-white dark:bg-secondary-700 hover:border-secondary-400 dark:hover:border-secondary-500"
}`}
>
<div className="flex items-center gap-1 mb-1 w-full justify-center">
<input
type="checkbox"
checked={formData.hostGroupIds.includes(group.id)}
onChange={(e) => {
if (e.target.checked) {
setFormData({
...formData,
hostGroupIds: [...formData.hostGroupIds, group.id],
});
} else {
setFormData({
...formData,
hostGroupIds: formData.hostGroupIds.filter(
(id) => id !== group.id,
),
});
}
}}
className="w-4 h-4 text-primary-600 bg-gray-100 border-gray-300 rounded focus:ring-primary-500 dark:focus:ring-primary-600 dark:ring-offset-gray-800 focus:ring-2 dark:bg-gray-700 dark:border-gray-600"
/>
<div className="flex items-center gap-2 flex-1">
{group.color && (
<div
className="w-3 h-3 rounded-full border border-secondary-300 dark:border-secondary-500 flex-shrink-0"
style={{ backgroundColor: group.color }}
></div>
)}
<div className="text-xs font-medium truncate max-w-full">
<div className="text-sm font-medium text-secondary-700 dark:text-secondary-200">
{group.name}
</div>
</div>
<div className="text-xs text-secondary-500 dark:text-secondary-400">
Group
</div>
{formData.hostGroupId === group.id && (
<div className="absolute top-2 right-2 w-3 h-3 rounded-full bg-primary-500 flex items-center justify-center">
<div className="w-1.5 h-1.5 rounded-full bg-white"></div>
</div>
)}
</button>
</label>
))}
</div>
<p className="mt-2 text-sm text-secondary-500 dark:text-secondary-400">
Optional: Assign this host to a group for better organization.
Optional: Select one or more groups to assign this host to for
better organization.
</p>
</div>
@@ -328,22 +317,24 @@ const Hosts = () => {
const defaultConfig = [
{ id: "select", label: "Select", visible: true, order: 0 },
{ id: "host", label: "Friendly Name", visible: true, order: 1 },
{ id: "ip", label: "IP Address", visible: false, order: 2 },
{ id: "group", label: "Group", visible: true, order: 3 },
{ id: "os", label: "OS", visible: true, order: 4 },
{ id: "os_version", label: "OS Version", visible: false, order: 5 },
{ id: "agent_version", label: "Agent Version", visible: true, order: 6 },
{ id: "hostname", label: "System Hostname", visible: true, order: 2 },
{ id: "ip", label: "IP Address", visible: false, order: 3 },
{ id: "group", label: "Group", visible: true, order: 4 },
{ id: "os", label: "OS", visible: true, order: 5 },
{ id: "os_version", label: "OS Version", visible: false, order: 6 },
{ id: "agent_version", label: "Agent Version", visible: true, order: 7 },
{
id: "auto_update",
label: "Agent Auto-Update",
visible: true,
order: 7,
order: 8,
},
{ id: "status", label: "Status", visible: true, order: 8 },
{ id: "updates", label: "Updates", visible: true, order: 9 },
{ id: "notes", label: "Notes", visible: false, order: 10 },
{ id: "last_update", label: "Last Update", visible: true, order: 11 },
{ id: "actions", label: "Actions", visible: true, order: 12 },
{ id: "ws_status", label: "Connection", visible: true, order: 9 },
{ id: "status", label: "Status", visible: true, order: 10 },
{ id: "updates", label: "Updates", visible: true, order: 11 },
{ id: "notes", label: "Notes", visible: false, order: 12 },
{ id: "last_update", label: "Last Update", visible: true, order: 13 },
{ id: "actions", label: "Actions", visible: true, order: 14 },
];
const saved = localStorage.getItem("hosts-column-config");
@@ -365,8 +356,11 @@ const Hosts = () => {
localStorage.removeItem("hosts-column-config");
return defaultConfig;
} else {
// Use the existing configuration
return savedConfig;
// Ensure ws_status column is visible in saved config
const updatedConfig = savedConfig.map((col) =>
col.id === "ws_status" ? { ...col, visible: true } : col,
);
return updatedConfig;
}
} catch {
// If there's an error parsing the config, clear it and use default
@@ -398,6 +392,118 @@ const Hosts = () => {
queryFn: () => hostGroupsAPI.list().then((res) => res.data),
});
// Track WebSocket status for all hosts
const [wsStatusMap, setWsStatusMap] = useState({});
// Fetch initial WebSocket status for all hosts
useEffect(() => {
if (!hosts || hosts.length === 0) return;
const token = localStorage.getItem("token");
if (!token) return;
// Fetch initial WebSocket status for all hosts
const fetchInitialStatus = async () => {
const statusPromises = hosts
.filter((host) => host.api_id)
.map(async (host) => {
try {
const response = await fetch(`/api/v1/ws/status/${host.api_id}`, {
headers: {
Authorization: `Bearer ${token}`,
},
});
if (response.ok) {
const data = await response.json();
return { apiId: host.api_id, status: data.data };
}
} catch (_error) {
// Silently handle errors
}
return {
apiId: host.api_id,
status: { connected: false, secure: false },
};
});
const results = await Promise.all(statusPromises);
const initialStatusMap = {};
results.forEach(({ apiId, status }) => {
initialStatusMap[apiId] = status;
});
setWsStatusMap(initialStatusMap);
};
fetchInitialStatus();
}, [hosts]);
// Subscribe to WebSocket status changes for all hosts via SSE
useEffect(() => {
if (!hosts || hosts.length === 0) return;
const token = localStorage.getItem("token");
if (!token) return;
const eventSources = new Map();
let isMounted = true;
const connectHost = (apiId) => {
if (!isMounted || eventSources.has(apiId)) return;
try {
const es = new EventSource(
`/api/v1/ws/status/${apiId}/stream?token=${encodeURIComponent(token)}`,
);
es.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
if (isMounted) {
setWsStatusMap((prev) => {
const newMap = { ...prev, [apiId]: data };
return newMap;
});
}
} catch (_err) {
// Silently handle parse errors
}
};
es.onerror = (_error) => {
console.log(`[SSE] Connection error for ${apiId}, retrying...`);
es?.close();
eventSources.delete(apiId);
if (isMounted) {
// Retry connection after 5 seconds with exponential backoff
setTimeout(() => connectHost(apiId), 5000);
}
};
eventSources.set(apiId, es);
} catch (_err) {
// Silently handle connection errors
}
};
// Connect to all hosts
for (const host of hosts) {
if (host.api_id) {
connectHost(host.api_id);
} else {
}
}
// Cleanup function
return () => {
isMounted = false;
for (const es of eventSources.values()) {
es.close();
}
eventSources.clear();
};
}, [hosts]);
const bulkUpdateGroupMutation = useMutation({
mutationFn: ({ hostIds, hostGroupId }) =>
adminHostsAPI.bulkUpdateGroup(hostIds, hostGroupId),
@@ -439,7 +545,7 @@ const Hosts = () => {
},
});
const updateHostGroupMutation = useMutation({
const _updateHostGroupMutation = useMutation({
mutationFn: ({ hostId, hostGroupId }) => {
console.log("updateHostGroupMutation called with:", {
hostId,
@@ -485,6 +591,46 @@ const Hosts = () => {
},
});
const updateHostGroupsMutation = useMutation({
mutationFn: ({ hostId, groupIds }) => {
console.log("updateHostGroupsMutation called with:", {
hostId,
groupIds,
});
return adminHostsAPI.updateGroups(hostId, groupIds).then((res) => {
console.log("updateGroups API response:", res);
return res.data;
});
},
onSuccess: (data) => {
// Update the cache with the new host data
queryClient.setQueryData(["hosts"], (oldData) => {
console.log("Old cache data before update:", oldData);
if (!oldData) return oldData;
const updatedData = oldData.map((host) => {
if (host.id === data.host.id) {
console.log(
"Updating host in cache:",
host.id,
"with new data:",
data.host,
);
return data.host;
}
return host;
});
console.log("New cache data after update:", updatedData);
return updatedData;
});
// Also invalidate to ensure consistency
queryClient.invalidateQueries(["hosts"]);
},
onError: (error) => {
console.error("updateHostGroupsMutation error:", error);
},
});
const toggleAutoUpdateMutation = useMutation({
mutationFn: ({ hostId, autoUpdate }) =>
adminHostsAPI
@@ -562,7 +708,7 @@ const Hosts = () => {
osFilter === "all" ||
host.os_type?.toLowerCase() === osFilter.toLowerCase();
// URL filter for hosts needing updates, inactive hosts, up-to-date hosts, or stale hosts
// URL filter for hosts needing updates, inactive hosts, up-to-date hosts, stale hosts, or offline hosts
const filter = searchParams.get("filter");
const matchesUrlFilter =
(filter !== "needsUpdates" ||
@@ -570,7 +716,8 @@ const Hosts = () => {
(filter !== "inactive" ||
(host.effectiveStatus || host.status) === "inactive") &&
(filter !== "upToDate" || (!host.isStale && host.updatesCount === 0)) &&
(filter !== "stale" || host.isStale);
(filter !== "stale" || host.isStale) &&
(filter !== "offline" || wsStatusMap[host.api_id]?.connected !== true);
// Hide stale filter
const matchesHideStale = !hideStale || !host.isStale;
@@ -655,8 +802,21 @@ const Hosts = () => {
sortDirection,
searchParams,
hideStale,
wsStatusMap,
]);
// Get unique OS types from hosts for dynamic dropdown
const uniqueOsTypes = useMemo(() => {
if (!hosts) return [];
const osTypes = new Set();
hosts.forEach((host) => {
if (host.os_type) {
osTypes.add(host.os_type);
}
});
return Array.from(osTypes).sort();
}, [hosts]);
// Group hosts by selected field
const groupedHosts = useMemo(() => {
if (groupBy === "none") {
@@ -744,10 +904,19 @@ const Hosts = () => {
{ id: "group", label: "Group", visible: true, order: 4 },
{ id: "os", label: "OS", visible: true, order: 5 },
{ id: "os_version", label: "OS Version", visible: false, order: 6 },
{ id: "status", label: "Status", visible: true, order: 7 },
{ id: "updates", label: "Updates", visible: true, order: 8 },
{ id: "last_update", label: "Last Update", visible: true, order: 9 },
{ id: "actions", label: "Actions", visible: true, order: 10 },
{ id: "agent_version", label: "Agent Version", visible: true, order: 7 },
{
id: "auto_update",
label: "Agent Auto-Update",
visible: true,
order: 8,
},
{ id: "ws_status", label: "Connection", visible: true, order: 9 },
{ id: "status", label: "Status", visible: true, order: 10 },
{ id: "updates", label: "Updates", visible: true, order: 11 },
{ id: "notes", label: "Notes", visible: false, order: 12 },
{ id: "last_update", label: "Last Update", visible: true, order: 13 },
{ id: "actions", label: "Actions", visible: true, order: 14 },
];
updateColumnConfig(defaultConfig);
};
@@ -810,27 +979,33 @@ const Hosts = () => {
{host.ip || "N/A"}
</div>
);
case "group":
case "group": {
// Extract group IDs from the new many-to-many structure
const groupIds =
host.host_group_memberships?.map(
(membership) => membership.host_groups.id,
) || [];
return (
<InlineGroupEdit
key={`${host.id}-${host.host_groups?.id || "ungrouped"}-${host.host_groups?.name || "ungrouped"}`}
value={host.host_groups?.id}
onSave={(newGroupId) =>
updateHostGroupMutation.mutate({
<InlineMultiGroupEdit
key={`${host.id}-${groupIds.join(",")}`}
value={groupIds}
onSave={(newGroupIds) =>
updateHostGroupsMutation.mutate({
hostId: host.id,
hostGroupId: newGroupId,
groupIds: newGroupIds,
})
}
options={hostGroups || []}
placeholder="Select group..."
placeholder="Select groups..."
className="w-full"
/>
);
}
case "os":
return (
<div className="flex items-center gap-2 text-sm text-secondary-900 dark:text-white">
<OSIcon osType={host.os_type} className="h-4 w-4" />
<span>{host.os_type}</span>
<span>{getOSDisplayName(host.os_type)}</span>
</div>
);
case "os_version":
@@ -859,6 +1034,38 @@ const Hosts = () => {
falseLabel="No"
/>
);
case "ws_status": {
const wsStatus = wsStatusMap[host.api_id];
if (!wsStatus) {
return (
<span className="inline-flex items-center px-2 py-1 rounded-full text-xs font-medium bg-gray-100 text-gray-600 dark:bg-gray-700 dark:text-gray-400">
<div className="w-2 h-2 bg-gray-400 rounded-full mr-1.5"></div>
Unknown
</span>
);
}
return (
<span
className={`inline-flex items-center px-2 py-1 rounded-full text-xs font-medium ${
wsStatus.connected
? "bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200"
: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-200"
}`}
title={
wsStatus.connected
? `Agent connected via ${wsStatus.secure ? "WSS (secure)" : "WS (insecure)"}`
: "Agent not connected"
}
>
<div
className={`w-2 h-2 rounded-full mr-1.5 ${
wsStatus.connected ? "bg-green-500 animate-pulse" : "bg-red-500"
}`}
></div>
{wsStatus.connected ? (wsStatus.secure ? "WSS" : "WS") : "Offline"}
</span>
);
}
case "status":
return (
<div className="text-sm text-secondary-900 dark:text-white">
@@ -870,9 +1077,11 @@ const Hosts = () => {
return (
<button
type="button"
onClick={() => navigate(`/packages?host=${host.id}`)}
onClick={() =>
navigate(`/packages?host=${host.id}&filter=outdated`)
}
className="text-sm text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 font-medium hover:underline"
title="View packages for this host"
title="View outdated packages for this host"
>
{host.updatesCount || 0}
</button>
@@ -952,13 +1161,13 @@ const Hosts = () => {
navigate(`/hosts?${newSearchParams.toString()}`, { replace: true });
};
const handleStaleClick = () => {
// Filter to show stale/inactive hosts
setStatusFilter("inactive");
const handleConnectionStatusClick = () => {
// Filter to show offline hosts (not connected via WebSocket)
setStatusFilter("all");
setShowFilters(true);
// We'll use the existing inactive URL filter logic
// Use a new URL filter for connection status
const newSearchParams = new URLSearchParams(window.location.search);
newSearchParams.set("filter", "inactive");
newSearchParams.set("filter", "offline");
navigate(`/hosts?${newSearchParams.toString()}`, { replace: true });
};
@@ -1012,13 +1221,12 @@ const Hosts = () => {
type="button"
onClick={() => refetch()}
disabled={isFetching}
className="btn-outline flex items-center gap-2"
className="btn-outline flex items-center justify-center p-2"
title="Refresh hosts data"
>
<RefreshCw
className={`h-4 w-4 ${isFetching ? "animate-spin" : ""}`}
/>
{isFetching ? "Refreshing..." : "Refresh"}
</button>
<button
type="button"
@@ -1088,17 +1296,46 @@ const Hosts = () => {
<button
type="button"
className="card p-4 cursor-pointer hover:shadow-card-hover dark:hover:shadow-card-hover-dark transition-shadow duration-200 text-left w-full"
onClick={handleStaleClick}
onClick={handleConnectionStatusClick}
>
<div className="flex items-center">
<AlertTriangle className="h-5 w-5 text-danger-600 mr-2" />
<div>
<p className="text-sm text-secondary-500 dark:text-white">
Stale
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{hosts?.filter((h) => h.isStale).length || 0}
<Wifi className="h-5 w-5 text-primary-600 mr-2" />
<div className="flex-1">
<p className="text-sm text-secondary-500 dark:text-white mb-1">
Connection Status
</p>
{(() => {
const connectedCount =
hosts?.filter(
(h) => wsStatusMap[h.api_id]?.connected === true,
).length || 0;
const offlineCount =
hosts?.filter(
(h) => wsStatusMap[h.api_id]?.connected !== true,
).length || 0;
return (
<div className="flex gap-4">
<div className="flex items-center gap-1">
<div className="w-2 h-2 bg-green-500 rounded-full"></div>
<span className="text-sm font-medium text-secondary-900 dark:text-white">
{connectedCount}
</span>
<span className="text-xs text-secondary-500 dark:text-secondary-400">
Connected
</span>
</div>
<div className="flex items-center gap-1">
<div className="w-2 h-2 bg-red-500 rounded-full"></div>
<span className="text-sm font-medium text-secondary-900 dark:text-white">
{offlineCount}
</span>
<span className="text-xs text-secondary-500 dark:text-secondary-400">
Offline
</span>
</div>
</div>
);
})()}
</div>
</div>
</button>
@@ -1266,9 +1503,11 @@ const Hosts = () => {
className="w-full border border-secondary-300 dark:border-secondary-600 rounded-lg px-3 py-2 focus:ring-2 focus:ring-primary-500 focus:border-primary-500 bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white"
>
<option value="all">All OS</option>
<option value="linux">Linux</option>
<option value="windows">Windows</option>
<option value="macos">macOS</option>
{uniqueOsTypes.map((osType) => (
<option key={osType} value={osType.toLowerCase()}>
{osType}
</option>
))}
</select>
</div>
<div className="flex items-end">
@@ -1421,6 +1660,11 @@ const Hosts = () => {
<div className="flex items-center gap-2 font-normal text-xs text-secondary-500 dark:text-secondary-300 normal-case tracking-wider">
{column.label}
</div>
) : column.id === "ws_status" ? (
<div className="flex items-center gap-2 font-normal text-xs text-secondary-500 dark:text-secondary-300 normal-case tracking-wider">
<Wifi className="h-3 w-3" />
{column.label}
</div>
) : column.id === "status" ? (
<button
type="button"
@@ -1554,6 +1798,7 @@ const BulkAssignModal = ({
isLoading,
}) => {
const [selectedGroupId, setSelectedGroupId] = useState("");
const bulkHostGroupId = useId();
// Fetch host groups for selection
const { data: hostGroups } = useQuery({
@@ -1572,28 +1817,31 @@ const BulkAssignModal = ({
return (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white rounded-lg p-6 w-full max-w-md">
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 w-full max-w-md">
<div className="flex justify-between items-center mb-4">
<h3 className="text-lg font-semibold text-secondary-900">
<h3 className="text-lg font-semibold text-secondary-900 dark:text-white">
Assign to Host Group
</h3>
<button
type="button"
onClick={onClose}
className="text-secondary-400 hover:text-secondary-600"
className="text-secondary-400 hover:text-secondary-600 dark:text-secondary-300 dark:hover:text-secondary-100"
>
<X className="h-5 w-5" />
</button>
</div>
<div className="mb-4">
<p className="text-sm text-secondary-600 mb-2">
<p className="text-sm text-secondary-600 dark:text-secondary-400 mb-2">
Assigning {selectedHosts.length} host
{selectedHosts.length !== 1 ? "s" : ""}:
</p>
<div className="max-h-32 overflow-y-auto bg-secondary-50 rounded-md p-3">
<div className="max-h-32 overflow-y-auto bg-secondary-50 dark:bg-secondary-700 rounded-md p-3">
{selectedHostNames.map((friendlyName) => (
<div key={friendlyName} className="text-sm text-secondary-700">
<div
key={friendlyName}
className="text-sm text-secondary-700 dark:text-secondary-300"
>
{friendlyName}
</div>
))}
@@ -1604,7 +1852,7 @@ const BulkAssignModal = ({
<div>
<label
htmlFor={bulkHostGroupId}
className="block text-sm font-medium text-secondary-700 mb-1"
className="block text-sm font-medium text-secondary-700 dark:text-secondary-300 mb-1"
>
Host Group
</label>
@@ -1612,7 +1860,7 @@ const BulkAssignModal = ({
id={bulkHostGroupId}
value={selectedGroupId}
onChange={(e) => setSelectedGroupId(e.target.value)}
className="w-full px-3 py-2 border border-secondary-300 rounded-md focus:outline-none focus:ring-2 focus:ring-primary-500"
className="w-full px-3 py-2 border border-secondary-300 dark:border-secondary-600 rounded-md bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white focus:outline-none focus:ring-2 focus:ring-primary-500"
>
<option value="">No group (ungrouped)</option>
{hostGroups?.map((group) => (
@@ -1621,7 +1869,7 @@ const BulkAssignModal = ({
</option>
))}
</select>
<p className="mt-1 text-sm text-secondary-500">
<p className="mt-1 text-sm text-secondary-500 dark:text-secondary-400">
Select a group to assign these hosts to, or leave ungrouped.
</p>
</div>
@@ -1765,9 +2013,10 @@ const ColumnSettingsModal = ({
};
return (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow-xl max-w-md w-full mx-4">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600">
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50 p-4">
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow-xl max-w-lg w-full max-h-[85vh] flex flex-col">
{/* Header */}
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600 flex-shrink-0">
<div className="flex items-center justify-between">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Column Settings
@@ -1780,14 +2029,14 @@ const ColumnSettingsModal = ({
<X className="h-5 w-5" />
</button>
</div>
</div>
<div className="px-6 py-4">
<p className="text-sm text-secondary-600 dark:text-secondary-300 mb-4">
<p className="text-sm text-secondary-600 dark:text-secondary-300 mt-2">
Drag to reorder columns or toggle visibility
</p>
</div>
<div className="space-y-2">
{/* Scrollable content */}
<div className="px-6 py-4 flex-1 overflow-y-auto">
<div className="space-y-1">
{columnConfig.map((column, index) => (
<button
key={column.id}
@@ -1804,22 +2053,22 @@ const ColumnSettingsModal = ({
// Focus handling for keyboard users
}
}}
className={`flex items-center justify-between p-3 border rounded-lg cursor-move w-full text-left ${
className={`flex items-center justify-between p-2.5 border rounded-lg cursor-move w-full text-left transition-colors ${
draggedIndex === index
? "opacity-50"
: "hover:bg-secondary-50 dark:hover:bg-secondary-700"
} border-secondary-200 dark:border-secondary-600`}
>
<div className="flex items-center gap-3">
<GripVertical className="h-4 w-4 text-secondary-400 dark:text-secondary-500" />
<span className="text-sm font-medium text-secondary-900 dark:text-white">
<div className="flex items-center gap-2.5">
<GripVertical className="h-4 w-4 text-secondary-400 dark:text-secondary-500 flex-shrink-0" />
<span className="text-sm font-medium text-secondary-900 dark:text-white truncate">
{column.label}
</span>
</div>
<button
type="button"
onClick={() => onToggleVisibility(column.id)}
className={`p-1 rounded ${
className={`p-1 rounded transition-colors flex-shrink-0 ${
column.visible
? "text-primary-600 hover:text-primary-700 dark:text-primary-400 dark:hover:text-primary-300"
: "text-secondary-400 hover:text-secondary-600 dark:text-secondary-500 dark:hover:text-secondary-300"
@@ -1834,8 +2083,11 @@ const ColumnSettingsModal = ({
</button>
))}
</div>
</div>
<div className="flex justify-between mt-6">
{/* Footer */}
<div className="px-6 py-4 border-t border-secondary-200 dark:border-secondary-600 flex-shrink-0">
<div className="flex justify-between">
<button type="button" onClick={onReset} className="btn-outline">
Reset to Default
</button>

View File

@@ -22,6 +22,7 @@ const Login = () => {
const emailId = useId();
const passwordId = useId();
const tokenId = useId();
const rememberMeId = useId();
const { login, setAuthState } = useAuth();
const [isSignupMode, setIsSignupMode] = useState(false);
const [formData, setFormData] = useState({
@@ -33,6 +34,7 @@ const Login = () => {
});
const [tfaData, setTfaData] = useState({
token: "",
remember_me: false,
});
const [showPassword, setShowPassword] = useState(false);
const [isLoading, setIsLoading] = useState(false);
@@ -127,7 +129,11 @@ const Login = () => {
setError("");
try {
const response = await authAPI.verifyTfa(tfaUsername, tfaData.token);
const response = await authAPI.verifyTfa(
tfaUsername,
tfaData.token,
tfaData.remember_me,
);
if (response.data?.token) {
// Update AuthContext with the new authentication state
@@ -158,9 +164,11 @@ const Login = () => {
};
const handleTfaInputChange = (e) => {
const { name, value, type, checked } = e.target;
setTfaData({
...tfaData,
[e.target.name]: e.target.value.replace(/\D/g, "").slice(0, 6),
[name]:
type === "checkbox" ? checked : value.replace(/\D/g, "").slice(0, 6),
});
// Clear error when user starts typing
if (error) {
@@ -170,7 +178,7 @@ const Login = () => {
const handleBackToLogin = () => {
setRequiresTfa(false);
setTfaData({ token: "" });
setTfaData({ token: "", remember_me: false });
setError("");
};
@@ -436,6 +444,23 @@ const Login = () => {
</div>
</div>
<div className="flex items-center">
<input
id={rememberMeId}
name="remember_me"
type="checkbox"
checked={tfaData.remember_me}
onChange={handleTfaInputChange}
className="h-4 w-4 text-primary-600 focus:ring-primary-500 border-secondary-300 rounded"
/>
<label
htmlFor={rememberMeId}
className="ml-2 block text-sm text-secondary-700"
>
Remember me on this computer (skip TFA for 30 days)
</label>
</div>
{error && (
<div className="bg-danger-50 border border-danger-200 rounded-md p-3">
<div className="flex">

View File

@@ -1,23 +1,476 @@
import { Package } from "lucide-react";
import { useParams } from "react-router-dom";
import { useQuery } from "@tanstack/react-query";
import {
AlertTriangle,
ArrowLeft,
Calendar,
ChartColumnBig,
ChevronRight,
Download,
Package,
RefreshCw,
Search,
Server,
Shield,
Tag,
} from "lucide-react";
import { useMemo, useState } from "react";
import { useNavigate, useParams } from "react-router-dom";
import { formatRelativeTime, packagesAPI } from "../utils/api";
const PackageDetail = () => {
const { packageId } = useParams();
const decodedPackageId = decodeURIComponent(packageId || "");
const navigate = useNavigate();
const [searchTerm, setSearchTerm] = useState("");
const [currentPage, setCurrentPage] = useState(1);
const [pageSize, setPageSize] = useState(25);
// Fetch package details
const {
data: packageData,
isLoading: isLoadingPackage,
error: packageError,
refetch: refetchPackage,
} = useQuery({
queryKey: ["package", decodedPackageId],
queryFn: () =>
packagesAPI.getById(decodedPackageId).then((res) => res.data),
staleTime: 5 * 60 * 1000,
refetchOnWindowFocus: false,
enabled: !!decodedPackageId,
});
// Fetch hosts that have this package
const {
data: hostsData,
isLoading: isLoadingHosts,
error: hostsError,
refetch: refetchHosts,
} = useQuery({
queryKey: ["package-hosts", decodedPackageId, searchTerm],
queryFn: () =>
packagesAPI
.getHosts(decodedPackageId, { search: searchTerm, limit: 1000 })
.then((res) => res.data),
staleTime: 5 * 60 * 1000,
refetchOnWindowFocus: false,
enabled: !!decodedPackageId,
});
const hosts = hostsData?.hosts || [];
// Filter and paginate hosts
const filteredAndPaginatedHosts = useMemo(() => {
let filtered = hosts;
if (searchTerm) {
filtered = hosts.filter(
(host) =>
host.friendlyName?.toLowerCase().includes(searchTerm.toLowerCase()) ||
host.hostname?.toLowerCase().includes(searchTerm.toLowerCase()),
);
}
const startIndex = (currentPage - 1) * pageSize;
const endIndex = startIndex + pageSize;
return filtered.slice(startIndex, endIndex);
}, [hosts, searchTerm, currentPage, pageSize]);
const totalPages = Math.ceil(
(searchTerm
? hosts.filter(
(host) =>
host.friendlyName
?.toLowerCase()
.includes(searchTerm.toLowerCase()) ||
host.hostname?.toLowerCase().includes(searchTerm.toLowerCase()),
).length
: hosts.length) / pageSize,
);
const handleHostClick = (hostId) => {
navigate(`/hosts/${hostId}`);
};
const handleRefresh = () => {
refetchPackage();
refetchHosts();
};
if (isLoadingPackage) {
return (
<div className="flex items-center justify-center h-64">
<RefreshCw className="h-8 w-8 animate-spin text-primary-600" />
</div>
);
}
if (packageError) {
return (
<div className="space-y-6">
<div className="bg-danger-50 border border-danger-200 rounded-md p-4">
<div className="flex">
<AlertTriangle className="h-5 w-5 text-danger-400" />
<div className="ml-3">
<h3 className="text-sm font-medium text-danger-800">
Error loading package
</h3>
<p className="text-sm text-danger-700 mt-1">
{packageError.message || "Failed to load package details"}
</p>
<button
type="button"
onClick={() => refetchPackage()}
className="mt-2 btn-danger text-xs"
>
Try again
</button>
</div>
</div>
</div>
</div>
);
}
if (!packageData) {
return (
<div className="space-y-6">
<div className="text-center py-8">
<Package className="h-12 w-12 text-secondary-400 mx-auto mb-4" />
<p className="text-secondary-500 dark:text-secondary-300">
Package not found
</p>
</div>
</div>
);
}
const pkg = packageData;
const stats = packageData.stats || {};
return (
<div className="space-y-6">
<div className="card p-8 text-center">
<Package className="h-12 w-12 text-secondary-400 mx-auto mb-4" />
<h3 className="text-lg font-medium text-secondary-900 mb-2">
Package Details
</h3>
<p className="text-secondary-600">
Detailed view for package: {packageId}
</p>
<p className="text-secondary-600 mt-2">
This page will show package information, affected hosts, version
distribution, and more.
</p>
{/* Header */}
<div className="flex items-center justify-between">
<div className="flex items-center gap-4">
<button
type="button"
onClick={() => navigate("/packages")}
className="flex items-center gap-2 text-secondary-600 hover:text-secondary-900 dark:text-secondary-400 dark:hover:text-white transition-colors"
>
<ArrowLeft className="h-4 w-4" />
Back to Packages
</button>
<ChevronRight className="h-4 w-4 text-secondary-400" />
<h1 className="text-2xl font-semibold text-secondary-900 dark:text-white">
{pkg.name}
</h1>
</div>
<button
type="button"
onClick={handleRefresh}
disabled={isLoadingPackage || isLoadingHosts}
className="btn-outline flex items-center gap-2"
>
<RefreshCw
className={`h-4 w-4 ${
isLoadingPackage || isLoadingHosts ? "animate-spin" : ""
}`}
/>
Refresh
</button>
</div>
{/* Package Overview */}
<div className="grid grid-cols-1 lg:grid-cols-3 gap-6">
{/* Main Package Info */}
<div className="lg:col-span-2">
<div className="card p-6">
<div className="flex items-start gap-4 mb-4">
<Package className="h-8 w-8 text-primary-600 flex-shrink-0 mt-1" />
<div className="flex-1">
<h2 className="text-xl font-semibold text-secondary-900 dark:text-white mb-2">
{pkg.name}
</h2>
{pkg.description && (
<p className="text-secondary-600 dark:text-secondary-300 mb-4">
{pkg.description}
</p>
)}
<div className="flex flex-wrap gap-4 text-sm">
{pkg.category && (
<div className="flex items-center gap-2">
<Tag className="h-4 w-4 text-secondary-400" />
<span className="text-secondary-600 dark:text-secondary-300">
Category: {pkg.category}
</span>
</div>
)}
{pkg.latest_version && (
<div className="flex items-center gap-2">
<Download className="h-4 w-4 text-secondary-400" />
<span className="text-secondary-600 dark:text-secondary-300">
Latest: {pkg.latest_version}
</span>
</div>
)}
{pkg.updated_at && (
<div className="flex items-center gap-2">
<Calendar className="h-4 w-4 text-secondary-400" />
<span className="text-secondary-600 dark:text-secondary-300">
Updated: {formatRelativeTime(pkg.updated_at)}
</span>
</div>
)}
</div>
</div>
</div>
{/* Status Badge */}
<div className="mb-4">
{stats.updatesNeeded > 0 ? (
stats.securityUpdates > 0 ? (
<span className="badge-danger flex items-center gap-1 w-fit">
<Shield className="h-3 w-3" />
Security Update Available
</span>
) : (
<span className="badge-warning w-fit">Update Available</span>
)
) : (
<span className="badge-success w-fit">Up to Date</span>
)}
</div>
</div>
</div>
{/* Statistics */}
<div className="space-y-4">
<div className="card p-4">
<div className="flex items-center gap-3 mb-3">
<ChartColumnBig className="h-5 w-5 text-primary-600" />
<h3 className="font-medium text-secondary-900 dark:text-white">
Installation Stats
</h3>
</div>
<div className="space-y-3">
<div className="flex justify-between">
<span className="text-secondary-600 dark:text-secondary-300">
Total Installations
</span>
<span className="font-semibold text-secondary-900 dark:text-white">
{stats.totalInstalls || 0}
</span>
</div>
{stats.updatesNeeded > 0 && (
<div className="flex justify-between">
<span className="text-secondary-600 dark:text-secondary-300">
Hosts Needing Updates
</span>
<span className="font-semibold text-warning-600">
{stats.updatesNeeded}
</span>
</div>
)}
{stats.securityUpdates > 0 && (
<div className="flex justify-between">
<span className="text-secondary-600 dark:text-secondary-300">
Security Updates
</span>
<span className="font-semibold text-danger-600">
{stats.securityUpdates}
</span>
</div>
)}
<div className="flex justify-between">
<span className="text-secondary-600 dark:text-secondary-300">
Up to Date
</span>
<span className="font-semibold text-success-600">
{(stats.totalInstalls || 0) - (stats.updatesNeeded || 0)}
</span>
</div>
</div>
</div>
</div>
</div>
{/* Hosts List */}
<div className="card">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600">
<div className="flex items-center justify-between mb-4">
<div className="flex items-center gap-3">
<Server className="h-5 w-5 text-primary-600" />
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Installed On Hosts ({hosts.length})
</h3>
</div>
</div>
{/* Search */}
<div className="relative max-w-sm">
<Search className="absolute left-3 top-1/2 transform -translate-y-1/2 h-4 w-4 text-secondary-400" />
<input
type="text"
placeholder="Search hosts..."
value={searchTerm}
onChange={(e) => {
setSearchTerm(e.target.value);
setCurrentPage(1);
}}
className="w-full pl-10 pr-4 py-2 border border-secondary-300 dark:border-secondary-600 rounded-md focus:ring-2 focus:ring-primary-500 focus:border-transparent bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white placeholder-secondary-500 dark:placeholder-secondary-400"
/>
</div>
</div>
<div className="overflow-x-auto">
{isLoadingHosts ? (
<div className="flex items-center justify-center h-32">
<RefreshCw className="h-6 w-6 animate-spin text-primary-600" />
</div>
) : hostsError ? (
<div className="p-6">
<div className="bg-danger-50 border border-danger-200 rounded-md p-4">
<div className="flex">
<AlertTriangle className="h-5 w-5 text-danger-400" />
<div className="ml-3">
<h3 className="text-sm font-medium text-danger-800">
Error loading hosts
</h3>
<p className="text-sm text-danger-700 mt-1">
{hostsError.message || "Failed to load hosts"}
</p>
</div>
</div>
</div>
</div>
) : filteredAndPaginatedHosts.length === 0 ? (
<div className="text-center py-8">
<Server className="h-12 w-12 text-secondary-400 mx-auto mb-4" />
<p className="text-secondary-500 dark:text-secondary-300">
{searchTerm
? "No hosts match your search"
: "No hosts have this package installed"}
</p>
</div>
) : (
<>
<table className="min-w-full divide-y divide-secondary-200 dark:divide-secondary-600">
<thead className="bg-secondary-50 dark:bg-secondary-700">
<tr>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Host
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Current Version
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Status
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Last Updated
</th>
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-600">
{filteredAndPaginatedHosts.map((host) => (
<tr
key={host.hostId}
className="hover:bg-secondary-50 dark:hover:bg-secondary-700 cursor-pointer transition-colors"
onClick={() => handleHostClick(host.hostId)}
>
<td className="px-6 py-4 whitespace-nowrap">
<div className="flex items-center">
<Server className="h-5 w-5 text-secondary-400 mr-3" />
<div>
<div className="text-sm font-medium text-secondary-900 dark:text-white">
{host.friendlyName || host.hostname}
</div>
{host.friendlyName && host.hostname && (
<div className="text-sm text-secondary-500 dark:text-secondary-300">
{host.hostname}
</div>
)}
</div>
</div>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-900 dark:text-white">
{host.currentVersion || "Unknown"}
</td>
<td className="px-6 py-4 whitespace-nowrap">
{host.needsUpdate ? (
host.isSecurityUpdate ? (
<span className="badge-danger flex items-center gap-1 w-fit">
<Shield className="h-3 w-3" />
Security Update
</span>
) : (
<span className="badge-warning w-fit">
Update Available
</span>
)
) : (
<span className="badge-success w-fit">
Up to Date
</span>
)}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-500 dark:text-secondary-300">
{host.lastUpdate
? formatRelativeTime(host.lastUpdate)
: "Never"}
</td>
</tr>
))}
</tbody>
</table>
{/* Pagination */}
{totalPages > 1 && (
<div className="px-6 py-3 bg-white dark:bg-secondary-800 border-t border-secondary-200 dark:border-secondary-600 flex items-center justify-between">
<div className="flex items-center gap-2">
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Rows per page:
</span>
<select
value={pageSize}
onChange={(e) => {
setPageSize(Number(e.target.value));
setCurrentPage(1);
}}
className="text-sm border border-secondary-300 dark:border-secondary-600 rounded px-2 py-1 bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white"
>
<option value={25}>25</option>
<option value={50}>50</option>
<option value={100}>100</option>
</select>
</div>
<div className="flex items-center gap-2">
<button
type="button"
onClick={() => setCurrentPage(currentPage - 1)}
disabled={currentPage === 1}
className="px-3 py-1 text-sm border border-secondary-300 dark:border-secondary-600 rounded disabled:opacity-50 disabled:cursor-not-allowed hover:bg-secondary-50 dark:hover:bg-secondary-700"
>
Previous
</button>
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Page {currentPage} of {totalPages}
</span>
<button
type="button"
onClick={() => setCurrentPage(currentPage + 1)}
disabled={currentPage === totalPages}
className="px-3 py-1 text-sm border border-secondary-300 dark:border-secondary-600 rounded disabled:opacity-50 disabled:cursor-not-allowed hover:bg-secondary-50 dark:hover:bg-secondary-700"
>
Next
</button>
</div>
</div>
)}
</>
)}
</div>
</div>
</div>
);

View File

@@ -4,6 +4,8 @@ import {
ArrowDown,
ArrowUp,
ArrowUpDown,
ChevronLeft,
ChevronRight,
Columns,
Eye as EyeIcon,
EyeOff as EyeOffIcon,
@@ -17,16 +19,28 @@ import {
} from "lucide-react";
import { useEffect, useMemo, useState } from "react";
import { useNavigate, useSearchParams } from "react-router-dom";
import { dashboardAPI } from "../utils/api";
import { dashboardAPI, packagesAPI } from "../utils/api";
const Packages = () => {
const [searchTerm, setSearchTerm] = useState("");
const [categoryFilter, setCategoryFilter] = useState("all");
const [securityFilter, setSecurityFilter] = useState("all");
const [updateStatusFilter, setUpdateStatusFilter] = useState("all-packages");
const [hostFilter, setHostFilter] = useState("all");
const [sortField, setSortField] = useState("name");
const [sortDirection, setSortDirection] = useState("asc");
const [showColumnSettings, setShowColumnSettings] = useState(false);
const [currentPage, setCurrentPage] = useState(1);
const [pageSize, setPageSize] = useState(() => {
const saved = localStorage.getItem("packages-page-size");
if (saved) {
const parsedSize = parseInt(saved, 10);
// Validate that the saved page size is one of the allowed values
if ([25, 50, 100, 200].includes(parsedSize)) {
return parsedSize;
}
}
return 25; // Default fallback
});
const [searchParams] = useSearchParams();
const navigate = useNavigate();
@@ -42,8 +56,8 @@ const Packages = () => {
const [columnConfig, setColumnConfig] = useState(() => {
const defaultConfig = [
{ id: "name", label: "Package", visible: true, order: 0 },
{ id: "affectedHosts", label: "Affected Hosts", visible: true, order: 1 },
{ id: "priority", label: "Priority", visible: true, order: 2 },
{ id: "packageHosts", label: "Installed On", visible: true, order: 1 },
{ id: "status", label: "Status", visible: true, order: 2 },
{ id: "latestVersion", label: "Latest Version", visible: true, order: 3 },
];
@@ -65,10 +79,10 @@ const Packages = () => {
localStorage.setItem("packages-column-config", JSON.stringify(newConfig));
};
// Handle affected hosts click
const handleAffectedHostsClick = (pkg) => {
const affectedHosts = pkg.affectedHosts || [];
const hostIds = affectedHosts.map((host) => host.hostId);
// Handle hosts click (view hosts where package is installed)
const handlePackageHostsClick = (pkg) => {
const packageHosts = pkg.packageHosts || [];
const hostIds = packageHosts.map((host) => host.hostId);
// Create URL with selected hosts and filter
const params = new URLSearchParams();
@@ -86,27 +100,59 @@ const Packages = () => {
// For outdated packages, we want to show all packages that need updates
// This is the default behavior, so we don't need to change filters
setCategoryFilter("all");
setSecurityFilter("all");
setUpdateStatusFilter("needs-updates");
} else if (filter === "security") {
// For security updates, filter to show only security updates
setSecurityFilter("security");
setUpdateStatusFilter("security-updates");
setCategoryFilter("all");
} else if (filter === "regular") {
// For regular (non-security) updates
setUpdateStatusFilter("regular-updates");
setCategoryFilter("all");
}
}, [searchParams]);
const {
data: packages,
data: packagesResponse,
isLoading,
error,
refetch,
isFetching,
} = useQuery({
queryKey: ["packages"],
queryFn: () => dashboardAPI.getPackages().then((res) => res.data),
queryKey: ["packages", hostFilter, updateStatusFilter],
queryFn: () => {
const params = { limit: 10000 }; // High limit to effectively get all packages
if (hostFilter && hostFilter !== "all") {
params.host = hostFilter;
}
// Pass update status filter to backend to pre-filter packages
if (updateStatusFilter === "needs-updates") {
params.needsUpdate = "true";
} else if (updateStatusFilter === "security-updates") {
params.isSecurityUpdate = "true";
}
return packagesAPI.getAll(params).then((res) => res.data);
},
staleTime: 5 * 60 * 1000, // Data stays fresh for 5 minutes
refetchOnWindowFocus: false, // Don't refetch when window regains focus
});
// Extract packages from the response and normalise the data structure
const packages = useMemo(() => {
if (!packagesResponse?.packages) return [];
return packagesResponse.packages.map((pkg) => ({
...pkg,
// Normalise field names to match the frontend expectations
packageHostsCount: pkg.packageHostsCount || pkg.stats?.totalInstalls || 0,
latestVersion: pkg.latest_version || pkg.latestVersion || "Unknown",
isUpdatable: (pkg.stats?.updatesNeeded || 0) > 0,
isSecurityUpdate: (pkg.stats?.securityUpdates || 0) > 0,
// Ensure we have hosts array (for packages, this contains all hosts where the package is installed)
packageHosts: pkg.packageHosts || [],
}));
}, [packagesResponse]);
// Fetch hosts data to get total packages count
const { data: hosts } = useQuery({
queryKey: ["hosts"],
@@ -128,17 +174,24 @@ const Packages = () => {
const matchesCategory =
categoryFilter === "all" || pkg.category === categoryFilter;
const matchesSecurity =
securityFilter === "all" ||
(securityFilter === "security" && pkg.isSecurityUpdate) ||
(securityFilter === "regular" && !pkg.isSecurityUpdate);
const matchesUpdateStatus =
updateStatusFilter === "all-packages" ||
(updateStatusFilter === "needs-updates" &&
(pkg.stats?.updatesNeeded || 0) > 0) ||
(updateStatusFilter === "security-updates" &&
(pkg.stats?.securityUpdates || 0) > 0) ||
(updateStatusFilter === "regular-updates" &&
(pkg.stats?.updatesNeeded || 0) > 0 &&
(pkg.stats?.securityUpdates || 0) === 0);
const affectedHosts = pkg.affectedHosts || [];
const packageHosts = pkg.packageHosts || [];
const matchesHost =
hostFilter === "all" ||
affectedHosts.some((host) => host.hostId === hostFilter);
packageHosts.some((host) => host.hostId === hostFilter);
return matchesSearch && matchesCategory && matchesSecurity && matchesHost;
return (
matchesSearch && matchesCategory && matchesUpdateStatus && matchesHost
);
});
// Sorting
@@ -154,14 +207,38 @@ const Packages = () => {
aValue = a.latestVersion?.toLowerCase() || "";
bValue = b.latestVersion?.toLowerCase() || "";
break;
case "affectedHosts":
aValue = a.affectedHostsCount || a.affectedHosts?.length || 0;
bValue = b.affectedHostsCount || b.affectedHosts?.length || 0;
case "packageHosts":
aValue = a.packageHostsCount || a.packageHosts?.length || 0;
bValue = b.packageHostsCount || b.packageHosts?.length || 0;
break;
case "priority":
aValue = a.isSecurityUpdate ? 0 : 1; // Security updates first
bValue = b.isSecurityUpdate ? 0 : 1;
case "status": {
// Handle sorting for the three status states: Up to Date, Update Available, Security Update Available
const aNeedsUpdates = (a.stats?.updatesNeeded || 0) > 0;
const bNeedsUpdates = (b.stats?.updatesNeeded || 0) > 0;
// Define priority order: Security Update (0) > Regular Update (1) > Up to Date (2)
let aPriority, bPriority;
if (!aNeedsUpdates) {
aPriority = 2; // Up to Date
} else if (a.isSecurityUpdate) {
aPriority = 0; // Security Update
} else {
aPriority = 1; // Regular Update
}
if (!bNeedsUpdates) {
bPriority = 2; // Up to Date
} else if (b.isSecurityUpdate) {
bPriority = 0; // Security Update
} else {
bPriority = 1; // Regular Update
}
aValue = aPriority;
bValue = bPriority;
break;
}
default:
aValue = a.name?.toLowerCase() || "";
bValue = b.name?.toLowerCase() || "";
@@ -177,12 +254,33 @@ const Packages = () => {
packages,
searchTerm,
categoryFilter,
securityFilter,
updateStatusFilter,
sortField,
sortDirection,
hostFilter,
]);
// Calculate pagination
const totalPages = Math.ceil(filteredAndSortedPackages.length / pageSize);
const startIndex = (currentPage - 1) * pageSize;
const endIndex = startIndex + pageSize;
const paginatedPackages = filteredAndSortedPackages.slice(
startIndex,
endIndex,
);
// Reset to first page when filters or page size change
// biome-ignore lint/correctness/useExhaustiveDependencies: We want this effect to run when filter values or page size change to reset pagination
useEffect(() => {
setCurrentPage(1);
}, [searchTerm, categoryFilter, updateStatusFilter, hostFilter, pageSize]);
// Function to handle page size change and save to localStorage
const handlePageSizeChange = (newPageSize) => {
setPageSize(newPageSize);
localStorage.setItem("packages-page-size", newPageSize.toString());
};
// Get visible columns in order
const visibleColumns = columnConfig
.filter((col) => col.visible)
@@ -231,8 +329,8 @@ const Packages = () => {
const resetColumns = () => {
const defaultConfig = [
{ id: "name", label: "Package", visible: true, order: 0 },
{ id: "affectedHosts", label: "Affected Hosts", visible: true, order: 1 },
{ id: "priority", label: "Priority", visible: true, order: 2 },
{ id: "packageHosts", label: "Installed On", visible: true, order: 1 },
{ id: "status", label: "Status", visible: true, order: 2 },
{ id: "latestVersion", label: "Latest Version", visible: true, order: 3 },
];
updateColumnConfig(defaultConfig);
@@ -243,10 +341,14 @@ const Packages = () => {
switch (column.id) {
case "name":
return (
<div className="flex items-center">
<Package className="h-5 w-5 text-secondary-400 mr-3" />
<div>
<div className="text-sm font-medium text-secondary-900 dark:text-white">
<button
type="button"
onClick={() => navigate(`/packages/${pkg.id}`)}
className="flex items-center text-left hover:bg-secondary-100 dark:hover:bg-secondary-700 rounded p-2 -m-2 transition-colors group w-full"
>
<Package className="h-5 w-5 text-secondary-400 mr-3 flex-shrink-0" />
<div className="flex-1">
<div className="text-sm font-medium text-secondary-900 dark:text-white group-hover:text-primary-600 dark:group-hover:text-primary-400">
{pkg.name}
</div>
{pkg.description && (
@@ -260,33 +362,58 @@ const Packages = () => {
</div>
)}
</div>
</div>
</button>
);
case "affectedHosts": {
const affectedHostsCount =
pkg.affectedHostsCount || pkg.affectedHosts?.length || 0;
case "packageHosts": {
// Show total number of hosts where this package is installed
const installedHostsCount =
pkg.packageHostsCount ||
pkg.stats?.totalInstalls ||
pkg.packageHosts?.length ||
0;
// For packages that need updates, show how many need updates
const hostsNeedingUpdates = pkg.stats?.updatesNeeded || 0;
const displayText =
hostsNeedingUpdates > 0 && hostsNeedingUpdates < installedHostsCount
? `${hostsNeedingUpdates}/${installedHostsCount} hosts`
: `${installedHostsCount} host${installedHostsCount !== 1 ? "s" : ""}`;
const titleText =
hostsNeedingUpdates > 0 && hostsNeedingUpdates < installedHostsCount
? `${hostsNeedingUpdates} of ${installedHostsCount} hosts need updates`
: `Installed on ${installedHostsCount} host${installedHostsCount !== 1 ? "s" : ""}`;
return (
<button
type="button"
onClick={() => handleAffectedHostsClick(pkg)}
onClick={() => handlePackageHostsClick(pkg)}
className="text-left hover:bg-secondary-100 dark:hover:bg-secondary-700 rounded p-1 -m-1 transition-colors group"
title={`Click to view all ${affectedHostsCount} affected hosts`}
title={titleText}
>
<div className="text-sm text-secondary-900 dark:text-white group-hover:text-primary-600 dark:group-hover:text-primary-400">
{affectedHostsCount} host{affectedHostsCount !== 1 ? "s" : ""}
{displayText}
</div>
</button>
);
}
case "priority":
case "status": {
// Check if this package needs updates
const needsUpdates = (pkg.stats?.updatesNeeded || 0) > 0;
if (!needsUpdates) {
return <span className="badge-success">Up to Date</span>;
}
return pkg.isSecurityUpdate ? (
<span className="badge-danger flex items-center gap-1">
<span className="badge-danger">
<Shield className="h-3 w-3" />
Security Update
Security Update Available
</span>
) : (
<span className="badge-warning">Regular Update</span>
<span className="badge-warning">Update Available</span>
);
}
case "latestVersion":
return (
<div
@@ -305,28 +432,38 @@ const Packages = () => {
const categories =
[...new Set(packages?.map((pkg) => pkg.category).filter(Boolean))] || [];
// Calculate unique affected hosts
const uniqueAffectedHosts = new Set();
// Calculate unique package hosts
const uniquePackageHosts = new Set();
packages?.forEach((pkg) => {
const affectedHosts = pkg.affectedHosts || [];
affectedHosts.forEach((host) => {
uniqueAffectedHosts.add(host.hostId);
});
// Only count hosts for packages that need updates
if ((pkg.stats?.updatesNeeded || 0) > 0) {
const packageHosts = pkg.packageHosts || [];
packageHosts.forEach((host) => {
uniquePackageHosts.add(host.hostId);
});
}
});
const uniqueAffectedHostsCount = uniqueAffectedHosts.size;
const uniquePackageHostsCount = uniquePackageHosts.size;
// Calculate total packages across all hosts (including up-to-date ones)
// Calculate total packages installed
// When filtering by host, count each package once (since it can only be installed once per host)
// When not filtering, sum up all installations across all hosts
const totalPackagesCount =
hosts?.reduce((total, host) => {
return total + (host.totalPackagesCount || 0);
}, 0) || 0;
hostFilter && hostFilter !== "all"
? packages?.length || 0
: packages?.reduce(
(sum, pkg) => sum + (pkg.stats?.totalInstalls || 0),
0,
) || 0;
// Calculate outdated packages (packages that need updates)
const outdatedPackagesCount = packages?.length || 0;
// Calculate outdated packages
const outdatedPackagesCount =
packages?.filter((pkg) => (pkg.stats?.updatesNeeded || 0) > 0).length || 0;
// Calculate security updates
const securityUpdatesCount =
packages?.filter((pkg) => pkg.isSecurityUpdate).length || 0;
packages?.filter((pkg) => (pkg.stats?.securityUpdates || 0) > 0).length ||
0;
if (isLoading) {
return (
@@ -398,7 +535,7 @@ const Packages = () => {
<Package className="h-5 w-5 text-primary-600 mr-2" />
<div>
<p className="text-sm text-secondary-500 dark:text-white">
Total Packages
Total Installed
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{totalPackagesCount}
@@ -429,7 +566,7 @@ const Packages = () => {
Hosts Pending Updates
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{uniqueAffectedHostsCount}
{uniquePackageHostsCount}
</p>
</div>
</div>
@@ -490,16 +627,21 @@ const Packages = () => {
</select>
</div>
{/* Security Filter */}
{/* Update Status Filter */}
<div className="sm:w-48">
<select
value={securityFilter}
onChange={(e) => setSecurityFilter(e.target.value)}
value={updateStatusFilter}
onChange={(e) => setUpdateStatusFilter(e.target.value)}
className="w-full px-3 py-2 border border-secondary-300 dark:border-secondary-600 rounded-md focus:ring-2 focus:ring-primary-500 focus:border-transparent bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white"
>
<option value="all">All Updates</option>
<option value="security">Security Only</option>
<option value="regular">Regular Only</option>
<option value="all-packages">All Packages</option>
<option value="needs-updates">
Packages Needing Updates
</option>
<option value="security-updates">
Security Updates Only
</option>
<option value="regular-updates">Regular Updates Only</option>
</select>
</div>
@@ -539,12 +681,13 @@ const Packages = () => {
<Package className="h-12 w-12 text-secondary-400 mx-auto mb-4" />
<p className="text-secondary-500 dark:text-secondary-300">
{packages?.length === 0
? "No packages need updates"
? "No packages found"
: "No packages match your filters"}
</p>
{packages?.length === 0 && (
<p className="text-sm text-secondary-400 dark:text-secondary-400 mt-2">
All packages are up to date across all hosts
Packages will appear here once hosts start reporting their
installed packages
</p>
)}
</div>
@@ -571,7 +714,7 @@ const Packages = () => {
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-600">
{filteredAndSortedPackages.map((pkg) => (
{paginatedPackages.map((pkg) => (
<tr
key={pkg.id}
className="hover:bg-secondary-50 dark:hover:bg-secondary-700 transition-colors"
@@ -591,6 +734,57 @@ const Packages = () => {
</div>
)}
</div>
{/* Pagination Controls */}
{filteredAndSortedPackages.length > 0 && (
<div className="flex items-center justify-between px-6 py-3 bg-white dark:bg-secondary-800 border-t border-secondary-200 dark:border-secondary-600">
<div className="flex items-center gap-4">
<div className="flex items-center gap-2">
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Rows per page:
</span>
<select
value={pageSize}
onChange={(e) =>
handlePageSizeChange(Number(e.target.value))
}
className="text-sm border border-secondary-300 dark:border-secondary-600 rounded px-2 py-1 bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white"
>
<option value={25}>25</option>
<option value={50}>50</option>
<option value={100}>100</option>
<option value={200}>200</option>
</select>
</div>
<span className="text-sm text-secondary-700 dark:text-secondary-300">
{startIndex + 1}-
{Math.min(endIndex, filteredAndSortedPackages.length)} of{" "}
{filteredAndSortedPackages.length}
</span>
</div>
<div className="flex items-center gap-2">
<button
type="button"
onClick={() => setCurrentPage(currentPage - 1)}
disabled={currentPage === 1}
className="p-1 rounded hover:bg-secondary-100 dark:hover:bg-secondary-600 disabled:opacity-50 disabled:cursor-not-allowed"
>
<ChevronLeft className="h-4 w-4" />
</button>
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Page {currentPage} of {totalPages}
</span>
<button
type="button"
onClick={() => setCurrentPage(currentPage + 1)}
disabled={currentPage === totalPages}
className="p-1 rounded hover:bg-secondary-100 dark:hover:bg-secondary-600 disabled:opacity-50 disabled:cursor-not-allowed"
>
<ChevronRight className="h-4 w-4" />
</button>
</div>
</div>
)}
</div>
</div>

View File

@@ -2,12 +2,16 @@ import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
import {
AlertCircle,
CheckCircle,
Clock,
Copy,
Download,
Eye,
EyeOff,
Key,
LogOut,
Mail,
MapPin,
Monitor,
Moon,
RefreshCw,
Save,
@@ -18,7 +22,7 @@ import {
User,
} from "lucide-react";
import { useId, useState } from "react";
import { useEffect, useId, useState } from "react";
import { useAuth } from "../contexts/AuthContext";
import { useTheme } from "../contexts/ThemeContext";
@@ -45,6 +49,18 @@ const Profile = () => {
last_name: user?.last_name || "",
});
// Update profileData when user data changes
useEffect(() => {
if (user) {
setProfileData({
username: user.username || "",
email: user.email || "",
first_name: user.first_name || "",
last_name: user.last_name || "",
});
}
}, [user]);
const [passwordData, setPasswordData] = useState({
currentPassword: "",
newPassword: "",
@@ -141,6 +157,7 @@ const Profile = () => {
{ id: "profile", name: "Profile Information", icon: User },
{ id: "password", name: "Change Password", icon: Key },
{ id: "tfa", name: "Multi-Factor Authentication", icon: Smartphone },
{ id: "sessions", name: "Active Sessions", icon: Monitor },
];
return (
@@ -521,6 +538,9 @@ const Profile = () => {
{/* Multi-Factor Authentication Tab */}
{activeTab === "tfa" && <TfaTab />}
{/* Sessions Tab */}
{activeTab === "sessions" && <SessionsTab />}
</div>
</div>
</div>
@@ -1060,4 +1080,256 @@ const TfaTab = () => {
);
};
// Sessions Tab Component
const SessionsTab = () => {
const _queryClient = useQueryClient();
const [_isLoading, _setIsLoading] = useState(false);
const [message, setMessage] = useState({ type: "", text: "" });
// Fetch user sessions
const {
data: sessionsData,
isLoading: sessionsLoading,
refetch,
} = useQuery({
queryKey: ["user-sessions"],
queryFn: async () => {
const response = await fetch("/api/v1/auth/sessions", {
headers: {
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
});
if (!response.ok) throw new Error("Failed to fetch sessions");
return response.json();
},
});
// Revoke individual session mutation
const revokeSessionMutation = useMutation({
mutationFn: async (sessionId) => {
const response = await fetch(`/api/v1/auth/sessions/${sessionId}`, {
method: "DELETE",
headers: {
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
});
if (!response.ok) throw new Error("Failed to revoke session");
return response.json();
},
onSuccess: () => {
setMessage({ type: "success", text: "Session revoked successfully" });
refetch();
},
onError: (error) => {
setMessage({ type: "error", text: error.message });
},
});
// Revoke all sessions mutation
const revokeAllSessionsMutation = useMutation({
mutationFn: async () => {
const response = await fetch("/api/v1/auth/sessions", {
method: "DELETE",
headers: {
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
});
if (!response.ok) throw new Error("Failed to revoke sessions");
return response.json();
},
onSuccess: () => {
setMessage({
type: "success",
text: "All other sessions revoked successfully",
});
refetch();
},
onError: (error) => {
setMessage({ type: "error", text: error.message });
},
});
const formatDate = (dateString) => {
return new Date(dateString).toLocaleString();
};
const formatRelativeTime = (dateString) => {
const now = new Date();
const date = new Date(dateString);
const diff = now - date;
const minutes = Math.floor(diff / 60000);
const hours = Math.floor(diff / 3600000);
const days = Math.floor(diff / 86400000);
if (days > 0) return `${days} day${days > 1 ? "s" : ""} ago`;
if (hours > 0) return `${hours} hour${hours > 1 ? "s" : ""} ago`;
if (minutes > 0) return `${minutes} minute${minutes > 1 ? "s" : ""} ago`;
return "Just now";
};
const handleRevokeSession = (sessionId) => {
if (window.confirm("Are you sure you want to revoke this session?")) {
revokeSessionMutation.mutate(sessionId);
}
};
const handleRevokeAllSessions = () => {
if (
window.confirm(
"Are you sure you want to revoke all other sessions? This will log you out of all other devices.",
)
) {
revokeAllSessionsMutation.mutate();
}
};
return (
<div className="space-y-6">
{/* Header */}
<div>
<h3 className="text-lg font-medium text-secondary-900 dark:text-secondary-100">
Active Sessions
</h3>
<p className="text-sm text-secondary-600 dark:text-secondary-300">
Manage your active sessions and devices. You can see where you're
logged in and revoke access for any device.
</p>
</div>
{/* Message */}
{message.text && (
<div
className={`rounded-md p-4 ${
message.type === "success"
? "bg-success-50 border border-success-200 text-success-700"
: "bg-danger-50 border border-danger-200 text-danger-700"
}`}
>
<div className="flex">
{message.type === "success" ? (
<CheckCircle className="h-5 w-5" />
) : (
<AlertCircle className="h-5 w-5" />
)}
<div className="ml-3">
<p className="text-sm">{message.text}</p>
</div>
</div>
</div>
)}
{/* Sessions List */}
{sessionsLoading ? (
<div className="flex items-center justify-center py-8">
<div className="animate-spin rounded-full h-8 w-8 border-b-2 border-primary-600"></div>
</div>
) : sessionsData?.sessions?.length > 0 ? (
<div className="space-y-4">
{/* Revoke All Button */}
{sessionsData.sessions.filter((s) => !s.is_current_session).length >
0 && (
<div className="flex justify-end">
<button
type="button"
onClick={handleRevokeAllSessions}
disabled={revokeAllSessionsMutation.isPending}
className="inline-flex items-center px-4 py-2 border border-danger-300 text-sm font-medium rounded-md text-danger-700 bg-white hover:bg-danger-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-danger-500 disabled:opacity-50"
>
<LogOut className="h-4 w-4 mr-2" />
{revokeAllSessionsMutation.isPending
? "Revoking..."
: "Revoke All Other Sessions"}
</button>
</div>
)}
{/* Sessions */}
{sessionsData.sessions.map((session) => (
<div
key={session.id}
className={`border rounded-lg p-4 ${
session.is_current_session
? "border-primary-200 bg-primary-50 dark:border-primary-800 dark:bg-primary-900/20"
: "border-secondary-200 bg-white dark:border-secondary-700 dark:bg-secondary-800"
}`}
>
<div className="flex items-start justify-between">
<div className="flex-1">
<div className="flex items-center space-x-3">
<Monitor className="h-5 w-5 text-secondary-500" />
<div>
<div className="flex items-center space-x-2">
<h4 className="text-sm font-medium text-secondary-900 dark:text-secondary-100">
{session.device_info?.browser} on{" "}
{session.device_info?.os}
</h4>
{session.is_current_session && (
<span className="inline-flex items-center px-2 py-1 rounded-full text-xs font-medium bg-primary-100 text-primary-800 dark:bg-primary-900 dark:text-primary-200">
Current Session
</span>
)}
{session.tfa_remember_me && (
<span className="inline-flex items-center px-2 py-1 rounded-full text-xs font-medium bg-success-100 text-success-800 dark:bg-success-900 dark:text-success-200">
Remembered
</span>
)}
</div>
<p className="text-sm text-secondary-600 dark:text-secondary-400">
{session.device_info?.device} • {session.ip_address}
</p>
</div>
</div>
<div className="mt-3 grid grid-cols-1 md:grid-cols-2 gap-4 text-sm text-secondary-600 dark:text-secondary-400">
<div className="flex items-center space-x-2">
<MapPin className="h-4 w-4" />
<span>
{session.location_info?.city},{" "}
{session.location_info?.country}
</span>
</div>
<div className="flex items-center space-x-2">
<Clock className="h-4 w-4" />
<span>
Last active: {formatRelativeTime(session.last_activity)}
</span>
</div>
<div className="flex items-center space-x-2">
<span>Created: {formatDate(session.created_at)}</span>
</div>
<div className="flex items-center space-x-2">
<span>Login count: {session.login_count}</span>
</div>
</div>
</div>
{!session.is_current_session && (
<button
type="button"
onClick={() => handleRevokeSession(session.id)}
disabled={revokeSessionMutation.isPending}
className="ml-4 inline-flex items-center px-3 py-2 border border-danger-300 text-sm font-medium rounded-md text-danger-700 bg-white hover:bg-danger-50 focus:outline-none focus:ring-2 focus:ring-offset-2 focus:ring-danger-500 disabled:opacity-50"
>
<LogOut className="h-4 w-4" />
</button>
)}
</div>
</div>
))}
</div>
) : (
<div className="text-center py-8">
<Monitor className="mx-auto h-12 w-12 text-secondary-400" />
<h3 className="mt-2 text-sm font-medium text-secondary-900 dark:text-secondary-100">
No active sessions
</h3>
<p className="mt-1 text-sm text-secondary-600 dark:text-secondary-400">
You don't have any active sessions at the moment.
</p>
</div>
)}
</div>
);
};
export default Profile;

View File

@@ -1,4 +1,4 @@
import { useQuery } from "@tanstack/react-query";
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
import {
AlertTriangle,
ArrowDown,
@@ -7,7 +7,6 @@ import {
Check,
Columns,
Database,
Eye,
GripVertical,
Lock,
RefreshCw,
@@ -15,21 +14,34 @@ import {
Server,
Shield,
ShieldCheck,
Trash2,
Unlock,
Users,
X,
} from "lucide-react";
import { useMemo, useState } from "react";
import { Link } from "react-router-dom";
import { repositoryAPI } from "../utils/api";
import { useEffect, useMemo, useState } from "react";
import { useNavigate, useSearchParams } from "react-router-dom";
import { dashboardAPI, repositoryAPI } from "../utils/api";
const Repositories = () => {
const queryClient = useQueryClient();
const navigate = useNavigate();
const [searchParams] = useSearchParams();
const [searchTerm, setSearchTerm] = useState("");
const [filterType, setFilterType] = useState("all"); // all, secure, insecure
const [filterStatus, setFilterStatus] = useState("all"); // all, active, inactive
const [hostFilter, setHostFilter] = useState("");
const [sortField, setSortField] = useState("name");
const [sortDirection, setSortDirection] = useState("asc");
const [showColumnSettings, setShowColumnSettings] = useState(false);
const [deleteModalData, setDeleteModalData] = useState(null);
// Handle host filter from URL parameter
useEffect(() => {
const hostParam = searchParams.get("host");
if (hostParam) {
setHostFilter(hostParam);
}
}, [searchParams]);
// Column configuration
const [columnConfig, setColumnConfig] = useState(() => {
@@ -80,6 +92,26 @@ const Repositories = () => {
queryFn: () => repositoryAPI.getStats().then((res) => res.data),
});
// Fetch host information when filtering by host
const { data: hosts } = useQuery({
queryKey: ["hosts"],
queryFn: () => dashboardAPI.getHosts().then((res) => res.data),
staleTime: 5 * 60 * 1000,
enabled: !!hostFilter,
});
// Get the filtered host information
const filteredHost = hosts?.find((host) => host.id === hostFilter);
// Delete repository mutation
const deleteRepositoryMutation = useMutation({
mutationFn: (repositoryId) => repositoryAPI.delete(repositoryId),
onSuccess: () => {
queryClient.invalidateQueries(["repositories"]);
queryClient.invalidateQueries(["repository-stats"]);
},
});
// Get visible columns in order
const visibleColumns = columnConfig
.filter((col) => col.visible)
@@ -138,6 +170,32 @@ const Repositories = () => {
updateColumnConfig(defaultConfig);
};
const handleDeleteRepository = (repo, e) => {
e.preventDefault();
e.stopPropagation();
setDeleteModalData({
id: repo.id,
name: repo.name,
hostCount: repo.hostCount || 0,
});
};
const handleRowClick = (repo) => {
navigate(`/repositories/${repo.id}`);
};
const confirmDelete = () => {
if (deleteModalData) {
deleteRepositoryMutation.mutate(deleteModalData.id);
setDeleteModalData(null);
}
};
const cancelDelete = () => {
setDeleteModalData(null);
};
// Filter and sort repositories
const filteredAndSortedRepositories = useMemo(() => {
if (!repositories) return [];
@@ -165,7 +223,11 @@ const Repositories = () => {
(filterStatus === "active" && repo.is_active === true) ||
(filterStatus === "inactive" && repo.is_active === false);
return matchesSearch && matchesType && matchesStatus;
// Filter by host if hostFilter is set
const matchesHost =
!hostFilter || repo.hosts?.some((host) => host.id === hostFilter);
return matchesSearch && matchesType && matchesStatus && matchesHost;
});
// Sort repositories
@@ -200,6 +262,7 @@ const Repositories = () => {
filterStatus,
sortField,
sortDirection,
hostFilter,
]);
if (isLoading) {
@@ -225,6 +288,56 @@ const Repositories = () => {
return (
<div className="h-[calc(100vh-7rem)] flex flex-col overflow-hidden">
{/* Delete Confirmation Modal */}
{deleteModalData && (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 max-w-md w-full mx-4">
<div className="flex items-center mb-4">
<AlertTriangle className="h-6 w-6 text-red-500 mr-3" />
<h3 className="text-lg font-semibold text-secondary-900 dark:text-white">
Delete Repository
</h3>
</div>
<div className="mb-6">
<p className="text-secondary-700 dark:text-secondary-300 mb-2">
Are you sure you want to delete{" "}
<strong>"{deleteModalData.name}"</strong>?
</p>
{deleteModalData.hostCount > 0 && (
<p className="text-amber-600 dark:text-amber-400 text-sm">
This repository is currently assigned to{" "}
{deleteModalData.hostCount} host
{deleteModalData.hostCount !== 1 ? "s" : ""}.
</p>
)}
<p className="text-red-600 dark:text-red-400 text-sm mt-2">
This action cannot be undone.
</p>
</div>
<div className="flex gap-3 justify-end">
<button
type="button"
onClick={cancelDelete}
className="px-4 py-2 text-secondary-600 dark:text-secondary-400 hover:text-secondary-800 dark:hover:text-secondary-200 transition-colors"
disabled={deleteRepositoryMutation.isPending}
>
Cancel
</button>
<button
type="button"
onClick={confirmDelete}
className="px-4 py-2 bg-red-600 text-white rounded-md hover:bg-red-700 transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
disabled={deleteRepositoryMutation.isPending}
>
{deleteRepositoryMutation.isPending
? "Deleting..."
: "Delete Repository"}
</button>
</div>
</div>
</div>
)}
{/* Page Header */}
<div className="flex items-center justify-between mb-6">
<div>
@@ -334,6 +447,31 @@ const Repositories = () => {
</div>
</div>
{/* Host Filter Indicator */}
{hostFilter && filteredHost && (
<div className="flex items-center gap-2 px-3 py-2 bg-primary-50 dark:bg-primary-900 border border-primary-200 dark:border-primary-700 rounded-md">
<Server className="h-4 w-4 text-primary-600 dark:text-primary-400" />
<span className="text-sm text-primary-700 dark:text-primary-300">
Filtered by: {filteredHost.friendly_name}
</span>
<button
type="button"
onClick={() => {
setHostFilter("");
// Update URL to remove host parameter
const newSearchParams = new URLSearchParams(searchParams);
newSearchParams.delete("host");
navigate(`/repositories?${newSearchParams.toString()}`, {
replace: true,
});
}}
className="text-primary-500 hover:text-primary-700 dark:text-primary-400 dark:hover:text-primary-200"
>
<X className="h-4 w-4" />
</button>
</div>
)}
{/* Security Filter */}
<div className="sm:w-48">
<select
@@ -415,7 +553,8 @@ const Repositories = () => {
{filteredAndSortedRepositories.map((repo) => (
<tr
key={repo.id}
className="hover:bg-secondary-50 dark:hover:bg-secondary-700 transition-colors"
className="hover:bg-secondary-50 dark:hover:bg-secondary-700 transition-colors cursor-pointer"
onClick={() => handleRowClick(repo)}
>
{visibleColumns.map((column) => (
<td
@@ -513,19 +652,23 @@ const Repositories = () => {
case "hostCount":
return (
<div className="flex items-center justify-center gap-1 text-sm text-secondary-900 dark:text-white">
<Users className="h-4 w-4" />
<span>{repo.host_count}</span>
<Server className="h-4 w-4" />
<span>{repo.hostCount}</span>
</div>
);
case "actions":
return (
<Link
to={`/repositories/${repo.id}`}
className="text-primary-600 hover:text-primary-900 flex items-center gap-1"
>
View
<Eye className="h-3 w-3" />
</Link>
<div className="flex items-center justify-center">
<button
type="button"
onClick={(e) => handleDeleteRepository(repo, e)}
className="text-orange-600 hover:text-red-900 dark:text-orange-600 dark:hover:text-red-400 flex items-center gap-1"
disabled={deleteRepositoryMutation.isPending}
title="Delete repository"
>
<Trash2 className="h-4 w-4" />
</button>
</div>
);
default:
return null;

View File

@@ -6,17 +6,18 @@ import {
Database,
Globe,
Lock,
Search,
Server,
Shield,
ShieldOff,
Trash2,
Unlock,
Users,
} from "lucide-react";
import { useId, useState } from "react";
import { useId, useMemo, useState } from "react";
import { Link, useParams } from "react-router-dom";
import { repositoryAPI } from "../utils/api";
import { Link, useNavigate, useParams } from "react-router-dom";
import { formatRelativeTime, repositoryAPI } from "../utils/api";
const RepositoryDetail = () => {
const isActiveId = useId();
@@ -24,9 +25,14 @@ const RepositoryDetail = () => {
const priorityId = useId();
const descriptionId = useId();
const { repositoryId } = useParams();
const navigate = useNavigate();
const queryClient = useQueryClient();
const [editMode, setEditMode] = useState(false);
const [formData, setFormData] = useState({});
const [searchTerm, setSearchTerm] = useState("");
const [currentPage, setCurrentPage] = useState(1);
const [pageSize, setPageSize] = useState(25);
const [showDeleteModal, setShowDeleteModal] = useState(false);
// Fetch repository details
const {
@@ -39,6 +45,49 @@ const RepositoryDetail = () => {
enabled: !!repositoryId,
});
const hosts = repository?.host_repositories || [];
// Filter and paginate hosts
const filteredAndPaginatedHosts = useMemo(() => {
let filtered = hosts;
if (searchTerm) {
filtered = hosts.filter(
(hostRepo) =>
hostRepo.hosts.friendly_name
?.toLowerCase()
.includes(searchTerm.toLowerCase()) ||
hostRepo.hosts.hostname
?.toLowerCase()
.includes(searchTerm.toLowerCase()) ||
hostRepo.hosts.ip?.toLowerCase().includes(searchTerm.toLowerCase()),
);
}
const startIndex = (currentPage - 1) * pageSize;
const endIndex = startIndex + pageSize;
return filtered.slice(startIndex, endIndex);
}, [hosts, searchTerm, currentPage, pageSize]);
const totalPages = Math.ceil(
(searchTerm
? hosts.filter(
(hostRepo) =>
hostRepo.hosts.friendly_name
?.toLowerCase()
.includes(searchTerm.toLowerCase()) ||
hostRepo.hosts.hostname
?.toLowerCase()
.includes(searchTerm.toLowerCase()) ||
hostRepo.hosts.ip?.toLowerCase().includes(searchTerm.toLowerCase()),
).length
: hosts.length) / pageSize,
);
const handleHostClick = (hostId) => {
navigate(`/hosts/${hostId}`);
};
// Update repository mutation
const updateRepositoryMutation = useMutation({
mutationFn: (data) => repositoryAPI.update(repositoryId, data),
@@ -49,6 +98,15 @@ const RepositoryDetail = () => {
},
});
// Delete repository mutation
const deleteRepositoryMutation = useMutation({
mutationFn: () => repositoryAPI.delete(repositoryId),
onSuccess: () => {
queryClient.invalidateQueries(["repositories"]);
navigate("/repositories");
},
});
const handleEdit = () => {
setFormData({
name: repository.name,
@@ -68,6 +126,19 @@ const RepositoryDetail = () => {
setFormData({});
};
const handleDelete = () => {
setShowDeleteModal(true);
};
const confirmDelete = () => {
deleteRepositoryMutation.mutate();
setShowDeleteModal(false);
};
const cancelDelete = () => {
setShowDeleteModal(false);
};
if (isLoading) {
return (
<div className="flex items-center justify-center h-64">
@@ -127,6 +198,56 @@ const RepositoryDetail = () => {
return (
<div className="space-y-6">
{/* Delete Confirmation Modal */}
{showDeleteModal && (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-secondary-800 rounded-lg p-6 max-w-md w-full mx-4">
<div className="flex items-center mb-4">
<AlertTriangle className="h-6 w-6 text-red-500 mr-3" />
<h3 className="text-lg font-semibold text-secondary-900 dark:text-white">
Delete Repository
</h3>
</div>
<div className="mb-6">
<p className="text-secondary-700 dark:text-secondary-300 mb-2">
Are you sure you want to delete{" "}
<strong>"{repository?.name}"</strong>?
</p>
{repository?.host_repositories?.length > 0 && (
<p className="text-amber-600 dark:text-amber-400 text-sm">
This repository is currently assigned to{" "}
{repository.host_repositories.length} host
{repository.host_repositories.length !== 1 ? "s" : ""}.
</p>
)}
<p className="text-red-600 dark:text-red-400 text-sm mt-2">
This action cannot be undone.
</p>
</div>
<div className="flex gap-3 justify-end">
<button
type="button"
onClick={cancelDelete}
className="px-4 py-2 text-secondary-600 dark:text-secondary-400 hover:text-secondary-800 dark:hover:text-secondary-200 transition-colors"
disabled={deleteRepositoryMutation.isPending}
>
Cancel
</button>
<button
type="button"
onClick={confirmDelete}
className="px-4 py-2 bg-red-600 text-white rounded-md hover:bg-red-700 transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
disabled={deleteRepositoryMutation.isPending}
>
{deleteRepositoryMutation.isPending
? "Deleting..."
: "Delete Repository"}
</button>
</div>
</div>
</div>
)}
{/* Header */}
<div className="flex items-center justify-between">
<div className="flex items-center gap-4">
@@ -157,9 +278,6 @@ const RepositoryDetail = () => {
{repository.is_active ? "Active" : "Inactive"}
</span>
</div>
<p className="text-secondary-500 dark:text-secondary-300 mt-1">
Repository configuration and host assignments
</p>
</div>
</div>
<div className="flex items-center gap-2">
@@ -185,15 +303,30 @@ const RepositoryDetail = () => {
</button>
</>
) : (
<button type="button" onClick={handleEdit} className="btn-primary">
Edit Repository
</button>
<>
<button
type="button"
onClick={handleDelete}
className="btn-outline border-red-200 text-red-600 hover:bg-red-50 hover:border-red-300 dark:border-red-800 dark:text-red-400 dark:hover:bg-red-900/20 dark:hover:border-red-700 flex items-center gap-2"
disabled={deleteRepositoryMutation.isPending}
>
<Trash2 className="h-4 w-4" />
{deleteRepositoryMutation.isPending ? "Deleting..." : "Delete"}
</button>
<button
type="button"
onClick={handleEdit}
className="btn-primary"
>
Edit Repository
</button>
</>
)}
</div>
</div>
{/* Repository Information */}
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow">
<div className="card">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-700">
<h2 className="text-lg font-semibold text-secondary-900 dark:text-white">
Repository Information
@@ -369,80 +502,159 @@ const RepositoryDetail = () => {
</div>
{/* Hosts Using This Repository */}
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-700">
<h2 className="text-lg font-semibold text-secondary-900 dark:text-white flex items-center gap-2">
<Users className="h-5 w-5" />
Hosts Using This Repository (
{repository.host_repositories?.length || 0})
</h2>
</div>
{!repository.host_repositories ||
repository.host_repositories.length === 0 ? (
<div className="px-6 py-12 text-center">
<Server className="mx-auto h-12 w-12 text-secondary-400" />
<h3 className="mt-2 text-sm font-medium text-secondary-900 dark:text-white">
No hosts using this repository
</h3>
<p className="mt-1 text-sm text-secondary-500 dark:text-secondary-300">
This repository hasn't been reported by any hosts yet.
</p>
<div className="card">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600">
<div className="flex items-center justify-between mb-4">
<div className="flex items-center gap-3">
<Server className="h-5 w-5 text-primary-600" />
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Hosts Using This Repository ({hosts.length})
</h3>
</div>
</div>
) : (
<div className="divide-y divide-secondary-200 dark:divide-secondary-700">
{repository.host_repositories.map((hostRepo) => (
<div
key={hostRepo.id}
className="px-6 py-4 hover:bg-secondary-50 dark:hover:bg-secondary-700/50"
>
<div className="flex items-center justify-between">
<div className="flex items-center gap-3">
<div
className={`w-3 h-3 rounded-full ${
hostRepo.hosts.status === "active"
? "bg-green-500"
: hostRepo.hosts.status === "pending"
? "bg-yellow-500"
: "bg-red-500"
}`}
/>
<div>
<Link
to={`/hosts/${hostRepo.hosts.id}`}
className="text-primary-600 hover:text-primary-700 font-medium"
>
{hostRepo.hosts.friendly_name}
</Link>
<div className="flex items-center gap-4 text-sm text-secondary-500 dark:text-secondary-400 mt-1">
<span>IP: {hostRepo.hosts.ip}</span>
<span>
OS: {hostRepo.hosts.os_type}{" "}
{hostRepo.hosts.os_version}
</span>
<span>
Last Update:{" "}
{new Date(
hostRepo.hosts.last_update,
).toLocaleDateString()}
</span>
</div>
</div>
{/* Search */}
<div className="relative max-w-sm">
<Search className="absolute left-3 top-1/2 transform -translate-y-1/2 h-4 w-4 text-secondary-400" />
<input
type="text"
placeholder="Search hosts..."
value={searchTerm}
onChange={(e) => {
setSearchTerm(e.target.value);
setCurrentPage(1);
}}
className="w-full pl-10 pr-4 py-2 border border-secondary-300 dark:border-secondary-600 rounded-md focus:ring-2 focus:ring-primary-500 focus:border-transparent bg-white dark:bg-secondary-800 text-secondary-900 dark:text-white placeholder-secondary-500 dark:placeholder-secondary-400"
/>
</div>
</div>
<div className="overflow-x-auto">
{filteredAndPaginatedHosts.length === 0 ? (
<div className="text-center py-8">
<Server className="h-12 w-12 text-secondary-400 mx-auto mb-4" />
<p className="text-secondary-500 dark:text-secondary-300">
{searchTerm
? "No hosts match your search"
: "This repository hasn't been reported by any hosts yet."}
</p>
</div>
) : (
<>
<table className="min-w-full divide-y divide-secondary-200 dark:divide-secondary-600">
<thead className="bg-secondary-50 dark:bg-secondary-700">
<tr>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Host
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Operating System
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Last Checked
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-300 uppercase tracking-wider">
Last Update
</th>
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-600">
{filteredAndPaginatedHosts.map((hostRepo) => (
<tr
key={hostRepo.id}
className="hover:bg-secondary-50 dark:hover:bg-secondary-700 cursor-pointer transition-colors"
onClick={() => handleHostClick(hostRepo.hosts.id)}
>
<td className="px-6 py-4 whitespace-nowrap">
<div className="flex items-center">
<div
className={`w-2 h-2 rounded-full mr-3 ${
hostRepo.hosts.status === "active"
? "bg-success-500"
: hostRepo.hosts.status === "pending"
? "bg-warning-500"
: "bg-danger-500"
}`}
/>
<Server className="h-5 w-5 text-secondary-400 mr-3" />
<div>
<div className="text-sm font-medium text-secondary-900 dark:text-white">
{hostRepo.hosts.friendly_name ||
hostRepo.hosts.hostname}
</div>
{hostRepo.hosts.friendly_name &&
hostRepo.hosts.hostname && (
<div className="text-sm text-secondary-500 dark:text-secondary-300">
{hostRepo.hosts.hostname}
</div>
)}
</div>
</div>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-900 dark:text-white">
{hostRepo.hosts.os_type} {hostRepo.hosts.os_version}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-500 dark:text-secondary-300">
{hostRepo.last_checked
? formatRelativeTime(hostRepo.last_checked)
: "Never"}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-500 dark:text-secondary-300">
{hostRepo.hosts.last_update
? formatRelativeTime(hostRepo.hosts.last_update)
: "Never"}
</td>
</tr>
))}
</tbody>
</table>
{/* Pagination */}
{totalPages > 1 && (
<div className="px-6 py-3 bg-white dark:bg-secondary-800 border-t border-secondary-200 dark:border-secondary-600 flex items-center justify-between">
<div className="flex items-center gap-2">
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Rows per page:
</span>
<select
value={pageSize}
onChange={(e) => {
setPageSize(Number(e.target.value));
setCurrentPage(1);
}}
className="text-sm border border-secondary-300 dark:border-secondary-600 rounded px-2 py-1 bg-white dark:bg-secondary-700 text-secondary-900 dark:text-white"
>
<option value={25}>25</option>
<option value={50}>50</option>
<option value={100}>100</option>
</select>
</div>
<div className="flex items-center gap-4">
<div className="text-center">
<div className="text-xs text-secondary-500 dark:text-secondary-400">
Last Checked
</div>
<div className="text-sm text-secondary-900 dark:text-white">
{new Date(hostRepo.last_checked).toLocaleDateString()}
</div>
</div>
<div className="flex items-center gap-2">
<button
type="button"
onClick={() => setCurrentPage(currentPage - 1)}
disabled={currentPage === 1}
className="px-3 py-1 text-sm border border-secondary-300 dark:border-secondary-600 rounded disabled:opacity-50 disabled:cursor-not-allowed hover:bg-secondary-50 dark:hover:bg-secondary-700"
>
Previous
</button>
<span className="text-sm text-secondary-700 dark:text-secondary-300">
Page {currentPage} of {totalPages}
</span>
<button
type="button"
onClick={() => setCurrentPage(currentPage + 1)}
disabled={currentPage === totalPages}
className="px-3 py-1 text-sm border border-secondary-300 dark:border-secondary-600 rounded disabled:opacity-50 disabled:cursor-not-allowed hover:bg-secondary-50 dark:hover:bg-secondary-700"
>
Next
</button>
</div>
</div>
</div>
))}
</div>
)}
)}
</>
)}
</div>
</div>
</div>
);

View File

@@ -5,11 +5,13 @@ import {
Clock,
Code,
Download,
Image,
Plus,
Save,
Server,
Settings as SettingsIcon,
Shield,
Upload,
X,
} from "lucide-react";
@@ -80,6 +82,15 @@ const Settings = () => {
});
const [showUploadModal, setShowUploadModal] = useState(false);
// Logo management state
const [logoUploadState, setLogoUploadState] = useState({
dark: { uploading: false, error: null },
light: { uploading: false, error: null },
favicon: { uploading: false, error: null },
});
const [showLogoUploadModal, setShowLogoUploadModal] = useState(false);
const [selectedLogoType, setSelectedLogoType] = useState("dark");
// Version checking state
const [versionInfo, setVersionInfo] = useState({
currentVersion: null, // Will be loaded from API
@@ -109,7 +120,7 @@ const Settings = () => {
});
// Helper function to get curl flags based on settings
const getCurlFlags = () => {
const _getCurlFlags = () => {
return settings?.ignore_ssl_self_signed ? "-sk" : "-s";
};
@@ -192,6 +203,37 @@ const Settings = () => {
},
});
// Logo upload mutation
const uploadLogoMutation = useMutation({
mutationFn: ({ logoType, fileContent, fileName }) =>
fetch("/api/v1/settings/logos/upload", {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${localStorage.getItem("token")}`,
},
body: JSON.stringify({ logoType, fileContent, fileName }),
}).then((res) => res.json()),
onSuccess: (_data, variables) => {
queryClient.invalidateQueries(["settings"]);
setLogoUploadState((prev) => ({
...prev,
[variables.logoType]: { uploading: false, error: null },
}));
setShowLogoUploadModal(false);
},
onError: (error, variables) => {
console.error("Upload logo error:", error);
setLogoUploadState((prev) => ({
...prev,
[variables.logoType]: {
uploading: false,
error: error.message || "Failed to upload logo",
},
}));
},
});
// Load current version on component mount
useEffect(() => {
const loadCurrentVersion = async () => {
@@ -556,6 +598,181 @@ const Settings = () => {
</p>
</div>
{/* Logo Management Section */}
<div className="mt-6 pt-6 border-t border-secondary-200 dark:border-secondary-600">
<div className="flex items-center mb-4">
<Image className="h-5 w-5 text-primary-600 mr-2" />
<h3 className="text-lg font-semibold text-secondary-900 dark:text-white">
Logo & Branding
</h3>
</div>
<p className="text-sm text-secondary-500 dark:text-secondary-300 mb-4">
Customize your PatchMon installation with custom logos and
favicon.
</p>
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
{/* Dark Logo */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-3">
Dark Logo
</h4>
{settings?.logo_dark && (
<div className="flex items-center justify-center p-3 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-3">
<img
src={settings.logo_dark}
alt="Dark Logo"
className="max-h-12 max-w-full object-contain"
onError={(e) => {
e.target.style.display = "none";
}}
/>
</div>
)}
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-3 truncate">
{settings?.logo_dark
? settings.logo_dark.split("/").pop()
: "Default"}
</p>
<button
type="button"
onClick={() => {
setSelectedLogoType("dark");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.dark.uploading}
className="w-full btn-outline flex items-center justify-center gap-2 text-sm py-2"
>
{logoUploadState.dark.uploading ? (
<>
<div className="animate-spin rounded-full h-3 w-3 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-3 w-3" />
Upload
</>
)}
</button>
{logoUploadState.dark.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.dark.error}
</p>
)}
</div>
{/* Light Logo */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-3">
Light Logo
</h4>
{settings?.logo_light && (
<div className="flex items-center justify-center p-3 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-3">
<img
src={settings.logo_light}
alt="Light Logo"
className="max-h-12 max-w-full object-contain"
onError={(e) => {
e.target.style.display = "none";
}}
/>
</div>
)}
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-3 truncate">
{settings?.logo_light
? settings.logo_light.split("/").pop()
: "Default"}
</p>
<button
type="button"
onClick={() => {
setSelectedLogoType("light");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.light.uploading}
className="w-full btn-outline flex items-center justify-center gap-2 text-sm py-2"
>
{logoUploadState.light.uploading ? (
<>
<div className="animate-spin rounded-full h-3 w-3 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-3 w-3" />
Upload
</>
)}
</button>
{logoUploadState.light.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.light.error}
</p>
)}
</div>
{/* Favicon */}
<div className="bg-white dark:bg-secondary-800 rounded-lg p-4 border border-secondary-200 dark:border-secondary-600">
<h4 className="text-sm font-medium text-secondary-900 dark:text-white mb-3">
Favicon
</h4>
{settings?.favicon && (
<div className="flex items-center justify-center p-3 bg-secondary-50 dark:bg-secondary-700 rounded-lg mb-3">
<img
src={settings.favicon}
alt="Favicon"
className="h-8 w-8 object-contain"
onError={(e) => {
e.target.style.display = "none";
}}
/>
</div>
)}
<p className="text-xs text-secondary-600 dark:text-secondary-400 mb-3 truncate">
{settings?.favicon
? settings.favicon.split("/").pop()
: "Default"}
</p>
<button
type="button"
onClick={() => {
setSelectedLogoType("favicon");
setShowLogoUploadModal(true);
}}
disabled={logoUploadState.favicon.uploading}
className="w-full btn-outline flex items-center justify-center gap-2 text-sm py-2"
>
{logoUploadState.favicon.uploading ? (
<>
<div className="animate-spin rounded-full h-3 w-3 border-b-2 border-current"></div>
Uploading...
</>
) : (
<>
<Upload className="h-3 w-3" />
Upload
</>
)}
</button>
{logoUploadState.favicon.error && (
<p className="text-xs text-red-600 dark:text-red-400 mt-2">
{logoUploadState.favicon.error}
</p>
)}
</div>
</div>
<div className="mt-4 p-3 bg-blue-50 dark:bg-blue-900/20 border border-blue-200 dark:border-blue-700 rounded-md">
<p className="text-xs text-blue-700 dark:text-blue-300">
<strong>Supported formats:</strong> PNG, JPG, SVG.{" "}
<strong>Max size:</strong> 5MB.
<strong> Recommended sizes:</strong> 200x60px for logos,
32x32px for favicon.
</p>
</div>
</div>
{/* Update Interval */}
<div>
<label
@@ -938,28 +1155,39 @@ const Settings = () => {
Agent Uninstall Command
</h3>
<div className="mt-2 text-sm text-red-700 dark:text-red-300">
<p className="mb-2">
<p className="mb-3">
To completely remove PatchMon from a host:
</p>
<div className="flex items-center gap-2">
<div className="bg-red-100 dark:bg-red-800 rounded p-2 font-mono text-xs flex-1">
curl {getCurlFlags()} {window.location.origin}
/api/v1/hosts/remove | sudo bash
{/* Go Agent Uninstall */}
<div className="mb-3">
<div className="space-y-2">
<div className="flex items-center gap-2">
<div className="bg-red-100 dark:bg-red-800 rounded p-2 font-mono text-xs flex-1">
sudo patchmon-agent uninstall
</div>
<button
type="button"
onClick={() => {
navigator.clipboard.writeText(
"sudo patchmon-agent uninstall",
);
}}
className="px-2 py-1 bg-red-200 dark:bg-red-700 text-red-800 dark:text-red-200 rounded text-xs hover:bg-red-300 dark:hover:bg-red-600 transition-colors"
>
Copy
</button>
</div>
<div className="text-xs text-red-600 dark:text-red-400">
Options: <code>--remove-config</code>,{" "}
<code>--remove-logs</code>,{" "}
<code>--remove-all</code>, <code>--force</code>
</div>
</div>
<button
type="button"
onClick={() => {
const command = `curl ${getCurlFlags()} ${window.location.origin}/api/v1/hosts/remove | sudo bash`;
navigator.clipboard.writeText(command);
// You could add a toast notification here
}}
className="px-2 py-1 bg-red-200 dark:bg-red-700 text-red-800 dark:text-red-200 rounded text-xs hover:bg-red-300 dark:hover:bg-red-600 transition-colors"
>
Copy
</button>
</div>
<p className="mt-2 text-xs">
This will remove all PatchMon files,
This command will remove all PatchMon files,
configuration, and crontab entries
</p>
</div>
@@ -1319,6 +1547,18 @@ const Settings = () => {
error={uploadAgentMutation.error}
/>
)}
{/* Logo Upload Modal */}
{showLogoUploadModal && (
<LogoUploadModal
isOpen={showLogoUploadModal}
onClose={() => setShowLogoUploadModal(false)}
onSubmit={uploadLogoMutation.mutate}
isLoading={uploadLogoMutation.isPending}
error={uploadLogoMutation.error}
logoType={selectedLogoType}
/>
)}
</div>
);
};
@@ -1467,4 +1707,181 @@ const AgentUploadModal = ({ isOpen, onClose, onSubmit, isLoading, error }) => {
);
};
// Logo Upload Modal Component
const LogoUploadModal = ({
isOpen,
onClose,
onSubmit,
isLoading,
error,
logoType,
}) => {
const [selectedFile, setSelectedFile] = useState(null);
const [previewUrl, setPreviewUrl] = useState(null);
const [uploadError, setUploadError] = useState("");
const handleFileSelect = (e) => {
const file = e.target.files[0];
if (file) {
// Validate file type
const allowedTypes = [
"image/png",
"image/jpeg",
"image/jpg",
"image/svg+xml",
];
if (!allowedTypes.includes(file.type)) {
setUploadError("Please select a PNG, JPG, or SVG file");
return;
}
// Validate file size (5MB limit)
if (file.size > 5 * 1024 * 1024) {
setUploadError("File size must be less than 5MB");
return;
}
setSelectedFile(file);
setUploadError("");
// Create preview URL
const url = URL.createObjectURL(file);
setPreviewUrl(url);
}
};
const handleSubmit = (e) => {
e.preventDefault();
setUploadError("");
if (!selectedFile) {
setUploadError("Please select a file");
return;
}
// Convert file to base64
const reader = new FileReader();
reader.onload = (event) => {
const base64 = event.target.result;
onSubmit({
logoType,
fileContent: base64,
fileName: selectedFile.name,
});
};
reader.readAsDataURL(selectedFile);
};
const handleClose = () => {
setSelectedFile(null);
setPreviewUrl(null);
setUploadError("");
onClose();
};
if (!isOpen) return null;
return (
<div className="fixed inset-0 bg-black bg-opacity-50 flex items-center justify-center z-50">
<div className="bg-white dark:bg-secondary-800 rounded-lg shadow-xl max-w-2xl w-full mx-4 max-h-[90vh] overflow-y-auto">
<div className="px-6 py-4 border-b border-secondary-200 dark:border-secondary-600">
<div className="flex items-center justify-between">
<h3 className="text-lg font-medium text-secondary-900 dark:text-white">
Upload{" "}
{logoType === "favicon"
? "Favicon"
: `${logoType.charAt(0).toUpperCase() + logoType.slice(1)} Logo`}
</h3>
<button
type="button"
onClick={handleClose}
className="text-secondary-400 hover:text-secondary-600 dark:text-secondary-500 dark:hover:text-secondary-300"
>
<X className="h-5 w-5" />
</button>
</div>
</div>
<form onSubmit={handleSubmit} className="px-6 py-4">
<div className="space-y-4">
<div>
<label className="block">
<span className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2">
Select File
</span>
<input
type="file"
accept="image/png,image/jpeg,image/jpg,image/svg+xml"
onChange={handleFileSelect}
className="block w-full text-sm text-secondary-500 dark:text-secondary-400 file:mr-4 file:py-2 file:px-4 file:rounded-md file:border-0 file:text-sm file:font-medium file:bg-primary-50 file:text-primary-700 hover:file:bg-primary-100 dark:file:bg-primary-900 dark:file:text-primary-200"
/>
</label>
<p className="mt-1 text-xs text-secondary-500 dark:text-secondary-400">
Supported formats: PNG, JPG, SVG. Max size: 5MB.
{logoType === "favicon"
? " Recommended: 32x32px SVG."
: " Recommended: 200x60px."}
</p>
</div>
{previewUrl && (
<div>
<div className="block text-sm font-medium text-secondary-700 dark:text-secondary-200 mb-2">
Preview
</div>
<div className="flex items-center justify-center p-4 bg-white dark:bg-secondary-800 rounded-lg border border-secondary-200 dark:border-secondary-600">
<img
src={previewUrl}
alt="Preview"
className={`object-contain ${
logoType === "favicon" ? "h-8 w-8" : "max-h-16 max-w-full"
}`}
/>
</div>
</div>
)}
{(uploadError || error) && (
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-md p-3">
<p className="text-sm text-red-800 dark:text-red-200">
{uploadError ||
error?.response?.data?.error ||
error?.message}
</p>
</div>
)}
<div className="bg-yellow-50 dark:bg-yellow-900/20 border border-yellow-200 dark:border-yellow-800 rounded-md p-3">
<div className="flex">
<AlertCircle className="h-4 w-4 text-yellow-600 dark:text-yellow-400 mr-2 mt-0.5" />
<div className="text-sm text-yellow-800 dark:text-yellow-200">
<p className="font-medium">Important:</p>
<ul className="mt-1 list-disc list-inside space-y-1">
<li>This will replace the current {logoType} logo</li>
<li>A backup will be created automatically</li>
<li>The change will be applied immediately</li>
</ul>
</div>
</div>
</div>
</div>
<div className="flex justify-end gap-3 mt-6">
<button type="button" onClick={handleClose} className="btn-outline">
Cancel
</button>
<button
type="submit"
disabled={isLoading || !selectedFile}
className="btn-primary"
>
{isLoading ? "Uploading..." : "Upload Logo"}
</button>
</div>
</form>
</div>
</div>
);
};
export default Settings;

View File

@@ -0,0 +1,389 @@
import { useQuery } from "@tanstack/react-query";
import {
AlertTriangle,
ArrowLeft,
CheckCircle,
Container,
ExternalLink,
RefreshCw,
Server,
} from "lucide-react";
import { Link, useParams } from "react-router-dom";
import api, { formatRelativeTime } from "../../utils/api";
const ContainerDetail = () => {
const { id } = useParams();
const { data, isLoading, error } = useQuery({
queryKey: ["docker", "container", id],
queryFn: async () => {
const response = await api.get(`/docker/containers/${id}`);
return response.data;
},
refetchInterval: 30000,
});
const container = data?.container;
const similarContainers = data?.similarContainers || [];
if (isLoading) {
return (
<div className="flex items-center justify-center min-h-screen">
<RefreshCw className="h-8 w-8 animate-spin text-secondary-400" />
</div>
);
}
if (error || !container) {
return (
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-8">
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4">
<div className="flex">
<AlertTriangle className="h-5 w-5 text-red-400" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Container not found
</h3>
<p className="mt-2 text-sm text-red-700 dark:text-red-300">
The container you're looking for doesn't exist or has been
removed.
</p>
</div>
</div>
</div>
<Link
to="/docker"
className="mt-4 inline-flex items-center text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300"
>
<ArrowLeft className="h-4 w-4 mr-2" />
Back to Docker
</Link>
</div>
);
}
const getStatusBadge = (status) => {
const statusClasses = {
running:
"bg-green-100 text-green-800 dark:bg-green-900 dark:text-green-200",
exited: "bg-red-100 text-red-800 dark:bg-red-900 dark:text-red-200",
paused:
"bg-yellow-100 text-yellow-800 dark:bg-yellow-900 dark:text-yellow-200",
restarting:
"bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200",
};
return (
<span
className={`inline-flex items-center px-3 py-1 rounded-full text-sm font-medium ${
statusClasses[status] ||
"bg-secondary-100 text-secondary-800 dark:bg-secondary-700 dark:text-secondary-200"
}`}
>
{status}
</span>
);
};
return (
<div className="space-y-6">
{/* Header */}
<div>
<Link
to="/docker"
className="inline-flex items-center text-sm text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 mb-4"
>
<ArrowLeft className="h-4 w-4 mr-2" />
Back to Docker
</Link>
<div className="flex items-center">
<Container className="h-8 w-8 text-secondary-400 mr-3" />
<div>
<div className="flex items-center gap-3">
<h1 className="text-2xl font-bold text-secondary-900 dark:text-white">
{container.name}
</h1>
{getStatusBadge(container.status)}
</div>
<p className="mt-1 text-sm text-secondary-600 dark:text-secondary-400">
Container ID: {container.container_id.substring(0, 12)}
</p>
</div>
</div>
</div>
{/* Overview Cards */}
<div className="grid grid-cols-1 gap-5 sm:grid-cols-2 lg:grid-cols-4">
{/* Update Status Card */}
{container.docker_images?.docker_image_updates &&
container.docker_images.docker_image_updates.length > 0 ? (
<div className="card p-4 bg-yellow-50 dark:bg-yellow-900/20 border-yellow-200 dark:border-yellow-800">
<div className="flex items-center">
<div className="flex-shrink-0">
<AlertTriangle className="h-5 w-5 text-yellow-600 dark:text-yellow-400 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-yellow-200">
Update Available
</p>
<p className="text-sm font-medium text-secondary-900 dark:text-yellow-100 truncate">
{
container.docker_images.docker_image_updates[0]
.available_tag
}
</p>
</div>
</div>
</div>
) : (
<div className="card p-4 bg-green-50 dark:bg-green-900/20 border-green-200 dark:border-green-800">
<div className="flex items-center">
<div className="flex-shrink-0">
<CheckCircle className="h-5 w-5 text-green-600 dark:text-green-400 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-green-200">
Update Status
</p>
<p className="text-sm font-medium text-secondary-900 dark:text-green-100">
Up to date
</p>
</div>
</div>
</div>
)}
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Server className="h-5 w-5 text-purple-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">Host</p>
<Link
to={`/hosts/${container.host?.id}`}
className="text-sm font-medium text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 truncate block"
>
{container.host?.friendly_name || container.host?.hostname}
</Link>
</div>
</div>
</div>
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Container className="h-5 w-5 text-green-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
State
</p>
<p className="text-sm font-medium text-secondary-900 dark:text-white">
{container.state || container.status}
</p>
</div>
</div>
</div>
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<RefreshCw className="h-5 w-5 text-secondary-400 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Last Checked
</p>
<p className="text-sm font-medium text-secondary-900 dark:text-white">
{formatRelativeTime(container.last_checked)}
</p>
</div>
</div>
</div>
</div>
{/* Container and Image Information - Side by Side */}
<div className="grid grid-cols-1 lg:grid-cols-2 gap-6">
{/* Container Details */}
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Container Information
</h3>
</div>
<div className="px-6 py-5">
<div className="grid grid-cols-1 sm:grid-cols-2 gap-6">
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Container ID
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white font-mono break-all">
{container.container_id}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Image Tag
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{container.image_tag}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Created
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{formatRelativeTime(container.created_at)}
</dd>
</div>
{container.started_at && (
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Started
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{formatRelativeTime(container.started_at)}
</dd>
</div>
)}
{container.ports && Object.keys(container.ports).length > 0 && (
<div className="sm:col-span-2">
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Port Mappings
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
<div className="flex flex-wrap gap-2">
{Object.entries(container.ports).map(([key, value]) => (
<span
key={key}
className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-blue-100 text-blue-800 dark:bg-blue-900 dark:text-blue-200"
>
{key} {value}
</span>
))}
</div>
</dd>
</div>
)}
</div>
</div>
</div>
{/* Image Information */}
{container.docker_images && (
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Image Information
</h3>
</div>
<div className="px-6 py-5">
<div className="grid grid-cols-1 sm:grid-cols-2 gap-6">
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Repository
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
<Link
to={`/docker/images/${container.docker_images.id}`}
className="text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 inline-flex items-center"
>
{container.docker_images.repository}
<ExternalLink className="ml-1 h-4 w-4" />
</Link>
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Tag
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{container.docker_images.tag}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Source
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{container.docker_images.source}
</dd>
</div>
{container.docker_images.size_bytes && (
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Size
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{(
Number(container.docker_images.size_bytes) /
1024 /
1024
).toFixed(2)}{" "}
MB
</dd>
</div>
)}
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Image ID
</dt>
<dd className="mt-1 text-xs text-secondary-900 dark:text-white font-mono break-all">
{container.docker_images.image_id?.substring(0, 12)}...
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Created
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{formatRelativeTime(container.docker_images.created_at)}
</dd>
</div>
</div>
</div>
</div>
)}
</div>
{/* Similar Containers */}
{similarContainers.length > 0 && (
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Similar Containers (Same Image)
</h3>
</div>
<div className="px-6 py-5">
<ul className="divide-y divide-secondary-200 dark:divide-secondary-700">
{similarContainers.map((similar) => (
<li
key={similar.id}
className="py-4 flex items-center justify-between"
>
<div className="flex items-center">
<Container className="h-5 w-5 text-secondary-400 mr-3" />
<div>
<Link
to={`/docker/containers/${similar.id}`}
className="text-sm font-medium text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300"
>
{similar.name}
</Link>
<p className="text-sm text-secondary-500 dark:text-secondary-400">
{similar.status}
</p>
</div>
</div>
</li>
))}
</ul>
</div>
</div>
)}
</div>
);
};
export default ContainerDetail;

View File

@@ -0,0 +1,354 @@
import { useQuery } from "@tanstack/react-query";
import {
AlertTriangle,
ArrowLeft,
Container,
ExternalLink,
Package,
RefreshCw,
Server,
} from "lucide-react";
import { Link, useParams } from "react-router-dom";
import api from "../../utils/api";
const HostDetail = () => {
const { id } = useParams();
const { data, isLoading, error } = useQuery({
queryKey: ["docker", "host", id],
queryFn: async () => {
const response = await api.get(`/docker/hosts/${id}`);
return response.data;
},
refetchInterval: 30000,
});
const host = data?.host;
const containers = data?.containers || [];
const images = data?.images || [];
const stats = data?.stats;
if (isLoading) {
return (
<div className="flex items-center justify-center min-h-screen">
<RefreshCw className="h-8 w-8 animate-spin text-secondary-400" />
</div>
);
}
if (error || !host) {
return (
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-8">
<div className="bg-red-50 dark:bg-red-900/20 border border-red-200 dark:border-red-800 rounded-lg p-4">
<div className="flex">
<AlertTriangle className="h-5 w-5 text-red-400" />
<div className="ml-3">
<h3 className="text-sm font-medium text-red-800 dark:text-red-200">
Host not found
</h3>
</div>
</div>
</div>
<Link
to="/docker"
className="mt-4 inline-flex items-center text-primary-600 hover:text-primary-900"
>
<ArrowLeft className="h-4 w-4 mr-2" />
Back to Docker
</Link>
</div>
);
}
return (
<div className="space-y-6">
<div>
<Link
to="/docker"
className="inline-flex items-center text-sm text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 mb-4"
>
<ArrowLeft className="h-4 w-4 mr-2" />
Back to Docker
</Link>
<div className="flex items-start justify-between">
<div className="flex items-center">
<Server className="h-8 w-8 text-secondary-400 mr-3" />
<div>
<h1 className="text-2xl font-bold text-secondary-900 dark:text-white">
{host.friendly_name || host.hostname}
</h1>
<p className="mt-1 text-sm text-secondary-600 dark:text-secondary-400">
{host.ip}
</p>
</div>
</div>
<Link
to={`/hosts/${id}`}
className="inline-flex items-center text-sm text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300"
>
View Full Host Details
<ExternalLink className="ml-2 h-4 w-4" />
</Link>
</div>
</div>
{/* Overview Cards */}
<div className="grid grid-cols-1 gap-5 sm:grid-cols-2 lg:grid-cols-4">
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Container className="h-5 w-5 text-blue-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Total Containers
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{stats?.totalContainers || 0}
</p>
</div>
</div>
</div>
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Container className="h-5 w-5 text-green-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Running
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{stats?.runningContainers || 0}
</p>
</div>
</div>
</div>
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Container className="h-5 w-5 text-red-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Stopped
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{stats?.stoppedContainers || 0}
</p>
</div>
</div>
</div>
<div className="card p-4">
<div className="flex items-center">
<div className="flex-shrink-0">
<Package className="h-5 w-5 text-purple-600 mr-2" />
</div>
<div className="w-0 flex-1">
<p className="text-sm text-secondary-500 dark:text-white">
Images
</p>
<p className="text-xl font-semibold text-secondary-900 dark:text-white">
{stats?.totalImages || 0}
</p>
</div>
</div>
</div>
</div>
{/* Host Information */}
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Host Information
</h3>
</div>
<div className="px-6 py-5 space-y-6">
<div className="grid grid-cols-1 gap-6 sm:grid-cols-2">
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Friendly Name
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{host.friendly_name}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
Hostname
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{host.hostname}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
IP Address
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{host.ip}
</dd>
</div>
<div>
<dt className="text-sm font-medium text-secondary-500 dark:text-secondary-400">
OS
</dt>
<dd className="mt-1 text-sm text-secondary-900 dark:text-white">
{host.os_type} {host.os_version}
</dd>
</div>
</div>
</div>
</div>
{/* Containers */}
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Containers ({containers.length})
</h3>
</div>
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-secondary-200 dark:divide-secondary-700">
<thead className="bg-secondary-50 dark:bg-secondary-900">
<tr>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Container Name
</th>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Image
</th>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Status
</th>
<th
scope="col"
className="px-6 py-3 text-right text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Actions
</th>
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-700">
{containers.map((container) => (
<tr key={container.id}>
<td className="px-6 py-4 whitespace-nowrap">
<Link
to={`/docker/containers/${container.id}`}
className="text-sm font-medium text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300"
>
{container.name}
</Link>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-500">
{container.image_name}:{container.image_tag}
</td>
<td className="px-6 py-4 whitespace-nowrap">
<span className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-secondary-100 text-secondary-800 dark:bg-secondary-700 dark:text-secondary-200">
{container.status}
</span>
</td>
<td className="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
<Link
to={`/docker/containers/${container.id}`}
className="text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 inline-flex items-center"
>
View
<ExternalLink className="ml-1 h-4 w-4" />
</Link>
</td>
</tr>
))}
</tbody>
</table>
</div>
</div>
{/* Images */}
<div className="card">
<div className="px-6 py-5 border-b border-secondary-200 dark:border-secondary-700">
<h3 className="text-lg leading-6 font-medium text-secondary-900 dark:text-white">
Images ({images.length})
</h3>
</div>
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-secondary-200 dark:divide-secondary-700">
<thead className="bg-secondary-50 dark:bg-secondary-900">
<tr>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Repository
</th>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Tag
</th>
<th
scope="col"
className="px-6 py-3 text-left text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Source
</th>
<th
scope="col"
className="px-6 py-3 text-right text-xs font-medium text-secondary-500 dark:text-secondary-400 uppercase tracking-wider"
>
Actions
</th>
</tr>
</thead>
<tbody className="bg-white dark:bg-secondary-800 divide-y divide-secondary-200 dark:divide-secondary-700">
{images.map((image) => (
<tr key={image.id}>
<td className="px-6 py-4 whitespace-nowrap">
<Link
to={`/docker/images/${image.id}`}
className="text-sm font-medium text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300"
>
{image.repository}
</Link>
</td>
<td className="px-6 py-4 whitespace-nowrap">
<span className="inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium bg-secondary-100 text-secondary-800 dark:bg-secondary-700 dark:text-secondary-200">
{image.tag}
</span>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-secondary-500">
{image.source}
</td>
<td className="px-6 py-4 whitespace-nowrap text-right text-sm font-medium">
<Link
to={`/docker/images/${image.id}`}
className="text-primary-600 hover:text-primary-900 dark:text-primary-400 dark:hover:text-primary-300 inline-flex items-center"
>
View
<ExternalLink className="ml-1 h-4 w-4" />
</Link>
</td>
</tr>
))}
</tbody>
</table>
</div>
</div>
</div>
);
};
export default HostDetail;

Some files were not shown because too many files have changed in this diff Show More