Compare commits

...

163 Commits

Author SHA1 Message Date
wh1te909
8046a3ccae Release 0.3.0 2021-01-17 02:16:06 +00:00
wh1te909
bf91d60b31 natsapi bin 1.0.0 2021-01-17 02:07:53 +00:00
wh1te909
539c047ec8 update go 2021-01-17 01:53:45 +00:00
wh1te909
290c18fa87 bump versions 2021-01-17 01:22:08 +00:00
wh1te909
98c46f5e57 fix domain 2021-01-17 01:21:21 +00:00
wh1te909
f8bd5b5b4e update configs/scripts and add migration docs for 0.3.0 2021-01-17 01:16:28 +00:00
wh1te909
816d32edad black 2021-01-16 23:34:55 +00:00
wh1te909
8453835c05 Merge branch 'develop' of https://github.com/wh1te909/tacticalrmm into develop 2021-01-16 23:32:54 +00:00
wh1te909
9328c356c8 possible fix for mesh scaling 2021-01-16 23:32:46 +00:00
sadnub
89e3c1fc94 remove my print statements 2021-01-16 17:46:56 -05:00
sadnub
67e54cd15d Remove pending action duplicates and make policy check/task propogation more efficient 2021-01-16 17:46:56 -05:00
sadnub
278ea24786 improve dev env 2021-01-16 17:46:56 -05:00
wh1te909
5e9a8f4806 new natsapi binary 2021-01-16 21:55:06 +00:00
wh1te909
4cb274e9bc update to celery 5 2021-01-16 21:52:30 +00:00
wh1te909
8b9b1a6a35 update mesh docker conf 2021-01-16 21:50:29 +00:00
wh1te909
188bad061b add wmi task 2021-01-16 10:31:00 +00:00
wh1te909
3af4c329aa update reqs 2021-01-16 09:42:03 +00:00
wh1te909
6c13395f7d add debug 2021-01-16 09:41:27 +00:00
wh1te909
77b32ba360 remove import 2021-01-16 09:39:15 +00:00
sadnub
91dba291ac nats-api fixes 2021-01-15 23:41:21 -05:00
sadnub
a6bc293640 Finish up check charts 2021-01-15 22:11:40 -05:00
sadnub
53882d6e5f fix dev port 2021-01-15 21:25:32 -05:00
sadnub
d68adfbf10 docker nats-api rework 2021-01-15 21:11:27 -05:00
sadnub
498a392d7f check graphs wip 2021-01-15 21:10:25 -05:00
sadnub
740f6c05db docker cli additions 2021-01-15 21:10:25 -05:00
wh1te909
d810ce301f update natsapi flags 2021-01-16 00:01:31 +00:00
wh1te909
5ef6a14d24 add nats-api binary 2021-01-15 18:21:25 +00:00
wh1te909
a13f6f1e68 move recovery to natsapi 2021-01-15 10:19:01 +00:00
wh1te909
d2d0f1aaee fix tests 2021-01-15 09:57:46 +00:00
wh1te909
e64c72cc89 #234 sort proc mem using bytes wh1te909/rmmagent@04470dd4ce 2021-01-15 09:44:18 +00:00
wh1te909
9ab915a08b Release 0.2.23 2021-01-14 02:43:56 +00:00
wh1te909
e26fbf0328 bump versions 2021-01-14 02:29:14 +00:00
wh1te909
d9a52c4a2a update reqs 2021-01-14 02:27:40 +00:00
wh1te909
7b2ec90de9 feat: double-click agent action #232 2021-01-14 02:21:08 +00:00
wh1te909
d310bf8bbf add community scripts from dinger #242 2021-01-14 01:17:58 +00:00
wh1te909
2abc6cc939 partially fix sort 2021-01-14 00:01:08 +00:00
sadnub
56d4e694a2 fix annotations and error for the check chart 2021-01-13 18:43:09 -05:00
wh1te909
5f002c9cdc bump mesh 2021-01-13 23:35:14 +00:00
wh1te909
759daf4b4a add wording 2021-01-13 23:35:01 +00:00
wh1te909
3a8d9568e3 split some tasks into chunks to reduce load 2021-01-13 22:26:54 +00:00
wh1te909
ff22a9d94a fix deployments in docker 2021-01-13 22:19:09 +00:00
sadnub
a6e42d5374 fix removing pendingactions that are outstanding 2021-01-13 13:21:09 -05:00
wh1te909
a2f74e0488 add natsapi flags 2021-01-12 21:14:43 +00:00
wh1te909
ee44240569 black 2021-01-12 21:06:44 +00:00
wh1te909
d0828744a2 update nginx conf
(cherry picked from commit bf61e27f8a)
2021-01-12 06:38:52 +00:00
wh1te909
6e2e576b29 start natsapi 2021-01-12 06:32:00 +00:00
wh1te909
bf61e27f8a update nginx conf 2021-01-12 03:02:03 +00:00
Tragic Bronson
c441c30b46 Merge pull request #243 from sadnub/develop
Move Check Runs from Audit to its own table
2021-01-11 00:29:59 -08:00
Tragic Bronson
0e741230ea Merge pull request #242 from dinger1986/develop
Added some scripts checks etc
2021-01-11 00:29:47 -08:00
sadnub
1bfe9ac2db complete other pending actions with same task if task is deleted 2021-01-10 20:19:38 -05:00
sadnub
6812e72348 fix process sorting 2021-01-10 19:35:39 -05:00
sadnub
b6449d2f5b black 2021-01-10 16:33:10 -05:00
sadnub
7e3ea20dce add some tests and bug fixes 2021-01-10 16:27:48 -05:00
sadnub
c9d6fe9dcd allow returning all check data 2021-01-10 15:14:02 -05:00
sadnub
4a649a6b8b black 2021-01-10 14:47:34 -05:00
sadnub
8fef184963 add check history graph for cpu, memory, and diskspace 2021-01-10 14:15:05 -05:00
sadnub
69583ca3c0 docker dev fixes 2021-01-10 13:17:49 -05:00
dinger1986
6038a68e91 Win Defender exclusions for Tactical 2021-01-10 17:56:12 +00:00
dinger1986
fa8bd8db87 Manually reinstall Mesh just incase 2021-01-10 17:54:41 +00:00
dinger1986
18b4f0ed0f Runs DNS check on host as defined 2021-01-10 17:53:53 +00:00
dinger1986
461f9d66c9 Disable Faststartup on Windows 10 2021-01-10 17:51:33 +00:00
dinger1986
2155103c7a Check Win Defender for detections etc 2021-01-10 17:51:06 +00:00
dinger1986
c9a6839c45 Clears Win Defender log files 2021-01-10 17:50:13 +00:00
dinger1986
9fbe331a80 Allows the following Apps access by Win Defender 2021-01-10 17:49:36 +00:00
dinger1986
a56389c4ce Sync time with DC 2021-01-10 17:46:47 +00:00
dinger1986
64656784cb Powershell Speedtest 2021-01-10 17:46:00 +00:00
dinger1986
6eff2c181e Install RDP and change power config 2021-01-10 17:44:23 +00:00
dinger1986
1aa48c6d62 Install OpenSSH on PCs 2021-01-10 17:42:11 +00:00
dinger1986
c7ca1a346d Enable Windows Defender and set preferences 2021-01-10 17:40:06 +00:00
dinger1986
fa0ec7b502 check Duplicati Backup is running properly 2021-01-10 17:38:06 +00:00
dinger1986
768438c136 Checks disks for errors reported in event viewer 2021-01-10 17:36:42 +00:00
dinger1986
9badea0b3c Update DiskStatus.ps1
Checks local disks for errors reported in event viewer within the last 24 hours
2021-01-10 17:35:50 +00:00
dinger1986
43263a1650 Add files via upload 2021-01-10 17:33:48 +00:00
wh1te909
821e02dc75 update mesh docker conf 2021-01-10 00:20:44 +00:00
wh1te909
ed011ecf28 remove old mesh overrides #217 2021-01-10 00:15:11 +00:00
wh1te909
d861de4c2f update community scripts 2021-01-09 22:26:02 +00:00
Tragic Bronson
3a3b2449dc Merge pull request #241 from RVL-Solutions/develop
Create Windows10Upgrade.ps1
2021-01-09 14:12:05 -08:00
Ruben van Leusden
d2614406ca Create Windows10Upgrade.ps1
Shared by Kyt through Discord
2021-01-08 22:20:33 +01:00
Tragic Bronson
0798d098ae Merge pull request #238 from wh1te909/revert-235-master
Revert "Create Windows10Upgrade.ps1"
2021-01-08 10:38:33 -08:00
Tragic Bronson
dab7ddc2bb Revert "Create Windows10Upgrade.ps1" 2021-01-08 10:36:42 -08:00
Tragic Bronson
081a96e281 Merge pull request #235 from RVL-Solutions/master
Create Windows10Upgrade.ps1
2021-01-08 10:36:19 -08:00
wh1te909
a7dd881d79 Release 0.2.22 2021-01-08 18:16:17 +00:00
wh1te909
8134d5e24d remove threading 2021-01-08 18:15:55 +00:00
Ruben van Leusden
ba6756cd45 Create Windows10Upgrade.ps1 2021-01-06 23:19:14 +01:00
Tragic Bronson
5d8fce21ac Merge pull request #230 from wh1te909/dependabot/npm_and_yarn/web/axios-0.21.1
Bump axios from 0.21.0 to 0.21.1 in /web
2021-01-05 13:51:18 -08:00
dependabot[bot]
e7e4a5bcd4 Bump axios from 0.21.0 to 0.21.1 in /web
Bumps [axios](https://github.com/axios/axios) from 0.21.0 to 0.21.1.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v0.21.1/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v0.21.0...v0.21.1)

Signed-off-by: dependabot[bot] <support@github.com>
2021-01-05 15:54:54 +00:00
wh1te909
55f33357ea Release 0.2.21 2021-01-05 08:55:54 +00:00
wh1te909
90568bba31 bump versions 2021-01-05 08:55:08 +00:00
wh1te909
5d6e2dc2e4 feat: add send script results by email #212 2021-01-05 08:52:17 +00:00
sadnub
6bb33f2559 fix unassigned scripts not show if not categories are present 2021-01-04 20:22:42 -05:00
wh1te909
ced92554ed update community scripts 2021-01-04 22:00:17 +00:00
Tragic Bronson
dff3383158 Merge pull request #228 from azulskyknight/patch-2
Create SetHighPerformancePowerProfile.ps1
2021-01-04 13:42:20 -08:00
Tragic Bronson
bf03c89cb2 Merge pull request #227 from azulskyknight/patch-1
Create ResetHighPerformancePowerProfiletoDefaults.ps1
2021-01-04 13:42:10 -08:00
azulskyknight
9f1484bbef Create SetHighPerformancePowerProfile.ps1
Script sets the High Performance Power profile to the active power profile.
Use this to keep machines from falling asleep.
2021-01-04 13:21:00 -07:00
azulskyknight
3899680e26 Create ResetHighPerformancePowerProfiletoDefaults.ps1
Script resets monitor, disk, standby, and hibernate timers in the default High Performance power profile to their default values.
It also re-indexes the AC and DC power profiles into their default order.
2021-01-04 13:19:03 -07:00
sadnub
6bb2eb25a1 sort script folders alphabetically and fix showing community scripts when no user scripts present 2021-01-03 21:01:50 -05:00
sadnub
f8dfd8edb3 Make pip copy the binaries versus symlink them in dev env 2021-01-03 20:15:40 -05:00
sadnub
042be624a3 Update .dockerignore 2021-01-03 15:16:13 -05:00
sadnub
6bafa4c79a fix mesh init on dev 2021-01-03 15:15:43 -05:00
wh1te909
58b42fac5c Release 0.2.20 2021-01-03 09:13:28 +00:00
wh1te909
3b47b9558a let python calculate default threadpool workers based on cpu count 2021-01-03 09:12:38 +00:00
wh1te909
ccf9636296 Release 0.2.19 2021-01-02 09:34:12 +00:00
wh1te909
96942719f2 bump versions 2021-01-02 09:32:04 +00:00
wh1te909
69cf1c1adc update quasar 2021-01-02 07:38:33 +00:00
wh1te909
d77cba40b8 black 2021-01-02 07:26:34 +00:00
wh1te909
968735b555 fix scroll 2021-01-02 07:21:10 +00:00
wh1te909
ceed9d29eb task changes 2021-01-02 07:20:52 +00:00
sadnub
41329039ee add .env example 2021-01-02 00:09:56 -05:00
sadnub
f68b102ca8 Add Dev Containers 2021-01-02 00:05:54 -05:00
wh1te909
fa36e54298 change agent update 2021-01-02 01:30:51 +00:00
wh1te909
b689f57435 black 2021-01-01 00:51:44 +00:00
sadnub
885fa0ff56 add api tests to core app 2020-12-31 17:18:25 -05:00
Tragic Bronson
303acb72a3 Merge pull request #225 from sadnub/develop
add folder view to script manager
2020-12-31 13:12:33 -08:00
sadnub
b2a46cd0cd add folder view to script manager 2020-12-31 15:46:44 -05:00
wh1te909
5a5ecb3ee3 install curl/wget first fixes #224 2020-12-30 19:04:14 +00:00
wh1te909
60b4ab6a63 fix logging 2020-12-22 05:15:44 +00:00
wh1te909
e4b096a08f fix logging 2020-12-22 05:14:44 +00:00
wh1te909
343f55049b prevent duplicate cpu/mem checks from being created 2020-12-19 20:38:22 +00:00
wh1te909
6b46025261 Release 0.2.18 2020-12-19 08:44:45 +00:00
wh1te909
5ea503f23e bump version 2020-12-19 08:43:47 +00:00
wh1te909
ce95f9ac23 add codestyle to tests 2020-12-19 08:24:47 +00:00
wh1te909
c3fb87501b black 2020-12-19 08:20:12 +00:00
wh1te909
dc6a343612 bump mesh 2020-12-19 07:55:39 +00:00
wh1te909
3a61053957 update reqs 2020-12-19 07:50:32 +00:00
wh1te909
570129e4d4 add debian 10 to readme 2020-12-19 07:50:05 +00:00
wh1te909
3315c7045f if ubuntu, force 20.04 2020-12-19 07:45:21 +00:00
wh1te909
5ae50e242c always run npm install during update 2020-12-18 21:59:23 +00:00
Tragic Bronson
bbcf449719 Merge pull request #214 from mckinnon81/debian
Updated install.sh for Debian
2020-12-18 13:56:14 -08:00
Matthew McKinnon
aab10f7184 Removed certbot test-cert. Not needed 2020-12-18 08:32:40 +10:00
Matthew McKinnon
8d43488cb8 Updated install.sh for Debian
Updated api\tacticalrmm\accounts\views.py valid_window=10
2020-12-18 08:28:01 +10:00
Tragic Bronson
0a9c647e19 Merge pull request #211 from sadnub/develop
Fix default policies
2020-12-16 13:51:37 -08:00
wh1te909
40db5d4aa8 remove debug print 2020-12-16 21:50:43 +00:00
Josh
9254532baa fix applying default policies in certain situations 2020-12-16 20:38:36 +00:00
Josh
7abed47cf0 Merge branch 'develop' of https://github.com/wh1te909/tacticalrmm into develop 2020-12-16 19:08:12 +00:00
Tragic Bronson
5c6ac758f7 Merge pull request #210 from mckinnon81/scripts
Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1
2020-12-16 09:36:33 -08:00
Matthew McKinnon
007677962c Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1 2020-12-16 22:32:04 +10:00
wh1te909
9c4aeab64a back to develop 2020-12-16 10:47:05 +00:00
wh1te909
48e6fc0efe test coveralls 2 2020-12-16 10:41:39 +00:00
wh1te909
c8be713d11 test coveralls 2020-12-16 10:38:00 +00:00
wh1te909
ae887c8648 switch to branch head for coveralls 2020-12-16 10:20:50 +00:00
wh1te909
5daac2531b add accounts tests for new settings 2020-12-16 10:09:58 +00:00
wh1te909
68def00327 fix tests 2020-12-16 09:40:36 +00:00
wh1te909
67e7976710 pipelines attempt 2 2020-12-16 09:25:28 +00:00
wh1te909
35747e937e try to get pipelines to fail 2020-12-16 09:10:53 +00:00
wh1te909
fb439787a4 Release 0.2.17 2020-12-16 00:37:59 +00:00
wh1te909
8fa368f473 bump versions 2020-12-16 00:36:43 +00:00
sadnub
c84a9d07b1 tactical-cli for managing docker installations 2020-12-15 13:41:03 -05:00
wh1te909
7fb46cdfc4 add more targeting options to bulk actions 2020-12-15 08:30:55 +00:00
Tragic Bronson
52985e5ddc Merge pull request #203 from wh1te909/dependabot/npm_and_yarn/docs/ini-1.3.8
Bump ini from 1.3.5 to 1.3.8 in /docs
2020-12-15 00:10:01 -08:00
wh1te909
e880935dc3 make script name required 2020-12-15 07:37:37 +00:00
wh1te909
cc22b1bca5 send favorite data when adding new script 2020-12-15 07:37:09 +00:00
wh1te909
49a5128918 remove extra migrations already handled by another func 2020-12-15 05:06:33 +00:00
wh1te909
fedc7dcb44 #204 add optional setting to prevent initial admin user from being modified or deleted 2020-12-14 21:00:25 +00:00
wh1te909
cd32b20215 remove vue tests for now 2020-12-14 20:59:43 +00:00
wh1te909
15cd9832c4 change fav script context menu style 2020-12-14 20:41:07 +00:00
wh1te909
f25d4e4553 add agent recovery periodic task 2020-12-14 19:27:09 +00:00
Tragic Bronson
12d1c82b63 Merge pull request #200 from sadnub/develop
Scripts Manager Rework
2020-12-14 10:35:19 -08:00
wh1te909
aebe855078 add a favorite menu to agent's context menu for easy way to run scripts 2020-12-14 11:28:00 +00:00
wh1te909
3416a71ebd add community scripts to migration 2020-12-14 07:17:51 +00:00
Tragic Bronson
94b3fea528 Create FUNDING.yml 2020-12-13 20:57:05 -08:00
Josh
ad1a9ecca1 fix agent table pending actions filter 2020-12-14 04:39:42 +00:00
Josh
715accfb8a scripts rework 2020-12-14 04:39:02 +00:00
dependabot[bot]
eedfbe5846 Bump ini from 1.3.5 to 1.3.8 in /docs
Bumps [ini](https://github.com/isaacs/ini) from 1.3.5 to 1.3.8.
- [Release notes](https://github.com/isaacs/ini/releases)
- [Commits](https://github.com/isaacs/ini/compare/v1.3.5...v1.3.8)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-13 07:18:22 +00:00
157 changed files with 7180 additions and 2270 deletions

View File

@@ -0,0 +1,28 @@
COMPOSE_PROJECT_NAME=trmm
IMAGE_REPO=tacticalrmm/
VERSION=latest
# tactical credentials (Used to login to dashboard)
TRMM_USER=tactical
TRMM_PASS=tactical
# dns settings
APP_HOST=rmm.example.com
API_HOST=api.example.com
MESH_HOST=mesh.example.com
# mesh settings
MESH_USER=tactical
MESH_PASS=tactical
MONGODB_USER=mongouser
MONGODB_PASSWORD=mongopass
# database settings
POSTGRES_USER=postgres
POSTGRES_PASS=postgrespass
# DEV SETTINGS
APP_PORT=8000
API_PORT=8080
HTTP_PROTOCOL=https

View File

@@ -0,0 +1,28 @@
FROM python:3.8-slim
ENV TACTICAL_DIR /opt/tactical
ENV TACTICAL_GO_DIR /usr/local/rmmgo
ENV TACTICAL_READY_FILE ${TACTICAL_DIR}/tmp/tactical.ready
ENV WORKSPACE_DIR /workspace
ENV TACTICAL_USER tactical
ENV VIRTUAL_ENV ${WORKSPACE_DIR}/api/tacticalrmm/env
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
EXPOSE 8000
RUN groupadd -g 1000 tactical && \
useradd -u 1000 -g 1000 tactical
# Copy Go Files
COPY --from=golang:1.15 /usr/local/go ${TACTICAL_GO_DIR}/go
# Copy Dev python reqs
COPY ./requirements.txt /
# Copy Docker Entrypoint
COPY ./entrypoint.sh /
RUN chmod +x /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]
WORKDIR ${WORKSPACE_DIR}/api/tacticalrmm

View File

@@ -0,0 +1,19 @@
version: '3.4'
services:
api-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["sh", "-c", "pip install debugpy -t /tmp && python /tmp/debugpy --wait-for-client --listen 0.0.0.0:5678 manage.py runserver 0.0.0.0:8000 --nothreading --noreload"]
ports:
- 8000:8000
- 5678:5678
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
networks:
dev:
aliases:
- tactical-backend

View File

@@ -0,0 +1,242 @@
version: '3.4'
services:
api-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-api"]
environment:
API_PORT: ${API_PORT}
ports:
- "8000:${API_PORT}"
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
networks:
dev:
aliases:
- tactical-backend
app-dev:
image: node:12-alpine
command: /bin/sh -c "npm install && npm run serve -- --host 0.0.0.0 --port ${APP_PORT}"
working_dir: /workspace/web
volumes:
- ..:/workspace:cached
ports:
- "8080:${APP_PORT}"
networks:
dev:
aliases:
- tactical-frontend
# salt master and api
salt-dev:
image: ${IMAGE_REPO}tactical-salt:${VERSION}
restart: always
volumes:
- tactical-data-dev:/opt/tactical
- salt-data-dev:/etc/salt
ports:
- "4505:4505"
- "4506:4506"
networks:
dev:
aliases:
- tactical-salt
# nats
nats-dev:
image: ${IMAGE_REPO}tactical-nats:${VERSION}
restart: always
environment:
API_HOST: ${API_HOST}
API_PORT: ${API_PORT}
DEV: 1
ports:
- "4222:4222"
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
networks:
dev:
aliases:
- ${API_HOST}
- tactical-nats
# meshcentral container
meshcentral-dev:
image: ${IMAGE_REPO}tactical-meshcentral:${VERSION}
restart: always
environment:
MESH_HOST: ${MESH_HOST}
MESH_USER: ${MESH_USER}
MESH_PASS: ${MESH_PASS}
MONGODB_USER: ${MONGODB_USER}
MONGODB_PASSWORD: ${MONGODB_PASSWORD}
NGINX_HOST_IP: 172.21.0.20
networks:
dev:
aliases:
- tactical-meshcentral
- ${MESH_HOST}
volumes:
- tactical-data-dev:/opt/tactical
- mesh-data-dev:/home/node/app/meshcentral-data
depends_on:
- mongodb-dev
# mongodb container for meshcentral
mongodb-dev:
image: mongo:4.4
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: ${MONGODB_USER}
MONGO_INITDB_ROOT_PASSWORD: ${MONGODB_PASSWORD}
MONGO_INITDB_DATABASE: meshcentral
networks:
dev:
aliases:
- tactical-mongodb
volumes:
- mongo-dev-data:/data/db
# postgres database for api service
postgres-dev:
image: postgres:13-alpine
restart: always
environment:
POSTGRES_DB: tacticalrmm
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASS}
volumes:
- postgres-data-dev:/var/lib/postgresql/data
networks:
dev:
aliases:
- tactical-postgres
# redis container for celery tasks
redis-dev:
restart: always
image: redis:6.0-alpine
networks:
dev:
aliases:
- tactical-redis
init-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
restart: on-failure
command: ["tactical-init-dev"]
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASS: ${POSTGRES_PASS}
APP_HOST: ${APP_HOST}
API_HOST: ${API_HOST}
MESH_HOST: ${MESH_HOST}
MESH_USER: ${MESH_USER}
TRMM_USER: ${TRMM_USER}
TRMM_PASS: ${TRMM_PASS}
HTTP_PROTOCOL: ${HTTP_PROTOCOL}
APP_PORT: ${APP_PORT}
depends_on:
- postgres-dev
- meshcentral-dev
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
# container for celery worker service
celery-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celery-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
# container for celery beat service
celerybeat-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celerybeat-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
# container for celery winupdate tasks
celerywinupdate-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celerywinupdate-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
nginx-dev:
# container for tactical reverse proxy
image: ${IMAGE_REPO}tactical-nginx:${VERSION}
restart: always
environment:
APP_HOST: ${APP_HOST}
API_HOST: ${API_HOST}
MESH_HOST: ${MESH_HOST}
CERT_PUB_KEY: ${CERT_PUB_KEY}
CERT_PRIV_KEY: ${CERT_PRIV_KEY}
APP_PORT: ${APP_PORT}
API_PORT: ${API_PORT}
networks:
dev:
ipv4_address: 172.21.0.20
ports:
- "80:80"
- "443:443"
volumes:
- tactical-data-dev:/opt/tactical
volumes:
tactical-data-dev:
postgres-data-dev:
mongo-dev-data:
mesh-data-dev:
salt-data-dev:
networks:
dev:
driver: bridge
ipam:
driver: default
config:
- subnet: 172.21.0.0/24

187
.devcontainer/entrypoint.sh Normal file
View File

@@ -0,0 +1,187 @@
#!/usr/bin/env bash
set -e
: "${TRMM_USER:=tactical}"
: "${TRMM_PASS:=tactical}"
: "${POSTGRES_HOST:=tactical-postgres}"
: "${POSTGRES_PORT:=5432}"
: "${POSTGRES_USER:=tactical}"
: "${POSTGRES_PASS:=tactical}"
: "${POSTGRES_DB:=tacticalrmm}"
: "${SALT_HOST:=tactical-salt}"
: "${SALT_USER:=saltapi}"
: "${MESH_CONTAINER:=tactical-meshcentral}"
: "${MESH_USER:=meshcentral}"
: "${MESH_PASS:=meshcentralpass}"
: "${MESH_HOST:=tactical-meshcentral}"
: "${API_HOST:=tactical-backend}"
: "${APP_HOST:=tactical-frontend}"
: "${REDIS_HOST:=tactical-redis}"
: "${HTTP_PROTOCOL:=http}"
: "${APP_PORT:=8080}"
: "${API_PORT:=8000}"
# Add python venv to path
export PATH="${VIRTUAL_ENV}/bin:$PATH"
function check_tactical_ready {
sleep 15
until [ -f "${TACTICAL_READY_FILE}" ]; do
echo "waiting for init container to finish install or update..."
sleep 10
done
}
function django_setup {
until (echo > /dev/tcp/"${POSTGRES_HOST}"/"${POSTGRES_PORT}") &> /dev/null; do
echo "waiting for postgresql container to be ready..."
sleep 5
done
until (echo > /dev/tcp/"${MESH_CONTAINER}"/443) &> /dev/null; do
echo "waiting for meshcentral container to be ready..."
sleep 5
done
echo "setting up django environment"
# configure django settings
MESH_TOKEN=$(cat ${TACTICAL_DIR}/tmp/mesh_token)
DJANGO_SEKRET=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 80 | head -n 1)
# write salt pass to tmp dir
if [ ! -f "${TACTICAL__DIR}/tmp/salt_pass" ]; then
SALT_PASS=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1)
echo "${SALT_PASS}" > ${TACTICAL_DIR}/tmp/salt_pass
else
SALT_PASS=$(cat ${TACTICAL_DIR}/tmp/salt_pass)
fi
localvars="$(cat << EOF
SECRET_KEY = '${DJANGO_SEKRET}'
DEBUG = True
DOCKER_BUILD = True
CERT_FILE = '/opt/tactical/certs/fullchain.pem'
KEY_FILE = '/opt/tactical/certs/privkey.pem'
SCRIPTS_DIR = '${WORKSPACE_DIR}/scripts'
ALLOWED_HOSTS = ['${API_HOST}', '*']
ADMIN_URL = 'admin/'
CORS_ORIGIN_ALLOW_ALL = True
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': '${POSTGRES_DB}',
'USER': '${POSTGRES_USER}',
'PASSWORD': '${POSTGRES_PASS}',
'HOST': '${POSTGRES_HOST}',
'PORT': '${POSTGRES_PORT}',
}
}
REST_FRAMEWORK = {
'DATETIME_FORMAT': '%b-%d-%Y - %H:%M',
'DEFAULT_PERMISSION_CLASSES': (
'rest_framework.permissions.IsAuthenticated',
),
'DEFAULT_AUTHENTICATION_CLASSES': (
'knox.auth.TokenAuthentication',
),
}
if not DEBUG:
REST_FRAMEWORK.update({
'DEFAULT_RENDERER_CLASSES': (
'rest_framework.renderers.JSONRenderer',
)
})
SALT_USERNAME = '${SALT_USER}'
SALT_PASSWORD = '${SALT_PASS}'
SALT_HOST = '${SALT_HOST}'
MESH_USERNAME = '${MESH_USER}'
MESH_SITE = 'https://${MESH_HOST}'
MESH_TOKEN_KEY = '${MESH_TOKEN}'
REDIS_HOST = '${REDIS_HOST}'
EOF
)"
echo "${localvars}" > ${WORKSPACE_DIR}/api/tacticalrmm/tacticalrmm/local_settings.py
# run migrations and init scripts
python manage.py migrate --no-input
python manage.py collectstatic --no-input
python manage.py initial_db_setup
python manage.py initial_mesh_setup
python manage.py load_chocos
python manage.py load_community_scripts
python manage.py reload_nats
# create super user
echo "from accounts.models import User; User.objects.create_superuser('${TRMM_USER}', 'admin@example.com', '${TRMM_PASS}') if not User.objects.filter(username='${TRMM_USER}').exists() else 0;" | python manage.py shell
}
if [ "$1" = 'tactical-init-dev' ]; then
# make directories if they don't exist
mkdir -p ${TACTICAL_DIR}/tmp
test -f "${TACTICAL_READY_FILE}" && rm "${TACTICAL_READY_FILE}"
# setup Python virtual env and install dependencies
test -f ${VIRTUAL_ENV} && python -m venv --copies ${VIRTUAL_ENV}
pip install --no-cache-dir -r /requirements.txt
django_setup
# create .env file for frontend
webenv="$(cat << EOF
PROD_URL = "${HTTP_PROTOCOL}://${API_HOST}"
DEV_URL = "${HTTP_PROTOCOL}://${API_HOST}"
APP_URL = https://${APP_HOST}
EOF
)"
echo "${webenv}" | tee ${WORKSPACE_DIR}/web/.env > /dev/null
# chown everything to tactical user
chown -R "${TACTICAL_USER}":"${TACTICAL_USER}" "${WORKSPACE_DIR}"
chown -R "${TACTICAL_USER}":"${TACTICAL_USER}" "${TACTICAL_DIR}"
# create install ready file
su -c "echo 'tactical-init' > ${TACTICAL_READY_FILE}" "${TACTICAL_USER}"
fi
if [ "$1" = 'tactical-api' ]; then
cp ${WORKSPACE_DIR}/api/tacticalrmm/core/goinstaller/bin/goversioninfo /usr/local/bin/goversioninfo
chmod +x /usr/local/bin/goversioninfo
check_tactical_ready
python manage.py runserver 0.0.0.0:${API_PORT}
fi
if [ "$1" = 'tactical-celery-dev' ]; then
check_tactical_ready
env/bin/celery -A tacticalrmm worker -l debug
fi
if [ "$1" = 'tactical-celerybeat-dev' ]; then
check_tactical_ready
test -f "${WORKSPACE_DIR}/api/tacticalrmm/celerybeat.pid" && rm "${WORKSPACE_DIR}/api/tacticalrmm/celerybeat.pid"
env/bin/celery -A tacticalrmm beat -l debug
fi
if [ "$1" = 'tactical-celerywinupdate-dev' ]; then
check_tactical_ready
env/bin/celery -A tacticalrmm worker -Q wupdate -l debug
fi

View File

@@ -0,0 +1,44 @@
# To ensure app dependencies are ported from your virtual environment/host machine into your container, run 'pip freeze > requirements.txt' in the terminal to overwrite this file
amqp==2.6.1
asgiref==3.3.1
asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==4.4.6
certifi==2020.12.5
cffi==1.14.3
chardet==3.0.4
cryptography==3.2.1
decorator==4.4.2
Django==3.1.4
django-cors-headers==3.5.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
idna==2.10
kombu==4.6.11
loguru==0.5.3
msgpack==1.0.0
packaging==20.4
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.9.9
pyotp==2.4.1
pyparsing==2.4.7
pytz==2020.4
qrcode==6.1
redis==3.5.3
requests==2.25.0
six==1.15.0
sqlparse==0.4.1
twilio==6.49.0
urllib3==1.26.2
validators==0.18.1
vine==1.3.0
websockets==8.1
zipp==3.4.0
black
Werkzeug
django-extensions
coverage
coveralls
model_bakery

View File

@@ -1,5 +1,25 @@
.git
.cache
**/*.env
**/env
**/__pycache__
**/.classpath
**/.dockerignore
**/.env
**/.git
**/.gitignore
**/.project
**/.settings
**/.toolstarget
**/.vs
**/.vscode
**/*.*proj.user
**/*.dbmdl
**/*.jfm
**/azds.yaml
**/charts
**/docker-compose*
**/Dockerfile*
**/node_modules
**/npm-debug.log
**/obj
**/secrets.dev.yaml
**/values.dev.yaml
**/env
README.md

12
.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
# These are supported funding model platforms
github: wh1te909
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

14
.vscode/launch.json vendored
View File

@@ -14,6 +14,20 @@
"0.0.0.0:8000"
],
"django": true
},
{
"name": "Django: Docker Remote Attach",
"type": "python",
"request": "attach",
"port": 5678,
"host": "localhost",
"preLaunchTask": "docker debug",
"pathMappings": [
{
"localRoot": "${workspaceFolder}/api/tacticalrmm",
"remoteRoot": "/workspace/api/tacticalrmm"
}
]
}
]
}

19
.vscode/settings.json vendored
View File

@@ -41,4 +41,23 @@
"**/*.zip": true
},
},
"go.useLanguageServer": true,
"[go]": {
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": false,
},
"editor.snippetSuggestions": "none",
},
"[go.mod]": {
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true,
},
},
"gopls": {
"usePlaceholders": true,
"completeUnimported": true,
"staticcheck": true,
}
}

23
.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,23 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{
"label": "docker debug",
"type": "shell",
"command": "docker-compose",
"args": [
"-p",
"trmm",
"-f",
".devcontainer/docker-compose.yml",
"-f",
".devcontainer/docker-compose.debug.yml",
"up",
"-d",
"--build"
]
}
]
}

View File

@@ -36,7 +36,7 @@ Demo database resets every hour. Alot of features are disabled for obvious reaso
## Installation
### Requirements
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04)
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04 / Debian 10)
- A domain you own with at least 3 subdomains
- Google Authenticator app (2 factor is NOT optional)

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0002_auto_20200810_0544'),
("accounts", "0002_auto_20200810_0544"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,24 +6,24 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('accounts', '0003_auto_20200922_1344'),
("accounts", "0003_auto_20200922_1344"),
]
operations = [
migrations.RemoveField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
),
migrations.RemoveField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
),
migrations.RemoveField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
),
migrations.RemoveField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0004_auto_20201002_1257'),
("accounts", "0004_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0007_update_agent_primary_key'),
("accounts", "0007_update_agent_primary_key"),
]
operations = [
migrations.AddField(
model_name='user',
name='dark_mode',
model_name="user",
name="dark_mode",
field=models.BooleanField(default=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2020-12-10 17:00
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("accounts", "0008_user_dark_mode"),
]
operations = [
migrations.AddField(
model_name="user",
name="show_community_scripts",
field=models.BooleanField(default=True),
),
]

View File

@@ -0,0 +1,26 @@
# Generated by Django 3.1.4 on 2021-01-14 01:23
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("accounts", "0009_user_show_community_scripts"),
]
operations = [
migrations.AddField(
model_name="user",
name="agent_dblclick_action",
field=models.CharField(
choices=[
("editagent", "Edit Agent"),
("takecontrol", "Take Control"),
("remotebg", "Remote Background"),
],
default="editagent",
max_length=50,
),
),
]

View File

@@ -3,11 +3,21 @@ from django.contrib.auth.models import AbstractUser
from logs.models import BaseAuditModel
AGENT_DBLCLICK_CHOICES = [
("editagent", "Edit Agent"),
("takecontrol", "Take Control"),
("remotebg", "Remote Background"),
]
class User(AbstractUser, BaseAuditModel):
is_active = models.BooleanField(default=True)
totp_key = models.CharField(max_length=50, null=True, blank=True)
dark_mode = models.BooleanField(default=True)
show_community_scripts = models.BooleanField(default=True)
agent_dblclick_action = models.CharField(
max_length=50, choices=AGENT_DBLCLICK_CHOICES, default="editagent"
)
agent = models.OneToOneField(
"agents.Agent",

View File

@@ -155,6 +155,33 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("put", url)
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_put_not_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_delete(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
@@ -166,6 +193,19 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("delete", url)
@override_settings(ROOT_USER="john")
def test_delete_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_delete_non_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
self.client.force_authenticate(user=self.alice)
r = self.client.delete(url)
self.assertEqual(r.status_code, 400)
class TestUserAction(TacticalTestCase):
def setUp(self):
@@ -184,6 +224,21 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("post", url)
@override_settings(ROOT_USER="john")
def test_post_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_post_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
self.client.force_authenticate(user=self.alice)
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_put(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
@@ -195,12 +250,46 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("put", url)
def test_darkmode(self):
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
user = User.objects.get(pk=self.john.pk)
self.assertEqual(user.totp_key, "")
@override_settings(ROOT_USER="john")
def test_put_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_user_ui(self):
url = "/accounts/users/ui/"
data = {"dark_mode": False}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"show_community_scripts": True}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "editagent"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "remotebg"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "takecontrol"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("patch", url)

View File

@@ -60,7 +60,7 @@ class LoginView(KnoxLoginView):
if settings.DEBUG and token == "sekret":
valid = True
elif totp.verify(token, valid_window=1):
elif totp.verify(token, valid_window=10):
valid = True
if valid:
@@ -108,6 +108,13 @@ class GetUpdateDeleteUser(APIView):
def put(self, request, pk):
user = get_object_or_404(User, pk=pk)
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
serializer = UserSerializer(instance=user, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
serializer.save()
@@ -115,7 +122,15 @@ class GetUpdateDeleteUser(APIView):
return Response("ok")
def delete(self, request, pk):
get_object_or_404(User, pk=pk).delete()
user = get_object_or_404(User, pk=pk)
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be deleted from the UI")
user.delete()
return Response("ok")
@@ -124,8 +139,14 @@ class UserActions(APIView):
# reset password
def post(self, request):
user = get_object_or_404(User, pk=request.data["id"])
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
user.set_password(request.data["password"])
user.save()
@@ -133,8 +154,14 @@ class UserActions(APIView):
# reset two factor token
def put(self, request):
user = get_object_or_404(User, pk=request.data["id"])
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
user.totp_key = ""
user.save()
@@ -161,6 +188,17 @@ class TOTPSetup(APIView):
class UserUI(APIView):
def patch(self, request):
user = request.user
user.dark_mode = request.data["dark_mode"]
user.save(update_fields=["dark_mode"])
return Response("ok")
if "dark_mode" in request.data:
user.dark_mode = request.data["dark_mode"]
user.save(update_fields=["dark_mode"])
if "show_community_scripts" in request.data:
user.show_community_scripts = request.data["show_community_scripts"]
user.save(update_fields=["show_community_scripts"])
if "agent_dblclick_action" in request.data:
user.agent_dblclick_action = request.data["agent_dblclick_action"]
user.save(update_fields=["agent_dblclick_action"])
return Response("ok")

View File

@@ -7,14 +7,20 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
('agents', '0020_auto_20201025_2129'),
("clients", "0006_deployment"),
("agents", "0020_auto_20201025_2129"),
]
operations = [
migrations.AddField(
model_name='agent',
name='site_link',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='agents', to='clients.site'),
model_name="agent",
name="site_link",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="agents",
to="clients.site",
),
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0022_update_site_primary_key'),
("agents", "0022_update_site_primary_key"),
]
operations = [
migrations.RemoveField(
model_name='agent',
name='client',
model_name="agent",
name="client",
),
migrations.RemoveField(
model_name='agent',
name='site',
model_name="agent",
name="site",
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0023_auto_20201101_2312'),
("agents", "0023_auto_20201101_2312"),
]
operations = [
migrations.RenameField(
model_name='agent',
old_name='site_link',
new_name='site',
model_name="agent",
old_name="site_link",
new_name="site",
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0024_auto_20201101_2319'),
("agents", "0024_auto_20201101_2319"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -6,13 +6,23 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0025_auto_20201122_0407'),
("agents", "0025_auto_20201122_0407"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC'), ('checkrunner', 'Checkrunner')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
("checkrunner", "Checkrunner"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -382,32 +382,18 @@ class Agent(BaseAuditModel):
return patch_policy
# clear is used to delete managed policy checks from agent
# parent_checks specifies a list of checks to delete from agent with matching parent_check field
def generate_checks_from_policies(self, clear=False):
def generate_checks_from_policies(self):
from automation.models import Policy
# Clear agent checks managed by policy
if clear:
self.agentchecks.filter(managed_by_policy=True).delete()
# Clear agent checks that have overriden_by_policy set
self.agentchecks.update(overriden_by_policy=False)
# Generate checks based on policies
Policy.generate_policy_checks(self)
# clear is used to delete managed policy tasks from agent
# parent_tasks specifies a list of tasks to delete from agent with matching parent_task field
def generate_tasks_from_policies(self, clear=False):
from autotasks.tasks import delete_win_task_schedule
def generate_tasks_from_policies(self):
from automation.models import Policy
# Clear agent tasks managed by policy
if clear:
for task in self.autotasks.filter(managed_by_policy=True):
delete_win_task_schedule.delay(task.pk)
# Generate tasks based on policies
Policy.generate_policy_tasks(self)
@@ -625,6 +611,13 @@ class Agent(BaseAuditModel):
elif action.details["action"] == "taskdelete":
delete_win_task_schedule.delay(task_id, pending_action=action.id)
# for clearing duplicate pending actions on agent
def remove_matching_pending_task_actions(self, task_id):
# remove any other pending actions on agent with same task_id
for action in self.pendingactions.exclude(status="completed"):
if action.details["task_id"] == task_id:
action.delete()
class AgentOutage(models.Model):
agent = models.ForeignKey(

View File

@@ -7,6 +7,7 @@ from packaging import version as pyver
from typing import List
from django.conf import settings
from scripts.models import Script
from tacticalrmm.celery import app
from agents.models import Agent, AgentOutage
@@ -16,6 +17,45 @@ from logs.models import PendingAction
logger.configure(**settings.LOG_CONFIG)
def _check_agent_service(pk: int) -> None:
agent = Agent.objects.get(pk=pk)
r = asyncio.run(agent.nats_cmd({"func": "ping"}, timeout=2))
if r == "pong":
logger.info(
f"Detected crashed tacticalagent service on {agent.hostname}, attempting recovery"
)
data = {"func": "recover", "payload": {"mode": "tacagent"}}
asyncio.run(agent.nats_cmd(data, wait=False))
def _check_in_full(pk: int) -> None:
agent = Agent.objects.get(pk=pk)
asyncio.run(agent.nats_cmd({"func": "checkinfull"}, wait=False))
@app.task
def check_in_task() -> None:
q = Agent.objects.only("pk", "version")
agents: List[int] = [
i.pk for i in q if pyver.parse(i.version) == pyver.parse("1.1.12")
]
chunks = (agents[i : i + 50] for i in range(0, len(agents), 50))
for chunk in chunks:
for pk in chunk:
_check_in_full(pk)
sleep(0.1)
rand = random.randint(3, 7)
sleep(rand)
@app.task
def monitor_agents_task() -> None:
q = Agent.objects.all()
agents: List[int] = [i.pk for i in q if i.has_nats and i.status != "online"]
for agent in agents:
_check_agent_service(agent)
def agent_update(pk: int) -> str:
agent = Agent.objects.get(pk=pk)
# skip if we can't determine the arch
@@ -23,55 +63,46 @@ def agent_update(pk: int) -> str:
logger.warning(f"Unable to determine arch on {agent.hostname}. Skipping.")
return "noarch"
# force an update to 1.1.5 since 1.1.6 needs agent to be on 1.1.5 first
if pyver.parse(agent.version) < pyver.parse("1.1.5"):
version = "1.1.5"
if agent.arch == "64":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5.exe"
inno = "winagent-v1.1.5.exe"
elif agent.arch == "32":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5-x86.exe"
inno = "winagent-v1.1.5-x86.exe"
else:
return "nover"
else:
version = settings.LATEST_AGENT_VER
url = agent.winagent_dl
inno = agent.win_inno_exe
version = settings.LATEST_AGENT_VER
url = agent.winagent_dl
inno = agent.win_inno_exe
if agent.has_nats:
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
action = agent.pendingactions.filter(
if pyver.parse(agent.version) <= pyver.parse("1.1.11"):
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
if pyver.parse(action.details["version"]) < pyver.parse(version):
action.delete()
else:
return "pending"
).exists():
action = agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
if pyver.parse(action.details["version"]) < pyver.parse(version):
action.delete()
else:
return "pending"
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": url,
"version": version,
"inno": inno,
},
)
else:
nats_data = {
"func": "agentupdate",
"payload": {
"url": url,
"version": version,
"inno": inno,
},
}
asyncio.run(agent.nats_cmd(nats_data, wait=False))
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": url,
"version": version,
"inno": inno,
},
)
return "created"
# TODO
# Salt is deprecated, remove this once salt is gone
else:
agent.salt_api_async(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": inno,
"url": url,
},
)
return "salt"
return "not supported"
@app.task
@@ -89,7 +120,6 @@ def send_agent_update_task(pks: List[int], version: str) -> None:
def auto_self_agent_update_task() -> None:
core = CoreSettings.objects.first()
if not core.agent_auto_update:
logger.info("Agent auto update is disabled. Skipping.")
return
q = Agent.objects.only("pk", "version")
@@ -103,16 +133,41 @@ def auto_self_agent_update_task() -> None:
agent_update(pk)
@app.task
def get_wmi_task():
agents = Agent.objects.all()
online = [
i
for i in agents
if pyver.parse(i.version) >= pyver.parse("1.2.0") and i.status == "online"
]
chunks = (online[i : i + 50] for i in range(0, len(online), 50))
for chunk in chunks:
for agent in chunk:
asyncio.run(agent.nats_cmd({"func": "wmi"}, wait=False))
sleep(0.1)
rand = random.randint(3, 7)
sleep(rand)
@app.task
def sync_sysinfo_task():
agents = Agent.objects.all()
online = [
i
for i in agents
if pyver.parse(i.version) >= pyver.parse("1.1.3") and i.status == "online"
if pyver.parse(i.version) >= pyver.parse("1.1.3")
and pyver.parse(i.version) <= pyver.parse("1.1.12")
and i.status == "online"
]
for agent in online:
asyncio.run(agent.nats_cmd({"func": "sync"}, wait=False))
chunks = (online[i : i + 50] for i in range(0, len(online), 50))
for chunk in chunks:
for agent in chunk:
asyncio.run(agent.nats_cmd({"func": "sync"}, wait=False))
sleep(0.1)
rand = random.randint(3, 7)
sleep(rand)
@app.task
@@ -255,8 +310,82 @@ def agent_outages_task():
outage = AgentOutage(agent=agent)
outage.save()
# add a null check history to allow gaps in graph
for check in agent.agentchecks.all():
check.add_check_history(None)
if agent.overdue_email_alert and not agent.maintenance_mode:
agent_outage_email_task.delay(pk=outage.pk)
if agent.overdue_text_alert and not agent.maintenance_mode:
agent_outage_sms_task.delay(pk=outage.pk)
@app.task
def install_salt_task(pk: int) -> None:
sleep(20)
agent = Agent.objects.get(pk=pk)
asyncio.run(agent.nats_cmd({"func": "installsalt"}, wait=False))
@app.task
def handle_agent_recovery_task(pk: int) -> None:
sleep(10)
from agents.models import RecoveryAction
action = RecoveryAction.objects.get(pk=pk)
if action.mode == "command":
data = {"func": "recoverycmd", "recoverycommand": action.command}
else:
data = {"func": "recover", "payload": {"mode": action.mode}}
asyncio.run(action.agent.nats_cmd(data, wait=False))
@app.task
def run_script_email_results_task(
agentpk: int, scriptpk: int, nats_timeout: int, nats_data: dict, emails: List[str]
):
agent = Agent.objects.get(pk=agentpk)
script = Script.objects.get(pk=scriptpk)
nats_data["func"] = "runscriptfull"
r = asyncio.run(agent.nats_cmd(nats_data, timeout=nats_timeout))
if r == "timeout":
logger.error(f"{agent.hostname} timed out running script.")
return
CORE = CoreSettings.objects.first()
subject = f"{agent.hostname} {script.name} Results"
exec_time = "{:.4f}".format(r["execution_time"])
body = (
subject
+ f"\nReturn code: {r['retcode']}\nExecution time: {exec_time} seconds\nStdout: {r['stdout']}\nStderr: {r['stderr']}"
)
import smtplib
from email.message import EmailMessage
msg = EmailMessage()
msg["Subject"] = subject
msg["From"] = CORE.smtp_from_email
if emails:
msg["To"] = ", ".join(emails)
else:
msg["To"] = ", ".join(CORE.email_alert_recipients)
msg.set_content(body)
try:
with smtplib.SMTP(CORE.smtp_host, CORE.smtp_port, timeout=20) as server:
if CORE.smtp_requires_auth:
server.ehlo()
server.starttls()
server.login(CORE.smtp_host_user, CORE.smtp_host_password)
server.send_message(msg)
server.quit()
else:
server.send_message(msg)
server.quit()
except Exception as e:
logger.error(e)

View File

@@ -127,7 +127,13 @@ class TestAgentViews(TacticalTestCase):
@patch("agents.models.Agent.nats_cmd")
def test_get_processes(self, mock_ret):
url = f"/agents/{self.agent.pk}/getprocs/"
agent_old = baker.make_recipe("agents.online_agent", version="1.1.12")
url_old = f"/agents/{agent_old.pk}/getprocs/"
r = self.client.get(url_old)
self.assertEqual(r.status_code, 400)
agent = baker.make_recipe("agents.online_agent", version="1.2.0")
url = f"/agents/{agent.pk}/getprocs/"
with open(
os.path.join(settings.BASE_DIR, "tacticalrmm/test_data/procs.json")
@@ -137,9 +143,7 @@ class TestAgentViews(TacticalTestCase):
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
assert any(i["name"] == "Registry" for i in mock_ret.return_value)
assert any(
i["memory_percent"] == 0.004843281375620747 for i in mock_ret.return_value
)
assert any(i["membytes"] == 434655234324 for i in mock_ret.return_value)
mock_ret.return_value = "timeout"
r = self.client.get(url)
@@ -550,6 +554,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "all",
"target": "agents",
"client": None,
"site": None,
@@ -567,6 +572,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "servers",
"target": "agents",
"client": None,
"site": None,
@@ -579,14 +585,13 @@ class TestAgentViews(TacticalTestCase):
r = self.client.post(url, payload, format="json")
self.assertEqual(r.status_code, 400)
payload = {
""" payload = {
"mode": "command",
"monType": "workstations",
"target": "client",
"client": self.agent.client.id,
"site": None,
"agentPKs": [
self.agent.pk,
],
"agentPKs": [],
"cmd": "gpupdate /force",
"timeout": 300,
"shell": "cmd",
@@ -594,10 +599,11 @@ class TestAgentViews(TacticalTestCase):
r = self.client.post(url, payload, format="json")
self.assertEqual(r.status_code, 200)
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300)
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300) """
payload = {
"mode": "command",
"monType": "all",
"target": "client",
"client": self.agent.client.id,
"site": self.agent.site.id,
@@ -615,6 +621,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "scan",
"monType": "all",
"target": "agents",
"client": None,
"site": None,
@@ -628,6 +635,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "install",
"monType": "all",
"target": "client",
"client": self.agent.client.id,
"site": None,
@@ -786,14 +794,14 @@ class TestAgentTasks(TacticalTestCase):
self.assertEqual(salt_batch_async.call_count, 4)
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_async")
def test_agent_update(self, salt_api_async):
@patch("agents.models.Agent.nats_cmd")
def test_agent_update(self, nats_cmd):
from agents.tasks import agent_update
agent_noarch = baker.make_recipe(
"agents.agent",
operating_system="Error getting OS",
version="1.1.0",
version="1.1.11",
)
r = agent_update(agent_noarch.pk)
self.assertEqual(r, "noarch")
@@ -804,15 +812,15 @@ class TestAgentTasks(TacticalTestCase):
0,
)
agent64_nats = baker.make_recipe(
agent64_111 = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.1.0",
version="1.1.11",
)
r = agent_update(agent64_nats.pk)
r = agent_update(agent64_111.pk)
self.assertEqual(r, "created")
action = PendingAction.objects.get(agent__pk=agent64_nats.pk)
action = PendingAction.objects.get(agent__pk=agent64_111.pk)
self.assertEqual(action.action_type, "agentupdate")
self.assertEqual(action.status, "pending")
self.assertEqual(action.details["url"], settings.DL_64)
@@ -821,33 +829,24 @@ class TestAgentTasks(TacticalTestCase):
)
self.assertEqual(action.details["version"], settings.LATEST_AGENT_VER)
agent64_salt = baker.make_recipe(
agent64 = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.0.0",
version="1.1.12",
)
salt_api_async.return_value = True
r = agent_update(agent64_salt.pk)
self.assertEqual(r, "salt")
salt_api_async.assert_called_with(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": f"winagent-v{settings.LATEST_AGENT_VER}.exe",
"url": settings.DL_64,
nats_cmd.return_value = "ok"
r = agent_update(agent64.pk)
self.assertEqual(r, "created")
nats_cmd.assert_called_with(
{
"func": "agentupdate",
"payload": {
"url": settings.DL_64,
"version": settings.LATEST_AGENT_VER,
"inno": f"winagent-v{settings.LATEST_AGENT_VER}.exe",
},
},
)
salt_api_async.reset_mock()
agent32_nats = baker.make_recipe(
"agents.agent",
operating_system="Windows 7 Professional, 32 bit (build 7601.23964)",
version="1.1.0",
)
agent32_salt = baker.make_recipe(
"agents.agent",
operating_system="Windows 7 Professional, 32 bit (build 7601.23964)",
version="1.0.0",
wait=False,
)
""" @patch("agents.models.Agent.salt_api_async")
@@ -953,4 +952,4 @@ class TestAgentTasks(TacticalTestCase):
"url": OLD_32_PY_AGENT,
},
)
self.assertEqual(ret.status, "SUCCESS") """
self.assertEqual(ret.status, "SUCCESS") """

View File

@@ -32,7 +32,11 @@ from .serializers import (
)
from winupdate.serializers import WinUpdatePolicySerializer
from .tasks import uninstall_agent_task, send_agent_update_task
from .tasks import (
uninstall_agent_task,
send_agent_update_task,
run_script_email_results_task,
)
from winupdate.tasks import bulk_check_for_updates_task
from scripts.tasks import handle_bulk_command_task, handle_bulk_script_task
@@ -110,8 +114,8 @@ def edit_agent(request):
# check if site changed and initiate generating correct policies
if old_site != request.data["site"]:
agent.generate_checks_from_policies(clear=True)
agent.generate_tasks_from_policies(clear=True)
agent.generate_checks_from_policies()
agent.generate_tasks_from_policies()
return Response("ok")
@@ -155,12 +159,12 @@ def agent_detail(request, pk):
@api_view()
def get_processes(request, pk):
agent = get_object_or_404(Agent, pk=pk)
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
if pyver.parse(agent.version) < pyver.parse("1.2.0"):
return notify_error("Requires agent version 1.2.0 or greater")
r = asyncio.run(agent.nats_cmd(data={"func": "procs"}, timeout=5))
if r == "timeout":
return notify_error("Unable to contact the agent")
return Response(r)
@@ -738,6 +742,21 @@ def run_script(request):
if output == "wait":
r = asyncio.run(agent.nats_cmd(data, timeout=req_timeout))
return Response(r)
elif output == "email":
if not pyver.parse(agent.version) >= pyver.parse("1.1.12"):
return notify_error("Requires agent version 1.1.12 or greater")
emails = (
[] if request.data["emailmode"] == "default" else request.data["emails"]
)
run_script_email_results_task.delay(
agentpk=agent.pk,
scriptpk=script.pk,
nats_timeout=req_timeout,
nats_data=data,
emails=emails,
)
return Response(f"{script.name} will now be run on {agent.hostname}")
else:
asyncio.run(agent.nats_cmd(data, wait=False))
return Response(f"{script.name} will now be run on {agent.hostname}")
@@ -825,6 +844,11 @@ def bulk(request):
else:
return notify_error("Something went wrong")
if request.data["monType"] == "servers":
q = q.filter(monitoring_type="server")
elif request.data["monType"] == "workstations":
q = q.filter(monitoring_type="workstation")
minions = [agent.salt_id for agent in q]
agents = [agent.pk for agent in q]
@@ -909,4 +933,4 @@ class WMI(APIView):
r = asyncio.run(agent.nats_cmd({"func": "sysinfo"}, timeout=20))
if r != "ok":
return notify_error("Unable to contact the agent")
return Response("ok")
return Response("ok")

View File

@@ -7,19 +7,25 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('checks', '0010_auto_20200922_1344'),
('alerts', '0002_auto_20200815_1618'),
("checks", "0010_auto_20200922_1344"),
("alerts", "0002_auto_20200815_1618"),
]
operations = [
migrations.AddField(
model_name='alert',
name='assigned_check',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='alert', to='checks.check'),
model_name="alert",
name="assigned_check",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="alert",
to="checks.check",
),
),
migrations.AlterField(
model_name='alert',
name='alert_time',
model_name="alert",
name="alert_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
]

View File

@@ -37,7 +37,7 @@ class Alert(models.Model):
@classmethod
def create_availability_alert(cls, agent):
pass
@classmethod
def create_check_alert(cls, check):
pass
pass

View File

@@ -16,4 +16,4 @@ class AlertSerializer(ModelSerializer):
class Meta:
model = Alert
fields = "__all__"
fields = "__all__"

View File

@@ -2,4 +2,4 @@ from django.apps import AppConfig
class Apiv2Config(AppConfig):
name = 'apiv2'
name = "apiv2"

View File

@@ -61,7 +61,7 @@ class TestAPIv3(TacticalTestCase):
def test_sysinfo(self):
# TODO replace this with golang wmi sample data
url = f"/api/v3/sysinfo/"
url = "/api/v3/sysinfo/"
with open(
os.path.join(
settings.BASE_DIR, "tacticalrmm/test_data/wmi_python_agent.json"
@@ -77,7 +77,7 @@ class TestAPIv3(TacticalTestCase):
self.check_not_authenticated("patch", url)
def test_hello_patch(self):
url = f"/api/v3/hello/"
url = "/api/v3/hello/"
payload = {
"agent_id": self.agent.agent_id,
"logged_in_username": "None",
@@ -92,3 +92,12 @@ class TestAPIv3(TacticalTestCase):
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("patch", url)
@patch("agents.tasks.install_salt_task.delay")
def test_install_salt(self, mock_task):
url = f"/api/v3/{self.agent.agent_id}/installsalt/"
r = self.client.get(url, format="json")
self.assertEqual(r.status_code, 200)
mock_task.assert_called_with(self.agent.pk)
self.check_not_authenticated("get", url)

View File

@@ -17,4 +17,5 @@ urlpatterns = [
path("<str:agentid>/winupdater/", views.WinUpdater.as_view()),
path("software/", views.Software.as_view()),
path("installer/", views.Installer.as_view()),
path("<str:agentid>/installsalt/", views.InstallSalt.as_view()),
]

View File

@@ -30,6 +30,7 @@ from agents.tasks import (
agent_recovery_email_task,
agent_recovery_sms_task,
sync_salt_modules_task,
install_salt_task,
)
from winupdate.tasks import check_for_updates_task
from software.tasks import install_chocolatey
@@ -258,23 +259,11 @@ class CheckRunner(APIView):
return Response(ret)
def patch(self, request):
from logs.models import AuditLog
check = get_object_or_404(Check, pk=request.data["id"])
check.last_run = djangotime.now()
check.save(update_fields=["last_run"])
status = check.handle_checkv2(request.data)
# create audit entry
AuditLog.objects.create(
username=check.agent.hostname,
agent=check.agent.hostname,
object_type="agent",
action="check_run",
message=f"{check.readable_desc} was run on {check.agent.hostname}. Status: {status}",
after_value=Check.serialize(check),
)
return Response(status)
@@ -626,3 +615,13 @@ class Installer(APIView):
)
return Response("ok")
class InstallSalt(APIView):
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def get(self, request, agentid):
agent = get_object_or_404(Agent, agent_id=agentid)
install_salt_task.delay(agent.pk)
return Response("ok")

View File

@@ -6,11 +6,11 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('automation', '0005_auto_20200922_1344'),
("automation", "0005_auto_20200922_1344"),
]
operations = [
migrations.DeleteModel(
name='PolicyExclusions',
name="PolicyExclusions",
),
]

View File

@@ -1,6 +1,5 @@
from django.db import models
from agents.models import Agent
from clients.models import Site, Client
from core.models import CoreSettings
from logs.models import BaseAuditModel
@@ -58,6 +57,11 @@ class Policy(BaseAuditModel):
@staticmethod
def cascade_policy_tasks(agent):
from autotasks.tasks import delete_win_task_schedule
from autotasks.models import AutomatedTask
from logs.models import PendingAction
# List of all tasks to be applied
tasks = list()
added_task_pks = list()
@@ -80,7 +84,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy
@@ -107,6 +111,33 @@ class Policy(BaseAuditModel):
tasks.append(task)
added_task_pks.append(task.pk)
# remove policy tasks from agent not included in policy
for task in agent.autotasks.filter(
parent_task__in=[
taskpk
for taskpk in agent_tasks_parent_pks
if taskpk not in added_task_pks
]
):
delete_win_task_schedule.delay(task.pk)
# handle matching tasks that haven't synced to agent yet or pending deletion due to agent being offline
for action in agent.pendingactions.exclude(status="completed"):
task = AutomatedTask.objects.get(pk=action.details["task_id"])
if (
task.parent_task in agent_tasks_parent_pks
and task.parent_task in added_task_pks
):
agent.remove_matching_pending_task_actions(task.id)
PendingAction(
agent=agent,
action_type="taskaction",
details={"action": "taskcreate", "task_id": task.id},
).save()
task.sync_status = "notsynced"
task.save(update_fields=["sync_status"])
return [task for task in tasks if task.pk not in agent_tasks_parent_pks]
@staticmethod
@@ -132,7 +163,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy
@@ -280,6 +311,15 @@ class Policy(BaseAuditModel):
+ eventlog_checks
)
# remove policy checks from agent that fell out of policy scope
agent.agentchecks.filter(
parent_check__in=[
checkpk
for checkpk in agent_checks_parent_pks
if checkpk not in [check.pk for check in final_list]
]
).delete()
return [
check for check in final_list if check.pk not in agent_checks_parent_pks
]

View File

@@ -6,46 +6,42 @@ from tacticalrmm.celery import app
@app.task
def generate_agent_checks_from_policies_task(
###
# copies the policy checks to all affected agents
#
# clear: clears all policy checks first
# create_tasks: also create tasks after checks are generated
###
policypk,
clear=False,
create_tasks=False,
):
def generate_agent_checks_from_policies_task(policypk, create_tasks=False):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
agent.generate_checks_from_policies(clear=clear)
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_checks_from_policies()
if create_tasks:
agent.generate_tasks_from_policies(
clear=clear,
)
agent.generate_tasks_from_policies()
@app.task
def generate_agent_checks_by_location_task(
location, mon_type, clear=False, create_tasks=False
):
def generate_agent_checks_by_location_task(location, mon_type, create_tasks=False):
for agent in Agent.objects.filter(**location).filter(monitoring_type=mon_type):
agent.generate_checks_from_policies(clear=clear)
agent.generate_checks_from_policies()
if create_tasks:
agent.generate_tasks_from_policies(clear=clear)
agent.generate_tasks_from_policies()
@app.task
def generate_all_agent_checks_task(mon_type, clear=False, create_tasks=False):
def generate_all_agent_checks_task(mon_type, create_tasks=False):
for agent in Agent.objects.filter(monitoring_type=mon_type):
agent.generate_checks_from_policies(clear=clear)
agent.generate_checks_from_policies()
if create_tasks:
agent.generate_tasks_from_policies(clear=clear)
agent.generate_tasks_from_policies()
@app.task
@@ -83,18 +79,28 @@ def update_policy_check_fields_task(checkpk):
@app.task
def generate_agent_tasks_from_policies_task(policypk, clear=False):
def generate_agent_tasks_from_policies_task(policypk):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
agent.generate_tasks_from_policies(clear=clear)
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_tasks_from_policies()
@app.task
def generate_agent_tasks_by_location_task(location, mon_type, clear=False):
def generate_agent_tasks_by_location_task(location, mon_type):
for agent in Agent.objects.filter(**location).filter(monitoring_type=mon_type):
agent.generate_tasks_from_policies(clear=clear)
agent.generate_tasks_from_policies()
@app.task

View File

@@ -121,9 +121,7 @@ class TestPolicyViews(TacticalTestCase):
resp = self.client.put(url, data, format="json")
self.assertEqual(resp.status_code, 200)
mock_checks_task.assert_called_with(
policypk=policy.pk, clear=True, create_tasks=True
)
mock_checks_task.assert_called_with(policypk=policy.pk, create_tasks=True)
self.check_not_authenticated("put", url)
@@ -140,8 +138,8 @@ class TestPolicyViews(TacticalTestCase):
resp = self.client.delete(url, format="json")
self.assertEqual(resp.status_code, 200)
mock_checks_task.assert_called_with(policypk=policy.pk, clear=True)
mock_tasks_task.assert_called_with(policypk=policy.pk, clear=True)
mock_checks_task.assert_called_with(policypk=policy.pk)
mock_tasks_task.assert_called_with(policypk=policy.pk)
self.check_not_authenticated("delete", url)
@@ -298,7 +296,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site__client_id": client.id},
mon_type="server",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -311,7 +308,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site__client_id": client.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -324,7 +320,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site_id": site.id},
mon_type="server",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -337,7 +332,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site_id": site.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -347,7 +341,7 @@ class TestPolicyViews(TacticalTestCase):
self.assertEqual(resp.status_code, 200)
# called because the relation changed
mock_checks_task.assert_called_with(clear=True)
mock_checks_task.assert_called()
mock_checks_task.reset_mock()
# Adding the same relations shouldn't trigger mocks
@@ -396,7 +390,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site__client_id": client.id},
mon_type="server",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -409,7 +402,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site__client_id": client.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -422,7 +414,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site_id": site.id},
mon_type="server",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -435,7 +426,6 @@ class TestPolicyViews(TacticalTestCase):
mock_checks_location_task.assert_called_with(
location={"site_id": site.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
mock_checks_location_task.reset_mock()
@@ -444,7 +434,7 @@ class TestPolicyViews(TacticalTestCase):
resp = self.client.post(url, agent_payload, format="json")
self.assertEqual(resp.status_code, 200)
# called because the relation changed
mock_checks_task.assert_called_with(clear=True)
mock_checks_task.assert_called()
mock_checks_task.reset_mock()
# adding the same relations shouldn't trigger mocks
@@ -753,7 +743,7 @@ class TestPolicyTasks(TacticalTestCase):
agent = baker.make_recipe("agents.agent", site=site, policy=policy)
# test policy assigned to agent
generate_agent_checks_from_policies_task(policy.id, clear=True)
generate_agent_checks_from_policies_task(policy.id)
# make sure all checks were created. should be 7
agent_checks = Agent.objects.get(pk=agent.id).agentchecks.all()
@@ -832,7 +822,6 @@ class TestPolicyTasks(TacticalTestCase):
generate_agent_checks_by_location_task(
{"site_id": sites[0].id},
"server",
clear=True,
create_tasks=True,
)
@@ -846,7 +835,6 @@ class TestPolicyTasks(TacticalTestCase):
generate_agent_checks_by_location_task(
{"site__client_id": clients[0].id},
"workstation",
clear=True,
create_tasks=True,
)
# workstation_agent should now have policy checks and the other agents should not
@@ -875,7 +863,7 @@ class TestPolicyTasks(TacticalTestCase):
core.workstation_policy = policy
core.save()
generate_all_agent_checks_task("server", clear=True, create_tasks=True)
generate_all_agent_checks_task("server", create_tasks=True)
# all servers should have 7 checks
for agent in server_agents:
@@ -884,7 +872,7 @@ class TestPolicyTasks(TacticalTestCase):
for agent in workstation_agents:
self.assertEqual(Agent.objects.get(pk=agent.id).agentchecks.count(), 0)
generate_all_agent_checks_task("workstation", clear=True, create_tasks=True)
generate_all_agent_checks_task("workstation", create_tasks=True)
# all agents should have 7 checks now
for agent in server_agents:
@@ -961,7 +949,7 @@ class TestPolicyTasks(TacticalTestCase):
site = baker.make("clients.Site")
agent = baker.make_recipe("agents.server_agent", site=site, policy=policy)
generate_agent_tasks_from_policies_task(policy.id, clear=True)
generate_agent_tasks_from_policies_task(policy.id)
agent_tasks = Agent.objects.get(pk=agent.id).autotasks.all()
@@ -1000,9 +988,7 @@ class TestPolicyTasks(TacticalTestCase):
agent1 = baker.make_recipe("agents.agent", site=sites[1])
agent2 = baker.make_recipe("agents.agent", site=sites[3])
generate_agent_tasks_by_location_task(
{"site_id": sites[0].id}, "server", clear=True
)
generate_agent_tasks_by_location_task({"site_id": sites[0].id}, "server")
# all servers in site1 and site2 should have 3 tasks
self.assertEqual(
@@ -1013,7 +999,7 @@ class TestPolicyTasks(TacticalTestCase):
self.assertEqual(Agent.objects.get(pk=agent2.id).autotasks.count(), 0)
generate_agent_tasks_by_location_task(
{"site__client_id": clients[0].id}, "workstation", clear=True
{"site__client_id": clients[0].id}, "workstation"
)
# all workstations in Default1 should have 3 tasks

View File

@@ -83,7 +83,6 @@ class GetUpdateDeletePolicy(APIView):
if saved_policy.active != old_active or saved_policy.enforced != old_enforced:
generate_agent_checks_from_policies_task.delay(
policypk=policy.pk,
clear=(not saved_policy.active or not saved_policy.enforced),
create_tasks=(saved_policy.active != old_active),
)
@@ -93,8 +92,8 @@ class GetUpdateDeletePolicy(APIView):
policy = get_object_or_404(Policy, pk=pk)
# delete all managed policy checks off of agents
generate_agent_checks_from_policies_task.delay(policypk=policy.pk, clear=True)
generate_agent_tasks_from_policies_task.delay(policypk=policy.pk, clear=True)
generate_agent_checks_from_policies_task.delay(policypk=policy.pk)
generate_agent_tasks_from_policies_task.delay(policypk=policy.pk)
policy.delete()
return Response("ok")
@@ -218,7 +217,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site__client_id": client.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
@@ -236,7 +234,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site_id": site.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
@@ -258,7 +255,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site__client_id": client.id},
mon_type="server",
clear=True,
create_tasks=True,
)
@@ -276,7 +272,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site_id": site.id},
mon_type="server",
clear=True,
create_tasks=True,
)
@@ -296,7 +291,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site__client_id": client.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
@@ -311,7 +305,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site_id": site.id},
mon_type="workstation",
clear=True,
create_tasks=True,
)
@@ -329,7 +322,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site__client_id": client.id},
mon_type="server",
clear=True,
create_tasks=True,
)
@@ -343,7 +335,6 @@ class GetRelated(APIView):
generate_agent_checks_by_location_task.delay(
location={"site_id": site.pk},
mon_type="server",
clear=True,
create_tasks=True,
)
@@ -358,14 +349,14 @@ class GetRelated(APIView):
if not agent.policy or agent.policy and agent.policy.pk != policy.pk:
agent.policy = policy
agent.save()
agent.generate_checks_from_policies(clear=True)
agent.generate_tasks_from_policies(clear=True)
agent.generate_checks_from_policies()
agent.generate_tasks_from_policies()
else:
if agent.policy:
agent.policy = None
agent.save()
agent.generate_checks_from_policies(clear=True)
agent.generate_tasks_from_policies(clear=True)
agent.generate_checks_from_policies()
agent.generate_tasks_from_policies()
return Response("ok")

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('autotasks', '0008_auto_20201030_1515'),
("autotasks", "0008_auto_20201030_1515"),
]
operations = [
migrations.AddField(
model_name='automatedtask',
name='run_time_bit_weekdays',
model_name="automatedtask",
name="run_time_bit_weekdays",
field=models.IntegerField(blank=True, null=True),
),
]

View File

@@ -6,7 +6,6 @@ import datetime as dt
from django.db import models
from django.contrib.postgres.fields import ArrayField
from django.db.models.fields import DateTimeField
from automation.models import Policy
from logs.models import BaseAuditModel
from tacticalrmm.utils import bitdays_to_string
@@ -43,7 +42,7 @@ class AutomatedTask(BaseAuditModel):
blank=True,
)
policy = models.ForeignKey(
Policy,
"automation.Policy",
related_name="autotasks",
null=True,
blank=True,

View File

@@ -76,9 +76,14 @@ def create_win_task_schedule(pk, pending_action=False):
return "error"
r = asyncio.run(task.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
# don't create pending action if this task was initiated by a pending action
if not pending_action:
# complete any other pending actions on agent with same task_id
task.agent.remove_matching_pending_task_actions(task.id)
PendingAction(
agent=task.agent,
action_type="taskaction",
@@ -144,6 +149,7 @@ def enable_or_disable_win_task(pk, action, pending_action=False):
task.sync_status = "synced"
task.save(update_fields=["sync_status"])
return "ok"
@@ -157,9 +163,13 @@ def delete_win_task_schedule(pk, pending_action=False):
}
r = asyncio.run(task.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
if r != "ok" and "The system cannot find the file specified" not in r:
# don't create pending action if this task was initiated by a pending action
if not pending_action:
# complete any other pending actions on agent with same task_id
task.agent.remove_matching_pending_task_actions(task.id)
PendingAction(
agent=task.agent,
action_type="taskaction",
@@ -168,7 +178,7 @@ def delete_win_task_schedule(pk, pending_action=False):
task.sync_status = "pendingdeletion"
task.save(update_fields=["sync_status"])
return
return "timeout"
# complete pending action since it was successful
if pending_action:
@@ -176,6 +186,9 @@ def delete_win_task_schedule(pk, pending_action=False):
pendingaction.status = "completed"
pendingaction.save(update_fields=["status"])
# complete any other pending actions on agent with same task_id
task.agent.remove_matching_pending_task_actions(task.id)
task.delete()
return "ok"

View File

@@ -1,5 +1,6 @@
from django.contrib import admin
from .models import Check
from .models import Check, CheckHistory
admin.site.register(Check)
admin.site.register(CheckHistory)

View File

@@ -0,0 +1,30 @@
# Generated by Django 3.1.4 on 2021-01-09 02:56
import django.contrib.postgres.fields
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0010_auto_20200922_1344"),
]
operations = [
migrations.AddField(
model_name="check",
name="run_history",
field=django.contrib.postgres.fields.ArrayField(
base_field=django.contrib.postgres.fields.ArrayField(
base_field=models.PositiveIntegerField(),
blank=True,
null=True,
size=None,
),
blank=True,
default=list,
null=True,
size=None,
),
),
]

View File

@@ -0,0 +1,39 @@
# Generated by Django 3.1.4 on 2021-01-09 21:36
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("checks", "0010_auto_20200922_1344"),
]
operations = [
migrations.CreateModel(
name="CheckHistory",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("x", models.DateTimeField()),
("y", models.PositiveIntegerField()),
("results", models.JSONField(blank=True, null=True)),
(
"check_history",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="check_history",
to="checks.check",
),
),
],
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 05:03
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0011_checkhistory"),
]
operations = [
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 05:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0012_auto_20210110_0503"),
]
operations = [
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(null=True),
),
]

View File

@@ -0,0 +1,13 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("checks", "0013_auto_20210110_0505"),
("checks", "0011_check_run_history"),
]
operations = []

View File

@@ -0,0 +1,27 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0014_merge_20210110_1808"),
]
operations = [
migrations.RemoveField(
model_name="check",
name="run_history",
),
migrations.AlterField(
model_name="checkhistory",
name="x",
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(blank=True, default=None, null=True),
),
]

View File

@@ -3,12 +3,13 @@ import string
import os
import json
import pytz
from statistics import mean
from statistics import mean, mode
from django.db import models
from django.conf import settings
from django.contrib.postgres.fields import ArrayField
from django.core.validators import MinValueValidator, MaxValueValidator
from rest_framework.fields import JSONField
from core.models import CoreSettings
from logs.models import BaseAuditModel
@@ -214,6 +215,9 @@ class Check(BaseAuditModel):
"modified_time",
]
def add_check_history(self, value, more_info=None):
CheckHistory.objects.create(check_history=self, y=value, results=more_info)
def handle_checkv2(self, data):
# cpuload or mem checks
if self.check_type == "cpuload" or self.check_type == "memory":
@@ -232,6 +236,9 @@ class Check(BaseAuditModel):
else:
self.status = "passing"
# add check history
self.add_check_history(data["percent"])
# diskspace checks
elif self.check_type == "diskspace":
if data["exists"]:
@@ -245,6 +252,9 @@ class Check(BaseAuditModel):
self.status = "passing"
self.more_info = f"Total: {total}B, Free: {free}B"
# add check history
self.add_check_history(percent_used)
else:
self.status = "failing"
self.more_info = f"Disk {self.disk} does not exist"
@@ -277,6 +287,17 @@ class Check(BaseAuditModel):
]
)
# add check history
self.add_check_history(
1 if self.status == "failing" else 0,
{
"retcode": data["retcode"],
"stdout": data["stdout"][:60],
"stderr": data["stderr"][:60],
"execution_time": self.execution_time,
},
)
# ping checks
elif self.check_type == "ping":
success = ["Reply", "bytes", "time", "TTL"]
@@ -293,6 +314,10 @@ class Check(BaseAuditModel):
self.more_info = output
self.save(update_fields=["more_info"])
self.add_check_history(
1 if self.status == "failing" else 0, self.more_info[:60]
)
# windows service checks
elif self.check_type == "winsvc":
svc_stat = data["status"]
@@ -332,6 +357,10 @@ class Check(BaseAuditModel):
self.save(update_fields=["more_info"])
self.add_check_history(
1 if self.status == "failing" else 0, self.more_info[:60]
)
elif self.check_type == "eventlog":
log = []
is_wildcard = self.event_id_is_wildcard
@@ -391,6 +420,11 @@ class Check(BaseAuditModel):
self.extra_details = {"log": log}
self.save(update_fields=["extra_details"])
self.add_check_history(
1 if self.status == "failing" else 0,
"Events Found:" + str(len(self.extra_details["log"])),
)
# handle status
if self.status == "failing":
self.fail_count += 1
@@ -645,3 +679,17 @@ class Check(BaseAuditModel):
body = subject
CORE.send_sms(body)
class CheckHistory(models.Model):
check_history = models.ForeignKey(
Check,
related_name="check_history",
on_delete=models.CASCADE,
)
x = models.DateTimeField(auto_now_add=True)
y = models.PositiveIntegerField(null=True, blank=True, default=None)
results = models.JSONField(null=True, blank=True)
def __str__(self):
return self.check_history.readable_desc

View File

@@ -1,8 +1,8 @@
import validators as _v
import pytz
from rest_framework import serializers
from .models import Check
from .models import Check, CheckHistory
from autotasks.models import AutomatedTask
from scripts.serializers import ScriptSerializer, ScriptCheckSerializer
@@ -65,6 +65,26 @@ class CheckSerializer(serializers.ModelSerializer):
"Please enter a valid IP address or domain name"
)
if check_type == "cpuload" and not self.instance:
if (
Check.objects.filter(**self.context, check_type="cpuload")
.exclude(managed_by_policy=True)
.exists()
):
raise serializers.ValidationError(
"A cpuload check for this agent already exists"
)
if check_type == "memory" and not self.instance:
if (
Check.objects.filter(**self.context, check_type="memory")
.exclude(managed_by_policy=True)
.exists()
):
raise serializers.ValidationError(
"A memory check for this agent already exists"
)
return val
@@ -217,3 +237,15 @@ class CheckResultsSerializer(serializers.ModelSerializer):
class Meta:
model = Check
fields = "__all__"
class CheckHistorySerializer(serializers.ModelSerializer):
x = serializers.SerializerMethodField()
def get_x(self, obj):
return obj.x.astimezone(pytz.timezone(self.context["timezone"])).isoformat()
# used for return large amounts of graph data
class Meta:
model = CheckHistory
fields = ("x", "y", "results")

View File

@@ -5,8 +5,6 @@ from time import sleep
from tacticalrmm.celery import app
from django.utils import timezone as djangotime
from agents.models import Agent
@app.task
def handle_check_email_alert_task(pk):
@@ -56,3 +54,15 @@ def handle_check_sms_alert_task(pk):
check.save(update_fields=["text_sent"])
return "ok"
@app.task
def prune_check_history(older_than_days: int) -> str:
from .models import CheckHistory
CheckHistory.objects.filter(
x__lt=djangotime.make_aware(dt.datetime.today())
- djangotime.timedelta(days=older_than_days)
).delete()
return "ok"

View File

@@ -1,5 +1,7 @@
from checks.models import CheckHistory
from tacticalrmm.test import TacticalTestCase
from .serializers import CheckSerializer
from django.utils import timezone as djangotime
from model_bakery import baker
from itertools import cycle
@@ -8,6 +10,7 @@ from itertools import cycle
class TestCheckViews(TacticalTestCase):
def setUp(self):
self.authenticate()
self.setup_coresettings()
def test_get_disk_check(self):
# setup data
@@ -55,6 +58,52 @@ class TestCheckViews(TacticalTestCase):
resp = self.client.post(url, invalid_payload, format="json")
self.assertEqual(resp.status_code, 400)
def test_add_cpuload_check(self):
url = "/checks/checks/"
agent = baker.make_recipe("agents.agent")
payload = {
"pk": agent.pk,
"check": {
"check_type": "cpuload",
"threshold": 66,
"fails_b4_alert": 9,
},
}
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 200)
payload["threshold"] = 87
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 400)
self.assertEqual(
resp.json()["non_field_errors"][0],
"A cpuload check for this agent already exists",
)
def test_add_memory_check(self):
url = "/checks/checks/"
agent = baker.make_recipe("agents.agent")
payload = {
"pk": agent.pk,
"check": {
"check_type": "memory",
"threshold": 78,
"fails_b4_alert": 1,
},
}
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 200)
payload["threshold"] = 55
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 400)
self.assertEqual(
resp.json()["non_field_errors"][0],
"A memory check for this agent already exists",
)
def test_get_policy_disk_check(self):
# setup data
policy = baker.make("automation.Policy")
@@ -134,3 +183,69 @@ class TestCheckViews(TacticalTestCase):
self.assertEqual(resp.status_code, 200)
self.check_not_authenticated("patch", url_a)
def test_get_check_history(self):
# setup data
agent = baker.make_recipe("agents.agent")
check = baker.make_recipe("checks.diskspace_check", agent=agent)
baker.make("checks.CheckHistory", check_history=check, _quantity=30)
check_history_data = baker.make(
"checks.CheckHistory",
check_history=check,
_quantity=30,
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
check_history.save()
# test invalid check pk
resp = self.client.patch("/checks/history/500/", format="json")
self.assertEqual(resp.status_code, 404)
url = f"/checks/history/{check.id}/"
# test with timeFilter last 30 days
data = {"timeFilter": 30}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 30)
# test with timeFilter equal to 0
data = {"timeFilter": 0}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 60)
self.check_not_authenticated("patch", url)
class TestCheckTasks(TacticalTestCase):
def setUp(self):
self.setup_coresettings()
def test_prune_check_history(self):
from .tasks import prune_check_history
# setup data
check = baker.make_recipe("checks.diskspace_check")
baker.make("checks.CheckHistory", check_history=check, _quantity=30)
check_history_data = baker.make(
"checks.CheckHistory",
check_history=check,
_quantity=30,
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
check_history.save()
# prune data 30 days old
prune_check_history(30)
self.assertEqual(CheckHistory.objects.count(), 30)
# prune all Check history Data
prune_check_history(0)
self.assertEqual(CheckHistory.objects.count(), 0)

View File

@@ -7,4 +7,5 @@ urlpatterns = [
path("<pk>/loadchecks/", views.load_checks),
path("getalldisks/", views.get_disks_for_policies),
path("runchecks/<pk>/", views.run_checks),
path("history/<int:checkpk>/", views.CheckHistory.as_view()),
]

View File

@@ -1,6 +1,10 @@
import asyncio
from django.shortcuts import get_object_or_404
from django.db.models import Q
from django.utils import timezone as djangotime
from datetime import datetime as dt
from rest_framework.views import APIView
from rest_framework.response import Response
@@ -13,7 +17,7 @@ from automation.models import Policy
from .models import Check
from scripts.models import Script
from .serializers import CheckSerializer
from .serializers import CheckSerializer, CheckHistorySerializer
from automation.tasks import (
@@ -135,6 +139,29 @@ class GetUpdateDeleteCheck(APIView):
return Response(f"{check.readable_desc} was deleted!")
class CheckHistory(APIView):
def patch(self, request, checkpk):
check = get_object_or_404(Check, pk=checkpk)
timeFilter = Q()
if "timeFilter" in request.data:
if request.data["timeFilter"] != 0:
timeFilter = Q(
x__lte=djangotime.make_aware(dt.today()),
x__gt=djangotime.make_aware(dt.today())
- djangotime.timedelta(days=request.data["timeFilter"]),
)
check_history = check.check_history.filter(timeFilter).order_by("-x")
return Response(
CheckHistorySerializer(
check_history, context={"timezone": check.agent.timezone}, many=True
).data
)
@api_view()
def run_checks(request, pk):
agent = get_object_or_404(Agent, pk=pk)

View File

@@ -6,48 +6,48 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('clients', '0004_auto_20200821_2115'),
("clients", "0004_auto_20200821_2115"),
]
operations = [
migrations.AddField(
model_name='client',
name='created_by',
model_name="client",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='created_time',
model_name="client",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='client',
name='modified_by',
model_name="client",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='modified_time',
model_name="client",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='site',
name='created_by',
model_name="site",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='created_time',
model_name="site",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='site',
name='modified_by',
model_name="site",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='modified_time',
model_name="site",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -8,24 +8,67 @@ import uuid
class Migration(migrations.Migration):
dependencies = [
('knox', '0007_auto_20190111_0542'),
('clients', '0005_auto_20200922_1344'),
("knox", "0007_auto_20190111_0542"),
("clients", "0005_auto_20200922_1344"),
]
operations = [
migrations.CreateModel(
name='Deployment',
name="Deployment",
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('uid', models.UUIDField(default=uuid.uuid4, editable=False)),
('mon_type', models.CharField(choices=[('server', 'Server'), ('workstation', 'Workstation')], default='server', max_length=255)),
('arch', models.CharField(choices=[('64', '64 bit'), ('32', '32 bit')], default='64', max_length=255)),
('expiry', models.DateTimeField(blank=True, null=True)),
('token_key', models.CharField(max_length=255)),
('install_flags', models.JSONField(blank=True, null=True)),
('auth_token', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploytokens', to='knox.authtoken')),
('client', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deployclients', to='clients.client')),
('site', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploysites', to='clients.site')),
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("uid", models.UUIDField(default=uuid.uuid4, editable=False)),
(
"mon_type",
models.CharField(
choices=[("server", "Server"), ("workstation", "Workstation")],
default="server",
max_length=255,
),
),
(
"arch",
models.CharField(
choices=[("64", "64 bit"), ("32", "32 bit")],
default="64",
max_length=255,
),
),
("expiry", models.DateTimeField(blank=True, null=True)),
("token_key", models.CharField(max_length=255)),
("install_flags", models.JSONField(blank=True, null=True)),
(
"auth_token",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploytokens",
to="knox.authtoken",
),
),
(
"client",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deployclients",
to="clients.client",
),
),
(
"site",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploysites",
to="clients.site",
),
),
],
),
]

View File

@@ -6,18 +6,18 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
("clients", "0006_deployment"),
]
operations = [
migrations.RenameField(
model_name='client',
old_name='client',
new_name='name',
model_name="client",
old_name="client",
new_name="name",
),
migrations.RenameField(
model_name='site',
old_name='site',
new_name='name',
model_name="site",
old_name="site",
new_name="name",
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0007_auto_20201102_1920'),
("clients", "0007_auto_20201102_1920"),
]
operations = [
migrations.AlterModelOptions(
name='client',
options={'ordering': ('name',)},
name="client",
options={"ordering": ("name",)},
),
migrations.AlterModelOptions(
name='site',
options={'ordering': ('name',)},
name="site",
options={"ordering": ("name",)},
),
]

View File

@@ -192,7 +192,7 @@ class GenerateAgent(APIView):
if not os.path.exists(go_bin):
return notify_error("Missing golang")
api = f"{request.scheme}://{request.get_host()}"
api = f"https://{request.get_host()}"
inno = (
f"winagent-v{settings.LATEST_AGENT_VER}.exe"
if d.arch == "64"
@@ -282,4 +282,4 @@ class GenerateAgent(APIView):
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
return response

View File

@@ -6,4 +6,4 @@ class Command(BaseCommand):
help = "Reload Nats"
def handle(self, *args, **kwargs):
reload_nats()
reload_nats()

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0008_auto_20200910_1434'),
("core", "0008_auto_20200910_1434"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='agent_auto_update',
model_name="coresettings",
name="agent_auto_update",
field=models.BooleanField(default=True),
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0009_coresettings_agent_auto_update'),
("core", "0009_coresettings_agent_auto_update"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='created_by',
model_name="coresettings",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='created_time',
model_name="coresettings",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_by',
model_name="coresettings",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_time',
model_name="coresettings",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -7,28 +7,34 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0010_auto_20201002_1257'),
("core", "0010_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='sms_alert_recipients',
field=django.contrib.postgres.fields.ArrayField(base_field=models.CharField(blank=True, max_length=255, null=True), blank=True, default=list, null=True, size=None),
model_name="coresettings",
name="sms_alert_recipients",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(blank=True, max_length=255, null=True),
blank=True,
default=list,
null=True,
size=None,
),
),
migrations.AddField(
model_name='coresettings',
name='twilio_account_sid',
model_name="coresettings",
name="twilio_account_sid",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_auth_token',
model_name="coresettings",
name="twilio_auth_token",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_number',
model_name="coresettings",
name="twilio_number",
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("core", "0011_auto_20201026_0719"),
]
operations = [
migrations.AddField(
model_name="coresettings",
name="check_history_prune_days",
field=models.PositiveIntegerField(default=30),
),
]

View File

@@ -49,6 +49,8 @@ class CoreSettings(BaseAuditModel):
default_time_zone = models.CharField(
max_length=255, choices=TZ_CHOICES, default="America/Los_Angeles"
)
# removes check history older than days
check_history_prune_days = models.PositiveIntegerField(default=30)
mesh_token = models.CharField(max_length=255, null=True, blank=True, default="")
mesh_username = models.CharField(max_length=255, null=True, blank=True, default="")
mesh_site = models.CharField(max_length=255, null=True, blank=True, default="")

View File

@@ -4,8 +4,10 @@ from loguru import logger
from django.conf import settings
from django.utils import timezone as djangotime
from tacticalrmm.celery import app
from core.models import CoreSettings
from autotasks.models import AutomatedTask
from autotasks.tasks import delete_win_task_schedule
from checks.tasks import prune_check_history
logger.configure(**settings.LOG_CONFIG)
@@ -25,3 +27,7 @@ def core_maintenance_tasks():
if now > task_time_utc:
delete_win_task_schedule.delay(task.pk)
# remove old CheckHistory data
older_than = CoreSettings.objects.first().check_history_prune_days
prune_check_history.delay(older_than)

View File

@@ -1,6 +1,7 @@
from tacticalrmm.test import TacticalTestCase
from core.tasks import core_maintenance_tasks
from unittest.mock import patch
from core.models import CoreSettings
from model_bakery import baker, seq
@@ -34,6 +35,54 @@ class TestCoreTasks(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("automation.tasks.generate_all_agent_checks_task.delay")
def test_edit_coresettings(self, generate_all_agent_checks_task):
url = "/core/editsettings/"
# setup
policies = baker.make("Policy", _quantity=2)
# test normal request
data = {
"smtp_from_email": "newexample@example.com",
"mesh_token": "New_Mesh_Token",
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(
CoreSettings.objects.first().smtp_from_email, data["smtp_from_email"]
)
self.assertEqual(CoreSettings.objects.first().mesh_token, data["mesh_token"])
generate_all_agent_checks_task.assert_not_called()
# test adding policy
data = {
"workstation_policy": policies[0].id,
"server_policy": policies[1].id,
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(CoreSettings.objects.first().server_policy.id, policies[1].id)
self.assertEqual(
CoreSettings.objects.first().workstation_policy.id, policies[0].id
)
self.assertEqual(generate_all_agent_checks_task.call_count, 2)
generate_all_agent_checks_task.reset_mock()
# test remove policy
data = {
"workstation_policy": "",
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(CoreSettings.objects.first().workstation_policy, None)
self.assertEqual(generate_all_agent_checks_task.call_count, 1)
self.check_not_authenticated("patch", url)
@patch("autotasks.tasks.remove_orphaned_win_tasks.delay")
def test_ui_maintenance_actions(self, remove_orphaned_win_tasks):
url = "/core/servermaintenance/"

View File

@@ -42,21 +42,19 @@ def get_core_settings(request):
@api_view(["PATCH"])
def edit_settings(request):
settings = CoreSettings.objects.first()
serializer = CoreSettingsSerializer(instance=settings, data=request.data)
coresettings = CoreSettings.objects.first()
old_server_policy = coresettings.server_policy
old_workstation_policy = coresettings.workstation_policy
serializer = CoreSettingsSerializer(instance=coresettings, data=request.data)
serializer.is_valid(raise_exception=True)
new_settings = serializer.save()
# check if default policies changed
if settings.server_policy != new_settings.server_policy:
generate_all_agent_checks_task.delay(
mon_type="server", clear=True, create_tasks=True
)
if old_server_policy != new_settings.server_policy:
generate_all_agent_checks_task.delay(mon_type="server", create_tasks=True)
if settings.workstation_policy != new_settings.workstation_policy:
generate_all_agent_checks_task.delay(
mon_type="workstation", clear=True, create_tasks=True
)
if old_workstation_policy != new_settings.workstation_policy:
generate_all_agent_checks_task.delay(mon_type="workstation", create_tasks=True)
return Response("ok")
@@ -69,7 +67,12 @@ def version(request):
@api_view()
def dashboard_info(request):
return Response(
{"trmm_version": settings.TRMM_VERSION, "dark_mode": request.user.dark_mode}
{
"trmm_version": settings.TRMM_VERSION,
"dark_mode": request.user.dark_mode,
"show_community_scripts": request.user.show_community_scripts,
"dbl_click_action": request.user.agent_dblclick_action,
}
)

View File

@@ -6,13 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0007_auditlog_debug_info'),
("logs", "0007_auditlog_debug_info"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,29 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0008_auto_20201110_1431'),
("logs", "0008_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,18 +6,50 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0009_auto_20201110_1431'),
("logs", "0009_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command'), ('bulk_action', 'Bulk Action')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
("bulk_action", "Bulk Action"),
],
max_length=100,
),
),
migrations.AlterField(
model_name='auditlog',
name='object_type',
field=models.CharField(choices=[('user', 'User'), ('script', 'Script'), ('agent', 'Agent'), ('policy', 'Policy'), ('winupdatepolicy', 'Patch Policy'), ('client', 'Client'), ('site', 'Site'), ('check', 'Check'), ('automatedtask', 'Automated Task'), ('coresettings', 'Core Settings'), ('bulk', 'Bulk')], max_length=100),
model_name="auditlog",
name="object_type",
field=models.CharField(
choices=[
("user", "User"),
("script", "Script"),
("agent", "Agent"),
("policy", "Policy"),
("winupdatepolicy", "Patch Policy"),
("client", "Client"),
("site", "Site"),
("check", "Check"),
("automatedtask", "Automated Task"),
("coresettings", "Core Settings"),
("bulk", "Bulk"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0010_auto_20201110_2238'),
("logs", "0010_auto_20201110_2238"),
]
operations = [
migrations.AlterField(
model_name='pendingaction',
name='action_type',
field=models.CharField(blank=True, choices=[('schedreboot', 'Scheduled Reboot'), ('taskaction', 'Scheduled Task Action'), ('agentupdate', 'Agent Update')], max_length=255, null=True),
model_name="pendingaction",
name="action_type",
field=models.CharField(
blank=True,
choices=[
("schedreboot", "Scheduled Reboot"),
("taskaction", "Scheduled Task Action"),
("agentupdate", "Agent Update"),
],
max_length=255,
null=True,
),
),
]

View File

View File

@@ -0,0 +1,5 @@
from django.apps import AppConfig
class NatsapiConfig(AppConfig):
name = "natsapi"

View File

@@ -0,0 +1,8 @@
from django.urls import path
from . import views
urlpatterns = [
path("natsinfo/", views.nats_info),
path("checkin/", views.NatsCheckIn.as_view()),
path("syncmesh/", views.SyncMeshNodeID.as_view()),
]

View File

@@ -0,0 +1,126 @@
from django.utils import timezone as djangotime
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.decorators import (
api_view,
permission_classes,
authentication_classes,
)
from django.conf import settings
from django.shortcuts import get_object_or_404
from agents.models import Agent
from software.models import InstalledSoftware
from checks.utils import bytes2human
from agents.serializers import WinAgentSerializer
from agents.tasks import (
agent_recovery_email_task,
agent_recovery_sms_task,
handle_agent_recovery_task,
)
from tacticalrmm.utils import notify_error, filter_software, SoftwareList
@api_view()
@permission_classes([])
@authentication_classes([])
def nats_info(request):
return Response({"user": "tacticalrmm", "password": settings.SECRET_KEY})
class NatsCheckIn(APIView):
authentication_classes = []
permission_classes = []
def patch(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
agent.version = request.data["version"]
agent.last_seen = djangotime.now()
agent.save(update_fields=["version", "last_seen"])
if agent.agentoutages.exists() and agent.agentoutages.last().is_active:
last_outage = agent.agentoutages.last()
last_outage.recovery_time = djangotime.now()
last_outage.save(update_fields=["recovery_time"])
if agent.overdue_email_alert:
agent_recovery_email_task.delay(pk=last_outage.pk)
if agent.overdue_text_alert:
agent_recovery_sms_task.delay(pk=last_outage.pk)
recovery = agent.recoveryactions.filter(last_run=None).last()
if recovery is not None:
recovery.last_run = djangotime.now()
recovery.save(update_fields=["last_run"])
handle_agent_recovery_task.delay(pk=recovery.pk)
return Response("ok")
# get any pending actions
if agent.pendingactions.filter(status="pending").exists():
agent.handle_pending_actions()
return Response("ok")
def put(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
serializer = WinAgentSerializer(instance=agent, data=request.data, partial=True)
if request.data["func"] == "disks":
disks = request.data["disks"]
new = []
for disk in disks:
tmp = {}
for _, _ in disk.items():
tmp["device"] = disk["device"]
tmp["fstype"] = disk["fstype"]
tmp["total"] = bytes2human(disk["total"])
tmp["used"] = bytes2human(disk["used"])
tmp["free"] = bytes2human(disk["free"])
tmp["percent"] = int(disk["percent"])
new.append(tmp)
serializer.is_valid(raise_exception=True)
serializer.save(disks=new)
return Response("ok")
if request.data["func"] == "loggedonuser":
if request.data["logged_in_username"] != "None":
serializer.is_valid(raise_exception=True)
serializer.save(last_logged_in_user=request.data["logged_in_username"])
return Response("ok")
if request.data["func"] == "software":
raw: SoftwareList = request.data["software"]
if not isinstance(raw, list):
return notify_error("err")
sw = filter_software(raw)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s.software = sw
s.save(update_fields=["software"])
return Response("ok")
serializer.is_valid(raise_exception=True)
serializer.save()
return Response("ok")
class SyncMeshNodeID(APIView):
authentication_classes = []
permission_classes = []
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if agent.mesh_node_id != request.data["nodeid"]:
agent.mesh_node_id = request.data["nodeid"]
agent.save(update_fields=["mesh_node_id"])
return Response("ok")

View File

@@ -1,38 +1,38 @@
amqp==2.6.1
amqp==5.0.2
asgiref==3.3.1
asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==4.4.6
celery==5.0.5
certifi==2020.12.5
cffi==1.14.3
chardet==3.0.4
cryptography==3.2.1
cffi==1.14.4
chardet==4.0.0
cryptography==3.3.1
decorator==4.4.2
Django==3.1.4
django-cors-headers==3.5.0
Django==3.1.5
django-cors-headers==3.6.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
idna==2.10
kombu==4.6.11
kombu==5.0.2
loguru==0.5.3
msgpack==1.0.0
packaging==20.4
msgpack==1.0.2
packaging==20.8
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.9.9
pyotp==2.4.1
pyparsing==2.4.7
pytz==2020.4
pytz==2020.5
qrcode==6.1
redis==3.5.3
requests==2.25.0
requests==2.25.1
six==1.15.0
sqlparse==0.4.1
twilio==6.49.0
twilio==6.51.0
urllib3==1.26.2
uWSGI==2.0.19.1
validators==0.18.1
vine==1.3.0
validators==0.18.2
vine==5.0.0
websockets==8.1
zipp==3.4.0

View File

@@ -6,8 +6,5 @@ script = Recipe(
name="Test Script",
description="Test Desc",
shell="cmd",
filename="test.bat",
script_type="userdefined",
)
builtin_script = script.extend(script_type="builtin")

View File

@@ -96,5 +96,103 @@
"name": "Check BIOS Information",
"description": "Retreives and reports on BIOS make, version, and date .",
"shell": "powershell"
},
{
"filename": "ResetHighPerformancePowerProfiletoDefaults.ps1",
"submittedBy": "https://github.com/azulskyknight",
"name": "Reset High Perf Power Profile",
"description": "Resets monitor, disk, standby, and hibernate timers in the default High Performance power profile to their default values. It also re-indexes the AC and DC power profiles into their default order.",
"shell": "powershell"
},
{
"filename": "SetHighPerformancePowerProfile.ps1",
"submittedBy": "https://github.com/azulskyknight",
"name": "Set High Perf Power Profile",
"description": "Sets the High Performance Power profile to the active power profile. Use this to keep machines from falling asleep.",
"shell": "powershell"
},
{
"filename": "Windows10Upgrade.ps1",
"submittedBy": "https://github.com/RVL-Solutions and https://github.com/darimm",
"name": "Windows 10 Upgrade",
"description": "Forces an upgrade to the latest release of Windows 10.",
"shell": "powershell"
},
{
"filename": "DiskStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Check Disks",
"description": "Checks local disks for errors reported in event viewer within the last 24 hours",
"shell": "powershell"
},
{
"filename": "DuplicatiStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Check Duplicati",
"description": "Checks Duplicati Backup is running properly over the last 24 hours",
"shell": "powershell"
},
{
"filename": "EnableDefender.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Enable Windows Defender",
"description": "Enables Windows Defender and sets preferences",
"shell": "powershell"
},
{
"filename": "OpenSSHServerInstall.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Install SSH",
"description": "Installs and enabled OpenSSH Server",
"shell": "powershell"
},
{
"filename": "RDP_enable.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Enable RDP",
"description": "Enables RDP",
"shell": "cmd"
},
{
"filename": "Speedtest.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "PS Speed Test",
"description": "Powershell speed test (win 10 or server2016+)",
"shell": "powershell"
},
{
"filename": "SyncTime.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Sync DC Time",
"description": "Syncs time with domain controller",
"shell": "cmd"
},
{
"filename": "WinDefenderClearLogs.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Clear Defender Logs",
"description": "Clears Windows Defender Logs",
"shell": "powershell"
},
{
"filename": "WinDefenderStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Defender Status",
"description": "This will check for Malware, Antispyware, that Windows Defender is Healthy, last scan etc within the last 24 hours",
"shell": "powershell"
},
{
"filename": "disable_FastStartup.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Disable Fast Startup",
"description": "Disables Faststartup on Windows 10",
"shell": "cmd"
},
{
"filename": "updatetacticalexclusion.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "TRMM Defender Exclusions",
"description": "Windows Defender Exclusions for Tactical RMM",
"shell": "cmd"
}
]

View File

@@ -0,0 +1,28 @@
# Generated by Django 3.1.3 on 2020-12-07 15:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("scripts", "0003_auto_20200922_1344"),
]
operations = [
migrations.AddField(
model_name="script",
name="category",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name="script",
name="favorite",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="script",
name="script_base64",
field=models.TextField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.3 on 2020-12-07 16:06
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("scripts", "0004_auto_20201207_1558"),
]
operations = [
migrations.RenameField(
model_name="script",
old_name="script_base64",
new_name="code_base64",
),
]

View File

@@ -0,0 +1,42 @@
# Generated by Django 3.1.4 on 2020-12-10 21:45
from django.db import migrations
from django.conf import settings
import os
import base64
from pathlib import Path
def move_scripts_to_db(apps, schema_editor):
print("")
Script = apps.get_model("scripts", "Script")
for script in Script.objects.all():
if not script.script_type == "builtin":
if script.filename:
filepath = f"{settings.SCRIPTS_DIR}/userdefined/{script.filename}"
else:
print(f"No filename on script found. Skipping")
continue
# test if file exists
if os.path.exists(filepath):
print(f"Found script {script.name}. Importing code.")
with open(filepath, "rb") as f:
script_bytes = f.read().decode("utf-8").encode("ascii", "ignore")
script.code_base64 = base64.b64encode(script_bytes).decode("ascii")
script.save(update_fields=["code_base64"])
else:
print(
f"Script file {script.name} was not found on the disk. You will need to edit the script in the UI"
)
class Migration(migrations.Migration):
dependencies = [
("scripts", "0005_auto_20201207_1606"),
]
operations = [migrations.RunPython(move_scripts_to_db, migrations.RunPython.noop)]

View File

@@ -1,3 +1,4 @@
import base64
from django.db import models
from logs.models import BaseAuditModel
from django.conf import settings
@@ -17,41 +18,27 @@ SCRIPT_TYPES = [
class Script(BaseAuditModel):
name = models.CharField(max_length=255)
description = models.TextField(null=True, blank=True)
filename = models.CharField(max_length=255)
filename = models.CharField(max_length=255) # deprecated
shell = models.CharField(
max_length=100, choices=SCRIPT_SHELLS, default="powershell"
)
script_type = models.CharField(
max_length=100, choices=SCRIPT_TYPES, default="userdefined"
)
favorite = models.BooleanField(default=False)
category = models.CharField(max_length=100, null=True, blank=True)
code_base64 = models.TextField(null=True, blank=True)
def __str__(self):
return self.filename
@property
def filepath(self):
# for the windows agent when using 'salt-call'
if self.script_type == "userdefined":
return f"salt://scripts//userdefined//{self.filename}"
else:
return f"salt://scripts//{self.filename}"
@property
def file(self):
if self.script_type == "userdefined":
return f"{settings.SCRIPTS_DIR}/userdefined/{self.filename}"
else:
return f"{settings.SCRIPTS_DIR}/{self.filename}"
return self.name
@property
def code(self):
try:
with open(self.file, "r") as f:
text = f.read()
except:
text = "n/a"
return text
if self.code_base64:
base64_bytes = self.code_base64.encode("ascii", "ignore")
return base64.b64decode(base64_bytes).decode("ascii", "ignore")
else:
return ""
@classmethod
def load_community_scripts(cls):
@@ -79,22 +66,41 @@ class Script(BaseAuditModel):
for script in info:
if os.path.exists(os.path.join(scripts_dir, script["filename"])):
s = cls.objects.filter(script_type="builtin").filter(
filename=script["filename"]
name=script["name"]
)
if s.exists():
i = s.first()
i.name = script["name"]
i.description = script["description"]
i.save(update_fields=["name", "description"])
i.category = "Community"
with open(os.path.join(scripts_dir, script["filename"]), "rb") as f:
script_bytes = (
f.read().decode("utf-8").encode("ascii", "ignore")
)
i.code_base64 = base64.b64encode(script_bytes).decode("ascii")
i.save(
update_fields=["name", "description", "category", "code_base64"]
)
else:
print(f"Adding new community script: {script['name']}")
cls(
name=script["name"],
description=script["description"],
filename=script["filename"],
shell=script["shell"],
script_type="builtin",
).save()
with open(os.path.join(scripts_dir, script["filename"]), "rb") as f:
script_bytes = (
f.read().decode("utf-8").encode("ascii", "ignore")
)
code_base64 = base64.b64encode(script_bytes).decode("ascii")
cls(
code_base64=code_base64,
name=script["name"],
description=script["description"],
filename=script["filename"],
shell=script["shell"],
script_type="builtin",
category="Community",
).save()
@staticmethod
def serialize(script):

View File

@@ -1,41 +1,33 @@
import os
from django.conf import settings
from rest_framework.serializers import ModelSerializer, ValidationError, ReadOnlyField
from rest_framework.serializers import ModelSerializer, ReadOnlyField
from .models import Script
class ScriptSerializer(ModelSerializer):
code = ReadOnlyField()
filepath = ReadOnlyField()
class ScriptTableSerializer(ModelSerializer):
class Meta:
model = Script
fields = "__all__"
fields = [
"id",
"name",
"description",
"script_type",
"shell",
"category",
"favorite",
]
def validate(self, val):
if "filename" in val:
# validate the filename
if (
not val["filename"].endswith(".py")
and not val["filename"].endswith(".ps1")
and not val["filename"].endswith(".bat")
):
raise ValidationError("File types supported are .py, .ps1 and .bat")
# make sure file doesn't already exist on server
# but only if adding, not if editing since will overwrite if edit
if not self.instance:
script_path = os.path.join(
f"{settings.SCRIPTS_DIR}/userdefined", val["filename"]
)
if os.path.exists(script_path):
raise ValidationError(
f"{val['filename']} already exists. Delete or edit the existing script first."
)
return val
class ScriptSerializer(ModelSerializer):
class Meta:
model = Script
fields = [
"id",
"name",
"description",
"shell",
"category",
"favorite",
"code_base64",
]
class ScriptCheckSerializer(ModelSerializer):

View File

@@ -1,10 +1,11 @@
import json
import os
from django.core.files.uploadedfile import SimpleUploadedFile
from pathlib import Path
from django.conf import settings
from tacticalrmm.test import TacticalTestCase
from model_bakery import baker
from .serializers import ScriptSerializer
from .serializers import ScriptSerializer, ScriptTableSerializer
from .models import Script
@@ -16,16 +17,50 @@ class TestScriptViews(TacticalTestCase):
url = "/scripts/scripts/"
scripts = baker.make("scripts.Script", _quantity=3)
serializer = ScriptSerializer(scripts, many=True)
serializer = ScriptTableSerializer(scripts, many=True)
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(serializer.data, resp.data)
self.check_not_authenticated("get", url)
# TODO Need to test file uploads and saves
def test_add_script(self):
pass
url = f"/scripts/scripts/"
data = {
"name": "Name",
"description": "Description",
"shell": "powershell",
"category": "New",
"code": "Some Test Code\nnew Line",
}
# test without file upload
resp = self.client.post(url, data)
self.assertEqual(resp.status_code, 200)
self.assertTrue(Script.objects.filter(name="Name").exists())
self.assertEqual(Script.objects.get(name="Name").code, data["code"])
# test with file upload
# file with 'Test' as content
file = SimpleUploadedFile(
"test_script.bat", b"\x54\x65\x73\x74", content_type="text/plain"
)
data = {
"name": "New Name",
"description": "Description",
"shell": "cmd",
"category": "New",
"filename": file,
}
# test with file upload
resp = self.client.post(url, data, format="multipart")
self.assertEqual(resp.status_code, 200)
script = Script.objects.filter(name="New Name").first()
self.assertEquals(script.code, "Test")
self.check_not_authenticated("post", url)
def test_modify_script(self):
# test a call where script doesn't exist
@@ -40,23 +75,39 @@ class TestScriptViews(TacticalTestCase):
"name": script.name,
"description": "Description Change",
"shell": script.shell,
"code": "Test Code\nAnother Line",
}
# test edit a userdefined script
resp = self.client.put(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEquals(
Script.objects.get(pk=script.pk).description, "Description Change"
)
script = Script.objects.get(pk=script.pk)
self.assertEquals(script.description, "Description Change")
self.assertEquals(script.code, "Test Code\nAnother Line")
# test edit a builtin script
builtin_script = baker.make_recipe("scripts.builtin_script")
data = {"name": "New Name", "description": "New Desc", "code": "Some New Code"}
builtin_script = baker.make_recipe("scripts.script", script_type="builtin")
resp = self.client.put(
f"/scripts/{builtin_script.pk}/script/", data, format="json"
)
self.assertEqual(resp.status_code, 400)
# TODO Test changing script file
data = {
"name": script.name,
"description": "Description Change",
"shell": script.shell,
"favorite": True,
"code": "Test Code\nAnother Line",
}
# test marking a builtin script as favorite
resp = self.client.put(
f"/scripts/{builtin_script.pk}/script/", data, format="json"
)
self.assertEqual(resp.status_code, 200)
self.assertTrue(Script.objects.get(pk=builtin_script.pk).favorite)
self.check_not_authenticated("put", url)
@@ -79,6 +130,7 @@ class TestScriptViews(TacticalTestCase):
resp = self.client.delete("/scripts/500/script/", format="json")
self.assertEqual(resp.status_code, 404)
# test delete script
script = baker.make_recipe("scripts.script")
url = f"/scripts/{script.pk}/script/"
resp = self.client.delete(url, format="json")
@@ -86,13 +138,50 @@ class TestScriptViews(TacticalTestCase):
self.assertFalse(Script.objects.filter(pk=script.pk).exists())
# test delete community script
script = baker.make_recipe("scripts.script", script_type="builtin")
url = f"/scripts/{script.pk}/script/"
resp = self.client.delete(url, format="json")
self.assertEqual(resp.status_code, 400)
self.check_not_authenticated("delete", url)
# TODO Need to mock file open
def test_download_script(self):
pass
# test a call where script doesn't exist
resp = self.client.get("/scripts/500/download/", format="json")
self.assertEqual(resp.status_code, 404)
def test_load_community_scripts(self):
# return script code property should be "Test"
# test powershell file
script = baker.make(
"scripts.Script", code_base64="VGVzdA==", shell="powershell"
)
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.ps1", "code": "Test"})
# test batch file
script = baker.make("scripts.Script", code_base64="VGVzdA==", shell="cmd")
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.bat", "code": "Test"})
# test python file
script = baker.make("scripts.Script", code_base64="VGVzdA==", shell="python")
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.py", "code": "Test"})
self.check_not_authenticated("get", url)
def test_community_script_json_file(self):
valid_shells = ["powershell", "python", "cmd"]
if not settings.DOCKER_BUILD:
@@ -113,5 +202,19 @@ class TestScriptViews(TacticalTestCase):
self.assertTrue(script["name"])
self.assertTrue(script["description"])
self.assertTrue(script["shell"])
self.assertTrue(script["description"])
self.assertIn(script["shell"], valid_shells)
def test_load_community_scripts(self):
with open(
os.path.join(settings.BASE_DIR, "scripts/community_scripts.json")
) as f:
info = json.load(f)
Script.load_community_scripts()
community_scripts = Script.objects.filter(script_type="builtin").count()
self.assertEqual(len(info), community_scripts)
# test updating already added community scripts
Script.load_community_scripts()
self.assertEqual(len(info), community_scripts)

View File

@@ -1,4 +1,4 @@
import os
import base64
from loguru import logger
from django.shortcuts import get_object_or_404
@@ -11,9 +11,10 @@ from rest_framework.response import Response
from rest_framework.parsers import FileUploadParser
from .models import Script
from .serializers import ScriptSerializer
from .serializers import ScriptSerializer, ScriptTableSerializer
from tacticalrmm.utils import notify_error
logger.configure(**settings.LOG_CONFIG)
@@ -22,74 +23,65 @@ class GetAddScripts(APIView):
def get(self, request):
scripts = Script.objects.all()
return Response(ScriptSerializer(scripts, many=True).data)
return Response(ScriptTableSerializer(scripts, many=True).data)
def put(self, request, format=None):
def post(self, request, format=None):
file_obj = request.data["filename"] # the actual file in_memory object
# need to manually create the serialized data
# since javascript formData doesn't support JSON
filename = str(file_obj)
data = {
"name": request.data["name"],
"filename": filename,
"category": request.data["category"],
"description": request.data["description"],
"shell": request.data["shell"],
"script_type": "userdefined", # force all uploads to be userdefined. built in scripts cannot be edited by user
}
if "favorite" in request.data:
data["favorite"] = request.data["favorite"]
if "filename" in request.data:
message_bytes = request.data["filename"].read()
data["code_base64"] = base64.b64encode(message_bytes).decode(
"ascii", "ignore"
)
elif "code" in request.data:
message_bytes = request.data["code"].encode("ascii", "ignore")
data["code_base64"] = base64.b64encode(message_bytes).decode("ascii")
serializer = ScriptSerializer(data=data, partial=True)
serializer.is_valid(raise_exception=True)
obj = serializer.save()
with open(obj.file, "wb+") as f:
for chunk in file_obj.chunks():
f.write(chunk)
return Response(f"{obj.name} was added!")
class GetUpdateDeleteScript(APIView):
parser_class = (FileUploadParser,)
def get(self, request, pk):
script = get_object_or_404(Script, pk=pk)
return Response(ScriptSerializer(script).data)
def put(self, request, pk, format=None):
def put(self, request, pk):
script = get_object_or_404(Script, pk=pk)
# this will never trigger but check anyway
data = request.data
if script.script_type == "builtin":
return notify_error("Built in scripts cannot be edited")
# allow only favoriting builtin scripts
if "favorite" in data:
# overwrite request data
data = {"favorite": data["favorite"]}
else:
return notify_error("Community scripts cannot be edited.")
data = {
"name": request.data["name"],
"description": request.data["description"],
"shell": request.data["shell"],
}
# if uploading a new version of the script
if "filename" in request.data:
file_obj = request.data["filename"]
data["filename"] = str(file_obj)
elif "code" in data:
message_bytes = data["code"].encode("ascii")
data["code_base64"] = base64.b64encode(message_bytes).decode("ascii")
data.pop("code")
serializer = ScriptSerializer(data=data, instance=script, partial=True)
serializer.is_valid(raise_exception=True)
obj = serializer.save()
if "filename" in request.data:
try:
os.remove(obj.file)
except OSError:
pass
with open(obj.file, "wb+") as f:
for chunk in file_obj.chunks():
f.write(chunk)
return Response(f"{obj.name} was edited!")
def delete(self, request, pk):
@@ -97,12 +89,7 @@ class GetUpdateDeleteScript(APIView):
# this will never trigger but check anyway
if script.script_type == "builtin":
return notify_error("Built in scripts cannot be deleted")
try:
os.remove(script.file)
except OSError:
pass
return notify_error("Community scripts cannot be deleted")
script.delete()
return Response(f"{script.name} was deleted!")
@@ -111,33 +98,12 @@ class GetUpdateDeleteScript(APIView):
@api_view()
def download(request, pk):
script = get_object_or_404(Script, pk=pk)
use_nginx = False
conf = "/etc/nginx/sites-available/rmm.conf"
if os.path.exists(conf):
try:
with open(conf) as f:
for line in f.readlines():
if "location" and "builtin" in line:
use_nginx = True
break
except Exception as e:
logger.error(e)
if script.shell == "powershell":
filename = f"{script.name}.ps1"
elif script.shell == "cmd":
filename = f"{script.name}.bat"
else:
use_nginx = True
filename = f"{script.name}.py"
if settings.DEBUG or not use_nginx:
with open(script.file, "rb") as f:
response = HttpResponse(f.read(), content_type="text/plain")
response["Content-Disposition"] = f"attachment; filename={script.filename}"
return response
else:
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={script.filename}"
response["X-Accel-Redirect"] = (
f"/saltscripts/{script.filename}"
if script.script_type == "userdefined"
else f"/builtin/{script.filename}"
)
return response
return Response({"filename": filename, "code": script.code})

View File

@@ -33,9 +33,9 @@ app.conf.beat_schedule = {
"task": "winupdate.tasks.check_agent_update_schedule_task",
"schedule": crontab(minute=5, hour="*"),
},
"sync-modules": {
"task": "agents.tasks.batch_sync_modules_task",
"schedule": crontab(minute=25, hour="*/4"),
"agents-checkinfull": {
"task": "agents.tasks.check_in_task",
"schedule": crontab(minute="*/24"),
},
"agent-auto-update": {
"task": "agents.tasks.auto_self_agent_update_task",
@@ -45,6 +45,14 @@ app.conf.beat_schedule = {
"task": "agents.tasks.sync_sysinfo_task",
"schedule": crontab(minute=55, hour="*"),
},
"get-wmi": {
"task": "agents.tasks.get_wmi_task",
"schedule": crontab(minute="*/18"),
},
"check-agentservice": {
"task": "agents.tasks.monitor_agents_task",
"schedule": crontab(minute="*/15"),
},
}

View File

@@ -14,6 +14,7 @@ def get_debug_info():
EXCLUDE_PATHS = (
"/natsapi",
"/api/v3",
"/api/v2",
"/logs/auditlogs",
@@ -80,4 +81,4 @@ class AuditMiddleware:
def process_template_response(self, request, response):
request_local.debug_info = None
request_local.username = None
return response
return response

View File

@@ -15,25 +15,25 @@ EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
AUTH_USER_MODEL = "accounts.User"
# latest release
TRMM_VERSION = "0.2.16"
TRMM_VERSION = "0.3.0"
# bump this version everytime vue code is changed
# to alert user they need to manually refresh their browser
APP_VER = "0.0.99"
APP_VER = "0.0.104"
# https://github.com/wh1te909/salt
LATEST_SALT_VER = "1.1.0"
# https://github.com/wh1te909/rmmagent
LATEST_AGENT_VER = "1.1.11"
LATEST_AGENT_VER = "1.2.0"
MESH_VER = "0.7.14"
MESH_VER = "0.7.45"
SALT_MASTER_VER = "3002.2"
# for the update script, bump when need to recreate venv or npm install
PIP_VER = "4"
NPM_VER = "3"
PIP_VER = "7"
NPM_VER = "6"
DL_64 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}.exe"
DL_32 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}-x86.exe"
@@ -72,6 +72,7 @@ INSTALLED_APPS = [
"logs",
"scripts",
"alerts",
"natsapi",
]
if not "TRAVIS" in os.environ and not "AZPIPELINE" in os.environ:

View File

@@ -13,6 +13,9 @@ class TacticalTestCase(TestCase):
self.john = User(username="john")
self.john.set_password("hunter2")
self.john.save()
self.alice = User(username="alice")
self.alice.set_password("hunter2")
self.alice.save()
self.client_setup()
self.client.force_authenticate(user=self.john)

View File

@@ -2,7 +2,7 @@
{
"name": "System",
"cpu_percent": 0.0,
"memory_percent": 7.754021906781984e-05,
"membytes": 434655234324,
"pid": 4,
"ppid": 0,
"status": "running",
@@ -12,7 +12,7 @@
{
"name": "Registry",
"cpu_percent": 0.0,
"memory_percent": 0.009720362333912082,
"membytes": 0.009720362333912082,
"pid": 280,
"ppid": 4,
"status": "running",
@@ -22,7 +22,7 @@
{
"name": "smss.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0006223099632878874,
"membytes": 0.0006223099632878874,
"pid": 976,
"ppid": 4,
"status": "running",
@@ -32,7 +32,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005682306310149464,
"membytes": 0.005682306310149464,
"pid": 1160,
"ppid": 1388,
"status": "running",
@@ -42,7 +42,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004793576106987529,
"membytes": 0.004793576106987529,
"pid": 1172,
"ppid": 1388,
"status": "running",
@@ -52,7 +52,7 @@
{
"name": "csrss.exe",
"cpu_percent": 0.0,
"memory_percent": 0.002459416691971619,
"membytes": 0.002459416691971619,
"pid": 1240,
"ppid": 1220,
"status": "running",
@@ -62,7 +62,7 @@
{
"name": "wininit.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0031970428784885716,
"membytes": 0.0031970428784885716,
"pid": 1316,
"ppid": 1220,
"status": "running",
@@ -72,7 +72,7 @@
{
"name": "csrss.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0023719354191771556,
"membytes": 0.0023719354191771556,
"pid": 1324,
"ppid": 1308,
"status": "running",
@@ -82,7 +82,7 @@
{
"name": "services.exe",
"cpu_percent": 0.0,
"memory_percent": 0.00596662044673147,
"membytes": 0.00596662044673147,
"pid": 1388,
"ppid": 1316,
"status": "running",
@@ -92,7 +92,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006052113508780605,
"membytes": 0.006052113508780605,
"pid": 1396,
"ppid": 1388,
"status": "running",
@@ -102,7 +102,7 @@
{
"name": "LsaIso.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0016124389144615866,
"membytes": 0.0016124389144615866,
"pid": 1408,
"ppid": 1316,
"status": "running",
@@ -112,7 +112,7 @@
{
"name": "lsass.exe",
"cpu_percent": 0.0,
"memory_percent": 0.012698702030414497,
"membytes": 0.012698702030414497,
"pid": 1416,
"ppid": 1316,
"status": "running",
@@ -122,7 +122,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.007129723732748768,
"membytes": 0.007129723732748768,
"pid": 1444,
"ppid": 1388,
"status": "running",
@@ -132,7 +132,7 @@
{
"name": "winlogon.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005396003962822129,
"membytes": 0.005396003962822129,
"pid": 1492,
"ppid": 1308,
"status": "running",
@@ -142,7 +142,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0027815068327148706,
"membytes": 0.0027815068327148706,
"pid": 1568,
"ppid": 1388,
"status": "running",
@@ -152,7 +152,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.001936517265950167,
"membytes": 0.001936517265950167,
"pid": 1604,
"ppid": 1388,
"status": "running",
@@ -162,7 +162,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011187661863964672,
"membytes": 0.011187661863964672,
"pid": 1628,
"ppid": 1388,
"status": "running",
@@ -172,7 +172,7 @@
{
"name": "fontdrvhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.002765601146752241,
"membytes": 0.002765601146752241,
"pid": 1652,
"ppid": 1492,
"status": "running",
@@ -182,7 +182,7 @@
{
"name": "fontdrvhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0017794486170691988,
"membytes": 0.0017794486170691988,
"pid": 1660,
"ppid": 1316,
"status": "running",
@@ -192,7 +192,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006676411682813821,
"membytes": 0.006676411682813821,
"pid": 1752,
"ppid": 1388,
"status": "running",
@@ -202,7 +202,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004892986644253965,
"membytes": 0.004892986644253965,
"pid": 1796,
"ppid": 1388,
"status": "running",
@@ -212,7 +212,7 @@
{
"name": "dwm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.02493216274642207,
"membytes": 0.02493216274642207,
"pid": 1868,
"ppid": 1492,
"status": "running",
@@ -222,7 +222,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011945170157934911,
"membytes": 0.011945170157934911,
"pid": 1972,
"ppid": 1388,
"status": "running",
@@ -232,7 +232,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006616765360453959,
"membytes": 0.006616765360453959,
"pid": 1980,
"ppid": 1388,
"status": "running",
@@ -242,7 +242,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0034435810109093323,
"membytes": 0.0034435810109093323,
"pid": 2008,
"ppid": 1388,
"status": "running",
@@ -252,7 +252,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004722000520155695,
"membytes": 0.004722000520155695,
"pid": 2160,
"ppid": 1388,
"status": "running",
@@ -262,7 +262,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004264712048730091,
"membytes": 0.004264712048730091,
"pid": 2196,
"ppid": 1388,
"status": "running",
@@ -272,7 +272,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005493426289343236,
"membytes": 0.005493426289343236,
"pid": 2200,
"ppid": 1388,
"status": "running",
@@ -282,7 +282,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.002757648303770926,
"membytes": 0.002757648303770926,
"pid": 2212,
"ppid": 1388,
"status": "running",
@@ -292,7 +292,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0038113999987951447,
"membytes": 0.0038113999987951447,
"pid": 2224,
"ppid": 1388,
"status": "running",
@@ -302,7 +302,7 @@
{
"name": "mmc.exe",
"cpu_percent": 0.084375,
"memory_percent": 0.027600341566653204,
"membytes": 0.027600341566653204,
"pid": 2272,
"ppid": 4664,
"status": "running",
@@ -312,7 +312,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004185183618916942,
"membytes": 0.004185183618916942,
"pid": 2312,
"ppid": 1388,
"status": "running",
@@ -322,7 +322,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.003334229419916253,
"membytes": 0.003334229419916253,
"pid": 2352,
"ppid": 1388,
"status": "running",
@@ -332,7 +332,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.003841223159975075,
"membytes": 0.003841223159975075,
"pid": 2400,
"ppid": 1388,
"status": "running",
@@ -342,7 +342,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.00720527574107126,
"membytes": 0.00720527574107126,
"pid": 2440,
"ppid": 1388,
"status": "running",
@@ -352,7 +352,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.008088041311997208,
"membytes": 0.008088041311997208,
"pid": 2512,
"ppid": 1388,
"status": "running",
@@ -362,7 +362,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005859257066483719,
"membytes": 0.005859257066483719,
"pid": 2600,
"ppid": 1388,
"status": "running",
@@ -372,7 +372,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004566920082020056,
"membytes": 0.004566920082020056,
"pid": 2724,
"ppid": 1388,
"status": "running",
@@ -382,7 +382,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004475462387734934,
"membytes": 0.004475462387734934,
"pid": 2732,
"ppid": 1388,
"status": "running",
@@ -392,7 +392,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004006244651837358,
"membytes": 0.004006244651837358,
"pid": 2748,
"ppid": 1388,
"status": "running",
@@ -402,7 +402,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.003240783514885803,
"membytes": 0.003240783514885803,
"pid": 2796,
"ppid": 1388,
"status": "running",
@@ -412,7 +412,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0036404138746968747,
"membytes": 0.0036404138746968747,
"pid": 2852,
"ppid": 1388,
"status": "running",
@@ -422,7 +422,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005932820864060882,
"membytes": 0.005932820864060882,
"pid": 2936,
"ppid": 1388,
"status": "running",
@@ -432,7 +432,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004240853519786147,
"membytes": 0.004240853519786147,
"pid": 2944,
"ppid": 1388,
"status": "running",
@@ -442,7 +442,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.009068229209444265,
"membytes": 0.009068229209444265,
"pid": 2952,
"ppid": 1388,
"status": "running",
@@ -452,7 +452,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.008205345745971602,
"membytes": 0.008205345745971602,
"pid": 3036,
"ppid": 1388,
"status": "running",
@@ -462,7 +462,7 @@
{
"name": "spaceman.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0003360076159605526,
"membytes": 0.0003360076159605526,
"pid": 3112,
"ppid": 2440,
"status": "running",
@@ -472,7 +472,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.00409571413537715,
"membytes": 0.00409571413537715,
"pid": 3216,
"ppid": 1388,
"status": "running",
@@ -482,7 +482,7 @@
{
"name": "ShellExperienceHost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.030085604998314096,
"membytes": 0.030085604998314096,
"pid": 3228,
"ppid": 1628,
"status": "running",
@@ -492,7 +492,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004664342408541163,
"membytes": 0.004664342408541163,
"pid": 3244,
"ppid": 1388,
"status": "running",
@@ -502,7 +502,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004843281375620747,
"membytes": 0.004843281375620747,
"pid": 3268,
"ppid": 1388,
"status": "running",
@@ -512,7 +512,7 @@
{
"name": "python.exe",
"cpu_percent": 0.559375,
"memory_percent": 0.029455342192044896,
"membytes": 0.029455342192044896,
"pid": 3288,
"ppid": 4708,
"status": "running",
@@ -522,7 +522,7 @@
{
"name": "RuntimeBroker.exe",
"cpu_percent": 0.0,
"memory_percent": 0.010283025974840107,
"membytes": 0.010283025974840107,
"pid": 3296,
"ppid": 1628,
"status": "running",
@@ -532,7 +532,7 @@
{
"name": "RuntimeBroker.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006596883253000673,
"membytes": 0.006596883253000673,
"pid": 3308,
"ppid": 1628,
"status": "running",
@@ -542,7 +542,7 @@
{
"name": "spoolsv.exe",
"cpu_percent": 0.0,
"memory_percent": 0.008095994154978522,
"membytes": 0.008095994154978522,
"pid": 3708,
"ppid": 1388,
"status": "running",
@@ -552,7 +552,7 @@
{
"name": "conhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011507763793962596,
"membytes": 0.011507763793962596,
"pid": 3752,
"ppid": 6620,
"status": "running",
@@ -562,7 +562,7 @@
{
"name": "LogMeInSystray.exe",
"cpu_percent": 0.0,
"memory_percent": 0.010300919871548067,
"membytes": 0.010300919871548067,
"pid": 3780,
"ppid": 4664,
"status": "running",
@@ -572,7 +572,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005767799372198599,
"membytes": 0.005767799372198599,
"pid": 3808,
"ppid": 1388,
"status": "running",
@@ -582,7 +582,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.007070077410388906,
"membytes": 0.007070077410388906,
"pid": 3816,
"ppid": 1388,
"status": "running",
@@ -592,7 +592,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.014217695039845633,
"membytes": 0.014217695039845633,
"pid": 3824,
"ppid": 1388,
"status": "running",
@@ -602,7 +602,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.022611920806623463,
"membytes": 0.022611920806623463,
"pid": 3832,
"ppid": 1388,
"status": "running",
@@ -612,7 +612,7 @@
{
"name": "nssm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.003163243295817984,
"membytes": 0.003163243295817984,
"pid": 3840,
"ppid": 1388,
"status": "running",
@@ -622,7 +622,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0030717856015328626,
"membytes": 0.0030717856015328626,
"pid": 3856,
"ppid": 1388,
"status": "running",
@@ -632,7 +632,7 @@
{
"name": "LMIGuardianSvc.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004441662805064347,
"membytes": 0.004441662805064347,
"pid": 3868,
"ppid": 1388,
"status": "running",
@@ -642,7 +642,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0026781198739577773,
"membytes": 0.0026781198739577773,
"pid": 3876,
"ppid": 1388,
"status": "running",
@@ -652,7 +652,7 @@
{
"name": "ramaint.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0038471877922110613,
"membytes": 0.0038471877922110613,
"pid": 3884,
"ppid": 1388,
"status": "running",
@@ -662,7 +662,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005374133644623514,
"membytes": 0.005374133644623514,
"pid": 3892,
"ppid": 1388,
"status": "running",
@@ -672,7 +672,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006421920707411746,
"membytes": 0.006421920707411746,
"pid": 3900,
"ppid": 1388,
"status": "running",
@@ -682,7 +682,7 @@
{
"name": "ssm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0031612550850726546,
"membytes": 0.0031612550850726546,
"pid": 3908,
"ppid": 1388,
"status": "running",
@@ -692,7 +692,7 @@
{
"name": "MeshAgent.exe",
"cpu_percent": 0.0,
"memory_percent": 0.01894963661372797,
"membytes": 0.01894963661372797,
"pid": 3920,
"ppid": 1388,
"status": "running",
@@ -702,7 +702,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006905055918526623,
"membytes": 0.006905055918526623,
"pid": 4076,
"ppid": 1388,
"status": "running",
@@ -712,7 +712,7 @@
{
"name": "sihost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.012527715906316225,
"membytes": 0.012527715906316225,
"pid": 4136,
"ppid": 3268,
"status": "running",
@@ -722,7 +722,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004169277932954313,
"membytes": 0.004169277932954313,
"pid": 4160,
"ppid": 1388,
"status": "running",
@@ -732,7 +732,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006851374228402747,
"membytes": 0.006851374228402747,
"pid": 4192,
"ppid": 1388,
"status": "running",
@@ -742,7 +742,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006024278558346003,
"membytes": 0.006024278558346003,
"pid": 4208,
"ppid": 1388,
"status": "running",
@@ -752,7 +752,7 @@
{
"name": "LogMeIn.exe",
"cpu_percent": 0.0,
"memory_percent": 0.017691099211934895,
"membytes": 0.017691099211934895,
"pid": 4232,
"ppid": 1388,
"status": "running",
@@ -762,7 +762,7 @@
{
"name": "vmms.exe",
"cpu_percent": 0.0,
"memory_percent": 0.017331233067030397,
"membytes": 0.017331233067030397,
"pid": 4292,
"ppid": 1388,
"status": "running",
@@ -772,7 +772,7 @@
{
"name": "TabTip32.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0023441004687425535,
"membytes": 0.0023441004687425535,
"pid": 4304,
"ppid": 5916,
"status": "running",
@@ -782,7 +782,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.022273924979917578,
"membytes": 0.022273924979917578,
"pid": 4436,
"ppid": 1388,
"status": "running",
@@ -792,7 +792,7 @@
{
"name": "explorer.exe",
"cpu_percent": 0.0,
"memory_percent": 0.040491900039364585,
"membytes": 0.040491900039364585,
"pid": 4664,
"ppid": 2804,
"status": "running",
@@ -802,7 +802,7 @@
{
"name": "tacticalrmm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.019854272502852533,
"membytes": 0.019854272502852533,
"pid": 4696,
"ppid": 3840,
"status": "running",
@@ -812,7 +812,7 @@
{
"name": "python.exe",
"cpu_percent": 0.0,
"memory_percent": 0.03651547854870715,
"membytes": 0.03651547854870715,
"pid": 4708,
"ppid": 3908,
"status": "running",
@@ -822,7 +822,7 @@
{
"name": "conhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0060938659344325075,
"membytes": 0.0060938659344325075,
"pid": 4728,
"ppid": 4708,
"status": "running",
@@ -832,7 +832,7 @@
{
"name": "conhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006127665517103096,
"membytes": 0.006127665517103096,
"pid": 4736,
"ppid": 4696,
"status": "running",
@@ -842,7 +842,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0035111801762505086,
"membytes": 0.0035111801762505086,
"pid": 4752,
"ppid": 1388,
"status": "running",
@@ -852,7 +852,7 @@
{
"name": "vmcompute.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005598801458845658,
"membytes": 0.005598801458845658,
"pid": 5020,
"ppid": 1388,
"status": "running",
@@ -862,7 +862,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005260805632139777,
"membytes": 0.005260805632139777,
"pid": 5088,
"ppid": 1388,
"status": "running",
@@ -872,7 +872,7 @@
{
"name": "vmwp.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011384494727752215,
"membytes": 0.011384494727752215,
"pid": 5276,
"ppid": 5020,
"status": "running",
@@ -882,7 +882,7 @@
{
"name": "python.exe",
"cpu_percent": 0.0,
"memory_percent": 0.020685344594399937,
"membytes": 0.020685344594399937,
"pid": 5472,
"ppid": 4708,
"status": "running",
@@ -892,7 +892,7 @@
{
"name": "WmiPrvSE.exe",
"cpu_percent": 0.0,
"memory_percent": 0.010167709751611041,
"membytes": 0.010167709751611041,
"pid": 5712,
"ppid": 1628,
"status": "running",
@@ -902,7 +902,7 @@
{
"name": "TabTip.exe",
"cpu_percent": 0.0,
"memory_percent": 0.008543341572677483,
"membytes": 0.008543341572677483,
"pid": 5916,
"ppid": 4752,
"status": "running",
@@ -912,7 +912,7 @@
{
"name": "vmwp.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011780148666072628,
"membytes": 0.011780148666072628,
"pid": 5924,
"ppid": 5020,
"status": "running",
@@ -922,7 +922,7 @@
{
"name": "msdtc.exe",
"cpu_percent": 0.0,
"memory_percent": 0.004956609388104484,
"membytes": 0.004956609388104484,
"pid": 6016,
"ppid": 1388,
"status": "running",
@@ -932,7 +932,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0025468979647660824,
"membytes": 0.0025468979647660824,
"pid": 6056,
"ppid": 1388,
"status": "running",
@@ -942,7 +942,7 @@
{
"name": "vmwp.exe",
"cpu_percent": 0.06875,
"memory_percent": 0.01141034146744149,
"membytes": 0.01141034146744149,
"pid": 6092,
"ppid": 5020,
"status": "running",
@@ -952,7 +952,7 @@
{
"name": "vmwp.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011595245066757059,
"membytes": 0.011595245066757059,
"pid": 6296,
"ppid": 5020,
"status": "running",
@@ -962,7 +962,7 @@
{
"name": "cmd.exe",
"cpu_percent": 0.0,
"memory_percent": 0.00203990422470726,
"membytes": 0.00203990422470726,
"pid": 6620,
"ppid": 4664,
"status": "running",
@@ -972,7 +972,7 @@
{
"name": "ctfmon.exe",
"cpu_percent": 0.0,
"memory_percent": 0.007632741051316932,
"membytes": 0.007632741051316932,
"pid": 6648,
"ppid": 4752,
"status": "running",
@@ -982,7 +982,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.007199311108835272,
"membytes": 0.007199311108835272,
"pid": 6716,
"ppid": 1388,
"status": "running",
@@ -992,7 +992,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.0038054353665591583,
"membytes": 0.0038054353665591583,
"pid": 6760,
"ppid": 1388,
"status": "running",
@@ -1002,7 +1002,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.013456210324384736,
"membytes": 0.013456210324384736,
"pid": 6868,
"ppid": 1388,
"status": "running",
@@ -1012,7 +1012,7 @@
{
"name": "SearchUI.exe",
"cpu_percent": 0.0,
"memory_percent": 0.04596743243199986,
"membytes": 0.04596743243199986,
"pid": 6904,
"ppid": 1628,
"status": "stopped",
@@ -1022,7 +1022,7 @@
{
"name": "tacticalrmm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.023025468641651836,
"membytes": 0.023025468641651836,
"pid": 6908,
"ppid": 7592,
"status": "running",
@@ -1032,7 +1032,7 @@
{
"name": "taskhostw.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006147547624556384,
"membytes": 0.006147547624556384,
"pid": 6984,
"ppid": 2440,
"status": "running",
@@ -1042,7 +1042,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.017520113087836627,
"membytes": 0.017520113087836627,
"pid": 7092,
"ppid": 1388,
"status": "running",
@@ -1052,7 +1052,7 @@
{
"name": "RuntimeBroker.exe",
"cpu_percent": 0.0,
"memory_percent": 0.011543551587378511,
"membytes": 0.011543551587378511,
"pid": 7148,
"ppid": 1628,
"status": "running",
@@ -1062,7 +1062,7 @@
{
"name": "dllhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006175382574990985,
"membytes": 0.006175382574990985,
"pid": 7232,
"ppid": 1628,
"status": "running",
@@ -1072,7 +1072,7 @@
{
"name": "conhost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.006191288260953614,
"membytes": 0.006191288260953614,
"pid": 7288,
"ppid": 6908,
"status": "running",
@@ -1082,7 +1082,7 @@
{
"name": "nssm.exe",
"cpu_percent": 0.0,
"memory_percent": 0.003252712779357776,
"membytes": 0.003252712779357776,
"pid": 7592,
"ppid": 1388,
"status": "running",
@@ -1092,7 +1092,7 @@
{
"name": "svchost.exe",
"cpu_percent": 0.0,
"memory_percent": 0.005972585078967456,
"membytes": 0.005972585078967456,
"pid": 8012,
"ppid": 1388,
"status": "running",

View File

@@ -25,4 +25,5 @@ urlpatterns = [
path("scripts/", include("scripts.urls")),
path("alerts/", include("alerts.urls")),
path("accounts/", include("accounts.urls")),
path("natsapi/", include("natsapi.urls")),
]

View File

@@ -27,30 +27,36 @@ jobs:
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
pip install --no-cache-dir --upgrade pip
pip install --no-cache-dir setuptools==50.3.2 wheel==0.36.1
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt
pip install --no-cache-dir setuptools==51.1.2 wheel==0.36.2
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt -r requirements-dev.txt
displayName: "Install Python Dependencies"
- script: |
cd /myagent/_work/1/s/api
git config user.email "admin@example.com"
git config user.name "Bob"
git fetch
git checkout develop
git pull
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
coverage run manage.py test -v 2
coveralls
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Run django tests"
- script: |
rm -rf /myagent/_work/1/s/web/node_modules
cd /myagent/_work/1/s/web
npm install
displayName: "Install Frontend"
cd /myagent/_work/1/s/api
source env/bin/activate
black --check tacticalrmm
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Codestyle black"
- script: |
cd /myagent/_work/1/s/web
npm run test:unit
displayName: "Run Vue Tests"
cd /myagent/_work/1/s/api
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
export CIRCLE_BRANCH=$BUILD_SOURCEBRANCH
coveralls
displayName: "coveralls"
env:
CIRCLECI: 1
CIRCLE_BUILD_NUM: $(Build.BuildNumber)

View File

@@ -1,6 +1,6 @@
#!/bin/bash
SCRIPT_VERSION="5"
SCRIPT_VERSION="6"
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh'
GREEN='\033[0;32m'
@@ -83,7 +83,7 @@ sudo tar -czvf ${tmp_dir}/nginx/etc-nginx.tar.gz -C /etc/nginx .
sudo tar -czvf ${tmp_dir}/confd/etc-confd.tar.gz -C /etc/conf.d .
sudo cp ${sysd}/rmm.service ${sysd}/celery.service ${sysd}/celerybeat.service ${sysd}/celery-winupdate.service ${sysd}/meshcentral.service ${sysd}/nats.service ${tmp_dir}/systemd/
sudo cp ${sysd}/rmm.service ${sysd}/celery.service ${sysd}/celerybeat.service ${sysd}/celery-winupdate.service ${sysd}/meshcentral.service ${sysd}/nats.service ${sysd}/natsapi.service ${tmp_dir}/systemd/
cat /rmm/api/tacticalrmm/tacticalrmm/private/log/debug.log | gzip -9 > ${tmp_dir}/rmm/debug.log.gz
cp /rmm/api/tacticalrmm/tacticalrmm/local_settings.py /rmm/api/tacticalrmm/app.ini ${tmp_dir}/rmm/

View File

@@ -11,8 +11,8 @@ API_HOST=api.example.com
MESH_HOST=mesh.example.com
# mesh settings
MESH_USER=meshcentral
MESH_PASS=meshcentralpass
MESH_USER=tactical
MESH_PASS=tactical
MONGODB_USER=mongouser
MONGODB_PASSWORD=mongopass

Some files were not shown because too many files have changed in this diff Show More