Compare commits

..

149 Commits

Author SHA1 Message Date
wh1te909
6b46025261 Release 0.2.18 2020-12-19 08:44:45 +00:00
wh1te909
5ea503f23e bump version 2020-12-19 08:43:47 +00:00
wh1te909
ce95f9ac23 add codestyle to tests 2020-12-19 08:24:47 +00:00
wh1te909
c3fb87501b black 2020-12-19 08:20:12 +00:00
wh1te909
dc6a343612 bump mesh 2020-12-19 07:55:39 +00:00
wh1te909
3a61053957 update reqs 2020-12-19 07:50:32 +00:00
wh1te909
570129e4d4 add debian 10 to readme 2020-12-19 07:50:05 +00:00
wh1te909
3315c7045f if ubuntu, force 20.04 2020-12-19 07:45:21 +00:00
wh1te909
5ae50e242c always run npm install during update 2020-12-18 21:59:23 +00:00
Tragic Bronson
bbcf449719 Merge pull request #214 from mckinnon81/debian
Updated install.sh for Debian
2020-12-18 13:56:14 -08:00
Matthew McKinnon
aab10f7184 Removed certbot test-cert. Not needed 2020-12-18 08:32:40 +10:00
Matthew McKinnon
8d43488cb8 Updated install.sh for Debian
Updated api\tacticalrmm\accounts\views.py valid_window=10
2020-12-18 08:28:01 +10:00
Tragic Bronson
0a9c647e19 Merge pull request #211 from sadnub/develop
Fix default policies
2020-12-16 13:51:37 -08:00
wh1te909
40db5d4aa8 remove debug print 2020-12-16 21:50:43 +00:00
Josh
9254532baa fix applying default policies in certain situations 2020-12-16 20:38:36 +00:00
Josh
7abed47cf0 Merge branch 'develop' of https://github.com/wh1te909/tacticalrmm into develop 2020-12-16 19:08:12 +00:00
Tragic Bronson
5c6ac758f7 Merge pull request #210 from mckinnon81/scripts
Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1
2020-12-16 09:36:33 -08:00
Matthew McKinnon
007677962c Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1 2020-12-16 22:32:04 +10:00
wh1te909
9c4aeab64a back to develop 2020-12-16 10:47:05 +00:00
wh1te909
48e6fc0efe test coveralls 2 2020-12-16 10:41:39 +00:00
wh1te909
c8be713d11 test coveralls 2020-12-16 10:38:00 +00:00
wh1te909
ae887c8648 switch to branch head for coveralls 2020-12-16 10:20:50 +00:00
wh1te909
5daac2531b add accounts tests for new settings 2020-12-16 10:09:58 +00:00
wh1te909
68def00327 fix tests 2020-12-16 09:40:36 +00:00
wh1te909
67e7976710 pipelines attempt 2 2020-12-16 09:25:28 +00:00
wh1te909
35747e937e try to get pipelines to fail 2020-12-16 09:10:53 +00:00
wh1te909
fb439787a4 Release 0.2.17 2020-12-16 00:37:59 +00:00
wh1te909
8fa368f473 bump versions 2020-12-16 00:36:43 +00:00
sadnub
c84a9d07b1 tactical-cli for managing docker installations 2020-12-15 13:41:03 -05:00
wh1te909
7fb46cdfc4 add more targeting options to bulk actions 2020-12-15 08:30:55 +00:00
Tragic Bronson
52985e5ddc Merge pull request #203 from wh1te909/dependabot/npm_and_yarn/docs/ini-1.3.8
Bump ini from 1.3.5 to 1.3.8 in /docs
2020-12-15 00:10:01 -08:00
wh1te909
e880935dc3 make script name required 2020-12-15 07:37:37 +00:00
wh1te909
cc22b1bca5 send favorite data when adding new script 2020-12-15 07:37:09 +00:00
wh1te909
49a5128918 remove extra migrations already handled by another func 2020-12-15 05:06:33 +00:00
wh1te909
fedc7dcb44 #204 add optional setting to prevent initial admin user from being modified or deleted 2020-12-14 21:00:25 +00:00
wh1te909
cd32b20215 remove vue tests for now 2020-12-14 20:59:43 +00:00
wh1te909
15cd9832c4 change fav script context menu style 2020-12-14 20:41:07 +00:00
wh1te909
f25d4e4553 add agent recovery periodic task 2020-12-14 19:27:09 +00:00
Tragic Bronson
12d1c82b63 Merge pull request #200 from sadnub/develop
Scripts Manager Rework
2020-12-14 10:35:19 -08:00
wh1te909
aebe855078 add a favorite menu to agent's context menu for easy way to run scripts 2020-12-14 11:28:00 +00:00
wh1te909
3416a71ebd add community scripts to migration 2020-12-14 07:17:51 +00:00
Tragic Bronson
94b3fea528 Create FUNDING.yml 2020-12-13 20:57:05 -08:00
Josh
ad1a9ecca1 fix agent table pending actions filter 2020-12-14 04:39:42 +00:00
Josh
715accfb8a scripts rework 2020-12-14 04:39:02 +00:00
wh1te909
a8e03c6138 Release 0.2.16 2020-12-13 11:46:12 +00:00
wh1te909
f69446b648 agent 1.1.11 wh1te909/rmmagent@f693d15322 2020-12-13 11:45:24 +00:00
dependabot[bot]
eedfbe5846 Bump ini from 1.3.5 to 1.3.8 in /docs
Bumps [ini](https://github.com/isaacs/ini) from 1.3.5 to 1.3.8.
- [Release notes](https://github.com/isaacs/ini/releases)
- [Commits](https://github.com/isaacs/ini/compare/v1.3.5...v1.3.8)

Signed-off-by: dependabot[bot] <support@github.com>
2020-12-13 07:18:22 +00:00
wh1te909
153351cc9f Release 0.2.15 2020-12-12 09:40:08 +00:00
wh1te909
1b1eec40a7 agent check-in and recovery improvements 2020-12-12 09:39:20 +00:00
wh1te909
763877541a Release 0.2.14 2020-12-12 01:59:47 +00:00
wh1te909
1fad7d72a2 fix for special chars in computer hostname closes #201 2020-12-12 01:59:10 +00:00
wh1te909
51ea2ea879 Release 0.2.13 2020-12-11 20:48:11 +00:00
wh1te909
d77a478bf0 agent 1.1.8 2020-12-11 20:47:54 +00:00
wh1te909
e413c0264a Release 0.2.12 2020-12-11 07:28:27 +00:00
wh1te909
f88e7f898c bump versions 2020-12-11 07:27:42 +00:00
wh1te909
d07bd4a6db add optional silent flag to installer 2020-12-11 07:25:42 +00:00
wh1te909
fb34c099d5 Release 0.2.11 2020-12-10 19:13:24 +00:00
wh1te909
1d2ee56a15 bump versions 2020-12-10 19:12:30 +00:00
wh1te909
86665f7f09 change update task for agent 1.1.6 2020-12-10 19:08:29 +00:00
wh1te909
0d2b4af986 Release 0.2.10 2020-12-10 10:34:40 +00:00
wh1te909
dc2b2eeb9f bump versions 2020-12-10 10:33:44 +00:00
wh1te909
e5dbb66d53 cleanup agent update func 2020-12-10 10:31:58 +00:00
wh1te909
3474b1c471 fix failing checks alert 2020-12-10 00:01:54 +00:00
wh1te909
3886de5b7c add postgres vacuum 2020-12-10 00:00:02 +00:00
wh1te909
2b3cec06b3 Release 0.2.9 2020-12-09 05:07:11 +00:00
wh1te909
8536754d14 bump version for new agent 2020-12-09 05:06:19 +00:00
wh1te909
1f36235801 fix wording 2020-12-09 05:04:25 +00:00
wh1te909
a4194b14f9 Release 0.2.8 2020-12-09 00:50:48 +00:00
wh1te909
2dcc629d9d bump versions 2020-12-09 00:31:33 +00:00
wh1te909
98ddadc6bc add sync task 2020-12-08 23:02:05 +00:00
wh1te909
f6e47b7383 remove extra services view 2020-12-08 20:09:09 +00:00
wh1te909
f073ddc906 Release 0.2.7 2020-12-07 09:50:37 +00:00
wh1te909
3e00631925 cleanup older pending action agent updates if one exists with an older agent version 2020-12-07 09:50:15 +00:00
wh1te909
9b7ac58562 Release 0.2.6 2020-12-07 08:56:20 +00:00
wh1te909
f242ddd801 bump versions 2020-12-07 08:55:49 +00:00
wh1te909
c129886fe2 change sleeps 2020-12-07 08:30:21 +00:00
wh1te909
f577e814cf add refresh summary 2020-12-07 08:29:37 +00:00
wh1te909
c860a0cedd update reqs 2020-12-07 00:35:38 +00:00
wh1te909
ae7e28e492 try fixing coveralls branch 2020-12-06 00:43:36 +00:00
wh1te909
90a63234ad add coveralls 2020-12-04 06:40:44 +00:00
wh1te909
14bca52e8f remove dead code, update middleware 2020-12-04 06:25:53 +00:00
wh1te909
2f3c3361cf remove static clients list from audit log 2020-12-04 06:05:25 +00:00
wh1te909
4034134055 add task scheduler expire after wh1te909/rmmagent@fe91e5f110 2020-12-03 22:46:25 +00:00
sadnub
c04f94cb7b fix certificates on docker 2020-12-03 12:29:03 -05:00
sadnub
fd1bbc7925 Update docker-build-push.yml 2020-12-02 07:53:12 -05:00
wh1te909
ff69bed394 Release 0.2.5 2020-12-02 11:06:55 +00:00
wh1te909
d6e8c5146f bump version 2020-12-02 11:06:34 +00:00
wh1te909
9a04cf99d7 fix pending actions ui 2020-12-02 11:05:29 +00:00
wh1te909
86e7c11e71 fix mesh nginx 2020-12-02 10:40:20 +00:00
wh1te909
361cc08faa Release 0.2.4 2020-12-02 05:45:55 +00:00
wh1te909
70dc771052 bump rmm and agent ver 2020-12-02 05:35:13 +00:00
wh1te909
c14873a799 update optional args 2020-12-02 05:33:35 +00:00
wh1te909
bba5abd74b bump script vers 2020-12-02 05:16:16 +00:00
wh1te909
a224e79c1f bump mesh and vue 2020-12-02 04:51:05 +00:00
wh1te909
c305d98186 remove old code 2020-12-02 04:14:35 +00:00
wh1te909
7c5a473e71 add flag to skip salt during agent install 2020-12-02 04:00:36 +00:00
wh1te909
5e0f5d1eed check for old installers 2020-12-02 03:23:16 +00:00
wh1te909
238b269bc4 remove update salt task 2020-12-02 03:22:19 +00:00
Josh
0ad121b9d2 fix tests attempt 2 2020-12-01 16:46:38 +00:00
Josh
7088acd9fd fix tests and remove travis config 2020-12-01 16:41:59 +00:00
Josh
e0a900d4b6 test for rm_orphaned_task in core maintenance 2020-12-01 16:35:34 +00:00
Josh
a0fe2f0c7d fix tests 2020-12-01 16:11:03 +00:00
Josh
d5b9bc2f26 get cert file locations from settings in docker build 2020-12-01 16:10:49 +00:00
Josh
584254e6ca fix/add tests 2020-12-01 15:55:26 +00:00
wh1te909
a2963ed7bb reload table when pending action changed 2020-12-01 07:01:50 +00:00
wh1te909
2a3c2e133d fix wording 2020-12-01 06:43:52 +00:00
wh1te909
3e7dcb2755 don't hide refresh when sw list empty 2020-12-01 06:27:34 +00:00
wh1te909
faeec00b39 remove more tasks now handled by the agent 2020-12-01 06:16:09 +00:00
wh1te909
eeed81392f add rm orphaned tasks to maintenance tab 2020-12-01 05:55:27 +00:00
wh1te909
95dce9e992 check for supported agent 2020-12-01 05:52:32 +00:00
wh1te909
502bd2a191 patch nats 2020-12-01 05:16:47 +00:00
wh1te909
17ac92a9d0 remove dead code 2020-12-01 05:16:37 +00:00
wh1te909
ba028cde0c remove old api app 2020-12-01 05:00:13 +00:00
wh1te909
6e751e7a9b remove bg task that's handled by the agent now 2020-12-01 04:51:51 +00:00
wh1te909
948b56d0e6 add a ghetto check for non standard cert 2020-12-01 04:47:09 +00:00
wh1te909
4bf2dc9ece don't create unnecessary outage records 2020-12-01 04:44:38 +00:00
Josh
125823f8ab add server maintenance to tools menu 2020-12-01 03:44:58 +00:00
Josh
24d33397e9 add virtual scroll to audit log table 2020-12-01 02:17:20 +00:00
Josh
2c553825f4 add server-side pagination for audit logging 2020-12-01 02:01:10 +00:00
wh1te909
198c485e9a reduce threads 2020-11-30 21:51:25 +00:00
wh1te909
0138505507 reduce threads 2020-11-30 21:49:47 +00:00
wh1te909
5d50dcc600 add api endpoint for software 2020-11-30 21:45:12 +00:00
wh1te909
7bdd8c4626 add some type hints 2020-11-30 10:28:25 +00:00
wh1te909
fc82c35f0c finish moving schedtasks to nats 2020-11-30 08:18:47 +00:00
wh1te909
426ebad300 start moving schedtasks to nats wh1te909/rmmagent@0cde11a067 2020-11-29 23:40:29 +00:00
sadnub
1afe61c593 fix docker-compose.yml 2020-11-29 14:24:32 -05:00
wh1te909
c20751829b create migration for schedtask weekdays 2020-11-29 10:37:46 +00:00
Tragic Bronson
a3b8ee8392 Merge pull request #194 from sadnub/develop
Get mesh version for settings.py
2020-11-28 21:02:58 -08:00
Josh
156c0fe7f6 add dockerignore and get MESH_VER from settings.py 2020-11-29 04:47:34 +00:00
wh1te909
216f7a38cf support mesh > 0.6.84 wh1te909/rmmagent@85aab2facf 2020-11-29 04:15:57 +00:00
Tragic Bronson
fd04dc10d4 Merge pull request #193 from sadnub/feature-uichanges
Some fixes
2020-11-28 19:48:41 -08:00
Josh
d39bdce926 add install agent to site context menu 2020-11-29 03:30:31 +00:00
Josh
c6e01245b0 fix disabled prop on edit agent patch policy and agent checks tab 2020-11-29 02:56:35 +00:00
Josh
c168ee7ba4 bump app version and mesh version 2020-11-29 02:44:29 +00:00
Josh
7575253000 regenerate policies and tasks on site/client change on agent 2020-11-29 02:35:30 +00:00
Josh
c28c1efbb1 Add pending actions to agent table and filter 2020-11-29 02:13:50 +00:00
sadnub
e6aa2c3b78 Delete docker-build-publish.yml 2020-11-28 09:47:41 -05:00
sadnub
ab7c481f83 Create docker-build-push.yml 2020-11-28 09:47:27 -05:00
wh1te909
84ad1c352d Release 0.2.3 2020-11-28 06:09:38 +00:00
wh1te909
e9aad39ac9 bump version 2020-11-28 06:09:01 +00:00
wh1te909
c3444a87bc update backup/restore scripts for nats 2020-11-28 06:05:47 +00:00
sadnub
67b224b340 get automated builds working 2020-11-28 00:23:11 -05:00
sadnub
bded14d36b fix action file 2020-11-27 23:12:22 -05:00
sadnub
73fa0b6631 create github action for testing 2020-11-27 23:09:45 -05:00
Josh Krawczyk
2f07337588 fix mesh container and wait for nginx 2020-11-27 21:15:27 -05:00
wh1te909
da163d44e7 fix nats reload for old agents, fix domain parsing for non standard domains 2020-11-27 22:41:32 +00:00
Josh
56fbf8ae0c docker fixes for salt modules and nats config reload 2020-11-27 19:31:33 +00:00
wh1te909
327eb4b39b Release 0.2.2 2020-11-26 07:37:00 +00:00
wh1te909
ae7873a7e3 fix duplicate key error causing UI to freeze 2020-11-26 07:36:26 +00:00
147 changed files with 4466 additions and 2876 deletions

5
.dockerignore Normal file
View File

@@ -0,0 +1,5 @@
.git
.cache
**/*.env
**/env
**/node_modules

12
.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
# These are supported funding model platforms
github: wh1te909
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

88
.github/workflows/docker-build-push.yml vendored Normal file
View File

@@ -0,0 +1,88 @@
name: Publish Tactical Docker Images
on:
push:
tags:
- "v*.*.*"
jobs:
docker:
name: Build and Push Docker Images
runs-on: ubuntu-latest
steps:
- name: Check out the repo
uses: actions/checkout@v2
- name: Get Github Tag
id: prep
run: |
echo ::set-output name=version::${GITHUB_REF#refs/tags/v}
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Build and Push Tactical Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical:${{ steps.prep.outputs.version }},tacticalrmm/tactical:latest
- name: Build and Push Tactical MeshCentral Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical-meshcentral/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical-meshcentral:${{ steps.prep.outputs.version }},tacticalrmm/tactical-meshcentral:latest
- name: Build and Push Tactical NATS Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical-nats/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical-nats:${{ steps.prep.outputs.version }},tacticalrmm/tactical-nats:latest
- name: Build and Push Tactical Salt Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical-salt/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical-salt:${{ steps.prep.outputs.version }},tacticalrmm/tactical-salt:latest
- name: Build and Push Tactical Frontend Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical-frontend/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical-frontend:${{ steps.prep.outputs.version }},tacticalrmm/tactical-frontend:latest
- name: Build and Push Tactical Nginx Image
uses: docker/build-push-action@v2
with:
context: .
push: true
pull: true
file: ./docker/containers/tactical-nginx/dockerfile
platforms: linux/amd64
tags: tacticalrmm/tactical-nginx:${{ steps.prep.outputs.version }},tacticalrmm/tactical-nginx:latest

View File

@@ -2,7 +2,7 @@
"python.pythonPath": "api/tacticalrmm/env/bin/python",
"python.languageServer": "Pylance",
"python.analysis.extraPaths": [
"api/tacticalrmm"
"api/tacticalrmm",
],
"python.analysis.typeCheckingMode": "basic",
"python.formatting.provider": "black",

View File

@@ -36,7 +36,7 @@ Demo database resets every hour. Alot of features are disabled for obvious reaso
## Installation
### Requirements
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04)
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04 / Debian 10)
- A domain you own with at least 3 subdomains
- Google Authenticator app (2 factor is NOT optional)

View File

@@ -20,6 +20,5 @@ omit =
*/urls.py
*/tests.py
*/test.py
api/*.py
checks/utils.py

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0002_auto_20200810_0544'),
("accounts", "0002_auto_20200810_0544"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,24 +6,24 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('accounts', '0003_auto_20200922_1344'),
("accounts", "0003_auto_20200922_1344"),
]
operations = [
migrations.RemoveField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
),
migrations.RemoveField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
),
migrations.RemoveField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
),
migrations.RemoveField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0004_auto_20201002_1257'),
("accounts", "0004_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0007_update_agent_primary_key'),
("accounts", "0007_update_agent_primary_key"),
]
operations = [
migrations.AddField(
model_name='user',
name='dark_mode',
model_name="user",
name="dark_mode",
field=models.BooleanField(default=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2020-12-10 17:00
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("accounts", "0008_user_dark_mode"),
]
operations = [
migrations.AddField(
model_name="user",
name="show_community_scripts",
field=models.BooleanField(default=True),
),
]

View File

@@ -8,6 +8,7 @@ class User(AbstractUser, BaseAuditModel):
is_active = models.BooleanField(default=True)
totp_key = models.CharField(max_length=50, null=True, blank=True)
dark_mode = models.BooleanField(default=True)
show_community_scripts = models.BooleanField(default=True)
agent = models.OneToOneField(
"agents.Agent",

View File

@@ -155,6 +155,33 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("put", url)
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_put_not_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_delete(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
@@ -166,6 +193,19 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("delete", url)
@override_settings(ROOT_USER="john")
def test_delete_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_delete_non_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
self.client.force_authenticate(user=self.alice)
r = self.client.delete(url)
self.assertEqual(r.status_code, 400)
class TestUserAction(TacticalTestCase):
def setUp(self):
@@ -184,6 +224,21 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("post", url)
@override_settings(ROOT_USER="john")
def test_post_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_post_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
self.client.force_authenticate(user=self.alice)
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_put(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
@@ -195,12 +250,34 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("put", url)
def test_darkmode(self):
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
user = User.objects.get(pk=self.john.pk)
self.assertEqual(user.totp_key, "")
@override_settings(ROOT_USER="john")
def test_put_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_user_ui(self):
url = "/accounts/users/ui/"
data = {"dark_mode": False}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"show_community_scripts": True}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("patch", url)

View File

@@ -60,7 +60,7 @@ class LoginView(KnoxLoginView):
if settings.DEBUG and token == "sekret":
valid = True
elif totp.verify(token, valid_window=1):
elif totp.verify(token, valid_window=10):
valid = True
if valid:
@@ -108,6 +108,13 @@ class GetUpdateDeleteUser(APIView):
def put(self, request, pk):
user = get_object_or_404(User, pk=pk)
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
serializer = UserSerializer(instance=user, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
serializer.save()
@@ -115,7 +122,15 @@ class GetUpdateDeleteUser(APIView):
return Response("ok")
def delete(self, request, pk):
get_object_or_404(User, pk=pk).delete()
user = get_object_or_404(User, pk=pk)
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be deleted from the UI")
user.delete()
return Response("ok")
@@ -124,8 +139,14 @@ class UserActions(APIView):
# reset password
def post(self, request):
user = get_object_or_404(User, pk=request.data["id"])
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
user.set_password(request.data["password"])
user.save()
@@ -133,8 +154,14 @@ class UserActions(APIView):
# reset two factor token
def put(self, request):
user = get_object_or_404(User, pk=request.data["id"])
if (
hasattr(settings, "ROOT_USER")
and request.user != user
and user.username == settings.ROOT_USER
):
return notify_error("The root user cannot be modified from the UI")
user.totp_key = ""
user.save()
@@ -161,6 +188,13 @@ class TOTPSetup(APIView):
class UserUI(APIView):
def patch(self, request):
user = request.user
user.dark_mode = request.data["dark_mode"]
user.save(update_fields=["dark_mode"])
return Response("ok")
if "dark_mode" in request.data:
user.dark_mode = request.data["dark_mode"]
user.save(update_fields=["dark_mode"])
if "show_community_scripts" in request.data:
user.show_community_scripts = request.data["show_community_scripts"]
user.save(update_fields=["show_community_scripts"])
return Response("ok")

View File

@@ -26,7 +26,7 @@ def get_wmi_data():
agent = Recipe(
Agent,
hostname="DESKTOP-TEST123",
version="1.1.0",
version="1.1.1",
monitoring_type=cycle(["workstation", "server"]),
salt_id=generate_agent_id("DESKTOP-TEST123"),
agent_id="71AHC-AA813-HH1BC-AAHH5-00013|DESKTOP-TEST123",

View File

@@ -7,14 +7,20 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
('agents', '0020_auto_20201025_2129'),
("clients", "0006_deployment"),
("agents", "0020_auto_20201025_2129"),
]
operations = [
migrations.AddField(
model_name='agent',
name='site_link',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='agents', to='clients.site'),
model_name="agent",
name="site_link",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="agents",
to="clients.site",
),
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0022_update_site_primary_key'),
("agents", "0022_update_site_primary_key"),
]
operations = [
migrations.RemoveField(
model_name='agent',
name='client',
model_name="agent",
name="client",
),
migrations.RemoveField(
model_name='agent',
name='site',
model_name="agent",
name="site",
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0023_auto_20201101_2312'),
("agents", "0023_auto_20201101_2312"),
]
operations = [
migrations.RenameField(
model_name='agent',
old_name='site_link',
new_name='site',
model_name="agent",
old_name="site_link",
new_name="site",
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0024_auto_20201101_2319'),
("agents", "0024_auto_20201101_2319"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -6,13 +6,23 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0025_auto_20201122_0407'),
("agents", "0025_auto_20201122_0407"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC'), ('checkrunner', 'Checkrunner')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
("checkrunner", "Checkrunner"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -1,5 +1,4 @@
import requests
import datetime as dt
import time
import base64
from Crypto.Cipher import AES
@@ -8,9 +7,7 @@ from Crypto.Hash import SHA3_384
from Crypto.Util.Padding import pad
import validators
import msgpack
import random
import re
import string
from collections import Counter
from loguru import logger
from packaging import version as pyver
@@ -89,6 +86,10 @@ class Agent(BaseAuditModel):
def has_nats(self):
return pyver.parse(self.version) >= pyver.parse("1.1.0")
@property
def has_gotasks(self):
return pyver.parse(self.version) >= pyver.parse("1.1.1")
@property
def timezone(self):
# return the default timezone unless the timezone is explicity set per agent
@@ -163,13 +164,11 @@ class Agent(BaseAuditModel):
elif i.status == "failing":
failing += 1
has_failing_checks = True if failing > 0 else False
ret = {
"total": total,
"passing": passing,
"failing": failing,
"has_failing_checks": has_failing_checks,
"has_failing_checks": failing > 0,
}
return ret
@@ -545,6 +544,7 @@ class Agent(BaseAuditModel):
ret = AgentEditSerializer(agent).data
del ret["all_timezones"]
del ret["client"]
return ret
@staticmethod
@@ -573,61 +573,6 @@ class Agent(BaseAuditModel):
return resp
def schedule_reboot(self, obj):
start_date = dt.datetime.strftime(obj, "%Y-%m-%d")
start_time = dt.datetime.strftime(obj, "%H:%M")
# let windows task scheduler automatically delete the task after it runs
end_obj = obj + dt.timedelta(minutes=15)
end_date = dt.datetime.strftime(end_obj, "%Y-%m-%d")
end_time = dt.datetime.strftime(end_obj, "%H:%M")
task_name = "TacticalRMM_SchedReboot_" + "".join(
random.choice(string.ascii_letters) for _ in range(10)
)
r = self.salt_api_cmd(
timeout=15,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Windows\\System32\\shutdown.exe"',
'arguments="/r /t 5 /f"',
"trigger_type=Once",
f'start_date="{start_date}"',
f'start_time="{start_time}"',
f'end_date="{end_date}"',
f'end_time="{end_time}"',
"ac_only=False",
"stop_if_on_batteries=False",
"delete_after=Immediately",
],
)
if r == "error" or (isinstance(r, bool) and not r):
return "failed"
elif r == "timeout":
return "timeout"
elif isinstance(r, bool) and r:
from logs.models import PendingAction
details = {
"taskname": task_name,
"time": str(obj),
}
PendingAction(agent=self, action_type="schedreboot", details=details).save()
nice_time = dt.datetime.strftime(obj, "%B %d, %Y at %I:%M %p")
return {"msg": {"time": nice_time, "agent": self.hostname}}
else:
return "failed"
def not_supported(self, version_added):
return pyver.parse(self.version) < pyver.parse(version_added)
def delete_superseded_updates(self):
try:
pks = [] # list of pks to delete

View File

@@ -36,12 +36,16 @@ class AgentSerializer(serializers.ModelSerializer):
class AgentTableSerializer(serializers.ModelSerializer):
patches_pending = serializers.ReadOnlyField(source="has_patches_pending")
pending_actions = serializers.SerializerMethodField()
status = serializers.ReadOnlyField()
checks = serializers.ReadOnlyField()
last_seen = serializers.SerializerMethodField()
client_name = serializers.ReadOnlyField(source="client.name")
site_name = serializers.ReadOnlyField(source="site.name")
def get_pending_actions(self, obj):
return obj.pendingactions.filter(status="pending").count()
def get_last_seen(self, obj):
if obj.time_zone is not None:
agent_tz = pytz.timezone(obj.time_zone)
@@ -62,6 +66,7 @@ class AgentTableSerializer(serializers.ModelSerializer):
"description",
"needs_reboot",
"patches_pending",
"pending_actions",
"status",
"overdue_text_alert",
"overdue_email_alert",

View File

@@ -3,12 +3,12 @@ from loguru import logger
from time import sleep
import random
import requests
from concurrent.futures import ThreadPoolExecutor
from packaging import version as pyver
from typing import List
from django.conf import settings
from tacticalrmm.celery import app
from agents.models import Agent, AgentOutage
from core.models import CoreSettings
@@ -16,174 +16,123 @@ from logs.models import PendingAction
logger.configure(**settings.LOG_CONFIG)
OLD_64_PY_AGENT = "https://github.com/wh1te909/winagent/releases/download/v0.11.2/winagent-v0.11.2.exe"
OLD_32_PY_AGENT = "https://github.com/wh1te909/winagent/releases/download/v0.11.2/winagent-v0.11.2-x86.exe"
def _check_agent_service(pk: int) -> None:
agent = Agent.objects.get(pk=pk)
r = asyncio.run(agent.nats_cmd({"func": "ping"}, timeout=2))
if r == "pong":
logger.info(
f"Detected crashed tacticalagent service on {agent.hostname}, attempting recovery"
)
data = {"func": "recover", "payload": {"mode": "tacagent"}}
asyncio.run(agent.nats_cmd(data, wait=False))
@app.task
def send_agent_update_task(pks, version):
assert isinstance(pks, list)
def monitor_agents_task() -> None:
q = Agent.objects.all()
agents: List[int] = [i.pk for i in q if i.has_nats and i.status != "online"]
with ThreadPoolExecutor(max_workers=15) as executor:
executor.map(_check_agent_service, agents)
def agent_update(pk: int) -> str:
agent = Agent.objects.get(pk=pk)
# skip if we can't determine the arch
if agent.arch is None:
logger.warning(f"Unable to determine arch on {agent.hostname}. Skipping.")
return "noarch"
# force an update to 1.1.5 since 1.1.6 needs agent to be on 1.1.5 first
if pyver.parse(agent.version) < pyver.parse("1.1.5"):
version = "1.1.5"
if agent.arch == "64":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5.exe"
inno = "winagent-v1.1.5.exe"
elif agent.arch == "32":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5-x86.exe"
inno = "winagent-v1.1.5-x86.exe"
else:
return "nover"
else:
version = settings.LATEST_AGENT_VER
url = agent.winagent_dl
inno = agent.win_inno_exe
if agent.has_nats:
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
action = agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
if pyver.parse(action.details["version"]) < pyver.parse(version):
action.delete()
else:
return "pending"
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": url,
"version": version,
"inno": inno,
},
)
return "created"
# TODO
# Salt is deprecated, remove this once salt is gone
else:
agent.salt_api_async(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": inno,
"url": url,
},
)
return "salt"
@app.task
def send_agent_update_task(pks: List[int], version: str) -> None:
q = Agent.objects.filter(pk__in=pks)
agents = [i.pk for i in q if pyver.parse(i.version) < pyver.parse(version)]
agents: List[int] = [
i.pk for i in q if pyver.parse(i.version) < pyver.parse(version)
]
chunks = (agents[i : i + 30] for i in range(0, len(agents), 30))
for chunk in chunks:
for pk in chunk:
agent = Agent.objects.get(pk=pk)
# skip if we can't determine the arch
if agent.arch is None:
logger.warning(
f"Unable to determine arch on {agent.salt_id}. Skipping."
)
continue
# golang agent only backwards compatible with py agent 0.11.2
# force an upgrade to the latest python agent if version < 0.11.2
if pyver.parse(agent.version) < pyver.parse("0.11.2"):
url = OLD_64_PY_AGENT if agent.arch == "64" else OLD_32_PY_AGENT
inno = (
"winagent-v0.11.2.exe"
if agent.arch == "64"
else "winagent-v0.11.2-x86.exe"
)
else:
url = agent.winagent_dl
inno = agent.win_inno_exe
logger.info(
f"Updating {agent.salt_id} current version {agent.version} using {inno}"
)
if agent.has_nats:
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
continue
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": agent.winagent_dl,
"version": settings.LATEST_AGENT_VER,
"inno": agent.win_inno_exe,
},
)
# TODO
# Salt is deprecated, remove this once salt is gone
else:
r = agent.salt_api_async(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": inno,
"url": url,
},
)
sleep(10)
for pk in agents:
agent_update(pk)
@app.task
def auto_self_agent_update_task():
def auto_self_agent_update_task() -> None:
core = CoreSettings.objects.first()
if not core.agent_auto_update:
logger.info("Agent auto update is disabled. Skipping.")
return
q = Agent.objects.only("pk", "version")
agents = [
pks: List[int] = [
i.pk
for i in q
if pyver.parse(i.version) < pyver.parse(settings.LATEST_AGENT_VER)
]
logger.info(f"Updating {len(agents)}")
chunks = (agents[i : i + 30] for i in range(0, len(agents), 30))
for chunk in chunks:
for pk in chunk:
agent = Agent.objects.get(pk=pk)
# skip if we can't determine the arch
if agent.arch is None:
logger.warning(
f"Unable to determine arch on {agent.salt_id}. Skipping."
)
continue
# golang agent only backwards compatible with py agent 0.11.2
# force an upgrade to the latest python agent if version < 0.11.2
if pyver.parse(agent.version) < pyver.parse("0.11.2"):
url = OLD_64_PY_AGENT if agent.arch == "64" else OLD_32_PY_AGENT
inno = (
"winagent-v0.11.2.exe"
if agent.arch == "64"
else "winagent-v0.11.2-x86.exe"
)
else:
url = agent.winagent_dl
inno = agent.win_inno_exe
logger.info(
f"Updating {agent.salt_id} current version {agent.version} using {inno}"
)
if agent.has_nats:
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
continue
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": agent.winagent_dl,
"version": settings.LATEST_AGENT_VER,
"inno": agent.win_inno_exe,
},
)
# TODO
# Salt is deprecated, remove this once salt is gone
else:
r = agent.salt_api_async(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": inno,
"url": url,
},
)
sleep(10)
for pk in pks:
agent_update(pk)
@app.task
def update_salt_minion_task():
q = Agent.objects.all()
agents = [
i.pk
for i in q
if pyver.parse(i.version) >= pyver.parse("0.11.0")
and pyver.parse(i.salt_ver) < pyver.parse(settings.LATEST_SALT_VER)
def sync_sysinfo_task():
agents = Agent.objects.all()
online = [
i
for i in agents
if pyver.parse(i.version) >= pyver.parse("1.1.3") and i.status == "online"
]
chunks = (agents[i : i + 50] for i in range(0, len(agents), 50))
for chunk in chunks:
for pk in chunk:
agent = Agent.objects.get(pk=pk)
r = agent.salt_api_async(func="win_agent.update_salt")
sleep(20)
@app.task
def get_wmi_detail_task(pk):
agent = Agent.objects.get(pk=pk)
if agent.has_nats:
asyncio.run(agent.nats_cmd({"func": "sysinfo"}, wait=False))
else:
agent.salt_api_async(timeout=30, func="win_agent.local_sys_info")
return "ok"
for agent in online:
asyncio.run(agent.nats_cmd({"func": "sync"}, wait=False))
@app.task
@@ -209,25 +158,6 @@ def batch_sync_modules_task():
sleep(10)
@app.task
def batch_sysinfo_task():
# update system info using WMI
agents = Agent.objects.all()
agents_nats = [agent for agent in agents if agent.has_nats]
minions = [
agent.salt_id
for agent in agents
if not agent.has_nats and pyver.parse(agent.version) >= pyver.parse("0.11.0")
]
if minions:
Agent.salt_batch_async(minions=minions, func="win_agent.local_sys_info")
for agent in agents_nats:
asyncio.run(agent.nats_cmd({"func": "sysinfo"}, wait=False))
@app.task
def uninstall_agent_task(salt_id, has_nats):
attempts = 0
@@ -331,19 +261,22 @@ def agent_recovery_sms_task(pk):
@app.task
def agent_outages_task():
agents = Agent.objects.only("pk")
agents = Agent.objects.only(
"pk", "last_seen", "overdue_time", "overdue_email_alert", "overdue_text_alert"
)
for agent in agents:
if agent.status == "overdue":
outages = AgentOutage.objects.filter(agent=agent)
if outages and outages.last().is_active:
continue
if agent.overdue_email_alert or agent.overdue_text_alert:
if agent.status == "overdue":
outages = AgentOutage.objects.filter(agent=agent)
if outages and outages.last().is_active:
continue
outage = AgentOutage(agent=agent)
outage.save()
outage = AgentOutage(agent=agent)
outage.save()
if agent.overdue_email_alert and not agent.maintenance_mode:
agent_outage_email_task.delay(pk=outage.pk)
if agent.overdue_email_alert and not agent.maintenance_mode:
agent_outage_email_task.delay(pk=outage.pk)
if agent.overdue_text_alert and not agent.maintenance_mode:
agent_outage_sms_task.delay(pk=outage.pk)
if agent.overdue_text_alert and not agent.maintenance_mode:
agent_outage_sms_task.delay(pk=outage.pk)

View File

@@ -5,22 +5,20 @@ from unittest.mock import patch
from model_bakery import baker
from itertools import cycle
from django.test import TestCase, override_settings
from django.conf import settings
from django.utils import timezone as djangotime
from logs.models import PendingAction
from tacticalrmm.test import TacticalTestCase
from .serializers import AgentSerializer
from winupdate.serializers import WinUpdatePolicySerializer
from .models import Agent
from .tasks import (
agent_recovery_sms_task,
auto_self_agent_update_task,
update_salt_minion_task,
get_wmi_detail_task,
sync_salt_modules_task,
batch_sync_modules_task,
batch_sysinfo_task,
OLD_64_PY_AGENT,
OLD_32_PY_AGENT,
)
from winupdate.models import WinUpdatePolicy
@@ -33,7 +31,7 @@ class TestAgentViews(TacticalTestCase):
client = baker.make("clients.Client", name="Google")
site = baker.make("clients.Site", client=client, name="LA Office")
self.agent = baker.make_recipe(
"agents.online_agent", site=site, version="1.1.0"
"agents.online_agent", site=site, version="1.1.1"
)
baker.make_recipe("winupdate.winupdate_policy", agent=self.agent)
@@ -186,10 +184,10 @@ class TestAgentViews(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("agents.models.Agent.nats_cmd")
def test_power_action(self, nats_cmd):
url = f"/agents/poweraction/"
def test_reboot_now(self, nats_cmd):
url = f"/agents/reboot/"
data = {"pk": self.agent.pk, "action": "rebootnow"}
data = {"pk": self.agent.pk}
nats_cmd.return_value = "ok"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
@@ -222,30 +220,37 @@ class TestAgentViews(TacticalTestCase):
self.check_not_authenticated("post", url)
@patch("agents.models.Agent.salt_api_cmd")
def test_reboot_later(self, mock_ret):
url = f"/agents/rebootlater/"
@patch("agents.models.Agent.nats_cmd")
def test_reboot_later(self, nats_cmd):
url = f"/agents/reboot/"
data = {
"pk": self.agent.pk,
"datetime": "2025-08-29 18:41",
}
mock_ret.return_value = True
r = self.client.post(url, data, format="json")
nats_cmd.return_value = "ok"
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["time"], "August 29, 2025 at 06:41 PM")
self.assertEqual(r.data["agent"], self.agent.hostname)
mock_ret.return_value = "failed"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "schedreboot",
"trigger": "once",
"name": r.data["task_name"],
"year": 2025,
"month": "August",
"day": 29,
"hour": 18,
"min": 41,
},
}
nats_cmd.assert_called_with(nats_data, timeout=10)
mock_ret.return_value = "timeout"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
mock_ret.return_value = False
nats_cmd.return_value = "error creating task"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
@@ -253,12 +258,12 @@ class TestAgentViews(TacticalTestCase):
"pk": self.agent.pk,
"datetime": "rm -rf /",
}
r = self.client.post(url, data_invalid, format="json")
r = self.client.patch(url, data_invalid, format="json")
self.assertEqual(r.status_code, 400)
self.assertEqual(r.data, "Invalid date")
self.check_not_authenticated("post", url)
self.check_not_authenticated("patch", url)
@patch("os.path.exists")
@patch("subprocess.run")
@@ -428,7 +433,14 @@ class TestAgentViews(TacticalTestCase):
self.assertIn("&viewmode=13", r.data["file"])
self.assertIn("&viewmode=12", r.data["terminal"])
self.assertIn("&viewmode=11", r.data["control"])
self.assertIn("mstsc.html?login=", r.data["webrdp"])
self.assertIn("&gotonode=", r.data["file"])
self.assertIn("&gotonode=", r.data["terminal"])
self.assertIn("&gotonode=", r.data["control"])
self.assertIn("?login=", r.data["file"])
self.assertIn("?login=", r.data["terminal"])
self.assertIn("?login=", r.data["control"])
self.assertEqual(self.agent.hostname, r.data["hostname"])
self.assertEqual(self.agent.client.name, r.data["client"])
@@ -538,6 +550,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "all",
"target": "agents",
"client": None,
"site": None,
@@ -555,6 +568,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "servers",
"target": "agents",
"client": None,
"site": None,
@@ -569,6 +583,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "workstations",
"target": "client",
"client": self.agent.client.id,
"site": None,
@@ -586,6 +601,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "command",
"monType": "all",
"target": "client",
"client": self.agent.client.id,
"site": self.agent.site.id,
@@ -603,6 +619,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "scan",
"monType": "all",
"target": "agents",
"client": None,
"site": None,
@@ -616,6 +633,7 @@ class TestAgentViews(TacticalTestCase):
payload = {
"mode": "install",
"monType": "all",
"target": "client",
"client": self.agent.client.id,
"site": None,
@@ -739,19 +757,6 @@ class TestAgentTasks(TacticalTestCase):
self.authenticate()
self.setup_coresettings()
@patch("agents.models.Agent.nats_cmd")
@patch("agents.models.Agent.salt_api_async", return_value=None)
def test_get_wmi_detail_task(self, salt_api_async, nats_cmd):
self.agent_salt = baker.make_recipe("agents.agent", version="1.0.2")
ret = get_wmi_detail_task.s(self.agent_salt.pk).apply()
salt_api_async.assert_called_with(timeout=30, func="win_agent.local_sys_info")
self.assertEqual(ret.status, "SUCCESS")
self.agent_nats = baker.make_recipe("agents.agent", version="1.1.0")
ret = get_wmi_detail_task.s(self.agent_nats.pk).apply()
nats_cmd.assert_called_with({"func": "sysinfo"}, wait=False)
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_cmd")
def test_sync_salt_modules_task(self, salt_api_cmd):
self.agent = baker.make_recipe("agents.agent")
@@ -787,84 +792,77 @@ class TestAgentTasks(TacticalTestCase):
self.assertEqual(salt_batch_async.call_count, 4)
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.nats_cmd")
@patch("agents.models.Agent.salt_batch_async", return_value=None)
@patch("agents.tasks.sleep", return_value=None)
def test_batch_sysinfo_task(self, mock_sleep, salt_batch_async, nats_cmd):
self.agents_nats = baker.make_recipe(
"agents.agent", version="1.1.0", _quantity=20
)
# test nats
ret = batch_sysinfo_task.s().apply()
self.assertEqual(nats_cmd.call_count, 20)
nats_cmd.assert_called_with({"func": "sysinfo"}, wait=False)
self.assertEqual(ret.status, "SUCCESS")
self.agents_salt = baker.make_recipe(
"agents.agent", version="1.0.2", _quantity=70
)
minions = [i.salt_id for i in self.agents_salt]
ret = batch_sysinfo_task.s().apply()
self.assertEqual(salt_batch_async.call_count, 1)
salt_batch_async.assert_called_with(
minions=minions, func="win_agent.local_sys_info"
)
self.assertEqual(ret.status, "SUCCESS")
salt_batch_async.reset_mock()
[i.delete() for i in self.agents_salt]
# test old agents, should not run
self.agents_old = baker.make_recipe(
"agents.agent", version="0.10.2", _quantity=70
)
ret = batch_sysinfo_task.s().apply()
salt_batch_async.assert_not_called()
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_async", return_value=None)
@patch("agents.tasks.sleep", return_value=None)
def test_update_salt_minion_task(self, mock_sleep, salt_api_async):
# test agents that need salt update
self.agents = baker.make_recipe(
"agents.agent",
version=settings.LATEST_AGENT_VER,
salt_ver="1.0.3",
_quantity=53,
)
ret = update_salt_minion_task.s().apply()
self.assertEqual(salt_api_async.call_count, 53)
self.assertEqual(ret.status, "SUCCESS")
[i.delete() for i in self.agents]
salt_api_async.reset_mock()
# test agents that need salt update but agent version too low
self.agents = baker.make_recipe(
"agents.agent",
version="0.10.2",
salt_ver="1.0.3",
_quantity=53,
)
ret = update_salt_minion_task.s().apply()
self.assertEqual(ret.status, "SUCCESS")
salt_api_async.assert_not_called()
[i.delete() for i in self.agents]
salt_api_async.reset_mock()
# test agents already on latest salt ver
self.agents = baker.make_recipe(
"agents.agent",
version=settings.LATEST_AGENT_VER,
salt_ver=settings.LATEST_SALT_VER,
_quantity=53,
)
ret = update_salt_minion_task.s().apply()
self.assertEqual(ret.status, "SUCCESS")
salt_api_async.assert_not_called()
@patch("agents.models.Agent.salt_api_async")
def test_agent_update(self, salt_api_async):
from agents.tasks import agent_update
agent_noarch = baker.make_recipe(
"agents.agent",
operating_system="Error getting OS",
version="1.1.11",
)
r = agent_update(agent_noarch.pk)
self.assertEqual(r, "noarch")
self.assertEqual(
PendingAction.objects.filter(
agent=agent_noarch, action_type="agentupdate"
).count(),
0,
)
agent64_nats = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.1.11",
)
r = agent_update(agent64_nats.pk)
self.assertEqual(r, "created")
action = PendingAction.objects.get(agent__pk=agent64_nats.pk)
self.assertEqual(action.action_type, "agentupdate")
self.assertEqual(action.status, "pending")
self.assertEqual(action.details["url"], settings.DL_64)
self.assertEqual(
action.details["inno"], f"winagent-v{settings.LATEST_AGENT_VER}.exe"
)
self.assertEqual(action.details["version"], settings.LATEST_AGENT_VER)
agent64_nats_before16 = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.1.4",
)
r = agent_update(agent64_nats_before16.pk)
self.assertEqual(r, "created")
action = PendingAction.objects.get(agent__pk=agent64_nats_before16.pk)
self.assertEqual(action.action_type, "agentupdate")
self.assertEqual(action.status, "pending")
self.assertEqual(
action.details["url"],
"https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5.exe",
)
self.assertEqual(action.details["inno"], "winagent-v1.1.5.exe")
self.assertEqual(action.details["version"], "1.1.5")
agent64_salt = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.0.0",
)
salt_api_async.return_value = True
r = agent_update(agent64_salt.pk)
self.assertEqual(r, "salt")
salt_api_async.assert_called_with(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": "winagent-v1.1.5.exe",
"url": "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5.exe",
},
)
salt_api_async.reset_mock()
""" @patch("agents.models.Agent.salt_api_async")
@patch("agents.tasks.sleep", return_value=None)
def test_auto_self_agent_update_task(self, mock_sleep, salt_api_async):
# test 64bit golang agent
@@ -967,4 +965,4 @@ class TestAgentTasks(TacticalTestCase):
"url": OLD_32_PY_AGENT,
},
)
self.assertEqual(ret.status, "SUCCESS")
self.assertEqual(ret.status, "SUCCESS") """

View File

@@ -12,7 +12,6 @@ urlpatterns = [
path("<pk>/agentdetail/", views.agent_detail),
path("<int:pk>/meshcentral/", views.meshcentral),
path("<str:arch>/getmeshexe/", views.get_mesh_exe),
path("poweraction/", views.power_action),
path("uninstall/", views.uninstall),
path("editagent/", views.edit_agent),
path("<pk>/geteventlog/<logtype>/<days>/", views.get_event_log),
@@ -20,7 +19,7 @@ urlpatterns = [
path("updateagents/", views.update_agents),
path("<pk>/getprocs/", views.get_processes),
path("<pk>/<pid>/killproc/", views.kill_proc),
path("rebootlater/", views.reboot_later),
path("reboot/", views.Reboot.as_view()),
path("installagent/", views.install_agent),
path("<int:pk>/ping/", views.ping),
path("recover/", views.recover),
@@ -31,4 +30,5 @@ urlpatterns = [
path("bulk/", views.bulk),
path("agent_counts/", views.agent_counts),
path("maintenance/", views.agent_maintenance),
path("<int:pk>/wmi/", views.WMI.as_view()),
]

View File

@@ -3,6 +3,8 @@ from loguru import logger
import os
import subprocess
import pytz
import random
import string
import datetime as dt
from packaging import version as pyver
@@ -18,7 +20,7 @@ from rest_framework import status, generics
from .models import Agent, AgentOutage, RecoveryAction, Note
from core.models import CoreSettings
from scripts.models import Script
from logs.models import AuditLog
from logs.models import AuditLog, PendingAction
from .serializers import (
AgentSerializer,
@@ -93,6 +95,8 @@ def uninstall(request):
@api_view(["PATCH"])
def edit_agent(request):
agent = get_object_or_404(Agent, pk=request.data["id"])
old_site = agent.site.pk
a_serializer = AgentSerializer(instance=agent, data=request.data, partial=True)
a_serializer.is_valid(raise_exception=True)
a_serializer.save()
@@ -104,6 +108,11 @@ def edit_agent(request):
p_serializer.is_valid(raise_exception=True)
p_serializer.save()
# check if site changed and initiate generating correct policies
if old_site != request.data["site"]:
agent.generate_checks_from_policies(clear=True)
agent.generate_tasks_from_policies(clear=True)
return Response("ok")
@@ -119,16 +128,9 @@ def meshcentral(request, pk):
if token == "err":
return notify_error("Invalid mesh token")
control = (
f"{core.mesh_site}/?login={token}&node={agent.mesh_node_id}&viewmode=11&hide=31"
)
terminal = (
f"{core.mesh_site}/?login={token}&node={agent.mesh_node_id}&viewmode=12&hide=31"
)
file = (
f"{core.mesh_site}/?login={token}&node={agent.mesh_node_id}&viewmode=13&hide=31"
)
webrdp = f"{core.mesh_site}/mstsc.html?login={token}&node={agent.mesh_node_id}"
control = f"{core.mesh_site}/?login={token}&gotonode={agent.mesh_node_id}&viewmode=11&hide=31"
terminal = f"{core.mesh_site}/?login={token}&gotonode={agent.mesh_node_id}&viewmode=12&hide=31"
file = f"{core.mesh_site}/?login={token}&gotonode={agent.mesh_node_id}&viewmode=13&hide=31"
AuditLog.audit_mesh_session(username=request.user.username, hostname=agent.hostname)
@@ -137,7 +139,6 @@ def meshcentral(request, pk):
"control": control,
"terminal": terminal,
"file": file,
"webrdp": webrdp,
"status": agent.status,
"client": agent.client.name,
"site": agent.site.name,
@@ -201,19 +202,6 @@ def get_event_log(request, pk, logtype, days):
return Response(r)
@api_view(["POST"])
def power_action(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
if request.data["action"] == "rebootnow":
r = asyncio.run(agent.nats_cmd({"func": "rebootnow"}, timeout=10))
if r != "ok":
return notify_error("Unable to contact the agent")
return Response("ok")
@api_view(["POST"])
def send_raw_cmd(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
@@ -372,24 +360,63 @@ def overdue_action(request):
return Response(agent.hostname)
@api_view(["POST"])
def reboot_later(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
date_time = request.data["datetime"]
class Reboot(APIView):
# reboot now
def post(self, request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
try:
obj = dt.datetime.strptime(date_time, "%Y-%m-%d %H:%M")
except Exception:
return notify_error("Invalid date")
r = asyncio.run(agent.nats_cmd({"func": "rebootnow"}, timeout=10))
if r != "ok":
return notify_error("Unable to contact the agent")
r = agent.schedule_reboot(obj)
return Response("ok")
if r == "timeout":
return notify_error("Unable to contact the agent")
elif r == "failed":
return notify_error("Something went wrong")
# reboot later
def patch(self, request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
if not agent.has_gotasks:
return notify_error("Requires agent version 1.1.1 or greater")
return Response(r["msg"])
try:
obj = dt.datetime.strptime(request.data["datetime"], "%Y-%m-%d %H:%M")
except Exception:
return notify_error("Invalid date")
task_name = "TacticalRMM_SchedReboot_" + "".join(
random.choice(string.ascii_letters) for _ in range(10)
)
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "schedreboot",
"trigger": "once",
"name": task_name,
"year": int(dt.datetime.strftime(obj, "%Y")),
"month": dt.datetime.strftime(obj, "%B"),
"day": int(dt.datetime.strftime(obj, "%d")),
"hour": int(dt.datetime.strftime(obj, "%H")),
"min": int(dt.datetime.strftime(obj, "%M")),
},
}
if pyver.parse(agent.version) >= pyver.parse("1.1.2"):
nats_data["schedtaskpayload"]["deleteafter"] = True
r = asyncio.run(agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
return notify_error(r)
details = {"taskname": task_name, "time": str(obj)}
PendingAction.objects.create(
agent=agent, action_type="schedreboot", details=details
)
nice_time = dt.datetime.strftime(obj, "%B %d, %Y at %I:%M %p")
return Response(
{"time": nice_time, "agent": agent.hostname, "task_name": task_name}
)
@api_view(["POST"])
@@ -798,6 +825,11 @@ def bulk(request):
else:
return notify_error("Something went wrong")
if request.data["monType"] == "servers":
q = q.filter(monitoring_type="server")
elif request.data["monType"] == "workstations":
q = q.filter(monitoring_type="workstation")
minions = [agent.salt_id for agent in q]
agents = [agent.pk for agent in q]
@@ -871,3 +903,15 @@ def agent_maintenance(request):
return notify_error("Invalid data")
return Response("ok")
class WMI(APIView):
def get(self, request, pk):
agent = get_object_or_404(Agent, pk=pk)
if pyver.parse(agent.version) < pyver.parse("1.1.2"):
return notify_error("Requires agent version 1.1.2 or greater")
r = asyncio.run(agent.nats_cmd({"func": "sysinfo"}, timeout=20))
if r != "ok":
return notify_error("Unable to contact the agent")
return Response("ok")

View File

@@ -7,19 +7,25 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('checks', '0010_auto_20200922_1344'),
('alerts', '0002_auto_20200815_1618'),
("checks", "0010_auto_20200922_1344"),
("alerts", "0002_auto_20200815_1618"),
]
operations = [
migrations.AddField(
model_name='alert',
name='assigned_check',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='alert', to='checks.check'),
model_name="alert",
name="assigned_check",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="alert",
to="checks.check",
),
),
migrations.AlterField(
model_name='alert',
name='alert_time',
model_name="alert",
name="alert_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
]

View File

@@ -37,7 +37,7 @@ class Alert(models.Model):
@classmethod
def create_availability_alert(cls, agent):
pass
@classmethod
def create_check_alert(cls, check):
pass
pass

View File

@@ -16,4 +16,4 @@ class AlertSerializer(ModelSerializer):
class Meta:
model = Alert
fields = "__all__"
fields = "__all__"

View File

@@ -1,5 +0,0 @@
from django.apps import AppConfig
class ApiConfig(AppConfig):
name = "api"

View File

@@ -1,11 +0,0 @@
from django.urls import path
from . import views
from apiv3 import views as v3_views
urlpatterns = [
path("triggerpatchscan/", views.trigger_patch_scan),
path("<int:pk>/checkrunner/", views.CheckRunner.as_view()),
path("<int:pk>/taskrunner/", views.TaskRunner.as_view()),
path("<int:pk>/saltinfo/", views.SaltInfo.as_view()),
path("<int:pk>/meshinfo/", v3_views.MeshInfo.as_view()),
]

View File

@@ -1,149 +0,0 @@
from loguru import logger
from django.conf import settings
from django.shortcuts import get_object_or_404
from django.utils import timezone as djangotime
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.authentication import TokenAuthentication
from rest_framework.permissions import IsAuthenticated
from rest_framework.decorators import (
api_view,
authentication_classes,
permission_classes,
)
from agents.models import Agent
from checks.models import Check
from autotasks.models import AutomatedTask
from winupdate.tasks import check_for_updates_task
from autotasks.serializers import TaskRunnerGetSerializer, TaskRunnerPatchSerializer
from checks.serializers import CheckRunnerGetSerializer, CheckResultsSerializer
logger.configure(**settings.LOG_CONFIG)
@api_view(["PATCH"])
@authentication_classes((TokenAuthentication,))
@permission_classes((IsAuthenticated,))
def trigger_patch_scan(request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
reboot_policy = agent.get_patch_policy().reboot_after_install
reboot = False
if reboot_policy == "always":
reboot = True
if request.data["reboot"]:
if reboot_policy == "required":
reboot = True
elif reboot_policy == "never":
agent.needs_reboot = True
agent.save(update_fields=["needs_reboot"])
if reboot:
r = agent.salt_api_cmd(
timeout=15,
func="system.reboot",
arg=7,
kwargs={"in_seconds": True},
)
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
check_for_updates_task.apply_async(
queue="wupdate", kwargs={"pk": agent.pk, "wait": False}
)
else:
logger.info(f"{agent.hostname} is rebooting after updates were installed.")
else:
check_for_updates_task.apply_async(
queue="wupdate", kwargs={"pk": agent.pk, "wait": False}
)
return Response("ok")
class CheckRunner(APIView):
"""
For windows agent
"""
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def get(self, request, pk):
agent = get_object_or_404(Agent, pk=pk)
checks = Check.objects.filter(agent__pk=pk, overriden_by_policy=False)
ret = {
"agent": agent.pk,
"check_interval": agent.check_interval,
"checks": CheckRunnerGetSerializer(checks, many=True).data,
}
return Response(ret)
def patch(self, request, pk):
check = get_object_or_404(Check, pk=pk)
if check.check_type != "cpuload" and check.check_type != "memory":
serializer = CheckResultsSerializer(
instance=check, data=request.data, partial=True
)
serializer.is_valid(raise_exception=True)
serializer.save(last_run=djangotime.now())
else:
check.last_run = djangotime.now()
check.save(update_fields=["last_run"])
check.handle_check(request.data)
return Response("ok")
class TaskRunner(APIView):
"""
For the windows python agent
"""
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def get(self, request, pk):
task = get_object_or_404(AutomatedTask, pk=pk)
return Response(TaskRunnerGetSerializer(task).data)
def patch(self, request, pk):
task = get_object_or_404(AutomatedTask, pk=pk)
serializer = TaskRunnerPatchSerializer(
instance=task, data=request.data, partial=True
)
serializer.is_valid(raise_exception=True)
serializer.save(last_run=djangotime.now())
return Response("ok")
class SaltInfo(APIView):
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def get(self, request, pk):
agent = get_object_or_404(Agent, pk=pk)
ret = {
"latestVer": settings.LATEST_SALT_VER,
"currentVer": agent.salt_ver,
"salt_id": agent.salt_id,
}
return Response(ret)
def patch(self, request, pk):
agent = get_object_or_404(Agent, pk=pk)
agent.salt_ver = request.data["ver"]
agent.save(update_fields=["salt_ver"])
return Response("ok")

View File

@@ -2,4 +2,4 @@ from django.apps import AppConfig
class Apiv2Config(AppConfig):
name = 'apiv2'
name = "apiv2"

View File

@@ -45,15 +45,11 @@ class TestAPIv3(TacticalTestCase):
def test_get_mesh_info(self):
url = f"/api/v3/{self.agent.pk}/meshinfo/"
url2 = f"/api/v1/{self.agent.pk}/meshinfo/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
r = self.client.get(url2)
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("get", url)
self.check_not_authenticated("get", url2)
def test_get_winupdater(self):
url = f"/api/v3/{self.agent.agent_id}/winupdater/"

View File

@@ -2,6 +2,7 @@ from django.urls import path
from . import views
urlpatterns = [
path("checkin/", views.CheckIn.as_view()),
path("hello/", views.Hello.as_view()),
path("checkrunner/", views.CheckRunner.as_view()),
path("<str:agentid>/checkrunner/", views.CheckRunner.as_view()),
@@ -14,4 +15,6 @@ urlpatterns = [
path("newagent/", views.NewAgent.as_view()),
path("winupdater/", views.WinUpdater.as_view()),
path("<str:agentid>/winupdater/", views.WinUpdater.as_view()),
path("software/", views.Software.as_view()),
path("installer/", views.Installer.as_view()),
]

View File

@@ -2,12 +2,12 @@ import asyncio
import os
import requests
from loguru import logger
from packaging import version as pyver
from django.conf import settings
from django.shortcuts import get_object_or_404
from django.utils import timezone as djangotime
from django.http import HttpResponse
from rest_framework import serializers
from rest_framework.response import Response
from rest_framework.views import APIView
@@ -20,6 +20,7 @@ from checks.models import Check
from autotasks.models import AutomatedTask
from accounts.models import User
from winupdate.models import WinUpdatePolicy
from software.models import InstalledSoftware
from checks.serializers import CheckRunnerGetSerializerV3
from agents.serializers import WinAgentSerializer
from autotasks.serializers import TaskGOGetSerializer, TaskRunnerPatchSerializer
@@ -28,18 +29,122 @@ from winupdate.serializers import ApprovedUpdateSerializer
from agents.tasks import (
agent_recovery_email_task,
agent_recovery_sms_task,
get_wmi_detail_task,
sync_salt_modules_task,
)
from winupdate.tasks import check_for_updates_task
from software.tasks import get_installed_software, install_chocolatey
from software.tasks import install_chocolatey
from checks.utils import bytes2human
from tacticalrmm.utils import notify_error, reload_nats
from tacticalrmm.utils import notify_error, reload_nats, filter_software, SoftwareList
logger.configure(**settings.LOG_CONFIG)
class CheckIn(APIView):
"""
The agent's checkin endpoint
patch: called every 45 to 110 seconds, handles agent updates and recovery
put: called every 5 to 10 minutes, handles basic system info
post: called once on windows service startup
"""
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def patch(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
agent.version = request.data["version"]
agent.last_seen = djangotime.now()
agent.save(update_fields=["version", "last_seen"])
if agent.agentoutages.exists() and agent.agentoutages.last().is_active:
last_outage = agent.agentoutages.last()
last_outage.recovery_time = djangotime.now()
last_outage.save(update_fields=["recovery_time"])
if agent.overdue_email_alert:
agent_recovery_email_task.delay(pk=last_outage.pk)
if agent.overdue_text_alert:
agent_recovery_sms_task.delay(pk=last_outage.pk)
recovery = agent.recoveryactions.filter(last_run=None).last()
if recovery is not None:
recovery.last_run = djangotime.now()
recovery.save(update_fields=["last_run"])
return Response(recovery.send())
# handle agent update
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
update = agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
update.status = "completed"
update.save(update_fields=["status"])
return Response(update.details)
# get any pending actions
if agent.pendingactions.filter(status="pending").exists():
agent.handle_pending_actions()
return Response("ok")
def put(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
serializer = WinAgentSerializer(instance=agent, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
if "disks" in request.data.keys():
disks = request.data["disks"]
new = []
# python agent
if isinstance(disks, dict):
for k, v in disks.items():
new.append(v)
else:
# golang agent
for disk in disks:
tmp = {}
for k, v in disk.items():
tmp["device"] = disk["device"]
tmp["fstype"] = disk["fstype"]
tmp["total"] = bytes2human(disk["total"])
tmp["used"] = bytes2human(disk["used"])
tmp["free"] = bytes2human(disk["free"])
tmp["percent"] = int(disk["percent"])
new.append(tmp)
serializer.save(disks=new)
return Response("ok")
if "logged_in_username" in request.data.keys():
if request.data["logged_in_username"] != "None":
serializer.save(last_logged_in_user=request.data["logged_in_username"])
return Response("ok")
serializer.save()
return Response("ok")
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
serializer = WinAgentSerializer(instance=agent, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
serializer.save(last_seen=djangotime.now())
sync_salt_modules_task.delay(agent.pk)
check_for_updates_task.apply_async(
queue="wupdate", kwargs={"pk": agent.pk, "wait": True}
)
if not agent.choco_installed:
install_chocolatey.delay(agent.pk, wait=True)
return Response("ok")
class Hello(APIView):
#### DEPRECATED, for agents <= 1.1.9 ####
"""
The agent's checkin endpoint
patch: called every 30 to 120 seconds
@@ -123,8 +228,6 @@ class Hello(APIView):
serializer.save(last_seen=djangotime.now())
sync_salt_modules_task.delay(agent.pk)
get_installed_software.delay(agent.pk)
get_wmi_detail_task.delay(agent.pk)
check_for_updates_task.apply_async(
queue="wupdate", kwargs={"pk": agent.pk, "wait": True}
)
@@ -386,7 +489,15 @@ class MeshInfo(APIView):
def patch(self, request, pk):
agent = get_object_or_404(Agent, pk=pk)
agent.mesh_node_id = request.data["nodeidhex"]
if "nodeidhex" in request.data:
# agent <= 1.1.0
nodeid = request.data["nodeidhex"]
else:
# agent >= 1.1.1
nodeid = request.data["nodeid"]
agent.mesh_node_id = nodeid
agent.save(update_fields=["mesh_node_id"])
return Response("ok")
@@ -476,3 +587,42 @@ class NewAgent(APIView):
"token": token.key,
}
)
class Software(APIView):
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
raw: SoftwareList = request.data["software"]
if not isinstance(raw, list):
return notify_error("err")
sw = filter_software(raw)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s.software = sw
s.save(update_fields=["software"])
return Response("ok")
class Installer(APIView):
def get(self, request):
# used to check if token is valid. will return 401 if not
return Response("ok")
def post(self, request):
if "version" not in request.data:
return notify_error("Invalid data")
ver = request.data["version"]
if pyver.parse(ver) < pyver.parse(settings.LATEST_AGENT_VER):
return notify_error(
f"Old installer detected (version {ver} ). Latest version is {settings.LATEST_AGENT_VER} Please generate a new installer from the RMM"
)
return Response("ok")

View File

@@ -6,11 +6,11 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('automation', '0005_auto_20200922_1344'),
("automation", "0005_auto_20200922_1344"),
]
operations = [
migrations.DeleteModel(
name='PolicyExclusions',
name="PolicyExclusions",
),
]

View File

@@ -80,7 +80,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy
@@ -132,7 +132,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy

View File

@@ -19,7 +19,17 @@ def generate_agent_checks_from_policies_task(
):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_checks_from_policies(clear=clear)
if create_tasks:
agent.generate_tasks_from_policies(
@@ -86,7 +96,17 @@ def update_policy_check_fields_task(checkpk):
def generate_agent_tasks_from_policies_task(policypk, clear=False):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_tasks_from_policies(clear=clear)

View File

@@ -1051,10 +1051,13 @@ class TestPolicyTasks(TacticalTestCase):
for task in tasks:
run_win_task.assert_any_call(task.id)
def test_update_policy_tasks(self):
@patch("agents.models.Agent.nats_cmd")
def test_update_policy_tasks(self, nats_cmd):
from .tasks import update_policy_task_fields_task
from autotasks.models import AutomatedTask
nats_cmd.return_value = "ok"
# setup data
policy = baker.make("automation.Policy", active=True)
tasks = baker.make(

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.3 on 2020-11-29 09:12
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("autotasks", "0008_auto_20201030_1515"),
]
operations = [
migrations.AddField(
model_name="automatedtask",
name="run_time_bit_weekdays",
field=models.IntegerField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,33 @@
from django.db import migrations
from tacticalrmm.utils import get_bit_days
DAYS_OF_WEEK = {
0: "Monday",
1: "Tuesday",
2: "Wednesday",
3: "Thursday",
4: "Friday",
5: "Saturday",
6: "Sunday",
}
def migrate_days(apps, schema_editor):
AutomatedTask = apps.get_model("autotasks", "AutomatedTask")
for task in AutomatedTask.objects.exclude(run_time_days__isnull=True).exclude(
run_time_days=[]
):
run_days = [DAYS_OF_WEEK.get(day) for day in task.run_time_days]
task.run_time_bit_weekdays = get_bit_days(run_days)
task.save(update_fields=["run_time_bit_weekdays"])
class Migration(migrations.Migration):
dependencies = [
("autotasks", "0009_automatedtask_run_time_bit_weekdays"),
]
operations = [
migrations.RunPython(migrate_days),
]

View File

@@ -8,6 +8,7 @@ from django.contrib.postgres.fields import ArrayField
from django.db.models.fields import DateTimeField
from automation.models import Policy
from logs.models import BaseAuditModel
from tacticalrmm.utils import bitdays_to_string
RUN_TIME_DAY_CHOICES = [
(0, "Monday"),
@@ -69,6 +70,8 @@ class AutomatedTask(BaseAuditModel):
on_delete=models.SET_NULL,
)
name = models.CharField(max_length=255)
run_time_bit_weekdays = models.IntegerField(null=True, blank=True)
# run_time_days is deprecated, use bit weekdays
run_time_days = ArrayField(
models.IntegerField(choices=RUN_TIME_DAY_CHOICES, null=True, blank=True),
null=True,
@@ -107,21 +110,12 @@ class AutomatedTask(BaseAuditModel):
elif self.task_type == "runonce":
return f'Run once on {self.run_time_date.strftime("%m/%d/%Y %I:%M%p")}'
elif self.task_type == "scheduled":
ret = []
for i in self.run_time_days:
for j in RUN_TIME_DAY_CHOICES:
if i in j:
ret.append(j[1][0:3])
run_time_nice = dt.datetime.strptime(
self.run_time_minute, "%H:%M"
).strftime("%I:%M %p")
if len(ret) == 7:
return f"Every day at {run_time_nice}"
else:
days = ",".join(ret)
return f"{days} at {run_time_nice}"
days = bitdays_to_string(self.run_time_bit_weekdays)
return f"{days} at {run_time_nice}"
@property
def last_run_as_timezone(self):
@@ -169,6 +163,7 @@ class AutomatedTask(BaseAuditModel):
name=self.name,
run_time_days=self.run_time_days,
run_time_minute=self.run_time_minute,
run_time_bit_weekdays=self.run_time_bit_weekdays,
run_time_date=self.run_time_date,
task_type=self.task_type,
win_task_name=self.win_task_name,

View File

@@ -1,52 +1,37 @@
import asyncio
import datetime as dt
from loguru import logger
from tacticalrmm.celery import app
from django.conf import settings
import pytz
from django.utils import timezone as djangotime
from packaging import version as pyver
from .models import AutomatedTask
from logs.models import PendingAction
logger.configure(**settings.LOG_CONFIG)
DAYS_OF_WEEK = {
0: "Monday",
1: "Tuesday",
2: "Wednesday",
3: "Thursday",
4: "Friday",
5: "Saturday",
6: "Sunday",
}
@app.task
def create_win_task_schedule(pk, pending_action=False):
task = AutomatedTask.objects.get(pk=pk)
if task.task_type == "scheduled":
run_days = [DAYS_OF_WEEK.get(day) for day in task.run_time_days]
r = task.agent.salt_api_cmd(
timeout=20,
func="task.create_task",
arg=[
f"name={task.win_task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {task.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Weekly",
f'start_time="{task.run_time_minute}"',
"ac_only=False",
"stop_if_on_batteries=False",
],
kwargs={"days_of_week": run_days},
)
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "weekly",
"weekdays": task.run_time_bit_weekdays,
"pk": task.pk,
"name": task.win_task_name,
"hour": dt.datetime.strptime(task.run_time_minute, "%H:%M").hour,
"min": dt.datetime.strptime(task.run_time_minute, "%H:%M").minute,
},
}
elif task.task_type == "runonce":
# check if scheduled time is in the past
agent_tz = pytz.timezone(task.agent.timezone)
task_time_utc = task.run_time_date.replace(tzinfo=agent_tz).astimezone(pytz.utc)
@@ -57,45 +42,41 @@ def create_win_task_schedule(pk, pending_action=False):
) + djangotime.timedelta(minutes=5)
task.save()
r = task.agent.salt_api_cmd(
timeout=20,
func="task.create_task",
arg=[
f"name={task.win_task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {task.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
f'start_date="{task.run_time_date.strftime("%Y-%m-%d")}"',
f'start_time="{task.run_time_date.strftime("%H:%M")}"',
"ac_only=False",
"stop_if_on_batteries=False",
"start_when_available=True",
],
)
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "once",
"pk": task.pk,
"name": task.win_task_name,
"year": int(dt.datetime.strftime(task.run_time_date, "%Y")),
"month": dt.datetime.strftime(task.run_time_date, "%B"),
"day": int(dt.datetime.strftime(task.run_time_date, "%d")),
"hour": int(dt.datetime.strftime(task.run_time_date, "%H")),
"min": int(dt.datetime.strftime(task.run_time_date, "%M")),
},
}
if task.remove_if_not_scheduled and pyver.parse(
task.agent.version
) >= pyver.parse("1.1.2"):
nats_data["schedtaskpayload"]["deleteafter"] = True
elif task.task_type == "checkfailure" or task.task_type == "manual":
r = task.agent.salt_api_cmd(
timeout=20,
func="task.create_task",
arg=[
f"name={task.win_task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {task.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
'start_date="1975-01-01"',
'start_time="01:00"',
"ac_only=False",
"stop_if_on_batteries=False",
],
)
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "manual",
"pk": task.pk,
"name": task.win_task_name,
},
}
else:
return "error"
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
r = asyncio.run(task.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
# don't create pending action if this task was initiated by a pending action
if not pending_action:
PendingAction(
@@ -129,13 +110,16 @@ def create_win_task_schedule(pk, pending_action=False):
def enable_or_disable_win_task(pk, action, pending_action=False):
task = AutomatedTask.objects.get(pk=pk)
r = task.agent.salt_api_cmd(
timeout=20,
func="task.edit_task",
arg=[f"name={task.win_task_name}", f"enabled={action}"],
)
nats_data = {
"func": "enableschedtask",
"schedtaskpayload": {
"name": task.win_task_name,
"enabled": action,
},
}
r = asyncio.run(task.agent.nats_cmd(nats_data))
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
if r != "ok":
# don't create pending action if this task was initiated by a pending action
if not pending_action:
PendingAction(
@@ -150,9 +134,6 @@ def enable_or_disable_win_task(pk, action, pending_action=False):
task.sync_status = "notsynced"
task.save(update_fields=["sync_status"])
logger.error(
f"Unable to update the scheduled task {task.win_task_name} on {task.agent.hostname}. It will be updated when the agent checks in."
)
return
# clear pending action since it was successful
@@ -163,7 +144,6 @@ def enable_or_disable_win_task(pk, action, pending_action=False):
task.sync_status = "synced"
task.save(update_fields=["sync_status"])
logger.info(f"{task.agent.hostname} task {task.name} was edited.")
return "ok"
@@ -171,13 +151,13 @@ def enable_or_disable_win_task(pk, action, pending_action=False):
def delete_win_task_schedule(pk, pending_action=False):
task = AutomatedTask.objects.get(pk=pk)
r = task.agent.salt_api_cmd(
timeout=20,
func="task.delete_task",
arg=[f"name={task.win_task_name}"],
)
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": task.win_task_name},
}
r = asyncio.run(task.agent.nats_cmd(nats_data, timeout=10))
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
if r != "ok":
# don't create pending action if this task was initiated by a pending action
if not pending_action:
PendingAction(
@@ -188,9 +168,6 @@ def delete_win_task_schedule(pk, pending_action=False):
task.sync_status = "pendingdeletion"
task.save(update_fields=["sync_status"])
logger.error(
f"Unable to delete scheduled task {task.win_task_name} on {task.agent.hostname}. It was marked pending deletion and will be removed when the agent checks in."
)
return
# complete pending action since it was successful
@@ -200,15 +177,13 @@ def delete_win_task_schedule(pk, pending_action=False):
pendingaction.save(update_fields=["status"])
task.delete()
logger.info(f"{task.agent.hostname} task {task.name} was deleted.")
return "ok"
@app.task
def run_win_task(pk):
# TODO deprecated, remove this function once salt gone
task = AutomatedTask.objects.get(pk=pk)
r = task.agent.salt_api_async(func="task.run", arg=[f"name={task.win_task_name}"])
asyncio.run(task.agent.nats_cmd({"func": "runtask", "taskpk": task.pk}, wait=False))
return "ok"
@@ -220,18 +195,9 @@ def remove_orphaned_win_tasks(agentpk):
logger.info(f"Orphaned task cleanup initiated on {agent.hostname}.")
r = agent.salt_api_cmd(
timeout=15,
func="task.list_tasks",
)
r = asyncio.run(agent.nats_cmd({"func": "listschedtasks"}, timeout=10))
if r == "timeout" or r == "error":
logger.error(
f"Unable to clean up scheduled tasks on {agent.hostname}. Agent might be offline"
)
return "errtimeout"
if not isinstance(r, list):
if not isinstance(r, list) and not r: # empty list
logger.error(f"Unable to clean up scheduled tasks on {agent.hostname}: {r}")
return "notlist"
@@ -240,7 +206,8 @@ def remove_orphaned_win_tasks(agentpk):
exclude_tasks = (
"TacticalRMM_fixmesh",
"TacticalRMM_SchedReboot",
"TacticalRMM_saltwatchdog", # will be implemented in future
"TacticalRMM_sync",
"TacticalRMM_agentupdate",
)
for task in r:
@@ -250,16 +217,16 @@ def remove_orphaned_win_tasks(agentpk):
if task.startswith("TacticalRMM_") and task not in agent_task_names:
# delete task since it doesn't exist in UI
ret = agent.salt_api_cmd(
timeout=20,
func="task.delete_task",
arg=[f"name={task}"],
)
if isinstance(ret, bool) and ret is True:
logger.info(f"Removed orphaned task {task} from {agent.hostname}")
else:
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": task},
}
ret = asyncio.run(agent.nats_cmd(nats_data, timeout=10))
if ret != "ok":
logger.error(
f"Unable to clean up orphaned task {task} on {agent.hostname}: {ret}"
)
else:
logger.info(f"Removed orphaned task {task} from {agent.hostname}")
logger.info(f"Orphaned task cleanup finished on {agent.hostname}")

View File

@@ -1,3 +1,4 @@
import datetime as dt
from unittest.mock import patch, call
from model_bakery import baker
from django.utils import timezone as djangotime
@@ -25,9 +26,9 @@ class TestAutotaskViews(TacticalTestCase):
# setup data
script = baker.make_recipe("scripts.script")
agent = baker.make_recipe("agents.agent")
agent_old = baker.make_recipe("agents.agent", version="0.9.0")
policy = baker.make("automation.Policy")
check = baker.make_recipe("checks.diskspace_check", agent=agent)
old_agent = baker.make_recipe("agents.agent", version="1.1.0")
# test script set to invalid pk
data = {"autotask": {"script": 500}}
@@ -50,10 +51,10 @@ class TestAutotaskViews(TacticalTestCase):
resp = self.client.post(url, data, format="json")
self.assertEqual(resp.status_code, 404)
# test invalid agent version
# test old agent version
data = {
"autotask": {"script": script.id, "script_args": ["args"]},
"agent": agent_old.id,
"autotask": {"script": script.id},
"agent": old_agent.id,
}
resp = self.client.post(url, data, format="json")
@@ -63,7 +64,7 @@ class TestAutotaskViews(TacticalTestCase):
data = {
"autotask": {
"name": "Test Task Scheduled with Assigned Check",
"run_time_days": [0, 1, 2],
"run_time_days": ["Sunday", "Monday", "Friday"],
"run_time_minute": "10:00",
"timeout": 120,
"enabled": True,
@@ -84,6 +85,7 @@ class TestAutotaskViews(TacticalTestCase):
data = {
"autotask": {
"name": "Test Task Manual",
"run_time_days": [],
"timeout": 120,
"enabled": True,
"script": script.id,
@@ -213,8 +215,8 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
self.authenticate()
self.setup_coresettings()
@patch("agents.models.Agent.salt_api_cmd")
def test_remove_orphaned_win_task(self, salt_api_cmd):
@patch("agents.models.Agent.nats_cmd")
def test_remove_orphaned_win_task(self, nats_cmd):
self.agent = baker.make_recipe("agents.agent")
self.task1 = AutomatedTask.objects.create(
agent=self.agent,
@@ -222,20 +224,6 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
win_task_name=AutomatedTask.generate_task_name(),
)
salt_api_cmd.return_value = "timeout"
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
self.assertEqual(ret.result, "errtimeout")
salt_api_cmd.return_value = "error"
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
self.assertEqual(ret.result, "errtimeout")
salt_api_cmd.return_value = "task not found in"
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
self.assertEqual(ret.result, "notlist")
salt_api_cmd.reset_mock()
# test removing an orphaned task
win_tasks = [
"Adobe Acrobat Update Task",
@@ -250,50 +238,54 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
]
self.calls = [
call(timeout=15, func="task.list_tasks"),
call({"func": "listschedtasks"}, timeout=10),
call(
timeout=20,
func="task.delete_task",
arg=["name=TacticalRMM_iggrLcOaldIZnUzLuJWPLNwikiOoJJHHznb"],
{
"func": "delschedtask",
"schedtaskpayload": {
"name": "TacticalRMM_iggrLcOaldIZnUzLuJWPLNwikiOoJJHHznb"
},
},
timeout=10,
),
]
salt_api_cmd.side_effect = [win_tasks, True]
nats_cmd.side_effect = [win_tasks, "ok"]
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
self.assertEqual(salt_api_cmd.call_count, 2)
salt_api_cmd.assert_has_calls(self.calls)
self.assertEqual(nats_cmd.call_count, 2)
nats_cmd.assert_has_calls(self.calls)
self.assertEqual(ret.status, "SUCCESS")
# test salt delete_task fail
salt_api_cmd.reset_mock()
salt_api_cmd.side_effect = [win_tasks, False]
# test nats delete task fail
nats_cmd.reset_mock()
nats_cmd.side_effect = [win_tasks, "error deleting task"]
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
salt_api_cmd.assert_has_calls(self.calls)
self.assertEqual(salt_api_cmd.call_count, 2)
nats_cmd.assert_has_calls(self.calls)
self.assertEqual(nats_cmd.call_count, 2)
self.assertEqual(ret.status, "SUCCESS")
# no orphaned tasks
salt_api_cmd.reset_mock()
nats_cmd.reset_mock()
win_tasks.remove("TacticalRMM_iggrLcOaldIZnUzLuJWPLNwikiOoJJHHznb")
salt_api_cmd.side_effect = [win_tasks, True]
nats_cmd.side_effect = [win_tasks, "ok"]
ret = remove_orphaned_win_tasks.s(self.agent.pk).apply()
self.assertEqual(salt_api_cmd.call_count, 1)
self.assertEqual(nats_cmd.call_count, 1)
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_async")
def test_run_win_task(self, salt_api_async):
@patch("agents.models.Agent.nats_cmd")
def test_run_win_task(self, nats_cmd):
self.agent = baker.make_recipe("agents.agent")
self.task1 = AutomatedTask.objects.create(
agent=self.agent,
name="test task 1",
win_task_name=AutomatedTask.generate_task_name(),
)
salt_api_async.return_value = "Response 200"
nats_cmd.return_value = "ok"
ret = run_win_task.s(self.task1.pk).apply()
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_cmd")
def test_create_win_task_schedule(self, salt_api_cmd):
@patch("agents.models.Agent.nats_cmd")
def test_create_win_task_schedule(self, nats_cmd):
self.agent = baker.make_recipe("agents.agent")
task_name = AutomatedTask.generate_task_name()
@@ -303,46 +295,32 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
name="test task 1",
win_task_name=task_name,
task_type="scheduled",
run_time_days=[0, 1, 6],
run_time_bit_weekdays=127,
run_time_minute="21:55",
)
self.assertEqual(self.task1.sync_status, "notsynced")
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(pk=self.task1.pk, pending_action=False).apply()
self.assertEqual(salt_api_cmd.call_count, 1)
salt_api_cmd.assert_called_with(
timeout=20,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {self.task1.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Weekly",
'start_time="21:55"',
"ac_only=False",
"stop_if_on_batteries=False",
],
kwargs={"days_of_week": ["Monday", "Tuesday", "Sunday"]},
self.assertEqual(nats_cmd.call_count, 1)
nats_cmd.assert_called_with(
{
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "weekly",
"weekdays": 127,
"pk": self.task1.pk,
"name": task_name,
"hour": 21,
"min": 55,
},
},
timeout=10,
)
self.task1 = AutomatedTask.objects.get(pk=self.task1.pk)
self.assertEqual(self.task1.sync_status, "synced")
salt_api_cmd.return_value = "timeout"
ret = create_win_task_schedule.s(pk=self.task1.pk, pending_action=False).apply()
self.assertEqual(ret.status, "SUCCESS")
self.task1 = AutomatedTask.objects.get(pk=self.task1.pk)
self.assertEqual(self.task1.sync_status, "notsynced")
salt_api_cmd.return_value = "error"
ret = create_win_task_schedule.s(pk=self.task1.pk, pending_action=False).apply()
self.assertEqual(ret.status, "SUCCESS")
self.task1 = AutomatedTask.objects.get(pk=self.task1.pk)
self.assertEqual(self.task1.sync_status, "notsynced")
salt_api_cmd.return_value = False
nats_cmd.return_value = "timeout"
ret = create_win_task_schedule.s(pk=self.task1.pk, pending_action=False).apply()
self.assertEqual(ret.status, "SUCCESS")
self.task1 = AutomatedTask.objects.get(pk=self.task1.pk)
@@ -353,7 +331,7 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
agent=self.agent, action_type="taskaction"
)
self.assertEqual(self.pending_action.status, "pending")
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(
pk=self.task1.pk, pending_action=self.pending_action.pk
).apply()
@@ -362,7 +340,7 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
self.assertEqual(self.pending_action.status, "completed")
# test runonce with future date
salt_api_cmd.reset_mock()
nats_cmd.reset_mock()
task_name = AutomatedTask.generate_task_name()
run_time_date = djangotime.now() + djangotime.timedelta(hours=22)
self.task2 = AutomatedTask.objects.create(
@@ -372,30 +350,29 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
task_type="runonce",
run_time_date=run_time_date,
)
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(pk=self.task2.pk, pending_action=False).apply()
salt_api_cmd.assert_called_with(
timeout=20,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {self.task2.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
f'start_date="{run_time_date.strftime("%Y-%m-%d")}"',
f'start_time="{run_time_date.strftime("%H:%M")}"',
"ac_only=False",
"stop_if_on_batteries=False",
"start_when_available=True",
],
nats_cmd.assert_called_with(
{
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "once",
"pk": self.task2.pk,
"name": task_name,
"year": int(dt.datetime.strftime(self.task2.run_time_date, "%Y")),
"month": dt.datetime.strftime(self.task2.run_time_date, "%B"),
"day": int(dt.datetime.strftime(self.task2.run_time_date, "%d")),
"hour": int(dt.datetime.strftime(self.task2.run_time_date, "%H")),
"min": int(dt.datetime.strftime(self.task2.run_time_date, "%M")),
},
},
timeout=10,
)
self.assertEqual(ret.status, "SUCCESS")
# test runonce with date in the past
salt_api_cmd.reset_mock()
nats_cmd.reset_mock()
task_name = AutomatedTask.generate_task_name()
run_time_date = djangotime.now() - djangotime.timedelta(days=13)
self.task3 = AutomatedTask.objects.create(
@@ -405,31 +382,13 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
task_type="runonce",
run_time_date=run_time_date,
)
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(pk=self.task3.pk, pending_action=False).apply()
self.task3 = AutomatedTask.objects.get(pk=self.task3.pk)
salt_api_cmd.assert_called_with(
timeout=20,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {self.task3.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
f'start_date="{self.task3.run_time_date.strftime("%Y-%m-%d")}"',
f'start_time="{self.task3.run_time_date.strftime("%H:%M")}"',
"ac_only=False",
"stop_if_on_batteries=False",
"start_when_available=True",
],
)
self.assertEqual(ret.status, "SUCCESS")
# test checkfailure
salt_api_cmd.reset_mock()
nats_cmd.reset_mock()
self.check = baker.make_recipe("checks.diskspace_check", agent=self.agent)
task_name = AutomatedTask.generate_task_name()
self.task4 = AutomatedTask.objects.create(
@@ -439,29 +398,24 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
task_type="checkfailure",
assigned_check=self.check,
)
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(pk=self.task4.pk, pending_action=False).apply()
salt_api_cmd.assert_called_with(
timeout=20,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {self.task4.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
'start_date="1975-01-01"',
'start_time="01:00"',
"ac_only=False",
"stop_if_on_batteries=False",
],
nats_cmd.assert_called_with(
{
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "manual",
"pk": self.task4.pk,
"name": task_name,
},
},
timeout=10,
)
self.assertEqual(ret.status, "SUCCESS")
# test manual
salt_api_cmd.reset_mock()
nats_cmd.reset_mock()
task_name = AutomatedTask.generate_task_name()
self.task5 = AutomatedTask.objects.create(
agent=self.agent,
@@ -469,23 +423,18 @@ class TestAutoTaskCeleryTasks(TacticalTestCase):
win_task_name=task_name,
task_type="manual",
)
salt_api_cmd.return_value = True
nats_cmd.return_value = "ok"
ret = create_win_task_schedule.s(pk=self.task5.pk, pending_action=False).apply()
salt_api_cmd.assert_called_with(
timeout=20,
func="task.create_task",
arg=[
f"name={task_name}",
"force=True",
"action_type=Execute",
'cmd="C:\\Program Files\\TacticalAgent\\tacticalrmm.exe"',
f'arguments="-m taskrunner -p {self.task5.pk}"',
"start_in=C:\\Program Files\\TacticalAgent",
"trigger_type=Once",
'start_date="1975-01-01"',
'start_time="01:00"',
"ac_only=False",
"stop_if_on_batteries=False",
],
nats_cmd.assert_called_with(
{
"func": "schedtask",
"schedtaskpayload": {
"type": "rmm",
"trigger": "manual",
"pk": self.task5.pk,
"name": task_name,
},
},
timeout=10,
)
self.assertEqual(ret.status, "SUCCESS")

View File

@@ -20,7 +20,7 @@ from .tasks import (
delete_win_task_schedule,
enable_or_disable_win_task,
)
from tacticalrmm.utils import notify_error
from tacticalrmm.utils import notify_error, get_bit_days
class AddAutoTask(APIView):
@@ -38,17 +38,20 @@ class AddAutoTask(APIView):
parent = {"policy": policy}
else:
agent = get_object_or_404(Agent, pk=data["agent"])
if not agent.has_gotasks:
return notify_error("Requires agent version 1.1.1 or greater")
parent = {"agent": agent}
added = "0.11.0"
if data["autotask"]["script_args"] and agent.not_supported(added):
return notify_error(
f"Script arguments only available in agent {added} or greater"
)
check = None
if data["autotask"]["assigned_check"]:
check = get_object_or_404(Check, pk=data["autotask"]["assigned_check"])
bit_weekdays = None
if data["autotask"]["run_time_days"]:
bit_weekdays = get_bit_days(data["autotask"]["run_time_days"])
del data["autotask"]["run_time_days"]
serializer = TaskSerializer(data=data["autotask"], partial=True, context=parent)
serializer.is_valid(raise_exception=True)
obj = serializer.save(
@@ -56,6 +59,7 @@ class AddAutoTask(APIView):
script=script,
win_task_name=AutomatedTask.generate_task_name(),
assigned_check=check,
run_time_bit_weekdays=bit_weekdays,
)
if not "policy" in data:

View File

@@ -36,17 +36,6 @@ class AddCheck(APIView):
else:
agent = get_object_or_404(Agent, pk=request.data["pk"])
parent = {"agent": agent}
added = "0.11.0"
if (
request.data["check"]["check_type"] == "script"
and request.data["check"]["script_args"]
and agent.not_supported(version_added=added)
):
return notify_error(
{
"non_field_errors": f"Script arguments only available in agent {added} or greater"
}
)
script = None
if "script" in request.data["check"]:
@@ -58,13 +47,6 @@ class AddCheck(APIView):
request.data["check"]["check_type"] == "eventlog"
and request.data["check"]["event_id_is_wildcard"]
):
if agent and agent.not_supported(version_added="0.10.2"):
return notify_error(
{
"non_field_errors": "Wildcard is only available in agent 0.10.2 or greater"
}
)
request.data["check"]["event_id"] = 0
serializer = CheckSerializer(
@@ -116,31 +98,8 @@ class GetUpdateDeleteCheck(APIView):
pass
else:
if request.data["event_id_is_wildcard"]:
if check.agent.not_supported(version_added="0.10.2"):
return notify_error(
{
"non_field_errors": "Wildcard is only available in agent 0.10.2 or greater"
}
)
request.data["event_id"] = 0
elif check.check_type == "script":
added = "0.11.0"
try:
request.data["script_args"]
except KeyError:
pass
else:
if request.data["script_args"] and check.agent.not_supported(
version_added=added
):
return notify_error(
{
"non_field_errors": f"Script arguments only available in agent {added} or greater"
}
)
serializer = CheckSerializer(instance=check, data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
obj = serializer.save()

View File

@@ -6,48 +6,48 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('clients', '0004_auto_20200821_2115'),
("clients", "0004_auto_20200821_2115"),
]
operations = [
migrations.AddField(
model_name='client',
name='created_by',
model_name="client",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='created_time',
model_name="client",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='client',
name='modified_by',
model_name="client",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='modified_time',
model_name="client",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='site',
name='created_by',
model_name="site",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='created_time',
model_name="site",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='site',
name='modified_by',
model_name="site",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='modified_time',
model_name="site",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -8,24 +8,67 @@ import uuid
class Migration(migrations.Migration):
dependencies = [
('knox', '0007_auto_20190111_0542'),
('clients', '0005_auto_20200922_1344'),
("knox", "0007_auto_20190111_0542"),
("clients", "0005_auto_20200922_1344"),
]
operations = [
migrations.CreateModel(
name='Deployment',
name="Deployment",
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('uid', models.UUIDField(default=uuid.uuid4, editable=False)),
('mon_type', models.CharField(choices=[('server', 'Server'), ('workstation', 'Workstation')], default='server', max_length=255)),
('arch', models.CharField(choices=[('64', '64 bit'), ('32', '32 bit')], default='64', max_length=255)),
('expiry', models.DateTimeField(blank=True, null=True)),
('token_key', models.CharField(max_length=255)),
('install_flags', models.JSONField(blank=True, null=True)),
('auth_token', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploytokens', to='knox.authtoken')),
('client', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deployclients', to='clients.client')),
('site', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploysites', to='clients.site')),
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("uid", models.UUIDField(default=uuid.uuid4, editable=False)),
(
"mon_type",
models.CharField(
choices=[("server", "Server"), ("workstation", "Workstation")],
default="server",
max_length=255,
),
),
(
"arch",
models.CharField(
choices=[("64", "64 bit"), ("32", "32 bit")],
default="64",
max_length=255,
),
),
("expiry", models.DateTimeField(blank=True, null=True)),
("token_key", models.CharField(max_length=255)),
("install_flags", models.JSONField(blank=True, null=True)),
(
"auth_token",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploytokens",
to="knox.authtoken",
),
),
(
"client",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deployclients",
to="clients.client",
),
),
(
"site",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploysites",
to="clients.site",
),
),
],
),
]

View File

@@ -6,18 +6,18 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
("clients", "0006_deployment"),
]
operations = [
migrations.RenameField(
model_name='client',
old_name='client',
new_name='name',
model_name="client",
old_name="client",
new_name="name",
),
migrations.RenameField(
model_name='site',
old_name='site',
new_name='name',
model_name="site",
old_name="site",
new_name="name",
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0007_auto_20201102_1920'),
("clients", "0007_auto_20201102_1920"),
]
operations = [
migrations.AlterModelOptions(
name='client',
options={'ordering': ('name',)},
name="client",
options={"ordering": ("name",)},
),
migrations.AlterModelOptions(
name='site',
options={'ordering': ('name',)},
name="site",
options={"ordering": ("name",)},
),
]

View File

@@ -38,7 +38,6 @@ class Client(BaseAuditModel):
@property
def has_failing_checks(self):
agents = (
Agent.objects.only(
"pk",
@@ -50,14 +49,17 @@ class Client(BaseAuditModel):
.filter(site__client=self)
.prefetch_related("agentchecks")
)
failing = 0
for agent in agents:
if agent.checks["has_failing_checks"]:
return True
failing += 1
if agent.overdue_email_alert or agent.overdue_text_alert:
return agent.status == "overdue"
if agent.status == "overdue":
failing += 1
return False
return failing > 0
@staticmethod
def serialize(client):
@@ -98,7 +100,6 @@ class Site(BaseAuditModel):
@property
def has_failing_checks(self):
agents = (
Agent.objects.only(
"pk",
@@ -110,14 +111,17 @@ class Site(BaseAuditModel):
.filter(site=self)
.prefetch_related("agentchecks")
)
failing = 0
for agent in agents:
if agent.checks["has_failing_checks"]:
return True
failing += 1
if agent.overdue_email_alert or agent.overdue_text_alert:
return agent.status == "overdue"
if agent.status == "overdue":
failing += 1
return False
return failing > 0
@staticmethod
def serialize(site):

View File

@@ -282,4 +282,4 @@ class GenerateAgent(APIView):
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
return response

View File

@@ -56,8 +56,9 @@ func downloadAgent(filepath string) (err error) {
func main() {
debugLog := flag.String("log", "", "Verbose output")
localSalt := flag.String("local-salt", "", "Use local salt minion")
localMesh := flag.String("local-mesh", "", "Use local mesh agent")
noSalt := flag.Bool("nosalt", false, "Does not install salt")
silent := flag.Bool("silent", false, "Do not popup any message boxes during installation")
cert := flag.String("cert", "", "Path to ca.pem")
timeout := flag.String("timeout", "", "Timeout for subprocess calls")
flag.Parse()
@@ -78,35 +79,39 @@ func main() {
}
if debug {
cmdArgs = append(cmdArgs, "--log", "DEBUG")
cmdArgs = append(cmdArgs, "-log", "debug")
}
if len(strings.TrimSpace(*localSalt)) != 0 {
cmdArgs = append(cmdArgs, "--local-salt", *localSalt)
if *silent {
cmdArgs = append(cmdArgs, "-silent")
}
if *noSalt {
cmdArgs = append(cmdArgs, "-nosalt")
}
if len(strings.TrimSpace(*localMesh)) != 0 {
cmdArgs = append(cmdArgs, "--local-mesh", *localMesh)
cmdArgs = append(cmdArgs, "-local-mesh", *localMesh)
}
if len(strings.TrimSpace(*cert)) != 0 {
cmdArgs = append(cmdArgs, "--cert", *cert)
cmdArgs = append(cmdArgs, "-cert", *cert)
}
if len(strings.TrimSpace(*timeout)) != 0 {
cmdArgs = append(cmdArgs, "--timeout", *timeout)
cmdArgs = append(cmdArgs, "-timeout", *timeout)
}
if Rdp == "1" {
cmdArgs = append(cmdArgs, "--rdp")
cmdArgs = append(cmdArgs, "-rdp")
}
if Ping == "1" {
cmdArgs = append(cmdArgs, "--ping")
cmdArgs = append(cmdArgs, "-ping")
}
if Power == "1" {
cmdArgs = append(cmdArgs, "--power")
cmdArgs = append(cmdArgs, "-power")
}
if debug {

View File

@@ -11,12 +11,11 @@ class Command(BaseCommand):
help = "Sets up initial mesh central configuration"
async def websocket_call(self, mesh_settings):
token = get_auth_token(
mesh_settings.mesh_username, mesh_settings.mesh_token
)
token = get_auth_token(mesh_settings.mesh_username, mesh_settings.mesh_token)
if settings.MESH_WS_URL:
uri = f"{settings.MESH_WS_URL}/control.ashx?auth={token}"
if settings.DOCKER_BUILD:
site = mesh_settings.mesh_site.replace("https", "ws")
uri = f"{site}:443/control.ashx?auth={token}"
else:
site = mesh_settings.mesh_site.replace("https", "wss")
uri = f"{site}/control.ashx?auth={token}"

View File

@@ -12,12 +12,11 @@ class Command(BaseCommand):
async def websocket_call(self, mesh_settings):
token = get_auth_token(
mesh_settings.mesh_username, mesh_settings.mesh_token
)
token = get_auth_token(mesh_settings.mesh_username, mesh_settings.mesh_token)
if settings.MESH_WS_URL:
uri = f"{settings.MESH_WS_URL}/control.ashx?auth={token}"
if settings.DOCKER_BUILD:
site = mesh_settings.mesh_site.replace("https", "ws")
uri = f"{site}:443/control.ashx?auth={token}"
else:
site = mesh_settings.mesh_site.replace("https", "wss")
uri = f"{site}/control.ashx?auth={token}"
@@ -52,11 +51,17 @@ class Command(BaseCommand):
try:
# Check for Mesh Username
if not mesh_settings.mesh_username or settings.MESH_USERNAME != mesh_settings.mesh_username:
if (
not mesh_settings.mesh_username
or settings.MESH_USERNAME != mesh_settings.mesh_username
):
mesh_settings.mesh_username = settings.MESH_USERNAME
# Check for Mesh Site
if not mesh_settings.mesh_site or settings.MESH_SITE != mesh_settings.mesh_site:
if (
not mesh_settings.mesh_site
or settings.MESH_SITE != mesh_settings.mesh_site
):
mesh_settings.mesh_site = settings.MESH_SITE
# Check for Mesh Token
@@ -75,7 +80,9 @@ class Command(BaseCommand):
return
try:
asyncio.get_event_loop().run_until_complete(self.websocket_call(mesh_settings))
asyncio.get_event_loop().run_until_complete(
self.websocket_call(mesh_settings)
)
self.stdout.write("Initial Mesh Central setup complete")
except websockets.exceptions.ConnectionClosedError:
self.stdout.write(

View File

@@ -1,9 +1,7 @@
import os
import shutil
import subprocess
import sys
import tempfile
from time import sleep
from django.core.management.base import BaseCommand
@@ -15,18 +13,6 @@ class Command(BaseCommand):
help = "Collection of tasks to run after updating the rmm, after migrations"
def handle(self, *args, **kwargs):
if not os.path.exists("/usr/local/bin/goversioninfo"):
self.stdout.write(self.style.ERROR("*" * 100))
self.stdout.write("\n")
self.stdout.write(
self.style.ERROR(
"ERROR: New update script available. Delete this one and re-download."
)
)
self.stdout.write("\n")
sys.exit(1)
# 10-16-2020 changed the type of the agent's 'disks' model field
# from a dict of dicts, to a list of disks in the golang agent
# the following will convert dicts to lists for agent's still on the python agent
@@ -43,88 +29,17 @@ class Command(BaseCommand):
self.style.SUCCESS(f"Migrated disks on {agent.hostname}")
)
# sync modules. split into chunks of 60 agents to not overload the salt master
agents = Agent.objects.all()
online = [i.salt_id for i in agents if i.status == "online"]
chunks = (online[i : i + 60] for i in range(0, len(online), 60))
self.stdout.write(self.style.SUCCESS("Syncing agent modules..."))
for chunk in chunks:
r = Agent.salt_batch_async(minions=chunk, func="saltutil.sync_modules")
sleep(5)
has_old_config = True
rmm_conf = "/etc/nginx/sites-available/rmm.conf"
if os.path.exists(rmm_conf):
with open(rmm_conf) as f:
for line in f:
if "location" and "builtin" in line:
has_old_config = False
break
if has_old_config:
new_conf = """
location /builtin/ {
internal;
add_header "Access-Control-Allow-Origin" "https://rmm.yourwebsite.com";
alias /srv/salt/scripts/;
}
"""
after_this = """
location /saltscripts/ {
internal;
add_header "Access-Control-Allow-Origin" "https://rmm.yourwebsite.com";
alias /srv/salt/scripts/userdefined/;
}
"""
self.stdout.write(self.style.ERROR("*" * 100))
self.stdout.write("\n")
self.stdout.write(
self.style.ERROR(
"WARNING: A recent update requires you to manually edit your nginx config"
)
)
self.stdout.write("\n")
self.stdout.write(
self.style.ERROR("Please add the following location block to ")
+ self.style.WARNING(rmm_conf)
)
self.stdout.write(self.style.SUCCESS(new_conf))
self.stdout.write("\n")
self.stdout.write(
self.style.ERROR(
"You can paste the above right after the following block that's already in your nginx config:"
)
)
self.stdout.write(after_this)
self.stdout.write("\n")
self.stdout.write(
self.style.ERROR(
"Make sure to replace rmm.yourwebsite.com with your domain"
)
)
self.stdout.write(
self.style.ERROR("After editing, restart nginx with the command ")
+ self.style.WARNING("sudo systemctl restart nginx")
)
self.stdout.write("\n")
self.stdout.write(self.style.ERROR("*" * 100))
input("Press Enter to continue...")
# install go
if not os.path.exists("/usr/local/rmmgo/"):
self.stdout.write(self.style.SUCCESS("Installing golang"))
subprocess.run("sudo mkdir -p /usr/local/rmmgo", shell=True)
tmpdir = tempfile.mkdtemp()
r = subprocess.run(
f"wget https://golang.org/dl/go1.15.linux-amd64.tar.gz -P {tmpdir}",
f"wget https://golang.org/dl/go1.15.5.linux-amd64.tar.gz -P {tmpdir}",
shell=True,
)
gotar = os.path.join(tmpdir, "go1.15.linux-amd64.tar.gz")
gotar = os.path.join(tmpdir, "go1.15.5.linux-amd64.tar.gz")
subprocess.run(f"tar -xzf {gotar} -C {tmpdir}", shell=True)

View File

@@ -6,4 +6,4 @@ class Command(BaseCommand):
help = "Reload Nats"
def handle(self, *args, **kwargs):
reload_nats()
reload_nats()

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0008_auto_20200910_1434'),
("core", "0008_auto_20200910_1434"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='agent_auto_update',
model_name="coresettings",
name="agent_auto_update",
field=models.BooleanField(default=True),
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0009_coresettings_agent_auto_update'),
("core", "0009_coresettings_agent_auto_update"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='created_by',
model_name="coresettings",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='created_time',
model_name="coresettings",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_by',
model_name="coresettings",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_time',
model_name="coresettings",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -7,28 +7,34 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0010_auto_20201002_1257'),
("core", "0010_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='sms_alert_recipients',
field=django.contrib.postgres.fields.ArrayField(base_field=models.CharField(blank=True, max_length=255, null=True), blank=True, default=list, null=True, size=None),
model_name="coresettings",
name="sms_alert_recipients",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(blank=True, max_length=255, null=True),
blank=True,
default=list,
null=True,
size=None,
),
),
migrations.AddField(
model_name='coresettings',
name='twilio_account_sid',
model_name="coresettings",
name="twilio_account_sid",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_auth_token',
model_name="coresettings",
name="twilio_auth_token",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_number',
model_name="coresettings",
name="twilio_number",
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View File

@@ -1,5 +1,7 @@
from tacticalrmm.test import TacticalTestCase
from core.tasks import core_maintenance_tasks
from unittest.mock import patch
from model_bakery import baker, seq
class TestCoreTasks(TacticalTestCase):
@@ -31,3 +33,45 @@ class TestCoreTasks(TacticalTestCase):
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("get", url)
@patch("autotasks.tasks.remove_orphaned_win_tasks.delay")
def test_ui_maintenance_actions(self, remove_orphaned_win_tasks):
url = "/core/servermaintenance/"
agents = baker.make_recipe("agents.online_agent", _quantity=3)
# test with empty data
r = self.client.post(url, {})
self.assertEqual(r.status_code, 400)
# test with invalid action
data = {"action": "invalid_action"}
r = self.client.post(url, data)
self.assertEqual(r.status_code, 400)
# test reload nats action
data = {"action": "reload_nats"}
r = self.client.post(url, data)
self.assertEqual(r.status_code, 200)
# test prune db with no tables
data = {"action": "prune_db"}
r = self.client.post(url, data)
self.assertEqual(r.status_code, 400)
# test prune db with tables
data = {
"action": "prune_db",
"prune_tables": ["audit_logs", "agent_outages", "pending_actions"],
}
r = self.client.post(url, data)
self.assertEqual(r.status_code, 200)
# test remove orphaned tasks
data = {"action": "rm_orphaned_tasks"}
r = self.client.post(url, data)
self.assertEqual(r.status_code, 200)
remove_orphaned_win_tasks.assert_called()
self.check_not_authenticated("post", url)

View File

@@ -8,4 +8,5 @@ urlpatterns = [
path("version/", views.version),
path("emailtest/", views.email_test),
path("dashinfo/", views.dashboard_info),
path("servermaintenance/", views.server_maintenance),
]

View File

@@ -42,18 +42,20 @@ def get_core_settings(request):
@api_view(["PATCH"])
def edit_settings(request):
settings = CoreSettings.objects.first()
serializer = CoreSettingsSerializer(instance=settings, data=request.data)
coresettings = CoreSettings.objects.first()
old_server_policy = coresettings.server_policy
old_workstation_policy = coresettings.workstation_policy
serializer = CoreSettingsSerializer(instance=coresettings, data=request.data)
serializer.is_valid(raise_exception=True)
new_settings = serializer.save()
# check if default policies changed
if settings.server_policy != new_settings.server_policy:
if old_server_policy != new_settings.server_policy:
generate_all_agent_checks_task.delay(
mon_type="server", clear=True, create_tasks=True
)
if settings.workstation_policy != new_settings.workstation_policy:
if old_workstation_policy != new_settings.workstation_policy:
generate_all_agent_checks_task.delay(
mon_type="workstation", clear=True, create_tasks=True
)
@@ -69,7 +71,11 @@ def version(request):
@api_view()
def dashboard_info(request):
return Response(
{"trmm_version": settings.TRMM_VERSION, "dark_mode": request.user.dark_mode}
{
"trmm_version": settings.TRMM_VERSION,
"dark_mode": request.user.dark_mode,
"show_community_scripts": request.user.show_community_scripts,
}
)
@@ -84,3 +90,56 @@ def email_test(request):
return notify_error(r)
return Response("Email Test OK!")
@api_view(["POST"])
def server_maintenance(request):
from tacticalrmm.utils import reload_nats
if "action" not in request.data:
return notify_error("The data is incorrect")
if request.data["action"] == "reload_nats":
reload_nats()
return Response("Nats configuration was reloaded successfully.")
if request.data["action"] == "rm_orphaned_tasks":
from agents.models import Agent
from autotasks.tasks import remove_orphaned_win_tasks
agents = Agent.objects.all()
online = [i for i in agents if i.status == "online"]
for agent in online:
remove_orphaned_win_tasks.delay(agent.pk)
return Response(
"The task has been initiated. Check the Debug Log in the UI for progress."
)
if request.data["action"] == "prune_db":
from agents.models import AgentOutage
from logs.models import AuditLog, PendingAction
if "prune_tables" not in request.data:
return notify_error("The data is incorrect.")
tables = request.data["prune_tables"]
records_count = 0
if "agent_outages" in tables:
agentoutages = AgentOutage.objects.exclude(recovery_time=None)
records_count += agentoutages.count()
agentoutages.delete()
if "audit_logs" in tables:
auditlogs = AuditLog.objects.filter(action="check_run")
records_count += auditlogs.count()
auditlogs.delete()
if "pending_actions" in tables:
pendingactions = PendingAction.objects.filter(status="completed")
records_count += pendingactions.count()
pendingactions.delete()
return Response(f"{records_count} records were pruned from the database")
return notify_error("The data is incorrect")

View File

@@ -6,13 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0007_auditlog_debug_info'),
("logs", "0007_auditlog_debug_info"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,29 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0008_auto_20201110_1431'),
("logs", "0008_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,18 +6,50 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0009_auto_20201110_1431'),
("logs", "0009_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command'), ('bulk_action', 'Bulk Action')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
("bulk_action", "Bulk Action"),
],
max_length=100,
),
),
migrations.AlterField(
model_name='auditlog',
name='object_type',
field=models.CharField(choices=[('user', 'User'), ('script', 'Script'), ('agent', 'Agent'), ('policy', 'Policy'), ('winupdatepolicy', 'Patch Policy'), ('client', 'Client'), ('site', 'Site'), ('check', 'Check'), ('automatedtask', 'Automated Task'), ('coresettings', 'Core Settings'), ('bulk', 'Bulk')], max_length=100),
model_name="auditlog",
name="object_type",
field=models.CharField(
choices=[
("user", "User"),
("script", "Script"),
("agent", "Agent"),
("policy", "Policy"),
("winupdatepolicy", "Patch Policy"),
("client", "Client"),
("site", "Site"),
("check", "Check"),
("automatedtask", "Automated Task"),
("coresettings", "Core Settings"),
("bulk", "Bulk"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0010_auto_20201110_2238'),
("logs", "0010_auto_20201110_2238"),
]
operations = [
migrations.AlterField(
model_name='pendingaction',
name='action_type',
field=models.CharField(blank=True, choices=[('schedreboot', 'Scheduled Reboot'), ('taskaction', 'Scheduled Task Action'), ('agentupdate', 'Agent Update')], max_length=255, null=True),
model_name="pendingaction",
name="action_type",
field=models.CharField(
blank=True,
choices=[
("schedreboot", "Scheduled Reboot"),
("taskaction", "Scheduled Task Action"),
("agentupdate", "Agent Update"),
],
max_length=255,
null=True,
),
),
]

View File

@@ -1,5 +1,4 @@
import datetime as dt
import json
from abc import abstractmethod
from django.db import models
from tacticalrmm.middleware import get_username, get_debug_info

View File

@@ -1,29 +0,0 @@
from loguru import logger
from tacticalrmm.celery import app
from django.conf import settings
logger.configure(**settings.LOG_CONFIG)
@app.task
def cancel_pending_action_task(data):
if data["action_type"] == "schedreboot" and data["status"] == "pending":
from agents.models import Agent
agent = Agent.objects.get(pk=data["agent"])
task_name = data["details"]["taskname"]
r = agent.salt_api_cmd(
timeout=30, func="task.delete_task", arg=[f"name={task_name}"]
)
if r == "timeout" or r == "error" or (isinstance(r, bool) and not r):
logger.error(
f"Unable to contact {agent.hostname}. Task {task_name} will need to cancelled manually."
)
return
else:
logger.info(f"Scheduled reboot cancelled on {agent.hostname}")
return "ok"

View File

@@ -122,10 +122,25 @@ class TestAuditViews(TacticalTestCase):
{"filter": {"clientFilter": [site.client.id]}, "count": 23},
]
pagination = {
"rowsPerPage": 25,
"page": 1,
"sortBy": "entry_time",
"descending": True,
}
for req in data:
resp = self.client.patch(url, req["filter"], format="json")
resp = self.client.patch(
url, {**req["filter"], "pagination": pagination}, format="json"
)
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), req["count"])
self.assertEqual(
len(resp.data["audit_logs"]),
pagination["rowsPerPage"]
if req["count"] > pagination["rowsPerPage"]
else req["count"],
)
self.assertEqual(resp.data["total"], req["count"])
self.check_not_authenticated("patch", url)
@@ -190,54 +205,31 @@ class TestAuditViews(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("logs.tasks.cancel_pending_action_task.delay")
def test_cancel_pending_action(self, mock_task):
@patch("agents.models.Agent.nats_cmd")
def test_cancel_pending_action(self, nats_cmd):
url = "/logs/cancelpendingaction/"
pending_action = baker.make("logs.PendingAction")
# TODO fix this TypeError: Object of type coroutine is not JSON serializable
""" agent = baker.make("agents.Agent", version="1.1.1")
pending_action = baker.make(
"logs.PendingAction",
agent=agent,
details={
"time": "2021-01-13 18:20:00",
"taskname": "TacticalRMM_SchedReboot_wYzCCDVXlc",
},
)
serializer = PendingActionSerializer(pending_action).data
data = {"pk": pending_action.id}
resp = self.client.delete(url, data, format="json")
self.assertEqual(resp.status_code, 200)
mock_task.assert_called_with(serializer)
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": "TacticalRMM_SchedReboot_wYzCCDVXlc"},
}
nats_cmd.assert_called_with(nats_data, timeout=10)
# try request again and it should fail since pending action doesn't exist
resp = self.client.delete(url, data, format="json")
self.assertEqual(resp.status_code, 404)
self.assertEqual(resp.status_code, 404) """
self.check_not_authenticated("delete", url)
class TestLogsTasks(TacticalTestCase):
def setUp(self):
self.authenticate()
@patch("agents.models.Agent.salt_api_cmd")
def test_cancel_pending_action_task(self, mock_salt_cmd):
from .tasks import cancel_pending_action_task
pending_action = baker.make(
"logs.PendingAction",
action_type="schedreboot",
status="pending",
details={"taskname": "test_name"},
)
# data that is passed to the task
data = PendingActionSerializer(pending_action).data
# set return value on mock to success
mock_salt_cmd.return_value = "success"
# call task with valid data and see if salt is called with correct data
ret = cancel_pending_action_task(data)
mock_salt_cmd.assert_called_with(
timeout=30, func="task.delete_task", arg=["name=test_name"]
)
# this should return successful
self.assertEquals(ret, "ok")
# this run should return false
mock_salt_cmd.reset_mock()
mock_salt_cmd.return_value = "timeout"
ret = cancel_pending_action_task(data)
self.assertEquals(ret, None)

View File

@@ -1,3 +1,4 @@
import asyncio
import subprocess
from django.conf import settings
@@ -5,6 +6,7 @@ from django.shortcuts import get_object_or_404
from django.http import HttpResponse
from django.utils import timezone as djangotime
from django.db.models import Q
from django.core.paginator import Paginator
from datetime import datetime as dt
from rest_framework.response import Response
@@ -18,7 +20,7 @@ from accounts.models import User
from .serializers import PendingActionSerializer, AuditLogSerializer
from agents.serializers import AgentHostnameSerializer
from accounts.serializers import UserSerializer
from .tasks import cancel_pending_action_task
from tacticalrmm.utils import notify_error
class GetAuditLogs(APIView):
@@ -26,6 +28,14 @@ class GetAuditLogs(APIView):
from clients.models import Client
from agents.models import Agent
pagination = request.data["pagination"]
order_by = (
f"-{pagination['sortBy']}"
if pagination["descending"]
else f"{pagination['sortBy']}"
)
agentFilter = Q()
clientFilter = Q()
actionFilter = Q()
@@ -67,9 +77,18 @@ class GetAuditLogs(APIView):
.filter(actionFilter)
.filter(objectFilter)
.filter(timeFilter)
)
).order_by(order_by)
return Response(AuditLogSerializer(audit_logs, many=True).data)
paginator = Paginator(audit_logs, pagination["rowsPerPage"])
return Response(
{
"audit_logs": AuditLogSerializer(
paginator.get_page(pagination["page"]), many=True
).data,
"total": paginator.count,
}
)
class FilterOptionsAuditLog(APIView):
@@ -95,19 +114,26 @@ def agent_pending_actions(request, pk):
@api_view()
def all_pending_actions(request):
actions = PendingAction.objects.all()
actions = PendingAction.objects.all().select_related("agent")
return Response(PendingActionSerializer(actions, many=True).data)
@api_view(["DELETE"])
def cancel_pending_action(request):
action = get_object_or_404(PendingAction, pk=request.data["pk"])
data = PendingActionSerializer(action).data
cancel_pending_action_task.delay(data)
if not action.agent.has_gotasks:
return notify_error("Requires agent version 1.1.1 or greater")
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": action.details["taskname"]},
}
r = asyncio.run(action.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
return notify_error(r)
action.delete()
return Response(
f"{action.agent.hostname}: {action.description} will be cancelled shortly"
)
return Response(f"{action.agent.hostname}: {action.description} was cancelled")
@api_view()

View File

@@ -1,23 +1,23 @@
amqp==2.6.1
asgiref==3.3.0
asgiref==3.3.1
asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==4.4.6
certifi==2020.11.8
cffi==1.14.3
chardet==3.0.4
cryptography==3.2.1
certifi==2020.12.5
cffi==1.14.4
chardet==4.0.0
cryptography==3.3.1
decorator==4.4.2
Django==3.1.3
django-cors-headers==3.5.0
Django==3.1.4
django-cors-headers==3.6.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
idna==2.10
kombu==4.6.11
loguru==0.5.3
msgpack==1.0.0
packaging==20.4
msgpack==1.0.2
packaging==20.8
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.9.9
@@ -26,14 +26,13 @@ pyparsing==2.4.7
pytz==2020.4
qrcode==6.1
redis==3.5.3
requests==2.24.0
requests==2.25.1
six==1.15.0
sqlparse==0.4.1
tldextract==3.0.2
twilio==6.47.0
urllib3==1.25.11
twilio==6.50.1
urllib3==1.26.2
uWSGI==2.0.19.1
validators==0.18.1
validators==0.18.2
vine==1.3.0
websockets==8.1
zipp==3.4.0

View File

@@ -6,8 +6,5 @@ script = Recipe(
name="Test Script",
description="Test Desc",
shell="cmd",
filename="test.bat",
script_type="userdefined",
)
builtin_script = script.extend(script_type="builtin")

View File

@@ -0,0 +1,28 @@
# Generated by Django 3.1.3 on 2020-12-07 15:58
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("scripts", "0003_auto_20200922_1344"),
]
operations = [
migrations.AddField(
model_name="script",
name="category",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name="script",
name="favorite",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="script",
name="script_base64",
field=models.TextField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.3 on 2020-12-07 16:06
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("scripts", "0004_auto_20201207_1558"),
]
operations = [
migrations.RenameField(
model_name="script",
old_name="script_base64",
new_name="code_base64",
),
]

View File

@@ -0,0 +1,42 @@
# Generated by Django 3.1.4 on 2020-12-10 21:45
from django.db import migrations
from django.conf import settings
import os
import base64
from pathlib import Path
def move_scripts_to_db(apps, schema_editor):
print("")
Script = apps.get_model("scripts", "Script")
for script in Script.objects.all():
if not script.script_type == "builtin":
if script.filename:
filepath = f"{settings.SCRIPTS_DIR}/userdefined/{script.filename}"
else:
print(f"No filename on script found. Skipping")
continue
# test if file exists
if os.path.exists(filepath):
print(f"Found script {script.name}. Importing code.")
with open(filepath, "rb") as f:
script_bytes = f.read().decode("utf-8").encode("ascii", "ignore")
script.code_base64 = base64.b64encode(script_bytes).decode("ascii")
script.save(update_fields=["code_base64"])
else:
print(
f"Script file {script.name} was not found on the disk. You will need to edit the script in the UI"
)
class Migration(migrations.Migration):
dependencies = [
("scripts", "0005_auto_20201207_1606"),
]
operations = [migrations.RunPython(move_scripts_to_db, migrations.RunPython.noop)]

View File

@@ -1,3 +1,4 @@
import base64
from django.db import models
from logs.models import BaseAuditModel
from django.conf import settings
@@ -17,41 +18,27 @@ SCRIPT_TYPES = [
class Script(BaseAuditModel):
name = models.CharField(max_length=255)
description = models.TextField(null=True, blank=True)
filename = models.CharField(max_length=255)
filename = models.CharField(max_length=255) # deprecated
shell = models.CharField(
max_length=100, choices=SCRIPT_SHELLS, default="powershell"
)
script_type = models.CharField(
max_length=100, choices=SCRIPT_TYPES, default="userdefined"
)
favorite = models.BooleanField(default=False)
category = models.CharField(max_length=100, null=True, blank=True)
code_base64 = models.TextField(null=True, blank=True)
def __str__(self):
return self.filename
@property
def filepath(self):
# for the windows agent when using 'salt-call'
if self.script_type == "userdefined":
return f"salt://scripts//userdefined//{self.filename}"
else:
return f"salt://scripts//{self.filename}"
@property
def file(self):
if self.script_type == "userdefined":
return f"{settings.SCRIPTS_DIR}/userdefined/{self.filename}"
else:
return f"{settings.SCRIPTS_DIR}/{self.filename}"
return self.name
@property
def code(self):
try:
with open(self.file, "r") as f:
text = f.read()
except:
text = "n/a"
return text
if self.code_base64:
base64_bytes = self.code_base64.encode("ascii", "ignore")
return base64.b64decode(base64_bytes).decode("ascii", "ignore")
else:
return ""
@classmethod
def load_community_scripts(cls):
@@ -79,22 +66,41 @@ class Script(BaseAuditModel):
for script in info:
if os.path.exists(os.path.join(scripts_dir, script["filename"])):
s = cls.objects.filter(script_type="builtin").filter(
filename=script["filename"]
name=script["name"]
)
if s.exists():
i = s.first()
i.name = script["name"]
i.description = script["description"]
i.save(update_fields=["name", "description"])
i.category = "Community"
with open(os.path.join(scripts_dir, script["filename"]), "rb") as f:
script_bytes = (
f.read().decode("utf-8").encode("ascii", "ignore")
)
i.code_base64 = base64.b64encode(script_bytes).decode("ascii")
i.save(
update_fields=["name", "description", "category", "code_base64"]
)
else:
print(f"Adding new community script: {script['name']}")
cls(
name=script["name"],
description=script["description"],
filename=script["filename"],
shell=script["shell"],
script_type="builtin",
).save()
with open(os.path.join(scripts_dir, script["filename"]), "rb") as f:
script_bytes = (
f.read().decode("utf-8").encode("ascii", "ignore")
)
code_base64 = base64.b64encode(script_bytes).decode("ascii")
cls(
code_base64=code_base64,
name=script["name"],
description=script["description"],
filename=script["filename"],
shell=script["shell"],
script_type="builtin",
category="Community",
).save()
@staticmethod
def serialize(script):

View File

@@ -1,41 +1,33 @@
import os
from django.conf import settings
from rest_framework.serializers import ModelSerializer, ValidationError, ReadOnlyField
from rest_framework.serializers import ModelSerializer, ReadOnlyField
from .models import Script
class ScriptSerializer(ModelSerializer):
code = ReadOnlyField()
filepath = ReadOnlyField()
class ScriptTableSerializer(ModelSerializer):
class Meta:
model = Script
fields = "__all__"
fields = [
"id",
"name",
"description",
"script_type",
"shell",
"category",
"favorite",
]
def validate(self, val):
if "filename" in val:
# validate the filename
if (
not val["filename"].endswith(".py")
and not val["filename"].endswith(".ps1")
and not val["filename"].endswith(".bat")
):
raise ValidationError("File types supported are .py, .ps1 and .bat")
# make sure file doesn't already exist on server
# but only if adding, not if editing since will overwrite if edit
if not self.instance:
script_path = os.path.join(
f"{settings.SCRIPTS_DIR}/userdefined", val["filename"]
)
if os.path.exists(script_path):
raise ValidationError(
f"{val['filename']} already exists. Delete or edit the existing script first."
)
return val
class ScriptSerializer(ModelSerializer):
class Meta:
model = Script
fields = [
"id",
"name",
"description",
"shell",
"category",
"favorite",
"code_base64",
]
class ScriptCheckSerializer(ModelSerializer):

View File

@@ -1,10 +1,11 @@
import json
import os
from django.core.files.uploadedfile import SimpleUploadedFile
from pathlib import Path
from django.conf import settings
from tacticalrmm.test import TacticalTestCase
from model_bakery import baker
from .serializers import ScriptSerializer
from .serializers import ScriptSerializer, ScriptTableSerializer
from .models import Script
@@ -16,16 +17,50 @@ class TestScriptViews(TacticalTestCase):
url = "/scripts/scripts/"
scripts = baker.make("scripts.Script", _quantity=3)
serializer = ScriptSerializer(scripts, many=True)
serializer = ScriptTableSerializer(scripts, many=True)
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(serializer.data, resp.data)
self.check_not_authenticated("get", url)
# TODO Need to test file uploads and saves
def test_add_script(self):
pass
url = f"/scripts/scripts/"
data = {
"name": "Name",
"description": "Description",
"shell": "powershell",
"category": "New",
"code": "Some Test Code\nnew Line",
}
# test without file upload
resp = self.client.post(url, data)
self.assertEqual(resp.status_code, 200)
self.assertTrue(Script.objects.filter(name="Name").exists())
self.assertEqual(Script.objects.get(name="Name").code, data["code"])
# test with file upload
# file with 'Test' as content
file = SimpleUploadedFile(
"test_script.bat", b"\x54\x65\x73\x74", content_type="text/plain"
)
data = {
"name": "New Name",
"description": "Description",
"shell": "cmd",
"category": "New",
"filename": file,
}
# test with file upload
resp = self.client.post(url, data, format="multipart")
self.assertEqual(resp.status_code, 200)
script = Script.objects.filter(name="New Name").first()
self.assertEquals(script.code, "Test")
self.check_not_authenticated("post", url)
def test_modify_script(self):
# test a call where script doesn't exist
@@ -40,23 +75,39 @@ class TestScriptViews(TacticalTestCase):
"name": script.name,
"description": "Description Change",
"shell": script.shell,
"code": "Test Code\nAnother Line",
}
# test edit a userdefined script
resp = self.client.put(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEquals(
Script.objects.get(pk=script.pk).description, "Description Change"
)
script = Script.objects.get(pk=script.pk)
self.assertEquals(script.description, "Description Change")
self.assertEquals(script.code, "Test Code\nAnother Line")
# test edit a builtin script
builtin_script = baker.make_recipe("scripts.builtin_script")
data = {"name": "New Name", "description": "New Desc", "code": "Some New Code"}
builtin_script = baker.make_recipe("scripts.script", script_type="builtin")
resp = self.client.put(
f"/scripts/{builtin_script.pk}/script/", data, format="json"
)
self.assertEqual(resp.status_code, 400)
# TODO Test changing script file
data = {
"name": script.name,
"description": "Description Change",
"shell": script.shell,
"favorite": True,
"code": "Test Code\nAnother Line",
}
# test marking a builtin script as favorite
resp = self.client.put(
f"/scripts/{builtin_script.pk}/script/", data, format="json"
)
self.assertEqual(resp.status_code, 200)
self.assertTrue(Script.objects.get(pk=builtin_script.pk).favorite)
self.check_not_authenticated("put", url)
@@ -79,6 +130,7 @@ class TestScriptViews(TacticalTestCase):
resp = self.client.delete("/scripts/500/script/", format="json")
self.assertEqual(resp.status_code, 404)
# test delete script
script = baker.make_recipe("scripts.script")
url = f"/scripts/{script.pk}/script/"
resp = self.client.delete(url, format="json")
@@ -86,13 +138,50 @@ class TestScriptViews(TacticalTestCase):
self.assertFalse(Script.objects.filter(pk=script.pk).exists())
# test delete community script
script = baker.make_recipe("scripts.script", script_type="builtin")
url = f"/scripts/{script.pk}/script/"
resp = self.client.delete(url, format="json")
self.assertEqual(resp.status_code, 400)
self.check_not_authenticated("delete", url)
# TODO Need to mock file open
def test_download_script(self):
pass
# test a call where script doesn't exist
resp = self.client.get("/scripts/500/download/", format="json")
self.assertEqual(resp.status_code, 404)
def test_load_community_scripts(self):
# return script code property should be "Test"
# test powershell file
script = baker.make(
"scripts.Script", code_base64="VGVzdA==", shell="powershell"
)
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.ps1", "code": "Test"})
# test batch file
script = baker.make("scripts.Script", code_base64="VGVzdA==", shell="cmd")
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.bat", "code": "Test"})
# test python file
script = baker.make("scripts.Script", code_base64="VGVzdA==", shell="python")
url = f"/scripts/{script.pk}/download/"
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, {"filename": f"{script.name}.py", "code": "Test"})
self.check_not_authenticated("get", url)
def test_community_script_json_file(self):
valid_shells = ["powershell", "python", "cmd"]
if not settings.DOCKER_BUILD:
@@ -113,5 +202,19 @@ class TestScriptViews(TacticalTestCase):
self.assertTrue(script["name"])
self.assertTrue(script["description"])
self.assertTrue(script["shell"])
self.assertTrue(script["description"])
self.assertIn(script["shell"], valid_shells)
def test_load_community_scripts(self):
with open(
os.path.join(settings.BASE_DIR, "scripts/community_scripts.json")
) as f:
info = json.load(f)
Script.load_community_scripts()
community_scripts = Script.objects.filter(script_type="builtin").count()
self.assertEqual(len(info), community_scripts)
# test updating already added community scripts
Script.load_community_scripts()
self.assertEqual(len(info), community_scripts)

View File

@@ -1,4 +1,4 @@
import os
import base64
from loguru import logger
from django.shortcuts import get_object_or_404
@@ -11,9 +11,10 @@ from rest_framework.response import Response
from rest_framework.parsers import FileUploadParser
from .models import Script
from .serializers import ScriptSerializer
from .serializers import ScriptSerializer, ScriptTableSerializer
from tacticalrmm.utils import notify_error
logger.configure(**settings.LOG_CONFIG)
@@ -22,74 +23,65 @@ class GetAddScripts(APIView):
def get(self, request):
scripts = Script.objects.all()
return Response(ScriptSerializer(scripts, many=True).data)
return Response(ScriptTableSerializer(scripts, many=True).data)
def put(self, request, format=None):
def post(self, request, format=None):
file_obj = request.data["filename"] # the actual file in_memory object
# need to manually create the serialized data
# since javascript formData doesn't support JSON
filename = str(file_obj)
data = {
"name": request.data["name"],
"filename": filename,
"category": request.data["category"],
"description": request.data["description"],
"shell": request.data["shell"],
"script_type": "userdefined", # force all uploads to be userdefined. built in scripts cannot be edited by user
}
if "favorite" in request.data:
data["favorite"] = request.data["favorite"]
if "filename" in request.data:
message_bytes = request.data["filename"].read()
data["code_base64"] = base64.b64encode(message_bytes).decode(
"ascii", "ignore"
)
elif "code" in request.data:
message_bytes = request.data["code"].encode("ascii", "ignore")
data["code_base64"] = base64.b64encode(message_bytes).decode("ascii")
serializer = ScriptSerializer(data=data, partial=True)
serializer.is_valid(raise_exception=True)
obj = serializer.save()
with open(obj.file, "wb+") as f:
for chunk in file_obj.chunks():
f.write(chunk)
return Response(f"{obj.name} was added!")
class GetUpdateDeleteScript(APIView):
parser_class = (FileUploadParser,)
def get(self, request, pk):
script = get_object_or_404(Script, pk=pk)
return Response(ScriptSerializer(script).data)
def put(self, request, pk, format=None):
def put(self, request, pk):
script = get_object_or_404(Script, pk=pk)
# this will never trigger but check anyway
data = request.data
if script.script_type == "builtin":
return notify_error("Built in scripts cannot be edited")
# allow only favoriting builtin scripts
if "favorite" in data:
# overwrite request data
data = {"favorite": data["favorite"]}
else:
return notify_error("Community scripts cannot be edited.")
data = {
"name": request.data["name"],
"description": request.data["description"],
"shell": request.data["shell"],
}
# if uploading a new version of the script
if "filename" in request.data:
file_obj = request.data["filename"]
data["filename"] = str(file_obj)
elif "code" in data:
message_bytes = data["code"].encode("ascii")
data["code_base64"] = base64.b64encode(message_bytes).decode("ascii")
data.pop("code")
serializer = ScriptSerializer(data=data, instance=script, partial=True)
serializer.is_valid(raise_exception=True)
obj = serializer.save()
if "filename" in request.data:
try:
os.remove(obj.file)
except OSError:
pass
with open(obj.file, "wb+") as f:
for chunk in file_obj.chunks():
f.write(chunk)
return Response(f"{obj.name} was edited!")
def delete(self, request, pk):
@@ -97,12 +89,7 @@ class GetUpdateDeleteScript(APIView):
# this will never trigger but check anyway
if script.script_type == "builtin":
return notify_error("Built in scripts cannot be deleted")
try:
os.remove(script.file)
except OSError:
pass
return notify_error("Community scripts cannot be deleted")
script.delete()
return Response(f"{script.name} was deleted!")
@@ -111,33 +98,12 @@ class GetUpdateDeleteScript(APIView):
@api_view()
def download(request, pk):
script = get_object_or_404(Script, pk=pk)
use_nginx = False
conf = "/etc/nginx/sites-available/rmm.conf"
if os.path.exists(conf):
try:
with open(conf) as f:
for line in f.readlines():
if "location" and "builtin" in line:
use_nginx = True
break
except Exception as e:
logger.error(e)
if script.shell == "powershell":
filename = f"{script.name}.ps1"
elif script.shell == "cmd":
filename = f"{script.name}.bat"
else:
use_nginx = True
filename = f"{script.name}.py"
if settings.DEBUG or not use_nginx:
with open(script.file, "rb") as f:
response = HttpResponse(f.read(), content_type="text/plain")
response["Content-Disposition"] = f"attachment; filename={script.filename}"
return response
else:
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={script.filename}"
response["X-Accel-Redirect"] = (
f"/saltscripts/{script.filename}"
if script.script_type == "userdefined"
else f"/builtin/{script.filename}"
)
return response
return Response({"filename": filename, "code": script.code})

View File

@@ -9,21 +9,6 @@ class TestServiceViews(TacticalTestCase):
def setUp(self):
self.authenticate()
def test_get_services(self):
# test a call where agent doesn't exist
resp = self.client.get("/services/500/services/", format="json")
self.assertEqual(resp.status_code, 404)
agent = baker.make_recipe("agents.agent_with_services")
url = f"/services/{agent.pk}/services/"
serializer = ServicesSerializer(agent)
resp = self.client.get(url, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(serializer.data, resp.data)
self.check_not_authenticated("get", url)
def test_default_services(self):
url = "/services/defaultservices/"
resp = self.client.get(url, format="json")
@@ -33,13 +18,13 @@ class TestServiceViews(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("agents.models.Agent.nats_cmd")
def test_get_refreshed_services(self, nats_cmd):
def test_get_services(self, nats_cmd):
# test a call where agent doesn't exist
resp = self.client.get("/services/500/refreshedservices/", format="json")
resp = self.client.get("/services/500/services/", format="json")
self.assertEqual(resp.status_code, 404)
agent = baker.make_recipe("agents.agent_with_services")
url = f"/services/{agent.pk}/refreshedservices/"
url = f"/services/{agent.pk}/services/"
nats_return = [
{

View File

@@ -4,7 +4,6 @@ from . import views
urlpatterns = [
path("<int:pk>/services/", views.get_services),
path("defaultservices/", views.default_services),
path("<int:pk>/refreshedservices/", views.get_refreshed_services),
path("serviceaction/", views.service_action),
path("<int:pk>/<svcname>/servicedetail/", views.service_detail),
path("editservice/", views.edit_service),

View File

@@ -19,17 +19,6 @@ logger.configure(**settings.LOG_CONFIG)
@api_view()
def get_services(request, pk):
agent = get_object_or_404(Agent, pk=pk)
return Response(ServicesSerializer(agent).data)
@api_view()
def default_services(request):
return Response(Check.load_default_services())
@api_view()
def get_refreshed_services(request, pk):
agent = get_object_or_404(Agent, pk=pk)
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
@@ -43,6 +32,11 @@ def get_refreshed_services(request, pk):
return Response(ServicesSerializer(agent).data)
@api_view()
def default_services(request):
return Response(Check.load_default_services())
@api_view(["POST"])
def service_action(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])

View File

@@ -1,5 +1,4 @@
import asyncio
import string
from time import sleep
from loguru import logger
from tacticalrmm.celery import app
@@ -8,6 +7,7 @@ from django.utils import timezone as djangotime
from agents.models import Agent
from .models import ChocoSoftware, ChocoLog, InstalledSoftware
from tacticalrmm.utils import filter_software
logger.configure(**settings.LOG_CONFIG)
@@ -87,44 +87,6 @@ def update_chocos():
return "ok"
@app.task
def get_installed_software(pk):
agent = Agent.objects.get(pk=pk)
if not agent.has_nats:
logger.error(f"{agent.salt_id} software list only available in agent >= 1.1.0")
return
r = asyncio.run(agent.nats_cmd({"func": "softwarelist"}, timeout=20))
if r == "timeout" or r == "natsdown":
logger.error(f"{agent.salt_id} {r}")
return
printable = set(string.printable)
sw = []
for s in r:
sw.append(
{
"name": "".join(filter(lambda x: x in printable, s["name"])),
"version": "".join(filter(lambda x: x in printable, s["version"])),
"publisher": "".join(filter(lambda x: x in printable, s["publisher"])),
"install_date": s["install_date"],
"size": s["size"],
"source": s["source"],
"location": s["location"],
"uninstall": s["uninstall"],
}
)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s.software = sw
s.save(update_fields=["software"])
return "ok"
@app.task
def install_program(pk, name, version):
agent = Agent.objects.get(pk=pk)
@@ -169,6 +131,4 @@ def install_program(pk, name, version):
agent=agent, name=name, version=version, message=output, installed=installed
).save()
get_installed_software.delay(agent.pk)
return "ok"

View File

@@ -120,61 +120,8 @@ class TestSoftwareTasks(TacticalTestCase):
salt_api_cmd.assert_any_call(timeout=200, func="chocolatey.list")
self.assertEquals(salt_api_cmd.call_count, 2)
@patch("agents.models.Agent.nats_cmd")
def test_get_installed_software(self, nats_cmd):
from .tasks import get_installed_software
agent = baker.make_recipe("agents.agent")
nats_return = [
{
"name": "Mozilla Maintenance Service",
"size": "336.9 kB",
"source": "",
"version": "73.0.1",
"location": "",
"publisher": "Mozilla",
"uninstall": '"C:\\Program Files (x86)\\Mozilla Maintenance Service\\uninstall.exe"',
"install_date": "0001-01-01 00:00:00 +0000 UTC",
},
{
"name": "OpenVPN 2.4.9-I601-Win10 ",
"size": "8.7 MB",
"source": "",
"version": "2.4.9-I601-Win10",
"location": "C:\\Program Files\\OpenVPN\\",
"publisher": "OpenVPN Technologies, Inc.",
"uninstall": "C:\\Program Files\\OpenVPN\\Uninstall.exe",
"install_date": "0001-01-01 00:00:00 +0000 UTC",
},
{
"name": "Microsoft Office Professional Plus 2019 - en-us",
"size": "0 B",
"source": "",
"version": "16.0.10368.20035",
"location": "C:\\Program Files\\Microsoft Office",
"publisher": "Microsoft Corporation",
"uninstall": '"C:\\Program Files\\Common Files\\Microsoft Shared\\ClickToRun\\OfficeClickToRun.exe" scenario=install scenariosubtype=ARP sourcetype=None productstoremove=ProPlus2019Volume.16_en-us_x-none culture=en-us version.16=16.0',
"install_date": "0001-01-01 00:00:00 +0000 UTC",
},
]
# test failed attempt
nats_cmd.return_value = "timeout"
ret = get_installed_software(agent.pk)
self.assertFalse(ret)
nats_cmd.assert_called_with({"func": "softwarelist"}, timeout=20)
nats_cmd.reset_mock()
# test successful attempt
nats_cmd.return_value = nats_return
ret = get_installed_software(agent.pk)
self.assertTrue(ret)
nats_cmd.assert_called_with({"func": "softwarelist"}, timeout=20)
@patch("agents.models.Agent.salt_api_cmd")
@patch("software.tasks.get_installed_software.delay")
def test_install_program(self, get_installed_software, salt_api_cmd):
def test_install_program(self, salt_api_cmd):
from .tasks import install_program
agent = baker.make_recipe("agents.agent")
@@ -195,6 +142,5 @@ class TestSoftwareTasks(TacticalTestCase):
salt_api_cmd.assert_called_with(
timeout=900, func="chocolatey.install", arg=["git", "version=2.3.4"]
)
get_installed_software.assert_called_with(agent.pk)
self.assertTrue(ChocoLog.objects.filter(agent=agent, name="git").exists())

View File

@@ -1,5 +1,5 @@
import asyncio
import string
from typing import Any
from django.shortcuts import get_object_or_404
@@ -10,7 +10,7 @@ from agents.models import Agent
from .models import ChocoSoftware, InstalledSoftware
from .serializers import InstalledSoftwareSerializer
from .tasks import install_program
from tacticalrmm.utils import notify_error
from tacticalrmm.utils import notify_error, filter_software
@api_view()
@@ -45,25 +45,11 @@ def refresh_installed(request, pk):
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
r = asyncio.run(agent.nats_cmd({"func": "softwarelist"}, timeout=15))
r: Any = asyncio.run(agent.nats_cmd({"func": "softwarelist"}, timeout=15))
if r == "timeout" or r == "natsdown":
return notify_error("Unable to contact the agent")
printable = set(string.printable)
sw = []
for s in r:
sw.append(
{
"name": "".join(filter(lambda x: x in printable, s["name"])),
"version": "".join(filter(lambda x: x in printable, s["version"])),
"publisher": "".join(filter(lambda x: x in printable, s["publisher"])),
"install_date": s["install_date"],
"size": s["size"],
"source": s["source"],
"location": s["location"],
"uninstall": s["uninstall"],
}
)
sw = filter_software(r)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()

View File

@@ -37,18 +37,18 @@ app.conf.beat_schedule = {
"task": "agents.tasks.batch_sync_modules_task",
"schedule": crontab(minute=25, hour="*/4"),
},
"sys-info": {
"task": "agents.tasks.batch_sysinfo_task",
"schedule": crontab(minute=15, hour="*/2"),
},
"update-salt": {
"task": "agents.tasks.update_salt_minion_task",
"schedule": crontab(minute=20, hour="*/6"),
},
"agent-auto-update": {
"task": "agents.tasks.auto_self_agent_update_task",
"schedule": crontab(minute=35, hour="*"),
},
"agents-sync": {
"task": "agents.tasks.sync_sysinfo_task",
"schedule": crontab(minute=55, hour="*"),
},
"check-agentservice": {
"task": "agents.tasks.monitor_agents_task",
"schedule": crontab(minute="*/15"),
},
}

View File

@@ -16,16 +16,15 @@ def get_debug_info():
EXCLUDE_PATHS = (
"/api/v3",
"/api/v2",
"/api/v1",
"/logs/auditlogs",
"/winupdate/winupdater",
"/winupdate/results",
f"/{settings.ADMIN_URL}",
"/logout",
"/agents/installagent",
"/logs/downloadlog",
)
ENDS_WITH = "/services/"
class AuditMiddleware:
def __init__(self, get_response):
@@ -37,7 +36,9 @@ class AuditMiddleware:
return response
def process_view(self, request, view_func, view_args, view_kwargs):
if not request.path.startswith(EXCLUDE_PATHS):
if not request.path.startswith(EXCLUDE_PATHS) and not request.path.endswith(
ENDS_WITH
):
# https://stackoverflow.com/questions/26240832/django-and-middleware-which-uses-request-user-is-always-anonymous
try:
# DRF saves the class of the view function as the .cls property
@@ -79,4 +80,4 @@ class AuditMiddleware:
def process_template_response(self, request, response):
request_local.debug_info = None
request_local.username = None
return response
return response

View File

@@ -15,25 +15,25 @@ EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
AUTH_USER_MODEL = "accounts.User"
# latest release
TRMM_VERSION = "0.2.1"
TRMM_VERSION = "0.2.18"
# bump this version everytime vue code is changed
# to alert user they need to manually refresh their browser
APP_VER = "0.0.91"
APP_VER = "0.0.100"
# https://github.com/wh1te909/salt
LATEST_SALT_VER = "1.1.0"
# https://github.com/wh1te909/rmmagent
LATEST_AGENT_VER = "1.1.0"
LATEST_AGENT_VER = "1.1.11"
MESH_VER = "0.6.84"
MESH_VER = "0.7.29"
SALT_MASTER_VER = "3002.2"
# for the update script, bump when need to recreate venv or npm install
PIP_VER = "3"
NPM_VER = "2"
PIP_VER = "5"
NPM_VER = "4"
DL_64 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}.exe"
DL_32 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}-x86.exe"
@@ -58,7 +58,6 @@ INSTALLED_APPS = [
"knox",
"corsheaders",
"accounts",
"api",
"apiv2",
"apiv3",
"clients",
@@ -156,39 +155,6 @@ LOG_CONFIG = {
"handlers": [{"sink": os.path.join(LOG_DIR, "debug.log"), "serialize": False}]
}
if "TRAVIS" in os.environ:
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": "travisci",
"USER": "travisci",
"PASSWORD": "travisSuperSekret6645",
"HOST": "127.0.0.1",
"PORT": "",
}
}
REST_FRAMEWORK = {
"DATETIME_FORMAT": "%b-%d-%Y - %H:%M",
"DEFAULT_PERMISSION_CLASSES": ("rest_framework.permissions.IsAuthenticated",),
"DEFAULT_AUTHENTICATION_CLASSES": ("knox.auth.TokenAuthentication",),
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
}
DEBUG = True
SECRET_KEY = "abcdefghijklmnoptravis123456789"
ADMIN_URL = "abc123456/"
SCRIPTS_DIR = os.path.join(Path(BASE_DIR).parents[1], "scripts")
SALT_USERNAME = "travis"
SALT_PASSWORD = "travis"
MESH_USERNAME = "travis"
MESH_SITE = "https://example.com"
MESH_TOKEN_KEY = "bd65e957a1e70c622d32523f61508400d6cd0937001a7ac12042227eba0b9ed625233851a316d4f489f02994145f74537a331415d00047dbbf13d940f556806dffe7a8ce1de216dc49edbad0c1a7399c"
REDIS_HOST = "localhost"
SALT_HOST = "127.0.0.1"
if "AZPIPELINE" in os.environ:
DATABASES = {
"default": {
@@ -208,6 +174,8 @@ if "AZPIPELINE" in os.environ:
"DEFAULT_RENDERER_CLASSES": ("rest_framework.renderers.JSONRenderer",),
}
ALLOWED_HOSTS = ["api.example.com"]
DOCKER_BUILD = True
DEBUG = True
SECRET_KEY = "abcdefghijklmnoptravis123456789"

View File

@@ -13,6 +13,9 @@ class TacticalTestCase(TestCase):
self.john = User(username="john")
self.john.set_password("hunter2")
self.john.save()
self.alice = User(username="alice")
self.alice.set_password("hunter2")
self.alice.save()
self.client_setup()
self.client.force_authenticate(user=self.john)

View File

@@ -10,7 +10,6 @@ urlpatterns = [
path("login/", LoginView.as_view()),
path("logout/", knox_views.LogoutView.as_view()),
path("logoutall/", knox_views.LogoutAllView.as_view()),
path("api/v1/", include("api.urls")),
path("api/v2/", include("apiv2.urls")),
path("api/v3/", include("apiv3.urls")),
path("clients/", include("clients.urls")),

View File

@@ -1,7 +1,10 @@
import json
import os
import string
import subprocess
import tldextract
import time
from typing import List, Dict
from loguru import logger
from django.conf import settings
from rest_framework import status
@@ -9,26 +12,102 @@ from rest_framework.response import Response
from agents.models import Agent
logger.configure(**settings.LOG_CONFIG)
notify_error = lambda msg: Response(msg, status=status.HTTP_400_BAD_REQUEST)
SoftwareList = List[Dict[str, str]]
WEEK_DAYS = {
"Sunday": 0x1,
"Monday": 0x2,
"Tuesday": 0x4,
"Wednesday": 0x8,
"Thursday": 0x10,
"Friday": 0x20,
"Saturday": 0x40,
}
def get_bit_days(days: List[str]) -> int:
bit_days = 0
for day in days:
bit_days |= WEEK_DAYS.get(day)
return bit_days
def bitdays_to_string(day: int) -> str:
ret = []
if day == 127:
return "Every day"
if day & WEEK_DAYS["Sunday"]:
ret.append("Sunday")
if day & WEEK_DAYS["Monday"]:
ret.append("Monday")
if day & WEEK_DAYS["Tuesday"]:
ret.append("Tuesday")
if day & WEEK_DAYS["Wednesday"]:
ret.append("Wednesday")
if day & WEEK_DAYS["Thursday"]:
ret.append("Thursday")
if day & WEEK_DAYS["Friday"]:
ret.append("Friday")
if day & WEEK_DAYS["Saturday"]:
ret.append("Saturday")
return ", ".join(ret)
def filter_software(sw: SoftwareList) -> SoftwareList:
ret: SoftwareList = []
printable = set(string.printable)
for s in sw:
ret.append(
{
"name": "".join(filter(lambda x: x in printable, s["name"])),
"version": "".join(filter(lambda x: x in printable, s["version"])),
"publisher": "".join(filter(lambda x: x in printable, s["publisher"])),
"install_date": s["install_date"],
"size": s["size"],
"source": s["source"],
"location": s["location"],
"uninstall": s["uninstall"],
}
)
return ret
def reload_nats():
users = [{"user": "tacticalrmm", "password": settings.SECRET_KEY}]
agents = Agent.objects.prefetch_related("user").only("pk", "agent_id")
for agent in agents:
users.append({"user": agent.agent_id, "password": agent.user.auth_token.key})
try:
users.append(
{"user": agent.agent_id, "password": agent.user.auth_token.key}
)
except:
logger.critical(
f"{agent.hostname} does not have a user account, NATS will not work"
)
if not settings.DOCKER_BUILD:
tld = tldextract.extract(settings.ALLOWED_HOSTS[0])
domain = tld.domain + "." + tld.suffix
cert_path = f"/etc/letsencrypt/live/{domain}"
domain = settings.ALLOWED_HOSTS[0].split(".", 1)[1]
if hasattr(settings, "CERT_FILE") and hasattr(settings, "KEY_FILE"):
if os.path.exists(settings.CERT_FILE) and os.path.exists(settings.KEY_FILE):
cert_file = settings.CERT_FILE
key_file = settings.KEY_FILE
else:
cert_file = f"/etc/letsencrypt/live/{domain}/fullchain.pem"
key_file = f"/etc/letsencrypt/live/{domain}/privkey.pem"
else:
cert_path = "/opt/tactical/certs"
cert_file = f"/etc/letsencrypt/live/{domain}/fullchain.pem"
key_file = f"/etc/letsencrypt/live/{domain}/privkey.pem"
config = {
"tls": {
"cert_file": f"{cert_path}/fullchain.pem",
"key_file": f"{cert_path}/privkey.pem",
"cert_file": cert_file,
"key_file": key_file,
},
"authorization": {"users": users},
"max_payload": 2048576005,
@@ -39,6 +118,7 @@ def reload_nats():
json.dump(config, f)
if not settings.DOCKER_BUILD:
time.sleep(0.5)
subprocess.run(
["/usr/local/bin/nats-server", "-signal", "reload"], capture_output=True
)

View File

@@ -107,7 +107,7 @@ def check_agent_update_schedule_task():
def check_for_updates_task(pk, wait=False, auto_approve=False):
if wait:
sleep(70)
sleep(120)
agent = Agent.objects.get(pk=pk)
ret = agent.salt_api_cmd(

View File

@@ -175,7 +175,7 @@ class WinupdateTasks(TacticalTestCase):
agent_salt_cmd.assert_called_with(func="win_agent.install_updates")
self.assertEquals(agent_salt_cmd.call_count, 2)
@patch("agents.models.Agent.salt_api_async")
""" @patch("agents.models.Agent.salt_api_async")
def test_check_agent_update_monthly_schedule(self, agent_salt_cmd):
from .tasks import check_agent_update_schedule_task
@@ -204,7 +204,7 @@ class WinupdateTasks(TacticalTestCase):
check_agent_update_schedule_task()
agent_salt_cmd.assert_called_with(func="win_agent.install_updates")
self.assertEquals(agent_salt_cmd.call_count, 2)
self.assertEquals(agent_salt_cmd.call_count, 2) """
@patch("agents.models.Agent.salt_api_cmd")
def test_check_for_updates(self, salt_api_cmd):

View File

@@ -1,12 +1,9 @@
from django.urls import path
from . import views
from apiv3 import views as v3_views
urlpatterns = [
path("<int:pk>/getwinupdates/", views.get_win_updates),
path("<int:pk>/runupdatescan/", views.run_update_scan),
path("editpolicy/", views.edit_policy),
path("winupdater/", views.win_updater),
path("results/", v3_views.WinUpdater.as_view()),
path("<int:pk>/installnow/", views.install_updates),
]

View File

@@ -58,20 +58,3 @@ def edit_policy(request):
patch.action = request.data["policy"]
patch.save(update_fields=["action"])
return Response("ok")
@api_view()
@authentication_classes((TokenAuthentication,))
@permission_classes((IsAuthenticated,))
def win_updater(request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
agent.delete_superseded_updates()
patches = (
WinUpdate.objects.filter(agent=agent)
.exclude(installed=True)
.filter(action="approve")
)
if patches:
return Response(ApprovedUpdateSerializer(patches, many=True).data)
return Response("nopatches")

View File

@@ -27,24 +27,36 @@ jobs:
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
pip install --no-cache-dir --upgrade pip
pip install --no-cache-dir setuptools==49.6.0 wheel==0.35.1
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt
pip install --no-cache-dir setuptools==50.3.2 wheel==0.36.1
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt -r requirements-dev.txt
displayName: "Install Python Dependencies"
- script: |
cd /myagent/_work/1/s/api
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
python manage.py test -v 2
coverage run manage.py test -v 2
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Run django tests"
- script: |
rm -rf /myagent/_work/1/s/web/node_modules
cd /myagent/_work/1/s/web
npm install
displayName: "Install Frontend"
cd /myagent/_work/1/s/api
source env/bin/activate
black --check tacticalrmm
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Codestyle black"
- script: |
cd /myagent/_work/1/s/web
npm run test:unit
displayName: "Run Vue Tests"
cd /myagent/_work/1/s/api
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
export CIRCLE_BRANCH=$BUILD_SOURCEBRANCH
coveralls
displayName: "coveralls"
env:
CIRCLECI: 1
CIRCLE_BUILD_NUM: $(Build.BuildNumber)

Some files were not shown because too many files have changed in this diff Show More