Compare commits

...

108 Commits

Author SHA1 Message Date
wh1te909
6a55ca20f3 Release 0.4.20 2021-03-02 23:42:38 +00:00
wh1te909
c56c537f7f HOTFIX 0.4.20 temporarily disable some sorting 2021-03-02 23:42:00 +00:00
wh1te909
fd7d776121 Release 0.4.19 2021-03-02 22:18:18 +00:00
wh1te909
1af28190d8 bump versions 2021-03-02 22:11:40 +00:00
wh1te909
6b305be567 add dash 2021-03-02 22:08:15 +00:00
wh1te909
3bf70513b7 isort 2021-03-02 09:18:35 +00:00
wh1te909
7e64404654 add type hints 2021-03-02 09:13:24 +00:00
wh1te909
e1b5226f34 fix alert 2021-03-02 08:46:41 +00:00
wh1te909
0d7128ad31 Revert "bump versions"
This reverts commit 5778626087.
2021-03-02 08:41:17 +00:00
wh1te909
5778626087 bump versions 2021-03-02 08:07:39 +00:00
wh1te909
3ff48756ed continue on defender errors 2021-03-02 07:38:14 +00:00
sadnub
0ce9a6eeba black 2021-03-01 22:14:48 -05:00
sadnub
ad527b4aed alerts rework and tests 2021-03-01 22:10:38 -05:00
sadnub
6633bb452e remove jest and add cypress for frontend testing 2021-03-01 22:10:38 -05:00
wh1te909
efeb0b4feb add tests 2021-03-02 00:45:37 +00:00
wh1te909
8cc11fc102 fix pendingactions ui 2021-03-02 00:39:42 +00:00
Tragic Bronson
ee6a167220 Merge pull request #302 from silversword411/patch-2
tweak for workflow
2021-03-01 16:16:38 -08:00
silversword411
8d4ad3c405 tweak for workflow 2021-03-01 19:11:01 -05:00
Tragic Bronson
072fbf4d60 Merge pull request #299 from silversword411/patch-3
Linking to FAQ
2021-03-01 15:24:55 -08:00
silversword411
727c41c283 Update install_server.md 2021-03-01 18:15:12 -05:00
silversword411
e2266838b6 Linking to FAQ
minor update and link to FAQ
2021-03-01 17:59:53 -05:00
Tragic Bronson
775762d615 Merge pull request #298 from silversword411/patch-2
Fixing bash commands
2021-03-01 14:56:34 -08:00
silversword411
900c3008cb Fixing bash commands
Removing ID/server so paste will work
2021-03-01 17:44:35 -05:00
sadnub
09379213a6 fix formatting 2021-03-01 17:37:24 -05:00
sadnub
ceb97048e3 Update mkdocs.yml 2021-03-01 17:34:27 -05:00
sadnub
4561515517 Create update_docker.md 2021-03-01 17:33:41 -05:00
wh1te909
a7b285759f delete chocolog model 2021-03-01 21:43:54 +00:00
wh1te909
b4531b2a12 ui tweaks 2021-03-01 21:37:59 +00:00
wh1te909
9e1d261c76 update faq 2021-03-01 21:09:12 +00:00
Tragic Bronson
e35fa15cd2 Merge pull request #297 from silversword411/patch-1
Docs addition - Recover login for Mesh Central
2021-03-01 13:01:11 -08:00
wh1te909
dbd1f0d4f9 pending actions refactor 2021-03-01 20:40:46 +00:00
wh1te909
9ade78b703 fix restore docs 2021-03-01 19:45:10 +00:00
silversword411
f20e244b5f Recover login for Mesh Central 2021-03-01 12:50:56 -05:00
wh1te909
0989308b7e fix tests 2021-03-01 09:35:26 +00:00
wh1te909
12c7140536 more choco rework 2021-03-01 09:26:37 +00:00
wh1te909
2a0b605e92 return empty val for missing software install date 2021-03-01 08:21:56 +00:00
wh1te909
6978890e6a add contributing docs 2021-03-01 07:51:31 +00:00
Tragic Bronson
561abd6cb9 Merge pull request #296 from beejayzed/develop
Add community script to verify antivirus status
2021-02-28 23:32:55 -08:00
beejayzed
4dd6227f0b Update community_scripts.json 2021-03-01 07:55:31 +07:00
beejayzed
1ec314c31c Rename VerifyAntivirus to VerifyAntivirus.ps1 2021-03-01 07:52:43 +07:00
beejayzed
a2be5a00be Create VerifyAntivirus 2021-03-01 07:50:56 +07:00
wh1te909
4e2241c115 start chocolatey rework 2021-02-28 11:00:45 +00:00
wh1te909
8459bca64a fix nats ping dict 2021-02-28 09:54:53 +00:00
wh1te909
24cb0565b9 add pagination to agent table 2021-02-28 09:18:04 +00:00
wh1te909
9442acb028 fix pipeline typo 2021-02-27 23:41:08 +00:00
wh1te909
4f7f181a42 fix pipeline 2021-02-27 23:24:06 +00:00
wh1te909
b7dd8737a7 make django admin disabled by default 2021-02-27 23:19:35 +00:00
wh1te909
2207eeb727 add missing import 2021-02-27 23:09:01 +00:00
wh1te909
89dad7dfe7 add sponsors info to docs 2021-02-27 22:37:11 +00:00
wh1te909
e5803d0cf3 bump mesh 2021-02-27 07:45:56 +00:00
wh1te909
c1fffe9ae6 add timeout to net 2021-02-27 06:08:42 +00:00
wh1te909
9e6cbd3d32 set uwsgi procs based on cpu count 2021-02-27 05:28:42 +00:00
wh1te909
2ea8742510 natsapi refactor 2021-02-27 00:23:03 +00:00
wh1te909
5cfa0254f9 isort 2021-02-26 23:25:44 +00:00
wh1te909
8cd2544f78 add new management command 2021-02-26 22:05:42 +00:00
wh1te909
c03b768364 fix typos 2021-02-26 09:01:14 +00:00
wh1te909
d60481ead4 add docs for management commands 2021-02-25 20:55:56 +00:00
Tragic Bronson
126be3827d Merge pull request #292 from bradhawkins85/patch-6
Update installer.ps1
2021-02-25 10:06:04 -08:00
bradhawkins85
121274dca2 Update installer.ps1
Don't try and add Windows Defender exceptions if Defender is not enabled, prevents errors during script execution.
2021-02-25 19:59:29 +10:00
wh1te909
0ecf8da27e add management commands for resetting pw/2fa 2021-02-25 07:56:17 +00:00
wh1te909
4a6bcb525d update docs 2021-02-25 07:55:13 +00:00
wh1te909
83f9ee50dd add management commands for resetting pw/2fa 2021-02-25 07:55:03 +00:00
wh1te909
2bff297f79 Release 0.4.18 2021-02-24 20:52:49 +00:00
wh1te909
dee68f6933 bump versions 2021-02-24 20:51:47 +00:00
wh1te909
afa1e19c83 also grep postgres info during restore #285 2021-02-24 20:39:02 +00:00
wh1te909
6052088eb4 grab postgres creds automatically for backup closes #285 2021-02-24 19:23:47 +00:00
wh1te909
c7fa5167c4 also reinstall py env / node modules during forced update 2021-02-24 11:25:42 +00:00
wh1te909
1034b0b146 also reinstall py env / node modules during forced update 2021-02-24 11:24:47 +00:00
wh1te909
8bcc4e5945 fix docs styling 2021-02-24 10:04:45 +00:00
wh1te909
c3c24aa1db black 2021-02-24 09:46:38 +00:00
wh1te909
281c75d2d2 add find_software management command 2021-02-24 09:42:24 +00:00
wh1te909
52307420f3 more docs 2021-02-24 09:36:59 +00:00
wh1te909
6185347cd8 remove border 2021-02-24 09:34:30 +00:00
wh1te909
b6cd29f77e change wording 2021-02-24 09:26:36 +00:00
wh1te909
b8ea8b1567 typo 2021-02-24 08:38:44 +00:00
wh1te909
2f7dc98830 change save query 2021-02-24 07:37:48 +00:00
wh1te909
e248a99f79 add option to run sched task asap after scheduled start was missed #247 2021-02-24 06:14:28 +00:00
wh1te909
4fb6d9aa5d more docs 2021-02-24 05:32:16 +00:00
sadnub
f092ea8d67 black 2021-02-23 23:58:28 -05:00
sadnub
c32cbbdda6 check run tests and agent alert actions tests 2021-02-23 23:53:55 -05:00
sadnub
2497675259 UI changes for AddAutomated Task and ScriptCheck models 2021-02-23 23:53:55 -05:00
sadnub
8d084ab90a docker dev changes 2021-02-23 23:53:55 -05:00
wh1te909
2398773ef0 moar docs 2021-02-24 03:33:39 +00:00
wh1te909
a05998a30e docs 2021-02-24 00:12:55 +00:00
wh1te909
f863c29194 more docs 2021-02-23 22:19:58 +00:00
wh1te909
d16a98c788 Release 0.4.17 2021-02-23 19:26:54 +00:00
wh1te909
9421b02e96 bump versions 2021-02-23 19:26:17 +00:00
wh1te909
10256864e4 improve typing support 2021-02-23 09:50:57 +00:00
wh1te909
85d010615d black 2021-02-23 08:27:22 +00:00
wh1te909
cd1cb186be deploy docs with gh actions 2021-02-23 08:24:19 +00:00
wh1te909
4458354d70 more docs 2021-02-23 08:14:25 +00:00
wh1te909
0f27da8808 add management command to show outdated agents 2021-02-22 20:31:57 +00:00
wh1te909
dd76bfa3c2 fix python build from source 2021-02-22 10:06:47 +00:00
wh1te909
5780a66f7d fix python build from source 2021-02-22 10:05:46 +00:00
wh1te909
d4342c034c add test for run_script 2021-02-22 09:46:48 +00:00
wh1te909
1ec43f2530 refactor to remove duplicate code 2021-02-22 08:46:59 +00:00
wh1te909
3c300d8fdf remove print 2021-02-22 08:45:57 +00:00
wh1te909
23119b55d1 isort 2021-02-22 08:43:21 +00:00
wh1te909
c8fb0e8f8a remove unneeded imports that are now builtin in python 3.9 2021-02-22 08:05:30 +00:00
sadnub
0ec32a77ef make check results chart more responsive with large amounts of data 2021-02-21 19:00:43 -05:00
sadnub
52921bfce8 black 2021-02-21 18:56:14 -05:00
sadnub
960b929097 move annotation labels to the left for check history chart 2021-02-21 18:51:45 -05:00
sadnub
d4ce23eced adding tests to agent alert actions and a bunch of fixes 2021-02-21 18:45:34 -05:00
wh1te909
6925510f44 no cgo 2021-02-21 10:18:05 +00:00
wh1te909
9827ad4c22 add isort to dev reqs 2021-02-21 10:17:47 +00:00
wh1te909
ef8aaee028 Release 0.4.16 2021-02-21 09:58:41 +00:00
wh1te909
3d7d39f248 bump version 2021-02-21 09:58:28 +00:00
wh1te909
3eac620560 add go mod to fix docker agent exe 2021-02-21 09:56:16 +00:00
150 changed files with 5903 additions and 51482 deletions

View File

@@ -100,6 +100,7 @@ MESH_USERNAME = '${MESH_USER}'
MESH_SITE = 'https://${MESH_HOST}'
MESH_TOKEN_KEY = '${MESH_TOKEN}'
REDIS_HOST = '${REDIS_HOST}'
ADMIN_ENABLED = True
EOF
)"
@@ -126,7 +127,7 @@ if [ "$1" = 'tactical-init-dev' ]; then
test -f "${TACTICAL_READY_FILE}" && rm "${TACTICAL_READY_FILE}"
# setup Python virtual env and install dependencies
! test -e "${VIRTUAL_ENV}" && python -m venv --copies ${VIRTUAL_ENV}
! test -e "${VIRTUAL_ENV}" && python -m venv ${VIRTUAL_ENV}
"${VIRTUAL_ENV}"/bin/pip install --no-cache-dir -r /requirements.txt
django_setup

View File

@@ -1,40 +1,24 @@
# To ensure app dependencies are ported from your virtual environment/host machine into your container, run 'pip freeze > requirements.txt' in the terminal to overwrite this file
amqp==5.0.5
asgiref==3.3.1
asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==5.0.5
certifi==2020.12.5
cffi==1.14.5
chardet==4.0.0
cryptography==3.4.6
decorator==4.4.2
Django==3.1.7
django-cors-headers==3.7.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
kombu==5.0.2
loguru==0.5.3
msgpack==1.0.2
packaging==20.8
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.10.1
pyotp==2.6.0
pyparsing==2.4.7
pytz==2021.1
qrcode==6.1
redis==3.5.3
requests==2.25.1
six==1.15.0
sqlparse==0.4.1
twilio==6.52.0
urllib3==1.26.3
validators==0.18.2
vine==5.0.0
websockets==8.1
zipp==3.4.0
asyncio-nats-client
celery
Django
django-cors-headers
django-rest-knox
djangorestframework
loguru
msgpack
psycopg2-binary
pycparser
pycryptodome
pyotp
pyparsing
pytz
qrcode
redis
twilio
packaging
validators
websockets
black
Werkzeug
django-extensions
@@ -44,3 +28,5 @@ model_bakery
mkdocs
mkdocs-material
pymdown-extensions
Pygments
mypy

2
.github/FUNDING.yml vendored
View File

@@ -3,7 +3,7 @@
github: wh1te909
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
ko_fi: tacticalrmm
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username

22
.github/workflows/deploy-docs.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
name: Deploy Docs
on:
push:
branches:
- develop
defaults:
run:
working-directory: docs
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: 3.x
- run: pip install --upgrade pip
- run: pip install --upgrade setuptools wheel
- run: pip install mkdocs mkdocs-material pymdown-extensions
- run: mkdocs gh-deploy --force

2
.gitignore vendored
View File

@@ -45,3 +45,5 @@ htmlcov/
docker-compose.dev.yml
docs/.vuepress/dist
nats-rmm.conf
.mypy_cache
docs/site/

View File

@@ -3,7 +3,14 @@
"python.languageServer": "Pylance",
"python.analysis.extraPaths": [
"api/tacticalrmm",
"api/env",
],
"python.analysis.diagnosticSeverityOverrides": {
"reportUnusedImport": "error",
"reportDuplicateImport": "error",
},
"python.analysis.memory.keepLibraryAst": true,
"python.linting.mypyEnabled": true,
"python.analysis.typeCheckingMode": "basic",
"python.formatting.provider": "black",
"editor.formatOnSave": true,

View File

@@ -15,6 +15,8 @@ Demo database resets every hour. Alot of features are disabled for obvious reaso
### [Discord Chat](https://discord.gg/upGTkWp)
### [Documentation](https://wh1te909.github.io/tacticalrmm/)
## Features
- Teamviewer-like remote desktop control
@@ -33,98 +35,6 @@ Demo database resets every hour. Alot of features are disabled for obvious reaso
- Windows 7, 8.1, 10, Server 2008R2, 2012R2, 2016, 2019
## Installation
## Installation / Backup / Restore / Usage
### Requirements
- VPS with 2GB ram (an install script is provided for Ubuntu Server 20.04 / Debian 10)
- A domain you own with at least 3 subdomains
- Google Authenticator app (2 factor is NOT optional)
### Docker
Refer to the [docker setup](docker/readme.md)
### Installation example (Ubuntu server 20.04 LTS)
Fresh VPS with latest updates\
login as root and create a user and add to sudoers group (we will be creating a user called tactical)
```
apt update && apt -y upgrade
adduser tactical
usermod -a -G sudo tactical
```
switch to the tactical user and setup the firewall
```
su - tactical
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow ssh
sudo ufw allow http
sudo ufw allow https
sudo ufw allow proto tcp from any to any port 4222
sudo ufw enable && sudo ufw reload
```
Our domain for this example is tacticalrmm.com
In the DNS manager of wherever our domain is hosted, we will create three A records, all pointing to the public IP address of our VPS
Create A record ```api.tacticalrmm.com``` for the django rest backend\
Create A record ```rmm.tacticalrmm.com``` for the vue frontend\
Create A record ```mesh.tacticalrmm.com``` for meshcentral
Download the install script and run it
```
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/install.sh
chmod +x install.sh
./install.sh
```
Links will be provided at the end of the install script.\
Download the executable from the first link, then open ```rmm.tacticalrmm.com``` and login.\
Upload the executable when prompted during the initial setup page.
### Install an agent
From the app's dashboard, choose Agents > Install Agent to generate an installer.
## Updating
Download and run [update.sh](https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/update.sh)
```
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/update.sh
chmod +x update.sh
./update.sh
```
## Backup
Download [backup.sh](https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh)
```
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh
```
Change the postgres username and password at the top of the file (you can find them in `/rmm/api/tacticalrmm/tacticalrmm/local_settings.py` under the DATABASES section)
Run it
```
chmod +x backup.sh
./backup.sh
```
## Restore
Change your 3 A records to point to new server's public IP
Create same linux user account as old server and add to sudoers group and setup firewall (see install instructions above)
Copy backup file to new server
Download the restore script, and edit the postgres username/password at the top of the file. Same instructions as above in the backup steps.
```
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/restore.sh
```
Run the restore script, passing it the backup tar file as the first argument
```
chmod +x restore.sh
./restore.sh rmm-backup-xxxxxxx.tar
```
### Refer to the [documentation](https://wh1te909.github.io/tacticalrmm/)

View File

@@ -7,7 +7,7 @@ from accounts.models import User
class Command(BaseCommand):
help = "Generates barcode for Google Authenticator and creates totp for user"
help = "Generates barcode for Authenticator and creates totp for user"
def add_arguments(self, parser):
parser.add_argument("code", type=str)
@@ -26,12 +26,10 @@ class Command(BaseCommand):
url = pyotp.totp.TOTP(code).provisioning_uri(username, issuer_name=domain)
subprocess.run(f'qr "{url}"', shell=True)
self.stdout.write(
self.style.SUCCESS(
"Scan the barcode above with your google authenticator app"
)
self.style.SUCCESS("Scan the barcode above with your authenticator app")
)
self.stdout.write(
self.style.SUCCESS(
f"If that doesn't work you may manually enter the key: {code}"
f"If that doesn't work you may manually enter the setup key: {code}"
)
)

View File

@@ -0,0 +1,57 @@
import os
import subprocess
import pyotp
from django.core.management.base import BaseCommand
from accounts.models import User
class Command(BaseCommand):
help = "Reset 2fa"
def add_arguments(self, parser):
parser.add_argument("username", type=str)
def handle(self, *args, **kwargs):
username = kwargs["username"]
try:
user = User.objects.get(username=username)
except User.DoesNotExist:
self.stdout.write(self.style.ERROR(f"User {username} doesn't exist"))
return
domain = "Tactical RMM"
nginx = "/etc/nginx/sites-available/frontend.conf"
found = None
if os.path.exists(nginx):
try:
with open(nginx, "r") as f:
for line in f:
if "server_name" in line:
found = line
break
if found:
rep = found.replace("server_name", "").replace(";", "")
domain = "".join(rep.split())
except:
pass
code = pyotp.random_base32()
user.totp_key = code
user.save(update_fields=["totp_key"])
url = pyotp.totp.TOTP(code).provisioning_uri(username, issuer_name=domain)
subprocess.run(f'qr "{url}"', shell=True)
self.stdout.write(
self.style.WARNING("Scan the barcode above with your authenticator app")
)
self.stdout.write(
self.style.WARNING(
f"If that doesn't work you may manually enter the setup key: {code}"
)
)
self.stdout.write(
self.style.SUCCESS(f"2fa was successfully reset for user {username}")
)

View File

@@ -0,0 +1,22 @@
from django.core.management.base import BaseCommand
from accounts.models import User
class Command(BaseCommand):
help = "Reset password for user"
def add_arguments(self, parser):
parser.add_argument("username", type=str)
def handle(self, *args, **kwargs):
username = kwargs["username"]
try:
user = User.objects.get(username=username)
except User.DoesNotExist:
self.stdout.write(self.style.ERROR(f"User {username} doesn't exist"))
return
passwd = input("Enter new password: ")
user.set_password(passwd)
user.save()
self.stdout.write(self.style.SUCCESS(f"Password for {username} was reset!"))

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.7 on 2021-02-28 06:38
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0011_user_default_agent_tbl_tab'),
]
operations = [
migrations.AddField(
model_name='user',
name='agents_per_page',
field=models.PositiveIntegerField(default=50),
),
]

View File

@@ -27,6 +27,7 @@ class User(AbstractUser, BaseAuditModel):
default_agent_tbl_tab = models.CharField(
max_length=50, choices=AGENT_TBL_TAB_CHOICES, default="server"
)
agents_per_page = models.PositiveIntegerField(default=50)
agent = models.OneToOneField(
"agents.Agent",

View File

@@ -283,6 +283,7 @@ class TestUserAction(TacticalTestCase):
"userui": True,
"agent_dblclick_action": "editagent",
"default_agent_tbl_tab": "mixed",
"agents_per_page": 1000,
}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)

View File

@@ -78,7 +78,7 @@ class GetAddUsers(APIView):
def post(self, request):
# add new user
try:
user = User.objects.create_user(
user = User.objects.create_user( # type: ignore
request.data["username"],
request.data["email"],
request.data["password"],
@@ -199,4 +199,8 @@ class UserUI(APIView):
user.default_agent_tbl_tab = request.data["default_agent_tbl_tab"]
user.save(update_fields=["agent_dblclick_action", "default_agent_tbl_tab"])
if "agents_per_page" in request.data.keys():
user.agents_per_page = request.data["agents_per_page"]
user.save(update_fields=["agents_per_page"])
return Response("ok")

View File

@@ -6,7 +6,7 @@ from itertools import cycle
from django.conf import settings
from django.utils import timezone as djangotime
from model_bakery.recipe import Recipe, foreign_key
from model_bakery.recipe import Recipe, foreign_key, seq
def generate_agent_id(hostname):
@@ -30,8 +30,7 @@ agent = Recipe(
hostname="DESKTOP-TEST123",
version="1.3.0",
monitoring_type=cycle(["workstation", "server"]),
salt_id=generate_agent_id("DESKTOP-TEST123"),
agent_id="71AHC-AA813-HH1BC-AAHH5-00013|DESKTOP-TEST123",
agent_id=seq("asdkj3h4234-1234hg3h4g34-234jjh34|DESKTOP-TEST123"),
)
server_agent = agent.extend(
@@ -44,6 +43,10 @@ workstation_agent = agent.extend(
online_agent = agent.extend(last_seen=djangotime.now())
offline_agent = agent.extend(
last_seen=djangotime.now() - djangotime.timedelta(minutes=7)
)
overdue_agent = agent.extend(
last_seen=djangotime.now() - djangotime.timedelta(minutes=35)
)

View File

@@ -0,0 +1,93 @@
from django.core.management.base import BaseCommand
from agents.models import Agent
from clients.models import Client, Site
class Command(BaseCommand):
help = "Bulk update agent offline/overdue time"
def add_arguments(self, parser):
parser.add_argument("time", type=int, help="Time in minutes")
parser.add_argument(
"--client",
type=str,
help="Client Name",
)
parser.add_argument(
"--site",
type=str,
help="Site Name",
)
parser.add_argument(
"--offline",
action="store_true",
help="Offline",
)
parser.add_argument(
"--overdue",
action="store_true",
help="Overdue",
)
parser.add_argument(
"--all",
action="store_true",
help="All agents",
)
def handle(self, *args, **kwargs):
time = kwargs["time"]
client_name = kwargs["client"]
site_name = kwargs["site"]
all_agents = kwargs["all"]
offline = kwargs["offline"]
overdue = kwargs["overdue"]
agents = None
if offline and time < 2:
self.stdout.write(self.style.ERROR("Minimum offline time is 2 minutes"))
return
if overdue and time < 3:
self.stdout.write(self.style.ERROR("Minimum overdue time is 3 minutes"))
return
if client_name:
try:
client = Client.objects.get(name=client_name)
except Client.DoesNotExist:
self.stdout.write(
self.style.ERROR(f"Client {client_name} doesn't exist")
)
return
agents = Agent.objects.filter(site__client=client)
elif site_name:
try:
site = Site.objects.get(name=site_name)
except Site.DoesNotExist:
self.stdout.write(self.style.ERROR(f"Site {site_name} doesn't exist"))
return
agents = Agent.objects.filter(site=site)
elif all_agents:
agents = Agent.objects.all()
if agents:
if offline:
agents.update(offline_time=time)
self.stdout.write(
self.style.SUCCESS(
f"Changed offline time on {len(agents)} agents to {time} minutes"
)
)
if overdue:
agents.update(overdue_time=time)
self.stdout.write(
self.style.SUCCESS(
f"Changed overdue time on {len(agents)} agents to {time} minutes"
)
)

View File

@@ -0,0 +1,18 @@
from django.conf import settings
from django.core.management.base import BaseCommand
from agents.models import Agent
class Command(BaseCommand):
help = "Shows online agents that are not on the latest version"
def handle(self, *args, **kwargs):
q = Agent.objects.exclude(version=settings.LATEST_AGENT_VER).only(
"pk", "version", "last_seen", "overdue_time", "offline_time"
)
agents = [i for i in q if i.status == "online"]
for agent in agents:
self.stdout.write(
self.style.SUCCESS(f"{agent.hostname} - v{agent.version}")
)

View File

@@ -4,7 +4,7 @@ import re
import time
from collections import Counter
from distutils.version import LooseVersion
from typing import Any, List, Union
from typing import Any, Union
import msgpack
import validators
@@ -164,14 +164,14 @@ class Agent(BaseAuditModel):
@property
def has_patches_pending(self):
return self.winupdates.filter(action="approve").filter(installed=False).exists()
return self.winupdates.filter(action="approve").filter(installed=False).exists() # type: ignore
@property
def checks(self):
total, passing, failing = 0, 0, 0
if self.agentchecks.exists():
for i in self.agentchecks.all():
if self.agentchecks.exists(): # type: ignore
for i in self.agentchecks.all(): # type: ignore
total += 1
if i.status == "passing":
passing += 1
@@ -241,6 +241,7 @@ class Agent(BaseAuditModel):
pass
try:
comp_sys_prod = self.wmi_detail["comp_sys_prod"][0]
return [x["Version"] for x in comp_sys_prod if "Version" in x][0]
except:
pass
@@ -273,7 +274,7 @@ class Agent(BaseAuditModel):
def run_script(
self,
scriptpk: int,
args: List[str] = [],
args: list[str] = [],
timeout: int = 120,
full: bool = False,
wait: bool = False,
@@ -295,10 +296,10 @@ class Agent(BaseAuditModel):
running_agent = self
if run_on_any:
nats_ping = {"func": "ping", "timeout": 1}
nats_ping = {"func": "ping"}
# try on self first
r = asyncio.run(self.nats_cmd(nats_ping))
r = asyncio.run(self.nats_cmd(nats_ping, timeout=1))
if r == "pong":
running_agent = self
@@ -312,7 +313,7 @@ class Agent(BaseAuditModel):
]
for agent in online:
r = asyncio.run(agent.nats_cmd(nats_ping))
r = asyncio.run(agent.nats_cmd(nats_ping, timeout=1))
if r == "pong":
running_agent = agent
break
@@ -333,27 +334,27 @@ class Agent(BaseAuditModel):
updates = list()
if patch_policy.critical == "approve":
updates += self.winupdates.filter(
updates += self.winupdates.filter( # type: ignore
severity="Critical", installed=False
).exclude(action="approve")
if patch_policy.important == "approve":
updates += self.winupdates.filter(
updates += self.winupdates.filter( # type: ignore
severity="Important", installed=False
).exclude(action="approve")
if patch_policy.moderate == "approve":
updates += self.winupdates.filter(
updates += self.winupdates.filter( # type: ignore
severity="Moderate", installed=False
).exclude(action="approve")
if patch_policy.low == "approve":
updates += self.winupdates.filter(severity="Low", installed=False).exclude(
updates += self.winupdates.filter(severity="Low", installed=False).exclude( # type: ignore
action="approve"
)
if patch_policy.other == "approve":
updates += self.winupdates.filter(severity="", installed=False).exclude(
updates += self.winupdates.filter(severity="", installed=False).exclude( # type: ignore
action="approve"
)
@@ -368,7 +369,7 @@ class Agent(BaseAuditModel):
site = self.site
core_settings = CoreSettings.objects.first()
patch_policy = None
agent_policy = self.winupdatepolicy.get()
agent_policy = self.winupdatepolicy.get() # type: ignore
if self.monitoring_type == "server":
# check agent policy first which should override client or site policy
@@ -453,9 +454,9 @@ class Agent(BaseAuditModel):
return patch_policy
def get_approved_update_guids(self) -> List[str]:
def get_approved_update_guids(self) -> list[str]:
return list(
self.winupdates.filter(action="approve", installed=False).values_list(
self.winupdates.filter(action="approve", installed=False).values_list( # type: ignore
"guid", flat=True
)
)
@@ -571,7 +572,7 @@ class Agent(BaseAuditModel):
from automation.models import Policy
# Clear agent checks that have overriden_by_policy set
self.agentchecks.update(overriden_by_policy=False)
self.agentchecks.update(overriden_by_policy=False) # type: ignore
# Generate checks based on policies
Policy.generate_policy_checks(self)
@@ -606,7 +607,7 @@ class Agent(BaseAuditModel):
except Exception:
return "err"
async def nats_cmd(self, data, timeout=30, wait=True):
async def nats_cmd(self, data: dict, timeout: int = 30, wait: bool = True):
nc = NATS()
options = {
"servers": f"tls://{settings.ALLOWED_HOSTS[0]}:4222",
@@ -628,7 +629,7 @@ class Agent(BaseAuditModel):
except ErrTimeout:
ret = "timeout"
else:
ret = msgpack.loads(msg.data)
ret = msgpack.loads(msg.data) # type: ignore
await nc.close()
return ret
@@ -650,12 +651,12 @@ class Agent(BaseAuditModel):
def delete_superseded_updates(self):
try:
pks = [] # list of pks to delete
kbs = list(self.winupdates.values_list("kb", flat=True))
kbs = list(self.winupdates.values_list("kb", flat=True)) # type: ignore
d = Counter(kbs)
dupes = [k for k, v in d.items() if v > 1]
for dupe in dupes:
titles = self.winupdates.filter(kb=dupe).values_list("title", flat=True)
titles = self.winupdates.filter(kb=dupe).values_list("title", flat=True) # type: ignore
# extract the version from the title and sort from oldest to newest
# skip if no version info is available therefore nothing to parse
try:
@@ -668,17 +669,17 @@ class Agent(BaseAuditModel):
continue
# append all but the latest version to our list of pks to delete
for ver in sorted_vers[:-1]:
q = self.winupdates.filter(kb=dupe).filter(title__contains=ver)
q = self.winupdates.filter(kb=dupe).filter(title__contains=ver) # type: ignore
pks.append(q.first().pk)
pks = list(set(pks))
self.winupdates.filter(pk__in=pks).delete()
self.winupdates.filter(pk__in=pks).delete() # type: ignore
except:
pass
# define how the agent should handle pending actions
def handle_pending_actions(self):
pending_actions = self.pendingactions.filter(status="pending")
pending_actions = self.pendingactions.filter(status="pending") # type: ignore
for action in pending_actions:
if action.action_type == "taskaction":
@@ -702,171 +703,24 @@ class Agent(BaseAuditModel):
# for clearing duplicate pending actions on agent
def remove_matching_pending_task_actions(self, task_id):
# remove any other pending actions on agent with same task_id
for action in self.pendingactions.exclude(status="completed"):
for action in self.pendingactions.exclude(status="completed"): # type: ignore
if action.details["task_id"] == task_id:
action.delete()
def handle_alert(self, checkin: bool = False) -> None:
from agents.tasks import (
agent_outage_email_task,
agent_outage_sms_task,
agent_recovery_email_task,
agent_recovery_sms_task,
)
from alerts.models import Alert
# return if agent is in maintenace mode
if self.maintenance_mode:
return
alert_template = self.get_alert_template()
# called when agent is back online
if checkin:
if Alert.objects.filter(agent=self, resolved=False).exists():
# resolve alert if exists
alert = Alert.objects.get(agent=self, resolved=False)
alert.resolve()
# check if a resolved notification should be emailed
if (
not alert.resolved_email_sent
and alert_template
and alert_template.agent_email_on_resolved
or self.overdue_email_alert
):
agent_recovery_email_task.delay(pk=alert.pk)
# check if a resolved notification should be texted
if (
not alert.resolved_sms_sent
and alert_template
and alert_template.agent_text_on_resolved
or self.overdue_text_alert
):
agent_recovery_sms_task.delay(pk=alert.pk)
# check if any scripts should be run
if (
not alert.resolved_action_run
and alert_template
and alert_template.resolved_action
):
r = self.run_script(
scriptpk=alert_template.resolved_action.pk,
args=alert_template.resolved_action_args,
timeout=alert_template.resolved_action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.resolved_action_retcode = r["retcode"]
alert.resolved_action_stdout = r["stdout"]
alert.resolved_action_stderr = r["stderr"]
alert.resolved_action_execution_time = "{:.4f}".format(
r["execution_time"]
)
alert.resolved_action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Resolved action: {alert_template.resolved_action} failed to run on any agent for {self.hostname} resolved outage"
)
# called when agent is offline
else:
# check if alert hasn't been created yet so create it
if not Alert.objects.filter(agent=self, resolved=False).exists():
# check if alert should be created and if not return
if (
self.overdue_dashboard_alert
or self.overdue_email_alert
or self.overdue_text_alert
or (
alert_template
and (
alert_template.agent_always_alert
or alert_template.agent_always_email
or alert_template.agent_always_text
)
)
):
alert = Alert.create_availability_alert(self)
else:
return
# add a null check history to allow gaps in graph
for check in self.agentchecks.all():
check.add_check_history(None)
else:
alert = Alert.objects.get(agent=self, resolved=False)
# create dashboard alert if enabled
if (
def should_create_alert(self, alert_template):
return (
self.overdue_dashboard_alert
or self.overdue_email_alert
or self.overdue_text_alert
or (
alert_template
and alert_template.agent_always_alert
or self.overdue_dashboard_alert
):
alert.hidden = False
alert.save()
# send email alert if enabled
if (
not alert.email_sent
and alert_template
and alert_template.agent_always_email
or self.overdue_email_alert
):
agent_outage_email_task.delay(
pk=alert.pk,
alert_interval=alert_template.check_periodic_alert_days
if alert_template
else None,
and (
alert_template.agent_always_alert
or alert_template.agent_always_email
or alert_template.agent_always_text
)
# send text message if enabled
if (
not alert.sms_sent
and alert_template
and alert_template.agent_always_text
or self.overdue_text_alert
):
agent_outage_sms_task.delay(
pk=alert.pk,
alert_interval=alert_template.check_periodic_alert_days
if alert_template
else None,
)
# check if any scripts should be run
if not alert.action_run and alert_template and alert_template.action:
r = self.run_script(
scriptpk=alert_template.action.pk,
args=alert_template.action_args,
timeout=alert_template.action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if isinstance(r, dict):
alert.action_retcode = r["retcode"]
alert.action_stdout = r["stdout"]
alert.action_stderr = r["stderr"]
alert.action_execution_time = "{:.4f}".format(r["execution_time"])
alert.action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Failure action: {alert_template.action.name} failed to run on any agent for {self.hostname} outage"
)
)
)
def send_outage_email(self):
from core.models import CoreSettings

View File

@@ -2,7 +2,7 @@ import asyncio
import datetime as dt
import random
from time import sleep
from typing import List, Union
from typing import Union
from django.conf import settings
from django.utils import timezone as djangotime
@@ -77,7 +77,7 @@ def agent_update(pk: int) -> str:
@app.task
def send_agent_update_task(pks: List[int]) -> None:
def send_agent_update_task(pks: list[int]) -> None:
chunks = (pks[i : i + 30] for i in range(0, len(pks), 30))
for chunk in chunks:
for pk in chunk:
@@ -93,7 +93,7 @@ def auto_self_agent_update_task() -> None:
return
q = Agent.objects.only("pk", "version")
pks: List[int] = [
pks: list[int] = [
i.pk
for i in q
if pyver.parse(i.version) < pyver.parse(settings.LATEST_AGENT_VER)
@@ -183,6 +183,8 @@ def agent_recovery_sms_task(pk: int) -> str:
@app.task
def agent_outages_task() -> None:
from alerts.models import Alert
agents = Agent.objects.only(
"pk",
"last_seen",
@@ -195,7 +197,7 @@ def agent_outages_task() -> None:
for agent in agents:
if agent.status == "overdue":
agent.handle_alert()
Alert.handle_alert_failure(agent)
@app.task
@@ -217,8 +219,8 @@ def run_script_email_results_task(
agentpk: int,
scriptpk: int,
nats_timeout: int,
emails: List[str],
args: List[str] = [],
emails: list[str],
args: list[str] = [],
):
agent = Agent.objects.get(pk=agentpk)
script = Script.objects.get(pk=scriptpk)

View File

@@ -1,7 +1,6 @@
import json
import os
from itertools import cycle
from typing import List
from unittest.mock import patch
from django.conf import settings
@@ -18,6 +17,107 @@ from .serializers import AgentSerializer
from .tasks import auto_self_agent_update_task
class TestAgentsList(TacticalTestCase):
def setUp(self):
self.authenticate()
self.setup_coresettings()
def test_agents_list(self):
url = "/agents/listagents/"
# 36 total agents
company1 = baker.make("clients.Client")
company2 = baker.make("clients.Client")
site1 = baker.make("clients.Site", client=company1)
site2 = baker.make("clients.Site", client=company1)
site3 = baker.make("clients.Site", client=company2)
baker.make_recipe(
"agents.online_agent", site=site1, monitoring_type="server", _quantity=15
)
baker.make_recipe(
"agents.online_agent",
site=site2,
monitoring_type="workstation",
_quantity=10,
)
baker.make_recipe(
"agents.online_agent",
site=site3,
monitoring_type="server",
_quantity=4,
)
baker.make_recipe(
"agents.online_agent",
site=site3,
monitoring_type="workstation",
_quantity=7,
)
data = {
"pagination": {
"rowsPerPage": 50,
"rowsNumber": None,
"sortBy": "hostname",
"descending": False,
"page": 1,
},
"monType": "mixed",
}
# test mixed
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["total"], 36) # type: ignore
self.assertEqual(len(r.data["agents"]), 36) # type: ignore
# test servers
data["monType"] = "server"
data["pagination"]["rowsPerPage"] = 6
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["total"], 19) # type: ignore
self.assertEqual(len(r.data["agents"]), 6) # type: ignore
# test workstations
data["monType"] = "server"
data["pagination"]["rowsPerPage"] = 6
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["total"], 19) # type: ignore
self.assertEqual(len(r.data["agents"]), 6) # type: ignore
# test client1 mixed
data = {
"pagination": {
"rowsPerPage": 3,
"rowsNumber": None,
"sortBy": "hostname",
"descending": False,
"page": 1,
},
"monType": "mixed",
"clientPK": company1.pk, # type: ignore
}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["total"], 25) # type: ignore
self.assertEqual(len(r.data["agents"]), 3) # type: ignore
# test site3 workstations
del data["clientPK"]
data["monType"] = "workstation"
data["sitePK"] = site3.pk # type: ignore
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["total"], 7) # type: ignore
self.assertEqual(len(r.data["agents"]), 3) # type: ignore
self.check_not_authenticated("patch", url)
class TestAgentViews(TacticalTestCase):
def setUp(self):
self.authenticate()
@@ -78,12 +178,12 @@ class TestAgentViews(TacticalTestCase):
_quantity=15,
)
pks: List[int] = list(
pks: list[int] = list(
Agent.objects.only("pk", "version").values_list("pk", flat=True)
)
data = {"pks": pks}
expected: List[int] = [
expected: list[int] = [
i.pk
for i in Agent.objects.only("pk", "version")
if pyver.parse(i.version) < pyver.parse(settings.LATEST_AGENT_VER)
@@ -257,7 +357,7 @@ class TestAgentViews(TacticalTestCase):
mock_ret.return_value = "nt authority\system"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertIsInstance(r.data, str)
self.assertIsInstance(r.data, str) # type: ignore
mock_ret.return_value = "timeout"
r = self.client.post(url, data, format="json")
@@ -277,15 +377,15 @@ class TestAgentViews(TacticalTestCase):
nats_cmd.return_value = "ok"
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data["time"], "August 29, 2025 at 06:41 PM")
self.assertEqual(r.data["agent"], self.agent.hostname)
self.assertEqual(r.data["time"], "August 29, 2025 at 06:41 PM") # type: ignore
self.assertEqual(r.data["agent"], self.agent.hostname) # type: ignore
nats_data = {
"func": "schedtask",
"schedtaskpayload": {
"type": "schedreboot",
"trigger": "once",
"name": r.data["task_name"],
"name": r.data["task_name"], # type: ignore
"year": 2025,
"month": "August",
"day": 29,
@@ -306,7 +406,7 @@ class TestAgentViews(TacticalTestCase):
r = self.client.patch(url, data_invalid, format="json")
self.assertEqual(r.status_code, 400)
self.assertEqual(r.data, "Invalid date")
self.assertEqual(r.data, "Invalid date") # type: ignore
self.check_not_authenticated("patch", url)
@@ -317,8 +417,8 @@ class TestAgentViews(TacticalTestCase):
site = baker.make("clients.Site")
data = {
"client": site.client.id,
"site": site.id,
"client": site.client.id, # type: ignore
"site": site.id, # type: ignore
"arch": "64",
"expires": 23,
"installMethod": "exe",
@@ -402,14 +502,6 @@ class TestAgentViews(TacticalTestCase):
self.check_not_authenticated("post", url)
def test_agents_list(self):
url = "/agents/listagents/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("get", url)
def test_agents_agent_detail(self):
url = f"/agents/{self.agent.pk}/agentdetail/"
@@ -426,7 +518,7 @@ class TestAgentViews(TacticalTestCase):
edit = {
"id": self.agent.pk,
"site": site.id,
"site": site.id, # type: ignore
"monitoring_type": "workstation",
"description": "asjdk234andasd",
"offline_time": 4,
@@ -457,7 +549,7 @@ class TestAgentViews(TacticalTestCase):
agent = Agent.objects.get(pk=self.agent.pk)
data = AgentSerializer(agent).data
self.assertEqual(data["site"], site.id)
self.assertEqual(data["site"], site.id) # type: ignore
policy = WinUpdatePolicy.objects.get(agent=self.agent)
data = WinUpdatePolicySerializer(policy).data
@@ -475,21 +567,21 @@ class TestAgentViews(TacticalTestCase):
# TODO
# decode the cookie
self.assertIn("&viewmode=13", r.data["file"])
self.assertIn("&viewmode=12", r.data["terminal"])
self.assertIn("&viewmode=11", r.data["control"])
self.assertIn("&viewmode=13", r.data["file"]) # type: ignore
self.assertIn("&viewmode=12", r.data["terminal"]) # type: ignore
self.assertIn("&viewmode=11", r.data["control"]) # type: ignore
self.assertIn("&gotonode=", r.data["file"])
self.assertIn("&gotonode=", r.data["terminal"])
self.assertIn("&gotonode=", r.data["control"])
self.assertIn("&gotonode=", r.data["file"]) # type: ignore
self.assertIn("&gotonode=", r.data["terminal"]) # type: ignore
self.assertIn("&gotonode=", r.data["control"]) # type: ignore
self.assertIn("?login=", r.data["file"])
self.assertIn("?login=", r.data["terminal"])
self.assertIn("?login=", r.data["control"])
self.assertIn("?login=", r.data["file"]) # type: ignore
self.assertIn("?login=", r.data["terminal"]) # type: ignore
self.assertIn("?login=", r.data["control"]) # type: ignore
self.assertEqual(self.agent.hostname, r.data["hostname"])
self.assertEqual(self.agent.client.name, r.data["client"])
self.assertEqual(self.agent.site.name, r.data["site"])
self.assertEqual(self.agent.hostname, r.data["hostname"]) # type: ignore
self.assertEqual(self.agent.client.name, r.data["client"]) # type: ignore
self.assertEqual(self.agent.site.name, r.data["site"]) # type: ignore
self.assertEqual(r.status_code, 200)
@@ -499,32 +591,6 @@ class TestAgentViews(TacticalTestCase):
self.check_not_authenticated("get", url)
def test_by_client(self):
url = f"/agents/byclient/{self.agent.client.id}/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.assertTrue(r.data)
url = f"/agents/byclient/500/"
r = self.client.get(url)
self.assertFalse(r.data) # returns empty list
self.check_not_authenticated("get", url)
def test_by_site(self):
url = f"/agents/bysite/{self.agent.site.id}/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.assertTrue(r.data)
url = f"/agents/bysite/500/"
r = self.client.get(url)
self.assertEqual(r.data, [])
self.check_not_authenticated("get", url)
def test_overdue_action(self):
url = "/agents/overdueaction/"
@@ -533,14 +599,14 @@ class TestAgentViews(TacticalTestCase):
self.assertEqual(r.status_code, 200)
agent = Agent.objects.get(pk=self.agent.pk)
self.assertTrue(agent.overdue_email_alert)
self.assertEqual(self.agent.hostname, r.data)
self.assertEqual(self.agent.hostname, r.data) # type: ignore
payload = {"pk": self.agent.pk, "overdue_text_alert": False}
r = self.client.post(url, payload, format="json")
self.assertEqual(r.status_code, 200)
agent = Agent.objects.get(pk=self.agent.pk)
self.assertFalse(agent.overdue_text_alert)
self.assertEqual(self.agent.hostname, r.data)
self.assertEqual(self.agent.hostname, r.data) # type: ignore
self.check_not_authenticated("post", url)
@@ -684,7 +750,7 @@ class TestAgentViews(TacticalTestCase):
nats_cmd.return_value = "ok"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.assertIn(self.agent.hostname, r.data)
self.assertIn(self.agent.hostname, r.data) # type: ignore
nats_cmd.assert_called_with(
{"func": "recover", "payload": {"mode": "mesh"}}, timeout=45
)
@@ -699,6 +765,77 @@ class TestAgentViews(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("agents.tasks.run_script_email_results_task.delay")
@patch("agents.models.Agent.run_script")
def test_run_script(self, run_script, email_task):
run_script.return_value = "ok"
url = "/agents/runscript/"
script = baker.make_recipe("scripts.script")
# test wait
data = {
"pk": self.agent.pk,
"scriptPK": script.pk,
"output": "wait",
"args": [],
"timeout": 15,
}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
run_script.assert_called_with(
scriptpk=script.pk, args=[], timeout=18, wait=True
)
run_script.reset_mock()
# test email default
data = {
"pk": self.agent.pk,
"scriptPK": script.pk,
"output": "email",
"args": ["abc", "123"],
"timeout": 15,
"emailmode": "default",
"emails": ["admin@example.com", "bob@example.com"],
}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
email_task.assert_called_with(
agentpk=self.agent.pk,
scriptpk=script.pk,
nats_timeout=18,
emails=[],
args=["abc", "123"],
)
email_task.reset_mock()
# test email overrides
data["emailmode"] = "custom"
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
email_task.assert_called_with(
agentpk=self.agent.pk,
scriptpk=script.pk,
nats_timeout=18,
emails=["admin@example.com", "bob@example.com"],
args=["abc", "123"],
)
# test fire and forget
data = {
"pk": self.agent.pk,
"scriptPK": script.pk,
"output": "forget",
"args": ["hello", "world"],
"timeout": 22,
}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
run_script.assert_called_with(
scriptpk=script.pk, args=["hello", "world"], timeout=25
)
class TestAgentViewsNew(TacticalTestCase):
def setUp(self):
@@ -730,7 +867,7 @@ class TestAgentViewsNew(TacticalTestCase):
r = self.client.post(url, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(r.data, data)
self.assertEqual(r.data, data) # type: ignore
self.check_not_authenticated("post", url)
@@ -742,14 +879,14 @@ class TestAgentViewsNew(TacticalTestCase):
agent = baker.make_recipe("agents.agent", site=site)
# Test client toggle maintenance mode
data = {"type": "Client", "id": site.client.id, "action": True}
data = {"type": "Client", "id": site.client.id, "action": True} # type: ignore
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertTrue(Agent.objects.get(pk=agent.pk).maintenance_mode)
# Test site toggle maintenance mode
data = {"type": "Site", "id": site.id, "action": False}
data = {"type": "Site", "id": site.id, "action": False} # type: ignore
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)

View File

@@ -6,8 +6,6 @@ urlpatterns = [
path("listagents/", views.AgentsTableList.as_view()),
path("listagentsnodetail/", views.list_agents_no_detail),
path("<int:pk>/agenteditdetails/", views.agent_edit_details),
path("byclient/<int:clientpk>/", views.by_client),
path("bysite/<int:sitepk>/", views.by_site),
path("overdueaction/", views.overdue_action),
path("sendrawcmd/", views.send_raw_cmd),
path("<pk>/agentdetail/", views.agent_detail),

View File

@@ -3,15 +3,15 @@ import datetime as dt
import os
import random
import string
import subprocess
from typing import List
from django.conf import settings
from django.core.paginator import Paginator
from django.db.models import Q
from django.http import HttpResponse
from django.shortcuts import get_object_or_404
from loguru import logger
from packaging import version as pyver
from rest_framework import generics, status
from rest_framework import status
from rest_framework.decorators import api_view
from rest_framework.response import Response
from rest_framework.views import APIView
@@ -20,7 +20,12 @@ from core.models import CoreSettings
from logs.models import AuditLog, PendingAction
from scripts.models import Script
from scripts.tasks import handle_bulk_command_task, handle_bulk_script_task
from tacticalrmm.utils import get_default_timezone, notify_error, reload_nats
from tacticalrmm.utils import (
generate_installer_exe,
get_default_timezone,
notify_error,
reload_nats,
)
from winupdate.serializers import WinUpdatePolicySerializer
from winupdate.tasks import bulk_check_for_updates_task, bulk_install_updates_task
@@ -53,7 +58,7 @@ def get_agent_versions(request):
@api_view(["POST"])
def update_agents(request):
q = Agent.objects.filter(pk__in=request.data["pks"]).only("pk", "version")
pks: List[int] = [
pks: list[int] = [
i.pk
for i in q
if pyver.parse(i.version) < pyver.parse(settings.LATEST_AGENT_VER)
@@ -95,7 +100,7 @@ def edit_agent(request):
a_serializer.save()
if "winupdatepolicy" in request.data.keys():
policy = agent.winupdatepolicy.get()
policy = agent.winupdatepolicy.get() # type: ignore
p_serializer = WinUpdatePolicySerializer(
instance=policy, data=request.data["winupdatepolicy"][0]
)
@@ -221,37 +226,74 @@ def send_raw_cmd(request):
return Response(r)
class AgentsTableList(generics.ListAPIView):
queryset = (
Agent.objects.select_related("site")
.prefetch_related("agentchecks")
.only(
"pk",
"hostname",
"agent_id",
"site",
"monitoring_type",
"description",
"needs_reboot",
"overdue_text_alert",
"overdue_email_alert",
"overdue_time",
"offline_time",
"last_seen",
"boot_time",
"logged_in_username",
"last_logged_in_user",
"time_zone",
"maintenance_mode",
)
)
serializer_class = AgentTableSerializer
class AgentsTableList(APIView):
def patch(self, request):
pagination = request.data["pagination"]
monType = request.data["monType"]
client = Q()
site = Q()
mon_type = Q()
if pagination["sortBy"] == "agentstatus":
sort = "last_seen"
elif pagination["sortBy"] == "client_name":
sort = "site__client__name"
elif pagination["sortBy"] == "site_name":
sort = "site__name"
elif pagination["sortBy"] == "user":
sort = "logged_in_username"
else:
sort = pagination["sortBy"]
order_by = f"-{sort}" if pagination["descending"] else sort
if monType == "server":
mon_type = Q(monitoring_type="server")
elif monType == "workstation":
mon_type = Q(monitoring_type="workstation")
if "clientPK" in request.data:
client = Q(site__client_id=request.data["clientPK"])
if "sitePK" in request.data:
site = Q(site_id=request.data["sitePK"])
queryset = (
Agent.objects.select_related("site")
.prefetch_related("agentchecks")
.filter(mon_type)
.filter(client)
.filter(site)
.only(
"pk",
"hostname",
"agent_id",
"site",
"monitoring_type",
"description",
"needs_reboot",
"overdue_text_alert",
"overdue_email_alert",
"overdue_time",
"offline_time",
"last_seen",
"boot_time",
"logged_in_username",
"last_logged_in_user",
"time_zone",
"maintenance_mode",
)
.order_by(order_by)
)
paginator = Paginator(queryset, pagination["rowsPerPage"])
def list(self, request):
queryset = self.get_queryset()
ctx = {"default_tz": get_default_timezone()}
serializer = AgentTableSerializer(queryset, many=True, context=ctx)
return Response(serializer.data)
serializer = AgentTableSerializer(
paginator.get_page(pagination["page"]), many=True, context=ctx
)
ret = {"agents": serializer.data, "total": paginator.count}
return Response(ret)
@api_view()
@@ -266,66 +308,6 @@ def agent_edit_details(request, pk):
return Response(AgentEditSerializer(agent).data)
@api_view()
def by_client(request, clientpk):
agents = (
Agent.objects.select_related("site")
.filter(site__client_id=clientpk)
.prefetch_related("agentchecks")
.only(
"pk",
"hostname",
"agent_id",
"site",
"monitoring_type",
"description",
"needs_reboot",
"overdue_text_alert",
"overdue_email_alert",
"overdue_time",
"offline_time",
"last_seen",
"boot_time",
"logged_in_username",
"last_logged_in_user",
"time_zone",
"maintenance_mode",
)
)
ctx = {"default_tz": get_default_timezone()}
return Response(AgentTableSerializer(agents, many=True, context=ctx).data)
@api_view()
def by_site(request, sitepk):
agents = (
Agent.objects.filter(site_id=sitepk)
.select_related("site")
.prefetch_related("agentchecks")
.only(
"pk",
"hostname",
"agent_id",
"site",
"monitoring_type",
"description",
"needs_reboot",
"overdue_text_alert",
"overdue_email_alert",
"overdue_time",
"offline_time",
"last_seen",
"boot_time",
"logged_in_username",
"last_logged_in_user",
"time_zone",
"maintenance_mode",
)
)
ctx = {"default_tz": get_default_timezone()}
return Response(AgentTableSerializer(agents, many=True, context=ctx).data)
@api_view(["POST"])
def overdue_action(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
@@ -427,124 +409,20 @@ def install_agent(request):
)
if request.data["installMethod"] == "exe":
go_bin = "/usr/local/rmmgo/go/bin/go"
if not os.path.exists(go_bin):
return Response("nogolang", status=status.HTTP_409_CONFLICT)
api = request.data["api"]
atype = request.data["agenttype"]
rdp = request.data["rdp"]
ping = request.data["ping"]
power = request.data["power"]
file_name = "rmm-installer.exe"
exe = os.path.join(settings.EXE_DIR, file_name)
if os.path.exists(exe):
try:
os.remove(exe)
except Exception as e:
logger.error(str(e))
goarch = "amd64" if arch == "64" else "386"
cmd = [
"env",
"GOOS=windows",
f"GOARCH={goarch}",
go_bin,
"build",
f"-ldflags=\"-s -w -X 'main.Inno={inno}'",
f"-X 'main.Api={api}'",
f"-X 'main.Client={client_id}'",
f"-X 'main.Site={site_id}'",
f"-X 'main.Atype={atype}'",
f"-X 'main.Rdp={rdp}'",
f"-X 'main.Ping={ping}'",
f"-X 'main.Power={power}'",
f"-X 'main.DownloadUrl={download_url}'",
f"-X 'main.Token={token}'\"",
"-o",
exe,
]
build_error = False
gen_error = False
gen = [
"env",
"GOOS=windows",
f"GOARCH={goarch}",
go_bin,
"generate",
]
try:
r1 = subprocess.run(
" ".join(gen),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except Exception as e:
gen_error = True
logger.error(str(e))
return Response(
"genfailed", status=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE
)
if r1.returncode != 0:
gen_error = True
if r1.stdout:
logger.error(r1.stdout.decode("utf-8", errors="ignore"))
if r1.stderr:
logger.error(r1.stderr.decode("utf-8", errors="ignore"))
logger.error(f"Go build failed with return code {r1.returncode}")
if gen_error:
return Response(
"genfailed", status=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE
)
try:
r = subprocess.run(
" ".join(cmd),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except Exception as e:
build_error = True
logger.error(str(e))
return Response("buildfailed", status=status.HTTP_412_PRECONDITION_FAILED)
if r.returncode != 0:
build_error = True
if r.stdout:
logger.error(r.stdout.decode("utf-8", errors="ignore"))
if r.stderr:
logger.error(r.stderr.decode("utf-8", errors="ignore"))
logger.error(f"Go build failed with return code {r.returncode}")
if build_error:
return Response("buildfailed", status=status.HTTP_412_PRECONDITION_FAILED)
if settings.DEBUG:
with open(exe, "rb") as f:
response = HttpResponse(
f.read(),
content_type="application/vnd.microsoft.portable-executable",
)
response["Content-Disposition"] = f"inline; filename={file_name}"
return response
else:
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
return generate_installer_exe(
file_name="rmm-installer.exe",
goarch="amd64" if arch == "64" else "386",
inno=inno,
api=request.data["api"],
client_id=client_id,
site_id=site_id,
atype=request.data["agenttype"],
rdp=request.data["rdp"],
ping=request.data["ping"],
power=request.data["power"],
download_url=download_url,
token=token,
)
elif request.data["installMethod"] == "manual":
cmd = [
@@ -653,7 +531,7 @@ def recover(request):
if r == "ok":
return Response("Successfully completed recovery")
if agent.recoveryactions.filter(last_run=None).exists():
if agent.recoveryactions.filter(last_run=None).exists(): # type: ignore
return notify_error(
"A recovery action is currently pending. Please wait for the next agent check-in."
)
@@ -681,8 +559,6 @@ def recover(request):
@api_view(["POST"])
def run_script(request):
agent = get_object_or_404(Agent, pk=request.data["pk"])
if not agent.has_nats:
return notify_error("Requires agent version 1.1.0 or greater")
script = get_object_or_404(Script, pk=request.data["scriptPK"])
output = request.data["output"]
args = request.data["args"]
@@ -701,9 +577,6 @@ def run_script(request):
return Response(r)
elif output == "email":
if not pyver.parse(agent.version) >= pyver.parse("1.1.12"):
return notify_error("Requires agent version 1.1.12 or greater")
emails = (
[] if request.data["emailmode"] == "default" else request.data["emails"]
)
@@ -807,7 +680,7 @@ def bulk(request):
elif request.data["monType"] == "workstations":
q = q.filter(monitoring_type="workstation")
agents: List[int] = [agent.pk for agent in q]
agents: list[int] = [agent.pk for agent in q]
AuditLog.audit_bulk_action(request.user, request.data["mode"], request.data)

View File

@@ -1,7 +1,20 @@
from __future__ import annotations
from typing import TYPE_CHECKING, Union
from django.conf import settings
from django.contrib.postgres.fields import ArrayField
from django.db import models
from django.db.models.fields import BooleanField, PositiveIntegerField
from django.utils import timezone as djangotime
from loguru import logger
if TYPE_CHECKING:
from agents.models import Agent
from autotasks.models import AutomatedTask
from checks.models import Check
logger.configure(**settings.LOG_CONFIG)
SEVERITY_CHOICES = [
("info", "Informational"),
@@ -78,7 +91,7 @@ class Alert(models.Model):
self.save()
@classmethod
def create_availability_alert(cls, agent):
def create_or_return_availability_alert(cls, agent):
if not cls.objects.filter(agent=agent, resolved=False).exists():
return cls.objects.create(
agent=agent,
@@ -87,9 +100,11 @@ class Alert(models.Model):
message=f"{agent.hostname} in {agent.client.name}\\{agent.site.name} is overdue.",
hidden=True,
)
else:
return cls.objects.get(agent=agent, resolved=False)
@classmethod
def create_check_alert(cls, check):
def create_or_return_check_alert(cls, check):
if not cls.objects.filter(assigned_check=check, resolved=False).exists():
return cls.objects.create(
@@ -99,9 +114,11 @@ class Alert(models.Model):
message=f"{check.agent.hostname} has a {check.check_type} check: {check.readable_desc} that failed.",
hidden=True,
)
else:
return cls.objects.get(assigned_check=check, resolved=False)
@classmethod
def create_task_alert(cls, task):
def create_or_return_task_alert(cls, task):
if not cls.objects.filter(assigned_task=task, resolved=False).exists():
return cls.objects.create(
@@ -111,10 +128,305 @@ class Alert(models.Model):
message=f"{task.agent.hostname} has task: {task.name} that failed.",
hidden=True,
)
else:
return cls.objects.get(assigned_task=task, resolved=False)
@classmethod
def create_custom_alert(cls, custom):
pass
def handle_alert_failure(cls, instance: Union[Agent, AutomatedTask, Check]) -> None:
from agents.models import Agent
from autotasks.models import AutomatedTask
from checks.models import Check
# set variables
dashboard_severities = None
email_severities = None
text_severities = None
always_dashboard = None
always_email = None
always_text = None
alert_interval = None
email_task = None
text_task = None
# check what the instance passed is
if isinstance(instance, Agent):
from agents.tasks import agent_outage_email_task, agent_outage_sms_task
email_task = agent_outage_email_task
text_task = agent_outage_sms_task
email_alert = instance.overdue_email_alert
text_alert = instance.overdue_text_alert
dashboard_alert = instance.overdue_dashboard_alert
alert_template = instance.get_alert_template()
maintenance_mode = instance.maintenance_mode
alert_severity = "error"
agent = instance
# set alert_template settings
if alert_template:
dashboard_severities = ["error"]
email_severities = ["error"]
text_severities = ["error"]
always_dashboard = alert_template.agent_always_alert
always_email = alert_template.agent_always_email
always_text = alert_template.agent_always_text
alert_interval = alert_template.agent_periodic_alert_days
if instance.should_create_alert(alert_template):
alert = cls.create_or_return_availability_alert(instance)
else:
# check if there is an alert that exists
if cls.objects.filter(agent=instance, resolved=False).exists():
alert = cls.objects.get(agent=instance, resolved=False)
else:
alert = None
elif isinstance(instance, Check):
from checks.tasks import (
handle_check_email_alert_task,
handle_check_sms_alert_task,
)
email_task = handle_check_email_alert_task
text_task = handle_check_sms_alert_task
email_alert = instance.email_alert
text_alert = instance.text_alert
dashboard_alert = instance.dashboard_alert
alert_template = instance.agent.get_alert_template()
maintenance_mode = instance.agent.maintenance_mode
alert_severity = instance.alert_severity
agent = instance.agent
# set alert_template settings
if alert_template:
dashboard_severities = alert_template.check_dashboard_alert_severity
email_severities = alert_template.check_email_alert_severity
text_severities = alert_template.check_text_alert_severity
always_dashboard = alert_template.check_always_alert
always_email = alert_template.check_always_email
always_text = alert_template.check_always_text
alert_interval = alert_template.check_periodic_alert_days
if instance.should_create_alert(alert_template):
alert = cls.create_or_return_check_alert(instance)
else:
# check if there is an alert that exists
if cls.objects.filter(assigned_check=instance, resolved=False).exists():
alert = cls.objects.get(assigned_check=instance, resolved=False)
else:
alert = None
elif isinstance(instance, AutomatedTask):
from autotasks.tasks import handle_task_email_alert, handle_task_sms_alert
email_task = handle_task_email_alert
text_task = handle_task_sms_alert
email_alert = instance.email_alert
text_alert = instance.text_alert
dashboard_alert = instance.dashboard_alert
alert_template = instance.agent.get_alert_template()
maintenance_mode = instance.agent.maintenance_mode
alert_severity = instance.alert_severity
agent = instance.agent
# set alert_template settings
if alert_template:
dashboard_severities = alert_template.task_dashboard_alert_severity
email_severities = alert_template.task_email_alert_severity
text_severities = alert_template.task_text_alert_severity
always_dashboard = alert_template.task_always_alert
always_email = alert_template.task_always_email
always_text = alert_template.task_always_text
alert_interval = alert_template.task_periodic_alert_days
if instance.should_create_alert(alert_template):
alert = cls.create_or_return_task_alert(instance)
else:
# check if there is an alert that exists
if cls.objects.filter(assigned_task=instance, resolved=False).exists():
alert = cls.objects.get(assigned_task=instance, resolved=False)
else:
alert = None
else:
return
# return if agent is in maintenance mode
if maintenance_mode or not alert:
return
# check if alert severity changed on check and update the alert
if alert_severity != alert.severity:
alert.severity = alert_severity
alert.save(update_fields=["severity"])
# create alert in dashboard if enabled
if dashboard_alert or always_dashboard:
# check if alert template is set and specific severities are configured
if alert_template and alert.severity not in dashboard_severities: # type: ignore
pass
else:
alert.hidden = False
alert.save()
# send email if enabled
if email_alert or always_email:
# check if alert template is set and specific severities are configured
if alert_template and alert.severity not in email_severities: # type: ignore
pass
else:
email_task.delay(
pk=alert.pk,
alert_interval=alert_interval,
)
# send text if enabled
if text_alert or always_text:
# check if alert template is set and specific severities are configured
if alert_template and alert.severity not in text_severities: # type: ignore
pass
else:
text_task.delay(pk=alert.pk, alert_interval=alert_interval)
# check if any scripts should be run
if alert_template and alert_template.action and not alert.action_run:
r = agent.run_script(
scriptpk=alert_template.action.pk,
args=alert_template.action_args,
timeout=alert_template.action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.action_retcode = r["retcode"]
alert.action_stdout = r["stdout"]
alert.action_stderr = r["stderr"]
alert.action_execution_time = "{:.4f}".format(r["execution_time"])
alert.action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Failure action: {alert_template.action.name} failed to run on any agent for {agent.hostname} failure alert"
)
@classmethod
def handle_alert_resolve(cls, instance: Union[Agent, AutomatedTask, Check]) -> None:
from agents.models import Agent
from autotasks.models import AutomatedTask
from checks.models import Check
# set variables
email_on_resolved = False
text_on_resolved = False
resolved_email_task = None
resolved_text_task = None
# check what the instance passed is
if isinstance(instance, Agent):
from agents.tasks import agent_recovery_email_task, agent_recovery_sms_task
resolved_email_task = agent_recovery_email_task
resolved_text_task = agent_recovery_sms_task
alert_template = instance.get_alert_template()
alert = cls.objects.get(agent=instance, resolved=False)
maintenance_mode = instance.maintenance_mode
agent = instance
if alert_template:
email_on_resolved = alert_template.agent_email_on_resolved
text_on_resolved = alert_template.agent_text_on_resolved
elif isinstance(instance, Check):
from checks.tasks import (
handle_resolved_check_email_alert_task,
handle_resolved_check_sms_alert_task,
)
resolved_email_task = handle_resolved_check_email_alert_task
resolved_text_task = handle_resolved_check_sms_alert_task
alert_template = instance.agent.get_alert_template()
alert = cls.objects.get(assigned_check=instance, resolved=False)
maintenance_mode = instance.agent.maintenance_mode
agent = instance.agent
if alert_template:
email_on_resolved = alert_template.check_email_on_resolved
text_on_resolved = alert_template.check_text_on_resolved
elif isinstance(instance, AutomatedTask):
from autotasks.tasks import (
handle_resolved_task_email_alert,
handle_resolved_task_sms_alert,
)
resolved_email_task = handle_resolved_task_email_alert
resolved_text_task = handle_resolved_task_sms_alert
alert_template = instance.agent.get_alert_template()
alert = cls.objects.get(assigned_task=instance, resolved=False)
maintenance_mode = instance.agent.maintenance_mode
agent = instance.agent
if alert_template:
email_on_resolved = alert_template.task_email_on_resolved
text_on_resolved = alert_template.task_text_on_resolved
else:
return
# return if agent is in maintenance mode
if maintenance_mode:
return
alert.resolve()
# check if a resolved email notification should be send
if email_on_resolved and not alert.resolved_email_sent:
resolved_email_task.delay(pk=alert.pk)
# check if resolved text should be sent
if text_on_resolved and not alert.resolved_sms_sent:
resolved_text_task.delay(pk=alert.pk)
# check if resolved script should be run
if (
alert_template
and alert_template.resolved_action
and not alert.resolved_action_run
):
r = agent.run_script(
scriptpk=alert_template.resolved_action.pk,
args=alert_template.resolved_action_args,
timeout=alert_template.resolved_action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.resolved_action_retcode = r["retcode"]
alert.resolved_action_stdout = r["stdout"]
alert.resolved_action_stderr = r["stderr"]
alert.resolved_action_execution_time = "{:.4f}".format(
r["execution_time"]
)
alert.resolved_action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Resolved action: {alert_template.action.name} failed to run on any agent for {agent.hostname} resolved alert"
)
class AlertTemplate(models.Model):
@@ -283,4 +595,4 @@ class AlertTemplate(models.Model):
@property
def is_default_template(self) -> bool:
return self.default_alert_template.exists()
return self.default_alert_template.exists() # type: ignore

File diff suppressed because it is too large Load Diff

View File

@@ -53,3 +53,39 @@ class TestAPIv3(TacticalTestCase):
r.json(),
{"agent": self.agent.pk, "check_interval": self.agent.check_interval},
)
def test_checkin_patch(self):
from logs.models import PendingAction
url = "/api/v3/checkin/"
agent_updated = baker.make_recipe("agents.agent", version="1.3.0")
PendingAction.objects.create(
agent=agent_updated,
action_type="agentupdate",
details={
"url": agent_updated.winagent_dl,
"version": agent_updated.version,
"inno": agent_updated.win_inno_exe,
},
)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "pending")
# test agent failed to update and still on same version
payload = {
"func": "hello",
"agent_id": agent_updated.agent_id,
"version": "1.3.0",
}
r = self.client.patch(url, payload, format="json")
self.assertEqual(r.status_code, 200)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "pending")
# test agent successful update
payload["version"] = settings.LATEST_AGENT_VER
r = self.client.patch(url, payload, format="json")
self.assertEqual(r.status_code, 200)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "completed")
action.delete()

View File

@@ -17,4 +17,5 @@ urlpatterns = [
path("choco/", views.Choco.as_view()),
path("winupdates/", views.WinUpdates.as_view()),
path("superseded/", views.SupersededWinUpdate.as_view()),
path("<int:pk>/chocoresult/", views.ChocoResult.as_view()),
]

View File

@@ -22,6 +22,7 @@ from autotasks.serializers import TaskGOGetSerializer, TaskRunnerPatchSerializer
from checks.models import Check
from checks.serializers import CheckRunnerGetSerializer
from checks.utils import bytes2human
from logs.models import PendingAction
from software.models import InstalledSoftware
from tacticalrmm.utils import SoftwareList, filter_software, notify_error, reload_nats
from winupdate.models import WinUpdate, WinUpdatePolicy
@@ -35,6 +36,8 @@ class CheckIn(APIView):
permission_classes = [IsAuthenticated]
def patch(self, request):
from alerts.models import Alert
updated = False
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if pyver.parse(request.data["version"]) > pyver.parse(
@@ -50,26 +53,27 @@ class CheckIn(APIView):
# change agent update pending status to completed if agent has just updated
if (
updated
and agent.pendingactions.filter(
and agent.pendingactions.filter( # type: ignore
action_type="agentupdate", status="pending"
).exists()
):
agent.pendingactions.filter(
agent.pendingactions.filter( # type: ignore
action_type="agentupdate", status="pending"
).update(status="completed")
# handles any alerting actions
agent.handle_alert(checkin=True)
if Alert.objects.filter(agent=agent, resolved=False).exists():
Alert.handle_alert_resolve(agent)
recovery = agent.recoveryactions.filter(last_run=None).last()
recovery = agent.recoveryactions.filter(last_run=None).last() # type: ignore
if recovery is not None:
recovery.last_run = djangotime.now()
recovery.save(update_fields=["last_run"])
handle_agent_recovery_task.delay(pk=recovery.pk)
handle_agent_recovery_task.delay(pk=recovery.pk) # type: ignore
return Response("ok")
# get any pending actions
if agent.pendingactions.filter(status="pending").exists():
if agent.pendingactions.filter(status="pending").exists(): # type: ignore
agent.handle_pending_actions()
return Response("ok")
@@ -111,7 +115,7 @@ class CheckIn(APIView):
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s = agent.installedsoftware_set.first() # type: ignore
s.software = sw
s.save(update_fields=["software"])
@@ -184,7 +188,7 @@ class WinUpdates(APIView):
def patch(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
u = agent.winupdates.filter(guid=request.data["guid"]).last()
u = agent.winupdates.filter(guid=request.data["guid"]).last() # type: ignore
success: bool = request.data["success"]
if success:
u.result = "success"
@@ -210,8 +214,8 @@ class WinUpdates(APIView):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
updates = request.data["wua_updates"]
for update in updates:
if agent.winupdates.filter(guid=update["guid"]).exists():
u = agent.winupdates.filter(guid=update["guid"]).last()
if agent.winupdates.filter(guid=update["guid"]).exists(): # type: ignore
u = agent.winupdates.filter(guid=update["guid"]).last() # type: ignore
u.downloaded = update["downloaded"]
u.installed = update["installed"]
u.save(update_fields=["downloaded", "installed"])
@@ -242,7 +246,7 @@ class WinUpdates(APIView):
# more superseded updates cleanup
if pyver.parse(agent.version) <= pyver.parse("1.4.2"):
for u in agent.winupdates.filter(
for u in agent.winupdates.filter( # type: ignore
date_installed__isnull=True, result="failed"
).exclude(installed=True):
u.delete()
@@ -256,7 +260,7 @@ class SupersededWinUpdate(APIView):
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
updates = agent.winupdates.filter(guid=request.data["guid"])
updates = agent.winupdates.filter(guid=request.data["guid"]) # type: ignore
for u in updates:
u.delete()
@@ -264,10 +268,6 @@ class SupersededWinUpdate(APIView):
class CheckRunner(APIView):
"""
For the windows golang agent
"""
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
@@ -301,10 +301,6 @@ class CheckRunnerInterval(APIView):
class TaskRunner(APIView):
"""
For the windows golang agent
"""
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
@@ -314,6 +310,7 @@ class TaskRunner(APIView):
return Response(TaskGOGetSerializer(task).data)
def patch(self, request, pk, agentid):
from alerts.models import Alert
from logs.models import AuditLog
agent = get_object_or_404(Agent, agent_id=agentid)
@@ -325,8 +322,17 @@ class TaskRunner(APIView):
serializer.is_valid(raise_exception=True)
serializer.save(last_run=djangotime.now())
new_task = AutomatedTask.objects.get(pk=task.pk)
new_task.handle_alert()
status = "failing" if task.retcode != 0 else "passing"
new_task: AutomatedTask = AutomatedTask.objects.get(pk=task.pk)
new_task.status = status
new_task.save()
if status == "passing":
if Alert.objects.filter(assigned_task=new_task, resolved=False).exists():
Alert.handle_alert_resolve(new_task)
else:
Alert.handle_alert_failure(new_task)
AuditLog.objects.create(
username=agent.hostname,
@@ -404,10 +410,10 @@ class NewAgent(APIView):
agent.salt_id = f"{agent.hostname}-{agent.pk}"
agent.save(update_fields=["salt_id"])
user = User.objects.create_user(
user = User.objects.create_user( # type: ignore
username=request.data["agent_id"],
agent=agent,
password=User.objects.make_random_password(60),
password=User.objects.make_random_password(60), # type: ignore
)
token = Token.objects.create(user=user)
@@ -452,7 +458,7 @@ class Software(APIView):
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s = agent.installedsoftware_set.first() # type: ignore
s.software = sw
s.save(update_fields=["software"])
@@ -475,3 +481,35 @@ class Installer(APIView):
)
return Response("ok")
class ChocoResult(APIView):
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def patch(self, request, pk):
action = get_object_or_404(PendingAction, pk=pk)
results: str = request.data["results"]
software_name = action.details["name"].lower()
success = [
"install",
"of",
software_name,
"was",
"successful",
"installed",
]
duplicate = [software_name, "already", "installed", "--force", "reinstall"]
installed = False
if all(x in results.lower() for x in success):
installed = True
elif all(x in results.lower() for x in duplicate):
installed = True
action.details["output"] = results
action.details["installed"] = installed
action.status = "completed"
action.save(update_fields=["details", "status"])
return Response("ok")

View File

@@ -43,11 +43,11 @@ class Policy(BaseAuditModel):
@property
def is_default_server_policy(self):
return self.default_server_policy.exists()
return self.default_server_policy.exists() # type: ignore
@property
def is_default_workstation_policy(self):
return self.default_workstation_policy.exists()
return self.default_workstation_policy.exists() # type: ignore
def __str__(self):
return self.name
@@ -56,7 +56,7 @@ class Policy(BaseAuditModel):
return self.get_related("server") | self.get_related("workstation")
def get_related(self, mon_type):
explicit_agents = self.agents.filter(monitoring_type=mon_type)
explicit_agents = self.agents.filter(monitoring_type=mon_type) # type: ignore
explicit_clients = getattr(self, f"{mon_type}_clients").all()
explicit_sites = getattr(self, f"{mon_type}_sites").all()

View File

@@ -505,12 +505,12 @@ class TestPolicyTasks(TacticalTestCase):
self.assertEqual(check.ip, checks[1].ip)
elif check.check_type == "cpuload":
self.assertEqual(check.parent_check, checks[2].id)
self.assertEqual(check.error_threshold, checks[0].error_threshold)
self.assertEqual(check.warning_threshold, checks[0].warning_threshold)
self.assertEqual(check.error_threshold, checks[2].error_threshold)
self.assertEqual(check.warning_threshold, checks[2].warning_threshold)
elif check.check_type == "memory":
self.assertEqual(check.parent_check, checks[3].id)
self.assertEqual(check.error_threshold, checks[0].error_threshold)
self.assertEqual(check.warning_threshold, checks[0].warning_threshold)
self.assertEqual(check.error_threshold, checks[3].error_threshold)
self.assertEqual(check.warning_threshold, checks[3].warning_threshold)
elif check.check_type == "winsvc":
self.assertEqual(check.parent_check, checks[4].id)
self.assertEqual(check.svc_name, checks[4].svc_name)

View File

@@ -171,7 +171,7 @@ class UpdatePatchPolicy(APIView):
serializer = WinUpdatePolicySerializer(data=request.data, partial=True)
serializer.is_valid(raise_exception=True)
serializer.policy = policy
serializer.policy = policy # type: ignore
serializer.save()
return Response("ok")

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.7 on 2021-02-24 05:37
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('autotasks', '0017_auto_20210210_1512'),
]
operations = [
migrations.AddField(
model_name='automatedtask',
name='run_asap_after_missed',
field=models.BooleanField(default=False),
),
]

View File

@@ -96,6 +96,7 @@ class AutomatedTask(BaseAuditModel):
)
run_time_date = DateTimeField(null=True, blank=True)
remove_if_not_scheduled = models.BooleanField(default=False)
run_asap_after_missed = models.BooleanField(default=False) # added in agent v1.4.7
managed_by_policy = models.BooleanField(default=False)
parent_task = models.PositiveIntegerField(null=True, blank=True)
win_task_name = models.CharField(max_length=255, null=True, blank=True)
@@ -218,176 +219,26 @@ class AutomatedTask(BaseAuditModel):
timeout=self.timeout,
enabled=self.enabled,
remove_if_not_scheduled=self.remove_if_not_scheduled,
run_asap_after_missed=self.run_asap_after_missed,
)
create_win_task_schedule.delay(task.pk)
def handle_alert(self) -> None:
from alerts.models import Alert
from autotasks.tasks import (
handle_resolved_task_email_alert,
handle_resolved_task_sms_alert,
handle_task_email_alert,
handle_task_sms_alert,
def should_create_alert(self, alert_template):
return (
self.dashboard_alert
or self.email_alert
or self.text_alert
or (
alert_template
and (
alert_template.task_always_alert
or alert_template.task_always_email
or alert_template.task_always_text
)
)
)
self.status = "failing" if self.retcode != 0 else "passing"
self.save()
# return if agent is in maintenance mode
if self.agent.maintenance_mode:
return
# see if agent has an alert template and use that
alert_template = self.agent.get_alert_template()
# resolve alert if it exists
if self.status == "passing":
if Alert.objects.filter(assigned_task=self, resolved=False).exists():
alert = Alert.objects.get(assigned_task=self, resolved=False)
alert.resolve()
# check if resolved email should be send
if (
not alert.resolved_email_sent
and self.email_alert
or alert_template
and alert_template.task_email_on_resolved
):
handle_resolved_task_email_alert.delay(pk=alert.pk)
# check if resolved text should be sent
if (
not alert.resolved_sms_sent
and self.text_alert
or alert_template
and alert_template.task_text_on_resolved
):
handle_resolved_task_sms_alert.delay(pk=alert.pk)
# check if resolved script should be run
if (
alert_template
and alert_template.resolved_action
and not alert.resolved_action_run
):
r = self.agent.run_script(
scriptpk=alert_template.resolved_action.pk,
args=alert_template.resolved_action_args,
timeout=alert_template.resolved_action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.resolved_action_retcode = r["retcode"]
alert.resolved_action_stdout = r["stdout"]
alert.resolved_action_stderr = r["stderr"]
alert.resolved_action_execution_time = "{:.4f}".format(
r["execution_time"]
)
alert.resolved_action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Resolved action: {alert_template.action.name} failed to run on any agent for {self.agent.hostname} resolved alert for task: {self.name}"
)
# create alert if task is failing
else:
if not Alert.objects.filter(assigned_task=self, resolved=False).exists():
# check if alert should be created and if not return
if (
self.dashboard_alert
or self.email_alert
or self.text_alert
or (
alert_template
and (
alert_template.task_always_alert
or alert_template.task_always_email
or alert_template.task_always_text
)
)
):
alert = Alert.create_task_alert(self)
else:
return
else:
alert = Alert.objects.get(assigned_task=self, resolved=False)
# check if alert severity changed on task and update the alert
if self.alert_severity != alert.severity:
alert.severity = self.alert_severity
alert.save(update_fields=["severity"])
# create alert in dashboard if enabled
if (
self.dashboard_alert
or alert_template
and alert_template.task_always_alert
):
alert.hidden = False
alert.save()
# send email if enabled
if (
not alert.email_sent
and self.email_alert
or alert_template
and self.alert_severity in alert_template.task_email_alert_severity
and alert_template.check_always_email
):
handle_task_email_alert.delay(
pk=alert.pk,
alert_template=alert_template.check_periodic_alert_days
if alert_template
else None,
)
# send text if enabled
if (
not alert.sms_sent
and self.text_alert
or alert_template
and self.alert_severity in alert_template.task_text_alert_severity
and alert_template.check_always_text
):
handle_task_sms_alert.delay(
pk=alert.pk,
alert_template=alert_template.check_periodic_alert_days
if alert_template
else None,
)
# check if any scripts should be run
if alert_template and alert_template.action and not alert.action_run:
r = self.agent.run_script(
scriptpk=alert_template.action.pk,
args=alert_template.action_args,
timeout=alert_template.action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.action_retcode = r["retcode"]
alert.action_stdout = r["stdout"]
alert.action_stderr = r["stderr"]
alert.action_execution_time = "{:.4f}".format(r["execution_time"])
alert.action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Failure action: {alert_template.action.name} failed to run on any agent for {self.agent.hostname} failure alert for task: {self.name}"
)
def send_email(self):
from core.models import CoreSettings

View File

@@ -45,7 +45,7 @@ def create_win_task_schedule(pk, pending_action=False):
task.run_time_date = now.astimezone(agent_tz).replace(
tzinfo=pytz.utc
) + djangotime.timedelta(minutes=5)
task.save()
task.save(update_fields=["run_time_date"])
nats_data = {
"func": "schedtask",
@@ -62,9 +62,12 @@ def create_win_task_schedule(pk, pending_action=False):
},
}
if task.remove_if_not_scheduled and pyver.parse(
if task.run_asap_after_missed and pyver.parse(
task.agent.version
) >= pyver.parse("1.1.2"):
) >= pyver.parse("1.4.7"):
nats_data["schedtaskpayload"]["run_asap_after_missed"] = True
if task.remove_if_not_scheduled:
nats_data["schedtaskpayload"]["deleteafter"] = True
elif task.task_type == "checkfailure" or task.task_type == "manual":

View File

@@ -3,7 +3,7 @@ from model_bakery.recipe import Recipe
check = Recipe("checks.Check")
diskspace_check = check.extend(
check_type="diskspace", disk="C:", warning_threshold=30, error_threshold=75
check_type="diskspace", disk="C:", warning_threshold=30, error_threshold=10
)
cpuload_check = check.extend(
@@ -13,7 +13,7 @@ cpuload_check = check.extend(
ping_check = check.extend(check_type="ping", ip="10.10.10.10")
memory_check = check.extend(
check_type="memory", warning_threshold=30, error_threshold=75
check_type="memory", warning_threshold=60, error_threshold=75
)
winsvc_check = check.extend(
@@ -21,6 +21,7 @@ winsvc_check = check.extend(
svc_name="ServiceName",
svc_display_name="ServiceName",
svc_policy_mode="manual",
pass_if_svc_not_exist=False,
)
eventlog_check = check.extend(

View File

@@ -3,27 +3,19 @@ import json
import os
import string
from statistics import mean
from typing import Any, List, Union
from typing import Any
import pytz
from django.conf import settings
from django.contrib.postgres.fields import ArrayField
from django.core.validators import MaxValueValidator, MinValueValidator
from django.db import models
from django.utils import timezone as djangotime
from loguru import logger
from rest_framework.fields import JSONField
from alerts.models import SEVERITY_CHOICES
from core.models import CoreSettings
from logs.models import BaseAuditModel
from .tasks import (
handle_check_email_alert_task,
handle_check_sms_alert_task,
handle_resolved_check_email_alert_task,
handle_resolved_check_sms_alert_task,
)
from .utils import bytes2human
logger.configure(**settings.LOG_CONFIG)
@@ -206,9 +198,9 @@ class Check(BaseAuditModel):
if self.error_threshold:
text += f" Error Threshold: {self.error_threshold}%"
return f"{self.get_check_type_display()}: Drive {self.disk} < {text}"
return f"{self.get_check_type_display()}: Drive {self.disk} - {text}" # type: ignore
elif self.check_type == "ping":
return f"{self.get_check_type_display()}: {self.name}"
return f"{self.get_check_type_display()}: {self.name}" # type: ignore
elif self.check_type == "cpuload" or self.check_type == "memory":
text = ""
@@ -217,13 +209,13 @@ class Check(BaseAuditModel):
if self.error_threshold:
text += f" Error Threshold: {self.error_threshold}%"
return f"{self.get_check_type_display()} > {text}"
return f"{self.get_check_type_display()} - {text}" # type: ignore
elif self.check_type == "winsvc":
return f"{self.get_check_type_display()}: {self.svc_display_name}"
return f"{self.get_check_type_display()}: {self.svc_display_name}" # type: ignore
elif self.check_type == "eventlog":
return f"{self.get_check_type_display()}: {self.name}"
return f"{self.get_check_type_display()}: {self.name}" # type: ignore
elif self.check_type == "script":
return f"{self.get_check_type_display()}: {self.script.name}"
return f"{self.get_check_type_display()}: {self.script.name}" # type: ignore
else:
return "n/a"
@@ -242,7 +234,7 @@ class Check(BaseAuditModel):
return self.last_run
@property
def non_editable_fields(self) -> List[str]:
def non_editable_fields(self) -> list[str]:
return [
"check_type",
"status",
@@ -267,164 +259,27 @@ class Check(BaseAuditModel):
"modified_time",
]
def handle_alert(self) -> None:
from alerts.models import Alert, AlertTemplate
def should_create_alert(self, alert_template):
# return if agent is in maintenance mode
if self.agent.maintenance_mode:
return
# see if agent has an alert template and use that
alert_template: Union[AlertTemplate, None] = self.agent.get_alert_template()
# resolve alert if it exists
if self.status == "passing":
if Alert.objects.filter(assigned_check=self, resolved=False).exists():
alert = Alert.objects.get(assigned_check=self, resolved=False)
alert.resolve()
# check if a resolved email notification should be send
if (
alert_template
and alert_template.check_email_on_resolved
and not alert.resolved_email_sent
):
handle_resolved_check_email_alert_task.delay(pk=alert.pk)
# check if resolved text should be sent
if (
alert_template
and alert_template.check_text_on_resolved
and not alert.resolved_sms_sent
):
handle_resolved_check_sms_alert_task.delay(pk=alert.pk)
# check if resolved script should be run
if (
alert_template
and alert_template.resolved_action
and not alert.resolved_action_run
):
r = self.agent.run_script(
scriptpk=alert_template.resolved_action.pk,
args=alert_template.resolved_action_args,
timeout=alert_template.resolved_action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.resolved_action_retcode = r["retcode"]
alert.resolved_action_stdout = r["stdout"]
alert.resolved_action_stderr = r["stderr"]
alert.resolved_action_execution_time = "{:.4f}".format(
r["execution_time"]
)
alert.resolved_action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Resolved action: {alert_template.action.name} failed to run on any agent for {self.agent.hostname} resolved alert for {self.check_type} check"
)
elif self.fail_count >= self.fails_b4_alert:
if not Alert.objects.filter(assigned_check=self, resolved=False).exists():
# check if alert should be created and if not return
if (
self.dashboard_alert
or self.email_alert
or self.text_alert
or (
alert_template
and (
alert_template.check_always_alert
or alert_template.check_always_email
or alert_template.check_always_text
)
)
):
alert = Alert.create_check_alert(self)
else:
return
else:
alert = Alert.objects.get(assigned_check=self, resolved=False)
# check if alert severity changed on check and update the alert
if self.alert_severity != alert.severity:
alert.severity = self.alert_severity
alert.save(update_fields=["severity"])
# create alert in dashboard if enabled
if (
self.dashboard_alert
or alert_template
and self.alert_severity in alert_template.check_dashboard_alert_severity
and alert_template.check_always_alert
):
alert.hidden = False
alert.save()
# send email if enabled
if (
not alert.email_sent
and self.email_alert
or alert_template
and self.alert_severity in alert_template.check_email_alert_severity
and alert_template.check_always_email
):
handle_check_email_alert_task.delay(
pk=alert.pk,
alert_interval=alert_template.check_periodic_alert_days
if alert_template
else None,
return (
self.dashboard_alert
or self.email_alert
or self.text_alert
or (
alert_template
and (
alert_template.check_always_alert
or alert_template.check_always_email
or alert_template.check_always_text
)
# send text if enabled
if (
not alert.sms_sent
and self.text_alert
or alert_template
and self.alert_severity in alert_template.check_text_alert_severity
and alert_template.check_always_text
):
handle_check_sms_alert_task.delay(
pk=alert.pk,
alert_interval=alert_template.check_periodic_alert_days
if alert_template
else None,
)
# check if any scripts should be run
if alert_template and alert_template.action and not alert.action_run:
r = self.agent.run_script(
scriptpk=alert_template.action.pk,
args=alert_template.action_args,
timeout=alert_template.action_timeout,
wait=True,
full=True,
run_on_any=True,
)
# command was successful
if type(r) == dict:
alert.action_retcode = r["retcode"]
alert.action_stdout = r["stdout"]
alert.action_stderr = r["stderr"]
alert.action_execution_time = "{:.4f}".format(r["execution_time"])
alert.action_run = djangotime.now()
alert.save()
else:
logger.error(
f"Failure action: {alert_template.action.name} failed to run on any agent for {self.agent.hostname} failure alert for {self.check_type} check{r}"
)
)
)
def add_check_history(self, value: int, more_info: Any = None) -> None:
CheckHistory.objects.create(check_history=self, y=value, results=more_info)
def handle_checkv2(self, data):
from alerts.models import Alert
# cpuload or mem checks
if self.check_type == "cpuload" or self.check_type == "memory":
@@ -657,11 +512,14 @@ class Check(BaseAuditModel):
self.fail_count += 1
self.save(update_fields=["status", "fail_count", "alert_severity"])
if self.fail_count >= self.fails_b4_alert:
Alert.handle_alert_failure(self)
elif self.status == "passing":
self.fail_count = 0
self.save(update_fields=["status", "fail_count", "alert_severity"])
self.handle_alert()
if Alert.objects.filter(assigned_check=self, resolved=False).exists():
Alert.handle_alert_resolve(self)
return self.status

View File

@@ -24,7 +24,7 @@ class TestCheckViews(TacticalTestCase):
serializer = CheckSerializer(disk_check)
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, serializer.data)
self.assertEqual(resp.data, serializer.data) # type: ignore
self.check_not_authenticated("get", url)
def test_add_disk_check(self):
@@ -211,7 +211,7 @@ class TestCheckViews(TacticalTestCase):
serializer = CheckSerializer(disk_check)
self.assertEqual(resp.status_code, 200)
self.assertEqual(resp.data, serializer.data)
self.assertEqual(resp.data, serializer.data) # type: ignore
self.check_not_authenticated("post", url)
def test_add_policy_disk_check(self):
@@ -221,7 +221,7 @@ class TestCheckViews(TacticalTestCase):
url = "/checks/checks/"
valid_payload = {
"policy": policy.pk,
"policy": policy.pk, # type: ignore
"check": {
"check_type": "diskspace",
"disk": "M:",
@@ -233,7 +233,7 @@ class TestCheckViews(TacticalTestCase):
# should fail because both error and warning thresholds are 0
invalid_payload = {
"policy": policy.pk,
"policy": policy.pk, # type: ignore
"check": {
"check_type": "diskspace",
"error_threshold": 0,
@@ -247,7 +247,7 @@ class TestCheckViews(TacticalTestCase):
# should fail because warning is less than error
invalid_payload = {
"policy": policy.pk,
"policy": policy.pk, # type: ignore
"check": {
"check_type": "diskspace",
"error_threshold": 80,
@@ -261,7 +261,7 @@ class TestCheckViews(TacticalTestCase):
# this should fail because we already have a check for drive M: in setup
invalid_payload = {
"policy": policy.pk,
"policy": policy.pk, # type: ignore
"check": {
"check_type": "diskspace",
"disk": "M:",
@@ -277,8 +277,8 @@ class TestCheckViews(TacticalTestCase):
def test_get_disks_for_policies(self):
url = "/checks/getalldisks/"
r = self.client.get(url)
self.assertIsInstance(r.data, list)
self.assertEqual(26, len(r.data))
self.assertIsInstance(r.data, list) # type: ignore
self.assertEqual(26, len(r.data)) # type: ignore
def test_edit_check_alert(self):
# setup data
@@ -361,8 +361,8 @@ class TestCheckViews(TacticalTestCase):
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
for check_history in check_history_data: # type: ignore
check_history.x = djangotime.now() - djangotime.timedelta(days=35) # type: ignore
check_history.save()
# test invalid check pk
@@ -375,20 +375,22 @@ class TestCheckViews(TacticalTestCase):
data = {"timeFilter": 30}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 30)
self.assertEqual(len(resp.data), 30) # type: ignore
# test with timeFilter equal to 0
data = {"timeFilter": 0}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 60)
self.assertEqual(len(resp.data), 60) # type: ignore
self.check_not_authenticated("patch", url)
class TestCheckTasks(TacticalTestCase):
def setUp(self):
self.authenticate()
self.setup_coresettings()
self.agent = baker.make_recipe("agents.agent")
def test_prune_check_history(self):
from .tasks import prune_check_history
@@ -403,8 +405,8 @@ class TestCheckTasks(TacticalTestCase):
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
for check_history in check_history_data: # type: ignore
check_history.x = djangotime.now() - djangotime.timedelta(days=35) # type: ignore
check_history.save()
# prune data 30 days old
@@ -414,3 +416,694 @@ class TestCheckTasks(TacticalTestCase):
# prune all Check history Data
prune_check_history(0)
self.assertEqual(CheckHistory.objects.count(), 0)
def test_handle_script_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
script = baker.make_recipe("checks.script_check", agent=self.agent)
# test failing
data = {
"id": script.id,
"retcode": 500,
"stderr": "error",
"stdout": "message",
"runtime": 5.000,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=script.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test passing
data = {
"id": script.id,
"retcode": 0,
"stderr": "error",
"stdout": "message",
"runtime": 5.000,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=script.id)
self.assertEqual(new_check.status, "passing")
# test failing info
script.info_return_codes = [20, 30, 50]
script.save()
data = {
"id": script.id,
"retcode": 30,
"stderr": "error",
"stdout": "message",
"runtime": 5.000,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=script.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "info")
# test failing warning
script.warning_return_codes = [80, 100, 1040]
script.save()
data = {
"id": script.id,
"retcode": 1040,
"stderr": "error",
"stdout": "message",
"runtime": 5.000,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=script.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
def test_handle_diskspace_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
diskspace = baker.make_recipe(
"checks.diskspace_check",
warning_threshold=20,
error_threshold=10,
agent=self.agent,
)
# test warning threshold failure
data = {
"id": diskspace.id,
"exists": True,
"percent_used": 85,
"total": 500,
"free": 400,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
# test error failure
data = {
"id": diskspace.id,
"exists": True,
"percent_used": 95,
"total": 500,
"free": 400,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test disk not exist
data = {"id": diskspace.id, "exists": False}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test warning threshold 0
diskspace.warning_threshold = 0
diskspace.save()
data = {
"id": diskspace.id,
"exists": True,
"percent_used": 95,
"total": 500,
"free": 400,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test error threshold 0
diskspace.warning_threshold = 50
diskspace.error_threshold = 0
diskspace.save()
data = {
"id": diskspace.id,
"exists": True,
"percent_used": 95,
"total": 500,
"free": 400,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
# test passing
data = {
"id": diskspace.id,
"exists": True,
"percent_used": 50,
"total": 500,
"free": 400,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=diskspace.id)
self.assertEqual(new_check.status, "passing")
def test_handle_cpuload_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
cpuload = baker.make_recipe(
"checks.cpuload_check",
warning_threshold=70,
error_threshold=90,
agent=self.agent,
)
# test failing warning
data = {"id": cpuload.id, "percent": 80}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=cpuload.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
# test failing error
data = {"id": cpuload.id, "percent": 95}
# reset check history
cpuload.history = []
cpuload.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=cpuload.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test passing
data = {"id": cpuload.id, "percent": 50}
# reset check history
cpuload.history = []
cpuload.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=cpuload.id)
self.assertEqual(new_check.status, "passing")
# test warning threshold 0
cpuload.warning_threshold = 0
cpuload.save()
data = {"id": cpuload.id, "percent": 95}
# reset check history
cpuload.history = []
cpuload.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=cpuload.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test error threshold 0
cpuload.warning_threshold = 50
cpuload.error_threshold = 0
cpuload.save()
data = {"id": cpuload.id, "percent": 95}
# reset check history
cpuload.history = []
cpuload.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=cpuload.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
def test_handle_memory_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
memory = baker.make_recipe(
"checks.memory_check",
warning_threshold=70,
error_threshold=90,
agent=self.agent,
)
# test failing warning
data = {"id": memory.id, "percent": 80}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=memory.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
# test failing error
data = {"id": memory.id, "percent": 95}
# reset check history
memory.history = []
memory.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=memory.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test passing
data = {"id": memory.id, "percent": 50}
# reset check history
memory.history = []
memory.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=memory.id)
self.assertEqual(new_check.status, "passing")
# test warning threshold 0
memory.warning_threshold = 0
memory.save()
data = {"id": memory.id, "percent": 95}
# reset check history
memory.history = []
memory.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=memory.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test error threshold 0
memory.warning_threshold = 50
memory.error_threshold = 0
memory.save()
data = {"id": memory.id, "percent": 95}
# reset check history
memory.history = []
memory.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=memory.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
def test_handle_ping_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
ping = baker.make_recipe(
"checks.ping_check", agent=self.agent, alert_severity="info"
)
# test failing info
data = {
"id": ping.id,
"output": "Reply from 192.168.1.27: Destination host unreachable",
"has_stdout": True,
"has_stderr": False,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=ping.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "info")
# test failing warning
data = {
"id": ping.id,
"output": "Reply from 192.168.1.27: Destination host unreachable",
"has_stdout": True,
"has_stderr": False,
}
ping.alert_severity = "warning"
ping.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=ping.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
# test failing error
data = {
"id": ping.id,
"output": "Reply from 192.168.1.27: Destination host unreachable",
"has_stdout": True,
"has_stderr": False,
}
ping.alert_severity = "error"
ping.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=ping.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test failing error
data = {
"id": ping.id,
"output": "some output",
"has_stdout": False,
"has_stderr": True,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=ping.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
# test passing
data = {
"id": ping.id,
"output": "Reply from 192.168.1.1: bytes=32 time<1ms TTL=64",
"has_stdout": True,
"has_stderr": False,
}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=ping.id)
self.assertEqual(new_check.status, "passing")
@patch("agents.models.Agent.nats_cmd")
def test_handle_winsvc_check(self, nats_cmd):
from checks.models import Check
url = "/api/v3/checkrunner/"
winsvc = baker.make_recipe(
"checks.winsvc_check", agent=self.agent, alert_severity="info"
)
# test passing running
data = {"id": winsvc.id, "exists": True, "status": "running"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "passing")
# test passing start pending
winsvc.pass_if_start_pending = True
winsvc.save()
data = {"id": winsvc.id, "exists": True, "status": "start_pending"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "passing")
# test failing no start
data = {"id": winsvc.id, "exists": True, "status": "not running"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "info")
# test failing and attempt start
winsvc.restart_if_stopped = True
winsvc.alert_severity = "warning"
winsvc.save()
nats_cmd.return_value = "timeout"
data = {"id": winsvc.id, "exists": True, "status": "not running"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "warning")
nats_cmd.assert_called()
nats_cmd.reset_mock()
# test failing and attempt start
winsvc.alert_severity = "error"
winsvc.save()
nats_cmd.return_value = {"success": False, "errormsg": "Some Error"}
data = {"id": winsvc.id, "exists": True, "status": "not running"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "failing")
self.assertEqual(new_check.alert_severity, "error")
nats_cmd.assert_called()
nats_cmd.reset_mock()
# test success and attempt start
nats_cmd.return_value = {"success": True}
data = {"id": winsvc.id, "exists": True, "status": "not running"}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "passing")
nats_cmd.assert_called()
nats_cmd.reset_mock()
# test failing and service not exist
data = {"id": winsvc.id, "exists": False, "status": ""}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "failing")
# test success and service not exist
winsvc.pass_if_svc_not_exist = True
winsvc.save()
data = {"id": winsvc.id, "exists": False, "status": ""}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=winsvc.id)
self.assertEqual(new_check.status, "passing")
def test_handle_eventlog_check(self):
from checks.models import Check
url = "/api/v3/checkrunner/"
eventlog = baker.make_recipe(
"checks.eventlog_check",
event_type="warning",
fail_when="contains",
event_id=123,
alert_severity="warning",
agent=self.agent,
)
data = {
"id": eventlog.id,
"log": [
{
"eventType": "warning",
"eventID": 150,
"source": "source",
"message": "a test message",
},
{
"eventType": "warning",
"eventID": 123,
"source": "source",
"message": "a test message",
},
{
"eventType": "error",
"eventID": 123,
"source": "source",
"message": "a test message",
},
],
}
# test failing when contains
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.alert_severity, "warning")
self.assertEquals(new_check.status, "failing")
# test passing when not contains and message
eventlog.event_message = "doesnt exist"
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "passing")
# test failing when not contains and message and source
eventlog.fail_when = "not_contains"
eventlog.alert_severity = "error"
eventlog.event_message = "doesnt exist"
eventlog.event_source = "doesnt exist"
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "failing")
self.assertEquals(new_check.alert_severity, "error")
# test passing when contains with source and message
eventlog.event_message = "test"
eventlog.event_source = "source"
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "passing")
# test failing with wildcard not contains and source
eventlog.event_id_is_wildcard = True
eventlog.event_source = "doesn't exist"
eventlog.event_message = ""
eventlog.event_id = 0
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "failing")
self.assertEquals(new_check.alert_severity, "error")
# test passing with wildcard contains
eventlog.event_source = ""
eventlog.event_message = ""
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "passing")
# test failing with wildcard contains and message
eventlog.fail_when = "contains"
eventlog.event_type = "error"
eventlog.alert_severity = "info"
eventlog.event_message = "test"
eventlog.event_source = ""
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "failing")
self.assertEquals(new_check.alert_severity, "info")
# test passing with wildcard not contains message and source
eventlog.event_message = "doesnt exist"
eventlog.event_source = "doesnt exist"
eventlog.save()
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
new_check = Check.objects.get(pk=eventlog.id)
self.assertEquals(new_check.status, "passing")

View File

@@ -59,7 +59,7 @@ class AddCheck(APIView):
if policy:
generate_agent_checks_from_policies_task.delay(policypk=policy.pk)
elif agent:
checks = agent.agentchecks.filter(
checks = agent.agentchecks.filter( # type: ignore
check_type=obj.check_type, managed_by_policy=True
)
@@ -149,7 +149,7 @@ class CheckHistory(APIView):
- djangotime.timedelta(days=request.data["timeFilter"]),
)
check_history = check.check_history.filter(timeFilter).order_by("-x")
check_history = check.check_history.filter(timeFilter).order_by("-x") # type: ignore
return Response(
CheckHistorySerializer(

View File

@@ -1,12 +1,9 @@
import datetime as dt
import os
import re
import subprocess
import uuid
import pytz
from django.conf import settings
from django.http import HttpResponse
from django.shortcuts import get_object_or_404
from django.utils import timezone as djangotime
from rest_framework.permissions import AllowAny
@@ -15,7 +12,7 @@ from rest_framework.views import APIView
from agents.models import Agent
from core.models import CoreSettings
from tacticalrmm.utils import notify_error
from tacticalrmm.utils import generate_installer_exe, notify_error
from .models import Client, Deployment, Site
from .serializers import (
@@ -183,99 +180,28 @@ class GenerateAgent(APIView):
d = get_object_or_404(Deployment, uid=uid)
go_bin = "/usr/local/rmmgo/go/bin/go"
if not os.path.exists(go_bin):
return notify_error("Missing golang")
api = f"https://{request.get_host()}"
inno = (
f"winagent-v{settings.LATEST_AGENT_VER}.exe"
if d.arch == "64"
else f"winagent-v{settings.LATEST_AGENT_VER}-x86.exe"
)
download_url = settings.DL_64 if d.arch == "64" else settings.DL_32
client = d.client.name.replace(" ", "").lower()
site = d.site.name.replace(" ", "").lower()
client = re.sub(r"([^a-zA-Z0-9]+)", "", client)
site = re.sub(r"([^a-zA-Z0-9]+)", "", site)
ext = ".exe" if d.arch == "64" else "-x86.exe"
file_name = f"rmm-{client}-{site}-{d.mon_type}{ext}"
exe = os.path.join(settings.EXE_DIR, file_name)
if os.path.exists(exe):
try:
os.remove(exe)
except:
pass
goarch = "amd64" if d.arch == "64" else "386"
cmd = [
"env",
"GOOS=windows",
f"GOARCH={goarch}",
go_bin,
"build",
f"-ldflags=\"-s -w -X 'main.Inno={inno}'",
f"-X 'main.Api={api}'",
f"-X 'main.Client={d.client.pk}'",
f"-X 'main.Site={d.site.pk}'",
f"-X 'main.Atype={d.mon_type}'",
f"-X 'main.Rdp={d.install_flags['rdp']}'",
f"-X 'main.Ping={d.install_flags['ping']}'",
f"-X 'main.Power={d.install_flags['power']}'",
f"-X 'main.DownloadUrl={download_url}'",
f"-X 'main.Token={d.token_key}'\"",
"-o",
exe,
]
gen = [
"env",
"GOOS=windows",
f"GOARCH={goarch}",
go_bin,
"generate",
]
try:
r1 = subprocess.run(
" ".join(gen),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except:
return notify_error("genfailed")
if r1.returncode != 0:
return notify_error("genfailed")
try:
r = subprocess.run(
" ".join(cmd),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except:
return notify_error("buildfailed")
if r.returncode != 0:
return notify_error("buildfailed")
if settings.DEBUG:
with open(exe, "rb") as f:
response = HttpResponse(
f.read(),
content_type="application/vnd.microsoft.portable-executable",
)
response["Content-Disposition"] = f"inline; filename={file_name}"
return response
else:
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
return generate_installer_exe(
file_name=f"rmm-{client}-{site}-{d.mon_type}{ext}",
goarch="amd64" if d.arch == "64" else "386",
inno=inno,
api=f"https://{request.get_host()}",
client_id=d.client.pk,
site_id=d.site.pk,
atype=d.mon_type,
rdp=d.install_flags["rdp"],
ping=d.install_flags["ping"],
power=d.install_flags["power"],
download_url=settings.DL_64 if d.arch == "64" else settings.DL_32,
token=d.token_key,
)

View File

@@ -0,0 +1,5 @@
module github.com/wh1te909/goinstaller
go 1.16
require github.com/josephspurrier/goversioninfo v1.2.0 // indirect

View File

@@ -0,0 +1,10 @@
github.com/akavel/rsrc v0.8.0 h1:zjWn7ukO9Kc5Q62DOJCcxGpXC18RawVtYAGdz2aLlfw=
github.com/akavel/rsrc v0.8.0/go.mod h1:uLoCtb9J+EyAqh+26kdrTgmzRBFPGOolLWKpdxkKq+c=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/josephspurrier/goversioninfo v1.2.0 h1:tpLHXAxLHKHg/dCU2AAYx08A4m+v9/CWg6+WUvTF4uQ=
github.com/josephspurrier/goversioninfo v1.2.0/go.mod h1:AGP2a+Y/OVJZ+s6XM4IwFUpkETwvn0orYurY8qpw1+0=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -6,6 +6,7 @@ import (
"flag"
"fmt"
"io"
"net"
"net/http"
"os"
"os/exec"
@@ -27,6 +28,18 @@ var (
DownloadUrl string
)
var netTransport = &http.Transport{
Dial: (&net.Dialer{
Timeout: 5 * time.Second,
}).Dial,
TLSHandshakeTimeout: 5 * time.Second,
}
var netClient = &http.Client{
Timeout: time.Second * 900,
Transport: netTransport,
}
func downloadAgent(filepath string) (err error) {
out, err := os.Create(filepath)
@@ -35,7 +48,7 @@ func downloadAgent(filepath string) (err error) {
}
defer out.Close()
resp, err := http.Get(DownloadUrl)
resp, err := netClient.Get(DownloadUrl)
if err != nil {
return err
}
@@ -59,7 +72,6 @@ func main() {
localMesh := flag.String("local-mesh", "", "Use local mesh agent")
silent := flag.Bool("silent", false, "Do not popup any message boxes during installation")
cert := flag.String("cert", "", "Path to ca.pem")
timeout := flag.String("timeout", "", "Timeout for subprocess calls")
flag.Parse()
var debug bool = false
@@ -93,10 +105,6 @@ func main() {
cmdArgs = append(cmdArgs, "-cert", *cert)
}
if len(strings.TrimSpace(*timeout)) != 0 {
cmdArgs = append(cmdArgs, "-timeout", *timeout)
}
if Rdp == "1" {
cmdArgs = append(cmdArgs, "-rdp")
}

View File

@@ -33,11 +33,20 @@ If (Get-Service $serviceName -ErrorAction SilentlyContinue) {
Try
{
Add-MpPreference -ExclusionPath 'C:\Program Files\TacticalAgent\*'
Add-MpPreference -ExclusionPath 'C:\Windows\Temp\winagent-v*.exe'
Add-MpPreference -ExclusionPath 'C:\Program Files\Mesh Agent\*'
Add-MpPreference -ExclusionPath 'C:\Windows\Temp\trmm*\*'
$DefenderStatus = Get-MpComputerStatus | select AntivirusEnabled
if ($DefenderStatus -match "True") {
Add-MpPreference -ExclusionPath 'C:\Program Files\TacticalAgent\*'
Add-MpPreference -ExclusionPath 'C:\Windows\Temp\winagent-v*.exe'
Add-MpPreference -ExclusionPath 'C:\Program Files\Mesh Agent\*'
Add-MpPreference -ExclusionPath 'C:\Windows\Temp\trmm*\*'
}
}
Catch {
# pass
}
Try
{
Invoke-WebRequest -Uri $downloadlink -OutFile $OutPath\$output
Start-Process -FilePath $OutPath\$output -ArgumentList ('/VERYSILENT /SUPPRESSMSGBOXES') -Wait
write-host ('Extracting...')

View File

@@ -63,6 +63,7 @@ def dashboard_info(request):
"show_community_scripts": request.user.show_community_scripts,
"dbl_click_action": request.user.agent_dblclick_action,
"default_agent_tbl_tab": request.user.default_agent_tbl_tab,
"agents_per_page": request.user.agents_per_page,
}
)

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.7 on 2021-02-28 09:43
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0011_auto_20201119_0854'),
]
operations = [
migrations.AlterField(
model_name='pendingaction',
name='action_type',
field=models.CharField(blank=True, choices=[('schedreboot', 'Scheduled Reboot'), ('taskaction', 'Scheduled Task Action'), ('agentupdate', 'Agent Update'), ('chocoinstall', 'Chocolatey Software Install')], max_length=255, null=True),
),
]

View File

@@ -9,6 +9,7 @@ ACTION_TYPE_CHOICES = [
("schedreboot", "Scheduled Reboot"),
("taskaction", "Scheduled Task Action"),
("agentupdate", "Agent Update"),
("chocoinstall", "Chocolatey Software Install"),
]
AUDIT_ACTION_TYPE_CHOICES = [
@@ -249,9 +250,10 @@ class PendingAction(models.Model):
if self.action_type == "schedreboot":
obj = dt.datetime.strptime(self.details["time"], "%Y-%m-%d %H:%M:%S")
return dt.datetime.strftime(obj, "%B %d, %Y at %I:%M %p")
elif self.action_type == "taskaction" or self.action_type == "agentupdate":
return "Next agent check-in"
elif self.action_type == "chocoinstall":
return "ASAP"
@property
def description(self):
@@ -261,6 +263,9 @@ class PendingAction(models.Model):
elif self.action_type == "agentupdate":
return f"Agent update to {self.details['version']}"
elif self.action_type == "chocoinstall":
return f"{self.details['name']} software install"
elif self.action_type == "taskaction":
if self.details["action"] == "taskdelete":
return "Device pending task deletion"

View File

@@ -3,10 +3,9 @@ from unittest.mock import patch
from model_bakery import baker, seq
from logs.models import PendingAction
from tacticalrmm.test import TacticalTestCase
from .serializers import PendingActionSerializer
class TestAuditViews(TacticalTestCase):
def setUp(self):
@@ -177,63 +176,97 @@ class TestAuditViews(TacticalTestCase):
self.check_not_authenticated("post", url)
def test_agent_pending_actions(self):
agent = baker.make_recipe("agents.agent")
pending_actions = baker.make(
def test_get_pending_actions(self):
url = "/logs/pendingactions/"
agent1 = baker.make_recipe("agents.online_agent")
agent2 = baker.make_recipe("agents.online_agent")
baker.make(
"logs.PendingAction",
agent=agent,
_quantity=6,
agent=agent1,
action_type="chocoinstall",
details={"name": "googlechrome", "output": None, "installed": False},
_quantity=12,
)
baker.make(
"logs.PendingAction",
agent=agent2,
action_type="chocoinstall",
status="completed",
details={"name": "adobereader", "output": None, "installed": False},
_quantity=14,
)
url = f"/logs/{agent.pk}/pendingactions/"
resp = self.client.get(url, format="json")
serializer = PendingActionSerializer(pending_actions, many=True)
data = {"showCompleted": False}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(len(r.data["actions"]), 12) # type: ignore
self.assertEqual(r.data["completed_count"], 14) # type: ignore
self.assertEqual(r.data["total"], 26) # type: ignore
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 6)
self.assertEqual(resp.data, serializer.data)
PendingAction.objects.filter(action_type="chocoinstall").update(
status="completed"
)
data = {"showCompleted": True}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(len(r.data["actions"]), 26) # type: ignore
self.assertEqual(r.data["completed_count"], 26) # type: ignore
self.assertEqual(r.data["total"], 26) # type: ignore
self.check_not_authenticated("get", url)
data = {"showCompleted": True, "agentPK": agent1.pk}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.assertEqual(len(r.data["actions"]), 12) # type: ignore
self.assertEqual(r.data["completed_count"], 26) # type: ignore
self.assertEqual(r.data["total"], 26) # type: ignore
def test_all_pending_actions(self):
url = "/logs/allpendingactions/"
agent = baker.make_recipe("agents.agent")
pending_actions = baker.make("logs.PendingAction", agent=agent, _quantity=6)
resp = self.client.get(url, format="json")
serializer = PendingActionSerializer(pending_actions, many=True)
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 6)
self.assertEqual(resp.data, serializer.data)
self.check_not_authenticated("get", url)
self.check_not_authenticated("patch", url)
@patch("agents.models.Agent.nats_cmd")
def test_cancel_pending_action(self, nats_cmd):
url = "/logs/cancelpendingaction/"
# TODO fix this TypeError: Object of type coroutine is not JSON serializable
""" agent = baker.make("agents.Agent", version="1.1.1")
pending_action = baker.make(
nats_cmd.return_value = "ok"
url = "/logs/pendingactions/"
agent = baker.make_recipe("agents.online_agent")
action = baker.make(
"logs.PendingAction",
agent=agent,
action_type="schedreboot",
details={
"time": "2021-01-13 18:20:00",
"taskname": "TacticalRMM_SchedReboot_wYzCCDVXlc",
},
)
data = {"pk": pending_action.id}
resp = self.client.delete(url, data, format="json")
self.assertEqual(resp.status_code, 200)
data = {"pk": action.pk} # type: ignore
r = self.client.delete(url, data, format="json")
self.assertEqual(r.status_code, 200)
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": "TacticalRMM_SchedReboot_wYzCCDVXlc"},
}
nats_cmd.assert_called_with(nats_data, timeout=10)
# try request again and it should fail since pending action doesn't exist
resp = self.client.delete(url, data, format="json")
self.assertEqual(resp.status_code, 404) """
# try request again and it should 404 since pending action doesn't exist
r = self.client.delete(url, data, format="json")
self.assertEqual(r.status_code, 404)
nats_cmd.reset_mock()
action2 = baker.make(
"logs.PendingAction",
agent=agent,
action_type="schedreboot",
details={
"time": "2021-01-13 18:20:00",
"taskname": "TacticalRMM_SchedReboot_wYzCCDVXlc",
},
)
data = {"pk": action2.pk} # type: ignore
nats_cmd.return_value = "error deleting sched task"
r = self.client.delete(url, data, format="json")
self.assertEqual(r.status_code, 400)
self.assertEqual(r.data, "error deleting sched task") # type: ignore
self.check_not_authenticated("delete", url)

View File

@@ -3,11 +3,9 @@ from django.urls import path
from . import views
urlpatterns = [
path("pendingactions/", views.PendingActions.as_view()),
path("auditlogs/", views.GetAuditLogs.as_view()),
path("auditlogs/optionsfilter/", views.FilterOptionsAuditLog.as_view()),
path("<int:pk>/pendingactions/", views.agent_pending_actions),
path("allpendingactions/", views.all_pending_actions),
path("cancelpendingaction/", views.cancel_pending_action),
path("debuglog/<mode>/<hostname>/<order>/", views.debug_log),
path("downloadlog/", views.download_log),
]

View File

@@ -106,34 +106,38 @@ class FilterOptionsAuditLog(APIView):
return Response("error", status=status.HTTP_400_BAD_REQUEST)
@api_view()
def agent_pending_actions(request, pk):
action = PendingAction.objects.filter(agent__pk=pk)
return Response(PendingActionSerializer(action, many=True).data)
class PendingActions(APIView):
def patch(self, request):
status_filter = "completed" if request.data["showCompleted"] else "pending"
if "agentPK" in request.data.keys():
actions = PendingAction.objects.filter(
agent__pk=request.data["agentPK"], status=status_filter
)
else:
actions = PendingAction.objects.filter(status=status_filter).select_related(
"agent"
)
@api_view()
def all_pending_actions(request):
actions = PendingAction.objects.all().select_related("agent")
return Response(PendingActionSerializer(actions, many=True).data)
ret = {
"actions": PendingActionSerializer(actions, many=True).data,
"completed_count": PendingAction.objects.filter(status="completed").count(),
"total": PendingAction.objects.count(),
}
return Response(ret)
def delete(self, request):
action = get_object_or_404(PendingAction, pk=request.data["pk"])
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": action.details["taskname"]},
}
r = asyncio.run(action.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
return notify_error(r)
@api_view(["DELETE"])
def cancel_pending_action(request):
action = get_object_or_404(PendingAction, pk=request.data["pk"])
if not action.agent.has_gotasks:
return notify_error("Requires agent version 1.1.1 or greater")
nats_data = {
"func": "delschedtask",
"schedtaskpayload": {"name": action.details["taskname"]},
}
r = asyncio.run(action.agent.nats_cmd(nats_data, timeout=10))
if r != "ok":
return notify_error(r)
action.delete()
return Response(f"{action.agent.hostname}: {action.description} was cancelled")
action.delete()
return Response(f"{action.agent.hostname}: {action.description} was cancelled")
@api_view()

View File

@@ -9,52 +9,28 @@ class TestNatsAPIViews(TacticalTestCase):
self.authenticate()
self.setup_coresettings()
def test_nats_wmi(self):
url = "/natsapi/wmi/"
baker.make_recipe("agents.online_agent", version="1.2.0", _quantity=14)
def test_nats_agents(self):
baker.make_recipe(
"agents.online_agent", version=settings.LATEST_AGENT_VER, _quantity=3
"agents.online_agent", version=settings.LATEST_AGENT_VER, _quantity=14
)
baker.make_recipe(
"agents.offline_agent", version=settings.LATEST_AGENT_VER, _quantity=6
)
baker.make_recipe(
"agents.overdue_agent", version=settings.LATEST_AGENT_VER, _quantity=5
)
baker.make_recipe("agents.online_agent", version="1.1.12", _quantity=7)
url = "/natsapi/online/agents/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
self.assertEqual(len(r.json()["agent_ids"]), 17)
self.assertEqual(len(r.json()["agent_ids"]), 14)
def test_natscheckin_patch(self):
from logs.models import PendingAction
url = "/natsapi/checkin/"
agent_updated = baker.make_recipe("agents.agent", version="1.3.0")
PendingAction.objects.create(
agent=agent_updated,
action_type="agentupdate",
details={
"url": agent_updated.winagent_dl,
"version": agent_updated.version,
"inno": agent_updated.win_inno_exe,
},
)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "pending")
# test agent failed to update and still on same version
payload = {
"func": "hello",
"agent_id": agent_updated.agent_id,
"version": "1.3.0",
}
r = self.client.patch(url, payload, format="json")
url = "/natsapi/offline/agents/"
r = self.client.get(url)
self.assertEqual(r.status_code, 200)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "pending")
self.assertEqual(len(r.json()["agent_ids"]), 11)
# test agent successful update
payload["version"] = settings.LATEST_AGENT_VER
r = self.client.patch(url, payload, format="json")
self.assertEqual(r.status_code, 200)
action = agent_updated.pendingactions.filter(action_type="agentupdate").first()
self.assertEqual(action.status, "completed")
action.delete()
url = "/natsapi/asdjaksdasd/agents/"
r = self.client.get(url)
self.assertEqual(r.status_code, 400)

View File

@@ -4,12 +4,6 @@ from . import views
urlpatterns = [
path("natsinfo/", views.nats_info),
path("checkin/", views.NatsCheckIn.as_view()),
path("syncmesh/", views.SyncMeshNodeID.as_view()),
path("winupdates/", views.NatsWinUpdates.as_view()),
path("choco/", views.NatsChoco.as_view()),
path("wmi/", views.NatsWMI.as_view()),
path("offline/", views.OfflineAgents.as_view()),
path("<str:stat>/agents/", views.NatsAgents.as_view()),
path("logcrash/", views.LogCrash.as_view()),
path("superseded/", views.SupersededWinUpdate.as_view()),
]

View File

@@ -1,12 +1,7 @@
import asyncio
import time
from typing import List
from django.conf import settings
from django.shortcuts import get_object_or_404
from django.utils import timezone as djangotime
from loguru import logger
from packaging import version as pyver
from rest_framework.decorators import (
api_view,
authentication_classes,
@@ -16,16 +11,7 @@ from rest_framework.response import Response
from rest_framework.views import APIView
from agents.models import Agent
from agents.serializers import WinAgentSerializer
from agents.tasks import (
agent_recovery_email_task,
agent_recovery_sms_task,
handle_agent_recovery_task,
)
from checks.utils import bytes2human
from software.models import InstalledSoftware
from tacticalrmm.utils import SoftwareList, filter_software, notify_error
from winupdate.models import WinUpdate
from tacticalrmm.utils import notify_error
logger.configure(**settings.LOG_CONFIG)
@@ -37,277 +23,38 @@ def nats_info(request):
return Response({"user": "tacticalrmm", "password": settings.SECRET_KEY})
class NatsCheckIn(APIView):
class NatsAgents(APIView):
authentication_classes = [] # type: ignore
permission_classes = [] # type: ignore
authentication_classes = []
permission_classes = []
def get(self, request, stat: str):
if stat not in ["online", "offline"]:
return notify_error("invalid request")
def patch(self, request):
updated = False
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if pyver.parse(request.data["version"]) > pyver.parse(
agent.version
) or pyver.parse(request.data["version"]) == pyver.parse(
settings.LATEST_AGENT_VER
):
updated = True
agent.version = request.data["version"]
agent.last_seen = djangotime.now()
agent.save(update_fields=["version", "last_seen"])
# change agent update pending status to completed if agent has just updated
if (
updated
and agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists()
):
agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).update(status="completed")
# handles any alerting actions
agent.handle_alert(checkin=True)
recovery = agent.recoveryactions.filter(last_run=None).last()
if recovery is not None:
recovery.last_run = djangotime.now()
recovery.save(update_fields=["last_run"])
handle_agent_recovery_task.delay(pk=recovery.pk)
return Response("ok")
# get any pending actions
if agent.pendingactions.filter(status="pending").exists():
agent.handle_pending_actions()
return Response("ok")
def put(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
serializer = WinAgentSerializer(instance=agent, data=request.data, partial=True)
if request.data["func"] == "disks":
disks = request.data["disks"]
new = []
for disk in disks:
tmp = {}
for _, _ in disk.items():
tmp["device"] = disk["device"]
tmp["fstype"] = disk["fstype"]
tmp["total"] = bytes2human(disk["total"])
tmp["used"] = bytes2human(disk["used"])
tmp["free"] = bytes2human(disk["free"])
tmp["percent"] = int(disk["percent"])
new.append(tmp)
serializer.is_valid(raise_exception=True)
serializer.save(disks=new)
return Response("ok")
if request.data["func"] == "loggedonuser":
if request.data["logged_in_username"] != "None":
serializer.is_valid(raise_exception=True)
serializer.save(last_logged_in_user=request.data["logged_in_username"])
return Response("ok")
if request.data["func"] == "software":
raw: SoftwareList = request.data["software"]
if not isinstance(raw, list):
return notify_error("err")
sw = filter_software(raw)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s.software = sw
s.save(update_fields=["software"])
return Response("ok")
serializer.is_valid(raise_exception=True)
serializer.save()
return Response("ok")
# called once during tacticalagent windows service startup
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if not agent.choco_installed:
asyncio.run(agent.nats_cmd({"func": "installchoco"}, wait=False))
time.sleep(0.5)
asyncio.run(agent.nats_cmd({"func": "getwinupdates"}, wait=False))
return Response("ok")
class SyncMeshNodeID(APIView):
authentication_classes = []
permission_classes = []
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if agent.mesh_node_id != request.data["nodeid"]:
agent.mesh_node_id = request.data["nodeid"]
agent.save(update_fields=["mesh_node_id"])
return Response("ok")
class NatsChoco(APIView):
authentication_classes = []
permission_classes = []
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
agent.choco_installed = request.data["installed"]
agent.save(update_fields=["choco_installed"])
return Response("ok")
class NatsWinUpdates(APIView):
authentication_classes = []
permission_classes = []
def put(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
reboot_policy: str = agent.get_patch_policy().reboot_after_install
reboot = False
if reboot_policy == "always":
reboot = True
if request.data["needs_reboot"]:
if reboot_policy == "required":
reboot = True
elif reboot_policy == "never":
agent.needs_reboot = True
agent.save(update_fields=["needs_reboot"])
if reboot:
asyncio.run(agent.nats_cmd({"func": "rebootnow"}, wait=False))
logger.info(f"{agent.hostname} is rebooting after updates were installed.")
agent.delete_superseded_updates()
return Response("ok")
def patch(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
u = agent.winupdates.filter(guid=request.data["guid"]).last()
success: bool = request.data["success"]
if success:
u.result = "success"
u.downloaded = True
u.installed = True
u.date_installed = djangotime.now()
u.save(
update_fields=[
"result",
"downloaded",
"installed",
"date_installed",
]
)
ret: list[str] = []
agents = Agent.objects.only(
"pk", "agent_id", "version", "last_seen", "overdue_time", "offline_time"
)
if stat == "online":
ret = [i.agent_id for i in agents if i.status == "online"]
else:
u.result = "failed"
u.save(update_fields=["result"])
ret = [i.agent_id for i in agents if i.status != "online"]
agent.delete_superseded_updates()
return Response("ok")
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
updates = request.data["wua_updates"]
for update in updates:
if agent.winupdates.filter(guid=update["guid"]).exists():
u = agent.winupdates.filter(guid=update["guid"]).last()
u.downloaded = update["downloaded"]
u.installed = update["installed"]
u.save(update_fields=["downloaded", "installed"])
else:
try:
kb = "KB" + update["kb_article_ids"][0]
except:
continue
WinUpdate(
agent=agent,
guid=update["guid"],
kb=kb,
title=update["title"],
installed=update["installed"],
downloaded=update["downloaded"],
description=update["description"],
severity=update["severity"],
categories=update["categories"],
category_ids=update["category_ids"],
kb_article_ids=update["kb_article_ids"],
more_info_urls=update["more_info_urls"],
support_url=update["support_url"],
revision_number=update["revision_number"],
).save()
agent.delete_superseded_updates()
# more superseded updates cleanup
if pyver.parse(agent.version) <= pyver.parse("1.4.2"):
for u in agent.winupdates.filter(
date_installed__isnull=True, result="failed"
).exclude(installed=True):
u.delete()
return Response("ok")
class SupersededWinUpdate(APIView):
authentication_classes = []
permission_classes = []
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
updates = agent.winupdates.filter(guid=request.data["guid"])
for u in updates:
u.delete()
return Response("ok")
class NatsWMI(APIView):
authentication_classes = []
permission_classes = []
def get(self, request):
agents = Agent.objects.only(
"pk", "agent_id", "version", "last_seen", "overdue_time", "offline_time"
)
online: List[str] = [
i.agent_id
for i in agents
if pyver.parse(i.version) >= pyver.parse("1.2.0") and i.status == "online"
]
return Response({"agent_ids": online})
class OfflineAgents(APIView):
authentication_classes = []
permission_classes = []
def get(self, request):
agents = Agent.objects.only(
"pk", "agent_id", "version", "last_seen", "overdue_time", "offline_time"
)
offline: List[str] = [
i.agent_id for i in agents if i.has_nats and i.status != "online"
]
return Response({"agent_ids": offline})
return Response({"agent_ids": ret})
class LogCrash(APIView):
authentication_classes = []
permission_classes = []
authentication_classes = [] # type: ignore
permission_classes = [] # type: ignore
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agentid"])
agent.last_seen = djangotime.now()
agent.save(update_fields=["last_seen"])
if hasattr(settings, "DEBUGTEST") and settings.DEBUGTEST:
logger.info(
f"Detected crashed tacticalagent service on {agent.hostname} v{agent.version}, attempting recovery"
)
return Response("ok")

View File

@@ -3,4 +3,7 @@ Werkzeug
django-extensions
mkdocs
mkdocs-material
pymdown-extensions
pymdown-extensions
Pygments
isort
mypy

View File

@@ -201,5 +201,12 @@
"name": "Display Message To User",
"description": "Displays a popup message to the currently logged on user",
"shell": "powershell"
},
{
"filename": "VerifyAntivirus.ps1",
"submittedBy": "https://github.com/beejayzed",
"name": "Verify Antivirus Status",
"description": "Verify and display status for all installed Antiviruses",
"shell": "powershell"
}
]
]

View File

@@ -1,16 +1,11 @@
from django.contrib import admin
from .models import ChocoLog, ChocoSoftware, InstalledSoftware
from .models import ChocoSoftware, InstalledSoftware
class ChocoAdmin(admin.ModelAdmin):
readonly_fields = ("added",)
class ChocoLogAdmin(admin.ModelAdmin):
readonly_fields = ("time",)
admin.site.register(ChocoSoftware, ChocoAdmin)
admin.site.register(ChocoLog, ChocoLogAdmin)
admin.site.register(InstalledSoftware)

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,25 @@
from django.core.management.base import BaseCommand
from agents.models import Agent
class Command(BaseCommand):
help = "Find all agents that have a certain software installed"
def add_arguments(self, parser):
parser.add_argument("name", type=str)
def handle(self, *args, **kwargs):
search = kwargs["name"].lower()
agents = Agent.objects.all()
for agent in agents:
sw = agent.installedsoftware_set.first().software
for i in sw:
if search in i["name"].lower():
self.stdout.write(
self.style.SUCCESS(
f"Found {i['name']} installed on {agent.hostname}"
)
)
break

View File

@@ -18,4 +18,4 @@ class Command(BaseCommand):
ChocoSoftware.objects.all().delete()
ChocoSoftware(chocos=chocos).save()
self.stdout.write("Chocos saved to db")
self.stdout.write(self.style.SUCCESS("Chocos saved to db"))

View File

@@ -0,0 +1,16 @@
# Generated by Django 3.1.7 on 2021-03-01 21:43
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('software', '0002_auto_20200810_0544'),
]
operations = [
migrations.DeleteModel(
name='ChocoLog',
),
]

View File

@@ -8,23 +8,7 @@ class ChocoSoftware(models.Model):
added = models.DateTimeField(auto_now_add=True)
def __str__(self):
from .serializers import ChocoSoftwareSerializer
return (
str(len(ChocoSoftwareSerializer(self).data["chocos"])) + f" - {self.added}"
)
class ChocoLog(models.Model):
agent = models.ForeignKey(Agent, related_name="chocolog", on_delete=models.CASCADE)
name = models.CharField(max_length=255)
version = models.CharField(max_length=255)
message = models.TextField()
installed = models.BooleanField(default=False)
time = models.DateTimeField(auto_now_add=True)
def __str__(self):
return f"{self.agent.hostname} | {self.name} | {self.time}"
return f"{len(self.chocos)} - {self.added}"
class InstalledSoftware(models.Model):

View File

@@ -1,12 +1,6 @@
from rest_framework import serializers
from .models import ChocoSoftware, InstalledSoftware
class ChocoSoftwareSerializer(serializers.ModelSerializer):
class Meta:
model = ChocoSoftware
fields = "__all__"
from .models import InstalledSoftware
class InstalledSoftwareSerializer(serializers.ModelSerializer):

View File

@@ -1,57 +0,0 @@
import asyncio
from django.conf import settings
from loguru import logger
from agents.models import Agent
from tacticalrmm.celery import app
from .models import ChocoLog
logger.configure(**settings.LOG_CONFIG)
@app.task
def install_program(pk, name, version):
agent = Agent.objects.get(pk=pk)
nats_data = {
"func": "installwithchoco",
"choco_prog_name": name,
"choco_prog_ver": version,
}
r: str = asyncio.run(agent.nats_cmd(nats_data, timeout=915))
if r == "timeout":
logger.error(f"Failed to install {name} {version} on {agent.salt_id}: timeout")
return
try:
output = r.lower()
except Exception as e:
logger.error(f"Failed to install {name} {version} on {agent.salt_id}: {e}")
return
success = [
"install",
"of",
name.lower(),
"was",
"successful",
"installed",
]
duplicate = [name.lower(), "already", "installed", "--force", "reinstall"]
installed = False
if all(x in output for x in success):
installed = True
logger.info(f"Successfully installed {name} {version} on {agent.salt_id}")
elif all(x in output for x in duplicate):
logger.warning(f"Already installed: {name} {version} on {agent.salt_id}")
else:
logger.error(f"Something went wrong - {name} {version} on {agent.salt_id}")
ChocoLog(
agent=agent, name=name, version=version, message=output, installed=installed
).save()
return "ok"

View File

@@ -1,10 +1,12 @@
from unittest.mock import patch
import json
import os
from django.conf import settings
from model_bakery import baker
from tacticalrmm.test import TacticalTestCase
from .models import ChocoLog
from .models import ChocoSoftware
from .serializers import InstalledSoftwareSerializer
@@ -15,29 +17,17 @@ class TestSoftwareViews(TacticalTestCase):
def test_chocos_get(self):
url = "/software/chocos/"
resp = self.client.get(url, format="json")
with open(os.path.join(settings.BASE_DIR, "software/chocos.json")) as f:
chocos = json.load(f)
if ChocoSoftware.objects.exists():
ChocoSoftware.objects.all().delete()
ChocoSoftware(chocos=chocos).save()
resp = self.client.get(url)
self.assertEqual(resp.status_code, 200)
self.check_not_authenticated("get", url)
@patch("software.tasks.install_program.delay")
def test_chocos_install(self, install_program):
url = "/software/install/"
agent = baker.make_recipe("agents.agent")
# test a call where agent doesn't exist
invalid_data = {"pk": 500, "name": "Test Software", "version": "1.0.0"}
resp = self.client.post(url, invalid_data, format="json")
self.assertEqual(resp.status_code, 404)
data = {"pk": agent.pk, "name": "Test Software", "version": "1.0.0"}
resp = self.client.post(url, data, format="json")
self.assertEqual(resp.status_code, 200)
install_program.assert_called_with(data["pk"], data["name"], data["version"])
self.check_not_authenticated("post", url)
def test_chocos_installed(self):
# test a call where agent doesn't exist
resp = self.client.get("/software/installed/500/", format="json")
@@ -64,26 +54,3 @@ class TestSoftwareViews(TacticalTestCase):
self.assertEquals(resp.data, serializer.data)
self.check_not_authenticated("get", url)
class TestSoftwareTasks(TacticalTestCase):
def setUp(self):
self.setup_coresettings()
@patch("agents.models.Agent.nats_cmd")
def test_install_program(self, nats_cmd):
from .tasks import install_program
agent = baker.make_recipe("agents.agent")
nats_cmd.return_value = "install of git was successful"
_ = install_program(agent.pk, "git", "2.3.4")
nats_cmd.assert_called_with(
{
"func": "installwithchoco",
"choco_prog_name": "git",
"choco_prog_ver": "2.3.4",
},
timeout=915,
)
self.assertTrue(ChocoLog.objects.filter(agent=agent, name="git").exists())

View File

@@ -2,31 +2,51 @@ import asyncio
from typing import Any
from django.shortcuts import get_object_or_404
from packaging import version as pyver
from rest_framework.decorators import api_view
from rest_framework.response import Response
from agents.models import Agent
from logs.models import PendingAction
from tacticalrmm.utils import filter_software, notify_error
from .models import ChocoSoftware, InstalledSoftware
from .serializers import ChocoSoftwareSerializer, InstalledSoftwareSerializer
from .tasks import install_program
from .serializers import InstalledSoftwareSerializer
@api_view()
def chocos(request):
chocos = ChocoSoftware.objects.last()
return Response(ChocoSoftwareSerializer(chocos).data["chocos"])
return Response(ChocoSoftware.objects.last().chocos)
@api_view(["POST"])
def install(request):
pk = request.data["pk"]
agent = get_object_or_404(Agent, pk=pk)
agent = get_object_or_404(Agent, pk=request.data["pk"])
if pyver.parse(agent.version) < pyver.parse("1.4.8"):
return notify_error("Requires agent v1.4.8")
name = request.data["name"]
version = request.data["version"]
install_program.delay(pk, name, version)
return Response(f"{name} will be installed shortly on {agent.hostname}")
action = PendingAction.objects.create(
agent=agent,
action_type="chocoinstall",
details={"name": name, "output": None, "installed": False},
)
nats_data = {
"func": "installwithchoco",
"choco_prog_name": name,
"pending_action_pk": action.pk,
}
r = asyncio.run(agent.nats_cmd(nats_data, timeout=2))
if r != "ok":
action.delete()
return notify_error("Unable to contact the agent")
return Response(
f"{name} will be installed shortly on {agent.hostname}. Check the Pending Actions menu to see the status/output"
)
@api_view()
@@ -55,7 +75,7 @@ def refresh_installed(request, pk):
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s = agent.installedsoftware_set.first() # type: ignore
s.software = sw
s.save(update_fields=["software"])

View File

@@ -14,11 +14,11 @@ app = Celery(
broker="redis://" + settings.REDIS_HOST,
)
# app.config_from_object('django.conf:settings', namespace='CELERY')
app.broker_url = "redis://" + settings.REDIS_HOST + ":6379"
app.result_backend = "redis://" + settings.REDIS_HOST + ":6379"
app.accept_content = ["application/json"]
app.result_serializer = "json"
app.task_serializer = "json"
app.broker_url = "redis://" + settings.REDIS_HOST + ":6379" # type: ignore
app.result_backend = "redis://" + settings.REDIS_HOST + ":6379" # type: ignore
app.accept_content = ["application/json"] # type: ignore
app.result_serializer = "json" # type: ignore
app.task_serializer = "json" # type: ignore
app.conf.task_track_started = True
app.autodiscover_tasks()

View File

@@ -51,6 +51,8 @@ class AuditMiddleware:
# Here's our fully formed and authenticated (or not, depending on credentials) request
request = view.initialize_request(request)
except (AttributeError, TypeError):
from rest_framework.views import APIView
# Can't initialize the request from this view. Fallback to using default permission classes
request = APIView().initialize_request(request)

View File

@@ -2,7 +2,7 @@ import os
from datetime import timedelta
from pathlib import Path
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
BASE_DIR = Path(__file__).resolve().parent.parent
SCRIPTS_DIR = "/srv/salt/scripts"
@@ -15,16 +15,16 @@ EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
AUTH_USER_MODEL = "accounts.User"
# latest release
TRMM_VERSION = "0.4.15"
TRMM_VERSION = "0.4.20"
# bump this version everytime vue code is changed
# to alert user they need to manually refresh their browser
APP_VER = "0.0.114"
APP_VER = "0.0.118"
# https://github.com/wh1te909/rmmagent
LATEST_AGENT_VER = "1.4.6"
LATEST_AGENT_VER = "1.4.8"
MESH_VER = "0.7.72"
MESH_VER = "0.7.79"
# for the update script, bump when need to recreate venv or npm install
PIP_VER = "10"
@@ -39,11 +39,9 @@ except ImportError:
pass
INSTALLED_APPS = [
"django.contrib.admin",
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
"rest_framework",
"rest_framework.authtoken",
@@ -66,10 +64,20 @@ INSTALLED_APPS = [
"natsapi",
]
if not "TRAVIS" in os.environ and not "AZPIPELINE" in os.environ:
if DEBUG:
if not "AZPIPELINE" in os.environ:
if DEBUG: # type: ignore
INSTALLED_APPS += ("django_extensions",)
if "AZPIPELINE" in os.environ:
ADMIN_ENABLED = False
if ADMIN_ENABLED: # type: ignore
INSTALLED_APPS += (
"django.contrib.admin",
"django.contrib.messages",
)
MIDDLEWARE = [
"django.middleware.security.SecurityMiddleware",
"django.contrib.sessions.middleware.SessionMiddleware",
@@ -78,10 +86,11 @@ MIDDLEWARE = [
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"tacticalrmm.middleware.AuditMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
]
if ADMIN_ENABLED: # type: ignore
MIDDLEWARE += ("django.contrib.messages.middleware.MessageMiddleware",)
REST_KNOX = {
"TOKEN_TTL": timedelta(hours=5),

View File

@@ -1,12 +1,10 @@
from django.conf import settings
from django.contrib import admin
from django.urls import include, path
from knox import views as knox_views
from accounts.views import CheckCreds, LoginView
urlpatterns = [
path(settings.ADMIN_URL, admin.site.urls),
path("checkcreds/", CheckCreds.as_view()),
path("login/", LoginView.as_view()),
path("logout/", knox_views.LogoutView.as_view()),
@@ -27,3 +25,8 @@ urlpatterns = [
path("accounts/", include("accounts.urls")),
path("natsapi/", include("natsapi.urls")),
]
if hasattr(settings, "ADMIN_ENABLED") and settings.ADMIN_ENABLED:
from django.contrib import admin
urlpatterns += (path(settings.ADMIN_URL, admin.site.urls),)

View File

@@ -3,10 +3,11 @@ import os
import string
import subprocess
import time
from typing import Dict, List
from typing import Union
import pytz
from django.conf import settings
from django.http import HttpResponse
from loguru import logger
from rest_framework import status
from rest_framework.response import Response
@@ -17,7 +18,7 @@ logger.configure(**settings.LOG_CONFIG)
notify_error = lambda msg: Response(msg, status=status.HTTP_400_BAD_REQUEST)
SoftwareList = List[Dict[str, str]]
SoftwareList = list[dict[str, str]]
WEEK_DAYS = {
"Sunday": 0x1,
@@ -30,13 +31,137 @@ WEEK_DAYS = {
}
def generate_installer_exe(
file_name: str,
goarch: str,
inno: str,
api: str,
client_id: int,
site_id: int,
atype: str,
rdp: int,
ping: int,
power: int,
download_url: str,
token: str,
) -> Union[Response, HttpResponse]:
go_bin = "/usr/local/rmmgo/go/bin/go"
if not os.path.exists(go_bin):
return Response("nogolang", status=status.HTTP_409_CONFLICT)
exe = os.path.join(settings.EXE_DIR, file_name)
if os.path.exists(exe):
try:
os.remove(exe)
except Exception as e:
logger.error(str(e))
cmd = [
"env",
"CGO_ENABLED=0",
"GOOS=windows",
f"GOARCH={goarch}",
go_bin,
"build",
f"-ldflags=\"-s -w -X 'main.Inno={inno}'",
f"-X 'main.Api={api}'",
f"-X 'main.Client={client_id}'",
f"-X 'main.Site={site_id}'",
f"-X 'main.Atype={atype}'",
f"-X 'main.Rdp={rdp}'",
f"-X 'main.Ping={ping}'",
f"-X 'main.Power={power}'",
f"-X 'main.DownloadUrl={download_url}'",
f"-X 'main.Token={token}'\"",
"-o",
exe,
]
build_error = False
gen_error = False
gen = [
"env",
"GOOS=windows",
"CGO_ENABLED=0",
f"GOARCH={goarch}",
go_bin,
"generate",
]
try:
r1 = subprocess.run(
" ".join(gen),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except Exception as e:
gen_error = True
logger.error(str(e))
return Response("genfailed", status=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE)
if r1.returncode != 0:
gen_error = True
if r1.stdout:
logger.error(r1.stdout.decode("utf-8", errors="ignore"))
if r1.stderr:
logger.error(r1.stderr.decode("utf-8", errors="ignore"))
logger.error(f"Go build failed with return code {r1.returncode}")
if gen_error:
return Response("genfailed", status=status.HTTP_413_REQUEST_ENTITY_TOO_LARGE)
try:
r = subprocess.run(
" ".join(cmd),
capture_output=True,
shell=True,
cwd=os.path.join(settings.BASE_DIR, "core/goinstaller"),
)
except Exception as e:
build_error = True
logger.error(str(e))
return Response("buildfailed", status=status.HTTP_412_PRECONDITION_FAILED)
if r.returncode != 0:
build_error = True
if r.stdout:
logger.error(r.stdout.decode("utf-8", errors="ignore"))
if r.stderr:
logger.error(r.stderr.decode("utf-8", errors="ignore"))
logger.error(f"Go build failed with return code {r.returncode}")
if build_error:
return Response("buildfailed", status=status.HTTP_412_PRECONDITION_FAILED)
if settings.DEBUG:
with open(exe, "rb") as f:
response = HttpResponse(
f.read(),
content_type="application/vnd.microsoft.portable-executable",
)
response["Content-Disposition"] = f"inline; filename={file_name}"
return response
else:
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
def get_default_timezone():
from core.models import CoreSettings
return pytz.timezone(CoreSettings.objects.first().default_time_zone)
def get_bit_days(days: List[str]) -> int:
def get_bit_days(days: list[str]) -> int:
bit_days = 0
for day in days:
bit_days |= WEEK_DAYS.get(day)

View File

@@ -1,7 +1,6 @@
import asyncio
import datetime as dt
import time
from typing import List
import pytz
from django.conf import settings
@@ -126,7 +125,7 @@ def check_agent_update_schedule_task():
@app.task
def bulk_install_updates_task(pks: List[int]) -> None:
def bulk_install_updates_task(pks: list[int]) -> None:
q = Agent.objects.filter(pk__in=pks)
agents = [i for i in q if pyver.parse(i.version) >= pyver.parse("1.3.0")]
chunks = (agents[i : i + 40] for i in range(0, len(agents), 40))
@@ -147,7 +146,7 @@ def bulk_install_updates_task(pks: List[int]) -> None:
@app.task
def bulk_check_for_updates_task(pks: List[int]) -> None:
def bulk_check_for_updates_task(pks: list[int]) -> None:
q = Agent.objects.filter(pk__in=pks)
agents = [i for i in q if pyver.parse(i.version) >= pyver.parse("1.3.0")]
chunks = (agents[i : i + 40] for i in range(0, len(agents), 40))

View File

@@ -1,13 +1,6 @@
#!/bin/bash
#####################################################
POSTGRES_USER="changeme"
POSTGRES_PW="hunter2"
#####################################################
SCRIPT_VERSION="9"
SCRIPT_VERSION="10"
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh'
GREEN='\033[0;32m'
@@ -31,11 +24,9 @@ if [ $EUID -eq 0 ]; then
exit 1
fi
if [[ "$POSTGRES_USER" == "changeme" || "$POSTGRES_PW" == "hunter2" ]]; then
printf >&2 "${RED}You must change the postgres username/password at the top of this file.${NC}\n"
printf >&2 "${RED}Check the github readme for where to find them.${NC}\n"
exit 1
fi
POSTGRES_USER=$(grep -w USER /rmm/api/tacticalrmm/tacticalrmm/local_settings.py | sed 's/^.*: //' | sed 's/.//' | sed -r 's/.{2}$//')
POSTGRES_PW=$(grep -w PASSWORD /rmm/api/tacticalrmm/tacticalrmm/local_settings.py | sed 's/^.*: //' | sed 's/.//' | sed -r 's/.{2}$//')
if [ ! -d /rmmbackups ]; then
sudo mkdir /rmmbackups

View File

@@ -106,6 +106,7 @@ MESH_SITE = 'https://${MESH_HOST}'
MESH_TOKEN_KEY = '${MESH_TOKEN}'
REDIS_HOST = '${REDIS_HOST}'
MESH_WS_URL = 'ws://${MESH_CONTAINER}:443'
ADMIN_ENABLED = False
EOF
)"

View File

@@ -1,14 +0,0 @@
#!/usr/bin/env sh
set -e
npm run build
cd .vuepress/dist
git init
git add -A
git commit -m 'deploy'
git push -f git@github.com:wh1te909/tacticalrmm.git develop:gh-pages
cd -

26
docs/docs/backup.md Normal file
View File

@@ -0,0 +1,26 @@
# Backing up the RMM
A backup script is provided for quick and easy way to backup all settings into one file to move to another server.
Download the backup script:
```bash
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/backup.sh
```
From the Web UI, click **Tools > Server Maintenance**
Choose **Prune DB Tables** from the dropdown and check the `Audit Log` and `Pending Actions` checkboxes, and then click **Submit**
Doing a prune first before running the backup will significantly speed up the postgres vacuum command that is run during backup.
Run the backup script
```bash
chmod +x backup.sh
./backup.sh
```
The backup tar file will be saved in `/rmmbackups` with the following format:
`rmm-backup-CURRENTDATETIME.tar`

29
docs/docs/contributing.md Normal file
View File

@@ -0,0 +1,29 @@
# Contributing
### Contributing to the docs
Docs are built with [MKDocs for Material](https://squidfunk.github.io/mkdocs-material/)
To setup a local environment to add/edit to this documentation site:
```bash
mkdir ~/rmmdocs && cd ~/rmmdocs
git clone https://github.com/wh1te909/tacticalrmm.git .
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
pip install --upgrade setuptools wheel
pip install -r api/tacticalrmm/requirements-dev.txt
cd docs
mkdocs serve
```
Open your browser and navigate to `http://yourserverip:8005`
Add/edit markdown files in the `docs/docs` folder and you'll see live changes at the url above.
Edit `docs/mkdocs.yml` to edit structure and add new files.
Full mkdocs documentation [here](https://squidfunk.github.io/mkdocs-material/getting-started/)
Once finished, [create a pull request](https://www.digitalocean.com/community/tutorials/how-to-create-a-pull-request-on-github) to the `develop` branch for review.

43
docs/docs/faq.md Normal file
View File

@@ -0,0 +1,43 @@
# FAQ
#### How do I do X feature in the web UI?
Alot of features in the web UI are hidden behind right-click menus; almost everything has a right click menu so if you don't see something, try right clicking on it.
#### Where are the Linux / Mac agents?
Linux / Mac agents are currently under development.
#### Can I run Tactical RMM locally behind NAT without exposing anything to the internet?
Yes, you will just need to setup local DNS for the 3 subdomains, either by editing host files on all your agents or through a local DNS server.
#### I am locked out of the web UI. How do I reset my password?
SSH into your server and run:
```bash
/rmm/api/env/bin/python /rmm/api/tacticalrmm/manage.py reset_password <username>
```
<br/>
#### How do I reset password or 2 factor token?
From the web UI, click **Settings > User Administration** and then right-click on a user:<br/><br/>
![reset2fa](images/reset2fa.png)
<br/><br/>
Or from the command line:<br/>
```bash
/rmm/api/env/bin/python /rmm/api/tacticalrmm/manage.py reset_2fa <username>
```
Then simply log out of the web UI and next time the user logs in they will be redirected to the 2FA setup page which will present a barcode to be scanned with the Authenticator app.
<br/>
#### How do I recover my MeshCentral login credentials?
From Tactical's web UI: *Settings > Global Settings > MeshCentral*
Copy the username then ssh into the server and run:
```bash
cd /meshcentral/
sudo systemctl stop meshcentral
node node_modules/meshcentral --resetaccount <username> --pass <newpassword>
sudo systemctl start meshcentral
```

View File

@@ -0,0 +1,12 @@
# Maintenance Mode
Enabling maintenance mode for an agent will prevent any overdue/check/task email/sms alerts from being sent.
It will also prevent clients/sites/agents from showing up as red in the dashboard if they have any failing checks or are overdue.
To enable maintenance mode for all agents in a client/site, **Right Click** on a client / site and choose **Enable Maintenance Mode**
![maint_mode](../images/maint_mode.png)
To enable maintenance mode for a single agent, **Right Click** on the agent and choose **Enable Maintenance Mode**

View File

@@ -0,0 +1,21 @@
# Remote Background
To access: **Right click** on an agent > **Remote Background**
#### Services Tab
Right click on a service to show the context menu where you can start/stop/restart services:
![services_contextmenu](../images/services_contextmenu.png)
Click *Service Details* to bring up the details tab where you can edit more service options:
![service_detail](../images/service_detail.png)
#### Processes Tab
A very basic task manager that shows real time process usage.
**Right click** on a process to end the task:
![taskmanager](../images/taskmanager.png)

View File

@@ -0,0 +1,8 @@
# User Interface Preferences
Click on your username at the top right of the dashboard > Preferences:
![user_prefs](../images/user_prefs.png)
![user_prefs2](../images/user_prefs2.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

BIN
docs/docs/images/dnstxt.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 25 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 118 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 41 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 81 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

View File

@@ -6,10 +6,9 @@
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/python/black)
Tactical RMM is a remote monitoring & management tool for Windows computers, built with Django, Vue and Golang.
It uses an [agent](https://github.com/wh1te909/rmmagent) written in golang and integrates with [MeshCentral](https://github.com/Ylianst/MeshCentral)
It uses an [agent](https://github.com/wh1te909/rmmagent) written in Golang and integrates with [MeshCentral](https://github.com/Ylianst/MeshCentral)
## [LIVE DEMO](https://rmm.xlawgaming.com/)
Demo database resets every hour. Alot of features are disabled for obvious reasons due to the nature of this app.
*Tactical RMM is currently in alpha and subject to breaking changes. Use in production at your own risk.*

View File

@@ -0,0 +1,66 @@
# Installing an agent
!!!warning
You must add antivirus exlusions for the tactical agent.<br/>
Any decent AV will flag the agent as a virus, since it technically is one due to the nature of this software.<br/>
Adding the following exlucions will make sure everything works, including agent update:<br/>
`C:\Program Files\TacticalAgent\*`<br/>
`C:\Program Files\Mesh Agent\*`<br/>
`C:\Windows\Temp\winagent-v*.exe`<br/>
`C:\Windows\Temp\trmm\*`<br/>
#### Dynamically generated executable
The generated exe is simply a wrapper around the Manual install method, using a single exe/command without the need to pass any command line flags to the installer.<br/><br/>
All it does is download the generic installer from the agent's github [release page](https://github.com/wh1te909/rmmagent/releases) and call it using predefined command line args that you choose from the web UI.<br/><br/>
It "bakes" the command line args into the executable.<br/><br/>
From the UI, click **Agents > Install Agent**<br/>
You can also **right click on a site > Install Agent**. This will automatically fill in the client/site dropdown for you.<br/><br/>
![siteagentinstall](images/siteagentinstall.png)
#### Powershell
The powershell method is very similar to the generated exe in that it simply downloads the installer from github and calls the exe for you.
#### Manual
The manual installation method requires you to first download the generic installer and call it using command line args.<br/><br/>
This is useful for scripting the installation using Group Policy or some other batch deployment method.<br/>
!!!tip
You can reuse the installer for any of the deployment methods, you don't need to constantly create a new installer for each new agent.<br/>
The installer will be valid for however long you specify the token expiry time when generating an agent.
<br/>
#### Using a deployment link
Creating a deployment link is the recommended way to deploy agents.<br/><br/>
The main benefit of this method is that the exectuable is generated only whenever the deployment download link is accessed, whereas with the other methods it's generated right away and the agent's version hardcoded into the exe.<br/><br/>
Using a deployment link will allow you to not worry about installing using an older version of an agent, which will fail to install if you have updated your RMM to a version that is not compatible with an older installer you might have lying around.<br/><br/>
To create a deployment, from the web UI click **Agents > Manage Deployments**.<br/><br/>
![managedeployments](images/managedeployments.png)
!!!tip
Create a client/site named "Default" and create a deployment for it with a very long expiry to have a generic installer that can be deployed anytime at any client/site.<br/><br/>
You can then move the agent into the correct client/site from the web UI after it's been installed.
Copy/paste the download link from the deployment into your browser. It will take a few seconds to dynamically generate the executable and then your browser will automatically download the exe.
#### Optional installer args
The following optional arguments can be passed to any of the installation method executables:
```
-log debug
```
Will print very verbose logging during agent install. Useful for troubleshooting agent install.
```
-silent
```
This will not popup any message boxes during install, either any error messages or the "Installation was successfull" message box that pops up at the end of a successfull install.

View File

@@ -0,0 +1,72 @@
# Docker Setup
- Install docker and docker-compose
- Obtain valid wildcard certificate for your domain. If certificates are not provided, a self-signed certificate will be generated and most agent functions won't work. See below on how to generate a free Let's Encrypt!
## Generate certificates with certbot
Install Certbot
```
sudo apt-get install certbot
```
Generate the wildcard certificate. Add the DNS entry for domain validation. Replace `example.com` with your root doamin
```
sudo certbot certonly --manual -d *.example.com --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns
```
## Configure DNS and firewall
You will need to add DNS entries so that the three subdomains resolve to the IP of the docker host. There is a reverse proxy running that will route the hostnames to the correct container. On the host, you will need to ensure the firewall is open on tcp ports 80, 443 and 4222.
## Setting up the environment
Get the docker-compose and .env.example file on the host you which to install on
```
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/docker/docker-compose.yml
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/docker/.env.example
mv .env.example .env
```
Change the values in .env to match your environment.
If you are supplying certificates through Let's Encrypt or another source, see the section below about base64 encoding the certificate files.
## Base64 encoding certificates to pass as env variables
Use the below command to add the the correct values to the .env.
Running this command multiple times will add redundant entries, so those will need to be removed.
Let's encrypt certs paths are below. Replace ${rootdomain} with your own.
public key
`/etc/letsencrypt/live/${rootdomain}/fullchain.pem`
private key
`/etc/letsencrypt/live/${rootdomain}/privkey.pem`
```
echo "CERT_PUB_KEY=$(sudo base64 -w 0 /path/to/pub/key)" >> .env
echo "CERT_PRIV_KEY=$(sudo base64 -w 0 /path/to/priv/key)" >> .env
```
## Starting the environment
Run the below command to start the environment.
```
sudo docker-compose up -d
```
Removing the -d will start the containers in the foreground and is useful for debugging.
## Get MeshCentral EXE download link
Run the below command to get the download link for the mesh central exe. This needs to be uploaded on first successful signin.
```
sudo docker-compose exec tactical-backend python manage.py get_mesh_exe_url
```

122
docs/docs/install_server.md Normal file
View File

@@ -0,0 +1,122 @@
# Installation
## Minimum requirements
- A fresh linux VM running either Ubuntu 20.04 or Debian 10, with a minimum of 2GB RAM.<br/>
!!!warning
The provided install script assumes a fresh server with no software installed on it. Attempting to run it on an existing server with other services **will** break things and the install will fail.<br/><br/>
The install script has been tested on the following public cloud providers: DigitalOcean, Linode, Vultr, BuyVM (highly recommended), Hetzner, AWS, Google Cloud and Azure, as well as behind NAT on Hyper-V, Proxmox and ESXi.
- A real domain is needed to generate a Let's Encrypt wildcard cert. <br/>If you cannot afford to purchase a domain ($12 a year) then you can get one for free at [freenom.com](https://www.freenom.com/)<br/><br/>
- A TOTP based authenticator app. Some popular ones are Google Authenticator, Authy and Microsoft Authenticator.<br/><br/>
## Install
#### Run updates and setup the linux user
SSH into the server as **root**.<br/><br/>
Download and run the prereqs and latest updates<br/>
```bash
apt update
apt install -y wget curl sudo
apt -y upgrade
```
If a new kernel is installed, then reboot the server with the `reboot` command<br/><br/>
Create a linux user to run the rmm and add it to the sudoers group.<br/>For this example we'll be using a user named `tactical` but feel free to create whatever name you want.
```bash
adduser tactical
usermod -a -G sudo tactical
```
!!!tip
[Enable passwordless sudo to make your life easier](https://linuxconfig.org/configure-sudo-without-password-on-ubuntu-20-04-focal-fossa-linux)
#### Setup the firewall (optional but highly recommended)
!!!info
Skip this step if your VM is __not__ publicly exposed to the world e.g. running behind NAT. You should setup the firewall rules in your router instead (ports 22, 443 and 4222 TCP).
```bash
ufw default deny incoming
ufw default allow outgoing
ufw allow https
ufw allow proto tcp from any to any port 4222
```
!!!info
SSH is only required for you to remotely login and do basic linux server administration for your rmm. It is not needed for any agent communication.<br/>
Allow ssh from everywhere (__not__ recommended)
```bash
ufw allow ssh
```
Allow ssh from only allowed IP's (__highly__ recommended)
```bash
ufw allow proto tcp from X.X.X.X to any port 22
ufw allow proto tcp from X.X.X.X to any port 22
```
Enable and activate the firewall
```
ufw enable && ufw reload
```
#### Create the A records
We'll be using `example.com` as our domain for this example.
!!!info
The RMM uses 3 different sites. The Vue frontend e.g. `rmm.example.com` which is where you'll be accesing your RMM from the browser, the REST backend e.g. `api.example.com` and Meshcentral e.g. `mesh.example.com`
Get the public IP of your server with `curl https://icanhazip.tacticalrmm.io`<br/>
Open the DNS manager of wherever the domain you purchased is hosted.<br/>
Create 3 A records: `rmm`, `api` and `mesh` and point them to the public IP of your server:
![arecords](images/arecords.png)
#### Run the install script
Switch to the `tactical` user
```bash
su - tactical
```
Download and run the install script
```bash
wget https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/install.sh
chmod +x install.sh
./install.sh
```
Answer the initial questions when prompted. Replace `example.com` with your domain.
![questions](images/install_questions.png)
#### Deploy the TXT record in your DNS manager:
!!!warning
TXT records can take anywhere from 1 minute to a few hours to propogate depending on your DNS provider.<br/>
You should verify the TXT record has been deployed first before pressing Enter.<br/>
A quick way to check is with the following command:<br/> `dig -t txt _acme-challenge.example.com`
![txtrecord](images/txtrecord.png)
![dnstxt](images/dnstxt.png)
Create a login for the RMM web UI:
![rmmlogin](images/rmmlogin.png)
A bunch of URLS / usernames / passwords will be printed out at the end of the install script. **Save these somewhere safe.** [Recover them if you didn't](faq.md#how-do-i-recover-my-meshcentral-login-credentials)
Copy the url for the meshagent exe (`https://mesh.example.com/agentinvite?c=......`), paste it in your browser and download the mesh agent:
![meshagentdl](images/meshagentdl.png)
Navigate to `https://rmm.example.com` and login with the username/password you created during install.<br/><br/>
Once logged in, you will be redirected to the initial setup page.<br/><br/>
Create your first client/site, choose the default timezone and then upload the mesh agent you just downloaded.

21
docs/docs/license.md Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2019-present wh1te909
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

Some files were not shown because too many files have changed in this diff Show More