Compare commits

...

115 Commits

Author SHA1 Message Date
wh1te909
9ab915a08b Release 0.2.23 2021-01-14 02:43:56 +00:00
wh1te909
e26fbf0328 bump versions 2021-01-14 02:29:14 +00:00
wh1te909
d9a52c4a2a update reqs 2021-01-14 02:27:40 +00:00
wh1te909
7b2ec90de9 feat: double-click agent action #232 2021-01-14 02:21:08 +00:00
wh1te909
d310bf8bbf add community scripts from dinger #242 2021-01-14 01:17:58 +00:00
wh1te909
2abc6cc939 partially fix sort 2021-01-14 00:01:08 +00:00
sadnub
56d4e694a2 fix annotations and error for the check chart 2021-01-13 18:43:09 -05:00
wh1te909
5f002c9cdc bump mesh 2021-01-13 23:35:14 +00:00
wh1te909
759daf4b4a add wording 2021-01-13 23:35:01 +00:00
wh1te909
3a8d9568e3 split some tasks into chunks to reduce load 2021-01-13 22:26:54 +00:00
wh1te909
ff22a9d94a fix deployments in docker 2021-01-13 22:19:09 +00:00
sadnub
a6e42d5374 fix removing pendingactions that are outstanding 2021-01-13 13:21:09 -05:00
wh1te909
a2f74e0488 add natsapi flags 2021-01-12 21:14:43 +00:00
wh1te909
ee44240569 black 2021-01-12 21:06:44 +00:00
wh1te909
d0828744a2 update nginx conf
(cherry picked from commit bf61e27f8a)
2021-01-12 06:38:52 +00:00
wh1te909
6e2e576b29 start natsapi 2021-01-12 06:32:00 +00:00
wh1te909
bf61e27f8a update nginx conf 2021-01-12 03:02:03 +00:00
Tragic Bronson
c441c30b46 Merge pull request #243 from sadnub/develop
Move Check Runs from Audit to its own table
2021-01-11 00:29:59 -08:00
Tragic Bronson
0e741230ea Merge pull request #242 from dinger1986/develop
Added some scripts checks etc
2021-01-11 00:29:47 -08:00
sadnub
1bfe9ac2db complete other pending actions with same task if task is deleted 2021-01-10 20:19:38 -05:00
sadnub
6812e72348 fix process sorting 2021-01-10 19:35:39 -05:00
sadnub
b6449d2f5b black 2021-01-10 16:33:10 -05:00
sadnub
7e3ea20dce add some tests and bug fixes 2021-01-10 16:27:48 -05:00
sadnub
c9d6fe9dcd allow returning all check data 2021-01-10 15:14:02 -05:00
sadnub
4a649a6b8b black 2021-01-10 14:47:34 -05:00
sadnub
8fef184963 add check history graph for cpu, memory, and diskspace 2021-01-10 14:15:05 -05:00
sadnub
69583ca3c0 docker dev fixes 2021-01-10 13:17:49 -05:00
dinger1986
6038a68e91 Win Defender exclusions for Tactical 2021-01-10 17:56:12 +00:00
dinger1986
fa8bd8db87 Manually reinstall Mesh just incase 2021-01-10 17:54:41 +00:00
dinger1986
18b4f0ed0f Runs DNS check on host as defined 2021-01-10 17:53:53 +00:00
dinger1986
461f9d66c9 Disable Faststartup on Windows 10 2021-01-10 17:51:33 +00:00
dinger1986
2155103c7a Check Win Defender for detections etc 2021-01-10 17:51:06 +00:00
dinger1986
c9a6839c45 Clears Win Defender log files 2021-01-10 17:50:13 +00:00
dinger1986
9fbe331a80 Allows the following Apps access by Win Defender 2021-01-10 17:49:36 +00:00
dinger1986
a56389c4ce Sync time with DC 2021-01-10 17:46:47 +00:00
dinger1986
64656784cb Powershell Speedtest 2021-01-10 17:46:00 +00:00
dinger1986
6eff2c181e Install RDP and change power config 2021-01-10 17:44:23 +00:00
dinger1986
1aa48c6d62 Install OpenSSH on PCs 2021-01-10 17:42:11 +00:00
dinger1986
c7ca1a346d Enable Windows Defender and set preferences 2021-01-10 17:40:06 +00:00
dinger1986
fa0ec7b502 check Duplicati Backup is running properly 2021-01-10 17:38:06 +00:00
dinger1986
768438c136 Checks disks for errors reported in event viewer 2021-01-10 17:36:42 +00:00
dinger1986
9badea0b3c Update DiskStatus.ps1
Checks local disks for errors reported in event viewer within the last 24 hours
2021-01-10 17:35:50 +00:00
dinger1986
43263a1650 Add files via upload 2021-01-10 17:33:48 +00:00
wh1te909
821e02dc75 update mesh docker conf 2021-01-10 00:20:44 +00:00
wh1te909
ed011ecf28 remove old mesh overrides #217 2021-01-10 00:15:11 +00:00
wh1te909
d861de4c2f update community scripts 2021-01-09 22:26:02 +00:00
Tragic Bronson
3a3b2449dc Merge pull request #241 from RVL-Solutions/develop
Create Windows10Upgrade.ps1
2021-01-09 14:12:05 -08:00
Ruben van Leusden
d2614406ca Create Windows10Upgrade.ps1
Shared by Kyt through Discord
2021-01-08 22:20:33 +01:00
Tragic Bronson
0798d098ae Merge pull request #238 from wh1te909/revert-235-master
Revert "Create Windows10Upgrade.ps1"
2021-01-08 10:38:33 -08:00
Tragic Bronson
dab7ddc2bb Revert "Create Windows10Upgrade.ps1" 2021-01-08 10:36:42 -08:00
Tragic Bronson
081a96e281 Merge pull request #235 from RVL-Solutions/master
Create Windows10Upgrade.ps1
2021-01-08 10:36:19 -08:00
wh1te909
a7dd881d79 Release 0.2.22 2021-01-08 18:16:17 +00:00
wh1te909
8134d5e24d remove threading 2021-01-08 18:15:55 +00:00
Ruben van Leusden
ba6756cd45 Create Windows10Upgrade.ps1 2021-01-06 23:19:14 +01:00
Tragic Bronson
5d8fce21ac Merge pull request #230 from wh1te909/dependabot/npm_and_yarn/web/axios-0.21.1
Bump axios from 0.21.0 to 0.21.1 in /web
2021-01-05 13:51:18 -08:00
dependabot[bot]
e7e4a5bcd4 Bump axios from 0.21.0 to 0.21.1 in /web
Bumps [axios](https://github.com/axios/axios) from 0.21.0 to 0.21.1.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v0.21.1/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v0.21.0...v0.21.1)

Signed-off-by: dependabot[bot] <support@github.com>
2021-01-05 15:54:54 +00:00
wh1te909
55f33357ea Release 0.2.21 2021-01-05 08:55:54 +00:00
wh1te909
90568bba31 bump versions 2021-01-05 08:55:08 +00:00
wh1te909
5d6e2dc2e4 feat: add send script results by email #212 2021-01-05 08:52:17 +00:00
sadnub
6bb33f2559 fix unassigned scripts not show if not categories are present 2021-01-04 20:22:42 -05:00
wh1te909
ced92554ed update community scripts 2021-01-04 22:00:17 +00:00
Tragic Bronson
dff3383158 Merge pull request #228 from azulskyknight/patch-2
Create SetHighPerformancePowerProfile.ps1
2021-01-04 13:42:20 -08:00
Tragic Bronson
bf03c89cb2 Merge pull request #227 from azulskyknight/patch-1
Create ResetHighPerformancePowerProfiletoDefaults.ps1
2021-01-04 13:42:10 -08:00
azulskyknight
9f1484bbef Create SetHighPerformancePowerProfile.ps1
Script sets the High Performance Power profile to the active power profile.
Use this to keep machines from falling asleep.
2021-01-04 13:21:00 -07:00
azulskyknight
3899680e26 Create ResetHighPerformancePowerProfiletoDefaults.ps1
Script resets monitor, disk, standby, and hibernate timers in the default High Performance power profile to their default values.
It also re-indexes the AC and DC power profiles into their default order.
2021-01-04 13:19:03 -07:00
sadnub
6bb2eb25a1 sort script folders alphabetically and fix showing community scripts when no user scripts present 2021-01-03 21:01:50 -05:00
sadnub
f8dfd8edb3 Make pip copy the binaries versus symlink them in dev env 2021-01-03 20:15:40 -05:00
sadnub
042be624a3 Update .dockerignore 2021-01-03 15:16:13 -05:00
sadnub
6bafa4c79a fix mesh init on dev 2021-01-03 15:15:43 -05:00
wh1te909
58b42fac5c Release 0.2.20 2021-01-03 09:13:28 +00:00
wh1te909
3b47b9558a let python calculate default threadpool workers based on cpu count 2021-01-03 09:12:38 +00:00
wh1te909
ccf9636296 Release 0.2.19 2021-01-02 09:34:12 +00:00
wh1te909
96942719f2 bump versions 2021-01-02 09:32:04 +00:00
wh1te909
69cf1c1adc update quasar 2021-01-02 07:38:33 +00:00
wh1te909
d77cba40b8 black 2021-01-02 07:26:34 +00:00
wh1te909
968735b555 fix scroll 2021-01-02 07:21:10 +00:00
wh1te909
ceed9d29eb task changes 2021-01-02 07:20:52 +00:00
sadnub
41329039ee add .env example 2021-01-02 00:09:56 -05:00
sadnub
f68b102ca8 Add Dev Containers 2021-01-02 00:05:54 -05:00
wh1te909
fa36e54298 change agent update 2021-01-02 01:30:51 +00:00
wh1te909
b689f57435 black 2021-01-01 00:51:44 +00:00
sadnub
885fa0ff56 add api tests to core app 2020-12-31 17:18:25 -05:00
Tragic Bronson
303acb72a3 Merge pull request #225 from sadnub/develop
add folder view to script manager
2020-12-31 13:12:33 -08:00
sadnub
b2a46cd0cd add folder view to script manager 2020-12-31 15:46:44 -05:00
wh1te909
5a5ecb3ee3 install curl/wget first fixes #224 2020-12-30 19:04:14 +00:00
wh1te909
60b4ab6a63 fix logging 2020-12-22 05:15:44 +00:00
wh1te909
e4b096a08f fix logging 2020-12-22 05:14:44 +00:00
wh1te909
343f55049b prevent duplicate cpu/mem checks from being created 2020-12-19 20:38:22 +00:00
wh1te909
6b46025261 Release 0.2.18 2020-12-19 08:44:45 +00:00
wh1te909
5ea503f23e bump version 2020-12-19 08:43:47 +00:00
wh1te909
ce95f9ac23 add codestyle to tests 2020-12-19 08:24:47 +00:00
wh1te909
c3fb87501b black 2020-12-19 08:20:12 +00:00
wh1te909
dc6a343612 bump mesh 2020-12-19 07:55:39 +00:00
wh1te909
3a61053957 update reqs 2020-12-19 07:50:32 +00:00
wh1te909
570129e4d4 add debian 10 to readme 2020-12-19 07:50:05 +00:00
wh1te909
3315c7045f if ubuntu, force 20.04 2020-12-19 07:45:21 +00:00
wh1te909
5ae50e242c always run npm install during update 2020-12-18 21:59:23 +00:00
Tragic Bronson
bbcf449719 Merge pull request #214 from mckinnon81/debian
Updated install.sh for Debian
2020-12-18 13:56:14 -08:00
Matthew McKinnon
aab10f7184 Removed certbot test-cert. Not needed 2020-12-18 08:32:40 +10:00
Matthew McKinnon
8d43488cb8 Updated install.sh for Debian
Updated api\tacticalrmm\accounts\views.py valid_window=10
2020-12-18 08:28:01 +10:00
Tragic Bronson
0a9c647e19 Merge pull request #211 from sadnub/develop
Fix default policies
2020-12-16 13:51:37 -08:00
wh1te909
40db5d4aa8 remove debug print 2020-12-16 21:50:43 +00:00
Josh
9254532baa fix applying default policies in certain situations 2020-12-16 20:38:36 +00:00
Josh
7abed47cf0 Merge branch 'develop' of https://github.com/wh1te909/tacticalrmm into develop 2020-12-16 19:08:12 +00:00
Tragic Bronson
5c6ac758f7 Merge pull request #210 from mckinnon81/scripts
Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1
2020-12-16 09:36:33 -08:00
Matthew McKinnon
007677962c Fixed Paths in ClearFirefoxCache.ps1 & ClearGoogleChromeCache.ps1 2020-12-16 22:32:04 +10:00
wh1te909
9c4aeab64a back to develop 2020-12-16 10:47:05 +00:00
wh1te909
48e6fc0efe test coveralls 2 2020-12-16 10:41:39 +00:00
wh1te909
c8be713d11 test coveralls 2020-12-16 10:38:00 +00:00
wh1te909
ae887c8648 switch to branch head for coveralls 2020-12-16 10:20:50 +00:00
wh1te909
5daac2531b add accounts tests for new settings 2020-12-16 10:09:58 +00:00
wh1te909
68def00327 fix tests 2020-12-16 09:40:36 +00:00
wh1te909
67e7976710 pipelines attempt 2 2020-12-16 09:25:28 +00:00
wh1te909
35747e937e try to get pipelines to fail 2020-12-16 09:10:53 +00:00
sadnub
c84a9d07b1 tactical-cli for managing docker installations 2020-12-15 13:41:03 -05:00
128 changed files with 4760 additions and 1341 deletions

View File

@@ -0,0 +1,27 @@
COMPOSE_PROJECT_NAME=trmm
IMAGE_REPO=tacticalrmm/
VERSION=latest
# tactical credentials (Used to login to dashboard)
TRMM_USER=tactical
TRMM_PASS=tactical
# dns settings
APP_HOST=rmm.example.com
API_HOST=api.example.com
MESH_HOST=mesh.example.com
# mesh settings
MESH_USER=tactical
MESH_PASS=tactical
MONGODB_USER=mongouser
MONGODB_PASSWORD=mongopass
# database settings
POSTGRES_USER=postgres
POSTGRES_PASS=postgrespass
# DEV SETTINGS
APP_PORT=8080
API_PORT=8000

View File

@@ -0,0 +1,28 @@
FROM python:3.8-slim
ENV TACTICAL_DIR /opt/tactical
ENV TACTICAL_GO_DIR /usr/local/rmmgo
ENV TACTICAL_READY_FILE ${TACTICAL_DIR}/tmp/tactical.ready
ENV WORKSPACE_DIR /workspace
ENV TACTICAL_USER tactical
ENV VIRTUAL_ENV ${WORKSPACE_DIR}/api/tacticalrmm/env
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
EXPOSE 8000
RUN groupadd -g 1000 tactical && \
useradd -u 1000 -g 1000 tactical
# Copy Go Files
COPY --from=golang:1.15 /usr/local/go ${TACTICAL_GO_DIR}/go
# Copy Dev python reqs
COPY ./requirements.txt /
# Copy Docker Entrypoint
COPY ./entrypoint.sh /
RUN chmod +x /entrypoint.sh
ENTRYPOINT ["/entrypoint.sh"]
WORKDIR ${WORKSPACE_DIR}/api/tacticalrmm

View File

@@ -0,0 +1,19 @@
version: '3.4'
services:
api-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["sh", "-c", "pip install debugpy -t /tmp && python /tmp/debugpy --wait-for-client --listen 0.0.0.0:5678 manage.py runserver 0.0.0.0:8000 --nothreading --noreload"]
ports:
- 8000:8000
- 5678:5678
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
networks:
dev:
aliases:
- tactical-backend

View File

@@ -0,0 +1,233 @@
version: '3.4'
services:
api-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-api"]
ports:
- 8000:8000
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
networks:
dev:
aliases:
- tactical-backend
app-dev:
image: node:12-alpine
ports:
- 8080:8080
command: /bin/sh -c "npm install && npm run serve -- --host 0.0.0.0 --port 8080"
working_dir: /workspace/web
volumes:
- ..:/workspace:cached
networks:
dev:
aliases:
- tactical-frontend
# salt master and api
salt-dev:
image: ${IMAGE_REPO}tactical-salt:${VERSION}
restart: always
volumes:
- tactical-data-dev:/opt/tactical
- salt-data-dev:/etc/salt
ports:
- "4505:4505"
- "4506:4506"
networks:
dev:
aliases:
- tactical-salt
# nats
nats-dev:
image: ${IMAGE_REPO}tactical-nats:${VERSION}
restart: always
ports:
- "4222:4222"
volumes:
- tactical-data-dev:/opt/tactical
networks:
dev:
aliases:
- ${API_HOST}
- tactical-nats
# meshcentral container
meshcentral-dev:
image: ${IMAGE_REPO}tactical-meshcentral:${VERSION}
restart: always
environment:
MESH_HOST: ${MESH_HOST}
MESH_USER: ${MESH_USER}
MESH_PASS: ${MESH_PASS}
MONGODB_USER: ${MONGODB_USER}
MONGODB_PASSWORD: ${MONGODB_PASSWORD}
NGINX_HOST_IP: 172.21.0.20
networks:
dev:
aliases:
- tactical-meshcentral
- ${MESH_HOST}
volumes:
- tactical-data-dev:/opt/tactical
- mesh-data-dev:/home/node/app/meshcentral-data
depends_on:
- mongodb-dev
# mongodb container for meshcentral
mongodb-dev:
image: mongo:4.4
restart: always
environment:
MONGO_INITDB_ROOT_USERNAME: ${MONGODB_USER}
MONGO_INITDB_ROOT_PASSWORD: ${MONGODB_PASSWORD}
MONGO_INITDB_DATABASE: meshcentral
networks:
dev:
aliases:
- tactical-mongodb
volumes:
- mongo-dev-data:/data/db
# postgres database for api service
postgres-dev:
image: postgres:13-alpine
restart: always
environment:
POSTGRES_DB: tacticalrmm
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASS}
volumes:
- postgres-data-dev:/var/lib/postgresql/data
networks:
dev:
aliases:
- tactical-postgres
# redis container for celery tasks
redis-dev:
restart: always
image: redis:6.0-alpine
networks:
dev:
aliases:
- tactical-redis
init-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
restart: on-failure
command: ["tactical-init-dev"]
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASS: ${POSTGRES_PASS}
APP_HOST: ${APP_HOST}
API_HOST: ${API_HOST}
MESH_HOST: ${MESH_HOST}
MESH_USER: ${MESH_USER}
TRMM_USER: ${TRMM_USER}
TRMM_PASS: ${TRMM_PASS}
depends_on:
- postgres-dev
- meshcentral-dev
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
# container for celery worker service
celery-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celery-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
# container for celery beat service
celerybeat-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celerybeat-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
# container for celery winupdate tasks
celerywinupdate-dev:
image: api-dev
build:
context: .
dockerfile: ./api.dockerfile
command: ["tactical-celerywinupdate-dev"]
restart: always
networks:
- dev
volumes:
- tactical-data-dev:/opt/tactical
- ..:/workspace:cached
depends_on:
- postgres-dev
- redis-dev
nginx-dev:
# container for tactical reverse proxy
image: ${IMAGE_REPO}tactical-nginx:${VERSION}
restart: always
environment:
APP_HOST: ${APP_HOST}
API_HOST: ${API_HOST}
MESH_HOST: ${MESH_HOST}
CERT_PUB_KEY: ${CERT_PUB_KEY}
CERT_PRIV_KEY: ${CERT_PRIV_KEY}
APP_PORT: 8080
API_PORT: 8000
networks:
dev:
ipv4_address: 172.21.0.20
ports:
- "80:80"
- "443:443"
volumes:
- tactical-data-dev:/opt/tactical
volumes:
tactical-data-dev:
postgres-data-dev:
mongo-dev-data:
mesh-data-dev:
salt-data-dev:
networks:
dev:
driver: bridge
ipam:
driver: default
config:
- subnet: 172.21.0.0/24

182
.devcontainer/entrypoint.sh Normal file
View File

@@ -0,0 +1,182 @@
#!/usr/bin/env bash
set -e
: "${TRMM_USER:=tactical}"
: "${TRMM_PASS:=tactical}"
: "${POSTGRES_HOST:=tactical-postgres}"
: "${POSTGRES_PORT:=5432}"
: "${POSTGRES_USER:=tactical}"
: "${POSTGRES_PASS:=tactical}"
: "${POSTGRES_DB:=tacticalrmm}"
: "${SALT_HOST:=tactical-salt}"
: "${SALT_USER:=saltapi}"
: "${MESH_CONTAINER:=tactical-meshcentral}"
: "${MESH_USER:=meshcentral}"
: "${MESH_PASS:=meshcentralpass}"
: "${MESH_HOST:=tactical-meshcentral}"
: "${API_HOST:=tactical-backend}"
: "${APP_HOST:=tactical-frontend}"
: "${REDIS_HOST:=tactical-redis}"
# Add python venv to path
export PATH="${VIRTUAL_ENV}/bin:$PATH"
function check_tactical_ready {
sleep 15
until [ -f "${TACTICAL_READY_FILE}" ]; do
echo "waiting for init container to finish install or update..."
sleep 10
done
}
function django_setup {
until (echo > /dev/tcp/"${POSTGRES_HOST}"/"${POSTGRES_PORT}") &> /dev/null; do
echo "waiting for postgresql container to be ready..."
sleep 5
done
until (echo > /dev/tcp/"${MESH_CONTAINER}"/443) &> /dev/null; do
echo "waiting for meshcentral container to be ready..."
sleep 5
done
echo "setting up django environment"
# configure django settings
MESH_TOKEN=$(cat ${TACTICAL_DIR}/tmp/mesh_token)
DJANGO_SEKRET=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 80 | head -n 1)
# write salt pass to tmp dir
if [ ! -f "${TACTICAL__DIR}/tmp/salt_pass" ]; then
SALT_PASS=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1)
echo "${SALT_PASS}" > ${TACTICAL_DIR}/tmp/salt_pass
else
SALT_PASS=$(cat ${TACTICAL_DIR}/tmp/salt_pass)
fi
localvars="$(cat << EOF
SECRET_KEY = '${DJANGO_SEKRET}'
DEBUG = True
DOCKER_BUILD = True
CERT_FILE = '/opt/tactical/certs/fullchain.pem'
KEY_FILE = '/opt/tactical/certs/privkey.pem'
SCRIPTS_DIR = '${WORKSPACE_DIR}/scripts'
ALLOWED_HOSTS = ['${API_HOST}', 'localhost', '127.0.0.1']
ADMIN_URL = 'admin/'
CORS_ORIGIN_ALLOW_ALL = True
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': '${POSTGRES_DB}',
'USER': '${POSTGRES_USER}',
'PASSWORD': '${POSTGRES_PASS}',
'HOST': '${POSTGRES_HOST}',
'PORT': '${POSTGRES_PORT}',
}
}
REST_FRAMEWORK = {
'DATETIME_FORMAT': '%b-%d-%Y - %H:%M',
'DEFAULT_PERMISSION_CLASSES': (
'rest_framework.permissions.IsAuthenticated',
),
'DEFAULT_AUTHENTICATION_CLASSES': (
'knox.auth.TokenAuthentication',
),
}
if not DEBUG:
REST_FRAMEWORK.update({
'DEFAULT_RENDERER_CLASSES': (
'rest_framework.renderers.JSONRenderer',
)
})
SALT_USERNAME = '${SALT_USER}'
SALT_PASSWORD = '${SALT_PASS}'
SALT_HOST = '${SALT_HOST}'
MESH_USERNAME = '${MESH_USER}'
MESH_SITE = 'https://${MESH_HOST}'
MESH_TOKEN_KEY = '${MESH_TOKEN}'
REDIS_HOST = '${REDIS_HOST}'
EOF
)"
echo "${localvars}" > ${WORKSPACE_DIR}/api/tacticalrmm/tacticalrmm/local_settings.py
# run migrations and init scripts
python manage.py migrate --no-input
python manage.py collectstatic --no-input
python manage.py initial_db_setup
python manage.py initial_mesh_setup
python manage.py load_chocos
python manage.py load_community_scripts
python manage.py reload_nats
# create super user
echo "from accounts.models import User; User.objects.create_superuser('${TRMM_USER}', 'admin@example.com', '${TRMM_PASS}') if not User.objects.filter(username='${TRMM_USER}').exists() else 0;" | python manage.py shell
}
if [ "$1" = 'tactical-init-dev' ]; then
# make directories if they don't exist
mkdir -p ${TACTICAL_DIR}/tmp
test -f "${TACTICAL_READY_FILE}" && rm "${TACTICAL_READY_FILE}"
# setup Python virtual env and install dependencies
test -f ${VIRTUAL_ENV} && python -m venv --copies ${VIRTUAL_ENV}
pip install --no-cache-dir -r /requirements.txt
django_setup
# create .env file for frontend
webenv="$(cat << EOF
PROD_URL = "http://${API_HOST}:8000"
DEV_URL = "http://${API_HOST}:8000"
DEV_HOST = 0.0.0.0
DEV_PORT = 8080
EOF
)"
echo "${webenv}" | tee ${WORKSPACE_DIR}/web/.env > /dev/null
# chown everything to tactical user
chown -R "${TACTICAL_USER}":"${TACTICAL_USER}" "${WORKSPACE_DIR}"
chown -R "${TACTICAL_USER}":"${TACTICAL_USER}" "${TACTICAL_DIR}"
# create install ready file
su -c "echo 'tactical-init' > ${TACTICAL_READY_FILE}" "${TACTICAL_USER}"
fi
if [ "$1" = 'tactical-api' ]; then
check_tactical_ready
python manage.py runserver 0.0.0.0:8000
fi
if [ "$1" = 'tactical-celery-dev' ]; then
check_tactical_ready
celery -A tacticalrmm worker -l debug
fi
if [ "$1" = 'tactical-celerybeat-dev' ]; then
check_tactical_ready
test -f "${WORKSPACE_DIR}/api/tacticalrmm/celerybeat.pid" && rm "${WORKSPACE_DIR}/api/tacticalrmm/celerybeat.pid"
celery -A tacticalrmm beat -l debug
fi
if [ "$1" = 'tactical-celerywinupdate-dev' ]; then
check_tactical_ready
celery -A tacticalrmm worker -Q wupdate -l debug
fi

View File

@@ -0,0 +1,44 @@
# To ensure app dependencies are ported from your virtual environment/host machine into your container, run 'pip freeze > requirements.txt' in the terminal to overwrite this file
amqp==2.6.1
asgiref==3.3.1
asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==4.4.6
certifi==2020.12.5
cffi==1.14.3
chardet==3.0.4
cryptography==3.2.1
decorator==4.4.2
Django==3.1.4
django-cors-headers==3.5.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
idna==2.10
kombu==4.6.11
loguru==0.5.3
msgpack==1.0.0
packaging==20.4
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.9.9
pyotp==2.4.1
pyparsing==2.4.7
pytz==2020.4
qrcode==6.1
redis==3.5.3
requests==2.25.0
six==1.15.0
sqlparse==0.4.1
twilio==6.49.0
urllib3==1.26.2
validators==0.18.1
vine==1.3.0
websockets==8.1
zipp==3.4.0
black
Werkzeug
django-extensions
coverage
coveralls
model_bakery

View File

@@ -1,5 +1,25 @@
.git
.cache
**/*.env
**/env
**/__pycache__
**/.classpath
**/.dockerignore
**/.env
**/.git
**/.gitignore
**/.project
**/.settings
**/.toolstarget
**/.vs
**/.vscode
**/*.*proj.user
**/*.dbmdl
**/*.jfm
**/azds.yaml
**/charts
**/docker-compose*
**/Dockerfile*
**/node_modules
**/npm-debug.log
**/obj
**/secrets.dev.yaml
**/values.dev.yaml
**/env
README.md

14
.vscode/launch.json vendored
View File

@@ -14,6 +14,20 @@
"0.0.0.0:8000"
],
"django": true
},
{
"name": "Django: Docker Remote Attach",
"type": "python",
"request": "attach",
"port": 5678,
"host": "localhost",
"preLaunchTask": "docker debug",
"pathMappings": [
{
"localRoot": "${workspaceFolder}/api/tacticalrmm",
"remoteRoot": "/workspace/api/tacticalrmm"
}
]
}
]
}

19
.vscode/settings.json vendored
View File

@@ -41,4 +41,23 @@
"**/*.zip": true
},
},
"go.useLanguageServer": true,
"[go]": {
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": false,
},
"editor.snippetSuggestions": "none",
},
"[go.mod]": {
"editor.formatOnSave": true,
"editor.codeActionsOnSave": {
"source.organizeImports": true,
},
},
"gopls": {
"usePlaceholders": true,
"completeUnimported": true,
"staticcheck": true,
}
}

23
.vscode/tasks.json vendored Normal file
View File

@@ -0,0 +1,23 @@
{
// See https://go.microsoft.com/fwlink/?LinkId=733558
// for the documentation about the tasks.json format
"version": "2.0.0",
"tasks": [
{
"label": "docker debug",
"type": "shell",
"command": "docker-compose",
"args": [
"-p",
"trmm",
"-f",
".devcontainer/docker-compose.yml",
"-f",
".devcontainer/docker-compose.debug.yml",
"up",
"-d",
"--build"
]
}
]
}

View File

@@ -36,7 +36,7 @@ Demo database resets every hour. Alot of features are disabled for obvious reaso
## Installation
### Requirements
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04)
- VPS with 4GB ram (an install script is provided for Ubuntu Server 20.04 / Debian 10)
- A domain you own with at least 3 subdomains
- Google Authenticator app (2 factor is NOT optional)

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0002_auto_20200810_0544'),
("accounts", "0002_auto_20200810_0544"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,24 +6,24 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('accounts', '0003_auto_20200922_1344'),
("accounts", "0003_auto_20200922_1344"),
]
operations = [
migrations.RemoveField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
),
migrations.RemoveField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
),
migrations.RemoveField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
),
migrations.RemoveField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0004_auto_20201002_1257'),
("accounts", "0004_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='user',
name='created_by',
model_name="user",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='created_time',
model_name="user",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='user',
name='modified_by',
model_name="user",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='user',
name='modified_time',
model_name="user",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0007_update_agent_primary_key'),
("accounts", "0007_update_agent_primary_key"),
]
operations = [
migrations.AddField(
model_name='user',
name='dark_mode',
model_name="user",
name="dark_mode",
field=models.BooleanField(default=True),
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('accounts', '0008_user_dark_mode'),
("accounts", "0008_user_dark_mode"),
]
operations = [
migrations.AddField(
model_name='user',
name='show_community_scripts',
model_name="user",
name="show_community_scripts",
field=models.BooleanField(default=True),
),
]

View File

@@ -0,0 +1,26 @@
# Generated by Django 3.1.4 on 2021-01-14 01:23
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("accounts", "0009_user_show_community_scripts"),
]
operations = [
migrations.AddField(
model_name="user",
name="agent_dblclick_action",
field=models.CharField(
choices=[
("editagent", "Edit Agent"),
("takecontrol", "Take Control"),
("remotebg", "Remote Background"),
],
default="editagent",
max_length=50,
),
),
]

View File

@@ -3,12 +3,21 @@ from django.contrib.auth.models import AbstractUser
from logs.models import BaseAuditModel
AGENT_DBLCLICK_CHOICES = [
("editagent", "Edit Agent"),
("takecontrol", "Take Control"),
("remotebg", "Remote Background"),
]
class User(AbstractUser, BaseAuditModel):
is_active = models.BooleanField(default=True)
totp_key = models.CharField(max_length=50, null=True, blank=True)
dark_mode = models.BooleanField(default=True)
show_community_scripts = models.BooleanField(default=True)
agent_dblclick_action = models.CharField(
max_length=50, choices=AGENT_DBLCLICK_CHOICES, default="editagent"
)
agent = models.OneToOneField(
"agents.Agent",

View File

@@ -155,6 +155,33 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("put", url)
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_put_not_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
data = {
"id": self.john.pk,
"username": "john",
"email": "johndoe@xlawgaming.com",
"first_name": "John",
"last_name": "Doe",
}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_delete(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
@@ -166,6 +193,19 @@ class GetUpdateDeleteUser(TacticalTestCase):
self.check_not_authenticated("delete", url)
@override_settings(ROOT_USER="john")
def test_delete_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
r = self.client.delete(url)
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_delete_non_root_user(self):
url = f"/accounts/{self.john.pk}/users/"
self.client.force_authenticate(user=self.alice)
r = self.client.delete(url)
self.assertEqual(r.status_code, 400)
class TestUserAction(TacticalTestCase):
def setUp(self):
@@ -184,6 +224,21 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("post", url)
@override_settings(ROOT_USER="john")
def test_post_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 200)
@override_settings(ROOT_USER="john")
def test_post_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk, "password": "3ASDjh2345kJA!@#)#@__123"}
self.client.force_authenticate(user=self.alice)
r = self.client.post(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_put(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
@@ -195,12 +250,46 @@ class TestUserAction(TacticalTestCase):
self.check_not_authenticated("put", url)
def test_darkmode(self):
@override_settings(ROOT_USER="john")
def test_put_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 200)
user = User.objects.get(pk=self.john.pk)
self.assertEqual(user.totp_key, "")
@override_settings(ROOT_USER="john")
def test_put_non_root_user(self):
url = "/accounts/users/reset/"
data = {"id": self.john.pk}
self.client.force_authenticate(user=self.alice)
r = self.client.put(url, data, format="json")
self.assertEqual(r.status_code, 400)
def test_user_ui(self):
url = "/accounts/users/ui/"
data = {"dark_mode": False}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"show_community_scripts": True}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "editagent"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "remotebg"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
data = {"agent_dblclick_action": "takecontrol"}
r = self.client.patch(url, data, format="json")
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("patch", url)

View File

@@ -60,7 +60,7 @@ class LoginView(KnoxLoginView):
if settings.DEBUG and token == "sekret":
valid = True
elif totp.verify(token, valid_window=1):
elif totp.verify(token, valid_window=10):
valid = True
if valid:
@@ -197,4 +197,8 @@ class UserUI(APIView):
user.show_community_scripts = request.data["show_community_scripts"]
user.save(update_fields=["show_community_scripts"])
return Response("ok")
if "agent_dblclick_action" in request.data:
user.agent_dblclick_action = request.data["agent_dblclick_action"]
user.save(update_fields=["agent_dblclick_action"])
return Response("ok")

View File

@@ -7,14 +7,20 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
('agents', '0020_auto_20201025_2129'),
("clients", "0006_deployment"),
("agents", "0020_auto_20201025_2129"),
]
operations = [
migrations.AddField(
model_name='agent',
name='site_link',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='agents', to='clients.site'),
model_name="agent",
name="site_link",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="agents",
to="clients.site",
),
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0022_update_site_primary_key'),
("agents", "0022_update_site_primary_key"),
]
operations = [
migrations.RemoveField(
model_name='agent',
name='client',
model_name="agent",
name="client",
),
migrations.RemoveField(
model_name='agent',
name='site',
model_name="agent",
name="site",
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('agents', '0023_auto_20201101_2312'),
("agents", "0023_auto_20201101_2312"),
]
operations = [
migrations.RenameField(
model_name='agent',
old_name='site_link',
new_name='site',
model_name="agent",
old_name="site_link",
new_name="site",
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0024_auto_20201101_2319'),
("agents", "0024_auto_20201101_2319"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -6,13 +6,23 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('agents', '0025_auto_20201122_0407'),
("agents", "0025_auto_20201122_0407"),
]
operations = [
migrations.AlterField(
model_name='recoveryaction',
name='mode',
field=models.CharField(choices=[('salt', 'Salt'), ('mesh', 'Mesh'), ('command', 'Command'), ('rpc', 'Nats RPC'), ('checkrunner', 'Checkrunner')], default='mesh', max_length=50),
model_name="recoveryaction",
name="mode",
field=models.CharField(
choices=[
("salt", "Salt"),
("mesh", "Mesh"),
("command", "Command"),
("rpc", "Nats RPC"),
("checkrunner", "Checkrunner"),
],
default="mesh",
max_length=50,
),
),
]

View File

@@ -3,11 +3,11 @@ from loguru import logger
from time import sleep
import random
import requests
from concurrent.futures import ThreadPoolExecutor
from packaging import version as pyver
from typing import List
from django.conf import settings
from scripts.models import Script
from tacticalrmm.celery import app
from agents.models import Agent, AgentOutage
@@ -28,12 +28,32 @@ def _check_agent_service(pk: int) -> None:
asyncio.run(agent.nats_cmd(data, wait=False))
def _check_in_full(pk: int) -> None:
agent = Agent.objects.get(pk=pk)
asyncio.run(agent.nats_cmd({"func": "checkinfull"}, wait=False))
@app.task
def check_in_task() -> None:
q = Agent.objects.only("pk", "version")
agents: List[int] = [
i.pk for i in q if pyver.parse(i.version) >= pyver.parse("1.1.12")
]
chunks = (agents[i : i + 50] for i in range(0, len(agents), 50))
for chunk in chunks:
for pk in chunk:
_check_in_full(pk)
sleep(0.1)
rand = random.randint(3, 7)
sleep(rand)
@app.task
def monitor_agents_task() -> None:
q = Agent.objects.all()
agents: List[int] = [i.pk for i in q if i.has_nats and i.status != "online"]
with ThreadPoolExecutor(max_workers=15) as executor:
executor.map(_check_agent_service, agents)
for agent in agents:
_check_agent_service(agent)
def agent_update(pk: int) -> str:
@@ -43,55 +63,46 @@ def agent_update(pk: int) -> str:
logger.warning(f"Unable to determine arch on {agent.hostname}. Skipping.")
return "noarch"
# force an update to 1.1.5 since 1.1.6 needs agent to be on 1.1.5 first
if pyver.parse(agent.version) < pyver.parse("1.1.5"):
version = "1.1.5"
if agent.arch == "64":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5.exe"
inno = "winagent-v1.1.5.exe"
elif agent.arch == "32":
url = "https://github.com/wh1te909/rmmagent/releases/download/v1.1.5/winagent-v1.1.5-x86.exe"
inno = "winagent-v1.1.5-x86.exe"
else:
return "nover"
else:
version = settings.LATEST_AGENT_VER
url = agent.winagent_dl
inno = agent.win_inno_exe
version = settings.LATEST_AGENT_VER
url = agent.winagent_dl
inno = agent.win_inno_exe
if agent.has_nats:
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).exists():
action = agent.pendingactions.filter(
if pyver.parse(agent.version) <= pyver.parse("1.1.11"):
if agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
if pyver.parse(action.details["version"]) < pyver.parse(version):
action.delete()
else:
return "pending"
).exists():
action = agent.pendingactions.filter(
action_type="agentupdate", status="pending"
).last()
if pyver.parse(action.details["version"]) < pyver.parse(version):
action.delete()
else:
return "pending"
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": url,
"version": version,
"inno": inno,
},
)
else:
nats_data = {
"func": "agentupdate",
"payload": {
"url": url,
"version": version,
"inno": inno,
},
}
asyncio.run(agent.nats_cmd(nats_data, wait=False))
PendingAction.objects.create(
agent=agent,
action_type="agentupdate",
details={
"url": url,
"version": version,
"inno": inno,
},
)
return "created"
# TODO
# Salt is deprecated, remove this once salt is gone
else:
agent.salt_api_async(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": inno,
"url": url,
},
)
return "salt"
return "not supported"
@app.task
@@ -109,7 +120,6 @@ def send_agent_update_task(pks: List[int], version: str) -> None:
def auto_self_agent_update_task() -> None:
core = CoreSettings.objects.first()
if not core.agent_auto_update:
logger.info("Agent auto update is disabled. Skipping.")
return
q = Agent.objects.only("pk", "version")
@@ -131,8 +141,14 @@ def sync_sysinfo_task():
for i in agents
if pyver.parse(i.version) >= pyver.parse("1.1.3") and i.status == "online"
]
for agent in online:
asyncio.run(agent.nats_cmd({"func": "sync"}, wait=False))
chunks = (online[i : i + 50] for i in range(0, len(online), 50))
for chunk in chunks:
for agent in chunk:
asyncio.run(agent.nats_cmd({"func": "sync"}, wait=False))
sleep(0.1)
rand = random.randint(3, 7)
sleep(rand)
@app.task
@@ -275,8 +291,68 @@ def agent_outages_task():
outage = AgentOutage(agent=agent)
outage.save()
# add a null check history to allow gaps in graph
for check in agent.agentchecks.all():
check.add_check_history(None)
if agent.overdue_email_alert and not agent.maintenance_mode:
agent_outage_email_task.delay(pk=outage.pk)
if agent.overdue_text_alert and not agent.maintenance_mode:
agent_outage_sms_task.delay(pk=outage.pk)
@app.task
def install_salt_task(pk: int) -> None:
sleep(20)
agent = Agent.objects.get(pk=pk)
asyncio.run(agent.nats_cmd({"func": "installsalt"}, wait=False))
@app.task
def run_script_email_results_task(
agentpk: int, scriptpk: int, nats_timeout: int, nats_data: dict, emails: List[str]
):
agent = Agent.objects.get(pk=agentpk)
script = Script.objects.get(pk=scriptpk)
nats_data["func"] = "runscriptfull"
r = asyncio.run(agent.nats_cmd(nats_data, timeout=nats_timeout))
if r == "timeout":
logger.error(f"{agent.hostname} timed out running script.")
return
CORE = CoreSettings.objects.first()
subject = f"{agent.hostname} {script.name} Results"
exec_time = "{:.4f}".format(r["execution_time"])
body = (
subject
+ f"\nReturn code: {r['retcode']}\nExecution time: {exec_time} seconds\nStdout: {r['stdout']}\nStderr: {r['stderr']}"
)
import smtplib
from email.message import EmailMessage
msg = EmailMessage()
msg["Subject"] = subject
msg["From"] = CORE.smtp_from_email
if emails:
msg["To"] = ", ".join(emails)
else:
msg["To"] = ", ".join(CORE.email_alert_recipients)
msg.set_content(body)
try:
with smtplib.SMTP(CORE.smtp_host, CORE.smtp_port, timeout=20) as server:
if CORE.smtp_requires_auth:
server.ehlo()
server.starttls()
server.login(CORE.smtp_host_user, CORE.smtp_host_password)
server.send_message(msg)
server.quit()
else:
server.send_message(msg)
server.quit()
except Exception as e:
logger.error(e)

View File

@@ -581,15 +581,13 @@ class TestAgentViews(TacticalTestCase):
r = self.client.post(url, payload, format="json")
self.assertEqual(r.status_code, 400)
payload = {
""" payload = {
"mode": "command",
"monType": "workstations",
"target": "client",
"client": self.agent.client.id,
"site": None,
"agentPKs": [
self.agent.pk,
],
"agentPKs": [],
"cmd": "gpupdate /force",
"timeout": 300,
"shell": "cmd",
@@ -597,7 +595,7 @@ class TestAgentViews(TacticalTestCase):
r = self.client.post(url, payload, format="json")
self.assertEqual(r.status_code, 200)
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300)
bulk_command.assert_called_with([self.agent.pk], "gpupdate /force", "cmd", 300) """
payload = {
"mode": "command",
@@ -792,14 +790,14 @@ class TestAgentTasks(TacticalTestCase):
self.assertEqual(salt_batch_async.call_count, 4)
self.assertEqual(ret.status, "SUCCESS")
@patch("agents.models.Agent.salt_api_async")
def test_agent_update(self, salt_api_async):
@patch("agents.models.Agent.nats_cmd")
def test_agent_update(self, nats_cmd):
from agents.tasks import agent_update
agent_noarch = baker.make_recipe(
"agents.agent",
operating_system="Error getting OS",
version="1.1.0",
version="1.1.11",
)
r = agent_update(agent_noarch.pk)
self.assertEqual(r, "noarch")
@@ -810,15 +808,15 @@ class TestAgentTasks(TacticalTestCase):
0,
)
agent64_nats = baker.make_recipe(
agent64_111 = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.1.0",
version="1.1.11",
)
r = agent_update(agent64_nats.pk)
r = agent_update(agent64_111.pk)
self.assertEqual(r, "created")
action = PendingAction.objects.get(agent__pk=agent64_nats.pk)
action = PendingAction.objects.get(agent__pk=agent64_111.pk)
self.assertEqual(action.action_type, "agentupdate")
self.assertEqual(action.status, "pending")
self.assertEqual(action.details["url"], settings.DL_64)
@@ -827,33 +825,24 @@ class TestAgentTasks(TacticalTestCase):
)
self.assertEqual(action.details["version"], settings.LATEST_AGENT_VER)
agent64_salt = baker.make_recipe(
agent64 = baker.make_recipe(
"agents.agent",
operating_system="Windows 10 Pro, 64 bit (build 19041.450)",
version="1.0.0",
version="1.1.12",
)
salt_api_async.return_value = True
r = agent_update(agent64_salt.pk)
self.assertEqual(r, "salt")
salt_api_async.assert_called_with(
func="win_agent.do_agent_update_v2",
kwargs={
"inno": f"winagent-v{settings.LATEST_AGENT_VER}.exe",
"url": settings.DL_64,
nats_cmd.return_value = "ok"
r = agent_update(agent64.pk)
self.assertEqual(r, "created")
nats_cmd.assert_called_with(
{
"func": "agentupdate",
"payload": {
"url": settings.DL_64,
"version": settings.LATEST_AGENT_VER,
"inno": f"winagent-v{settings.LATEST_AGENT_VER}.exe",
},
},
)
salt_api_async.reset_mock()
agent32_nats = baker.make_recipe(
"agents.agent",
operating_system="Windows 7 Professional, 32 bit (build 7601.23964)",
version="1.1.0",
)
agent32_salt = baker.make_recipe(
"agents.agent",
operating_system="Windows 7 Professional, 32 bit (build 7601.23964)",
version="1.0.0",
wait=False,
)
""" @patch("agents.models.Agent.salt_api_async")
@@ -959,4 +948,4 @@ class TestAgentTasks(TacticalTestCase):
"url": OLD_32_PY_AGENT,
},
)
self.assertEqual(ret.status, "SUCCESS") """
self.assertEqual(ret.status, "SUCCESS") """

View File

@@ -32,7 +32,11 @@ from .serializers import (
)
from winupdate.serializers import WinUpdatePolicySerializer
from .tasks import uninstall_agent_task, send_agent_update_task
from .tasks import (
uninstall_agent_task,
send_agent_update_task,
run_script_email_results_task,
)
from winupdate.tasks import bulk_check_for_updates_task
from scripts.tasks import handle_bulk_command_task, handle_bulk_script_task
@@ -738,6 +742,21 @@ def run_script(request):
if output == "wait":
r = asyncio.run(agent.nats_cmd(data, timeout=req_timeout))
return Response(r)
elif output == "email":
if not pyver.parse(agent.version) >= pyver.parse("1.1.12"):
return notify_error("Requires agent version 1.1.12 or greater")
emails = (
[] if request.data["emailmode"] == "default" else request.data["emails"]
)
run_script_email_results_task.delay(
agentpk=agent.pk,
scriptpk=script.pk,
nats_timeout=req_timeout,
nats_data=data,
emails=emails,
)
return Response(f"{script.name} will now be run on {agent.hostname}")
else:
asyncio.run(agent.nats_cmd(data, wait=False))
return Response(f"{script.name} will now be run on {agent.hostname}")
@@ -914,4 +933,4 @@ class WMI(APIView):
r = asyncio.run(agent.nats_cmd({"func": "sysinfo"}, timeout=20))
if r != "ok":
return notify_error("Unable to contact the agent")
return Response("ok")
return Response("ok")

View File

@@ -7,19 +7,25 @@ import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('checks', '0010_auto_20200922_1344'),
('alerts', '0002_auto_20200815_1618'),
("checks", "0010_auto_20200922_1344"),
("alerts", "0002_auto_20200815_1618"),
]
operations = [
migrations.AddField(
model_name='alert',
name='assigned_check',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='alert', to='checks.check'),
model_name="alert",
name="assigned_check",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="alert",
to="checks.check",
),
),
migrations.AlterField(
model_name='alert',
name='alert_time',
model_name="alert",
name="alert_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
]

View File

@@ -37,7 +37,7 @@ class Alert(models.Model):
@classmethod
def create_availability_alert(cls, agent):
pass
@classmethod
def create_check_alert(cls, check):
pass
pass

View File

@@ -16,4 +16,4 @@ class AlertSerializer(ModelSerializer):
class Meta:
model = Alert
fields = "__all__"
fields = "__all__"

View File

@@ -2,4 +2,4 @@ from django.apps import AppConfig
class Apiv2Config(AppConfig):
name = 'apiv2'
name = "apiv2"

View File

@@ -61,7 +61,7 @@ class TestAPIv3(TacticalTestCase):
def test_sysinfo(self):
# TODO replace this with golang wmi sample data
url = f"/api/v3/sysinfo/"
url = "/api/v3/sysinfo/"
with open(
os.path.join(
settings.BASE_DIR, "tacticalrmm/test_data/wmi_python_agent.json"
@@ -77,7 +77,7 @@ class TestAPIv3(TacticalTestCase):
self.check_not_authenticated("patch", url)
def test_hello_patch(self):
url = f"/api/v3/hello/"
url = "/api/v3/hello/"
payload = {
"agent_id": self.agent.agent_id,
"logged_in_username": "None",
@@ -92,3 +92,12 @@ class TestAPIv3(TacticalTestCase):
self.assertEqual(r.status_code, 200)
self.check_not_authenticated("patch", url)
@patch("agents.tasks.install_salt_task.delay")
def test_install_salt(self, mock_task):
url = f"/api/v3/{self.agent.agent_id}/installsalt/"
r = self.client.get(url, format="json")
self.assertEqual(r.status_code, 200)
mock_task.assert_called_with(self.agent.pk)
self.check_not_authenticated("get", url)

View File

@@ -17,4 +17,5 @@ urlpatterns = [
path("<str:agentid>/winupdater/", views.WinUpdater.as_view()),
path("software/", views.Software.as_view()),
path("installer/", views.Installer.as_view()),
path("<str:agentid>/installsalt/", views.InstallSalt.as_view()),
]

View File

@@ -30,6 +30,7 @@ from agents.tasks import (
agent_recovery_email_task,
agent_recovery_sms_task,
sync_salt_modules_task,
install_salt_task,
)
from winupdate.tasks import check_for_updates_task
from software.tasks import install_chocolatey
@@ -265,16 +266,6 @@ class CheckRunner(APIView):
check.save(update_fields=["last_run"])
status = check.handle_checkv2(request.data)
# create audit entry
AuditLog.objects.create(
username=check.agent.hostname,
agent=check.agent.hostname,
object_type="agent",
action="check_run",
message=f"{check.readable_desc} was run on {check.agent.hostname}. Status: {status}",
after_value=Check.serialize(check),
)
return Response(status)
@@ -626,3 +617,13 @@ class Installer(APIView):
)
return Response("ok")
class InstallSalt(APIView):
authentication_classes = [TokenAuthentication]
permission_classes = [IsAuthenticated]
def get(self, request, agentid):
agent = get_object_or_404(Agent, agent_id=agentid)
install_salt_task.delay(agent.pk)
return Response("ok")

View File

@@ -6,11 +6,11 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('automation', '0005_auto_20200922_1344'),
("automation", "0005_auto_20200922_1344"),
]
operations = [
migrations.DeleteModel(
name='PolicyExclusions',
name="PolicyExclusions",
),
]

View File

@@ -80,7 +80,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy
@@ -132,7 +132,7 @@ class Policy(BaseAuditModel):
default_policy = CoreSettings.objects.first().server_policy
client_policy = client.server_policy
site_policy = site.server_policy
else:
elif agent.monitoring_type == "workstation":
default_policy = CoreSettings.objects.first().workstation_policy
client_policy = client.workstation_policy
site_policy = site.workstation_policy

View File

@@ -19,7 +19,17 @@ def generate_agent_checks_from_policies_task(
):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_checks_from_policies(clear=clear)
if create_tasks:
agent.generate_tasks_from_policies(
@@ -86,7 +96,17 @@ def update_policy_check_fields_task(checkpk):
def generate_agent_tasks_from_policies_task(policypk, clear=False):
policy = Policy.objects.get(pk=policypk)
for agent in policy.related_agents():
if policy.is_default_server_policy and policy.is_default_workstation_policy:
agents = Agent.objects.all()
elif policy.is_default_server_policy:
agents = Agent.objects.filter(monitoring_type="server")
elif policy.is_default_workstation_policy:
agents = Agent.objects.filter(monitoring_type="workstation")
else:
agents = policy.related_agents()
for agent in agents:
agent.generate_tasks_from_policies(clear=clear)

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('autotasks', '0008_auto_20201030_1515'),
("autotasks", "0008_auto_20201030_1515"),
]
operations = [
migrations.AddField(
model_name='automatedtask',
name='run_time_bit_weekdays',
model_name="automatedtask",
name="run_time_bit_weekdays",
field=models.IntegerField(blank=True, null=True),
),
]

View File

@@ -176,6 +176,12 @@ def delete_win_task_schedule(pk, pending_action=False):
pendingaction.status = "completed"
pendingaction.save(update_fields=["status"])
# complete any other pending actions on agent with same task_id
for action in task.agent.pendingactions.all():
if action.details["task_id"] == task.id:
action.status = "completed"
action.save()
task.delete()
return "ok"

View File

@@ -1,5 +1,6 @@
from django.contrib import admin
from .models import Check
from .models import Check, CheckHistory
admin.site.register(Check)
admin.site.register(CheckHistory)

View File

@@ -0,0 +1,30 @@
# Generated by Django 3.1.4 on 2021-01-09 02:56
import django.contrib.postgres.fields
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0010_auto_20200922_1344"),
]
operations = [
migrations.AddField(
model_name="check",
name="run_history",
field=django.contrib.postgres.fields.ArrayField(
base_field=django.contrib.postgres.fields.ArrayField(
base_field=models.PositiveIntegerField(),
blank=True,
null=True,
size=None,
),
blank=True,
default=list,
null=True,
size=None,
),
),
]

View File

@@ -0,0 +1,39 @@
# Generated by Django 3.1.4 on 2021-01-09 21:36
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("checks", "0010_auto_20200922_1344"),
]
operations = [
migrations.CreateModel(
name="CheckHistory",
fields=[
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("x", models.DateTimeField()),
("y", models.PositiveIntegerField()),
("results", models.JSONField(blank=True, null=True)),
(
"check_history",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="check_history",
to="checks.check",
),
),
],
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 05:03
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0011_checkhistory"),
]
operations = [
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(blank=True, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 05:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0012_auto_20210110_0503"),
]
operations = [
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(null=True),
),
]

View File

@@ -0,0 +1,13 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
("checks", "0013_auto_20210110_0505"),
("checks", "0011_check_run_history"),
]
operations = []

View File

@@ -0,0 +1,27 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("checks", "0014_merge_20210110_1808"),
]
operations = [
migrations.RemoveField(
model_name="check",
name="run_history",
),
migrations.AlterField(
model_name="checkhistory",
name="x",
field=models.DateTimeField(auto_now_add=True),
),
migrations.AlterField(
model_name="checkhistory",
name="y",
field=models.PositiveIntegerField(blank=True, default=None, null=True),
),
]

View File

@@ -3,12 +3,13 @@ import string
import os
import json
import pytz
from statistics import mean
from statistics import mean, mode
from django.db import models
from django.conf import settings
from django.contrib.postgres.fields import ArrayField
from django.core.validators import MinValueValidator, MaxValueValidator
from rest_framework.fields import JSONField
from core.models import CoreSettings
from logs.models import BaseAuditModel
@@ -214,6 +215,10 @@ class Check(BaseAuditModel):
"modified_time",
]
def add_check_history(self, value):
if self.check_type in ["memory", "cpuload", "diskspace"]:
CheckHistory.objects.create(check_history=self, y=value)
def handle_checkv2(self, data):
# cpuload or mem checks
if self.check_type == "cpuload" or self.check_type == "memory":
@@ -232,6 +237,9 @@ class Check(BaseAuditModel):
else:
self.status = "passing"
# add check history
self.add_check_history(data["percent"])
# diskspace checks
elif self.check_type == "diskspace":
if data["exists"]:
@@ -245,6 +253,9 @@ class Check(BaseAuditModel):
self.status = "passing"
self.more_info = f"Total: {total}B, Free: {free}B"
# add check history
self.add_check_history(percent_used)
else:
self.status = "failing"
self.more_info = f"Disk {self.disk} does not exist"
@@ -645,3 +656,17 @@ class Check(BaseAuditModel):
body = subject
CORE.send_sms(body)
class CheckHistory(models.Model):
check_history = models.ForeignKey(
Check,
related_name="check_history",
on_delete=models.CASCADE,
)
x = models.DateTimeField(auto_now_add=True)
y = models.PositiveIntegerField(null=True, blank=True, default=None)
results = models.JSONField(null=True, blank=True)
def __str__(self):
return self.check_history.readable_desc

View File

@@ -1,8 +1,8 @@
import validators as _v
import pytz
from rest_framework import serializers
from .models import Check
from .models import Check, CheckHistory
from autotasks.models import AutomatedTask
from scripts.serializers import ScriptSerializer, ScriptCheckSerializer
@@ -65,6 +65,26 @@ class CheckSerializer(serializers.ModelSerializer):
"Please enter a valid IP address or domain name"
)
if check_type == "cpuload" and not self.instance:
if (
Check.objects.filter(**self.context, check_type="cpuload")
.exclude(managed_by_policy=True)
.exists()
):
raise serializers.ValidationError(
"A cpuload check for this agent already exists"
)
if check_type == "memory" and not self.instance:
if (
Check.objects.filter(**self.context, check_type="memory")
.exclude(managed_by_policy=True)
.exists()
):
raise serializers.ValidationError(
"A memory check for this agent already exists"
)
return val
@@ -217,3 +237,15 @@ class CheckResultsSerializer(serializers.ModelSerializer):
class Meta:
model = Check
fields = "__all__"
class CheckHistorySerializer(serializers.ModelSerializer):
x = serializers.SerializerMethodField()
def get_x(self, obj):
return obj.x.astimezone(pytz.timezone(self.context["timezone"])).isoformat()
# used for return large amounts of graph data
class Meta:
model = CheckHistory
fields = ("x", "y")

View File

@@ -5,8 +5,6 @@ from time import sleep
from tacticalrmm.celery import app
from django.utils import timezone as djangotime
from agents.models import Agent
@app.task
def handle_check_email_alert_task(pk):
@@ -56,3 +54,15 @@ def handle_check_sms_alert_task(pk):
check.save(update_fields=["text_sent"])
return "ok"
@app.task
def prune_check_history(older_than_days: int) -> str:
from .models import CheckHistory
CheckHistory.objects.filter(
x__lt=djangotime.make_aware(dt.datetime.today())
- djangotime.timedelta(days=older_than_days)
).delete()
return "ok"

View File

@@ -1,5 +1,7 @@
from checks.models import CheckHistory
from tacticalrmm.test import TacticalTestCase
from .serializers import CheckSerializer
from django.utils import timezone as djangotime
from model_bakery import baker
from itertools import cycle
@@ -8,6 +10,7 @@ from itertools import cycle
class TestCheckViews(TacticalTestCase):
def setUp(self):
self.authenticate()
self.setup_coresettings()
def test_get_disk_check(self):
# setup data
@@ -55,6 +58,52 @@ class TestCheckViews(TacticalTestCase):
resp = self.client.post(url, invalid_payload, format="json")
self.assertEqual(resp.status_code, 400)
def test_add_cpuload_check(self):
url = "/checks/checks/"
agent = baker.make_recipe("agents.agent")
payload = {
"pk": agent.pk,
"check": {
"check_type": "cpuload",
"threshold": 66,
"fails_b4_alert": 9,
},
}
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 200)
payload["threshold"] = 87
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 400)
self.assertEqual(
resp.json()["non_field_errors"][0],
"A cpuload check for this agent already exists",
)
def test_add_memory_check(self):
url = "/checks/checks/"
agent = baker.make_recipe("agents.agent")
payload = {
"pk": agent.pk,
"check": {
"check_type": "memory",
"threshold": 78,
"fails_b4_alert": 1,
},
}
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 200)
payload["threshold"] = 55
resp = self.client.post(url, payload, format="json")
self.assertEqual(resp.status_code, 400)
self.assertEqual(
resp.json()["non_field_errors"][0],
"A memory check for this agent already exists",
)
def test_get_policy_disk_check(self):
# setup data
policy = baker.make("automation.Policy")
@@ -134,3 +183,69 @@ class TestCheckViews(TacticalTestCase):
self.assertEqual(resp.status_code, 200)
self.check_not_authenticated("patch", url_a)
def test_get_check_history(self):
# setup data
agent = baker.make_recipe("agents.agent")
check = baker.make_recipe("checks.diskspace_check", agent=agent)
baker.make("checks.CheckHistory", check_history=check, _quantity=30)
check_history_data = baker.make(
"checks.CheckHistory",
check_history=check,
_quantity=30,
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
check_history.save()
# test invalid check pk
resp = self.client.patch("/checks/history/500/", format="json")
self.assertEqual(resp.status_code, 404)
url = f"/checks/history/{check.id}/"
# test with timeFilter last 30 days
data = {"timeFilter": 30}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 30)
# test with timeFilter equal to 0
data = {"timeFilter": 0}
resp = self.client.patch(url, data, format="json")
self.assertEqual(resp.status_code, 200)
self.assertEqual(len(resp.data), 60)
self.check_not_authenticated("patch", url)
class TestCheckTasks(TacticalTestCase):
def setUp(self):
self.setup_coresettings()
def test_prune_check_history(self):
from .tasks import prune_check_history
# setup data
check = baker.make_recipe("checks.diskspace_check")
baker.make("checks.CheckHistory", check_history=check, _quantity=30)
check_history_data = baker.make(
"checks.CheckHistory",
check_history=check,
_quantity=30,
)
# need to manually set the date back 35 days
for check_history in check_history_data:
check_history.x = djangotime.now() - djangotime.timedelta(days=35)
check_history.save()
# prune data 30 days old
prune_check_history(30)
self.assertEqual(CheckHistory.objects.count(), 30)
# prune all Check history Data
prune_check_history(0)
self.assertEqual(CheckHistory.objects.count(), 0)

View File

@@ -7,4 +7,5 @@ urlpatterns = [
path("<pk>/loadchecks/", views.load_checks),
path("getalldisks/", views.get_disks_for_policies),
path("runchecks/<pk>/", views.run_checks),
path("history/<int:checkpk>/", views.CheckHistory.as_view()),
]

View File

@@ -1,6 +1,10 @@
import asyncio
from django.shortcuts import get_object_or_404
from django.db.models import Q
from django.utils import timezone as djangotime
from datetime import datetime as dt
from rest_framework.views import APIView
from rest_framework.response import Response
@@ -13,7 +17,7 @@ from automation.models import Policy
from .models import Check
from scripts.models import Script
from .serializers import CheckSerializer
from .serializers import CheckSerializer, CheckHistorySerializer
from automation.tasks import (
@@ -135,6 +139,29 @@ class GetUpdateDeleteCheck(APIView):
return Response(f"{check.readable_desc} was deleted!")
class CheckHistory(APIView):
def patch(self, request, checkpk):
check = get_object_or_404(Check, pk=checkpk)
timeFilter = Q()
if "timeFilter" in request.data:
if request.data["timeFilter"] != 0:
timeFilter = Q(
x__lte=djangotime.make_aware(dt.today()),
x__gt=djangotime.make_aware(dt.today())
- djangotime.timedelta(days=request.data["timeFilter"]),
)
check_history = check.check_history.filter(timeFilter).order_by("-x")
return Response(
CheckHistorySerializer(
check_history, context={"timezone": check.agent.timezone}, many=True
).data
)
@api_view()
def run_checks(request, pk):
agent = get_object_or_404(Agent, pk=pk)

View File

@@ -6,48 +6,48 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('clients', '0004_auto_20200821_2115'),
("clients", "0004_auto_20200821_2115"),
]
operations = [
migrations.AddField(
model_name='client',
name='created_by',
model_name="client",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='created_time',
model_name="client",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='client',
name='modified_by',
model_name="client",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='client',
name='modified_time',
model_name="client",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
migrations.AddField(
model_name='site',
name='created_by',
model_name="site",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='created_time',
model_name="site",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='site',
name='modified_by',
model_name="site",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='site',
name='modified_time',
model_name="site",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -8,24 +8,67 @@ import uuid
class Migration(migrations.Migration):
dependencies = [
('knox', '0007_auto_20190111_0542'),
('clients', '0005_auto_20200922_1344'),
("knox", "0007_auto_20190111_0542"),
("clients", "0005_auto_20200922_1344"),
]
operations = [
migrations.CreateModel(
name='Deployment',
name="Deployment",
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('uid', models.UUIDField(default=uuid.uuid4, editable=False)),
('mon_type', models.CharField(choices=[('server', 'Server'), ('workstation', 'Workstation')], default='server', max_length=255)),
('arch', models.CharField(choices=[('64', '64 bit'), ('32', '32 bit')], default='64', max_length=255)),
('expiry', models.DateTimeField(blank=True, null=True)),
('token_key', models.CharField(max_length=255)),
('install_flags', models.JSONField(blank=True, null=True)),
('auth_token', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploytokens', to='knox.authtoken')),
('client', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deployclients', to='clients.client')),
('site', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='deploysites', to='clients.site')),
(
"id",
models.AutoField(
auto_created=True,
primary_key=True,
serialize=False,
verbose_name="ID",
),
),
("uid", models.UUIDField(default=uuid.uuid4, editable=False)),
(
"mon_type",
models.CharField(
choices=[("server", "Server"), ("workstation", "Workstation")],
default="server",
max_length=255,
),
),
(
"arch",
models.CharField(
choices=[("64", "64 bit"), ("32", "32 bit")],
default="64",
max_length=255,
),
),
("expiry", models.DateTimeField(blank=True, null=True)),
("token_key", models.CharField(max_length=255)),
("install_flags", models.JSONField(blank=True, null=True)),
(
"auth_token",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploytokens",
to="knox.authtoken",
),
),
(
"client",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deployclients",
to="clients.client",
),
),
(
"site",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="deploysites",
to="clients.site",
),
),
],
),
]

View File

@@ -6,18 +6,18 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0006_deployment'),
("clients", "0006_deployment"),
]
operations = [
migrations.RenameField(
model_name='client',
old_name='client',
new_name='name',
model_name="client",
old_name="client",
new_name="name",
),
migrations.RenameField(
model_name='site',
old_name='site',
new_name='name',
model_name="site",
old_name="site",
new_name="name",
),
]

View File

@@ -6,16 +6,16 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('clients', '0007_auto_20201102_1920'),
("clients", "0007_auto_20201102_1920"),
]
operations = [
migrations.AlterModelOptions(
name='client',
options={'ordering': ('name',)},
name="client",
options={"ordering": ("name",)},
),
migrations.AlterModelOptions(
name='site',
options={'ordering': ('name',)},
name="site",
options={"ordering": ("name",)},
),
]

View File

@@ -192,7 +192,7 @@ class GenerateAgent(APIView):
if not os.path.exists(go_bin):
return notify_error("Missing golang")
api = f"{request.scheme}://{request.get_host()}"
api = f"https://{request.get_host()}"
inno = (
f"winagent-v{settings.LATEST_AGENT_VER}.exe"
if d.arch == "64"
@@ -282,4 +282,4 @@ class GenerateAgent(APIView):
response = HttpResponse()
response["Content-Disposition"] = f"attachment; filename={file_name}"
response["X-Accel-Redirect"] = f"/private/exe/{file_name}"
return response
return response

View File

@@ -6,4 +6,4 @@ class Command(BaseCommand):
help = "Reload Nats"
def handle(self, *args, **kwargs):
reload_nats()
reload_nats()

View File

@@ -6,13 +6,13 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0008_auto_20200910_1434'),
("core", "0008_auto_20200910_1434"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='agent_auto_update',
model_name="coresettings",
name="agent_auto_update",
field=models.BooleanField(default=True),
),
]

View File

@@ -6,28 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0009_coresettings_agent_auto_update'),
("core", "0009_coresettings_agent_auto_update"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='created_by',
model_name="coresettings",
name="created_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='created_time',
model_name="coresettings",
name="created_time",
field=models.DateTimeField(auto_now_add=True, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_by',
model_name="coresettings",
name="modified_by",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='coresettings',
name='modified_time',
model_name="coresettings",
name="modified_time",
field=models.DateTimeField(auto_now=True, null=True),
),
]

View File

@@ -7,28 +7,34 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('core', '0010_auto_20201002_1257'),
("core", "0010_auto_20201002_1257"),
]
operations = [
migrations.AddField(
model_name='coresettings',
name='sms_alert_recipients',
field=django.contrib.postgres.fields.ArrayField(base_field=models.CharField(blank=True, max_length=255, null=True), blank=True, default=list, null=True, size=None),
model_name="coresettings",
name="sms_alert_recipients",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.CharField(blank=True, max_length=255, null=True),
blank=True,
default=list,
null=True,
size=None,
),
),
migrations.AddField(
model_name='coresettings',
name='twilio_account_sid',
model_name="coresettings",
name="twilio_account_sid",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_auth_token',
model_name="coresettings",
name="twilio_auth_token",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name='coresettings',
name='twilio_number',
model_name="coresettings",
name="twilio_number",
field=models.CharField(blank=True, max_length=255, null=True),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.4 on 2021-01-10 18:08
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("core", "0011_auto_20201026_0719"),
]
operations = [
migrations.AddField(
model_name="coresettings",
name="check_history_prune_days",
field=models.PositiveIntegerField(default=30),
),
]

View File

@@ -49,6 +49,8 @@ class CoreSettings(BaseAuditModel):
default_time_zone = models.CharField(
max_length=255, choices=TZ_CHOICES, default="America/Los_Angeles"
)
# removes check history older than days
check_history_prune_days = models.PositiveIntegerField(default=30)
mesh_token = models.CharField(max_length=255, null=True, blank=True, default="")
mesh_username = models.CharField(max_length=255, null=True, blank=True, default="")
mesh_site = models.CharField(max_length=255, null=True, blank=True, default="")

View File

@@ -4,8 +4,10 @@ from loguru import logger
from django.conf import settings
from django.utils import timezone as djangotime
from tacticalrmm.celery import app
from core.models import CoreSettings
from autotasks.models import AutomatedTask
from autotasks.tasks import delete_win_task_schedule
from checks.tasks import prune_check_history
logger.configure(**settings.LOG_CONFIG)
@@ -25,3 +27,7 @@ def core_maintenance_tasks():
if now > task_time_utc:
delete_win_task_schedule.delay(task.pk)
# remove old CheckHistory data
older_than = CoreSettings.objects.first().check_history_prune_days
prune_check_history.delay(older_than)

View File

@@ -1,6 +1,7 @@
from tacticalrmm.test import TacticalTestCase
from core.tasks import core_maintenance_tasks
from unittest.mock import patch
from core.models import CoreSettings
from model_bakery import baker, seq
@@ -34,6 +35,54 @@ class TestCoreTasks(TacticalTestCase):
self.check_not_authenticated("get", url)
@patch("automation.tasks.generate_all_agent_checks_task.delay")
def test_edit_coresettings(self, generate_all_agent_checks_task):
url = "/core/editsettings/"
# setup
policies = baker.make("Policy", _quantity=2)
# test normal request
data = {
"smtp_from_email": "newexample@example.com",
"mesh_token": "New_Mesh_Token",
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(
CoreSettings.objects.first().smtp_from_email, data["smtp_from_email"]
)
self.assertEqual(CoreSettings.objects.first().mesh_token, data["mesh_token"])
generate_all_agent_checks_task.assert_not_called()
# test adding policy
data = {
"workstation_policy": policies[0].id,
"server_policy": policies[1].id,
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(CoreSettings.objects.first().server_policy.id, policies[1].id)
self.assertEqual(
CoreSettings.objects.first().workstation_policy.id, policies[0].id
)
self.assertEqual(generate_all_agent_checks_task.call_count, 2)
generate_all_agent_checks_task.reset_mock()
# test remove policy
data = {
"workstation_policy": "",
}
r = self.client.patch(url, data)
self.assertEqual(r.status_code, 200)
self.assertEqual(CoreSettings.objects.first().workstation_policy, None)
self.assertEqual(generate_all_agent_checks_task.call_count, 1)
self.check_not_authenticated("patch", url)
@patch("autotasks.tasks.remove_orphaned_win_tasks.delay")
def test_ui_maintenance_actions(self, remove_orphaned_win_tasks):
url = "/core/servermaintenance/"

View File

@@ -42,18 +42,20 @@ def get_core_settings(request):
@api_view(["PATCH"])
def edit_settings(request):
settings = CoreSettings.objects.first()
serializer = CoreSettingsSerializer(instance=settings, data=request.data)
coresettings = CoreSettings.objects.first()
old_server_policy = coresettings.server_policy
old_workstation_policy = coresettings.workstation_policy
serializer = CoreSettingsSerializer(instance=coresettings, data=request.data)
serializer.is_valid(raise_exception=True)
new_settings = serializer.save()
# check if default policies changed
if settings.server_policy != new_settings.server_policy:
if old_server_policy != new_settings.server_policy:
generate_all_agent_checks_task.delay(
mon_type="server", clear=True, create_tasks=True
)
if settings.workstation_policy != new_settings.workstation_policy:
if old_workstation_policy != new_settings.workstation_policy:
generate_all_agent_checks_task.delay(
mon_type="workstation", clear=True, create_tasks=True
)
@@ -73,6 +75,7 @@ def dashboard_info(request):
"trmm_version": settings.TRMM_VERSION,
"dark_mode": request.user.dark_mode,
"show_community_scripts": request.user.show_community_scripts,
"dbl_click_action": request.user.agent_dblclick_action,
}
)

View File

@@ -6,13 +6,28 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0007_auditlog_debug_info'),
("logs", "0007_auditlog_debug_info"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,29 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0008_auto_20201110_1431'),
("logs", "0008_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
],
max_length=100,
),
),
]

View File

@@ -6,18 +6,50 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0009_auto_20201110_1431'),
("logs", "0009_auto_20201110_1431"),
]
operations = [
migrations.AlterField(
model_name='auditlog',
name='action',
field=models.CharField(choices=[('login', 'User Login'), ('failed_login', 'Failed User Login'), ('delete', 'Delete Object'), ('modify', 'Modify Object'), ('add', 'Add Object'), ('view', 'View Object'), ('check_run', 'Check Run'), ('task_run', 'Task Run'), ('agent_install', 'Agent Install'), ('remote_session', 'Remote Session'), ('execute_script', 'Execute Script'), ('execute_command', 'Execute Command'), ('bulk_action', 'Bulk Action')], max_length=100),
model_name="auditlog",
name="action",
field=models.CharField(
choices=[
("login", "User Login"),
("failed_login", "Failed User Login"),
("delete", "Delete Object"),
("modify", "Modify Object"),
("add", "Add Object"),
("view", "View Object"),
("check_run", "Check Run"),
("task_run", "Task Run"),
("agent_install", "Agent Install"),
("remote_session", "Remote Session"),
("execute_script", "Execute Script"),
("execute_command", "Execute Command"),
("bulk_action", "Bulk Action"),
],
max_length=100,
),
),
migrations.AlterField(
model_name='auditlog',
name='object_type',
field=models.CharField(choices=[('user', 'User'), ('script', 'Script'), ('agent', 'Agent'), ('policy', 'Policy'), ('winupdatepolicy', 'Patch Policy'), ('client', 'Client'), ('site', 'Site'), ('check', 'Check'), ('automatedtask', 'Automated Task'), ('coresettings', 'Core Settings'), ('bulk', 'Bulk')], max_length=100),
model_name="auditlog",
name="object_type",
field=models.CharField(
choices=[
("user", "User"),
("script", "Script"),
("agent", "Agent"),
("policy", "Policy"),
("winupdatepolicy", "Patch Policy"),
("client", "Client"),
("site", "Site"),
("check", "Check"),
("automatedtask", "Automated Task"),
("coresettings", "Core Settings"),
("bulk", "Bulk"),
],
max_length=100,
),
),
]

View File

@@ -6,13 +6,22 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('logs', '0010_auto_20201110_2238'),
("logs", "0010_auto_20201110_2238"),
]
operations = [
migrations.AlterField(
model_name='pendingaction',
name='action_type',
field=models.CharField(blank=True, choices=[('schedreboot', 'Scheduled Reboot'), ('taskaction', 'Scheduled Task Action'), ('agentupdate', 'Agent Update')], max_length=255, null=True),
model_name="pendingaction",
name="action_type",
field=models.CharField(
blank=True,
choices=[
("schedreboot", "Scheduled Reboot"),
("taskaction", "Scheduled Task Action"),
("agentupdate", "Agent Update"),
],
max_length=255,
null=True,
),
),
]

View File

View File

@@ -0,0 +1,5 @@
from django.apps import AppConfig
class NatsapiConfig(AppConfig):
name = "natsapi"

View File

@@ -0,0 +1,8 @@
from django.urls import path
from . import views
urlpatterns = [
path("natsinfo/", views.nats_info),
path("checkin/", views.NatsCheckIn.as_view()),
path("syncmesh/", views.SyncMeshNodeID.as_view()),
]

View File

@@ -0,0 +1,99 @@
from django.utils import timezone as djangotime
from rest_framework.response import Response
from rest_framework.views import APIView
from rest_framework.decorators import (
api_view,
permission_classes,
authentication_classes,
)
from django.conf import settings
from django.shortcuts import get_object_or_404
from agents.models import Agent
from software.models import InstalledSoftware
from checks.utils import bytes2human
from agents.serializers import WinAgentSerializer
from tacticalrmm.utils import notify_error, filter_software, SoftwareList
@api_view()
@permission_classes([])
@authentication_classes([])
def nats_info(request):
return Response({"user": "tacticalrmm", "password": settings.SECRET_KEY})
class NatsCheckIn(APIView):
authentication_classes = []
permission_classes = []
def patch(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
agent.version = request.data["version"]
agent.last_seen = djangotime.now()
agent.save(update_fields=["version", "last_seen"])
return Response("ok")
def put(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
serializer = WinAgentSerializer(instance=agent, data=request.data, partial=True)
if request.data["func"] == "disks":
disks = request.data["disks"]
new = []
for disk in disks:
tmp = {}
for _, _ in disk.items():
tmp["device"] = disk["device"]
tmp["fstype"] = disk["fstype"]
tmp["total"] = bytes2human(disk["total"])
tmp["used"] = bytes2human(disk["used"])
tmp["free"] = bytes2human(disk["free"])
tmp["percent"] = int(disk["percent"])
new.append(tmp)
serializer.is_valid(raise_exception=True)
serializer.save(disks=new)
return Response("ok")
if request.data["func"] == "loggedonuser":
if request.data["logged_in_username"] != "None":
serializer.is_valid(raise_exception=True)
serializer.save(last_logged_in_user=request.data["logged_in_username"])
return Response("ok")
if request.data["func"] == "software":
raw: SoftwareList = request.data["software"]
if not isinstance(raw, list):
return notify_error("err")
sw = filter_software(raw)
if not InstalledSoftware.objects.filter(agent=agent).exists():
InstalledSoftware(agent=agent, software=sw).save()
else:
s = agent.installedsoftware_set.first()
s.software = sw
s.save(update_fields=["software"])
return Response("ok")
serializer.is_valid(raise_exception=True)
serializer.save()
return Response("ok")
class SyncMeshNodeID(APIView):
authentication_classes = []
permission_classes = []
def post(self, request):
agent = get_object_or_404(Agent, agent_id=request.data["agent_id"])
if agent.mesh_node_id != request.data["nodeid"]:
agent.mesh_node_id = request.data["nodeid"]
agent.save(update_fields=["mesh_node_id"])
return Response("ok")

View File

@@ -4,35 +4,35 @@ asyncio-nats-client==0.11.4
billiard==3.6.3.0
celery==4.4.6
certifi==2020.12.5
cffi==1.14.3
chardet==3.0.4
cryptography==3.2.1
cffi==1.14.4
chardet==4.0.0
cryptography==3.3.1
decorator==4.4.2
Django==3.1.4
django-cors-headers==3.5.0
Django==3.1.5
django-cors-headers==3.6.0
django-rest-knox==4.1.0
djangorestframework==3.12.2
future==0.18.2
idna==2.10
kombu==4.6.11
loguru==0.5.3
msgpack==1.0.0
packaging==20.4
msgpack==1.0.2
packaging==20.8
psycopg2-binary==2.8.6
pycparser==2.20
pycryptodome==3.9.9
pyotp==2.4.1
pyparsing==2.4.7
pytz==2020.4
pytz==2020.5
qrcode==6.1
redis==3.5.3
requests==2.25.0
requests==2.25.1
six==1.15.0
sqlparse==0.4.1
twilio==6.49.0
twilio==6.51.0
urllib3==1.26.2
uWSGI==2.0.19.1
validators==0.18.1
validators==0.18.2
vine==1.3.0
websockets==8.1
zipp==3.4.0

View File

@@ -96,5 +96,103 @@
"name": "Check BIOS Information",
"description": "Retreives and reports on BIOS make, version, and date .",
"shell": "powershell"
},
{
"filename": "ResetHighPerformancePowerProfiletoDefaults.ps1",
"submittedBy": "https://github.com/azulskyknight",
"name": "Reset High Perf Power Profile",
"description": "Resets monitor, disk, standby, and hibernate timers in the default High Performance power profile to their default values. It also re-indexes the AC and DC power profiles into their default order.",
"shell": "powershell"
},
{
"filename": "SetHighPerformancePowerProfile.ps1",
"submittedBy": "https://github.com/azulskyknight",
"name": "Set High Perf Power Profile",
"description": "Sets the High Performance Power profile to the active power profile. Use this to keep machines from falling asleep.",
"shell": "powershell"
},
{
"filename": "Windows10Upgrade.ps1",
"submittedBy": "https://github.com/RVL-Solutions and https://github.com/darimm",
"name": "Windows 10 Upgrade",
"description": "Forces an upgrade to the latest release of Windows 10.",
"shell": "powershell"
},
{
"filename": "DiskStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Check Disks",
"description": "Checks local disks for errors reported in event viewer within the last 24 hours",
"shell": "powershell"
},
{
"filename": "DuplicatiStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Check Duplicati",
"description": "Checks Duplicati Backup is running properly over the last 24 hours",
"shell": "powershell"
},
{
"filename": "EnableDefender.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Enable Windows Defender",
"description": "Enables Windows Defender and sets preferences",
"shell": "powershell"
},
{
"filename": "OpenSSHServerInstall.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Install SSH",
"description": "Installs and enabled OpenSSH Server",
"shell": "powershell"
},
{
"filename": "RDP_enable.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Enable RDP",
"description": "Enables RDP",
"shell": "cmd"
},
{
"filename": "Speedtest.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "PS Speed Test",
"description": "Powershell speed test (win 10 or server2016+)",
"shell": "powershell"
},
{
"filename": "SyncTime.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Sync DC Time",
"description": "Syncs time with domain controller",
"shell": "cmd"
},
{
"filename": "WinDefenderClearLogs.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Clear Defender Logs",
"description": "Clears Windows Defender Logs",
"shell": "powershell"
},
{
"filename": "WinDefenderStatus.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "Defender Status",
"description": "This will check for Malware, Antispyware, that Windows Defender is Healthy, last scan etc within the last 24 hours",
"shell": "powershell"
},
{
"filename": "disable_FastStartup.bat",
"submittedBy": "https://github.com/dinger1986",
"name": "Disable Fast Startup",
"description": "Disables Faststartup on Windows 10",
"shell": "cmd"
},
{
"filename": "updatetacticalexclusion.ps1",
"submittedBy": "https://github.com/dinger1986",
"name": "TRMM Defender Exclusions",
"description": "Windows Defender Exclusions for Tactical RMM",
"shell": "cmd"
}
]

View File

@@ -6,23 +6,23 @@ from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('scripts', '0003_auto_20200922_1344'),
("scripts", "0003_auto_20200922_1344"),
]
operations = [
migrations.AddField(
model_name='script',
name='category',
model_name="script",
name="category",
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AddField(
model_name='script',
name='favorite',
model_name="script",
name="favorite",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='script',
name='script_base64',
model_name="script",
name="script_base64",
field=models.TextField(blank=True, null=True),
),
]

View File

@@ -6,13 +6,13 @@ from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('scripts', '0004_auto_20201207_1558'),
("scripts", "0004_auto_20201207_1558"),
]
operations = [
migrations.RenameField(
model_name='script',
old_name='script_base64',
new_name='code_base64',
model_name="script",
old_name="script_base64",
new_name="code_base64",
),
]

View File

@@ -13,7 +13,11 @@ def move_scripts_to_db(apps, schema_editor):
for script in Script.objects.all():
if not script.script_type == "builtin":
filepath = f"{settings.SCRIPTS_DIR}/userdefined/{script.filename}"
if script.filename:
filepath = f"{settings.SCRIPTS_DIR}/userdefined/{script.filename}"
else:
print(f"No filename on script found. Skipping")
continue
# test if file exists
if os.path.exists(filepath):

View File

@@ -33,9 +33,9 @@ app.conf.beat_schedule = {
"task": "winupdate.tasks.check_agent_update_schedule_task",
"schedule": crontab(minute=5, hour="*"),
},
"sync-modules": {
"task": "agents.tasks.batch_sync_modules_task",
"schedule": crontab(minute=25, hour="*/4"),
"agents-checkinfull": {
"task": "agents.tasks.check_in_task",
"schedule": crontab(minute="*/24"),
},
"agent-auto-update": {
"task": "agents.tasks.auto_self_agent_update_task",

View File

@@ -14,6 +14,7 @@ def get_debug_info():
EXCLUDE_PATHS = (
"/natsapi",
"/api/v3",
"/api/v2",
"/logs/auditlogs",
@@ -80,4 +81,4 @@ class AuditMiddleware:
def process_template_response(self, request, response):
request_local.debug_info = None
request_local.username = None
return response
return response

View File

@@ -15,25 +15,25 @@ EXE_DIR = os.path.join(BASE_DIR, "tacticalrmm/private/exe")
AUTH_USER_MODEL = "accounts.User"
# latest release
TRMM_VERSION = "0.2.17"
TRMM_VERSION = "0.2.23"
# bump this version everytime vue code is changed
# to alert user they need to manually refresh their browser
APP_VER = "0.0.100"
APP_VER = "0.0.103"
# https://github.com/wh1te909/salt
LATEST_SALT_VER = "1.1.0"
# https://github.com/wh1te909/rmmagent
LATEST_AGENT_VER = "1.1.11"
LATEST_AGENT_VER = "1.1.12"
MESH_VER = "0.7.24"
MESH_VER = "0.7.45"
SALT_MASTER_VER = "3002.2"
# for the update script, bump when need to recreate venv or npm install
PIP_VER = "4"
NPM_VER = "4"
PIP_VER = "6"
NPM_VER = "6"
DL_64 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}.exe"
DL_32 = f"https://github.com/wh1te909/rmmagent/releases/download/v{LATEST_AGENT_VER}/winagent-v{LATEST_AGENT_VER}-x86.exe"
@@ -72,6 +72,7 @@ INSTALLED_APPS = [
"logs",
"scripts",
"alerts",
"natsapi",
]
if not "TRAVIS" in os.environ and not "AZPIPELINE" in os.environ:

View File

@@ -13,6 +13,9 @@ class TacticalTestCase(TestCase):
self.john = User(username="john")
self.john.set_password("hunter2")
self.john.save()
self.alice = User(username="alice")
self.alice.set_password("hunter2")
self.alice.save()
self.client_setup()
self.client.force_authenticate(user=self.john)

View File

@@ -25,4 +25,5 @@ urlpatterns = [
path("scripts/", include("scripts.urls")),
path("alerts/", include("alerts.urls")),
path("accounts/", include("accounts.urls")),
path("natsapi/", include("natsapi.urls")),
]

View File

@@ -28,18 +28,35 @@ jobs:
cd /myagent/_work/1/s/api/tacticalrmm
pip install --no-cache-dir --upgrade pip
pip install --no-cache-dir setuptools==50.3.2 wheel==0.36.1
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt
pip install --no-cache-dir -r requirements.txt -r requirements-test.txt -r requirements-dev.txt
displayName: "Install Python Dependencies"
- script: |
cd /myagent/_work/1/s/api
git config user.email "admin@example.com"
git config user.name "Bob"
git fetch
git checkout develop
git pull
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
coverage run manage.py test -v 2
coveralls
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Run django tests"
- script: |
cd /myagent/_work/1/s/api
source env/bin/activate
black --check tacticalrmm
if [ $? -ne 0 ]; then
exit 1
fi
displayName: "Codestyle black"
- script: |
cd /myagent/_work/1/s/api
source env/bin/activate
cd /myagent/_work/1/s/api/tacticalrmm
export CIRCLE_BRANCH=$BUILD_SOURCEBRANCH
coveralls
displayName: "coveralls"
env:
CIRCLECI: 1
CIRCLE_BUILD_NUM: $(Build.BuildNumber)

View File

@@ -41,12 +41,7 @@ mesh_config="$(cat << EOF
"NewAccounts": false,
"mstsc": true,
"GeoLocation": true,
"CertUrl": "https://${NGINX_HOST_IP}:443",
"httpheaders": {
"Strict-Transport-Security": "max-age=360000",
"_x-frame-options": "sameorigin",
"Content-Security-Policy": "default-src 'none'; script-src 'self' 'unsafe-inline'; connect-src 'self'; img-src 'self' data:; style-src 'self' 'unsafe-inline'; frame-src 'self'; media-src 'self'"
}
"CertUrl": "https://${NGINX_HOST_IP}:443"
}
}
}

View File

@@ -2,6 +2,9 @@
set -e
: "${APP_PORT:=80}"
: "${API_PORT:=80}"
CERT_PRIV_PATH=${TACTICAL_DIR}/certs/privkey.pem
CERT_PUB_PATH=${TACTICAL_DIR}/certs/fullchain.pem
@@ -31,7 +34,7 @@ server {
location / {
#Using variable to disable start checks
set \$api http://tactical-backend;
set \$api http://tactical-backend:${API_PORT};
proxy_pass \$api;
proxy_http_version 1.1;
@@ -95,7 +98,7 @@ server {
location / {
#Using variable to disable start checks
set \$app http://tactical-frontend;
set \$app http://tactical-frontend:${APP_PORT};
proxy_pass \$app;
proxy_http_version 1.1;

44
docker/install.sh Executable file
View File

@@ -0,0 +1,44 @@
#!/usr/bin/env bash
set -o nounset
set -o errexit
set -o pipefail
temp="/tmp/tactical"
args="$*"
version="latest"
branch="master"
branchRegex=" --branch ([^ ]+)"
if [[ " ${args}" =~ ${branchRegex} ]]; then
branch="${BASH_REMATCH[1]}"
fi
echo "branch=${branch}"
tactical_cli="https://raw.githubusercontent.com/wh1te909/tacticalrmm/${branch}/docker/tactical-cli"
versionRegex=" --version ([^ ]+)"
if [[ " ${args}" =~ ${versionRegex} ]]; then
version="${BASH_REMATCH[1]}"
fi
rm -rf "${temp}"
if ! mkdir "${temp}"; then
echo >&2 "Failed to create temporary directory"
exit 1
fi
cd "${temp}"
echo "Downloading tactical-cli from branch ${branch}"
if ! curl -sS "${tactical_cli}"; then
echo >&2 "Failed to download installation package ${tactical_cli}"
exit 1
fi
chmod +x tactical-cli
./tactical-cli ${args} --version "${version}" 2>&1 | tee -a ~/install.log
cd ~
if ! rm -rf "${temp}"; then
echo >&2 "Warning: Failed to remove temporary directory ${temp}"
fi

439
docker/tactical-cli Normal file
View File

@@ -0,0 +1,439 @@
#!/usr/bin/env bash
set -o nounset
set -o errexit
set -o pipefail
# FUNCTIONS
function ask_questions {
while [[ -z "$API_HOST" ]] && [[ "$API_HOST" != *[.]*[.]* ]]
do
echo -ne "Enter the subdomain for the backend (e.g. api.example.com): "
read API_HOST
done
echo "API_HOST is set to ${API_HOST}"
while [[ -z "$APP_HOST" ]] && [[ "$APP_HOST" != *[.]*[.]* ]]
do
echo -ne "Enter the subdomain for the frontend (e.g. rmm.example.com): "
read APP_HOST
done
echo "APP_HOST is set to ${APP_HOST}"
while [[ -z "$MESH_HOST" ]] && [[ "$MESH_HOST" != *[.]*[.]* ]]
do
echo -ne "Enter the subdomain for meshcentral (e.g. mesh.example.com): "
read MESH_HOST
done
echo "MESH_HOST is set to ${MESH_HOST}"
while [[ -z "$EMAIL" ]] && [[ "$EMAIL" != *[@]*[.]* ]]
do
echo -ne "Enter a valid email address for django and meshcentral: "
read EMAIL
done
echo "EMAIL is set to ${EMAIL}"
while [[ -z "$USERNAME" ]]
do
echo -ne "Set username for mesh and tactical login: "
read USERNAME
done
echo "USERNAME is set to ${USERNAME}"
while [[ -z "$PASSWORD" ]]
do
echo -ne "Set password for mesh and tactical password: "
read PASSWORD
done
echo "PASSWORD is set"
# check if let's encrypt or cert-keys options were set
if [[ -z "$LETS_ENCRYPT" ]] && [[ -z "$CERT_PRIV_FILE" ]] || [[ -z "$CERT_PUB_FILE" ]]; then
echo -ne "Create a let's encrypt certificate?[Y,n]: "
read USE_LETS_ENCRYPT
[[ "$USE_LETS_ENCRYPT" == "" ]] || [[ "$USE_LETS_ENCRYPT" ~= [Yy] ]] && LETS_ENCRYPT=1
if [[ -z "$LET_ENCRYPT" ]]; then
echo "Let's Encrypt will not be used"
echo -ne "Do you want to specify paths to a certificate public key and private key?[Y,n]: "
read PRIVATE_CERTS
if [[ "$PRIVATE_CERTS" == "" ]] || [[ "$PRIVATE_CERTS" ~= [yY] ]]; then
# check for valid public certificate file
while [[ ! -f $CERT_PUB_FILE ]]
do
echo -ne "Enter a valid full path to public key file: "
read CERT_PUB_FILE
done
# check for valid private key file
while [[ ! -f $CERT_PRIV_FILE ]]
do
echo -ne "Enter a valid full path to private key file: "
read CERT_PRIV_FILE
done
fi
fi
fi
}
function encode_certificates {
echo "Base64 encoding certificates"
CERT_PUB_BASE64="$(sudo base64 -w 0 ${CERT_PUB_FILE})"
CERT_PRIV_BASE64="$(sudo base64 -w 0 ${CERT_PRIV_FILE})"
}
function generate_env {
[[ -f "$ENV_FILE" ]] && echo "Env file already exists"; return 0;
local mongodb_user=$(cat /dev/urandom | tr -dc 'a-z' | fold -w 8 | head -n 1)
local mongodb_pass=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1)
local postgres_user=$(cat /dev/urandom | tr -dc 'a-z' | fold -w 8 | head -n 1)
local postgres_pass=$(cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 20 | head -n 1)
echo "Generating env file in ${INSTALL_DIR}"
local config_file="$(cat << EOF
IMAGE_REPO=${REPO}
VERSION=${VERSION}
TRMM_USER=${USERNAME}
TRMM_PASS=${PASSWORD}
APP_HOST=${APP_HOST}
API_HOST=${API_HOST}
MESH_HOST=${MESH_HOST}
MESH_USER=${USERNAME}
MESH_PASS=${PASSWORD}
MONGODB_USER=${mongogb_user}
MONGODB_PASSWORD=${mongodb_pass}
POSTGRES_USER=${postgres_user}
POSTGRES_PASS=${postgres_pass}
EOF
)"
echo "${env_file}" > "$ENV_FILE"
}
function update_env_field {
}
function get_env_field {
local search_field="$1"
awk -F "=" '{if ($1==$search_field) { print $2" } }' $ENV_FILE
}
function initiate_letsencrypt {
echo "Starting Let's Encrypt"
ROOT_DOMAIN=$(echo ${API_HOST} | cut -d "." -f2- )
echo "Root domain is ${ROOTDOMAIN}"
sudo certbot certonly --manual -d *.${ROOT_DOMAIN} --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns -m ${EMAIL} --no-eff-email
while [[ $? -ne 0 ]]
do
sudo certbot certonly --manual -d *.${ROOT_DOMAIN} --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns -m ${EMAIL} --no-eff-email
done
CERT_PRIV_FILE=/etc/letsencrypt/live/${ROOT_DOMAIN}/privkey.pem
CERT_PUB_FILE=/etc/letsencrypt/live/${ROOT_DOMAIN}/fullchain.pem
}
# setup defaults
# keep track of first arg
FIRST_ARG="$1"
# defaults
REPO="tacticalrmm/"
BRANCH="master"
VERSION="latest"
# file locations
INSTALL_DIR=/opt/tactical
ENV_FILE=/opt/tactical/.env
# check prerequisites
command -v docker >/dev/null 2>&1 || { echo >&2 "Docker must be installed. Exiting..."; exit 1; }
command -v docker-compose >/dev/null 2>&1 || { echo >&2 "Docker Compose must be installed. Exiting..."; exit 1; }
command -v curl >/dev/null 2>&1 || { echo >&2 "Curl must be installed. Exiting..."; exit 1; }
command -v bash >/dev/null 2>&1 || { echo >&2 "Bash must be installed. Exiting..."; exit 1; }
# check for arguments
[ -z "$1" ] && echo >&2 "No arguments supplied. Exiting..."; exit 1;
# parse arguments
while [[ $# -gt 0 ]]
do
key="$1"
case $key in
# install arg
-i|install)
[[ "$key" != "$FIRST_ARG" ]] && echo >&2 "install must be the first argument. Exiting.."; exit 1;
MODE="install"
shift # past argument
;;
# update arg
-u|update)
[[ "$key" != "$FIRST_ARG" ]] && echo >&2 "update must be the first argument. Exiting..."; exit 1;
MODE="update"
shift # past argument
;;
# backup arg
-b|backup)
[[ "$key" != "$FIRST_ARG" ]] && echo >&2 "backup must be the first argument. Exiting..."; exit 1;
MODE="backup"
shift # past argument
;;
# restore arg
-r|restore)
[[ "$key" != "$FIRST_ARG" ]] && echo >&2 "restore must be the first argument. Exiting..."; exit 1;
MODE="restore"
shift # past argument
;;
# update-cert arg
-c|update-cert)
[[ "$key" != "$FIRST_ARG" ]] && echo >&2 "update-cert must be the first argument. Exiting..."; exit 1;
MODE="update-cert"
shift # past argument
;;
# use-lets-encrypt arg
--use-lets-encrypt)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update-cert as first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update-cert" ]] && \
echo >&2 "--use-lets-encrypt option only valid for install and update-cert. Exiting..."; exit 1;
LETS_ENCRYPT=1
shift # past argument
;;
# cert-priv-file arg
--cert-priv-file)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update-cert first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update-cert" ]] && \
echo >&2 "--cert-priv-file option only valid for install and update-cert. Exiting..."; exit 1;
shift # past argument
[ ! -f "$key" ] && echo >&2 "Certificate private key file $key does not exist. Use absolute paths. Exiting..."; exit 1;
CERT_PRIV_FILE="$key"
shift # past value
;;
# cert-pub-file arg
--cert-pub-file)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update-cert first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update-cert" ]] && \
echo >&2 "--cert-pub-file option only valid for install and update-cert. Exiting..."; exit 1;
shift # past argument
[ ! -f "$key" ] && echo >&2 "Certificate public Key file ${key} does not exist. Use absolute paths. Exiting..."; exit 1;
CERT_PUB_FILE="$key"
shift # past value
;;
# local arg
--local)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update" ]] && \
echo >&2 "--local option only valid for install and update. Exiting..."; exit 1;
REPO=""
shift # past argument
;;
# branch arg
--branch)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update" ]] && \
echo >&2 "--branch option only valid for install and update. Exiting..."; exit 1;
shift # past argument
BRANCH="$key"
shift # past value
;;
# version arg
--version)
[[ -z "$MODE" ]] && echo >&2 "Missing install or update first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] || [[ "$MODE" != "update" ]] && \
echo ">&2 --version option only valid for install and update. Exiting..."; exit 1;
shift # past argument
VERSION="$key"
shift # past value
;;
# noninteractive arg
--noninteractive)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--noninteractive option only valid for install. Exiting..."; exit 1;
NONINTERACTIVE=1
shift # past argument
;;
# app host arg
--app-host)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--app-host option only valid for install. Exiting..."; exit 1;
shift # past argument
APP_HOST="$key"
shift # past value
;;
# api host arg
--api-host)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--api-host option only valid for install. Exiting..."; exit 1;
shift # past argument
API_HOST="$key"
shift # past value
;;
# mesh host arg
--mesh-host)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--mesh-host option only valid for install. Exiting..."; exit 1;
shift # past argument
MESH_HOST="$key"
shift # past value
;;
# tactical user arg
--tactical-user)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--tactical-user option only valid for install. Exiting..."; exit 1;
shift # past argument
USERNAME="$key"
shift # past value
;;
# tactical password arg
--tactical-password)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--tactical-password option only valid for install. Exiting..."; exit 1;
shift # past argument
PASSWORD="$key"
shift # past value
;;
# email arg
--email)
[[ -z "$MODE" ]] && echo >&2 "Missing install first argument. Exiting..."; exit 1;
[[ "$MODE" != "install" ]] && echo >&2 "--email option only valid for install. Exiting..."; exit 1;
shift # past argument
EMAIL="$key"
shift # past value
;;
# Unknown arg
*)
echo "Unknown argument ${$1}. Exiting..."
exit 1
;;
esac
done
# for install mode
if [[ "$MODE" == "install" ]]; then
echo "Starting installation in ${INSTALL_DIR}"
# move to install dir
mkdir -p "${INSTALL_DIR}"
cd "$INSTALL_DIR"
# pull docker-compose.yml file
echo "Downloading docker-compose.yml from branch ${branch}"
COMPOSE_FILE="https://raw.githubusercontent.com/wh1te909/tacticalrmm/${branch}/docker/docker-compose.yml"
if ! curl -sS "${COMPOSE_FILE}"; then
echo >&2 "Failed to download installation package ${COMPOSE_FILE}"
exit 1
fi
# check if install is noninteractive
if [[ -z "$NONINTERACTIVE" ]]; then
# ask user for information not supplied as arguments
ask_questions
else
echo "NonInteractive mode set."
# check for required noninteractive arguments
[[ -z "$API_HOST" ]] || \
[[ -z "$APP_HOST" ]] || \
[[ -z "$MESH_HOST" ]] || \
[[ -z "$EMAIL" ]] || \
[[ -z "$USERNAME" ]] || \
[[ -z "$PASSWORD" ]] && \
echo "You must supply additional arguments for noninteractive install."; exit 1;
fi
# if certificates are available base64 encode them
if [[ -n "$LET_ENCRYPT" ]] && [[ -z "$NONINTERACTIVE" ]]; then
initiate_letsencrypt
encode_certificates
elif [[ -n "$CERT_PUB_FILE" ]] && [[ -n "$CERT_PRIV_FILE" ]]; then
encode_certificates
# generate config file
generate_config
# generate env file
generate_env
echo "Configuration complete. Starting environment."
# start environment
docker-compose pull
docker-compose up -d
fi
# for update mode
if [[ "$MODE" == "update" ]]; then
[[ "$VERSION" != "latest" ]]
docker-compose pull
docker-compose up -d
fi
# for update cert mode
if [[ "$MODE" == "update-cert" ]]; then
# check for required parameters
[[ -z "$LET_ENCRYPT" ]] || \
[[ -z "$CERT_PUB_FILE" ]] && \
[[ -z "$CERT_PRIV_FILE" ]] && \
echo >&2 "Provide the --lets-encrypt option or use --cert-pub-file and --cert-priv-file. Exiting..."; exit;
if [[ -n "$LET_ENCRYPT" ]]; then
initiate_letsencrypt
encode_certificates
generate_env
elif [[ -n "$CERT_PUB_FILE" ]] && [[ -n "$CERT_PRIV_FILE" ]]; then
encode_certificates
generate_env
docker-compose restart
fi
# for backup mode
if [[ "$MODE" == "backup" ]]; then
echo "backup not yet implemented"
fi
# for restore mode
if [[ "$MODE" == "restore" ]] then;
echo "restore not yet implemented"
fi

17
go.mod Normal file
View File

@@ -0,0 +1,17 @@
module github.com/wh1te909/tacticalrmm
go 1.15
require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/go-resty/resty/v2 v2.3.0
github.com/josephspurrier/goversioninfo v1.2.0
github.com/kr/pretty v0.1.0 // indirect
github.com/nats-io/nats.go v1.10.1-0.20210107160453-a133396829fc
github.com/ugorji/go/codec v1.2.2
github.com/wh1te909/rmmagent v1.1.13-0.20210112033642-9b310c2c7f53
golang.org/x/net v0.0.0-20201031054903-ff519b6c9102 // indirect
golang.org/x/sys v0.0.0-20201113233024-12cec1faf1ba // indirect
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1 // indirect
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 // indirect
)

155
go.sum Normal file
View File

@@ -0,0 +1,155 @@
github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d/go.mod h1:3eOhrUMpNV+6aFIbp5/iudMxNCF27Vw2OZgy4xEx0Fg=
github.com/akavel/rsrc v0.8.0 h1:zjWn7ukO9Kc5Q62DOJCcxGpXC18RawVtYAGdz2aLlfw=
github.com/akavel/rsrc v0.8.0/go.mod h1:uLoCtb9J+EyAqh+26kdrTgmzRBFPGOolLWKpdxkKq+c=
github.com/capnspacehook/taskmaster v0.0.0-20201022195506-c2d8b114cec0/go.mod h1:257CYs3Wd/CTlLQ3c72jKv+fFE2MV3WPNnV5jiroYUU=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/elastic/go-sysinfo v1.4.0/go.mod h1:i1ZYdU10oLNfRzq4vq62BEwD2fH8KaWh6eh0ikPT9F0=
github.com/elastic/go-windows v1.0.0/go.mod h1:TsU0Nrp7/y3+VwE82FoZF8gC/XFg/Elz6CcloAxnPgU=
github.com/fsnotify/fsnotify v1.4.7/go.mod h1:jwhsz4b93w/PPRr/qN1Yymfu8t87LnFCMoQvtojpjFo=
github.com/go-ole/go-ole v1.2.4/go.mod h1:XCwSNxSkXRo4vlyPy93sltvi/qJq0jqQhjqQNIwKuxM=
github.com/go-resty/resty/v2 v2.3.0 h1:JOOeAvjSlapTT92p8xiS19Zxev1neGikoHsXJeOq8So=
github.com/go-resty/resty/v2 v2.3.0/go.mod h1:UpN9CgLZNsv4e9XG50UU8xdI0F43UQ4HmxLBDwaroHU=
github.com/golang/protobuf v1.2.0/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
github.com/golang/protobuf v1.4.0-rc.1/go.mod h1:ceaxUfeHdC40wWswd/P6IGgMaK3YpKi5j83Wpe3EHw8=
github.com/golang/protobuf v1.4.0-rc.1.0.20200221234624-67d41d38c208/go.mod h1:xKAWHe0F5eneWXFV3EuXVDTCmh+JuBKY0li0aMyXATA=
github.com/golang/protobuf v1.4.0-rc.2/go.mod h1:LlEzMj4AhA7rCAGe4KMBDvJI+AwstrUpVNzEA03Pprs=
github.com/golang/protobuf v1.4.0-rc.4.0.20200313231945-b860323f09d0/go.mod h1:WU3c8KckQ9AFe+yFwt9sWVRKCVIyN9cPHBJSNnbL67w=
github.com/golang/protobuf v1.4.0/go.mod h1:jodUvKwWbYaEsadDk5Fwe5c77LiNKVO9IDvqG2KuDX0=
github.com/golang/protobuf v1.4.2 h1:+Z5KGCizgyZCbGh1KZqA0fcLLkwbsjIzS4aV2v7wJX0=
github.com/golang/protobuf v1.4.2/go.mod h1:oDoupMAO8OvCJWAcko0GGGIgR6R6ocIYbsSw735rRwI=
github.com/gonutz/w32 v1.0.1-0.20201105145118-e88c649a9470/go.mod h1:Rc/YP5K9gv0FW4p6X9qL3E7Y56lfMflEol1fLElfMW4=
github.com/google/go-cmp v0.3.0/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.3.1/go.mod h1:8QqcDgzrUqlUb/G2PQTWiueGozuR1884gddMywk6iLU=
github.com/google/go-cmp v0.4.0 h1:xsAVV57WRhGj6kEIi8ReJzQlHHqcBYCElAvkovg3B/4=
github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/hpcloud/tail v1.0.0/go.mod h1:ab1qPbhIpdTxEkNHXyeSf5vhxWSCs/tWer42PpOxQnU=
github.com/iamacarpet/go-win64api v0.0.0-20200715182619-8cbc936e1a5a/go.mod h1:oGJx9dz0Ny7HC7U55RZ0Smd6N9p3hXP/+hOFtuYrAxM=
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
github.com/joeshaw/multierror v0.0.0-20140124173710-69b34d4ec901/go.mod h1:Z86h9688Y0wesXCyonoVr47MasHilkuLMqGhRZ4Hpak=
github.com/josephspurrier/goversioninfo v1.2.0 h1:tpLHXAxLHKHg/dCU2AAYx08A4m+v9/CWg6+WUvTF4uQ=
github.com/josephspurrier/goversioninfo v1.2.0/go.mod h1:AGP2a+Y/OVJZ+s6XM4IwFUpkETwvn0orYurY8qpw1+0=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/mattn/go-sqlite3 v1.14.5/go.mod h1:WVKg1VTActs4Qso6iwGbiFih2UIHo0ENGwNd0Lj+XmI=
github.com/minio/highwayhash v1.0.0 h1:iMSDhgUILCr0TNm8LWlSjF8N0ZIj2qbO8WHp6Q/J2BA=
github.com/minio/highwayhash v1.0.0/go.mod h1:xQboMTeM9nY9v/LlAOxFctujiv5+Aq2hR5dxBpaMbdc=
github.com/nats-io/jwt v0.3.2/go.mod h1:/euKqTS1ZD+zzjYrY7pseZrTtWQSjujC7xjPc8wL6eU=
github.com/nats-io/jwt v0.3.3-0.20200519195258-f2bf5ce574c7 h1:RnGotxlghqR5D2KDAu4TyuLqyjuylOsJiAFhXvMvQIc=
github.com/nats-io/jwt v0.3.3-0.20200519195258-f2bf5ce574c7/go.mod h1:n3cvmLfBfnpV4JJRN7lRYCyZnw48ksGsbThGXEk4w9M=
github.com/nats-io/jwt/v2 v2.0.0-20200916203241-1f8ce17dff02/go.mod h1:vs+ZEjP+XKy8szkBmQwCB7RjYdIlMaPsFPs4VdS4bTQ=
github.com/nats-io/jwt/v2 v2.0.0-20201015190852-e11ce317263c h1:Hc1D9ChlsCMVwCxJ6QT5xqfk2zJ4XNea+LtdfaYhd20=
github.com/nats-io/jwt/v2 v2.0.0-20201015190852-e11ce317263c/go.mod h1:vs+ZEjP+XKy8szkBmQwCB7RjYdIlMaPsFPs4VdS4bTQ=
github.com/nats-io/nats-server/v2 v2.1.8-0.20200524125952-51ebd92a9093/go.mod h1:rQnBf2Rv4P9adtAs/Ti6LfFmVtFG6HLhl/H7cVshcJU=
github.com/nats-io/nats-server/v2 v2.1.8-0.20200601203034-f8d6dd992b71/go.mod h1:Nan/1L5Sa1JRW+Thm4HNYcIDcVRFc5zK9OpSZeI2kk4=
github.com/nats-io/nats-server/v2 v2.1.8-0.20200929001935-7f44d075f7ad/go.mod h1:TkHpUIDETmTI7mrHN40D1pzxfzHZuGmtMbtb83TGVQw=
github.com/nats-io/nats-server/v2 v2.1.8-0.20201129161730-ebe63db3e3ed h1:/FdiqqED2Wy6pyVh7K61gN5G0WfbvFVQzGgpHTcAlHA=
github.com/nats-io/nats-server/v2 v2.1.8-0.20201129161730-ebe63db3e3ed/go.mod h1:XD0zHR/jTXdZvWaQfS5mQgsXj6x12kMjKLyAk/cOGgY=
github.com/nats-io/nats.go v1.10.0/go.mod h1:AjGArbfyR50+afOUotNX2Xs5SYHf+CoOa5HH1eEl2HE=
github.com/nats-io/nats.go v1.10.1-0.20200531124210-96f2130e4d55/go.mod h1:ARiFsjW9DVxk48WJbO3OSZ2DG8fjkMi7ecLmXoY/n9I=
github.com/nats-io/nats.go v1.10.1-0.20200606002146-fc6fed82929a/go.mod h1:8eAIv96Mo9QW6Or40jUHejS7e4VwZ3VRYD6Sf0BTDp4=
github.com/nats-io/nats.go v1.10.1-0.20201021145452-94be476ad6e0/go.mod h1:VU2zERjp8xmF+Lw2NH4u2t5qWZxwc7jB3+7HVMWQXPI=
github.com/nats-io/nats.go v1.10.1-0.20210107160453-a133396829fc h1:bjYoZsMFpySzGUZCFrPk9+ncZ47kPCVHPoQS/QTXYVQ=
github.com/nats-io/nats.go v1.10.1-0.20210107160453-a133396829fc/go.mod h1:Sa3kLIonafChP5IF0b55i9uvGR10I3hPETFbi4+9kOI=
github.com/nats-io/nkeys v0.1.3/go.mod h1:xpnFELMwJABBLVhffcfd1MZx6VsNRFpEugbxziKVo7w=
github.com/nats-io/nkeys v0.1.4/go.mod h1:XdZpAbhgyyODYqjTawOnIOI7VlbKSarI9Gfy1tqEu/s=
github.com/nats-io/nkeys v0.2.0 h1:WXKF7diOaPU9cJdLD7nuzwasQy9vT1tBqzXZZf3AMJM=
github.com/nats-io/nkeys v0.2.0/go.mod h1:XdZpAbhgyyODYqjTawOnIOI7VlbKSarI9Gfy1tqEu/s=
github.com/nats-io/nuid v1.0.1 h1:5iA8DT8V7q8WK2EScv2padNa/rTESc1KdnPw4TC2paw=
github.com/nats-io/nuid v1.0.1/go.mod h1:19wcPz3Ph3q0Jbyiqsd0kePYG7A95tJPxeL+1OSON2c=
github.com/nxadm/tail v1.4.4/go.mod h1:kenIhsEOeOJmVchQTgglprH7qJGnHDVpk1VPCcaMI8A=
github.com/onsi/ginkgo v1.6.0/go.mod h1:lLunBs/Ym6LB5Z9jYTR76FiuTmxDTDusOGeTQH+WWjE=
github.com/onsi/ginkgo v1.12.1/go.mod h1:zj2OWP4+oCPe1qIXoGWkgMRwljMUYCdkwsT2108oapk=
github.com/onsi/gomega v1.7.1/go.mod h1:XdKZgCCFLUoM/7CFJVPcG8C1xQ1AJ0vpAezJrB7JYyY=
github.com/onsi/gomega v1.10.2/go.mod h1:iN09h71vgCQne3DLsj+A5owkum+a2tYe+TOCB1ybHNo=
github.com/onsi/gomega v1.10.3/go.mod h1:V9xEwhxec5O8UDM77eCW8vLymOMltsqPVYWrpDsH8xc=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prometheus/procfs v0.0.0-20190425082905-87a4384529e0/go.mod h1:TjEm7ze935MbeOT/UhFTIMYKhuLP4wbCsTZCD3I8kEA=
github.com/rickb777/date v1.14.2/go.mod h1:swmf05C+hN+m8/Xh7gEq3uB6QJDNc5pQBWojKdHetOs=
github.com/rickb777/date v1.14.3/go.mod h1:mes+vf4wqTD6l4zgZh4Z5TQkrLA57dpuzEGVeTk/XSc=
github.com/rickb777/plural v1.2.2/go.mod h1:xyHbelv4YvJE51gjMnHvk+U2e9zIysg6lTnSQK8XUYA=
github.com/shirou/gopsutil/v3 v3.20.12/go.mod h1:igHnfak0qnw1biGeI2qKQvu0ZkwvEkUcCLlYhZzdr/4=
github.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.6.1 h1:hDPOHmpOpP40lSULcqw7IrRb/u7w6RpDC9399XyoNd0=
github.com/stretchr/testify v1.6.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/tc-hib/goversioninfo v0.0.0-20200813185747-90ffbaa484a7/go.mod h1:NaPIGx19A2KXQEoek0x88NbM0lNgRooZS0xmrETzcjI=
github.com/tc-hib/rsrc v0.9.1/go.mod h1:JGDB/TLOdMTvEEvjv3yetUTFnjXWYLbZDDeH4BTXG/8=
github.com/ugorji/go v1.2.0/go.mod h1:1ny++pKMXhLWrwWV5Nf+CbOuZJhMoaFD+0GMFfd8fEc=
github.com/ugorji/go v1.2.2 h1:60ZHIOcsJlo3bJm9CbTVu7OSqT2mxaEmyQbK2NwCkn0=
github.com/ugorji/go v1.2.2/go.mod h1:bitgyERdV7L7Db/Z5gfd5v2NQMNhhiFiZwpgMw2SP7k=
github.com/ugorji/go/codec v1.2.0/go.mod h1:dXvG35r7zTX6QImXOSFhGMmKtX+wJ7VTWzGvYQGIjBs=
github.com/ugorji/go/codec v1.2.2 h1:08Gah8d+dXj4cZNUHhtuD/S4PXD5WpVbj5B8/ClELAQ=
github.com/ugorji/go/codec v1.2.2/go.mod h1:OM8g7OAy52uYl3Yk+RE/3AS1nXFn1Wh4PPLtupCxbuU=
github.com/wh1te909/go-win64api v0.0.0-20201021040544-8fba2a0fc3d0/go.mod h1:cfD5/vNQFm5PD5Q32YYYBJ6VIs9etzp8CJ9dinUcpUA=
github.com/wh1te909/rmmagent v1.1.13-0.20210111092134-7c83da579caa h1:ZV7qIUJ5M3HDFLi3bun6a2A5+g9DoThbLWI7egBYYkQ=
github.com/wh1te909/rmmagent v1.1.13-0.20210111092134-7c83da579caa/go.mod h1:05MQOAiC/kGvJjDlCOjaTsMNpf6wZFqOTkHqK0ATfW0=
github.com/wh1te909/rmmagent v1.1.13-0.20210111205455-e6620a17aebe h1:xsutMbsAJL2xTvE119BVyK4RdBWx1IBvC7azoEpioEE=
github.com/wh1te909/rmmagent v1.1.13-0.20210111205455-e6620a17aebe/go.mod h1:05MQOAiC/kGvJjDlCOjaTsMNpf6wZFqOTkHqK0ATfW0=
github.com/wh1te909/rmmagent v1.1.13-0.20210112033642-9b310c2c7f53 h1:Q47sibbW09BWaQoPZQTzblGd+rnNIc3W8W/jOYbMe10=
github.com/wh1te909/rmmagent v1.1.13-0.20210112033642-9b310c2c7f53/go.mod h1:05MQOAiC/kGvJjDlCOjaTsMNpf6wZFqOTkHqK0ATfW0=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190701094942-4def268fd1a4/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/crypto v0.0.0-20200323165209-0ec3e9974c59/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20201016220609-9e8e0b390897 h1:pLI5jrR7OSLijeIDcmRxNmw2api+jEfxLoykJVice/E=
golang.org/x/crypto v0.0.0-20201016220609-9e8e0b390897/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/net v0.0.0-20180906233101-161cd47e91fd/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20200513185701-a91f0712d120/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=
golang.org/x/net v0.0.0-20200520004742-59133d7f0dd7/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=
golang.org/x/net v0.0.0-20201006153459-a7d1128ccaa0/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU=
golang.org/x/net v0.0.0-20201031054903-ff519b6c9102 h1:42cLlJJdEh+ySyeUUbEQ5bsTiq8voBeTuweGVkY6Puw=
golang.org/x/net v0.0.0-20201031054903-ff519b6c9102/go.mod h1:sp8m0HH+o8qH0wwXwYZr8TS3Oi6o0r6Gce1SSxlDquU=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sys v0.0.0-20180909124046-d0be0721c37e/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190130150945-aca44879d564/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190904154756-749cb33beabd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191022100944-742c48ecaeb7/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191025021431-6c3a3bfe00ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20191120155948-bd437916bb0e/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200622182413-4b0db7f3f76b/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200930185726-fdedc70b468f/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201024232916-9f70ab9862d5/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20201113233024-12cec1faf1ba h1:xmhUJGQGbxlod18iJGqVEp9cHIPLl7QiX2aA3to708s=
golang.org/x/sys v0.0.0-20201113233024-12cec1faf1ba/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.4/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/time v0.0.0-20200416051211-89c76fbcd5d1 h1:NusfzzA6yGQ+ua51ck7E3omNUX/JuqbFSaRGqU8CcLI=
golang.org/x/time v0.0.0-20200416051211-89c76fbcd5d1/go.mod h1:tRJNPiyCQ0inRvYxbN9jk5I+vvW/OXSQhTDSoE431IQ=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1 h1:go1bK/D/BFZV2I8cIQd1NKEZ+0owSTG1fDTci4IqFcE=
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
google.golang.org/protobuf v0.0.0-20200109180630-ec00e32a8dfd/go.mod h1:DFci5gLYBciE7Vtevhsrf46CRTquxDuWsQurQQe4oz8=
google.golang.org/protobuf v0.0.0-20200221191635-4d8936d0db64/go.mod h1:kwYJMbMJ01Woi6D6+Kah6886xMZcty6N08ah7+eCXa0=
google.golang.org/protobuf v0.0.0-20200228230310-ab0ca4ff8a60/go.mod h1:cfTl7dwQJ+fmap5saPgwCLgHXTUD7jkjRqWcaiX5VyM=
google.golang.org/protobuf v1.20.1-0.20200309200217-e05f789c0967/go.mod h1:A+miEFZTKqfCUM6K7xSMQL9OKL/b6hQv+e19PK+JZNE=
google.golang.org/protobuf v1.21.0/go.mod h1:47Nbq4nVaFHyn7ilMalzfO3qCViNmqZ2kzikPIcrTAo=
google.golang.org/protobuf v1.23.0 h1:4MY060fB1DLGMB/7MBTLnwQUY6+F09GEiz6SsrNqyzM=
google.golang.org/protobuf v1.23.0/go.mod h1:EGpADcykh3NcUnDUJcl1+ZksZNG86OlYog2l/sGQquU=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/fsnotify.v1 v1.4.7/go.mod h1:Tz8NjZHkW78fSQdbUxIjBTcgA1z1m8ZHf0WmKUhAMys=
gopkg.in/tomb.v1 v1.0.0-20141024135613-dd632973f1e7/go.mod h1:dt/ZhP58zS4L8KSrWDmTeBkI65Dw0HsyUHuEVlX15mw=
gopkg.in/yaml.v2 v2.2.1/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.4/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.3.0/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
howett.net/plist v0.0.0-20181124034731-591f970eefbb/go.mod h1:vMygbs4qMhSZSc4lCUl2OEE+rDiIIJAIdR4m7MiMcm0=

View File

@@ -1,8 +1,10 @@
#!/bin/bash
SCRIPT_VERSION="27"
SCRIPT_VERSION="32"
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/install.sh'
sudo apt install -y curl wget
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
@@ -23,14 +25,32 @@ fi
rm -f $TMP_FILE
UBU20=$(grep 20.04 "/etc/"*"release")
if ! [[ $UBU20 ]]; then
echo -ne "\033[0;31mThis script will only work on Ubuntu 20.04\e[0m\n"
exit 1
osname=$(lsb_release -si); osname=${osname^}
osname=$(echo "$osname" | tr '[A-Z]' '[a-z]')
fullrel=$(lsb_release -sd)
codename=$(lsb_release -sc)
relno=$(lsb_release -sr | cut -d. -f1)
fullrelno=$(lsb_release -sr)
# Fallback if lsb_release -si returns anything else than Ubuntu, Debian or Raspbian
if [ ! "$osname" = "ubuntu" ] && [ ! "$osname" = "debian" ]; then
osname=$(grep -oP '(?<=^ID=).+' /etc/os-release | tr -d '"')
osname=${osname^}
fi
# determine system
if ([ "$osname" = "ubuntu" ] && [ "$fullrelno" = "20.04" ]) || ([ "$osname" = "debian" ] && [ $relno -ge 10 ]); then
echo $fullrel
else
echo $fullrel
echo -ne "${RED}Only Ubuntu release 20.04 and Debian 10 and later, are supported\n"
echo -ne "Your system does not appear to be supported${NC}\n"
exit 1
fi
if [ $EUID -eq 0 ]; then
echo -ne "\033[0;31mDo NOT run this script as root. Exiting.\e[0m\n"
echo -ne "${RED}Do NOT run this script as root. Exiting.${NC}\n"
exit 1
fi
@@ -42,6 +62,16 @@ if [[ "$LANG" != *".UTF-8" ]]; then
exit 1
fi
if ([ "$osname" = "ubuntu" ]); then
mongodb_repo="deb [arch=amd64] https://repo.mongodb.org/apt/$osname $codename/mongodb-org/4.4 multiverse"
else
mongodb_repo="deb [arch=amd64] https://repo.mongodb.org/apt/$osname $codename/mongodb-org/4.4 main"
fi
postgresql_repo="deb [arch=amd64] https://apt.postgresql.org/pub/repos/apt/ $codename-pgdg main"
# prevents logging issues with some VPS providers like Vultr if this is a freshly provisioned instance that hasn't been rebooted yet
sudo systemctl restart systemd-journald.service
@@ -93,7 +123,7 @@ echo -ne "${YELLOW}Enter a valid email address for django and meshcentral${NC}:
read letsemail
done
# if server is behind NAT we need to add the 3 subdomains to the host file
# if server is behind NAT we need to add the 3 subdomains to the host file
# so that nginx can properly route between the frontend, backend and meshcentral
# EDIT 8-29-2020
# running this even if server is __not__ behind NAT just to make DNS resolving faster
@@ -114,7 +144,7 @@ if ! [[ $CHECK_HOSTS ]]; then
else
echo "127.0.1.1 ${rmmdomain} $frontenddomain $meshdomain" | sudo tee --append /etc/hosts > /dev/null
fi
else
else
if [[ $HAS_11 ]]; then
echo -ne "${GREEN}Please manually edit your /etc/hosts file to match the line below and re-run this script.${NC}\n"
sed "/127.0.1.1/s/$/ ${rmmdomain} $frontenddomain $meshdomain/" /etc/hosts | grep 127.0.1.1
@@ -130,7 +160,7 @@ fi
BEHIND_NAT=false
IPV4=$(ip -4 addr | sed -ne 's|^.* inet \([^/]*\)/.* scope global.*$|\1|p' | head -1)
if echo "$IPV4" | grep -qE '^(10\.|172\.1[6789]\.|172\.2[0-9]\.|172\.3[01]\.|192\.168)'; then
BEHIND_NAT=true
BEHIND_NAT=true
fi
echo -ne "${YELLOW}Create a username for meshcentral${NC}: "
@@ -145,7 +175,7 @@ print_green 'Getting wildcard cert'
sudo certbot certonly --manual -d *.${rootdomain} --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns -m ${letsemail} --no-eff-email
while [[ $? -ne 0 ]]
do
sudo certbot certonly --manual -d *.${rootdomain} --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns -m ${letsemail} --no-eff-email
sudo certbot certonly --manual -d *.${rootdomain} --agree-tos --no-bootstrap --manual-public-ip-logging-ok --preferred-challenges dns -m ${letsemail} --no-eff-email
done
CERT_PRIV_KEY=/etc/letsencrypt/live/${rootdomain}/privkey.pem
@@ -161,8 +191,6 @@ echo "saltapi:${SALTPW}" | sudo chpasswd
print_green 'Installing golang'
sudo apt install -y curl wget
sudo mkdir -p /usr/local/rmmgo
go_tmp=$(mktemp -d -t rmmgo-XXXXXXXXXX)
wget https://golang.org/dl/go1.15.5.linux-amd64.tar.gz -P ${go_tmp}
@@ -198,8 +226,8 @@ sudo apt install -y nodejs
print_green 'Installing MongoDB'
wget -qO - https://www.mongodb.org/static/pgp/server-4.2.asc | sudo apt-key add -
echo "deb [ arch=amd64 ] https://repo.mongodb.org/apt/ubuntu bionic/mongodb-org/4.2 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.2.list
wget -qO - https://www.mongodb.org/static/pgp/server-4.4.asc | sudo apt-key add -
echo "$mongodb_repo" | sudo tee /etc/apt/sources.list.d/mongodb-org-4.4.list
sudo apt update
sudo apt install -y mongodb-org
sudo systemctl enable mongod
@@ -210,11 +238,12 @@ sudo systemctl restart mongod
print_green 'Installing python, redis and git'
sudo apt update
sudo apt install -y python3.8-venv python3.8-dev python3-pip python3-cherrypy3 python3-setuptools python3-wheel ca-certificates redis git
sudo apt install -y python3-venv python3-dev python3-pip python3-cherrypy3 python3-setuptools python3-wheel ca-certificates redis git
print_green 'Installing postgresql'
sudo sh -c 'echo "deb https://apt.postgresql.org/pub/repos/apt/ $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
echo "$postgresql_repo" | sudo tee /etc/apt/sources.list.d/pgdg.list
wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
sudo apt update
sudo apt install -y postgresql-13
@@ -275,12 +304,7 @@ meshcfg="$(cat << EOF
"CertUrl": "https://${meshdomain}:443/",
"GeoLocation": true,
"CookieIpCheck": false,
"mstsc": true,
"httpheaders": {
"Strict-Transport-Security": "max-age=360000",
"_x-frame-options": "sameorigin",
"Content-Security-Policy": "default-src 'none'; script-src 'self' 'unsafe-inline'; connect-src 'self'; img-src 'self' data:; style-src 'self' 'unsafe-inline'; frame-src 'self'; media-src 'self'"
}
"mstsc": true
}
}
}
@@ -463,7 +487,7 @@ server {
listen 443 ssl;
server_name ${rmmdomain};
client_max_body_size 300M;
access_log /rmm/api/tacticalrmm/tacticalrmm/private/log/access.log;
access_log /rmm/api/tacticalrmm/tacticalrmm/private/log/access.log combined if=\$ignore_ua;
error_log /rmm/api/tacticalrmm/tacticalrmm/private/log/error.log;
ssl_certificate ${CERT_PUB_KEY};
ssl_certificate_key ${CERT_PRIV_KEY};
@@ -491,6 +515,14 @@ server {
alias /srv/salt/scripts/;
}
location ~ ^/(natsapi) {
allow 127.0.0.1;
deny all;
uwsgi_pass tacticalrmm;
include /etc/nginx/uwsgi_params;
uwsgi_read_timeout 9999s;
uwsgi_ignore_client_abort on;
}
location / {
uwsgi_pass tacticalrmm;
@@ -543,9 +575,8 @@ sudo ln -s /etc/nginx/sites-available/rmm.conf /etc/nginx/sites-enabled/rmm.conf
sudo ln -s /etc/nginx/sites-available/meshcentral.conf /etc/nginx/sites-enabled/meshcentral.conf
print_green 'Installing Salt Master'
wget -O - https://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest/SALTSTACK-GPG-KEY.pub | sudo apt-key add -
echo 'deb http://repo.saltstack.com/py3/ubuntu/20.04/amd64/latest focal main' | sudo tee /etc/apt/sources.list.d/saltstack.list
wget -O - 'https://repo.saltstack.com/py3/'$osname'/'$fullrelno'/amd64/latest/SALTSTACK-GPG-KEY.pub' | sudo apt-key add -
echo 'deb http://repo.saltstack.com/py3/'$osname'/'$fullrelno'/amd64/latest '$codename' main' | sudo tee /etc/apt/sources.list.d/saltstack.list
sudo apt update
sudo apt install -y salt-master
@@ -621,7 +652,7 @@ CELERYD_OPTS="--time-limit=2900 --autoscale=50,5"
CELERYD_PID_FILE="/rmm/api/tacticalrmm/%n.pid"
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_LOG_LEVEL="INFO"
CELERYD_LOG_LEVEL="ERROR"
CELERYBEAT_PID_FILE="/rmm/api/tacticalrmm/beat.pid"
CELERYBEAT_LOG_FILE="/var/log/celery/beat.log"
@@ -873,7 +904,7 @@ printf >&2 "${YELLOW}Django admin url: ${GREEN}https://${rmmdomain}/${ADMINURL}$
printf >&2 "${YELLOW}MeshCentral password: ${GREEN}${MESHPASSWD}${NC}\n\n"
if [ "$BEHIND_NAT" = true ]; then
echo -ne "${YELLOW}Read below if your router does NOT support Hairpin NAT${NC}\n\n"
echo -ne "${YELLOW}Read below if your router does NOT support Hairpin NAT${NC}\n\n"
echo -ne "${GREEN}If you will be accessing the web interface of the RMM from the same LAN as this server,${NC}\n"
echo -ne "${GREEN}you'll need to make sure your 3 subdomains resolve to ${IPV4}${NC}\n"
echo -ne "${GREEN}This also applies to any agents that will be on the same local network as the rmm.${NC}\n"

15
main.go Normal file
View File

@@ -0,0 +1,15 @@
package main
import (
"flag"
"github.com/wh1te909/tacticalrmm/natsapi"
)
func main() {
apiHost := flag.String("api-host", "", "django api base url")
debug := flag.Bool("debug", false, "Debug")
flag.Parse()
api.Listen(*apiHost, *debug)
}

151
natsapi/api.go Normal file
View File

@@ -0,0 +1,151 @@
package api
import (
"bufio"
"errors"
"fmt"
"log"
"os"
"runtime"
"strings"
"time"
"github.com/go-resty/resty/v2"
nats "github.com/nats-io/nats.go"
"github.com/ugorji/go/codec"
rmm "github.com/wh1te909/rmmagent/shared"
)
var rClient = resty.New()
func getAPI(apihost string) (string, error) {
if apihost != "" {
return apihost, nil
}
f, err := os.Open(`/etc/nginx/sites-available/rmm.conf`)
if err != nil {
return "", err
}
defer f.Close()
scanner := bufio.NewScanner(f)
for scanner.Scan() {
if strings.Contains(scanner.Text(), "server_name") && !strings.Contains(scanner.Text(), "301") {
r := strings.NewReplacer("server_name", "", ";", "")
return strings.ReplaceAll(r.Replace(scanner.Text()), " ", ""), nil
}
}
return "", errors.New("unable to parse api from nginx conf")
}
func Listen(apihost string, debug bool) {
var baseURL string
api, err := getAPI(apihost)
if err != nil {
log.Fatalln(err)
}
if debug {
baseURL = fmt.Sprintf("http://%s:8000/natsapi", api)
} else {
baseURL = fmt.Sprintf("https://%s/natsapi", api)
}
rClient.SetHostURL(baseURL)
rClient.SetTimeout(30 * time.Second)
natsinfo, err := rClient.R().SetResult(&NatsInfo{}).Get("/natsinfo/")
if err != nil {
log.Fatalln(err)
}
opts := []nats.Option{
nats.Name("TacticalRMM"),
nats.UserInfo(natsinfo.Result().(*NatsInfo).User,
natsinfo.Result().(*NatsInfo).Password),
nats.ReconnectWait(time.Second * 5),
nats.RetryOnFailedConnect(true),
nats.MaxReconnects(-1),
nats.ReconnectBufSize(-1),
}
server := fmt.Sprintf("tls://%s:4222", api)
nc, err := nats.Connect(server, opts...)
if err != nil {
log.Fatalln(err)
}
nc.Subscribe("*", func(msg *nats.Msg) {
var mh codec.MsgpackHandle
mh.RawToString = true
dec := codec.NewDecoderBytes(msg.Data, &mh)
switch msg.Reply {
case "hello":
go func() {
var p *rmm.CheckIn
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Patch("/checkin/")
}
}()
case "osinfo":
go func() {
var p *rmm.CheckInOS
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "winservices":
go func() {
var p *rmm.CheckInWinServices
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "publicip":
go func() {
var p *rmm.CheckInPublicIP
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "disks":
go func() {
var p *rmm.CheckInDisk
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "loggedonuser":
go func() {
var p *rmm.CheckInLoggedUser
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "software":
go func() {
var p *rmm.CheckInSW
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Put("/checkin/")
}
}()
case "syncmesh":
go func() {
var p *rmm.MeshNodeID
if err := dec.Decode(&p); err == nil {
rClient.R().SetBody(p).Post("/syncmesh/")
}
}()
}
})
nc.Flush()
if err := nc.LastError(); err != nil {
fmt.Println(err)
os.Exit(1)
}
runtime.Goexit()
}

6
natsapi/types.go Normal file
View File

@@ -0,0 +1,6 @@
package api
type NatsInfo struct {
User string `json:"user"`
Password string `json:"password"`
}

View File

@@ -7,9 +7,11 @@ pgpw="hunter2"
#####################################################
SCRIPT_VERSION="10"
SCRIPT_VERSION="11"
SCRIPT_URL='https://raw.githubusercontent.com/wh1te909/tacticalrmm/master/restore.sh'
sudo apt install -y curl wget
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
@@ -90,7 +92,6 @@ sudo systemctl restart systemd-journald.service
print_green 'Installing golang'
sudo apt update
sudo apt install -y curl wget
sudo mkdir -p /usr/local/rmmgo
go_tmp=$(mktemp -d -t rmmgo-XXXXXXXXXX)
wget https://golang.org/dl/go1.15.5.linux-amd64.tar.gz -P ${go_tmp}

View File

@@ -1,17 +1,9 @@
Write-Host Exporting the list of users to c:\users.csv
# List the users in c:\users and export to csv file for calling later
dir C:\Users | select Name | Export-Csv -Path C:\users.csv -NoTypeInformation
$list=Test-Path C:\users.csv
# Clear Google Chrome
Write-Host Clearing FireFox caches
Import-CSV -Path C:\users.csv -Header Name | foreach {
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache\* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache\*.* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache2\entries\*.* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\thumbnails\* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\cookies.sqlite -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\webappsstore.sqlite -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Mozilla\Firefox\Profiles\*.default\chromeappsstore.sqlite -Recurse -Force -EA SilentlyContinue -Verbose
}
Remove-Item -path c:\users.csv
Write-Host FireFox cache is cleared
Write-Host "Clearing FireFox caches"
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache\*.*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\cache2\entries\*.*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\thumbnails\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\cookies.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\webappsstore.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Mozilla\Firefox\Profiles\*.default\chromeappsstore.sqlite" -Recurse -Force -EA SilentlyContinue -Verbose
Write-Host "FireFox cache is cleared"

View File

@@ -1,15 +1,7 @@
Write-Host Exporting the list of users to c:\users.csv
# List the users in c:\users and export to csv file for calling later
dir C:\Users | select Name | Export-Csv -Path C:\users.csv -NoTypeInformation
$list=Test-Path C:\users.csv
# Clear Google Chrome
Write-Host Clearing Google caches
Import-CSV -Path C:\users.csv -Header Name | foreach {
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Google\Chrome\User Data\Default\Cache\* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Google\Chrome\User Data\Default\Cache2\entries\* -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Google\Chrome\User Data\Default\Cookies -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Google\Chrome\User Data\Default\Media Cache -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path C:\Users\$($_.Name)\AppData\Local\Google\Chrome\User Data\Default\Cookies-Journal -Recurse -Force -EA SilentlyContinue -Verbose
}
Remove-Item -path c:\users.csv
Write-Host Google Chrome cache is cleared
Write-Host "Clearing Google caches"
Remove-Item -path "C:\Users\*\AppData\Local\Google\Chrome\User Data\Default\Cache\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Google\Chrome\User Data\Default\Cache2\entries\*" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Google\Chrome\User Data\Default\Cookies" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Google\Chrome\User Data\Default\Media Cache" -Recurse -Force -EA SilentlyContinue -Verbose
Remove-Item -path "C:\Users\*\AppData\Local\Google\Chrome\User Data\Default\Cookies-Journal" -Recurse -Force -EA SilentlyContinue -Verbose
Write-Host "Google Chrome cache is cleared"

Some files were not shown because too many files have changed in this diff Show More